Big Data Architect

Company:  Saic
Location: Chantilly
Closing Date: 29/10/2024
Salary: £100 - £125 Per Annum
Hours: Full Time
Type: Permanent
Job Requirements / Description

Description

SAIC is seeking an experienced, results-oriented, mission-driven Big Data Architect with a specialized focus on Data Engineering to perform data model design, data formatting, and ETL development optimized for efficient storage, access, and computation in support of national security objectives.

Responsibilities include, but are not limited to:

  • As part of an Agile team, increase innovation capacity and drive the velocity of development of data ingestion and data analysis.
  • Synchronize efforts with other tasks in assembling data technologies to control the flow of data from source to value, with the goal of speeding up the process of deriving value and insight.
  • Passion for unlocking the secrets held by a dataset and solid understanding and experience with developing, automating, and enhancing all parts of the data pipeline to include ingestion, processing, storage, and exposing data for consumption.
  • Implement data tests for quality and also focus on improving inefficient tooling and adopting new transformative technologies while maintaining operational continuity.

Qualifications

Required:

  • Active TS/SCI with Polygraph Clearance
  • Bachelor’s Degree in Computer Science, Information Systems, Engineering or years of experience in lieu of degree
  • 14 years of overall related professional experience
  • 3+ years' hands-on development experience using Java, JavaScript, Python to ETL data.
  • ETL experience, including formats such as XML, JSON, and YML, and normalizing data and high-volume data ingestion.
  • 3+ years' experience using and ingesting data into SQL and NoSQL database systems
  • Familiarity with the NEXIS platform
  • Experience with Apache NiFi
  • Experience programming in Apache Spark and PySpark

Desired:

  • Familiarity with building Containerized Services (e.g. via Docker)
  • Familiarity with Databricks platform
  • Experience developing and maintaining data processing flows.
  • Experience with Amazon Web Services (AWS)
  • Experience with CI/CD pipeline
  • Experience with Agile Methodologies and Kanban Framework
  • Experience with utilizing relational databases including the use of MySQL and/or Oracle for designing database schemas
  • Experience with Linux, REST services, and HTTP
#J-18808-Ljbffr
Apply Now
An error has occurred. This application may no longer respond until reloaded. Reload 🗙