
Description
Description
SAIC is seeking a Data Extract, Transform, Load (ETL) Systems Engineer in Chantilly, VA to focus on Application Programming Interface design (API), implementation and maintenance to extract data from various sources, transform it, and visualize it for analysis and use by Senior decision makers. This role also involves advising on data infrastructure, data quality, and data security while employing a variety of technical tools and solutions.
The successful applicant will exhibit flexibility in task execution and provide technical assistance. As a SETA advisor, candidates will be required to demonstrate value-added judgment to advise the government on program activities. The successful candidate will produce recommendations and deliverables in a thorough, practicable, and consistent manner congruent with the organization's objectives. US Citizenship required. Active Top Secret/Sensitive Compartmented Information (TS/SCI) clearance with Poly is required to start and must be maintained.
Responsibilities:
- Design, develop, and implement ETL processes to extract, transform, and load data from various sources (e.g., databases, flat files, APIs).
- Build, maintain, and optimize data pipelines to ensure efficient and reliable data flow.
- Ensure data quality and integrity throughout the ETL process, addressing data cleansing, validation, and security concerns.
- Design and maintain data warehouse schema and data models, ensuring data consistency and accuracy.
- Provide technical expertise and guidance on data infrastructure, data modeling, and data governance practices.
- Monitor and optimize ETL pipeline performance, addressing bottlenecks and improving execution times.
- Troubleshoot ETL issues, identify root causes, and implement solutions.
- Create and maintain documentation for ETL processes, data mappings, and data models.
- Collaborate with cross-functional teams (e.g., data analysts, business users) to understand data requirements and ensure data quality.
Qualifications
Required Skills and Qualifications:
- Must be a U.S. Citizen.
- Active TS/SCI clearance with polygraph.
- Bachelor's degree in science, technology, engineering, or mathematics (STEM)
- Aptitude for designing and implementing data pipelines that automate the Extraction, Transformation, and Loading (ETL) process.
- Knowledge of Application Programming Interfaces (APIs) and implementing “as-is” or “reworking” existing APIs.
- Experience with Tableau extract, transform, and load functions.
- A background in Python, Java, Scala, and/or SQL.
- AWS service knowledge.
- Understanding of data architecture and best practices.
- Ability to write scripts for data ETL.
- Knowledge or understanding of data integration tools and platforms, such as Apache Kafka, Apache Flume, AWS Glue, Azure Data Factory, or Google Cloud Dataflow.
- Understanding of RESTful APIs and web services.
- Ability to use relational databases (e.g., MS SQL, MySQL, PostgreSQL, Oracle), NoSQL databases (e.g., MongoDB, Cassandra, DynamoDB).
- Knowledge of data warehousing concepts and data warehousing platforms (e.g., Amazon Redshift, Google BigQuery, Snowflake).
- Experienced in documenting project requirements and schedule using agile project management techniques.
- Understanding of software configuration management techniques.
- Understanding of Retrieval Augmented Generation (RAG).
- AWS Solutions Architect, Data Analytics, or Developer Associate certification preferred.
- Cloudera Data Platform, Azure Data Engineer, or Google Data Engineer certification(s) preferred.
- Familiarity with data modeling, data cataloging, and/or data governance is desired.
- Familiarity with the application and use of Artificial Intelligence (AI) and Machine Learning (ML) services is desired.
Apply on company website