Skip to main content

Data Engineer

Job Details

Lorton, VA
Hybrid
Full Time
None
Information Technology

Description

Data Engineer

 A qualified candidate is responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams while acting in the role of a subject matter expert. The candidate will have extensive experience supporting software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. Must be self-directed and comfortable supporting the data needs of multiple teams, systems and products, to optimize or even re-design the company’s data architecture to support the next generation of products and data initiatives.

 

Clearance: Interim Secret, minimum

Onsite Requirements: 3 days onsite, 2 days remote

 

Key Responsibilities

  • Create data tools for analytics and data scientist team members that assist them in building and optimizing the product into an innovative industry leader.
  • Create and maintain optimal data pipeline architecture.
  • Assemble large, complex datasets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS/Azure ‘big data’ technologies.
  • Work with data and analytics experts to strive for greater functionality in the data systems.

Qualifications

Required Qualifications

  • Hands-on experience with wide variety of data platforms such as SQL database design, PostgreSQL, and Microsoft SQL Server.
  • Working knowledge of programming languages including PySpark, Python, R, etc.
  • Three (3+) years of experience as a data engineer or a similar role.
  • Capable of supporting and working with cross-functional teams in a dynamic environment.
  • Technical expertise with data models, data mining, and segmentation techniques.
  • Familiar with using ETL Solutions like Azure Data Factory to assist in the extracting, transforming, and loading of data into databases or other storage types.
  • Familiar with Cloud Platforms.
  • Familiar with different logging solutions including but not limited to (application logs, system logs, network logs, Splunk, Elastic search).
  • Understanding of data normalization techniques and their implementation.

Bonus Qualifications

  • Bachelor's degree in computer science, IT or similar fields or relevant work experience.
  • Data engineering certification (e.g., IBM Certified Data Engineer).
  • Strong understanding of data pipelines; should be able to work with REST, SOAP, FTP, HTTP, and ODBC.
  • Working knowledge of message queuing, stream processing, and highly scalable data stores.
  • Experience with Palantir Foundry.
  • Basic knowledge of Agile Scrum.
Apply