EPIC SYSTEMS INC · 1 month ago
Databricks Engineer – Must have an Active Secret (S) clearance and be able to obtain a TS/SCI clearance and DHS Suitability
Epic Systems Inc is supporting a U.S. Government customer on a critical development and sustainment program. They are seeking a Databricks Engineer to assist in migrating customer applications and services to a Medallion Model, ensuring optimal performance and integration of data assets to the Databricks platform.
Information Technology & Services
Responsibilities
Supporting teams to migrate services, applications and platforms from legacy back-end systems to Databricks
Identifying the optimal path for migration, building the plan for migration and finally, execution of the plan
Data Pipeline migration of legacy pipelines from NiFi to Databricks, complete with validation
Implementing the medallion model for each of the data assets being migrated to Databricks
Develop an SOP for integration of data assets into the Databricks platform focused on efficiency, instrumentation and performance
Optimize development, testing, monitoring and security for data assets being added to the Databricks platform
Develop and implement a strategy for optimizing migration and integration of data assets to the Databricks platform
Develop code using various programming and scripting languages to automate/optimize data ingestion, pipeline orchestration and improve data management processes
Ingest transparency, leveraging technologies such as AWS CloudWatch to identify places for measuring and gathering performance information on automated data pipelines
Ensure Data Engineering Team Standard Operating Procedures are appropriately captured and communicated across the team
Ensure technical correctness, timeliness and quality of delivery for the team
Demonstrate excellent oral and written communication skills to all levels of management and the customer
Qualification
Required
Must have an Active Secret (S) clearance and be able to obtain a TS/SCI clearance
Must be able to obtain DHS Suitability
8 + years of directly relevant software development experience required
Minimum of 5 years of experience performing data engineering work in a cloud environment
Experience with relational, noSQL and/or file-based storage (e.g. Databricks, Elastic, Postgres, S3, Athena, etc.)
Experience Working In a CICD Pipeline Factory Environment
Working knowledge of Databricks, Cloud Relational Database Services, NiFi, AWS Redshift and Elasticsearch
Bachelor's degree in Software Engineering, Computer Science or a related discipline is required. [Ten (10) years of experience (for a total of six (18) or more years) may be substituted for a degree.]
Preferred
Experience with Databricks workflows
Experience with Databricks Unity Catalog
Experience with Databricks Autoloader
Experience with Databricks Delta Live Tables/Delta Lake
Experience with Databricks Workspace/Notebooks
Experience with MLflow
Experience with Apache Spark
Experience with collaboration tools including MS Teams, MS Outlook, MS SharePoint, and Confluence
Amazon Web Services (AWS) Professional certification or equivalent
Excellent problem-solving and communication skills
Familiarity with CISA: Securing the Software Supply Chain
Familiarity with CISA: Cybersecurity Best Practices
Familiarity with CISA: Open-Source Software Security
Familiarity with NIST SP 800-218, Secure Software Development Framework V1.1: Recommendations for Mitigating the Risk of Software Vulnerabilities