Software / Data Engineer (1451) (Remote) jobs in United States
cer-icon
Apply on Employer Site
company-logo

GRUPO ZAPATA · 14 hours ago

Software / Data Engineer (1451) (Remote)

Zapata Technology is seeking an experienced Software / Data Engineer for a remote role for west coast hours. The Data Engineer will lead the transformation of data infrastructure and build efficient ETL processes while collaborating with cross-functional teams to meet analytics and reporting needs.

Packaging Services
check
Diversity & Inclusion
badNo H1BnoteSecurity Clearance RequirednoteU.S. Citizen Onlynote

Responsibilities

Design and develop modular ETL solutions that facilitate data ingestion and delivery across various systems, utilizing custom ASP.NET applications and Postgres stored procedures
Develop and maintain scalable data storage systems, focusing on Postgres or other relational databases, to support efficient data retrieval and transformation processes
Develop optimized SQL queries and stored procedures that process and transform both structured and unstructured data, ensuring performance and reliability in data handling
Build and manage data pipelines tailored for reporting and analytics, leveraging custom scripting rather than relying on third-party ETL tools to maintain flexibility and control
Collaborate with cross-functional teams to understand data requirements and implement custom data solutions that support the organization’s analytics and reporting needs
Create custom scripts and applications to enhance data transformation capabilities and address specific data management requirements
Document data ingestion and transformation processes thoroughly, ensuring compliance with data governance frameworks, including HRPP, Privacy, HIPAA, and SOP guidelines
Perform quality assurance tasks to validate data accuracy and troubleshoot data quality issues, upholding system integrity and reliability
Support metadata management practices, ensuring all data processes comply with governance standards and industry best practices

Qualification

ETL developmentPostgresSQL optimizationPythonASP.NETAWSData pipelinesShell scriptingAgile methodologiesSecurity+ Certification

Required

Design and develop modular ETL solutions that facilitate data ingestion and delivery across various systems, utilizing custom ASP.NET applications and Postgres stored procedures
Develop and maintain scalable data storage systems, focusing on Postgres or other relational databases, to support efficient data retrieval and transformation processes
Develop optimized SQL queries and stored procedures that process and transform both structured and unstructured data, ensuring performance and reliability in data handling
Build and manage data pipelines tailored for reporting and analytics, leveraging custom scripting rather than relying on third-party ETL tools to maintain flexibility and control
Collaborate with cross-functional teams to understand data requirements and implement custom data solutions that support the organization's analytics and reporting needs
Create custom scripts and applications to enhance data transformation capabilities and address specific data management requirements
Document data ingestion and transformation processes thoroughly, ensuring compliance with data governance frameworks, including HRPP, Privacy, HIPAA, and SOP guidelines
Perform quality assurance tasks to validate data accuracy and troubleshoot data quality issues, upholding system integrity and reliability
Support metadata management practices, ensuring all data processes comply with governance standards and industry best practices
4+ years of Software Development
Requires Security+ Certification
This position requires an active Secret clearance with the ability to obtain TS/SCI

Preferred

Experience with Python, ASP.NET, C#, and related web application development frameworks
Advanced knowledge of Postgres, particularly in writing complex stored procedures, query optimization, and transitioning data infrastructure to Postgres
Strong familiarity with AWS cloud environments, with experience in migrating data systems to AWS
Experience with low-code/no-code platforms and distributed data processing tools such as Spark, Databricks, or similar
Knowledge of real-time data processing, streaming applications, and data warehousing concepts, especially in cloud-native environments
Proficiency in shell scripting and automation to support cloud migrations and system integrations
Experience working in Agile methodologies, particularly within cloud-based or hybrid cloud environments

Company

GRUPO ZAPATA

twittertwitter
company-logo
Envases

Funding

Current Stage
Late Stage
Company data provided by crunchbase