A2C · 2 weeks ago
Data Engineer w/ GCP
A2C is seeking a Data Engineer to build and maintain scalable data pipelines and cloud data infrastructure on GCP. The role focuses on utilizing BigQuery, Dataflow, and modern ETL/ELT processes to support analytics and machine learning workflows.
ConsultingInformation ServicesInformation Technology
Responsibilities
Build and optimize batch/streaming pipelines using Dataflow, Pub/Sub, Composer
Develop and tune BigQuery models, queries, and ingestion processes
Implement IaC (Terraform), CI/CD, monitoring, and data quality checks
Ensure data governance, security, and reliable pipeline operations
Collaborate with data science teams and support Vertex AI–based ML workflows
Qualification
Required
A problem solver with ability to analyze and research complex issues and problems; and proposing actionable solutions and/or strategies
Solid understanding and hands on experience with major cloud platforms
Experience in designing and implementing data pipelines
Must have strong Python, SQL & GCP skills
3–5+ years of data engineering experience
Hands-on GCP experience (BigQuery, Dataflow, Pub/Sub)
Solid ETL/ELT and data modeling experience
Preferred
GCP certifications
Spark
Kafka
Airflow
dbt/Dataform
Docker/K8s
Company
A2C
A2C provides IT staff augmentation and direct hire solutions for technology consultants.
H1B Sponsorship
A2C has a track record of offering H1B sponsorships. Please note that this does not
guarantee sponsorship for this specific role. Below presents additional info for your
reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2021 (1)
Funding
Current Stage
Growth StageCompany data provided by crunchbase