White Cap Canada · 15 hours ago
Data Platform Engineer
White Cap Canada is committed to Building Trust on Every Job, and they are seeking a Data Platform Engineer to design and implement high-performance data pipelines and integrations. The role involves collaborating with various teams to ensure secure data movement and automating deployment processes.
Construction
Responsibilities
Design, build, and maintain batch and streaming data pipelines using Databricks (PySpark, Delta Live Tables, Unity Catalog)
Develop and manage inbound/outbound data feeds via APIs, SFTP, pub/sub, or middleware platforms
Build and optimize data models in Postgres and synchronize with analytical layers
Collaborate with product, architecture, and InfoSec teams to ensure secure and compliant data movement
Implement data quality, observability, and governance standards
Automate deployment and testing with CI/CD tools (e.g., Databricks Asset Bundles, GitHub Actions, or Azure DevOps)
Participate in refactoring existing data pipelines to modern, scalable approaches. Aid in retirement of former techniques and communications around new methods
Create build vs buy proposals. Implement “greenfield” solutions or integrate 3rd party apps and connectors
Qualification
Required
Design, build, and maintain batch and streaming data pipelines using Databricks (PySpark, Delta Live Tables, Unity Catalog)
Develop and manage inbound/outbound data feeds via APIs, SFTP, pub/sub, or middleware platforms
Build and optimize data models in Postgres and synchronize with analytical layers
Collaborate with product, architecture, and InfoSec teams to ensure secure and compliant data movement
Implement data quality, observability, and governance standards
Automate deployment and testing with CI/CD tools (e.g., Databricks Asset Bundles, GitHub Actions, or Azure DevOps)
Participate in refactoring existing data pipelines to modern, scalable approaches. Aid in retirement of former techniques and communications around new methods
Create build vs buy proposals. Implement 'greenfield' solutions or integrate 3rd party apps and connectors
Demonstrates skill in data analysis techniques by resolving missing/incomplete information and inconsistencies/anomalies in more complex research/data
Typically requires BS/BA in a related discipline. Generally, 2-5 years of experience in a related field OR MS/MA and generally 2-4 years of experience in a related field. Certification is required in some areas
Preferred
Proficiency in Python or Scala, with strong SQL skills
Hands-on experience with Databricks or Spark-based data engineering
Experience integrating APIs, building middleware connectors, and managing event-based data flows
Solid understanding of Postgres or similar OLTP databases
Familiarity with cloud environments (Azure preferred) and containerization (Docker/Kubernetes)
Strong problem-solving, performance tuning, and communication skills
Relevant certifications (e.g., Databricks Certified Data Engineer, Azure Data Engineer Associate)
Experience working in Agile/Scrum environments
Strong documentation and technical writing skills
Company
White Cap Canada
Official White Cap Canada Account.
Funding
Current Stage
Late StageCompany data provided by crunchbase