Data Quality Engineer jobs in United States
cer-icon
Apply on Employer Site
company-logo

Galent · 3 hours ago

Data Quality Engineer

Galent is a technology solutions company focused on digital transformation, and they are seeking a Data Quality Engineer to establish a data quality function for their Snowflake TD migration. The role involves implementing governance frameworks, automating data pipelines, and ensuring the integrity and accessibility of data products.

Computer Software
Hiring Manager
Aravin Kumar
linkedin

Responsibilities

Stand up a Minimum Viable Data Quality function to support the Snowflake TD migration ensuring trustworthy accessible and secure data necessary to successfully launch an MVP Data Product Marketplace
Roll out a federated lightweight governance framework focusing on quality standards policies and tooling
Identify and align to a North Star for data quality level 90 and enable the availability of data products
Utilize Snowflake native DQ and Observability capabilities to ensure high quality data and move away from legacy tools
Governance must integrate with existing systems such as EDAG RBAC customer application
Conduct a current state analysis focusing on existing engineering services and data quality
Define a North Star through identification of key metrics and targets
Using DQ dimensions eg accuracy completeness consistency timeliness
Align and socialize North Star target with data owners and Data stewards across
Develop a lightweight federated governance framework with an emphasis on standards policies and snowflake tooling that can be socialized across
Support the implementation of Data Security Privacy Role Based Access Controls
Enable Snowflakes native data quality and observability capabilities to provide the monitoring and assurance foundation to support an MVP Data Product Marketplace
Identify and automate data pipelines to support data quality
Provide best practice considerations related to Snowflake configuration accounts data governance security guidance data management and other topics as directed by
Defining the business value ROI of improved data quality Establishing clear Data Ownership and Stewardship roles to be accountable for DQ targets Assessing data ownership coverage across critical domains
Transparency Context Organizational alignment on metadata standards for tracking data characteristics DQ dimensions rules metrics Ensuring metadata availability to endusers to provide context needed for proper data use supporting DQ usability Assessing metadata standards to ensure consistent understanding and sharing
Accessibility Usage How data rights and and access are capture single source of truth and in a manual or automated fashion
Integrity Confidentiality Evaluating how data rights and access are captured single source of truth to prevent unauthorized changes that compromise data integrity
Data Lifecycle Evaluating the Data Lifecycle Management framework to ensure data is retired archived or purged according to defined policies preventing the use of obsolete data supporting DQ timeliness Assessing adherence to data retention policies to ensure data is available only when valid
Data Technical Architecture Evaluating selected architecture patterns to ensure they support data democratization without introducing quality decay Reviewing lineage discovery and recording across environments to enable rapid root cause analysis of quality issues
Snowflake Governance Accountability
Cataloging Classification
Accessibility Usage
Protection Privacy
Data Lifecycle Data
Technical Architecture
Capabilities assessment of current DG DQ program and prioritization of capabilities to move the program forward
Classification Framework Tagging Framework
PHI Management Framework Tagging Framework
Data Architecture Design
Data Pipeline Development
Failover Playbook
Data Architecture Design
Data Pipeline
Build Data Sharing Enablement implementation leveraging Snowflake key considerations best practices to mitigate technical debt
Snowflake Platform Capabilities Data storage Crossborder Data Movement Metadata Classification Tagging Data Lineage Marketplace Catalog Identity Access Management Role Based Access Control RBAC External Tokenization Row Access Policies Dynamic Data Masking TagBased Policies Data Sharing Encryption in Transit at Rest Column Level Encryption In addition to Accessibility Usage features Eliminating data silos with Single Source of Truth Minimize Data Movement Snowflake regional high Availability Replication and Failover Client redirect Eliminating data silos with Single Source of Truth Minimize Data Movement
Conduct current state analysis define the North Star vision and establish the business case and value metrics for governance Create and socialize a lightweight federated governance framework establishing roles ownership and organizational alignment on standards Define and ensure metadata availability for selfservice implement data retention policies and record end-to-end data lineage Support the implementation of Data Security Privacy RBAC and DLP policies including encryption and access rights management Enable Snowflake native data quality observability capabilities and automate data pipelines to enhance data quality

Qualification

SnowflakeData Quality GovernanceData Pipeline DevelopmentData Lifecycle ManagementData SecurityData Architecture DesignData LineageMetadata Management

Required

Snowflake
Stand up a Minimum Viable Data Quality function to support the Snowflake TD migration ensuring trustworthy accessible and secure data necessary to successfully launch an MVP Data Product Marketplace
Roll out a federated lightweight governance framework focusing on quality standards policies and tooling
Identify and align to a North Star for data quality level 90 and enable the availability of data products
Utilize Snowflake native DQ and Observability capabilities to ensure high quality data and move away from legacy tools
Governance must integrate with existing systems such as EDAG RBAC customer application
Conduct a current state analysis focusing on existing engineering services and data quality
Define a North Star through identification of key metrics and targets
Using DQ dimensions eg accuracy completeness consistency timeliness
Align and socialize North Star target with data owners and Data stewards across
Develop a lightweight federated governance framework with an emphasis on standards policies and snowflake tooling that can be socialized across
Support the implementation of Data Security Privacy Role Based Access Controls
Enable Snowflakes native data quality and observability capabilities to provide the monitoring and assurance foundation to support an MVP Data Product Marketplace
Identify and automate data pipelines to support data quality
Provide best practice considerations related to Snowflake configuration accounts data governance security guidance data management and other topics as directed by
Defining the business value ROI of improved data quality
Establishing clear Data Ownership and Stewardship roles to be accountable for DQ targets
Assessing data ownership coverage across critical domains
Organizational alignment on metadata standards for tracking data characteristics DQ dimensions rules metrics
Ensuring metadata availability to endusers to provide context needed for proper data use supporting DQ usability
Assessing metadata standards to ensure consistent understanding and sharing
Evaluating how data rights and access are captured single source of truth to prevent unauthorized changes that compromise data integrity
Evaluating the Data Lifecycle Management framework to ensure data is retired archived or purged according to defined policies preventing the use of obsolete data supporting DQ timeliness
Assessing adherence to data retention policies to ensure data is available only when valid
Evaluating selected architecture patterns to ensure they support data democratization without introducing quality decay
Reviewing lineage discovery and recording across environments to enable rapid root cause analysis of quality issues
Cataloging Classification
Protection Privacy
Capabilities assessment of current DG DQ program and prioritization of capabilities to move the program forward
Classification Framework Tagging Framework
PHI Management Framework Tagging Framework
Data Architecture Design
Data Pipeline Development
Failover Playbook
Build Data Sharing Enablement implementation leveraging Snowflake key considerations best practices to mitigate technical debt
Snowflake Platform Capabilities Data storage Crossborder Data Movement Metadata Classification Tagging Data Lineage Marketplace Catalog Identity Access Management Role Based Access Control RBAC External Tokenization Row Access Policies Dynamic Data Masking TagBased Policies Data Sharing Encryption in Transit at Rest Column Level Encryption
Eliminating data silos with Single Source of Truth Minimize Data Movement Snowflake regional high Availability Replication and Failover Client redirect
Conduct current state analysis define the North Star vision and establish the business case and value metrics for governance
Create and socialize a lightweight federated governance framework establishing roles ownership and organizational alignment on standards
Define and ensure metadata availability for selfservice implement data retention policies and record end-to-end data lineage
Support the implementation of Data Security Privacy RBAC and DLP policies including encryption and access rights management
Enable Snowflake native data quality observability capabilities and automate data pipelines to enhance data quality

Company

Galent

twitter
company-logo
Galent is an AI-native digital engineering firm at the forefront of the AI revolution, dedicated to delivering unified, enterprise-ready AI solutions that transform businesses and industries.

Funding

Current Stage
Late Stage
Company data provided by crunchbase