VSP Vision Care · 16 hours ago
Senior Data Engineer (Snowflake)
VSP Vision Care is a company that focuses on providing vision care solutions, and they are seeking a Senior Data Engineer to design, build, and optimize data pipelines for key analytics capabilities. The role involves collaborating with various teams to create reliable data processes and drive automation of data integration tasks.
Insurance
Responsibilities
Design, build and optimize data pipelines for key data and analytics capabilities in the enterprise
Collaborate within an agile, multi-disciplinary team to deliver optimal data integration and transformation solutions
Analyze data requirements (functional and non-functional) to develop and design robust, scalable automated, fault-tolerant data pipeline solutions for business and technology initiatives
Profile data to assess the accuracy and completeness of data sources and provide feedback in data gathering sessions
Design, build, maintain, and operationalize data pipelines for high volume and complex data using appropriate tools and practices in development, test, and production environments
Develop and design data mappings, programs, routines, and SQL to acquire data from legacy, web, cloud, and purchased package environments into the analytics environment
Understand and apply the appropriate use of ELT, ETL, data virtualization, and other methods to optimize the balance of minimal data movement against performance, and mentor others on their appropriate use
Drive automation of data pipeline preparation and integration tasks to minimize manual and error-prone processes and improve productivity using modern data preparation, integration, and AI-enabled metadata management tools and techniques
Leverage auditing facilities that will enable monitoring of data quality to detect emerging issues
Deploy transformation rules to cleanse against defined rules and standards
Participate in architecture, governance, and design reviews, identifying opportunities and making recommendations
Participate in health check assessments of the existing environment and evaluations of emerging technologies
Collaborate with architects to design and model application data structures, storage, and integration in accordance with enterprise-wide architecture standards across legacy, web, cloud, and purchased package environments
Qualification
Required
Bachelor's degree in computer science, data science, statistics, economics, or related functional area; or equivalent experience
Excellent written and verbal communication skills with the ability to gather requirements and effectively communicate technical concepts and ideas to all levels of employees and management
6+ years' experience working in development team providing analytical capabilities
6+ years of hands-on experience in the data space spanning data preparation, SQL, integration tools, ETL/ELT/data pipeline design
SQL coding experience
Experience working in an agile development environment (Scrum, Kanban) with a focus on Continuous Integration and Delivery
Knowledge about various data architectures, patterns, and capabilities such as event-driven architecture, real-time data flows, non-relational repositories, data virtualization, cloud storage, etc
Knowledge of and experience with multiple data integration platforms (IBM InfoSphere DataStage, Oracle Data Integrator, Informatica PowerCenter, MS SSIS, AWS Glue, Denodo), and data warehouse MPP platforms such Snowflake, Netezza, Teradata, Redshift, etc
Familiarity with DataOps practices and their application within analytics environments as well as their ability to extend data and analytics capabilities to other operational systems and consumers
Familiarity with event store and stream processing (Apache Kafka and platforms like Confluent) and with API development and management platforms (MuleSoft, Axway) is beneficial
Capable of focusing on a specific set of tasks while also ensuring alignment to a broader strategic design
Preferred
Snowflake Certification/Strong hands-on experience with Snowflake (data modeling, performance tuning, security, and optimization)
Experience with DataStage (ETL development and maintenance)
Good understanding and practical experience in Data Vault modeling (Raw Vault, Business Vault, Information Marts)
Experience with workflow orchestration tools such as Apache Airflow
Proficiency with GitHub (version control, branching strategies, code reviews, CI/CD integration)
Experience building and maintaining ETL/ELT pipelines
Strong SQL and data transformation skills
Familiarity with cloud platforms (AWS/Azure/GCP)
Understanding of data warehousing concepts, performance tuning, and best practices
Ability to provide technical leadership, mentor junior engineers, and support production issue resolution
Ability to participate in Agile ceremonies and contribute to continuous improvement initiatives
Ability to work in Agile/Scrum environments
Benefits
Eligible bonuses and commissions
VSP Vision benefits
Company
VSP Vision Care
We help people see by delivering what matters most to our members—quality care, personalized attention, and the best choices in eyewear at the lowest out-of-pocket cost.
H1B Sponsorship
VSP Vision Care has a track record of offering H1B sponsorships. Please note that this does not
guarantee sponsorship for this specific role. Below presents additional info for your
reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2025 (20)
2024 (16)
2023 (8)
2022 (21)
2021 (16)
2020 (15)
Funding
Current Stage
Late StageLeadership Team
Recent News
Sacramento Business Journal
2024-02-03
Business Journals
2022-07-12
Company data provided by crunchbase