Senior Data Engineer (Developer) jobs in United States
cer-icon
Apply on Employer Site
company-logo

Creighton University · 2 months ago

Senior Data Engineer (Developer)

Creighton University is looking for a Senior Data Engineer who will be an expert in developing and building data management processes from various data sources. The role involves data modeling, data mapping, and constructing data pipelines while ensuring data quality and performance in a collaborative agile team environment.

Higher Education
check
H1B Sponsor Likelynote

Responsibilities

Create and maintain data pipelines, job, etc. for extracting, transforming, and loading data from diverse sources (e.g., databases, APIs, flat files)
Extract, Transform, Load (ETL): ETL/IPAAS processes are the backbone of data pipelines. Extract data from various sources (e.g., APIs, databases), transform it (cleaning, aggregating, enriching), and load it into storage systems (data warehouses, databases). Ensure data flows smoothly, adhering to best practices and industry standards
Monitoring: Evaluation of code and performance to see if there are opportunities for improvement
Automation: Writing code (Python, SQL, or other languages) to automate data management tasks ensures consistent and reliable data flow. Schedule jobs, handle error handling, and monitor pipeline performance
Data Quality: Implementing data quality checks ensures that the data is accurate, complete, and consistent. Define rules (e.g., missing values, outliers) and monitor adherence
Deliver support/troubleshooting for operational data integration issues. Debug and provide solutions for performance items related to data flows
Test conduct unit, integration, and end-to-end tests for data quality. Data Engineers prepare test data, set up test environments, and validate all components of ETL (Extract, Transform, Load) pipelines and data flows to maintain system integrity
Document Create or maintain comprehensive documentation is required for data processes, pipelines, architecture, and metadata. Within the prevailing standard that team members and stakeholders can understand the structure and logic of data systems and supports scalability and troubleshooting
Mapping conduct mapping of data sources to target systems. Describe how data fields correspond between systems, supporting migration, integration, and transformation efforts
Business Requirements work closely with stakeholders to translate business needs into technical requirements. This includes understanding domain-specific needs, designing corresponding data solutions, and ensuring pipelines and storage meet objectives
Modeling Collaborate with data architects and stakeholders to create/build robust data schemas, and structures with business requirements
As business needs require, actively being a resource for team members to answer questions and provide solutions for data integration outside their primary responsibility
Demonstrate active membership in internal/external organizations like the Association of Jesuit Colleges and Universities (AJCU)
Provide effective communication on matters relating to the job description
Be an active contributor to onboarding and ongoing training of team members
Remains current in the field of expertise to ensure assigned applications remain at the industry forefront

Qualification

ETL/IPAAS toolsSQLPythonData quality assuranceCloud platformsDBMSData modelingAnalytical skillsProblem-solving skillsInterpersonal skillsCommunication skills

Required

Bachelor's degree, required: A degree in Computer Science, Data Science or a related field
5+ years of combined experience of demonstrated enterprise data engineering activities
Strong experience with ETL/IPAAS tools such as Mulesoft, Talend, Informatica, Boomi, Workato, etc
Familiarity with DBMS; ideally Oracle, MS SQL Server
Proficiency in SQL: Essential for querying databases and manipulating data
Experience with Python, SSIS, Talend, or other scripting for ETL, and data analysis
Familiarity with cloud platforms (e.g., AWS, Azure, GCP) for scalable data solutions
Demonstrated ability to develop ETL/ELT jobs, database schemas, tables, indexes, views, queries, and understand the implementation tradeoffs of various methodologies
Strong analytical and problem-solving skills: optimizing queries, handling large datasets, or troubleshooting pipeline issues
Familiarity with continuous integration and continuous deployment
Demonstrated proficiency at utilizing APIs and flat file loads to move data from source to target
Demonstrated success at data quality assurance, testing and Data Tuning expertise
Demonstrated ability to work independently as well as a collaborative team member with other departments
Thrive in a fast-paced goal-oriented environment
History of creative problem-solving while adhering to established standards
Coachable and open to feedback
Outstanding interpersonal and communication skills, along with the ability to work effectively within a diverse community

Company

Creighton University

twitter
company-logo
At Creighton, our Jesuit mission shapes our vision.

H1B Sponsorship

Creighton University has a track record of offering H1B sponsorships. Please note that this does not guarantee sponsorship for this specific role. Below presents additional info for your reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2025 (21)
2024 (26)
2023 (10)
2022 (17)
2021 (21)
2020 (15)

Funding

Current Stage
Late Stage

Leadership Team

leader-logo
John R. Stone, MD, PhD
Co-Founder & Co-Executive Director
linkedin
leader-logo
Russ Pearlman
Vice President of Information Technology, Chief Information Officer
linkedin
Company data provided by crunchbase