Data Integration jobs in United States
cer-icon
Apply on Employer Site
company-logo

MUFG · 10 hours ago

Data Integration

MUFG is one of the world’s leading financial groups, dedicated to making a difference for every client and community they serve. The role involves architecting and executing cloud-based solutions, managing data integration, and ensuring compliance with regulatory data aggregation and reporting standards.

Financial Services

Responsibilities

Architecting and executing new cloud-based solutions aligned with enterprise standards; overseeing decommissioning of legacy platforms and seamless migration to modern infrastructure; and collaborating with infrastructure, security, and application teams to ensure scalable and secure deployments
Supporting and implementing Tier 1 and Tier 2 reports with high availability and accuracy; leading BCBS 239 compliance efforts, ensuring data aggregation and reporting meet regulatory expectations; and partnering with business stakeholders to align reporting outputs with decision-making needs
Ensuring Data Management and Control Implementation Framework (DMCIF) controls are properly implemented, documented, and tested; driving periodic control assessments and remediation plans in coordination with audit and risk teams; and maintaining traceability and transparency across control execution and reporting pipelines
Driving end-to-end cloud implementations, managing legacy platform migrations to a modern tech stack, and ensuring robust reporting and regulatory compliance across Tier 1 and Tier 2 domains
Designing, developing and implementing data warehouse solutions for the Bank utilizing expertise in enterprise data architecture, designs, solutions and technologies
Functioning as a Subject Matter Expert (SME) for technical applications relating to banking, including loans, deposits, treasury
Performing API based data integration between data consumers and Enterprise Data Platform in Cloud
This Enterprise Data Platform build out involves bringing in the Bank’s commercial banking products to support regulatory reporting, and BCBS 239 Risk Aggregation Framework
Leading, planning, designing, documenting, developing, testing, implementing, monitoring, maintaining and supporting enterprise data warehouse and data mart solutions, including subject area marts and interfaces to downstream applications
Liaising with internal stakeholders to assess business, technical, quality and security requirements to achieve intended outcomes in accordance with Bank processes, standards and procedures
Developing and maintaining complex ETL mappings, workflows and Unix shell scripts in a normalized/denormalized data warehouse/data mart environment based on technical specifications and other supporting documentation, utilizing Informatica PowerCenter, Unix Shell Scripts, advanced SQL and Tidal Enterprise Scheduler
Supporting Business initiatives and day to day activities across the entire spectrum of our Bank, including all Front Office, Middle Office and Back Office users as well as Credit Risk, Finance, Compliance, Comptroller and Tokyo Head Office personnel, where it is used for reporting and investigation purposes
Performing Device Under Test, System Integration Testing, User Acceptance Testing, Disaster Recovery and regression testing activities
Ensuring best practices, standards, processes and procedures are followed
Supporting Information Management and other Information Technology Department initiatives
Directing, monitoring and assigning work to those on the team
Leading the team in the preparation of functional requirements, design documents and conducting feasibility and cost benefit studies and a comprehensive knowledge of the Bank's business and system flow within the organization
Working on large, complex projects that have enterprise wide impact and require subject matter expertise of multiple process improvement areas and mastery of process improvement tools
Leading the API based data integration between data consumers and the Enterprise Data Platform in Cloud using AWS API Gateway, Python, PostgresSQL, Kafka, AWS Active MQ, Informatica, WhereScape
Coordinating with the development groups to ensure data accuracy to business analysts, leadership groups, and other end users to aid in ongoing operational insights
Conducting training/knowledge sharing sessions, Ensure best practices and standards
Leading and designing Data Migration process and activities from current warehouses to Data Lake
Supervising 8-10 Data Developers and Analysts and ensuring timely delivery of migration milestones, cloud onboarding and reporting enhancements

Qualification

Enterprise Data WarehouseETL using InformaticaCloud SolutionsUnix Shell ScriptingPythonAWSData MigrationTeam LeadershipCommunication SkillsProblem Solving

Required

Bachelor's Degree in Computer Science, Information Technology, Electronic Engineering or a related field (or foreign equivalent degree)
7 years of technical software development experience implementing Enterprise Datawarehouse with ETL using Informatica
5 years of experience must be in the banking or financial services domain with ETL development using Informatica PowerCenter
Unix Shell Scripting
designing relational and dimensional databases
developing and scheduling batch jobs using Cisco Tidal Enterprise Scheduler or Autosys
1 year of experience must include core banking products & services (loans, deposits and treasury)
developing scripts using Python, HIVE or AWS
Required to work nights & weekends & be on-call during non-business hours for technical support & maintenance purposes
Position requires employment in-office 4 days per week and remotely 1 day per week

Company

MUFG (Mitsubishi UFJ Financial Group) is one of the world's leading financial groups.

Funding

Current Stage
Late Stage

Leadership Team

leader-logo
Greidy Puig
Vice President Finance- Business Unit CFO
linkedin
leader-logo
Mark Fernandez
Vice President, Global Markets CFO Office
linkedin
Company data provided by crunchbase