Shoreline AI · 21 hours ago
Senior Cloud Data Platform Engineer
Shoreline AI is an Industrial AI/IoT startup providing a cloud-native, subscription based SAAS solution to optimize asset performance and operational efficiency. They are seeking a Senior Cloud Data Platform Engineer/Architect to build and maintain the data infrastructure and platform that powers their product.
Responsibilities
Design, implement and maintain a scalable and secure data lake to handle both structured and semi-structured data, implement flexible data governance, and provide secure access to Data Scientists and Software Developers
Design and Build the 'Data API' on top of the Data Lake Platform to provide easy programmatic access to Developers to the available Data for processing, analytics, and visualization
Create data pipelines to ingest, clean, and transform data from multiple sources
Develop a strategy for easy creation and deployment of containerized applications
Develop and maintain internal tools and frameworks for data ingestion using Python and SQL
Monitor data pipelines and cloud infrastructure for availability, low latency, and data correctness
Collaborate cross-functionally to define data models, contracts, schemas, access, and retention policies
Embrace software development and deployment best-practices including continuous integration/continuous deployment (CI/CD), employing Infrastructure-as-Code (IaC), automated testing, etc
Learn and adapt to new cloud technologies and development best practices
Maintain a strong customer-first attitude, and ensure that all technical solutions focus on providing customer delight
Participate in architecture, design and code reviews and maintain a high-standard of quality, testing, documentation, and compliance with security standards
Qualification
Required
3+ years of experience in architecting, designing, developing, and implementing cloud solutions on AWS platforms
Platform-builder mindset through experience defining and building APIs and tools to help other developers be productive
Demonstrated ability to work with AI based coding tools (e.g., Cursor, Claude Code, Gemini CLI) to accelerate learning, defining architectures and project plans, implementing code and tests
Deep understanding of SQL and modern data lake architectures (e.g., using Parquet, Iceberg, or Delta Lake)
Experience working with real-time or batch data ingestion at scale, and designing fault-tolerant ETL/ELT pipelines
Familiarity with event-driven architectures and messaging systems like Kafka or Kinesis
Hands-on experience with AWS services including but not limited to: S3, Lambda, API Gateway, Glue, Kinesis, Athena, and RDS
Excellent collaboration and communication skills and ability to work with remote teams