Leidos · 22 hours ago
Kafka Cloud Architect
Leidos is seeking a Kafka Cloud Architect to help lead SSA’s Digital Modernization Strategy. The role involves architecting, designing, coding, and implementing next-generation data streaming and event-based architecture while mentoring team members and influencing the development of data stream solutions.
ComputerGovernmentInformation ServicesInformation TechnologyNational SecuritySoftware
Responsibilities
Architect, design, code, and implement next-generation data streaming and event-based architecture / platform using system/software engineering best practices and the latest technologies: Confluent Kafka, Apache Flink, Kafka Connect, and modern software development
Work alongside customers to determine expanded use of Kafka within the Agency
Strategize within Leidos to set up opportunities to explore new technologies to use with Kafka
Define strategy for streaming data to data warehouse, and integrating event-based architect with microservice based applications
Establish Kafka best practices and standards for implementing the Kafka platform based on identified use cases and required integration patterns
Mentor existing team members by imparting expert knowledge to build a high-performing team in our event-driven architecture. Assist developers in choosing correct patterns, event modelling and ensuring data integrity
Provide software expertise in one or more of these areas: application integration, enterprise services, service-oriented architecture (SOA), security, business process management/business rules processing, data streaming/event driven design, or data ingestion/data modeling
Triage, investigate, advise in a hands-on capacity to resolve platform issues regardless of component
Influence development of data stream solutions that impact strategic project/program goals and business results. Recommend and develop new technical solutions, products, and/or standards in support of function’s strategy and operations. Lead and manage works of other technical staff that has significant impact on project results/outputs
Brief management, customer, team, or vendors using written or oral skills at appropriate technical level for audience
All other duties as assigned or directed
Qualification
Required
Must be able to obtain and maintain a Public Trust. Contract requirement
Bachelor's degree in computer science, mathematics, engineering, or a related field with 12 years of relevant experience or master's degree with 10 years of relevant experience. Additional years of experience may be substituted/accepted in lieu of degree
12+ years of experience with modern software development including systems/application analysis and design
7+ years of combined experience with Kafka (Confluent Kafka and/or Apache Kafka)
2+ years of combined experience with designing, architecting, and deploying to AWS cloud platform
1+ years of leading a technical team
Selected candidate must be willing to work on-site in Woodlawn, MD 5 days a week
Expert experience with Confluent Kafka with hands-on production experience, capacity planning, installation, administration / platform management, and a deep understanding of the Kafka architecture and internals
Expert in Kafka cluster and application security
Strong knowledge and experience with Event Driven Architecture (EDA)
Expert Experience in data pipeline, data replication and/or performance optimization
Kafka installation & partitioning on OpenShift or Kubernetes, topic management, HA & SLA architecture
Strong knowledge and application of microservice design principles and best practices: distributed systems, bounded contexts, service-to-service integration patterns, resiliency, security, networking, and/or load balancing in large mission critical infrastructure
Expert experience with Kafka Connect, KStreams, and KSQL, with the ability to know how to use effectively for different use cases
Hands-on experience with scaling Kafka infrastructure including Broker, Connect, ZooKeeper, Schema Registry, and/or Control Center
Hands-on experience in designing, writing, and operationalizing new Kafka Connectors
Solid experience with data serialization using Avro and JSON and data compression techniques
Experience with AWS services such as ECS, EKS, Flink, Amazon RDS for PostgreSQL, and/or S3
Basic knowledge of relational databases (PostgreSQL, DB2, or Oracle), SQL, and ORM technologies (JPA2, Hibernate, and/or Spring JPA)
Preferred
Creating disaster recovery strategy
Experience with Domain Driven Design (DDD)
AWS cloud certifications
Delivery (CI/CD) best practices and use of DevOps to accelerate quality releases to Production
PaaS using Red Hat OpenShift/Kubernetes and Docker containers
Experience with configuration management tools (Ansible, CloudFormation / Terraform)
Solid experience with Spring Framework (Boot, Batch, Cloud, Security, and Data)
Solid knowledge with Java EE, Java generics, and concurrent programming
Solid experience with automated unit testing, TDD, BDD, and associated technologies (Junit, Mockito, Cucumber, Selenium, and Karma/Jasmine)
Working knowledge of open-source visualization platform Grafana and open-source monitoring system Prometheus and uses with Kafka
Benefits
Health and Wellness programs
Income Protection
Paid Leave
Retirement
Company
Leidos
Leidos is a Fortune 500® innovation company rapidly addressing the world’s most vexing challenges in national security and health.
Funding
Current Stage
Public CompanyTotal Funding
unknown2025-02-20Post Ipo Debt
2013-09-17IPO
Recent News
MarketScreener
2025-12-16
2025-12-16
Company data provided by crunchbase