Noblesoft Solutions · 2 days ago
Hadoop Developer (Need ONLY on W2 and Locals)
Noblesoft Solutions is seeking a Hadoop Developer with extensive experience in big data technologies. The role involves technical design and coding, primarily focusing on Hadoop, Spark, and cloud-based platforms, while also requiring strong communication skills and leadership experience.
Responsibilities
HADOOP
Spark
Most recent HBase
Experience with No SQL databases (ex. Mongo)
Spark/Kafka streaming
Scala
DB2
Qualification
Required
HADOOP
Spark
Most recent HBase
Experience with No SQL databases (ex. Mongo)
Spark/Kafka streaming
Scala
DB2
5+ years related work experience, Professional experience with technical design and coding in the IT industry
Great verbal and written communication
Experience as a Lead using Cloudera Data Platform (CDP)
Good experience working on cloud-based platforms. Well-versed in cloud architecture, deployment, and management of big data applications, with a strong understanding of cloud security, scalability, and cost optimization. Experience with cloud-based services such as S3, Azure Blob Storage etc
Working knowledge of Snowflake architecture, data modeling, and data warehousing best practices. Hands-on experience working with Snowflake
Ability to analyze existing Spark code, identify areas of improvement, and refactor the code to leverage Snowflake's data warehousing capabilities
Experience with data transformation, data quality, and data validation
Preferred
Experience with Snowflake features such as data clustering, caching, and query optimization is highly desirable
Experience with migrating existing Spark code to write data to Snowflake
Familiarity with Snowflake's APIs and SDKs for seamless integration with Spark
Ability to design and implement ETL workflows that leverage Snowflake's compute capabilities, including but not limited to Snowflake's SQL, Python, and Scala APIs