Required Skills

Big Data Scala Spark Core Java

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 8th Jul 2024

JOB DETAIL

Experience: 5+ years Client is a leading provider of platforms; digital innovation; artificial Intelligence and end-to-end IT services and solutions for Global 1000 companies.

We are transforming corporations through deep domain expertise; knowledge-based ML platforms; as well as profound anthropological efforts to understand the end customer and design products and interactions that create delight. We are deeply committed to developing a comprehensive understanding of our client's problems and developing platforms to address them

We are seeking a highly skilled and motivated Spark Scala Developer to join our dynamic team.

As a Spark Scala Developer, you will play a critical role in the design, development, deployment and optimization of data processing application.

Key Responsibilities: • Develop and maintain data processing applications using Spark and Scala. • Collaborate with cross-functional teams to understand data requirements and design efficient solutions. • Implement test-driven deployment practices to enhance the reliability of application. • Deploy artifacts from lower to higher environment ensuring smooth transition • Troubleshoot and debug Spark performance issues to ensure optimal data processing. • Work in an agile environment, contributing to sprint planning, development and delivering high quality solutions on time • Provide essential support for production batches, addressing issues and providing fix to meet critical business needs

Skills/Competencies: • Strong knowledge of Scala programming language • Excellent problem-solving and analytical skills. • Proficiency in Spark, including the development and optimization of Spark applications. • Ability to troubleshoot and debug performance issues in Spark. • Understanding of design patterns and data structure for efficient data processing • Familiarity with database concepts and SQL * Java and Snowflake (Good to have). • Experience with test-driven deployment practices (Good to have). • Familiarity with Python (Good to have). • Knowledge of Databricks (Good to have). • Understanding of DevOps practices (Good to have).

Company Information