Required Skills

Hadoop Spark Scala Developer

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 20th Aug 2022

JOB DETAIL


Hadoop, Spark, Scala Developer

 

Your future duties and responsibilities
Your future duties and responsibilities
• 5+ years of demonstrated ability with Big Data tools and technologies including working in a Production environment of a Hadoop Project.
• 2+ years of experience with Spark, PySpark, SQL, Hive, Impala, Oozie, HDFS, Hue, Git, MapReduce and Sqoop.
• 2+ years of Programming experience in Scala Programming and Application Development.
• Experience in Test Driven Development (TDD), and/or Continuous Integration/Continuous Deployment (CI/CD) is a plus
• Big Data Development using Hadoop Ecosystem including Pig, Hive and other Cloudera tools.
• Analytical and problem solving skills, applied to a Big Data environment.
• Experience with large-scale distributed applications.
• Experience with Agile methodologies to iterate quickly on product changes, developing user stories and working through backlog.
• We Prefer experience with Cloudera Hadoop distribution components and custom packages.
• Traditional Data Warehouse/ETL experience.
• Excellent planning, organization, communication and thought leadership skills.
• Ability to learn and apply new concepts quickly.
• Validated ability to mentor and coach junior team members.
• Strong leadership, communication and interpersonal skills.
• Ability to adapt to constant changes. Sense of innovation, creativity, organization, autonomy and quick adaptation to various technologies.
• Capable and eager to work under minimal direction in fast-paced energetic environment.


Required qualifications to be successful in this role
• Strong Hands-on Hadoop, Python, Mainframe, DB2 and Teradata experience
• Experience in analysis, design, development, support, and improvements in data warehouse environment with Bigdata Technologies with a minimum of 5+ years’ experience in Hadoop, MapReduce, Sqoop, HDFS, Hive, Impala, Oozie, Hue, Kafka, Yarn
• 2+ Experience in PySpark, Spark.
• 2+ Experience in Scala Development.
• Python experience is a plus
• Experience with Agile methodologies to iterate quickly on product changes

Education: Bachelor's Degree or a level of education, training and experience equivalent to a Bachelor's Degree in related field.

Company Information