Required Skills

Pyspark Hive Hadoop Hbase Spark Nifi SCALA Kafka Python

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 13th Oct 2023

JOB DETAIL

  • The candidate must be a self-starter, who can work under general guidelines in a fast-spaced environment.
  • Overall minimum of 4 to 10 year of software development experience and hands-on working knowledge on Big Data technologies such as Pyspark, Hive, Hadoop, Hbase, Spark, Nifi, SCALA, Kafka, Python
  • Excellent knowledge in Java , SQL & Linux Shell scripting
  • Strong communication, Interpersonal, Learning and organizing skills matched with the ability to manage stress, Time, and People effectively
  • Proven experience in co-ordination of many dependencies and multiple demanding stakeholders in a complex, large-scale deployment environment
  • Ability to manage a diverse and challenging stakeholder community
  • Co-ordinate SIT and UAT Testing. Take feedbacks and provide necessary remediation/recommendation in time.
  • Drive small projects individually, Co-ordinate change and deployment in time

Company Information