Required Skills

SDLC) scala Zookeeper Kafka Impala Spark Hadoop

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 18th Nov 2020

JOB DETAIL

  • At least 3-5 years of experience with the Software Development Life Cycle (SDLC)
  • At least 3-5 years of experience working on a big data platform (Hadoop, Spark, Impala, Nifi, Kafka, Zookeeper and scala)
  • At least 2-3 years of experience with a Spark
  • At least 3 years of experience working with unstructured datasets
  • At least 1 year of experience building data pipelines, CICD pipelines, and fit for purpose data stores
  • At least 1 year of Agile experience
  • Good understanding of data lake fundamentals
  • Technical proficiency with data integration and data lake design patterns
  • Strong communication, collaboration, and multi-tasking abilities
  • Experience with Agile methodologies
  • Healthcare experience preferred
  • Should have flexibility to work with offshore team

 

Required Qualifications (5 – 8 bullet points on must have skills)

Preferred Qualifications:

  • At least 3-5 years of experience with the Software Development Life Cycle (SDLC)
  • At least 3-5 years of experience working on a big data platform (Hadoop, Spark, Impala, Nifi, Kafka, Zookeeper and scala)
  • At least 2-3 years of experience with a Spark
  • At least 3 years of experience working with unstructured datasets
  • At least 1 year of experience building data pipelines, CICD pipelines, and fit for purpose data stores
  • At least 1 year of Agile experience

Company Information