Required Skills

Hadoop spark

Work Authorization

  • US Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 10th May 2022

JOB DETAIL

  • Professional Bigdata Hadoop development experience between 3-8 years is preferred. 
  • Expertise with Big Data ecosystem services, such as Spark(Scala/Python), Hive, Kafka, Unix and experience with any cloud stack, preferably GCP(Big Query & DataProc), AWS(Glue,EMR,RedShift)
  • Object-oriented programming and component-based development with Java.
  • Experience in working with large cloud data lakes.
  • Experience with large-scale data processing, complex event processing, stream processing.
  • Experience in working with CI/CD pipelines, source code repositories, and operating environments.
  • Experience in working with both structured and unstructured data, with a high degree of SQL knowledge. 
  • Experience designing and implementing scalable ETL/ELT processes and modeling data for low latency reporting
  • Experience  in performance tuning, troubleshooting and diagnostics, process monitoring, and profiling. 
  • Understanding containerization, virtualization, and cloud computing.

PREFERRED EXPERIENCE 

  • Experience working in the Scrum Agile software development framework.

Ability to work in a fast-paced environment with evolving requirements and capability goals. 

 

Company Information