Required Skills

Hadoop

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 29th Jan 2026

JOB DETAIL

Apache Hadoop


Nice to have skills - Shell scripting,Python,Hive,Apache Sentry,Impala,Hadoop Administration,Delivery Management,Risk Management,Project Stakeholder Management,Cloudera Data Platform,Hadoop\Big Data Admin

We are seeking a Sr. Developer (Sr. Associate - Projects) with 6-10 years of experience, specializing in Apache Hadoop ecosystems and Hive, to join our dynamic team.
This role involves developing high-quality solutions, optimizing data processing, and ensuring data security.
The ideal candidate will contribute to our mission by enhancing our data capabilities, thereby impacting our global financial services positively.

Roles & Responsibilities
Person will be responsible to Perform Big Data Administration and Engineering activities on multiple Hadoop Kafka Hbase and Spark clusters
Work on Performance Tuning and Increase Operational efficiency on a continuous basis Monitor health of the platforms and Generate Performance Reports and Monitor and provide continuous improvements
Working closely with development engineering and operation teams jointly work on key deliverables ensuring production scalability and stability
Develop and enhance platform best practices
Ensure the Hadoop platform can effectively meet performance SLA requirements
Responsible for Big Data Production environment which includes Hadoop HDFS and YARN Hive Spark Livy SOLR Oozie Kafka Airflow Nifi Hbase etc
Perform optimization debugging and capacity planning of a Big Data cluster
Perform security remediation automation and self heal as per the requirement

 

Company Information