Required Skills

Hadoop Admin

Work Authorization

  • US Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 9th Aug 2022

JOB DETAIL

• Person will be responsible to Perform Big Data Platform Administration and Engineering activities on multiple Hadoop, Kafka, Hbase and Spark clusters

•  Work on Performance Tuning and Increase Operational efficiency on a continuous basis

• Monitor health of the platforms and Generate Performance Reports and Monitor and provide continuous improvements 

• Working closely with development, engineering, and operation teams, jointly work on key deliverables ensuring production scalability and stability  

• Develop and enhance platform best practices

• Ensure the Hadoop platform can effectively meet performance & SLA requirements 

• Responsible for Big Data Production environment which includes Hadoop (HDFS and YARN), Hive, Spark, Livy, SOLR, Oozie, Kafka, Airflow, Nifi, Hbase etc.

• Perform optimization, debugging and capacity planning of a Big Data cluster

• Perform security remediation, automation and selfheal as per the requirement

• Hands on Experience in Hadoop Admin, Hive, Spark, Kafka, experience in maintaining, optimization, issue resolution of Big Data large scale clusters, supporting Business users and Batch    process.

• Hands-on Experience No SQL Databases HBASE is plus

• Prior Experience in Linux / Unix OS Services, Administration, Shell, awk scripting is a plus

• Excellent oral and written communication, analytical and problem-solving skills

• Self-driven, Ability to work independently and as part of a team with proven track record

• Experience on Hortonworks distribution or Open Source preferred

Company Information