Required Skills

Big Data Hadoop

Work Authorization

  • US Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 12th Jul 2022

JOB DETAIL

 

 • Person will be responsible to Perform Big Data Platform Administration and Engineering activities on multipleHadoop, Kafka, HBase and Spark clusters

 

  •  and Increase OperationalWork on Performance Tuningefficiency on a continuous basis

 

• Monitor health of the platforms and Generate Performance Reports and Monitor and provide continuous improvements 

 

• Working closely with development, engineering, and operation teams, jointly work on key deliverables ensuring production scalability and stability  

 

• Develop and enhance platform best practices

 

• Ensure the Hadoop platform can effectively meet performance & SLA requirements 

 

• Responsible for Big Data Production environment which includes Hadoop (HDFS and YARN), Hive, Spark, Livy, SOLR, Oozie, Kafka, Airflow, Nifi, HBase etc.

 

• Perform optimization, debugging and capacity planning of a Big Data cluster

 

• Perform security remediation, automation and selfheal as per the requirement

 

• Hands on Experience in Hadoop Admin, Hive, Spark, Kafka, experience in maintaining, optimization, issue resolution of Big Data large scale clusters, supporting Business users and Batch    process.

 

• Hands-on Experience No SQL Databases HBASE is plus

Company Information