Required Skills

Apache Pig Apache Hive Apache Mahout

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 5th Mar 2024

JOB DETAIL

  • Deploying, maintaining the Hadoop cluster, adding and removing nodes using cluster monitoring tools like Ganglia Nagios or Cloudera Manager, configuring the Name Node high availability
  • Keeping track of all the running Hadoop jobs.
  • Implementing, managing, and administering the overall Hadoop infrastructure.
  • Takes care of the day-to-day running of Hadoop clusters
  • Work closely with the database team, network team, BI team, and application teams to make sure that all the big data applications are highly available and performing as expected.
  • Knowledge on Apache Distribution, must manually setup all the configurations- Core-Site, HDFS-Site, YARN-Site and Map Red-Site.
  • Aware of the configuration setup using Hadoop distribution like Hortonworks, Cloudera or MapR
  • Capacity planning and estimating the requirements for lowering or increasing the capacity of the Hadoop cluster, deciding the size of the Hadoop cluster based on the data to be stored in HDFS.
  • Ensure that the Hadoop cluster is up and running all the time. Monitoring the cluster connectivity and performance.
  • Manage and review Hadoop log files. Backup and recovery tasks. Resource and security management
  • Troubleshooting application errors.

Skills Required:

  • Experience as Cloudera Admin
  • Preferably Cloudera Hadoop Admin Certification
  • Excellent knowledge of RedHat Linux .(5 years’ Experience)
  • Excellent oral & verbal communication.
  • Prefer prior experience in BFS Domain large Cloudera Hadoop cluster admin and performance tuning experiences. Need these both in NAM and India.
  • Configuration management and automation tools like Puppet or Chef for non-trivial installation.
  • Cluster monitoring tools like Ambari, Ganglia, or Nagios.

Good to Have:

  • Knowing core java, good understanding of OS concepts, process management and resource scheduling.
  • Basics of networking, CPU, memory, and storage. Good hold of shell scripting
  • Exposure to the components in the Hadoop ecosystem like Apache Pig, Apache Hive, Apache Mahout, etc."

Company Information