Required Skills

Hadoop Kafka Data Engineers Data Scientists System Administrators Storage Administrators Network team vendors

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 9th Jan 2021

JOB DETAIL

Position: Hadoop Administrator

Location: Jersey City, NJ (onsitework necessary, relocation is fine) 

Duration: 6 months PossibleExtension

 

Description:

 

Hadoopadministrator
The Hadoop administrator is responsible for the care, maintenance,administration and reliability of the Hadoop ecosystem. The role will beresponsible to work as part of 24X7 shift to provide Hadoop Platform Supportand Perform Administrative on Production Hadoop clusters. The role includesensuring system availability, security, stability, reliability, capacityplanning, recoverability (protecting business data) and performance. Inaddition to providing new system and data management solution delivery to meetthe growing and evolving data demands of the enterprise.
Essential job Skills:
• Hadoop administrator using Cloudera, Administers Hadoop technology andsystems responsible for backup, recovery, architecture, performance tuning,security, auditing, metadata management, optimization, statistics, capacityplanning, connectivity and other data solutions of Hadoop systems
• Responsible for installation and ongoing administration of Hadoopinfrastructure
• Propose and deploy new hardware and software environments required for Hadoopand to expand existing environments as the need arises. Provision new users,groups and roles and setting up Kerberos principals using Active Directory
• Addition and deletion of nodes from Hadoop clusters. Monitoring Hadoopeco-system
• Kafka Administration, add/delete new brokers, rebalance Kafka topicsworkload, setup SSL, security on Kafka brokers
• Monitor & Troubleshoot Kafka workload
• Log Files and file systems management, HDFS, storage management and capacityplanning
• Troubleshooting and resolving application errors
• Ensure the High Availability and performance of Hadoop clusters
• Responsible for data movement in and out of Hadoop clusters
• Perform Backup and recovery for the META data store, databases andconfiguration files
• Data modeling, designing and implementation of HIVE, Impala SQL tables
• Installing security patches, upgrading software on release schedules
• Automation of manual tasks. Experience of automation tools such as Ansible,Chef, Puppet or SALT Stack
• Monitor Health check of all the Hadoop clusters.
• Thorough knowledge of Hadoop overall architecture, its key components such asHDFS, YARN, Sqoop, HIVE, Impala, Spark, Cloudera Manager etc…
• Ability to effectively collaborate with Data Engineers, Data Scientists,System Administrators, Storage Administrators, Network team, vendors etc…
• A solid background in Linux shell scripting and system administration
• Excellent communication and Problem solving skills
• Ability to work with Senior Enterprise Architects to develop a Big Dataplatform
• 10 year experience in large-scale Relational database administration dealingwith data warehousing systems
• 4 year experience in Hadoop (Cloudera preferred)

• Hadoop Certification is preferred

 

 

Mahima Raj

Mahima@tekinspirations.com

Direct : 469-353-7766

IT Technical Recruiter|TEK Inspirations LLC

13573 Tabasco, Cat Trail, Frisco, TX 75035

Company Information