Required Skills

BigData JVM Heap Thread pool memory management HA Kerberos Encryption using Sentry Ranger KMS

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 4th Jan 2021

JOB DETAIL

Big Data Administration Architect
Pleasanton, California
Long Term Contract

Net 60 Payment 

Install and configure various components of Hadoop ecosystem (HDFS,YARN,HBASE,ZOOKEEPER, KAFKA,HIVE,SPARK,NAME NODE,YARN and HIve HA configuration and sentry ) and maintain their integrity
Maintain a Hadoop cluster, adding and removing nodes using Cloudera Manager
Experience on Cloudera, working with data delivery teams to setup new Hadoop users. This includes setting up users, setting up Kerberos principals and testing Cloudera components
Expert knowledge on Active Directory/LDAP security integration with Cloudera Big Data platform. HDFS permission using Kerberos. Single sign on for all cloudera components not limited to OS, console and other tools
Understanding of Java Application and it's components not limited to container, JVM, Heap, Thread pool , memory management
Good understanding and experience of configuring HA, Kerberos, Encryption using Sentry, Ranger, KMS
Debugging, Configuration and tuning various components of Hadoop ecosystems
Monitoring and troubleshooting the cluster activities and performance
Manage all aspects of our infrastructure (storage, network, compute) in cloud environment.
Setup Active Replica for Cloudera environment.
Experience Cloudera upgrade, migration

Company Information