HDFSYarnHiveLLAPDruidImpalaSparkKafkaHBaseCloudera Work Bench
Work Authorization
US Citizen
Green Card
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
Contract to Hire
Employment Type
Consulting/Contract
education qualification
UG :- - Not Required
PG :- - Not Required
Other Information
No of position :- ( 1 )
Post :- 8th Jun 2024
JOB DETAIL
Bachelor's degree in information systems, Engineering, Computer Science, or related field from an accredited university.
Intermediate experience in a Hadoop production environment.
Must have intermediate experience and expert knowledge with at least 4 of the following:
Hands on experience with Hadoop administration in Linux and virtual environments.
Well versed in installing & managing distributions of Hadoop (Cloudera).
Expert knowledge and hands-on experience in Hadoop ecosystem components; including HDFS, Yarn, Hive, LLAP, Druid, Impala, Spark, Kafka, HBase, Cloudera Work Bench, etc.
Thorough knowledge of Hadoop overall architecture.
Experience using and troubleshooting Open-Source technologies including configuration management and deployment.
Data Lake and Data Warehousing design and development.
Experience reviewing existing DB and Hadoop infrastructure and determine areas of improvement.
Implementing software lifecycle methodology to ensure supported release and roadmap adherence.
Configuring high availability of name-nodes.
Scheduling and taking backups for Hadoop ecosystem.