Required Skills

HDFS HBase Hive Pig Spark MapReduce Cloudera

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 18th Oct 2023

JOB DETAIL

 • Advanced knowledge of application, data and infrastructure architecture disciplines

• Experience with AWS, Redshift and Snowflake
• Experience with security, isolation and multi-tenant design of distributed cloud services
• Understanding of RESTful API design best practices and experience in developing them
• Experience with Hadoop ecosystem technology stacks as HDFS, HBase, Hive, Pig, Spark, MapReduce, Cloudera etc.
• Experience in using Eclipse/IntelliJ, Maven, Jenkins, GIT, JIRA, Control M or equivalent tools
• Working knowledge of at least a few of the common frameworks like Spring, Apache, Hibernate or similar ORM tools
• Experience with Casandra, Kafka preferred.
• ETL based High volume Real Time & Batch application processing.
• Experienced in developing large scale enterprise applications using Big Data open source solutions such as Hadoop, Spark, Kafka, and Elastic Search
• Experience with Scala, Java and/or Python
• Hands-on experience with RDBMS (Oracle, Mysql) and NoSql (Cassandra)
• Experience with Change Management and Incident Management process

Company Information