Required Skills

Big Data_Hadoop

Work Authorization

  • US Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 15th Jul 2022


  • 5+ years of professional experience as admin
  • 3+ years of hands-on experience with Linux/UNIX operating systems
  • 3+ years of hands-on experience with Big Data and AWS Cloud Computing technologies
  • Proficient in Hortonworks/Cloudera Hadoop stack technologies, MapReduce, HDFS
  • Experience with security components (e.g., Ranger, AD, Kerberos)
  • Experience with S3, IAM policies, RedShift, Kinesis, Lambda, Elasticsearch
  • Experience with ETL tools (e.g., Datameer, Informatica, Talend)
  • Experience with NoSQL databases (e.g., HBase, Cassandra)
  • Knowledge of various ETL techniques and frameworks (e.g., NiFi, Kafka)
  • Proficient in one or more scripting or automation tools (e.g., Shell Script, Perl)
  • Proficient in one or more languages (e.g., Python, Spark, Java, Bash)
  • Good to have experience with reporting tools (e.g., Tableau, MicroStrategy, Power BI, Talend)
  • Experience with streaming stacks (e.g., NiFi, PySpark)
  • Experience with Jenkins, GIT, Airflow
  • Experience with continuous software integration, test and deployment
  • Experience with agile software development paradigm (e.g., Scrum, Kanban)
  • Ability to work within a dynamic programmatic environment with evolving requirements and capability goals
  • Strong customer service and communication skills
  • Ability to collaborate and share knowledge with diverse team of individuals across Technology and Business
  • Self-motivated.  Capable of working with little or no supervision.


Primary responsibilities:

  • Administer and support Big Data technologies including AWS Cloud, Hortonworks/Cloudera Hadoop, Datameer, Redshift, Tableau, Jenkins
  • Develop scripts to automate processes (e.g., monitoring, installation, deployments)
  • Troubleshoot issues, provide root cause analysis, restore service
  • Engage with multiple vendors to resolve issues and obtain/test/implement bug fixes
  • Install, configure, upgrade and patch Big Data technologies
  • Provision new servers and scale clusters to meet Enterprise needs
  • Identify and implement continuous improvements to JCP’s Big Data environment
  • Identify technical implementation options and issues
  • Establish and communicate standards and best practices to ensure data standardization and consistency


Company Information