Required Skills

Big Data Spark HDFS Kafka Hive MySQL PostgreSQL SQL Server Snowflake Python PySpark Java SQL Shell Scripting Sqoop

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 5th Jan 2021

JOB DETAIL

Job Description :

  • Programming Languages - Python, PySpark, Java, SQL, Shell Scripting, Sqoop
  • Big Data Tools - Spark, HDFS, Kafka, Hive
  • Databases - MySQL, PostgreSQL , SQL Server, Snowflake
  • Cloud Technologies - Amazon Web Services
  • Unix/Linux expertise

 Responsibilities

  • Design and Develop Data Ingestion and Processing Code using Python/Pyspark/ Hive on the Cloudera CDH Platform.
  • Create and update Design Specs and reference Architecture documents to enable acceleration in solution development.
  • Cloudera Data Platform Innovating new ideas, researching related technology, developing new concepts, prototyping and delivering implementations
  • Participate in testing and peer code reviews to identify any bugs and ensure reusability of code.
  • Automate the deployment of the solutions by using Shell scripts/Python/Oozie.
  • Work with internal subject matter experts to define requirements for new demo environments
  • Collaborating with the Apache community on Hadoop and other related opensource projects
  • Knowledge on Snowflake , AWS Cloud

Company Information