Required Skills

Spark Kafka HBase Pig Impala Sqoop Oozie Flume Mahout Storm Tableau Talend big data technologies. PeopleCode Hadoop Tableau Data Tokenization Oracle SQL Informatica NDM Big Data Python SSO

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 3rd Feb 2021

JOB DETAIL

JD:-

6-10 years of total IT development experience in all phases of the SDLC
Hadoop/Java Developer experience in all phases of Hadoop and HDFS development, ETL/Informatica exposure.
Extensive experience and actively involved in Requirements gathering, Analysis, Design, Coding and Code Reviews, Unit and Integration Testing.
Hands on experience in Hadoop ecosystem including Spark, Kafka, HBase, Pig, Impala, Sqoop, Oozie, Flume, Mahout, Storm, Tableau, Talend big data technologies.
Involved in converting Hive/SQL queries into Spark transformations using Spark RDD and Pyspark concepts.
Experience working with SQL, PL/SQL and NoSQL databases like Microsoft SQL Server, Oracle, HBase and Cassandra.
Experience in Importing and exporting data from different databases like MySQL, Oracle, Netezza, Teradata, DB2 into HDFS using Sqoop, Talend.
Experience in developing and scheduling ETL workflows in Hadoop using Oozie.
Experience with Tableau that is used as a reporting tool.
Excellent communication skills and strong architecture skills
Ability to learn and adapt quickly to the emerging new technologies.

PeopleCode, Hadoop, Tableau, Data Tokenization, Oracle, SQL, Informatica, NDM, Big Data, Python, SSO

 

Company Information