Required Skills


Work Authorization

  • Citizen

Preferred Employment

  • Full Time

Employment Type

  • Direct Hire

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 24th Nov 2022


  • Hadoop and Spark Architecture, Performance optimization, Spark SQL, Streaming, HIVE, SQOOP, KAFKA, Impala, HBASE, Entitlements, etc. are examples of big data technologies that require at least three years of experience.

  • Python/Java experience of two or more years is required.

  • SQL writing expertise is a need.

  • A strong background in UNIX shell scripting is required.

  • Experience with cloud computing systems, ideally AWS, is required.

  • DevOps experience is required.

  • Familiarity with relational database environments, which use databases, tables/views, stored procedures, agent jobs, etc.

  • A working knowledge of the Hadoop framework and complicated ETL processes.

  • A plus is having experience with Map Reduce.

  • A plus is having previous experience using Hadoop-based solutions for data analytics.

  • ETL tool knowledge is a plus.

  • Strong analytical abilities with the capacity to gather, arrange, organize, analyze, and disseminate a large amount of information accurately and with attention to detail.

  • Ability to produce and show information in an entertaining manner is a bonus, as is proficiency with the complete spectrum of database and business intelligence technologies.


Company Information