Required Skills

Hive Hadoop SCALA Spark SQL Scripting Unix Hql Linux Hdfs Teradata

Work Authorization

  • Citizen

Preferred Employment

  • Full Time

Employment Type

  • Direct Hire

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 18th May 2022

JOB DETAIL

6-9 years of post-college working experience as a developer and architect in Hadoop and Data warehousing tools.

  1. 6+ years of experience Had0op, Hive SQLS is required.
  2. 6+ Years of experience of Linux/Unix/Python coding experience (Including scripting required.
  3. Spark programming knowledge preferred
  4. Hands on experience in Dataiku and Dremio Products is a preferred
  5. Strong conceptual and creative problem solving skills, ability to work with considerable ambiguity, ability to learn new and complex concepts quickly..
  6. Experience leading teams in a complex organization involving multiple reporting lines.
  7. The candidate should be able to tweak the queries and work on performance enhancement.
  8. The candidate will be responsible for delivering code, setting up environment, connectivity, deploying the code in production after getting it properly tested.
  9. The Candidate should have strong functional and technical knowledge to deliver what is required and he/she should be well acquainted with Banking terminologies. Occasionally, the candidate may have to be responsible as a primary contact and/or driver for small to medium size projects.

Skills & Attributes Must have (maximum of 5 bullet points)

  1. Good development, analytical and technical expertise in HDFS,HIVE,SQL/HQL and other Hadoop ecosystems.
  2. Production support knowledge/experience. Problem solving skills to manage the incident.
  3. Minimum 3-4 years of development experience with 3-6 of working experience. Possess a hunger for knowledge, self initiative as well as willing to learn.
  4. Good understanding of and experience in AGILE projects
  5. Familiar with SDLC and banking domain is an added advantage.

 

Must have Technical skills

  1. Familiar with Enterprise Data Warehouse and Reference Data Management is a plus
  2. Familiar with ETL process, Hadoop
  3. Familiar with Spark, Data API, MLP
  4. Familiar in SQL scripting, VB scripting and Python
  5. Familiar with Unix / Linux and batch scripting is a plus
  6. To develop and execute the Automation Framework especially complimenting Data Warehouse domain
  7. Familiar with Teradata database is a plus

Nice to have

  1. Hadoop Certified
  2. Banking domain knowledge is a plus
  3. Development and Support experience in the Data warehouse is welcome

Note : We are open for sub-contract as well. Interested companies please reach us on M : +91 99620 02035

 

Company Information