Required Skills

ETL Developer

Work Authorization

  • US Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 19th Nov 2022

JOB DETAIL

In this role, you would be part of the data integrity/analysis team in the Capital Markets domain. You will be responsible to work with various system to check the ETL process, Data Reconciliation, and identify the gaps of Data loss. For these tasks, you would be encouraged to build validation framework to validate the various rule based scenario of data processing , you also need tp understand the business ask/problem, assess the scope, quantity and quality of the available data, prepare and build the accurate reports including the pull of the data and the creation of analytics on top of the data.

 

Responsibilities

•             Interact with client to understand requirements and provide support

•             Fetch data from the Data warehouse, analyze and produce reports as per client requirements

•             Provide assistance and solve issues faced by end users

•             Data reconciliation and Validation framework

•             Create rule-based scripts to check data quality

•             Work on Hive, Hadoop and Unix on Data lake environment which has Data pipeline created using Pyspark

•             Be able to work in Unix Scripting and complex SQL

•             Hands on ETL Tool- Informatica

•             Ensure integrity of data and customer delight by giving appropriate levels of access to requested tools/channels by following business rules

Qualifications we seek in you

 

Minimum qualifications

•             Graduate/B Tech/BCA

•             Knowledge/exp. of Capital Markets is an added advantage for this position

 

Preferred qualifications

•             Strong hands-on core python or Unix scripting exp.

•             Experience on Datawarehouse and Data Lake environment

•             Experience of Data Modelling concept, dimensional model

•             Expertise in HQL and writing complex DB SQL Queries

•             Knowledge on Hadoop environment.

•             Good understanding of ETL- Informatica

•             Good interpersonal, problem solving and verbal communication skills

•             Good to have experience on web scrapping and transposing data from one form to other.

Company Information