Required Skills

ETL

Work Authorization

  • US Citizen

  • Green Card

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 24th Feb 2023

JOB DETAIL

•        Knowledge of Azure Data Factory is a must and good to have knowledge on other tools as well like SSIS, Talend etc.

•        Hands on experience in migration projects on Azure Cloud.

•        Good experience in Python scripting

•        Should be able to evaluate and provide the best ETL tool as per data migration requirement

•        Hands on experience in migration project using Azure Data Factory from SQL Server,DB2 (LUW or Mainframe), Mainframe database( VSAM, Flat files , DB2) to Oracle ( If do not have experience in this , should have exp from any relational source to cloud database using ADF)

•        Experience on validating Data migration.

•        Experience in writing Python scripts and Azure Databricks.

•        Hands on experience in data extraction (Full data + Incremental data) strategies from source systems. Source systems like relational database – SQL Server, DB2, mainframe database – VSAM files / flat files / Sequential files.

•        Hands on experience in implementing bi-directional synchronization and what database configurations need to be done in the database to enable bi-directional sync.

•        Documents all technical and system specifications documents for all ETL processes and perform unit tests on all processes and prepare required programs and scripts.

•        Design, implement, and continuously expand data pipelines by performing extraction, transformation, and loading activities.

•        Gather requirements and business process knowledge in order to transform the data in a way that’s geared towards the needs of end users.

•        Maintain and improve already existing processes.

•        Ensure that the data architecture is scalable and maintainable.

•        Work with the business in designing and delivering correct, high quality data.

•        Investigate data to identify potential issues within ETL pipelines, notify end-users and propose adequate solutions.

Company Information