Required Skills

AWS Python MapR IBM Snowflake BO Tableau

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 19th Dec 2020

JOB DETAIL

Python, Amazon Web Service(AWS) Cloud Computing, BigData and Hadoop Ecosystem - MapR, IBM InfoSphere DataStage, Snowflake

• Data Asset analysis, analyze sources & inventory preparation

• Build/modify extract jobs from various sources to given target – Data Analysis, profiling and replication

• Analyze existing on prem ETL jobs on DataStage & Informatica and extract the transformation information to rebuild in Data Build Tool(DBT)

• Test driven development & sensitive data management/data governance process adherence

• Downstream dependency analysis and integration with Cloud Warehouse solution – BO, Tableau, Sap Lumira

• Building validation, reconciliation, quality check frameworks for data migration and daily ETL/ELT

• Data sharing capabilities building

• Building Cloud Migration Accelerators – Snowflake, AWS S3, Streamsets, DBT, Python, Unix, Java, APIs etc.

• Adopting data engineering best practices

• Data migration rollout from on-premise analytical environment to Snowflake warehouse, implementation, process

• Assist in implementation and enablement of proposed technology stack ( e.g. recommended tool implementation & configuration strategies)

• Provide insights, best practices & documentation associated with operationalizing the new technology stack

• Provide transformational best practices including communications, change management as well as insights on how to plan, budget & optimize in an audit ready environment

Company Information