Required Skills

ADLS Kafka SQL Synapse SQL Cosmos DB Graph DBs

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 2nd Nov 2023

JOB DETAIL

You will be partner with the DnA Data Management team and other stakeholders to document DG and DQ requirements for the implementation of various data projects.

Work with the Data Governance team and business partners to research, evaluate, document, and maintain standards, best practices, design patterns around project requirements, and various other aspects of existing and emerging ETL technologies in support of the on cloud and Big data implementation.

Collate requirements for projects from the perspective of Data Governance team and participate in overall project implementation.

Perform consultative role in the assessment of policy and standards alignment, data inventories for scoped application, data content management, DQ rules, access management, data security and data risk.

Interact with business analysts and functional analysts in requirements gathering and implementation.

Collaborate with the extended project team on compiling proposals including high level technical solution, estimate and project plan.

 

 

To help you succeed, you need to have :- 

  • 5+ years´ experience in enterprise-wide implementation of Data Governance, Data Quality or Technical Data Management programs.
  • 3+ plus years of experience working with various Data Governance tools (Collibra experience preferred).
  • Advanced English.
  • Demonstrated experience with designing, implementing, and deploying scalable and performant data hubs at global scale.
  • Demonstrated experience in innovative database technologies, and cloud services such as Azure, GCP, Data Bricks or SnowFlake.
  • Experience in technologies such as Spark (Scala/python/Java), ADLS, Kafka, SQL, Synapse SQL, Cosmos DB, Graph DBs.
  • Experience with Collibra, EDC, IDQ or other Data Management tools.
  • Expertise in Python and experience writing Azure functions using Python/Node.js.
  • Experience using Event Hub for data integrations. Knowledgeable of the Collibra operating model and tool architecture, including workflows, roles, and responsibilities.
  • Experience using Event Hub for data integrations. Knowledgeable of the Collibra operating model and tool architecture, including workflows, roles, and responsibilities.

Company Information