Required Skills

Data Modeler

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 20th Apr 2022

JOB DETAIL

·        Responsible for designing, deploying, and maintaining Data models for DWL Layer.

·        Responsible for creating STM and Data Models (ER Model) for every single Entity.

·        Understanding the Work Intake from Client on granularity and business level and Model the requirement to the New or Existing business model.

·        Evaluate new and upcoming data solutions and make recommendations for adoption to existing and new Data Entities.

·        Responsible for handling, maintaining and optimizing global EDW Design and evaluating the Entity to the common format as per its functional role.

·        Creating Dynamic DDL’s and Scripts as per the requirements.

·        Guiding ETL Team to make them understand the Data Model and functional role of the Entity

·        Responsible for migration of Dev and QA Data entities

·        Creating STM and Data Models in ER.

·        Identify gaps and improve the existing platform to improve quality, robustness, maintainability, and speed

·        Perform development, QA, and dev-ops roles as needed to ensure total end to end responsibility of solutions

 

Requirements & Qualifications:

·        Experience building, maintaining, and improving Data Processing Pipeline / Data routing in large scale environments

·        Fluency in common query languages, API development, data transformation, and integration of data streams

·        Strong experience with large dataset platforms such as (e.g. Azure SQL Database, Teradata etc )

·        Experience with Azure Synapse is preferable

·        Fluency in multiple programming languages, such as Python, Shell Scripting, SQL, Java, or similar languages and tools appropriate for large scale data processing.

·        Experience in any ER Tool

·        Experience with acquiring data from varied sources such as: API, data queues, flat-file, remote databases

·        Must have basic Linux administration skills and Multi-OS familiarity (e.g. Microsoft Windows, Linux)

·        Data Pipeline and Data processing experience using common platforms and environments

·        Understanding of traditional Data Warehouse components (e.g. ETL, Business Intelligence Tools)

·        Creativity to go beyond current tools to deliver the best solution to the problem

·        5+ years working on data processing environments

Company Information