- Modernize our Data Warehouse and build Analytical Data Store, while also maintaining our existing legacy Data warehouses.
- Design, develop and implement scalable batch/real time data pipelines (ETLs) to integrate data from a variety of sources into Data Warehouse and Data Lake
- Design and implement data model changes that align with warehouse dimensional modeling standards.
- Proficient in Data Lake, Data Warehouse Concepts and Dimensional Data Model.
- Responsible for maintenance and support of all database environments, design and develop data pipelines, workflow, ETL solutions on both on-prem and cloud-based environments.
- Design and develop SQL stored procedures, functions, views, and triggers
- Design, code, test, document and troubleshoot deliverables
- Collaborate with others to test and resolve issues with deliverables
- Maintain awareness of and ensure adherence to standards regarding privacy.
- Create and maintain Design documents, Source to Target mappings, unit test cases, data seeding.
- Ability to perform Data Analysis and Data Quality tests and create audit for the ETLs.
- Perform Continuous Integration and deployment using Azure Devops and Git
Skills and Experience You Will Need:
Required Skills:
- Strong experience with SQL, DBT, and Snowflake
- Strong Experience designing and implementing Data Warehouse.
- Experience working with Microsoft BI stack (SSIS/SSRS/SSAS) and Microsoft SQL server. (5 Years)
- Must have Experience with at least one Columnar MPP Cloud data warehouse (Snowflake /Azure Synapse / Redshift) (2+ years)
- Working knowledge managing data in the Data Lake.
- Experience in Agile, Jira and Confluence.
- Solid understanding of programming SQL objects (procedures, triggers, views, functions) in SQL Server. Experience optimizing SQL queries a plus.
- Advanced understanding of T-SQL, indexes, stored procedures, triggers, functions, views, etc.
- Healthcare experience (claims, payments)
Desired Skills:
- ADF and Azure experience
- Has demonstrated proficiency in designing and developing Azure Data Factory Pipelines
- Experience in ETL tools like Fivetran and DBT. (2 Years)