Analyzes the business requirements for data file requests to design, develop, and deploy fully automated ETL jobs to produce data files.
Designs, develops, and deploys ETL programs to facilitate the receipt of data from various data sources and the loading of data into destination tables within the overall data architecture framework.
Collaborates to design end-to-end solutions for file creation and distribution, as well as supporting ETL production processes and operations.
Utilizes advanced understanding of healthcare data models to identify data sources and maps data to required file specifications.
Designs and automates data quality and data validation functions as part of the overall solution.
Leverages expertise in ETL and the MS SQL platform to perform performance tuning of technologies to manage data.
Experience in analysis, data relationships, data aggregation, and summarization.
Performs technical peer reviews and code reviews.
Participates in review of potential technical architectures and solutions to improve and enhance existing systems, processes, and data services.
Expertise in MS SQL server (T-SQL, SQL server)
Experience with MS Azure/AWS or other cloud-based data warehouse systems.
Experience in building Data Pipelines using Azure Data Factory
Proficient with ETL tools (SSIS/Active Batch)
Working Knowledge in job scheduling tools like Automic, Tidal
In-depth understanding of database structure principles and Data warehouse principles.
Knowledge of scripting languages like PowerShell
Expertise in database architecture, storage and Query Optimizer
Extensive background in business intelligence tools (e.g., SSIS, SSRS and SSAS)
Architectural understanding of reporting tools (e.g., Tableau, SSRS, Crystal Reports and PowerBI)
Experience in CI/CD pipelines using Azure Devops
Experience with Git, Visual Studio, Visual Code are preferred.
Working knowledge in .NET 4.5 or greater would be an added advantage.