-Would prefer someone that has DataBricks experience
Will only be responsible for managing the gold layer of the data lake
Does not need loads of experience on DataBricks - just exposure
If they are light on DataBricks, they need to have heavy experience on SQL Server
-Current team - does all of the reporting on their current data lake built on SQL
Although they are migrating from SQL to Databricks, they still have a
Must have Skills: ETL, SSIS, SQL, Databricks, Data Lake, SQL Server, Azure SQL, DACPAC
Maintain operation of legacy data warehouse and related ETL processes.
- Contribute to the design, development, deployment, and maintenance of the team’s data architecture in Databricks Delta Lake.
- Proactively propose solutions and/or improvements to how the team captures, stores, and accesses data, with an eye toward increased efficiency, ease of use, and value.
- Create and maintain documentation of the physical and logical data models, data dictionaries, and ETL processes.
- Design and deploy data table structures, reports, and queries.
- Manage SQL data tier dacpac deployments between proprietary applications.
- Design and manage the information access and security requirements for the data warehouse.
- Design, develop, and support complex integration processes/ SSIS Processes (including interfaces) using SQL Server technology, stored procedures and SQL program code.
- Implement incremental load for many SSIS packages. Experience in error handling, debugging, error logging for production support in SSIS.
- Improve slow running jobs with the help of redesign and better ETL processes to meet business needs.
- Participate in code reviews, analyze execution plans, and re-factor inefficient code.
- -Would prefer someone that has DataBricks experience
- Will only be responsible for managing the gold layer of the data lake
- Does not need loads of experience on DataBricks - just exposure
- If they are light on DataBricks, they need to have heavy experience on SQL Server
- -Current team - does all of the reporting on their current data lake built on SQL
- Although they are migrating from SQL to Databricks, they still have a need for someone to maintain the warehouse during migration and after until around April/May 2023
- Right now Deloitte is dependent on them and they have a bidirectional data feed that they will need to maintain until Deloitte switches to DataBricks