Design, develop, and maintain ETL/ELT pipelines using dbt, Python, and GCP tools, ensuring efficient data flow and transformation.
Manage and optimize Snowflake data warehouse infrastructure, focusing on performance, scalability, and cost-efficiency.
Collaborate with BI Analysts to understand reporting requirements and ensure timely and accurate delivery of data.
Build and maintain data models, tables, and views in Snowflake to support the reporting needs of the business.
Implement data quality checks and validation processes to ensure the accuracy and reliability of data.
Monitor, troubleshoot, and optimize data pipelines to ensure smooth operation and minimal downtime.
Develop automation scripts in Python to streamline data processes and improve efficiency.
Work with cross-functional teams to integrate various data sources, ensuring that the data is centralized and easily accessible.
Ensure data security and privacy in line with company policies and regulatory requirements.
Stay up-to-date with the latest technologies and trends in data engineering and analytics, continually improving existing processes and infrastructure.