Lead the development, design, development and deployment of data pipelines, storage solutions and analytic tools
Ensure scalability, reliability and security of the platform to meet current and future demands.
Oversee the implementation of data lakes, warehouses and ELT/ETL pipelines on azure and snowflake
Develop and Review code and ensure adherence to development best practices and standards
Foster a culture of collaboration, innovation and continous improvement within the team.
Drive integration of real-time and batch data processing capabilities.
Ensure compliance with data governance policies, security and regulatory requirements.
Implement robust data quality, lineage and cataloging processes
Build and integrate APIs and backend systems using frameworks like Django, Flask or FastAPI
Develop scripts and tools for data analysis, transformation and automation
Write unit tests, integration tests and conduct debugging to ensure software quality
Work closely with cross-functional teams including developers, product managers, business analysts and other stakeholders to align the data platform with organizational goals
Translate business requirements into scalable technical solutions