Collaborate with cross-functional teams to design and implement scalable and reliable systems on Google Cloud Platform considering optimal performance, security, and cost-effectiveness.
- Build Data ingestion pipelines to extract data from various sources (Azure Blob, Azure SQL, Flat files, Semi structure sources, AWS S3) into the data warehouse in GCP.
- Utilize GCP services to build robust and scalable data solutions.
- Design, develop, and maintain data pipelines and implement data architecture on GCP using services such as Big Query, Cloud Storage, and Cloud Composer.
- Expertise in the tools and technology that helps in the process of data collection, cleaning, transforming, and modelling data to achieve useful information.
- Leveraging GCP capabilities and technologies for migrating existing databases to cloud.
Collaborate with cross-functional teams to understand data requirements and implement scalable solutions.
- Implement and optimize Big Query tables and Complex SQL queries for efficient data retrieval, performance, and efficiency.
- Experience in Data Migration from On-premises Database to Big query and experience in BQ conversion
- Experience and knowledge in building data pipelines and scheduling using Cloud Composer (Airflow) and data and file transformation using Python