Master’s in data Analytics/Information Technology/Management Information System (preferred)
Strong understanding of data fundamentals, knowledge of data engineering and familiarity with core cloud concepts
Must have good implementation experience on various GCP’s Data Storage and Processing services such as Big Query, Dataflow, Bigtable, Data form, Data fusion, cloud spanner, Cloud SQL
Must have programmatic experience of SQL, Python, Apache Spark
At least 3-5 years of professional experience in building data engineering capabilities for various analytics portfolios with at least 2 years in GCP/Cloud based platform.
Your expertise in one or more of the following areas is highly valued:
Google Cloud Platform, ideally with Google Big Query, Cloud Composer and Cloud Data Fusion, Cloud spanner, Cloud SQL
Experience with legacy data warehouses (on SQL Server or any Relational Datawarehouse platform)
Experience with our main tools DBT (Data Build Tool) , Terraform/Terragrunt, Git (CI/CD)
Experience with a testing framework.
Experience with Business Intelligence tools like PowerBI and/or Looker.
What sets you apart:
Experience in complex migrations from legacy data warehousing solutions or on-prem Data Lakes to GCP
Experience with building generic, re-usable capabilities and understanding of data governance and quality frameworks.
Experience in building real-time ingestion and processing frameworks on GCP.
Adaptability to learn new technologies and products as the job demands.
Multi-cloud & hybrid cloud experience
Any cloud certification (Preference to GCP Certifications)
Experience working with Financial and Banking Industry