A solid experience and understanding of considerations for large scale architecting, solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is a must.
Create detailed target state technical, security, data and operational architecture and design blueprints incorporating modern data technologies and cloud data services demonstrating modernization value proposition.
Minimum of 8 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Hive, Cloud DataProc, Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud BigQuery, Cloud PubSub, Cloud storage Cloud Functions & Github performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloud.