Active Google Cloud Data Engineer or Google Professional Cloud Architect Certification.
Minimum 8 years of experience designing, building, and operationalizing large-scale enterprise data solutions and applications using GCP data and analytics services alongside 3rd party tools (Spark, Hive, Cloud DataProc, Cloud Dataflow, Apache Beam/Composer, BigTable, Cloud BigQuery, Cloud Pub/Sub, Cloud Storage, Cloud Functions, & Github).
Minimum 5 years of experience performing detailed assessments of current data platforms and crafting strategic migration plans to GCP cloud.
Strong Python development experience (mandatory).
2+ years of data engineering experience with distributed architectures, ETL, EDW, and big data technologies.
Demonstrated knowledge and experience with Google Cloud BigQuery (mandatory).
Experience with DataProc & DataFlows using Java on GCP.
Experience with serverless data warehousing concepts on Google Cloud.
Experience with DWBI modelling frameworks.
Strong understanding of Oracle databases and familiarity with GoldenGate is highly desired.
Expertise in Debezium and Apache Flink for change data capture and processing.
Experience working with both structured and unstructured data sources using cloud analytics platforms (e.g., Cloudera, Hadoop).
Experience with Data Mapping and Modelling.
Experience with Data Analytics tools.
Proven proficiency in one or more programming/scripting languages: Python, JavaScript, Java, R, UNIX Shell, php, or ruby.
Experience with Google Cloud services: Streaming + Batch, Cloud Storage, Cloud Dataflow, DataProc, DFunc, BigQuery, BigTable.
Knowledge and proven use of contemporary data mining, cloud computing, and data management tools: Microsoft Azure, AWS Cloud, Google Cloud, Hadoop, HDFS, MapR, and Spark.
Bachelor's degree or equivalent (minimum 12 years) work experience.