Roles and Responsibilities
- Develop high performance and scalable solutions using GCP that extract, transform, and load big data.
- Designing and building production-grade data solutions from ingestion to consumption using Java / Python
- Design and optimize data models on GCP cloud using GCP data stores such as BigQuery
- Optimizing data pipelines for performance and cost for large scale data lakes.
- Writing complex, highly-optimized queries across large data sets and to create data processing layers.
- Closely interact with Data Engineers to identify right tools to deliver product features by performing POC
- Collaborative team player that interacts with business, BAs and other Data/ML engineers
- Research new use cas
Desired Candidate Profile
- Bachelors degree in computer science, software/computer engineering, mathematics, or equivalent practical experience.
- Should have 1-3 years of working experience with Java/Python and SQL
- Should have 1-3 years of working experience with big data technologies such as Apache flink / Apache Beam / PySpark/ Spark framework.
- Experience in Data Warehouse or Bigquery
- Familiar with messaging technologies (Kafka) and workflow environments (Airflow) .
- Should have experience with Agile development methodologies.
- Excellent communication skills, strong critical thinking skills, and an ability to pick new technology quickly and deliver .
- Google cloud Professional Data Engineer would be a plus