MUST HAVE:
Hiring an experienced and highly skilled Senior Data Engineer to join our Data Engineering team. The ideal candidate will have a strong background in Google Cloud Platform (GCP) technologies, with expertise in GCP Dataflow, Apache Airflow, GCP Composer, GCP Dataproc, BigQuery, and strong proficiency in Python (Java experience is a plus).
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines using GCP Dataflow and Apache Airflow.
- Implement data workflows and orchestration using GCP Composer.
- Utilize GCP Dataproc for efficient processing of large-scale data sets.
- Optimize and tune BigQuery queries for maximum performance.
- Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs.
- Ensure data quality, integrity, and reliability throughout the data pipeline.
- Stay up-to-date with the latest advancements in GCP and data engineering technologies.
Qualifications:
- Minimum of 6-7 years of experience in data engineering.
- Strong proficiency in Python; experience in Java is a plus.
- Extensive experience with GCP Dataflow, Apache Airflow, GCP Composer, GCP Dataproc, and BigQuery.
- Solid understanding of SQL and its application in data engineering.
- Hands-on experience in Google Cloud Platform.