Responsibilities: As a Senior Data Engineer, you will Design and develop big data applications using the latest open source technologies.
- Desired working in offshore model and Managed outcome
- Develop logical and physical data models for big data platforms.
- Automate workflows using Apache Airflow.
- Create data pipelines using Apache Hive, Apache Spark, Scala, Apache Kafka.
- Provide ongoing maintenance and enhancements to existing systems and participate in rotational on-call support.
- Learn our business domain and technology infrastructure quickly and share your knowledge freely and actively with others in the team.
- Mentor junior engineers on the team
- Lead daily standups and design reviews.
- Groom and prioritize backlog using JIRA.
- Act as the point of contact for your assigned business domain Requirements:
- 8+ years of hands-on experience with developing data warehouse solutions and data products.
- 4+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive, Scala, Airflow or a workflow orchestration solution are required .
- 4 + years of experience in GCP,GCS Data proc, BIG Query
- 2+ years of hands-on experience in modeling(Erwin) and designing schema for data lakes or for RDBMS platforms.
- Experience with programming languages: Python, Java, Scala, etc.
- Experience with scripting languages: Perl, Shell, etc.