Required Skills

Data Engineer Scrum/ Python Java Scala

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 28th Mar 2024

JOB DETAIL

Must Have Skills –

Data Engineer-14+ Years Experience

•             Spark – 4-8+ Yrs of Exp

•             GCP –2-5+ Yrs of Exp

•             Hive– 8+Yrs of Exp

•             SQL - 8+ Yrs of Exp

•             ETL Process / Data Pipling - 8+ Years of experience

 

Description:

Responsibilities: As a Senior Data Engineer, you will Design and develop big data applications using the latest open source technologies.

•         Desired working in offshore model and Managed outcome • Develop logical and physical data models for big data platforms.

•         Automate workflows using Apache Airflow. • Create data pipelines using Apache Hive, Apache Spark, Scala, Apache Kafka.

•         Provide ongoing maintenance and enhancements to existing systems and participate in rotational on-call support.

•         Learn our business domain and technology infrastructure quickly and share your knowledge freely and actively with others in the team.

•         Mentor junior engineers on the team • Lead daily standups and design reviews.

•         Groom and prioritize backlog using JIRA.

•         Act as the point of contact for your assigned business domain Requirements:

•         8+ years of hands-on experience with developing data warehouse solutions and data products.

•         4+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive, Scala, Airflow or a workflow orchestration solution are required . 4 + years of experience in GCP,GCS Data proc, BIG Query

•         2+ years of hands-on experience in modeling(Erwin) and designing schema for data lakes or for RDBMS platforms.

•         Experience with programming languages: Python, Java, Scala, etc.

•         • Experience with scripting languages: Perl, Shell, etc.

•         Practice working with, processing, and managing large data sets (multi TB/PB scale). • Exposure to test driven development and automated testing frameworks.

•         Background in Scrum/Agile development methodologies.

•         • Capable of delivering on multiple competing priorities with little supervision.

•         Excellent verbal and written communication skills.

•         Bachelor's Degree in computer science or equivalent experience.

 

The most successful candidates will also have experience in the following:

•         Gitflow

•         Atlassian products - BitBucket, JIRA, Confluence etc.

•         Continuous Integration tools such as Bamboo, Jenkins, or TFS Location: This position will be based in Sunnyvale, CA.

 

Company Information