Required Skills

Apache Hive Apache Spark Scala Apache Kafka

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 9th Oct 2023

JOB DETAIL

Requirements:
• 8+ years of hands-on experience with developing data warehouse solutions and data products.
• 4+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive,Scala, Airflow or a workflow orchestration solution are required

. 4 + years of experience in GCP,GCS Data proc, BIG Query
• 2+ years of hands-on experience in modeling(Erwin) and designing schema for data lakes or for RDBMS platforms.
• Experience with programming languages: Python, Java, Scala, etc.
• Experience with scripting languages: Perl, Shell, etc.
• Practice working with, processing, and managing large data sets (multi TB/PB scale).
• Exposure to test driven development and automated testing frameworks.
• Background in Scrum/Agile development methodologies.
• Capable of delivering on multiple competing priorities with little supervision.
• Excellent verbal and written communication skills.
• Bachelor's Degree in computer science or equivalent experience.
The most successful candidates will also have experience in the following:
• Gitflow
• Atlassian products – BitBucket, JIRA, Confluence etc.
• Continuous Integration tools such as Bamboo, Jenkins, or TFS

Company Information