Required Skills

Big Data

Work Authorization

  • US Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

Employment Type

  • Consulting/Contract

education qualification

  • UG :-

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 22nd Nov 2022

JOB DETAIL

1. Airflow - Orchestration Engine.

2. Python and SQL.

3. Redshift  "Nice to have"

4  Big Data - Hive, Spark and Presto

5. Data Ingestion Frameworks experience is "Nice to have"

6. Must-have skills: Big Data (Hive, Spark, Presto), Airflow, Python and Strong SQL skills, EMR

7. Cloud: AWS

8. Nice to have skills: Oracle Autonomous Data Warehouse experience.

 

R&R;

1. 6+ years relevant work experience Strong problem solving and analytical mindset Experience building scalable real time and high performance cloud data lake solutions Experience with relational SQL and scripting languages such as Shell and Python.

2. Hands on coder who understands the concepts behind distributed databases batch processing

3. Experience building and shipping data production pipelines sourcing data from a diverse array of sources - Git

4. Experience with Hadoop and related processing frameworks such as Spark Hive and Airflow

5. Experience with source control tools such as GitHub and related dev process

6. Experience with workflow scheduling orchestration tools

7. Hands on experience of cleaning preparing and optimizing data for ingestion and consumption

8. Work distribution: 50% coordination, 50% technical

9. Will be required to provide on call support over the weekend

 

Company Information