Required Skills

Data Engineer python scala sql nosql data modeling data warehousing Spark

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 29th Feb 2024

JOB DETAIL

Responsibilities

As a data engineer in the DE team, you will apply your strong technical experience building highly reliable services on managing and orchestrating multi-terabyte scale data lakes and implement a Data Mesh architecture, working closely with the Data Architecture/Modeling team. You enjoy working in an agile environment and are able to take vague requirements and transform them into solid solutions. You are motivated by solving challenging problems, where innovation, problem-solving, and creativity is as important as your ability to write code and test cases.

Minimum Qualifications and Expectations:

• At least 3 years (5 or 10 based on level) of professional experience as a software engineer or data engineer

• A BS in Computer Science or equivalent experience

• Strong programming skills (some combination of Python, Java, and Scala)

• Experience writing SQL, structuring data, and data storage practices

• Experience NoSQL databases like Mongodb and Cassandra

• Experience with data modeling

• Knowledge of data warehousing concepts

• Experienced building data pipelines and micro services

• Experience with Spark, Kafka, Flink, Hive, Airflow and other streaming and data pipeline technologies to process large volumes of streaming data

• Experience working on Amazon Web Services (in particular using EMR, Kinesis, RedShift, S3, SQS and the like)

• Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

• An open mind to try solutions that may seem impossible at first

It's preferred, but not technically required, that you have:

• Experience building self-service tooling and platforms

• Built and designed Data Mesh architecture platforms

• A passion for building and running continuous integration pipelines.

• Built pipelines using Databricks and well versed with their API’s

• Contributed to open source projects (Ex: Operators in Airflow)

Skills:

python, scala, sql, nosql, data modeling, data warehousing, Spark, kafka, flink, hive, airflow, AWS web services

Top Skills Details:

python,scala,sql,nosql,data modeling,data warehousing,Spark,kafka,flink,hive,airflow,AWS web services

Additional Skills & Qualifications:

Solid Data Engineering, AWS web services experience. The technology details related to these two areas are listed in the JD above.

Experience Level:

Expert Level

Company Information