Required Skills

Data Engineer

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( )

  • Post :- 6th May 2023

JOB DETAIL

  • Apply data and systems engineering principles to develop code spanning the data lifecycle including ingest, transform, consume end to end from source to consumption for operational and analytical workloads that minimize complexity and maximize business value. 
  • Work as part of an agile scrum team to deliver business value.
  • Participate in design sessions to understand customers' functional needs.
  • Work with solution architect and development team to build quick prototypes leveraging existing or new architecture.
  • Provide end-to-end flow for a data process, map technical solutions to the process.
  • Develop and deploy code in continuous development pipelines leveraging off-the-shelf and open-source components of Enterprise Data Warehouse, ETL, and Data Management processes adhering to the solution architecture.
  • Perform software analysis, code analysis, requirements analysis, release analysis and deployment.

 

Experience & Qualifications:

  • Hands-on development experience in distributed, analytical, cloud-based, and/or open-source technologies.
  • At least 10 years of professional experience, building software for Data Ingestion/Data Movement ETL pipelines for operational and/or analytical systems.
  • Expertise with coding and implementing data pipelines in cloud-based data infrastructure, analytical, and no-SQL databases (i.e.: AWS, Snowflake, MongoDB, Postgres).
  • Hands-on programming experience in Python and/or Java, and/or SnowSQL.
  • Experience leveraging build and deploy tools (i.e.: Github, Gradle, Maven, Jenkins).
  • Ability to travel up to 10% of the time, if not less.
  • Bachelor’s degree (or higher)
  • Experience Implementing software leveraging flow-based pipelines such as NiFi or Airflow and Streaming services such as Kafka.
  • Experience building data pipeline framework.

Company Information