Required Skills

Big Data Engineer

Work Authorization

  • Us Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 27th Jul 2021

JOB DETAIL

  • Spark Scala developer position with key skills in Spark, Kafka, big data platform tools, i.e. Cloudera/Horton Works/Databricks.
  • Experience with Databricks would be a plus.
  • Must have a bachelor’s degree in Computer Science or related IT discipline.
  • Must have at least 8 years of IT development experience.
  • Must have knowledge of 3 years of SCALA/Python – Spark programming.
  • Must have relevant professional experience working with big data toolsets.
  • Knowledge of standard software development methodologies such as Agile and Waterfall
  • Must be willing to flex work hours accordingly to support application launches and manage production outages if necessary.

 

Mandatory Skills:

  • SCALA
  • SQL
  • Spark/Spark Streaming
  • Big Data Tool Set
  • Linux
  • Python

 

Essential Job Functions:

  • Design and development of data ingestion pipelines.
  • Perform data migration and conversion activities.
  • Develop and integrate software applications using suitable development methodologies and standards, applying standard architectural patterns, taking into account critical performance characteristics and security measures.
  • Collaborate with Business Analysts, Architects and Senior Developers to establish the physical application framework (e.g. libraries, modules, execution environments).

Perform end to end automation of ETL process for various datasets that are being ingested into the big data platform.  

Company Information