Required Skills

SQL

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 27th Jan 2024

JOB DETAIL

The successful candidate will be part of the Enterprise Data Warehouse team. The engineer will have professional experience in data integration, transformation, and developing data pipelines. This individual will need experience working to get API data from different sources/vendors, along with internal data, enriching and anonymizing the data - then prepping/transforming the data into a readable format (gold) for visualizations, and maintaining current data feeds using streaming practices. Strong experience with Databricks, as well as spark/python or Scala, as well as SQL. They need to have a good understanding of streaming and spark. Architecture concepts, Following medallion architecture (bronze/silver/gold) - they will be operating in AWS (S3) and potentially Azure (blob), once again - Heavy use of Databricks - sole contributor and self-starter/confident in their abilities, and strong communication skills

Required Skills and Experience *

 

  • Bachelor's Degree Computer Science, Information Management, or related field; Masters preferred.
  • 7+ years relevant experience or 4+ years with Masters
  • Databricks experience building data pipelines and workflows is a must
  • Proven record of stand alone projects
  • Proficient in programming language (SQL and python or scala -- pyspark, pysql)
  • Streaming Experience
  • Understanding of cloud infrastructure and technologies specifically AWS

 

Nice to Have Skills and Experience

 

  • Unity Catalog
  • Kinesis
  • Lamda

Company Information