Required Skills

ETL Python

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 29th Jan 2024

JOB DETAIL

  • 10+ years Hands-on experience with - Azure Data Bricks . Must be well versed with the platform
    • Design and implement scalable and efficient data pipelines to ensure smooth data movement from multiple sources using Azure Databricks
    • Developing scalable and re-usable frameworks for ingesting of data sets
    • Ensure data quality and integrity throughout the entire data pipeline, implementing robust data validation and cleansing mechanisms
    • Working with event based / streaming technologies to ingest and process data.
    • Develop and maintain data pipelines on Snowflake and Azure platforms. Assist in deploying applications with Kubernetes and Dockers. Contribute to event streaming and processing using KAFKA.
  • Must have 3+ years of relevant experience Java development background with strong Azure knowledge, develop and maintain Java applications using Springboot, Jboss, and other technologies.
  • Azure Cloud Experience – hands-on experience with Storage Format, provisioning options and single tenant and multi-tenant concepts.
  • Bigdata Experience : Proficient in ETL development with Talend, Big Data processing and storage, MDM implementation for data consistency
  • Python/Spark/Scala development > 5+ years coding experience .
  • Lead the design and implementation of cloud-based data solutions.
  • The candidate should be very well versed with Cloud technology and Java is a must.

Company Information