Required Skills

Data bricks Delta lake Hive HDFS Sqoop Kafka Kerberos Impala

Work Authorization

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 16th Dec 2024

JOB DETAIL

  • Collect, store, process and analyze large datasets to build and implement extract, transfer, load (ETL) processes
  • Develop reusable frameworks to reduce the development effort involved thereby ensuring cost savings for the projects.
  • Develop quality code with thought through performance optimizations in place right at the development stage.
  • Appetite to learn new technologies and be ready to work on new cutting-edge cloud technologies.
  • Work with team spread across the globe in driving the delivery of projects and recommend development and performance improvements.
  • Building and Implementing data ingestion and curation process developed using Big data tools such as Spark(Scala/python/Java), Data bricks, Delta lake, Hive, HDFS, Sqoop ,Kafka, Kerberos, Impala etc.
  • Hands on expertise in: Implementing analytical data stores in Azure platform using ADLS/ADF/Data Bricks and Cosmos DB and CDP 7x
  • Ingesting huge volumes data from various platforms for Analytics needs and writing high-performance, reliable and maintainable ETL code Strong
  • SQL knowledge and data analysis skills for data anomaly detection and data quality assurance.
  • Proficiency and extensive Experience with Spark & Scala, Python and performance tuning is a MUST
  • Hive database management and Performance tuning is a MUST. (Partitioning / Bucketing )
  • Strong SQL knowledge and data analysis skills for data anomaly detection and data quality assurance.
  • Strong analytic skills related to working with unstructured datasets
  • Performance tuning and problem-solving skills is a must.
  • Code versioning experience using Bitbucket/AzDo. Working knowledge of AzDo pipelines would be a big plus.
  • Monitoring performance and advising any necessary infrastructure changes.
  • Strong experience in building designing Data warehouses, data stores for analytics consumption. (real time as well as batch use cases)
  • Eagerness to learn new technologies on the fly and ship to production
  • Expert in technical program delivery across cross-functional / LOB teams
  • Expert in driving delivery through collaboration in highly complex, matrixed environment
  • Possesses strong leadership and negotiation skills
  • Excellent communication skills, both written and verbal
  • Ability to interact with senior leadership teams in IT and business Preferred
  • Expertise in Python and experience writing Azure functions using Python/Node.js
  • Experience using Event Hub for data integrations.
  • Hands on expertise in: Implementing analytical data stores in Azure platform using ADLS/Azure data factory /Data Bricks and Cosmos DB (mongo/graph API)
  • Experience ingesting using Azure data factory, Complex ETL using Data Bricks.
  • Eagerness to learn new technologies on the fly and ship to production

Company Information