Required Skills

Databricks Azure Cloud ADF Synapse Unity Catalog

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 17th Dec 2024

JOB DETAIL

  • Data Pipeline Development: Design, develop, and migrate robust ETL/ELT pipelines in Azure Data Factory and Databricks to support data ingestion, transformation, and storage.
  • Unity Catalog Implementation & Migration: Implement and manage Unity Catalog within the Databricks environment to organize, secure, and govern data assets. Migrate legacy pipelines, notebooks, and tables to a Unity Catalog compliant format.
  • Data Modeling: Develop and maintain scalable data models and architectures that support business intelligence and analytics needs.
  • Data Governance: Collaborate with data governance teams to ensure data is properly classified, secured, and audited within Unity Catalog.
  • Performance Optimization: Monitor and optimize the performance of data pipelines and Databricks clusters to ensure cost-effective and efficient data processing.
  • Data Security: Implement and manage role-based access controls (RBAC) and data encryption practices to safeguard sensitive data.
  • Collaboration: Work with cross-functional teams to understand data requirements and deliver high-quality data solutions that meet business needs.
  • Automation: Develop automation scripts and tools to streamline data engineering processes and improve productivity.
  • Troubleshooting: Identify and resolve issues related to data pipelines, data quality, and performance bottlenecks.

Company Information