Required Skills

Big data AWS

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 25th Mar 2024

JOB DETAIL

Databricks Engineer

 

  • Responsibilities will include designing, implementing, maintaining databricks platform, and providing operational support. Operational support responsibilities include platform set up and configuration, workspace administration, resource monitoring, provide technical support to data engineering, Data Science/ML, Application/integration teams, performing restores/recoveries, troubleshooting service issues, determining the root causes of issues, and resolving issues. The position will also involve management of security and changes.

 

  • The position will work closely with the Team Lead, other Databricks Administrators, System Administrators, and Data Engineers/Scientists/Architects/Modelers/Analysts. This position will involve participation in on-call rotation for 24/7 support.

 

Responsibilities:

 

  • Responsible for the administration, configuration, and optimization of the Databricks platform to enable data analytics, machine learning, and data engineering activities within the organization.
  • Collaborate with data engineering team to ingest, transform, and orchestrate data.
  • Manage privileges over the entire Databricks account, as well as at workspace level, Unity Catalog level and SQL warehouse level.
  • Create workspaces, configure cloud resources, view usage data, and manage account identities, settings, and subscriptions.
  • Install, configure, and maintain Databricks clusters and workspaces.
  • Maintain Platform currency with security, compliance, and patching best practices.
  • Monitor and manage cluster performance, resource utilization, platform costs, and troubleshoot issues to ensure optimal performance.
  • Implement and manage access controls and security policies to protect sensitive data.
  • Manage schema data with Unity Catalog - create, configure, catalog, external storage, and access permissions.
  • Administer interfaces with Azure AD and Amazon AWS.
  • Hadoop admin skills are preferred
  • Knowledge about Apache Kafka is desirable

 

Skills & Experience

 

  • 5+ years of production support of Databricks platform
  • 2+ years AWS/Azure PaaS
  • 2+ years of automation framework such as Terraform
  • 2+ years of Hadoop administration (Cloudera/Hortonworks)
  • 2+ years of Apache Kafka streaming platform administration

Company Information