Required Skills

Data Architect

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 20th Sep 2023

JOB DETAIL

•   Leading The Architecture And Implementation Of Spark And Databricks-Based ETL Frameworks For Large-Scale Enterprise Systems.

•   Designing And Developing High-Throughput Data Pipelines Using Spark Core And Streaming Technologies.

•   Implementing And Enforcing Architectural Standards And Frameworks To Ensure A Flexible And Scalable Data Environment.

•   Collaborating With Cross-Functional Teams To Gather Requirements, Analyze Data, And Design Effective Solutions.

•   Hands-On Development Of Python-Based Scripts And Applications To Support Data Processing And Transformation.

•   Utilizing Tools Such As Apache Airflow, Azure Data Factory, And Change Data Capture (CDC) For Orchestrating And Managing Data Workflows.

•   Playing A Key Role In DevOps Activities, Including Deployment Of Spark Jobs And Infrastructure Setup.

•   Providing Mentorship And Technical Guidance To Junior Team Members.

•   Staying Updated With The Latest Industry Trends And Technologies In Data Engineering And Analytics.

 

Qualifications:

 •   Bachelor's Or Master's Degree In Computer Science, Information Technology, Or A Related Field.

•   5+ Years Of Hands-On IT Experience, With A Strong Focus On ETL And Python Technologies.

•   Proven Expertise In Designing And Implementing Data Solutions Using Spark And Databricks.

•   Extensive Experience With Spark Core And Streaming Development.

•   Solid Understanding Of Python Programming For Data Manipulation And Transformation.

•   Hands-On Experience With Databricks Workflows, Delta Live Tables, And Unity Catalog.

•   Proven Ability To Troubleshoot And Optimize SPARK Queries For Analytics And Business Intelligence Use Cases.

•   Proficiency In Utilizing Apache Airflow, Azure Data Factory, And Change Data Capture (CDC) For Data Orchestration.

•   Basic Knowledge Of DevOps Practices For Managing Spark Job Deployments.

•   Databricks Certified Certification In Advanced Data Engineering Is A Plus.

•   Strong Problem-Solving Skills And The Ability To Work Effectively In A Collaborative Team Environment.

•   Excellent Communication And Interpersonal Skills.

Company Information