Required Skills

Data Architect

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 29th Feb 2024

JOB DETAIL

The top skills are databricks, spark, python, pyspark.

 

This person will be in charge of architecting the databricks platform for their Logistics. They will also need to be handson with coding as they will be doing code changes and recommending changes for the architecture.

 

Client is searching for a Senior Data Architect at their Dallas, TX HQ!

Client is at the forefront of expanding data infrastructure and analytics capabilities. In the role of Senior Data Architect, you will be instrumental in architecting, implementing, and optimizing our data processing and analytics platform based on Databricks for our customer facing Logistics Platform (LP). This role requires a collaborative mindset to work with cross-functional teams, understanding business requirements, and ensuring the seamless integration of Databricks within our technology stack.

 

What you'll do:

 

•             Develop and oversee a comprehensive data architecture, aligning with business goals and integrating technologies such as Azure, Databricks, and Palantir to craft a forward-looking data management and analytics landscape

•             Lead the design of enterprise-grade data platforms addressing needs across Data Engineering, Data Science, and Data Analysis, capitalizing on the capabilities of Azure Databricks

•             Architect, develop, and document scalable data architecture patterns, ETL frameworks, and governance policies, adhering to Databricks’ best practices to support future and unknown use cases with minimal rework

•             Define cloud data standards, DevOps, Continuous Integration / Continuous Delivery (CI/CD) processes and participate in the proliferation of the corporate meta-data repository

•             Offer hands-on technical guidance and leadership across teams, driving the development of KPIs for effective platform cost management and the creation of repeatable data patterns for data integrity and governance

•             Direct the strategic implementation of Databricks-based solutions, aligning them with business objectives and data governance standards while optimizing performance and efficiency

•             Promote a culture of teamwork, leading evaluations of design, code, data assets, and security features, and working with key external data providers like Databricks and Microsoft to follow best practices. Create and deliver training materials, such as data flow diagrams, conceptual diagrams, UML diagrams (Unified Modeling Language), and ER flow diagrams, to explain data model meaning and usage clearly to a diverse audience of technical and non-technical users

•             Communicate complex technical concepts effectively to both technical and non-technical stakeholders, ensuring clear understanding and alignment across the organization

•             Implement robust audit and monitoring solutions, design effective security controls, and collaborate closely with operations teams to ensure data platform stability and reliability

 

What you'll need:

 

•             Bachelor’s or master’s degree in Computer Science, Information Technology, or a related field

•             8+ years of experience in technical roles with expertise in Software/Data Engineering, Development Tools, Data Applications Engineering

•             Proficiency in SQL, Python, Scala, or Java. Experience with big data technologies (e.g., Spark, Hadoop, Kafka), MPP databases, and cloud infrastructure

•             Strong background in data modeling, ETL/ELT workloads, and enterprise data architecture on platforms like Azure Databricks

•             Experience with data governance tools, BI tools (Tableau, PowerBI), version control systems, and CI/CD tools

•             Relevant certifications in Databricks, cloud technologies (AWS or Azure), or related fields are a plus

 

 

Company Information