Required Skills

Data Architect

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 14th Jul 2025

JOB DETAIL

•     10+ years - Enterprise Data Management

•     10+ years - SQL Server based development of large datasets

•     5+ years of experience in data warehouse architecture and hands-on experience with the Databricks platform. Extensive experience in PySpark coding. Snowflake experience is good to have

•     3+ years of Python (NumPy, Pandas) coding experience

•     Experience in Data warehousing - OLTP, OLAP, Dimensions, Facts, and Data modeling

•     Good knowledge of Azure Cloud and services like ADF, Active Directory, App Services, ADLS, etc

•     Hands-on experience on CI/CD pipeline implementations

•     Previous experience leading an enterprise-wide Cloud Data Platform migration with strong architectural and design skills

•     Experience with Snowflake utilities such as SnowSQL and SnowPipe - good to have

•     Capable of discussing enterprise-level services independent of the technology stack

•     Experience with Cloud-based data architectures, messaging, and analytics

•     Superior communication skills

•     Cloud certification(s)

•     Any experience with Reporting is a plus

•     Excellent written and verbal communication, intellectual curiosity, a passion for understanding and solving problems, consulting & customer service              

•     Structured and conceptual mindset coupled with strong quantitative and analytical problem-solving aptitude

•     Exceptional interpersonal and collaboration skills within a team environment  

Responsibilities:

•     Migrate, Design, develop, and deploy AbInitio graphs to DBT jobs to process and analyze large volumes of data.            

•     Collaborate with data engineers and data scientists to understand data requirements and implement appropriate data processing pipelines.    

•     Optimize DBT jobs for performance and scalability to handle big data workloads.  

•     Implement best practices for data management, security, and governance within the Databricks environment. Experience designing and developing Enterprise Data Warehouse solutions.

•     Demonstrated proficiency with Data Analytics, Data Insights

•     Proficient in writing SQL queries and programming including stored procedures and reverse engineering existing process

•     Leverage SQL, programming language (Python or similar), and/or ETL Tools (Azure Data Factory, Data Bricks, Talend, and SnowSQL) to develop data pipeline solutions to ingest and exploit new and existing data sources.    

•     Perform code reviews to ensure fit to requirements, optimal execution patterns, and adherence to established standards.

•     Collaborate with data engineers and data scientists to understand data requirements and implement appropriate data processing pipelines.  

•     Optimize Databricks jobs for performance and scalability to handle big data workloads.

Company Information