Our client is currently looking for an Azure Cloud Data Architect to join their growing Data Mesh team. This is a critical role that will provide cloud deployment in an enterprise-wide data and digital transformation imitative and deliver next generation data solutions.
This is a support role for the Data Architect who is the technology leader who will bring in the experience of SaaS/cloud solutions (Azure), data engineering techniques, DevSecOps automation and best practices in building, maintaining, and enhancing state of the art big data Platforms.
Responsibilities:
- Will work with the Data Architect to deploy old legacy systems in Azure cloud, for a big-data ecosystem at enterprise scale; assess the data architecture in place and work with core mesh team to continuously enhance and evolve data tech stack.
- Promote data mesh technology to broad user groups in both technology and business organizations, and partner with key stakeholders to assist with defining the data and analytics strategies.
- Work alongside the Data Architect deploying the data mesh architecture and developmental recommendations, and stay hands-on in developing and maintaining the framework, cookbook, and bootstrap code for data infrastructure as-a platform.
- Partner with Data Governance team to ensure the completion and accuracy in metadata collections are deployed in Azure cloud environment.
- Evaluate, prototype and recommend emerging Cloud data technologies and platforms from open source or vendors to deploy and host in Azure cloud environment.
- Communicate complex technical topics to non-technical business and senior executives and assist the Data Architect with scoping and architecting Azure cloud data solutions.
Required:
- At least 8 years of experience developing data and services in an enterprise and/or cloud environment.
- Deep understanding of data warehouse, data lake architecture and data management processes.
- Hands-on expertise of multiple modern cloud storages and services such as Snowflake, Azure SQL, ADLS2, Azure Cosmos DB, MongoDB, CockroachDB, DataStax, or Redis.
- Working experience using ETL/analytics tools such as Talend, NiFi, dBt, DataBricks, or ADF, etc.
- Knowledge of on-prem/cloud data virtualization such as Tibco DV, Denodo, Starburst, or Dremio.
- Skilled in variety of languages such as Java, or Python
- Experience in writing scripts for Linux shell and Windows PowerShell.
- Strong organization and communication skills.
- Bachelor's degree in Computer Science or related discipline.