Sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features
Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns
Experience with data security and data access controls and design
Should have experience in ELT or ETL tools like informatica/Talend/ Matillion (preferable)
Experience with AWS or Azure data storage and management technologies such as S3 and ADLS
Build processes supporting data transformation, data structures, metadata, dependency and workload management
Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot
Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface
Must have expertise in AWS or Azure Platform as a Service (PAAS).
Certified Snowflake cloud data warehouse Architect (Desirable).
Should be able to troubleshoot problems across infrastructure, platform and application domains.
Must have experience of Agile development methodologies
Strong written communication skills.
Is effective and persuasive in both written and oral communication
As a Specialist Solutions Architect (SSA) in Big Data / Data Engineering, you will guide customers in building big data solutions on Databricks that span a large variety of use cases
You will be in a customer-facing role, working with and will support the Solution Architects, that requires hands-on production experience with Apache Spark and expertise in other data technologies SSAs help customers through design and successful implementation of essential workloads while aligning their technical roadmap for expanding the