· Design overall data structure, ensuring that Snowflake's features (e.g., data sharing, scalability, secure data exchange, etc.) are fully utilized to meet the business requirements.
· Create a blueprint for how data will be stored, processed, and accessed within the Snowflake platform.
· Perform optimization of data pipelines and workflows for performance, scalability, and cost-efficiency.
· Design ETL (Extract, Transform, Load) or ELT (Extract, Load, Transform) processes, and optimize queries and data storage strategies.
· Integrate with other cloud services (e.g., AWS, Azure, GCP), third-party tools, and on-premises data systems.
· Designs and implements strategies to control access to sensitive data, applying encryption, role-based access control, and data masking as necessary.
· Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand their requirements and ensure the Snowflake environment meets those needs.
· Monitor the performance of the Snowflake environment, identifying bottlenecks, and ensuring optimal query performance.
· Automate administrative tasks using Snowflake SQL and scripting languages like Python or Shell scripting.
· Preform data loading methods (bulk loading using COPY INTO, Snowpipe for real-time ingestion, and External tables).
· Perform Snowflake cloning capabilities for databases and schemas.
· Perform configuration and management of Snowflake Virtual Warehouses including scaling, resizing and auto-suspend/resume settings.
· Implement roles and privileges for managing secure access utilizing Snowflake RBAC (Role-Based Access Control)
· Integrate Snowflake SSO (Single Sign-On) and SCIM (System for Cross-domain Identity Management) for secure access and identity management.
· Configure alerts and monitor data pipeline failures, resource spikes, and cost thresholds.