Primary Tech skills – SAS, Databricks, Pyspark
Secondary Tech skills – Azure, Synapse
- Experience in building end to end architecture for Data Lakes, Data Warehouses and Data Marts
- Experience of DWH, Data Integration, Cloud, Architecture, Design, Data Modelling
- Experience on SAS Base, SAS Macros, SAS SQL and SAS Enterprise Guide (EG)
- Hands-on experience in Pyspark is a must.
- Experience working with structured and unstructured data.
- Extensive knowledge of ETL and Data Warehousing concepts, strategies, methodologies
- Experience in data ingestion, preparation, integration, and operationalization techniques in optimally addressing the data requirements.
- Experience in relational data processing technology like MS SQL, Delta Lake, Spark SQL, SQL Server
- Experience with Orchestration tools and GitHub
- Experience to own end-to-end development, including coding, testing, debugging and deployment.
- Must be team oriented with strong communication, collaboration, prioritization, and adaptability skill
- Need to be an individual performer as well as a Lead.