-
UG :- - Not Required
-
PG :- - Not Required
-
No of position :- ( 1 )
-
Post :- 25th Jul 2022
- Job Description: Understanding of data lake and data warehousing concepts e.g., Dimensional Modeling, Star and Snowflake Schema, Data Catalog etc.
- Deep understanding of Snowflake with excellent knowledge of SQL, PL SQL, Tasks, Streams, Snowpipe.
- Expertise in Expertise in designing, building, and deploying Data Processing pipeline on AWS cloud using Python/ Unix Scripting in Linux environment.
- Working experience with AWS Data services like S3, Glue, Athena, Lambda, SNS, SQS, RDS, and Redshift
- Knowledge of Open Source Orchestration tools like Apache Airflow
- Understanding of Cloud native file formats e.g., Parquet, Avro, ORC etc.
- Individual Certified in AWS/ Snowflake is a plus.
- Financial Services industry experience a plus
- Ability to work with shifting priorities in a Agile environment
- Excellent Communication Skills