- Extensively worked in data extraction transformation and loading from source to target systems using BTEQ, Fastload and Multiload
- In depth understanding and usage of Teradata and Proficient in Teradata SQL , Stored Procedures, Macros, Views, Indexes etc.
- Design and develop Unix and script and ETL Processes for loading data in Data warehouse.
- Developed scripts for loading data to Target Data warehouse.
- Performance tuning of Teradata queries using high volume of data.
- Developed BTEQ scripts to load data in Teradata.
Candidates will also need 2+ years’ experience with the following:
- Working knowledge of GCP and ability to design and develop optimized Data pipelines for batch and real time data processing.
- Experience with GitHub and leveraging CI/CD pipelines.
- Experience in GCP Cloud services such as Big Query, Composer, Scheduling tools (Tidal or Airflow), DataProc.