UG :- - Not Required
PG :- - Not Required
No of position :- ( 1 )
Post :- 24th Dec 2021
As part of the Data Integration team, you will be tasked with provisioning new datasets for analytic and operational use cases, as well as changes to existing loads. As an ETL developer you will build and manage the Extract, Transform, and Load processes, implementing technical solutions. This role will meet with a diverse array of stakeholders ranging from IS to our analytics stakeholders across the firm. Our main technology platform consists of AWS, Snowflake, SQL/ETL,Airflow, Jira and Python.
• Interact directly with requestors from multiple divisions of the firm to understand their documented data requirements
• Recommend an ETL design based on the requirements of the specific use case and provide accurate estimates of effort
• Write complex SQL to perform the ETL form multiple data sources to be loaded in Snowflake.
• Partner with RDBMS DBAs to understand the source data structures and design of the target structures
• Use the appropriate tools and frameworks available to develop the data acquisition and ingestion process in accordance with the approved design
• Leverage workflow tools to maintain accurate status of assigned tasks (JIRA, etc.)
• Ensure the necessary data validation steps are included to ensure completeness and accuracy of data
• Perform initial validation of the process and inspection of the data
• Utilize the deployment frameworks available to move artifacts from development to test to production environments
• Ensure jobs are scheduled to run on a frequency consistent with stakeholder requirements
• Ensure role-based access is established on target tables
• Supported jobs once deployed in production to ensure SLAs are met for data consumers.
• 3 years of experience in an ETL Developer role using SQL
• Bachelor’s degree in Computer Science, Information Systems or another applicable field is preferred
• Advanced working knowledge of data acquisition frameworks and ETL development
• Advances coding experience and performance tuning
• Experience building solutions using AWS, Snowflake, Python.
• Experience with Airflow or other orchestration platforms
• Linux/UNIX platform experience
• Experience working with different databases and platforms, i.e. SQL Server, Teradata, NoSQL, Hadoop, etc….