UG :- - Not Required
PG :- - Not Required
No of position :- ( 1 )
Post :- 24th Dec 2021
As part of the Data Integration team, you will be tasked with provisioning new datasets for analytic and operational use cases, as well as changes to existing loads. As an ETL developer you will build and manage theExtract, Transform, and Load processes, implementing technical solutions. Thisrole will meet with a diverse array of stakeholders ranging from IS to ouranalytics stakeholders across the firm. Our main technology platform consists of AWS, Snowflake, SQL/ETL,Airflow, Jira and Python.
• Interactdirectly with requestors from multiple divisions of the firm to understandtheir documented data requirements
• Recommendan ETL design based on the requirements of the specific use case and provideaccurate estimates of effort
• Writecomplex SQL to perform the ETL form multiple data sources to be loaded inSnowflake.
• Partnerwith RDBMS DBAs to understand the source data structures and design of thetarget structures
• Use theappropriate tools and frameworks available to develop the data acquisition andingestion process in accordance with the approved design
• Leverageworkflow tools to maintain accurate status of assigned tasks (JIRA, etc.)
• Ensure thenecessary data validation steps are included to ensure completeness andaccuracy of data
• Performinitial validation of the process and inspection of the data
• Utilizethe deployment frameworks available to move artifacts from development to testto production environments
• Ensurejobs are scheduled to run on a frequency consistent with stakeholderrequirements
• Ensurerole-based access is established on target tables
• SupportETL jobs once deployed in production to ensure SLAs are met for data consumers.
• 3 yearsof experience in an ETL Developer role using SQL
• Bachelor'sdegree in Computer Science, Information Systems or another applicable field ispreferred
• Advancedworking knowledge of data acquisition frameworks and ETL development
• AdvancedSQL coding experience and performance tuning
• Experiencebuilding solutions using AWS, Snowflake, Python.
• Experiencewith Airflow or other orchestration platforms
• Linux/Unixplatform experience
• Experienceworking with different databases and platforms, ie SQL Server, Teradata, NoSQL,Hadoop, etc….