Major responsibilities of the profile, but not limited to, shall be:
- Developing and maintaining Python and Airflow based data platforms (on premises and cloud)
- Supporting data pipelines across various source and target systems
- Develop, document and follow standards and practices for the rest of the firm
- Providing help and support to different applications for use of the data platforms
- Maintain and support traditional ETL tools and technologies like Linux Shell and Informatica
- Migration and upgrade of data pipelines from traditional tools to newer data platforms
- The candidate may have to be available in evenings to overlap with US timezones
REQUIREMENTS
- Bachelors degree / Master s degree in Computer Science or related technical field, or equivalent practical experience.
- 2-5 yrs. of experience (preferably in Data Engineering/Data sciences Domain)
- Knowledge of Linux, Relational databases and Python along with familiarity with testing frameworks (ex: pytest)
- Good Understanding or Hands-on for Snowflake is highly desired.
- Knowledge of cloud-based technologies like AWS and Snowflake preferred but not mandatory
- Knowledge of Big Data Tech. like Hive, Hadoop, etc. preferred but not mandatory.