MINIMUM EDUCATION AND RELATED WORK EXPERIENCE:
- Bachelor's degree in a technology field and 5 years of prior TI work experience; OR
- Completion of Coding /IT Bootcamp and 5 years of prior TI work experience; OR
- 7 years of experience in most phases of IT systems deployments in one or more of the following areas: design and deployment of cloud services, software development, hardware installation, system administration, cyber security, or other functional IT area.
- Design, build, and maintain scalable and reliable data pipelines and ETL processes.
- Develop and maintain data infrastructure systems, including data warehouses and data lakes.
- Collaborate with data scientists and analysts to understand data requirements ,create mapping documents and data models.
- Ensure data quality and integrity by implementing data validation and cleansing processes.
- Optimize data pipelines and systems for performance and scalability.
- Troubleshoot data pipeline and system issues and provide timely resolution.
- Create and maintain documentation of data architecture, data pipelines, and data systems.
- Good problem-solving and analytical skills.
- Strong communication and collaboration skills.
- Should be able to mentor team members.
- Subject matter expert in specific areas, ability to lead projects
- Good understanding in writing complex SQL queries,•. Good knowledge on scripting languages like Python, Node JS , JavaScript, Java
- Good knowledge of AWS Cloud Platform services like AWS S3, AWS Lambda, Glue , Managed Airflow, IAM, Amazon RDS, Amazon Redshift, MSK
- Familiarity with Kafka, Apache Airflow
- Familiarity to create infrastructure as a code using Terraform