- 10+ years of experience in data architecture, data engineering, or data management.
- Strong experience in building and optimizing data pipelines, architectures, and data sets.
- Proven experience in data modelling, database design, and data integration.
- Experience with both structured (SQL) and unstructured (NoSQL, Hadoop) data management.
- Demonstrated experience in data analysis and delivering data-driven insights to stakeholders.
- Experience with cloud platforms for data management (e.g., AWS, Azure, Google Cloud).
- Knowledge of data governance and compliance frameworks.
Technical Skills:
- Proficiency with ETL tools (e.g., Talend, Informatica) and data pipeline orchestration frameworks (e.g., Apache Airflow, AWS Glue).
- Strong SQL and Python programming skills.
- Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) and big data technologies (e.g., Hadoop, Spark).
- Familiarity with data analysis and visualization tools (e.g., Tableau, Power BI, Looker).
- Experience with advanced analytics and machine learning platforms is a plus.
- Knowledge of data lake architecture and working with real-time streaming data solutions (e.g., Kafka, Kinesis) is highly desirable.
Preferred Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Data Science, Information Systems, or a related field.
- Certifications such as AWS Certified Solutions Architect, Google Cloud Professional Data Engineer, or Microsoft Certified Azure Data Engineer.
- Strong problem-solving and analytical thinking with experience in delivering strategic data solutions.
- Excellent communication skills to interact with both technical and non-technical stakeholders.
- Experience with Agile methodologies and data engineering best practices.