- Strong SQL knowledge – able to translate complex scenarios into queries.
- Strong Programming experience in Java or Python
- Experience with Data modeling and mapping.
- Experience in Google Cloud platform (especially BigQuery & Dataflow)
- Experience with Google Cloud SDK & API Scripting
- Experience in Hadoop ( Hive, MapReduce, Spark )
- Experience in Onsite-Offshore coordination
- Experience in Test Driven Development
- Experience in Agile processes and DevOps methodologies.
- Experience in NoSQL Databases
- Experience in AWS/Snowflake/Azure services
- Experience of Retail domain will be an added advantage
Technical/Functional Skills
:
- Programming Language – Java / Python
- Google Cloud – Big Query, Pub/ Sub, Dataflow, Composer DAGs, Cloud storage
- CI/CD – GitHub, Jenkins
Roles & Responsibilities
:
- Create and maintain optimal data pipeline architecture replicating the existing process.
- Analyze, explore and select the appropriate tools for data migration and transformation.
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Keep data separated and secured using masking and encryption.
- Work with data and analytics experts to strive for greater functionality in data systems.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Translate business requirements into well-engineered, tested and deployed application systems that are used by the business.
- Explain the requirement to offshore team and create
:
- Good organizational and problem-solving skills
- Good team player who is self motivated and well organized
- Strong oral and written communication skills
- Ability to work with remote teams
- Ability to manage project scope