Ability to work individually or mentor a small group in an Agile development environment.
Communicate effectively with global customers and collaborate well within a team environment to drive results.
Embrace new technologies and work with various tools and technologies to achieve desired functionality.
Work on problems of diverse scope, develop solutions to technology challenges and deliver the requirements before the deadlines.
Follow standard practices and procedures in analyzing situations or data from which answers can readily be obtained.
Contributing to your BU/Practice by
Documenting your learnings from the current work and engaging in the external tech community by writing blogs, contributing in Github, Stack overflow, meet-ups/conferences etc.
Keep updated on the latest technologies with technology trainings and certifications
Actively participate in organization level activities and events related to learning, formal training, interviewing, special projects etc.
Qualifications
REQUIRED
Minimum 7 years of experience in design, development, and deployment of large-scale, distributed environment.
Bachelor’s in Computer Science or related disciplines
Must have been part of minimum 2 end to end implementation projects and must have handled defined modules independently.
Experience on developing SAP HANA Information View and HANA data models with consumption of Hierarchies, Calculated views.
SAP HANA modelling and scripting experience including ability to write complex stored procedures, Table Functions and HANA views using SQL.
Expert in SQL and good with data modelling for relational, analytical and performance improvement.
Clear understanding of Snowflake’s architecture and data sharing concepts
End to end implementation of at least one Snowflake project is a must.
Expertise in Snowflake Data Modeling, ELT using Snowflake SQL, implementing stored procedures and standard DWH ETL concepts
Experience with SnowPipe and SnowSQL Development
Experience in working with any 1 cloud marketplace - AWS, Azure or Google data services
Knowledge of building REST API end points for data consumption
Knowledge of implementing CI/CD in the pipelines
Knowledge of Spark, and building jobs using Python/Scala/Java is a plus
Willingness to learn and adapt.
Delivery focused and willingness to work in a fast-paced work environment.
Take initiative and be responsible for delivering complex software.
Excellent oral and written communication is a must.