US Citizen
Green Card
EAD (OPT/CPT/GC/H4)
H1B Work Permit
Corp-Corp
Consulting/Contract
UG :- - Not Required
PG :- - Not Required
No of position :- ( 1 )
Post :- 12th Apr 2022
· Bachelors or Masters in a technology related field (e.g. Computer Science, Engineering etc.) required.
· 6+ years of related experience in data engineering, analysis, data warehouses, data lakes. Specialist understanding and experience of methodologies like data warehousing, data visualization and data integration.
· Strong experience with relational database technologies (Oracle SQL & PL/SQL or similar RDBMS), preferably Snowflake or other Cloud Data warehousing services.
· Strong expertise in all aspects of data movement technologies (ETL/ELT) and experience with schedulers.
· Practical experience delivering and supporting Cloud strategies including migrating legacy products and implementing SaaS integrations.
· Proven experience in understanding multi-functional enterprise data, navigating between business analytic needs and data, and being able to work hand-in-hand with other members of technical teams to execute on product roadmaps to enable new insights with our data.
· Crafted and implemented operational data stores, as well as data lakes in production environments.
· Experience with DevOps, Continuous Integration and Continuous Delivery. Developing and deploying pipelines. Deploying within a cloud native infrastructure would be advantageous.
· Able to work collaboratively with a geographically diverse team.
The Skills you bring
· Proven track record of working in collaborative teams to deliver high quality data solutions in a multi-developer agile environment following design & coding best practices.
· Outstanding SQL skills and experience performing deep data analysis on multiple database platforms.
· Ability to develop ELT/ETL pipelines to move data to and from Snowflake data store using combination of Python and Snowflake SnowSQL.
· Knowledge and expertise of data modelling techniques and best practices (Relational, Dimensional), plus any prior experience with data modelling tools (eg. PowerDesigner).
· Prior experience with Data ingestion tool sets (e.g Apache NiFi, Kafka) is advantageous.
· Experience in working with AWS, MS Azure or other cloud providers. Experience with AWS services such as Lambda or S3, AWS Certification.
· Experience in Data Architecture (Database design, performance optimization).
· Prior experience in setting up reliable infrastructure (Hardware, Scalable data management systems, and frameworks) to perform data-related tasks, particularly with Kafka.
· Understanding basics of distributed systems and Kubernetes.
· Strong Focus on resiliency & reliability.
· You have excellent written and oral communication skills.
· Nice to have: Scripting/coding experience in any of the following: Python, Unix, Java.
The Value you deliver
· Simplifying and effectively communicating technical challenges, solutions options, and recommendations to business partners and technology leadership.
· Provide technical leadership and support in data and solutioning to team members (coaching others to their full potential).
· Produce scalable, resilient, cloud-based systems design aligned with our long-term strategy.
· Collaborate with chapter leads, squad leads, tech leads and architects on setting the technical roadmaps.
· Recognizing opportunities to bring emerging technologies to deliver innovative solutions to meet business challenges.
· Understand detailed requirements and deliver solutions that meet or exceed customer expectations.
· Take ownership and accountability.