Us Citizen
Green Card
Corp-Corp
Consulting/Contract
UG :- - Not Required
PG :- - Not Required
No of position :- ( 1 )
Post :- 17th Aug 2021
Design and develop data pipelines using Python, Kafka, Snowflake and AWS technologies
Assemble large, complex data sets that meet functional / non-functional business requirements
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Work with stakeholders including the Marketing, CRM, Finance, Operations, Product, and Development teams to assist with data-related technical issues and support their data infrastructure needs
Develop repeatable and scalable data quality audits
Hands on coding role
Required Skills:
Bachelor’s Degree in Software Engineering, Computer Science, MIS or related field
3+ years’ experience with Python programming (Object Oriented)
3+ years’ experience with AWS Services (Need all 3: S3, Lambda, Fargate)
2+ years Advanced SQL knowledge
Experience working with relational databases, query authoring
Experience writing advanced SQL queries with multi-table joins, group functions, subqueries, set operations, functions, stored procedures
Experience with Git
Experience with Docker and Unix environments
Experience with queue and data streaming technologies (one of the following: Kafka, Kinesis, SQS, RabbitMQ)
Experience with Infrastructure as Code (Terraform, CloudFormation, Serverless)
Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
Strong understanding of data modeling and data warehousing principles
Excellent documentation and general organization of ongoing tasks, including the ability to evaluate and question business rules