US Citizen
Green Card
Corp-Corp
Consulting/Contract
UG :- - Not Required
PG :- - Not Required
No of position :- ( 1 )
Post :- 11th Oct 2022
Years of Experience: 10.00 Years of Experience
Required Skills – Spark, PySpark, Big Data Management
Nice to have skills - AWS Services, AWS EMR
Design and implement distributed data processing pipelines using Spark, Hive, Python, and other tools and languages prevalent in the Hadoop ecosystem.
Ability to design and implement end to end solution.
Experience publishing RESTful API’s to enable real-time data consumption using OpenAPI specifications
Roles & Responsibilities
• Experience with open source NOSQL technologies such as HBase, DynamoDB, Cassandra
• Familiar with Distributed Stream Processing frameworks for Fast & Big Data like ApacheSpark, Flink, Kafka stream
• Build utilities, user defined functions, and frameworks to better enable data flow patterns.
• Work with architecture/engineering leads and other teams to ensure quality solutions are implements, and engineering best practices are defined and adhered to.
• Experience in Business Rule management systems like Drools