US Citizen
Green Card
EAD (OPT/CPT/GC/H4)
H1B Work Permit
Corp-Corp
Consulting/Contract
UG :- - Not Required
PG :- - Not Required
No of position :- ( 1 )
Post :- 8th Apr 2022
Your future duties and responsibilities
• Design the architecture for migrating on-premise data warehouse and data marts to the data lake and subsequent vending on AWS Cloud.
• Architect solutions for the design and implementation of Big & Fast Data Infrastructure on AWS cloud using Kafka, Kinesis, Glue, Athena, Redshift, DynamoDB & Quick Sight
• Define information models supporting data assets for complex data structures represented through various data management systems such as a graph, relational and hierarchical databases.
• Guide other teams to design, develop, and deploy data sets and tools that support product use cases.
Required qualifications to be successful in this role
• 10+ years of relevant experience.
• Must have AWS data AND application experience
• PySpark/Spark
• PQL skills
• Big Data
• Strong Python or Java skills
• AWS experience
• Database systems (SQL and NoSQL)
• Data warehousing solutions
• ETL tools
• Data APIs.
• Understanding the basics of distributed systems.
• Knowledge of algorithms and data structures.
Desired Skillset
• Knowledge of Machine Learning concepts such as KNN, Random Forest, Naïve Bayes, Neural Networks, and deploying them on Sagemaker is a plus.
EDUCATION REQUIREMENT S
Bachelor degree in Computer Science, Information Systems or related field