Required Skills

Big Data on AWS Hadoop Spark Hive Spark PySpark Scala

Work Authorization

  • Green Card

Preferred Employment

  • Corp-Corp

  • W2-Contract

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 1st Nov 2021

JOB DETAIL

  1. At least 6 years of experience in IT and at least 3 years using Big Data on AWS 
  2. Hands-on experience with using Cloud Platform provided Big Data technologies and equally non-big data technologies 
  3. EC2, EMR, RedShift, S3 in Amazon Web Services 
  4. Hands-on experience in setting up Cloud platforms for Client’s use-cases 
  5. Good and practical scripting knowledge in Java ( core java ) and J2EE technologies, Python 
  6. Good knowledge in Big Data technologies – Hadoop, Spark, Hive
  7. Experience using programming frameworks -  Spark, PySpark, Scala 
  8. Experience or learning ability in NoSQL Technologies – MongoDB, DynamoDB
  9. Good knowledge of Big Data, Relational Database, Data Architecture concepts 
  10. Strong analytical, problem-solving, data analysis and research skills 
  11. Good people management skills to manage, guide and mentor a technical team 
  12. Ability to work with various business teams to resolve technical challenges and understand requirements
  13. Demonstrable ability to interact, collaborate, drive consensus and confidence among  different (of our) groups – both on-shore and off-shore
  14. Demonstrable ability to think outside of the box and not be dependent on readily available tools 
  15. Excellent communication, presentation and interpersonal skills are a must 

Company Information