Required Skills

UNIX SQOOP

Work Authorization

  • US Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 29th Jul 2022

JOB DETAIL

•Design, build and unit test highly scalable applications.
•Provide maintenance support to applications as required supporting incident escalations.
•Identify new technologies, trends, and opportunities.

Agile Development
•Participate in sprint planning, design, coding, unit testing, sprint reviews.
•Provide basic design documents and translates into component-level designs to accelerate development. Design, develop, and distribute reusable technical components.
•Assist in developing of technical documentation; participate in test-plan development, integration, and deployment.
•Define & develop project requirements, functional specifications, and detailed designs of application solutions for clients.

Requirements:
•4-6 years of experience in Big Data development using Hadoop with a good understanding in all phases of software development life cycle.
•Experience using Hadoop Technologies like Hive, Spark, Pig, or Kafka
•Participate in sprint planning, design, coding, unit testing, sprint reviews
•Provide basic design documents and translates into component-level designs to accelerate development. Designs, develops, and distributes reusable technical components.
•Assist in developing technical documentation; participate in test-plan development, integration, and deployment
•Must have extensive hands-on experience in designing, developing, and maintaining software solutions on Big Data platform such as Hadoop eco-system.
•Must have experience with strong UNIX shell scripting
•Must have experience with one of the IDE tools such as Eclipse.
•Must have working experience with Spark and Scala/Python.
•Preferred experience with developing Pig scripts/Hive QL, HBASE, SQOOP, UDF for analyzing all semi-structured/unstructured/structured data flows.
•Preferred experience with developing MapReduce programs running on the Hadoop cluster using Java/Python.
•Preferred experience with developing in Cloud environment such as Azure.
•Not mandatory but a big advantage to have prior experience using Talend with Hadoop technologies

Company Information