Required Skills

Hadoop Developer

Work Authorization

  • US Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 27th Jul 2022


Desired Skills and Experience *

• 3-6+ years’ experience in Hadoop stack and storage technologies (HDFS, MapReduce, Yarn, HIVE, Sqoop, Impala , spark, flume, Kafka and Oozie)
• 1– 3+ years’ experience in Scala programming for Big Data
• Extensive Knowledge on Bigdata Enterprise architecture (Cloudera preferred)
• Experienced in HBase, RDBMS, SQL, ETL and data analysis
• Experience in No SQL Technologies (ex., Cassandra/ MongoDB, etc.)
• Experienced in scripting(Unix/Linux) and scheduling (Autosys)
• Experience with team delivery/release processes and cadence pertaining to code deployment and release
• Experience in Hive Tuning, Bucketing and Partitioning, debugging, tracing errors/exception in Spark job execution
• Knowledgeable in techniques for designing Hadoop based file layout, optimized for business



• Object-oriented programming and design experience
• Experience with automated testing methodologies and frameworks, including JUnit, is a plus
• Exposure to JAVA/Spring framework
• Fundamentals of Python – Data Structures, Collections, Pandas for file and other type of data handling, visualizations etc.
• Visual Analytics Tools knowledge ( Tableau )
• Degree in Computer Science or equivalent
• Any Big Data certification(ex. Cloudera's CCP, CCA) is a plus
• Excellent analytical capabilities - Strong interest in algorithms. Research oriented, motivated, pro-active, self-starter with good interpersonal skills
• Experience with team delivery/release processes and cadence pertaining to code deployment and release
• A team player with good verbal and written skills, capable of working with a team of Architects, Developers, Business/Data Analysts, QA and client stakeholders
• Proficient understanding of distributed computing principles
• Python IDEs(Django, Flask), data wrangling and analytics in a python based environment
• Experience with Big Data Analytics & Business Intelligence and Industry standard tools integrated with Hadoop ecosystem. ( R , Python )
• Data Integration, Data Security on Hadoop ecosystem. ( Kerberos )
• Research oriented, motivated, pro-active, self-starter with strong technical, analytical and interpersonal skills
• Versatile resource with balanced development skills and business acumen to operate at a fast and accurate speed


Day-to-Day *

Insight Global is looking for Senior Hadoop Developers to support Data and Analytics Platform, Information Management and Solution Delivery for one of our largest Financial clients in either Dallas TX, Charlotte NC, Pennington NJ, or New York. The role ensures design and engineering approach for complex data solutions is consistent across multiple flows and systems, while building processes to support data transformation, data structures, metadata, data quality controls, dependency and workload management. The individual will be responsible to define internal controls, identify gaps in data management standards adherence and work with appropriate partners to develop plans to close the same, lead concept and experimentation testing to synthesize the results and validate and improve the solution, document and communicate required information for deployment, maintenance, support, and business functionality. They may be required to mentor more junior Data Engineers and coach team members in delivery/release activities.

Company Information