Required Skills

Python Spark Java Hadoop DevOps

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 24th Dec 2020

JOB DETAIL

Job Duties Include but not limited to the following: 
Standing up cutting-edge analytical capabilities, leveraging automation, cognitive and science-based techniques to manage data and models, and drive operational efficiency by offering continuous insights and improvements. 
Help with System Integration, Performance Evaluation and Application scalability and Resource refactoring, based on utilizing a thorough understanding of applicable technology, tools such as Python, Spark, Hadoop, AtScale, Dremio and existing designs. 
Collaborate with team for changes to the architecture, prepare reusable functions and reduce time to market through automated testing. 
Use a variety of languages, tools and frameworks to marry data and systems together. 

Required Qualifications 
• 8+ years of software engineering experience 
• 3+ years of development experience with Python, Spark, Java 
• 3+ years of Hadoop experience (Horton Works preferred) 
• 2+ years of business intelligence and reporting experience 
• 1+ years of development experience with AtScale/Dremio/Tableau 

Desired Qualifications 
• 1+ years of development experience in Anaconda environment 
• 1+ years of DevOps experience 
• An industry-standard technology certification 
• Strong verbal, written, and interpersonal communication skills 
• SAS programming experience in model implementation, reporting, and complex data manipulations 

Company Information