Required Skills

Big Data

Work Authorization

  • US Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 30th May 2022

JOB DETAIL

 

Primary Skills | Docker, Hadoop, Cascading, Spark, AWS, DynamoDB, Redshift, Redis, Akka

 

 

Essential Key Responsibilities      % of Time

1.            Responsible for the design and development of a Big Data predictive analytics SaaS based customer data platform using object oriented analysis, design and programming skills, and design patterns

                60%

2.            Responsible for continuously improving reliability, scalability, and stability of micro services and platform.     10%

3.            Contribute to and lead the continuous improvement of the software development framework and processes by analysing, designing and developing test cases and implementing automated test suites.          10%

4.            Reproduce, troubleshoot, and determine root cause of production issues 10%

5.            Participate daily stand-up team meeting/bi-weekly sprint planning & sprint-end demo/retrospective and work cross-functionality with other teams in Lattice to drive the innovations of our products 10%

Education/Experience and Competencies

 

List the knowledge, skills, abilities, physical abilities, experience, licenses, training, educational requirements, etc. required for the position.  These are not “functions” but rather the “attributes” an individual must possess in order to be qualified for the position.

 

1.            expertise with 8-12  years of experience in building enterprise techniques for large scale distributed system design and data processing

2.            Strong knowledge of common algorithms, data structures, Object Oriented programming and design.

3.            Strong analytical and problem solving skills. Ability to hit the ground running and learn/adapt quickly.

4.            Desired hands-on experience with Docker, Hadoop, Cascading, Spark, AWS, DynamoDB, Redshift, Redis, Akka or similar technologies.

5.            Fluency in English with excellent verbal and written communication skills

6.            Self-driven, willing to work in a fast-paced, dynamic environment

7.            Computer Science degree or equivalent. Master is preferred

 

Company Information