Required Skills

VPCs Security groups EC2 RDS S3 ACLs KMS AWS CLI

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 11th Jul 2022

JOB DETAIL

  • 8+ years of experience
  • Strong Java 8 Exp
  • Multithreading, Collections
  • Kafka and Kafka Streams and Spring Boot
  • AWS Foundations Services

    This role is part of the high-profile migration project and the person hired will architect, design and build streaming solution(s) as part of a Risk Based set of systems. We need a major contributor in the development of scalable resilient hybrid Cloud-based distributed computing solutions, supporting critical financial risk management activities. You will help in the transformation of the enterprise into a data-driven organization. The role is for someone with experience in cloud development, having the ability to design large scale micro services based streaming solution(s). Further, this person should have hands on technical skills in creating prototype(s) and in setting right standards around software development practices.

     

    SKILL AND EXPERIENCE REQUIRED:

    ·         8+ years of technical experience building data-centric software solutions

    ·         Advance level of Java 8+ knowledge with experience using Multithreading, Collections, Streams API and functional programming in real enterprise projects.

    ·         Expert working knowledge of SQL and scripting such as Python, Shell etc.

    ·         Minimum 1 year experience in developing cloud native streaming applications using Kafka, Kafka Streams and Sprint Boot.

    ·         Hands-on experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc.

    ·         Hands-on experience with any one distributed data stores HBase, Cassandra, MongoDB, AWS Dynamo DB etc.

    ·         Hands-on experience with any one distributed message brokers Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc.

    ·         Hands-on experience with AWS foundational services like VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc.

    ·         Experience with Big Data architectures and BI solutions

    ·         Intermediate working knowledge of DevOps tools Terraform, Ansible, Jenkins, Maven/Gradle, Nexus/Artifactory and CI/CD pipeline etc.

    ·         Comprehensive debugging and troubleshooting skills, resourcefulness and strong researching skills

    ·         Proficient in Oral and Written communication

Company Information