Required Skills

Big data java aws Java Spark Kafka Python AWS

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 15th Jan 2022

JOB DETAIL

Job Title: BigData Engineer

Location :  San Jose, CA / Costa mesa, CA / Allen, TX
Experience : 10+ Yrs.
Contract Position :  C2C

Duration: 6 Months + extendable

Job Description: -

Big Data Engineer with Java and AWS experience

 Required skills:

·         BS degree in computer science, computer engineering or equivalent

·         5 – 6 years of experience delivering enterprise software solutions for Developers

·         8 years for lead

·         Proficient in Java, Spark, Kafka, Python, AWS Cloud technologies

·         Must have active current experience with Scala, Java, Python, Oracle, Cassandra, Hbase,

          Hive

·         3+ years of experience across multiple Hadoop / Spark technologies such as Hadoop,

          MapReduce, HDFS, Cassandra, HBase, Hive, Flume, Sqoop, Spark, Kafka, Scala

          Familiarity with AWS scripting and automation

·         Flair for data, schema, data model, how to bring efficiency in big data related life cycle

·         Must be able to quickly understand technical and business requirements and can translate them

          into technical implementations

·         Experience with Agile Development methodologies

·         Experience with data ingestion and transformation

·         Solid understanding of secure application development methodologies

·         Experienced in developing microservices using spring framework is a plus

·         Understanding of automated QA needs related to Big data

·         Strong object-oriented design and analysis skills

·         Excellent written and verbal communication skills Responsibilities

·         Utilize your software engineering skills including

 

Responsibilities:

·         Utilize your software engineering skills including Java, Spark, Python, Scala to Analyze disparate, complex

          systems and collaboratively design new products and services

·         Integrate new data sources and tools

·         Implement scalable and reliable distributed data replication strategies

·         Ability to mentor and provide direction in architecture and design to onsite/offshore developers

·         Collaborate with other teams to design and develop and deploy data tools that support both operations and

          product use cases

·         Perform analysis of large data sets using components from the Hadoop ecosystem

·         Own product features from the development, testing through to production deployment

·         Evaluate big data technologies and prototype solutions to improve our data processing architecture

·         Automate everything

Company Information