Required Skills

Data Enginee J2EE Hadoop Apache Spark Kafka AWS cloud

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :-

  • PG :-

Other Information

  • No of position :- ( 1 )

  • Post :- 17th Nov 2020

JOB DETAIL

  •          Create and maintain optimal data pipeline architecture, Assemble large, complex data sets that meet functional / non-functional business requirements.
  •          Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability etc
  •          Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘Big data’ technologies.
  •          Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  •          Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  •          Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  •          Develop Enterprise scale applications, micro services and web applications using the Java and J2EE Technologies

 

Required Skills:

  •          5-7+ years of experience (Sr-level) Strong Programming experience with object-oriented/object function scripting languages:  Java, J2EE, Scala
  •          5+ years of experience (Mid-level) Experience with big data tools: Hadoop, Apache Spark, Kafka, etc.
  •          Strong Experience in developing Enterprise scale applications, micro services and web applications using the Java and J2EE Technologies.
  •          1+ years of  strong technical Experience with AWS cloud services and DevOps engineering: S3, IAM, EC2, EMR, RDS, Redshift, Cloudwatch with Docker, Kubernetes, GitHub, Jenkins, CICD
  •          Experience with stream-processing systems: Storm, Spark-Streaming, etc. (Nice to have)
  •          1+ Years of experience with relational SQL (PostgresSQL)
  •          Experience with Snowflake and NoSQL databases (Nice to have)

 

Must Have Skills

  •          DevOps
  •          Apache Hadoop
  •          Amazon Web Services
  •          ANSI SQL
  •          Core Java

 

Good To Have Skills

  •          Spark
  •          Scala
  •          Kafka
  •          J2EE


Monika Chabbra
monika@smartitframe.com

Company Information