Required Skills

ETL Developer

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 19th Aug 2022

JOB DETAIL

 

UST® is looking for a talented and creative ETL Developer who takes responsibility and ownership in

Providing solutions and contributing to the overall success of the team.

 

Responsibilities:

• Hands-on architecture/development of ETL pipelines using our internal framework written in Java

• Hands-on architecture of real time REST APIs or other solutions for streaming data from Graph

    using Spark

• Interpret data, analyze results using statistical techniques and provide ongoing reports

• Develop and implement databases, data collection systems, data analytics and other strategies

    those optimize statistical efficiency and quality

• Acquire data from primary or secondary data sources and maintain databases/data systems

• Identify, analyze, and interpret trends or patterns in complex data sets

• Filter and clean data by reviewing reports and performance indicators to locate and correct

problems

• Work with management to prioritize business and information needs

• Locate and define new process improvement opportunities

 

Skills/Requirements:

• At least 8+ years of experience architecting and implementing complex ETL pipelines preferably

    With Spark toolset.

• At least 4+ years of experience with Java particularly within the data space

• Technical expertise regarding data models, database design development, data mining and

   Segmentation techniques

• Good experience writing complex SQL and ETL processes

• Excellent coding and design skills, particularly in Java/Scala and Python and or Java.

• Experience working with large data volumes, including processing, transforming and transporting

    Large-scale data

• Experience in AWS technologies such as EC2, Redshift, Cloud formation, EMR, AWS S3, AWS

Analytics required.

• Big data related AWS technologies like HIVE, Presto, Hadoop required.

• AWS certification is preferable: AWS Developer/Architect/DevOps/Big Data

• Excellent working knowledge of Apache Hadoop, Apache Spark, Kafka, Scala, Python etc.

• Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant

    amounts of information with attention to detail and accuracy

• Good understanding & usage of algorithms and data structures

• Good Experience building reusable frameworks.

Company Information