Required Skills

ETL Developer

Work Authorization

  • US Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 19th Aug 2022

JOB DETAIL

  • Client is looking for talented and creative  ETL Developer  who takes responsibility and ownership in providing software solutions and contributing to the overall success of the teamResponsibilities: Hands-on architecture/development of ETL pipelines using our internal framework written in Java
  • Hands-on architecture of real time REST APIs or other solutions for streaming data from Graph using Spark
  • Interpret data, analyze results using statistical techniques and provide ongoing reports
  • Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality
  • Acquire data from primary or secondary data sources and maintain databases/data systems
  • Identify, analyze, and interpret trends or patterns in complex data sets
  • Filter and clean data by reviewing reports and performance indicators to locate and correct problems
  • Work with management to prioritize business and information needs
  • Locate and define new process improvement opportunities

 

Requirements:·     

  • At least 8+ years of experience architecting and implementing complex ETL pipelines preferably with Spark toolset.·     
  • At least 4+ years of experience with Java particularly within the data space·    
  • Technical expertise regarding data models, database design development, data mining and segmentation techniques·     
  • Good experience writing complex SQL and ETL processes·    
  • Excellent coding and design skills, particularly in Java/Scala and Python and or Java.·     
  • Experience working with large data volumes, including processing, transforming and transporting large-scale data·     
  • Experience in AWS technologies such as EC2, Redshift, Cloud formation, EMR, AWS S3, AWS Analytics required.·     
  • Big data related AWS technologies like HIVE, Presto, Hadoop required.·      
  • AWS certification is preferable: AWS Developer/Architect/DevOps/Big Data·     
  • Excellent working knowledge of Apache Hadoop, Apache Spark, Kafka, Scala, Python etc.·     
  • Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy·     
  • Good understanding & usage of algorithms and data structures·     
  • Good Experience building reusable frameworks.·     
  • Experience working in an Agile Team environment.·     
  • AWS certification is preferable: AWS Developer/Architect/DevOps/Big Data·     
  • Excellent communication skills both verbal and written

Waiting for your response.

 

Please share your available time for me to contact you directly through phone call.

Company Information