US Citizen
Green Card
Corp-Corp
Consulting/Contract
UG :- - Not Required
PG :- - Not Required
No of position :- ( 1 )
Post :- 19th Jul 2022
UST® is looking for a talented and creative Java Developer who takes responsibility and ownership in providing software solutions and contributing to the overall success of the team.
The individual in this position will act as a trailblazer for the team and increase development, delivery, and operational efficiencies through best practices, industry standards, and high quality of engineering.
Responsibilities:
Hands-on building of ETL pipelines using our internal framework written in Java
Hands-on solutioning of real time REST APIs or other solutions for streaming data from Graph
Interpret data, analyze results using statistical techniques and provide ongoing reports
Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality
Acquire data from primary or secondary data sources and maintain databases/data systems
Identify, analyze, and interpret trends or patterns in complex data sets
Filter and clean data by reviewing reports and performance indicators to locate and correct problems
Work with management to prioritize business and information needs
Locate and define new process improvement opportunities
Data visualization tool experience (Tableau, Thoughtspot, Graphana)
Document design and data flow for existing and new applications being built.
Coordinate with multiple different teams QA, Operations, and other development teams within the organization.
Requirements/Tech Stack:
At least 4+ years of experience implementing complex ETL pipelines preferably with Spark toolset.
At least 4+ years of experience with Java particularly within the data space
Technical expertise in data models, database design development, data mining and segmentation techniques
Good experience writing complex SQL and ETL processes
Excellent coding and design skills, particularly in Java/Scala and Python and or Java.
Experience working with large data volumes, including processing, transforming and transporting large-scale data
Experience in AWS technologies such as EC2, Redshift, Cloud formation, EMR, AWS S3, AWS Analytics required.
Big data related AWS technologies like HIVE, Presto, Hadoop are required.
Excellent working knowledge of Apache Hadoop, Apache Spark, Kafka, Scala, Python etc.
Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
Good understanding & usage of algorithms and data structures
Good Experience building reusable frameworks.
Experience working in an Agile Team environment.
AWS certification is preferable: AWS Developer/Architect/DevOps/Big Data
Education Qualification & Work Experience Criteria:
Bachelor's Degree in Engineering (Computer Science or IT or equivalent technical discipline)
Excellent communication skills both verbal and written