Required Skills

NoSQL Hadoop Infrastructure hdfs Apache Monitoring SQL Python Software engineering

Work Authorization

  • Citizen

Preferred Employment

  • Full Time

Employment Type

  • Direct Hire

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 27th Oct 2022

JOB DETAIL

Design, develop and maintain an infrastructure for streaming, processing, and storage of data. Build tools for effective maintenance and monitoring of the data infrastructure.
Contribute to key data pipeline architecture decisions and lead the implementation of major initiatives.
Work closely with stakeholders to develop scalable and performant solutions for their data requirements, including extraction, transformation, and loading of data from a range of data sources.
Develop the team s data capabilities -share knowledge, enforce best practices and encourage data-driven decisions.
Develop data retention policies, and backup strategies and ensure that the firm s data is stored redundantly and securely.

Job requirements:
Solid Computer Science fundamentals, excellent problem-solving skills, and a strong understanding of distributed computing principles.
At least 3 years of experience in a similar role, with a proven track record of building scalable and performant data infrastructure.
Expert SQL knowledge and deep experience working with relational and NoSQL databases.
Advanced knowledge of Apache Kafka and demonstrated proficiency in Hadoop v2, HDFS, and MapReduce.
Experience with stream-processing systems (e.g. Storm, Spark Streaming), big data querying tools (e.g. Pig, Hive, Spark) and data serialization frameworks (e.g. Protobuf, Thrift, Avro).
Bachelor s or Master s degree in Computer Science or related field from a top university.
Able to work within the GMT 8 time zone

Company Information