US Citizen
Green Card
W2-Permanent
W2-Contract
Consulting/Contract
UG :- - Not Required
PG :- - Not Required
No of position :- ( 1 )
Post :- 1st Mar 2024
Analyzing and translating business needs into long-term solution data models.
Evaluating existing data systems.
Working with the development team to create conceptual data models and data flows.
Developing best practices for data coding to ensure consistency within the system.
Reviewing modifications of existing systems for cross-compatibility.
Experience in Azure cloud environment
Defining databases and the physical, logical and conceptual levels involved
Kafka Expert with Big Data Background
Required skills:
A bachelor’s or master’s degree in relevant Business/IT studies with at least 5 years of experience in a similar role
You are experienced with Kafka publish-subscribe principles
Experience in Internet-of-things integration scenarios
Understanding of event-driven microservice
At least 3 years of experience in the field of Kafka Data Streaming
Experience with Kafka connectors on different hyperscaler and integration scenarios
Transformation tools like KSQL Streams
Experience in Cloud Big Data technologies and architectures within AZURE, Google Cloud or AWS.
Experienced in Java, Scala, Python, MySQL
Business Consulting and Technical Consulting skills
An entrepreneurial spirit and the ability to foster a positive and energized culture
A growth mindset with a curiosity to learn and improve.
Team player with strong interpersonal, written and verbal communication skills.
You can demonstrate fluent communication skills in English and Mandarin (spoken and written)
Roles & Responsibilities:
Managing Kafka real-time data streaming scenarios in a productive environment
Working with Confluent Kafka Control center
Supporting Kafka components like Broker, Zookeeper, Schema registry
Building complex Kafka KSQL Streams
Setup of Kafka scenarios on different hyperscaler
Design and build Data Flows using different Kafka connectors
Develop and optimize Data Models and pipelines for performance and scalability, reusable and listing in libraries for the future
Support industrialization of streaming solutions
Enable meaningful and insightful reports for Data Analysis and Monitoring
Ensure systematic quality assurance for the validation of accurate Data Processing
Building reusable code and libraries for future use
Optimization of applications for maximum speed and scalability
Implementation of security and data protection
Translation of stakeholder requirements into concrete specifications for Kafka and self-service solutions
Work diligently towards better relationship with the customer
Ability to create instructions /operations manual and staying on the job until it is finished
Willingness to be accountable, courtesy, reliability, flexibility, cooperation and adaptability.
Take initiative, ability to take responsibility