Required Skills

Big Data Engineer

Work Authorization

  • US Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 15th Jul 2022



Mandatory Skills: (Candidate should provide a detailed write up with the below skills).


  • API Designing & Development
  • Microservices Architecture
  • Rest API Principles


Duties and responsibilities:


  • Use big data technologies to develop distributed, fault-tolerant scalable data solutions.
  • Participate in discussions with customers, along with the product team, to understand their data requirements.
  • Translate the business requirements in to corresponding data requirements.
  • Collect and process data at scale from a variety of sources for different project needs.
  • Participate in identifying, evaluating, selecting, and integrating big data frameworks and tools required for the big data platform.
  • Design, develop, and maintain data pipelines , data platforms using selected frameworks and tools based on requirements from different projects.
  • Convert structured and unstructured data into the form that is suitable for processing. Provide support to different teams in analysing data.
  • Design, develop and maintain data API’s.
  • Integrate data from variety of data sources using federation / virtualization techniques.
  • Develop solutions independently based on high-level design and architecture with minimal supervision.
  • Monitor the performance of the data platform on a regular basis and tune the infrastructure and platform components accordingly to ensure the best performance. 
  • Maintain a high level of expertise in data technologies and stay current on latest data technologies.




  • Overall 10+ year experience in software design and development.
  • 5+ years of experience in data engineering.
  • Prior experience with implementing big data platform components that are scalable, high performing, and lower in operations cost.
  • Proven experience with integration of data from multiple heterogeneous and distributed data sources.
  • Experience with processing large amounts of data (structured and unstructured), building data models, data cleaning, data visualization and reporting.
  • Experience in production support and troubleshooting.
  • Hands-on knowledge of containers, API designing and implementing is a must.
  • Experience with NoSQL databases, Graph databases, relational databases, time series databases.
  • Excellent knowledge of various ETL techniques and frameworks, various messaging systems, stream-processing systems, Big data ML toolkits, big data querying tools
  • Experience in Python, Go, Perl, JavaScript, Kafka, Spark, Kubernetes.
  • Good knowledge of Agile software development methodology.
  • Excellent interpersonal, communication (verbal and written) skills.
  • Proven experience in managing and working with teams based in multiple geographies.
  • Bachelor’s Degree or higher in Computer Science or a related field.


Company Information