Required Skills

Cassandra MongoDB MySQL PostgreSQL Apache Airflow NiFi

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 23rd Aug 2025

JOB DETAIL

  • Data Architecture & Design: Design and implement scalable, high-performance data systems and pipelines to support large datasets, data processing, and analytical workflows.
  • Data Processing: Develop and manage ETL (Extract, Transform, Load) pipelines for data ingestion, processing, and transformation. Utilize big data technologies such as Apache Hadoop, Spark, Kafka, and other related frameworks.
  • Data Storage: Implement and manage data storage solutions using both relational and NoSQL databases (e.g., HDFS, Hive, Cassandra, MongoDB, etc.).
  • Data Integration: Integrate data from various structured and unstructured sources, ensuring that data is accurately transformed and processed.
  • Optimization & Performance Tuning: Monitor, optimize, and troubleshoot the performance of big data pipelines and systems to ensure efficient and reliable data processing.
  • Collaboration: Collaborate with data scientists, analysts, and other team members to understand data requirements and provide data-driven solutions for business needs.
  • Data Security: Ensure proper security measures are in place to protect sensitive data in compliance with company policies and industry regulations.
  • Documentation: Document data pipelines, architectures, and workflows for team reference and future scalability.

Company Information