Required Skills

Spark Kafka Redshift AWS Kubernetes

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Permanent Direct Hire

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 23rd Nov 2020

JOB DETAIL


•    6+ years of experience in Data Engineering and Business Intelligence. 
•    Proficient in IoT tools such as MQTT, Kafka, Spark 
•    Proficient with AWS, S3, Redshift  
•    Experience with Presto and Parquet/ORC  
•    Proficient with Apache Spark and data frame. 
•    Experienced in containerization, including Docker and Kubernetes 
•    Expert in tools such as Apache Spark, Apache Airflow, Presto 
•    Expert in design and implement reliable, scalable, and performant distributed systems and data pipelines 
•    Extensive programming and software engineering  experience, especially in Java, Python, 
•    Experience with Columnar database such as Redshift, Vertica 
•    Hands-on design and develop streaming and IoT data pipelines.  
•    Developing streaming pipeline using MQTT, Kafka, Spark Structure Streaming 
•    Orchestrate and monitor pipelines using Prometheus and Kubernetes  
•    Deploy and maintain streaming jobs in CI/CD and relevant tools. 
•    Python scripting for automation and application development  
•    Design and implement Apache Airflow and other dependency enforcement and scheduling tools. 
•    Hands-on data modeling and data warehousing   
•    Deploy solution using  AWS, S3, Redshift and Docker/Kubernetes  
•    Develop storage and retrieval system using  Presto and Parquet/ORC  
•    Scripting with Apache Spark and data frame.

Company Information