Required Skills

Bigdata Engineer

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 3rd Aug 2022

JOB DETAIL

·        7 or more years of experience working directly with enterprise data solutions

·        Hands-on experience working in a public cloud environment and on-prem infrastructure.

·        Specialty on Columnar Databases like Redshift Spectrum, Time Series data stores like Apache Pinot, and the AWS cloud infrastructure

·        Experience with in-memory, serverless, streaming technologies and orchestration tools such as Spark, Kafka, Airflow, Kubernetes

·        Current hands-on implementation experience required, possessing 7or more years of IT platform implementation experience.

·        AWS Certified Big Data - Specialty desirable

·        Experience designing and implementing AWS big data and analytics solutions in large digital and retail environments is desirable

·        Advanced knowledge and experience in online transactional (OLTP) processing and analytical processing (OLAP) databases, data lakes, and schemas.

·        Experience with AWS Cloud Data Lake Technologies and operational experience of Kinesis/Kafka, S3, Glue, and Athena.

·        Experience with any of the message/file formats: Parquet, Avro, ORC

·        Design and development experience on a Streaming Service, EMS, MQ, Java, XSD, File Adapter, and ESB based applications

·        Experience in distributed architectures such as Microservices, SOA, RESTful APIs, and data integration architectures.

·        Experience with a wide variety of modern data processing technologies, including:

o   Big Data Stack (Spark, spectrum, Flume, Kafka, Kinesis, etc.)

o   Data streaming (Kafka, SQS/SNS queuing, etc)

o   Columnar databases (Redshift, Snowflake, Firebolt, etc)

o   Commonly used AWS services (S3, Lambda, Redshift, Glue, EC2, etc)

o   Expertise in Python, pySpark, or similar programming languages

o   BI tools (Tableau, Domo, MicroStrategy)

o   Understanding Continuous Integration/Continuous Delivery

Company Information