Required Skills

Python Engineer

Work Authorization

  • Us Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 26th Jan 2022

JOB DETAIL

  • 3+ years of experience in building web applications in Python
  • 3+ years of experience in building data pipelines using Spark or data frames.
  • 2+ years of experience in AWS, Azure, or GCP.
  • LinkedIn Profile
  • EXCELLENT communication skills

As a Sr Data Infrastructure Engineer on the security data infrastructure team you will help build data infrastructure for the security team. You will build reliable, large-scale, multi-geo data pipelines to support detecting threats (internal and external threats) in Databricks systems, incident response forensics analysis and periodic compliance audits. You will build and deploy data pipelines in multi cloud (AWS, Azure and GCP) environments to process data and logs from external SaaS systems

The impact you will have:

  • Architect and build data pipelines to collect telemetry and logs from millions of virtual machines running in the cloud (AWS, Azure, and GCP).
  • Design the base ETL framework that can be used by all pipelines developed in the security team.
  • Partner with security engineers, detection engineers, and incident response engineers to build bronze, silver, gold quality data sets to meet detection, forensics, and compliance needs.
  • Develop best practices and standards that can be used by data engineers in the security team to build, optimize and maintain data pipelines.
  • Build tools to detect, improve data quality and monitor data pipeline performance.
  • Perform on-call rotation to support any production issues or troubleshoot production jobs.

What we look for:

  • 3+ years of experience in software or data engineering.
  • 3+ years of experience in programming with python, scala, or SQL. (preference for python)
  • 3+ years of experience in building data pipelines using Spark or data frames.
  • 2+ years of experience in AWS, Azure, or GCP.
  • Comfortable in exploring new tech or finding creative ways to solve problems.
  • Experience in security engineering or detection engineering.
  • BS in Computer Science/Engineering/Information Systems or equivalent experience

 

Company Information