Required Skills

Python

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 6th Mar 2024

JOB DETAIL

1. Python

2. AWS

3. Kafka (or any other data processing skill)

  • Provides technical direction, guides the team on key technical aspects and responsible for product tech delivery
  • Lead the Design, Build, Test and Deployment of components
  • Where applicable in collaboration with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Lead)
  • Understand requirements / use case to outline technical scope and lead delivery of technical solution
  • Confirm required developers and skillsets specific to product
  • Provides leadership, direction, peer review and accountability to developers on the product (key responsibility)
  • Works closely with the Product Owner to align on delivery goals and timing
  • Assists Product Owner with prioritizing and managing team backlog
  • Collaborates with Data and Solution architects on key technical decisions
  • The architecture and design to deliver the requirements and functionality

 

Required:

Python, AWS-Redshift, Lake formation, Kafka, Terraform, Experience with Postgres, MySQL, DocumentDB

Nice to have: ECS/EKS, Streams Processing application development experience using Kafka Streams and/or Flink java APIs, EMR, Flink, Glue Catalog – at least Solution Architect Associate, Developer, or Data Specialty certification.

 

Roles & Responsibilities:

  • Support or collaborate with application developers, database architects, data analysts and data scientists to ensure optimal data delivery architecture throughout ongoing projects/operations.
  • Design, build, and manage analytics infrastructure that can be utilized by data analysts, data scientists, and non-technical data consumers, which enables functions of the big data platform for Analytics.
  • Develop, construct, test, and maintain architectures, such as databases and large-scale processing systems that help analyze and process data in the way the Analytics organization requires.
  • Develop highly scalable data management interfaces, as well as software components by employing programming languages and tools.
  • Work closely with a team of Data Science staff to take existing or new models and convert them into scalable analytical solutions.
  • Design, document, build, test and deploy data pipelines that assemble large complex datasets from various sources and integrate them into a unified view.
  • Identify, design, and implement operational improvements: automating manual processes, data quality checks, error handling and recovery, re-designing infrastructure as needed.
  • Create data models that will allow analytics and business teams to derive insights about customer behaviors. Build new data pipelines, identify existing data gaps and provide automated solutions to deliver analytical capabilities and enriched data to applications.
  • Responsible for obtaining data from the System of Record and establishing batch or real-time data feed to provide analysis in an automated fashion.
  • Develop techniques supporting trending and analytic decision making processes
  • Apply technologies for responsive front-end experience
  • Ensure systems meet business requirements and industry practices
  • Research opportunities for data acquisition and new uses for existing data
  • Develop data set processes for data modeling, mining and production
  • Integrate data management technologies and software engineering tools into existing structures
  • Employ a variety of languages and tools (e.g. scripting languages)

 

Company Information