Required Skills

Principal Data Engineer

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 27th Aug 2021

JOB DETAIL

Hands-on expertise in Airflow, Redshift, Python, Relational DB like Postgres, AWS cloud services, Distributed systems

You will lead teams in implementing modern data architecture, data engineering pipelines, and advanced analytical solutions. Our projects range from implementing enterprise data lakes and data warehouses using best practices for cloud solutions, building visualizations and dashboards, unifying data for a single view of the customer, or predicting next-best outcomes with advanced analytics.

You will act as the primary technical lead on projects to scope and estimate work streams, architect and model technical solutions to meet business requirements, serve as a technical expert in client communications, and mentor junior project team members. On a typical day, you might expect to participate in design sessions, build data structures for an enterprise data lake or statistical models for a machine learning algorithm, coach junior resources, and manage technical logs and release management tools. Additionally, you will seek out new business development opportunities at existing and new clients.

  • Have a minimum of 10+ years of technical, hands-on experience building, optimizing, and implementing data pipelines and architecture
  • Experience leading teams to wrangle, explore, and analyze data to answer specific business questions and identify opportunities for improvement
  • highly driven professional and enjoy serving in a fast-paced, dynamic client-facing role where delivering solutions to exceed high expectations is a measure of success
  • passion for leading teams and providing both formal and informal mentorship
  • strong communication and interpersonal skills, and the ability to engage customers at a business level in addition to a technical level
  • deep understanding of data governance and data privacy best practices
  • a degree in Computer Science, Computer Engineering, Engineering, Mathematics, Management Information Systems or a related field of study
  • Big data tools (e.g. Hadoop, Spark, Kafka, etc.)
  • Relational SQL and NoSQL databases (e.g. Postgres, MySQL, RedShift, MongoDB, etc.)
  • Data pipeline and workflow management tools (e.g. Azkaban, Oozie, Luigi, Airflow, etc.)
  • Stream-processing systems (e.g. Storm, Spark-Streaming, etc.)
  • Scripting languages (e.g. Python, Java, C++, Scala, etc.)
  • Container Orchestration (e.g. Kubernetes, Docker, etc.)
  • Experience with one or more of the following cloud service providers  (AWS/ GCP/ Azure)      
  • Thrive in a fast-paced, dynamic, client-facing role where delivering solid work products to exceed high expectations is a measure of success
  • Contribute in a team-oriented environment
  • Prioritize multiple tasks in order to consistently meet deadlines
  • Creatively solve problems in an analytical environment
  • Adapt to new environments, people, technologies and processes
  • Excel in leadership, communication, and interpersonal skills
  • Establish strong work relationships with clients and team members
  • Generate ideas and understand different points of view

Company Information