Required Skills

Teradata ETL Hadoop Pyspark Kafka Hive Data Curation Teradata experience

Work Authorization

  • Us Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 17th Nov 2020

JOB DETAIL

JOB DESCRIPTION

 

Interview mode Phone and Skype
No H1B

Need more Data Engineers with heavy Teradata experience.
I closed 1 but there are 2 more spots.

Teradata, ETL, Hadoop, Pyspark, Kafka, Hive, Data Curation

Must have teradata Development experience, working with Teradata pipelines, preparing data and pushing into Teradata. 
Expertise in how to process data in a Teradata environment.


Job Profile Summary

The Data Engineer will work closely with senior engineers, data scientists and other stakeholders to design and maintain moderate to advanced data models. The Data Engineer is responsible for developing and supporting advanced reports that provide accurate and timely data for internal and external clients. The Data Engineer will design and grow a data infrastructure that powers our ability to make timely and data-driven decisions.

Job Description
· Extract data from multiple sources, integrate disparate data into a common data model, and integrate data into a target database, application, or file using efficient programming processes
· Document, and test moderate data systems that bring together data from disparate sources, making it available to data scientists, and other users using scripting and/or programming languages
· Write and refine code to ensure performance and reliability of data extraction and processing
· Participate in requirements gathering sessions with business and technical staff to distill technical requirement from business requests
· Develop SQL queries to extract data for analysis and model construction
· Own delivery of moderately sized data engineering projects
· Define and implement integrated data models, allowing integration of data from multiple sources
· Design and develop scalable, efficient data pipeline processes to handle data ingestion, cleansing, transformation, integration, and validation required to provide access to prepared data sets to analysts and data scientists
· Ensure performance and reliability of data processes
· Define and implement data stores based on system requirements and consumer requirements
· Document and test data processes including performance of through data validation and verification
· Collaborate with cross functional team to resolve data quality and operational issues and ensure timely delivery of products
· Develop and implement scripts for database and data process maintenance, monitoring, and performance tuning
· Analyze and evaluate databases in order to identify and recommend improvements and optimization
· Design eye-catching visualizations to convey information to users

Hiring Requirements
· Bachelor’s degree in Computer Science or related field or equivalent experience
· 3 years of SQL programming skills (Intermediate to Advance SQL programming skills)
· 3 years programming experience in Python, R or other programming language
· Demonstrated experience working with large and complex data sets
· Experience with business intelligence tools (Tableau)

Hiring Preferences
· Experience with Hadoop, Hive and/or other Big Data technologies
· Experience with ETL or Data Pipeline tools
· Experience with query and process optimization
· Experience working in AWS and/or using Linux based systems
· Ability to translate task/business requirements into written technical requirements
· Reliable task estimation skills
· Excellent quantitative, problem solving and analytic skills
· Ability to document data pipeline architecture and design
· Ability to collaborate effectively with business stakeholders, performance consultants, data scientists, and other data engineers
· Proficient in use of MS Office applications including expert level Excel programming
· Ability to quickly become an expert in operational processes and data of lines of business
· Ability to troubleshoot and document findings and recommendations
· Ability to communicate risks, problems, and updates to leadership
· Ability to keep up with a rapidly evolving technology space

Company Information