Required Skills

AWS AND VAULT

Work Authorization

  • US Citizen

  • Green Card

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

Employment Type

  • Permanent Direct Hire

education qualification

  • UG :-

  • PG :-

Other Information

  • No of position :- ( )

  • Post :- 20th May 2022

JOB DETAIL

Job Description : Data Engineer

Employment Type: Contract

Client: New York University

City: New York

State: NY

 

Description: NYU Washington Square has a truly unique opportunity for data engineer for

Learning & Analytics project. As part of this team, you will work on the collecting,

storing, processing, and analyzing huge sets of data. The primary focus will be to

develop the construction and maintenance of our data pipeline, ETL processes and

data warehouse. Data Engineer will also be responsible for data quality and

understanding the data needs our various source data in order to anticipate and

scale our systems.

 

Candidate must be willing to learn, work well independently, be open to feedback,

and enthusiastic, with demonstrated technical aptitude, skills and abilities. Support

NYU Returns Project that uses student and vaccine data to enable the access of the

individuals to the university.

 

Hiring Manager: Ram Boopalam, Project Director

 

The person will sit within their Enterprise Data Management group that takes care of the data needs for reporting in Tableau off their Enterprise Data Warehouse.  The group gets data from various systems, consolidates the data, curates it and uses ETL tools to load data into the marts for reporting.  

 

The project will be related to Covid data and modernizing an old legacy application used by the Student Health Center.  Data comes in to S3 buckets in AWS, they use snowflake to transform the data and load it into the marts.  The marts feed data to the badging system to enable/disable badges for building access.

 

They work off an AWS backbone (Lambda, S3, CloudWatch, Cloud Trail, EC2 instances).

 

They are looking for someone to come in and work on building the ETL data pipelines.

 

They want the person to have experience:

  • Working with SnowSQL to maintain and manage data pipelines and support new development.  Should have experience with Java/JavaScript as they use JavaScript with SnowSQL
  • Strong SQL skills. Should know how stored procedures and functions work within SQL / SnowSQL.  But that experience can be within SQL Server or Oracle, not necessarily Snowflake. (Snowflake is preferred)
  • WhereScape for modeling (it’s the methodology used within Data Vault)
  • Data Vault (preferably 2.0) (they build the model in data vault and data is loaded via Hub, Link, Satellite) 

 

This link describes Data Vault and the hub, link, satellite concepts https://en.wikipedia.org/wiki/Data_vault_modeling

 

Old notes from the last time he had the role open:

  • Should know how APIs are used and how they work
  • Mulesoft is not mandatory and is a nice to have
  • Dimensional Modeling experience would be a big plus

 

  • The other things like python, R, etc. in the job description below are nice to have.  The core skills that they need are bulleted above.

 

Ram will try to do a one and done video interview (1 to 1.5 hours in length).  If someone from the team is missing, it need a second IV.

 

Candidates need to be local to NYC, they will be expected to come in 1 to 2 days per week since the project is for the Student Health Center.  May be onsite more in the future.  Since they will be working hybrid but coming onsite, candidates must be fully vaccinated and have their booster.

 

 

Thanks & Regards

Hussain Ali Mirza

Sr.Technical Recruiter

E-mail: hussain.m@implifyinc.com

US: +1(610) 890-9860 Ext:287

India: +91 9951751472

 

Company Information