Us Citizen
Green Card
EAD (OPT/CPT/GC/H4)
H1B Work Permit
Corp-Corp
Consulting/Contract
UG :- - Not Required
PG :- - Not Required
No of position :- ( 1 )
Post :- 8th Jan 2021
Title: Data Engineer
Location: Detroit, MI - 100% Remote
Duration: 6+ months
Candidate should have 10 Years of exp at least
Top three Skills:
1) PySpark - Python in Spark - creating data pipelines
2) AWS Serverless Analytics Tools - Glue, EMR, Redshift, S3, Lakeformation
3) Big Data fundamentals - core understanding of building data pipelines, data modeling, Linux
Job description:
The Data Engineer is responsible for engaging in the design, development and maintenance of the big data platform and solutions at Rock Central. This includes the platform host data sets that support various business operations and enable data-driven decisions as well as the analytical solutions that provide visibility and decision support using big data technologies. The Data Engineer is responsible for administering a Hadoop cluster, developing data integration solutions, resolving technical issues, and working with Data Scientists, Business Analysts, System Administrators and Data Architects to ensure the platform meets business demands. This team member also ensures that solutions are scalable, include necessary monitoring, and adhere to best practices and guidelines. The Data Engineer helps mentor new team members and continues to grow their knowledge of new technologies.
Responsibilities:
Requirements:
Additional Information:
Subvendor/H1 candidates are fine. Quicken Loans is fine with 100% remote candidates for this engagement.
Work Environment:
Strong Scaled Agile environment, will work on part of Scrum teams within Data Intelligence group. Roughly 380 people in Data Intelligence on over 30 teams. Each Scrum team has 8-15 depending on it's focus.
Who is the Internal/External Customer:
External data consumer's a Rock Connections
Impact to the Internal/External Customer:
Rock Connections will have a brand new data environment built completely on AWS.
Business Challenge:
Rock Central is building out a new data infrastructure for a sister company within the Rocket Companies Family - Rock Connections. They are building them a solution from scratch on AWS. This is due to a split of the Rock Connections environment between Rock Connections and Rocket Auto.
EVP:
Rock Central is a professional services organization that supports the other 90 companies that make up the Rocket Companies Family. Rocket Companies recently had their IPO and the family of companies have consistently been rewarded as a best place to work.
Non-Technical Skills
Experience working with Lead and/or Call Center Analytics data is a plus
Please run these questions past all candidates and send to me with sub package - 1) What/when is the last code you pushed into production? What language was it written in? What was it for? Bonus question: What is the version control tool used and what tool was used to deploy the code?