Us Citizen
Green Card
EAD (OPT/CPT/GC/H4)
H1B Work Permit
Corp-Corp
Consulting/Contract
UG :- - Not Required
PG :- - Not Required
No of position :- ( 1 )
Post :- 4th Aug 2021
Summarize job responsibilities, core deliverables and major duties. What is required for the position to exist?
• The Data Engineer will work directly on the Oracle enterprise data warehouse (EDW) to deliver batch and real time data for analytics and reporting capabilities to the Linear Ad Sales business unit.
• Work across the different stages of the EDW data pipeline, using tools such as Oracle ExaCC Exadata, SAP Data Services, Oracle GoldenGate, Oracle BigData Extension, and Kalido DIW
• Help understand our data by performing exploratory and quantitative analytics, data mining, and discovery.
• Think of new ways to help make our data platform more scalable, resilient and reliable and then work across our team to put your ideas into action.
• Ensure performance is optimized for real time data by implementing and refining robust data processing across landing, integration and data mart layers
• Help us stay ahead of the curve by working closely with product, data modelers, API developers, DevOps team, and analysts to design systems which can scale elastically
• Mentor other software engineers by developing re-usable frameworks. Review design and code produced by other engineers.
• Embrace the DevOps mentality to build, deploy and support applications in cloud with minimal help from other teams
Required:
• Bachelor’s degree or better in Computer Science or a related technical field or equivalent job experience.
• 5+ years of experience working as an Oracle / Snowflake database developer (Oracle 11g or greater) or similar technologies
• At least 5+ years development experience in a data warehousing/ Data Mart or big data environment
• At least 5+ years with data warehouse / Data Mart design experience
• Experience with PL/SQL, SQL, database performance Tuning and optimization.
• Experience with Python Programming language.
• Experience with ETL and other data integration tools such as SAP Data Services, Oracle GoldenGate, and Tidal Enterprise scheduler.
• Ability to develop, implement and maintain standards established by the architecture and Development teams.
• Robust data analysis and root cause analysis skills
• Self-motivated independent thinker and collaborative team member.
• Experience in using CI/CD pipeline (Gitlab)
• Experience in Code Quality implementation (Used Pep8/Pylint) tools or any other code quality tool.
• Implement Industry Standards /Best Practices.
• Excellent analytical and problem-solving skills
• Excellent verbal and written communication skills
Preferred Qualifications:
• Media Ad Sales and Finance experience preferred
• Experience with Kalido
• Experience with AWS, Kafka, Snowflake, Airflow or other cloud technologies
Required Education:
• Bachelor’s degree or better in Computer Science