Required Skills

Data Engineer

Work Authorization

  • US Citizen

  • Green Card

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 18th Aug 2023

JOB DETAIL

Search and Auto complete – Vespa Search Engine (vectorized docs)

Argo Flow – Data ingestion – Listen to Kafka, API, S3, give us data in any form.

Data processing, schema – Vespa, AWS and Azure, multi cloud strategy

AKS containers, Jenkins flow, 50 plus repos, ML modeling – 50K training cost

Graphana based logs

Log data aggregation

Some Big Data

 

Your Impact:
• Combine your technical expertise and problem-solving passion to work closely with clients, turning complex ideas into end-to-end solutions that transform our clients’ business
• Translate clients requirements to system design and develop a solution that delivers business value
• Lead, design, develop and deliver large-scale data systems, data processing and data transformation projects
• Automate data platform operations and manage the post-production system and processes
• Conduct technical feasibility assessments and provide project estimates for the design and development of the solution
• Mentor, help and grow junior team members


Your Skills & Experience:
• Demonstrable experience in data platforms involving implementation of end to end data pipelines
• Good communication and willingness to work as a team
• Hands-on experience with at least one of the leading public cloud data platforms (Amazon Web Services, Azure or Google Cloud)
• Implementation experience with column-oriented database technologies (i.e., Big Query, Redshift, Vertica), NoSQL database technologies (i.e., DynamoDB, BigTable, Cosmos DB, etc.) and traditional database systems (i.e., SQL Server, Oracle, MySQL)
• Experience in implementing data pipelines for both streaming and batch integrations using tools/frameworks like Glue ETL, Lambda, Google Cloud DataFlow, Azure Data Factory, Spark, Spark Streaming, etc.
• Ability to handle module or track level responsibilities and contributing to tasks “hands-on”
• Experience in data modeling, warehouse design and fact/dimension implementations
• Experience working with code repositories and continuous integration


Set Yourself Apart With:
• Developer certifications for any of the cloud services like AWS, Google Cloud or Azure
• Understanding of development and project methodologies

• Willingness to travel

Company Information