Required Skills

Data Engineer

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 6th Aug 2021

JOB DETAIL

  • Deploy, maintain, and administer analytic platforms & tools that enable state of the art, modern data capabilities for data engineers, data scientists, and analytic users and applications.
  • Assist application development teams during application design and development for highly complex and critical data projects to ensure platforms are enabled for targeted use-cases.
  • Contribute to the underlying platforms of all data projects including migration to new cloud-based data platforms for unstructured, streaming and high-volume data.
  • Leveraging DevOps techniques and practices like Continuous Integration, Continuous Deployment, Test Automation, Build Automation and Test-Driven Development to enable rapid deployment of end user capabilities
  • Creates and maintains DevOps processes, application infrastructure, and utilizes cloud services (including database systems and data models)
  • Developing data solutions on cloud data centers such as AWS
  • Provides 24x7 rotating support of production systems to ensure business continuity
  • Contributes to and defines environment and platform standards, processes, and best practice documentation

 
 
Essential Functions:

  • Handle multiple priorities simultaneously
  • Ability to build strong trusting relationships with business partners
  • Ability to work in a fast-paced, team environment
  • Very strong communication skills and ability to lead initiatives
  • Strong collaborator with experience wearing multiple hats on smaller teams and cross training other team members
  • Basic project management skills required

 
Basic Qualifications:

  • Experience implementing and maintaining data platform core services (data lake, data integration, data warehouse) and configuring and integrating infrastructure and PAAS (e.g. Snowflake, Databricks, EMR)
  • Experience with AWS development on S3, EC2, EMR/Hive, Cloud Formation Templates, IAM, Security Groups, SQS, SNS, Redshift, Athena, Lambda,
  • Experience with Hadoop ecosystem (Cloudera) and Apache Spark
  • Experience with streaming technologies and data structures: Kafka, Kinesis, Firehose, XML, JSON
  • 3-5 years Python, Java, or Scala programming experience
  • SQL and database basics (Structures, Data Models, Partitions, Statistics, Optimizations)
  • Experience in source control tooling like Git, SVN, et
  • Linux/ Bash scripting

Open Source Toolset Configuration (Nifi, Kafka, Jupyter, Zeppelin)

  • DevOps and IAC Tooling experience with tools such as Ansible Terraform and Pulumi
  • AWS Certification Preferred
  • Understanding of networking and infrastructure fundamentals including firewalls, Active Directory integration, and general troubleshooting
  • Prior experience with Talend Cloud a huge plus

Company Information