Required Skills

Data Engineer-Architect

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 21st Sep 2021

JOB DETAIL

•            Design and architect solutions in the areas of DevOps, Security, Governance,CI/CD, and/or Containerization

•            Monitor and manage implementation to ensure high quality and alignment withdefined requirements

•            Deep product knowledge and understanding of product features including IaaS,PaaS, and SaaS Solutions

•            High Availability and Disaster Recovery principles, patterns, and usage

•            Experience with Cloud ecosystem and leading-edge cloud emerging technologies

•            Experience with networking principles and technologies (DNS, Load Balancers,Reverse Proxies)

•            Automation experience with CloudFormation, Resource Manager, Puppet, Chef,Ansible

•            Container experience with Docker, Vagrant, Kubernetes, etc.

•            Experience developing solutions and passion for getting hands dirty withcode/scripting (e.g, python, java, c#, .NET, Node.js)Hands-on experiencewriting scripts in Bash, Python, PowerShell or similar

•            Streamline deployment automation and elastic sizing of environments formulti-tenant platforms.

•            Resolve complex Linux and Windows operating systems administration issues.

•            Experience with AWS Well-Architected Framework, AWS IAM and SecurityGroups/ACLs, AWS CloudFormation, AWS CloudTrail, AWS Control Tower, AWS Config,Lambda, Route 53, VPC, S3, SQS, SNS, RDS

•            Hands-on experience leading the design, development, and deployment of businesssoftware at scale or current hands on technology infrastructure, network,compute, storage, and virtualization experience

•            Infrastructure automation through DevOps scripting (E.g. shell, Python, Ruby,PowerShell)

•            Strong practical Linux and Windows-based systems administration skills in aCloud or Virtualized environment

•            Experience maintaining Cloudera Hadoop infrastructure such as HDFS, YARN,Spark, Impala, and edge nodes

•            Experience with Apache Nifi, Apache Kafka and Talend for data ingestion

•            Experience with Apache Airflow for scheduling / job orchestration

•            Experience with developing Cloud-based Big Data solutions on AWS or Azure

•            Experience with Cloud Data Platforms like Snowflake or Databricks

•            Experience analytics tools platforms like Dataiku, Elasticsearch

•            Experience with Data Federation or Virtualization technologies

•            Experience in development on other application types (Web applications, batch,or streaming)

 

Company Information