Required Skills

Parquet Glue · IAM PySpark Quicksight Lambda Amazon EMR · Redshift

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 26th Nov 2020

JOB DETAIL

technologies to design, develop and test scalable applications that operate on large volume datasets. Familiar with handling datasets containing mixes of structured and unstructured data. Transforms unstructured data into suitable forms for analysis and modeling. Performs extract, transform and load (ETL) integrations with variety of data sources. Writes ad-hoc scripts and queries, schedules jobs and develops real-time streaming applications and monitors. Background in technologies such as my list, etc.
Designs, develops and tests analytic solutions and advanced visualizations. Develops and executes policies, practices and procedures that manage the collection, quality, standardization, integration and aggregation of data. Uses knowledge of distributed computing techniques to design, develop and test scalable applications that operate on large volume datasets. Familiar with handling datasets containing mixes of structured and unstructured data. Transforms unstructured data into suitable forms for analysis and modeling. Performs extract, transform and load (ETL) integrations with variety of data sources. Writes ad-hoc scripts and queries, schedules jobs and develops real-time streaming applications and monitors. Background in technologies such as
· Terraform /Terraform Enterprise
· S3
· Parquet
· Glue
· IAM
· Kinesis
· Athena
· Redshift
· DynamoDB
· Lambda
· Quicksight
· PySpark
· Amazon EMR
· DevOps
· CICD Pipeline (GitOps)

Duties
• Participates in sprint planning; provides work estimates to deliver product stories; owns development stories
• Provides data design/data movement services to help develop solutions that improve overall quality of data and support enterprise strategies
• Design and execute the policies, practices, and procedures that support data quality, data security, data governance, data standardization & integration
• Translates functional and technical requirements into detailed design
• Provides support for data movement solutions including extract, transformation, load (ETL)
• Understands data infrastructure needed to support data services and automation
• Designs, develops, tests and supports software in support of big data objectives
• Completes required coding to satisfy the defined acceptance criteria and deliver desired outcome
• Assists in development of automated testing and supporting code as necessary
• Completes required documentation to communicate information to deployment, maintenance, and business teams
• Utilizes agile software development practices, data and testing standards, code reviews, source code management, continuous delivery, and software architecture
• Adopts Service Design, where appropriate, through architecture modularity to enable continuous delivery
• Resolves problems that result in a decreased time to market; improves quality, enhances flexibility, and embraces the solution provider mindset
• Provides input into overall testing plan; contributes to test approach and scenarios for requirements
• Provides product and/or process expertise necessary to support design, development, testing and execution of solutions
• Exhibits DevOps mindset
• Possesses an understanding of how technology solutions meet the business outcomes and offers a range of solutions for business partners
• Applies an expert understanding of development tools, processes, applications, programming languages and environments to assignments
• Provides highest level of support for problem and issue resolution and provides technical consultation and direction to business and product team members
• Utilizes application architecture to increase efficiency and effectiveness of complex issues
• Champion and provide guidance with an innovative mind set to deliver product solutions
• Influences and provides direction on product development practices, coding, data and testing standards, code reviews and software architecture
• Experience with AWS ETL and consumption patterns.
• Understanding of the cost benefit tradeoffs between various AWS services and appropriate uses cases for each.

Understanding of the AWS Analytics & Big Data architecture and patterns
7+ years of development experience in AWS Analytics & Big Data.
Professional Certification in AWS Developer/Architect (preferred).
AWS Certified Data Analytics – Specialty (preferred)
 

Company Information