Required Skills

SQL Apache Hadoop Spark Airflow

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 29th Aug 2024

JOB DETAIL

1. 10 years’ experience designing and building performant, cost-optimized systems in AWS.

2. 10 years' experience designing data platforms, data lakes, data meshes, data fabrics, data warehouses, or large data management systems.

3. 8 years' experience using Agile software development methodologies.

4. 8 years’ experience designing and implementing data pipelines/ETL/ELT technologies.

5. 8 years’ experience with designing databases, including creating entity relationship diagrams for logical and physical database models and associated data dictionaries.

6. Knowledge of data technologies such as SQL, Apache Hadoop, Spark, Airflow etc.

7. Experience presenting technical solutions to Architecture/Software review board.

8. Excellent communication and interpersonal skills and ability to effectively interact with stakeholders of varying levels of technology acumen.

9. Knowledge of NIST & FISMA security review and ATO process.

10. Experience acquiring in-depth understanding of large complex software systems to isolate defects, reproduce defects, assess risk, and understand varied customer

deployments.

11. Hand-on experience building cloud infrastructure on AWS.

12. Hands-on experience implementing data pipelines in AWS.

Company Information