Required Skills

Data Architect

Work Authorization

  • US Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

  • Corp-Corp

  • W2-Permanent

  • W2-Contract

  • Contract to Hire

Employment Type

  • Consulting/Contract

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 27th Feb 2024

JOB DETAIL

As a Data Platforms Solutions Architect you will be responsible for the technical delivery of our Data Platform’s core functionality and strategic solutions. This includes the development of reusable tooling / APIs, applications, data stores, and software stack to accelerate our relational data warehousing, big data analytics, and data management needs. This individual will also be responsible for designing and developing strategic solutions that utilize big data, cloud, and other modern technologies in order to meet our constantly changing business requirements. 

Responsibilities include: 

  • Day-to-day management of several small development teams focused on our Big Data platform and Data Management applications 
  • Designing and driving execution of strategic technology solutions on Hadoop, Cloud, and other new relevant data technologies based on business requirements and firm-wide initiatives 
  • Collaboration within the organization on current and future state architecture 
  • Hands-on development working alongside the team 
  • Coordination with our DevOps teams, platform engineers, and production support to ensure the stability of our Big Data platform 

Skills Required: 

  • 10+ years of appropriate technical experience Advanced-level experience developing modular solutions in Python, Scala and Java. 
  • Experience with Spark for data processing High quality software architecture and design methodologies and patterns 
  • Experience maintaining Cloudera Hadoop infrastructure such as HDFS, YARN, Spark, Impala and edge nodes 
  • Strong SQL skills with commensurate experience in a large database platform 
  • Complete SDLC process and Agile Methodology (Scrum) 
  • Strong oral and written communication skills 

Skills Desired: 

  • Experience with building Java-based web applications  
  • Experience with Apache NiFi, Apache Kafka and Talend for data ingestion 
  • Experience with Apache Airflow for scheduling / job orchestration 
  • Experience with developing Cloud-based Big Data solutions on AWS or Azure 
  • Experience with Cloud Data Platforms like Snowflake or Databricks 
  • Experience analytics tools \ platforms like Dataiku, ElasticSearch 
  • Experience with Data Federation or Virtualization technologies 
  • Experience in development on other application types (Web applications, batch, or streaming) 

 

Company Information