Required Skills

BIG DATA ENGINEER

Work Authorization

  • Us Citizen

  • Green Card

  • EAD (OPT/CPT/GC/H4)

  • H1B Work Permit

Preferred Employment

Employment Type

education qualification

  • UG :- - Not Required

  • PG :- - Not Required

Other Information

  • No of position :- ( 1 )

  • Post :- 2nd Oct 2021

JOB DETAIL

• Responsible for all the tasks involved in administration of ETL Tool (Ab-Initio).
• Maintaining access, licensing and file system in the ETL server.
• Provide guidance on the design and integration of ETL to ETL Developers.
• Manage metadata hub, Operational Console and troubleshoot environmental issues which affect these components.
• Responsible for technical Metadata management.
• Work with the team to maintain data lineage and resolve data lineage issues.
• Design and develop automated ETL process and architecture.
• Interact with the Client on daily basis to define the scope of different applications.
• Work on the break fix and continuous development items and review and inspection for the production changes.
• Perform the code review for the ETL code developed by the development team and guide to resolve any issues.
• Work with the various other groups – DBAs, Server Engineer Team, Middleware Group, Citrix Group, Network Group, data transmission, etc. to resolve the performance related/integration related issues.
• Provide update on progress of infrastructure/development and unit testing tasks to the client on weekly meetings.

    The incumbent’s accountabilities include, but are not limited to, the following:  

·         30%  System Maintenance:  Independently installs and maintains Big Data (Cloudera, Horton Works, etc.) clusters in high available, load balanced configuration across multiple (Production, User Acceptance, Development) environments.

 

·         25%  Systems Implementation:  Independently implement complex technical system design changes for Big Data environments.

 

·         10%  Procedural: Updates and maintains operations manuals, inventories, and written procedures relative to installing, maintaining, and using the Big Data environments. Works closely with Change management, Configuration Management and Security Management in developing and maintaining said procedures.  

 

·         25%  Technology Consulting: Leads problem resolution and coordination with Level 2 support.  Performs integration of component level systems into solutions as documented/directed by Lead Big Data Administrator. Liaison with Infrastructure, Security and application development.

 

·         10%  Research and Development: Researches new technologies to benefit the business.  Works with Lead/Expert Big Data Administrator to develop recommendations and provide implementation support.      

 

 This position has no direct reports, but may provide some technical guidance to less experienced staff. 

This position requires a BA/BS in Computer Science, Information Systems, Information Technology or related field with 3-5 years of prior experience in software development, Data Warehousing and Business Intelligence OR equivalent experience.

 

•    Administrator experience working with batch processing and tools in the Hadoop technical stack (e.g. MapReduce, Yarn, Pig, Hive, HDFS, Oozie)

•     Administrator experience working with tools in the stream processing technical stack (e.g. Spark, Storm, Sama, Kafka, Avro)

•     Administrator experience with NoSQL stores (e.g. ElasticSearch, Hbase, Cassandra, MongoDB, CouchDB)

•     Expert knowledge on AD/LDAP security integration with Big Data

•     Hands-on experience with at least one major Hadoop Distribution such as Cloudera, Horton Works, MapR or IBM Big Insights

•     Advanced experience with SQL and at least two major RDBMS’s

•     Advanced experience as a systems integrator with Linux systems and shell scripting

•     Advanced experience doing data related benchmarking, performance analysis and tuning, troubleshooting

•     Excellent verbal and written communication skills

•     System usage and optimization tools such as Splunk

•     Healthcare experience

•     ETL solution experience, preferably on Hadoop

•     Experience with industry leading Business Intelligence tools

•     Big Data – Administrator (certification)

•     Experience with Machine Learning and Artificial Intelligence

The physical demands described here are representative of those that must be met by an employee to perform the essential duties and responsibilities of the position successfully.  Requirements may be modified to accommodate individuals with disabilities.

 

 

Company Information