High level proficiency in one or more ETL products such as Talend, Informatica, Pentaho (Kettle), etc., with at least 2 years working in Talend
Medium level proficiency in EDI (preferably in handling healthcare related files, particularly the 834 file format)
Build, test, deploy and maintain ETL processes, handling various file formats such as CSV, Fixed Length, EDI files
One or more ETL tools (preferably Talend Studio), REST APIs, JSONs, MySQL DB, GIT
Continuously optimize, enhance, monitor, support and maintain all data integration processes.
Use REST APIs to export or import data into the inhouse product
Analyze data, design and build databases tables to capture all data elements, and load data, while maintaining data lineage and operational tracking.
Maintain documentation, manage source code and deployments, and implement best-practices
Experience in build scalable systems
Experience in processing large data volumes
Eagerness to learn and demonstrates strong analytical and critical thinking skills.
Excellent organizational, interpersonal, verbal, and written communication skills.
Ability to work well in a fast-paced environment under deadlines in a changing environment.
Ability to successfully execute many complex tasks simultaneously
Ability to work as a member of a globally distributed remote team, as well as independently
Good to have
Familiar with CI/CD, Jenkins pipeline
Experience with cloud (preferably AWS) and microservices architectures, and multi-tenant solutions
Experience working in agile and devops delivery environments
Low to Mid Java coding experience
Analysis, recommendations, and influence to overall strategy and roadmap to build data integration ecosystem and supporting toolsets
Design and build environments, framework and processes to acquire data from Healthcare clients/vendors in a secure and efficient manner
Set up processes to acquire data in structured/semi-structured format, and implement tools/processes to consume data received, as a batch process or real-time
Focus on streamlining and improving data integration efficiencies, reducing effort related to development and support, while providing security, flexibility and scalability
Advanced skill level in SQL, data integration, data modeling and data architecture skills
Advanced knowledge of physical database design and data structures