- High level proficiency in one or more ETL products such as Talend, Informatica, Pentaho (Kettle), etc., with at least 2 years working in Talend
- Medium level proficiency in EDI (preferably in handling healthcare related files, particularly the 834 file format)
- Build, test, deploy and maintain ETL processes, handling various file formats such as CSV, Fixed Length, EDI files
- One or more ETL tools (preferably Talend Studio), REST APIs, JSONs, Mysql DB, GIT
- Continuously optimize, enhance, monitor, support and maintain all data integration processes.
- Use REST APIs to export or import data into the inhouse product
- Analyze data, design and build databases tables to capture all data elements, and load data, while maintaining data lineage and operational tracking.
- Maintain documentation, manage source code and deployments, and implement best-practices
- Experience in build scalable systems
- Experience in processing large data volumes
- Eagerness to learn and demonstrates strong analytical and critical thinking skills.
- Excellent organizational, interpersonal, verbal, and written communication skills.
- Ability to work well in a fast-paced environment under deadlines in a changing environment.
- Ability to successfully execute many complex tasks simultaneously
- Ability to work as a member of a globally distributed remote team, as well as independently
Good to have
- Familiar with CI/CD, Jenkins pipeline
- Experience with cloud (preferably AWS) and microservices architectures, and multi-tenant solutions
- Experience working in agile and devops delivery environments
- Low to Mid Java coding experience
- Analysis, recommendations, and influence to overall strategy and roadmap to build data integration ecosystem and supporting toolsets
- Design and build environments, framework and processes to acquire data from Healthcare clients/vendors in a secure and efficient manner
- Set up processes to acquire data in structured/semi-structured format, and implement tools/processes to consume data received, as a batch process or real-time
- Focus on streamlining and improving data integration efficiencies, reducing effort related to development and support, while providing security, flexibility and scalability
- Advanced skill level in SQL, data integration, data modeling and data architecture skills
- Advanced knowledge of physical database design and data structures
Additional considerations:
- While project supports a flexible 40-hour week, the core working hours need to be in the window between 7am ET till about 7pm ET.
- Candidates need to be able to attend meetings in the above window; particularly, morning availability is critical to collaborate with offshore team. (alternate time zones can be considered if agreeable to this)
- While the job is remote, candidates cannot be expecting to work from outside US