US Citizen
Green Card
Corp-Corp
W2-Permanent
W2-Contract
Contract to Hire
Consulting/Contract
UG :- - Not Required
PG :- - Not Required
No of position :- ( 1 )
Post :- 28th Apr 2025
The Department's OIT is seeking the services of two (2) experienced Data Architects. The Candidates
will be required to successfully complete tasks in relation to the Scope of Work defined in Section 2.3.
2
RFQ-25-075
2.3. Scope of Work/Job Characteristics
The Data Architects, under the working job title of Extract, Transform, Load (ETL) Architects, will serve
as the principal line of communication for the project team. The ETL Architects will drive the development
of data integration pipelines, enabling efficient, reliable access to critical data within the Correction
Information Management System (CIMS) Data Warehouse/Data Lake on Azure. The ETL Architects will
work with Azure Data Factory (ADF), Azure Databricks, Azure Synapse, Power BI, and Azure Purview.
The ETL Architects will be at the forefront of transforming complex data into actionable insights. The
ETL Architects will be responsible for ensuring data integrity, security, and performance, all while
meeting mission-critical needs. The duties and responsibilities of this position are as follows:
ETL Pipeline Design and Development:
• Lead the design and development of high-performing ETL processes to integrate and transform data
across disparate sources;
• Deliver efficient, reliable pipelines that meet business needs and maintain the highest standards of
security; and
• Utilize ADF to automate and streamline data workflows, ensuring smooth transitions from source to
target.
Data Integration and Transformation:
• Build and manage complex ETL workflows that extract, transform, and load data for downstream
analytics and reporting, ensuring data is accurate, timely, and secure; and
• Take ownership of data quality and validation, creating resilient ETL processes that ensure only
trusted data reaches its destination.
Cloud Platform Expertise:
• Leverage the full power of the Azure ecosystem—ADF, Databricks, Synapse, and Purview—to
manage and process high volumes of structured and unstructured data, delivering solutions that are
scalable and performance-optimized; and
• Integrate large datasets into Azure Synapse Analytics, enabling analytics teams to deliver data-driven
insights that support the Department’s mission.
Performance Optimization:
• Continuously optimize ETL jobs to minimize latency and maximize throughput; and
• Ensure the architecture supports fast, reliable data access for end-users and systems, meeting
stringent performance metrics.
Security and Compliance:
• Embed security and compliance best practices in every step of the ETL process.
• Protect sensitive data by adhering to industry standards and ensuring compliance with the
Department’s data governance policies; and
• Use Azure Purview to enforce data governance, track data lineage, and ensure that data handling
meets the highest standards of integrity.
Collaboration and Stakeholder Engagement:
• Partner with cross-functional teams (e.g., data engineers, analysts, business stakeholders, and
security experts) to design and implement ETL solutions that meet the Department’s evolving needs;
and
• Act as a technical leader and mentor, helping guide junior team members and providing expert
guidance on data processing and transformation best practices.
3
RFQ-25-075
Documentation and Best Practices:
• Develop and maintain clear, detailed documentation for ETL processes, ensuring the team can
consistently deliver high-quality, reliable solutions; and
• Establish and enforce best practices for data handling, ETL development, and security, driving a
culture of excellence and accountability.
3. Work Environment
3.1. Location
The work will be conducted on-site with the Department's OIT in the Carlton Building at 501 South
Calhoun Street, Tallahassee, Florida.
3.2. Provided Equipment
The Department will provide furnished office space, computer equipment, licensed software, access to
the Department's network, and internet access.
4. Requirements/Qualifications
The Department is seeking two (2) full-time, on-site ETL Architect throughout the PO term of this RFQ. The
positions required qualifications and preferred qualifications are described in Sections 4.1 and 4.2 and must
be verifiable in the Candidates’ resumes.
NOTE: Any successful Candidate with access to the Department's network is required to complete the
Department's Security Awareness Training within 30 calendar days of hire.
4.1. Required Qualifications
A bachelor’s degree or master's degree from an accredited college or university in Computer Science,
Information Systems, or a related field is required. Alternatively, equivalent work experience, including
experience in Service-Oriented Architecture (SOA) and Microsoft Azure Cloud Solutions, can be
substituted for the educational requirement on a year-for-year basis, when applicable.
The Department requires the following experience, skills, and knowledge for this position:
• Seven (7) or more years of experience in ETL development and data engineering;
• Three (3) or more years of hands-on experience working with ADF, Azure Databricks, Azure
Synapse Analytics, and Azure Purview;
• Proven track record of building and optimizing large-scale ETL pipelines for high-performance,
high-availability environments;
• Extensive expertise in Spark, Python, and/or Scala for large-scale data transformations;
• Strong Structured Query Language (SQL) proficiency and experience working with complex data
structures;
• In-depth knowledge of data governance, security protocols, and role-based access control (RBAC)
within the Azure ecosystem; and
• Ability to design ETL processes that are resilient, efficient, and fully compliant with regulatory
standards.
NOTE: In addition to the above list, the selected Candidates must successfully complete a Level II
Background Check.
4
RFQ-25-075
4.2. Preferred Qualifications
The Department prefers the Candidates to have the following experience, skills, and/or knowledge for
this position:
• Possession of a Microsoft Office Certification as an Azure Data Engineer Associate, Azure
Solutions Architect Expert, and Azure Fundamentals; and
• Azure Databricks Certification as a Data Engineer Associate.