PDG B-5 2025 APPR: Comparing Previous Instrument Against New Instrument
Big Picture |
||
Goals for New Instrument |
Revised Instrument |
Previous Instrument |
More streamlined reporting to reduce the burden on grantees, making it easier and faster to complete.
|
3 narrative response questions; reducing completion time. |
13 narrative response questions |
Limiting the number of required questions, focusing only on the most critical data elements needed for analysis.
|
10 questions; reduced number of questions to focus on meaningful data. |
16 questions |
Collecting questions that provide data that can be easily aggregated, allowing us to identify trends across all grantees.
|
Questions include multiple response questions and checkboxes to allow easily aggregated responses. |
Heavy narrative questions; requiring tedious read and analyzing of narrative to identify data points. |
Formatting changes to ensure consistency in responses, enabling more meaningful comparisons across programs. |
Formatting changes to ensure consistent data responses to compare across states. |
Differing narrative responses, produce inconsistent responses to questions. |
Updates to improve quality and usability of collected information that can inform other priorities.
|
Simplified questions allow increased useability of data to develop trends used to inform leadership. |
Questions resulted in segmented responses that were difficult to compare state by state. |
Content |
||
Key Topic/Theme1 |
Revised Instrument |
Previous Instrument |
Strengthening ECE Workforce |
Q1-2 (Data/ list questions) Collected data
|
Q1- 5 (Narrative questions) makes brief reference to initiatives to support the ECCE workforce in the context of improving program quality. |
Expanding Access to ECE |
Removed this section:
|
Q6- 9 (Narrative questions) Collect information on the extent to which recipients used grant funds for specific subgroups, such as infants and toddlers, underserved/children, children with/at risk for disabilities, etc. |
Improving ECE Program Quality |
Q3 (Data/list question) Collected data
Q4-5: (Data list question) Collected data
|
Q 10 (Narrative question) Collects information on what specific approaches recipients have used to improve program quality. Q16 (Narrative question) Collects information on coordination and referral of specific populations. |
Family Choice and Engaging Families |
Q6- 7 (Data/ list questions) Collected data
|
Q11- 12 (Narrative question) Collects information on how grant funds were used to engage families as leaders and maximize parent and family choice, and how many family representatives have been engaged as leaders. |
Strengthening ECCE Systems |
Q8 (Narrative question) Collected data
|
Q13 (Narrative question) Addresses how grant funds are used to support the state ECCE system – closely aligned with previous instrument |
Coordination and Referral/ Coordinated Application Enrollment & Eligibility (CAEE) |
Q9 (Narrative question) Collected data
|
Q14 (Narrative question) Addresses how grant funds improve the coordination and delivery of ECCE services, including coordinated applications and eligibility. |
State integrated data systems (ECIDS) |
Q10 (Narrative question) Collected data
|
Q15 (Narrative question) Addresses how grant funds strengthen/expand the state’s integrated data system.
|
Mo re streamlined reporting to reduce the burden on grantees, making it easier and faster to complete.
3 narrative response questions to reduce completion time; reducing from 13 narrative questions.
Limiting the number of required questions, focusing only on the most critical data elements needed for analysis.
10 questions; reducing from 16 questions to focus on meaningful data.
Collecting questions that provide data that can be easily aggregated, allowing us to identify trends across all grantees.
Questions include multiple response questions and checkboxes to allow easily aggregated responses.
Reducing heavy narrative questions; requiring tedious read and analyzing of narrative to identify data points.
Formatting changes to ensure consistency in responses, enabling more meaningful comparisons across programs.
Formatting changes to ensure consistent data responses to compare across states.
Eliminating differing narrative responses, producing inconsistent responses to questions.
Updates to improve quality and usability of collected information that can inform other priorities.
Simplifying questions allow increased useability of data to develop trends used to inform leadership.
Previous questions resulted in segmented responses that were difficult to compare state by state.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Phoenix, Tosheania (ACF) |
File Modified | 0000-00-00 |
File Created | 2025-07-29 |