Attachment F - ED Response to 60 day public comments

Attachment F EDFacts 2022-23 to 2024-25 Response to 60-Day Public Comments.docx

EDFacts Data Collection School Years 2022-23, 2023-24, and 2024-25 (With 2021-22 Continuation)

Attachment F - ED Response to 60 day public comments

OMB: 1850-0925

Document [docx]
Download: docx | pdf

Attachment F

Paperwork Reduction Act Submission Supporting Statement



Annual Mandatory Collection of Elementary and Secondary

Education Data through EDFacts



March 2022



Attachment F









EDFacts Data Set

for School Years 2022-23, 2023-24, and 2024-25

Response to 60-Day Public Comments


Introduction

This attachment contains responses to public comments on the Annual Mandatory Collection of Elementary and Secondary Education Data through EDFacts. The 60-day comment period for the EDFacts package closed on January 14, 2022. The Department (ED) received a total of 46 comment submissions, many covering multiple topics, totaling 788 individual comments. The majority of submissions and comments came from states (see below).


Submitters

Submissions

Individual Comments

Total

46

788

State

38

770

Association

3

13

Individual

5

5


The Department received comments on each of the 37 directed questions. This document is organized by directed question topic. Comments not related to any directed questions are provided at the end of this document.


Each section provides a summary of the public comments received, the Department’s response(s) to those comments, and any resulting changes, if any, being made to the proposed data collection package. In addressing the public comments and making revisions to the package, the Department focused on recommendations from the public comments that continue to move EDFacts forward in achieving the goals of consolidating collections, obtaining high quality data, and reducing burden on data submitters.


The Department appreciates the time and attention the public spent on reviewing the EDFacts package and in composing thoughtful comments that shape the final data set, as evidenced in this attachment. The Department reviewed, summarized and documented each statement prior to analyzing all statements. This documentation will aid in the finalization of this data clearance package and will serve to inform future policy decisions regarding EDFacts.






special education

Directed Question #1: IDEA Personnel Data: Under the last EDFacts Information Collection Package, OSEP revised the Part B Child Count and Educational Environments data collected in FSs 002 and 089 to require reporting children with disabilities who are age 5 and in kindergarten in the school age Part B Child Count and Educational Environments data (FS 002) and reporting children with disabilities age 5 and not in kindergarten in the preschool Part B Child Count and Educational Environments data (FS 089). ED is proposing to change the age groups used in the IDEA Personnel data collected in FSs 070 (special education teachers) and 112 (paraprofessionals) to align with the Part B Child Count and Educational Environments data age groups. This change is also in response to public comments received in the last EDFacts package regarding the misalignment between child and staff counts.

  1. Can your state report IDEA personnel data in these revised age groups?

  2. What is the impact of this change?


Shape1

Public Comments

Twenty-four states and one association provided input on the proposed change to revise the age groups associated with reporting the special education teacher full time equivalent (FTE) counts and the paraprofessional FTE counts from ages 3-5 and ages 6-21 to ages 3-5 (not in kindergarten) and ages 5 (in kindergarten)-21 in the IDEA Personnel data. The majority of states responded that they would be able to report the IDEA Personnel data disaggregated by the proposed age groups and expressed overall support for this change.

Several commenters expressed concerns with collecting and reporting the special education teacher FTE counts and paraprofessional FTE counts data by these revised age groups. Two commenters said their current data system/data collections do not allow them to report the special education teacher data disaggregated by these revised age groups and two commenters said they could not report paraprofessional data by these revised ages group with their current data system/collection. Three commenters said that they are not able to report personnel data by age but are able to report it by grade.

ED Response

The majority of comments were in support for the Department to make this change. The Department is maintaining the requirement for States to report IDEA Personnel data by age-based categories. These proposed changes were a result of changes suggested by States and an association during the EDFacts Data Collection School Years 2019-20, 2020-21, and 2021-22 30-Day public comment period. The changes were noted in the Special Education Teachers File Specifications (FS 070). Beginning with the SY 2019-20 IDEA Personnel data, the Department has allowed states to report special education teachers and paraprofessionals who are employed or contracted to work with 5-year-old children with disabilities who are in kindergarten in the school age reporting category (permitted value 6TO21). If states took advantage of this flexibility in reporting IDEA Personnel Data, the Department requested that the state upload a data note explaining where the personnel employed or contracted to work with 5-year-old children with disabilities who are in kindergarten were reported in the IDEA Personnel data. The Department understands that reporting the IDEA Personnel data by the revised age groups may result states having to make changes to their data collections and/ or data systems. It is important to the Department to align the age groups that are used for reporting in the IDEA Part B Child Count and Educational Environments data collection and the IDEA Part B Personnel data collection in order to understand and analyze the number of special education personnel needed to support children with disabilities who are receiving services under Part B of IDEA. The proposed change does not amend the requirement that states must report IDEA Personnel data by age-based categories. The proposed change would require States to report 5-year-olds by the grade levels of preschool and school age in order to better align these data with the changes to the Part B Child Count and Educational Environments data.



Directed Question #2: ED is now proposing to implement four new data points associated with CCEIS in the MOE & CEIS EMAPS survey.

  1. Does your state collect the data needed to report the number of children receiving comprehensive coordinated early intervening services as defined by 34 C.F.R. §300.646 for each LEA or ESA that receives a Section 611 or Section 619 subgrant from the state in the Maintenance of Effort and Coordinated Early Intervening Services data collection?

  2. Does your state collect the data needed to report these counts disaggregated by the following age groups: preschool age (ages 3 through 5 not in kindergarten) and school age (ages 5 in kindergarten through 21)?

  3. Does your state collect the data needed to report these counts disaggregated by disability status: children with disabilities (IDEA) and children without disabilities?

  4. If states do not collect the needed information at this time, how long would states need to be able to report these data to ED?

  5. Would this proposed collection create unique challenges that you would like ED to be made aware?


Shape2

Public Comments

Twenty-five states, one association, and two individuals provided input on the proposed change to collect four new data elements associated with children receiving comprehensive coordinated early intervention services (CCEIS) in the Maintenance of Effort Reduction and Coordinated Early Intervening Services (MOE Reduction and CEIS). The majority of commenters communicated that they do not have all the data needed to report on the proposed data elements and expressed concern with the Department requiring the reporting of the proposed data elements. However, several other commenters indicated they collect the data needed to report on the proposed data elements. Several commenters requested information on how the Department will use the data collected under the proposed data elements considering the large increase in reporting burden.

ED Response

To limit reporting burden, the Department is withdrawing the proposal to collect the count of children receiving CCEIS disaggregated by age group. The Department will use the count of children receiving CCEIS disaggregated by disability status to monitor the implementation of the Equity in IDEA Regulations on Significant Disproportionality, published in the Federal Register on December 19, 2016. Allowing children with disabilities to receive CCEIS was a significant change in the implementation of IDEA. These data will allow OSEP to investigate, monitor, and understand the impact of this change on findings of significant disproportionality. Due to the need for states to collect new data to report on these elements, we are proposing to delay the implementation of the proposed data elements associated with children receiving CCEIS for one year.



Directed Question #3: IDEA State Supplemental Survey: ED is considering separating the questions in the IDEA State Supplemental Survey into two surveys that align with the submission and resubmission timelines for the associated FSs and EMAPS surveys: one survey associated with those IDEA Section 618 Part B data collections due in November (FSs 005, 006, 007, 088, 143, 144, 009, 070, 099, and 112) and one survey associated with those IDEA Section 618 Part B data collections due in April and May (FSs 002 and 089 and the Maintenance of Effort and Coordinated Early Intervening Services survey in EMAPS). ED expects that this change would provide a positive impact on the efficiency and accuracy of the information provided via the IDEA State Supplemental Survey. ED is not proposing changes to any of the questions currently present in the IDEA State Supplemental Survey.

  1. Can your state report this information split in two different surveys?

  2. How would splitting this survey impact your state?

Shape3

Public Comments

We received input on the proposed change to divide the current IDEA State Supplement Survey (IDEA SSS) metadata survey into two surveys from 23 commenters. Of those 23 commenters, 22 represented States and one represented an association. The proposed change would align the submission and resubmission dates of the surveys with the submission and resubmission dates for the data due the first Wednesday of November (file specifications 005, 006, 007, 088, 143, 144, 009, 070, 099, and 112) and the data due the first Wednesday of April (file specifications 002 and 089). Close to half of the commenters expressed that this change would have little or no impact on their ability to collect and report the IDEA SSS metadata. Over half of the commenters expressed concerns that this change would increase reporting burden. Additionally, several commenters expressed concerns with the question on how the State defines significant disproportionality in the IDEA SSS metadata survey.

Several states expressed concerns that the question on how states define significant disproportionality in the IDEA SSS metadata survey is duplicative. The comments noted that the information is also collected in the annual IDEA Part B formula grant application in more detail.

ED Response

In response to these comments and concerns, OSEP is withdrawing the proposal to divide the current IDEA SSS metadata survey into two surveys. The Department appreciates the comments and the concerns with dividing the current IDEA SSS metadata survey into two surveys and understands that, while dividing the current IDEA SSS metadata survey into two surveys may better align the submission dates with the associated IDEA Section 618 data submission dates, it does increase burden on the states. The proposed change would increase the number of submission and resubmission periods for states to track and would require States to submit the IDEA SSS metadata survey twice annually.

The Department agrees with the commenters that the question on how states define significant disproportionality is addressed in more detail in the annual grant application. To reduce reporting burden and eliminate a duplicative data element, the Department will remove the significant disproportionality question from the IDEA SSS metadata survey.




Title I

Directed Question #4: Title I School Status: Title I School Status (DG22) is currently collected in the CCD School file specification (129). ED is proposing to move this data group into the Title I Part A SWP/TAS Participation file specification (037), connecting its collection with other Title I data groups. This would also change the due date of this data group. Currently FS129 is collected in March of the same school year while FS037 is collected the following February. This means there will be at least a year delay in receiving the Title I School Status data group for SY2022-23.

  1. What is the impact on your state associated with reporting Title I school status in February of the following school year rather than in March of the same school year, as currently collected?


Shape4

Public Comments

Twenty states and one association provided comments to the directed question about the impact of moving Title I School status (DG 22) from the CCD School file (FS 129) to the file collecting counts of students participating in and served by Title I, Part A schools operating schoolwide programs and targeted assistance programs (FS 037). Nineteen states and the association supported the proposal to move the data group or said the move would have little to no impact on their burden. A few states reported that the new file’s deadline would provide them more time to collect and report this data. One state expressed concern based on an understanding the change would move reporting of the data group to an earlier time, though the proposed change would move reporting to a later time in the following school year.

ED Response

Based on nearly all responses indicating the proposed change would have little or no impact on state reporting, the Department is keeping the proposed change to move collection of Title I DG 22 out of FS 129. In addition, to simplify reporting of DG 22 so that counts and statuses for schools are reported in files for the specific types of data, the Department is proposing to move DG 22 into its own file specification beginning in SY 2022-23.



Directed Question #5: Section 1003 Funds: ED is proposing to remove the Economically Disadvantaged Students data group (DG56) from the Section 1003 Funds file (FS132) and into its own file specification. This change is being proposed because DG56 is not related to Section 1003 funds and it could cause confusion when reporting this data group.

  1. What is the impact on your state associated with reporting this data group in a separate file?

Shape5

Public Comments

Nineteen states provided comments to the directed question about moving the counts of economically disadvantaged students (DG 56) from the Section 1003 Funds file (FS 132) and into its own file. Thirteen states indicated the proposed change would have little or no impact on their reporting, and two of these states explicitly supported the change. Four states indicated making the proposed change would create burden for them, and one state indicated the added burden would be substantial. Another state indicated on-going submission of an additional file would add burden. One state indicated it would be helpful to understand the benefits of creating a new file rather than moving the data group to an existing file, such as FS 033 (Free Lunch and Reduced-Price Lunch).

ED Response

Based on the majority of responses indicating the proposed change would have little or no impact on state reporting and that the value of the proposed change in separately collecting the counts of economically disadvantaged students (DG 56) from the file more narrowly focused on subawards of Section 1003 Funds (FS 132), the Department is keeping this proposed change. While certain other files collect related data (e.g., FS 033), both the nature of the data and timelines for collecting the data vary across DG 56 and other files.


Directed Question #6: Title I Allocations (FS 193): Most LEAs do not receive a McKinney-Vento subgrant (over 75%) and the mandatory Title I, Part A LEA homeless set-aside under ESSA is the primary Federal education resource to address the unique or specific needs of students experiencing homelessness, who are also automatically eligible for Title I services. Therefore, ED is proposing to add a data group to collect the dollar amount of Title I, Part A allocation reserved by the LEA to serve homeless children and youth.

  1. Can your state report this data group?

  2. What is the impact with reporting this data group anticipated in your state?


Shape6

Public Comments

Twenty states provided comments to the directed question about the impact of collecting LEA homeless reservation amounts for Title I, Part A in FS 193. Thirteen states commented that the proposed change would entail a low additional burden and seven states commented that it would require some additional burden. Of the states commenting in support of the proposal, two states commented that it would provide valuable information on services for students experiencing homelessness. Three states said the information would be easier to collect if it were the amount reserved and not the actual expenditures. Of the seven states that commented that it would require some extra burden to collect and submit, several commented that the data are available.


ED Response

Based on the majority of responses that the proposed change would have little impact on burden for the state and the value of the additional data for supporting services for students in foster care under Title I, Part A, the Department is keeping this proposed change and have clarified the definition to show that the initially reserved amounts and not actual expenditures are to be reported.



Directed Question #7: Title I, Part A Homeless Reservation: The passage of the Elementary and Secondary Education Act of 1965 (ESEA) requires LEAs to reserve funds from its Title I, Part A allocation to provide services to homeless children. ED is proposing to collect the number of homeless children and youth served by Title I, Part A programs to better understand the numbers of these students being served.

  1. Can your state distinguish homeless children and youth served by Title I, Part A programs under the reservation for homeless children and youth?

  2. What is the impact with reporting this data group anticipated in your state?

Shape7

Public Comments

Twenty-one states provided comments to the directed question about the impact of reporting the number of homeless children and youth served under the Title I, Part A LEA homeless set-aside. Six states commented that the proposed change would have little impact on the burden of reporting for the state. Two states commented the proposed change would raise awareness of barriers for children and youth experiencing homelessness. However, fifteen states commented the new data collection would be a heavy burden. One state noted there is no definition of what “served” by the Title I, Part A set-aside means for students experiencing homelessness. Several states commented these data are not collected in the state and collecting the data would be burdensome and costly. One state confused this proposed new collection with the retired collection for McKinney-Vento subgrants (FS 043, DG 560). Three states proposed using existing LEA homeless enrollment counts from FS 118 in place of the proposed new collection since all enrolled homeless students may be and sometimes are served by the set-aside.


ED Response

The clear majority of responses indicated that the proposed change would create a heavy burden for states and LEAs, and because of the alternative suggested by three states to use existing LEA homeless enrollment counts instead, the Department is no longer proposing this change.



Directed Question #8: Students in Foster Care: The passage of the Elementary and Secondary Education Act of 1965 (ESEA) requires SEAs and LEAs to coordinate with State and local child welfare agencies to ensure the educational stability of children in foster care. ED is proposing to collect the number students who are in foster care and enrolled in a public LEA who are eligible for Title I, Part A services under the reservation for students in foster care to better understand the numbers of these students being served.

  1. Can your state distinguish students who are in foster care for all your LEAs or only those LEAs who are reported in file specification 134 as eligible for Title I, Part A services?

  2. What is the impact reporting this data group anticipated in your state?

Shape8

Public Comments

Twenty-two states and one association provided comments on this new data group. Fourteen states indicated the proposed change to collect counts of students in foster care in LEAs that receive Title I, Part A funds would have little or no impact on the burden of reporting for the state. Eight states indicated the proposed change would be a more substantial burden, particularly for those states not already collecting the data. Similarly, a few states requested some clarification regarding which students in Title I LEAs would be included in the counts reported. Comments from four of these states suggest the estimated increase in burden was based on a misunderstanding of the proposed change. Specifically, these commenters are concerned about a state’s ability to report whether students in foster care are eligible for Title I, Part A services, which is outside the scope of the proposed data collection, which would require counts only for LEAs that receive Title I, Part A funds. A few comments from states and a national association underscored that data on the numbers of students in foster care would be beneficial for supporting this student population.

ED Response

Based on the majority of responses that the proposed change would have little impact on burden for the state and the value of the additional data for supporting services for students in foster care under Title I, Part A, the Department is keeping this proposed change. Regarding which students in Title I LEAs would be included in the data reported, the Department has refined the definition as the number of foster care students enrolled in Title I LEAs, rather than a count of students enrolled in foster care who are receiving Title I Part A services in schools (i.e. targeted assistance or schoolwide programs).



Directed Question #9: Comprehensive Support and Targeted Support Identification (FS212): Consistent with the ESEA, each State is permitted to create State-specific subgroups. This change is intended to allow States that utilize State-specific subgroups to include those subgroups when reporting the reason for identification for schools identified as additional targeted support and improvement (ATSI) and targeted support and improvement due to consistently underperforming subgroups (TSI).

  1. Will this change support your State in accurately reporting the reason(s) for identification for ATSI and TSI schools?

  2. If not, what is needed to accurately report the reasons for identification?


Shape9

Public Comments

Eighteen states provided comments to the directed question regarding the proposed change to allow states that have defined state-specific subgroups in approved ESEA Consolidated State Plans to report the performance of those subgroups as the reason(s) for identification for schools identified for additional targeted support and improvement (ATSI) and targeted support and improvement (TSI) due to consistently underperforming subgroups. Nine states indicated the change would support the state in accurately reporting the reason(s) for identification of schools for ATSI and TSI. The remaining nine states indicated the change would have no impact on their reporting, most because the states had not defined state-specific subgroups.

ED Response

Based on the responses that this change would support states that have defined state-specific subgroups and has no impact on states that do not, the Department is keeping this proposed change.

The Department is making a conforming change to remove the permitted value for a Underserved Race/Ethnicity subgroup. Because such a subgroup can be reported as a State-defined subgroup, a permitted value for this specific subgroup is no longer needed. In addition, because records are not required for reasons that do not apply, the Department is removing the Reason Does Not Apply permitted value, which will decrease burden. Finally, the Department has become aware that states are not able to report reasons for comprehensive support identification (CSI) for schools that have not met state-defined exit criteria and are required to implement more rigorous state-determined action because there is no permitted value for these schools in FS 212. As such, the Department added a permitted value for reporting these schools.



Directed Question #10: ED is proposing adding permitted values to five indicator status file specifications (199, 200, 201, 202, and 205) to allow reporting results in cases where a state has defined more than one measure for the indicator. These changes align with the approach for multiple indicators currently used in FS202. These changes will allow states to report the indicator data in the form they likely have for their uses. This change will have no impact on states that have defined only one indicator measure that is submitted for the file.

  1. Or is this change sufficient to allow complete reporting of the indicators?

  2. If not, what is needed to allow complete reporting of the indicators?

Shape10

Public Comments

Seventeen states and one association provided comments to the directed question about the impact of adding permitted values to five indicator status file specifications (199, 200, 201, 202, and 205) to allow reporting results in cases where a state has defined more than one measure for the indicator. Seventeen states indicated the change would have no impact and/or be sufficient to allow complete reporting of the indicators. One state raised several questions about how the indicator statuses would be reported.

ED Response

Based on the responses that the proposed change would be sufficient to allow complete reporting of the indicators, the Department is keeping this proposed change to allow reporting results in cases where a state has defined more than one measure for the indicator. However, the Department is adjusting the number of measures that may be reported for specific indicators to better align with states’ approved ESEA Consolidated State Plans. The proposal allows a state to report results for each measure it has defined for an indicator, allowing for 6 measures for the Academic Achievement indicator, 6 measures for the Other Academic indicator, 3 measures for the graduation rate indicator, 2 measures for the progress in achieving English language proficiency indicator, and 12 school quality or student success indicators.


The Department has also revised the associated Accountability Metadata Survey to collect information to support collection and interpretation of data reported for indicator statuses. To ensure consistency between these data files and the Accountability Metadata survey, the Department streamlined and reformatted questions regarding indicator type and definitions, streamlined questions about the names of elements/measures and related performance levels and values for the performance levels, and reformatted questions about grades/grade levels to which the measures for indicators apply. In addition, to more narrowly focus on indicator statuses, the Department is removing questions no longer needed. Regarding FS 199 (Graduation Rate Indicator Status), the Department is removing the questions about an alternate diploma to reduce burden; the Department will rely on related information collected in FS 151 (Cohorts for Adjusted Cohort Graduation Rate) as a substitute. The Department is also removing the questions about FS 160 (High School Graduates Postsecondary Enrollment); the Department will rely on directions provided in the file specification and as needed, information collected elsewhere to replace this collection of information. (See Attachment C for the revised Accountability Metadata Survey.)


Major Racial Ethnic (Accountability)

Public Comment

Two states provided comments on the Major Racial Ethnic data category for accountability file specifications (199, 200, 201, 202). One state questioned whether states must collect data for Major Racial and Ethnic as defined for Title I assessment and accountability files. Another state recommended revising the permitted values for the subgroups in the assessment files to allow reporting for a Native Hawaiian subgroup and a Pacific Islander subgroup, consistent with the permitted values for the accountability data groups.

ED Response

A state is only required to report data for the major racial and ethnic subgroups defined for the state’s Title I accountability system as outlined in the state’s approved ESEA Consolidated State Plan for the reporting year.

The proposed change to indicator data groups allows states that have state-defined subgroups for Title I accountability purposes in the state’s ESEA Consolidated State Plan to report data for up to three State-defined subgroups. With this change, states will be able to report data for a Native Hawaiian subgroup and a Pacific Islander subgroup, or other state-defined subgroups not currently included in the permitted values, for the indicator files. Because data in the indicator and school identification files must align with approved ESEA Consolidated State Plans and are used for a state’s system of annual meaningful differentiation, the Department believes collecting data for state-defined subgroups, where applicable, for these files justifies the additional burden. The assessment and completer files are used to calculate indicators, but are not by themselves used for accountability, therefore, the Department does not believe collecting data for state-defined subgroups for these files justifies the additional burden.

Title I: Neglected or Delinquent

Directed Question #11: N or D Participation - State Agency: Currently, ED collects data on students participating in neglected programs and those participating in delinquent programs as one count, even though these programs are different (FS119 / DG656). ED is proposing to collect the number of students participating in neglected programs separate from those participating in delinquent programs to better understand and support these different student populations. In addition, ED is proposing to add an additional disaggregation count by economically disadvantage status to better understand who is served within these two programs. ED is proposing the following changes to have a better understanding of the number of students within each.

  1. Can your state report the number of students participating in neglected programs separate from those participating in delinquent programs?

  2. Can your state disaggregate these counts by economically disadvantaged status?

  3. What impacts with reporting this data group are anticipated in your state?


Shape11

Public Comments

Twenty-five states provided comments to the directed question about the impact of separating students participating in neglected programs from delinquent programs, disaggregating by economically disadvantaged status, and the impact of reporting this data group. Twenty-two states responded they would be able to report students participating in neglected programs separate from delinquent programs. Three states reported that they could not report that data at this time and would need to update their data collection tool. Of the states that responded only a quarter did not comment that the collection would pose a significant burden.

Fifteen states responded that they could collect data on economically disadvantaged status, ten responded they could not. Of the fifteen who said they could, nine reported it would be a significant burden and would require updating their data collection methods. One state noted that their numbers were small and the amount of work needed to obtain the small amount of data was not cost effective. Of the ten who said they could not, five noted that all students in facilities receive free lunch which would render the count at 100 percent. One state noted that the high percentage of students in facilities who are low SES would render the data meaningless. Two states also noted that facilities do not have access to the state’s student information system (SIS) and had not method of reporting this data. Several states also asked for clarification of the definition of “economically disadvantaged” and the method for making this determination.

ED Response

Based on the responses received that SEAs could report neglected and delinquent programs separately, the Department will move forward with this proposed change.


Based on the negative response regarding the request to disaggregate student data by economically disadvantaged status, the Department is withdrawing the proposed change. States cited a significant burden in updating data collection tools, and challenges in providing facilities with access to existing student information systems. In addition, states noted that all children in facilities may be categorically eligible for free meals, the standard often used to denote “economically disadvantaged,” and would render the data meaningless.



Directed Question #12: N or D Participation – LEA: Currently, ED collects data on students participating in At-risk programs and those participating in delinquent programs as one count, even though these programs are different (FS127 / DG657). ED is proposing to collect the number of students participating in At-risk programs separate from those participating in delinquent programs to better understanding of the number of students within each program and support these different student populations.

  1. Can your state report the number of students participating in delinquent programs separate from those participating in programs for at-risk students?

  2. What impacts with reporting this data group are anticipated in your state?

Shape12

Public Comments

Twenty-four states provided comments to the directed question regarding the ability to report student participating in delinquent programs separate from those participating in programs for at-risk students. Twenty-one states responded that they were able to collect this data. Of those, eleven reported that there would be little to no impact because they are already collecting the data, or it would not be difficult to do so. Three reported some impact, but not a significant amount. Six states reported that there would be a significant burden to the SEA to update data collection systems and that additional time and technical assistance would be needed. Three states responded that the data collection was not applicable as they did not fund at-risk programs. Only one state responded that they could not collect the data but noted they could collect the data by SY 2023-24.

ED Response

Based on the majority of responses that this data can be collected with minimal impact; the Department is keeping this proposed change.



Directed Question #13: N or D Program Data Categories: Currently, OESE collects data on students participating in neglected, delinquent, and at-risk programs by one set of permitted values, even though the program types vary by program. OESE is proposing to separate out each program to better understand the location of where each program serves their students.

Can your state report the number of students participating in these programs?

  1. What impacts with reporting these data categories are anticipated in your state?

  2. Under Section 1432(4)(A) of the ESEA, a neglected program is defined as a public or private residential facility, other than a foster home, that is operated for the care of children who have been committed to the institution or voluntarily placed in the institution under applicable State law, due to abandonment, neglect, or death of their parents or guardians. Due to this broad definition, neglected programs that could receive Title I, Part D funds would not be limited to the program types currently proposed. Are there any program types missing from the proposed list that would more accurately categorize the neglected programs or congregate care settings in your State?

Shape13

Public Comments

Twenty-four states provided comments to the directed question regarding the ability to report student participating in neglected, delinquent, and at-risk programs, including the impact that this change would have. Ten states responded that this data is already collected or could be collected with minimal impact. Ten states responded that this data could be collected but would present an additional burden due to the need to update data collection forms and the need for additional professional development and technical assistance to assist facilities in collecting the data correctly. One state noted they would also need to update data sharing agreements.

Three states reported that they could not collect this data. One state noted they do not collect information on types of facilities. One state noted that they could collect this data on Subpart 2 programs, but not on Subpart 1. The other state does not collect this data but noted they supported collecting data on additional categories.

Additionally, states reported on missing types of facilities in these categories. The following additional categories were suggested: other (two states), adult detention (one), tribal facilities (one), transitional camp (one), care and treatment (one), group home (one), shelters (one). Two states asked if non-residential programs would be eligible for funding. One state asked for clarification on whether this information would be collected at the building level or program level as they have facilities offering multiple program types under one roof. One state noted that it did not currently count certain types of neglected facilities served by LEAs and asked for further clarity. One state asked about non-residential community day programs, which are eligible for Subpart 1 services and are a proposed reporting category. Several states inquired why only Subpart 1 programs for neglected students were included in this question; the Department notes that neglected students are served only under Subpart 1 of this program, not Subpart 2, which serves delinquent and at-risk students.

ED Response

Based on the responses the Department proposes to keep the change. Regarding the suggestions of categories for neglected programs under Subpart 1, most of the suggestions received were specific only to one state and therefore are not advisable to add as reporting categories. A permitted value of “other” was added to allow for state-specific responses.



Directed Question #14: N or D In Program Outcomes: ESEA requires ED to evaluate the program and impact of the program while the students are still enrolled. To reduce burden, ED is proposing to eliminate the disaggregation of program type from the current collection (FS180 / DGs 782 and 783) and replace it with two new data groups.

Can your state report the number of students participating in neglected AND delinquent programs who attained academic and career and technical outcomes while enrolled in the programs?

  1. Can your state report the number of students participating in at-risk AND delinquent programs who attained academic and career and technical outcomes while enrolled in the programs?

  2. What impacts with reporting these data groups are anticipated in your state?

  3. Since at-risk programs could include a wide range of focuses and formats, developing uniform evaluation measures without stakeholder input could impact the ability to effectively report and analyze any data that is collected. Based on the at-risk programs in your State, what student or program outcomes are currently collected to evaluate all at-risk programs upon completion?


Shape14

Public Comments

Twenty-four states provided comments to the directed question regarding the reporting of academic and CTE outcomes for students enrolled in neglected and delinquent programs and at-risk and delinquent programs. Twenty-two states responded that they would be able to collect the number of students participating in neglected AND delinquent programs who attained academic and career and technical outcomes while enrolled in the programs, with three noting they would have to update their reporting system, one state noted that the data could only be collected while students were in the facility. Another state noted that CTE participation data was difficult to collect due to the structure of programs in their state. One state also suggested that data be collected “while enrolled” and “upon exit.” Two states reported that they could not collect the data, but one of those noted that they could if they updated their data collection tool.

Twenty-three states responded that they would be able to collect the number of students participating in at-risk AND delinquent programs who attained academic and career and technical outcomes while enrolled in the programs. One state reported that it could not. As with the first question, states noted the need to update data collection systems. Again, one state suggested collecting data “while enrolled” and “upon exit.”

Eighteen states reported no impacts in collecting these data groups. Six states reported they would need to update their data collection tool. Three states reported a need for additional technical assistance. One state requested clearer definitions. One state noted they cannot collect outcome data once a student turns 18.

Of the states that responded to the question about at-risk programs, four noted they do not fund at-risk programs, six did not report additional outcomes beyond those already collected, and three felt that the outcome data was problematic as it was difficult to collect, was not reliable, and did not yield information that was meaningful. One state suggested eliminating the “90 days after exit” collection as it is difficult to follow student outcomes after students leave the program (the Department notes that this concern pertains to FS 181 rather than FS 180). One state noted that it collects outcome data on students at risk of academic failure and in contact with the juvenile justice system.

ED Response

Based on the responses the Department proposes to keep the proposed changes, which provides language clarifying that Title I, Part D Subpart 1 serves students participating in neglected and delinquent programs, while Subpart 2 serves students participating in at-risk and delinquent programs.



Directed Question #15: N or D Exited Program Outcomes: ESEA requires ED to evaluate the Title I, Part D, Subpart 1 and 2 programs and outcomes of those students. ED is proposing to remove the 90 day outcome period (FS181 / DGs 785 and 784) and replace it with two new data groups.

  1. Can your state report the number of students participating in neglected AND delinquent programs who attained academic and career and technical outcomes at the time of exiting from the program?

  2. Can your state report the number of students participating in at-risk AND delinquent programs who attained academic and career and technical outcomes at the time of exiting from the program?

  3. What impacts with reporting these data groups are anticipated in your state?

  4. With the proposed change of collecting this data within 14 calendar days rather than 90 days after exit, what data quality issues, if any, do you believe will affect your State’s ability to report these outcomes?


Shape15

Public Comments

Twenty-four states provided comments to the directed question regarding the reporting of outcome data on students participating in neglected and delinquent programs and at-risk and delinquent programs who attained academic and CTE outcomes at the time of exiting the program.

Twenty-one states responded that they would be able to collect this data. Of those who responded yes, one noted that additional technical assistance would be needed, one noted that many LEAs could not track this data, one noted the data was not accurate, one noted it was difficult to track after exit, and two noted they would have to modify their data collection tool. One state suggested the data should be collected “while enrolled” and “at exit.” Three states reported that they could not collect the data. Of these, one reported that they would need to change their survey tool and one noted challenges with the structure of programs in the state.

Three states reported there would be no impact, one reported a minimal impact, seven reported that the impact would be burdensome as they would need to update data collection forms, five reported a need for additional technical assistance. Two states felt the data would be more accurate.

Five states reported that the proposed change to collect the data within 14 days rather than 90 days would be more accurate. Two states indicated a strong preference to report outcomes at the time of exiting the program, and two additional states noted they could not track students after exit due to state laws. Several states requested additional clarification about the “within 14 days” timeline. In general, respondents reported it would be very difficult to report meaningful outcomes within a two-week period.

ED Response

Based on the responses received, the Department will maintain the proposed clarifying language regarding the breakout of outcomes for students in neglected and delinquent programs under Subpart 1 and students in at-risk and delinquent programs under Subpart 2.


Regarding the question about whether states would prefer to report on outcomes within 14 days of exit rather than 90 days, the responses were mixed, and it appears that states need clarification on this question. Several states commented on the difficulty of reporting outcomes after students exit the programs; however, the statute requires that states report on long-term outcomes such as graduation, high school credit accrual, employment, and enrollment in postsecondary education and job training programs. While some states felt that 14 days would provide more accurate data, others felt that the data would not be meaningful within 14 days of exit. Therefore, we will withdraw this question as many states do not feel confident that this change will yield more meaningful data.



Directed Question #16: N or D New Data Groups: OESE is proposing to move the collection of length of stay from the Consolidated State Performance Report to EDFacts and is proposing two new data groups to collect information on recidivism of students in delinquent programs since one of the requirements of Title I, Part D is to help students transition back in their local school.

  1. Does your state collect these data to be able to disaggregate these counts out by program type?

  2. If states do not collect the needed information at this time, how long would states need to be able to report these data to ED?

  3. What impacts are associated with reporting these data to ED?


Shape16

Public Comments

Twenty-five states provided comments to the directed question regarding the ability to collect length of stay and recidivism data for delinquent programs at the SEA and LEA levels. Eighteen states reported the ability to collect data on length of stay for both SEA and LEA programs. Seven states responded that they did not collect this data, but all responded that they could given time and support. There are currently no states collecting data on recidivism.

Four states responded that they would need at least a year to collect this data. Five stated they would need two years. Two reported needing at least three years. One reported this was an undue burden due to staffing issues and overwhelming numbers of students in the programs.

Six states noted they would need to update their data collection tools and provide additional technical assistance. Nine states wanted more clarity around the definition of recidivism, including the challenges of tracking recidivism when students returned to different programs or when returns happen across multiple reporting periods. Several states noted that this data is reported by their LEAs in the aggregate and they do not have this data at the student-level. One noted their numbers are small and the change would be burdensome.

ED Response

Due to difficulty in collecting recidivism data and questions about how the data would be used to improve program outcomes, the Department will withdraw this proposed collection. Based on the number of generally positive responses, the Department will maintain the collection of average length of stay but has withdrawn the proposal to move this collection to EDFacts from CSPR.



Directed Question #17: N or D Academic Achievement: ED is proposing to remove program type as a category in both the State and LEA level files. However, in order to fulfill the program evaluation requirement under Section 1431(a)(1) of the ESEA, the current proposed EDFacts package will continue to collect pre- and post-testing data to determine improvement in student academic performance. Across the nation, less than 25% of the students served by Title I, Part D complete pre- and post-test as reported by State educational agencies.

  1. How can data quality for this item be improved in your State?

  2. What academic performance data outside or in lieu of pre and post-test does your State currently collect?


Shape17

Public Comments

Twenty-two states provided comments to the directed question regarding how data quality can be improved and what academic performance data outside of pre and post-testing is currently collected (FS 113, 125). States reported challenges in reporting pre-and post-testing. Six states noted that students leave facilities with little notice and there is no time to post-test. One state noted that students may be impacted by trauma and do not perform well on tests. One state noted that this data is reported by their LEAs in the aggregate and they do not have this data at the student-level. Five states noted that most students were not in facilities long enough to meet the 90-day post-test requirement, which means that many students cannot be captured in this data. Two states suggested that the program could mandate testing at 90 days rather than testing when students leave the program, this would capture the students who leave programs with little notice, which is common. States also noted students who are in facilities do not make an effort to perform well on assessments, the lack of a standardized test across all facilities, and the impact of the pandemic on both assessments and the ability to provide in-person learning.

States reported gathering the following additional data points; two states reported collecting graduation and drop-out data for long-term facilities, five states reported collecting state-wide assessment data from long-term facilities, and several states reported collecting benchmark assessments, summative assessments, and end of course assessments. One state requested that testing data only be reported for the residential facilities that generate funding. One state suggested creating an ad hoc group to develop protocol, suggested a 45-day timeline for collecting post-test data, and focus on long-term facilities providing state assessments as required under ESEA 1111(b).

ED Response

Based on the responses, the Department is proposing a new optional data group that would provide states with more ways to report academic achievement data, therefore giving the state more data on program participants by using state assessment data (reading and math) for students in programs and those who have exited (participation and proficiency). The Department is also proposing clarification to the language of the existing data groups with the hope that this will clarify that states do not need to wait until a student exits the program to complete this assessment.

title iii

Directed Question #18: Title III Students Served: In the Fiscal Year 2020 appropriations bill, the Department received report language encouraging the disaggregated data to present a more complete picture of academic achievement for the diverse population. ED is proposing to add the racial/ethnic data category to the two Title III Students Served (FS116) data group 648 (Title III Students Served Table). ED is also proposing to change the reporting period for DG849 from “October 1 (or closest school day)” to “School Year - Any 12-month period” to match the reported period of other Title III data groups (648, 864, and 865).

  1. Can your state report these data groups by racial/ethnic data of the Title III students served?

  2. What impacts with reporting this data category are anticipated in your state?

  3. Will the change in reporting period for DG648 change the number of students reported or increase the reporting burden for your state?

Shape18

Public Comments

Twenty-two states and one association responded to the proposed changes to data elements for “Title III Students Served”. All states but one noted that they could, or would be able to, collect and report data disaggregated by racial and ethnic categories for English learners served by an English language instruction educational program supported with Title III, Part A of ESEA. Most states anticipated minimal to no impacts from having to disaggregate this data by racial and ethnic categories. A few states anticipated moderate to significant impacts, including programming and database updates, additional staff review time, and a potential delay in the timeline for districts to submit data during the year the revision is implemented. One state asked whether the racial and ethnic category disaggregation would be a separate count from the grade level count or if it would be incorporated into the grade-level count and how multiethnic students should be reported.

For the shift in the reporting period for DG 648, from October 1 to a full school year count, most states anticipated minimal or no impacts, and one state noted this would decrease burden since it would align the reporting timeframe within FS 116. Some states anticipated moderate to significant impacts, including reduced time to create the files after the year-end data snapshot and added burden to deduplicate counts of students who have moved across schools or LEAs. One state recommended moving the data submission to the EDFacts Part II submission window in February to allow states more time to report. One state asked for clarification whether a student would be counted at any LEA in which they were enrolled during the school year and whether the state would be able to pick the reporting period date. All states addressing impacts on the number of students reported said that going from October 1 to a full year count would likely increase the count of students reported in the data group.

ED Response

The Department recognizes there may be additional burden in the first year of reporting to set up the data disaggregation and revise other reporting guides and data quality review processes, but the impacts should be limited since these data elements are already collected for program accountability purposes and federal civil rights data collection and are flagged electronically in student identifier systems. To simplify the reporting and reduce burden, the Department is only proposing that states disaggregate racial and ethnic categories by DG648, which is the unduplicated count of students.

Regarding the question as to whether the racial and ethnic category disaggregation would be a separate count from the grade level count or if it would be incorporated into the grade-level count, the Department plans to request that information as a separate count. Regarding how multiethnic students should be reported, this is the same racial/ethnic category set that is used in the membership FS 052 (DG 39) which includes “Two or more races as an option.”

For the question of how to report counts for students who move between LEAs during the school year, for DG 648, states should report the data once at the SEA level and once for any LEA in which the student was served by an LIEP program in LEAs receiving Title III, Part A funds during the “School Year – any 12 months” revised reporting period. The Department recognizes that a student who moved between LEAs within the reporting period may be reported by both LEAs; the SEA bears responsibility for reconciling the data to only report the student once at the SEA level. DG 837 (in FS 205) identifies a school's performance on the progress in achieving English Language proficiency indicator and is at a different unit of analysis, where the reporting occurs for the school as a whole. The “partial attendance” requirements in ESEA section 1111(c)(4)(F) apply to reporting on measures used for annual meaningful differentiation, including a school's performance on the progress in achieving English Language proficiency indicator.

students

Directed Question #19: Dropouts and Graduates: For Dropouts (FS032/DG326) and Graduates/Completers (FS040 / DG306), the language in the reporting guidance and dropout definition will be updated to reflect all high school equivalencies. More specifically, every reference to “GED” will be changed to “High School Equivalency Diploma (HSED)” allowing for states to include all high school equivalencies and not just GEDs.

  1. Will this change impact your agency’s reporting of these items?

  2. If so, how will it increase or decrease your burden?


Shape19

Public Comment

Twenty-one states responded to the question about changing “GED” to “High School Equivalency Diploma (HSED)”. All twenty-one states said this change would have little to no impact in their reporting. A few states noted this is already how they report, others said this would allow them to add in more high school equivalency diplomas but would have little impact, and others noted they do not have these types of completers.


ED Response

Based on the overwhelming response from states that the proposed change would have little to no impact, the Department is keeping this proposed change.



Directed Question #20: Free and Reduced-Price Lunch: The reporting period for the Free and Reduced-Price Lunch (FS033/DG565) will be updated to align with the reporting period for Direct Certification (DG813), changing it to “October 1 or the date that aligns with the reporting period for USDA.”

  1. Will this change impact your agency’s reporting of this item?

  2. If so, how will it increase or decrease your burden?

Shape20

Public Comment

Nineteen states responded to the question on the Free and Reduced-Price Lunch reporting period. Sixteen states said the change would have little to no impact in their reporting burden. Three states noted that moving the reporting period would impact their reporting.


ED Response

Based on the overwhelming response from states that the proposed change would have little to no impact, the Department is keeping this proposed change. For the states that indicated a possibility of increasing burden, the Department was not sure if states misunderstood the proposed change. The change does not involve substantive change. Currently in FS 33, the wording for reporting periods are:

  • DG 565 Free and Reduced-price Lunch: October 1 (or closest school day) – October 1 or the closest school day to October 1

  • DG 813 Direct Certification: October 1 (or USDA reporting period) – October 1 or the date that aligns with the reporting period for USDA

The Department’s proposal makes the reporting period consistent as "October 1 (or closest school day) or USDA reporting period" for both DG 565 and DG 813.

Directed Question #21: Chronic Absenteeism/Economically Disadvantaged: ED is proposing to add in the economically disadvantaged data category, by sex, to the Data Group 814 (Chronic absenteeism table). This added category would allow ED to determine the impact of students being economically disadvantaged on their chronic absenteeism status.

  1. Can your state report the unduplicated number of students absent 10% or more school days during the school year by economically disadvantaged status and sex?

  2. What impact with reporting this data group are anticipated in your state?.

  3. The existing data categories used with this data group are collected by sex, would it add burden to collect economically disadvantaged without sex? Would important data points be missed by taking out sex in this data category?


Shape21

Public Comment

Twenty-three states provided comments to the proposal to add in the economically disadvantaged data category, by sex, to the DG 814 (Chronic absenteeism table) and the question about whether it would add burden to collect economically disadvantaged without sex. Eighteen states reported that there would be minimal burden in reporting these counts. Five states commented that there would be some burden, but one state explained that it was mainly if the counts had to be unduplicated at the LEA and SEA levels. One state commented that there would be a burden in removing sex from the counts if the data are not required by the Civil Rights Data Collection. Two states commented that there could be an advantage to not reporting counts by sex as some states and students use categories other than male/female. One state commented that it doesn't collect attendance data for high school students and another state indicated that Free and Reduced-Price Lunch waivers to serve meals to all students are rendering the economically disadvantaged category meaningless in their State.


ED Response

Based on the majority of states indicating minimal burden in adding the economically disadvantaged student subgroup to FS 195, the Department is keeping the proposed change. The Department will continue to collect chronic absenteeism data by sex at the school level given the Office of Civil Rights' use case within the Civil Rights Data Collection (CRDC). See the response to Directed Question #22 for the discussion of unduplicated reporting. Given the CRDC information collection concurrently going through public comment process, it is important to note that CRDC is proposing adding data about students who are nonbinary to CRDC. EDFacts is maintaining existing Sex (Membership) permitted values and DG 814 will not include data about students who are nonbinary.

Directed Question #22: Chronic Absenteeism Unduplicated: ED currently collects Data Group 814 (Chronic absenteeism table) at the school level by sex and racial/ethnic, disability, 504, English learner, homeless, and the proposed economic disadvantaged. ED would like to understand if states can provide unduplicated counts of this data group and categories at the LEA and SEA levels.

  1. Can your state report the unduplicated number of students absent 10% or more school days during the school year at the LEA and SEA level?

  2. Can your state report these unduplicated LEA and SEA counts by the existing data categories and the one proposed category (economically disadvantaged)?

  3. What impacts are anticipated with reporting these unduplicated counts at the LEA and SEA level?


Shape22

Public Comment

Twenty states provided comments in response to the directed question about unduplicated chronic absenteeism counts at the LEA and SEA levels. Thirteen states commented that it would require minimal burden. Seven states commented that there would be a relatively high burden. Two states pointed out the need for clearer guidance on how to count students enrolled in more than one school or district during the day when un-duplicating the data at the LEA and SEA levels. One state indicated the need to know the intent of this question in order to respond to it more completely.

ED Response

States provided mixed responses about the burden associated with the proposal to report chronic absenteeism data at lower levels. Given the value and the planned use of the data, unduplicated counts at the LEA and SEA level are important for accurate reporting, data quality review, and calculating multi-year trends, the Department is proposing to collect unduplicated counts of chronically absent students for the homeless enrolled and economically disadvantaged subgroups at the SEA- and LEA-levels for Data Group 814. The Department is using homeless student chronic absenteeism as a national program performance measure for the McKinney-Vento program, reporting it by state in national reports, and publishing at the LEA level on ED Data Express.


assessment

Directed Question #23: In order to collect data for multiple assessments the assessment files will need to be split into 12 new file specifications.

  1. Does your state currently administer more than one assessment for any subject for any grade? If yes,

    1. Can students be reported more than once in a single year in the assessment participation and the assessment achievement data when more than one assessment is used? If so, how do you handle these duplicates when you aggregate into one assessment result, as you current do for reporting the assessment participation and achievement data?

  2. If your state does not currently administer more than one assessment for any subject for any grade, is your state planning to expand the assessments? If yes,

    1. Can students be reported more than once in a single year in the assessment participation and the assessment achievement data when more than one assessment is used? If so, how do you handle these duplicates when you aggregate into one assessment result, as you current do for reporting the assessment participation and achievement data?

  3. What impacts from proposed change are anticipated in your state?

Shape23

Public Comments

Twenty states and one association provided comments about the proposed changes to the assessment files. Six states indicated general support for the changes, or otherwise noting that the impact from these changes would be low in terms of their updating systems to meet these revised requirements. Two states noted general support for the changes in the EDFacts data collection package regarding multiple assessments. The one association supported the Department’s proposal, noting it would better represent which assessments are being administered.

Eleven states indicated that these changes would result in large impacts and additional burden and were not supportive of the proposed changes. One state noted that they only offer one general assessment and one alternate assessment in high school. One state noted that it does offer multiple pathways in the mathematics subject area, but it only offers one test to a student in a given school year. A general concern was that states lacked sufficient staff resources to perform the additional file creation and validation tasks that would be necessitated by these changes.

One state questioned why the Department was making these changes, since the statute requires that only one assessment is required at the high school level. This same state also asked if these proposed changes would mean that they would now have to report results of all advanced testing that occurs in high school. Two states raised questions about why grade level specifications for high school science were different than those for reading/language arts and mathematics.

ED Response

The Department would like to clarify a few reporting points. For states that only offer one end-of-year assessment in high school, the proposed changes in this package would have minimal impact on their current efforts in supporting EDFacts reporting. To ease some of the burden for states with only one assessment, the Department will maintain the structure of the assessment files within the different grade levels. For example, for mathematic academic achievement, the grade level would continue to be the 7th field in the file and assessment administered or assessment type would continue to be the 15th field in the file. This means for states that do not have additional assessment types, the only change would be splitting the files into lower grades and high school. The Department believes that the split of grade levels has long term benefits for the data quality review.


If a state uses multiple end-of-course tests to meet the high school testing requirement for ESEA, then it should report those tests in a way that reflects the administration of those tests for a given school year. Doing so ensures that the state can demonstrate that all students appropriately participate in required high school assessments. The proposed changes are designed to fully capture the test taking patterns in states that offer such pathways.


The Department also would like to clarify that only test results used to fulfill ESEA assessment requirements are required to be reported through EDFacts. Other state assessments that are not used to meet ESEA assessment requirements should not be reported.


As for the grades by subject, per 34 CFR § 200.5(a)(ii), states must administer the same statewide assessments in science at not less than one time during each of grades 3 through 5; grades 6 through 9; and grades 10 through 12. 34 CFR § 200.5(a)(i) specifies that states must administer the same statewide assessments in reading/language arts and mathematics in each of grades 3 through 8; and at least once in grades 9 through 12. These proposed changes reflect these requirements of the statute and the regulations.


The Department recognizes that these changes will initially create some additional burden on states, and carefully considered that impact when developing the proposed changes to data collection requirements for assessments. The Department would also reiterate that these changes are designed to gather detail to describe four patterns of test administration allowed under the statute and regulations: 1) advanced tests allowed under the 8th grade math exception in 34 CFR § 200.5(b); 2) locally selected, nationally recognized high school tests allowed under 34 CFR § 200.3; 3) schools participating in Innovative Assessment Demonstration Authority (IADA) pilot tests authorized in 34 CFR §§ 200.104-109, and 4) states that administer multiple end-of-course tests to fulfill ESEA high school testing requirements. Based on Departmental records, about 12 states utilize the 8th grade math exception; about four states utilize locally selected nationally recognized high school tests; five states currently have the IADA pilot authority; and about 20 states administer end-of-course tests to fulfill ESEA high school testing requirements. The Department believes that states not utilizing any of these four flexibilities should not experience significant additional burden in reporting their high school test results to EDFacts.



Directed Question #24: ED is also proposing to revise the permitted values regarding assessment types to support States reporting assessment data on the Innovative Assessment Demonstration Authority, Advanced Assessments, and locally-selected nationally recognized high school assessments.

  1. Can the State report the counts of students disaggregated by certain assessment types now while the State needs more time to be able to report the counts of students disaggregated by other assessment types? If so, please explain which assessment types your State needs more time to report?

  2. What impact from reporting the count of students disaggregated by the new proposed assessment types (if applicable) are anticipated in your state?

Shape24

Public Comments

Seventeen states provided comments to the directed question about assessment types. Twelve states reported that they do not have more than one assessment type or that they do, but could report these counts with minimal effort, no additional time, or that the impact of the proposed changes would not be significant.

Six states indicated that they could report the data disaggregated by current assessment types but doing so for the proposed assessment types would create significant additional burden for them. Five states provided comments that described how the addition of increased detail for high school assessment results would create significant additional burden. Several of these states questioned the need for these proposed changes. Two states requested guidance on how to develop assessment data codes for multiple high school courses when their state only offers one end-of-grade test for each required subject in high school. One state suggested an alternate approach to the proposed changes to assessment types, that essentially maintained the current reporting structures for high school assessments in states that do not offer any of the four flexibilities described earlier.

ED Response

The Department again recognizes that while these changes could initially create some additional burden, the overall benefit to public transparency in documenting the patterns of test administration of ESEA required assessments will be worth that combined effort. As noted elsewhere in this document, the Department carefully considered that impact when developing the proposed changes to data collection requirements for assessments. The Department would further clarify that these changes are designed to gather detail to describe four patterns of test administration allowed under the ESEA statute and regulations and only test results used to fulfill ESEA assessment requirements are required to be reported through EDFacts.


As noted in the response above, to ease some of the burden for states with only one assessment, the Department will maintain the structure of the assessment files within the different grade levels. For example, for mathematic academic achievement, the grade level would continue to be the 7th field in the file and assessment administered or assessment type would continue to be the 15th field in the file. This means for states that do not have additional assessment types, the only change would be splitting lower grades and high school. The Department believes that the split of grade levels has long term benefits for the data quality review.


The Department also provided more clarification in this package for the data category and permitted value names and presentation.



Directed Question #25: States will continue to report on the assessment data on children with disabilities (IDEA) under the Disability Status (IDEA) category set C in the above data groups. In addition, new file specifications and data groups will be added to collect further disaggregated data on the subpopulation children with disabilities (IDEA). The counts in these files will be expected to equal the aggregated counts of children with disabilities (IDEA), as reported in category set C in the files containing all students. States would report the assessment data on children with disabilities (IDEA) disaggregated by Major Racial and Ethnic Groups and by Disability Category.

    1. Does your state collect the data needed to report the assessment data on children with disabilities by major racial/ethnic groups?

      1. If your state does not collect the needed information at this time, how long would states need to be able to report the assessment data on children with disabilities by this disaggregation to ED?

      2. What impacts are anticipated with reporting these data to ED?

    2. Does your state collect the data needed to report the assessment data on children with disabilities by disability category?

      1. If your state does not collect the needed information at this time, how long would states need to be able to report the assessment data on children with disabilities by this disaggregation to ED?

      2. What impacts are anticipated with reporting these data to ED?


Shape25

Public Comments

Twenty-three states and one association provided input on the proposed change to collect the assessment participation and performance data on children with disabilities (IDEA) disaggregated by major racial and ethnic groups and disaggregated by disability category. The majority of states indicated they have some or all of the data needed to report but expressed concerns with the proposed change. Several other commenters indicated the proposed change would have little or no impact on their ability to collect and report the data.

Several commenters provided information on the challenges and burden of capturing disability category data at the time of the assessment testing window. A few commenters expressed concerns with the timing of the State’s child count data and the assessment administration, reporting data by race/ ethnicity may not be valid. Several commenters noted that the proposed changes would result in a large increase in reporting burden. Two commenters expressed concerns that this change would greatly increase the size of the assessment data files. Two commenters noted that the proposed changes would result in a need to report many zero counts in the data submission. Several commenters expressed concerned with the very small cell sizes at the LEA level. One state questioned if the Department is also proposing to have these data reported at the school level. A few commenters indicated the change would require the linking of data/ data systems/ data platforms that is not currently established. A few commenters stated that they will need additional time to be able to implement this change. Additionally, commenters expressed concerns that using the Fall child count data for this purpose would not provide valid data.

Several commenters asked the Department to explain how the data will be used at the Federal level.

ED Response

The Department maintains the proposal to collect the assessment participation and performance data on children with disabilities (IDEA) disaggregated by major racial and ethnic groups. In response to the comments received, the Department is withdrawing the proposal to collect the participation and performance assessment data on children with disabilities (IDEA) disaggregated by disability category.

States are currently required to collect and report assessment data for all students disaggregated by major racial and ethnic groups as part of the assessment administration process. The Department expects that many states will use the same process of collecting/ reporting major racial and ethnic group data for all students in order to report the assessment participation and performance data on children with disabilities (IDEA) disaggregated by major racial and ethnic groups. Additionally, states select which major racial and ethnic groups they will use for reporting the assessment data. These reporting categories or permitted values may differ from those used to report race/ ethnicity in the Part B child count data.

The Department agrees that reporting participation and performance assessment data on children with disabilities (IDEA) disaggregated by disability category would result in a large increase in reporting burden. However, the Department does not believe reporting participation and achievement assessment data on children with disabilities disaggregated by major racial and ethnic groups will add a significant amount of burden considering states already collect this as part of the assessment administration process. States are currently required to collect and report assessment data for all students disaggregated by major racial and ethnic groups in FS 175,178, 185, and 188.

The Department agrees with commenters that SEAs and LEAs have found it helpful to identify subgroups of students within common categories, such as race/ethnicity, in order to improve education outcomes and offer targeted support. The analysis of subgroups can lead to improved programs and policies. Some states already disaggregate assessment data by race/ethnicity to target support towards specific categories of students to improve performance. Collecting these data disaggregated by race/ethnicity also addresses the Executive Order on Advancing Racial Equity and Support for Underserved Communities Through the Federal Government. The Department has received requests for this disaggregation of the assessment data on children with disabilities by internal and external stakeholders in order to understand which children with disabilities are participating in different types of assessments as well as to understand whether differences in performance are evident across different subgroups of children with disabilities.

It is important for OSEP to be able to distinguish between zero counts, counts that are missing, and counts that are not applicable in the IDEA Assessment data submissions. Zero counts are only required for all valid combinations at the SEA level. All combinations that are valid for the state that are not included in the file are assumed to be zeros at the LEA level. The Department believes that zero counts are not overly burdensome at the SEA level.


Directed Question #26: Assessment Metadata: To support the revised assessment collection, a new metadata collection is being proposed, see Attachment C EMAPS Collections for the list of questions.

  1. Will your state have any issues responding to these assessment metadata questions?

  2. Are there questions that are missing but needed for ED to understand your state’s assessment data?

Shape26

Public Comments

Twenty states provided comments to the directed question regarding the assessment metadata collection. Sixteen states indicated they would not have an issue providing the information that is requested through EMAPS as a result of these proposed changes. Four states raised concerns with providing this information that they had also raised in providing comments to Directed Questions 23 and 24.

ED Response

Based on the response that these metadata questions supports this data collection, the Department is keeping this proposed change.



staff

Directed Question #27: Staff FTE/Ungraded Teachers: During recent data reviews, several questions have come up regarding reporting of ungraded teachers. The definition of ungraded teachers in the FS059 reporting guidance is “Teachers of a group or class that is not organized on the basis of grade grouping and has no standard grade designation” and in EMAPS is “Are any teachers identified as teachers for ungraded classes?” ED is not currently suggesting a change to the definition used in FS059 or in EMAPS but would like to better understand how SEAs report data in this permitted value.

  1. How do States report teachers (including special education teachers) who provide instruction to students in more than one grade – do SEAs assign them to a specific grade or are they reported as ungraded?

  2. What impacts are anticipated with the ungraded teachers permitted value?


Shape27

Public Comment

Twenty states provided feedback to the Department regarding their ungraded teacher count. Feedback on assigning teachers who provide instruction to students in more than one grade including: splitting the teachers FTE by the percentage of students in the different grades in the class; assigning them to ungraded; assigning to a grade if more than 50 percent of students are in one grade; using the teacher’s assignment codes; and, how special education teachers are categorized by grade.


ED Response

The Department appreciates this feedback and will use this as a reference document when questions are raised by users about how states are reporting ungraded teachers.



Directed Question #28: Staff FTE/Health Professionals: COVID-19 brought a renewed focus on student access to qualified health professionals, specifically school nurses. The information reported in EDFacts is variable over the years for health professionals. Beginning with the 2019-20 school year, school psychologists are now being reported as a separate permitted value. Currently health professionals like Nurses and other Health Specialists are included in the Student Support Services Staff permitted value.

ED is interested in understanding if SEAs already collect Health Specialists separate from Student Support Services Staff (Staff FTE DG528). Common Education Data Standards defines Health Specialists as “Professional staff members or supervisors assigned specific duties related to providing any health services that are not specific to mental health.”

  1. Does your state data collection differentiate Health Specialists separately from other Student Support Staff?

  2. If so, how does your state define Health Specialists?

  3. If not, what would be the burden to differentiate Health Specialists from other student support staff?

  4. Are nurses the main component of your Health Specialists?

  5. Does your state data collection collect Nurses separately from Health Specialists AND other Student Support Staff?

  6. If so, how does your state define Nurses?

  7. Are there other issues with these health staff that should be considered?


Shape28

Public Comment

Twenty-two states responded to the question regarding health professionals. Nine states noted that they currently collect and can differentiate Health Specialists from other Student Support Staff. Some states collect a version of health professionals or cannot separate it from student support staff. A few states can also separate out Nurses separately. Most states who responded have more complicated categorization of their student support staff.


ED Response

While some states would be able to report Health Specialists and/or Nurses separately from other Student Support Staff, most other states could not. Therefore, the Department will not be proposing the separate collection of Health Specialists or Nurses from Other Student Support Staff.



retired data groups

Directed Question #29: DG 24 Magnet Status (FS129 CCD School): A detailed review of the SY2017-18 CCD and CRDC data files shows that the magnet data collected in the CRDC is more reliable than the data collected for CCD. The proposal is for CRDC to continue to collect and release Magnet School Status every two years (or the frequency of the CRDC) and to retire DG24 Magnet Status from EDFacts/CCD.

  1. What use cases of CCD are impacted if Magnet Status is not released in the CCD public data files?

Shape29

Public Comment

Sixteen states responded to the question about retiring the magnet school status from the CCD files. Fifteen states noted that retiring this data group would have a positive impact, or no impact, in their reporting. One state asked that the magnet school link between the CCD and CRDC be available in public data and another state noted that CRDC is only released once every two years which would make it more difficult to find magnet schools.


ED Response

Based on the majority of responses that the proposed change would have no impact, the Department is keeping this proposed change. CRDC has a public data use tool where users can look up individual school profiles, including magnet status. Once CCD discontinues collecting the magnet data, CCD will consider adding a link for the magnet variable in "school search" to direct users to the CRDC tool.



Directed Question #30: DG 458 Chief State School Officer (FS029 Directory): ED is proposing to retire this data group as it is not currently used.

  1. What information for ED and the public is lost if this data group is retired?

Shape30

Public Comment

Fourteen states responded to the question of retiring the Chief State School Offices data group. All of the states said that no information would be lost by retiring this data group and they supported the change.


ED Response

Based on the responses that the proposed change would have no impact, the Department is keeping this proposed change.



Directed Question #31: DG 699 State poverty designation (FS103 Poverty Quartile): ED is proposing to retire this data group and use data from FS203 and LEA SAIPE poverty data obtained from the Census.

  1. What information for ED and the public is lost if this data group is retired?


Shape31

Public Comment

Fifteen states responded to the question on the state poverty designation. Seven states indicated that nothing would be lost by retiring file FS 103 and one state indicated that poverty percentages are more useful that poverty quartiles.


Four states expressed concerns about how the Department’s use of SAIPE data instead of data from FS 103 will affect public reporting of teacher data by poverty quartile. These states were concerned that quartile data reported by the Department will no longer match similar data reported by states in their report cards if the Department switches to reporting quartiles using SAIPE data rather than FRPL data. Two states raised concerns about whether the Department’s decision to use SAIPE data instead of FRPL data to sort schools into poverty quartiles would have implications for how SEAs award funds to school districts. Two states asked whether SAIPE poverty data will be made available for review.


ED Response

Based on the responses, the Department is keeping this proposed change. This data group has, since the ESEA was most recently reauthorized in 2015, been used only to calculate national program performance measures for the Title II, Part A program and to report poverty quartiles for teacher data in CSPR reports. Because of data quality issues with data that states have submitted in FS 103 and difficulties states have had in matching school level data in FS 103 with school level data about teachers in FS 203, these data have not been useful for Title II, Part A national program performance purposes. As a result, the Title II, Part A program office proposed a new performance measure that make use of SAIPE poverty data instead school level poverty quartile data provided in FS 103 and OMB approved the new national performance measures in 2020.


In regard to the Department’s reporting on teacher data by poverty quartile, while sections 1111(h)(1)(C)(ix) and 1111(h)(2)(C) of the ESEA require states to report teacher information disaggregated by high and low poverty in their state and local report cards, section 1111(h)(5)(D) does not require states to submit teacher data disaggregated by poverty in the annual report to the Secretary (CSPR). As a result, when FS 103 is retired, the Department will no longer report teacher data disaggregated by poverty quartiles in its CSPR report, so there will be no mismatch in quartile reporting between what states include in their report cards and what the Department reports from the CSPR. The SAIPE poverty data will be used for national performance measures for the Title II, Part A program, not for public CSPR reports. If states wish to do so, they may continue to use FRPL data in reporting teacher data by poverty in their report cards. Similarly, two states indicated that they use school-level poverty data sorted by quartiles, as reported in FS 103 for other purposes; these states could continue to use these data for these other purposes, even after FS 103 is retired.


Retiring FS 103 has no implications for how the Department makes awards to states or for how SEAs make subaward allocations to LEAs as the change does not affect any allocation formulas in Federal statute. The Department further notes that for at least some ESEA programs (Title I, Part A and Title II, Part A, for example), the ESEA requires states to use SAIPE population data rather than enrollment or FRPL data in making suballocations to LEAs.


The Census Bureau makes district-level SAIPE data, including poverty data, publicly available on its web site and updates the data annually: https://www.census.gov/programs-surveys/saipe.html. One state further asked if LEAs will have an opportunity to review the SAIPE poverty data in order to amend or dispute it. LEAs already have an opportunity to dispute SAIPE estimates: each year, once updated Census data are publicly available (usually in December), the Department send the data to states and provides information about how to contact the Census Bureau if the SEA or LEAs have questions or concerns about the data. The Department also provides information about the deadline by which challenges to the data must be made. The Department encourages states to share this information with their LEAs.


technical corrections

Directed Question #32: Migratory Files: Change “students” to “children” to match legislative language in all file names; data group names and definitions; data category names and definitions; permitted values; and file specification language.

  1. Are there other issues with this change that should be considered?

Shape32

Public Comments

Fourteen states provided comments to the directed question to change “students” to “children” in all the migrant education program files. Thirteen states reported there would be no issues with the Department’s proposal. One state noted that the change may cause some confusion for those migrant education programs that serve only students aged 18-21.

ED Response

Based on the overwhelming response that this change in language to would not have an impact, the Department is keeping this proposed change.



Directed Question #33: LEA Subgrant Status (FS170): ED proposes to drop the No permitted value so that only those LEAs that received a McKinney-Vento subgrant would be reported. This would allow ED to only collect those needed LEAs and would cut down on needless data quality checks on this item.

  1. Does your state agree that this change would reduce time spent on needless data quality checks?

  2. Are there other issues with this change that should be considered?


Shape33

Public Comments

Seventeen states provided comments to the directed question to drop the “No” permitted value and only report LEAs that received a McKinney-Vento subgrant. All of the states agreed this change would reduce time spent on needless data quality checks or they have no issues with the Department’s proposal.

ED Response

Based on the overwhelming response that this change would reduce data quality burden; the Department is keeping this proposed change.



Directed Question #34: Consolidated MEP funds status (FS165): ED proposes to drop the Not Applicable permitted value so that only those schools that have a schoolwide program and/or receive federal migrant education funds under ESEA Title I, Part C would be reported. This would allow ED to only collect those needed schools and would cut down on unnecessary data quality checks on this item.

  1. Does your state agree that this change would reduce time spent on unnecessary data quality checks?

  2. Are there other issues with this change that should be considered?

Shape34

Public Comments

Sixteen states provided comments to the directed question on MEP fund status. All sixteen states supported the Department’s proposal to only collect the schools that have a schoolwide program and/or receive federal migrant education funds under ESEA Title I, Part C. They agreed that the proposal would reduce time spent on unnecessary data quality checks and no other issues were raised with the proposed change.

ED Response

Based on the overwhelming response that this change reduces the time spent on unnecessary data quality checks, the Department is keeping this proposed change.



Directed Question #35: Adjusted Cohort Graduation Rate: ED is proposing to change the education unit totals from “operational schools with a 12th grade” to “LEAs/Schools with a 12th grade that have at least one student in the cohort.” This will match the data states are already reporting and better aligns to program implementation in states.

  1. Will this change impact your state’s reporting of this item?

  2. If so, how will it increase or decrease your burden?


Shape35

Public Comments

Fifteen states provided comments to the directed question on the education unit totals for ACGR to include LEAs/Schools with a grade 12 that have at least one student in the cohort. All fifteen states supported the Department’s proposal change the education unit total definition. Several states also commented the proposed change would have no or minimal impact on the burden of reporting.

ED Response

Based on the overwhelming response that this change supports the way states have been reporting, the Department is keeping this proposed change.

sex permitted values

Directed Question #36: The last EDFacts information collection package included directed questions about changing the definition of Sex/Gender to remove references to biological traits (note that sex/gender were terms used synonymously in historical ED documents). ED also requested information about how states collect and report sex and gender. Based on responses to those directed questions, ED did not consider additional changes. In preparing this package, ED data stewards considered reporting requirements in existing statutes, data privacy implications, and data release plans. ED is not proposing a change to the reporting of sex permitted values in data files in this package.

ED is proposing new metadata questions to ask states to annually report the permitted values the state uses. This will provide context about discrepancies between subtotals by Sex and totals of all students and inform future data collection proposals.

  1. Would the proposed metadata questions achieve the goal of providing context about discrepancies between subtotals by Sex and totals of all students?

  2. Are there other ways of providing context about these discrepancies?

  3. If you are in a state, do you have a preferred method for responding to the questions (e.g., in the State Submission Plan)?


The following metadata questions were proposed in Attachment C, EMAPS Collections. The questions are presented here because they are related to the Directed Questions above and referenced in the response below:

The following may be added to the existing survey or asked separately (but on the SSP timeline).

  1. For sex, does your state collect more than two permitted values (male and female)? If so, what are the other values?

  2. If your state collects more permitted values, does your state publicly report out data on all permitted values?

  3. If you collect additional permitted values but do not publicly report, why?

Shape36

Public Comment

Twenty-one (21) states and one organization submitted comments to this directed question. None of the 21 respondents objected to the proposal to maintain existing permitted values and add metadata questions.

Eleven states agreed that proposed metadata would help explain the discrepancies (when male plus female does not equal the total number of students). Three states specifically noted that their state would not be impacted because they would not have a discrepancy. The organization also agreed that the metadata would help explain discrepancies. Two states agreed with the first part of the proposed metadata question but stated the other parts were not true metadata and were not necessary for the Department to collect.

Seven states and the organization recommended adding an option for “Other.” Specifically, these states noted, “While it may be difficult for ED to create an all-encompassing set of permitted values for Sex, and it makes sense to use a metadata collection to gather information from states, it would be helpful to have an “other” category in addition to Male/Female for actual reporting purposes so that states do not have to explain differences between subtotals by Sex and totals of all students. If the metadata information later shows some level of consensus on the categories that comprise “other,” the reporting categories could be expanded at a later date.”

Six states recommended adopting the proposal to use the State Submission Plan (SSP, an existing metadata survey) to collect the metadata. One state opposed the use of the SSP and did not offer an alternative. One state recommended a new survey and to expand the concept to other data groups with discrepancies. Two states had no preference.

States also noted that the responses to the metadata questions should be used to turn off data quality errors, responses should be applied across data files, and that the information should be collected only once per year.

Twelve states provided information about current state practices. Two states reported collecting additional permitted values beyond male and female. Four states specifically mentioned reporting “gender” using only male and female. Five states reported using only male and female without mentioning gender or sex. Some states noted that state policy limits what is collected (e.g., binary-only; “need to know” information).


ED Response

As a result of the public comments received, the Department

  • revised the proposed metadata questions, only questions useful for interpreting data are included in the 30-day proposed package.

  • is clarifying that the purpose of the metadata responses includes decreasing the need for data quality follow-up. The Department will use the responses to automate data quality information across files so that the state is reporting metadata once and the Department is applying the responses across files that collect sex information. This should improve confidence in data quality, provide context for data users, and decrease reporting burden for states with observed data discrepancies.

  • is clarifying that another purpose of the metadata responses is to consider meaningful changes to the collection of this data in the future.

  • will maintain the plan to collect metadata via the SSP. The Department will make the final decision about the collection tool and timing based on cost and efficiency for the Department and states.



The Department is declining to add the permitted value of “Other” to the category Sex (Membership) because the burden exceeds the utility. The commenters noted that the recommendation to add Other was to minimize data quality feedback; that issue is already addressed by the use of the metadata responses. Using a general category of “Other” is less specific than the responses the Department will receive by asking the metadata question about values used by each state and adding “Other” would have limited analytic value. As the commenters noted, adding “Other” would be a temporary solution until the Department has more information from the proposed metadata responses.

The cost of making a temporary change, with a plan to make another change when more information is gathered, is significant for the Department and states. If the Department makes changes in permitted values, the Department must be able to justify the cost and burden, per the Paperwork Reduction Act. The Department justifies cost and burden based on statutory or regulatory requirement changes (not met here), based on grant program monitoring needs (not met here), and/or on a planned policy use of data (not met here). For the Department to change existing permitted values, data stewards must be able to clearly define those permitted values and they must be prepared to answer clarifying questions from states about definition anomalies, practices that vary across states, and the information must be clear enough to support an expectation of high quality data reporting by states. To meet requirements of the Evidence Act, data stewards should have data quality checks that can identify low quality data and they should be able to describe how data can and will be used. None of those requirements or conditions are met.

At this time, the Department has no confidence that permitted value changes would lead to meaningful and usable data. The Department does have confidence that gathering information via metadata questions will add value and knowledge to the bigger discussion of outcomes for students that are not reported (or not accurately reported) in existing sex permitted values.

The Department is committed to improving knowledge and awareness about equity issues and is committed to making meaningful changes to support equity for students. The Department is concerned if students cannot be accurately represented in the data. To improve data, the Department must have better information to make meaningful changes.

The Department would also like to acknowledge a challenge in the collection of data using the category “Sex (Membership)” that has been in place across multiple EDFacts data groups for over 15 years. Historically, the terms “sex” and “gender” were used interchangeably and inconsistently by the Department across programs and data collections. There are known disconnects between statutory language using “gender” and the variable name which references “sex.” Some collections use “gender identity.” The Department is aware of the complexities behind these issues and is engaged in conversations about impacts. The Department is gathering information across many collections, not just EDFacts, and anticipates additional clarity in the near future about use of these and other related terms and definitions across education data collections.



EDFACTS DATA SYSTEM AND PROCESSES

Directed Question #37: The Department has several directed questions to gather comments and input from states as ED considers modernization.

  1. Do you agree with ED’s assessment of the current issues? If not, what improvements would most effectively contribute to improving data quality so that data are usable at the due date?

  2. Do you agree with ED’s priorities for modernizing the collection system? If not, what would you change?

  3. If ED provided a CEDS-based tool that allowed states to run pre-submission business rules to review both unit- and aggregate-level data quality, could your state use that tool? What would be the barriers to using this tool in your state?

  4. To what extent is CEDS currently used in your state (e.g., using CEDS elements, using CEDS tools, implementing a CEDS solution for state use and/or federal reporting)?

Shape37

Public Comment and ED Responses

Twenty-two states and one organization provided a response to the questions on modernization. The organization agreed with the modernization goal because of the size and value of the data collected. The organization recommended that applicable process improvements be expanded to other data collections. The comments from the twenty-two states are discussed below.

Defining Problems Modernization Must Address. Over half the states that responded agreed with the problems defined by the Department. The states that did not agree indicated that the problem should be directed to the federal level and not the state level. One state specifically mentioned that the volume of business rules and inconsistent guidance and instructions in the file specifications were problems that the Department needs to address.

The Department acknowledges that changes are needed at the federal level and specifically acknowledges that success depends on having effective business rules and accurate guidance and instructions in the file specifications. The consensus from the comments is that modernization is needed.

Pre-submission Data Quality and the Common Education Data Standards (CEDS). States’ concerns with pre-submission data quality centered around states being held accountable for business rules that were not applicable or did not account for schools and local education agencies (LEA) changes. Some states pointed out that the results of the business rules often require explanation rather than corrections to the data. Other states mentioned that some business rules use data groups that are due at different times and inquired how the data quality would work.

Two states reminded the Department that data quality depends on LEAs which may have limited resources. In at least one state, the state is unable to change LEA data after the state certification date.

The Department acknowledges that for pre-submission data quality to be effective the business rules must be valid. The Department is reviewing the business rules to ensure that the business rules for the pre-submission data quality are the business rules required to assess data quality. The Department will be providing the tool for states to assess data quality before submission. The tool will include functionality to submit explanations when business rules fail.

Using CEDS and Interest in CEDS Tool. Twenty states responded to either or both parts of the questions about CEDS, and twelve expressed interest in the tool. Two states indicated that CEDS is not used for federal reporting. The rest of the states ranged from CEDS compliant to various levels of progress in implementing CEDS to just exploring CEDS. Based on this information, the Department intends to pursue the CEDS tool.

States mentioned that due dates will need to be changed. States also mentioned the need to have the due dates, instructions, and ability to submit files earlier. The Department will be assessing the due dates for SY 2022-23 and will be publishing the due dates shortly after the approval of the package.

other public comments

Public Comment – Migrant Students Eligible and Served

Guidance is different regarding verifying when children turn 3 between FS121 and FS122. States should be able to use enrollment date, withdrawal date, residency date, or residency verification to verify residency (not just enrollment).

ED Response

This comment is related to FS121 and FS122 and the alignment between the two. The Department agrees with the comment and fixed it in reporting guidance in the file specification starting with SY 2021-22.


Public Comment – Migrant Files

Two comments were submitted about providing comments for Migrant files during reporting. This included asking the Department to remove comments if the counts over time were consistent.

ED Response

The Department will take your comments under consideration.


Public Comment – Continuation (Only)

Data Group 102: This revision could have a big impact on how we track and report students who are identified under Continuation of Services (COS). This change is more than a “Technical Correction”. Clarification is required from OME concerning COS and provision of services. Does this change require states include provision of services provided to students after they have ended their eligibility? There is confusion under the “Do not include column”. The title indicates, “do not” include those students who receive a service until the end of term under 1304(e)(1). However, noted below the title, guidance provided indicates “to” include them in the column with Category Set C. This guidance is contradictory.

ED Response

After reviewing state comments, the Department has revised the DG102 definition for the 30-Day package to eliminate confusion. The Department is also revising the data category title to “Continuation of Services”. OME plans to collect the count of children served under the Continuation of Services (COS) authority in section 1304(e)(1-3). This change requires states to include the count of all children served under the COS authority. The Department believes that including children who continue to be served until the end of the term in which their eligibility expired provides more comprehensive data on all formerly migratory children who continue to receive MEP funded services.


Public Comment – MEP Services

In the definition of “Support Services” the Department removed a sentence regarding the one-time act of providing instructional or information packets. One state asked if this constitute a change in policy.


ED Response

The Department agrees that the one-time act of providing instructional or informational packets to a child or family does not constitute a support service. There has been no change to policy on this issue. It was removed from FS 145 to align with the language in Non-Regulatory Guidance (NRG).


Public Comment – Closure/Takeover

One individual asked questions about how EDFacts data is used to help track closures and takeovers and recommended new data items.

ED Response

The Department does keep the NCES school identifier (DG 529) a school switches LEAs. This blog, Accessing the Common Core of Data (CCD), provides more information. There is a variable, DG 743 (Reconstituted Status) in FS 029, collecting information on whether the school was restructured, transformed, or otherwise changed as a consequence of the state’s accountability system under ESEA or as a result of School Improvement Grants (SIG), but is NOT recognized as a new school for CCD purposes. The variable is included in the CCD school directory file. The Department does not have any further mandate to collect more information on school closures or takeovers and the proposed variables would have a burden that cannot be justified at this time.

Public Comment – Ungraded

One state noted that the Department could improve how it collects and uses information from states about Ungraded. The state recommended that the Department collect this information once in the State Submission Plan and use the response across files, rather than repeat the information for each relevant file.

ED Response

The Department agrees with the comment. The Department will look for a solution to ensure a state can indicate one time if they use optional permitted values, like ungraded and grade 13.

Public Comment – Timing of Package

Two states noted that final changes to the data collection needs to be released sooner. One state noted that they start data collections 6 months before the start of the school year (Directory) and that technical teams within the state need changes at least 9 months before the state of a school year to establish technical requirements and make system changes. The state noted they do this work with both state and contract staff. Both states suggested that since changes will not be final until almost the start of SY 2022-23, ED should consider delaying the collection of new or changed data for one year.

ED Response

The Department agrees that the timing of the data collection changes is not timely enough and is continuously working to improve timeliness. The EDFacts team will speak with data stewards about the possibility of delaying new data collections until SY 2023-24.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleExplanation of EDFacts
Author[email protected]
File Modified0000-00-00
File Created2023-07-29

© 2024 OMB.report | Privacy Policy