CC Evaluation_Supporting Statement Part A 30JAN22

CC Evaluation_Supporting Statement Part A 30JAN22.docx

National Evaluation of the 2019 Comprehensive Centers Program Grantees

OMB: 1850-0970

Document [docx]
Download: docx | pdf



P art A: Supporting
Statement for
Paperwork Reduction
Act Submission


Implementation Evaluation of the Comprehensive Centers




January 2022



Prepared for:

Andrew Abrams

U.S. Department of Education

Institute of Education Sciences

550 12th Street, SW

Washington, DC 20202

Contract No. 91990019C0077



Submitted by:

Abt Associates Inc.

10 Fawcett Street

Cambridge, MA 02138



Part A: Supporting Statement for Paperwork Reduction Act Submission

Table of Contents



  1. Justification

Introduction

The U.S. Department of Education (the Department)’s Institute of Education Sciences (IES) requests clearance for new data collection activities to support a national evaluation of the Comprehensive Centers (CCs) program. Specifically, this request covers surveys of state and local education agency staff who are recipients of CC technical assistance (TA) and interviews with the directors of the CCs and another federal technical assistance network with whom the CCs are intended to collaborate—the Regional Educational Laboratories (RELs).

The CC program, funded by the Department at over $50 million per year, provides training, tools, and other supports to help state education agencies and local education agencies carry out their education plans and take steps to close achievement gaps. The CCs’ services aim to build individual and organizational capacity to help identify and solve key problems. This evaluation will examine the delivery and usefulness of the Centers’ technical assistance, given potential new stakeholder needs and changes in the Center program that took effect with the 20 new grants awarded in 2019.

A.1 Circumstances Making the Collection of Information Necessary

The Education Technical Assistance Act (ETAA) of 2003 mandates that results of a national evaluation of the CC program be reported to Congress.1 This data collection on the CCs is critical because it will help the Department identify program improvements for the next cohort of CC grantees, and it may have implications for the evaluation of federal technical assistance efforts more broadly. Results of this evaluation are intended to:

(a) help better align CC services to state education agencies’ needs,

(b) identify how CC services can be more effectively delivered,

(c) identify how CCs can better support the development of state and local education agencies’ capacity, and

(d) provide lessons learned from the new structure of the CCs introduced in the 2019 cycle.

A.2 Purposes and Use of the Information Collection

IES has contracted with Abt Associates and its partners AnLar, Educational Testing Service (ETS), and the Center for Research in Education & Social Policy (CRESP) at the University of Delaware to conduct this evaluation. The evaluation will collect data in order to address the research questions and sub-questions presented in Exhibit A1.

The study will result in a report on the implementation of the CCs. The report will describe:

  • The types of projects and services undertaken by the CCs, and the extent to which the CCs addressed the needs of TA recipients;

  • The capacities that CCs aimed to address, the extent to which CCs were successful in building TA recipients’ capacity, and how capacity building services changed between the 2012 and 2019 cycles; and

  • Lessons learned about changes introduced in the 2019 cycle, which include the elimination of Content Centers, the addition of a National Center, and new requirements to collaborate in needs sensing activities with RELs.

The report will include information about the CCs from the first two years of the 2019 cycle, which include the 2019-2020 and 2020-2021 school years. The report will also include details about planned activities for the third year of the cycle, which includes the 2021-2022 school year.

Exhibit A2 lists the types of primary and administrative data to be collected. For each type of data, the exhibit lists the respondent sample, timeframe for data collection, and uses in the evaluation.



Exhibit A1: CC Evaluation Research Questions

RQ1: What types of educational problems are addressed by Comprehensive Centers, and does the Comprehensive Centers’ focus align with the priorities identified by TA recipients? To what extent did Comprehensive Centers shift the focus of their work in response to COVID-19, and how?

1.1 What are the most common educational problems that Comprehensive Centers’ projects focus on, and to what extent do they vary across Centers?

1.2 To what extent do the problems that Comprehensive Centers focus on align with the priorities of TA recipients?

1.3 Which TA recipient priorities are not addressed by Comprehensive Center projects, and why?

1.4 To what extent did Comprehensive Centers shift the focus of their work in response to COVID-19? What are the most common COVID-19-related issues that Comprehensive Centers addressed?

1.5 To what extent were the services and activities provided by Comprehensive Centers responsive to TA recipients’ COVID-19 needs?

RQ2: What services are Comprehensive Centers providing to TA recipients, and to what extent do the services vary in type and intensity?

2.1 What are the most common services that Comprehensive Centers provide to TA recipients?

2.2 To what extent is there variation in the types and intensity of services provided by Comprehensive Centers?

RQ3: What dimensions of capacity are Comprehensive Center projects designed to increase, and to what extent do TA recipients report that the Centers improve their capacities? To what extent do the Comprehensive Centers build sustainable capacity?

3.1 What dimensions of capacity (human, organizational, policy, and resource) are Comprehensive Center projects designed to increase?

3.2 To what extent do TA recipients report that Comprehensive Centers improve their capacity and in what dimensions?

3.3 To what extent do TA recipients agree that Comprehensive Center support builds sustainable capacity?

RQ4: What are the perceived successes and challenges of changes to the Comprehensive Center program, including the shift in the number and geographic reach of the Regional Centers and expectations to increase collaboration with the Department's Regional Educational Laboratories?

4.1 What are the perceived benefits and challenges of changes to the structure of the Comprehensive Center program?

4.2 To what extent, and in what ways, do Regional Comprehensive Centers collaborate with each other and with the National Comprehensive Center?

4.3 To what extent, and in what ways, do Comprehensive Centers collaborate with the RELs? In what areas do Comprehensive Center and REL activities overlap with each other, if at all?

4.4 To what extent do Comprehensive Centers refer TA recipients to other Department-funded TA Centers?

Exhibit A2: Data Sources, Respondent Sample, and Timeframe, including Overview of Content and Use(s) in Evaluation

Data Source

Respondent Sample

Mode and Timing of Data Collection

Use(s) in Evaluation

Primary Data

TA Recipient Survey

Key contact (TA recipient) for every active Year 2 CC project (233)



Web Survey

May 2022

  • Identify educational problems, such as instructional issues or inequity, that CCs address (RQ 1.1)

  • Document adaptation of CC projects in response to the COVID-19 pandemic (RQ 1.4)

  • Measure the types of capacity—human, organizational, policy, or resource—addressed by CCs (RQ 3.1)

  • Measure TA recipient perceptions of the extent to which CCs increase capacity (RQ 3.2)

  • Measure TA recipient perceptions of usefulness of CC tools provided (RQ 3.2)

  • Measure perceptions of extent to which TA recipients have tools to address needs on their own (RQ 3.3)

  • Compare differences in services delivered and TA recipient experiences between the 2012 and 2019 cohorts (RQ 4)

Interviews with CC Directors

Regional Center Directors (19)


National Center Director (1)

Jun-Jul 2022

  • Document adaptations to CC projects in response to COVID-19 pandemic (RQ 1.4)

  • Document intensity of projects (staffing requirements/duration) (RQ 2.2)

  • Document approaches to building sustainable capacity (RQ 3.3)

  • Document nature and frequency of supports that the National Center provides to Regional Centers (RQ 4.2)

  • Identify successes and challenges of collaboration between the National Center and Regional Centers (RQ 4.2)

  • Identify how collaboration between RELs and CCs can be improved and avoid duplication of effort and/or gaps in services (RQ 4.3, 4.4)

Interviews with REL Directors

REL Directors (10)

Jun-Jul 2022

  • Document nature and frequency of collaboration between CCs and RELs (RQ 4.3)

  • Document successes and challenges related to REL and CC collaboration (RQ 4.3)

  • Identify opportunities for CCs and RELs to improve collaboration and avoid duplication of effort and/or gaps in services (RQ 4.3)

Administrative Data2

Annual Service Plans, Annual Evaluation Reports, and Joint Needs Sensing Memos

None (administrative data from U.S. Department of Education, IES)


Administrative Data

Years 1 & 2: Spring/Fall 2021


Year 3: Spring/Fall 2022

  • Document type of problems that CC projects are addressing (RQ 1.1)

  • Document types and focus of services that CCs are delivering (RQ 2.1)

  • Document process and focus of collaboration between RELs and CCs (RQ 4.3)

  • Pre-populate interview protocols (RQ 1, 2, 3, 4)

Restricted Use Data from 2012 Cycle CC Evaluation

None (administrative data from U.S. Department of Education, IES)

Spring 2022

  • Document dimensions of capacity addressed by CC projects (RQ 3.1)

  • Compare perceptions of the extent to which CCs improved capacity between the 2012 and 2019 cohorts (RQ 4.1)

  • Compare perceived successes and challenges in addressing capacity needs between the 2012 and 2019 cohorts (RQ 4.1)





A.3 Use of Information Technology and Burden Reduction

To minimize burden on study participants, the study team will use strategies that have proven successful in past studies the team has conducted. General strategies to minimize burden using technology are described below. Other strategies for minimizing burden are discussed in section A.4.

A.3.1 Administrative Data

The study is limiting the scope of the surveys and interviews by drawing on administrative data collected by the Department. This administrative data includes:

  1. Annual Service Plans, which are collected at the beginning of each fiscal year and provide information about projects planned for the coming year. The CC program office provides Annual Service Plan templates to each CC to ensure these data are structured consistently across Centers.

  2. Annual Evaluation Reports, which are collected at the end of each fiscal year and provide internal CC evaluation findings. These reports are in narrative format.

  3. Joint Needs Sensing Memos, which present the results of discussions between the CCs and the RELs to identify key needs in each region.

In addition, to allow for some comparisons of the CC program over time, the study will draw on the restricted use data from an evaluation of the 2012 cycle of CC grants.

A.3.2 TA Recipient Survey

The study team will use the Confirmit Horizons platform which will allow TA recipients to respond to the survey at the most convenient time within the survey window, ensuring that survey response is as low-burden as possible. The study team chose this platform for the TA Recipient Survey, given its compatibility with desktop, laptop, tablet, and mobile survey completion. Features of the survey include:

  • Personalized Access. A customized survey link will be created for each potential respondent. Respondents can exit the survey and return to finish their response at any time.

  • Closed-ended Questions. The survey questions are closed-ended, which reduces burden on participants. For a small number of items, respondents will be able to select “other” and provide an additional option, if applicable.

  • Automated Skip Patterns. Skip logic will be programmed into the survey to reduce burden on participants by omitting questions that are not applicable based on previous responses. This programming helps to eliminate the need for respondent follow-up about missing data and/or responses that would not be feasible in any circumstance.

  • Automated Validation Checks. The study team will program the survey to limit inaccurate data entries by validating any numerical entries to ensure they are within an acceptable range. The survey will limit the potential for missing item-level data by prompting respondents for responses when they skip a question. These checks help ensure accuracy and eliminate the need for respondent follow-up.

Before finalizing the TA recipient survey, the study team pilot tested the survey, embedding cognitive interviews in the process, with three TA recipients from different regions who were nominated by CC Directors. This piloting process collected feedback about survey length, clarity, and relevance, and identified areas of confusion, misunderstanding, or gaps in the survey instrument. Based on feedback from the piloting process, the study team made adjustments to the survey in order to ensure the questions are clear, relevant, and will elicit useful data, thus reducing burden on respondents. Respondents who pilot the survey will not be asked to complete the survey again during the data collection window.

A.3.3 Interviews

All interviews will be conducted over video conferencing platforms, which are easily accessible on multiple devices such as computers, laptops, tablets, and smartphones, and thus will reduce burden on the interviewee. Interviewers and note takers may use laptops to record responses throughout the interview. In addition, the interviews will be audio-recorded—so long as the interviewees provide consent. This will ensure that notes are accurate and ready for analysis and will reduce the likelihood that the study team will need to follow up with respondents.

Interview Procedures. The CC Director interview and REL Director interview will be conducted by a senior researcher as the lead interviewer and junior researcher as the note taker. Ahead of the CC interviews, the interview team will review preliminary survey data from the TA recipients in the CC’s region, information from the pre-interview survey template, and the applicable Annual Service Plans, Annual Evaluation Reports, and Joint Needs Sensing Memos. This review will provide the study team with insights into the CC’s work with TA recipients and their collaboration with the REL and National Center. The study team will pre-populate the interview protocol with relevant information to reduce the burden on the respondent and to ensure interviews address information that is not already available

Pre-Interview Survey. The study team will administer a pre-interview survey to CC Directors to gather project-specific details that may be difficult for CC Directors to recall during the interview. Prior to the interview, each Center Director will receive the survey pre-populated with the name of projects that were listed as active in Year 2 in the Annual Service Plans. CC Directors will be asked to respond to a series of questions about each project. This interview preparation process will minimize the need for follow-up about project details after the interviews are complete.

Piloting the Interview. The study team pilot tested the interview protocol with one CC Director and included additional questions at the end to collect feedback about interview length, clarity, and relevance. Based on feedback from the piloting process, the study team made adjustments to the instruments to ensure questions are clear, relevant, and non-duplicative, thus reducing burden on respondents. The respondent that pilots the interview protocol will only be asked questions added to the interview protocol during the interview window.

A.4 Efforts to Identify Duplication

Whenever possible, the study team will use administrative data and restricted use data from the evaluation of the 2012 cohort to answer the evaluation’s research questions (see Exhibit A1 for a list of research questions). Detailed plans for avoiding duplication of information included in these documents and information collected in the TA recipient survey and interviews are explained below.

A.4.1 Administrative Data

The study team conducted a systematic review of administrative documents available to date from the Department, and will update its review as new documents become available. These documents—which include Annual Service Plans, Annual Evaluation Reports, and CC-REL Joint Needs Sensing Memos—will provide important descriptive information about CC services and activities and guide the preparation of the interviews and survey. The study team will also access restricted use data from the evaluation of the 2012 CC cohort to compare the services provided and experiences of TA recipients between the previous and current cycles.

A.4.2 TA Recipient Survey

The study team will minimize duplication in data collection by identifying and surveying a single TA recipient per project. Prior to the fielding of the TA Recipient Survey the study team will ask each CC Director to provide the name of the key contact person for each of the projects that were active in Year 2. If a project spans across multiple states, then Directors will be asked to identify one key contact in each of the participating states. If a TA recipient serves as the key point of contact for multiple projects, they will receive one survey that will cycle through a series of project-specific questions for each project they oversee.

A.4.3 Interviews

The study team will minimize duplication in interview data collection by reviewing all administrative data prior to the start of the interview.

A.5 Efforts to Minimize Burden in Small Businesses and Other Small Entities

No information in this study will be collected from small businesses or other small entities.

A.6 Consequences of Not Collecting the Information

An independent evaluation of the CC program is mandated by Title II, Sec. 203, of the Educational Technical Assistance Act (ETAA) of 2002. The evaluation is intended to inform program improvement for the next cohort of grantees and report findings about the problems that the CCs are addressing, the services that they are providing to try to address these problems, and successes and challenges related to the new program structure to Congress. Without the data collection effort, the Department could not meet its obligations to Congress and the study team could not inform program improvements for the next CC cohort.

A.7 Special Circumstances Justifying Inconsistencies with Guidelines in 5 CFR 1320.6

There are no special circumstances for the proposed data collection. While CC Director interview participants will be asked to complete pre-interview questions, they will be given more than 30 days to respond to the request.

A.8 Consultation Outside the Agency

A.8.1 Federal Registrar Announcement

On September 17, 2021, a 60-day Federal Register Notice was published at 86 FR 51878. There was one comment received during the 60-day period. A 30-day Notice will be published.

A.8.2 Consultations Outside the Agency

The study team will consult with a Technical Working Group (TWG) in the summer of 2022 to review study findings and discuss their implications for the CC program. The TWG will consist of practitioners and researchers with experience in capacity building related to the implementation of evidence-based practices.

A.9 Payments or Gifts to Respondents

Data collection for this study does not involve payments or gifts to respondents.

A.10 Assurance of Confidentiality

The study team will conduct all data collection activities for this study in accordance with relevant regulations and requirements mandated by the Privacy Act of 1974 (5 U.S.C. 552a) and The Education Sciences Institute Reform Act of 2002, Title I, Part E, Section 183.

Administrative data were emailed to Abt Associates by the Department in a secure zip file. No personally identifiable information (PII) was included in these files.

The study team and the Department will use the primary data collected in this study for research purposes only and will protect the confidentiality of all study participants. All electronic data will be stored on Abt Associates’ Secure Analytics Enclave. Survey data will originally be stored on the Confirmit platform and will be transferred to Abt’s Secure Analytics Enclave for analysis. The only PII available to the study team will be the names and email addresses of survey and interview respondents, which will be used to contact potential respondents; the surveys will not ask for this information. For data analysis, names will be replaced with study-specific identifiers after the data are collected and will be destroyed once they are no longer required for the study. All study team members with access to the data will be trained in confidentiality and data security procedures. Reports will present data in aggregate form, and will not identify specific TA recipients, CC Directors, or REL Directors. The survey, pre-interview template, and interviews will include a notice of confidentiality.

The following is an example of a type of statement that will be included on a request for data:

Information collected for this study comes under the confidentiality and data protection requirements of the Institute of Education Sciences (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for the study will summarize findings across the sample and will not associate responses with a specific individual. Any willful disclosure of such information for non-statistical purposes, except as required by law, is a class E felony.

The following safeguards are routinely employed by Abt Associates to ensure confidentiality, and they will be consistently applied to this study:

  • All employees sign a confidentiality pledge that emphasizes the importance of confidentiality and describes employees’ obligations to maintain it.

  • PII is maintained in separate forms and files, which are linked only by sample identification numbers.

The study will take several steps to safeguard respondent information:

  1. All contractor staff will comply with the security investigation requirements governed by their risk/sensitivity level as detailed in the Department Contractor Vetting Security Requirements (11-1-2019).

  2. All contractor staff will receive instruction in the privacy requirements of the study.

  3. Access to any data with identifying information will be limited to contractor staff directly working on the study. Access to electronic data will require individual usernames and passwords.

  4. Names and other identifying information for survey and interview respondents will be replaced with numerical identifiers after the data are collected and prior to analysis. A key linking the names to the identifiers will be kept in a separate location, with access for Abt staff on a need-only basis.

  5. Any quotations from responses used in public reporting will be edited to ensure that the identity of the respondent cannot be ascertained.

The study team will house survey data on the Confirmit platform, which is FISMA Moderate compliant. For analyses, the study team will store data on Abt’s Analytic Computing Environment (ACE), which complies with HIPAA, FERPA, and FISMA Moderate standards implemented in FedRAMP-certified Amazon Web Services (AWS) environments. Additional security features include encrypted storage, intrusion detection, and audit log aggregation. ACE is monitored seven days a week, 24 hours a day using advanced monitoring and alerting tools. Expert security and IT staff continuously review audit logs, conduct regular scans, and apply patches. Redundant backups are stored both on AWS and at a remote data center.

Abt’s Institutional Review Board (IRB) will review the study design protocols and data collection plan in order to determine whether the study needs to undergo a full review or whether it is exempt from full review. If the Abt IRB determines that a full review is necessary, the IRB will review the informed consent process, data security plan, and all data collection instruments and procedures to ensure that study participants are protected.

A.11 Sensitive Questions

The data collection for this study does not include any questions of a sensitive nature.

A.12 Estimate of Response Burden

The total annual respondent burden for the data collection effort covered by this clearance request is 105.98 hours for a total cost burden of $6,113.62. Exhibit A3 presents the estimated time burden to respondents, and Exhibit A4 presents the estimated cost burden to respondents. The following assumptions informed these burden estimates:

  • TA Recipient Representatives:

  • Three TA recipients will complete the TA recipient survey pilot and cognitive interview. Completing both these tasks will take approximately 1 hour per respondent. The cost to the TA recipients is based on a national average hourly wage of $45.11 for education administrators in 2020 (BLS).

  • TA recipients identified as key contacts will complete one TA Recipient Survey per project. If TA recipients lead more than one project within their agencies, the survey will cycle through the series of project-specific questions. There are 233 CC projects, and the study team assumes an 80% completion rate for the survey—which will result in responses about approximately 186 projects. The cost to the TA recipients is based on a national average hourly wage of $45.11 for education administrators in 2020 (BLS).

  • CC Director:

  • The 19 Regional Center Directors and 1 National Center Director will participate in the CC Director Interview. The study team assumes a 100% completion rate for CC Director interviews. Interview participation will take approximately 60 minutes per participant, plus 20 minutes to fill in the pre-interview template. The cost to the CC Director is based on an average hourly wage of $77.15 for General and Operations Managers within Management, Scientific, and Technical Consulting Services in 2020 (BLS).

  • All 20 CC Directors will help identify the name and contact information for the key contact for every project active in Year 2 in his or her region and will send a notification email about the data collection activities to the key contacts, as well as a reminder email to those who have not responded. These tasks will take approximately 30 minutes combined. The cost to the CC Director is based on an average hourly wage of $77.15 for General and Operations Managers within Management, Scientific, and Technical Consulting Services in 2020 (BLS).

  • REL Director:

  • The 10 REL Directors will participate in the REL Director interview. The study team assumes a 100% completion rate for REL Director interviews. Interview participation will take approximately 30 minutes per respondent. The cost to the REL Director is based on an average hourly wage of $77.15 for General and Operations Managers within Management, Scientific, and Technical Consulting Services in 2020 (BLS).





Exhibit A3: Estimate of Respondent Time Burden

Respondent Type

Time per Response (Hours)

Maximum Number of Responses

Number of Respondents

Total Time Burden (hours)

TA Recipient Representatives





TA Recipient Survey Pilot

1

1

3

3

TA Recipient Survey

.33

1

186

61.38

CC Director





CC Director Pre-Interview Template

.33

1

20

6.6

CC Director Interview

1

1

20

20

Identify contacts for survey and send emails

.5

1

20

10

REL Director





REL Director Interview

.5

1

10

5

Total Hours




105.98



Exhibit A4: Estimate of Respondent Cost Burden by Year

Information
Activity or IC

Sample Size

Respondent
Response
Rate

Number
of Respondents

Average Burden Hours Per Response

Total Annual Burden Hours

Estimated Respondent Average Hourly
Wage

Total
Annual
Cost

TA Recipients








TA Recipient Survey Pilot

3

100%

3

1

3

$45.11a

$135.33

TA Recipient Survey

233

80%

186

.33

61.38

$45.11a

$2768.85

CC Director








CC Director Pre-Interview Template

20

100%

20

.33

6.6

$77.15b

$509.19

CC Director Interview

20

100%

20

1

20

$77.15b

$1543.00

Identify contacts for survey and send emails

20

100%

20

.5

10

$77.15b

$771.50

REL Director








REL Director Interview

10

100%

10

.5

5

$77.15b

$385.75

Annualized Totals







$6,113.62

a The cost to TA recipients is based on a 2020 national average hourly wage of $45.11 for education administrators (BLS).

b The cost to CC Directors and REL Directors is based on a 2020 national average hourly wage of $77.15 for general and
operations managers (BLS).

A.13 Estimate of Total Capital and Startup Costs/Operation and Maintenance Costs to Respondents or Record-Keepers

There are no annualized capital/startup or ongoing operation and maintenance costs involved in the collection of the proposed data.

A.14 Estimate of Cost to the Federal Government

The total estimated cost of the evaluation to the federal government, including the data collection activities described above, protocol development, analysis, and reporting, is $1,939,185. The average annual cost, spread across 3 years, is $646,395.

A.15 Change in Burden

This is a request for a new collection of information.

A.16 Plans for Analysis, Publication, and Schedule

A.16.1 Analysis Plan

To address the research questions, the study team will use two types of analytic methods.

  • Descriptive analyses – The study team will produce summary statistics such as means and standard deviations for continuous variables and tabulations such as frequency distributions and percentages for categorical variables.

  • Comparative analyses – To compare groups, such as administrative challenges by the type of capacity that CCs aimed to develop, the study team will conduct cross-tabulations and use common statistical tests, such as an F-test, to determine whether differences are statistically significant or likely due to chance.

The study team will summarize findings and present them in tables and exhibits in the final report. All text describing findings will be presented in plain language.

A.16.2 Time Schedule and Publications

The evaluation requires publishing a final report and delivering a restricted use data file.

  • Final Report: After incorporating feedback from the Department, the study team will provide the final report expected in May 2023.

  • Restricted Use Data File: The study team will produce a restricted use data file that contains all data collected for the evaluation, subject to any applicable privacy laws, with all direct personal identifiers removed. This file and documentation accompanying the file are expected in September 2023.

A.17 Approval to Not Display Expiration Date

No exemption is requested. The data collection instruments will display the expiration date.

A.18 Exceptions to Item 19 of OMB Form 83-1

No exceptions are necessary for this information collection.

2 Because administrative data are provided directly by ED to the study team, they are not included in this clearance request. By the time data collection will take place, Annual Service Plans will be available for Years 1-3, Annual Evaluation Reports will be available for Years 1-2, and Joint Needs Sensing Memos will be available for Years 1-3.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy