Supporting_Statement_Part_A_revised 1-8-2018_clean

Supporting_Statement_Part_A_revised 1-8-2018_clean.docx

National Evaluation of the Investing in Innovation (i3) Program

OMB: 1850-0913

Document [docx]
Download: docx | pdf



S upporting Statement for Information Collection Request—Part A




Investing in Innovation (i3) Technical Assistance and Evaluation Project




January 8, 2018





Prepared for:

Tracy Rimdzius

Institute of Education Sciences

U.S. Department of Education

550 12th Street, SW

Washington, DC 20024




Submitted by:

Abt Associates

55 Wheeler Street

Cambridge, MA 02138





  1. Justification

The Institute of Education Sciences (IES), within the U.S. Department of Education (ED), requests that the Office of Management and Budget (OMB) approve, under the Paperwork Reduction Act of 1995, clearance for IES to conduct data collection efforts for the Investing in Innovation (i3) Technical Assistance and Evaluation Project. The i3 program is designed to support school districts and nonprofit organizations in expanding, developing, and evaluating evidence-based practices and promising efforts to improve outcomes for the nation’s students, teachers, and schools. Each i3 grantee is required to fund an independent evaluation, in order to document the implementation and outcomes of their intervention. The i3 Technical Assistance and Evaluation Project has two goals: 1) to provide technical assistance (TA) to support the independent evaluators in conducting evaluations that are well-designed and well-implemented and 2) to summarize the strength of the evidence and findings from the independent evaluations.

Data collection will be required to address the second goal of summarizing the strength of the evidence and findings of the individual i3 evaluations being conducted by independent evaluators contracted by each i3 grantee. The required data collection will entail each i3 evaluator providing data on their impact estimates and fidelity of implementation, either via the standardized templates provided by the project (shown in Appendix A) or in a draft report with sufficient detail that the information can be extracted by the i3 Technical Assistance and Evaluation Project team. Part A of this request discusses the justification for these data collection activities. Part B, submitted under separate cover, describes the data collection procedures.

    1. Circumstances Requiring the Collection of Data

To date, ED has awarded over $1.4 billion in funding through i3. Across seven cohorts, one per fiscal year (FY 2010 - FY 2016), ED awarded grants to support the implementation and evaluation of 172 programs (11 Scale-Up grants, 46 Validation grants, and 115 Development grants). Across cohorts, Scale-up grants were intended to support taking to scale programs that have strong evidence of their effectiveness in some settings. Validation grants were given to programs to validate their effectiveness, and Development grants were given to programs to establish the feasibility of implementation of the program and test the promise for improving outcomes.

The i3 program, overseen by ED’s Office of Innovation and Improvement (OII), requires each grantee to fund an independent evaluation in order to document the implementation and outcomes of the intervention funded by i3. The Scale-up and Validation grantees are expected to fund rigorous impact studies using experimental or quasi-experimental designs (QEDs); such designs have the potential to yield strong evidence about a range of approaches to improving educational outcomes. The Development grants are also supporting independent evaluations, but they are not necessarily expected to support rigorous evaluations. ED’s Institute of Education Sciences (IES) has contracted with Abt Associates and its partners1 (the i3 Evaluation Team) to conduct the i3 Technical Assistance and Evaluation Project.

Achieving the first goal of the i3 Technical Assistance and Evaluation Project requires one group within the i3 Evaluation Team, called the Technical Assistance (TA) Team, to take an active role in maximizing the strength of the design and implementation of the evaluations, so that the evaluations yield the strongest evidence possible about what works for whom in efforts to improve educational outcomes. Achieving the second goal of the i3 Technical Assistance and Evaluation Project requires a separate group within the i3 Evaluation Team, known as the Analysis and Reporting (AR) Team, to (1) assess the evaluation designs and implementation, both to inform ED of the progress of the evaluations2 and to provide important context for the summary of results, and (2) provide clear synopses of what can be learned from this unprecedented investment in educational innovation. To ensure that the work of the TA Team does not influence the assessments and analysis of the AR Team, the two groups of staff members do not overlap and are separated by a firewall maintained by the Project Director. The systematic data collection proposed for the i3 Technical Assistance and Evaluation Project is necessary to provide the information for the AR Team to review and summarize the findings from the independent evaluations funded by i3. The data collection will be led by the AR Team, with oversight from the Project Director.

    1. Purposes and Use of the Data

This section of the supporting statement provides an overview of the research design and data collection efforts planned to meet the main research questions and overall objectives of the i3 Technical Assistance and Evaluation Project. The section begins with an overview of the research design, including the main objectives of the evaluation and key research questions, followed by a description of the data collection activities for which OMB clearance is requested.

As noted earlier, the main objectives of the i3 Technical Assistance and Evaluation Project are (1) to provide technical assistance (TA) to support the independent evaluators in conducting evaluations that are well-designed and well-implemented and (2) to summarize the strength of the evidence and findings from the independent evaluations. The data collected for this evaluation will address the second objective. Specifically, the data will be used to support reviews and reports to ED that have five goals:

  1. Describe the intervention implemented by each i3 grantee.

  2. Present the evidence produced by each i3 evaluation, along with assessment of its strength.

  3. Assess the strength of the evidence produced by each i3 evaluation.

  4. Identify effective and promising interventions.

  5. Assess the results of the i3 program.

These five goals are described in more detail below, along with the research questions that address each goal.

      1. Goal One: Describe the intervention implemented by each i3 grantee

While the interventions are described in the applications submitted by the grantees for funding, it is important to augment these descriptions with information reflecting what happened as opposed to what was planned during the application stage. This information allows the i3 Evaluation Team to describe key components of each i3-funded intervention—for example, in the approach taken to training teachers and delivering instruction to students. An accurate description of the details of each project’s intervention is critical to ED’s understanding of the breadth and depth of the approaches funded, and of the implications for what these specific projects can potentially teach the field about promising approaches to education reform.

This goal is fulfilled by answering the following research questions:

Q1. What are the components of the intervention?

Q2. What are the ultimate student outcomes that the intervention is designed to affect?

Q3. What are the intermediate outcomes through which the intervention is expected to affect student outcomes?

      1. Goal Two: Present the evidence produced by each i3 evaluation

In fall 2017, we began presenting the available findings from both the implementation and impact studies as they emerged from the i3 evaluations. As more grants finish their evaluations, the i3 AR Team will continue reporting the findings provided by the independent evaluators, after processing those data to ensure consistency in reporting across grants.

This goal is fulfilled by answering the following research questions:

Q4. For each i3 grant, how faithfully was the intervention implemented?

Q5. For each i3 grant, what were the effects of the intervention on/promise of the intervention to improve educational outcomes?

      1. Goal Three: Assess the strength of the evidence produced by each i3 evaluation

In addition to providing ED with information necessary to report key Government Performance and Reporting Act (GPRA) measures, the i3 AR Team’s assessments of the i3 independent evaluations provide important context for the interpretation of the findings generated by the independent evaluations and summarized in the reports. Specifically, ED has explicitly articulated the expectation that Scale-Up and Validation grantees conduct evaluations that are “well-designed and well-implemented,” as defined by What Works Clearinghouse™ (WWC) Evidence Standards, because findings generated by such studies provide more convincing evidence of the effectiveness of the programs than evaluations that do not meet this standard. ED expects evaluations of Development grants to provide evidence on the intervention’s promise for improving student outcomes.3

This goal is fulfilled by answering the following research questions:

Q6. For each i3 grant, how strong is the evidence on the causal link between the intervention and its intended student outcomes?

Q6a. For Scale-Up and Validation grants, did the evaluation provide evidence on the effects of the intervention?

Q6b. For Development grants, did the evaluation produce evidence on whether the intervention warrants a more rigorous study of the intervention’s effects? For Development grants, if applicable, did the evaluation provide evidence on the outcomes of the intervention?

      1. Goal Four: Identify effective and promising interventions

The i3 Evaluation Team produces lists of interventions with positive effects/outcomes. Since educational interventions are typically designed to improve outcomes in one or more domains, and are targeted for students at particular grade levels, we list interventions by outcome domain and educational level (e.g., reading achievement for students in elementary school). Each list includes the names of the i3-funded interventions that produced positive effects, based on i3 studies that meet WWC standards (with or without reservations).4

This goal is fulfilled by answering the following research questions:

Q7. Which i3-funded interventions were found to be effective at improving student outcomes?

Q8. Which i3-funded interventions were found to be promising at improving student outcomes?

      1. Goal Five: Assess the results of the i3 program

To inform federal policymakers, we will summarize the effectiveness of the i3-funded interventions. This goal is fulfilled by answering the following research questions:

Q9. How successful were the i3-funded interventions?

Q9a. What fraction of interventions was found to be effective or promising?

Q9b. What fraction of interventions produced evidence that met i3 evidence standards (with or without reservations) or i3 criteria for providing credible evidence of the intervention’s promise for improving educational outcomes?

Q9c. What fraction of interventions produced credible evidence of implementation fidelity?

Q9d. Did Scale-Up grants succeed in scaling up their interventions as planned?

      1. Alignment of the data collection goals and the data elements

As stated above, the i3 Technical Assistance and Evaluation Project has nine research questions that address the five major goals of the evaluation. Exhibit 1 presents a crosswalk aligning the research questions with the individual items from the data collection templates. Appendix A contains the data collection instrument; evaluators may choose to complete this instrument or they may submit a final report.

Exhibit 1. Crosswalk Aligning Goals, Research Questions, and Data Collection Instrument

Goal 1: Describe the intervention implemented by each i3 grantee.

Research Question

Data Collection Instrument Items

What are the components of the intervention?

IMPLEMENTATION: A, B, D, I, N, S, X, & AB

What are the ultimate student outcomes that the intervention is designed to affect?

IMPACT: F; BACKGROUND: 5

What are the intermediate outcomes through which the intervention is expected to affect student outcomes?5

n/a

Goal 2: Present the evidence produced by each i3 evaluation.

Research Question

Data Collection Instrument Items

For each i3 grant, how faithfully was the intervention implemented?

IMPLEMENTATION: C, E-H, J-M, O-R, T-W, Y-AA, AC-AJ

For each i3 grant, what were the effects of the intervention on educational outcomes?

IMPACT: CA-CQ

Goal 3: Assess the strength of the evidence produced by each i3 evaluation.

Research Question

Data Collection Instrument Items

For each i3 grant, how strong is the evidence on the causal link between the intervention and its intended student outcomes?

Was the evaluation well-designed and well-implemented (i.e., for Scale-Up and Validation grants, did the evaluation provide evidence on the effects of the intervention)?

Did the evaluation provide evidence on the intervention’s promise for improving student outcomes (i.e., for Development grants, did the evaluation produce evidence on whether the intervention warrants a more rigorous study of the intervention’s effects)?

IMPACT: A, B, F-CG



Goal 4: Identify effective and promising interventions.

Research Question

Data Collection Instrument Items

Which i3-funded interventions were found to be effective at improving student outcomes?

IMPACT: A, B, F-CG

Which i3-funded interventions were found to be promising at improving student outcomes?

IMPACT: A, B, F-CG

Goal 5: Assess the results of the i3 Program.

Research Question

Data Collection Instrument Items

How successful were the i3-funded interventions?

What fraction of interventions was found to be effective or promising?

What fraction of interventions produced evidence that met i3 evidence standards or i3 criteria?

What fraction of interventions produced credible evidence of implementation fidelity?

Did Scale-Up grants succeed in scaling-up their interventions as planned?

BACKGROUND 1-7; IMPACT A, B, F-CG; IMPLEMENTATION: A-D, E-I, J-N, O-S, T-X, Y-AJ

          1. A.2.6.1 Types of data to be collected each year

Data on the i3 evaluation designs, activities, and findings will be collected toward the end of each evaluator’s contract. The evaluator may submit findings whenever they determine the findings are final and will not be subject to future changes. The exact timing will vary across i3 projects depending on the duration of their funding (which ranges from 3 to 5 years). While evaluators are permitted to submit data at any time, we anticipate most will do so near the end of their grant.

An earlier clearance package covered several years of data collection, during which 96 grants from the FY 2010 through FY 2014 cohorts submitted findings. Data collection from February 2018 through January 2021 will be covered by this clearance package, while data collection from February through December 2021 will be included in a future OMB clearance package.

When evaluators are ready to share findings, we ask them to either: (1) complete the data collection instrument shown in Appendix A or (2) submit a report (or other documents with sufficient data that the i3 Evaluation Team can complete the instrument based on the information provided). We will also ask them to complete the Background section of the data collection instrument shown in Appendix A.

Each i3 grant undergoes the following process on a timeline that is unique to its cohort and its award length but includes the same key components.

For each grant, we:

  1. Collect, via the i3 TA Team’s standardized templates, impact and implementation tables produced by the evaluators, or the evaluators’ draft report or other documents with information that would otherwise be covered in the templates.

  2. Review documents described in 1) to verify completeness.

  3. Follow up with the evaluator to a) actively confirm the information contained in the Background section of the data collection instrument and b) request missing data.

Shape1

          1. A.2.6.2 Information to be collected by the i3 Technical Assistance and Evaluation Project

The data collection instrument requests the following information:

  • Information on interventions, including:

  • Name of the intervention.

  • Descriptions of the interventions.

  • People or aspects of educational organizations that the intervention aims to effect or benefit.

  • Logic models of the inputs, mediators, and expected outputs of interventions. (For Scale-Up grants this includes a logic model of the mechanisms and pathways of the scale-up process).

  • Scale of the intervention (in terms of the number of students, teachers, schools, and/or districts intended to be served).

  • Evaluation methods, including:6

  • Research questions (specified as confirmatory and exploratory).

  • Research design (i.e., randomized controlled trials, quasi-experimental designs with comparison groups, regression discontinuity designs, or other types of designs).

  • Treatment and counterfactual conditions (specifically, how the counterfactual differs from the intervention).

  • Characteristics of the potential intervention versus comparison group contrasts to be analyzed.

  • Plans for measuring fidelity of intervention implementation.

  • Outcomes or dependent variables (e.g., name, timing of measurement for the intervention and the comparison groups, reliability and validity properties, whether outcome data were imputed in the case of missing data, and a description of the measurement instrument if not standard).

  • Independent variables (e.g., name, timing of measurement, and reliability and validity properties).

  • Analysis methods including descriptions of how impacts in the full sample and for subgroups were estimated, how the baseline equivalence was established (in QEDs and RCTs with high attrition), and how clustering of participants and noncompliance to treatment were accounted for.

  • Role of the developers or grantee in the conduct of the evaluation (to assess the independence of the evaluators).

  • Description of the study sample, including:

  • Group sizes (overall and per group).

  • Characteristics (e.g., grades or age levels).

  • Location.

  • Evaluation findings from the implementation study, including:

  • Revised description of the intervention components.

  • Level of implementation fidelity.

  • Evaluation findings from impact study for each contrast tested in the full sample or subgroups, plus information that is helpful in interpreting the impact findings, including:

  • Impact estimates.

  • Standard errors (to allow the i3 AR Team to conduct tests of statistical significance).

  • Standard deviation of the outcome measures (to allow the i3 AR Team to construct effect size measures).

  • Means and standard deviations for baseline variables (to allow the i3 AR Team to test for baseline equivalence).

  • Attrition rate and analysis sample sizes by group for each contrast.



          1. A.2.6.3 Data collection procedures and timeline

The i3 Technical Assistance and Evaluation Project has collected or will collect data from the universe of all i3 projects funded under the i3 Program. We have already collected data from the independent evaluators of 96 grants from the FY 2010 through FY 2014 cohorts. We anticipate that the evaluators of approximately 69 projects will participate in the data collection included in this clearance package, while approximately seven remaining projects will participate in data collection covered by a future OMB clearance package. The data collection covered in this OMB clearance package will occur from February of 2018 through January of 2021, with the exact timing dependent on the grant. Each year, we will produce a set of summary findings tables using data from any grants that participated in data collection in the prior year. These tables are still under development with IES but will be similar to summary tables included in the project’s findings report, which is currently under review by IES. The findings report includes the 67 grants that participated in our data collection prior to December 31, 2016. Exhibit 2, below, summarizes the timing of data collection and the production of the findings report/summary tables for grants covered by the original clearance package (“past request (no. 1)”), the current information collection request (“current request (no. 2)”), and a final clearance package (“future request (no. 3)”).

Exhibit 2: Data Collection and Reporting Plan Overview

Grant End Date

Timing of Data Collection

Timing of Report/Findings Summary Tables

Information Collection Request (Number)

Grants ending in 2015-2016 (67 grants)

Before December 31, 2016

Submitted to IES’ Standards and Review office September 2017

Past request (no. 1)

Grants ending in 2017 (29 grants)

Before December 31, 2017

June (draft)/July (final) 2018

Past request (no. 1)

Grants ending in 2018 (37 grants)

Before December 31, 2018

June (draft)/July (final) 2019

Current request (no. 2)

Grants ending in 2019 (18 grants)

Before December 31, 2019

June (draft)/July (final) 2020

Current request (no. 2)

Grants ending in 2020 (14 grants)

Before December 31, 2020

June (draft)/July (final) 2021

Current request (no. 2)

Grants ending in 2021 (7 grants)

Before December 31, 2021

October (draft)/November (final) 2021

Future request (no. 3)


    1. Use of Information Technology to Reduce Burden

Evaluators will submit data to the study team via email, regardless of the format of the data evaluators submit (i.e., data collection instrument or report). The use of email to collect data (combined with the flexible timeline) means that evaluators can complete the data collection process at a time and place convenient for them.

    1. Efforts to Identify Duplication

Potential duplications of effort are of two general types: addressing research questions already answered and duplicating data collection. The i3 Technical Assistance and Evaluation Project will not address research questions already answered. As explained in Section A.1., the i3 grants were newly funded in FYs 2010 – 2016 and this evaluation is the only federally funded information collection with plans to collect data in order to assess the extent to which the i3 independent evaluations are well-designed and well-implemented, and report the results across the evaluations. To ensure we do not duplicate data collection efforts, independent evaluators will be encouraged to send findings reports that were constructed in their own work to the AR team. Findings reports will be reviewed by the AR team and information from the reports will be extracted rather than requiring evaluators to provide data through the data collection instrument.

    1. Small Businesses

The primary entities for this study are independent evaluators employed by both large and small businesses as well as in some universities. Every effort is being made to reduce the burden on the evaluators by providing evaluators with the option to share findings via a final report instead of the AR Team’s data collection instrument.

    1. Consequences of Not Collecting the Information

The data collection described in this supporting statement is necessary for conducting this evaluation, which is consistent with the goals of the Investing in Innovation program to identify and document best practices that can be shared and taken to scale based on demonstrated success (American Recovery and Reinvestment Act of 2009 (ARRA), Section 14007(a)(3)(C)). The information that will be collected through this effort is also necessary to report on the performance measures of the i3 program, required by the Government Performance and Results Act:

          1. Long-Term Performance Measures
  • The percentage of programs, practices, or strategies supported by a [Scale-Up or Validation] grant that implements a completed well-designed, well-implemented and independent evaluation that provides evidence of their effectiveness at improving student outcomes.

  • The percentage of programs, practices, or strategies supported by a [Development] grant with a completed evaluation that provides evidence of their promise for improving student outcomes.

  • The percentage of programs, practices, or strategies supported by a [Scale-Up or Validation] grant with a completed well-designed, well-implemented and independent evaluation that provides information about the key elements and the approach of the project so as to facilitate replication or testing in other settings.

  • The percentage of programs, practices, or strategies supported by a [Development] grant with a completed evaluation that provides information about the key elements and the approach of the project so as to facilitate further development, replication or testing in other settings.

    1. Special Circumstances Justifying Inconsistencies with Guidelines in 5 CFR 1320.6

There are no special circumstances required for the collection of this information.

    1. Federal Register Notice and Consultation

      1. Federal Register Notice

The 60-day Federal Register Notice was published on Monday, November 6, 2017, Volume 82, Number 213, page 51414. A copy of this notice is included as Appendix C. During the notice and comment period, the government received two comments, neither of which was substantive to the collection.

      1. Consultation with Experts

The Abt study team assembled two technical working groups (TWG) consisting of consultants with various types of expertise relevant to this evaluation. The TWG meetings were held in April 2011 and April 2016. The TWGs discussed the overall approach to the i3 Technical Assistance and Evaluation Project, including providing TA and data collection and reporting.

Exhibit 3: TWG Members

Name

Affiliation

April 2011 TWG

April 2016 TWG

David Francis, Ph.D.

University of Houston

x

x

Tom Cook, Ph.D.

Northwestern University

x

x

Brian Jacob, Ph.D.

University of Michigan

x


Lawrence Hedges, Ph.D.

Northwestern University

x


Carolyn Hill, Ph.D.

Georgetown University

x

x

Chris Lonigan, Ph.D.

Florida State University

x


Neal Finkelstein, Ph.D.

WestEd

x


Bob Granger, Ph.D.

W.T. Grant Foundation

x


Vivian Tseng

W.T. Grant Foundation


x

Russ Whitehurst, Ph.D.

The Brookings Institution


x



    1. Payments or Gifts to Respondents

Data collection for this study does not involve payments or gifts to respondents.

    1. Assurance of Confidentiality

Abt Associates and its subcontractors follow the confidentiality and data protection requirements of IES (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183), which require that all collection, maintenance, use and wide dissemination of data conform to the requirements of the Privacy Act of 1974 (5 U.S.C. 552a), the Family Educational Rights and Privacy Act of 1974 (20 U.S.C. 1232g), and the Protection of Pupil Rights Amendment (20 U.S.C. 1232h). The study team will not be collecting any individually identifiable information.

All study staff involved in collecting, reviewing, or analyzing study data will be knowledgeable about data security procedures. The privacy procedures adopted for this study for all data collection, data processing, and analysis activities include the following:

  • The study will not request any personally identifiable information (PII) data that was collected by independent evaluators. All of the data requested from independent evaluators will be in the form of aggregated reports of the methods, measures, plans, and results in their independent evaluations.

  • Individual i3 grants will be identified in this study. Their characteristics, results, and findings reported by independent evaluators, as well as assessments of the quality of the independent evaluations and i3 projects, will be reported. This study cannot, however, associate responses with a specific school or individual participating in i3 independent evaluations in annual reports, since we are not collecting that information.

  • To ensure data security, all individuals hired by Abt Associates Inc. are required to adhere to strict standards and sign an oath of confidentiality as a condition of employment. Abt’s subcontractors will be held to the same standards.

  • All data files on multi-user systems will be under the control of a database manager, with access limited to project staff on a “need-to-know” basis only.

  • Identifiable data will be kept for three years after the project end date and then destroyed.

  • Written records will be shredded and electronic records will be purged.

    1. Questions of a Sensitive Nature

The data collection for this study does not include any questions of a sensitive nature.

    1. Estimates of Response Burden

Burden for the data collection covered by this clearance request is 1,028 hours, for a total cost to respondents of $39,955. On an annual basis, 23 respondents will provide 23 responses, and the burden during each year of data collection will be 343 hours. Exhibit 4 presents time estimates of response burden for the data collection activities requested for approval in this submission. The burden estimates are based on the following assumptions:

  • Approximately half of evaluators (35) will elect to submit a findings report in place of the i3’s data collection instruments.

  • The remaining evaluators (34) will complete the i3’s data collection instruments.

  • The i3 team will follow up with evaluators to request any missing information.

Estimated hourly costs to independent evaluators are based on an average hourly wage for social scientists and related workers of $38.87 according to The United States Department of Labor, Bureau of Labor Statistics (U.S. Department of Labor 2016).

Exhibit 4. Estimate of Respondent Burden

Notes:

This OMB Package covers the period from 2/1/2018 through 1/31/2021. Therefore it includes all data collection activities for the 2 FY 2010 grants and 1 FY 2011 grants, 4 FY 2012 grants, 17 FY 2013 grants, 25 FY 2014 grants, 13 FY 2015 grants, and 7 FY 2016 grants.

a During the period covered by this OMB package, we expect approximately half of all grants (35) to submit a findings report to i3, in place of completing portions of the i3's data collection instrument. These grants will still have to complete questions 3, 4, 5, and 7 in the background section of the i3's instrument (see Appendix A).

b During the period covered by this OMB package, we expect approximately half of all grants (34) to complete the i3's data collection instrument.

c Average hourly wage for “Social Scientists, and Related Workers, All Others” from the Industry-Specific Occupational and Wage Estimates, U.S. Department of Labor (see http://www.bls.gov/oes/current/oes193099.htm#nat), accessed October 6, 2017.

    1. Cost Burden to Respondents or Record Keepers

There are no annualized capital, start-up, or ongoing operation and maintenance costs involved in collecting this information.

    1. Estimates of Costs to the Federal Government

The estimated cost to the Federal Government for data collection is $754,416. The average annual cost to the federal government is approximately $251,472.  

    1. Changes in Burden

This is the second of three OMB packages that will be submitted for the i3 Technical Assistance and Evaluation Project. We estimate that there will be no remaining burden for the previously approved instrument. The overall burden decreased between the first and second packages due to a decrease in the number of respondents (96 in the first package v. 69 in the current package). The average burden per respondent is 15 hours in both packages.

    1. Plans for Publication, Analysis, and Schedule

In this section, we present our approach to analyzing the data collected in order to address the five goals of the data collection (and related research questions) introduced in Section A.2: (1) Describe the intervention implemented by each i3 grantee; (2) Assess the strength of the evidence produced by each i3 evaluation; (3) Present the evidence produced by each i3 evaluation; (4) Identify effective and promising interventions; and (5) Assess the results of the i3 Program. After describing our approach to addressing the five goals, we discuss our plans for reporting the results. The five goals and research questions we plan to address over the course of our evaluation, presented in Section A.2 above, are reiterated in Exhibit 5.

Exhibit 5: i3 Technical Assistance and Evaluation Project’s Goals and Research Questions

Goal 1: Describe the intervention implemented by each i3 grantee.


Q1. What are the components of the intervention?


Q2. What are the ultimate student outcomes that the intervention is designed to affect?


Q3. What are the intermediate outcomes through which the intervention is expected to affect student outcomes?

Goal 2: Present the evidence produced by each i3 evaluation.


Q4. For each i3 grant, how faithfully was the intervention implemented?

Q5. For each i3 grant, what were the effects of the interventions on/promise of the interventions to improve educational outcomes?


Goal 3: Assess the strength of the evidence produced by each i3 evaluation.


Q6. For each i3 grant, how strong is the evidence on the causal link between the intervention and its intended student outcomes?

Q6a. For Scale-Up and Validation grants, did the evaluation provide evidence on the effects of the intervention?

Q6b. For Development grants, did the evaluation produce evidence on whether the intervention warrants a more rigorous study of the intervention’s effects? For Development grants, if applicable, did the evaluation provide evidence on the outcomes of the intervention?

Goal 4: Identify effective and promising interventions.


Q7. Which i3-funded interventions were found to be effective at improving student outcomes?

Q8. Which i3-funded interventions were found to be promising at improving student outcomes?

Goal 5: Assess the results of the i3 Program.


Q9. How successful were the i3-funded interventions?


Q9a What fraction of interventions was found to be effective or promising?

Q9b. What fraction of interventions produced evidence that met i3 criteria?

Q9c. What fraction of interventions produced credible evidence of implementation fidelity?

Q9d. Did Scale-Up grants succeed in scaling-up their interventions as planned?



      1. Approach to i3 analysis and reporting

The nature of our analysis and reporting is to synthesize and review findings, rather than to estimate impacts using statistical techniques. In this section, we provide more information on how we plan to address the research questions posed for the i3 Technical Assistance and Evaluation Project by analyzing the data we are requesting permission to collect.



Goal 1: Describe the intervention implemented by each i3 grantee.


Q1. What are the components of the intervention?

Components are defined as the activities and inputs that are under the direct control of the individual or organization responsible for program implementation (e.g., program developer, grant recipient), and are considered by the developer to be essential in implementing the intervention. Components may include financial resources, professional development for teachers, curricular materials, or technology products. We will review the evaluators’ reported intervention components, code them into one of 11 categories to compare with key components across other evaluations, and report them in a succinct manner to facilitate easing “browsing” of activities across interventions.


Q2. What are the ultimate student outcomes that the intervention is designed to affect?

We will report the ultimate student outcome domains that were evaluated for each grant. As necessary, we will rephrase the outcome domains to maintain their original meaning but to be consistent with outcome domains named in other interventions; for example, we will use “Mathematics” to describe outcome domains labeled by evaluators as “Math Achievement,” “Mathematics Achievement,” “Mathematical Understanding,” etc.


Q3. What are the intermediate outcomes through which the intervention is expected to affect student outcomes?

We will report the intermediate outcomes through which the intervention is expected to affect outcomes. We will code the intermediate outcome into one of three categories – school, teacher/classroom, or family – to facilitate comparison with intermediate outcomes named in other interventions.

Goal 2: Present the evidence produced by each i3 evaluation.


Q4. For each i3 grant, how faithfully was the intervention implemented?

We will ask evaluators to report their criteria for assessing whether each key component of the intervention was implemented with fidelity. In addition, we will ask evaluators to report annual fidelity estimates for each key component so that the i3 AR Team can assess whether or not the intervention component was implemented with fidelity. We will state that, overall, an intervention was implemented with fidelity if the evaluator reports that the thresholds for fidelity were met for the majority of the intervention’s key components in at least 50 percent of the years of measurement.


Q5. For each i3 grant, what were the effects of the intervention on/promise of the intervention to improve educational outcomes?

We will convert all of the reported impact estimates into effect sizes by following WWC guidelines, and report these effect sizes and their statistical significance.


Goal 3: Assess the strength of the evidence produced by each i3 evaluation.


Q6. For each i3 grant, how strong is the evidence on the causal link between the intervention and its intended student outcomes?

Q6a. For Scale-Up and Validation grants, did the evaluation provide evidence on the effects of the intervention?

Our WWC-certified reviewers will apply the most recent WWC handbook rules to determine whether the evaluation meets standards without reservations, meets standards with reservations, or does not meet standards. For this reason, the data collection includes all elements that would be required to complete a WWC Study Review Guide (SRG).

Q6b. For Development grants, did the evaluation produce evidence on whether the intervention warrants a more rigorous study of the intervention’s effects? For Development grants, if applicable, did the evaluation provide evidence on the outcomes of the intervention?

Our WWC-certified reviewers will apply the most recent WWC handbook rules to determine whether the evaluation meets standards without reservations, meets standards with reservations, or does not meet standards. If it does not meet standards, the WWC-certified reviewers will determine that an evaluation provides evidence on the intervention’s promise of improving student outcomes if:

      • The evaluation uses a Randomized Controlled Trial, a Regression Discontinuity Design, a Quasi-Experimental Design, and Interrupted Time Series Design without a comparison group, or a pre-post design without a comparison group.7

      • The outcome measure meets WWC standards.

Goal 4: Identify effective and promising interventions.


Q7. Which i3-funded interventions were found to be effective at improving student outcomes?

Effective interventions will be defined as those with evidence of positive and significant effects based on a study that meets WWC evidence standards. This analysis requires all of the data found in a completed WWC Study Review Guide (SRG). The evidence will indicate a positive effect if for any domain, at least one confirmatory impact estimate in the domain is positive and statistically significant, after correcting for multiple comparisons, and no estimated effects for any confirmatory contrasts in the domain are negative and statistically significant.

Q8. Which i3-funded interventions were found to be promising at improving student outcomes?

Promising interventions will be defined as Development Grant-funded interventions with evidence of positive and significant outcomes based on a study with a rating of Provides Evidence of the Intervention’s Promise for Improving Outcomes. The evidence will indicate a positive outcome if for any domain, at least one confirmatory impact estimate in the domain is positive and statistically significant, after correcting for multiple comparisons, and no estimated outcomes for any confirmatory contrasts in the domain are negative and statistically significant.






Goal 5: Assess the results of the i3 Program.


Q9. How successful were the i3-funded interventions?


Q9a What fraction of interventions were found to be effective or promising?

We will use basic tabulation methods to compute the fraction of interventions that were found to be effective or promising.

Q9b. What fraction of interventions produced evidence that met WWC evidence standards and i3 criteria?

We will use basic tabulation methods to compute the fraction of interventions that were found to meet criteria without reservations, meet criteria with reservations, provide evidence of the intervention’s promise for improving outcomes, or not meet criteria.

Q9c. What fraction of interventions produced credible evidence of implementation fidelity?


We will use basic tabulation methods to compute the fraction of interventions that were found to produce credible evidence of implementation fidelity.

Q9d. Did Scale-Up grants succeed in scaling-up their interventions as planned?

We will describe the evaluator’s scale-up goals and compare those goals to the scale-up efforts that took place, based on the evaluator’s reports.




      1. Plans for publication and study schedule

Each year, the i3 Technical Assistance and Evaluation Project will produce a set of summary findings tables using data from any grants that participated in data collection in the prior year. This means that the content of these tables is driven by the progress made by the independent evaluators. Exhibit 6, below summarizes the timing of data collection and the production of findings summary tables for data collected between February 2018 and January 2021. The schedule assumes OMB approval will be received by January 31st, 2018.

Exhibit 6. Schedule of Reports

Grants ending in 2018

Timing of Data Collection

Timing of Findings Summary Tables

Grants ending in 2018 (37 grants)

Before December 31, 2018

June (draft)/July (final) 2019

Grants ending in 2019 (18 grants)

Before December 31, 2019

June (draft)/July (final) 2020

Grants ending in 2020 (14 grants)

Before December 31, 2020

June (draft)/July (final) 2021



    1. Approval to Not Display Expiration Date

No exemption is requested. The data collection instruments will display the expiration date.



    1. Exceptions to Item 19 of OMB Form 83-1

The submission describing data collection requires no exemptions to the Certificate for Paperwork Reduction Act (5 CFR 1320.9).



1 The partners are ANALYTICA, Century Analytics, Measured Decisions Inc., and Mike Puma Associates, LLC.

2 It is important to note that the assessment of whether the independent evaluations are well-designed and well-implemented forms the basis for some of the Government Performance and Results Act (GPRA) measures for the i3 program.

3 The i3 Promise criteria operationalize OII’s GPRA requirements, which distinguish between Development and Validation/Scale-up grants in their evidence requirements. These Promise criteria were developed to assess the quality of evaluations of Development grants that do not meet WWC evidence standards (with or without reservations). These criteria are intended to determine the extent to which the interventions funded by Development grants provide evidence of “promise” for improving student outcomes and may be ready for a more rigorous test of their effectiveness.

5 This information is collected from documents previously provided by evaluators to the AR team.

6 We expect that some projects will use more than one research design. In order to be clear, in this plan we refer only to a single design per evaluation.

7 The i3 Technical Assistance and Evaluation Project will treat an Interrupted Time Series Design or a pre-post design with a comparison group as a Quasi-Experimental Design.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAbt Single-Sided Body Template
AuthorKatheleen Linton
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy