Supporting_Statement_Part_B_revised 1-8-2018_clean

Supporting_Statement_Part_B_revised 1-8-2018_clean.docx

National Evaluation of the Investing in Innovation (i3) Program

OMB: 1850-0913

Document [docx]
Download: docx | pdf



S upporting Statement for Information Collection Request—Part B



Investing in Innovation (i3) Technical Assistance and Evaluation Project




January 8, 2018




Prepared for:

Tracy Rimdzius

Institute of Education Sciences

U.S. Department of Education

550 12th Street, SW

Washington, DC 20024




Submitted by:

Abt Associates

55 Wheeler Street

Cambridge, MA 02138





  1. Collections of Information Employing Statistical Methods

The Investing in Innovation (i3) Technical Assistance and Evaluation Project is being conducted by Abt Associates Inc. and its subcontractors, ANALYTICA, Measured Decisions Inc., and Mike Puma Associates, LLC (the i3 Evaluation Team) for the Institute of Education Sciences (IES), U.S. Department of Education. As indicated in Part A of this submission, the i3 Technical Assistance and Evaluation Project has two broad goals: (1) to provide technical assistance (TA) to support the independent evaluators of i3 projects in conducting evaluations that are well-designed and well-implemented and (2) to summarize the strength of the evidence and findings from the independent evaluations.

Data collection will be required in order to address the second goal of summarizing the quality of and findings from the individual evaluations being conducted by independent evaluators contracted by each i3 grantee. Part B describes the “Collection of Information Employing Statistical Methods” and Part A, submitted under separate cover, describes the justification for these data collection activities.

    1. Respondent Universe and Sampling Methods

The i3 Technical Assistance and Evaluation Project has collected or will collect data from the universe of all i3 projects funded under the i3 Program. We have already collected data from the independent evaluators of 96 grants from the FY 2010 through FY 2014 cohorts. We anticipate that the evaluators of approximately 69 projects will participate in the data collection included in this clearance package, while approximately seven remaining projects will participate in data collection covered by a future OMB clearance package, since cooperation with the i3 Technical Assistance and Evaluation Project is a requirement for all i3 grantees. In the event that an evaluation is not received by the i3 Technical Assistance and Evaluation Project, the Project will not impute information about the missing evaluation nor will it conduct any statistical adjustment in its reporting on the overall outcomes of the i3 Program.

    1. Information Collection and Estimation Procedures

In this section we briefly describe the procedures for the collection of information for the i3 Technical Assistance and Evaluation Project. The information for this study is being collected on behalf of the Institute of Education Sciences (IES), U.S. Department of Education by Abt Associates Inc. (Abt) and its subcontractors. The specific staff working on the data collection, analysis, and report writing activities are known as the Analysis and Reporting (AR) team.

The required data collection will call for each i3 evaluator to either (1) complete the data collection instrument shown in Appendix A or (2) submit a report (or other documents with sufficient data that the i3 Evaluation Team can complete the instrument based on the information provided) and complete questions 3, 4, 5, and 7 in the background section of the data collection instrument shown in Appendix A. Evaluators will be asked to submit findings once. During the data collection period covered by the prior clearance package, i3 evaluators in the FY 2010-FY 2014 cohorts will complete data collection for approximately 96 grants. During the data collection period covered by this clearance package, we expect evaluators for an additional 69 grants to complete data collection. Evaluators for the remaining seven grants are expected to complete data collection covered by a future OMB clearance package.

This study will summarize the results of all i3 evaluations, so no sample selection or stratification survey methods are applicable. We will not adopt specialized sampling methods in the event of unusual data collection problems; rather, we will analyze and report solely the evaluation findings reported to us.

An important goal of the i3 Technical Assistance and Evaluation Project is to ensure that the field learns more from the evaluations than could be learned from a separate assessment of each evaluation. Therefore, the Project will describe the i3 interventions, assess the strength of the evidence from each evaluation, present the findings from the i3 independent evaluations, and conduct a cross-study analysis to assess the overall results of the i3 program.

The accumulated evidence from i3 independent evaluations provides an opportunity to learn about the effects/outcomes of the i3-funded interventions. The i3 Technical Assistance and Evaluation Project will:

  • Review evaluators’ written descriptions of the intervention and its components, intermediate outcomes, and ultimate outcomes and summarize them succinctly and with a common lexicon.

  • Apply the most recent What Works Clearinghouse™ (WWC) Evidence Standards to review each evaluation, employing WWC-certified reviewers to conduct the reviews. This practice can involve some simple statistical procedures, such as the need to apply adjustments for clustering or for multiple comparisons, or to compute effect sizes; however, these methods are common and will follow the latest WWC guidelines.

  • Apply simple decision rules to assess the independence, adherence to scientific process, implementation fidelity, and relevance of each evaluation:

    • The AR Team will assess whether the data collection, analysis, and reporting efforts were independent based on whether the parties conducting the data collection, analysis, and reporting efforts were the same parties responsible for the development or ultimate success of the intervention.

    • The AR Team will conclude adherence to scientific process if the evaluator submitted an evaluation plan prior to collecting outcome data and the evaluation findings address all of the confirmatory, or primary, research questions specified a priori.

    • The AR Team will ask evaluators to report their criteria for assessing whether each intervention component was implemented with fidelity. In addition, we will ask evaluators to report annual fidelity estimates for each intervention component so that the AR Team can assess whether or not the component was implemented with fidelity. We will state that, overall, an intervention was implemented with fidelity if the thresholds for fidelity were met for the majority of the intervention’s key components in at least 50 percent of the years of measurement.


    • The AR Team will compare the sample of schools and students included in the impact evaluation to the schools and students served by the i3-funded intervention. The evaluation sample will be considered representative of those served if 1) less than 25 percent of districts or schools served are excluded from the impact evaluation and 2) less than 10 percent of teachers or students served by the intervention are not excluded from the impact evaluation based on characteristics that are expected to be related to outcomes. These include but are not limited to teacher experience, prior student achievement, prior academic performance, race or ethnicity, income (e.g., FRPL), English language proficiency, and special education status.


  • Tabulate the ratings, findings, educational levels (e.g., elementary school, middle school), and outcome domains across all of the studies.

We will produce a list of the interventions for which the i3 evaluation found positive effects on student academic outcomes. The list will include the names of the i3-funded interventions that produced positive effects, based on i3 evaluations that meet WWC evidence standards (with or without reservations). This list will answer Q7: Which i3-funded interventions were found to be effective at improving student outcomes?

In defining what constitutes evidence of a positive effect, we will adopt the convention from the WWC. In particular, we will consider an impact estimate as evidence of a positive effect if the impact estimate is positive and statistically significant at the 5-percent level, after accounting for clustering and multiple comparisons. For more details on our plan for clustering and multiple comparisons adjustments, see the WWC Evidence Standards.1

We propose to inform federal policymakers by summarizing the contribution of the i3 program. From an evaluation perspective, the contribution of i3 can be measured in terms of the amount of useful research that it produced, measured by the percentage of studies that meet reasonable standards for demonstrating the effectiveness or promise of the intervention. From a policy perspective, the contribution of i3 can be measured in terms of the effective interventions that were implemented by i3 grantees. To characterize the contribution of i3 from a policy perspective, we will calculate and present the number and share of i3 evaluations that implemented an effective intervention. To characterize the contribution of i3 from an evaluation perspective, we will measure and report the number and share of grants and of i3 funding that meet WWC evidence standards and i3 criteria and that produced credible evidence of implementation fidelity.

    1. Methods for Maximizing the Response Rate and Addressing Issues of Nonresponse

In this section, we describe the strategies that will be used to maximize response rates and deal with issues of nonresponse. Section B.2 describes the procedures that we use to collect data from evaluators. These procedures were developed to minimize burden and encourage cooperation and completion of the data collection process by the end of each grant’s funding period.

Data collection procedures involve obtaining the support of TA liaisons who ensure that evaluators are systematically collecting and recording the information contained in the data collection instrument. As described in Section A of this OMB submission, the TA liaisons’ role is to maintain relationships with the independent evaluators of all of the i3 projects throughout their tenure and provide the independent evaluators valuable suggestions, information, and resources that will support and strengthen their evaluations. As a result of this strong, ongoing relationship, the TA liaisons are a key factor in ensuring that evaluators collect information contained in the data collection instrument and complete these instruments (or submit a report), thereby maximizing the response rates of the independent evaluators in the i3 Technical Assistance and Evaluation Project’s data collection.

The study team will also enlist the support of the OII program staff to contact the i3 grantees about independent evaluators who are nonresponsive. OII provides funding for the i3 grantees and requires i3 grantees to cooperate with the i3 Technical Assistance and Evaluation Project and, therefore, may be in the best position to influence i3 grantees with incomplete data. To date, only one i3 grantee from among completed grants has not shared all of the data requested by the i3 AR Team.

Exhibit B.1 highlights these steps and other specific strategies we will employ to maximize response rates and deal with issues of nonresponse.

Exhibit B.1: Strategies to Maximize Survey Response

Advance notification of data collection

  • Gain support and cooperation of respondents by providing advance notice of the data collection process and asking respondents to determine when they will submit their findings.

Provide clear instructions and user-friendly materials

  • TA liaisons will suggest using standardized tools and templates to track the progress of the evaluation.

Offer to review any information provided by respondents

  • If evaluators are not completing the data collection instrument, invite them to submit relevant information through a report, which they are required by their grant to produce.

Monitor progress regularly

  • Produce monthly reports of completed data collection.

  • Maintain regular contact between study team members to monitor response rates, identify nonrespondents, and resolve problems.

  • Conduct follow-up and reminder calls and e-mails to nonrespondents, as necessary.

Obtain support of TA liaisons and OII staff

  • For nonresponsive independent evaluators: contact the TA liaison and OII program staff working with these grantees and ask them to intervene.



Ninety-six of 172 grants have already participated in data collection; for these grants, the response rate was 99 percent. The 96 grants include 49 from the FY 2010 cohort, 21 grants from the FY 2011 cohort, 17 grants from the FY 2012 cohort, 8 grants from the FY 2013 cohort, and 1 grant from the FY 2014 cohort. Therefore, among the grants that we expect to participate in data collection during the period covered by this OMB package, we anticipate a 100 percent response rate.

    1. Tests of Procedures or Methods

The data collection instruments and procedures have already been used with evaluators participating in data collection covered by the prior OMB package. Prior to use with all evaluators, the instrument was pilot tested by Abt staff and one FY 2010 cohort grant. The pilot test helped to ensure that the instructions are understandable, the content is comprehensible, and the technology functions appropriately. In response to our pilot testing, we clarified some of the instrument’s questions and instructions, and added additional programming to the survey. While our pilot test was limited, it did help us identify sections of the instrument that are not applicable to some evaluations so that we could include skip patterns to reduce burden in those cases. Further, the pilot provided a more accurate indication of the burden experienced by people completing the instrument and allowed us to revise our burden estimates accordingly.

    1. Individuals Consulted on Statistical Aspects of the Design

The i3 Technical Assistance and Evaluation Project is being conducted by Abt Associates Inc., and its subcontractors, ANALYTICA, Century Analytics, Measured Decisions Inc., and Mike Puma Associates, LLC (the i3 Evaluation Team) for the Institute of Education Sciences (IES), U.S. Department of Education. With IES oversight, the contractors for the evaluation are responsible for study design, instrument development, data collection, analysis, and report preparation.

The individuals listed in Exhibit B.2 worked closely in developing the data collection instruments and will have primary responsibility for the data collection and data analysis. Contact information for these individuals is provided below.

Exhibit B.2: Key Individuals Consulted

Name

Title/Affiliation

Telephone

Beth Boulay

Project Director, Abt Associates

617-520-2903

Barbara Goodson

Principal Investigator, Abt Associates

617-349-2811

Rachel McCormick

Deputy Project Director, Abt Associates

617-520-2668

Rob Olsen

Analysis and Reporting Leader, Rob Olsen, LLC

240-461-0503

Eleanor Harvill

Associate, Abt Associates

301-347-5638

Catherine Darrow

Senior Associate, Abt Associates

617-520-3034






File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAbt Single-Sided Body Template
AuthorKatheleen Linton
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy