NEi3 OMB SS Part B 10-30-14

NEi3 OMB SS Part B 10-30-14.docx

National Evaluation of the Investing in Innovation (i3) Program

OMB: 1850-0913

Document [docx]
Download: docx | pdf



S upporting Statement for Information Collection Request—Part B




National Evaluation of the Investing in Innovation (i3) Program




October 22, 2014




Prepared for:

Tracy Rimdzius

Institute of Education Sciences

U.S. Department of Education

555 New Jersey Avenue, NW

Washington, DC 20208




Submitted by:

Abt Associates

55 Wheeler Street

Cambridge, MA 02138





  1. Collections of Information Employing Statistical Methods

The National Evaluation of the Investing in Innovation (i3) Program is being conducted by Abt Associates Inc. and its subcontractors, Dillon-Goodson Research Associates, Chesapeake Research Associates, Analytica, Westat, Century Analytics, and American Institutes for Research (the i3 Evaluation Team) for the Institute of Education Sciences (IES), U.S. Department of Education. As indicated in Part A of this submission, the National Evaluation of i3 (NEi3) has two broad goals: (1) to provide technical assistance (TA) to support the independent evaluators of i3 projects in conducting evaluations that are well-designed and well-implemented and (2) to summarize the strength of the evidence and findings from the independent evaluations.

Data collection will be required in order to address the second goal of summarizing the quality of and findings from the individual evaluations being conducted by independent evaluators contracted by each i3 grantee. Part B describes the “Collection of Information Employing Statistical Methods” and Part A, submitted under separate cover, describes the justification for these data collection activities.

    1. Respondent Universe and Sampling Methods

The NEi3 will collect data from the universe of all i3 projects currently funded under the i3 Program. We anticipate that the independent evaluators of all i3 projects will participate in one of the three data collections included in this clearance package, or in one of two additional data collections (to be included in a future OMB clearance package), since cooperation with the NEi3 is a requirement for all i3 grantees. In the event that an evaluation is not received by the NEi3, the NEi3 will not impute information about the missing evaluation nor will it conduct any statistical adjustment in its reporting on the overall outcomes of the i3 Program.

    1. Description of Procedures for the Collection of Information

In this section we briefly describe the procedures for the collection of information during each of the annual surveys for the NEi3, which will begin in January 2015. The information for this study is being collected on behalf of the Institute of Education Sciences (IES), U.S. Department of Education by Abt Associates Inc. (Abt) and its subcontractors. Abt and its subcontractors have developed the survey instruments that will be used during each annual data collection period. The specific staff working on the data collection, analysis and report writing activities are known as the Analysis and Reporting (AR) team.

The required data collection will call for each i3 evaluator to complete a Primary Survey; some evaluators will choose to submit preliminary findings through an Early Survey and some evaluators will complete a Follow-Up Survey in order to provide additional or updated findings. Each of the three survey types will use the same survey template. In order to minimize the burden on survey respondents and maximize survey response, the AR Team will pre-fill survey item responses with information from existing materials (reports, conference presentations, etc.) that report evaluators’ findings. Evaluators will be asked to submit these materials to the AR Team before being asked to complete the survey.

Surveys will be administered online. Evaluators will be given a unique password to log onto a secure website to complete their survey. The AR Team will run a help desk with a toll-free phone number and email address through which evaluators can ask questions about technical issues or survey content.

Each i3 evaluator will be required to participate in at least one of the five data collection periods planned by the NEi3. During the three data collection periods covered by this clearance package, we anticipate that i3 evaluators in the FY 2010, FY 2011, and FY 2012 cohorts will complete at least one survey. We also expect about twenty percent of FY 2013 cohort evaluators to participate in an Early Data Collection during this period. While annual data collection is not required, evaluators that would like the NEi3 to report their latest findings will be invited to submit data on an annual basis. This study will summarize the results of all i3 evaluations, so no sample selection or stratification survey methods are applicable. We will not adopt specialized sampling methods in the event of unusual data collection problems; rather, we will analyze and report solely the evaluation findings reported to us.

An important goal of the NEi3 is to ensure that the field learns more from the evaluations than could be learned from a separate assessment of each evaluation. The NEi3 will describe the i3 interventions, assess the strength of the evidence from each evaluation, present the findings from the i3 independent evaluations and conduct a cross-study analysis to assess the overall results of the i3 program.

The accumulated evidence from i3 independent evaluations provides an opportunity to learn about the effects/outcomes of different types of interventions. The NEi3 will:

  • Review evaluators’ written descriptions of the intervention and its components, intermediate outcomes, and ultimate outcome domains and summarize them succinctly and with a common lexicon.

  • Apply the most recent What Works Clearinghouse (WWC) evidence standards to review each evaluation, employing WWC-certified reviewers to conduct the reviews. This practice can involve some simple statistical procedures, such as the need to apply adjustments for clustering or for multiple comparisons, or to compute effect sizes: however, these methods are common and will follow the latest WWC guidelines.

  • Apply simple decision rules to assess the independence, adherence to scientific process, implementation fidelity, and relevance of each evaluation:

    • The NEi3 will assess whether the data collection, analysis, and reporting efforts were independent based on whether the parties conducting the data collection, analysis, and reporting efforts were the same parties responsible for the development or ultimate success of the intervention.

    • The NEi3 will conclude adherence to scientific process if the evaluator submitted an evaluation plan prior to collecting outcome data; if the evaluation report addresses all of the confirmatory, or primary, research questions specified a priori, and; if the evaluation methods used match the methods described in the evaluation plan.

    • The NEi3 will ask evaluators to report their criteria for assessing whether each intervention component of the intervention was implemented with fidelity. In addition, we will ask evaluators to report annual fidelity estimates so that the NEi3 can assess whether or not the intervention was implemented with fidelity. We will state that an intervention was implemented with fidelity in a given year if the evaluator reports that the thresholds for fidelity were met for that year.


    • The NEi3 will review the survey responses of evaluator that compare the evaluation sample (the sample of students/teacher/schools for which the outcome impact estimates were evaluated) and the implementation sample (the sample of students/teachers/schools for which implementation fidelity was assessed). If the evaluation sample is a non-representative subset of the implementation sample, the NEi3 will caveat the reporting of findings with a short statement summarizing the misalignment.


  • Tabulate the ratings, findings, intervention types, student types, and outcome domains across all of the studies.

To help school and district officials identify effective interventions, we will produce lists of interventions with positive effects/outcomes. Since educational interventions are typically designed to improve outcomes in one or more domains, and are targeted for students at particular grade levels, we will list interventions by ultimate outcome domain and educational level (e.g., reading achievement for students in elementary school). We will produce two key types of lists:

  • One list will include the names of the i3-funded interventions that produced positive effects, based on i3 studies that meet WWC evidence standards (with or without reservations). This list will answer Q7: Which i3-funded interventions were found to be effective at improving student or teacher outcomes?

  • The second list will include the names of the i3-funded interventions with positive outcomes, based on i3 students that Meet i3 Criteria for Providing Evidence of Promise1. This list will answer Q8: Which i3-funded interventions were found to be promising at improving student or teacher outcomes?

In defining what constitutes evidence of a positive effect, we will adopt the convention from the WWC. In particular, we will consider an impact estimate as evidence of a positive effect if the impact estimate is positive and statistically significant at the 5-percent level, after accounting for clustering and multiple comparisons. For more details on our plan for clustering and multiple comparisons adjustments, see the WWC Evidence Standards.2 We propose to inform federal policymakers by summarizing the contribution of the i3 program. From an evaluation perspective, the contribution of i3 can be measured in terms of the amount of useful research that it produced, measured by the percentage of studies that meet reasonable standards for demonstrating the effectiveness or promise of the intervention. From a policy perspective, the contribution of i3 can be measured in terms of the effective interventions that were implemented by i3 grantees. To characterize the contribution of i3 from a policy perspective, we will calculate and present the number and share of i3 evaluations that implemented an effective intervention. To characterize the contribution of i3 from an evaluation perspective, we propose to measure and report the number and share of grants and of i3 funding that meet WWC evidence standards and i3 criteria and that produced credible evidence of implementation fidelity.

    1. Description of Procedures for Maximizing Response Rates

In this section, we describe the strategies that will be used to maximize response rates and deal with issues of nonresponse. Section B.2 describes the procedures that we will use to implement the annual surveys. These procedures were developed to encourage cooperation and completion of the survey within the data collection period.

Procedures will involve obtaining the support of TA liaisons in encouraging their assigned independent evaluators to complete the survey. As described in Section A of this OMB submission, the TA liaisons’ role is to maintain relationships with the independent evaluators of all of the i3 projects throughout their tenure and provide the independent evaluators valuable suggestions, information, and resources that will support and strengthen their evaluations. As a result of this strong, ongoing relationship, we expect that the TA liaisons will be a key factor in maximizing the response rates of the independent evaluators in the annual surveys.

The study team will also enlist the support of the OII program staff to contact the i3 grantees about independent evaluators who are nonresponsive. The OII provides funding for the i3 grantees and requires the i3 grantees to cooperate with the National i3 Evaluation and therefore may be in the best position to influence i3 grantees with incomplete surveys.

Exhibit B.1 highlights these steps and other specific strategies we will employ to maximize survey response rates and deal with issues of nonresponse.

Exhibit B.1: Strategies to Maximize Survey Response

Advance notification of survey

  • Gain support and cooperation of respondents by providing advance notice of the survey

Provide clear instructions and user-friendly materials

  • Send survey and cover page that includes purpose of the study and provisions to protect respondents’ privacy and confidentiality; a toll-free telephone number answered by a member of the AR team to call for questions; and the web address for the on-line survey along with instructions for completing the on-line survey.

Offer to pre-fill surveys with information provided by respondents

  • Invite respondents to submit relevant information in whatever format they already have prepared for their own uses (e.g., tables of sample information) prior to receiving surveys so that the AR Team can pre-fill information into the surveys for respondents’ review.

Offer IT assistance for survey respondents

  • Provide toll-free IT assistance telephone number and email address.

  • Provide study website with instructions for web-based survey completion.

Monitor progress regularly

  • Produce bi-weekly data collection report of completed surveys.

  • Maintain regular contact between study team members to monitor response rates, identify nonrespondents, and resolve problems.

  • Conduct follow-up and reminder calls and e-mails to nonrespondents.

Obtain support of TA liaisons and OII staff

  • For nonresponsive independent evaluators: contact the TA liaison and OII program staff working with these grantees and ask them to intervene.


We expect that 92 grants (all 49 FY 2010 grants, all 23 FY 2011 grants, and all 20 FY 2012 grants) will participate in Primary Data Collection and 38 grants (40 percent of the FY 2010 grants, 40 percent of the FY 2011 grants, and 20 percent of the FY 2012 grants, and 20 percent of the FY 2013 grants) will participate in Early or Follow-Up Data Collection. Among all grants that we expect to participate in data collection during the period covered by this OMB package, we anticipate a 100 percent response rate.

    1. Description of Tests, Procedures and Methods

During the 60-day comment period, the survey was pilot tested by Abt staff and one FY 2010 cohort grant. The pilot test helped to ensure that the instructions are understandable, the content is comprehensible, and the technology functions appropriately. In response to our pilot testing, we clarified some of the survey questions and instructions, and added additional programming to the survey. While our pilot test was limited, it did help us identify sections of the survey that are not applicable to some evaluations so that we could include skip patterns to reduce burden in those cases. Further, the pilot provided a more accurate indication of the burden experienced by people taking the survey and allowed us to revise our burden estimates accordingly.

    1. Individuals Consulted on Statistical Aspects of the Design

The NEi3 is being conducted by Abt Associates Inc., and its subcontractors, Dillon-Goodson Research Associates, Chesapeake Research Associates, Analytica, Westat, Century Analytics, and American Institutes for Research (the i3 Evaluation Team) for the Institute of Education Sciences (IES), U.S. Department of Education. With IES oversight, the contractors for the evaluation are responsible for study design, instrument development, data collection, analysis, and report preparation.

The individuals listed in Exhibit B.2 worked closely in developing the survey instruments and will have primary responsibility for the data collection and data analysis. Contact information for these individuals is provided below.

Exhibit B.2: Key Individuals Consulted

Name

Title/Affiliation

Telephone

Beth Boulay

Project Director, Abt Associates

617-520-2903

Barbara Goodson

Principal Investigator, Dillon-Goodson Research Associates

617-349-2811

Rachel McCormick

Deputy Project Director, Abt Associates

617-945-6506

Rob Olsen

Analysis and Reporting Leader, Abt Associates

301-634-1716

Katie Gan

Director of Analysis, Abt Associates

301-347-5546

Judy Geyer

Senior Analyst, Abt Associates

617-520-2972

Catherine Darrow

Associate, Abt Associates

617-520-3034



1 The i3 Promise criteria operationalize OII’s GPRA requirements, which distinguish between Development and Validation/Scale-Up grants in their evidence requirements. These Promise criteria were developed to assess the quality of evaluations of Development grants that do not meet WWC evidence standards (with or without reservations). These criteria are intended to determine the extent to which the interventions funded by Development grants provide evidence of “promise” for improving student outcomes and may be ready for a more rigorous test of their effectiveness.




File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAbt Single-Sided Body Template
AuthorKatheleen Linton
File Modified0000-00-00
File Created2021-01-26

© 2024 OMB.report | Privacy Policy