Justification Part A_STAC 103014 Clean

Justification Part A_STAC 103014 Clean.docx

National Evaluation of School Turnaround AmeriCorps

OMB: 3045-0164

Document [docx]
Download: docx | pdf

NATIONAL EVALUATION OF SCHOOL TURNAROUND AMERICORPS

Shape1

SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSIONS

A. Justification

A1.  Need for Information Collection

Background

CNCS and the U.S. Department of Education are collaborating on a new grant program to increase high school graduation, college readiness, and educational attainment for students in our nation’s persistently lowest-achieving schools. Since fall 2013, School Turnaround AmeriCorps has been providing AmeriCorps grants to eligible organizations that work with schools receiving School Improvement Grant (SIG) or Priority school funding1. The grants are designed to improve academic outcomes for disadvantaged children in SIG-funded2 schools as the schools implement school improvement strategies. The SIG schools must implement one of four school intervention models—turnaround model, transformation model, restart model, or school closure—or Priority schools can implement interventions aligned with the ESEA flexibility turnaround principles. The national evaluation of the School Turnaround AmeriCorps initiative is designed to understand the value-added of AmeriCorps members who provide direct services in low-performing schools above and beyond the school turnaround resources already invested in these schools, and to describe the mechanisms by which this happens.

The national evaluation will contribute toward the evidence and knowledge base associated with this initiative. Results of the evaluation will be used by grantees and staff at CNCS and the Department of Education to strengthen programming, and to document the effects of School Turnaround AmeriCorps and the contributions of AmeriCorps members. The results of the study will inform policy and funding decisions at both agencies.

Program Theory of Change

The goal of the School Turnaround AmeriCorps program is to turnaround the lowest-performing schools by improving students’ academic performance, academic engagement and/or attendance, high school graduation rates, and college readiness. Its premise is that AmeriCorps members are particularly well-suited to deliver effective turnaround interventions and achieve the desired student outcomes in eligible schools. The program logic models, shown in Exhibits A-1 and A-2, depict the core activities that define the interventions being implemented by AmeriCorps members that are expected to lead to the intended short-term, intermediate, and long-term outcomes, and the target population for the intervention. They also place the program in context by describing key assumptions that underlie the hypothesized causal relationship between program activities and intended outcomes, and factors (moderators) that may condition the degree to which those activities achieve intended effects.

Both logic models also illustrate how activities funded through the program address multiple student needs, and align with comprehensive school turnaround plans by incorporating at least one of the following six SIG strategies:

  • Providing ongoing mechanisms for family and community engagement.

  • Establishing a school culture and environment that improve school safety, attendance, and discipline and address other non-academic factors that impact student achievement, such as students’ social, emotional, and health needs.

  • Accelerating students’ acquisition of reading and mathematics knowledge and skills;

  • Increasing graduation rates through strategies such as early warning systems, credit-recovery programs, and re-engagement strategies.

  • Increasing college enrollment rates through college preparation counseling assistance to include completing the Free Application for Federal Student Aid (FAFSA) and college applications, and educating students and their families on financial literacy for college; or

  • Supporting school implementation of increased learning time.

The six strategies are aligned with those required of SIG schools in implementing one of the four SIG turnaround models (e.g., transformation, restart), as well as the requirements associated with Priority schools that are implementing the turnaround principles under ESEA flexibility. In addition, these strategies are based on research on turning around the lowest-performing schools.

The program theory of change also recognizes that leveraging community, LEA, and school-level support systems can be an important component in comprehensive turnaround efforts. As such, the School Turnaround AmeriCorps funding guidelines encourage grantees to partner with multiple eligible schools within an LEA and coordinate turnaround efforts among multiple school sites. Implementing the six strategies through a comprehensive and coordinated approach is hypothesized to enable grantees to take advantage of economies of scale, and aid in changing community, LEA, and school cultures.

The program-level logic model in Exhibit A-1 represents a comprehensive view of the resources needed to support the funded activities (AmeriCorps human capital, grant funding, school partnerships) and the full range of activities involved in implementing the program, including both the provision of direct services and management and oversight of AmeriCorps members and school partnerships. The long-term outcomes consist of improved student educational achievement as well as school-level and program-level outcomes that are expected to result from schools’ improved capacity to implement their turnaround plans.

The direct-services logic model in Exhibit A-2 provides a narrower depiction of only those inputs, activities, and outcomes related to direct services provision – AmeriCorps members working directly with students to improve student educational achievement. This “zoomed in” view provides more detail about the types of activities grantees are implementing within the six SIG strategies.


Exhibit A-1: CNCS School Turnaround AmeriCorps Program – Program Level Logic Model



Exhibit A-2: CNCS School Turnaround­ AmeriCorps Program – Direct Service provision by AmeriCorps members Logic Model


Overview of the Evaluation

The goal of the evaluation is to understand the effect that AmeriCorps members have on grantee schools’ capacity to implement their respective turnaround models successfully and to improve key turnaround outcomes. The evaluation will assess the contributions of AmeriCorps members toward the success of turnaround models in low-performing schools in which they provide direct services, and will seek to understand the mechanisms underlying those contributions.

Specifically, the primary lines of inquiry guiding the evaluation are to:

  1. Examine the capacity building strategies, school-level interventions, and direct services that AmeriCorps members deliver to support school turnaround efforts;

  2. Understand how local context may affect program implementation and identify best practices for the School Turnaround AmeriCorps program in terms of supporting schools’ ability to implement their turnaround plans; and

  3. Compare the implementation of school turnaround efforts in AmeriCorps schools to those of a matched group of comparison schools with SIG/Priority funding and no or minimal AmeriCorps presence to provide insights into the perceived effectiveness of the program with respect to the following outcomes:

    • Overall success in school turnaround

    • Academic achievement

    • Students’ socio-emotional health

    • School climate

    • School capacity to implement its turnaround effort


The research design for the national evaluation of School Turnaround AmeriCorps is based upon a quasi-experimental design that compares implementation of turnaround models in SIG and Priority schools with AmeriCorps members to a matched comparison group of schools with little or no AmeriCorps presence. There are 13 grantees across 16 states, working with 62 treatment schools3; the study will attempt to recruit 62 matched comparison schools. This component of the study design will help to isolate the effect of AmeriCorps members’ service in supporting schools’ turnaround efforts and improving their capacity to implement their turnaround plans.


The evaluation is also designed to examine implementation, through collection of primary and secondary data from multiple stakeholder groups to assess multiple perspectives and describe stakeholder perceptions. The evaluation will use multiple analytic approaches—quasi-experimental, qualitative, and descriptive—to synthesize findings across data collection strategies and contextualize the findings appropriately.


Research Questions


The goals of the evaluation are to:

  • Describe how AmeriCorps members are supporting school turnaround efforts;

  • Contrast the implementation of school turnaround efforts at School Turnaround AmeriCorps sites with school turnaround sites that are not supported by AmeriCorps; and

  • Identify best practices for the School Turnaround AmeriCorps program in terms of supporting schools’ ability to implement their turnaround plans.


The study’s guiding research questions include:

  1. How do AmeriCorps members help schools implement their turnaround plans?

    1. How do AmeriCorps grantees work with teachers and other school personnel to identify and target students with whom their members will engage so that the school is more likely to achieve its turnaround goals?

    2. What are the specific direct service activities and school-level interventions that AmeriCorps members conduct at each school and how are those activities believed to support school turnaround?

    3. What are the specific capacity-building strategies that AmeriCorps members contribute to each school? How do school leaders and staff view the role and contributions of AmeriCorps members in building the school’s capacity to implement their turnaround effort? What are the areas in which schools believe AmeriCorps members have the most and least influence over the school’s ability to achieve its turnaround goals, and why? In what ways, if any, does the presence of AmeriCorps members allow school staff or volunteers to modify their activities in ways that might benefit students?

    4. Do the specific activities that AmeriCorps members conduct change over the course of the grant period? To what extent do grantees use data to inform continuous improvement efforts to meet changing needs and improve their interventions?

  2. How and to what extent do School Turnaround AmeriCorps programs adhere to grantees’ program designs across schools or exhibit flexibility to adapt to schools’ needs and local contexts?

    1. Which aspects of grantee-school partnerships appear to be the most promising practices in terms of satisfaction of the school leadership and the participating AmeriCorps members?

    2. What elements of the implementation are sensitive to local contexts and might be difficult to generalize and replicate in other contexts?

    3. Which elements of implementation are potentially replicable in other schools?

  3. Are AmeriCorps members perceived by school leaders and other stakeholders to be more vital in supporting certain SIG/Priority strategies than others? Which activities pursued by AmeriCorps members are perceived as being more or less helpful in supporting schools’ turnaround efforts with respect to the following outcomes, and why?

    1. Overall success in school turnaround

    2. Academic achievement

    3. Students’ socio-emotional health

    4. School climate

    5. School capacity to implement its turnaround effort

Comparing treatment and comparison schools will help in answering these questions and address the overall study goal of understanding the contributions and value-added of placing AmeriCorps resources in low-performing schools above and beyond the school turnaround resources already invested in these schools.


Study Design


To answer these research questions, the national evaluation of School Turnaround AmeriCorps will use a quasi-experimental design that compares low-performing schools (SIG and Priority schools) with School Turnaround AmeriCorps (i.e., the treatment group) to a matched comparison group of low-performing schools without the School Turnaround AmeriCorps initiative (i.e., the comparison group). This research design will attempt to isolate the effects of AmeriCorps members’ service.


As a condition of receiving funding, all 13 funded grantees are required to participate in the national evaluation, and the evaluation will include all these grantees. Most School Turnaround AmeriCorps schools will be included in the evaluation, with the exception of one large grantee for which only a representative sample of schools will be included.4 In total, 62 grantee schools5 will be included in the evaluation, along with a similar number of comparison schools. The comparison group will consist of a matched group of schools with SIG grants or Priority school funding that are not implementing School Turnaround AmeriCorps. Comparison schools will be matched on key characteristics (state, grade level, turnaround model) and will come from within the same Local Educational Agencies (LEAs) as their School Turnaround AmeriCorps counterparts wherever possible. Potentially eligible comparison schools will be selected (among other criteria) on the basis of no or minimal AmeriCorps presence.

Primary data collection will consist of surveys, semi-structured interviews, and focus groups with grantees, AmeriCorps members, school leaders, teachers, counselors, and parents/guardians of students who receive AmeriCorps services. Secondary data analysis will be based, primarily, on mid-year and annual grantee progress reports and performance measure data, and pending availability, grantee activity logs, and any additional outcomes data (achievement scores, attendance and behavior records, etc.) that can be obtained from grantees and partner schools. The focus of both the primary and secondary data collection is to understand and compare implementation effectiveness, therefore collection of secondary student outcome data is supplemental and will only be used if feasible. The study purposefully does not include direct data collection from students so as not to detract from their instructional time.

A2.  Indicate how, by whom, and for what purpose the information is to be used.


The national evaluation of School Turnaround AmeriCorps will increase the evidence and knowledge base associated with this initiative. Results of the evaluation will be used by grantees and staff at CNCS and the Department of Education to strengthen programming and document the effects of School Turnaround AmeriCorps and the contributions of AmeriCorps members. The results of the analysis will inform policy and funding decisions at both agencies.


A3.  Minimize Burden: Use of Improved Technology to Reduce Burden


The surveys will use Internet administration. The contractor’s staff will work with grantees and school districts to obtain respondent email addresses. Once the appropriate respondents have been identified, each respondent will be emailed an individualized survey link and will have 3-4 weeks to take the survey at their convenience. Regular reminders will be emailed to respondents who have not yet completed the survey over the course of the field period. We assume that all respondents will have work email addresses and access to the Internet. We will use multiple modes for reminders including email, mail (e.g., fliers for teachers’ school mailboxes), and telephone calls with school liaisons. We have attempted to minimize the burden on respondents by conducting the surveys through the Internet and minimizing the length of administration. The surveys are expected to take on average 30 minutes or less per respondent.

A4.  Non-Duplication

We have reviewed previous CNCS data collection efforts. There are no other surveys, interviews, or focus groups that ask questions specifically related to the effectiveness of the School Turnaround AmeriCorps initiative.

A5.  Minimizing economic burden for small businesses or other small entities.

The information collection will not involve small businesses. The data collection procedures have been designed to minimize the burden on AmeriCorps grantees as well as representatives from larger organizations through: 1) Web-based administration of the survey, which will further reduce burden in so far as respondents may end any given session on the survey and return to their previous answers in the survey at their discretion. This will allow respondents the ability to complete the survey at a time and place most convenient to them. 2) Scheduling interviews and focus groups at times most convenient for all respondents.

A6.  Consequences of the collection if not conducted, conducted less frequently, as well as any technical or legal obstacles to reducing burden.

A national evaluation is required as part of the School Turnaround AmeriCorps partnership between CNCS and the Department of Education, one of the signature partnerships of the Presidential Task Force established to expand national service. Under Goal 3 of its strategic plan, CNCS aims to maximize the value it adds to grantees, partners, and participants by developing a relevant and accessible knowledge base informed by research and rigorous evaluation. This evaluation will help us understand how AmeriCorps members are supporting school turnaround efforts and will identify best practices for the School Turnaround AmeriCorps program in terms of supporting schools’ ability to implement their turnaround plans.

It is important that data collection occurs each year during the two year evaluation as grantees may be improving or refining their program models such that outcomes could change over time. Some surveys are designed to be administered at the beginning of the school year and again at the end of the school year since this pre-post design can most accurately capture changes in school and student conditions. It is also critical that data collection occurs for multiple stakeholder groups in order to accurately assess a variety of perspectives and fully capture the perceptions of all who are involved in the School Turnaround AmeriCorps initiative.

A7.  Special circumstances that would cause information collection to be collected in a manner requiring respondents to report more often than quarterly; report in fewer than 30 days after receipt of the request; submit more than an original and two copies; retain records for more than three years; and other ways specified in the Instructions focused on statistical methods, confidentially, and proprietary trade secrets.


The information collection will not involve any of these circumstances.


A8.  Provide copy and identify the date and page number of publication in the Federal Register of the Agency’s notice. Summarize comments received and actions taken in response to comments. Specifically address comments received on cost and hour burden.


The 60 day Notice soliciting comments was published on June 27, 2014, on Regulations.gov (docket ID: CNCS-2014-0018-0001). No comments were received. The 30 day Notice soliciting comments was published on September 12, 2014, on Regulations.gov (docket ID: CNCS-2014-0018-0003). No comments were received.


A9.  Payment to Respondents


The research literature on incentives—both experiments and meta-analyses—suggests that incentives are associated with increased cooperation rates and response rates across modes (depending on the study, cooperation and/or response rates may be reported; Brick et al. 2005; Church 1993; Edwards et al. 2002, 2005; James and Bolstein 1992; Shettle and Mooney 1999; Singer et al. 1999; Singer, Van Hoewyk, and Maher 2000; Yammarino, Skinner, and Childers 1991). Incentive payments to survey respondents have been used extensively for many years to improve survey response rates. There is considerable research-based evidence supporting the value of compensation for increasing cooperation and improving the speed and quality of response in a broad range of data collection efforts.


There is greater variation in the literature with respect to the comparative effectiveness of prepaid and postpaid (i.e., conditional on survey completion) incentives. Findings are inconsistent across meta-analyses. Singer et al. (1999) find no significant difference between prepaid and postpaid incentives, both of which are more effective than no incentive. Edwards et al. (2002) find prepaid incentives more effective than postpaid incentives, though both pre- and postpaid incentives are more effective than no incentive. Church (1993) finds no significant difference between postpaid and no incentive conditions. Cantor, O’Hare, and O’Connor (2007) draw a valuable distinction between mail surveys, which have generally found positive effects for postpaid incentives, and interviewer-mediated surveys, which have not generally found postpaid incentives to be effective.


As the surveys for the national evaluation of School Turnaround AmeriCorps are self-administered, they should be closer to the experiences of mail surveys than interviewer-mediated surveys. Cantor et al. (2007) find that postpaid incentives in the $15 to $35 range increase response rates for mail surveys (cf. Cantor et al. 2003; Strouse and Hall 1997). Modest individual incentives ($10 gift cards) to teacher survey respondents will be offered in districts where individual incentives are permitted, as well as larger monetary awards for the three schools with the highest response rates. The study team has reviewed the research literature on the effectiveness of incentives in increasing response rates for surveys. In the Reading First Impact Study commissioned by the Department of Education, monetary incentives proved to have significant effects on response rates among teachers. The contractor for the national evaluation of School Turnaround AmeriCorps is offering individual incentives as part of two current national studies that include teacher surveys, and has received positive feedback from districts and school staff about these incentives. A sub-study requested by OMB on the effect of incentives on survey response rates for teachers showed significant increases when an incentive of $15 or $30 was offered to teachers as opposed to no incentive (Gamse et al., 2008). In another study, Rodgers (2011) offered adult participants $20, $30, or $50 in one wave of a longitudinal study and found that offering the highest incentive of $50 showed the greatest improvement in response rates and also had a positive impact on response rates for the next four waves.

Parents will receive a $20 incentive payment for participation in parent interviews, which is based on prior experience with the challenges inherent in securing participation from a difficult to reach parent population. Participants in on-site focus groups will be provided with food/refreshments valued at approximately $40 per focus group. Finally, the evaluation is greatly strengthened by high response rates from comparison schools, so each of the comparison schools will receive a $250 stipend as an incentive to participate in the study.

A10.  Assurance of Confidentiality and its basis in statute, regulation, or agency policy.


Participation in the surveys is voluntary. All analyses, summaries, or briefings will be presented at the aggregate level and it will not be possible to link specific responses to individual respondents in any way.

Assurance of privacy will be provided to all respondents in the text used for survey invitations and reminders. In addition, measures will be taken by the contractor to remove key identifiers prior to data analysis, so that individual responses cannot be linked to a specific individual. The basis for the assurance of privacy is from the privacy statement and nondisclosure agreement that is part of the project’s contract.


The survey data will be stored on a subcontractor’s computer that is protected by a firewall that monitors and evaluates all attempted connections from the Internet. The subcontractor, Data Star, will use a secure transfer site to send data files to the contractor. Identifiable information on each survey respondent (name, email address) will be maintained in a separate data file apart from the survey data. Access to data with identifying information will be limited to only contractor staff directly working on the survey. Additional measures will be taken to remove other identifiers prior to data analysis, so that individual responses cannot be linked to a specific individual. The de-identified analysis files will be stored on a secure server at the contractor’s site.


Once the project is completed, all identifiable data on each respondent will be deleted, though it should be noted that the contractor maintains backup tapes that are not amenable to the deletion of particular files. The entire database will be encrypted so that any data stored will be further protected. Finally, access to any data with identifying information will be limited to only contractor staff directly working on the surveys.


All study staff will be trained in project-specific confidentiality and security procedures. We will not link particular responses to individual respondents, maintaining confidentiality of the individual.

Interview and focus group files will be limited to the contractor’s staff directly working on the data collection and analysis for this project. Hard copy notes and consent forms will be stored in locked file cabinets. Identifying information will be removed from analytic files immediately after data cleaning, and files that include individual identifiers will be stored on a secure server accessible only to key project staff. All interview and focus group analyses, summaries, or briefings will be presented at the aggregate level.


A11.  Sensitive Questions 


No questions of a sensitive nature will be asked in the surveys, interviews, or focus groups.


A12. Hour burden of the collection


Average Time Per Response: 30 minutes.

Estimated Maximum Total Burden Hours: 2274 hours per year; 4548 total over 2 years. 

Survey

AmeriCorps

Comparison

Pre/post?

Total

Grantee staff

13

0

No

13

AmeriCorps members

440

0

No

440

Principals

62

62

Yes

248

Teachers

348

348

Yes

1392

 

863

410

 

2093

Unique respondents




1,273


Minutes

125580




Hours

2093








Interviews

AmeriCorps

Comparison

Pre/post?

Total

Grantee staff

13

0

Yes

26

AmeriCorps members

26

0

No

26

Principals

26

26

Yes

104

Teachers

26

26

No

52

Parents

50

0

No

50

Unique respondents




193


Minutes

7740




Hours

129








Focus groups

AmeriCorps




Grantee staff

13




AmeriCorps members

39




Principals

13




Teachers

39




Unique respondents

104




Minutes

3120




Hours

52









Total hours per year

2274




Total hours 2 years

4548






A13. Cost burden to the respondent


The telecommunications costs of the interviews are considered part of customary and usual business practices. The survey will not involve any additional cost burden to respondents or record-keepers, other than that described above. 


A14. Cost to Government


The cost to the Federal Government for the entire first year of the evaluation is $706,326. The cost of data collection is estimated at $260,725. This includes administering the surveys, conducting interviews and focus groups, transcriptions, preparing data files, and incentive payments.


A15. Reasons for program changes or adjustments in burden or cost.


Not applicable.


A16.  Publication of results


School Turnaround AmeriCorps grantees began operating in schools in fall 2013. As shown in Exhibit A-3, 2013-14 was the first program year of grantees implementing the intervention, 2014-15 represents the second year of operating the program, and 2015-16 will be the third program year. The 2013-14 school year served as a pilot year for the national evaluation to develop the evaluation design, develop and pilot test the data collection instruments, and prepare and submit the OMB clearance package. The two-year national evaluation will span the 2014-15 and 2015-16 school years.


Exhibit A-3: Program Implementation and Evaluation Timeline

School Year

2012-2013

2013-2014

2014-2015

2015-2016

Program Implementation

Pre-program

First year

Second year

Third year

Program Evaluation

Baseline data

Pilot evaluation

Year 1

Year 2

It is anticipated that data collection for the national evaluation will begin in November 2014, pending OMB, contractor IRB, and district IRB approvals. It is important that data collection occurs each year during the two year evaluation, as grantees may be improving or refining their program models such that outcomes could change over time. Some surveys are designed to be administered at the beginning of the school year and again at the end of the school year to capture changes that may have occurred in school and student conditions.

The evaluation team will conduct identical or very similar activities in Year 2 of the evaluation as those occurring in Year 1. We anticipate learning important information about some activities during the first year, such as comparison school recruitment, improving survey response rates, and grantee activity tracking, that could require adjustments to the design and implementation of the evaluation in the second year (if, for example, comparison schools decline to participate for a second year). Thus, the design presented here will serve as the blueprint for Year 2 of the evaluation, recognizing that Year 1 will inform any non-substantive changes needed that could impact decisions about Year 2. We will consult with OMB prior to beginning data collection in Year 2 in order to discuss any such process changes.


The chart below displays the time schedule for the first year of the evaluation (September 2014-August 2015). A similar schedule will be repeated in year two of the evaluation (September 2015-August 2016).


Office of Management and Budget (OMB) review and approval

July-October 2014

Conduct Fall surveys

November-December 2014

Conduct Spring surveys

April-June 2015

Interviews and Focus Groups

November 2014-June 2015

Interim Findings Report

May 2015

Final Findings Report

August 2015


Results will be compiled into interim and final internal reports to CNCS stakeholders, with accompanying briefing and presentation materials. Depending on the quality of the results, a peer reviewed article may be developed.


A17.  Explain the reason for seeking approval to not display the expiration date for OMB approval of the information collection.

The expiration date will appear on the materials.

A18.  Exceptions to the certification statement

There are no exceptions.

References

Brick, J. Michael, Jill Montaquila, Mary Collins Hagedorn, Shelly Brock Roth, and Christopher Chapman. 2005. “Implications for RDD Design from an Incentive Experiment.” Journal of Official Statistics 21:571-589.

Cantor, David, P. Cunningham, T. Triplett, and R. Steinbach. 2003. “Comparing Incentives at Initial and Refusal Conversion Stages on a Screening Interview for a Random Digit Dial Survey.” Paper presented at the Annual Meeting of the American Association for Public Opinion Research, Nashville, TN.

Cantor, David, Barbara C. O’Hare, and Kathleen S. O’Connor. 2007. “The Use of Monetary Incentives to Reduce Nonresponse in Random Digit Dial Telephone Surveys.” Pp. 471-498 in Advances in Telephone Survey Methodology, edited by James M. Lepkowski, Clyde Tucker, J. Michael Brick, Edith de Leeuw, Lilli Japec, Paul J. Lavrakas, Michael W. Link, and Roberta L. Sangster. New York: Wiley.

Church, Allan H. 1993. “Estimating the Effect of Incentives on Mail Survey Response Rates: A Meta-Analysis.” Public Opinion Quarterly 57:62-79.

Edwards, Phil, Rachel Cooper, Ian Roberts, and Chris Frost. 2005. “Meta-Analysis of Randomised Trials of Monetary Incentives and Response to Mailed questionnaires.” Journal of Epidemiology and Community Health 25:987-99.

Edwards, Phil, Ian Roberts, Mike Clarke, Carolyn DiGuiseppi, Sarah Pratap, Reinhard Wentz, and Irene Kwan. 2002. “Increasing Response Rates to Postal Questionnaires: Systematic Review.” British Medical Journal 324:1183-85.

Gamse, B. C., Bloom, H. S., Kemple, J. J., & Jacob, R. T. 2008. “Reading First Impact Study: Interim Report.” NCEE 2008-4016. National Center for Education Evaluation and Regional Assistance.

James, Jeannine M. and Richard Bolstein. 1992. “Large Monetary Incentives and Their Effect on Mail Survey Response Rates.” Public Opinion Quarterly 56:445-53.

Rodgers, W. 2011. “Effects of increasing the incentive size in a longitudinal study.” Journal of Official Statistics 27(2), 279–299.

Shettle, Carolyn and Geraldine Mooney. 1999. “Monetary Incentives in U.S. Government Surveys.” Journal of Official Statistics 15:231-50.

Singer, Eleanor, John Van Hoewyk, Nancy Gebler, Trivellore Raghunathan, and Katherine McGonagle. 1999. “The Effect of Incentives on Response Rates in Interviewer-Mediated Surveys.” Journal of Official Statistics 15:217-30.

Singer, Eleanor, John Van Hoewyk, and Mary P. Maher. 2000. “Experiments with Incentives in Telephone Surveys.” Public Opinion Quarterly 64:171-88.

Strouse, Richard C. and John W. Hall. 1997. “Incentives in Population-Based Health Surveys.” Proceedings of the American Statistical Association, Survey Research Section, 952-957.

Yammarino, Francis J., Steven J. Skinner, and Terry L. Childers. 1991. “Understanding Mail Survey Behavior: A Meta-Analysis.” Public Opinion Quarterly 55:613-39.


1 School Improvement Grants (SIG), authorized under section 1003(g) of Title I of the Elementary and Secondary Education Act of 1965 (Title I or ESEA), are grants to State educational agencies (SEAs) that SEAs use to make competitive subgrants to local educational agencies (LEAs) that demonstrate the greatest need for the funds and the strongest commitment to use the funds to provide adequate resources in order to raise substantially the achievement of students in their lowest-performing schools. Source: http://www2.ed.gov/programs/sif/index.html. Accessed October 9, 2014. Priority Schools are defined here: http://www.ed.gov/sites/default/files/demonstrating-meet-flex-definitions.pdf. Accessed October 27, 2014.


2 Throughout this document, all references to SIG-funded schools also include Priority-funded schools.

3 This count excludes (1) the schools with no School Turnaround AmeriCorps members during the 2013-14 school year, (2) some Teach For America schools. The number of treatment schools may decrease if the final evaluation design warrants excluding a few treatment schools if no comparable matches can be found.

4 Teach for America (TFA)’s intervention involves providing AmeriCorps members who have been trained as teachers to teach in school classrooms, in contrast to other School Turnaround AmeriCorps grantees whose members provide services, such as tutoring and mentoring, to support student engagement and academic achievement. Because of its distinct intervention, only a representative sample of TFA schools will be included in the evaluation.


5 This number excludes the schools newly added in 2014-2015, since their experiences in implementing the program will be qualitatively different from the experiences of the schools in the second year of programming that have already completed one year of program implementation.

16


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Author[email protected]
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy