Supp Statement Part B 10Oct12

Supp Statement Part B 10Oct12.docx

Identifying Potentially Successful Approaches to Turning Around Chronically Low Performing Schools

OMB: 1850-0875

Document [docx]
Download: docx | pdf




Identifying Potentially Successful Approaches to Turning Around Chronically Low Performing Schools

ED-04-CO-0025/0020





Supporting Statement for Paperwork Reduction Act Submission, Description of Statistical Methods

Part B








Table of Contents


Supporting Statement for Paperwork Reduction Act Submission Description of Statistical Methods (Part B) 1

1. Sampling Design 1

Principal Survey 1

In-Depth Case Studies 2

2. Procedures for Data Collection 3

Principal Survey 3

In-Depth Case Studies 5

3. Methods to Maximize Response Rate 8

4. Expert Review and Piloting Procedures 8

5. Individuals and Organizations Involved in Project 9


List of Exhibits


Exhibit 1: Principal Survey Sample 1

Exhibit 2. Organizations, Individuals Involved in Project 9


Supporting Statement for Paperwork Reduction Act Submission Description of Statistical Methods (Part B)

1. Sampling Design

The respondent universe relies on longitudinal data from three states (North Carolina, Florida, and Texas) to identify chronically low-performing and turnaround schools. The focus is on elementary and middle schools in these states and in particular chronically low performing schools. This study includes the collection of new data through two data collection activities: a principal survey and school case studies. The sampling design for each is described below.

Principal Survey

School sample. The principal survey sample will be selected as follows. Equal numbers of TA, MI, and NI schools will be selected from within school-level strata (elementary school/middle school) to produce six groups of schools with 125 schools per group (Exhibit 1). For detecting significant differences among sampled groups, 100 respondents is the minimum acceptable n. Taking into account a projected principal response rate of 80 percent, we plan to sample 125 schools per group. Within the TA strata, the first priority will be to select schools that turned around in both reading and mathematics; if the number of such schools is less than 125, additional schools will be selected at random from those that have turned around in one subject or the other.

Exhibit 1: Principal Survey Sample


Elementary

Middle

Total

Turnaround Schools

125

125

250

Moderate Improvement Schools

125

125

250

Not Improving Schools

125

125

250

Total

375

375

750


A simple random sampling procedure will yield numbers of schools that are roughly proportional to the numbers of TA, MI, and NI schools that Study I found in each state. If the findings of Study I are such that one of the three states would then be expected to have fewer than 10 schools in any group within the survey sample (e.g., TA middle schools), we will discuss with IES the pros and cons of oversampling from that state. Our preference is, in general, to sample schools without stratifying by state, because we are most interested in focusing on schools with the most clear-cut trajectories of turning around student achievement, starting from very low levels of performance. However, we recognize that having too few cases from a particular state would limit our opportunity to understand the translation of that state’s policies into policies, programs, and practices.

Respondent sample. To understand TA schools, we must study behaviors and actions that happened several years ago, and retrospective data collection means that sitting principals may not have been in the school throughout the period of interest, 2004-07.1 We will attempt to locate and survey the individuals who served in the role of principal during that time.

We will undertake the following steps to identify the principal(s) who served from fall 2004 through spring 2007. In the initial contact with the school, we will ask the school secretary for this information. If the sitting principal is new to the school since 2004, we will ask the secretary whether the former principal(s) is/are still working in the district, retired and known to be living nearby, or not reachable. If the secretary is unsure, we will ask the current principal about his/her predecessor(s). We will also search Google and the “wayback machine” internet archive (archive.org) for old web pages with the names of the then-principals.

If it is not possible to contact the individual(s) who served as principal throughout the 2004-07 period, we will attempt to interview a staff member who has been in the school from that period through the present, with preference for a department chair or instructional coach.

In-Depth Case Studies

School sample. We propose to conduct case studies in 36 schools, including 24 TA schools and 12 NI schools. The inclusion of NI schools in the sample will provide an invaluable source of data to disconfirm wrong conclusions about turnaround: approaches that were found in both groups of schools are not the keys to turnaround; our data analysis will be aimed at describing the work of turnaround in ways that do accurately characterize events and approaches in TA schools but do not apply just as well to the NI schools.

We plan to select a two-to-one ratio of TA to NI schools within each state as well as in the sample as a whole. Our rationale for including more TA than NI schools in the sample is that, notwithstanding the value of contrasting the two groups analytically, we expect to learn more from the details of successful turnaround efforts than from the details of efforts that failed or fizzled.

The first step in sampling will be to select TA schools from among those identified in Study I as turnarounds in both reading and mathematics. Using the survey results, we will exclude from consideration those schools with exogenous explanations for their gains and those schools whose 2004-07 principals cannot be contacted. From the remaining schools, we will randomly select 24 schools, including both elementary and middle schools, and 15 backup schools. If the pool is not sufficient to yield these numbers, we will round out the sample by making random selections from the full pool of TA schools whose turnaround was in either reading or mathematics but not both. If warranted by the findings of the administrative data analysis, described just above, we will stratify the sampling by amount of teacher turnover and will select at least 8 TA schools and 4 NI schools that had a sizable influx of new teachers.

We will not formally match NI schools with particular TA schools. However, after selecting the TA sample we will randomly select NI schools from within strata such that the two samples are roughly proportionate to one another in school level (elementary/ middle), size (larger/ smaller than the state median for the school level), urbanicity (urban/ rural/ other), and poverty level (above/ below the state median for the school level). We have considered using these variables as criteria in sampling but decided against this procedure. The reason is that introducing additional stratification variables makes it less likely that our sample will capture the strongest examples of turnaround—schools that had exceptionally low performance and then greatly improved—and we believe that these examples will yield the findings with the greatest implications for future research and policy, irrespective of the demographic makeup of the schools. The pool will be refreshed as schools consent (or not) to the research process.

Respondent sample. Using procedures described below, we will obtain a list of professional staff in each school, with job titles and year hired. We will purposively select eight respondents to be interviewed (in addition to the principal) from that list. Four respondents will be selected from among administrators, coaches, department chairs, or other members of a school leadership team, with a preference for staff in the school since 2004–05 or earlier. Four respondents will be selected from among teachers, with a preference for teachers in the school since 2004–05 and those teaching reading/English and/or mathematics (depending on the subject(s) in which the school turned around).

2. Procedures for Data Collection

Principal Survey

Survey Development

A key decision underlying our development of the principal survey is that we will pose the same questions in TA, MI, and NI schools alike. The premise of the interview will be that the school has faced challenges and has made at least some efforts to improve student performance; we are confident that this will apply to all the schools. By asking about the improvement efforts that the school relied on most heavily in the period from four to six years prior to the survey, we will elicit descriptive information from each school. This will permit the comparative analysis of the types of policies, programs, and practices that were central in TA, MI, and NI elementary and middle schools, respectively. This analysis, with appropriate caveats for the limitations of survey research regarding events in the past, will provide answers to the first research question (See Part A for the study research questions).

Types of external and structural policies, programs, and practices on which the survey will focus are the following: external mandates, incentives, and resources; teacher recruitment and deployment; curriculum or pedagogy; added time; school organization/grouping strategies; support services; comprehensive reform designs; and inter-organizational partnerships. A draft of the survey can be found in the Appendix. Note that this draft survey has not yet been formatted for Web use. It will also be adapted for telephone administration (as described below).



Survey Administration

The principal survey will be administered online and as a telephone interview. We will pilot both formats in 4 chronically low-performing elementary schools and 4 chronically low-performing middle schools across the target states, or in a total of 8 chronically low-performing schools. These pilot schools will be excluded from the survey sample. Following warranted revisions in instruments, we will begin data collection.

Whenever possible, the principal survey will be administered initially via the Web. We will follow up with nonrespondents via computer-assisted telephone interview (CATI) techniques. We anticipate that many principals may be difficult to reach online and may prefer a CATI survey. We will give principals the option but anticipate that much of the data will be collected via CATI. A CATI survey has benefits with regard to response burden and ensuring a high response rate. Among the advantages of this mode of administration is ease of response: principals may be contacted at times that accommodate their schedules, respondents do not need to carefully interpret skip patterns or to mail completed hard copies. Moreover, for the purposes of this study, we believe that the CATI – which offers the most flexibility in tailoring questions and gathering open-ended data – is the most appropriate technique. A telephone survey administered by a highly-trained staff offers us the best opportunity to fully understand the nature of the practices, programs, and policies in place in turnaround schools, and to accurately gauge whether schools can accurately be categorized as turnarounds.

CATI technology significantly improves data quality by automatically providing correct sequencing and branching; interviewer error is reduced; and the time required to conduct interviews is substantially decreased. Further, all data are immediately validated for completeness and consistency as they are entered into a centrally-located database. And, respondents can provide requested data over a number of sessions if interruptions occur, which are likely when interviewing principals.

However, to ensure a high response rate, we will not rely entirely on CATI techniques. We will supplement our follow‑up activities with e-mail prompts that include a link to an online version of the survey, and a hard copy pencil‑and‑paper survey questionnaire will be mailed to any respondent who requests one. Once data collection has been approved by OMB, we will send the sampled principals a package containing: (1) a cover letter; (2) promotional material describing the evaluation; (3) an outline of the topics to be addressed during the telephone survey; and (4) information concerning the data collection window. Principals will be strongly encouraged to participate in the telephone survey, but they will also be given the option of responding to the online if they prefer.

Approximately one week after the initial mailing, trained study staff will begin contacting principals to schedule and conduct telephone surveys. Multiple attempts over a 6-week period will be made to contact and schedule interviews with the principals. We recognize the importance of obtaining the cooperation of the principal’s secretary or administrative assistant as a gatekeeper. CATI staff will be provided with specific protocols and directions to use to access the appropriate persons in the school settings.

In implementing the survey, it will be important to keep in mind the job responsibilities, schedules, and time constraints of the principals. That is one reason why we will provide the option of completing the survey by telephone or online. We will conduct multiple contact attempts with principals, establishing contacts through their administrative assistants and secretaries, as appropriate, to schedule the time to conduct the 25-minute interviews. All interviewers will be trained to successfully encourage principals to complete the survey while remaining flexible to accommodate their schedules.

In communicating with the principals through advance letters and telephone calls, we will also obtain their email addresses, if available, to provide another route for contacting them. The online version of the principal survey will be accessible at all times and CATI staff can work outside of regular business hours to accommodate scheduled phone calls. Principals will be provided with a toll-free number to use to call DIR’s CATI center.

Data-collection staff will be given a thorough understanding of the study purpose and the research goals in order to be effective in persuading principals to participate. Interviewers will be trained on how to answer potential questions that principals and gatekeepers might have, such as how long the interview will be or how the analyzed data will be used. Interviewers will be provided with an easy-to-reference guide of frequently asked questions that will address concerns likely to be raised by respondents and that give appropriate responses to each. Responses to anticipated questions will also be incorporated into the CATI instruments and the training materials

Using these procedures, we expect to generate a response rate of greater than 80 percent.

In-Depth Case Studies

Case Study Procedures and Instruments

Through case studies, we will investigate how TA and NI schools implemented improvement initiatives, shedding light on the conditions that may prompt the adoption of policies, programs, and practices, and the elements of school capacity that may support and accompany policies, programs, and practices implementation. The primary means of data collection will be interviews and observations, conducted by experienced site visitors with training specifically designed for this study. Site visits will be conducted in winter and spring 2011, scheduled so as not to conflict with district assessment dates.

We plan on exploring resource issues as part of our case study research. The protocols for these conversations will build upon AIR’s prior work on successful turnaround schools (see Aladjem et al., 2010) in which the importance of state and federal grants was identified as critical to successful school turnarounds. We do not, however, plan on conducting intensive fiscal analyses of school or district expenditure data. We will limit the resources analyses to qualitative descriptions provided by case study participants, one question (B1e—though others have resource implications) on the school survey, and the human capital analyses.

Profile and timeline of reforms. Prior to site visits, we will use a template to complete profiles of the case-study state, district, and school reform efforts from 2002 through 2007. Data will be obtained through online searches, interviews with district officials, and a review of district and local school improvement plans. We will develop a “request for documents” form letter requesting relevant reports across districts. The content of state and district profiles, like the data gathered from the principal survey, will be used in tailoring the probes used in the onsite interviews in each school. They will also inform the analysis of case-study data, shedding light on the origins of various pressures and opportunities associated with turnaround that respondents in the schools may report. The profile template will include a wide range of the types of external policies, programs, and practices shown in our conceptual framework. These include staffing strategies (replacing principals, hiring sizable cohorts of new teachers), empowerment strategies (undertaking efforts to increase principal control over budget, curriculum, schedule, and staffing), accountability strategies (raising standards, closing failing schools, increasing school choice, rewarding successful schools and/or teachers, replacing failing teachers), and capacity-building strategies (undertaking major professional development efforts, engaging external support providers). The profile template will also include a wide range of the types of school policies, programs, and practices shown in our conceptual framework. These include targeted changes in curriculum, pedagogy, organization of the school day and instructional groups, packaged comprehensive reforms, and inter-organizational partnerships, etc. Profile data will be used to construct a timeline or history of reform efforts and milestones, 2000-01 to 2007-08.

Design and instrument for site visits. In designing the case study process, we will pay careful attention to issues in retrospective data collection. These include respondents’ not experiencing relevant events, forgetting events, recalling events incorrectly, and misidentifying events on a timeline. Strategies will include the following:

  • We will identify and interview individuals employed in the case study schools since 2004 and earlier.

  • We will construct a timeline mapping school experiences over the last decade.2

  • We will email timeline to respondents one week prior to our site visit asking them to begin reflecting on notable milestones or turning points in their school’s history.3


The basic interview instrument will have a common structure across all respondents. Respondents will include the principal, four additional administrators and four teachers. In middle schools we include the following in the category of administrators assistant principal (instruction), ELA department chair, math department chair, grade team leader, leadership coach, instructional coaches, etc. In elementary schools, we include the following in the category of administrators: all assistant principals, grade team leaders, leadership coach, instructional coaches, etc.


We are developing broad questions to initiate discussion, followed by focused probes to ascertain insights in important areas. In developing protocol questions, the study team will avoid language that may be loaded, leading, or likely to yield socially desirable responses.

We will pilot test the protocols to ensure that they are capturing all the data elements required and that the questions are adequately interpreted by the respondents. In two schools, one TA and one NI, we will conduct interviews and ask respondents about (1) the overall organization, flow, and length of the interview, (2) the clarity of the interview wording and language, (3) specific questions that were unclear or difficult to answer, and (4) any recommended changes. We will use the responses from these pilots to modify the protocols. A draft interview instrument is included in the appendix.

Data-Collection Procedures

Recruitment. Some of the districts in which our sampled schools are located will require a research proposal for local approval. We will follow the requirements of each district that has such a procedure.

In each case study school, the current principal will be asked to designate a study liaison to schedule interviews and to provide a list of professional school staff (with job titles and year hired).

In seeking the agreement of principals and the site coordinators they designate, we will communicate the study purposes and focus, procedures for limiting burden and ensuring confidentiality, and an offer of modest incentives for the building and the coordinator. We will emphasize to all schools, TA and NI alike, that we hope to visit them because other schools can learn from their experiences and accomplishments in the challenging process of improvement.

Site-visitor training. All site visitors will be experienced qualitative researchers from the firms participating in this study. Nevertheless, training is essential to ensure a shared understanding of the particular study’s requirements. In this study, training for the site visitors will have a dual focus on the conceptual framework for the study and on relevant procedures. Conceptually, site visitors will review major findings from the existing literature on school improvement. To make clear how the site visits must provide data directly addressing the conceptual framework, the training will emphasize what constitutes an adequate or inadequate answer to a question on the interview guide, illustrated with examples. (For example, “We had professional development” would be an inadequate answer without details on duration, intensity, and extent of participation.) Training will also include review of the school visit checklist developed for the study outlining all tasks to be performed before, during, and after each visit.

Case-study visits. Prior to on-site data collection, site visitors will review state, district, and school reform profiles. On-site, a two-person team will be in the school for two or three days, conducting interviews and observing school processes (e.g., staff meetings or professional development). The choice of a two- or three-day visit will depend on school size and organizational complexity: three-day visits will be conducted in schools that are above average in size, compared with the rest of the sample, and in which both reading and math performance turned around. In some cases the timing of events to be observed, or other scheduling complications, will also necessitate a three-day visit.

As described above (in the section on sampling), we will conduct individual interviews with eight individuals in each school, in addition to the principal. Observations of group events such as grade team meetings will confirm/challenge reports of standard school policies, programs, and practices obtained during interviews (see question #2). Following site visits, to add any needed elaboration or clarification, we will conduct additional brief, targeted telephone interviews and email correspondence, as needed.

We will capture data onsite using a combination of note-taking and audio-recording. In our experience, a skilled note-taker can produce a high quality record of the interview, especially if notes are cleaned immediately following the site visit. This process has additional benefits: the staff member who takes notes is in a better position to assist with analyses and this eliminates the necessity of waiting and paying for verbatim transcriptions. In the event that we need exact quotes or need to verify information, the audio file will be available.

Throughout the process of data collection and reporting, the team will make all efforts to protect the privacy of respondents participating in the site visits. We will not identify by name any of the interviewees; nor will we attribute quotes that could be construed in a negative manner. Although we will identify the names of states in the final reporting of case studies, districts and schools will be identified by pseudonyms.

3. Methods to Maximize Response Rate

Data collection is a complicated process that requires careful planning. The research team has developed interview, survey, and data collection protocols that are streamlined and that are designed to place as little burden on respondents as possible. The team will also pilot and subsequently refine all instruments to ensure they are user-friendly and easily understandable, all of which increases participants’ willingness to participate in the data collection activities and thus increases response rates.

To further ensure a high response rate on the principal survey, we will not rely entirely on Web-based on-line administration. Our experience demonstrates that multi-mode survey administration will maximize response rates by providing respondents the mode that they find easiest to complete. As an added incentive to participate, we will provide $50 payments to survey participants. We will supplement our follow-up activities with telephone prompts, as necessary, and a CATI alternative. A hard copy pencil-and-paper survey questionnaire will be mailed to any respondent who requests one. Further, after mailing the packages to school districts, we will prepare a twice-weekly log of all responses received online and by mail. Approximately two weeks after the initial mailing, we will begin the process of survey follow-up. We will send a letter reminding respondents about the survey. After two more weeks, we will implement a series of three follow-up calls at approximately ten-day intervals. During the third call, we will offer to complete the questionnaire as a telephone interview. The research team has extensive experience administering Web-based and CATI surveys with high response rates using these procedures.

Due to our reliance on respondents’ good will to achieve a high response rate, we anticipate a 80 percent or greater response rate from principals.

4. Expert Review and Piloting Procedures

As discussed above, we will pilot both Web and telephone forms of the principal survey in 4 chronically low-performing elementary schools and 4 chronically low-performing middle schools across the target states, or in a total of 8 chronically low-performing schools. These pilot schools will be excluded from the survey sample. Following warranted revisions in instruments, we will begin data collection.

For the current study, we anticipate a pilot test of the principal survey with 4 elementary school and 4 middle school principals across the target states. We will use both the Web and CATI versions of the survey. We will ask respondents to respond to the survey in a natural manner, noting the length of time required for completion. For purposes of the pilot we will add questions on the following topics at the end of the survey: (1) the clarity and usefulness of the survey instructions; (2) the overall organization and flow; (3) the clarity of survey wording and language; (4) specific items that were unclear or difficult to answer; (5) any recommended changes; and (6) any other comments on the questionnaire. Responses from these interviews will be used in making any final changes to the questionnaire.

Case study protocols will also be pilot tested. We will pilot test the protocols to ensure that they are capturing all the data elements required and that the questions are adequately interpreted by the respondents. In two schools, one TA and one NI, we will conduct interviews and ask respondents about (1) the overall organization, flow, and length of the interview, (2) the clarity of the interview wording and language, (3) specific questions that were unclear or difficult to answer, and (4) any recommended changes. We will use the responses from these pilots to modify the protocols. Lessons learned during the piloting process will inform refinements to protocols, as well as procedures for scheduling and conducting interviews. Prior to conducting state visits, we will develop consent forms for interviews and have the forms and the protocols reviewed by the IRB.

5. Individuals and Organizations Involved in Project

AIR is the prime contractor for this study. Decision Information Resources (DIR), Policy Studies Associates (PSA), and Urban Institute are subcontractors. Dr. Daniel Aladjem, of AIR, is the project director; Drs. Brenda Turnbull and Eileen Foley, of PSA, will lead the case studies; Dr. Pamela Wells, of DIR, will lead the administration of the principal survey.

Contact information for these individuals and organizations is presented in Exhibit 2.

Exhibit 2. Organizations, Individuals Involved in Project

Responsibility

Organization

Contact Name

Telephone Number

Project Director

AIR

Dr. Daniel Aladjem

(202) 403-5386

Subcontractors

Case Study Team Leader

PSA

Dr. Brenda Turnbull

(202) 939-5324

Case Study Team Leader

PSA

Dr. Eileen Foley

(202) 939-9780

Principal Survey Team Leader

DIR

Dr. Pamela Wells

(832) 485-3730

Administrative Data Team Leader

Urban Institute

Dr. Jane Hannaway

(202) 261-5753




1 This period has been selected as it represents a balance between competing demands on the study. On the one hand, a long time period is ideal from the standpoint of making certain that schools have truly achieved turnaround and sustained improvement. On the other, we want to examine instances of turnaround that are as recent as possible so that we have greater confidence that respondents remember accurately what happened.

2 For example, prior to school site visits, to learn more about school improvement efforts identified by principals as most intensive and to confirm intervention timelines, we will conduct individually tailored phone calls. In these calls, we will ask principals to confirm or correct their survey responses with regard to their intensive PPP and to indicate, if possible, the time at which each intensive PPP was introduced and when it was most fully implemented.

3 In the interview, using data gathered from the principals and in the state and district profiles, we will prompt respondents for their recollections of PPP that they have not yet mentioned but that we have learned about from other sources.

1000 THOMAS JEFFERSON STREET, NWWASHINGTON, DC 20007‑3835TEL 202 403 5000FAX 202 403 5020WEBSITE WWW.AIR.ORG

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleIntroduction
AuthorInformation Technology Group
File Modified0000-00-00
File Created2021-02-02

© 2024 OMB.report | Privacy Policy