ED Response to OMB Passback

SST Response to OMB Questions.docx

Study of Schools Targeted for Improvement Using Title I Section 1003(g) Funds Provided Under ARRA (Study of School Turnaround)

ED Response to OMB Passback

OMB: 1850-0878

Document [docx]
Download: docx | pdf

Memorandum

To

OMB

From

Thomas Wei, IES Dept of Ed

Date

February 10, 2011

Re

Responses to OMB questions



  1. The evaluation questions (pg.8) do not seem to get at structural changes – for example, longer school days or year, a change in the school’s schedule – how will the study get at these approaches to school reform?

While the evaluation questions themselves do not specifically cite structural changes, such changes will be captured in the data collection and are implied in several evaluation questions. For example, the first sub-question associated with EQ2 reads “What specific actions are states, districts, and schools taking to improve instruction in the SIG-funded schools in this study?” In responding to this question, the study team will collect data on the change strategies listed in the conceptual framework, including human capital management, curriculum and instruction, use of time, data use, professional learning, leadership, resource allocation, parent and community outreach, school climate, and student supports. We expect any relevant changes in structures in the school to occur in one or more of these arenas and thus to be picked up in the interviews. Indeed, questions in the interview protocols and focus group protocols (see, for example, Elementary Teacher Focus Group #6) probe for details about change strategies, including the rationale and sequence of activities. In site visit training activities, research staff will be coached to collect data on the primary structural changes in each case study school.

2.    Will the study, in its examination of use of funds (EQ6), get at any relationship between SIG funds and other funds available to these schools (RTT, Title I, Title II, IDEA, state compensatory education dollars)?  Also, will the study be able to say anything about the percentage increase in funds experienced by these schools or the relative costs of similar reform interventions or strategies across the case study schools?

The focus of this study is on the use of SIG funds to support implementation of school turnaround models. Nevertheless, in interviews with district officials and principals, we will ask how they are leveraging all available sources of funds to support their turnaround strategies. However, we have not planned to explicitly examine the relationship between the amount of SIG funds and the amount of funds from other sources. It would be a challenge to collect detailed information on all funding sources to analyze the relationship between the various sources. We will certainly analyze overall changes in school budgets, as well as changes in expenditures (see the next question), but due to data availability, it would be difficult to break down the expenditures by exact funding sources. (See Question 23 in the elementary principal interview, 24 in the high school principal interview, and 28 in the district official interview.)

We will be able to describe the percentage increase in funds each district experiences from SIG funds, examining per pupil expenditures over time in years 2007-08, 2008-09, 2009-10, and 2011-12.  The Baseline Data Report has also estimated the average percentage increase in funds for schools in each state.  The cross-case analyses will enable a comparison of costs for schools planning to adopt similar turnaround models and strategies.

3.       Will EQ6 examine how total school budgets (federal, state, and local funding) change?

Yes. School budgets collected each year from case study schools will provide information about how total budgets change over the years 2011-2013.  We also plan to examine how total district budgets change over time before and after the influx of SIG funds.  We will examine total per-pupil expenditures for each district in which a case study school is situated in years 2008-09, 2009-10, and 2011-12 to understand, as context, whether SIG funds provide a significant increase in overall funds or simply replace funds that have been recently cut.

4.       Please clarify how the 25 core schools and 20 special topics schools will be selected from the 60 schools and how the 60 schools will be selected from the universe.

See Questions 11 and 22 below for details about selecting the 60 schools from the universe. Extant information from document review and data from initial principal telephone interviews in winter 2011 in all 60 schools will inform the selection of the25 core case schools. The selection of the 25 schools will be an iterative process, similar to the process through which the study team selected the sample of 60 schools. That is, the first step will be to prioritize specific variables that are most likely to ensure a balanced sample that will achieve the study’s analytic goals, and in doing so, reduce the sample to approximately 40 schools. In the second step, the study team will focus on specific contextual variables that will be addressed during the principal interviews and will favor rich variation in the sample.

Step 1: Reduce sample based on key variables:

  • School levels (will seek 13 elementary and 12 high schools)

  • Nesting of multiple schools within districts (20 of the 25 schools should be in districts with at least one other case study school)

  • Urbanicity (the distribution should reflect the distribution within the full sample of 60 schools)

  • Intervention model (all models should be represented, in approximate proportion to their distribution in the full sample of 60 schools)

  • How much SIG funds will change the funding at the school level (e.g., replace other funds or additional funds, and range in SIG funding)

Step 2: Select 25 schools ensuring a rich composition of interventions, reform history, and other contextual variables.

  • Student demographics (majority African-American, majority Hispanic, mixed ethnic groups, other)

  • Contextual influences (prior to SIG funding), e.g., prior reform efforts; fiscal resources; school autonomy in making key decisions and discretion to allocate funds in school budget

  • Leverage points for school change( e.g., whether priorities focusing on changes in curriculum and instruction vs. improving professional development opportunities for teachers)

  • Type of external support provider (e.g., a provider from a CMO vs. an independent consultant)

  • Experienced vs. new principal


The special topics schools will be selected based on whether or not they fit into the special topic. For example, ELL schools are a special topic, so we will select the 10 schools from among the 60 base sample schools with the highest concentration of ELL students.



5.    The package suggests it is a possibility that the different schools (core and special topics) may overlap.  Wouldn’t it be better to maximize the number of schools from which we are getting detailed implementation data?

Although we recognize the rationale for limiting the overlap among schools in the three subsamples, some overlap seems unavoidable. First, the set of core case study schools is intended to include variation on a number of important variables, including school level, geography, SIG intervention model, urbanicity, and student demographics. Although we have already identified the first “special topic” of schools with high concentrations with English Learners (ELs) we do not want to exclude high-EL schools from the core case study sample. Moreover, we do not yet know the second special topic; one of the unique features of this study is that we will identify a topic that is of interest to the policy audience – this could be a focus on rural schools, high schools, a particular intervention model, or schools in a particular district reform context. Because we do not know this topic, we cannot select a core case study sample in a way that would eliminate all chances of overlap among the subsamples.

6.    Has the study team done any piloting of the feasibility of getting the types of expenditure data necessary for the study from districts at the district and school level?

AIR’s team has collected expenditure files from districts in other studies (e.g., Chambers et al, 2009; Chambers et al, 2000) and will follow similar practices to request these files from the districts selected for the current study, incorporating lessons learned such as requesting files immediately when they become available, talking to the district staff person who is most familiar with the files, and making sure to explicitly request site codes.  Study staff have already worked with districts and schools with whom they have maintained existing relationships and will collect examples of school budgets (overall and SIG) to prepare for site visits and to use in site visit training.

7.    On page 14, the package describes that the study will analyze State applications to learn about SIG policies.  Is it reasonable to assume the applications will reflect the program as actually implemented?

We cannot make an assumption that the state SIG applications will reflect the program as actually implemented; rather, the applications reflect the intent of the State. In the analysis of State SIG applications, and as stated in the Baseline Data Report, one of the key research questions is: What SIG-related policies and practices do states intend to implement based on their SIG applications (emphasis added)? The study team has been careful to note how states plan to determine LEA capacity, how they intend to monitor LEA progress, and how they plan to support SIG implementation. In follow-up data collections (e.g., site visits), we will be able to better assess whether their intent based on the applications matched up with actual implementation.

8.      The answer to efforts to identify duplication needs to discuss the relationship between the samples and the data collection activities for this study, the integrated ARRA evaluation, and the RTT/ SIG impact study.

All studies of the ARRA programs have a vested interest in limited response burden on stakeholders while ensuring data of appropriate depth and reliability.  The study team for the Study of School Turnaround is working closely with the RTT/SIG Impact study to limit overlap; this coordination is facilitated by the fact that AIR and Mathematica are contractors on both studies. The study teams anticipate little overlap among respondents for these two studies. 

First, the Study of School Turnaround seeks to begin data collection in the spring of 2011, while the RTT-SIG Impact study will not begin data collection until the spring of 2012, thus avoiding one year of simultaneous data collection efforts.  With regard to the respondent groups, there is limited overlap, primarily at the state and district levels.  In the few cases in which there is overlap, the study teams may conduct joint interviews: that is, a researcher from one study team will conduct the interview while a representative from the other study team will listen, only adding questions as necessary to address study requirements.  Because both studies will be probing issues related to state SIG policies, the joint interview format should be effective and efficient.

Level

Study of School Turnaround

RTT/SIG Impact Study

Extent of overlap and proposed solution

State

Interviews with state officials with primary responsibility for SIG in 6 states

Interviews with state officials with primary responsibility for RTT and SIG in 50 states + DC

In the spring of 2011, only the SST will conduct state level interviews.  For subsequent study years, the study teams will conduct joint interviews

District

Interviews with district officials in which the case study sub-samples are nested.  At most, 23 districts in 6 states.

RTT sample: interviews with administrators in 3 districts in all states except DC and HI, which do not have districts.

Because the SST will only be in 23 districts in 6 states, the majority of the RTT/SIG district data collection will not overlap.  However, it is possible that there will be overlap with the large districts in the SST sample (possibly including Philadelphia, Miami-Dade, Cleveland, San Francisco Unified, and Los Angeles Unified).  In these few cases, the study teams will coordinate to ensure that interviews are conducted jointly, or we will avoid interviewing the same individuals.

STM sample: interviews with administrators in a subset of districts in 25 states.

School

Interviews with principals in 60 schools nested within 6 states

Survey of principals in 1200 schools nested in 25 states (STM sample)

If the 25 RTT/SIG states overlap with the 6 SST states, then it is possible that a few of the SST-sampled schools will be included in the RTT/SIG set of schools.  Although the RTT/SIG study will seek to avoid duplication to the extent possible, the design imperatives may require some overlap.  In such cases, the contractors will again coordinate interviews; the SST will conduct interviews in the fall, while the RTT/SIG team will do so in the spring.

The RTT/SIG study is not collecting teacher-level data, so there is no overlap with other school-level data collections.

Survey of teachers in 60 schools nested within 6 states

Interviews and focus group with stakeholders in subsamples of schools.



With regard to the integrated ARRA evaluation, some overlap among respondents is inevitable, given that the ARRA evaluation is collecting data from state officials, district administrators, and school principals as a nationally representative sample from all 50 states.  However, the topics of data collection and the data collection strategies are notably different.  For example, the integrated ARRA study will administer a closed-ended survey to state officials that focuses on issues related to standards, assessments, evaluating educator performance, recruiting and retaining teachers, strategies for improving low-performing schools, reform priorities and finances.  While the state-level interview for the Study of School Turnaround also addresses strategies for supporting SIG schools, the interview probes more deeply into issues related to the state context, decision-making processes with regard to SIG, the selection of intervention models, the role of external support providers, state support and monitoring, and challenges.  Thus, data collection for the Integrated ARRA study and the SST should not be perceived as duplicative by respondents.

9.  What is the rationale for seeking sets of 2 to 3 schools nested within districts (Part B pg. 16)?

The study team chose to nest two to three schools within a single district for a number of reasons. The primary motivation was to maximize our ability to understand the district’s role in the school change process. Having multiple SIG-funded schools per district provides more points of reference within a district and enables analysts to better distinguish between variables that are associated with the school itself and those that are associated with the district context. Additionally, from a practical perspective, traveling to multiple districts each with only one SIG-funded school would be a much less efficient use of resources than clustering schools within a smaller number of districts.

10.   We are interested in learning about middle schools but we understand there are limitations to including them in the study. Can you clarify further why including all three levels compromises analyses?

The decision to include all three levels of schooling—elementary, middle and high—would add yet another constraint on our sample selection process. Our objective is to allow for enough variation in our sample (in terms of states, turnaround models, urbanicity etc.) to be able to reach meaningful conclusions about each subset of schools—elementary and high. If middle schools were to be included with elementary and high schools, then it would become difficult to conduct meaningful analyses about any level of schooling given our existing sampling criteria and the fact that we can draw a sample of only 60 schools. We feel the more effective approach is to focus on only two levels of schooling, but in greater depth. The sample is clearly not meant to be representative, but is purposively selected to incorporate adequate variation on the few key variables we have identified.

11.   Please clarify how you intend to take concentration of SIG schools in the state into account (Part B page 18).

The concentration of SIG schools in a given state (essentially, the number of SIG-awarded schools within a given state) has implications for both funding levels and the degree to which the state education agency can focus attention, supports, and monitoring efforts on given schools. Thus, as one of the sampling criteria, the study team considered the number of SIG-awarded schools in each state. In doing so, the study team prioritized states with higher numbers of SIG-awarded schools for several reasons. First, states with more SIG-awarded schools are also more likely to have schools clustered within districts (see question 9, above) and are more likely to capture multiple school levels and models. In addition, states with multiple SIG-awarded schools are more likely to have replacement schools, should some refuse to participate in this study. However, the study team also sought to include states with fewer SIG-awarded schools, to provide variation in the state context and support for SIG.

12.   Will the interviews of district and school administrators be the only source of detailed data on specific uses of funds (Part B pg. 20)? 

No.  In addition to interview information (which will provide a wealth of information on the strategies behind expenditure decisions), the study team will collect budgets from each case study school in the fall of each year, in conjunction with the site visit.  Study staff will also collect audited district expenditure files for school year 2011-12 to understand district- and school-level expenditures and to compare to the school budgets.

Also, please share the results of the pilot test mentioned on pg. 21.

Under Section 4: Expert Review and Piloting Procedures on pg. 28, the study team included a more detailed description of piloting the interview protocols, excerpted below:

Following publication of this OMB submission for public comment, the study team conducted piloting interviews with district officials, principals, and teachers in three states. Study team members asked these educators and administrators to react to (1) the overall organization, flow, and length of the interview, (2) the clarity of the interview wording and language, (3) specific questions that were unclear or difficult to answer, and (4) any recommended changes. The study team then used the responses from these pilots to modify the protocols to make them as suitable as possible for the actual site visits.”

Following are comments from district officials, principals and teachers, and how the study team modified the protocols.

Protocol/Question

Comment Description

Study Team Response

District (overall)

Pointed out that we use chronically low performing sometimes and then persistently low-achieving other times. He suggests we be consistent. Low-performing might have broader connotations.

We made it consistent and use “low-performing” throughout protocols.

District, Q3

Rather than use “some” schools, might want to refer to the particular set of schools that they mentioned in an earlier question.

We modified “some” schools to “these” schools.

District, Q3-4

Wondered whether it might not be useful to ask what they were already doing for their chronically low performing schools before SIG. He thinks question 3 reads as an assumption that nothing was going on before SIG.

We did not change the wording of the question but broke it out into two questions.

District, Q3

In probe, not just district policies – might want to say something like “district policies, practices, or contractual agreements.”

We added “practices and/or contractual agreements” to the probe.

District, Q6

Suggested that we add “Why or why not?”

Done

District, Q7

Suggested that “For distributing these funds across identified schools” should come later under Use of Funds.

We changed wording to “Were there any SIG-eligible schools for which the district did not seek funding?”

District, Q7

In probes, suggested adding probes that are in question 8 about who made decisions, did state provide guidance, and were schools and/or community involved. In his specific district schools were involved in decision-making. Suggested to delete probe on whether strategy was similar or different from district’s general approach to allocation of funds as it is asked later.

Done

District, Q8

Suggested to add “community” in probe

Done in both questions 7 and 8.

District, Q8

Suggested to add to probe whether district rep or district had any prior experience with reconstitution.

Done

District, Q17

Suggested to add “In any low-performing schools?”

Done

District, Q20

Suggested to change union “policies” to “contracts.”

Done

District, Q21

Suggested to press district leaders about ‘how do they know’ school leadership is strong enough.”

Added “How do you know?”

District Q21

Suggested that district leaders will be inclined just to say yes. Better to ask about the leadership challenges in the school(s) and how the leader or leadership team at the site is equipped and planning to address those.

Added probes to address suggestions.

District, Q26

Suggested to add “measure” along with how the district will monitor SIG schools’ success.

Done

District, Q27

Describe the process for deciding and approving.

Added probe: “What was the district’s strategy for distributing these funds across the identified schools?”

District, Q27

Listen for any school involvement

Added probe to address this suggestion.

Principal, Q2

Re students, it might be a good idea to ask or see if principal will answer questions about diversity of school population (ethnicity, socioeconomic, ELL, SPED). Also, do students in the school have a sense of the value of an education.

We added to things to listen for: diversity of student population, ethnicity, socioeconomic, ELL, SPED. We did not include sense of value of an education.

Principal, Q3

Maybe add a probe [or separate question]: How do you support faculty and staff members in the change? I believe this is a very important part of “change” if the teachers do not buy into it, or are not supported properly…little change will happen.

We added a probe: Supporting the faculty and staff members in the change process.

Principal, Q4

There is a lot of talk around the idea that principals have a very hard time getting out of their office to visit classrooms because of a variety of issues. Books on the subject say that to be an effective instructional leader a principal must have a firm understanding of what is going on in classrooms on a day to day basis. A principal must be visible to both students and teachers. A probe might be to ascertain what percentage of their day/job is spent on instructional focuses and what percent is taken up by the “external environment

We added probes on percent of day/job spent on instruction and learning and on external environment; and how often do they get in the classrooms in a typical week.

Principal, Q7

Suggest to ask how much autonomy the school has from the district to make decisions the school feels will help their particular school/kids. All schools are unique and in a lot of districts all schools use the same curriculum even though it might not be good for a particular school’s needs

We added a new question about school autonomy and for what kinds of decisions.

Principal, Q8-9

There are two BIG questions here and are the “meat” of the interview. I would separate these two questions and make them individual questions. Ask about why he/she believes school is low achieving first. Then ask the question about what needs to be done to increase student scores

We separated out these two questions.

Principal, Q9

Suggested to add parents to the probes. Maybe have a probe about parent involvement: how do parents support their children in school; communication with teachers and principal, do they attend important school-wide meetings, parent conferences

We added a probe on parental involvement.

Principal, Q25

Suggested adding probe: Were you involved in making decisions regarding how SIG funds should be spent in your school? How is this different from how expenditure decisions are usually made, if at all?

We added this probe.

Principal, Q26

Suggested adding probes: Do you believe you have enough funds to ensure improved academic achievement? Do you believe funds are being used efficiently and effectively?”

We added these probes.

Teacher (overall)

T1: With these questions and with the probes she said you can easily get the answers. The wording is appropriate. All items are understood.

T2: She asked whether the teachers will know about SIG, and if not, some questions might need more clarification. She thinks the school context questions are not too long.

T3/coach: She looked over the teacher interviews, coach interview, and teacher focus group protocols.  She didn’t find any of them difficult to understand or answer. Overall, she thought the protocols were good, easy to understand, addressed most of the right issues, and answerable by teachers. She thinks the school context questions are important and not too long.


Teacher, Q1

T2: She thought we might want to add examples to the probe about capacities: dept. chair, coach, technology mentor.

We added these to the probe

Teacher, Q7

T2: This question about the needs of students who are struggling opened up a “can of worms” for her. She said that it really depends on who is in the teacher’s class, and which class if they teach a number of classes. There could be a variety of students in the class, e.g., GATE, ELL, at-risk, special ed, regular, etc.

We added: [If teacher has multiple classes, s/he can refer to each class].

Teacher, Q8

T2: She did not know what “wraparound services” means.

We added: [or community-based intervention]

Teacher, Q11

T2: Without any examples of the school’s goals, this question seemed ambiguous to her. She wanted some examples, are we talking about academic, discipline, dress code, faculty, admin.?

T1: We should include a question about the standards within the school both academic and behavioral and whether teachers think the standards promote or hinder success. She would like to know if the school has high expectations. In her experience, administration is failing and they do not hold students to any standard.

We added probes: Where do the goals come from and how do they affect what you do in the classroom? Do the goals include academic and behavioral standards? Does the school have high expectations of its students?

Teacher, Q13

T1: Important question (leadership)

T2: She thinks we should also listen for “backing” the teacher when appropriate, e.g., if a parents calls and complains about a student’s grade, not just automatically change the grade but check in with the teacher first.

We added “support you as a teacher” in the question and “spending sufficient time in classrooms” in the probes

Teacher, Q15

T3: Suggested adding question about the most important factor in school improvement.

We added this question about core improvement strategies or approaches.

Teacher, Q16

Principal suggested adding: Does the staff have time in their day to have Common Planning Time. Do they have adequate time for preparation of lessons for the differentiation of instruction?

T3: She suggested asking teachers about extra time they might have to take for training, planning, implementing programs and additional paperwork. 

We added to the Use of Time probe: for training, common planning time, implementing programs, etc.

Teacher, Q16

T2: She felt like the guidance counselor support was missing in the improvement strategies

We added “additional counselors” to the probe on Staff, additional positions; support staff.

Teacher, Q23

T2: This question was not clear to her and she wanted to re-word it to something like: “Are you happy here? [if yes] What do you like about this school? [if not] What examples can you give that would make teaching at this school better?”

We changed the question to: What do you like about this school? Can you provide me with examples that would make teaching better at this school?



13.   Does the study have any mechanisms for determining if other funds are reallocated by the district to other schools given the infusion of SIG funding into certain schools (see also question 2 above)?

Please see response to question 2 above.

14.   Is the base year for fiscal data collection early enough to detect changes resulting from the recession and budget cuts (Part B pg. 24)?

Upon further discussion, the study team has determined it would be prudent to collect fiscal data for 2007-08 as well as the later years, especially since we are already collecting fiscal data so the burden would be virtually unchanged. Thus, the study will collect expenditure files for school years 2007-2008, 2008-09 and 2009-10 (three years prior to SIG), and 2011-12 (the second year of SIG funds).  The effects of the downturn (following the financial crises of the fall of 2008) will have started at different times in different states, although there was a lag time between the start of the recession and the impact on state and local budgets. Because the study team cannot predict the timeline of the economic decline in all sampled districts, it is preferable to collect expenditures files from 2007-08 and then empirically determine which year would be an appropriate baseline for study purposes.

15.   Will the surveys of State administrators get at any changes to teacher licensure or certification systems, teacher evaluations systems, or teacher distribution efforts that are part of the policy context in which SIG is being implemented?

Several states have changed their teacher evaluation/certification systems to respond to RTT, or other external pressures. Such changes potentially intersect with SIG in a number of ways. First is the extent to which the staffing changes implied or required by the models are facilitated (or impeded) by some of these teacher policies. Second, to the extent that these policies actually reach the school level by the time SST data collection is complete, they may have implications for the broader talent management strategies and practices in the schools. The study team has added a probe in broad terms about these teacher policies to question 10 of the state administrator interview protocol, which concerns whether the state needed to make (or is considering making ) any systematic changes in state policies to implement SIG. Questions about teacher evaluation systems are included in the district administrator and principal interview protocols , and we will add specific probes about licensure/certification systems and teacher distribution efforts to both of these protocols. For example if rules about tenure prevent replacement of some staff, or whether the standard ways teachers are assigned to schools or can transfer form one school to another might affect SIG implementation.

16.   We are concerned that these school personnel are going to be pretty overwhelmed by the magnitude of their reform efforts, coupled with the study requirements.  We think these schools will be of policy interest to those working on a variety of K-12 topics.  Has IES reached out to all other offices/entities within ED and ideally beyond to identify any other data collection needs so that they can be incorporated as efficiently as possible without undertaking additional data collections later?  Alternatively, is ED willing to be judicious and discourage other program offices from “hitting up” these schools separately in the next few years?

In addition to the coordination efforts described in response to question 8, IES has sought to communicate with other ED offices to ensure this study addresses multiple needs. Toward that end, the Title III offices added funds to the study in the summer of 2010 to ensure data collection on issues related to English Learners in SIG schools. In addition, staff from OESE and OPEPD attended the study TWG meeting in January 2011 and will continue to be included in important study communications.

17.   Data collection

a.       Consistent with many other longitudinal IES studies, please provide the introductory information and how it will be presented to teachers at the study’s onset so that they can “buy in” to the full study and understand what will or may be asked of them over the duration. 

The study director and principal investigators will initiate communication with the district administrators, and senior staff assigned to each school will be responsible for recruiting each school and responding to questions regarding participation in the study. At the school level, the first contact between the study team and the school will be the principal. As appropriate, the study team will follow up with email and telephone communication to provide detailed responses to the principal’s questions. With regard to teachers, the first information about the study will be transmitted through the principal or when the online survey is first administered. An introductory email to precede the administration of the web-based survey will provide information about the data collection activities, the study contractors, the study purpose, and will include a brief statement about data safeguards. In addition, at the beginning of each interview and focus group, study staff will provide an overview of the study and will carefully respond to all questions that teachers may have about the study, data collection activities, and how their data will be used.

b.      Please provide more detail on how all focus group participants will be recruited and screened, especially the youth.  If there are screening instruments, those should be provided.

The study team has not developed screening instruments for this study.

Student Focus Groups: The student team will send parents of high school students under the age of 18 a letter asking for their permission to allow their child to participate in a focus group. Students who are selected to participate who are 18 or older will sign a consent form at the start of the focus group. Students will be selected to participate based on their school schedules and school staff recommendations.

Teacher Focus Groups: The study team will not randomly select teachers for focus groups. We will provide guidelines to the principal and school coordinator for the site visits for the selection of teachers from core subjects and other subjects such as vocational education, arts, and foreign languages. Focus groups should also include teachers of English as a Second Language and teachers with primary responsibility for special education students. We will explicitly ask the school to select teachers who have a variety of perspectives and experiences with the reform strategy.

Parent Focus Groups: The study team will not randomly select parents for focus groups. We will provide guidelines to the principal and school coordinator for the site visits for the selection of parents of students in a range of grade levels, with varying degrees of involvement with their children’s education and participation in school activities.

18.   Incentives – The logic underlying the proposed uses and non-uses of incentives is not clear to us.  First, there’s the notion that the study is mandatory to grantees, therefore 100% participation is anticipated of everyone except teachers.  We see the mandatory nature as more relevant to teachers than to students and parents.  We also see the burden on teachers of a focus group after school hours as potentially more problematic than a 10 minute survey so don’t understand why you would feel compelled to incentivize the latter and not the former.  Further, we do not consider $10 a “token” for a low burden survey, particularly if it is prepaid.  This amount seems unnecessary.  We also don’t understand how a gift card can be “included with this questionnaire” if it is delivered via the web. 

Please see response to question 19. We plan to conduct quantitative analysis using the teacher survey data so it will be essential to have a high response rate from teachers. The incentive is identified as a token of our appreciation for their taking the time to support the research effort. The underlying principle used to justify payments to study participants is that the payment ought to correlate closely to the participants’ labor market wages. According to the latest data available from the Department of Labor, Bureau of Labor Statistics, Occupational Employment and Wages, May 2008, the average hourly wage of elementary school teachers is $40.81 (http://www.bls.gov/oes/current/oes252021.htm). Given the incentive payment is for a 10-15 minute online survey, $10 seemed an appropriate incentive level.

The survey contractor will provide teachers with an electronic gift card which will be made accessible to them once successful completion of the web survey is confirmed. The survey contractor determined that a prepaid gift card approach (sending a prepaid gift card with the questionnaire) was not the most cost-effective use of study funds because most communications with teachers will be through electronic means. The use of electronic gift cards allows the respondent to be sent a claim code number by the survey contractor. Once a completed survey has been verified through the web, the survey contractor electronically emails to the survey respondent a claim code number in a gift card template. The template can be personalized with the study logo. The respondent goes online to the gift card vendor (i.e., Amazon.com or icardgiftcard.com) and redeems the value of the card and can apply it for purchases from any number of vendors available through the gift card provider. The survey contractor is then able to track whether the gift card is downloaded and can maintain an audit trail for the use of the respondent payments. The survey contractor is currently negotiating with two potential vendors of electronic-accessible gift cards (Amazon.com and icardgiftcard.com) which provide gift cards which can be used by respondents with many potential vendors. Hard copy cards can also be sent through these services to respondents who do not wish to access an electronic gift card.

With respect to the focus groups, there are different ways of establishing rapport and incentivizing participation. Indeed, a skilled focus group moderator or interviewer can effectively communicate the importance of a participant’s contributions to the study, should express interest in the teacher’s feedback, and may foster a conversational context that is enjoyable, rather than burdensome, for respondents. Moreover, the focus groups will only include a handful of teachers in a school and are not meant to be representative, so we do not expect there to be a problem with assembling focus groups.

19.   Please clarify why a high teacher response rate is important.  Is ED planning to produce statistics from these surveys that will be released as part of the public reports?  If so, we would like to see more detail on  how teacher lists will be obtained, maintained, updated, and how movers will be treated, etc, over the life of the study.

The teacher survey is only one data collection strategy in the Study of School Turnaround, but it has a specific purpose: to collect data on topics for which data from the full set of teachers is critical to understanding the construct in question. Indeed, many topics will be measured with data from interviews and focus groups; for these, a sample of teachers and other stakeholders is sufficient. However, to explore principal leadership, for example, teachers may feel more comfortable responding honestly through a survey, rather than expressing their views in a focus group setting. Moreover, teachers’ opinions about principal leadership may vary among subsets of teachers. To accurately measure such variation, and to limit systematic patterns of non-response, the study team has determined that a high response rate is necessary.

The teacher survey will collect information from all of the elementary and high school teachers in the sampled schools in the winter of each study year. The results provide a snapshot at the time of the survey; no effort is made to follow teachers longitudinally. The survey will be completed by teachers in the school at the time of data collection. Teachers will be provided an access code and instructions on how to complete the survey through a web-based platform. Teachers will be alerted by email and regular postal service about the survey—its objectives, sponsorship, and the information needed to access and complete the survey.

Teacher rosters (containing current teacher names, email addresses, and grade and subject area) will be obtained from each school during each data collection year and serve as the teacher survey contact list for each school. Each year’s teacher survey will be conducted with the teachers who are employed at the school at that time, so teachers who have moved will not be followed. Each year will be treated as an independent data collection with analysis conducted at the school level.

20.   Confidentiality –

a.       Pledge language -- We have a well established approach to citing the ESRA confidentiality pledge and would like this study to stick with that approach.  The pledge language in SS A 9 and on all consent materials and questionnaires should be changed to include the use statement (“solely for research purposes” or the like) and should exclude the phrase “in a confidentiality manner.”   Also, strike the term “anonymous” from consent forms since none of the collections are anonymous.



In response to this concern, we have made the following edits to all of the interview and focus group protocols:

In the introductory text on each protocol, we have revised to read as follows:

Confidentiality

I want to assure you that all information obtained today will be treated in a manner that carefully protects your privacy confidential manner, in accordance with the Education Sciences Institute Reform Act of 2002, Title I, Subsection (c) of Section 183. Only selected research staff will have access to data. We will NOT present results in any way that would permit them to be identified with you or any other specific individual. No personally identifiable information, such as your name or your district or school affiliation, will be disclosed to anyone outside the project. We also will not share what you and I discuss with other people in this district. Our study will identify the states that we visit, but will not disclose the names of the districts or schools in each state.  Also, you should know that your participation is voluntary, and you do not have to respond to any questions you do not want to.  Please let us know at any time if you would prefer not to participate.

I’d like to ask you to sign a consent form before we begin. It outlines some of the issues I’ve just mentioned with regard to privacy considerations anonymity and confidentiality. Please take a minute to read it and let me know if you have any questions.

In the consent forms, we have revised the following passages:

Purpose

The Institute of Education Sciences (IES) of the U.S. Department of Education (ED) requests clearance for the data collection for the Study of School Turnaround (SST). The purpose of the study is to document over time the intervention models, approaches and strategies adopted and implemented by a subset of schools receiving federal School Improvement Grant (SIG) funds. To this end, the evaluation will employ multiple data collection strategies. The data collected will be used solely for research purposes.

Confidentiality Privacy Considerations

We will treat the information that you supply in a confidential manner that protects your privacy, in accordance with the Education Sciences Institute Reform Act of 2002, Title I, Subsection (c) of Section 183. Only selected research staff will have access to data. We will NOT present results in any way that would permit them to be identified with you or any other specific individual. No personally identifiable information, such as name or district/school affiliation, will be disclosed to anyone outside the project. 

b.      Please clarify what level and type of information is being provided to individual schools, districts of states.   We are concerned that if schools are given school-level results to the teacher surveys, there would potentially be disclosure issues.  Related, what exactly are state and other public officials being told about how their names and other direct identifiers are being protected.  We want to be sure that this is not overstated given the difficulties noted in providing confidentiality.   Related, SS B3 says that part of the teacher nonresponse follow up involves contacting the schools.  We want to be sure that school administrators are not “middle men” in the data collection process in any way.

With regard to schools, the study team has revised the approach originally described in the OMB submission. The study team will not be providing any feedback to the schools aside from the publicly-released study reports and associated research briefs. School stakeholders will not receive school-level survey results.

At the state level, the study team will not divulge the names of the districts or schools to be included in the study sample. In some cases, the state officials could likely determine at least some of the sampled schools, in states in which there are fewer SIG-awarded schools. However, the study team will not identify district or schools in any study reports, individual names or job titles (other than very generic job titles) will be used, and distinguishing features of a school or district will be masked. As noted, it is very difficult to mask states; however, the study team will not identify individuals within state education agencies, nor will quotes be attributed to specific state officials.

With regard to survey follow-up activities, school administrators will not be “middle men” in the data collection process. The primary strategy for prompting non-respondents will be email communication, through which all non-respondents are sent a reminder email with the survey link. In cases of persistent non-response, the survey administrators will follow up by telephone (directly to the non-respondent teachers, not via the principal). As a last resort, the survey administrators will send a paper-and-pencil survey to teachers to be returned via U.S. Postal Mail. In no cases will the principal be asked to act as an intermediary.

21.   In multiple places (e.g., Exhibit 5, footnote 1; SS A 16 “will be submitted” is probably to ED, not to OMB; SS B1, 6th paragraph; ……., the document is written in the “contractor’s voice” and should be written to OMB from the department’s perspective. 

These requested changes have been made.

22.   In SS B 1, there is no discussion of the universe size nor how the sample will be selected.  We understand that it will be purposive and designed to meet a range of criteria such as elementary and secondary, but how cost/data collection practicalities, and early study findings fit in is not specified.

The universe of schools for this study is the full set of all SIG-awarded schools nationally, from which the study team has selected 60 schools. The American Institutes for Research compiled a database of SIG-eligible and SIG-awarded schools. Information on SIG-eligible schools was obtained through the state SIG applications for all 50 states and the District of Columbia, currently available on the ED Web site (http://www2.ed.gov/programs/sif/summary/index.html). Data on SIG-awarded schools, including school names, selected intervention models, and award allocations, were derived from information available through the SEA Web sites. As of February 3, 2011, 49 states and the District of Columbia had provided information on SIG awards to LEAs and schools (information for Hawaii is currently unavailable). Availability of specific data elements differed across these states: for instance, data on selected intervention models were available for 48 states and the District of Columbia and total award allocations for 43 states and the District of Columbia.

From the universe of SIG-awarded schools in the 49 states and the District of Columbia, the contractors identified 60 schools—30 elementary and 30 high—in 23 districts from six states. Potential schools were identified through the SIG database then narrowed down based on a set of sampling criteria designed to yield a base sample of schools that would provided informative, yet varied data.

In selecting the 30 elementary and 30 high schools within the six states, we sought to optimize variation on several key dimensions, including model distribution, urbanicity, school size, percentage increase over base per-pupil funding associated with SIG, and student demographics. In general, we sought to reflect the distribution among the full universe of eligible SIG-awarded schools within the constraints of managing the number of districts.

Specific exceptions and decision rules as follows:

  • Nesting of schools within districts: As discussed in response to question 9, there are both methodological and practical reasons for selecting multiple SIG schools within a given district. With regard to the practicality issues mentioned in the question above, conducting site visits to two or three schools in a given district will limit travel time and expense, as well as reducing the overall number of district officials from which the study must collect data.

  • Oversampling of restart schools: Because there are very few restart schools (5 percent of all eligible elementary schools and 2 percent of all eligible high schools), we determined that we would need to oversample restart schools.

  • Oversampling of ELL students: While we sought to reflect the student demographic distribution among eligible SIG-awarded schools, we recognized the need to oversample schools with at least 25 percent ELL enrollment to ensure enough schools for the ELL special topic sub-sample.

  • Distribution of case study sample schools across states: Sample schools were selected in proportion to the number of eligible SIG-awarded schools in the six states.


Certain features of the sample drove the sequence of steps in identifying the 60 schools. For example, because of the few restart schools nationally (and among the six selected states), restart schools and the districts in which they are located were the first set identified for possible inclusion in the sample. Next, the study team identified districts in which there are schools with high concentrations of ELLs (i.e., at least 25 percent). Third, districts with large numbers of eligible SIG-awarded schools were selected to encompass clusters of SIG schools and to reflect the high proportion of urban schools among SIG-awarded schools. After the initial triage of schools, the identification of the final set became an iterative process of seeking a balance among urbanicity, SIG models, and student demographics.

23.   SS B 5 should also list the IES contacts for the study.

The section now includes IES contacts for the study.

24.  On the teacher survey, we note that all of the subparts to questions 7 and 8 are written in the positive, rather than the typical mixed approach used in question 6 and elsewhere.   Please explain why this is the case and how you have confidence that you will avoid straight lining.

The majority of the sub-items on items 7 and 8 of this study’s teacher survey come directly from a teacher survey that researchers developed at the Consortium on Chicago School Research and have been well-validated in the field. Aside from the prior evidence of validation of these items, there are strong survey design reasons to justify this approach.

Cognitive survey researchers are well aware of the fact that respondents are cognitive misers. They will satisfice (that is, choose the first reasonable response rather than the best response). They may not read items carefully. Accordingly, the provision of an underlying logical structure (e.g., the higher the level of agreement, the better things are) will help to minimize respondent errors due to poor reading. This approach reduces cognitive burden, allowing respondents to use a simple rule (more agreement is better) to indicate their judgments. Moreover, it would be unwise to write some of these items in the negative, because agree/disagree items should never have a negatively worded stem. The cognitive complexity associated with having to say “I disagree that ‘The principal does not place the needs of children ahead of personal and political interests’” is much greater than saying “I agree that ‘The principal places the needs of children ahead of personal and political interests.’”

Next, let us consider the potential problem of “straight lining.” This methodological challenge assumes, rightly, that respondents will do the least amount of cognitive work necessary. That is, they will skim the items and just mark the same response in each column by drawing a straight line down the column – or, in the case of an online survey, clicking all check-boxes in a given column. However, there is another, more problematic risk: in their haste, respondents can easily misread the item and mark a response other than the intended response. It is our belief that the incidence of “recording errors” will be greater than the incidence of straight-lining.

So, how does one reduce the probability of respondent recording errors? By providing a response structure so that “strongly agree” is always the best and “strongly disagree” is consistently the worst. This reduction in cognitive load may also allow the respondent to focus more of their cognitive resources on judgment formation (that is, answering the question), leading to more valid data.

Finally, someone who wishes to straight-line probably is not reading and considering each question very carefully, and so it is not clear that changing the response orders or question framing would do much to significantly reduce the incidence of straight-lining.



12


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKerstin LeFloch (Carlson)
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy