Att_Study 2d_Supporting_Statement_A_05-24

Att_Study 2d_Supporting_Statement_A_05-24.doc

A Study of the Effectiveness of a School Improvement Intervention

OMB: 1850-0838

Document [doc]
Download: doc | pdf



A STUDY OF THE EFFECTIVENESS
OF A SCHOOL IMPROVEMENT INTERVEN
tION/
DATA COLLECTION
(Study 2.1d)

OMB Clearance Package Supporting Statement

Part A: Justification


Regional Educational Laboratory

for the

Central Region


Contract #ED-06-CO-0023



Elisabeth A. Palmer, Ph.D.
ASPEN Associates, Inc.




Submitted to:

Submitted by:


Institute of Education Sciences

U.S. Department of Education
555 New Jersey Ave., N.W.

Washington, DC 20208

REL Central at
Mid-continent Research for Education and Learning

4601 DTC Blvd., #500
Denver, CO 80237
Phone: 303-337-0990
Fax: 303-337-3005





Project Officer

Project Director:



Sandra Garcia, Ph.D.


Louis F. Cicchinelli, Ph.D.


Deliverable #2007-2.3/2.4

July 22, 2007




© 2007































This report was prepared for the Institute of Education Sciences under Contract
#ED-06-CO-0023 by Regional Educational Laboratory Central Region, administered by Mid-continent Research for Education and Learning. The content of the publication does not necessarily reflect the views or policies of IES or the U.S. Department of Education, nor does mention of trade names, commercial products, or organizations imply endorsement by the U.S. Government.

TABLE OF CONTENTS

A. Justification 1

Introduction 1
Overview
1

1. Circumstances Necessitating the Data Collection 2

2. Purposes and Uses of the Data 2

Purpose. 2

Research Questions. 3

Data Collection. 4

Student Data. 7

Teacher Data 8

School Data. 9

Implementation Fidelity and Contamination Data 9

3. Use of Improved Information Technology to Reduce Burden 9

4. Efforts to Identify and Avoid Duplication 11

5. Impacts on Small Businesses and Other Small Entities 11

6. Consequences to Federal Programs or Policies if Data Collection is Not Conducted 12

7. Special Circumstances 12

8. Federal Register Comments and Persons Consulted Outside the Agency 12

9. Payments to Respondents 13

10. Assurances of Confidentiality Provided to Respondents 14

11. Justification for Questions of a Sensitive Nature 16

12. Estimates of Hour Burden of Data Collection 16

13. Estimates of Total Cost Burden to Respondents 19

14. Estimates of Annualized Cost to the Federal Government 19

15. Reasons for Changes or Adjustments in Burden 20

16. Tabulation, Analysis, and Publication Plans and Schedule 21

Data Analysis 21

Cluster Consideration. 22

Attrition Considerations. 22

Pre-intervention Analyses. 23

Descriptive Statistics. 23

Assumptions/Outliers/Data Treatment. 23

Analysis of Attrition. 24

Analysis of Confounding Events, Cross-Over, and Contamination. 24

Primary Analysis: Main Effects. 25

Secondary Analysis: Proximal Outcomes. 26

Reporting. 27

Study Report Preparation. 27

Public- or Restricted-Use Data Files. 28

Dissemination. 28

17. OMB Expiration Date 28

18. Exceptions to Certification Statement 29

References 30

A. JUSTIFICATION

Introduction

This submission requests approval for a data collection plan for a study of the effectiveness of a school improvement intervention. The project is sponsored by the Institute of Education Sciences within the U.S. Department of Education, and will be conducted by the Central Region Educational Laboratory (REL Central) (contract #ED-06-CO-0023) administered by Mid-continent Research for Education and Learning (McREL). The study, which is being conducted by ASPEN Associates, Inc., will examine the impact of a comprehensive approach to school improvement.

Overview

A Study of the Effectiveness of a School Improvement Intervention will be conducted by ASPEN Associates, Inc., a small business subcontractor to REL Central. This section provides an overview of the study design and the planned data collection.

A Study of the Effectiveness of a School Improvement Intervention will address the growing concern over how to assist the increasing number of schools failing to meet the adequate yearly progress provision of the No Child Left Behind Act (NCLB). The intervention to be studied, Success in Sight, is a widely used comprehensive approach to school improvement developed by McREL. Success in Sight has been field-tested in various settings; however, there is no experimental evidence regarding its efficacy when fully implemented.

This randomized control trial study will examine the effectiveness of Success in Sight in 52 elementary schools with low to moderate student achievement. Random assignment will occur at the school level with 26 schools assigned to the treatment and 26 to the control group. This study will specifically examine the impact of Success in Sight on student achievement and on school practices associated with school improvement. The primary data collection will include a teacher survey assessing school improvement practices (data-based decision-making, practices associated with improved student achievement, shared leadership, and purposeful community) and student achievement data. Data collection will occur over a two-year period. Hierarchical Linear Modeling (HLM) will be used to determine the effects of Success in Sight on school-level student achievement and school-level reform practices.

1. Circumstances Necessitating the Data Collection

The Regional Educational Laboratories (RELs) are authorized under the Education Sciences Reform Act of 2001 (Pub. L. 107-279) Part D, Section 174, (20 U.S.C. 9564), and are administered by the Institute of Education Sciences’ National Center for Education Evaluation and Regional Assistance (see Exhibit A for authorizing legislation). The primary mission of the RELs is to serve the educational needs of each region, using applied research, dissemination, and technical assistance to bring the latest and best scientifically valid research and proven practices into school improvement efforts.

The national priority for the current REL contract is addressing the goals of the No Child Left Behind Act of 2001, which states that, “each state shall establish a timeline for adequate yearly progress. The timeline shall ensure that no later than 12 years after the 2001-2002 school year all students…will meet or exceed the state’s proficient level of academic achievement on the state’s assessments” (Title I, Sec. 1111 (b)(F)). State plans must include the criteria for designating schools as making adequate yearly progress and for being identified as in need of improvement. The plans must also address how states intend to close persistent achievement gaps between disadvantaged children and their more advantaged peers. 

Low-performing schools need assistance to meet the many challenges they face — improving student achievement, closing the achievement gap, and bringing all students to proficiency in reading and mathematics by 2014. Given the stakes associated with student performance and the steady encroachment of the 2014 deadline, schools do not have the luxury of a trial-and-error approach to school improvement. Schools need a systematic method for implementing systemic change using programs that have been proven to be effective in improving student performance.

This study will respond to regional and national needs by examining the effects of participation in a widely-used comprehensive approach to school improvement, Success in Sight, on student achievement. Although Success in Sight is a widely-used, research-based approach to school improvement with evidence of its effectiveness from early field tests, no direct causal evidence of its effectiveness is available. This study will respond to the legislative intent of the Education Sciences Reform Act of 2001 regarding activities of the Regional Educational Laboratories in supporting activities that identify successful educational programs and make such information available so that such programs may be considered for inclusion in the national education dissemination system (Pub. L. 107-279, Section 174(g)(5)).

2. Purposes and Uses of the Data

Purpose. This study addresses the growing concern over how to assist the increasing number of schools failing to meet the adequate yearly progress provision of the No Child Left Behind Act of 2001. Specifically, this experimental study will evaluate the effectiveness of Success in Sight, a facilitated comprehensive approach to school improvement. Success in Sight is derived from years of research at McREL (e.g., Marzano, Pickering, & Pollock, 2001; Marzano, 2003; Marzano, Waters, & McNulty, 2005) and reflects McREL’s accumulated knowledge about the essential tasks that schools must undertake if they are to improve student performance. These essential tasks include specific school, teacher, and leadership practices that are associated with high levels of student achievement. Success in Sight focuses on helping schools understand and address these practices as they develop and implement plans that are tailored to site-specific needs for data-driven, standards-based education to promote short-term and long-term gains in student achievement.

Success in Sight focuses on the actions that make a difference in student achievement. It is unique in its emphasis on building school capacity to plan for, manage, and sustain change (i.e., to understand and navigate the change process). To implement Success in Sight, external change agents (“mentors”) guide schools through an iterative, five-stage cycle to help schools share leadership responsibilities, develop a purposeful community, and apply specific strategies for managing the differential impacts of change on members of the school community. Success in Sight is intended to be implemented over the equivalent of two, nine-month school years (i.e., a two-year intervention).

Success in Sight has been field-tested in various settings; however, there is no experimental evidence regarding its efficacy when fully implemented. This randomized control trial study will examine the efficacy of Success in Sight in 52 elementary schools (26 treatment and 26 control group) with low to moderate student achievement (i.e., in schools that are not making adequate yearly progress or at risk of falling into that status) over a two-year intervention period. The data will be used by the U.S. Department of Education to assess the effectiveness of this comprehensive approach to school improvement.

Research Questions. Ultimately, we are interested in the impact of Success in Sight on student achievement as measured through test scores. However, Success in Sight is designed to influence school practices (such as data-based decision-making and shared leadership) that, in turn, positively influence achievement levels. As such, our primary research question will be:

  1. Does implementation of Success in Sight significantly improve student achievement?

This study will also examine the effects of Success in Sight on four proximal outcomes. These outcomes reflect Success in Sight’s theory of action, which asserts that changes in student achievement occur as a result of changes in school capacity to engage in school reform practices (i.e., mechanisms that create and sustain improvements in student achievement). The following secondary research questions address the impact of Success in Sight on these proximal outcomes:

  1. Does implementation of Success in Sight have a significant impact on the extent to which schools engage in data-based decision-making?

  2. Does implementation of Success in Sight have a significant impact on the extent to which schools engage in practices associated with improved student achievement?

  3. Does implementation of Success in Sight have a significant impact on the extent to which schools develop and maintain a purposeful community?

  4. Does implementation of Success in Sight have a significant impact on the extent to which leadership is shared in schools?

Data Collection. This study will involve the collection of data from students, teachers, and school leaders at both the treatment and control schools. As mentioned above, data collection will occur over a two-year intervention period. Data collection will begin in the spring of the 2007–2008 school year. Existing data to be utilized in the study includes demographic data on schools and students, and student achievement data from state administered assessments. New data will be collected via a teacher survey, a standardized achievement test, and site visits. Table 1 provides a summary of the planned data collection for the study, including the purpose of data collection, data source, respondents, and timing of data collection. See also Exhibits B and C for copies of the data collection instruments and protocols and Exhibits D-H for copies of the memorandum of agreement, informed consent forms, and staff confidentiality forms.

Table 1: Data Collection Plan for the Study

Purpose

Data Source


N = new data collection

E = existing data

Respondent

Person Collecting

Data

Timing of Data Collection

Spring 2008

Fall 2008

Spring 2009

Fall 2009

Describe the study sample

School demographic data publicly available from state database (E)

School

Research team obtains from state

X

X

X

X

Student demographic data from state database (E)

Student

Research team obtains from state

X

X

X

X

Teacher background data from teacher survey (N)

Teacher

Research team fields survey and collects data online

X


X

X

Describe nature of school improvement practices to assess implementation fidelity at treatment sites and similarity of activities at control sites

Site visit interviews and focus groups (N)

Teachers, leadership teams, principals, mentors

Research team conducts interviews and focus groups

X


X

X

Estimate differences in student achievement (main outcomes) between the treatment and control groups

State assessments in reading and mathematics (E)

Student

State administers regularly scheduled assessments

X


X


Standardized achievement tests in reading and mathematics (N)

Student

Site coordinators administer test




X

Table 1: Data Collection Plan for the Study (cont’d)

Purpose

Data Source


N = new data collection

E = existing data

Respondent

Person Collecting

Data

Timing of Data Collection

Spring 2008

Fall 2008

Spring 2009

Fall 2009

Estimate differences in school improvement practices (proximal outcomes) between the treatment and control groups

Teacher survey (N)

Teacher

Research team fields survey and collects data online

X


X

X

This study has been evaluated against the guidelines for quality, utility, integrity, and objectivity as outlined by OMB (Office of Information and Regulatory Affairs, 2006a and 2006b) and NCES (National Center for Education Statistics, 2003). Based on that evaluation, this proposed data collection will result in information that will be collected, maintained, and used in a manner consistent with the information quality guidelines of OMB and IES.

Student Data. Student demographic and student achievement data will be collected for this study. Student demographic data will be used to examine the student sample characteristics. Student achievement data will be used to examine the effects of the intervention on the main outcome, student performance. With the exception of the standardized achievement test administered at the end of the intervention, all student data are collected by the state as a matter of course. As a requirement for participation, participating districts will be asked to authorize the release of such data from the state to the research team. The research team will provide all required assurances for complying with the Privacy Act of 1974. This includes the use of a memorandum of understanding that outline the roles and responsibilities of participating districts and schools (see Exhibit D), informed consent forms for teachers (see Exhibit E), informed consent forms for school staff participating in the site visits (see Exhibit F), informed consent forms for parents of student being tested (see Exhibit G), and staff confidentiality agreements (see Exhibit H).

In Minnesota, the state in which we have proposed to conduct this study, student demographic data such as ethnicity, language, eligibility for free and reduced-price lunch, ELL status, and special education status are collected by the state department of education as a matter of course and will be available at no cost to the research team.

The baseline and first follow-up measure of student achievement will come from the state assessments of reading and mathematics, which are administered every spring under the annual testing requirements of the No Child Left Behind Act. The Minnesota Comprehensive Assessments — Series II (MCA-IIs) are the state administered reading and mathematics tests that help districts measure student progress toward Minnesota's academic standards and meet the requirements of No Child Left Behind. They are used to determine whether schools and districts have made adequate yearly progress toward all students being proficient in reading and mathematics by 2014. Reading and mathematics tests are given in grades 3-8, 10 and 11. These data will be available to the research team at no cost. Copies of the MCA-II assessments are not included in the exhibits of data collection protocols as the items are not publicly released.

(Note: Although McREL is collecting student achievement data from the state assessment, it is not requesting clearance for this particular data collection as OMB does not consider achievement tests or assessments administered by the state as a matter of course to be a public burden. Rather, information on the state student assessments is included here to provide a complete picture of the study.)

The only new student data collection required for this study will be the administration of a standardized achievement test at the end of the intervention. This additional assessment is required in order to provide the second and final follow-up measure of student achievement within the study period, which begins spring 2009. At the end of the intervention, the Northwest Achievement Level Tests (NALT), a nationally-normed test, will be administered for reading and mathematics in grades 5, 6, and 7. The NALT is particularly useful as follow-up measure in a study where the baseline and initial follow-up measures are the state assessment as it is a nationally-normed standardized achievement test specifically aligned and highly correlated with the Minnesota state standards. The NALT group norms, collected in 2002, include scores from more than 1,000,000 students. The NALT reading and mathematics tests are two separate tests administered online. Together, the two tests take approximately 90 minutes for students to complete. One of the participating districts already administers this assessment as a matter of course every fall (i.e., at the time the intervention ends); consequently, the only new student data collection will be testing in the districts that do not currently administer this standardized assessment. Site coordinators will facilitate student completion of the test online this achievement test. The NALT data will be available to the research team at a minimal cost for data retrieval. Copies of the NALT assessments are not included in exhibits of data collection protocols as the items are not publicly released.

Three cohorts of students will be included in this study (see Table 2). At the beginning of the study, Cohort A will be third graders, Cohort B will be fourth graders, and Cohort C will be fifth graders. These cohorts were selected to take into account the typical configuration of elementary schools in the districts being targeted in the study. In one of the districts, the elementary schools serve primary grades K-6; in the other, grades K-8. As shown in Table 2, data from the state tests of reading and math will be obtained for grades 3 through 6, while the assessment data at the end of the study will come from the nationally-normed standardized tests of reading and math.

Table 2. Student Cohorts

Cohort

Grade Level

Spring 2008

Spring 2009

Fall 2009

Baseline:

State Assessment

First Follow-Up:

State Assessment

Final Follow-up:

Standardized Achievement Test

A

3

4

5

B

4

5

6

C

5

6

7

Teacher Data. As shown in Table 1, all teachers in the treatment and control schools will be asked to complete a teacher survey at baseline (spring 2008) and in the spring and fall of 2009 (see Exhibit B for a copy of the teacher survey). This survey provides a teacher report of school improvement practices related to the four proximal outcomes for this study: data-based decision-making, shared leadership, purposeful community, and other effective school practices such as parent involvement and safe and orderly climate. Written permission to use these copyrighted questions was obtained from the authors. Questions related to school practices are worded to reflect the school, not the teacher, as the unit of analysis. Additional questions on the background of participating teachers will be used to describe the teacher sample (e.g., years of teaching, highest degree earned). The survey will be administered online and takes approximately 25 minutes to complete. Site coordinators will assist with coordination of the online teacher survey. Data from the teacher survey will be downloaded by research team, which will post and manage the online survey.

School Data. School demographic data will be collected to examine the characteristics of schools in the sample. These school-level data will be developed by aggregating the student-level demographic data, as noted above (e.g., percentage of students receiving free or reduced price lunches, student mobility). As noted above, these data are collected by the state department of education as a matter of course and will be available at no cost to the research team.

Implementation Fidelity and Contamination Data. In the spring of 2008, and in the spring and fall of 2009, two members of the research team will make site visits to each of the treatment and control schools to assess the implementation fidelity at treatment schools and similarity of activities at control schools. Each site visit will include semi-structured interviews that will be conducted with principals (treatment and control), leadership teams (treatment schools) or other school leaders (control schools), and mentors (treatment schools). Each site visit will also include a focus group with a cross-section of teachers (treatment and control schools). Interview and focus group questions are designed to elicit evidence of schools’ engagement in the key activities involved in the intervention being studied (see Exhibit B for Site Visit Protocols). Supporting artifacts will be requested, as available, as further evidence to support responses to interview and focus group questions. Site coordinators will assist with scheduling for the half-day site visits and the selection of teachers for the focus groups.

3. Use of Improved Information Technology to Reduce Burden

Five general strategies will be used to minimize the reporting burden for participants:

  • Online Teacher Survey. The online nature of the teacher survey will allow participants to complete the data collection more quickly because they will not have to manage paper documents or mailing. The protocols for the online teacher survey reflect research on how to conduct online surveys to maximize response rates (e.g., three-wave postings, including initial and two follow-ups) while addressing potential disadvantages of Web-based administration. The use of online technology for data collection allows teachers to complete surveys at their convenience; however, past experience of researchers involved in this project suggests that having all teachers complete the survey at a predetermined time (e.g., during an initial planning meeting or faculty meeting) may be necessary. This option will be discussed with individual schools. If teachers are allowed to complete the survey at their convenience, high response rates are expected to be obtained by early notification of the timelines for the survey and the secure nature of the responses (i.e., security certificates and special logins for the study), e-mail announcements that the survey is posted online sent directly to the staff (via the school contact to avoid deletion as “spam”), automatic reminders (1 week after posting to all participants and 3 weeks after to nonrespondents), and further follow-up with nonrespondents as needed (including notifying the school contact of response rates for their school). The teacher survey also will be programmed to accommodate the most common Web browsers for both PC and MAC operating platforms, and the smallest screen sizes typically in use (so that no scrolling down is required to view questions). In following these protocols, response rates of 80% or more have been readily obtained by the study team from similar populations.

  • Online Achievement Testing. The baseline standardized achievement test is administered online as a matter of course in one of the participating districts and will be administered in the same manner in the other participating school districts for the purposes of this study. Online administration of this student assessment will reduce the coordination required by the school to administer paper-and-pencil assessments.

  • Online Access to Data Files. Online administration of the teacher survey and standardized achievement test allows the authorized members of the research team to obtain these data files without burden to the school district. The teacher survey data is housed on a secure server accessible only to the research firm. The testing company will provide secure and direct access to the student achievement data to the authorized research team.

  • Ongoing Communication with Participants via Email. The full schedule of data collection activities and timelines will be communicated at the beginning of the study, with reminders of each upcoming event sent two weeks in advance via e-mail and a four-week response window provided for the actual collection of data. This advance schedule, reminder, and response window structure will allow participants plenty of time to plan and to incorporate the data collections into their schedules. In addition, the research team will communicate on an ongoing basis with the site coordinator regarding the study and any issues that arise via email.

  • Maintaining Electronic Participant Lists. Instruments have been designed to reduce response burden by focusing only on the information necessary to carry out the study. This includes participant data (e.g., identification number, name, grade level, etc.) provided by respondents that will be maintained in a secure location. Participant lists will facilitate follow-up data collection by requiring that this information need only be collected from new participants in order to conduct follow-up data collection (i.e., from teachers and students who come to the school after the baseline data collection).

4. Efforts to Identify and Avoid Duplication

The purpose of this experimental study is to evaluate the effectiveness of Success in Sight, a comprehensive approach to school improvement. Although Success in Sight has been widely used and previously field tested by the developers, no rigorous experimental studies have been conducted to determine its effects. In addition, no other studies of this intervention are currently being conducted by other entities.

To the extent possible, the research team will utilize existing data to avoid duplicating data collection efforts. Existing data to be used in the study will include demographic data on schools and students, and student achievement data from state administered assessments. New data will be collected via an online teacher survey, a standardized achievement test administered at the end of the intervention in the districts that do not already administer it as a matter of course, and site visits.

5. Impacts on Small Businesses and Other Small Entities

Schools are the only small entities included in this study. The primary respondents in our study will be teachers, school administrators, other school staff who are members of school leadership teams, and students (see response to Item 12). Each of these groups will be asked to participate in data collection activities.

As noted in our response to A-4, all efforts will be made to utilize existing data and avoid duplicate data collection. To reduce response burden, schools and districts will not be asked to provide student data. Instead, data from the standardized student achievement tests will be obtained directly from the testing company, and existing state assessment data will be obtained directly from the state department of education. Student demographic data will also be obtained from the state department of education. Moreover, all data collection protocols have been designed to reduce response burden by focusing only on the information necessary to carry out the study. Response burden will be further minimized through the assistance of a designated site coordinator at each school who will assist the research team with data collection.

6. Consequences to Federal Programs or Policies if Data Collection is Not Conducted

In response to regional and national needs to comply with the NCLB requirement that all subgroups of students achieve to the same high level, this study will examine the impact of a comprehensive approach to school improvement in elementary schools with low to moderate student achievement. Increasing pressures on schools to improve the achievement of all students, close the achievement gap, and bring all students to proficiency in reading and mathematics by 2014 require the use of research-based programs and practices designed for the populations and settings in which they are implemented. Determining whether such programs have the potential to help schools raise student achievement requires the highest quality evidence of effectiveness. The Education Sciences Reform Act of 2001 charges the regional educational laboratories with providing such evidence by conducting rigorous research. Without the data and findings from this study, REL Central and the National Laboratory Network will be unable to disseminate scientifically valid research on the effects of this broadly used comprehensive approach to school improvement.

7. Special Circumstances

None of the special circumstances apply to this data collection.

8. Federal Register Comments and Persons Consulted Outside the Agency

We have published a 60-day and 30-day Federal Register Notices to allow public comment. The 60-day notice to solicit public comments was published in the Federal Register on May 30, 2007 with an end date of July 29, 2007 (see Exhibit I for Federal Register Notices). The 30-day notice to solicit public comments was published in the Federal Register on July 31, 2007 with an end date of August 29, 2007. Table 3 summarizes the public comments received and the response to these comments in terms of changes to the study procedures or instruments.

Table 3: Public Comments, Responses, and Subsequent Changes to the Study

Public Comment

Response

Subsequent Changes
to the Study

60-day Notice

No comments received.

Not applicable.

Not applicable.

30-day Notice








In addition, throughout the course of this study, we have and will continue to draw on the experience and expertise of a technical working group (TWG) that will provide a diverse range of experience and perspectives as well as expertise in relevant methodological and content areas. The first meeting of the TWG was held from May 31 through June 2, 2006. The second meeting of the TWG was held from September 5 through September 7, 2006. The members of this group are:

  • Dr. Geoffrey Borman, Associate Professor, University of Wisconsin-Madison

  • Dr. Robert Boruch, Professor, University of Pennsylvania, Wharton School,
    Graduate School of Education

  • Dr. Robert D. (Robin) Morris, Vice President of Research, Georgia State University

  • Dr. Andrew Porter, Director, Learning Sciences Institute, Vanderbilt University

  • Dr. Robert St. Pierre, President, STP Associates

  • Dr. William Clune, Voss-Bascom Professor of Law Emeritus, Law School, University of Wisconsin-Madison

  • Dr. Norman Webb, Senior Research Scientist, Wisconsin Center for Education Research, University of Wisconsin-Madison

In addition, the research team has solicited input from the IES analytical and technical services contractor, Mathematica Policy Research, and from the IES team assigned to the project.

9. Payments to Respondents

As noted in item 12, we acknowledge that this study will place a data collection burden on teachers, school administrators, and other school staff who are members of school leadership teams. Consequently, participants in both treatment and control sites will receive partial compensation for participating in data collection that is over and above their regularly assigned duties at the school and occurs outside of the school day. For the proposed study, participant compensation will be $25/hour for teachers and $35/hour for administrators, payable annually at the end of the school year. We believe these rates to be reasonable in that they reflect the level of partial compensation noted in a review of OMB submissions for studies with a similar design and the current rates of compensation for educators in the state being targeted.1 Moreover, the rates for participant compensation in the proposed study were set just below the averages noted in our review to further indicate that these payments are intended as partial rather than full compensation. Designated site coordinators who assist with scheduling the teacher survey, student testing, and site visits will also receive partial compensation of $25/hour for assisting the research team with coordination of data collection to the extent that it is over and above their regularly assigned duties at the school and occurs outside of the school day. There will be no remuneration for students in the study.

10. Assurances of Confidentiality Provided to Respondents2

Per OMB guidelines, McREL and ASPEN Associates follow the confidentiality and data protection requirements of IES (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). McREL and ASPEN Associates will protect the confidentiality of all information collected for the study and will use it for research purposes only. No information that identifies any study participant will be released. Information from participating institutions and respondents will be presented at aggregate levels in reports. Information on respondents will be linked to their institution but not to any individually identifiable information. No individually identifiable information will be maintained by the research team. All institution-level identifiable information will be kept in secured locations and identifiers will be destroyed as soon as they are no longer required. McREL and ASPEN Associates will obtain signed NCEE Affidavits of Nondisclosure from all employees, subcontractors, and consultants that may have access to this data and submits them to their NCEE COR. Specific protocols for ensuring confidentiality and protection of data are described below.

District IRB. During site recruitment, information on each district’s IRB policies and procedures will be collected. The study’s Principal Investigator and research team members will adhere to all district and state IRB policies and procedures. District IRB applications will be submitted early in the study during the discussions with districts about the study. Once a potential site has expressed enough of an interest that their final decision warrants IRB approval, the application will be submitted to ensure approval by the time the study is to begin. The PI and authorized personnel from each participating district and school will sign Memoranda of Understanding (MOUs) to help ensure clarity of expectations and roles and responsibilities (see Exhibit D for Memoranda of Understanding). Informed consent from participants will be sought after school principals have approved the conduct of the study in their school and signed a school MOU (see Exhibits E, F, and G for Letters of Consent). Consent from teachers, administrators, and other school staff will be active consent; consent from parents will be passive consent.

Informed Consent. Informed consent will be sought and obtained from each participating school staff member — teachers, school administrators, and other staff members (non-instructional staff) — who will be asked to complete the teacher survey (see Exhibit E) and participate in data collection during site visits (i.e., leadership team interview, principal interview, and teacher focus group) (see Exhibit F). Informed consent will also be obtained from the parents or legal guardians of students participating in the standardized achievement test administered at the end of the intervention (see Exhibit G). School staff and parent informed consent letters are written to clearly communicate the research purposes, procedures, and risks and benefits. The consent letters will assure participants that reports prepared for this study will summarize findings across the sample and will not associate responses with a specific school or individual, and that the study team will not provide information that identifies specific schools or individuals to anyone outside the study team, except as required by law. Also included in these letters are statements offering teachers and parents the opportunity to ask questions and withdraw at any time. The parent letter is written for an eighth grade reading level and will be provided in other languages as requested by districts or schools.

Letters of consent will also provide the documentation most districts need to comply with The Family Educational Rights and Privacy Act (FERPA 34 CFR Sect. 99.31) as they inform parents of the release of student achievement records for the purposes of research and evaluation on district programs. These letters of consent are supplemental forms to document that informed consent was obtained by all participants; they are not part of the data collection protocols.

OMB Confidentiality Language. All communication about the study, including letters of consent and data collection instruments, will include the following notice regarding confidentiality as suggested by OMB:

Responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district or individual. We will not provide information that identifies you or your district to anyone outside the study team, except as required by law.”

Safeguards. In addition to following the OMB guidelines regarding, as noted above, this study has also been designed to protect against and reduce risks to participants’ confidentiality through the use of the following safeguards to store and secure data:

  • Responses to the online teacher survey will be stored on a secure server accessible only to ASPEN Associates. Access to the online survey data will be password protected and accessible to authorized members of the research team only.

  • Standardized achievement testing at the end of the intervention will occur online via a secure server operated by the testing company. Data will be made available by download to authorized members of the research team via password protected Web access.

  • Each school, teacher, and student participant will be assigned an ID number and all identifying information stripped from data files. No individually-identifiable information will be kept in the data files.

  • Participant-ID number lists will be kept in a locked and/or password protected file and not released by authorized research team personnel.

  • Informed consent forms will be stored separately from the responses to data collection, such that no link can be made between the consent forms and participant responses.

The PI will monitor the research team members’ compliance with the procedures identified above and report the status of compliance annually to McREL’s Institutional Review Board (IRB).

11. Justification for Questions of a Sensitive Nature

No questions of a sensitive nature will be included in the study.

12. Estimates of Hour Burden of Data Collection

This study design calls for new data collection from three different sources: an online teacher survey, a one-time standardized achievement test administered at the end of the intervention, and site visits (see Exhibit B for the teacher survey and Exhibit C for the site visit interviews and focus group protocols; the standardized achievement test is proprietary and thus not included as an Exhibit). Data will also be gathered from two existing data sources: student demographic and state achievement data will be obtained from the state department of education. In each of these activities there will be some burden placed upon individuals — students, teachers, school administrators, site coordinators, other school staff, and the state department of education. Table 4 presents the anticipated respondent burden associated with these activities.

The study sample includes a total of 6,995 unduplicated respondents. This 6,995 is the sum of 5,850 students; 1,040 teachers; 52 school principals; 52 site coordinators; and 1 state representative who each provide data. In some cases, a respondent may be asked to participate in more than one form of data collection (e.g., all teachers will be asked to complete a teacher survey and a subset of these will be asked to participate in a teacher focus group during the annual site visits). As such, the total number of responses to data collection noted below is greater than the total number of respondents (i.e., the total number of individuals responding).

Primary data collection will occur during 2008 (Study Year Three) and 2009 (Study Year Four). The total number of responses to data collection over the two-year study is 7,958, which as noted above, includes more than one data collection with some respondents. Of the 7,958 total responses to data collection, 71% will be gathered electronically.

When averaged across the two years of the study, the total annualized burden is 3,979 annual responses (7,958 total responses / 2 years); 6,452 hours (12,904 total hours / 2 years), and $52,098 ($104,195 total monetary burden / 2 years).

Given that data collection is not evenly distributed within each study year, the actual annual burden is as follows:

  • Year Three will include only baseline data collection in spring 2008 for an average burden across respondents of 1,217 hours and $30,715.

  • Year Four will include follow-up data collection in the spring and fall of 2009 (including standardized testing of students at the end of the intervention) and thus will have a greater respondent burden: 11,687 hours and $73,480.

Table 4. Estimated Respondent Burden

Data Collection Activity & Responsible Party

Number of Responses
per Data Collection

Number of Data Collections

Time per Response (minutes)

Total Hour Burden

Hourly Rate1

Total Monetary Burden

1. Standardized Ach. Test2

5,850

1

90

8,775

NA

NA

Coordinate Testing3

26

1

1110

481

$25.00

$12,025

  1. State Achievement Test4







Obtain Data from State

1

2

120

4

$0.00

$0

  1. Student Demographic Data4







Obtain Data from State

1

4

60

4

$0.00

$0

4. Teacher Survey

10405

3

25

1,300

$25.00

$32,500

Coordinate Teacher Survey6

52

3

30

78

$25.00

$1,950

5. Annual Site Visit







Coordinate Site Visit6

52

3

60

156

$25.00

$3,900

Teacher Focus Group

520

3

45

1,170

$25.00

$29,250

Leadership Team Interview

364

3

45

819

$25.00

$20,475

Principal Interview

52

3

45

117

$35.00

$4,095

Total

7,9587

-- 

-- 

12,904

-- 

$104,195

1 A review of other OMB packages for similar whole school studies revealed customary rate of participant partial compensation for data collection activities to be $26-$30/hour for teachers and $36-$48/hour for administrators. Studies reviewed included Reading First (Abt Associates, Inc., 2004); Longitudinal Analysis of Comprehensive School Reform Implementation and Outcomes (LACIO) (WestEd & Cosmos Corporation, 2006); and Trends in International Mathematics and Science (TIMMS) (Windwalker Corporation & Westat, Inc., 2005). Current rates of compensation in Minnesota are $27-$37 for teachers and $37-47 for principals, with the higher averages representing the large, urban districts in the metropolitan area, which are the target populations for this study.

2 One of the two targeted school districts already administers the standardized achievement test being used as the baseline measure as a matter of course. Thus, the respondent burden noted here reflects the additional time required for the site coordinator for the district that does not currently utilize this test.

3 Time involved in site coordination of student testing is 18.5 hours spread over a 4-week period.

4 The state administers a student assessment and gathers student demographic data as a matter of course. This data is available at no cost, but does require state staff time.

5 The anticipated response rate for the teacher survey, as reflected in the number of respondents, is 80%.

6 Site coordinators assist with scheduling the online teacher survey, including forwarding initial and follow-up emails to teachers in their school, and with scheduling the site visit activities.

7 The 7,958 total responses refers to the total number of data collection responses with the study sample of 6,995 individuals, some of whom participate in more than one data collection activity.


13. Estimates of Total Cost Burden to Respondents

There will be no respondent costs associated with this data collection other than the hour and cost burden estimated in item 12. These estimates include the often “hidden costs” associated with data collection, such as, the need for a contact person at the school to meet with the research team prior to and during data collection. In this study, this type of coordination is built into the role of the site coordinator who assists the research team with scheduling and coordinating all data collection at the school. In addition, there will also be no start-up costs associated with the data collection for this project.

14. Estimates of Annualized Cost to the Federal Government

The estimated cost to the federal government of conducting the Study of the Effectiveness of a School Improvement Intervention (Success in Sight) is approximately $5.8 million total across the five years of the study. Of this, $3.2 million goes to the independent research firm conducting the study, ASPEN Associates; $1.5 million is the cost to implement the intervention, which goes to McREL; and $1.1 million goes to McREL for contract administration and oversight for this study. Table 5 shows the annualized cost to the federal government as being highest in years 2 through 4 when the intervention and data collection occurs.

Table 5. Annualized Cost to the Federal Government


5-Year Total

(in $1,000s)

Annual Totals

(in $1,000s)

Year 1

Year 2

Year 3

Year 4

Year 5

Semi-annual meetings with REL Directors and Department of Education, planning, development, document review and revision,
and consultations with Mathematica and IES

$164

$35

$35

$34

$30

$30

Consultation with Technical Working Group

$155

$31

$31

$31

$31

$31

Recruitment of sites

$36

 

$36

--

--

--

Design, IRB, and OMB approval processes

$120

$60

$60

 

 

 

Baseline data collection and random assignment

$625

 

 

$625

 

 

Follow-up data collection

$850

 

 

 

$850

 

Data analysis

$725

 

 

$200

$400

$125

Report preparation

$525

 

 

$75

$275

$175

Intervention materials, training, and implementation

$1,500

 

 

$750

$750

 

McREL contract administration and oversight, distributed as follows:

 

 

 

 

 

 

Communication with IES and study subcontractor

$440

$88

$88

$88

$88

$88

Reporting

$220

$50

$40

$40

$40

$50

Quality Assurance

$187

$19

$42

$42

$42

$42

Overhead

$417

$19

$23

$143

$192

$40

Annualized Totals

$5,800

$267

$320

$1,995

$2,668

$550

15. Reasons for Changes or Adjustments in Burden

This is a new collection. Therefore, the entire burden is new.

16. Tabulation, Analysis, and Publication Plans and Schedule

This study will begin during the spring of the 2007-2008 school year. Success in Sight will be implemented in treatment schools during the 2007-2008 school year (spring), 2008-2009 (full year), and 2009-2010 (fall). Outcome data on school improvement practices and student achievement will be collected in both treatment and control schools. Intermediate and cumulative effects of the intervention will be analyzed using data collected through the end of the first year and over the course of the study. Table 5 presents the schedule for the major study activities.

Table 5. Schedule of Activities

Activity

Schedule

Create District and School Pool for Site Selection Recruitment

Spring-Fall 2007

District and School Recruitment (pending OMB approval)

Winter 2007

District IRB

Spring-Fall 2007

District and School MOUs

March 2008

Random Assignment

March 2008

Informed Consent

March 2008

Start Baseline Data Collection

March 2008

Start Intervention

March 2008

Year 1 Follow-up Data Collection

Spring 2009

Interim Report of One-Year Findings

Fall 2009

Year 2 Follow-up Data Collection

Fall 2009/Winter 2010

Final Report of Findings

January 2011

Data Analysis. The purpose of this study is to determine whether or not the Success in Sight whole-school intervention is effective in raising the academic performance of students in low- to moderate-performing schools by building school capacity for comprehensive school improvement.

The data collected will be analyzed to examine the impact of participation in Success in Sight. The primary emphasis of the data analysis will be to examine the effects of Success in Sight on the reading and mathematics achievement of the students after two years. Additional analyses will be conducted to examine the effects of Success in Sight on four proximal outcomes as well. Analysis of the effect of Success in Sight on both proximate and ultimate outcomes will be used to gauge the progress of the intervention and the effects after two years of participation in Success in Sight. Prior to data analysis, data files will be examined to ensure data quality. Data management and analysis procedures will be documented for quality control and reporting.

Cluster Consideration. Success in Sight is a school-level intervention and therefore requires the selection and assignment to treatment and control groups at the school level. Schools will be the unit of analysis. Consequently, this cluster randomized trial will involve randomization of schools within districts (i.e., blocking at the district level) and collection of outcome data at the level of the student and teacher, which will be aggregated to the school level (i.e., school means).

To allow for this nesting of students and teachers within schools, hierarchical linear modeling (HLM) software will be used to generate more accurate statistical estimates. The advantages of HLM and its applicability to understanding schools are well known and documented in education research (Raudenbush & Bryk, 2002). A two-level hierarchical model will account for student- and school-level sources of variability in the analysis of main effects and teacher- and school-level variability in the analysis of proximal outcomes; in both analyses, estimates of treatment effects will be reported at the level of random assignment (i.e., school-level). The blocking will enhance the statistical precision of the impact estimates. It will also help avoid unbalanced distributions of treatment and control schools within districts that might lead schools to withdraw from the study. For example, in the absence of blocking, a district could, through simple random assignment, have many more control schools than treatment schools. In this case, a district might be less inclined to remain in the study than a district that feels its participation in the control group is balanced by equal participation in the treatment group, which is perceived as a tangible benefit.

Final analyses will occur after an examination of and adjustment for sample differences (see Attrition and Attenuation Analyses below) and basic descriptive analyses (including examination of outliers and variances and confirmation of reliability and validity) to determine data integrity given the planned analyses.

Attrition Considerations. This study of a comprehensive school improvement approach (i.e., whole-school reform) will have the school as its unit of analysis. Attrition of students will be expected to naturally occur during the course of this study and has been accounted for in the anticipated sample size for each population within each school and the inclusion of two different student samples. Sample A: Longitudinal Cohort is comprised of “stayers” or students who remained in the same school throughout the study. Sample B: Typical School Cohort is comprised of “stayers plus in-movers,” students who remained plus students who moved into the school after the baseline year. Each sample reflects the natural fluctuation in the student population due to student mobility. In this particular study, it will be important to include “in-movers,” or students who moved into the school after the study began, as this is a common occurrence in any school and even more common in schools that may be perceived as making improvements. Being able to analyze the effects of the intervention on both of these populations will provide valuable information on the typically fluctuating school population, in addition to examining the effects on the more stable population of students who return year after year.

Likewise, teacher attrition will be expected to occur naturally during the course of this study. The Success in Sight approach to school improvement recognizes this and is designed to show schools how to bring new teachers into the reform effort as part of the planned activities. McREL’s Technical Working Group (TWG) also discussed the related issue of “cross-over” effects and concluded that even if a teacher moved from a treatment school to a control school (or vice versa), this individual teacher would not be able to bring the same magnitude of change to the control school because the intervention is a school-wide reform effort, not an individual reform effort. Likewise, teachers (as well as students) might naturally want to move from a control school into a treatment school if the latter is perceived as “improving.” Again, the TWG concluded that this is a natural effect of school reform and that in both instances, the experimental design accounts for these issues.

Nevertheless, in an effort to reduce the amount of attrition during the study among teachers, both tangible and intangible benefits to participation are built into the study (see response to B-3). To reduce attrition at the school level, the unit of analysis, the study team will explore the potential for school closings or restructuring during the selection process (see section on Recruitment under response to B-1).

Pre-intervention Analyses. Data analyses will be conducted prior to the implementation of the intervention in order to compare the intervention and control schools on school improvement practices and student achievement. Any large differences between the two groups will warrant a review of the random assignment procedure and possible use of statistical methods to adjust for pre-intervention differences. The distribution of intervention and control schools within each district will also be checked prior to intervention.

Descriptive Statistics. Descriptive statistics will be produced for both groups on all instruments included in the study. Descriptive statistics (e.g., means, standard deviations, frequency distribution, item-total correlations, and internal consistency) will be used to examine the psychometric adequacy of all instruments. In addition, the mean and standard deviation of students’ achievement scale scores in the study sample will be compared to those of the target population in the state (i.e., low to moderate performing schools) to examine the degree to which the study sample is similar to the target population in terms of student achievement. Since the study is not based on a probability sample, any resulting differences will not be weighted as a corrective measure. Rather, this comparative information will inform the interpretation of findings by further elucidating the nature of the sample. Finally, descriptive statistics will also include information on sample sizes for each experimental group and for sub-groups within the experimental groups (i.e., ethnic subgroups). Characteristics of “leavers” and “in-movers” in the participating schools will also be described to provide a clearer picture of the two main student samples of “stayers” and “stayers plus in-movers.”

Assumptions/Outliers/Data Treatment. Data will be examined for relevant statistical assumptions and will be compared to results from existing measures on comparable samples to check the reasonableness of the data. Where existing data are not available, results will be compared to previously collected data sets to check the reasonableness and stability of the data. The data will also be examined for outliers. Any treatment of the data to deal with violations of assumptions or outliers will be reported.

Analysis of Attrition. A number of analyses will be conducted in order to determine the rate of attrition. Of primary concern is attrition that results in the sample being below the minimum necessary to permit a sufficiently precise estimate of the effect size. Results will also be presented that show the percentage of individuals in each group for whom outcome data could not be obtained.

Analyses will be conducted and reported to examine the possibility of differential attrition. Analyses include the comparison of baseline results for the initial sample in each group with the baseline results for the sample of schools that complete the study in order to determine if the there are any systematic differences between those who complete the study and those who drop out. Particular attention will be paid to the number of low-performing schools that are lost from each group. These comparisons will help determine if the sample of schools that complete the study differs from the sample of schools that began the study. Large differences on important variables could be an indication of differential attrition.

Data will also be analyzed and presented from the implementation fidelity measures to determine the number of schools assigned to the intervention group that actually participated in the intervention. Analysis of the fidelity data will also help determine if the intervention was implemented in a manner consistent with its design.

No analyses of student and teacher attrition are proposed, but rather are addressed by the research design. Given that this is a study of a comprehensive school improvement intervention (i.e., a school-wide initiative), attrition of students and teachers is expected. The attrition of students is accounted for by the inclusion of two student samples, each of which estimates 30% student mobility. Sample A: Longitudinal School Cohort represents students who remained in the schools throughout the study (i.e., “stayers”). Sample B: Typical School Cohort represents the typical mobility of students into and out of a school population over time (i.e., “stayers plus in-movers). Teacher mobility is addressed by the manner in which the school improvement intervention purposefully socializes new teachers and in the sample size proposed for this study.

Analysis of Confounding Events, Cross-Over, and Contamination. Data will be collected using the Site Visit Protocols (interviews and focus groups) on the types of school improvement activities engaged in by treatment and control schools. These data will be useful in examining and describing any confounding events, cross-over, or contamination occurring among treatment and control teachers and schools. Examples of confounding events that would likely affect the results of the study include the introduction of a new curriculum or other initiative that requires substantial resources (time, effort) on the part of the school. Cross-over occurs when teachers and students from the treatment schools move into control schools and when teachers and students from control schools move into treatment schools. Contamination would be evident if teachers and students at the control schools were exposed to a low dose of the intervention; for example, when cross-over occurs and a teacher from a treatment school initiates a similar effort at the control school. In an effort to prevent any of these circumstances from occurring, the research team will communicate expectations for maintaining implementation fidelity to both treatment and control sites at the beginning of each school year during a study orientation and throughout the school year. In between site visits, the research team will check in with the site coordinator to inquire about and address any such issues as they arise. In the end, if there is evidence of confounding events, cross-over, or contamination, these circumstances will be fully described in the discussion of the site activities in the study report and their implications noted in the discussion of outcomes.

Primary Analysis: Main Effects. Consistent with the random assignment of schools to either the intervention or control group, the school-level effects of assignment to the intervention on student achievement will be analyzed for both student samples. As noted above, intervention effects will be estimated using HLM to account for sources of variability of students nested within schools.

The school-level effects of assignment to Success in Sight after one and two years on average student achievement will be analyzed using a two-level hierarchical model. Two models will be run, one for reading and one for mathematics.

The Level 1 model will nest students within schools and will include the students’ grade level as a predictor. Student grade level will be coded into two variables to represent the three grade levels, with grade 3 as the reference grade level. The variable grade34 will code grade 3 as -1.0 and grade 4 as +1.0. Similarly, the variable grade35 will code grade 3 as -1.0 and grade 5 as +1.0. This coding will control for school level differences in the proportion of students in each grade level and will allow the Level 2 intercept for overall school level performance to be interpreted as the average performance of the third through fifth graders. The Level 1 model is specified as:

Yij = β0j + β1y(grade34) + β2y(grade35) + rij

The Level 2 model will include an indicator for assignment to treatment or control school as a predictor of mean school achievement to estimate the effects of the intervention on student achievement. Student achievement is measured using the state assessment scale scores at baseline (the Level 2 covariate) and as the first outcome measure. The second student achievement outcome measure administered immediately following the end of the intervention comes from the NWEA standardized test. The NWEA is based on the state standards and is highly correlated with the state assessment. As such, the Level 2 model is specified as:

β0j = γ00 + γ01(mean Baseline Assessment) + γ02(Treatment) + γ03(District) + u0j

The Level 2 model also includes group assignment and district as predictors. Group assignment will code treatment as 1.0 and control as 0.0. The district variable will code one district as 1.0 and the other as 0.0. If additional districts are included in the study, then additional district variables will be included in the model up to the number of districts minus one. Level 2 will also include a cluster level covariate (baseline achievement) to explain additional between-school variance not explained in the Level 1 model and to improve the power of estimation of the intervention’s effect (Raudenbush, Spybrook, Liu, & Congdon, 2006). The cluster level covariate represents each school’s average level of achievement at baseline on the standardized NWEA assessment. As such, it represents the school level achievement for the baseline cohort of all students in grades 3, 4, and 5.

Teacher-level effects are not included in the analysis of student achievement to reflect the school-level nature of the intervention (i.e., a whole-school improvement process focused on organizational change, rather than on specific improvement strategies targeting at a single classroom). Student achievement outcome data, in both the Level 1 and Level 2 models, will include data for all students for whom data are available at the time of the outcome measure. The outcome measures are the state assessments in reading and mathematics administered in the spring of 2008 (year 1), 2009 (year 2), and 2010 (year 3, if warranted).

The findings related to the impact of participation on student achievement will be interpreted in light of the descriptive analysis of implementation fidelity in treatment sites and the descriptive analysis on possible contamination in control sites.

Secondary Analysis: Proximal Outcomes. In addition to the analyses that address the ultimate outcomes of the intervention as outlined in the primary research question, this study will also examine the effects on the four proximal outcomes: data-based decision-making, effective practices, purposeful community, and shared leadership.

Consistent with the random assignment of schools to either the intervention or control group, the secondary effects of the intervention on school improvement practices will be analyzed at the school level. Intervention effects will be estimated at the school level using HLM to account for sources of variability in the nested structure of the school environment using separate two-level models for each of the four proximal outcomes.

The school-level effects of assignment to Success in Sight after one and two years on school improvement practices will be analyzed using a two-level hierarchical model. Four models will be run, one for each of the four proximal outcomes.

The Level 1 model will nest teachers within schools and will include the years of teaching and teacher certification as predictors. Both measures are collected via the teacher survey. Years of teaching will be coded into a dichotomous variable that reflects new teachers (0 to 5 years) versus more veteran teachers (more than 5 years of teaching). This coding will control for school level differences in the proportion of teachers less versus more experience in teaching and will allow the Level 2 intercept for overall school improvement practices to be interpreted as the average extent to which teachers in the school engage in these practices across years of experience. Teacher certification will also be included in the Level 1 model. The categories of teacher certification will be coded to reflect the state definition of a “highly qualified teacher,” such that provisional and alternative certifications are appropriately categorized. The Level 1 model is specified as:

Yij = β0j + β1y(Years Teach) + β2y(Highly Qualified) + rij

The Level 2 model will include an indicator for assignment to treatment or control school as a predictor of mean school improvement practice (e.g., use of data-based decision-making) to estimate the effects of the intervention on each of the four school improvement practices: data-based decision-making, effective practices, purposeful community, and shared leadership. Each of the school improvement practices is represented by a set of scaled items indicating the extent to which teachers in the school engage in a particular practice. School improvement practices are measured using the state The Level 2 model is specified as:

β0j = γ00 + γ01(mean school improvement practice) + γ02(Treatment) + γ03(District) + u0j

The Level 2 model also includes group assignment and district as predictors. Group assignment will code treatment as 1.0 and control as 0.0. The district variable will code one district as 1.0 and the other as 0.0. If additional districts are included in the study, then additional district variables will be included in the model up to the number of districts minus one. Level 2 will also include a cluster level covariate (baseline achievement) to explain additional between-school variance not explained in the Level 1 model and to improve the power of estimation of the intervention’s effect (Raudenbush, Spybrook, Liu, & Congdon, 2006). The cluster level covariate represents each school’s average engagement in a particular school improvement practice at baseline (e.g., use of data-based decision-making). As such, it represents the extent to which all teachers in the school, new and veteran teachers, engage in the school improvement practice.

Reporting. This section provides an overview of the reporting plan for this study.

Study Report Preparation. During 2009-2010, the study team will prepare a technical report consistent with IES technical standards. The report will fully explicate the rationale, study questions, research design, method, and results. Findings for each of the research questions will be presented, threats to validity will be considered and ruled out as appropriate, and conclusions about the research questions will be drawn based on these considerations. The report will be prepared such that it is appropriate for a peer-reviewed scholarly journal.

To accommodate timing of the final data collection in the fall of 2009, the research team will prepare a cumulative report of the status of the study at the conclusion of the first year of data collection. This year-one report will include a complete set of all deliverables (reports, datasets, etc.) required for the final report. IES and MPR will be invited to critique this report as if they were being submitted at the conclusion of the study. This will allow time for IES to approve the format and proposed content of all deliverables. In part, the purpose of this year-one report will be to simulate the final technical report from the study that will be due prior to contract end. Thus, it will also provide McREL an opportunity to adjust the plan for submitting the final report while final data collection is under way. After the final data collection, this year-one report would simply need to be updated with the final data, analyses, and a synthesis of findings from the study as a whole.

At the conclusion of the study, a non-technical report will also be prepared that discusses the study rationale and summarizes the findings from the final technical report submitted to IES. This non-technical report will highlight selected conclusions given in the technical report and discuss implications of the study for education policy and practice.

Public- or Restricted-Use Data Files. Prior to data collection, the study team will begin to create a data codebook. Datasets will be created, updated, and managed during the duration of the study’s data collection and analysis efforts. The codebook will be updated throughout the study. The study team will consult with the Technical Working Group regarding the development of the dataset(s) and codebook(s). Upon completion of the study, the data file and a complete and accurate codebook will be finalized to document the public- or restricted-use data files. All participant identifiers will be removed from the final files.

Dissemination. Pending approval of findings, the study team will submit a version of the technical report for publication in a scholarly journal and for presentation at one or more research conferences, as appropriate. A non-technical report also will be prepared that discusses the study rationale, presents the research questions, and summarizes the findings. The report will summarize the study results and highlight selected conclusions. Implications of the study for education policy and practice will be discussed. Dissemination will be aligned with the dissemination activities described in McREL’s dissemination plan under the current regional educational laboratory contract.

17. OMB Expiration Date

Not applicable. We are not seeking this and plan to display the OMB control number and expiration of OMB approval on data collection forms.

18. Exceptions to Certification Statement

No exceptions to the certification statement are requested or required.

REFERENCES

Marzano, R. J., Pickering, D. J., & Pollock, J. E. (2001). Classroom instruction that works. Alexandria, VA: Association for Supervision and Curriculum Development.

Marzano, R. J., Waters, T., & McNulty, B. A. (2005). School leadership that works: From research to results. Alexandria, VA: Association for Supervision and Curriculum Development.

National Center for Education Statistics. (2003). NCES statistical standards (Report No. NCES 2003-601) [Electronic version]. Washington, D.C.: U.S. Department of Education, Institute of Education Sciences. Retrieved March 19, 2007 from http://nces.ed.gov/statprog/2002/std3_2.asp

Office of Information and Regulatory Affairs. (2006). Questions and answers when designing surveys for information collections. Washington, D.C.: Executive Office of the President, Office of Management and Budget. Retrieved March 19, 2007 from http://www.whitehouse.gov/omb/inforeg/pmc_survey_guidance_2006.pdf

Office of Information and Regulatory Affairs. (2006). Standards and guidelines for statistical surveys. Washington, D.C.: Executive Office of the President, Office of Management and Budget. Retrieved March 19, 2007 from http://www.whitehouse.gov/omb/inforeg/statpolicy/standards_stat_surveys.pdf

Raudenbush, S. W. & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods (2nd ed.). Thousand Oaks, CA: Sage Publications.

Raudenbush, S., Spybrook, J., Liu, S., Congdon, R. & Martinez, A. (2006). Optimal Design for Longitudinal and Multi-level Research (Version 1.77) [Computer software]. University of Michigan, Ann Arbor: Survey Research Center. Retrieved June 1, 2006, from http://sitemaker.umich.edu/group-based/files/odmanual-20060517-v156.pdf

1 A review of other OMB packages for similar whole school studies revealed customary rate of participant compensation for data collection activities to be $26-$30/hour for teachers and $36-$48/hour for administrators. OMB packages reviewed included Reading First (Abt Associates, Inc., 2004); Longitudinal Analysis of Comprehensive School Reform Implementation and Outcomes (LACIO) (WestEd & Cosmos Corporation, 2006); and Trends in International Mathematics and Science (TIMMS) (Windwalker Corporation & Westat, Inc., 2005). Current rates of compensation in Minnesota are $27-$37 for teachers and $37-47 for principals, with the higher averages representing the large, urban districts in the metropolitan area which are the target populations for this study.


2 This study was reviewed and approved by McREL’s Institutional Review Board on May 21, 2007.


File Typeapplication/msword
File TitleSUPPORTING STATEMENT
AuthorRichard Roberts
Last Modified ByTara.Bell
File Modified2007-07-30
File Created2007-07-30

© 2024 OMB.report | Privacy Policy