Supporting Statement Part A_11-14-07

Supporting Statement Part A_11-14-07.doc

Improving Adolescent Literacy Acrossed the Curriculum in High Schools: An Evaluation of the Strategic Instruction Model's Content Literacy Continuum.

OMB: 1850-0845

Document [doc]
Download: doc | pdf

DRAFT Supporting Statement, REL Midwest – CLC Study

Revised November 2007

IMPROVING ADOLESCENT LITERACY ACROSS THE CURRICULUM IN HIGH SCHOOLS: AN EVALUATION OF THE STRATEGIC INSTRUCTION MODEL’S CONTENT LITERACY CONTINUUM



OMB CLEARANCE REQUEST

Supporting Statement Part A


November 2007

















Prepared for: Prepared by:

Institute of Educational Sciences MDRC

United States Department of Education 16 East 34th Street

Contract No. ED-06-CO-0019 New York, NY 10016


Learning Point Associates

1120 East Diehl Road, Ste. 200

Naperville, IL 60563

SUPPORTING STATEMENT


FOR PAPERWORK REDUCTION ACT SUBMISSION




A. Justification


1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


Data from the 2002 NAEP indicate that a sizable percentage of American high school students tested are reading at or below basic proficiency level (where ‘basic’ denotes only partial mastery of fundamental skills).1 Within these results, there are three alarming findings: first, these results pertain to students who were still enrolled in school and say nothing about the literacy skills of the students who drop out before 12th grade (and the study team assumes that students who drop out of high school would have lower scores if they were tested); second, there is a noticeable achievement gap between underserved and privileged populations;2 and third, that reading achievement of 12th grade students has decreased over the past 30 years3.


Research also suggests that as students advance to high school grades and for the first time confront increasingly challenging text there is also the possibility that students will have increased difficulty with reading4, and that motivation, engagement, and confidence in one’s reading abilities will decrease as a result.5 Failure in school and especially failure to be promoted to tenth grade has shown to be a predictor of subsequent drop out.6 Thus, interventions that help students acquire requisite literacy skills can help forestall a high school trajectory marked by failure.


While many studies have led to a greater understanding of the reading skills that young children need and how best to teach these skills,7 similar literature dealing with adolescent populations is sparse8. This study, then, will respond to the need for more rigorous research to identify effective approaches to improving adolescent literacy and school achievement. For this project, the study team will use a randomized controlled trial to study the Content Literacy Continuum (CLC) -- a school-wide literacy across the curriculum model developed by the University of Kansas Center for Research on Learning (KU-CRL) – to determine if such an approach to literacy intervention is effective in producing positive impacts on literacy achievement. The use of a random assignment design helps ensure that – all else equal -- the study will yield the strongest, most reliable evidence possible on which to base policy and practice.


The current authorization for the Regional Educational Laboratories program is under the Education Sciences Reform Act of 2002, Part D, Section 174, (20 U.S.C. 9564), administered by the Institute of Education Sciences’ National Center for Education Evaluation and Regional Assistance.


2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


Research Questions and Overview of the Project Plan


This study focuses on an intervention that emphasizes literacy instruction throughout a high school student’s school day, includes all grade levels in the school, and addresses students across a continuum of literacy needs9. Information will be collected by the study team of MDRC and Learning Point Associates (LPA), the prime contractor of the Regional Education Laboratory Midwest (REL Midwest) and will be analyzed and used to inform the research questions and build upon existing IES initiatives.


The study team will address four research questions in this impact and implementation study:


  1. To what extent does a literacy across the curriculum intervention improve students’ reading skills and other academic outcomes such as attendance, persistence in school, course-taking patterns, and performance on high-stakes standards-based assessments?

  2. For which grade levels and subgroups of students is a literacy across the curriculum approach most effective?

  3. What is the effect of a literacy across the curriculum approach on literacy instruction?

  4. What factors promote or impede successful implementation of a literacy across the curriculum approach in high schools? What factors appear to account for the impact on instruction and achievement outcomes? What are the associated costs?


To answer these questions, the study team has developed an experimental research design wherein data will be collected over three years from 50 high schools (at a minimum, 40 high schools are needed for this study) from diverse (urban/rural, large/midsize/small) districts in the Midwest region (Illinois, Indiana, Iowa, Michigan, Minnesota, Ohio, and Wisconsin). Within each district, the high schools will be randomly assigned in equal proportion to treatment and control groups. Assuming there are 50 schools participating, this will yield 25 treatment schools and 25 control schools. The study team will need to collect new information from the participating districts and schools through student testing, observations of classrooms, and interviews of building and district personnel. Data routinely collected and compiled by school districts will also help answer these questions.


TABLE A1



PROPOSED DATA COLLECTION PLAN

Data Collection Level


Mode

Timeline

Key Data

Student level

Achievement testing of students

Spring 2009


Spring 2010


Reading achievement

Classroom level

Classroom observations

2007-2008 school year


2008-2009 school year


2009 -2010 school year

Instructional quality


Treatment contrast


Dosage


School/District level

Semi-structured interviews with school and district administrators

2007-2008 school year


2008-2009 school year


2009 -2010 school year

Instructional quality


Treatment contrast


Professional development


Technical assistance


Attitudes regarding district-wide school reform


School/District level

Administrative/ school records data on students

Fall 2008


Fall 2009

State and district test scores


Transcript/course data


Student attendance


Promotion and graduation data




Data Collection Plan


Achievement testing: The study team will administer the GRADE, a nationally normed literacy assessment, the results of which will allow us to answer the first part of the research question: To what extent does a literacy across the curriculum intervention improve students’ reading skills and other academic outcomes such as attendance, persistence in school, course-taking patterns, and performance on high-stakes standards-based assessments? The administration of an achievement test like the GRADE is not considered for information collection burden under 5 CFR 1320.3(h)(7).


Classroom observation: The main method for measuring the exposure students have to literacy instruction and its quality will be classroom observations. These will help us to answer the questions: What factors promote or impede successful implementation of a literacy-across the curriculum approach in high schools?; and What factors appear to account for any observed impact (or lack of impact) on teacher instruction and achievement outcomes? Observers will use an adapted version of a validated, literacy-focused classroom observation tool developed by LPA that is being used in an IES funded Striving Readers evaluation in the Chicago Public Schools. This observation tool is designed for observing both teacher instruction and student activity during a class period. It is also general enough to be used for observations in both CLC and non-CLC high schools. Classrooms will be selected for observation by selecting student schedules at random from treatment and control high schools. Observers will then follow these schedules to observe an entire day of instruction across academic classes, thereby allowing us to see the literacy across the curriculum component of the CLC program at treatment schools and contrast this to instruction at control schools. It is our understanding that observational data collection is not considered in calculations of paperwork burden.


Interviews: Interview protocols are based on pre-existing instruments developed by LPA for the Striving Readers program. Interviews of building level administrators will be the same at both the treatment and the control schools, and will target the administrator at each school with the greatest responsibility for instruction and school-wide reform efforts. Although the study team does not expect to obtain a quantified result for each school based on these interviews, interviews from the treatment and control schools will be compared descriptively – still taking advantage of the experimental nature of comparisons between CLC and non-CLC schools to get a qualitative sense of the impact CLC may have on school change as well as the treatment contrast. Additionally, these interviews will provide an opportunity to learn about the professional development and technical assistance provided to teachers. The study team will also interview district administrators. These interviews will provide context for understanding issues of school change and school improvement in the district, as well as the overall district context in which the CLC study is being conducted. The interviews are included in our calculations of respondent burden. Copies of the building and district level administrator interviews are included with this document as appendices A and B, respectively.


School records: Data routinely collected and compiled about student characteristics and measures of academic progress – such as test scores, grades, attendance, courses attempted and passed, and credits earned toward graduation – will be collected as baseline and follow-up measures on all students in the study sample from all of the participating districts. Having these data will help the study team answer the questions: To what extent does a literacy across the curriculum intervention improve students’ reading skills and other academic outcomes such as attendance, persistence in school, course-taking patterns, and performance on high-stakes standards-based assessments? And, for which grade levels and subgroups of students is a literacy across the curriculum approach most effective? These data should be available electronically from participating districts. As these data are pre-existing, already collected by schools and districts, they are not included in our calculation of burden. Using existing school records data helps to minimize overall burden because it avoids the collection of these same data items from individual respondents at the school level.





3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision of adopting this means of collection. Also describe any consideration of using information technology to reduce burden.


Whenever possible the study team will use information technologies to maximize the efficiency and completeness of the information gathered for this evaluation and to minimize the burden on respondents. In particular, data will be collected from existing electronic school administrative records. The study team will also use laptop computers to record field observations. Computers will also be used by the study team to type respondent answers to questions in the building and district administrator interviews.

4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use of the purposes described in Item 2 above.


The study team is planning to administer the GRADE to students in the study sample because the data collected with such an instrument are not available from any other source on a national basis; the state-administered assessments vary in what they measure, how closely they can measure reading proficiency, and how well they can measure student reading achievement. Thus, administering our own assessment is necessary because it provides site-to-site consistency and assures a reliable and common reading achievement score. Data collection through classroom observation will help us understand the fidelity of implementation of the CLC program as well as instructional quality, the treatment dosage, and the treatment contrast between CLC (treatment) and non-CLC (control) schools. Interviewing at the building and district level at both treatment and control schools will also help us understand issues surrounding implementation, training and technical assistance, and the nature and extent of school change effort. This final point is of critical import to our study given that the CLC framework is one that requires school-wide reform. In both cases, the data collected through observations and interviews do not currently exist.


5. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-1), describe any methods used to minimize burden.


The focus of this study is on school districts and the attendant schools – which includes the students and their parents, the teachers, and administrators -- within these districts. The study team has reduced burden for respondents using a data collection plan that requests the minimum information needed to successfully execute this study. To minimize burden on respondents, the study design requests information that is already collected by schools and districts, and any new collection instruments have been designed to ask questions that cannot be answered through any other available sources. Data collection activities are conducted by the study team of MDRC and Learning Point Associates.



6. Describe the consequences to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


The systematic collection, analysis, and reporting of assessment data are required to accomplish the goals of the research project approved by IES. Participation in all data collection activities is voluntary. Multiple data collection strategies are planned, which include analysis of nationally normed state literacy assessments, student transcripts, state assessment results, and attendance records. These data are necessary to measure impacts on academic achievement, school progress, and school participation.


The GRADE will be administered to students at treatment and control schools as an outcome measure at the end of the 2008-2009 and 2009-2010 school years. This is the primary outcome measure for literacy skills and if it is not collected the study would lack a common measure of students’ reading ability.


Schools in the study sample, both treatment and control, will be visited for a total of 15 hours each year of the study in order to conduct classroom observations. These observations will provide us valuable descriptive and quantitative information about instructional quality, especially regarding the integration of literacy strategies into instructional practice, and treatment contrast. The study team will not be able to speak adequately to either of these questions if the observations are not conducted.


Interviews will be conducted once per year for the every year of the study with administrators at both treatment and control schools. If these interviews are not conducted, the study team will not be able to descriptively assess the perceptions of implementation at the classroom, building, and district levels. The study team will not know about the teachers, schools, and districts charged with implementing the CLC programs and these data are essential for us to understand how teachers and administrators perceive literacy education and reform efforts, as well as how they view the CLC training and technical assistance. These factors affect the implementation of the programs, and having these data will provide context for our study that facilitates our ability to understand how these programs might work in other settings with other personnel.


High response rates are anticipated given that states, districts, and even schools will be selected based on their willingness to fully participate in the project. Additionally, the combined experience and expertise of MDRC and LPA in conducting impact studies and data collection around supplemental adolescent literacy programs (ERO, Striving Readers) will lend themselves to achieving high response. The study team is confident that the use of multiple measures of data collection will ensure accurate analyses and results.





7. Explain any special circumstances that would cause an information collection to be conducted in a manner inconsistent with Section 1320.5(d)(2) of the federal regulations:


No special circumstances apply.


8.Federal Register Comments and Persons Consulted Outside of the Agency

A 60 day notice to solicit public comments was published on page 25755 of the Federal Registry on May 7, 2007, with an expiration date of July 6 2007. No public comments were received.


The following individuals were consulted in the development of these materials:


Steve Cantrell, LPA

Matthew Dawson, LPA

James Kemple, MDRC

William Corrin, MDRC

Judy Stewart, Taylor Education Consulting, Inc.


9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


The treatment schools will receive the intervention and all materials and support that go with the intervention. Because control schools will not receive the intervention program but will be subjected to data collection and other activities, each control school will receive $1,000 in compensation. Schools assigned to the treatment group will pay a participation fee of $4,000 to receive the treatment. These fees will be transferred – in sums of $2,000 per year – to the schools assigned to the control group as compensation for not receiving the treatment, but still participating in the study. 10


10. Describe any assurances of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


All data collection activities will be conducted in full compliance with The Department

of Education regulations to maintain the confidentiality of data obtained on private persons and to protect the rights and welfare of human research subjects as contained in the Department of Education regulations. These activities will also be conducted in compliance with other federal regulations in particular with The Privacy Act of 1974, P.L. 93-579, 5 USC 552 a; the “Buckley Amendment,” Family Educational and Privacy Act of 1974, 20 USC 1232 g; The Freedom of Information Act, 5 USC 522; and related regulations, including but not limited to: 41 CFR Part 1-1 and 45 CFR Part 5b and, as appropriate, the Federal common rule or ED’s final regulations on the protection of human research participants.

MDRC and LPA follow the confidentiality and data protection requirements of IES (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183).  MDRC and LPA will protect the confidentiality of all information collected for the study and will use it for research purposes only.  No information that identifies any study participant will be released.  Information from participating institutions and respondents will be presented at aggregate levels in reports.  Information on respondents will be linked to their institution but not to any individually identifiable information.  No individually identifiable information will be maintained by the study team.  All institution-level identifiable information will be kept in secured locations and identifiers will be destroyed as soon as they are no longer required.  MDRC and LPA obtain signed NCEE Affidavits of Nondisclosure from all employees, subcontractors, and consultants that may have access to this data and submits them to the NCEE COR. All members of the study team having access to the institution-level data have been certified by MDRC’s and LPA’s Institutional Review Boards as having received training in the importance of confidentiality and security. Please refer to appendices D and E for MDRC and LPA confidentiality agreements.

An explicit statement describing the project, the data collection and confidentiality will be provided to study participants. These participants will include adults – administrators at participating high schools and district offices will be interviewed – who will need to sign this statement, acknowledging their willingness to participate. In some districts it may be necessary to also provide a similar statement for students which will need to be signed by student and a parent for the study team to collect data about the students. Examples of these proposed consent forms for these two types of participants are attached as appendices F and G.


A Privacy Impact Assessment has been conducted and the Privacy Act System of Notice is currently being developed.


The study team prepared a System of Records (SOR) and notice was published in the Federal Registry on [DATE TO BE ENTERED LATER].


11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. The justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


None of the questions utilized to collect data for this study, including interviews, concern topics commonly considered private or sensitive, such as religious beliefs or sexual practices.


12. Provide estimates of the hour burden of the collection of information.


Table 2 below provides annual burden estimates for the data collection activities that carry burden:


TABLE A2


ANNUAL HOUR BURDEN ESTIMATES


Instrument

Annual number of respondents


Average time per response (hours)

Frequency of Response

Annual number of responses

Total respondent time (hours)

Building administrator interview


4811

1.5

1 per year

48

72.0

District administrator interview


1512

1.5

1 per year

23

23

Administrator Study Participation Agreement form

21

.01

1

21

0.21

Student Study Participation Agreement Form

2000

.01

1

2000

20.0


Total



2084


3.003 (For administrators)

.01 (For students)




115



  1. Describe any other costs to respondents.


No additional costs are associated.


14. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expenses that would not have been incurred without this collection of information. Agencies also may aggregate cost estimates from Items 12, 13, and 14 in a single table.


Tasks for FY 2007


Year 2


Capital/startup costs


Facility Operations

$25,512

Overhead

$78,211

Total annualized capital/startup costs

$103,723



O&M costs


Staff

$166,385

Consultants

$1,515,838

Student Assessments

$70,000

Travel

$34,500

Total O&M costs

$1,786,723



Total annualized cost

$1,890,446


The costs shown in the above table are related to costs directly related to the study. Funding includes paying for staff at REL Midwest, as well as covering costs of subcontractors on this study who will assist in study design, data collection, and data analysis. Also included in the “consultants” line are costs associated with delivering the intervention to treatment schools, and collecting data from both treatment and control schools. We have also budgeted for costs around the third party student assessments.


15. Describe any changes in the burden from prior approvals


A program change of 1307.1 hours of data collection burden is shown for this new data collection effort.


16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of the report, publication dates, and other actions.


The results of this data collection may be used in several ways. First, it will be used to report formative information to IES on a quarterly basis. Second, REL Midwest may use the report results to prepare presentations at regional or national conferences. Third, reports, presentations, and other training and technical assistance activities specified in the contract will be conducted using the results from data collection activities. (See Appendix H for the study timeline.)


The first technical report, due in draft form in November 2009 and final form in May 2010 (the draft and final versions of the first non-technical report are due in July and September of 2010, respectively) will follow students in the first two years of the study cohort through the spring of the 2008-2009 school year. At this point, the study team will have data on student literacy outcomes from the GRADE assessment, administered in the spring of 2009. Our initial impact analyses will center on this relatively short-term outcome. The report will also discuss findings from the classroom observations, and district and building interviews about the implementation of the CLC program and about the difference between the treatment schools and control schools in their exposure to comprehensive change in instruction, particularly as it relates to literacy, across high school curricula.


The second technical report, due in draft form in October 2010 and final form in January 2011 (non-technical draft and final versions of this report are due in December 2010 and February 2011, respectively), will analyze the impacts of the intervention on a wider set of outcomes and for a larger group of students as the program is phased into the high schools. The report will include two years of follow-up for Cohort 1 (through the end of the scheduled 10th grade year) and one year of follow-up for Cohort 2. Data for Cohort 2 will include the GRADE Assessment. School records data will also be available for both cohorts of students including information on attendance, course-taking, and promotion. For Cohort 2, the study team will have information on student performance on state or district standardized tests for those that administer them in the 10th grade.


The data collection, analysis, and reporting for this study is driven by the key research question: What is the impact of a literacy across the curriculum intervention on student outcomes? Our approach to estimating the effects of CLC has the following core features:

  1. A focus on impacts based directly on the experimental design.

  2. Estimation of impacts in ways that account for the randomization of schools.

  3. Use of student- and school-level baseline covariates to increase precision.

  4. Estimation of impacts separately for each follow-up year and for each grade in question.


The basic logic of our analysis strategy is to compare the schools that are randomly assigned to receive the treatment to those that are not. As random assignment occurs at the school level, schools are the primary unit of analysis. However, the data for this evaluation can be thought of as nested, as individual students are nested within schools. Individual student observations tend to vary as a group rather than being independent of each other. Since observations within the same group are not statistically independent of one another, the most appropriate way to estimate the effect of the intervention and correctly estimate statistical precision is to apply a multilevel model (HLM) that estimates separate equations at the student and school levels.


The analysis methods are discussed in further detail in Part B of the Supporting Statement for this project.


17. Describe arrangements for displaying the number provided by OMB and its expiration date.


The approval number provided by OMB and its expiration date will appear in the heading on all data collection instruments.


18. Exceptions to Certification Statement


No exceptions are requested.


REFERENCES PART A


Alliance for Excellent Education. (2004, May). How to know a good adolescent literacy

program when you see one: Quality criteria to consider. Washington, DC: Author. Retrieved April 17, 2007 from http://www.all4ed.org/adolescent_literacy/issue_briefs.html.


Alvermann, D. E. (2001). Effective literacy instruction for adolescents. Executive

Summary and Paper commissioned by the National Reading Conference. Chicago: National Reading Conference. Retrieved April 17, 2007, from http://www.nrconline.org/publications/alverwhite2.pdf


Caldwell, J. and Leslie, L. (2004). Does proficiency in middle school reading assure

proficiency in high school reading? The possible role of think-alouds. Journal of Adolescent and Adult Literacy, 47, 324-335. 


Grigg, W. S., Daane, M. C., Jin, Y., and Campbell, J. R. (2003). The Nation’s Report Card: Reading 2002. Washington, DC: National Center for Education Statistics.


Guthrie, J.T. and Alvermann, D.A. (1999). Engaged reading: Processes, practices and

policy implications. Teachers College Press: New York.


Kamil, M. (2003). Adolescents and literacy: Reading for the 21st Century. Washington,

D.C.: Alliance for Excellent Education. Retrieved April 17, 2007 from http://www.all4ed.org/publications/AdolescentsAndLiteracy.pdf


Kemple, J., and Herlihy, C. (2004). The Talent Development high school model: Context,

components, and initial impacts on ninth-grade students’ engagement and performance. New York: MDRC. Retrieved April 17, 2007 from http://www.mdrc.org/publications/388/full.pdf


Moore, D.W., Bean, T.W., Birdyshaw, D., and Rycik, J.A. (1999). Adolescent literacy: A

position statement. Newark, DE: International Reading Association.


National Reading Panel. (2000, April). Report of the National Reading Panel: Teaching

children to read. Washington, DC: National Institute of Child Health and Human Development, National Institutes of Health, U.S. Department of Health and Human Services. Retrieved April 17, 2007 from http://www.nationalreadingpanel.org/Publications/publications.htm


RAND Reading Study Group. (2002). Reading for understanding: Toward a research

and development program in reading comprehension. Santa Monica, CA: RAND.


Snow, C. E., and Biancarosa, G. (2003). Adolescent literacy and the achievement gap:

What do we know and where do we go from here? New York: Carnegie Corporation. Retrieved April 17, 2007 from http://www.ode.state.or.us/teachlearn/subjects/elarts/reading/literacy/summerinstitute/resources/carnegieadolescentliteracyreport.pdf


Snow, C.E., Burns, M.S., and Griffin, P. (Eds.), the Committee on the Prevention of

Reading Difficulties in Young Children, Commission on Behavioral and Social Sciences Education, and The National Research Council (1998). Preventing reading difficulties in young children. Washington, DC: National Academy Press.


Wigfield, A. (2004). “Motivation for Reading During the Early Adolescent and Adolescent Years.” In D. S. Strickland and D. E. Alvermann (Eds.), Bridging the

Literacy Achievement Gap, Grades 4-12. (pp. 56-69). New York: Teachers College Press.


1 Grigg, Daane, Jin, and Campbell (2003).

2 Snow and Biancarosa, (2003).

3 RAND Reading Study Group (2002).

4 Caldwell and Leslie (2003-2004).

5 Guthrie (2002); Guthrie and Alvermann (1999); Wigfield (2004).

6 Kemple and Herlihy (2004).

7 National Reading Panel (2002); National Research Council (1998).

8 The Alliance for Excellent Education ((2004); Kamil (2003)), the International Reading Association (Moore, Bean, Birdyshaw and Rychik (1999), and the National Reading Conference (Alvermann (2001)) have all published discussions relating to policies for adolescent literacy instruction.

9 For further description of the Content Literacy Continuum, please see Appendix I.

10 The 2007 Study of the impact on student achievement of teacher professional development designed to enhance teacher content knowledge and pedagogical content knowledge in mathematics Mathematics: Professional Development Impact Study currently being conducted by the American Institutes for Research for the Institute for Education Sciences/US Department of Education (Contract No. ED-01-CO-0026/0020; OMB number 1850-0816 v.1) guarantees middle schools in the control group $1,000 per year for participation in the data collection activities without receipt of any treatment. The nature and scope of the REL Midwest study is similar to that of the evaluation of the impact on student achievement of teacher professional development in mathematics and the compensation amounts were derived, in part, by comparing the two studies.  It should be noted that the REL Midwest study works with high schools while the mathematics professional development study works with middle schools. As middle schools are typically smaller than high schools, the study team determined that $2000 per high school annually (compared with $1000 per middle school in the mathematics professional development study) in compensation for control high schools was the appropriate commensurate amount. For example, for a high school with 1000 students, an annual $2000 compensation equates to $2 per student.

11 The study team assumes there will be at least a 95% response rate by interviewees. Assuming 50 schools participate, a 95% response rate yields approximately 48 respondents.

12 Although we expect our sample will come from about 12 districts we are overestimating here in case it is necessary to interview more than one administrator in a few districts.

1


File Typeapplication/msword
File TitleSUPPORTING STATEMENT
AuthorNCERL
Last Modified Bydean.gerdeman
File Modified2007-11-14
File Created2007-11-13

© 2024 OMB.report | Privacy Policy