Att_Reading Comp OMB_final_V2 Sec A 062807

Att_Reading Comp OMB_final_V2 Sec A 062807.doc

Evaluation of Reading Comprehension Interventions

OMB: 1850-0812

Document [doc]
Download: doc | pdf

C ontract No.: ED-01-CO-0039/0010

MPR Reference No. 6136-600





Supporting Justification for OMB Clearance of Information Collection Forms for the Evaluation of Reading Comprehension Interventions


Revision of Currently Approved Collection (OMB #1850-0812)


Section A


June, 2007













Submitted to:


U.S. Department of Education

Institute of Education Sciences

555 New Jersey Ave., NW, Rm. 308

Washington, DC 20208

(202) 208-7078


Project Officer:

 Audrey Pendleton


Submitted by:


Mathematica Policy Research, Inc.

600 Maryland Ave., SW, Suite 550

Washington, DC 20024-2512

(202) 484-9220



Project Director:

Jerry West, Ph.D.

Deputy Project Director:

Wendy Mansfield, Ph.D.

CONTENTS

Page

A. JUSTIFICATION 1

1. Circumstances Necessitating Collection of Information 7

2. How, by Whom, and for What Purpose Information Is to Be Used 8

3. Use of Automated, Electronic, Mechanical or Other Technological Collection Techniques 9

4. Efforts to Avoid Duplication of Effort 9

5. Sensitivity to Burden on Small Entities 10

6. Consequences to Federal Program or Policy Activities if the Collection is Not Conducted or Is Conducted Less Frequently than Proposed 10

7. Special Circumstances 10

8. Federal Register Announcement and Consultation 10

9. Payment or Gift to Respondents 12

10. Confidentiality of the Data 12

11. Additional Justification for Sensitive Questions 14

12. Estimates of Hour Burden 15

13. Estimate of Total Annual Cost Burden to Respondents or Record-Keepers 15

14. Estimates of Annualized Cost to the Federal Government 15

15. Reasons for Program Changes or Adjustments 15

16. Plan for Tabulation and Publication and Schedule for Project 16

17. Approval Not to Display the Expiration Date for OMB Approval 17

18. Exception to the Certification Statement 17


REFERENCES 18





CONTENTS (continued)

Page

APPENDIX A: TEACHER SURVEYS A-1

APPENDIX B: SCHOOL RECORDS FORMS B-1

APPENDIX C: SUMMARY TABLES OF INSTRUMENT ITEMS AND
QUESTION-BY-QUESTION JUSTIFICATIONS C-1

APPENDIX D: PARENT LETTER, PARENT CONSENT FORM, STUDENT ASSENT FORM, AND BROCHURE D-1

APPENDIX E: CLASSROOM OBSERVATION FORMS E-1

APPENDIX F: EVALUATION LEGISLATION F-1

APPENDIX G: CONFIDENTIALITY PLEDGE G-1

APPENDIX H: INTRODUCTORY LETTER TO STATES H-1

APPENDIX I: INTRODUCTORY LETTER TO DISTRICTS I-1




TABLES AND FIGURES

Tables Page

1 Data Collection Instruments 7

2 Estimate of Response Time 15

3 Schedule of Activities 17




Appendix Tables

C.1 Question-by-Question Justification of Teacher Survey Questions C-3

C.2 Question-by-Question Justification for School Records Forms C-4


SUPPORTING STATEMENT
REQUEST FOR CLEARANCE OF INFORMATION COLLECTION FORMS
FOR AN EVALUATION OF READING COMPREHENSION INTERVENTIONS

This submission is a request for a revision of Office of Management and Budget (OMB) clearance for the Evaluation of Reading Comprehension Interventions sponsored by the U.S. Department of Education’s Institute of Education Sciences. The interventions being evaluated are designed to teach reading comprehension strategies to fifth-grade students in the content areas of science and social studies. The existing clearance (Number 1850-0812) was issued on March 15, 2006, and will expire March 31, 2009. The revision being requested is for a second year of the study, as required by the OMB terms of clearance for the first year of the study The second year has two components:

1. Sustainability of Initial Impacts. This component will examine whether the results of instruction in reading comprehension strategies for fifth graders can be sustained through the next school year.

2. Benefits of a Second Year of Implementation. This component of the study will examine whether reading comprehension interventions are more effective at improving student outcomes after teachers and schools have had one year of experience using them.

Both components are discussed in more detail below.

A. JUSTIFICATION

Many of the nation’s children struggle with comprehending complex texts and other reading materials that are used in the upper elementary grades for subjects such as social studies and science. This is especially true of children from disadvantaged backgrounds (Snow, Burns, and Griffin 1998).

Title I of the No Child Left Behind Act (NCLB) of 2002 calls on educators to close the gap between low and high achievers by using scientifically sound instructional approaches. Recent research has identified features of reading comprehension instructional approaches that are linked to improvements in students’ ability to abstract meaning from expository texts (e.g., National Reading Panel 2000; Gersten et al., 2001; Rosenshine et al., 1996). These features include:

  • Engaging students in elaborative questions and answers, including providing feedback and giving students opportunities to ask and answer their own questions about the text.

  • Using text structures (e.g., compare-contrast, cause-effect, explanation, and sequencing) or graphic organizers as guides for teachers or students in generating questions, helping students approach expository text, or eliciting elaborated responses.

  • Teaching students to make predictions based on subtitles or materials in preceding paragraphs, record ideas about what they read, evaluate the accuracy of their predictions, and summarize these ideas after reading.

  • Using multiple strategies to improve comprehension (e.g., question generation, summarization, and prediction)

  • Practicing strategies in small-groups with their peers.

Although research on instructional approaches that improve reading comprehension is accumulating, little is known about the effectiveness of different approaches to teaching reading comprehension strategies within science and social studies. Scientifically rigorous evidence is lacking with respect to both (a) the effectiveness of specific interventions intended to help students better comprehend expository text and (b) taking such interventions to scale. As a consequence, it is difficult for state and local educators in Title I schools to decide how best to improve the capacity of students to comprehend complex, expository text.

Responding to the need for scientifically-based evidence of the effectiveness of specific instructional interventions, the first year of the Evaluation of Reading Comprehension Interventions addressed three major questions:

  1. Can reading comprehension interventions improve student reading achievement in social studies or science?

  2. What are the most effective reading comprehension interventions for improving student reading achievement in social studies or science?

  3. Under what conditions and practices do reading comprehension interventions improve student achievement in reading in social studies or science?

Determining whether improvements gained from instructional interventions are long lasting is also an important consideration for policy makers. Therefore, continuation of the study for a second year would address two additional questions:

  1. Does reading comprehension instruction in fifth grade result in improvement in student outcomes that is sustained for another year?

  2. Are reading comprehension interventions more effective at improving student outcomes after teachers, or schools, have had one year of experience using them?

Interventions that have sustained effects will be more appealing to educators, policymakers, and parents than those whose effects dissipate quickly. In addition, the reading comprehension strategies being taught to students as part of the interventions are hypothesized to have long-term effects by changing the way students approach reading.

Component 1 – Sustainability of Initial Impacts. To assess the sustainability of initial impacts, we will test the original cohort of fifth graders as sixth graders in the spring of the 2007-08 school year. We will test general reading comprehension as well as reading comprehension of social studies and science texts, compare second-year assessment impacts for various intervention groups and the control group, and examine how second-year impacts vary by student characteristics, such as prior achievement. Component 1 does not require the implementation of the reading comprehension interventions in sixth grade.

Tests to be administered are sixth-grade versions of the end-of-year assessments conducted in the 2006-07 spring follow-up—Pearson/AGS’s Group Reading Assessment and Diagnostic Evaluation (GRADE) passage comprehension subtest and social studies and science reading comprehension tests developed specifically for this study.1 All students will be administered the general reading comprehension test, and half will be randomly selected to take the science test and the other half will take the social studies test.2 Randomization to either the science or the social studies test will occur within each intervention (and control) group, so that the same proportion of students take each test in each group. Using this approach, we can reduce the burden by administering two tests instead of three to each student.

An important consideration when interpreting sustained impacts is that the middle school environment could influence the extent to which initial impacts are sustained. For example, if students enter a middle school without qualified or experienced teachers, then the initial benefits of the reading interventions might quickly fade. To measure the school environment, we propose to administer a short teacher survey to all sixth-grade English, science, and social studies teachers at each school during spring 2008 (Appendix A.1). This survey would cover professional background, including education, certification, and experience. We also propose to collect school-level information from secondary sources such as the Common Core of Data (CCD) and SchoolMatters.

Parent letters disseminated last year requested consent for both the 2006-07 and 2007-08 school years. Therefore, we do not plan to obtain consent again for study purposes (though if requested by the school, we will resend the letters). We will test only those sixth graders for whom we had consent in the 2006-07 school year and who remained within their public school district in 2007-08. Before administering the reading assessments in spring 2008, we will collect lists from the schools to learn which students have transferred, and then ask the school to identify where they transferred to. These schools will be contacted and arrangements made to test these students at their new schools within the district.

Component 2 – Benefits of a Second Year of Implementation. To assess the impact of fifth graders being taught by teachers with a year of experience with the interventions, we will repeat the original study design using the same schools and teachers but a new cohort of fifth graders. Only teachers who participated in the first year of the study would be included in the data analysis.

To assess the impact on fifth graders of being taught in schools with a full year of experience with the interventions, we will repeat the study design using the same schools, all fifth-grade teachers (both new and original) and a new cohort of fifth graders. Both new and original teachers will be included in this data analysis.

Component 2 repeats nearly all the data collection activities from the first year:

  • Obtain parental consent. We expect the consent process and rates for the 2007-08 school year to mirror last year’s experience: 9 of the 10 participating districts allowed the use of passive consent letters, while the remaining district required active consent. We obtained an overall consent rate of 99 percent. District consent rates ranged from 98 percent to 100 percent in passive-consent districts, and the rate was 94 percent in the active-consent district. At the start of the 2007-08 school year, we will disseminate consent letters to participating schools in all districts, and schools will send the letters home with fifth graders. In the active-consent district, we will work with school staff to obtain consent forms from parents.

  • Administer baseline tests in the fall of 2007-08. We will administer the GRADE and ProEd’s Test of Silent Contextual Reading Fluency (TOSCRF) to fifth-grade students with consent.

  • Conduct classroom observations from January to March of 2008. We will analyze the observation data from the Year 1 high-intensity/low-intensity experiment to determine the appropriate number of observations to conduct per classroom to achieve high quality data at a reasonable cost.3 Based on that analysis, we will develop and implement a plan for the 2007-08 observations to assess the quality of instruction. For teachers in the intervention groups, we will also conduct a fidelity observation for the assigned curriculum for both returning and new teachers. Teachers in the control group and teachers who have stopped using their assigned curriculum will be observed only with the teacher quality observation measure (a fidelity observation will not be conducted).

  • Administer follow-up tests in the spring of 2008. We will administer the same end-of-year assessments that were used in 2006-07: the GRADE passage comprehension subtest and the social studies and science reading comprehension tests. One-half of the sample will receive the social studies test and one-half the science test.

  • Collect student records at the end of the 2007-08 school year. We will follow each district’s preferred collection approach. In general, districts are able to provide the student records data electronically, although in a small number of cases a few items are available only in hard copy at the schools.

  • Administer the teacher surveys in fall 2007 to those teachers who did not complete a survey in the 2006-07 school year. Local field staff will give the surveys to teachers when they administer tests to students, and they will collect them before leaving the school. (Appendix A.2 contains the survey administered to teachers in control schools, and Appendix A.3 contains the survey administered to teachers in treatment schools.)

This clearance request pertains to the administration of the teacher surveys (Appendix A); the school records form (Appendix B); and the reading tests. The OMB clearance package provides a question-by-question justification for each item on the teacher survey and school records form (Appendix C). We have also included materials that will be sent to parents—a letter describing the study, consent form to be used if required by the district, assent form to be used if required by the district, and a question-and-answer brochure (Appendix D). The data collection effort also encompasses classroom observations (Appendix E) which will be completed by trained field staff. Mathematica Policy Research (MPR) will carry out the second year of the evaluation as two separate components, described in detail in the next two sections. Table 1 provides a list of the instruments, components in which they will be used, intended respondents, and timeframe.

TABLE 1

DATA COLLECTION INSTRUMENTS



Timeframe

Instrument

Respondent

Fall 2007

Spring 2008

Reading assessments

Component 1

Level 6 GRADE paragraph comprehension

6th grade social studies and science comprehension tests

Component 2

Level 5 GRADE paragraph comprehension

TOSCRF

5th grade social studies and science comprehension tests

Student


















Classroom observation (Component 2 only)

Trained field staff


Teacher questionnaire

Component 1

Component 2

Teacher




School records form (Component 2 only)

School staff




1. Circumstances Necessitating Collection of Information

Title I, Part E, Section 1501, of the NCLB (Appendix F) mandates a national assessment of the implementation and impact of Title I and an Independent Review Panel (IRP) to advise on the conduct of that assessment. The IRP recommended that the Title I evaluation initially assess the impact of reading interventions on low-income students’ reading achievement. A panel of reading experts formed subsequently by the Department’s Institute of Education Sciences (IES) indicated that strategies for improving comprehension are not as developed as those for decoding and fluency, and that research on teaching reading comprehension strategies within content areas (e.g., science, social studies) is scarcer than research demonstrating techniques for comprehending narrative text.

The panel advised IES to focus their evaluation on direct instruction of multiple comprehension strategies for expository text in science and social studies. Sustainability of improvements are also key to determining the value of instructional interventions such as the reading comprehension programs being examined in Year 1 of this study. Analyses of the data collected in Year 2 will provide critical information regarding the immediate and longer-term effectiveness of reading comprehension interventions and could substantially influence both policy and practice regarding reading instruction for expository text in social studies or science.

2. How, by Whom, and for What Purpose Information Is to Be Used

The evaluation will include an examination of the extent to which impacts on students are sustained over one year and an assessment of the effects of these interventions after schools and teachers have had one year of experience with them. IES will use the information from this study to determine the efficacy of reading comprehension interventions for improving students’ comprehension of expository text in science and social studies, as well as the conditions and practices through which reading comprehension can be improved. ED could also use the information to determine the feasibility of implementing reading comprehension interventions on a large-scale basis under Title I.

The data will also be useful for state and local policymakers, districts, and schools. An important goal of the evaluation is to produce findings that will trigger wide adoption of effective strategies for improving reading comprehension among the elementary school population, particularly low-income students. The information will support policy decisions about the funding of reading comprehension programs. In addition, the data will support additional research on reading comprehension, by academics or others interested in the subject area. Restricted-use data files from the evaluation will be submitted to and distributed by IES, and can be used for independent studies on topics of interest to the reading research and policy community.

3. Use of Automated, Electronic, Mechanical or Other Technological Collection Techniques

The data collection plan reflects sensitivity to issues of efficiency, accuracy, and respondent burden. Where feasible, information will be gathered from existing data sources, using the most efficient methods available. For example, school records for the majority of students will be gathered via computer files. In addition, school-level data for newly recruited middle schools in Component 1 will be abstracted from the CCD and SchoolMatters, rather than asking schools to provide the information. Some data, however, can be obtained only from students, their teachers, and school staff.

The student tests will be administered in a group setting. Children will benefit from the guidance of a test administrator who will be present to explain directions and answer questions. Questionnaires for teachers will be delivered to the schools and collected either by test administrators or by mail with telephone follow-up for nonresponse or consistency checks. When requested, questionnaires will be transmitted to and from respondents by fax. In addition, MPR’s electronic mail address and toll-free telephone number are included on the front of the questionnaire if anyone has questions. These procedures are all designed to minimize the burden on respondents.

4. Efforts to Avoid Duplication of Effort

This effort will yield unique data to evaluate reading comprehension programs in science and social studies. There are no similar evaluations being conducted and there is no alternative source for the information to be collected. Moreover, the data collection plan reflects careful attention to the potential sources of information for this study and particularly to the reliability of the information and efficiency in gathering the information. The data collection plan avoids unnecessary collection of information from multiple sources.

5. Sensitivity to Burden on Small Entities

The primary entities for the study are schools. Burden is minimized for all respondents by requesting only the minimum amount of information required to meet the study objectives. The burden on schools has also been minimized through careful specification of information needs, restriction of questions to generally available information, and the design of the data collection strategy (particularly the survey methods). All primary data collection will be coordinated by MPR employees so as to reduce the burden on school employees.

6. Consequences to Federal Program or Policy Activities if the Collection is Not Conducted or Is Conducted Less Frequently than Proposed

If the proposed data were not collected, IES would not fulfill its Title I mandate for a national evaluation and would be unable to provide information on the efficacy of reading comprehension program practices and conditions and their sustainability over time. As a result, IES would not know whether the programs have any short- or long-term impacts, either positive or negative, on participating students. Thus, federal resources would be allocated and program decisions would be made in the absence of valid evidence of the effectiveness of various reading comprehension programs.

7. Special Circumstances

There are no special circumstances.

8. Federal Register Announcement and Consultation

a. Federal Register Announcement

We will publish a 30-day Federal Register Notice to allow public comment.

b. Consultations Outside the Agency

A panel of reading experts was established to select the reading comprehension programs through a competitive process and consists of the following nationally recognized reading researchers:

  • Dr. Donna Alvermann, University of Georgia

  • Dr. Timothy Shanahan, University of Illinois-Chicago

  • Dr. Joseph Torgesen, Florida State University

  • Dr. Joanna Williams, Columbia University

In addition, to provide advice on the study the evaluation team formed a technical work group (TWG) of researchers that combine expertise in large-scale random assignment studies and impact evaluation, knowledge of reading comprehension, and familiarity with interventions designed to improve reading comprehension. The evaluation team has consulted with the TWG on the overall study design, the data collection plan, and the survey instruments.

Members from the panel of reading experts also serve on the TWG. This promotes further continuity between the rationale for selecting reading comprehension interventions and the approach taken in the evaluation of the impacts on reading comprehension of expository text in science and social studies. In addition to the panel members, the technical working group includes:

  • Dr. Mark Berends, Vanderbilt University

  • Dr. Isabel Beck, University of Pittsburgh

  • Dr. Thomas Cook, Northwestern University

  • Dr. David Francis, University of Houston

  • Dr. Larry Hedges, University of Chicago



c. Unresolved Issues


None.

9. Payment or Gift to Respondents

For Component 1, we propose providing all schools a $500 non-monetary incentive for study-related efforts by school administrators and teachers. This amount is one-third the incentive payment offered last year to control schools participating in baseline and follow-up testing, observations, teacher survey, and school records collection. A lesser amount may not be sufficient to gain the cooperation of middle schools. We will also provide $20 to teachers who complete the teacher survey; this is the same amount paid to teachers who completed the survey in Year 1.

For Component 2, we again propose a $1,500 payment for each control school for study-related efforts by school administrators and teachers. This will facilitate recruitment of schools for the second year, maintain integrity of the control group, and serve as an alternative treatment to minimize the possibility of a Hawthorne effect for the treatment schools. We will also provide $20 to teachers who complete the teacher survey.

10. Confidentiality of the Data

The data collection efforts that are the focus of this clearance package will be conducted in accordance with all relevant federal regulations and requirements. These include the Education Sciences Reform Act of 2002, Title I, Part E, Section 183 that requires “All collection, maintenance, use, and wise dissemination of data by the Institute: to “conform with the requirements of section 552 of Title 5, United States Code, the confidentiality standards of subsections (c) of this section, and sections 444 and 445 of the General Education Provisions Act (20 U.S.C. 1232 g, 1232h).” These citations refer to the Privacy Act, the Family Educational Rights and Privacy Act, and the Protection of Pupil Rights Amendment. In addition, for student information, the data collection efforts will ensure that all individually identifiable information about students, their academic achievements, their families and information with respect to individual schools, shall remain confidential in accordance with section 552a of Title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act. The study will also adhere to requirements of subsection (d) of section 183 prohibiting disclosure of individually identifiable information as well as making the publishing or inappropriate communication of individually identifiable information by employees or staff a felony.

Data to be collected will not be released with individual student, teacher, or school identifiers. Data will be presented in aggregate statistical form only. A statement to this effect is included in a letter accompanying each questionnaire and will be read to students by a field examiner before administering tests and by an interviewer before completing a telephone survey. All MPR interviewers and field examiners will be knowledgeable about confidentiality procedures and will be prepared to describe them in full detail, if needed, or to answer related questions raised by respondents. Respondents will be assured that all information identifying them or their school or program will be kept confidential.

The following safeguards are routinely employed by MPR to carry out confidentiality assurances:

  • All employees at MPR sign a confidentiality pledge (Appendix G) emphasizing its importance and describing their obligation.

  • Access to sample selection data is limited to those who have direct responsibility for providing and maintaining sample locating information. At the conclusion of the research, these data are destroyed.

  • Identifying information is maintained on separate forms and files, which are linked only by sample identification number.

  • Access to the file linking sample identification numbers with the respondents’ ID and contact information is limited to a small number of individuals who have a need to know this information.

  • Access to the hard copy documents is strictly limited. Documents are stored in locked files and cabinets. Discarded material is shredded.

  • Computer data files are protected with passwords and access is limited to specific users. With especially sensitive data, the data are maintained on removable storage devices that are kept physically secure when not in use.

The Privacy Act of 1974 applies to this collection. MPR will make certain that all surveys are held strictly confidential, as described above, and that in no instance will responses be made available except in tabular form. Under no condition will information be made available to school or program personnel. Project and school staff responsible for assisting MPR in the data collection will be fully informed of MPR’s policies and procedures regarding confidentiality of the data.

11. Additional Justification for Sensitive Questions

No questions of a highly sensitive nature are included in the teacher questionnaire. Teachers will be asked to provide only demographic (ethnicity, race, age), educational, and professional background information. Such items may be sensitive to some respondents, but they are important as variables that may be associated with student outcomes.

The questions are worded in a sensitive, nonjudgmental manner and have been successfully pretested and used extensively in previous studies with no evidence of harm. Furthermore, survey responses will be strictly confidential, as described above, and responses will not affect the teachers’ professional status in any way.

12. Estimates of Hour Burden

The total reporting burden associated with this data collection is 1,044 hours, including 44 hours for the teacher survey and 1,000 hours for school records. (See Table 2 below.)

TABLE 2


ESTIMATED RESPONSE TIME


Instrument

Respondents

Response Time

Total Time

Teacher survey

Teachers



Grade 6 Teachers (2007-08)

240 (80 schools x 3)

10 minutes

40 hours

Control Teachers (2007-08)

1 Y1 non-respondent

15 minutes

.25 hours

Treatment Teachers (2007-08)

10 Y1 non-respondents

20 minutes

3.33 hours

School records form

(Component 2 only)

89 schools

10 minutes x 6,000 students

1,000 hours

TOTAL



1,044 hours




13. Estimate of Total Annual Cost Burden to Respondents or Record-Keepers

None.

14. Estimates of Annualized Cost to the Federal Government

The estimated cost to the federal government for extending the Evaluation of Reading Comprehension Interventions for another year—including processing and analyzing the data, preparing reports, and a restricted-use data file and documentation—is approximately $7.1 million. This amount includes approximately $1.5 million for Component 1 and $5.6 million for Component 2.

15. Reasons for Program Changes or Adjustments

This is a one-year extension of a data collection with a program change of 1,044 hours.

16. Plan for Tabulation and Publication and Schedule for Project

  1. Tabulation Plans


Tabulation plans cover both the impact and implementation evaluations. Each is discussed below.

Implementation Evaluation. Classroom observation data will be used to assess the fidelity with which the interventions were implemented and the quality of the reading instruction in treatment and control classrooms. We will characterize the frequency and duration of reading comprehension instruction and the extent to which it conforms to current best practices in reading comprehension instruction. Data from the implementation evaluation will be used in statistical models to estimate the association between intervention inputs and impacts.

Impact Evaluation. For the impact evaluation, we will tabulate the outcomes (mean reading achievement scores) for each treatment group and report the differences and regression-adjusted differences, indicating which are statistically significant. For Component 2, we will report these comparisons for outcomes measured at baseline as a check to ensure that treatment and control groups are similar at baseline. We will look for differences at both the school and student levels.. School-level variables include enrollment, Title I status, percentage of students eligible for federally free and reduced-price lunch, and percentage of minority students. Student-level variables include standard demographic information (age, race and ethnicity, sex, federally funded free and reduced-price lunch status), baseline reading achievement, and key contextual variables (status as English Language Learners, Individual Education Plan, or 504 Service Agreement).

We will also tabulate impact estimates for selected subgroups of students and schools. Looking at impacts for subgroups can offer important insight into how interventions effects students. Intervention impacts might differ, for example, for boys and girls, across racial and ethnic groups, by income status, and for students with learning disabilities. The school records form gathers the data needed, along with baseline tests, to classify sample members into appropriate groups for these subgroup analyses.

b. Publication Plans

The evaluation report is scheduled to be completed in September 2009, following the completion of data collection for the 2007-2008 school year. A key objective of the report is to identify and discuss effective intervention models and practices. Analytic techniques will range from descriptive statistics and impact analysis of data from surveys, records, and reading tests to qualitative analysis of information from classroom observations studies.

c. Time Schedule


The timeline for Year 2 of the evaluation is shown in Table 3.

TABLE 3


Schedule of Activities


Activity

Schedule

Implementation of Components 1 and 2

August 2007-September 2008

Analysis and report

October 2008-September 2009

17. Approval Not to Display the Expiration Date for OMB Approval

Approval not to display the expiration date for OMB approval is not requested.

18. Exception to the Certification Statement

No exceptions to the certification statement are being sought.

REFERENCES

Box, G.P., Hunter, W.G., and Hunter, J.S. (1978). Statistics for Experimenters: An Introduction to Design, Data Analysis, and Model Building. New York: John Wiley.

Bryk, A., Camburn, E., and Louis, K.S. (1999). “Professional Community in Chicago Elementary Schools: Facilitating Factors and Organizational Consequences.” Education Administration Quarterly, 35, 751-781.

Gersten, R., Fuchs, L.S., Williams, J.P. Baker, S. (2001). “Teaching Reading Comprehension Strategies to Students with Learning Disabilities: A Review of Research.” Review of Educational Research, 71, pp. 279-320.

James-Burdumy, Susanne, David Myers, John Deke, Wendy Mansfield, Russell Gersten, Joseph Dimino, Jan Dole, Lauren Liang, Sharon Vaughn, and Meaghan Edmonds. "The National Evaluation of Reading Comprehension Interventions: Design Report." Princeton, NJ: Mathematica Policy Research, Inc., May 2006.

National Reading Panel (2000). Teaching Children to Read: An Evidence-Based Assessment of the Scientific Literature on Reading and its Implications for Reading Instruction. Washington, DC: The National Institute for Literacy.

Rosenshine, B., Meister, C., and Chapman, S. (1996). “Teaching Students to Generate Questions: A Review of the Intervention Studies.” Review of Educational Research, 66, pp. 181–221.

Snow, C. E., Burns, M.S., and Griffin, P. (eds.) (1998). Preventing Reading Difficulties in Young Children. Washington, DC: National Academy Press.

Sparks, G. M. (1988). “Teachers' Attitudes Toward Change and Subsequent Improvements in Classroom Teaching.” Journal of Educational Psychology, 80(1), 111-117.

Trabasso, T. (December 2003). “Teaching of Comprehension in Content Areas: Design Criteria.” Paper commissioned by the Institute of Education Sciences.


1 Pearson/AGS publishes a standardized passage comprehension subtest appropriate for sixth-grade students. We will need work with the Educational Testing Service to develop a sixth-grade version of the reading comprehension tests that were used for fifth graders.

2 We will administer the same version of the assessments to all students, even those who are retained in fifth grade. Based on previous studies with similar populations, we expect very few students to be retained.

3 The purpose of the experiment was to measure and compare the reliability of classroom observations based on two experimental conditions: (1) low-intensity observations where the classroom was observed one time and (2) high-intensity observations where the classroom was observed three times by the same observer.

4


File Typeapplication/msword
File TitleMEMORANDUM
AuthorAugust Pitt
Last Modified ByDoED User
File Modified2007-07-03
File Created2007-07-03

© 2024 OMB.report | Privacy Policy