Supporting Statement B_revised clean

Supporting Statement B_revised clean.docx

Summer Reading Program Study

OMB: 1850-0864

Document [docx]
Download: docx | pdf

Summer Reading Program Study

OMB Clearance Request—Part B

Supporting Materials

February 2009

Prepared For:

Institute of Education Sciences

United States Department of Education

Contract No. ED‑06‑CO‑0017

Prepared By:

Regional Educational Laboratory—Southwest

Edvance Research, Inc.

9901 IH‑10 West, Suite 700

San Antonio, Texas 78230

(210) 558‑1902

(210) 558‑1075 (fax)

Contents

Supporting Statement for Paperwork Reduction Act Submission: PART B 1

B. Description of Statistical Methods 1

1. Respondent Universe and Sampling Methods 1

Development of an Overall Pool of Potential Sites 1

Recruitment 2

2. Procedures for Data Collection 2

Student Background Data 4

Parent Consent Forms and Student Interest Survey 4

Monitoring Postcard 4

Final Postcard 5

Statistical Methodology and Stratification 5

Estimation Procedures/Analysis Methods 6

Sample Characteristics and Baseline Group Equivalence 6

Fidelity of Implementation 7

Analysis for Assessing the Effects of SRP on Student Reading Achievement 8

Exploratory Analysis for Assessing the Effects of SRP on Student Reading Achievement Within Demographic Subgroups 9

Degree of Accuracy Needed 9

3. Procedures to Maximize Response Rates 10

4. Tests of Procedures to Be Undertaken 11

5. Individuals Consulted on Statistical Aspects of Design 11

References 12



List of Exhibits

Exhibit 1. Identified Districts in Texas 2

Exhibit 2. Data Collection Purposes and Responsibility 3

Exhibit 3. Sample size needed (assuming 80% Participation Rate) 10



Supporting Statement for Paperwork Reduction Act Submission: PART B

B. Description of Statistical Methods

1. Respondent Universe and Sampling Methods

This study is an evaluation of a Summer Reading Program (SRP) for students between their 3rd and 4th grade years who read below the 50th percentile on a nationally-normed reading assessment and qualify for the free/reduced lunch program. The research design for this study calls for up to 3 districts with a total of approximately 1,900 participating students. The study is being conducted as part of the Regional Educational Laboratory – Southwest (REL Southwest) contract No. ED-06-CO-0017. Therefore, the region of focus includes those states served by REL Southwest: Arkansas, Louisiana, New Mexico, Oklahoma, and Texas.

The study is a randomized controlled trial designed to evaluate the causal impact of the intervention. The study is not designed as a survey, and there is no attempt to collect a representative sample of some population of interest, nor any other kind of probability sample. Within selected districts, all schools will participate in the study. Within each participating school, all eligible students who have returned consent forms will be randomly assigned to either treatment or control conditions.

Because the study will utilize Lexile measures, we identified Texas as the state in which to recruit as all of its public schools/districts administer the TAKS assessment, which allows for the identification of a Lexile measure for each student that describes his or her reading ability. In addition, recruitment for the study will be concentrated in medium to large districts, as it will be easier to obtain the full sample of approximately 1,900 students in one large district or three medium-sized districts as compared with several smaller districts. Focusing efforts on one to three medium to large districts will also reduce travel costs associated with project management, data collection, and other activities.

To also focus the recruitment efforts, districts know to be currently planning summer reading programs will be contacted. We will work with these districts if they are willing to conduct the summer reading program as a randomized controlled trial, under the guidance of the REL Southwest and if they agree with all study related data collection efforts.

It is expected that 80% (based on prior studies) of the students will provide pre and post-test data for our primary analysis of the Intent-to-Treat impact of the Summer Reading Program. In accordance with the Office of Management and Budget Standards and Guidelines for Statistical Surveys (OMB, 2006), in particular Guideline 3.2.9, there would not be any need to examine non-response bias if the response rate is at least 80 percent. We are not doing a survey analysis, and therefore non-response bias is of less concern (since it would have to be differential non-response bias to affect the proposed analyses). Nevertheless, if there is less than 80% response rate, we will examine non-response bias and if bias is indicated, appropriate design weights will be used in the analysis to account for it.

The postcards (which are brief surveys in postcard format) are expected to have an 80% return rate, based on prior summer reading programs conducted by MetaMetrics, as noted in Part A. Estimates of Hour Burden section, but are only being used descriptively, and are not part of any inferential or confirmatory analysis.

Development of an Overall Pool of Potential Sites

REL Southwest used the Common Core of Data (CCD) website, which utilizes publicly available datasets to describe characteristics of schools and districts, to identify Texas districts with student enrollments of 25,000 or greater. Based on the CCD results, 42 districts were a match (see Exhibit 1. Identified Districts in Texas).



Exhibit 1. Identified Districts in Texas

Aldine ISD

El Paso ISD

Mesquite ISD

Alief ISD

Fort Bend ISD

North East ISD

Amarillo ISD

Fort Worth ISD

Northside ISD

Arlington ISD

Garland ISD

Pasadena ISD

Austin ISD

Houston ISD

Pharr-San Juan-Alamo ISD

Brownsville ISD

Humble ISD

Plano ISD

Carrollton-Farmers Branch ISD

Irving ISD

Richardson ISD

Clear Creek ISD

Katy ISD

Round Rock ISD

Conroe ISD

Keller ISD

San Antonio ISD

Corpus Christi ISD

Killeen ISD

Socorro ISD

Cypress-Fairbanks ISD

Klein ISD

Spring ISD

Dallas ISD

Lewisville ISD

Spring Branch ISD

Ector County ISD

Lubbock ISD

United ISD

Edinburg CISD

Mansfield ISD

Ysleta ISD

Recruitment

After identifying the potential districts, we will contact each of the 42 sites listed in Exhibit 1. Pending a district’s interest in the study, we will work with that district to schedule a meeting with the district to learn more about the number of students who 1) have a Lexile measure below the 50th percentile and 2) qualify for free/reduced lunch.

We will request that interested districts provide us with a letter of intent to participate, which must be signed by the appropriate district level administrator (such as the superintendent, research director, etc.). Based on the total number of students per district, we will select one to three districts that qualify for the SRP and have provided REL Southwest with written permission to participate in the study.

2. Procedures for Data Collection

Data collection will be carried out by REL Southwest and MetaMetrics, Inc. REL Southwest will have responsibility for managing data collection and ensuring quality, coordination, and timeliness, as well as collecting student background data from the district. REL Southwest is also in charge of creating student-level rosters with the district, updating these rosters periodically, and creating unique study IDs for all participating students.

All collected data will be processed for data entry by REL Southwest. As a part of the general data management, REL Southwest will track response rates, using the unique study ID numbers created. REL Southwest will be responsible for converting responses into electronic analysis files, and ultimately producing public use data sets in accordance with the requirements of the Department of Education. Data files with unique study IDs will be sent to MetaMetrics. MetaMetrics will provide a database of books and associated Lexile measures and interest areas. REL Southwest will create a computer program that will randomly select eight books from the database for each student based on the student’s Lexile measure and interests indicated by the student on the Student Interest Survey. A book publisher will be contracted to ship the eight books to the students. This publisher will be required to conform to the confidentiality rules around the names and addresses of the students. This publisher will not be allowed access to this data for any purpose other than to distribute books for this study. Upon completion of the study, the publisher will destroy all study-related data. Exhibit 2 shows data collection purpose and the timing of different data collection activities.

Exhibit 2. Data Collection Purposes and Responsibility

Responsible Organization

Data Collection Instrument

Primary Purpose

Data Collection Schedule

Provide Context/ Covariates

Measure Outcomes

Early Summer

2009

Mid/Late Summer
2009

Fall
2009

Winter /Spring
2010

REL Southwest

Student Background Data

X


X




REL Southwest

Student Interest Survey

X


X




REL Southwest

Pre- and post-test scores obtained from extant data from participating districts


X

X


X

X
Depending on the measure used by the district, may collect in spring 2010

REL Southwest

Monitoring Postcard (8 quantity)

X



X



REL Southwest

Final Postcard

X



X





Student Background Data

REL Southwest will request student-level achievement/Lexile measure and demographic data from school districts as soon as possible after the recruitment has been completed. The type of the request (i.e., district-wide or a tailored school-level request) will depend on each school district’s preferences. The data sets will include identifiers because the data have to be linked to specific children in the study. Once study IDs have been created for students, the identifiers will be removed from the data set. The data will be stored in accordance with the IES Confidentiality statute as indicated in Part A.

Parent Consent Forms and Student Interest Survey

REL Southwest will be responsible for working with each district and school to distribute and collect Parent Consent Forms. The Student Interest Survey will be attached to each Parent Consent Form to facilitate completion and an 80% or higher response rate. The Student Interest Survey will take approximately 3 minutes to complete. The parent is asked to work with the student on the survey. The survey gathers information on the types of reading (e.g., history, animals, mystery) the student would find of interest over the summer.

All students will fill out the Student Interest Survey prior to the end of the 2008/09 school year. The student‘s interests and Lexile measure from the district will be entered into a database with the student’s unique ID. The database will be used to select eight books matching the student’s reading ability and interest.

Students will be randomly selected to be either in the control or treatment group. Students selected to be in the treatment group will be sent books during summer 2009. Four books will be sent to the student in early summer 2009 and an additional 4 books will be sent 4 weeks after the first mailing. Students in the control group will be sent 8 books at the beginning of the summer in 2010.

Monitoring Postcard

Monitoring postcards, which are brief surveys in postcard format, will be sent to the treatment students’ mailing address provided by the district. (Note that these postcards are mailed to the students in an envelope. The postcards themselves only have an identification number on them and do not contain the student’s name. Additionally, the postcards are pre-addressed and pre-stamped for easy return.) Treatment students will be asked to read approximately one book a week over the summer. To monitor the students’ progress and amount of reading over the summer, students will be sent one postcard a week for eight weeks. Students are asked to write the title of the book they have just read and to answer a few questions about the book. Students are then asked to send the self-addressed, prepaid postcard back to the study team. REL Southwest will enter the data collected from the postcard. The information from the monitoring postcards will be used for descriptive purposes only.

Final Postcard

Final postcards, which are brief surveys in postcard format, will be sent to the treatment and control students’ mailing addresses provided by the district. (Note that these postcards are mailed to the students in an envelope. The postcards themselves only have an identification number on them and do not contain the student’s name. Additionally, the postcards are pre-addressed and pre-stamped for easy return.) The primary purpose of the final postcard is to collect information about each student to determine the amount/level of reading for all students over the summer as well as to determine which students might be involved in other summer learning programs (e.g., summer school). Students are then asked to send the self-addressed, prepaid postcard back to the study team. REL Southwest will enter the data collected from the postcard.

Pre- and Post-test

REL Southwest will collect pretest and post-test student data from participating districts. Pretest data consists of students’ spring TAKS administration reading scores. In addition to pretest data, this data will be used to determine which students fall below the study’s cut point and can be included in the study. The post-test data consists of SRI scores.

Statistical Methodology and Stratification

The mission of REL Southwest is to conduct research that is relevant for the Southwest region of the United States. This is a study about the effectiveness of a SRP providing free, targeted books to low-income, struggling readers during the summer between the 3rd and 4th grades. Struggling readers are defined as those who score below the 50th percentile on national norms for a reading test that reports Lexile measures. Low-income students are defined as those receiving free or reduced lunch.

The study will be conducted in one or more districts in the Texas region with a high proportion of low-income struggling readers. A sufficient number of districts will be recruited to obtain at least 1,900 eligible students.

This study is a randomized control trial in which the eligible students in each school will be randomly assigned to a treatment or control group. In order to be eligible, students will need to be low-income, score below the 50th percentile on national norms for a reading test that reports Lexile measures, and be able to read in English. English language learners (ELLs) and bilingual students are eligible; however, monolingual non-English speakers will not be included in the study.

The treatment students will receive eight free books during the program summer. These books will be selected to match the students’ individual reading abilities (via Lexile measures) as well as self-selected areas of interest. The control students will not receive books during the intervention summer. However, control students will receive eight free books the following summer, though this will not be part of the data analysis.

Students will be excluded from data collection if their parents do not consent to participation in the study. Thus, the final sample will include all eligible students whose parents/guardians allow study participation.

Estimation Procedures/Analysis Methods

This study is intended to assess the effect of a SRP on student reading achievement gain/loss over the summer through a randomized control trial in which low-income, struggling readers are randomly assigned to treatment (receiving eight free interest- and ability-targeted books) or control (not receiving books) groups. The primary hypothesis to be tested is whether students who receive the books will demonstrate less summer reading loss than students who do not receive the books. To clarify, the primary hypothesis will be based on comparing post-test: pretest scores for the two groups. Prior to testing the primary hypothesis, preliminary data analyses will be conducted. This will include outlier analysis, as well as a careful examination of the sample characteristics and baseline equivalence of the two study groups. In addition to the primary confirmatory analysis, various exploratory analyses will be conducted examining student demographic and fidelity data. Fidelity (i.e., the degree to which the students actually read the books) will only be looked at in descriptive fashion, and all students, regardless of level of fidelity, will be included in the intent‑to‑treat analysis of the primary hypothesis.

Sample Characteristics and Baseline Group Equivalence

The primary focus of preliminary data analysis will be on sample characteristics and group equivalence at baseline. Descriptive analyses of sample characteristics (e.g., demographic composition and attrition) will be performed with both the full sample and the two study groups separately.

Although the random assignment of the study sample is expected to produce two groups that are statistically equivalent on all measured, as well as unmeasured characteristics, there may still be differences between the study groups due to sampling error. Moreover, post-randomization attrition of the study participants may affect the baseline equivalence of the SRP group and the control group in the analytic sample; we will therefore examine the baseline equivalence of treatment and control groups for the post-test sample. Significant baseline differences between the study groups, if not properly controlled, will lead to biased estimates of the program’s impacts. Therefore, it is essential to examine baseline group equivalence prior to conducting the impact analyses, so that significant baseline differences can be adequately controlled through the use of covariates in the impact analyses.

Specifically, group equivalence of the analytic sample will be assessed by comparing the SRP group and the control group on the following student characteristics:

  • Student characteristics: sex, race, ELL status

Differences in the above characteristics between the two groups will be tested using independent-samples t-tests. Significant differences based on the t-tests will be statistically controlled in the main impact analyses.1

Fidelity of Implementation

It should be noted that the intervention being assessed is enrollment in a SRP, in which the books are made available to the students. This intervention is fully implemented by providing the books to the students, regardless of the degree to which the students actually read the books.

Nevertheless, it is of interest for descriptive analyses to examine the degree to which the students read the books. Data on students’ reading of the books will be collected throughout the summer via postcards (as described in Section 2).

Analysis for Assessing the Effects of SRP on Student Reading Achievement

The primary hypothesis of this study will be tested using ordinary least-squares (OLS) models that compare the outcomes of students in the SRP with those of students assigned to the control group. Student outcomes will be modeled as a function of students’ pretest scores and SRP status. Although randomization will not require the use of covariate adjustments to obtain unbiased estimates of the program’s effects, the inclusion of covariates strongly related to the outcome, particularly pretest scores, will lead to improved statistical precision of the parameter estimates (Bloom, Richburg-Hayes, & Black, 2005; Raudenbush, Martinez, & Spybrook, 2005). Moreover, the use of covariates can also adjust for significant group differences that occur by chance.

Yi = π0 + π 1*(Pretest)i + π2*(SRP)i + + ei

Where:

Yi is the outcome for student i2;

Pretest3: the pretest score of student i;

SRP: an indicator variable representing whether the student was assigned to the treatment group;

π0 is the average outcome of students;

π 1 is effect of pretest on the outcome of student i;

π 2 is the difference in the outcome between treatment and control students; and

pair_g, g = 2, 3, …, G, are (G-1) dummy indicator variables representing the G schools, with school_1 as the omitted reference school;

πg = 2, 3, …, G, represents the (G-1) fixed school effects for the G schools; and

ei is a random error associated with student i; ei ~ N (0, σ2).

In addition to the statistical significance of SRP effect, the analysis will also gauge the magnitude of the effect with the effect size index. Specifically, the effect size will be computed as a standardized mean difference (Hedges’s g) by dividing the adjusted mean difference (π2 ) by the unadjusted pooled within-group standard deviation of the outcome measure.

Degree of Accuracy Needed

Based on the positive effect sizes for reading gains shown in previous research conducted on SRPs (e.g., Kim, 2006) we have designed a study that can detect a minimal effect size of approximately 0.12 for the main treatment effect for the primary hypothesis. The power analysis assumes a design in which students are randomly assigned to treatment or control conditions. In particular, it uses formulas presented in Murray (1998 pg. 378-380) for Mixed Model ANCOVA with stratification and regression adjustment for covariates.

The power calculations are based on the following additional assumptions:

  • Desired statistical power—80 percent.

  • Statistical significance level—the statistical significance is 0.05 (two-tailed).

  • Minimum detectable effect size—assume desired MDES=.12.

  • Explanatory power of the pretest—assume that the pretest will correlate with the post-test at the following level—R2 = 0.3, with resultant error reduction.

  • Attrition and parental consent return rate—assume that 80% of eligible students will return signed consent forms and provide outcome data. This is consistent with consent form rates in other similar studies we have conducted.

Exhibit 3 includes findings from the power analyses incorporating the above assumptions. This table provides the required sample size (per group) for MDES ranging from 0.10 to 0.13 and with R2 ranging from 0 to 0.5. Our targeted sample size is based on MDES=0.12 and R2 = 0.3, which is 948 students per condition.

Exhibit 3. Sample size needed (assuming 80% participation rate)


Sample Size Needed
(per group)


R2

MDES

0

0.3

0.5

0.10

1,945

1,361

974

0.12

1,354

948

678

0.13

1,154

809

578

3. Procedures to Maximize Response Rates

Although this is a randomized controlled trial, and we are not attempting to collect a representative sample, we still wish to maximize response rates. Therefore, we are undertaking the following steps will be to obtain high response rates and high-quality data:

  • Clear Parental Consent Forms that explain the purpose of the study and related data collection without jargon.

  • Preparation of high-quality materials (i.e., Student Interest Survey, Monitoring Postcard, and Final Postcard) that are clear and do not burden students excessively.

  • Assign a staff member(s) with experience with complex data collection to be a data manager. This person will be responsible for:

    • Building and maintaining good working relationships with the school districts and school personnel.

    • Scheduling data collection.

    • Overseeing and participating in data collection.

  • Collaborating with a point person at each school so that any address changes can be dealt with in a timely fashion.

4. Tests of Procedures to Be Undertaken

The assessments in this study will not be specifically piloted or tested as part of this study, since they are either commercially developed tests, or have been previously used in earlier studies.

The items in the Student Interest Survey are based on postcards created by MetaMetrics in a previously conducted SRP pilot based upon James Kim’s research and program design. We conducted a small pretesting pilot on 8 third-grade children to identify any usability or other concerns with on both the Student Interest Survey as well as the postcards in April 2009. The students were given the instruments and were asked to read them and fill them out. Students were able to follow the directions and complete both instruments with no problems. Students were then interviewed regarding the readability and understanding of both instruments, and it was determined that the instruments are appropriate for third-graders.

The pretest scores will be from the spring 2009 TAKS administration. The post-test scores will be from a fall 2009 administration of the SRI. Test data will be provided by the district.

5. Individuals Consulted on Statistical Aspects of Design

The following individuals were consulted on the statistical aspects of the study’s design:

  • Dr. Chuck Wilkins, Senior Statistician, REL Southwest

  • Dr. Kim Brunnert, Senior Researcher, REL Southwest

In addition to the above, members of the TWG (listed in Part A) provided substantial input to the study design and data collection plan. The members of the TWG are:

Technical Working Group Members

Technical Working Group Members

Expert

Affiliation

Dr. Roger Bybee

Biological Sciences Curriculum Study (BSCS)

Dr. David Chard

University of Oregon

Dr. David Francis

University of Houston

Dr. Jeremy Kilpatrick

University of Georgia

Dr. David Myers

American Institutes for Research







References

Bloom, H.S., Richburg-Hayes, L., & Black, A.R. (2005). Using covariates to improve precision: Empirical guidance for studies that randomize schools to measure the impacts of educational interventions . New York: MDRC.

Kim, J. S., (2006). Effects of a voluntary summer reading intervention on reading achievement: Results from a randomized field trial. Educational Evaluation and Policy Analysis, 28 No4, 335-354.

Raudenbush, S.W., Martinez, A., & Spybrook, J. (2005). Strategies for improving precision in group-randomized experiments. New York: William T. Grant Foundation.



1 These analyses will not make corrections for multiple comparisons. The purpose of these tests is to identify whether the two study groups are equivalent at baseline. Consequently it is preferable to be conservative and use t-tests uncorrected for multiple comparisons, as such corrections would make it harder to detect significant baseline differences.

2 The outcome or post-test measure will be the Scholastic Reading Inventory (SRI).

3 The pre-test measure will be the score from the spring administration of the Texas Assessment of Knowledge and Skills (TAKS).

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION
AuthorRebecca Holland Coviello
File Modified0000-00-00
File Created2021-02-04

© 2024 OMB.report | Privacy Policy