High School Study_Information Collection Support Statement B_09-22-14

High School Study_Information Collection Support Statement B_09-22-14.docx

High School Reform Study

OMB: 1875-0275

Document [docx]
Download: docx | pdf


September 22, 2014

U.S. Department of Education

PPSS TO 13. High School Reform Study

Information Package





September 22, 2014



OMB Clearance Package



PPSS Task Order 13. High School Reform Study


Contract Number GS-10F-0554N, Order # ED-PEP-11-O-0090/TO13
SRI Project #P21496













Submitted to:

Joanne Bogart
Policy and Program Studies Service

U.S. Department of Education

400 Maryland Avenue, SW

Washington, DC 20202



Prepared by:

Christine Padilla, Ellen Schiller, and Rebecca Schmidt, SRI International
Deborah Herget, RTI International


Revised Supporting Statement, Part B
Paperwork Reduction Act Submission

B. Collection of Information Employing Statistical Methods

B.1. Sampling Design

Potential Respondent Universe

The study will include a survey of a nationally representative sample of high school principals. The study will examine the prevalence and characteristics of high school reform strategies, particularly dropout prevention strategies. The proposed survey of 2,088 schools is appropriate to provide descriptive information on these questions based on a representative sample of the universe of public high schools (see section B.2 for a discussion of the degree of precision this sample will provide).

Selection of the sample and administration of the survey will be carried out by the Research Triangle Institute (RTI).

The sampling frame for this survey is based on the Common Core of Data (CCD). Public high schools providing instruction to 12th-grade students in the fall of 2010 will be included unless (1) the lowest offered grade is 11th grade or higher, (2) there are fewer than five students in grades 9 through 12, (3) the percentage of students enrolled in grades 9 through 12 is under 20 percent of the total school enrollment and the total number of students in grades 9 through 12 is fewer than 20, or the school name contains one of nine keywords indicating juvenile detention center or hospital. Of the 103,813 total schools listed in the 2010–11 CCD, 22,447 high schools meet the criteria to be included in the sampling frame.

Sample Selection Process

The sample will be selected using a stratified random sampling approach. The research team at RTI classified schools in the sampling frame into locale (urban, suburban, and rural) and graduation rate (high and low) strata. RTI combined the 12 locale codes provided in the CCD into three broader groups. Exhibit 2 shows the three values of the locale sampling strata and their relationship to the 12 locale codes available in the CCD.


Exhibit 2. Sampling Locale and CCD Locale Codes

Sampling Locale

CCD Locale

Urban

11- City, Large Territory

Urban

12- City, Midsize Territory

Urban

13- City, Small Territory

Suburban

21- Suburb, Large Territory

Suburban

22- Suburb, Midsize Territory

Suburban

23- Suburb, Small Territory

Suburban

31- Town, Fringe Territory

Suburban

32- Town, Distant Territory

Suburban

33- Town, Remote Territory

Rural

41- Rural, Fringe Census-defined Rural Territory

Rural

42- Rural, Distant Census-defined Rural Territory

Rural

43- Rural, Remote Census-defined Rural Territory


Graduation rates for each school were gathered from the EDFacts data bases, which is a central repository of performance data supplied by K-12 state education agencies. To identify the two graduation rate strata (high and low), the RTI research team established a threshold of 80 percent to define high graduation rates based on the all rate_1011 variable on the EDFacts website. The decision to use an 80 percent threshold was made because approximately 50 percent of schools reported graduation rates of 80 percent or less. Because schools often reported graduation rates in categories (e.g., 80 to 85 percent) rather than a specific number, some additional decision rules were necessary. There were three categories that included 80 percent as a lower bound (80 to 85 percent; 80 to 89 percent; and greater than or equal to 80 percent). All schools included in any of these three categories were included in the “high graduation rate” category.

There were 3,302 schools without graduation rate information in EDFacts. RTI used an imputation approach to assign these schools to either the high or low graduation rate stratum. The imputation process began by examining the distribution of the high/low graduation rate classification for the 19,145 schools by sampling locale. Exhibit 3 shows the distribution of these 19,145 schools by sampling locale and graduation rate classification.

Exhibit 3: Initial Distribution of HSRS Sampling Frame Schools by Sampling Strata


Graduation Rate\Locale

Rural

Suburban

Urban

Total

Total

7,828

7,057

4,260

19,145

High

5,351

4,449

1,748

11,548

Low

2,477

2,608

2,512

7,597



The percentage of schools classified as high graduation rate was calculated separately for each locale sampling stratum; 68.4 percent of rural schools were classified as high graduation rate, 63 percent of suburban schools were classified as high graduation rate, and 41.0 percent of urban schools were classified as high graduation rate. Each of the 3,302 schools with unknown graduation rates was randomly assigned to the high graduation rate stratum with probability 68.4 if the school was classified as rural, with probability 63.0 if the school was classified as suburban, and with probability 41.0 if the school was classified as urban.

The final distribution of the 22,447 schools by strata is shown in Exhibit 3.

Exhibit 4. Final Distribution of HSRS Sampling Frame Schools by Sampling Strata


Graduation Rate\Locale

Rural

Suburban

Urban

Total

Total

9,141

8,180

5,126

22,447

High

6,260

5,164

2,086

13,510

Low

2,881

3,016

3,040

8,937



B.2. Procedures for Collection of Information

Initial Contact and Follow up

Procedures for collecting data for the High School Reform Study (HSRS) will commence upon receipt of OMB approval anticipated November 2014. Once OMB approval is received, a letter will be sent to school districts to notify them that the study team will contact selected schools in their district in about a week. If it is known that a school district requires a research application prior to contacting schools, that application will be submitted along with the district letter. Project staff will be prepared to comply with all district protocols and will not contact schools until district approval is received. The district letter will include a toll-free number that they can call to alert project staff if there are requirements to be met before contacting schools. If project staff are already in contact with schools and are informed by school or school district staff that specific district requirements must be met before schools may commit to participating in a research study, school contact will cease until the district requirements are met and approval is received to resume contact with the school.

Data collection with school administrators will commence about one week after district letters are sent. A letter and study fact sheet will be mailed to the principal that will emphasize the importance of the study and will provide the URL, user ID, and password to access the online survey. School administrators will be informed that they may complete the 30-minute survey or ask a designee to do so. A designee may be any person at the school who is knowledgeable about school policies and programs. If they prefer to have someone else complete the form, the administrator will be asked to share the login information on the notification letter. The first question on the survey will ask for the name and title of the person completing the survey. A toll-free number and email address will be provided so school staff may contact project staff with questions as needed. Copies of letters and the fact sheet can be found in Appendix B.

Prompting calls will begin with nonresponding schools after about three weeks to answer any questions about the study, remind administrators that they may delegate the survey to another knowledgeable staff member, and encourage participation. Administrators may complete the survey as an interview by phone should they prefer a phone interview to an online survey.

Up to three mailings will be sent to the school over the data collection period to remind staff to complete the online survey. Email reminders will be sent every 10 days.

The research team will email a link to a secure web-based version of the survey to the school principal in each of the 2,088 sampled schools on December 1, 2014. The survey will remain open through June 1, 2015. In June, the researchers will download the survey data and begin cleaning and analysis. A full draft of the survey is included in Appendix A.

Statistical Methodology and Estimation Procedures

The study will examine descriptive statistics on each survey question. For example, the researchers will produce tables with the frequency and percentage of schools responding “Yes” to each question that asks if a particular strategy is available at the school such as: “In the 2014-15 school year, does your school have formal adult mentor(s)?” These data will answer the question about the prevalence and key high school reforms in the nations’ public high schools. Researchers will weight responses to the total number of schools in each stratum, and will report the standard error of each reported percentage in the appendix. Where appropriate, the study may also examine survey responses by school characteristics such as school type (e.g., regular school, alternative school) and school enrollment. The source of these variables is either the Common Core of Data (CCD) or survey background questions.

Degree of Accuracy Needed

RTI determined sample sizes by estimating the number of responding sample members to achieve certain power requirements, adjusting that number upward to account for 25 percent school nonparticipation and for design effects arising out of potential weight adjustments1 as well as imputation misclassification.

RTI estimated precision requirements using four assumptions. First, statistical tests will employ an alpha of 0.05 and will have a required power of 80 percent. Second, the underlying proportion was assumed to be .30 for purposes of calculating sample sizes for estimates and tests of proportions. Third, the design effect will be less than 1.6. Finally, the researchers expect a 25 percent nonresponse or nonparticipation rate by schools.

The selected total sample size of 2,088 schools is designed to support comparisons between schools from any two strata such that there is 80 percent power to detect a 15 percent difference in proportions. The sample of 2,088 is allocated equally to each of the six strata so that 348 schools will be sampled from each stratum in order to obtain 261 responding schools per stratum.

These sample size calculations are sensitive to assumptions about nonresponse rates and to misclassification rates associated with imputation of graduation rate status. A reserve sample will be established, comprising 174 schools in each of the six sampling strata, so that additional schools are available if the nonresponse rate is substantially higher than 25 percent. The reserve sample also could be used to supplement the initial sample of 2,088 schools if the imputation classification was found to have a high error rate. However, misclassified schools must not be excluded from data collection; rather, additional schools could be added from the reserve sample in order to try to obtain 261 responding schools in each sampling stratum. The degree to which this may be possible depends on actual field data collection experiences and costs.

Use of Periodic Data Collection

The survey will only be administered once during the 2014–15 school year.

B.3. Methods for Maximizing Response Rate and Dealing with Nonresponse

Response Rate

The approach for gaining cooperation from school administrators includes a series of mailings and emails with follow-up telephone calls to prompt for outstanding incomplete surveys. Letters to school districts and schools will be printed on Department letterhead and signed by high-ranking officials at the U.S. Department of Education’s Office of Planning, Evaluation and Policy Development as well as the High School Graduation Initiative (HSGI). The inclusion of an HSGI signature on the letter will lend legitimacy to the study as many sampled schools and school districts will be familiar with the HSGI grant program that supports dropout prevention and reentry efforts.

With an estimated 2,088 schools in the sample and an expected 80 percent participation rate (see B.2), 1,670 school administrators should complete the survey. A six-month data collection is planned to allow school administrators to participate at a time most convenient to them. At the end of data collection, a school analytic weight will be constructed that will account for school nonresponse.

Generalizability of the Sample

The research design for the survey relies on a simple random sample stratified by locale and graduation rate intended to capture descriptive information. As such, the findings from the survey will be generalizable to all public high schools within the sampling frame as well as each locale and schools with high and low graduation rates.

B.4. Test of Procedures and Methods

The research team has conducted internal pretesting of survey items to ensure clarity. Additionally, the research team confirmed that all items are related to the research questions, ensuring the survey will capture all necessary information.

Many of the survey questions have been adapted from relevant questions used in other national surveys of high school reforms. For example, some questions related to credit recovery programs have their roots in questions on the Fast Response Survey System–Alternative Schools and Programs Survey (2008) and Distance Education Survey (2010).

The research team will pilot the survey items with nine regionally dispersed district administrators in July. After the pilot participants test the instruments (i.e., take the survey), the researchers will have phone conversations with each participant to discuss clarity of wording and flow, interpretations of the questions, and any other issues that emerge. The researchers will revise the instruments based on this feedback.

B.5. Consultations on Statistical Aspects of the Design

The research team consulted with Dr. David Wilson, Senior Research Scientist at RTI International, on sampling for the survey. He can be reached at 919-541-6990.

Agency

Joanne Bogart of the U.S. Department of Education is the Contracting Officer’s Representative for the study. She can be reached at 202-205-7855.

Contractors

SRI International and RTI will be responsible for data collection, under the direction of
Ellen Schiller, who can be reached at 703-247-8503.



References

Alliance for Excellent Education. (2013). The economic benefits of increasing the high school graduation rate for public school students in the United States. Washington, DC: Alliance for Excellent Education.

Amos, J. (2008). Dropouts, diplomas, and dollars: U.S. high schools and the nation’s economy. Washington, DC: Alliance for Excellent Education.

Carnevale, A. P., Smith, N., & Strohl, J. (2010). Help wanted: Projections of jobs and education requirements through 2018. Washington, DC: Center on Education and the Workforce, Georgetown University.

Rumberger, R. W. (2013). Poverty and high school dropouts: The impact of family and community poverty on high school dropouts, The SES Indicator (6)2, May, 2013.

Stetser, M. C., & Stillwell, R. (2014). Public high school four-year on-time graduation rates and event dropout rates: School years 2010-11 and 2011-12, First look (NCES 2014-391). Washington, DC: National Center for Education Statistics, U.S. Department of Education.







1 Design effects quantify the extent to which the survey’s sampling error varies from the sampling error expected when using a simple random sample.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorNancy
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy