10-2-14_TO10 OMB Package_Fourth Draft_Version_SSPart B rvd 10-01-14_efr

10-2-14_TO10 OMB Package_Fourth Draft_Version_SSPart B rvd 10-01-14_efr.docx

Studies of Rural Education Achievement Program (REAP) Grantee

OMB: 1875-0274

Document [docx]
Download: docx | pdf


October 1, 2014



Fourth Draft OMB Package, Version Two – Part B (revised)



PPSS TO 10: Studies of Rural Education Achievement Program (REAP) Grantees

Subtask 3.5: Prepare Fourth Draft OMB Clearance Package

Contract Number GS-10F-0554N; Order Number ED-PEP-11-O-0090

SRI Project #P21494













Submitted to:

Andrew Abrams

Policy and Program Studies Service

U.S. Department of Education

400 Maryland Avenue, SW

Washington, DC 20202



Prepared by:

SRI International

Kyra Caspary, Chris Padilla, Nancy Adelman, Rebecca Schmidt, Erica Harbatkin, and Kaily Yee


SUPPORTING STATEMENT, PART B
PAPERWORK REDUCTION ACT SUBMISSION

B. Collection of Information Employing Statistical Methods

B.1. Sampling Design

Potential Respondent Universe

As discussed in Part A, the study will include a survey of a representative sample of REAP districts, interviews with REAP coordinators in all states receiving REAP funds in school year 2014–15,1 and interviews with REAP coordinators in 30 districts receiving REAP funds. The REAP program office is interested in learning about REAP districts’ needs, their experiences with eligibility determination, how they use REAP funds, and the challenges and technical assistance needs they face. The proposed survey of 1,000 districts is appropriate to provide descriptive information on these questions using a representative sample of the universe of SRSA and RLIS grantees (see section B.2 for a discussion of the degree of precision this sample will provide).

Exhibit 8 provides an estimate of the sampling universe of REAP districts in school year 2014–15 using the numbers of districts in school year 2013–14. Nationwide, two-thirds of REAP districts received SRSA funds and one-third were eligible for RLIS funds in the 2013–14 school year. The REAP Office can provide lists of SRSA grantee districts and RLIS-eligible districts by November 2014 (in time for sampling), but does not maintain RLIS subgrantee lists. Therefore, the sampling frame for the survey comprises all districts that receive SRSA funds in school year 2014–15 and all districts that are eligible for RLIS funds in school year 2014–15. Although the number of districts actually receiving RLIS funds will be slightly lower than the number of RLIS-eligible districts, the RLIS take-up rate in school year 2013–14 was approximately 96 percent. The research team expects a similar take-up rate in school year 2014–15, so there should be few RLIS-eligible districts that are sampled for the survey and do not receive funds. The email invitation to REAP coordinators in RLIS districts will verify that the district received funds in school year 2014–15. Districts that did not receive funds will be excluded from the sample, and the research team will randomly select another district from the same stratum to take their place.





Exhibit 8. Universe of REAP Grantees by Grantee Type in 2013–142

Shape1



RLIS and SRSA districts are concentrated in different parts of the United States, with the greatest number of SRSA districts in the Midwest and the greatest number of RLIS districts in the South (see Exhibit 9). In addition, SRSA districts are eligible to exercise a provision called REAP Flex that allows greater flexibility in the use of Title funds. In 2013–14, 49 percent of SRSA districts used the REAP Flex provision.





Exhibit 9. Total Number of SRSA and RLIS Districts per Region in 2013–143

Region

Census Division

Number of RLIS Eligible Districts

Number of SRSA Districts


Midwest

East North Central

341

622


West North Central

170

1,094


Total

511

1,716


Northeast

Middle Atlantic

113

214


Northeast

75

270


Total

188

484


South

East South Central

363

18


South Atlantic

319

22


West South Central

471

793


Total

1,153

833


West

Mountain

114

689


Pacific

122

577


Total

236

1,266


Total


2,088

4,299




Sample Selection Process

Survey of REAP Districts

Starting with the approximately 6,300 districts that receive SRSA funds or are eligible for RLIS funds in school year 2014–15, the survey sample selection process will ultimately identify approximately 1,000 districts that are representative of each geographic region and grantee type. Upon receipt of all state lists of RLIS eligible and SRSA grantee districts in school year 2014–15, the research team will select a stratified random sample of districts stratified by grantee type and census division.4 There are two program type categories and nine census divisions, resulting in 18 strata (see Exhibit 10).

The target sample for the study is 1,004 districts. Because the REAP office is interested in learning about the particular experiences of SRSA grantees in choosing whether or not to exercise REAP Flex and how they use this provision, the study will allocate two-thirds of the sample to SRSA districts (668 districts) and one-third to RLIS districts (336 districts). Because the Flex use rate is approximately 50 percent, this should yield a sample of approximately 334 SRSA districts that exercise Flex and 334 that do not. Because the lists of REAP-Flex users will not be available from the REAP Program Office to create the sampling frame, the study will not stratify SRSA grantees based on Flex use but will include questions about districts’ use of REAP Flex on the survey.

The study’s Technical Working Group has stressed that rural districts vary greatly depending on the part of the country in which they are located. Therefore, to ensure adequate numbers of districts to make statements about REAP districts by region, the study will allocate equal numbers of sampled districts to each region within each program type. This means the study will sample 84 RLIS districts and 167 SRSA districts in each region, as shown in Exhibit 10. Within each region and program type, the study will allocate the districts to each census division proportionally based on the number of districts in the sampling universe. This sampling design ensures that the sample will have adequate precision to report on survey results by both program type and by region and should yield a sample of SRSA districts of which approximately half exercise REAP Flex, allowing the study to report on survey results for REAP Flex users and nonusers. Therefore, the researchers will be able to report on the responses of RLIS subgrantees, SRSA Flex users and SRSA grantees who did not use Flex at the national level. The researchers will also be able to report the responses of REAP districts in each of the four regions, but sample size is not adequate to allow the study to report responses of RLIS or SRSA districts within each region. Because the number of RLIS and SRSA districts varies greatly by census division and some strata will have few sampled districts, the study will not report the results of the survey by census division. The study is stratifying at the census division level to ensure sampling at least one district per populated stratum for purposes of the follow-up interviews, as described below. Likewise, the study will report on the responses of Flex districts at the national level, not broken down by region, because the predicted number of sampled Flex districts in some regions is too low to report results with precision.

Exhibit 10. Estimated Sample of Districts by Region and Grantee Type5

Region

Census Division

Number of RLIS Eligible Districts

Number of SRSA Districts

Estimated Number of Sampled REAP-Flex Districts

Midwest

East North Central

56

61

13

West North Central

28

106

57

Total

84

167

70

Northeast

Middle Atlantic

50

74

21

Northeast

34

93

17

Total

84

167

38

South

East South Central

26

4

0

South Atlantic

23

4

3

West South Central

34

159

114

Total

83

167

117

West

Mountain

41

91

30

Pacific

43

76

57

Total

84

167

87

Total


336

668

312



Follow-Up Interviews

The research team will use a similar stratification scheme to select a subsample of districts for follow-up interviews. This sample will not be representative of REAP districts overall. Instead, the words of the districts administrators will be used to illustrate findings from the survey. For this reason, the research team will begin with the sample of surveyed districts and select a subsample of 30 to be interviewed. The research team will select one district in each census division for RLIS grantees, and two districts in each census division for SRSA grantees, for a total of 27 districts. The study will rely on REAP Flex use in 2013–14 to guide the selection of the two SRSA districts within each census division, with an effort to select one district that used Flex in 2013–14 and one that did not, although this will not be possible in all strata. For example, none of the SRSA districts in the east south central census division used Flex in 2013-14 school year, so in this stratum the study will select two SRSA districts, neither of which used Flex in 2013-14. The study will also select three extra districts for a total of 30 districts. The researchers will select one extra RLIS district in the west south central census division, which has the largest number of RLIS-eligible districts. The researchers will also select two extra SRSA districts (one that exercised REAP Flex in 2013-14 and one that did not ) in the west north central region, which has the largest number of SRSA grantee districts.

Finally, the research team will also conduct in-depth interviews with each of the state-level REAP coordinators (among states that received REAP funds in 2013-14).

B.2. Procedures for the Collection of Information

Once the study has determined the sample of approximately 1,000 districts for the survey, the research team will ask the state REAP coordinator in each of the 48 participating states to identify the respondent and contact information for each sampled district in his/her state, via a template (Appendix F). The study expects that this person will commonly be the district superintendent or a federal grant coordinator.

The research team will email a link to a secure web-based version of the survey to these individuals in December 2014. The survey will remain open through April 2015. While the survey is open, the researchers will send weekly email reminders to the district contacts who have not yet responded. In May, the researchers will download the survey data and begin cleaning and analysis. Appendix A contains a full draft of the survey instrument.

For the follow-up district interviews, the research team will randomly select 30 of these district REAP coordinators (one in each stratum, and an extra district in the largest strata within each program type) to schedule an approximately 45-minute phone interview.6 These phone interviews will be conducted from January 2015 through March 2015. During the same time period, the research team will also interview one state-level REAP coordinator for approximately 45 minutes in each of the 48 states receiving REAP funds in 2013–14.

Statistical Methodology and Estimation Procedures

The interviews with state and district REAP coordinators involve qualitative data collection. Statistical methodology is not applicable. The research team will quantify the qualitative data collected from all state REAP coordinators when this quantification can be done reliably.

The study will examine descriptive statistics on each survey question. For example, the researchers will produce tables with the frequency and percentage of districts responding “Yes” to each subtopic in this survey question: “Were any of the following people involved in deciding how to spend your [RLIS][SRSA] funds?” These data will answer the question of which personnel are involved in deciding how to target REAP funds. Researchers will weight responses to the total number of districts in each stratum, and will report the standard error of each reported percentage in an appendix.

Where appropriate, the study will also examine survey responses by the district characteristics (see Exhibit 11). The source of these variables is either the eligibility spreadsheets maintained by the federal REAP Program Office (“Eligibility Spreadsheets”) or the Common Core of Data (CCD).

Exhibit 11. Source of District Characteristics

Variable

Source

Grantee type

Eligibility Spreadsheets

Region

CCD

Size

CCD

Percent of students and families in poverty

Eligibility Spreadsheets

Award Amount*

Eligibility Spreadsheets

*Available only for SRSA grantees


The PPSS guidelines for categorizing district size classify districts with 2,500 or fewer students as “small.” However, this category applies to more than 90 percent of the districts that receive REAP funds. Therefore, the research team will further classify “small” districts into two groups based on the median average daily attendance (ADA) for the districts with 2,500 or fewer students. Based on the 2013–14 SRSA grantee and RLIS-eligible districts, the three categories for district size would be: medium or large (more than 2,500 students with average daily attendance), small (346 to 2,500 students), and smallest (345 or fewer students). Award amount is not available from the REAP Program Office as it is for SRSA grantees. States may allocate RLIS funds to districts by competition or by formula. States that choose to allocate RLIS funds by formula may use ADA or an alternative formula. Through state interview, the study will confirm or refute the impression of REAP Program Office staff that all states allocate RLIS funds by formula using ADA. If confirmed,, ADA may serve as a proxy for award amount for RLIS districts.

Degree of Accuracy Needed

With an expected 85 percent response rate, the sample size described above will allow the study to provide estimates with a margin of error of no greater than plus or minus 6.1 percentage points for RLIS districts and for each group of SRSA Flex users and nonusers, 4.2 for SRSA districts overall, 3.4 percentage points for the REAP program as a whole, and 6.9 percentage points for the REAP program within each census region. These margin of error values were obtained by simulating the survey results 20 times and averaging the 20 margin of error estimates. The standard deviation of the margin of error estimates was less than 0.03 percentage points.

Unusual Problems Requiring Specialized Sampling Procedures

There are no unusual problems anticipated.

Use of Periodic Data Collection

The survey will be administered and the data collected only once. State and district coordinators will be interviewed only once.

B.3. Methods for Maximizing Response Rate and Dealing with Nonresponse

Response Rate

SRI has extensive experience in administering surveys and carrying out interviews in schools and districts for research purposes. SRI will provide each selected state and district with a letter describing the study and its importance for the field (see Appendix E). These letters will also include the purpose of the survey, information on why the district was selected, and how to learn more about the study. The research team will use additional key access strategies such as having a designated researcher as the primary contact for each district; sending email reminders weekly; making weekly reminder phone calls beginning in the fourth week of data collection; and sending a mail reminder, if necessary to communicate with the districts; and providing opportunities for district contacts to ask questions about the study. The research team anticipates an 85 percent response rate — a realistic goal given two past surveys of REAP districts —and will persist with these efforts until this response rate is achieved or the data collection period ends. A survey of district administrators in districts receiving RLIS funds in 2007–08 conducted by Berkeley Policy Associates achieved a response rate of 84 percent (U.S. Department of Education, 2010). A 2005–06 survey of districts eligible for SRSA conducted by the Urban Institute achieved a response rate of 94 percent (U.S. Department of Education, 2007).

Generalizability of the Sample

The research design for the survey relies on a simple random sample stratified by grantee type and census division intended to capture descriptive information. As such, the findings from the survey will be generalizable to all districts that received REAP funds in school year 2013–14. It will also be generalizable to each geographic region and grantee type, when weighted to account for the stratification design.

State interviews will be conducted with REAP coordinators in all states that received REAP funds in school year 2014–15. The findings will describe the practices and experiences of this population of state coordinators. The district interviews will not be generalizable but can and will be used to illustrate survey findings.

B.4. Test of Procedures and Methods

The research team has conducted internal pretesting of protocol items to ensure clarity. Additionally, the research team confirmed that all protocols are aligned with the research questions, ensuring the protocols will capture all necessary information. In April of 2014 the research team piloted the survey with seven randomly selected district administrators across the four census regions to approximate the average respondent.

Many of the protocol questions have been adapted from relevant questions used in other REAP and SRI studies. For example, questions related to technical assistance have their roots in questions developed and used as part of a past study of RLIS districts (U.S. Department of Education, 2010). Questions related to the use of REAP Flex started with items used in a past study of districts eligible to exercise REAP Flex (U.S. Department of Education, 2007).

After the pilot participants tested the instruments (i.e., took the survey), the researchers conducted phone conversations with each of them to discuss clarity of wording and flow, how they interpreted the questions, and any other issues that may come up. The researchers revised the instruments based on this feedback.

B.5. Consultations on Statistical Aspects of the Design

The research team consulted with Dr. Harold Javitz, Distinguished Scientist at SRI International, on sampling for the survey. He can be reached at 650-859-5274.

Agency

Andrew Abrams of the U.S. Department of Education is the Contracting Officer’s Representative for the study. He can be reached at 202-401-1232.

Contractors

SRI International will be responsible for data collection and analysis, under the direction of
Kyra Caspary, who can be reached at 503-477-4228.

References

U.S. Department of Education, Office of Planning, Evaluation and Policy Development, Policy and Program Studies Service. (2007). Evaluation of Flexibility under No Child Left Behind: Volume III—The Rural Education Achievement Program (REAP Flex). Washington, D.C.

U.S. Department of Education, Office of Planning, Evaluation and Policy Development, Policy and Program Studies Service (2010). Evaluation of the Implementation of the Rural and Low-Income School (RLIS) Program: Final Report. Washington, D.C.





1 All but two states (Hawaii and Vermont) had districts that received REAP funds in the 2013–14 school year. The research team anticipates that the number of states with REAP districts in 2014–15 will be close to 48.

2 The research team will update this exhibit with 2014–15 data when they are available from the REAP Office.

3 The research team will update this exhibit when 2014–15 data become available. The researchers expect the number of districts in each stratum to be similar in school year 2014–15.

4 In analysis and reporting, the study will weight responses to the total number of districts in each stratum.

5 The research team will update the numbers in this table when 2014–15 data are available from the REAP Office.

6 If there are empty strata in the 2014–15 school year, the research team will select additional SRSA districts in the next largest stratum.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorErica Harbatkin
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy