Att_RLIS_OMB Supporting Statement Part B

Att_RLIS_OMB Supporting Statement Part B.doc

The Evaluation of the Implementation of the Rural and Low-Income Schools (RLIS) Program

OMB: 1875-0250

Document [doc]
Download: doc | pdf




Evaluation of the Implementation of the
Rural and Low-Income Schools (RLIS) Program





Request for OMB Approval

Supporting Statement for Paperwork Reduction Act Submission

Part B





Version 1


Submission: June 2008









Submitted to: Submitted by:

U.S. Department of Education

Office of Planning, Evaluation and

Policy Development (OPEPD)

Policy and Program Studies Service (PPSS)

400 Maryland Avenue, SW

Washington, DC 20202


Contracting Officer’s Representative: Project Director:

Erica Lee Kay Magill, Ph.D.

U.S. Department of Education Berkeley Policy Associates

Office of Planning, Evaluation and (510) 465-7884 ex. 206

Policy Development (OPEPD)

Policy and Program Studies Service (PPSS)

(202) 260-1463




TABLE OF CONTENTS

Supporting Statement B: Data Collection Procedures and Statistical Methods




B-1. Respondent Universe and Sampling Methods………………………………………………..

1 33

B-2. Statistical Power of the Sample……………………………………………………………….

2

B-3. Maximizing Response Rates……..……………….…………………………………………..

3

B-4. Pretesting of Surveys……………………………………………………………………….….

4

B-5. Contact Information……..………………………………………………………………………

5 5



Appendices


Appendix A. RLIS District Coordinator Interview Protocol


Appendix B. RLIS State Coordinator Survey


Appendix C. RLIS District Coordinator Survey


Appendix D. RLIS District Coordinator Interview Introduction letter/email


Appendix E. RLIS State and District Survey Mail/Email Letter


Appendix F. Federal Register Notice


Appendix G. Confidentiality Forms




Evaluation of the Implementation of the Rural and Low-Income Schools (RLIS) Program


SUPPORTING STATEMENT PART B:

Collections of Information Employing Statistical Methods



B1. Respondent Universe and Sampling Methods


Respondent Universe/Population


The study has a multi-component evaluation design that includes interviews with staff from a sample of states and districts, an online survey of staff from all states receiving RLIS funding, as well as an online survey of a random sample of staff from districts receiving RLIS funding.


Specifically, the RLIS state administrator population includes the 39 RLIS state coordinators in the 39 states that received RLIS funding during the 2007-2008 school year. The RLIS district coordinator population includes the 1,247 district administrators in the 1,247 districts that received RLIS funding during the 2007-2008 school year.



Sample Design and Selection


Sampling Strategy


District Interview Sampling Strategy


The rationale for the sampling strategy for the interviews with RLIS district coordinators was developed from the strategy used for the pre-test interviews with the RLIS state coordinators. For the district-level interviews, we will conduct interviews and collect documents from a sample of districts in the nine states where we conducted the state-level interviews (or those states that received the highest level of funding during the 2007-2008 school year), to provide more depth to our focused analyses. In order to align district coordinator responses with those of the state coordinators and with the available achievement data, which runs one year behind, we will ask the district coordinators about their goals, priorities, and uses of funds for the 2007-08 school year. (In addition, we will ask as well about any notable changes in the implementation of RLIS in their districts that occurred in the 2008-09 school year.) We will randomly sample five districts from the nine states where we conducted the state-level interviews. This will provide us with a district-level interview sample size of 45 districts.




State and District Online Survey Sampling Strategy


To obtain a representative sample of survey respondents from each of the two constituent populations (state staff and district staff), we plan to use FY 2007 information on RLIS-funded states and districts that includes contact names, titles, addresses, phone numbers, and email addresses. We intend to include all states (39) that received RLIS funding in FY 2007. For the district staff survey, we will obtain additional information from the interviews and the extant data sources described previously that will allow us to create sampling strata. We intend to sample 689 districts from the 1,247 districts that received RLIS funding in FY 2007. We selected this sample size (689 districts) because this is the smallest sample size that will provide us with the power or statistical confidence that we need in order to report findings from this survey. (See Exhibit 1.)


Exhibit 1. Estimates of the respondent universe, sample, and expected respondents by respondent type for the online surveys




The district survey will rely on stratified random sampling from this list of districts to represent the diversity of the population of districts receiving RLIS funding. We plan to stratify the sampling frame by key variables (these variables will be determined after analysis of the in-depth interviews with the states). The purpose of stratification is to minimize random sampling variation in the survey sample and to increase the face validity of survey results. Statistically, stratification is carried out by dividing the survey sampling frame into strata and then drawing sample members from each stratum with a probability equal to the ratio of the overall survey sample size to the sampling frame. Stratification modestly increases the statistical precision of survey estimates, especially in small samples. However, it is not possible to take into account the benefits of these gains in statistical precision at this time, because the data to construct survey strata are not yet available to us. Therefore the results of the statistical power calculations presented in Exhibit 2, next page, will be somewhat conservative.



B2. Statistical Power of the Survey Sample


The proposed sample composition is shown in Exhibit 2. A total of 689 potential district respondents will be sampled. Assuming a response rate of 85 percent for the district sample, the expected overall respondent sample will include 586 individuals. For the state survey, assuming a response rate of 90 percent, the expected overall respondent sample will include 35 individuals. (We expect a higher response rate from the state coordinators than from the district coordinators because they are more likely to respond to requests from the National Office to participate, and there are fewer of them to ask to participate if they do not immediately do so.) Wth these sample sizes, we expect to have 95 percent confidence intervals of 50 percent plus or minus 2.9 percent for the full sample of districts (assuming a binomial outcome with a mean of .5), and plus or minus 5.2 percent for the full sample of states.

We calculated the confidence interval by multiplying the z-score for a 95 percent confidence interval by the standard error of the mean (the mean being 50 percent or .5). Since we are sampling a large proportion of the population, we applied the finite population correction factor to our confidence interval. The finite population correction factor is typically used when a survey samples all or most of the members of the population. The finite population correction factor is calculated by multiplying the standard error in the confidence interval calculation by (1-f) where f=n/N. Applying the finite population correction factor improves the confidence interval because it accounts for the fact that most of the population is being surveyed. In cases where less than 5 or 10 percent of the population is being sampled the finite population correction factor will not change or improve the confidence interval.

Confidence intervals were calculated using the following equation:

196*SQRT((.25/n)*(1-f))

Where SQRT=square root

n=expected total respondent sample

(1-f)=the finite population correction factor

f=n/N or expected total respondent sample/respondent universe



Exhibit 2. Sample composition and confidence interval calculations


 

RLIS-funded Districts

RLIS-funded States

Population

1247

39

Sample

689

39

Expected total respondent sample (85% response rate for districts and 90% for states)

586

35

Confidence interval for a 50 percent Yes/No outcome

+/-2.9%

+/-5.2%



B3. Maximizing Response Rates


An important challenge in conducting these surveys will be to obtain a sufficiently large response rate (85 percent for the district coordinators and 90 percent for the state coordinators) so that the findings will be valid and reliable. To address this challenge, we will administer the surveys as follows:


  1. We will send sampled respondents a letter/postcard announcing the survey and explaining its importance for the evaluation and rural education in the United States (Appendix E). This letter will also include information on how to access and complete the online survey. Returned letters due to incorrect addresses will be corrected and re-sent when possible.


  1. We will send all sampled respondents an email with a web-link. Sampled respondents will be asked to complete the survey online. Returned emails will be corrected and re-sent wherever possible. If respondents have not completed the survey within one week, a reminder email will be sent out. A reminder email will be sent out once a week until the respondent completes the survey or the end of the field period, whichever occurs first.


  1. If the respondents do not respond within two weeks of receiving the first email, we will contact these sample members by telephone, as well as send a second reminder email. (If the phone is not answered after a number of attempts—which is considered a single contact—or no valid phone number is provided, we will follow up both by weekly email and by regular mail.) If sample members do not have time to complete the survey during this follow-up call1, they will be sent another email with a link to the survey and if they fail to respond to that email they will receive a second call. This effort is expected to produce a high response rate.


Using this approach, we are able to conduct this survey with a sufficiently large and representative sample, but also in a cost-effective manner. The expected overall response rate will be about 85 percent for district coordinators and 90 percent for state coordinators.


The nature of the online and phone formats of the survey we will implement allow for data quality control measures to be built in to the data collection process. The survey will be programmed with skip patterns to reduce both burdens on the respondents and the amount of data cleaning that will need to be conducted later. All telephone interviewers will be trained in conducting the interviews and some calls will be monitored for quality assurance.



B4. Pretesting


We used the state interviews as a pilot test for the district interviews. Based on this pilot test of the district interviews with 9 people in similar roles (state staff) to those that will be interviewed, the mean time to respond was 1 hour. The district interview protocol has similar questions to the state interview protocol, so the state interviews helped ensure the clarity of the district protocol.


We have conducted limited pretesting of the items designed specifically for the online surveys of state and district RLIS administrators to ensure clarity, and have administered the full surveys to nine respondents whose roles are similar to those we will sample for the full administration to ensure that the respondent burden does not exceed our estimates. This pretest confirmed that our burden estimate of 20 minutes for a respondent to read the instructions and then fill out the survey in full is conservative.







B5. Contact Information


BPA Contact:


Kay Magill (Project Director)

Berkeley Policy Associates

440 Grand Avenue, Suite 500

Oakland, CA 94610

510-465-7884 ext. 206

[email protected]


U.S. Department of Education Contact:


Erica Lee (COR)

U.S. Department of Education

Office of Planning, Evaluation and Policy Development (OPEPD)

Policy and Program Studies Service (PPSS), Room 6W205

400 Maryland Avenue, SW

Washington, DC 20202

202-260-1463

[email protected]


1 We expect most respondents to complete the survey during the follow-up phone call.

File Typeapplication/msword
AuthorPhyllis Weinstock
Last Modified Bydoritha.ross
File Modified2008-09-12
File Created2008-09-12

© 2024 OMB.report | Privacy Policy