Att_IIA Use of Funds Supporting Statement 12-20-10 Part B

Att_IIA Use of Funds Supporting Statement 12-20-10 Part B.docx

SURVEY ON THE USE OF FUNDS UNDER TITLE II, PART A ("IMPROVING TEACHER QUALITY STATE GRANTS – SUBGRANTS TO LEAS")

OMB: 1810-0618

Document [docx]
Download: docx | pdf

REQUEST FOR CLEARANCE OF PROPOSED STUDY

SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION

Survey on Use of Funds Under Title II, Part A



B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS


  1. Respondent Universe


We will draw a sample of 800 LEAs for this study from the 2011-2012 NCES CCD Public Elementary and Secondary Agency Universe File. Specifically, we will draw the sample from the over 16,500 agencies overall in this data file, of which almost 15,000 (or 90%) are classified as regular school districts.


Multiple factors, such as respondent burden, data quality and cost implications, were taken into account when determining the design for this study. Using a nationally representative sample of 800 LEAs minimizes the respondent burden but still allows the Department to perform the analyses needed for the GPRA indicators.


There are 8 LEA size categories and 5 poverty status categories (5 instead of 4 because, for a few LEAs, such as charters, poverty status was missing), yielding 40 cells. Proportional allocation of the 800 would have yielded very few large LEAs and (to a lesser extent) high poverty LEAs. The Department is interested in both of these groups, so 100 LEAs per size category will be sampled. This, in effect, oversamples the large LEAs and high poverty LEAs. Since it is more difficult to get the large LEAs and high poverty LEAs to respond to the survey, these LEAs will not be overrepresented in the final analyses.


This study is expecting a response rate among LEAs of at least 80 percent. During the last data collection (2009-10), a response rate of 81 percent was achieved.


  1. Sample Design


A stratified sample of 800 LEAs will be selected for the survey. To select the sample, we propose to stratify the LEAs in the frame by size and poverty status. The most recent district-level poverty estimates will be used to stratify the LEAs by level of school district poverty.


There are 8 LEA size categories and 5 poverty status categories (5 instead of 4 because, for a few LEAs, such as charters, poverty status was missing), yielding 40 cells. Proportional allocation of the 800 would have yielded very few large LEAs and (to a lesser extent) high poverty LEAs. The Department is interested in both of these groups, so 100 LEAs per size category will be sampled. This, in effect, oversamples the large LEAs and high poverty LEAs. Since it is more difficult to get the large LEAs and high poverty LEAs to respond to the survey, these LEAs will not be overrepresented in the final analyses.


The sample has been stratified by size and poverty status because the Department wants to disaggregate the survey results by these categories. Poverty quartiles, in particular, were used because they mirror the CSPR data collection and for cost reasons. Urban/rural sampling and a state representative sample were originally explored, but there was a tight budget and additional sampling was not cost effective. The categorization of urban/rural LEAs is likely correlated with size, since large LEAs tend to be urban and small LEAs tend to be rural.


In addition to estimates of the percentage of LEAs having specified characteristics, some of the key statistics produced from the survey include aggregate measures, such as total dollar amounts allocated for allowable activities under Title II, Part A and total numbers of teachers participating in various professional development activities. In the 2006-07 survey, the relative standard errors (RSEs) of the aggregate measures have ranged roughly from 5 to 20 percent depending on the statistic, while those of the percentage characteristics generally have ranged from 3 to 9 percent. Although explicit precision requirements for this study have not been specified, we believe that the achieved levels of precision from past surveys can serve as a reasonable goal to aim for in future studies. Thus, we expect the current design to yield similar levels of precision.


There are no unusual problems requiring specialized sampling procedures. This data collection will be annual.


  1. Methods for Maximizing the Response Rate


A letter (Appendix C) will be sent to each LEA along with the survey, reiterating the reasons for selection and requesting cooperation. LEAs will be given the option of responding by mail or by fax, whichever they find easiest. If the completed survey is not received by the expected date, a reminder card will be sent to the LEA. Following the reminder card, contractor personnel will re-send the data collection instrument to non-respondents and continue to attempt to obtain completed surveys for 3 months after the initial mail out of the data collection instruments. The contractor for this study has a long history of achieving high response rates through repeated mail and telephone follow-ups. This study will yield reliable data that can be generalized to the universe.


The contractor will examine non-respondents to determine any response bias. Non-responses bias analyses will be modeled on the procedures used for NAEP. It is Westat’s understanding that the Department requires these analyses if the response rate is below 85 percent. With NAEP, this was interpreted to mean 85 percent for various subgroups, not just overall. Westat will look at the response rates (weighted) for the survey for the 8 LEA size categories and the 4 poverty quartiles and see if any of them are under 85 percent. If not, Westat does not anticipate the need for non-response bias analyses. If some are under 85 percent, Westat will model the non-response bias analyses after NAEP. Westat will examine the weighted distributions of respondents before and after non-response adjustment for certain key frame characteristics that are related to the survey responses.


  1. Tests of Procedures and Methods


For the original Title II, Part A study, the Department consulted with several SEAs and LEAs to determine the feasibility of the data collection. Additionally, we conducted a pilot test of the revised 2004-05 data collection instrument with five LEAs. In selecting the LEAs to participate in the pilot test, we sought to include LEAs of varying size and location. The purpose of the test was to (1) verify that LEAs will be able to provide information for all of the data items on the data collection instrument and (2) ensure that the burden estimates used in this clearance package are accurate. As a result of this effort, we made several revisions to the wording on the data collection instrument and determined that we had somewhat underestimated the respondent burden. This OMB clearance request has been modified to accommodate the changes made to the LEA survey for the 2011-12 data collection.


  1. Consultations on Statistical Aspects of the Design


All sample design development will be provided by Hyunshik Lee, Senior Statistician in Westat's Statistical Support Group (301-610-5112). Westat will collect and analyze the information for the Department.


2


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleREQUEST FOR CLEARANCE OF PROPOSED STUDY
AuthorDaryl.Martyris
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy