NASS Comments

NASS Comments - NSLP - National survey of direct certification practices - Feb 2011.doc

National School Lunch Program (NSLP) Direct Certification Improvement Study

NASS Comments

OMB: 0584-0529

Document [doc]
Download: doc | pdf

NSLP Direct Certification Study


  1. National Survey of Direct Certification Practices.


  1. State Level.


This survey will include all 50 states plus five territories (a census of 56 states and/or territories). Knowledgeable staff members in each state or territory will provide the state level data. This will be a web-based data collection effort. A 90% response rate is anticipated.


Comment: Nonresponse adjustment not implied and methodology is not specified.


  1. District Level.


There are 19 states that currently use district-level matching methodology for certification. A separate survey will be used to collect information from each district in the 19 district-matching states.


A sample of districts in states using district-level matching will receive a detailed long-form version questionnaire (2,500 samples). The remaining districts will receive a shortened version of the questionnaire (5,000 samples). All districts residing in these states will be enumerated, either with a short-form or long-form questionnaire version. Sampling is done to reduce respondent burden.


A stratified cluster sample design will be used to assign districts to strata. Strata will be defined by state, enrollment size, and by public/private status. Sampling allocation will give a roughly proportional allocation to each state. Strata containing districts in key subgroups may be oversampled to obtain a reasonable number of reports. Response rates of about 80% are anticipated. It appears that nonresponse adjustments to the weights will be used, but no details were apparent to the reviewer.


Reviewer Comments:


The reviewer assumes that the most important research items would be picked up on both questionnaire versions. One might consider using some type of imputation procedure to “fill in” the missing data for the short-form; particularly if there are strong relationships between the data picked up on the short-form and the long-form data missing on the short-form. Due to the significant amount of data that would be required to be filled in, this might not be feasible however.


  1. Case Study Survey.


  1. In-depth Interview.


There will be 7 states selected as case-study states. Data will be collected at the state level, and for 2 or 3 selected districts in each of the 7 states. States selected for participation in this component of the survey will have direct certification infrastructure in place that best addresses key research questions. Convenience sampling will be employed in state selection. Data will be collected through in-depth, semi-structured, personal interviews. No nonresponse is anticipated.


B. Unmatched SNAP participant record survey.


A sample of 4 districts will be selected in each of the 7 in-depth study states for participation in a study of record matching algorithms. Districts selected for this study will supply their entire lists of records requiring matching, along with the algorithms used and an indicator of matching success. An investigation of the characteristics of the various record matching algorithms employed by the districts will be made. A stratified cluster sample design will be employed. Strata will be defined by whether the state in which the district resides is a district level matching state and the number of NSLP applicants in the district. The number of actual records examined as a result of this sampling process is random, but the expected number of records is given as 2,128. Sample allocation is specified. Strata containing more applications, less expensive data collection costs, smaller average number of applications per district, and larger DEFFs will on average receive larger samples ( although the sampled number of districts in each state is only 4.)


A 100% response rate is anticipated for selected districts (all selected districts are expected to supply the required files). Nonresponse adjustments should not be required- and weighting might not be applicable for this type of survey.


Reviewer Comments:


It seems that the requirement of 4 districts being selected for each state would drive the allocation. Significant differences in the quality of the matching algorithms might exist in different states/districts. The number of districts selected would seem just as important as the number of records obtained for review. Is it possible to screen states/districts concerning their basic approach prior to sample selection and use this information in the stratification phase? Or maybe increase the sample size? It is not real clear to this reviewer what types of estimates are of interest.



Matt Fetter- USDA/NASS

February 28, 2011







File Typeapplication/msword
Authorfettma
Last Modified Byhancda
File Modified2011-02-28
File Created2011-02-28

© 2024 OMB.report | Privacy Policy