Supporting Statement for Evaluation of the Summer Food Service Program (SFSP) Participant Characteristics
(OMB Control Number: 0584-0595)
Supporting Statement for
Evaluation of the Summer Food Service Program (SFSP) Participant Characteristics
Part B
(OMB Control Number: 0584-0595)
April 14, 2014
Updated October 30, 2014
Prepared for:
USDA, Food and Nutrition Service
Office of Research and Analysis
3101 Park Center Drive, Room 1014
Alexandria, VA 22302
COR: Laura Zatz
CS: Ashley Owens
Prepared by:
Optimal Solutions Group, LLC
University of Maryland M Square Research Park
5825 University Research Court, Suite 2800
College Park, MD 20740-9998
Point of Contact:
Mark Turner, Ph.D., Project Director
Email: [email protected]
Phone: 301-918-7301
PART B. Collections of Information Employing Statistical Method
2. Describe the procedures for the collection of information 6
3. Describe methods to maximize response rates and to deal with issues of non-response. 7
4. Describe any tests of procedures or methods to be undertaken. 9
Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.
Developing the sample frame information for the study will require data collection at several stages. State agencies will be contacted three times -- in December 2014, and May, and June of 2015 -- to provide the lists of sponsors and sites to develop the sample frame for the selection of sponsors. After obtaining the lists of 2014 sponsors and sites in January 2015, the availability and completeness of the states’ administrative data will be examined, especially with respect to the sites’ characteristics and contact information. If the data obtained from the states is not sufficient to support sampling or data collection, then an alternative approach of contacting sampled sponsors will be implemented. In this alternative case, sponsors selected for the sample will be contacted twice to provide their respective site lists in order to develop the sample frame for the selection of sites: First in April and May 2015 (on a rolling basis, depending on the state’s application deadline), and again in June of 2015. For a more detailed description see Part A.
Two school districts in SFSP eligible areas will be contacted to provide the lists of eligible children, (participating in school lunch or breakfast programs) to select respondents for the semi-structured interviews with parents or caregivers of SFSP participants and eligible nonparticipants. Twenty-five parents or caregivers of SFSP participating children and 25 parents or caregivers of SFSP-eligible nonparticipating children will be randomly selected from the lists. The parents/caregivers surveys will not be representative and will be a sample of convenience. Due to the low response rates reported by 2003 study (45%), financial incentives will be used to increase response rates.
To develop the sampling frame and to answer some research questions, administrative data will be collected from the states, possibly from some of the sampled sponsors if state data are insufficient, and two schools or SFA sites. The data will be used to develop the sampling frame of sponsors, sites, and parents/caregivers of participants and eligible nonparticipants. Administrative data on sponsor and site characteristics will be used to address research Objectives 1, 2, and 3, as well as to provide comparisons with the ERS/MPR 2003 results (research Objective 6). The administrative data for this study will be collected from three sources:
The states’ administrative data will include a request for a complete listing, contact information, and types of current SFSP sponsors and sites; planned dates of meal service; tabulations of SFSP participation (e.g., number of children and number of meals served), meal types, the peak month of operation, the number of meals expected to be provided during the summer, and anticipated or typical timeframe for approving or rejecting new sponsor and site applications (or sponsor and site application submission and approval/rejection dates). If available, we will also request anticipated or actual number of replacement sponsors and sites enrolling after the official deadline.
The administrative data requested from some sampled sponsors, if needed, will include a request for a complete listing, contact information, and types of current SFSP sites; planned dates of meal service; and meal types and the number of meals expected to be provided during the summer.
The administrative data from the two schools or SFA sites will include the lists and the contact information for their NSLP and SBP participants and eligible nonparticipants.
Based on the 2003 study the expected universe and response rates for this request are presented in Exhibit 5.
Exhibit 5. Sampling Universe, Sample, and Response Rates
Type of sample |
Respondent universe |
Sample size |
Expected response rates |
Expected non-participants |
State agencies |
54 |
54 |
100% |
0 |
Sponsor organizations |
4,397 |
300 |
90% |
30 |
Sites |
35,530 |
350 |
90% |
35 |
Parents/caregivers of participants |
n/a list of 200 |
25 |
80% |
5 |
Parents/caregivers of eligible nonparticipants |
n/a list of 200 |
25 |
80% |
5 |
Note: the sample of parents/caregivers will be a sample of convenience without an expectation of representativeness.
Statistical methodology for stratification and sample selection, estimation procedure, degree of accuracy needed for the purpose described in the justification, unusual problems requiring specialized sampling procedures, and any use of periodic (less frequent than annual) data collection cycles to reduce burden.
Statistical methodology for stratification and sample selection
Based on the obtained sponsors and site lists, a stratified sampling approach will be used to select nationally representative samples of sponsors and sites by USDA region, location (urban/rural), and with probability proportional to size, where the measure of size will be average daily attendance at SFSP sites. The overall design for the study involves two-stage stratified sampling which utilizes the USDA regions as the primary sampling strata, and selects sponsors and sites at each stage with a probability proportional to size. The sampling frame for the first stage of this effort will be derived from the states’ listings of sponsors and sites, and data on program volume. The second stage (selecting program sites) will be based on information obtained from the states, with possible follow up to some program sponsors if state data on sites is insufficient, and will target up to two sites per responding sponsor. The first-stage selection of sponsors will involve two phases. In the first phase, a sample of approved sponsors will be selected from lists obtained from states during April and May 2015. Additional sponsors will be selected in the second phase after all states have been contacted in June and July 2015 regarding any late sponsor and site approvals. .
These sampling procedures will target the selection of a total of 300 sponsors and 350 sites. Under standard assumptions, power calculations indicate that the target size for the primary sample of 300 sponsors and 350 sites should be sufficiently large to detect the influence of environmental or policy factors individually using analysis of variance (ANOVA) or within a general linear modeling framework, even when effect sizes are small (0.1-0.2). Calculations also assume a design effect of 1.5 and at least an 80% response rate among sponsors and sites. Non-sampling errors arising from unit and item nonresponses will be dealt with through weighting and imputation where appropriate. The sponsors and sites data will be weighted by calculating the sampling weight as the inverse of the probability of selection. The sampling weight for the sites will depends on the probability of selection of the sponsor at the first stage.
The study will utilize a mixed-mode data-collection approach to maximize the survey response rates and minimize the respondent burden. The following data-collection techniques will be used:
Use USDA official introductory letters and e-mails to establish the legitimacy and the importance of the survey and to make respondents aware that this is a USDA initiative.
Use introductory letters and e-mails from state agencies to make sponsors and sites aware of the importance and legitimacy of the survey. The introduction will describe the study’s purpose and value, the privacy and confidentiality of the survey data, and its use in the analysis.
Send the color brochure by mail and by e-mail to explain the study’s value; to outline the research objectives, data-collection approach, dates, and the information being collected; and to provide relevant contact information.
Send up to five e-mail reminders at various times and on various days of the week to increase the odds of the survey invitation’s being noticed and acknowledged.
Send up to five reminder postcards after the introductory letter to increase the number of contacts with the respondents.
Provide self-addressed, prepaid envelopes, and closely monitor the status of postal and e-mail survey invitations and survey activity to track respondents’ activity, and send targeted and friendly e-mail reminders to non-respondents.
Conduct up to five reminder telephone calls with non-respondents. The respondents will have the option of taking the web survey, responding to the telephone survey, rescheduling for a more convenient time, requesting a paper version of the survey, or identifying another suitable point of contact.
Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from ten or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.
The five types of survey instruments (state agencies, sponsors, sites, parents/caregivers of participants, and parents/caregivers of nonparticipants) were pretested with two state administrators, six sponsors, six site supervisors, two parents/caregivers of the participants, and three parents/caregivers of eligible nonparticipants. The state and sponsor respondents were also asked to estimate the burden required to provide the lists and administrative data on the sponsors and the sites that they operate.
The pretesting involved cognitive interviews to estimate respondents’ burden and to identify respondents’ difficulties in interpreting and responding to the questions. This approach utilized probes to determine whether respondents were having difficulty understanding the meaning of the questions or words, problems remembering the information needed to answer questions, issues with unnecessary questions or response options, or trouble with missing questions or response categories. The items that were reported by the respondents or were noticed by the interviewers as being problematic, difficult, or time-consuming to answer, were further discussed to determine the reasons for the difficulties.
Based on the pretesting results, the surveys were revised to minimize the respondents’ burden. The revisions involved rewording some questions to improve comprehension and responding, omitting some items that were confusing or difficult to answer, adding skip question options to avoid asking questions that were not applicable, revising some of the response scales, adding a few specific questions and items, and revising the surveys’ overall layout and format. The final versions of the surveys are attached in Appendix E-1, F-1, G-1, H-1 and I-1.
The names and telephone numbers of individuals consulted on statistical and methodological aspects of the design are:
EXPERT, ADVISORY PANEL- SFSP
Expertise |
Name |
Position |
Affiliation |
|
Sampling, survey methodology |
Laura Stapleton |
Associate Professor in Measurement |
Statistics and Evaluation in the Department of Human Development and Quantitative Methodology at the University of Maryland |
|
Subject matter, food security programs, program administration |
Judi Bartfeld |
Director |
Research on Poverty (IRP)--USDA's Food Security Research, Innovation, and Development Grants in Economics (RIDGE) program--University of Wisconsin |
|
Subject matter, food security programs, program evaluation, social services |
Child Nutrition Policy Analyst |
Food and Research Action Center (FRAC) |
||
Subject matter, food security programs, program evaluation, social services |
Kate Sims |
Outreach and Policy Associate |
Food and Research Action Center (FRAC) |
Statistical Contact
For questions regarding the study or questionnaire design or statistical methodology, contact:
Mark Turner, Ph.D., Project Director
Optimal Solutions Group, LLC
University of Maryland, M Square Research Park
5825 University Research Court, Suite 2800
College Park, MD 20740-9998
Telephone: 301-918-7301; e-mail: [email protected]
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Patrick Mulford |
File Modified | 0000-00-00 |
File Created | 2021-01-26 |