Final Revised Part B 07_19_11

Final Revised Part B 07_19_11.docx

Evaluation of the Summer Food Service Program Enhancement Demonstrations

OMB: 0584-0560

Document [docx]
Download: docx | pdf





Evaluation of the Summer Food Service Program Enhancement Demonstrations

Supporting Statement and Tables

OMB Submission Part B – May 12, 2011


























Contact Information:

Chan Chanhatasilpa, Ph.D.

FNS Project Manager

USDA, Food and Nutrition Service

3101 Park Center Drive, 10th floor

Alexandria, VA 22302

703-305-2115

[email protected]


Page Left Intentionally Blank


Table of Contents


PART Page


B Collection of Information Requiring Statistical Methods 1


B.1 Respondent Universe and Sampling Methods 1

B.2 Procedures for the Collection of Information 8


B.2.1 Information Collection Procedures 8

B.2.2 Expected Level of Precision 13


B.3 Methods to Maximize Response Rates and Deal with Non-response. 16

B.4 Tests of Procedures of Methods to be Undertaken 22

B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 24


References 27



Tables


B.1.1 Sample sizes for summer and fall 2011 surveys 7


B.1.2 Sample sizes for summer 2012 surveys 8


B.2.1 Contents of parent questionnaire 10


B.2.2 Contents of document review forms 13


B.2.3 Minimum detectable differences between groups, assuming one-tailed test; alpha = 0.05 and power = 0.80 16


B.5.1 Individuals involved in statistical aspects, data collection, and analysis 25



B. Collection of Information Requiring Statistical Methods

B.1 Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.



Data for the evaluation will be collected from the following sources:

  • Parents of participants in the Meal Delivery and Backpack demonstrations;

  • Separate cost instruments completed by demonstration state grantees and sponsors in all four types of demonstrations;

  • Interviews with key informants representing state officials, demonstration sponsors and SFSP sites in Demonstrations 1 (Extended Operations), 2 (Enrichment Activity), and 4 (Backpack), and state officials and demonstration sponsors in Demonstration 3 (Meal Delivery); and

  • Abstraction of key information from demonstration project proposals and other project materials in all four types of demonstrations.

Statistical methodologies will be employed to select households for inclusion in the parent data collection effort. The sampling approach and expected response rates for this component are discussed in this section. Data collection from other sources (cost instruments, key informant interviews, and materials collection and abstraction) will not use sampling. A discussion of the selection process and the number of entities involved in these data collection components is provided in Section B.2.


Sampling of Households for Parent Interviews


In preparing the lists that will be used for sampling households for parent interviews, it is important that complete and up-to-date lists of demonstration participants be constructed to ensure substantially unbiased coverage of the population of interest. While it is theoretically possible to develop such lists from participant records maintained by the demonstration sponsors and sites, the complete list of participants will generally not be known until the demonstration is well underway. Since the interviews for the outcome evaluation will be conducted over a relatively short time period during the summer months (i.e., primarily in July and August), there needs to be sufficient lead time to compile, edit, and process the participant lists prior to selecting the samples. The way that children are enrolled to receive services under the Meal Delivery and Backpack demonstrations will also affect the timing and efficiency of the sampling procedures. For example, in the Backpack demonstration, children are typically invited to a park or community center to participate in summer activities, at which time they can be recruited into the study. This leaves little time to compile the required lists and ascertain eligibility of the children for inclusion in the study. In the Meal Delivery demonstration, on the other hand, sponsors are required to compile and maintain accurate lists of participating children in order to determine the most efficient way of delivering meals to participating households.


In view of the challenges associated with the sampling of the participants, substantial efforts will be made by local field staff (employed by the contractor) to coordinate sampling and data collection activities with the demonstration sponsors and sites. An important goal of these activities will be to better understand the processes involved in enrolling children into the various types of demonstration projects in order to determine an appropriate and practical way of sampling participants. As described below, a multistage sample design will be employed where necessary to reduce the scope of the list construction activities to a representative sample of sites operating within the demonstration areas. Note that while the general discussion below assumes that reasonably complete lists can be obtained for sampling purposes, the actual procedures to be used may involve less rigorous procedures if it is not feasible to develop such lists in a timely fashion.


For the Meal Delivery Demonstration, it will be feasible to select children from lists compiled by the demonstration sponsors. In this case, unclustered stratified samples will be selected from these lists (e.g., within strata defined by state and sponsor) to ensure appropriate representation of participants across the demonstration area. Although data collection will be conducted by telephone, it will be necessary in some instances for local field staff to visit the household in order to make sure the interview is completed. Thus, while the sample for the Meal Delivery demonstration is currently planned to be unclustered, consideration will also be given to the possibility of clustering the sample geographically (e.g., by ZIP code or Census tract) if it is determined that this will result in appreciable reductions in data collection costs.


For the Backpack demonstration, where lists are not available at the sponsor level but must be compiled with the assistance of the staff at the sites where meals are delivered, a two stage probability sample design will be used to select participants for the household surveys on food security. The general approach will be to design and select a first-stage sample of approximately 40 sites (e.g., schools, community centers, etc.) from lists maintained by the demonstration sponsors. To ensure that the sample is as diverse as possible, the sites will be selected through a stratified sample design (e.g., with strata defined by geography, type of sponsor, size of site, and other site characteristics to the extent feasible). Within strata, sites will be selected with probabilities proportionate to a measure of size (e.g., the projected numbers of children to be served by the site). Use of a measure of size in sample selection will help ensure that roughly equal workloads per site can be maintained without unduly increasing the variability of sampling weights.


Through information gathered by local field staff, lists of participants will be constructed for the selected sites. Thus, at the second stage of selection, subsamples of demonstration participants will be selected from these lists. Roughly equal numbers of participants (children) will be subsampled from each site. To accommodate the fact that some lists may be incomplete or inaccurate, the sampling may be done in stages if feasible to allow extra time for list construction as the need arises.


It should be noted that the sampling frames of participants will typically consist of lists of individual children. Thus, to the extent possible, children included on the lists will be aggregated to the household level prior to sample selection to allow for the direct selection of households rather than indirect selection through the selection of individuals. The initial number of participants to be selected from the demonstration areas will be large enough to achieve the desired number of completed interviews after anticipated losses due to nonresponse and attrition.


Households participating in the 2011 Meal Delivery and Backpack demonstrations will be surveyed in the summer and fall of 2011. In the following summer of 2012, two additional rounds of interviews will be conducted with a representative sample composed of (a) households that responded in the fall of 2011 and (b) a supplemental sample designed to replenish the continuing sample and extend coverage to newly eligible households.


Sample Size. Tables B.1.1 and B.1.2 summarize the proposed target sample sizes for the household surveys to be conducted in 2011 and 2012, respectively. Since the interval between the time the sample is drawn and the start of the summer survey will be very short, we expect that sample losses (e.g., resulting from moves, loss of eligibility, etc.) prior to contacting the household will be relatively small. As indicated in the footnotes to these tables, we have assumed a 5% loss rate between the time the household is sampled and the initial contact. Among households that are determined to be eligible for the study, there will be losses due to survey nonresponse (refusal, unavailable during field period, language problems, etc.). Given the extensive efforts that will be used to follow up and convert telephone nonrespondents (described in B.3 below), we have assumed that it will be possible to achieve an overall survey response rate of 70 percent.


We estimate a total sample size of 804 respondents in the summer 2011 survey. As indicated in Table B.1.1 under the heading “summer 2011,” we will sample about 605 households (participants) from each demonstration type to obtain a total of 400 completed interviews for the first summer survey. Under the proposed two-stage sample design for the Backpack demonstration, the initial sample of 605 households will be clustered in 40 sampled sites, or 15 households per site. The resulting sample will be approximately self-weighting (i.e., every sample household will have an equal opportunity of being selected) assuming that it is feasible to aggregate children to the household level prior to sampling. Otherwise, the probability of selecting the household will be proportional to the number of children in the household. To minimize the possibility of sampling a site with fewer than 15 participants which can lead to excessive variation in sampling weights, sites with small numbers of participants will be combined with a nearby site prior to sampling whenever possible. Note that for the Meal Delivery demonstration, the initial sample of 605 households will be unclustered.


For the subsequent fall survey, all responding households in the summer 2011 survey will be included in the sample. Since the time interval between surveys will be relatively short, we assume a further loss of 5 percent due to moves or change of address, and a followup response rate of 80 percent among those households that can be recontacted. Under these assumptions, an estimated 306 households per demonstration type will complete the fall 2011 interview as shown in the last column of Table B.1.1.




Table B.1.1. Sample sizes for summer and fall 2011 surveys


Summer 2011


Fall 2011

Demonstration type

No. to be sampled

Exp. no. eligible[1]

Exp. no. resp.[2]


No. to be followed

Exp. no. eligible[3]

Exp. no. resp.[4]

Meal Delivery








Participant

605

575

402


402

382

306









Backpack








Participant

605

575

402


402

382

306









All Areas








Participant

1,210

1,150

804


804

764

612

[1] Assumes 5% loss (due to moves, change in eligibility or program participation status) between time of sample selection and summer 2011.

[2] Assumes 70% response rate.

[3] Assumes additional 5% loss between summer and fall 2011

[4] Assumes 80% fall followup response rate.



The sample of households for the summer 2012 interviews will include respondents in the previous fall survey plus a supplemental sample drawn from updated lists of participants provided by the demonstration grantees. For the Backpack demonstration, the supplemental samples will be drawn from the same sites selected for the 2011 surveys, plus a small sample of new sites if warranted. As shown in Table B.1.2 under the heading “Continuing sample,” an estimated 245 of the 306 fall respondents (per demonstration type) will be eligible for the summer 2012 survey based on an assumed year-to-year loss rate of 20 percent. To compensate for these anticipated losses, an additional 310 households will be selected from current lists of participants, of which 295 are expected to be eligible for the survey based on a 5 percent loss rate. Assuming a response rate of 80 percent for the continuing sample and a response rate of 70 percent for the supplemental sample, the two components of the sample will yield approximately 400 completed household interviews for the initial round of data collection in summer 2012. A second interview will be attempted approximately 30 days later. Assuming a response rate of 95 percent for the second interview, an estimated 382 of the 402 respondents will complete the second interview for each demonstration.


Table B.1.2. Sample sizes for summer 2012 surveys


Continuing sample


Supplemental sample


Total sample

Demonstration
type

No. retained from fall 2011[1]

Exp. no. eligible cases[2]


No. to be sampled

Exp. no. eligible cases[3]


Eligible cases

Exp. no. followup resp. [4]

Exp. no. supp. resp.[5]

No. resp - first int.

No. resp. - 2nd int.[6]

Meal Delivery












Participant

306

245


310

295


539

196

206

402

382













Backpack












Participant

306

245


310

295


539

196

206

402

382













All Areas












Participant

612

490


620

589


1,078

391

412

804

764

[1] See Table B.1.1.

[2] Assumes 20% loss rate between 2011 and 2012. For example, data from the 2009 Current Population Survey indicate year-to-year mobility rates ranging from 17 to 25 percent among households with incomes below 150% of poverty (U.S. Census Bureau, 2010).

[3] Assumes 5% loss between time of sample selection and summer 2012.

[4] Assumes 80% followup response rate.

[5] Assumes 70% initial survey response rate.

[6] Assumes 95% response rate for second interview. Response rate is expected to be relatively high because the second interview will be conducted within a month of the first interview.



B.2 Describe the procedures for the collection of information including:

* Statistical methodology for stratification and sample selection,

* Estimation procedure,

* Degree of accuracy needed for the purpose described in the justification,

* Unusual problems requiring specialized sampling procedures, and

* Any use of periodic (less frequent than annual) data collection cycles to reduce burden.



B.2.1 Information Collection Procedures

Information collection will occur over a two-year period, beginning in 2011 and will consist of four types of data collection.


Telephone Interviews with Parents

Interviews will be conducted with parents of participants in the Meal Delivery and Backpack demonstration projects at four points in time – summer 2011, fall 2011, and twice during the summer of 2012. The interviews will be administered by trained telephone interviewers from the evaluation contractor’s telephone research center. To overcome nonresponse due to incomplete contact information and households with no phones, the study will use the “Take Phone Home” approach, described in detail in Section B.3.


The parent questionnaire will include an assessment of household food security using the USDA 18-item food security module. This module permits assessment of food security of children in the household under the age of 18 or of those older than 18 eligible to participate in the demonstration (i.e., graduated seniors over age 18, or those who have been determined by the state to have a mental or physical disability and who participate in a public or nonprofit school program established for people with disabilities). Based on responses provided, households can be classified as food insecure without hunger, food insecure with moderate hunger, and food insecure with severe hunger. The questionnaire will also contain items on respondent and participant characteristics, household characteristics, other factors that may be contributing to demonstration project impacts, and parent and participant satisfaction with the demonstration project (Table B.2.1).


Attachment H contains seven different versions of the parent questionnaire (i.e., a separate questionnaire for parents of participants in the Meal Delivery demonstration in summer 2011, summer 2012 – Round 1 and summer 2012 – Round 2; a separate questionnaire for parents of participants in the Backpack demonstrations in summer 2011, summer 2012 – Round 1 and summer 2012 Round 2; and a fall 2011 questionnaire for parents of participants in either the Meal Delivery or Backpack demonstration). These instruments will also be available in Spanish and are contained in Attachment H. Many of the questions in the parent questionnaires came from previously validated or cognitively tested questions or were slightly revised.1

Table B.2.1. Contents of parent questionnaire

  • Participation in demonstrations (Meal Delivery and Backpack)

  • Demonstration operations (e.g., source of information on demonstration, location, scheduling meal delivery)

  • Parent and demonstration participant satisfaction with demonstration project

  • Participation in other nutrition assistance programs

  • School year nutrition program participation

  • Household food security

  • Perception of change in food expenditure

  • Meal targeting accuracy (food storage, food consumption by participants, sharing food, leftover food)

  • Concern about stigma

  • Perception of change in food expenditure

  • Household and respondent characteristics

  • Demographics of respondent

  • Presence of a person with a disability in the household

  • Employment status of adults in household




Site Visits and In-person Key Informant Interviews

For the implementation process evaluation, trained project staff will conduct site visits to each of the four types of demonstrations and conduct in-person key informant interviews using semi-structured interview guides (Attachment J). Staff will conduct up to 11 key informant interviews for each demonstration project. Key informants will consist of state officials (demonstration project grantees), sponsors, and staff and volunteers from demonstration sites. The contractor will work with FNS and state officials to develop criteria for selection of key informants and seek their assistance in obtaining individuals to participate in the interviews. Interviews for all four types of demonstrations will take place in 2011. Additional interviews for Demonstrations 3 and 4 only will take place in 2012.


Collection of Cost Data

State grantees will be asked to complete a short cost instrument (Attachment I1) by telephone to collect information on start up operations and operational costs related to their role as a demonstration grantee. Grantees from the states in which the four types of demonstrations are being conducted (up to 8 grantees) will be asked to complete the cost instrument in 2011. The state grantees in the Meal Delivery and Backpack demonstrations will be asked to complete the instrument again in 2012. State grantees will be sent the questionnaire prior to a telephone interview so they can prepare for the interview. An extended cost instrument (Attachment I2) will also be fielded to all demonstration sponsors in 2011 to collect information on administrative and benefit costs, and to those sponsors still operating demonstration projects in 2012 (Backpack and Meal Delivery projects only). The contractor will also work closely with state grantees to determine whether some of the cost data to be collected in the extended cost instrument (Attachment I2) has already been collected from sponsors by the state grantees and can be obtained directly from them.

We expect response rates for the short and extended cost instruments to approach 100 percent. Demonstration project sponsors have agreed to cooperate with the evaluation effort as a condition of their participation in the demonstrations. To facilitate collection of cost data from all four demonstration types, webinars will be held to describe the instruments and data collection methodology to sponsors. The letter will include a contact name and phone number if state grantees or sponsors have questions about the content of the cost instruments. Because the number of state grantees and project sponsors is quite small for the Backpack and Meal Delivery projects (a total of 8 state grantees and 16 sponsors), evaluation contractor staff will be able to conduct intensive follow-up and provide one-on-one assistance to sponsors in these demonstrations if they experience difficulty completing the instrument or fail to complete the questionnaire within the requested timeframe.


Document Review and Abstraction of Data


State grantees and sponsors will be asked to provide copies of their proposals, participant recruitment materials, and other project documents (e.g., reports and menus). Project staff will review the materials and abstract key information using an abstraction form. Examples of the type of Information expected to be contained in the documents are shown in Table B.2.2. This information will serve to supplement data collected in key informant interviews. Abstractors will be trained to obtain 80 percent reliability.

Table B.2.2. Contents of document review forms

State proposals

Sponsor proposals

Other materials (e.g., menus, recruitment materials)

  • Name of state

  • Department

  • Staffing (roles and responsibilities)

  • Methods for providing administrative oversight

  • Types of sponsors sought*

  • Project design

  • Sponsor outreach*

  • Application process


  • Sponsor name

  • Organization type

  • Site type (e.g., open, enrolled, camp, etc.)

  • Previous experience with Summer Food Service Program

  • Location

  • Methods for providing nutritional integrity (expected contents of meal)

  • Methods for maintaining food safety

  • Outreach to sites, participants, and families

  • Duration of program (if also a site)

  • Demonstration start and end dates

  • Contents of meals

  • Description of contents

  • Frequency of meal delivery (e.g., days of the week)


*Demonstrations 1 and 2



B.2.2 Expected Level of Precision

A stated goal of the sample design is that it should be able to detect policy-relevant changes in food security with 95 percent confidence (i.e., five percent significance level) and 80 percent power (i.e., type II error probability of 20 percent). To illustrate the levels of precision that can be achieved from the proposed design, we have calculated the minimum detectable difference (MDD) in food security under various assumptions about the underlying prevalence of household food security in the selected demonstration sites. The assumptions are based on national estimates of the prevalence of food security derived from the 2008 Current Population Survey (CPS) Food Security Supplement (FSS) (Nord et al., 2009). Among all households with children under 18 years of age, about seven percent experienced very low food security during the past year. Among low-income households (i.e., households with incomes below 130 percent of federal poverty guidelines), the prevalence of households experiencing very low food security was 17 percent. We anticipate that the levels of very low food security in the various demonstration sites could possibly be higher than the overall national rate of 17 percent for low-income households. It should be noted that the CPS also collects information about food security during the past month, which is the definition to be used in the proposed household surveys. Under this definition, the rates of very low food security are four percent for all households with children under 18 years of age, and 11 percent for low-income households. Moreover, the prevalence of households with very low food security among children is much lower. For example, the percent of households with children that have very low food security among children is about 1.3 percent across all income groups. Among low-income households, this percentage ranges from 3-4 percent. Thus, in what follows, we consider a range of low food security prevalence rates to assess the MDDs that can be achieved under the proposed sample design.


Table B.2.3 summarizes the MDDs for prevalence rates relating to very low food security ranging from 5 to 25 percent under various assumptions about the sample sizes of the subgroups being compared. The MDDs shown in the last column of the table were calculated assuming a design effect (DEFF) resulting from nonresponse weighting adjustments and other aspects of the sample design leading to unequal weights. For some types of comparisons, the DEFFs may be larger than the 1.10 used in the calculations. In particular, if there is a high degree of homogeneity of the survey responses within sites, the DEFFs could be considerably larger than the 1.10 used in these calculations.


The MDDs for various types of comparisons are shown in the table. For example, the uppermost part of the table denoted as “summer 2011 (cross-sectional)” apply to cross-site comparisons of the summer 2011 survey results based on a nominal sample size of 400 respondents per demonstration. The samples being compared in this case are independent samples. Under the stated assumptions, the MDDs are expected to range from 4.02% to 7.98% depending on the underlying prevalence rate being estimated. For the corresponding cross-sectional comparisons of fall 2011 survey data, the MDDs will be somewhat larger due to the expected smaller sample sizes. The entries in the lower part of the table apply to comparisons across years, and can viewed as either “longitudinal” (in which case, the same set of respondents are included in the analysis for both years), or “cross-sectional” (in which case only a portion of the samples being compared will overlap). In either case, the use of the overlapping samples is expected to reduce sampling variance (i.e., increase sampling precision) resulting from the year-to-year correlation of household characteristics. Consequently, as can be seen in the last column of the table, the MDDs for year-to-year comparisons will generally be smaller than for within-year comparisons based on samples of roughly similar size.’



Table B.2.3. Minimum detectable differences between groups, assuming one-tailed test; alpha = 0.05 and power = 0.80

Type of comparison

Prevalence of food security

DEFF*

First subgroup
sample size

Second subgroup sample size

MDD

Summer 2011

5%

1.10

400

400

4.02%

(cross-sectional)

10%

1.10

400

400

5.53%


15%

1.10

400

400

6.58%


20%

1.10

400

400

7.37%


25%

1.10

400

400

7.98%







Fall 2011

5%

1.10

306

306

4.59%

(cross-sectional)

10%

1.10

306

306

6.32%


15%

1.10

306

306

7.52%


20%

1.10

306

306

8.43%


25%

1.10

306

306

9.12%







Summer 2012

5%

1.10

400

400

4.02%

(cross-sectional)

10%

1.10

400

400

5.53%


15%

1.10

400

400

6.58%


20%

1.10

400

400

7.37%


25%

1.10

400

400

7.98%







Summer 2011

5%

1.10

196

196

4.06%

vs. summer 2012

10%

1.10

196

196

5.58%

(longitudinal)**

15%

1.10

196

196

6.65%


20%

1.10

196

196

7.45%


25%

1.10

196

196

8.06%







Summer 2011

5%

1.10

400

400

3.49%

vs. summer 2012

10%

1.10

400

400

4.80%

(cross-sectional)**

15%

1.10

400

400

5.72%


20%

1.10

400

400

6.41%


25%

1.10

400

400

6.93%

* Design effects due to unequal weighting, nonresponse adjustment, and clustering. For the Meal Delivery demonstration, the design effects will be lower since there is no clustering effect under a one-stage stratified design.

**Year-to-year correlation for the overlapping portion of sample is assumed to be 0.50.



B.3 Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.



Response rate projections are presented in Sections B.1 and B.2. Attaining these response rates will be accomplished through a variety of procedures intended to gain cooperation from sponsoring organizations, local sites in the Backpack demonstration, and parents of demonstration participants. This section describes the approaches that will be used to maximize response for each component of the data collection effort.


Coordination with State and Local Agencies. Ensuring the full support and active participation of state and local agencies (SFSP sponsors and sites) will be critical to successful data collection throughout the project. Early in the project schedule, meetings will be held with state grantees and sponsors to familiarize them with the evaluation and to coordinate the key informant and cost data collection efforts. Soon after the initial meetings, the evaluation contractor will conduct an orientation webinar with all sponsors for each demonstration to familiarize them with the evaluation, the contractor’s role, and the contractor’s plan for supporting them in the field. The webinars will be followed by regular meetings with each sponsor to learn more about their proposed process for implementing the demonstration; discuss the most efficient procedures for recruiting households into the evaluation, gaining consent, and completing the parent interviews; and plan work together on identifying and recruiting local field coordinators (described below) to support the sponsors and sites. A priority will be to finalize the method for publicizing the summer demonstrations, preparing advance materials for parents (see Attachments E1 – E4), and establishing procedures for building the list sample.


The evaluation contractor plans to assign one member of its data collection and coordination team (DCCT) to each demonstration state to coordinate evaluation activities. Team members will schedule meetings, draft minutes, oversee materials development, finalize procedures, conduct site visits and key informant interviews, and oversee local field coordinators, hired by the contractor, to carry out evaluation activities. Each DCCT member assigned to Demonstration #3 (Meal Delivery) will oversee 1 field coordinator who will be responsible to the sponsor(s) in that state. Each DCCT member assigned to Demonstration #4 (Backpack) will oversee 3 field coordinators, and each field coordinator will support approximately 5 sponsors (64 sites per demonstration). The field coordinators will help the sponsors build their list of demonstration participants, implement the Take the Phone Home program, and follow-up on non-response, as needed.


Local Field Coordinators. The evaluation contractor’s DCCT members assigned to each demonstration will work closely with the sponsors and sites to identify and recruit local field coordinators. The contractor will obtain names from each site of local persons who may be qualified to fill the position. Field coordinators will be recruited based on longevity and interest in their community and awareness of food insecurity and the SFSP in their local area. Bilingual staff will be recruited in communities with a high percentage of non-English speaking households.


The contractor will hire approximately 12 field coordinators (exact numbers will be determined based on the geographic location of sponsors). The coordinators will be assigned sites that are in close proximity to their home to minimize travel time as much as possible. After making contact with their assigned sites and working out the logistics of their support, the field coordinator will help the sites put their participation lists together. Once the lists are prepared and the sample selected, the field coordinators will work with their assigned sites to facilitate the process of collecting the data from parents of participants in sampled households.


Take Phone Home Program. The Take Phone Home program assumes that there will not be telephone numbers for about 60 percent of sampled households on the participant lists. In the summer 2011, the contractor will collect telephone contact information from participants to be able to contact about 90 percent of that same sample by telephone for the fall 2011 interview. Incorrect telephone numbers are expected for about 30 percent of the returning sample in summer 2012, and no telephone contact information is expected on about 60 percent of the supplemental sample. As with the interview in summer 2011, the contractor will collect additional contact information on the supplemental sample during the interview to ensure there is up-to-date contact information for the second 2012 interview. For the second interview in 2012, it is expected that there will be telephone numbers for 90 percent of the sample.


The Take Phone Home program is a two-step process. In the first step local field coordinators will make arrangements to have disposable cell phones delivered. The cell phones (valued at about $10 each) will be working and will have 60 minutes of prepaid time (30 minutes will be used for the interview). Respondents will be provided with instructions that inform them that if they call the contractor’s toll-free number and complete an interview within 24 hours, they will receive a total of 50 minutes of free prepaid time for their own use and that they can also keep the phone (Attachment B, in English and Spanish). If the respondent does not call within the predetermined time, the contractor’s Telephone Research Center interviewers will try to contact them via the cell phone. If they cannot be reached within 3 days by calling the cell phone, the field coordinators will move to Step 2 and attempt to schedule an in-person contact with a member of the household. The coordinator will either arrange for the conduct of the interview with the parent or caregiver at the SFSP demonstration site when the child is dropped off or picked up or visit the home.


For the Meal Delivery demonstration, field coordinators will be present at locations that assemble meals and will be responsible for preparing Take Phone Home packages (which include instructions [Attachment B] and a cell phone) and labeling them with the name and address of the child participants who should receive one. Drivers who deliver the meals will also deliver these packages. Drivers will be briefed by the field coordinator so they can provide a simple overview of the evaluation and instructions to parents when they make their deliveries. For the Backpack project, field coordinators will also prepare Take Phone Home packages that include instructions (Attachments B) and a cell phone. The local field coordinators will be present at each SFSP site to ensure that packages are delivered to those who require them. Field coordinators will provide the package to the child participant (or parent, if available) in a backpack, and review the contents of the letter and instructions.


In-Person Followup. Field coordinators will begin followup with households that have not completed the phone interview within 3 days of receiving the Take Phone Home package (Attachment B). For the Meal Delivery project, depending on the location of the household, the local field coordinator will either make an in-person visit to the home or send a letter to the household via the delivery drivers. In addition, the field coordinator will either arrange for the parent to complete the interview the same day by telephone or schedule a time to complete the telephone interview within the next few days. For the Backpack project, field coordinators will make in-person contact with the household at the SFSP program locations during drop-off or pick-up time and encourage them to complete the interview. They will attempt to engage nonrespondents and convince them to take the time to complete the interview on-the-spot (using the coordinator’s cell phone). If unsuccessful, the coordinator will attempt to convince the household member to commit to a scheduled phone interview, either in the home or at the SFSP site. The primary function of the field coordinator in the nonrespondent followup effort is to facilitate contact with a phone interviewer; they will not conduct parent interviews. This will ensure the data collection mode is the same for all respondents, provide help to the respondent if needed, and improve response rates in an efficient manner.


Incentives. Parents with contact information who complete the interview will receive $20 for each interview as an incentive. Those with no contact information will become part of the Take Phone Home Program and be given 30 prepaid minutes for the interview and an additional 50 prepaid minutes for their own use if they complete the interview in a timely manner. They will also be allowed to keep the cell phones after the interviews are complete. Attachments K1 and K2 contains a thank you letter for the Meal Delivery and Backpack demonstrations (in English and Spanish).




B.4 Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of test may be submitted for approval separately or in combination with the main collection of information.


Most questions in the parent survey have been well tested and validated. However, for those questions that were written specifically for the Backpack demonstration project, we conducted one round of instrument cognitive testing with SFSP program administrators in Forth Worth, Texas who implemented a summer food backpack program in 2010.  The purpose of the cognitive testing was to solicit expert input on the clarity of wording and the ability of primary caregivers to understand as well as respond to the questions, before finalizing the parent questionnaire for submission with this information collection request.  Testing included 3 interviews in English and 2 interviews in Spanish with program administrators and was conducted over the telephone.  Overall, the administrators thought the parent survey was a good questionnaire and that parents in the program would be able to answer the questions.


Based on feedback from the testing, minor revisions were made to several questions to increase understanding, better reflect how meals are delivered to participants, or add response categories. Some administrators felt it would be difficult for parents with multiple children in the Backpack demonstration to answer some questions for each child. However, to meet the evaluation objectives of determining household food security and factors that might be related, the questionnaire continues to ascertain participation in the demonstration projects of each eligible individual in the household. Some questions are more general (e.g., whether anyone shared food in the backpack with others) and therefore may be easier to answer.


Nonetheless, interviewers will be trained to be mindful of the questions that require responses for each child, and especially for families with a large number of children (>5) who brought home a backpack. They will be trained to take the parent through these questions very patiently and carefully and provide probes and reminders as necessary. For example, when the respondent is asked to name all foods in each backpack, if the parent says s/he can’t remember, interviewers will probe using food categories to help them remember -- “Thinking about that most recent backpack, were there any milk products; were there any fruits or vegetables; was there any bread, or any type of meat included?” When a respondent does provide foods the interviewer will always ask, “Anything else?” before going to the next child’s backpack.


If the respondent says all children received the same foods, the interviewer will be instructed to confirm each child by name as the interviewer goes through each child named on the CATI screen before coding as the same for all. Interviewers will also be required to record respondent comments on questions that require responses for each child. If the respondent reports any comment that needs to be captured, the interviewer will use a command to invoke entry of a comment in CATI. In that way the comments are accurately identified with the question asked that caused the reaction by the respondent, what the comment was, and the ID of the case. Early in the data collection period, we will review comments daily and provide interviewers with additional probes informed by these comments.


We will also be examining the frequencies of all variables early in the data collection period to look for red flags (e.g., > 15% of respondents said “don’t know” or did not answer) (see Section A.16), but for the series of questions that requires responses for each child in the family that receives a backpack, we will also examine the frequencies of DKs and missing data broken down by the number of children in the family to see whether there are differences in response between families with many and few children.


In addition, at some point a parent with a large number of children may tire of the questions that require responses for each child, and by the third or fourth child, will start responding in the same way for all children. In addition to training the interviewer to be on the lookout for this possibility, we will examine the variability of the data for this series of questions to see whether this might be the case.

Detailed information on the changes made to individual questionnaire items is provided in the pretest memo contained in Attachment L.


B.5 Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Data collection and data analysis activities will be conducted by Westat, the research organization contracted to complete an independent evaluation of the demonstration projects. In consultation with USDA, Westat developed the sample design for the evaluation (as described in Section B.1) using standard statistical methods.

Key individuals and their role in the evaluation are shown in Table B.5.1.Table B.5.1. Individuals involved in statistical aspects, data collection, and analysis


USDA, Food and Nutrition Service

3101 Park Center Drive, 10th Floor

Alexandria, VA 22301

Chan Chanhatasilpa, Ph.D.

FNS Project Manager

Phone: 703-305-2115

E-mail: [email protected]

USDA, Economic Research Service

1800 M Street NW
Washington, DC 20036

Mark Nord, Ph.D.

Advisor on use of Food Security Module in parent questionnaire

Phone: 202-694-5433

E-mail: [email protected]

Westat

1600 Research Boulevard

Rockville, MD 20850

Westat Project Team

Lynn Elinson, Ph.D.

Principal Investigator; Task Director, Design, Development, Analysis Team Lead

Phone: 240-314-5844

E-mail: [email protected]

Susie McNutt, M.S., R.D.

Co-PI, Operations Design, Management, QC Team Lead

Phone: 301-738-3554

E-mail: [email protected]

Adam Chu, Ph.D.

Senior Sampling Statistician

Phone: 301-251-4326

E-mail: [email protected]m

James Bethel, Ph.D.

Senior Statistical Advisor

Phone: 301-294-2067

E-mail: [email protected]

Mustafa Karakas, Ph.D.

Health Economist

Phone: 301-294-2874

E-mail: Mustafa [email protected]

Sujata Dixit-Joshi, Ph.D., M.P.H.

Senior Nutritionist

Phone: 508-435-0402

E-mail: [email protected]



Table B.5.1. Individuals involved in statistical aspects, data collection, and analysis (continued)


Westat Advisory Group

David Cantor, Ph.D.

Westat Associate Director;

Research Professor, Joint Program in Survey Methodology, University of Maryland; experience in Take Phone Home program

Phone: 301-294-2080

E-mail: [email protected]


Brad Edwards, B.A.

Westat Vice President; Expert in Survey Management and Design and Special Populations (e.g., low income)

Phone: (301) 294-2021

E-mail: [email protected]


Martha Berlin, B.A.

Westat Vice President and Associate Director, Survey Operations Group

Phone: (301) 251-4287

E-mail: [email protected]


Jerry Cater, Ph.D., RD

Consultant for school nutrition programs in Mississippi

Research Scientist (retired)
Applied Research Division
National Food Service Management Institute Hattiesburg, MS

Phone: (228) 860-9636.

E-mail: [email protected]

Katherine Yadrick, Ph.D., RD

Professor and Chair

Department of Nutrition and Food Systems
University of Southern Mississippi

Hattiesburg, MS

Phone: (601) 266-4479

E-mail; [email protected]








References

Nord, M., Andrews, M., Carlson, S., Household Food Security in the United States, 2008, ERR-83, U.S. Dept. of Agriculture, Econ. Res. Serv. November 2009.

U.S. Census Bureau, Current Population Survey, 2009 Annual Social and Economic Supplement, Internet Release Date: May 2010.


1Summer and school year food consumption, - revised from a study on the SFSP (funded by FNS); Participation in Nutrition Assistance Programs - another FNS study (Healthy Incentive Pilot); household size and age – the California Survey of WIC participants; gender, language, education, date of birth – NHANES; Hispanic or Latino, annual household income from all sources – Behavioral Risk Factor Surveillance System.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorHintz_e
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy