Technical Working Group Recommendations

C-TWG Recommendations.pdf

National Household Food Acquisition and Purchase Survey

Technical Working Group Recommendations

OMB: 0536-0068

Document [pdf]
Download: pdf | pdf
APPENDIX C
RECOMMENDATIONS OF THE TECHNICAL WORK GROUP
FOR ADJUSTMENTS TO FIELD TEST PROCEDURES

This page has been left blank for double-sided copying.

MEMORANDUM

955 Massachusetts Avenue, Suite 801
Cambridge, MA 02139
Telephone (617) 491-7900
Fax (617) 491-8044
www.mathematica-mpr.com

TO:

Mark Denbaly, USDA Economic Research Service

FROM:

Nancy Cole

SUBJECT:

Recommended Modifications for FoodAPS Based on the
TWG Meeting

DATE: 7/19/2011

Rev. 9/23/2011

The FoodAPS Technical Work Group (TWG) meeting was held on July 15 to present
findings from the field test and receive recommendations from academic and government
experts. Below is a list of FoodAPS design changes recommended by the TWG.
1. Adopt the high incentive level – The field test included a low and high base incentive for
the main respondent ($50 and $100). The high incentive yielded significantly greater
response rates and is recommended for the full-scale survey.
2. Include the “Pre-Test with interviewers” - The field test included a “pre-test with
interviewers” which is not budgeted for the full-scale survey. This pre-test was suggested by
OMB to gather additional information about survey protocols in advance of the field test.
Interviewers used the food reporting data collection instruments for one week prior to their
training for the field test. For the field test, Mathematica provided “respondent training” to
interviewers in a series of two-hour group training sessions. For the full-scale survey, we will
produce a training video for distribution via multiple mediums. The hands-on experience that
interviewers gained through the “pre-test” provided them with in-depth understanding of the
protocols and it resulted in more efficient and effective training for the field test.
3. Acquire Additional SNAP Data - One of the findings from the pre-test is that the SNAP
caseload is fluid, thus reducing the efficiency of the SNAP sampling frame. About half of
SNAP households were obtained from the ABS frame, while non-SNAP cases were obtained
from the SNAP frame. TWG members were not concerned with this frame “contamination”
in terms of the impact on sampling weights. However they were concerned with the reduced
efficiency of the sample over time. The SNAP frame (based on State caseload data) will age
even more over the 6-month full-scale survey, compared with the field test. The TWG
suggested that additional SNAP administrative data processed for sampling for the second
half of the field period may improve the efficiency of field operations.
4. Research and acquire additional UPC data – The TWG was concerned with the low match
rate of scanned UPCs with Gladson data because item descriptions from receipts are often
impossible to link to food items. ERS indicated that they have research to share on this issue,
and suggested that aggregation of UPCs may be appropriate for research purposes. However,
non-matching UPCs pose added costs when entering price data from receipts. Alternative
An Affirmative Action/Equal Opportunity Employer

PAGE:

2

sources of UPC data, for example, from large retailers or private label manufacturers can
improve data entry operations and data quality.
5. Replicate weights - The TWG recommended the construction of replicate weights for
inclusion in public access data files. Replicate weights can be used to account for the
complex sampling design without revealing the identity of PSUs.
6. Increase the number of Secondary Sampling Units (SSUs) from 4 to 8 – The full-scale
survey was designed to include 4 SSUs per PSU. At the first TWG meeting (January 2009),
TWG members suggested an increase to 8 SSUs per PSU for the field test and this change
was made in the first contract modification. Eight SSUs resulted in acceptable ICCs that were
driven primarily by weighting and not by clustering. TWG members recommended adoption
of 8 SSUs per PSU for the full-scale survey.
7. Measures to improve response rates – The field test achieved lower response rates than
expected. Several TWG members were comfortable attributing the rates to the difficult
survey areas and winter field conditions. However, OMB expressed concerns with the
screener response rates and stated that we must implement a solution for the low screener
response rate. The following measures are designed to improve response rates:
a. Add one additional day to the field interviewer training schedule – One additional
day will be used to provide in-depth training on screening, including locating,
planning routes, filling time slots, logging attempts, and refusal conversion.
b. Procedural changes to increase response rates – TWG members suggested that
sources of nonresponse should be reduced through: (a) an incentive to gatekeepers of
locked buildings and gated communities; (b) a redesigned advance mailing and
subsequent mailings to convert refusals. The advance mailing redesign is warranted
because the incentive should have had a greater impact at screening, and the lack of
impact indicates ineffective communication of the incentive prior to screening.
c. Implement a web-based sample management system – Mathematica recommended
implementation of a web-based sample management system to better track the
multiple attempts needed to make contact with the address-based sample, and to
better track the multiple contacts with households throughout the data collection
week. It is critical that we receive detailed and timely sampling reports to manage
sample release and provide feedback to field staff on their productivity. TWG
members generally concurred that better tracking of the screening effort is essential to
achieving improved response rates.
8. Extend the field period – TWG members noted that the current field period, March 5
through September 3, will prove difficult because many households will be on vacation in
August when we are trying to wrap up operations. The TWG suggested that we extend the
field period by at least one month, and possibly into October or November because those
months provide a more responsive environment. We would also thereby include a longer
period when children are in school.
9. Revise household interviews to reduce burden – Item nonresponse on Household
Interview #2 was a problem (employment, income, and on-food expenditures). TWG

PAGE:

3

members suggested that ERS examine their research plans for these data and identify survey
questions that may be eliminated. Three suggetsions for reducing burden and improving
response are: (a) eliminate questions about utility expenditures and supplement the survey
with averages by geographic area; and (b) design and produce an advance brochure to inform
survey respondents of the content covered in HH2; (c) combine HH2 and HH3 so that the
difficult income questions are part of an in-person survey.
10. Increase sample release to account for lower rates – Mathematica presented findings from
the field test that showed higher than expected eligibility rates from the ABS frame, but
lower than expected screener completion rates from the SNAP frame. The higher eligiblity
rate is due largely to the fact that field test PSUs were selected to provide us with very low
income areas, and are thus not representative of the average PSU for the full-scale survey.
The lower than expected SNAP completion rate indicates that SNAP households are not as
responsive as expected, perhaps due to the expansion and increased diversity of SNAP
caseloads. Lower SNAP household cooperation must be taken into account in the full-scale
survey with a larger release of sample from the SNAP frame.
11. Provide additional analysis of field test data –TWG members suggested that the field test
data should be analyzed to provide answers
Incentives:




Include incentive levels in non-response analysis
Use regression analysis to obtain estimates of the marginal impact of the high incentive
Examine the impact of incentives as measured by the total incentive paid to household

Sampling







Examine whether the accuracy of the SNAP frame changed over time.
Determine whether the counties were similar in terms of the inaccuracy of the frame.
Determine whether the SNAP frame yielded a disproportionate percent of very lowincome households.
Examine the incremental cost of using the SNAP vs. ABS frame for an additional SNAP
households; and the relative cost of obtaining very low income households from the
SNAP vs. ABS frame
Conduct analysis of the efficiency/cost tradeoff of oversampling households adjacent to
SNAP households

Quality of food acquisition data










Examine the percent of acquisitions by day of data collection week to see if there is
survey fatigue throughout the week
Examine food acquisitions separately for single person and multi-person households
Examine SNAP participants’ food acquisitions relative to their benefits distribution
Examine acquisitions in relation to frequency of pay checks
Examine characteristics associated with response through different reporting channels
(e.g., Daily List, red/blue page, scanner, receipts)
Assess whether misclassification of income at screener has an impact on inferences with
regard to food acquisition behavior
Compare reported food acquisitions with HH1 response to questions about “How often
do you eat dinner out”
Examine the disparity between booklets and FRS by timing of phone calls throughout the
week to determine if survey fatigue played a role in reporting accuracy


File Typeapplication/pdf
AuthorECurley
File Modified2011-11-14
File Created2011-11-11

© 2024 OMB.report | Privacy Policy