Cognitive Test Results

Appx B.pdf

Field Test for the National Household Food Acquisition and Purchase Survey

Cognitive Test Results

OMB: 0536-0067

Document [pdf]
Download: pdf | pdf
APPENDIX B
COGNITIVE TEST RESULTS

National Household Food
Acquisition and Purchase
Survey
Report on Cognitive Tests

June 7, 2010
Holly Matulewicz
Nancy Cole

Contract Number:
AG-3K06-D-09-0212

National Household Food

Mathematica Reference Number:
6687-031

Survey

Submitted to:
U.S. Department of Agriculture
Economic Research Service
1800 M Street, NW
Room 5125
Washington, DC 20036
Project Officer: Mark Denbaly
Contracting Officer: Jennifer Crouse
Submitted by:
Mathematica Policy Research
955 Massachusetts Avenue
Suite 801
Cambridge, MA 02139
Telephone: (617) 491-7900
Facsimile: (617) 491-8044
Project Director: Nancy Cole

Acquisition and Purchase

Report on Cognitive Tests

June 7, 2010

Holly Matulewicz
Nancy Cole

CONTENTS
I

INTRODUCTION ................................................................................................. 1

II

METHODOLOGY ................................................................................................. 1
A.

Sample of Households ................................................................................. 2

B.

Instruments Tested ..................................................................................... 2

C.

Household Training and Cognitive Interviews .............................................. 4
1.
2.
3.

III

First Household Visit ........................................................................... 5
Second Household Visit ....................................................................... 6
Length of Household Visits .................................................................. 7

RESULTS ............................................................................................................ 8
A.

Use of Scanner and Bar Code Book .............................................................. 8

B.

Individual Use of Food Booklets ................................................................. 10

C.

Understanding and Using the Data Collection Forms ................................. 11
1.
2.
3.
4.
5.
6.
7.
8.
9.

Instructions and Reference Pages ...................................................... 11
Daily List (CV) .................................................................................... 12
Master List for the Household (SV1 and SV2)...................................... 12
Red Pages for Reporting FAFH ........................................................... 13
Blue Pages for Reporting FAH ............................................................ 14
Purple Pages for Reporting FAH and FAFH (SV2) ................................. 15
Youth Booklet .................................................................................... 16
Saving Receipts ................................................................................. 17
Respondents’ Overall Assessments .................................................... 17

D. Data Quality .............................................................................................. 19
1.
2.
IV

Did Respondents Use the Scanner Correctly?...................................... 19
What Was the Quality of the Written “Descriptions” of Foods? ............. 22

CONCLUSIONS AND RECOMMENDATIONS ......................................................... 23
1.
2.

Generalizability of These Results ....................................................... 25
Implications for the Field Test ........................................................... 26

APPENDIX A: FOODAPS COGNITIVE TEST CASES

iii

TABLES
1

Three Versions of Instruments for Collecting Food Data .................................... 4

2

Allocation of Households to Three Versions of Instruments .............................. 4

3

Length of Household Visits ............................................................................... 7

4

Use of Individual Booklets by Interview Type ................................................... 11

5

Percentage of Food Acquisitions with a Receipt ............................................... 17

6

Scanner Data Validity by Type of Interview ...................................................... 21

iv

I. INTRODUCTION
This report describes findings from cognitive test interviews conducted from April 30 to May
25, 2010, as part of the National Household Food Acquisition and Purchase Survey (FoodAPS)
sponsored by the Economic Research Service (ERS). The cognitive tests were fielded to obtain
information about respondent understanding and use of survey materials designed to collect data
about household food acquisitions over a one-week period. The cognitive tests provided feedback
on three slightly different versions of these instruments. For respondents, the tests included (a)
training, (b) using instruments over a two-day period, and (c) completing a debriefing interview.
Results were obtained from interviewer observations and questions during the training session,
qualitative reports from respondents about the data collection process, interviewer analysis of
respondents’ use of the data collection instruments, and comparison of the scanner data with the
data collection forms for food at home (FAH). Video and audio files from seven interviews were
sent to ERS under separate cover.
The cognitive interviews provide data to help us answer three questions about data collection
procedures for the National Food Survey:
1. Do respondents understand the tasks involved in participation (scanning, saving receipts,
and filling out forms)?
2. Do respondents complete the tasks involved in participation and do they use the survey
tools correctly?
3. Are there differences in respondents’ understanding and completion of tasks, and in data
quality, across the three instrument formats used for the cognitive tests?
This report is organized in three sections: (1) Methodology, (2) Results, and (3) Conclusions and
Recommendations.
II. METHODOLOGY
The participants for the 16 cognitive interviews were drawn from a convenience sample of
persons currently receiving benefits from the Special Supplemental Nutrition Program for Women,
Infants, and Children (WIC), Supplemental Nutrition Assistance Program (SNAP), or Social Security
benefits. Our respondents self-identified as the primary food shopper in their household. To recruit
this sample, Mathematica Policy Research conducted outreach with two local WIC offices (to find
households with young children), two locally based Councils on Aging (COAs) (to recruit elders),
and two sites for the Department of Transitional Assistance (DTA) (for remaining households with
and without children). We sent letters to directors of these agencies and followed up with a
telephone call and an in-person visit either to drop off recruitment brochures or conduct on-site
recruitment. At each agency, we sought input on the most effective ways to reach out to the agency’s
clients. We did not ask for lists of clients, and we did not sample from a list. Methods of recruiting
participants varied by site:
• At one COA, the nutrition director promoted the opportunity on the grocery shopping
van service it offers to seniors. The second COA distributed flyers, but this did not yield
calls from elders in time to schedule cognitive tests.

1

• The DTA offices first sought state-level approval for our activities and then allowed
Mathematica staff to recruit in their office. We set up a table in the office lobby during
peak traffic hours and gathered names and contact information from those who
approached the table for more information.
• The WIC office elected to distribute our flyers to clients. Interested clients contacted us
by telephone for more information.
We will contact the same agencies again in June 2010 when we recruit households for the
pretest; at that time they will receive a $50 gift as a token of our appreciation for their assistance.
A. Sample of Households
We completed screeners with a total of 46 households to determine household size and
composition (nonelderly households were assumed to be low-income based on recruitment site).
Our goal was to recruit 16 households. Three households initially agreed to participate but did not
complete a first appointment, two rescheduled their first appointment, and one of them ultimately
did not complete the appointment. Thus, we scheduled 20 interviews to obtain 16 completed
interviews. We completed a first interview/training with 16 households and a second interview with
15 households. One household completed a first appointment as scheduled but was unable to keep
the second appointment, and, to date, that household has not taken or returned our calls in an effort
to reschedule.
We selected the 16 participating households to represent different demographic groups. Our
goal was to interview 8 households with children, 4 households with nonelderly adults and without
children, and 4 households with elderly persons. Due to last-minute cancellations and the availability
of others to book on short notice, we ended with 9 households with children, 3 nonelderly
households without children, and 4 elderly households. Recruitment of households with dependent
children ages 11 to 18 was the most difficult, as their incidence at the DTA offices was low on the
days we conducted the on-site outreach efforts.
Households with children ranged in size from two to four persons, with an average of two
persons. Households without children ranged in size from one to four persons, with an average of
two persons. Three households had adults living or staying in the household during the study period
that were not identified in the screening process. These included (1) a grandmother who provided
child care during weekdays; (2) two households with relatives staying for an extended period of time.
A detailed description of the participating households, by case identification number, is
included as Appendix A. Of the 16 respondents, one had less than a high school education, 9
graduated from high school, and 6 graduated from college (or had postgraduate schooling).
Participants included racial and ethnic minorities; two households living in a homeless shelter
serving families; and persons with physical, cognitive, and mental disabilities.
B. Instruments Tested
The purpose of the cognitive tests was to obtain feedback on three slightly different versions of
instruments designed to collect data on food acquisitions. These three versions are referred to as:
• Comprehensive version (CV)
• Simple version 1 (SV1)
2

• Simple version 2 (SV2)
These versions differ primarily in the layout and organization of data collection forms. The
comprehensive version collects all data items specified in the statement of work; the two simplified
versions do not collect information about whether a tip was paid and the tip amount (for food away
from home, or FAFH); SV2 also does not collect information about purchases for persons outside
the household.1
All three versions included the following components:
1. Handheld Scanner and “Scanner Instructions and Book of Bar Codes” to scan
foods and drinks brought into the home for home preparation and consumption (food
at home, or FAH);2 the bar code book contains pictures and bar codes for bulk items
(dried fruits, candy, grains and rice, nuts, and seeds), deli items (meats, cheeses, and
salads), and fruits and vegetables.
2. Youth Food Booklets for persons ages 11–18 willing to track their FAFH acquisitions
3. Adult Food Booklets to report FAFH
We summarize differences between versions in Table 1. The first item (scanner and book) was
the same for SV1 and SV2; for CV, the book also included data collection forms for FAH (blue
pages). The last item (Youth Food Booklets) was exactly the same for CV, SV1, and SV2. Three
different Adult Food Booklets were designed as follows:
• CV Adult Food Booklets—include Daily List (green pages) for each day located at the
front of the booklet and red pages for reporting FAFH.
• SV1 Adult Food Booklets—include red pages to report FAFH and blue pages for
reporting FAH. A master list (ML) for the household, designed to reside separately from
the food booklets, replaced the daily list (DL).
• SV2 Adult Food Booklets—include purple pages to report FAFH and FAH. An ML for
the Household, designed to reside separately from the food booklets, replaced the DL.
The two simple versions were designed to consolidate the FAFH and FAH data collection
forms in the Adult Food Booklet. SV1 is similar to CV, retaining the red and blue color coding for
FAFH and FAH, while SV2 contains a single purple form for reporting all food acquisitions. The
red and purple forms retain a similar one-page-per-place design in all versions. The CV blue page
has a one-page-per-place design, and the SV1 blue page has a grid design (one page for the week).
Allocation of households to each version of instruments is shown in Table 2.

We originally envisioned simplified versions that would exclude even more data items in order to reduce burden.
However, after simplifying the design of the forms, Mathematica and ERS agreed to test the redesigned forms without
excluding additional data items.
1

2 Respondents were trained to scan the bar code on the food product, if available, or to look up the food item in
the book of bar codes if the item had no bar code.

3

Table 1. Three Versions of Instruments for Collecting Food Data
Instrument

Comprehensive

Simple Version 1

Simple Version 2

1. Scanner Book

- Instructions
- Bar codes
- Blue forms FAH

- Instructions
- Bar codes

- Instructions
- Bar codes

2. Adult Booklet

- Daily list
- Red forms for FAFH

- Red forms for FAFH
- Blue grid for FAH

- Purple forms for FAFH
and FAH

3. Master List for the
Household

NA

Separate from booklets

4. Youth Booklet

Separate from booklets

One version used for all tests

FAFH = food away from home; FAH = food at home.
NA = not available.

Table 2. Allocation of Households to Three Versions of Instruments
Household Type
Households with
Childrena

Households Without
Children

0-6 years old
7-10 years old
11-14 years old
15-18 years old
19-30 years old
31-64 years old
65 years or
older

Total

Comprehensive
(CV)

Simple 1
(SV1)

Simple 2
(SV2)

Total

1
2
0
0
1
1
1

1
0
1
1
0
1
1

1
1
0
1
0
0
2

3
3
1
2
1
2
4

6

5

5

16

a
Households may have had children or adults in other age categories. For the purposes of identifying
interview targets or quotas, households were first categorized as being with or without children, then
placed in the appropriate group based on the oldest adult in the household (for those without children)
and the oldest child in the household (for those with children).

CV = comprehensive version; SV1 = simple version 1; SV2 = simple version 2.

C. Household Training and Cognitive Interviews
A member of Mathematica’s Cambridge survey staff (hereafter, “interviewer”) visited each
respondent twice, in the respondent’s homes or at the COA (two respondents). The first visit was
designed to exactly replicate the training portion of the first household interview planned for the
full-scale survey. The second visit was designed as a debriefing interview. We asked each household
to track its food acquisitions for two days following the first visit. We gave each respondent $10 in
cash at the first visit and asked him or her to spend the money on food within the next two days. We
gave each respondent a postal money order for $50 after completing the debriefing interview in the
second visit.
1.

First Household Visit

At the first visit, the interviewer provided the consent form and asked the respondent to review
and sign it, indicating consent to participate, along with consent (or not) to audio- or videotape the
session. Each respondent received a copy of his or her consent form. The interviewer then
4

described the purpose of the study, provided an overview of the data collection activities, reviewed
each component of each booklet, provided instruction on how to use the scanner and when to use
the bar code book, and finished with a “practice” activity so that respondents could practice the
reporting of FAH and FAFH that they would do as part of the study. We developed separate
training scripts for each of the three instrument designs, using the same language for items that are
the same across instruments.
To train respondents for scanning, interviewers described the purpose of bar codes, showed
examples from groceries they brought with them, and then demonstrated how to scan an item,
holding the scanner on a slight angle about two inches away from a bar code. To indicate proper
use, the scanner provides two verifications: (1) it emits a red laser beam across the bar code when
the scanning button is depressed and (2) it emits a loud “beep” noise when the item has scanned
successfully.
To provide practice for reporting FAH, interviewers brought a supermarket receipt and grocery
bag with eight items. These items were purposively selected as items that might be challenging to
scan: rounded items (cans), soft packages (such as dried fruit), produce that had the new style of
“double bar codes,” produce and bulk items without a bar code that had to be looked up and
scanned in the bar code book3, and a blueberry muffin that could not be scanned:
1. Orange—produce with double bar code
2. Apple—no bar code
3. Craisins package—squishy item
4. Blueberry muffin—store bakery item without bar code
5. Tuna can—defective bar code
6. Banana chips—bulk item
7. Toothpaste—Nonfood item
8. Jalapeño pepper—no bar code, possibly hard to find in bar code book
Respondents practiced the reporting of FAH by reviewing the receipt (to determine the place),
scanning the practice items, and filling in the FAH form (blue or purple) using the supermarket
receipt and information about the grocery items that they scanned (or not). To practice reporting
FAFH, respondents completed the appropriate form to describe a recent meal or snack purchased
away from home.
During the first visit, we asked a series of questions to determine respondents’ understanding of
terms used in the booklets. These terms included the following:
• Store and Manufacturers’ Coupons. All respondents provided examples to illustrate
their understanding of this term, including “the ones that come out of the register” or
“coupons you get in the newspaper.”

3

The bar code book includes pictures and bar codes for bulk foods, deli items, fruits, and vegetables.

5

• Loyalty Card. Several respondents were confused by use of this term until they saw an
image of an example card, which they immediately recognized. When asked what they
call this card, respondents offered the following responses: customer card, frequency of
use type card, rewards card, member card, savings card, store card, or (fill the name of
store) card.
• EBT. Respondents were generally familiar with the term, which is an abbreviation for
electronic benefit transfer, and able to connect it to “food stamps” or “the card they
gave me for food assistance,” though they were not always sure what the acronym
meant. One described it as “the blue card from the state—they put food stamps on it
like a credit card.”
• “Please be as specific as possible” (bottom section of red and blue pages).
Respondents understood the concept and provided examples such as the name of the
brand, flavor of a drink, and type of food. Many respondents repeated the example given
on the instruction page facing the red forms.
• “Weight/#” (blue page). Respondents provided examples to reflect their
comprehension of the terms such as “either ounces or small/medium/large,” or “include
both number and weight.” One respondent said she did not understand why it was
necessary and she did not want to do it. Another suggested using the weight listed on the
container in that field. During the training, another respondent held the tuna and the
muffin in each hand to estimate the weight of the muffin relative to the tuna can (which
had the weight recorded in ounces on the label).
• “List ALL the foods and drinks you could not scan” (blue page). When respondents
were asked to put this in their own words, one said, “Things you can’t scan, if you ate
out, anything not eaten at home and anything not scannable;” others struggled to
rephrase it differently than written, feeling it was stated plainly and clearly.
• Size or Amount (purple page). When asked what they would include in the size or
amount column, respondents’ answers indicated comprehension of the concept and one
indicated kilograms would be easier to report than ounces (provided in the example).
One interpreted “amount” as dollar value.
A number of booklets were left with respondents corresponding to the number of household
members ages 11 and older. Respondents were instructed to share what they had learned with other
household members to help them use their booklets. If another household member did not want to
use his or her own booklet, we instructed the primary respondent to record the information in his or
her own booklet. Respondents were told that the household could use individual booklets or one
booklet for all members of the household, at their discretion.
2.

Second Household Visit

During the second visit, the interviewer began by asking respondents to provide an overview of
their experiences over the past two days and their food acquisitions over that time period. Next, the
interviewer reviewed the daily lists (for the CV group) or master list (for the SV1 and SV2 groups)
and each section of each booklet to identify inconsistencies between the recorded data and the
overview of acquisitions. Interviewers probed for clarification when data appeared to be recorded
incorrectly, and they identified sections in which respondents had questions or concerns. After this
review, interviewers asked scripted questions to obtain information about respondents’ experiences,
including the following:
6

• Understanding and application of concepts covered in the first training sessions: for
example, how to record a pizza delivered to their home, a snack purchased at a vending
machine, whether alcohol is something to record in the booklets, and how household
members can practice using the scanner.
• Interpretation of terms used in the booklets, such as “EBT,” “free food,” “loyalty card,”
and “detailed description.”
• Suggestions for improving the instruments or the process (these are discussed in the
results section).
The first set of “test” questions for respondents was about their understanding of where to
report different types of acquisitions (red or blue pages). In truth, acquisitions can be entered
anywhere and Mathematica can reclassify them as FAH or FAFH, but the concern is whether this
categorization placed a burden on respondents. Our two examples of pizza delivery and vending
machine revealed some confusion. For example, one respondent thought that food delivered to the
home is in the category of “food brought home” to be put on a blue page. She understood the
distinction between FAH and FAFH better when it was explained in terms of “food that you bring
home and prepare yourself.” She also suggested that “delivery” should be listed as a place for red
pages. Most respondents said they would put vending machine items on a red page, but at least one
respondent said they brought the item home and scanned it (either method is acceptable).
3.

Length of Household Visits

The length of the first household visit varied slightly by type of instrument (Table 3). SV2
interviews were shortest, as expected, because this version included training for one detailed data
collection form (purple page) rather than two (red and blue pages). The length of the second
household visit depended on the number of reported food acquisitions and the amount of feedback
the respondent wished to provide.
Table 3. Length of Household Visits
Interview /
Interview Length
First Visit (minutes)
Minimum
Maximum
Average
Second Visit (minutes)
Minimum
Maximum
Average

Comprehensive
(CV)

Simple Version 1
(SV1)

Simple Version 2
(SV2)

45
105
67

60
90
70

50
70
62

60
95
78

90
90
90

75
90
81

7

III. RESULTS
This section presents results of the cognitive tests interviews. We obtained these results from
interviewer observations during the training of respondents, qualitative reports from respondents
about the data collection process, interviewer analysis of respondents’ use of the data collection
instruments, and comparison of the scanner data with the data collection forms for FAH. The
discussion is organized by survey instrument.
Overall, during the first household visit, respondents remarked how easy the process would be,
and few asked questions when offered the opportunity to do so at each stage in the training. At the
second visit, the vast majority of respondents appeared confident they had completed the tasks
correctly, regardless of the quality of the data provided. Many said that the practice during the first
visit left them feeling equipped to collect these data on their own, and they said they gained
confidence after their first few entries. Furthermore, they were clearly engaged in the activity and
often followed up on specific details with the interviewer during the detailed review of the booklets.
A. Use of Scanner and Bar Code Book
All respondents were able to use the scanning device during the practice session. When asked
about the most helpful part of the training, respondents often cited the act of scanning groceries and
completing the data collection forms. Interviewers observed that several respondents had initial
problems using the scanner, but each problem was easily resolved by the interviewer. Scanning
problems included the following:
• Attempts to scan a bar code with the laser beam pointing vertically rather than
horizontally. Interviewers clarified that the red beam must cover the entire barcode, and
there was no beep because the beam was only flashing on one or two of the lines rather
than the whole code. (This will be addressed by adding an image to the scanner
instructions showing a red laser beam across the barcode horizontally.)
• Holding the scanner too close or too far from the item. After doing this in practice
and receiving feedback, respondents were able to keep the device at the correct distance
from the item (about two inches) in subsequent scans.
• Failure to press the “scan” button firmly. One respondent tried to use the edges of
her acrylic nail to depress the scan button and, although it emitted a beam, it did not
scan/beep because it was not depressed firmly enough. Likewise, an elderly respondent
who had mobility issues had difficulty at first in training because her hands shook and
were not always able to depress the button firmly at the same time the beam hit the bar
code. With practice, however, this respondent was able to use the device and reported
the scanning activity for her second shopping trip took only half as long as it did for the
first trip.
Respondents observed that the beep when they delete is different from the beep when they
scan, which appeared to be helpful. One noted that although she saw the red line hitting her item,
she knew it had not scanned because she did not hear a beep, and that led her to realize she was
holding the scanner too far from the item.
8

Respondents were instructed to look in the bar code book for items that did not have bar codes
affixed to them or that had bar codes that would not scan. As respondents leafed through the
booklet in training, some commented that the book was comprehensive, pointing out items such as
the potato salad or unusual fruits and vegetables.
Second Household Visit. All 15 respondents with a completed second visit returned the
scanner undamaged. The majority stored the scanner in close proximity to the booklets, often in a
pile together in either the kitchen or living room, on a high shelf (away from children), or in a
cupboard. Many respondents reported that scanning was fun and they enjoyed it. In one household,
the 10-year old daughter helped scan the groceries from each shopping trip.
During this visit, each household answered a series of questions about scanning:
Can you tell me how and when you used the scanner? In general, respondents reported
scanning their groceries upon returning home. Most reported completing the blue (or purple) page
most of the way and stopping where the form asked if they scanned all, most, or some items,
because they could not answer that question until they tried to scan all their items. Those who were
able to scan all items and did not have to write any detailed descriptions reported forgetting to go
back to the form to check that final box. However, their memory of the scanning results for the
shopping trips was salient at the time of the second visit, and they would likely be able to report
such data over the telephone if prompted to do so.
Can you walk me through what you or others did to practice? Most respondents were able
to describe the process correctly. Some respondents said the person practicing should scan an item
then delete it. Others felt no one should practice using the device and it was their role as respondent
to do the scanning. Still others said that because they practiced during training, they did not need
additional practice before scanning actual foods.
Are the places where you normally shop included on the PLACES page? Most
respondents indicated that the places page in the scanner book encompassed all the places they
either shop or get food for free. Questions arose for the following situations:
• Uncertainty about whether place should be scanned if grocery items are brought to your
home by a family member (free for you, purchased at store for by them).
• Confusion about where an ethnic market would be classified in the list: respondent used
convenience store/bodega for this scenario.
Did you have any problems finding items in the scanner book? Respondents reported that
pictures were helpful in enabling someone to visually scan a page quickly for a particular item. A few
commented on the use of color to highlight groups of items—such as peppers, mushrooms, and
lettuce—in which varieties were grouped together and broke the alphabetical order of the book.
When probed about whether the coloring and layout was helpful, respondents did not elicit strong
opinions one way or another. One respondent did not notice that the book was alphabetically
organized and had trouble finding items, but others reported no problems. Specific comments for
improvements included
• Adding two items to the fruits and vegetables section which were said to be common
among Hispanic people—plantains and guineitos
9

• Using consistent descriptions to reinforce the idea of scanning the bar code once for
each item or not (for example, use the singular avocado not plural description for avocados
or the plural bananas for a bunch of bananas)
Did you have any questions about the scanner that were not included in the instruction
book? Respondents provided the following suggestions for questions and answers to include in the
booklets or training:
• Do I need to turn it on/off?
• Do I need to recharge the battery?
• Do I scan food I already have, that I am eating from my home, or only new things?
• Troubleshooting: What if I see the line but do not hear a beep?
• Troubleshooting: What do I do if an item will not scan?
• What do I do if I forgot to scan “place” before I scanned my groceries for that trip?
• Do I scan food (groceries) someone brought to my house for free?
• Do I scan things I eat out, such as a candy bar or soda (reported as a “snack”)?
• Do I scan each item if I bought a six-pack or is one bar code sufficient?
• How do I scan meat from the meat market? (The respondent concluded it should be
treated like anything else that could not be scanned.)
One respondent also suggested a start and end code to define a shopping trip.
B. Individual Use of Food Booklets
As noted earlier, we provided households with adult and youth food booklets corresponding to
the number of household members. We allowed households discretion in using multiple booklets or
collecting all household information in one booklet. Table 4 summarizes the use of booklets by
interview type. In six cases (40 percent), we expected only one booklet from the main respondent
because the households consisted of only one person or all children were younger than 11.
In the nine cases in which multiple household members could have completed their own
booklet, only two households collected data that way. In the remaining seven households, the main
respondent recorded all information for the household. When other adults in the household did not
complete booklets, respondents cited reasons such as poor penmanship, cognitive disabilities, or
refusal to take part in the study at all. Three households included youths who were age-eligible for
booklets but did not complete them. One of those households cited burden on the youth during a
stressful time when the teen would be living out of the household (the mother recorded data for the
teen). The second household included two teens: one completed a youth booklet and one did not
(this teen refused to participate in any way). The third household with a teen has not yet completed a
second interview.
How a respondent elects to allocate the booklets across household members can either increase
or decrease data quality. A proxy report with intensive followup for nonreporters might yield better
quality data than a self report. On the other hand, proxy reporting might result in fewer entries of
food acquisitions than actually occurred or entries that carry less detail than might have been directly
10

reported by a household member. Clearly, in this small sample, the primary respondents who
received training were better motivated than other household members or felt more able to
complete the tasks. (As discussed in the Conclusions section, we did not offer incentives to each
household member. Plans for the field test include a base incentive for the main respondent and
additional incentive for reporting by each additional household member.)
Table 4. Use of Individual Booklets by Interview Type
Version
Number of Household
Members Ages 11 and
Older

Number of Booklets
Completed

CV

SV1

SV2

Total

Percentage
of Total
Cases

1

1

3

1

2

6

40.0

More than 1

1

2

1

2

5

33.3

More than 1 all
completed by main
respondent

1

0

1

2

13.3

More than 1
completed by
multiple household
members

0

2

0

2

13.3

6

4

5

15

Total

C. Understanding and Using the Data Collection Forms
The three versions of data collection instruments were organized differently but contained
similar components. This section is organized by the components of the data collection forms, with
information collated across the CV, SV1, and SV2 groups, where appropriate.
1.

Instructions and Reference Pages

The instructions and question-and-answer pages were basically the same across all three
instrument versions, with identical step-by-step layout and identical language except where each
version required different instruction. Respondents did not ask questions about the instructions or
question-and-answer pages during training. During the debriefing we received the following
feedback:
• CV Group: Five of six respondents referred back to these sections during the data
collection and found them to be “helpful” and “clearly written.”
• SV1 Group: One respondent commented that “Few people with the info given wouldn’t
be able to do it.” Several found the step-by-step guide helpful, and one remarked that it
was “extremely straightforward.”
• SV2 Group: One respondent reported that she looked through all the instructions to
“Make sure she did things right” (then scanned and recorded foods consumed from
home each day). One respondent reflected on her reading of the instructions as being
“So easy even a trained monkey could do this.” Respondents looked for guidance on:
-

Estimating the size and quantity of food (one respondent reported the form of
food feature was helpful)
11

2.

-

What to do if they lost a receipt

-

Whether or not to record or scan foods eaten from an existing home food
supply.

Daily List (CV)

The DL was designed to be the first stop in the Adult Food Booklet, where respondents record
the place where they acquire food and are guided, by the structure of the DL, to the appropriate red
or blue page. The DL was in the CV Adult Food Booklet only.
During training, four of five respondents had no questions about the DL; one asked whether
school lunches go in Section A (for red pages) or B (for blue pages). At the second visit, many
respondents said that the list on the facing page was helpful in deciding between Sections A and B.
Five of six respondents completed the DL at the end of the day; one completed it just prior to
the second interview. Respondents reported they often skipped DL and went straight to red or blue
page. One respondent missed a full day of daily list entries, but had completed red and blue pages
for that day.
Issues encountered in the use of the DL included the following:
• Three of six respondents found the instructions on the DL confusing. Instructions to
“include what you bought and got for free” led them to think they should record item
details on the DL.
• When one household member refused to take part in study, the respondent checked
“nothing to report that day” to record the refusal, because there was no other feature
within the forms to indicate that.
• Some respondents expressed confusion about recording vending machine foods (listed
as an example for Section A) if such foods are brought home.
3.

Master List for the Household (SV1 and SV2)

The ML was designed as an alternative to the DL. Technical Work Group (TWG) members
suggested that a simple ML could provide essential details about food acquisition events for
households or individual household members who are unwilling to fill out detailed data collection
forms in the food booklets.
During training, the following feedback was obtained:
• One respondent did not like the ML being separate and said she might tape her ML to
her booklet to keep them together.
• Three respondents sought clarification on issues such as saving receipts, recording free
foods, and recording what they eat when out of the house.
At the second visit, respondents were asked questions to determine how well they retained the
information covered in the training session. They were asked if they referenced the guide on the
front page to decide between red or blue columns for place; only one respondent said yes.
Respondents were also asked the hypothetical question of what they would do if “all food came
12

from within your home” on a given day. Three of four SV1 respondents identified the “nothing to
report that day” box on the ML, and one said he or she “would not have recorded anything.” SV2
respondents struggled with this question. One respondent, who recorded all foods eaten, said “I’d
just write down what I had here from home.” When probed by the interviewer, she replied, “What
am I supposed to write—nothing, nothing, nothing?” Another echoed similar confusion about the
box on the ML, pointing to the box and saying, “But that is confusing because we eat every day.”
Only three of nine respondents in the SV1 and SV2 groups used the ML correctly:
• Three of four SV1 respondents used the ML, but two misunderstood it to be only for
their entries and not for everyone in the household. The three that used the ML said that
they recorded the entries on the form as they went through their day, soon after their
return home from getting any food.
• Two of five SV2 respondents used the ML correctly.
Placement of the ML form as loose or stored in the side pocket of the scanner binder did not
work. One respondent said she forgot about the ML but would not have forgotten it if it had been
part of her adult booklet.
Several factors caused incorrect use of the ML:
• One respondent checked the boxes on the front cover to indicate the places she got
food, but she left the columns blank.
• One respondent confessed she did not record day care snacks at all, but focused on “big
things” due to burden. (She suggested adding a day care example to the red list.)
• One respondent was confused about whether to check red or blue for groceries brought
to the home by a nonhousehold member, as text refers to “groceries brought home,” but
she did not bring them home herself. As a result, she was confused on how to record
those groceries.
Respondents in the SV2 group were asked if it was confusing to have red or blue column colors
that both lead the respondent to a purple page; most found the question strange. Two said it was not
confusing. One added, “They were separate booklets. One is for place and the other is for
everything you ate.” (This respondent often eats outside the home.)
4.

Red Pages for Reporting FAFH

The CV and SV1 Adult Food Booklets contained red pages for reporting FAFH. These red
pages had a different layout but collected most of the same information (SV1 did not include tip).
Only 3 of 10 respondents asked questions about the red page during training. These included
• Clarification of the response category “SNAP EBT,” as it was an unfamiliar term. The
respondent showed the interviewer his card and confirmed it was SNAP EBT.
• One respondent asked whether small items—such as taco sauce—should be included.

13

• One respondent was confused by the concept of who “got” the food—relating it to the
Council on Aging (COA) and wondered if it referred to who provided the food to the
COA.
Several challenges were reported at the second visit:
• A respondent did not have detailed information about meals and snacks served at his
child’s day care. The day care information form showed only “French toast,” but the
child also received snacks and beverages not listed on the form. The respondent did not
follow up with the provider for the details and did not record them on the red page.
• Respondents did not know quantities of food served in school meals. One reported he
“made up” the quantities.
• Respondents noted there were not enough red pages. One CV respondent ran out of red
pages. This respondent recorded his child’s meals and snacks in his booklet. After
running out of red pages, he “converted” blue pages to red ones.
• Multiple respondents were unsure of the form to use for some acquisitions. This
occurred for an item purchased in a grocery-like setting, partially consumed outside the
home, and then brought home and scanned. Another respondent was confused about
delivery items.
• Another respondent was not sure about the source of food. The respondent received
food from a friend and was unsure whether to report “place” as “friend” or as the place
the friend bought the item(s).
• One respondent did not record a restaurant meal that was paid for by someone else,
because she thought of this as an acquisition of the “buyer.”
Many respondents reported that the most difficult part of the data collection was describing
quantities of food. At least one reported that the portion size reference guide on the page facing red
pages was helpful, especially the examples using familiar items such as a deck of cards. Others
admitted that they didn’t know quantities, especially for school and day care meals, and entered their
best guess.
5.

Blue Pages for Reporting FAH

Blue pages were used to collect information about FAH from the CV and SV1 groups. For the
CV group, blue pages were located in the bar code book and were formatted in the same way as red
pages, to report one place per page. For the SV1 group, blue pages were in the Adult Food Booklet
and formatted as a grid to collect all FAH events for the period.
During training, only one of 10 respondents had a question about the blue pages. That
respondent asked for a definition of a loyalty card. When the definition was provided, the
respondent then understood and described loyalty cards as “Shaws cards” or “CVS cards.”
The SV1 blue pages were somewhat more complicated than the one-page-per-place format of
the blue pages used for the CV group. For the SV1 group, a single food acquisition (shopping trip)
might span multiple pages of the form: the first page is a grid to list places and characteristics of the
acquisition, the next pages provide space to report descriptions of items that could not be scanned.
Following the practice exercise during training, each SV1 respondent was able to describe in his own
14

words how he would complete a row on page 1 and skip to the next page, as applicable. The
respondents also acknowledged receipts were to be stored in the pouch provided.
During the second visit, most respondents reported that they completed the blue pages soon
after their arrival home from getting food.
Feedback about the blue pages varied for the two groups:
• CV Group: Some felt it would be helpful to have DL, red, and blue pages all in one
binder with tabs for each section, especially when one respondent recorded all
information for the household. One respondent thought the blue pages were easier to
complete than red pages, because the practice with scanning made her feel prepared for
blue pages. Another felt blue pages were “less demanding” than red because scanning
was easier than recording all the details.
• SV1 Group: Overall there were no problems with the change in format between red and
blue pages. One respondent, however, recorded details of a shopping trip in place 1 (not
2) on the second page because she understood it to be the same “day” and did not notice
the header for “place” at the top of the column.
Respondents reported the total time spent scanning and reporting FAH to range from 5–30
minutes per trip for the CV group and 15–30 minutes for the SV1 group, with 15 minutes per trip
most commonly reported. Most reported that they scanned items as they put them away.
6.

Purple Pages for Reporting FAH and FAFH (SV2)

The purple pages in the SV2 Adult Food Booklet were designed to test the hypothesis that
respondents would find it easier to track food if they did not have to categorize their acquisitions as
FAH or FAFH to determine the proper form to use.
During training, many respondent questions about the purple form were general questions
about reporting food acquisitions and did not seem to be driven by the consolidation of the red and
blue forms. These questions included
• How to record if someone else paid for the meal?
• How to categorize brunch?
• How to record free refills or all-you-can eat salad bars?
• How to record the total of the practice scanning exercise because the purchase included
health and beauty products?
One elderly respondent had difficulty with the size of the boxes on this form (which were
consistent with other forms) and recorded $2.25 as $225 in the preset currency formatted boxes;
however that respondent was able to clarify (when prompted) that it was $2.25.
Respondents in this group were less clear about how to use the “Description” section at the
bottom of the form, as compared with users of the red form. This is likely because the directions
could not be targeted specifically for either FAH or FAFH. When asked how or when to use the
bottom section of the purple form (in their own words), respondents offered responses such as,
15

“When no receipt or when you couldn’t scan stuff;” “What I had at home to eat;” and “What you
ate and how much of it, what size it was, grocery trip has a receipt—you have to remember a lot.”
During the second visit, it was clear that respondents had more difficulty with this form than
the red or blue forms:
• In the “who got food” field, one respondent recorded her children’s names, interpreting
that field as being for who received the meal that was purchased.
• One respondent did not complete a ML and reported her meal away from home, her
daughter’s meal away from home, and a grocery trip all on one page, having retained the
idea of day at a glance from ML, but applying it to a purple page.
• One respondent reported all meals eaten at home and said she had a hard time getting
the form to work (again using one page for each day, across all meals, including those
eaten from the home food supply).
• One respondent reported food eaten from home for breakfast (from home supply)
because she thought she should, but was not entirely sure, and she reported challenges
getting the form to work to record foods eaten from home.
Other difficulties were probably not a result of the uniqueness of the purple form:
• One respondent reported the size and amount part of the form was the hardest because
“Not everything has it on it and you have to guess.” The size and amount portion of the
form was not completed consistently across booklets.
• One respondent erred in recording the prices for a pizza delivery entry, though the
respondent did save the receipt. The receipt showed that the pizza cost $10.61, and she
remembered giving a $3.00 tip, but she had recorded the total price (on the form) as
$3.00. When asked to clarify this response on the form, she just repeated these same
statements and was unable to recognize the error.
When asked when they completed the purple pages, two respondents reported completing them
at the end of the day (all at once). The others reported completing it after the shopping trip or after
the master list from each event. Respondents reported the total time spent scanning and reporting
information on the purple form to range from 10 to 25 minutes per food acquisition. The
respondent reporting 25 minutes said, “I was going back and forth to make sure I wasn’t doing
something wrong.”
7.

Youth Booklet

Two of the 15 completed households had youth ages 11 to 18 (none in the CV group, one in
each of the SV1 and SV2 groups). The three households with youths ages 7 to 10 were also offered
the youth booklet, and two of the three declined. One respondent said it was easier to record his
daughter’s food in his adult diary because the training focused on that version and he “felt more
comfortable” using it. Thus youth booklets were given to three households for four youths, and one
youth completed the booklet on her own.
The youth under age 11 was a 10-year-old boy who came home from school when the
interviewer was at the home. The interviewer was able to train him directly using the current day’s
school lunch as an example. He was unable to remember how many chicken nuggets he was offered,
16

so he guessed “small.” He did remember he got white milk and when prompted was able to identify
it as one-percent milk. During this practice, the mother prompted him to record anything else he
got. (This boy is in the household that failed to keep their appointment for the second interview.)
One household had a teenage girl and boy. The girl took part in the study, but the boy refused
to complete a booklet or respond to his mother’s questions about food he got. The girl asked only
about how to record vegetables on a submarine sandwich. The second household with a teen had an
unusual situation because the teen girl was outside the home for an exam. The main respondent
recorded her daughter’s food each day in the adult booklet. The respondent asked her daughter to
call or text her to report all meals (eaten outside their home), but had to follow up each day with
phone calls. This respondent elected not to report the daughter’s lunch of chips because she deemed
it not worthy of being written down, as it was “just chips and not a lunch.”
8.

Saving Receipts

Respondents were more likely to save receipts from FAH than FAFH. They did a good job of
taping receipts in the space provided on the data collection forms. Respondents were provided tape
for this task and the majority gave the tape back at the second interview.
We present the numbers of food purchases in Table 5, along with the number and percentage
of purchases where we expect a receipt, and the number that had a receipt. The counts in this table
exclude meals at school, day care, COA, and other places where respondents received food without
paying.
Table 5. Percentage of Food Acquisitions with a Receipt
Blue Pages
(FAH)

Red Pages (FAFH)

Number of households

10

10

5

Number of pages

22

19

12

Number of pages with
receipt

18

4

9

Percentage of pages with a
receipt

82

21

75

9.

Purple Pages
(FAH & FAFH)

Respondents’ Overall Assessments

During the second visit, respondents were asked to share with us their overall assessment of the
data collection task. We asked them to rate the difficulty of the task and to tell us whether they
considered using the toll-free number, or if they would use it if needed. We also asked for their
assessment of whether they would be able and willing to report the food acquisition data by
telephone and how difficult it would be to collect data for seven days. Finally, we asked for their
opinion about an adequate incentive payment for households that would collect these data over a
seven-day period.
General Impressions. When respondents were asked during the second interview to assess
how difficult it was to track the food they got, most respondents said it was “easy,” “not bad,” or
“easier than I thought it would be.” One respondent said it was difficult because of her busy
schedule but the tasks themselves were “not extremely hard.”
17

The greatest challenge reported by the CV group was the task of estimating sizes of portions or
amounts of food, especially for children’s meals and snacks outside the home. Similarly, the SV1
respondents were challenged by “writing details of when you went out to eat” and having to “eyeball
amounts and how many.” One SV1 respondent recognized the difficulty of writing portion sizes,
saying “some people may not be accustomed to watching portion sizes, but I am because I am a
diabetic.” Two SV1 respondents felt there was “no hard part.” SV2 respondents were challenged by
the size or amount column, the process of writing things down due to physical impairment
(arthritis), deciding between the red or blue check boxes on the ML, and remembering to use the
ML at all.
When we asked respondents if there was anything that would have helped them understand the
process better or make them feel more prepared, we received the following responses:
• More information about how to handle receipts without item descriptions
• Clarification about the difference between the foods one “gets” (or acquires) and the
food one “eats” (or consumes), as it was an important concept for the study
• Add meat to the list of examples of items that might not scan
• Place the step-by-step guide more prominently in the booklet, perhaps opposite the page
on how to use the scanner
• Provide a small booklet that respondents could take with them during the day to save
receipts from a variety of potential stops
Toll-Free Number. A toll-free number is listed at the bottom of most pages throughout the
food booklets, accompanied by the phrase, “Have Questions? Call . . .” Two of the 16 households
used this toll-free number just shortly before their second interview. One was in distress about
whether or not (and where) she should have recorded “all the food she ate”; the second was from
the respondent who completed all her forms just before the second interview and called to ask
whether she should record separate pages for herself and her infant daughter.
All but three respondents said that they would use a toll-free number if needed. One
respondent said she preferred a web site, because “sometimes they put you on hold” (with a
telephone call); one respondent said she would not call because “the materials were selfexplanatory”; and the third respondent said she thought about calling but then decided her question
was “stupid and she should know the answer,” even though it was not provided in any of the
booklets. She also expressed concern about who would answer the telephone if she called.
Telephone Reporting. The full-scale survey will include protocols for respondents to call the
survey center during the data collection week to report the information they record in the food
booklets. We asked cognitive test respondents, “How difficult do you think it would be to describe
what was recorded in your booklet over the telephone?”
All CV respondents said describing what was recorded in their booklets over the telephone
would be extremely easy. SV1 respondents thought it would not be difficult, but one respondent felt
it would be time consuming. Three of five SV2 respondents felt it would be easy to do, with one
saying, “Just pull out the book.” Another respondent said it would be a little difficult, especially with
regard to the “stuff on sizes of portions.” Still another respondent expressed the same concerns she
18

had over completing the ML: describing the event and selecting the red or blue box would be
potentially difficult and confusing.
Seven-Day Data Collection. Respondents were asked to rate on a scale from 1 to 10, with 10
being extremely difficult, how difficult it would be to gather these data for seven days instead of
two:
• CV Group: Responses ranged from 2 to 5. The higher rating was explained as the
difficulty of adding the recording tasks to the already demanding schedule of a single
mother.
• SV1 Group: Most responses were 1 or 2, with one rating of 8. The high rating was
“because of the number of days.”
• SV2 Group: Responses ranged from 2 to 5. One respondent said it would not be
difficult because as you do it, the process gets easier. Reasons for higher rankings
included buying more groceries or shopping at many stores, putting it on your calendar
and not forgetting to do it each day, and fitting the recording time in among taking care
of children and other household or work responsibilities.
Incentive Payment for a Seven-Day Collection. When respondents were asked what they
thought someone should be paid to complete these tasks for seven days instead of two, responses
ranged from $50 to $140 ($20 per day) for the CV group, from $50 to $125 for the SV1 group, and
from $50 to $250 for the SV2 group.
One respondent who answered $50 explained that “it was not that difficult really,” and another
respondent who answered $50 explained by saying, “If you ain’t got nothing else to do, you are
getting paid for doing something you already do (shopping).” The person suggesting $140 said that
the high incentive might be needed to keep people from dropping out after a few days of
completing the task.
The single, working mother who had to call the day care center to inquire about meals (and
snacks) for her two children felt between $100 and $250 was fair, but remarked how it was also
helpful for her to write these things down over the past few days, expressing concern about both the
amount she spent eating out and the quality of the foods she obtained.
D. Data Quality
We reviewed the collected data from the cognitive tests to evaluate two issues of data quality:
(1) Did respondents use the scanner correctly? and (2) What was the quality of the “descriptions” of
foods written in the bottom sections of the red, blue, and purple pages (foods that would not scan
and foods not listed on a receipt)?
1.

Did Respondents Use the Scanner Correctly?
Before examining the data, we knew of three scanner problems reported by respondents:
• PLACE code not used correctly. The place code is designed to delineate items in the
data file that were acquired from different sources. Each scanned item has a time/date
stamp in the data file. However, rather than rely on the time/date stamp, we designed
19

the place code to ensure that we distinguished items brought into the home at the same
time from different sources. There were two issues with the PLACE code:
-

Respondents forgot to scan PLACE.

-

Respondents scanned PLACE even if no items would scan. (For example, one
respondent went the meat market, scanned the place, then realized that no items
had a bar code.)

The second problem is not critical, because it can be verified by the number of
acquisitions and response to “Did you scan some, none, or all items?” The first problem
might be minimized by affixing a small sticker to the scanner with a reminder to “SCAN
PLACE.”
• Respondents scanned too many items. During their second interview, two of the
four elder respondents reported scanning foods already in their home prior to
consuming them. When interviewers probed about why they did this, one replied that
she “just thought you’d want to know about what we ate.” The forms “really didn’t let
me record that, but since this was a study about food, it seemed important to tell you
what we ate so I did.”
This problem can be difficult to resolve in the scanner data, and we did not attempt to
resolve it for the comparisons discussed below. We can resolve it by “throwing out”
items that are scanned on days when there is no corresponding “blue page,” but we will
still overcount items on days when a respondent reports a food acquisition and scans
additional items from his or her home. We may also minimize this problem by adding
consistent messaging throughout the data collection materials that “It’s not about what
you eat.”
• Failure to use the scanner immediately after shopping. One respondent admitted
she completed all of the booklets just prior to her second interview, having to take
groceries out of the cabinets, refrigerator, and freezer to do so. This practice is likely to
lead to errors with respondents scanning too few or too many items. This problem may
be mitigated by the midweek telephone calls about FAFH, which will include a few
questions to check on respondents’ scanning activities.
We reviewed the scanner data to examine three questions: (1) Did respondents scan the PLACE
code to delineate trips? (2) Did respondents scan items for each trip reported as having “all” or
“some” items scanned? (3) Does the number of scanned and unscanned (written) items match the
number of foods listed on receipts? This information is summarized in Table 6.

20

Table 6. Scanner Data Validity by Type of Interview
Comprehensive Version (CV1)

CV-1

N Places Scanned/
N Blue Pagesa
2/2

Total Items Scanned/
Total Items on Receiptsb
28/5

CV-2
CV-3
CV-4

3/2
0/2
1/3

9/0
25/24
60/55

Items only
Items only

CV-5

5/7

23/22

Items only

CV-6

1/1

20/22

Near perfect

ID #

Match?

Comments
Scanned foods from cupboard
immediately before 2nd interview
No receipts
Did not scan PLACE
Did not scan PLACE for 2nd & 3rd
acquisitions
Did not scan PLACE for 6th & 7th
acquisitions
Missed two items.

Simple Version 1 (SV1)
ID #
SV1-1
SV1-2
SV1-3
SV1-4
SV1-5

N Places Scanned/
N Blue Rows
0
3/3
2/2
1/1
2/2

Total Items Scanned/
Total Items on Receipts
0
14/15
33/33
9/9
95/83

Near perfect
Perfect!
Perfect!
Perfect!

Total Items Scanned/
Total Items on Receipts

Match?

Comments

Items only
Items only

Scanned foods consumed (elder)
May have scanned nonfoods, but large
difference
Did not scan PLACE for one acquisition
Appears to be failure to fill a purple page
for one acquisition
Did not scan PLACE; scanned foods
consumed (elder)

Match?

Comments
Nonrespondent
Missed one item

Items match for grocery store; no receipt
for food bank

Simple Version 2 (SV2)

SV2-1
SV2-2

N Places Scanned/
N Purple Pages for
FAH
0/0
3/3

SV2-3
SV2-4

1/2
2/1

6/5
6/5

SV2-5

0/1

16/8

ID #

3/0
25/19

Number of places is a count of booklet pages/rows where respondent indicated that some or all items were
scanned.
b
Number of items on receipts excludes nonfood items and items described in booklets because could not be
scanned.
c
This respondent did not remember to save receipts and did not list the purchases in the detailed description
section provided in the blue pages. Instead, these details were provided on the daily list for some items.
a

Only 5 of the 15 respondents could be categorized with perfect or near perfect use of the
scanner; four of these used the SV1 protocol that bundled red and blue pages in one booklet. An
additional 5 respondents scanned a number of items that produced a close match to receipts, even
though they did not scan PLACE codes to properly delineate all food acquisitions. The slight
mismatch on number of items might be due to scanning nonfood items, which we cannot identify at
this time, prior to obtaining a UPC data dictionary.4 The remaining 5 respondents have large
discrepancies in the number of scanned items, either because they scanned foods they consumed
from the home supply, did not save receipts for our verification, possibly played with the scanner
without scanning the practice code, or as in one case, may have scanned six nonfood items that we
cannot verify.
4 Our training included a tube of toothpaste, and we told respondents it was not necessary for them to scan
nonfood items but that it was okay if they do scan nonfoods because we can identify them.

21

Overall, the SV2 booklet worked least well for FAH. The CV and SV1 versions used the
red/blue color coding that may have served as a reminder to use the scanner book and scan the
place code. In addition, the two elders using the SV2 booklet had a problem with the concept of
reporting foods acquired, not foods consumed.
2.

What Was the Quality of the Written “Descriptions” of Foods?

All 15 respondents used the “written description” section on at least one data collection form.
The number of food acquisitions with a written description of items ranged from 1 to 11 per
household. The number of individual items recorded ranged from 2 to 49 per household for the
two-day period, with an average of 17 items per household.5
Descriptions of food items were written legibly. Respondents were clearly engaged in the task
of communicating the food they acquired, and they provided detailed specifications. For example:
“Starbucks small iced coffee w/skim milk,” “Coffee w/creamer (used 2),” “Potatoes (tater tots),” “6
piece chicken Big Kids meal, small FF, med drink root beer.” However, the level of detail for certain
key nutritional characteristics varied among respondents. Milk was reported 23 times, and 10 of
those reports included fat content. Bread was reported 7 times, and each occurrence included the
type of bread (wheat, Italian, or garlic).
Many respondents told us that the most difficult part of the process was writing down
quantities of foods that could not be scanned or were not listed on a receipt. Some respondents told
us they guessed, and they seemed frustrated that they had to do this. The following types of
information were reported for size:
• Red pages (FAFH): Just about half of the 145 items had no information about size or
were characterized only as small, medium, or large.
• Blue pages (FAH): All but two of the 26 items had size information in ounces, pounds,
or grams. The two exceptions were a whole avocado (“medium, .5 lb?”) and a “bunch”
of bananas.
• Purple pages (FAH and FAFH): 41 of the 92 items were characterized as small, medium,
or large; 30 were characterized by ounces; and 21 had no size information.
These counts overstate the problem of missing size information because many entries with a
blank size field contained information embedded in the item description. For example, “2 slices of
bread,” “1 dozen jumbo eggs,” “2 saltines.” In addition, some items that were listed without size or
quantity are from chain restaurants, and the information would be readily available to us.
Nonetheless, respondents were frustrated by the task of providing size information when it wasn’t
readily available to them. Many provided information that may not be accurate, and “small, medium,
large” may mean different things to different people if these terms are applied to items not
characterized in those terms on a menu that the research staff can reference.

5

Items consumed from the home food supply are not included in the counts.

22

IV. CONCLUSIONS AND RECOMMENDATIONS
Mathematica scheduled cognitive test interviews with 20 households that voluntarily contacted
us for the opportunity to take part in the study and earn $50. We completed a first training interview
with 16 households, our target number of completes. One household participated in the first
interview and did not complete the second interview. One household completed the data collection
forms immediately prior to the second interview. Thus, 14 households were actively engaged and
compliant with the data collection.
It is significant that cognitive test interviewees were actively interested in understanding the
request for information. Regardless of the quality of his or her data, the primary respondent in all
households felt confident in the work he or she had done. Most thought it was easy, and many
offered to take part in future interviews.
There are several major findings that should lead to revision of our data collection plans, and
several other findings that suggest making minor changes to data collection forms.
Training is critical. Respondents told us that the most useful part of training was the practice
they had using the scanner and completing the forms. Incorporating this practice time in the first
visit resulted in an average first visit time of one hour, including time to (a) introduce the study, (b)
obtain consent, (c) explain the protocols, and (d) complete the training with practice exercises.6 The
practice exercises were the most time-consuming part of the visit. We need to consider whether or
not a respondent could withstand the cognitive burden of completing “Household Interview #1”
(which is expected to last 30 minutes and was originally designed to be completed during this same
visit) and keeping his or her focus and attention through an hour’s worth of detailed content. We
may need to revisit plans for the timing of the first household interview.
The scanner was a success. Respondents were not confused or intimidated by the scanner
and found it better than “writing things out.” They did not have problems scanning items, and they
understood how and when to use the bar code book and seemed to enjoy browsing the book of
items. Respondents did not always “Scan place first,” and we plan to put this message on a label on
the scanner to improve compliance.7
There is no clear winner among the three versions of materials tested. We found no
significant differences in training time or respondent burden for the three versions tested.
Furthermore, it is not clear that the “simple” version was “easier” or provided better-quality data.
One clear finding is that the data collection materials might work best if packaged together (an
option not tested). CV respondents did not like having red and blue pages in different books; SV1
and SV2 respondents did not use the master list largely because it was not integrated with other

To ensure high levels of between-interviewer consistency, we will videotape a master version of a training session
so that interviewers can watch our “gold standard.” This video will include an interviewer fielding commonly asked
questions, responding to interruptions (respondent attending to young children, telephone calls, and so on), and
household complexities (guests in the household).
6

7 Items acquired from different places can be separated by the time/date stamp alone (if acquired on different
days) or comparing the items with receipts. However, this increases data processing costs.

23

materials. In addition, the main respondents tended to report information for all household
members and, in this case, they wanted to use only one book.
Overall, the red/blue color coding provided the intended direction toward scanning or not,
while the purple forms appear to be associated with lower-quality data because respondents did not
have the reminder that blue is for food brought home and red is for “eat out and take-out.” Two
elders using purple forms reported food consumption, not acquisition. One user of purple forms
tried to write all acquisitions for a day on one page (rather than one place per page). The purple
forms also lacked clear directions about what to write under “Detailed descriptions” because the
form had to fit both types of acquisitions. (The purple forms are also associated with higher data
processing costs.)
The daily list and master list both led to some confusion. The DL was used correctly by half of
CV respondents, while the ML was used correctly by one-third of SV1 and SV2 respondents. At
least one respondent wanted to write details on the DL, and this can be corrected with better
instructions. The problem with the ML was one of compliance, as respondents forgot about it
entirely. This suggests that some type of list with better instructions may work if incorporated into a
booklet with the other forms.
“Food acquisition” is a tough concept, but only for some. Two elderly respondents were
compelled to report food consumption, in addition to acquisitions. This finding is confounded with
use of the purple pages, so it is too soon to know if this poses a large problem. However, we can
add additional messages to our data collection forms to remind respondents that “It’s not about
what you eat.”
Teen participation might be a problem. One of three children over age 10 refused to
participate, one completed a youth booklet, and one provided her mother with information to
report. The cognitive tests did not provide an individual incentive, so it is not possible to know if
teens will respond to an incentive.8 ERS has suggested alternate data collection methods for teens
(picture or text messaging). Teens may also be responsive to an information web site with
instructions. It may also be useful to redesign the youth booklet to be more teen friendly by
downsizing it to something that is easier to carry around, requiring less information especially for
foods that are not prepackaged, and changing the layout to more closely resemble the adult booklet
so that parents can better train teens to use this booklet.
Adults can provide proxy reports for kids in school or day care settings, but lack full
information. Several adults said it was difficult to accurately record the meals and snacks their
children got throughout their day, particularly in day care settings. Some reported they “guessed” at
portion sizes for school meals. Although some day care providers give parents a summary sheet for
the day, it often did not include drinks and snacks, and there was no detail on portion sizes. It
appeared that the job of quizzing the children was made substantially more difficult by the
requirement to report portion sizes. Therefore, it is possible that the response rate for reporting
acquisitions will be improved if the burden is reduced by eliminating portion size reporting and
using imputed portion sizes.

8 As outlined in the OMB package, the field test will provide a fixed incentive for the main respondent plus an
additional incentive for each household member that completes a food booklet.

24

Training and instruments must address a tendency toward satisfying. Two respondents
told us that they did not include all food acquisitions because some things didn’t meet their criteria
for “important” acquisitions. One mother thought it was okay to “report the big things” but not all
snacks received by her child in day care. Another mother did not report the chips that her daughter
had for lunch because “that’s not a meal.”
Minor revisions. Overall, respondents found the instructions to be clear and straightforward.
They completed the data collection forms without finding them burdensome, with the exception of
portion size information. Respondents suggested that the following areas are in need of
improvement:
• Information about how to scan fruits and vegetables (for example, bunch of bananas)
• Information about how to scan or report multipacks
• More examples of foods that might not scan (for example, meat market items)
• Information about the scanner (Do I need to turn it on/off? Do I need to recharge the
battery? What if I see the line but do not hear a beep? What do I do if an item will not
scan?)
• What do I do if I forgot to scan “place” before I scanned my groceries for that trip?
• How to report food (groceries) someone brought to my house for free
• How to report packaged snacks that are eaten out but could be brought home and
scanned
It is also worth considering whether we might obtain higher quality data by asking for portion
size “if available from package or menu,” and discouraging guesswork. In this case, we can
distinguish good faith reports based on package information from missing data. Our imputations of
missing data may be more accurate than respondents’ guesses.
1.

Generalizability of These Results

The cognitive tests were conducted over a two-day period, and the second debriefing interview
occurred at a time when respondents to the full survey would have their first telephone check-in to
report FAFH. Thus several data quality issues would be caught and corrected during the telephone
interview. An interviewer will ask probing questions tailored to the respondents’ specific questions
or circumstances and, accordingly, could do the following:
• Help respondents decide between red or blue classifications for places listed on their ML
or DL (or an alternative form in future fieldings).
• Provide corrective guidance, where needed, for those who report foods eaten from their
home food supply.
• Provide clarification on when a nonhousehold member bought or brought food to the
household, what to record in the “place” field on the blue/purple forms.
• Facilitate an account for all food activity for all household members. This minimizes the
opportunity to “forget” to account for meals and snacks outside the household that may
25

not have been gathered by the respondent when completing or compiling all the
booklets.
• Help with determining portion size by walking respondents through techniques, such as
helping them picture the food they bought and asking them if it is bigger than a fist.
(However, this will not solve the problem of proxy guesswork.)
Several data quality issues observed in the cognitive tests cannot be resolved by telephone
followup:
• Respondents will still struggle in estimating weights or portion sizes of items. Some may
still guess at these amounts, as they admitted to doing in the cognitive interviews.
• Respondents will not always be able to account for food activity for all household
members (unable to check in, household members refusing to take part, not having their
booklets on hand).
• Respondents may be less inclined to keep detailed information in their booklets because
they anticipate being able to “remember it all” when prompted for these data over the
telephone. Should they use their booklets less frequently, they may also be less inclined
to remember to save their receipts.
2.

Implications for the Field Test

Original plans for the field test included a test of two survey protocols: (1) paper food booklets
plus telephone interviews about FAFH and (2) paper food booklets without telephone interviews.
During development of the survey instruments, Mathematica and ERS agreed that telephone
interviews were critical to obtaining high data quality because there are many special circumstances
that cannot be included in the food booklet instructions without making the process daunting for all
respondents. In addition, the TWG expressed concern that the protocol for collecting food data was
too complex, especially because of the multiple instructions and multiple modes of collection
(scanner, receipts, and forms). The TWG suggested that we test a “comprehensive” and “simple”
protocol in the field test.
The three versions of instruments used for cognitive tests were not substantially different. In
practice, we did not test a simple version. Fortunately, respondents—with the proper training—were
not intimidated by any of the data collection protocols. But respondents also did not comply
perfectly with any of the protocols, and the cognitive tests offered areas for improvement and
testing.
Perhaps the largest problem revealed by the cognitive tests is that some household members
will not participate. One teen and one adult refused, and others participated by proxy. Thus we
recommend that the field test focus on testing methods to encourage full participation within the
household. This could concentrate on the role of the main respondent as gatekeeper and record
keeper, or on the level of burden and removal of the specific burden that might discourage
participation. The following are two examples of possible alternate tests:
1. One group gets a single binder with all data collection forms to be completed by the
main respondent, and the second group gets individual food booklets for members of
the household.
26

2. One group is asked to report portion sizes, and the second group is asked to report
portion sizes “if available on the package or menu.” (Both groups receive a revised youth
booklet with less reporting burden and layout resembling the adult booklet.)
It would also be useful to test some type of text messaging approach with teens. ERS has
provided us with information about the use of text messaging to locate respondents. We could use
text messaging in a similar way to prompt teens to report their food acquisitions to their parent
(reminding them of the incentive they will receive), or to prompt them to complete their own
booklet (coupled with a method for them to return their booklet directly to us). Currently, we are
not aware of other surveys using text messaging to collect data, so this method may need substantial
testing.
The first test provides information about the relative data quality with reporting by proxy. The
second provides information about the trade-off between burden and data quality. An additional
focus, though not currently in scope, is to test teen responses to alternative data collection methods
such as text messages or reporting by web site. We look forward to discussing these potential next
steps with ERS.

27

APPENDIX A
FOODAPS COGNITIVE TEST CASES

Table A.1. FoodAPS Cognitive Test Cases
# Household Members by Age Group
CASE ID

FI

Appt 1 Date

Appt 2 Date

Recruitment
Source

WIC

DTA office

√

SNAP

SSA

HH
Size

Ages
0-6

2

1

Ages
7-11

Ages
11-14

Ages
15-18

Ages
19-30

Ages
31-64

Ages
65+

Education
Level

Video

Audio

1

HS grad

-

-

2

HS grad

√

-

-

-

√

√

√

√

-

√

CV: Comprehensive instruments
cc1

HM

Wed 5/12

Sat 5/15

cc2

HM

Tue 5/4

Fri 5/7

DTA office

√

2

cc3

HM

Mon 5/3

Thu 5/6

Council on
Aging

cc4

LK

Mon 5/3

Thu 5/6

DTA office

√

2

cc5

PS

Tue 5/4

Fri 5/7

DTA office

√

1

cc6

PS

Tue 5/4

Fri 5/7

DTA office

√

3

√

3

√

1

1
1

1
1

1

1

1

College
grad
College
grad
College
grad
College
grad

SV1: Simple instruments, red & blue
ss1-1

PS

Tue 5/18

Fri 5/21

DTA

ss1-2

HM

Mon 5/10

Thu 5/13

DTA office

ss1-3

HM

Wed 5/12

Sat 5/15

DTA office

ss1-4

PS

Tue 5/11

Fri 5/14

COA Salem

ss1-5

PS

Mon 5/10

Thu 5/13

√
√

DTA office

√

1

3
2

√

1

1

4
4

1

HS grad

√

-

3

College
grad

√

√

1

HS grad

√

√

HS grad

-

-

College
grad

√

√

< High
School

-

-

HS grad

√

√

HS grad

-

-

HS grad

√

√

HS grad

√

√

3
1

2

1

1

SV2: Simple instruments, purple
ss2-1

HM

Tue 5/11

Fri 5/14

Council on
Aging

ss2-2

HM

Mon 5/10

Thu 5/13

DTA office

ss2-3

HM

Tue 5/11

Thu 5/14

WIC

√

ss2-4

PS

Mon 5/10

Thu 5/13

DTA office

√

√

2

1

2

√

Council on
ss2-5
PS
Tue 5/18
Fri 5/21
Aging
Field Interviewers: HM = Holly Matulewicz, PS = Premini Sabaratnam, LK =
Laura Kalb
Shaded rows denote cognitive test interview recording delivered to ERS.

√

1

3

2

3

1

1

1

1
1

1

1
1

www.mathematica-mpr.com

Improving public well-being by conducting high-quality, objective research and surveys
Princeton, NJ ■ Ann Arbor, MI ■ Cambridge, MA ■ Chicago, IL ■ Oakland, CA ■ Washington, DC
Mathematica® is a registered trademark of Mathematica Policy Research


File Typeapplication/pdf
AuthorECurley
File Modified2010-09-23
File Created2010-08-13

© 2024 OMB.report | Privacy Policy