Pre-test Results and Recommendations for Adjustments

Appx P.pdf

Field Test for the National Household Food Acquisition and Purchase Survey

Pre-test Results and Recommendations for Adjustments

OMB: 0536-0067

Document [pdf]
Download: pdf | pdf
APPENDIX P
PRE-TEST RESULTS AND RECOMMENDATIONS FOR ADJUSTMENTS

MEMORANDUM

TO:

Mark Denbaly

FROM:

Holly Matulewicz, Nicholas Redel, Laura Kalb

SUBJECT:

FoodAPS Pretest

955 Massachusetts Avenue, Suite 801
Cambridge, MA 02139
Telephone (617) 491-7900
Fax (617) 491-8044
www.mathematica-mpr.com

DATE: 7/30/2010

This memo summarizes the results of the pretest for the FoodAPS Field Test. The pretest
was conducted in June and July of 2010. The memo is divided into four sections: (1) Pretest
Background and Respondents’ Characteristics, (2) Pretest Design, (3) Pretest Results, and (4)
Conclusions and Recommendations. This memo is accompanied by several appendices, which
provide greater detail on the responding households’ characteristics (Appendix A), the timing for
each survey instrument (Appendix B), the proposed revisions to each survey instrument
(Appendix C), and the design and implementation of the web interface (Appendices D through
F) for food acquisition reporting.
Section 1. Pretest Background and Respondent Characteristics
This pretest followed the cognitive interviews that took place in May 2010. It served as a dry
run for many of the processes that will be used during the 400-case field test in 2011. The pretest
goals were:
1. Test the full set of procedures for the data collection process to assess participants’
understanding of and cooperation with the study procedures and to generate burden
estimates for the OMB materials.
2. Evaluate the clarity and flow of the survey instruments and generate
recommendations for revisions to the survey instruments and other reporting
materials.
3. Gather respondent feedback about the data collection process and the challenges they
faced, as well as qualitative observations on any differences in use of the study
documents between the simple and comprehensive versions.
Project staff completed a total of six interviews between June 21 and July 19, 2010. The
original field period was extended to include two Supplemental Nutrition Assistance Program
(SNAP) households during the week that they receive their SNAP benefit issuance during the
data collection week.1 Participating households were drawn from the existing list of households
1

Ultimately, three of the four SNAP households had the field period within their SNAP issuance dates.

An Affirmative Action/Equal Opportunity Employer

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
2
that expressed interest in participating in cognitive interviews in May 2010. Interviewers
contacted potential participants by phone to discuss the pretest protocol and incentives for
participation. All of the households were located in either the North Shore or Metro-West areas
of Boston, Massachusetts. This convenience sample was composed of:
 Two elderly-adult households without children (one or more individuals age 60+)
 One household with school-age preteen children
 Two households with at least one teenager (age 14+)
 One non-elderly adult household without children (members under age 65)
Half of the households completed each version of the data collection instruments, three
receiving the comprehensive version (CV) and three receiving the simple version (SV). Of the
six primary respondents, two identified themselves as disabled; five of six were female; and four
identified themselves as White, with one also identifying as Hispanic. Households ranged in size
from one to five persons, with an average of three persons. In total, across the six participating
households there were 16 possible individual participants. All 16 agreed to participate. A
detailed summary of the participating households is found in Appendix A.
During recruitment we encountered three household-level refusals. Two refused to take part
on the day of the initial visit (one for health reasons as a result of broken ribs, and one who did
not answer the door or return telephone calls). The third case completed both a screener and the
initial visit as scheduled. However, as the appointment progressed, this respondent grew
increasingly concerned about the purpose of the study, who would see her data, and why USDA
wanted to know about her household’s food acquisitions. The respondent visibly changed her
demeanor during the first household interview when the interviewer asked about “who lives or
stays in the household,” and she was alarmed by the citizenship question. The interviewer kept
her engaged and fielded her questions, and by the end of the visit she agreed to “give it a try.”
However, this household did not complete the scheduled telephone interviews. There was a mild
language barrier that may have contributed to the respondent’s discomfort. We used a Spanishspeaking staff person to contact the respondent to check in, answer her questions, and address
concerns. These contacts appeared to put the respondent at ease, though the respondent
continued to question the purpose of the study. By Day 3, when it was clear she was not
participating, the Spanish-speaking staff member contacted her again and the respondent
conceded she decided not to take part. The field interviewer returned to the household, retrieved
the study materials, and provided compensation for time spent on the initial visit. Each of the
three refusals was immediately replaced by an alternate household in order to maintain six
households in the pretest.

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
3
Section 2. Pretest Design
The pretest included all of the instruments and protocols planned for the FoodAPS Field
Test, except instruments planned for computer-assisted personal interview (CAPI) and computerassisted telephone interview (CATI) administration were administered on paper. The survey
protocols include a seven-day data collection from households that are screened eligible for the
study. Data collection includes household interviews and reporting of food acquisitions. The
FoodAPS Field Test will test the efficacy of two alternate instrument designs for collecting food
data, both of which were included in the pretest. The two versions of the food instruments are:




Comprehensive version (CV) - Multiple food booklets are provided to the household so
that individual household members may report their own acquisitions of food away from
home (FAFH). A separate book is provided for reporting household food at home (FAH)
acquisitions.
Simple version (SV) - One binder is provided for reporting FAH and FAFH acquisitions
of all household members.

The CV featured the same basic design as that used during the cognitive test. This included:
(1) the Scanner Instructions, Blue Pages, and Book of Barcodes binder that included Blue Pages
and a section of foods with barcodes to scan to record foods and drinks brought into the home for
home preparation or consumption (that is, food at home, or FAH); (2) the Adult Food Booklets,
which included seven Daily Lists, and Red Pages to report food away from home (FAFH) for
each participating adult in the household; and (3) the Youth Food Booklets for individuals ages
11-18 willing to track their own FAFH acquisitions. Based on feedback from the cognitive
interviews, minor revisions were made to formatting and text of the CV materials, and the design
and layout of the youth booklet underwent major revisions.2 The youth booklet was put in a
small booklet size (5.5 x 8.5), pictures of teens were added to the cover, and the layout of the
inside pages was revised to mirror the layout of the Red Pages in the adult booklet but with a
more simplistic layout.
The SV featured one binder for the household, which included all components of the
Scanner Instructions, Blue Pages, Book of Barcodes, and the Adult Food Booklets. The SV
binder included seven Daily List pages, Red Pages for FAFH acquisitions, Blue Pages for FAH
acquisitions, and a Questions & Answers section. Each section of the binder was separated by a
labeled tab.3 Since this was a household binder, one main respondent was responsible for
2

See National Household Food Acquisition and Purchase Survey: Report on Cognitive Tests delivered to ERS
on June 7, 2010.
3

In the Questions & Answers section of the simple version, questions from all sections of the binder are
combined in one place, which is color coded to align with the section of the binder from which the question may
arise. For example, Daily List questions are in green, Blue Pages questions are in blue, and so on.

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
4
household reporting. This could be accomplished either by asking other household members for
their information each day, or by asking household members to record their own acquisitions in
the master household binder. The SV version used for pretest was informed by cognitive tests.
Cognitive testing included a Master List (modified version of Daily List that appears in Adult
Food Booklets) which was largely misused or misunderstood by respondents. Therefore the
Master List was dropped from the SV design for the pretest.
Both SV and CV materials were revised following cognitive tests to include a formal
practice page in each section (Daily List, a Red Page, and a Blue Page), which respondents used
during the formal training provided by field interviewers.
All households received a calendar magnet which depicted the seven-day data collection
week and indicated the days when they needed to complete telephone interviews. They also
received a Meals and Snacks Form, on which to record all meals and snacks consumed by each
household member during the seven-day period. The Meals and Snacks Form could be affixed to
the refrigerator with the calendar magnet.
Apart from the design of the food instruments, the two versions featured the same study
requirements, which included:
1. Completing the Meals and Snacks Form.
2. Completing the screener and Household Interviews 1, 2, and 3.
3. Recording food acquisitions for all household members (in the individual booklets
[CV] or single binder [SV]).
4. Scanning groceries brought into the home, as well as a place code to identify the type
of store or place where they were purchased or acquired.
5. Reporting FAFH acquisitions by telephone three times during the survey week.
6. Completing two debriefing interviews. The first was conducted by telephone and
asked about respondents’ experiences with the telephone reporting process and the
questions in Household Interview 2 (conducted by the telephone interviewer). The
second was conducted in-person on Day 8 and gathered feedback on questions in
Household Interviews 1 and 3, as well as general feedback on recording food data
during the week.
Regardless of the version assigned, each household was offered a base honorarium of $50,
plus $20 for each additional household member who participated. Participating members were
defined as those who provided information on their food acquisitions. Households also received
a $25 bonus if they adhered to the telephone reporting schedule (discussed in Section 3.9).

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
5
A toll-free telephone number was provided on all survey materials so that respondents (Rs)
could obtain support during the data collection week. The study week followed the same
schedule for all households, described in Table 1 and described in Sections 2.1.-2.4. below.
Table 1. Summary of Household Activity During Study Week
Day 0
Activity for
Household
(HH)

Day 1

Day 2

In-person: Data
R reports
screener,
collection
FAFH by
week begins phone /
HH
check on
interview 1,
process
training

Mathematica FI
Staff
Involved

TI fields
questions,
as needed

TI using
web
database

Day 3

Day 4

HH
interview 2
by phone

TI

TI fields
questions,
as needed

Day 5

Day 6

Day 7

R reports
FAFH
by phone

R reports
FAFH
by phone

TI using
web
database

TI using
TI fields
questions, if web
needed
database

Day 8
In-person:
collect
study
materials,
HH
interview 3,
distribute
incentive
FI

2.1. Household Visit on Day 0. Careful consideration was given to the sequence of
activities for the initial household visit. The cognitive test showed that respondent training for
food acquisition reporting took 60 minutes, on average. To balance the cognitive burden of
listening to instructions with more active tasks, such as answering survey questions or
completing mock forms, the sequence of events for the initial visit (hereafter, the Day 0 visit) in
the pretest was as follows:
1. Contact respondent by telephone on the day of interview to confirm time of visit.
2. Complete consent form and provide a copy to the respondent.
3. Provide overview of the activities for that visit, overview of the study as a whole,
and copies of the study brochure and advance letter to the respondent.
4. Complete Household Screener.
5. Complete Household Interview 1.
6. Fill out the calendar magnet with day/dates for the study week.
7. Review and personalize the Meals and Snacks Form
8. Conduct training on how to complete the Daily List pages, Red Pages, Youth
Booklet (if applicable), and Blue Pages, and carry out the scanning process. This
included completion of a mock Red Page from the respondent’s recent history of
foods acquired and a mock Blue Page (using props provided by the interviewer to
scan and record).

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
6
9. Recap key points from food acquisition reporting, answer any final questions, and set
expectations for the next step in the study process (telephone reporting).
The placement of the Meals and Snacks Form in this series of events was challenging,
because the purpose of the form is for respondents to record all the meals and snacks consumed
by each household member, whereas all other tasks focus on foods acquired. By placing it at the
start of the training, our goal was to keep participants’ focus for the rest of the training on foods
acquired.
2.2. Reporting of Food Acquisitions. Regardless of the version assigned, each household
was asked to record the food acquisitions of all household members identified in Household
Interview 1 during the seven-day study week. For CV participants, each household member over
age 11 received his or her own food booklet, and respondents were identified by name on the
front cover. Foods for those under age 11 and for household members who elected not to keep
their own books were recorded in the primary respondent’s booklet. For SV participants, all
household members logged their food acquisitions in one binder. In both versions, participants
were to complete a Daily List page for each of the seven days. Households were trained to see
this sheet as a “snapshot” of their day. This sheet included instructions guiding participants to
complete either Red or Blue Pages for FAFH and FAH, respectively.
2.3. Telephone Reporting on Days 2, 5, 7. Mathematica project staff manned a toll-free
call-in line from 8:00 a.m. to 9:00 p.m. each day. Respondents could call the toll-free number
with questions about the study at any time during those hours. On Days 2, 5, and 7, participants
were instructed to call the toll-free line between the hours of 6:00 p.m. and 9:00 p.m., when
meals and shopping activities were completed for the day, to report what they recorded in the
booklets. These instructions were covered in the training, placed as a reminder on the back of
each adult booklet or binder, and printed on the calendar magnet.
The structure of the telephone reporting calls varied slightly depending on the version
assigned (CV or SV), since SV households used only one booklet. During the calls, each booklet
was reviewed independently and the primary respondent provided details from the Daily List
(FAH and FAFH) and the Red Pages for FAFH acquisitions. If the respondent listed at least one
FAH acquisition in Section B, the interviewer asked if he or she was able to scan the items
purchased from this place, if he or she had any trouble scanning, and if the receipt was attached
to the Blue Page. The interviewer entered all of the information gathered into the web reporting
interface, which closely resembled the layout of the data collection forms. Screenshots of the
web interface, as well as the script for telephone reporting calls, and a Step-by-Step Guide for
Web-Based Reporting, are provided in Appendices D, E, and F of this memo. Households that
did not complete the Day 7 call were flagged in the system and the field interviewer was notified
of the need to review the booklets in greater detail during the Day 8 visit; this included asking
clarifying questions, where needed, to improve the quality of the data. The field interviewer
called the telephone interviewer to report these acquisitions after leaving the household.

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
7
2.4. Household Interview 2. A telephone interviewer called respondents on Day 3 to
complete Household Interview 2, which focused on household finances. This interview was
recorded on paper for the pretest but will be programmed into CATI for the field test. Telephone
interviewers used this opportunity to check on the completeness of the food record and record
any data in the web database that the respondents had previously omitted.
2.5. Household Visit on Day 8. In the final visit, the field interviewer returned to the
household at a time specified by the respondent. During this visit, the field interviewer
completed Household Interview 3; completed a debriefing interview, which sought feedback on
the food acquisition process and on specific items in the in-person survey instruments; briefly
reviewed the household food booklets/binder; collected the data collection forms and study
materials; and provided the honorarium for participation. The same field interviewer who
completed the Day 0 visit was assigned to the Day 8 visit.
Section 3. Results
The results of the six-case pretest are reported in Sections 3.1-3.8 below. These include
burden estimates for each component of the study.
3.1 Respondent Trainings. The training scripts used for cognitive tests were revised for the
pretest to better address key concepts respondents struggled with in the cognitive test, as well as
to reflect the changes in the study protocol. These changes included:







Rewording the introductory language to emphasize the purpose of the study and how it is
not about “what you eat” but rather about “what you get” or acquire.
Adding language to explain the tiered incentive structure.
Reducing the number of the examples of food acquisitions, so as not to overwhelm
participants.
Removing text from the SV script that referenced the Master List, which was no longer
being used.
Providing a sequence of events to follow for the Day 0 visit (consent, screener timing,
Household Interview 1, training, and recap).
Including training for the Meals and Snacks Form and explanation of the calendar
magnet, which were not used during the cognitive test.

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
8
Overall, trainings went very smoothly and averaged 70 minutes per household.4 Mirroring
results in the cognitive interviews, the mock entries in various pages in the booklets continued to
work well and helped respondents “own” their new knowledge. These practice pages also served
as points of reference when the primary respondent trained other household members on the
study. Interviewers wove examples from the household members’ daily lives into the training,
making the training more salient; examples included how to record food from a neighbor’s
garden or food from a pantry at church on Blue Pages and in Section B of the Daily List.
Likewise, interviewers used examples such as meals at locally based restaurants or meals at a
friend’s home for Section A on the Daily List and Red Pages. Immediately following the
explanation of the Red Pages in the adult booklet, interviewers walked through youth booklets
(for applicable households), showing how much of the same information is captured in a similar
format. During her training, one respondent commented the youth booklets were “so cute” and
“easy” to follow.
3.2. Household Screener and Household Interview 1. Administration of the screener
flowed smoothly overall. Many of the respondents found it somewhat humorous that we were
explaining ourselves or introducing ourselves again (formally through the script), but understood
we had to read all items to capture accurate timings for administration. The length of
administration ranged from 5 to 10 minutes, with an average of 8 minutes across the six cases.
Issues identified in the screener, which will be addressed in a subsequent revision, include:
 Providing a place to document the name of the person to whom we are speaking when
we contact the household. This will be used for verification as well as potential
refusal conversion efforts.
 Item Q3 (whether there are other housing units or living quarters at this address)
confused several respondents, particularly those in multi-unit dwellings. We will add
clarification and cover this issue in detail during training.
 One respondent struggled with the questions about meal planning/food shopping
because she employs someone for these services. Although the respondent would be
the person interviewed for the study (household size of one), the employee does the
shopping and meal preparation, which caused confusion. We will cover this issue in
the interviewer training.
The length of administration for Household Interview 1 ranged from 14 to 26 minutes, with
an average of 19 minutes across the six households. Respondents had mixed reactions to this
interview, depending on their household’s composition and size and their citizenship status.
4

This average time is 10 minutes longer than the cognitive test time because training included the Meal and
Snack Form, calendar magnet, and instructions for telephone calls.

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
9
During administration, we identified several areas to address with revisions to wording or skip
patterns. These included:
 We will add introductory text for the household roster series, to allay potential
concerns the household is being investigated or that we do not believe responses
provided in the screener.
 Items A9 and A9a, which ask whether any household members are noncitizens and
request the names of the noncitizens, proved to be extremely sensitive for one
household (the respondent later withdrew from the study). Given the current political
climate, this item could be sensitive in certain parts of the country and in some
immigrant communities. Therefore, we propose revising the item (eliminating
household member names) and moving the item to Household Interview 2.
 For item A12c about food obtained at work, we will revise response options to
minimize confusion for those who work in the food-service industry and eat food
from work while on duty.
 For item B5 about whether a child was attending school, respondents tended to
conceptualize whether or not a child was in school “in general.” We will add an
interviewer probe to clarify that children on school vacation or school break are not
attending school
 We will modify Item B12, which identifies any household members currently
pregnant, with age and gender-appropriate skip patterns
 We will modify Item C13, which asks how many times the household eats dinner out
on an average week. Respondents asked if we meant “eating out together.” We’d like
to capture the number of FAFH dinners and will modify this question to capture
individual-level as well as household-level information.
During both the cognitive test and the pretest, we found many of the participants selfidentified as being persons with a disability. These included individuals with physical
impairments, chronic health conditions, mental illnesses, substance abuse, and cognitive
impairments. In some cases, the respondent’s disability made it difficult to complete data
collection tasks. For example, one person with a cognitive impairment had trouble recalling
details of food acquisitions during the telephone interviews; one physically disabled respondent
had a support person who did her grocery shopping and other errands, and the wording of survey
questions was awkward because the respondent did not acquire food herself. Survey protocols
may need to be adapted for persons with disabilities in the following ways:
 Persons who have support people to help with daily tasks might be asked to invite their
support person to attend the study training.

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
10
 Persons with physical or visual impairments might have difficulty writing on the data
collection forms, and may therefore be contacted by telephone on a daily basis to
provide information verbally.
 Persons who are cognitively impaired might benefit from more frequent check-ins to
provide information with cognitive impairments could be contacted more than 3 times
per week to obtain information before it is lost from recall impairments.
We propose adding three questions to Household Interview 1, which would identify persons
with disabilities that may impair their ability to complete the data collection activities.
Information about disability status will be transmitted to telephone interviewers so that they can
provide additional support and in-depth prompting about progress during the telephone
interviews. These questions, below, are adapted from a 6-question module used in the Current
Population Survey and the American Community Survey.
We’d like to know if you might need extra help with this survey because of physical,
mental, or emotional conditions that cause difficulty with daily activities.
1. Because of a disability, do you have difficulty using the telephone?
2. Because of a disability, do you have difficulty writing with a pen or pencil?
3. Because of a disability, do you have serious difficulty concentrating or
remembering?
In the debriefing, respondents were asked about aspects of Household Interview 1. We
sought feedback on how challenging it was to identify all the people who live or stay in the
household. The majority of respondents found these questions very easy to answer, saying they
“know who lives there.” However, one respondent, (household size of one) has a family member
who does not live or stay in the house, but who comes over each day and takes food. From her
perspective, he is a household member, in that he impacts her budget and food supply
substantially. From the way the questions were worded, she was not able to list him as a
household member, which she found confusing. We also asked for feedback on the questions
about the place(s) respondents do most of their food shopping. Respondents said these questions
were also very easy to answer because they are “creatures of habit” or they have “routines” based
on their favorite or most convenient places to shop. Item-level descriptions of all the proposed
modifications to the Screener and Household Interview 1 are found in Appendix C.
3.4. Household Interview 2. This instrument was administered over the telephone on Day
3. It collects information about income, assets, and non-food expenditures, which are
components of household resources and non-food demands on those resources. The length of
administration ranged from 16 to 44 minutes, with an average of 26 minutes across the six
households. Overall, this interview went smoothly - no respondents were reluctant or unwilling
to answer the questions and all indicated it was easy to provide specific responses. During the

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
11
pretest administration, several items were identified for further modification. A comprehensive
list of proposed changes to the instrument is provided in Appendix C. These items include:
 Items A6-A10 (cell phone, landline, cable, and internet services) were too specific to
capture bundles and alternate plans. We will revise skip logic to correct for errors and
add two questions on internet expenses to allow for proper capture of bundled
services.
 In Item C1 (unearned income), some respondents were able to provide a total for
unearned income from Social Security Retirement Benefits (SSA) and Social Security
Disability Benefits (SSDI) but were unable to break the total down into its
components. The current design, however, requires respondents to list the total
amount of each income component separately. We will revise these questions to
allow for combined amounts.
 Section B establishes earned income for each household member. Although the
respondent provides a list of household members during the Day 0 visit, reconfirming
this list at the beginning of Section B will promote reliability, increased clarity, and
allow for a more fluid skip to Item B5. We will revise the introduction to include an
interviewer instruction to confirm the household members working for pay.
 Item A15 (educational expenses) does not apply to many elderly households. We will
insert a screening question before A15 to determine whether anyone in the household
incurred educational expenses in the past 12 months. Interviewers will “check all that
apply” for Item A15, and A15b will loop for each item checked in A15.
 Item A17 (number of automobiles possessed by the household) is followed by
questions relating to vehicular expenses. The current skip pattern does not account for
expenses associated with frequently used vehicles that are not owned or leased (e.g.
borrowed or rented), which may account for large non-food expenditures. We will
resolve this issue by adding an option to indicate the frequent use of another vehicle
outside the household’s possession.
 We created Item A24 in the nonfood expenditures section, which asks respondents
about public transportation expenses incurred by the household.
One respondent took a very long time to consider exact dollar amounts for many of the
income and expenditure categories included in Household Interview 2. We will revise the
instrument so that interviewers may remind respondents, when necessary, that best guesses are
acceptable during Household Interview 2, so that they do not feel the need to make precise
calculations or look for relevant financial documentation.
During the telephone debriefing, respondents were asked if they would have preferred to
complete this interview in person during the Day 0 or Day 8 meeting, or if they preferred

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
12
completing it by telephone. All respondents indicated that the telephone reporting was either
acceptable or preferable. One suggested, “I might have felt a little embarrassed to give that
information face-to-face” and another said, “Maybe it would have been more comfortable to do
in person, but the convenience of the phone is better.”
3.5. Household Interview 3. This instrument was administered in person on Day 8 and
ranged from 12 to 30 minutes, with an average of 18 minutes. Overall, the administration flowed
smoothly. Because this pretest included households ranging in size from one to five members,
we identified several items in this interview that needed skip patterns or text substitutions to
improve the flow for single-person households. In the debriefing, we asked how difficult it was
to remember which days guests came to the house for meals or snacks. The majority of
respondents found these questions very easy to answer, either because they had few guests or
because their households follow a set routine. One respondent commented that households who
have a lot of kids coming and going each day would have a “hard time” answering those
questions. When asked for their feedback on the height and weight questions, most had already
disclosed during the interview whether they were making their “best guess.” One respondent felt
these items were extremely sensitive and would not be telling her boyfriend she had disclosed
this information about him in the interview. For the field test, we will develop a protocol
whereby respondents will be able to enter information on weight directly into the interviewer’s
computer without having to say the number out loud, if they prefer to do so.
When asked if there were any other items that were particularly sensitive or made them feel
uncomfortable, none of the respondents mentioned the Food Security series and the majority felt
the interview was relatively benign. Finally, when asked whether they were familiar with the
acronym “SNAP” the vast majority of the respondents indicated their understanding of the term,
saying it was “Food Stamps” and some were able to say what the acronym stood for. When
asked what SNAP meant in their own words, some respondents said, “It helps you get groceries,”
and another said that he had recently seen a presentation on it at the senior apartments and the
presenter had explained, “it is a supplemental assistance program for people to get food.”
3.6. Food Consumption Reporting via the Meals and Snacks Form. This form was to be
completed for (or by) all participating household members. Households often kept this form on
their refrigerator, as suggested, affixed by the study calendar magnet. As noted, this form
requires household members to “shift gears” away from food acquisitions and report on meals
and snacks consumed each day. To minimize perceived levels of burden, two formats of the grid
were created, one for households with four or fewer members and one for households with six or
more members (containing more rows for more members). In the debriefing, when asked about
the process for completing this form, five of the six primary respondents reported it being “easy”
to complete. The one respondent who had difficulty had several household members who ate
food outside the home. She found it challenging to record this information when they were away,
along with maintaining all of their booklets. One commented “it was easy, except I missed
marking a few meals for the last days.” Others reported that the form was “self-explanatory” and
said they kept it on their refrigerator. While participants reported that the form was easy to

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
13
complete, as we reviewed the completed forms, we had concerns about the quality or validity of
the data provided. Some respondents had difficulty following across the row for each household
member and recorded some entries in rows that were not populated with a person’s name. For
some cases, entire days were missed within the study week (column left blank), often in the latter
part of the week. Finally, in one household in which two of the five members are morbidly
obese, the members were not recorded as having eaten any snacks on any days. To obtain better
quality data on the Meals and Snacks Form, we suggest adding a reminder about this form during
the three telephone calls for FAFH reporting, and to have the FI review the form with the
respondent during the final household visit. We considered and rejected the possibility of asking
for meal/snack information during the telephone calls because we do not want to shift the
respondent’s focus from food acquisitions to food consumption.
3.7. Reporting Food Acquisitions via Telephone. One of the main goals of the pretest was
to determine how many households would call Mathematica according to the study schedule and
how many would require outbound follow-up the next day. The pretest also obtained burden
estimates for these calls. Table 2 summarizes the results of the telephone follow-up activity.
Table 2. Results of Inbound and Outbound Calling Efforts.
N Called
Mathematica on
Correct Evening

N Called
Mathematica
Next Day

N Mathematica
Called Next Day

N Missed
Entirely

Average Call
Length in
Minutes

Day 2

6

0

0

0

16

Day 5

5

0

1

0

10

Day 7

4

1

0

1a

12

Day

Total
Note:

15
1
1
1
Day 7 call times include food reporting as well as the telephone debriefing interview.

12

a

This respondent was not able to call out from her telephone any longer because of nonpayment. When we called her
to complete this interview, she had received bad news and was unable to speak for several days. As a result, we
reviewed these data at the Day 8 in-person visit instead.

The length of these calls ranged from 2 to 30 minutes, with an average length of 12 minutes.
Overall, the telephone reporting of food acquisitions worked well. Almost all respondents
indicated that it was easy to remember the calling schedule because of the calendar magnet; none
of the respondents felt a need to prepare for the calls; five out of six respondents indicated that it
was “easy” to report their food acquisitions over the telephone; and each respondent indicated
that there was nothing that would have made the calls easier. Most respondents (four out of six)
were happy with the number of calls; one felt there could have been fewer calls; and one felt that
daily calls would have been helpful.
The telephone reporting of food acquisitions was designed to obtain high quality data about
FAFH acquisitions by providing an opportunity to ask respondents to clarify responses and

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
14
provide additional detail, especially with regard to the precise identification of foods acquired.
These calls also provide an opportunity for additional respondent training.
Opportunity for Refresher Training. Most respondents (four out of six) understood the study
requirements, and demonstrated their understanding by successfully completing the required
food acquisition forms and through comments made while reporting their food acquisitions. For
example:
 One respondent called before the scheduled time, but said, “I’m calling you now
because I already got my dinner. I haven’t eaten it yet, but it’s not about what you eat
it’s about what you get. So is it okay to do the call now?”
 Another had nothing to report and explained, “I almost went to the refreshment table
at church today, so I would have had a red page. But I didn’t go because I was too
upset” (explained that church is closing).
 Another completed the call while commuting and engaged in a detailed discussion of
her Daily List and Red Pages. The interviewer clarified, “Do you have your booklet
with you?” and respondent replied, “No, but I don’t really need the booklet because I
am calling you and you’re writing everything down.”
Even though these households understood the study requirements, the call presented an
opportunity to provide training reminders based on the respondents’ upcoming acquisitions. For
instance, one respondent noted that he might go to a friend’s home the following day, so the
interviewer reminded him, “If you get food at your friend’s house remember to write it in the top
section of the Daily List and fill out a Red Page.” Another respondent revealed her plans to
complete her monthly grocery shopping before the next call in, so the interviewer reminded her,
“When you get home from your shopping trip, remember to fill out the Blue Page. Just follow
the directions on the page and scan your groceries before you put them away. And please feel
free to call if you have any questions while scanning.”
Opportunity to Ensure Data Quality. For households that understood the study less clearly,
the calls created opportunities to identify and correct problems that could have negatively
impacted data quality. For instance, one CV household failed to complete the Daily List pages.
The interviewer identified this oversight during the Day 2 call-in and retrained the respondent
accordingly. The same respondent revealed trouble with the scanning process during the call.
This respondent decided to rescan her groceries because she thought she had made a mistake.
She told the interviewer that she scanned the “Oops barcode, which clears everything out” and
then rescanned all of her items.5 The interviewer was able to flag this error in the reporting
5

Due to this error, the wording and location of the “Oops barcode” will be adjusted. See Appendix C.

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
15
interface and note the double counting of groceries during this trip. The interviewer was also able
to explain the proper procedure for future reference. For another household, the interviewer
discovered the respondent was double counting meals (by recording the information in two
separate booklets when more than one household member got the food) and was able to correctly
record the information in the web database while retraining the respondent on how to complete
Red Pages. The same respondent, who was cognitively impaired, also had trouble with the
scanning procedure. In response, the interviewer talked her through the process of completing
the Blue Page and scanning her groceries, which the respondent did successfully while on the
telephone.
Household Experiencing Extreme Difficulty. The one household that encountered significant
difficulties during the study week presented an opportunity to test the protocol for households
that require greater support. The primary respondent self-identified as a person with a disability
during the Day 0 visit. During the Day 8 visit, we learned the disability was cognitive and
involved short-term memory loss. This respondent also elected to complete each of the other
household members’ booklets because she wanted to keep the $20 bonuses for grocery money.
Despite doing well in training, the respondent grew increasingly confused about the booklets and
required a significant amount of retraining. During the Day 2 call, the interviewer provided a
significant amount of re-training, focusing on how pages relate to one another and highlighted
the information required for each type of page. This call lasted 30 minutes and the interviewer
was only able to obtain information from one of the five household members. Likewise, the Day
5 call lasted 30 minutes, and the interviewer was only able to obtain information from the Daily
List pages from one of the five household members. This household failed to complete the Day 7
call.
The respondent’s confusion was significant and persistent. For example:
 Despite repeated training, the respondent listed the place, items acquired, and the
price on the Daily List pages, and did not complete any corresponding Red Pages.
 When asked to open a booklet to the page that says “Daily List - Day 1” the
respondent replied, “Wait this says Friday but today is not Friday.” The interviewer
explained that the page should say Thursday because Thursday was the first day that
she was supposed to record information. The respondent replied, “But then why isn’t
it Monday, because Monday should always be Day 1” (apparently because it is the
first day of the week).
This confusion highlighted the need for phone interviewers to be able to provide thorough
retraining on the survey instruments, to be able to resend the field interviewer to retrain the
respondent, and to offer more frequent -perhaps daily - calls if the respondent would benefit from
frequent reporting. In this instance, the field interviewer asked the respondent if daily calls would
have been helpful and the respondent indicated that daily calls would have made reporting easier.
The respondent’s misunderstanding of the study parameters also presented an opportunity to

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
16
develop protocols to handle extreme deviation from the normal telephone reporting call
structure. Finally, the respondent’s failure to provide complete information over the telephone
led us to establish a protocol for handling missing data during the Day 8 visit (discussed in
Section 4).
3.8. Data Quality from Telephone Reporting of Food Acquisitions. During the pretest,
we reviewed the collected data to evaluate four issues related to data quality: (1) Did the
information gathered during the telephone reporting correspond to the information provided in
the respondents’ booklets? (2) Did respondents use the scanner correctly? (3) Did respondents
alter their behavior as a result of participating in the study? (4) Were there differences in the
quality of the data obtained in the comprehensive and simple survey instruments? Our findings
on these issues are described in greater detail below.
 Did data reported by telephone match respondents’ booklets? After field
interviewers collected household booklets, they compared data in the booklets to data
recorded in the web database. Data from three of the six respondents could be
categorized as a perfect or near-perfect match to the data recorded on the web
database.6 However, data from three of the respondents did not match the web
database: one failed to report information from three Red Pages; and one failed to
report information from one Red Page. Data from the third non-matching respondent
did not correspond to the information collected over the telephone because she
misunderstood the study parameters and had extreme difficulty reporting acquisitions
over the phone (as discussed in Section 3.7).7
 Did respondents use the scanner correctly? During the cognitive test that preceded
this pretest, we reviewed the scanner data to examine three questions: (1) Did
respondents scan items for each trip reported as having “all” or “some” items
scanned? (2) Does the number of scanned and unscanned (written) items match the
number of foods listed on receipts? (3) Did respondents scan the PLACE code to
delineate trips? In the cognitive test, we discovered that only 5 of the 15 cognitive test
respondents demonstrated perfect or near-perfect use of the scanner; an additional
five scanned a number of items that produced a close match to receipts, even though

6

In a near-perfect match, all differences between web and booklet data were instances where the interviewer
modified respondent data in order to comply with the study parameters. For example, if a respondent inappropriately
recorded Red Page information on a Blue Page, the interviewer would record the information correctly in the web
interface.
7

As outlined in the initial proposal, situations like this will be addressed through additional in-person visits to
the home during the data collection week.

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
17
they did not scan PLACE codes to properly delineate all food acquisitions; and the
remaining five had large discrepancies in the number of scanned items.
To address the shortcomings discovered during the cognitive test, we revised the step-bystep guide and relocated it to the page facing each Blue Page, we added BEGIN and END codes
as additional delimiters of shopping trips, and we added a blank practice page that the
interviewer used during the training. We reviewed the pretest scanner data to assess the impact of
these changes. This information is summarized in Table 3.
Table 3. Scanner Data Validity by Type of Interview

ID #

Scan
Begina

Scan
Place

Scan
End

N Places
Scanned /N
Blue Pagesb

Total Items
Scanned/Total
on Receiptsc

Match?

Comments

Comprehensive Version (CV)
CV-1

No

No

No

1/6

24/0

No

No receipts. Did not scan trip
delimiting barcodes. Completed
many Blue Pages that should
have been Red Pages.

CV-2

Yes

Yes

Yes

0/1

8/8

Perfect

Skipped question indicating
some items were scanned

CV-3

Yes

Yes

Yes

1/1

10/13

Near
Perfect

Unscanned items not listed on
page

Perfect match

Simple Version (SV)
SV-1

No

Yes

No

1/1

45/45

Perfect

SV-2

Yes

Yes

No

1/1

34/46

No

Missed 12 items

SV-3
Yes
Yes
Yes
1/1
19/19
Perfect
Each respondent had one shopping instance where they scanned items, so Scan Begin, Scan Place, and Scan End
apply to that trip.
b
Number of places is a count of booklet pages/rows where respondent indicated that some or all items were scanned.
C
Number of items on receipts excludes nonfood items and items described in booklets because they could not be
scanned.
a

In the pretest, four of the six respondents demonstrated perfect or near perfect use of the
scanner; one successfully scanned most of the items but failed to list items that would not scan at
the bottom of the page; and one failed to use the scanner successfully. Furthermore, five of the
six scanned at least one of the BEGIN, PLACE, or END trip delimiting barcodes, and three out
of six respondents scanned every delimiter (per the instructions).
 Did respondents alter their normal pattern of food acquisition during the study
week? Data reported in the food acquisition process raised issues concerning both
their quality and validity. For example, one five-person CV household (with a
monthly income of approximately $840) reported spending over $80 on fast-food

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
18
purchases during the first two days of the study, then reported no acquisitions for the
next five days. Did members of this household modify their behavior because they
were anticipating the study incentive? Did they modify their behavior once they
understood the burden of saving receipts and reporting food acquisitions over the
telephone? Or worse, did they adhere to their normal patterns of food acquisition but
falsify their food reports because they realized the burden of reporting numerous
acquisitions over the telephone? During the telephone debriefing, the interviewer
asked the primary respondent specifically about this issue. The respondent claimed
that the household did not alter its buying behaviors in any way during the week and
was surprised by the question. In fact, all households were asked if they had altered
their behavior in any way as a result of participating in the study; each indicated that
they had not.
 Were there differences in data quality between SV and CV? The pretest data were
examined to determine if there were differences in data quality (e.g. frequency of
errors in food reporting, level of detail obtained, and amount of missing data)
between the two instruments. Given that there are only three cases per instrument, it
is not possible to make a valid claim about the differences between the instruments.
However, there appears to be very little difference in the quality of data obtained by
each instrument. Across the instruments, households encountered similar difficulties
with Blue Pages, Red Pages and web reporting (all outlined above) that do not appear
to be associated with the specific instruments. That said, one five-person CV
household started the week by having each member complete his or her own booklet,
but the primary respondent took over reporting for all household members after a
couple of days. During the telephone debriefing, this respondent indicated that it was
easier for her to record all of the information and said that she would have preferred
to have all materials in one book. In fact, all of the CV respondents indicated that they
would have preferred to have the data collection instruments in one book or binder,
and all SV respondents indicated that they were happy with the instrument design.
However this sample is too small to support recommendations for instrument
changes.
3.9. Incentive. As part of the pretest design, we used a tiered approach to distribute
respondent incentives. The tiered incentive was based on household size, as it relates to increased
burden, and the telephone bonus was based on the cost savings associated with incoming calls
compared with outgoing calls.8
8

During the pretest, our tiered approach involved a base amount for the primary respondent of $50.00, plus
$20.00 for each additional household member who participated, and a $25.00 bonus for calling in to Mathematica to
report food acquisitions on Days 2, 5 and 7. However, single-person households received a minimum of $100.00 for
full participation because this incentive amount had been communicated to them early in the recruitment process.

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
19

This approach presented several challenges. First, we were concerned that some household
members may be tempted to take the “easy way out” by simply checking the box at the top of the
Daily List saying “nothing to report today” to minimize their reporting burden. This response
“counts” as a valid response (and makes the respondent eligible for the bonus), as there may be
times when this is, in fact, the case. However, there were several occasions when the telephone
interviewer doubted the validity of this report, based on previous days’ reporting or on the
baseline data from Household Interview 1 on household eating habits.
Second, we realized we needed to provide greater clarification to households about what
qualifies a household for the $25 call-in bonus. The current text simply states the household is
eligible for the bonus if the food acquisition reporting is “completed” over the study week,
without specifying that the main respondent must initiate the call to Mathematica on the
scheduled days and not have Mathematica call him or her the next day. The idea behind this
bonus was that the amount would be offset by the reduced interviewer labor costs associated
with making follow-up calls to non-responders. It would also serve as an additional source of
compensation for participation. We are developing more thorough specifications regarding
“compliance” with this incentive for the field test.
Third, we realized that this incentive structure does not enable the field interviewer to give
the respondent the appropriate check(s) at the end of the study week. In the original design, we
proposed a flat rate for all households, which would be provided in person at the Day 8 visit. In
the pretest design, we did not know until the close of Day 7 whether the household had
completed all of its call-ins. This did not allow enough time for us to process a check and ship it
to the field interviewer. Furthermore, it put the field interviewer in, at best, an awkward position
of delivering the news about the bonus (or lack thereof) in person. Therefore, for the field test,
we plan to provide the $50 or $100 base check to the main respondent on Day 8 to compensate
him or her for completing the three interviews and taking part in the Day 0 training. The field
interviewer will tell the respondent that we will mail the balance of the honorarium to the
household within a few weeks’ time.
Section 4.

Conclusions and Recommendations

Overall, the six-case pretest indicated that the process developed for training respondents,
conducting the interviews, and reporting the food acquisitions works well.
Inter-Staff Communication. Communication between the phone and field interviewing teams
played a key role in the pretest process. The training and Household Interview 1 provided
information about households’ eating habits and respondents’ level of comprehension and
comfort with participation that enabled the field interviewer to highlight for the telephone
interviewer cases that would likely (1) pose little difficulty, (2) require refusal conversion efforts,
or (3) need special attention due to cognitive limitations or language barriers. Bringing this level
of communication to scale for the field test will not be without its challenges. However, we

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
20
propose creating a respondent “comfort code” whereby field interviewers can document in case
reports (when transmitting the screener data) the likely level of difficulty certain cases may
present. Field interviewers could also include detailed notes about the types of challenges that
may be encountered (such as cognitive impairments or reluctance to participate).
Call Assignments. During the pretest, respondents were assigned to a single telephone
interviewer who conducted all of their check-ins, enabling telephone interviewers to establish
rapport. While staffing schedules may not permit this process to take place on a large scale, we
will attempt to assign households to a particular interviewer (based on respondents’ comfort
level) and route calls to that interviewer. When this is not possible, calls will be routed to
interviewers who are adept at fielding calls from individuals with that particular comfort level.
Supplemental Training. Pretest findings also highlight that, regardless of the materials used
or caliber of the in-person training session, we must equip telephone interviewers with the skills
necessary to “re-train” respondents on the spot on any aspect of the food acquisition reporting
process. This includes training on completing the forms, completing the Meals and Snacks Form,
and scanning items from a grocery shopping trip. In addition, interviewers must be given the
resources to offer additional support (in the form of daily check-ins with a telephone interviewer
or an in-person visit from the field interviewer) to households having extreme difficulty with the
reporting process.
Missing Data. During the pretest, it became evident that we would need to develop protocols
for collecting FAFH data for respondents who fail to report all days during the Day 2, 5, and 7
telephone calls. We propose the following procedure to address this issue: in the early morning
before each Day 8 visit, a report providing details about missing data and specialized instructions
will be automatically generated and sent to field interviewers. During the Day 8 visit, field
interviewers will review the respondents’ booklets, clarify information contained in the booklets,
and prompt respondents about missing data. After the meeting, field interviewers will call the
telephone operations center to report the information gathered during the visit.
It also became evident that some respondents completed the Day 2, 5, and 7 telephone calls
but did not report all the red pages. To ensure that these data are not lost, we will develop a form
for field interviewers to complete when they mail survey materials back to the central office.
Field interviewers will provide a count of red pages for each day. These counts will be entered in
a database and compared to the information obtained through telephone reporting.
Instrument Revisions. By administering the survey instruments to a variety of types of
households, we identified questions that need to be revised to accommodate single-person
households or that should be asked of only certain age or gender groups. In addition, we
identified some skip patterns that need to be modified. All of these changes are documented in
detail in Appendix C.

MEMO TO: Mark Denbaly
FROM:
Holly Matulewicz, Nicholas Redel, Laura Kalb
DATE:
7/30/2010
PAGE:
21
Scanner Feedback. As demonstrated in the cognitive interviews, respondents’ feedback on
the scanner and use of this technology was extremely positive. Many commented that the device
was easy to use and they liked that it provided two forms of confirmation that an item had
scanned correctly (visually and with sound). Based on our comparison of the data reported in the
Blue Pages and the places and items scanned with the device, it is clear that respondents had no
problem scanning items. Also, revisions to the step-by-step guide positively impacted
respondents’ adherence to the scanning protocol, including use of the PLACE barcode.
In summary, the pretest was a success and yielded many valuable insights on the proposed
design and study instruments. We welcome the opportunity to discuss these findings in greater
detail.

cc: Nancy Cole, Project Director


File Typeapplication/pdf
AuthorECurley
File Modified2010-09-23
File Created2010-08-13

© 2024 OMB.report | Privacy Policy