Q_Pretest Summary Memo

Q_Pretest Summary Memo.pdf

Study of Nutrition and Activity in Child Care Settings II (SNACS-II) (New)

Q_Pretest Summary Memo

OMB: 0584-0669

Document [pdf]
Download: pdf | pdf
Q. Pretest Summary Memo

This page has been left blank for double-sided copying.

Chapter # Title of Chapter

Memo
To:

Constance Newman

From:

SNACS-II Instrument Development Team

Date:

01/06/2021

Subject:

Pretest Report and Revised Instruments and Recruiting Materials (Tasks 2.2, 2.3, and 3.2)

This memorandum describes the procedures used in implementing the pretest for the second Study of
Nutrition and Activity in Child Care Settings (SNACS-II). It also summarizes major findings from the
pretest and instrument changes made in response to these findings. Finally, the memo describes the
methodology our team is using to translate instruments and recruiting materials into Spanish. In keeping
with the requirements specified in the Performance Work Statement, the table below shows staff who
reviewed this memorandum.

Personnel

Name

Quality assurance reviewer

Laura Kalb

Copy editor

Donna Daniels Verdier

Other staff

Mary Kay Fox, Sarah Forrestal, Liz Gearan, Thea
Zimmerman

We look forward to receiving feedback from the Food and Nutrition Service (FNS) by January 27, 2021,
which will ensure that the project stays on schedule. We will then complete any remaining revisions and
the Spanish translations, which are currently underway, and submit the final instruments and recruiting
materials by February 16, 2021.

A. Pretest procedures
We conducted pretests with providers, parents/guardians1, and youth in November and December 2020.
We identified potential Child and Adult Care Food Program (CACFP) providers through our expert
consultants, contacts from other projects, and State agency staff. FNS notified the Regional Offices (ROs)
about the timing and content of the pretest. ROs then notified State agencies in the five States in which we
planned to recruit providers—California, New York, North Carolina, Ohio, and Texas. We then began to
recruit providers.
1.

Provider pretests

To maximize diversity among the providers, we wanted to recruit four who were operating different types
of programs in different States. We began by contacting providers by email and telephone; ultimately, the
following four providers agreed to participate in the pretest: an independent child care center director in

1

For simplicity, we generally refer to the pretest participants as parents in this memo. However, the term
parents/guardians may be used to accurately report language used in specific survey items.
111 East Wacker Drive, Suite 3000, Chicago, IL 60601-4303 • (312) 994-1002 phone (312) 994-1003 fax • mathematica.org
An Affirmative Action/Equal Opportunity Employer

To:
From:
Date:
Page:

Constance Newman
SNACS-II Instrument Development Team
01/06/2021
2

Mathematica

Ohio, a sponsored child care center director in North Carolina, an at-risk afterschool (AR) center director
in New York, and a family day care home (FDCH) operator in California.
Within two days after providers agreed to participate, we shipped them a pretest version of the Provider
Survey, handouts for the Sponsor/Center Cost Interview (except for the FDCH operator), and a simplified
version of the Infant Intake Form (for the two providers with enrolled infants), as well as provider
recruiting materials (the Provider Recruitment Letter, Study Fact Sheet and FAQs2, Provider Recruitment
Call Script, and Pre-Visit Planning Interview Script). These materials incorporated FNS and external
reviewers’ feedback on the draft instruments and recruiting materials submitted under Tasks 2.1 and 3.1.
We solicited providers’ feedback through telephone interviews. We conducted separate interviews with
each of the four providers to obtain feedback on the Provider Survey, the meal observation procedures,
Infant Intake Form incentives (if applicable), and provider recruiting materials. Interviews with the two
child care directors and the AR center director also obtained feedback on the Sponsor/Center Cost
Interview. After the interviews were completed, we sent each provider a $50 gift card to thank them for
their participation.
Specific procedures for the provider pretests are summarized below:
•

We conducted cognitive interviews to pretest the Provider Survey. We asked pretest participants to
read each question and associated response options out loud and tell us what they thought the
question was asking and which response (or responses) they would choose, including any that were
not among the listed response options. We probed for participants’ interpretation of questions and
response options and the level of difficulty they had answering each question. We also asked
providers to recommend improvements to the wording of questions and response options.
Consistent with the revised study plan, we pretested a limited set of questions from the Provider
Survey. The pretest version had 32 questions—21 questions that were new for SNACS-II and 11
questions that our team thought might not be interpreted uniformly or might be improved by adding
response options. In addition, we tailored the pretest version of the Provider Survey and recruiting
materials to the provider type and age groups served. For example, only the two providers who serve
infants received the infant feeding module in the Provider Survey, and only the AR center director got
the version with two items focused on physical activity offerings before or after school for school-age
children.

•

We described the meal observation procedures and asked questions to assess the feasibility of data
collection plans for this component of the study and identify potential challenges. Although soliciting
feedback on the meal observation procedures was not part of our original plans for the pretest, our
team thought it made sense to take advantage of the opportunity to get feedback from providers.

•

We reviewed the Infant Intake Form instructions and asked pretest participants for their feedback
about the planned incentive amount (a $5 gift card for each completed form). We sought feedback
about the incentives because one of our expert consultants thought infant caregivers and teachers
might find the modest amount off-putting.

2

As discussed in Section B5, this document is now called Study FAQs.

To:
From:
Date:
Page:

•

Constance Newman
SNACS-II Instrument Development Team
01/06/2021
3

Mathematica

We administered the Sponsor/Center Cost Interview to the three center-based providers and asked
follow-up questions throughout the interview. For example, we asked whether the definitions of
technical terms were clear and whether it was easy or difficult to provide the requested information.
We had originally intended to pretest the Pre-Visit Cost Interview and Pre-Visit Cost Form. However,
as described in the revised study plan, we decided to instead pretest the Sponsor/Center Cost
Interview, which covers a broader range of content. For example, the Sponsor/Center Cost Interview
includes questions about food costs and revenues, and the staffing grids and handouts are similar to
those included in the Center Director Cost Interview and Center Food Service Cost Interview.

•

2.

We asked for their feedback on the provider recruiting materials, such as their general impressions
of the written materials, whether they understood the descriptions of study activities, and whether
they identified issues that were not addressed in the recruiting materials but should be. We tailored
the materials to each provider to minimize confusion. For example, the materials we shipped to the
FDCH operator omitted references to the cost study.
Parent and youth pretests

We pretested two modules (Modules A and C) in the Parent Interview for an In-Care Day (ICD).3 Module
A includes the Automated Self-Administered 24-Hour Dietary Assessment (ASA24), including use of
the Child Food Diary (CFD), for an ICD. Module C assesses child physical activity. We pretested the
Parent Interview modules with two parents and the entire Teen Survey with two youth.
The AR center director provided a password-protected list with contact information for parents of 10- and
11-year-olds and distributed a recruiting flyer to the parents. We contacted all 17 parents on the list that
were identified as English-speaking and left voice mail messages with 13 of them explaining the purpose
of the call and requesting a call back if they were interested in participating in the pretest.4 None of the
parents who received a message returned our call. During follow-up phone calls, we scheduled interviews
with three parents, to be conducted when both the parent and the child were available so the pretests could
be completed in one session. In scheduling the interviews, we also made sure that parents could complete
the CFD for an in-care day. After we obtained parental consent and scheduled each interview, we shipped
a packet with instructions for completing the CFD and, for the two youth participants, a copy of the Teen
Survey. We texted parents a reminder to complete the CFD one day before the interview. Two parents
rescheduled several times, and the third did not respond to reminder calls. We thus had to recruit another
parent. We conducted in-depth cognitive interviews with parents and youth by telephone. Mothers were
the respondents for two of the parent interviews, whereas the mother and father jointly completed the
third interview. We sent a $40 gift card to the parents and a $20 gift card to the youth to thank them for
their participation.

3

The other modules include questions from the SNACS-I Parent Interview with little to no modification, so were
not pretested.
4
Among the other four parents, one spoke Spanish, one hung up and would not speak with us, one did not have
voice mail set up and we were unable to leave a message, and one had a wrong number listed.

To:
From:
Date:
Page:

Constance Newman
SNACS-II Instrument Development Team
01/06/2021
4

Mathematica

B. Pretest findings and recommended revisions
1.

Provider Survey

Below, we provide an item-by-item summary of the pretest findings, the changes we made in the revised
version of the Provider Survey to address these findings and, in some cases, additional changes we
recommend be incorporated into the final version of the survey. The item numbers correspond to the
numbering in the revised version of the survey.
•

M1.7. Pretest participants did not understand the intent of this question or the meaning of the
response options. The providers’ perspective was that they do not charge families in any way for the
cost of CACFP meals and snacks. To address this feedback, we added “meals and snacks” to the two
existing response options and added a third response option (“Price of meals and snacks is not
charged to families in tuition or separately”). However, given that all providers were confused by this
question and the fact that volume 1 of the CACFP Sponsor and Provider Characteristics Study
documented that few providers charge families separately for meals and snacks (pp. 5-16 and 5-17),
we recommend deleting this question from the final version of the Provider Survey.

•

M1.8. Two pretest participants interpreted this question as asking about the language the majority of
parents/guardians speak. They also asked whether the question focused on the language
parents/guardians speak when they come to pick up or drop off their children or the language they
speak at home. We modified the wording of the question to focus on languages spoken at home.

•

M1.9. Except for the FDCH operator, pretest participants misinterpreted the intent of this question,
thinking it was asking about the language their staff speak in their own homes. We modified the
wording of the question to focus on the language staff usually speak at the child care site.

•

M2.5. Pretest participants understood this question and felt the response options were relevant to their
programs. However, the AR center director noted that the dinner meal is a CACFP supper, but the
afternoon snack is reimbursed through the National School Lunch Program (NSLP), and each is
planned by a different organization. We modified the wording of this question to clarify that the focus
is CACFP meals and snacks. Because potential confusion about afternoon snacks reimbursed through
the NSLP is cross-cutting and could affect many aspects of data collection and analysis, our team
plans to develop comprehensive plans for dealing with this situation. We discussed this issue with
FNS in a meeting on December 18, 2020, and FNS agreed to look into how the situation was handled
in SNACS-I. Our plans will build off of the previous study and will be incorporated in final versions
of the instruments, recruiting materials, and study plan.
One pretest participant said she is the child care provider, director, and teacher at her site, so she
would select all three of these response options. To avoid the appearance that multiple people are
engaged in an activity when it is actually one person with many responsibilities, we added an
instruction to select the person’s main role. We also added this instruction to questions M3.1 and
M3.6, which have similar structures.

•

M2.9. Two pretest participants did not understand this question as it was intended. These two
respondents recommended new response options that focused on general menu planning challenges
rather than challenges associated with planning menus that meet the updated meal patterns (“parent
preferences” and “dealing with food allergies”). The other two participants said they had no

To:
From:
Date:
Page:

Constance Newman
SNACS-II Instrument Development Team
01/06/2021
5

Mathematica

challenges planning menus that meet the updated meal patterns. To address this feedback, we
underlined “meet the updated CACFP meal patterns” to emphasize the question’s intent and added a
“no challenges” response option.
•

M4.2. All pretest participants clearly understood this question and the response options, and they did
not suggest any new response options.

•

M5.7. Pretest participants were unsure whether this question was asking about what happens to the
food that is left on children’s plates after the meal or what happens to food that is prepared but not
served. We modified the wording of the question to clarify that it is asking about food remaining in
the classroom, not foods left on individual children’s plates. During earlier stages of instrument
development, Mathematica and FNS decided to omit items in the SNACS-I Classroom Waste
Measurement Form that had been designed to capture this information at the food-item level, and to
add a question to the Provider Survey to capture more general information about the handling of food
remaining in the classroom.

•

M5.8. One pretest participant did not understand the intent of this question. In response, we added an
introductory sentence, “We are interested in methods centers use to prevent or reduce food waste”
and underlined “to prevent or reduce food waste” for emphasis.
In addition, all three center directors did not understand response option b (“Serving pre-cut, readyto-eat fruits or vegetables (e.g. apple slices, orange slices, or carrot sticks)”) and two center directors
and the FDCH operator did not understand response option l (“Tailoring the number of meals and
snacks prepared daily to meet usual attendance”). With probing, we identified and incorporated
changes in the wording that would both clarify the questions to participants and pertain to a wider
range of provider practices: “Serving pre-cut, ready-to-eat fruits or vegetables (e.g., apple slices,
orange slices, or carrot sticks) so that children can take or request only the amount they want to eat”
and “Tailoring the number of meals and snacks prepared daily based on expected attendance.”

•

M6.4. Two pretest participants interpreted this question as asking only about accommodations related
to meals and snacks. We added a sentence to clarify that the question is asking about procedures that
accommodate children with disabilities or impairments at mealtimes and at other times.
Additionally, one participant said she would not know how to respond to this question because the
center has both written and informal policies on this topic. We added a response option for providers
to indicate that they have “both an informal and a written policy.” We also added this response option
to other questions in the survey that ask about the existence of a policy.

•

M6.5. As with question M6.4, pretest participants did not understand the focus of the question, so we
added language to clarify that they should select the procedures they use to accommodate children
with disabilities or impairments at mealtimes and at other times. Two participants described other
accommodations they believe are common among providers serving children of all ages. To address
this feedback, we added two additional response options: “Communicate with pictures and signs” and
“Provide breaks from the group for individual children to help them self-regulate”.

•

M7.3 and M7.3.a. The AR center director interpreted the term “recreational activities” to include
time spent playing table games, such as foosball. We revised question M7.3 to ask specifically about
“programming that includes time for physical activity,” and changed “activities” to “programming” in
question M7.3.a for consistency.

To:
From:
Date:
Page:

Constance Newman
SNACS-II Instrument Development Team
01/06/2021
6

Mathematica

Module 8: Infant feeding and physical activity
The draft version of the Provider Survey included placeholders for several new infant questions that
needed to be developed after getting more input from FNS. The questions listed in italics in this section
are those that were developed after we received this input from FNS (and thus were not included in the
draft deliverable).
•

M8.1. This question was in the SNACS-I Provider Survey, but we included it in the pretest because
we thought the response options were awkwardly worded. Both of the pretest participants who served
infants reported concerns about this question. They thought that providers would have a difficult time
answering it because the timing of feedings varies for each infant and is affected by several factors,
including parent request, when the infant was fed before arriving at child care, and the age of the
infant (pretest participants reported that younger infants are generally fed when they show they are
hungry, while older infants start transitioning to a meal-like schedule as they begin to eat more solid
foods).
Based on this feedback, we recommend dropping this question from the final version of the Provider
Survey. The RQ that this item was intended to address—RQ 2 under Objective 5 (Are providers
feeding infants “on-demand” or at set meal times?)—is already addressed by two other items in the
Provider Survey. Question M8.2 asks about how often staff use responsive feeding techniques and
question M8.3 asks about how staff determine the end of infant feedings. If FNS agrees with this
recommendation, the final version of the study plan will omit “timing of infant feeding” as an
outcome for RQ 2 in Table VII.5. The other two planned outcomes for this RQ (frequency of using
responsive feeding techniques and determination of end of infant feedings) will be retained.

•

M8.4. Pretest participants reported that there is no easy way to determine the “average” age for
introducing solids. Participants preferred the term “typical” for this question, so we adjusted the
wording of the question accordingly.

•

M8.5 and M8.6. These questions were new to SNACS-II and were designed to address the part of RQ
4.b under Objective 5 that asks, “How do providers determine when to introduce solid foods?”
Question M8.5 asks about how staff determine whether an infant is ready for solids, and question
M8.6 asks about how staff work with parents to determine when infants are ready for solids. The two
pretest participants who completed these questions felt strongly that the questions do not reflect the
factors that determine when solid foods are introduced. They emphasized that child care staff do not
make decisions about when solid foods are fed to infants—this is determined by CACFP policy and
by parental request/guidance. Based on this feedback, we recommend dropping these questions from
the final version of the Provider Survey. In email discussions with FNS during earlier stages of
instrument development, FNS indicated that the main interests in asking about solid foods are to (1)
assess whether providers are following American Academy of Pediatrics (AAP) and CACFP
recommendations about when to introduce solid foods, (2) identify the types of solid foods that are
typically introduced first, and (3) understand the challenges providers may face in introducing solids.
Questions M8.4, M8.7, and M8.8 address these topics.
If FNS agrees with this recommendation, the final version of the study plan will omit “how providers
determine when to introduce solid foods” as an outcome for RQ 4b in Table VII.5. The other two
outcomes currently included in Table VII.5 for this RQ (types of solid foods typically introduced first
and challenges providers face related to introduction of solid foods) will be retained. We will also

To:
From:
Date:
Page:

Constance Newman
SNACS-II Instrument Development Team
01/06/2021
7

Mathematica

adjust the wording of RQ 4b to match the outcomes that will be examined—Which solid foods are
typically introduced first? What challenges do providers face related to solid foods? Finally, for RQ
4a (“Does the timing of the introduction of solid foods follow AAP recommendations?), we
recommend adding a second outcome that focuses on the percentage of providers following
AAP/CACFP recommendations for introducing solids to more fully address this RQ (the outcome
currently included for this RQ in Table VII.5 is the average age solids are introduced).
•

M8.7. Pretest participants clearly understood the question. However, one pretest participant said that
it would be difficult to say what is “typical” because it can vary by infant and parent preference. We
changed the wording of this question to focus on the single type of food that is most often introduced
first and changed the format to “select only one” response (versus check all that apply). We also
updated the response options to include more disaggregated groupings of foods (for grains and
meat/meat alternates) to reflect the change in the focus of the question on a single type of food.

•

M8.8. Pretest participants said it would be difficult to rate whether a challenge, such as an emphatic
parental request to start solid foods early, is “major” or “minor” because although it is a big challenge
when it happens, it does not happen often. The strong and consistent negative feedback on how
difficult it would be to differentiate major from minor challenges led us to change the question
wording and format to allow for a Yes, No, or Don’t know response for each challenge.

•

M8.10. The participants clearly understood this question and the response options, although they
suggested that the second response option should be revised to clarify that parent preference, not
infant preference, drives the decision to send solid foods from home for infants. We changed the
wording of this response option to focus solely on parent/guardian preference. (Note that we also
administered question M.8.9 in the pretest because the response to this question was needed to
determine whether respondents should answer M.8.10; pretest participants did not have feedback
about M8.9.)

After revising the Provider Survey following the pretest, we revisited the estimated burden. We concluded
that the 50-minute estimate need not be changed. If FNS agrees to delete the recommended questions,
their removal will not meaningfully reduce the estimated burden.
2.

Meal observation procedures

Following the pretest of the Provider Survey, we asked the pretest participants for their reactions to our
plans for conducting the meal observations. Specifically, we asked for their input about plans for:
•

The measurement of reference portions,

•

Observations of three children at one table, and

•

The general observation procedures.

Below we summarize the feedback we received and any revisions to the procedures that we made in
response.
a.

We asked these questions about our plans for measuring reference portions:

•

How easy or difficult would it be for staff to provide us with two servings of each food item?

•

What might be the best time to ask staff to provide servings of food for measurement?

To:
From:
Date:
Page:

Constance Newman
SNACS-II Instrument Development Team
01/06/2021
8

Mathematica

Providers indicated that it would not be a challenge to provide us with two servings of each food for use
in measuring reference portion sizes. Providers suggested that the optimal time for obtaining reference
portion measurements would be just before the start of each meal. Center directors also recommended that
they introduce the observer to staff at the start of the data collection visit. At that time, observers and staff
can best discuss the approach for measuring reference portions and the optimal time for the observer to
return to the kitchen to measure the portions. We did not make any changes in response to this input. Our
existing procedures for measuring reference portion sizes—that is, requesting that observers be
introduced to the food preparer, teachers, and any other staff, and having observers measure reference
portions before the start of each meal—are consistent with this feedback. As in SNACS-I, training
materials will specify that observers should start measuring reference portions 15 minutes before the start
of meal service.
b.
•

We asked pretest participants these questions about observing three children at a table:
How easy or difficult would it be for staff to have the three sampled children eat their meals at the
same table and at the same time?

All pretest providers replied that having the sampled children sit at the same table would be feasible.
Center-based directors emphasized that communication with the classroom teacher will be very
important. Moreover, they recommended that observers ask classroom teachers to seat sampled children
next to each other (not just at the same table), which will make it easier for the observers to observe
trading or any other additions or subtractions from the child’s plate.
We did not make any changes as a result of this feedback. Our existing procedures include plans to
discuss the meal observation approach with the classroom teacher (and with FDCH operators).
c.

Finally, we asked for general feedback on planned procedures for the meal observations.

Center-based directors noted that if observers are also completing the EOF, there may be too little time
between the end of an activity session and the start of meal/snack service to allow observers to visit the
kitchen and get reference portion measurements. On the basis of this input, we made the following
modifications to the onsite data collection procedures:
•

During the pre-visit planning phase, we will request a schedule that documents activities and meal
and snack times for the sampled classroom. We added this to the Pre-Visit Planning Interview so that
providers sampled for child-only or child-and-cost data collection can upload both the roster and the
schedule.

•

During their initial in-person meeting with the food preparer, field interviewers (FIs) will ask about
all foods and drinks to be served at meals and snacks that day and populate the appropriate fields on
the Reference Portion Measurement Forms (within the Meal Observation Booklet). If the FI is unable
to record all foods and drinks during this initial meeting, the Reference Portion Measurement Forms
can be completed when measuring reference portions for a given meal and snack. The FI will also
confer with the kitchen staff (and possibly the teacher) during this initial meeting so that all of the
parties involved with meals and snacks are aware of the logistics of getting the reference portion
measurements. Working out the process in advance and having observers start to populate the
Reference Portion Measurement Form at the beginning of the day will speed up the actual

To:
From:
Date:
Page:

Constance Newman
SNACS-II Instrument Development Team
01/06/2021
9

Mathematica

measurement process and maximize the time available for the EOF. We will add these details about
our approach to measuring reference portions to the final version of the Study Plan.
3.

Infant Intake Form incentives

As noted in Section A, we asked the two providers who serve infants for their opinion about the planned
incentive of $5 per completed Infant Intake Form. Both pretest participants said that completing this form
would not be very burdensome and the incentive amount is reasonable. However, because staff rotate
through the infant room throughout the day, more than one person might need to contribute to completing
each form. We considered three alternatives to deal with this situation:
a.

Give the monetary incentive to the lead teacher to divide among participating staff.

Although this approach is straightforward for the field interviewers to carry out, it places additional
burden on the teacher if the number of $5 gift cards is different from the number of staff (for example,
three caregivers who help to complete two Infant Intake Forms).
b.

Make the cash incentive a classroom gift.

Although this strategy is less burdensome on the lead teacher, the modest dollar amounts given to the
classroom are not likely to incentivize individual caregivers’ behavior.
c.

Give the classroom children’s books.

This is the strategy that we recommend. We can supply field interviewers with a small selection of ageand culturally appropriate titles and offer one book for up to two infants, two books for three to six
infants, and three books for seven or more infants. Mathematica conducts several early childhood
projects, including some in Head Start centers, and the survey directors report that the books are wellreceived. If FNS agrees with this change in approach, we will incorporate the new incentive strategy in
the study materials and Office of Management and Budget package. (Before purchasing supplies for data
collection, we will consult with our colleagues for book recommendations, examples of which are A Ball
for Daisy and Good Night, Gorilla.)
4.

Sponsor/Center Cost Interview

We pretested three modules in the Sponsor/Center Cost Interview with the three non-FDCH participants.
Below, we summarize the findings and describe the resulting revisions for each of the modules and
associated handouts.
a.

Sponsor CACFP Staff Cost Interview module

The pretest interviews confirmed our expectation that, depending on the type of provider, more than one
person may be needed to complete all of the modules of the Sponsor/Center Cost Interview. However, the
Sponsor CACFP Staff Cost Interview module will need to be completed only by staff at the sponsor level,
not by center directors or other staff. We have updated the cover page for this module to remove centerlevel staff from the list of possible respondents.

To:
From:
Date:
Page:

Constance Newman
SNACS-II Instrument Development Team
01/06/2021
10

Mathematica

•

Sponsor Staff Overview. We updated the formatting of the instructions to improve clarity for both
the interviewer and the respondent. We now provide the definitions of direct and support roles first
and then offer examples. We also structured these instructions as a bulleted list.

•

Sponsor Staff Grid. The term “central kitchen” was not familiar to at least one of the pretest
participants. We added a definition below the grid for reference during the interview.
−

Participants were unsure about whether to include contracted services, such as trash or janitorial
services, in this staff grid. Because the instructions for this section are already long and detailed,
we decided that the most effective approach would be to emphasize in interviewer training that
these types of services should not be included in the grid. Rather, as the instructions state, the grid
is intended to capture information specifically about the individual staff members the
organization employs.

•

Roster for Sponsor Staff with Direct CACFP Roles. We updated the formatting of the check boxes
in this table to ensure they are uniformly centered.

•

Handout 1: CACFP/Food Service Activities. On the basis of feedback from pretest respondents, we
updated this handout thus:

b.

−

We added a bullet to the last category (“Other CACFP/Food Service Activities for CCCs”) to
indicate that any other CACFP activities not listed should be included in that category.

−

One pretest participant pointed out that we included a third bullet (“Any other work that involves
direct production…”) under breakfast, lunch, and breakfast and lunch production, but did not
include the corresponding bullet under snack and supper production. We added the corresponding
bullets to those items for consistency.

Support Staff Cost Interview module

•

Introduction. The independent center director expressed confusion about the purpose of this section,
which is to ensure that the interviewer does not miss capturing any staff with support roles in CACFP
at the center level. We slightly revised the introductory text in this section to make the purpose
clearer.

•

Handout 2: Definitions for Support Functions. We changed the title of this handout to reference
support functions rather than the “Support Function Cost Grid.” Referring to the grid in the
instrument was confusing to pretest participants because they were not looking at the instrument
alongside the handout.

c.

Food Price and USDA Foods Checklist module

•

FPM1. Two pretest participants suggested that we reference “catering” or “caterers” in our interview.
While neither used a caterer, they said that it was a common term used among CACFP providers to
describe meals prepared for the organization. We updated the text of this item to ask about “a food
service management company or other vendor or caterer” to bring our terminology in line with that
used by providers.

•

FPM2b, FPM3b, FPM4b, and FPM5b. We updated the formatting in column C of each grid to
remove unneeded lines.

To:
From:
Date:
Page:

•

d.

Constance Newman
SNACS-II Instrument Development Team
01/06/2021
11

Mathematica

Handout 3: List of Foods. One pretest participant who expressed confusion about the term “USDA
Foods” was unfamiliar with the term. We added a brief definition of USDA Foods to the handout to
address this issue.
Handouts

We identified additional ways to improve the cost instrument handouts.
•

It was difficult for the pretest participants to visualize the types of time and salary information we
were collecting. We therefore added examples of the time and salary grids to the handouts, not only
for the Sponsor/Center Cost Interview handouts but also for the Center Director and Center Food
Service Cost Interview handouts.

•

We made capitalization of the handout titles consistent across the handouts.

5.

Provider recruiting materials

Pretest participants uniformly recommended that the content of the recruiting materials be streamlined
and simplified. We revised the Provider Recruitment Letter and consolidated the Study Fact Sheet and
FAQs. We moved some of the Study Fact Sheet and FAQs content to the recruiting website home page
for sample members who wish to read more details about the study. Because we revised the content from
the fact sheet portion of the document to be in question-and-answer format, we will now refer to the
document as the Study FAQs. Pretest participants also suggested that the term “onsite liaison” be replaced
because it is unfamiliar. In response, we changed onsite liaison to point-of-contact in study materials.
During our discussion of the Provider Recruitment Call Script, the AR center director explained that her
center is typical of many afterschool programs, in that children are not grouped by classroom for
activities, nor does each child have a predetermined schedule of activities on any day. She explained that
the center usually offers four or more activities each day and children individually choose the three
different activities they want to participate in, as well as the sequence of activities. Children are not
allowed to engage in the same activity for the entire afternoon.
The SNACS-II sample design includes first sampling classrooms or groups of children to serve as the unit
of observation for the EOF, and then sampling children within the classroom or group for the meal
observations and other child/parent data collection. Centers that are “ungrouped” will need a different
approach. To address this issue, we added two screening questions to the Provider Recruitment Call
Script to screen for ungrouped AR centers and outside school hours care centers for both activities and
meals. Our team is working with FNS to determine how this arrangement was handled in SNACS-I so
that we can replicate those procedures. We will incorporate the procedures into the final study plan and
relevant study materials.
6.

Parent Interview for In-Care Day and Child Food Diary

The parent pretest focused on two modules (Modules A and C) in the Parent Interview for an ICD.
Module A includes administration of the ASA24, including use of the CFD for an ICD. We did not
pretest telephone administration of the ASA24 because both Mathematica and Westat have successfully
administered the ASA24 in prior studies. Rather, the pretest for Module A focused on the CFD. Although
our team has used CFDs in several other studies, we wanted to test the version that will be used in

To:
From:
Date:
Page:

Constance Newman
SNACS-II Instrument Development Team
01/06/2021
12

Mathematica

SNACS-II to assess whether parents understand the directions and complete the CFD as intended. Module
C, which focuses on child physical activity, was pretested in its entirety.
Module A: Child Food Diary
None of the parents completed the CFD before the interview. During the interview, we asked parents to
read the instructions and then complete the CFD by reporting part of their child’s intake on the previous
day. We then asked about (a) reasons for not completing the CFD before the interview and the usefulness
of the reminder text message we sent, (b) their understanding of the directions for completing the CFD,
and (c) the ease of reporting their child’s intake.
a.

Reasons for not completing the CFD and usefulness of the reminder text message

Parents indicated that they had not opened the packet and reviewed the materials before the interview
because they had been busy. Two parents indicated that if they had been part of a “real” study—that is,
not the pretest—they would have taken the time to complete the CFD. All three parents commented that
study participants should find the reminder text message helpful. One parent stated that, rather than use
the CFD, she would involve the child in reporting foods and drinks during the interview.
Recognizing that the CFD is an optional tool intended to make the ASA24 interview easier and more time
efficient, we will emphasize its utility to parents during interview scheduling and in the reminder text
message and phone call. We also revised the language in the cover letter and on the instructions for the
CFD to further emphasize its utility. In addition, the current interview protocol for children ages 9 and
older requests that the child participate in the ASA24 portion of the interview along with the parent; thus,
if the parent did not complete the CFD, the interview can proceed and the child can help report on intake.
However, if the child is not available and the parent did not complete the CFD, the interviewer will
proceed with other modules in the survey and return to the ASA24 module. These procedures were
already included in the instrument logic.
b.

Understanding of the directions for completing the CFD

Parents found it easier to record foods by meal type, rather than the time of the meal, and recommended
changing the directions to record food and drinks by meal type (breakfast, lunch, dinner, or snacks, for
example) rather than time of day. One parent also recommended different wording in two places to
simplify the directions: (1) replacing the term “amount” with “quantity” and (2) replacing “when they are
not at child care or with you” to “when they are not in your care.” On the basis of this input, we revised
the directions and the column header in the CFD grid to ask parents to record the time and/or name of the
meal. We also revised the CFD directions to use the term “quantity” (instead of “amount”). Finally, we
revised the directions to clarify that all foods and drinks consumed outside of child care should be
recorded and also added a new instruction for parents of school-age children to ask their child about the
foods and drink they had while at school.
c.

Ease of reporting their child’s intake

All parents said their child ate breakfast and lunch at school, and they did not know what foods and drinks
their child had at those meals. All parents were confident that their child would be able to provide
information about the foods eaten at school, but they were unsure whether their child would be able to

To:
From:
Date:
Page:

Constance Newman
SNACS-II Instrument Development Team
01/06/2021
13

Mathematica

report the amount of that food. None of the parents sent foods to the child care center, although one had
recently started sending a bottle of water.
All parents indicated that outside of school and child care, their children ate all meals and snacks at home,
and thus they need not ask others about meals consumed outside of school or child care. One parent felt
that talking with other adults who served foods and drinks to their child (outside school or child care)
could be considered very personal, and if their child did eat elsewhere, they would not ask others about
what they served to their child. All the parents said they would just ask their child what they ate and
drank.
As described above, we added instructions to the CFD asking parents to specifically talk to their child
about foods and drinks they consumed while at school. We did not make any further changes as a result of
this feedback because the current data collection protocol addresses such scenarios. In all dietary intake
studies, respondents are sometimes unable to provide details about the amount of food consumed. Our
coding guidelines will use the same methods that the National Health and Nutrition Examination Survey
(NHANES) does to resolve these issues.
Module C: Child’s Physical Activity
Module C includes seven questions. Of these, five are slightly modified versions of questions from the
Physical Activity Questionnaire (PAQ) used in NHANES5, one was a validated question from Adamo and
coauthors,6 and one was entirely new. Below, we provide an item-by-item summary of the pretest
findings and the changes we made to Module C in the revised version of the Parent Interview. The item
numbers correspond to the numbering in the revised version of the instrument.
•

Q9. All three parents were able to answer this question easily, reflecting activities their child was
currently participating in. They reported that they thought about the time their child spent being active
after school, particularly their participation in paid after-school extracurricular activities. We did not
make any changes to this question.

•

Q10 and Q11. In the draft version of the interview, one question asked, “Thinking now about a
typical weekday, about how many hours does [CHILDNAME] spend watching TV or videos when
[she/he] is not at child care or school?” Parents asked for clarification about the time period for
reporting, given that the amount of time children spend watching TV or videos varies by season. Two
parents indicated that they would be able to report the time easily, because they set time limits. The
other parent reported that they allow their child to watch TV or use the computer but not both at the
same time or on the same day. When parents were asked how they reported the time their child spent
watching TV while also using social media (for example, watching TV while checking social media
on their phone or computer), parents said they included that time as spent watching TV, as this was
the primary activity. One parent misunderstood “videos” to mean video games and included time

NHANES 2019, Physical Activity and Physical Fitness – PAQ, Target Group: SPs 2+. Available at
https://wwwn.cdc.gov/nchs/data/nhanes/2019-2020/questionnaires/PAQ_K.pdf.
6
Adamo, K.B., S. Papadakis, L. Dojeiji, M. Turnau, L. Simmons, M. Parameswaran, J. Cunningham, A.L. Pipe, and
R.D. Reid. “Using Path Analysis to Understand Parents’ Perceptions of Their Children’s Weight, Physical Activity,
and Eating Habits in the Champlain Region of Ontario.” Paediatrics & Child Health, vol. 15, no. 9, November
2010, pp. e33–e41. doi:10.1093/pch/15.9.e33.
5

To:
From:
Date:
Page:

Constance Newman
SNACS-II Instrument Development Team
01/06/2021
14

Mathematica

spent playing video games. Two parents indicated that having response options instead of reporting
time in an open-ended format would be helpful.
On the basis of these findings, we added a lead-in question about the number of weekdays in the past
week that the child watched TV rather than asking about a typical day. We also focused the question
about time spent watching on days when the child did watch TV or videos. These revisions will focus
parents on the current season and address the issue that some children may not watch TV every day of
the week. Further, to reduce respondent burden, we replaced the initial open-ended response for time
spent with categorical response options. These response options are consistent with the NHANES,
National Youth Fitness Survey PAQ, Q710.7 Finally, we added instructions to clarify that if the child
engages in more than one activity at a time, parents should report the time spent on the primary
activity and not double count the time for the two activities.
•

Q12 and Q13. In the draft version of the interview, one question asked, “Thinking again about a
typical weekday, about how many hours does [CHILDNAME] spend using a computer or playing
computer games, when [she/he] is not at child care or school? Include PlayStation, Nintendo, DS, or
other portable video games.” Parents’ comments and questions about this item were similar to those
for the preceding item about watching TV and videos. In addition, one parent reported that on a given
day her child either watched TV or played computer games and reported hours for each of these
(three hours for TV and three hours for computer games). Because these activities never actually
happen on the same day, these data would have overestimated (double counted) the amount of screen
time per day.
Considering these findings, we made changes similar to those we made for Q10 and Q11, adding the
lead-in question to ask first about the number of weekdays the child played video games or used a
computer (which avoids the concern about double counting). We also modified the wording of the
question so we could exclude time spent using the computer to do homework. We added instructions
to ensure that parents are not double counting time spent watching TV and using the computer. This
approach will allow us to accommodate instances in which children do not engage in these activities
every day. Finally, to reduce respondent burden and for consistency, we standardized the response
options to match those provided for Q11.

•

Q14 and Q15. In the draft version of the interview, one question asked, “Thinking again about a
typical weekday, about how many hours does [CHILDNAME] usually spend sitting, when [she/he] is
not at child care or school? Include time spent sitting at home, getting to and from places, doing
homework, reading, or playing cards or board games.” None of the parents were able to answer this
question. They all said “a lot” and indicated that they would not able to able to estimate this time, as
children are in their rooms or parents are occupied with chores after work. In general, the time that
the child was home for child care was generally seated, with periods of time getting up and down.
Also, it was apparent that none of the parents noticed that the example included time seated in car
travel from one destination to another.
Because this question is very broad and parents are not confident about their ability to provide a valid
estimate, we replaced the original question with two questions focused on the past week. The first
question asks about how much time the child spent sitting while traveling (by vehicle) from one place
to another. The second question asks about time spent sitting while engaged in activities such as

7

NHANES, NNYFS. PAQ Q710. Available at https://www.cdc.gov/nchs/data/nnyfs/paq.pdf.

To:
From:
Date:
Page:

Constance Newman
SNACS-II Instrument Development Team
01/06/2021
15

Mathematica

homework, reading, playing cards or board games, excluding computer screen time. Since travel
times may vary considerably and may be less than one hour for some, we did not create categorical
response options for this question. We did provide categorical response options for the question about
time spent sitting while engaged in activities, however. These response options are consistent with
those provided for Q11 and Q13.
•

Q16 and Q17. The draft version of the interview included two questions asking about activities that
increased children’s heart rates: “In the past 7 days, on how many days did [CHILDNAME] spend
time doing things that increased [his/her] heart rate and made [him/her] breathe hard? Include biking;
brisk walking; swimming; dancing; competitive sports; or playing active video games such as Wii
Sports, Wii Fit, Xbox, Kinect, PlayStation 3, Or Dance, Dance Revolution” and “On a typical day,
how much time does [CHILDNAME] spend doing things that increase [his/her] heart rate and make
[him/her] breathe hard? Include biking; brisk walking; swimming; dancing; competitive sports; or
playing active video games such as Wii Sports, Wii Fit, Xbox, Kinect, PlayStation 3, Or Dance,
Dance Revolution.” All parents answered these questions without difficulty. However, during followup probing, it became apparent that they were only counting the days their child played competitive
sports or other structured activities. They were not including activities such as biking and brisk
walking. Reading the list of example activities was time consuming, and it was apparent that by the
time we finished reading the list, parents may not have fully understood or remembered the intent of
the question. It was also apparent that the time frames for the two questions were not in sync; one
asked about the past seven days and one asked about a typical day (implying weekdays).
We modified the structure of these questions to match the structure used in Q10 through Q15. We
also added a preamble before Q16 that includes examples of activities first, then asks the question.
We also modified the wording of the questions to clarify that parents should focus on weekdays and
activities outside of school and child care.

•

Q18 and Q19. In the draft version of the interview, one question asked, “On a typical day, how much
time does [CHILDNAME] spend playing outdoors? Include time spent biking; brisk walking;
swimming; competitive sports; or in other activities outdoors.”
One parent said it would be hard to track and monitor their child’s outdoor activities because (1) they
would not know if the child was playing or just sitting and hanging out with friends when outside,
and (2) parents don’t know how much time their children spend outdoors at school or child care.
Another parent said she knows about the time her child spends in an activity because she pays for the
class.
Because the question asked about a typical day, parents were unsure if they should include weekend
activities. For consistency and ease of reporting, we structured the question to be comparable to other
questions in the module and provided a time anchor (the past week).

Because we fully pretested only one of the modules in the Parent Interview, we did not change the
estimated response burden. Although we added questions to module C, we expect that these changes will
reduce confusion without increasing respondent burden.

To:
From:
Date:
Page:

7.

Constance Newman
SNACS-II Instrument Development Team
01/06/2021
16

Mathematica

Teen Survey

Two boys, both age 10, pretested the Teen Survey. One of the participants completed the survey on his
own. The parent of the other participant said that the child8, who had a learning disability, would not be
able to complete the survey independently. So, the parent read the survey questions to the child and
recorded the responses. To deal with such a situation during data collection, we will train FIs to
administer the survey, if needed; encourage respondents to ask questions as they complete the survey; and
provide neutral assistance (that is, avoid leading respondents to certain answers).
Below is an item-by-item summary of the pretest findings and the revisions we made in the Teen Survey
to address these findings.
•

Assent and instructions pages. None of the pretest participants read the assent page when they
completed the survey. We gave them time to do so when we did the follow-up interviews and probed
to assess whether they found any part of the form hard to understand or confusing. None of the
participants thought the form was hard to understand or confusing. We will amend the study protocol
to indicate that FIs will instruct respondents to read the assent page before completing the survey. No
changes to the assent and instructions page are required.

•

Q1a, Q1b, and Q1c. Questions in the draft version of the survey asked, “In the past week, how many
hours did you spend at this program after school each day? How many hours did you spend before
school?” and “In the past week, how many hours did you spend at this program on the weekend
(Saturday or Sunday)?” Neither child could estimate the hours spent at their afterschool program.
When probed, we learned that the afterschool program schedule did not vary by weekday, and they
were able to tell us when they arrived and when they left the afterschool programs.
Based on this feedback, we split these questions into three parts to ask for arrival and departure times
before school (Q1a), after school (Q1b), and on the weekends (Q1c).

•

Q2, Q3a, and Q3b. Questions in the draft version of the survey asked, “On each of the past 7 days,
how many minutes were you physically active? Add up all the time you spent in any kind of physical
activity that increased your heart rate and made you breathe hard some of the time” and “Which of
the following activities did you do? [a list of activities followed the question]” One child felt it would
be easier to report the number of days he spent being physically active for at least 30 and 60 minutes,
rather than report the total time for each day. The parent administering the survey to his child
recommended simplifying the instructions so that examples of activities immediately followed the
instructions to add up the time they spent in any kind of activity.
Both pretest participants selected multiple activities from the list. When we asked if they did any
other activity that was not on the list, one child reported engaging in skateboarding and taekwondo,
but had not written these activities in the “other, specify” field because it was only when probed for
other activities that these activities had come to mind. Both participants (particularly the parent)
suggested reversing the question sequence so that respondents first select the activities they do and
then provide a time estimate.
Based on this feedback, we modified the list of activities to split out martial arts (such as karate,
taekwondo, or jiu-jitsu) and simplified the wording for response options that include a number of

Because both pretest participants were 10 years of age, we use the term “child” rather than “teen” throughout this
discussion for simplicity.
8

To:
From:
Date:
Page:

Constance Newman
SNACS-II Instrument Development Team
01/06/2021
17

Mathematica

different activities, including the option that includes skateboarding. We also modified the order of
the questions to ask first about activities and then about the number of days the child was physically
active (for durations of at least 30 minutes and at least 60 minutes). To reflect the expansion to focus
on both 30 minutes and 60 minutes of physical activity, we will add the mean number of days with at
least 30 minutes of physical activity as an outcome for RQ 1 in Table VIII.3c in the final Study Plan.
•

Q4 and Q5. Questions in the draft version of the survey asked, “During the past 7 days, on how many
days did you do any exercises to strengthen or tone your muscles, such as push-ups, pull-ups, sit-ups,
weight lifting, climbing (on rocks, ropes, trees or playground equipment) or yoga?” and “Which of
the following exercises to strengthen or tone your muscles did you do?” Neither pretest participant
had any difficulty understanding or responding to these questions, so we made no changes to the
content of the questions. However, for consistency, we reversed the order of the questions to ask
about the activities first.

•

Q6. Both pretest participants understood this question and answered it with no difficulty. However,
we simplified the construction to ask about how many days the respondent goes to physical education
(PE) class.

•

Q7. The parent participant felt that the concepts of “past 12 months” and “extracurricular” activities
are hard for children to understand. Moreover, if a child joined a sport or activity and participated in
it for less than 12 months, they may be confused about whether they should count this sport or
activity. The draft question had been adapted from a question in the Youth Risk Behavior Survey
(YRBS). Judging from this feedback, it is clear that the changes we made to incorporate wording
from a parallel NHANES question made the question more confusing. Consequently, we replaced the
draft question with the original YRBS question.

•

Q8a, Q8b, Q9a, and Q9b. Two questions in the draft version of the survey inquired about time spent
in various screen-time activities on school days and on non-school days. The parent participant
reported that the term “average” day is hard for their child to understand and recommended changing
it to “usual.” The child who completed the survey unaided reported that it was unclear how to record
the 1.5 hours he spent on these activities in the open-ended response space in the survey.
In response to this feedback, we adjusted the wording of the questions to ask about “usual” days. We
also replaced the open-ended response spaces for reporting time spent with the categorical response
options used in YRBS. In addition, for both school days and non-school days, we split the question to
ask about two different types of screen time: (1) time watching TV or videos; and (2) time playing
video games, accessing the Internet, or using social media. To reflect this change, we will adjust the
screen time outcome for RQ 1 in Table VII.3c in the final Study Plan from mean number of hours of
screen time to (1) mean number of hours watching TV or videos, and (2) mean number of hours
playing video games, accessing the Internet, or using social media.

•

Q10 through Q18. The child who completed the survey independently found these questions easy to
answer. The child who was assisted by the parent responded “sometimes” after the parent read the
response options for Q10 aloud. The parent subsequently decided not to read the response categories
to the child and the child responded with a “no” for all remaining questions in this section. The parent
thought these questions would be hard for younger children to answer and suggested a yes/no option
might work better.

To:
From:
Date:
Page:

Constance Newman
SNACS-II Instrument Development Team
01/06/2021
18

Mathematica

When we asked why the child said “sometimes” in response to Q10, the child was unable to say why.
The parent stepped in and said that they had a well-stocked refrigerator and pantry and that even if
they ran out of money, they would not make the child aware of this information. The father
speculated that sometimes the child may have asked for chips or frozen pizza, and the items weren’t
in the house at the time. The mother concurred with the father’s speculation.
We did not make any changes to these questions since the goal is to use the Connell food security
module9 verbatim, and the child who completed the questions independently did not report any issues.
We note, however, that the Connell module was tested with children 12 and older, whereas our
sample will include younger children (10- and 11-year-olds). If FNS wishes to simplify the response
options for younger children, we can do this. Alternatively, we could add a gate question similar to
the question used in the full (adult) food security module, so children who report being food secure
can skip the remaining questions. We will incorporate any changes FNS requests into the final
version of the Teen Survey.
•

Q19 through Q23. Pretest participants were readily able to answer these questions, so we did not
make any changes.
The child who completed the survey independently did so in about 11 minutes. The parent who
administered the survey reported that completion took 22 minutes. We believe the current response
burden of 10 minutes is fine, given the amount of time the 10-year-old took to complete the survey
independently and the fact that older children might well be able to read and respond to the survey
even faster.

C. Translation methodology
Highly proficient translators are translating the instruments and recruiting materials for providers,
parents/guardians, and youth. They are using existing translations from the FNS website as guidance for
names of programs and benefits to ensure that respondents recognize the terminology in their language.
They are also using a neutral Spanish dialect and choosing language appropriate for a wide range of
educational levels to facilitate understanding. All the instruments and most of the recruiting materials use
formal language, but the recruiting call scripts are more conversational. All documents will use the formal
second person usted to address participants.
After a document has been translated, another member of the team will review the work to ensure
accurate translations of the English document. For example, the reviewer will identify any divergence in
meaning between the original and the translation and check for potential misinterpretations due to
negative connotations or secondary meanings in the translation. The original translator and the reviewing
team member then go through a reconciliation process to discuss and resolve any issues found during
translation and review. Frequently recurring language or challenging terms are being translated by the
Mathematica team and sent to Westat’s translators to ensure consistency across documents. When we
submit the final instruments and recruiting materials, we will describe any issues the translators
encountered and relate how the issues were resolved.
cc:

Mary Kay Fox; Cassandra McClellan

Connell, C.L., M. Nord, K.L. Lofton, and K. Yadrick. “Food Security of Older Children Can Be Assessed Using a
Standardized Survey Instrument.” Journal of Nutrition, vol. 134, no. 10, October 2004, pp. 2566–2572.
9


File Typeapplication/pdf
AuthorDorothy Bellow
File Modified2021-03-02
File Created2021-03-02

© 2024 OMB.report | Privacy Policy