Appendix U. APEC III Cognitive Pretest Findings Report

U APEC III Cognitive Pretest Findings Report.pdf

Third Access, Participation, Eligibility and Certification Study Series (APEC III)

Appendix U. APEC III Cognitive Pretest Findings Report

OMB: 0584-0530

Document [pdf]
Download: pdf | pdf
APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

1

Introduction and Methods

The U.S. Department of Agriculture’s, Food and Nutrition Service (FNS) is conducting the Access,
Participation, Eligibility and Certification Study Series (APEC III). The purpose is to estimate, and
identify ways to reduce errors in payments under the National School Lunch Program (NSLP) and
the School Breakfast Program (SBP). Components of the data collection include data abstractions,
meal observations, and talking directly to parents and guardians, School Food Authority (SFA)
Directors and Cafeteria Managers to better understand if there are parts of the application,
certification, meal reimbursement or claiming process that could be improved.
In preparation for study launch, Westat’s Instrument Design, Evaluation and Analysis (IDEA)
Services tested five data collection instruments – the Household Survey, the SFA Director Survey,
and the in-depth interview guides for households, SFA Directors and Cafeteria Managers. The goals
of the testing were to ensure that: 1) respondents are interpreting the questions as intended and can
easily respond and 2) that interviewers can easily administer the instruments. All testing materials
were reviewed and approved by the Westat IRB prior to recruitment and data collection. As shown
in Table 1-1, IDEA Services completed 2 to 9 telephone or in-person interviews for each data
collection instrument, using cognitive, feasibility and expert review interviews. The in-person
interviews were conducted at Westat’s Rockville facilities. All interviews lasted an hour and were
audio-recorded for later review and analysis. Three of the in-person interviews were observed by
APEC III project staff either in the interview room itself (when the interview was conducted in a
conference room), or from a separate observation room.
Table 1-1.

Completed Interviews by APEC III Instrument

Interview Mode

Interview Type

Completed
Interviews

Household Survey

In-person

Cognitive

9

Household In-depth Interview Guide

Telephone

Feasibility

3

SFA Director Survey

Telephone

Expert Review

9

SFA Director In-depth Interview Guide

Telephone

Feasibility

3

Cafeteria Manager In-depth Interview Guide

Telephone

Feasibility

2*

APEC III Instrument

* The original target was to complete 3 interviews with Cafeteria Managers.

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

The remainder of this report describes the recruitment strategies and respondent characteristics; data
collection methods; approach to data analysis; and issues that surfaced during testing. We also
include for FNS review and approval our recommendations for addressing the issues found in
testing. Once the recommendations are approved, we will submit all instruments with the approved
revisions incorporated in tracked changes.

1.1

Recruitment Methods

1.1.1

SFA Directors

IDEA recruited SFA Directors from a list of 61 contacts who participated in focus groups for Child
Nutrition Analysis and Modeling (CNAM) Task 10—Healthy Hunger Free Kids Act Research Briefs
in 2015 and had given consent to participate in future studies. To be eligible, respondents in the
current testing effort must have been knowledgeable about all SFA activities including the
application certification process, direct certification process, meal claiming, meal counting and
claiming reports, and for those districts with Community Eligibility Provision (CEP) schools, the
process to determine an Identified Student Percentage (ISP) rate.
Table 1-2 provides demographic information for the nine SFA Directors who participated in the
expert review telephone interviews and three who participated in the feasibility telephone interviews
for the in-depth interview guide. Only one of the nine SFA Directors participated in both the expert
review and feasibility interviews. Two others participated in only the feasibility interview. While
most were female, respondents had a range of experience in their jobs, size of district and number of
CEP schools in their districts. Almost half reported that they conduct an administrative or secondary
review of school meal applications. SFA Director respondents participated as an individual (on their
own personal time), were given $75 per interview and used their own discretion when scheduling the
session time.

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

Table 1-2.

SFA Director Respondents

Respondent
ID

Gender

Years’
experience

1

M

1.25

2

F

3

Number of
Schools in
District

Number of
CEP Schools

Conducts
administrative or
secondary review

230

230

No

16

4

1

DK

M

2.5

1

None

Yes

4

F

25

25

None

No

5

F

30

60

32

Yes

6

F

18

3

None

No

7**

F

2.5

10

8

Yes

8

F

8

4

None

DK

9
10*
11*

F
F
F

8
5
8

12
18
7

None
5
None

Yes
No
Yes

* Respondent participated in the in-depth interview only.
**Respondent participated in both the survey and in-depth interviews.

1.1.2

Household Respondents

Respondents for the Household Survey in-person cognitive interviews and feasibility telephone
interviews for the in-depth interview guide were recruited from a Craigslist advertisement for
parents of school-age children who had applied for free and reduced-price school meals in the 20152016 school year. All respondents had a child in elementary, middle, or high school and had filled
out an application for free and reduced price school meals during the 2015-2016 school year. During
screening, respondents were also asked if they currently have a child who receives free or reduced
price school meals, and if they receive Temporary Assistance for Needy Families (TANF) or
Supplemental Nutrition Assistance Program (SNAP) benefits.
Table 1-3 shows demographics for the nine parents who participated in the cognitive and feasibility
interviews, including information on receiving free or reduced price lunch and TANF or SNAP.
Respondents were given $50 for completing the Household Survey cognitive interview, and an
additional $20 for compiling and bringing income documentation with them to the cognitive
interview. From these 9 respondents, three agreed to participate in the in-depth feasibility interview,
for which they were given an additional $30.

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

Table 1-3.

Household Survey Respondents

Receive
Free or
Reduced
Price lunch

Receive
TANF or
SNAP

Hispanic

Race

Education

Household
in-depth
interview
respondent

Age

Yes

No

29

Female

No

Black

Some college

X

Yes

No

45

Female

No

Black

College

Yes

No

38

Male

No

Black

Some college

Yes

No

34

Female

No

Black

College

Yes

Yes

28

Female

No

Black

Some college

Yes

Yes

41

Male

No

Black

High School

Yes

Yes

31

Female

Yes

Black

College

No

No

34

Female

Yes

Hispanic

No

No

35

Female

No

White

1.1.3

Gender

X

Some college
College

X

Cafeteria Managers

As shown in Table 1-1, we aimed to complete feasibility interviews to test the in-depth interview
guide with three Cafeteria Managers. We searched for Cafeteria Managers at the elementary, middle,
and high school levels by posting Craigslist ads and by utilizing personal networks. We found two
respondents through personal networking. The Craigslist ad yielded no responses. To be eligible for
the Cafeteria Manager interviews, respondents needed to be the person in charge of meal counting
and claiming records for the breakfast and lunch school meals program, and the person in charge of
running the cash register in the school cafeteria. In addition, they needed to be extremely
knowledgeable, or very knowledgeable about the meal counting, claiming and recording processes at
their school. Ultimately, we conducted two feasibility interviews by phone with the Cafeteria
Managers shown in Table 1-4. They participated as an individual (on their own personal time)
received $50 for participating and used their own discretion when scheduling the interview time.
Table 1-4.

Cafeteria Manager Respondents

Age

Gender

How knowledgeable
about meal counting,
claiming & recording
processes?

How long
working in
current
position?

How long
working in
cafeteria
operations?

Grade
Level of
School

Number of
Students
in School

42

M

Extremely

2 years

6 years

Head Start–6th

630

59

F

Very

5 years

18 years

6th - 8th

600

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

1.2

Data Collection

Three trained senior interviewers conducted the interviews. The interview sessions lasted up to 60
minutes and included the following:


The interviewer administered the study introduction—explaining the study purpose and
the respondent’s rights as a research subject.



Respondents in in-person Household Survey interviews were asked to read and sign a
consent form to document awareness of the voluntary nature of participation,
confidentiality, and agreement with the discussion being recorded. For all other
interviews, the telephone interview respondents were asked for their verbal consent
after interviewers explained the voluntary nature of their participation and
confidentiality, and requested that the interview be audio recorded.



For in-person interviews, which were all conducted at Westat, respondents were
informed that observers may be present.



The interviewer followed the interview guide and administered scripted probes.



At the conclusion of the interview, the respondents were provided an opportunity to
offer any other additional feedback or reactions.



After the end of the session, the respondent was thanked for participating. Cash
incentives were given to in-person interview respondents and checks were mailed to
telephone interview respondents.

The three different approaches used to test the five data collection instruments – expert review,
feasibility interviews, and cognitive interviews -- are described below.

1.2.1

Expert Review Interview: SFA Director Survey

The approach for testing the SFA Director Survey was an expert review interview administered by
telephone, in which respondents are asked to review and provide comments on the survey without
actually answering the questions. It would have been difficult, if not impossible, for respondents to
track down and provide the significant amount of school meal program data the survey asks for, as
well as complete a traditional cognitive interview, within the hour of time we were requesting from
them. Providing the requested data on a paper instrument when ultimately the survey will be
conducted online would have also added unnecessary burden to the testing task, and perhaps yielded
findings and recommendations that weren’t directly applicable to the online instrument. SFA

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

Director respondents were sent the survey prior to the interview and asked to document comments
and feedback about items that they thought might be difficult to understand or problematic to
answer. During the interview, the interviewer and respondent went through each section of the
survey and discussed the respondent’s documented feedback in response to interviewer administered
scripted probes.

1.2.2

Feasibility Interview: SFA Director, Cafeteria Manager, and
Household In-Depth Interview Guides

IDEA used feasibility interviews to test the Household, Cafeteria Manager and SFA Director indepth interview guides. Because cognitive testing is itself a form of in-depth interview, it is not the
preferred approach for testing in-depth interview guides as respondents tend to confuse the tested
questions with the “probes” that are asked to help evaluate the tested questions. Instead, the
approach for testing these types of qualitative data collection instruments is to administer them as
written and observe how respondents respond, noting any difficulties they encounter. The feasibility
interviews were conducted over the phone with all three types of respondents. Interviewers
administered the full in-depth interview guide, timed the process, and observed and documented
issues that arose for both respondents and interviewers. Household respondents were emailed a
copy of or link to their school districts’ school meal application prior to the interview, then asked to
refer to the application during the interview and provide feedback about specific items and sections
on the application. After completion of the in-depth guide across all the feasibility interviews,
interviewers followed up on any areas of difficulty respondents encountered while answering
questions.

1.2.3

Cognitive Interview: Household Survey

The Household Survey lent itself best to the traditional cognitive testing approach, whereby
interviewers administered the survey instrument along with retrospective scripted probes, by survey
section, to address specific research objectives. IDEA conducted the cognitive interviews in-person
at Westat facilities. Prior to the interview, respondents were sent an income documentation
worksheet, asked to complete it on their own, and to bring it along with the associated income
documentation to the interview. At the end of the interview, when observers were present, the
interviewer also administered additional unscripted probes on behalf of the observers when
requested.

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

1.3

Data Analysis

Interviewers also served as analysts. They reviewed the interview audio recordings and their own
notes to produce written summaries of each interview. Interview summaries included respondent
answers to items, discussion of responses to probes, and (where appropriate) verbatim quotes.
Analysts identified themes and patterns within the data, focusing on problems and issues with the
instruments overall as well as individual items. Themes and patterns were organized, evaluated,
synthesized, and summarized into report form.

1.4

Findings and Recommendations

The remainder of this report summarizes the issues found in each instrument we tested and provides
recommendations for addressing the issues.

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

Summary of Recommendations
2.1

Expert Review Interviews: SFA Director Survey
SFA Director Survey Findings

1

2

Recommendations

Global Issues
While most findings on the SFA director survey are related to a
specific item, several issues revealed may result in more
comprehensive revisions to the survey. They are issues with race and
ethnicity items, including “Don’t know” as a response option, and in
some cases finding a replacement for the term “certification.”

a.

Race and Ethnicity Items: A10 and D6b, c & d



Leave items as they
are. Add the response
option “Data Not
Available.”



Add a “Don’t Know”
response option. For
certain types of items,
it may be more helpful
to understand that
respondents don’t
know the answer
rather than force them
to choose an answer
that may not
accurately reflect their
situation.

Data on student characteristics are requested in items in Section A
and D. Not all respondents have access to race, ethnicity, and
gender data.
One respondent said she does not have direct or indirect access to
the student information system where race and ethnicity information
is stored. Two other respondents said the ethnicity/race data is not
readily available but could probably be obtained by getting it from
somebody else. One of those two respondents reported she felt
uncomfortable providing that information since the department of
education instructs them not to be discriminatory. One wasn’t sure if
gender is tracked.
b.

“Don’t know” Response Option: A12 and F3
Although this issue was only brought up in regard to two items, it
may be an issue for many other items in the survey as well. For
question A12 (“Does your SFA receive a NSLP 60% subsidy?”), one
respondent did not know if her SFA receives a NSLP 60% subsidy.
She would answer “no.” Two respondents did not know the answer to
question F3 (“In what year did your district begin using direct
certification?”) because they were not in their current position when
their district started using direct certification. One said she could get
the answer from the state department or her assistant. The other
didn’t know how she would find the answer.

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

SFA Director Survey Findings
c.

Use of the term “certification”

Recommendations


Define “certification”
and where needed,
define “direct
certification” based on
the National School
Lunch Act, regulations
and FNS guidance such
as the Eligibility
Manual for School
Meals.



One respondent
suggested the
following replacement
phrase in D6e, f, & g:

Some respondents confused “certification”, when referring to the
approval of applications for free or reduced price meals, with “direct
certification.”

d.

D6e, f, & g
These items refer to “students certified for free meals, reduced price
meals, or paid meals.” One respondent pointed out that the use of
“certified” is too close to the term “direct certification” which has a
very specific meaning. He suggested the term “meals by application”
instead. For another respondent the term “certified” means the list
of students the state sends who receive food assistance. So that
respondent would use the numbers from the state. She also said
that for D6f, she does not have any “certified” students, but that she
does have students who are “approved through their application.”
Another explained that the term “certified” is not used to talk about
“paid” students, but that it is okay to use it to talk about free and
reduced price students.

D6e. Number of
enrolled students
approved for free
meals by application
(Non CEP School).
D6f. Number of
enrolled students
approved for reduced
price meals by
application (Non CEP
School).
D6g. Number of
enrolled students
determined as paid
meals status by
application (Non CEP
School).

e.

E8, E9, E10, & E13
These items refer to either a student’s certification status or the
certification process. One respondent explained that using the word
certification “gets you mixed up with the direct certification, and
they’re two separate things.” “They’re just wanting the results of a
free and reduced application that has been processed. I really
wouldn’t call that certification.” Another respondent said “eligibility”
is easier to understand. Another said to use “free and reduced
status” instead of “certification.”



Use additional
clarifying language for
certification in this
section. The
question(s) will include
an example, or
clarification of
definition and/or intent
of the question. (E7,
E8, E9, E10 & E13).

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

SFA Director Survey Findings
2

Survey Instructions

Recommendations


For the online SFA
Director survey, provide
the survey as a PDF for
respondents to print
out and use to record
data as they gather it
in preparation for
entering it online.



Include in the
instructions assurance
that if the respondent
stops part-way through
the online survey, they
can come back to it
later with their entered
answers saved.



Include in the
instructions that the
online survey will take
an hour or less to
complete.



Add the following
instruction to Q1:
“Select both if all
enrolled students have
the opportunity to
participate.”



Revise A3 to: “What is
the total number of
school districts (public
and private) in your
SFA?”



Include “vended meal
company” in A7 so that
the item reads:

No consistent problems but 3 respondents made suggestions worth
noting.

3

Q1
The first three questions on the survey ask how data will be reported
and for what dates. The first question, Q1, uses the terms
“opportunity to participate,” which was confusing for two
respondents in California. California’s ED Code 49558 requires all
schools to offer at least one meal per day. The question implies that
schools have a choice, but not in California. The number of enrolled
students is the same as the number who have the opportunity to
participate in the program.

4

A3
A3 asks “How many public school districts or legal entities are in
your SFA?” One respondent thought the term “legal entities”
sounded odd and said it’s not commonly used in this context. He
suggested using “private school districts” instead.

5

A7
A7 asks how SFAs manage their food service operations. The phrase
“consulting company or independent consultant” was unfamiliar to
one respondent, who suggested defining the terms used or using
“vended meal company.” The respondent uses a vended meal
company but was unsure if he would answer “yes” to this question,
even though he said third party companies that bring food ready to
go “do try to consult with us and tell us what to do.”

“Is your SFA food
service operation
under the direction of a
food service
management
company, or does your
SFA use a consulting
company, vended meal

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

SFA Director Survey Findings

6

A8

Recommendations
company, or
independent
consultant to help plan
or manage food
service operations?”


A8 gives instructions, in parentheses, on how to define elementary
schools. In general, respondents tend to overlook text in
parentheses. One respondent appears to have missed the
parenthetical instructions for how to define the elementary schools
category. She noted that she intentionally excluded Pre-K students
because of the word “access” in the second column of A8.

Remove the
parentheses from the
question stem. Also,
move the instructions to
the beginning of the
paragraph, where
respondents are less
likely to miss them
because they are
“forced” to read them
before the question is
posed. See revised
instructions below.
A8. Elementary schools
are most typically
thought of as grades K5, middle schools or
junior high as 6-8, and
high schools as grades
9-12. In the table below,
please record the
number of schools and
enrolled students for
you entire SFA, overall
and by type of school.
Record the information
as of . If your
schools don’t align with
the categories listed in
the table, please fit
them as closely as
possible.

7

A8 & A9
A8 and A9 ask for data by grade level and by type of meal plan,
respectively. The difference between A8 and A9 was not clear to one
respondent, who said they seemed repetitive. She suggested
combining them.



Label the data tables in
A8 and A9 to help
respondents understand
the difference. For
example:
A8: Schools and
Enrolled Students by
Grade Level
A9: Schools and
Enrolled Students by
Type of Meal Program

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

SFA Director Survey Findings
8

A13

Recommendations


Revise A13: “Do any
schools within your SFA
receive a SBP severe
need subsidy?



Provide a definition of
the subsidy in question
A13, in the same
manner a definition is
provided in A12 for the
60% subsidy.



If possible, use
programming to tailor
the column headers
based on whether
respondents have CEP
schools in their SFAs.



Revise the definitions
of Provision 2 and 3 to
include the
qualifications a school
must meet to use
them.



Add a response option
that allows
respondents to
indicate that they’re
using CEP instead of
Provision 2 or 3, such
as the example one
respondent provided,
“Community Eligibility
(CEP) is a better option
for our schools.”



Move the last response
option, “No
participation in CEP” to
the top of the list so

The question “Does your SFA receive a SBP severe need subsidy?”
implies that it is asking about all schools in the SFA. One respondent
said the subsidy is provided on a per school basis.
One respondent did not know what “SBP severe need subsidy”
meant and had to Google it.

9

A14
Respondents whose SFAs included CEP schools struggled a bit with
this item that asks for the number of meals claimed for
reimbursement by program type (NSLP and SBP) for the entire SFA.
One respondent wondered if he should include meals claimed from
the CEP schools in the “# of Free Meals Claimed” column. Another
felt that the column categories do not make sense for an SFA that is
100 percent CEP. The respondent said, “We don’t claim free meals,
reduced price meals or paid meals, we just claim meals.”

10

B1
B1 begins with an explanation of what happens under Provision 2
and 3. One respondent didn’t understand the requirements for
Provision 2 and 3 schools, noting that explanations describe what the
reimbursements are based on, not what qualifications the school or
SFA must meet to use either of the Provisions. She looked it up on
her state’s (Indiana) Department of Education website, where she
said she found a brief definition that explained the qualifications
clearly.

11

B2
Four respondents had trouble selecting a response option as the
primary reason their SFA does not participate in Provision 2 or 3.
Two of the four noted they do not participate because CEP is a better
option for their schools. One of them explained that he didn’t feel
like any of the response options fit his situation because
“everybody’s using Community Eligibility.” The other said it would be
easiest to answer in the “other” category rather than selecting option
4 “Provision 2 or 3 is not economically beneficial or appropriate for
our schools.”
The two others were unfamiliar with Provision 2 and 3 and did not
know why they do not participate. One of those two also had CEP
schools and was sure they don’t qualify for Provision 2 or 3 but did
not know why.

12

C1
Three respondents said they would have difficulty selecting a
response option about their SFA’s participation in CEP. One

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

SFA Director Survey Findings
respondent felt that there is no option for SFAs that serve more than
one district where there’s a “mixed bag” of CEP and non-CEP
districts. “It doesn’t allow you to say I have one LEA that’s 100% and
this other situation for these other LEAs.” Another respondent in a
district with only one group of schools said the third response option
“only groups of schools within LEAs?” confused her because it said
“LEAs” in plural form. Yet another respondent was unfamiliar with
the term LEA. She has only one district and no CEP schools and said
she would skip the item.

13

C2

Recommendations
that respondents for
whom the rest of the
response options don’t
apply can quickly find
their answer.


No other
recommendations for
revising this item. It’s
unclear why the other
respondents couldn’t fit
their situations to the
existing response
options, since the list
seems to be mutually
exclusive and
exhaustive, and does
include options for the
situations they
described along with a
place to answer for
“other” kinds of
situations.


Add a response option
to C2: “Provision 2 or 3
is a better option for
our schools.”



Skip respondents out
of D1 through D4 if
they answered “yes” in
A4 and provided dates
in A5 and A6.



Replace SFA with LEA
in D11 as below.

Question C2 asks for the primary reason the SFA did not elect CEP
for any schools. One respondent wanted a response option that
explained they participate in Provision 2 or 3.
14

D2 & D3
D2 and D3 ask for start and end dates for the school year for the
sampled schools. One respondent pointed out that D1 through D3
are repetitive if the response at A4 is “yes” (all schools start and end
on the same date) and dates are provided in A5 & A6.

15

D11
D11 asks if a school participates in CEP with the entire LEA. Two
respondents found the survey’s use of both “SFA” and “LEA”
confusing. One respondent, with an SFA that serves two districts,
said it would make more sense to ask D11 about her SFA.

16

E3 and E4
E3 and E4 ask about the types of technical assistance available. One
respondent explained that it would be hard to select one type of

Provide definitions for
SFA and LEA

D11. “Is 
participating in CEP
with the entire SFA, as
an individual school or
as part of a group of
schools?”


Program E3 and E4 for
“Check all that apply.”

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

SFA Director Survey Findings
assistance since they often provide all types of assistance.
17

E7

Recommendations


Use additional
clarifying language for
certification in this
section



Revise E9 to include
response options that
accurately reflect the
SFA requirement to
extend eligibility for 30
days. Provide clarifying
language for
“certification”.

This question asks about number of staff with responsibility for
reviewing applications and “determining certification status.” Some
respondents said it’s confusing to use the word “certification”
because it calls to mind “direct certification.”
Two respondents commented that they hire temporary help for a
short period of time at the beginning of the school year. They were
unsure if they should count the temporary workers.
18

E9
This question asks how long certification status is extended into the
current school year. Three respondents pointed out that the
requirement to extend eligibility is for 30 days and that 30 days is
not 1 month. Two respondents said the time period was closer to 6
weeks. One of them said she would have to decide between the
response options for 1 or 2 months.

Less than 30 days
30 days
More than 30 days
Until a certain date
(specify)
 Some other time
period (specify)





APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

SFA Director Survey Findings
19

E10 & E12

Recommendations


Leave items as they
are. Because the size
of the answer space is
a visual clue to
respondents about the
length of the answer
they’re expected to
provide, display space
appropriate for a brief
paragraph, or up to 5-6
sentences of
description.



Revise E12 so that it is
more clearly linked to
E11, which is about
processing households’
applications:

Eight respondents were asked if they would be more likely to enter a
long, detailed response or a short, general response to these openended questions about the application and certification process. All
8 respondents who were asked said their open-ended responses to
E10 and E12 would be short. One said the answers would be just 2
to 4 sentences. Another said she could answer by giving 6-8 simple
points about what she does.

20

E12
Four respondents did not understand what this question is asking
(“How are the household’s responses about their application
recorded?”). One, who was also confused about the use of
“certification” in E8, E9, and E10, also wondered if this question was
also about certification, or if it referred to the questions that
households sometimes write on their applications. Another wondered
what part of the process the question was about, “the letter of
eligibility or what?” Another was confused by the word “responses” in
the question. She said responses given “on” the application are
recorded but not responses “about” the application, such as
comments or complaints. One respondent didn’t know if the question
refers to the calls she makes to households regarding a question
about their application or if it is refers to the denial letters she sends
to households that don’t qualify.

21

Section F: Direct Certification for Non CEP Schools
One respondent with a “100% CEP SFA” felt that this section should
apply to his SFA since “direct certification without application, that’s
the heart of what we do.” If asked, he would answer “yes” to F1.

22

Section G: Verification

“Once a household is
contacted regarding
questions about their
application, how are
their answers about
their application
recorded?”



Leave as is; this
section is designed for
Non CEP SFAs.



Add questions on State
and District level
matching to the CEP
section.



Program a skip for
Section G for SFAs with
100% of their schools
as CEP schools and
insert skip language on
paper copies.



Revise the order of
items in H1 to match
the real-time order of
steps for meal
counting and claiming.

On the paper version it is not clear if SFAs with 100% of their schools
as CEP schools will automatically skip this section about the
verification process.

23

H1
H1 asks about manual and automated processes for meal counting
and claiming. Two respondents questioned the order of H1a, b, and
c. One felt that “b” (Point of sale meal counting) should come before
“a” (SFA preparation and submission of meal reimbursement claims
to state agency) because it happens first. Another felt that “a”

a. Point of sale meal

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

SFA Director Survey Findings
should come after “c” (School preparation of meal counts submitted
to SFA) because the first step is getting claim data from the
Cafeteria Manager then that data are sent to the state.

Recommendations
counting
b. School preparation
of meal counts
submitted to SFA
c. SFA preparation and
submission of meal
reimbursement
claims to state
agency

24

H2



Leave item as is.
Because the size of the
answer space is a
visual clue to
respondents about the
length of the answer
they’re expected to
provide, display space
appropriate for a brief
paragraph, or up to 5-6
sentences of
description.



Leave items as they
are.



Ask follow-up
question(s) for any SFA
Director who answers
“no”



Include the following
additional training
topics to I3:

H2 is an open-ended question about ensuring accuracy of meal
counting and claiming. Five of the six respondents who were asked
about this question said their open-ended responses to H2 would be
short, perhaps a couple of sentences, or a couple short paragraphs.
Just one was not sure how much detail was being requested and
said the description “could get very lengthy.” He suggested asking
“What error checks do you have in place?”

25

I1 and I5
These items ask about training “during the past 12 months.”
Respondents were probed on what time period they would consider
when answering these items. Five of the 6 who were asked this
probe said “the last 12 months” as instructed. Several noted that it
makes sense since training is done annually. One respondent didn’t
notice the instructions and said “the last school year” when probed.

26

I3
Item I3 asks about the topics that are covered in training.
Respondents made suggestions to revise and add to the training
topics listed.

The meal line
training
 “Smart Snacks”
training




Revise response
option #7:


27

I4

Approval for meal
benefits



Add “Food Service
Director” to the list of
staff who received
training.



Revise question to

This question asks “What types of staff received your training?” One
respondent made a suggestion to add “Food Service Director” to the
response option list.

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

SFA Director Survey Findings

28

I5

Recommendations
remove “your”.


Add a definition of
“technical assistance”
in I5.



Revise question
language to clarify who
is doing the visiting.



Revise question to
clarify that the visit is
specifically for
monitoring (and define
monitoring).

I5 asks about technical assistance received from the State Agency.
Two respondents asked what is meant by “technical assistance.”
One asked if it means calling with a question and also wondered if
the required web-based training should be included. Another thinks
of “technical assistance” as in-person, on-site training and also
wondered if it included the continuing education all administrators
receive at the state conference.
29

I9 and I10
These questions are about school visits. Two respondents were
confused about “who” the questions assume are visiting. “Who are
we asking that was visiting? That probably would need to be a little
more specific.” She did not know if the questions were asking if she
visits each site as the SFA, or someone else. She explained that she
conducts visits to each of the 4 schools in her district daily. Another
respondent was not sure if the question was asking about her or the
state.

I9. “How many schools
does your SFA visit in a
typical year?”
I10. “What percentage
of all the schools in the
SFA are visited by your
SFA?”

30

I12



Provide text that
clarifies what is meant
by “review component”
and “remotely” in this
question.



No recommendations.
The issue of errors is
addressed in the indepth interview. The
other two comments
are informational, not
requiring survey
revision.

I12 asks “Is there a records review component that is conducted
remotely?” One respondent explained that “for us there’s no remote
because we’re one school.” Another did not know what it meant. She
said she would answer “yes” because “we can’t over claim free and
reduced students when you’re only approved for so many…, that
would be remotely from my computer, it wouldn’t be done at the
school.” Another was generally confused, “I don’t know what a
records review is.”
31

Wrap Up – “Are you aware of errors or mistakes in the certification and
meal claiming processes in your district?”
One respondent believes errors are introduced because parents are
not always truthful and the SFA does not or cannot follow-up to verify
parents’ claims.
One respondent suggested adding a question about whether any
errors were found during an audit.
One respondent, with 3 schools under 1 roof, also serves as the
Cafeteria Manager. She said the only errors that could occur are
when a student doesn’t take the complete reimbursable meal. She
knows when that happens and stops it “99% of the time.” She either

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

SFA Director Survey Findings
asks the child to come back and get what they need or records it and
reduces her total number by that student who didn’t take the full
meal.

2.2

Recommendations

Feasibility Interviews: SFA Director In-depth Interview Guide
SFA Director In-depth Interview Findings

32

Section A: Application Certification Errors and Section B: Direct
Certification Errors.

Recommendation


Replace the word
“certification” with a
more appropriate
phrase such as
“approved for” (or some
variation of it such as
determined eligible by
application) in Section
A.



Section B is currently
lacking any introductory
text. Insert text to
highlight the transition
and clarify the
difference between A
and B. “We just finished
covering your district’s
process for certifying
applications. Now let’s
focus on the direct
certification process.”



Leave question as is; it
is asking about change
over time and is not
redundant. We will
provide clarification or
an interview probe on
how this question is
distinct.



Add transition language
before questioning:
“Now let’s talk about
your ISP rate and the
sources you use to
determine it.”



Provide definition for
ISP rate.

All 3 respondents commented that questions in these sections felt
repetitive and during questioning gave information that overlapped
between sections. One respondent had already answered all items in
Section B when answering Section A. Another could not answer most
of the questions for Section B because they were not applicable to
her district.
SFA Directors seemed reluctant to admit errors, or unaware of the
errors being made, so the focus on errors and similar phrasing of
questions often did not yield new information.

33

Section A, Question 4 “What changes, if any have there been over time
in the types and extent of errors that occur in the certification process?
If so, what kinds of things have contributed to those changes?
Two respondents said this question was redundant, and had already
answered in previous items.

34

Section C. CEP/ISP Errors
Of the two SFA directors who received these items, interviewers noted
that the transition was abrupt and that adding language explaining the
new section would help with flow.
One respondent asked what an ISP rate was.

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

SFA Director In-depth Interview Findings
35

Section D. Meal Claiming Errors

Recommendation


Respondents were unsure what part of the process “meal claiming”
was.




36

Two respondents had difficulty understanding what this section was
about because they did not call this step “meal claiming,” despite
the explanation in the introductory text.

“Meal claiming is the
point at which cafeteria
staff identify meals as
reimbursable or not
reimbursable. Meal
claiming errors occur at
the end of the serving
line, after a student has
filled his or her tray,
typically at the point of
sale or cash register”

One respondent asked for clarification and called this moment
“point of service.” Another considered “meal claiming” to be the
time when claims are keyed and reported after meals have been
distributed, “when I actually am going to submit my claim.” She
called the moment at the cash register, “identifying reimbursable
meals.”

Section E. Training
Intro text: Now I’d like to learn more about training and guidance for
SFA staff.



In one small school district the SFA Director was the only staff
member involved in the certification process so reading “SFA staff”
was awkward.
37

Section E. Training

Revise the introductory
text to further clarify
what is meant by “meal
claiming,” as below.

Change text to allow
interviewer to read
either “SFA staff” or
“you.”



Two respondents were not sure how to answer and had to ask for
clarification about which person this question is referring to. One
needed to clarify if the intention was a person at the site, at the
central office, or at the state. The phrasing, “In charge of meal
counting and claiming” did not make sense for her.

Revise the question so
it’s asked in two parts,
first to learn from the
respondents’ view who
is in charge of meal
counting and claiming,
then to learn what
training that person
receives.



Another respondent was not sure if training meant the
administrative training she had received as an SFA Director or the
training for cashiers in her schools.

“Who completes the
meal counting and
claiming paperwork?”



Q5 “What type of training or guidance does the person who is in charge
of the meal counting and claiming receive?”



“What type of training
or guidance does that
person receive about
how to complete the
meal counting and
claiming paperwork?”

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

2.3

Feasibility Interviews: Cafeteria Manager In-depth
Interview Guide
Cafeteria Manager In-depth Inteview Findings

38

Global Issues

Recommendation


Allow interviewers to move around in the
protocol and ask questions out of order, or
skip questions if they have already been
answered.



Move Q4, Section A to an “if needed”
probe in Q1.

Items were repetitive causing interviewers to either
skip questions or respondents to repeat information
they had already given.

39



For both respondents training came up
organically while answering questions in the first
three sections. Probing further at the time the
respondent mentioned training would have
allowed for more natural flow than waiting until
the end of the interview to repeat subject matter.



Conceptual boundaries between Sections A and
B seemed unclear to one respondent who
focused her answers on how they track which
students get free, reduced, or paid meals when
responding for both sections.

Section A, Q4 “What are some challenges to
knowing if a meal is reimbursable?”
Q4 is repetitive and elicits information already given.
Two respondents had already provided this
information in Questions 1 and 3.

40

Section A. Serving Meals and Section B. Recording
Reimbursable Meals
Section A starts vaguely so that respondents have
difficulty focusing their answers. Content potentially
overlaps with Section B in respondents’
understanding.




41

One respondent answered about recording meals
during the serving section. The interviewer
needed to emphasize “serving” to keep the
respondent focused when answering.

Better define Sections A and B to direct
respondents to desired subject matter. To help
respondents better distinguish between
sections, change introductory text for each
section to emphasize serving and recording
meals respectively.


Introduce Section A with text, “I’d like to
learn more about the process of serving
reimbursable meals. Let’s start there.”



Introduce Section B with, “Now let’s talk
about the process of recording a
reimbursable meal. We would like to better
understand how a student’s meal could be
missing some necessary components and
still be recorded as reimbursable.”



Add brief text to the introductory section
reminding respondents their answers will
not be shared, will not affect their job or
their school district, and will be used to
help make improvements.

Both interviewers needed to use all probes for
Question 1 because respondents were not
providing the type of information being sought.

Section D. Staff Training
One respondent declined to answer several probes for
Question 1. He did not feel comfortable answering
because he did not want to give negative feedback
about his supervisors. He commented, “You say
nothing gets back, but you don’t want to throw your
superiors under the bus.”

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

2.4

Feasibility Interviews: Household In-depth Interview Guide
Household In-depth Interview Findings

42

Global

Recommendation and Justification


To mitigate potential navigational
issues during interviews about online
applications, print screen shots for
interviewers to reference. Also, develop
and have interviewers give to
respondents step-by-step navigation
instructions for online applications, to
ensure respondents land on the
screens that are relevant to the
interview questions.



For ease of interviewer use, number
items sequentially from the beginning
to the end of the questionnaire.



Reword this part of the question as:
“What made you think of that
person/those people?”

Web-based applications are designed differently across
school districts. Both interviewers and respondents had
difficulty quickly navigating the differing formats.

43



One web-based application could not be viewed in
its entirety; instead the applicant had to enter
information section by section. The interviewer had
difficulty following along with the respondent as she
entered information into each screen. Entering fake
data to keep up with the respondent did not always
land the interviewer on the same screen as the
respondent.



One respondent had difficulty finding the appropriate
sections in a PDF of her electronic application. The
interviewer needed to guide her to the correct
section.

Global
Numbering restarts frequently, sometimes within the
same section.

44

Section A. Experience Completing the Application
Q2 – IF NO: How did you choose to ask the person you
just mentioned?
Probe was awkward to administer.

45

Section A. Experience Completing the Application
Q3 – How much time did you have from when you first
received the application to when it had to be turned in?
In your opinion, was it enough time to gather the
information you needed to complete the application?

46

Rephrase the first sentence to emphasize
the time that respondents were given to
return the application and remove
language that suggests a paper
application was used.


Display the second question as a subquestion so that the two are read and
answered separately:



One respondent answered “the next day,” saying
how quickly she completed the application rather
than how much time she was given to do it.



Two questions are read in succession making it
difficult for respondents to answer both.

“How much time were you given to
complete and submit the application?”



The wording “received” and “turned in” implies a
hard copy application which is inaccurate if the
person completed an on-line application.

a. “In your opinion, was it enough time
to gather the information you
needed to complete the
application?”

Section A: Experience Completing the Application
Q1C (pg. 2). If so, why did you choose to fill out the
application on-line?
This wording makes it sound as if the respondent needs
to justify his or her decision.



In general it’s good practice to avoid
the word “why” when asking for
reasons people engage in any given
behavior. It can evoke a defensive
reaction, or imply that there’s a “right”
answer. Reword this part of the

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

Household In-depth Interview Findings

47

Section A: Experience Completing the Application

Recommendation and Justification
question as: “What made you decide to
fill out the application on-line?”


Reword as: “Did you complete the
application over time or all in one
sitting?”

This wording assumes the respondent completed the
application over time. The one respondent who
completed the application online did so in one sitting,
and was not sure if she would have been able to save
and come back.



If respondent reports completing the
application all in one sitting, ask: “Do
you think you would have been able to
leave and come back if you needed to?

Section B: Understanding Application Components

Have definitions for terms ready for
interviewers to use in case respondents
need more explanation including:

Q1D (pg. 2). Were you able to complete the application
over time or did it have to be done in one sitting?

48

Q2 – Let me ask about a couple of types of income that
can be confusing to people. If you were to receive
income from child support would you enter that
information in the income box? Explain.



Child support;



Alimony;



Public assistance; and



Worker’s compensation.



Add Probe: if the respondent does need
an explanation, first ask what they
think the terms mean before providing
the definition.



To better assess income reporting and
whether respondents understand the
differences between gross and net
income make this a required question
and not an “if needed” probe.

This probe was very important for the respondent who
was not a native English speaker. She initially was
confident in her response, but after receiving this probe
had to think through what gross and net income mean.
The application she completed switched between
asking for gross and net, which she found confusing.



Revise question to read: “Was it clear
whether the income you reported
needed to be before or after taxes and
other deductions?”

Section B: Understanding Application Components,
Listing Income



Have interviewers direct respondents to
the instructions in the application and
allowing them to read to themselves.
Interviewers could prompt with the first
sentence or two if necessary.

What about income and alimony payments? Explain.
Income from public assistance? Explain.
One respondent was not a native English speaker and
relatively new to the United States. She was not
familiar with the terms “child support,” “alimony,” and
“public assistance.” The interviewer had to provide
extensive explanations for the respondent to
understand.
49

Section B: Understanding Application Components,
Listing Income
Q1 probe: Was it unclear as to whether the income
needed to be before or after taxes and other
deductions?

50

Q4 – Now I’m going to read you the application
instructions about reporting a child’s income.
[INTERVIEWER READ INSTRUCTIONS ON APPLICATION].
Based on those instructions…
In two applications the instructions to be inserted into
the question were long and awkward for the interviewer
to read in full.

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

2.5

Cognitive Interviews: Household Survey
Household Survey Findings

51

Section B: Participating in School Breakfast and Lunch
Programs.

Recommendations


This section begins with a definition of school meals
served through the school breakfast and lunch
programs. Interviewers found B1 lengthy and repetitive
to read. Specifically, the word “school” or “schools”
appears up to 8 times in 4 sentences; “meals” appears
4 times; and the words “breakfast” and “lunch” appear
3 times.

52

Section B: Participating in School Breakfast and Lunch
Programs.

Revise the text so that it is more
concise and less repetitive.
B1. “The next questions are about the
meals TARGET STUDENT eats at school.
I am going to ask about whether your
child had a school breakfast or lunch
each day during the last full week of
school. I am referring to the meals
provided under the School Breakfast and
School Lunch Program. They are the
meals that are on the menu for free or a
single price, as opposed to individual
foods, such as salads, meats, and
desserts that are priced and bought
separately.”



Allow interviewers to enter data for
each date when respondents make
statements about the entire week.



Field interviewers will have a CAPI
instrument programmed to fill the
dates based on the answer to B2. In
addition, provide field interviewers with
a calendar for reference.



Ask separate questions for breakfast
and lunch on questions C1a, C1b, and
C2a.

Questions about daily school attendance and
participation in the school breakfast and lunch program
were repetitive when the respondent states that the
student attended or ate breakfast/lunch at school every
day of the week.
Interviewers went off script several times and recorded
responses based on respondents’ overall statement
about the entire week. All interviewers agreed that
administering the items by date were at times repetitive.
53

Section B: Participating in School Breakfast and Lunch
Programs.
The questions ask for the dates of the last full week of
school and student attendance on each date. Interviewers
had some difficulty going through the days and dates of
the prior week without the help of a calendar.

54

Section C: Perception of School Meals.
The questions in this section ask about student and
parent satisfaction with school meals. Respondents
have difficulty with Section C questions when they have
different answers for school breakfasts versus school
lunches. The questions are not broken down separately
by breakfast and lunch as done in Section B.


Two respondents had difficulty answering because
they didn’t know if the question was asking about
breakfast or lunch.



One child liked the lunches but not the breakfasts.



One parent felt differently about the “healthfulness”

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

Household Survey Findings
of the breakfasts and the lunches served.
55

Section C Response Lists

Recommendations



Add “or” after “Somewhat dissatisfied” in
the response list. For all questions where
the response options are read aloud, be
sure they are scripted conversationally
for ease of interviewer administration.



Revise C1c to become two questions:
1) asking about food served in the
Breakfast Program and 2) asking about
food served in the Lunch Program.



Same change for C2b.



Leave as is.



Add “or” after “Somewhat difficult” in
the response list.



Eliminate the “/” in “need/request.”
Develop two separate questions for
“need” and “request.”

For smoother reading, cognitive interviewers
spontaneously inserted “or” after “Somewhat
dissatisfied” in the response list.
56

C1c
Two respondents found C1c (satisfaction with the food
program overall) difficult to answer on behalf of their
young children (Kindergarten and 4th grade), noting
that their children do not even know about the
program.

57

C2a
In C2a (parent satisfaction with “the healthfulness of
the food”), one respondent mis-heard “healthfulness”
as “helpfulness” and another heard it as “healthiness.”

58

Section D: Perceptions of the Household Application.
D1 – For smoother reading, interviewers inserted “or”
after “Somewhat difficult” in the response list. Cognitive
interviewers did this spontaneously. The more scripted
approach used by field interviewers requires revisions.

59

Section D: Perceptions of the Household Application.
D5 – The double-barreled question (“Did you request/
need assistance to complete the application?”) asks
about both a “request” and “need” for assistance in one
question. The slash in “request/need” is awkward for
the interviewer to administer aloud.

60



Two respondents were probed on whether they were
thinking about if they needed assistance or
requested assistance. Both were thinking about
whether they needed assistance.



Because items D6 – D8 are about applicants
requesting assistance, it is assumed that
determining if an applicant requested assistance is
an important research objective.

Section E: Categorical Eligibility.

D5. “Did you need assistance to
complete the application?”
IF YES:
D5a. “Did you request assistance to
complete the application?”



Provide instructions to the interviewer
to read the first three response options
to signal the respondent to provide the
exact relationship. Instruct the
interviewer to continue reading
response options until the respondent
provides the answer.



Train interviewers to probe for exact
relationship when respondents’ initial
answers don’t indicate what that is.

Question E1 asks about the student’s relationship to
the respondent. Interviewers do not consistently verify
the exact relationship.


Interviewers did not consistently verify if the child is
adopted, foster, or biological. For example, when a
respondents answers “son” or “daughter” some
interviewers recorded “NATURAL CHILD” without
verification.

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

Household Survey Findings
61

Section E: Categorical Eligibility.

Recommendations


Revise question to ask about
APPLICATION MONTH, YEAR. This is
especially important should the survey
be conducted in the second half of the
school year.



Revise the second sentence of the
introduction to more accurately reflect
the process, preventing the respondent
from showing the interviewer their
documentation but informing them
they will need it soon before the
interviewer administers E6. See the
recommendation below.



Move the instructions (“Do not include
TANF/SNAP benefits received by
another household member with their
own TANF/SNAP case number that
does not include you, your spouse,
and/or your child/children”) to the
introduction of the Household Benefits
section, so that it is read prior to
question E6 and E19. See the
recommendation below.

E4 asks for application month but not year.
Interviewers recorded the year even though it was not
requested in the survey item.
62

Section E: Categorical Eligibility.
This section begins with an introduction about benefits
received through government programs. Then
questions E6 and E19 ask if household members
receive TANF or SNAP benefits. The questions are
followed by a qualifying statement about what not to
include.


Respondents answered “no” during the introduction
of the Household Benefits section, before the
questions were asked. Some respondents began to
retrieve their documentation at that moment, when
it is not necessary to look at the documentation
until E7.



It was also disruptive for interviewers to read the
second sentence of E6 and E19 (“Do not include
TANF/SNAP benefits received by another household
member with their own TANF/SNAP case number
that does not include you, your spouse, and/or your
child/children”), which was placed after the question.
The respondents answered “yes” or “no” to the
question before the second sentence could be read.



These issues occurred with four respondents, all of
whom answered “no” to E6 and E19.

HOUSEHOLD BENEFITS
The next questions are about benefits
received through government programs.
Soon we’ll need to look at any
documentation you have about payments
from these programs. Do you have that
ready? IF NO, GIVE TIME FOR R TO
COLLECT IT
INTERVIEWER: WHENEVER POSSIBLE,
USE AVAILABLE DOCUMENTS TO VERIFY
OR CLARIFY RESPONDENT’S
RESPONSES.
TANF BENEFITS
Let’s discuss TANF benefits. Do not
include TANF benefits received by another
household member with their own TANF
case number that does not include you,
your spouse, and/or your child/children.
E6. During [application month and year],
did you, or anyone in your household
receive Temporary Assistance for Needy
Families (TANF), also known as cash
welfare, or [DC & VA = TANF,
MD=Temporary Cash Assistance]?

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

Household Survey Findings

Recommendations
SNAP BENEFITS
Now let’s discuss SNAP benefits. Do not
include SNAP benefits received by
another household member with their
own SNAP case number that does not
include you, your spouse, and/or your
child/children.
E19. During [application month and
year], did you, your spouse, and/or
child/children) receive Supplemental
Nutrition Assistance Program (SNAP)
benefits (formerly known as Food
Stamps), or [DC & VA=SNAP, MD=Food
Supplement Program]?

63

Section E: Categorical Eligibility.



This item instructs the interviewer to record the period
ending date on the TANF statement document.
Sometimes periodic statements, rather than monthly
statements, serve as documentation. The date on the
periodic statement might not be reflective of a month.

During interviewer training, include
instruction for selecting “END DATE
NOT FOUND ON THE DOCUMENT” when
the documentation is not a monthly
statement.

E11 – The interviewer recorded the date from the
periodic statement, which was not a monthly
statement. The interviewer did not use the option to
circle “END DATE NOT FOUND ON THE DOCUMENT.”
64

E20 – The item asks for a statement from the
application month and year. This makes it difficult for a
respondent that only gets a statement once a year, not
monthly.

Revise question to ask for a statement,
and add a follow-up question to determine
whether the amount on the statement
matches that received in the application
month and year.


E20. “We need to record the total
amount (you and your (child/children)/
you and your spouse and
(child/children)) received in (State
SNAP/SNAP) benefits during
[application month and year]. We can
get that amount from your SNAP award
statement or notification of payment.
Do you have a statement or notification
of the amount of your monthly
benefits?”
E20a. “Is that the amount you received
in [application month and year]?”

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

Household Survey Findings
65

Section F: Household Composition.

Recommendations


F3 – Respondents do not consistently exclude
themselves from the number of household members
reported.


66

F3. “Not including yourself, how many
people live with you?”

Two respondents included themselves. One
corrected her response after the interviewer
repeated “NOT including yourself.”

Section F: Household Composition.



F15 – “Did anyone (else) not currently in this household
live with you in [application month and year]?” Some
respondents struggled with this item, perhaps in part
because it begins with negative phrasing, which can
pose a more challenging comprehension task.

67



In one case the respondent asked “not lived with
me?” The interviewer repeated the question and the
respondent was able to answer “no.



In another case the respondent thought it was a
different question about visitors. She answered
“yes” because her mother was temporarily visiting
at the time she completed the application.

Section G: Income and Earning Sources.

68



Provide brief definitions for some
Showcard items that are commonly
reported.

During probing, some respondents were unfamiliar
with Temporary Assistance, Black Lung Benefits,
Alimony Payments, Payments from Large Amounts
or Settlements, Private Pension, Housing Subsidy,
Strike Benefits, Interest and Dividends Income, and
General Assistance Benefits.

Section G: Income and Earning Sources.
The Worksheet is long and text heavy.


Revise the question so that it does not
begin with a negative. Provide
instructions not to include visitors.
Move the instruction to the front of the
item.
F15. “For this next question, do not
include temporary visitors. Did anyone
(else) live with you in this household in
[application month and year] but does
not live with you now]?”

Respondents are not familiar with some of the income
and earnings sources listed on the Showcard.


Rephrase question so that “not
including yourself” is first:

One respondent thought the worksheet was easy
but not initially. She said, “At first I thought it was so
long, like oh my gosh, more reading and more
understanding. But once I printed it and started to
fill it, it is very easy. It just took one minute or so.”



Another said “It’s really wordy. Just make it quick.”
The respondent stopped reading after the first row
of the second table on page 3.



Another wasn’t sure about what to do other than
write in the name of the income earner.

To the extent possible, streamline the
Worksheet and instructions:


Delete repetitive phrases or sentences.



Delete unnecessary instructions.



Reformat tables to fit on one page.

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

Household Survey Findings
69

Section H: Income and Earning Amounts.
The Introduction to the section and the initial text in H1,
leading up to the question, are very long and difficult to
administer.

70

Recommendations
Streamline the introduction and initial text
of H1.


Delete some examples from the first
paragraph



Delete the first part of the second
sentence “For each type of income you
reported.”



The first paragraph is long and has too many
examples.



The introduction asks for documentation so it breaks
up the flow when the respondent responds to the
 Revise language to consistently ask for
request.
application month and year.



The second sentence of the introduction, “For each
type of income you reported,” is not applicable when
the respondent had just talked about one source of
income.



The introduction and H1 switch back and forth
between asking for documentation, information from
the last time paid, and from the application month.



The 3rd bullet in H1 “farm or non-farm business” is
awkward to read for many respondents’ situations.

Section H: Income and Earning Amounts.



Revise the response option to: EVERY
TWO WEEKS (BI-WEEKLY).



Revise H5-H10 to accommodate
contract documentation.

H2 – This question asks “How often are these earnings
paid to (you/person’s name)?” Two respondents said “biweekly,” but the response option says “every two weeks.”
71

Section H: Income and Earning Amounts.
H5-H10 – These items are for the interviewer to
document details about the respondent’s paid income
documentation. The items are difficult to administer if
the respondent’s documentation is a contract stating
the income earner’s annual salary rather than a
payment statement.


H5 has “award letter” as a response option. If this is
the same as a contract, revise to “contract or award
letter.



H6 does not have a response option for salary
contract.



H7 and H8 refer to the application month which
does not work when documentation is a salary
contract.

H5. Revise response option: AWARD
LETTER/CONTRACT.
H6. DOES THE PAY STATEMENT
REFLECT EARNINGS DURING
[APPLICATION MONT], THE CURRENT
MONTH, CURRENT YEAR, OR ANOTHER
TIME PERIOD?
Add CURRENT YEAR as a response
option.
H7 & H8. Add the appropriate skip
instruction when annual contract
documentation is presented instead of
monthly documentation.

APPENDIX U. APEC III COGNITIVE PRETEST FINDINGS REPORT

Household Survey Findings
72

Section J: Demographic Characteristics.
J8 – This question asks “How long have you lived in the
United States?” One respondent answered “6 years”
immediately after hearing the question. Then, after
hearing the probe “Include the total number of
years/months living in the United States. The time does
not need to be consecutive,” she changed her answer to
4 years and 6 months, subtracting the time for the two
visits to her home country (once for 1 year and again
for 5 months). She explained that when she heard the
word “consecutive” she thought of “continuous” which
made her think she was to report the amount of time
“in total” that she had lived in the United States.

Recommendations
Include “IF NEEDED” before probe and
eliminate the second sentence of the
probe.
“Include the total number of
years/months living in the United
States.”


File Typeapplication/pdf
AuthorSonji Hogan
File Modified2017-03-31
File Created2017-01-25

© 2024 OMB.report | Privacy Policy