Download:
pdf |
pdfJ. National Agricultural Statistics Service Comments and
Response
NASS Comments
Study Objectives: These objectives seem reasonable given the survey instrument.
Background
Per the Food and Nutrition Act of 2008, and as amended in 2014, the U.S. Department of Agriculture’s
(USDA’s) Food and Nutrition Service (FNS) administers the Supplemental Nutrition Assistance Program
(SNAP). Per a congressional amendment, participants have a work requirement to be eligible to
participate in SNAP, with some exceptions.
SNAP State agencies operate an employment and training (E&T) program to help SNAP participants. Of
note, there is some flexibility in terms of implementing the rules concerning who is required to
participate in the E&T program. In order to evaluate equity, USDA FNS has contracted with Westat to
perform a study to assess and monitor the equity of administration in the SNAP E&T programs. All SNAP
State agencies are included in this study.
Data Collection Instruments/Study
The study includes a survey of all 53 SNAP State agencies, document review, and key informant
interviews with individuals from six States. This Information Collection Request includes five data
collection instruments: (1) Survey Instrument (attachments C.1 and C.2); (2) SNAP State Agency
Interview Protocol (attachment D); (3) SNAP Local Agency Interview Protocol (attachment E); (4) SNAP
E&T Provider Interview Protocol (attachment F); and (5) Interested Parties Interview Protocol
(attachment G) [Part A OMB].
Comments/Suggestions
Overall, this seems like a very interesting study to evaluate the equity of E&T programs. However, I do
have a few issues I delineate below that I’m fearful could introduce some bias in the output. In
particular, a non‐response strategy to the web‐based survey so that non‐response is adequately
accounted for is important.
1. Part B.1. Web‐based survey: How will nonresponse be handled? Will there be any nonresponse?
Is this a “required” survey? Per the ques onnaire’s Public Burden Statement, it is a “voluntary”
data collec on. If nonresponse is expected, how it will be addressed should be discussed so as to
help mi gate any poten al bias in results.
2. Part B.1. Key informant interviews: One should consider laying out a sampling strategy here that
properly stra fies the respondents so their output can adequately represent the popula on.
3. Part B.1. How was the sample size for this determined?
4. Table B.1.1. [The table for] Respondents and Nonrespondents is confusing due to the goal of
presen ng “unique” responders as the total. Consider breaking out the numbers as a Total and a
Total Unique. For example, if there are 10 SNAP State Directors, but only 7 are unique given they
have responded elsewhere, consider pu ng 10 (7). Currently the table does not add up, and the
J‐1
5.
6.
7.
8.
9.
calculated response rate for SNAP State Director does not necessarily make sense given the
number of non‐responders.
Sec on B.2. While there is no sample defined, it may s ll be useful to discuss es ma on
strategies (presumably descrip ve sta s cs and simple aggrega on) that will be presented in the
final report.
Sec on B.2. Web‐based survey: The follow‐up plan seems adequate. However, how will any
non‐response that remains at the end be addressed?
Survey instrument, ques onnaire. The ques onnaire contains 88 ques ons, and the allo ed
me [to complete] is 75 minutes. A majority of the ques ons can probably be answered from
respondents’ memory, but some may require extra work to confirm or validate. For example,
someone may have to spend me looking at the SNAP eligibility system to validate the answer to
ques on 55 and others in that sec on. Consider reevalua ng the burden, as it’s possible the
respondents can complete this in 75 minutes, but without in‐depth knowledge I cannot confirm.
(I will note that Document K, Sec on C, does indicate it could take more than 2 hours to
complete.)
The web‐based ques onnaire has a good flow to it. Is ques on 39 open‐ended? Or will there be
op ons for “type of service”?
I perused the a achments, and they all seemed appropriate.
FNS Response
#
1
Location in
Supporting
Statement
Part B.1. Web‐
based survey
NASS Comment
Response
How will non‐response be
handled? Will there be any
non‐response? Is this a “required”
survey? Per the questionnaire’s
Public Burden Statement it is a
“voluntary” data collection. If
non‐response is expected, how it
will be addressed should be
discussed so as to help mitigate
any potential bias in results.
For States that do not respond to the
study’s email requests for participation,
trained data collectors will follow up by
phone to encourage, as indicated on page
10. This is a process we have used
successfully with recent studies that
included a census of State agencies. One
recent example, the WIC Breastfeeding
Practices Inventory, resulted in an 85%
response rate among State agencies.
Because the surveys are not intended to
generate inferential data representative of
a larger population, bias resulting from
nonresponse will not affect the study’s
findings.
In response to this comment, we included
a sentence on page 10 indicating that
because the surveys are not designed to
provide results representative of a larger
population, non‐response will not affect
the study’s ability to address primary
research objectives.
J‐2
#
Location in
Supporting
Statement
NASS Comment
2
Part B.1
Key Informant Interviews – One
should consider laying out a
sampling strategy here that
properly stratifies the respondents
so their output can adequately
represent the population.
3
Part B.1
How was the sample size for this
survey determined?
4
Table B.1.1
5
Section B.2
Respondents and Nonrespondents
table is confusing due to the goal
of presenting “unique” responders
as the total. Consider breaking out
the numbers as a Total and a Total
Unique. For example, if there are
10 SNAP State Directors, but only
7 are unique given they have
responded elsewhere, consider
putting 10 (7). Currently the table
does not add, and the calculated
response rate for SNAP State
Director does not necessarily make
sense given the number of
non‐responders.
While there is no sample defined,
it may still be useful to discuss
estimation strategies (presumably
descriptive statistics and simple
aggregation) that will be presented
in the final report.
6
Section B.2. Web‐
based Survey
The follow‐up plan seems
adequate. However, how will any
nonresponse that remains at the
end be addressed?
7
Survey
Instrument,
Questionnaire
The questionnaire contains 88
questions, and the allotted time is
75 minutes. A majority of the
questions can probably be
Response
In response to this comment, we added
additional language on page 2 to clarify
that the key informant interviews are
designed to provide in‐depth information,
rather than representative information, of
persons with knowledge and experience
directly relevant to the study objectives.
We added table B.1.1. on page 3 which
describes our key informant interview
State selection criteria in more detail.
The web survey is a census of all State
SNAP agencies, as noted on page 7.
On the key informant interviews, we added
additional information about how FNS
determines the sample sizes for studies
(page 2).
We have made the recommended changes
to Table B.1.2 on page 6. We added a new
table that became B.1.1., the table
referenced here is now B.1.2.
We have added text to Section B.2 on page
8 that describes our approach to
presenting statistical information.
In response to this comment, we included
a sentence on page 10 indicating that
because the surveys are not designed to
provide results representative of a larger
population, non‐response will not affect
the study’s ability to address primary
research objectives.
As noted in the pretest memorandum
(Attachment K), pretest participants’
estimates of the time needed to complete
the survey ranged from 40 to 60 minutes
J‐3
#
8
Location in
Supporting
Statement
Survey
Instrument,
Questionnaire
NASS Comment
Response
answered from respondents’
memory, but some may require
extra work to confirm or validate.
For example, one may have to
spend time looking at the SNAP
eligibility system to validate the
answer to question 55 and others
in that section. Consider re‐
evaluating the burden, as it’s
possible the respondents can
complete this in 75 minutes, but
without in‐depth knowledge I
cannot confirm. (I will note that
Document K, Section C, does
indicate it could take more than 2
hours to complete.)
The web‐based questionnaire has
a good flow to it. Is question 39
open‐ended? Or will there be
options for “type of service”?
to 90 to 120 minutes. Based on this range,
we anticipate 75 minutes to be a
reasonable average time, across all
participants, to complete the survey.
We did not make any changes in response
to this comment because 75 minutes is an
average time, where some respondents
might take longer and others less time
based on our pre‐test results.
In response to this comment, we updated
the structure of question 39 to allow for a
yes or no answer to each item listed.
J‐4
File Type | application/pdf |
File Title | Microsoft Word - Attachment J. National_Agricultural_Statistics_Service_Comments 04302024 |
Author | Jamia.Franklin |
File Modified | 2024-06-21 |
File Created | 2024-06-21 |