Attachment 18_NASS Comments

Attachment 18_NASS Comments.docx

Supplementation Nutrition Assistance Program (SNAP) Employment & Training Study

Attachment 18_NASS Comments

OMB: 0584-0602

Document [docx]
Download: docx | pdf

ATTACHMENT A.18

Responses to Comments from US Department of Agriculture National Agricultural Statistics Service

Supplemental Nutrition Assistance Program (SNAP) Employment and Training (E&T) Study

Reviewed by Leslie Smith, USDA-NASS, Methodology Division

October, 22, 2014

Part A

A.1 – This section looks good. It clearly describes the need for the data.

A.2 – In the Table A.2 you indicate that the survey will help you answer “are fees or costs charged to participants?” but I didn’t see any questions that would provide this data. This is also referenced in Section 2 (the provider survey). Additionally, section 2 states that the provider survey will take 15 minutes but in A.12 it says 30 minutes as does the questionnaire itself.

A.3 – This section looks good. I was concerned that one of your two modes of data collection was web for a survey of low-income persons. However, you clearly state that many of the SNAP E&T participants are likely to have familiarity and access to computers through the SNAP E&T work programs.

A.4 through A.11 – These sections look fine.

A.12 – I’m not sure where the annual burden total is coming from in Section 1. It isn’t consistent with the calculations in Table A.12a. Also, parts a & b could use a little clarification. When I first read it, it sounded like you had a separate sample of people who you knew wouldn’t respond. However, what you really meant was with a total sample size of X you expected a certain percentage to not respond because they refuse or are inaccessible. You have an estimated number of respondents who will complete the survey with one burden level and an estimated number of nonrespondents who will only read the letter with another burden level. This is clearly demonstrated in Table A.12a.

A.13 through A.15 – These sections look fine.

A.16 – Overall, this section documents the objectives and identifies the limitations of the survey data. However, I believe the reference to Table A.17 in Section 2 should be to Table A.16. There is no Table A.17.

A.17 & A.18 – These sections look fine.

Part B

B.1 – If you are making inferences to the entire U.S., I am a little concerned that you are only surveying participants from 25 states when Part A clearly says that it is up to each state to determine how to provide the SNAP E&T services. Since each state has the potential to be unique, wouldn’t selecting registrants, participants, and providers from all states (excl. RI because of the lack of frame data) be more representative, particularly for the characteristics of the providers? However, I understand that this would mean the work registrant, participant, and provider sample sizes would need to increase which may be limited by funding.

That being said, I’m not sure I completely follow the PPS sampling methodology in Section 1. The three phases are defined clearly. However, in the last paragraph on page 2, it states the “25 certainty states” are used to provide the list for the registrant/participant samples. If there are 25 certainty states, which means they have a probability of 1 and are automatically included in Phase 1, why is Phase 3 necessary?

In section 2 while the selection of providers is random, you are limiting the sample population to those near those in the participant sample which may introduce bias.

In section 3, the focus group sample is selected from the same frame as the participant survey sample. Is it possible for a participant selected in the survey to also be selected for the focus group? Overall, this section talks about the makeup of the sample but not the sampling methodology used to achieve it.

B.2 – In section 1 you mention administrative data from two sources which were used to build the sampling frame. Can duplication exist between the two lists? Was this addressed? Otherwise, I think you have a very clear data collection strategy for the work registrant and participant survey and the focus group. However, there is no discussion of data collection activities for the provider survey in this section.

In section 2 there is a lot of detail about your nonresponse adjustment methodology which is very clearly described but there is little about the coverage adjustment and outlier adjustment. Does this part need to be expanded?

B.3 – In section 1, it is apparent that your primary means of maximizing response is incentives coupled with the verification of addresses and phone numbers and extensive use of reminder letters. However, at the bottom of page 7, I’m not sure I quite follow the last paragraph where it says ‘Address and telephone numbers will be included for many sample members, or might be incorrect.’ Is the point that this data will be missing or incorrect for the members of the sample or is the point that although the data is provided, some of it might be incorrect?

B.4 – This section looks fine. It seems that the data collection instruments have been thoroughly pre-tested and modified accordingly.

Questionnaires

I.1 – There is a skip in Q36 to ‘Go to Q37’ but the instructions on Q37 say ‘If 36 ≠ 1. Should the skip be to Q38?

1.2 – In the provider survey you ask them to breakout much of the data by E&T participants (sometimes by voluntary and mandatory) and non-E&T participants. Is this something that they are required to tract possibly for the SNAP program? If not, are there concerns that they may not be able to provide the breakouts or was this addressed during the pre-test?

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorcolble
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy