ATTACHMENT 16_Pretest Memo

ATTACHMENT 16_Pretest Memo.docx

Supplementation Nutrition Assistance Program (SNAP) Employment & Training Study

ATTACHMENT 16_Pretest Memo

OMB: 0584-0602

Document [docx]
Download: docx | pdf

Attachment 16:Survey Pretest Memorandum

ATTACHMENT A.16

Survey Pretest Memorandum



F INAL MEMORANDUM




P.O. Box 2393

Princeton, NJ 08543-2393

Telephone (609) 799-3535

Fax (609) 799-0005

www.mathematica-mpr.com



TO: Michael DePiro and Wesley Dean


FROM: Gretchen Rowe, Stephanie Boraas, Brian Estes and Carla Bozzolo DATE: 9/16/2014

SUBJECT: Results of pretesting the “Supplemental Nutrition Assistance Program (SNAP) Employment and Training (E&T) Study” surveys and focus group discussion guide


The pretest of the data collection instruments for the SNAP E&T study was conducted in July and August, 2014. We tested three data collection instruments: (1) the SNAP registrant and E&T participant survey, (2) the E&T provider survey, and (3) the SNAP E&T participant focus group discussion guide. The goals of all the pretests were to:

  • determine completion times for each instrument;

  • identify any challenges with concepts, wording, saliency, or recall;

  • learn if service providers maintain and are able to access the requested data; and

  • uncover any issues with instrument accessibility or layout.

Mathematica, in consultation with the Food and Nutrition Service (FNS), selected Pennsylvania as the pretest location. Pennsylvania was a good candidate for the pretest, as it has a sizeable E&T population and was geographically close to our offices. In addition, Pennsylvania’s approach to SNAP E&T is consistent with the approach used in about half of all States—clients volunteer to participate in E&T (the program is not mandatory). After Pennsylvania was selected, FNS sent a letter to the State to encourage them to participate in the pretest. Mathematica followed-up with an email and phone call, and the State agreed to participate.

As FNS is aware, Pennsylvania was substantially delayed in providing us with the data extract used to identify registrants and participants who could be recruited to participate in the pretest. This delay resulted in a compressed timeline for the pretest. The compressed schedule and poor quality of the data provided by the State presented challenges in recruiting participants for the focus groups and identifying recent E&T participants for the survey. Overall, however, the pretest was extremely informative, both in terms of what we learned about the questions in the instruments and about the quality of contact information we are likely to receive from States. This memo describes how the pretest was structured and implemented, summarizes the main findings, and describes proposed changes to the instruments based on these findings. During a conversation with FNS staff held on September 3, 2014, we discussed the details of this memo and the proposed changes. The decisions made during that conversation and subsequent follow-up are included (in italics) as well. Attached are the surveys and focus group guide used for pretesting. (Final versions of the instruments based on the changes approved in this memo will be provided under a separate deliverable.)

I. Registrant and Participant Survey

The goal of the registrant and participant survey is to learn about the characteristics of the individuals in each of these two groups—their educational and employment histories, job search activities, barriers to employment, and participation in SNAP and E&T. The survey includes two pathways—one for registrants and one for participants. The path for any individual respondent is determined by responses to certain questions throughout the survey. To ensure we tested both pathways, we used screening questions during recruiting to identify participants versus registrants. Pretests were conducted in both English and Spanish.

A. Respondent Recruitment and Profiles

Pennsylvania provided a file with data for registrants and participants as of July 25, 2014. The file included contact information and preferred language. We separated the data into two lists, one that included registrants (not participating in E&T) and the other that included E&T participants. Experienced interviewers at Mathematica’s Survey Operations Center (SOC) began calling individuals on both lists in an attempt to contact four to five registrants and an equal number of participants. We recruited respondents based initially on their registrant or participant status as reported by the State, but also asked each person we contacted if they had participated in SNAP E&T services, to determine if they were participants.1 We offered respondents a $20 gift card as both an incentive to participate in the pretest and as a token of appreciation for their time. After obtaining consent, interviewers completed the survey using a hard copy form.2

Following each interview, we conducted a short debriefing in which the interviewer asked respondents about their perceptions of the survey, any difficulties they had in understanding or answering the questions, whether they felt any topics were sensitive, and how they felt about the length of the survey.

Although we screened individuals to ensure we had an equivalent number of registrants and participants, respondents’ answers to survey questions determined their pathway through the survey. This generally worked well, but we initially encountered an issue in identifying current or recent E&T participants. One screened participant disclosed during the survey that she had not actually participated in SNAP E&T and was treated as a work registrant, while two other individuals identified by the screener as participants later reported they had participated years in the past but not recently. These respondents did answer some of the participant questions, but could not completely answer the full set of questions and were sent to the end of the survey. After reviewing responses and realizing we had not fully tested all of the questions in the participant pathway, we modified the screener question to probe for current participation and conducted additional interviews to ensure that some current participants were included in the pretest sample. Ultimately, we conducted six registrant surveys and five participant surveys. In addition, we pre-tested the Spanish version of the instruments. However, because of the limited numbers of Spanish speakers included in the file provided by the State, we were able to complete only two interviews with Spanish-speaking registrants. Table 1 shows the number of surveys that were completed by respondent type and language.

Table 1. Pretest surveys completed by respondent type and language

 

English


Spanish

Total Across Registrant and Participant Pathways and Language

 

Registrant

Participant


Registrant

Participant

Number completed

6

5


2

0

13

Minimum completion time

20

18


7

--

7

Maximum completion time

30

32


20

--

32

Average completion time

26

23


14

--

23



The registrants and participants included in the pretest represented a range of those typically required to register or participate. The eight registrants ranged from age 19 to 58 years and included six women and two men. The overall educational attainment of the registrants was low, with half not completing high school and the other half with a high school degree or some college (none completed a degree). Among the five participants, the age range was 23 to 59 years, and four were women. Educational attainment was slightly higher among the participants; one respondent had not completed high school, two had a high school diploma or GED, one had attended some college, and another had an associate’s degree and a nursing certification.

The average completion times for both pathways of the survey were relatively close to the estimated time of 20 minutes. As shown in Table 1, completion times for the registrant survey ranged from 7 to 30 minutes (average of 26 minutes), and completion times for the participant survey ranged from 18 to 32 minutes (average of 23 minutes). The 7 and 32 minute response times were outliers, with 8 of the 13 interviews having a completion time within 4 minutes of the average.

B. Findings

During the pretest recruiting, we identified two important findings that we will incorporate into our strategies for recruiting respondents during data collection. First, we found that the time of the month we contact individuals could have a substantial impact on cooperation rates. Many of the individuals contacted for the pretest had Federally funded Lifeline Assistance phones, which they are eligible for as SNAP recipients. Many of these potential respondents would not participate in the pretest because we contacted them toward the end of the month and they were approaching the minutes limit on their cell phone plans. For this reason, we will plan to begin our calling efforts early in the month during data collection. Second, we determined that the State data identifying E&T participation may not be accurate or updated frequently (particularly in volunteer States), so we need to clarify the wording of the screening questions to better target recent or current SNAP E&T participants. In addition, we will plan to begin our calls with targeted participants prior to recruiting registrants to ensure we are recruiting enough participants for the study. (FNS staff had no concerns about this approach.)

Overall, the survey instrument worked quite well. The majority of respondents reported that they were able to answer most of the questions with little difficulty. None of the survey items were reported to be sensitive and none of the respondents refused to answer specific items. Respondents considered the length of the survey to be reasonable, and there were no hang ups during the survey. Several respondents expressed satisfaction at being able to participate in the survey and to contribute to improving SNAP. The successful pretest of this survey suggests that we will be able to collect the data needed to address the research questions FNS has identified for this study. While the average time to complete the survey was slightly over the planned 20 minutes, we have identified a few areas where additional clarification or changes would contribute to meeting our 20 minute target.3 Below, we describe the specific issues in the survey and our recommendations for changes:

  • Questions 6(7/8)b ask respondents about the type(s) of business in which they worked. In general, respondents did not seem to understand this question. Only four responded appropriately and, without noting confusion, identified a type of business. All of these respondents worked for private, for-profit companies. The rest of the respondents did not understand the concept although the options were read to them. Several respondents selected “other” and reported doing “outdoor work” or “security”, for example. Interviewers were able to explain the concept to some respondents and help them choose a response, but they reported that most respondents had difficulty responding. Because respondents did not understand the concept of “type of business,” we likely will not obtain good information for this question. Therefore, we suggest eliminating it. If necessary, we may be able to develop a proxy for type of employer from responses to other questions like 5 and 6c.

Final Decision: FNS agreed to eliminate these items.

  • Questions 6(7/8)d through 6(7/8)k collect information about duration of employment, hours, and earnings. These questions were sometimes difficult for respondents to answer, particularly for jobs they held several years in the past. Six respondents were able to give both a start and end date for their job and three respondents who were still employed were able to report a start date. The other four respondents were not able to provide one or more of these dates. The ability to provide dates decreased for the subsequently listed jobs (items 7d through 7f and 8d through 8f), most likely due to longer recall periods and gaps between jobs.

The existing question does not place a time restriction on employment history. Most respondents reported on jobs held since 2000, although two reported on jobs back to the 1970s and 1980s. For the hours of employment and earnings (6h-k), respondents generally were able to answer the questions; however, several respondents mentioned that they had to guess on these items. In addition, for the older jobs, respondents generally spent more time searching their memories to provide their best guesstimate. Due to the recall issues around past employment, we suggest asking for respondents’ two most recent jobs (eliminating all questions related to the third (8a-8m)). This will likely provide the same level of reliable data while shortening overall response time.

Final Decision: FNS agreed to ask about only the two most recent jobs.

  • Question 16 relates to why individuals participated in E&T. The current version of this item includes “Never got told I had to participate/Didn’t want to volunteer” in the same response option (“0”). In the pretest, respondents that selected this option almost universally responded to question 16a (which asks about reasons for not participating in E&T) with “other” and told interviewers that they did not have to participate in E&T. We believe this suggests that option “0” in question 16 is not specified correctly. We recommend splitting option “0” into two responses “Never got told I had to participate” and “Didn’t want to volunteer” and altering the skip pattern. Respondents who were never told they had to participate would be skipped to item 34 and respondents who chose not to participate (did not volunteer) would continue on to question 16a.

In addition, on question 16, two respondents noted that they had previously participated in E&T but had not participated for several years. To target participants who will be able to recall their participation, we recommend adding the response option, “Participated in the past but have not participated in the last 12 months” and routing this response to question 34 where participation in E&T programs in the past 24 months will be probed.

Final Decision: FNS agreed to these changes.

  • Question 16b, Based on issues with data quality of asking providers about the goals and motivations of participants, FNS suggested that we include a question on motivations in the participant survey. For those individuals that respond to question 16 with either they were told to participate or they volunteered, we ask the following question:



16b. “What were your main reasons for participating in SNAP E&T?”

Select all that apply

Keep SNAP benefits, 1

Get childcare, 2

Get other benefits, 3

Improve my English, 4

Gain job search skills, 5

Learn about self employment, 6

Earn a certification/credential/license, 7

Learn a new skill/industry, 8

Get promoted, 9

Get a raise, 10

Get a job, 11

Find a better job, or 12

Something else? 99

Shape2 Specify



Final Decision: This change was developed and agreed to during the call.



II. Provider Survey

The goal of the provider survey is to learn about the characteristics of the organizations providing E&T services, the services these organizations provide and the types of participants they target, participant outcomes, and the sources of funding. We targeted four providers for the pretest and were able to contact directors or those overseeing programs at a variety of provider types. Respondents were informed in advance of the types of questions that would be asked. We suggested it would be helpful for them to have information on expenditures and participants (characteristics, number participating in each activity, and outcomes) readily accessible at the time of the interview to help expedite the process.

A. Respondent Recruitment and Profiles

Pennsylvania’s SNAP E&T is administered mainly by organizations administering Workforce Investment Act (WIA) services and through community colleges running Keystone Education Yields Success (KEYS) programs, although there are other types of providers operating mainly in the Philadelphia area. The State provided us with lists of all of these providers and we purposefully selected four providers that offered different types of programs and services. They included a workforce investment board (WIB), a community college running a KEYS program, and two WIA and Employment, Advancement, and Retention Network (EARN) offices. We emailed the providers to obtain their cooperation, and respondents who agreed to participate were contacted to set up an appointment to complete the survey. Although the provider survey will ultimately be self-directed online, the pretest, as is standard, was tested via a telephone survey and documented on a hard copy form.4 Following each interview, we conducted a short debriefing in which the interviewer asked respondents about their perceptions of the survey, any difficulties they had in understanding or answering the questions, whether they felt any topics were sensitive, and how they felt about the length of the survey.

Average completion time was about 60 minutes for the first three interviews. This is considerably longer than the planned length of 15 minutes. After identifying this issue, we reviewed the survey responses and identified areas where the providers could not answer the questions, needed additional clarification to answer, and did not provide complete or useful information. We revised and streamlined the instrument based on our assessment of the data to test the proposed changes. We interviewed the fourth provider using this revised instrument. The completion time for the revised instrument was considerably shorter (26 minutes) and the provider seemed to have less difficulty providing answers. In Section B, we summarize the changes that were made to shorten the instrument and our recommendations on whether these changes should be retained in the final version of the instrument.

B. Findings

Overall, respondents were able to answer the majority of the questions in the provider survey. However, most respondents struggled with questions related to reporting data on numbers of participants and amount of funding by activity. These questions often took longer for respondents to understand and complete, which increased the time they needed to complete the survey. In addition, respondents who completed the original version of the survey agreed that it was far too long. The respondent who completed the revised (abbreviated) survey thought it seemed a bit long but that it would probably be acceptable if he were able to complete the survey online and work at his own pace.

Based on the results of the pretest, we do not believe it is possible to collect all of the information needed to address the study research questions in a 15 minute survey. Rather than make major revisions to the survey content to reach this target, we suggest increasing the target response time for the provider survey to 30 minutes. With the suggested modifications to the survey tested during the pretest (and a few more changes described later in this section), we believe we can attain an average completion time of 30 minutes for online administration. We do not believe this change will affect the overall response rate for the provider survey. On several other studies with providers we have conducted surveys ranging from 30 to 45 minutes and achieved high response rates. If we are clear upfront about the time involved and the types of reference documents that will be helpful for providers to have on hand, we believe providers will cooperate and complete the survey.

Final Decision: FNS agreed to increase the provider survey length to 30 minutes

In addition to concerns about survey length, the pretest of the provider survey revealed two other important issues:

  1. Two respondents had difficulty reporting data for SNAP participants versus all other E&T participants. This seemed particularly difficult for respondents whose organizations served both TANF and SNAP clients. To address this issue, we suggest that the survey instructions emphasize the importance of separately reporting data for SNAP clients. In addition, we suggest providing a clear definition of what we mean by SNAP E&T participants—those whose training is reimbursed by SNAP funds—in the survey instructions, so there is no confusion about SNAP clients who are receiving E&T funded by other programs versus SNAP E&T participants. A notes box following the data reporting items would allow providers to give any caveats or additional information they felt necessary for understanding their responses.

(FNS staff had no concerns about this approach.)

  1. Second, one of the respondents refused to provide any financial information. This may arise during data collection, particularly among private organizations that cite confidentiality concerns. We recommend adding instructions to the financial section of the survey that encourage respondents to at least provide information about the percentage of funds spent on various types of activities (question 28), even if they are not willing to report actual budget estimates. This will provide us with a sense of how the organization spends their funding.5

(FNS staff suggested that this issue may be location-specific and not a widespread problem. Therefore, they suggested we add a disclaimer to the beginning of the section to assure providers that we will use their data for research purposes only and no state or federal staff will receive the information with identifiers. We added this language to the final instrument.)



The remainder of this section summarizes recommended changes to the provider survey to decrease survey length. We note instances where the recommended change was tested with the fourth respondent (an abbreviated instrument) and how the suggested modifications worked in practice.

  • Question 5 related to how long the organization has been serving SNAP E&T participants. The responses to this item essentially duplicated responses to question 4, which asks how long the organization has been providing E&T. We recommend eliminating question 4. If there is variation in how long organizations have been providing SNAP E&T versus E&T more generally, the former response is most relevant for this study. (We eliminated question 4 in the abbreviated interview.)

Final Decision: FNS agreed to this deletion.

  • Questions 7 and 15 both ask about capacity for enrollment. We believe question 15 is a better question (it collects enrollment data by activity) and recommend eliminating question 7. (We eliminated question 7 in the abbreviated interview.)

Final Decision: FNS agreed to this deletion.

  • Questions 8a and 8b ask about upfront assessments of good cause exemptions. In the first three interviews, respondents found these items to be confusing. The intent of the questions was to determine if providers assessed possible reasons for good cause exemptions from participation, not whether they provide an assessment of what E&T activity to place a client into. Because this was not clear in the initial wording, we suggest modifying the questions to: (8a) "Does your organization conduct screenings to determine if individuals are eligible for good cause exemptions from participation?" and (8b) "What percentage of those you screen are found to qualify for a good cause exemption?" (This alternative wording was tested in the abbreviated interview and worked well.)

Final Decision: FNS agreed to this wording change.

  • Question 10 asks about the likelihood of participants meeting their E&T goals. All of the pretest respondents provided answers in the mid-range. Given that there was not much variation in responses to this question, we suggest eliminating it. Questions 29 and 30 collect information on activity completion rates. Although we cannot specifically link completion to goals, the data collected in questions 29 and 30 will likely be more useful for analysis. (We did not eliminate question 10 in the abbreviated interview. This would be an additional elimination, further contributing to reduced survey length).

Final Decision: FNS agreed to this deletion.

  • Question 11 asks about client motivation. Like question 10, respondents tended to select mid-range responses. It may be difficult for providers to link motivation with completion and some programs are ongoing with no end dates. Therefore, we suggest eliminating this question, as it likely will not provide much variation and could be difficult for providers to assess. (We did not eliminate question 11 in the abbreviated interview. This would be an additional elimination, further contributing to reduced survey length).

Final Decision: FNS agreed to this deletion. To address the research question about participant motivations, item 16b was added to the R/P survey.

  • Question 14 asks about the information and data providers use to design or modify their E&T programs. Most of the respondents reported that they use all of the information and data included in the pre-coded list of response options. This may be accurate, but it also may be a sign of social desirability bias leading respondents to answer this question with what they perceive are the “right” answers. Therefore, we suggest eliminating this question or modifying the wording to indicate we are asking for the primary information and data sources used to design or modify programs. (We did not eliminate question 14 in the abbreviated interview. This would be an additional elimination, further contributing to reduced survey length).

Final Decision: FNS suggested keeping the question but modifying it to better understand if providers design training based on the economy and need for high-demand jobs. We reworded the question and circulated it to FNS for final approval, which was given, before incorporating the changes in the final instrument.

  • Question 15 identifies the number of SNAP E&T participants enrolled in different E&T activities. The initial version of this question asked for a breakdown of SNAP mandatory, SNAP voluntary, and non-SNAP participants. Pretest respondents had questions about mandatory versus volunteer participants (Pennsylvania is a volunteer-only State, so none of the participants were mandatory). Because not all States will have mandatory participants, we suggest simplifying this matrix to include just two columns—SNAP E&T participants and non-SNAP E&T participants.6 This will cause less confusion for the providers and likely lead to more accurate data. (We modified this question for the abbreviated interview and believe the simplified matrix worked well. The respondent was able to answer the question without additional clarification.)

Final Decision: The two column table described above will be implemented. However, FNS suggested adding a question to identify the percentage of mandatory versus voluntary participants. We added this item.

  • Questions 17and 17a asks about activities that are unique to the provider, that is activities that are not available from other providers in the area. The first three providers indicated, in their responses to question 17, that they provided services that were not available from other providers in the area. However, the services they listed in question 17a did not appear to be unique. The responses to question 17a were often providing more detail about a common type of activity. It may be that providers cannot easily assess their activities in relation to the broader E&T system. For these reasons, we suggest eliminating these two questions. Doing so will not result in any loss of data because question 13 asks for a list of the activities provided and includes an “other” option that providers can use if they offer services that are not on the precoded list of response options. (We eliminated questions 17 and 17a in the abbreviated interview.)

Final Decision: FNS agreed to these deletions.

  • Question 19 asks respondents to report the minimum number of hours required to complete each activity and the duration of each activity. Each of the first three respondents had difficulty completing this table. In particular, they pointed out that there is no minimum hours requirement for some activities and that duration could be difficult to track because some participants combine activities. Based on this feedback, we recommend revising the columns to ask for the average number of hours per week and the average number of weeks participants spend in each activity. (We tested this modification in the abbreviate interview and it worked well. The respondent was able to provide the requested information.)

Final Decision: FNS agreed to this change.

  • Question 22 and 23 asks about participant referrals to other State and Federal programs and other local community organizations. This information was similar to that captured in question 18 (options 5 and 6), so we recommend eliminating this question. (We eliminated these questions in the abbreviated interview.)

Final Decision: FNS agreed to these deletions.

  • Questions 23a and 23b focus on the types of organizations with which providers have formal agreements. We suggest that two questions are not needed to capture this information and that one combined question can be used instead. The question would read: “Does your organization have agreements or coordinate with any of the following kinds of organizations?” (listing the types of organizations from 23b). (This modification was tested in the abbreviated interview and worked well.)

Final Decision: FNS agreed to this modification.

  • Question 30 asks respondents to provide employment rates for participants who enrolled in each activity and for those who completed each activity. The first three respondents struggled to provide this information because some activities do not have defined completion timeframes and participants who enroll in one year may not complete the program in the same fiscal year. Based on this feedback, we recommend revising the question to ask only for the percentage of participants who entered employment in that fiscal year by activity. (We tested this modification in the abbreviated interview and it worked well.)

Final Decision: FNS agreed to this modification.

  • Questions 37 and 38 ask providers to assess the types of skills needed by employers and which skills participants are often lacking. Based on pretest responses, we suggest combining these two questions into one that more directly identifies which skills the providers perceive are needed to become employable in the community. We suggest asking, “What are the primary types of skills E&T participants need to become employable in your community?” (We tested this modification in the abbreviated instrument and it worked well.)

Final Decision: FNS agreed to this modification.



III. Focus Group Discussion Guide

The goal of the focus groups with SNAP E&T participants is to gather information about participants’ employment goals, their skill gaps and training needs, their perceived barriers to obtaining and retaining employment, and the training they received from SNAP E&T programs.

A. Respondent Recruitment and Profiles

Recruitment of focus group participants was challenging due to the compressed timeframe for recruiting and the poor quality of contact information in the data file provided by the State. For logistical reasons, the potential sample for the focus groups was restricted to participants in the Philadelphia area. After removing individuals with no phone numbers, the pool of potential participants included 296 individuals. We were unable to reach 233 of these individuals (79 percent) because of incorrect or disconnected telephone numbers. Among the individuals we were able to reach, refusal rates were high. This could be in part due to the short amount of time between the recruiting call and the focus group—recruiting continued up until two days prior to the focus group. It is also possible that the $25 incentive we offered was too low relative to the burden involved in participating in a focus group.

Ultimately, 10 English-speaking participants were recruited for an in-person focus group. Participants were reminded of the scheduled focus group the day prior, but only one person showed up. The recruited participants were called 15 and 20 minutes after the scheduled start time. About half were reached and cancelled without giving clear reasons why they had not come, and the other half could not be reached. To maintain the study timeline, we elected to conduct semi-structured interviews by telephone rather than recruit for another formal focus group. Recruiters contacted the nine individuals who did not show for the English focus group, five individuals who were originally unavailable for the focus group at the set time, and 65 participants who were not previously contacted. We offered a $50 incentive for participating in the telephone interview. Interviews were scheduled with six SNAP E&T participants, but four of them could not be reached to complete their interviews.

Recruitment of Spanish-speakers was particularly challenging as the potential pool of Spanish-speakers was very small (33 individuals), and over half of the telephone numbers were incorrect. Because of these limitations, a formal focus group was not conducted with Spanish speakers. Instead, the study team arranged two in-depth telephone interviews to test the Spanish discussion guide, but we were able to reach only one of the respondents to complete the interview.

In total, four in-depth interviews were completed—three in English and one in Spanish. Following each interview, we conducted a short debriefing in which the interviewer asked respondents if they had any difficulties understanding or answering the questions, whether they felt any topics were sensitive, and how they felt about the length of the interview.


The difficulties encountered in convening pretest focus groups have important implications for our data collection. Although we will have a longer time frame for recruitment and larger sample frames, we are likely to encounter similar problems with poor contact information and smaller sample frames for Spanish-speakers. Therefore, staff are currently identifying strategies to implement during focus group recruiting next year to ensure that we recruit enough individuals to meet our targets for focus group participants. This may include targeting focus groups to more populated areas, providing larger incentives, providing more pre-focus group follow-up to remind participants of the meeting times and locations, and using alternatives to telephone numbers, such as mailings or emails (when available). These strategies were discussed with FNS during our meeting on pretest findings and can be revisited in the future.

During the discussion with FNS, we identified some specific strategies for improving the recruiting during our data collection next year. In general, FNS was happy with our strategies to move up the recruiting timeframe closer to when we receive the data, leverage providers as much as possible to help with recruiting, and use alternative forms of contact, like mail and email. In addition, we suggested increasing the incentive from $25 to $40 and offering a $10 bonus for participants who arrive 15 minutes early for the focus group. We suggested that the $40 incentive would likely improve our ability to recruit participants and the extra $10 “early bird” incentive may guarantee attendance and ultimately increase participation. FNS agreed to increasing the incentive and providing an early bird incentive.

B. Findings

Overall, results of the in-depth interviews indicated that the questions included in the focus group discussion guide were clear, appropriate, and easily answered by respondents. Based on feedback from interviewers and respondents, we believe that no revisions are necessary for the “Perceptions of the Labor Market,” “Barriers to Employment,” and “Final Exercise” sections (VI, VII, and VIII) of the discussion guide, and only minor revisions are required for the other sections. Suggested revisions are summarized below and are motivated primarily by feedback from interviewers that some questions were redundant and that others were not applicable for certain respondents because of the structure of the E&T programs in which they participated.

Because we conducted one-on-one interviews rather than focus groups, we were not able to assess whether some respondents may be more hesitant to respond to certain questions in a group setting. However, interviewers asked about this during the respondent debriefings and none of the respondents indicated that they would have concerns about a group setting. We also were not able to estimate how long the focus group discussions are likely to take. However, the estimated 90 minute length should be more than adequate based on the time it took to complete the in-depth interviews—the English interviews lasted approximately 20 to 30 minutes and the Spanish interview lasted 40 minutes, excluding the time for debriefing.

The following describes the issues identified during the interviews and our recommendations for changes:

  • Section II: E&T Participation. Respondents addressed the questions in this section with ease, and no revisions are necessary. However, the question that asked, “Did you leave the program before it ended?” may not be applicable to all respondents depending on the structure of the E&T program. For instance, two respondents were enrolled in programs that prepared individuals for job interviews and connected them with employers. These programs did not have a defined length, and the participants left them when they found employment. We will plan to include more guidance in the discussion guide so moderators understand what types of services would include defined periods of time and which would not. This will help direct the conversation appropriately.

Final Decision: FNS agreed to this plan for guidance.

  • Section III: Employment Goals. The questions in this volunteer-only section were easily answered and the conversation flowed well, but interviewers thought this section was somewhat redundant with the introductory question for the volunteer groups, “What were your reasons for enrolling in the Employment and Training program?” We recommend eliminating the introductory question to facilitate the flow of the discussion. Instead, we recommend using the mandatory participant introductory question for all focus groups: “When you enrolled in the Employment and Training program, what did you hope to get out if it?”

Interviewers also thought that the question: “How easy or difficult will it be for you to get the training you need from the SNAP E&T program?” seemed redundant with the subsequent section that focuses on participants’ perceptions of E&T programs. We recommend removing the question here and adding a probe to the question in the perceptions section to ensure the moderators distinguish between the ease of getting services and the usefulness and quality of services received.

Final Decision: FNS agreed to these changes.

  • Section IV: Perceptions of E&T Program. This section flowed well, and most participants easily answered questions regarding the strengths and weaknesses of their E&T program. However, questions on participation in other job-preparation programs could not be tested because none of the participants had enrolled in other training programs.

One participant had difficulty thinking of a response to the question, “What parts of the program worked best/were most helpful for you?” This respondent elaborated when probed about classroom versus hands-on training, and a second respondent also highlighted the differences in teaching styles when asked what parts of the program worked best. We suggest adding this probe to the discussion guide to yield additional data about effective teaching methods.

Final Decision: FNS agreed to this probe.

  • Section V: Workforce Preparedness. Participants were forthcoming when asked about their employment situation, how they obtained their jobs, and the adequacy of their jobs in meeting their basic needs. While this section was generally successful, the question “Why do you think you were able to get this job?” gave the respondents the impression that they were being asked to justify their qualifications. For this reason, we suggest rephrasing the question to increase clarity and ensure that it captures the intended data (that is, whether respondents believe the E&T program was helpful in securing a job). A more neutral question would be, “Where any specific programs or supports helpful to you in getting this job?”

If in response to the above question (Why do you think you were able to get this job?) the respondent mentions the E&T program, the moderator is to ask, “Do you think you could have gotten the job without going through the Employment and Training program?” This often was redundant with questions in the “Perceptions of the E&T Program” section and we recommend moving it as a probe in that section of the guide. The remaining follow-up questions, “If no, what part of the training was most helpful in getting the job? If yes, are there parts of the training that could be improved to help you get work?” are also redundant with the prior section and we suggest deleting them.

Final Decision: FNS agreed to these modifications.



1 Recruiters made calls at various times of the day and recorded the details of each contact attempt. Because of the small number of respondents needed and the short time frame, no voicemails were left.

2 The final version of the survey will ultimately be programmed for Computer Assisted Telephone Interviewing (CATI) and online administration.

3 During the course of the pretest we noticed some of the skip patterns needed to be altered and minor wording changes were needed for clarification. We have not included details about these minor changes in this memo. Instead, with FNS approval, we will provide a track changes version of the survey when we submit the final version of the instruments. This track changes version will allow FNS to review all of these minor changes.

4 Because of the costs associated with making changes to the CATI and web versions of surveys, we traditionally program the surveys only after the instruments are final, not during the pretest phase.

5 For analysis, we plan to discuss funding and expenditures in percent terms; however, based on our experience we found it was easier for respondents to provide amounts versus calculating percentages, thus why we focused on capturing dollar amounts in the survey.

6 Another reason to consider this change is that some States told us, during preliminary discussions about their data, that providers do not know who is mandatory and who is voluntary (even in mandatory States).



/home/ec2-user/sec/disk/omb/icr/201411-0584-004/doc/53082101 A16.1

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJessica Ziegler
File Modified0000-00-00
File Created2021-01-26

© 2024 OMB.report | Privacy Policy