Download:
pdf |
pdfATTACHMENT N
PRETEST RESULTS MEMORANDUM
This page has been left blank for double-sided copying.
ATTACHMENT N: PRETEST RESULTS MEMORANDUM
MEMORANDUM
505 14th Street, Suite 800
Oakland, CA 94612-1475
Telephone (510) 830-3700
Fax (510) 830-3701
www.mathematica-mpr.com
TO:
Wesley Dean
FROM:
Michael Ponza and Betsy Santos
SUBJECT:
SNAP E&T 12-Month Follow-Up Survey Pretest Results
7/17/15
SNAPET-13
DATE:
This memo summarizes the results from the pretest of the 12-month Follow-Up CATI
instrument conducted by Mathematica and summarizes the proposed instrument changes based
on these results.
The purpose of the pretest was to address the following issues:
Survey length when administered in English
Survey flow
Respondents’ interpretation of questions
Respondents’ ability to recall required information
Completeness of response category options
To obtain timing estimates, the pretest interviews were conducted by telephone, mimicking
as much as possible actual field conditions. And, to explore respondents’ understanding and
recall of pertinent data required by the survey questions, we conducted a short debriefing after
the interview with respondents to learn whether they interpreted questions as intended, if they
found any questions difficult to answer from their perspective, why those questions were difficult
to answer, and how easy or difficult it was to recall the information required.
Testing Details and Procedures
There were two rounds of pretest interviews. The first round took place between June 23
and June 25, 2015 and was conducted by Brianna Sullivan, a Survey Associate at Mathematica,
with assistance from Betsy Santos. Three interviews were conducted using the telephone
questionnaire. Results from the round 1 interviews were reviewed and changes were made to the
CATI survey. Ms. Sullivan conducted a second round of five interviews between July 1 and July
16, 2015 to confirm that the changes successfully addressed the issues discovered during the first
round of testing.
Mathematica’s budget assumed that the follow-up surveys would take 30 minutes to
complete on average. To obtain timing estimates, the pretest interviews were conducted without
any interruptions during rounds 1 and 2. Once the telephone survey was administered, the
interviewer then asked a series of debriefing questions to explore respondents’ understanding of
the survey items.
N.3
ATTACHMENT N: PRETEST RESULTS MEMORANDUM
Pretest respondents were sent a check for $40 for participating in the 30-50 minute
telephone interview, which included extra time for responding to debriefing questions about their
perceptions of the interview during or after the interview.
Recruitment and Respondent Profile
We attempted to recruit pretest respondents from four States – Vermont, Virginia,
California, and Wisconsin. Evaluation leads sent an email to these States explaining the purpose
of the pretest and requested a list of 30 potential respondents who were currently receiving
SNAP E&T services. Two pilot States – Vermont and California – ultimately participated. The
other pilot States either did not respond to our request, or the process to receive a list of potential
respondents required numerous approvals that were judged to be too lengthy.
Respondents were recruited from the convenience sample files provided by Vermont and
California. Ms. Sullivan called potential respondents from the lists provided by the States,
explained the purpose of the pretest, and scheduled a convenient time for the interview with
those who agreed to participate. Table 1 summarizes the characteristics of the individuals who
ultimately participated in the pretest interviews.
TABLE 1: CHARACTERISTICS OF PRETEST RESPONDENTS
Location
Gender
Age
Education level
Number of jobs held in past
year
Number of employment and
training programs attended
during the previous year
Vermont
California
Female
Male
Under 45
45 and older
Less than 8th grade
8th to 12th, no diploma
High school diploma or GED
Adult Basic Education (ABE) Certificate
Some college but no degree
Vocational/technical degree or certificate
Business degree of certificate
Associate degree (AA)
Bachelor’s degree (BA/BS)
Master’s degree (MA/MS) or higher (MD, Ph.D.)
0 jobs
1 job
2 jobs
0 programs
1 program
2 programs
N.4
Round
1
Round
2
Total
3
1
2
1
2
1
2
2
1
2
1
-
2
4
4
2
5
1
2
1
3
2
2
2
2
3
1
5
4
5
4
6
3
2
1
2
1
3
4
2
3
4
4
1
ATTACHMENT N: PRETEST RESULTS MEMORANDUM
Findings
Survey Timing. During the first round of the pretest, interviews averaged 38 minutes. This
was quite long considering that 2 of the 3 participants did not hold any jobs or did not participate
in any training or education or training programs within the past year, meaning they skipped two
entire sections of questions (see Table 2). As a result we cut questions from the survey that we
judged did not contribute to primary or secondary outcomes of interest and tested the revised
instrument with the remaining 6 participants during the second round of testing.
The second round of interviews averaged approximately 33 minutes. The most timeconsuming sections were the ones which required recall of specific details of any jobs (Section
B) or education and/or training programs (Section C). Questions were cut from each of these
sections. However, it is important to note that additional cuts were made just before the last
interview. Many of the changes to Section B occurred after respondent 8’s interview but prior to
respondent 9’s interview. The time savings gained by removing these questions is demonstrated
in the difference between respondent 8 and 9’s Section B length (ten versus six minutes). Time
savings are also demonstrated in the lower mean interview length for round two when compared
to round one, even though participants in round 2 had more jobs and programs to discuss in
Sections B and C. A list of the cuts that were made can be found in Table 3.
With the current cuts, we believe the survey will average between 30-35 minutes. Other
cuts can be made to ensure the survey stays within the 30 minute budgeted time, but it would
require eliminating questions that collect data on secondary outcomes, such as the mental health
question. However, if the interviews average more than 30 minutes, there are budget
implications. We look to FNS for feedback on whether more proposed cuts are desired.
Table 2. Survey Length in Minutes by Round
Round 1
Round 2
1
2
3
RD 1
Mean
# Jobs
0
0
2
0.7
0
0
2
2
1
# Programs
0
0
1
0.3
0
1
2
1
S – Screening
2
2
2
2
2
2
2
A – Household
1
1
1
1
1
1
B – Employment
2
6
13
7
2
C – E&T
8
13
11
11
D – Assistance
4
4
3
E – Food security
2
4
F – Health
5
G – Housing
H – Contact info
Total time
RD 2
Mean
Total
Mean
1
1.0
0.9
1
0
0.8
0.7
2
2
1
2
1.9
1
1
1
1
1
1.0
2
14
15
10
6
8
7.8
4
9
14
12
12
5
9
9.8
4
3
2
4
3
3
4
3
3.3
2
3
2
1
2
2
3
3
2
2.3
11
4
7
2
3
2
3
3
4
3
4.1
1
1
1
1
1
1
1
1
1
1
1
1.0
3
4
3
3
3
3
3
3
3
3
3
3.1
28
46
40
38
20
24
43
42
38
28
33
34.3
4
5
6
N.5
7
8
9
ATTACHMENT N: PRETEST RESULTS MEMORANDUM
Table 3. Item Deletions Due To Length
Question: S1
Question: B8
Question: B16,
B16a
Question: B19,
B19a
Question: B20
Question: C1
Question: C6, C7,
C8
Question: C33,
C34
Question: F3, F4
Question: G1a, G2
Reduced the length of the introduction.
Removed quit/fired follow-up:
B8. IF QUIT OR FIRED: Why did you (quit/get fired) from [FILL COMPANY NAME]?
Removed important job resource and opportunities for promotion items:
B16. What was the most important resource you used to find this job?
B16a. (Did/Does) your (current/most recent) job have opportunities for promotion?
Removed questions about earnings from odd jobs, side jobs, and under-the-table jobs:
B19.Since (fill RA MONTH/YEAR), how much did you earn, in total, from odd jobs, side
jobs, under-the-table jobs, or any other activities? Do not include income from gifts, child
support, lottery winnings, and things like that. Please remember that all of your responses
on this survey will be kept private and will not affect any benefits you receive now or in the
future.
B19a. Would you say less than $500, $500 to less than $1000, or $1,000 or more?
Removed job hunting/retention question:
B20. Sometimes people have problems getting or keeping a job. What problems, if any,
have you had getting or keeping a job?
Removed the following from the introduction:
“We’re interested in individual meetings you may have had in person or over the phone.
Please do not include any meeting you may have had as part of an interview or meetings
with individuals you may have spoken with at job fairs or hiring events.”
Removed additional questions about career assessment testing from the survey.
C6. How many days did it take to do the testing?
C7. On average, about how many hours each day did you take the career assessment
testing?
C8. Were the results of the test(s) shared with you?
Removed questions about transportation costs and career goals.
C33. On average, how much, if anything, do you yourself pay for transportation to get to
and from work, training or school?
C34. The next few questions are about your career goals. Whether or not you are
currently working, please tell me if you strongly disagree, somewhat disagree, somewhat
agree, or strongly agree with the following statements:
a. I have specific goals for my future
b. I have a plan for achieving my career goals
c. Planning for a career is not worth the effort
e. If I have a career, I won’t be able to enjoy other things in life
Removed questions regarding general well-being and challenges.
F3. Taken all together, how would you say things are these days? Would you say that you
are very happy, pretty happy, not too happy, or not happy at all?
F4. The next questions are about how you feel and handle challenges in your life. Please
tell me if you strongly disagree, somewhat disagree, somewhat agree, or strongly agree
with each of the following statements:
a. I can do just about anything I set my mind to
b. When I really want to do something, I usually find a way to succeed at it
c. Whether or not I’m able to get what I want is in my own hands
d. What happens to me in the future mostly depends on me
e. I can do the things I want to do
Removed more detailed housing questions.
G1a. Do you own the place you live in, rent your own place or contribute to rent at a friend
or family’s place, or live rent free?
G2. What type of group quarters do you live in?
N.6
ATTACHMENT N: PRETEST RESULTS MEMORANDUM
Survey Content. Overall, respondents seem to understand most questions and did not appear
to have much difficulty answering them. There were some questions that did require further
clarification. As such, adjustments were made to the question wording, or interviewer probes
were added, to address any respondent misunderstandings of question intent. Table 4 provides a
list of questions where there was some issue during administration, and the suggested
modifications.
Debriefings. Pretest respondents were asked five debriefing questions: (1) Did you find
any of the questions difficult to answer? (2) Did I ask you about anything that was confusing or
hard to understand? (3) How easy or difficult was it for you to recall or remember the details
about some of your jobs/education programs? (4) How confident did you feel about your
answers? (5) In general, is there anything you would change to improve the questions?
All respondents said that they didn’t find any questions confusing or hard to understand.
And they all expressed having a lot of confidence in their responses. Regarding questions that
were difficult to answer, one respondent said it was hard to remember how much SNAP benefits
she had received each month. Two respondents mentioned that answering the depression scale
was difficult for them emotionally. Another mentioned it was difficult figuring out who was
paying for the education program in which they were participating. When asked about recall,
one respondent said it was hard to recall how many hours they worked during their last week of
work. Another said it was difficult to recall how many months they lived in their apartment.
The remaining participants said that it was “easy” or not a problem. Finally, when asked about
how the survey could be improved, participants felt overall that the questions were good, and
“fair”. One mentioned that some things seemed repetitive, but understood that they’re needed to
cover everyone’s experience.
Table 4. Survey Content – Problems Identified and Recommended Changes
Question: S3
Question: B1
Issue
Reminder that all survey responses would be kept private was repetitive,
as this was just mentioned in S1, and most respondents began answering
as soon as we asked for the last four digits of their social security number
anyways.
Recommendation
Move the following from the question text itself to an optional interviewer
probe instead: “IF NECESSARY: Please remember that all of your
responses on this survey will be kept private and will not affect any
benefits you receive now or in the future.”
Issue
Question asked respondents about being self-employed, then mentioned
working at a job for pay. Respondents may concentrate on the selfemployed portion of the question and answer no without listening to the
second half of the question.
Recommendation
To help respondents focus on working at a job for pay, switch the order of
these two items in the question, “Are you currently working at a job for pay
or self-employed?”
N.7
ATTACHMENT N: PRETEST RESULTS MEMORANDUM
Question: B2
Issue
Respondents may give reason for why they lost their last job rather than
why they haven’t been able to find another job.
Recommendation
Add interviewer probe to clarify: “IF R MENTIONS HOW LAST JOB
ENDED (I.E. FIRED, LAID OFF) PROBE: What is the main reason you
have not been able to get a new job?
Also, added ‘pregnancy’ to the family responsibilities response category.
Question: C10
Question: C12
Question: C18
Question: C19
Question: C28,
C29, C29a, C30,
C31, C32
Question: D1d
Issue
Respondents sometimes had a hard time realizing they were being asked
about a different type of programs in item C10 (e.g. education and training
programs) than in the previous questions in section C (e.g. career
counseling or one-on-one assistance from an employment professional).
Recommendation
Add introduction to C10 that begins with, “Now we’re going to ask you
about…” to help differentiate the different types of programs we are asking
about
Issue
One respondent said the main reason she had not participated in any
education or training programs was due to her pregnancy and subsequent
maternity leave.
Recommendation
Add ‘pregnancy’ to the need to care for children or others response
category. Pregnancy was also added to a similar response category in
question B2.
Issue
Needed clarification
Recommendation
Add the following interviewer probe to clarify: “General education
programs include adult basic education or GED courses, college, and
other types of school.”
Issue
One respondent was unclear about what was meant by “on-the-job
training”. This respondent said yes to this question because he worked
with actual forklifts during this program, but the program wasn’t associated
with an actual job site.
Recommendation
Add this clarifying interviewer probe to question: “On-the-job training, also
called “OJT,” involves getting on-the-job-experience from a particular
employer.”
Issue
This question about support services was lengthy, and at times, confusing
for respondents due to a lack of further clarification about what an item
was referring to (e.g. one respondent indicated he had received medical
assistance because he was enrolled in Medicaid).
Recommendation
Edit question for length, removing less important items, and removing the
sub-question about medical assistance and remove follow-up subquestions C29, C29a, C30, C31, and C32.
Issue
Two respondents found the term ‘Unemployment Insurance” to confusing
and did not recognize that they were being asked about general
unemployment benefits.
Recommendation
Reword item to say, “Unemployment Insurance or Unemployment
Benefits.”
N.8
ATTACHMENT N: PRETEST RESULTS MEMORANDUM
Question: D1g
Issue
Recommendation
Question: D1h
Question: D2b
Question: G1
Question: G3, G4
Two respondents were unsure whether they received Medicaid. One
wasn’t sure if he got Medicaid or Medicare, and another said he had
‘Obamacare’.
Add fill for state-specific Medicaid name to question text, e.g.
“Medicaid also known as Medi-Cal.”
Issue
One respondent mentioned WIC when asked if they received any other
assistance.
Recommendation
Add specific response category for WIC assistance.
Issue
Respondents had difficulty recalling the exact amount of their monthly
SNAP/Food Stamp benefits, or would mention that the amount varied.
Recommendation
Add two interviewer probes – “Your best estimate is fine” and “IF
MONTHLY AMOUNT VARIED, PROBE: How much was the most recent
amount?” to encourage respondents to provide an answer rather than
have incomplete data
Issue
Respondents often thought they were being asked for their address
instead of what kind of place they lived in (e.g. house, apartment).
Recommendation
Add interviewer probe of, “What kind of place do you live in?”
Issue
Respondents often went to a bit of effort to recall exactly how long they
had lived in their current place of residence in G3, e.g. “Well my daughter
was 3 I think, so I guess that was 2010…” etc.
Recommendation
Since the data being collected was limited to merely more or less than one
year, changed question text to reflect that and ease the recall burden for
respondents: “How long have you lived there? Would you say less than
one year or one year or longer?”
For consistency, the same wording change was made for G4: “How long
have you been without a regular place to stay? Would you say less than
one year, or one year or longer?”
Next Steps
Implementing the cuts to the questionnaire described earlier, we believe the follow-up
survey will average between 30 - 35 minutes. Converting the questionnaire to CATI should
streamline the administration. Additional cuts could be made to ensure the survey stays within
the 30 minute budgeted time, but it would require eliminating questions that collect data on
secondary outcomes, such as the mental health question. However, because our planning and
budget assumes interviews should on average take 30 minutes, there could be budget
implications if administration of the survey exceeds this threshold. We look to FNS for feedback
on whether more proposed cuts are desired, and if so, which questions could be deleted.
N.9
File Type | application/pdf |
Author | KGroesbeck |
File Modified | 2015-09-04 |
File Created | 2015-07-17 |