Appendix
F1
Pre-test
Findings Memo
This page has been left blank for double-sided copying
From: Kim McDonald, Emma Wells, Dory Thrasher, Julie Hartnack, and Jodie Davis
Subject: SNAP Mobile Payment Pilots Evaluation: Pre-Test Results
This memo summarizes our findings from pre-testing four data collection instruments for the SNAP Mobile Payment Pilots Evaluation: the client experience survey, the survey of non-adopters, the key informant interview discussion guide, and the retailer interview discussion guide. Section I presents findings from the survey pre-tests with SNAP participants, Section II describes findings from the interview discussion guide pre-tests and Section III describes next steps.
The goal of the survey pre-test was to assess survey length and flow, respondents’ interpretation of questions, and the comprehensiveness of response options. We describe how we revised the Client Experience Survey and Survey of Non-Adopters based on observations from pre-tests conducted in three States (California, New York, and Oregon), two modes (self-administered and interviewer-administered), and two languages (English and Spanish). First, we describe the pre-test procedures and sample; then we review the findings and suggested modifications from the pre-test.
Survey structure and modifications for pre-testing. Questions in Section A and Section E are the same for both surveys, whereas Sections B, C, and D are tailored to specific respondent populations (Exhibit 1).
Exhibit 1. Sections by survey type
Section |
Client Experience Survey |
Survey of Non-Adopters |
Section A: Screener |
Screen for survey administration |
Screen for survey administration |
Section B: Awareness and Engagement with Pilot |
Experiences setting up and using SNAP mobile payments |
Familiarity with and interest in SNAP mobile payments |
Section C: Barriers to Use and Feedback |
Challenges encountered using SNAP mobile payments |
Hesitations, challenges, and other opinions about SNAP mobile payments |
Section D: Respondent Characteristics |
Respondent characteristics |
Respondent characteristics and cell phone capability |
Section E: End |
Collect contact information for incentives |
Collect contact information for incentives |
The option to redeem SNAP benefits by using an electronic benefit transfer (EBT) card on a mobile device (SNAP mobile payments) is not yet available. However, using and setting up mobile payments with a debit or credit card is a widely used payment option that works similarly to how SNAP mobile payments would work. To make the survey clearer for pre-test participants, the study team modified the surveys in the pre-test to use the term “mobile payments” instead of “SNAP mobile payments.” The study team also removed several references to SNAP that could have confused pre-test participants responding to questions about mobile payments in general. Some of the survey changes that were made for pre-testing are shown in Exhibit 2.
Exhibit 2. Example survey modifications for pre-test
Pre-test sample. Pre-test participants were recruited by study team staff in June 2023. The study team administered the pre-test in three States: California, Oregon, and New York. These States were selected to help test questions that might differ by State, including those with references to State-specific SNAP apps, and to allow for in-person administration by the study team. In the three States, study team staff reached out to organizations that serve adults similar to those in the study population for the SNAP Mobile Payment Pilots Evaluation, including two food pantries, one local farmers’ market, and a local college. In coordination with these organizations, the study team invited individuals to participate in a pre-test interview with Mathematica. All pre-test participants were screened to ensure they were receiving SNAP benefits at the time of the pre-test interview and to confirm their use or non-use of mobile payments.
The Client Experience Survey was pre-tested with 9 participants who had used mobile payments in a store to buy food (8 in English and one in Spanish), and the Survey of Non-Adopters was fielded with nine pre-test participants who had not (8 in English and one in Spanish).
Pre-test procedures. Pre-test interviews were conducted in person or by telephone. To mirror self-administration of the survey by web, 11 participants filled out a printed version of the survey. These respondents self-administered the questionnaire by reading each question to themselves and circling or highlighting their answers. To replicate interviewer administration of the survey by telephone, interviewers on the study team read survey questions aloud to 7 participants and asked them to respond verbally. Exhibit 3 shows the breakdown of pre-test administration mode by survey type.
Exhibit 3. Pre-test administration mode by survey type
|
Self-administered (paper) |
Interviewer-administered (verbal) |
Total |
Client Experience Survey |
7 |
2 |
9 |
Survey of Non-Adopters |
4 |
5 |
9 |
Total |
11 |
7 |
18 |
Before beginning each pre-test, the interviewer explained the context of the survey to help clarify to participants why they were being asked questions about both mobile payments and SNAP. Exhibit 4 includes the introductory text that interviewers read to pre-test participants.
Exhibit 4. Pre-test introduction
|
Following administration of the survey, the interviewer asked the respondent to go back through the survey to discuss specific questions, asking a series of cognitive interview prompts to elicit feedback. These questions deepened our understanding of how respondents answered the survey, the challenges they encountered, and any suggestions they had for improvement.
The study team conducted two of the pre-test interviews in Spanish to obtain feedback on the Spanish translation. Interviews took between 20 to 30 minutes, and pre-test respondents were given or sent a $20 Visa gift card to thank them for their participation.
Overall, respondents thought the survey was straightforward and easy to understand. Respondents said the flow between sections made sense, and they thought the survey length was appropriate. Next, we discuss findings on respondent burden and recommendations for item-specific revisions.
Respondent burden. The average time it took to finish the survey across modes, locations, and languages was 4.8 minutes. However, there was a difference between the time it took to complete surveys administered on paper (self-administered) and surveys administered verbally (interview-administered). Self-administered surveys averaged 3.5 minutes, and interviewer-administered surveys averaged 6.6 minutes. A few factors account for this difference. First, participants were more inclined to ask questions during the interviewer-administered surveys despite the study team’s instruction to hold questions until the end. In addition, one interviewer-administered survey took 10 minutes and another took eight minutes, whereas none of the others took longer than six minutes. With those outliers removed, the average interviewer-administered survey time was 5.7 minutes. Therefore, although the time varied by administration mode, the overall burden of the pre-tested instruments was consistent with our five-minute estimate.
Exhibit 5. Survey modifications from pre-test (English and Spanish)
This section summarizes our findings from pre-testing the draft implementation interview guides for key informants and retailers. The goal of the pre-test was to assess interview guide length and flow, understand respondents’ interpretation of questions, and gauge the comprehensiveness of the guide. We describe how we revised the Key Informants Interview Guide and the Retailer Interview Guide based on suggestions from pre-tests with staff involved in the pilots from the Illinois Department of Human Services staff, Aldi, and Schnuck’s Markets. First, we describe the pre-test procedures; then present the findings.
Pre-test respondents. Pre-test participants were recruited by study team staff in June and July 2023. The study team administered the pre-tests with likely key informants in one pilot State—Illinois—to ensure the interview guides were comprehensive of activities the State Agency and their retailer partners expect to complete during the pilot planning and implementation. The study team invited staff from the Illinois Department of Human Services (IDHS) who are involved in the pilot planning to participate in the pre-test and requested their help identifying retailer partners. We the reached out to corporate-level contacts at Aldi and Schnuck’s Markets and requested their participation in a pre-test. We asked our corporate contacts to invite store managers or other key staff to join the conversation. Store managers and other corporate staff involved with payment technologies participated in the pre-test with Schnuck’s Market, and one corporate-level staff person participated from Aldi.
Pre-test guides.
Key informant interview guide. The key informant interview guide includes over 100 questions that cover all key informant respondents and three rounds of data collection. The guide will ultimately be tailored to ask only questions relevant for the respondent’s role in the pilot and round of data collection and will take no longer than an hour.
Retailer interview guide. The retailer interview guide is designed for use with corporate or store-level staff at participating pilot retailers. The interview guide includes fewer questions than the key informant interview guide and is designed to take no longer than 30 minutes with any respondent.
Example pre-testing
questions
What
feedback do you have on the questions in this section?
What
did you think of the organization and flow of the questions?
Could
any questions be eliminated due to being repetitive or
unapplicable?
Are
there any questions or topics you found sensitive?
Pre-test procedures. We took slightly different
approaches to pre-testing the key informant interview guide and the
retailer interview guide. We asked IDHS staff to review the full
interview guide and consider recommendations for improving content or
clarity. We then held a one-hour virtual meeting to debrief and ask
targeted questions. We did not ask retailer staff to review any
materials ahead of a 45-minute virtual meeting to review the
questions together and ask for targeted feedback.
All virtual meetings were held in June and July 2023. In each virtual meeting we began by explaining the context of the Mobile Payments Pilot Evaluation, the pre-testing process, and the structure of the interview guides. We clarified that we were seeking feedback on the structure of the interview guides, the content of the questions, and the likelihood that the ultimate respondents would be able to answer the questions. We asked questions about each section and sub-section of the interview guide during the virtual meeting (see pop-out box for examples). We took notes throughout the conversations and recorded the meetings.
General acceptability: Overall, participants expressed that the interview guides were comprehensive and questions were straightforward. Respondents made several suggestions for improving the comprehension of certain sections and questions. For example, IDHS suggested adding another sentence to the introduction of the key informant interview to provide more context for the type of work Mathematica typically does for FNS. In addition, IDHS suggested adding more information about the reference period for the specific interview, given the key informant interviews will occur several times over a period of years. During discussions about interview guide administration, some retailers raised concerns that customer-facing workers, such as cashiers and clerks, might not have the knowledge to answer certain questions. Our respondents suggested that certain questions, particularly those about the retailer’s technology systems and customer shopping patterns would be better answered by managers and corporate-level staff with access to store data.
Language choice: During the discussion of the store environment, retailers expressed a preference for using the term "EBT" instead of "SNAP" in the text, noting that most in-store language, include point-of-sale systems, use “EBT” to refer to the payment type.
Exhibit 6. Questions modifications from pre-test
Discussion Guide Section |
Question(s) |
Feedback |
Revision |
Key Informant Interview Guide |
|||
Section B |
Question 7. Are these formal or informal partnerships? |
One respondent stated that “Partners could be contracted in another way, but still be an advisor on this.” |
|
Section C |
Question 14. Were new staff hired specifically for the pilot? |
One respondent suggested we ask about whether any staff have been added over time or whether they felt there were additional staff needs |
|
Section D |
Question 44 and 45. What steps did you take with your State’s mobile application developer to prepare to offer NFC/QR code technology? |
Respondents are not planning to include QR codes in their State’s pilot. |
|
Section F |
Operations (Round 2 only) |
One respondent thought that the guide does not include enough questions about EBT testing at the retailer level. |
|
Retailer Interview Guide |
|||
Introduction |
“…what strategies were used to ensure program integrity…” |
Retailers stated that the average check/clerk may not understand the term “program integrity.” |
|
Section B |
Question 7. Did the store offer mobile payment methods (for non-SNAP purchases) prior to your involvement in the SNAP mobile payment pilot?
|
One respondent commented that the term EBT was used by staff in-stores instead of the term SNAP. Likewise, respondents suggested asking this question of both in-store and corporate level staff. |
|
Respondents stated that mobile payment needs to be defined in this question. |
|
||
Section C |
Question 12. What did your store need to do to prepare to offer SNAP mobile payments? |
Some respondents wondered if in-store cashiers or clerks would know how to answer this question. |
|
Section D |
Question 23. Has your store experienced any challenges with mobile payments processing incorrectly? |
A few respondents noted that complaints could also go to the customer satisfaction department |
|
Section E |
Question 28. How difficult would it be to roll out mobile payments to stores in the rest of the State? |
Some respondents were not sure whether this question would be relevant for store clerks and managers (in their stores). |
|
Question 29. What advice would you give other retailers that want to implement mobile payments? |
Some respondents were not sure whether this question would be relevant for store clerks and managers (in their stores). |
|
Respondent burden. Retailer staff suggested that the estimated duration of the interview would range from 15 to 30 minutes, depending on the level of engagement from the interviewee. (We did not ask IDHS staff to estimate the time for key informant interviews since the pre-test covered all questions in the guide but only selected questions will be asked in each interview round.)
The study team has revised the surveys, key informant interview guide, and retailer interview guide to reflect the changes described in this memo. We have had both revised surveys translated into Spanish. Finalized surveys and interview guides will be included in the final OMB package.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Mathematica Memo |
Subject | memo |
Author | Kim McDonald |
File Modified | 0000-00-00 |
File Created | 2024-07-21 |