OMB
Number: 0584-XXXX Expiration
Date: XX/XX/XXXX
Supporting
Statement Part B
OMB No. 0584-XXXX
Modernizing
Channels of Communication
With SNAP Participants
March 2, 2020
Office of Policy Support
Food and Nutrition Service
U.S. Department of Agriculture
1320 Braddock Place
Alexandria, VA 22314
Contents
Part B. Collection of Information Employing Statistical Methods 1
B.1. Respondent Universe and Selection Methods 1
B.2. Procedures for the Collection of Information 3
B.3. Methods to Maximize Response Rates and the Issue of Nonresponse 4
Tables
Table B.1.1. Breakout of Respondents and Nonrespondents by Respondent Type 2
Appendices
A. Legal Authority Statutes and Regulations
B. Research Objectives and Questions by Data Source
C. Pretest Methods and Summary of Findings
D. Use of Incentives
E. State SNAP Director Interview Protocol
F. Introductory Telephone Call With State MCS Staff and Administrators Protocol
G. State MCS Staff and Administrators Interview Protocol
H. Business Software Developers Interview Protocol
I. Local Office Frontline Staff Group Interview Protocol
J. Business Not-For-Profit Community Partners Interview Protocol
K. SNAP Participant Focus Group Protocol
L. SNAP Office Waiting Room Questionnaire
M. Conceptual Framework for MCS Functions Diagram
N. SNAP Participant Focus Group Eligibility Screener
O. SNAP Participant Focus Group Demographic Questionnaire
P. Consent Form for Waiting Room Questionnaire Participants
Q. Consent Form for Focus Group With SNAP Participants
R. Consent Form for Stakeholder Interviewees (60 Minutes)
S. Consent Form for Stakeholder Interviewees (90 minutes)
T. Insight Policy Research Confidentiality Pledge
U. Propel Public Comments
V. FNS Response to Propel Public Comments
W. Code for America Public Comment
X. FNS Response to Code for America Public Comment
Y. Case Study Site Recruitment Email to States from Research Team
Z. Template Recruitment Email from Regional Office to Case Study Site
AA. Project Overview for Case Study Site Recruitment
AB. Confirmation Email for Case Study Site
AC. Advance Materials for Confirmed Case Study Site
Part B. Collection of Information Employing Statistical Methods
Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.
Respondent Universe
According to a 2018 national scan of States’ use of mobile communication strategies (MCS) for the Supplemental Nutrition Assistance Program (SNAP), most States and U.S. territories (n = 38) had optimized their websites for mobile devices. Only a few States (n = 3) had fully implemented a comprehensive MCS (fully operational mobile applications [apps], text messaging [SMS], and mobile-optimized websites). Some States had implemented mobile apps (n = 8) or had apps under development (n = 7). Some States had implemented text messaging (n = 15) or had plans to implement text messaging (n = 7).1 The U.S. Department of Agriculture’s (USDA) Food and Nutrition Service (FNS) will select five States (i.e., State, county, or local municipality) that represent a mix of mobile technology features related to participant adoption and satisfaction, MCS functionality, and features. The proposed selection will also include States with a mix of SNAP caseload characteristics and geographic locations to ensure study findings are relevant to the range of States’ circumstances. Of the five States selected, we will ensure they represent—
Five of the seven FNS regions
A range of MCS capabilities (e.g., at least one State that uses text messaging only, at least one State that uses a combination of text messaging and mobile app technology/a mobile-optimized website, at least one State that employs multiprogram MCS technology)
Functionality covering all five major pathways of the client experience: (1) application/recertification, (2) change reporting, (3) notifications, (4) client inquiries, and (5) benefits and electronic benefit transfer
A variety of States whose MCS include other safety net programs
Both early and recent adopters of MCS
A variety of other demographic characteristics (e.g., one rural State facing broadband access challenges, one State with an older SNAP population)
When narrowing the list of States, FNS will work closely with the research team to identify the considerations of greatest interest for this study. After the States are sorted, the research team will select 10 States that represent a mix of characteristics and flag the 5 it recommends for inclusion in the study; the other 5 States will serve as potential alternatives. FNS and the research team will discuss each of the 10 States. After considering all factors, we will finalize the list of five States for recruitment and provide FNS with a memorandum describing the criteria and the process.
Estimated Number of Respondents
The study will gather data through site visits to five States with SNAP MCS. Data will be collected in each of the five study States through (1) interviews with the State SNAP director, State MCS leads and other staff involved in MCS implementation, local SNAP office staff, and State software developers or IT staff; (2) interviews with community partners; (3) interviews with for-profit organizations (e.g., software developers or IT staff); and (4) focus groups with individuals/households (i.e., SNAP participants) and surveys of SNAP applicants and waiting room visitors. These data will provide information on States’ and program recipients’ use of MCS and client satisfaction with and perspectives on MCS.
This new information collection will have 444 respondents (72 State and local government staff, 10 for-profit organization staff, 21 not-for-profit staff [staff from community partners], 169 SNAP participants, and 172 SNAP office waiting room visitors). It is anticipated that of the 444 contacted, 326 will be responsive and 118 will be nonresponsive. Table B.1.1 provides the breakout of respondents and nonrespondents by respondent type.
Table B.1.1. Breakout of Respondents and Nonrespondents by Respondent Type
Respondent Type |
Total Contacted |
Number of Respondents |
Number of Nonrespondents |
|
State and local government |
State SNAP directors |
6 |
5 |
1 |
State staff involved in MCS |
32 |
27 |
5 |
|
State software developers or IT staff |
7 |
6 |
1 |
|
Local office staff |
27 |
22 |
5 |
|
Business or other for-profit |
Software developers or IT staff |
10 |
5 |
5 |
Business or other Not-for-profit |
Community partners |
21 |
16 |
5 |
Individuals |
SNAP participants/MCS users |
169 |
123 |
46 |
SNAP-eligible individuals |
172 |
122 |
50 |
Describe the procedures for the collection of information including:
Statistical methodology for stratification and sample selection
Estimation procedure
Degree of accuracy needed for the purpose described in the justification
Unusual problems requiring specialized sampling procedures
Any use of periodic (less frequent than annual) data collection cycles to reduce burden
FNS will send a letter to the State SNAP directors to inform them their States have been selected for the study and encourage their participation. Within 1 week of the delivery of the recruitment letters, the FNS research team will follow up with an email to the State SNAP directors to ask to schedule a call to discuss the study. These initial communications with potential States will clearly outline the expectations for participation, including requests for staff to participate in interviews, assist with recruiting focus group participants, and broker connections with a local office with space to hold the focus groups. The initial recruitment call will address the State’s questions, identify points of contact for staff at the SNAP office, and establish next steps. If any of the recommended States are either unwilling or unable to host the case study, the team will contact the alternative States. Once the selected States have agreed to participate, the team will hold an informational onboarding webinar to orient the States to the project.
Once the selected States identify local offices to assist with the study, the team will conduct an additional 1-hour virtual training for the local office staff, which will include information on the following activities:
Recruiting focus group participants using the designated screening instrument
Tracking the recruitment process
Protecting participants’ privacy
Reminding participants
Hosting the focus groups
No statistical sampling methodology will be employed, no estimation of the number of data sources or systems used will be required, and no special sampling procedures will be used. Communication will occur via email, telephone calls, and in-person site visits.
No unusual problems requiring specialized sampling procedures have been identified. Because this is a one-time data collection, no periodic data collection cycles are applicable.
Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.
The staff at each local office the study team visits will be responsible for assembling two groups of at least six individuals who meet the screening criteria (e.g., have installed and used MCS on their mobile devices) and can attend the group discussions at the designated date and location. To compensate for no-shows, 10–12 individuals will be recruited per focus group to ensure at least 6 individuals attend. FNS research team will also ask the local office staff to remind participants of the upcoming focus group by telephone or text the day before the discussion.
To fully support the local offices in their recruitment efforts, the research team will provide the following recruitment materials and guidance to each local office:
A 1-hour virtual training to provide a detailed overview and instructions for the recruitment process
Screener documents local SNAP staff can use to identify eligible participants
Recruitment scripts
A log for tracking names and contact information of clients who have agreed to participate during each time slot
A study overview handout that describes the purpose of the focus groups, planned topics of discussion, and incentive offered
The study team expects the planned methods of data collection will allow the team to collect the accurate and reliable data needed to satisfy the objectives of this study. Table B.3.1 shows the anticipated response rates.
Table B.3.1. Expected Response Rates
Respondent Type |
Research Activity |
Expected Response Rate |
|
|
|
Site recruitment |
83 |
State and local government |
State SNAP directors |
Advance materials |
100 |
Onsite interviews |
100 |
||
State staff involved in MCS |
Site recruitment |
83 |
|
Advance materials |
100 |
||
Onsite interviews |
83 |
||
State software developers or IT staff |
Advance materials |
100 |
|
Onsite interviews |
83 |
||
Local office staff |
Advance materials |
100 |
|
Onsite interviews |
80 |
||
Business or other for-profit |
Software developers or IT staff |
Advance materials |
100 |
Onsite interviews |
50 |
||
Business or other not-for-profit |
Community partners |
Advance materials |
100 |
Onsite interviews |
75 |
||
Individuals |
SNAP participants/MCS users |
Advance materials |
100 |
Eligibility screener |
100 |
||
Reminder |
72 |
||
In-person focus group |
54 |
||
SNAP-eligible individuals |
Onsite recruitment |
100 |
|
Onsite interview |
70 |
The FNS research team anticipate all the selected States will participate. However, in the event a State selected to participate is unable to do so, the study team will contact the alternative States.
Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.
FNS pretested all the interview data collection instruments for the study to evaluate the clarity of the questions asked, identify possible modifications to the order or wording of the questions that could improve the quality of the data, and estimate respondents’ burden. After completing the pretest interviews via telephone and in person, the team prepared a memorandum that summarized the pretest procedures and findings and addressed pretest respondents’ understanding and ability to answer the interview questions or provide requested data. The memorandum also recorded the time required to complete each data collection activity and identified recommended changes to improve the clarity or flow of questions. Based on the results of the pretests and additional feedback from FNS and the research team, we finalized the data collection instruments.
Pretest Findings and Changes to Interview Protocols
The pretest findings are outlined in appendix C (Pretest Methods and Summary of Findings). The pretest findings are summarized in table B.4.1.
Table B.4.1. Summary of Pretest Findings
Instrument |
Summary of Findings |
Resulting Changes to Instrument |
Introductory Phone Call With State MCS Staff and Administrators Protocol |
Duration: 41 minutes (anticipated 60 minutes) Further clarification needed on research team approach to compiling State-specific understanding of MCS One probe was deemed unnecessary |
Adjusted introductory text Removed one probe |
State MCS Staff and Administrators Interview Protocol |
Duration: 75 minutes (anticipated 90 minutes) Clarification needed on definition of MCS Further clarification needed on research team approach to compiling State-specific understanding of MCS Several questions appeared redundant Some confusion about definition of terms |
Adjusted introductory text and added definition of MCS Removed redundant questions Added clarifying context to provide background |
Software Developers Interview Protocol |
Duration: 55 minutes (anticipated 60 minutes) Further clarification needed on research team approach to compiling State-specific understanding of MCS Wording of some questions was unclear |
Adjusted introductory text Clarified wording in some questions |
Local Office Frontline Staff Group Interview Protocol |
Duration: 51 minutes (anticipated 60 minutes) Further clarification needed on research team approach to compiling State-specific understanding of MCS Further clarification needed on a few questions Even in States lacking a formal MCS, participants could use mobile communication for SNAP |
Adjusted introductory text Added clarifying text and adjusted wording to a few questions Added two optional questions for States with no text messaging capabilities or mobile app to ask staff about presence of third-party or informal mobile communications |
Other Stakeholders/ Community Partners Interview Protocol |
Duration: 27 minutes (anticipated 60 minutes) Further clarification needed on research team approach to compiling State-specific understanding of MCS Even in States lacking a formal MCS, participants could use mobile communication for SNAP Lack of general understanding about how community partners’ clients use MCS |
Adjusted introductory text Added two optional questions for States with no text messaging capabilities or mobile app to ask staff about presence of third-party or informal mobile communications Added an introductory question to assess whether, when, and how clients use MCS for SNAP |
SNAP Participant Focus Group Protocol |
Duration: 61minutes (anticipated 90 minutes) Lack of flow between introduction and icebreaker Some participants focused on third-party MCS Some participants may use MCS informally to communicate with their caseworkers Even in States lacking a formal MCS, participants could use mobile communication for SNAP Confusion around the wording of some questions |
Adjusted ordering of icebreaker and introductory text Added clarifying text to focus conversation on State-sponsored MCS Added probe to assess informal MCS Added two optional questions for States with no text messaging capabilities or mobile app to ask staff about presence of third-party or informal mobile communications Adjusted wording in some questions to improve clarity |
SNAP Office Waiting Room Questionnaire |
Duration: 10–11 minutes (anticipated 5–7 minutes) Confusion around the wording of some questions |
Deleted two questions to ensure completion within allotted time Adjusted wording in some questions to improve clarity |
SNAP Participant Focus Group Demographic Questionnaire |
Duration: 5–11 minutes (anticipated to be included as part of 90-minute focus group) Confusion around the wording of some questions Additional response options needed in some questions |
Adjusted wording in some questions to improve clarity Added additional response options in some questions |
Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
FNS consulted with a mathematical statistician from USDA’s National Agricultural Statistics Service (NASS), who reviewed the study methodology and procedures (see table B.5.1).
Table B.5.1 Individuals Consulted on Statistical Aspects of Design
Name |
Title |
Organizational Affiliation |
Contact Information |
Sofiya Cherni |
Mathematical Statistician |
Sampling and Frame Development Section, Methodology Division, National Agricultural Statistics Service |
(202) 690-2178 |
FNS has contracted with Insight Policy Research (part of our research team) to assist in conducting this study. The Project Officer for the contract providing funding for the evaluation, Mr. Andrew Burns, will be responsible for receiving and approving all contract deliverables.
1 See the memorandum “Task 2.1: National Scan of States’ Use of Mobile Communication Strategies (MCS) to Enhance SNAP Participant Experiences” submitted February 15, 2019, by Insight to FNS.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Elaine Ayo |
File Modified | 0000-00-00 |
File Created | 2021-05-28 |