Supporting Statement – PART A Justification for OMB Clearance
Paperwork Reduction Act of 1995
OMB No. 0584-[NEW]
Rapid Cycle Evaluation of Operational Improvements in Supplemental Nutrition Assistance Program (SNAP) Employment & Training (E&T) Programs
Office of Policy Support
Food and Nutrition Service
United States Department of Agriculture
1320 Braddock Place
Alexandria, Virginia 22314
Phone: 703-305-0414
Email: [email protected]
TABLE OF CONTENTS
A1. Circumstances Making the Collection of Information Necessary 1
A2. Purpose and Use of the Information 1
A3. Use of Information Technology and Burden Reduction 8
A4. Efforts to Identify Duplication and Use of Similar Information 9
A5. Impacts Small Businesses or Other Small Entities 10
A6. Consequences of Collecting the Information Less Frequently 11
A7. Special Circumstances Relating to the Guideline of 1320.5(D)(2) 11
A8. Comments in Response to the Federal Register Notice and Efforts to Consult with Persons Outside the Agency 11
A9. Explanation of Any Payments or Gifts to Respondents 13
A10. Assurances of Confidentiality Provided to Respondents 15
A11. Justification for Sensitive Questions 16
A12. Estimates of Hour Burden Including Annualized Hourly Costs 17
A13. Estimates of Other Total Annual Cost Burden to Respondents or Record Keepers 19
A14. Annualized Cost to the Federal Government 19
A15. Explanation for Program Changes or Adjustments 20
A16. Plans for Tabulations and Publication and Project Time Schedule 20
A17. Display of Expiration Date for OMB Approval 21
A18. Exception to the Certification Statement Identified in Item 19 of Form OMB 83-1 22
APPENDICES
A. Legal Authority
B1. Summary of Intervention Designs
B2. Intervention Outreach Materials
B3. Intervention Assessment Materials
C. Data Collection Summary Table
D1. SNAP Administrative Data Request
D2. Administrative Data: RAPTER Screenshots
E1.1. Colorado Participant Survey Specifications
E1.2. Colorado Participant Survey Specifications: Spanish
E1.3. Colorado Participant Survey Screenshots
E1.4. Colorado Participant Survey Screenshots: Spanish
E2.1. Massachusetts Participant Survey Specifications
E2.2. Massachusetts Participant Survey Specifications: Spanish
E2.3. Massachusetts Participant Survey Screenshots
E2.4. Massachusetts Participant Survey Screenshots: Spanish
E3.1. Connecticut Participant Survey Specifications
E3.2. Connecticut Participant Survey Specifications: Spanish
E3.3. Connecticut Participant Survey Screenshots
E3.4. Connecticut Participant Survey Screenshots: Spanish
E4.1. Rhode Island Participant Survey Specifications
E4.2. Rhode Island Participant Survey Specifications: Spanish
E4.3. Rhode Island Participant Survey Screenshots
E4.4. Rhode Island Participant Survey Screenshots: Spanish
E5.1. Participant Survey Advance Letter
E5.2. Participant Survey Initial Email/Text
E5.3. Participant Survey Reminder Email/Text
E5.4. Participant Survey Reminder Letter
E5.5. Participant Survey Reminder Postcard
F1. Participant Focus Group Discussion Guide
F1.1. Participant Focus Group Information Form
F2.1. Participant Focus Group Recruitment Email/Text
F2.2. Participant Focus Group Confirmation and Reminder Email/Text
F3. Participant FAQ
G1. Participant In-depth Interview Discussion Guide
G2.1. Participant In-depth Interview Recruitment Email/Text
G2.2. Participant In-depth Interview Confirmation and Reminder Email/Text
H1. Staff Semi-structured Interview Guide
H2. Staff Semi-structured Interview Invitation Email
I1. Staff Questionnaire Specifications
I1.1 Staff Questionnaire Screenshots
I2. Staff Questionnaire Advance Email with FAQ
J. 60-Day Federal Register Notice
K1. Public Comment 1
K2. FNS Response to Public Comment 1
L. NASS Comments
M. Participant Survey Pretest Memo
N. Incentive and Response Rates
O. FNS-8 USDA/FNS Studies and Reports
P. FNS-10 USDA/FNS Persons Doing Business with the Food Nutrition Service
Q1. Confidentiality Pledge
Q2. Data Security Plan
R. Connecticut Evaluation Consent Form
S. Institutional Review Board (IRB) Approval
T. Burden Table
U. Site-specific MDI Tables
V. Site-specific Intervention Design Diagrams
Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.
This is a new information collection request. In addition to providing food assistance benefits to support adequate access to food, the Supplemental Nutrition Assistance Program (SNAP) provides Employment and Training (E&T) services to improve recipients’ economic self-sufficiency and reduce long-term reliance on SNAP. State SNAP agencies are required to operate an E&T program (Food and Nutrition Act of 2008, Section 6(d)(4)(A)(i)), but have considerable flexibility to determine their program operations and the services and activities they offer. The U.S. Department of Agriculture’s Food and Nutrition Service (FNS) seeks to ensure the quality of the services and activities offered through SNAP E&T programs by investing resources and providing technical assistance to help States build capacity, create more robust services, and increase engagement in their programs.
Section 17 of the Food and Nutrition Act of 2008, as amended in March 2022, authorizes the Secretary of Agriculture to contract with private organizations and conduct research to improve the administration and effectiveness of SNAP (Appendix A: Legal Authority). The planned data collection will allow FNS to test new, low-cost, small-scale interventions in SNAP E&T operations or service delivery using rapid cycle evaluation (RCE). States and providers typically do not have the resources to implement major changes to business processes, service delivery approach, or service options, but may be able to make small changes that promote better outcomes, such as improved program engagement and service take-up. Developing and testing these interventions will enable States to improve SNAP E&T programs and FNS to identify how to provide technical assistance to States.
Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate how the agency has actually used the information received from the current collection.
The purpose of this information collection is to address the objectives presented in Table A.1.
Table A.1. Study objectives
|
1. Describe how RCE can be used to improve SNAP E&T operations, service delivery and program outcomes, including if the use of RCE in SNAP E&T differs from its use in other public assistance programs and whether there are special considerations and challenges FNS should review when using RCE in SNAP E&T. |
2. Design and implement RCEs to obtain estimates of the effectiveness of small-scale interventions on SNAP E&T outcomes, including identifying which States and providers are best suited to implement these changes and how the changes are expected to affect SNAP E&T operations and service delivery. |
3. Conduct an implementation evaluation to identify challenges to implementing programmatic changes; learn about the context in which changes were made; and assess how well staff and participants view, understand, and react to the changes. |
4. Assess the scalability of small-scale interventions to SNAP E&T operations and service delivery to other programs, including identifying what modifications would be needed to scale these changes to other SNAP E&T programs. |
5. Determine and document the costs associated with implementing and maintaining small-scale interventions for both the entities involved in the implementation and the recipients of the programmatic changes. |
No other new or ongoing effort addresses these research study objectives. This study’s findings will inform how FNS improves SNAP E&T programs’ efficiency and effectiveness through enabling program administrators to target scarce resources and maximize potential impacts on the individuals they serve. FNS has contracted with a vendor to conduct this evaluation.
A2.a Background. FNS has partnered with eight sites for this study: (1) Colorado Department of Human Services, (2) Connecticut: Community Colleges, (3) District of Columbia Department of Human Services, (4) Kansas Division of Children and Families, (5) Minnesota Department of Human Services, (6) Minnesota: Hennepin County Department of Human Services, (7) Massachusetts Department of Transitional Assistance, and (8) Rhode Island Department of Human Services. The study team is collaborating with sites to identify their main challenges to recruitment, outreach, participant engagement, and receipt of services, and to design interventions to address these challenges (Appendix B1: Summary of Intervention Designs). Examples of proposed interventions include sending behaviorally informed text messages/emails to encourage enrollment in SNAP E&T or attendance at appointments or activities (Appendix B2: Intervention Outreach Materials), and using work readiness assessments to improve referrals, and enhance case management (Appendix B3: Intervention Assessment Materials).
Most interventions will be evaluated using randomized control trials in which individuals eligible for the intervention will be randomly assigned to a treatment group that receives the intervention or a control group that does not. The control group will continue to be offered the site’s existing approach. Following a short pilot period to ensure interventions operate smoothly, sites will implement their interventions for a period of three to six months, depending on the site.
A2.b Information collections. The study will gather data from State SNAP agency administrative records, State SNAP E&T administrative records, SNAP participants (in the form of a participant survey, focus groups, and in-depth interviews), and State and local agency staff and local E&T provider staff (in the form of staff semi-structured interviews and a staff questionnaire) to evaluate each intervention’s effectiveness and implementation. The Data Collection Summary Table (Appendix C) provides an overview of the information being collected. Participation in the participant survey, focus groups, and in-depth interviews (IDIs) is voluntary for SNAP participants. Participation in the study, provision of administrative data, and participation in the staff semi-structured interviews and staff questionnaire is also voluntary for State and local SNAP agencies and E&T providers. Each data collection activity will be conducted once in 2023.
SNAP administrative data. In all eight sites, State SNAP agency staff will provide the study team with contact information and demographic and household information for SNAP participants targeted by each intervention (Appendix D1: SNAP Administrative Data Request). This data will be obtained electronically through a secure File Transfer Protocol (FTP) site. Depending on the intervention, some sites will use this contact information to send individuals electronic messages for recruitment, outreach, and engagement. SNAP administrative data will also be used by the contractor to sample and recruit SNAP participants into information collection activities, including participant surveys, focus groups, and IDIs, which are described below. In all communications, it will be clear that study participation is voluntary and participants can opt out at any point. If participants opt out from the study during the intervention period, they will not be contacted for the participant survey, focus groups, or IDIs. Demographic and household information will be used in the study’s impact analysis to analyze differences in outcomes by participant characteristics.
SNAP E&T administrative data. In all eight sites, State or county SNAP agencies and local E&T providers will provide outcome information such as participants’ contacts with E&T providers, enrollment in E&T, completion of assessments, receipt of referrals, and engagement in activities. Depending on each site’s preferences, the team will collect these data either electronically through a secure FTP site or using Random Assignment, Participating Tracking, Enrollment and Reporting (RAPTER®), which is a secure information system in which SNAP E&T and provider staff can enter outcome information for each participant (Appendix D2: Administrative Data: RAPTER Screenshots). The RAPTER® system allows caseworkers, intervention staff, site administrators, and site team leaders to monitor outcome information in real time throughout the intervention period for quality assurance and completeness. RAPTER® is designed to securely store sensitive information, built on Federal Information Security Management Act (FISMA) Moderate services in the Amazon Web Services (AWS) Cloud. These data will be used to assess outcomes in the study’s impact analysis, which will assess the effectiveness of the interventions.
Participant survey. The study also will include a 15-minute survey administered to 4,000 participants across four of the eight sites (Appendices E1 – E4: Participant Survey Specifications & Screenshots). FNS purposively selected four sites in which to conduct the participant survey to obtain evaluation information on a range of unique interventions. The contractor will send an advance letter to all participants (Appendix E5.1: Participant Survey Advance Letter), as well as an email or text if possible (Appendix E5.2: Participant Survey Initial Email/Text). Throughout data collection, nonrespondents will also receive a reminder email or text (Appendix E5.3: Participant Survey Reminder Email/Text), a reminder letter (Appendix E5.4: Participant Survey Reminder Letter), and a reminder postcard (Appendix E5.5: Participant Survey Reminder Postcard). The multimode survey will be available in English and Spanish, and respondents will have the option to self-administer the survey by web or participate in an interviewer-administered survey over the phone with a trained interviewer. The survey is cross-sectional. The data collection period will last up to four months, but will depend on each site’s timeline, enrollment, and outcomes. Depending on the site’s intervention, the survey will collect information on receipt of SNAP E&T recruitment and outreach materials, assessments, case management, and referral services. The participant survey also will assess barriers to engaging with services and seeking employment, program satisfaction, and reasons for engagement decisions (for those who engaged in the E&T programs, and those who either never engaged or disengaged). This information will be used as outcomes in the study’s impact analysis when outcome information cannot be obtained in the SNAP E&T administrative data. The information also will be used to describe participants’ experiences in the intervention to provide context for those analyses.
Participant focus groups. SNAP participants in each of the eight sites will participate in 90-minute in-person focus groups conducted by trained two-person teams (Appendix F1 & F1.1: Participant Focus Group Discussion Guide and Information Form). Teams will conduct two focus groups per site, each with 8 to 10 SNAP participants.1 The team will use contact information available in the SNAP administrative data to identify potential respondents at each site and contact them via email and text messages, asking SNAP participants to contact the study team if they are interested in participating (Appendix F2.1 & F2.2: Participant Focus Group Recruitment & Confirmation Email/Text). Individuals will be recruited to participate by site based on the treatment they received and whether they decided to participate in the program (see SSB B.1). For example, in Kansas, the team will conduct one focus group with individuals who received text message reminders and one group with individuals who received a behavioral nudge. Two members of the study team will facilitate focus groups using a participation FAQ document to respond to participant questions (Appendix F3.1: Participant FAQ). The focus groups will obtain participants’ perspectives on intervention recruitment and retention efforts and explore why people do and do not participate in E&T activities, their opinions about the services offered or received, and the types of barriers they face that could prevent them from participating in E&T or finding a job. This information will provide context for quantitative data findings, adding breadth and depth to the impact evaluation.
Participant in-depth interviews. A total of 60 SNAP participants in the four sites administering the participant survey will participate in 90-minute in-depth interviews conducted by trained two-person interviewer teams (Appendix G1: Participant IDI Discussion Guide). The interviews will occur at a location convenient for respondents. Recruitment for the in-depth interviews will follow the same recruitment process described above for the participant focus groups (Appendix G2.1 & G2.2: Participant IDI Recruitment & Confirmation Email/Text). The data will provide detailed, contextual information for the impact and implementation analyses about participants’ situations and insights into their lives; details of their goals and perceptions of factors that might impede them from reaching their goals; their relationship with site staff, training providers, and employers; and their perceptions of how the intervention has helped them progress toward their goals.
Staff semi-structured interviews. State SNAP agency staff and local staff, including local SNAP office staff, E&T service provider staff, and relevant partner organization staff will participate in 60-90-minute semi-structured interviews performed by trained two-person teams during in-person visits at all eight sites (Appendix H1: Staff Semi-Structured Interview Guide). The team will send an invitation email to all potential respondents, inviting them to participate in the interviews (Appendix H2: Staff Semi-Structured Interview Invitation Email). Types of respondents and interview locations will vary by intervention depending on the structure of the program. The team will conduct a total of 240 interviews with State administrative staff and local/business frontline staff across the eight sites, approximately 15 interviews of each respondent type in each site. The semi-structured interviews will obtain staff perspectives’ on the intervention’s implementation, including details about the context and process of the intervention and service provision, and lessons learned from the experience. The data will be used in the study’s implementation analysis to describe intervention services and workflow, assess fidelity of the implementation and costs of the implementation, and determine how to replicate and sustain changes.
Staff questionnaire. The study team will administer a 15-minute web-based questionnaire to a total of 120 frontline agency and provider staff implementing the intervention across the eight sites (Appendix I1 & I1.1: Staff Questionnaire Specifications and Screenshots). The questionnaire will be deployed approximately six to eight weeks before the site visit, so the team can use the information collected to inform its discussions on-site and ensure it follows up on any patterns identified in the data during the staff semi-structured interviews. The team will send an invitation email to all sample members, inviting them to participate in the questionnaire (Appendix I2: Staff Questionnaire Advance Email with FAQ). The questionnaire will collect information about staff’s characteristics and experiences; time savings or cost of the interventions; understanding of, opinions about, and satisfaction with the changes; effect of the changes on their job; challenges they face in providing services; and what program aspects they consider critical for success. This information will be used in the study’s implementation analysis to describe staff skills, experience, understanding, and perceptions of the intervention; assess implementation consistency across staff and geographic areas within sites; and inform how to replicate practices and services tested through the interventions.
FNS will use all of the information collected in the study to improve SNAP E&T operations, service delivery, and program outcomes beyond the eight sites included in the study. FNS will learn how RCE can be applied to SNAP E&T as a tool for learning about State agency and provider challenges; innovating current recruitment, outreach, and engagement practices to increase their efficiency and effectiveness; and improving SNAP E&T participants’ experiences, barrier mitigation, and employment outcomes while reducing burden of administrative staff. FNS also will use the information from the study to learn whether there are agency-, program-, or participant-specific considerations that differentiate using RCE in SNAP versus using it in other settings. Using this information, FNS will decide how to replicate or sustain innovations and best practices for SNAP E&T identified through the interventions implemented in the study.
Information shared with any other organizations inside or outside USDA or the government. The findings of the study will be published in a set of eight reports and briefs that will be available to SNAP E&T program administrators and the general public on FNS’ website. Restricted-use data files and documentation will be prepared for FNS staff, allowing the agency to conduct future analyses.
Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.
FNS is committed to complying with the E-Government Act of 2002 to promote the use of technology. The study will obtain SNAP administrative data and SNAP E&T administrative records electronically either through a secure FTP site or through entry into RAPTER™ (Appendix D2: Administrative Data: RAPTER Screenshots). State and local administrative staff may submit these data in whatever form is most convenient for them, given the variety of systems and databases used by States nationwide. The FTP site URL is https://www.websiteforthcoming.com.2 Any site that already tracks participant outcome information in their own management information system will be able to import these data directly into RAPTER™ without manually entering the information into the system. The study team expects that 100 percent of administrative data collection will be conducted electronically.
The study will administer the participant survey through a mobile-optimized online instrument, available in English and Spanish, that uses Computer Assisted Web Interviewing (CAWI) and is developed on the Confirmit web survey platform. Participants will receive a unique web survey link that will take them directly to their survey instrument. The study team expects 40 percent of participant survey completes to be completed electronically by web.
The study team expects the remaining 60 percent of participant survey completes to be conducted with a trained telephone interviewer using Computer Assisted Telephone Interviewing (CATI). Telephone interviewers will be available to complete any modules that the respondent would like to complete over the phone. Telephone interviewers will also follow up with nonrespondents to complete the participant survey by phone after several weeks of nonresponse to the web-based version.
The study team will use mobile-optimized CAWI to administer the staff questionnaire and expects 100 percent of staff questionnaires to be collected electronically.
Online data collection enables efficient participation, as programming limits questions to relevant respondents and will constrain data ranges, keeping responses within a certain length and simplifying data cleaning. Web-based instruments also allow respondents to complete and submit data securely using unique, password-protected logins. Respondents may save their progress, facilitating completion of the instrument in more than one session. By including programmed skip patterns and consistency and data range checks, both the CAWI and CATI methods will employ technology to reduce data entry error that often necessitates callbacks to respondents to clarify the responses recorded by an interviewer using pencil and paper methods. Use of CATI for administering instruments allows for use of soft and hard checks on entries that improve data quality and reduce recall bias. Using these systems also helps to track the length of time spent on the participant survey and staff questionnaire, allowing the team to improve internal processes in real time for addressing respondent burden.
Participant focus groups and in-depth interviews and staff semi-structured interviews will be conducted in-person and not electronically. It is not practicable to offer electronic reporting for these qualitative, discussion-based activities. All notes will be recorded electronically.
Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Question 2.
The study has made every effort to avoid duplication. FNS has examined Federal and non-Federal sources and has determined there are no similar prior or ongoing efforts that duplicate the proposed information collection. Because the study is testing new changes to program operations, data describing these changes are not available. Thus, the proposed participant survey, participant focus groups, participant in-depth interviews, staff semi-structured interviews, and staff questionnaire will collect information about impact and implementation that is not available from other data sources. For sites that are testing the effectiveness of the interventions on outcomes measuring enrollment and engagement in SNAP E&T, the study will use existing data that the State agency or provider staff already collect in SNAP E&T administrative data records, rather than requesting that staff collect data.
If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.
Information being requested has been held to the minimum required for the intended use. The study team expects the burden on small businesses or entities to be minimal. Some SNAP E&T providers (local SNAP agency and business for- and not-for-profit organizations) reporting SNAP E&T administrative data and participating in the staff semi-structured interviews and staff questionnaire may be small entities as defined by OMB Form 83-I. FNS estimates that 6.5 percent of business sample members (approximately 25 out of 382 sample members), or 0.0003 percent of the total sample (approximately 25 out of 91,910 sample members), will be small entities. The one-time staff semi-structured interviews occurring during site visits will be scheduled in collaboration with program staff to minimize any disruption of daily activities. The site visit team will conduct group discussions to the extent feasible, and no more than 60-90 minutes will be required of any one individual. The staff questionnaire contains only a limited set of questions with which to assess staff’s experiences, intervention costs, and perceptions of the intervention.
The SNAP administrative data requested from State government respondents at SNAP agencies has also been held to the minimum required for the purposes of implementing the interventions and conducting the evaluation. FNS is requesting minimal amounts of participant contact information with which to send communications via text message, email, and mail, and a limited set of demographic and household variables for analysis. These data are being requested only once to limit burden on agency staff.
Describe the consequence to Federal program or policy activities if the collection is not conducted, or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.
Data collection for the proposed study will occur one time only. Without this data collection, there would be no data to examine the effectiveness of the changes made to SNAP E&T programs under this initiative and to identify areas for improvement. These data are critical for monitoring program trends and identifying necessary and effective improvements to SNAP E&T program operations at the State and local levels.
Explain any special circumstances that would cause an information collection to be conducted in a manner…:
requiring respondents to report information to the agency more often than quarterly;
requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
requiring respondents to submit more than an original and two copies of any document;
requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;
in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;
requiring the use of a statistical data classification that has not been reviewed and approved by OMB;
that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.
There are no special circumstances. The collection of information is conducted in a manner consistent with the guidelines in 5 CFR 1320.5.
If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8 (d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.
Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.
Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years even if the collection of information activity is the same as in prior years. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.
Name |
Title |
Affiliation |
Contact Information |
Peter Quan |
NASS Reviewer |
National Agricultural Statistical Service |
|
Lisa Strunk |
Assistant Director of Employment Services Food Assistance Employment & Training Program |
Kansas Department for Children and Families |
(316) 283-3015 ext. 211 |
Eileen Peltier |
Chief Regional Workforce Development Officer North West Region |
Connecticut Community Colleges |
(860) 253-3032 |
Miriam Kaufmann |
SNAP Employment and Training Manager |
Massachusetts Department of Transitional Assistance |
(857) 408-6193
|
Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.
Incentives for this information collection are planned for the participant survey, participant focus groups, and participant in-depth interviews, all of which are voluntary for respondents. Providing incentives to respondents can be an essential component of the multiple approaches used to minimize non-response bias described in Section B.3 of this information collection request. All respondents who receive incentives will have been SNAP participants at the time of the intervention and will be asked to engage in a data collection effort when they face competing demands from participating in SNAP E&T activities and meeting family and work obligations. Research has long shown that incentives can increase survey response rates (thus minimizing non-response bias and improving population representativeness) without compromising data quality (Singer and Ye 2013; Singer and Kulka 2002; Singer et al. 19993).
Based on the empirical evidence (Appendix N: Incentives and Response Rates), participant survey respondents will receive a $30 post-participation incentive paid by gift card. Incentives can help offset expenses, such as mobile smartphone airtime or internet connectivity charges, respondents may incur in participation. The potential for response bias among subsets of respondents must be proactively avoided to ensure high quality data overall and, for the impact analysis based on the participant survey data, to ensure balanced rates of response from individuals in treatment and control groups.
Focus group and in-depth interview respondents will receive a $50 gift card at the end of each focus group or interview to offset costs of participation in the study, such as transportation or child care costs. Providing payments of $50 has resulted in adequate sample sizes for in-depth interviews and focus groups with similar populations (resulting in response rates of approximately 20%) and is consistent with other OMB-approved FNS information collections which utilized incentives.
Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.
The study team will comply with all Federal and State laws to protect privacy, including the requirements of the Privacy Act of 1974. The FNS Privacy Officer, Michael Bjorkman, reviewed this information collection request and had no comments or concerns regarding privacy, and determined on September 29, 2022 that the data collection meets the requirements of the Privacy Act. The study team will adhere to the requirements in the system of records notice (SORN) FNS-8 USDA/FNS Studies and Reports, published in the Federal Register on April 25, 1991, volume 56, No. 80, pages 19078 – 19080 (Appendix O: FNS-8) as well as FNS-10 USDA/FNS Persons Doing Business with the Food Nutrition Service, published in the Federal Register on March 31, 2000, volume 65, No. 63, pages 17251 - 17252 (Appendix P: FNS-10). The study team will sign a Confidentiality Pledge (Appendix Q1: Confidentiality Pledge), participate in annual security awareness training, and follow a data security plan (Appendix Q2: Data Security Plan).
State and local SNAP administrators and individual participants will be notified that the information provided will be kept private and will not be disclosed to anyone but the researchers conducting this investigation, except as otherwise required by law. Administrative data files sent to the contractor will be shared through a secure FTP site or through RAPTER®, a secure information sharing application. No personally identifiable information will be attached to any reports or data supplied to FNS or any other researchers. While no local program operators will be identifiable in the reports or data, the files will contain site identifiers.
Consent. Respondents either will read a consent form online and click a button to affirm consent if responding to a web-based instrument (i.e., participant survey or staff questionnaire), will hear the consent form read aloud by a trained interviewer and provide verbal consent if responding to an interviewer-administered collection (i.e., participant survey, participant focus group, participant in-depth interview, or staff semi-structured interview), or will provide written consent (Appendix R: Connecticut Evaluation Consent Form). All respondents will be informed that participation is voluntary, will not affect benefits or employment, and will not carry penalties if some or all questions are unanswered. The study team will assure participants that the information they provide will not be published in a way that identifies them. For reporting of results, data will be presented only in aggregate form without identifying individuals and institutions.
Information technology and physical safeguards. The contractor uses extensive corporate administrative and security systems to prevent the unauthorized release of personal records, including state-of-the-art hardware and software for encryption that meets Federal standards and other methods of data protection (for example, requirements for regular password updating), as well as physical security that includes limited key card access and locked data storage areas. All contractor staff sign a corporate non-disclosure agreement (Appendix Q1: Confidentiality Pledge) and undergo annual security training. At the end of the project, all interview recordings will be destroyed by the contractor. Appendix Q2 (Data Security Plan) describes the security protocols employed by the contractor.
Institutional Review Board approval. The study protocol, data collection tools and procedures, informed consent forms, and data handling and security procedures have been reviewed and approved by the Health Media Lab IRB on June 15th, 2022 (Appendix S: IRB Approval).
Provide additional justification for any questions of a sensitive nature, such as sexual behavior or attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
For individuals who agreed to participate in the participant survey, participant in-depth interview, or participant focus group, the study team will ask some questions that respondents may consider sensitive. Potentially sensitive topics may include race, ethnicity, and reasons for noncompliance with work requirements or ability to participate in program activities or maintain employment (which could include drug use, mental health issues, and other sensitive reasons).
Collecting information about these topics is necessary to gain a better understanding of the context in which individuals experience and view SNAP E&T, responses to the intervention (such as whether they responded to text message reminders about available services), and perceived effects of the intervention or broader SNAP E&T program. Race/ethnicity are critical background characteristics, both in that they define key subgroups of individuals, and that they are important control variables in assessment of intervention impacts. Questions about race/ethnicity will follow the OMB Standards for the Classification of Federal Data on Race and Ethnicity.
FNS cannot obtain information about noncompliance with work requirements, program engagement, or employment history because it is not collected as part of standard program administration or because it is not measured consistently across sites. FNS may be able to collect race and ethnicity information from administrative data, in which case it will not be asked during the participant survey. However, because this information is not always measured consistently across sites, it may not be possible to obtain this information from existing sources.
The study team will inform respondents that their identities will be kept private to the extent permitted by law, their responses will not affect services or benefits they or their family members receive, and they can choose to skip questions without penalty. On September 29, 2022, the FNS Privacy Officer, Michael Bjorkman, reviewed this information collection request and determined that the data collection meets the requirements of the Privacy Act.
Provide estimates of the hour burden of the collection of information. Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated.
The burden table (Appendix T: Burden Table) shows estimates for this information collection, including the number of respondents, frequency of response, average time to respond, and annual hour burden for each part of the data collection. Estimated burden accounts for the time to participate in the interventions; read recruitment, reminder, and confirmation messages; respond to data collection instruments; and report SNAP and SNAP E&T administrative data. There are 61,782 respondents (49,476 respondents and 12,306 nonrespondents), 239,256 responses (173,224 responsive responses and 66,032 non-responsive responses), and 16,216 burden hours (14,987 hours for respondents and 1,229 hours for nonrespondents). The sample includes 61,316 individuals, 431 State/Local program staff, and 35 business program staff. The average number of responses per respondent is 3.50 and the average number of responses per non-respondent is 5.37; for respondents and nonrespondents combined, the average number of responses is 3.87.
See the burden table (Appendix T: Burden Table) for estimated annualized burden. The annualized cost of respondent burden is the product of each type of respondent’s annual hour burden and an average hourly wage rate, with 33 percent of the estimated base added to reflect fully loaded wages. Wage rates are based on May 2021 National Occupational and Wage Statistics data from the U.S. Department of Labor, Bureau of Labor Statistics for selected occupations and the State minimum wage rates.4,5 Wage rates for the participant survey, focus groups, and IDIs use the Federal minimum wage rate of $7.25 per hour because those activities are conducted with sample members from a mix of different states. Table A12.1 shows the occupation code and hourly mean wage rate used to calculate the annual respondent cost estimates for respondent occupation categories. The estimated total annualized cost to respondents, with fully loaded wages, is $323,742.81.
Table A12.1
Respondent category |
Type of respondents |
Occupation category4 |
Wage rate |
State government |
State program staff |
11-0000, Management Occupations |
$59.31 |
Local government |
Local program staff |
21-0000, Community and Social Service Occupations |
$25.94 |
Business (Profit, Non-Profit, or Farm staff) |
Local program staff |
21-0000, Community and Social Service Occupations |
$25.94 |
Provide estimates of the total annual cost burden to respondents or recordkeepers resulting from the collection of information, (do not include the cost of any hour burden shown in questions 12 and 14). The cost estimates should be split into two components: (a) a total capital and start-up cost component annualized over its expected useful life; and (b) a total operation and maintenance and purchase of services component.
There are no capital/start-up or ongoing operation/maintenance costs associated with this information collection.
Provide estimates of annualized cost to the Federal government. Provide a description of the method used to estimate cost and any other expense that would not have been incurred without this collection of information.
The total cost to the Federal Government is $7,008,555.20 over a four-year period, or $1,752,138.80 on an annualized basis. The annualized cost includes payment to a contractor, based on fully loaded wages, in the amount of $1,720,783.25, to engage with sites; design the interventions and evaluation plans; develop data collection instruments; conduct the study; complete the analysis; and deliver data files, briefings and reports. This amount includes $107,000 in respondent incentives ((3,200 survey respondents * $30) + (160 focus group participants * $50) + (60 IDI participants * $50)). The annualized cost of this information collection also assumes $28,137.48 spent for a GS-13, step 2 Social Science Research Analyst ($52.89/hour x 400 hours/year x 1.33) and $3,218.07 spent for a GS-14, step 1 Branch Chief ($60.49/hour x 40 hours/year x 1.33). Federal employee rates are based on the General Schedule of the Office of Personnel Management (OPM) for 2022 for the Washington, D.C. locality.
Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-1.
This is a new information collection that will add 16,215.74 rounded up to 16,216 estimated burden hours and 239,256 annual responses to the OMB information collection inventory as a result of program changes.
For collections of information whose results are planned to be published, outline plans for tabulation and publication.
Study schedule. The planned schedule for the activities in the study is as follows:
Table A16.1 Data collection schedule
Project Activity |
Months after OMB Approval |
Conduct participant survey, focus group, and in-depth interviewer training |
1 to 2 months after OMB approval |
Implement interventions with sites |
|
Send recruitment and advanced materials to participants and agency staff |
3 to 6 months after OMB approval |
Collect SNAP administrative data |
1 to 2 months after OMB approval |
Collect SNAP E&T administrative data |
1 to 6 months after OMB approval |
Collect participant survey data |
3 to 6 months after OMB approval |
Conduct participant focus groups and participant in-depth interviews |
4 to 7 months after OMB approval |
Administer staff semi-structured interviews and staff questionnaire |
4 to 7 months after OMB approval |
Analyze information collected |
8 to 12 months after OMB approval |
Write reports and briefs |
12 to 16 months after OMB approval |
Conduct briefings with agency staff |
15 months after OMB approval |
Publish reports and briefs |
16 months after OMB approval |
The quantitative information obtained through SNAP administrative data and SNAP E&T administrative data will be analyzed separately by site to examine causal impacts of the eight interventions on participants’ engagement in SNAP E&T programs, service receipt, and barriers. In some sites, impacts also will be estimated for specific subgroups, such as those defined by age, household composition, and recent employment history. Analyses based on the participant survey data will describe participants’ experiences with the interventions, such as how well they understood communication materials and why they continued or discontinued participating in specific activities.
The qualitative information obtained through participant focus groups and participant in-depth interviews, as well as staff semi-structured interviews and staff questionnaire data, will document the context and operations of each intervention and help to interpret and understand the impact findings. The team will use a structured write-up guide to synthesize collected information by respondent and question. The write-up will highlight themes, provide examples and illustrative quotes, and identify discrepancies and areas of agreement among data sources. The information will be coded according to a scheme that aligns with the implementation framework, study objectives, research questions, and intervention designs. Based on the analysis of the coded data, the study team will develop site visit summaries, process map diagrams, and thematic tables and illustrative quotes to organize the findings. Finally, the analysis will assess conditions needed to conduct RCEs and replicate and sustain the interventions.
FNS will share study findings internally through potential products such as a summary of findings, data visualizations, and study briefings. Final dissemination products in the form of eight site-specific, non-technical reports will be available to the public on the FNS research website. The reports will provide insights into the implementation of each intervention, identify what worked well, what could have affected the outcomes, and how to replicate or sustain the interventions. To summarize findings at a higher level, two-page briefs for each intervention will also be developed and published.
If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.
The agency will display the expiration date for OMB approval on all instruments.
Explain each exception to the certification statement identified in Item 19 of the OMB 83-I “Certification for Paperwork Reduction Act.”
There are no exceptions to the certification statement. The agency is able to certify compliance with all provisions under Item 19 of OMB Form 83-I.
1 Should in-person data collection not be feasible due to the evolving nature of the current COVID-19 pandemic, the study team will be prepared to conduct participant focus groups, participant in-depth interviews, and staff semi-structured interviews (described below) telephonically or virtually if needed.
2 The secure FTP website is yet to be created.
3 Singer, E. and C. Ye. 2013. "The Use and Effects of Incentives in Surveys." Annals of the American Academy of Political and Social Science, 645(1): 112-141.
Singer, E. and R. Kulka. 2002. “Paying Respondents for Survey Participation,” In Studies of Welfare Populations: Data Collection and Research Issues, eds. M. Ver Ploeg, R. Moffitt, and C. F. Citro, pp. 105-128. Washington: National Academy Press.
Singer, E. and J. van Hoewyk, N. Gerbler, T. Raghunathan and K. McGonagle. 1999. "The Effect of Incentives on Response Rates in Interviewer-Mediated Surveys." Journal of Official Statistics, 15(2): 217-230.
4 We obtained average hourly wage rates for Management Occupations (11-000) from https://www.bls.gov/oes/current/oes110000.htm and Community and Social Service Occupations (21-0000) from https://www.bls.gov/oes/current/oes210000.htm.
5 We obtained the Federal minimum wage rate from https://dol.gov/general/topic/wages/minimumwage and State minimum wage rates from https://www.dol.gov/agencies/whd/minimum-wage/state
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Kim McDonald |
File Modified | 0000-00-00 |
File Created | 2023-12-12 |