Survey of SNAP E&T Case Management_Supporting Statement Part B_revision1_clean

Survey of SNAP E&T Case Management_Supporting Statement Part B_revision1_clean.docx

Survey of Supplemental Nutrition Assistance Program (SNAP) Employment and Training (E&T) Case Management

OMB: 0584-0665

Document [docx]
Download: docx | pdf



SUPPORTING STATEMENT - PART B






OMB Control Number 0584-[NEW]







Survey of Supplemental Nutrition Assistance Program (SNAP) Employment and Training (E&T) Case Management


















June 25, 2021


Project Officer: Kristen Corey

Office of Policy Support

Food and Nutrition Service

U.S. Department of Agriculture

1320 Braddock Place Alexandria, Virginia 22314







B.1. Respondent universe and sampling methods. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

This is a new information collection request. This collection includes two primary data collection components: (1) a structured web survey (see Appendix B and F) of SNAP E&T programs and (2) site visits with selected States (see Appendix C, D, and E). By surveying all 53 State and territory SNAP E&T directors and conducting in-depth case studies of four States, this study will provide FNS a comprehensive picture of case management in SNAP E&T.

Survey. The survey will collect information on how each SNAP E&T program approaches case management. The survey will be a complete census of the 53 States and territories implementing SNAP E&T programs; there will be no sampling. Since States have substantial flexibility in the design of their E&T programs and case management components, only a complete census will provide sufficient information to help FNS understand how all States and territories provide case management and how they have responded to the new case management requirement in the 2018 Farm Bill. The main point of contact for the survey will be the 53 SNAP directors, who may designate as many as three staff members to complete sections of the survey. Although this process could account for an additional 159 State or territory respondents if all SNAP directors designate three additional staff members to complete the study, FNS expects fewer additional staff to participate. We anticipate a 100 percent response rate for the survey. To ensure a high response rate, the study team will use a reminder strategy that increases the frequency of contact with respondents via both email (see Appendices K through P) and phone (see Appendix Q) as the field period progresses.

Site visits. The data collection also includes site visits to four States to (1) explore case management in depth and (2) identify lessons learned and best practices for implementation. The respondent universe consists of 53 State or territory SNAP agencies. FNS will select four States purposively for site visits and include as many as four backup States. FNS will select and recruit diverse States for the study that (1) provide case management primarily through a network of SNAP E&T providers; (2) recently implemented new case management practices in their SNAP E&T programs; or (3) recently implemented innovative approaches to case management, assessment, or participant reimbursements.

The site visits include semi-structured interviews with State and local SNAP office directors and staff and SNAP provider staff, and observations of SNAP E&T providers (business-for-and-not-for-profit and local SNAP office staff) and SNAP E&T participants (individuals) during case management activities. Among the States that agree to participate in the case studies, we anticipate a 100 percent response rate among those staff identified for interviews and observations and a 96.8 percent response rate among E&T participants identified for case management observations. For site visits, FNS will identify the SNAP director, who will then identify other State staff to work with the study team and FNS to identify local SNAP offices and E&T providers to include in the site visits based on the following criteria: (1) the number of E&T participants served; (2) the length of time the agency has implemented case management; (3) whether the agency has implemented or piloted innovative case management strategies, assessment tools or approaches, or participant reimbursements; (4) the SNAP E&T components provided by the agency; and (5) the location of the agencies in comparison to the State office to facilitate efficient travel.. The SNAP director will then identify a main point of contact in each of the local offices and providers who will help identify respondents at each location. Local office or E&T provider supervisors and case managers will identify staff and SNAP participants for observations to represent a variety of activities (e.g., initial assessment, service planning, progress monitoring) and multiple case managers at each service provider. Immediately prior to case study observations, study team observers will ask for a verbal consent to observe and record the session from SNAP participants before reading the public burden statement and the observation takes place (see Appendix D and E for the script).

Respondents. The total number of estimated respondents is 564. Members of the public affected by the data collection include individuals, State and local governments, and business for and not-for-profit agencies administering SNAP E&T programs. Table B-1 shows the respondent universe, sample size, and expected response rate for each respondent type.

Table B.1. Summary of Respondents and Nonrespondents by Respondent Type

Respondent Type/Data Collection

Total to Be Contacted

Expected Number of Respondents

Expected Number of Nonrespondents

Expected Response Rate

Survey






State and Local Government

State/territory SNAP director

53

53

0

100%

State/territory SNAP E&T director

53

53

0

100%

State/territory SNAP policy staff

53

53

0

100%

State/territory SNAP financial staff

53

53

0

100%

Site Visitsa






State and Local Government

State/territory SNAP director

8

4

4

50%

State/territory SNAP E&T director

4

4

0

100%

State/territory SNAP policy staff

8

8

0

100%

State/territory SNAP financial staff

4

4

0

100%

Local SNAP office director

10

10

0

100%

Local SNAP office supervisor

10

10

0

100%

Local SNAP office frontline staff

30

30

0

100%

Business or Other for Profit

SNAP E&T provider directors

5

5

0

100%

SNAP E&T provider supervisors

5

5

0

100%

SNAP E&T provider frontline staff

15

15

0

100%

Nonprofit

SNAP E&T provider directors

5

5

0

100%

SNAP E&T provider supervisors

5

5

0

100%

SNAP E&T provider frontline staff

15

15

0

100%

Individuals

SNAP E&T program participants

248

240

8

96.8%

Total


564

556

12

98.6%

a The State/territory site visit respondents are a subset of the people who will respond to the survey. The same Business or Other for Profit and Non-profit frontline staff will participate in both the interviews and observations.

B.2. Procedures for the collection of information. Describe the procedures for the collection of information including:

  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

1. Data collection

Survey. The study team will collect data from the 53 States and territories through a survey (Appendix B). The study team will also request that States submit one assessment tool as part of the survey, which is described in the survey instrument and emails. The survey will be a complete census of States and territories implementing SNAP E&T programs; there will be no sampling. FNS will send an email to each SNAP agency introducing the study, the survey, and the vendor’s role (Appendix R). Study team trained recruiters will then send a personalized introductory email to each State SNAP director for the survey from a study-specific email account (Appendix S). The study team will program the survey instrument as a web survey and deploy it for a 16-week field period. State- and territory-level SNAP staff will log in to a secure online portal to access the full survey or individual sections (Appendix F). The study team also will offer SNAP agency staff an option to complete the full survey or individual sections over the phone by calling an 800 number, or on hard copy as a fillable PDF and returning it to the study team. The study team will provide periodic survey reminder emails and calls to SNAP directors (Appendix L, M, P, and Q) and designated staff (Appendix K, N, O, and Q).

Site visits. The site visits will consist of in person visits to the four States to conduct semi-structured individual and group discussions with key program staff and observations of case management activities. The study team anticipates that each interview and observation will last about 60 minutes. FNS will email the four selected States (Appendix T) to invite them to participate in the study and will contact the other four States only if one of the initial four declines to participate. About one week after FNS’ email, the study team will send the selected States’ SNAP directors FNS-approved personalized recruitment letters (Appendix U) and a study description (Appendix V) via email, and follow up with phone calls to secure their participation in the study. Within two weeks of sending the recruitment email (Appendix U), the study team will schedule orientation phone calls with selected States, to the extent possible. During the call, the study team will introduce the study, review the objectives and the site visits, and answer questions. Once the State agrees to participate, the study team will reach out again to confirm its plans for the visits and identify possible dates. In consultation with the State SNAP and E&T directors, the study team will select SNAP E&T providers and/or local SNAP offices to visit and identify specific respondents to interview. The study team will develop a tentative agenda of the locations, respondents, and dates for the visit, including an appointment schedule for interviews and observations.

The study team will collect data during the site visits using (1) a semi-structured interview discussion guide (Appendix C), (2) an observation guide to use during one-on-one intake and case management meetings (Appendix D), and (3) an observation guide to use during group activities, including orientations and formal assessments (Appendix E). The visits will take place over three days and will include a three-person team: a site lead, an analyst, and an observation lead. The site lead and the analyst will interview staff and observe group activities. The observation lead will separately observe and audio-record one-on-one case management appointments at local SNAP offices or E&T providers. These one-on-one observations and recordings will provide more in-depth information about how one-on-one case management interactions between staff and participants occur, including the content and intensity of services provided. In addition, as part of and in advance of the site visit, the study team will request any existing program documents (Appendix W) and aggregate data (Appendix X) on receipt of case management, assessments, and reimbursements over each of the previous two years (FY 2019 and 2020).

During the site visits, the site lead and the analyst will interview State SNAP staff, local SNAP office staff, and/or SNAP E&T provider staff, depending on the State model. For example, in some States, local SNAP office staff provide SNAP E&T services, so the team would not interview SNAP E&T provider staff. The site lead will primarily lead the discussions, and the analyst will take detailed notes. The study team will not audio-record the interviews. If key staff become unavailable during the visits, the study team will conduct the interviews after the visits by phone.

The observation site lead will join the visit on the second and third days to record and observe case management sessions (conducted in person, over the phone, or via live web) at two of the selected E&T providers or local SNAP offices. The observation lead will coordinate with the SNAP E&T administrator at the agency locations before the visit to ensure there will be enough case management appointments scheduled to observe, with backup appointments available, if necessary. If possible, the lead will aim to observe three case managers at each provider, and two of the case managers twice with different participants, so the study team can observe the range of techniques used. The site lead and site visitor will also arrange to observe and record case management sessions at the visited E&T entities as backup. The team will complete about 4 to 5 observations each day, for a total of about 8 to 10 observations per State, and about 32 to 40 observations across all case study States.

2. Statistical methodology, estimation, and degree of accuracy

The survey is a census of all SNAP E&T programs; there will be no sampling or estimation. The site visits are intended to gather in depth information on SNAP E&T case management in select States and identify best practices and lessons learned, not to produce generalizable or representative information about typical SNAP E&T program activities. After OMB approval, survey and site visit staff will participate in a training on the purpose of the study and the data collection, management, and analysis procedures. FNS staff will be invited to attend the training. The study team will have instantaneous access to survey responses and frequency data for each survey question. The study team will monitor survey data quality and completeness, including checking for inconsistent or contradictory responses across respondents from the same State and cross-checking States’ survey data on a rolling basis against their SNAP E&T State Plans. In addition, each State and territory will be given the opportunity to review their survey and case study findings for accuracy during the reporting phase.

3. Unusual problems requiring specialized sampling procedures

The study has no unusual problems requiring specialized sampling procedures.

4. Periodic data collection cycles to reduce burden

The study only has one cycle of data collection.

B.3. Methods to maximize the response rates and to deal with nonresponse. Describe methods to maximize response rates and to deal with issues of nonresponse. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.

To encourage respondents to complete the survey and achieve a 100 percent response rate, the study team will highlight the flexibility for respondents to complete the survey in the mode that is most convenient for them. The study team will also emphasize the low relative overall response burden for the survey—about 45 minutes, with any of the three main survey sections estimated to take about 13 minutes—and that the survey sections can be completed in different sittings. In addition, the study team will employ a reminder strategy that increases the frequency of contact with respondents via both email and phone as the field period progresses. The study team will send reminder emails to SNAP directors and assigned survey respondents who have not completed the survey starting the second week of the fielding period (Appendices K through P). The study team will send reminder emails biweekly through the twelfth week of the fielding period, and weekly after that through the sixteenth week, until the survey is complete. The study team will supplement the email outreach with bi-weekly and then weekly telephone calls (Appendix Q), following the same schedule, to SNAP directors and assigned survey respondents starting the eighth week of the fielding period. The team will also ask FNS to contact the SNAP directors in States and territories that have not responded to the survey by the twelfth week to encourage them to complete it. Using the customizable reporting features Confirmit software, the study team will generate daily, weekly, and real-time reports to track the completion status of each survey section in each State and will closely monitor response rates. The study team will fine-tune the outreach and reminder approach if the reporting indicates a potential problem in meeting targets by using customized emails and phone calls.

The site visits are not intended to produce generalizable or representative information about typical SNAP E&T program activities. The information collected during the site visits will be used to develop detailed case studies that will provide insights and perspectives on the design and operation of SNAP E&T case management, provide context for the survey findings, and identify best practices and lessons learned that can be used by FNS, States, and E&T program providers to develop robust programs. To maximize response rates, we will contact States that are likely to participate because they have more robust or innovative case management program components. These States may be more interested in having their programs highlighted in the study and supporting FNS’ efforts to better understand E&T case management approaches. We will also rely on relationships that the vendors and FNS staff have with States. This approach was useful during the pre-test period and is expected to be useful in the full study.

B.4. Test of procedures or methods to be undertaken. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.

Survey pretest. The study team pretested the survey instrument with three State SNAP office directors in February and March 2020 to assess respondents’ comprehension, their response burden, and the effectiveness of the delivery method, and to identify whether specific items, the question structure, or the question order can negatively affect data quality. For each pretest, the study team emailed the SNAP directors a hard copy of the survey instrument and an explanation of the pretest purpose. The study team asked pretest respondents to complete the survey independently and time each section; note anything that was unclear, confusing, or difficult to answer; and then to debrief the study team by telephone. The study team held hour-long debriefing calls with each respondent that consisted of a set of open-ended questions and probes to gauge the respondents’ understanding and ease in answering the questions. The study team asked about the modular structure of the survey, whether the division of content across topical sections was logical, and the ease with which the respondents could answer all questions within their assigned section. As a result of the pretest, we adjusted our burden estimates for the survey to reflect that respondents required more time than estimated (45 minutes vs. 40 minutes). We also made several revisions to the survey to help clarify its purpose and to ensure questions and response options were easy to understand. Appendix G includes a list of the revisions.

Site visit discussion guide pretest. The study team pretested the discussion guide with three respondents from one State in February 2020 to ensure the respondents understood the phrasing and content of the questions and to determine the need to add or remove questions. The study team conducted three, 75-minute pretest discussions over the phone with the three test respondents: one State-level SNAP director, one local SNAP E&T office director, and one local SNAP office E&T provider. One study team member led each discussion using sections of the discussion guide appropriate to each respondent. At the end of each discussion, the study team debriefed the respondent with a set of open-ended questions to gather feedback. As a result of the pretest, we added questions to the guide to collect more complete information and revised other questions for clarity. Appendix G includes a list of the revisions. The pretest confirmed that initial burden estimates for the site visit discussions were accurate.

B.5. Individuals consulted on statistical aspects and individuals collecting and/or analyzing data. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.

The following individuals from USDA were involved in the design for this project:

  • Kristen Corey, Social Science Research Analyst, USDA, FNS, 703-305-2517

  • Danielle Deemer, Social Science Research Analyst, USDA, FNS, 703-305-2952

  • Leigh Gantner, Analyst, USDA, FNS, 703-305-2822

  • Fatou Thiam, Mathematical Statistician, USDA, National Agricultural Statistics Service

The following individuals from the vendors will be involved in the design, data collection, and analysis for this project:

  • Kristen Joyce, Researcher, Mathematica, 617-715-6963

  • Gretchen Rowe, Senior Researcher, Mathematica, 202-484-4221

  • Alexandra Stanczyk, Researcher, Mathematica, 202-838-3632

  • Jeanette Holdbrook, Research Analyst, Mathematica, 609-275-2296

  • Miranda Kharsa, Research Analyst, Mathematica, 312-585-3328

  • Natalie Larkin, Programmer, Mathematica, 510-830-3722

  • Johnny Willing, Research Associate, Mathematica, 609-297-4569

  • Mary Kalb, Research Associate, Mathematica, 312-585-3314

  • Madeleine Levin, Senior Associate, Social Policy Research Associates, 510-768-8277

  • Anne Paprocki, Senior Associate, Social Policy Research Associates, 510-768-8499

  • Ivette Gutierrez, Associate, Social Policy Research Associates, 510-788-2487

  • Maureen Sarver, Associate, Social Policy Research Associates, 510-788-2480

  • Mahika Rangnekar, Associate, Social Policy Research Associates, 510-788-2467



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Title1MathU Report
AuthorSharon Clark
File Modified0000-00-00
File Created2021-06-25

© 2024 OMB.report | Privacy Policy