050213 Part B - Assessment of the Contributions of SNAP Interview

050213 Part B - Assessment of the Contributions of SNAP Interview.docx

Assessment of the Contribution of an Interview to SNAP Program Eligibility and Benefit Determination Study

OMB: 0584-0582

Document [docx]
Download: docx | pdf





Assessment of the Contributions of an Interview to Supplemental Nutrition Assistance Program Eligibility and Benefit Determinations:
OMB Supporting Statement

Part B

May 2, 2013

Project Officer: Rosemarie Downer

Contract Number:

AG-3198-D-10-0072

Mathematica Reference Number:

06831.202

Submitted to:

U.S. Department of Agriculture

3101 Park Center Dr

Alexandria, VA 22302

Project Officer: Rosemarie Downer

Submitted by:

Mathematica Policy Research

P.O. Box 2393

Princeton, NJ 08543-2393

Telephone: (609) 799-3535

Facsimile: (609) 799-0005

Project Director: Scott Cody

Assessment of the Contributions of an Interview to Supplemental Nutrition Assistance Program Eligibility and Benefit Determinations:
OMB Supporting Statement

Part B

May 2, 2013





CONTENTS

Part B: Collection of Information Employing Statistical Methods 1

B1. Respondent Universe and Sampling Methods 1

B2. Procedures for Collection of Information 6

B3. Methods to Maximize Response Rates and to Deal with Nonresponse 9

B4. Tests of Procedures 12

B5. Individuals Consulted 12


APPENDIX C: Letters to SNAP Directors and community-based organizations

APPENDIX D: Client survey

APPENDIX E: Client Survey Correspondence

APPENDIX F: Focus groups with procedural denials guide

appendix g: focus group correspondence

APPENDIX H: Time-use data collection protocol

appendix i: Site visits and interview protocol

appendix j: Qc-like review form


TABLES

B.1.1 Sampling and Response Rates Among SNAP Staff and Partners 3

B.1.2 Sampling and Response Rates Among SNAP Clients 4

B.1.3 Recruitment Plan for Procedural Denials Focus Groups in Each State 6

B.5.1 Individuals Consulted on Data Collection or Analysis 13


Part B: Collection of Information Employing Statistical Methods

B1. Respondent Universe and Sampling Methods

Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

This section describes the respondent universe and sampling methods for site visit interviews with Supplemental Nutrition Assistance Program (SNAP) staff at the State, county, and local levels, as well as interviews with community-based organizations (CBOs). This section also describes the universe for the survey of SNAP clients and for the focus groups with procedurally denied applicants. The samples for the staff and CBO site visit interviews, as well as the focus groups with procedural denials, will be selected using convenience sampling. The samples for the survey of clients will be selected using probability selection methods.

All respondents for the site visit interviews of SNAP staff and CBOs, as well as for the client surveys, will come from the study sites in each of the three selected States that have agreed to participate in the study. To select the States participating in this study, the Food and Nutrition Service (FNS) issued a request for applications (RFA), which detailed the goals of the study and States’ requirements for participating in it. In response to the RFA announced under OMB# 0584-0512 expiration date: January 31, 2016, three States submitted applications to participate in the study: North Carolina, Oregon, and Utah. All three States’ applications were deemed acceptable, and all three States were selected to participate in the study.

Site selection procedures. In order to gain meaningful, accurate insights into the potential impact of eliminating client interviews at certification and recertification, it is crucial to conduct interviews and observe program operations during the site visit to demonstration and comparison sites in each participating State. This will produce a rich evidence base from which to draw conclusions about the effect of the waiver on program access, payment accuracy, and administrative costs and procedures.

The selection procedures for study sites will depend on the evaluation model employed in each State. Two States—North Carolina and Oregon—will use the demonstration site model for this study. Each State will identify one or more localities to implement the no-interview model (the demonstration sites). Each State also will identify one or more comparison sites with characteristics similar to the demonstration site. The comparison sites will continue to interview applicants using the State’s typical interview procedures.1 Site visit observations will occur in the demonstration and comparison sites in these two States.

Utah operates a Statewide eligibility system with centralized intake and processing. As a result, a demonstration site approach to testing the no-interview model is not feasible. The centralized system facilitates a random assignment approach, in which applicants will be randomly assigned to demonstration and control groups Statewide. In Utah, the research team will identify study sites that reflect a diverse mix of urban and rural portions of the State.

Staff interview respondent identification procedures. The universe for the site visit interviews are employees working in the State, county, and local SNAP and CBO offices in the study areas. The study team will use a tiered approach to identifying staff interview respondents, asking a point of contact at each level to help identify staff at that level, as well as a point of contact at the next lower level. First, staff will work with State officials to identify the appropriate county and local offices to visit. The study team also will work with the State to identify with whom to speak at each office and any CBOs that should be interviewed. After determining which offices to visit, staff will contact the directors of those offices and work with each to identify key office staff to interview, including supervisors and frontline eligibility workers. Directors will each be sent an introductory letter from their State (Appendix C).

Because the States have already agreed to participate, a 100 percent response rate for the site visit interviews in each of the States is expected (Table B.1.1).

Table B.1.1. Sampling and Response Rates Among SNAP Staff and Partners

Respondent Type

Number of Offices (Universe)

Sampling Method

Respondents Contacted

Respondents Participating

State SNAP Office Staff

3

Convenience sampling

12

12

District/County SNAP Office Staff

Universe unknown

Convenience sampling

18

18


Local SNAP Office Staff

Universe unknown

Convenience sampling

60

60


CBO Staff

Universe unknown

Convenience sampling

12

12

Total



102

102

Expected Response Rate



100%


Selection methods for client survey respondents. To provide the clients’ perspectives on the process, survey staff will conduct a short survey (Appendix D) of SNAP clients to ask about their recent application or recertification interview experiences. The research team will select samples from the demonstration and comparison sites (or, in the case of Utah, the demonstration and control groups) from State-supplied lists of newly certified or recertified clients residing in each site. In the demonstration site States of North Carolina and Oregon, the sample will be divided equally between demonstration and comparison sites. Within sites research staff will use implicit stratification by status and ZIP code to ensure proportionate representation of new certifications and recertifications and of different locations within the site. In the randomization State (Utah), the sample will be allocated equally to those assigned to treatment status and those serving as the comparison group from throughout the State. Staff will use implicit stratification by status and ZIP code to ensure proportionate representation of new certifications and recertifications and of different locations within the State.

The study team will select a total sample of 3,648 applicants across the three States—608 in each demonstration and control group in each of 3 States. In the demonstration site States sampled applicants will drawn across the participating sites. In the random assignment State, staff will ensure approximately equal samples for the demonstration and control groups. Staff anticipate that 95 percent of those sampled will be eligible for the survey and that 80 percent of these will complete the interview. This will yield a total of 2,772 completed interviews.

Table B.1.2. Sampling and Response Rates Among SNAP Clients

State

Number Sampleda

Number Eligible

Completed Interviews

North Carolina (Demonstration Sites)

608

578

462

North Carolina (Comparison Sites)

608

578

462

Oregon (Demonstration Sites)

608

578

462

Oregon (Comparison Sites)

608

578

462

Utah (Statewide)

1,216

1,155

924

Total

3,648

3,467

2,772

a Depending on the final number sites included in the demonstration by each State, the sample may be spread across additional sites, but the total sample allotted to each State and will be unchanged.

Selection methods for focus group members. Procedural denials are individuals who submit a SNAP application but are denied benefits because they fail to complete subsequent stages of the application process. An important question under study is whether waiving the SNAP interview results in fewer or more procedural denials. The study team will conduct focus groups with a sample of procedural denials in order to examine whether the reasons for not completing the application process vary by model.

Focus group locations are currently undetermined, pending finalization of States’ plans for the demonstration. However, in all three States, the focus groups will occur in the same locations that are included in the site visit portion of the study. The sampling frame for the recruitment of procedural denials will be a list of all SNAP applicants in that location who submitted an initial application for benefits during the previous three months but were denied benefits because they failed to complete the application process. From each State, staff will collect administrative records for individuals, including their contact information and demographic characteristics. Staff will request records for approximately 400 cases in each State. The study team will sort each sample into a random order. Interviewers from Mathematica’s Survey Operations Center (SOC) will then call sampled clients, explain the study and its purpose, and ask them to participate in the focus group. The study team will attempt to recruit a mix of Spanish and English speaking elderly, young, working, and unemployed clients. The mix, although not statistically representative, will provide a variety of perspectives.

Given their qualitative nature and small number of participants, focus groups are not intended to include representative samples of a population, but they do require denied clients who are sufficiently experienced with the issues of interest and who, ideally, have the capacity to offer meaningful insights and suggestions. To increase the likelihood of identifying such clients, staff might exclude from recruitment any cases that appear anomalous and not representative of a broader pattern of procedural denials. For instance, if procedural denials typically occur within a certain time range after application and an individual procedural denial fell well outside that range, it would likely be excluded from the focus group sample. Likewise, if in the course of telephone recruiting, a respondent seems to lack the verbal or cognitive skills to make meaningful contributions in a group setting, staff will politely screen out and not invite that individual to join the group.

The study team will continue to recruit procedural denial subjects from the list of 400 per State until 25 participants are obtained for each focus group discussion. It is expected that agreement by 25 subjects will yield only 10 focus group participants. Staff likely will have to contact 50 procedural denials to obtain agreement from 25 subjects. In each State, staff will aim to recruit at least one group composed of Spanish-speaking participants for each interview mode assuming a sufficient concentration of Spanish speakers, thereby increasing the inclusiveness of the overall sample and enabling us to examine whether language issues potentially contribute to procedural denials as related to interview mode.

Staff will conduct a total of 12 focus groups with procedural denials (4 in each State). The study team will conduct all focus groups during the second site visits, approximately 13 months into the demonstration.

Table B.1.3. Recruitment Plan for Procedural Denials Focus Groups in Each State

State

Sample Frame

Sample Selected

Sample Recruited per Group

Focus Group Attendance Rate

Number Attending per Group

Number of Groups

Total Focus Group Participants

North Carolina

Unknown #

400

25

40%

10

4

40

Oregon

Unknown #

400

25

40%

10

4

40

Utah

Unknown #

400

25

40%

10

4

40

Total


1,600

75


30

12

120


B2. Procedures for Collection of Information

Describe the procedures for the collection of information including:

  • Statistical methodology for stratification and sample selection

  • Estimation procedure

  • Degree of accuracy needed for the purpose described in the justification

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

This study employs three primary data collection activities: (1) interviews with State and local SNAP and CBO staff, (2) a short survey of SNAP clients, and (3) focus groups with SNAP procedural denials. Methods for sample selection and stratification are discussed in Section B.1 above.

In-person interviews. Two researchers will conduct each semistructured, in-person interview, typically at the respondent’s workplace. A senior member of the study team will lead the discussion, using the guide in Appendix I while the second researcher will primarily take notes. After the interviews, the research team will prepare a site visit summary of individuals’ responses to the questions in the discussion guide. The research team will use those summaries later to analyze the results of the in-person interviews and compare them with those from other data sources.

Client surveys. The methods for selecting the samples for the client survey were described previously. A programmer will check the file to make sure that the file can be read, the information States agreed to provide is present, and the cases included in the file meet our criteria based on date of certification or recertification. Following confirmation of the quality of the sampling frame file, staff will draw the client survey sample; mail an advance letter (Appendix E), including a small prepaid cash incentive ($2); and start the interviews a few days later, beginning with households certified or recertified two months earlier and moving to those certified most recently. This approach ensures maximum respondent recall while producing the required sample sizes for analysis. The client survey (Appendix D) will be administered as a stand-alone, computer-assisted telephone interviewing (CATI) survey approximately seven months following implementation of the demonstration. Clients will receive a $10 Visa gift card after completing the survey.

Focus groups. All focus groups will be conducted during the site visit, approximately 13 months into the demonstration. The study team will identify and recruit SNAP participants using administrative case record data submitted by each study State. From these data, staff will select a total of 400 cases in each State, within the nearest ZIP codes of the focus group site. The interviewers will proceed through each list until they have recruited the target number of clients. Clients will be offered $30 as a token of our appreciation. The study team will inform all the invited SNAP clients that these incentives will not affect the value of their SNAP benefits.

FNS will provide a light meal and refreshments to focus group participants. Those who accept the invitation for the focus group will receive a letter with information about the study (Appendix G) and detailed information about the time and location of the focus group. SOC interviewers will also make reminder calls (Appendix G) to participants a few days before the scheduled focus group to maximize attendance.

All participants will be asked to sign a consent form (Appendix G) and ensured of the privacy of their contribution to the groups. An experienced moderator will lead the focus groups.

The focus group moderator will follow the guide in Appendix F. This guide will be translated into Spanish included in Appendix F. With the approval of all respondents, the discussion will be tape-recorded and later transcribed. The study team will use transcripts and notes to analyze the results and compare them with those from other data sources.

Extant data collection. In addition to the staff and client interviews and focus groups, Mathematica also will collect monthly administrative data to examine program costs and trends. The study team will work with participating sites to collect monthly administrative cost data tied to the operations of the demonstration and each State’s typical procedures from existing financial statements, fiscal reports, audit reports, and similar records. A senior Mathematica programmer will work closely with State data managers to articulate the study’s data needs, determine an appropriate data delivery format, and ensure that the project team understands the cost elements included in the variables sent by each State. If requested by a State, staff will provide a memorandum of agreement that outlines the roles of all parties and pledges client confidentiality. This memorandum of agreement was submitted under a separate package (OMB #0584-0512) and expires 1/31/2016.

In addition, Mathematica will collect office-wide performance data from monthly management reports, time use data from caseworkers, and information from quality control (QC) reviews of active SNAP cases from the State’s QC staff. These data will help identify the impact of the waiver on SNAP costs and operations. Burden associated with the collection of extant data is included in estimates for the separate OMB clearance (OMB #0584-0512) for states’ participation in the demonstration program.

B3. Methods to Maximize Response Rates and to Deal with Nonresponse

Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.

Due to prior agreement with States during the Request For Application (RFA) grant process, FNS anticipates 100 percent participation. The study team does not expect difficulties in securing interviews with staff members at SNAP offices or CBOs. However, ensuring high participation rates for the client survey and for the focus groups is critical. The team will use several techniques employed in previous studies to ensure high participation.

Methods to Maximize Response Rates and Deal with Non-response for the Client Survey.

Telephone locating. Survey staff will use telephone and web locating techniques, such as directory searches, to maximize the likelihood of reaching the desired sample.

Structured opportunities to build rapport. To minimize client survey respondent burden, the study team expects to cover this limited number of questions through a telephone interview that lasts from five to seven minutes (see Appendix D) (Total burden associated with responding to the client survey is estimated at 10 minutes, including the receipt of an advance letter (Appendix E).). Telephone interviewers selected for the project will demonstrate a combination of interviewing experience and high-level training focused on encouraging participation among low-income households. Project-specific training will address the study’s purpose and goals, the data collection instrument, and best practices in data collection, while reinforcing concepts for eliminating bias and remaining sensitive to at-risk and special populations. Experienced supervisors will closely monitor all interviewers periodically throughout data collection.

Strategies for encouraging participation without coercion help convince sample members that the study is worthwhile and their participation will not affect receipt of benefits. All interviewers will be trained in refusal-aversion techniques and prepared to address common respondent questions, such as “What is this study about? Why should I participate? Is this a voluntary study? How long will the interview take? What will be expected of me? Where did you get my name? Can’t you ask someone else? Will this affect my immigration status, my job, or my SNAP benefits? What will be done with the information I give you? Is this confidential?” (FAQ are included in Appendix E.)

Language accommodations. Mathematica will translate all study mailings, data collection materials, and CATI questionnaires into Spanish and offer to conduct interviews in Spanish to minimize unit nonresponse due to language barriers. Staff will conduct interviews in additional languages as needed, based on our existing multilingual interviewing capacity. The team will endeavor to identify non–English-speaking households before contacting them, using information from the SNAP administrative records, including primary language, language of application, language of certification interview, and other relevant data. For such clients, a bilingual interviewer will initiate contact.

Respondent incentives. Following confirmation of the quality of the sampling frame file, the study team will draw the sample; mail an advance letter, including a small cash incentive ($2); and start the interviews few days later to ensure receipt, beginning with households certified or recertified two months earlier and moving to those certified most recently. This approach ensures maximum respondent recall while producing the required sample sizes for analysis. Clients will receive a $10 Visa gift card after completing the survey.

Nonresponse analysis. The study team will construct analysis weights within a site or randomization group to account for nonresponse. Nonresponse adjustment cells will be formed based on household characteristics, such as the number of people in the household, benefit amount, and race/ethnicity as these factors may be associated with a propensity to respond and correlated with the key client outcomes being measured. These household characteristic information will be obtained from existing SNAP records which will be included in the sample frame file. Using this information, staff will conduct a nonresponse analysis for each site. The results of this analysis will inform the definition of the cells. If the response rate among the eligible sample is lower than the 80 percent expected, staff will extend the nonresponse analysis to include an estimate of potential bias and the extent to which the weights corrected for the potential bias.

Methods to Maximize Response Rates and Deal with Nonresponse for Focus Groups

Structured opportunities to build rapport. Recruiting will take place in the three weeks leading up to the focus group. The study team will send reminder letters as individuals agree to attend the groups. These letters (Appendix G) will reiterate the purpose of the study—clearly stating the date, time, location, and directions—and address issues such as privacy Additionally, interviewers in charge of prescreening and recruiting will be trained to build rapport during the first minutes of the initial telephone contact. Some groups might be scheduled in the early evening or on weekends to make it easier for individuals to attend.

Reminder calls. One to two days before each focus group, reminder calls (Appendix G) will be made to those who agreed to attend.

Burden and location. The focus group discussions will last for approximately 90 minutes. The focus groups will be conducted in comfortable conference rooms that can accommodate the appropriate number of people around a table. Given the negative outcome of participants’ SNAP applications, groups will meet in neutral facilities—that is, locations not associated with SNAP, such as a library—so that the respondents feel comfortable speaking frankly.

Respondent incentives. Respondents will be offered $30 token of our appreciation. Participants will be reassured that accepting this token will not affect their benefits or eligibility for SNAP or other programs. Light refreshments will also be provided.

Language accommodations. Mathematica’s survey operations division will take into account special considerations of the target population. Because a significant portion of SNAP participants in some States are fluent in Spanish but not in English, the prescreening call will identify sites in which there are large numbers of monolingual Spanish speakers; focus groups will be held in Spanish (Appendix F) when necessary. The discussion group moderator will be bilingual, fluent in both Spanish and English.

B4. Tests of Procedures

Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.

The site visit and focus group guides for this collection will be semi-structured and, therefore, will not be tested. To make the interviews and focus groups run more efficiently, the study team will tailor the guides for each State so that they include only questions that are relevant for that State. In order to test the utility of the client survey, survey staff conducted a small demonstration test with 9 respondents. The demonstration examined the understandability of the survey questions and survey length. Results indicated that the questions were salient and easier to answer. The length of the final version of the survey was in line with estimates as well.

B5. Individuals Consulted

Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.

Mathematica staff and the FNS project officer contributed to planning for the survey and other aspects of the collection (Table B.5.1). Comments from the public and the National Agricultural Statistics Service (NASS) were also consulted.




Table B.5.1. Individuals Consulted on Data Collection or Analysis

Mathematica Staff (Contractor)

Telephone Number

Scott Cody, Associate Director of Research and Senior Advisor

617-715-6937

Gretchen Rowe, Project Director

202-484-4221

John Hall, Senior Statistician

609-275-2357

Eric Zeidman, Survey Director

609-936-2784


FNS Staff


Rosemarie Downer, FNS Project Officer

703-305-2129



1 The research team will provide guidance to each State as the States select their demonstration and comparison sites to ensure the characteristics of the sites are equivalent.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAssessment of the Contributions of an Interview to Supplemental Nutrition Assistance Program Eligibility and Benefit Determinati
AuthorDawn Patterson
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy