OMB_SNAP Extra Help Part B

OMB_SNAP Extra Help Part B.docx

Study of the Effectiveness of Efforts to Improve Supplemental Nutrition Assistance Program Access Among Medicare's Extra Help Population Pilot Projects

OMB: 0584-0572

Document [docx]
Download: docx | pdf



Supporting Statement Part B for the Study of the Effectiveness of Efforts to Improve Supplemental Nutrition Assistance Program Access Among Medicare’s Extra Help Population Pilot Projects

September 17, 2012

FNS Project Officer: Bob Dalrymple

Contract Number:

AG-3198-D-10-0079/GS-10F-005 L

Mathematica Reference Number:

06843

Submitted to:

Food & Nutrition Service

3101 Park Center Drive

Alexandria, VA 22302

Project Officer: Bob Dalrymple

Submitted by:

Mathematica Policy Research

505 14th Street, Suite 800

Oakland, CA 94612-1475

Telephone: (510) 830-3700

Facsimile: (510) 830-3701

Project Director: Laura Castner

Supporting Statement Part B for the Study of the Effectiveness of Efforts to Improve Supplemental Nutrition Assistance Program Access Among Medicare’s Extra Help Population Pilot Projects

September 17, 2012

Laura Castner
Daniel Friend





CONTENTS



B. COLLECTION OF INFORMATION USING STATISTICAL METHODS 1

B1. Respondent Universe and Sampling Methods 1

B.2. Procedures for Collecting the Information 4

B.3. Methods to Maximize Response Rates and Deal with Nonresponse 9

B.4. Tests of Procedures or Methods to be Undertaken 11

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting or Analyzing Data 12




TABLES

B1.1 Sample Size and Response Rate Estimates for Client Interviews, by State and Site 3

B2.1 Minimum Detectable Differences for Findings from Client Survey, by State and SNAP Participation 9

B5.1 Individuals Responsible for Statistical Aspects and Data Collection and Analysis 13





















Appendices

A.) SNAP CATI Client Survey

B.) SNAP CATI Client Survey - Spanish

C.) Focus Group Protocol

D.) Focus Group Protocol - Spanish

E.) Public Comments

F.) Focus Group Recruitment

G.) Pretest Summary

H.) Advance Letter

I.) Locating Letter

J.) Thank You Letter

K.) Tribal Letter to New Mexico

L.) Tribal Letter to Washington

M.) Advance Letter- Spanish

N.) Locating Letter - Spanish

O.) Thank You Letter - Spanish

P.) IRB Application

Q.) IRB Approval

R.) National Agricultural Statistics Service (NASS) Review

S.) Focus Group Recruitment - Spanish





B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

The Food and Nutrition Service (FNS), U.S. Department of Agriculture (USDA) is funding three pilot projects to address some of the challenges of improving SNAP access among the elderly, specifically among beneficiaries of Medicare’s Extra Help program, through three approaches: (1) targeted outreach in Washington, (2) simplified eligibility criteria in Pennsylvania, and (3) standardized SNAP benefits in New Mexico.

Overview of the study design. The overarching goal of the evaluation is to understand how the pilot programs operated; who they served; and the extent to which they generated any measurable effects on participation, cost, and SNAP benefits. As part of the evaluation, FNS will assess the overall pilot experience among SNAP participants and nonparticipants within the target group.

The information collection being requested for this project is to address the assessment of overall pilot experience among SNAP participants and eligible nonparticipants by soliciting feedback from participants and nonparticipants through a 20-minute telephone survey and through 60-minute focus groups in order to better understand the client experience with SNAP in general and the pilot project more specifically. In pilot locations, the evaluation will also ask about respondents’ impressions of the pilot initiative.

B1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

Client survey. The total estimated number of sample members to the survey is 6,000 (the additional 138 will be collected in the focus groups described below). This includes 1,000 individuals in New Mexico, 2,000 in Washington, and 3,000 in Pennsylvania. The total estimated number of respondents to the survey is 4,803, or 80 percent of the sample in each state (85 percent of the participant sample and 75 percent of the nonparticipant sample). For statistical efficiency, equal numbers of surveys in the comparison and demonstration sites will be conducted. Where necessary, sampling could be used to deal with varying site sizes so that the survey is fielded over the same period of time in all states and both the pilot and comparison counties. This may be necessary in Pennsylvania based on this state’s estimates of the target size. Sampling may be less relevant to the client surveys in New Mexico and Washington. In any states where sampling is required, it will be done in proportion to the number of Extra Help applicants in the counties.

Table B1.1 shows the target response rates and the expected number of completed interviews for each state.


Table B1.1. Sample Size and Response Rate Estimates for Client Interviews, by State and Site


Pilot Series

Comparison Sites

Total

Response Rates (AAPOR Response Rate 1)




Target response rate–participants

85%

80%


Target response rate–nonparticipants

81%

76%


New Mexico initial sample size

500

500

1,000

SNAP elderly participation rate

35%

30%


Number of SNAP participants

175

150


Number of SNAP nonparticipants

325

350


Target completed interviews–participants

149

120


Target completes–nonparticipants

263

266


Total number of completed interviews

412

386

798

Target overall response rate

82.4%

77.2%

79.8%

Pennsylvania initial sample size

1,500

1,500

3,000

SNAP elderly participation rate

39%

34%


Number of SNAP participants

585

510


Number of SNAP nonparticipants

915

990


Target completed interviews–participants

497

408


Target completes–nonparticipants

741

752


Total number of completed interviews

1,238

1,160

2,399

Target overall response rate

82.6%

77.4%

80.0%

Washington initial sample size

1,000

1,000

2,000

SNAP elderly participation rate

48%

43%


Number of SNAP participants

480

430


Number of SNAP nonparticipants

520

570


Target completed interviews–participants

408

344


Target completes–nonparticipants

421

433


Total number of completed interviews

829

777

1,606

Target overall response rate

82.9%

77.7%

80.3%


Note: SNAP participation rates in the comparison sites are based on state estimates of the elderly participation rate in 2006 (Cunnyngham 2010). It is likely that SNAP participation rates among Extra Help applicants will be lower because very low income seniors are likely to be automatically eligible for Extra Help. It is assumed that the participation rate increases by 5 percentage points in the pilot sites. The response rates are based on the contractor’s experience surveying elderly beneficiaries and are calculated in accordance with the standards set forth by the American Association for Public Opinion Research (AAPOR) in the 2009 Edition of Standard Definitions.

SNAP participants will likely be interviewed at higher rates than nonparticipants because (1) it is likely that contact information for participants will be better than for nonparticipants, making them easier to locate and contact; and (2) participants may be more willing than nonparticipants to share information about themselves. Higher response rates in the pilot sites versus comparison sites are expected because individuals in the pilot sites will have had more recent contact with the state. While it is difficult to predict an exact response rate, experience with surveys of elderly beneficiaries suggests that it is possible to target a response rate of at least 85 percent for participants in the pilot sites and 76 percent for nonparticipants in the comparison sites. Methods to maximize response rates are described in more detail in section B.3.

Focus groups. The total estimated number of sample members for the focus group is 138. The total number of focus group participants is 110, or 80 percent of the focus group sample in each state. This includes two groups with 10 people each in Pennsylvania, five groups with 10 people each in Washington, and four groups with 10 people each in New Mexico. Focus group participants will be eligible nonparticipants in the pilot sites.

B2. Describe the procedures for the collection of information including:

  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

The procedures for the client survey and focus group activities are described below.

Survey data collection. Survey data will be collected through a CATI survey on a rolling basis over the course of one year in each state. Working with each pilot site to identify the appropriate month to contact the site and using the contact information provided by sites for Extra Help applicants, FNS expects to contact Extra Help applicants approximately three to four months after the Medicaid office receives its information. This will give the pilot staff time to provide the services to applicants and applicants time to apply for SNAP, if they so desire. To manage respondent burden due to potential stamina issues of elderly respondents, the survey questionnaire is limited 20 minutes.

The first contact with each sample member will be by United States Postal Service (USPS) mail. The advance letter (Appendix H and M) will explain the importance and legitimacy of the survey and notify sample members of privacy to the extent required by law. It will also explain that participation is voluntary, that they will receive $25 as a token of our appreciation for completing the survey, and that the $25 will not affect their SNAP benefit.

Subsequent contact attempts with sample members will be by telephone. Interviewers will be flexible with sample members and make appointments to call back at more convenient times when appropriate. Interviewers will also assess the respondent’s fatigue and offer to continue the interview at a later date if the respondent seems tired or not focused on the interview. The survey questionnaire has been translated into Spanish. For respondents more comfortable in Spanish, a bilingual interviewer will administer the interview in that language.

For their participation, respondents will be offered a $25 incentive. As some sample members may live in rural areas or in remote areas on Indian reservations, they may be difficult to contact by telephone. Sample members that are difficult to locate and for whom no reliable telephone information can be found will be offered an additional $10 incentive. This incentive is intended as a token of our appreciation and to offset the extra effort required of sample members to participate if they do not have easy or convenient access to a telephone.

The materials used to contact respondents (including those translated into Spanish), including the advance letter, the non-response letter offering the additional incentive, and the letter that accompanies the incentive for respondents who completed the survey are located in Appendices H – O and S.

Focus groups. In addition to the survey interview, focus groups with SNAP nonparticipants in the pilot sites will be conducted. The focus groups will capture information about barriers that elderly individuals face in applying for SNAP and build understanding about how the pilot programs could be changed to address remaining barriers. To conduct the focus groups, a list of eligible Extra Help applicants (those that applied more than four months prior) will be obtained from the pilot sites. These lists will be ordered first by ZIP code and then randomly ordered by names within each ZIP code. Two to four weeks before the focus group, calling will begin to people who live in the ZIP codes closest to the focus group location, in the order they are on the list. This approach ensures that those for whom transportation is less burdensome are called first, but otherwise focus group members are selected randomly. The purpose of the focus group will be described during the recruitment call. About seven days before the focus group date, each person who agreed to participate will be sent a reminder letter along with directions and a map to the location of the focus group. The day before the focus group date, calls will be made to everyone who agreed to participate to remind them of the focus group meeting date, time, and location. On average, 10 clients per focus group are expected to attend. The script to recruit focus group participants is located in Appendix F & S.

Procedures for contacting sample members living on Indian reservations. As both New Mexico and Washington will have sample members living on Indian reservations, the data collection procedures will include communication with tribal leaders in advance of contacting sample members. A letter will be sent to tribal leaders (Appendices K and L) in the two states to make them aware of the study. In New Mexico, where the tribal members often have limited access to telephones and are less likely to speak English, the letter (Appendix K) will ask the tribal leaders for their support in encouraging their local leaders at chapter houses and/or senior centers to work with sample members who need help participating in the study. A letter will be mailed to the local leaders (Appendix K and L) to let them know that the data collection effort is underway. Sample members identified as living on reservations, either through their address or from program information, will be mailed a letter they can send back indicating the best way to contact them, including if they prefer to be contacted on the phone of a close friend or family member, or at a chapter house or other community center (Appendix I and N). The letter will also ask about days of the week and blocks of time that would be convenient to conduct the survey interview. These contact procedures will be used for both the survey and the focus groups. When selecting focus group locations, consideration will be given to locations that are convenient for sample members to attend.

Degree of accuracy needed for the purpose described in the justification. As described in Part A of this Supporting Statement, the survey data will be analyzed by comparing groups on certain types of survey questions:

  1. All Extra Help applicants in the comparison sites and pilot sites will be compared on their responses to questions about previous SNAP experience and food security.

  1. Extra Help applicants who are SNAP participants in comparison sites will be compared to those in Pilot sites on questions about their reasons for applying for SNAP and their application and participation experiences.

  2. Extra Help applicants who are not SNAP participants will be compared in comparison site versus pilot sites on their knowledge of SNAP and reasons for nonparticipation.

  3. Pilot site Extra Help applicants who are SNAP participants will be compared with pilot site Extra Help applicants who are nonparticipants on their previous SNAP experience, food security, and pilot experience.

  4. Comparison site, Extra Help applicants who are SNAP participants will be compared with comparison site Extra Help applicants who are nonparticipants on their previous SNAP experience, food security, and pilot experience.

Analysis of the survey data will include producing simple descriptive statistics on pilot program experiences for Extra Help applicants in the pilot sites as well as basic cross-tabulations for each of these comparisons. Additionally, the survey data will be analyzed by comparing survey data between the pilot and comparison sites. The statistical model will use the two-tailed t-test, and the sample design targets an overall 80 percent response rate. Table B2.1 shows the comparison groups with the expected number of completed interviews and the minimum detectable differences from the two-tailed t-tests. As described in Part A, the survey data will also be analyzed targeting specific subgroups on certain types of survey questions. Caution should be taken in sample subgroups that show low response rates, for the degree of accuracy with the descriptive and inferential statistics may be affected. Possible subgroups include counties or regions with low response, or certain groups of the elderly who have special needs and cannot participate in over-the-phone interviews.

Table B2.1 shows the minimum detectable differences (MDDs) in proportions for each state. The MDD represents the smallest difference that can be expected to be statistically significant when comparing the pilot site with the comparison site. If the “true effect” of the pilot is positive but smaller than the MDD, it will not be detectable. The table shows differences in proportions because many of the survey questions are ratings that can be reduced to binary questions (for example, having a positive viewpoint of the SNAP application process or not). The table displays MDDs for two-tailed tests with 80 percent power and a significance level of 5 percent. The table assumes 50 percent of the comparison population has a characteristics or outcome (such as having a positive view). Either a higher or a lower percentage will lead to a smaller MDD.


Table B2.1. Minimum Detectable Differences for Findings from Client Survey, by State and SNAP Participation


Number of Completed Interviews

Pilot Sites

Comparison Sites

MDD

New Mexico




Overall comparisons

412

386

9.9%

Comparisons of SNAP nonparticipants

263

266

12.2%

Comparisons of SNAP participants

149

120

17.2%

Pennsylvania




Overall comparisons

1,238

1,160

5.7%

Comparisons of SNAP nonparticipants

741

752

7.2%

Comparisons of SNAP participants

497

408

9.4%

Washington




Overall comparisons

829

777

7.0%

Comparisons of SNAP nonparticipants

421

433

9.6%

Comparisons of SNAP participants

408

344

10.2%


Note: Notes: The targeted number of completed interviews in each state and subgroup are from Table B1.1. MDDs are for two-tailed tests with 80 percent power and a significance level of 5 percent. We assumed 50 percent of the comparison population has a binary characteristic or outcome (a higher or lower percentage would lead to a smaller MDD).

B3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.

For the survey data collection that forms part of this evaluation, numerous methods will be used to encourage response. These include training in effective means of communicating with and convincing elderly respondents to participate in the survey to avoid refusals, extensive locating to minimize noncontacts, and use of an additional incentive for respondents for whom no valid telephone information can be found.

The advance letter (Appendix H) to sample members will be produced on FNS letterhead to heighten the salience of the survey request and to activate the norm of reciprocity. The survey incentive will also be mentioned in the letter to further heighten the saliency and reciprocity norm. The contractor’s professional interviewers have extensive experience interviewing elderly respondents, participants in federal income transfer programs, and individuals with disabilities such as hearing loss or dementia. They will be trained and prepared to accommodate the respondents’ needs by providing standardized explanations for common respondent concerns (such as fear of telephone scams), to avert refusals, to accommodate hard-of-hearing respondents, and to respond with neutral but encouraging probes. In addition, interviews will be as flexible and accommodating of the respondents’ needs as possible, such as by offering to conduct the interview in parts or rescheduling interviews for more convenient times.

In addition to using information from returned mail using the USPS address correction service, locating efforts will also include major national databases, such as Accurint; the Social Security death index; professional license databases; military locator database; property/deed transfer records; state, county, and civil court records; and other agency databases.

Table B1.1 presents the expected target response rates for SNAP participants and nonparticipants in each state. These rates are achievable given the contractor’s success in conducting other phone interviews with elderly program beneficiaries and previous experience with these well-tested procedures.

Where the expected survey response rate is less than 80 percent, nonresponse patterns will be analyzed using whatever relevant information is known about both respondents and nonrespondents. The characteristics of respondents, nonrespondents, and the total attempted sample for the survey, both unweighted and weighted, will be compared to evaluate the risk for nonresponse bias of estimates, which cannot be directly measured.

The data collection involving focus groups is not based on a probability sample and is not meant to represent anyone other than the respondents; therefore a response rate does not apply to this activity.

B4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.

The survey questionnaire was pretested in July 2011. The pilot data confirmed that overall, the information being requested in the survey is reasonable, clearly stated in coherent unambiguous language, and collected in the least burdensome way possible.

The survey questionnaire was pretested with fewer than 10 respondents to learn about problems respondents might experience in providing the requested information and to make appropriate changes to the questionnaire. Pretest responses and comments to the survey questionnaire were collected by telephone to emulate as closely as possible the way the survey will ultimately be administered. Contractor staff mailed pretest respondents a letter and then followed up by telephone to conduct the interview.

While conducting the interview, contractor staff documented comments and questions from respondents, problems with respondent comprehension, and respondent sensitivity. As a result of the pretest, some minor wording changes were made to the survey instrument. These include changes to improve the flow of the interview, improve respondent comprehension, maximize interviewer ease of use to ensure the questionnaire is administered properly, and mitigate the potential sensitivity of some questions. A summary of the pretest is located in Appendix G.

The pretest was also used to establish the average interview length. As the pretest interviews were slightly longer than 20 minutes, some items that were less critical to the evaluation were deleted. The questionnaire is currently estimated at an average of 20 minutes per respondent.

The protocol for the focus group with nonparticipants will not be pretested as it is not feasible to pretest a focus group protocol without conducting the focus group itself.

B5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.

A review by the USDA National Agricultural Statistics Service (NASS) was conducted (Appendix R). No other individuals outside the evaluation project were consulted on statistical aspects of the design. FNS has contracted with Mathematica Policy Research to conduct this study. Table B5.1 identifies the individuals at this organization who will be responsible for collecting and analyzing the data. The Project Officer for the contract providing funding for the evaluation, Bob Dalrymple, will be responsible for receiving and approving all contract deliverables. His contact information is included in Table B5.1.


Table B5.1. Individuals Responsible for Statistical Aspects and Data Collection and Analysis

Name

Title (Project Role)

Organizational Affiliation and Address

Phone Number

Laura Castner

Senior Researcher

(Data collection design, analysis)

Mathematica Policy Research

1100 1st Street, NE, 12th Floor

Washington, DC 20002-4221

(202) 484-3282

Daniel Friend

Survey Researcher

(Data collection design, management)

Mathematica Policy Research

1100 1st Street, NE, 12th Floor

Washington, DC 20002-4221

(202) 250-3540

Elizabeth Clary

Research Analyst

(Data collection management)

Mathematica Policy Research

1100 1st Street, NE, 12th Floor

Washington, DC 20002-4221

(202) 484-4831

Rhoda Cohen

Senior Survey Researcher

(Data collection design, management)

Mathematica Policy Research

600 Alexander Park

Princeton, NJ 08540

(609) 275-2324

Mindy Hu

Survey Specialist

(Data collection design, management)

Mathematica Policy Research

505 14th Street

Suite 800

Oakland, CA 94612

(510) 830-3710

Jennifer McNulty

Senior Programmer

(Data collection programming)

Mathematica Policy Research

600 Alexander Park

Princeton, NJ 08540

(609) 716-4545

Elizabeth Potamites

Researcher

(Data collection design, analysis)

111 East Wacker Dr.,

Suite 920

Chicago, IL 60601

(312) 994-1011

Emily Sama-Miller

Researcher

(Data collection design, analysis)

Mathematica Policy Research

1100 1st Street, NE, 12th Floor

Washington, DC 20002-4221

(202) 484-4512

Bob Dalrymple

Senior Analyst for the Family Programs Staff

USDA Food and Nutrition Service, Office of Research and Analysis

3101 Park Center Dr.

Alexandria, VA 22302

703-305-2122

Tom Pordugal


USDA, National Agricultural Statistics Service, Statistical Methods Branch

1400 Independence Ave., SW

Washington, DC 20250

202-720-7017



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy