SNAP Timeliness Study Supporting Statement - Part B FINAL - OMB response)

SNAP Timeliness Study Supporting Statement - Part B FINAL - OMB response).docx

SNAP Application Processing Timeliness Rates Study

OMB: 0584-0622

Document [docx]
Download: docx | pdf



SUPPORTING STATEMENT - PART B for

OMB Control Number 0584-NEW





Identifying Program Components and Practices that Influence

SNAP Application Processing Timeliness Rates


SNAP Timeliness Study





Contract Officer Representative: Rosemarie Downer


USDA, Food and Nutrition Service

3101 Park Center Drive

Alexandria, Virginia 22302

July 2016




Table of Contents


PART b: Collection of Information Employing Statistical Methods

B.1 RESPONDENT UNIVERSE AND SAMPLING METHODS


Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


The Identifying Program Components and Practices that Influence SNAP Application Processing Timeliness Rates study (SNAP Timeliness Study) is a policy-driven analysis of State SNAP office procedures. The study will collect information from two groups of respondents: (1) State SNAP directors (or designees) in all 50 States plus the District of Columbia and (2) managers who administer the program in the local SNAP offices in six States.

All local offices within the selected States are included in the study for the purpose of describing similarities and differences in how processes and procedures are being implemented at the local-level within each of these States. Examining the administrative procedures used by local SNAP offices in States with acceptable APT status and/or recent improvement in APT rates is of much importance to achieving the study’s purpose to identify factors that promote high APT rates. Because of this, the subsample of six States includes States with acceptable APT rates and improved rates from 2013-2014.

FNS and the study team used the criteria outlined in Table B-3 to determine the States in which to collect information from all local offices:

Table B-1: States Selected in Order to Collect Data from Local SNAP Offices

State/Local Office Compilation

State

# of Local Offices

Reason for Selection

MS

82

higher performer

CT

12

significant improvement

WA

65

higher performer

KY

120

higher performer

NV

15

significant improvement

DC

7

higher performer

Total

301

 


FNS and the study team have also selected four back-up States to replace any of the above States where the local offices are unable or unwilling to participate. These States are listed in Table B-2.

Table B-2: Back-Up States Selected in Order to Collect Data from Local SNAP Offices

State/Local Office Compilation

State

# of Local Offices

Reason for Selection

UT

35

higher performer

MA

26

higher performer

CO

90

higher performer

PA

101

borderline performer


Data will be collected from all local offices in each of the six States. Data collected from the selected local offices within a given State will provide a generalizable assessment of the State.

The universe represented in this sampling frame is the 50 States, plus the District of Columbia is shown in the table below.

Table B-3: Sampling Frame

State

SNAP Region

State or County Administered

# SNAP Offices

# of Counties

APT Rates* FY 2013

APT Rates* FY 2014

APT Rates Improved
FY 2013-14

Average # of Households 2015

California

WRO

County

380

58

86.57

86.82

<1

2,094,188

Florida

SERO

State

97

67

94.07

88.65

-

1,744,008

New York

NERO

County

58

62

91.89

83.36

-

1,667,237

Texas

SWRO

State

311

254

93.53

90.06

-

1,557,424

Illinois

MWRO

State

115

102

82.13

63.36

-

1,061,490

Pennsylvania

MARO

State

101

67

80.00

85.54

4

916,571

Georgia

SERO

State

170

159

77.99

64.82

-

839,998

Michigan

MWRO

State

82

83

89.52

85.58

-

827,060

Ohio

MWRO

County

108

88

80.47

79.72

-

820,297

North Carolina

SERO

County

100

100

75.36

72.63

-

803,651

Tennessee

SERO

State

97

95

78.44

84.59

6

613,903

Washington

WRO

State

65

39

93.10

93.13

<1

573,142

New Jersey

MARO

County

32

21

68.81

76.57

8

453,687

Massachusetts

NERO

State

26

14

83.78

85.05

1

449,312

Oregon

WRO

State

147

36

90.58

91.72

1

442,829

Arizona

WRO

State

86

15

91.21

91.92

<1

439,220

Alabama

SERO

State

70

67

85.88

84.91

-

418,117

Wisconsin

MWRO

County

79

72

91.85

95.87

4

408,857

Virginia

MARO

County

120

133

91.57

93.32

1.5

405,013

Maryland

MARO

State

43

24

89.78

86.74

-

404,588

Missouri

MPRO

State

124

115

82.88

84.00

2

398,606

Louisiana

SWRO

State

63

64

87.17

84.67

-

388,784

Indiana

MWRO

State

92

92

87.86

90.91

3

381,315

South Carolina

SERO

State

46

46

76.76

89.40

13

380,299

Kentucky

SERO

State

120

120

98.41

90.21

-

370,675

Mississippi

SERO

State

82

82

95.13

94.88

-

296,248

Oklahoma

SWRO

State

90

77

91.50

93.63

2

270,703

Connecticut

NERO

State

12

8

57.36

80.21

2.3

248,180

Minnesota

MWRO

County

87

87

91.52

89.46

-

240,851

Colorado

MPRO

County

90

64

94.94

91.91

-

233,506

Arkansas

SWRO

State

83

75

90.57

92.42

2

214,513

Nevada

WRO

State

15

17

73.90

83.93

10

208,913

New Mexico

SWRO

State

34

33

98.64

85.75


205,219

Iowa

MPRO

State

99

99

90.64

89.10

-

185,317

West Virginia

MARO

State

54

55

90.10

91.15

1

182,174

Kansas

MPRO

State

45

105

92.36

88.24

-

122,287

Maine

NERO

State

16

16

92.51

84.25

-

105,507

Rhode Island

NERO

State

5

5

91.87

91.93

<1

100,955

Hawaii

WRO

State

45

5

91.95

94.41

2

95,865

Utah

MPRO

State

35

29

93.75

89.64

-

88,320

Idaho

WRO

State

27

44

98.98

99.61

<1

84,093

D.C.

MARO

State

7

1

97.62

94.53

-

80,062

Nebraska

MPRO

State

63

93

68.03

65.80

-

77,665

Delaware

MARO

State

18

3

85.51

73.93

-

71,860

Montana

MPRO

State

44

56

88.62

93.29

5

56,368

New Hampshire

NERO

State

12

10

91.12

92.89

1.8

51,632

Vermont

NERO

State

0

14

85.58

79.46

-

45,050

South Dakota

MPRO

State

66

66

96.02

91.85

-

43,227

Alaska

WRO

State

17

29

87.88

85.66

-

34,136

North Dakota

MPRO

County

51

53

97.24

97.14

-

24,767

Wyoming

MPRO

State

29

23

90.18

94.70

4.5

13,897













The trained study team will use survey methods that have been demonstrated to maximize response rates to administer the web-based survey1, with the goal of obtaining at least an 80 percent response rate overall. The study plan does not include the use of financial incentives, since the respondents are State or local SNAP agency employees who will complete the survey and provide data during work hours. The primary incentive for State and local SNAP offices to participate in the survey is FNS’ sponsorship of the study and the value of the study’s findings to SNAP agencies, especially those that lack the resources to conduct research to identify factors that will improve their APT rates.

The study team has conducted a survey pre-test, inviting nine State SNAP directors to participate. Eight State SNAP directors (or designees) completed the survey pre-test (89% response rate), and seven of the eight completed an additional pre-test feedback form (87% response rate).


B.2 PROCEDURES FOR THE COLLECTION OF INFORMATION


Describe the procedures for the collection of information including:

  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The SNAP Timeliness Study will collect program administrative information, procedure manuals, and survey data from 50 State SNAP agencies and all local SNAP agencies in six States. To minimize the need to collect information from State and local SNAP agency employees, the study team reviewed a variety of documents prior to developing the survey instrument, to identify existing data sources, as well as information about State and local SNAP policies and administrative procedures. These documents include FNS reports, previous studies conducted by contractors, and State websites. The study team found some information about administrative procedures needed for the study in the State Options Report (Eleventh and Twelfth Editions), the 2014 State Activity Report, and the SNAP Workload Management Matrix downloaded from the FNS website. Respondents will not be asked to compile and submit any of the data identified in these sources. The study team will download copies of State policy manuals from State websites if these are available online. The study team will request additional administrative information, policy documents, and procedure manuals from State and local SNAP agencies only if needed to supplement information obtained elsewhere. For qualitative analyses of policy documents and procedure manuals, the study team will use policy-driven qualitative analysis techniques to examine common themes in these resources.

The study team will collect additional data via the web-based survey (or optional telephone interview) during a 6-month period. After the data collection, the study team will analyze the data according to the crosswalk between study objectives and research questions, relevant variables, and proposed analytic methods presented in Appendix I.

The study team will conduct quantitative analyses to summarize all variables using statistical packages such as SPSS or Stata. The study team will calculate descriptive statistics to describe the central tendencies and variation in data across the States. For binary variables, the analyses will indicate the percentage of States that have implemented a certain policy, procedure, practice, or initiative. For other categorical variables, the study team will calculate quartiles, medians, or modal values, as appropriate. After analyzing the data using descriptive statistics, the study team will use bivariate analysis methods, and possibly multivariate methods, to examine associations between State policies and procedures with SNAP APT rates, SNAP APT status (acceptable, borderline acceptable, unacceptable), and SNAP APT status over time. To examine two categorical variables, the study team will use Chi-Square and other appropriate tests of association (such as Fisher’s exact test) to assess if there is an association between the implementation of a particular policy or procedure and APT status. To examine the association of SNAP APT rate with a particular State policy or procedure, the study team will use t-tests to compare the mean APT for States that have implemented the policy or procedure to those that have not.

B.3 METHODS TO MAXIMIZE THE RESPONSE RATES AND TO DEAL WITH NONRESPONSE


Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


Data collection at both the State and levels will occur in several stages: (1) an advance email notification; (2) a second email to provide directions for completing the survey, a link to access the online survey, and the option of being interviewed by telephone; and (3) follow-up email reminders to non-responders to request that they complete the survey (see Appendix E). The study team will develop a semi-structured interview protocol for the study coordinators to use when completing the survey via telephone interview (see Appendix C).

One week after the advance email notification, the study team will send the second email to State and local SNAP directors (or designee) requesting that they complete the survey. This second email will contain directions for logging on to the website, as well as the website URL, unique username, and password; instructions for completing the survey; and a reminder to contact the study coordinator if they need assistance completing the survey. Study coordinators will be available to answer respondents’ questions about survey questions and other data requirements or to help resolve technical problems with the online survey throughout the data collection period.

The study team will monitor survey completion throughout the data collection period using the survey software capabilities. Study coordinators will maintain contact with the State and local SNAP office contact persons to ensure the submission of all information and documents and maintain a record of submitted documents and information. The study team will review the survey response rates and receipt of needed documentation on a weekly basis.

Typically, multiple follow-up email reminders to respondents to encourage them to complete surveys are needed to achieve an adequate response rate.2 To promote a high response rate, the study team will send multiple follow-up email reminders to non-respondents. The first email reminder will be sent one week after the email request to complete the survey. Additional email reminders will be sent to all non-respondents a week later. A decision to send additional email reminders will be made after weekly reviews of response rates and feedback from study coordinators about their interactions with State and local SNAP office staff and their perceptions about the potential effectiveness of additional email contacts. Study coordinators will make follow-up phone calls to State and local SNAP office staff to clarify survey responses, as needed.

If at some point during the data collection period it appears that survey completion has slowed to the point that an adequate response rate will not be reached, the study team will consult the COR about requesting a follow-up letter from FNS that encourages participation in the survey (see Appendix E).

To monitor survey completion throughout the data collection, the study team will develop a Survey Tracker spreadsheet for recording survey completion, mode of administration, response rates, dates of reminders sent, follow-up telephone calls, and collection of all data and documentation. The study team will send weekly email updates with response rates to the COR identifying progress.

B.4 TEST OF PROCEDURES OR METHODS TO BE UNDERTAKEN

Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


The study team convened a Study Advisory Board to provide advice about the data collection activities and to pre-test the survey instruments. Nine State officials were invited and eight State officials agreed to participate on the Study Advisory Board and pre-test. The study team conducted conference calls with these eight State officials, and in some cases several of their staff, between January 26, 2016 and February 23, 2016. During the conference calls, the study team provided State staff with more detailed information about the study and asked questions about the accuracy of FNS State Option Reports; the availability of documents that describe current State policies and options; the best sources of administrative cost, staff, and caseload data and county/local APT rates; and variations in SNAP administrative practices and case assignment models used in their State. Study Advisory Board members also provided advice about selecting staff in local offices who are most knowledgeable about SNAP administrative procedures.

The study team conducted the pre-test of the survey instrument between February 22 and March 2, 2016. The Study Advisory Board members completed the survey either online or by telephone, and then provided feedback about the survey instrument by completing a feedback form (see Appendix J). The study team sent an email to respondents with instructions for completing the pre-test survey and returning the feedback form. Study Advisory Board members received this email on February 22, followed by emails to provide assistance with technical problems, to remind them to complete the survey, and to respond to requests for additional time to complete the survey and feedback form. Due to problems with the online link and requests from several members for additional time to complete the survey, the deadline was extended. The study team administered the survey via telephone with one Study Advisory Board member to pre-test a computer-administered telephone interview option. The respondents completed both the pre-test telephone survey and all online surveys by March 2, 2016. In total, eight respondents completed the pre-test survey, including seven who completed the online survey (see Appendix D) and one who completed the survey through the telephone interview.

The study team used the feedback provided by the pre-test respondents to modify the survey instrument and to correct technical problems with the survey software. The study team deleted or modified some questions on the survey to reduce the time burden on respondents. The average response time for the survey was 70 minutes for the seven State officials testing the online version, which included time to gather materials needed to support survey responses. The time required to complete the survey was greater for those respondents who had to spend time consulting with other staff or looking through reports or archives to find information.

B.5 INDIVIDUALS CONSULTED ON STATISTICAL ASPECTS & INDIVIDUALS COLLECTION AND/OR ANALYZING DATA


Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The following study team members from contractors WRMA, Inc. and IMPAQ International designed the study and/or will collect and analyze information for the Food and Nutrition Service.

The person responsible for receiving and approving the contract deliverables is Contracting Officer Representative, Rosemarie Downer, Ph.D., Social Science Researcher, SNAP Research and Analysis Division, Office of Policy Support, Food and Nutrition Service, [email protected].

1 Dillman, D., Smyth, JD and LM Christian (2014). Internet, Mail and Mixed-Method Surveys: The Tailored Design Method (4th Edition). New York: John Wiley and Sons, Inc.

2 Dillman, D., Smyth, JD and LM Christian (2014). Internet, Mail and Mixed-Method Surveys: The Tailored Design Method (4th Edition). New York: John Wiley and Sons, Inc.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement for OMB No
AuthorUSDA
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy