Supporting Statement Part B PRAO recommendations 8-18-17

Supporting Statement Part B PRAO recommendations 8-18-17.docx

Assessment of States’ Use of Computer Matching Protocols in SNAP

OMB: 0584-0633

Document [docx]
Download: docx | pdf




SUPPORTING STATEMENT - PART B for

OMB Control Number 0584-NEW:

Assessment of States’ Use of Computer Matching Protocols in SNAP








Contracting Officer Representative: Danielle Deemer



USDA, Food and Nutrition Service

3101 Park Center Drive

Alexandria, Virginia 22302

August 2017



Table of Contents





B.1 RESPONDENT UNIVERSE AND SAMPLING METHODS


Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.



The participant universe is all 50 States, the District of Columbia, and two territories (U.S. Virgin Islands and Guam). The survey will also collect data at the county and local level in 10 States that have county-administered SNAP to account for variations in processes and procedures at the county and local levels. Due to the many and varied systems States use to match data for initial and continuing program eligibility, participation, and integrity checks, we anticipate that any particular State could have multiple county/local respondents who can best answer system, process, technical, and cost-related questions.

Sampling is not possible for this information collection because the research objectives of this study require that data be collected from each State (Examples: Inventory ALL data matches that States currently use, plan to use, or have discontinued using; Identify and describe ALL data systems used for matching by EACH SNAP State agency). The amount of variation among States in data-matching processes and procedures precludes the use of sampling, which would not yield reliable data that could be generalized to all States.

Estimated Number of Respondents: In each of the 43 States with State-administered systems, there will be one respondent, a State administrator who will fill out the State module of the survey only.

Ten States have county-administered systems. Based on findings from the pilot test, we estimate that a State representative will provide information about county activities (as well as State activities) in five of these States. The State representative will still complete just one survey, but he or she will complete a county module in addition to the State module to answer questions about county-administered programs as a whole.

In the other five county-administered States, a State representative will fill out the State module, but a county and/or local administrator will be asked to fill out the county module in each county/local SNAP. We assume a total of 300 county and/or local staff will be asked to complete the survey (5 States X 60 counties). This gives a total of 353 potential survey respondents. We estimate a 100 percent response rate at the State level and a 50 percent response rate at the county/local level, resulting in 203 completes [(48*1) + (5*1) + (300*.50) = 203].

To ensure we achieve the desired response rate, we have planned a three-month fielding period. We will prepare survey materials that will maximize engagement and response by State and county SNAP program offices, and will submit all materials to FNS for approval before release. FNS and the regional offices will be involved in the data collection to help with response rates. Specifically, we will send weekly status updates to FNS identifying States that have and have not responded to the survey. FNS along with our research team will determine the best reminder protocols, such as reminder e-mails or phone calls; contacting the regional offices or State offices (in the case of county-administered systems); and contacts made by FNS staff or the contractor.

Number of Responses per Respondent: Each respondent will complete the survey once.


B.2 PROCEDURES FOR THE COLLECTION OF INFORMATION


Describe the procedures for the collection of information including:

  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The survey link will be emailed to the designated point of contact at each State SNAP agency. This email will include the URL for the website that respondents will visit to complete the survey, and a unique PIN to access the web-based survey instrument (see Attachment I). We will provide a toll-free number for respondents to contact the contractor if they have questions about the survey. All phone calls will be directly routed to staff specifically trained for, and assigned to, this project.

No statistical sampling methodology will be employed, no estimation of the number of data sources or systems used will be required, and no special sampling procedures will be used. Communication consists of email and postal mail with a follow up phone call, if a phone number is available.

No unusual problems requiring specialized sampling procedures have been identified.

B.3 METHODS TO MAXIMIZE THE RESPONSE RATES AND TO DEAL WITH NONRESPONSE


Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.



The evaluator will use well-established methods to maximize response rates and data reliability for the survey, including enlisting the assistance of the Regional SNAP Directors to encourage all States to complete the survey; providing an advance copy of the survey so that State administrators can determine who is best qualified to answer each section; regular follow-up contact with respondents who have not yet completed the survey; providing sufficient time to field the survey so that respondents can gather the necessary information; and allowing the State

SNAP Director to hand off the survey to other staff to complete any questions that he or she is unable to answer.

The strategy for maximizing survey response begins with the survey development and carries through the entire survey process. The methods employed attempt to mitigate all types of individual nonresponse, from failure to locate the sample member to a refusal to participate in the survey. We anticipate a 100 percent response rate from State SNAP administrators, who have established working relationships with FNS and are accustomed to reporting information about their SNAP as part of their job duties. We anticipate a lower response rate from county and local SNAP administrators, who are less likely to have established these relationships and may be unaccustomed to answering questions about SNAP data-matching processes and procedures.

Survey Language and Length: The questionnaire is designed to be easy to complete. The questions use clear and straightforward language and skips ensure that respondents answer only items that apply to them. The estimated average time required for completing the web survey is .8611 hours for the State-level survey and .8372 for the county-level survey.

Gaining and Maintaining Cooperation: One week after sending email invitations to participate in the survey, and at several other points in the 12-week field period, the study team will send email reminders to encourage response (see Attachment M). If email reminders are not sufficient, the evaluation team will follow up by telephone. The telephone script is provided in Attachment N. Telephone follow-up with non-respondents will begin in week eight of the survey period and continue throughout the remainder of the field period. In these calls, trained study team members who were involved with the study design and other data collection efforts will answer any questions the respondents have about the survey, and will encourage them to complete the survey online or complete it via telephone.

B.4 TEST OF PROCEDURES OR METHODS TO BE UNDERTAKEN


Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.



All instruments and protocols used to administer the National Survey of State SNAP Data Matching have been tested to evaluate the clarity of the questions asked, identify possible modifications to question wording or order that could improve the quality of the data, and estimate respondents’ burden.

Pilot Study Participants and Method of Selection: In March 2017, the survey instrument was thoroughly pre-tested using the web interface when the study team, in coordination with FNS, conducted a pilot test with State- and county-level SNAP administrators in four States. Using characteristics of State SNAP agencies as the primary selection criteria (e.g., total number of SNAP participants, number of matching systems, frequency of recertification, number of online applications); the study team recommended six States with State-administered SNAP programs and four with county-administered SNAP programs for FNS consideration. FNS selected two States from each group.

How Participants Were Contacted: Before contacting the State administrators, an introductory email was sent to Regional SNAP Directors on behalf of FNS, alerting them that States in their regions had been selected to participate in the pilot (see Attachment K). The email explained the purpose of the survey and indicated that a member of the study team would be contacting the States with additional information. A few days later, the study team sent an email to the four State administrators, inviting them to participate in the pilot test (see Attachment L). The emails indicated that the field period would remain open for two weeks, and included the URLs and passwords that respondents would need to access the web-based survey instrument. The email explained the purpose and nature of the survey, the types of questions that would be asked, the approximate length of the survey, and the privacy protections governing responses to the survey. The instructions indicated that if the point of contact could not answer all of the survey items, he or she should forward the URL to a representative who could complete the rest. The email included several attachments: a summary of the study, the State survey, and the county module for States that performed some data-matching at the county level. The instructions also explained that respondents would be asked to complete a few follow-up questions at the end to gather feedback about their survey experience.

Pilot Test Findings: All states successfully clicked the URL in the email and logged onto the survey site. No respondent indicated any technical issues with the survey programming.

In three of the four States selected for the pilot test, State administrators completed the entire survey as well as the feedback questions. One State with a county-administered SNAP program forwarded the URL to two county administrators in different counties, who then successfully completed the survey.

The team conducted an item-response analysis to examine survey items that were partially answered or skipped by both State and county administrators. The results indicate that several items were repeatedly skipped, partially completed, or marked “Don’t Know,” suggesting that these questions were particularly difficult for respondents to answer.

Many of the comments that were provided by the pre-test respondents referred to the technical and detailed nature of the questions. Because the pretest window was only two weeks and respondents were asked to provide information on wide-ranging data-matching activities, some may not have had sufficient time to hand off the survey to technical staff who were better positioned to answer the questions. One respondent remarked that if given more time, he would be able to work with staff to answer the questions more thoroughly and completely than he did for the pre-test.

On average, State and county-level administrators reported that they used 20 data sources to match SNAP applicant and recipient data, ranging from 17 to 24 data sources. Because States reported using more data sources than anticipated, the survey took longer to complete than expected during the pilot study and several respondents provided specific feedback that the survey was too long.

How the Survey Changed as a Result of the Feedback: Based on participant responses, the study team identified several ways to provide greater technical support and increase coordination among all stakeholders to maximize the response rate. For example, one of the main findings from the pre-test is that identifying the correct respondents in the State and county is critical in getting the most accurate and complete information. This could be encouraged by providing each respondent with introductory material that highlights: (1) the type of questions in the survey that the State and counties can expect, with the option of a printable version of the survey, if requested; (2) frequently asked questions; and (3) a clear project description highlighting the reason for the study and the importance of getting useful, accurate data-matching information. By providing these materials in advance, State agencies will be able to gather information in anticipation of the survey field period, distribute the survey questions to other relevant staff for review, and marshal staff resources best positioned to answer particular questions in the survey. Further, having a three-month field period will ease some of the resource and time issues that were evident in the compressed period for the pre-test.

Once the survey is in the field, survey administration staff will be able to monitor survey responses and be proactive in: (1) reaching out to respondents to assist them in the completion of the survey; (2) to triage any issues the respondents may have; and (3) to direct them to either the survey technical team or the survey content experts who can provide further technical assistance. Specifically, survey administration staff would contact the 10 county-administered States early in the field period to discuss how data-matching works in the State to assist the State in completing the survey. Both the advance materials and the field period technical assistance will reduce respondent burden, while also helping the respondents complete the survey with the most accurate information they can provide.

Additional Efforts to Reduce Burden: Each of the pre-test respondents completed the survey, but the length of time and the quality of the responses to particular questions indicated a need to reevaluate the survey to balance project goals and objectives with overall burden to the respondents. The most potentially time-consuming aspect of the survey is for respondents to answer specific questions for each data match they perform. On average, the four pre-test States perform approximately 20 different data matches to administer their SNAP programs.

The study team and FNS worked collaboratively to revise the survey to meet the project’s research goals and objectives, while reducing the overall burden for respondents. The study team began by developing specific recommendations to streamline the SNAP survey and presented these recommendations to FNS. Suggestions included reducing the number of items; revising some of the items so that respondents did not have to answer the question for every data source used; rewording or combining some of the items; and asking States that use more than 10 data sources to answer questions for only a subset of data sources, rather than all of them.

FNS staff approved 13 of the 20 revisions that the study team had proposed for their consideration, such as removing several items that did not directly address any of the Research Questions or Objectives; rewording some items to ask respondents to provide a percentage or their best estimate, rather than an exact number; and combining some items to reduce the length of the survey.

B.5 INDIVIDUALS CONSULTED ON STATISTICAL ASPECTS & INDIVIDUALS COLLECTING AND/OR ANALYZING DATA

Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


To ensure that the methodology and instruments reflected best practices in survey science, several content matter experts were consulted as well as a number of senior survey methodologists. The individuals listed in Exhibit 1 were consulted in the development of the data collection instruments and data collection methodologies.



Exhibit 1: Survey development consultants

Organization

Name

Contact Information

NASS

Jennifer Rhorer

800-727-9540

Avar Consulting

Z. Joan Wang

301-977-6553, ext. 222

[email protected]


Cynthia Prince

301-977-6553, ext. 223

[email protected]


Steven Fink

301-977-6553, ext. 219

[email protected]

Mathematica Policy Research

Brandon Kyler

609-716-4381

[email protected]


Kevin Conway


609-750-4083

[email protected]


Robbi Ruben-Urm


609-275-2348

[email protected]




File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement for OMB No
AuthorAvar Consulting
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy