Supporting Statement Part A

Supporting Statement Part A.docx

Assessment of States’ Use of Computer Matching Protocols in SNAP

OMB: 0584-0633

Document [docx]
Download: docx | pdf





SUPPORTING STATEMENT - PART A for

OMB Control Number 0584-[NEW]:

Assessment of States’ Use of Computer Matching Protocols in SNAP









Contracting Officer Representative: Danielle Deemer


USDA, Food and Nutrition Service

3101 Park Center Drive

Alexandria, Virginia 22302

August 2017



Table of Contents



Attachments


  1. Improper Payments Elimination and Recovery Act of 2010

  2. 7 CFR Parts 272 and 273

  3. National Survey of State SNAP Data Matching

    1. State-Level Survey Instrument

    2. County-Level Survey Instrument

  1. Screen shot of web survey

  2. Introduction letter to regional directors of upcoming study

  3. Project description

  4. Introduction letter to State SNAP directors of upcoming study

  5. Frequently Asked Questions (FAQ)

  6. Letter to State administrative staff to participate in survey

  7. PRETEST – National Survey of State SNAP Data Matching

  1. State-Level Survey Instrument

  1. County-Level Survey Instrument

  1. PRETEST – Introduction letter to regional directors of upcoming pretest

  2. PRETEST – Letter to State administrative staff to participate in pretest

  3. Follow-up e-mail

  4. Follow-up telephone script

  5. Public comments

  1. Public Comment DCF Florida State

  2. FNS Response to DCF Florida State

  1. NASS

1. NASS comments

2. NASS comments with responses

  1. Confidentiality Agreements

1. Contractor Data Confidentiality Agreement

2. Subcontractor Data Confidentiality Agreement








PART A: JUSTIFICATION


A1. CIRCUMSTANCES THAT MAKE THE COLLECTION OF INFORMATION NECESSARY


Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


This is a new information collection request. The Supplemental Nutrition Assistance Program (SNAP) is the largest food assistance program in the United States, serving as a safety net for families who are experiencing difficulties in obtaining adequate nutrition. In Fiscal Year 2016, SNAP served more than 45 million participants who received an average monthly benefit amount of $125.78. Participation and average benefits have decreased since 2013, when the SNAP participation rate was at its highest, at 47.6 million individual recipients. Despite this decrease, the number of participants still remains much higher than pre-recession numbers, creating strain on SNAP staff and resources, potentially delaying program entry for eligible recipients, and introducing opportunities for improper payments that threaten program integrity. In addition, since the Personal Responsibility and Work Opportunity Reconciliation Act of 1996 took effect, some States have opted to decentralize the administration of SNAP to the county level. This move has changed the way those States allocate resources, manage and verify participation, and identify and mitigate recipient fraud. More generally, the historically high numbers of participants and the need to verify continuing eligibility demands efficient, effective, and comprehensive data-matching systems and processes in all States to meet the needs of the SNAP and recipients.

Almost all Federal and State programs use computer data matching to determine or verify eligibility for benefits. In order to receive SNAP, households must meet financial and non-financial eligibility criteria and provide information and verification about their household circumstances. Additional non-income eligibility requirements include work requirements.

State Agencies administering SNAP use data matching to verify information submitted at the application and recertification stages of an application process and to monitor changes in benefit recipients’ household circumstances. In fiscal year 2001, States reported using an average of 14 computer-matching systems, including the State Wage Information Collection Agency (SWICA), State Data Exchange, Unemployment Insurance (UI), and Beneficiary Data Exchange.1 Currently, State agencies have access to even more systems and advances in data-matching techniques, so they have greater options to improve program operations and ensure program integrity.

For SNAP, States also use computer data matching to ensure program integrity. The Improper Payments Elimination and Recovery Act of 2010 (IPERA),2 in conjunction with FNS’ program integrity goals, compel State agencies to assess the existence of potential fraud and improper payments at both the application and receipt of benefits levels3 (see Attachment A. Improper Payments Elimination and Recovery Act of 2010, and Attachment B. 7 CFR Parts 272 and 273). State agencies are required to check for disqualified recipients in the Electronic Disqualified Recipient System, validate against a list of incarcerated people using the Social Security Administration’s Prisoner Verification System, verify applicant employment data through the National Directory of New Hires (NDNH) and confirm an individual is not in the Social Security Administration’s Death Master File. Additional program integrity tools and methods vary by State and can vary within States, particularly those that are decentralized and administer SNAP at the county level. Local offices may also conduct matches that vary from those used at the county or State level.

In order for USDA to make informed decisions, it is important to gather current information about how and to what extent SNAP agencies conduct data matching and systematically use that information to improve program integrity. The most recent study examining SNAP computer matching is currently 16 years old4 and it has been seven years since the passage of IPERA. With the Assessment of States’ Use of Computer Matching Protocols in SNAP, FNS will build on prior research on computer matching in SNAP. These findings will help FNS understand purposes of current data matches, data matching system characteristics, SNAP data used in matching, data-matching algorithms and requirements, recent and planned changes to data-matching and cost and resource allocation required to perform matches. The results of the study will result in a comprehensive final report, detailed state-specific profiles of each State’s matching activities as well as a repository of data-matching information. The repository will serve as an efficient data analytic tool to provide FNS with the ability to view the most up-to-date data on State data-matching practices and database characteristics.

A2. PURPOSE AND USE OF THE INFORMATION


Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate how the agency has actually used the information received from the current collection.


This is a one-time, voluntary data collection. This study builds upon prior research sponsored by FNS that examined computer matching.5 The current study will help FNS update the nationwide inventory of State SNAP data-matching and improve SNAP computer-matching efforts across the nation to maximize efficiencies and minimize fraud and waste.

The National Survey of State SNAP Data Matching (Attachments C.1 National Survey of State SNAP Data Matching, State-Level Survey Instrument, and C.2 National Survey of State SNAP Data Matching, County-Level Survey Instrument) data will be collected to address the following research objectives:

  1. Objective 1: Inventory all data matches that State SNAP offices currently use or plan to use in the future and which ones they have discontinued using in the previous three years.

  2. Objective 2: Identify and describe all data systems used for matching by each SNAP State agency. Such systems include automated systems, web-based systems, and/or software that integrate data from multiple sources.

  3. Objective 3: Identify and describe the purposes for which States pursue each data match.

  4. Objective 4: Calculate the annual and per-usage costs incurred in carrying out data matches, in total and, when possible, for each individual match.

To address the study objectives, three types of data will be collected and analyzed: (1) extant documentation on State data-matching procedures gathered by reviewing existing State documents; (2) extant documentation on administrative costs of data matching, also gathered by reviewing existing State documents; and (3) survey data from State SNAP agencies collected via the National Survey of State SNAP Data Matching (Attachments C.1 and C.2).

The review of extant documentation has two goals: (1) to inform the design and development of the National Survey of State SNAP Data Matching (Attachments C.1 and C.2); and (2) to gather relevant State data-matching documentation to generate a holistic view of current and planned State data-matching systems, methods, and techniques. Information gathered provides insight into State data-matching practices that the survey might not totally inform. Extant documentation we aim to collect includes State match procedural guides, detailed cost information, and/or system documentation that will provide additional detail not included in the questionnaire (Attachments C.1 and C.2) or will provide needed context on State SNAP data matching procedures or systems responses in the questionnaire. We will offer States the opportunity to upload any data matching documentation they may have available.

The National Survey of State SNAP Data Matching will be a self-administered web survey (see Attachment D. Screen shot of web survey). Potential respondents will include SNAP administrative staff in all 50 States, the District of Columbia, two territories (U.S. Virgin Islands and Guam), and all county or local SNAP offices in States that are county-administered. We will explore the characteristics and matching processes of the data matching systems used by each State. Information derived from the survey will aid FNS in improving SNAP computer matching efforts across the nation in order to maximize efficiencies and minimize fraud and waste.

To prepare for the launch of the survey, the FNS study team, will develop and distribute via email an introductory letter from FNS to the Regional SNAP Directors (see Attachment E. Introduction letter to regional directors of upcoming study). The email will describe the survey and alert the Regional Directors that FNS will be approaching the States in their regions to participate in the survey. The email will include as an attachment a description of the project (see Attachment F. Project description). The following week, the study team will email an introductory letter from FNS to the State SNAP directors that describes the survey, the approximate amount of time required to complete it, and the kind of information that will be collected, and indicates that FNS’ contractor will be in touch with them shortly thereafter to invite them to participate in the survey on behalf of FNS (see Attachment G. Introduction letter to State SNAP directors of upcoming study). The email will include as attachments the project description (Attachment F), a list of frequently asked questions (Attachment H. Frequently Asked Questions (FAQ)), and an advance copy of the survey instrument (Attachments C.1 and C.2). The announcement will be sent to all 53 State SNAP offices. Several days later, the study team will email letter packets to the State SNAP Directors with more detailed information about the survey, the URL for the website that respondents will visit to complete the survey, and a unique PIN for each SNAP office to access the web-based survey instrument (see Attachment I. Letter to State administrative staff to participate in survey).

For respondent access to the survey, the study team proposes to generate one (and only one) randomly generated PIN which will be sent to the representative (point of contract) from each of the State SNAP agencies. If the point of contact cannot answer all the survey items, he/she will forward the URL to another representative to complete. Any previously recorded responses will be viewable, but not changeable by another representative.

PRETESTING CONDUCTED (MARCH 8 – 22, 2017):

The study team pre-tested the National Survey of State SNAP Data Matching in March 2017 (see Attachments J.1 PRETEST – National Survey of State SNAP Data Matching, State-Level Survey Instrument, and J.2 PRETEST – National Survey of State SNAP Data Matching, and County-Level Survey Instrument). The pre-test was conducted using commercial, off-the-shelf software. The software included skip patterns, branching, custom preferred layout and design, extensive piping features, and conditional validation. Using FNS-approved selection criteria (i.e., size of SNAP caseload, number of matching systems, frequency of recertification, and availability of online applications and recertification), the study team recommended six States with State-administered SNAP programs and four with county-administered SNAP programs. FNS selected two from each group for a total of 4 States. The 4 States were purposely chosen to represent a mix of high-burden and medium-burden States from a variety of regions and to represent a mix of selection criteria characteristics. State administrators and respective regional directors were contacted to ask for their participation by email (see Attachment K. PRETEST – Introduction letter to regional directors of upcoming pretest, and Attachment L. PRETEST – Letter to State administrative staff to participate in pretest). The emails explained the purpose of the survey, the types of questions that would be asked, the approximate length of the survey, and the privacy protections governing responses to the survey, and invited them to participate in the pilot test. The emails indicated that the field period would remain open for two weeks and included the URLs6 and passwords that respondents would need to access the web-based survey instrument. The instructions indicated that, if the point of contact could not answer all of the survey items, he or she should forward the URL to a representative who could complete the rest. The email included several attachments: a SNAP project description (Attachment F), the State survey (Attachment J.1), and the County-level survey for States that performed some data-matching at the county level (Attachment J.2). The instructions also explained that respondents would be asked to complete a few follow-up questions at the end to gather feedback about their survey experience.

All states successfully clicked the URL in the email and logged onto the survey site. No respondent indicated any technical issues with the survey programming or “glitches” when completing the survey. At the completion of the pretest, respondents were asked to provide their opinions and feedback on the questionnaire. Based on the questionnaire responses and the respondent feedback, the study team conducted an item-response analysis to examine survey items that were partially answered or skipped by respondents or that respondents indicated they had difficulty answering. Based on this analysis and consultation with FNS SNAP staff, a number of questions were reworded or removed to ensure that the questionnaire is as clear and straightforward as possible. Some questions that did not directly address the study objectives were also removed in order to minimize respondent fatigue and burden and maximize response rate.

A3. USE OF INFORMATION TECHNOLOGY AND BURDEN REDUCTION

Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.


We designed the data collection methodology for this study to minimize burden and provide the flexibility required to meet the needs of respondents. We are committed to compliance with the E-Government Act of 2002 to promote the use of technology. All surveys will be administered online. This allows easy access and efficient collection of data. Survey respondents will complete a web-based survey (Attachments C.1 and C.2) that will be programmed using Qualtrics, a commercial, off-the-shelf software that was used during the pretest. A screenshot of the web survey is provided in Attachment D. Because the survey will include program-specific and technical questions, we have designed the web survey so that a respondent can save responses and then ask other appropriate administrators or staff who have relevant knowledge to enter the survey and complete the remaining sections/questions. The web survey will include functions for tracking survey responses, enabling project staff to keep abreast of the status of survey respondents. The database will alert staff of past-due surveys so staff can follow up with non-respondents (see Attachment M. Follow-up e-mail, and Attachment N. Follow-up telephone script).

The questionnaire was designed with skip patterns to ensure that respondents only answer relevant questions. This is particularly critical in reducing burden for State administrators in a county-administered SNAP State who may not be able to answer technical matching questions. In this case, the questionnaire would allow the State respondent to skip out of the technical matching questions and the respondent would be instructed to submit the questionnaire to the appropriate county respondents. FNS anticipates 100 percent response will be submitted using the web-based system at www. Qualtrics.com.


A4. EFFORTS TO IDENTIFY DUPLICATION. DESCRIBE EFFORTS TO IDENTIFY DUPLICATION


Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Question 2.


The data requirements for the evaluation have been carefully reviewed to determine whether the needed information is already available. Because technology changes rapidly and data matching procedures continually evolve, efforts to identify duplication included a review of FNS reporting requirements, State administrative agency reporting requirements, and special studies by government and private agencies. The information on computer matching procedures to be collected in this study does not exist elsewhere. Previous data collection efforts have collected similar data, but these data are no longer current, they do not account for more recent changes in direct certification regulations, they do not collect data from all States and counties, and they do not provide the level of detail required for this study.7 The recent U.S. Government and Accountability Office report, “More Information on Promising Practices Could Enhance States’ Use of Data Matching for Eligibility,”8 provides information on States’ use of computer matching and potential promising practices, but it does not provide specific information on the number and details of each of the data sources used by each State, the purpose of the data matching, nor any cost information associated with the data-matching systems and procedures. Additionally, because FNS does not require States to report this level of information related to their computer matching activities, this data collection does not duplicate State efforts. Though the collection of extant data may provide useful state matching information, the questionnaire is required to capture more detailed information consistently across States.

A5. IMPACTS ON SMALL BUSINESSES OR OTHER SMALL ENTITIES


If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.


No small businesses or entities will be involved in this study.


A6. CONSEQUENCES OF COLLECTING THE INFORMATION LESS FREQUENTLY


Describe the consequence to Federal program or policy activities if the collection is not conducted, or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


This is a one-time voluntary data collection request. If the data are not collected, FNS will not have access to important information on recent advances in data-matching techniques and the effect on improving program operations and ensuring program integrity in SNAP. The Improper Payments Elimination and Recovery Act of 2010 (Attachment A), in conjunction with FNS’ program goals (Attachment B), compels State agencies to perform fraud detection for both eligibility and benefits. Without a detailed inventory of SNAP data-matching sources and procedures, FNS will not be able to identify and describe all data systems used for matching by each SNAP State agency; FNS will not be able to identify and describe the purposes for which States pursue each data match; and FNS will not be able to assist in the improvement of SNAP computer-matching efforts across the nation to maximize efficiencies and minimize fraud and waste. Further, lack of SNAP data-matching information will significantly hinder FNS’ ability to assess the feasibility of improving the data matches used for eligibility verification and address the project’s objectives as outlined in Section A.2.

A7. SPECIAL CIRCUMSTANCES RELATING TO THE GUIDELINES OF 5 CFR 1320.5


Explain any special circumstances that would cause an information collection to be conducted in a manner:

  • Requiring respondents to report information to the agency more often than quarterly;

  • Requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

  • Requiring respondents to submit more than an original and two copies of any document;

  • Requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;

  • In connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

  • Requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

  • That includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

  • Requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.


There are no special circumstances. We will conduct data collection in a manner consistent with the guidelines in 5 CFR 1320.5.


A8. COMMENTS TO THE FEDERAL REGISTER NOTICE AND EFFORTS FOR CONSULTATION


If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8 (d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years even if the collection of information activity is the same as in prior years. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


A Federal Register Notice (FRN) was published in the Federal Register on May 30, 2017, Volume 82, Number 102, Pages 24659-24664.

One comment was received in response to the Federal Register Notice (see Attachments O.1 Public Comment DCF Florida State and O.2 FNS Response to DCF Florida State).

In addition, Jennifer Rhorer with the National Agricultural Statistical Service’s Summary, Estimation, and Disclosure Methodology Branch reviewed Part A and Part B of this OMB Clearance Package (see Attachments P.1 NASS Comments, and P.2 NASS Comments with Responses).

Three State-level SNAP staff and two county-level SNAP staff also provided feedback on the survey instrument. Three reviewers gave permission to FNS to publish their names and contact details: Tiffany Vasey, Iowa Department of Human Services, [email protected], Windy Rickman, Iowa Department of Human Services, [email protected], and Arlene Dura, North Dakota Department of Human Services, [email protected].

The following individuals reviewed this OMB Clearance Package: Tamieka Muns, USDA Food and Nutrition Service, 3101 Park Center Drive, Alexandria, VA, 22302, [email protected] and Laura Tiehen, USDA Economic Research Service, 1400 Independence Avenue, SW, Washington, DC, 20250, [email protected].


A9. EXPLAIN ANY DECISIONS TO PROVIDE ANY PAYMENT OR GIFT TO RESPONDENTS


Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


There are no payments or gifts to respondents.


A10. ASSURANCES OF CONFIDENTIALITY PROVIDED TO RESPONDENTS


Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


Since the questionnaire will have no questions of a private nature nor have any personally identifiable information, issues of privacy of responses are not a concern for State respondents. Web surveys will be password protected, with data transmitted via a secure tunnel to a database residing behind a monitored firewall.

The Department complies with the Privacy Act of 1974. No private information is associated with this collection of information. We will assure respondents in writing that they will not be personally identified in any publications. Moreover, we will ensure that any published reports with tabular summaries or frequency distributions will not allow the deductive disclosure of any participant in this study. This disclosure statement is contained in the frequently asked questions page that will be transmitted to respondents with the introductory materials (Attachment H). All trained study team members have signed a Data Confidentiality Agreement (see Attachment Q.1 Contractor Data Confidentiality Agreement and Q.2 Subcontractor Data Confidentiality Agreement).

A11. JUSTIFICATION FOR ANY QUESTIONS OF A SENSITIVE NATURE


Provide additional justification for any questions of a sensitive nature, such as sexual behavior or attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


There are no questions of a sensitive nature included in this information collection request.

A12. ESTIMATES OF THE HOUR BURDEN OF THE COLLECTION OF INFORMATION


Provide estimates of the hour burden of the collection of information. Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated.


A. Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.


Number of Respondents: 372 Sample size. Out of 372 sampled, 221 are expected to respond (Pretest respondents 18; State Agencies 53; County & Local Staff 150). The National Survey of State SNAP Data Matching will be a self-administered web survey. Potential respondents will include administrative staff in all 50 States, the District of Columbia, and two territories (U.S. Virgin Islands and Guam). Due to the many and varied systems States use to match data for initial and continuing program eligibility, participation, and integrity checks, we anticipate that any particular State could have multiple county and/or local administrative staff respondents who can best answer system, process, technical, and cost-related questions. Therefore, the survey will also collect data at the county and local levels from SNAP staff in ten States that have county-administered SNAP to account for variations in processes and procedures at the county level. Although State SNAP Directors will be the individuals contacted to participate in the survey, they will be informed that they may hand off the survey to other knowledgeable individuals to complete any questions that they cannot answer, including administrators in county or local SNAP offices who may be best positioned to answer questions about data matching conducted at the county or local levels.

There will be one respondent in each of the 43 States and territories with State-level systems. Based on findings from the pre-test, we estimate that about half of the 10 States with county-administered SNAP will ask county and/or local administrators to complete the sections of the survey about county/local level processes and procedures. In these 5 States where county or local staff are asked to complete the survey, we estimate that about 60 county or local staff will be asked to respond, for a total of about 300 potential county/local level respondents.9 In total, therefore, we expect there to be about 221 unique respondents to the survey (based on 18 pretest respondents, an expected 100 response rate from 53 State administrators, and a 50 percent response rate from 300 county/local administrators). Details are provided in Table A12.A.

Estimated Number of Responses per Respondent: Each respondent will complete the survey once. The survey will be web-based and will be completed by multiple respondents in a secured web portal.

Estimated Total Annual Responses: 372 (221 responses + 151 non-responses)

Estimated Time per Response: 0.8611 for the State-level survey and 0.8372 for the county-level survey (excludes pretest respondents).

Among the 43 States that have State-administered SNAP, we estimate respondent burden based on three tiers, as follows:

  • Tier 1 includes 4 States with a low respondent burden (i.e., those using 5 or fewer data sources).

  • Tier 2 includes 28 States with a medium respondent burden (i.e., those using 6-14 data sources).

  • Tier 3 includes 11 States with a high respondent burden (i.e., those using 15 or more data sources to match SNAP applicant and recipient data).

In the 5 States where State administrators will likely report on county-level processes and procedures, we estimate respondent burden based on three tiers, as follows:

  • Tier 1 includes 1 State with a low respondent burden (i.e., those in which counties/localities use 5 or fewer data sources).

  • Tier 2 includes 2 States with a medium respondent burden (i.e., those in which counties/localities use 6-14 data sources).

  • Tier 3 includes 2 States with a high respondent burden (i.e., those in which counties/localities use 15 or more data sources to match SNAP applicant and recipient data).

We estimate that about half of the 10 States with county-administered SNAP will ask county and/or local administrators to complete the sections of the survey about county/local level processes and procedures. In each of these five States, we estimate that 60 counties will be asked to respond, for a total of 300 potential county and local respondents. We estimate respondent burden based on three tiers, as follows:

  • Tier 1 includes 1 State with a low respondent burden (i.e., those in which counties/localities use 5 or fewer data sources).

  • Tier 2 includes 2 States with a medium respondent burden (i.e., those in which counties/localities use 6-14 data sources).

  • Tier 3 includes 2 States with a high respondent burden (i.e., those in which counties/localities use 15 or more data sources to match SNAP applicant and recipient data).

We expect a 100 percent response rate from the 53 State administrators asked to complete the survey and a 50 percent response rate from the 300 county and local level administrators for a total of 150 responses. This estimate is based on pretest results and comments provided by County/Local SNAP staff of the burden required to complete the survey.


Estimated Total Annual Burden on Respondents and Non-Respondents: 196.17 Hours

The pretest burden estimate is based on 6 pretest respondents who were administered the web-based survey (4 State-level administrators and 2 county-level administrators). One director in a State with a county-administered SNAP forwarded the URL to administrators in two different counties to complete the survey. Five respondents completed the survey, including three of the four State SNAP directors and both county administrators. It took approximately 2.4 hours for the State SNAP administrators to complete the survey, and approximately 1.2 hours for the county-level administrators to complete the survey, including .0334 hours to read the e-mail notification and .0167 hours for the e-mail follow-up. Based on the pretest findings, FNS and the study team worked collaboratively to remove several survey items and modify others to reduce respondent burden.

For the main study, the burden estimate for a State-administered SNAP with 5 or fewer data sources, 6 to 14 data sources, and 15+ data sources is 0.334, 0.5845, and 1.00 hours, respectively (see Table A12.A). We expect five States will be completing the State survey as well as the county/local module for their State. The burden estimate for a county-administered State completing the survey for a county-administered SNAP with 5 or fewer data sources, 6 to 14 data sources, and 15+ data sources is 0.50, 1.00, and 1.25 hours, respectively. The burden estimate for a county-administered State completing the State portion of the survey only with 5 or fewer data sources, 6 to 14 data sources, and 15+ data sources is 0.3340, 0.5845, and 1.00 hours, respectively. For a county/local respondent in a county-administered State with a low burden, the amount of time required to complete the survey is 0.25 hours; for a county/local respondent in a county-administered State with a medium burden, the amount of time required to complete the survey is 0.5 hours; and for a county/local respondent in a county-administered State with a high burden, the survey will require 0.80 hours to complete. The slight difference in the time required for State and county/local staff to complete the survey is due to several additional items on the State survey.

The total estimated annual burden among respondents (including pretest) is 196.1701 hours: 9.9516 hours for pretest respondents, 35.7995 hours for State-administered SNAP (43 States), 9.8390 hours for the 10 States that administer SNAP at the county level, and 140.58 hours for county/local level respondents in the 10 States that administer SNAP at the county level. For individuals who decline to participate in the survey, the burden estimate is 17.6217 hours (.1167 hours per non-respondent). This includes the amount of time to read an email, review a few questions, and decide to exit the survey. We anticipate a 100 percent response rate from State-level staff and a 50 percent response rate from county/local staff.



Table A12.A. Burden Table among Respondents and Non-Respondents by Respondent Description

Affected Public

Respondent Description

Type of Survey Instrument

Instrument

Sample Size

RESPONDENTS

NON-RESPONDENTS

 

Estimated Number of Respondents

Frequency of Responses (Annually)

Total Annual Responses

Average Time per Response (Hours)

Subtotal Estimated Annual Burden (Hours)

Estimated Number of Non-Respondents

Frequency of Response

Total Annual Responses

Average Time per Response (Hours)

Subtotal Estimated Annual Burden (Hours)

Grand Total Burden Hours

PRETEST ONLY

STATE, LOCAL AND TRIBAL GOVERNMENT

Regional Directors

Introductory
e-mail

Letter

3

3

1

3

0.0167

0.0501

0

0

0

0.00

0.00

0.0501

State administrators

e-mail notification of data collection

Letter

4

4

1

4

0.0334

0.1336

0

0

0

0.00

0.00

0.1336

Regional Directors

e-mail notification of data collection

Letter

3

3

1

3

0.0167

0.0501

0

0

0

0.00

0.00

0.0501

State administrators

e-mail follow-up

Letter

2

2

1

2

0.0167

0.0334

0

0

0

0.00

0.00

0.0334

County/Local SNAP Staff

e-mail follow-up

Letter

1

1

1

1

0.0167

0.0167

0

0

0

0.00

0.00

0.0167

State SNAP Staff

Web Survey

Survey

4

3

1

3

2.4170

7.2510

1

1

1

0.0167

0.0167

7.2677

County/Local SNAP Staff

Web Survey

Survey

2

2

1

2

1.20

2.40

0

0

0

0.00

0.00

2.40

Subtotal Pretest (including feedback survey)

19

18

7

18

0.5519

9.9349

1

1

1

0.0167

0.0167

9.9516

STATE-ADMINISTERED (includes the District of Columbia)


Regional Directors

e-mail

Letter

7

7

1

7

0.0668

0.4676

0

0

0

0.00

0.00

0.4676

State administrators

e-mail

Letter

43

43

1

43

0.1336

5.7448

0

0

0

0.00

0.00

5.7448

State SNAP Staff (low burden)

Web Survey

Survey

4

4

1

4

0.3340

1.3360

0

0

0

0.00

0.00

1.3360

STATE, LOCAL AND TRIBAL GOVERNMENT

State SNAP Staff (medium burden)

Web Survey

Survey

28

28

1

28

0.5845

16.3660

0

0

0

0.00

0.00

16.3660

State SNAP Staff (high burden)

Web Survey

Survey

11

11

1

11

1.00

11.00

0

0

0

0.00

0.00

11.00

State administrators

e-mail follow-up

Letter

43

43

1

43

0.0167

0.7181

0

0

0

0.00

0.00

0.7181

State administrators

Reminder

Phone

20

10

1

10

0.0167

0.1670

0

0

0

0.00

0.00

0.1670

Subtotal State 43 SNAP Staff administer SNAP at the State level

43

43

1

43

0.8325

35.7995

0

0

0

0.00

0.00

35.7995

STATES ADMINISTERED SNAP AT COUNTY LEVEL


State administrators

e-mail

Letter

10

10

1

10

0.1336

0.1336

0

0

0

0.00

0.00

0.1336

State SNAP Staff Completing for State and County (low burden)

Web Survey

Survey

1

1

1

1

0.50

0.50

0

0

0

0.00

0.00

0.50

State SNAP Staff Completing for State and County (medium burden)

Web Survey

Survey

2

2

1

2

1.00

2.00

0

0

0

0.00

0.00

2.00

State SNAP Staff Completing for State and County (high burden)

Web Survey

Survey

2

2

1

2

1.25

2.50

0

0

0

0.00

0.00

2.50

State SNAP Staff Completing for State only (low burden)

Web Survey

Survey

1

1

1

1

0.3340

0.3340

0

0

0

0.00

0.00

0.3340


State SNAP Staff Completing for State only (medium burden)

Web Survey

Survey

2

2

1

2

0.5845

1.1690

0

0

0

0.00

0.00

1.1690

State SNAP Staff Completing for State only (high burden)

Web Survey

Survey

2

2

1

2

1.00

2.00

0

0

0

0.00

0.00

2.00

State administrators

e-mail follow-up

Letter

10

10

1

10

0.0167

0.1670

0

0

0

0.00

0.00

0.1670

State administrators

Reminder

Phone

5

5

1

5

0.0167

0.0835

0

0

0

0.00

0.00

0.0835

10 State Staff administer SNAP at County Level Subtotal

10

10

1

10

0.9839

9.8390

0

0

0

0.00

0.00

9.8390

GRAND TOTAL OF STATE BURDEN
(excludes pretest)

53

53

1

53

0.8611

45.6385

0

0

0

0.00

0.00

45.6385

SNAP ADMINISTERED AT COUNTY LEVEL


County administrators

e-mail

Letter

300

300

1

300

0.1336

40.08

0

0

0

0.00

0.00

40.08

County/Local SNAP Staff (low burden)

Web Survey

Survey

60

30

1

30

0.25

7.50

30

1

30

0.10

3.00

10.50

County/Local SNAP Staff (medium burden)

Web Survey

Survey

120

60

1

60

0.50

30.00

60

1

60

0.10

6.00

36.00

County/Local SNAP Staff (high burden)

Web Survey

Survey

120

60

1

60

0.80

48.00

60

1

60

0.10

6.00

54.00

Subtotal State/County/Local SNAP Staff

300

150

1

150

0.8372

125.5800

150

1

150

0.10

15.00

140.58

GRAND TOTAL

372

221

1

221

0.8197

181.1534

151

1

151

0.1167

15.0167

196.1701



B. Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories.


The total cost to respondents for their time in this collection is $3,544.65 (Table A12.B.). To calculate the annualized cost to State and local agencies and business respondents, we used the mean hourly wage rate categories determined by the Bureau of Labor Statistics, May 2016, National Occupational Employment and Wage Estimates.

Table A12.B. Burden Table among Respondents and Non-Respondents by Respondent

Affected Public

Respondent Description

Estimated
# of
Respondents

Responses
Annually
Per
Respondent

Total
Annual
Responses

Estimated
Avg. # of
Hours Per
Response

Estimated
Total
Hours

Respondent Cost ($)

PRETEST

Regional Directors

3

1

1

0.0167

0.0501

0.8923

State administrators

4

1

1

0.0334

0.1336

2.3794

Regional Directors

3

1

1

0.0167

0.0501

0.8923

State administrators

2

1

1

0.0167

0.0334

0.5949

County/Local SNAP Staff

1

1

1

0.0167

0.0167

0.2974

State SNAP Staff

4

1

1

2.4170

7.2510

129.1403

County/Local SNAP Staff

2

1

1

1.20

2.40

42.7440

STATE-ADMINISTERED

Regional Directors

7

1

1

0.0668

0.4676

8.3280

State administrators

43

1

1

0.1336

5.7448

102.3149

State SNAP Staff (low burden)

4

1

1

0.3340

1.3360

23.7942

State SNAP Staff (medium burden)

28

1

28

0.5845

16.3660

291.4785

State SNAP Staff
(high burden)

1

1

11

1.0

11.00

195.91

State administrators

43

1

43

0.0167

0.7181

12.7894

State administrators

20

1

10

0.0167

0.1670

2.9743

STATE- ADMINISTERED SNAP AT COUNTY LEVEL

State administrators

10

1

10

.1336

1.3360

23.7942

State SNAP Staff Completing for State and County (low burden)

1

1

1

0.50

.50

8.9050

State SNAP Staff Completing for State and County (medium burden)

2

1

2

1.0

2.0

35.62

State SNAP Staff Completing for State and County (high burden)

2

1

2

1.25

2.50

44.5250

State SNAP Staff Completing for State only (low burden)

1

1

1

0.3340

0.3340

5.9485

State SNAP Staff Completing for State only (medium burden)

2

1

2

0.5845

1.1690

20.8199

State SNAP Staff Completing for State only (high burden)

2

1

2

1.0

2.0

35.62

State administrators

10

1

10

0.0167

0.1670

2.9743

State administrators

5

1

5

0.0167

0.0835

1.4871

SNAP ADMINISTERED AT COUNTY LEVEL

County administrators

300

1

300

0.1336

40.08

713.8248

County/Local SNAP Staff (low burden)

60

1

30

0.25

7.50

133.5750

County/Local SNAP Staff (medium burden)

120

1

60

0.50

30.00

534.30

County/Local SNAP Staff (high burden)

120

1

60

0.80

48.00

854.88

Non-respondents

151

1

151

0.1167

17.6217

313.8425

TOTAL

372

1

221

12.5053

199.0256

3,544.6459



* Includes 18 pretest respondents, 53 State responses, and 150 county/local responses

**Assumes $17.81 represents the median of all occupations. See https://www.bls.gov/oes/current/oes_nat.htm.

A13. ESTIMATES OF OTHER TOTAL ANNUAL COST BURDEN


Provide estimates of the total annual cost burden to respondents or record keepers resulting from the collection of information, (do not include the cost of any hour burden shown in questions 12 and 14). The cost estimates should be split into two components: (a) a total capital and start-up cost component annualized over its expected useful life; and (b) a total operation and maintenance and purchase of services component.


There is no other cost to respondents beyond what is discussed in A12.

A14. PROVIDE ESTIMATES OF ANNUALIZED COST TO THE FEDERAL GOVERNMENT


Provide estimates of annualized cost to the Federal government. Provide a description of the method used to estimate cost and any other expense that would not have been incurred without this collection of information.


The total average annualized cost is $338,008.28. The average annualized cost is $306,953.68 for the contractor. Using the Federal Wage Salary 2017, the average annualized cost for the Federal project officer is $28,683.00 and the average annualized cost for the Branch Chief is $2,371.60.

The total cost to the Federal Government for all data collection activities by the contractor is $767,384.20 over 30 months. These costs include study design, development of data collection instruments, preparation of the OMB clearance submission, data collection, analysis and report writing. Using the Federal Wage Salary 2017, the Federal project officer, a GS-11-Step 1, will spend approximately 2,250 hours over 30 months to manage the data collection, costing the Federal Government $71,707.50. The cost of the FNS employee, Branch Chief, involved in project oversight with the study is estimated at GS-14, Step 4 at $59.29 per hour based on 2,080 hours per year. We anticipate that this person will work 100 hours for a cost of $5,929.

A15. EXPLANATION OF PROGRAM CHANGES OR ADJUSTMENTS


Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I.


This is a new information collection which will add 196.1701 burden hours and 372 responses to the FNS OMB inventory.


A16. PLANS FOR TABULATION, AND PUBLICATION AND PROJECT TIME SCHEDULE


For collections of information whose results are planned to be published, outline plans for tabulation and publication.


We will provide analyses derived from this data collection via three key deliverables: (1) a comprehensive final written report; (2) a briefing presented at FNS; and (3) a repository of SNAP data-matching information with a robust data analytic tool. Each of the reports and the presentations will present key findings of the study in clear, nontechnical language that makes them understandable by a broad audience. Table A16 presents the schedule for data collection.

Table A16. Data Collection, Analysis, and Reporting Schedule

Activity

Time Schedule

Data Collection



Pretesting

Send introductory letter from FNS to regional directors

Send introductory letter from FNS to State administrators

Send letter from contractor to survey respondents

May 8-22, 2017

1 week after OMB approval

2 weeks after OMB approval

3 weeks after OM approval

Send web survey with URL

May 20, 2018



Data Analysis


Compile survey data and prepare survey data files

November 5, 2018

Complete data analysis

December 5, 2018

Final table shells

December 19, 2018



Reports


Main Report – Final

March 28, 2019



Presentations


FNS presentation

May 20, 2019



Data Files


SNAP Data-Matching and Analytic Tool (S-DMAT) with documentation

May 21, 2019


Main Final Report. The main final report, to be published on the FNS website (http://www.fns.usda.gov/ora/), will present a comprehensive inventory and assessment of each State’s SNAP data-matching methods, procedures, and program data sources as well as a discussion of barriers and challenges encountered. The report will consist of an executive summary and five sections, plus appendices: (1) Introduction and background of SNAP data-matching, (2) study objectives, (3) methodology, (4) results from the National Survey of State SNAP Data Matching and reviews of the data-matching documentation, (5) lessons learned and conclusions, and (6) appendices.

We will present information in tables and in narrative form, with analysis of the study’s objectives and questions based on the information gathered. We will organize this discussion by topic in order to provide readers with a fuller, more textured understanding of each dimension of direct certification. Our general approach will be to use information from the national survey to give the broader picture of direct certification practices nationally, and then to use analysis of the data-matching documentation to support the conclusions of the national survey analysis, provide counterexamples, or give a more nuanced local perspective on the patterns we find nationally.

FNS Briefing. The briefing at FNS will provide an opportunity to discuss the key findings of the project. The project team will organize the presentation in a way that clearly highlights the condition of SNAP data-matching—both nationally and across representative States. The briefing will go over all aspects of the project and provide to FNS a clear and concise view of the types and purpose of SNAP data-matching at the national and State level for both State-administered and county-administered SNAP agencies.

SNAP Data-Matching and Analytic Tool (S-DMAT). The SNAP Data-Matching and Analytic Tool will be a one-stop technical solution to gather and display the most up-to-date information on data-matching database characteristics to facilitate improved direct certification systems in each State. The data analytic tool will provide a comprehensive, multidimensional view of SNAP data-matching, allowing users to view information in different ways to see SNAP data-matching characteristics from the national level down to the individual State and county level (for those States that perform matching at that level). The database will include edited information gathered from the National Survey of State SNAP Data Matching and supplementary technical documentation gathered. Information stored in the database will include the following: computer matching sources, procedures, program data used in matching, attributes of State information systems, data element attributes, frequency of matching, data-matching algorithms, data-matching rates, and cost data.

The S-DMAT couples the need for current information to identify promising new practices with the technological capacity to update process and technological changes quickly. Users will be able to query the database using a simple Section 508-compliant interface. FNS owners of the data will also have the ability to provide access to other users10 – such as State SNAP directors – to enable them to update the database with new information as well as to use the search and query features to foster improvements in SNAP data-matching.

For example, we plan to use Tableau as the business intelligence software to view and analyze the data collected for this project. The Tableau interface will be connected live to the SQL database tables, presenting the collected data in visually informative ways that will enable FNS and authorized State users to summarize figures and reach conclusions about State data matching easily. The tool will provide users with a high-level national overview of States’ data-matching metrics as well as the ability to drill down and see a detailed profile of each State’s matching practices, including the data sources that a State currently uses, the data sources it plans to begin using in the near future, the frequency with which the State conducts matches, and the costs incurred with each matching session, where possible. The tool also offers the ability to view the data in different ways through ad-hoc queries so users can manipulate the data to present it different ways based on the user’s needs and questions. In addition, the S-DMAT will have reporting functionality so that information can be readily used to assist States and districts with their continuous improvement plans.

A17. DISPLAYING THE OMB APPROVAL EXPIRATION DATE


If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


The agency will display the OMB approval number and expiration date on all instruments.


A18. EXCEPTIONS TO THE CERTIFICATION STATEMENT IDENTIFIED IN ITEM 19


Explain each exception to the certification statement identified in Item 19 of the OMB 83-I "Certification for Paperwork Reduction Act."


There are no exceptions to the certification statement.


1 Borden, William S., and Robbi L. Ruben-Urm. (2002, January). “An Assessment of Computer Matching in the Food Stamp Program.” Final Report. OMB Control Number 0584-0504, expiration date 3/31/2002. Alexandria, VA: U.S. Department of Agriculture, Food and Nutrition Service.

2 S. 1508 — 111th Congress: Improper Payments Elimination and Recovery Act of 2010.” www.GovTrack.us. 2009. Retrieved July 31, 2017 from https://www.govtrack.us/congress/bills/111/s1508l

3 Food and Nutrition Service. (2012, August 13). “Supplemental Nutrition Assistance Program: Disqualified Recipient Reporting and Computer Matching Requirements.” A Rule by the Food and Nutrition Service on August 13, 2012. Federal Register, Document 77 FR 48045, Number 2012-19768, 7 CFR 272-273, pp. 48045-48058, https://www.gpo.gov/fdsys/pkg/FR-2012-08-13/pdf/2012-19768.pdf

U.S. Government Accountability Office. (2014). “Supplemental Nutrition Assistance Program: Enhanced Detection Tools and Reporting Could Improve Efforts to Combat Recipient Fraud.” GAO-14-641. A report to Ranking Member, Committee on the Budget, U.S. Senate.

4 Borden, William S., and Robbi L. Ruben-Urm. (2002, January). “An Assessment of Computer Matching in the Food Stamp Program.” Final Report. OMB Control Number 0584-0504, expiration date 3/31/2002. Alexandria, VA: U.S. Department of Agriculture, Food and Nutrition Service.


5 Ibid.

7 Borden, William S., and Robbi L. Ruben-Urm. (2002, January). “An Assessment of Computer Matching in the Food Stamp Program.” Final Report. OMB Control Number 0584-0504, expiration date 3/31/2002. Alexandria, VA: U.S. Department of Agriculture, Food and Nutrition Service.

8 Government Accountability Office. (2016). More Information on Promising Practices Could Enhance States’ Use of Data Matching for Eligibility. (GAO Publication No. 17-111). Washington, D.C.: U.S. Government Printing Office.

9 The average number of counties per state is 62. https://en.wikipedia.org/wiki/County_(United_States)

10 The ability of FNS to provide access to the information in the data analytic tool to other users is dependent on the environment, software version, data security protocol, and so on particular to FNS.

1

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy