Evaluation of SNAP Employment and Training Pilots
OMB Supporting Statement 0584-0604
Part A: Justification
Submitted to:
Office of Management and Budget
Submitted
by:
Project Officer: Danielle Deemer
Food and Nutrition Service
United States Department of Agriculture
PART A: JUSTIFICATION
A.1. Explanation of circumstances that make collection of data necessary 7
A.2. How the information will be used, by whom, and for what purpose 9
A.3. Uses of improved information technology to reduce burden 10
A.4. Efforts to identify and avoid duplication 11
A.5. Efforts to minimize burden on small businesses or other entities 13
A.6. Consequences of less frequent data collection 13
A.7. Special circumstances requiring collection of information in a manner inconsistent with Section 1320.5(d)(2) of the Code of Federal Regulations 14
A.8. Federal Register comments and efforts to consult with persons outside the agency 15
1. Federal Register notice and comments 15
2. Consultations outside the agency 15
A.9. Payments to respondents 16
A.10. Assurance of privacy 17
1. Privacy 17
2. Institutional Review Board (IRB) 18
A.11. Questions of a sensitive nature 19
A.12. Estimates of respondent burden 20
A.13. Estimates of other annual costs to respondents 22
A.14. Estimates of annualized government costs 22
A.15. Changes in hour burden 23
A.16. Time schedule, publication, and analysis plans 23
1. Publication of study results 23
2. Plans for analysis 25
A.17. Display of expiration date for OMB approval 28
A.18. Exceptions to certification statement 28
Table A.2.a. SNAP E&T Pilots study data collection instruments for OMB approval 10
EXHIBIT
Exhibit A.16.a. Project publication schedule 23
ATTACHMENTS
A Agriculture Act of 2014 (The 2014 Farm Bill)
B Study Description: Research Questions, Data Sources, and Key Outcomes
C.1 Registration Document – English
C.2 Registration Document – Spanish
C.3 Registration Document – Screenshots
D.1 Study Consent Document – English
D.2 Study Consent Document – Spanish
D.3 Study Consent Document Mandatory – English
D.4 Study Consent Document Mandatory – Spanish
E.1 Welcome Packet Letter – English
E.2 Welcome Packet Letter – Spanish
F.1 Study Brochure – English
F.2 Study Brochure – Spanish
G.1 Seasonal Postcard - English
G.2 Seasonal Postcard - Spanish
H Master Site Visit Protocol
I.1 Interview Guide for Client Case Study
I.2 Interview Guide for providers
I.3 Observation Guide Case Study
J.1 Focus Group Moderator Guide for clients – English
J.2 Focus Group Moderator Guide for clients – Spanish
J.3 Focus Group Moderator Guide for employers
K.1 Client Focus Group Recruitment Guide – English
K.2 Client Focus Group Recruitment Guide – Spanish
L.1 Focus Group Confirmation Letter: Client – English
L.2 Focus Group Confirmation Letter: Client – Spanish
L.3 Focus Group Recruitment Email – Employer
L.4 Focus Group Confirmation Letter - Employer
M.1 Participant Information Survey: Client Focus Group – English
M.2 Participant Information Survey: Client Focus Group – Spanish
M.3 Participant Information Survey – Employer Focus Group
N Pretest Results Memorandum
O.1 SNAP E&T Pilots 12-Month Follow-Up Survey – English
O.2 SNAP E&T Pilots 12-Month Follow-Up Survey – Spanish
O.3 SNAP E&T Pilots 12-Month Follow-Up Survey – Screenshot
O.4 SNAP E&T Pilots 36-Month Follow-Up Survey – English
O.5 SNAP E&T Pilots 36-Month Follow-Up Survey – Spanish
O.6 SNAP E&T Pilots 36-Month Follow-Up Survey – Screenshot
P.1 Survey Advance Letter – English
P.2 Survey Advance Letter – Spanish
Q.1 Survey Reminder Letter – English
Q.2 Survey Reminder Letter – Spanish
R.1 Survey Reminder Postcard – English
R.2 Survey Reminder Postcard – Spanish
S.1 Survey Refusal Letter – English
S.2 Survey Refusal Letter – Spanish
T Administrative Data Elements
U Pilot Costs Workbook
V.1 Staff Time-Use Survey
V.2 Staff Time-Use Survey Screenshots
W.1 Time-Use Survey Initial Email
W.2 Time-Use Survey Reminder Email
X.1 Sample Respondent Burden table (Excel)
X.2 Sample Respondent Burden Table (Word)
Y.1 NASS Reviewer Comments
Y.2 NASS Reviewer Comments and Responses to Comments
Z SNAP E&T Pilots Memorandum of Understanding
AA Document List for Document Review Process
BB Confidentiality Pledge
CC.1 Federal Register Comment
CC.2 Response to Federal Register Comment
CC.3 Federal Register Comment
CC.4 Response to Federal Register Comment
DD Summary of Pilot Projects
EE IRB Approval Letters
Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.
This is revision of a currently approved information collection request. In the Agricultural Act of 2014 (Public Law 113-79, Section 4022), Congress added a new section authorizing the implementation and evaluation of pilot projects to “reduce dependency and increase work requirement and work effort under the Supplemental Nutrition Assistance Program.” Specifically, pilot programs or projects (also referred to as Grantees) are tasked with reducing dependence on the Supplemental Nutrition Assistance Program (SNAP) and other public assistance programs by increasing the number of SNAP participants who obtain unsubsidized employment and increasing the income of employed participants.
This section mandates an independent longitudinal evaluation of the pilot programs “to measure the impact of employment and training programs and services provided by each State agency under the pilot projects on the ability of adults in each pilot project target population to find and retain employment that leads to increased household income and reduced reliance on public assistance, as well as other measures of household well-being, compared to what would have occurred in the absence of the pilot project.” The data being collected under this submission are necessary to meet the congressionally mandated requirement for an independent evaluation of the demonstration projects being conducted by the U.S. Department of Agriculture (USDA), Food and Nutrition Service (FNS) under this authorizing legislation. A copy of the statute is included in Attachment A.
FNS received nearly 50 applications in response to a Request for Applications (RFA) released in August 2014 and screened them to identify the top third of the State proposals. FNS’s contractor then reviewed these applications to support FNS’s selection of the final 10 pilots. FNS sought the 10 pilots to be diverse—in geographic location, whether the existing employment and training program is mandatory or voluntary, in the services they offer, and in the targeted groups of SNAP participants. FNS also sought pilots that demonstrated plans for strong implementation of innovative SNAP employment and training programs that will introduce distinct and meaningful differences in services received by treatment and control group members, coupled with faithfully implemented research designs with adequate sample sizes.
FNS’s contractor implemented a multi-step approach to narrow the pool of pilots to the 10 pilots with the strongest potential program and evaluation designs consisting of (1) having a multi-person team assess each application and complete a review template used across all applications, (2) drafting a memorandum that summarized the strengths and weaknesses of each applicant’s written proposal and written responses to questions that FNS submitted to applicants based on FNS’s initial review of applications, (3) performing up to several rounds of submitting questions to applicants based on detailed reviews of applications. Finally, FNS’s contractor submitted a Technical Review memorandum in which it ranked the top third of applicants into categories of “high”, “medium” and “low” rankings. Using this as input, FNS awarded grants to pilots in California, Delaware, Georgia, Illinois, Kansas, Kentucky, Mississippi, Virginia, Vermont, and Washington State.
Indicate how, by whom, how frequently, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.
FNS/USDA will use the information gathered in the data collection activities discussed here to describe the pilot projects and to determine if the employment and training programs and services provided by each pilot program led to unsubsidized work, increased earnings, and reduced reliance on public assistance programs, including SNAP. A detailed overview of the study can be found in Attachment B.
The data collection described in this document is essential for meeting the Congressional mandate for an independent longitudinal evaluation of the pilot projects to foster work and self-sufficiency. (The list of instruments used for data collection is outlined in Table A.2.a, below.) Data will be collected through June 2021 to allow grantees time to gather and format data covering the 36-month follow-up period concluding December 2020. This information collection will result in 6 annual reports to Congress from 2015 to 2020, and interim and final evaluation reports submitted from FNS’s contractor to FNS for each of the 10 pilots completed in September 2019 and September 2021, respectively. There is currently no other effort that can address the research objectives of the study.
The information collection is still underway at this time, and the Contractor will begin analyzing the data in early 2019 for the interim report. The agency has used some of the information collected to submit three annual reports to Congress in 2015, 2016, and 2017. Each report summarized the pilot projects and described progress and challenges pilots experienced, participant enrollment trends, and progress in the evaluation. The remaining three reports will be submitted in 2018, 2019, and 2020. The Annual Reports to Congress are posted on USDA’s website. (https://www.fns.usda.gov/2014-ET-Pilots)
Table A.2.a. SNAP E&T Pilots study data collection instruments for OMB approval
Attachment |
Description |
Instruments and guides |
|
C.1 & C.2 |
Registration Document |
H |
Master Site Visit Protocol |
I.1 |
Interview Guide for Client Case Study |
I.2 |
Interview Guide for Providers |
I.3 |
Observation Guide Case Study |
J.1 & J.2 |
Client Focus Group Moderator Guide |
M.1 & M.2 |
Client Focus Group Participant Information Survey |
J.3 |
Employer Focus Group Moderator Guide |
M.3 |
Employer Participant Information Survey |
O.1 & O.2 |
12-month Follow-Up Survey |
O.4 & O.5 |
36-month Follow-Up Survey |
U |
Pilot Cost Data Collection Workbook |
V.1 |
Staff Time-Use Survey |
Other study materials |
|
D.1, D.2, D.3, D.4 |
Study Consent Document |
E.1 & E.2 |
Welcome Packet Letter |
F.1 & F.2 |
Study Brochure |
G.1 & G.2 |
Seasonal Postcard |
K.1 & K.2 |
Focus Group Recruitment Guide – Client |
L.1, L.2, L.4 |
Focus Group Confirmation Letter |
L.3 |
Focus Group Recruitment Email – Employer |
P.1 & P.2 |
Survey Advance Letter |
Q.1 & Q.2 |
Survey Reminder Letter |
R.1 & R.2 |
Survey Reminder Postcard |
S.1 & S.2 |
Survey Refusal Letter |
W.1 |
Staff Time-Use Survey Initial email |
W.2 |
Staff Time-Use Survey Reminder email |
Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also, describe any consideration of using information technology to reduce burden.
This study strives to comply with the E-Government Act of 2002 (Public Law 107-347, 44 U.S.C. Ch 36). The consent document (Attachment D.1 – D.4) and registration document (Attachment C.1, C.2) were collected from SNAP E&T Pilots participants electronically (unless a particular site had logistical circumstances that did not allow for electronic collection) via EPIS, a web-based random assignment system. Burden is also being reduced by using computer-assisted telephone interviewing (CATI) to administer the follow-up surveys (Attachment O) of SNAP E&T pilot program participants. By including programmed skip patterns and consistency and data range checks, this technology reduces data entry error that often necessitates callbacks to respondents to clarify the responses recorded by an interviewer using pencil and paper to conduct an interview. The study is collecting 100% of follow-up survey responses electronically using CATI.
To the extent possible, any administrative and cost data requested from programs will be collected using Excel workbooks (Attachment U) which will be sent to sites via email. This format will enable the evaluation team to systematically collect data across pilot programs while limiting the burden associated with hardcopy completion.
The staff time-use survey administered as part of cost data collection was administered using Opinio, an internet-based survey software. This format allowed respondents to enter data at their own pace and on their own schedules. No personal data will be maintained in the system. Screenshots of the time-use survey can be found in Attachment V.2.
The total number of responses is 84,916 and the total number of electronic responses is 12,050. The percentage of responses that are electronic is 14 percent.
Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purpose described in item 2 above.
FNS has made every effort to avoid duplication. FNS has reviewed USDA reporting requirements, State administrative agency reporting requirements, and special studies by other government and private agencies. To our knowledge, there is no similar information available or being collected for the current time frame that could be used to evaluate these congressionally mandated pilot programs.
The information in the Baseline Registration document consists of demographic and economic characteristics of the pilot participant and was collected just prior to he or she being randomly assigned. Although some of this information, such as whether the pilot participant is employed, is contained in many states’ management information systems, the evaluation team needed to ask the question the same way across all ten grantees so that the data is consistently defined across grantees in the impact analyses. The evaluation team also needed to obtain these data just before random assignment, rather than using existing information from prior weeks or months, to have the data accurately describe pilot participants’ circumstances at baseline. Finally, for several grantees, the Baseline Information Registration data was collected at the service provider level, rather than at the SNAP administrative level. Thus, although pilot participants’ Social Security Numbers (SSNs) are available for many grantees’ SNAP administrative data systems, the evaluation team requested SSNs on the Baseline Information Registration to be able to link the Baseline Information Registration data to the SNAP administrative data. Without it, it would not be possible to successfully link the two data sources.
The information in the follow-up surveys is not available elsewhere. The evaluation team is collecting data on quarterly earnings from UI wage records, for example, but those data do not have more detailed employment and earnings information such as measures of job quality, job tenure, and more detailed wage information that is required for the impact analysis.
If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.
Information being requested or required has been held to the minimum required for the intended use. Although smaller Grantees otherwise known as State agencies and for-profit and not-for-profit awardee partners are involved in this data collection effort, they deliver the same program benefits and perform the same function as any other State agency or business partner. Thus, they maintain the same kinds of administrative information on file. The evaluation team estimates one small business or other small entity will serve as a partner to each pilot program (ten total). The same methods to minimize burden will be used with all such entities. The total number of small entities is 960 or 1.8 percent of the total number of respondents. To avoid burdening for-profit contractors and entities playing a minor role in the pilot program, the evaluation contractor will exclude from the data collection those that receive minimal funding or resources from the awardee.
Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.
The study will evaluate the impact, implementation, and costs of pilot programs that foster employment and reduce reliance on public assistance programs. This is an ongoing data collection and participation is voluntary. Data for the study will be collected from 2016 to 2020 from State agency staff; private sector for-profit and not-for profit partner organization staff; and SNAP E&T Pilot program participants. Without this information, FNS will not be able to produce the required annual Reports to Congress. Moreover, collecting data less frequently would jeopardize the impact evaluation because the design requires an assessment of change over time, including both a short-term and longer-term assessment. Tracking SNAP E&T participants via a longitudinal survey also will allow FNS to better understand how participants engage in services, acquire job skills, and obtain (and subsequently retain) employment.
Explain any special circumstances that would cause an information collection to be conducted in a manner
requiring respondents to report information to the agency more often than quarterly;
requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
requiring respondents to submit more than an original and two copies of any document;
requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;
in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;
requiring the use of a statistical data classification that has not been reviewed and approved by OMB;
that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.”
There are no special circumstances that would cause FNS to conduct this information collection in a manner inconsistent with 5 CFR 1320.5.
If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments.
A notice of the information collection renewal and an invitation for public comment was published in the Federal Register, August 10, 2018, volume 83, number 155, pages 39657-39665. In the notice FNS indicated this was an extension and we have determined this is a revision. FNS received two comments for this information collection. Comments and responses can be seen in attachments CC.1 through CC.4.
In addition to soliciting comments from the public, FNS consulted with the following people for their expertise in matters such as data sources and availability, research design, sample design, level of burden, and clarity of instructions for this collection.
Jennifer Rhorer
USDA National Agriculture Statistical Service (NASS)
Methodology Division
1400 Independence Ave., SW
Washington, DC 20250
(800) 727-9540
See Attachment Y.1 and Y.2 for NASS reviewer comments and responses to these comments.
Kelly Kinnison
Director, Division of Economic Support for Families
Office of the Assistant Secretary for Planning and Evaluation
U.S. Department of Health and Human Services
200 Independence Avenue, S.W.
Washington, DC 20201
(202) 730-3904
Harvey Krahn
Department of Sociology
5-21 HM Tory Building
University of Alberta
Edmonton, Alberta
Canada T6G 2H4
(780) 492-5234
Bryan Wilson
Workforce Data Quality Campaign Director
National Skills Coalition
1730 Rhode Island Avenue NW, Suite 172
Washington, DC 20036
(202) 223-8355, ext. 115
Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.
The first incentive, the magnet that is sent with the welcome letter, is non-monetary. The magnet is provided as a small gift and will provide an ongoing tangible reminder about the study. It will also include the study toll-free number that the sample members can call if they have questions, or if they want to update their contact information. The first monetary incentive provided to sample members was the $40 MAX Discover® prepaid card given to pretest respondents as described in the pretest results memorandum (Attachment N).
As mentioned, the longitudinal household surveys will be conducted with a randomly selected subsample of study participants. To achieve the higher range response rate desired to obtain reliable impact estimates, on behalf of FNS, our contractor offered an incentive of $30 to all households for the 12-month follow-up survey and will offer $40 for the 36-month follow-up survey. (See Attachment B for a more detailed overview of the approach to offering incentives.)
Describe any assurance of privacy provided to respondents and the basis for the assurance in statute, regulation, or agency policy.
In accordance with the Privacy Act of 1974, our contractor and study team will protect the identifiable information collected for the evaluation from disclosure, to the extent permitted by law, and will use it for research purposes only, except as otherwise required by law. To reduce the risk of disclosure, personally identifiable data collected will not be entered into the analysis file, and data records will contain a numeric identifier only. The terms and protections that were provided to respondents are discussed in two system of record notices (SORNs) titled 1) FNS-8 USDA FNS Studies and Reports published in the Federal Register on April 25, 1991, Volume 56, page 19078; and 2) USDA/FNS-10 Persons Doing Business with the Food and Nutrition Service, published in the Federal Register on March 31, 2000, Volume 65, and located on pages 17251-17252. Pilot program and partner staff and individual program participants were notified that the information they provide will not be released in a form that identifies them, except as otherwise required by law. No identifying information will be attached to any reports or data supplied to USDA or any other researchers. The identities of the project directors from the States are known, because their information was included on applications to participate in the pilot program.
When reporting the results, data will be presented only in aggregate form, so that individuals and institutions will not be identified. A statement to this effect are included with all requests for data. All members of the study team having access to the data are trained on the importance of privacy and data security. All data are kept in secured locations. Identifiers are destroyed as soon as they are no longer required.
FNS staff will never handle or see any of the personal data collected and Mathematica Policy Research’s system does not tie into any of FNS’s data management and analysis systems nor was the Mathematica Policy Research’s data creation and processing system created for this contract agreement. FNS does not have any control over the contractor’s system.
The following safeguards will be employed by FNS’s contractor to protect privacy during the study:
Access to identifying information on sample members are limited to those who have direct responsibility for providing and maintaining sample locating information. At the conclusion of the research, these data will be destroyed.
Identifying information are maintained on separate forms and files that are linked only by sample identification numbers. This cannot be linked back to any one individual.
Access to the file linking sample identification numbers with respondents’ IDs and contact information are limited to a small number of individuals who have a need to know this information.
Access to hard copy documents is rigorously limited to project staff with a need to know. Documents are stored in locked files and cabinets. Documents containing PII are shredded when discarded.
Computer data files are protected with passwords and access is limited to specific users on the research team.
Employees must notify their supervisor, the project director, and the Mathematica security officer if secured and private information has been disclosed to an unauthorized person, used in an improper manner, or altered in an improper manner.
A copy of the Confidentiality Pledge in which the employees of the contractor provide assurances to the above safeguards can be seen in Attachment BB.
The contractor obtained clearance from the New England Internal Review Board (NEIRB), a recognized IRB for Research Involving Human Subjects. NEIRB is responsible for ensuring that the organization’s research: 1) meets the highest ethical standards; and 2) receives fair, timely, and collegial review by an external panel. NEIRB is part of the contractor’s Federal Wide Assurance (FWA) and is committed to complying with the requirements in the HHS Protection of Human Subjects regulations at 45 CFR parts 46.
For review, the NEIRB requires a summary of the study and several forms describing the pilot sites, as well as all instruments and attachments from the OMB package. After the final instruments were sent to FNS, the contractor submitted the IRB application to the NEIRB. The contractor was granted NEIRB clearance in October 2016. Some states also require local IRB reviews, and the contractor obtained clearance from these as well (i.e., Washington, Delaware, California, Georgia, Virginia, and Vermont). IRB documentation is provided in Attachment EE.
Provide additional justification for any questions of a sensitive nature, such as sexual behavior or attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
While the questions in the registration document (Attachments C.1 and C.2) and follow-up surveys (Attachments O.1, O.2, O4, and O.5) are generally not of a sensitive nature, some individuals may be reluctant to provide information on race, family structure, income, food security, and/or mental health and well-being, as well as their social security number.
At the time the participant is called to complete a survey, he/she will be reminded that participation is voluntary and that they may decline to answer any questions they do not wish to answer and there will be no penalties.
Calculating accurate estimates of program costs also requires collecting information on staff salaries. The importance of this information will be explained to study respondents.
Provide estimates of the hour burden of the collection of information.
Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.
There are a total of 25,870 respondents and non-respondents (18,720 respondents + 7,150 non-respondents). The affected public in this study are 190 State Agency staff; 770 business- for- and not-for-profit staff; and 52,870 individuals/households. FNS anticipates 100 percent participation for our grantees (State agencies and Business’). There are also a total of 84,916 responses (77,766 responses from respondents + 7,150 responses from non-respondents). These respondents include those who choose not to participate (known as non-respondents). Attachments X.1 and X.2 show sample sizes, estimated burden, and estimated annualized cost of respondent burden for each part of the data collection and for the data collection as a whole. The annual total estimated burden across all data collection components is 17,964.62 hours (17,607.12 hours for respondents + 357.5 hours for non-respondents). Time for reading data collection materials such as letters, postcards, and emails, are included in the time estimate in the burden table. No respondents will be asked to keep records of data as part of this data collection; therefore, no burden hours have been estimated for recordkeeping.
Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories.
Annualized cost of respondent burden is the product of each type of respondent’s annual burden and average hourly wage rate. The total cost of respondent burden is $539,530.451. The total annualized cost of this information collection is calculated as the sum of the annualized costs by respondent category. For each respondent category, the annualized cost is the product of burden hours (including pretest burden and nonresponsive burden) and an assumed wage rate for a corresponding occupation.
The hourly wage rate of $7.25 for individuals/participants is the federal minimum wage rate according to the Department of Labor Wage and Hour Division (http://www.dol.gov/whd/minimumwage.htm).
Remaining wage rates for the other affected publics were determined using the most recent available data, the May 2017 National Occupational Employment and Wage Estimates data from the Bureau of Labor Statistics. (http://www.bls.gov/oes/current/oes_nat.htm). Using this website, the salaries of State, local, or Tribal agency director/manager respondents ($57.65) are the average hourly earnings of government workers in management occupations (11-0000). The salaries of State, local, or Tribal agency direct service staff respondents ($23.10) are the average hourly earnings of workers in community and social services occupations (21-0000).
For the Private sector, the salaries for the for-profit business director/manager respondents ($57.65) are the average hourly earnings of workers in management occupations (11-0000). The salaries of Private sector not-for-profit agency director/manager respondents ($33.91) are the average hourly earnings of social and community services managers (11-9151). The salaries of Private sector not-for-profit agency employer training supervisor respondents and direct service staff ($21.53) are the average hourly earnings of community and social service specialists (21-1099).
Provide estimates of the total annual cost burden to respondents or record keepers resulting from the collection of information, (do not include the cost of any hour burden shown in items 12 and 14). The cost estimates should be split into two components: (a) a total capital and start-up cost component annualized over its expected useful life; and (b) a total operation and maintenance and purchase of services component.
No capital and start-up or ongoing operational and maintenance costs are associated with this information collection.
Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost and any other expense that would not have been incurred without this collection of information.
The total cost to the Federal government is $28,999,516.78 over a 72-month period, or $4,833,252.73 on an annualized basis. The largest cost to the Federal government is to pay a contractor $28,992,6442 to conduct the study and deliver data files. The information collection also assumes that a total of 80 hours of Federal employee time per year: for a GS-15, step 1 in the Washington-DC locality, at $64.59 per hour for a total of $5,167.20 (or $6,872.38 for fully loaded wages). Federal employee pay rates are based on the General Schedule of the Office of Personnel Management (OPM) effective January 2018.
Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-1.
This is a revision of a currently approved information collection as a result of program changes. These burden estimates were revised from 49,972.09 burden hours and 317,108 annual responses to 18,522.68 burden hours and 84,916 annual responses. The reason for the revision is that the research team has concluded study enrollment, the 12-month follow-up survey, and all three rounds of site visits. The burden hours for these data collection activities were removed from the burden estimates.
During this renewal process we discovered the burden hours and total responses identified and approved in ROCIS are incorrect which indicates 41,764 burden hours and 162,082 responses; the approved supporting statement correctly displayed the requested burden hours and total responses indicated above. This request will reflect program adjustments in ROCIS for an additional 8,202 burden hours and 155,026.
For collections of information whose results are planned to be published, outline plans for tabulation and publication.
Exhibit A.16.a shows the planned publication schedule for SNAP E&T Pilots evaluation.
Exhibit A.16.a. Project publication schedule
Activity |
Schedule |
Annual Reports to Congress |
December 2015-2020 |
Interim evaluation report published |
01/01/19-10/31/19 |
Special analyses published |
10/15/19-08/30/21 |
Final evaluation report published |
01/01/21-10/31/21 |
FNS will submit an Annual Report to Congress on December of each year of the base contract (6 annual reports). Each annual report will provide summaries of progress on implementation status of pilots and activities of the evaluations during the previous year and planned activities for the next year.
From January 2019 through October 2019, the evaluation team will prepare one interim report, which will be a combination of project-specific and topical cross-project reports. Following the interim report, from October 2019 to August 2021, the evaluation team will conduct up to ten additional analyses that will be published in the form of memos and short reports. From January to October 2021, the evaluation team will produce ten final report volumes, one for each pilot project, describing project implementation and participation, costs, and impacts. Each project-specific final report will include appendices that describe the study methodology and provide technical details about the study design, sampling, weighting, response rates, data processing, and analysis.
In addition, from June to October 2019 and again from June to October 2021, the evaluation team will prepare integrated cross-pilot summary reports targeted to a policy-focused audience that examines impacts, implementation, and cost-effectiveness across all pilot projects. The reports will synthesize findings across pilots, using theme tables and side-by-side comparisons of the services provided by the pilots, impacts, participation patterns, and benefits and costs. This will have a two-page executive summary that provides the findings for those policymakers who do not have time to delve into more details. The final reports will be posted on the USDA FNS website. (http://www.fns.usda.gov/ops/research-and-analysis).
The implementation and process analysis will draw on data collected from the site visits, MIS files, and all other interactions with sites. Following each site visit, the evaluation team uses a structured write-up guide to synthesize collected information by implementation dimension, highlight themes, provide examples and illustrative quotes, and identify discrepancies and areas of agreement among data sources. This information will be uploaded into a database developed for the study, structured around the study questions and protocols. Descriptive tables will also be used to keep track of components of each site and ensure consistency in knowledge across all staff and sites. Using theme tables, site notes, and the standardized descriptive tables, the evaluation team will describe the environment in which the pilot participants (both treatment and control group members) operate, as well as the implementation process and outcomes for each pilot. The assessment will identify how the pilot was planned; whether it was implemented as intended; what challenges the sites experienced in planning, implementing, and operating the pilots, and the associated solutions; and how the pilot components and services differ from those available to the control groups and how these differences might affect impacts. In addition to pilot program-level analysis, the evaluation team will conduct cross-site analysis and subgroup analysis within and between pilots.
At the heart of the evaluation is the estimation of the impact of the pilot projects on participants’ outcomes. The primary outcomes in these analyses will be measures of earnings, employment, and receipt of SNAP, TANF, and Medicaid. Key secondary outcomes will include household food security, health status, and self-esteem.
As impacts on some outcomes may emerge or fade over time, the evaluation team will estimate impacts on outcomes measured regularly throughout the period between random assignment and each follow up survey. For employment and earnings, the evaluation team will estimate the impact for each quarter since random assignment, whereas for public assistance receipt, the evaluation team will estimate an impact for each month since random assignment. Impacts on secondary outcomes will be measured at each follow up.
A critical part of the analysis will be to assess what works and for whom. These analyses can be used to assess the extent to which intervention effects vary across policy-relevant subpopulations. Results from subgroup analyses can help inform decisions about how best to target specific interventions, and possibly to suggest ways to improve the design or implementation of the tested interventions.
The participation analysis will describe the employment and training services received by SNAP participants for each tested intervention, the variation in service receipt over time within and across pilot sites, and the contrast in the services received by the treatment and control group members. It will also examine the extent to which the pilot E&T initiatives encourage or discourage entry into the SNAP program.
The evaluation team will describe the nature, amount, and types of services received by SNAP participants in each SNAP E&T activity and differences in services between the treatment and control groups. Analyses will be conducted by site and by groups of sites with similar program features, with an emphasis on identifying factors associated with cross-site variation in service receipt.
The evaluation team will also examine the extent to which the pilot E&T initiatives encourage or discourage entry into the SNAP program. The approach for estimating entry effects will be tailored to the pilot project designs and, indeed, some designs may preclude the analysis of entry effects. In some cases it may be necessary to use non-experimental methods to analyze entry effects, such as comparing SNAP participation rates in areas with and without the pilot, before and after the pilot, and between the SNAP E&T-eligible and ineligible populations.
Future decisions about whether to replicate and expand the SNAP E&T pilots will hinge in part on whether their benefits are large enough to justify their costs. The evaluation team will conduct a cost-benefit analysis at the end of each follow up period that uses a systematic accounting framework (Ohls et al. 1996) to determine (a) overall, per-participant and per-component costs of providing treatment services relative to those of providing control services; (b) whether the benefits of each pilot exceed its costs; and (c) the extent to which planning, start-up, and early implementation costs are offset by the stream of current and future benefits.
The evaluation team will use the ingredient, or resource cost, method (Levin and McEwan 2001) to compute pilot costs. This approach entails estimating pilot costs through itemizing and collecting data on the amounts and costs of the resources (or ingredients) necessary to provide services. Using cost data provided by pilots and estimates of the additional services received by each participant (estimated from the SNAP E&T MIS), the evaluation team will estimate the total pilot costs, per-component costs (for example, the cost of each type of service), and per-participant costs. For all three types of costs, distributions of costs will be compared within and across grantees.
The evaluation team will also conduct a series of up to ten special analyses to supplement findings from the interim evaluation report. These additional analyses on specific topics will enable stakeholders to learn more about the impact of program components and how impacts differ across policy-relevant subgroups.
If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.
The agency plans to display the expiration date for OMB approval of the information collection on all instruments.
Explain each exception to the certification statement identified in Item 19 “Certification for Paperwork Reduction Act”.
This study does not require any exceptions to the Certification for Paperwork Reduction Act (5 CFR 1320.9)
1 Fully-loaded wages total $717,575.50.
2 Because research services for this study were procured under a firm fixed-price contract, contractor costs include fully loaded wages.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | OMB Part A_011316 |
Subject | Evaluation of SNAP Employment and Training Pilots OMB Supporting Statement 0584-NEW Part A: Justification |
File Modified | 0000-00-00 |
File Created | 2021-01-20 |