P.1. National Agricultural Statistics Service Comments

P.1. National Agricultural Statistics Service Comments.docx

Assessment of Mandatory SNAP E&T Programs

P.1. National Agricultural Statistics Service Comments.docx

OMB: 0584-0645

Document [docx]
Download: docx | pdf

Appendix P.1 National Agricultural Statistics Service Comments

OMB No. 0584-[NEW]

Assessment of Mandatory E&T Programs

Month XX, 2018

Project Officer: Jordan Younes

Office of Policy Support

Food and Nutrition Service

U.S. Department of Agriculture

3101 Park Center Drive

Alexandria, VA 22303

703-305-2935

[email protected]


NASS Review of OMB Document 0584-[NEW] Assessment of Mandatory Employment and Training (E&T) Programs Administered Through the Supplemental Nutrition and Assistance Program (SNAP)

Linette Lanclos Mathematical Statistician

  1. Study Purpose/Justification

The purpose/justification of this study is clearly stated. States have the option to offer mandatory or voluntary employment and training programs for all non-exempt SNAP participants or a subset of participants. While mandatory reports describe participation and general outcomes, the variation in program practices and features is not assessed. The data for this study are being collected to examine program features and administrative practices of mandatory SNAP E&T programs and assess how these features and practices may affect participation, sanctions, and outcomes, namely stable, well-paying employment and economic self-sufficiency for mandatory participants.

  1. Proposed Process for Data Collection

Information collected will come from semi-structured interviews, a process mapping exercise, and the collection of SNAP administrative data in six states. Semi-structured interviews are from:

    1. the State SNAP office staff

      1. Questions address capacity to meet E&T goals, program outcomes, and challenges

      2. In-person interview with:

        1. SNAP director

        2. E&T manager

    2. local SNAP offices

      1. Questions address screening, intake, and drop-off points, sanctions, and capacity

      2. Two local offices in each of 6 States (12 offices)

        1. one rural

        2. one urban

      3. In-person interview with

        1. local director

        2. 2 local eligibility staff

      4. Observation data from local office will also be gathered

    3. SNAP E&T providers

      1. Questions addressing how local E&T programs serve the population and community

      2. Types of providers

        1. Three staff in 6 States of State providers (total 18)

        2. Three staff in 6 States of business or other for-profit organizations (total 18)

        3. Three staff in 6 States of not-for-profit organizations (total 18)

      3. Rural and urban

      4. Observation data of staff/client interaction and available support will be collected

Once state and local SNAP officials have provided procedural information and personal observations of general practices have been documented, the study team will hold a small group discussion to complete a process-mapping exercise. The group mapping exercise will involve 5 local staff from each of the 2 sites in each of 6 States (total of 60 staff). The mapping will be used to complete the understanding of the offices screening, intake and referral processes, any drop-off points, procedures for reviewing noncompliant cases and issuing sanctions, staff training and capacity to handle case load.

Extant administrative data will be solicited from SNAP case records and E&T provider records (i.e. State, business or for-profit or not-for-profit). The data are collected as follows:

  1. The State database administrator

    1. One file per State

    2. Identify mandatory E&T participants, exemption status, sanctions, SNAP benefits, and case closures

  2. E&T provider

    1. 2 State providers from 6 States

    2. 2 Business or for-profit providers from 6 States

    3. 2 Not-for-profit providers from 6 States

    4. Data collected is not specified

The process is fairly well defined, but a graphic up-front that summarizes the persons contacted and data collected would be very helpful. The study is very complex.

The data analysis is not explained in detail for this program. However, there may be concerns that the extant data collection will not provide adequate data for assessment of

E&T outcomes. I recommend providing more detail in regards to the analysis to better assess the data needed.

  1. Participating State Selection Process

There are 17 states with operating mandatory SNAP E&T programs 3 pilot programs and 14 non-pilot programs. From the 14 non-pilot program states, 6 states were selected to reflect variation across 14 selection criteria. Selection criteria come fromcharacteristics captured in mandatory reporting and analysis of the SNAP and SNAP E&T programs.

For the SNAP E&T, the mandatory reporting for all states includes:

  • The number and percentage of E&T participants and former participants who are in unsubsidized employment during the second quarter after completion of participation in E&T;

  • The number and percentage of E&T participants and former participants who are in unsubsidized employment during the fourth quarter after completion of participation in E&T;

  • The median quarterly earnings of all the E&T participants and former participants who are in unsubsidized employment during the second quarter after completion of participation in E&T;

  • The number and percentage of participants that completed a training, educational, work experience or an on-the-job training component; and

  • The number of all E&T participants who:

  1. Are voluntary vs. mandatory participants;

  2. Have received a high school degree (or GED) prior to being provided with E&T services;

  3. Are able- bodied adults without dependents (ABAWDs);

  4. Speak English as a second language;

  5. Are male or female;

  6. Are within each of the following age ranges: 16-17, 18-35, 36-49, 50-59, 60 or older.

In addition cross tabulations of the above quantities are reported. State agencies also include reports of E&T components that are projected to have 100 or more participants.

The six states were chosen based on primary selection criteria, which include participant statistics (e.g. ratio of SNAP E&T participants to SNAP recipients), E&T program characteristics (e.g. number of E&T components offered), and head of household race, and secondary selection criteria, which include characteristics of the broader SNAP and FNS administration of program, such as time limit waivers and support of the SNAP E&T target population. Six states are invited to participate in the study based on the diversity of mandatory SNAP E&T programs represented. A back-up state for each selected state is also designated. The six named states are Colorado, Florida, Louisiana, New Jersey, Ohio, and South Dakota.

While an algorithm was not provided to solidify an understanding of the selection process, a review of the various state websites indicated that the programs are largely diverse. I recommend providing more detail in regards to this process of selection.

Moreover, while the expense of incorporating data from all states may be a limiting factor, the differences in the programs and practices might be better observed by incorporating more states into the evaluation set.

  1. Participating Local Offices and Providers

In the pre-test, local offices were suggested by the State office. Again an algorithm was not provided to solidify an understanding of the selection process. I recommend clearly defining the selection process that ensures the selected local offices represent: 1) rural and urban areas, 2) mandatory programs, and 3) a mix of State, for profit and not-for- profit providers.

There are a large number of interviews with providers, and a more limited collection of extant administrative data. In comparing State, for profit and not-for-profit providers, the interviews may reveal varying degrees of success. The extant data, however, are limited, so the data sources may not complement each other. I recommend compiling data from a broader base of providers.

  1. Pre-test Observations on Data Collection

Generally, the pre-test questions and data collection was well thought out and thorough. The multi-pronged approach of surveys, observation, process mapping and extant data collection is appropriate and insightful. The approach recognizes that data will be acquired from likely very different organizational structures and participation levels.

From the pre-test, some areas were identified for improvement. I recommend revisiting the comments from the pre-test and evaluating avenues to ensure that as much data as possible can be collected from interviews, process mapping and data collection. I would also recommend having a strategy in place to coordinate data from all sources, i.e. what identifiers are needed to ensure that State data and E&T provider data can be merged. Specific examples of areas that I recommend revisiting are given below.

In the State SNAP Director and E&T Director Interview Protocol it was identified that some questions would need advance research and the interview exceeded the time limit. Sending a list of topics in advance is a good take-away and will facilitate a smoother process. I recommend supplementing the topic list with a general understanding of the depth of the questions. One way to do this is to ask for pre- collected information. Each State program is different and some of these questions could elicit very lengthy answers. For example, question F.1.c. Do you produce other reports for internal program management purposes? Followed by: what kind of information is in these reports? It may be beneficial to ask for a copy of the internal report, if produced. While documents are requested prior to starting the interview process, the request is fairly open and document collection may vary. Asking for copies of interview question supporting materials will assist the State Director in identifying the types of questions that may be asked, such as the key analysis of their programs and content of their evaluations.

In the Local SNAP Office Director Protocol, the original protocol was for the director to receive an interview focused on higher level local economic issues and program support. Some questions, however, were more detailed than the office director was able to easily answer. It was identified that the questions might better be handled by individuals with a specialized focus from within the office. Moreover, some questions were less tractable at a local level. The survey was administered to the Local SNAP Office Director and the supervisor for employment counselors to mitigate the unresolved questions. I observed the following:

    1. Interviewees answered select sets of questions. While the interview may have lasted only an hour, if all questions were answered, the time may have exceeded an hour. I recommend increasing the time.

    2. Process questions, policy questions, examples and observations are appropriate interview questions. Questions regarding percentages, quantities, frequency, etc. are typically only requested in formats that allow for earlier or concurrent retrieval of the data. Given the quantity of questions and limited time frame, I recommend either providing the questions in advance of the interview or requesting follow-up.

    3. The program evaluated by the pre-test had a very narrow focus with only one provider. If there is more than one provider or venue for a client the answers may be complex. I recommend reviewing the wording or asking the set of questions to address each the services by each provider separately to avoid confusion. For example, E.4.a. What supports do they most commonly need? may have a different response depending on if it is a State provider versus a for profit provider. Covering more complex programs will likely require more interview time. I recommend increasing the allotted time.

    4. If topics are distributed pre-interview, there may be more interviewees attending to address the topics. While this is a good opportunity to collect all the information, the response burden is only accounting for one interviewee. I recommend making an adjustment in good faith to account for the potential increased burden.

    5. I recommend changing the placement of some questions from one interview to another. For example, “G.9. Are there any parts of NOAAs that clients seem to have difficulty understanding?” might better be answered by the eligibility officer that works directly with the clients. Whereas the Local SNAP Office Eligibility Worker Interview Protocol question E.5. Approximately how many (what percentage) of your clients who receive a NOAA comply with the requirements in time to avoid being sanctioned?” might better be answered by the local office director. My recommended list of changes in question placement is in appendix A.



    1. I recommend sending the topics in advance and enough description of the topics to ensure the director has the staff support available to answer the more detailed questions. If the office is large there may be many managers of specific areas involved.

The Local SNAP Office Eligibility Worker Interview protocol called for two eligibility staff to receive interviews that discussed client interaction. The pre-test interview selected a supervisor. The supervisor may have a bit more in-depth knowledge than a non-supervisory employee. The pre-test situation was limited as only one provider was available, data requests were not answered on-the-spot and some questions were determined too difficult to answer in the given forum. I would make the following recommendations:

  1. In section E there are a number of questions asking percentages. These questions may be phrased as “Based on your experience, what percentage….” or “Approximately how many (what percentage)….”. Questions based on personal experience may be influenced by divisions within an office, whether it is geographic, type of client, etc. I recommend cautiously using the information. More global questions may be more difficult for an eligibility worker to obtain information. I recommend the questions be queries to managers, directors, or program sponsors.

  2. I recommend sending the topics in advance and enough description of the topics to ensure the appropriate staff are available to answer the questions. If the office is large there may be specialization and some sections may not be addressed.

The E&T Provider Interview Protocol calls for 3 staff each of State, for profit organizations, and not for profit organizations in each State. The pre-test interview was of an E&T provider supervisor and frontline staff member from a State provider. General observations are as follows:

  1. The testing was limited to a very narrow band of services and to a State provider only. I recommend extending pretesting to evaluate the questions using a for- profit organization and a not-for-profit organization to thoroughly vet the range of answers. For example, questions about capacity may yield differing responses.

  2. As noted any data requirements should be sent in advance, as these types of questions are difficult to answer without preparation. I recommend sending these questions in advance or allowing for interviewee follow-up to these questions.

  3. If multiple providers are available in a local area, there may be some questions that could naturally not be answered. For example, a “new” referral may be confusing as the provider may only know if the referral is “new” to the specific provider. I recommend restating questions as “for your organization”.

The observation checklist assesses whether policies, procedures and requirements of the program are covered. Observations may need to be in context of the program and the requirements of the individual client as well. The focus is on the determination of requirements and the explanation of client requirements with little observation of the opportunities and support presented.

The process-mapping exercise is discussed in Part A of the OMB docket. However, no materials are provided. The stated purpose of the exercise is to help identify each step in the client pathway and drop-off points in the process that may lead to sanctions. I recommend clarifying the task and its relationship to other types of data collection. The in-take observation checklist may benefit by having more detail around the process- mapping as well.

Extant administrative data are requested from State SNAP staff and E&T Providers. The data specifications file is combined to include data as follows: individual variables, household variables and E&T activities and outcomes. The pre-test findings suggest the following:

  1. limit the data to age 16 to 59 participants,

  2. provide only data for months participated,

  3. add a variable for individual earnings,

  4. add variables for sanction dates, and

  5. make changes to wording.

I recommend some additional adjustments:

  1. For E&T activities the ET_EARNINGS variable needs to be explicit as monthly or quarterly earnings or the value cannot be compared.

  2. It is not clear that not-for-profit or for-profit organizations will track participants past the point of payment for specified service, which may be the start date of unsubsidized employment. In that case, the earnings would be unclear for following months.

  3. The ET_EARNINGS is unclear as to whether it represents all income earned by the individual or only income from the employment received via the E&T program. In some states eligibility includes SNAP participants that have a part time job but do not have the requisite number of work hours.

  4. One tabulation that you may want to do is based on whether the provider is State, not-for-profit, or for-profit. While the provider name may assist in developing that information an indicator variable would be preferable.

  5. There is no indication of total hours worked per week or hours worked per week for E&T employment. I would include these variables for perspective.

  6. I would include State as a variable.

  7. Local offices may serve clients that go to various service providers (for-profit, not-for-profit, etc.) based on geography or other considerations. My understanding is that we may only have outcomes for a limited number of recipients in that case.



The pre-test case was strictly a case of a State provider and all data likely were from a single system and/or single database. I recommend acquiring data from at least one not- for-profit or for-profit entity to ensure that data from alternative sources can be joined to provide a complete perspective on the client case. In addition, there may be no need to have data from outside sources if all variables can be obtained from the State database. If states require providers to report information to the SNAP or SNAP E&T program, there may be limited need to request this data from E&T providers

There is limited information on the database structures. Hence, criteria should be provided to establish the subgroup of clients that you would like to have data (e.g., SNAP E&T mandatory participants). Based on the mandated reporting for all States, States should provide unsubsidized employment reporting for several quarters after the completion of the service. Even if the participant is no longer receiving SNAP, there should be a record of the individual. I recommend a thorough discussion of what data are available from the State, the source of their data, and their established criteria for data transfer files or documented descriptions of what data need to be reported prior to requesting data from providers. It may be challenging to acquire meaningful data with more diversified programs than the pilot program.

  1. Maximizing Response Rates

Methods to maximize response rates were addressed in Part B of the document. The methods included advance letters, instructions, and follow-up. There is a 100% response expected at the State and local level. Onsite interviews at not-for profit and for-profit providers have an expected response rate of 67%. In addition, the study materials indicate that, if a State does not have the capacity or is unwilling to cooperate when advance materials are distributed, a complementary State would be chosen.

Organizational structures will vary widely and the potential to be able to collect data efficiently may be compromised. I recommend accounting for higher levels of complexity that may hinder meeting reporting requirements and lower response rates.

V. Statistical Methods for Summary and Analysis

There is no information on the type of results that will be reported, tabulated or assessed. I recommend providing a short description. The stated purpose is to assess how plan features and administrative practices affect participation, sanctions, and outcomes, namely stable, well-paying employment and economic self-sufficiency for mandatory participants. Program aspects could be documented and evaluated or tabulated. The analysis documentation could be brief if it will just be basic frequencies or cross tabulations. If there are plans for more complex analysis, it should be documented.



Appendix A

Questionnaire Review

E&T Components and Services

Local Director Survey Question E.1.a. might be queried at a State Director Survey level as providers may register at a State level (see questions C.1.b. and C.1.c. on State Director survey).

Training

Wording differences may have been intentional or not. Please be aware that on the Eligibility Worker survey question G.1. states “practices for implementing mandatory E&T policies” relative to question C.1. on the Local Director Survey which state implementing mandatory E&T policies. This may create slightly different answers.

Compliance and Sanctions

Questions D.1.a. and D.2. on the State Director Survey seem more appropriate for the Eligibility Worker Survey.

Questions 5 and 8 on the Eligibility Worker Survey might be more applicable for the State Director Survey (Nested in Questions D.6. and D.7.) and the Local Director Survey. Summary statistics may be made available to them.

On the Eligibility Worker Survey Conciliation and Good Cause Determination are in their own separate section. On the Local Director Survey it is placed in the context of other Compliance and Sanctions Section. It seems like this may be made consistent.

In the State Director Survey question 6 is a little touch - “does the State track of”. Perhaps there is a missing word or two in this question.

In the Local Director Survey Question G.3. might either be asked of the providers themselves or rephrased as what requirements or agreements are in place. Some local directors may not be willing to speak to the detail of other organizations or may not know.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorBetsy Thorn
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy