1290-0NEW Supporting Statement A NHE Demo_FINAL_rev

1290-0NEW Supporting Statement A NHE Demo_FINAL_rev.docx

NHE Demo Opioid Evaluation

OMB: 1290-0033

Document [docx]
Download: docx | pdf

PART A: JUSTIFICATION for Evaluation of NHE GRANTS to Address the opioid Crisis

OMB No.

OCTOBER 2020

Part A: Justification

The Chief EvaluationOffice of the U.S. Department of Labor’s (DOL) in collaboration with the Employment and Training Administration, has commissioned an evaluation of the National Health Emergency (NHE) Dislocated Worker Demonstration Grants to Address the Opioid Crisis. These grants enable states to test innovative approaches to address the economic and workforce-related impacts of the opioid epidemic. The evaluation of the NHE Grants to Address the Opioid Crisis offers a unique opportunity to build knowledge about the implementation of these approaches, identify perceived challenges and promising practices, and share information with grantees and other stakeholders as they seek to address the opioid crisis. Mathematica Policy Research and its subcontractor Social Policy Research Associates have been contracted to conduct an implementation evaluation. As required by the Paperwork Reduction Act (PRA), this package requests clearance for the following data collection tools as part of the implementation evaluation:


  1. Administrator and staff interview protocol

  2. Interview respondent information form

  3. Program participant focus group protocol

  4. Participant focus group information form

1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.

The widely publicized impacts of the opioid crisis are so staggering that the Secretary of

Health and Human Services declared the crisis a national public health emergency in October 2017. The crisis is affecting local community service and workforce programs that are increasingly forced to address opioid issues in their service delivery. As part of the response to this crisis, in July 2018, DOL awarded an initial set of NHE grants totaling $22 million to six states (Alaska, Maryland, New Hampshire, Pennsylvania, Rhode Island, and Washington). The purpose of these two-year grants is to support state implementation of approaches including services for people or family members affected by opioid addiction, training for workers to address the crisis, and system-wide investments to align workforce services with services provided by other organizations in the community.


A letter to states announcing the availability of the NHE grants described the grants as, “DOL’s first phase of funding opportunities meant to counter the employment impacts of the opioid crisis and encourage training opportunities for skilled professions positioned to impact the underlying causes of the crisis.” The letter encouraged states to consider “innovative approaches” and to “creatively align and deliver career, training, and supportive services to best serve the affected individuals.” In addition, grantees were required to identify partners that would ensure the goals of their projects would be met. Potential partners included at least one local workforce development board or American Job Center; employers or industry organizations; community health providers; justice or law enforcement organizations; community-based organizations; or educational institutions.


Understanding the implementation and context of the NHE grantees’ program is essential for identifying promising practices that contribute to the body of evidence on workforce strategies to address the opioid crisis, an area for which there is very limited research to date (see discussion below on the literature review). Collecting qualitative data from grantees and participants will facilitate an understanding of each grant’s context; structure, management, and partnerships; planned and implemented activities; program participants and experiences; and perceived challenges and successes, as well as implementation factors that lead to similarities and differences across the grantees. This exploratory study is examining grant implementation by NHE grantees and subgrantees, and will contribte to the ongoing dialogue of a field in its nascent stages.

As part of the evaluation, the contractor conducted a literature review to summarize evidence on topics related to the intersection of employment and the opioid crisis. The contractor performed an extensive examination of existing research on employment services for people with opioid use disorder, employer practices for addressing opioid use disorder, and considerations for developing the health care workforce to address the opioid crisis (Vine et al. 2020). A key conclusion from this review is that the existing literature is limited in size and scope, and there are critical gaps in the current knowledge base. Most significantly, very few studies examine employment interventions for people with opioid use disorder. Another important consideration is that none of the existing studies examined interventions implemented within the public workforce system in the U.S.

The data collected using the questions in this ICR will thus be the first effort to identify in depth how State workforce agencies and their local subgrantees used funds under a discretionary grant (ie., the NHE dislocated worker demonstration grants) to provide services, create partnerships as required under the grants, develop new service models, and keep track of service levels and results. Although this implementation study will not examine the effectiveness of the various interventions, it will document types of service provision and partnership development, as well as challenges and lessons learned from these experiences.

Citation of sections of laws that justify this information collection: The NHE grant program and subsequent evaluation are authorized by Title 29 of the American Competitiveness and Workforce Improvement Act, which states that “the Secretary of Labor shall . . . award grants to eligible entities to provide job training and related activities for workers to assist them in obtaining or upgrading employment in industries and economic sectors . . . projected to experience significant growth and ensure that job training and related activities funded by such grants are coordinated with the public workforce investment system (29 USC 3224(a)).”

This package requests clearance for four additional data collection activities which need to start in August 2020 as part of the implementation evaluation. Given that the NHE grants involve opioid use disorder and employment which is a priority for DOL, a timely start to the information collection is critical for providing DOL near real-time information about how the grants were implemented as well as if and how they were adapted during the COVID-19 pandemic.

2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.

The data collected through the activities summarized in this request will be used by DOL to comprehensively describe implementation of the NHE grant program, including grantees’ partnerships, training and support services provided, target population, and common implementation successes and challenges. These data will provide DOL and other policymakers with important information to guide management decisions, support future planning efforts regarding such grant programs, and share information on the implementation of various approaches to delivering workforce services to individuals and communities affected by opioid addiction.

1. Overview of the evaluation

The evaluation of the NHE grants will take place over three years (2018 to 2021) and will address the following research questions:

  1. How were the grants implemented and what perceived factors influenced implementation and participant outcomes?

  2. Who were the major partners involved and what services did they provide?

  3. What theory of change and evidence on intervention, recovery, and/or program services, if any, did grantees use to inform program services?

  4. What perceived challenges did grantees encounter in implementation and how were those addressed?

To answer these research questions, the implementation evaluation includes a review of grant documents from all six grantees, interviews with state and local program administrators and staff, and participant focus groups during site visits to all six grantees. During multiday visits, the site visit team will collect information about grantee’s state and community context, strategies and approach to service delivery, target populations and recruiting, the nature of partners’ involvement in grant activities, perceived successes and challenges, and other topics.

2. Overview of the data collection

Understanding the implementation of the NHE grants requires data collection from multiple sources. The implementation evaluation data collection instruments included in this clearance request include the protocols that will be used to conduct in-person interviews and focus groups during site visits to all six grantees. These in-person interviews will be semi-structured and tailored to the circumstances of each grantee. Over the course of a multiday site visit, the site visit team will spend time at the state level, interviewing the state grantee and state-level partners, and at select local areas participating in the grant. For grantees providing services in many locations, we will work with the identified grantee liaison to identify local areas in which to conduct the field work. To the extent possible, we will work with the liaison to identify geographic diversity in the local areas visit

This package seeks clearance for interview protocols and information forms for two types of respondents: state and local program administrators, staff, and partners; and small groups of current program participants.

  1. State and local administrator, staff, and partner interview protocol. This protocol will be used to conduct in-person interviews with state administrators, state grant directors, state partners, local subgrant directors, frontline staff, employers, and local partners. This protocol will cover the rationale behind grant plans and activities, the context of the state’s opioid crisis and other initiatives addressing it, and implementation progress, successes, and challenges. In-person interviews are expected to take an average of 60 minutes, depending on respondent type.

  2. Interview respondent information form. This form will be sent by email to administrators and staff who will be interviewed during in-person site visits. In cases where respondents do not complete the form in advance, it will be administrated before the start of each administrator and staff interview. The respondent will complete a short questionnaire that includes basic background information, such as their highest education level and experience. The form is expected to take five minutes to complete.

  3. Participant focus group protocol. This protocol will be used to conduct focus groups with up to 10 participants at each visited site. This protocol will gather data on their backgrounds, employment experience, motivation for participation, and description and assessment of training received. These groups will be conducted in person and are expected to take approximately 60 minutes to complete.

  4. Participant focus group information form. This form will be distributed to focus group participants for completion at the beginning of each focus group. The information form will collect details on how participants were referred to the program, how long they have been involved, and their previous work experience. The form is expected to take approximately five minutes to complete.

Proposed uses for each data collection activity are described in Table A.1.

Table A.1. How data will be used, by data collection activity

Data collection activity

How the data will be used

1. State and local administrator, staff, and partner interview

Data from in-person interviews with NHE state administrators and staff, local subgrant administrators and staff, and partner staff will be used to describe state and community context, strategies and approach to service delivery, theory of change and evidence base for intervention design, target populations and recruiting, the nature of partners’ involvement in grant activities, successes and challenges, and other topics.

2. Interview respondent information forms

Data will be used to summarize basic background information about respondents.

3. Participant focus groups

Data will be used to describe participant backgrounds, program experiences, and expectations.

4. Participant focus group information form

Data will be used to summarize characteristics of the individuals participating in the focus groups.



3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.

The evaluation team will primarily use email to help facilitate the logistics and scheduling of the site visits and interviews to reduce the burden on participants. Site visitors for the evaluation of the NHE grants will use electronic audio recorders to record the semi-structured interviews. This will allow the visitors to conduct interviews in the shortest amount of time possible, as they will not be required to use interview time to take notes on the content of the conversation. There will be no other information technology used by site visitors.

4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.

The evaluation of the NHE grants will not require collection of information that is available through alternate sources. For example, the evaluation will use available information from grantee applications to ensure that data collected through interviews and focus groups are not available elsewhere.

5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.

Interviews could be conducted with employers or program stakeholders from small businesses or other small entities, if they are participating as subgrantees or partners in the NHE grants. We will only request information required for the intended use and minimize burden by restricting the length of interviews to the minimum required time.

6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.

If the interviews are not conducted, DOL and other stakeholders will not have the information necessary to answer the evaluation’s key research questions. Without collecting the information specified in the site visit interviews, a comprehensive implementation analysis of the NHE grants could not occur. This would prevent information being provided to policymakers about the context in which the partnerships and programs operated, any operational challenges faced by grantees and partners, how the partnerships and services evolved over time, whether the approaches were perceived to be effective and implications for interpreting results, or implications for program improvement based on evidence obtained through the evaluation.

7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

No special circumstances apply to this data collection.

8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.

A 60-day notice to solicit public comments was published in the Federal Register, 83 FR 54943 on December 26, 2018. One comment was received. It suggested that our funding should be routed elsewhere. CEO responded in acknowledgement of the comment.

Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.

Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years - even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.



Mathematica and Social Policy Research Associates did not consult with any outside individuals on the design of this evaluation.

9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.

There are no payments or gifts to program and partner staff, as activities are expected to be carried out in the course of their employment, and no additional compensation will be provided outside of their normal pay. Respondents participating in the participant focus groups will receive a $40 gift card.

10. Describe any assurance of privacy provided to respondents and the basis for the assurance in statute, regulation, or agency policy

Information collected will be kept private to the extent permitted by law. The evaluation team complies with DOL data security requirements by implementing security controls for processes that it routinely uses in projects that involve sensitive data. Further, the evaluation is being conducted in accordance with all relevant regulations and requirements.

11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.



There are no sensitive questions included in the interview protocols for state and local administrators, staff, and partners. Participant focus groups will collect demographic data such as gender, employment status, education level, and sources of financial support. In addition, participant focus groups will collect information on how the opioid epidemic has affected participants’ communities, the availability of jobs in participants’ communities, and participant backgrounds and reasons for seeking services through the NHE grants, which could be considered sensitive. These questions are important to understanding the context in which the grants are being implemented and participants that are being reached, as well as the participants’ experiences with the grants. We will inform participants that their answers to these questions will be kept private, and that they do not need to answer any questions that they do not feel comfortable with. In addition, we will convey to participants that they may share the answers to these questions in written form, or discuss them with the research team individually after the focus group, if they prefer.

12. Provide estimates of the hour burden of the collection of information

* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.

* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included under “Annual Cost to Federal Government.”



Table A.3 provides annual burden estimates for each of the data collection activities for which this package requests clearance. All of the activities covered by this request will take place over a three-year period. To calculate the estimated cost burden for respondents, average hourly wages from the U.S. Bureau of Labor Statistics, National, State, Metropolitan, and Nonmetropolitan Area Occupational Employment and Wage Estimates for May 2019 were multiplied by the number of hours per respondent type. The following summarizes the annual burden estimates for each of the three data collection activities:

Semi-structured program stakeholder interviews (in-person): State and local administrator, staff and partners . State, local, and partner staff interviews will be conducted for all six grantee sites in-person. Each visit will, on average, involve individual or group interviews with 30 respondents (e.g., 5 state-level staff, 5 state-level partner agency staff, 4 local administrators, 8 frontline staff, and 8 partners). The team assumes the average burden time per response to be 1.5 hours, although some meetings will be shorter and some will be longer.

Interview respondent information forms for state and local administrators, staff and partners. These forms will be completed by state, local and partner staff in advance of the site visit interviews and will take 5 minutes to complete.

Participant focus groups. Focus groups with a subset of participants will take place during in-person site visits. Each focus group will take 60 minutes to complete. Ten participants are expected to participate at each of the six sites visited, for a total of 60 respondents (10 participants × 6 grantees).

Participant focus group information form. Forms will be administered with participants at the start of each focus group. Each form will take 5 minutes to complete. Ten participants are expected to participate at each of the six sites visited, for a total of 60 respondents (10 participants × 6 grantees).



Table A.3. Estimated Annualized Respondent Hour and Cost Burden

Data Collection Activity

Annual number of respondents

Number of responses per respondent

Total number of responses

Average burden per response (in hours)

Annual
estimated burden hours

Average hourly a

Annual monetized burden hours

Semi-structured program stakeholder interviews (in-person): State and local administrator, staff and partnersb

60

1

60

1.5

90

$45.87

$4,128


Respondent information form: State and local administrator, staff and partners

60

1

60

0.083

5

$45.87

$229.35

Participant focus groups

20

1

20

1

20

$19.17

$383.40

Participant focus group information form

20

1

20

0.083

1.67

$19.17

$32

Unduplicated Total

80c

--

160


117


$4,772

a The hourly wage of $45.87 is the May 2019 median wage across Education Administrators, Postsecondary (see http://www.bls.gov/oes/current/oes_nat.htm); $19.17 is the May 2018 median wage across all occupations in the United States

b Assumes each visit will, on average, involve individual or group interviews with 30 respondents (e.g., 5 state-level staff, 5 state-level partner agency staff, 4 local administrators, 8 frontline staff, and 8 partners). The team assumes the average burden time per response to be 1.5 hours, although some meetings will be shorter and some will be longer.

c Participants in stakeholder interviews and focus groups will also be the respondents to information forms


A.13. Estimate of cost burden to respondents

* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collections services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.

There are no direct costs to respondents other than their time.

14 Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information.



The total cost to the Federal government over three years is $174,316. The total annualized cost to the federal government is $58,105.331. Costs result from the following categories:

The estimated cost to the federal government for the contractor to carry out the site visit interviews is $118,618. Annualized, this comes to $39,539.33:

= $39,539.33

The annual cost borne by DOL for federal technical staff to oversee the contract is estimated to be $18,566 . We expect the annual level of effort to perform these duties will require 200 hours for one federal GS 14 step 2 employee based in Washington, D.C., earning $58.02 per hour. (See Office of Personnel Management 2019 Hourly Salary Table at https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2019/DCB_h.pdf). To account for fringe benefits and other overhead costs, the agency has applied multiplication factor of 1.6:

200 hours × $58.02 × 1.6 = $18,566.40

Thus the total annualized federal cost is $39,539.33 + $18,566.40 = $58,105.73.

15. Explain the reasons for any program changes or adjustments reported on the burden worksheet..

This is a new information collection.

16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.

1. Analysis plan

Analysis of interview and focus group data will involve coding and triangulating across data sources. The evaluation team will begin by writing up detailed field notes from in-person interviews and focus groups in a structured format. To code the qualitative data for key themes and topics, a coding scheme will be developed and organized according to key research questions and topics. The evaluation team will then code the data using qualitative analysis software. To ensure reliability across team staff, all coders will code an initial set of documents and compare codes to identify and resolve discrepancies. These data will be used to develop a thorough understanding of each grantee including the grant’s context; structure, management, and partnerships; planned and implemented activities; program participants and experiences; and challenges and successes.

2. Publications

In 2021, we will produce a report on the implementation evaluation as well as other dissemination products such as fact sheets and issue briefs on topics of interest to DOL.

17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.

The OMB approval number and expiration date will be displayed or cited on all forms completed as part of the data collection.

18. Explain each exception to the topics of the certification statement identified in “Certification for Paperwork Reduction Act Submissions.”

No exceptions are necessary for this information collection.

References

Vine, Michaela, Colleen Staatz, Crystal Blyler, and Jillian Berk. “The Role of the Public Workforce System and Employers in Addressing the Opioid Crisis: A Review of the Literature.” Cambridge, MA: Mathematica Policy Research, February 2020.



1The annualized cost to the federal government for the contractor includes cost for $40 gift cards paid to focus group participants.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJeanne Bellotti
File Modified0000-00-00
File Created2021-01-14

© 2024 OMB.report | Privacy Policy