Evaluation of Programs Supporting the Mental Health of the Health Professions Workforce
The Health Resources and Services Administration (HRSA) is requesting OMB approval for a new information collection request to evaluate three provider resiliency programs that are part of the HRSA Bureau of Health Workforce (BHW). The Public Health Service Act and the American Rescue Plan Act of 2021 authorized the three programs, which are as follows: 1) the Health and Public Safety Workforce Resiliency Training Program (the Training Program); 2) the Promoting Resilience and Mental Health among Health Professional Workforce (the Workforce Program) program; and 3) the Health and Public Safety Workforce Resiliency Technical Assistance Center (the Technical Assistance Center (TAC).
Each program has a unique set of objectives to increase provider resiliency:
The Training Program funds evidence-based provider wellness training activities and aims to increase the knowledge of these activities through the health workforce.
The Workforce Program supports entities that provide health care by funding programs or protocols aimed at creating a culture of wellness within the entities.
The TAC assists the Training Program and the Workforce Program awardees in deploying evidence-based resilience strategies within their respective populations and work within the 10 HRSA Regional Public Health Training Centers (PHTCs), to develop and advance a framework to reduce burnout.
For each program, unique design features address provider resiliency to meet their respective objectives over a three-year period (January 2022 to December 2024):
HRSA awarded $68.2 million for the Training Program and made awards to 34 health professional schools, academic health centers, and State or local governments. Awardees conduct training activities using evidence-based strategies focused on reducing burnout and promoting resiliency among the health workforce in rural and medically underserved communities.
HRSA awarded $28.6 million for the Workforce Program and made awards to 10 healthcare-providing entities, healthcare provider organizations, and Federally Qualified Health Centers. The main activities are to establish and expand organizations’ evidence-based programs or protocols that foster resilience and wellness among the health workforce in rural and medically underserved communities.
For the TAC, HRSA awarded $5.9 million to the George Washington University (GWU) Fitzhugh Mullan Institute for Health Workforce Equity at the Milken Institute School of Public Health. The TAC’s main activities are to provide tailored training and technical assistance (TA) to HRSA’s health workforce resiliency award recipients and to expand infrastructure to implement evidence-based strategies that promote resilience and wellness in rural and medically underserved communities.
The planned evaluation will assess the three programs with respect to their shared goal to promote resiliency and wellness in the health workforce. This evaluation will consider each program individually, employing an overlapping set of questions and a shared set of methods to achieve the following goals:
Develop and implement an evaluation to assess programs’ efforts to promote resiliency and mental health in the health workforce.
Develop and implement a robust evaluation methodology that measures program outcomes.
Develop and direct data collection efforts over a four-year period (2022-2026) to allow the evaluation to inform BHW leadership on an ongoing basis about program progress and to make recommendations for continuous process improvement. Data measures and collection efforts will align with parallel efforts across HRSA.
Develop recommendations and provide actionable strategies and/or methodologies that HRSA can use to inform future programming and future investment strategies.
Data collection efforts will inform BHW leadership about the progress, costs and benefits, and impact of the three programs to strengthen and support resiliency of the health care workforce in the United States.
The purpose of the planned primary data collection activities is to understand program outcomes in ways not otherwise captured by administrative sources. The evaluation will enable a uniquely comprehensive evaluation of these important HRSA-funded programs to promote resiliency and mental health in the health workforce. The information collected will enable BHW to address evaluation questions including, but not limited to, the following:
The Training Program
What are the perceived changes in outcomes before and after activities, trainings, and/or services? Key outcomes include burnout, resiliency, work environment, support needs, and mental health.
What are the best practices, innovations, challenges, and lessons learned in implementing the program?
What are the overall costs and benefits of the program?
The Workforce Program
What are the perceived changes in outcomes before and after the activities, trainings, and/or services? Key outcomes include burnout, resiliency, work environment, support needs, and mental health.
What are the best practices, innovations, challenges, and lessons learned in implementing the program?
What are the overall costs and benefits of the program?
How did awardees make progress toward organizational change?
The Technical Assistance Center
How effective was the Workplace Change Collaborative (WCC) at providing support?
What are the best practices, innovations, challenges, and lessons learned in implementing technical assistance to the Training Program and Workforce Program?
Data will be collected from awardees as well as individuals in each awardee’s target population. This evaluation will gather both quantitative and qualitative data, using instruments administered twice for each award program over the four-year evaluation period (note: while the Cost Workbook will be administered twice, awardees will be asked to complete two Cost Workbooks during the first administration, one for 2022 and one for 2023), as well as a one-time survey for a comparison group.
Data collection efforts are critical to understanding program outcomes and will inform BHW leadership on program progress as well as on timely corrective program actions. The evaluation has been designed to minimize the potential burden on respondents by tailoring the primary data collection instruments to maximize efficiency by leveraging secondary administrative data (for example, awardee abstracts and applications, awardee performance measures, and awardee progress and final reports) to obtain characteristics of individuals in the programs’ target population and awardee characteristics, activities, and implementation progress. To achieve its aims, the evaluation will implement the following data collection efforts:
For the Healthcare Workforce Survey, questions will assess perceived outcomes associated with award-funded trainings/activities (such as burnout, resiliency, absenteeism, and intent to leave profession and employment setting) as well as other related factors (such as perceptions of mental health, whether respondents found trainings/activities to be helpful, and reasons for burnout). The Healthcare Workforce Survey will also assess perceptions about organizational culture and whether respondents feel prepared for another infectious disease outbreak like COVID-19.
The Awardee Training and Services Report is an Excel-based tool that will be used to obtain a current and complete list of activities and key descriptive information for each awardee. Each report will include pre-populated information to minimize burden on awardees while confirming, revising, or adding details, as needed.
The Training Program Comparison Group Survey is a web-based survey to assess key outcomes among those in the health workforce who did not have access to Health and Public Safety Workforce Resiliency Training Program-funded activities. Two third-party vendors will provide the healthcare workforce sample. Eligibility for the survey will be assessed using a brief web-based Screener. The purpose of the screener is to identify respondents with similar characteristics as the Health and Public Safety Workforce Resiliency Training Program target population.
For the Awardee Survey about the TAC, survey questions will ask whether awardees were satisfied with the TA that was provided, whether TA was perceived to be effective, and whether TA achieved desired outcomes.
For the Awardee Interview, topics will include effectiveness of models/programs, changes in design, barriers/challenges during implementation, program innovation, implementation facilitators, impact of the program, effectiveness of the interventions, role of the COVID-19 pandemic, and program sustainability.
For the Organizational Assessment Interviews, topics will include leadership through organizational change, shared vision of organizational wellness and resilience, embedding equity, partner support, setting the stage for change, current organizational culture, role of the COVID-19 pandemic, and change strategies.
The Awardee Cost Workbook is an Excel-based tool to conduct a cost-benefit analysis. It will be pre-populated with existing data for awardee managers to verify and update as needed. Awardees for the Workforce Program are expected to have lower response burden because they must report staff turnover through annual reporting, while awardees for the Training Program do not have the same reporting requirement.
The Healthcare Workforce Fielding Tracker is an Excel-based tool to assess how each awardee distributes the Healthcare Workforce Survey. The tool will also gather aggregated demographic information on the total target population, required for a non-response bias analysis.
Evaluation data collection is designed to minimize survey respondent burden by using web-based technology. The Healthcare Workforce, Comparison Group, and Awardee Surveys will be programmed in Voxco and/or Qualtrics; both survey platforms are certified by the Federal Risk and Authorization Management Program (FedRAMP). The surveys can be completed using a smart phone, tablet, or computer.
The web-based Voxco and Qualtrics survey platforms will minimize burden by reducing the survey length using skip patterns, using previous responses to pre-populate later questions, and pre-populating fields when feasible. This evaluation will also conduct extensive quality control testing to ensure the skip patterns and logic allow for maximum efficiency. Awardee survey respondents may stop the survey and return later, rather than start over, allowing respondents to tailor the time needed to complete the survey into increments of their choosing.
For the Awardee Survey about the TAC, respondents will have the option of a paper self-administered questionnaire (SAQ). Respondents will be able to submit the completed paper survey by fax, saving approximately 15 minutes over manual inputting of completed responses into the web survey.
As noted earlier, this evaluation will minimize respondent burden while ensuring that survey objectives are met by leveraging secondary administrative data (for example, awardee applications, awardee progress and final reports, and awardee performance measures) to obtain characteristics of the awardees’ target population and awardee characteristics, activities, and implementation progress. Many awardees also survey their target populations using validated measures of burnout, resiliency, mental health, and other constructs. However, this evaluation provides BHW with a unique, consistent data set, that has been collected at the same time using standardized data collection protocols, on the same constructs, using the same measures across the individuals in the awardees’ healthcare workforce. Consistent data collection across awardees will provide BHW with the data needed to assess the impact and implementation of the three programs.
The evaluation will leverage secondary data across data collection activities in a manner that avoids duplication and informs analysis.
Quantitative Descriptive Characteristics. This evaluation will use available administrative data, such as awardee applications and annual reports, to characterize the interventions under evaluation. Progress reports contain information on target population demographic characteristics, discipline, and participation in trainings (at the training level for the Training Program and the awardee level for the Workforce Program), site characteristics and location, training (curriculum) development and enhancement activities, and faculty development activities. This evaluation will also use this data to describe the types and frequency of trainings, as well as site and target population characteristics. Names and descriptions of grant-funded activities from progress reports will also be used to pre-populate the Awardee Training and Services Report. Awardees will be able to verify and clarify this data, which will be used to ensure that the primary data collection instruments are tailored to reduce respondent burden.
Structured Qualitative Assessment. This evaluation will use a combination of inductive and deductive coding to conduct a thematic analysis as part of the implementation assessment component of this evaluation. This will include a review of awardee applications, progress reports, and final reports. Awardee applications contain information related to the timeline of activities and types of evidence-based activities utilized by awardees. Progress reports will provide data on activities, challenges, and successes, which the evaluation will use to inform awardee interviews. Final reports will provide summative information on best practices, lessons learned, implementation challenges and successes, and qualitative costs and benefits, offering context for evaluation results.
Awardee Cost Workbook. For the Awardee Cost Workbook, the National Opinion Research Center (NORC) will abstract data from Progress and Performance Reports, current budget and budget justifications, and required application for federal assistance standard forms such as the SF424 and SF425—to prefill fields where possible and minimize the burden on awardees. In addition, the evaluation will abstract the total funding amount from applications or other reporting to HRSA. From these sources, the evaluation will obtain information including costs of personnel, contracted services, facilities, supplies and materials, overhead, administration, and others. In addition, the evaluation will abstract staff turnover data for the Workforce Program awardees from their annual performance reporting (Faculty Development FD5 form) to pre-populate that section of the Workbook. Abstracted data will be entered into the Awardee Cost Workbook which will be shared with awardee managers to verify and update as needed. Verification is needed, as costs may have evolved due to the availability of new data or the understanding of the awardee regarding what is being asked.
Quantitative Outcomes Assessment. For the Training Program and the Workforce Program awardees, the evaluation will use public data (for example, HRSA’s Area Health Resources File and the U.S. Centers for Disease Control and Prevention (CDC)’s Social Vulnerability Index) to characterize the communities where the programs’ target population and employees work. This data will be used to help to illustrate the distribution of the awards in rural and underserved communities and characterize the communities regarding demographic, socio-economic and measures of advantages and vulnerabilities. This characterization will allow the evaluation to account for differences between the treatment and comparison groups.
While the targeted population includes health care providers like physicians (that is, small businesses), we include only items that provide critical information for conducting the evaluation, and the requested information is the minimum required for the intended use of the data. For example, each Awardee Training and Services Report will include information pre-populated from awardee applications and annual reporting to minimize burden on awardees while allowing awardees to confirm, revise, or add details, as needed. Furthermore, data from the awardee completed Awardee Training and Services Report will be used to pre-populate the Healthcare Workforce Survey with relevant award activities and to generate skip patterns to minimize respondent burden. In addition, the Healthcare Workforce Survey’s skip patterns will ensure respondents are only asked questions that are relevant to their experience.
The planned frequency of data collection is necessary to assess program adoption and effectiveness accurately and completely. The evaluation’s approach requires that all data collection instruments be administered twice (with the exception of the Comparison Group Survey) to account for respondents who may leave their organizations early in the award period, for example, due to staff turnover or students graduating from programs. Two rounds of data collection also shorten the amount of time that passes after participants have completed activities, which will reduce recall bias, and for the awardee and organizational interviews, the extra round of data collection will allow for assessment of program implementation at various phases across the award period.
This request fully complies with the information collection guidelines of 5 CFR 1320.5.
Federal Register Notice and Comments
A 60-day Federal Register Notice was published in the Federal Register on May 5, 2023, vol. 88, No. 87; pp. 29137-38. There were no public comments. A 30-day Federal Register Notice was published in the Federal Register on July 24, 2023, vol. 88, No. 140; pp. 47511-13. There were no public comments.
Consultation with Expert Outside of the Evaluation
During questionnaire development, HRSA sought input on item content and wording from two subject matter experts. There were no major problems that could not be resolved during consultation.
Respondents for the Healthcare Workforce Survey and the Awardee Survey about the TAC will not receive any payments or gifts.
One evaluation goal is to measure the intent of the healthcare workforce to leave their primary discipline/profession and employment setting, relative to a comparison group. Identifying a group for valid comparison will be challenging and critical to provide HRSA with actionable findings related to intent to leave. The evaluation proposes identifying the Training Program Comparison Group sample by purchasing access to two established panels—AmeriSpeak and Survey Healthcare Group (SHG)—to ensure adequate sample needed for analysis. If the evaluation does not use either panel, it would be difficult to identify an appropriate sample frame, especially with multiple respondent types (Hutchinson and Sutherland, 2019). Purchasing access to the panels requires the use of post-paid incentives for survey completion (Exhibit 1), and the cost of such incentives is included in the purchase price. Appropriate incentives play a key role in ensuring access to applicable respondents. The incentive amount will depend on the panel, the target audience, and the time and effort required for survey participation. Panelist incentive levels are typically outside the control of the purchaser.
HRSA will not handle the incentives, distribute them, or determine their value. The panel vendors that conduct recruitment set the parameters, including the amount of compensation to panelists for their participation in research studies. Incentive rates reflect the difficulty associated with a particular audience and the time and effort of survey response. Incentive levels are designed to cover more than the time involved in the survey itself. Panelists undergo lengthy screening processes to join the panel and are required to maintain their member profiles to ensure current personal information. Screening and maintaining member profiles means that panel members take time away for alternative activities that may be important to them. Exhibit 1 details respondent incentive costs per panel.
Exhibit 1. Incentives for the Comparison Group Survey Panel Respondents
AmeriSpeak Panel |
Incentive Amount |
|
All healthcare workforce respondents |
$3* |
gift card or cash |
Survey Healthcare Group (by profession) |
Incentive Amount |
Incentive Type |
Behavioral Health workforce respondents |
$55 |
|
All other healthcare workforce respondents |
$45 |
gift card or cash |
*3,000 AmeriSpeak points=$3
Per AmeriSpeak standard protocol, panel members will receive the incentive as survey choice “points” to redeem for prizes commonly provided to survey panel respondents who complete online surveys. Respondents redeem these points for cash, Amazon gift codes, virtual Mastercard currency, or physical goods, by using the AmeriSpeak Panel member web portal or by calling the AmeriSpeak support toll-free telephone number.
The respondent incentive plan for the Training Program Comparison Group Survey is consistent with AmeriSpeak’s best practices to assure an optimal survey cooperation rate. AmeriSpeak routinely offers points to keep respondents engaged and motivated, to obtain maximum survey participation. The use of a points incentive with panelists is positively associated with response rates and helps to build trust (Dillman et. al., 2014).
Respondents will receive information about privacy protection throughout the recruitment process, through emailed invitations explaining data collection requests and reminders. Such information will also be received when respondents consent to survey and interview participation in the Healthcare Workforce Survey, the Training Program Comparison Group Survey, the Awardee Survey about the TAC, the Awardee Interview, the Organizational Assessment Interviews, and/or the Awardee Cost Workbook. Privacy concerns will be addressed first through emails introducing the programs’ target population and awardees to the survey and then through any follow-up contacts (see Attachments 1B-J). Emailed survey invitations and reminders will include a link to the respective survey’s Frequently Asked Questions (FAQs; Attachment 1I) explaining how HRSA protects respondent information and use the data collected through the surveys. Finally, each survey contains carefully worded consent statements explaining in simple, direct language the steps that HRSA will take to protect the privacy of the information provided. Each data collection instrument provides clear information on the confidentiality of the information.
Safeguards for the security of data include protection of computer files against access by unauthorized individuals and groups through a multi-tiered approach of access control and monitoring, data encryption during transmission, and continuous upgrade of plans and policies.
The Healthcare Workforce Survey includes items related to mental health, burnout, and other sensitive topics. Because the focus of this evaluation is to assess programs that promote resiliency and mental health among healthcare workers, it is important to ask questions that assess changes in mental health and experiences of burnout. During the consent process, respondents will be told that their decision to complete the survey is voluntary, that they can stop at any time, and that they do not have to answer any questions they do not want to answer. In addition, mental health resources will be made available at the end of the survey tool (Attachment 1). All the data collected from the Healthcare Workforce Survey, including the sensitive questions, will be used to help answer the evaluation questions and assess perceived outcomes associated with award-funded trainings/activities (e.g., burnout, resiliency, absenteeism, and intent to leave profession and employment setting), as well as other related factors (e.g., perceptions of mental health, whether respondents found trainings/activities to be helpful, and reasons for burnout). HRSA will use the data to develop recommendations and offer actionable approaches/methodologies that programs can implement to inform future program incentives and investment strategies.
The following sensitive questions in the Healthcare Workforce Survey relate to mental health and feelings of burnout:
19. *Thinking about how you feel now, compared to before you participated in these trainings/activities/services/other initiatives, how would you rate each of the following?
[For the comparison group and those who indicated that they were not aware of/did not participate in activities, this question will be worded, “Thinking about how you feel now, compared to a year ago, how would you rate each of the following?”]
|
Much better now |
A little better now |
About the same now |
A little worse now |
Much worse now |
My feelings of burnout* at work are.... |
|
|
|
|
|
My resiliency** is.... |
|
|
|
|
|
My ability to manage my work-related stress is.... |
|
|
|
|
|
The flexibility I have at work is.... |
|
|
|
|
|
My workload is... |
|
|
|
|
|
My organization’s efforts to address staff burnout are.... |
|
|
|
|
|
The stigma about mental health at work is.... |
|
|
|
|
|
The resources my workplace provides to manage my mental health, stress and burnout are.... |
|
|
|
|
|
My organization’s culture with regards to workplace well-being and burnout is… |
|
|
|
|
|
My feelings of burnout* in my <autopopulate #3 answer> program are... |
|
|
|
|
|
My resiliency* is.... |
|
|
|
|
|
My ability to manage my school-related stress is.... |
|
|
|
|
|
*Please use this definition of burnout when responding: “Burnout is a type of stress that can last a long time. It makes you feel like you stopped caring about your patients and can cause you to be really tired and feel like you are not doing a good job. It can also make it hard for you to understand how your patients feel.”
**Please use this definition of resilience when responding: “Resilience is the ability to bounce back from stressful situations, endure hardships, and repair your own well-being, while creating a positive adaptation in the face of disruptive changes.”
Q25. The following questions ask about your organization’s commitment to mental health and staff well-being.
|
Yes |
No |
Not Sure |
Does your organization make it clear that mental health is a top priority? |
|
|
|
Does your organization lessen barriers to access mental health resources? |
|
|
|
Is your organization training colleagues to understand signs of burnout and distress? |
|
|
|
Is your organization getting feedback from employees about mental health supports/burnout through trainings or surveys? |
|
|
|
Is your organization holding leaders and managers accountable to support employee mental health and resiliency? |
|
|
|
Q26. Please select the response that best describes your feelings or experiences on each item.
[Students will be asked: Please select the response that best describes your feelings and experiences training to be a <autopopulate #3 answer>. If the question asks about work or job, please answer the question about your experiences in your <autopopulate #3 answer> program as a whole, including rotations or clinical experiences]
|
Disagree Strongly |
Disagree Slightly |
Neutral |
Agree Slightly |
Agree Strongly |
Not Applicable |
Events in this work setting affect my life in an emotionally unhealthy way |
|
|
|
|
|
|
I feel burned out from my work |
|
|
|
|
|
|
I feel fatigued when I get up in the morning and have to face another day on the job |
|
|
|
|
|
|
I feel frustrated by my job |
|
|
|
|
|
|
I feel I am working too hard on my job |
|
|
|
|
|
|
27a. *Which, if any, of the following factors related to your work demands have contributed to your feelings of burnout: [For students: Which, if any, of the following factors related to work demands do you think you will experience and may make you feel burned out when you work as a <autopopulate #3 answer>?]
Select all that apply.
Administrative work stress
Concerns for physical health or safety at work
Fear of making serious mistakes
Feeling numb or tired from witnessing patient suffering (compassion fatigue)
Increased clinical demands (e.g., patient load, electronic health record documentation)
Lack of control over my work
Lack of resources compared to other similar settings
Not enough balance between work and personal life
Professional impact of COVID-19
Schedule is not flexible
Stress of hearing about people’s suffering and traumatic experiences
Understaffed at work
Unmanageable workload
None of the above
27b. *Which, if any, of the following factors related to your colleagues and organizational support have contributed to your feelings of burnout: [For students: Which, if any, of the following factors related to colleagues and organizational support do you think you will experience and may make you feel burned out when you work as a <autopopulate #3 answer>?]
Select all that apply.
Colleagues don’t trust each other
Employees are not included in decision making at my organization
Impacts of reimbursement models or other government and/or insurer policies on work
Lack of manager or leadership support
Lack of resources for mental health and wellness at work
My opinions don’t matter to the organization
Not enough support from colleagues
Organization does not prioritize diversity, equity, and inclusion
Too much mental health stigma at work
None of the above
27c. *Which, if any, of the following factors related to your position and career growth have contributed to your feelings of burnout: [For students: Which, if any, of the following factors related to position and career growth do you think you will experience and may make you feel burned out when you work as a <autopopulate #3 answer>?]
Select all that apply.
Lack of professional development
Lack of role clarity
My contributions are not valued enough
Not enough financial compensation at work
Unfair treatment/lack of equity at work (harassment and discrimination)
Working outside of my scope/training
None of the above
27d. *Which, if any, of the following factors related to your personal life have contributed to your feelings of burnout: [For students: Which, if any, of the following factors related to your personal life do you think you will experience and may make you feel burned out when you work as a <autopopulate #3 answer>?]
Select all that apply.
Chronic health problems (e.g., pain, fatigue, health conditions)
Depression, anxiety, and/or substance use
Family stressors (e.g., divorce, incarceration)
Financial stress
Feeling lonely
Lack of suitable and affordable childcare
Lack of time to take care of myself (e.g., to do things I enjoy)
Legal stressors
Personal impact of COVID-19
Stress of caring for others (e.g., older adults, children)
Uneven distribution of household responsibilities
None of the above
27e. Please list any other factors that have contributed to your feelings of burnout:
[For students: Please list any other factors you think you will experience and may make you feel burned out when you work as a <autopopulate #3 answer>.]
Q28. *Please select the top three reasons you feel burned out.
[list all factors indicated in Question 27a-e above].
Q42. * Please respond to each statement below by selecting one response per row.
|
Strongly Disagree |
Disagree |
Neutral |
Agree |
Strongly Agree |
I tend to bounce back quickly after hard times. |
|
|
|
|
|
I have a hard time making it through stressful events. |
|
|
|
|
|
It does not take me long to recover from a stressful event. |
|
|
|
|
|
It is hard for me to snap back when something bad happens. |
|
|
|
|
|
I usually come through difficult times with little trouble. |
|
|
|
|
|
I tend to take a long time to get over setbacks in my life. |
|
|
|
|
|
Q45. *In general, how would you rate your overall mental health now:
Excellent
Very good
Good
Fair
Poor
In addition, the U.S. Department of Health & Human Services (HHS) requires that information about race and ethnicity be collected on all HHS data collection instruments (ASPE, 2011). The proposed questions below have been revised to conform with existing OMB standards and align with those used in the 2020 Census. As noted above, the Healthcare Workforce Survey Informed Consent statement reads that a respondent’s decision to complete the survey is voluntary, that they can stop at any time, and that they do not have to answer any questions they do not want to answer.
Are you Hispanic or Latino/a? Select one.
Yes
No
What is your race? Select all that apply.
American Indian or Alaska Native
Asian
Black or African American
Native Hawaiian or Other Pacific Islander
White
Prefer not to answer
This section includes estimates of the total burden hours for information collection (Exhibit 2) and of the cost associated with those hours (Exhibit 3). Exhibits 4-6 provide detailed explanation for how we estimated number of responses (Exhibit 4), the source/methodology for how the agency determined average burden per response (Exhibit 5), and source and methodology for how the agency determined the median hourly wage rate (Exhibit 6), including overhead and benefits. Regarding the number or responses per respondent, we plan to field each data collection form no more than once per year. Accordingly, we estimate one response per respondent.
Exhibit 2. Estimated Annualized Burden Hours
Form Name |
Number of Respondents |
Number of Responses per Respondent |
Total Responses |
Average Burden per Response (in hours) |
Total Burden Hours |
Healthcare Workforce Survey |
29,359 |
1 |
29,359 |
0.25 |
7,340 |
The Training Program Comparison Group Screener |
180,000 |
1 |
180,000 |
0.05 |
9,000 |
The Training Program Comparison Group Survey |
2,600 |
1 |
2,600 |
0.17 |
|
The Training Program Awardee Cost Workbook |
34 |
1 |
34 |
6 |
204 |
Awardee Interview Guide |
44 |
1 |
44 |
1.50 |
66 |
Awardee Training and Services Report |
44 |
1 |
44 |
1.00 |
44 |
Fielding Tracker |
44 |
1 |
44 |
4.00 |
176 |
|
|
|
|
|
|
The Workforce Program Awardee Cost Workbook |
10 |
1 |
10 |
6 |
|
The Workforce Program Organizational Assessment Interview Protocol |
50 |
1 |
50 |
1.00 |
50 |
|
|
|
|
|
|
Awardee Survey about the Technical Assistance Center |
44 |
1 |
44 |
1.00 |
44 |
Total |
212,229 |
10 |
212,229 |
20.97 |
17,426 |
Exhibit 3. Estimated Annualized Burden Cost to Respondents
Type of Respondent |
Number of respondents |
Total Burden Hours |
Hourly Wage Rate |
Total Cost Burden |
Nurses |
9,569 |
2,392 |
$78.10 |
$186,815.20 |
Advanced Practice Registered Nurses (Nurse Practitioners) |
1,309 |
327 |
$116.94 |
|
Behavioral Health Provider (Social Workers, Counselors, Psychologists) |
4,760 |
1,190 |
$47.48 |
$56,501.20 |
Behavioral Health Students |
601 |
150 |
$43.26 |
$6,489.00 |
Physicians (MDs and DOs) |
4,941
|
1,235 |
$214.82 |
$265,302.70 |
Physician Assistants |
1,429 |
357 |
$121.16 |
$43,254.12 |
Pharmacists |
1,200 |
300 |
$127.64 |
$38,292.00 |
Resident Physicians1 |
1,500 |
375 |
$158.26 |
$59,347.50 |
All other students (nursing, medical students, nurse practitioner, physician assistants)
|
4,050 |
1,013 |
$19.96 |
$20,219.48 |
Total |
29,359 |
7,339 |
$927.62 |
$714,460.58 |
Exhibit 4. Explanation for Estimated Number of Responses
Form Name |
Number of Respondents |
Source/Methodology for Determining the Number of Respondents |
Healthcare Workforce Survey |
29,359 |
This number reflects 30 percent of the sum of the estimated target population for the Training Program and Workforce Program, as 30 percent is the estimated response rate. HRSA identified the target population for the Training Program using data extracted from awardee abstracts and applications. Data about the target population by respondent type were not included in awardee abstracts and applications for the Workforce Program. For this reason, HRSA developed proportional estimates using HRSA’s Health Center Program Uniform Data System (UDS) data for the following respondent types: nurses, nurse practitioners, behavioral health providers (total mental health services row in UDS data), physicians, and physician assistants. |
The Training Program Comparison Group Screener |
180,000 |
This number reflects the estimate from Survey Healthcare Global (SHG)’s non-probability panel that roughly 100 people would need to be screened in their panel for each eligible and successfully complete case (inclusion criteria includes nurses, physicians, physicians assistants, behavioral health providers, nurse practitioners, and students who are not employed by one of the awardee institutions). This estimate is based on their previous experience with their panel eligibility and response rates. |
The Training Program Comparison Group Survey |
2,600 |
This number reflects our intent to utilize NORC’s AmeriSpeak Panel supplemented by Survey Healthcare Global (SHG)’s non-probability panel to provide estimates on burnout and retention for members of the healthcare workforce. HRSA plans to obtain up to 1,500 responses across both panels, comprising 800 from AmeriSpeak and 700 from SHG. This sample size will allow HRSA to detect a difference of 10% in the retention rate between the comparison group and the awardees. This is based on a power analysis for a two-tailed test with the following parameters: beta of 0.2, alpha of 0.05, population retention rate of 30% and a design effect of 2.5. |
The Training Program Awardee Cost Workbook |
34 |
This is the total number of Training Program awardees. |
Awardee Interview Guide |
44 |
This is the total number of Training and Workforce Program awardees. |
Awardee Training and Services Report |
44 |
This is the total number of Training and Workforce Program awardees. |
Fielding Tracker |
44 |
This is the total number of Training and Workforce Program awardees. |
The Workforce Program Awardee Cost Workbook |
10 |
This is the total number of Workforce Program awardees. |
The Workforce Program Organizational Assessment Interview Protocol |
50 |
This number assumes and average of five interview participants (e.g., program director, key support staff, key partner staff) at each of the ten Workforce Program awardees. |
Awardee Survey about the Technical Assistance Center |
44 |
This is the total number of Training and Workforce Program awardees. |
Exhibit 5. Explanation for Estimated Average Burden per Response
Form Name |
Source/Method |
Healthcare Workforce Survey |
Based on the understanding that this survey has approximately 60 items, exclusive of skip logic, for respondent consideration. In evaluating the potential burden, we employed a variety of testing and estimation measures. We conducted cognitive interviews and tracked the response time (excluding retrospective probing) for items. We tested the items internally for average completion time. Additionally, we compared the survey to similar instrument’s average completion times. Finally, we compared these estimates to best practice estimates in survey research (e.g., the average respondent would answer three 200-word questions/minute). |
The Training Program Comparison Group Screener |
Based on the understanding that this survey has approximately 10 items, exclusive of skip logic, for respondent consideration. In evaluating the potential burden, we employed a variety of testing and estimation measures. We conducted cognitive interviews and tracked the response time (excluding retrospective probing) for items. We tested the items internally for average completion time. Additionally, we compared the screener to similar instrument's average completion times. Finally, we compared these estimates to best practice estimates in survey research (e.g., the average respondent would answer three 200-word questions/minute). |
The Training Program Comparison Group Survey |
Based on the understanding that this survey has approximately 45 items, exclusive of skip logic, for respondent consideration. In evaluating the potential burden, we employed a variety of testing and estimation measures. We conducted cognitive interviews and tracked the response time (excluding retrospective probing) for items. We tested the items internally for average completion time. Additionally, we compared the survey to similar instrument's average completion times. Finally, we compared these estimates to best practice estimates in survey research (e.g., the average respondent would answer three 200-word questions/minute). |
The Training Program Awardee Cost Workbook |
Based on consultation with subject matter experts with experience doing similar cost abstraction (e.g., the CDC-funded STEADI Algorithm for Fall Risk Screening, Assessment & Intervention where we estimated initiative economic cost-effectiveness and a Special Projects of National Significance (SPNS) Initiative where we assessed the cost effectiveness of an HIV health outcomes program). This includes reviewing instructions and pre-filled data, as well as time to make any corrections or updates as needed to the Cost Workbook, reporting the estimated period of time that each person spent on project activities, and submitting any missing cost descriptions. |
Awardee Interview Guide |
Based on understanding of the breadth and depth of content; through feedback sessions including cognitive interviews and focus groups (each involving less than nine individuals); awareness that interviews will be scheduled with a firm start and stop time. |
Awardee Training and Services Report |
Based on the understanding that these pre-populated reports will include a current and complete list of activities and key descriptive information for each awardee organization. We considered the average number of activities per awardee, time to review the instructions and layout of the report, and time to make any corrections or updates as needed to the report (based on the average silent reading time for adults as 238 words per minute).1 |
Fielding Tracker |
Based on consideration of the time required to review instructions for completing the Fielding Tracker (e.g., the average silent reading time for adults as 238 words per minute1), as well as the multiple staff who may be involved in providing target population demographics and/or disseminating the Healthcare Workforce Survey email invitations. |
The Workforce Program Awardee Cost Workbook |
Based on consultation with subject matter experts with experience doing similar cost abstraction (e.g., the CDC-funded STEADI Algorithm for Fall Risk Screening, Assessment & Intervention where we estimated initiative economic cost-effectiveness and a Special Projects of National Significance (SPNS) Initiative where we assessed the cost effectiveness of an HIV health outcomes program). This includes reviewing instructions and pre-filled data, as well as time to make any corrections or updates as needed to the Cost Workbook, reporting the estimated period of time that each person spent on project activities, and submitting any missing cost descriptions. |
The Workforce Program Organizational Assessment Interview Protocol |
Based on breadth and depth of content and through feedback sessions including cognitive interviews and focus groups (each involving less than nine individuals) and awareness that interviews will be scheduled with a firm start and stop time. |
Awardee Survey about the Technical Assistance Center |
Estimate based on the understanding that this survey has approximately 48 items, exclusive of skip logic, for respondent consideration. In evaluating the potential burden, we employed a variety of testing and estimation measures. The level of detail required of this survey, likely requiring the consultation of documentation or colleagues, results in a higher response burden than a standard survey. We conduced expert reviews and tracked the response time (excluding retrospective probing) for items. Additionally, we compared the survey to similar instruments’ average completion times. Finally, we compared these estimates to best practice estimates in survey research (e.g., the average respondent would answer three 200-word questions/minute). |
Exhibit 6. Explanation for Source and Methodology for How the Agency Determined the Hourly Wage Rate
Type of Respondent |
Source* |
Source/Method |
Nurses |
DOL-BLS Wage Data |
Median hourly wage rate for Registered Nurses. (wages as of 05/2022, accessed 12/19/23). https://www.bls.gov/oes/current/oes291141.htm |
Advanced Practice Registered Nurses (Nurse Practitioners) |
DOL-BLS Wage Data |
Median hourly wage rate for Nurse Practitioners. (wages as of 05/2022, accessed 12/19/23). https://www.bls.gov/oes/current/oes291171.htm |
Behavioral Health Provider (Social Workers, Counselors, Psychologists) |
DOL-BLS Wage Data |
Median hourly wage rate for Community and Social Service Occupations. (wages as of 05/2022, accessed 12/19/23). https://www.bls.gov/oes/current/oes210000.htm
|
Behavioral Health Students |
Zippia |
The average hourly wage rate for behavioral health students is based on the graduate teaching assistant salary on zippia.com (Graduate Teaching Assistant Salary (April 2023 - Zippia). The Bureau of Labor Statistics website did not include an hourly wage for this position [Teaching Assistants, Postsecondary (bls.gov)] |
Physicians (MDs and DOs) |
DOL-BLS Wage Data |
Median hourly wage rate for Physicians, All other. (wages as of 05/2022, accessed 12/19/23). https://www.bls.gov/oes/current/oes291229.htm |
Physician Assistants |
DOL-BLS Wage Data |
Median hourly wage rate for Physician Assistants. (wages as of 05/2022, accessed 12/19/23). https://www.bls.gov/oes/current/oes291071.htm |
Pharmacists |
DOL-BLS Wage Data |
Median hourly wage rate for Pharmacists. (wages as of 05/2022, accessed 12/19/23). https://www.bls.gov/oes/current/oes291051.htm |
Resident Physicians |
Zippia |
We separated out resident physicians from medical students/all other students in this table since resident physicians receive a salary. The average hourly wage for resident physicians is based on data found on zippia.com. (https://www.zippia.com/resident-physician-jobs/salary/). There was no information on hourly wage for resident physicians on the Bureau of Labor Statistics website. |
All other students (nursing, medical students, nurse practitioner, physician assistants) |
Minimum wage |
The average hourly wage for all other students is based on the national average minimum hourly wage for 2022, assuming health care workforce students have some external employment (https://www.laborlawcenter.com/state-minimum-wage-rates) |
Estimates of other Total Annual Cost Burden to Respondents or Recordkeepers/Capital Costs
There are no direct costs to respondents other than their time to participate in the data collection.
The total estimated cost of this data collection and evaluation for the contractors is $7,348,769. The contract spans a 48-month project period and represents an annual cost of $1,837,192.
The costs associated with the data collection and evaluation activities for the project include the contractor project development costs and project management costs, as well as the costs to develop provider resiliency program evaluation questions and methodology; to develop an evaluation plan; to develop data collection instruments; to conduct information collection, data development, and coding procedures; to conduct data analysis; and reporting.
In addition, the cost to the government includes the salaries of the HRSA staff (Exhibit 4) who:
1) determine the content of the data collection instruments,
2) oversee the scope of work conducted under the contract, and
3) assist in analyzing the results and recommend changes in questionnaire wording.
There are no equipment or overhead costs. The only cost to the Federal Government will be the salary of HRSA staff and funding for the contractor (NORC) to support the development of the study design, data collection, analysis of results, and associated tasks.
Exhibit 7. Estimated Government Staff Costs
Type of Federal Program Staff |
Average Total Annual Burden Hours |
Hourly Wage Rate* |
Total Respondent Costs |
Public Health Analyst GS-013, Step 5 average |
520 (0.25 FTE) |
$91.25 |
$47,450.00 |
Public Health Analyst GS-013, Step 5 average |
208 (0.10 FTE) |
$91.25 |
$18,980.00 |
Public Health Analyst GS-013, Step 5 average |
104 (0.05 FTE) |
$91.25 |
$9,490.00 |
Public Health Analyst GS-013, Step 5 average |
104 (0.05 FTE) |
$91.25 |
|
Total |
|
|
$85,410.00 |
*Wage rate is based on 2023 OPM Pay Schedule for Washington DC area:
https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2023/DCB_h.pdf
Rates were adjusted by 1.5 to account for overhead costs.
Total (contracts and staff) is $7,405,705.88.
This is a new information collection.
Exhibit 5 presents a summary of the proposed time schedule for data collection activities. The exact start date for data collection activities depends on the OMB clearance date.
Exhibit 8. Proposed Data Collection Time Schedule
Task |
Time Period |
Conduct the Training Program and the Workforce Program Health Workforce Survey (pending OMB approval) |
|
Conduct Awardee Interviews |
2023 and 2024 |
Conduct Organizational Assessment Interviews |
2023 and 2024
|
Complete Awardee Cost Workbook |
Winter 2024 (for 2022 and 2023 data) and Fall 2024/Early Winter 2025 (for 2024 data) |
Conduct Awardee Training and Services Report |
Fall 2023 and Fall 2024 |
Conduct Awardee Survey about the TAC |
Fall 2023 and Fall 2024 |
Conduct Comparison Group Survey |
Fall 2024/Early Winter 2025 |
Analyze secondary quantitative data |
Results provided annually in each draft Interim and Final Report sixty days (eight weeks) prior to the end of the contract year |
Analyze secondary qualitative data |
|
Analyze awardee survey data |
|
Analyze qualitative interview data |
|
Peer-Reviewed Publication #1 |
2024 (Year 2) |
Peer-Reviewed Publication #2 |
2024 (Year 3) |
Peer-Reviewed Publication #3 |
2025 (Year 4) |
Interim Reports |
2025 |
Final Report |
2026 (8 weeks prior to the end of the contract) |
Briefing slides |
2026 (3 weeks prior to the end of the contract) |
Data files |
2026 (4 weeks prior to the end of the contract) |
The evaluation will use both quantitative and qualitative analyses to describe the characteristics of the programs and to assess outcomes. Each phase of this evaluation will be informed by literature, especially related to identifying evaluation best practices and to comparing the findings with other relevant research. Using the Healthcare Workforce Survey data, HRSA will address evaluation research questions by assessing the characteristics of the target population and their experiences with the intervention and by noting differences in outcomes between the target population and the comparison group.2 HRSA will integrate and synthesize survey data with award program and secondary public data.
The awardees will be asked to invite their entire target population (that is, those identified to participate in the Training Program and employees of the Workforce Program awardees) to complete the survey. It is expected that some individuals will decline to respond. The evaluation will use the Healthcare Workforce Fielding Tracker to collect data on the total number of individuals within the awardees’ target population and the basic demographics of the population. Using data from the Tracker, the evaluation will assess whether survey results reflect the target population and if necessary, use weighting to make responses of the sampled individuals more representative of the target population.3
This evaluation will report descriptive univariate respondent characteristics and outcomes collected in the Healthcare Workforce Survey and in Annual Performance Review data. The evaluation will report current outcomes (for example, burnout and absenteeism), whether respondents in activities perceive changes in outcomes since participating in the program, and whether respondents attribute any improvement in outcomes to the activities. Pooled individual-level data will be used to report the descriptive results at the overall program level. The evaluation will assess the feasibility of reporting descriptive results for awardees, major target population types (for example, nurses and medical students), and other subpopulations. A key consideration for reporting target population data at the level of target population type or awardee is whether the number of respondents is sufficient to meet requirements to mask (that is, not report) results to protect respondents' privacy. Results of descriptive analyses will be synthesized into tables and visuals.
For the Training Program, the evaluation will apply the appropriate statistical tests to compare all outcomes between the comparison and target population groups, based on variable type and distribution (for example, chi squared test for categorical variables and t-tests or analysis of variance (ANOVA) for continuous variables). Key outcomes include absenteeism, intent to leave, burnout, and resilience of the target population and the comparison group. For the Workforce Program, the evaluation will compare results on the level of intent to leave (before and after individuals participate in awardee activities) with point estimates from comparable studies in the literature or from publicly available data.
The next step in evaluating the Training Program will be to run multivariable regression models that assess differences in outcomes, adjusting for individual-level factors collected in the survey (that is, demographics and profession) and for community-level factors (such as rural/urban location and community-level socioeconomic status) using workplace zip code data. Adjusting for such factors will account for observed differences between respondents in the target population and the comparison group that may relate to outcomes, although the evaluation will not be able to measure all relevant factors outside the intervention. For the pooled assessment, the evaluation will consider propensity score adjustments to address differences in demographic and other characteristics between respondents to the treatment and comparison groups; the evaluation expects that sample sizes will be insufficient for propensity score adjustments at the awardee level. An appropriate functional form will be selected for each outcome based on the distribution of the data (for example, logistic regression for binary outcomes). To account for clustering respondents within awardee organizations, the evaluation will use a random effect for organization. As with descriptive results, the evaluation will report awardee-level respondent type and subpopulation results in addition to the pooled assessment, if feasible.
The evaluation will use a mixed methods approach for the organizational assessment of the Workforce Program awardees. Analyses will draw on qualitative and quantitative data from the Annual Performance Reports (APR) and other HRSA reporting, specific questions about organizational change from the Healthcare Workforce Survey, and interviews with the Workforce Program awardees. To analyze interviews, the evaluation will use a team-based approach to establish a codebook, provide training to coders, and conduct quality assurance checks for inter-rater reliability. The initial codebook will be based on the interview guide, updated as needed during the analysis. After coding is completed, the evaluation will conduct content analysis at multiple levels to describe findings at the overall program level; by specific awardee characteristics and perspectives (type of interviewee); and at the awardee level.
The evaluation will use a similar qualitative approach to analyze awardee interviews. The evaluation will use a combined deductive and inductive approach and develop an initial codebook based on the awardee interview guide. HRSA will ensure strong inter-rater reliability. Potential domains include implementation processes, challenges and facilitators to collaboration, and best practices and lessons learned.
Evaluation results will be disseminated through three manuscripts drafted for publication in peer-reviewed health journals, at a rate of one per year. The evaluation will review proposed topics and possible journals to ensure that the evaluation is aligned with HRSA priorities and represent the key findings of the evaluation to date. HRSA’s National Center for Health Workforce Analysis (NCHWA) staff, editors, and subject matter experts will collaborate to develop manuscripts covering relevant topic areas. Early in the project NORC will work with NCHWA to focus the dissemination of results toward intended audiences and to contribute to research, policy, and programs. Examples of peer-reviewed health journals that might be relevant to this evaluation include the Journal of the American Medical Association, Psychiatric Services, American Journal of Preventive Medicine, Journal of Health Care for the Poor and Underserved, Health Services Research, and Journal of Healthcare.
Per 42 USC 292 et seq, HRSA cannot share individual-level data from this evaluation. Specifically, U.S. Code Title 42 Chapter 6A Subchapter V Part E Section 295k(e)(3), states the following: “(A) Notwithstanding any other provision of law, personal data collected by the Secretary or any program entity under this section may not be made available or disclosed by the Secretary or any program entity to any person other than the individual who is the subject of such data unless (i) such person requires such data for purposes of this section, or (ii) in response to a demand for such data made by means of compulsory legal process. Any individual who is the subject of personal data made available or disclosed under clause (ii) shall be notified of the demand for such data. (B) Subject to all applicable laws regarding confidentiality, only the data collected by the Secretary under this section which is not personal data shall be made available to bona fide researchers and policy analysts (including the Congress) for the purposes of assisting in the conduct of studies respecting health professions personnel.”
After
these three manuscripts are published, HRSA will publish
all public government data assets generated by this information
collection online as open data, using standardized, machine-readable
data formats in compliance with the OPEN Government Data Act (Title
II of the Foundations for Evidence-Based Policymaking Act of 2018,
P.L. 115-435).
Does not apply. The OMB number and expiration date will be displayed on every page of all forms and instruments.
No exceptions are necessary for this information collection.
Hutchinson, M., & Sutherland, M. A. (2019). Conducting surveys with multidisciplinary health care providers: Current challenges and creative approaches to sampling, recruitment, and data collection. Research in Nursing & Health, 42(6), 458–466. https://doi.org/10.1002/nur.21976
Dillman D.A., Smyth J.D., Christian, L.M. (2014). Internet, phone, mail, and mixed-mode
surveys: The tailored design method (Fourth Edition). John Wiley & Sons, Inc.
Office of the Assistance Secretary for Planning and Evaluation (ASPE) (2011). HHS Implementation Guidance on Data Collection Standards for Race, Ethnicity, Sex, Primary Language, and Disability Status. https://aspe.hhs.gov/reports/hhs-implementation-guidance-data-collection-standards-race-ethnicity-sex-primary-language-disability-0
Attachments
Supporting Document |
Attachment |
The Health and Public Safety Workforce Resiliency Training Program (The Training Program) / Promoting Resilience and Mental Health among Health Professional Workforce (The Workforce Program) Healthcare Workforce Survey The Training Program Comparison Group Screener and Survey |
1 |
The Healthcare Workforce Fielding Tracker The Healthcare Workforce Survey Respondent Contact Materials The Training Program Comparison Group Survey Respondent Contact Materials |
1A 1(B-K) 1(L-P)
|
Awardee Survey about the Technical Assistance Center (TAC) |
2 |
Awardee Survey about the TAC Respondent Contact Materials |
2(A-H) |
The Awardee Cost Workbook |
3 |
Cost-Benefit Workbook Respondent Contact Materials |
3(A-D) |
The Training Program/ The Workforce Program Awardee Training and Services Report |
4 |
The Awardee Training and Services Report Respondent Contact Materials |
4(A-C) |
The Training Program Awardee Interview Guide |
5 |
The Training Program Awardee Interview Guide Respondent Contact Materials |
5(A-B) |
The Workforce Program Awardee Interview Guide |
6 |
The Workforce Program Awardee Interview Guide Respondent Contact Materials |
6(A-B) |
1 NORC separated out resident physicians from medical students/all other students in this table since resident physicians receive a salary.
2 For the Workforce Program, the evaluation will use national benchmarks for comparisons to intent to leave rather than the comparison group.
3 While the Annual Performance Data includes some information on the healthcare workforce demographics, it provides information at the training level and individuals can be targeted for multiple interventions. In addition, the APR data does not include individuals who declined to enroll in activities. This information collection requests to use the Healthcare Workforce Fielding Tracker because nonresponse analysis requires information on unduplicated target group members.
Supporting
Statement A | Page
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Instructions for writing Supporting Statement A |
Author | Jodi.Duckhorn |
File Modified | 0000-00-00 |
File Created | 2024-11-16 |