Evaluation of Programs Supporting MH of HPW - Supporting Statement B 12192023

Evaluation of Programs Supporting MH of HPW - Supporting Statement B 12192023.docx

Evaluation of Programs Supporting the Mental Health of the Health Professions Workforce

OMB: 0906-0080

Document [docx]
Download: docx | pdf






Supporting Statement B


Evaluation of Programs Supporting the Mental Health of the Health Professions Workforce



OMB Control No. 0906-XXXX-New




Table of Contents



List of Exhibits


Exhibit 1. Provider Resiliency Programs Overview 1

Exhibit 2. Provider Resiliency Evaluation Data Collection Instruments and Numerical Estimates of Respondents by Respondent Type 2

Exhibit 3. Cost-Benefit Assessment Data Collection and Communications 6

Exhibit 4. The Awardee Training and Services Report Email Communications 8

Exhibit 5. The Training Program Awardee Interview Email Communications 8

Exhibit 6. The Workforce Program Awardee Interview Email Communications 8

Exhibit 7. Healthcare Workforce Survey Pre-Notification Efforts 9

Exhibit 8. Healthcare Workforce Survey Emails (to Healthcare Workforce) 10

Exhibit 9. Awardee Survey about the TAC Emails (to Awardee Contacts) 11















Table of Attachments


Supporting Document

Attachment

The Health and Public Safety Workforce Resiliency Training Program (the Training Program) / Promoting Resilience and Mental Health among Health Professional Workforce (The Workforce Program) Healthcare Workforce Survey

The Training Program Comparison Group Screener and Survey

1

The Healthcare Workforce Fielding Tracker

The Healthcare Workforce Survey Respondent Contact Materials

The Training Program Comparison Group Survey Respondent

Contact Materials


1A

1(B-J)

1(K-P)



Awardee Survey about the Technical Assistance Center (TAC)

2

Awardee Survey about the TAC Respondent Contact Materials

2(A-H)

The Training Program/The Workforce Program Awardee Cost Workbook

3

Cost-Benefit Assessment Respondent Contact Materials

3(A-D)

The Training Program/The Workforce Program Awardee Training and Services Report  

4

Awardee Training and Services Report Respondent Contact Materials

4(A-C)

The Training Program Awardee Interview Guide

5

The Training Program Awardee Interview Guide Respondent Contact Materials

5 (A-B)

The Workforce Program Awardee Interview Guide

6

The Workforce Program Awardee Interview Guide Respondent Contact Materials

6(A-B)

Supporting Statement B

Bureau of Health Workforce Provider Resiliency Evaluation



  1. Collection of Information Employing Statistical Methods


  1. Respondent Universe and Sampling Methods


Health Resources and Services Administration’s Bureau of Health Workforce (HRSA) seeks to evaluate each of the three award programs (the Training Program, the Workforce Program, and the Technical Assistance Center). The programs allow awardees to address their unique needs with regards to reducing burnout in their healthcare workforce and creating organizational change. Exhibit 1 below provides a high-level summary of these programs’ objectives and the awardees. The flexibility in the HRSA programs leads to significant variation across the awardee programs with respect to:

  • Type of organization

  • Discipline, specialty, and training background of their target population

  • Other characteristics of their target population

  • Activities and curriculum developed

  • Number of trainees

Exhibit 1. Provider Resiliency Programs Overview

Program

Objectives

Awardees

The Health and Public Safety Workforce Resiliency Training Program

(The Training Program)

Reduce burnout by funding evidenced based provider wellness training activities and increase the knowledge of these strategies throughout the health workforce

34 health professional schools, academic health centers, and state or local governments who conduct training activities using evidence-based strategies focused on reducing burnout and promoting resiliency among the health workforce in rural and underserved communities

Promoting Resilience and Mental Health among Health Professional Workforce

(The Workforce Program)

Support health care providing entities by funding programs or protocols aimed at creating a culture of wellness within their organizations

10 health care providing entities, health care providers’ associations, and Federally Qualified Health Centers

The Health and Public Safety Workforce Resiliency Technical Assistance Center (The Technical Assistance Center)

Assist HPSWRTP and PRMHW awardees in deploying evidence-based resilience strategies within their respective populations and work with the 10 Regional Public Health Training Centers to develop and advance a framework to reduce burnout

George Washington University (GWU) Fitzhugh Mullan Institute for Health Workforce Equity at the Milken Institute School of Public Health


Respondent Universe

The three types of respondents for this evaluation are:

  1. Healthcare workforce (target population) across two grant programs- the Training Program and the Workforce Program.

  2. Awardee project staff - the Training Program, the Workforce Program, and Health and Public Safety Workforce Resiliency Technical Assistance Center (The Technical Assistance Center).

  3. Healthcare workers not participating in the grant programs (for the comparison group).


Exhibit 2 defines the eligible respondents for each data collection instrument.

Exhibit 2. Provider Resiliency Evaluation Data Collection Instruments and Numerical Estimates of Respondents by Respondent Type

Data Collection Instrument

The Training Program

Target Population

The Workforce Program

Target Population

The Training Program Awardees

The Workforce Program

Awardees

Health Workforce Comparison Group

1. The Health and Public Safety Workforce Resiliency Training Program (the Training Program) / Promoting Resilience and Mental Health among Health Professional Workforce (The Workforce Program) Healthcare Workforce Survey

22,481

6,878




2. The Training Program/The Workforce Program Awardee Training and Services Report



34

10


3. The Training Program Comparison Group Survey





1,500

4. The Awardee Survey about the Technical Assistance Center (TAC)



34

10


5. The Training Program/The Workforce Program Awardee Cost Workbook



34

10


6. The Awardee Interview



34

10


7. Organizational Assessment Interview Protocol




30


8. The Healthcare Workforce Fielding Tracker



34

10



Sampling Methods

The Provider Resiliency Evaluation does not employ any statistical methods to select respondents. A census approach will be used for all data collection efforts except for the comparison group (which is explained in the Design and Sampling for the Training Program Comparison Group Survey section below).


A census approach will be used for The Training Program/The Workforce Program Awardee Training and Services Report (Awardee Training and Services Report) and the Healthcare Workforce Fielding Tracker for the Training Program and the Workforce Program awardees because the data provided on the form is needed to tailor the survey for each awardee, and the data collected about each awardees’ target population will be used for non-response analysis. A census approach is also needed for the Awardee Survey about the Technical Assistance Center (TAC), The Training Program/The Workforce Program Awardee Cost Workbook (the Awardee Cost Workbook), The Awardee Interview, and Organizational Assessment Interview Protocol because of the wide range of programs and the need to understand the implementation and cost benefit of each awardee’s approach to reducing burnout and/or creating organizational change.


For the Health Care Workforce Survey there are several reasons that a census approach is the optimal sampling approach:

  1. There is a lack of auxiliary information that can be used to develop an efficient sample design.

  2. This reduces the complexity of the survey administration and information transfers between HRSA, the data collection contractor NORC, and the awardees.

  3. Although we strive to achieve high response rates, a census approach has the best chance of achieving large enough sample sizes to yield sufficient statistical power.

  4. There is a desire to conduct subgroup analyses; some subgroups cannot be identified prior to data collection using a sampling approach.

  5. Given the diversity in respondents, a census approach has the best chance of getting a respondent pool with representation for a wide range of respondents.

Design and Sampling for the Training Program Comparison Group Survey

The Training Program Comparison Group Survey is designed to provide nationally representative estimates on burnout and retention for members of the healthcare workforce. The target population for this survey is the healthcare workforce in the United States. The Survey will utilize NORC’s AmeriSpeak Panel supplemented by Survey Healthcare Global (SHG)’s non-probability panel. HRSA plans to obtain up to 1,500 responses across both panels, comprising 800 from AmeriSpeak and 700 from SHG. This sample size will allow HRSA to detect a difference of 10% in the retention rate between the comparison group and the awardees.1


AmeriSpeak Panel


AmeriSpeak Frame

NORC recruits its AmeriSpeak panel members using address-based sampling (ABS) to contact U.S. households at random. The primary sampling frame for AmeriSpeak is the NORC master sample or the National Frame, a multistage probability sample that represents the U.S. household population for over 90% of the sample segments. NORC uses the U.S. Postal Service (USPS) Delivery Sequence File (DSF) to update the addresses annually.


For the construction of the National Frame, the primary sampling units comprised 1,917 National Frame Areas (NFAs), where each NFA is an entire metropolitan area (made up of one or more counties), a county, or a group of counties with a minimum population of 10,000. A total of 126 NFAs are selected in the first stage, including 28 non-urban NFAs, 38 certainty NFAs, and 60 urban NFAs. Implicit stratification was achieved by sorting the segments by location (NFA, state, and county), principal city indicator, and ethnic and income indicators. From each rural and urban NFA, a sample of five and eight segments, respectively, was selected using systematic probability-proportional-to-size (PPS) sampling where the measure of size is the number of housing units per segment. Overall, a stratified probability sample of 1,514 segments was selected into the National Frame in the second stage sampling. In addition to NORC’s National Frame, the DSF is used as a supplemental sample frame in four states (Alaska, Iowa, North Dakota, and Wyoming) to assure AmeriSpeak presence in all U.S. states and Washington, D.C.


AmeriSpeak Panel Construction

AmeriSpeak panel recruitments take place annually. The NFA segments are stratified into six sampling strata based on the race/ethnicity and age composition of each segment. To support the second stage of panel recruitment, initially sampled but nonresponding housing units are subsampled for nonresponsive follow-up (NRFU). Overall, approximately one in five initially nonresponding housing units are subsampled for NRFU, using the same six sampling strata defined above. Due to NRFU, these initially nonresponding housing units have a higher selection probability compared to the housing units that were recruited during the first stage of panel recruitment. NORC’s National Frame is designed to represent the U.S. household population nationally. Each year a major assessment of panel representativeness is conducted to inform the annual sampling strategy to ensure representativity by state as well as a full range of demographic variables. If needed additional statewide samples using the USPS DSF file are generated for supplemental recruitment.


AmeriSpeak Sample Design

The Training Program Comparison Group Survey sample will be drawn from the active members of the AmeriSpeak Panel. Panel members will be eligible for the sample if they reported studying or being employed in a healthcare or healthcare-related field during panel recruitment or other data collections since recruitment.


Since there is a limited number of panel members in these occupations, all members with these occupations or studying these occupations will be selected initially for the Survey, yielding a sample of approximately 3,500 panel members. Eligibility for the Training Program Comparison Group Survey will be assessed using a brief web-based screener (Attachment 1). We expect that approximately 800 panel members will be eligible for and respond to the Survey.


Non-Probability Survey Healthcare Global Sample

The AmeriSpeak sample will be supplemented with panel members from a commercial opt-in panel of U.S. healthcare workers from SHG. The panel is generated from samples from a population of two million plus physicians and allied healthcare professionals, whose information is regularly updated (using American Medical Association (AMA) records, hospital books, and other sources). The sample will be targeted to physicians, nurses, and healthcare students.


Response Rate

For the qualitative data collection instruments as well as the Awardee Cost Workbook, Awardee Training and Services Report, Healthcare Workforce Fielding Tracker, and Awardee Survey about the TAC we anticipate receiving an 80% response rate.


The Training Program/Workforce Program Healthcare Workforce Survey (Healthcare Workforce Survey) marks the first time that survey data will be collected from an external evaluator to evaluate the specific objectives of the Provider Resiliency award programs. For surveys of healthcare professionals, response rates have been under 60% since the 1990s and for web-only surveys had dropped to 38% on average by 2012. A recent study, the 2021 Public Health Workforce Interests and Needs Survey (PH WINS), was conducted by the de Beaumont Foundation and the Association of State and Territorial Health Officials (ASTHO). This study had a response rate of 32.5% (de Beaumont Foundation, 2023). The survey addressed topics similar to those in the Healthcare Workforce Survey—including stress, burnout, and intent to leave—although PH WINS focused on the public health workforce rather than on the healthcare workforce. In addition, the data collection contractor (NORC) has recent experience with a similar evaluation, the Bureau of Health Workforce (BHW) Substance Use Disorder Evaluation, for which they obtained a response rate of 22% among awardee trainees who participated in training programs. We aim to achieve a 30% response rate for the Healthcare Workforce Survey. We discuss our approach to maximize response rates in Section 3.


  1. Procedures for the Collection of Information


We have designed data collection procedures to maximize response rates, to reduce burden to respondents, and to promote accuracy and completeness of responses. In this section, we outline planned steps for qualitative and quantitative data collection instruments, including: 1) the Cost-Benefit Assessment (including the Awardee Cost Workbook)); 2) the Awardee Training and Services Report 3) the Awardee Interview; 4) the Healthcare Workforce Survey; 5) the Awardee Survey about the TAC; and 6) the Training Program Comparison Group Survey.


Cost-Benefit Assessment

The project team will provide a coordinated set of communications to Awardees (Attachments 3(A-D)) regarding the Cost-Benefit Assessment. After data abstraction activities are completed, an Awardee training webinar will be conducted to introduce the Cost-Benefit Assessment and Awardee Cost Workbook (Attachment 3) and provide instructions needed to review and update the data as needed. In addition, the project team will schedule and hold several office hours to answer any questions from Awardees and assist them in completing the Awardee Cost Workbook review and data input. Awardees will be asked to complete a Cost Workbook for each of their three award years.

Exhibit 3. Cost-Benefit Assessment Data Collection and Communications

Task

Attachment

Details

Timeline 2023 and 2024

Overview and Introduction to the Awardee Cost Workbook 

(Email)

3A

  • Introduce Cost-Benefit Assessment and Awardee Cost Workbook.

  • Inform Awardees of upcoming webinar/call to introduce Cost-Benefit Assessment and Awardee Cost Workbook.

  • Mention two office hours sessions for the Awardee Cost Workbook.

Following Office of Management and Budget (OMB) approval (2024 for 2022 and 2023 data) and data abstraction activities (Fall 2024/Early Winter 2025 for 2024 data)

Awardee Cost Workbook Email Invitation

3B

  • Provide pre-populated Awardee Cost Workbook for Awardee review, editing, and completion.

  • Reminder of office hours sessions for support.

  • Includes project email and telephone number for support.

Following training webinar

Cost-Benefit Assessment Email Reminder 1

3C

  • Reminder of deadline to complete Awardee Cost Workbook.

  • Reminder that Awardee Cost Workbook Training recording is available.

  • Includes project email and telephone number for support.

Every other week after providing Awardee Cost Workbooks

Awardee Cost Workbooks due from awardees

NA


Two months after providing Awardee Cost Workbooks

Cost-Benefit Assessment Email Reminder 2

3D

  • Reminder that Awardee Cost Workbook deadline has passed.

  • Reminder that Awardee Cost Workbook Training and office hours recordings are available.

  • Includes project email and telephone number for support.

Every other week after providing Awardee Cost Workbooks to awardees

Review Awardee Cost Workbooks received

NA

  • Review for completeness and appropriateness of included data

Within one month of receiving completed Awardee Cost Workbooks

Contact Awardees as needed

NA

  • Verify data as needed

Within two months of receiving completed Awardee Cost Workbooks



Awardee Training and Services Report

The project team will email the Awardee Training and Services Report form to each Awardee; the form will be a pre-populated Excel document that includes available, descriptive program data from Awardee reports. The project team will providing detailed instructions to completing the Report and hold office hours for Awardees to ask any clarifying questions. As shown in Exhibit 4, Awardees will be sent a reminder email after one week.



Exhibit 4. The Awardee Training and Services Report Email Communications


Task

Attachment

Details

Timeline

2023 and 2024

The Awardee Training and Services Report Initial Invitation Email

4A

  • Provide pre-populated Report to Awardees for review and completion.

  • Provide contact information for any questions.

After OMB approval (2023) and Fall 2024

The Awardee Training and Services Report Reminder Email 1

4B

  • Reminder to review and edit pre-populated Report by deadline.

  • Provide contact information for any questions.

1 week after Invitation Email was sent

The Awardee Training and Services Report Reminder Email 2

4C

  • Reminder that Report is past due.

  • Provide contact information for any questions.

Sent two days past Report deadline



Awardee Interview

The project team will interview each Awardee annually—once in 2023 and again in 2024—using semi-structured interview guides that allow for the flexibility to ask follow-up questions (Attachment 5). The interviews will document implementation, clarify data collected in the Awardee Cost Workbooks, and provide context for assessment of quantitative data. Exhibits 5 and 6 summarize data collection plans and communications regarding scheduling and conducting the Awardee Interviews, to take place over an eight-week period after receipt and review of the Awardee Cost Workbook. For the Workforce Program Awardee Interview communications (Exhibit 6), each email is customized to the specific audience or role (that is, Project Manager/Director, Awardee Partner, and healthcare workforce member).

Exhibit 5. The Training Program Awardee Interview Email Communications

Task

Attachment

Details

Timeline

2023 and 2024

Identify Awardee Interview participants

NA

  • Begin with an Awardee point of contact identified from administrative data

1-2 months prior to interviews

The Training Program Awardee Interview Invitation Email

5A

  • Remind Project Manager/Director of evaluation.

  • Request to schedule interview; select an available date and time.

2-4 weeks prior to interviews

The Training Program Awardee Interview Invitation Reminder Email

5B

  • Reminder to schedule interview; request to select an available date and time.

  • Sent one week after initial invitation.

1 week after Invitation Email was sent


Exhibit 6. The Workforce Program Awardee Interview Email Communications

Contact

Attachment

Audience/Role

Description

Timeline

2023 and 2024

Identify Awardee Interview participants

NA

  • Project Manager/ Director

  • Awardee Partner

  • Health workforce member

  • Begin with an Awardee point of contact identified from administrative data

1-2 months prior to the interviews

The Workforce Program Awardee Interview Invitation Email

6A

  • Remind audience of evaluation.

  • Request to schedule interview; select an available date and time.

2-4 weeks prior to the interviews


The Workforce Program Awardee Interview Invitation Reminder Email

6B

  • Reminder to schedule interview; request to select an available date and time.

  • Sent one week after initial invitation.

1 week after Invitation Email was sent



Healthcare Workforce Survey

Data will be collected from respondents using a web-based Survey (Attachment 1) that respondents will access through a secure link. To ensure privacy of personally identifiable information (PII), Awardees will send out email invitations to complete the Survey. Awardees will have detailed instructions on how to contact their target program population throughout the data collection period. Outreach will be through email only, using customized email templates, with an initial Survey invitation email followed by approximately weekly email follow up (Attachments 1 (B-J)).

Awardees will use the Healthcare Workforce Fielding Tracker (Attachment 1A) to record all outreach conducted, including the number of emails sent, date of sent emails, and the number of undeliverable emails.

Exhibits 7 and 8 summarize the types and timing of Survey pre-notification and contact efforts. Outreach emails will introduce Awardees and respondents to the Survey, provide instructions on completing the Survey, include a secure link to the Survey, and include Survey support team contact information for questions and concerns and a link to frequently asked questions (FAQs) about the Survey.

Exhibit 7. Healthcare Workforce Survey Pre-Notification Efforts

Time before survey launch

Title 

From

Audience (To)

Description 

4 weeks

Pre-Notification Email

HRSA

Awardees

Notify Awardees of upcoming survey and awardee responsibilities for email outreach. Look out for instructions from NORC.

3 weeks

Instructions for Awardee Email Outreach

NORC

Email detailed instruction guide for contacting program target population with invitation to complete Survey. 

1-2 weeks

Healthcare Workforce Survey Flyer (Attachment 1J)

Awardees

Healthcare workforce respondents (program target population)

Post and discuss flyer informing healthcare workforce of upcoming Survey, importance of Survey, and where to find more information.

2 weeks

Pre-Notification Email (Attachment 1B)

Email introduction of upcoming Survey, importance of Survey, and where to find more information.

1 week

Reminder of Upcoming Email Outreach

NORC

Awardees

Email reminder of detailed instructions to contact target population to complete Survey. 


Exhibit 8. Healthcare Workforce Survey Emails (from Awardees to Healthcare Workforce Respondents)

Email to Send 

Title 

Attachment

Description 

Week 1 

Healthcare Workforce Survey Initial Email  

1C

Initial contact email asking respondents to complete the Survey to help address burnout in the workforce. 

Week 2 

Healthcare Workforce Survey Reminder 1 

1D

Reminder to complete Survey. 

Sent approximately one week after Initial Email Invitation. 

Week 3 

Healthcare Workforce Survey Reminder 2  

1E

Reminder to complete Survey. 

Sent approximately one week after Reminder Email 1. 

Week 4 

Healthcare Workforce Survey Reminder 3 

1F

Reminder to complete Survey. 

Sent approximately one week after Reminder Email 2.  

 Week 5 

Healthcare Workforce Survey Last Chance 1 

1G

Reminder to complete survey and note that Survey is ending soon. 

Sent approximately one week after Reminder Email 3.  

Week 6 

Healthcare Workforce Survey Last Chance 2  

1H

Reminder to complete Survey and note that Survey is ending soon. 

Sent approximately one week after Last Chance Email 1.  


Awardee Survey about the Technical Assistance Center (TAC)

Data will be collected from Awardee respondents using a web-based survey (Attachment 2) that respondents will access via a secure link. Outreach will be through email only, with an initial Survey invitation email followed by approximately weekly email follow-up contacts (Attachments 2 (A-H)).

Exhibit 9 summarizes the timing of Survey email contact efforts. Outreach will introduce awardees to the Survey, provide instructions on completing the Survey, include a secure link to the Survey, and include Survey support team contact information for questions and concerns as well as a link to FAQs about the Survey.

In addition, HRSA will send a pre-notification email to all Awardee contacts one to two weeks before Survey launch. The email will note the upcoming Survey administration and emphasize the importance of a prompt and thorough response.

Exhibit 9. Awardee Survey about the TAC Emails (to Awardee Contacts)

Email to Send 

Title 

Attachment

Description 

Week 1 

Awardee Survey about the TAC Initial Email  

2A

Initial contact email asking awardees to complete the Survey to understand awardees’ experience with the TA provided by the Workplace Change Collaborative (WCC).

Week 2 

Awardee Survey about the TAC Reminder 1 

2B

Reminder to complete Survey. 

Sent approximately one week after Initial Email Invitation. 

Week 3 

Awardee Survey about the TAC Reminder 2  

2C

Reminder to complete Survey. 

Sent approximately one week after Reminder Email 1. 

Week 4 

Awardee Survey about the TAC Reminder 3 

2D

Reminder to complete Survey. 

Sent approximately one week after Reminder Email 2.  

 Week 5 

Awardee Survey about the TAC Last Chance 1 

2F

Reminder to complete Survey and note that survey is ending soon. 

Sent approximately one week after Reminder Email 3.  

Week 6 

Awardee Survey about the TAC Last Chance 2  

2G

Reminder to complete Survey and note that survey is ending soon. 

Sent approximately one week after Last Chance Email 1.  


In addition, Awardees can request the Survey questions by emailing or calling the Survey support team for assistance. If Awardees complete a hard copy version of the Survey, they may either enter their responses into the online Survey or fax their hard copy of the completed survey responses to the Survey support team.



The Training Program Comparison Group Survey

For the Training Program Comparison Group Survey, respondents will access a web-based Screener (Attachment 1) through a secure link. To be eligible for the Survey, the Screener will confirm that respondents work or study in one of the approved healthcare fields and that they do not work at an Awardee organization.

AmeriSpeak panelists who prefer taking a web-based Survey will be emailed through the field period with varied messages to encourage responses. Phone-preferred respondents are emailed if an email address is available and are called several times, with messages left when possible. These respondents may call back and reach a live interviewer at a time convenient for them. Phone-preferred respondents can also schedule a callback. See Attachment 1 (K-P) for the telephone script. Further, AmeriSpeak can send Short Message Service (SMS) texts during the field period to phone or web-preferred respondents who have given their mobile numbers and permission to send such text messages (Attachment 1 (K-P)). All AmeriSpeak panelists receive an incentive in the form of AmeriPoints that can be exchanged for money or gift cards. For this study, AmeriPoints equivalent to $3 will be used.


The SHG panelist will be invited to complete the Survey online. Survey invitations are sent primarily through email but may also be sent by phone or snail mail as needed [Attachment 1 (K-L)]. Invitations are personalized and reminders are sent 3-6 times depending on response rates. All SHG panelist receive an incentive payment for their participation of $45-$55.


Weighting of Collected Data

AmeriSpeak Panel Weighting


AmeriSpeak panel weights—including both household-level and person-level weights—are developed to account for the probability of selection of the housing unit, adjustments for unknown eligibility of the housing unit, nonresponse associated with panel recruitment, panel attrition, and nonresponse from secondary panel members (other eligible adults in the same household), as well as to include raking ratio adjustments to external population benchmarks. Specifically, the weighting steps for panel weights are as follows:

  • Computation of base weights

  • Adjustment for unknown eligibility

  • Adjustment for household nonresponse

  • Adjustment to household population totals to yield the final household-level panel weight

  • Initial person-level weight

  • Adjustment for nonresponse associated with panel members

  • Raking ratio adjustment to person-level population totals to yield the final person-level panel weight


Base Weights. AmeriSpeak annual recruitments use a stratified random sample of housing units selected from the NORC National Frame, as well as address-based sampling (ABS) frames. Initial base weights are calculated as the inverse of probability of selection of the housing units for the combined samples, currently including samples selected from 2014 to 2022. In most years, nonrespondent households to panel recruitment are subsampled for NRFU. The subsampled housing units have their initial base weights adjusted to account for NRFU subsampling. The combined adjustments correspond to the inverse of the probability of selection and the subsampling adjustment is used to generate the base weight.


Adjustment for Unknown Eligibility. AmeriSpeak uses a weighting class approach to adjust the base weights for housing units with known eligibility to account for housing units with unknown eligibility. To create the adjustment cells under the weighting class approach, we use sample design variables such as sampling strata, recruitment year, and tract-level information of household characteristics obtained from the 5-year American Community Survey (ACS) and Tract-Level Planning Database. Within each adjustment cell, base weights for housing units with known eligibility are adjusted to represent all housing units.


Household Nonresponse Adjustment. The adjustment compensates for eligible households that did not complete the recruitment Survey. Furthermore, panel attrition could result in some household members being withdrawn from the panel. For purposes of weighting, if no other adult in the household belongs to the panel after an adult is withdrawn from the panel, we consider the household as a nonrespondent household for purposes of weighting. AmeriSpeak uses a weighting class approach to adjust the weights from the previous step for household nonresponse. The adjustment cells under the weighting class approach are created in the same way as described earlier. Within each adjustment cell, weights from the previous step for eligible respondent households are adjusted to represent all eligible households.


Adjustment to Household Population Control Totals. The final household level weight is developed by applying a ratio adjustment. For each Census division, the weights after the household nonresponse adjustment are adjusted such that the sum of the weight equals the total number of households in the division based on the most recent Current Population Survey (CPS) data.


Person-Level Nonresponse Adjustment. The primary panel member identifies and provides contact information for other eligible adults in the same household; subsequently, the eligible adults from the same household are contacted and asked to complete the recruitment Survey. The adjustment compensates for nonresponse due to the following:

  • No contact information is available for eligible adults in the same household as the primary panel member contacted for recruitment

  • Recruitment Survey not completed by eligible adults in the same household as the primary panel member

  • Withdrawal from the panel when at least one other adult in the same household continued to be an active panel member


A weighting class approach is used to adjust the weights from the previous step for eligible respondents to account for eligible nonrespondents. In addition to the household-level variables used earlier, age group and sex are also used to support the person-level nonresponse adjustment.


Raking Adjustment to Derive Final Person-Level Panel Weights. The last step in deriving person-level weights for the panel is the raking adjustment to person-level population totals obtained from CPS, ACS, and National Health Interview Survey (NHIS). The following person-level characteristics are used in this raking adjustment: age group, sex, Census division, education, race/ethnicity, housing tenure, and household phone status


Weighting for the Training Program Comparison Group Survey. The base weight for the Training Program Comparison Group Survey will be the AmeriSpeak final person-level panel weight since all healthcare workers in the panel will be selected for the survey. These weights will then be adjusted for nonresponse. Because not all eligible sampled members complete the main Training Program Comparison Group Survey, an adjustment is needed to account for eligible nonrespondents. A weighting class approach to adjust the screener nonresponse adjusted weights for eligible respondents to account for eligible nonrespondents. To create the adjustment cells for the weighting class approach, we use household-level and person level information collected during panel recruitment, such as age group, sex, education, and race/ethnicity. After the nonresponse adjustment a raking adjustment will be made to adjust the sum of the weight to meet known control totals, specifically to employment counts by profession obtained from the Bureau of Labor Statistics Occupational Employment and Wage Statistics.


Probability/Nonprobability Weighting. We will use NORC’s TrueNorth methodology to combine the probability sample from AmeriSpeak and the non-probability SHG sample. Unlike with the AmeriSpeak panel, the selection probabilities cannot be assigned to SHG panel respondents; for this reason, base weights are not readily available. We will impute weights for the nonprobability SHG panel using TrueNorth, as follows:

  • The weighting process uses statistical matching of the nonprobability sample to the probability sample. Every nonprobability record will match to one and only one probability sample record according to a distance metric using variables known to be associated with propensity to be in the nonprobability SHG sample.

  • Among the records that matched, a logistic regression model is fitted to predict the propensity for each case to be in the nonprobability SHG sample. The inverse of the response propensity forms the imputed weight for the nonprobability SHG panelists. The probability panelists maintain their original non-response adjusted weight throughout the weighting process.

  • The nonprobability weights are calibrated to estimates of key target populations based on the probability sample that matched. This helps align the nonprobability weights to the portion of the population that they are believed to cover, based on similarity of the key matching variables.

  • The full combined sample is calibrated to benchmarks such as the employment counts by profession obtained from the Bureau of Labor Statistics Occupational Employment and by the Wage Statistics used in the AmeriSpeak weighting.

  1. Methods to Maximize Response Rates and Deal with Nonresponse


For the provider resiliency evaluation, several data collection issues present a challenge to achieving high response rates, including a data collection strategy that doesn't allow for the evaluator to administer the survey or track responses directly, and a condensed timeline. Cognitive testing (with fewer than nine respondents) revealed additional challenges related to:

  • Demanding schedules for all respondents

  • Awardee IRB limitations and concerns about sharing healthcare workforce PII that may impede reaching potential respondents

  • Concerns about privacy that may make the healthcare workforce less willing to participate

  • Perceptions by awardees that the healthcare workforce already faces work-related burdens and evaluation-related survey fatigue

  • Awardees report burden from various regulatory reporting requirements, site visits and inspections, and other government-supported data collection efforts.

The evaluation’s approach to maximizing response rates will emphasize increasing the perceived benefits of participation. Clarity in all communication about privacy should further reduce the perceived costs of participation. As we field both the Healthcare Workforce Survey and the Awardee Survey about the TAC, multiple strategies will be used to help maximize response rates.


Maximizing Response Rates

Outreach strategies to maximize response rates will include:

  • Publicizing the survey prior to the start of data collection; see Exhibit 3 for an example of such a prenotification.

  • Creating contact materials designed to foster a successful first encounter with each respondent by communicating the importance of the evaluation for the different respondent types and anticipating concerns likely to prevent participation.

  • Sending regular reminders/prompts during the data collection period, with varied text to keep up interest.

  • Varying the day of the week and time of day of reminders to maximize the possibility of reaching respondents.

  • Avoiding sending survey requests or reminders during specific times that are historically more difficult to reach respondents (for example, after 4 p.m. local time on a Friday).

  • Applying known best practices to avoid employer email filters where possible to ensure survey emails reach the intended respondents.

  • Designing the survey questionnaire with respondent burden in mind, including optimization for ease of completion on mobile devices, tablets, and/or desktop computers.

  • Addressing potential respondent concerns about privacy and confidentiality through clear communication in survey outreach materials, the availability of survey helpdesk team to answer questions, and access to a web page for frequently asked questions.

  • Providing survey helpdesk email and/or telephone support to respondents who have questions or who encounter technical issues.


Additional strategies to maximize response rates include the following:


Questionnaire design and mode. To facilitate cooperation and reduce item nonresponse, we prioritized creating logical, clear questionnaires with concrete question wording, closed-ended response choices, simple grammar, and questions grouped according to subject areas. In addition, the web-based Voxco/Qualtrics survey platforms will make it easy for respondents to participate. We will pre-populate fields where relevant to allow respondents to complete the survey more efficiently and to skip questions that do not pertain to them. The web-based platform will also allow Awardee respondents to save and continue work, making survey completion more convenient.

Media platforms. During each year of data collection, HRSA will use its existing channels of communication to inform respondents about the Surveys and to promote the importance and value of the Surveys. For the communications, we will use OMB-approved respondent materials text (for example, the Healthcare Workforce flyer, Attachment 1 (B-J)) in HRSA newsletters and other announcements from HRSA. In addition, the survey team will create webpages for the evaluation surveys to provide background on each Survey, to post FAQs, and to promote participation.


Respondent support (or TA). We will provide contact information for the survey support team should respondents have questions or concerns, as well as contact information for our Institutional Review Board (IRB) should respondents have concerns about their rights as a study participant. Respondents will be provided a toll-free number to speak directly with staff trained to assist survey respondents and to be as responsive as possible in addressing concerns.


Addressing Nonresponse

For the Healthcare Workforce Survey and the Awardee Survey about the TAC, respondents will receive a weekly reminder email during the first month of data collection. After the third reminder email, respondents will receive a “last chance” email noting that the Survey will close soon, followed approximately one week later by a final last chance email. We may also ask HRSA awardees to prompt HRSA sites and trainees to complete their Surveys using information provided in the Healthcare Workforce flyer (Attachment 1 (B-J)). The Surveys will be closed approximately six to eight weeks after opening.

We will consider the need for post-stratification weighting of Awardee sites. First, we will determine which strata are relevant for comparisons, such as facility type. Based on the responses received as of the Survey closings, we will assess whether any group was disproportionally under-represented. As needed, we would use calibration techniques to create weights to make responding sites and participants representative of the total population surveyed. The technique involves selecting a set of variables where the population distribution is known and then adjusting the weight for each respondent iteratively until the weighted respondent distribution aligns with the total population for the variables. Calibration weighting techniques can reduce the variance and the bias in final survey estimates.

Nonresponse Analysis. We will use secondary administrative data (for example, Annual Performance Reports, awardee applications, Progress Reports, Final Reports) and public data (for example, HRSA’s Area Health Resources File and the Centers for Disease Control and Prevention (CDC)’s Social Vulnerability Index), as well as target population demographic and discipline data from the completed Healthcare Workforce Fielding Tracker to conduct a nonresponse analysis. This will assess potential differences in sociodemographic and training background characteristics between those who respond and those who do not.

The Training Program Comparison Group Survey (AmeriSpeak Panel): Maximizing Response and Addressing Nonresponse

AmeriSpeak strives to attain sufficient sample sizes to support analyses having substantial statistical power for adults aged 18 and over residing in the United States (50 states plus the District of Columbia). AmeriSpeak panel recruitment occurs annually, and the panel size as of August 2022 is 54,001 panel members aged 13 and over residing in more than 43,000 households.


AmeriSpeak panel recruitment is a two-stage process: 1) initial recruitment using U.S. Postal Service (USPS) mailings, telephone contact, and modest incentives; and 2) a more elaborate NRFU recruitment using FedEx mailings, enhanced incentives, and in-person (face-to-face) visits by NORC field interviewers.


  • For the initial recruitment, sample households are invited to join AmeriSpeak online by visiting the panel member web portal (AmeriSpeak.org) or by calling a toll-free telephone line (inbound/outbound-supported). The initial recruitment data collection protocol features the following: an over-sized pre-notification postcard, a USPS recruitment package in a 9”x12” envelope (containing a cover letter, a summary of the privacy policy, FAQs, and a study brochure), two follow-up postcards, and contact by NORC’s telephone research center for sample units with a matched telephone number.


  • For the second stage of NRFU recruitment, a stratified random sample is selected from the nonrespondents of the initial recruitment. Units sampled for NRFU are sent a new recruitment package by FedEx with an enhanced incentive offer. Shortly thereafter, NORC field interviewers make personal, face-to-face visits to the pending cases to encourage participation. Once the households are located, the field interviewers administer the initial AmeriSpeak recruitment survey in-person using computer-assisted personal interviews (CAPI), or else they encourage the respondents to register online or by telephone. A sample household is considered recruited if at least one adult in the household joins the panel. The weighted household response rate (AAPOR RR3) is about 6% for initial recruitment and 28% for NRFU recruitment. We report two recruitment response rates: one for all the panel recruitment years (2014-2021) and one for the recruitment years with NRFU (2014-2018 and 2021). For all recruitment years, the cumulative weighted household response rate is 21.9%; for recruitment years with NRFU, the cumulative weighted household response rate is 34%.


The Training Program Comparison Group Survey Response. For individual client surveys using AmeriSpeak, the all-in, cumulative American Association for Public Opinion Research (AAPOR) Response Rate 3 (RR3)—estimating “what proportion of cases of unknown eligibility is actually eligible” (AAPOR, 2016)—is between 10% and 20%. Variation reflects study-specific parameters such as target population, survey length, time in the field, cross-sectional versus longitudinal designs, salience of subject, incentive amount, and level of effort for refusal conversion. For AmeriSpeak, RR3 considers the panel recruitment rate, panel retention rate, and survey participation rate.


  1. Tests of Procedures or Methods to be Undertaken

The survey instruments were developed with input from HRSA staff, reviewed by NORC survey methodologists and subject matter experts, and edited by copy editors. The project team conducted cognitive testing as well as a focus group session in March 2023. The same question was asked of no more than nine people across the pilot test and focus group.

For cognitive testing of the Healthcare Workforce Survey and the Training Program Comparison Group Survey, five respondents were recruited through NORC’s networks, with the goal of testing with individuals external to the current grant programs but familiar with issues affecting healthcare workers. Between March 21 and 29, 2023, the project team completed five interviews—two with students and three with mid-to-senior career individuals.


Pilot testers were asked to comment on survey length, clarity of instructions and questions, and whether response categories were comprehensive and coherent. Based on feedback from pilot testers, content was modified as needed to clarify terminology, to streamline questions, and to refine response options. The interviewer used concurrent probing to ask interviewees to define key terms (such as burnout, resiliency, activities, programs, training), to present response options not included, or to consider methods to reduce complexity. Finally, the interviewer used retrospective probing and wrap-up questions to identify any key topics not yet captured in the survey, including ways to improve recruitment messaging.


Recruitment for the focus group was done in collaboration with the HRSA Project Officers (POs), who identified representatives from nine awardees. The project team sent email invitations to all representatives provided by the POs. Ultimately, six organizations were represented (one person per organization). The project team conducted a focus group with awardees to solicit feedback on strategies to field the Healthcare Workforce Survey and the Training Program Comparison Group Survey and on the instruments planned for awardee data collection.


The focus group began with introductions (including asking for names and affiliations), presentation of consent language, and information about the requested cadence for the discussion. The group then reviewed specific components of proposed instruments, including the Awardee Training and Services Report and key considerations related to administration of the Healthcare Workforce Survey, the Awardee Survey about the TAC, the Awardee Interview protocols, and the Healthcare Workforce Fielding Tracker. The focus group concluded with solicitation of concerns and suggestions around administration and ways the project team could best support awardees in these efforts.


  1. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The NORC at the University of Chicago evaluation team will conduct the data collection and analysis. The evaluation team can be contacted at [email protected] and (301) 634-9339.





References

American Association for Public Opinion Research. (2016). Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys. 9th edition. AAPOR. https://aapor.org/wp-content/uploads/2022/11/Standard-Definitions20169theditionfinal.pdf

Cho, Y. M., Johnson, T. P., & VanGeest, J. B. (2013). Enhancing Surveys of Health Care Professionals. Evaluation & the Health Professions, 36(3), 382–407. https://doi.org/10.1177/0163278713496425

de Beaumont Foundation. (2023, March 30). PH WINS 2021: Key Findings and Data Dashboard. De Beaumont Foundation. https://debeaumont.org/phwins/2021-findings/

NORC at the University of Chicago. (2022). A Guide for Seeking OMB Clearance in Studies

Using Amerispeak. https://amerispeak.norc.org/content/dam/amerispeak/research/pdf/AmeriSpeak%20Guide%20for%20Obtaining%20OMB%20Approval.pdf

NORC at the University of Chicago. (2021). TrueNorth: An Advanced Calibration Tool for

Combining Probability and Nonprobability Samples. https://amerispeak.norc.org/content/dam/amerispeak/research/pdf/NORC%20-%20White%20Paper%20%20TrueNorth%20Calibration%20tool%20for%20probability%20and%20nonprobability%20samples%20-%20March%202020.pdf

1 This is based on a power analysis for a two-tailed test with the following parameters: beta of 0.2, alpha of 0.05, population retention rate of 30% and a design effect of 2.5.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleInstructions for writing Supporting Statement B
AuthorJodi.Duckhorn
File Modified0000-00-00
File Created2024-11-16

© 2024 OMB.report | Privacy Policy