Alternative Supporting Statement for Information Collections Designed for
Research, Public Health Surveillance, and Program Evaluation Purposes
National and Tribal Evaluation of the 2nd Generation of the Health Profession Opportunity Grants
OMB Information Collection Request
0970 - 0462
Supporting Statement
Part B
Revised October 2021
Submitted By:
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, D.C. 20201
Project Officers:
Nicole Constance
Lisa Zingman
Amelia Popham
Part B
B1. Objectives
Study Objectives
As described further in Supporting Statement A, the U.S. Department of Health and Human Services (HHS), Administration for Children and Families (ACF) awarded grants to 32 organizations to administer the second generation of the Health Profession Opportunity Grants (HPOG 2.0) Program. The Program provides healthcare occupational training for Temporary Assistance for Needy Families (TANF) recipients and other low-income people. OMB has approved various data collection activities in support of the HPOG 2.0 National and Tribal Evaluations under OMB #0970-0462 (see Supporting Statement A, Study Design for a summary of the previous approvals.)
As noted in Supporting Statement A, ACF is ready to begin reengagement activities with study participants in anticipation of conducting the HPOG 2.0 long-term follow-up study. ACF is also interested in fielding a short-term follow-up survey of HPOG 2.0 participants enrolled after the onset of the COVID-19 pandemic. These two efforts are the focus of this information request.
The goal of the long-term follow-up survey (five and a half years after randomization) with the same cohort of participants selected for the ongoing intermediate follow-up survey is to measure the outcomes of interest over the longer-term. To maximize the response rates to that long-term follow-up survey, this request seeks approval for some minor changes in the procedures for collecting contact updates from the participants selected for the long-term follow-up survey data collection using previously approved Instrument #5b (see Section B4 for more detail).
The goal of the COVID-19 Cohort Short-term Survey (STS) is to capture the participant experiences in the 15 months following randomization. The evaluation contractor already administered a similar follow-up to survey to participants who enrolled in portions of 2017 and 2018. This new follow-up survey will allow ACF’s Office of Planning, Research & Evaluation (OPRE) to measure the robustness of HPOG 2.0 impacts. The primary research question is whether the pandemic dampened or intensified the impact of HPOG 2.0 on participants who enrolled after the onset of the pandemic, relative to those who enrolled pre-pandemic.
Much of this Supporting Statement B pertains to the statistical implications for the COVID-19 Cohort Survey Short-term Survey, but Section B4 also discusses the requested changes to the procedures for tracking the long-term follow-up survey sample. Additional details specific to the long-term follow-up survey will be included in a future information collection request, seeking approval for the actual long-term follow-up survey and supporting materials. These activities were approved by OMB in July 2021.
ACF seeks approval to make non-substantive changes to Instrument 12a, the HPOG 2.0 COVID-19 Cohort Short-term Survey (COVID-19 Cohort STS), and corresponding advance letter (Attachment K-Revised), both of which were approved in July 2021 under OMB # 0970-0462. The justifications for these requested changes, described in more detail in Supporting Statement A and the attached Nonsubstantive Change Request Memo (OMB#0970-0462_NonSubstantiveChangeRequest_October 2021) are to:
Delete one sentence from the advance letter (Attachment K-Revised);
Minor wording changes to one question (I4b) and the introduction in Instrument 12a COVID-19 Cohort STS; and
Add a new version of Instrument 12a, that is shorter to administer and includes only the items most critical to the study (Instrument 12b, COVID-19 Cohort Short-term Survey, Critical Items.)
Generalizability of Results
This randomized study is intended to produce internally valid estimates of the intervention’s causal impact, not to promote statistical generalization to other sites or service populations. ACF is considering this COVID-19 Cohort STS to take advantage of the unique opportunity to understand how the experiences of, and program impacts on, those who enroll in the HPOG 2.0 Program after the onset of the pandemic vary from those who enrolled pre-pandemic. We are interested in both the pre- and post-COVID comparisons and in understanding the post-COVID impacts for their own sake. The results will formally apply to those randomly assigned by the HPOG 2.0 grantees during the pandemic. While the HPOG population is not formally representative of a broader population, it is similar to other populations in federal job training programs (e.g., WIOA). As such, these results should be insightful about the evolution through the COVID-19 pandemic of levels and impacts of job training programs more broadly. These results are likely to be insightful about. how participant experiences differed as programs under extreme stress adapted to the rapid changes the pandemic forced upon training institutions and the healthcare industry.
Appropriateness of Study Design and Methods for Planned Uses
This COVID-19 Cohort STS is the best way to help ACF understand more about how the experiences of, and program impacts on, those who enroll in the HPOG 2.0 Program after the onset of the pandemic, compared to those who enrolled pre-pandemic. As has been often found in survey research, even small working changes or alterations in question order can substantially change mean responses.1 Given that the primary purpose of the COVID Cohort Survey is to measure changes in outcomes and impacts, it is vital that the instrument be changed as little possible and that any new questions be added at the end of the instrument. Accordingly, the evaluation contractor has recommended that the instrument and study protocols for the 15-month follow-up of pandemic-era enrollees be very similar to the instrument used at the same follow-up interval for those enrolled in 2017-2018. This will reduce the threat that differences in impacts over the two eras are caused by methodological differences rather than the pandemic. As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information. However, since documenting changes in treatment is an objective of the study, we have added a supplemental module at the end of the interview to understand the COVID effects. We included this module because we have very limited knowledge about these changes at the present.
B2. Methods and Design
Target Population
Grantees began enrolling participants in 2016 and enrollment will continue through the end of the grant period in September 2021; over 54,000 participants enrolled through April 2021. The target population for the COVID-19 Cohort STS is the universe of applicants to the non-tribal HPOG 2.0 programs who were randomly assigned between May 2020 and the end of the grant period (September 2021). Based on HPOG programs’ enrollment projections, OPRE estimates the sample will be approximately 7,500. This non-substantive change request does not require any changes to the respondent universe nor to the sampling methods.
Sampling
There will not be sampling of the target population. A large sample size is required to detect changes in impacts across two time periods. Exhibit 1 summarizes the effect of the randomization ratio on minimum detectable differential impacts (MDDIs) for critical outcomes obtainable only through a survey (employment in healthcare, receipt of means-tested benefits, educational progress, healthcare credentials). MDDI calculations assume a 3-to-1 randomization ratio, no subsampling of the anticipated 7,500 enrollees in the target population, and a 70 percent response rate.2
Exhibit B-1: Minimum Detectable Differential Impact of HPOG 2.0 between the Pre-Pandemic and Pandemic Eras
Healthcare employment at survey |
Means-tested benefits at survey |
Education progress at Q5 |
Healthcare credential at Q5 |
4.7 pp |
3.6 pp |
11.4 pp |
12.5 pp |
Notes: “Minimum detectable” defined as 80 percent power given a two-sided test at the 5 percent significance level. Other assumptions include a target population size of 6,400, a response rate of 70 percent, and a 3:1 randomization ratio (3 enrollees assigned to treatment for every enrollee assigned to control condition.
Given that these MDDIs are rather large, OPRE intends to include the largest available sample, with no subsampling. Even if differential impacts this large are not detected, the second key research question focuses on HPOG 2.0 impacts after the onset of the pandemic.
Key Research Questions for COVID-19 Cohort STS:
Are the impacts of HPOG 2.0 different for people who started training after the onset of the pandemic than for people trained before the outbreak?; and
What are the post-COVID impacts on participants during this unique point in our economic history?
B3. Design of Data Collection Instruments
Development of Data Collection Instrument
The COVID-19 STS (Instrument #12a) was developed by making minimal modifications to the HPOG 2.0 Short-term Follow-up Survey (previously approved Instrument #12) used for people randomized between March of 2017 and February of 2018. The use of the HPOG 2.0 Short-term Follow-up Survey (Instrument #12) as the base for the new survey helps to minimize contextual effects that could complicate comparisons with the pre-pandemic cohort. Instrument 12a, compared to Instrument 12, includes two key categories of changes. First, we dropped some questions that are not applicable to the COVID-19 Cohort STS, such as questions that were included for the cost-benefit analysis. Second, Instrument 12a includes a new module to capture information specifically about the impact of the COVID-19 pandemic on employment and financial well-being. See Supporting Statement A for more detail on the research questions and instrument content and Attachment L-Revised COVID-19 Cohort Short-Term Survey Sources for a listing of sources used for each question in the COVID-19 Cohort STS.
This submission seeks approval for a new revised version of the COVID-19 Cohort STS. This new version is shorter in length (20 minutes as opposed to 60 minutes). It is intended to serve as a tool to maximize response rates by offering participants that would likely become a final refusal, the opportunity to complete a shorter version of the instrument. As noted in the attached Nonsubstantive Change Request Memo (OMB#0970-0462_NonSubstantiveChangeRequest_October 2021) and Supporting Statement A, OMB approved the use of a critical items instrument for the Intermediate Follow-up Survey (Instrument 18a). The results from that data collection effort show that the shorter instrument was very successful. The shorter instrument increased the overall survey response rate for the Intermediate Follow-up Survey by 9.2 percentage points, and reduced the differential in response rates between the treatment and control groups—from 7.0 percentage points to 3.6 percentage points.
B4. Collection of Data and Quality Control
The COVID-19 Cohort STS will follow the same procedures approved and implemented for the previously approved Short-term Follow-up Survey (Instrument #12 under this OMB number, initially approved in June 2018). The survey will be programmed and administered using Confirmit Computer Aided Personal Interviewing (CAPI) technology. Trained interviewers will use tablets equipped with the Confirmit software to conduct the surveys. The evaluation Contractor will send an advance letter to all participants selected for the COVID-19 Cohort STS data collection effort. This submission requests approval for non-substantive changes to the data collection instrument and procedures. The changes in procedures pertaining to the COVID-19 Cohort STS (Instruments 12a and 12b, and Attachments K-Revised) are summarized here.
This information collection request also seeks approval of two procedural changes related to the previously approved contact update requests (Instrument 5b, initially approved in June 2017). The first change is to include a newsletter (Attachment AA) with one of the contact update request mailings, and the second is to conduct one round of contact update requests by making outbound telephone calls to participants to collect contact update information rather than reaching out by mail first and allowing the participant to respond by mail, phone, or online (Instrument 5b Contact Update Form Phone Version). Neither of these requested procedural changes involve changes to the previously approved burden for Instrument 5b. The requested changes are described further below. These changes to Instrument 5b are necessary to support the HPOG 2.0 Long-term Survey. Additional details specific to the long-term follow-up survey data collection procedures and quality control will be included in a future information collection request for approval of the actual long-term follow-up survey and supporting materials. These changes were approved in July 2021, ACF is not seeking any further revisions to these instruments as part of this non-substantive change request.
Participant Contact Update Request Procedures.
The participant contact update form (Instrument 5b) is self-administered. As of this request for clearance, contact updates are sent to two groups of participants: participants enrolled between September 2017 and January 2018 (those who are in the Intermediate Follow-up Survey sample and will be included in the forthcoming Long-term Follow-up sample), and participants enrolling between May 2020 and September 2021 (those who will be in the COVID-19 Cohort STS sample). The form is mailed to sample members quarterly, beginning three months after random assignment. Participants are encouraged to update their information by returning the form by mail, through a secure online portal, or they can update their contact information by calling the evaluation contractor’s toll-free number. Participants can indicate that the information is correct, or they can make any necessary changes to contact information. The contact update requests improve the quality of the contact information we have in our records by allowing participants to update address information, add apartment numbers, correct zip codes, update phone numbers—all of which helps to improve the accuracy of outreach efforts.
As part of this information collection request, ACF proposes to substitute one round of contact update mail requests to conduct updates by phone, rather than mail. Local interviewers working under the evaluation contractor will make outbound calls to conduct a short check-in call with study participants eight months prior to the start of the long-term follow-up survey. This call allows the interviewers to build upon their existing rapport with participants established during the short-term and intermediate follow-up data collection efforts. Interviewers will inform study participants about the next survey data collection effort and address any questions about the study. Interviewers would conclude the call by collecting updated contact data. All new contact information will be appended to the study database prior to the start of the long-term follow-up survey. (See Instrument 5b HPOG 2.0 Contact Update Phone Version, which includes the script the evaluation contractor will use to make the outbound calls to collect the updated contact information.)
HPOG 2.0 Participant Newsletter. To keep study participants selected for the forthcoming long-term follow-up survey—those randomized between September 2017 and January 2018— this request for clearance seeks approval of a new tool to maximize response rates—a participant newsletter. The HPOG 2.0 Participant Newsletter (Attachment AA) will remind participants that they are part of this study. This newsletter will thank participants for their continued cooperation and remind them of the importance of their participation—even if they were assigned to the control group and did not participate in the program. It will also include a summary of key HPOG 2.0 impact evaluation accomplishments since grants were awarded in 2015. Finally, it will explain the remaining data collection activities and offer participants an opportunity to update their contact information via the online system or on paper. This one-time newsletter will be sent 12 months prior to the release of the long-term survey, along with the standard contact update request form that would go out at that time.
COVID-19 Cohort STS Procedures
Interviewer Staffing: An experienced, trained staff of interviewers will conduct the COVID-19 Cohort STS with participants. To the extent possible, the evaluation contractor will recruit interviewers who worked successfully on the HPOG 2.0 Short-term and Intermediate Follow-up data collection efforts as these interviewers are familiar with the HPOG 2.0 and career pathways model and study goals, and they have valuable experience working with this study population. All interviewers will participate in a training that includes didactic presentations, numerous hands-on practice exercises, and role-play interviews. The evaluator’s training materials will build on those prepared for the Short-term and Intermediate Surveys.
Advance Letter: The evaluation team will mail an advance letter to all participants in the randomization cohort selected for inclusion in the COVID-19 Cohort STS, as they did prior to the short-term and intermediate follow-up survey efforts. The advance letter (Attachment K-Revised) will be mailed to study participants selected to participate in the survey approximately one and a half weeks before interviewers begin data collection. The advance letter helps to alert participants to the upcoming survey effort, so they are more prepared for the interviewer’s call. The letter provides each selected study participant with a toll-free number that they can call to set-up an interview. See the previously approved Supporting Statement A from June 2018, for more information on the use of advance letter. This non-substantive change request seeks approval to delete one sentence in the advance letter (Attachment K-Revised). The requested change to the advance letter was requested by OPRE’s Privacy Analyst to reflect the fact that OPRE or other researchers may need to access the data in the future.
Email Reminder: Interviewers will attempt to contact participants by telephone first. If initial phone attempts are unsuccessful, interviewers can use their project-specific email accounts to introduce themselves as the local data collection staff, explain the study, and attempt to set up an interview. They send this email, along with the advance letter, about halfway through the time during which they are working the cases (see Attachment N-Revised COVID-19 Cohort Short-Term Survey Email Reminder Text).
Interviewing: The COVID-19 Cohort STS data collection will be mixed-mode phone with in-person follow-up—assuming it remains appropriate to resume face-to-face interviewing.3 Data collection begins when interviewers attempt to reach the selected study participants by telephone, using the full contact information history for the respondent and any alternate contacts (such as family or friends that study participants identified as people who will know how to reach them). After the interviewers exhaust all phone efforts, they will work non-completed cases in person (if feasible). Interviewers may leave specially designed project flyers with family members or friends (see Attachment M-Revised_COVID-19 Cohort Short-term Survey Trying to Reach You Flyer). As noted in Supporting Statement A and OMB#0970-0462_NonSubstantive ChangeRequest_October 2021 memo, ACF intends to administer a new, shorter version of Instrument 12a COVID-19 Cohort Short-term Survey (COVID-19 Cohort STS) that includes only the most critical items (see new Instrument 12b). This new version, Instrument 12b COVID-19 Cohort Short-term Survey Critical Items, will be offered as the last refusal conversion effort. It will be offered only to those who would likely otherwise have a final disposition of “refusal”—those with chronic broken appointments or soft refusals.4
Sample Release Schedule: We will follow a similar schedule for the COVID-19 Cohort sample as we did for the short-term survey. This is important because some of the critical outcomes are expected to show time-varying impacts.5
B5. Response Rates and Potential Nonresponse Bias
This non-substantive change request builds upon the methods to maximize response rates and deal with nonresponse previously approved in July 2021. Now that the COVID-19 pandemic social distancing guidelines are easing nationwide, we expect to be able to conduct face-to-face interviewing for this information collection. As such, we expect the data collection approach and the methods used to maximize response rates for the COVID-19 Cohort STS will be nearly identical to those approved for use in the HPOG 2.0 Short-term Follow-up Survey (Instrument #12; please see revisions approved in June 2018 for more detail on those approved methods to maximize response rates, such as sample control, tokens of appreciation, and contact update procedures).6 ACF seeks permission to administer, a shorter version of Instrument 12a the COVID-19 Cohort STS; Instrument 12b captures just the critical items of interest.
For the HPOG 2.0 Short-term Follow-up Survey, after excluding those participants who withdrew from the study or were ineligible to participate, the evaluation team completed an interview with 9,620 of the 12,923 participants eligible for the data collection:7 a 74.4 percent response rate. The only planned difference for the COVID-19 Cohort STS is to slightly stretch the time elapsed between attempts to recontact for initial nonrespondents in the treatment group and the time elapsed to recontact those in the control group. The purpose of this stretching will be to reduce the gaps between the response rates for the two study arms without altering the average gap between the target interview date and the actual interview date. We are expecting a response rate of 75 percent for the COVID cohort.8 However, the actual response rate will depend on respondent cooperation and field conditions. At the time that the original request for approval of the COVID-19 Cohort STS was submitted, the Intermediate Follow-up Survey data collection was still underway, and the success of the shorter critical items version of the instrument was unknown. Given the success of that shorter critical items version of the instrument (a 9.2 percentage point increase in overall response rate and a 3.4 percentage point reduction in the differential response rates between the treatment and control group), ACF seeks permission to add a shorter, critical items version of Instrument 12a COVID-19 Cohort STS. This shorter instrument, Instrument 12b, will help to improve the response rates and balance the treatment/control differential for the COVID-19 Cohort STS. This will be particularly useful if the status of the pandemic (including the surge in the Delta or other variants) affects the feasibility of in-person locating and interviewing, which may depress the response rate.9
We plan to use the same nonresponse analyses and adjustments as we used for the Short-Term Impact Report. Analyses for that report did find significant relationships between a variety of baseline variables from PAGES and current variables from the National Student Clearinghouse. The weights we prepared neutralized most of those relationships, leading to estimated impacts that should have minimal nonresponse bias. This was confirmed in analyses of National Directory of New Hires (NDNH) data where weighted impacts on NDNH impacts prepared using only survey respondents and the nonresponse weights closely tracked impacts on the full NDNH sample (including survey nonrespondents).
ACF is mindful of contextual effects that might be caused by shortening the instrument relative to the original STS. We are also concerned about the potential for increased nonresponse bias given the resurgence of COVID in many communities. By fielding the short instrument as a last resort, we hope to minimize both contextual effects and changes in nonresponse bias that would cloud comparisons of short-term impacts in the pre-COVID and COVID eras. The analysis team will run sensitivity tests to assess how the shorter, critical items version affected comparability between the original Short-term and the COVID-19 Cohort Short-term surveys, in addition to using the same nonresponse analyses and adjustments to minimize contextual effects.10
B6. Production of Estimates and Projections
Impact estimates for the COVID cohort based on this survey would be published using the same methodology used for the HPOG 2.0 Short-term Impact Report (STIR). OPRE anticipates that a wide collection of policy makers and policy analysts will use these reports in deciding whether to continue funding programs such as HPOG and what guidance to include for grantees in funding opportunity announcements. The evaluation design and analysis plan methods are detailed in the HPOG 2.0 National Evaluation Impact Evaluation Design Plan (Klerman, Judkins, and Locke 2019) and Short-term Analysis Plan (Judkins, Klerman, and Locke 2020); additional information on analysis methods will be documented in the forthcoming STIR.11,12 These methods include an approach to variance estimation that is specifically designed to support prospective inferences instead of retrospective inferences. This means that a higher burden of proof is required for outcomes that the local HPOG 2.0 programs impact differentially rather than uniformly. A brief supplement to that analysis plan will be published for the report to be based on the COVID cohort. This nonsubstantive change request will require the addition of procedures to impact estimates to accommodate the COVID-19 Cohort STS, Critical Items instrument, similar in nature to procedures used for the Intermediate Follow-up Survey analysis (Instrument 18a).
ACF will work with the evaluation contractor to prepare data for archiving with copious documentation of data structures to try to maximize the chances that secondary analysts will be able to prepare high-quality reports and papers. The work of secondary analysts will also be made easier by the very detailed technical methods appendices that have been prepared for the STIR.
B7. Data Handling and Analysis
The COVID-19 Cohort STS would be an additional component under the HPOG 2.0 National Evaluation impact evaluation. Please refer to prior revisions of OMB #0970-0462 for additional details on the data handling and analysis for the previously approved HPOG 2.0 National Evaluation descriptive evaluation, cost-benefit analysis study, and earlier impact evaluation survey efforts. Prior revisions also cover the HPOG 2.0 Tribal Evaluation. This nonsubstantive change request does not include any changes to the data handling procedures. It does require minor additions to the analysis as noted below.
Data Handling
To ensure data security and enhance data quality, the trained interviewers administering the COVID-19 Cohort STS will collect data on tablets using the Confirmit CAPI system. The Confirmit CAPI interviewer console requires secure login-based interface and, all case level data files are stored and encrypted by the CAPI software that is installed on the devices. This includes survey data as well as PII associated with respondents and their secondary contacts. All survey data will be transmitted to Abt servers via a secure internet connection, and the CAPI system itself encrypts data flowing between interviewer consoles and the centralized server. Datasets generated by the CAPI system will be stored in a restricted access folder on Abt Associates’ secure Analytical Computing Environment (ACE3), the FISMA moderate server, where most analyses will be conducted. Once Abt programmers are satisfied with the extraction of survey data from CAPI software into SAS, the SAS files will be transferred to Abt analysts. Data transfer between the survey system and the analysis system will be conducted through Abt’s secure online file transfer platform that utilizes FIPS 140-2 validated cryptographic modules. Only those project research staff with security approval will be granted access to the data. All analyses will take place within the secure ACE-3 computing environment at Abt Associates.
Data Analysis
As mentioned under Section B.6, a detailed analysis plan was prepared for the STIR (Judkins, Klerman and Locke 2020), and this plan will largely guide analysis of the data from the new survey of the COVID-19 Cohort survey data. A new COVID-19 Cohort STS supplement to the STIR analysis plan will be prepared and published. This supplement will address new issues that stem from comparing the two sets of cohorts as well as any response rate and weight adjustments. This supplement will pay particular attention to changes in the baseline profiles of students recruited to the study over the different time periods. An important question will be whether to adjust any such differences out of comparisons or to treat any changes in the profile of the student bodies as just another consequence of the pandemic. This supplement will also incorporate any additional changes necessary to address weighting and nonresponse for items excluded from the COVID-19 Cohort STS Critical Items instrument—the focus of this nonsubstantive change request.
Data Use
With ACF oversight, Abt and its partners MEF Associates, the Urban Institute, Insight Policy Research, and NORC are responsible for conducting the HPOG 2.0 National and Tribal Evaluation. This team has published:
an Impact Evaluation Design Plan with considerable detail on planned analytic procedures: https://www.acf.hhs.gov/opre/report/national-and-tribal-evaluation-2nd-generation-health-profession-opportunity-grants-1;
a Descriptive Study Analysis Plan: https://www.acf.hhs.gov/sites/default/files/documents/opre/final_hpog_2_0_impl_dsn_report_1_18_2018_b508.pdf;
a Tribal Evaluation Plan: https://www.acf.hhs.gov/opre/report/health-profession-opportunity-grants-hpog-20-tribal-evaluation-evaluation-plan; and
the Short-term Impact Study Analysis Plan:
See Supporting Statement A for more detail on the timeline for publication of the Short-term Impact Report (STIR). ACF anticipates the evaluation contractor will employ similar analyses and develop appendices about methods used to compare impacts for the original and COVID-19 Cohort STS samples. The appendices prepared for the STIR are very thorough and we plan a similar effort for the report based on this new survey. This includes clear statements of the limitations of the data and explorations of data quality.
The HPOG 2.0 Intermediate Impact Analysis Plan is under development and should be published in mid-2021. The evaluation contractor is working on the design of the Long-term Follow-up Survey and data collection plan, which will be submitted under a subsequent information collection request.
Publicly Posted Data
The HPOG 2.0 National Evaluation Impact Study, Short-term Impact Analysis Plan is registered at the Registry of Efficacy and Effectiveness Studies (REES), registration ID 1948.1v1. 13 It is also registered with the Open Science Framework (OSF). 14 The evaluation team is developing an analysis plan for the Intermediate Impact Report and will register it accordingly. The team will also develop and register the supplement to the STIR analysis plan for the COVID-19 Cohort STS.
B8. Contact Person(s)
The individuals listed in Exhibit B-2 below made a contribution to this information collection request.
Exhibit B-2: Contributors
Name |
Role in HPOG 2.0 National and Tribal Evaluation |
Organization/Affiliation |
Gretchen Locke |
National Evaluation Project Director |
Abt Associates |
Jacob Klerman |
National Evaluation Co-Principal Investigator |
Abt Associates |
Bob Konrad |
National Evaluation Co-Principal Investigator |
Abt Associates |
Robin Koralek |
National Evaluation Deputy Project Director |
Abt Associates |
Larry Buron |
National Evaluation Project Quality Advisor |
Abt Associates |
David Judkins |
National Evaluation Director of Impact Analysis |
Abt Associates |
Debi McInnis |
National Evaluation Site Coordinator |
Abt Associates |
Inquiries regarding the statistical aspects of the HPOG 2.0 National Evaluation design should be directed to:
Gretchen Locke, Project Director
Abt Associates
10 Fawcett Street, Suite 5
Cambridge, MA 02138
(617) 349-2373
The following HHS staff—including the HHS project officers Nicole Constance, Lisa Zingman, and Amelia Popham—have overseen the design process and can be contacted at:
Nicole Constance
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
330 C Street S.W., 4th Floor, Washington, D.C. 20201
(202) 401-7260
Lisa Zingman
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
330 C Street S.W., 4th Floor, Washington, D.C. 20201
(202) 260-0323
Amelia Popham
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
330 C Street S.W., 4th Floor, Washington, D.C. 20201
(202) 401-5322
Attachments
New Instrument:
Instrument 12a: COVID-19 Cohort Short-term Survey Revised
Instrument 12b: COVID-19 Cohort Short-term Survey Critical Items
Previously Approved Instruments Still in Use:
Instrument 1: PAGES Grantee- and Participant-Level Data Items List
Instrument 5: HPOG 2.0 National Evaluation welcome packet and participant contact update forms
Instrument 5a: HPOG 2.0 National Evaluation welcome packet and contact update form_REV Instrument 5b: HPOG 2.0 National Evaluation participant contact update letter and form
Instrument 5b: HPOG 2.0 Contact Update Form Phone Version
Previously Approved Instruments No Longer in Use:
Instrument 2: HPOG 2.0 National Evaluation Screening Interview
Instrument 3: HPOG 2.0 National Evaluation first-round telephone interview protocol
Instrument 4: HPOG 2.0 National Evaluation in-person implementation interviews
Instrument 4A HPOG 2.0 National Evaluation In-Person Implementation Interview
Instrument 4B HPOG 2.0 National Evaluation In-Person Implementation Interviews Basic Skills Training
Instrument 4C HPOG 2.0 National Evaluation In-Person Implementation Interviews Career Pathways
Instrument 4D HPOG 2.0 National Evaluation In-Person Implementation Interviews Work-Readiness
Instrument 4E HPOG 2.0 National Evaluation In-Person Implementation Interviews Sustainability
Instrument 6: HPOG 2.0 Tribal Evaluation grantee and partner administrative staff interviews
Instrument 7: HPOG 2.0 Tribal Evaluation program implementation staff interviews
Instrument 8: HPOG 2.0 Tribal Evaluation employer interviews
Instrument 9: HPOG 2.0 Tribal Evaluation program participant focus groups
Instrument 10: HPOG 2.0 Tribal Evaluation program participant completer interviews
Instrument 11: HPOG 2.0 Tribal Evaluation program participant non-completer interviews
Instrument 12: HPOG 2.0 National Evaluation Short-term Follow-up Survey
Instrument 13: HPOG 2.0 Screening Interview Second Round
Instrument 14: HPOG 2.0 Second Round Telephone Interview Guide
Instrument 15: HPOG 2.0 Program Operator Interview Guide for Systems Study
Instrument 16: HPOG 2.0 Partner Interview Guide for Systems Study
Instrument 17: HPOG 2.0 Participant In-depth Interview Guide
Instrument 18: HPOG 2.0 Intermediate Follow-up Survey_ REV_June2020
Instrument 18a: HPOG 2.0 Intermediate Follow-up Survey_Critical Items Only
Instrument 19: HPOG 2.0 Phone-based Skills Assessment Pilot Study Instrument
Instrument 20: HPOG 2.0 Program Cost Survey
New Attachments
Previously Approved Attachments Still in Use
Attachment A: References
Attachment B: New Informed Consent Forms, Updated Time Period
Attachment B: National Evaluation Informed Consent Form A (Lottery Required) _REV
Attachment B: National Evaluation Informed Consent Form C (Lottery Required) _Verbal_REV
Attachment B2: Tribal Evaluation informed consent form A (SSNs)
Attachment B3: Tribal Evaluation informed consent form B (Unique identifiers)
Attachment B2: Tribal Evaluation Informed Consent Form C (SSNs)_Verbal
Attachment B3: Tribal Evaluation Informed Consent Form D (Unique identifiers) _Verbal
Attachment L-Revised: COVID-19 Short-Term Survey Sources
Attachment M-Revised: COVID-19 Cohort Short-term Survey Trying to Reach You Flyer
Attachment N-Revised: COVID-19 Cohort Short-Term Survey Email Reminder Text
Attachment P: HPOG 2.0 National Evaluation Intermediate Follow-up Survey Advance Letter_REV
Attachment Q: Intermediated Follow-up Survey Sources_REV
Attachment R: HPOG 2.0 National Evaluation Intermediate Follow-up Survey Trying to Reach You Flyer
Attachment S: HPOG 2.0 National Evaluation Intermediate Follow-up Survey Email Reminder_REV
Attachment AA: HPOG 2.0 Participant Newsletter
Previously Approved Attachments No Longer in Use
Attachment B: Previously Approved Informed Consent Forms
Attachment B: National Evaluation informed consent form A (Lottery Required)
Attachment B: National Evaluation informed consent form B (Lottery Not Required)
Attachment B: National Evaluation Informed Consent Form C (Lottery Required) _Verbal
Attachment B: National Evaluation Informed Consent Form D (Lottery Not Required) _Verbal
Attachment C: 60-Day Federal Register Notice
Attachment D: Previously Approved Sources and Justification for PAGES Grantee- and Participant-Level Data Items
Attachment E: Previously Approved Final Updated Attachment E PPR Data List and Mockup
Attachment F: First Round of HPOG Grantees Research Portfolio
Attachment G: Previously Approved Participant Contact Information Update Letter and Form (Obsolete, replaced by Instrument 5a and 5b)
Attachment H: HPOG Logic Model
Attachment I: Previously Approved Focus Group Participant Consent Form
Attachment I: New Focus Group Participant Consent Form_Remote
Attachment J: Previously Approved Interview Verbal Informed Consent Form
Attachment J: New Interview Verbal Informed Consent Form_Remote
Attachment K: HPOG 2.0 National Evaluation Short-term Follow-up Survey Advance Letter
Attachment L: HPOG 2.0 National Evaluation Short-term Follow-up Survey Sources
Attachment M: HPOG 2.0 National Evaluation Short-term Follow-up Survey Trying to Reach You Flyer
Attachment N: HPOG 2.0 National Evaluation Short-term Follow-up Survey Email Reminder
Attachment O: Research Questions for Previously Approved Data Collection Efforts (National Evaluation and Tribal Evaluation)
Attachment P: HPOG 2.0 National Evaluation Intermediate Follow-up Survey Advance Letter
Attachment Q: HPOG 2.0 National Evaluation Intermediate Follow-up Survey Sources
Attachment R: HPOG 2.0 National Evaluation Intermediate Follow-up Survey Trying to Reach You Flyer
Attachment S: HPOG 2.0 National Evaluation Intermediate Follow-up Survey Email Reminder
Attachment T: HPOG 2.0 National Evaluation phone-based Skills Assessment Pilot flyer
Attachment U: HPOG 2.0 National Evaluation phone-based Skills Assessment Pilot grantee letter
Attachment V: HPOG 2.0 National Evaluation phone-based Skills Assessment Pilot participant letter
Attachment W: HPOG 2.0 National Evaluation phone-based Skills Assessment Pilot recruitment script
Attachment X: Complete list of previously approved data collection instruments
Attachment Y: 60-day Federal Register Notice
Attachment Z: Participant Interview Recruitment Materials
1 See for example, this white paper by Pew Research: https://www.pewresearch.org/methods/u-s-survey-research/questionnaire-design/#:~:text=Researchers%20have%20demonstrated%20that%20the,called%20%E2%80%9Corder%20effects%E2%80%9D).
2 Our estimates to achieve a 75 percent response rate are based on the presumption that we will be able to resume in-person data collection efforts. However, we calculate MDDIs assuming a worst-case scenario—70 percent response in case in -person efforts are not feasible.
3 However, the rise in new cases due to variants of COVID-19 in some communities coupled with continued uneasiness of study members and local data collectors may pose challenges to planned in-person data collection efforts.
4 Any cases that are flagged as hostile refusals will not be contacted again by interview staff, therefore they will be excluded from the critical items data collection effort. In addition, respondents who are deceased, have a language barrier, or are deemed unlocatable are ineligible for the critical items instrument.
5 Under the short-term survey, people randomized early in the 12-month catchment window were released for interviewing as late at the 19th month following study enrollment. Furthermore, people randomized late in the window were released for interviewing as early as the 14th month following study enrollment. Similar release schedules will be used for the COVID-19 Cohort STS.
6 However, if changes in the pandemic and social distancing guidelines suggest that in-person interviewing will not be sustainable throughout the data collection period, we may need to rely solely on interviews by phone.
7 13,907 people were randomized during the sample window. Of these, 43 withdrew consent to have their data used in the study for any purpose, including nonresponse analysis. Another 42 became ineligible for the following reasons: death, incarceration, or inability to respond because of a permanent health/disability issue.
8 Every effort will be made to maximize that response and get closer to the response rate achieved for the Short-term Follow-up Survey effort.
9 We have encountered two classes of pandemic-related data collection challenges. First, some respondents were in stages of transition due to COVID impacts (e.g., health issues, housing concerns/needing to move, and employment changes, etc.). This made them harder to locate. Second, some respondents were juggling multiple responsibilities due to COVID including taking care of family and managing virtual learning for their children. These multiple responsibilities resulted in a higher percentage of broken appointments and refusals to do the interview.
10 The ability to use the shorter instrument if needed, may also be beneficial as a rise in new cases due to variants of COVID-19 in some communities or uneasiness of study members and local data collectors may pose challenges to planned in-person data collection efforts.
11 Klerman, Jacob Alex, David Judkins, and Gretchen Locke. (2019). Impact Evaluation Design Plan for the HPOG 2.0 National Evaluation, OPRE Report # 2019-82. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services. https://www.acf.hhs.gov/opre/report/national-and-tribal-evaluation-2nd-generation-health-profession-opportunity-grants-1
12 Judkins, David Ross, Jacob Alex Klerman, and Gretchen Locke. (2020). Analysis Plan for the HPOG 2.0 National Evaluation Short-Term Impact Report, OPRE Report # 2020-07. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services. https://www.acf.hhs.gov/opre/report/national-and-tribal-evaluation-2nd-generation-health-profession-opportunity-grants-0
14 https://protect2.fireeye.com/url?k=20e9f3bf-7cbcfaac-20e9c280-0cc47adb5650-ab60093706f8c24f&u=http://osf.io/nv2fz/
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Debi McInnis |
File Modified | 0000-00-00 |
File Created | 2021-10-15 |