Supporting Statement A- MCHB- PMHCA- New - OIRA- JBS- Clean 07182024

Supporting Statement A- MCHB- PMHCA- New - OIRA- JBS- Clean 07182024.docx

Pediatric Mental Health Care Access Program National Impact Study

OMB: 0906-0097

Document [docx]
Download: docx | pdf

Supporting Statement A


Health Resources and Services Administration Maternal and Child Health Bureau Pediatric Mental Health Care Access Program National Impact Study


OMB Control No. 0915-XXXX



Terms of Clearance: None

A. Justification

  1. Circumstances Making the Collection of Information Necessary

In compliance with Section 3506(c)(2)(A) of the Paperwork Reduction Act of 1995, this submission requests Office of Management and Budget (OMB) approval of a 3-year clearance for the Health Resources and Services Administration (HRSA) to examine the impact of the Maternal and Child Health Bureau (MCHB) Pediatric Mental Health Care Access (PMHCA) program (the Impact Study). This project will collect data to provide HRSA with information to guide future program decisions regarding the PMHCA program as it relates to (1) the impact of HRSA’s PMHCA program on changes in children/adolescents’ and their families/caregivers’ access to behavioral health care; their subsequent receipt and utilization of behavioral health care services, including culturally and linguistically appropriate care; and related behavior health impacts, and (2) monetary and societal PMHCA program costs and benefits.

Title X, Section 1002 of the 21st Century Cures Act supported increased access to pediatric mental health care and authorized funding to increase access to pediatric mental health care by supporting the development of new and the expansion of existing PMHCA programs. Section 10002 authorized the appropriation of $9 million each fiscal year (FY) for FYs 2018–2022 for this initiative.

Section 2712 of the American Rescue Plan Act allowed for additional funding for PMHCA programs. In addition to amounts otherwise available, Section 2712 appropriated to the Secretary of Health and Human Services for FY 2021, out of any money in the Treasury not otherwise appropriated, $80 million to remain available, until expended, for carrying out Section 330M of the Public Health Service Act (42 U.S.C. 254c–19).

Section 11005 of the Bipartisan Safer Communities Act added new program requirements and allowed for additional funding for PMHCA programs. Section 11005 authorized the appropriation of $31 million each FY for FYs 2023–2027 for this initiative. See Attachment A1 for a description of the legislation.

HRSA-funded PMHCA programs aim to increase the identification and treatment of behavioral health conditions for children/adolescents by:

  • Providing support to health professionals (HPs) in their delivery of high-quality and timely screening, assessment, treatment, and referrals for children/adolescents with behavioral health conditions, through the provision of clinical behavioral health consultation, care coordination support services (i.e., communication/collaboration, accessing resources, referral services), and training to HPs

  • Increasing access to clinical interventions, including by telehealth

PMHCA recipients also focus on achieving health equity related to racial, ethnic, and geographic disparities in access to care, especially in rural and other underserved areas.

JBS International, Inc. (JBS) will implement the Impact Study as part of a contract that is funded by HRSA (Contract Number: 75R60219D00046/Task Number: 75R60223F34003).

  1. Purpose and Use of Information Collection

As stated in Section A.1, the goal of this project is to provide HRSA with information to guide future program decisions regarding the PMHCA program as it relates to (1) the impact of HRSA’s PMHCA program on changes in children/adolescents’ and their families/caregivers’ access to behavioral health care; their subsequent receipt and utilization of behavioral health care services, including culturally and linguistically appropriate care; and related behavioral health impacts, and (2) monetary and societal PMHCA program costs and benefits.

The Impact Study uses a mixed methods design with data collection activities with participants in all HRSA MCHB PMHCA award recipient programs that were funded in 2021, 2022, and 2023. Methodologies for this study include a survey (i.e., online, mailed) and virtual focus group discussions (FGDs).

As indicated in Exhibit 1 below, the project will collect survey data from HPs enrolled and/or participating in the 2021, 2022, and 2023 PMHCA programs and conduct FGDs with families/caregivers who have sought and/or received behavioral health care for their child(ren)/adolescent(s), as identified by 2021, 2022, and 2023 PMHCA programs. Additionally, families/caregivers identified by the PMHCA programs to participate in the Family/Caregiver FGD will be asked demographic questions (Family/Caregiver Demographic Questionnaire) over the phone for the purpose of FGD sampling and to inform qualitative analyses. HPs will complete surveys at 2 time points, 1 year apart, in 2024 and 2025. FGDs with families/caregivers will be conducted once in 2025.

Exhibit 1. Data Collection Activities

Tool

2024

2025

2026

HP Impact Survey

Fall 2024

Fall 2025

N/A

Family/Caregiver FGD

N/A

Spring 2025

N/A

Family/Caregiver Demographic Questionnaire

N/A

Winter 2025 (prior to the FGDs)

N/A

Specifically, HRSA is requesting approval for the following:

HP Impact Survey – survey of enrolled/participating 2021, 2022, and 2023 PMHCA program HPs, examining their experiences with screening, diagnosing, treating, and referring children/adolescents with behavioral health conditions and their perception of behavioral health impact. HPs' first and last names and ZIP Code will also be collected to link their data with other data sources (e.g., Medicaid data) for the purpose of identifying impacts of the PMHCA program on access to behavioral health care.

Family/Caregiver FGD – discussions with families/caregivers who have sought and/or received behavioral health care for their child/adolescent about their experiences with behavioral health care access, receipt, and utilization; satisfaction with behavioral health care services; and impact of the behavioral health services on their child/adolescent.

Family/Caregiver Demographic Questionnaire – demographic questionnaire of families/caregivers identified by PMHCA programs to participate in the Family/Caregiver FGD about themselves and their child/adolescent for the purpose of FGD sampling and to inform qualitative data analyses.

After careful consideration of existing data and literature, the HP Impact Survey and Family/Caregiver FGD address the unavailability of quantitative and qualitative data that are essential for understanding the impact of the PMHCA program. Specifically, the HP Impact Survey will collect data on:

  • Provider estimates of the number of patients screened

  • Types of behavioral health disorders screened

  • Number of patients diagnosed with behavioral health disorders and resulting diagnostic classifications

  • Number of patients referred to behavioral health specialty treatment

  • Type of behavioral health resources to which patients are referred

  • Number of those referrals resulting in a visit to a behavioral health specialist

Specifically, the Family/Caregiver FGD will collect the perspectives and viewpoints of PMHCA program beneficiaries (i.e., families/caregivers) on their:

  • Experiences with behavioral health care access, receipt, and utilization for their child/adolescent

  • Satisfaction with behavioral health care services for their child/adolescent

  • Impact of the behavioral health services on their child/adolescent

The Family/Caregiver Demographic Questionnaire will collect information from PMHCA program beneficiaries (i.e., families/caregivers and their child/adolescent) including:

  • Family/caregiver contact information

  • Family/caregiver ZIP Code

  • Family/caregiver preferred language

  • Child/adolescent age

  • Child/adolescent race/ethnicity

  • Child/adolescent gender identity

In coordination with HRSA MCHB, JBS will use the data collected to:

  • Enhance our understanding of the impact of the PMHCA program, including how it improves access to and receipt and utilization of behavioral health care

  • Study the impact of the PMHCA program on unmet behavioral health needs

  • Examine differences between PMHCA program impact, based on child/adolescent and community characteristics and/or locations in which PMHCA programs are implemented

  • Explore PMHCA program cost-benefit (both monetary and societal)

  • Provide data in reports and webinars to HRSA MCHB and PMHCA programs

Supporting Statement B contains additional information on study procedures on the collection of information using these data collection tools, which are also included as attachments in Supporting Statement B.

  1. Use of Improved Information Technology and Burden Reduction

The Impact Study will follow a multimethod approach. Data collection methodologies for this evaluation will use a survey (i.e., web-based, email), virtual FGD (e.g., Microsoft Teams, Zoom), and questionnaire (i.e., phone). All technology used for the survey administration (i.e., web-linked survey administered via email and via survey platform) will meet Federal requirements for Section 508 accessibility. Information technology will be used in the following ways:

  • All survey participants will receive the web-linked survey via email. Electronic responses will be downloaded directly into a securely stored server.

  • All FGDs will be conducted via a web-based platform (e.g., Microsoft Teams, Zoom). All respondents must agree to be recorded; individuals who do not agree will not be eligible to participate. Interviewers will record responses as they are given and upload the recordings to a secured server.

  • All demographic questionnaires will be conducted over the phone. The Impact Study staff will record responses as they are given and save responses to a secured server.

  • Reports and materials (e.g., resources) generated from this project may be made available to the public.

The data collection methods were selected for the Impact Study because they will reduce participant burden while providing the study with necessary data. Offering a web-based survey reduces burden to participants by eliminating the time it takes to write responses on a paper-based, mail-in survey. In addition, having participants respond to an online survey eliminates the time needed to mail back a paper-based survey. The burden is reduced for respondents participating in focus group discussions via a web-based platform (e.g., Microsoft Teams, Zoom) because they will not have to write responses to the questionnaires or travel to participate in an in-person FGD. Similarly, the burden is reduced for potential FGD participants completing the demographic questionnaire over the phone because they will not have to write responses to the questionnaires or access the questionnaire via an online platform.

Using protected electronic data is the most secure form of data management because it eliminates the possibility of either paper documents or of data being lost in transit or delivered to an incorrect location. However, because not all respondents may prefer to complete a web-based survey, and to maximize completion rates, we may use alternative forms of administration (i.e., providing a printable PDF to participants). In this case, the printable PDF surveys can be returned either as attachments through encrypted emails or via mail, depending on the respondent’s preference. All hard copies will be entered into the online system at JBS and stored in a locked file cabinet, with participants’ names and identifying information removed.

  1. Efforts to Identify Duplication and Use of Similar Information

During the initial planning phase of the Impact Study, JBS conducted a literature review and review of existing data sources. These reviews found that no similar data collection is being conducted to examine the impact of the PMHCA program.

An outcome evaluation of the HRSA MCHB PMHCA and Screening and Treatment for Maternal Mental Health and Substance Use Disorders (formerly Screening and Treatment for Maternal Depression and Related Behavioral Disorders) cooperative agreement-funded programs is being conducted by JBS and is ongoing through 2026. However, no duplication of efforts between the evaluation and Impact Study exists because the data collection instruments for the Impact Study were developed considering the HRSA-required data awardees already report and the data collected to support the evaluation of the programs. No other bureaus or agencies are currently conducting an impact study of the PMHCA programs.

All potential data items were mapped to the evaluation questions to ensure no duplication of information and to reduce participant burden.

  1. Impact on Small Businesses or Other Small Entities

Physicians, as part of participating HPs, are included in the data collection efforts (i.e., surveys) for this Impact Study. Although a portion of physicians may be employed by large hospitals or health systems, none of which are considered small businesses, some may be in a private practice or practice in small groups of physicians. Information collection for this study is not anticipated to have a significant impact on physicians.

The information to be obtained from physicians is the minimum required for the intended use of the data and to achieve the objectives of the impact study; however, completion of the survey instrument will likely induce minimum burden. To reduce this burden, we have developed the survey to be as short as possible, while still collecting necessary data, and made attempts to move respondents quickly through questions (e.g., skip patterns have been added to the surveys so respondents do not need to answer questions that may not be relevant to them).

  1. Consequences of Collecting the Information Less Frequently

As noted above in Section A.2, the collection of these data is critical to assessing the impact of PMHCA awardee programs. The frequency of data collection, as specified below, is held to the minimum necessary to meet the needs of the Impact Study goals and objectives.

HP Impact Survey. The HP Impact Survey will be administered annually to HPs enrolled/participating in 2021, 2022, 2023 PMHCA programs, with anticipated data collection in 2024 and 2025. Annual HP Impact Survey administration will allow for:

  • Data collection from all potentially enrolled/participating HPs because they may enroll/participate in PMHCA programs on a rolling basis

  • Examination of changes over time related to screening, diagnosing, treating, and referring children/adolescents with behavioral health conditions and on their perception of behavioral health impact for HPs who complete the survey at both timepoints

Family/Caregiver FGD. The Family/Caregiver FGDs will be administered via a web-based platform (e.g., Microsoft Teams, Zoom) to family members/caregivers identified by the PMHCA awardees, with anticipated data collection once in 2025. To complement and expand on data collected from other sources, the FGDs will collect in-depth, contextual, qualitative information from families/caregivers who have sought and/or received behavioral health care services for their child/adolescent, regarding behavioral health care access, receipt, and utilization; culturally and linguistically appropriate services; PMHCA cost-benefit; and satisfaction.

Family/Caregiver Demographic Questionnaire. The Family/Caregiver Demographic Questionnaire will be administered over the phone to family members/caregivers identified by the PMHCA awardees as eligible to participate in the Family/Caregiver FGD, with anticipated data collection once in 2025, prior to conducting the FGDs. The questionnaire will collect demographic information (e.g., ZIP Code, preferred language, child/adolescent age) about the family member/caregiver and their child/adolescent for the purpose of Family/Caregiver FGD sampling and to inform qualitative data analyses.

There are no legal obstacles to reduce the burden.

  1. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

The request fully complies with the regulation.

  1. Comments in Response to the Federal Register Notice/Outside Consultation

Section 8A:

A 60-day Federal Register Notice was published in the Federal Register on February 06, 2024, vol. 89, No. 25, pp. 8210-11 (see Attachment A2). There were two public comments. The public comments and responses to the comment are provided as an attachment to this supporting statement (see Attachment A3). No substantive changes were made as a result of the public comments. A 30-day Federal Register Notice was published in the Federal Register on April 24, 2024, vol. 89, No. 80, pp. 31210-11.

Section 8B:

Consultations on the Impact Study's key research questions, design, data collection instruments (i.e., HP Impact Survey, Family/Caregiver FGD) and protocols, data management, data quality plan, and analysis occurred throughout the planning phase of the project in 2023. Consultations were provided by External Partner Group (EPG) members identified and convened specifically for the Impact Study to advise on key contract activities; they include non-Federal and Federal leaders of pediatric mental health and individuals with lived experience. In addition, The RAND Corporation (RAND) is the subcontractor to JBS on the Impact Study and also provided consultation for the Impact Study activities. These consultations provided, and will continue to provide, the opportunity to ensure the technical quality and appropriateness of the overall Impact Study design and data analysis plans, obtain advice and recommendations concerning the data collection instruments, and structure the impact study and instruments to minimize overall and individual response burden. Consultations have occurred with the following individuals in connection with this study (listed in Exhibit 2 below by affiliation):



Exhibit 2. Consultants

Affiliation

Title

Years and Areas of Consultation

Johns Hopkins Medicine, Department of Psychiatry and Behavioral Sciences; Maryland PMHCA; National Network of Child Psychiatry Access Programs (NNCPAP) member

Assistant Professor, Clinician (EPG Chair)

Consultation Year: 2023

Reviewed/Consulted on:

  • Formative research (e.g., literature review, existing data sources)

  • Key research questions

  • Study design

  • Data quality plan

  • Primary data collection instruments

Integrated Service Division; Chickasaw Nation PMHCA

Executive Officer

Consultation Year: 2023

Reviewed/Consulted on:

  • Formative research (e.g., literature review, existing data sources)

  • Key research questions

  • Study design

  • Data quality plan

  • Primary data collection instruments

American Academy of Pediatrics

Director

Consultation Year: 2023

Reviewed/Consulted on:

  • Formative research (e.g., literature review, existing data sources)

  • Key research questions

  • Study design

  • Data quality plan

  • Primary data collection instruments

Families as Allies

Executive Director

Consultation Year: 2023

Reviewed/Consulted on:

  • Key research questions

  • Study design

  • Data quality plan

  • Primary data collection instruments

Consultant in school health, mental health, and social and emotional learning

LCSW

Consultation Year: 2023

Reviewed/Consulted on:

  • Formative research (e.g., literature review, existing data sources)

  • Key research questions

  • Study design

  • Data quality plan

  • Primary data collection instruments

University of Texas (TX) at Austin; TX Institute for Excellence in Mental Health

Senior Research Analyst

Consultation Year: 2023

Reviewed/Consulted on:

  • Formative research (e.g., literature review, existing data sources)

  • Key research questions

  • Data quality plan

  • Primary data collection instruments

George Mason University; National Bureau of Economic Research; Institute for Labor Economics

Health and Labor Economist, Associate Professor

Consultation Year: 2023

Reviewed/Consulted on:

  • Formative research (e.g., literature review, existing data sources)

  • Key research questions

Massachusetts PMHCA, NNCPAP

Founding Director

Consultation Year: 2023

Reviewed/Consulted on:

  • Formative research (e.g., literature review, existing data sources)

  • Key research questions

  • Study design

  • Data quality plan

  • Primary data collection instruments

Tribal Early Childhood; Tribal Home Visiting Program; Office of Early Childhood Development; Administration for Children and Families

Senior Policy Advisor (SPA), Director

Consultation Year: 2023

Reviewed/Consulted on:

  • Formative research (e.g., literature review, existing data sources)

  • Key research questions

HRSA MCHB Office of Epidemiology and Research (OER)

Public Health Analyst (PHA)

Consultation Year: 2023

Reviewed/Consulted on:

  • Formative research (e.g., literature review, existing data sources)

  • Key research questions

  • Study design

  • Data quality plan

  • Primary data collection instruments

HRSA MCHB Division of State and Community Health

Deputy Director

Consultation Year: 2023

Reviewed/Consulted on:

  • Formative research (e.g., literature review, existing data sources)

  • Key research questions

Centers for Medicaid & Medicare Services; CHIP Services Office of the Center Director

Senior Policy Advisor for Youth

Consultation Year: 2023

Reviewed/Consulted on:

  • Formative research (e.g., literature review, existing data sources)

  • Key research questions

HRSA MCHB Office of Epidemiology and Research

Director

Consultation Year: 2023

Reviewed/Consulted on:

  • Formative research (e.g., literature review, existing data sources)

  • Key research questions

  • Study design

  • Data quality plan

  • Primary data collection instruments

HRSA MCHB Office of the Associate Administrator

Senior PHA

Consultation Year: 2023

Reviewed/Consulted on:

  • Key research questions

  • Study design

  • Data quality plan

  • Primary data collection instruments

Substance Abuse and Mental Health Services Administration; National Center of Excellence for Integrated Health Solutions

Lieutenant Commander

Consultation Year: 2023

Reviewed/Consulted on:

  • Formative research (e.g., literature review, existing data sources)

  • Key research questions

RAND

Economist; Professor of Policy Analysis

Consultation Year: 2023

Reviewed/Consulted on:

  • Formative research (e.g., literature review, existing data sources)

  • Key research questions

  • Study design

  • Data quality plan

  • Primary data collection instruments

RAND

Senior Analyst

Consultation Year: 2023

Reviewed/Consulted on:

  • Formative research (e.g., literature review, existing data sources)

  • Key research questions

  • Study design

  • Data quality plan

  • Primary data collection instruments

RAND

Director

Consultation Year: 2023

Reviewed/Consulted on:

  • Formative research (e.g., literature review, existing data sources)

  • Key research questions

  • Study design

  • Data quality plan

  • Primary data collection instruments


  1. Explanation of any Payment/Gift to Respondents

HP Impact Survey respondents will not receive any payments or gifts.

To encourage participation in the Family/Caregiver FGDs, a payment of $25 will be provided to each family member/caregiver who participates in a focus group. The payment amount is realistic in that it is $4.76 less than the mean hourly wage ($29.76) for all occupations included in the Bureau of Labor Statistics (BLS). This incentive is justified by the following factors:

  • Time and Expertise: Respondents are dedicating their time and sharing their experiences/expertise to contribute to the understanding of the impact of the PMHCA program. Payment is a way to communicate the importance of their insights to the Impact Study, as well as demonstrates respect for their commitment and effort required to participate.

  • Enhancing Response Rates: Offering payment increases the likelihood of participation in the FGDs, ultimately contributing to the overall quality of the data collection.

The research literature supports offering monetary incentives to FGD participants (i.e., researchers found that participants were more likely to participate in qualitative research if provided a monetary incentive, as compared to a nonmonetary incentive or no incentive).1 Similarly, other researchers discussed that focus group participants should receive incentives as it takes time and effort to participate.2

  1. Assurance of Confidentiality Provided to Respondents

The current project will fully comply with the Privacy Act of 1974 (5 U.S.C. Section 552a, 1998). The Act may apply to some data collection activities (e.g., the study will collect first name, last name, practice ZIP Code, and email addresses from survey respondents).

We will assure all respondents that their data will be kept private to the extent allowed by law. In addition, communications to inform participants about the data collection and any other introductory materials about the data collection will indicate HRSA’s Federal status and the purpose of the data collection. Please see Attachments A4-A11 for communication/recruitment materials (e.g., email notifications). Supporting Statement B contains additional information on study procedures related to the communications.

The study fulfills the criteria for expedited review under 45 CFR 46.110, category #7, and was approved by expedited review by the JBS Institutional Review Board (IRB) without requirement for Continuing Review (see Attachment A12).

  1. Justification for Sensitive Questions

Personally identifiable information (PII), including participants’ names and email addresses, will be collected for administration of the surveys and FGDs. The surveys ask for first and last names and practice ZIP Code(s) to link their data with other data sources (e.g., Medicaid data). Collection of these data are necessary for the Impact Study to link data solely for the purpose of identifying impacts of the entire PMHCA program on access to behavioral health care; individual HPs and their practices will not be individually evaluated. For the Family/Caregiver FGDs, PII, including participants’ names, contact information, preferred language, and ZIP Code, as well as their child’s/adolescent’s age, gender identity, and race and ethnicity will be collected from the identified families/caregivers via a demographic questionnaire prior to their participation in the focus group. Collection of these data are necessary for the Impact Study to ensure diverse sampling and to understand differences in experiences of families/caregivers based on the characteristics collected. We have elected to include OMB’s SPD-15 race and ethnicity question with minimum categories only (i.e., American Indian or Alaska Native, Asian, Black or African American, Hispanic or Latino, Middle Eastern or North African, Native Hawaiian or Pacific Islander, White) as the potential benefit of detailed data in this collection would not justify the additional burden to the agency nor the public as the demographic data collected will primarily be used for FGD sampling. Additionally, because the Family/Caregiver FGDs will occur only once (anticipated 2025) and this impact study will conclude in 2026, we will not adopt Figure 1 in the future. During the Family/Caregiver FGDs, participants will only be asked to share their first name (or preferred name) and age of child; however, participants will not be required to share this information with the group.

All data and information from participants will be stored in the secure facilities for 10 years after the study is completed, and we will adhere to Federal requirements regarding collection and storage of PII.

  1. Estimates of Annualized Hour and Cost Burden

This section summarizes the total burden hours for this information collection effort, in addition to the cost associated with those hours.

12A. Estimated Annualized Burden Hours

Exhibit 3 contains estimated response burdens for each subject population participating in the Impact Study’s data collection activities.


We calculated estimates for the response-hour burden (1) based on the methodology being used with each respondent population and (2) using the average completion time based on instrument pilot testing. Supporting Statement B contains additional information on pilot tests of the data collection tools to be used in the evaluation, as well as summaries of pilot test feedback and changes made to the data collection tools based on this feedback.


It should be noted that the 60- and 30-day Federal Register Notices were aggregated by types of respondents (e.g., physician; nurse practitioner; physician assistant; counselor, social worker, and other community and social services specialist) and are disaggregated here.

Exhibit 3. Estimated Annualized Burden Hours

Form

Name


No. of

Respondents

No.

Responses

per

Respondent

Total Responses

Average

Burden per

Response

(in hours)

Total Burden Hours*

HP Impact Survey

21,070

2

42,140

0.17

7163.80

Family/Caregiver FGD

42

1

42

1

42.00

Family/Caregiver Demographic Questionnaire

270

1

270

.08

21.60

Total

21,382


42,452


7,227.40

* Totals are rounded up in ROCIS.


12B.

Exhibit 4 summarizes the estimated, annualized cost burden to respondents of the Impact Study. Median hourly wage estimates and occupational profile codes were obtained from the BLS using wage estimates from 2023 (the most recently available estimates). The total respondent cost is calculated as (hourly wage rate X 2 [to account for overhead costs]) X (time spent on the instrument X number of responses).

Exhibit 4. Estimated Annualized Burden Costs

Type of

Respondent (Occupational Profile Code)


Total Burden

Hours


Hourly

Wage Rate*


Total Respondent Costs


Physicians (29-1215;29-1216;29-1221)

4,584.90

$207.26

$950,266.37

Nurse Practitioners (29-1171)

1,361.02

$121.40

$165,227.83

Physician Assistants (29-1071)

429.76

$125.02

$53,728.60

Counselor, Social Worker, and Other Community and Social Services Specialist (21-0000)

573.24


$50.00

$28,662.00

Other Health Care Professionals (29-0000)

214.88

$77.72

$16,700.47

Family member/Caregiver (00-0000)**

63.60

$46.22

$2,939.59

Total

7,227.40



$1,217,524.86

*SOURCE: U.S. Department of Labor, Bureau of Labor Statistics. (2024, April). Occupational employment and wage statistics. https://www.bls.gov/oes/current/oes_stru.htm

**The median hourly wage for all occupations included in the Occupational employment and wage statistics was used for the family/caregiver Hourly Wage Rate.

  1. Estimates of other Total Annual Cost Burden to Respondents or Recordkeepers/Capital Costs

Other than time, there is no cost to respondents.

  1. Annualized Cost to Federal Government

The cost to the Federal Government for this 4-year project is $3,170,965 or $792,741 per year on average. The total average cost for the project is $792,741 over a 1-year period. These costs cover all aspects of data collection design, testing, collection, and analysis. The method used to estimate the cost includes preparation of a detailed line-item budget that specifies all staff/consultant rates and labor hours by task, along with operational and other direct costs (e.g., telephone calls, reproduction).

In addition, it is estimated that one full-time equivalent HRSA staff member (Grade 13, Step 4) will spend 20% of his or her time (416 hours) to manage and administer the project. Assuming an annual salary of $194,638.50 ($129,759 x 1.5 to account for benefits), Government personnel costs will be $38,927.70 ($25,951.80 x 1.5 to account for benefits) over a 1-year period3.

  1. Explanation for Program Changes or Adjustments

This is a new information collection effort.

  1. Plans for Tabulation, Publication, and Project Time Schedule


Project Time Schedule. As Exhibit 5 shows, the project covers a 2-year period commencing upon receipt of OMB approval.


Exhibit 5. Project Time Schedule

Activity

Time Schedule

Obtain OMB approval

June–September 2024

Administer HP Impact Survey

1–4 months after OMB approval and at the same timeframe 1 year after OMB approval

Administer Family/Caregiver FGD

6–9 months after OMB approval

Administer Family/Caregiver Demographic Questionnaire

3-6 months after OMB approval (prior to FGDs)

Analysis Plan. The HRSA MCHB Impact Study will encompass the use of multiple instruments, collection of information, and analytical strategies. Both qualitative and quantitative data will be collected and analyzed to assess (1) the impact of HRSA’s PMHCA program on changes in children/adolescents’ and families/caregivers’ access to behavioral health care; their subsequent receipt and utilization of behavioral health care services, including culturally and linguistically appropriate care, and related behavior health impacts, and (2) monetary and societal PMHCA program costs and benefits. Qualitative data analysis will use a thematic approach to uncover underlying themes among the Family/Caregiver FGD responses. Quantitative data analyses will include the use of descriptive statistics, univariate analysis, and multivariate analysis. Finally, triangulation of methods (i.e., qualitative and quantitative data), when feasible, will be used to examine additional aspects of program achievements and impact that may not be accomplished with individual methods. The planned qualitative and quantitative data analyses are explained in more detail in the remainder of this section.

Qualitative Data Analysis: Analysis will begin with JBS cleaning transcripts based on audio recordings of the Family/Caregiver FGDs. Data will be analyzed and coded both deductively (i.e., based on pre-existing concepts) and inductively (i.e., concepts arising from the transcripts) by a team using a process of thematic analysis. To guide analysis, the team will develop a qualitative codebook that contains initial codes (i.e., conceptual tags to apply to chunks of text) derived from the research questions, Family/Caregiver FGD guide questions, and literature on PMHCA/child psychiatric access programs. The codebook will include code descriptions, inclusion and exclusion criteria, and exemplars (i.e., example quotations) from the transcripts. The team lead, a trained qualitative researcher, will develop an analysis protocol, including use of the codebook, upon which the team will be trained.

Analysis will begin with a deep reading of the transcripts to promote overall understanding of the perspectives. A primary analyst will code each transcript, with coding assessed by a reviewer. Primary coders and reviewers will use electronic memos embedded in the transcript files to exchange questions and answers between primary coders and reviewers and to refine coding. Interrater reliability (i.e., consistency between coders) will be accomplished by comparing analyses during weekly team meetings and through the primary coder and reviewer; we will also explore the use of a software algorithm for interrater reliability. Team use of ATLAS.ti qualitative data analysis software will facilitate the coding and analytic process.

As coding progresses, the analysts will identify potential key themes and subthemes. After finishing coding of the transcripts, a primary analyst and reviewer will be assigned to produce a summary of findings for each applicable research question, based on the coding and identification of themes, including example de-identified quotations from the participants. During the summarization process, the team will discuss fit between the constructed themes and the coded data and refine the summaries as needed.

To enhance reliability and validity, the study team will triangulate findings by data type (i.e., quantitative data, qualitative data) and data source (i.e., qualitative data, literature review, expert opinion). Study team members will convene a series of analytical working sessions to compare and contrast findings derived by data types and sources for each research question to identify potential areas requiring further analysis and refinement.

Quantitative Data Analysis: The statistical analysis of quantitative data will consider the research questions, measurement characteristics of the selected variables, sample size, distribution of the data, scale of measurement of the variables in the analysis, assumptions underlying the potential statistical tests, statistical power, interpretability of findings and their accessibility for all stakeholders, and acceptability of the approach and results to the scientific community. The goal of the quantitative analytic approach is to produce findings that are credible and accepted by all PMHCA stakeholders.

To guide quantitative data analysis, JBS’s analytic strategy will be implemented at three levels: (1) assessing the distribution of the variables in the primary and secondary data sets, (2) conducting bivariate analyses and assessing associations between variables to identify their interrelationships, and (3) conducting advanced multivariate and multivariable statistical tests to suggest meaningful inferences that can be put forward in response to the Impact Study’s key research questions.

Level 1. Univariate Analyses and Assessment of the Distribution of Variables: Univariate analysis is the most basic consideration of singular variables in which there is only a unitary quantity that changes or varies. The main purpose of univariate analysis is to describe the data. We will calculate measures of central tendency (i.e., mean, median, mode) to provide a simple and concise summary of each variable. Such measures allow us to represent each variable at each time point, which will suggest the center value of each data element at individual points in time, as well as a general indication of change over time for variables with multiple time points.

We will calculate statistics that summarize the distribution of each included data element or measure to obtain a picture of how much spread there is within and between variables. These statistics will include standard deviation (i.e., variability or spread of data), skewness (i.e., symmetry of the data distribution), kurtosis (i.e., peakedness or flatness of the data distribution), and range (i.e., separation between low and high values for each variable). These statistics will also provide information important to assessing whether the assumptions of potential statistical tests are met and to consequent analytic decisions.

We will examine errors or anomalies in the data distributions, using appropriate visualization methods (e.g., histograms may identify spikes or gaps that suggest errors or outliers; box plots may graphically present central tendency, spread, and outliers; scatter plots of two variables may reveal patterns or clusters of related data points). All these measures may be useful in informing subsequent analyses and interpretations.

We will assess the extent and nature of missing or suppressed values, using frequency distributions. Such an analysis will help guide decisions about imputation, which is an important decision point because restricting analysis to only observations with complete data could bias sample statistics (e.g., regression coefficients), resulting in less powerful and reliable estimates. To address these potential limitations, JBS will consider the imputation of missing values for all variables with a high number of missing values. We will review, select, and apply the most efficient method based on careful consideration of the dataset and the type of missing data.

In summary, our first level of analysis will assess the accuracy of measurements, identify sources of error, suggest methods with which to address error, and provide essential descriptive information. Such analyses provide a foundation for subsequent more advanced statistical tests.

Level 2. Bivariate Analyses and Assessing Interrelationships Between Variables: Bivariate analyses are the building blocks on which subsequent, more sophisticated multivariable analyses rest. Examples of bivariate analyses include the correlation of screening rates with measures of unmet need or the comparison of impacts in subgroups (e.g., PMHCA versus non-PMHCA geographic areas and PMHCA-participating versus PMHCA-nonparticipating respondents to the HP Impact Survey). In addition to providing insights into individual variables and sets of individual variables, bivariate analyses assist in making informed decisions about more advanced statistical tests. We will conduct bivariate tests as appropriate and based on whether the data (1) are nominal, ordinal, or continuous (i.e., interval or ratio), (2) are paired or independent, and (3) can be assumed to be normally distributed. Parametric tests will include Student’s t-test and ANOVA. Nonparametric tests will include chi-square (or Fisher’s exact), Mann–Whitney U, Kruskal–Wallis, and Wilcoxon. Comparisons of groups via the above tests will show the significance of differences in central tendency and variability or in observed versus expected frequencies.

We will use correlation to test the strength of the relationship between variables, sometimes presented as the amount of variance the variables share (i.e., the square of the correlation coefficient, called the coefficient of determination) or, conversely, do not share (i.e., the amount of unexplained variance, called the coefficient of alienation). Correlation also identifies whether relationships between variables are positive, negative, or lacking. These tests can be applied to continuous variables (i.e., Pearson product-moment), continuous and dichotomous combinations (i.e., point-biserial), or two dichotomous variables (i.e., phi coefficient). In addition to assessing independence/dependence, correlation can be useful in identifying potential covariates for more advanced analyses.

Level 3. Advanced Multivariable and Multivariate Statistical Tests to Suggest Meaningful Analytic Inferences: Based on the Impact Study key research questions and consideration of individual variables in Level 1 analyses and Level 2 bivariate analyses, we will determine the appropriate multivariable (i.e., analysis with one outcome variable) and multivariate (i.e., analysis with more than one outcome variable) tests. The principal tests will include assessment of change over time.

We will conduct interrupted time series or event study models to examine trends before and after the implementation of PMHCA in a given region or area. Because PMHCA programs were implemented at different time points in different places, we will conduct analyses standardized in “event time” (i.e., years relative to implementation) when combining areas with different implementation timelines. Depending on the outcome of interest, we will use generalized linear models (GLM); generalized estimating equations, an extension of GLM; or repeated measures ANOVA, as appropriate, based on the nature of the data under consideration and the assumptions and analytic merits of each test.

We will also use difference-in-difference (DID) analysis to assess the relative impact of PMHCA by comparing changes in outcomes over time between areas in which PMHCA was implemented with areas in which the program was not implemented. DID is particularly useful when it is not possible to randomly assign areas to treatment and control groups for comparison, as in our case. As such, this analysis can compare how outcomes unfold over time for children and families more likely to be exposed to PMHCA programs with how outcomes unfold over time for children and families less likely to be exposed to PMHCA due to being in the non-PMHCA areas. Assuming similar trends in the groups prior to exposure to PMHCA, DID allows an assessment of the impact of PMHCA by assessing the difference between these two groups before and after implementation of PMHCA. If tests of the pre-trends assumptions for DID analysis are not satisfied, we can conduct bounding exercises to assess the direction and extent of potential bias.

Because PMHCA implementation timing and methods vary by location, there will be variation in the populations served in different regions at different times. Therefore, after estimating our main DID models on the overall population, we will stratify our models, to the extent possible, based on information about the implementation method (e.g., whether a prescribing behavioral health provider was part of the program) or population characteristics (e.g., rurality, racial and ethnic minority, Health Professional Shortage Areas). We will also stratify the model to target specific subgroups relevant for our extrapolation of downstream societal outcomes.

We will use estimates derived from the DID and event study models as inputs into an exploratory cost-benefit analysis. Many of the downstream outcomes of interest (e.g., high school graduation, employment) have not yet been realized by patients currently being treated by physicians participating in PMHCA. Therefore, we will conduct a hypothetical exercise, using estimates from the literature, to extrapolate the potential impact of any observed changes in utilization or receipt of care on these longer-term outcomes.

Publication Plan. As stated in Section A.2, the goal of the Impact Study is to guide future program decisions regarding the PMHCA program. It is therefore important to prepare and disseminate information that clearly and concisely presents Impact Study results so that they can be appreciated by both technical and nontechnical audiences. Publication activities will include:

  • Hosting two webinars annually—one for HRSA staff, and one for HRSA staff and PMHCA awardees—to provide an overview of the progress of the Impact Study, including:

    • Background on the current contract and on what the Bureau will get from this investment

    • Accomplishments and lessons learned

    • Cost-benefit of the PMHCA program

    • PMHCA program impact on:

      • Children/adolescents’ access to behavioral health care, including among groups historically underserved

      • Children/adolescents' receipt of behavioral health services

      • Children/adolescents' subsequent behavioral health care utilization

      • Receipt of culturally and linguistically appropriate behavioral health care

      • Children/adolescents' behavioral health outcomes

  • Preparing and submitting:

    • An annual report that summarizes the findings presented in the webinar for HRSA staff

    • A comprehensive final report on the aggregate results of the national Impact Study based on the data collected and analyzed. This comprehensive final report will be published on the public-facing PMHCA website.

  1. Reason(s) Display of OMB Expiration Date is Inappropriate

The OMB number and expiration date will be displayed on every page of every form/instrument.

  1. Exceptions to Certification for Paperwork Reduction Act Submissions

There are no exceptions to the certification.



1 Kelly, B., Margolis, M., McCormack, L., LeBaron, P. A., & Chowdhury, D. (2017). What affects people’s willingness to participate in qualitative research? An experimental comparison of five incentives. Field Methods29(4), 333–350. https://doi.org/10.1177/1525822X17698958

2 Adler, K., Salanterä, S., & Zumstein-Shaha, M. (2019). Focus Group Interviews in Child, Youth, and Parent Research: An Integrative Literature Review. International Journal of Qualitative Methods, 18. https://doi.org/10.1177/1609406919887274

3 SOURCE: Salary Table 2024-DCB. U.S. Office of Personnel Management. (n.d.). https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/2024/general-schedule


10



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleInstructions for writing Supporting Statement A
AuthorJodi.Duckhorn
File Modified0000-00-00
File Created2025-05-23

© 2025 OMB.report | Privacy Policy