SPF-Rx Supporting Statement A - 11-7-2022

SPF-Rx Supporting Statement A - 11-7-2022.docx

Strategic Prevention Framework for Prescription Drugs (SPF Rx)

OMB: 0930-0377

Document [docx]
Download: docx | pdf

Program Evaluation for Prevention Contract Evaluation Activities


Supporting Statement

A. Justification

A.1. Circumstances of Information Collection

The Substance Abuse and Mental Health Services Administration’s (SAMHSA) Center for Substance Abuse Prevention (CSAP) is requesting approval from the Office of Management and Budget (OMB) for a revision to the data collection activities related to the cross-site evaluation of SAMHSA’s Strategic Prevention Framework for Prescription Drugs (SPF Rx) fiscal year (FY) 2021 cohort. The SPF Rx (OMB No. 0930-0377) expiration date is February 29, 2024. SAMHSA funds the Program Evaluation for Prevention Contract (PEPC), which supports the cross-site evaluation activities for SPF Rx.

SAMHSA requests approval for the following data collection tools in Exhibit 1:

Exhibit 1. Data Collection Tools


Instrument

OMB Request

Attachment

Annual Reporting Tool (survey instrument—formerly the Annual Implementation Instrument (AII)]

Major Revision

1

Grantee-Level Prescription Drug Monitoring Program (PDMP) Outcomes Module (secondary data collection instrument)

Minor Revision

2

Community-Level PDMP Outcomes Module (secondary data collection instrument)

Minor Revision

3

Grantee-Level Interview (interview protocol)

Major Revision

4


The grant program is summarized below.

  • The SPF Rx grant program is designed to address nonmedical use of prescription drugs as well as opioid overdoses by raising awareness about the dangers of sharing medications and by working with pharmaceutical and medical communities on the risks of overprescribing. SPF Rx program grantees also raise community awareness and bring prescription drug abuse prevention activities and education to schools, communities, parents, prescribers, and their patients. SAMHSA also plans to track reductions in opioid overdoses and the incorporation of Prescription Drug Monitoring Program (PDMP) data into needs assessments and strategic plans as indicators of the program’s success.

  • At the end of FY 2021, SAMHSA awarded the 5-year SPF Rx grant to 18 states and three tribal organizations. These grantees identified 105 subrecipient community partners in their grant applications (we estimate 110 subrecipients below as the eventual number of funded subrecipients).

  • The SPF Rx evaluation consists of three interrelated components: (1) performance monitoring, (2) a process or formative evaluation, and (3) an outcome or impact evaluation.

A.1.a. Statement of Need for Program Evaluation for Prevention Contract (PEPC) Evaluation Activities of Strategic Prevention Framework for Prescription Drugs (SPF Rx)

SPF Rx grantees use SAMHSA’s Strategic Prevention Framework (SPF) to plan, implement, and evaluate their prevention projects. The SPF, which comprises five steps and two guiding principles, cultural competency and sustainability, provides a comprehensive process for focusing communities’ prevention initiatives on the most pressing needs and priority substance use and related health problems.


Use of the SPF ensures that prevention efforts are data-driven, dynamic (i.e., involve continuous needs assessment and adjustment of prevention strategies, as needed), engage diverse state, tribal, and community partners, and focus on population-level change. The SPF Rx program has several important requirements, including that grantees must work collaboratively with other state, tribal, and community stakeholders (schools; businesses; law enforcement; pharmaceutical and medical communities; youth, young adults, and other community members, including parents) to achieve their objectives. Most, though not all, grantees choose to fund subrecipient community partner organizations to implement interventions at the community level. Grantees must also design and implement their own local evaluations and participate in SAMHSA’s cross-site evaluation.

SPF Rx Evaluation

Opioid misuse, opioid use disorder, and opioid overdose are significant public health issues in the United States (U.S.). In the most recent survey, SAMHSA’s National Survey on Drug Use and Health (NSDUH) estimated that 3.3% of respondents reported misuse of prescription pain relievers in the past year (SAMHSA, 2020). Since 1999, the number of deaths from drug overdoses has quadrupled, including a 5% increase from 2018 to 2019 (Centers for disease Control and Prevention, 2021). Drug-related overdose has become the nation’s leading cause of accidental death, with deaths from opioid overdose playing a significant role in this increase (Hedegaard, Minino, & Warner, 2018). Opioid overdose fatalities can be attributed to both prescription medications, such as morphine, codeine, oxycodone, and others, and to illegal drugs such as heroin and illegally manufactured fentanyl and its analogs. While synthetic opioids (primarily fentanyl) contributed to the majority of opioid overdose deaths in 2020, prescription opioid overdoses continued to contribute to a substantial number of deaths (16,416 deaths in 2020; National Institute on Drug Abuse, 2022). In response to this crisis, the SPF Rx grant program seeks to provide infrastructure for states, tribal entities, and their subrecipients to address issues of prescription drug misuse.


The SPF Rx evaluation will assist SAMHSA to better understand whether the SPF Rx program impacts opioid prescribing, opioid misuse, prescription opioid overdoses, and related deaths at the state, tribal, and subrecipient community levels, and which programmatic factors contribute to that change. The tools included in this package allow evaluators to collect and process outcome data to better understand the program. In addition, the Public Health Services Act requires SAMHSA to monitor program performance and document the impact of government funding. The SPF Rx data collection tools also provide information required for proper performance of SAMHSA’s required agency function of program oversight.

A.1.b. Overview or Study Design and Evaluation Questions

SPF Rx Evaluation

Data collected through the tools described in this statement will be used for the national cross-site evaluation of SAMHSA’s SPF Rx program FY 2021 cohort. This request for revision covers continued data collection through FY 2025 (three years), but the evaluation is expected to continue through FY 2026 to cover all 5 years of the FY 2021 SPF Rx cohort (before expiration of this requested approval, a new OMB package will be submitted to cover the remaining year). The PEPC team will systematically collect and maintain Annual Reporting Tool (ART) and Grantee- and Community-level PDMP Outcomes data submitted by SPF Rx grantees through the online PEPC Cross-Site Data Collection Platform (CS-DCP). The evaluation also includes qualitative interviews with SPF Rx grantee project directors using the Grantee-Level Interview tool.


The following monitoring and evaluation questions guide the cross-site evaluation:

  • Performance Monitoring

    • Did grantees and subrecipients conduct activities as planned?

    • Did grantees and subrecipients use resources as intended?

    • To what extent (how quickly and appropriately) did grantees and subrecipients progress through the steps outlined in the SPF?

  • Process Evaluation

    • How did SPF Rx grantees use the SPF to plan their activities and manage their own progress?

    • What barriers and facilitators affected SPF Rx implementation and outcomes (e.g., characteristics of partnerships, concentration of effort, infrastructure, laws and regulations, state/community contextual factors)?

    • How did grantees address implementation barriers or leverage implementation facilitators?

  • Outcome Evaluation

    • Was the implementation of SPF Rx associated with desired proximal and distal outcomes, including safer opioid prescribing practices and decreases in prescription drug misuse and opioid overdose?

    • How did SPF Rx grantees use PDMPs to improve proximal and distal outcomes?


The evaluation includes yearly data collection of the ART and of the Grantee- and Community-Level PDMP Outcomes Modules, and data collection every other year using the Grantee-level Interview. The ART contains questions for subrecipients on their needs assessment, data sources, resources, capacity building and sustainability, along with questions for both grantee and subrecipients on the SPF Rx interventions they implemented and their targeted population and reach. The ART will serve as a source of both process and implementation data and will provide independent and moderator variables for the evaluation of outcomes. Grantee- and Community-Level PDMP outcomes are one set of dependent variables for the evaluation and were selected as indicators of improvements in the use of PDMPs. The PDMP outcomes modules include two categories of indicators: opioid prescribing patterns and prescriber use of PDMPs. The Grantee-Level Interview seeks to answer the performance monitoring and process evaluation questions in more detail through focusing on how grantees implemented their grants, including progress through the SPF steps , including infrastructure, capacity, collaboration, and other factors. This qualitative interview will be conducted in Years 1, 3, and 5 of the evaluation.


Exhibit 2 provides an overview of the data collection method, frequency of data collection, and number of times each tool is collected for the SPF Rx data collection instruments.


Exhibit 2. SPF Rx Cross-Site Evaluation Data Collection Tools (N = 21 grantees and 110 subrecipients)


Instrument

Data Collection Method

Frequency of Data Collection

Maximum Number of Data Collections

Attachment Number

ART

Grantees and subrecipients both submit in PEPC CS-DCP.

Yearly (subrecipients complete for each year funded)

5 times:

Years 1–5

1

Grantee-Level PDMP Outcomes Module

Grantees submit in PEPC CS-DCP.

Yearly

5 times:

Years 1–5

2

Community-Level PDMP Outcomes Module

Grantees submit for subrecipient communities in PEPC CS-DCP.

Yearly

5 times:

Years 1–5

3

Grantee-Level Interview

Grantee interviews are recorded and transcribed; stored on secure drive

Baseline and follow-ups

3 times:

Years 1, 3, and 5

4


To reduce burden, the SPF Rx evaluation also will use data from other data sources including SPF Rx grantee proposals, SAMHSA's Performance Accountability and Reporting System (SPARS), and secondary outcomes data sources. Document review of grantee proposals will provide information on SPF Rx grantee planned activities and targets. In SPARS (OMB #0930-0354 Expiration Date: 10/31/2024), SPF Rx grantees report quarterly on their grant progress and challenges into the Division of State Programs-Management Reporting Tool (DSP-MRT) and annually into the outcomes module, providing grantee- and community-level information on opioid overdoses, steps to enhance access to and use of PDMP data, and targeted consumption pattern and consequence National Outcomes Measures (NOMS). Secondary outcomes data sources for the SPF Rx evaluation will include the National Vital Statistics System (NVSS - for opioid mortality), the National Poison Data System (NPDS - for opioid overdose calls), and IQVIA (for prescribing outcomes).


The PEPC team carefully reviewed each of the data sources to ensure nonduplication of data collection efforts and streamline data collection for the PEPC evaluation.

Potential Impacts of SPF Rx Data Collection

SAMHSA’s SPF Rx program is designed with the premise that changes at the community level will lead to measurable changes in substance use and misuse at the state and tribal levels. It assumes that effective state, tribal, and community change requires comprehensive efforts targeting youth and adults, as well as the environments in which they live.


The goal of SAMHSA’s SPF Rx cross-site evaluation is to provide data on activities and services that were delivered; program participants; infrastructure supports that facilitate program implementation; implementation barriers; program outcomes and impacts; and the extent to which grantees were prepared or able to sustain their programs at the end of the grant period.

A.2. Purpose and Use of Information

The theory of change guiding the SPF Rx program is that well planned and implemented prevention efforts at the community and grantee levels will result in population-level change. This includes reduced prescription drug misuse and continued enhancements to state, tribal, and community prevention systems. This section describes the practical utility of the SPF Rx data collection.

SPF Rx Evaluation

The SPF Rx evaluation is designed to objectively and rigorously measure population-level changes in prescription drug misuse and its impact (e.g., opioid overdose morbidity and mortality), and describe conditions and changes in state, tribal, and community prevention systems (e.g., PDMP use). Across its 5 years, the evaluation will address performance monitoring, process evaluation, and outcome evaluation. It employs mixed methods, with in-depth qualitative and quantitative data, to answer the evaluation questions and extend SAMHSA’s understanding of what is required to implement strategies and prevent prescription drug misuse and opioid overdose.


The SPF Rx cross-site evaluation is expected to have important program and policy implications at the federal, state, tribal, and community levels. It will provide valuable information to the prevention field about best practices in these real-world settings, including what types of interventions should be funded and implemented to reduce prescription drug misuse. Additionally, the evaluation will provide information about ways to build access and usage capacity of PDMP at the state, tribal, and community levels. The evaluation is ultimately designed to use data to inform resource allocation and programming to prevent prescription drug misuse.

Instrumentation

The SPF Rx data collection efforts include the ART, Grantee- and Community-Level PDMP Outcomes Modules, and Grantee-Level Interview. These cross-site measures provide process data regarding progression through the SPF model; challenges and successes experienced during these steps; PDMP infrastructure; intervention implementation; prescriber use of PDMP and prescribing patterns; training and technical assistance (T/TA); and funding. This data collection emphasizes the SPF Rx impact on outcomes related to the prevalence of prescription drug misuse, and capacity for and use of PDMP for monitoring prescriber behavior and prevention strategies . The emergence of prescription drug misuse as a serious public health issue highlights the critical need for the SPF Rx cross-site evaluation to examine the implementation and effectiveness of prevention interventions developed to target this issue.

The ART, Grantee- and Community-Level PDMP Outcomes Modules, and Grantee-Level Interviews are used to collect data to measure the main constructs of interest to answer the SPF Rx evaluation questions. Collecting data through these instruments at multiple time points is necessary to assess the grantees’ progress and change in outcomes over the course of the grant. The instrumentation is described in detail below.

Annual Reporting Tool (ART, Attachment 1):

The ART is a survey instrument collected through PEPC’s CS-DCP. It is designed to be completed by grantees and subrecipient project directors. The PEPC evaluation team collects ART data yearly to monitor state, tribal entity, and community-level performance, and to evaluate the effectiveness of the SPF Rx program across states, tribal entities, and subrecipient communities. The ART provides process data related to funding use and effectiveness, organizational capacity, collaboration with community partners, data infrastructure, planned intervention targets, intervention implementation (e.g., categorization, timing, dosage, and reach), evaluation, contextual factors, T/TA needs, and sustainability. Repeated collection of these data is needed to: (1) track the grantees and subrecipients’ progress and changes on the indicators over time; and (2) allow SAMHSA and the grantees to monitor performance and ongoing implementation.

The ART included in this data collection builds upon the prior SPF Rx evaluation Annual Implementation Instrument primarily by integrating the Consolidated Framework for Implementation Research (CFIR) (Damschroder et al., 2009) constructs to provide questions on barriers that map to the SPF steps. In addition, the evaluation team clarified the instructions, questions, and responses for some of the variables to reduce confusion for respondents and improve the quality of the data.

The ART will be collected five times annually under this data collection— from Year 1 (2022) to Year 5 (2026).

Grantee- (Attachment 2) and Community-Level PDMP Outcomes Modules (Attachment 3):

The Grantee- and Community-Level PDMP Outcomes Modules are survey instruments collected through PEPC’s CS-DCP. Grantees use the PDMP Outcomes Modules instruments to provide outcome data for opioid prescribing practices and prescribers’ use of PDMPs (from PDMP data) at both the grantee and subrecipient community levels.


The PDMP Outcomes Modules included in this data collection are the same as those used by the prior SPF Rx evaluation aside from a few minor updates.


The Grantee- and Community-Level PDMP Outcomes Modules will be collected five times annually under this data collection from Year 1 (2022) to Year 5 (2026). Grantee-Level Interview (Attachment 4):

The Grantee-Level Interview is a semi-structured interview, conducted by telephone with grantee staff. This instrument is designed to collect more in-depth qualitative information on progress toward implementing the grant, organizational infrastructure, use of PDMP data, collaboration, funding use and effectiveness, subrecipient collaboration, criteria for intervention selection, adaptations to interventions, challenges, and lessons learned.

The Grantee-Level Interview protocol underwent the most extensive changes from the prior SPF Rx evaluation. In addition to reducing the total number items, most of the existing items were dropped and new items added to better incorporate implementation science constructs into the evaluation. Specifically, the interview reflects CFIR constructs related to grant implementation; inner (e.g., infrastructure and capacity) and outer (e.g., collaboration) setting factors; and intervention characteristics influencing implementation.

The Grantee-Level Interview will be conducted in FY 2023 to cover activities from the first year of the grant and will occur again in the third (2024) and then final years of the grant (2026).

A.3. Use of Information Technology

All efforts have been made to minimize respondent burden, while obtaining the essential information needed to answer the evaluation questions. The use of web-based data submission methods decreases respondent burden as compared to that required for alternate methods, such as a paper format, by allowing direct transmission of the data. During the data collection period respondents can enter and submit the data at a time and location that is convenient for them. In addition, the data entry and quality control mechanisms built into the web-based portal reduce errors that might otherwise require follow-up, resulting in less burden than that required for hardcopy data collection. Whenever possible, the PEPC team uses automated data checks to improve data quality and reduce burden for respondents. Additionally, any publicly produced documents will be 508-compliant for accessibility to the public.

SPF Rx Evaluation Cross-site Data Collection Platform (CS-DCP)

SPF Rx grantee staff will submit their responses electronically by providing their ART and the Grantee- and Community-Level Outcomes data through SAMHSA’s CS-DCP. Use of a web-based system reduces both respondent burden and data entry error, thereby increasing the efficiency of data entry and improving data quality, for the reasons listed below.

  • On the ART, both grantees and subrecipients will answer questions on prevention interventions that they have implemented. However, subrecipients will also answer questions on their progress through the SPF steps, prevention capacity, and related funding resources. The CS-DCP provides a different set of questions depending on whether the respondent is a grantee or subrecipient, so grantees will only see questions that are required for them.

  • Each of the web-based instruments has automated data checks as well as skip procedures and prepopulated fields based on prior responses to certain questions. Only the questions that are required at that time will appear on the instrumentation.

  • The automated data checks will ensure that responses follow the expected format (e.g., numbers or dates where those are expected). For relevant items, they will also flag responses that appear to be out of the normal range of responses for that item (e.g., responses for intervention reach that exceed the target population in the community).

Web-based systems also allow the cross-site evaluation team to review submissions efficiently, request revisions or clarifications as needed, and then approve grantee submissions as appropriate. This process increases accuracy of data, which ultimately makes these data easier to analyze and strengthens the analysis and results.

A.4. Effort to Identify Duplication

This evaluation collects information unique to SPF Rx programs that is otherwise not available to Project Officers or the PEPC cross-site evaluation team. With an eye toward minimizing duplication and burden, the PEPC evaluation team ensured data collected from each instrument is nonduplicative and complementary to the other evaluation components and program monitoring tools.

A.5. Involvement of Small Entities

Participation in this evaluation will not impose a significant impact on small entities. SPF Rx grantees are state agencies, tribal entities, and subrecipients. Some subrecipients may be small entities, such as local coalitions; however, the data collection instruments have been designed to include only the most pertinent information needed to understand progress and to carry out the evaluation and feasibility study effectively. The ART is the only evaluation tool that the subrecipients need to complete and skip patterns allow the respondents to only report on activities they conducted during the prior year. Burden on small entities is expected to be minimal.

A.6. Consequences if Information Is Collected Less Frequently

The data collection schedule represents the minimum amount of information needed for the government to accomplish the objectives of its evaluation and to meet data reporting requirements. SAMHSA made every effort to ensure that data are collected only when necessary. Performance monitoring goals of this evaluation generally require annual data collection to allow for prompt, ongoing feedback and course corrections as needed. For example, the ART tool for SPF Rx collects grantee and subrecipient implementation data annually, allowing the PEPC team to track implementation progress among grantees and their subrecipients and for regular data feedback to grantees on subrecipient implementation. Timing information can be found in Exhibit 2 for SPF Rx.

SPF Rx Evaluation

The multiple data collection points for the SPF Rx cross-site evaluation in the CS-DCP are necessary to track and evaluate progress and change over time for grantees, tribes, and communities. SAMHSA uses these data for performance monitoring, as well as for the cross-site evaluation for the SPF Rx programs; grantees use these data to track ongoing implementation of their efforts under this grant. Less frequent reporting could impede SAMHSA’s and the grantees’ ability to do so effectively. For example, SAMHSA is federally required to report on Government Performance and Results Act (GPRA) measures annually. GPRA measures are included in the SPF Rx cross-site evaluation and therefore must be collected each year from grantees.

ART and Grantee- and Community-Level PDMP Outcomes Module

The evaluation team collects ART and Grantee- and Community-Level Outcomes data annually for performance monitoring and to track trends across time. Annual updates to the ART allow SAMHSA and grantees to monitor how long it takes grantees and subrecipients to begin implementing interventions as well as provide trend data on the type of interventions implemented and the reach of the SPF Rx program interventions. The evaluation team uses the outcomes data from the Grantee- and Community-Level PDMP Outcomes Modules related to PDMP use and prescription drug misuse as the dependent variables to measure the impact of the program over time. Collecting data points less than annually would inhibit our ability to conduct reliable trends analysis on those outcomes.

Grantee-Level Interview

The Grantee-Level Interview provides for data collection at three time points: baseline, Year 3, and Year 5. These three time points allow for a baseline assessment of the grantees’ initiation of their SPF Rx activities; an assessment of how those activities progressed halfway through their grants (which also allows for course corrections as needed); and a final assessment of how the grantees progressed through SPF Rx.

A.7. Consistency with the Guidelines in 5 CFR 1320.5(d)(2)

This information collection fully complies with the guidelines in 5 CFR 1320.5(d)(2).

A.8. Consultation outside the Agency

The notice required by 5 CFR 1320.8(d) was published in the Federal Register Notice on August 17, 2022 (87 FR 50633). No comments were received.

SPF Rx Evaluation

The SPF Rx evaluation tools were developed by SAMHSA and the PEPC evaluation team. These tools originated with earlier evaluations of SAMHSA CSAP programs and were refined for the evaluation of the FY 2016 SPF Rx cohort evaluation using grantee and subrecipient feedback, as well as an assessment of data accuracy and completeness. For example, items related to the demographics of implemented environmental strategy interventions were removed, as many subrecipients were not able to provide this information. Additional feedback and a review of responses from the FY 2016 SPF Rx cohort grantees and subrecipients resulted in further minor revisions to items and response options.

A.9. Payment to Respondents

No cash incentives or gifts will be given to respondents.

A.10. Assurance of Confidentiality

All members of the PEPC team will receive general awareness training and role-based training commensurate with the responsibilities required to perform the tasks of the project. Prior to performing any project work or accessing any system, and annually thereafter throughout the life of the study, each team member will complete the SAMHSA Security Awareness Training required by the agency, as well as Records Management and Human Subjects Research Training. The project will maintain a list of all individuals who have completed these trainings and will submit this list to the Project Officer upon request.

The SPF Rx data collection instruments do not request Personal Identifiable Information (PII). They collect programmatic data at the grantee and community levels, along with aggregated, nonidentifying outcomes data. Identifiers such as name, email address, phone number, and organizational role will be collected separately to facilitate survey administration (i.e., to provide respondents access to the CS-DCP) and to notify respondents of the Grantee-Level Interviews. Sensitive respondent information, such as birthdates and Social Security numbers, will not be collected. Individuals and organizations providing information to the SPF Rx evaluation will be told the purposes for which the information is collected and that any identifiable information about them will not be used or disclosed for any other purpose. Once data collection is complete, any personal identifiers will be removed from the data and destroyed.


The study teams will safeguard the names of respondents, all information or opinions collected during interviews, and any information about respondents learned incidentally during the project. The PEPC team is trained on privacy and properly handling of sensitive data. Hard copies of evaluation data and notes containing personal identifiers will be kept in locked containers or a locked room when not being used. Every effort will be taken to limit access to data to only those persons who are working on the project and who have been instructed in appropriate Human Subjects requirements for the project.


Electronic files and audio files will be accessible only to project staff who have received permission from the PEPC Project Director to access them. Files containing data are stored on a platform requiring password protection and additional authentication prior to accessing. Access to data in the system will be handled by a hierarchy of user roles, with each role conferring only the minimum level of access to system data needed to perform their specific functions. Access to network-based data files will be controlled using Access Control Lists or directory- and file-access rights based on user account ID and the associated user group designation. Staff will be instructed on the proper storage, transfer, and use of sensitive information and available tools (e.g., encryption). The PEPC team takes responsibility for ensuring that the web and data systems are properly maintained, monitored, and secured. Server staff will follow standard procedures for applying security patches and conducting routine maintenance for system updates.


All data, notes, recordings, etc. will be provided to SAMHSA at least 30 days prior to contract end date. SAMHSA will ensure documentation of data destruction is completed by the contractor once all information and data is provided to SAMHSA.


The SPF Rx cross-site evaluation study was presented to the PEPC contractor’s Institutional Review Board (IRB) and was found as a program evaluation to not constitute research with human subjects. This designation only applies to the protocols submitted as attachments to this data collection, and if any of the protocols are changed in the future, the study will be resubmitted to determine whether further IRB review is required.

A.11. Questions of a Sensitive Nature

The information reported by respondents for the SPF Rx does not ask for sensitive personal information or include questions of a sensitive nature. The focus of the SPF Rx data collection is on the programmatic characteristics of the SPF Rx grantees and subrecipients. Grantee staff provide information about their organizations and SPF Rx activities, rather than information about themselves personally.

A.12. Estimates of Annualized Hour Burden

This section provides annualized and total burden estimates for each SPF Rx instrument included in this OMB package. The total burden for this entire OMB package is 3559.5 hours and $123,447.08, shown in Exhibit 3. The average annualized burden for this OMB package is 713 hours and $24,689.41, shown in Exhibit 10.


Exhibit 3. Total Data Collection Burden for SPF Rx FY 2021 Cohort Evaluation


Study

Total Responses

Total Burden Hours a

Total Wage Cost b

SPF Rx FY 2021 Cohort Total

1373

3,559.5

$123,447.09

a Total Burden hours includes Grantee PD or Evaluator hourly wage and Subrecipient Staff hourly wage. The Grantee PD or Evaluator hourly wage is based on the mean hourly wage for state government managers, as reported in the 2020 Occupational Employment (OES) by the Bureau of Labor Statistics (BLS) found at https://www.bls.gov/oes/current/naics4_999200.htm#11-00000. Subrecipient Staff hourly wage is based on the mean hourly wage for local government counselors, social workers, and other community and social service specialists, as reported in the 2020 OES by the BLS found at https://www.bls.gov/oes/current/naics4_999300.htm.

b Total wage cost is calculated as total burden hours × average hourly wage by staff type.

SPF Rx Evaluation

For the 5 years of the SPF Rx FY 2021 Cohort evaluation, we assume that the number of annual data collection responses will be consistent for subrecipients but vary for grantees by year based on the timing of the Grantee-Level Interviews. As such, the burden and respondent cost may vary by year. Exhibit 4 provides an overview of the total estimated number of responses per year for the evaluation.

Exhibit 4. SPF Rx FY 2021 Cohort Evaluation Burden Totals by Year

Year

Total Responses

Total Burden Hours a

Total Wage Cost b

Year 1

283

724.5

$25,277.71

Year 2

262

693

$23,806.98

Year 3

283

724.5

$25,277.71

Year 4

262

693

$23,806.98

Year 5

283

724.5

$25,277.71

Total

1373

3559.5

$123,447.09

a Total Burden hours includes Grantee PD or Evaluator hourly wage and Subrecipient Staff hourly wage. The Grantee PD or Evaluator hourly wage is based on the mean hourly wage for state government managers, as reported in the 2020 Occupational Employment (OES) by the Bureau of Labor Statistics (BLS) found at https://www.bls.gov/oes/current/naics4_999200.htm#11-00000. Subrecipient Staff hourly wage is based on the mean hourly wage for local government counselors, social workers, and other community and social service specialists, as reported in the 2020 OES by the BLS found at https://www.bls.gov/oes/current/naics4_999300.htm.

b Total wage cost is calculated as total burden hours × average hourly wage by staff type.


ART

The ART is required annually of all grantees and subrecipients that have been funded during that year. Key assumptions related to the burden include:

  • In each of the 5 years, we expect that 21 SPF Rx grantees and 110 SPF Rx subrecipients will complete the ART one time each year.

  • For subrecipients, the ART is estimated to take 4 hours to complete each response period; this includes 2.5 hours to research and compile information and 1.5 hour to complete the web instrument in the CS-DCP.

  • For grantees, who primarily only complete items related to the implementation of their interventions, the ART is estimated to take 3 hours to complete each response period; this includes 2 hours to research and compile information, and 1 hour to complete the web instrument.

  • There are no direct costs to respondents other than their time to complete the instrument.

Exhibits 5–9 provide detail of the annual burden for the ART for Years 1–5 of the FY-2021 SPF Rx cohort evaluation. The estimated burden for the ART is 4 hours per subrecipient respondent, including time to gather relevant information and enter it into (CS-DCP). The current burden of 4 hours per respondent reflects the average of the actual burden estimates provided by six SPF Rx subrecipients from the FY 2016 SPF Rx cohort evaluation. Because grantees complete only a portion of the ART, we estimate their burden at 3 hours per grantee respondent, including time to gather relevant information and enter it into (CS-DCP). The burden for the ART is reported by respondent type to reflect a more accurate hourly wage and cost for completion.

Exhibit 10 presents average annualized estimates of the ART burden (503 hours) and the total respondent cost ($14,935.87 = total burden hours × the estimated hourly wage for respondents).

Grantee-Level PDMP Outcomes Module

The Grantee-Level PDMP Outcomes Module is required annually of all grantees. Key assumptions related to the burden include:

  • In each of the 5 years, we expect that 21 SPF Rx grantees will complete the Grantee-Level PDMP Outcomes module one time each year.

  • The Grantee-Level PDMP Outcomes Module is estimated to take 2.5 hours to complete per response; this includes 1.5 hours to research and compile information, and 1 hour to complete the web instrument.

  • There are no direct costs to respondents other than their time to complete the instrument.

Exhibits 5–9 provide details on the annual burden for the Grantee-Level PDMP Outcomes Module for Years 1–5 of the FY 2021 SPF-Rx cohort evaluation. The estimated burden of 2.5 hours reflects the average of actual burden estimates provided by six FY 2016 SPF Rx grantees (6.2 hours) and the removal of approximately two-thirds of the instrument used for that cohort.

Exhibit 10 presents estimates of the average annualized Grantee-Level Outcome Module burden (52.5 hours) and the total respondent cost ($2,941.47 = total burden hours × the estimated hourly wage for respondents).

Community-Level PDMP Outcomes Module

The Community-Level PDMP Outcomes Module is required to be reported annually by grantees for all subrecipients. Key assumptions related to the burden include:

  • In each of the 5 years, we expect the 21 grantees to report Community-Level PDMP Outcomes for 110 subrecipients.

  • The Community-Level PDMP Outcomes Module is estimated to take 1.25 hours to complete per response; this includes.75 hours to research and compile information (to include working with subrecipients to identify correct data), and.5 hour to complete the web instrument.

  • There are no direct costs to respondents other than their time to complete the instrument.

Exhibits 5–9 provide detail of the annual burden for the Community-Level PDMP Outcomes Module for Years 1–5 of the FY 2021 SPF Rx cohort evaluation. The estimated burden of 1.25 hours for the Community-Level PDMP Outcomes Module reflects the average of actual burden estimates provided by six FY 2016 SPF Rx grantees (3.5 hours) and the removal of approximately two-thirds of the instrument. Note that the Community-Level PDMP Outcomes Module is less burdensome than the Grantee-Level PDMP Outcomes Module because the compiled information comes from data already retrieved by grantees for the Grantee Module.

Exhibit 10 presents average annualized estimates of the Community-Level PDMP Outcomes Module burden (137.5 hours) and respondent cost ($6,419.88 = total burden hours × the estimated hourly wage for respondents).

Grantee-Level Interviews

The Grantee-Level Interviews are required in the baseline, third, and final years of the grant funding. Key assumptions related to this burden include:

  • In Years 1, 3, and 5, we expect that 21 SPF Rx grantees will complete the Grantee-Level Interview.

  • The Grantee-Level Interviews are estimated to take 1.5 hours to complete, per response. This includes the time it takes for the grantee to complete the phone interview.

  • There are no direct costs to respondents other than their time to complete the instrument.

Exhibits 5–9 provide detail of the annual burden for the Grantee-Level Interview for Years 1–5 of the FY 2021 SPF-Rx cohort evaluation. The estimated burden of 1.5 hours was confirmed through interviews conducted with a similar length instrument for the FY 2016 SPF Rx grantees cohort evaluation.

Exhibit 10 presents average annualized estimates for the Grantee-Level Interview burden (18.9 hours) and respondent cost ($882.44 = total burden hours × the estimated hourly wage for respondents).

Exhibit 5. Estimates of Annual Burden for Year 1 for the SPF Rx FY 2021 Cohort Data Collection (N = 21 Grantees; N = 110 Subrecipients)

Instrument

Number of Respondents

Respondent Type

Responses per Respondent

Total Number of Responses

Hours per Response

Total Burden Hours

Average

Hourly Wage b, c

Total Cost d

ART a


110

Subrecipient Staff

1

110

4

440

$27.26

$11,994.40

21

Grantee PD or Evaluator

1

21

3

63

$46.69

$2,941.47

Grantee-Level PDMP Outcomes Module

21

Grantee PD or Evaluator

1

21

2.5

52.5

$46.69

$2,451.23

Community-Level PDMP Outcomes Module

21

Grantee PD or Evaluator

5.2

110

1.25

137.5

$46.69

$6,419.88

Grantee-Level Interview

21

Grantee PD or Evaluator

1

21

1.5

31.5

$46.69

$1,470.74

Year 1 Total

131



283


724.5


$25,277.72

PD = Project Director

a The ART is used for both subrecipient and grantee-level reporting. Grantees report ART data at the state or tribal grantee level, and subrecipients report the ART at the community level. Grantees complete all other instruments.

b Grantee PD or Evaluator hourly wage is based on the mean hourly wage for state government managers, as reported in the 2020 Occupational Employment (OES) by the Bureau of Labor Statistics (BLS) found at https://www.bls.gov/oes/current/naics4_999200.htm#11-0000.

c Subrecipient Staff hourly wage is based on the mean hourly wage for local government counselors, social workers, and other community and social service specialists, as reported in the 2020 OES by the BLS found at https://www.bls.gov/oes/current/naics4_999300.htm.

d Total respondent cost is calculated as total burden hours × average hourly wage.


Exhibit 6. Estimates of Annual Burden for Year 2 for the SPF Rx FY 2021 Cohort Data Collection (N = 21 Grantees; N = 110 Subrecipients)

Instrument

Number of Respondents

Respondent Type

Responses per Respondent

Total Number of Responses

Hours per Response

Total Burden Hours

Average

Hourly Wage b, c

Total Cost d

ART a


110

Subrecipient Staff

1

110

4

440

$27.26

$11,994.40

21

Grantee PD or Evaluator

1

21

3

63

$46.69

$2,941.47

Grantee-Level PDMP Outcomes Module

21

Grantee PD or Evaluator

1

21

2.5

52.5

$46.69

$2,451.23

Community-Level PDMP Outcomes Module

21

Grantee PD or Evaluator

5.2

110

1.25

137.5

$46.69

$6,419.88

Grantee-Level Interview

0

Grantee PD or Evaluator

N/A

N/A

1.5

N/A

$46.69

N/A

Year 2 Total

131



262


693


$23,806.98

PD = Project Director

a The ART is used for both subrecipient and grantee-level reporting. Grantees report ART data at the state or tribal grantee level, and subrecipients report the ART at the community level. Grantees complete all other instruments.

b Grantee PD or Evaluator hourly wage is based on the mean hourly wage for state government managers, as reported in the 2020 Occupational Employment (OES) by the Bureau of Labor Statistics (BLS) found at https://www.bls.gov/oes/current/naics4_999200.htm#11-0000.

c Subrecipient Staff hourly wage is based on the mean hourly wage for local government counselors, social workers, and other community and social service specialists, as reported in the 2020 OES by the BLS found at https://www.bls.gov/oes/current/naics4_999300.htm.

d Total respondent cost is calculated as total burden hours × average hourly wage.


Exhibit 7. Estimates of Annual Burden for Year 3 for the SPF Rx FY 2021 Cohort Data Collection (N = 21 Grantees; N = 110 Subrecipients)

Instrument

Number of Respondents

Respondent Type

Responses per Respondent

Total Number of Responses

Hours per Response

Total Burden Hours

Average

Hourly Wage b, c

Total Cost d

ART a


110

Subrecipient Staff

1

110

4

440

$27.26

$11,994.40

21

Grantee PD or Evaluator

1

21

3

63

$46.69

$2,941.47

Grantee-Level PDMP Outcomes Module

21

Grantee PD or Evaluator

1

21

2.5

52.5

$46.69

$2,451.23

Community-Level PDMP Outcomes Module

21

Grantee PD or Evaluator

5.2

110

1.25

137.5

$46.69

$6,419.88

Grantee-Level Interview

21

Grantee PD or Evaluator

1

21

1.5

31.5

$46.69

$1,470.74

Year 3 Total

131



283


724.5


$25,277.72

PD = Project Director

a The ART is used for both subrecipient and grantee-level reporting. Grantees report ART data at the state or tribal grantee level, and subrecipients report the ART at the community level. Grantees complete all other instruments.

b Grantee PD or Evaluator hourly wage is based on the mean hourly wage for state government managers, as reported in the 2020 Occupational Employment (OES) by the Bureau of Labor Statistics (BLS) found at https://www.bls.gov/oes/current/naics4_999200.htm#11-0000.

c Subrecipient Staff hourly wage is based on the mean hourly wage for local government counselors, social workers, and other community and social service specialists, as reported in the 2020 OES by the BLS found at https://www.bls.gov/oes/current/naics4_999300.htm.

d Total respondent cost is calculated as total burden hours × average hourly wage.



Exhibit 8. Estimates of Annual Burden for Year 4 for the SPF Rx FY 2021 Cohort Data Collection (N = 21 Grantees; N = 110 Subrecipients)

Instrument

Number of Respondents

Respondent Type

Responses per Respondent

Total Number of Responses

Hours per Response

Total Burden Hours

Average

Hourly Wage b, c

Total Cost d

ART a


110

Subrecipient Staff

1

110

4

440

$27.26

$11,994.40

21

Grantee PD or Evaluator

1

21

3

63

$46.69

$2,941.47

Grantee-Level PDMP Outcomes Module

21

Grantee PD or Evaluator

1

21

2.5

52.5

$46.69

$2,451.23

Community-Level PDMP Outcomes Module

21

Grantee PD or Evaluator

5.2

110

1.25

137.5

$46.69

$6,419.88

Grantee-Level Interview

0

Grantee PD or Evaluator

N/A

N/A

1.5

N/A

$46.69

N/A

Year 4 Total

131



262


693


$23,806.98

PD = Project Director

a The ART is used for both subrecipient and grantee-level reporting. Grantees report ART data at the state or tribal grantee level, and subrecipients report the ART at the community level. Grantees complete all other instruments.

b Grantee PD or Evaluator hourly wage is based on the mean hourly wage for state government managers, as reported in the 2020 Occupational Employment (OES) by the Bureau of Labor Statistics (BLS) found at https://www.bls.gov/oes/current/naics4_999200.htm#11-0000.

c Subrecipient Staff hourly wage is based on the mean hourly wage for local government counselors, social workers, and other community and social service specialists, as reported in the 2020 OES by the BLS found at https://www.bls.gov/oes/current/naics4_999300.htm.

d Total respondent cost is calculated as total burden hours × average hourly wage.


Exhibit 9. Estimates of Annual Burden for Year 5 for the SPF Rx FY 2021 Cohort Data Collection (N = 21 Grantees; N = 110 Subrecipients)

Instrument

Number of Respondents

Respondent Type

Responses per Respondent

Total Number of Responses

Hours per Response

Total Burden Hours

Average

Hourly Wage b, c

Total Cost d

ART a


110

Subrecipient Staff

1

110

4

440

$27.26

$11,994.40

21

Grantee PD or Evaluator

1

21

3

63

$46.69

$2,941.47

Grantee-Level PDMP Outcomes Module

21

Grantee PD or Evaluator

1

21

2.5

52.5

$46.69

$2,451.23

Community-Level PDMP Outcomes Module

21

Grantee PD or Evaluator

5.2

110

1.25

137.5

$46.69

$6,419.88

Grantee-Level Interview

21

Grantee PD or Evaluator

1

21

1.5

31.5

$46.69

$1,470.74

Year 5 Total

131



283


724.5


$25,277.72

PD = Project Director

a The ART is used for both subrecipient and grantee-level reporting. Grantees report ART data at the state or tribal grantee level, and subrecipients report the ART at the community level. Grantees complete all other instruments.

b Grantee PD or Evaluator hourly wage is based on the mean hourly wage for state government managers, as reported in the 2020 Occupational Employment (OES) by the Bureau of Labor Statistics (BLS) found at https://www.bls.gov/oes/current/naics4_999200.htm#11-0000.

c Subrecipient Staff hourly wage is based on the mean hourly wage for local government counselors, social workers, and other community and social service specialists, as reported in the 2020 OES by the BLS found at https://www.bls.gov/oes/current/naics4_999300.htm.

d Total respondent cost is calculated as total burden hours × average hourly wage.



Exhibit 10. Average Annualized Burden for SPF Rx FY 2021 Cohort Data Collection a

Instrument

Average Number of

Respondents

Respondent Type

Average Number of Responses per Respondent

Average Number of Responses

Hours per Response

Average Burden Hours

Hourly Wage c, d

Annualized Data Collection Burden

ART b


110

Subrecipient Staff

1

110

4

440

$27.26

$11,994.40

21

Grantee PDs or Evaluators

1

21

3

63

$46.69

$2,941.47

Total Burden for ART

131



131


503


$14,935.87

Grantee-Level PDMP Outcomes Module

21

Grantee PDs or Evaluators

1

21

2.5

52.5

$46.69

$2,451.23

Community-Level PDMP Outcomes Module

21

Grantee PDs or Evaluators

5.2

110

1.25

137.5

$46.69

$6,419.88

Grantee-Level Interview

13

Grantee PDs or Evaluators

1

13

1.5

19

$46.69

$882.44

Total Annualized Burden

131



275


713


$24,689.41

PD = Project Director

a Annualized Data Collection Burden captures the average number of respondents and responses, burden hours, and respondent cost over the 3 years (FY 2022–FY 2026).

b The ART is used for both subrecipient and grantee-level reporting. Grantees report ART data at the state or tribal grantee-level, and subrecipients report the ART at the community-level.

c Grantee PD or Evaluator hourly wage is based on the mean hourly wage for state government managers, as reported in the 2018 Occupational Employment (OES) by the Bureau of Labor Statistics (BLS) found at https://www.bls.gov/oes/current/naics4_999200.htm#11-00000.

d Subrecipient Staff hourly wage is based on the mean hourly wage for local government counselors, social workers, and other community and social service specialists, as reported in the 2018 OES by the BLS found at https://www.bls.gov/oes/current/naics4_999300.htm.


A.13. Estimates of Annualized Cost Burden to Respondents

There are no respondent costs for capital, startup, operation, or maintenance.

A.14. Estimates of Annualized Cost to the Government

SAMHSA plans to allocate resources for the management, processing, and use of the collected information in a manner that will enhance its utility to agencies. The contract award to cover this evaluation is $3,011,109 over a 3-year period. Thus, the annualized contract cost is $1,003,703. It is estimated that two SAMHSA employees will each be involved for 15% of their time, at an estimated annualized cost of $49,231 to the government. The total estimated average cost to the government per year is $1,052,934.

A.15. Changes in Burden

There is a decrease in burden due to programmatic changes to how much data is collected from grantees versus use of secondary data. To reduce burden, the SPF Rx program will use data from other data sources including SPF Rx grantee proposals, SAMHSA's Performance Accountability and Reporting System (SPARS), and secondary outcomes data sources.

A.16. Time Schedule, Publications, and Analysis Plan

Time Schedule

Exhibit 10 represents a timeline for data collection and reporting benchmarks for the SPF Rx Evaluation.


Exhibit 10. Time Schedule for SPF Rx Evaluation Data Collection


Activity

Time Schedule

Disseminate Findings: Baseline Annual Report

July 2022

Year 1 ART, Grantee- and Community-Level PDMP Outcomes Data Collection, and Grantee-Level Interview

November 2022–January 2023

Disseminate Findings: Year 1 Annual Report

July 2023

Year 2 ART, Grantee- and Community-Level PDMP Outcomes Data Collection

November 2023–January 2024

Disseminate Findings: Year 2 Annual Report

July 2024

Year 3 ART, Grantee- and Community-Level PDMP Outcomes Data Collection, and Grantee-Level Interview

November 2024–January 2025

Disseminate Findings: Year 3 Annual Report

July 2025

Year 4 ART, Grantee- and Community-Level PDMP Outcomes Data Collection

November 2025–January 2026

Disseminate Findings: Year 4 Annual Report

July 2026

Year 5 ART, Grantee- and Community-Level PDMP Outcomes Data Collection, and Grantee-Level Interview

July 2026–September 2026

Disseminate Findings: Year 5 Annual Report

July 2027

Publications

The PEPC evaluation team will use the data collected through the SPF Rx evaluation to help SAMHSA reach its diverse stakeholders. The objective for all reports and dissemination products is to provide user-friendly documents and presentations that help SAMHSA successfully disseminate and explain the findings. The dissemination plan includes products in a variety of formats for a variety of target audiences, such as:

  • Annual reports that summarize findings: The SPF Rx reports will include brief profiles on each grantee, with helpful and easy-to-read graphics on performance data, rather than lengthy text.

  • Briefings for SAMHSA and other federal stakeholders: Audiences for briefings may include SAMHSA staff, grantees, and other stakeholders.

  • Aggregate information: This may also be used in journal articles, scholarly presentations, budget justifications, and other testimony related to the outcomes of the SPF Rx program.

Analysis

SPF Rx Evaluation

The PEPC SPF Rx evaluation uses a series of interdependent analyses to answer the key evaluation questions developed to assess the impact of the SPF Rx program on prescription drug misuse, opioid overdoses, and related outcomes. The evaluation will fully incorporate all data from the cross-site evaluation instruments, data that the grantees submit to the SPARS system, and secondary administrative and surveillance data to enhance the evaluation’s ability to address evaluation questions. SPARS will provide information on performance measures and grantee progression through steps outlined in the SPF. Secondary outcomes data sources for the SPF Rx evaluation will include the National Vital Statistics System (for opioid mortality), the National Poison Data System (for opioid overdose calls), and IQVIA (for prescribing outcomes).

The analysis plan includes a range of approaches, from basic descriptive analyses of GPRA measures, grantee performance measures, and NOMs (e.g., means, frequencies, percentages, trend analysis) to sophisticated qualitative analysis, multiple quantitative analytic frameworks, and models that reflect the anticipated complexities of data collected by the PEPC team.

Matched Comparison Groups:

The SPF Rx evaluation will use matched comparison groups when relevant and feasible. The PEPC team plans to obtain key county-level characteristics from baseline census, archival, and survey data sources and use that information to select comparison counties (or communities). For all grantees, the required estimates will be available through standard public reporting. Under no circumstances will new data collection be required for the matching process. Follow-up outcomes data for the matched comparison groups will come from the same data sources used for the matching process.


Quantitative Analyses:

Several features of the evaluation design and evaluation questions guided the selection of the analysis frameworks for the SPF Rx evaluation:

  • repeated outcomes;

  • data from state and tribal grantees;

  • data from communities nested within grantees;

  • nonrandomized comparison of communities within grantee states; and

  • nonrandom selection of intervention types that often occur in combination.

Outcome Evaluation Models:

To address the above features, analyses will use the following models:


  • Longitudinal meta-regression: Meta-regression will be used to model summary data (i.e., percentage estimates and standard errors). We will model changes over time within and across grantees (and communities), relying on already aggregated data for grantee-reported proximal and distal outcomes, such as morbidity (hospital emergency department data) and PDMP outcomes. Covariates and their interaction terms will be added to the models to estimate their impact on intervention effects.

  • Multilevel latent growth models (MLLGM): The MLLGM framework will be used to examine SPF Rx impact, particularly changes in NPDS and NVSS outcomes and IQVIA prescribing measures. These methods examine differences in change in outcomes over time (“difference-in-differences” models), either with respect to a comparison group or in relation to repeated pre-intervention values within cases (interrupted time series). Moreover, these models allow for the disaggregation of grantee-level intervention effects from community-level intervention effects, such that, for example, states with a higher proportion of communities participating in SPF-Rx may see greater benefits at the grantee level, separate from the benefits to individual communities.

  • Interrupted time series analysis: For analyses where comparison communities are not available, we will conduct MLLGM equivalents of interrupted time series models. Through this approach, we can use an extended pre-implementation period as the within-community comparison period. Significant SPF Rx effects would be indicated by post-intervention deflection of the outcome trajectory, relative to the normative or secular trend in the baseline period. The proposed design would include a sufficient baseline time series to assess trends during the pre-intervention period. We would propose to use the interrupted time series models to estimate the changes in levels and trends of key outcomes after the implementation of SPF Rx overall or after the implementation of specific strategies in SPF Rx.

  • Idiographic clinical trials (ICT): ICT analyses will be used for localized SPF Rx evaluations of both consequences of opioid misuse (NPDS data) and proximal prescribing outcomes (IQVIA prescription data). ICT designs, which may be used for within-subject clinical trials involving small samples, early phase treatment evaluations, and pragmatic trials, are composed of three fundamental components: (1) data collected from each “subject” (or community) during multiple study phases (e.g., baseline and intervention phases); (2) time series data analyzed from each study phase; and (3) hierarchical regression modeling adjusted to provide conservative estimates for small sample analysis. The key benefit to the SPF Rx evaluation is the use of ICTs to investigate within-subject heterogeneity of intervention implementation outcomes within grantees.

  • Qualitative comparative analysis (QCA): To analyze the relationship between the process measures and outcomes, we will employ QCA. QCA will assess how different features of SPF Rx implementation—individually or in combination—are necessary or sufficient for changes in outcomes (must be present for an outcome to occur or produce the outcome, respectively). QCA, which integrates qualitative and quantitative data to maximize the utility of process evaluation data for SAMHSA, offers an approach for supporting systematic cross-case comparison and can accommodate small and intermediate-size samples. It uses formal logic and set theory, a branch of mathematics, to identify noncorrelation relationships using qualitative data, quantitative data, or both, derived from the cases included in the analysis. This approach differs from traditional variable-oriented statistical techniques with large sample size requirements, which are often not well suited to explaining complex social phenomena. Grantees and subrecipients will serve as the cases for the analysis.

A.17. Display of Expiration Date

OMB approval expiration dates will be displayed.

A.18. Exceptions to Certification for Statement

There are no exceptions to the certification statement. The certifications are included in this submission.

1

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorauthor
File Modified0000-00-00
File Created2023-07-30

© 2024 OMB.report | Privacy Policy