Justification Statement - DFC

1_Justification Statement_2019_Aug6.docx

Drug Free Communities Support Program National Evaluation

Justification Statement - DFC

OMB: 3201-0012

Document [docx]
Download: docx | pdf

Office of Management and Budget



Clearance Package Supporting Statement and Revised Data Collection Instruments





Drug-Free Communities Support Program National Evaluation









Supported by:


Executive Office of the President

Office of Administration

Office of National Drug Control Policy

1800 G Street NW

Washington, DC 20006




Assistant Director, Drug-Free Communities Program:

Helen Hernandez

(202) 395.6665 (phone)

(202) 395-6711 (fax)













August 2019



Table of Contents



A. JUSTIFICATION 1


A.1. Circumstances Making the Collection of Information Necessary 1


A.2. Purpose, Requested Revisions/Additions, and Use of the Information 3


A.3. Use of Information Technology to Reduce Burden 9


A.4. Efforts to Identify Duplication 10


A.5. Impact on Small Businesses or Other Small Entities 10


A.6. Consequences of Less Frequent Data Collection 10


A.7. Special Circumstances Influencing Collection 11


A.8. Federal Register Notice and Consultation Outside the Agency 12


A.9. Explanation of Payment or Gift to Respondents .12


A.10. Assurance of Confidentiality Provided to Respondents 12


A.11. Justification for Sensitive Questions 13


A.12. Estimates of Hour Burden Including Annualized Hourly Costs 13


A.13. Estimate of Other Total Annual Cost Burden to Respondents or Record

Keepers 15


A.14. Annualized Cost to the Federal Government 15


A.15. Explanation for Program Changes or Adjustments 15


A.16. Time Schedule, Analysis Plans, and Publication 16


A.17. Reason(s) Display of OMB Expiration Date is Inappropriate 19


A.18. Exceptions to Certification Submissions 19


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS 20


B.1. Respondent Universe and Sampling 20


B.2. Procedures for the Collection of Information 20


B.3. Methods to Maximize Response Rates and Deal with Non-Response 23


B.4. Test of Procedures or Methods to be Undertaken 24


B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 24


LIST OF EXHIBITS


Exhibit 1. Estimates of Hour Burden 14


Exhibit 2. Annualized Cost to Respondents 15


Exhibit 3. Evaluation Time Schedule 16


Exhibit 4. Key Evaluation Questions, Products, and Analytic Methods 18




LIST OF ATTACHMENTS:


Attachment 1. Drug Free Communities Authorizing and Re-Authorizing Legislation


Attachment 2. Drug Free Communities Support Program National Logic Model and Evaluation Plan


Attachment 3. Sample Evaluation Report


Attachment 4. Proposed Revisions to Drug-Free Communities Progress Report and Core Measures


Attachment 5. CCT Proposed Data Collection Revisions


Attachment 6. Site Visit Protocols Final


Attachment 7. Progress Report and Core Measure Data Collection Final


Attachment 8. Sample Survey Review Guide


Attachment 9. Proposed Feedback Evaluation Items


Attachment 10. DFC Data Request Form


Attachment 11. Federal Register 60-Day and 30-Day Notices

A. Justification


A.1. Circumstances Making the Collection of Information Necessary


The current package reflects a request for revisions to the 2015 previously approved collection pertaining to the Drug Free Communities Support Program (DFC). DFC was created by the Drug Free Communities Act of 1997 (Public Law 105-20), reauthorized through the Drug Free Communities Reauthorization Act of 2001 (Public Law 107-82), and reauthorized again through the Office of National Drug Control Policy Reauthorization Act of 2006 (Public Law 109-469). The latest reauthorization extended the program for an additional five years until 2019. The DFC authorizing statute (21 USC §1521–1032) provides that community-based coalitions addressing youth substance use can receive Federal grant funds and that the amount of each DFC grant award shall not exceed $125,000 annually. (See Attachment 1 for authorizing statutes.)


ONDCP, the lead agency for setting U.S. drug control policy and strategy, provides funding through the DFC Program to build community capacity for preventing and reducing substance abuse among our nation’s youth. ONDCP directs the DFC Program. The Substance Abuse and Mental Health Services Administration’s (SAMHSA) Center for Substance Abuse Prevention (CSAP) currently provides Government Project Officers (GPO) and grants management support to the grant award recipients. DFC has two primary goals: to prevent and reduce youth substance use and to support community coalitions in building capacity to address youth substance use. These goals are addressed by establishing, strengthening, and fostering collaboration among public and private non-profit agencies, as well as Federal, State, local, and tribal governments. One important objective of the DFC Program is to empower community coalitions to become self-sufficient. There were 724 fiscal year (FY) 2018 grant recipients impacted by the revisions requested with this submission, along with grant recipients awarded in future fiscal years.


The current request pertains to revisions to the progress report and the Coalition Classification Tool (CCT), while retaining the case study protocols. These changes have been determined necessary to reduce burden on grant recipients, facilitate the monitoring and tracking of grant recipient progress, and improve the quality of the data.


ONDCP awarded a contract for a more robust DFC Management and Evaluation (DFC Me) system in January 2015 following a competitive request for proposals process. DFC Me provides: (1) a communication tool to facilitate system-wide or targeted announcements to grant recipients, distribute information related to the DFC program, and solicit responses for information; (2) a progress reporting tool for grant recipients to submit semi-annual quantitative and qualitative data pertaining to their grant progress (including Progress Reports, Core Measures, and the annual CCT); (3) a Learning Center where resources are shared and where grant recipients can share Success Stories; 4) a Portfolio Status dashboard to assist ONDCP and GPO in monitoring grant recipient compliance; and 5) a DFC coalition report card for grant recipients to track their own compliance with grant requirements. The DFC Me system is regularly updated and improved in order to facilitate usability based on guidance from ONDCP.


Under reauthorization legislation (21 § USC 1702), Congress mandated a National Evaluation be undertaken to determine the effectiveness of the DFC Program in meeting its objectives. ONDCP manages what is currently the third five-year evaluation from 2015–2020. The current evaluation builds on the prior evaluations resulting in knowledge to continue to assess how effective the DFC program has been at achieving its goals of: (1) increasing community collaboration to address youth substance use and (2) reducing substance use among youth (see Attachment 2, DFC National Evaluation Logic Model and Evaluation Plan). Additional information about how data is utilized for reporting can be found in Attachment 3, Sample Evaluation Report, which provides a sample published report and executive summary utilizing DFC data.


The 2015 DFC OMB package was also intended to cover the SAMHSA Sober Truth on Preventing Underage Drinking (STOP) Act Program, which funds current and former DFC grant recipients. The STOP Act program goals are to prevent and reduce alcohol use among youth ages 12-20 in communities throughout the United States. STOP Act grants are authorized under the Public Health Service (PHS) Act (42 U.S.C. 290bb-25b), Section 519B. The DFC core measures related to alcohol are the same as the STOP Act performance measures for Past 30-day use and parental disapproval. For perception of risk and perception of peer disapproval, DFC coalitions may choose to collect using either the DFC wording or the STOP Act wording. The proposed revisions will not involve any changes to DFC core measures. The alcohol core measures are the only shared data reporting requirements between DFC and STOP Act grant recipients and as no revisions to these measures are being proposed, the revisions requested in this document do not apply to the STOP Act program. ONDCP will share STOP Act relevant data with SAMHSA based on a data sharing agreement being in place in order to avoid duplication of effort.


In Fiscal Year (FY) 2018, the first Comprehensive Addiction and Recovery Act (CARA) Community-based Coalition Enhancement Grants to Address Local Drug Crises Grants (CARA-ALDC) were awarded. By definition these grants were awarded as an enhancement to current or formerly funded DFC grant recipients. ONDCP anticipates evaluating the CARA-ALDC grant in the future. Given this new grant, optional core measure items regarding heroin and methamphetamine use have been added to the Progress Report as well as additional items regarding addressing opioids in particular. These items will be optional for all grant recipients and are intended to improve understanding of how coalitions are working to address opioids and methamphetamines as they become issues in their communities. The core measures already include required information on youth use of prescription drugs which will also support the CARA-ALDC grants.


Grant recipients are required to participate in three aspects of data reporting related to the grant, most of which will remain unchanged from the revision requested in 2015 OMB under the revisions proposed here:


  1. Progress Report and monitoring data is updated every six months (February and August) and includes descriptive information on the community served and coalition members, coalition accomplishments and challenges, assessment activities, strategic plan information, implementation strategies, and technical assistance needs. Several aspects of the progress report support ONDCP and GPO in monitoring grant recipients (i.e., identification of active members from each of twelve sectors). In addition, a main emphasis for progress report data is reporting on the activities within seven strategy categories in which the grant recipient engaged during the prior six month period in order to achieve grant goals. This progress reporting of activities is vital because it informs all levels of grant recipient stakeholders: ONDCP and GPO are kept up-to-date on grant recipient activities, grant recipients are able to reflect on their work and identify successes/challenges twice a year, and the data provided are used to both inform Congress of substance use trends within communities served by grant recipient coalition as well as support the national evaluation in understanding these efforts. A range of small changes for clarity have been proposed in the Progress Report, including reordering some items and relabeling some scales (e.g., member involvement scales) as well as the addition of a small number of items (e.g., regarding opioids).

  2. Core measures data are required to be collected and reported every two years. The four core measures are past 30-day use, perception of risk or harm of use, perception of parent disapproval of use, and perception of peer disapproval of use. Each of the four core measures is collected with regard to four substances: alcohol, tobacco, marijuana, and prescription drugs. This data is critical in reporting on the extent to which grant recipients are achieving the goal of reducing youth substance use, in addition to increasing perception of risk and of parent and peer disapproval. When the new data is ready for submission, grant recipients submit it as part of the Progress Report. Going forward, grant recipients will have the option (but not be required) to include and report on the core measures with regard to two additional substances: heroin and methamphetamines.

  3. The Coalition Classification Tool (CCT) is a survey instrument collected once each year at the same time as the August Progress Report. The CCT is designed to capture information on coalition performance or characteristics with regard to a range of areas including building capacity, Strategic Prevention Framework Utilization, data and outcomes utilization, youth involvement, member empowerment and building sustainability. The CCT survey was introduced as part of the 2015 approved revisions. Based on analysis of the CCT since that time, many of the items had a limited range of responses and did not therefore differentiate between coalitions (one of the goals of the measure). In addition, at 300 items, many grant recipients found the CCT lengthy and cumbersome to complete. A revised and much shorter CCT has been proposed as part of this packet (from ~300 items to ~100 items).


Finally, case study protocols were approved as part of the prior package. The DFC National Evaluation includes a case study component intended to document coalition practices, successes and challenges. Approximately nine DFC grant recipients are selected each year to highlight in the case studies (see Section B. Collections of Information Employing Statistical Methods for site sample selection and data analysis). The information from the case studies will be used to illustrate not only what works, but also how and why it works. The questions are currently flexible enough that a focus can be applied to the protocols and no change to the protocols have been requested.


A.2. Purpose, Requested Revisions/Additions, and Use of the Information


Purpose of Collection


The overall goal of the DFC National Evaluation is to assess the DFC Program’s effectiveness in preventing and reducing youth substance use. Two primary objectives of the evaluation are to: 1) regularly monitor, measure, and analyze data in order to report on the progress of the DFC program and its grant recipients on program goals, and 2) provide technical assistance support to grant recipients in effectively collecting and submitting data and in understanding the role of data in driving local coalition efforts. Within these broad objectives, the evaluation addresses a series of specific questions which are presented in Section A.16, Time Schedule, Analysis Plans, and Publication, and in Attachment 2, Drug Free Communities Support Program National Evaluation Logic Model and Evaluation Plan.


With the data provided in the Progress Reports (including core measures which are attached to Progress Reports), the CCT, and from nine site visit case studies annually, the evaluation can examine both direct and indirect relationships between measures of the DFC Program’s effectiveness and changes in substance use outcomes in DFC-funded communities. First, the strategies, initiatives, and activities of DFC coalitions provided in progress reports are examined to determine trends and patterns in how DFC coalitions go about working for change. At the same time, core outcomes are assessed for change over time, including change across all coalitions (long-term) and changes within the most current group of coalitions (short-term change). Relationships between strategies and outcomes are also examined. The CCT provides both specific data regarding processes engaged in by grant recipients as well as information regarding community assets put into place as a result of the grant award. Finally, case studies provide more in-depth qualitative information regarding what works, why, and under what conditions as well as challenges faced by coalitions and potential strategies for overcoming these challenges.


Requested Revisions


As part of the current five-year evaluation, ICF International (ICF), the evaluation contractor, regularly reviews existing systems, measures, and tools available for the evaluation. ICF receives ongoing feedback from ONDCP, GPO, and grant recipients regarding the data collections. In addition, the case studies involved discussions with coalition leaders, members, and local evaluators, when appropriate, regarding feedback on data collection. Progress Report, core measures, and CCT data have been examined in a range of ways, including item-by-item review, factor analysis, content analysis of open-ended responses, review of the core measure data, and relationships between variables. In doing so, specific recommendations for revising and enhancing progress report, CCT, and case studies were identified. The proposed revisions continue to be aligned with the logic model for the DFC Program (see Attachment 2, Drug Free Communities Support Program National Evaluation Logic Model and Evaluation Plan).


The requested revisions to the Progress Report and the CCT represent system enhancements and item/content revisions and are intended to reduce grant recipient reporting burden while maintaining quality of data available for the evaluation. In addition, these requested revisions will allow for better reporting of process, output, and outcome data. The revisions are highlighted below (see Attachment 4, Proposed Revisions to Drug-Free Communities Progress Report and Core Measures and Appendix 5 Coalition Classification Tool Proposed Revision for a summary of these changes).


DFC Management and Evaluation (DFC Me)

The DFC Me system was launched in February 2016. As previously noted, DFC Me provides a range of support for the DFC program. Within this system, grant recipients report the following:

  • Progress Reports, submitted each February and August

  • Core Measures, attached to Progress Reports as available (minimum of every two years)

  • Core Measure Survey Submission (surveys are reviewed by the DFC National Evaluation team to ensure grant recipients are collecting appropriate core measures data

  • CCT submitted annually in August

  • Sector Representatives (grant recipients are required to have in place a Coalition Involvement Agreement (CIA) for each of the twelve sectors and these representatives are entered into DFC Me)

In addition, within the Learning Center, grant recipients have the option to enter information regarding local success stories they wish to share with other grant recipients. The Learning Center also provides an opportunity for ONDCP, GPO, and the DFC National Evaluation Team to provide a range of resources to grant recipients, including webinars, tip sheets, custom materials and reports.


Through the DFC Me system, ONDCP and GPO are able to communicate with grant recipients. In addition, grant recipients are able to see a history of previously sent e-blasts. This means they have access to communication even if they did not observe the communication within their local email account.


Most recently, in January 2018, the DFC Me system provided ONDCP and GPO with tools that allow them to better monitor grant recipient’s compliance with grant requirements. The system has a series of dashboards that summarizes compliance information as well as the capability to view a specific coalition’s report card with regard to compliance. Each grant recipient also has access to their own report card, which is summarized in a dashboard on their main page.


Progress Report Revisions (Semi-Annual Progress Report)


Grant recipients have generally reported few concerns with the Progress Report and this tool generally meets the needs of the national evaluation, ONDCP, and GPO. However, a small number of revisions are being requested. The specific deletions, modifications, and additions of items to the Progress Report are depicted in Attachment 4, Proposed Revisions to Drug-Free Communities Progress Report and Core Measures, including the rationale for each revision (see Attachment 7, Progress Report and Core Measures Data Collection Final for the final revised progress report based on accepting all changes). Following is a high level discussion of the changes.


  1. DFC Me pre-populates basic information about the coalition into the Progress Report so these items have been removed. The Coalition Information Section now asks grant recipients to indicate if they work with any local High Intensity Drug Trafficking Area (HIDTA) program recipient and to indicate if they have been awarded a CARA-ALDC grant.

  2. The number of fields where grant recipients can provide qualitative (open-text) responses regarding their local efforts has been increased. For example, in each Strategies section, grant recipients can now provide descriptions of their work within the strategy type. This will allow the evaluation to better describe grant recipients’ efforts beyond counting of activities.

  3. In line with CARA-ALDC, and more broadly with ONDCP interest in how grant recipients are working to address opioids, several additional items have been included in the progress report. As noted in the core measures section, this includes providing grant recipients the option to enter information regarding youth use of heroin and methamphetamines. In addition, a new section asks first if grant recipients are working on these issues. If the response is no, then no further items are asked. If the response is yes, they are asked to a) indicate substance focus; b) to indicate yes/no if they engage in a range of activities organized by strategy type; and c) to describe (open-text) specific activities that they have engaged in over the past six areas in this area. These fields are open to all grant recipients. In addition, throughout the Progress Report areas regarding prescription drugs were separated to allow grant recipients to indicate a focus on prescription opioids versus prescription non-opioids.

  4. A section to identify coalitions working to prevent or reduce youth engagement in vaping has been added to the report. Only coalitions who indicate yes they are working on vaping will be required to answer additional items regarding strategies to address.

  5. The majority of the remaining changes were in wording of the item to improve clarification or revisions to response items again for clarity. These changes typically had no overall impact on the burden to the recipient.


Core Measures: No Required Revisions


The core measures were revised as part of the 2015 package and no further revisions wording for the required core measures is requested at this time. However, the following recommendations are being made:

  1. Remove the requirement to report core measure data by gender. The current core measure reporting asks coalitions to report data on the core measure by gender, regardless of grade. Many DFC recipients have reported that to the extent that they have data by gender, it is by grade level and difficult to aggregate for reporting purposes. The DFC National Evaluation Team is in agreement that core measure data aggregated by gender regardless of grade level is of little value and does not support comparison to available national data. However, asking grant recipients to report on core measures by gender and grade level would significantly increase burden. Therefore, the recommendation is being made to no longer request this information.

  2. Addition of heroin and methamphetamines as optional core measure substances. To better understand grant recipients’ potential impact on a range of substances and aligned with CARA-ALDC, reporting for the four core measures has been expanded to include heroin and methamphetamines. Reporting for these core measures will be considered to be optional rather than required. Wording was aligned with wording used in the 2017 Youth Risk Behavior Survey1:

    1. heroin (also called smack, junk, China White)

    2. methamphetamines (also called speed, crystal, crank, ice)

Each grant recipient is permitted to use their own local survey in order to collect core measure data. To ensure that the grant recipient is collecting the core measure data using appropriately worded items, the grant recipients are required to submit a survey for review to the DFC National Evaluation team through a tool in the DFC Me system. Each survey is reviewed for all core measures and a survey review guide is returned to the grant recipient (see Attachment 8, Sample Survey Review Guide). This process will now include review for the additional optional core measure items.


CCT Revisions


The largest revisions in the current OMB package are associated with the Coalition Classification Tool (CCT). The 2015 version of the CCT was intended to assess a broad range of coalition functions and to differentiate between coalitions that might be beginners versus more mature with regard to acting as a community coalition. In total, this version included 336 items and had an estimated OMB time of completion of 3 hours. The DFC National Evaluation Team began to extensively analyze CCT data in 2017 following initial submission in August 2016 (see Attachment 5: Coalition Classification Tool Proposed Revisions). Based on this review and on feedback from DFC coalitions, it was determined that the CCT undergo a major revision. While a small number of items from the 2015 CCT have been retained, the revision was extensive enough that we have simply introduced the CCT here in the revised format. The proposed 2019 CCT will include as few as 96 items and no more than 106 items and is anticipated to take grant recipients approximately one hour complete, a significant reduction in burden. Following are some highlights of the revised CCT:

  1. The majority of the questions (65) ask the grant recipients to think over their work in the past year and to indicate to what extent their coalition engaged in or achieved a given activity. The scale for these items is To a Great Extent, To a Moderate Extent, To a Slight Extent, Not at All, or Not Applicable. Utilizing a single response format across the majority of items is intended to increase understanding of the scale and reduce time needed to respond when the scale changes across items. While the CCT included in the attachment is organized by theoretical scales, the CCT will be randomly ordered when grant recipients complete it in the DFC Me system.

  2. Coalition Structure will be assessed by nine items asking the grant recipient to indicate across a range of activities who is responsible for carrying out the activity: Primarily Staff, Staff and Coalition Members Equally or Primarily Coalition Members. These items are intended to build on our understanding of the extent to which grant recipients are building community capacity through engagement of coalition members.

  3. The CCT will continue to assess the extent to which the grant recipient has enabled the recipient to put into place Community Assets. Based on analyses of prior responses, we have reduced the number of assets identified from 45 to 22. However, the proposed revision also includes the option to add up to 10 community assets to be assessed annually. This flexibility will allow the DFC National Evaluation Team and ONDCP to identify innovative practices, through site visits and/or review of qualitative responses, that may be of interest of better understanding the extent to which a broad range of grant recipients engage in the practice.


Case Study Interviews


Since the prior package approval of case study interviews, 45 site visits have been completed. The case study interview protocols have proven flexible enough to work with understanding how a range of DFC coalitions function, including high-performing coalitions, coalitions working in border communities, coalitions working in inner-city communities, coalitions working in states with legalized marijuana, coalitions working with LGBT youth, coalitions working to address opioids, and coalitions highly engaged with law enforcement. No revisions are planned for these protocols, with the understanding that minor modifications will occur based on specific focus of a given site visit. A copy of the protocols can be found in Attachment 6 Site Visit Protocols Final.


In addition, many grant recipients participating in the site visits struggled to complete the Social Network Survey. Based on feedback provided from grant recipients participating in this survey, the survey would need to be specialized for each case in order to get informative data. At the same time, site visit interviews and focus groups have provided good information regarding the quality of collaborations across and within sectors. Given these factors and to reduce burden, the Social Network Survey will no longer be utilized as part of the case studies.


Feedback Evaluations


Grant recipients participating in webinars will continue to be asked to provide feedback regarding the webinars and there is no change to this instrument. In order to evaluate the assistance that grant recipients are being provided from the DFC National Evaluation Team, participants in Ask-an-Evaluator webinars will be asked to complete a short survey following the session. This will be voluntary. Items address the extent to which the webinar increased knowledge and/or skills, relevance to work, and more generally what participants liked/did not like about the webinar. The information from the feedback evaluations will be used to provide the National Evaluation team with feedback on the provision of technical assistance in order to improve as needed.


Additionally, grant recipients may be asked to complete a short survey occasionally regarding using DFC Me in order for the National Evaluation team to gather feedback on the user experience. This survey will also be voluntary. Items ask about satisfaction, usefulness, ease of use, and areas for improvement. This information will be shared with ONDCP to finalize any recommended changes to be made to the system.


The feedback evaluation items can be found in Attachment 9, Feedback Evaluation Items.


Use of the Revised Information


To date, the data collected through Progress Reports and the CCT have been widely used by ONDCP, GPO, and the previous and current national evaluator to monitor and assess progress, inform training and technical assistance delivery, and evaluate the effectiveness of DFC. Additionally, the data and results of the evaluation have been shared with the grant recipients to help inform coalition operations and programming. The data have been widely used by ONDCP to demonstrate to Congress, grant recipients, and other stakeholders the progress and impact of DFC (see Attachment 3, Sample National Evaluation Report).


Moving forward, the data will continue to be used to prepare annual reports and will be used in the comprehensive analysis for the current DFC National Evaluation (see Section A.16 for more detail on Analysis Plan). In addition, the DFC National Evaluation will continue to provide ONDCP, GPO and CADCA with ad hoc reports on approved request. These ad hoc requests typically focus on a specific substance (e.g., what grant recipients are reporting on opioids; what grant recipients are reporting on marijuana) or on subgroups of grant recipients (e.g., performance of grant recipients working in inner-city communities; grant recipients from a particular state). The general public may request copies of the survey data. The DFC National Evaluation team at ICF will compile and conduct a first round of review, and then make a recommendation to the ONDCP FOIA Officer. (see Attachment 10, DFC Data Request).


In addition, findings from the DFC National Evaluation will continue to be shared with grant recipients not only in written reports but also through presentations, including webinars specifically for grant recipients and at conferences attended by grant recipients as well as others interested in engaging community coalitions to prevent and reduce youth substance use, such as the CADCA Mid-Year Institute. Since 2016, each coalition also receives an updated snapshot following their Progress Report submissions. These snapshots are also shared with ONDCP, GPO, and CADCA as requested. ONDCP may occasionally use findings to engage with interested parties via social media (e.g., twitter).


A.3. Use of Information Technology and Burden Reduction


ONDCP awarded a contract in January 2015 that resulted in the development of DFC Me, an ongoing process. Grant recipients, ONDCP, GPO, and the DFC National Evaluation Team interact with the new system as appropriate to their role.


For example, ONDCP and GPO are able to review compliance with DFC terms and conditions via dashboards provided in DFC Me. GPO request revisions and or approve Progress Reports following submission. The DFC National Evaluation Team is able to dump data as appropriate from the system.


The revisions requested with this submission associated with the DFC Me system are intended to ensure that the use of the system does, in fact, provide the type of efficiencies expected with online reporting as explained in the previous OMB submission and summarized below:

  1. Integrated data collection. Integrating the data collection activities required by the evaluation and those needed for grant management/monitoring reduces the overall burden on grant recipients by providing a single source of information. This integrated system represents a forum where GPO and grant recipients can view identical information in real time and proactively answer questions of and about grant recipients.


Currently, all coalitions have Internet access and are able to access and utilize the system. However, for coalitions without Internet access, the Government Project Officer or the DFC National Evaluation Team are able to generate a blank “report” version to capture the progress reporting and evaluation information in hard copy, and then enter the information into the system on behalf of the coalition.


  1. Reducing reporting burden for grant management. DFC Me allows coalitions to enter or edit information and report accomplishments throughout the year. This spreads the reporting burden over a longer period if utilized, reducing the likelihood that a DFC recipient will need to expend significant time in a single week to complete requirements (although grant recipients may choose to complete reports in a single sitting as well). The introduction of the Learning Center in DFC Me creates not only opportunities for new learning but also increases grant recipients’ engagement and familiarity with the system and contributes to ease of use. As noted, DFC Me also provides grant recipients with a “report card” indicating their progress grant requirements. Project officers will be able to track grant recipients in real time as well, supporting their ability to check in with grant recipients who have not been in the system recently.


  1. Enhancing a coalition’s ability to succeed. Progress Reports are intended to be a training tool, as well as a data collection mechanism. By entering their Progress Report data, coalitions are able to reflect on their efforts and begin to think about how they are achieving desired outcomes, what they might need to do differently and how prepared they are to sustain beyond the end of grant funding.



A.4. Efforts to Identify Duplication and Use of Similar Information


The revisions requested to Progress Reports and the CCT are based on a review of ONDCP and grant recipient feedback, knowledge gained from site visits, and on current research and the need to incorporate new and appropriate measures to evaluate the effectiveness of the DFC Program in reducing youth substance use. Because Progress Reports and the CCT represent the most comprehensive dataset available for the evaluation of the DFC Program, it was determined that adding these variables did not represent duplication of information for the grant recipients. The proposed changes to the Progress Report are relatively small with little anticipated change in burden. The proposed revisions to the CCT will decrease burden from three hours to one hour annually for this requirement.


A.5. Impact on Small Businesses or Other Small Entities


Data captured in Progress Reports, including the core measures and the CCT, are collected from funded DFC coalitions, some of which may be small entities as defined by OMB. Compared with paper-based reporting requirements, web-based reporting streamlines access to and submission of coalition data, thus reducing the paperwork burden on these small entities. The DFC Me system has appropriate systems in place to support grant recipients and reduce burden, including training materials, detailed instructions, validation checks, and pop-up help features. Both an email help request address and a Helpline number are available for grant recipients who need support with any technical issues. Finally, as part of the DFC National Evaluation, technical assistance, including the “Ask an Evaluator” webinar series, is available to all grant recipients with special emphasis placed on data collection and reporting. No significant impact on small entities is expected.


A.6. Consequences of Collecting the Information Less Frequently


The proposed data collection supports multiple purposes as described in Section A.2, including grant monitoring, grant progress reporting, support of developmental progress, training and technical assistance, and evaluation. Current DFC Program policies require that these data be collected semi-annually, a requirement clearly spelled out in the Terms and Conditions of the grant award and in the request for applications. Grant recipient Progress Report data are collected to assess a coalition’s performance in meeting program goals and objectives and to support coalitions in receiving the technical assistance they need to meet goals and objectives.


While grant recipient Progress Report data is captured semi-annually, the CCT is completed annually. These two components of grant recipient reporting cannot be completed less frequently without adversely affecting the quality and reliability of evaluation data. Anecdotal evidence indicates that coalition capabilities and capacities can develop and expand substantially over the period of a year. Administering the CCT less frequently risks missing these developmentally important changes. Similarly, data from the Semi-Annual Progress Report cannot be collected less frequently without impairing the capacity to assess the effectiveness of the DFC Program as a whole. Specifically, grant recipients track strategies throughout the year and collecting this information at least twice annually helps to ensure that grant recipients submit as complete a picture as possible of their grant activities. Because progress reporting is interlinked with monitoring of grant recipients, the semi-annual progress reports facilitate GPOs in their monitoring of grant recipients, facilitating their ability to step in early when grant recipients may be facing implementation challenges.


Grant recipients are only required to collect and submit new core measure data every two years. Core measures data collection typically requires working with schools to conduct a survey of youth. Requiring new data only every two years is sufficient for the national evaluation and reduces burden on both schools and on the grant recipients. Grant recipients with sufficient resources not to consider it a burden are permitted to collect and submit new core measures data every year if it supports their local efforts. While grant recipients will continue to submit data by grade level, the proposed revisions reduce burden by no longer asking recipients to enter this data by gender.


A.7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5


This information collection is consistent with the provisions in 5 CFR 1320.5(d)(2). Specifically:

  • Grant recipients are not required to report information to the agency more often than quarterly. For the DFC Program, progress reports are due semi-annually and the CCT completed annually. New core measure data is only required to be collected and reported every two years.

  • Grant recipients are not required to prepare a written response to the collection of information in fewer than 30 days after receipt.

  • Grant recipients are not required to submit more than one original and two copies of any document. For the DFC Program, grant recipient reports are completed online and not via hard copy, although should any DFC recipient identify challenges with completing a requirement online, a paper version will be made available to them.

  • Grant recipients are not required to retain records for more than 3 years.

  • All information collected has been designed to produce valid and reliable results that can be generalized to the universe of the study.

  • Statistical data classification will not occur in the absence of review and approval by OMB.

  • Information collection will not be conducted in a manner that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use.

  • Grant recipients are not required to submit proprietary trade secrets or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality to the extent permitted by law.


A.8. Federal Register Notice and Consultation Outside the Agency


The 60-day notice required in 5 CFR 1320.8(d) was published in the Federal Register on May 6, 2019 (84 FR 20357, pp. 20357-20358, Doc. 2019-09553) and the 30-day notice was published on July 18, 2019 (84 FR 34391, pp. 34391-34392, Doc. 2019-15303). No questions were received on either the 60-day or 30-day notice (Attachment 11, Federal Register 60-Day and 30-Day Notices).


These revisions, as stated previously, were informed by input from representatives from ONDCP, GPO, and grant recipients. ICF tracks technical assistance requests from grant recipients. In addition, grant recipients are encouraged to provide input on the system during presentations and one-on-one at CADCA Conferences (Leadership training in February annually and mid-year training in July annually). Finally, grant recipients that participate in site visits were specifically asked to discuss any recommendations they had with regard to the data required to be submitted and the system itself.


A.9. Explanation of Any Payment or Gift to Respondents


No payments are made to respondents.


A.10. Assurance of Confidentiality Provided to Respondents


The information to be collected pertains solely to DFC and CARA-ALDC coalitions (e.g., their characteristics, activities, functions, and community-level outcomes). No outcome data for any individual persons (e.g., youth who participate in DFC coalition activities) are sought. Names and contact information for individual persons in their official capacities as coalition leaders or representatives of organizations comprising the coalition (key personnel) are captured. Grant recipients submit core measures data collected from youth aggregated at grade level. This information is routinely provided in the course of grant application and administration. All personal contact information is treated in a confidential manner. No narratives gathered as part of the information collection activities are or will be attributed to a specific individual in any reports. Below we describe the handling and reporting procedures employed in order to maintain the privacy of individuals who provide data in their capacities as coalition representatives.


  • Every DFC coalition participating in the National Evaluation is assigned a unique identification number by the DFC National Evaluation team. This ID number is used to monitor the DFC coalition’s status throughout the evaluation. A master list linking the unique ID to coalition information is stored separately from the data.

  • Access to identifying information is limited to the DFC National Evaluation team.

  • Coding documents and computer files of survey data refer to DFC coalitions by their ID numbers only. No name or institutional identifiers other than ID numbers appear on forms.

  • Individual data bases and computer files are protected by passwords or other techniques to restrict access to staff involved in data analysis.

  • No data used in the DFC National Evaluation will be reported in any form that can be traced back to individual DFC coalitions. For example, cell sizes of less than 10 will not be reported to further protect respondents from identification.

  • Coalitions are not asked to provide individual-level outcome information or any outcome information for subgroups that could be used to identify responses of individuals.


There are two potential exceptions to the confidentiality rules. Grant recipients who participate in site visits are promised individual-level confidentiality. At the coalition level, participants are informed that should the coalition’s leadership agree in writing to be identified as associated with promising practices, this information will be shared with other grant recipients as a whole. Prior to agreeing to this, leadership is provided with a site visit summary and is notified regarding particular practices that may be called out. It is made clear to participating coalitions that they can choose to not have their coalition name called out and there are no consequences for requesting to remain confidential. This strategy was taken as many DFC coalitions indicated a desire to share their experiences and be able to discuss them with other grant recipients, and this cannot occur without identification.


The second exception is that Progress Report data are summarized on individual coalition level snapshots. Each coalition receives a copy of their own snapshot while ONDCP, GPO and CADCA may also receive these if agreed to by ONDCP. These summaries are high level and are intended to help the coalition and others to share their DFC stories based on data. These snapshots include core measure data, but as noted previously, this reflects data aggregated from youth surveys collected in the community. That is, they do not reflect individual level data.


In addition, as noted (see Attachment 10, DFC Data Request ), outside parties may seek permission to access DFC data. Requestors must follow the guidance established by the Freedom of Information Act to request the data from ONDCP. Considering the vast amount of data collected, the requests must be specific to prevent from overburdening ONDCP and ICF. Data requests may be emailed to the ONDCP FOIA address [email protected] or mailed to


SSDMD/RDS; EOP Office of National Drug Control Policy

ONDCP FOIA Officer

Joint Base Anacostia-Bolling (JBAB)

Bldg. 410/Door 123

250 Murray Lane, SW

Washington, DC 20509


follow.


Upon approval of the revisions to the Progress Report and CCT, ICF will seek and receive clearance for the protection of human subjects from their IRB, in compliance with 45 CFR 46.


A.11. Justification for Sensitive Questions


No questions are asked that are of a sensitive nature.


A.12. Estimates of Hour Burden Including Annualized Hourly Costs


The changes identified here for the DFC Progress Report are minimal. However, a number of opportunities have been made available for grant recipients to enter additional stories about their work. Should DFC and CARA-ALDC grant recipients choose to complete these sections it is anticipated burden will increase from five hours to six hours per reporting period. To help ensure minimum reporting burden on the grant recipients, ongoing technical assistance is available to prepare grant recipients for the changes and to address problems or issues in real-time. In addition, DFC and CARA-ALDC grant recipients will be encouraged to input data into the system on an ongoing basis in order to reduce burden at any one time.


The number of items in the CCT has been significantly reduced. It is anticipated that the burden associated with the revised CCT will be approximately one hour annually (a reduction from three hours annually). The burden estimate for the case studies is associated with interviews and focus groups, as well as the efforts of one staff member to assist in planning the site visit. Typically site visits last 1-2 days on site (8-12 hours across a range of participants from the coalition) as well as up to 2 hours for planning the visit. The burden estimates for Progress Reports, the CCT, and the Case Study Interviews are presented in the table below using the 724 FY 2018 grant recipients as number of respondents. No STOP Act grant recipients were awarded in FY 2018. Should this program be further funded in the future, prior data suggest that a relatively small number of recipients are not also current DFC recipients. Coalitions with both DFC and STOP Act simply complete grant requirements a single time. In 2018, 55 CARA-ALDC grants were awarded. All but 13 of the 55 recipients are current DFC recipients. Recipients with both awards will fill out one progress report. That is, they are not required to submit data separately for the two grants. CARA-ALDC recipients without a DFC will only be required to complete the progress report and CCT.


Table 1: Estimates of Hour Burden

Type of Respondents

Number of Respondents

Frequency of Response

Average Time per Response (in hours)

Total Annual Burden (in hours)

Instrument – Semi-Annual Progress Report

DFC Grant recipient Program Directorsa

724

2

6

8,688

CARA-ALDC recipient Program Directors

13

2

6

156

Instrument – CCT (Coalition Classification Tool)

DFC Grant recipient Program Directors

724

1

1

724

CARA-ALDC recipient Program Directors

13

1

1

13

Case Studies – Interviewsb

DFC Coalition Sector Members

144*

1

1.5

216

DFC Coalition Chair/Program Director

9

1

4

36

Total

9,833

Note: Estimate does not include estimates for completing feedback evaluations. For the technical assistance items, grant recipients may participate in 1-4 webinars per year. The follow-on survey is estimated to take 5-15 minutes of time to complete (six items). For the DFC Me items, grant recipients may choose to complete the survey as many times as they wish, but it is anticipated they will take it only once. There are six items estimated to take up to 5-15 minutes to complete, adding approximately 905 hours if grant recipients choose to participate.

aOnly key personnel, such as the Program Director, have access to the DFC Me system and can complete requirement.

bAverage of 16 members at each of 9 case study coalitions will be interviewed or participate in focus groups. Interviews with program directors will typically last 2 hours. An additional two hours has been allocated for project director or assigned coalition member to schedule site visit.


Table 2: Annualized Cost to Respondents

Type of Respondents

Number of Respondents

Frequency of Response X Average Time Per Response

Hourly Wage Ratea


Annualized

Cost to Respondents

Instrument – Semi-Annual Progress Report

DFC Grant recipient Program Directors

724

12

$23.11

$200,780

CARA-ALDC recipient Program Directors

13

12

$23.11

$3,605

Instrument: Coalition Classification Tool (CCT)

DFC Grant recipient Program Directors

724

1

$23.11

$16,732

CARA-ALDC recipient Program Directors

13

1

$23.11

$300

Case Studies – Interview Protocols

DFC Coalition Sector Members

144

1.5

$23.11

$4,992

DFC Coalition Chair

9

4

$23.11

$832

Total for DFC Coalitions and CARA-ALDC recipients

$227,241

Total per DFC Coalition/CARA-ALDC recipient (excluding site visit participants)

$300.43

a fThe hourly wage represents the average hourly wage for community and social services occupations reported by the Bureau of Labor Statistics for the National Occupational Employment and Wages, May 2017




A.13. Estimate of Other Total Annual Cost Burden to Respondents or Record Keepers


There are no capital/start-up or operational/maintenance of services costs to the respondents associated with this evaluation. Grant recipients may self-select to share success stories within DFC Me. The system is designed to allow the coalition to upload files, photos, videos, presentations developed for other purposes in order to support the ease of any submissions.


A.14. Annualized Cost to the Federal Government


The first year contract cost for development/revision of the data collection system and instruments, data collection, data processing, analysis, and reporting was $975,000. After that, the costs were reduced to an estimated $332,000 to $350,000 annually. In addition, one Federal employee will be involved for approximately 30% of his/her time over the five years of the project. The cost of the DFC National evaluation ranges from approximately $1.3 to $1.6 million annually. Annual costs to the government for Federal staff to oversee and support this project are $90,000 for each year. The total annualized cost to the Federal government is approximately $1.7 to $2.7 million annually. Travel to provide in-person training and site visits will not exceed up to an additional $50,000 annually (billed at actual cost).


A.15. Explanation for Program Changes or Adjustments


The Progress Report and the CCT are both existing tools used to collect required data from current and future grant recipients (DFC and CARA-ALDC). The revisions requiring OMB approval are based on input from current users of the system (DFC grant recipients) and the data currently reported (ONDCP, GPO, national evaluator). Based on this input, new burden estimates are necessary that more accurately reflect the effort involved in reporting required data using the system and tools. These revisions are necessary to reduce actual burden on grant recipients, ensure better data quality, and ultimately to support the DFC National Evaluation (as well as potential evaluations of STOP Act and/or CARA-ALDC).



A.16. Time Schedule, Analysis Plans, and Publication


Time Schedule


The data collection for the evaluation will occur throughout the five-year period of the contract (through November 2020). A specific time schedule is provided in the table below and has been and will continue to be dependent on the evaluation team’s access to required reporting data following each of the semi-annual report cycles. Case study data will be collected once a year with each of nine identified DFC grant recipients.


Exhibit 3: Evaluation Time Schedule


Activity

Time Schedule

E-blast sent to respondents informing of changes to Progress Report and CCT

2 weeks after OMB approval

Revised Semi-Annual Progress Reporting (new DFC Me)

Ongoing across the five years (semi-annually in February and August)

Core Measure Outcome data

Every other year

Revised Coalition Classification Tool

Annually for five years (August)

New Case Study Site Visits Interview Protocols

Annually for five years (nine each year)

Data analysis: Includes classification, data quality assessment, and cross-sectional analyses

Ongoing for five years; conducted with current and historical data and will continue with revised data

Submit Annual Report

Annually across the five years

Conduct longitudinal data analysis

Ongoing across the five years

Publish findings on DFC National Evaluation

As outlined in the contract between ONDCP and ICF


Analysis Plans and Publication


Since 1998, DFC recipients’ communities have expanded efforts to address social problems through collective action. Based on the belief that new financial support enables a locality to assemble stakeholders; assess needs; enhance and strengthen the community’s prevention service infrastructure; improve immediate outcomes; and reduce levels of substance use, DFC-funded coalitions have been able to implement strategies that have been supported by prior research.2 Research also shows that effective coalitions are holistic and comprehensive; flexible and responsive; build a sense of community; and provide a vehicle for community empowerment.3 Yet, there remain many challenges to evaluating them. Specific interventions vary from coalition to coalition, and the context within which interventions are implemented is dynamic. As a result, conventional evaluation models involving comparison sites are difficult to implement.4


Three major features of the current DFC National Evaluation Plan allow for the expansion of previous analysis to include a far greater range of hypotheses concerning the coalition characteristics that contribute to stronger outputs, stronger coalition outcomes, and ultimately, stronger community outcomes.

  • First, going forward, the evaluation approach will continue to utilize progress report and CCT data to systematically deconstruct more encompassing measures (e.g., maturation stages, engagement with youth coalitions) into specific constructs that are more clearly related to strategies and functions that coalitions must perform, and that define their capacity. This will provide measures of multiple coalition characteristics that may differentiate real world coalitions. Differentiating coalitions will help to better understand effective coalitions, across different settings and in different coalition systems.

  • Second, the evaluation will continue to use a natural variation approach. This approach examines the naturally occurring differences or variations in coalitions’ organization, function, procedures and management strategies, with the intent to provide concrete lessons on how to construct effective coalitions in diverse settings. The data collected for the case studies contributes significantly to this part of the evaluation as does the opportunity within Progress Reports for coalitions to provide detailed examples of the work in which they engage.

  • Third, the evaluation design and analysis will continue to use a multi-method approach, which will allow for different “sub-studies” within the larger DFC National Evaluation and will provide unique opportunities to contribute to overall program improvement.

By better understanding the DFC Program and its mechanisms for contributing to positive change, the National Evaluation can deliver an effective, efficient, and sensitive set of analyses that will meet the needs of the program at the highest level, while also advancing prevention science.

The table below provides a summary of selected examples of research questions relevant to the evaluation (column 1), the products through which findings and lessons can be communicated in a useful way (column 2), and a preliminary identification of analysis methods that the evaluation will support (column 3).


Case studies: Participating coalitions are provided with case study reports tailored to their coalition to disseminate at their discretion. Reports highlight the community context, coalition capacity, strategies and activities, and accomplishments and challenges, and are developed with the goal of sharing accomplishments and lessons learned from the evaluation with all coalition stakeholders. Where appropriate, these case study reports will also be shared utilizing the DFC Me Learning Center.


Exhibit 4. Key Evaluation Questions, Products, and Analytic Methods


Key Research Questions

Products

Methods

  • Do DFC coalitions have positive outcomes on the core measures? Are they improving over time?

  • In what other ways are DFC coalitions having an impact?

  • What does DFC membership look like annually and how does this change over time?

  • To what extent are DFC coalitions operating with a youth coalition? How is engagement with a youth coalition associated with outcomes, membership and strategy implementation?

  • What strategies do DFC coalitions implement in order to prevent youth substance use in their communities?

  • How do DFC outcomes compare with national data?

  • What relationship exists between strategy implementation and/or coalition processes and outcomes?

  • Mid-year and End-Year evaluation reports; Infographics

  • Journal publications


  • Change analysis for core measures/other outcomes/grant recipient outputs and outcomes in aggregate & by comparison groups

  • Analyses of Community Assets data (CCT) for programs put in place following DFC

  • Describe membership data annually, including involvement by sector, and examine trends associated with maturation

  • Describe frequency of engagement with youth coalitions and compare outcomes for coalitions with versus without a youth coalition

  • Descriptive frequencies of annual strategy implementation data; exploratory factor analysis and correlation data to assess relationships between engagement in strategy practices

  • Comparison to YRBS and MFT data where appropriate

  • Correlations; t-tests, ANOVA; temporal associations (prior period strategies with later outcomes)

  • What are the key ingredients to successful collaboration between community partners?

  • What potential pitfalls exist in the implementation process and how can they be avoided?

  • Are coalitions enhancing the prevention system?

  • Do coalitions support community engagement in the implementation of prevention science?

  • What specific initiatives or strategies should be implemented to keep youth drug- and alcohol-free?

  • What practices should be replicated in all coalitions and which practices are useful in specific contexts?

  • Practice briefs; Infographics

  • Mid and end of year evaluation report

  • Journal publications

  • Code and profile coalitions using qualitative data (e.g., site visits, progress report open-text fields)

  • Profile and correlate process data from qualitative progress report data and site visits, replicate and confirm as site visits accumulate across years

  • Exploratory analysis using enhanced process measures (CCT)

  • Confirmatory analysis of cross-site visit findings through enhanced process measures on all coalitions

  • Assessment of perceived value of strategies in site visits

  • Bivariate (exploratory) correlation, and multi-variate relation of practice measures with outcomes using coded site visit data in cross-site analysis, multi-variate (confirmatory) analysis

  • Correlate coalition strength measure (e.g., CCT and others) with community resources / other measures

  • What policies should be implemented to keep students and adults drug-free?

  • What are the most cost effective ways to reduce substance use in communities?

  • Mid and end-year evaluation report (executive summary for policymakers); Infographics

  • Practice briefs

  • Assessment of perceived effectiveness of policies in site visits

  • Change analysis for GPRA/core measures re: substance use comparing coalitions differing in type and intensity of policies

  • Simple assessment and analysis of cost of policies related to effectiveness in site visit communities

  • Are comprehensive community initiatives reducing the negative effects of alcohol and other drug use among youth (e.g., reductions in DUI/drugged driving)? Among adults?

  • Mid and end-year evaluation report Practice briefs

  • Multivariate assessment of relation between coalition strategies, implementation strength & community outcomes

  • Exploratory site visit, cross-site analysis, confirmatory multivariate analysis in comparison samples & full sample


A.17. Reason(s) Display of OMB Expiration Date is Inappropriate


No exemption from displaying the expiration date is requested.


A.18. Exceptions to Certification for Paperwork Reduction Act Submissions


This collection of information involves no exceptions to the Certification for Paperwork Reduction Act Submissions.



B. Collections of Information Employing Statistical Methods


B.1. Respondent Universe and Sampling


Grant Recipients. The respondent universe consists of all grant recipients receiving funding in FY 2018 (N=724 DFC plus N=13 CARA-ALDC) and any additional grant recipients funded throughout the evaluation period. Information will be used to estimate substance use rates among youth in DFC communities and compare trends in these rates with national trend data (e.g., YRBS data). Data for comparison will be abstracted from national- or state-level surveys (i.e., without the need for additional primary data collection or sampling). Therefore, no sampling procedures are being used for the evaluation. Instead, all grant recipients will continue to submit required data through DFC Me.


To reduce burden, yet still provide the information needed for the evaluation, case studies will be conducted with a limited sample of DFC grant recipients. Analysis of data from progress reports and the CCT will be used to identify appropriate DFC coalitions each year. In some cases the focus will be on best practices with regard to change in core outcomes. In other cases, sites will be selected based on specific interests identified by ONDCP (e.g., best practices in inner-city coalitions, best practices with regard to addressing synthetic drugs). A purposeful sample of nine grant recipients will be selected from the potential sample of relevant sites to ensure representation across sites where possible. This process will be completed each year of the DFC National Evaluation to identify nine new coalitions to highlight through the case studies. Annual site visits will be conducted to each case study location to conduct interviews and focus groups as described previously. All key personnel will be interviewed if possible. A sample of approximately 12-16 coalition members from each site will be selected to participate in interviews or focus groups. Whenever possible, participants will be encouraged to have focus groups with youth and parent sector members participate in the site visits (typically 2-8 members). One member of each of the 12 key sectors represented on the DFC coalitions will participate when possible. The site will identify a local coordinator to identify who best to participate in the site visit for the particular location.


B.2. Procedures for the Collection of Information


The Terms and Conditions that accompany grant recipients’ Notice of Awards at the beginning of the grant award year contains specific information about the reporting requirements. Once all requested revisions to progress reports and the CCT have been approved by OMB and once revisions are made in DFC Me, the grant recipients will be informed of the changes by ONDCP via E-blast. Additionally, a webinar will be prepared by the DFC National Evaluation team explaining changes, and grant recipients will be given the opportunity to share their experiences and suggestions for improvement through a short, voluntary questionnaire (Attachment 9, Feedback Evaluation Items). Grant recipients will be encouraged to access technical assistance through the following venues:



Once case study sites have been identified by the National Evaluation team through the analysis of progress report and CCT data and consultation with ONDCP, a member of the National Evaluation team will contact the DFC grant recipient Program Director to inform him/her of the selection and invite them to participate in the case study. If they accept the invitation, a site visit will be planned and conducted. If the grant recipient declines to participate, an alternate will be identified and contacted, that is, participation in the site visits is voluntary. Site visits last up to two days and interviews/focus groups are conducted with all levels of stakeholders involved with coalition activities. The site visits are limited to participation by 16 individuals unless it is deemed appropriate by the coalition to involve additional members. Most typically, the total number of participants exceeds 16 in only those cases where youth focus groups or parent focus groups include 6-8 members, reflecting high engagement of these sectors by the coalition.


Data Management and Quality Control


Data quality will continue to be of particular concern on this project because the primary data for the evaluation are self-reported, and the DFC coalitions are responsible for identifying and reporting community-level outcome data. Data quality will be improved through the use of vetted survey items and training and technical assistance in responding to survey items. A major focus of data quality assurance will be on the outcome measures. The outcome measures will be community-level statistics obtained from surveys that are conducted independently of this evaluation. Coalitions will be responsible for identifying the appropriate data source, locating the data corresponding to the outcomes and strata requested in this evaluation, and entering them accurately into the data collection instrument. This process may potentially lead to data that are below the minimum data quality standards needed to conduct an unbiased evaluation. Deficiencies may be indicated by, among other things, (1) significant amounts of missing or invalid data, (2) evidence of inaccurate data, and (3) the use of unreliable methods by coalitions for collecting outcome measures.


Potentially inaccurate data will be identified using a number of quality checks, including:


  • Identical responses across multiple categories within the same strata and year.

  • Identical responses in large numbers of cells over time.

  • Discrepancies in sample sizes or estimated proportions across different levels of aggregation (e.g., the total number of respondents summed across grade levels does not equal the total number of respondents summed across gender; or the number of respondents changes radically from year to year).

  • Performing a formal, statistically based outlier analysis for reported outcomes.

  • Establishing criteria that may be indicative of potentially invalid or inaccurate responses, such as the reporting of 100% or 0% of youth for a particular outcome.


Coalitions are asked to report outcome measures, but are not mandated as to how they obtain the requisite information. That is, each coalition may choose to employ a different survey technique to obtain this information as well as identifying local sampling strategies. Therefore, there is the potential that some coalitions may rely on techniques that are known to be biased. As part of the information collected from coalitions, data on the instrument used for collecting outcome measures will continue to be requested. For example, coalitions are asked to indicate the source of their outcome data (state survey, established community survey, or custom survey) and are asked if they perceive the data to be representative of the community in which their work is conducted.


Outcome measures using an established state or community survey are more likely to yield scientifically valid and representative results for the community. Outcome measures collected using other methods (e.g. use of custom surveys) are more likely to be biased or non-representative, and additional information will continue be sought from coalitions that report using these methods to evaluate the validity of the reported outcomes. If grant recipients indicate the use of a custom survey, they must have the survey reviewed by the evaluation team and approved by their GPO. In addition, technical assistance will be made available to grant recipients to help them select valid instruments for collecting outcome measures.


Another possible challenge to data quality is the sampling technique used by grant recipients when administering their surveys. Since the results of these surveys form the core findings for the DFC National Evaluation, additional steps are planned to ensure the validity of the sampling process, and by extension, the validity of the evaluation results. Through proactive technical assistance to grant recipients, the National Evaluation team will provide instructions on how to sample students for outcome surveys. The importance of obtaining a representative sample will be emphasized. Moreover, it is critical that the sampling frame remain consistent over time so changes across time can be accurately measured.


Data quality will also be enhanced through two concurrent strategies that are currently employed by the evaluation team: 1) continuation of current data cleaning procedures, and 2) provision of technical assistance to grant recipients.


The data entered by grant recipients are cleaned at multiple points:


  • Data are reviewed by GPO and, once approved, data are cleared for release to the National Evaluation team.

  • A more in-depth two-step cleaning process is conducted by the National Evaluation team.

  1. Raw data are cleaned and processed using SAS and SPSS code, then appended to existing raw databases. Most of the procedures involve logic checks within given databases.

  2. The raw data are processed to develop a set of analyses databases, which are used for all analyses. Data cleaning procedures conducted at this step mainly involve logic checks both within and across databases. Particular attention is paid to core measures data to identify any reported changes that are extreme outliers and may reflect incorrectly entered data.

  • A final round of data cleaning is conducted within the analysis programs. For example, before data are analyzed, duplicate records are removed (duplicates are created when grant recipients update records from previous reporting periods).

Ensuring that data are of the highest quality possible increases the confidence in the findings. Standard procedures will also be employed by the DFC National Evaluation to ensure all data (qualitative and quantitative) from the case studies are entered, cleaned, and checked for problems before moving forward with the analysis. Given that DFC is implemented through the Executive Office of the President, and is attended to closely by members of Congress, it is expected that the current evaluation will be subject to scrutiny. Having confidence in the results is, therefore, of the utmost priority.


Much of the data central to this evaluation is collected or provided by grant recipient organizations or individual respondents within organizations. As noted above, the National Evaluation team will be conducting extensive data quality and missing data bias analyses on existing data. We will identify major challenges in past data collection and develop responses and procedures that will help ameliorate these challenges.


If data quality issues are identified, they will be reported to the GPO. Where possible, their reports can be placed into a status referred to as Needs Revision After Approval so the correction can be completed. If they cannot be resolved by GPO, in conjunction with the coalition, data from that coalition may be excluded from the statistical analyses, which will reduce the effective sample sizes and resulting statistical power of hypothesis tests for the evaluation.


B.3. Methods to Maximize Response Rates and Deal with Non-Response


Given DFC Terms and Conditions, a 100% response rate among the DFC coalitions is expected for the Semi-Annual Progress Reports, core measures, and the CCT. DFC Me provides each coalition with a Report Card summarizing the extent to which they are in compliance, further ensuring that recipients are aware that they may have a reporting requirement that needs to be addressed.


In addition, a high response rate among DFC coalitions is anticipated based on current use of the system. While not 100%, in August 2018, 708 of 713 FY 2017 DFC recipients (99%) completed the required Progress Report. In addition, using the progress reporting tool will allow coalitions to generate analytic information to assess their own coalition’s performance and target efforts to improve sustainability. For example, a critical function of the CCT is to assist coalitions (and the National Evaluation team) in identifying areas where additional technical assistance and training may be required to further improve the performance of the coalition.


Non-respondents will be contacted by their GPO and required to complete either the semi-annual report or the CCT in accordance with the Terms and Conditions provided with the Notice of Award. Failure to complete the semi-annual report, the bi-annual data reporting and/or annual CCT will result in further follow up by the GPO. A corrective action plan will be implemented for the grant recipient to input the necessary data by a specified deadline. Failure to comply with the corrective action plan will result in the execution of the DFC Program’s progressive disciplinary action plan, which is already established, that ranges from suspension to termination of a grant. Incentives in the form of ad-hoc analyses/ summaries of the collected information will be provided to participating coalitions.



B.4. Test of Procedures or Methods to be Undertaken


The revisions will be shared with the grant recipients during technical assistance webinars. Any lessons learned from the testing will be incorporated, as appropriate, before revisions are final. Any revisions will be communicated to OMB as required.


B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


Barbara O’Donnel, Ph.D.

Principal

ICF International

9300 Lee Highway

Fairfax, VA 22031

[email protected]


Jeremy Goldbach, Ph.D.

Co-Principal Investigator

[email protected]


James Demery, Ph.D.

Technical Specialist

[email protected]


Kelle Falls, M.A.

Technical Specialist

[email protected]


Erica McCoy, M.P.A.

Associate

[email protected]


Caitlin McLaughlin, M.A.

Principal

[email protected]


Jason Schoeneberger, Ph.D.

Senior Technical Specialist

[email protected]


Maria Asencio, M.S.

Senior Associate

[email protected]


Ted Coogan

Senior Project Manager

[email protected]


1 Youth Risk Behavior Survey Questionnaire Content:1991-2019, Center for Disease Control, retrieved October 28, 2018 at https://www.cdc.gov/healthyyouth/data/yrbs/pdf/2019/YRBS_questionnaire_content_1991-2019.pdf

2 Brounstein, P. & Zweig, J. (1999). Understanding Substance Abuse Prevention Toward the 21st Century: A Primer on Effective Programs. Washington, DC: U.S. Department of Health and Human Services.

3 Wolf, T. (2001). Community Coalition Building–Contemporary Practice and Research: Introduction. American Journal of Community Psychology, 29(2), 165-172. Community Anti-Drug Coalition of America (2010). Research Support for Comprehensive Community Interventions to Reduce Youth Alcohol, Tobacco and Drug Use and Abuse. Washington, DC: Author. Retrieved from http://www.cadca.org/files/resources/ResearchSupport-4-ComprehensiveInterventions-09-2011.pdf

4 Gruenewald, P.J. (1997). Analysis Approaches to Community Evaluation. Evaluation Review, 21(2), 209-230.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJoanne and Shukri Abed
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy