1_Justification Statement_2019_Aug6 SECTION B

1_Justification Statement_2019_Aug6 SECTION B.docx

Drug Free Communities Support Program National Evaluation

OMB: 3201-0012

Document [docx]
Download: docx | pdf

Office of Management and Budget



Clearance Package Supporting Statement and Revised Data Collection Instruments





Drug-Free Communities Support Program National Evaluation









Supported by:


Executive Office of the President

Office of Administration

Office of National Drug Control Policy

1800 G Street NW

Washington, DC 20006




Assistant Director, Drug-Free Communities Program:

Helen Hernandez

(202) 395.6665 (phone)

(202) 395-6711 (fax)













August 2019



Table of Contents



B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS 1


B.1. Respondent Universe and Sampling 1


B.2. Procedures for the Collection of Information 1


B.3. Methods to Maximize Response Rates and Deal with Non-Response 4


B.4. Test of Procedures or Methods to be Undertaken 5


B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 5





B. Collections of Information Employing Statistical Methods


B.1. Respondent Universe and Sampling


Grant Recipients. The respondent universe consists of all grant recipients receiving funding in FY 2018 (N=724 DFC plus N=13 CARA-ALDC) and any additional grant recipients funded throughout the evaluation period. Information will be used to estimate substance use rates among youth in DFC communities and compare trends in these rates with national trend data (e.g., YRBS data). Data for comparison will be abstracted from national- or state-level surveys (i.e., without the need for additional primary data collection or sampling). Therefore, no sampling procedures are being used for the evaluation. Instead, all grant recipients will continue to submit required data through DFC Me.


To reduce burden, yet still provide the information needed for the evaluation, case studies will be conducted with a limited sample of DFC grant recipients. Analysis of data from progress reports and the CCT will be used to identify appropriate DFC coalitions each year. In some cases the focus will be on best practices with regard to change in core outcomes. In other cases, sites will be selected based on specific interests identified by ONDCP (e.g., best practices in inner-city coalitions, best practices with regard to addressing synthetic drugs). A purposeful sample of nine grant recipients will be selected from the potential sample of relevant sites to ensure representation across sites where possible. This process will be completed each year of the DFC National Evaluation to identify nine new coalitions to highlight through the case studies. Annual site visits will be conducted to each case study location to conduct interviews and focus groups as described previously. All key personnel will be interviewed if possible. A sample of approximately 12-16 coalition members from each site will be selected to participate in interviews or focus groups. Whenever possible, participants will be encouraged to have focus groups with youth and parent sector members participate in the site visits (typically 2-8 members). One member of each of the 12 key sectors represented on the DFC coalitions will participate when possible. The site will identify a local coordinator to identify who best to participate in the site visit for the particular location.


B.2. Procedures for the Collection of Information


The Terms and Conditions that accompany grant recipients’ Notice of Awards at the beginning of the grant award year contains specific information about the reporting requirements. Once all requested revisions to progress reports and the CCT have been approved by OMB and once revisions are made in DFC Me, the grant recipients will be informed of the changes by ONDCP via E-blast. Additionally, a webinar will be prepared by the DFC National Evaluation team explaining changes, and grant recipients will be given the opportunity to share their experiences and suggestions for improvement through a short, voluntary questionnaire (Attachment 9, Feedback Evaluation Items). Grant recipients will be encouraged to access technical assistance through the following venues:



Once case study sites have been identified by the National Evaluation team through the analysis of progress report and CCT data and consultation with ONDCP, a member of the National Evaluation team will contact the DFC grant recipient Program Director to inform him/her of the selection and invite them to participate in the case study. If they accept the invitation, a site visit will be planned and conducted. If the grant recipient declines to participate, an alternate will be identified and contacted, that is, participation in the site visits is voluntary. Site visits last up to two days and interviews/focus groups are conducted with all levels of stakeholders involved with coalition activities. The site visits are limited to participation by 16 individuals unless it is deemed appropriate by the coalition to involve additional members. Most typically, the total number of participants exceeds 16 in only those cases where youth focus groups or parent focus groups include 6-8 members, reflecting high engagement of these sectors by the coalition.


Data Management and Quality Control


Data quality will continue to be of particular concern on this project because the primary data for the evaluation are self-reported, and the DFC coalitions are responsible for identifying and reporting community-level outcome data. Data quality will be improved through the use of vetted survey items and training and technical assistance in responding to survey items. A major focus of data quality assurance will be on the outcome measures. The outcome measures will be community-level statistics obtained from surveys that are conducted independently of this evaluation. Coalitions will be responsible for identifying the appropriate data source, locating the data corresponding to the outcomes and strata requested in this evaluation, and entering them accurately into the data collection instrument. This process may potentially lead to data that are below the minimum data quality standards needed to conduct an unbiased evaluation. Deficiencies may be indicated by, among other things, (1) significant amounts of missing or invalid data, (2) evidence of inaccurate data, and (3) the use of unreliable methods by coalitions for collecting outcome measures.


Potentially inaccurate data will be identified using a number of quality checks, including:


  • Identical responses across multiple categories within the same strata and year.

  • Identical responses in large numbers of cells over time.

  • Discrepancies in sample sizes or estimated proportions across different levels of aggregation (e.g., the total number of respondents summed across grade levels does not equal the total number of respondents summed across gender; or the number of respondents changes radically from year to year).

  • Performing a formal, statistically based outlier analysis for reported outcomes.

  • Establishing criteria that may be indicative of potentially invalid or inaccurate responses, such as the reporting of 100% or 0% of youth for a particular outcome.


Coalitions are asked to report outcome measures, but are not mandated as to how they obtain the requisite information. That is, each coalition may choose to employ a different survey technique to obtain this information as well as identifying local sampling strategies. Therefore, there is the potential that some coalitions may rely on techniques that are known to be biased. As part of the information collected from coalitions, data on the instrument used for collecting outcome measures will continue to be requested. For example, coalitions are asked to indicate the source of their outcome data (state survey, established community survey, or custom survey) and are asked if they perceive the data to be representative of the community in which their work is conducted.


Outcome measures using an established state or community survey are more likely to yield scientifically valid and representative results for the community. Outcome measures collected using other methods (e.g. use of custom surveys) are more likely to be biased or non-representative, and additional information will continue be sought from coalitions that report using these methods to evaluate the validity of the reported outcomes. If grant recipients indicate the use of a custom survey, they must have the survey reviewed by the evaluation team and approved by their GPO. In addition, technical assistance will be made available to grant recipients to help them select valid instruments for collecting outcome measures.


Another possible challenge to data quality is the sampling technique used by grant recipients when administering their surveys. Since the results of these surveys form the core findings for the DFC National Evaluation, additional steps are planned to ensure the validity of the sampling process, and by extension, the validity of the evaluation results. Through proactive technical assistance to grant recipients, the National Evaluation team will provide instructions on how to sample students for outcome surveys. The importance of obtaining a representative sample will be emphasized. Moreover, it is critical that the sampling frame remain consistent over time so changes across time can be accurately measured.


Data quality will also be enhanced through two concurrent strategies that are currently employed by the evaluation team: 1) continuation of current data cleaning procedures, and 2) provision of technical assistance to grant recipients.


The data entered by grant recipients are cleaned at multiple points:


  • Data are reviewed by GPO and, once approved, data are cleared for release to the National Evaluation team.

  • A more in-depth two-step cleaning process is conducted by the National Evaluation team.

  1. Raw data are cleaned and processed using SAS and SPSS code, then appended to existing raw databases. Most of the procedures involve logic checks within given databases.

  2. The raw data are processed to develop a set of analyses databases, which are used for all analyses. Data cleaning procedures conducted at this step mainly involve logic checks both within and across databases. Particular attention is paid to core measures data to identify any reported changes that are extreme outliers and may reflect incorrectly entered data.

  • A final round of data cleaning is conducted within the analysis programs. For example, before data are analyzed, duplicate records are removed (duplicates are created when grant recipients update records from previous reporting periods).

Ensuring that data are of the highest quality possible increases the confidence in the findings. Standard procedures will also be employed by the DFC National Evaluation to ensure all data (qualitative and quantitative) from the case studies are entered, cleaned, and checked for problems before moving forward with the analysis. Given that DFC is implemented through the Executive Office of the President, and is attended to closely by members of Congress, it is expected that the current evaluation will be subject to scrutiny. Having confidence in the results is, therefore, of the utmost priority.


Much of the data central to this evaluation is collected or provided by grant recipient organizations or individual respondents within organizations. As noted above, the National Evaluation team will be conducting extensive data quality and missing data bias analyses on existing data. We will identify major challenges in past data collection and develop responses and procedures that will help ameliorate these challenges.


If data quality issues are identified, they will be reported to the GPO. Where possible, their reports can be placed into a status referred to as Needs Revision After Approval so the correction can be completed. If they cannot be resolved by GPO, in conjunction with the coalition, data from that coalition may be excluded from the statistical analyses, which will reduce the effective sample sizes and resulting statistical power of hypothesis tests for the evaluation.


B.3. Methods to Maximize Response Rates and Deal with Non-Response


Given DFC Terms and Conditions, a 100% response rate among the DFC coalitions is expected for the Semi-Annual Progress Reports, core measures, and the CCT. DFC Me provides each coalition with a Report Card summarizing the extent to which they are in compliance, further ensuring that recipients are aware that they may have a reporting requirement that needs to be addressed.


In addition, a high response rate among DFC coalitions is anticipated based on current use of the system. While not 100%, in August 2018, 708 of 713 FY 2017 DFC recipients (99%) completed the required Progress Report. In addition, using the progress reporting tool will allow coalitions to generate analytic information to assess their own coalition’s performance and target efforts to improve sustainability. For example, a critical function of the CCT is to assist coalitions (and the National Evaluation team) in identifying areas where additional technical assistance and training may be required to further improve the performance of the coalition.


Non-respondents will be contacted by their GPO and required to complete either the semi-annual report or the CCT in accordance with the Terms and Conditions provided with the Notice of Award. Failure to complete the semi-annual report, the bi-annual data reporting and/or annual CCT will result in further follow up by the GPO. A corrective action plan will be implemented for the grant recipient to input the necessary data by a specified deadline. Failure to comply with the corrective action plan will result in the execution of the DFC Program’s progressive disciplinary action plan, which is already established, that ranges from suspension to termination of a grant. Incentives in the form of ad-hoc analyses/ summaries of the collected information will be provided to participating coalitions.



B.4. Test of Procedures or Methods to be Undertaken


The revisions will be shared with the grant recipients during technical assistance webinars. Any lessons learned from the testing will be incorporated, as appropriate, before revisions are final. Any revisions will be communicated to OMB as required.


B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


Barbara O’Donnel, Ph.D.

Principal

ICF International

9300 Lee Highway

Fairfax, VA 22031

[email protected]


Jeremy Goldbach, Ph.D.

Co-Principal Investigator

[email protected]


James Demery, Ph.D.

Technical Specialist

[email protected]


Kelle Falls, M.A.

Technical Specialist

[email protected]


Erica McCoy, M.P.A.

Associate

[email protected]


Caitlin McLaughlin, M.A.

Principal

[email protected]


Jason Schoeneberger, Ph.D.

Senior Technical Specialist

[email protected]


Maria Asencio, M.S.

Senior Associate

[email protected]


Ted Coogan

Senior Project Manager

[email protected]


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJoanne and Shukri Abed
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy