Supporting Statement B DFC National Evaluation

Supporting Statement B DFC National Evaluation.docx

Drug Free Communities Support Program National Evaluation

OMB: 3201-0012

Document [docx]
Download: docx | pdf

Office of Management and Budget



Clearance Package Supporting Statement B: Collections of Information Employing Statistical Methods



Drug Free Communities Support Program National Evaluation









Supported by:


Executive Office of the President

Office of Administration

Office of National Drug Control Policy

750 17th Street, NW

Washington, DC 20006-4607




Government Project Officer:

Shannon Weatherly

(202) 395.6774 (phone)

(202) 395-6711 (fax)













August 2011

Table of Contents




B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS 1


B.1. Respondent Universe and Sampling 1


B.2. Procedures for the Collection of Information 2


B.3. Methods to Maximize Response Rates and Deal with Non-Response 6


B.4. Test of Procedures or Methods to be Undertaken 7


B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 7





B. Collections of Information Employing Statistical Methods


B.1. Respondent Universe and Sampling


DFC grantees. The respondent universe consists of all DFC coalitions receiving funding in FY2010 (N=718) and any additional grantees funded throughout the evaluation period. Data will be collected to estimate substance use prevalence rates among youth in DFC communities and compare trends in these rates with trends in matched non-DFC communities. Data for non-DFC communities will be abstracted from national- or state-level surveys (i.e., without the need for additional primary data collection or sampling). Therefore, no sampling procedures are being used for the evaluation. Instead, all DFC grantees will continue to submit required data through COMET.


To minimize burden, yet provide the detailed qualitative data needed to inform the analysis, interpretation and use of the full set of COMET and CCT data, site visits will be conducted with a limited sample of DFC grantees.. The site visits will strengthen the study by (a) providing exploratory insight for generating hypothesis to be tested in the full data; (b) elaborating understanding of how findings from the analysis of COMET and CCT data are reflected in specific programs; and (c) provide specific cases and examples of the application of evidence-informed practices by coalitions. The sample is selected through application of the following criteria.


  • Effectiveness. Since a primary purpose of the site visits is to explore in depth coalition structures, processes and strategies associated with effectiveness, and to provide examples of these coalition characteristics, sample coalitions will be selected because they have demonstrated effectiveness in one or more of the following ways: (a) shown through analysis of COMET and CCT data to have achieved consistent, positive core measures outcomes; (b) recognized as achieving community or capacity building successes through data, local evaluation, recognition by recognized professional organizations (e.g., CADCA); and/or (c) have been recognized as using innovative practices.

  • Relevance to Evaluation Research Focus. Each year, consideration for site selection will include the relevance of a coalition to one or more research questions that are a priority for the National Evaluation. For example, the evaluation may be interested in generating hypotheses for further exploration, and/or promising practices related to the application of specific environmental strategies. Or, as another example, the evaluation team may set a priority for coalitions with very active collaboration teams, or those that focus on high school binge drinking. In these instances, relevance to these priorities may be part of the site selection criteria.

  • Variation in Setting or Other Criteria. One advantage of in-depth field research is the ability to closely examine the interaction of contextual factors (e.g., community characteristics, target populations), coalition processes (e.g., membership, decision style, strategies), and specific outcomes. To gain greater understanding of how a particular evaluation emphasis (e.g., use of specific environmental strategies) interacts with other factors, variation in these factors may be a criterion for selection.


In summary, the selection of nine coalitions for site visits each year will be systematic in the consideration of multiple, clearly articulated criteria. However, it will not be dictated by a specific cut point, and the mix and weighting of criteria may vary according to information needs in a particular year.


Annual site visits will be conducted to each case study location to conduct interviews and administer the Social Network Survey as described previously. All coalition chairs will be interviewed and administered the survey. Additionally, the DFC staff member identified as having primary oversight of the grant will be interviewed. Finally, a sample of 12 coalition members from each site will be selected to participate in interviews and to complete the survey. The selection of respondents will be conducted in cooperation with the coalition, but the ultimate identification of respondents will be made by the evaluation team. For example, in cases in which there are multiple active organizations in a given sector, we may ask our coalition liaison to provide a list of active participants most knowledgeable about the involvement of organizations in that sector, and make a selection from that list. The objective will be to get at least one interviewee from each sector with active members in the coalition. If there is more than one member from each sector, the member with the most experience on the coalition will be selected to participate.


B.2. Procedures for the Collection of Information


The formal process of data collection began in February 2005 with a letter from ONDCP to all DFC coalition directors providing information about the National Evaluation. Specifically, this letter introduced COMET (referred to as PMMS at the time this communication was made); provided a password and instructions to access the system; and outlined a timetable for submission of data. After receiving the initial letter, coalitions were notified electronically and through the ONDCP Community Prevention listserv of dates to access the system to submit data for the instruments. In all subsequent years, the Terms and Conditions that accompany grantees’ Notice of Awards at the beginning of the grant award years have contained specific information about the reporting requirements. This information is also posted on the Drug Free Communities Program’s website (http://www.WhiteHouseDrugPolicy.gov/dfc). Once the requested revisions to COMET and the CCT have been approved by OMB, DFC grantees will be informed by ONDCP and SAMHSA of the changes. Additionally, a training will be prepared jointly by ICF and KIT Solutions explaining the changes to the data collection. Additionally, all user guides and tools available from KIT Solutions will be updated, as necessary, to reflect the revisions to COMET and the CCT. Grantees will be encouraged to access technical assistance through the following venues:



Once case study sites have been identified by the National Evaluation team through the analysis of COMET and CCT data and in consultation with ONDCP, a member of the National Evaluation team will contact the DFC Grantee Program Director to inform him/her of the selection and invite them to participate in the case study. If they accept the invitation, a site visit will be planned and conducted. If the grantee declines to participate, an alternate grantee will be identified and contacted. Collaborative pre-visit planning with the site is essential to efficient completion of site visit objectives. Planning with sites will begin approximately six weeks prior to a projected visit date. A site visit team member will serve as liaison to the site, and will work with a site representative to (a) gain information on potential coalition and sector key informants and select respondents; (b) make logistic arrangements, and (c) set an agenda and time schedule on site.


Data Management and Quality Control


Data quality will continue to be of particular concern on this project because the primary data for the evaluation are self-reported, and the DFC coalitions are responsible for identifying and reporting community-level outcome data. Data quality will be improved through the use of vetted survey items and training and technical assistance in responding to survey items.

A major focus of data quality assurance will be on the outcome measures. The outcome measures will be community-level statistics obtained from surveys that are conducted independently of this evaluation. Coalitions will be responsible for identifying the appropriate data source, locating the data corresponding to the outcomes and strata requested in this evaluation, and entering them accurately into the data collection instrument. This process may potentially lead to data that are below the minimum data quality standards needed to conduct an unbiased evaluation. Deficiencies may be indicated by, among other things, (1) significant amounts of missing or invalid data, (2) evidence of inaccurate data, and (3) the use of unreliable methods by coalitions for collecting outcome measures.


Evidence of potentially inaccurate data will be identified using a number of quality checks, including:


  • Identical responses across multiple categories within the same strata and year.

  • Identical responses in large numbers of cells over time.

  • Discrepancies in sample sizes or estimated proportions across different levels of aggregation (e.g., the total number of respondents summed across grade levels does not equal the total number of respondents summed across gender; or the number of respondents changes radically from year to year).

  • Performing a formal outlier analysis for reported outcomes.

  • Establishing criteria that may be indicative of potentially invalid or inaccurate responses, such as the reporting of 100% or 0% of youth for a particular outcome.


Coalitions are asked to report outcome measures, but are not mandated as to how they obtain the requisite information. That is, each coalition may choose to employ a different survey technique to obtain this information. Therefore, there is the potential that some coalitions may rely on techniques that are known to be biased. As part of the information collected from coalitions, data on the instrument used for collecting outcome measures will continue to be requested. For example, coalitions are asked to indicate the source of their outcome data– state survey, established community survey, or custom survey, for example.


Outcome measures using an established state or community survey are more likely to yield scientifically valid and representative results for the community. Outcome measures collected using other methods (e.g. use of custom surveys) are more likely to be biased or non-representative, and additional information will continue be sought from coalitions that report using these methods to evaluate the validity of the reported outcomes. If grantees indicate the use of a custom survey, they must have the survey reviewed by the evaluation team and approved by their Government Project Officer. In addition, technical assistance will be made available to grantees to help them select valid instruments for collecting outcome measures.


Another possible challenge to data quality is the sampling technique used by grantees when administering their surveys. Since the results of these surveys form the core findings for the DFC National Evaluation, additional steps are planned to ensure the validity of the sampling process, and by extension, the validity of the evaluation results. Through proactive technical assistance to grantees, the National Evaluation team will provide detailed instructions on how to sample students for outcome surveys (an “Ask an Evaluator” webinar has already been delivered to DFC grantees to address general issues related to sampling). The team will work with in-house sampling statisticians and vet any plans to ONDCP before they are sent out to grantees. The importance of obtaining a representative sample will be emphasized. Moreover, it is critical that the sampling frame remain consistent so changes across time can be accurately measured.


Data quality will also be enhanced through two concurrent strategies that are currently employed by the evaluation team: (1) continuation of current data cleaning procedures, and (2) provision of technical assistance to grantees.


Currently, the data entered by DFC grantees are cleaned at multiple points:


  • A general cleaning process is conducted by KIT Solutions once data are entered into COMET. Cleaning procedures on the data include range checks and other standard techniques to ensure the quality of the data.

  • Data are reviewed by SAMHSA Government Project Officers and once approved by SAMHSA, data are cleared for release to the National Evaluation team.

  • A more in-depth cleaning process is conducted by the National Evaluation team. This cleaning process takes place in two steps:

  1. Raw data are cleaned and processed using structured query language (SQL) code, then appended to existing raw databases. Most of the procedures involve logic checks within given databases. ICF has completed an initial review of this code, which was developed by the previous contractor, and the cleaning decisions appear to be in line with standard practice. A second round of review will be conducted in the second year of the contract to document all cleaning decisions and to ensure that the cleaning process is transparent to ONDCP.

  2. The raw data are processed to develop a set of analysis databases, which are used for all analyses. Data cleaning procedures conducted at this step mainly involve logic checks both within and across databases.

  • A final round of data cleaning is conducted within the analysis programs. For example, before data are analyzed, duplicate records are removed (duplicates are created when grantees update records from previous reporting periods), and records missing critical information (e.g., outcome sample sizes, data collection dates) are removed.

By ensuring that data are of the highest quality possible, it only increases the confidence in the findings. Standard procedures will also be employed by the DFC National Evaluation to ensure all data (qualitative and quantitative) from the case studies are entered, cleaned, and checked for problems before moving forward with the analysis. Given that DFC is implemented through the Executive Office of the President, and is attended to closely by members of Congress, it is expected that the current evaluation will be subject to a high level of scrutiny. Having confidence in the results is, therefore, of the utmost priority.


The evaluation team will also follow careful procedures to ensure the quality and usefulness of site visit information. Qualitative data quality and utility can be compromised, particularly in multi-site contexts, by (a) lack of preparation, (b) inadequate documentation in the field, (c) excessive variation in format or documentation across sites; (d) inadequate organization of field data to support analysis steps such as thematic coding; or (e) inadequate conceptual framing of data collection prior to field entry. These issues will be addressed in the following ways:


  • All members of site visit teams will be thoroughly trained in the purposes of the visits; interview techniques, and particularly in ensuring comprehensive responses through probes; focus group techniques, and particularly in techniques for encouraging full participation and interaction between members; and in team completion of site summary protocols upon completion of the site visit.

  • Before conducting the site visit, team members will be requested to pre-fill certain data already available through the COMET system. In preparation for the visits, they will be provided site documentation, including summaries of COMET and CCT data; proposals for funding; and relevant documentation of coalition structure and process, intervention strategies and implementation, and outcome results.

  • Clear statements of the reasons sites were selected, and the implications for data collection emphases will be provided to each member of the site visit team.

  • Teams will participate in developing a site summary protocol that will provide a structured template for summarizing results of interviews, focus groups and other site data in response to an organized set of topics and questions. Summaries will share a basic structure designed to capture the common components of the interview and focus protocols across sites, but will have emphases appropriate to each site. Protocols will provide an organized format for entering exemplary statements and examples. These protocols are a critical internal product of the visits that will support the use of cross-visit information to generate hypotheses or to elaborate evaluation findings.

These careful and systematic steps in data collection are designed to maximize the accuracy, relevance and utility of qualitative site visit information.

If data quality issues are identified, they will be immediately reported to the Government Project Officer. If they cannot be resolved by Government Project Officer, in conjunction with the coalition, data from that coalition may be excluded from the statistical analyses, which will reduce the effective sample sizes and resulting statistical power of hypothesis tests for the evaluation.


B.3. Methods to Maximize Response Rates and Deal with Non-Response


A 100% response rate among the DFC coalitions is expected for both COMET (the Semi-Annual Progress Reporting tool) and the CCT.


In addition, a high response rate among DFC coalitions is anticipated based on current use of the system. In addition, using COMET allows coalitions to generate analytic information to assess their own coalition’s performance and target efforts to improve sustainability. For example, a critical function of the CCT is to assist coalitions (and the National Evaluation team) in identifying areas where additional technical assistance and training may be required to further improve the performance of the coalition.


Non-respondents will be contacted by their Government Project Officers and required to complete either the semi-annual report or the CCT in accordance with the Terms and Conditions provided with the Notice of Award. Failure to complete the semi-annual report, the bi-annual data reporting and/or annual CCT will result in further follow up by the Government Project Officer. A corrective action plan will be implemented for the grantee to input the necessary data by a specified deadline. Failure to comply with the corrective action plan will result in the execution of the DFC Program’s progressive disciplinary action plan, which is already established, that ranges from suspension to termination of a grant. Incentives in the form of ad-hoc analyses/ summaries of the collected information will be provided to participating coalitions.

Administration of the Social Network Survey will be conducted at the start of each on-site interview with the respective parties to ensure completion and to be used to guide the subsequent interview.



B.4. Test of Procedures or Methods to be Undertaken


The revisions to COMET and the CCT outlined in this justification and in the associated attachments will undergo pilot testing with 9 or fewer grantees. Additionally, the revisions will be shared with the grantees during training and technical assistance calls and through other venues (meetings, Webinars, etc.). Any lessons learned from the testing will be incorporated, as appropriate, before revisions are final. Any revisions will be communicated to OMB as required.


B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


Wendy Sisson

Principal

ICF International

9300 Lee Highway

Fairfax, VA 22031

703-934-3000

[email protected]


Allan Porowski

Fellow

ICF International

9300 Lee Highway

Fairfax, VA 22031

703-225-2229

[email protected]


Kaz Uewaka, Ph.D.

Technical Specialist

ICF International

9300 Lee Highway

Fairfax, VA 22031

703-934-3000

[email protected]


Dan Cantillon, Ph.D.

Manager

ICF International

9300 Lee Highway

Fairfax, VA 22031

703-934-3000

[email protected]






Felix Fernandez, Ph.D.

Senior Associate

ICF International

9300 Lee Highway

Fairfax, VA 22031

703-934-3000

[email protected]


Sarah Decker

Senior Associate

ICF International

9300 Lee Highway

Fairfax, VA 22031

703-934-3000

[email protected]


Erin Williamson

Senior Associate

ICF International

9300 Lee Highway

Fairfax, VA 22031

Fairfax, VA 22031

703-934-3000

[email protected]


Ed Briggs

Principal

ICF International

11420 Rockville Pike

Rockville, MD 20852

240-747-4910

[email protected]








Glen Doss

Senior Manager

ICF International

11420 Rockville Pike

Rockville, MD 20852

240-747-4862

[email protected]


Michael Rankin

Technical Specialist

ICF International

11420 Rockville Pike

Rockville, MD 20852

240-747-4863

[email protected]



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJoanne and Shukri Abed
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy