CMHI Phase V OMB Re-Submission-8-5-09

CMHI Phase V OMB Re-Submission-8-5-09.doc

National Evaluation of the Comprehensive Mental Health Services for Children and Their Families Program: Phase V

OMB: 0930-0280

Document [doc]
Download: doc | pdf


Phase Five of the National Evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program

Supporting Statement



A. JUSTIFICATION




1. CIRCUMSTANCES OF INFORMATION COLLECTION



The Substance Abuse and Mental Health Services Administration (SAMHSA), Center for Mental Health Services (CMHS) is requesting OMB approval for the National Evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program: Phase V (OMB No. 0930-0280) —Revision. Current data collection is approved under OMB No. 0930-0280 until October 31, 2009. The current request builds on experience garnered during Phases I, II, III, IV, and V of the evaluation and enhances the design, data collection procedures, and instruments.


a. Background



The understanding of child and adolescent mental health disorders has improved significantly during the last two decades. As a result, the field is in a much better position today to estimate the extent to which mental health disorders occur in the population of children and adolescents at large, although it is likely that many children and youth in need go undetected. The President’s New Freedom Commission on Mental Health ([PNFC], 2003) and the Surgeon General’s Report on Mental Health (U. S. Department of Health and Human Services [DHHS], 1999) provide overviews of the literature on the prevalence of mental health disorders among children and adolescents and address the significant discrepancy between population need and utilization of services. Approximately 20 percent of all children and adolescents will qualify for a DSM-IV (American Psychological Association [APA], 1994) mental health diagnosis during the course of a year, and 5–9 percent of children with mental health diagnoses will meet the criteria for serious emotional disturbance (U.S. Public Health Service Office of the Surgeon General [USPH], 2001; Farmer, Mustillo, Burns and Costello, 2003). More refined estimates of the number of children who have serious emotional disturbance range from 4.5 to 6.3 million children (Friedman, Katz-Leavy, Manderscheid, & Sondheimer, 1999); this number would be closer to 8 million now given population estimates for 2007 of over 80 million for children and youth under age 19 (U.S. Census Bureau, 2007). Consistent with these findings are the results of the 2005 National Survey on Drug Use and Health, which indicate that 21.8% (5.5 million) of youth age 12–17 seek treatment or counseling for emotional or behavioral problems each year (Kessler et al., 2005). However the problem is quantified, it is clear that a substantial subset of our nation’s children and youth, and their families, grapple with significant mental health problems.


Children and adolescents with serious emotional disturbance face challenges in many aspects of their daily lives. They are at greater risk for substance abuse disorders, and youth with less severe emotional disturbance are vulnerable to increased emotional problems as a result of substance use (CMHS, 2002, 2003a, 2003b, 2004, 2005, 2006, 2007a; Holden, 2003; Holden et al., 2003; Liao, Manteuffel, Paulic, & Sondheimer, 2001; Riehman, Schurig, & Stephens, 2008; Riehman, Stephens, & Tucker, 2008; Substance Abuse and Mental Health Services Administration [SAMHSA], 2002). Youth with serious emotional disturbance have greater risk for negative encounters with the juvenile justice system and have a high rate of criminal involvement when compared to all students with disabilities (Center for Mental Health Services [CMHS], 2002, 2003a, 2003b, 2004, 2005, 2006, 2007a; Davis & Vander Stoep, 1997). Youth within the juvenile justice system display an exceptionally high rate of mental health and substance abuse disorders (Heffron, Pumariega, Fallon, & Carter, 2003). Students with emotional disturbance fail more courses, earn lower grade point averages, miss more days of school, are retained at grade more than students with other disabilities, and have high dropout rates (Epstein, Nelson, Trout, & Mooney, 2005; U.S. Department of Education [DOE], 2001). Research supports assertions that people with mental illness during childhood have higher use of health care services in adulthood than other adults (Knapp, McCrone, Fombonne, Beecham and Wostear, 2002), and may have poor employment opportunities and experience periods of poverty in adulthood (National Advisory Mental Health Council Workgroup on Child and Adolescent Mental Health Intervention and Deployment, 2001). There is also the increased risk that youth with mental illness will not reach adulthood, as these youth are more likely to commit suicide than youth without mental illness. Suicide is the fourth leading cause of death among youth age 10–14, and the third leading cause of death among those age 15–24 (Centers for Disease Control and Prevention [CDC], 2001). Many of these suicide victims have undiagnosed or untreated mental illness (Institute of Medicine [IOM], 2002).


Advances in the knowledge base over the last decade have served to illuminate continuing challenges in delivering services and meeting needs for this population, and have thrust the issue of children’s mental health into the public spotlight. Despite these advances, service capacity has not kept pace with need (Friedman, 2002; Stroul, Pires, & Armstrong, 2001); it is estimated that only 1 in 5 children with serious emotional disturbance receive the specialty services they need (Burns et al., 1995; DHHS, 1999; Shaffer et al., 1996), and youth with co-occurring mental health and substance abuse disorders rarely receive appropriate and timely services (Federation of Families for Children’s Mental Health and Keys for Networking, Inc., 2001). Unfortunately, the prevalence and accompanying impairment associated with serious emotional disturbance is only likely to grow in the future.


Despite increased efforts to enhance access to services and improve service systems, children and youth with serious emotional disturbance are under identified and most children in need do not receive mental health services (DHHS, 1999). Furthermore, within this population economic, demographic, and geographic factors disproportionately affect identification, placement, and completion of services (Burns & Hoagwood, 2002; Coutinho & Denny, 1996; PNFC, 2003). According to the President’s New Freedom Commission on Mental Health (2003), impoverished families, families from minority racial or ethnic backgrounds, and families living in rural areas confront barriers to accessing services, receiving quality care, and achieving positive outcomes. Serving the needs of persons of diverse backgrounds requires culturally and linguistically competent providers, culturally competent treatments and practices, and cultural adaptations to provide efficacious and effective services (Whaley & Davis, 2007). This underscores the need for the development of effective community-based care that is sensitive to and structured for the diverse cultures in individual communities (Hernandez & Isaacs, 1998; Isaacs-Shockley, Cross, Bazron, Dennis, & Benjamin, 1996; PNFC, 2003) and impoverished families, and is available in even the most geographically remote communities in the country (PNFC, 2003).


There has been much debate about the best method to serve these children and their families. In 1969, the Joint Commission on the Mental Health of Children published a landmark study showing these children were typically unserved or were served inappropriately in excessively restrictive settings (National Institute of Mental Health [NIMH], 1969). Later, the Commission’s findings were substantiated by numerous other studies, task forces, commissions, and reports. These studies concurred that community-based, family-driven, coordinated systems of care providing a range of services are necessary to effectively serve these children and their families.


In 1984, in response to these findings, the NIMH initiated the Child and Adolescent Service System Program (CASSP). Later administered by CMHS within SAMHSA, CASSP provided funds to promote the development of comprehensive and integrated service delivery systems for children with serious emotional disturbance through a system of care approach. The 1999 Surgeon General’s Report on Mental Health documents the progress that has been made and the resources devoted to transforming the nature of service delivery for children with serious emotional disturbances and their families. In 2003, the President’s New Freedom Commission on Mental Health advocated for mental health care to be provided in communities with treatments integrated across agencies and designed to meet the needs of individuals and their families. The report calls for research focused on outcomes—determining the treatments that promote quality care and recovery, and finding the most effective way to disseminate information about these practices. This objective includes investigating emerging best practices, such as wraparound services and systems of care for children with serious emotional disturbances and their families. Research should occur at all levels, with findings made available at the community level. Having a better understanding of this question of effectiveness is especially important in an era of managed care, accountability, and constrained Federal and State spending on mental health services. The 2005 report developed by the Institute of Medicine, “Improving the Quality of Health Care for Mental Health and Substance-Use Conditions” states that to address mental health and substance-use conditions communities need an infrastructure to produce and disseminate scientific evidence of effective treatments as well as funds to conduct studies that are directly related to clinical practice and policy.


The system of care program theory model proposes a comprehensive spectrum of mental health and other necessary services that are organized into a coordinated network to meet the multiple and changing needs of children and adolescents with serious emotional disturbance. In this model, agencies in various child-serving sectors, such as education, juvenile justice, mental health, and child welfare work together to provide the wide array of services needed by children with serious emotional disturbance and their families. Built upon the CASSP philosophy that calls for services to be child-centered, family-driven, community-based, and culturally competent, the model emphasizes the need to: (1) broaden the range of nonresidential community-based services, (2) strengthen case planning across child-serving sectors, and (3) increase case management capacity to ensure that services work together across sectors and providers.


In spite of the progress made through CASSP efforts to develop an infrastructure for systems of care, a deficit of appropriate, less restrictive treatment services remains. Studies indicate rising costs of residential services and increasing rates of child placement in residential facilities and in out-of-home care. These findings are reasons for continued concern that children are served in overly restrictive settings.


b. The Comprehensive Community Mental Health Services for Children and Their Families Program



While the system of care model has provided a conceptual framework to meet the needs of children and youth with serious emotional disturbance, funding to provide services at the local level has been either sporadic or missing. In 1992, the Federal Government addressed this gap with the passage of the Children’s and Communities Mental Health Services Improvement Act which is part of the Alcohol, Drug Abuse and Mental Health Administration Reorganization Act (Public Law 102–321, Section 520). The Act was amended in 2000 to change the term of funding from five to six fiscal years (Public Law 106–310, Section 3105(c)). The CMHI provides support through grants and cooperative agreements to States, political subdivisions within States, the District of Columbia, and territories to develop integrated home and community-based systems and supports for children and youth with serious emotional disturbances and their families. This funding encourages communities to develop and expand systems of care. The CMHI is the largest Federal commitment to children’s mental health to date. The program is fully described in the grant Guidance for Applicants. (See Attachment 1, Guidance for Applicants 2005 No. SM–05–010).


The goals of the CMHS program are to:


  • Expand community capacity to serve children and adolescents with serious emotional disturbances and their families;

  • Provide a broad array of effective services, treatments, and supports;

  • Create a case management team with an individualized service plan for each child;

  • Incorporate culturally and linguistically competent practices for serving all children, youth, and their families. Further, to eliminate disparities related to race, ethnicity, or geographic location; and

  • Promote full participation of families and youth in service planning and in the development of local services and supports.

The goals of the CMHI program are harmonious with those outlined in the New Freedom Commission’s report Achieving the Promise: Transforming Mental Health Care in America (2003). Systems of care work to promote recovery and reduce stigma though the provision of youth-guided and family-driven care that is culturally and linguistically responsive. Services are informed by research and evidence-based practices are utilized to treat children and youth, including those with co-occurring disorders. Finally, Federal, State, and local partnerships are encouraged across child- and youth-serving systems.


c. The Need for Evaluation


Section 565(c) of the Public Health Service Act mandates annual evaluation activities. A basic requirement is documentation of the characteristics of the children, youth, and families served by the system of care initiative, the type and amount of services they receive, and the cost to serve them. Equally important is the need to assess whether the program was implemented and services experienced as intended. It is also critical to assess whether the children and youth served by the program experience improvement in clinical and functional outcomes, whether family life is improved, and whether improvements endure over time. Finally, policymakers and service providers need to know whether those outcomes can be reasonably attributed to the system of care initiative.


A government contractor (referred to as the National Evaluation Team throughout this document) coordinates data collection for the national evaluation and provides training and technical assistance to facilitate the collection of data by local-level evaluators. In turn, each grant community is required by the cooperative agreement to hire a minimum of two evaluation staff (or their full-time equivalents) to ensure that data collection is systematic and can be sustained through the funding period. In this partnership between the National Evaluation Team and local evaluators, the National Evaluation Team provides training and technical assistance regarding data collection and research design. In addition, the National Evaluation Team receives data from all grant communities, monitors data quality, and provides feedback to grant communities. The grant communities help shape data collection procedures and provide feedback to the National Evaluation Team regarding successful approaches. This evaluation will primarily prepare data analyses for the national assessment of the program, but in doing so will make grant community-specific data available to the grant communities to help meet their local evaluation needs.


d. Clearance Request



This submission requests continuation of the original OMB clearance approval through evaluation year 6 for both 2005- and 2006-funded communities in Phase V of the National Evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program.


The national evaluation is driven by the CMHI program theory model. This program theory asserts that to serve children and youth with serious emotional disturbance, service delivery systems need to offer a wide array of accessible, community-based service options that center on children and youths’ individual needs, include the child, youth, and family in treatment planning and delivery, and are provided in a culturally and linguistically competent manner. An emphasis is placed on serving children and youth in the least restrictive setting that is clinically appropriate. In addition, because many children and youth with serious emotional disturbance use a variety of services and have contact with several child- and youth-serving agencies, service coordination and interagency collaboration are critical. The program theory holds that if services are provided in this manner, outcomes for children, youth, and families will be better than can be achieved in traditional service delivery systems.


To examine the system of care theory, the national evaluation is designed to answer the following overarching questions:


  • Who are the children, youth, and families served by the program and by the funded communities? Does the served population change over time as systems of care mature?

  • How do systems of care develop according to system of care principles (e.g., youth and family involvement, cultural competence, interagency collaboration) over time? In what ways does funding accelerate system development?

  • To what extent do children and youth’s clinical and functional outcomes improve over time? How are family outcomes affected? How are changes in child, youth, family, and system outcomes associated with efforts to implement and develop systems of care?

  • What are the service utilization patterns (specific services, treatments, and supports) for children, youth, and families in systems of care and what are the associated costs? How cost-effective are systems of care over time?

  • To what extent are children, youth, and families’ experiences consistent with the system of care philosophy? How satisfied are children, youth, and families with the services they receive? To what extent are youth and family members involved in systems of care?

  • Are there subgroups of children, youth, and families for whom a system of care is more effective?

  • To what extent are systems of care able to sustain themselves after Federal funding has ended? What factors facilitate or impede sustainability?

  • How is the CQI Initiative being pursued by communities? How do communities use the CQI Progress Reports and associated technical assistance (TA) in their efforts toward CQI? How satisfied are communities with the CQI approach? What are community’s perceptions of the effectiveness and utility of the CQI Progress Reports and TA provision?

  • What is the practitioner awareness and knowledge of evidence-based practices (EBPs)? What EBPs and treatments are practitioners using? What are practitioner attitudes about the implementation of EBPs? How do practitioners assess their agencies’ readiness for change? What are families’ knowledge of and attitudes about the quality of services they are receiving or treatment effectiveness? How are grant communities implementing EBPs within their systems of care?

  • To what extent and in what ways are communities self-assessing their efforts to develop a culturally and linguistically competent system of care and utilizing their assessment findings to make improvements?

  • How are EBPs adapted to be culturally appropriate? What barriers exist to formulating and implementing appropriate adaptations of EBP models? How are youth and caregivers involved in the development of adaptations of evidence-based practices?


These evaluation questions have evolved over the last 17 years through development of the CMHI and feedback from system of care personnel and other partners and extend those mandated by the CMHI authorizing legislation. The legislation requires funded communities to participate in a national evaluation that assesses the number of children and youth served, child, youth and family characteristics, child, youth, and family outcomes, service utilization patterns, and system characteristics.

The evaluation design for the Phase V communities includes six core studies and three special studies that employ both qualitative and quantitative methods to comprehensively examine the impact of CMHI funding. This evaluation provides the opportunity to advance the assessment of evidence-based treatments within systems of care, and to examine in greater detail specific efforts and goals of the CMHI. Exhibit 1 on the following page presents a summary of components for the Phase V evaluation. Note that the years listed in Exhibit 1 and throughout this supporting statement refer to the evaluation year, not the funding year. Because the cooperative agreements awarded to the communities cover one planning year and five option years, evaluation year 1 is actually contract or funding year 2.




Exhibit 1: Summary of Major Components in Phase V

Note: Years refer to evaluation year



System-level

Service-level

Individual-level


Table 1 (below) describes forms included for clearance by study.


Table 1. Study Component and Forms for Phase V Re-submission



Study Components and Instruments

System of Care Assessment

Overview of System of Care Assessment Framework

Letter Templates

Informant Table

Pre-Visit Documentation

System of Care Assessment Interview Protocols

Interagency Collaboration Scale

Cross-Sectional Descriptive Study

Enrollment and Demographic Information Form

Child Information Update Form

Living Situations Questionnaire (LSQ)

Child Behavior Checklist (CBCL) / Child Behavior Checklist 6–18 (CBCL 6–18)

Child Behavior Checklist (CBCL) / Child Behavior Checklist 1½–5 (CBCL 1½–5)

Caregiver Strain Questionnaire (CGSQ)

Behavioral and Emotional Rating Scale-Second Edition—Parent Rating Scale (BERS-2C)

Education Questionnaire-Revised (EQ-R)

Family Life Questionnaire (FLQ)

Delinquency Survey-Revised (DS-R)

Gain-Quick Substance Related Issues (Gain Quick-R)

Substance Use Survey-Revised (SUS-R)

Revised Children’s Manifest Anxiety Scales (RCMAS)

Reynolds Adolescent Depression Scale—Second Edition (RADS-2)

Youth Information Questionnaire (YIQ-I)

Youth Information Questionnaire (YIQ-F)

Behavioral and Emotional Rating Scale—Second Edition, Youth Rating Scale (BERS-2Y)

Columbia Impairment Scale (CIS)

Vineland Screener (VS)

Caregiver Information Questionnaire (CIQ-I)

Caregiver Information Questionnaire (CIQ-F)

Service Experience Study

Multi-Sector Service Contacts Questionnaire—Revised (MSSC-R)

Evidence-Based Practices Experience Measure (EBPEM)

Youth Services Survey for Families (YSS-F)

Youth Services Survey (YSS)

Cultural Competence and Service Provision Questionnaire (CCSP)

Sustainability Study

Sustainability Study Respondent Selection Criteria

Sustainability Study Email Scripts

Sustainability Study Survey Cover Letter

Sustainability Study Survey

Sustainability Study Reminder Letters

Sustainability Study Web Screens

Flex Funds Data Dictionary

Services and Costs Data Dictionary

Continuous Quality Improvement (CQI) Initiative Evaluation

CQI Initiative Evaluation Letter Templates

CQI Initiative Survey

CQI Initiative Interview Guide

Evidence-Based Practices Study

System-level Implementation Factors Discussion Guide

Service-level Implementation Factors Discussion Guide

Consumer-level Implementation Factors Discussion Guide

Cultural and Linguistic Competence Study

CCIOSAS – Beneficiaries of Self-Assessment Findings Focus Group Guide – Staff and Partners

CCIOSAS – Beneficiaries of Self-Assessment Findings Focus Group Guide - Caregivers

CCIOSAS – Beneficiaries of Self-Assessment Findings Focus Group Guide - Youth

CCIOSAS – Participants in Self-Assessments Focus Group Guide – Staff and Partners

CCIOSAS – Participants in Self-Assessments Focus Group Guide - Caregivers

CCIOSAS – Participants in Self-Assessments Focus Group Guide - Youth

CCIOSAS – Users of Self-Assessment Findings Focus Group Guide –Staff and Partners

CCIOSAS – Users of Self-Assessment Findings Focus Group Guide – Caregivers

CCIOSAS – Users of Self-Assessment Findings Focus Group Guide – Youth

CCIOSAS – Telephone Interview – Staff and Partners

CCEBPS – Managers of EBP Process Focus Group Guide

CCEBPS – Providers of EBP Focus Group Guide

CCEBPS – Family Focus Group Guide

CCEBPS – Youth Focus Group Guide

CCEBPS – Telephone Interview




2. PURPOSE AND USE OF INFORMATION



This evaluation serves several purposes. It: (1) describes who is being served by the CMHS-funded systems of care; (2) shows how much children, youth, and families’ outcomes are improved by systems of care; (3)shows whether there are observable differences in child, youth, and family outcomes that can be plausibly linked to a faithful implementation of the system of care approach; (4) describes how children, youth, and families experience the service system and how they use services and supports (i.e., utilization patterns); (5) estimates the cost of serving children and youth in systems of care; (6) illustrates the development of systems of care as they move toward offering integrated and comprehensive services; (7) describes characteristics and factors related to sustainability of system of care infrastructure; (8) determines the utility and effectiveness of the initiative among the communities; (9) describes the use of evidence-based practices by mental health services providers within system of care communities and how families perceive this use; (10) examines how evidence-based practices are adapted to be culturally and linguistically competent; (11) supports technical assistance activities to help CMHS best meet program goals; (12) supports CMHS in its efforts to establish standards for measuring their performance and effectiveness as required under the 1993 Government Performance and Results Act (GPRA); and (13) provides data for the National Evaluation Measures (NOMs) to address the national outcome measures for mental health programs as currently established by SAMHSA.


The data collected in Phase V is useful to CMHS and its partners, other Federal agencies, the grant communities, individual children, youth, and their families, and the research field. Findings from the Phase I, II, III, and IV evaluations have been used to describe the children, youth, and families served by the funded systems of care, to assess whether the children and youth in the samples have experienced improved outcomes, to measure service experiences and system development, and to request additional funding from local and State agencies to sustain system of care services. In addition to contributing further information on topics covered in prior phases, Phase V continues to add to the knowledge base through the development of a clearer understanding of the service environment needed to implement evidence-based practices, system of care communities’ readiness to sustain themselves and the barriers and facilitators to sustainability, and the ways in which evidence-based practices are adapted to address culturally diverse populations within systems of care. As in previous phases of the evaluation, the design allows for the exploration of the relationships between service use and outcomes and the study of the long-term impact of the program.


System of Care Assessment. This study examines whether programs have been implemented in accordance with system of care program theory and documents how systems develop over time to meet the needs of the children, youth, and families they serve. A particular interest is whether services are delivered in an individualized, family-driven, coordinated manner, and whether the system involves multiple child- and youth-serving agencies. For Phase V, site visits for each system of care community are conducted at 18–24 month intervals in evaluation years 1 through 6.


Information is collected through a combination of document reviews, review of randomly selected case records, semi-structured quantitative and qualitative interviews, observations made on site, follow-up telephone interviews to clarify information, and the administration of selected domains of the Interagency Collaboration Scale (IACS) (Greenbaum et al., 2003). Categories of interview respondents include project directors, core child- and youth-serving agency representatives, family organization representatives, care coordinators, direct service providers, caregivers, youth coordinators and youth served by the system of care. The IACS, which measures collaboration between child- and youth-serving agencies in system of care communities, is administered to project directors, core child- and youth-serving agency representatives, family organization representatives, care coordinators, and direct service providers.


Cross-Sectional Descriptive Study. This study describes child, youth, and family characteristics of all children and youth entering CMHS-funded systems of care. Data are obtained primarily through in-person interviews with caregivers conducted as part of the intake process and through case record reviews. Data for the intake instrument may be directly entered into a Web-based instrument by intake personnel to facilitate capture of basic descriptive characteristics of children and youth served. Data are collected on entry for all children, youth, and families who enter the system of care throughout the program’s funding period. For the children, youth, and families who participate in the Longitudinal Child and Family Outcome Study (see below), additional descriptive information is collected as part of the baseline interview, and the descriptive data elements that may have changed over time (e.g., diagnosis, insurance status) are collected again at follow-up data collection points.


Longitudinal Child and Family Outcome Study. This study, conducted among a sample of children and youth in each community, examines how the system affects child and youth clinical and functional status and family life. Outcome data on child and youth clinical and functional status are used to assess change over time in the following areas: symptomatology, diagnosis, social functioning, substance use, school attendance and performance, delinquency and juvenile justice involvement, and stability of living arrangements. Family life is assessed in the areas of family functioning and caregiver strain. These data are collected at all system of care communities within 30 days of the child’s entry into services and at 6-month intervals for the length of the evaluation. Every effort has been taken to reduce the burden on children, youth and families participating in the study, including offering to conduct the interviews in their homes or at other locations most convenient for them.


Service Experience Study. This study, conducted among the sample of children and youth participating in the Longitudinal Child and Family Outcome Study, investigates the extent to which system of care principles are experienced by families, and considers experiences from the perspectives of youth and caregivers. Data are used to assess intervention fidelity, satisfaction with services, cultural competence, accessibility and coordination of services, perceived helpfulness of services, and impact of services on ability of family members to work outside the home. Data collection occurs at 6-months after intake and ever 6 months thereafter, up to 36 months.

Sustainability Study. Using a Web survey, this study gathers data on system of care characteristics and factors related to sustainability of infrastructure during the life of the award and after the Federal funding cycle is completed. The survey questions cover the following topic areas: (a) availability of specific services in the system of care, (b) mechanisms used to implement system of care principles, (c) factors affecting sustainability (whether each factor has played a role in the development or maintenance of the system of care, and, if so, the extent to which each has impacted the system of care), (d) success with objectives for implementing systems of care, (e) strategies for sustaining systems of care, and (f) financial resources contributing to budget. The Web survey will be conducted with representatives from all grant communities in years 3, 4, 5, and 6 of the evaluation.


Services and Costs Study. This study will describe the types of services used by children, youth, and families in systems of care, their service use patterns, and the costs associated with these services. Of particular interest are the types of services within systems of care communities, the combination of services received, continuity or gaps in care, the length of treatment, and the costs of services. The relationship between service use and outcome indicators will also be explored, as well as cost effectiveness of the systems of care model.


The National Evaluation Team will request data needed for the Services and Costs Study from communities on a continuous basis, beginning in evaluation year 4. Data provided at that time will represent services received from the beginning of service delivery, typically starting in the second year of the community’s grant funding. Information collected by communities will include data on flexible funds expenditures, services received by children/youth, and the costs of those services. Although some communities already maintain some of these data in their existing fiscal (e.g., charge, billing) management information systems (MISs), none of these communities maintain complete data on all system of care services and the costs of those services. The National Evaluation Team will ask these communities to collect data in addition to what they already maintain. The National Evaluation Team will ask communities that do not maintain any data on services and costs to begin collecting data for this study.


The National Evaluation Team will ask communities that have existing MISs to extract and reformat their data to match the common data structure established for the study in the Flex Funds Data Dictionary and the Services and Costs Data Dictionary. Communities that do not have existing data will be asked to key enter services and costs data into two data entry applications provided by the National Evaluation Team. These include the Flex Funds Tool and the Services and Costs Data Entry Application.


CQI Initiative Evaluation. The CQI Initiative Evaluation is a new component of the national evaluation. The CQI Initiative is a process developed to support systems of care in their ability to achieve the goals and objectives of the Comprehensive Community Mental Health services for Children and Their Families Program. The CQI Progress Report is a performance measurement and benchmarking tool that the National Evaluation Team has provided to system of care communities to support CQI at the national and local levels. Infrastructure to support the grant communities in achieving benchmarking goals has and continues to be developed and refined. The CQI Initiative Evaluation is designed to assess how communities are pursuing the CQI Initiative, how communities use the CQI Progress Reports and associated technical assistance (TA) in their efforts toward CQI, how satisfied communities are with the CQI approach, and what community’s perceptions are of the effectiveness and utility of the CQI Progress Reports and TA provision.


It will include two data collection mechanisms: (1) a Web-based survey of local system of care constituents and (2) semi-structured interviews of key constituents from a subset of communities receiving Federal funding. The survey and interviews will both occur in year 4 of the evaluation. For each grant community, The National Evaluation Team will ask up to seven site-level respondents (i.e., principal investigator, project director, lead evaluator, cultural competence coordinator, social marketer, lead family representative, youth coordinator) to complete the CQI Initiative Survey. The National Evaluation Team will select a subset of six communities for participation in the semi-structured CQI Interview.


Evidence-Based Practices Study. The purpose of the EBP Study is to better understand the implementation of evidence-based treatment in systems of care. In an effort to include each of the 30 funded communities at some level, the evidence-based practice study will include a multilevel mixed-method approach to the collection of information from multiple respondent groups within and across communities. This study has been revised. The EBP study was previously composed of five substudies: The Assessment of Planned EBP Substudy (APEBP), the Provider Practices Substudy (PPS), the Community Readiness Substudy (CRS), the Combined Provider Practices, Community Readiness and Outcomes Substudy (CPPCROS), and the Evidence-Based Practices Substudy (EBPES).


  • The Assessment of Planned EBP Substudy (APEBPS) was completed in March 2009. This substudy included a review of planned implementations of EBP among all 2005-funded communities. Information was collected through a careful review of grant applications and confirmed on an annual basis through both informal and systematic communication with the local communities using a semi-structured interview process. Two rounds of discussions with site leadership teams occurred during evaluation years 1, 2, and 3.

    • The Provider Practices Substudy (PPS) was designed to provide contextual detail concerning the knowledge and use of EBP among providers. Results from its administration in evaluation year 2 have shown that although 55.6% (n= 213) of service providers began the EBP-R survey, only 31.5% (n= 67) actually completed it. This poor response rate may be attributable to the extensive nature of the study survey. Based on poor response rates the National Evaluation Team will discontinue this survey.

    • The Community Readiness Substudy (CRS) examines organizational readiness for change and relies on a survey to gather information. Based on its administration in evaluation year 2 response rates have been good, but there has been a lack of variance in the results. Specifically, results show that for both agency directors and treatment providers, the average scores on all 18 of the subscales ranged between 3 and 4 (on a 1–5 point scale). Although there were statistically significant differences between directors and providers on four of the subscales, these differences seem to simply reflect the participants’ roles and degrees of authority. These results suggest a need for different approaches.

    • The Combined Provider Practices, Community Readiness and Outcomes Substudy (CPPCROS) used a pilot-study process to combine EBP-R survey data with individual-level data from the Cross-Sectional Descriptive and Longitudinal Child and Family Outcome Studies in evaluation years 1 and 2. The pilot-study found that current CMHI systems of care do not generally record any details about the treatments provided to their children, youth, and families. There are no reliable records from participating agencies as to which EBTs they provided to youth and families and their degree of fidelity in implementation. Therefore, the planned data-matching methods in the original design of the CPPCROS are unlikely to produce much reliable, generalizable data.

    • The Evidence-Based Practice Experiences Substudy (EBPES) was planned to examine youth and family awareness of and service experience with EBP and differences between groups of children and youth in system of care sites that do and do not receive an evidence-based treatment in terms of client awareness and knowledge of EBP and their perceived usefulness. Many of the EBPES research questions will be addressed with the introduction of the Culturally Competent Evidence-Based Practices Substudy and the Family and Youth Experiences Substudy. There is little need to duplicate this effort therefore the National Evaluation Team will discontinue the EBPES.


The five substudies originally designed will be replaced with the following three substudies:


  • The Community Plans Substudy (CPS). Formerly called the APEBPS, this substudy focuses on obtaining data about planned implementation of EBP among all phase V communities in the early stages of their grant. Data collection was completed as of March 2009. The substudy ends in FY 2009 since sites are no longer planning in FY 2010.

  • The Implementation Factors Substudy (IFS), combines elements of the PPS, CRS, and CPPCROS. The IFS examines the contextual factors that support or inhibit the implementation of evidence- and practice-based treatments and the impact of these approaches on consumers, providers, agencies, and systems of care. Beginning in evaluation year 4, The National Evaluation Team will conduct semi-structured qualitative interviews with professionals, consumers, and other constituents to better understand their experiences with and perceptions of treatment implementation factors. The data collection will include all three levels of system of care constituents – system, service, and consumer. A separate interview guide was developed for each type/category of key informant, and will be used to gather preliminary information from the three levels. Contacting project directors, direct mental health providers, family members and youth will be of particular importance. Interviews will be conducted during evaluation year 4.

  • Family and Youth Experiences Substudy is a continuation of the EBPES. The Family and Youth Experience with EBT Substudy (FYES) focuses on youth and family awareness of and service experience with EBT. The information is collected from participants as part of the Longitudinal Child and Family Outcome Study during the 6-month follow-up interviews via the Evidence-Based Practices Experience Measure (EBPEM), an addendum to the Multi-Sector Service Contacts—Revised (MSSC-R) instrument.


Cultural and Linguistic Competence (CLC) Study. The CLC study encompasses three substudies: the Cultural and Linguistically Competent Implementation Substudy (CLCIS); the Culturally Competent Implementation and Outcomes Self-Assessment Substudy (CCIOSAS); and the Culturally Competent Evidence-Based Practices Substudy (CCEBPS). The first substudy completed in evaluation year 1 examined the relationship between the cultural and linguistic characteristics of communities and the design and implementation of system of care practices, from infrastructure to service delivery. The National Evaluation Team conducted the CLCIS using semi-structured qualitative interviews to assess how cultural and linguistic community characteristics inform systems of care implementation, and the barriers and facilitators encountered when implementing cultural and linguistic competence standards. The National Evaluation Team developed protocols for each key informant. The National Evaluation Team conducted interviews at four sites and no more than nine interviews were conducted using each version of the protocol. For this reason, OMB clearance was not requested. The National Evaluation Team used findings from these interviews to develop protocols for the remaining two substudies, for which clearance is being requested.


The CCIOSAS, the second substudy conducted in evaluation year 3, involves a multi-method qualitative approach that will examine how system of care communities conduct self assessments of their emerging CLC practices at the infrastructure and service delivery levels, and how results were used to improve system of care practice. The National Evaluation Team will conduct site visits in four communities. Key informants (e.g., system of care staff, agency partners, youth, and families) will participate in either focus groups or interviews to better understand how communities self-assess their CLC efforts. Communities will be active in determining who the informants will be and which data collection format (i.e., focus group or interview) is best for the participants. Eight communities—including the four communities participating in the CLCIS—will be involved in the CCIOSAS. The four communities that participated in the CLCIS will not receive a site visit, but would participate in focus groups and/or interviews via teleconference. Lastly, to gain a broader perspective on how the remaining 2005-and 2006- funded system of care communities implemented self-assessments of their linguistic and cultural competence to improve their system of care, The National Evaluation Team will conduct teleconference calls. These calls will be held with a small group of community representatives who develop, implement, or participate in the self-assessment process (e.g., project director, evaluator, cultural competence coordinator, service provider).


The CCEBPS, the third substudy conducted in evaluation year 5, examines the extent to which diverse characteristics of system of care communities shape system of care service delivery, including what choices or adaptations of evidence-based practices are made to ensure cultural and linguistic needs are met. This study’s purpose is to develop a more systematic understanding of the approaches to adapting evidence-based practices to serve diverse populations. This includes addressing both successes and barriers to the implementation of culturally competent evidence-based practices, and how these changes relate to outcomes. The National Evaluation Team will conduct site visits in four communities. Key informants (e.g., system of care staff, agency partners, youth, and families) will participate in either focus groups or interviews to better understand how communities approach adapting evidence-based practices to meet the diverse needs of the children, youth, and families they serve. Communities will be active in determining who the informants will be and which data collection format (i.e., focus group or interview) is best for the participants. Twelve communities—including the eight communities participating in the CCIOSAS, will be involved in the CCEBPS. The eight communities that participated in the CCIOSAS will not receive a site visit, but would participate in focus groups and/or interviews via teleconference. Lastly, to gain a broader perspective on how the remaining 2005-and 2006- funded system of care communities adapted and implemented culturally competent evidence-based practices, teleconference calls will be conducted with a small group of community representatives who participated in this process (e.g., project director, evaluator, cultural competence coordinator, service provider).



Principal changes from Phase V 2006 submission to Phase V 2009 re-submission include:


  • Updates to three of the measures in the Longitudinal Child and Family Outcome Study instrument package to address information desired by the program;

  • The addition of the Services and Costs Study Data Dictionary and the Flex Funds Data Dictionary;

  • The addition of the CQI Initiative Evaluation to assess how the CQI Initiative approach is being pursued by communities, how communities use the CQI Progress Reports and associated technical assistance (TA) in their efforts toward CQI, how satisfied communities are with the CQI approach, and what communities’ perceptions are of the effectiveness and utility of the CQI Progress Reports and TA provision;

  • The modification in the design of the Evidence-Based Practices Study to help determine what attitudinal and organizational factors influence the implementation and receipt of evidence-based practices; and

  • The addition of two new sub-studies under the Cultural and Linguistic Competence Study, which focus on the adaptation of evidence-based practices and the organizational context, which support adaptation.


Table 2 (below) summarizes instrument additions and revisions for the Phase V Re-submission.


Table 2. Study Component and Instrument Revisions for Phase V Re-submission



Study Components and Instruments


New or Revised for 2009

Re-submission


No change


Nature of Change

System of Care Assessment




Overview of System of Care Assessment Framework


X


Letter Templates


X


Informant Table


X


Pre-Visit Documentation


X


System of Care Assessment Interview Protocols


X


Interagency Collaboration Scale


X


Cross-Sectional Descriptive Study




Enrollment and Demographic Information Form


X


Child Information Update Form


X


Longitudinal Child and Family Outcome Study

Living Situations Questionnaire (LSQ)


X


Child Behavior Checklist (CBCL) / Child Behavior Checklist 6–18 (CBCL 6–18)




Child Behavior Checklist (CBCL) / Child Behavior Checklist 1½–5 (CBCL 1½–5)


X


Caregiver Strain Questionnaire (CGSQ)


X


Behavioral and Emotional Rating Scale-Second Edition—Parent Rating Scale (BERS-2C)


X




Education Questionnaire-Revised (EQ-R)



X



Slight wording change to interviewer note and the term “day care” changed to “childcare”

Family Life Questionnaire (FLQ)


X


Delinquency Survey-Revised (DS-R)


X


Gain-Quick Substance Related Issues (Gain Quick-R)


X


Substance Use Survey-Revised (SUS-R)


X


Revised Children’s Manifest Anxiety Scales (RCMAS)


X


Reynolds Adolescent Depression Scale—Second Edition (RADS-2)


X


Youth Information Questionnaire (YIQ-I)


X


Youth Information Questionnaire (YIQ-F)


X


Behavioral and Emotional Rating Scale—Second Edition, Youth Rating Scale (BERS-2Y)


X


Columbia Impairment Scale (CIS)


X


Vineland Screener (VS)


X


Caregiver Information Questionnaire (CIQ-I)


X


Question 39a skip pattern revised

Question 39d list of medications updated

Caregiver Information Questionnaire (CIQ-F)

X


Question 39a skip pattern revised

Question 39d list of medications updated

Service Experience Study




Multi-Sector Service Contacts Questionnaire—Revised (MSSC-R)

X


Slight modification to Card 4 and Cards 6 and 7 are new

Evidence-Based Practices Experience Measure (EBPEM)


X


Youth Services Survey for Families (YSS-F)


X


Youth Services Survey (YSS)


X


Cultural Competence and Service Provision Questionnaire (CCSP)


X


Sustainability Study




Sustainability Study Respondent Selection Criteria


X


Sustainability Study Email Scripts


X


Sustainability Study Survey Cover Letter


X


Sustainability Study Survey


X


Sustainability Study Reminder Letters


X


Sustainability Study Web Screens


X


Services and Costs Study

Flex Funds Data Dictionary

X


New

Services and Costs Data Dictionary

X


New

Continuous Quality Improvement (CQI) Initiative Evaluation




CQI Initiative Evaluation Letter Templates

X


New

CQI Initiative Survey

X


New

CQI Initiative Interview Guide

X


New

Evidence-Based Practices Study




System-level Implementation Factors Discussion Guide

X


New

Service-level Implementation Factors Discussion Guide

X


New

Consumer-level Implementation Factors Discussion Guide

X


New

Cultural and Linguistic Competence Study




CCIOSAS – Beneficiaries of Self-Assessment Findings Focus Group Guide – Staff and Partners

X


New

CCIOSAS – Beneficiaries of Self-Assessment Findings Focus Group Guide - Caregivers

X


New

CCIOSAS – Beneficiaries of Self-Assessment Findings Focus Group Guide - Youth

X


New

CCIOSAS – Participants in Self-Assessments Focus Group Guide – Staff and Partners

X


New

CCIOSAS – Participants in Self-Assessments Focus Group Guide - Caregivers

X


New

CCIOSAS – Participants in Self-Assessments Focus Group Guide - Youth

X


New

CCIOSAS – Users of Self-Assessment Findings Focus Group Guide –Staff and Partners

X


New

CCIOSAS – Users of Self-Assessment Findings Focus Group Guide – Caregivers

X


New

CCIOSAS – Users of Self-Assessment Findings Focus Group Guide – Youth

X


New

CCIOSAS – Telephone Interview – Staff and Partners

X


New

CCEBPS – Managers of EBP Process Focus Group Guide

X


New

CCEBPS – Providers of EBP Focus Group Guide

X


New

CCEBPS – Family Focus Group Guide

X


New

CCEBPS – Youth Focus Group Guide

X


New

CCEBPS – Telephone Interview

X


New


CMHS uses the results from Phase V to develop policies and provide guidance regarding the development of systems of care. Specific findings on the successes and challenges that agencies have experienced in developing collaborative, coordinated, and comprehensive systems are used to tailor technical assistance to grant communities. Information and findings from the evaluation helps CMHS plan and implement other efforts related to systems of care. Findings from the evaluation also enhance other CMHS programs that support system development (e.g., Projects for Assistance in Transition from Homelessness, Community Mental Health Services Block Grant, CMHS Community Support Programs, and Child and Adolescent Mental Health & Substance Abuse State Infrastructure Grants). In addition, the many partners that work in collaboration with CMHS, including the Federation of Families for Children’s Mental Health and the Technical Assistance Partnership, are able to use the results in their national efforts to help build systems of care to meet the needs of children, youth, and families.


Finally, CMHS also uses the findings from the evaluation to provide objective measures of its progress toward meeting targets of key performance indicators put forward in its annual performance plans as required by law under the GPRA. Globally, these measures for children and youth include increases in the number of children and youth served in the CMHS program, increased school attendance, decreased juvenile justice contacts, decreased use of inpatient hospitalization, decreased expenditures for inpatient hospitalization, and long-term program outcomes demonstrated by the percentage of grantees showing decreases in child/youth symptomatology, decreases in inpatient care costs (efficiency measure), and increases in programs maintained 5 years post-program funding. Specific measures from the Phase V instrumentation corresponding to these global measures include the Education Questionnaire—Revised (EQ-R) and the Delinquency Survey—Revised (DS-R) for assessing school attendance and juvenile justice contacts; the Living Situations Questionnaire (LSQ) for assessing usage of inpatient hospitalization; the Child Behavior Checklist (CBCL) for assessing child symptomatology; and the Sustainability Survey for assessing sustained program characteristics. These instruments are described in detail in Section B.2.


Findings from the evaluation are useful to policymakers, planners, and analysts in other Federal agencies involved in programs for this target population. The service program is coordinated with relevant Federal agencies, such as NIMH and the Administration for Children and Families in DHHS, the Office of Juvenile Justice and Delinquency Prevention in the Department of Justice (DoJ), and the Institute of Education Sciences and the Office of Special Education Programs under the Office of Special Education and Rehabilitative Services in the DoE. CMHS has held several meetings with representatives from these and other Federal agencies since the inception of this program. The involvement of staff from related agencies and programs ensures that the effort is coordinated at the Federal level and that results of the evaluation will be useful to a wider audience. See Attachment 2.A for a list of participants in the Federal/National Partnership for Children’s Mental Health.


Findings from the evaluation are used by grant communities to improve the implementation of their systems of care and achieve the goals of the CMHI. Demographic and outcome data on a sample of children, youth, and families who participate in the system of care aid grant communities in identifying the program elements that help children, youth, and families function better, that are family-driven and youth-guided, and that lead to client satisfaction. Grant communities are expected to use the information to identify better their target populations, improve their services, and support their efforts to obtain required matching funds and to sustain their system of care after the CMHI funding has ended. Indeed, several grant communities have used data collected for the Phase I, II, III, and IV studies to request additional funding from their State legislatures. The same is expected for Phase V. Service experience data provides useful feedback to grant communities on whether families experience services as the grant communities intended and identifies their programs’ strengths and weaknesses. This information helps grant communities plan culturally competent services and supports that families report as useful and that are associated with improved child, youth, and family outcomes. System of Care Assessments provide useful feedback on how to refine the system by identifying gaps in system development and barriers to collaboration, which help grant communities more effectively allocate personnel and funding and prioritize activities.


Grant communities also learn what barriers children, youth, and their families perceive and are able to work to eliminate such barriers. Clinicians are able to use the data collected with standardized objective measures to guide treatment.


The research community, particularly the field of children’s mental health services research, profit in a number of ways. First, evaluation of the CMHI adds significantly to the developing research base about systems of care. Second, the focus on child, youth, family, and system outcomes allows researchers to examine and understand the specific ways children and youth improve, how services can be enhanced, and the importance of adherence to service plans. Moreover, the relationship among these variables can be better understood. Finally, the analysis of evaluation data aids researchers in formulating new questions about systems of care and specific services, and helps both service providers and researchers improve the delivery of children’s mental health services. The information obtained from the Longitudinal Child and Family Outcome Study is of particular importance in addressing these research goals.


If these data are not collected, policymakers and program planners at the Federal and local levels would not have the necessary information to determine the extent to which children and youth with serious emotional disturbance and their families experience contract-funded services as they were intended. Without this evaluation, they would not know if these systems have had any positive impact on the lives of the people they serve.



3. USE OF IMPROVED INFORMATION TECHNOLOGY


System of Care Assessment. System of Care Assessment data, which primarily are qualitative in nature, are collected by the National Evaluation Team during site visits and do not lend themselves to the use of special technology at this time.


Cross-Sectional Descriptive Study, Longitudinal Child and Family Outcome Study, and Service Experience Study. The majority of the child, youth, and family descriptive, outcome, and intervention-level data are collected through interviews with youth and families using standard instruments. The data collection is conducted by grant community staff. Previous experience has shown that grant communities differ in their access to hardware and software. Requiring special hardware or software for this evaluation would be disruptive and would increase rather than reduce burden, especially since grant communities must be capable of administering the instruments in a variety of settings. However, the National Evaluation Team has provided software for computer-assisted personal interviewing (CAPI) for those grant communities that have access to the necessary hardware. Across all study components, approximately half of total responses are collected electronically by CAPI or Web survey. The remaining half are key entered into an electronic form after hard copy data collection.


Data for these studies are managed using an integrated Internet-based data input, management, and dissemination system—the interactive-collaborative network (ICN). The ICN, which was introduced in Phase III and refined in Phases IV and V of the national evaluation, reduces evaluation burden for grant communities and allows real-time access to data for grant community personnel and National Evaluation Team (NET) members. The system serves as a mechanism for communicating about evaluation activities and results.

The ICN was designed as a three-part system that allows systematic data input, provides immediate validation to identify data entry errors, and monitors data entry and evaluation in real time. It reduces processing time and provides the capability of creating interactive reports. The ICN is a completely secure system that maintains privacy by requiring different levels of password-protected access to site and national data. The ICN is only accessible to staff at systems of care grant communities, program partner organizations, the National Evaluation Team, and CMHS. It is not accessible to systems of care participants or the general public, therefore the three software subsystems include:


  1. Data Input. Data entry software allows rapid data entry off-line, and the Internet is used to transfer data from local grant communities to the national database. The off-line data entry feature of the ICN allows those grant communities with available laptop computers the option of CAPI interviewing by entering the participant’s responses directly into the data entry package during the interview. Specific descriptive information on Cross-Sectional Descriptive Study participants are entered directly to the ICN Web site. This web-based data entry software is designed for use by intake workers or case managers who are often located at various agencies rather than at a central evaluation office. Basic validations are completed during the data entry process. More complex validations requiring comparison of data across instruments and across times are performed on the ICN after data are uploaded to and stored in the central repository. The primary goal of this Web-based software is to maximize the entry of descriptive information on all children and youth served in system of care programs as efficiently as possible, to minimize burden associated with the Cross-Sectional Descriptive Study.

  2. Data Monitoring and Management. Software allows the National Evaluation Team and CMHS to monitor the status of each grant community’s data submissions in real time and permits grant communities to check the status of their own data submissions.

  3. Data Dissemination. Features on the ICN support grant communities’ ability to use their data for local data needs and program performance monitoring purposes. Reports provided on the ICN include summary analysis of grant community-specific data, participant enrollment and retention rates, and analysis of continuous quality improvement indicators. Additional reports posted on the ICN facilitate review of aggregate data reports that CMHS has approved for public release.


The OMB control number, expiration date, and response burden statement are displayed at the beginning of instruments programmed in the ICN system, as shown in the screen shot below. The ICN is compliant to requirements of Section 508 of the Rehabilitation Act to permit accessibility to people with disabilities.


The National Evaluation Team provides training and direct technical assistance support to grant communities to facilitate the implementation of the evaluation protocol and the use of evaluation results at the grant community level. The National Evaluation Team trains grant community personnel on the ICN at national training meetings and during evaluation technical assistance visits to the sites.


Sustainability Study. The Sustainability Study is conducted as a Web survey. Respondents enter a Web address, username, and password into their Web browsers to open and complete the survey. Because names and contact information of respondents in communities funded in 2005 and 2006 are maintained by the National Evaluation Team, e-mail contacts are available. A letter describing the survey and instructions for logging onto the Web survey are sent by either e-mail or mail to respondents. For those people who cannot complete the survey on the Web, the option to complete a paper-and-pencil survey is provided. Previous experience indicates that approximately 90% of respondents will submit responses electronically. Survey completion is monitored by each login to assess response rates and to implement targeted follow-up mailings and phone calls to nonrespondents.


Services and Costs Study. Data for this study are collected by grant communities in various ways, depending on their local information technology infrastructure. Some communities have comprehensive data systems that already track services and costs data electronically, while other communities have limited or no data system in which to record these data. To minimize communities’ burden in providing data for this study, the National Evaluation Team is accommodating this variation in communities’ capacity to collect services and costs data. For communities that have existing data systems or are developing their own data systems, the National Evaluation Team is providing a common data dictionary structure for communities to use in extracting and recoding their data prior to transferring their data to the National Evaluation Team. For communities that have no data system, the National Evaluation Team is developing two data entry applications for communities to use for collecting data for this study.


These two data entry applications provide communities with the tools to report services and costs data to the National Evaluation Team in a standard format and structure. Each of the two applications is designed for entering different types of data.


The first type of data provides information on how communities disburse the flexible spending funds that are included as part of their budget. To track these budget expenditures, the National Evaluation Team has developed the Flex Funds Tool for communities to use. The Flex Funds Tool is a stand-alone Microsoft Excel® application that includes password protection, data entry validation, and reporting features.


The second type of data provides information of each service received by children and youth from all systems of care agencies, and the cost of each service. To compile information on services received and the costs of these services, the National Evaluation Team is developing a Services and Costs Data Tool. This data entry application is designed as a Web-based program that includes user accounts and password protection, data entry validation, and reporting features.


Services and costs data collected by communities provide valuable information to support not only the National Evaluation Team’s Services and Costs Study, but also to support grant communities local fiscal management, provide data for measuring program performance, and support local data reporting needs. The National Evaluation Team’s development of these two data entry applications minimizes communities’ need to develop their own systems locally and the costs of this development.


CQI Initiative Evaluation. The CQI Initiative Survey will be Web-based and will comply with Section 508 of the Rehabilitation Act. The National Evaluation Team will recruit respondents primarily through e-mail. The National Evaluation Team will maintain names and contact information of respondents in Phase V communities; therefore, e-mail contacts will be available. The National Evaluation Team will send a letter describing the survey and instructions for logging onto the Web survey to respondents by either e-mail or ground mail. Respondents will enter a Web address and password into their Web browsers to open and complete the survey. Survey completion will be monitored by each login to assess response rates and to implement targeted follow-up mailings and phone calls to non-respondents. For those who cannot complete the survey online, The National Evaluation Team will provide the option to complete a paper-and-pencil survey. It is expected that less than 10% of respondents will complete a paper survey. The National Evaluation Team will conduct the CQI Initiative Interviews by telephone and will not utilize special technology.



Evidence-Based Practices Study. The National Evaluation Team will collect data for the Implementation Factors Study through key informant telephone interviews. The information will be primarily qualitative in nature and the data collection methodology does not lend itself to the use of special technology at this time.


Cultural and Linguistic Competence Study. Data from the CLC study are primarily qualitative in nature. They are collected by the National Evaluation Team during site visits and by telephone interviews and do not lend themselves to the use of special technology at this time.



4. EFFORTS TO IDENTIFY DUPLICATION


The 2005 report developed by the Institute of Medicine (IOM), “Improving the Quality of Health Care for Mental Health and Substance-Use Conditions,” encourages the development of an overall strategy to address mental health and substance-use conditions that includes an infrastructure to produce and disseminate scientific evidence of effective treatments and research funds that are used for studies directly related to clinical practice and policy. The new IOM report (IOM, 2009), Preventing Mental, Emotional, and Behavioral Disorders Among Young People: Progress and Possibilities, focuses on the importance of prevention of mental, emotional, and behavioral (MEB) disorders through an application of universal, selective, and targeted interventions with individuals and groups of children and youth who are at risk of developing serious MEB disorders and identifies a number of programs that have a sufficient evidence base to warrant consideration of broader implementation.


The issue of real world effectiveness has become a growing concern for those who have been supporting efficacy studies of treatments for specific child disorders. A conceptual model and strategic plan for improving the relationship between the results of efficacy trials and effectiveness research for both children and adults with mental illness was released by NIMH in 1998. At this critical juncture, the Phase V evaluation offers a unique opportunity to address the overlapping needs to understand the effectiveness of systems of care and to implement and measure evidence-based treatments in community contexts. This opportunity is consistent with the Federal Action Agenda (2005) in response to the President’s New Freedom Commission. In March of 2007 SAMHSA launched an improved National Registry of Evidence-Based Programs and Practices that expanded to include interventions in mental health and substance abuse prevention and treatment.


The development of designs to address these needs within the national evaluation has generally followed questions emerging from the children’s mental health services field. Although many questions continue about the effectiveness of systems of care at the clinical outcome level (Burns & Hoagwood, 2002; Stephens et al., 2005; Surgeon General’s Report, 1999), data exist to support continued work on implementation of the approach within community settings and the President’s New Freedom Commission (2003) calls for community services with programs integrated across levels of government and agencies. Strong consumer advocacy for alterations in traditional mental health services approaches for children and youth with serious emotional disturbance and their families is an important driving factor in sustaining Federal- and State-level efforts.


The President’s New Freedom Commission Report (2003) and the evidence-based treatment movement within children’s mental health (Burns & Hoagwood, 2002) are more recent events that have affected the evolution of research questions and the direction of the evaluation. Systems of care is an area in need of further study, especially with respect to the integration of evidence-based interventions within these community-based programs. The most important questions for the field are how to effectively integrate evidence-based interventions within the system of care philosophy with the underlying hypothesis being that the effects of these interventions will maintain and generalize more effectively within the context of coordinated, community-based service systems.


The National Evaluation Team also conducted an extensive literature search to identify existing evaluation research on systems of care and children’s mental health services. The search included a review of published literature, unpublished papers, works-in-progress, and working papers and documents. During the implementation of the Phase I–IV evaluations, the National Evaluation Team has kept abreast of the literature in children’s mental health services research and has been in close contact with the original grant communities. This has allowed the team to keep up with advances in practice and research. In addition, the Services Evaluation Committee for the national evaluation has helped keep the evaluation appraised of innovations in the field. These efforts yielded a broad list of useful references. While some of the research identified contains features similar to the planned evaluation, the scope of the research projects varies considerably and is driven by the particular research interests of each investigator. The Phase V evaluation offers unique contributions to the field not available in these other studies. The nature of these studies and the unique contributions being made by the Phase V evaluation are summarized below.


Systems of Care for Children and Adolescents with Serious Emotional Disturbances: What Are the Results?” published by Beth Stroul in 1993, contains a complete review of studies of local systems of care. Stroul concluded that while there is a growing body of evidence to support the contention that systems of care provide high quality and more appropriate care, continuing commitments to research and evaluation are needed. Further, attention should be directed beyond the assessment of short-term outcomes. She called for the development of a common set of outcome indicators that would provide a framework for more systematic studies and multi-site analyses. The evaluations for all five phases of the project address these concerns because they cover multiple sites, and share standard instrumentation. Phases I and II included comparison sites, and Phases II, III, and IV include evidence-based treatment studies. Beginning in Phase II and continuing in Phase V, data are collected from children, youth, and families after the completion of services to examine long-term outcomes.


In 2002, Stroul published Issue Brief—System of Care: A Framework for System Reform in Children’s Mental Health. The purpose of this issue brief was to re-examine system reform in children’s mental health, clarify what the system of care concept is, and explore the continued relevance of the system of care concept and philosophy as a framework for reform. Four questions are addressed: (1) What kind of system reform is needed for children’s mental health? (2) What is the actual meaning of the system of care concept? (3) Why should we continue to use the system of care concept and philosophy as a framework for system reform in children’s mental health? (4) How can we achieve our system reform goals in children’s mental health? The national evaluation addresses these questions through a number of its studies including the System of Care Assessment and the Longitudinal Child and Family Outcome Study.


In 2008, Stroul and Blau published their edited book The System of Care Handbook. The purpose of the book was to provide a compendium that informed the development of systems of care drawing from the evidence base on effective strategies for systems building and service delivery. Emphasis was placed on providing recommendations for practice. Evaluation results were used to illustrate how data can be used to inform decision-making at various levels in system change initiatives. Content focused on building and sustaining systems of care, implementing evidence-based practices in these systems, and providing services in a culturally and linguistically competent way that promotes the elimination of disparities in mental health services delivery. The implications for future evaluation acknowledged the importance of developing generalizable knowledge about the effectiveness of systems of care. The national evaluation addresses these questions through a number of its studies including the CQI Initiative Evaluation, System of Care Assessment, Sustainability Study, and the CLC and EBP substudies.


The Alternatives to Residential Treatment Study (ARTS) project, which started in the early 1990’s, was conducted by the Research and Training Center for Children’s Mental Health of the Florida Mental Health Institute to study the effectiveness of five innovative programs (Duchnowski, Hall, & Kutash, 1998; Duchnowski, Hall, Kutash, & Friedman, 1998). Components of this study included descriptions of the children and families served, interventions employed, program costs, and outcomes for children over time. This study contributed to the field by documenting the experiences of individuals affected by changes in service delivery systems. However, the ARTS project sample was relatively small (87 children). As a result, generalizable conclusions about the effectiveness of the system of care approach cannot be drawn. With a larger sample and more sites, Phase V offers an opportunity to produce generalizable findings for those elements covered in ARTS. In addition, unlike ARTS, Phase V will address the effect of system of care and service-level factors on outcomes.


The National Adolescent and Child Treatment Study (NACTS) was a 7-year longitudinal study conducted at 121 sites in six States by the Research and Training Center for Children’s Mental Health of the Florida Mental Health Institute. It assessed the treatment provided to children with serious emotional disturbance in residential mental health facilities and in community-based special education programs (Greenbaum, Dedrick, Friedman, Kutash, Brown, Lardieri, & Paugh, 1996). Although the NACTS project studied children in residential treatment and community-based special education programs, it focused on describing children rather than the services they received. The NACTS was not evaluative, but descriptive, in nature. In addition to describing children receiving services in a community-based system of care, the Phase V evaluation also assesses outcomes and service delivery and use.


The Robert W. Johnson Foundation (RWJF) Mental Health Services Program for Youth, conceived in 1988, funded eight community programs that were evaluated by Brandeis University (Cole & Poe, 1993; Cole, 1996; Saxe & Cross, 1997). The evaluation of that program focused on changing financing policies and refining new treatment strategies and did not aim to assess client outcomes over time. While not mandated by the evaluation, some sites collected child and family outcome data. However, their findings were limited due to differences in instrumentation that compromised the ability to compare results across the sites. The national evaluation systematically evaluates child and family outcomes using a standard set of instruments, thus allowing for comparison across sites and, when appropriate, aggregation of data.


Another evaluation of the RWJF program in North Carolina was started in 1992 and conducted by researchers at Duke University (Burns, Farmer, Angold, Costello, & Behar, 1996; Angold, Burns, Costello, & Behar, 1998). For this study, children were randomly assigned to one of two models of case management to determine their impact on mental health outcomes for children. Unlike Phase V, this study did not evaluate the effectiveness of the full continuum of service options or study the roles of multiple child-serving sectors (e.g., juvenile justice, education, child welfare).


The Center for Mental Health Policy at Vanderbilt University evaluated the Fort Bragg Child and Adolescent Mental Health Demonstration Project. The evaluation of this project, which served children of military personnel in the Fort Bragg area, had four components. First, it described how the demonstration project was implemented and highlighted key process indicators (e.g., linkages among providers, extent of family involvement). Second, it examined whether the quality of services provided was sufficient to produce the predicted effect on outcomes. Third, it studied the cost of providing services and patterns of service use. Finally, it assessed the mental health outcomes of the children using a quasi-experimental design that included two comparison sites (Bickman, Guthrie, Foster, Lambert, Summerfelt, Breda, & Heflinger, 1995). Several of these general areas of inquiry overlap with the Phase V evaluation. However, the Fort Bragg study focused on services in the mental health sector, ignoring other child-serving sectors. The evaluation indicated that services delivered through a continuum of care did not produce significantly better clinical outcomes than regular CHAMPUS-funded services for military dependents. Access to services was greater in the demonstration site with resulting increases in costs. A subsequent investigation utilized a randomized control group design to evaluate the effectiveness of system of care services for children with serious emotional disturbance and their families seeking services in Stark County, Ohio. This latter effort also found no significant clinical and functional differences between children served in a system of care and those who received treatment as usual, although the children enrolled in this trial may have been minimally functionally impaired and the number of participants limited the power to detect significant differences (Bickman, Summerfelt, Firth, & Douglas, 1997).


The Phase V evaluation has a broader population scope than the Fort Bragg study since it is not limited to the children of military personnel. It is notable that more than one-half of the children in grant communities funded between 1997 and 2003 lived in poverty and less than 25 percent lived in households with both of their biological parents. Phase V grant communities are expected to serve similar populations, and, as such, findings from Phase V are more likely to generalize to the children and families served by public agencies.


The 1999 Mental Health: A Report of the Surgeon General included a review of the effectiveness of systems of care. The report concluded that while findings are encouraging, the effectiveness of systems of care has not been demonstrated conclusively, and that the findings of the Fort Bragg study, in particular, indicate the importance of evaluating the impact of changes at the system level on practice. The report’s findings indicate that further research needs to focus on practice-level issues, and examine the relationship between changes at the system level and changes at the practice level to demonstrate that services delivered within a system of care result in improved clinical outcomes relative to services delivered within traditional systems.


The New Freedom Commission on Mental Health published Achieving the Promise: Transforming Mental Health Care in America Final Report in 2003. This report outlined six goals developed by the New Freedom Commission to transform the mental health care delivery system in the United States. The fifth goal in this report was “excellent mental health care is delivered and research is accelerated.” The New Freedom Commission’s recommendations regarding how to meet this goal included advancing the use and understanding of evidence-based practices and the ability of mental health professionals to carry out these practices as well as developing a knowledge base in understudied areas in mental health. Phase V of the national evaluation has a focus on studying evidence-based practices both in terms of how children and families perceive their use and service professionals’ familiarity and expertise in their use. In addition, data collected through the national evaluation are important to add to the field’s knowledge base.


Research on mental health services for youth suggests that higher quality mental health services may be more expensive. However, previous research has ignored the impact of mental health services on other sectors that serve youth. Michael Foster’s 2005 study of Public Costs of Better Mental Health Services for Children and Adolescents used a quasi-experimental study design to understand better the fiscal impact of system of care services for youth. Expenditures for improved mental health services in the system of care communities were significantly higher when compared to the matched non-system of care communities. However, after costs in other sectors were included, the differences in expenditures among the communities dropped significantly. The full fiscal impact of improved mental health services can be assessed only in the context of their impact on other sectors, such as juvenile justice and child welfare. Phase V of the national evaluation will focus on the development of tools to allow communities to gather the fiscal data to measure the costs of system of care services and those provided in other sectors.


Two studies conducted by the National Evaluation Team are furthering knowledge of the effectiveness of evidence-based practices in community settings. Introducing and Evaluating Parent-Child Interaction Therapy in a System of Care (Franco, Soler, & McBride, 2005) examines whether children who receive an evidence-based treatment delivered in a system of care have better outcomes and maintain those outcomes longer than children in the same system who do not receive the evidence-based treatment. A second study, Evidence-Based Treatments in the Field: A Brief Report on Provider Knowledge, Implementation, and Practice (Walrath, et al., 2005) reported high familiarity with, relatively high perceived effectiveness, and generally high use of evidence-based treatments for children in community settings. Phase V of the national evaluation has a focus on studying evidence-based practices both in terms of how children and families perceive their use and service professionals’ familiarity and expertise in their use.


As explained above, Phase V does not duplicate extant studies, but instead enhances and expands the existing knowledge base. In addition, Phase V provides information that is specific to this service program. As required by the legislation, data must be collected from the communities in which the program has been funded.


As described above in Section A.1.d, advances in the field of children’s mental health have emphasized the importance of assessing the impact of implementing evidence-based practices in systems of care and the adaptation of those practices to address diverse communities. Consequently, Phase V addresses both of these issues by including an Evidence-Based Practices Study that focuses on the impacts of provider knowledge and community readiness on the implementation of evidence-based practices. This study will increase understanding of the factors that affect the implementation and effectiveness of evidence-based practices. In addition, the Cultural and Linguistic Competence Study will address the adaptation of evidence-based practices within diverse communities.



5. INVOLVEMENT OF SMALL ENTITIES


Some of the data for this evaluation are collected from mental health, juvenile justice, public health, education, and child welfare agencies. While most data are collected from public agencies, it is possible that some organizations providing services to the target population, such as community-based organizations, not-for-profit agencies, private providers, schools, or parent groups, would qualify as small entities. The information requested is the minimum required to meet the study objectives. The site visit interview guides used in the System of Care Assessment, the Web-based surveys employed in the Sustainability Study, CQI Initiative Evaluation, and Evidenced-Based Practice Study, and the interviews used in the CLC Study, CQI Initiative Evaluation, and the Evidence-Based Practices Study are the only instruments that are administered to the staff of small entities.



6. CONSEQUENCES IF INFORMATION COLLECTED LESS FREQUENTLY


System of Care Assessment. Data for this component are collected every 18–24 months across the 6 years of system of care community funding, documenting how the program has led to system enhancement. This information is key to examining whether improved outcomes for the children and youth served by the system can be plausibly linked to this initiative. Because systems of care change slowly, collection of system data every 18–24 months is sufficient to provide information on system implementation, organizational involvement, and relationships. If these data were collected less frequently, important interim changes would not be documented. The System of Care Assessment data collected during the evaluations in Phases I, II, III, and IV have been valuable to CMHS and the system of care communities in mapping progress and making decisions about program resources and strategies, and have been useful in identifying interim technical assistance needs. In Phase V, continued efforts are made to apply System of Care Assessment results to CMHS program decisions and technical assistance efforts.


Cross-Sectional Descriptive Study. Data for this study are collected when children, youth, and families first access the system of care, during their administrative intake procedures. Grant communities collect data on children, youth, and families including demographics, service use, status, treatment plans, and other information. These and other data elements provide basic profile information and document diagnostic eligibility for systems of care participation. For children, youth, and families also participating in the Longitudinal Child and Family Outcome Study, however, the descriptive information that may have changed over time (e.g., family income, caregiver’s marital status) is collected at each 6-month follow-up data collection point. Failure to collect these few data elements at each follow-up interview would preclude the detection of key changes in the child’s environment that could have an important impact on the child’s clinical outcomes, service use, or family functioning. Data from the grant communities are submitted to the National Evaluation Team continuously using the ICN.


Longitudinal Child and Family Outcome Study. For this component, data are collected at intake and every 6 months for the length of the evaluation, up to 36 months. Clinicians who work with this population of children suggest that once children enter services, they are likely to experience detectable improvements within the first 6 months of services. However, it is important to demonstrate whether improvement is sustained. Assessing outcomes every 6 months allows study of the course of improvement over time so that interventions can be planned for times that are likely to yield the greatest gains. Thus, waiting 12 months to collect outcome data would miss important changes that are likely to happen in children who are still developing. On the other hand, it was the judgment of the National Evaluation’s Services Evaluation Committee and prior grant communities that quarterly data collection would be too burdensome.


The data collection schedule calls for collecting data on all children, youth, and families in the Longitudinal Child and Family Outcome Study for the duration of the evaluation. It is important to follow children and youth as long as possible to capture changes that occur as children and youth enter new developmental stages, especially adolescence and young adulthood. Of particular interest are functional outcomes that indicate whether a child is developing into a productive member of society such as completing high school, obtaining a job, and abstaining from criminal behavior. However, because some children and youth enter services (and therefore the study) later than others, the children and youth recruited into the study in the first year of data collection are followed for 36 months, while the children and youth recruited in the fourth year of data collection are followed only for 18 months.

Service Experience Study. Data for this study component are collected 6 months after intake into the evaluation and at subsequent 6-month intervals in conjunction with the Longitudinal Child and Family Outcome Study. At each data collection point, a screening question indicates whether any services have been received during the previous 6-month period. If so, questions for the Multi-Sector Service Contacts—Revised Survey (MSSC-R), the Youth Services Survey for both youth and family (YSS), and the Cultural Competence and Service Provision Questionnaire (CCSP) are asked. If not, these sets of questions are skipped. This provides youth and caregiver perspectives at various stages of treatment as their needs and services change (e.g., during intensive involvement, while transitioning to less intensive services, and after formal discharge from mental health services). If these data were collected less frequently, the National Evaluation Team would not be able to track the service changes that may be linked to changes in outcomes.


Sustainability Study. Data on sustainability are collected from representatives of all award communities in evaluation years 3, 4, 5, and 6. It is necessary to collect these data at multiple points during the latter half of programs’ funding cycle to assess the progress being made towards sustaining funding for continued operation during their funding period and for sustaining programs after the funding cycle. Evaluation of sustainability over time is needed because the amount of nonfederal funds required increases each year, as does the developmental stage of the systems of care. This makes each evaluation point distinct from previous points and will yield important information on the process of becoming increasingly independent of Federal support, the critical stages in efforts towards sustainability, and where in the process potential barriers to sustainability are most likely to arise. Assessing sustainability at the end of the funding cycle would yield information on whether a grant community has or has not achieved sustainability but would not provide insight into the process of becoming sustainable or barriers and facilitators to sustainability. The final survey administration and at least one of the other administrations will occur in the same year as programs’ System of Care Assessment and having these complementary data from the same points in time will permit a more comprehensive understanding of sustainability efforts at each grant community.


Services and Costs Study. The Services and Costs Study is tasked to:

  • describe the services provided by children, youth, and families through systems of care,

  • identify service use patterns, estimate associated costs of these services,

  • determine the cost-effectiveness of the systems of care program model, and

  • explore the relationship between service use and outcomes by linking services and costs data with outcomes data collected in the Longitudinal Child and Family Outcomes Study.


By not collecting services and costs data from the beginning of service delivery, within a consistent data structure across all grant communities, the ability to accomplish these study goals are seriously diminished. SAMHSA is often asked to demonstrate the cost-effectiveness or cost-benefit of this grant program. Without requiring complete and consistent data from all communities, the validity of these types of costs analyses would be compromised.


Data collection for this study involves on-going data accumulation beginning when the grant communities initiate services within their systems of care program. Some grant communities currently collect this information electronically as part of their normal program procedures, some communities currently collect it on paper, and some communities are not yet collecting this information.


The national evaluation’s Phase V Services and Costs Study is requesting communities to collect services and costs data routinely as services are delivered. Transferring these data to the national evaluation will occur at different times, depending on how communities enter and maintain these data. Communities that already enter these data in their existing data systems will be required to extract, recode, and transfer their cumulative data to the National Evaluation Team at the end of each fiscal year beginning in evaluation year 4. Communities that elect to use the Flex Funds Tool for their flexible funds expenditures will be required to enter data from the beginning of service delivery and transfer their cumulative data to the National Evaluation Team at the end of each fiscal year beginning in evaluation year 4. Communities that elect to use the Services and Costs Data Tool will provide their data to the National Evaluation Team’s central database as it is entered into the Web-based system.


CQI Initiative Evaluation. The National Evaluation Team will collect information on the CQI Initiative from local constituents in year 4 of the national evaluation. The National Evaluation Team will administer a Web-based survey once to key constituents in each system of care community in evaluation year 4, and follow-up semi-structured interviews in a subset of communities to respondents of the Web-based survey. The purpose of the follow-up structured interviews is to obtain additional qualitative information on implementation issues. Not collecting this information would prevent a comprehensive assessment of the CQI Initiative and the extent to which it has been implemented.


Evidence-Based Practices Study. Clearance is being requested only for the Implementation Factors Substudy. The FYES involves data collected during the Longitudinal Child and Family Outcome Study interviews. The National Evaluation Team will collect data on evidence-based treatment implementation only once from program directors and administrators, direct mental health service providers, youth and family members recruited from each of the 2005- and 2006-funded grant communities. The National Evaluation Team will conduct semi-structured telephone interviews in the last quarter of year 4 of the evaluation. It is expected that by then, key informants will have had a more extensive experience with EBP implementation in their communities. Not conducting these interviews would prevent a thorough understanding of how evidence-based practices are implemented within systems of care.


Cultural and Linguistic Competence Study. Clearance for this study is being requested for the CCIOSAS and CCEBPS conducted in years 3 and 5 of the evaluation. The National Evaluation Team will collect data from key informant interviews conducted with program directors and administrators, clinical supervisors, direct service providers, youth and family members during site visits and by telephone. The site visits and telephone calls will be conducted once for each substudy. Not collecting these data would prevent a thorough understanding about strides made by communities to self-assess their efforts to provide culturally and linguistically competent services, and to adapt evidence-based practices ensuring that the cultural and linguistic needs of those served are met. In addition, these data will be useful in identifying technical assistance needs.


7. CONSISTENCY WITH THE GUIDELINES IN 5 CFR 1320.5(d) (2)


The data collection fully complies with the requirements of 5 CFR 1320.5(d) (2).



8. CONSULTATION OUTSIDE THE AGENCY


The notice required in 5 CFR 1320.8(d) was published in the Federal Register on April 7, 2009 (74 FR 15730), soliciting public comment on this study. SAMHSA received no comments.


Consultation on the design, instrumentation, data availability and products, and statistical aspects of the evaluation occurred continually throughout the implementation of Phases I, II, III, and IV, and have been occurring during the implementation of Phase V. To capitalize on the experience and knowledge gained, the revisions of Phase V are based, in part, on this consultation. Since the beginning of this initiative, consultations have been sought from the following:


  • Federal representatives working in related program areas;

  • Experts in the area of child mental health services research;

  • CMHS grant communities;

  • Families caring for children with emotional and behavioral disorders;

  • Representatives of national organizations for children, families, and providers in the field (e.g., National Technical Assistance Center for Children’s Mental Health, National Mental Health Associations, the Federation of Families for Children’s Mental Health, National Alliance on Mental Illness, State Mental Health Representatives for Children and Youth);

  • Experts in program evaluation, measurement, and statistical analysis; and

  • Experts in mental health service systems for American Indian/Alaskan Native children.


These consultations had several purposes: (1) to ensure continued coordination of related activities, especially at the Federal level; (2) to ensure the rigor of the evaluation design, the proper implementation of the design, and the technical soundness of study results; (3) to verify the relevance and accessibility of the data to be collected; and (4) to minimize respondent burden.


a. Federal Consultation



Input from representatives of Federal agencies involved in children’s mental health issues has been elicited throughout all phases of the national evaluation. CMHS receives input about its children’s services program from Federal offices including, but not limited to, the following: the Office of Special Education Programs, DoE; the Office of Juvenile Justice and Delinquency Prevention, DoJ; the Office of Disability, DHHS; and Division of Adolescent and School Health, CDC. (See Attachment 2.A for a list of the participants in the Federal/National Partnership for Children’s Mental Health and their affiliations and telephone numbers.)


These offices are involved in a public-private interagency partnership group to ensure that services for children with serious emotional disturbance and their families are coordinated at the Federal level and that evaluation results are useful to a wide audience. Specifically, representatives from the listed Federal agencies have convened to develop strategies for coordinated training, technical assistance, and culturally competent services to communities across the country.


In addition, SAMHSA, the parent agency of CMHS, requires that its other two constituent centers, the Center for Substance Abuse Treatment (CSAT) and the Center for Substance Abuse Prevention (CSAP), conduct an internal review of the Annual Report to Congress on the Evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program. Evaluation specialists at the CDC, NIMH, and the Office of the Assistant Secretary for Planning and Evaluation (ASPE) of DHHS have also reviewed and provided comments on the national evaluation. Furthermore, representatives of many Federal agencies participate in the national evaluation’s Services Evaluation Committee. (See Attachment 2.B for a list of Methodological Consultants and current national evaluation Services Evaluation Committee to the National Evaluation Team members.) Collaboration with NIMH led to the release of a program announcement (PA–00–135; Effectiveness, Practice, and Implementation in CMHS’ Children’s Service Sites) on September 21, 2000, by NIMH for the conduct of research studies on services delivered to children, adolescents, and their families in currently or previously CMHS-funded system of care communities. This mechanism encourages studies examining the nature and impact of routine clinical practice, and factors related to successful implementation of treatments or services. This program announcement addresses recommendations set forth in the NIMH report, “Bridging Science and Service: A Report by the National Advisory Mental Health Council’s Clinical Treatment and Services Research Workgroup,” and in the NIMH Child and Adolescent Services Research Strategic Planning Report. A revised program announcement (PA–04–019; Effectiveness, Practice, and Implementation in CMHS’ Children’s Service Sites) was released on November 10, 2003, by NIMH. The scope of this program announcement was broadened to include research in communities with Safe Schools Healthy Students grants.


b. Expert Consultation


The Services Evaluation Committee of the national evaluation, a workgroup of expert consultants, was organized to provide technical guidance and review for Phase I of the evaluation. The Services Evaluation Committee continues to have input regarding the enhanced design and instrumentation for Phases II, III, IV, and V. Recommendations made by this group have influenced changes applied to the Phase V instrumentation. Services Evaluation Committee

members have combined expertise in children’s mental health, the delivery of children’s mental health services, and the evaluation of systems of care. (See Attachment 2.B for a list of current Services Evaluation Committee members.)


Most of the individuals invited to provide consultation were chosen because of their involvement in past or current studies of children’s mental health service systems. During previous phases, input has also been received from the National Association of State Mental Health Program Directors and the State Mental Health Representatives for Children and Youth.


c. Grant Community Consultation



Previously funded grant communities have been key providers of input for all phases of the evaluation design. For the design of Phases IV and V, grant community input was used in the development of the instrument package. Project directors and evaluators from Phase II, III, and IV grant communities participated in the Measures Review Meeting. These participants helped in determining the instruments that are most appropriate for each component of the evaluation. In addition, evaluators and project directors from all grant communities were given the opportunity to provide their input to the recommendations made at the Measures Review Meeting. Additional grant community feedback was received during close-out site visits conducted with Phase II and III communities in which evaluation processes and data utilization were reviewed.


Several representatives from grant communities also participate in the Services Evaluation Committee of the national evaluation and these members offer the grant community perspective on how research goals can be achieved at the grant community with the least disruption. Grant community members have also provided input to several of the Phase V instruments, through participation on special advisory groups. CMHS initiated an annual consumer survey of the Phase II and III grant communities in January and February 2002, and of the Phase IV and V grant communities in May 2007. The survey was designed to assess satisfaction with implementation of the national evaluation and the role of the National Evaluation Team in this implementation (OMB Control # 0930–0197). The survey also asked for feedback from grant community evaluators regarding desired changes in study design. CMHS repeated this survey in April 2003 for Phases II and III. CMHS received feedback from evaluators in almost all grant communities and synthesized these data for use in quality improvement efforts.



d. Youth and Family Consultation


Critical to the CMHI principles is the role of youth and family caregivers as active constituents in the system of care. That philosophy has been extended to all phases of the evaluation design in several ways. Caregivers participated on the Services Evaluation Committee and gave early input to the overall design. Caregivers also reviewed the instrumentation and key features of the evaluation design to ensure sensitivity to parent issues and concerns as well as to maximize clarity of meaning and to assess feasibility of administering the questionnaires. Input from family members participating in assessment interviews indicated a need to reduce the length of the interview and this recommendation is reflected in the Phase V instrument package. The Phase V package is modified only slightly from Phase IV, which caregivers and youth found to be acceptable in terms of length and content. Grant communities systematically solicit feedback from family members; hence the family perspective is also included in comments and consultation from grant communities. The evaluation team has a formal relationship with the Federation of Families for Children’s Mental Health to facilitate systematic and ongoing input to the evaluation. In April 2008 a diverse group of youth and youth coordinators from system of care communities across the country came together to create YADA (Youth Advisors Driving Action), a youth advisory group to The National Evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program (CMHS).  YADA was created to ensure the presence of a strong youth voice throughout the national evaluation.



9. PAYMENT TO RESPONDENTS


As with previous phases, Phase V of the national evaluation uses a research-based approach to evaluation and, as such, requires participation of youth and families beyond their receipt of services in their system of care programs. Consequently, remuneration is essential to ensure good response rates across all study components.


Remuneration levels in the System of Care Assessment, Longitudinal Child and Family Outcome Study, and Sustainability Study for Phase V are the same as those currently approved in Phase IV.


System of Care Assessment. Four caregivers of children and youth who are receiving services in each system of care community are interviewed during each System of Care Assessment site visit. The National Evaluation Team provides a payment of $25 to them at the time of their interviews in compensation for the additional burden and potential inconvenience of these interviews. Two youth who are receiving services in each system of care community are interviewed during each System of Care Assessment site visit. The National Evaluation Team provides a payment of $15 to them at the time of their interviews in compensation for the additional burden and potential inconvenience of these interviews. The incentives provided improve the validity and reliability of the caregivers’ and youths’ responses by removing any resistance to taking the time to be interviewed thoroughly.


Cross-Sectional Descriptive Study. Data for the Cross-Sectional Descriptive Study may be collected by intake, care coordination, or evaluation team staff members through administrative or management information system record review. In some communities, some of these data may be obtained through in-person interviews with caregivers. As this information is usually collected at intake into any behavioural health care system, no incentives, payment, or gifts are proposed as part of this study.


Longitudinal Child and Family Outcome Study and Service Experience Study. The National Evaluation Team strongly recommends that grant communities remunerate respondents who participate in the Longitudinal Child and Family Outcome Study $20 each for caregivers and youth at each administration. Remuneration is standard practice in this type of longitudinal research to acknowledge participants’ value to the study. It is essential to help maximize participation rates, particularly given the additional time being asked of families who already face multiple challenges and demands on their time in caring for their children and youth with serious emotional disturbance. Youth and Caregivers who participate in the Longitudinal Child and Family Outcome Study are asked to complete more assessments than ordinarily are required in the course of receiving services. To complete the instruments at the time of entry to services and at subsequent follow-up points requires the evaluation participants to spend time away from other activities. The combination of the number of instruments and their periodicity creates a burden to the caregivers and youth that exceeds the burden that ordinarily would be placed on them if they were seeking services not associated with this evaluation.


Sustainability Study. As with the Phase II, III, and IV Sustainability Survey, individuals asked to complete the Phase V Sustainability Survey receive a token incentive (e.g., a refrigerator magnet) when they are informed about the survey, to encourage survey completion.


Services and Costs Study. Data for the Services and Costs Study are collected entirely from administrative and fiscal records by staff paid by grant funding. No incentives, payment, or gifts are proposed as part of this study.


CQI Initiative Evaluation. Individuals asked to complete the CQI Initiative Survey and the CQI Interviews will be offered an incentive to encourage completion of the data collection activities. Specifically, survey respondents will receive a $20 pre-paid credit card, and interview participants will receive a $35 pre-paid credit card. These incentives may increase the validity and reliability of participants’ responses by lessening participants’ resistance to taking the time to complete the survey and interview, thus increasing the response rate.

Evidence-Based Practices Study. Youth and family members who decide to participate in the semi-structured telephone interviews will be provided an incentive in compensation for the additional burden and potential inconvenience. They will be mailed a $20 gift certificate redeemable at a specific online retail store after completion of their participation. Participating administrators and providers will be acting within the course of their normal professional duties – some of which are grant-funded – and thus will not receive incentives.


Cultural and Linguistic Competence Study. Youth and family members who participate in focus groups and interviews are compensated for their time. Family members will receive a payment of $50 and youth will receive $25. Each site is expected to recruit 10 to 15 youth and caregivers. The incentives provided improve the validity and reliability of the youth and caregivers’ responses by removing any resistance to taking the time to be interviewed thoroughly.



10. ASSURANCE OF CONFIDENTIALITY


System of Care Assessment. Data collection for the System of Care Assessment occurs via face-to-face interviews. Because respondents’ identities are known, to ensure that participants’ rights are protected, an active informed consent process occurs. (See Attachment 3D1–3D23 for informed consent forms.) To ensure privacy, all System of Care Assessment Study supporting documents and protocols are stored in a locked cabinet, in a locked storage room. Institutional Review Board (IRB) approval for this study was received and is renewed each year. The National Evaluation Team uses data from this study to examine whether communities have implemented systems of care in accordance with system of care program theory and document how systems develop over time to meet the needs of the children, youth, and families they serve. A particular interest is whether services are delivered in an individualized, family-driven, coordinated manner, and whether the system involves multiple child- and youth-serving agencies.


Cross-Sectional Descriptive Study, Longitudinal Child and Family Outcome Study, and Service Experience Study. Phase V requires collecting descriptive and clinical data from children, youth, and families. In all the grant communities, site staff collect data. These staff are responsible for developing procedures to protect the privacy of all participants in the evaluation data collection, storage of data, and reporting of all information obtained through data collection activities. These procedures include limiting the number of individuals who have access to identifying information, using locked files to store hardcopy forms, assigning unique code numbers to each participant to ensure anonymity, and implementing guidelines pertaining to data reporting and dissemination.


Because of the sensitivity of the information that is collected, CMHS requires that all grant communities establish a system whereby data are gathered, stored, and accessed in the most confidential manner possible. The National Evaluation Team provides each grant community with a coding schema that each grant community uses to generate code numbers to assign to individual respondents, and trains staff responsible for data collection on the process of developing codes and linking them to individual respondents. Grant communities are instructed to maintain a list of the codes and their assignment to individual respondents. A secure, stand-alone software to allow grant community evaluation staff to store codes with respondent names is also provided to grant communities. This program is password protected and grant communities are instructed to limit access to the database to only those on-site evaluation staff that need access to this information. If a paper list is maintained, the list linking the assigned codes to respondent names is kept in a locked cabinet and only the on-site data collection staff have access to the list. The database or list will be maintained for the duration of the CMHS program. The purpose of maintaining the list for this period of time is to ensure that the data can be linked back to the identified child, youth, and family throughout the data collection process. When the project is completed, the databases or lists will be destroyed. The National Evaluation Team developed this coding system to facilitate the tracking of children and youth during their involvement with the evaluation and to ensure that no personal identifying information from the grant communities would need to be made available to either the National Evaluation Team or CMHS.


The security and privacy of data entered and managed on the Internet-based ICN also is assured. Access to the ICN is password protected, and the ICN uses data encryption to further enhance security and protect privacy. Further, the project including the ICN system is operating under an ADP/IT security plan approved by CMHS to assure that project data are protected.


Each grant community has implemented an active consent procedure that informs the participants of the purpose of the evaluation, describes what their participation entails, and addresses the maintenance of privacy as described above. Interviewers obtain informed assent from participating youth and adolescents (ages 11–17 years). In addition, informed consent is obtained from adolescents who have reached the age of 18 at follow-up data collection. Intake and/or evaluation staff obtain written informed consent or assent from youth and families at the point of entry into services. Each grant community has obtained local Institutional Review Board (IRB) approval for the informed consent or assent procedures used in this evaluation. Grant communities are instructed to determine whether updates to consents are required at each data collection point. The legal custody of a child or youth may change, a child or youth may become old enough to participate in a youth interview, a youth may become an emancipated minor or age up into adult status, and local IRBs may have requirements for regular updates. The National Evaluation Team uses data from the Cross-Sectional Descriptive Study to describe child, youth, and family characteristics of all children entering CMHS-funded systems of care. The National Evaluation Team uses data from the Longitudinal Child and Family Outcome Study to examine how the system affects child and youth clinical and functional status and family life. Data from the Service Experience Study are used to investigate the extent to which system of care principles are experienced by families, assess intervention fidelity, satisfaction with services, cultural competence, accessibility and coordination of services, perceived helpfulness of services, and impact of services on ability of family members to work outside the home.


Sustainability Study. Data collection for the Sustainability Survey occurs using the Web-based Sustainability Survey. To protect the rights and privacy of the respondents, an active informed consent process occurs. A letter is mailed to potential participants explaining the survey, including the voluntary nature of survey completion, privacy of responses, and the risks, benefits, and rights as respondents, and advises the recipient that they will be asked to indicate, by checking a box on the Internet Web survey, that they agree to participate in the study before they complete and return the survey. Information about the study and participant rights is presented in the Web survey prior to the check box indicating consent to participate. The letter and the Web survey also provides contact information if the survey recipient has questions or desires clarification prior to participation. If the individual does not have e-mail access, a packet is sent by regular mail containing a cover letter, an informed consent form, a survey, and a return envelope. (See Attachments 3.D.7, 4.B.3, and Instrument E.1). The cover letter indicates that the respondent is to return the informed consent form and the survey. (See Attachment 4.B.5 for Web screen shots of the survey.) Once study activities are concluded, the database containing contact information for respondents is destroyed, in keeping with IRB requirements. Institutional Review Board (IRB) approval for this study was received and is renewed each year. Data from this study are used to assess characteristics and factors related to sustainability of infrastructure during the life of the award and after the Federal funding cycle is completed.


Services and Costs Study. The National Evaluation Team trains all Phase V grant communities to include specific language in their consent and assent forms to describe the services and costs data that will be accessed through the child/youth’s records and shared with the National Evaluation Team. Although grant communities may work with personal identifying information to extract and link electronic records, no personally identifying information will be included in any data transferred to the National Evaluation Team for this study, other than the child/youth’s national evaluation child identification number.


For those communities electing to enter data in the Flex Funds Tool or the Services and Costs Data Tool, data in these applications are password protected to protect privacy. When data are transferred to the National Evaluation Team, data files will be encrypted to protect data privacy during electronic transfer.


The Phase V Services and Costs Study was submitted to the Macro International Inc’s IRB for review. The IRB declared that this study to be exempt from review requirements because data represent secondary analysis of administrative records. The National Evaluation Team uses data from this study to describe the types of services used by children, youth, and families in systems of care, their service use patterns, and the costs associated with these services.


CQI Initiative Evaluation.

Data collection for the CQI Initiative Survey will occur via the Internet. To protect the rights and privacy of respondents, an active informed consent process will occur. A letter containing instructions for completing the Web survey will be mailed to potential respondents by e-mail or ground mail. The letter will explain the survey and consent process, including the voluntary nature of survey completion, the privacy of responses, and the respondent’s right to discontinue participation at any time. The letter will also explain that participants may be contacted following completion of the Web survey for completion of a semi-structured interview (see below). When respondents log on to the survey Web site, they will be directed to read a consent form prior to beginning the survey. The consent form will instruct respondents to click a box to indicate their consent to participate in the survey. Both the letter and the Web survey will provide contact information in case the respondent has questions before, during, or after participation. If the potential respondent does not have e-mail access, a packet will be sent by ground mail containing a cover letter, an informed consent form, a hard copy survey, and a return envelope. The cover letter will indicate that the respondent is to return the informed consent form and the survey. After respondents have been identified for interviews, any ID link between their survey and their contact information will be destroyed to address concerns with anonymity. A subset of communities will be targeted to participate in the CQI Initiative Interviews. Up to six (20%) of the 30 funded communities will be targeted for the follow-up interviews. Within each selected community, respondents who completed the Web survey will be contacted via telephone or e-mail to solicit participation in the semi-structured interview. Those individuals who agree to participate in an interview will be sent a consent form via e-mail, ground mail, or facsimile. They will then be asked to read and sign the consent form and return it via ground mail or facsimile. To further assure privacy, all instruments and consent forms for the CQI Initiative Evaluation undergo an IRB approval process through Macro International, Inc. and approval will be renewed each year. In addition, the database containing contact information for respondents will be destroyed after study activities are included, in keeping with IRB requirements. Data from this study will be used to assess how the CQI Initiative approach is being pursued by communities, how communities use the CQI Progress Reports and associated technical assistance (TA) in their efforts toward CQI, how satisfied communities are with the CQI approach, and what communities’ perceptions are of the effectiveness and utility of the CQI Progress Reports and TA provision;



Evidence-Based Practices Study.

One-on-one and small group telephone interviews will be conducted to collect data for the Implementation Factors Substudy. Specific informants from all three levels of system of care constituents – system, service, and consumer (i.e., program administrators, direct service providers, youth, and family members) will be identified though collaboration with the project director and local evaluator. Initial contact will be made with the project director and local evaluator to explain the study, solicit potential respondents for the interviews and obtain accurate contact information, including e-mail addresses. To protect the rights and privacy of respondents, an active consent form process will take place with all participants. The consent forms will explain the purpose of the study, including the voluntary nature of their participation, privacy of responses, and the risks, benefits, and rights as respondents. The consent forms will also indicate that only the research team will have access to the information gathered through the telephone interviews, and that the data will be kept in locked cabinets. As part of the active informed consent process, The National Evaluation Team will send an invitation letter and informed consent form via e-mail to youth and family members. If the potential participants do not have e-mail access, a packet will be sent by regular mail containing an invitation letter, informed consent form, and a return envelope. Youth and family members will be required to review, sign and fax or email the consent forms to the national evaluation team. Once study activities are concluded, the database containing contact information for respondents will be destroyed, in keeping with IRB requirements. This study will go through Institutional Review Board (IRB) review and approval will be renewed each year. Data from this study will be used to assess implementation of EBP among Phase V communities, the contextual factors that support or inhibit the implementation of evidence- and practice-based treatments and youth and family awareness of and service experience with EBT.


Cultural and Linguistic Competence Study. Prior to data collection, the two substudies will be submitted for review for the protection of human subjects through the Institutional Review Board of Macro International. Data collection for both substudies will occur via face-to-face interviews, focus groups and telephone interviews. To ensure privacy and protect the rights of participants, an active informed consent process will take place. The consent form will explain the purpose of the study, including the voluntary nature of participating in the study, privacy of responses, and the risks, benefits, and rights as respondents. The consent forms will also indicate that only the research team will have access to the information gathered through the course of the site visits and other data collection efforts, and that data will be kept in locked cabinets. Participants in the telephone interview will also be required to review and sign a consent form, then fax or email the form to the National Evaluation Team. This study will go through Institutional Review Board (IRB) review and approval will be renewed each year. Data from this study are used to assess how cultural and linguistic community characteristics inform system of care implementation, and the barriers and facilitators encountered when implementing a culturally and linguistically competent system of care, how system of care communities self assess their emerging CLC practices at the infrastructure and service delivery levels, and how results were used to improve system of care practice and the extent to which diverse characteristics of system of care communities share system of care service delivery.


Federal Certificate of Confidentiality

As in previous phases of the national evaluation, to further protect study participants for Phase V, all grant communities and the National Evaluation Team obtain a Federal Certificate of Confidentiality, authorized by Section 301(d) of the Public Health Service Act. This certificate provides additional protections of the data from civil and criminal subpoena. Additionally, the national evaluation conforms to all requirements of the Privacy Act of 1974, under the System of Records: Alcohol, Drug, and Mental Health Epidemiological, and Biometric Research Data, DHHS, #09–30–0036; the most recent publication in the Federal Register occurred on January 19, 1999 (64 FR 2914). Client records at the sites are also covered under this Privacy Act System of Records.



11. QUESTIONS OF A SENSITIVE NATURE


Cross-Sectional Descriptive Study, Longitudinal Child and Family Outcome Study, and Service Experience Study. Because this project concerns services to children and youth with serious emotional disturbance and their families, it is necessary to ask questions that are potentially sensitive. It should be noted, however, that only information that is central to the study is being sought. Questions address dimensions such as child/youth emotions, behavior, social functioning, school performance, and involvement in unlawful activities. The answers to these questions are used to determine baseline status and to measure changes in these areas experienced after entering the system of care. Since each grant community must keep data on child, youth, and family status and service use, as well as treatment plan and other information, the data collection required for the national evaluation is not introducing new, sensitive domains of inquiry, but is paralleling standard procedures in the field of children’s mental health.


Although the inclusion of substance use data is sensitive in nature, it does not represent a new domain of inquiry. The frequent comorbidity of substance use and serious emotional disturbance among adolescent populations, and the increased ability to record dual diagnoses, are cited in the case management and mental health literatures. Because of the increased risk of substance use by children and youth with mental illness, Phase V system of care communities are increasing their focus on children and youth with comorbidity of substance use. Consequently, it is necessary to collect data about substance use from the children and youth to determine the prevalence of this comorbidity and to track changes in substance use after entering a system of care.


In addition to information on child/youth clinical status and social function, The National Evaluation Team asks other questions of a sensitive nature of families. These include questions related to family functioning and caregiver strain. These questions are included in response to growing evidence of the powerful role families play in shaping children’s use of services and their related outcomes. This is particularly important in systems of care where a basic tenet is to involve families in treatment planning and service delivery. Moreover, representatives of family organizations who consulted with the national evaluators during Phase III identified a lack of information on family life as a weakness in previous studies.


Before collecting data, each grant community obtains active consent from caregivers. In addition, interviewers obtain youth assent from youth. In that process, respondents are made aware that the information they provide is maintained in the strictest confidence and that they can withdraw their participation at any time. Similarly, respondents can freely choose to refrain from answering any questions they find objectionable.



12. ESTIMATES OF ANNUALIZED HOUR BURDEN


In accordance with the evaluation design, the data for the 30 communities in Phase V of the national evaluation cover a period of six years beginning in October 2006 and ending in September 2012. The 25 communities funded in 2005 cover a period of five years beginning in October 2006 and ending in September 2011. The five communities funded in 2006 also cover a period of five years beginning in October 2007 and ending in September 2012.


Table 3 shows the burden associated with the Phase V evaluation of the 30 grant communities. For measures that were previously cleared by OMB, burden estimates presented in Table 3 are based on information supplied by grant communities. Measures that have been revised during Phase V have already been used in the national evaluation and average burden estimates are based on that experience. These measures include the Caregiver Information Questionnaire (CIQ-IC), the Education Questionnaire-Revised (EQ-R), and the Multi-Sector Service Contacts—Revised (MSSC-R). Although minor changes have been made to these instruments, these changes do not affect the burden previously estimated. The burdens for the surveys that will be used for the CQI Initiative Evaluation, Evidence-Based Practices Study, and Cultural and Linguistic Competence Study were estimated from typical measures used for these purposes. The bases for hour and cost burden estimates are included in the footnotes below the table.






Table 3. Detailed Estimate of Respondent Burden

Note: Total burden is annualized over a 3-year period.



Instrument

Respondent

Number of Respondents

Total Average Number of Responses per Respondent

Hours per Response

Total Burden Hours

3-Year Average

Annual Burden Hours

Hourly Wage Rate ($)

Total cost per year ($)

System of Care Assessment

Interview Guides and Data Collection Forms

Key site informants

6301

1

1.00

630

210

19.232

4,038

Interagency Collaboration Scale (IACS)

Key site informants

630

1

0.13

82

27

19.23

519

Longitudinal Child and Family Outcome Study


Caregiver Information Questionnaire

(CIQ-IC)


Caregiver

8,8103

1

0.283

2,493

831

9.934

8,252

Caregiver Information Questionnaire Followup (CIQ-FC)

Caregiver

8,810

25

0.200

3,524

1,175

9.93

11,668

Caregiver Strain Questionnaire (CGSQ)

Caregiver

8,810

3

0.167

4,414

1,471

9.93

14,607

Child Behavior Checklist (CBCL)/ Child Behavior Checklist 1½–5

(CBCL 1½–5)

Caregiver

8,810

3

0.333

8,801

2,934

9.93

29,135

Education Questionnaire—Revised (EQ-R)

Caregiver

8,810

3

0.333

8,801

2,934

9.93

29,135

Living Situations Questionnaire (LSQ)

Caregiver

8,810

3

0.083

2,194

731

9.93

7,259

The Family Life Questionnaire (FLQ)

Caregiver

8,810

3

0.050

1,322

441

9.93

4,379

Behavioral and Emotional Rating Scale—Second Edition, Parent Rating Scale (BERS-2C)

Caregiver

7,4886

3

0.167

4,193

1,398

9.93

13,882

Columbia Impairment Scale (CIS)

Caregiver

8,3697

3

0.083

2,084

695

9.93

6,901

The Vineland Screener (VS)

Caregiver

1,3218

3

0.250

330

110

9.93

1,094

Delinquency Survey—Revised (DS-R)

Youth

5,2869

3

0.167

2,648

883

7.2510

6,402

Behavioral and Emotional Rating Scale—Second Edition, Youth Rating Scale (BERS-2Y)

Youth

5,286

3

0.167

2,648

883

7.25

6,402

Gain-Quick Substance Related Issues

(Gain Quick-R)

Youth

5,286

3

0.083

1,316

439

7.25

3,183

Substance Use Survey—Revised (SUS-R)

Youth

5,286

3

0.100

1,586

529

7.25

3,835

Revised Children’s Manifest Anxiety Scales (RCMAS)

Youth

5,286

3

0.050

793

264

7.25

1,914

Reynolds Adolescent Depression Scale—Second Edition (RADS-2)

Youth

5,286

3

0.050

793

264

7.25

1,914

Youth information Questionnaire—Baseline (YIQ-I)


Youth

5,286

1

0.167

883

294

7.25

2,132

Youth information Questionnaire—Follow-up (YIQ-F)


Youth

5,286

2

0.167

1,766

589

7.25

4,270


Service Experience Study

Multi-Sector Service Contacts—Revised (MSSC-R)

Caregiver

8,810

211

0.250

4,405

1,468

9.93

14,577

Evidence-Based Practice Measure (EBPEM)

Caregiver

8,810

2

0.167

2,943

981

9.93

9,741

Cultural Competence and Service Provision Questionnaire (CCSP)


Caregiver

8,810

2

0.167

2,943

981

9.93

9,741

Youth Services Survey—Family

(YSS-F)

Caregiver

8,810

2

0.117

2,062

687

9.93

6,822

Youth Services Survey (YSS)

Youth

5,286

2

0.083

877

292

7.25

2,117

Services and Costs Study

Flex Funds Data Dictionary

Local staff compiling / entering data

2,67012

313

.033

218

73

24.0414

1,755

Services and Costs Data Dictionary

Local staff compiling / entering data

10,68015

10016

.033

29,073

9,691

26.4417

256,230

Sustainability Study

Sustainability Survey—Caregiver

Caregiver18

52

2

0.75

78

26

9.93

258

Sustainability Survey—Provider

Provider/

Administrator18

156

2

0.75

234

78

26.44

2,062

CQI Benchmarking Initiative Evaluation

CQI Initiative Survey

Key community staff

150

1

0.5

75

25

26.44

661

CQI Initiative Interview Guide

Key community staff

50

1

1.0

50

17

26.44

449

Evidence-Based Practices Study

The Implementation Factors Discussion Guide

SOC leadership team member

90

1

0.75

68

23

26.44

608

The Implementation Factors Discussion Guide

Provider

60

1

0.75

45

15

26.44

397

The Implementation Factors Discussion Guide

Caregivers

30

1

0.5

15

5

9.93

50

Cultural and Linguistic Competence Study

CCIOSAS – Beneficiaries of Self-Assessment Findings

Provider

40

1

1.0

40

13

26.44

344

CCIOSAS – Beneficiaries of Self-Assessment Findings

Administrators/Managers

20

1

1.5

30

10

26.44

264

CCIOSAS – Beneficiaries of Self-Assessment Findings

Caregivers

40

1

.75

30

10

9.93

99

CCIOSAS – Beneficiaries of Self-Assessment Findings

Youth

40

1

.75

30

10

7.25

73

CCIOSAS – Participants in Self-Assessments

Provider

40

1

1.0

40

13

26.44

344

CCIOSAS – Participants in Self-Assessments

Administrators/Managers

20

1

1.5

30

10

26.44

264

CCIOSAS – Participants in Self-Assessments

Caregivers

16

1

.75

12

4

9.93

40

CCIOSAS – Participants in Self-Assessments

Youth

16

1

.75

12

4

7.25

29

CCIOSAS – Users of Self-Assessment Findings

Provider

40

1

1.0

40

13

26.44

344

CCIOSAS – Users of Self-Assessment Findings

Administrators/Managers

20

1

1.5

30

10

26.44

264

CCIOSAS – Users of Self-Assessment Findings

Caregivers

16

1

.75

12

4

9.93

40

CCIOSAS – Users of Self-Assessment Findings

Youth

16

1

.75

12

4

7.25

29

CCIOSAS – Telephone Interview

Providers

2

1

1.0

2

0.67

26.44

18

CCIOSAS – Telephone Interview

Administrators/Managers

3

1

1.0

3

1

26.44

26

CCEBPS – Managers of EBP Process

Providers

16

1

1.0

16

5

26.44

132

CCEBPS – Managers of EBP Process

Administrators/Managers

20

1

1.5

30

10

26.44

264

CCEBPS – Providers of EBP

Providers

40

1

1.0

40

13

26.44

344

CCEBPS – Families and Youth

Caregivers

40

1

.75

30

10

9.93

99

CCEBPS – Families and Youth

Youth

40

1

.75

30

10

7.25

73

CCEBPS – Telephone Interview

Providers

2

1

1.0

2

0.67

26.44

18

CCEBPS – Telephone Interview

Administrators/Managers

3

1

1.0

3

1

26.44

26


Table 3a. Summary Estimate of Respondent Burden


Summary of Burden Estimates for 3 Years


Number of Distinct Respondents

Average Number of Responses per Respondent

Total Number of Responses

Average Burden per Response (hours)

Total Burden

(hours)

Total Cost

Caregivers

8,810

2.46

21,673

2.36

51,147

507,890

Youth

5,286

2.56

13,532

0.99

13,397

97,128

Community staff

870

72.22

62,831

0.86

54,035

1,428,685

Total Summary

14,996


98,036


118,579

2,033,703


Table 3b. Summary Estimate of Annualized Respondent Burden


Summary of Annualized Burden Estimates for 3 Years


Number of Distinct Respondents

Number of Responses per Year per Respondent

Total Number of Responses Per Year

Average Burden per Response (hours)

Total Annual Burden

(hours)

Annual Cost

Caregivers

8,810

0.82

7,224

2.36

17,049

169,297

Youth

5,286

0.85

4,511

0.99

4,466

32,376

Community staff

870

24.07

20,944

0.86

18,012

476,228

Total Annual Summary

14,996


32,679


39,527

677,901


  1. An average of 21 constituents in up to 30 grant communities will complete the System of Care Assessment interview. These constituents will include site administrative staff, providers, agency representatives, family representatives, and youth.

  2. Assuming the average annual income across all types of staff/service providers/administrators is $40,000, the wage rate was estimated using the following formula: $40,000 (annual income)/2080 (hours worked per year) = $19.23 (dollars per hour).

  3. Number of respondents across 30 grantees. Average based on a 5 percent attrition rate at each data collection point.

  4. Given that 56 percent of the families in the Phase V evaluation sample fall at or below the 2008-2009 DHHS National Poverty Level of $ 20,650, (based on family of four), the wage rate was estimated using the following formula: $20,650 (annual family income)/2080 (hours worked per year) = 9.93 (dollars per hour).

  5. Average number of responses per respondent is a weighted average of the possible numbers of responses per respondent for communities beginning data collection in FY2007 and FY2008. The maximum numbers of responses per respondent are for 24 communities beginning data collection in FY2007, 1 follow-up data collection point remaining for children/youth recruited in year 2 (of grant community funding), 3 for children/youth recruited in year 3, 4 for children/youth recruited in year 4, and 4 for children/youth recruited in year 5. The maximum numbers of responses per respondent are, for 6 communities beginning data collection in FY2008, 3 follow-up data collection points remaining for children/youth recruited in year 2 (of grant community funding), 5 for children/youth recruited in year 3, 6 for children/youth recruited in year 4, and 4 for children/youth recruited in year 5.

  6. Approximate number of caregivers with children over age 5, based on Phase V data submitted as of 12/08.

  7. Approximate number of caregivers with children 3 and older, based on Phase V data submitted as of 12/08.

  8. Approximate number of caregivers with children 5 or under, based on Phase V data submitted as of 12/08.

  9. Based on Phase III and IV finding that approximately 60 percent of the children/youth in the evaluation were 11 years old or older.

  10. Based on the 2009 Federal minimum wage rate of $7.25 per hour.

  11. Respondents only complete Service Experience Study measures at follow-up points. See Footnote #3 for the explanation about the average number of responses per respondent.

  12. Staff will enter data on flexible funds expenditures into a Web-based application or will recode existing data on flexible funds expenditures to match the Flex Funds Data Dictionary format. Each community will use flexible funds expenditures on average for approximately one-quarter of the estimated 356 children/youth enrolled, suggesting a total of 89 children/youth will receive services from flexible funds per community. Thus, there will be data entered for 89*30 = 2,670 children/youth using the Flex Funds Data Dictionary.

  13. Assumes that three expenditures, on average, will be spent on each child/youth receiving flexible fund benefits.

  14. Assumes that the average annual income across all types of programming staff is $50,000, the wage rate was estimated using the following formula: $50,000 (annual income) / 2080 (hours worked per year) = $24.04 per hour.

  15. Staff will collect paper-based forms from agencies and enter them into a Web-based application or will extract data from agencies’ existing data systems. Staff will recode data to match the Services and Costs Data Dictionary format. Service and costs records will be compiled for all 356*30=10,680 children/youth enrolled.

  16. Assumes that each child/youth will have 100 service episodes, on average, during his/her time in a system of care.

  17. Assumes that the average annual income across all types of evaluators, agency staff, and administrative staff is $55,000, the wage rate was estimated using the following formula: $55,000 (annual income) / 2080 (hours worked per year) = $26.44 per hour.

  18. This survey will be administered in 5 communities funded in 2006, 25 communities funded in 2005, 2 communities funded in 2000, and 20 communities funded in 1999. For each community, one respondent will be a caregiver and three respondents will be administrators/providers.


As indicated in Table 3, the average total annual burden for data collection is estimated at 39,526 hours. This estimate is derived by calculating the burden for each measure, dividing those numbers by 3 (years of data collection in the national evaluation), and summing.



13. ESTIMATES OF ANNUALIZED COST BURDEN TO RESPONDENTS


The cost of this data collection is minimal. The costs for operation and maintenance of materials necessary for ongoing data collection are similarly minimal.


Other costs related to this effort, such as the cost of obtaining copyrighted instruments, are costs to the Federal Government. Each grant community has been funded, as part of the overall cooperative agreement award, to support two staff positions (or the full-time equivalent) to assist in the evaluation.




14. ESTIMATES OF ANNUALIZED COST TO THE GOVERNMENT


CMHS has planned and allocated resources for the management, processing, and use of the collected information in a manner that shall enhance its utility to agencies and the public. Including the Federal contribution to local grant community evaluation efforts, the contract with the National Evaluation Team and government staff to oversee the evaluation, the annualized cost to the government is estimated at $3,846,036. These costs are described below.


Each grant community is expected to hire two full-time equivalents to recruit families into the evaluation, collect information, manage and clean data, and conduct analyses at the local level. Assuming (1) an average annual salary of $40,000; (2) that 30 grant communities will be funded; and (3) that the average Federal contribution (not including State matching funds) will be 73 percent, the annual cost for Phase V at the grant community level is estimated at $730,000. These monies are included in the cooperative agreement awards.


The national evaluation contract has been awarded to Walter R. McDonald & Associates, Inc. (WRMA) and its primary partner Macro International Inc. (MACRO) for evaluation of the 30 grant communities in Phase V. The national evaluation contract provides for one base year of $3,346,108 with an option to renew for four more years. The estimated average annual cost of the contract will be $3,038,340. Included in these costs are the expenses related to developing and monitoring the national evaluation including, but not limited to, the following activities: developing the design, instrument package (including acquisition of copyrighted instruments), data manual, and training materials; monitoring and providing technical assistance to sites; traveling to sites and relevant meetings; and analyzing and disseminating data. Cost for acquisition of copyrighted instrumentation is projected to be $22,371 per year. This cost is included in the total contract award.


It is estimated that CMHS will allocate 75 percent of a full-time equivalent each year for government oversight of the evaluation. Assuming an annual salary of $103,594, these government costs will be $77,696 per year.



15. CHANGES IN BURDEN


There are 27,936 burden hours in the original OMB approval. CMHS is now requesting 39,527 hours. This is an increase of 11,591 hours due to the following program changes:


  • Updates to three of the measures in the Longitudinal Child and Family Outcome Study instrument package to address information desired by the program;

  • The addition of the Services and Costs Study Data Dictionary and the Flex Funds Data Dictionary;

  • The addition of the CQI Initiative Evaluation to assess how the CQI Initiative approach is being pursued by communities, how communities use the CQI Progress Reports and associated technical assistance (TA) in their efforts toward CQI, how satisfied communities are with the CQI approach, and what communities’ perceptions are of the effectiveness and utility of the CQI Progress Reports and TA provision;

  • The modification in the design of the Evidence-Based Practices Study is intended to help determine what attitudinal and organizational factors influence the implementation and receipt of evidence-based practices; and

  • The addition of two new sub-studies under the Cultural and Linguistic Competence Study, which focuses on the adaptation of evidence-based practices and the organizational context, which support adaptation.



16. TIME SCHEDULE, PUBLICATION, AND ANALYSIS PLANS


a. Time Schedule


The time schedule for implementing the Phase V evaluation is summarized in Table 4.


Table 4. Time Schedule


Receive initial OMB clearance for study

October 2006

Begin data collection for 30 sites funded in FY 2005

October 2006

Receive OMB Resubmission clearance for study

October 2009

Data collection completed for 25 sites funded in FY 2005

September 2011

Data collection completed for 5 sites funded in FY 2006

September 2012

Process and analyze data

Ongoing

Produce public use data base

September 2011

Produce final report

September 2012


In regards to the public use data file, the National Evaluation Team has created a Data Access Group (DAG). The DAG provides an opportunity for interested investigators, both affiliated with Center for Mental Health Services, CMHS-funded system of care communities and those not affiliated, to conduct analyses with the aggregate national evaluation data. The purpose of the group is to promote interest among experts in the field and to expand analytic resources and enhance dissemination activities from national evaluation data.


Data made available to these investigators will be the combined datasets of Phases, II, III, IV, and V that include the Cross-Sectional Descriptive Study and the Longitudinal Child and Family Outcome Study data. Data from other study components may be made available upon request and approval by Macro, WRMA, and SAMHSA.


b. Publication Plans


Applications of the system of care model have increased in number and funding over the past several years. Thus, the publication of evaluation results will be of great interest at the Federal, State, and local levels, all of which have been involved in promoting the system of care model. Interim reports have been prepared for CMHS annually beginning in October 2004. A final report will be prepared at the completion of the evaluation for internal use by CMHS and will be widely distributed beyond CMHS.


Because of the importance of this evaluation to the field of children’s mental health and the expansion of the system of care model, results of the national evaluation will be published in relevant professional journals to inform the research community as well as the decision making of policymakers and program administrators. At least 10 publications are planned. Possible publications include manuscripts reporting results from the Evidence-Based Practices Study and the Cultural and Linguistic Competence Study. Additional publications may include articles on the development of community-based systems of care, effectiveness of services for targeted groups, cost effectiveness of treatment components, and implications of system development approaches for sustainability, among others. All publications will be submitted in draft form to the Government Project Officer (GPO) and an expert panel designated by the GPO for review and approval prior to submission to the selected journal.


The cross-agency, interagency, collaborative perspective represented by the system of care model involves multiple audiences, including those involved in mental health, child welfare, juvenile justice, public health, and education. Policymakers, program administrators, and researchers in each of these service sectors will be interested in the findings from this evaluation and will serve as the potential audience for publications. Examples of journals that will be considered as vehicles for publication include the following:


    • American Journal of Public Health;

    • American Psychologist;

    • Child Abuse and Neglect: The International Journal;

    • Child and Adolescent Psychiatric Clinics of North America;

    • Child Development;

    • Child Maltreatment;

    • Child and Youth Services Review;

    • Children Today;

    • Evaluation Review;

    • Evaluation Quarterly;

    • Journal of Autism and Developmental Disorders;

    • Journal of Behavioral Health Services and Research;

    • Journal of Child and Family Studies;

    • Journal of Clinical Child and Adolescent Psychology;

    • Journal of Consulting and Clinical Psychology;

    • Journal of Emotional and Behavioral Disorders;

    • Journal of Health and Social Behavior;

    • Journal of Mental Health Administration;

    • Journal of the American Academy of Child and Adolescent Psychology;

    • Mental Health Services Research;

    • Milbank Memorial Fund Quarterly;

    • Psychiatric Services; and

    • Social Services Review.


Besides audiences associated with specific service sectors, results of the project will be of interest to State legislators. This group often makes decisions about how to configure the service delivery system for children with serious emotional and behavioral disorders and determines matching funds required for this program. The National Conference of State Legislators can help identify the best strategies for reaching this group with evaluation findings.


c. Data Analysis Plan


All of the data collection and analytic strategies detailed in this package are linked to the evaluation questions. These linkages are shown in Table 5. Note that the majority of these data are collected at intake and at each 6-month follow-up data collection point. Exceptions include: (1) descriptive data elements that are not expected to change over time (e.g., gender, race) and are asked only at intake; (2) services and costs data, which will be collected in evaluation years 4, 5, and 6; (3) system of care data, which are collected every 18-24 months; (4) the data for the Implementation Factors Substudy of the Evidence-Based Practice Study, which will be collected in year 4 of the evaluation; (5) sustainability data that are collected in years 3, 4, 5, and 6 of the evaluation; (6) cultural and linguistic competence data that are collected in years 1, 3, and 5; and (7) CQI evaluation data to be collected in evaluation year 4. Analyses are conducted to assess reliability and validity of selected measures as sufficient data to conduct these analyses are obtained in the early stages of the study. These analyses include, but are not limited to, calculation of reliability using Cronbach’s coefficient alpha to determine internal consistency of ordinal-level and interval-level measures, calculation of the Kuder-Richardson formula 20 to determine internal consistency of dichotomous measures, and confirmatory factor analysis to determine latent variable structure and content of multi-component scales.


Table 5. Evaluation Questions, Indicators, Data Sources, and Analysis Techniques


Evaluation Questions

Indicators

Data Sources

Data Analysis

System of Care Assessment

Does the system maximize interagency collaboration?

  • Core agencies participate in a collaborative way

  • Integration of staff, resources, functions, and funds

  • Co-location of services of multiple agencies

  • Interagency service planning

  • Shared vision and goals

  • Formal relationships established between agencies

  • Site Visit

  • IACS


Univariate/

Multivariate Analysis


Qualitative thematic analysis

Are the various service components of the system coordinated?

  • Co-location of services of multiple agencies

  • Availability of case management/care coordination services

  • Case manager/care coordinator has broad responsibilities and active referral role

  • Integration and consistency in case management/care coordination across systems/agencies


  • Site Visit

Univariate/

Multivariate Analysis


Qualitative thematic analysis

Are services and the system accessible?

  • Proportion of eligible population provided services

  • Time between identification of need and entry to system

  • Waiting lists for entry to system

  • Waiting lists for delivery of key services

  • Active outreach

  • Logistics and supports that encourage access


  • Site Visit

Univariate Analysis


Qualitative thematic analysis


Is the service array comprehensive?

  • Availability of broad array of residential, intermediate, outpatient, and wraparound services


  • Site Visit

  • MIS

Univariate Analysis


Qualitative thematic analysis

System of Care Assessment (continued)

Are services and the system culturally competent?

  • Cultural diversity of the child and family population

  • Cultural diversity of provider population

  • Agency commitment to cultural competency

  • Equitable treatment of all children and families

  • Adherence to national standards of cultural competence

  • Site Visit

  • CCSP

  • YSS, YSS-F

Univariate Analysis


Qualitative thematic analysis

Are services and the system family-driven?

  • System and services involve caregivers in developing individual child and family service plans

  • System and services involve caregivers in overall system of care planning activities

  • System and services involve caregivers in service delivery

  • System and services address needs of caregivers and families for support

  • Site Visit

  • YSS, YSS-F



Univariate/ Multivariate Analysis


Qualitative thematic analysis


Are services individualized and youth-guided?

  • Active individualized service planning process

  • Frequency of monitoring of ISP by case manager

  • System and services involve youth in developing his or her own service plan

  • System and services involve youth in overall system of care planning activities

  • System and services involve youth in his or her own service delivery

  • System and services address needs of youth for support

  • Site Visit

  • YSS, YSS-F

Univariate/

Multivariate Analysis


Qualitative thematic analysis

Are services community-based?

  • Availability of services within the community

  • Extent of reliance on out-of-county and out-of-State placements

  • Site Visit

  • MIS

Univariate/

Multivariate Analysis

Do systems mature over time?

  • Development of infrastructure

  • Development of service delivery capacity

  • Site Visit

Multivariate Analysis


Qualitative thematic analysis

Are services provided in the least restrictive setting that is appropriate?

  • Processes to ensure that children step down to lower levels of care when appropriate

  • Extent of use of intermediate and outpatient placements

  • Extent of use of wraparound services

  • Stability and duration of placements

  • Level of use of mental health services in normative settings (e.g., home, school)

  • Site Visit

  • MIS

  • LSQ

Univariate/

Multivariate Analysis


Qualitative thematic analysis

Cross-Sectional Descriptive Study

What are children, youth, and families like?

  • Gender

  • Race

  • Age

  • Foster care placement

  • Presenting problem(s)

  • Diagnosis at intake

  • Intake and referral source

  • Case status

  • EDIF

  • CIUF

Univariate/Bivariate Analysis

Longitudinal Child and Family Outcome Study

Are there differences between the children, youth, and families served in the systems that do and do not choose to participate in the Longitudinal Child and Family Outcome Study?

  • Gender

  • Race

  • Age

  • Educational level and placement

  • Socioeconomic status

  • Parents’ employment status

  • Living arrangement

  • Presenting problem(s)

  • Diagnosis at intake

  • Intake/referral source

  • Risk factors for family and child

  • Case status

  • EDIF

  • CIUF

  • CIQ

Univariate/Bivariate Analysis

Has there been a reduction in childrens/youth negative behaviors?

  • Number of problem behaviors

  • CBCL1½–5

  • CBCL 6–18

  • CIS

Univariate/

Multivariate Analysis

Has there been an increase in the level of childs/youth overall functioning?

  • Childs ability to accomplish activities of daily living

  • Quality of family relationships

  • Quality of peer relationships

  • CBCL1½–5

  • CBCL 6–18

  • BERS-2C

  • BERS-2Y

  • CIS

  • FLQ

Univariate/

Multivariate Analysis

Has there been improvement in child/youth functioning in the educational environment?

  • School attendance

  • Expulsions, dropouts, suspensions

  • Academic performance

  • BERS-2C

  • BERS-2Y

  • EQ-R

Univariate/ Multivariate Analysis

Has there been improvement in child/youth regarding involvement with law enforcement?

  • Violations

  • Number of contacts with law enforcement

  • Number of incarcerations


DS-R

Univariate/

Multivariate Analysis

Do families experience improvements in family life?

  • Family functioning

  • Caregiver strain (burden of care)


  • FLQ

  • CGSQ

Univariate/

Multivariate Analysis

Are there differences in family outcomes across systems of care?

  • Family functioning

  • Caregiver strain (burden of care)

  • Material resources


  • FLQ

  • CGSQ

Univariate/

Multivariate Analysis

Service Experiences Study

How do children, youth, and families experience services?

  • Ratings of specific services

  • Ratings of the overall system

  • Provider attitudes and practices


  • YSS

  • YSS-F

  • CCSP

Univariate/

Multivariate Analysis

Service Experiences Study (continued)

Are there differences in service experiences across systems of care? Are differences, if any, associated with differential outcomes?

  • Comparison of ratings of specific services

  • Comparison of ratings of the overall system

  • Comparison of provider attitudes and practices

  • Relationship to child outcomes



  • YSS

  • YSS-F

  • CCSP

  • CBCL1½–5

  • CBCL 6–18

  • CIS

Univariate/

Multivariate Analysis

Sustainability Study

To what extent are systems of care able to sustain themselves after Federal funding has ended? What factors facilitate or impede sustainability?


  • System of care characteristics

  • Factors related to sustainability

  • Success of sites to be sustainable post-funding

  • Sustainability Survey

Univariate/ Multivariate Analysis

Services and Costs Study

What services do children, youth, and families receive and what are their service utilization patterns?

  • Previous service history

  • Service setting and type

  • Level of restrictiveness

  • Mix of services

  • Amount and duration

  • Continuity of care



  • Community MISs

  • Flex Funds Data Dictionary

  • Services and Costs Data Dictionary

  • LSQ

Univariate/

Multivariate Analysis

How do service use patterns relate to child/youth behavioral and functional outcomes?

  • Comparison of service use for children/youth who enter the system at varying levels of challenge

  • Comparison of change in outcomes over time for children/youth in different utilization pattern groups

  • Community MISs

  • Flex Funds Data Dictionary

  • Services and Costs Data Dictionary

  • CBCL1½–5

  • CBCL 6–18

  • CIS

  • BERS-2C

  • BERS-2Y

  • MSSC-R



Univariate/

Multivariate Analysis

How do service use patterns differ across subgroups within a site? Across system of care sites?

  • Comparisons of types of services used

  • Comparisons of level of restrictiveness

  • Comparisons of service mix

  • Comparison of amount and duration

  • Comparison of continuity of care

  • Community MISs

  • Flex Funds Data Dictionary

  • Services and Costs Data Dictionary

  • LSQ

  • MSSC-R



Univariate/

Multivariate Analysis

What costs are associated with services at the aggregate and child, youth, and family levels?

  • Total costs of services for individual children, youth, and families

  • Average costs per child/youth/family

  • Average cost per service type



  • Community MISs

  • Flex Funds Data Dictionary

  • Services and Costs Data Dictionary

Univariate/Bivariate Analysis

CQI Initiative Evaluation

Have communities continuously improved the quality of their systems of care?

  • Utilization of CQI Progress Report

  • Description of CQI infrastructure

  • Effectiveness of technical assistance

  • Development of communication feedback loop

  • CQI Initiative Survey

  • CQI Initiative Interview

Univariate/Bivariate Analysis

Trend Analysis


Qualitative Thematic Analysis

Has the continuous quality improvement Initiative been effectively implemented and to what extent have implementation goals been met in each community?

  • Key constituent involvement in implementing CQI Initiative

  • Positive and negative implications of initiative

  • Extent to which initiative was implemented according to plans

  • Satisfaction with implementation

  • CQI Initiative Survey

  • CQI Initiative Interview

Univariate/Bivariate Analysis

Trend Analysis


Qualitative Thematic Analysis

Evidence-Based Practices Study

What are caregiver’s experience with providers and EBTs?

  • Kinds of information provided to caregivers about the treatments their child/youth will receive.

  • MSSC-R

Descriptive Statistics

What are caregiver’s attitudes about the information they are receiving about treatment effectiveness?


  • Caregiver ratings about the importance of receiving various kinds of information about their child’s/youth’s treatments.

  • MSSC-R

Descriptive Statistics

What is the implementation process for EBTs and PBE approaches and how do they affect agencies/programs? What are the barriers and facilitators of EBT and PBE implementation?

  • Descriptions of processes used to implement EBT and PBE.

  • Description of specific factors that affected implementation of EBT and PBE approaches.

  • Description of the impact EBT and PBE implementation on agencies/program.

  • Program Directors

  • Service Providers


Qualitative Thematic Analysis

What is the understanding of families/youth about EBTs and PBE approaches and how they are integrated in their service planning? What are the experiences and outcomes of family/youth with EBT and PBE?

  • Descriptions of the extent to which caregivers and youth understand EBT and PBE approaches and are aware of their integration and impact on their treatment planning.

  • Descriptions of the extent to which caregivers and youth assess factors that affect implementation of EBT and PBE and the impact on health outcomes.

  • Caregivers

  • Youth

Qualitative Thematic Analysis

Cultural and Linguistic Competence Study

What self-assessment instrumentation and supporting materials are available for use by communities to conduct self-assessments?


  • Descriptions of self-assessment tools

  • Program Directors

  • Service Providers

Qualitative Thematic Analysis

How do communities select, adapt, and conduct self-assessments?

  • Descriptions of processes used to conduct self-assessments

  • Program Directors

  • Service Providers

Qualitative Thematic Analysis

How do communities utilize their findings to inform program and service delivery improvements?

  • Descriptions of the extent to which self-assessment findings are used

  • Program Directors

  • Service Providers

  • Caregivers

  • Youth

Qualitative Thematic Analysis

How are evidence-based practices adapted to be culturally appropriate?

  • Descriptions of specific adaptations and how these differ from the evidence-based practice model.

  • Program Directors

  • Service Providers

Qualitative Thematic Analysis

What barriers exist to formulating and implementing appropriate adaptations of EBP models?

  • Descriptions of processes used to develop adapted models.

  • Program Directors

  • Service Providers

Qualitative Thematic Analysis

How are caregivers and youth involved in the development of adaptations of evidence-based practices?

  • Descriptions of the processes that are used to engage caregivers in youth in providing input practice development.

  • Extent to which such practices model culturally competence.

  • Program Directors

  • Service Providers

  • Caregivers

  • Youth

Qualitative Thematic Analysis

In what ways are adapted services meeting the needs of caregivers and youth?

  • Descriptions of the extent to which caregivers and youth self assess the adequacy and consistency of services.

  • Caregivers

  • Youth

Qualitative Thematic Analysis

How are issues in adapting evidence-based practices identified and resolved?

  • Descriptions of the processes for self-evaluating the approaches being put in place.

  • Program Directors

  • Service Providers

  • Caregivers

  • Youth

Qualitative Thematic Analysis



Analyses planned for each of the studies are described below. These analyses are possible for grant communities that are able to implement the evaluation as designed, including collection of cross-sectional descriptive data on the census of children, youth, and families who enter the system, the proper recruitment of an adequately sized sample, minimal missing data within and across data collection points, retention of families over time, and adherence to prescribed data collection procedures. In sites with constraints (e.g., insufficient size of target population), analyses are tailored to meet the needs of the individual site. The sample table shells presented in Attachment 5 provides examples of how data can be summarized.


Essentially, the objectives of the data analysis are concentrated on an overall goal of understanding the system of care approach and its effects. The analysis plan focuses on description, explanation, and prediction. The data analyzed in Phase V include both discrete and continuous variables. The scales on which these variables are measured have important implications for the choice of statistical procedures used in data analysis. Some of the variables used in this evaluation are nominal (e.g., race and ethnicity) and ordinal (e.g., services ranked in order of restrictiveness). These types of measurement scales require the use of nonparametric statistics. It is recognized that nonparametric statistics offer less power relative to parametric tests, and that parametric tests are restrictive but they are more robust to violations of normal distribution. For this reason, research questions measured with ordered discrete variables (such as the ratings of system and service performance) approaching a continuous scale are tested using parametric statistics.


System of Care Assessment. In this evaluation study, Phase V seeks to determine whether a system of care has been implemented in accordance with the system of care program theory and to document the maturation of the system over time. This study component includes both qualitative and quantitative analyses and both are based on a standard framework. Qualitative analyses are used to describe the infrastructure and the direct service delivery processes of system of care communities. The standard framework ensures that all system of care communities are characterized on similar system operations (e.g., management, client entry into the system of care, service planning and coordination processes) but the qualitative approach provides for the individual and unique features of each system of care community to be portrayed.


Qualitative data obtained through individual interviews at each system of care community and from document reviews are synthesized into a site-specific narrative report that is returned to each system of care community for review and correction. When the reports for each community are finalized after site comment, they are entered into a qualitative database software program (Atlas.ti) that allows for meta-analyses across system of care communities and across time.


The quantitative analyses are based on scores given to each system of care community that measure the extent to which it has achieved the program theory’s overarching principles (e.g., youth-guided, individualized and family-driven care, cultural competence, coordination) within the system operations described in the qualitative analysis and from quantitative interview questions (e.g., percentage of children and youth who receive an individualized service plan, number of child- and youth-serving agencies that attend governing body meetings). This approach allows systems of care to be assessed across principles (e.g., how well system operations incorporate a family-driven approach) and across operations (e.g., how well does the overall management of the system of care reflect the principles as a whole). The relationship among service and system experiences, child, youth, and family characteristics, and outcomes over time are explored using correlational, regression, and path analyses.


Information from the Interagency Collaboration Scale (IACS) is analyzed quantitatively to assess the level of interagency collaboration in system of care communities and to better understand the multidimensional structure of the collaboration construct. The general linear model (GLM) repeated measures analysis allows the National Evaluation Team to test whether changes over time are significant and whether some groups experience more improvement than others. Responses to the Interagency Collaboration Scale (IACS) are analyzed using GLM to determine the extent to which interagency collaboration factors of Beliefs/Values, Activities/Behavior, and Knowledge change over time. In addition, system-level characteristics are used to group communities to assess the impact of these characteristics on interagency collaboration scores.


Cross-Sectional Descriptive Study. Descriptive demographic and diagnostic eligibility data are analyzed with basic descriptive techniques to report frequencies and percentages. These data are reported for each grant community, as well as for all grant communities combined.


Longitudinal Child and Family Outcome Study. For this evaluation component, data collected at intake are analyzed to describe the sample in terms of intake demographic characteristics, symptomatology (i.e., Child Behavior Checklist [CBCL] scores), functional impairment (i.e., Columbia Impairment Scale [CIS] scores), social functioning (i.e., peer relations, Delinquency Survey—Revised [DS-R], and Substance Use Survey—Revised [SUS-R] scores), and stability of living arrangements (i.e., Living Situations Questionnaire [LSQ]). Families are described in terms of their intake demographic features, functioning (i.e., Family Life Questionnaire [FLQ] scores), and level of caregiver strain (i.e., Caregiver Strain Questionnaire [CGSQ] scores). Univariate descriptive analyses are performed to characterize the families participating in this evaluation, including score ranges, means, and medians. These analyses are reported for each system of care community as well as for all grant communities combined.


Change in child, youth, and family outcomes over time are tested using a variety of techniques. Repeated measures analysis of variance (ANOVA) is used to test the significance of change over time within and between groups at the grant communities. Repeated measures analysis of covariance (ANCOVA) is conducted using the system of care development scores from the System of Care Assessment as a covariate. ANCOVA controls for differences present at intake, which is prudent, even when those differences are not statistically significant. Because children and youth recruited in different years are followed for varying periods of time, these analyses will only include intake, 6-month, 12-month, and 18-month data.


Following children and youth recruited in the first 2 years of data collection for more than 18 months enables hierarchical linear modeling (HLM) be used. HLM provides improvement in estimating individual effects, an opportunity to model cross-level effects (i.e., individuals within systems, over time), and greater precision in partitioning components of effects across multiple levels. The following provides an illustration of how HLM is used in the evaluation. The children, youth, and families in the longitudinal study are located (or “nested”) within systems of care. We assume that children and youth experience an intervention and that, as a result of that intervention, they experience change. We know from the evaluation of the 22 grant communities originally funded in 1993 and 1994 that systems of care vary in terms of their overall development (Brannan et al., 2002; Vinson et al., 2001). We expect that differential system development (approximated with system-level assessment scores) will mediate child, youth, and family outcomes. HLM allows us to estimate growth curves (e.g., changes in the level of symptomatology) based on repeated observations. These repeated measures are “nested” within the individual child/youth. Using this three-level design, HLM permits us to estimate how much of the variance found in the first level (e.g., changes in symptoms) is due to the second (e.g., individual receiving treatment), and how much of the variance can be attributed to the third level (e.g., the degree of system of care development).


The GLM repeated measures analysis allows the National Evaluation Team to test whether changes over time are significant and whether some groups experience more improvement than others do. Within a community, these techniques are used to explore whether certain service utilization patterns yield better outcomes. Path analysis and other structural equation modeling techniques are used to investigate the direct and indirect effects of causal variables (such as ratings of system performance and adherence to service plans) on dependent outcome measures (such as clinical assessments, restrictiveness of care, and family functioning). The National Evaluation Team does not view the use of path analysis as a method of causal discovery, but rather as a method of confirming appropriate models derived from empirical and theoretical considerations.


Service Experience Study. The analysis for this study of the Phase V evaluation assesses the extent to which children, youth, and families receive services as they were intended, that is, consistent with the system of care program model. As with data from the Services and Costs Study, the distribution of self-reported service use across the client population is described (i.e., Multi-Sector Service Contacts—Revised [MSSC-R]). Service use patterns are also described. HLM or ANOVA will be performed to examine: (1) change in service use patterns of children, youth, and their families; (2) whether there are differences between groups of children and youth in the system of care communities who receive an evidence-based treatment and those who do not in terms of client satisfaction as measured by the abbreviated satisfaction questionnaires (i.e., Youth Services Survey [YSS-F, YSS]) and ratings of the cultural competence of services as measured by the Cultural Competence and Service Provision Questionnaire (CCSP); (3) whether children, youth, and families stay in services longer on average in communities with higher average service and system of care ratings; and (4) whether within communities, caregivers of children and youth who received fewer services in the previous 6 months (as measured by the Multi-Sector Service Contacts—Revised [MSSC-R]) also reported being less satisfied or rated their services and systems lower.


Sustainability Study. For the Sustainability Survey, the analysis plan includes both quantitative and qualitative components. Web survey data are aggregated and analyzed quantitatively and qualitatively. Quantitative data obtained from factors related to sustainability are examined for reliability, and are compared to system characteristics. To examine factors in relation to system development, survey data pertaining to system features are compared to responses related to factors contributing to sustainability. In addition, survey data are combined with data from final System of Care Assessment site visits, including assessment scores from these visits, to create a more robust picture of the status and process of sustainability in each community. Quantitative data obtained about system features and factors affecting sustainability are tallied for each grant community. This information is also tallied across all grant communities, yielding cross-site information on the extent to which specific system of care features are in place in Phase V grant communities during various stages of their funding, positive and negative factors affecting sustainability, and the effectiveness of strategies implemented to sustain systems of care. Quantitative ratings are assigned to each grant community across the various assessment areas, and are ranked according to their importance. Where appropriate, quantitative comparisons of these features are made across grant communities.


Services and Costs Study. For this study, analyses will focus primarily on service use patterns (e.g., types, combination, amount, and costs of services used) and the factors that influence use. Analyses will be conducted at the aggregate and individual child, youth, and family levels. At the aggregate level, the distribution of service use and costs across the population will be described. At the individual child, youth and family level, service use patterns will be described (e.g., distribution of children and youth using various combinations of services, mean and median amounts of services used, mean and median costs of services).


Latent class analysis and other case-grouping techniques will be used to group children/youth who experience similar service use patterns, based on combinations and amount of services. Multinomial logistic regression analysis will be employed to predict classes of service utilization patterns with child/youth, clinical, and family life variables measured at intake. The longitudinal outcomes of children and youth in various service use groups will be compared to see if some use patterns are associated with greater gains and, if so, for which groups of children and youth.


Trend analysis will be used to analyze change in costs over time. Multivariate techniques that adjust for the skewed distribution of cost data will be employed to predict costs, controlling for variation in baseline characteristics. Examples of such techniques include log-transformation and generalized linear models assuming a gamma distribution.


CQI Initiative Evaluation. For the CQI Initiative Evaluation, the analysis plan includes both quantitative and qualitative components. Analyses for the survey data will include content/thematic analysis of open-ended questions, and descriptive, univariate, and bivariate statistical analyses of quantitative data. Interview data will be analyzed primarily using qualitative methods, such as content/thematic analysis. Data from the surveys and interviews will be used to assess the CQI process within communities, gauge the effectiveness of the CQI Initiative in providing appropriate technical assistance to communities, and inform the ongoing development of the Initiative.



Evidence-Based Practices Study. Data collected for this study are analyzed both qualitatively and quantitatively. Traditional statistics on quantitative data (e.g., means, standard deviations, percentages) will be used to profile the extent to which consumers are aware of the research and clinical bases supporting their treatments. Correlations and t-tests compare the service experiences of groups who receive different information about their treatment. Qualitative data will be processed and analyzed to look for contextual information to help shed light on the findings from the quantitative analyses. The thematic analysis will focus on common themes and patterns from both within and across sites.


Cultural and Linguistic Competence Study. Data from this study will be qualitative in nature. Data analysis will be conducted with Atlas.ti and will focus on themes that are derived from interviews both within the sites that are visited and across sites with key informants, including project directors, program administrators, service providers, youth and families. Queries will be performed on the coded text to compare themes across respondent types in order to understand differences and similarities in perceptions of, for example, how the self-assessment protocols and support materials were used to self assess system of care communities, and how communities adapt and implement evidence-based practices. Specifically, patterns of level of involvement in developing, conducting, and using self-assessments, as well as patterns related to the decision-making process of which evidence-based practice to use, adapt or implement will be assessed across respondents. The interviews will be compiled into summary narratives and analyzed to identify key features of cultural and linguistic competence practice implementation that have a bearing on effective approaches with diverse populations. Data collection site visits will occur in year 5.



17. DISPLAY OF EXPIRATION DATE


All data collection instruments will display the expiration date of OMB approval.



18. EXCEPTIONS TO THE CERTIFICATION STATEMENT


This collection of information involves no exceptions to the Certification for Paperwork Reduction Act Submissions.

Phase Five of the National Evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program

Supporting Statement



B. STATISTICAL METHODS



1. RESPONDENT UNIVERSE AND SAMPLING METHODS


System of Care Assessment. The universe for the System of Care Assessment consists of key system of care roles in the 30 grant communities. Respondents for the System of Care Assessment are selected based on their affiliation with the system of care community and must be serving in specific roles. To determine the respondents, the National Evaluation Team sends a site informant list to each community 8 weeks prior to its site visit. The site informant list identifies categories of respondents who offer a variety of perspectives about each community’s system of care. The document outlines the specific positions and roles, specialized functions, number of interviewees, and estimated interview time for each respondent category. The system of care community selects potential respondents that meet the requirements outlined in the list. System of care communities e-mail the completed list to the National Evaluation Team at least 4 weeks prior to the scheduled visit so that the list of projected interviewees can be reviewed to ensure that each category of respondent is adequately represented. The respondent categories include representatives of core child-serving agencies, project directors, family representatives and representatives of family advocacy organizations, quality monitoring participants, intake workers, youth coordinators, care coordinators and case managers, direct service providers, case review participants, youth, and caregivers. For each system of care community, there are approximately 27 respondents per site visit. Site visits will be conducted in all system of care communities. Based on previous experience, we expect a response rate for this study component of approximately 84 percent.


Cross-Sectional Descriptive Study, Longitudinal Child and Family Outcome Study, and Service Experience Study. The universe for the Phase V Cross-Sectional Descriptive Study, the Longitudinal Child and Family Outcome Study, and the Service Experience Study consists of the children served by the CMHS program in the 30 grant communities.


Cross-Sectional Descriptive Study. For this evaluation study, data are collected on children and families at intake into services. Descriptive data are collected on all children and their families who are being served by the CMHS program. To be included in this study component children need to: (1) meet the community’s service program eligibility criteria; and (2) receive services in that community. Because these data are routinely collected at the sites for internal purposes, descriptive data on all the children and families who receive services are available.


Longitudinal Child and Family Outcome Study. To gather data for this study that can be meaningfully interpreted while not creating an overwhelming burden for some grant communities, a sample of families is selected for participation.


The Longitudinal Child and Family Outcome Study sample is selected from the pool of children and their families entering the Phase V-funded systems of care. Although each grant community is funded for 6 years, the first year is committed to initial system development with data collection occurring in the last 5 years of their funding. Hence, recruitment of family participants occurs in years 2, 3, 4, and 5 of program funding (or years 1, 2, 3, and 4 of the evaluation).


Systems of care develop differentially over the length of the project, so it is important to consider the growth of the system of care in designing the sample. If the entire sample is recruited in the first year, the opportunity would be lost to assess whether changes in the client population occurred as the system matured (e.g., increasingly serving children with more severe problems or children referred through the juvenile justice system). For that reason, recruitment is spread across 4 years.


It is important that we draw a large enough sample in each grant community to ensure that the evaluation is able to detect the impact of the system of care initiative on child and family outcomes. If the samples are too small, significant differences of an important magnitude might go undetected. The effect sizes of the phenomena of interest form the basis of determining the minimum sample size needed through a statistical power analysis. Briefly, the power of a statistical test is generally defined as the probability of rejecting a false null hypothesis. In other words, power gives an indication of the probability that a statistical test will detect an effect of a given magnitude that, in fact, really exists in the population. The power analysis does not indicate that a design will actually produce an effect of a given magnitude. The magnitude of an effect, as represented by the population parameter, exists independent of the study and is dependent on the relationship among the independent and the dependent variables in question. The probability of detecting an effect from sample data, on the other hand, depends on three factors: (1) the level of significance used, (2) the size of the treatment effect in the population, and (3) sample size.


For the Longitudinal Child and Family Outcome Study in the grant communities, the longitudinal design assesses whether individual children and families experience meaningful improvements in outcomes between the time they enter the systems of care and subsequent data collection points. Comparisons of outcomes among different groups are also made. Previous research has indicated that comparisons of served population groups yield small to medium effect sizes (.27 to .33). Table 6 shows the power calculations used to determine the sample size required to detect effect sizes of various magnitudes for the comparison of outcomes between groups. For example, to detect a difference between two groups with a small to medium effect size with power of .80 would require a total sample size of 553. Thus, each grant community should collect data on 277 children (i.e., 553 / 2 = 276.5). This ensures that sufficient power will be achieved for the longitudinal analysis within the systems of care over time, between different groups within grant communities, as well as between grant communities.


Table 6. Effect Size: Latent Variable Model


Power

Small (.20)

Small to Medium (.30)

Medium (.50)

.80

690

553

330

.85

810

625

420

.90

930

700

510


The estimate of the number of children and families that has to be recruited in the Longitudinal Child and Family Outcome Study incorporates an anticipated attrition rate of 5 percent at each data collection point, which results in approximately 85 percent retention at the end of data collection. That is, to end up with follow-up data on at least 277 families after 4 data collection points, 356 families have to be recruited. In addition, to study the longitudinal impact of the program on functional development (e.g., advance to college, work), communities continue to follow children and families for the duration of the evaluation. Follow-up data collection will continue into the last year of the grant communities’ funding, allowing the children and families recruited in the first and second year of data collection to be followed for 36 months, those recruited in the third year of data collection to be followed for 30 months, and those recruited in the fourth year to be followed for 18 months.


Table 7 shows the data collection schedule for the 4 years of recruitment and 5 years of data collection. While past experience with this study has indicated that some grant communities will have difficulty maintaining an attrition rate of 5 percent at each data collection point, a majority of grant communities in Phase III of the evaluation had retention rates above 80 percent at 6 months, with one-fourth retaining more than 90 percent of study participants at 6 months. Overall, retention rates at 12 months were above 70 percent. The National Evaluation Team has established a number of strategies and techniques for maximizing recruitment and retention (see Section B.3.) and works closely with all communities to determine the best methods for recruiting and retaining study participants.


Table 7. Data Collection Schedule for the Longitudinal Child and Family Outcome Study


2005-Funded Communities

Data Collection Year Recruited1

Data Collection Year

FY06–07

FY07–08

FY08–09

FY09–10

FY10–11

Year 2

2225

2114

2008

1908

1812

1722

1636




Year 3



2225

2114

2008

1908

1812

1722

1636


Year 4





2225

2114

2008

1908

1812

1722

Year 5







2225

2114

2008

1908

Year 6

Completion of data collection if data collection goals have not been met.

2006-Funded Communities1

Data Collection Year Recruited

Data Collection Year

FY007–08

FY08–09

FY09–10

FY10–11

FY11–12

Year 2

534

507

482

458

435

413

393




Year 3



534

507

482

458

435

413

393


Year 4





534

507

482

458

435

413

Year 5







534

507

482

458

Year 6

Completion of data collection if data collection goals have not been met.

  1. Refers to the year of the national evaluation in which the family was recruited into the study. Across all sites, the national evaluation spans 5 years. Although data collection occurs in years 2 through 5, recruitment ends in year 5 with follow-up data collection continuing in year 5. Any sites that have not met their participant recruitment goals are allowed to continue recruitment during year 6 as long as at least one follow-up interview can be completed before program funding ends.


To reach these numbers, some grant communities need to recruit all willing families into the Longitudinal Child and Family Outcome Study sample. For these grant communities, the cross-sectional descriptive and the longitudinal samples are identical. Other grant communities need to employ a sampling strategy to randomly select a sufficient number of families from the pool of children who enter the system of care. At these grant communities, a systematic sampling approach is used. A random starting point between 1 and the nearest integer to the sampling ratio (n/N) is selected using a table of random numbers. Children are systematically selected for inclusion at intervals of the nearest integer to the sampling ratio. For example, every tenth child (after the random starting point) would be sampled in a grant community serving 3560 children (n/N = 3560/356 = 10) and every fifth child would be sampled in a grant community serving half that number or 1780 children (n/N = 1780/356 = 5) (where n = the number of children in the population and N = the number of children to be recruited into the sample).


The purpose of the sampling strategy described above is to maximize the chance that the children who participate in the Longitudinal Child and Family Outcome Study are indeed representative of the universe of children who enter the systems of care. If this is achieved, the findings from data collected from the randomly selected sample are more likely to generalize to the overall client pool. Every effort is made to recruit and follow the children who are randomly selected into the Longitudinal Child and Family Outcome Study. However, one should expect that some of the families approached about entering the study would refuse to participate. When a family refuses to participate, the next family that meets the selection criteria is selected. Past experience indicates that grant communities vary in their abilities to recruit Cross-Sectional Descriptive Study sample members into the Longitudinal Child and Family Outcome Study with the majority of grant communities recruiting more than 60 percent of the Cross-Sectional Descriptive Study sample into the Longitudinal Child and Family Outcome Study sample. To estimate the effect of the refusals on the representativeness of the sample, the families who refuse are compared to the participating sample on, at minimum, demographic characteristics. (See the Data Analysis Plan section above.) Recall that descriptive data are collected on all families that enter the systems of care. This provides the data upon which to make comparisons.


Experience from previous Phases of the national evaluation has shown that, although grant communities can make estimates, it is difficult to predict precisely how many children will be served by the grant communities’ systems of care. In addition, the number of children who enter the systems of care may increase over time as grant communities expand their service capacity and enhance outreach efforts. For that reason, sampling strategies have to remain flexible during the recruitment period and are monitored closely by the National Evaluation Team. The sampling strategies are based on the sampling ratio approach to random selection described above. In the first year of their funding, grant communities monitor the number of children that enter their systems of care. Toward the end of the first year, a sampling ratio is developed based on the first year of enrollment into the systems of care. That sampling ratio is tested in the first 3 months of data collection and monitored throughout the recruitment period to ensure that it remains on target.


The actual process of recruitment differs across grant communities. This is necessary because children and families enter services differently across grant communities. For example, in one grant community, the primary portals of entry might be the schools, while in another it might be the court system. It is also likely that grant communities have a variety of portals of entry (e.g., mental health centers, schools, and courts). Every effort is made to ensure that the recruitment process is as standardized as possible across grant communities and at the various portals of entry. The rudiments of sample selection and recruitment are documented in the national evaluation procedures manual, with additional guidelines developed specifically for each grant community. Training is also conducted at each grant community. Whether a family is to be recruited into the Longitudinal Child and Family Outcome Study (i.e., whether they are selected for inclusion in the sample) is determined as soon as it is known whether they meet the eligibility criteria. Intake workers, regardless of their location, training or service sector affiliation, are trained to conduct the consent to contact process in a uniform manner. Scripts are used to make sure that each potential participant receives the same information before agreeing to be contacted by the evaluation staff. (See Attachment 3.B.) Similarly, evaluation staff are trained to conduct the informed consent process uniformly. Standard forms are used to document refusals to be contacted or to participate in the study. These are established procedures in field research, and the National Evaluation Team closely monitors them.


Service Experience Study. The sampling and recruitment procedures for this study, which includes administration of the Multi-Sector Service Contacts—Revised (MSSC-R), the Family and Youth Services Surveys (YSS, YSS-F), and the Cultural Competence and Service Provision Questionnaire (CCSP) are identical to that of the Longitudinal Child and Family Outcome Study; that is, the same randomly selected sample of children and families being served in all system of care communities. Thus, anticipated response rates and retention rates are the same as for the Longitudinal Child and Family Outcome Study.


Sustainability Study. The universe for the Sustainability Study consists of four key system of care roles in each of the 30 grant communities. For each site, four community respondents (i.e., project director, mental health representative, family organization representative, agency representative) will be asked to complete the Web survey. These four respondents are representative of the community members most familiar with sustainability efforts. The project director, the director of the local family organization, and the two agency representatives who will be asked to complete the survey are individuals interviewed for the System of Care Assessment. Previous experience indicates that the response rate for the Sustainability Survey should be 80 percent or higher.


Services and Cost Study. Data for the Services and Costs Study are collected only on children and youth enrolled in the Longitudinal Child and Family Outcome Study. The sampling and recruitment procedures for this study are identical to that of the Longitudinal Child and Family Outcomes Study.


CQI Initiative Evaluation. The universe for the CQI Initiative Evaluation consists of key system of care roles in each of the 30 grant communities. For each grant community, up to seven site-level respondents (i.e., principal investigator, project director, lead evaluator, cultural competence coordinator, social marketer, lead family representative, and youth coordinator) will be asked to complete the CQI Initiative Survey. These four respondents are representative of the community members most familiar with CQI efforts. Previous experience based on a similar study indicates that the response rate for the Web Survey should be approximately 80 percent. Web Survey results for each community will be ranked on a scale developed by the National Evaluation Team which rates sites in terms of their engagement in and satisfaction with the CQI Initiative. Based on these ratings a subset of 6 communities will be selected for participation in the CQI Initiative Interview. As a basic criterion for selection, the Web Survey response rate for the communities must exceed 50 percent. Within each selected community, each respondent to the Web survey will be contacted for administration of the semi-structured interview.


Evidence-Based Practice Study. Program directors and administrators, direct children’s mental health service providers, youth and family affiliated with each Phase V system of care will be recruited to participate in the Implementation Factors Substudy (IFS), representing all 3 levels of system of care constituents – system, service, and consumer. To identify these participants, the national evaluation site liaisons will assist in making the initial contacts with project directors and local evaluators to explain the study and solicit participation. Project directors will be asked to participate in the interviews and provide the name and contact information of two or more respondents who have had some experience with EBP for each of the three categories of participants. Potential participants will then be contacted via e-mail and asked to participate in the telephone interviews. Previous experience based on a similar study indicates that the response rate should be approximately 82 percent.


Cultural and Linguistic Competence Study. Respondents for the CCIOSAS and CCEBPS Substudies of the Cultural and Linguistic Competence study will be program directors and administrators, service providers, agency and community partners, youth and family representatives from four of the CMHI-funded communities for each substudy. To identify these participants the National Evaluation Team will first put out a call for volunteers, and then identify four grant communities for each substudy that serve diverse communities and meet specific criteria determined by the research team. Community grant applications will be used to gather additional information about sites. Identified sites will be asked to recruit potential respondents and asked to identify a few key people to serve on a committee to help coordinate the site visit and telephone calls. This committee will also participate in scheduling respondents, reviewing protocols and determining the appropriate data collection method for particular respondents (e.g., focus groups or face-to-face interviews). Previous experience based on a similar study indicates that the response rate should be approximately 83 percent.







2. INFORMATION COLLECTION PROCEDURES


System of Care Assessment. The National Evaluation Team collects data for this study during periodic site visits. Data collection includes semi-structured interviews with key informants, review of documents and randomly selected case records, and observations. To document changes in system of care development that occur over time, all system of care communities are visited 3 times, at 18–24 month intervals in evaluation years 1 through 6. Initial data collection site visits are scheduled according to the relative development of the individual programs so that more advanced communities will be scheduled first followed by all others until all have completed the data collection process within the timeframe allotted. The initial data collection site visits took place between February and September 2007, with subsequent site visits planned to occur at 18–24 month intervals.


The System of Care Assessment protocol yields an average of 23 individual interviews and 6 case record reviews per data collection site visit. It is expected that these averages will be achieved during the Phase V data collection process. Key informants include the local project director, representatives of core child-serving agency, representatives of family organizations, youth coordinators, care coordinators, direct service providers, caregivers of children who are receiving services through the system of care, and youth who are receiving services through the system of care. The average time to obtain the required information from each person is about one hour.


Prior to the site visit, the National Evaluation Team sends out tables to be completed by the system of care community. These tables collect information on: (1) the structure and participants of the governing body; (2) trainings that have been provided on system of care principles; (3) demographics of program staff; (4) services provided in the system of care community’s service array; (5) amounts, sources, and types of funding; and (6) participants on the case review team. These completed tables are e-mailed to the National Evaluation Team approximately 4 weeks prior to the site visit. (See Attachments 4.A.1–4.A.4 for System of Care Assessment protocols.)


The Interagency Collaboration Scale (IACS) is administered to approximately 14 respondents per site visit, including project directors, core child-serving agency representatives, representatives from family organizations, care coordinators, and direct service providers. The System of Care Assessment interview guides, the Interagency Collaboration Scale (IACS) and the protocol for arranging for site visits and identifying potential respondents are presented in Instrument A.2 and A.3.


Cross-Sectional Descriptive Study. Data for the Cross-Sectional Descriptive Study are collected at entry into services for all children and families in the grant communities. Data for this study are collected by grant communities’ intake staff, who are trained by the National Evaluation Team to ensure standard collection of these data. To standardize the collection of these data across grant communities, the National Evaluation Team has developed the Enrollment and Demographic Information Form (EDIF) and the Child Information Update Form (CIUF). (See Instruments B.1 and B.2.) The information can be collected from case records or from intake interviews conducted at intake. The National Evaluation Team strongly recommends that all grant communities incorporate these items into their intake process. These data are directly entered into a Web-based database by intake personnel to facilitate capture of basic descriptive characteristics of children served. The information collected in the EDIF includes elements required in the Guidance for Applicants (listed below), plus a few additional elements specific to the evaluation. The required descriptive information includes the following:


  • The number of children served by the CMHS service program,

  • Demographic characteristics of the children and families, and

  • Diagnostic information on the child.


For families participating in the Longitudinal Child and Family Outcome Study, the descriptive information that may change over time (e.g., diagnosis, insurance status) is also collected at each follow-up data collection point using the CIUF. Evaluation staff collect these follow-up descriptive data elements in conjunction with other follow-up data collection for the Longitudinal Child and Family Outcome Study (see below).


Longitudinal Child and Family Outcome Study. Data collection for this evaluation study begins in the second year of the grant communities’ funding. Because respondents’ reading levels varies, the instruments are administered in interview format. This approach has been successfully implemented in Phases II, III, and IV. These data are collected at intake and follow-up data collection points. In Phase V, outcome data are collected from a sample of children, youth, and their caregivers. (See Instrument C for instruments.) The CMHS program’s Guidance for Applicants requires grant communities to collect the following information on child and family outcomes:


  • Standardized assessments of child symptoms and social functioning;

  • Functional indicators including school performance and contacts with law enforcement;

  • Restrictiveness of child’s service placements; and

  • Family functioning.


Following children and families as long as possible allows the assessment of the long-term impact of the system and permits important functional outcomes to be assessed as children and youth develop toward maturity (e.g., completion of high school). Thus, children and families who enter the study in the first year are followed for 36 months, those who enter in the second year are followed for 30 months, and those who enter in the third year are followed for 18 months.


Seven of the measures are completed by youth 11 years of age and older. These include the:

  • Youth Services Survey (YSS, YSS-F),

  • Delinquency Survey—Revised (DS-R),

  • Substance Use Survey—Revised (SUS-R),

  • Gain Quick-R Substance Problem Scale (Gain Quick-R),

  • Youth Information Questionnaire (YIQ),

  • Revised Children’s Manifest Anxiety Scales (RCMAS),

  • Reynolds Adolescent Depression Scale—Second Edition (RADS-2).


All of the measures planned to assess child mental health and family outcomes were already cleared by the OMB for use during Phase V of the national evaluation; many of the measures have been approved across multiple phases. Previously approved measures include the following:


  • Information regarding the residential status of children is collected from caregivers using the Living Situations Questionnaire (LSQ). (See Instrument C.1.)

  • To measure child clinical symptomatology, caregivers of children age 6 years and older complete the Child Behavior Checklist (CBCL 6–18). To measure child clinical symptomatology in young children, caregivers of children age 6 years and younger complete the Child Behavior Checklist 1½–5 (CBCL 1½–5). The CBCL has been widely used in children’s mental health services research to assess social competence, behaviors, and feelings. (See Instrument C.2.)

  • The Caregiver Strain Questionnaire (CGSQ) is used to measure how families are affected by the special demands associated with caring for a child with serious emotional disturbance. (See Instrument C.3.)

  • To identify the emotional and behavioral strengths of children, caregivers of children older than age 5 years complete the Behavioral and Emotional Rating Scale—Second Edition, Parent Rating Scale (BERS-2C). The BERS-2C is a strengths-based measure of social competence. (See Instrument C.4.)

    • To measure children’s functioning in school environments, caregivers complete the Education Questionnaire—Revised (EQ-R). (See Instrument C.5.)

  • The Family Life Questionnaire (FLQ) is used to assess how families interact and communicate. (See Instrument C.6.)

  • Youth complete the Delinquency Survey—Revised (DS-R). This measure identifies delinquent or risky behavior for which youth with mental illnesses may be at high risk. (See Instrument C.7.)

  • The Gain Quick-R Substance Problem Scale (Gain Quick-R) measures substance use, abuse and dependence and is administered to youth. (See Instrument C.8.)

  • The Substance Use Survey—Revised (SUS-R) is administered to youth to determine alcohol, tobacco, and drug use during the previous 30 days and 6 months. (See Instrument C.9.)

  • To determine if youth are experiencing anxiety, they are administered the Revised Children’s Manifest Anxiety Scales (RCMAS). (See Instrument C.10.)

  • Youth are administered the Reynolds Adolescent Depression Scale—Second Edition (RADS-2) to assess if they are experiencing depression. (See Instrument C.11.)

  • The Youth Information Questionnaire (YIQ) is a compilation of questions on a range of topics, including coercion, acculturation, symptomatology, peer relations, employment status, suicidality, and neighborhood safety that are answered by youth. (See Instrument C.12.)

  • To identify the emotional and behavioral strengths of children from their own perspective, youth complete the Behavioral and Emotional Rating Scale—Second Edition, Youth Scale (BERS-2Y). (See Instrument C.13.)

  • The Columbia Impairment Scale (CIS) is completed by caregivers of children older than 5 years to measure children’s general level of functioning. (See Instrument C.14.)

  • The Vineland Screener (VS), which assesses development in young children, is completed by caregivers of children age 5 years and younger. (See Instrument C.15.)

  • The Caregiver Information Questionnaire (CIQ), which collects descriptive information about the child and family, is completed by caregivers. (See Instrument C.16.)


On-site data collectors, hired and managed by grant communities, collect data in the funded systems of care. In these grant communities, the people who collect the data depend on the resources and needs of the grant communities. For example, some grant communities may choose to hire two full-time staff to manage the local evaluation and to collect all the data. Other grant communities might choose to hire one full-time evaluator to manage the evaluation but collect data with flexible part-time staff.


The National Evaluation Team documents and monitors data collection procedures in the system of care grant communities to ensure the greatest possible uniformity in data collection across grant communities. In addition, evaluation staff and data collectors are trained using standard materials developed by the National Evaluation Team.


Service Experience Study. All of the measures planned to assess service experience were already cleared by the OMB for use during Phase V of the national evaluation; many of the measures have been approved across multiple phases. Data for the Service Experience Study are collected along with data for the Longitudinal Child and Family Outcome Study and include:


(1) Recording service contacts on the Multi-Sector Service Contacts Questionnaire—Revised (MSSC-R) (Instrument D.1);

(2) An assessment of service experience, satisfaction, and perceived outcomes with the Family and Youth Services Surveys (YSS-F and YSS) (Instruments D.2 and D.3); and

(3) Caregiver report on the cultural competence of services provided using the Cultural Competence and Service Provision Questionnaire (CCSP) (Instrument D.4).


The Service Experience Study also examines the congruence between the program’s original design and what is actually experienced by clients during implementation of that design. The Youth Services Surveys focus on whether the overall service system experienced by youth and their caregivers reflect the key principles of the system of care model. Youth and caregivers report their perceptions of whether services they received were accessible, well-coordinated, family-driven, culturally competent, helpful in meeting therapeutic goals, and matched with the individual needs of the child and family.


This corresponds to the Guidance for Applicants which requires sites to collect data on:


    • Collaboration and coordination of system components;

    • Family involvement in services; and

    • Family and youth satisfaction with services.


Data for the Service Experience Study are collected in all systems of care communities. These data are completed at follow-up for families who have received services as indicated in the gate question and are participating in the Longitudinal Child and Family Outcomes Study. On average, children and families complete five follow-up points.


Sustainability Study. The Sustainability Study involves collecting data in each grant community via a Web-based survey. This study gathers data on system of care characteristics and factors related to sustainability, and monitors and evaluates the success of grant communities’ ability to be sustainable post-funding. The Sustainability Survey is completed by four selected staff (i.e., project director, family organization representative, agency representative, mental health representative) from each grant community in years 3, 4, 5, and 6 of the evaluation. (See Attachments B.1–5 and Instrument E.1.)


Following recruitment activities and verification of contact information, survey mailing occurs by e-mail or mail. The National Evaluation Team implements this Web-based survey. Implementation of this survey adheres to accepted methods for mail and Internet surveys. After initial solicitation of participation by a key individual in each grant community and identification of appropriate survey participants, a pre-survey letter explaining that the recipient will be asked to participate in a survey is sent to these selected staff in each community, followed 1 week later by a letter containing a token incentive and directions for logging onto a Web site to complete the Internet survey. Instructions are also provided for obtaining a hard copy of the survey if desired. A follow-up reminder postcard is sent 1 week later, a second reminder letter is sent out 1 week after that to those respondents who have not completed the survey and 1 week after that, another letter containing a hard copy of the survey and a return envelope is be sent to all providers who have not completed the Web survey. Links to the survey Web site and reminder letters can also be sent by e-mail. Telephone reminder calls will be made to any remaining nonrespondents. The National Evaluation Team conducts the Web survey. These data collection instruments and procedures are the same as those previously approved by OMB for Phases II, III, IV, and V of the national evaluation.


Data collected for this study corresponds to the Guidance for Applicants, which requires grant communities to collect data on their progress to become increasingly sustainable over the life of the award, with the amount of program funding from non-award sources increasing incrementally in each year of the award.


Services and Costs Study. To provide data for this study, grant communities will collect two types of data. The first type of data are budget data on services provided through flexible fund expenditures. The second type of data are child-level service event data. This includes data on each service provided to each child/youth by as many partner agencies in the systems of care as possible. The availability of these data and procedures that communities will implement in accessing these data will vary widely across grant communities. Some of the data needed for this study are already collected by communities in existing data systems developed for their own program management purposes. Other data are recorded on paper-based forms or as part of the child’s case records. However, some communities do not currently collect the data needed for this study, either electronically or on paper. For data not already collected, communities will be asked to begin collecting these data specifically for the Services and Costs Study.


Data will be complied by either extracting data from existing data systems and recoding them according to a specified data dictionary or by key entering information collected from paper records. Some communities will either extract and recode their data or will enter their data, while other communities will use a combination of both methods.


The National Evaluation Team will provide two data dictionaries to provide specifications for communities to use in recoding data from existing data systems, one for flexible fund expenditures and the other for service event data. The National Evaluation Team will also provide two data entry applications for communities to use for key entering data from paper records. The first application is the Flex Funds Tool for data on flexible funding expenditures. The second application is the Services and Costs Data Tool for child-level service event data.


Data that are complied by extracting and recoding existing data will be transmitted to the National Evaluation Team at regular intervals beginning in evaluation year 4. Data that are entered from paper records will be transmitted to a central database on an on-going basis, as they are entered.


CQI Initiative Evaluation. The CQI Initiative Evaluation involves collecting data from respondents in all grant communities via a Web survey and from respondents in a subset of communities using semi-structured interviews. This study will gather data on the effectiveness of the CQI Initiative implementation and the extent to which implementation goals were met, i.e. the degree to which communities are engaged in CQI; the mechanisms by which CQI is being pursued by the communities; and community members’ perception of the effectiveness of CQI efforts. The CQI Initiative Survey will likely be completed by up to seven staff (i.e., project investigator, project director, lead evaluator, family representative, and youth coordinator) from each grant community in evaluation year 4.


Following recruitment activities and verification of contact information, survey participants will be directed to the Web-based survey. Implementation of this survey will adhere to accepted methods for mail and Internet surveys. The National Evaluation Team will seek to identify a key constituent in the grant community to provide assistance in identifying community respondents. After initial solicitation of participation by the key individual in each grant community and identification of appropriate survey participants, a pre-survey letter explaining that the recipient will be asked to participate in a survey will be sent to these selected members in each community via e-mail or standard mail, followed 1 week later by a letter containing directions for logging onto a Web site to complete the Internet survey. Instructions will also be provided for obtaining a hard copy of the survey if desired. A follow-up reminder will be sent 1 week later, and 1 week after that; another letter containing a hard copy of the survey will be sent to all providers who have not completed the Web survey (Dillman, 2001). Telephone reminder calls will be made to any remaining nonrespondents. Each respondent will be mailed a gift card upon completion of the survey.


Six communities will be ranked via a scale developed by the NET, which ranks sites in terms of their engagement in and satisfaction with their CQI process. Communities will be selected based on quantitative and qualitative data analysis of the CQI Initiative Survey. Telephone interviews will be conducted by the National Evaluation Team and responses will be entered into a database. Respondents identified for administration of the semi-structured interviews will be contacted via telephone following completion of the Web survey to solicit participation, gain consent, and if consent is obtained schedule an appointment for their telephone interview.


Data collection for the web survey will begin in the first quarter of national evaluation year 5; data collection for interview respondents will begin the second quarter of year 5.


Evidence-Based Practices Study. Data collection for the Implementation Factors Substudy involves obtaining data from a sample of 120 leadership team members, 90 service providers, and 90 youth and caregivers in all 30 grant communities. Each community’s National Evaluation site liaison will assist with scheduling a conference call, during which the EBP Study Team will use the discussion guide to gather preliminary information. Members of the system of care leadership team in each community will be asked to recommend other community members for interviews using the respondent selection criteria. Invitation letters will be sent to potential respondents. Individual and small group telephone interviews will be conducted and information will be gathered using three semi-structured interview formats (Implementation Factors Substudy Discussion Guide) for each of the participant categories. The formats will address the contextual factors that support and/or inhibit the implementation of evidence- and practice-based treatments and the impact of these approaches on consumers, providers, agencies, and systems of care. These data are used to assess both the extent to which respondents are aware of barriers and facilitators in the implementation of evidence- and practice-based treatments and their impact on program, service and health outcomes. It is anticipated that data collection will take place in the last quarter of year 4 of the national evaluation.

Data collection concerning caregiver experience with the receipt of evidence-based practices for the Family and Youth Experiences Substudy occurs as part of the Multi-Sector Service Contacts Questionnaire—Revised (MSSC-R). (See Instrument D.1.) These data are used to assess the extent to which caregivers are informed regarding the basis for the services that they received and their awareness of the evidence base used in defining a plan of treatment. The data collection procedures for this are described above in this section under the description of the Service Experience Study.


Cultural and Linguistic Competence Study. Data for the Cultural and Linguistic Competent Study will be collected via focus groups, in-person interviews and telephone interviews using semi-structured interview protocols. Invitation letters will be sent out to 2005- and 2006-funded communities. The protocols will address cultural and linguistic characteristics that may influence philosophy, infrastructure, service system, service array, individualized care planning, family engagement, and evaluation, and the barriers and facilitators encountered as cultural and linguistic standards are addressed. It is anticipated that data collection for the CCIOSAS will take place in national evaluation year 3 and data collection for the CCEBPS in national evaluation year 5.


Table 8 summarizes the respondent, data collection procedure, and periodicity for each measure.


Table 8. Instrumentation, Respondents, and Periodicity


Measure

Indicators

Data Source(s)

Method

When Collected

System of Care Assessment (all sites)

System of Care Assessment Tool (Interview Guides and Data Collection Forms)

  • Family-driven

  • Youth-guided

  • Individualized services

  • Cultural competence

  • Interagency collaboration

  • Service coordination

  • Service array

  • System & service accessibility

  • Community-based services

  • Least restrictive service provision


  • Project staff

  • Core agency representatives

  • Family members

  • Caregivers

  • Youth

  • Service providers

  • Other constituents

  • Documents



Interview

Review

Every 18–24 months






System of Care Assessment (all sites) (continued)

Interagency Collaboration Scale (IACS)

Interagency collaboration

  • Project staff

  • Core agency representatives

  • Family organization representatives

  • Service providers

  • Others

Survey

Every 18–24 months



Cross-Sectional Descriptive Study

Enrollment and Demographic Information Form (EDIF)

  • Agency involvement

  • Source of referral

  • Date of birth

  • Gender

  • Race/ethnicity

  • Zipcode

  • Presenting problems

  • Child welfare status

  • Health insurance status

  • Diagnoses

  • Type of diagnosing provider

  • Enrollment status

  • Service plan participation


  • Intake records

  • Caregivers

Record Review and Interview

At Intake

Child Information Update Form (CIUF)

  • Agency involvement

  • Source of referral

  • Zipcode

  • Child welfare status

  • Health insurance status

  • Diagnoses

  • Type of diagnosing provider

  • Enrollment status


  • Intake records

  • Caregivers

Record Review and Interview

At 6 months and every 6 months thereafter

Longitudinal Child and Family Outcome Study (a sample of children and families enrolled in the system of care)

Caregiver Information Questionnaire (CIQ)

  • Age

  • Educational level and placement

  • Socioeconomic status

  • Race/ethnicity

  • Parents employment status

  • Living arrangement

  • Presenting problem(s)

  • Intake/referral source

  • Risk factors for family and child

  • Child and family physical health

  • Coercion for services

  • Service use


  • Caregiver

Interview

At Intake, and every 6 months thereafter

Living Situations Questionnaire (LSQ)

  • Living situations

  • Number of placements

  • Restrictiveness of placements

  • Caregiver

Interview

At Intake, and every 6 months thereafter

Longitudinal Child and Family Outcome Study (a sample of children and families enrolled in the system of care) (continued)

Behavior and Emotional Rating Scale—Second Edition, Parent Rating Scale (BERS-2C)


  • Strengths

  • Social competence

  • Caregiver of children age 6 years and older

Interview

At Intake, and every 6 months thereafter

Child Behavior Checklist (CBCL) and Child Behavior Checklist 1½–5 (CBCL 1½ –5)


  • Symptomatology

  • Social competence

  • Caregiver

Interview

At Intake, and every 6 months thereafter

Education Questionnaire —Revised (EQ-R)

  • Functioning in school environments

  • Caregiver

Interview

At Intake, and every 6 months thereafter

The Family Life Questionnaire (FLQ)

  • Family interaction and communication


  • Caregiver

Interview

At Intake, and every 6 months thereafter

The Vineland Screener (VS)

  • Development

  • Personal and social sufficiency

  • Caregiver of children age 5 years and younger


Interview

At Intake, and every 6 months thereafter

The Columbia Impairment Scale (CIS)

  • General functioning

  • Caregiver of children age 6 years and older

Interview

At Intake, and every 6 months thereafter

Caregiver Strain Questionnaire (CGSQ)


  • Caregiver strain


  • Caregiver

Interview

At Intake, and every 6 months thereafter

Behavior and Emotional Rating Scale—Second Edition, Youth Scale (BERS-2Y)


  • Strengths

  • Social Competence

  • Youth

Interview

At Intake, and every 6 months thereafter

Delinquency Survey—Revised (DS-R)

  • Delinquent or risky behaviors

  • Youth 11 years and older

Interview

At Intake, and every 6 months thereafter

Gain-Quick Substance Problems Scale (Gain Quick-R)


  • Substance use, abuse, and dependence

  • Youth 11 years and older

Interview

At Intake, and every 6 months thereafter

Substance Use Survey—Revised (SUS-R)

  • Alcohol, tobacco, and drug use

  • Youth 11 years and older

Interview

At Intake, and every 6 months thereafter

Revised Children’s Manifest Anxiety Scales (RCMAS)


  • Child anxiety

  • Youth 11 years and older

Interview

At Intake, and every 6 months thereafter

Reynolds Adolescent Depression Scale—Second Edition (RADS-2)


  • Child depression

  • Youth 11 years and older

Interview

At Intake, and every 6 months thereafter

Longitudinal Child and Family Outcome Study (a sample of children and families enrolled in the system of care) (continued)

Youth Information Questionnaire (YIQ)

  • Acculturation

  • Coercion

  • Peer relations

  • Symptomatology

  • Suicidality

  • Neighborhood Safety

  • Presenting problems

  • Employment status

  • Youth 11 years and older

Interview

At Intake, and every 6 months thereafter

Service Experience Study

Multi-Sector Service Contacts—Revised (MSSC-R)

  • Type of service

  • Amount of service

  • Location of service

  • Caregiver

Interview

Every 6 months after intake if services received

Youth Services Survey-Families (YSS-F)

  • Service experience

  • Client satisfaction

  • Perceived outcomes

  • Caregiver

Interview

Every 6 months after intake if services received

Youth Services Survey (YSS-Y)

  • Service experience

  • Client satisfaction

  • Perceived outcomes

  • Youth 11 years and older

Interview

Every 6 months after intake if services received

Cultural Competence and Service Provision Questionnaire (CCSP)

  • Cultural competence

  • Caregiver

Interview

Every 6 months after intake if services received

Sustainability Study

Sustainability Survey

  • System of care characteristics

  • Factors related to sustainability

  • Success of sites to be sustainable post-funding


  • Local site informants

Web-based survey

Once in evaluation years 3, 4, 5, and 6

Services and Costs Study (all enrolled in the Longitudinal Child and Family Outcome Study)

Flex Funds Data Dictionary

  • Child ID

  • Type of expenditure

  • Date of flex funds expenditure

  • Amount of expenditure

  • Budget expenditure information

Database extraction and recoding; data entry from paper records


Continuously; data transmitted at regular intervals in evaluation years 4, 5, and 6

Services and Costs Data Dictionary

  • Child ID

  • Date of service

  • Service type

  • Sponsoring agency

  • Provider type

  • Service location

  • Service units/number of units

  • Amount of charge

  • Amount of adjustment

  • Amount paid by source of payment

  • Estimated value for unbilled services


  • Fiscal or administrative databases; administrative records

Database extraction and data recoding; data entry from paper records

Continuously transmitted at regular intervals in evaluation years 4, 5, and 6

CQI Initiative Evaluation

CQI Initiative Survey

  • Utilization of CQI Progress Report

  • Description of CQI infrastructure

  • Effectiveness of technical assistance

  • Development of communication feedback loop

  • Satisfaction with CQI Progress Report and technical assistance

  • Key constituents

Web-based survey

Once in the first quarter of national evaluation year 4

CQI Initiative Interview

  • Key constituent involvement in implementing CQI Initiative

  • Community use of CQI Progress Report and technical assistance

  • Extent to which initiative was implemented according to plans

  • Satisfaction with implementation

  • Program changes resulting from the CQI process

  • Key constituents

Semi-structured telephone interview

Once in the second quarter of national evaluation year 4

Evidence-Based Practices Study

Multi-Sector Service Contacts—Revised (MSSC-R)

  • Type of service

  • Amount of service

  • Knowledge of service

  • Caregiver

Interview

Every 6 months after intake if services received

The Implementation Factors Discussion Guide

  • Extent to which EBT and PBE were implemented

  • Factors related to EBT and PBE implementation

  • Knowledge and experience with EBT and PBE

  • Impact of EBT and PBE implementation

  • Program Directors,

  • Service providers,

  • Caregivers, Youth

Interviews

Once at end of evaluation year 4

Cultural and Linguistic Competence Study

CCIOSAS –Beneficiaries of Self-Assessment Findings Focus Group Guide

  • Provider knowledge, attitudes, and practices

  • Administrator knowledge, attitudes and practices

  • Family and youth attitudes

Service providers,

Program Directors, Caregivers, Youth

Interviews

Once in evaluation year 3

CCIOSAS – Participants in Self-Assessments Focus Group Guide

  • Provider knowledge, attitudes, and practices

  • Administrator knowledge, attitudes, and practices

  • Family and youth attitudes

  • Service providers

  • Program Directors

  • Caregivers

  • Youth

Interviews

Once in evaluation year 3

CCIOSAS – Users of Self-Assessment Findings Focus Group Guide

  • System of care supports

  • Service providers

  • Program Directors

  • Caregivers

  • Youth

Interviews

Once in evaluation year 3

CCIOSAS – Telephone Interview Guide

  • Provider knowledge, attitudes, and practices

  • Administrator knowledge, attitudes and practices

  • Family and youth attitudes

  • Service providers

  • Program Directors

  • Caregivers

  • Youth

Interviews

Once in evaluation year 3

CCEBPS – Managers of EBP/PBE Interventions Focus Group Guide

  • Provider knowledge, attitudes, and practices

  • Administrator knowledge, attitudes and practices

  • Family and youth attitudes

  • Service providers

  • Program Directors

  • Caregivers

  • Youth

Interviews

Once in evaluation year 5

CCEBPS – Providers of EBP/PBE Interventions Focus Group Guide

  • Provider knowledge, attitudes, and practices


  • Service providers


Interviews

Once in evaluation year 5

CCEBPS – Family and Youth Focus Group Guide

  • Family and youth attitudes

  • Caregivers

  • Youth

Interviews

Once in evaluation year 5

CCEBPS – Telephone Interview Guide

  • Provider knowledge, attitudes, and practices

  • Administrator knowledge, attitudes and practices

  • Family and youth attitudes

  • Service providers

  • Program Directors

  • Caregivers

  • Youth

Interviews

Once in evaluation year 5



3. METHODS TO MAXIMIZE RESPONSE RATES


To maximize the response rate for all data collection efforts, a number of steps are taken:


The National Evaluation Team continues to take an active role providing technical assistance and support to the grant communities. This is done by providing:

(1) A detailed Data Collection Procedures Manual;

(2) An initial training on evaluation protocols;

(3) Evaluation workshops at semi-annual national meetings;

(4) One-on-one contact with national evaluation liaisons;

(5) Regular teleconferences and site visits throughout the evaluation period;

(6) Forums for cross-community facilitated discussions;

(7) Reading materials; and

(8) Additional guidance and information, as questions arise.


In addition, resources to assure that grant community evaluators are aware when an interview is due for completion are provided in the form of a Tracking System in Microsoft Access specific to this evaluation, and reminder e-mails generated by the Internet-based data collection system to eliminate the need for site-level duplication of effort and expense in the design of local tracking materials.


Additionally, the National Evaluation Team provides mechanisms for grant communities to communicate with the National Evaluation Team and other grant communities. This is done by provision of an Internet-based listserv for facilitating communication about training and technical assistance regarding evaluation implementation and utilization. The listserv allows grant community evaluators to communicate with the National Evaluation Team and each other through group e-mail. Any e-mail message sent to the listserv is automatically distributed to all grant community evaluators. The listserv is run at no cost to grant community evaluators.


Special efforts around training in communities with smaller service populations are also conducted to ensure that as many people as possible from the target population are enrolled and that grant community staff are familiar with methods for maximizing response rates. The National Evaluation Team encourages these grant communities to keep in frequent contact with study participants to update telephone numbers and addresses and to create an identifier for the grant community to engage families. As well, the National Evaluation Team provides these grant communities with contact information for staff from other grant communities that have had high response rates and assists them in applying strategies that have been used successfully in other communities.


To help ensure that data are being collected regularly and in keeping with national evaluation standards, the data collection staff at the local grant communities continues to work closely with local providers, staff from various agencies, and evaluation staff. These contacts focus the evaluation, data collection procedures, and any questions or concerns of the participating providers or agencies. As well, local parent groups are enlisted to encourage the cooperation of families in providing child and family information.


Following from the national evaluation standards, information is collected from participants in the Longitudinal Child and Family Outcome Study to facilitate contacting them in the future. This includes the names, phone numbers, and addresses of close friends and family members who are likely to always know where the participants are if they move. At the time of follow-up data collection, staff attempt to contact respondents at different times of the day and week using a variety of methods (e.g., phone calls, mailed postcards). This continues until it is determined that a family has refused further participation or cannot be found. Efforts to contact respondents for follow-up data collection begin by one month before the follow-up interview is due. Other efforts to increase the response rate include:


    • Providing an incentive payment for completing follow-up interviews;

    • Administering the instruments to children and their parents or caregivers at times and settings of their choice and administering multiple instruments at one time;

    • Developing a close working relationship between the data collection staff and providers at each grant community to facilitate tracking;

    • Conducting follow-up and informational mailings throughout the study period to maintain contact with study participants;

    • Using a centralized data collection and tracking system involving trained interviewers and at least one person dedicated to the tracking of study participants over time to keep study attrition to a minimum;

    • Employing proven tracking techniques (e.g., request address corrections from the post office for forwarded mail, use Web-based address and telephone searches, employ locator services to search for respondents);

    • Obtaining permission from caregivers for evaluators to contact other agencies for the purpose of getting new addresses and phone numbers if the family has moved since the last interview; and

    • Providing grant communities with useful feedback on data obtained through the evaluation activities that assists them in planning and service delivery.



4. TESTS OF PROCEDURES


Many instruments for Phase V are standardized instruments that have been tested through use in children’s mental health services research and practice and have been used in the field for the past three years. These include the:

  • Child Behavior Checklist (CBCL),

  • Behavioral and Emotional Rating Scale—Second Edition (BERS-2),

  • Gain-Quick Substance Problems Scale (Gain Quick-R),

  • Youth Services Surveys (YSS),

  • Revised Children’s Manifest Anxiety Scales (RCMAS),

  • Reynolds Adolescent Depression Scale—Second Edition (RADS-2), and

  • Interagency Collaboration Scale (IACS).


Selection of measures was based on expert panel reviews, and an assessment of measurement quality as reported in the literature. (Information on the reliability and validity of the measures and other supporting materials appears along with the instruments in the List of Instruments.) Decisions about Phase V instrumentation were made in conjunction with expert reviewers, site representatives, and family members. These consultants are listed in Attachment 2.


In addition to providing input into the selection of standardized instruments, the team of consultants also suggested measures to be removed from the evaluation, and specific items to include in the evaluation (which have been incorporated into the new and revised measures). New and revised measures have been administered to determine burden estimates. Experience and data from Phase IV were further used to assess reliability and validity and contributed to the burden estimates.


The following are new measures in Phase V:


Flex Funds Data Dictionary. The Flex Funds Data Dictionary was reviewed by an expert panel, and pilot tested by five communities. Community involvement in the pilot test included participation on a one-hour training call, entry of flexible funds data in the Flex Funds Tool for one month, submission of a copy of this data to the National Evaluation Team, and completion of a review form. Review and comment from communities about implementing the data dictionary was essential to insuring that the structure established was efficient and usable by all communities.


Services and Costs Data Dictionary. The Services and Costs Data Dictionary was reviewed by an expert panel, and pilot tested by four communities. Community involvement in the pilot test included an initial training conference call, review of the data dictionary, extraction and recoding of MIS or partner agency data, submission of their recoded data file to the National Evaluation Team, completion of a feedback form, and participation in a follow-up conference call.


CQI Initiative Survey and Interview Guide. The survey for the CQI Initiative evaluation was pilot tested with several representatives from Phase IV communities to obtain feedback and calculate an accurate burden estimate. The survey was subsequently revised based on feedback from the pilot test participants. The CQI evaluation interview protocol expands on questions included in the survey.


Evidence-Based Practices Discussion Guide. The Implementation Factors Substudy (IFS) is newly designed to combine elements of the three originally proposed EBP Substudies, PPS, CRS, and CPPCROS. The IFS Substudy Discussion Guide was created based on pilot test feedback from the PPS, CRS, and CPPCROS Substudies.


Cultural and Linguistic Competence Study Focus Group Guide and Interview Guide. The National Evaluation Team conducted and completed the first CLC Substudy (CLCIS) in year 1 of the evaluation. The National Evaluation Team conducted interviews at four sites and no more than nine interviews were conducted using each version of the protocol. For this reason, OMB clearance was not requested. The National Evaluation Team used findings from these interviews to develop protocols for the remaining two Substudies (CCIOSAS and CCEBPS), for which clearance is being requested.



Revised measures in Phase V include the following:


  • Caregiver Information Questionnaire (CIQ);

  • Education Questionnaire-Revised (EQ-R) ; and

  • Multi-Sector Service Contacts Questionnaire—Revised (MSSC-R).



Measures that are unchanged from previous phases of the evaluation include the following:


  • Living Situations Questionnaire (LSQ);

  • Child Behavior Checklist (CBCL);

  • Caregiver Strain Questionnaire (CGSQ);

  • Behavioral and Emotional Rating Scale—Second Edition (BERS-2);

  • Family Life Questionnaire (FLQ);

  • Delinquency Survey—Revised (DS-R);

  • Gain-Quick Substance Related Issues (Gain Quick-R);

  • Substance Use Survey—Revised (SUS-R);

  • Revised Children’s Manifest Anxiety Scales (RCMAS);

  • Reynolds Adolescent Depression Scale (RADS-2);

  • Columbia Impairment Scale (CIS);

  • Vineland Screener (VS);

  • Youth Services Survey (YSS);

  • Youth Information Questionnaire (YIQ);

  • Cultural Competence and Service Provision Questionnaire (CCSP); and

  • Sustainability Survey.


All Cross-Sectional, Longitudinal Child and Family Outcome, and Service Experience Study measures as well as the Sustainability Web Survey have been translated into Spanish. The reliability and validity of the Spanish Child Behavior Checklist (CBCL) has been reported in the literature. Translations of measures are conducted using established procedures, as done in earlier phases. First, experienced bilingual translation consultants translated the measures from English to Spanish. Then, to maximize the accuracy of the translation, full measures or in some cases selected sections were then back-translated from Spanish to English by other translators who were largely native speakers in grant communities.



5. STATISTICAL CONSULTANTS


The National Evaluation Team has full responsibility for the development of the overall statistical design, and assumes oversight responsibility for data collection and analysis for Phase V. Training, technical assistance, and monitoring of data collection is provided by the National Evaluation Team. The individual responsible for overseeing data collection and analysis is:


Carolyn Lichtenstein, Ph.D.

Walter R. McDonald & Associates, Inc.

12300 Twinbrook Parkway, Suite 310

Rockville, MD 20852

(301) 881–2590 x 237


The following individuals serve as statistical consultants to this project:


Susan Ettner, Ph.D.

Professor

David Geffen School of Medicine at UCLA

Division of General Internal Medicine and Health Services Research

911 Broxton Plaza, Room 106

Box 951736

Los Angeles, CA 90095-1736

Campus code: 173617

(310) 794-2289



Anna Krivelyova, M.S.

Macro International Inc.

3 Corporate Square, Suite 370

Atlanta, GA 30329

(404) 321–3211


Robert Stephens, Ph.D., M.P.H.

Macro International Inc.

3 Corporate Square, Suite 370

Atlanta, GA 30329

(404) 321–3211


Stephen L. Forssell, Ph.D.

Walter R. McDonald & Associates, Inc.

12300 Twinbrook Parkway, Suite 310

Rockville, MD  20852

(301) 881-2590 x 242


Carolyn Lichtenstein, Ph.D.

Walter R. McDonald & Associates, Inc.

12300 Twinbrook Parkway, Suite 310

Rockville, MD 20852

(301) 881–2590 x 237


The agency staff person responsible for receiving and approving contract deliverables is:


Sylvia K. Fisher, Ph.D.

Child, Adolescent, and Family Branch

Center for Mental Health Services

Substance Abuse and Mental Health Services

1 Choke Cherry Road, Room 6–1047

Rockville, MD 20857

LIST OF INSTRUMENTS



A. System of Care Assessment

1. Site Visit Tables

2. System of Care Assessment Interview Protocols

A. Representative of Core Agency

B. Project Director

C. Family Representative/Representative of Family/Advocacy Organizations

D. Evaluation and Quality Monitoring: Project Staff, Agency Reps., Provider Reps., CBO Reps., Family Reps.

E. Intake Worker

F. Care Coordinator

G. Direct Service Delivery Staff

H. Case Review Structure—Staff Participant

I. Caregiver of Child Served by the System/Program

K. Case Review Family Participant

L. Direct Service Staff from Other Public Child-Serving Agencies

M. Care Record/Chart Review

N. Other Staff

O. Debriefing Document

P. Youth Served by the System of Care

Q. Youth Coordinator

3. Interagency Collaboration Scale

B. Cross-Sectional Descriptive Study

1. Enrollment and Demographic Information Form (EDIF)

2. Child Information Update Form (CIUF)

C. Longitudinal Child and Family Outcome Study

1. Living Situations Questionnaire (LSQ): Caregiver

2. Child Behavior Checklist (CBCL): Caregiver

a. Child Behavior Checklist (CBCL), 6-18: Caregiver

b. Child Behavior Checklist (CBCL), 1 ½ -5: Caregiver

3. Caregiver Strain Questionnaire (CGSQ): Caregiver

4. Behavioral and Emotional Rating Scale—Second Edition, Parent Rating Scale

(BERS-2C): Caregiver

5. Education Questionnaire—Revised (EQ-R): Caregiver

6. Family Life Questionnaire (FLQ): Caregiver

7. Delinquency Survey—Revised (DS-R): Youth

8. Gain Quick-R Substance Problem Scale (Gain Quick-R): Youth

9. Substance Use Survey—Revised (SUS-R): Youth

10. Revised Children’s Manifest Anxiety Scales (RCMAS): Youth

11. Reynolds Adolescent Depression Scale—Second Edition (RADS-2): Youth

12. Youth Information Questionnaire (YIQ): Youth

a. Youth Information Questionnaire—Intake (YIQ-I): Youth

b. Youth Information Questionnaire—Follow-Up (YIQ-F): Youth

13. Behavioral and Emotional Rating Scale—Second Edition, Youth Rating Scale

(BERS-2Y): Youth

14. Columbia Impairment Scale (CIS): Caregiver

15. Vineland Screener

a. Vineland Screener, 0-Under 3 (VS1): Caregiver

b. Vineland Screener, 3-5 (VS2): Caregiver

c. Vineland Screener, 6-12 (VS3): Caregiver

16. Caregiver Information Questionnaire (CIQ)

a. Caregiver Information Questionnaire—Intake: Caregiver (CIQ-IC)

b. Caregiver Information Questionnaire—Follow-up: Caregiver (CIQ-FC)

c. Caregiver Information Questionnaire—Intake: Staff as Caregiver (CIQ-IS)

d. Caregiver Information Questionnaire—Follow-up: Staff as Caregiver (CIQ-FS)

D. Service Experience Study

1. Multi-Sector Service Contacts Questionnaire—Revised (MSSC-R): Caregiver

a. Multi-Sector Service Contacts Questionnaire—Revised: Caregiver (MSSC-RC)

b. Multi-Sector Service Contacts Questionnaire—Revised: Staff as Caregiver

(MSSC-RS)

2. Youth Services Survey for Families (YSS-F): Caregiver

3. Youth Services Survey (YSS): Youth

4. Cultural Competence and Service Provision Questionnaire (CCSP): Caregiver

E. Sustainability Study

1. Sustainability Study Survey

F. Services and Costs Study

1. Flex Fund Data Dictionary

2. Services and Cost Data Dictionary

G. Continuous Quality Improvement (CQI) Initiative Evaluation

1. CQI Initiative Survey

2. CQI Initiative Interview Protocols

H. Evidence-Based Practices (EBP) Study

1. IFS Discussion Guides

a. IFS Discussion Guide—Service-Level Informant

b. IFS Discussion Guide—System-Level Informant

c. IFS Discussion Guide—Consumer-Level Informant

I. Cultural and Linguistic Competence (CLC) Study

1. CCIOSAS Beneficiaries of Self-Assessment Findings

a. CCIOSAS Beneficiaries of Self-Assessment Process and Findings – Focus Group Guide

(Staff and Partners)

b. CCIOSAS Beneficiaries of Self-Assessment Process and Findings – Focus Group Guide

(Caregivers)

c. CCIOSAS Beneficiaries of Self-Assessment Process and Findings – Focus Group Guide

(Youth)

2. CCIOSAS – Participants in Self-Assessment

a. CCIOSAS Participants in Self-Assessment – Focus Group Guide (Staff and Partners)

b. CCIOSAS Participants in Self-Assessment – Focus Group Guide (Caregivers)

c. CCIOSAS Participants in Self-Assessment – Focus Group Guide (Youth)

3. CCIOSAS – Users of Self-Assessment Findings

a. CCIOSAS Users of Self-Assessment Findings – Focus Group Guide (Staff and Partners)

b. CCIOSAS Users of Self-Assessment Findings – Focus Group Guide (Caregivers)

c. CCIOSAS Users of Self-Assessment Findings – Focus Group Guide (Youth)

4. CCIOSAS Telephone Interview Guide – Staff and Partners

5. CCEBPS

a. CCEBPS Managers of Evidence-Based Practice/Practice-Based Evidence/Community-

Defined Evidence Interventions – Focus Group Guide

b. CCEBPS Providers of Evidence-Based Practice/Practice-Based Evidence/Community-

Defined Evidence Interventions – Focus Group Guide

c. CCEBPS Caregivers – Focus Group Guide

d. CCEBPS Youth – Focus Group Guide

6. CCEBPS Telephone Interview Guide


LIST OF ATTACHMENTS




Attachment 1. Guidance for Applicants No. SM-05-010


Attachment 2. Consultation

A. Federal/National Partnership for Children’s Mental Health Participants

B. Methodological Consultants and Services Evaluation Committee to the National Evaluation

C. Expert Reviewers of Instrumentation


Attachment 3. Consents

A. Guidelines for Obtaining Informed Consent

B. Model Script for Consent to Contact

C. Model Consent Forms

1. Sample Script to Introduce the Longitudinal Child and Family Outcome Study

2. Consent to Contact

3. Informed Consent—Caregiver Version

4. Informed Assent—Child Version

5. Informed Consent—Young Adult Version

D. National Evaluation Consent Forms

1. Informed Consent—Staff (System of Care Assessment)

2. Informed Consent—Caregiver (System of Care Assessment)

3. Informed Consent—Youth (System of Care Assessment)

4. Informed Assent—Youth (System of Care Assessment)

5. Informed Consent—Parent/Guardian Approval for Youth Participant (System of Care

Assessment)

6. Informed Consent—Record Review (System of Care Assessment)

7. Informed Consent (Sustainability Study)

8. Informed Consent (CQI Initiative Evaluation Survey)

9. Informed Consent (CQI Initiative Evaluation Interview)

10. Informed Consent—System/Provider-Level (EBP—Implementation Factors Substudy)

11. Informed Consent—Caregiver/Youth (EBP—Implementation Factors Substudy)

12. Informed Consent—Youth (CCIOSAS Focus Group)

13. Informed Assent—Youth (CCIOSAS Focus Group)

14. Informed Consent—Parent/Guardian Approval for Youth Participant (CCIOSAS Focus

Group)

15. Informed Consent—Family Respondent (CCIOSAS Focus Group)

16. Informed Consent—System/Service-Level Respondent (CCIOSAS Focus Group)

17. Informed Consent—System/Service-Level Respondent (CCIOSAS Telephone Interview)

18. Informed Consent—Youth (CCEBPS Focus Group)

19. Informed Assent—Youth (CCEBPS Focus Group)

20. Informed Consent—Parent/Guardian Approval for Youth Participant (CCEBPS Focus

Group)

21. Informed Consent—Family Respondent (CCEBPS Focus Group)

22. Informed Consent—System/Service-Level Respondent (CCEBPS Focus Group)

23. Informed Consent—System/Service-Level Respondent (CCEBPS Telephone Interview)


Attachment 4. Data Elements, and Supporting Materials

A. System of Care Assessment

1. Overview of System of Care Assessment Framework

a. Infrastructure Domain

b. Service Delivery Domain

2. Letter Templates

a. Introduction Letters

b. Confirmation Letter

c. Draft Report Letter

d. Final Report Letter

e. Thank You Letter

3. Informant Table

4. Pre-visit Documentation

a. Instructions for Completing Site Visit Tables and Lists

b. Site Informant List

c. Sample Agenda

d. Checklist of Planning Steps

B. Sustainability Study

1. Sustainability Study Respondent Selection Criteria

2. Sustainability Study E-mail Scripts

3. Sustainability Study Cover Letters

4. Sustainability Study Survey Reminder Letters

5. Sustainability Study Survey Web Screens

C. Continuous Quality Improvement (CQI) Initiative Evaluation

1. CQI Initiative Letter Templates

a. Invitation Letters

b. Reminder Letters

c. Thank You Letters

D. Evidence-Based Practices (EBP) Study

1. Implementation Factors Substudy (IFS) Respondent Selection Criteria

2. IFS Invitation Letters

E. Cultural and Linguistic Competence (CLC) Study

1. CLC Study Invitation Letters

a. Culturally Competent Implementation and Outcomes Self-Assessment Study

(CCIOSAS) Invitation Letter

b. Culturally Competent Evidence-Based Practices Study (CCEBPS) Invitation Letter


Attachment 5. Sample Table Shells for Reporting Findings


90

File Typeapplication/msword
File TitleSupporting Statement
AuthorGordon
Last Modified BySKING
File Modified2009-08-05
File Created2009-08-04

© 2024 OMB.report | Privacy Policy