Phase_VI_Final Supporting Statement 11-13-09 Final

Phase_VI_Final Supporting Statement 11-13-09 Final.doc

National Evaluation of the Comprehensive Mental Health Services for Children and Their Families Program: Phase VI

OMB: 0930-0307

Document [doc]
Download: doc | pdf

Phase VI of the National Evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program


Supporting Statement


A. JUSTIFICATION


1. CIRCUMSTANCES OF INFORMATION COLLECTION



The Substance Abuse and Mental Health Services Administration (SAMHSA) at the Center for Mental Health Services is requesting OMB approval for data collection associated with Phase VI of the national evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program. The current request builds on experience garnered during Phases I, II, III, IV, and V of the evaluation and enhances the design, data collection procedures, and instruments. This data collection includes 66 instruments.


Serious emotional disturbance affects more than 4.5 million children and their families in the United States. There is consensus that an integrated, coordinated, and comprehensive system of care is the best approach for meeting the needs of this population. The Comprehensive Community Mental Health Services for Children and Their Families Program, which is administered by the Center for Mental Health Services (CMHS) within the Substance Abuse and Mental Health Services Administration (SAMHSA), provides funds to support a broad array of community-based and family-driven services delivered through the system of care model. Under this program, CMHS has funded 5- and 6-year grants and cooperative agreements to States and locales to expand the array and capacity of services for children with serious emotional disturbance. To date, this CMHS program has funded 164 such communities through these grants and cooperative agreements. This includes 38 sites funded in Phase VI (18 in FY 2008, and 20 in FY 2009).


The data collection effort proposed here relates closely to the completed evaluation of Phase I grantees (clearance number: 0930–0171), the completed evaluation of Phase II grantees (clearance number: 0930–0192), the completed evaluations of Phase III grantees (clearance number: 0930–0209), and the ongoing evaluation of Phase IV (clearance number: 0930–0257) and Phase V grantees (clearance number: 0930-0280). Phase IV covers grantees funded in FY 2002, FY 2003, and FY 2004; Phase V covers grantees funded in FY 2005 and FY 2006. Phase VI of the national evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program, for which approval is now being sought, expands data collection to the 38 communities awarded cooperative agreements in FY 2008 and FY 2009. Phase VI of the evaluation will continue for the duration of the 6-year award period, ending in September 2015.


The Phase VI evaluation has eight study components that will be conducted with all grantees. These study components collect information on a major nationwide initiative serving thousands of children and their families. These data are used for the national evaluation as well as for local evaluations by the grantees. The Phase VI studies include: (1) the System of Care Assessment that will involve collection of data through site visits conducted every 12–18 months to document the development of systems of care; (2) the Cross-Sectional Descriptive Study that will collect descriptive data on all children and families who enter the CMHS-funded systems of care throughout the funding period; (3) the Child and Family Outcome Study that will collect data longitudinally on child clinical and functional status, and on family outcomes from a sample of children and families; (4) the Service Experience Study that will collect data on child and family experience and satisfaction with services in the overall system; (5) the Sustainability Study that will gather data on system of care characteristics and factors related to sustainability of infrastructure during the life of the award and after the Federal funding cycle is complete; (6) the Services and Costs Study that will assess the costs and cost-effectiveness of system of care services; (7) the Continuous Quality Improvement (CQI) Initiative Evaluation that will document the development of the CQI process within communities and will monitor changes in the process over time; and (8) the Alumni Network study that will examine the extent and degree to which currently and formerly funded system of care communities collaborate among themselves and with program partners as a result of the Alumni Network Web site. In addition to this the knowledge, use, and satisfaction with the Alumni Network Web site will also be assessed.


One study component will be conducted with a subsample of the grantees. The Sector-Specific Assessment and Quasi-Experimental Comparison study, called the Sector and Comparison study from here on, will assess differential outcomes of children and families involved in a specific child-serving sector (i.e., child welfare, juvenile justice, special education) and receiving services from agencies in funded systems of care with a similar group of children and families receiving services from agencies outside of funded systems of care.


Phase VI, like Phases I, II, III, IV, and V has been structured to capture the linkages between an enhanced system of care and the outcomes and experiences of children and families over time.


a. Background



The understanding of child and adolescent mental health disorders has improved significantly during the last two decades. As a result, the field is in a much better position today to estimate the extent to which mental health disorders occur in the population of children and adolescents at large, however it is still likely that many children in need go undetected. With the estimate that at least 20% of children and youth under age 19 may require mental health services (U.S. Public Health Service Office of the Surgeon General [USPHS], 2001), one also can estimate that at least 16 million children and youth are in need of some type of mental health service each year. Ten years ago, it was estimated that 4.5 to 6.3 million children had problems severe enough to be classified as serious emotional disturbance (Friedman, Katz-Leavy, Manderscheid, & Sondheimer, 1999), and that number is likely to have grown in the past decade. As noted in Promotion and Prevention in Mental Health (Substance Abuse and Mental Health Services Administration [SAMHSA], 2007), half of all diagnosed mental illnesses begin by age 14, and three-fourths begin by age 24. Clearly, a substantial subset of our nation’s children and youth, and their families, grapple with significant mental health problems. Given these conditions, the ability for child-serving providers to identify children in need of services in settings where children and youth are found and to know how and where to direct their families to services is essential. Increasingly, the need for the public health approaches of health promotion and prevention is being identified for mental health (Institute of Medicine [IOM], 2009). The role that education, child welfare, juvenile justice, primary care, substance abuse, daycare, and other settings can play in early identification is facilitated by collaboration across systems and the awareness that providers in these settings have of the mental health needs of the children and youth they serve, as well as the services available to them.


Children and adolescents with serious emotional disturbance face challenges in many aspects of their daily lives. Generally, they present with a variety of diagnoses, they experience high rates of risk factors for mental illness, and they exhibit severe clinical symptoms and functional impairment (Manteuffel, Stephens, Brashears, Krivelyova, & Fisher, 2008). They are at greater risk for substance abuse disorders, and youth with less severe emotional disturbance are vulnerable to increased emotional problems as a result of substance use (Center for Mental Health Services [CMHS], 2001, 2003, 2004; Holden, 2003; Holden et al., 2003; Liao, Manteuffel, Paulic, & Sondheimer, 2001; SAMHSA, 2002). Youth with serious emotional disturbance have greater risk for negative encounters with the juvenile justice system and have a high rate of criminal involvement when compared to all students with disabilities (CMHS, 2001, 2003, 2004; Davis & Vander Stoep, 1997). Youth within the juvenile justice system display an exceptionally high rate of mental health and substance abuse disorders (Feldmann, 2008; Heffron, Pumariega, Fallon, & Carter, 2003; Shelton, 2005). Students with emotional disturbance fail more courses, earn lower grade point averages, miss more days of school, are retained at grade more than students with other disabilities, and have high dropout rates (Epstein, Nelson, Trout, & Mooney, 2005; U.S. Department of Education [DOE], 2001). Longitudinal research following samples into adulthood further supports assertions of high rates of poor long-term outcomes for these youth (Epstein, Kutash, & Duchnowski, 2005; Friedman, Kutash, & Duchnowski, 1996; Knapp, McCrone, Fombonne, Beecham and Wostear, 2002; Pumariega & Winters, 2003) who may have poor employment opportunities and who may experience periods of poverty in adulthood (National Advisory Mental Health Council Workgroup on Child and Adolescent Mental Health Intervention and Deployment, 2001). There is also the increased risk that youth with mental illness will not reach adulthood, as these youth are more likely to commit suicide than youth without mental illness. In 2004, suicide was the third leading cause of death among youth age 10–24; and, reversing a declining trend over the previous decade, rates in 2003-2004 increased 8% in this age group—the largest single year increase since 1990 (Centers for Disease Control and Prevention [CDC], 2007). Many of these suicide victims have undiagnosed or untreated mental illness (Institute of Medicine [IOM], 2002). Furthermore, within this population, economic and demographic factors disproportionately affect identification, placement, and completion of services, with many who initiate services terminating prematurely (Burns & Hoagwood, 2002; Coutinho & Denny, 1996).


Advances in the knowledge base over the last decade have served to illuminate continuing challenges in delivering services and meeting needs for this population, and have thrust the issue of children’s mental health into the public spotlight. Despite these advances, service capacity has not kept pace with need (Friedman, 2002; Stroul, Pires, & Armstrong, 2001); it has been estimated previously that only 1 in 5 children with serious emotional disturbance receive the specialty services they need (Burns et al., 1995; DHHS, 1999; Shaffer et al., 1996), and youth with co-occurring mental health and substance abuse disorders rarely receive appropriate and timely services (Federation of Families for Children’s Mental Health and Keys for Networking, Inc., 2001). More recent estimates suggest that, among youth 6–17 years old with mental health problems so severe that clinical mental health evaluation was indicated, 4 out of 5 did not receive a mental health evaluation or treatment in the past year (Kataoka, Zhang, & Wells, 2002); and among children with special health care needs, rates of unmet need are higher among those with a chronic emotional, behavioral, or developmental problem (Inkelas, Raghavan, Larson, Kuo, & Ortega, 2007). Unfortunately, the prevalence and accompanying impairment associated with serious emotional disturbance is only likely to grow in the future.


Despite increased efforts to enhance access to services and improve service systems, children and youth with serious emotional disturbance are underidentified; most children in need do not receive mental health services, and Latinos and the uninsured have especially high rates of unmet need relative to other children (DHHS, 1999; Kataoka, Zhang, & Wells, 2002). According to the President’s New Freedom Commission on Mental Health (PNFC, 2003), impoverished families, families from minority racial or ethnic backgrounds, and families living in rural areas confront barriers to accessing services, receiving quality care, and achieving positive outcomes. This underscores the need for the development of effective community-based care that is sensitive to and structured for the diverse cultures in individual communities (Hernandez & Isaacs, 1998; Isaacs-Shockley, Cross, Bazron, Dennis, & Benjamin, 1996; PNFC, 2003) and impoverished families, and is available in even the most geographically remote communities in the country (PNFC, 2003). The Federal Action Agenda states that expanding access to quality mental health care is one of the identified methods to system transformation (SAMHSA, 2005). For families to access services to meet the needs of their children and youth, and for youth to identify themselves in need of services, available services must be perceived as accessible and appropriate to the cultures, traditions, problems, and needs of those seeking services. Access to services to address the mental health needs of children can be especially difficult for families of diverse cultural backgrounds who do not perceive that services or service providers will understand their values and needs (HHS, 2001a). Serving the needs of persons of diverse backgrounds requires culturally and linguistically competent providers, culturally competent treatments and practices, and cultural adaptations to provide efficacious and effective services (Whaley & Davis, 2007).


There has been much debate about the best method to serve these children and their families. In 1969, the Joint Commission on the Mental Health of Children published a landmark study showing these children were typically unserved or served inappropriately in excessively restrictive settings (National Institute of Mental Health [NIMH], 1969). Later, the Commission’s findings were substantiated by numerous other studies, task forces, commissions, and reports. These studies concurred that community-based, family-driven, coordinated systems of care providing a range of services are necessary to effectively serve these children and their families.


In 1984, in response to these findings, the NIMH initiated the Child and Adolescent Service System Program (CASSP). Later administered by CMHS within SAMHSA, CASSP provided funds to promote the development of comprehensive and integrated service delivery systems for children with serious emotional disturbance through a system of care approach. In spite of the progress made through CASSP efforts to develop an infrastructure for systems of care, a deficit of appropriate, less restrictive treatment services remained. Studies indicated rising costs of residential services and increasing rates of child placement in residential facilities and in out-of-home care. These findings were reasons for continued concern that children were being served in overly restrictive settings.


The system of care program theory model, first articulated by Stroul and Friedman in 1986, proposed a transformation of the mental health service delivery system to a comprehensive spectrum of mental health and other necessary services that are organized into a coordinated network to meet the multiple and changing needs of children and adolescents with serious emotional disturbance (Stroul & Friedman, 1994). In this model, agencies in various child-serving sectors, such as education, juvenile justice, mental health, and child welfare work together to provide the wide array of services needed by children with serious emotional disturbance and their families. Built upon the CASSP philosophy that calls for services to be child-centered, family-driven, community-based, and culturally competent, the model emphasizes the need to: (1) broaden the range of nonresidential community-based services, (2) strengthen case planning across child-serving sectors, and (3) increase case management capacity to ensure that services work together across sectors and providers.


The 1999 Mental Health: A Report of the Surgeon General documented the progress that had been made to date and the resources devoted to transforming the nature of service delivery for children with serious emotional disturbances and their families (USPHS, 1999a). Numerous efforts since the 1999 publication of the Surgeon General’s report on mental health have brought increased attention to children’s mental health and have resulted in publications calling for and outlining action plans to address youth suicide (USPHS, 1999b); youth violence (HHS, 2001b); research on the use of medication for emotional and behavioral problems of young children; the mental health needs of diverse cultures, races, and ethnicities (HHS, 2001a); and the need for a national action agenda. In addition, the World Health Organization published the report Mental Health: New Understanding, New Hope (2001), and the National Institute of Mental Health released the report Blueprint for Change: Research on Child and Adolescent Mental Health (National Advisory Mental Health Council’s Workgroup on Child and Adolescent Mental Health Intervention Development and Deployment, 2001). The President’s New Freedom Commission on Mental Health was established in 2002 to evaluate the mental health service delivery system in the United States and to advise the President on approaches to improving the system so that adults and children with serious mental health problems can participate fully in their communities. In 2003, the PNFC published Achieving the Promise: Transforming Mental Health Care in America (2003) which advocated for mental health care to be provided in communities with treatments integrated across agencies and designed to meet the needs of individuals and their families. The report calls for research focused on outcomes—determining the treatments that promote quality care and recovery, and finding the most effective way to disseminate information about these practices. This objective includes investigating emerging best practices, such as wraparound services and systems of care for children with serious emotional disturbances and their families. Research should occur at all levels, with findings made available at the community level. Having a better understanding of this question of effectiveness is especially important in an era of managed care, accountability, and constrained Federal and State spending on mental health services. The 2005 report developed by the IOM Improving the Quality of Health Care for Mental Health and Substance-Use Conditions states that to address mental health and substance-use conditions communities need an infrastructure to produce and disseminate scientific evidence of effective treatments as well as funds to conduct studies that are directly related to clinical practice and policy. The new IOM report (IOM, 2009), Preventing Mental, Emotional, and Behavioral Disorders Among Young People: Progress and Possibilities, focuses on the importance of prevention of mental, emotional, and behavioral (MEB) disorders through an application of universal, selective, and targeted interventions with individuals and groups of children and youth who are at risk of developing serious MEB disorders and identifies a number of programs that have a sufficient evidence base to warrant consideration of broader implementation. In addition, with the recent change in administrations, comes a renewed emphasis on transparency and transformation at the Federal level and calls to further develop program performance measurement comparability, accessibility, and independence to ensure government accountability (Danker, Dohrmann, Killefer, & Mendonca, 2006). Thus, the call for mental health service delivery system reform continues to demand structural and functional changes in community mental health and Federal programs that can demonstrate effectiveness.


The system of care approach is consistent with the vision for transformation in children’s mental health articulated by the PNFC (Huang et al., 2005) which calls for comprehensive home- and community-based services and supports, family partnerships and support, culturally competent and individualized care, evidence-based practices, coordination of services, responsibility and funding, prevention, early identification and early childhood intervention, mental health services in schools, and accountability. The system of care approach has evolved into a major organizing force shaping the development of community-based children’s mental health services in the United States.


b. The Comprehensive Community Mental Health Services for Children and Their Families Program (CMHI)



While the system of care model provided a conceptual framework to meet the needs of children with serious emotional disturbance, funding to provide services at the local level was either sporadic or missing. In 1992, the Federal Government addressed this gap with the passage of the Children’s and Communities Mental Health Services Improvement Act (CMHI), which is part of the Alcohol, Drug Abuse and Mental Health Administration Reorganization Act (Public Law 102–321, Section 520). The Act was amended in 2000 to change the term of funding from 5 to 6 fiscal years (Public Law 106–310, Section 3105(c)). CMHI provides support through grants and cooperative agreements to States, political subdivisions within States, the District of Columbia, and territories to develop integrated home and community-based systems and supports for children and youth with serious emotional disturbances and their families. This funding encourages communities to develop and expand systems of care. The CMHI is the largest Federal commitment to children’s mental health to date, and through FY 2008 has provided more than $1.25 billion to support system development in 164 communities in 50 States, 2 territories, and the District of Columbia, including the 38 grants awarded in FY 2008 and FY 2009. The program is fully described in the grant Guidance for Applicants. (See Attachment 1, Request for Applications No. SM–08–004.)


The goals of the CMHS program are to:


  • Expand community capacity to serve children and adolescents with serious emotional disturbances and their families;

  • Provide a broad array of accessible, clinically effective and fiscally-accountable services, treatments and supports;

  • Serve as a catalyst for broad-based, sustainable systemic change inclusive of policy reform and infrastructure development;

  • Create a case management team with an individualized service plan for each child;

  • Deliver culturally and linguistically competent services with special emphasis on racial, ethnic, linguistically diverse and other underrepresented, underserved or emergent cultural groups; and Implement full participation of families and youth in service planning, in the development, evaluation and sustainability of local services and supports and in overall system transformation activities.


The goals of CMHS program are harmonious with those outlined in Achieving the Promise: Transforming Mental Health Care in America (2003). Systems of care work to promote recovery and reduce stigma though the provision of youth-guided and family-driven care that is culturally and linguistically responsive. Services are informed by research and evidence-based practices are utilized to treat children and youth, including those with co-occurring disorders. Finally, Federal, State, and local partnerships are encouraged across child- and youth-serving systems.


c. The Need for Evaluation


Section 564(c) of the Public Health Service Act, entitled Additional Provisions in the section of General Provisions mandates annual evaluation activities. A basic requirement is documentation of the characteristics of the children and families served by the system of care initiative, the type and amount of services they receive, and the cost to serve them. Equally important is the need to assess whether the program was implemented and services experienced as intended. It is also critical to assess whether the children served by the program experience improvement in clinical and functional outcomes, whether family outcomes improved, and whether improvements endure over time. Finally, policymakers and service providers need to know whether those outcomes can be reasonably attributed to the system of care initiative.


A government contractor (referred to as the National Evaluator throughout this document) coordinates data collection for the national evaluation and provides training and technical assistance to facilitate the collection of data by local-level evaluators. In turn, each grantee is required by the cooperative agreement to hire a minimum of two evaluation staff (or their full-time equivalents) to ensure that data collection is systematic and can be sustained through the funding period. In this partnership between the National Evaluator and local evaluators, the National Evaluator provides training and technical assistance regarding data collection and research design. In addition, the National Evaluator receives data from all grantees, monitors data quality, and provides feedback to grantees. The grantees help shape data collection procedures and provide feedback to the National Evaluator regarding successful approaches. This evaluation will first and foremost prepare data analyses for the national assessment of the program, but in doing so will make grantee-specific data available to the grantees to help meet their local evaluation needs.


d. Clearance Request



This submission requests OMB clearance for the first 3 years of the 6-year data collection effort for Phase VI of the national evaluation of the Comprehensive Community Mental Health Services for Children and Families Program. The request estimates burden for data collection in 38 sites (18 sites funded in FY 2008 and 20 sites funded in FY 2009).


The national evaluation is driven by the system of care program theory model. This program theory asserts that to serve children with serious emotional disturbance, service delivery systems need to offer a wide array of accessible, community-based service options that center on children’s individual needs, include the family in treatment planning and delivery, and are provided in a culturally and linguistically competent manner. An emphasis is placed on serving children in the least restrictive setting that is clinically appropriate. In addition, because many children with serious emotional disturbance use a variety of services and have contact with several child-serving agencies, service coordination and interagency collaboration are critical. The program theory holds that if services are provided in this manner, outcomes for children and families will be better than can be achieved in traditional service delivery systems.


To examine the system of care theory, the national evaluation is designed to answer the following overarching questions:


  • Who are the children and families served by the program and by the funded communities? How do the characteristics of children and families who participate in systems of care differ? Does the served population change over time as systems of care mature?

  • How do systems of care develop according to system of care principles (e.g., family and youth involvement, cultural competence, interagency collaboration) over time? What are differences in the development of systems of care? In what ways does funding accelerate system development?

  • What is the degree to which each of the grantee communities has implemented, developed, and sustained their service systems according to the system of care conceptual framework, based on the results of a System of Care Assessment Tool?

  • To what extent do children’s clinical and functional outcomes improve over time? How are family outcomes affected? What is the nature of change in child, family, and system outcomes? How are changes in child, family, and system outcomes associated with efforts to implement and develop systems of care?

  • What are the service utilization patterns (specific services, treatments, and supports) for children and families in systems of care and what are the associated costs? In what ways do the services and supports that children and families receive differ? How cost-effective are systems of care over time? Are systems of care cost-effective?

  • To what extent are children’s and families’ experiences consistent with the system of care philosophy? How satisfied are children and families with the services they receive? How well do grantee communities provide a broad array of services in a cultural context that is most appropriate for the child and the family and that ensures a full partnership with families? How effective are specific services, treatments, or supports in producing positive outcomes for children and families?

  • Are there subgroups of children and families for whom a system of care is more effective?

  • In what ways are the developing systems of care fiscally sustainable beyond the 6-year funding period? What factors facilitate or impede sustainability? What strategies are systems of care using to foster sustainability during their funding period?

  • How are communities pursuing continuous quality improvement? How well does the Continuous Quality Improvement (CQI) Initiative identify and address communities’ technical assistance (TA) needs? How effective is the CQI Initiative in providing appropriate, data-driven TA to communities?

  • To what extent do grantee communities receive technical assistance to implement the evaluation appropriately? How frequently is feedback provided to local grantee communities on the status of data collection and on findings of the evaluation?

  • What extent do current and formerly funded system of care communities collaborate on issues of governance, individualized care, funding, family-driven care, youth-guided care, culturally competent care, sustainability, and evaluation? What is the nature and level of collaboration between system of care communities with program partners on program and evaluation technical assistance? To what extent has the Network Web site facilitated these collaborations? What activities and features of the Alumni Network Web site facilitate and/or hinder collaboration among system of care communities? How satisfied are the Alumni Network Web site users with the Web site?

  • To what degree are systems of care effective in producing positive outcomes for children and families?

  • In the Child Welfare sector:

    • What are the numbers and characteristics of children in systems of care who are also involved in child welfare, including the type of child welfare involvement, child and family demographics, placement history, and family risk factors?

    • What factors influence referrals of children involved in child welfare to systems of care in their communities?

    • Are systems of care providing mental health assessments for children in child welfare even if they are not ultimately determined to be in need of, or eligible for, system of care services?

    • What services are provided by systems of care to children involved in child welfare?

    • What is the extent of involvement in system of care service planning and implementation by child welfare staff, foster parents, biological parents, and children?

    • What are the child welfare and mental health outcomes of children involved in child welfare and systems of care, specifically:

    • To what extent are children who are receiving in-home services maintained in their own homes?

    • To what extent are children in out-of-home placement experiencing stable placements?

    • To what extent do their trauma symptoms change over time?

    • To what extent do their educational outcomes improve over time?

    • To what extent does their mental and behavioral health improve over time?

  • In the Juvenile Justice sector:

    • To what extent are juvenile justice-involved youth in systems of care prevented from further involvement in the juvenile justice system?

    • To what extent are juvenile justice-involved youth in systems of care prevented from escalation of criminal activity?

    • To what extent do juvenile justice-involved youth in systems of care experience less increasing severity of placement compared to non-system of care juvenile justice-involved youth?

    • Do juvenile justice-involved youth in systems of care show greater improvement in clinical outcomes compared to non-system of care juvenile justice-involved youth?

  • In the Education sector:

    • What are the characteristics of children served in school-focused systems of care?

    • Do educational outcomes of school-aged children in systems of care improve over time?

    • Do children in systems of care receive appropriate educational supports?

    • Do educational outcomes of school-aged children in systems of care improve more compared to non-system of care children?

    • Are children in systems of care more likely to receive appropriate educational supports compared to non-system of care children?

    • What are the service experiences of youth served in school-focused systems of care?

    • To what extent are teacher and caregiver reports of educational outcomes congruent?

    • What is the extent of teacher’s contact with caregivers of children, mental health providers, or care coordinators in system of care?

    • What supports and training are provided to teachers of children in systems of care?

    • How does teacher involvement, supports and training differ from that of teachers in non-system of care communities (or schools who are not part of the system of care)?

    • What individual level services are available in schools in system of care communities?

    • What school level interventions are available in schools in system of care communities?

    • What are the types of mental health service delivery systems in schools in system of care communities?


These evaluation questions evolved over the last 15 years through development of the CMHI and feedback from system of care personnel and other partners and extend those mandated by the CMHI authorizing legislation. The legislation requires funded communities to participate in a national evaluation that assesses the number of children served, child and family characteristics, child and family outcomes, service utilization patterns, and system characteristics.


The evaluation design for the new communities includes six core study components and four special studies that employ both qualitative and quantitative methods to comprehensively examine the impact of CMHI funding. This evaluation provides the opportunity to advance the assessment of outcomes for children with serious emotional disturbance with significant involvement in or identified for mental health services by juvenile justice, child welfare, and education sectors within systems of care, and to examine in greater detail specific efforts and goals of the CMHI. Specifically, the CQI study will document the development of the CQI process within communities and will monitor changes in the process over time. The Alumni Networking Study will measure the extent and nature of collaboration among system of care communities by examining how collaboration is used as a conduit for sharing and transferring knowledge, resources, and technology to achieve system of care goals. Exhibit 1 on the following page presents a flow chart of the study components for the Phase VI evaluation. Note that the years listed in Exhibit 1 and throughout this supporting statement refer to the evaluation year, not the funding year. Because the project begins with one planning year and 5 option years, evaluation year 1 is actually contract or funding year 2.


System of Care Assessment. This component will examine whether programs have been implemented in accordance with the system of care program model and will document how systems develop over time to meet the needs of the children and families they serve. A particular interest is whether services are delivered in an individualized, family-driven and youth-guided, coordinated manner, and whether the system involves multiple child-serving agencies. For Phase VI, site visits for each system of care community will be conducted at 18–24-month intervals across their 6 years of funding, beginning in the second year of funding.


Information will be collected through a combination of document reviews, review of randomly selected case records, semi-structured qualitative interviews, observations made on site, and follow-up telephone interviews to clarify information. Categories of interview respondents will include project directors, cultural and linguistic competence coordinators, social marketers, program evaluators, staff responsible for care review and quality monitoring, core child-serving agency representatives, family organization representatives, care coordinators, direct service providers, youth served by the program, and caregivers of children and youth served by the system of care.


Exhibit 1: Summary of Major Components in Phase VI

Note: Years refer to evaluation year



Cross-Sectional Descriptive Study. This study will describe child and family characteristics of all children entering CMHS-funded systems of care. Data will be obtained primarily through in-person interviews with caregivers conducted as part of the usual intake process and through case record reviews; data will be directly entered into a Web-based database by intake personnel to facilitate capture of basic descriptive characteristics of children served. Data will be collected upon entry for all children and families who enter the system of care throughout the program’s funding period. For the children and families who participate in the Child and Family Outcome Study (see below), additional descriptive information is collected as part of the baseline interview, and the descriptive data elements that may have changed over time (e.g., diagnosis, insurance status) will be collected again at follow-up data collection points. Because sites routinely maintain basic descriptive data for administrative purposes, only the additional descriptive data collected on families at baseline and follow-up in the Child and Family Outcome Study sample constitute respondent burden.


Child and Family Outcome Study. This study, conducted among a sample of children in each community, will examine how the system affects child clinical and functional status and family functioning. Outcome data on child clinical and functional status will be used to assess change over time in the following areas: symptomatology, diagnosis, social functioning, substance use, school attendance and performance, delinquency and juvenile justice involvement, and stability of living arrangements. Family life will be assessed in the areas of family functioning and caregiver strain. These data will be collected at all system of care communities within 30 days of the child’s entry into services and at 6-month intervals for the length of the evaluation.


Service Experience Study. This study, conducted among the sample of children participating in the Child and Family Outcome Study, investigates the extent to which system of care principles are experienced by families, and considers experiences from the perspectives of caregivers and youth. Data will be used to assess intervention fidelity, satisfaction with services, cultural competence, accessibility and coordination of services, perceived helpfulness of services, and impact of services on ability of family members to work outside the home. Data collection occurs at intake and follow-up from those families who have received services in the previous 6 months.


Sector and Comparison Study. This study, conducted among a subset of children enrolled in the Child and Family Outcome Study will provide comprehensive sector-specific assessments aimed at improving the quality of information for multiple child-serving sectors that are a part of systems of care. A subset of children enrolled in the core study will be randomly sampled into three sectoral groups (education, juvenile justice, child welfare). Sector-specific assessments will be conducted within 30 days of the child’s entry into services and at 6-month intervals for the length of the evaluation. Existing data from child records will also be utilized for this study. Other assessments will involve interviewing services providers/administrators. For the education sector, data will be collected from teachers of children enrolled in the sector study (at baseline and at 6-month intervals for the length of the evaluation) and school administrators (at baseline at and at 12-month intervals for the length of the evaluation). For the juvenile justice sector, data will be collected from court representatives who are responsible for oversight of youth in the juvenile justice system who are enrolled in the sector study (at baseline and at 6-month intervals for the length of the evaluation). For the child welfare sector, data will be collected from the system of care coordinator in coordination with the child welfare social worker/case manager of children enrolled in the sector study (at baseline and at 6-month intervals for the length of the evaluation).


As part of the sector study we will conduct a quasi-experimental comparison study, comparing children enrolled in the sector study and similar children who are involved with agencies in similar child-serving sectors in locations that are not receiving system of care funding. For each sector’s group of children, comparison samples will be drawn through rigorous child-level matching. Children and families selected from comparison agencies will have characteristics that would make them eligible for system of care services if grant funding were available in their location or jurisdiction. Existing data from child records will also be utilized for this study. Other assessments will involve interviewing services providers/administrators. For the education sector, data will be collected from teachers of children enrolled in the comparison study (at baseline and at 6-month intervals for the length of the evaluation) and school administrators (at baseline at and at 12-month intervals for the length of the evaluation). For the juvenile justice sector, data will be collected from court representatives who are responsible for oversight of youth in the juvenile justice system who are enrolled in the sector study (at baseline, and at 6-month intervals for the length of the evaluation). For the child welfare sector, data will be collected from the system of care coordinator in coordination with the child welfare social worker/case manager of children enrolled in the comparison study (at baseline and at 6-month intervals for the length of the evaluation).

Sustainability Study. Using a Web survey, this study gathers data on system of care characteristics and factors related to sustainability of infrastructure during the life of the award and after the Federal funding cycle is completed. The survey questions cover the following topic areas: (a) availability of specific services in the system of care, (b) mechanisms used to implement system of care principles, (c) factors affecting sustainability (whether each factor has played a role in the development or maintenance of the system of care, and, if so, the extent to which each has impacted the system of care), (d) success with objectives for implementing systems of care, (e) strategies for sustaining systems of care, and (f) financial resources contributing to budget. The Web survey will be conducted with representatives from all sites in years 2 and 5 of the evaluation. The Web survey will also be utilized to conduct a 5-year post funding assessment of the communities funded in 2002. A shorter version of the survey will be administered annually to the current or former project director of each graduated community in years 1 through 4 post-funding.


The State Strategies for Sustainability study is an enhanced component of the sustainability study that assesses state strategies for taking systems of care to scale and examines strategies used for spreading systems of care to additional areas of the state for the purpose of developing a statewide initiative. It is anticipated that interviews with no more than nine current and former project directors will be conducted to obtain more in-depth information. Because the interviews will be conducted with no more than nine participants, additional burden is not sought for this study. It is mentioned here so that the full scope of the sustainability study is described.


Services and Costs Study. This study will describe the types of services used by children and families, their utilization patterns, and the associated costs. The relationship between service use and outcomes will also be explored. The cost data collected for the children enrolled in the comparison study will also allow for cost-effectiveness analysis. These data are maintained continually by grantees in their fiscal (e.g., charge, billing) management information systems (MISs) and transmitted to the National Evaluator at regular intervals. Of interest are the types of services, the combination of services, continuity or gaps in care, and the length of treatment.


CQI Initiative Evaluation. This study will document the development of the CQI process within communities and will monitor changes in the process over time. More specifically, the evaluation will assess if and how communities pursue CQI; how well the CQI Initiative identifies and addresses communities’ technical assistance (TA) needs; and how effective the CQI Initiative is in providing appropriate, data-driven TA to communities. Data will be gathered through three complementary activities: a Baseline Survey of key constituents in all FY2008-2009 funded communities; a subsequent Monitoring Survey administered every 2 years to the same constituents; and biennial Case Studies of four selected communities. For each community, up to eight respondents (i.e., principal investigator, project director, lead evaluator, cultural and competence coordinator, social marketing-communications manager, lead family contact, youth coordinator, TA coordinator) will be asked to complete the Baseline Survey and Monitoring Survey. A subset of four communities will be selected for participation in the Case Studies, which will consist of focus groups with local system of care personnel and national TA providers for each selected community.


Alumni Networking. This component measures the extent and nature of collaboration among system of care communities by examining how collaboration via the Alumni Network Web site is used as a conduit for sharing and transferring knowledge, resources, and technology to achieve system of care goals. In years 1 and 3 of the evaluation, data will be collected via a Web-based Networking and Collaboration Survey from one key person from formerly funded system of care communities that is most knowledgeable about the site (e.g., principal investigator, project director, lead evaluator, or lead family representative), and up to three key people from currently funded sites who are most knowledgeable about the site. The data collected will measure the extent to which communities are interacting with other communities on select key activities, such as governance, individualized care, funding, family-driven care, youth-guided care, culturally competent care, sustainability, evaluation, program technical assistance, and evaluation technical assistance.


End-user satisfaction with the Alumni Web site will be assess through the use of an online satisfaction survey which will examine perceived utility of the design, format, and content of the Alumni Network Web site. All registered users of the Alumni Network Web site will be invited to participate in a satisfaction survey via a Web-based survey in years 2 and 4 of the evaluation. The Satisfaction Survey will be administered also to a random selection of non-registered users of the Alumni Web site via pop-up window technology during years 2 and 4 of the evaluation. Information collected via usability expert testing of the Alumni Network Web site during year 1 will be used to modify and strengthen the utility of the Web site for all end users of the Web site.



2. PURPOSE AND USE OF THE INFORMATION



This evaluation will serve several purposes. It will: (1) describe who is being served by the CMHS-funded systems of care; (2) show whether there are observable differences in child and family outcomes that can be plausibly linked to a faithful implementation of the system of care approach; (3) describe how children and families experience the service system and how they use services and supports (i.e., utilization patterns); (4) estimate the cost of serving children in systems of care and assess cost-effectiveness of services; (5) illustrate the development of systems of care as they move toward offering integrated and comprehensive services; (6) assess the development of the continuous quality improvement (CQI) process within communities and the effectiveness of the CQI Initiative; (7) compare outcomes and service experience among a group of children, youth and families involved in one of three child-serving sectors and receiving services from CMHS-funded system of care communities, and a similar group receiving services from non-funded communities; (8) describe how developing systems of care will be fiscally sustained; (9) identify types and strengths of collaboration between system of care communities and interactions with program partners, capture the satisfaction with these relationships, and provide guidance on features and the utility of the alumni Web site that will foster collaboration between system of care communities; (10) support technical assistance activities to help CMHS best meet program goals; (11) support CMHS in its efforts to establish standards for measuring their performance and effectiveness as required under the 1993 Government Performance and Results Act (GPRA); and (12) provide data for the National Evaluation Measures (NOMs) to address the national outcome measures for mental health programs as currently established by SAMHSA.


The data collected in Phase VI will be useful to CMHS and its partners, other Federal agencies, the grantees, individual children and their families, and the research field. Findings from the Phase I, II, III, IV and V evaluations have been used to describe the children and families served by the funded systems of care, to assess whether the children in the samples have experienced improved outcomes, to measure service experiences and system development, and to request additional funding from local and State agencies to sustain system of care services. In addition to contributing further information on topics covered in prior phases, Phase VI will continue to add to the knowledge base by developing a better understanding of the barriers and facilitators to sustainability and the service experience of children and families involved in multiple child-serving agencies through the development of sector-specific assessments aimed at improving the quality of information from these child-serving sectors. As in previous phases of the evaluation, the design allows for the exploration of the relationships between service use and outcomes and the study of the long-term impact of the program.


Principal changes from Phase V to Phase VI include:


  • Updates to several of the measures in the Child and Family Outcome Study instrument package to address information desired by the program, including more information on family advocacy, youth empowerment and self-efficacy, and transition age youth;

  • The addition of three measures to assess social and emotional development in early childhood. One of the instruments replaced another instrument used in previous phases;

  • The addition of six sector-specific instruments aimed at obtaining more detailed information within three child-serving sectors (i.e., Education, Child Welfare and Juvenile Justice).

  • Updates to the sustainability survey and an addition of a brief survey on sustainability to be administered annually to graduated grantees post-funding to further the understanding of the factors affecting sustainability;

  • The addition of a Continuous Quality Improvement (CQI) Initiative evaluation aimed at documenting the development of the CQI process within communities, monitoring changes over time, and examining how well the CQI process identifies technical assistance needs;

  • The addition of a Networking and Collaboration Survey aimed at assessing the nature and extent of the interaction between alumni and currently funded communities, and Alumni Network Web Site Satisfaction Survey examining end-user satisfaction with the design, format, and content of the Web site.


CMHS will use the results from Phase VI to develop policies and provide guidance regarding the development of systems of care. Specific findings on the successes and challenges that agencies have experienced in developing collaborative, coordinated, and comprehensive systems will be used to tailor technical assistance to grantees. Information and findings from the evaluation will help CMHS plan and implement other efforts related to systems of care. Findings from the evaluation can also enhance other CMHS programs that support system development (e.g., Projects for Assistance in Transition from Homelessness, Community Mental Health Services Block Grants, Cooperative Agreements for State-Sponsored Youth Suicide Prevention and Early Intervention, Mental Health Transformation State Incentive Grants, and the National Registry of Evidence-Based Programs and Practices program). In addition, the many partners that work in collaboration with CMHS, including the National Federation of Families for Children’s Mental Health and the National Mental Health Association, will be able to use the results in their national efforts to help build systems of care to meet the needs of children and families.


Finally, CMHS will also use the findings from the evaluation to provide objective measures of its progress toward meeting targets of key performance indicators put forward in its annual performance plans as required by law under the GPRA. Globally, these measures for children include increases in the number of children served in the CMHS program (output measure), increased school attendance, decreased juvenile justice contacts, decreased use of inpatient hospitalization, decreased costs of inpatient hospitalization (efficiency measure), and long-term program outcomes demonstrated by the percentage of grantees showing decreases in child symptomatology and increases in percentage of programs sustained 5 years post-program funding. Specific measures from the Phase VI instrumentation corresponding to these global measures include the Education Questionnaire, Revision 2 (EQ–R2) and the Delinquency Survey, Revised (DS–R) for assessing school attendance and juvenile justice contacts; the Living Situations Questionnaire (LSQ) for assessing usage of inpatient hospitalization; the Child Behavior Checklist (CBCL) for assessing child symptomatology; and the Sustainability Survey for assessing sustained program characteristics. These instruments are described in detail in Section B.2.


Findings from the evaluation will be useful to policymakers, planners, and analysts in other Federal agencies involved in programs for this target population. The service program is being coordinated with relevant Federal agencies, such as NIMH and the Administration for Children and Families and the Children’s Bureau in DHHS, the Office of Juvenile Justice and Delinquency Prevention in the Department of Justice (DoJ), and the Institute of Education Sciences and the Office of Special Education Programs under the Office of Special Education and Rehabilitative Services in the DoE. CMHS has held several meetings with representatives from these and other Federal agencies since the inception of this program. The involvement of staff from related agencies and programs ensures that the effort is coordinated at the Federal level and that results of the evaluation will be useful to a wider audience. See Attachment 2.A. for a list of participants in the Federal/National Partnership for Children’s Mental Health.


Findings from the evaluation will be used by grantees to improve the implementation of their systems of care and achieve the goals of the CMHI. Demographic and outcome data on a sample of children and families who participate in the system of care will aid grantees in identifying the program elements that help children and families function better, that promote family involvement, and that lead to client satisfaction. Grantees are expected to use the information to identify better their target populations, improve their services, and support their efforts to obtain required matching funds and to sustain their system of care after the CMHI funding has ended. Indeed, several grantees have used data collected for the Phase I, II, III, IV and V studies to request additional funding from their State legislatures. The same is expected for Phase VI. Service experience data will provide useful feedback to grantees on whether families experience services as the grantees intended and will identify their programs’ strengths and weaknesses. This information will help grantees plan culturally competent services and supports which families and youth report as useful and that are associated with improved child, youth and family outcomes. System of Care Assessments will provide useful feedback on how to refine the system by identifying gaps in system development and barriers to collaboration, which will help the grantees more effectively allocate personnel and funding and prioritize activities.

Grantees will also learn what barriers children or youth and their families perceive and will be able to work to eliminate such barriers. Clinicians will be able to use the data collected with standardized objective measures to guide treatment.


The research community, particularly the field of children’s mental health services research, will profit in a number of ways. First, evaluation of the CMHI will add significantly to the developing research base about systems of care. Second, the focus on child, family, and system outcomes will allow researchers to examine and understand the specific ways children improve, how services can be enhanced, and the importance of adherence to service plans. Moreover, the relationship among these variables will be better understood. Finally, the analysis of evaluation data will aid researchers in formulating new questions about systems of care and specific services, and will help both service providers and researchers improve the delivery of children’s mental health services. The information obtained from the Child and Family Outcome Study will be of particular importance in addressing these research goals.


If these data are not collected, policymakers and program planners at the Federal and local levels will not have the necessary information to determine the extent to which children with serious emotional disturbance and their families experience contract-funded services as they were intended. Without this evaluation, they will not know if these systems have had any positive impact on the lives of the people they serve.



3. USE OF IMPROVED INFORMATION TECHNOLOGY


The majority of the child and family descriptive, outcome, and intervention-level data are collected through interviews with children, youth and families using standard instruments. The data collection will be conducted by grantee site staff. Every effort has been taken to reduce the burden on children, youth and families participating in the study, including offering to conduct the interviews in their homes or at other locations most convenient for them. Previous experience has shown that sites differ in their access to hardware and software. Requiring special hardware or software for this evaluation would be disruptive and would increase rather than reduce burden, especially since grantees must be capable of administering the instruments in a variety of settings. However, the National Evaluator has provided software for computer-assisted personal interviewing (CAPI) for those grantee communities that have access to the necessary hardware. Across all study components approximately 48 percent of total responses, based on our most recent assessment of previous use, will be obtained electronically by CAPI or Web survey. Because the collection of System of Care Assessment data is primarily qualitative in nature and does not lend itself to the use of special technology, these data are collected by the National Evaluator during site visits.


Data from the Cross-Sectional Descriptive Study, Child and Family Outcome Study, Service Experience Study, and the Sector and Comparison Studies are managed using an integrated Internet-based data input, management, and dissemination system—the interactive-collaborative network (ICN). The ICN, which was introduced in Phase III and refined in Phases IV and V of the national evaluation, reduces evaluation burden for the sites and allows real-time access to data for site personnel and National Evaluation Team members. The system serves as a mechanism for communicating about evaluation activities and results.


The ICN was designed as a three-part system that allows systematic data input, immediate validation to identify data input flaws, and monitoring of data entry and evaluation in real time. It reduces processing time and provides the capability of creating interactive reports. The ICN is a completely secure system that ensures privacy through the provision of different levels of password-protected access to site and national data. The three software subsystems include:


  • Data Input. Data entry software allows rapid data entry off-line, and the Internet is used to transfer data from local sites to the national database. The off-line data entry feature of the ICN allows those sites with available laptop computers the option of CAPI interviewing by entering the participant’s responses directly into the data entry package during the interview. Specific descriptive information on Cross-Sectional Descriptive Study participants are entered directly to the ICN Web site. This software is designed to be used by intake workers or case managers often located at various agencies rather than at a central evaluation office. The primary goal of this Web-based software is to maximize the capture of descriptive information on all children served in system of care programs while eliminating burden associated with the Cross-Sectional Descriptive Study.

  • Data Monitoring and Management. Software allows the National Evaluator and CMHS to monitor the status of each site’s data submissions in real time and permits sites to check the status of their own data submissions.

  • Data Dissemination. Reporting features support sites’ abilities to use their data for quality assurance monitoring and system improvement purposes. Basic validations are completed during the data entry process. More complex validations requiring comparison of data across instruments and across time are performed on the ICN after data are uploaded to and stored in the central repository. Additional reports posted on the ICN provide a vehicle for the review of aggregate data that CMHS has approved for public release. For example, Data Profile Reports, created 3 times per year, display a summary of child- and family-level descriptive and outcome data collected at the community and aggregate level. Continuous Quality Improvement reports display community- and aggregate-level progress on certain key indicators of performance, and are also created 3 times per year. Every month, detailed reports are provided to communities that detail any potential data errors or issues. The National Evaluator is currently in the process of automating these reports, such that communities will have real-time, on-demand access to these reports. These features will be available to Phase VI communities as soon as data collection is underway.


The National Evaluator will provide training and direct evaluation technical assistance support to sites to facilitate the implementation of the evaluation protocol and the use of evaluation results at the site level. Site personnel will be trained to utilize the ICN at national training meetings and during evaluation technical assistance visits to the sites.


System of Care Assessment. System of Care Assessment data, which primarily are qualitative in nature, are collected by the National Evaluator during site visits and do not lend themselves to the use of special technology at this time.


Sustainability Study. The Sustainability Survey will be conducted as a Web survey. Because it will be necessary to link responses of individuals who completed System of Care Assessment interviews to their Sustainability Survey responses, procedures to maintain anonymity will not be employed. Respondents will enter a Web address, username, and password into their Web browsers to open and complete the survey. Because names and contact information of respondents in Phase VI as well-as earlier funded communities will be maintained by the National Evaluator, e-mail contacts will be available. A letter describing the survey and instructions for logging onto the Web survey will be sent by either e-mail or mail to respondents. For those people who cannot complete the survey on the Web, the option to complete a paper-and-pencil survey will be provided. Survey completion can be monitored by each login to assess response rates and to implement targeted follow-up mailings and phone calls to nonrespondents.


Services and Cost Study. The data will be collected and submitted to the national evaluation using two tools. The Flexible Funds Tool (FFT) has been developed by the national evaluation and is currently being used by multiple currently funded communities. The FFT developed in Microsoft Excel tracks flexible funds expenditures on a service episode level. The FFT, developed with the input from system of care representatives, has been pilot tested in several communities. The application features programmed validation checks for quality assurance, preset expenditure categories, and data reports that display graphs and data tables with dynamic query options. The tool has required core fields and allows for the addition of customized fields by communities. The Services and Costs Tool is a Web-based data collection application that is under development by the national evaluation. The application is designed to create a child-level data record for each system of care service received by children / youth. The resulting data file is intended to provide a comprehensive service record for each child / youth with sector-specific pre-structured service categories and responses and other-specify text fields. Grant communities have the option to key in data in any of these service modules or to upload an extract file representing the same data. The application will feature programmed validation checks for quality assurance, preset response categories, secure access authorization for multiple persons within each community and multiple automated reports. The uploading features will include accommodation of multiple file formats (SPSS, Excel, ASCII format); data encryption during transfer to protect the data; and file inspection for data compatibility with central database.


Services and costs data collected by communities provide valuable information to support not only the national evaluation’s Services and Costs Study, but also to support grant communities local fiscal management, provide data for measuring program performance, and support local data reporting needs. The national evaluation’s development of these two data entry applications minimizes communities’ need to develop their own systems locally and the costs of this development.


Sector and Comparison Study. Sector and Comparison Study data collection will involve utilization of the Web-based ICN and Services and Costs Tool and the Excel-based Flexible Funds Tool described above in the Services and Costs Study. In addition, the Sector and Comparison Study will involve identification and collection of administrative data from schools, criminal justice systems, and child welfare agencies. These data will include individual youth records such as school records, court records, and child welfare records. Macro will identify and request administrative data for individual youth participating in the Sector and Comparison Study. Data will be electronically transferred directly to Macro via secure Web site. Data transfer of existing administrative data reduces the need for additional data entry by project staff and reduces the potential for error.


CQI Initiative Evaluation. Both the Baseline Survey and Monitoring Survey will be administered as Web-based surveys. Respondents will enter a Web address and password into their Web browsers to open and complete the survey. Because names and contact information of respondents in Phase VI communities will be maintained by the National Evaluation Team, e-mail contacts will be available. Instructions for logging onto the Web survey will be sent to respondents by either e-mail or ground mail. For those who cannot complete the survey on the Web, the option to complete a paper-and-pencil survey will be provided. Survey completion will be monitored by each login to assess response rates and to implement targeted follow-up mailings and phone calls to non-respondents.


Alumni Networking Study. A variety of technologies will be used to conduct the Alumni Networking Study. The Alumni Networking and Collaboration Survey will be conducted as a Web-based survey. Respondents will enter a Web address, username, and password into their Web browsers to open and complete the survey. Because names and contact information of respondents in Phase VI as well as earlier funded communities will be maintained by the National Evaluator, e-mail contacts will be available. A letter describing the survey and instructions for logging onto the Web survey will be sent by either e-mail or mail to respondents. For those people who cannot complete the survey on the Web, the option to complete a paper-and-pencil survey will be provided. Survey completion can be monitored by each login to assess response rates and to implement targeted follow-up mailings and phone calls to non-respondents.


The Alumni Network Web Site Satisfaction Survey will be incorporated into the Alumni Network Web site to gather end-user satisfaction with the content, format, and design of the Web site. A random sample of non-registered users will be invited to complete the Web-based Satisfaction Survey via pop-up window technology as they access the site. Each user will have the ability to refuse to participate in the Web-based survey by selecting the “no” option on the survey invitation. Additionally, all registered users on the Alumni Network Web site will be invited to participate in the Satisfaction Survey via e-mail containing a description of the survey and instructions for accessing the Web survey. Targeted follow-up e-mails and phone calls will be sent to registered users based on the login history and assessment of response rates.



4. EFFORTS TO IDENTIFY DUPLICATION


The 2005 report developed by the Institute of Medicine (IOM), “Improving the Quality of Health Care for Mental Health and Substance-Use Conditions,” encourages the development of an overall strategy to address mental health and substance-use conditions that includes an infrastructure to produce and disseminate scientific evidence of effective treatments and research funds that are used for studies directly related to clinical practice and policy. The new IOM report (IOM, 2009), Preventing Mental, Emotional, and Behavioral Disorders Among Young People: Progress and Possibilities, focuses on the importance of prevention of mental, emotional, and behavioral (MEB) disorders through an application of universal, selective, and targeted interventions with individuals and groups of children and youth who are at risk of developing serious MEB disorders and identifies a number of programs that have a sufficient evidence base to warrant consideration of broader implementation. Thus, the issue of real world effectiveness is a continuing concern for interventions designed to prevent and treat mental health problems. At this critical juncture, the Phase VI evaluation offers a unique opportunity to address the overlapping needs to understand the effectiveness of systems of care through the quasi-experimental comparison study that will be conducted as part of the sector-specific enhanced study to address effectiveness of the system of care approach in these specific target groups congruent with the recommendations of the 2009 IOM report. The study will compare children and families enrolled in system of care services through agencies in each of the three identified child-serving sectors with similar children and families who are involved with agencies in similar child-serving sectors in jurisdictions that are not receiving system of care funding. For each sector’s group of children, comparison samples will be drawn. Children and families selected from comparison agencies will have characteristics that would make them eligible for system of care services if grant funding were available in their jurisdiction. The program will identify jurisdictions without federally funded systems of care that are similar in terms of structure, community demographics, and other relevant characteristics to the funded communities. As an example, for a juvenile justice child-serving sector in a funded system of care community, the comparison sample will be selected through child-level matching from agencies in unfunded service delivery systems that are serving similar youth who experience both justice-related and mental health involvement. Thus, we are not proposing a matched comparison site for each system of care site as in previous comparisons studies in the national evaluation; rather we conduct a rigorous child-level matching process in similar unfunded agencies.


Studies comparing CMHI-funded system of care communities with communities that do not have funding have provided invaluable information regarding the complex nature of system development, program implementation, and individual child and family characteristics, as well as how they affect change in outcomes across time (Brannan, Baughman, Reed, & Katz-Levy, 2002; Foster & Connor, 2005; Hernandez et al., 2001; Stephens, Holden, & Hernandez, 2004). Comparison studies have used a variety of designs, including pre-post comparisons (Salazar,

Sherwood, & Toche, 1997), matched communities with and without systems of care (Bickman et al., 1995; Foster, et al., 2007; Stephens et al., 2005), comparisons of data from system of care demonstration communities to other counties within a State (Rosenblatt & Attkisson, 1993), and randomized trials (Bickman, Noser, & Summerfelt,1999). However, large-scale multisite evaluations like the national evaluation of the CMHI present challenges in implementation, including the variation in context and cultures among funded communities, variation in characteristics of children and families served, and differences in the specific objectives of systems of care across communities. To date, conclusions about the effectiveness of systems of care based on data from quasi-experimental designs are somewhat limited because of these methodological challenges.


The development of designs to address these needs within the national evaluation has generally followed questions emerging from the children’s mental health services field. Although many questions continue about the effectiveness of systems of care at the clinical outcome level (Burns & Hoagwood, 2002; Stephens et al., 2005; Surgeon General’s Report, 1999), data exist to support continued work on implementation of the approach within community settings. Strong consumer advocacy for alterations in traditional mental health services approaches for children with serious emotional disturbance and their families is an important driving factor in sustaining Federal- and State-level efforts.


The National Evaluator also conducted an extensive literature search to identify existing evaluation research on systems of care and children’s mental health services. The search included a review of published literature, unpublished papers, works-in-progress, and working papers and documents. During the implementation of the Phase I–V evaluations, the National Evaluator has kept abreast of the literature in children’s mental health services research and has been in close contact with the original grantees. This has allowed the team to keep up with advances in practice and research. In addition, the Services Evaluation Committee for the national evaluation has helped keep the evaluation appraised of new innovations in the field. These efforts yielded a broad list of useful references. While some of the research identified contains features similar to the planned evaluation, the scope of the research projects varies considerably and is driven by the particular research interests of each investigator. The Phase VI evaluation offers unique contributions to the field not available in these other studies. The nature of these studies and the unique contributions being made by the Phase VI evaluation are summarized below.


Systems of Care for Children and Adolescents with Serious Emotional Disturbances: What Are the Results?” published by Beth Stroul in 1993, contains a complete review of studies of local systems of care. Stroul concluded that while there is a growing body of evidence to support the contention that systems of care provide high-quality and more appropriate care, continuing commitments to research and evaluation are needed. Further, attention should be directed beyond the assessment of short-term outcomes. She called for the development of a common set of outcome indicators that would provide a framework for more systematic studies and multi-site analyses. The evaluations for all five phases of the project address these concerns because they cover multiple sites, and share standard instrumentation. Phases I, II and VI includes comparison sites, and Phases II, III, and IV include evidence-based treatment studies. Beginning in Phase II and continuing in Phase VI, data are collected from children and families after the completion of services to examine long-term outcomes.


In 2002, Stroul published Issue Brief—System of Care: A Framework for System Reform in Children’s Mental Health. The purpose of this issue brief was to re-examine system reform in children’s mental health, clarify what the system of care concept is, and explore the continued relevance of the system of care concept and philosophy as a framework for reform. Four questions are addressed: (1) What kind of system reform is needed for children’s mental health? (2) What is the actual meaning of the system of care concept? (3) Why should we continue to use the system of care concept and philosophy as a framework for system reform in children’s mental health? (4) How can we achieve our system reform goals in children’s mental health? The national evaluation addresses these questions through a number of its studies including the System of Care Assessment and the Child and Family Outcome Study.


In 2008, Stroul and Blau published their edited book The System of Care Handbook. The purpose of the book was to provide a compendium that informed the development of systems of care drawing from the evidence base on effective strategies for systems building and service delivery. Emphasis was placed on providing recommendations for practice. Evaluation results were used to illustrate how data can be used to inform decision-making at various levels in system change initiatives. Content focused on building and sustaining systems of care, implementing evidence-based practices in these systems, and providing services in a culturally and linguistically competent way that promotes the elimination of disparities in mental health services delivery. The implications for future evaluation acknowledged the importance of developing generalizable knowledge about the effectiveness of systems of care. The evaluation for Phase VI addresses this ongoing need with the inclusion of the quasi-experimental comparison study


The Alternatives to Residential Treatment Study (ARTS) project, which started in the early 1990’s, was conducted by the Research and Training Center for Children’s Mental Health of the Florida Mental Health Institute to study the effectiveness of five innovative programs (Duchnowski, Hall, & Kutash, 1998; Duchnowski, Hall, Kutash, & Friedman, 1998). Components of this study included descriptions of the children and families served, interventions employed, program costs, and outcomes for children over time. This study contributed to the field by documenting the experiences of individuals affected by changes in service delivery systems. However, the ARTS project sample was relatively small (87 children). As a result, generalizable conclusions about the effectiveness of the system of care approach cannot be drawn. With a larger sample and more sites, Phase VI offers an opportunity to produce generalizable findings for those elements covered in ARTS. In addition, unlike ARTS, Phase VI will address the effect of system of care and service-level factors on outcomes.


The National Adolescent and Child Treatment Study (NACTS) was a 7-year longitudinal study conducted at 121 sites in 6 States by the Research and Training Center for Children’s Mental Health of the Florida Mental Health Institute. It assessed the treatment provided to children with serious emotional disturbance in residential mental health facilities and in community-based special education programs (Greenbaum, Dedrick, Friedman, Kutash, Brown, Lardieri, & Paugh, 1996). Although the NACTS project studied children in residential treatment and community-based special education programs, it focused on describing children rather than the services they received. The NACTS was not evaluative, but descriptive, in nature. In addition to describing children receiving services in a community-based system of care, the Phase VI evaluation also assesses outcomes and service delivery and use.


The Robert W. Johnson Foundation (RWJF) Mental Health Services Program for Youth, conceived in 1988, funded eight community programs that were evaluated by Brandeis University (Cole & Poe, 1993; Cole, 1996; Saxe & Cross, 1997). The evaluation of that program focused on changing financing policies and refining new treatment strategies and did not aim to assess client outcomes over time. While not mandated by the evaluation, some sites collected child and family outcome data. However, their findings were limited due to differences in instrumentation that compromised the ability to compare results across the sites. The national evaluation systematically evaluates child and family outcomes using a standard set of instruments, thus allowing for comparison across sites and, when appropriate, aggregation of data.


Another evaluation of the RWJF program in North Carolina was started in 1992 and conducted by researchers at Duke University (Burns, Farmer, Angold, Costello, & Behar, 1996; Angold, Burns, Costello, & Behar, 1998). For this study, children were randomly assigned to one of two models of case management to determine their impact on mental health outcomes for children. Unlike Phase VI, this study did not evaluate the effectiveness of the full continuum of service options or study the roles of multiple child-serving sectors (e.g., juvenile justice, education, child welfare).


The Center for Mental Health Policy at Vanderbilt University evaluated the Fort Bragg Child and Adolescent Mental Health Demonstration Project. The evaluation of this project, which served children of military personnel in the Fort Bragg area, had four components. First, it described how the demonstration project was implemented and highlighted key process indicators (e.g., linkages among providers, extent of family involvement). Second, it examined whether the quality of services provided was sufficient to produce the predicted effect on outcomes. Third, it studied the cost of providing services and patterns of service use. Finally, it assessed the mental health outcomes of the children using a quasi-experimental design that included two comparison sites (Bickman, Guthrie, Foster, Lambert, Summerfelt, Breda, & Heflinger, 1995). Several of these general areas of inquiry overlap with the Phase VI evaluation. However, the Fort Bragg study focused on services in the mental health sector, ignoring other child-serving sectors. The evaluation indicated that services delivered through a continuum of care did not produce significantly better clinical outcomes than regular CHAMPUS-funded services for military dependents. Access to services was greater in the demonstration site with resulting increases in costs. A subsequent investigation utilized a randomized control group design to evaluate the effectiveness of system of care services for children with serious emotional disturbance and their families seeking services in Stark County, Ohio. This latter effort also found no significant clinical and functional differences between children served in a system of care and those who received treatment as usual, although the children enrolled in this trial may have been minimally functionally impaired and the number of participants limited the power to detect significant differences (Bickman, Summerfelt, Firth, & Douglas, 1997).


The Phase VI evaluation has a broader population scope than the Fort Bragg study since it is not limited to the children of military personnel. It is notable that more than one-half of the children in grant communities funded between 1997 and 2003 lived in poverty and less than 25 percent lived in households with both of their biological parents. Phase VI grantees are expected to serve similar populations, and, as such, findings from Phase VI are more likely to generalize to the children and families served by public agencies.


The 1999 Mental Health: A Report of the Surgeon General included a review of the effectiveness of systems of care. The report concluded that while findings are encouraging, the effectiveness of systems of care has not been demonstrated conclusively, and that the findings of the Fort Bragg study, in particular, indicated the importance of evaluating the impact of changes at the system level on practice. The report’s findings indicated that further research needs to focus on practice-level issues and the relationship between changes at the system level and changes at the practice level. The report also concluded at the time that research had not yet demonstrated that services delivered within a system of care resulted in improved clinical outcomes relative to services delivered within traditional systems. Since the report’s publication, there has been little additional work to address this gap beyond the comparison studies included in Phases I and II of the national evaluation. Findings from these studies suggested that, as expected, funded systems generally demonstrated greater adherence to system of care principles than did their matched comparison communities (Brannan, Baughman, Reed, & Katz-Leavy, 2002), yet clinical and functional outcomes of youth improved over time for those served in both systems of care and comparison communities (Stephens et al., 2005; Stephens, Holden, & Hernandez, 2004; Foster, Stephens, Krivelyova, & Gyamfi, 2007). Systems of care were more effective in reducing risk of subsequent juvenile justice involvement (Foster, Qaseem, & Connor, 2004), and youth served in systems of care show greater reductions in functional impairment than youth served in comparison systems, but, these findings were not observed consistently across the community pairs (Stephens et al., 2005; Foster et al., 2007). There is evidence to suggest a relationship exists between the experience of system of care principles in services and reductions in behavioral and emotional problems (Stephens, Holden, & Hernandez, 2004). Differences in the relative effectiveness of systems of care across sites may reflect differences in system implementation, especially with regard to service provision (Foster et al., 2007). Provision of services in a system of care costs more than traditional service delivery, but this increase in costs is partially offset by the decrease in costs associated with juvenile justice placements (Foster & Connor, 2005). The Sector and Comparison Study proposed in Phase VI allows for continued examination of system of care effectiveness and attempts to address some of the methodological challenges seen in previous CMHI comparison studies. Through rigorous child-level matching, implementation challenges due to variation in contexts and characteristics of children and families will be reduced. This study examines also cost effectiveness and conducts cost benefit analysis, an element not examined in previous comparison studies. Together, this study will provide better information about the effectiveness of systems of care.


The New Freedom Commission on Mental Health published Achieving the Promise: Transforming Mental Health Care in America Final Report in 2003. This report outlined six goals developed by the New Freedom Commission to transform the mental health care delivery system in the United States. The fourth goal focused on early mental health screening and early assessment. One of the recommendations regarding this goal was to promote the mental health of young children. The Phase VI evaluation has included an early childhood package that includes measures of social and emotional development, children’s strengths, parental distress and parent-child interaction. The fifth goal in this report highlighted accelerating research to improve mental health care. The comparison study added to Phase VI of the national evaluation aimed at better understanding the effectiveness of systems of care in specific child-serving sectors, as well as understanding specific factors that improve mental health care in a system of care environment. In addition, data collected through the national evaluation is important to add to the field’s knowledge base.


As explained above, Phase VI does not duplicate extant studies, but instead enhances the existing knowledge base. In addition, Phase VI provides information that is specific to this service program. As required by the legislation, data must be collected from the communities in which the program has been funded.


As described above in Section A.1.d, advances in the field of children’s mental health have emphasized the importance of assessing the impact of providing coordinated, community-based mental health services through a system of care environment, and the ability to sustain system of care services. Consequently, Phase VI addresses both of these issues by including a Sector-Specific, Quasi-Experimental Comparison study aimed at increasing the understanding of the factors that affect improvements on clinical outcomes for children and their families. In addition the enhanced study on sustainability will address the status of funded communities’ ability to sustain their systems of care after funding ends.



5. INVOLVEMENT OF SMALL ENTITIES


Some of the data for this evaluation will be collected from mental health, juvenile justice, education, and child welfare agencies. While most data will be collected from public agencies, it is possible that some organizations providing services to the target population, such as community-based organizations, not-for-profit agencies, private providers, schools, or parent groups, would qualify as small entities. The information requested is the minimum required to meet the study objectives. The site visit interview guides used in the System of Care Assessment and the Web-based surveys employed in the Sustainability Study, CQI Initiative Evaluation, and Sector-specific information obtained from the Comparison Study are the only instruments that will be administered to the staff of small entities.



6. CONSEQUENCES IF INFORMATION IS COLLECTED LESS FREQUENTLY


System of Care Assessment. Data for this component will be collected every 18–24 months across the 6 years of system of care community funding (beginning in the second year), documenting how the program has led to system enhancement. This information is key to examining whether improved outcomes for the children served by the system can be plausibly linked to this initiative. Because systems of care change slowly, collection of system data every 18–24 months is sufficient to provide information on system implementation, organizational involvement, and relationships. If these data were collected less frequently, important interim changes would not be documented. The System of Care Assessment data collected during the evaluations in Phases I, II, III, IV, and V have been valuable to CMHS and the system of care communities in mapping progress and making decisions about program resources and strategies, and have been useful in identifying interim technical assistance needs. In Phase VI, continued efforts will be made to apply System of Care Assessment results to CMHS program decisions and technical assistance efforts.


Cross-Sectional Descriptive Study. Data for this component will be collected when children and families first access the system of care. As part of their normal operations, grantees collect data on children and families including demographics, service use, status, treatment plans, and other information. These and other data elements are maintained by the grantees for their own administrative purposes; hence their collection creates no additional respondent burden. For families participating in the Child and Family Outcome Study, however, the descriptive information that may have changed over time (e.g., family income, caregiver’s marital status) will be collected at each follow-up data collection point. Failure to collect these few data elements at follow-up would preclude the detection of key changes in the child’s environment that could have an important impact on the child’s clinical outcomes, service use, or family functioning. Data from the grantee sites will be submitted to the National Evaluator continuously using the ICN, resulting in a minimal burden to site staff.


Child and Family Outcome Study. For this component, data will be collected at intake and every 6 months for the length of the evaluation, up to 24 months. Clinicians who work with this population of children suggest that once children enter services, they are likely to experience detectable improvements within the first 6 months of services. However, whether improvement is sustained is important to demonstrate. Assessing outcomes every 6 months allows for the study of the course of improvement over time so that interventions can be planned for times that are likely to yield the greatest gains. Thus, waiting 12 months to collect outcome data would miss important changes that are likely to happen in children who are still developing. On the other hand, it was the judgment of the Research Advisory Board and prior grantees that quarterly data collection would be too burdensome.


The data collection schedule calls for collecting data on all children and families in the longitudinal Child and Family Outcome Study for the duration of the evaluation. It is important to follow children as long as possible to capture changes that occur as children enter new developmental stages, especially adolescence and young adulthood. We have changed our power analysis assumptions to reflect higher effect sizes (observed in the analysis of data from more recently funded communities) and a between versus within-site difference in our analytical approach. As a result, we are modifying the core outcome and service experience study requirement for enrollment into the outcome study and reducing outcomes data collection at the site level to 24 months after service intake. This will allow local evaluation staff to expend resources on longitudinal follow-up to increase the quality of data and improve retention rates.

Service Experience Study. Data for this study component will be collected at intake into the evaluation and at subsequent 6-month intervals in conjunction with the Child and Family Outcome Study. At each data collection point, a screening question will indicate whether any services have been received during the previous 6-month period. If so, questions for the Multi-Sector Service Contacts, Revised (MSSC–R), the Youth Services Survey for both youth and family (YSS), and the Cultural Competence and Service Provision Questionnaire, Revised (CCSP–R) will be asked. If not, these sets of questions will be skipped. This will provide youth and caregiver perspectives at various stages of treatment as their needs and services change (e.g., during intensive involvement, while transitioning to less intensive services, and after formal discharge from mental health services). If these data were collected less frequently, the National Evaluator would not be able to track the service changes that may be linked to changes in outcomes.


Sector and Comparison Study. Data for this study component will be collected at intake into the evaluation and at subsequent 6-month intervals in conjunction with the Child and Family Outcome Study. Of particular interest for the sector and comparison studies are functional outcomes such as educational performance, abstaining from criminal behavior and placement stability. It is important to follow children as long as possible to capture changes that occur as children enter new developmental stages, especially adolescence and young adulthood.

For the educational sector, teachers will be assessed at baseline and every 6 months at follow-up for similar reasons, since large part of the assessment will be aimed to collect child-level data. The school administrator survey will be administered at baseline and every 12 months afterwards. The 12-month interval is chosen to follow the school academic year and collecting the data less frequently may miss important changes on the school level that are likely to occur with every new academic year. For the juvenile justice sector, court representatives who are responsible for oversight of youth completion of court-required activities will be assessed at baseline and every 6 months at follow-up for similar reasons. Youth require regular reporting to court representatives to ensure completion of activities, and completion of these may occur over a period of months or years, depending on the youth’s sentence or status. For the child welfare sector, child welfare social workers/case managers will be assessed at baseline and every 6 months at follow-up for similar reasons. Placement settings for children and youth involved in child welfare can change over a period of months.


Services and Cost Study. Data used in this study come from communities’ MISs and is aimed at assessing all services received by children and their families and associated costs. These data are episodic in nature, and not collecting information on all episodes of services will result in underreporting of services utilization and underestimating services cost incurred by children and families.


By not collecting services and costs data, from the beginning of service delivery, within a consistent data structure across all grant communities, the ability to accomplish these study goals are seriously diminished. SAMHSA is often asked to demonstrate the cost-effectiveness of this grant program. Without requiring complete and consistent data from all communities, the validity of these types of costs analyses would be compromised.


Data collection for this study involves on-going data accumulation beginning when the grant communities initiate services within their system of care program. Some grant communities currently collect this information electronically as part of their normal program procedures, some communities currently collect it on paper, and some communities are not yet collecting this information. The national evaluation’s Services and Costs Study is requesting communities to collect services and costs data routinely as services are delivered.


Sustainability Study. Data on sustainability will be collected from representatives of all award communities in years 2 and 5 of the evaluation. It is necessary to collect these data at multiple points to assess the progress being made towards sustaining funding for continued operation during their funding period and for sustaining programs after the funding cycle. Evaluation of sustainability over time is needed because the amount of nonfederal funds required increases each year, as does the developmental stage of the systems of care. This makes the second evaluation point distinct from the first point and will yield important information on the process of becoming increasingly independent of Federal support, the critical stages in efforts towards sustainability, and where in the process potential barriers to sustainability are most likely to arise. Assessing sustainability at the end of the funding cycle would yield information on whether a site has or has not achieved sustainability but would not provide insight into the process of becoming sustainable or barriers and facilitators to sustainability. The final survey administration will occur in the same year as programs’ System of Care Assessment and having these complementary data from the same points in time will permit a more comprehensive understanding of sustainability efforts at each site.


The shorter version of the sustainability survey will be collected from a former or current project director of graduated communities annually for up to 4 years (5-year post funding assessment will use a long version of the survey). Annual post-funding assessment of graduated grantees is essential to obtain accurate information using a consistent format and parameters for determining sustainability. Sustainability at 5 years post funding is a long term GPRA measure for the program. The long sustainability survey has been used with previously funded grantees to assess whether their programs have sustained. Because SAMHSA is called upon to provide updates on the status of former grantees on a regular basis, and must report on the long term GPRA measure, collecting these data less frequently will not be consistent with agency reporting needs.


CQI Initiative Evaluation. Data on the CQI Initiative will be collected via a Baseline Survey of key constituents in all 2008-funded communities in year 1 of program delivery; a subsequent Monitoring Survey administered to the same constituents in years 3 and 5; and biennial Case Studies of four selected communities in years 2 and 4. Longitudinal evaluation of the CQI Initiative is essential to document the development of and changes in the CQI process within communities over time. Collecting this information less frequently will not provide a comprehensive assessment of the CQI Initiative and the extent to which it has been implemented.


Alumni Networking Study. The Alumni Networking and Collaboration Survey will be collected in years 1 and 3 of the evaluation and will inquire about the extent to which each currently funded and alumni system of care community interact with each other and program partners on activities such as governance, individualized care, funding, family-driven care, youth-guided care, culturally competent care, sustainability, evaluation, program technical assistance, and evaluation technical assistance as a result of the Alumni Network Web site. The initial data collection point in year 1 will provide baseline information on the extent of collaboration among currently and formerly funded communities. Between years 1 and 3 of the evaluation the Alumni Network activities will be targeting those topical areas covered in the Networking and Collaboration Survey that respondents reported not being facilitated by the Alumni Network. This information is critical to understanding the extent and nature of the collaboration between sites through the use of the Alumni Network Web site, as this has implications for how systems of care sustain themselves after funding. It is anticipated that the results of the Networking and Collaboration Survey in year 3 will show increased collaboration around all topical areas facilitated by the Alumni Network. Therefore, measuring relationships among both currently and formally federally funded system of care communities in alternating years will provide the minimum frequency of data collection required to assess change in collaboration over time as a result of the Alumni Network Web site.


The Alumni Network Web Site Satisfaction Survey will be done in alternating years of the evaluation  (years 2 and 4) to gain insight into the design, content, and format of the Alumni Network Web site from a wide variety of end users. Information collected during these data collection periods will be used to modify and strengthen the utility of the Web site. As content, design and formatting are changed based on the feedback collected on this survey, it is anticipated that end-user satisfaction with the Web site will increase over time.  If these data were collected less frequently the national evaluation would not be able to measure changes in end-user satisfaction over time.



7. CONSISTENCY WITH GUIDELINES IN 5 CFR 1320.5(d)(2)


The data collection fully complies with the requirements of 5 CFR 1320.5(d)(2).



8. CONSULTATION OUTSIDE THE AGENCY


The notice in the Federal Register was published by SAMHSA on March 18, 2009 (Vol. 74, page 11593) to solicit public comment on this study. No comments were received.


Consultation on the design, instrumentation, data availability and products, and statistical aspects of the evaluation occurred continually throughout the implementation of Phases I, II, III, IV, and V. To capitalize on the experience and knowledge gained, the development of Phase VI was based, in part, on this consultation. Since the beginning of this initiative, consultations have been sought from the following:


  • Federal representatives working in related program areas

  • Experts in the area of child mental health services research

  • CMHS grantees

  • Families caring for children with emotional and behavioral disorders

  • Representatives of national organizations for children, families, and providers in the field (e.g., National Technical Assistance Center for Children’s Mental Health, National Mental Health Associations, the Federation of Families for Children’s Mental Health, National Alliance for the Mentally Ill, State Mental Health Representatives for Children and Youth)

  • Experts in program evaluation, measurement, and statistical analysis

  • Experts in Web site usability testing

  • Experts in mental health service systems for Native American children


These consultations had several purposes: (1) to ensure continued coordination of related activities, especially at the Federal level; (2) to ensure the rigor of the evaluation design, the proper implementation of the design, and the technical soundness of study results; (3) to verify the relevance and accessibility of the data to be collected; and (4) to minimize respondent burden.


a. Federal Consultation



Input from representatives of Federal agencies involved in children’s mental health issues has been elicited throughout all phases of the national evaluation. CMHS received input about its children’s services program from Federal offices including, but not limited to, the following: the Office of Special Education Programs, DoE; the Office of Juvenile Justice and Delinquency Prevention, DoJ; the Office of Disability, DHHS; and Division of Adolescent and School Health, CDC. (See Attachment 2.A. for a list of the participants in the Federal/National Partnership for Children’s Mental Health and their affiliations and telephone numbers.)


These offices are involved in a public-private interagency partnership group to ensure that services for children with serious emotional disturbance and their families are coordinated at the Federal level and that evaluation results are useful to a wide audience. Specifically, representatives from the listed Federal agencies have convened to develop strategies for coordinated training, technical assistance, and culturally competent services to communities across the country.


In addition, SAMHSA, the parent agency of CMHS, requires that its other two constituent centers, the Center for Substance Abuse Treatment (CSAT) and the Center for Substance Abuse Prevention (CSAP), conduct an internal review of the Annual Report to Congress on the Evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program. Evaluation specialists at the CDC, NIMH, and the Office of the Assistant Secretary for Planning and Evaluation (ASPE) of DHHS have also reviewed and provided comments on the national evaluation. Furthermore, NIMH has been represented on the Services Evaluation Committee of the national evaluation by various individuals over the past several years, including most recently Amy Goldstein, David Chambers, and Marina Broitman. (See Attachment 2.B. for a list of Methodological Consultants and Services Evaluation Committee to the national evaluation members.) Collaboration with NIMH led to the release of a program announcement (PA–00–135; Effectiveness, Practice, and Implementation in CMHS’ Children’s Service Sites) on September 21, 2000, by NIMH for the conduct of research studies on services delivered to children, adolescents, and their families in currently or previously CMHS-funded system of care communities. This mechanism encourages studies examining the nature and impact of routine clinical practice, and factors related to successful implementation of treatments or services. This program announcement addresses recommendations set forth in the NIMH report, “Bridging Science and Service: A Report by the National Advisory Mental Health Council’s Clinical Treatment and Services Research Workgroup,” and in the NIMH Child and Adolescent Services Research Strategic Planning Report. A revised program announcement (PA–04–019 [reissued as PA-06-526]; Effectiveness, Practice, and Implementation in CMHS’ Children’s Service Sites) was released on March 2, 2006, by NIMH. The scope of this program announcement was broadened to include research in communities with Safe Schools Healthy Students grants.


b. Expert Consultation


The Services Evaluation Committee of the national evaluation, a workgroup of expert consultants, was organized to provide technical guidance and review for Phase I of the evaluation. The Services Evaluation Committee continued to have input regarding the enhanced design and instrumentation for Phases II, III, IV, and V. Recommendations made by this group have influenced changes applied to the Phase VI instrumentation. Services Evaluation Committee members have combined expertise in children’s mental health, the delivery of children’s mental health services, and the evaluation of systems of care. (See Attachment 2.B. for a list of Services Evaluation Committee members.)


Most of the individuals invited to provide consultation were chosen because of their involvement in past or current studies of children’s mental health service systems. During previous phases, input has also been received from the National Association of State Mental Health Directors and the State Mental Health Representatives for Children and Youth.


c. Grantee Consultation



Previously funded grantees have been key providers of input for all phases of the evaluation design. For the design of Phase VI, grantee input was used in the development of the instrument package. In October 2008, project directors and evaluators from previously-funded sites participated in the Phase VI Evaluation Review Meeting where study design and instrumentation was discussed. These participants helped in determining the instruments that are most appropriate for each component of the evaluation. Modifications to the Phase VI instrument package also reflect ongoing input received by the National Evaluator from Phases II, III, IV, and V grantees through conference calls, site visits, and semi-annual workshops and evaluator meetings. Additional grantee feedback was received during close-out site visits conducted with Phase IV communities in which evaluation processes and data utilization were reviewed.


Several representatives from the grantee sites also participate in the Services Evaluation Committee of the national evaluation and these members offer the grantee site perspective on how research goals can be achieved at the sites with the least disruption.


In January and February 2002, CMHS initiated an annual consumer survey of the Phase II and Phase III grantee sites to assess satisfaction with implementation of the national evaluation and the role of the National Evaluator in this implementation (OMB Control # 0930–0197). The survey also asked for feedback from grantee site evaluators regarding desired changes in study design. This survey was repeated in April 2003 and in June 2007. CMHS received feedback from evaluators in almost all grantee communities and synthesized these data for use in quality improvement efforts.


d. Family Consultation


Critical to the CASSP principles is the role of family caregivers as active stakeholders in the system of care. That philosophy has been extended to all phases of the evaluation design in several ways. Caregivers participated on the Services Evaluation Committee and gave early input to the overall design. Caregivers also reviewed the instrumentation and key features of the evaluation design to ensure sensitivity to parent issues and concerns as well as to maximize clarity of meaning and to assess feasibility of administering the questionnaires. Input from family members participating in assessment interviews and in the Phase VI Evaluation Review Meeting indicated a need to reduce the length of the interview and provide more accurate information, and this recommendation is reflected in the Phase VI instrument package. Grantee sites systematically solicit feedback from family members; hence the family perspective is also included in comments and consultation from grantee sites. The evaluation team has a formal relationship with the Federation of Families for Children’s Mental Health to facilitate systematic and ongoing input to the evaluation.



9. PAYMENT TO RESPONDENTS


As with previous phases, Phase VI of the national evaluation will use a research-based approach to evaluation and, as such, will require participation of children and families beyond their receipt of services in their system of care programs. Consequently, remuneration is essential to ensure good response rates across all study components.


Remuneration levels in the System of Care Assessment, Child and Family Outcome Study, and Sustainability Study for Phase VI are the same as those currently approved in Phase V. Other Phase VI study components that will provide an incentive include the Sector and Comparison studies, CQI Initiative Evaluation and the Alumni Networking Study.


System of Care Assessment. Three caregivers of children who receive services in each system of care community are interviewed during each System of Care Assessment site visit. The national evaluation will provide a payment of $25 to them at the time of their interviews in compensation for the additional burden and potential inconvenience of these interviews. Two youth participants in each system of care community are interviewed during each System of Care Assessment site visit. The national evaluation will provide a payment of $15 to them at the time of their interviews in compensation for the additional burden and potential inconvenience of these interviews.


Child and Family Outcome Study. The National Evaluator strongly recommends that grantees remunerate respondents who participate in the Child and Family Outcome Study $20 each for caregivers and youth at each administration. Remuneration is standard practice in this type of longitudinal research to acknowledge participants’ value to the study. It is essential to help maximize participation rates, particularly given the additional time being asked of families who already face multiple challenges and demands on their time in caring for their children with serious emotional disturbance. Caregivers and children who participate in the Child and Family Outcome Study are asked to complete more assessments than ordinarily are required in the course of receiving services. To complete the instruments at the time of entry to services and at subsequent follow-up points requires the evaluation participants to spend time away from other activities. The combination of the number of instruments and their periodicity creates a burden to the caregivers and children that exceeds the burden that ordinarily would be placed on them if they were seeking services not associated with this evaluation.


Services and Costs Study. Data for the Services and Costs Study is collected entirely from administrative and fiscal records by staff paid by grant funding. No incentives, payment, or gifts are proposed as part of this study.


Sustainability Study. As with the Phase II, III, and IV Sustainability Surveys, individuals asked to complete the Sustainability Survey will receive a token incentive (e.g., a refrigerator magnet) to encourage survey completion when they are informed about the survey.


Sector and Comparison Study. At baseline, incentives will be paid to caregivers and youth ($40 and $20, respectively). The incentives will include a bonus incentive of $50 paid to each caregiver and youth who complete all five waves of data collection. As noted, remuneration is standard practice in this type of longitudinal research to acknowledge participants’ value to the study and to help maximize participation rates given the amount of time being asked of these families. Incentives will also be provided to the agency representatives in the amount of $20 for their participation in interviews. In some cases, State and county agency representatives may not be allowed to accept incentives. In this case, alternative methods of providing incentives will be devised, which may include a donation to the overall agency, a donation to an agency project or activity, or donation to a charity of the respondent's choice.


CQI Initiative Evaluation. Individuals who complete the CQI Initiative Evaluation Baseline Survey and Monitoring Survey will receive a $20 pre-paid credit card for the completion of each survey. Community members who participate in focus groups as part of the Case Studies will receive an additional $20 pre-paid credit card. These incentives may help to increase response rates and ensure continued participation over time.


Alumni Networking Study Individuals who complete the Networking and Collaboration Survey will receive a $10 online gift certificate. For those survey participants who request a paper copy of the Networking and Collaboration Survey a pre-paid credit card will be sent to them via ground mail. No incentives will be provided to end users who complete the Alumni Network Web Site Satisfaction Survey.



10. ASSURANCE OF CONFIDENTIALITY


Phase VI requires collecting descriptive and clinical data from children and families. In all the grantee sites, data are collected by site staff. These staff members are responsible for developing procedures to protect the privacy of all participants in the evaluation data collection, storage of data, and reporting of all information obtained through data collection activities. These procedures include limiting the number of individuals who have access to identifying information, using locked files to store hardcopy forms, assigning unique code numbers to each participant to ensure anonymity, and implementing guidelines pertaining to data reporting and dissemination.


Because of the sensitivity of the information that will be collected, CMHS has required that all grantees establish a system whereby data are gathered, stored, and accessed in a manner that protects the information as much as possible. The National Evaluator provides each grantee with a coding schema that each site uses to generate code numbers to assign to individual respondents, and trains staff responsible for data collection on the process of developing codes and linking them to individual respondents. Sites are instructed to maintain a list of the codes and their assignment to individual respondents. A secure, stand-alone software to allow site evaluation staff to store codes with respondent names is also provided to sites. This program is password protected and sites are instructed to limit access to the database to only those onsite evaluation staff that need access to this information. If a paper list is maintained, the list linking the assigned codes to respondent names is kept in a locked cabinet and only the onsite data collection staff has access to the list. The database or list will be maintained for the duration of the CMHS program. The purpose of maintaining the list for this period of time is to ensure that the data can be linked back to the identified child and family throughout the data collection process. When the project is completed, the databases or lists will be destroyed. This coding system was developed to facilitate the tracking of children during their involvement with the evaluation and to ensure that no personal identifying information from the grantee sites would need to be made available to either the National Evaluator or CMHS.


The security of data entered and managed on the Internet-based ICN also is assured. Access to the ICN is password protected, and the ICN uses data encryption to further enhance security and privacy. Further, the project including the ICN system is operating under an ADP/IT security plan approved by CMHS to assure that project data are protected.


Each grantee has implemented an active consent procedure that informs the participants of the purpose of the evaluation, describes what their participation entails, and addresses how privacy is maintained as described above. Informed assent is obtained from participating older children and adolescents (aged 11–17 years). In addition, informed consent is obtained from adolescents who have reached the age of 18 at follow-up data collection. Written informed consent or assent is obtained from children and families at the point of entry into services. Each grantee has obtained local Institutional Review Board (IRB) approval for the informed consent or assent procedures used in this evaluation. Grantees are instructed to determine whether updates to consents are required at each data collection point, since the legal custody of a child may change, a child may become old enough to participate in a youth interview, a youth may become an emancipated minor or age up into adult status, and local IRBs may have requirements for regular updates.


As in previous phases of the national evaluation, to further protect study participants for Phase VI, all grantees and the National Evaluator will obtain a Federal Certificate of Confidentiality, authorized by Section 301(d) of the Public Health Service Act. This certificate provides additional protections of the data from civil and criminal subpoena. Additionally, the National Evaluator will conform to all requirements of the Privacy Act of 1974, under the System of Records: Alcohol, Drug, and Mental Health Epidemiological, and Biometric Research Data, DHHS, #09–30–0036; the most recent publication in the Federal Register occurred on January 19, 1999 (64 FR 2914). Client records at the sites are also covered under this Privacy Act System of Records.


System of Care Assessment. Data collection for the System of Care Assessment will occur via face-to-face interviews. Because respondents’ identities will be known, to ensure that participants’ rights are protected, an active informed consent process will occur. (See Attachment 3.D.1.–3.D.6. for informed consent forms.)


Services and Costs Study. The national evaluation trains all grant communities to include specific language in their consent and assent forms to describe the services and costs data that will be accessed through the child/youth’s records and shared with the national evaluation. Although grant communities may work with personal identifying information to extract and link electronic records, no personally identifying information will be included in any data transferred to the national evaluation for this study, other than the child/youth’s national evaluation child identification number.


For those communities electing to enter data in the Flex Funds Tool or the Services and Costs Data Tool, data in these applications are password protected to ensure privacy. When data are transferred to the national evaluation, data files will be encrypted to protect the information during electronic transfer. No child identifying information will be included in these data files other than the child/youth’s national evaluation child identification number.


Sustainability Study. Data collection for the Sustainability Survey (long and short versions) will occur using the Web-based Sustainability Survey. Because respondents’ identities will be known, to ensure that participants’ rights are protected, an active informed consent process will occur. A letter will be mailed to potential participants explaining the survey, including the voluntary nature of survey completion, privacy of responses, and the risks, benefits, and rights as respondents, and will advise the recipient that they will be asked to indicate, by checking a box on the Web survey, that they agree to participate in the study before they complete and return the survey. Information about the study and participant rights will be presented in the Web survey prior to the check box indicating consent to participate. The letter and the Web survey will also provide contact information if the survey recipient has questions or desires clarification prior to participation. If the individual does not have e-mail access, a packet will be sent by regular mail containing a cover letter, an informed consent form, a survey, and a return envelope. (See Attachments 3.D.7., 3.D.8., 4.H.1.c., and 4.H.2.c.) The cover letter will indicate that the respondent is to return the informed consent form and the survey. (See Attachment 4.H.2.f. for Web screen shots of the survey.)


Sector and Comparison Study. Caregiver informed consent and youth assent procedures for participants in the comparison study will follow those of the system of care participants described above. Caregivers of youth involved with the sector studies will provide consent for their children’s agency representative (e.g., teachers, child welfare case worker, or court representative) to complete the respective sector-specific instruments. The consent for completing these instruments will be included in the caregiver consent form for the Child and Family Outcome Study. (See Attachments 3.C.8-3.C.11 for informed consent forms.)


CQI Initiative Evaluation. Data collection for the CQI Initiative Evaluation will occur through two Web-based surveys (Baseline Survey and Monitoring Survey) and two focus groups. Because survey respondents’ identities will be known, an active informed consent process will occur to ensure that participants’ rights are protected. A letter will be mailed to potential participants explaining the survey, including the voluntary nature of survey completion, privacy of responses, and the risks, benefits, and rights as respondents. The letter will also advise recipients that they will be asked to indicate that they agree to participate in the study by checking a box on the Web-based survey. Information about the study and participant rights will be presented in the Web-based survey prior to the check box indicating consent to participate. The letter and the Web survey will also provide contact information if the survey recipient has questions or desires clarification prior to participation. If the individual does not have e-mail access, a packet will be sent by regular mail containing a cover letter, an informed consent form, a survey, and a return envelope. The cover letter will indicate that the respondent is to return the informed consent form and the survey. Similar procedures will be followed for the focus groups. Participants will be mailed a letter explaining the study and a consent form for focus group participation, and they will be asked to return the consent form via facsimile or ground mail.

(See Attachments 3.D.9.–3.D.11., 4.I.1.a.1., and 4.I.2.a.1.)


Alumni Networking Study. Data collection for the Alumni Networking study will occur using the Web-based Alumni Networking and Collaboration Survey. Because survey respondents’ identities will be known, an active informed consent process will occur to ensure that participants’ rights are protected. A letter will be mailed to potential participants explaining the survey, including the voluntary nature of survey completion, privacy of responses, and the risks, benefits, and rights as respondents, and will advise the recipient that they will be asked to indicate, by checking a box on the Web survey, that they agree to participate in the study before they complete and return the survey. The letter and the Web survey will also provide contact information if the survey recipient has questions or desires clarification prior to participation. If the individual does not have e-mail access, a packet will be sent by regular mail containing a cover letter, an informed consent form, a survey, and a return envelope. Contact information will be used to send incentives to respondents who complete the survey and to follow up with non-respondents. All contact information will be kept on a secured server and will only be accessible to key study personnel. The cover letter will indicate that the respondent is to return the informed consent form and the survey. (See Attachment 3.D.12.)


The Alumni Network Web Site Satisfaction Survey will be collected from two groups of end users of the Web site. For non-registered users, pop-up window technology will be used to randomly select participants for the Satisfaction Survey. E-mail addresses will be collected during the survey process for the purpose of sending incentives to end users who complete the survey. Respondents must actively consent to complete the Satisfaction Survey and will be notified that e-mail addresses will be collected and only used to distribute survey incentives. Once incentives have been sent to users, the e-mail addresses will be deleted.


Registered users’ identities of the Alumni Network Web site will be known and an active consent process will occur to ensure that participants’ rights are protected. A letter will be e-mailed to potential participants explaining the survey, including the voluntary nature of survey completion, privacy of responses, and the risks, benefits, and rights as respondents, and will advise the recipient that they will be asked to indicate, by checking a box on the Web survey, that they agree to participate in the study before they complete and return the survey. The letter and the Web survey will also provide contact information if the survey recipient has questions or desires clarification prior to participation. Contact information will be used to send incentives to respondents who complete the survey and to follow-up with non-respondents. All contact information will be kept on a secured server and will only be accessible to key study personnel. (See Attachment 3.D.13.) The cover letter will indicate that the respondent is to check the box on the Web survey to indicate agreement to participate in the survey.



11. QUESTIONS OF A SENSITIVE NATURE


Because this project concerns services to children with serious emotional disturbance and their families, it is necessary to ask questions that are potentially sensitive. It should be noted, however, that only information that is central to the study is being sought. Questions address dimensions such as child emotions, behavior, social functioning, school performance, and involvement in unlawful activities. Also asked are questions about the child’s experience with sexual and physical abuse and suicidality. The answers to these questions will be used to determine baseline status and to measure changes in these areas experienced after entering the system of care. Questions about child abuse and suicidality have implications for local mandated reporting, which grantees are informed to consider and to train interviewers accordingly. Since each grantee must keep data on child and family status and service use, as well as treatment plan and other information, the data collection required for the national evaluation is not introducing new, sensitive domains of inquiry, but is paralleling standard procedures in the field of children’s mental health.


Although the inclusion of substance use data is sensitive in nature, it does not represent a new domain of inquiry. The frequent comorbidity of substance use and serious emotional disturbance among adolescent populations, and the increased ability to record dual diagnoses, are cited in the case management and mental health literatures. Because of the increased risk of substance use by children with mental illness, it is necessary for Phase VI system of care communities to collect data about substance use from the children to determine the prevalence of this comorbidity and to track changes in substance use after entering a system of care.


In addition to information on child clinical status and social function, other questions of a sensitive nature will be asked of families. These include questions related to family functioning caregiver strain and parental distress. These questions are included in response to growing evidence of the powerful role families play in shaping children’s use of services and their related outcomes. This is particularly important in systems of care where a basic tenet is to involve families in treatment planning and service delivery. Moreover, family representatives who have consulted with the National Evaluators consistently identify a lack of information on family life as a weakness in previous studies.


Before collecting data, each grantee will obtain active consent from caregivers. In addition, child assent will also be obtained. In that process respondents will be made aware that the information they provide will be protected strictly and that they can withdraw their participation at any time. Similarly, respondents can freely choose to refrain from answering any questions they find objectionable.



12. ESTIMATES OF ANNUALIZED HOUR BURDEN


In accordance with the evaluation design, the descriptive, outcome, intervention, and service information collection for the 38 communities in Phase VI of the national evaluation will cover a period of 5 years beginning in October 2009 and ending in September 2014.


Table 1 shows the burden associated with the Phase VI evaluation of the 38 grantees. For measures that were previously cleared by the OMB, burden estimates presented in Table 1 are based on information supplied by grantees in prior phases of the evaluation. Measures that are revised for Phase VI have been used in previous phases of the national evaluation and average burden estimates are based on that experience. These measures include the Caregiver Information Questionnaire, Revised (CIQ–R), Youth Information Questionnaire, Revised (YIQ–R), the Education Questionnaire, Revision 2 (EQ–R2), Culturally Competent Services Provision, Revised (CCSP–R), and the Multi-Sector Service Contacts, Revised (MSSC–R), the Youth Services Survey (YSS), and the Youth Services Survey for Families (YSS–F). The burdens for the surveys that will be used for the CQI Initiative Evaluation, Sector and Comparison Studies, and Alumni Network Studies were estimated from typical measures used for these purposes.


Table 1. Estimate of Respondent Burden

Note: Total burden is annualized over a 5-year period.

Instrument

Respondent

Number of Respondents

Total Average Number of Responses per Respondent

Hours per Response

Total Burden Hours

5-Year Average Annual Burden Hours

Hourly Wage Rate ($)

Average Annual Cost ($)

System of Care Assessment

Interview Guide A. Core Agency Representative

Key site informants

8741

3

1.00

2,622

524

19.232

10,084

Interview Guide B. Project Director

Interview Guide C. Family Representative/ Representative of Family/ Advocacy Organizations

Interview Guide D. Program Evaluator

Interview Guide E. Intake Worker

Interview Guide F. Care Coordinator

Interview Guide G. Direct Service Delivery Staff

Interview Guide H. Care Review Participant

Interview Guide I. Caregiver of Child or Youth Served by the Program

Interview Guide L. Direct Service Staff from Other Public Child-Serving Agencies

Interview Guide M. Care Record/Chart Review

Interview Guide N. Other Staff

Interview Guide O. Debriefing Document

Interview Guide P. Youth Respondent

Interview Guide Q. Youth Coordinator

Interview Guide R. Cultural and Linguistic Competence Coordinator

Interview Guide S. Social Marketing Communications Manager

Child and Family Outcome Study

Caregiver Information Questionnaire, Revised: CaregiverIntake (CIQ–RC–I)

Caregiver

7,9523

1

0.37

2,916

583

9.934

5,790

Caregiver Information Questionnaire, Revised: Staff as CaregiverIntake (CIQ–RS–I)

Staff as Caregiver

Caregiver Information Questionnaire, Revised: CaregiverFollow-Up (CIQ–RC–F)

Caregiver

7,952

45

0.28

9,012

1,802

9.93

17,898

Caregiver Information Questionnaire, Revised: Staff as CaregiverFollow-Up (CIQ–RS–F)

Staff as Caregiver

Caregiver Strain Questionnaire (CGSQ)

Caregiver

7,952

5

0.17

6,640

1,328

9.93

13,187

Child Behavior Checklist 1½–5 (CBCL 1½–5)

Caregiver

7,952

5

0.33

13,240

2,648

9.93

26,295

Child Behavior Checklist 6–18 (CBCL 6–18)

Education Questionnaire, Revision 2 (EQ–R2)

Caregiver

7,952

5

0.33

13,240

2,648

9.93

26,295

Living Situations Questionnaire (LSQ)

Caregiver

7,952

5

0.08

3,300

660

9.93

6,554

Behavioral and Emotional Rating Scale—Second Edition, Parent Rating Scale (BERS–2C)

Caregiver

7,0286

5

0.17

5,868

1,174

9.93

11,655

Columbia Impairment Scale (CIS)

Caregiver

7,6707

5

0.08

3,183

637

9.93

6,322

Parenting Stress Index (PSI)

Caregiver

2,8628

5

0.08

1,220

244

9.93

2,423

Devereux Early Childhood Assessment for Infants (DECA 1–18M)

Caregiver

2,1899

5

0.08

912

182

9.93

1,811

Devereux Early Childhood Assessment for Toddlers (DECA 18–36M)

Devereux Early Childhood Assessment (DECA 2–5Y)

Preschool Behavioral and Emotional Rating (PreBERS)

Caregiver

2,189

5

0.10

1,095

219

9.93

2,174

Delinquency Survey, Revised (DS–R)

Youth

5,15010

5

0.13

3,433

687

7.2511

4,978

Behavioral and Emotional Rating Scale—Second Edition, Youth Rating Scale (BERS–2Y)

Youth

5,150

5

0.17

4,300

860

7.25

6,235

Gain Quick–R: Substance Problem Scale (GAIN)

Youth

5,150

5

0.08

2,137

427

7.25

3,099

Substance Use Survey, Revised (SUS–R)

Youth

5,150

5

0.10

2,575

515

7.25

3,734

Revised Children’s Manifest Anxiety Scale, Second Edition (RCMAS–2)

Youth

5,150

5

0.07

1,717

343

7.25

2,489

Reynolds Adolescent Depression Scale, Second Edition (RADS–2)

Youth

5,150

5

0.05

1,288

258

7.25

1,867

Youth Information Questionnaire, Revised—Intake (YIQ–R–I)

Youth

5,150

1

0.25

1,288

258

7.25

1,867

Youth Information Questionnaire, Revised—Follow-Up (YIQ–R–F)

Youth

5,150

4

0.25

5,150

1,030

7.25

7,468

Service Experience Study

Multi-Sector Service Contacts, Revised: Caregiver—Intake (MSSC–RC–I)

Caregiver

7,952

1

0.25

1,988

398

9.93

3,948

Multi-Sector Service Contacts, Revised: Staff as Caregiver—Intake (MSSC–RS–I)

Staff as Caregiver

Multi-Sector Service Contacts, Revised: Caregiver—Follow-Up (MSSC–RC–F)

Caregiver

7,952

4

0.25

7,952

1,590

9.93

15,793

Multi-Sector Service Contacts, Revised: Staff as Caregiver—Follow-Up (MSSC–RS–F)

Staff as Caregiver

Cultural Competence and Service Provision Questionnaire, Revised (CCSP–R)

Caregiver

7,952

412

0.13

4,241

848

9.93

8,423

Youth Services Survey for Families (YSS–F)

Caregiver

7,952

4

0.12

3,722

744

9.93

7,391

Youth Services Survey (YSS)

Youth

5,150

4

0.08

1,710

342

7.25

2,479

Comparison and Sector Study: Juvenile Justice

Court Representative Questionnaire (CRQ)

Court representatives

21213

5

0.50

530

106

26.4414

2,803

Electronic Data Transfer of Juvenile Justice Records

Key site personnel

212

5

0.03

35

7

26.44

187

Comparison and Sector Study: Education

Teacher Questionnaire (TQ)

Teacher

212

5

0.50

530

106

26.44

2,803

School Administrator Questionnaire (SAQ)

School administrators

212

5

0.50

530

106

26.44

2,803

Electronic Data Transfer of Education Records

Key site personnel

212

5

0.03

35

7

26.44

187

Comparison and Sector Study: Child Welfare

Child Welfare Sector Study Questionnaire—Intake (CWSQ–I)

Care coordinators

212

1

0.50

106

21

26.44

561

Child Welfare Sector Study Questionnaire—Follow-Up (CWSQ–F)

Care coordinators

212

4

0.50

424

85

26.44

2,242

Electronic Data Transfer of Child Welfare Records

Key site personnel

212

5

0.03

35

7

26.44

187

Sustainability Study

Sustainability Survey: Brief Form

Project Director

79

2

0.17

26

5

26.44

132

Sustainability Survey

Providers15

162

2

0.75

243

49

26.44

1,296

Caregiver15

54

2

0.75

81

16

9.93

159

CQI Initiative Evaluation

CQI Baseline Survey, Web-Based

Key site personnel

304

1

0.50

152

30

26.44

793

CQI Monitoring Survey, Web-Based

Key site personnel

304

2

0.50

304

61

26.44

1,613

CQI Local Focus Group Guide

Key site personnel

30

2

1.00

60

12

26.44

317

CQI National Focus Group Guide

National TA providers

20

2

1.00

40

8

26.44

212

Alumni Networking Study

Networking and Collaboration Survey

Key site personnel

308

2

0.50

308

62

26.44

1,639

Alumni Network Web Site Satisfaction Survey

Key site personnel, National TA providers, Branch staff

518

2

0.25

259

52

26.44

1,375

Services and Costs Study

Flex Funds Data Dictionary/Tool

Local programming staff compiling/ entering administrative data on children/youth

1,90916

317

0.03

189

38

24.0418

908

Services and Costs Data Dictionary/Data Entry Application

Local evaluator, staff at partner agencies, and programming staff compiling/ entering service and cost records on children/youth

7,952

10019

0.05

39,760

7,952

26.44

210,251

Summary of Annualized Burden Estimates for 5 Years

 

Number of Distinct Respondents

Average Annual Number of Responses per Respondent

Total Annual Number of Responses

Average 5-Year Burden per Response (hours)

Total Annual Burden (hours)20

Total Annual Cost ($)

Caregivers

7,952

0.9

82,395

2.2

15,722

156,117

Youth

5,150

0.9

40,170

1.1

4,719

34,216

Providers/Administrators

874

11.5

162,823

0.9

9,238

240,391

Total Summary

13,976


285,387

 

29,680

430,724

  1. An average of 23 stakeholders in up to 38 grant communities will complete the System of Care Assessment interview. These stakeholders will include site administrative staff, providers, agency representatives, family representatives, and youth.

  2. Assuming the average annual income across all types of staff/service providers/administrators/caregivers is $40,000, the wage rate was estimated using the following formula: $40,000 (annual income)/2080 (hours worked per year) = $19.23 (dollars per hour).

  3. Number of respondents across 38 grantees (7,634), in addition to 318 children/families from the comparison sample. Average based on a 5 percent attrition rate at each data collection point.

  4. Given that 56 percent of the families in the Phase V evaluation sample fall at or below the 2008–2009 DHHS National Poverty Level of $ 20,650, (based on family of four), the wage rate was estimated using the following formula: $20,650 (annual family income)/2080 (hours worked per year) = 9.93 (dollars per hour).

  5. Number of responses per respondent is five over the course of the study (once every 6 months for 24 months, with one baseline/intake response, and 4 follow-up responses).

  6. Approximate number of caregivers with children over age 5, based on Phase IV data submitted as of 12/08. Also includes 318 children/families from the comparison sample.

  7. Approximate number of caregivers with children 3 and older, based on Phase IV data submitted as of 12/08. Also includes 318 children/families from the comparison sample.

  8. Approximate number of caregivers with either: (1) children served at the roughly 10 early childhood-focused communities, for whom the instrument is required; or (2) children aged 0 to 12 at other communities, where the instrument is optional (we estimate that 1/3 of caregivers will be administered the instrument when it is optional). Estimates are based on Phase IV data submitted as of 12/08.

  9. Approximate number of caregivers with either: (1) children served at the roughly 10 early childhood-focused communities, for whom the instrument is required; or (2) children aged 0 to 5 at other communities, where the instrument is optional (we estimate that 1/3 of caregivers will be administered the instrument when it is optional). Estimates are based on Phase IV data submitted as of 12/08.

  10. Based on Phase IV finding that approximately 63 percent of the children in the evaluation were 11 years old or older. Also includes 318 children/families from the comparison sample.

  11. Based on the 2009 Federal minimum wage rate of $7.25 per hour.

  12. With the exception of the MSSC-R, respondents only complete Service Experience Study measures at follow-up points. See Footnote #3 for the explanation about the average number of responses per respondent.

  13. Approximate number of children/families in each sector, for the Sector and Comparison Study. This includes cases within the communities, as well as within the comparison sample.

  14. Assumes that the average annual income across all types of evaluators, agency staff, and administrative staff is $55,000, the wage rate was estimated using the following formula: $55,000 (annual income) / 2080 (hours worked per year) = $26.44 per hour.

  15. For each community, 1 respondent will be a caregiver and 3 respondents will be administrators/providers.

  16. Assumes that each community will use flexible funds expenditures on average for approximately one quarter of the children/youth enrolled.

  17. Assumes that three expenditures, on average, will be spent on each child/youth receiving flexible fund benefits.

  18. Assumes that the average annual income across all types of programming staff is $50,000, the wage rate was estimated using the following formula: $50,000 (annual income) / 2080 (hours worked per year) = $24.04 per hour.

  19. Assumes that each child/youth in system of care communities and in the comparison sample will have 100 service episodes, on average.

  20. Total Annual Burden (hours) is the product of Number of Distinct Respondents X Average Annual Number of Responses per Respondent X Average 5-Year Burden per Response (hours).


As indicated in Table 1, the average total annual burden for data collection is estimated at 29,680 hours. This estimate is derived by calculating the burden for each measure, dividing those numbers by 5 (years of data collection in the national evaluation), and summing.



13. ESTIMATES OF ANNUALIZED COST BURDEN TO RESPONDENTS


Grantees collect the majority of the required data elements as part of their normal operations, and maintain this information for their own service planning, quality improvement, and reporting purposes. The additional cost of this data collection is minimal. The costs for operation and maintenance of materials necessary for ongoing data collection are similarly minimal.


Other costs related to this effort, such as the cost of obtaining copyrighted instruments, are costs to the Federal Government. Each grantee has been funded, as part of the overall cooperative agreement award, to support two staff positions (or the full-time equivalent) to assist in the evaluation. Therefore, no cost burden is imposed on the grantee by this information collection effort.




14. ESTIMATES OF ANNUALIZED COST TO THE GOVERNMENT


CMHS has planned and allocated resources for the management, processing, and use of the collected information in a manner that shall enhance its utility to agencies and the public. Including the Federal contribution to local grantee evaluation efforts, the contract with the National Evaluator, and government staff to oversee the evaluation, the annualized cost to the government is estimated at $6,386,987. These costs are described below.


Each grantee is expected to hire two full-time equivalents to recruit families into the evaluation, collect information, manage and clean data, and conduct analyses at the local level. Assuming (1) an average annual salary of $55,000; (2) that 38 grantees will be funded; and (3) that the average Federal contribution (not including State matching funds) will be 73 percent, the annual cost for Phase VI at the grantee level is estimated at $3,051,400. These monies are included in the cooperative agreement awards.


The national evaluation contract has been awarded to Macro International Inc. for evaluation of the 38 grantees in Phase VI. The national evaluation contract provides for 1 base year of $2,800,053 with an option to renew for 4 more years. The estimated average annual cost of the contract will be $3,238,087. Included in these costs are the expenses related to developing and monitoring the national evaluation including, but not limited to, the following activities: developing the design, instrument package (including acquisition of copyrighted instruments), data manual, and training materials; monitoring and providing technical assistance to sites; traveling to sites and relevant meetings; conducting special studies, and analyzing and disseminating data. Cost for acquisition of copyrighted instrumentation is projected to be $40,122.19 per year. This cost is included in the total contract award.


It is estimated that CMHS will allocate 75 percent of a full-time equivalent each year for government oversight of the evaluation. Assuming an annual salary of $130,000, these government costs will be $97,500 per year.



15. CHANGES IN BURDEN


This is a new project.



16. TIME SCHEDULE, PUBLICATION, AND ANALYSIS PLANS


a. Time Schedule


The time schedule for implementing the Phase VI evaluation is summarized in Table 2.


Table 2. Time Schedule


Receive OMB clearance for study

May 2009

Begin data collection for 38 sites funded in FY 2008 and 2009

October 2009

Data collection completed for 38 sites funded in FY 2008 and 2009

September 2014

Process and analyze data

Ongoing

Produce annual reports

October 2009, annually thereafter

Produce public use data base

September 2014

Produce final report

September 2014


b. Publication Plans


Applications of the system of care model have increased in number and funding over the past several years. Thus, the publication of evaluation results will be of great interest at the Federal, State, and local levels, all of which have been involved in promoting the system of care model. As in the past, Reports to Congress will be prepared for CMHS annually. A final report will be prepared at the completion of the evaluation for internal use by CMHS and will be widely distributed beyond CMHS.


Because of the importance of this evaluation to the field of children’s mental health and the expansion of the system of care model, results of the national evaluation will be published in relevant professional journals to inform the research community as well as the decision making of policymakers and program administrators. At least 10 publications are planned. It is unlikely that any publications related to Phase VI will be submitted during the first year of funding since sites will be involved in establishing their systems of care. For the remaining 4 years of the contract, a minimum of two publications will be submitted in each of years 2 and 3, and at least three publications per year will be submitted in each of years 4 and 5. The national evaluation’s research questions will guide the topics, and the articles will report on findings from analyses of system of care assessments, child and family outcomes, and data from special studies. Possible publications include manuscripts reporting on the use of empowerment evaluation and continuous quality improvement in systems of care; implementing interagency systems of care; implementing evidence-based practice and practice-based evidence with community development teams in systems of care; and outcomes of children, youth, and families referred to systems of care from schools, child welfare, and juvenile justice agencies; as well as other topics of interest to the children’s mental health field.. Papers related to methodological issues could be prepared during the second and third years of funding. Additionally, specific publications may be developed, such as a special edition of a journal or monographs, to disseminate this unique information more broadly. Additional publications may include articles on the development of community-based systems of care, effectiveness of system of care services for targeted groups, cost effectiveness of treatment components, and implications of system development approaches for sustainability, among other possibilities. All publications will be submitted in draft form to the Government Project Officer (GPO) and an expert panel designated by the GPO for review and approval prior to submission to the selected journal.


The cross-agency, interagency, collaborative perspective represented by the system of care model involves multiple audiences, including those involved in mental health, child welfare, juvenile justice, and education. Policymakers, program administrators, and researchers in each of these service sectors will be interested in the findings from this evaluation and will serve as the potential audience for publications. Examples of journals that will be considered as vehicles for publication include the following:


    • American Journal of Public Health

    • American Psychologist

    • Child Abuse and Neglect: The International Journal

    • Child and Adolescent Psychiatric Clinics of North America

    • Child Development

    • Child Maltreatment

    • Children and Youth Services Review

    • Children Today

    • Evaluation Review

    • Evaluation Quarterly

    • Journal of Autism and Developmental Disorders

    • Journal of Behavioral Health Services and Research

    • Journal of Child and Family Studies

    • Journal of Clinical Child and Adolescent Psychology

    • Journal of Consulting and Clinical Psychology

    • Journal of Emotional and Behavioral Disorders

    • Journal of Health and Social Behavior

    • Journal of Mental Health Administration

    • Journal of the American Academy of Child and Adolescent Psychology

    • Mental Health Services Research

    • Milbank Memorial Fund Quarterly

    • Psychiatric Services

    • Social Services Review


Besides audiences associated with specific service sectors, results of the project will be of interest to State legislators. It is this group that often makes decisions about how to configure the service delivery system for children with serious emotional and behavioral disorders and determines matching funds required for this program. The National Conference of State Legislators can help identify the best strategies for reaching this group with evaluation findings.


c. Data Analysis Plan


All of the data collection and analytic strategies detailed in this package are linked to the evaluation questions. These linkages are shown in Table 3. Note that the majority of these data are collected at intake and at each data collection point. Exceptions include: (1) descriptive data elements that are not expected to change over time (e.g., gender, race) and are asked only at intake; (2) service and cost data from grantee MISs, which will be collected on an ongoing basis; (3) System of Care Assessment data, which will be collected every 18–24 months; (4) sustainability data that will be collected in years 2, 3, and 5 of the evaluation; (5) CQI Initiative Evaluation data that will be collected in years 1, 3, and 5 of the evaluation in the 18 FY 2008 sites, and in years 2 and 4 of the evaluation in the 20 FY 2009 sites; and (6) the Alumni Networking Study data that will be collected in years 2 and 4 of the evaluation. Analyses will be conducted to assess reliability and validity of selected measures as sufficient data to conduct these analyses are obtained in the early stages of the study. These analyses will include, but are not limited to, calculation of reliability using Cronbach’s coefficient alpha to determine internal consistency of ordinal-level and interval-level measures, calculation of the Kuder-Richardson formula 20 to determine internal consistency of dichotomous measures, and confirmatory factor analysis to determine latent variable structure and content of multi-component scales.

Table 3. Evaluation Questions, Indicators, Data Sources, and Analysis Techniques


Evaluation Questions

Indicators

Data Sources

Data Analysis

System of Care Assessment

Does the system maximize interagency collaboration?

  • Core agencies participate in a collaborative way

  • Integration of staff, resources, functions, and funds

  • Co-location of services of multiple agencies

  • Interagency service planning

  • Shared vision and goals

  • Formal relationships established between agencies

  • Site Visit

Univariate/

Multivariate Analysis

Are the various service components of the system coordinated?

  • Co-location of services of multiple agencies

  • Availability of case management/care coordination services

  • Case manager/care coordinator has broad responsibilities and active referral role

  • Integration and consistency in case management/care coordination across systems/agencies

  • Site Visit

Univariate/

Multivariate Analysis

Are services and the system accessible?

  • Proportion of eligible population provided services

  • Time between identification of need and entry to system

  • Waiting lists for entry to system

  • Waiting lists for delivery of key services

  • Active outreach

  • Logistics and supports that encourage access

  • Site Visit

Univariate Analysis

Is the service array comprehensive?

  • Availability of broad array of residential, intermediate, outpatient, and wraparound services

  • Site Visit

  • MIS

Univariate Analysis

Are services and the system culturally competent?

  • Cultural diversity of the child and family population

  • Cultural diversity of provider population

  • Agency commitment to cultural competency

  • Equitable treatment of all children and families

  • Adherence to national standards of cultural competence

  • Site Visit

  • CCSP–R

  • YSS, YSS–F

Univariate Analysis

Are services and the system family-driven?

  • System and services involve caregivers in developing individual child and family service plans

  • System and services involve caregivers in overall system of care planning activities

  • System and services involve caregivers in service delivery

  • System and services address needs of caregivers and families for support

  • Site Visit

  • YSS, YSS–F

  • CIQ–R

Univariate/ Multivariate Analysis

Are services individualized and youth-guided?

  • Active individualized service planning process

  • Frequency of monitoring of ISP by case manager

  • System and services involve youth in developing his or her own service plan

  • System and services involve youth in overall system of care planning activities

  • System and services involve youth in his or her own service delivery

  • System and services address needs of youth for support

  • Site Visit

  • YSS, YSS-F

  • YIQ-R

Univariate/

Multivariate Analysis

Are services community-based?

  • Availability of services within the community

  • Extent of reliance on out-of-county and out-of-State placements

  • Site Visit

  • MIS

Univariate/

Multivariate Analysis

Do systems mature over time?

  • Development of infrastructure

  • Development of service delivery capacity

  • Site Visit

Multivariate Analysis

Are services provided in the least restrictive setting that is appropriate?

  • Processes to ensure that children step down to lower levels of care when appropriate

  • Extent of use of intermediate and outpatient placements

  • Extent of use of wraparound services

  • Stability and duration of placements

  • Level of use of mental health services in normative settings (e.g., home, school)

  • Site Visit

  • MIS

  • LSQ

Univariate/

Multivariate Analysis

Cross-Sectional Descriptive Study

What are children and families like?

  • Gender

  • Race

  • Age

  • Foster care placement

  • Presenting problem(s)

  • Diagnosis at intake

  • Intake and referral source

  • Case status

  • EDIF

  • CIQ-R

Univariate/Bivariate Analysis

Child and Family Outcome Study

Are there differences between the children and families served in the systems who do and do not choose to participate in the Child and Family Outcome Study?

  • Gender

  • Race

  • Age

  • Educational level and placement

  • Socioeconomic status

  • Parents’ employment status

  • Living arrangement

  • Presenting problem(s)

  • Diagnosis at intake

  • Intake/referral source

  • Risk factors for family and child

  • Case status

  • EDIF

  • CIQ–R

Univariate/Bivariate Analysis

Has there been a reduction in childrens negative behaviors?

Number of problem behaviors

  • CBCL1½–5

  • CBCL 6–18

  • CIS

  • DECA

Univariate/

Multivariate Analysis

Has there been an increase in the level of childs overall functioning?

  • Childs ability to accomplish activities of daily living

  • Child’s strength

  • Quality of family relationships

  • Quality of peer relationships

  • CBCL1½–5

  • CBCL 6–18

  • BERS–2C

  • BERS–2Y

  • PreBERS

  • CIS

Univariate/

Multivariate Analysis

Has there been improvement in child functioning in the educational environment?

  • School attendance

  • Expulsions, dropouts, suspensions

  • Academic performance

  • BERS–2C

  • BERS–2Y

  • EQ–R2

Univariate/

Multivariate Analysis

Has there been improvement in child regarding involvement with law enforcement?

  • Violations

  • Number of contacts with law enforcement

  • Number of incarcerations

  • DS-R

Univariate/

Multivariate Analysis

Do families experience improvements in family life?

  • Family functioning

  • Parenting stress

  • Caregiver strain (burden of care)

  • PSI

  • CGSQ

  • CIQ–R

Univariate/

Multivariate Analysis

Are there differences in family outcomes across systems of care?

  • Family functioning

  • Caregiver strain (burden of care)

  • Material resources

  • PSI

  • CGSQ

  • CIQ–R

Univariate/

Multivariate Analysis

Service Experiences Study

How do children and families experience services?

  • Ratings of specific services

  • Ratings of the overall system

  • Provider attitudes and practices

  • YSS

  • YSS–F

  • CCSP–R

Univariate/

Multivariate Analysis

Are there differences in service experiences across systems of care? Are differences, if any, associated with differential outcomes?

  • Comparison of ratings of specific services

  • Comparison of ratings of the overall system

  • Comparison of provider attitudes and practices

  • Relationship to child outcomes

  • YSS

  • YSS–F

  • CCSP–R

  • CBCL1½–5

  • CBCL 6–18

  • CIS

Univariate/

Multivariate Analysis

Sustainability Study

To what extent are systems of care able to sustain themselves after Federal funding has ended? What factors facilitate or impede sustainability?

  • System of care characteristics

  • Factors related to sustainability

  • Success of sites to be sustainable post-funding

  • Sustainability Survey

Univariate/ Multivariate Analysis

Services and Costs Study

What services do children and families receive and what are their service utilization patterns?

  • Previous service history

  • Service setting and type

  • Level of restrictiveness

  • Mix of services

  • Amount and duration

  • Continuity of care

  • MIS

  • LSQ

Univariate/

Multivariate Analysis

How do service use patterns relate to child behavioral and functional outcomes?

  • Comparison of service use for children who enter the system at varying levels of challenge

  • Comparison of change in outcomes over time for children in different utilization pattern groups

  • MIS

  • MSSC–R–I

  • MSSC–R–F

  • EDIF

  • CIQ–R

  • YIQ–R

  • CBCL1½–5

  • CBCL 6–18

  • CIS

  • GAIN

  • SUS–R

  • DS–R

  • RADS–2

  • RCMAS–2

  • BERS–2C

  • BERS–2Y

  • PreBERS

  • DECA

  • PSI

  • LSQ

  • DS–R

  • EQ–R2

  • TQ

  • SAQ

  • CRQ

  • CWSQ–I

  • CWSQ–F

Univariate/

Multivariate Analysis

How do service use patterns differ across subgroups within a site? Across system of care sites?

  • Comparisons of types of services used

  • Comparisons of level of restrictiveness

  • Comparisons of service mix

  • Comparison of amount and duration

  • Comparison of continuity of care

  • MIS

  • LSQ

  • MSSC–R–I

  • MSSC–R–F

  • EDIF

  • CIQ–R

  • YIQ–R

Univariate/

Multivariate Analysis

What costs are associated with services at the aggregate and child/family levels?

  • Total costs of services for individual children and families

  • Average costs per child/family

  • Average cost per service type

  • MIS

  • LSQ

  • MSSC–R–I

  • MSSC–R–F

Univariate/Bivariate Analysis

Sector and Comparison Study

Education Sector

Do educational outcomes of school-aged children in systems of care improve over time?


Do educational and clinical outcomes of school-aged children in systems of care improve more compared to non-system of care children?


Are children in systems of care more likely to receive appropriate educational supports compared to non-system of care children?


How does teacher involvement, supports and training in system of care communities differ from that of teachers in non-system of care communities (or schools who are not part of the system of care)?


What individual level services are available in schools in system of care communities?

What school level interventions are available in schools in system of care communities?

What are the types of mental health service delivery systems in schools in system of care communities?

  • Attendance

  • Performance

  • Delinquent behavior

  • Grade repetition

  • School mobility

  • Disciplinary actions

  • Receipt of special education and supports

  • Teacher’s supports and training

  • EQ–R2

  • TQ

  • SAQ

  • School records data

Univariate/

Multivariate Analysis

Juvenile Justice Sector

Do juvenile justice outcomes of juvenile justice-involved children in systems of care improve over time?


Do juvenile justice and clinical outcomes of juvenile justice-involved children in systems of care improve more compared to non-system of care juvenile justice-involved children?


Are juvenile justice-involved children in systems of care more likely to receive appropriate juvenile justice supports compared to non-system of care juvenile justice-involved children?


How does court/juvenile justice representative involvement, supports and training in system of care communities differ from that of court/juvenile justice personnel in non-system of care communities (or in juvenile justice systems that are not part of the system of care)?

  • Arrests

  • Adjudication process,

  • Placements

  • Criminal activity

  • Substance use

  • Interaction with mental health providers

  • DS–R

  • SUS

  • GAIN

  • CRQ

  • Juvenile justice records data

Univariate/

Multivariate Analysis

Child Welfare Sector

Do the child welfare outcomes of children involved in child welfare and systems of care improve over time?


Do the child welfare and clinical outcomes of children involved in child welfare and systems of care improve more than the child welfare and mental health outcomes of non-system of care children involved in child welfare?


Are child welfare-involved children in systems of care more likely to receive appropriate services compared to non-system of care children in child welfare?


How does child welfare staff involvement, supports and training in system of care communities differ from that of child welfare staff in non-system of care communities (or in child welfare systems that are not part of the system of care)


What factors influence referrals of children involved in child welfare to systems of care in their communities?


Are systems of care providing mental health assessments for children in child welfare even if they are not ultimately determined to be in need of or eligible for, system of care services?

  • MH services provided

  • Maintenance In home

  • Out of home placement

  • Risk factors for child

  • Trauma symptoms

  • CWS–EDIFA

  • CWSQ–I

  • CWSQ–F

Univariate/

Multivariate Analysis

CQI Initiative Evaluation

How are communities pursuing CQI? How well does the CQI Initiative identify and address communities’ TA needs? How effective is the CQI Initiative in providing appropriate data- driven TA to communities?

  • Utilization of CQI Progress Report

  • Development of CQI infrastructure

  • Utilization and effectiveness of technical assistance

  • Key constituent involvement in implementing CQI Initiative

  • Extent to which the Initiative was implemented according to plans

  • Satisfaction with implementation

  • Baseline Survey

  • Monitoring Survey

  • Local Focus Group

  • National TA Provider Focus Group

Descriptive Statistics

Univariate / Bivariate Analysis

Content/Thematic Analysis

Alumni Networking Study

To what extent do currently and formerly funded system of care communities collaborate? What is the nature and level of collaboration between system of care communities with program partners on program and evaluation technical assistance? What activities and features of the Alumni Network Web site facilitate and/or hinder collaboration among system of care communities?

  • Frequency and levels of collaboration on issues of governance, individualized care, funding, family-driven care, youth-guided care, culturally competent care, sustainability, and evaluation

  • Frequency and levels of collaboration on program and evaluation technical assistance

  • What Web site factors facilitate or hinder collaboration

  • Networking and Collaboration Survey

Descriptive and Social Network Analysis

How satisfied are the users from the currently funded and formerly funded communities with the design, format, and content of the Alumni Network Web site?

  • Satisfaction with design of the Web site

  • Satisfaction with format of the Web site

  • Satisfaction with content of the Web site

  • Satisfaction with utility of Web site for its intended purpose

  • Alumni Network Web Site Satisfaction Survey

Descriptive Statistics

Univariate / Bivariate Analysis



Analyses planned for each of the study components are described below. These analyses will be possible for grantee sites that are able to implement the evaluation as designed, including collection of cross-sectional descriptive data on the census of children and families who enter the system, the proper recruitment of an adequately sized sample, minimal missing data within and across data collection points, retention of families over time, and adherence to prescribed data collection procedures. In sites with constraints (e.g., insufficient size of target population), analyses will be tailored to meet the needs of the individual site. The sample table shells presented in Attachment 5 provide examples of how data can be summarized.


Essentially, the objectives of the data analysis are concentrated on an overall goal of understanding the effects of the system of care approach. The analysis plan will focus on description, explanation, and prediction. The data analyzed in Phase VI will include both discrete and continuous variables. The scales on which these variables are measured have important implications for the choice of statistical procedures used in data analysis. Some of the variables used in this evaluation are nominal (e.g., race and ethnicity) and ordinal (e.g., services ranked in order of restrictiveness). These types of measurement scales require the use of nonparametric statistics. It is recognized that nonparametric statistics offer less power relative to parametric tests. Parametric tests are more restrictive in their distributional assumptions, but they are often robust to violations of normal distribution. For this reason, research questions measured with ordered discrete variables (such as the ratings of system and service performance) approaching a continuous scale will be tested using parametric statistics.


System of Care Assessment. In this evaluation component, Phase VI seeks to determine whether a system of care has been implemented in accordance with the system of care program model and to document the maturation of the system over time. This study component includes both qualitative and quantitative analyses and both are based on a standard framework. Qualitative analyses will be used to describe the infrastructure and the direct service delivery processes of system of care communities. The standard framework ensures that all system of care communities will be characterized on similar system operations (e.g., management, client entry into the system of care, service planning and coordination processes) but the qualitative approach provides for the individual and unique features of each system of care community to be portrayed.


Qualitative data obtained through individual interviews at each system of care community and from document reviews will be synthesized into a site-specific narrative report that will be returned to each system of care community for review and correction. When the reports for each community are finalized after site comment, they will be entered into a qualitative database software program (Atlas.ti) that will allow meta-analyses across system of care communities and across time.


The quantitative analyses will be based on scores given to each system of care community that measure the extent to which it has achieved the program model’s overarching principles (e.g., youth-guided, individualized and family-driven care, cultural and linguistic competence, service coordination) within the system operations described in the qualitative analysis and from quantitative interview questions (e.g., percentage of children who receive an individualized service plan, number of child-serving agencies that attend governing body meetings). This approach allows systems of care to be assessed across principles (e.g., how well system operations incorporate a family-driven approach) and across operations (e.g., how well does the overall management of the system of care reflect the principles as a whole). The relationship among service and system experiences, child and family characteristics, and outcomes over time will be explored using correlational, regression, and path analyses.


Child and Family Outcome Study. For this evaluation component, data collected at intake will be analyzed to describe the sample in terms of intake demographic characteristics, symptomatology (i.e., Child Behavior Checklist [CBCL] scores), functional impairment (i.e., Columbia Impairment Scale [CIS] scores), social functioning (i.e., peer relations, Delinquency Survey, Revised [DS–R], and Substance Use Survey, Revised [SUS–R] scores), and stability of living arrangements (i.e., Living Situations Questionnaire [LSQ]). Families will be described in terms of their intake demographic features, functioning (i.e., Caregiver Information Questionnaire, Revised [CIQ–R], Parenting Stress Index [PSI] scores), and level of caregiver strain (i.e., Caregiver Strain Questionnaire [CGSQ] scores). Univariate descriptive analyses will be performed to characterize the families participating in this evaluation, including score ranges, means, and medians. These analyses will be reported for each system of care community as well as for all grantees combined.


Change in child and family outcomes over time will be tested using a variety of techniques. Repeated measures analysis of variance (ANOVA) will be used to test the significance of change over time within and between groups at each site. Repeated measures analysis of covariance (ANCOVA) will be conducted using the system of care development scores from the System of Care Assessment as a covariate. ANCOVA controls for differences present at intake, which is prudent, even when those differences are not statistically significant.


Following all children recruited into the Child and Family Outcome Study in the Phase VI evaluation for 24 months enhances our ability to use hierarchical linear modeling (HLM). HLM provides improvement in estimating individual effects, an opportunity to model cross-level effects (i.e., individuals within systems, over time), and greater precision in partitioning components of effects across multiple levels. The following provides an illustration of how HLM will be used in the evaluation. The children and families in the longitudinal study are located (or “nested”) within systems of care. We assume that children experience an intervention and that, as a result of that intervention, they experience change. We know from the evaluation of the 22 grant communities originally funded in 1993 and 1994 that systems of care vary in terms of their overall development (Brannan et al., 2002; Vinson et al., 2001). We expect that differential system development (approximated with system-level assessment scores) will mediate child and family outcomes. HLM allows us to estimate growth curves (e.g., changes in the level of symptomatology) based on repeated observations. These repeated measures are “nested” within the individual child. Using this three-level design, HLM permits us to estimate how much of the variance found in the first level (e.g., changes in symptoms) is due to the second (e.g., individual receiving treatment), and how much of the variance can be attributed to the third level (e.g., the degree of system of care development).


The GLM repeated measures analysis allows the National Evaluator to test whether changes over time are significant and whether some groups experience more improvement than others. Within a community, these techniques will be used to explore whether certain service utilization patterns yield better outcomes. Path analysis and other structural equation modeling techniques will be used to investigate the direct and indirect effects of causal variables (such as ratings of system performance and adherence to service plans) on dependent outcome measures (such as clinical assessments, restrictiveness of care, and family functioning). The National Evaluator does not view the use of path analysis as a method of causal discovery, but rather as a method of confirming appropriate models derived from empirical and theoretical considerations.


Service Experience Study. In this component of the Phase VI evaluation, analysis will assess the extent to which children and families receive services as they were intended; that is, consistent with the system of care program model. As with data from the Service and Cost Study, the distribution of self-reported service use across the client population will be described (i.e., Multi-Sector Service Contacts, Revised—Intake and Multi-Sector Service Contacts, Revised—Follow-Up [MSSC–R–I and MSSC–R–F]). Service utilization patterns also will be described. HLM or ANOVA will be performed to examine: (1) change in service utilization patterns of children and their families; (2) whether there are differences between groups of children in the system of care communities who receive an evidence-based treatment and those who do not in terms of client satisfaction as measured by the abbreviated satisfaction questionnaires (i.e., Youth Services Survey [YSS–F, YSS]) and ratings of the cultural competence of services as measured by the Cultural Competence and Service Provision Questionnaire, Revised (CCSP–R); (3) whether children and families stay in services longer on average in communities with higher average service and system of care ratings; and (4) whether within communities, caregivers of children who received fewer services in the previous 6 months (as measured by the Multi-Sector Service Contacts, Revised [MSSC–R–I and MSSC–R–F]) also reported being less satisfied or rated their services and systems lower.


The design of this study allows for the analysis of trends in outcomes over multiple data collection waves, as well as the analysis of differential rates of improvement between children in systems of care and comparison samples. Given the quasi-experimental design, the treatment and comparison groups may not be comparable at baseline. Because treatment is not assigned randomly, the effect of treatment potentially will be confounded with the effect of other factors associated with assignment to treatment. We will employ propensity score matching to account for possible baseline differences between children in system of care and comparison groups. Once group membership is modeled, subsequent analyses would incorporate the propensity score to adjust results on the basis of group differences at baseline. Repeated measures ANOVA with treatment group as a between-subjects factor and time as a within-subjects factor will be used to examine differences in continuous outcomes over time. Generalized estimating equations will be used in the analysis of dichotomous outcomes. Multivariate regression modeling across multiple time points will allow characterization of effects in terms of persistence over time and identification of both system-level and specific services factors that maintain short- and long-term positive outcomes. In addition, the appropriateness of multilevel modeling will be explored as a potential approach for linking site-level characteristics to changes in outcomes over time.


Services and Costs Study. For this component, analyses will focus primarily on utilization patterns (e.g., types, combination, amount, and costs of services used) and the factors that influence use. Analyses will be conducted at the aggregate and individual child and family levels. At the aggregate level, the distribution of service use and costs across the client population will be described. At the individual child and family level, service utilization patterns will be described (e.g., distribution of children using various combinations of services, mean and median amounts of services used).


Latent class analysis and other case-grouping techniques will be used to group children who experience similar utilization patterns, based on combinations and amount of services. The longitudinal outcomes of children in various service utilization groups will be compared to see if some utilization patterns are associated with greater gains and, if so, for which groups of children.


Trend analysis will be used to analyze change in costs over time. Multivariate techniques that adjust for skewed distribution of cost data will be employed to predict costs controlling for variation in baseline characteristics. We also will describe the allocation of service costs across children and different service categories, and we will model costs as a function of child and family characteristics. Given that utilization and cost data are often characterized by high skewness and/or large proportion of zero outcomes, we propose utilization of specialized statistical techniques (e.g., two-part model, logarithmic transformations, zero-inflated Poisson model) in analyzing utilization and cost study data. For cost-effectiveness analysis, we will use bootstrapping methods to account for uncertainty


Sustainability Study. For the Sustainability Survey, the analysis plan will include both quantitative and qualitative components. Web survey data will be aggregated and analyzed quantitatively and qualitatively. Quantitative data obtained from factors related to sustainability will be examined for reliability, and will be compared to system characteristics. To examine factors in relation to system development, survey data pertaining to system features will be compared to responses related to factors contributing to sustainability. In addition, survey data will be combined with data from final System of Care Assessment site visits, including assessment scores from these visits, to create a more robust picture of the status and process of sustainability in each community. Quantitative data obtained about system features and factors impacting sustainability will be tallied for each site. This information will also be tallied across all sites, yielding cross-site information on the extent to which specific system of care features are in place in funded sites during various stages of their funding, positive and negative factors affecting sustainability, and the effectiveness of strategies implemented to sustain systems of care. Quantitative ratings will be assigned to each site across the various assessment areas, and will be ranked according to their importance. Where appropriate, quantitative comparisons of these features will be made across sites.


CQI Initiative Evaluation. For the CQI Initiative Evaluation, the analysis plan includes both quantitative and qualitative components. Analyses for the survey data will include: content/thematic analysis of open-ended questions; and descriptive, univariate, and bivariate statistical analyses of quantitative data. Focus group data will be analyzed primarily using qualitative methods, such as content/thematic analysis. Data from the surveys and focus groups will be used to assess the development of the CQI process within communities, gauge the effectiveness of the CQI Initiative in providing appropriate technical assistance (TA) to communities, and inform ongoing TA provision.


Alumni Networking Study. The Networking and Collaboration Survey has been developed using standardized social network analysis methods (Wasserman and Faust, 1995) that have been applied in health services research to describe and evaluate collaboration among health and mental health service organizations (Valente, 1995; Morrissey, 1999). The body of the survey asks respondents to select from a listing of all system of care communities the ones that they interact with as a result of the alumni network Web site on governance/decision-making. Using specialized social network analysis software such as UCINET (Borgatti, Everett, & Freeman, 1999), the frequency and types of linkages between communities, and network characteristics such as centrality and clustering of the most highly interacting players, and gaps in linkages will be examined. Social Network analysis will help identify levels of inter-organizational communication, clusters of activity, and system of care communities integral to collaboration as well as the change in interaction over time. This study component characterizes the Network in terms of the level of collaboration occurring and the influence of particular system of care communities as a result of the Alumni Network Web site. Analyses for the Alumni Network Web Site Satisfaction Survey data will include content/thematic analysis of open-ended questions; and descriptive, univariate, and bivariate statistical analyses of quantitative data.

17. DISPLAY OF EXPIRATION DATE


All data collection instruments will display the expiration date of OMB approval.



18. EXCEPTIONS TO THE CERTIFICATION STATEMENT


This collection of information involves no exceptions to the Certification for Paperwork Reduction Act Submissions.

B. STATISTICAL METHODS



1. RESPONDENT UNIVERSE AND SAMPLING METHODS


System of Care Assessment. Respondents for the System of Care Assessment will be selected based on their affiliation with the system of care community and must serve in specific roles. To determine the respondents, the National Evaluator will send a site informant list to each community 8 weeks prior to its site visit. The site informant list identifies categories of respondents who offer a variety of perspectives about each community’s system of care. The document outlines the specific positions and roles, specialized functions, number of interviewees, and estimated interview time for each respondent category. The system of care community will select potential respondents that meet the requirements outlined in the list. System of care communities will e-mail the completed list to the National Evaluator at least 4 weeks prior to the scheduled visit so that the list of projected interviewees can be reviewed to ensure that each category of respondent will be adequately represented. The respondent categories include representatives of core child-serving agencies, project directors, family representatives and representatives of family advocacy organizations, social marketers, cultural and linguistic competence coordinators, program evaluators, intake workers, youth coordinators, care coordinators and case managers, direct service providers, care review participants, caregivers, and youth. For each system of care community, there will be approximately 27 respondents per site visit. Site visits will be conducted in all system of care communities. Based on previous experience, we expect a response rate for this study component of approximately 84 percent.


The universe for the Phase VI Cross-Sectional Descriptive Study, the Child and Family Outcome Study, and the Service Experience Study consists of the children served by the CMHS program in the 38 grantee sites.


Cross-Sectional Descriptive Study. For this evaluation component, data will be collected on children and families at intake into services. Descriptive data will be collected on the census of all children and their families who are being served by the CMHS program. To be included in this study component children will need to: (1) meet the community’s service program eligibility criteria; and (2) receive services in that community. Because these data are routinely collected at the sites for internal purposes, descriptive data on all the children and families who receive services will be available.


Child and Family Outcome Study. For this component, to gather data that can be meaningfully interpreted while not creating an overwhelming burden for some grantees, a sample of families will be selected for participation in this component. Recall that this is a longitudinal study. For ease of discussion, samples are discussed as longitudinal and cross-sectional samples.


The Child and Family Outcome Study sample will be selected from the pool of children and their families entering the Phase VI-funded systems of care. Although each site is funded for 6 years, the first year is committed to initial system development with data collection occurring in the last

5 years of their funding. Hence, recruitment of family participants will occur in years 2, 3, and 4 of program funding (or years 1, 2, and 3 of the evaluation) but could continue in later years if enrollment goals are not met.


As systems of care will develop differentially over the length of the project, it is important to consider the growth of the system of care. If the entire sample is recruited in the first year, the opportunity will be lost to assess whether changes in the client population occurred as the system matured (e.g., increasingly serving children with more severe problems or children referred through the juvenile justice system). For that reason, recruitment will be spread across 3 years and the number of children and families recruited each year will be standard across sites.


It is important that we draw a large enough sample in each grantee site to ensure that the evaluation will be able to detect the impact of the system of care initiative on child and family outcomes. If the samples are too small, significant differences of an important magnitude might go undetected. The effect sizes of the phenomena of interest form the basis of determining the minimum sample size needed through a statistical power analysis. Briefly, the power of a statistical test is generally defined as the probability of rejecting a false null hypothesis. In other words, power gives an indication of the probability that a statistical test will detect an effect of a given magnitude that, in fact, really exists in the population. The power analysis does not indicate that a design will actually produce an effect of a given magnitude. The magnitude of an effect, as represented by the population parameter, exists independent of the study and is dependent on the relationship among the independent and the dependent variables in question. The probability of detecting an effect from sample data, on the other hand, depends on three factors: (1) the level of significance used; (2) the size of the treatment effect in the population; and (3) sample size.


For the Child and Family Outcome Study in the grantee communities, the longitudinal design assesses whether individual children and families experience meaningful improvements in outcomes between the time they enter the systems of care and subsequent data collection points. Comparisons of outcomes among different groups within a community and across communities will also be made.


Power analysis assumptions have been modified from previous phases of the evaluation to reflect higher effect sizes observed in analyses of data from communities funded in Phases IV and V and a between- versus within-site difference in our analytical approach. As a result, each site will be expected to recruit sufficient numbers of children and families to ensure enrollment of 220 in each community. Relative to previous phases, this reduction in enrollment numbers and number of cohorts for initial enrollment will allow local evaluation staff to expend their limited evaluation resources on longitudinal follow-up to increase the quality of data and improve retention rates. Enrollment of 220 children and families at intake will result in a final sample of 180, assuming 5 percent attrition at each wave over five waves of data collection (intake, 6, 12, 18, and 24 months), which will reflect approximately 82 percent retention at the end of data collection. Children and families will be enrolled into the study beginning in year 2 of funding, and enrollment can continue through the end of year 4 of funding to ensure 24-month follow-up. The target sample sizes will be large enough to ensure the ability to detect changes in outcomes over time and to compare outcomes of relevant subgroups of children and families across a variety of characteristics (e.g., referral source, demographic characteristics, and risk factors) within a community.


Table 4 shows the data collection schedule for the 3 years of recruitment and 5 years of data collection. While past experience with this study component has indicated that some sites will have difficulty maintaining an attrition rate of 5 percent at each data collection point, a majority of sites in Phases IV and V of the evaluation have retention rates above 80 percent at 6 months, with one-fourth retaining more than 90 percent of study participants at 6 months. Overall, retention rates at 12 months were above 70 percent. The National Evaluator has established a number of strategies and techniques for maximizing recruitment and retention (see Section B.3.) and will work closely with all communities to determine the best methods for recruiting and retaining study participants.


Table 4. Data Collection Schedule for the Child and Family Outcome Study


Data Collection Year Recruited1

Data Collection Year

FY10–11

FY11–12

FY12–13

FY13–14

FY14–15

Year 2

2812

2671

2538

2411

2290






Year 3



2812

2671

2538

2411

2290




Year 4





2812

2671

2538

2411

2290


Year 5











Year 6

Completion of data collection if data collection goals have not been met.

  1. Refers to the year of the national evaluation in which the family was recruited into the study. Across all sites, the national evaluation spans 5 years. Although data collection will occur in years 2 through 6, recruitment ends in year 4 with follow-up data collection continuing in year 6. Any sites that have not met their participant recruitment goals will be allowed to continue recruitment during year 6 as long as at least one follow-up interview can be completed before program funding ends.


To reach these numbers, some grantee sites will need to recruit all willing families into the Child and Family Outcome Study sample. For these sites, the cross-sectional descriptive and the longitudinal samples will be identical. Other sites will need to employ a sampling strategy to randomly select a sufficient number of families from the pool of children who enter the system of care. At these sites, a systematic sampling approach will be used. A random starting point between 1 and the nearest integer to the sampling ratio (n/N) will be selected using a table of random numbers. Children will be systematically selected for inclusion at intervals of the nearest integer to the sampling ratio. For example, every tenth child (after the random starting point) would be sampled in a site serving 2200 children (n/N = 2200/220 = 10) and every fifth child would be sampled in a site serving half that number or 1100 children (n/N = 1100/220 = 5) (where n = the number of children in the population and N = the number of children to be recruited into the sample).


The purpose of the sampling strategy described above is to maximize the chance that the children who participate in the Child and Family Outcome Study are indeed representative of the universe of children who enter the systems of care. If this is achieved, the findings from data collected from the randomly selected sample are more likely to generalize to the overall client pool. Every effort will be made to recruit and follow the children who are randomly selected into the Child and Family Outcome Study. However, one should expect that some of the families approached about entering the study would refuse to participate. When a family refuses to participate, the next family that meets the selection criteria will be selected. Past experience indicates that sites vary in their abilities to recruit Cross-Sectional Descriptive Study sample members into the Child and Family Outcome Study with the majority of sites recruiting more than 60 percent of the Cross-Sectional Descriptive Study sample into the Child and Family Outcome Study sample. To estimate the effect of the refusals on the representativeness of the sample, the families who refuse will be compared to the participating sample on, at minimum, demographic characteristics. (See the Data Analysis Plan section above.) Recall that descriptive data will be collected on all families that enter the system of care. This will provide the data upon which to make comparisons.


Experience from previous phases of the national evaluation has shown that, although sites can make estimates, it is difficult to predict precisely how many children will be served by the grantee systems of care. In addition, the number of children who enter the systems of care may increase over time as grantees expand their service capacity and enhance outreach efforts. For that reason, sampling strategies will have to remain flexible during the recruitment period and will be monitored closely by the National Evaluator. The sampling strategies will be based on the sampling ratio approach to random selection described above. In the first year of their funding, grantees will monitor the number of children that enter their systems of care. Toward the end of the first year, a sampling ratio will be developed based on the first year of enrollment into the system of care. That sampling ratio will be tested in the first 3 months of data collection and monitored throughout the recruitment period to ensure that it remains on target.


The actual process of recruitment will differ across sites. This is necessary because children and families will enter services differently across sites. For example, in one site, the primary portals of entry might be the schools, while in another it might be the court system. It is also likely that sites will have a variety of portals of entry (e.g., mental health centers, schools, and courts). Every effort will be made to ensure that the recruitment process is as standardized as possible across sites and at the various portals of entry. The rudiments of sample selection and recruitment will be documented in the national evaluation procedures manual, with additional guidelines developed specifically for each site. Training will also be conducted at each site. Whether a family is to be recruited into the Child and Family Outcome Study (i.e., whether they are selected for inclusion in the sample) will be determined as soon as it is known whether they meet the eligibility criteria. Intake workers, regardless of their location, training or service sector affiliation, will be trained to conduct the consent to contact process in a uniform manner. Scripts will be used to make sure that each potential participant receives the same information before agreeing to be contacted by the evaluation staff. (See Attachment 3.B.) Similarly, evaluation staff will be trained to conduct the informed consent process uniformly. Standard forms will be used to document refusals to be contacted or to participate in the study. These are established procedures in field research, and the National Evaluator will closely monitor them.


Service Experience Study. The sampling and recruitment procedures for this study, which includes administration of the Multi-Sector Service Contacts, Revised (MSSC–R–I and MSSC–R–F), the Youth Services Survey and Youth Services Survey for Families (YSS and YSS–F), and the Cultural Competence and Service Provision Questionnaire, Revised (CCSP–R) are identical to that of the Child and Family Outcome Study; that is, the same randomly selected sample of children and families being served in all system of care communities. Thus, anticipated response rates and retention rates are the same as for the Child and Family Outcome Study.


Sector and Comparison Study. 230 children will be sampled in each sectoral cluster at intake (115 in the system of care group and 115 in the comparison group), resulting in a final sample of 190 children (95 children in the system of care group and 95 children in the comparison group) in each cluster, assuming 5% attrition at each wave over five waves of data collection (intake and 6, 12, 18, and 24 months). We will compare sector-specific outcomes and service experience measures from intake up to 24 months after intake of children in each system of care cluster to those of children in corresponding comparison clusters. We assume that many outcomes of interest will be dichotomous; for example, an education-related sector cluster may use as a sector-relevant outcome the proportion of youth who remain in school after entering services. Analysis would examine this outcome for youth entering services in a system of care compared to those not served in a system of care. An overall sample size of 190 (n=95 in each group) achieves power of .80 to detect an odds ratio of 1.75 (i.e., a probability of remaining in school that is 1.75 times greater in the system of care group than in the comparison group) in a design with five repeated measurements (i.e., intake and 6, 12, 18, and 24 months). This calculation assumes the proportion with a successful outcome in the comparison group is .50, the correlation between observations on the same subject is .50, the covariance structure is order 1 autoregressive, and α=.05.


Services and Cost Study. Data for the Services and Costs Study are collected only on children and youth enrolled in the Longitudinal Child and Family Outcome Study. The sampling and recruitment procedures for this study are identical to that of the Longitudinal Child and Family Outcome Study. This includes the same randomly selected sample of children and families, response rates, and retention rates.


Sustainability Study. For each site, four site-level respondents (i.e., project director, key mental health representative, family organization representative, agency representative) will be asked to complete the Web survey. The project director, the director of the local family organization, and the two agency representatives who will be asked to complete the survey are individuals interviewed for System of Care Assessments. Previous experience indicates that the response rate for the Sustainability Survey should be 80 percent or higher. The brief version of the survey will only be completed by the project directors. Historically, the response rates have been highest among project directors; therefore it is realistic to expect the response rates to the brief version of the survey to be at least 80 percent.


CQI Initiative Evaluation. For each 2008-funded community, up to eight site-level respondents (i.e., principal investigator, project director, lead evaluator, cultural and linguistic competence coordinator, social marketing-communications manager, lead family contact, youth coordinator, TA coordinator) will be asked to complete the CQI Initiative Evaluation Baseline Survey and Monitoring Survey. Previous experience indicates that the response rate for both surveys should be approximately 75 percent. A subset of four communities will be selected to participate in the Case Studies. As a basic criterion for selection, the survey response rate for the selected communities must exceed 50 percent to ensure community investment in the case studies. Two focus groups will be conducted for each selected community: one with the site-level respondents who completed the Baseline and Monitoring Surveys; and one with the national-level technical assistance (TA) providers who work with the community.


Alumni Networking Study. For each of the currently funded and anticipated sites (n = 72 sites), up to three representatives most knowledgeable about the site will be asked to complete the Networking and Collaboration Survey (e.g., project director/principal investigator, lead family contact, lead evaluator). For each previously funded community (n = 92), one representative most knowledgeable about the site will be asked to complete the Networking and Collaboration Survey. Previous experience indicates that the response rate for the Networking and Collaboration Survey should be 80 percent or higher.

For the Alumni Network Web Site Satisfaction Survey, all registered and a random sample of non-registered users of the Web site will be invited to participate. Estimates of the number of people expected to access the Web site will be based on current Web site usage statistics. Members of the Child, Adolescent and Family Branch at the Center for Mental Health Services who access the site will also be solicited to participate in the Satisfaction Survey. Previous experience indicates that the response rate for the Alumni Network Web Site Satisfaction Survey should be 80 percent or higher.



2. INFORMATION COLLECTION PROCEDURES


System of Care Assessment. The National Evaluator will collect data for this component during periodic site visits. Data collection will include semi-structured interviews with key informants, review of documents and randomly selected case records, and observations. To document changes in system of care development that occur over time, all system of care communities will be visited three times during the 5 years of data collection (every 18–24 months), beginning in the second year of program funding. Initial data collection site visits will be scheduled according to the relative development of the individual programs so that more advanced communities will be scheduled first followed by all others until all have completed the data collection process within the timeframe allotted. It is anticipated that initial data collection site visits will take place between February and September 2010, with subsequent site visits occurring at 18–24-month intervals.


The System of Care Assessment protocol yields an average of 23 individual interviews and 6 case record reviews per data collection site visit. It is expected that these averages will be achieved during the Phase VI data collection process. Key informants include the local project director, representatives of core child-serving agency, representatives of family organizations, cultural and linguistic competence coordinators, social marketers, program evaluators, youth coordinators, care coordinators, direct service providers, caregivers of children who receive services through the system of care, and youth who receive services through the system of care. The average time to obtain the required information from each person is about 1 hour.


Prior to the site visit, the National Evaluator will send out tables to be completed by the system of care community. These tables will collect information on: (1) the structure and participants of the governing body, (2) trainings that have been provided on system of care principles, (3) demographics of program staff, (4) services provided in the system of care community’s service array, (5) amounts, sources, and types of funding, and (6) participants on the care review team. These completed tables will be e-mailed to the National Evaluator approximately 4 weeks prior to the site visit. (See Attachments 4.A.1.–4.A.5. for System of Care Assessment protocols.)


Cross-Sectional Descriptive Study. Data for the Cross-Sectional Descriptive Study will be collected at entry into services for all children and families in the grantee sites. Data for this component will be collected by sites’ intake staff, who will be trained by the National Evaluator to ensure standard collection of these data. To standardize the collection of these data across sites, the National Evaluator has developed the Enrollment and Demographic Information Form (EDIF) and the Child Information Update Form (CIUF). (See Attachments 4.B.1. and 4.B.2.) The information can be collected from case records or from interviews conducted at intake. The National Evaluator strongly recommends that all grantees incorporate these items into their intake process. These data can be directly entered into a Web-based database by intake personnel to facilitate capture of basic descriptive characteristics of children served. There is no burden associated with the Enrollment and Demographic Information Form (EDIF) or Child Information Update Form (CIUF). To the extent possible, the collection of this information will be coordinated with the collection of data elements required for the National Outcome Measures (NOMs) reporting through TRAC. The GFA for FY2008 funding states that grantees will be required to report a number of performance measures to ensure SAMHSA can meet its reporting obligations under GPRA. The required descriptive information includes the number of persons served by age, gender, race and ethnicity. This information will be gathered using the CMHS NOMs Adult Consumer Outcome Measures for Discretionary Programs or the Child Consumer Outcome Measures for Discretionary Programs (Child or Adolescent Respondent Version or Caregiver Respondent Version).


For families participating in the Child and Family Outcome Study, the descriptive information that may change over time (e.g., diagnosis, insurance status) will also be collected at each follow-up data collection point using the Child Information Update Form (CIUF). Evaluation staff will collect these follow-up descriptive data elements in conjunction with other follow-up data collection for the Child and Family Outcome Study (see below). Again, the information collected in the Cross-Sectional Descriptive Study creates no additional respondent burden.


Child and Family Outcome Study. Data collection for this evaluation component begins in the second year of the grantees’ funding. Because respondents’ reading levels will vary, the instruments will be administered in interview format. This approach has been successfully implemented in Phases II, III, IV, and V. These data will be collected at intake and follow-up data collection points. In Phase VI, child and family outcome data will be collected from a sample of children, their caregivers, and their service providers. (See Attachment 4.C. for instruments.) The CMHS program’s Guidance for Applicants requires grantees to collect the following information on the following outcomes:


  • mental illness symptomatology

  • employment/education

  • crime and criminal justice

  • stability in housing


Following children and families for 24 months will allow the assessment of the long-term impact of the system and will permit important functional outcomes to be assessed as children develop toward maturity (e.g., completion of high school).


Eight of the measures—the Youth Services Survey (YSS), the Delinquency Survey, Revised (DS–R), the Substance Use Survey, Revised (SUS–R), the Gain Quick–R: Substance Problem Scale (GAIN), the Youth Information Questionnaire, Revised (YIQ–R–I, YIQ–R–F), the Revised Children’s Manifest Anxiety Scale, Second Edition (RCMAS–2), the Reynolds Adolescent Depression Scale, Second Edition (RADS–2), and the Behavioral and Emotional Rating Scale—Second Edition, Youth Rating Scale (BERS–2Y)—will be completed by youth 11 years of age and older.


Many of the measures have been cleared by the OMB across multiple phases. Previously approved measures include the following:


  • The Caregiver Strain Questionnaire (CGSQ) will be used to measure how families are affected by the special demands associated with caring for a child with serious emotional disturbance. (See Attachment 4.C.2.)

  • To measure child clinical symptomatology, caregivers of children age 6 years and older will complete the Child Behavior Checklist (CBCL 6–18). To measure child clinical symptomatology in young children, caregivers of children age 6 years and younger will complete the Child Behavior Checklist 1½–5 (CBCL 1½–5). The CBCL has been widely used in children’s mental health services research to assess social competence, behaviors, and feelings. (See Attachment 4.C.3.)

  • Information regarding the residential status of children will be collected from caregivers using the Living Situations Questionnaire (LSQ). (See Attachment 4.C.5.)

  • To identify the emotional and behavioral strengths of children, caregivers of children older than age 5 years will complete the Behavioral and Emotional Rating Scale—Second Edition, Parent Rating Scale (BERS–2C). The BERS–2C is a strengths-based measure of social competence. (See Attachment 4.C.6.)

  • The Columbia Impairment Scale (CIS) will be completed by caregivers of children older than 5 years to measure children’s general level of functioning. (See Attachment 4.C.7.)

  • Youth will complete the Delinquency Survey, Revised (DS–R). This measure identifies delinquent or risky behavior for which youth with mental illnesses may be at high risk. (See Attachment 4.C.11.)

  • To identify the emotional and behavioral strengths of children from their own perspective, youth will complete the Behavioral and Emotional Rating Scale—Second Edition, Youth Scale (BERS–2Y). (See Attachment 4.C.12.)

  • The Gain Quick–R: Substance Problem Scale (GAIN) measures substance use, abuse and dependence and will be administered to youth. (See Attachment 4.C.13.)

  • The Substance Use Survey, Revised (SUS–R) will be administered to youth to determine alcohol, tobacco, and drug use during the previous 30 days and 6 months. (See Attachment 4.C.14.)

  • To determine if youth are experiencing anxiety, they will be administered the Revised Children’s Manifest Anxiety Scale, Second Edition (RCMAS–2). (See Attachment 4.C.15.)

  • Youth will be administered the Reynolds Adolescent Depression Scale, Second Edition (RADS–2) to assess if they are experiencing depression. (See Attachment 4.C.16.)


Onsite data collectors, hired and managed by the sites, will collect data in the funded systems of care. In these sites, the people who will collect the data depend on the resources and needs of the sites. For example, some sites may choose to hire two full-time staff to manage the local evaluation and to collect all the data. Other sites might choose to hire one full-time evaluator to manage the evaluation but will collect data with flexible part-time staff.


The National Evaluator will document and monitor data collection procedures in the system of care sites to ensure the greatest possible uniformity in data collection across sites. In addition, evaluation staff and data collectors will be trained using standard materials developed by the National Evaluator.


Service Experience Study. Data for the Service Experience Study will be collected along with data for the Child and Family Outcome Study and includes (1) recording service contacts on the Multi-Sector Service Contacts Questionnaire, Revised, Intake and Follow-Up (MSSC–R–I and MSSC–R–F) (Attachment 4.D.1.); (2) caregiver report on the cultural competence of services provided using the Cultural Competence and Service Provision Questionnaire, Revised (CCSP–R) (Attachment 4.D.2.), and (3) an assessment of service experience, satisfaction, and perceived outcomes with the Youth Services Survey for Families and Youth Services Survey (YSS–F and YSS) (Attachments 4.D.3. and 4.D.4.). The Service Experience Study will also examine the congruence between the program’s original design and what is actually experienced by clients during implementation of that design. The Youth Services Surveys focus on whether the overall service system experienced by youth and their caregivers reflect the key principles of the system of care model. Caregivers and youth will report their perceptions of whether services they received were accessible, well-coordinated, family-driven, culturally competent, helpful in meeting therapeutic goals, and matched with the individual needs of the child and family.


This corresponds to the Guidance for Applicants which requires sites to collect data on:


    • Collaboration and coordination of system components;

    • Family involvement in services;

    • Family and youth satisfaction with services.


Data for the Service Experience Study will be collected in all system of care communities. These data will be completed at intake and follow-up for families who have received services as indicated in the gate question and are participating in the Child and Family Outcomes Study.


Sector and Comparison Study. Data for this study will be collected in select system of care communities. A subset of children enrolled in the core study will be randomly sampled into three sectoral groups (juvenile justice, education, child welfare). Service enrollment expectations established in funding awards, diversity of populations served, stratification based on sector clustering, and other factors will be considered to establish appropriate sampling strategies and sample sizes for this study in the funded communities. In those sites where local evaluation capacity will not ensure adequate data quality and retention of participants, national evaluation staff members will work with local evaluators to identify additional local staff to hire and train to assist in conducting interviews. These interviewers will collect the more detailed data with sector-specific instruments. Local evaluation staff in sites with sufficient capacity will collect data as with other core study participants and, in addition, will collect measures included as part of the enhanced study. For the identification of comparison sample, we will work with selected unfunded agencies to develop a process for the national evaluation study coordinator to oversee data collection for the comparison study. (See Attachments 4.E., 4.F., and 4.G.)


Services and Costs Study. To provide data for this study, grant communities will collect two types of data. The first type of data is budget data on services provided through flexible fund expenditures. The second type of data is child-level service event data. This includes data on each service provided to each child/youth by as many partner agencies in the systems of care as possible. The availability of these data and procedures that communities will implement in accessing these data will vary widely across grant communities. Some of the data needed for this study are already collected by communities in existing data systems developed for their own program management purposes. Other data are recorded on paper-based forms or as part of the child’s case records. However, some communities do not currently collect the data needed for this study, either electronically or on paper. For data not already collected, communities will be asked to begin collecting these data specifically for the Services and Costs Study.


Data will be compiled by either extracting data from existing data systems and recoding them according to a specified data dictionary or by key entering information collected from paper records. Some communities will either extract and recode their data or will key enter their data, while other communities will use a combination of both methods.


The national evaluation will provide two data dictionaries to provide specifications for communities to use in recoding data from existing data systems, one for flexible fund expenditures and the other for service event data. (See Attachment 4.K.) The national evaluation will also provide two data entry applications for communities to use for key entering data from paper records. The first application is the Flex Funds Tool for budget data on flexible funding expenditures. The second application is the Services and Costs Data Tool child-level service event data. Data that are compiled by extracting and recoding existing data will be transmitted to the national evaluation at regular intervals. Data that are entered from paper records will be transmitted to a central database on an on-going basis, as they are entered.


Cost effectiveness and cost benefit analysis will involve utilization of data collected as part of the longitudinal outcomes and comparison studies. See appropriate sections for the description of information collection procedures for these studies.


Sustainability Study. The Sustainability Study involves collecting data in each grantee community via a Web-based survey. This study will gather data on system of care characteristics and factors related to sustainability, and monitor and evaluate the success of sites to be sustainable post-funding. The Sustainability Survey will be completed by four selected staff (i.e., project director, family organization representative, agency representative, key mental health representative) from each grantee site in years 2 and 5 of the evaluation. The Web survey will also be utilized to conduct a 5-year post funding assessment of the communities funded in 2002. (See Attachment 4.H.2.)


The Sustainability Study will also collect data from grantees during each of the first 4 years post-funding with a short form of the survey instrument from one respondent in each community (i.e., current or former project director). (See Attachment 4.H.1.)


CQI Initiative Evaluation. The CQI Initiative Evaluation will gather data on the implementation and effectiveness of the Initiative. Thus, the evaluation will assess if and how communities pursue CQI; how well the CQI Initiative identifies and addresses communities’ technical assistance (TA) needs; and how effective the Initiative is in providing appropriate, data-driven TA to communities.


The CQI Initiative Evaluation will collect data about grantee communities through three complementary activities: a Baseline Survey of key constituents in all 2008-funded communities administered in year 1 of program delivery; a subsequent Monitoring Survey administered to the same constituents in years 3 and 5; and Case Studies of four selected communities in years 2 and 4. (See Attachment 4.I.) For each community, up to eight respondents (i.e., principal investigator, project director, lead evaluator, cultural and competence coordinator, social marketing-communications manager, lead family contact, youth coordinator, TA coordinator) will be asked to complete the Web-based Baseline and Monitoring Surveys. A subset of four communities will be selected for participation in the Case Studies, which will consist of focus groups with local system of care personnel and national TA providers for each selected community.


Survey administration will adhere to accepted methods for mail and Web surveys. Following recruitment activities and verification of contact information, each survey participant will be mailed a pre-survey letter that explains the study and contains directions for logging onto a Web site to complete the survey. Instructions will also be provided for obtaining a hard copy of the survey if desired. A follow-up reminder will be sent to non-respondents after 1 week and again after 2 weeks. Lastly, a letter containing a hard copy of the survey will be sent to all remaining non-respondents, followed by a telephone reminder call.


Key personnel and TA providers for four communities will be contacted via e-mail or telephone to solicit their participation in focus groups. Each individual who agrees to participate will then be sent a consent form via e-mail or ground mail, and will be asked to return the signed consent form via facsimile or ground mail. In addition, consent by all participants will be verbally verified before each focus group commences.


Alumni Networking Study. The Alumni Networking Study will collect data via a Web-based Networking and Collaboration Survey in years 1 and 3 from up to three representatives from each currently funded and anticipated community (n = 72) and one representative from each previously funded community (n = 92). Each representative will be e-mailed a pre-survey letter that explains the study and contains directions for logging onto a Web site to complete the survey. Instructions will also be provided for obtaining a hard copy of the survey if desired. A follow-up reminder will be sent to non-respondents after 1 week and again after 2 weeks.


The Alumni Network Web Site Satisfaction Survey will be used to collect information from a randomly selected sample of non-registered Web site users to complete the Web survey via pop-up window technology. All currently registered users will be solicited for participation via an e-mail explaining the purpose of the study and a link to the Satisfaction Survey. A follow-up reminder will be sent to non-respondents after 1 week and again after 2 weeks. Additionally, all members from CAFB will be solicited to participate in the Satisfaction Survey. The Alumni Network Web Site Satisfaction Survey will be collected in years 2 and 4.


Table 5 summarizes the respondent, data collection procedure, and periodicity for each measure.


Table 5. Instrumentation, Respondents, and Periodicity


Measure

Indicators

Data Source(s)

Method

When Collected

System of Care Assessment (all sites)

System of Care Assessment Tool (Interview Guides and Data Collection Forms)

  • Family-driven

  • Youth-guided

  • Individualized services

  • Cultural competence

  • Interagency collaboration

  • Service coordination

  • Service array

  • System & service accessibility

  • Community-based services

  • Least restrictive service provision

  • Project staff

  • Core agency representatives

  • Family members

  • Caregivers

  • Youth

  • Service providers

  • Other constituents

  • Documents

Interview

Review

Every 18–24 months

Child and Family Outcome Study (a sample of children and families enrolled in the system of care)

Caregiver Information Questionnaire, Revised (CIQ–R)

  • Age

  • Educational level and placement

  • Socioeconomic status

  • Race/ethnicity

  • Parents employment status

  • Family advocacy and peer support

  • Living arrangement

  • Presenting problem(s)

  • Intake/referral source

  • Risk factors for family and child

  • Child and family physical health

  • Coercion for services

  • Service use

Caregiver

Interview

Intake, 6 months, and every 6 months thereafter

Child and Family Outcome Study (a sample of children and families enrolled in the system of care)

Living Situations Questionnaire (LSQ)

  • Living situations

  • Number of placements

  • Restrictiveness of placements

Caregiver

Interview

Intake, 6 months, and every 6 months thereafter

Behavior and Emotional Rating Scale—Second Edition, Parent Rating Scale (BERS–2C)

  • Strengths

  • Social competence

Caregiver of children age 6 years and older

Interview

Intake, 6 months, and every 6 months thereafter

Preschool Behavior and Emotional Rating Scale— Parent Rating Scale (PreBERS)

  • Strengths

  • Social competence

Caregivers of children age 3-5

Interview

Intake, 6 months, and every 6 months thereafter

Child Behavior Checklist 6–18 (CBCL 6–18) and Child Behavior Checklist 1½–5 (CBCL 1½ –5)

  • Symptomatology

  • Social competence

Caregiver

Interview

Intake, 6 months, and every 6 months thereafter

Education Questionnaire, Revision 2 (EQ–R2)

  • Functioning in school environments

Caregiver

Interview

Intake, 6 months, and every 6 months thereafter

Devereux Early Childhood Assessment (DECA)

  • Behavioral concerns

  • Initiative, self-control, attachment

  • Attention problems, aggression, withdrawal, emotional control

Caregiver of children aged 0–5

Interview

Intake, 6 months, and every 6 months thereafter

Parenting Stress Index (PSI)

  • Parenting characteristics and child adjustment

Caregiver of children age 12 years and younger

Interview

Intake, 6 months, and every 6 months thereafter

The Columbia Impairment Scale (CIS)

  • General functioning

Caregiver of children age 6 years and older

Interview

Intake, 6 months, and every 6 months thereafter

Caregiver Strain Questionnaire (CGSQ)

  • Caregiver strain


Caregiver

Interview

Intake, 6 months, and every 6 months

Behavior and Emotional Rating Scale—Second Edition, Youth Scale (BERS–2Y)

  • Strengths

  • Social Competence

Youth

Interview

Intake, 6 months, and every 6 months

Delinquency Survey, Revised (DS–R)

  • Delinquent or risky behaviors

Youth 11 years and older

Interview

Intake, 6 months, and every 6 months thereafter

Gain Quick–R Substance Problem Scale (GAIN)

  • Substance use, abuse, and dependence

Youth 11 years and older

Interview

Intake, 6 months, and every 6 months thereafter

Substance Use Survey, Revised (SUS–R)

  • Alcohol, tobacco, and drug use

Youth 11 years and older

Interview

Intake, 6 mo., and every 6 months thereafter

Revised Children’s Manifest Anxiety Scale, Second Edition (RCMAS–2)

  • Child anxiety

Youth 11 years and older

Interview

Intake, 6 months, and every 6 months thereafter

Reynolds Adolescent Depression Scale, Second Edition (RADS–2)

  • Child depression

Youth 11 years and older

Interview

Intake, 6 months, and every 6 months thereafter

Child and Family Outcome Study (a sample of children and families enrolled in the system of care)

Youth Information Questionnaire, Revised (YIQ–R)

  • Acculturation

  • Coercion

  • Peer relations

  • Symptomatology

  • Suicidality

  • Neighborhood Safety

  • Presenting problems

  • Empowerment

  • Self-efficacy

  • Life skills

  • Employment status

Youth 11 years and older

Interview

Intake, 6 months, and every 6 months thereafter

Service Experience Study

Multi-Sector Service Contacts, Revised—Intake (MSSC–R–I)

  • Type of service

  • Amount of service

  • Location of service

Caregiver

Interview

Intake if services received

Multi-Sector Service Contacts, Revised—Follow-Up (MSSC–R–F)

  • Type of service

  • Amount of service

  • Location of service

Caregiver

Interview

Intake, 6 months, and every 6 months thereafter if services received

Youth Services Survey for Families (YSS–F)

  • Service experience

  • Client satisfaction

  • Perceived outcomes

Caregiver

Interview

Intake, 6 months, and every 6 months thereafter if services received

Youth Services Survey (YSS)

  • Service experience

  • Client satisfaction

  • Perceived outcomes

Youth 11 years and older

Interview

Intake, 6 months, and every 6 months thereafter if services received

Cultural Competence and Service Provision Questionnaire, Revised (CCSP–R)

  • Cultural competence

Caregiver

Interview

Intake, 6 months, and every 6 months thereafter if services received

Sector and Comparison Study

Child Welfare Sector Study Questionnaire—Intake (CWSQ–I)

  • MH services provided

  • Maintenance In home

  • Out of home placement

  • Risk factors for child

Care Coordinators / Child Welfare Case Workers

Interview

Intake, 6 months, and every 6 months thereafter

Child Welfare Sector Study Questionnaire—Follow-Up (CWSQ–F)

  • MH services provided

  • Maintenance In home

  • Out of home placement

  • Risk factors for child

Care Coordinators / Child Welfare Case Workers

Interview

Intake, 6 months, and every 6 months thereafter

Sustainability Study

Sustainability Survey

  • System of care characteristics

  • Factors related to sustainability

  • Success of sites to be sustainable post-funding

Local site informants

Interview

Once in evaluation years 2 and 5

Services and Costs Study (all sites; caregivers: all enrolled in the Child and Family Outcome Study)

Management Information Systems (MIS)

  • Previous service history

  • Service setting and type

  • Level of restrictiveness

  • Mix of services

  • Amount and duration

  • Continuity of care

  • Service costs

  • Funding sources & third party reimbursements

MIS systems maintained by State and local agencies

Data abstraction

Continuously; data transmitted at regular intervals

CQI Initiative Evaluation

Baseline Survey

  • Development of CQI infrastructure

  • Utilization and effectiveness of technical assistance

  • Key constituent involvement in implementing CQI Initiative

  • Extent to which the Initiative was implemented according to plans

  • Satisfaction with implementation

Key site personnel

Web-based survey

Once in year 1 of program delivery

Monitoring Survey

  • Development of CQI infrastructure

  • Utilization and effectiveness of technical assistance

  • Key constituent involvement in implementing CQI Initiative

  • Extent to which the Initiative was implemented according to plans

  • Satisfaction with implementation

  • Utilization of CQI Progress Report

Key site personnel

Web-based survey

Twice in years 3 and 5 of program delivery

Local Focus Group Guide

  • Development of CQI infrastructure

  • Utilization and effectiveness of technical assistance

  • Satisfaction with implementation

  • Program changes resulting from the Initiative

  • Recommendations to improve the CQI Initiative

Key site personnel

Focus group

Twice in years 2 and 4 of program delivery

National TA Provider Focus Group Guide

  • Development of CQI infrastructure

  • Utilization and effectiveness of technical assistance

  • Satisfaction with implementation

  • Program changes resulting from the Initiative

  • Recommendations to improve the CQI Initiative

National TA providers

Focus group

Twice in years 2 and 4 of program delivery

Alumni Networking Study

Networking and Collaboration Study

  • Frequency and levels of collaboration on issues of governance, individualized care, funding, family-driven care, youth-guided care, culturally competent care, sustainability, and evaluation

  • Frequency and levels of collaboration on program and evaluation technical assistance

  • What Web site factors facilitate or hinder collaboration

Project Directors/Principal Investigators, lead family contact, and lead evaluator

Web-based survey

Evaluation years 1 and 3

Web Site Satisfaction Study

  • Satisfaction with design of the Web site

  • Satisfaction with format of the Web site

  • Satisfaction with content of the Web site

Project Directors/Principal Investigators

Web-based survey

Evaluation years 2 and 4



3. METHODS TO MAXIMIZE RESPONSE RATES


To maximize the response rate for all data collection efforts, a number of steps will be taken:


The National Evaluator will continue to take an active role providing technical assistance and support to the grantee sites. This will be done by providing: (1) a detailed Data Collection Procedures Manual; (2) an initial training on evaluation protocols; (3) evaluation workshops at semi-annual national meetings and through Webinars; (4) one-on-one contact with national evaluation liaisons; (5) regular teleconferences and site visits throughout the evaluation period; (6) forums for cross-community facilitated discussions; (7) reading materials; and (8) additional guidance and information, as questions arise. In addition, resources to assure that site evaluators are aware when an interview is due for completion will be provided in the form of a Tracking System in Microsoft Access specific to this evaluation, and reminder e-mails generated by the Internet-based data collection system to eliminate the need for site-level duplication of effort and expense in the design of local tracking materials.


Additionally, the National Evaluator will provide mechanisms for sites to communicate with the National Evaluator and other sites. This will be done by provision of an Internet-based listserv for facilitating communication about training and technical assistance regarding evaluation implementation and utilization. The listserv allows site evaluators to communicate with the National Evaluator and each other through group e-mail. Any e-mail message sent to the listserv is automatically distributed to all site evaluators. The listserv is run at no cost to site evaluators. As well, a computer bulletin board will be established and can be used to provide a safe avenue for exchanging electronic copies of documents such as evaluation reports and research instruments to use for training and technical assistance purposes.


Special efforts around training in communities with smaller service populations will also be conducted to ensure that as many people as possible from the target population are enrolled and that site staff are familiar with methods for maximizing response rates. The National Evaluator will encourage these sites to keep in frequent contact with study participants to update telephone numbers and addresses and to create program branding and materials for the site to engage families. As well, the National Evaluator will provide these sites with contact information for staff from other sites that have had high response rates and will assist them in applying strategies that have been used successfully in other communities.


To help ensure that data are being collected regularly and in keeping with national evaluation standards, the data collection staff at the local sites will continue to work closely with local providers, staff from various agencies, and evaluation staff. These contacts will inform the evaluation implementation and data collection procedures, and address any questions or concerns of the participating providers or agencies. As well, local parent groups will be enlisted to encourage the cooperation of families in providing child and family information.


Following from the national evaluation standards, information will be collected from participants in the longitudinal Child and Family Outcome Study to facilitate contacting them in the future. This will include the names, phone numbers, and addresses of close friends and family members who are likely to always know where the participants are if they move. At the time of follow-up data collection, staff will attempt to contact respondents at different times of the day and week using a variety of methods (e.g., phone calls, mailed postcards). This will continue until the determination is made that a family has refused further participation or cannot be found. Efforts to contact respondents for follow-up data collection will begin by 1 month before the follow-up interview is due. Other efforts to increase the response rate will include:


    • Providing an incentive payment for completing follow-up interviews;

    • Administering the instruments to children and their parents or caregivers at times and settings of their choice and administering multiple instruments at one time;

    • Developing a close working relationship between the data collection staff and providers at each site to facilitate tracking;

    • Conducting follow-up and informational mailings throughout the study period to maintain contact with study participants;

    • Using a centralized data collection and tracking system involving trained interviewers and at least one person dedicated to the tracking of study participants over time to keep study attrition to a minimum;

    • Employing proven tracking techniques (e.g., request address corrections from the post office for forwarded mail, use Web-based address and telephone searches, employ locator services to search for respondents);

    • Obtaining permission from caregivers for evaluators to contact other agencies for the purpose of getting new addresses and phone numbers if the family has moved since the last interview;

    • Providing sites with useful feedback on data obtained through the evaluation activities that will assist them in planning and service delivery.



4. TESTS OF PROCEDURES


Many instruments planned for Phase VI are standardized instruments that have been tested through use in children’s mental health services research and practice. These include the Child Behavior Checklist (CBCL), the Behavioral and Emotional Rating Scale—Second Edition (BERS–2), the Preschool Behavioral and Emotional Rating Scale (PreBERS), the Devereux Early Childhood Assessment (DECA), the Gain Quick–R: Substance Problem Scale (GAIN), the Youth Services Surveys (YSS), the Revised Children’s Manifest Anxiety Scale, Second Edition (RCMAS–2), and the Reynolds Adolescent Depression Scale, Second Edition (RADS–2). Selection of measures was based on expert panel reviews, and an assessment of measurement quality as reported in the literature. (Information on the reliability and validity of the measures and other supporting materials appears along with the instruments in Attachment 4.) Decisions about Phase VI instrumentation were made in conjunction with expert reviewers, site representatives, and family members. These consultants are listed in Attachment 4.


In addition to providing input into the selection of standardized instruments, the team of consultants also suggested measures to be removed from the evaluation, and specific items to include in the evaluation (which have been incorporated into the new and revised measures). New and revised measures have been administered to determine burden estimates. Experience and data from previous phases were further used to assess reliability and validity and contributed to the burden estimates.


The following are new measures in Phase VI:


  • Parenting Stress Index (PSI)

  • Devereux Early Childhood Assessment (DECA)

  • Preschool Behavioral and Emotional Rating Scale (PreBERS)

  • Court Representative Questionnaire (CRQ)

  • Teacher Questionnaire (TQ)

  • School Administrator Questionnaire (SAQ)

  • Child Welfare Sector Study Questionnaire—Intake (CWSQ–I)

  • Child Welfare Sector Study Questionnaire—Follow-Up (CWSQ–F)

  • Sustainability Survey: Brief Form

  • Multi-Sector Service Contacts, Revised—Intake (MSSC–R–I)

  • CQI Baseline Survey

  • CQI Monitoring Survey

  • Local CQI Focus Group Guide

  • National CQI Focus Group Guide

  • Networking and Collaboration Survey

  • Alumni Network Web Site Satisfaction Survey

  • Flex Funds Data Dictionary/Tool

  • Services and Costs Data Dictionary/Data Entry Application


Revised measures in Phase VI include the following:


  • System of Care Assessment Interview Protocols

  • Caregiver Information Questionnaire, Revised (CIQ–R)

  • Education Questionnaire, Revision 2 (EQ–R2)

  • Substance Use Survey, Revised (SUS–R)

  • Delinquency Survey, Revised (DS–R)

  • Youth Information Questionnaire, Revised (YIQ–R)

  • Multi-Sector Service Contacts, Revised— Follow-Up (MSSC–R–F)

  • Cultural Competence and Service Provision Questionnaire, Revised (CCSP–R)

  • Youth Services Survey for Families (YSS–F)

  • Youth Services Survey (YSS)

  • Sustainability Survey


Measures that are unchanged from previous phases of the evaluation include the following:


  • Living Situations Questionnaire (LSQ)

  • Child Behavior Checklist (CBCL)

  • Caregiver Strain Questionnaire (CGSQ)

  • Behavioral and Emotional Rating Scale—Second Edition (BERS–2)

  • Gain Quick–R: Substance Problem Scale (GAIN)

  • Revised Children’s Manifest Anxiety Scale, Second Edition (RCMAS–2)

  • Reynolds Adolescent Depression Scale, Second Edition (RADS–2)

  • Columbia Impairment Scale (CIS)


Measures included in previous phases but removed from Phase VI include the following:


  • Family Life Questionnaire (FLQ)

  • Vineland Screener (VS)

  • Interagency Collaboration Scale


In addition to these measures, the Phase VI evaluation will include an electronic data transfer to obtain records from the education and juvenile justice sectors. These data will be collected as part of the Sector and Comparison study and will obtain administrative data such as school grades, school attendance, arrest and adjudication records.


All the measures for Phase VI have been or will be translated into Spanish. The reliability and validity of the Spanish Child Behavior Checklist (CBCL) has been reported in the literature. Translation of measures will be conducted using established procedures, as done in earlier phases. First, experienced bilingual translation consultants translated the measures from English to Spanish. Then, to maximize the accuracy of the translation, full measures or in some cases selected sections were then back-translated from Spanish to English by other translators who were largely native speakers in grantee communities.



5. STATISTICAL CONSULTANTS


The National Evaluator has full responsibility for the development of the overall statistical design, and assumes oversight responsibility for data collection and analysis for Phase VI. Training, technical assistance, and monitoring of data collection will be provided by the National Evaluator. The individual responsible for overseeing data collection and analysis is:


Brigitte Manteuffel, Ph.D.

Macro International Inc.

3 Corporate Square, Suite 370

Atlanta, GA 30329

(404) 321-3211


The following individuals will serve as statistical consultants to this project:


Susan L. Ettner, Ph.D.

Professor

David Geffen School of Medicine at UCLA

Division of General Internal Medicine and Health Services Research

911 Broxton Plaza, Room 106

Box 951736

Los Angeles, CA 90095-1736

Phone: (310) 794-2289

Fax: (310) 794-0732


Anna Krivelyova, M.S.

Macro International Inc.

3 Corporate Square, Suite 370

Atlanta, GA 30329

(404) 321–3211


Robert Stephens, Ph.D., M.P.H.

Macro International Inc.

3 Corporate Square, Suite 370

Atlanta, GA 30329

(404) 321–3211


The agency staff person responsible for receiving and approving contract deliverables is:


Sylvia K. Fisher, Ph.D.

Child, Adolescent, and Family Branch

Center for Mental Health Services

Substance Abuse and Mental Health Services

1 Choke Cherry Road, Room 6–1047

Rockville, MD 20857





LIST OF ATTACHMENTS



Attachment 1: Request for Applications (RFA) No. SM-08-004, pages 1–47


Attachment 2: Consultation

A. Federal/National Partnership for Children’s Mental Health

B. Methodological Consultants and Services Evaluation Committee to the National Evaluation

C. Expert Reviewers of Instrumentation


Attachment 3: Consents

A. Guidelines for Obtaining Consent

1. National Evaluation

2. Comparison Study

B. Model Scripts for Consent to Contact

1. National Evaluation

2. Comparison Study

C. Model Consent Forms

1. Sample Script to Introduce the Longitudinal Child and Family Outcome Study

2. Consent to Contact—Longitudinal Child and Family Outcome Study

3. Informed Consent—Longitudinal Child and Family Outcome Study: Caregiver Version

4. Informed Assent—Longitudinal Child and Family Outcome Study: Child Version

5. Informed Consent—Longitudinal Child and Family Outcome Study: Young Adult Version

6. Sample Script to Introduce the Sector and Comparison Study

7. Consent to Contact— Sector and Comparison Study

8. Informed Consent— Sector and Comparison Study: Caregiver Version

9. Informed Assent— Sector and Comparison Study: Child Version

10. Informed Consent— Sector and Comparison Study: Young Adult Version

11. Informed Consent— Sector and Comparison Study: Agency Representative

D. National Evaluation Consent Forms

1. Informed Consent—System of Care Assessment: Staff

2. Informed Consent—System of Care Assessment: Caregiver

3. Informed Consent—System of Care Assessment: Youth

4. Informed Assent—System of Care Assessment: Youth

5. Informed Consent—System of Care Assessment: Parent/Guardian Approval for Youth Participant

6. Informed Consent—System of Care Assessment: Record Review

7. Informed Consent—Sustainability Survey: Brief Form

8. Informed Consent—Sustainability Survey

9. Informed Consent—Continuous Quality Improvement Initiative Survey

10. Informed Consent—Continuous Quality Improvement Initiative Local Focus Group

11. Informed Consent—Continuous Quality Improvement Initiative National Focus Group

12. Informed Consent—Alumni Networking and Collaboration Survey

13. Informed Consent—Alumni Web Site Satisfaction Survey


Attachment 4: Original Instruments, Data Elements, and Supporting Materials

A. System of Care Assessment

1. Overview of System of Care Assessment Framework

a. Infrastructure Domain

b. Service Delivery Domain

2. Letter Templates

a. Introduction Letters

b. Confirmation Letter

c. Draft Report Letter

d. Final Report Letter

e. Thank You Letter

3. Informant Table

4. Pre-Visit Documentation

a. Instructions for Completing Site Visit Tables and Lists

b. Site Visit Tables

c. Site Informant List

d. Sample Agenda

e. Checklist of Planning Steps

5. System of Care Assessment Interview Protocols

A. Core Agency Representative

B. Project Director

C. Family Representative/Representative of Family/Advocacy Organizations

D. Program Evaluator

E. Intake Worker

F. Care Coordinator

G. Direct Service Delivery Staff

H. Care Review Participant

I. Caregiver of Child or Youth Served by the Program

L. Direct Service Staff from Other Public Child-Serving Agencies

M. Care Record/Chart Review

N. Other Staff

O. Debriefing Document

P. Youth Respondent

Q. Youth Coordinator

R. Cultural and Linguistic Competence Coordinator

S. Social Marketing Communications Manager

B. Cross-Sectional Descriptive Study

1. Enrollment and Demographic Information Form (EDIF)

2. Child Information Update Form (CIUF)

C. Longitudinal Child and Family Outcome Study

1. Caregiver Information Questionnaire, Revised (CIQ)

a. Caregiver Information Questionnaire, Revised: Caregiver—Intake (CIQ–RC–I)

b. Caregiver Information Questionnaire, Revised: Caregiver—Follow-Up (CIQ–RC–F)

c. Caregiver Information Questionnaire, Revised: Staff as Caregiver—Intake (CIQ–RS–I)

d. Caregiver Information Questionnaire, Revised: Staff as Caregiver—Follow-Up (CIQ–RS–F)

2. Caregiver Strain Questionnaire (CGSQ)

3. Child Behavior Checklist (CBCL)

a. Child Behavior Checklist (CBCL 1½–5)

b. Child Behavior Checklist (CBCL 6–18)

4. Education Questionnaire, Revision 2 (EQ–R2)

5. Living Situations Questionnaire (LSQ)

6. Behavioral and Emotional Rating Scale—Second Edition, Parent Rating Scale (BERS–2C)

7. Columbia Impairment Scale (CIS)

8. Parenting Stress Index (PSI)

9. Devereux Early Childhood Assessment

a. Devereux Early Childhood Assessment for Infants (DECA 1–18M)

b. Devereux Early Childhood Assessment for Toddlers (DECA 18–36M)

c. Devereux Early Childhood Assessment (DECA 2–5Y)

10. Preschool Behavioral and Emotional Rating Scale (PreBERS)

11. Delinquency Survey, Revised (DS–R)

12. Behavioral and Emotional Rating Scale—Second Edition, Youth Rating Scale (BERS–2Y)

13. Gain Quick–R: Substance Problem Scale (GAIN)

14. Substance Use Survey, Revised (SUS–R)

15. Revised Children’s Manifest Anxiety Scale, Second Edition (RCMAS–2)

16. Reynolds Adolescent Depression Scale, Second Edition (RADS–2)

17. Youth Information Questionnaire, Revised (YIQ)

a. Youth Information Questionnaire, Revised—Intake (YIQ–R–I)

b. Youth Information Questionnaire, Revised—Follow-Up (YIQ–R–F)

D. Service Experience Study

1. Multi-Sector Service Contacts Questionnaire, Revised

a. Multi-Sector Service Contacts, Revised: Caregiver—Intake (MSSC–RC–I)

b. Multi-Sector Service Contacts, Revised: Caregiver—Follow-Up (MSSC–RC –F)

c. Multi-Sector Service Contacts, Revised: Staff as Caregiver—Intake (MSSC–RS–I)

d. Multi-Sector Service Contacts, Revised: Staff as Caregiver—Follow-Up (MSSC–RS –F)

2. Cultural Competence and Service Provision Questionnaire, Revised (CCSP–R)

3. Youth Services Survey for Families, Abbreviated Version (YSS–F)

4. Youth Services Survey, Abbreviated Version (YSS)

E. Sector and Comparison Study: Juvenile Justice

1. Court Representative Questionnaire (CRQ)

F. Sector and Comparison Study: Education

1. Teacher Questionnaire (TQ)

2. School Administrator Questionnaire (SAQ)

G. Sector and Comparison Study: Child Welfare

1. Child Welfare Sector Study EDIF Addendum (CWS–EDIF)

2. Child Welfare Sector Study Questionnaire—Intake (CWSQ–I)

3. Child Welfare Sector Study Questionnaire—Follow-Up (CWSQ–F)

H. Sustainability Study

1. Sustainability Study Survey: Brief Form

a. Sustainability Study: Brief Form Respondent Selection Criteria

b. Sustainability Study: Brief Form Telephone Script

c. Sustainability Study: Brief Form Cover Letters

d. Sustainability Survey: Brief Form

e. Sustainability Study: Brief Form Reminder Letters

2. Sustainability Study Survey

a. Sustainability Study: Survey Respondent Selection Criteria

b. Sustainability Study: Survey Telephone Scripts

c. Sustainability Study: Survey Cover Letters

d. Sustainability Survey

e. Sustainability Study: Survey Reminder Letters

f. Sustainability Study Survey Web Screens

I. CQI Initiative Evaluation

1. CQI Initiative Evaluation Survey

a. Survey Cover Letter

b. CQI Initiative Evaluation Baseline Survey

c. CQI Initiative Evaluation Monitoring Survey

d. Survey Reminder Letters

e. Survey Thank You Letter

2. CQI Initiative Evaluation Focus Groups

a. Local Focus Groups

1. Local Focus Group Invitation Letter

2. Local Focus Group Confirmation Letter

3. Local Focus Group Reminder Letter

4. Local Focus Group Guide

5. Local Focus Group Thank You Letter

a. National Focus Groups

1. National Focus Group Invitation Letter

2. National Focus Group Confirmation Letter

3. National Focus Group Reminder Letter

4. National Focus Group Guide

5. National Focus Group Thank You Letter

J. Alumni Networking Study

1. Networking and Collaboration Survey

a. Networking and Collaboration Survey Cover Letters

b. Networking and Collaboration Survey

c. Networking and Collaboration Survey Reminder Letters

2. Web Site Satisfaction Survey

a. Web Site Satisfaction Survey Cover Letters

b. Web Site Satisfaction Survey

c. Web Site Satisfaction Survey Reminder Letters

K. Services and Costs Study

1. Flex Funds Tool Data Dictionary

2. Services and Costs Data Dictionary


Attachment 5: Sample Table Shells for Reporting Findings


85

File Typeapplication/msword
File TitleSupporting Statement
AuthorGordon
Last Modified ByKatherine.E.Young
File Modified2009-11-13
File Created2009-11-12

© 2024 OMB.report | Privacy Policy