Phase VI SS Part A_6-8-15 Final

Phase VI SS Part A_6-8-15 Final.docx

National Evaluation of the Comprehensive Mental Health Services for Children and Their Families Program: Phase VI

OMB: 0930-0307

Document [docx]
Download: docx | pdf

Phase VI of the National Evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program


Supporting Statement


A. JUSTIFICATION


1. CIRCUMSTANCES OF INFORMATION COLLECTION



The Substance Abuse and Mental Health Services Administration (SAMHSA), Center for Mental Health Services is requesting OMB approval for the continuation of currently approved data collection activities (26 instruments) for communities awarded cooperative agreements in nine (9) communities awarded cooperative agreements (CA) in FY 2010. These communities are included in the Phase VI cohort of the national evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program—Children’s Mental Health Initiative (CMHI), which included communities funded in FY 2008 and FY 2009. The request is to extend the data collection from these nine (9) communities through September 2018. The current approved data collection is under OMB No. 0930-0307, which expires on 9/30/2015.


Serious emotional disturbance affects more than 4.5 million children and their families in the United States. There is consensus that an integrated, coordinated, and comprehensive system of care is the best approach for meeting the needs of this population. The Comprehensive Community Mental Health Services for Children and Their Families Program, which is administered by the Center for Mental Health Services (CMHS) within the Substance Abuse and Mental Health Services Administration (SAMHSA), provides funds to support a broad array of community-based and family-driven services delivered through the system of care model. Under this program, CMHS has funded 5- and 6-year grants and cooperative agreements to States and locales to expand the array and capacity of services for children with serious emotional disturbance. Program funding has increased from $4.9 million in FY 1993 to $121.3 million in FY 2010 due in large part to the evidence of the effectiveness of the program provided by the national evaluation. This level of funding was maintained through FY 2011. To date, this CMHS program has funded 173 such communities through these grants and cooperative agreements. This includes 47 sites awarded cooperative agreements in Phase VI (18 in FY 2008, 20 in FY 2009, and 9 in FY 2010) for which approval is being sought.


The data collection effort proposed here relates closely to the completed and previously approved evaluations of Phase I (OMB No. 0930–0171), Phase II (OMB No. 0930–0192), Phase III (OMB No. 0930–0209), Phase V (OMB No. 0930–0257) and the ongoing evaluation of Phase V (OMB No. 0930-0280), and Phase VI CA awardees (OMB No. 0930-0307). Phases I through III cover grantees funded in FY 1993, FY 1994, FY 1997 to FY 2000, Phase IV covers CA awardees funded in FY 2002 to FY 2004; Phase V covers CA awardees funded in FY 2005 and FY 2006; Phase VI covers CA awardees funded in FY 2008, FY 2009, and FY 2010.



The previously cleared Phase VI evaluation is composed of six core study components: (1) the System of Care Assessment that documents the development of systems of care through site visits conducted every 12–18 months; (2) the Cross-Sectional Descriptive Study that collects descriptive data on all children and families who enter the CMHS-funded systems of care throughout the funding period; (3) the Child and Family Outcome Study that collects data longitudinally on child clinical and functional status, and family outcomes; (4) The Service Experience Study that collects data longitudinally on family experience and satisfaction with services from a sample of children and families; (5) the Services and Costs Study that assesses the costs and cost-effectiveness of system of care services; and (6) the Sustainability Study that assessed the potential for communities to sustain their systems of care over time, were previously approved for data collection by communities funded in FY 2008 and FY 2009.


Of these six studies, five are currently being conducted with all CA awardees funded in FY 2009 and FY 2010. The present request proposes to continue data collection under this protocol with the nine CA awardees funded in FY 2010. CA awardees funded in FY 2010 will participate in five of these previously approved core study components. These five study components collect information on a major nationwide initiative serving thousands of children and their families. These data are used for the national evaluation as well as for local evaluations by the CA awardees.


The proposed data collection activities will continue the previously cleared data collection efforts and ensure data collection activities align with SAMHSA’s plan for achieving the goals of the agency’s six strategic initiatives entitled Leading Change 2.0: Advancing the Behavioral Health of the Nation. Through its longitudinal assessment of child and family living situations, employment, education and behavioral health outcomes, this evaluation assesses CMHI program progress in addressing SAMHSA’s strategic initiative focused on promoting recovery-oriented behavioral health service systems and establishing system-level approaches that foster health and resilience; increase permanent housing, employment, education and other necessary supports; and reduce barriers to social inclusion.


This request proposes to continue previously approved data collection activities for communities funded in FY 2010 until FY 2018 to facilitate the collection of data for an additional 3 years. A total of 26 instruments will be used to collect the data.


Grant/Cooperative Agreement Review Process. The SAMHSA Office of Review selects the review panel based on a number of criteria including, but not limited to, geographic region, race/ethnicity, etc. The review office requests recommendations of qualified reviewers from program among other sources; however, the selection of reviewers is blind to program. The reviewers rate the applicants based on the evaluation criteria contained in the Request for Applications (RFA) No. SM-10-005. Once scoring has occurred, program writes a funding plan containing the number of grants that can be funded based on the total budget available for the grant. Once the Administrator concurs, the Notice of Grant Award is sent to the successful applicants.


Characteristics of Nine Cooperative Agreement Communities Funded in 2010. Consistent with previous cohorts of CA-funded communities, the characteristics of the nine communities funded in 2010 vary by governmental entity receiving the funding, geographic location or catchment area served, and diversity in the populations of focus. State mental health agencies are the recipients of the CA in Puerto Rico and Tennessee. County mental health agencies are the recipients in Los Angeles County, CA; Seminole County, FL; Saginaw County, MI; and Durham County, NC. A city agency is the recipient in Jacksonville, FL and Tribal governments are the recipients in Mescalero Apache Tribe, NM and Rosebud Sioux Tribe, SD. As in previous funded cohorts, communities are located in urban areas (Los Angeles; Jacksonville; Saginaw; Durham); suburban areas (Seminole County, a part of metropolitan Orlando); multi-county largely rural catchment areas (middle Tennessee); frontier/rural tribal communities (Mescalero Apache and Rosebud Sioux tribes); and two small island communities in the Territory of Puerto Rico. The relative mix of populations of focus is similar to previously funded communities. This mix includes a particular focus on transition-age youth aged 16–21 (Durham); early childhood aged 0–5 (Los Angeles; middle Tennessee); and children and youth involved with child welfare or juvenile justice (Jacksonville and Seminole County). The tribal communities and program in Puerto Rico have focused their services on children and youth living in poverty in generally underserved areas; and middle Tennessee plans to extend a special effort to serve children in military families who live near the military bases located in that area.


a. Background


The understanding of child and adolescent mental health disorders has improved significantly during the last two decades. As a result, the field is in a much better position today to estimate the extent to which mental health disorders occur in the population of children and adolescents at large, however it is still likely that many children in need go undetected. With the estimate that at least 20% of children and youth under age 19 may require mental health services (U.S. Public Health Service Office of the Surgeon General [USPHS], 2001), one also can estimate that at least 16 million children and youth are in need of some type of mental health service each year. As noted in Promotion and Prevention in Mental Health (Substance Abuse and Mental Health Services Administration [SAMHSA], 2007), half of all diagnosed mental illnesses begin by age 14, and three fourths begin by age 24. Given these conditions, the ability for child-serving providers to identify children in need of services in settings where children and youth are found and to know how and where to direct their families to services is essential. Increasingly, the need for the public health approaches of health promotion and prevention is being identified for mental health (Institute of Medicine [IOM], 2009). The role that education, child welfare, juvenile justice, primary care, substance abuse, daycare, and other settings can play in early identification is facilitated by collaboration across systems and the awareness that providers in these settings have of the mental health needs of the children and youth they serve, as well as the services available to them.


Children and adolescents with serious emotional disturbance face challenges in many aspects of their daily lives. Generally, they present with a variety of diagnoses, they experience high rates of risk factors for mental illness, and they exhibit severe clinical symptoms and functional impairment (Manteuffel, Stephens, Brashears, Krivelyova, & Fisher, 2008). They are at greater risk for substance abuse disorders (Center for Mental Health Services [CMHS], 2001, 2003, 2004; Holden, 2003; Holden et al., 2003; Liao, Manteuffel, Paulic, & Sondheimer, 2001; SAMHSA, 2002), and have greater risk for negative encounters with the juvenile justice system (CMHS, 2001, 2003, 2004; Davis & Vander Stoep, 1997). Students with emotional disturbance fail more courses, earn lower grade point averages, miss more days of school, are retained at grade more than students with other disabilities, and have high dropout rates (Epstein, Nelson, Trout, & Mooney, 2005; U.S. Department of Education [DOE], 2001). Longitudinal research following samples into adulthood further supports assertions of high rates of poor long-term outcomes for these youth (Epstein, Kutash, & Duchnowski, 2005; Friedman, Kutash, & Duchnowski, 1996; Knapp, McCrone, Fombonne, Beecham and Wostear, 2002; Pumariega & Winters, 2003) who may have poor employment opportunities and who may experience periods of poverty in adulthood (National Advisory Mental Health Council Workgroup on Child and Adolescent Mental Health Intervention and Deployment, 2001). There is also the increased risk that youth with mental illness will not reach adulthood, as these youth are more likely to commit suicide than youth without mental illness (Centers for Disease Control and Prevention [CDC], 2007).


Despite advances in the knowledge base over the last decade that have illuminated continuing challenges in delivering services and meeting needs for this population, service capacity has not kept pace with need (Friedman, 2002; Stroul, Pires, & Armstrong, 2001). It has been estimated previously that only 1 in 5 children with serious emotional disturbance receive the specialty services they need (Burns et al., 1995; DHHS, 1999; Shaffer et al., 1996).Latinos and the uninsured have especially high rates of unmet need relative to other children (DHHS, 1999; Kataoka, Zhang, & Wells, 2002). This underscores the need for the development of effective community-based care that is sensitive to and structured for the diverse cultures in individual communities (Hernandez & Isaacs, 1998; Isaacs-Shockley, Cross, Bazron, Dennis, & Benjamin, 1996) and impoverished families, and is available in even the most geographically remote communities in the country. The Federal Action Agenda states that expanding access to quality mental health care is one of the identified methods to system transformation (SAMHSA, 2005). Serving the needs of persons of diverse backgrounds requires culturally and linguistically competent providers, culturally competent treatments and practices, and cultural adaptations to provide efficacious and effective services (Whaley & Davis, 2007).


In 1984, in response to findings that children and families are most effectively served by community-based, family-driven, coordinated systems of care, the NIMH initiated the Child and Adolescent Service System Program (CASSP). Later administered by CMHS within SAMHSA, CASSP provided funds to promote the development of comprehensive and integrated service delivery systems for children with serious emotional disturbance through a system of care approach. The system of care program theory model, first articulated by Stroul and Friedman in 1986, proposes that agencies in various child-serving sectors, such as education, juvenile justice, mental health, and child welfare work together to provide the wide array of services needed by children with serious emotional disturbance and their families. Built upon the CASSP philosophy that calls for services to be child-centered, family-driven, community-based, and culturally competent, the model emphasizes the need to (1) broaden the range of nonresidential community-based services, (2) strengthen case planning across child-serving sectors, and (3) increase case management capacity to ensure that services work together across sectors and providers.


The Patient Protection and Affordable Care Act (ACA) of 2010, which seeks to make health insurance coverage more affordable for individuals and families and the owners of small businesses, also addresses a variety of services that should be available for individuals with mental health and addiction needs (Health Reform: Overview of the Affordable Care Act, SAMHSA newsletter, May/June 2010, Volume 18, Number 3). The system of care approach is consistent with the vision for transformation in mental health services outlined in the ACA, which calls for enhancing community-based service options for individuals with a mental health or substance use condition, school-based health centers that will offer mental health and addiction services, coordination of primary and mental health care services, prevention, early identification, and funding for system transformation.


Under the ACA, many individuals and families previously ineligible for Medicaid or unable to obtain commercial insurance for or because of mental and substance use disorders will be covered by Medicaid, commercial insurance through employers or on the private market or through the State health insurance exchanges. Estimates are that up to 32 million more people will become eligible for health insurance, of which six to ten million will have significant untreated mental health and/or addictions. Many of these, including children and families, will be treated through primary care settings, utilizing referrals to treatment that will help prevent or offer recovery from significant disorders (SAMHSA, Justification of Estimates for Appropriations Committees, Fiscal Year 2011). The system of care approach works to increase access to such quality, evidence-based referral services.


b. The Comprehensive Community Mental Health Services for Children and Their Families Program (CMHI)


While the system of care approach provided a conceptual framework to meet the needs of children with serious emotional disturbance, funding to provide services at the local level was either sporadic or missing. In 1992, the Federal Government addressed this gap with the passage of the Children’s and Communities Mental Health Services Improvement Act (CMHI), which is part of the Alcohol, Drug Abuse and Mental Health Administration Reorganization Act (Public Health Service Act, Title V, Part E, Section 561-565, as amended, Public Law 102-321, 42 U.S.C. 290ff). The Act was amended in 2000 to change the term of funding from 5 to 6 fiscal years (Public Law 106–310, Section 3105(c)). CMHI provides support through grants and cooperative agreements to States, political subdivisions within States, the District of Columbia, and territories to develop integrated home and community-based systems and supports for children and youth with serious emotional disturbances and their families. This funding encourages communities to develop and expand systems of care. The CMHI is the largest Federal commitment to children’s mental health to date, and through FY 2014 has provided over $1.9 billion to support system development in 173 communities in 50 States, 2 territories, the District of Columbia, and 22 tribes or tribal entities including the 38 grants awarded in FY 2008 and FY 2009, and the nine grants awarded in FY 2010. The program is fully described in the grant Guidance for Applicants.


The goals of the CMHS program:


  • Expand community capacity to serve children and adolescents with serious emotional disturbances and their families.

  • Provide a broad array of accessible, clinically effective and fiscally-accountable services, treatments and supports.

  • Serve as a catalyst for broad-based, sustainable systemic change inclusive of policy reform and infrastructure development.

  • Create a case management team with an individualized service plan for each child.

  • Deliver culturally and linguistically competent services with special emphasis on racial, ethnic, linguistically diverse and other underrepresented, underserved or emergent cultural groups; and Implement full participation of families and youth in service planning, in the development, evaluation and sustainability of local services and supports and in overall system transformation activities.


c. The Need for Evaluation


Section 565(c)1 of the Public Health Service Act (PL 102–321) mandates annual evaluation activities. A basic requirement is documentation of the characteristics of the children and families served by the system of care initiative, the type and amount of services they receive, and the cost to serve them. Equally important is the need to assess whether the program was implemented and services experienced as intended. It is also critical to assess whether the children served by the program experience improvement in clinical and functional outcomes, whether family life is improved, and whether improvements endure over time. Finally, policymakers and service providers need to know whether those outcomes can be reasonably attributed to the system-of-care initiative.


Further evaluation requirements under Section 565 (c)2 of PL 103–321:


  • Annual reports to the Secretary of Health and Human Services (HSS) that include a description of the number of children served, child demographic characteristics, types and costs of services provided, availability and use of third-party reimbursements, estimates of the unmet need for services within CA awardee jurisdictions, how the grant was expended to establish jurisdiction-wide systems of care, and other information as required by the Secretary.

  • Annual Reports to Congress that provide information on longitudinal studies of outcomes of services provided by the funded systems of care, the effect of activities conducted under funded systems of care on the utilization of hospital and other institutional settings, barriers to the achievements of establishing interagency collaboration within systems of care, and parental assessment of the effectiveness of systems of care.


A government contractor (referred to as the National Evaluator throughout this document) coordinates data collection for the national evaluation and provides training and technical assistance to facilitate the collection of data by local-level evaluators. In turn, each CA awardee is required by the cooperative agreement to hire a minimum of two evaluation staff (or their full-time equivalents) to ensure that data collection is systematic and can be sustained through the funding period. In this partnership between the National Evaluator and local evaluators, the National Evaluator provides training and technical assistance regarding data collection and research design. In addition, the National Evaluator receives data from all CA awardees, monitors data quality, and provides feedback to CA awardees. The CA awardees help shape data collection procedures and provide feedback to the National Evaluator regarding successful approaches. This evaluation will first and foremost prepare data analyses for the national assessment of the program, but in doing so will make CA awardee-specific data available to the CA awardees to help meet their local evaluation needs.



2. PURPOSE AND USE OF THE INFORMATION


What follows is a description of the previously approved clearance, a summary of the revisions from the previously approved package, and a description of the uses of the information collected through the evaluation.


a. Previously Approved Clearance


Currently, data collection for the CMHI cross-site evaluation is operating under OMB clearance (OMB No. 0930-0307) valid until September 30, 2015. The national evaluation is designed to answer evaluation questions that have evolved over the last 18 years through development of the CMHI and feedback from system of care personnel and other partners and extend those mandated by the CMHI authorizing legislation. The legislation requires funded communities to participate in a national evaluation that assesses the number of children served, child and family characteristics, child and family outcomes, service utilization patterns, and system characteristics.


This evaluation will serve several purposes. It will fulfill the program’s legislatively mandated requirements for an annual report to Congress based on findings from a national evaluation of the program. In support of this purpose, it will (1) describe who is being served by the CMHS-funded systems of care; (2) show whether there are observable differences in child and family outcomes that can be plausibly linked to a faithful implementation of the system of care approach; (3) describe how children and families experience the service system and how they use services and supports (i.e., utilization patterns); (4) estimate the cost of serving children in systems of care and assess cost-effectiveness of services; (5) illustrate the development of systems of care as they move toward offering integrated and comprehensive services; and (6) compare outcomes and service experience among a group of children, youth and families involved in one of three child-serving sectors and receiving services from CMHS-funded system of care communities, and a similar group receiving services from non-funded communities. In addition, the evaluation will (7) provide data to CMHS and CA awardees to inform program implementation, improvement, and sustainability; and (8) support evaluation technical assistance activities to help CMHS best meet program goals;


The evaluation design includes participation among CA awardees in five core study components and one special study (subsample of FY 2008-funded sites only) that employ both qualitative and quantitative methods to comprehensively examine the impact of the CMHS program. Longitudinal data collected from children and families in each cohort over a 24-month period provide an assessment of improvement in functional and behavioral outcomes over time, and satisfy the requirement of Public Health Service Act, Title V, Part E, Section 565, Public Law 102-321, 42 U.S.C. 290ff–4(c) that information be collected on the longitudinal outcomes of services provided by the funded systems of care. The six currently approved study components, their associated instruments, and the purpose of data collection as it relates to the Public Law, and Program Objectives as stated in the Request for Applications (RFA) to which CA awardees applied are summarized in Table 1 below.


Table 1. Purpose of Currently OMB-Approved Cross-Site Evaluation Data Collection Instruments


STUDY AND ASSOCIATED INSTRUMENTS

PURPOSE

Cross-Sectional Descriptive Study

  • Enrollment & Demographic Form (Web-based, record review)

  • Child Information Update Form

Public Health Service Act

Title V, Part E

Public Law 102-321

Program

Objective

The study addresses Section 565, 42 U.S.C. 290ff–4(c) that the public entity involved will annually submit to the Secretary a report on the activities of the entity under the grant that includes a description of the number and demographics of children provided access to systems of care.

This study crosswalks with RFA requirement 2.1 that CA awardees serve children and/or adolescents with a serious emotional disturbance.



Table 1. Purpose of Currently OMB-Approved Cross-Site Evaluation Data Collection Instruments (continued)


STUDY AND ASSOCIATED INSTRUMENTS

PURPOSE

Child and Family Outcome Study and Service Experience Study

Caregiver Measures

  • Behavioral and Emotional Rating Scale—Second Edition, Parent Rating Scale (BERS–2C)

  • Caregiver Information Questionnaire, Revised (CIQ–R)

  • Caregiver Strain Questionnaire (CGSQ)

  • Child Behavior Checklist (CBCL)

  • Columbia Impairment Scale (CIS)

  • Education Questionnaire, Revision 2 (EQ–R2)

  • Living Situations Questionnaire (LSQ)

  • Multi-Service Sector Contacts Questionnaire, Revised (MSSC–R)

  • Culturally Competent Service Provision Questionnaire, Revised (CCSP–R)

  • Youth Services Survey for Families (YSS–F)


Caregivers of young children only:

  • Devereux Early Childhood Assessment (DECA)

  • Parenting Stress Index (PSI)

  • Preschool Behavioral and Emotional Rating Scale (PreBERS)

Youth Measures

  • Behavioral and Emotional Rating Scale (BERS–2)

  • Delinquency Survey (DS)

  • GAIN Quick—R: Substance Problems Scale (GAIN)

  • Revised Child Manifest Anxiety Scale, Second Edition (RCMAS–2)

  • Reynolds Adolescent Depression Scale, Second Edition (RADS–2)

  • Substance Use Survey (SUS-R)

  • Youth Information Questionnaire (YIQ–R)

  • Youth Services Survey (YSS)

Public Health Service Act

Title V, Part E

Public Law 102–321

Program

Objective

The study addresses Section 565, 42 U.S.C .290ff–4(c) that evaluations assess the effectiveness of the system of care, including longitudinal studies of outcomes of services,

and effectiveness of the system of care as assessed by parents.

By assessing service experience, this study crosswalks with RFA requirement 2.2: Services Delivery. By assessing longitudinal outcomes this study crosswalks with RFA requirement 2.5: Data Collection and Performance Measurement.




Table 1. Purpose of Currently OMB-Approved Cross-Site Evaluation Data Collection Instruments (continued)


STUDY AND ASSOCIATED INSTRUMENTS

PURPOSE

Services and Costs Study

  • Existing service utilization and cost data from agency management information systems and budgets captured on ongoing basis in the Services and Costs Tool provided by the national evaluation, and/or recoded and submitted to national evaluation quarterly

  • Flexible fund expenditures captured on ongoing basis in the Flex Funds Tool provided by the national evaluation, and/or recoded and submitted to national evaluation quarterly

Public Health Service Act

Title V, Part E

Public Law 102-321

Program

Objective

This study addresses Section 564, U.S.C. 290ff–3(f) and Section 565, U.S.C. 290ff–4(c) that the public entity annually submit to the Secretary a report that includes a description of the types and costs of services provided, the availability and use of third-party reimbursements, estimates of the unmet need for services in the jurisdiction of the entity, and that evaluations assess the effect of activities on the utilization of hospital and other institutional settings.

This study crosswalks with RFA requirement 2.4.1: Required Activities that CA awardees develop financing approaches that promote provision of a cross-agency service delivery system, create flexible funds, and develop care review approaches that promote fiscal accountability. The study also crosswalks with RFA requirement 2.4.3: Sustainability by providing data on the cost effectiveness of systems of care.

System of Care Assessment

  • Semi structured interview with multiple stakeholders using the System of Care Assessment Interview Guides A–I, L–S

  • Review of randomly selected case records, document review, and follow-up telephone calls as needed

Public Health Service Act

Title V, Part E

Public Law 102-321

Program

Objective

This study addresses Section 564, U.S.C. 290ff–3(f) and Section 565, U.S.C. 290ff–4(c) that the evaluation assess barriers and achievements resulting from interagency collaboration in providing community-based services to children with a serious emotional disturbance, and that the public entity annually submit to the Secretary a report assessing the manner in which the grant has been expended toward the establishment of a jurisdiction-wide system of care.

This study crosswalks with RFA requirements 2.4.1: Required Activities, 2.4.3: Sustainability, 2.4.4: System Development and Implementation Plan, and 2.5: Data Collection and Performance Measurement.



Table 1. Purpose of Currently OMB-Approved Cross-Site Evaluation Data Collection Instruments (continued)


STUDY AND ASSOCIATED INSTRUMENTS

PURPOSE

Sector and Comparison Study

  • Court Representative Questionnaire (CRQ)

  • Teacher Questionnaire (TQ)

  • School Administrator Questionnaire (SAQ)

  • Education Sector Caregiver Questionnaire (ESCQ)

Public Health Service Act

Title V, Part E

Public Law 102–321

Program

Objective

The study addresses Section 565, U.S.C. 290ff–4(c) that evaluations assess the effectiveness of the system of care.



The national evaluation is driven by the system of care program theory. This program theory asserts that to serve children with serious emotional disturbance, service delivery systems need to offer a wide array of accessible, community-based service options that center on children’s individual needs, include the family in treatment planning and delivery, and are provided in a culturally and linguistically competent manner. An emphasis is placed on serving children in the least restrictive setting that is clinically appropriate. In addition, because many children with serious emotional disturbance use a variety of services and have contact with several child-serving agencies, service coordination and interagency collaboration are critical. The program theory holds that if services are provided in this manner, outcomes for children and families will be better than can be achieved in traditional service delivery systems.


To examine the system of care theory, the core studies of the national evaluation are designed to answer the following overarching questions:


  • Who are the children and families served by the program and by the funded communities? How do the characteristics of children and families who participate in systems of care differ? Does the served population change over time as systems of care mature?

  • How do systems of care develop according to system of care principles (e.g., family and youth involvement, cultural competence, interagency collaboration) over time? What are differences in the development of systems of care? In what ways does funding accelerate system development?

  • What is the degree to which each of the CA awardee communities has implemented, developed, and sustained their service systems according to the system of care conceptual framework, based on the results of a System of Care Assessment Tool?

  • To what extent do children’s clinical and functional outcomes improve over time? How are family outcomes affected? What is the nature of change in child, family, and system outcomes? How are changes in child, family, and system outcomes associated with efforts to implement and develop systems of care?

  • What are the service utilization patterns (specific services, treatments, and supports) for children and families in systems of care and what are the associated costs? In what ways do the services and supports that children and families receive differ? How cost-effective are systems of care over time? Are systems of care cost-effective?

  • To what extent are children’s and families’ experiences consistent with the system of care philosophy? How satisfied are children and families with the services they receive? How well do CA awardee communities provide a broad array of services in a cultural context that is most appropriate for the child and the family and that ensures a full partnership with families? How effective are specific services, treatments, or supports in producing positive outcomes for children and families?

  • Are there subgroups of children and families for whom a system of care is more effective?

  • To what extent do CA awardee communities receive technical assistance to implement the evaluation appropriately? How frequently is feedback provided to local CA awardee communities on the status of data collection and on findings of the evaluation?

  • To what degree are systems of care effective in producing positive outcomes for children and families?


These evaluation questions evolved over the last 19 years through development of the CMHI and feedback from system of care personnel and other partners and extend those mandated by the CMHI authorizing legislation. The legislation requires funded communities to participate in a national evaluation that assesses the number of children served, child and family characteristics, child and family outcomes, service utilization patterns, and system characteristics.


b. Changes to Be Made


The previously approved Phase VI evaluation is composed of six core study components: (1) the System of Care Assessment that documents the development of systems of care through site visits conducted every 12–18 months; (2) the Cross-Sectional Descriptive Study that collects descriptive data on all children and families who enter the CMHS-funded systems of care throughout the funding period; (3) the Child and Family Outcome Study that collects data longitudinally on child clinical and functional status, and family outcomes; (4) the Service Experience Study that collects data on family experience and satisfaction with services from a sample of children and families; (5) the Services and Costs Study that assesses the costs and cost-effectiveness of system of care services; and (6) the Sustainability Study, as well as and three special studies: the Alumni Networking Study, the Continuous Quality Improvement (CQI) Initiative Evaluation, and the Sector and Comparison Study. Earlier revisions eliminated one of the core studies, the Sustainability Study, and two of the special studies: the Alumni Networking Study and the Continuous Quality Improvement (CQI) Initiative Evaluation.


This revision requests the elimination of the Sector and Comparison Study. The eliminated studies have provided data to the program and are no longer needed. The Sustainability Study was implemented originally to collected data for a long-term Government Performance and Results Act (GPRA) outcome measure of sustainability 5 years post-funding. This long-term outcome measure is no longer in effect. Data were also collected to assess preparedness for sustainability at critical changes in match requirement, and at the end of funding for comparison to the long-term assessment. The first assessments for the Alumni Networking Study and the CQI Initiative Evaluation were completed. Additional assessment is not needed. The Phase VI Special Study consisted of the Sector and Comparison Study, which was conducted with a subsample of the FY 2008-funded CA awardees and assessed differential outcomes of children and families involved in a specific child-serving sector (i.e., child welfare, juvenile justice, special education) and receiving services from agencies in funded systems of care with a similar group of children and families receiving services from agencies outside of funded systems of care. These data collections are therefore also no longer required.


c. Uses of Information Collected Through the CMHI Evaluation


CMHI Evaluation data and reports have been, and will continue to be, used by multiple stakeholders, including SAMHSA, CMHS Directors, and Grant Project Officers (GPOs), CA awardees, the practice community, and the research community.


SAMHSA


As with findings from Phases I, II, III, IV, and V, SAMHSA will be able to use the results from the Phase VI evaluation to


  • determine whether CA awardees implement their programs according to program specific requirements and whether fidelity to program implementation is associated to child and family outcomes.

  • develop policies and provide guidance regarding the development of systems of care.

  • enhance other CMHS programs that support system development (e.g., Projects for Assistance in Transition from Homelessness, Community Mental Health Services Block Grants, Cooperative Agreements for State-Sponsored Youth Suicide Prevention and Early Intervention, Mental Health Transformation State Incentive Grants, and the National Registry of Evidence-Based Programs and Practices program).

  • support the many partners that work in collaboration with CMHS, including the National Federation of Families for Children’s Mental Health and the National Mental Health Association in their national efforts to help build systems of care for children's mental health services.

  • fulfill the program’s reporting requirement of an annual report to Congress based on findings from a national evaluation of the program, as mandated by the program’s authorizing legislation.


In addition, in 2014, to guide its work through 2018, SAMHSA identified six strategic initiatives with input from stakeholders including Federal, state, and local leaders; constituency groups; advisory council members; members of Congress; people in recovery; and family members. These initiatives are designed to focus SAMHSA’s work on improving lives and capitalizing on emerging opportunities. In particular, the CMHI evaluation responds to the following three strategic initiatives:


  • Recovery Support: SAMHSA is taking the lead on promoting recovery-oriented service systems and peer support for individuals with or in recovery from mental and substance use disorders. Thus, one of the six Strategic Initiatives—“Recovery Support”—is designed:


to partner with people in recovery from mental and substance use disorders and family members to guide the behavioral health system and promote individual-, program-, and system-level approaches that foster health and resilience; increase permanent housing, employment, education, and other necessary supports; and reduce discriminatory barriers.”


The “Recovery Support” strategic initiative includes four goals with imbedded objectives and action steps. Of those, the CMHI program and data collection associated with Phase VI of the CMHI evaluation contribute most specifically to the following:

  • Engaging individuals in recovery and their families in self-directed care, shared decision-making, and person-centered planning.

  • Ensuring that permanent housing and supportive services are available for individuals with or in recovery from mental and substance use disorders.

  • Increasing gainful employment and educational opportunities, while decreasing legal and policy barriers, for individuals in recovery with mental and substance use disorders.


  • Data, Outcomes and Quality Initiative: SAMHSA has highlighted the importance of supporting programming decisions with high quality data and of transparency in these decisions by making data readily available to the public. The objective of the initiative is:


to realize an integrated data strategy that informs policy and measures program impact leading to improved quality of services and outcomes for individuals, families and communities.”


The initiative includes four goals with imbedded objectives and action steps. Of those, the CMHI evaluation is guided by the following:

  • Improving the quality of SAMHSA’s program evaluations and services research.

  • Improving the quality and accessibility of surveillance, outcome/performance, and evaluation information for staff, stakeholders, funders, and policy makers.


  • Trauma and Justice Initiative: SAMHSA is one of the leading agencies addressing the impact of trauma on individuals, families and communities across the country. Thus, one of the six Strategic Initiatives—“Trauma and Justice”—is designed:


to focus programmatic efforts on the goal of reducing the pervasive, harmful, and costly health impact of violence and trauma by integrating trauma-informed approaches throughout health and behavioral health care systems and by diverting people with substance use and mental disorders from criminal and juvenile justice systems into trauma-informed treatment and recovery.”


The “Trauma and Justice” strategic initiative includes five goals with imbedded objectives and action steps. Of those, the CMHI program and data collection associated with the CMHI evaluation contribute most specifically to the following:

  • Reducing the impact of trauma.

  • Supporting programs to address trauma experienced in childhood.

  • Improving the availability of trauma-informed care.


In sum, in its design and through its established priorities and data collection approach, Phase VI of the CMHI evaluation, as in other phases of the evaluation, will provide data that will allow SAMHSA to assess and illustrate the ways in which, as well as the extent to which, the CMHI program has achieved goals in the areas of urgency and opportunity as outlined in SAMHSA’s Strategic Initiatives.


CMHS Leadership


CMHS leadership has been, and will continue to be, able to use CMHI evaluation data reported by CA awardees to determine whether funded activities are progressing as expected and to keep abreast of any issues that CA awardees are having related to carrying out their proposed activities. Government Project Officers (GPOs) may also use the information to connect CA awardees who are conducting similar activities or serving comparable populations to facilitate collaboration across the CMHI.


In addition, the design for the CMHI evaluation provides for data collection, summarization, analysis, and reporting that can be used to address the following SAMHSA/CMHS priorities:


  • Accountability: The evaluation was designed to support SAMHSA/CMHS legislatively-mandated reporting requirements. Findings from the evaluation have been, and will continue to be, used to fulfill the legislatively mandated requirements for annual reports to the Secretary and to Congress. Information to be reported includes the following:

  • Description of the number of children provided access to systems of care.

  • Demographic characteristics of the children.

  • Types and costs of services provided.

  • Availability and use of third-party reimbursements.

  • Estimates of the unmet need for such services within grantee jurisdictions.

  • How the grant has been expended to establish a jurisdiction-wide system of care for children with a serious emotional disturbance.

  • Assessments of effectiveness of systems of care that examine longitudinal and other studies of outcomes, the effect of systems of care on the utilization of hospitals and other institutional settings, barriers to and achievements from interagency collaboration in providing community-bases services, and parent or caregiver assessments of effectiveness.

  • Other information as may be required.


  • Program and policy planning. Findings from the evaluation inform both intra- and interagency discussion and decision-making for program and policy planning. The evaluation provides the most comprehensive data available about children with serious emotional disturbances and their long term outcomes, and are therefore frequently drawn on to fill often urgent requests for information received by SAMHSA from the Secretary and other Federal child-serving entities within and beyond HHS (e.g., Agency for Children and Families, Department of Education, Department of Justice) about the characteristics and long term outcomes of children and youth with serious mental health concerns, and subgroups of these children and youth such as those who are involved with child welfare, juvenile justice, or education services; are at risk of suicide; are lesbian, gay, bisexual or transgender; have experienced trauma or bullying; have misused prescription drugs; or have co-occurring disorders.


  • Quality Improvement: Mechanisms for reporting useful data profiles, summaries, and/or reports have been developed in previous phases of the evaluation and will continue to be used in Phase VI of the evaluation to support quality improvement activities for clinical interventions, other products, and training/dissemination efforts to serve as an incentive for data collection by data providers.


  • Program justification purposes: Program justification requires indicators not only of the effectiveness of activities and products in the abstract or in the published literature, but also of wide distribution and actual uptake of the activities and products, and evidence that they are effective, cost-effective and sustainable in communities throughout the country.


CA Awardees


Findings from the evaluation have been and will continue to be used by CA awardees to


  • improve the implementation of their systems of care and achieve the goals of the CMHI;

  • improve their services, and support their efforts to obtain required matching funds and to sustain their system of care after the CMHI funding has ended. Indeed, several CA awardees have used data collected for the Phase I, II, III, IV and V studies to request additional funding from their State legislatures;

  • plan culturally competent services and supports which families and youth report as useful and that are associated with improved child, youth, and family outcomes;

  • learn what barriers children or youth and their families perceive and work to eliminate such barriers;

  • learn whether families experience services as the CA awardees intended and will identify their programs’ strengths and weaknesses;

  • help identify gaps in system development and barriers to collaboration, and will help CA awardees more effectively allocate personnel and funding and prioritize activities.


Research Community


The research community, particularly the field of children’s mental health services research, will profit in a number of ways. First, evaluation of the CMHI will add significantly to the developing research base about systems of care. Second, the focus on child, family, and system outcomes will allow researchers to examine and understand the specific ways children improve, how services can be enhanced, and the importance of adherence to service plans. Moreover, the relationship among these variables will be better understood. Finally, the analysis of evaluation data will aid researchers in formulating new questions about systems of care and specific services, and will help both service providers and researchers improve the delivery of children’s mental health services. Data collected from the national evaluation have contributed to more than 750 publications and presentations.


Summary


The CMHI evaluation data and related reports produced will be useful to SAMHSA, CMHS GPOs and leadership, CA awardees and the research community. The level of evidence provided by the evaluation about program implementation and outcomes has enabled communities to use evaluation data to track activities funded by their CMHI CAs, provide summary reports to their local steering committees or other advisory boards, support statewide expansion efforts, develop interagency partnerships, and obtain resources to sustain systems with interagency agreements.


At all levels of government—Federal, state, and local—and in the private sector, decisions are being made that are dramatically changing the lives of children and families. To make these decisions in a responsible way, policymakers, communities, and other stakeholders need information such as the data and findings to be produced by the CMHI Evaluation.



3. USE OF IMPROVED INFORMATION TECHNOLOGY


The National Evaluator has provided software for computer-assisted personal interviewing (CAPI) to CA awardees. Across all study components approximately 90 percent of total responses, based on our most recent assessment of previous use, will be obtained electronically by CAPI or Web survey.


Data from the Cross-Sectional Descriptive Study, Child and Family Outcome Study, and Service Experience Study are managed using an integrated Internet-based data input, management, and dissemination system—the interactive-collaborative network (ICN). The ICN, which was introduced in Phase III and refined in Phases IV, V, and VI of the national evaluation, reduces evaluation burden for the sites and allows real-time access to data for site personnel and National Evaluation Team members. The ICN is designed to capture the specific data collected by the national evaluation to meet the reporting requirements of the CMHI’s authorizing legislation. The system serves as a mechanism for communicating about data quality, and evaluation activities and results.


The ICN was designed as a three-part system that allows systematic data input, immediate validation to identify data input flaws, and monitoring of data entry and evaluation in real time. It reduces processing time and provides the capability of creating interactive reports. The ICN is a completely secure system that ensures privacy through the provision of different levels of password-protected access to site and national data.


  • Data Input. The data entry software allows sites with available laptop computers the option of CAPI interviewing by entering the participant’s responses directly into the data entry package during the interview. The software allows rapid data entry off-line, and the Internet is used to transfer data from local sites to the national database. Specific descriptive information on Cross-Sectional Descriptive Study participants are entered directly to the ICN Web site. This web-based data entry is designed to be used by intake workers or case managers often located at various agencies rather than at a central evaluation office. The primary goal of this web-based data entry is to maximize the capture of descriptive information on all children served in system of care programs while eliminating burden associated with the Cross-Sectional Descriptive Study. Finally, for the Services and Costs Study, the National Evaluator has developed the Services and Costs Tool. This Web-based data collection application is designed to create a child-level data record for each system of care service received by children/youth. CA communities have to option to key in data in any of the service module fields or to upload an extract file representing the same data. The application features validation checks for quality assurance, preset response categories, secure access authorization for multiple persons within each community and multiple automated reports.

  • Data Monitoring, Management and Dissemination. Software allows the National Evaluator and CMHS to monitor the status of each site’s data submissions in real time and permits sites to check the status of their own data submissions. Reporting features support sites’ abilities to use their data for quality assurance monitoring and system improvement purposes. Basic validations are completed during the data entry process. Every month, detailed reports are provided to communities that detail any potential data errors or issues. The National Evaluator has automated these reports, such that communities have real-time, on-demand access to these reports. These features are available to Phase VI communities that have started data collection. Reports posted on the ICN provide a vehicle for the review of aggregate data that CMHS has approved for public release. For example, Data Profile Reports, created 3 times per year, display a summary of child- and family-level descriptive and outcome data collected at the community and aggregate level.


The National Evaluator will provide training and direct evaluation technical assistance support to sites to facilitate the implementation of the evaluation protocol and the use of evaluation results at the site level. Site personnel will be trained to utilize the ICN at national training meetings and during evaluation technical assistance visits to the sites.


CMHI evaluation surveys and forms that are Web based for Phase VI of the evaluation:


  • The Services and Costs Study Tool (Web-based data collection application)

  • Enrollment and Demographic Information Form (Web-based form)


The use of Web-based surveys and forms decreases respondent burden, as compared to that required for alternative methods, such as a paper format, by allowing for direct transmission of the survey or form. In addition, the data entry and quality control mechanisms built into the Web-based format reduces errors that might otherwise require follow-up, thus reducing burden, as compared to that required for a hard-copy administration. As well, respondents can complete the survey at a time and location that is convenient for them. The national evaluation’s development of the Services and Costs Study Tool has also minimized communities’ need to develop their own systems locally and the costs of this development.


All of the Web-based surveys associated with the evaluation recruit respondents to participate through an e-mail invitation. The e-mail process occurs in four stages: (1) an advance invitation to participate, (2) a formal invitation, which includes the Web site’s URL and unique user name and password, (3) a reminder to all respondents, and (4) a final targeted reminder to nonresponders and those who have only partially completed the survey.


Finally, SAMHSA and its contractors strive to ensure that all Web-based solutions are fully compliant with Section 508 of the Rehabilitation Act. This includes ensuring that all posted documents are compliant or have a compliant alternative. The National Evaluator utilizes Adobe products that are capable of producing compliant PDF files per the SAMHSA recommended process. The National Evaluator has a thorough knowledge of Section 508 standards and employs accessibility specialists with experience in Section 508 compliance verification, including assessment with a variety of assistive technologies, including screen readers, screen magnifiers, and voice recognition software.



4. EFFORTS TO IDENTIFY DUPLICATION


This evaluation generates data that have not previously been collected, or have only minimally been collected in the field of children’s mental health services and/or collected only by the CMHI cross-site evaluation in the past. This includes information on access to quality, evidence-based care for children, youth, and their families and disparities in access to care by demographic groups, including a comparison of access to care within and outside of the CMHI; the process of developing, disseminating, and implementing evidence-based practices (EBPs) for children, youth and their families; and the national impact of the CMHI. As well, the five core studies, which include data on who receives system of care services, the types of services they receive, and the outcomes related to receipt of these services, are collected in a systematic manner that yields more extensive, detailed, and consistent information than has previously been obtained.


The National Evaluator also conducted an extensive literature search to identify existing evaluation research on systems of care and children’s mental health services. The search included a review of published literature, unpublished papers, works-in-progress, and working papers and documents. During the implementation of the Phase I–V evaluations, the National Evaluator has kept abreast of the literature in children’s mental health services research and has been in close contact with the original CA awardees. This has allowed the team to keep up with advances in practice and research. In addition, the Services Evaluation Committee for the national evaluation has helped keep the evaluation appraised of new innovations in the field. These efforts yielded a broad list of useful references. While some of the research identified contains features similar to the planned evaluation, the scope of the research projects varies considerably and is driven by the particular research interests of each investigator. The Phase VI evaluation offers unique contributions to the field not available in these other studies.


Phase VI does not duplicate extant studies, but instead enhances the existing knowledge base. In addition, Phase VI provides information that is specific to this service program. As required by the legislation, data must be collected from the communities in which the program has been funded. Existing research and data in the area of children’s mental health services are not sufficient to address the questions posed in this evaluation. For questions related specifically to the functioning and impact of the CMHI, the evaluation has and will serve as a primary mechanism through which the CMHI will be understood, improved, and sustained.


The data collected under Phase VI of the national evaluation are not available in other Federal databases, nor are they collected through TRAC.



5. INVOLVEMENT OF SMALL ENTITIES


Some of the data for this evaluation will be collected from mental health, juvenile justice, education, and child welfare agencies. While most data will be collected from public agencies, it is possible that some organizations providing services to the target population, such as community-based organizations, not-for-profit agencies, private providers, schools, or parent groups, would qualify as small entities. The information requested is the minimum required to meet the study objectives.



6. CONSEQUENCES IF INFORMATION IS COLLECTED LESS FREQUENTLY


Below is a summary of the consequences if the CMHI Evaluation information is collected less frequently, organized by individual studies that all currently have OMB approval.


System of Care Assessment. Data for this component have been collected every 18–24 months across the 6 years of system of care community funding (beginning in the second year), documenting how the program has led to system enhancement. This information is key to examining whether improved outcomes for the children served by the system can be plausibly linked to this initiative. Because systems of care change slowly, collection of system data every 18–24 months is sufficient to provide information on system implementation, organizational involvement, and relationships. If these data were collected less frequently, important interim changes would not be documented. For this request, data will be collected for two additional 18-month follow-ups.


Cross-Sectional Descriptive Study. Data for this component will be collected when children and families first access the system of care. These data elements are maintained by the CA awardees for their own administrative purposes; hence their collection creates no additional respondent burden. For families participating in the Child and Family Outcome Study, however, the descriptive information that may have changed over time (e.g., family income, caregiver’s marital status) will be collected at each follow-up data collection point. Failure to collect these few data elements at follow-up would preclude the detection of key changes in the child’s environment that could have an important impact on the child’s clinical outcomes, service use, or family functioning. Data from the CA awardee sites will be submitted to the National Evaluator continuously using the ICN, resulting in a minimal burden to site staff.


Child and Family Outcome Study and Service Experience. For this component, data will be collected at intake and every 6 months for the length of the evaluation, up to 24 months. Clinicians who work with this population of children suggest that once children enter services, they are likely to experience detectable improvements within the first 6 months of services. However, whether improvement is sustained is important to demonstrate. Assessing outcomes every 6 months allows for the study of the course of improvement over time so that interventions can be planned for times that are likely to yield the greatest gains. Thus, waiting 12 months to collect outcome data would miss important changes that are likely to happen in children who are still developing. On the other hand, it was the judgment of the Research Advisory Board and prior CA awardees that quarterly data collection would be too burdensome.


Services and Cost Study. Data used in this study come from communities’ MISs and is aimed at assessing all services received by children and their families and associated costs. These data are episodic in nature, and not collecting information on all episodes of services will result in underreporting of services utilization and underestimating services cost incurred by children and families. By not collecting services and costs data, from the beginning of service delivery, within a consistent data structure across all grant communities, the ability to accomplish these study goals are seriously diminished. SAMHSA is often asked to demonstrate the cost-effectiveness of this grant program. Without requiring complete and consistent data from all communities, the validity of these types of costs analyses would be compromised.



7. CONSISTENCY WITH GUIDELINES IN 5 CFR 1320.5(d) (2)


The data collection fully complies with the requirements of 5 CFR 1320.5(d) (2).



8. CONSULTATION OUTSIDE THE AGENCY


Federal Register Notice


The notice in the Federal Register was published by SAMHSA on January 21, 2015 (Vol. 80, p. 2953) to solicit public comment on this study. No comments were received.


Consultation Outside of the Agency


Consultation on the design, instrumentation, data availability and products, and statistical aspects of the evaluation occurred continually throughout the implementation of Phases I, II, III, IV, and V. To capitalize on the experience and knowledge gained, the development of Phase VI was based, in part, on this consultation. Since the beginning of this initiative, consultations have been sought from the following:


  • Federal representatives working in related program areas

  • Experts in the area of child mental health services research

  • CMHS CA awardees

  • Families caring for children with emotional and behavioral disorders

  • Representatives of national organizations for children, families, and providers in the field (e.g., National Technical Assistance Center for Children’s Mental Health, National Mental Health Association, the National Federation of Families for Children’s Mental Health, National Alliance for the Mentally Ill, State Mental Health Representatives for Children and Youth)

  • Experts in program evaluation, measurement, and statistical analysis

  • Experts in Web site usability testing

  • Experts in mental health service systems for Native American children


These consultations had several purposes: (1) to ensure continued coordination of related activities, especially at the Federal level; (2) to ensure the rigor of the evaluation design, the proper implementation of the design, and the technical soundness of study results; (3) to verify the relevance and accessibility of the data to be collected; and (4) to minimize respondent burden.


a. Federal Consultation



Input from representatives of Federal agencies involved in children’s mental health issues has been elicited throughout all phases of the national evaluation. CMHS received input about its children’s services program from Federal offices including, but not limited to, the following: the Office of Special Education Programs, DoE; the Office of Juvenile Justice and Delinquency Prevention, DoJ; the Office of Disability, DHHS; and Division of Adolescent and School Health, CDC. (See Attachment A.1.a-c for a list of the participants in the Federal/National Partnership for Children’s Mental Health and their affiliations and telephone numbers.)Specifically, representatives from the listed Federal agencies have convened to develop strategies for coordinated training, technical assistance, and culturally competent services to communities across the country.


In addition, SAMHSA, the parent agency of CMHS, requires that its other two constituent centers, the Center for Substance Abuse Treatment (CSAT) and the Center for Substance Abuse Prevention (CSAP), conduct an internal review of the Annual Report to Congress on the Evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program. Evaluation specialists at the CDC, NIMH, and the Office of the Assistant Secretary for Planning and Evaluation (ASPE) of DHHS have also reviewed and provided comments on the national evaluation. Furthermore, NIMH has been represented on the Services Evaluation Committee of the national evaluation by various individuals over the past several years, including most recently Beverly Pringle, David Chambers, and Carmen Moten. (See Attachment A.2 for a list of Methodological Consultants and Services Evaluation Committee).


b. Expert Consultation


The Services Evaluation Committee of the national evaluation, a workgroup of expert consultants, was organized to provide technical guidance and review for Phase I of the evaluation. The Services Evaluation Committee continued to have input regarding the enhanced design and instrumentation for Phases II, III, IV, and V. Recommendations made by this group influenced changes applied to the Phase VI instrumentation. Services Evaluation Committee members have combined expertise in children’s mental health, the delivery of children’s mental health services, and the evaluation of systems of care. (See Attachment A.2 for a list of Services Evaluation Committee members.)


c. Cooperative Agreement Awardee Consultation



Previously funded CA awardees have been key providers of input for all phases of the evaluation design. For the design of Phase VI, CA awardee input was used in the development of the instrument package. In October 2008, project directors and evaluators from previously-funded sites participated in the Phase VI Evaluation Review Meeting where study design and instrumentation was discussed. These participants helped in determining the instruments that are most appropriate for each component of the evaluation. Additional input from CA awardees was also received by the National Evaluator through conference calls, site visits, semi-annual workshops and evaluator meetings, close-out visits in which evaluation processes and data utilization were reviewed, and CA awardee participation on the Services Evaluation Committee.


d. Family Consultation


Caregivers participated on the Services Evaluation Committee and gave early input to the overall design. Caregivers also reviewed the instrumentation and key features of the evaluation design to ensure sensitivity to parent issues and concerns as well as to maximize clarity of meaning and to assess feasibility of administering the questionnaires. CA awardee sites systematically solicited feedback from family members; hence the family perspective was also included in comments and consultation from CA awardee sites.



9. PAYMENT TO RESPONDENTS


As with previous phases, Phase VI of the national evaluation will use a research-based approach to evaluation and, as such, will require participation of children and families beyond their receipt of services in their system of care programs. Children and caregivers are provided with stipends to offset the cost of participation in the study, which might include the cost of transportation to and from the location of the interviews as well as compensating for the time burden associated with completing study instruments and potential nonresponse bias as a result.


Remuneration levels in the System of Care Assessment and the Child and Family Outcome Study/Service Experience Study are the same as those currently approved in Phase VI.


System of Care Assessment. Three caregivers of children who receive services in each system of care community are interviewed during each System of Care Assessment site visit. The national evaluation will provide a payment of $25 to them at the time of their interviews in compensation for the additional burden and potential inconvenience of these interviews. Two youth participants in each system of care community are interviewed during each System of Care Assessment site visit. The national evaluation will provide a payment of $15 to them at the time of their interviews in compensation for the additional burden and potential inconvenience of these interviews.


Child and Family Outcome Study/Service Experience Study. The National Evaluator strongly recommends that CA awardees remunerate respondents who participate in the Child and Family Outcome Study/Service Experience Study $20 each for caregivers and youth at each administration. Remuneration is essential to help maximize participation rates, particularly given the additional time being asked of families who already face multiple challenges and demands on their time in caring for their children with serious emotional disturbance. To complete the instruments at the time of entry to services and at subsequent follow-up points requires the evaluation participants to spend time away from other activities and creates a burden to the caregivers and children that exceed the burden that ordinarily would be placed on them if they were seeking services not associated with this evaluation.



10. ASSURANCE OF CONFIDENTIALITY


Phase VI requires collecting descriptive and outcomes data from children and families. In all the CA awardee sites, data are collected by site staff. These staff members are responsible for developing procedures to protect the privacy of all participants in the evaluation data collection, storage of data, and reporting of all information obtained through data collection activities. These procedures include limiting the number of individuals who have access to identifying information, using locked files to store hardcopy forms, assigning unique code numbers to each participant to ensure anonymity, and implementing guidelines pertaining to data reporting and dissemination.


Because of the sensitivity of the information that will be collected, CMHS will require that all CA awardees establish a system whereby data are gathered, stored, and accessed in a manner that protects the information as much as possible. The National Evaluator will provide each CA awardee with a coding schema that each site will use to generate code numbers to assign to individual respondents, and will train staff responsible for data collection on the process of developing codes and linking them to individual respondents. Sites will be instructed to maintain a list of the codes and their assignment to individual respondents. A secure, stand-alone software to allow site evaluation staff to store codes with respondent names will also be provided to sites. This program is password protected and sites will be instructed to limit access to the database to only those onsite evaluation staff that needs access to this information. If a paper list is maintained, the list linking the assigned codes to respondent names will be kept in a locked cabinet and only the onsite data collection staff will have access to the list. The database or list will be maintained for the duration of the CMHS program. The purpose of maintaining the list for this period of time is to ensure that the data can be linked back to the identified child and family throughout the data collection process. When the project is completed, the databases or lists will be destroyed. This coding system was developed to facilitate the tracking of children during their involvement with the evaluation and to ensure that no personal identifying information from the CA awardee sites would need to be made available to either the National Evaluator or CMHS.


The security of data entered and managed on the Internet-based ICN also will be assured. Access to the ICN will be password protected, and the ICN will use data encryption to further enhance security and privacy. Further, the project including the ICN system will operate under an ADP/IT security plan approved by CMHS to assure that project data are protected.


Each CA awardee will develop and implement an active consent procedure that informs the participants of the purpose of the evaluation, describes what their participation entails, and addresses how privacy will be maintained as described above. Informed assent will be obtained from participating older children and adolescents (aged 11–17 years). In addition, informed consent will be obtained from adolescents who have reached the age of 18 at follow-up data collection. Written informed consent or assent will be obtained from children and families at the point of entry into services. Each CA awardee will obtain local Institutional Review Board (IRB) approval for the informed consent or assent procedures used in this evaluation. CA awardees are instructed to determine whether updates to consents are required at each data collection point, since the legal custody of a child may change, a child may become old enough to participate in a youth interview, a youth may become an emancipated minor or age up into adult status, and local IRBs may have requirements for regular updates.


As in previous phases of the national evaluation, to further protect study participants for Phase VI, the National Evaluator has obtained a Federal Certificate of Confidentiality, authorized by Section 301(d) of the Public Health Service Act, as well as IRB approval within ICF Macro for the following studies: System of Care Assessment and Services and Costs Study. All CA awardees will also obtain a Certificate of Confidentiality. This certificate provides additional protections of the data from civil and criminal subpoena. Additionally, the National Evaluator will conform to all requirements of the Privacy Act of 1974, under the System of Records: Alcohol, Drug, and Mental Health Epidemiological, and Biometric Research Data, DHHS, #09–30–0036; the most recent publication in the Federal Register occurred on January 19, 1999 (64 FR 2914). Client records at the sites are also covered under this Privacy Act System of Records.


System of Care Assessment. Data collection for the System of Care Assessment will occur via face-to-face interviews. Because respondents’ identities will be known, to ensure that participants’ rights are protected, an active informed consent process will occur. (See Attachment B.5 for informed consent forms.)


Services and Costs Study. The national evaluation trains all grant communities to include specific language in their consent and assent forms to describe the services and costs data that will be accessed through the child/youth’s records and shared with the national evaluation. Although grant communities may work with personal identifying information to extract and link electronic records, no personally identifying information will be included in any data transferred to the national evaluation for this study, other than the child/youth’s national evaluation child identification number.


For those communities electing to enter data in the Flex Funds Tool or the Services and Costs Data Tool, data in these applications are password protected to ensure privacy. When data are transferred to the national evaluation, data files will be encrypted to protect the information during electronic transfer. No child identifying information will be included in these data files other than the child/youth’s national evaluation child identification number.



11. QUESTIONS OF A SENSITIVE NATURE


Because this project concerns services to children with serious emotional disturbance and their families, it is necessary to ask questions that are potentially sensitive. However, only information that is central to the study is being sought. Questions address dimensions such as child emotions, behavior, social functioning, school performance, substance use, and involvement in unlawful activities. Also asked are questions about the child’s experience with sexual and physical abuse and suicidality. The answers to these questions will be used to determine baseline status and to measure changes in these areas experienced after entering the system of care. Questions about child abuse and suicidality have implications for local mandated reporting, which CA awardees are informed to consider and to train interviewers accordingly. Since each CA awardee must keep data on child and family status and service use, as well as treatment plan and other information, the data collection required for the national evaluation is not introducing new, sensitive domains of inquiry, but is paralleling standard procedures in the field of children’s mental health.


In addition to information on child clinical status and social function, other questions of a sensitive nature will be asked of families. These include questions related to family functioning caregiver strain and parental distress and are included in order to measure family involvement in treatment planning and service delivery. Moreover, family representatives who have consulted with the National Evaluator consistently identify a lack of information on family life as a weakness in previous studies.


Before collecting data, each CA awardee will obtain active consent from caregivers. In addition, child assent will also be obtained. In that process respondents will be made aware that the information they provide will be protected strictly and that they can withdraw their participation at any time. Similarly, respondents can freely choose to refrain from answering any questions they find objectionable.



12. ESTIMATES OF ANNUALIZED HOUR BURDEN


In accordance with the evaluation design, the descriptive, outcome, intervention, and service information collection for the 47 communities in Phase VI of the national evaluation will cover a period of 5 years. Data collection for the 18 communities funded in FY 2008 will end in September 2014. Data collection for the 20 communities funded in FY 2009 will end in September 2015. Data collection for the 9 communities funded in FY 2010 will begin upon OMB approval and end in September 2018.


Table 2 shows the combined burden associated with the remaining three years of data collection for CA awardees funded in FY 2010. For measures that were previously cleared by the OMB, burden estimates presented in Table 2 are based on information supplied by CA awardees in prior phases of the evaluation.



Table 2. Estimate of Respondent Burden


Instrument

Respondent

Number

of Respondents

Total Average Number of Responses per Respondent

Hours per Response

Total Burden Hours

Hourly Wage Rate ($)

Total Cost per Year ($)

System of Care Assessment

Interview Guides A–I, L–S

Key site informants

2071

1

1.00

207

19.172

3,968

Child and Family Outcome Study

Caregiver Information Questionnaire, Revised—Intake (CIQ–R–I)

Caregiver

1,0993

1

0.37

407

11.434

4,652

Caregiver Information Questionnaire, Revised—Follow-Up (CIQ–R–F)

Caregiver

1,099

15

0.28

308

11.43

3,520

Caregiver Strain Questionnaire (CGSQ)

Caregiver

1,099

2

0.17

374

11.43

4,275

Child Behavior Checklist (CBCL)/ Child Behavior Checklist 1½–5/ 6–18

Caregiver

1,099

2

0.33

725

11.43

8,287

Education Questionnaire, Revision 2 (EQ–R2)

Caregiver

1,099

2

0.33

725

11.43

8,287

Living Situations Questionnaire (LSQ)

Caregiver

1,099

2

0.08

176

11.43

2,012

Behavioral and Emotional Rating Scale—Second Edition, Parent Rating Scale (BERS–2C)

Caregiver

1,7816

2

0.17

606

11.43

6,927

Columbia Impairment Scale (CIS)

Caregiver

1,9897

2

0.08

318

11.43

3,635

Parenting Stress Index (PSI)

Caregiver

5368

2

0.08

86

11.43

983

Deveraux Early Childhood Assessment (DECA)

Caregiver

5049

2

0.08

81

11.43

926

Preschool Behavioral and Emotional Rating Scale—Second Edition, Parent Rating Scale (PreBERS)

Caregiver

504

2

0.10

101

11.43

1,154

Delinquency Survey—Revised (DS-R)

Youth

1,50410

2

0.13

391

7.2511

2,835

Behavioral and Emotional Rating Scale—Second Edition, Youth Rating Scale (BERS-2Y)

Youth

1,504

2

0.17

511

7.25

3,705

GAIN Quick—R: Substance Problem Scale

Youth

1,504

2

0.08

241

7.25

1,747

Substance Use Survey, Revised (SUS–R)

Youth

1,504

2

0.10

301

7.25

2,182

Revised Children’s Manifest Anxiety Scales, Second Edition (RCMAS–2)

Youth

1,504

2

0.07

211

7.25

1,530

Reynolds Adolescent Depression Scale, Second Edition (RADS–2)

Youth

1,504

2

0.05

150

7.25

1,088


Table 2. Estimate of Respondent Burden (continued)


Instrument

Respondent

Number

of Respondents

Total Average Number of Responses per Respondent

Hours per Response

Total Burden Hours

Hourly Wage Rate ($)

Total Cost per Year ($)

Youth Information Questionnaire, Revised—Baseline (YIQ–R–I)

Youth

1,504

1

0.25

376

7.25

2,726

Youth Information Questionnaire, Revised—Follow-Up (YIQ–R–F)

Youth

1,504

1

0.25

376

7.25

2,726

Service Experience Study

Multi-Sector Service Contacts, Revised—Intake (MSSC–R–I)

Caregiver

2,257

1

0.25

564

11.43

6,447

Multi-Sector Service Contacts, Revised—Follow-Up (MSSC–R–F)

Caregiver

2,257

2

0.25

1,129

11.43

12,904

Cultural Competence and Service Provision Questionnaire, Revised (CCSP–R)

Caregiver

2,257

112

0.13

293

11.43

3,349

Youth Services Survey—Family (YSS–F)

Caregiver

2,257

1

0.12

271

11.43

3,098

Youth Services Survey (YSS)

Youth

1,504

1

0.08

120

7.25

870

Services and Costs Study

Flex Funds Data Dictionary/Tool

Local programming staff compiling/entering administrative data on children/youth

27513

314

0.03

25

23.9615

599

Services and Costs Data Dictionary/Data Entry Application

Local evaluator, staff at partner agencies, and programming staff compiling

/entering service and cost records on children/youth

2,257

2016

0.05

2,257

23.96

54,078


Table 2. Estimate of Respondent Burden (continued)


Summary of Annualized Burden Estimates for 1 Year



 

Number of Distinct Respondents

Number of Responses per Respondent

Total Annual Burden (hours)17

Cost


Caregivers

2,257

1.5

9,059

70,455



Youth

1,504

1.6

2,682

19,408



Providers/Administrators

275

24.0

1,333

58,645



Total Summary

4,036

27

13,074

148,508




  1. An average of 23 stakeholders in up to 9 grant communities will complete the System of Care Assessment interview. These stakeholders will include site administrative staff, providers, agency representatives, family representatives, and youth.

  2. Assuming the average annual income across all types of staff/service providers/administrators is $40,000, the wage rate was estimated using the following formula: $40,000 (annual income)/2087 (hours worked per year) = $19.17 (dollars per hour).

  3. Number of respondents across 9 CA awardees (2,257). Average based on a 5 percent attrition rate at each data collection point.

  4. Given that 65 percent of the families in the Phase VI evaluation sample fall at or below the 2014 DHHS National Poverty Level of $ 23,850, (based on family of four), the wage rate was estimated using the following formula: $23,850 (annual family income) / 2087 (hours worked per year) = 11.43 (dollars per hour).

  5. Number of responses per respondent is five over the course of the study (once every 6 months for 24 months, with one baseline/intake response, and 4 follow-up responses).

  6. Approximate number of caregivers with children over age 5, based on Phase VI data submitted as of 7/14.

  7. Approximate number of caregivers with children 3 and older, based on Phase VI data submitted as of 7/14.

  8. Approximate number of caregivers with either (1) children enrolled in Outcome Study at the two early childhood-focused communities, for whom the instrument is required; or (2) children aged 0 to 12 at other communities, where the instrument is optional (we estimate that 1/3 of caregivers will be administered the instrument when it is optional). Estimates are based on Phase VI data submitted as of 7/14.

  9. Approximate number of caregivers with either (1) children enrolled in Outcome Study at the two early childhood-focused communities, for whom the instrument is required; or (2) children aged 0 to 5 at other communities, where the instrument is optional (we estimate that 1/3 of caregivers will be administered the instrument when it is optional). Estimates are based on Phase VI data submitted as of 7/14.

  10. Based on findings from Phase VI data that approximately 67 percent of the children in the evaluation were 11 years old or older.

  11. Based on the 2014 Federal minimum wage rate of $7.25 per hour.

  12. With the exception of the MSSC-R, respondents only complete Service Experience Study measures at follow-up points. See Footnote #3 for the explanation about the average number of responses per respondent.

  13. Assumes that each community will use flexible funds expenditures on average for approximately one quarter of the children/youth enrolled.

  14. Assumes that three expenditures, on average, will be spent on each child/youth receiving flexible fund benefits.

  15. Assumes that the average annual income across all types of programming staff is $50,000, the wage rate was estimated using the following formula: $50,000 (annual income) / 2087 (hours worked per year) = $23.96 per hour.

  16. Assumes that each child/youth in system of care communities will have 20 service episodes, on average.

  17. Total Annual Burden (hours) is the product of Number of Distinct Respondents X Average Annual Number of Responses per Respondent X Average3-Year Burden per Response (hours).


As indicated in Table 2, the average total annual burden for data collection is estimated at 13,074 hours. This estimate is derived by calculating the burden for 3 years of data collection (the period of national evaluation data collection for which approval is being sought) for each measure and summing.



13. ESTIMATES OF ANNUALIZED COST BURDEN TO RESPONDENTS


CA awardees collect the majority of the required data elements as part of their normal operations, and maintain this information for their own service planning, quality improvement, and reporting purposes. The additional cost of this data collection is minimal. The costs for operation and maintenance of materials necessary for ongoing data collection are similarly minimal.


Other costs related to this effort, such as the cost of obtaining copyrighted instruments, are costs to the Federal Government. Each CA awardee has been funded, as part of the overall cooperative agreement award, to support two staff positions (or the full-time equivalent) to assist in the evaluation. Therefore, no cost burden is imposed on the CA awardee by this information collection effort.



14. ESTIMATES OF ANNUALIZED COST TO THE GOVERNMENT


SAMHSA has planned and allocated resources for the management, processing, and use of the collected information in a manner that shall enhance its utility to agencies and the public. Including the Federal contribution to local CA awardee evaluation efforts, the contract with the National Evaluator, and government staff to oversee the evaluation, the annualized cost to the government is estimated at $9,168,221. These costs are described below.


Each CA awardee is expected to hire two full-time equivalents to recruit families into the evaluation, collect information, manage and clean data, and conduct analyses at the local level. Assuming (1) an average annual salary of $55,000; (2) that 47 CA awardees have been funded; and (3) that the average Federal contribution (not including State matching funds) will be 73 percent, the annual cost for Phase VI at the CA awardee level is estimated at $3,774,100. These monies are included in the cooperative agreement awards.


The national evaluation contract has been awarded to ICF Macro for evaluation of the 47 CA awardees in Phase VI. The first Round of Phase VI CA awardees began data collection in October 2009 and will continue data collection for 4 years until September 2015. The second Round of CA awardees are scheduled to begin data collection upon OMB approval and will continue data collection until September 2018. The national evaluation contract for Round one of Phase VI provides for 1 base year of $2,809,053 with an option to renew for 4 more years. The national evaluation contract for Round two of Phase VI provides for 1 base year of $2,444,200 with an option to renew for 5 more years. The estimated average annual cost of the contract for Round one of Phase VI will be $3,238,087. The estimated average annual cost of the contract for Round two of Phase VI will be $2,084,034. Together, the total cost across the two contracts is $5,322,121. Included in these costs are the expenses related to developing and monitoring the national evaluation including, but not limited to, the following activities: developing the design, instrument package (including acquisition of copyrighted instruments), data manual, and training materials; monitoring and providing technical assistance to sites; traveling to sites and relevant meetings; conducting special studies, and analyzing and disseminating data. Cost for acquisition of copyrighted instrumentation is projected to be $48,295.44 per year. This cost is included in the total contract award.


It is estimated that CMHS will allocate 75 percent of a full-time equivalent each year for government oversight of the evaluation. Assuming an annual salary of $136,000, these government costs will be $102,000 per year.



15. CHANGES IN BURDEN


Currently there are 40,024 hours in the OMB inventory. SAMHSA is requesting 13,074 hours for this submission. This revision is due to a decrease of 26,960 hours is due to program changes resulting from the closing of 19 communities funded in FY 2009 that no longer require data collection and data collection for the Sector and Comparison Study.



16. TIME SCHEDULE, PUBLICATION, AND ANALYSIS PLANS


a. Time Schedule


The time schedule for implementing the Phase VI evaluation is summarized in Table 3. A 3-year clearance is requested for this project.


Table 3. Time Schedule


Receive OMB clearance for study

XXX

Re-submit for OMB approval of remaining 1 year of data collection for sites funded in FY 2010

XXX

Continue data collection for 9 sites funded in FY 2010

Ongoing

Data collection completed for 18 sites funded in FY 2008

September 2014

Data collection completed for 20 sites funded in FY 2009

September 2015

Data collection completed for 9 sites funded in FY 2010

September 2018

Process and analyze data

Ongoing

Produce annual report

September 2018, (2010 funded)

Produce public use database

September 2018 (2009/2010 funded)

Produce final report

September 2018 (2009/2010 funded)


b. Data Analysis Plan


All of the data collection and analytic strategies detailed in this package are linked to the evaluation questions. These linkages are shown in Table 4. Analyses will be conducted to assess reliability and validity of selected measures as sufficient data to conduct these analyses are obtained in the early stages of the study. These analyses will include, but are not limited to, calculation of reliability using Cronbach’s coefficient alpha to determine internal consistency of ordinal-level and interval-level measures, calculation of the Kuder-Richardson formula 20 to determine internal consistency of dichotomous measures, and confirmatory factor analysis to determine latent variable structure and content of multi-component scales.


Table 4. Evaluation Questions, Indicators, Data Sources, and Analysis Techniques


Evaluation Questions

Indicators

Data Sources

Data Analysis

System of Care Assessment

Does the system maximize interagency collaboration?

  • Core agencies participate in a collaborative way

  • Integration of staff, resources, functions, and funds

  • Co-location of services of multiple agencies

  • Interagency service planning

  • Shared vision and goals

  • Formal relationships established between agencies

  • Site Visit

Univariate/

Multivariate Analysis

Are the various service components of the system coordinated?

  • Co-location of services of multiple agencies

  • Availability of case management/care coordination services

  • Case manager/care coordinator has broad responsibilities and active referral role

  • Integration and consistency in case management/care coordination across systems/agencies

  • Site Visit

Univariate/

Multivariate Analysis

Are services and the system accessible?

  • Proportion of eligible population provided services

  • Time between identification of need and entry to system

  • Waiting lists for entry to system

  • Waiting lists for delivery of key services

  • Active outreach

  • Logistics and supports that encourage access

  • Site Visit

Univariate Analysis

Is the service array comprehensive?

  • Availability of broad array of residential, intermediate, outpatient, and wraparound services

  • Site Visit

  • MIS

Univariate Analysis

Are services and the system culturally competent?

  • Cultural diversity of the child and family population

  • Cultural diversity of provider population

  • Agency commitment to cultural competency

  • Equitable treatment of all children and families

  • Adherence to national standards of cultural competence

  • Site Visit

  • CCSP–R

  • YSS

  • YSS–F

Univariate Analysis


Table 4. Evaluation Questions, Indicators, Data Sources, and Analysis Techniques (continued)


Evaluation Questions

Indicators

Data Sources

Data Analysis

Are services and the system family-driven?

  • System and services involve caregivers in developing individual child and family service plans

  • System and services involve caregivers in overall system of care planning activities

  • System and services involve caregivers in service delivery

  • System and services address needs of caregivers and families for support

  • Site Visit

  • YSS

  • YSS–F

  • CIQ–R

Univariate/ Multivariate Analysis

Are services individualized and youth-guided?

  • Active individualized service planning process

  • Frequency of monitoring of ISP by case manager

  • System and services involve youth in developing his or her own service plan

  • System and services involve youth in overall system of care planning activities

  • System and services involve youth in his or her own service delivery

  • System and services address needs of youth for support

  • Site Visit

  • YSS

  • YSS–F

  • YIQ–R

Univariate/

Multivariate Analysis

Are services community-based?

  • Availability of services within the community

  • Extent of reliance on out-of-county and out-of-State placements

  • Site Visit

  • MIS

Univariate/

Multivariate Analysis

Do systems mature over time?

  • Development of infrastructure

  • Development of service delivery capacity

  • Site Visit

Multivariate Analysis

Are services provided in the least restrictive setting that is appropriate?

  • Processes to ensure that children step down to lower levels of care when appropriate

  • Extent of use of intermediate and outpatient placements

  • Extent of use of wraparound services

  • Stability and duration of placements

  • Level of use of mental health services in normative settings (e.g., home, school)

  • Site Visit

  • MIS

  • LSQ

Univariate/

Multivariate Analysis

Cross-Sectional Descriptive Study

What are children and families like?

  • Gender

  • Race

  • Age

  • Foster care placement

  • Presenting problem(s)

  • Diagnosis at intake

  • Intake and referral source

  • Case status

  • EDIF

  • CIQ–R

Univariate/Bivariate Analysis


Table 4. Evaluation Questions, Indicators, Data Sources, and Analysis Techniques (continued)


Evaluation Questions

Indicators

Data Sources

Data Analysis

Child and Family Outcome Study

Are there differences between the children and families served in the systems who do and do not choose to participate in the Child and Family Outcome Study?

  • Gender

  • Race

  • Age

  • Educational level and placement

  • Socioeconomic status

  • Parents’ employment status

  • Living arrangement

  • Presenting problem(s)

  • Diagnosis at intake

  • Intake/referral source

  • Risk factors for family and child

  • Case status

  • CIUF

  • CIQ–R

Univariate/Bivariate Analysis

Has there been a reduction in childrens negative behaviors?

Number of problem behaviors

  • CBCL1½–5

  • CBCL 6–18

  • CIS

  • DECA

Univariate/

Multivariate Analysis

Has there been an increase in the level of childs overall functioning?

  • Childs ability to accomplish activities of daily living

  • Child’s strength

  • Quality of family relationships

  • Quality of peer relationships

  • CBCL1½–5

  • CBCL 6–18

  • BERS–2C

  • BERS–2Y

  • PreBERS

  • CIS

Univariate/

Multivariate Analysis

Has there been improvement in child functioning in the educational environment?

  • School attendance

  • Expulsions, dropouts, suspensions

  • Academic performance

  • BERS–2C

  • BERS–2Y

  • EQ–R2

Univariate/

Multivariate Analysis

Has there been improvement in child regarding involvement with law enforcement?

  • Violations

  • Number of contacts with law enforcement

  • Number of incarcerations

  • DS–R

Univariate/

Multivariate Analysis

Do families experience improvements in family life?

  • Family functioning

  • Parenting stress

  • Caregiver strain (burden of care)

  • PSI

  • CGSQ

  • CIQ–R

Univariate/

Multivariate Analysis

Are there differences in family outcomes across systems of care?

  • Family functioning

  • Caregiver strain (burden of care)

  • Material resources

  • PSI

  • CGSQ

  • CIQ–R

Univariate/

Multivariate Analysis

Service Experience Study

How do children and families experience services?

  • Ratings of specific services

  • Ratings of the overall system

  • Provider attitudes and practices

  • YSS

  • YSS–F

  • CCSP–R

Univariate/

Multivariate Analysis

Are there differences in service experiences across systems of care? Are differences, if any, associated with differential outcomes?

  • Comparison of ratings of specific services

  • Comparison of ratings of the overall system

  • Comparison of provider attitudes and practices

  • Relationship to child outcomes

  • YSS

  • YSS–F

  • CCSP–R

  • CBCL1½–5

  • CBCL 6–18

  • CIS

Univariate/

Multivariate Analysis

Table 4. Evaluation Questions, Indicators, Data Sources, and Analysis Techniques (continued)


Evaluation Questions

Indicators

Data Sources

Data Analysis

Services and Costs Study

What services do children and families receive and what are their service utilization patterns?

  • Previous service history

  • Service setting and type

  • Level of restrictiveness

  • Mix of services

  • Amount and duration

  • Continuity of care

  • MIS

Univariate/

Multivariate Analysis

How do service use patterns relate to child behavioral and functional outcomes?

  • Comparison of service use for children who enter the system at varying levels of challenge

  • Comparison of change in outcomes over time for children in different utilization pattern groups

  • MIS

  • MSSC–R

  • EDIF

  • CIQ–R

  • YIQ–R

  • CBCL1½–5

  • CBCL 6–18

  • CIS

  • GAIN

  • SUS–R

  • DS–R

  • RADS–2

  • RCMAS–2

  • BERS–2C

  • BERS–2Y

  • PreBERS

  • DECA

  • PSI

  • LSQ

  • DS–R

  • EQ–R2

Univariate/

Multivariate Analysis

How do service use patterns differ across subgroups within a site? Across system of care sites?

  • Comparisons of types of services used

  • Comparisons of level of restrictiveness

  • Comparisons of service mix

  • Comparison of amount and duration

  • Comparison of continuity of care

  • MIS

  • LSQ

  • MSSC–R

  • EDIF

  • CIQ–R

  • YIQ–R

Univariate/

Multivariate Analysis

What costs are associated with services at the aggregate and child/family levels?

  • Total costs of services for individual children and families

  • Average costs per child/family

  • Average cost per service type

  • MIS

  • LSQ

  • MSSC–R

Univariate/Bivariate Analysis


Analyses planned for each of the study components are described below.


System of Care Assessment. This study component includes both qualitative and quantitative analyses and both are based on a standard framework. Qualitative analyses will be used to describe the infrastructure and the direct service delivery processes of system of care communities. Qualitative data obtained through individual interviews at each system of care community and from document reviews will be synthesized into a site-specific narrative report that will be returned to each system of care community for review and correction. When the reports for each community are finalized after site comment, they will be entered into a qualitative database software program (Atlas.ti) that will allow meta-analyses across system of care communities and across time.


The quantitative analyses will be based on scores given to each system of care community that measure the extent to which it has achieved the program model’s overarching principles within the system operations described in the qualitative analysis and from quantitative interview questions. The relationship among service and system experiences, child and family characteristics, and outcomes over time will be explored using correlational, regression, and path analyses.


Cross-Sectional Descriptive Study and Child and Family Outcome Study. For this evaluation component, univariate descriptive analyses will be performed to characterize the families participating in this evaluation, including score ranges, means, and medians. These analyses will be reported for each system of care community as well as for all CA awardees combined.


Change in child and family outcomes over time will be tested using a variety of techniques. Repeated measures analysis of variance (ANOVA) will be used to test the significance of change over time within and between groups at each site. Repeated measures analysis of covariance (ANCOVA) will be conducted using the system of care development scores from the System of Care Assessment as a covariate. Hierarchical linear modeling (HLM) will be used to estimate growth curves (e.g., changes in the level of symptomatology) based on repeated observations.


The GLM repeated measures analysis will allow the National Evaluator to test whether changes over time are significant and whether some groups experience more improvement than others.. Path analysis and other structural equation modeling techniques will be used to investigate the direct and indirect effects of causal variables (such as ratings of system performance and adherence to service plans) on dependent outcome measures (such as clinical assessments, restrictiveness of care, and family functioning).


Service Experience Study. In this component of the Phase VI evaluation, HLM or ANOVA will be performed to examine (1) change in service utilization patterns of children and their families; (2) whether there are differences between groups of children in the system of care communities who receive an evidence-based treatment and those who do not in terms of client satisfaction; (3) whether children and families stay in services longer on average in communities with higher average service and system of care ratings; and (4) whether within communities, caregivers of children who received fewer services in the previous 6 months.


Repeated measures ANOVA with treatment group as a between-subjects factor and time as a within-subjects factor will be used to examine differences in continuous outcomes over time. Generalized estimating equations will be used in the analysis of dichotomous outcomes. Multivariate regression modeling across multiple time points will allow characterization of effects in terms of persistence over time and identification of both system-level and specific services factors that maintain short- and long-term positive outcomes. In addition, the appropriateness of multilevel modeling will be explored as a potential approach for linking site-level characteristics to changes in outcomes over time.


Services and Costs Study. For this component, analyses will focus primarily on utilization patterns (e.g., types, combination, amount, and costs of services used) and the factors that influence use. Analyses will be conducted at the aggregate and individual child and family levels. At the aggregate level, the distribution of service use and costs across the client population will be described. At the individual child and family level, service utilization patterns will be described (e.g., distribution of children using various combinations of services, mean and median amounts of services used).


Latent class analysis and other case-grouping techniques will be used to group children who experience similar utilization patterns, based on combinations and amount of services. The longitudinal outcomes of children in various service utilization groups will be compared to see if some utilization patterns are associated with greater gains and, if so, for which groups of children.


Trend analysis will be used to analyze change in costs over time. Multivariate techniques that adjust for skewed distribution of cost data will be employed to predict costs controlling for variation in baseline characteristics. We also will describe the allocation of service costs across children and different service categories, and we will model costs as a function of child and family characteristics. Given that utilization and cost data are often characterized by high skewness and/or large proportion of zero outcomes, we propose utilization of specialized statistical techniques (e.g., two-part model, logarithmic transformations, zero-inflated Poisson model) in analyzing utilization and cost study data. For cost-effectiveness analysis, we will use bootstrapping methods to account for uncertainty.



17. DISPLAY OF EXPIRATION DATE


All data collection instruments will display the expiration date of OMB approval.



18. EXCEPTIONS TO THE CERTIFICATION STATEMENT


This collection of information involves no exceptions to the Certification for Paperwork Reduction Act Submissions.

27

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKatherine.E.Young
File Modified0000-00-00
File Created2021-01-25

© 2024 OMB.report | Privacy Policy