Phase_VI_Final_SS_9-13-2012-A[1]

Phase_VI_Final_SS_9-13-2012-A[1].docx

National Evaluation of the Comprehensive Mental Health Services for Children and Their Families Program: Phase VI

OMB: 0930-0307

Document [docx]
Download: docx | pdf



Phase VI of the National Evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program


Supporting Statement


A. JUSTIFICATION


1. CIRCUMSTANCES OF INFORMATION COLLECTION



The Substance Abuse and Mental Health Services Administration (SAMHSA), Center for Mental Health Services is requesting OMB approval for (1) the continuation of currently approved data collection activities for communities awarded cooperative agreements in FY 2008 and 2009, with some revisions made to accommodate contract modifications and (2) an extension of these approved data collection activities for an additional 9 communities awarded cooperative agreements (CA) in FY2010. These communities are included in the Phase VI cohort of the national evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program—Children’s Mental Health Initiative (CMHI). The Phase VI evaluation of these 9 new communities will continue for the duration of the award period, ending in September 2016. The current approved data collection is under OMB No. 0930-0307, which expires on 12/31/2012.

Serious emotional disturbance affects more than 4.5 million children and their families in the United States. There is consensus that an integrated, coordinated, and comprehensive system of care is the best approach for meeting the needs of this population. The Comprehensive Community Mental Health Services for Children and Their Families Program, which is administered by the Center for Mental Health Services (CMHS) within the Substance Abuse and Mental Health Services Administration (SAMHSA), provides funds to support a broad array of community-based and family-driven services delivered through the system of care model. Under this program, CMHS has funded 5- and 6-year grants and cooperative agreements to States and locales to expand the array and capacity of services for children with serious emotional disturbance. Program funding has increased from $4.9 million in FY 1993 to $121.3 million in FY 2010 due in large part to the evidence of the effectiveness of the program provided by the national evaluation. This level of funding was maintained through FY 2011. To date, this CMHS program has funded 173 such communities through these grants and cooperative agreements. This includes 47 sites awarded cooperative agreements in Phase VI (18 in FY 2008, 20 in FY 2009, and 9 in FY 2010) for which approval is being sought.


The data collection effort proposed here relates closely to the completed and previously approved evaluations of Phase I (OMB No. 0930–0171), Phase II (OMB No. 0930–0192), Phase III (OMB No. 0930–0209), Phase V (OMB No. 0930–0257) and the ongoing evaluation of Phase V (OMB No. 0930-0280), and Phase VI CA awardees (OMB No. 0930-0307). Phases I through III cover grantees funded in FY1993, FY1994, FY1997 to FY2000, Phase IV covers CA awardees funded in FY 2002 to FY 2004; Phase V covers CA awardees funded in FY 2005 and FY 2006; Phase VI covers CA awardees funded in FY 2008, FY 2009, and FY 2010.


The previously cleared Phase VI evaluation is composed of five core study components and three special studies. In order to accommodate contract modifications, the present request proposes to eliminate one of the core studies, the Sustainability Study, and two of the special studies: The Alumni Networking Study and the Continuous Quality Improvement (CQI) Initiative Evaluation. These eliminated studies have provided data to the program and are no longer needed. The Sustainability Study was implemented originally to collected data for a long term Government Performance and Results Act (GPRA) outcome measure of sustainability five years post-funding. This long term outcome measure is no longer in effect. Data were also collected to assess preparedness for sustainability at critical changes in match requirement, and at the end of funding for comparison to the long term assessment. These data collections are therefore also no longer required. The first assessments for the Alumni Networking Study and the CQI Initiative Evaluation were completed. Additional assessment is not needed.


The previously cleared five core study components are currently being conducted with all CA awardees funded in FY 2008 and FY 2009; the present request proposes to continue data collection under this protocol with the 38 CA awardees funded in FY 2008 and FY 2009, and extend this previously approved protocol to the 9 CA awardees funded in FY 2010. CA awardees funded in FY 2010 will participate in four of these previously approved core study components. The remaining special study is being conducted with a subsample of CA awardees funded in FY 2008 (Sector and Comparison Study). The Sector and Comparison Study will not be conducted with CA awardees funded in FY 2009 and FY 2010. These five study components collect information on a major nationwide initiative serving thousands of children and their families. These data are used for the national evaluation as well as for local evaluations by the CA awardees.


The Phase VI Core Studies include (1) the System of Care Assessment that will document the development of systems of care through site visits conducted every 12–18 months to; (2) the Cross-Sectional Descriptive Study that will collect descriptive data on all children and families who enter the CMHS-funded systems of care throughout the funding period; (3) the Child and Family Outcome Study that will collect data longitudinally on child clinical and functional status, family outcomes, family experience and satisfaction with services from a sample of children and families; and (5) the Services and Costs Study that will assess the costs and cost-effectiveness of system of care services.


The Phase VI Special Study consists of the Sector and Comparison Study that will be conducted with a subsample of the FY 2008-funded CA awardees and will assess differential outcomes of children and families involved in a specific child-serving sector (i.e., child welfare, juvenile justice, special education) and receiving services from agencies in funded systems of care with a similar group of children and families receiving services from agencies outside of funded systems of care.


The proposed data collection activities will continue the previously cleared data collection efforts and ensure data collection activities align with SAMHSA’s recently released plan for achieving the goals of the agency’s eight strategic initiatives entitled Leading Change: A Plan for SAMHSA’s Roles and Actions 2011-2014. Through its longitudinal assessment of child and family living situations, employment, education and behavioral health outcomes, this evaluation assesses CMHI program progress in addressing SAMHSA’s strategic initiative focused on promoting recovery-oriented behavioral health service systems and establishing system-level approaches that foster health and resilience; increase permanent housing, employment, education and other necessary supports; and reduce barriers to social inclusion.


This request proposes to continue previously approved data collection activities for communities funded in FY 2008 and FY 2009 until FY 2014, and extend these previously approved data collection efforts to include 9 additional communities funded in FY 2010. Rather than creating a new protocol while the CMHI may be in transition from a local community to a statewide focus, the request proposes to add these 9 communities to the previously approved package. In an effort to lessen participant burden, the request also proposes to replace the intake and follow-up questionnaires for the child welfare component of the Sector and Comparison Study with an administrative record review form, as well as remove data collection activities for the Alumni Networking Study, the CQI Initiative Evaluation, and the Sustainability Study. In order to address CA awardee and family member recommendations to the education sector of the Sector and Comparison Study, this request proposes the addition of a brief 8-item Education Sector Caregiver Questionnaire to capture family involvement in the development and use of an Individualized Education Programs or Plan (IEP).


Grant/Cooperative Agreement Review Process. The SAMHSA Office of Review selects the review panel based on a number of criteria including, but not limited to, geographic region, race/ethnicity, etc. The review office requests recommendations of qualified reviewers from program among other sources; however, the selection of reviewers is blind to program. The reviewers rate the applicants based on the evaluation criteria contained in the Request for Applications (RFA) No. SM-10-005 (see Attachment A, page 32 of attachment). Once scoring has occurred, program writes a funding plan containing the number of grants that can be funded based on the total budget available for the grant. Once the Administrator concurs, the Notice of Grant Award is sent to the successful applicants.


Characteristics of 9 Cooperative Agreement Communities Funded in 2010. Consistent with previous cohorts of CA-funded communities, the characteristics of the 9 communities funded in 2010 vary by governmental entity receiving the funding, geographic location or catchment area served, and diversity in the populations of focus. State mental health agencies are the recipients of the CA in Puerto Rico and Tennessee. County mental health agencies are the recipients in Los Angeles County, CA; Seminole County, FL; Saginaw County, MI; and Durham County, NC. A city agency is the recipient in Jacksonville, FL and Tribal governments are the recipients in Mescalero Apache Tribe, NM and Rosebud Sioux Tribe, SD. As in previous funded cohorts, communities are located in urban areas (Los Angeles; Jacksonville; Saginaw; Durham); suburban areas (Seminole County, a part of metropolitan Orlando); multi-county largely rural catchment areas (middle Tennessee); frontier/rural tribal communities (Mescalero Apache and Rosebud Sioux tribes); and two small island communities in the Territory of Puerto Rico. The relative mix of populations of focus is similar to previously funded communities. This mix includes a particular focus on transition-age youth aged 16-21 (Durham); early childhood aged 0-5 (Los Angeles; middle Tennessee); and children and youth involved with child welfare or juvenile justice (Jacksonville and Seminole County). The tribal communities and program in Puerto Rico have focused their services on children and youth living in poverty in generally underserved areas; and middle Tennessee plans to extend a special effort to serve children in military families who live near the military bases located in that area.



a. Background


The understanding of child and adolescent mental health disorders has improved significantly during the last two decades. As a result, the field is in a much better position today to estimate the extent to which mental health disorders occur in the population of children and adolescents at large, however it is still likely that many children in need go undetected. With the estimate that at least 20% of children and youth under age 19 may require mental health services (U.S. Public Health Service Office of the Surgeon General [USPHS], 2001), one also can estimate that at least 16 million children and youth are in need of some type of mental health service each year. As noted in Promotion and Prevention in Mental Health (Substance Abuse and Mental Health Services Administration [SAMHSA], 2007), half of all diagnosed mental illnesses begin by age 14, and three-fourths begin by age 24. Given these conditions, the ability for child-serving providers to identify children in need of services in settings where children and youth are found and to know how and where to direct their families to services is essential. Increasingly, the need for the public health approaches of health promotion and prevention is being identified for mental health (Institute of Medicine [IOM], 2009). The role that education, child welfare, juvenile justice, primary care, substance abuse, daycare, and other settings can play in early identification is facilitated by collaboration across systems and the awareness that providers in these settings have of the mental health needs of the children and youth they serve, as well as the services available to them.


Children and adolescents with serious emotional disturbance face challenges in many aspects of their daily lives. Generally, they present with a variety of diagnoses, they experience high rates of risk factors for mental illness, and they exhibit severe clinical symptoms and functional impairment (Manteuffel, Stephens, Brashears, Krivelyova, & Fisher, 2008). They are at greater risk for substance abuse disorders (Center for Mental Health Services [CMHS], 2001, 2003, 2004; Holden, 2003; Holden et al., 2003; Liao, Manteuffel, Paulic, & Sondheimer, 2001; SAMHSA, 2002), and have greater risk for negative encounters with the juvenile justice system (CMHS, 2001, 2003, 2004; Davis & Vander Stoep, 1997). Students with emotional disturbance fail more courses, earn lower grade point averages, miss more days of school, are retained at grade more than students with other disabilities, and have high dropout rates (Epstein, Nelson, Trout, & Mooney, 2005; U.S. Department of Education [DOE], 2001). Longitudinal research following samples into adulthood further supports assertions of high rates of poor long-term outcomes for these youth (Epstein, Kutash, & Duchnowski, 2005; Friedman, Kutash, & Duchnowski, 1996; Knapp, McCrone, Fombonne, Beecham and Wostear, 2002; Pumariega & Winters, 2003) who may have poor employment opportunities and who may experience periods of poverty in adulthood (National Advisory Mental Health Council Workgroup on Child and Adolescent Mental Health Intervention and Deployment, 2001). There is also the increased risk that youth with mental illness will not reach adulthood, as these youth are more likely to commit suicide than youth without mental illness (Centers for Disease Control and Prevention [CDC], 2007).


Despite advances in the knowledge base over the last decade that have illuminated continuing challenges in delivering services and meeting needs for this population, service capacity has not kept pace with need (Friedman, 2002; Stroul, Pires, & Armstrong, 2001). It has been estimated previously that only 1 in 5 children with serious emotional disturbance receive the specialty services they need (Burns et al., 1995; DHHS, 1999; Shaffer et al., 1996).Latinos and the uninsured have especially high rates of unmet need relative to other children (DHHS, 1999; Kataoka, Zhang, & Wells, 2002). This underscores the need for the development of effective community-based care that is sensitive to and structured for the diverse cultures in individual communities (Hernandez & Isaacs, 1998; Isaacs-Shockley, Cross, Bazron, Dennis, & Benjamin, 1996) and impoverished families, and is available in even the most geographically remote communities in the country. The Federal Action Agenda states that expanding access to quality mental health care is one of the identified methods to system transformation (SAMHSA, 2005). Serving the needs of persons of diverse backgrounds requires culturally and linguistically competent providers, culturally competent treatments and practices, and cultural adaptations to provide efficacious and effective services (Whaley & Davis, 2007).


In 1984, in response to findings that children and families are most effectively served by community-based, family-driven, coordinated systems of care, the NIMH initiated the Child and Adolescent Service System Program (CASSP). Later administered by CMHS within SAMHSA, CASSP provided funds to promote the development of comprehensive and integrated service delivery systems for children with serious emotional disturbance through a system of care approach. The system of care program theory model, first articulated by Stroul and Friedman in 1986, proposes that agencies in various child-serving sectors, such as education, juvenile justice, mental health, and child welfare work together to provide the wide array of services needed by children with serious emotional disturbance and their families. Built upon the CASSP philosophy that calls for services to be child-centered, family-driven, community-based, and culturally competent, the model emphasizes the need to: (1) broaden the range of nonresidential community-based services, (2) strengthen case planning across child-serving sectors, and (3) increase case management capacity to ensure that services work together across sectors and providers.


The Patient Protection and Affordable Care Act (ACA) of 2010, which seeks to make health insurance coverage more affordable for individuals and families and the owners of small businesses, also addresses a variety of services that should be available for individuals with mental health and addiction needs (Health Reform: Overview of the Affordable Care Act, SAMHSA newsletter, May/June 2010, Volume 18, Number 3). The system of care approach is consistent with the vision for transformation in mental health services outlined in the ACA, which calls for enhancing community-based service options for individuals with a mental health or substance use condition, school-based health centers that will offer mental health and addiction services, coordination of primary and mental health care services, prevention, early identification, and funding for system transformation.


Under the ACA, many individuals and families previously ineligible for Medicaid or unable to obtain commercial insurance for or because of mental and substance use disorders will be covered by Medicaid, commercial insurance through employers or on the private market or through the State health insurance exchanges. Estimates are that up to 32 million more people will become eligible for health insurance, of which six to ten million will have significant untreated mental health and/or addictions. Many of these, including children and families, will be treated through primary care settings, utilizing referrals to treatment that will help prevent or offer recovery from significant disorders (SAMHSA, Justification of Estimates for Appropriations Committees, Fiscal Year 2011). The system of care approach works to increase access to such quality, evidence-based referral services.


b. The Comprehensive Community Mental Health Services for Children and Their Families Program (CMHI)


While the system of care model provided a conceptual framework to meet the needs of children with serious emotional disturbance, funding to provide services at the local level was either sporadic or missing. In 1992, the Federal Government addressed this gap with the passage of the Children’s and Communities Mental Health Services Improvement Act (CMHI), which is part of the Alcohol, Drug Abuse and Mental Health Administration Reorganization Act (Public Health Service Act, Title V, Part E, Section 561-565, as amended, Public Law 102-321, 42 U.S.C. 290ff). The Act was amended in 2000 to change the term of funding from 5 to 6 fiscal years (Public Law 106–310, Section 3105(c)). CMHI provides support through grants and cooperative agreements to States, political subdivisions within States, the District of Columbia, and territories to develop integrated home and community-based systems and supports for children and youth with serious emotional disturbances and their families. This funding encourages communities to develop and expand systems of care. The CMHI is the largest Federal commitment to children’s mental health to date, and through FY 2010 has provided over $1.5 billion to support system development in 173 communities in 50 States, 2 territories, the District of Columbia, and 22 tribes or tribal entities including the 38 grants awarded in FY 2008 and FY 2009, and the 9 grants awarded in FY 2010. The program is fully described in the grant Guidance for Applicants.


The goals of the CMHS program are to:


  • Expand community capacity to serve children and adolescents with serious emotional disturbances and their families;

  • Provide a broad array of accessible, clinically effective and fiscally-accountable services, treatments and supports;

  • Serve as a catalyst for broad-based, sustainable systemic change inclusive of policy reform and infrastructure development;

  • Create a case management team with an individualized service plan for each child;

  • Deliver culturally and linguistically competent services with special emphasis on racial, ethnic, linguistically diverse and other underrepresented, underserved or emergent cultural groups; and Implement full participation of families and youth in service planning, in the development, evaluation and sustainability of local services and supports and in overall system transformation activities.


c. The Need for Evaluation


Section 565(c)1 of the Public Health Service Act (PL 102-321) mandates annual evaluation activities. A basic requirement is documentation of the characteristics of the children and families served by the system-of-care initiative, the type and amount of services they receive, and the cost to serve them. Equally important is the need to assess whether the program was implemented and services experienced as intended. It is also critical to assess whether the children served by the program experience improvement in clinical and functional outcomes, whether family life is improved, and whether improvements endure over time. Finally, policymakers and service providers need to know whether those outcomes can be reasonably attributed to the system-of-care initiative.


Further evaluation requirements under Section 565 (c)2 of PL 103-321 include:


  • Annual reports to the Secretary of Health and Human Services (HSS) that include a description of the number of children served, child demographic characteristics, types and costs of services provided, availability and use of third-party reimbursements, estimates of the unmet need for services within CA awardee jurisdictions, how the grant was expended to establish jurisdiction-wide systems of care, and other information as required by the Secretary

  • Annual Reports to Congress that provide information on longitudinal studies of outcomes of services provided by the funded systems of care, the effect of activities conducted under funded systems of care on the utilization of hospital and other institutional settings, barriers to the achievements of establishing interagency collaboration within systems of care, and parental assessment of the effectiveness of systems of care.


A government contractor (referred to as the National Evaluator throughout this document) coordinates data collection for the national evaluation and provides training and technical assistance to facilitate the collection of data by local-level evaluators. In turn, each CA awardee is required by the cooperative agreement to hire a minimum of two evaluation staff (or their full-time equivalents) to ensure that data collection is systematic and can be sustained through the funding period. In this partnership between the National Evaluator and local evaluators, the National Evaluator provides training and technical assistance regarding data collection and research design. In addition, the National Evaluator receives data from all CA awardees, monitors data quality, and provides feedback to CA awardees. The CA awardees help shape data collection procedures and provide feedback to the National Evaluator regarding successful approaches. This evaluation will first and foremost prepare data analyses for the national assessment of the program, but in doing so will make CA awardee-specific data available to the CA awardees to help meet their local evaluation needs.


d. Clearance Request


This submission requests OMB clearance for (1) continued data collection under the previously approved package for Phase VI of the national evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program (OMB No. 0930-0370), with some revisions made to accommodate contract modifications, and (2) the estimate of burden for an additional 9 sites collecting data under this previously approved protocol. The request estimates burden for data collection in a total of 47 sites (18 sites funded in FY 2008, 20 sites funded in FY 2009, and 9 sites funded in FY 2010).



2. PURPOSE AND USE OF THE INFORMATION


What follows is a description of the previously approved clearance, a summary of the revisions from the previously approved package, and a description of the uses of the information collected through the evaluation.


a. Previously Approved Clearance


Currently, data collection for the CMHI cross-site evaluation is operating under OMB clearance (OMB No. 0930-0370) valid until December 31, 2012. The national evaluation is designed to answer evaluation questions that have evolved over the last 18 years through development of the CMHI and feedback from system of care personnel and other partners and extend those mandated by the CMHI authorizing legislation. The legislation requires funded communities to participate in a national evaluation that assesses the number of children served, child and family characteristics, child and family outcomes, service utilization patterns, and system characteristics.


This evaluation will serve several purposes. It will fulfill the program’s legislatively mandated requirements for an annual report to Congress based on findings from a national evaluation of the program. In support of this purpose, i t will (1) describe who is being served by the CMHS-funded systems of care; (2) show whether there are observable differences in child and family outcomes that can be plausibly linked to a faithful implementation of the system of care approach; (3) describe how children and families experience the service system and how they use services and supports (i.e., utilization patterns); (4) estimate the cost of serving children in systems of care and assess cost-effectiveness of services; (5) illustrate the development of systems of care as they move toward offering integrated and comprehensive services; (6) and compare outcomes and service experience among a group of children, youth and families involved in one of three child-serving sectors and receiving services from CMHS-funded system of care communities, and a similar group receiving services from non-funded communities. In addition, the evaluation will provide data to CMHS and CA awardees to inform program implementation, improvement, and sustainability; (8) support evaluation technical assistance activities to help CMHS best meet program goals.;


The evaluation design includes participation among CA awardees in four core study components and one special study (subsample of FY 2008-funded sites only) that employ both qualitative and quantitative methods to comprehensively examine the impact of the CMHS program. Longitudinal data collected from children and families in each cohort over a 24 month period provide an assessment of improvement in functional and behavioral outcomes over time, and satisfy the requirement of Public Health Service Act, Title V, Part E, Section 565, Public Law 102-321, 42 U.S.C. 290ff-4(c) that information be collected on the longitudinal outcomes of services provided by the funded systems of care. The five currently approved study components, their associated instruments, and the purpose of data collection as it relates to the Public Law, and Program Objectives as stated in the Request for Applications (RFA) to which CA awardees applied are summarized in Table 1 below.




Table 1. Purpose of Currently OMB-Approved Cross-Site Evaluation Data Collection Instruments

STUDY & ASSOCIATED INSTRUMENTS

PURPOSE

Cross-Sectional Descriptive Study

  • Enrollment & Demographic Form (Web-based, record review)

  • Child Information Update Form

Public Health Service Act

Title V, Part E

Public Law 102-321

Program

Objective

The study addresses Section 565, 42 U.S.C. 290ff-4(c) that the public entity involved will annually submit to the Secretary a report on the activities of the entity under the grant that includes a description of the number and demographics of children provided access to systems of care.

This study crosswalks with RFA requirement 2.1 that CA awardees serve children and/or adolescents with a serious emotional disturbance.

Child and Family Outcome Study

Caregiver Measures

  • Behavioral and Emotional Rating Scale (BERS–2C)

  • Caregiver Information Questionnaire, Revised (CIQ-R)

  • Caregiver Strain Questionnaire (CGSQ)

  • Child Behavior Checklist (CBCL)

  • Columbia Impairment Scale (CIS)

  • Education Questionnaire (EQ-R2)

  • Living Situations Questionnaire (LSQ)

  • Multi-Service Sector Contacts Questionnaire (MSSC-RC)

  • Culturally Competent Service Provision Questionnaire (CCSP-R)

  • Youth Services Survey for Families (YSS–F)


Caregivers of young children only:

  • Phase VI: Devereux Early Childhood Assessment (DECA), Parenting Stress Index (PSI), Preschool Behavioral and

  • Emotional Rating Scale (PreBERS)

Youth Measures

  • Behavioral and Emotional Rating Scale (BERS–2)

  • Delinquency Survey (DS)

  • GAIN Quick—R: Substance Problems Scale (GAIN)

  • Revised Child Manifest Anxiety Scale (RCMAS)

  • Reynolds Adolescent Depression Scale (RADS–2)

  • Substance Use Survey (SUS-R)

  • Youth Information Questionnaire (YIQ-R)

  • Youth Services Survey (YSS)

Public Health Service Act

Title V, Part E

Public Law 102-321

Program

Objective

The study addresses Section 565, 42 U.S.C .290ff-4(c) that evaluations assess the effectiveness of the system of care, including longitudinal studies of outcomes of services,

and effectiveness of the system of care as assessed by parents.

By assessing service experience, this study crosswalks with RFA requirement 2.2: Services Delivery. By assessing longitudinal outcomes this study crosswalks with RFA requirement 2.5: Data Collection and Performance Measurement.

Services and Costs Study

  • Existing service utilization and cost data from agency management information systems and budgets captured on ongoing basis in the Services and Costs Tool provided by the national evaluation, and/or recoded and submitted to national evaluation quarterly

  • Flexible fund expenditures captured on ongoing basis in the Flex Funds Tool provided by the national evaluation, and/or recoded and submitted to national evaluation quarterly

Public Health Service Act

Title V, Part E

Public Law 102-321

Program

Objective

This study addresses Section 564, U.S.C. 290ff-3(f) and Section 565, U.S.C. 290ff-4(c) that the public entity annually submit to the Secretary a report that includes a description of the types and costs of services provided, the availability and use of third-party reimbursements, estimates of the unmet need for services in the jurisdiction of the entity, and that evaluations assess the effect of activities on the utilization of hospital and other institutional settings.

This study crosswalks with RFA requirement 2.4.1: Required Activities that CA awardees develop financing approaches that promote provision of a cross-agency service delivery system, create flexible funds, and develop care review approaches that promote fiscal accountability. The study also crosswalks with RFA requirement 2.4.3: Sustainability by providing data on the cost effectiveness of systems of care.

System of Care Assessment      

  • Semi structured interview with multiple stakeholders using the System of Care Assessment Interview Guides A-I, L-S

  • Review of randomly selected case records, document review, and follow-up telephone calls as needed

Public Health Service Act

Title V, Part E

Public Law 102-321

Program

Objective

This study addresses Section 564, U.S.C. 290ff-3(f) and Section 565, U.S.C. 290ff-4(c) that the evaluation assess barriers and achievements resulting from interagency collaboration in providing community-based services to children with a serious emotional disturbance, and that the public entity annually submit to the Secretary a report assessing the manner in which the grant has been expended toward the establishment of a jurisdiction-wide system of care.

This study crosswalks with RFA requirements 2.4.1: Required Activities, 2.4.3: Sustainability, 2.4.4: System Development and Implementation Plan, and 2.5: Data Collection and Performance Measurement.

Sector and Comparison Study

  • Court Representative Questionnaire (CRQ)

  • Teacher Questionnaire (TQ)

  • School Administrator Questionnaire (SAQ)

  • Education Sector Caregiver Questionnaire (ESCQ)

Public Health Service Act

Title V, Part E

Public Law 102-321

Program

Objective

The study addresses Section 565, U.S.C. 290ff-4(c) that evaluations assess the effectiveness of the system of care.



The national evaluation is driven by the system of care program theory model. This program theory asserts that to serve children with serious emotional disturbance, service delivery systems need to offer a wide array of accessible, community-based service options that center on children’s individual needs, include the family in treatment planning and delivery, and are provided in a culturally and linguistically competent manner. An emphasis is placed on serving children in the least restrictive setting that is clinically appropriate. In addition, because many children with serious emotional disturbance use a variety of services and have contact with several child-serving agencies, service coordination and interagency collaboration are critical. The program theory holds that if services are provided in this manner, outcomes for children and families will be better than can be achieved in traditional service delivery systems.


To examine the system of care theory, the core studies of the national evaluation are designed to answer the following overarching questions:


  • Who are the children and families served by the program and by the funded communities? How do the characteristics of children and families who participate in systems of care differ? Does the served population change over time as systems of care mature?

  • How do systems of care develop according to system of care principles (e.g., family and youth involvement, cultural competence, interagency collaboration) over time? What are differences in the development of systems of care? In what ways does funding accelerate system development?

  • What is the degree to which each of the CA awardee communities has implemented, developed, and sustained their service systems according to the system of care conceptual framework, based on the results of a System of Care Assessment Tool?

  • To what extent do children’s clinical and functional outcomes improve over time? How are family outcomes affected? What is the nature of change in child, family, and system outcomes? How are changes in child, family, and system outcomes associated with efforts to implement and develop systems of care?

  • What are the service utilization patterns (specific services, treatments, and supports) for children and families in systems of care and what are the associated costs? In what ways do the services and supports that children and families receive differ? How cost-effective are systems of care over time? Are systems of care cost-effective?

  • To what extent are children’s and families’ experiences consistent with the system of care philosophy? How satisfied are children and families with the services they receive? How well do CA awardee communities provide a broad array of services in a cultural context that is most appropriate for the child and the family and that ensures a full partnership with families? How effective are specific services, treatments, or supports in producing positive outcomes for children and families?

  • Are there subgroups of children and families for whom a system of care is more effective?

  • To what extent do CA awardee communities receive technical assistance to implement the evaluation appropriately? How frequently is feedback provided to local CA awardee communities on the status of data collection and on findings of the evaluation?

  • To what degree are systems of care effective in producing positive outcomes for children and families?


These evaluation questions evolved over the last 19 years through development of the CMHI and feedback from system of care personnel and other partners and extend those mandated by the CMHI authorizing legislation. The legislation requires funded communities to participate in a national evaluation that assesses the number of children served, child and family characteristics, child and family outcomes, service utilization patterns, and system characteristics.


b. Summary of Specific Revisions to Instruments


Principal changes from the previous Phase VI OMB approval include:


  • Replacement of intake and follow-up questionnaires for the Child Welfare Sector and Comparison Study with an administrative record review form

  • Addition of an Education Sector Caregiver Questionnaire to the Education Sector and Comparison Study

  • Removal of data collection activities for the Alumni Networking Study, the CQI Initiative Evaluation, and the Sustainability Study.


The intake and follow-up questionnaires for the Child Welfare Sector and Comparison Study are proposed to be replaced with an administrative record review form in an effort to lessen participant burden. In order to comply with budget modifications, the request proposes to remove data collection activities associated with the Alumni Networking Study, the CQI Initiative Evaluation, and the Sustainability Study. In order to obtain supplemental information not captured on the Education Questionnaire about caregiver experiences with Individual Education Plans in the Education Sector and Comparison Study, the request proposes to amend data collection activities for this study to include an Education Sector Caregiver Questionnaire.


c. Uses of Information Collected Through the CMHI Evaluation


CMHI Evaluation data and reports have been, and will continue to be, used by multiple stakeholders, including SAMHSA, CMHS Directors, and Grant Project Officers (GPOs), CA awardees, the practice community, and the research community.


SAMHSA


As with findings from Phases I, II, III, IV and V, SAMHSA will be able to use the results from the Phase VI evaluation to:


  • determine whether CA awardees implement their programs according to program specific requirements and whether fidelity to program implementation is associated to child and family outcomes.

  • develop policies and provide guidance regarding the development of systems of care.

  • enhance other CMHS programs that support system development (e.g., Projects for Assistance in Transition from Homelessness, Community Mental Health Services Block Grants, Cooperative Agreements for State-Sponsored Youth Suicide Prevention and Early Intervention, Mental Health Transformation State Incentive Grants, and the National Registry of Evidence-Based Programs and Practices program)

  • support the many partners that work in collaboration with CMHS, including the National Federation of Families for Children’s Mental Health and the National Mental Health Association in their national efforts to help build systems of care for children's mental health services.

  • fulfill the program’s reporting requirement of an annual report to Congress based on findings from a national evaluation of the program, as mandated by the program’s authorizing legislation


In addition, in 2010, to guide its work through at least 2012, SAMHSA identified eight strategic initiatives with input from stakeholders including Federal, state and local leaders; constituency groups; advisory council members; members of Congress; people in recovery; and family members. These initiatives are designed to focus SAMHSA’s work on improving lives and capitalizing on emerging opportunities. In particular, the CMHI evaluation responds to the following three strategic initiatives:


  • Recovery Support: SAMHSA is taking the lead on promoting recovery-oriented service systems and peer support for individuals with or in recovery from mental and substance use disorders. Thus, one of the eight Strategic Initiatives—“Recovery Support”—is designed:


to partner with people in recovery from mental and substance use disorders and family members to guide the behavioral health system and promote individual-, program-, and system-level approaches that foster health and resilience; increase permanent housing, employment, education, and other necessary supports; and reduce discriminatory barriers.”


The “Recovery Support” strategic initiative includes four goals with imbedded objectives and action steps. Of those, the CMHI program and data collection associated with Phase VI of the CMHI evaluation contribute most specifically to the following:

  • Engaging individuals in recovery and their families in self-directed care, shared decision-making, and person-centered planning

  • Ensuring that permanent housing and supportive services are available for individuals with or in recovery from mental and substance use disorders

  • Increasing gainful employment and educational opportunities, while decreasing legal and policy barriers, for individuals in recovery with mental and substance use disorders


  • Data, Outcomes and Quality Initiative: SAMHSA has highlighted the importance of supporting programming decisions with high quality data and of transparency in these decisions by making data readily available to the public. The objective of the initiative is:


to realize an integrated data strategy that informs policy and measures program impact leading to improved quality of services and outcomes for individuals, families and communities.”


The initiative includes four goals with imbedded objectives and action steps. Of those, the CMHI evaluation is guided by the following:

  • Improving the quality of SAMHSA’s program evaluations and services research

  • Improving the quality and accessibility of surveillance, outcome/performance, and evaluation information for staff, stakeholders, funders and policymakers


  • Trauma and Justice Initiative: SAMHSA is one of the leading agencies addressing the impact of trauma on individuals, families and communities across the country. Thus, one of the eight Strategic Initiatives—“Trauma and Justice”—is designed:


to focus programmatic efforts on the goal of reducing the pervasive, harmful, and costly health impact of violence and trauma by integrating trauma-informed approaches throughout health and behavioral health care systems and by diverting people with substance use and mental disorders from criminal and juvenile justice systems into trauma-informed treatment and recovery.”


The “Trauma and Justice” strategic initiative includes five goals with imbedded objectives and action steps. Of those, the CMHI program and data collection associated with the CMHI evaluation contribute most specifically to the following:

  • Reducing the impact of trauma

  • Supporting programs to address trauma experienced in childhood

  • Improving the availability of trauma-informed care


In sum, in its design and through its established priorities and data collection approach, Phase VI of the CMHI evaluation, as in other phases of the evaluation, will provide data that will allow SAMHSA to assess and illustrate the ways in which, as well as the extent to which, the CMHI program has achieved goals in the areas of urgency and opportunity as outlined in SAMHSA’s Strategic Initiatives.


CMHS Leadership


CMHS leadership has been, and will continue to be, able to use CMHI evaluation data reported by CA awardees to determine whether funded activities are progressing as expected and to keep abreast of any issues that CA awardees are having related to carrying out their proposed activities. Government Project Officers (GPOs) may also use the information to connect CA awardees who are conducting similar activities or serving comparable populations to facilitate collaboration across the CMHI.


In addition, the design for the CMHI evaluation provides for data collection, summarization, analysis, and reporting that can be used to address SAMHSA/CMHS priorities including:


  • Accountability: The evaluation was designed to support SAMHSA/CMHS legislatively-mandated reporting requirements. Findings from the evaluation have been, and will continue to be, used to fulfill the legislatively mandated requirements for annual reports to the Secretary and to Congress. Information to be reported includes:

  • Description of the number of children provided access to systems of care

  • Demographic characteristics of the children

  • Types and costs of services provided

  • Availability and use of third-party reimbursements

  • Estimates of the unmet need for such services within grantee jurisdictions, and

  • How the grant has been expended to establish a jurisdiction-wide system of care for children with a serious emotional disturbance,

  • Assessments of effectiveness of systems of care that examine longitudinal and other studies of outcomes, the effect of systems of care on the utilization of hospitals and other institutional settings, barriers to and achievements from interagency collaboration in providing community-bases services, and parent or caregiver assessments of effectiveness, and

  • Other information as may be required.


  • Program and policy planning. Findings from the evaluation inform both intra- and interagency discussion and decision-making for program and policy planning. The evaluation provides the most comprehensive data available about children with serious emotional disturbances and their long term outcomes, and are therefore frequently drawn on to fill often urgent requests for information received by SAMHSA from the Secretary and other Federal child-serving entities within and beyond HHS (e.g., Agency for Children and Families, Department of Education, Department of Justice) about the characteristics and long term outcomes of children and youth with serious mental health concerns, and subgroups of these children and youth such as those who are involved with child welfare, juvenile justice, or education services; are at risk of suicide; are lesbian, gay, bisexual or transgender; have experienced trauma or bullying; have misused prescription drugs; or have co-occurring disorders.

  • Quality Improvement: Mechanisms for reporting useful data profiles, summaries, and/or reports have been developed in previous phases of the evaluation and will continue to be used in Phase VI of the evaluation to support quality improvement activities for clinical interventions, other products, and training/dissemination efforts to serve as an incentive for data collection by data providers.


  • Program justification purposes: Program justification requires indicators not only of the effectiveness of activities and products in the abstract or in the published literature, but also of wide distribution and actual uptake of the activities and products, and evidence that they are effective, cost-effective and sustainable in communities throughout the country.




CA Awardees


Findings from the evaluation have been and will continue to be used by CA awardees to:


  • improve the implementation of their systems of care and achieve the goals of the CMHI.

  • improve their services, and support their efforts to obtain required matching funds and to sustain their system of care after the CMHI funding has ended. Indeed, several CA awardees have used data collected for the Phase I, II, III, IV and V studies to request additional funding from their State legislatures.

  • plan culturally competent services and supports which families and youth report as useful and that are associated with improved child, youth and family outcomes.

  • learn what barriers children or youth and their families perceive and work to eliminate such barriers.

  • learn whether families experience services as the CA awardees intended and will identify their programs’ strengths and weaknesses

  • help identify gaps in system development and barriers to collaboration, and will help CA awardees more effectively allocate personnel and funding and prioritize activities .


Research Community


The research community, particularly the field of children’s mental health services research, will profit in a number of ways. First, evaluation of the CMHI will add significantly to the developing research base about systems of care. Second, the focus on child, family, and system outcomes will allow researchers to examine and understand the specific ways children improve, how services can be enhanced, and the importance of adherence to service plans. Moreover, the relationship among these variables will be better understood. Finally, the analysis of evaluation data will aid researchers in formulating new questions about systems of care and specific services, and will help both service providers and researchers improve the delivery of children’s mental health services. Data collected from the national evaluation have contributed to more than 750 publications and presentations.


Summary


The CMHI evaluation data and related reports produced will be useful to SAMHSA, CMHS GPOs and leadership, CA awardees and the research community. The level of evidence provided by the evaluation about program implementation and outcomes has enabled communities to use evaluation data to track activities funded by their CMHI CAs, provide summary reports to their local steering committees or other advisory boards, support statewide expansion efforts, develop interagency partnerships, and obtain resources to sustain systems with interagency agreements.


At all levels of government—Federal, State, and local—and in the private sector, decisions are being made that are dramatically changing the lives of children and families. To make these decisions in a responsible way, policymakers, communities, and other stakeholders need information such as the data and findings to be produced by the CMHI Evaluation.



3. USE OF IMPROVED INFORMATION TECHNOLOGY


The National Evaluator has provided software for computer-assisted personal interviewing (CAPI) to CA awardees. Across all study components approximately 90 percent of total responses, based on our most recent assessment of previous use, will be obtained electronically by CAPI or Web survey.


Data from the Cross-Sectional Descriptive Study, Child and Family Outcome Study, Service Experience Study, and the Sector and Comparison Studies are managed using an integrated Internet-based data input, management, and dissemination system—the interactive-collaborative network (ICN). The ICN, which was introduced in Phase III and refined in Phases IV, V, and VI of the national evaluation, reduces evaluation burden for the sites and allows real-time access to data for site personnel and National Evaluation Team members. The ICN is designed to capture the specific data collected by the national evaluation to meet the reporting requirements of the CMHI’s authorizing legislation. The system serves as a mechanism for communicating about data quality, and evaluation activities and results.


The ICN was designed as a three-part system that allows systematic data input, immediate validation to identify data input flaws, and monitoring of data entry and evaluation in real time. It reduces processing time and provides the capability of creating interactive reports. The ICN is a completely secure system that ensures privacy through the provision of different levels of password-protected access to site and national data.


  • Data Input. The data entry software allows sites with available laptop computers the option of CAPI interviewing by entering the participant’s responses directly into the data entry package during the interview. The software allows rapid data entry off-line, and the Internet is used to transfer data from local sites to the national database. Specific descriptive information on Cross-Sectional Descriptive Study participants are entered directly to the ICN Web site. This web-based data entry is designed to be used by intake workers or case managers often located at various agencies rather than at a central evaluation office. The primary goal of this web-based data entry is to maximize the capture of descriptive information on all children served in system of care programs while eliminating burden associated with the Cross-Sectional Descriptive Study. Finally, for the Services and Costs Study, the National Evaluator has developed the Services and Costs Tool. This Web-based data collection application is designed to create a child-level data record for each system of care services received by children/youth. CA communities have to option to key in data in any of the service module fields or to upload an extract file representing the same data. The application features validation checks for quality assurance, preset response categories, secure access authorization for multiple persons within each community and multiple automated reports.

  • Data Monitoring, Management and Dissemination. Software allows the National Evaluator and CMHS to monitor the status of each site’s data submissions in real time and permits sites to check the status of their own data submissions. Reporting features support sites’ abilities to use their data for quality assurance monitoring and system improvement purposes. Basic validations are completed during the data entry process. Every month, detailed reports are provided to communities that detail any potential data errors or issues. The National Evaluator has automated these reports, such that communities have real-time, on-demand access to these reports. These features are available to Phase VI communities that have started data collection. Reports posted on the ICN provide a vehicle for the review of aggregate data that CMHS has approved for public release. For example, Data Profile Reports, created 3 times per year, display a summary of child- and family-level descriptive and outcome data collected at the community and aggregate level.


The National Evaluator will provide training and direct evaluation technical assistance support to sites to facilitate the implementation of the evaluation protocol and the use of evaluation results at the site level. Site personnel will be trained to utilize the ICN at national training meetings and during evaluation technical assistance visits to the sites.


CMHI evaluation surveys and forms that are Web-based for Phase VI of the evaluation include:

  • The Services and Costs Study Tool (Web-based data collection application)

  • Enrollment and Demographic Information Form (Web-based form)

  • Transfer of administrative data from schools, criminal justice systems, and child welfare agencies for the Sector and Comparison Study to ICF Macro (via secure web site)


The use of Web-based surveys and forms decreases respondent burden, as compared to that required for alternative methods, such as a paper format, by allowing for direct transmission of the survey or form. In addition, the data entry and quality control mechanisms built into the Web-based format reduces errors that might otherwise require follow-up, thus reducing burden, as compared to that required for a hard-copy administration. As well, respondents can complete the survey at a time and location that is convenient for them. The national evaluation’s development of the Services and Costs Study Tool has also minimized communities’ need to develop their own systems locally and the costs of this development. Similarly, data transfer of existing administrative data for the Sector and Comparison Study reduces the need for additional data entry by project staff and reduces the potential for error.


All of the Web-based surveys associated with the evaluation recruit respondents to participate through an e-mail invitation. The e-mail process occurs in four stages: (1) an advance invitation to participate, (2) a formal invitation, which includes the Web site’s URL and unique user name and password, (3) a reminder to all respondents, and (4) a final targeted reminder to nonresponders and those who have only partially completed the survey.


Finally, SAMHSA and its contractors strive to ensure that all Web-based solutions are fully compliant with Section 508 of the Rehabilitation Act. This includes ensuring that all posted documents are compliant or have a compliant alternative. The National Evaluator utilizes Adobe products that are capable of producing compliant PDF files per the SAMHSA recommended process. The National Evaluator has a thorough knowledge of Section 508 standards and employs accessibility specialists with experience in Section 508 compliance verification, including assessment with a variety of assistive technologies, including screen readers, screen magnifiers, and voice recognition software.



4. EFFORTS TO IDENTIFY DUPLICATION


This evaluation generates data that have not previously been collected, or have only minimally been collected in the field of children’s mental health services and/or collected only by the CMHI cross-site evaluation in the past. This includes information on access to quality, evidence-based care for children, youth, and their families and disparities in access to care by demographic groups, including a comparison of access to care within and outside of the CMHI; the process of developing, disseminating, and implementing evidence-based practices (EBPs) for children, youth and their families; and the national impact of the CMHI. As well, the four Core Studies, which include data on who receives system of care services, the types of services they receive, and the outcomes related to receipt of these services, are collected in a systematic manner that yields more extensive, detailed, and consistent information than has previously been obtained.


The National Evaluator also conducted an extensive literature search to identify existing evaluation research on systems of care and children’s mental health services. The search included a review of published literature, unpublished papers, works-in-progress, and working papers and documents. During the implementation of the Phase I–V evaluations, the National Evaluator has kept abreast of the literature in children’s mental health services research and has been in close contact with the original CA awardees. This has allowed the team to keep up with advances in practice and research. In addition, the Services Evaluation Committee for the national evaluation has helped keep the evaluation appraised of new innovations in the field. These efforts yielded a broad list of useful references. While some of the research identified contains features similar to the planned evaluation, the scope of the research projects varies considerably and is driven by the particular research interests of each investigator. The Phase VI evaluation offers unique contributions to the field not available in these other studies.


Phase VI does not duplicate extant studies, but instead enhances the existing knowledge base. In addition, Phase VI provides information that is specific to this service program. As required by the legislation, data must be collected from the communities in which the program has been funded. Existing research and data in the area of children’s mental health services are not sufficient to address the questions posed in this evaluation. For questions related specifically to the functioning and impact of the CMHI, the evaluation has and will serve as a primary mechanism through which the CMHI will be understood, improved, and sustained.


The data collected under Phase VI of the National Evaluation are not available in other Federal databases, nor are they collected through TRAC.


As described above in Section A.1.d, advances in the field of children’s mental health have emphasized the importance of assessing the impact of providing coordinated, community-based mental health services through a system of care environment, and the ability to sustain system of care services. Consequently, Phase VI addresses both of these issues by including a Sector and Comparison Study aimed at increasing the understanding of the factors that affect improvements on clinical outcomes for children and their families.



5. INVOLVEMENT OF SMALL ENTITIES


Some of the data for this evaluation will be collected from mental health, juvenile justice, education, and child welfare agencies. While most data will be collected from public agencies, it is possible that some organizations providing services to the target population, such as community-based organizations, not-for-profit agencies, private providers, schools, or parent groups, would qualify as small entities. The information requested is the minimum required to meet the study objectives. The site visit interview guides used in the System of Care Assessment and Sector-specific information obtained from the Sector and Comparison Study are the only instruments that will be administered to the staff of small entities.



6. CONSEQUENCES IF INFORMATION IS COLLECTED LESS FREQUENTLY


Below is a summary of the consequences if the CMHI Evaluation information is collected less frequently, organized by individual studies that all currently have OMB approval.


System of Care Assessment. Data for this component will be collected every 18–24 months across the 6 years of system of care community funding (beginning in the second year), documenting how the program has led to system enhancement. This information is key to examining whether improved outcomes for the children served by the system can be plausibly linked to this initiative. Because systems of care change slowly, collection of system data every 18–24 months is sufficient to provide information on system implementation, organizational involvement, and relationships. If these data were collected less frequently, important interim changes would not be documented.


Cross-Sectional Descriptive Study. Data for this component will be collected when children and families first access the system of care. These data elements are maintained by the CA awardees for their own administrative purposes; hence their collection creates no additional respondent burden. For families participating in the Child and Family Outcome Study, however, the descriptive information that may have changed over time (e.g., family income, caregiver’s marital status) will be collected at each follow-up data collection point. Failure to collect these few data elements at follow-up would preclude the detection of key changes in the child’s environment that could have an important impact on the child’s clinical outcomes, service use, or family functioning. Data from the CA awardee sites will be submitted to the National Evaluator continuously using the ICN, resulting in a minimal burden to site staff.


Child and Family Outcome Study. For this component, data will be collected at intake and every 6 months for the length of the evaluation, up to 24 months. Clinicians who work with this population of children suggest that once children enter services, they are likely to experience detectable improvements within the first 6 months of services. However, whether improvement is sustained is important to demonstrate. Assessing outcomes every 6 months allows for the study of the course of improvement over time so that interventions can be planned for times that are likely to yield the greatest gains. Thus, waiting 12 months to collect outcome data would miss important changes that are likely to happen in children who are still developing. On the other hand, it was the judgment of the Research Advisory Board and prior CA awardees that quarterly data collection would be too burdensome.


Sector and Comparison Study. Data for this study component will be collected at intake into the evaluation and at subsequent 6-month intervals in conjunction with the Child and Family Outcome Study for a subset of communities funded in FY 2008. Of particular interest for the sector and comparison studies are functional outcomes such as educational performance, abstaining from delinquent and criminal behavior and placement stability. It is important to follow children as long as possible to capture changes that occur as children enter new developmental stages, especially adolescence and young adulthood. For the educational sector, teachers and school administrators will be surveyed at baseline and every 6 months at follow-up for similar reasons. The caregiver questionnaire will be administered at baseline and every 6 months at follow-up only if the caregiver indicates their child has received an Individualized Education Program (IEP) in the previous 6 months. For the juvenile justice sector, court representatives who are responsible for oversight of youth completion of court-required activities will be assessed at baseline and every 6 months at follow-up for similar reasons. Youth require regular reporting to court representatives to ensure completion of activities, and completion of these may occur over a period of months or years, depending on the youth’s sentence or status. Collecting the data less frequently for all sectors may miss important changes that are likely to occur with every new academic year.


Services and Cost Study. Data used in this study come from communities’ MISs and is aimed at assessing all services received by children and their families and associated costs. These data are episodic in nature, and not collecting information on all episodes of services will result in underreporting of services utilization and underestimating services cost incurred by children and families. By not collecting services and costs data, from the beginning of service delivery, within a consistent data structure across all grant communities, the ability to accomplish these study goals are seriously diminished. SAMHSA is often asked to demonstrate the cost-effectiveness of this grant program. Without requiring complete and consistent data from all communities, the validity of these types of costs analyses would be compromised.



7. CONSISTENCY WITH GUIDELINES IN 5 CFR 1320.5(d) (2)


The data collection fully complies with the requirements of 5 CFR 1320.5(d) (2).



8. CONSULTATION OUTSIDE THE AGENCY


Federal Register Notice


The notice in the Federal Register was published by SAMHSA on April 12, 2012 (Vol. 77, p. 21986) to solicit public comment on this study.





Consultation Outside of the Agency


Consultation on the design, instrumentation, data availability and products, and statistical aspects of the evaluation occurred continually throughout the implementation of Phases I, II, III, IV, and V. To capitalize on the experience and knowledge gained, the development of Phase VI was based, in part, on this consultation. Since the beginning of this initiative, consultations have been sought from the following:


  • Federal representatives working in related program areas

  • Experts in the area of child mental health services research

  • CMHS CA awardees

  • Families caring for children with emotional and behavioral disorders

  • Representatives of national organizations for children, families, and providers in the field (e.g., National Technical Assistance Center for Children’s Mental Health, National Mental Health Association, the National Federation of Families for Children’s Mental Health, National Alliance for the Mentally Ill, State Mental Health Representatives for Children and Youth)

  • Experts in program evaluation, measurement, and statistical analysis

  • Experts in Web site usability testing

  • Experts in mental health service systems for Native American children


These consultations had several purposes: (1) to ensure continued coordination of related activities, especially at the Federal level; (2) to ensure the rigor of the evaluation design, the proper implementation of the design, and the technical soundness of study results; (3) to verify the relevance and accessibility of the data to be collected; and (4) to minimize respondent burden.


a. Federal Consultation



Input from representatives of Federal agencies involved in children’s mental health issues has been elicited throughout all phases of the national evaluation. CMHS received input about its children’s services program from Federal offices including, but not limited to, the following: the Office of Special Education Programs, DoE; the Office of Juvenile Justice and Delinquency Prevention, DoJ; the Office of Disability, DHHS; and Division of Adolescent and School Health, CDC. (See Attachment A.1.a-c for a list of the participants in the Federal/National Partnership for Children’s Mental Health and their affiliations and telephone numbers.)Specifically, representatives from the listed Federal agencies have convened to develop strategies for coordinated training, technical assistance, and culturally competent services to communities across the country.


In addition, SAMHSA, the parent agency of CMHS, requires that its other two constituent centers, the Center for Substance Abuse Treatment (CSAT) and the Center for Substance Abuse Prevention (CSAP), conduct an internal review of the Annual Report to Congress on the Evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program. Evaluation specialists at the CDC, NIMH, and the Office of the Assistant Secretary for Planning and Evaluation (ASPE) of DHHS have also reviewed and provided comments on the national evaluation. Furthermore, NIMH has been represented on the Services Evaluation Committee of the national evaluation by various individuals over the past several years, including most recently Beverly Pringle, David Chambers, and Carmen Moten. (See Attachment A.2 for a list of Methodological Consultants and Services Evaluation Committee).


b. Expert Consultation


The Services Evaluation Committee of the national evaluation, a workgroup of expert consultants, was organized to provide technical guidance and review for Phase I of the evaluation. The Services Evaluation Committee continued to have input regarding the enhanced design and instrumentation for Phases II, III, IV, and V. Recommendations made by this group influenced changes applied to the Phase VI instrumentation. Services Evaluation Committee members have combined expertise in children’s mental health, the delivery of children’s mental health services, and the evaluation of systems of care. (See Attachment A.2 for a list of Services Evaluation Committee members.)


c. Cooperative Agreement Awardee Consultation



Previously funded CA awardees have been key providers of input for all phases of the evaluation design. For the design of Phase VI, CA awardee input was used in the development of the instrument package. In October 2008, project directors and evaluators from previously-funded sites participated in the Phase VI Evaluation Review Meeting where study design and instrumentation was discussed. These participants helped in determining the instruments that are most appropriate for each component of the evaluation. Additional input from CA awardees was also received by the National Evaluator through conference calls, site visits, semi-annual workshops and evaluator meetings, close-out visits in which evaluation processes and data utilization were reviewed, and CA awardee participation on the Services Evaluation Committee.


d. Family Consultation


Caregivers participated on the Services Evaluation Committee and gave early input to the overall design. Caregivers also reviewed the instrumentation and key features of the evaluation design to ensure sensitivity to parent issues and concerns as well as to maximize clarity of meaning and to assess feasibility of administering the questionnaires. CA awardee sites systematically solicited feedback from family members; hence the family perspective was also included in comments and consultation from CA awardee sites.



9. PAYMENT TO RESPONDENTS


As with previous phases, Phase VI of the national evaluation will use a research-based approach to evaluation and, as such, will require participation of children and families beyond their receipt of services in their system of care programs. Consequently, remuneration is essential to ensure good response rates across all study components.


Remuneration levels in the System of Care Assessment, Child and Family Outcome Study and Sector and Comparison Study, are the same as those currently approved in Phase VI.


System of Care Assessment. Three caregivers of children who receive services in each system of care community are interviewed during each System of Care Assessment site visit. The national evaluation will provide a payment of $25 to them at the time of their interviews in compensation for the additional burden and potential inconvenience of these interviews. Two youth participants in each system of care community are interviewed during each System of Care Assessment site visit. The national evaluation will provide a payment of $15 to them at the time of their interviews in compensation for the additional burden and potential inconvenience of these interviews.


Child and Family Outcome Study. The National Evaluator strongly recommends that CA awardees remunerate respondents who participate in the Child and Family Outcome Study $20 each for caregivers and youth at each administration. Remuneration is essential to help maximize participation rates, particularly given the additional time being asked of families who already face multiple challenges and demands on their time in caring for their children with serious emotional disturbance. To complete the instruments at the time of entry to services and at subsequent follow-up points requires the evaluation participants to spend time away from other activities and creates a burden to the caregivers and children that exceed the burden that ordinarily would be placed on them if they were seeking services not associated with this evaluation.


Sector and Comparison Study. At baseline, incentives will be paid to caregivers and youth ($40 and $20, respectively). The incentives will include a bonus incentive of $50 paid to each caregiver and youth who complete all five waves of data collection. As noted, remuneration is standard practice in this type of longitudinal research to acknowledge participants’ value to the study and to help maximize participation rates given the amount of time being asked of these families. Incentives will also be provided to the agency representatives in the amount of $20 for their participation in interviews. State and county agency representatives may not be allowed to accept incentives. In this case, alternative methods of providing incentives will be devised, which may include a donation to the overall agency, a donation to an agency project or activity, or donation to a charity of the respondent's choice.



10. ASSURANCE OF CONFIDENTIALITY


Phase VI requires collecting descriptive and outcomes data from children and families. In all the CA awardee sites, data are collected by site staff. These staff members are responsible for developing procedures to protect the privacy of all participants in the evaluation data collection, storage of data, and reporting of all information obtained through data collection activities. These procedures include limiting the number of individuals who have access to identifying information, using locked files to store hardcopy forms, assigning unique code numbers to each participant to ensure anonymity, and implementing guidelines pertaining to data reporting and dissemination.


Because of the sensitivity of the information that will be collected, CMHS will require that all CA awardees establish a system whereby data are gathered, stored, and accessed in a manner that protects the information as much as possible. The National Evaluator will provide each CA awardee with a coding schema that each site will use to generate code numbers to assign to individual respondents, and will train staff responsible for data collection on the process of developing codes and linking them to individual respondents. Sites will be instructed to maintain a list of the codes and their assignment to individual respondents. A secure, stand-alone software to allow site evaluation staff to store codes with respondent names will also be provided to sites. This program is password protected and sites will be instructed to limit access to the database to only those onsite evaluation staff that needs access to this information. If a paper list is maintained, the list linking the assigned codes to respondent names will be kept in a locked cabinet and only the onsite data collection staff will have access to the list. The database or list will be maintained for the duration of the CMHS program. The purpose of maintaining the list for this period of time is to ensure that the data can be linked back to the identified child and family throughout the data collection process. When the project is completed, the databases or lists will be destroyed. This coding system was developed to facilitate the tracking of children during their involvement with the evaluation and to ensure that no personal identifying information from the CA awardee sites would need to be made available to either the National Evaluator or CMHS.


The security of data entered and managed on the Internet-based ICN also will be assured. Access to the ICN will be password protected, and the ICN will use data encryption to further enhance security and privacy. Further, the project including the ICN system will operate under an ADP/IT security plan approved by CMHS to assure that project data are protected.


Each CA awardee will develop and implement an active consent procedure that informs the participants of the purpose of the evaluation, describes what their participation entails, and addresses how privacy will be maintained as described above. Informed assent will be obtained from participating older children and adolescents (aged 11–17 years). In addition, informed consent will be obtained from adolescents who have reached the age of 18 at follow-up data collection. Written informed consent or assent will be obtained from children and families at the point of entry into services. Each CA awardee will obtain local Institutional Review Board (IRB) approval for the informed consent or assent procedures used in this evaluation. CA awardees are instructed to determine whether updates to consents are required at each data collection point, since the legal custody of a child may change, a child may become old enough to participate in a youth interview, a youth may become an emancipated minor or age up into adult status, and local IRBs may have requirements for regular updates.


As in previous phases of the national evaluation, to further protect study participants for Phase VI, the National Evaluator has obtained a Federal Certificate of Confidentiality, authorized by Section 301(d) of the Public Health Service Act, as well as IRB approval within ICF Macro for the following studies: System of Care Assessment, Services and Costs, and Sector and Comparison. All CA awardees will also obtain a Certificate of Confidentiality. This certificate provides additional protections of the data from civil and criminal subpoena. Additionally, the National Evaluator will conform to all requirements of the Privacy Act of 1974, under the System of Records: Alcohol, Drug, and Mental Health Epidemiological, and Biometric Research Data, DHHS, #09–30–0036; the most recent publication in the Federal Register occurred on January 19, 1999 (64 FR 2914). Client records at the sites are also covered under this Privacy Act System of Records.


System of Care Assessment. Data collection for the System of Care Assessment will occur via face-to-face interviews. Because respondents’ identities will be known, to ensure that participants’ rights are protected, an active informed consent process will occur. (See Attachment B.5 for informed consent forms.)


Services and Costs Study. The national evaluation trains all grant communities to include specific language in their consent and assent forms to describe the services and costs data that will be accessed through the child/youth’s records and shared with the national evaluation. Although grant communities may work with personal identifying information to extract and link electronic records, no personally identifying information will be included in any data transferred to the national evaluation for this study, other than the child/youth’s national evaluation child identification number.


For those communities electing to enter data in the Flex Funds Tool or the Services and Costs Data Tool, data in these applications are password protected to ensure privacy. When data are transferred to the national evaluation, data files will be encrypted to protect the information during electronic transfer. No child identifying information will be included in these data files other than the child/youth’s national evaluation child identification number.


Sector and Comparison Study. Caregiver informed consent and youth assent procedures for participants in the comparison study will follow those of the system of care participants described above. Caregivers of youth involved with the sector studies will provide consent for their children’s agency representative (e.g., teachers, child welfare case worker, or court representative) to complete the respective sector-specific instruments. The consent for completing these instruments will be included in the caregiver consent form for the Child and Family Outcome Study. (See Attachments E.2 for informed consent forms.)



11. QUESTIONS OF A SENSITIVE NATURE


Because this project concerns services to children with serious emotional disturbance and their families, it is necessary to ask questions that are potentially sensitive. However, only information that is central to the study is being sought. Questions address dimensions such as child emotions, behavior, social functioning, school performance, substance use, and involvement in unlawful activities. Also asked are questions about the child’s experience with sexual and physical abuse and suicidality. The answers to these questions will be used to determine baseline status and to measure changes in these areas experienced after entering the system of care. Questions about child abuse and suicidality have implications for local mandated reporting, which CA awardees are informed to consider and to train interviewers accordingly. Since each CA awardee must keep data on child and family status and service use, as well as treatment plan and other information, the data collection required for the national evaluation is not introducing new, sensitive domains of inquiry, but is paralleling standard procedures in the field of children’s mental health.


In addition to information on child clinical status and social function, other questions of a sensitive nature will be asked of families. These include questions related to family functioning caregiver strain and parental distress and are included in order to measure family involvement in treatment planning and service delivery. Moreover, family representatives who have consulted with the National Evaluators consistently identify a lack of information on family life as a weakness in previous studies.


Before collecting data, each CA awardee will obtain active consent from caregivers. In addition, child assent will also be obtained. In that process respondents will be made aware that the information they provide will be protected strictly and that they can withdraw their participation at any time. Similarly, respondents can freely choose to refrain from answering any questions they find objectionable.



12. ESTIMATES OF ANNUALIZED HOUR BURDEN


In accordance with the evaluation design, the descriptive, outcome, intervention, and service information collection for the 47 communities in Phase VI of the national evaluation will cover a period of 5 years. Data collection for the 18 communities funded in FY2008 will end in September 2014. Data collection for the 20 communities funded in FY2009 will end in September 2015. Data collection for the 9 communities funded in FY2010 will begin upon OMB approval and end in September 2016.


Table 2 shows the combined burden associated with the remaining three years of data collection for CA awardees funded in FY 2008 and FY 2009, and the three years of data collection for CA awardees funded in FY 2010. For measures that were previously cleared by the OMB, burden estimates presented in Table 1 are based on information supplied by CA awardees in prior phases of the evaluation.


Table 2. Estimate of Respondent Burden

Note: Total burden is annualized over a 3-year period.

Instrument

Respondent

Number of Respondents

Total Average Number of Responses per Respondent

Hours per Response

Total Burden Hours

3-Year Average Annual Burden Hours

Hourly Wage Rate ($)

Average Annual Cost ($)

System of Care Assessment

Interview Guides A-S


Key site informants

10811

3

1.00

3,243

1,081

19.232

12,473

Child and Family Outcome Study

Caregiver Information Questionnaire, Revised: CaregiverIntake (CIQ–RC–I)

Caregiver

6,5613

1

0.37

2,406

802

9.934

4,778

Caregiver Information Questionnaire, Revised: Staff as CaregiverIntake (CIQ–RS–I)

Staff as Caregiver

Caregiver Information Questionnaire, Revised: CaregiverFollow-Up (CIQ–RC–F)

Caregiver

6,561

45

0.28

7,436

2,479

9.93

14,,767

Caregiver Information Questionnaire, Revised: Staff as CaregiverFollow-Up (CIQ–RS–F)

Staff as Caregiver

Caregiver Strain Questionnaire (CGSQ)

Caregiver

6,561

5

0.17

5,478

1,826

9.93

10,880

Child Behavior Checklist 1½–5 (CBCL 1½–5)

Caregiver

6,561

5

0.33

10,924

3,641

9.93

21,695

Child Behavior Checklist 6–18 (CBCL 6–18)

Education Questionnaire, Revision 2 (EQ–R2)

Caregiver

6,561

5

0.33

10,924

3,641

9.93

21,695

Living Situations Questionnaire (LSQ)

Caregiver

6,561

5

0.08

2,723

908

9.93

5,408

Behavioral and Emotional Rating Scale—Second Edition, Parent Rating Scale (BERS–2C)

Caregiver

5,3896

5

0.17

4,500

1,500

9.93

8,937

Columbia Impairment Scale (CIS)

Caregiver

6,2817

5

0.08

2,607

869

9.93

5,117

Parenting Stress Index (PSI)

Caregiver

2,1518

5

0.08

896

299

9.93

1,780

Devereux Early Childhood Assessment for Infants (DECA 1–18M)

Caregiver

1,5769

5

0.08

657

219

9.93

1,304

Devereux Early Childhood Assessment for Toddlers (DECA 18–36M)

Devereux Early Childhood Assessment (DECA 2–5Y)

Preschool Behavioral and Emotional Rating (PreBERS)

Caregiver

1,576

5

0.10

788

263

9.93

1,565

Delinquency Survey, Revised (DS–R)

Youth

3,98610

5

0.13

2,657

886

7.2511

3,853

Behavioral and Emotional Rating Scale—Second Edition, Youth Rating Scale (BERS–2Y)

Youth

3,986

5

0.17

3,328

1,109

7.25

4,826

Gain Quick–R: Substance Problem Scale (GAIN)

Youth

3,986

5

0.08

1,654

551

7.25

2,399

Substance Use Survey, Revised (SUS–R)

Youth

3,986

5

0.10

1,993

664

7.25

2,890

Revised Children’s Manifest Anxiety Scale, Second Edition (RCMAS–2)

Youth

3,986

5

0.07

1,329

443

7.25

1,927

Reynolds Adolescent Depression Scale, Second Edition (RADS–2)

Youth

3,986

5

0.05

997

332

7.25

1,445

Youth Information Questionnaire, Revised—Intake (YIQ–R–I)

Youth

3,986

1

0.25

997

332

7.25

1,445

Youth Information Questionnaire, Revised—Follow-Up (YIQ–R–F)

Youth

3,986

4

0.25

3,986

1,329

7.25

5,780

Service Experience Study

Multi-Sector Service Contacts, Revised: Caregiver—Intake (MSSC–RC–I)

Caregiver

6,561

1

0.25

1,640

547

9.93

3,258

Multi-Sector Service Contacts, Revised: Staff as Caregiver—Intake (MSSC–RS–I)

Staff as Caregiver

Multi-Sector Service Contacts, Revised: Caregiver—Follow-Up (MSSC–RC–F)

Caregiver

6,561

4

0.25

6,561

2,187

9.93

13,030

Multi-Sector Service Contacts, Revised: Staff as Caregiver—Follow-Up (MSSC–RS–F)

Staff as Caregiver

Cultural Competence and Service Provision Questionnaire, Revised (CCSP–R)

Caregiver

6,561

412

0.13

3,499

1,166

9.93

6.949

Youth Services Survey for Families (YSS–F)

Caregiver

6,561

4

0.12

3,071

1,024

9.93

6,098

Youth Services Survey (YSS)

Youth

3,986

4

0.08

1,323

441

7.25

1,919

Comparison and Sector Study: Juvenile Justice

Court Representative Questionnaire (CRQ)

Court representatives

20213

5

0.50

505

168

26.4414

2,670

Electronic Data Transfer of Juvenile Justice Records

Key site personnel

202

5

0.03

34

11

26.44

178

Comparison and Sector Study: Education

Teacher Questionnaire (TQ)

Teacher

202

5

0.50

505

168

26.44

2,670

School Administrator Questionnaire (SAQ)

School administrators

202

5

0.50

505

168

26.44

2,670

Electronic Data Transfer of Education Records

Key site personnel

202

5

0.03

34

11

26.44

178

Education Sector Caregiver Questionnaire (ESCQ)

Caregiver

202

5

0.08

81

27

26.44

427

Comparison and Sector Study: Child Welfare

Electronic Data Transfer of Child Welfare Records

Key site personnel

202

5

0.03

34

11

26.44

178


Services and Costs Study









Flex Funds Data Dictionary/Tool

Services and Costs Data Dictionary/Data Entry Application

Local programming staff compiling/ entering administrative data on children/youth

1,56516

317

0.03

155

52

24.0418

745

Local evaluator, staff at partner agencies, and programming staff compiling/ entering service and cost records on children/youth

6,561

10019

0.05

32,805

10,935

26.44

173,473


Number of Distinct Respondents

Annual Number of Responses per Respondent

Total Annual Number of Responses

Average 5-Year Burden per Response (hours

Total Annual Burden (hours)20


Total Summary

11,628

15

232,582

 


40,024




582,444




  1. An average of 23 stakeholders in up to 47 grant communities will complete the System of Care Assessment interview. These stakeholders will include site administrative staff, providers, agency representatives, family representatives, and youth.

  2. Assuming the average annual income across all types of staff/service providers/administrators/caregivers is $40,000, the wage rate was estimated using the following formula: $40,000 (annual income)/2080 (hours worked per year) = $19.23 (dollars per hour).

  3. Number of respondents across 47 CA awardees (6,258), in addition to 303 children/families from the comparison sample. Average based on a 5 percent attrition rate at each data collection point.

  4. Given that 56 percent of the families in the Phase V evaluation sample fall at or below the 2008–2009 DHHS National Poverty Level of $ 20,650, (based on family of four), the wage rate was estimated using the following formula: $20,650 (annual family income)/2080 (hours worked per year) = 9.93 (dollars per hour).

  5. Number of responses per respondent is five over the course of the study (once every 6 months for 24 months, with one baseline/intake response, and 4 follow-up responses).

  6. Approximate number of caregivers with children over age 5, based on Phase V & VI combined data submitted as of 12/10. Also includes 303 children/families from the comparison sample.

  7. Approximate number of caregivers with children 3 and older, based on Phase V & VI combined data submitted as of 12/10. Also includes 303 children/families from the comparison sample.

  8. Approximate number of caregivers with either: (1) children served at the 9 early childhood-focused communities, for whom the instrument is required; or (2) children aged 0 to 12 at other communities, where the instrument is optional (we estimate that 1/3 of caregivers will be administered the instrument when it is optional). Estimates are based on Phase V and VI combined data submitted as of 12/10.

  9. Approximate number of caregivers with either: (1) children served at the 9 early childhood-focused communities, for whom the instrument is required; or (2) children aged 0 to 5 at other communities, where the instrument is optional (we estimate that 1/3 of caregivers will be administered the instrument when it is optional). Estimates are based on Phase V and VI combined data submitted as of 12/10.

  10. Based on finding from Phase V and VI combined data that approximately 59 percent of the children in the evaluation were 11 years old or older. Also includes 303 children/families from the comparison sample.

  11. Based on the 2009 Federal minimum wage rate of $7.25 per hour.

  12. With the exception of the MSSC-R, respondents only complete Service Experience Study measures at follow-up points. See Footnote #3 for the explanation about the average number of responses per respondent.

  13. Approximate number of children/families in each sector, for the Sector and Comparison Study. This includes cases within the communities, as well as within the comparison sample.

  14. Assumes that the average annual income across all types of evaluators, agency staff, and administrative staff is $55,000, the wage rate was estimated using the following formula: $55,000 (annual income) / 2080 (hours worked per year) = $26.44 per hour.

  15. Assumes that each community will use flexible funds expenditures on average for approximately one quarter of the children/youth enrolled.

  16. Assumes that three expenditures, on average, will be spent on each child/youth receiving flexible fund benefits.

  17. Assumes that the average annual income across all types of programming staff is $50,000, the wage rate was estimated using the following formula: $50,000 (annual income) / 2080 (hours worked per year) = $24.04 per hour.

  18. Assumes that each child/youth in system of care communities and in the comparison sample will have 100 service episodes, on average.

  19. Total Annual Burden (hours) is the product of Number of Distinct Respondents X Average Annual Number of Responses per Respondent X Average3-Year Burden per Response (hours).


As indicated in Table 2, the average total annual burden for data collection is estimated at 40,024 hours. This estimate is derived by calculating the burden for each measure, dividing those numbers by 3 (years of data collection in the national evaluation for which approval is being sought), and summing.



13. ESTIMATES OF ANNUALIZED COST BURDEN TO RESPONDENTS


CA awardees collect the majority of the required data elements as part of their normal operations, and maintain this information for their own service planning, quality improvement, and reporting purposes. The additional cost of this data collection is minimal. The costs for operation and maintenance of materials necessary for ongoing data collection are similarly minimal.


Other costs related to this effort, such as the cost of obtaining copyrighted instruments, are costs to the Federal Government. Each CA awardee has been funded, as part of the overall cooperative agreement award, to support two staff positions (or the full-time equivalent) to assist in the evaluation. Therefore, no cost burden is imposed on the CA awardee by this information collection effort.




14. ESTIMATES OF ANNUALIZED COST TO THE GOVERNMENT


SAMHSA has planned and allocated resources for the management, processing, and use of the collected information in a manner that shall enhance its utility to agencies and the public. Including the Federal contribution to local CA awardee evaluation efforts, the contract with the National Evaluator, and government staff to oversee the evaluation, the annualized cost to the government is estimated at $9,168,221. These costs are described below.


Each CA awardee is expected to hire two full-time equivalents to recruit families into the evaluation, collect information, manage and clean data, and conduct analyses at the local level. Assuming (1) an average annual salary of $55,000; (2) that 47 CA awardees have been funded; and (3) that the average Federal contribution (not including State matching funds) will be 73 percent, the annual cost for Phase VI at the CA awardee level is estimated at $3,774,100. These monies are included in the cooperative agreement awards.


The national evaluation contract has been awarded to ICF Macro for evaluation of the 47 CA awardees in Phase VI. The first Round of Phase VI CA awardees began data collection in October 2009 and will continue data collection for 4 years until September 2014. The second Round of CA awardees are scheduled to begin data collection upon OMB approval and will continue data collection until September 2016. The national evaluation contract for Round one of Phase VI provides for 1 base year of $2,809,053 with an option to renew for 4 more years. The national evaluation contract for Round two of Phase VI provides for 1 base year of $2,444,200 with an option to renew for 5 more years. The estimated average annual cost of the contract for Round one of Phase VI will be $3,238,087. The estimated average annual cost of the contract for Round two of Phase VI will be $2,084,034. Together, the total cost across the two contracts is $5,322,121. Included in these costs are the expenses related to developing and monitoring the national evaluation including, but not limited to, the following activities: developing the design, instrument package (including acquisition of copyrighted instruments), data manual, and training materials; monitoring and providing technical assistance to sites; traveling to sites and relevant meetings; conducting special studies, and analyzing and disseminating data. Cost for acquisition of copyrighted instrumentation is projected to be $48,295.44 per year. This cost is included in the total contract award.


It is estimated that CMHS will allocate 75 percent of a full-time equivalent each year for government oversight of the evaluation. Assuming an annual salary of $136,000, these government costs will be $102,000 per year.



15. CHANGES IN BURDEN


Currently there are 28,156 hours in the OMB inventory. SAMHSA is requesting 40,024 hours for this submission.. This revision responds to a variety of program changes that explain the change in hours: 1) the addition of 9 sites funded in FY 2010 for which burden is estimated; 2) the estimate of burden for the remaining 3 years of data collection for sites funded in FY 2008 and FY 2009; 3) a reduction in the number of instruments requiring respondents for the Sector and Comparison Study; 4) addition of the Education Sector Caregiver Questionnaire for the Sector and Comparison Study; 5) the removal of data collection activities for the Alumni Networking Study, the CQI Initiative Evaluation, and the Sustainability Study.



16. TIME SCHEDULE, PUBLICATION, AND ANALYSIS PLANS


a. Time Schedule


The time schedule for implementing the Phase VI evaluation is summarized in Table 3. A 3-year clearance is requested for this project.



Table 3. Time Schedule


Receive OMB clearance for study

XXX

Re-submit for OMB approval of remaining 3 years of data collection for sites funded in FY 2010

XXX

Continue data collection for 38 sites funded in FY 2008 and 2009

Ongoing

Begin data collection for 9 sites funded in FY 2010

XXX

Data collection completed for 18 sites funded in FY 2008

September 2014

Data collection completed for 20 sites funded in FY 2009

September 2015

Data collection completed for 9 sites funded in FY 2010

September 2016

Process and analyze data

Ongoing

Produce annual reports

October 2011, annually thereafter (2008/2009 funded)

October 2012, annually thereafter (2010 funded)

Produce public use database

September 2014 (2008 funded)

September 2016 (2009/2010 funded)

Produce final report

September 2014 (2008 funded)

September 2016 (2009/2010 funded)


b. Data Analysis Plan


All of the data collection and analytic strategies detailed in this package are linked to the evaluation questions. These linkages are shown in Table 4. Analyses will be conducted to assess reliability and validity of selected measures as sufficient data to conduct these analyses are obtained in the early stages of the study. These analyses will include, but are not limited to, calculation of reliability using Cronbach’s coefficient alpha to determine internal consistency of ordinal-level and interval-level measures, calculation of the Kuder-Richardson formula 20 to determine internal consistency of dichotomous measures, and confirmatory factor analysis to determine latent variable structure and content of multi-component scales.


Table 4. Evaluation Questions, Indicators, Data Sources, and Analysis Techniques


Evaluation Questions

Indicators

Data Sources

Data Analysis

System of Care Assessment

Does the system maximize interagency collaboration?

  • Core agencies participate in a collaborative way

  • Integration of staff, resources, functions, and funds

  • Co-location of services of multiple agencies

  • Interagency service planning

  • Shared vision and goals

  • Formal relationships established between agencies

  • Site Visit

Univariate/

Multivariate Analysis

Are the various service components of the system coordinated?

  • Co-location of services of multiple agencies

  • Availability of case management/care coordination services

  • Case manager/care coordinator has broad responsibilities and active referral role

  • Integration and consistency in case management/care coordination across systems/agencies

  • Site Visit

Univariate/

Multivariate Analysis

Are services and the system accessible?

  • Proportion of eligible population provided services

  • Time between identification of need and entry to system

  • Waiting lists for entry to system

  • Waiting lists for delivery of key services

  • Active outreach

  • Logistics and supports that encourage access

  • Site Visit

Univariate Analysis

Is the service array comprehensive?

  • Availability of broad array of residential, intermediate, outpatient, and wraparound services

  • Site Visit

  • MIS

Univariate Analysis

Are services and the system culturally competent?

  • Cultural diversity of the child and family population

  • Cultural diversity of provider population

  • Agency commitment to cultural competency

  • Equitable treatment of all children and families

  • Adherence to national standards of cultural competence

  • Site Visit

  • CCSP–R

  • YSS, YSS–F

Univariate Analysis

Are services and the system family-driven?

  • System and services involve caregivers in developing individual child and family service plans

  • System and services involve caregivers in overall system of care planning activities

  • System and services involve caregivers in service delivery

  • System and services address needs of caregivers and families for support

  • Site Visit

  • YSS, YSS–F

  • CIQ–R

Univariate/ Multivariate Analysis

Are services individualized and youth-guided?

  • Active individualized service planning process

  • Frequency of monitoring of ISP by case manager

  • System and services involve youth in developing his or her own service plan

  • System and services involve youth in overall system of care planning activities

  • System and services involve youth in his or her own service delivery

  • System and services address needs of youth for support

  • Site Visit

  • YSS, YSS-F

  • YIQ-R

Univariate/

Multivariate Analysis

Are services community-based?

  • Availability of services within the community

  • Extent of reliance on out-of-county and out-of-State placements

  • Site Visit

  • MIS

Univariate/

Multivariate Analysis

Do systems mature over time?

  • Development of infrastructure

  • Development of service delivery capacity

  • Site Visit

Multivariate Analysis

Are services provided in the least restrictive setting that is appropriate?

  • Processes to ensure that children step down to lower levels of care when appropriate

  • Extent of use of intermediate and outpatient placements

  • Extent of use of wraparound services

  • Stability and duration of placements

  • Level of use of mental health services in normative settings (e.g., home, school)

  • Site Visit

  • MIS

  • LSQ

Univariate/

Multivariate Analysis

Cross-Sectional Descriptive Study

What are children and families like?

  • Gender

  • Race

  • Age

  • Foster care placement

  • Presenting problem(s)

  • Diagnosis at intake

  • Intake and referral source

  • Case status

  • EDIF

  • CIQ-R

Univariate/Bivariate Analysis

Child and Family Outcome Study

Are there differences between the children and families served in the systems who do and do not choose to participate in the Child and Family Outcome Study?

  • Gender

  • Race

  • Age

  • Educational level and placement

  • Socioeconomic status

  • Parents’ employment status

  • Living arrangement

  • Presenting problem(s)

  • Diagnosis at intake

  • Intake/referral source

  • Risk factors for family and child

  • Case status

  • EDIF

  • CIQ–R

Univariate/Bivariate Analysis

Has there been a reduction in childrens negative behaviors?

Number of problem behaviors

  • CBCL1½–5

  • CBCL 6–18

  • CIS

  • DECA

Univariate/

Multivariate Analysis

Has there been an increase in the level of childs overall functioning?

  • Childs ability to accomplish activities of daily living

  • Child’s strength

  • Quality of family relationships

  • Quality of peer relationships

  • CBCL1½–5

  • CBCL 6–18

  • BERS–2C

  • BERS–2Y

  • PreBERS

  • CIS

Univariate/

Multivariate Analysis

Has there been improvement in child functioning in the educational environment?

  • School attendance

  • Expulsions, dropouts, suspensions

  • Academic performance

  • BERS–2C

  • BERS–2Y

  • EQ–R2

Univariate/

Multivariate Analysis

Has there been improvement in child regarding involvement with law enforcement?

  • Violations

  • Number of contacts with law enforcement

  • Number of incarcerations

  • DS-R

Univariate/

Multivariate Analysis

Do families experience improvements in family life?

  • Family functioning

  • Parenting stress

  • Caregiver strain (burden of care)

  • PSI

  • CGSQ

  • CIQ–R

Univariate/

Multivariate Analysis

Are there differences in family outcomes across systems of care?

  • Family functioning

  • Caregiver strain (burden of care)

  • Material resources

  • PSI

  • CGSQ

  • CIQ–R

Univariate/

Multivariate Analysis

How do children and families experience services?

  • Ratings of specific services

  • Ratings of the overall system

  • Provider attitudes and practices

  • YSS

  • YSS–F

  • CCSP–R

Univariate/

Multivariate Analysis

Are there differences in service experiences across systems of care? Are differences, if any, associated with differential outcomes?

  • Comparison of ratings of specific services

  • Comparison of ratings of the overall system

  • Comparison of provider attitudes and practices

  • Relationship to child outcomes

  • YSS

  • YSS–F

  • CCSP–R

  • CBCL1½–5

  • CBCL 6–18

  • CIS

Univariate/

Multivariate Analysis

Services and Costs Study

What services do children and families receive and what are their service utilization patterns?

  • Previous service history

  • Service setting and type

  • Level of restrictiveness

  • Mix of services

  • Amount and duration

  • Continuity of care

  • MIS

  • LSQ

Univariate/

Multivariate Analysis

How do service use patterns relate to child behavioral and functional outcomes?

  • Comparison of service use for children who enter the system at varying levels of challenge

  • Comparison of change in outcomes over time for children in different utilization pattern groups

  • MIS

  • MSSC–R–I

  • MSSC–R–F

  • EDIF

  • CIQ–R

  • YIQ–R

  • CBCL1½–5

  • CBCL 6–18

  • CIS

  • GAIN

  • SUS–R

  • DS–R

  • RADS–2

  • RCMAS–2

  • BERS–2C

  • BERS–2Y

  • PreBERS

  • DECA

  • PSI

  • LSQ

  • DS–R

  • EQ–R2

  • TQ

  • SAQ

  • CRQ

  • CWRF

  • ESCQ

Univariate/

Multivariate Analysis

How do service use patterns differ across subgroups within a site? Across system of care sites?

  • Comparisons of types of services used

  • Comparisons of level of restrictiveness

  • Comparisons of service mix

  • Comparison of amount and duration

  • Comparison of continuity of care

  • MIS

  • LSQ

  • MSSC–R–I

  • MSSC–R–F

  • EDIF

  • CIQ–R

  • YIQ–R

Univariate/

Multivariate Analysis

What costs are associated with services at the aggregate and child/family levels?

  • Total costs of services for individual children and families

  • Average costs per child/family

  • Average cost per service type

  • MIS

  • LSQ

  • MSSC–R–I

  • MSSC–R–F

Univariate/Bivariate Analysis

Sector and Comparison Study


Education Sector

Do educational outcomes of school-aged children in systems of care improve over time?


Do educational and clinical outcomes of school-aged children in systems of care improve more compared to non-system of care children?


Are children in systems of care more likely to receive appropriate educational supports compared to non-system of care children?


How does teacher involvement, supports and training in system of care communities differ from that of teachers in non-system of care communities (or schools who are not part of the system of care)?


What individual level services are available in schools in system of care communities?

What school level interventions are available in schools in system of care communities?

What are the types of mental health service delivery systems in schools in system of care communities?

  • Attendance

  • Performance

  • Delinquent behavior

  • Grade repetition

  • School mobility

  • Disciplinary actions

  • Receipt of special education and supports

  • Teacher’s supports and training

  • EQ–R2

  • TQ

  • SAQ

  • School records data


Univariate/

Multivariate Analysis


Juvenile Justice Sector

Do juvenile justice outcomes of juvenile justice-involved children in systems of care improve over time?


Do juvenile justice and clinical outcomes of juvenile justice-involved children in systems of care improve more compared to non-system of care juvenile justice-involved children?


Are juvenile justice-involved children in systems of care more likely to receive appropriate juvenile justice supports compared to non-system of care juvenile justice-involved children?


How does court/juvenile justice representative involvement, supports and training in system of care communities differ from that of court/juvenile justice personnel in non-system of care communities (or in juvenile justice systems that are not part of the system of care)?

  • Arrests

  • Adjudication process,

  • Placements

  • Criminal activity

  • Substance use

  • Interaction with mental health providers

  • DS–R

  • SUS

  • GAIN

  • CRQ

  • Juvenile justice records data

Univariate/

Multivariate Analysis


Child Welfare Sector

Do the child welfare outcomes of children involved in child welfare and systems of care improve over time?


Do the child welfare and clinical outcomes of children involved in child welfare and systems of care improve more than the child welfare and mental health outcomes of non-system of care children involved in child welfare?


Are child welfare-involved children in systems of care more likely to receive appropriate services compared to non-system of care children in child welfare?


How does child welfare staff involvement, supports and training in system of care communities differ from that of child welfare staff in non-system of care communities (or in child welfare systems that are not part of the system of care)


What factors influence referrals of children involved in child welfare to systems of care in their communities?


Are systems of care providing mental health assessments for children in child welfare even if they are not ultimately determined to be in need of or eligible for, system of care services?

  • MH services provided

  • Maintenance In home

  • Out of home placement

  • Risk factors for child

  • Trauma symptoms

  • CWS–EDIFA

  • CWRF

Univariate/

Multivariate Analysis



Analyses planned for each of the study components are described below.


System of Care Assessment. This study component includes both qualitative and quantitative analyses and both are based on a standard framework. Qualitative analyses will be used to describe the infrastructure and the direct service delivery processes of system of care communities. Qualitative data obtained through individual interviews at each system of care community and from document reviews will be synthesized into a site-specific narrative report that will be returned to each system of care community for review and correction. When the reports for each community are finalized after site comment, they will be entered into a qualitative database software program (Atlas.ti) that will allow meta-analyses across system of care communities and across time.


The quantitative analyses will be based on scores given to each system of care community that measure the extent to which it has achieved the program model’s overarching principles within the system operations described in the qualitative analysis and from quantitative interview questions. The relationship among service and system experiences, child and family characteristics, and outcomes over time will be explored using correlational, regression, and path analyses.


Child and Family Outcome Study. For this evaluation component, univariate descriptive analyses will be performed to characterize the families participating in this evaluation, including score ranges, means, and medians. These analyses will be reported for each system of care community as well as for all CA awardees combined.


Change in child and family outcomes over time will be tested using a variety of techniques. Repeated measures analysis of variance (ANOVA) will be used to test the significance of change over time within and between groups at each site. Repeated measures analysis of covariance (ANCOVA) will be conducted using the system of care development scores from the System of Care Assessment as a covariate. Hierarchical linear modeling (HLM) will be used to estimate growth curves (e.g., changes in the level of symptomatology) based on repeated observations.

The GLM repeated measures analysis will allow the National Evaluator to test whether changes over time are significant and whether some groups experience more improvement than others.. Path analysis and other structural equation modeling techniques will be used to investigate the direct and indirect effects of causal variables (such as ratings of system performance and adherence to service plans) on dependent outcome measures (such as clinical assessments, restrictiveness of care, and family functioning).


Service Experience. In this component of the Phase VI evaluation, HLM or ANOVA will be performed to examine: (1) change in service utilization patterns of children and their families; (2) whether there are differences between groups of children in the system of care communities who receive an evidence-based treatment and those who do not in terms of client satisfaction; (3) whether children and families stay in services longer on average in communities with higher average service and system of care ratings; and (4) whether within communities, caregivers of children who received fewer services in the previous 6 months.


Repeated measures ANOVA with treatment group as a between-subjects factor and time as a within-subjects factor will be used to examine differences in continuous outcomes over time. Generalized estimating equations will be used in the analysis of dichotomous outcomes. Multivariate regression modeling across multiple time points will allow characterization of effects in terms of persistence over time and identification of both system-level and specific services factors that maintain short- and long-term positive outcomes. In addition, the appropriateness of multilevel modeling will be explored as a potential approach for linking site-level characteristics to changes in outcomes over time.


Services and Costs Study. For this component, analyses will focus primarily on utilization patterns (e.g., types, combination, amount, and costs of services used) and the factors that influence use. Analyses will be conducted at the aggregate and individual child and family levels. At the aggregate level, the distribution of service use and costs across the client population will be described. At the individual child and family level, service utilization patterns will be described (e.g., distribution of children using various combinations of services, mean and median amounts of services used).


Latent class analysis and other case-grouping techniques will be used to group children who experience similar utilization patterns, based on combinations and amount of services. The longitudinal outcomes of children in various service utilization groups will be compared to see if some utilization patterns are associated with greater gains and, if so, for which groups of children.


Trend analysis will be used to analyze change in costs over time. Multivariate techniques that adjust for skewed distribution of cost data will be employed to predict costs controlling for variation in baseline characteristics. We also will describe the allocation of service costs across children and different service categories, and we will model costs as a function of child and family characteristics. Given that utilization and cost data are often characterized by high skewness and/or large proportion of zero outcomes, we propose utilization of specialized statistical techniques (e.g., two-part model, logarithmic transformations, zero-inflated Poisson model) in analyzing utilization and cost study data. For cost-effectiveness analysis, we will use bootstrapping methods to account for uncertainty



17. DISPLAY OF EXPIRATION DATE


All data collection instruments will display the expiration date of OMB approval.



18. EXCEPTIONS TO THE CERTIFICATION STATEMENT


This collection of information involves no exceptions to the Certification for Paperwork Reduction Act Submissions.



23


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKatherine.E.Young
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy