1. RESPONDENT UNIVERSE AND SAMPLING METHODS
Respondent Universe. Although most funded CMHI jurisdictions are expected to participate in the Evaluation to some degree, the extent of each grantee’s participation will be limited as much as possible. All grantee sites will participate in a few data collection activities, but only at the jurisdiction level (i.e., the “highest” level within the grantee), and all such participants will be stakeholder staff.
The amount of data collection requested of each grantee has been limited wherever possible. Planning grantees will be asked to complete only two instruments, the Stakeholder Interviews and the SAIS. Each of these instruments will be completed only one time by planning grantees. Among the implementation grantees, the Financial Benchmarking Component will involve only a limited number of volunteer grantees.
Sampling Methods. A convenience sample will be used to select local systems from which to gather more extensive data (see Table 8) and to select clients and caregivers to participate in the SOCEA data collection.
Selecting local systems. For most data collection efforts, a convenience sample will be used to select one local system from each grantee/jurisdiction to participate in the Evaluation. (See Table 8.) Grantees will be asked to identify a participating local system in their jurisdiction that is a good example of their SOC activities. To ensure they are operational, these selected local systems must have served at least 20 children, youth, or young adults since the grant was funded. Each local system will be seen as a case study.
This approach is deliberate. Grantees’ best local systems are desired for several reasons. First, the selected local systems will most likely to represent the grantees’ goals and vision for their SOC expansion grant. Second, they are more likely to be implementing better practices, offering the best opportunity to learn the most effective approaches. Finally, this approach acknowledges the fact that grantees may be inclined to nominate their best local systems anyway. In reality, most grantees have few local systems (many with only one), so including an actual sample of local systems is not plausible.
As this approach will have implications for the interpretation of results, findings will be warranted in consideration of the evaluation design. Specifically, the interpretation of findings will assume that the local system selected will be characterized as the grantee’s best effort, and not a typical or representative example. It will be made explicit that this is a case study approach and any limits to generalizability and representativeness will be noted.
Selecting clients and caregivers to participate in the SOCEA. A convenience sample of two caregivers and two clients age 14 to 21 from the sampled local system will participate in the SOCEA. Clients and caregivers will be nominated by the local system. Selection criteria will include: (1) having sufficient experience with services to be able to appraise the processes; (2) having the ability to articulate their experiences and interactions with various systems; and (3) being willing to participate in the Evaluation.
As with the local system, SOCEA participant selection will be nonrandom, and participants with a favorable experience with the local system will be more likely to be selected and to participate. This is not methodologically problematic in that the aim of SOCEA data collection is to gain a clear picture of what is occurring within the SOC in its best, most exemplary state. This approach will be made explicit in the interpretation and reporting of findings. Any limits to generalizability and representativeness will be noted.
Table 8. Participants: All eligible vs. sample for each data collection activity by grant cohort, data source, and frequency of data collection
Data Collection Activity |
*Grant Type |
Data Sources |
Frequency |
All eligible grantees (all are at the jurisdiction level) |
|||
Stakeholder Interviews |
All planning and implementation grantees |
High-level key stakeholders within jurisdictions
CMHI quality monitors |
Once for planning grants, during the first 12 months of funding. Twice for implementation grants: during the first 12 months and the last 12-18 months of grant funding. |
SAIS |
All planning and implementation grantees |
Stakeholders:
|
Once for planning grants. For implementation grants, baseline within the grant’s first 18 months, then annually through end of grant funding. |
Network Analysis Survey: Jurisdiction |
All implementation grantees
|
|
Twice: Baseline within the grant’s first 18 months of grant with follow up 2-3 years later |
GIS: Jurisdiction |
All implementation grantees |
Work addresses of event attendees |
Quarterly
|
Financial Mapping Interview |
All implementation grants |
Financial administrators |
Twice: Baseline within the grant’s first 18 months with follow up 2 years later |
Convenience Sample |
|
||
Benchmark Tool |
Volunteer implementation grantees
|
Data compiled by personnel working with state Medicaid and MH Authority reporting and payment systems |
Twice: Baseline for two cohorts within the grant’s first 18 months. Follow-up 2 years later for the first cohort. |
Network Analysis Survey: Local System |
Implementation grantees: (One local system per grant) |
Local service personnel involved in direct service delivery:
|
Twice: Baseline within the grant’s first 18 months of grant with follow up 2-3 years later |
SOCEA |
Implementation grantees: (One local system per grant) |
Local service delivery and management personnel:
|
Twice, within the first 18 months and last 12 months of grant funding (i.3., 2-3 years later, depending on the cohort) |
GIS: Local system |
Implementation grantees (One local system per grant) |
Work addresses of event attendees |
Quarterly |
GIS: Child & family |
Implementation grantees (One local system per grant) |
Census block derived from family address by site staff |
Once |
Child and family outcome instruments |
Implementation grantees: (One local system per grant) |
|
Intake, discharge, 6 and 12 months (while receiving SOC services) |
NOTE. * The Stakeholder interviews and SAIS include both planning and implementation grants. All other data collection activities are limited to implementation grants.
Sample Size and Power Analysis for the Child and Family Outcome Component. For the child and family outcome component, it is important that CMHS draws enough participants from each jurisdiction to ensure the evaluation will be able to detect the impact of the SOC initiative on child and family outcomes. If the number of participants is too small, significant differences of an important magnitude might go undetected. The effect sizes of the phenomena of interest form the basis of determining the minimum number of participants needed through a statistical power analysis.1 In order to obtain complete follow-up data on 74 participants per site, it will be necessary to enroll 90 families into the evaluation at each site (based on a 90% retention rate at each follow-up data collection point). If we assume that grantees will serve 45 children for each full year of service delivery, 112 children will be served during the 2.5 years of enrollment period (i.e., the first six months will be start-up and the last year will be follow-up data collection). An initial response rate of 85% will allow the enrollment of 90 families.
CMHS conducted power analyses to determine the appropriate sample size. The overall goals of the Evaluation are twofold. CMHS believes that individual sites should obtain a sufficient sample to conduct meaningful analyses for their own use. CMHS also needs to obtain sufficient data to conduct cross-site analyses related to the overall evaluation questions. Therefore, CMHS ran separate power analyses for these two separate yet related domains.
For individual jurisdiction power analyses, CMHS uses the G*Power application to estimate the needed sample size using the following assumptions. CMHS assumes an average of three time points of data to be used in a repeated-measures ANOVA. CMHS also assumes a retention rate of approximately 90% at each time point (for an overall baseline to 12 month follow-up retention rate of 81.5%), power of .80, an effect size of .26 or higher, repeated measures correlation of .5 or lower, and level-1 (time) variability of 1.0. Using these assumptions results in an estimated final (complete) sample size of 74 at each individual jurisdiction needed in order to detect between-group differences in change over time in communities for their local analytic purposes.
For cross-site analyses, CMHS used the Optimal Design application to estimate the Minimum Detectable Effect Size (MDES) using the following assumptions: 130 jurisdictions, three time points of data, a 3-level longitudinal Multilevel Growth Model testing a quadratic trend, a retention rate of approximately 90% at each time point and an overall baseline-to-12 month follow up retention rate of 81.5%, power of .80, residual variability of .3, level-1 (time) variability of 1.0, and 74 people in each jurisdiction. The MDES ranges from .17 (very small effects) for an ICC of .10, .22 for an ICC of .20, and .27 for an ICC of .30.
Each participating site will be expected to recruit a sufficient number of children and families to ensure enrollment of 90 children and families in each jurisdiction (or 74 after attrition). Complete data on 74 children and families in each of the 130 jurisdictions will result in a final sample of 5,825 client families with complete data at the end of the year 20182. This sample size will be large enough to ensure the ability to detect changes in outcomes over time at both the local and national levels.
2. INFORMATION COLLECTION PROCEDURES
SAMHSA has contracted with Westat to conduct the Evaluation. Westat, and its subcontractors and consultants (listed in Section B.5), are referred to throughout this document as the NET. The NET will conduct all jurisdiction and local system level data collection activities directly with respondents. Child and family level data will be collected by local service provider agencies. The NET will provide training and TA regarding child and family outcome instruments added to the CDP tool and support local agencies in the collection of child and family outcome data. Through the seamlessly integrated CDP and CMHI data collection mechanisms, the NET will receive de-identified client-level data from all implementation grantees. Table 8 shows each data collection activity by respondent and data collection interval.
Implementation Assessment
Stakeholder interviews. Participants will include the program director, high-level administrators of participating service sectors (e.g., MH, juvenile justice, and child welfare), jurisdiction-level family and youth representatives, and quality monitors. Participants will be selected based on information gleaned from the document review and conversations with grantee personnel. Other informants identified by participants as being knowledgeable and having a unique perspective also will be interviewed. Following OMB clearance, these data will be collected in newly-funded jurisdictions within the first 12 months of funding, for both planning and implementation grants. For implementation grants, the Stakeholder Interview will also be administered during the last 12 to 18 months of grant funding. Stakeholder Interviews will be used to describe the expansion plans of planning and implementation grants. These semi-structured interviews will be conducted with approximately eight key stakeholders at higher levels in each jurisdiction. (This interview and the SAIS are the only two data collection activities that include planning grants.)
SAIS. This self-report survey will be administered via the online CMHI portal. Respondents will rate items on a Likert-type scale. Respondents will include approximately 30 jurisdiction-level stakeholders from each jurisdiction, such as representatives from family and youth organizations, child-serving sectors, advocacy organizations for diverse populations, provider organizations, and financial officers, among others. Evaluation staff will identify potential respondents through previous evaluation efforts (e.g., document review, Stakeholder Interviews) and invite those stakeholders to participate in this component. Implementation grantees will complete this survey in the first 12 to 18 months of funding and annually thereafter through the end of their funding period or June 2018, whichever comes first. Planning grantees will complete this survey once in the last quarter of their one-year funding period. (This survey and the Stakeholder interviews are the only two data collection activities that include planning grants.)
SOCEA. The SOCEA will be used to describe and assess strategies and mechanisms that implementation grantees used to expand SOCs at the local system level. The SOCEA is a semi-structured interview that includes numerous open-ended questions. Based on standard criteria, interviewers will use item responses from individual informants to rate degree of implementation (e.g., 1=lowest and 5=highest). SOCEA interviewers will be extensively trained to conduct interviews and to rate responses reliably based on established criteria. Before conducting SOCEA interviews, interviewers will be required to meet 85% agreement in ratings with a gold standard. Inter-rater reliability will be monitored throughout the project during annual booster training sessions and in the field.
Individual respondents to be interviewed include: heads of lead local agencies, representatives from family organizations, quality monitoring/evaluation staff, care coordinators/case managers, clinicians, and direct service delivery staff from multiple agencies. In addition, in each participating local system, the NET will talk to two caregivers of children and youth age 5 to 17 and two youth (age 14-21) who are receiving services. The communities will identify respondents. All implementation grantees will select one of their local systems to participate in the Evaluation. This selected local system will be assessed using the SOCEA within the first 12 months of implementation grant funding, with a single followup assessment in the last 12 months of their funding period.
Network Analysis and GIS Component
Network Analysis: Jurisdiction level. The Network Analysis Survey: Jurisdiction will be used to assess SOC networks at the jurisdiction level by collecting social network and network relational information. This survey will be administered online to implementation grantees via the CMHI portal described in Section A.3. Respondents at the jurisdiction level will be high-level stakeholders such as project directors, heads of child-serving agencies, and leaders of jurisdiction-level family and youth organizations. The survey will collect data on agencies and organizations with whom the respondent interacts as part of the SOC implementation and expansion effort. The list of these partner agencies and organizations will be developed based on document review, interviews with key stakeholders, and other data collection efforts. In addition, respondents will have the opportunity to identify additional agencies/organizations with which they interact. The surveys contain indicators of networking behavior such as: holding joint expansion meetings; collaborating to develop policies, make funding decisions, improve service access, empower youth and family leadership; developing the infrastructure to support the use evidence-based models; and sharing electronic data. For each item, respondents will report the extent to which their agency/organization engages in that activity with other agencies. In addition, the NET will ask respondents to indicate whether those relationships are formalized (i.e., whether there are written agreements, memoranda of understanding, or contracts). The initial survey will be conducted within the first 18 months of the implementation grant’s funding, with the second administration 2 to 3 years later.
Network Analysis: Local system level. Respondents for the Network Analysis: Local System will be personnel from all agencies involved with direct service delivery in the SOC for that local service system. Respondents will complete the Network Analysis Survey: Local System online via the CMHI portal described in Section A.3. The list of these partner agencies and organizations will be developed based on information gleaned through liaison and TA efforts, the SOCEA, and other data collection efforts. In addition, respondents will have the opportunity to identify additional agencies/organizations with which they interact in the delivery of mental health services to children/youth and their families. The surveys will ask respondents to rate the extent to which they interact with other agencies on several service delivery indicators such as the following: joint case consultation meetings; collaboration to determine joint processes, service funding, and SOC client eligibility criteria; implementing and monitoring evidence-based treatment models; and other collaborations to implement practices and procedures consistent with the SOC model. One local system will be recruited to participate within each expansion implementation jurisdiction. The initial survey will be conducted within the first 18 months of the implementation grant’s funding, with the second administration 2 to 3 years later.
GIS. Geographic coverage will be examined at multiple levels. Methods of data collection by Evaluation level are as follows:
Jurisdiction. At this level, the GIS component will describe the extent to which important planning, implementation, and expansion events are attended by partners across the full jurisdiction. NET staff will obtain rosters of attendees at important SOC expansion planning and implementation events and some basic information about the event itself like location, meeting title, type of event, and its purpose. Examples of events include governance meetings, jurisdiction-wide seminars, policy summits, training events, among others. These events could be face-to-face or virtual (e.g., online, video-conferencing). GIS will be based on the business or office addresses. This information will be compiled in an ongoing basis as events occur through-out the year. Grantees will have the option of providing this information as events occur, or on a monthly, quarterly, or annual basis.
Local system. GIS efforts at this level will focus on events related to planning, implementation and expansion efforts associated with direct service provision and local program development. In addition to some basic information about the event such as type, location, and purpose of the events, the business and office addresses of individuals who attend these events, in person or virtually, will be recorded and used in GIS analyses. These events may include training of direct service providers and supervisors, training on system of care values, multi-agency care review meetings, and management meetings. This information will be compiled in an ongoing basis as events occur through-out the year. Local systems will have the option of providing this information as events occur, or on a monthly, quarterly, or annual basis.
Child and family. The purpose of the GIS component at this level is to map the areas served by the local service system. For children/youth receiving services and their caregivers, local site staff members already obtain addresses as part of routine intake procedures. To obscure families’ identity, the NET will provide site staff with software that converts home addresses to Census block groups. Site staff will enter Census block group data into the CMHI portal (but not addresses) at the same time they enter other administrative information into the CDP system. This information will be collected at baseline only. Client/ family addresses will not be part of the evaluation dataset or transmitted by the grantees to SAMHSA or its contractors.
Implementation grantees will participate in this Evaluation component in years 2 and 4 of their funding cycles.
Financial Mapping and Benchmark Component
Financial Mapping. The NET will make information requests and conduct semi-structured interviews with key implementation grantee administrators and staff. Specifically, data will be collected on children’s MH funding sources for all states, counties, and tribes with CMHI grantees during the first 18 months of the grant and the next to last or last 12 months of the implementation grant funding period. The NET will review publicly available information to develop a preliminary list of children’s mental health services in the state, county or tribe. This list will be sent to interview respondents at least a week prior to the interview with a request to make any needed corrections. The corrected list will then be incorporated into the interview schedules. The interview schedules will be shared with state and county agencies, and with tribal representatives who can describe Medicaid-funded, MH Authority-funded, and Indian Health Service-funded services in the form of a WebEx. In addition, the NET will also speak to representatives from family organizations about their funding sources and provider associations to learn what services are covered by commercial insurance plans. Key information from interviews with Mental Health agencies, Medicaid agencies and tribal authorities will be summarized in a matrix, and sent back to respondents for validation.
Benchmark Component. Each implementation grantee volunteering to participate in the Benchmark Component will receive preparation support for and begin cost data collection upon OMB approval. Data will be collected during the first 18 months for two cohorts. Data will be collected during the third and beginning of the final year of the implementation grant funding period for the first cohort. Volunteer state or county MH and Medicaid agencies will collect and report a core set of data that will be used to calculate access, utilization, and costs for child MH services in the jurisdiction. The NET will provide states and counties with contact information to reach Evaluation staff if they have any questions about the data request. Evaluation staff has considerable experience in collecting these types of data and can effectively clarify any confusion or help to address limitations or problems that states may encounter when generating the requested information.
Child and Family Outcome Component
One local system within each grantee will be selected to collect child and family outcome data from all clients age 5 to 21, and their families, who meet the eligibility criteria. The criteria includes, clients will need to: (1) receive services through a selected local service system within a funded jurisdiction; (2) meet the local system’s service program eligibility criteria for SOC services; (3) be between age 5 and 21 years; (4) have a MH diagnosis; (5) not have a sibling already participating in the Evaluation; (6) have a participating caregiver if the client is age 5 to 17 years old; and (7) provide informed consent/assent, as appropriate based on client age. Data collection for this Evaluation component will begin soon after OMB approval.
Child and family data will be collected at intake, 6-months, and 12-months post service entry (as long as the child/youth is still receiving services). Data will also be collected at discharge if the child/youth leaves services before the 6- or 12-month data collection point. Evaluation staff will collect these follow-up data from caregivers of minor children and adolescents (age 5 to 17) and from youth and young adults age 11 to 21. Table 9 shows the child and family data elements added by the Evaluation and the target respondents. Data will be collected by local service system personnel through a combination of interviews with caregivers and youth, and abstraction of routinely collected clinical information from client records.
The NET will take an active role in providing support and TA to grantees for collecting the Evaluation-related child and family outcome data. This will be done by providing: (1) a detailed data collection procedures manual including relevant reading materials; (2) an initial training on these items; (3) one-on-one contact by evaluation liaisons; and (4) additional guidance and information, as questions arise. The NET will offer TA on recruiting families into the Child and Family Outcome Component. In addition, NET staff will monitor data quality and provide training and TA to SOC sites for Evaluation-specific additions to the CDP tool to ensure high-quality data collection. Using the CDP tool has the benefit of standardizing the collection of these data across sites and minimizing burden to participants and sites added by the Evaluation. CDP staff will be responsible for monitoring CDP data collection procedures in the SOC sites to ensure the greatest possible uniformity in data collection across sites.
Table 9. Data elements added to the CDP tool for the Child and Family Outcome component by respondent
Data Element Added to CDP Tool Section H |
Respondent |
Administrative data:
|
Clinical record review, abstracted by site staff for each client |
Family/Living Information:
|
Clients age 18-21 |
Family/Living Information:
|
Caregivers of clients age 5 -17 |
Caregiver Strain Questionnaire - Short Form |
Caregivers of clients children and youth age 5 -17 |
Columbia Impairment Scale |
Caregivers of clients children and youth age 5 -17 Clients Youth and young adults age 11 - 21 |
Pediatric Symptom Checklist-17 |
Caregivers of children and youth age 5 -17 Clients Youth and young adults age 11 - 21 |
*As part of the CMHS Client-Level Services Measures for Discretionary Programs, CMHS PROGRAM ONLY data collection requirement, the CDP tool is collected for all clients age 11 to 21, which has the benefit of consistent instrumentation across clients. The Evaluation will also collect outcomes applicable to clients age 18-21 using the CDP Client-Level Services Measures for Discretionary Programs, CMHS PROGRAM ONLY instrument. These additional questions are accounted for in our burden tables.
3. METHODS TO MAXIMIZE RESPONSE RATES
Several steps will be taken to maximize response rates and reduce non-response bias for all data collection efforts. The NET will lead and/or be available to support each data collection process. The NET will provide ongoing technical assistance and remain available to grantees and other respondents to respond to questions and provide clarification or guidance whenever needed.
For most data collection activities, the NET will collect data from participants involved in the planning, implementation, and expansion of SOCs at the jurisdiction and local system levels (i.e., stakeholders). Efforts to maximize response rates are presented here by type of data collection method, as these apply across evaluation components.
Requesting documents. Document requests will be combined across other Evaluation components to minimize the number of requests and to avoid duplicate requests. For example, for the GIS Component, the NET will send each grantee a Group Collaborative Events for GIS Analysis Form on a quarterly basis to be sure these data are collected regularly.
Identifying respondents among stakeholders. The NET will work with the grantee’s evaluation contact in each jurisdiction to identify the appropriate people to interview. All respondents will be partners in the planning, implementation, and expansion of systems of care and will participate in the evaluation as part of the performance of their roles.
Scheduling interviews. The NET will be flexible in scheduling interviews, provide a copy of the interview schedule ahead of time, and respect the specified time limits. To make the best use of informants’ time, the NET will review available documents and perform web searches to collect publicly available information prior to the interview. To keep logistics and costs manageable, interviews will be conducted with individual informants by telephone, Skype, or video-conferencing.
Site liaison model. Individual NET staff will serve as a site liaison to each participating grantee to facilitate communication in ways that the NET anticipates will enhance response rates, data quality, and grantee motivation. In addition, the site liaison model will enable the NET to understand the grantees more comprehensively, which will be of value when interpreting findings.
In addition, the NET will provide an Internet-based listserv for facilitating communication about training and TA regarding evaluation implementation and utilization. The listserv allows site evaluators to communicate with the NET and each other through group e-mail. Any e-mail message sent to the listserv will be automatically distributed to all site evaluators. The listserv is run at no cost to site evaluators.
The NET anticipates that grantees and other stakeholders will be particularly motivated to participate in several data collection efforts of the Evaluation. Examples relevant to specific Evaluation components are as follows:
Financial Mapping. The NET anticipates that most informants will be interested in finding ways to financially sustain their SOC and will be motivated to participate in the Financial Mapping component. The NET will follow up with the people interviewed to share the draft financial map to confirm the NET’s understanding of the state’s use of funds, which the NET anticipate will further enhance motivation to participate.
Benchmark Component. In the past, the NET has successfully collected similar data from over 31 state and county MH authorities and/or Medicaid agencies, who also participated on a voluntary basis. Grantees that elect to participate will be able to benchmark their state’s use of children’s MH resources against other participating states. The NET believes that states with well-developed information systems that can readily compile the needed data will be interested in the rare opportunity to compare how they use inpatient and residential care to other states. States in the cohort that will be benchmarked twice will also have the opportunity to document how expansion of their SOC may have changed their service use pattern and expenditure rates. This information may be valuable in demonstrating the business case for SOC to legislators and other stakeholders.
Child and Family Outcome Component. As part of their grant requirements, local sites collect CDP data on all clients and their families who receive SOC services. In that CDP is the primary mechanism by which grantees document to SAMHSA the number of clients they serve, this is another Evaluation component in which grantees will be inherently motivated to participate. In addition, SAMHSA expects an 80% follow-up rate for clients’ CDP data. Data collection for the Child and Family Outcome Component will be limited to CDP data, so participation rates should be identical to those of CDP.
4. TESTS OF PROCEDURES
The selection of data collection activities was based on a review of those used during the earlier National CMHI Evaluation (OMB Nos. 0930-0192, 0930-0209, 0930-0257, 0930-0280) in consultation with individuals involved in both evaluations; an assessment of measurement quality as reported in the literature; and decisions about data collection activities were made in conjunction with expert reviewers, consumers, and family members. These consultants are listed in Section B5. Testing of each data collection activity proposed by this request is described here.
In order to obtain estimated times for completing the tools and to improve the clarity and flow of the questions, the NET pilot tested the Stakeholder interview, the SOCEA, the SAIS, the Network Analysis Survey, and the Financial Mapping Interview with grantees. Pilot testing of data collection activities was approved by the Westat IRB. In order to recruit participants, the NET sent an email to 2012 and 2013 grantees asking for volunteers to participate in the pilot testing effort. Interested grantees were contacted and a total of 8 grantee respondents participated in the pilot testing. The pilot testing sessions were held with individual grantee participants. During the pilot testing sessions, a NET member administered one tool to the respondent in order to obtain estimated times for completion, and then the grantee participated in a debriefing session, in which they provided feedback on the clarity and flow of the questions. Participants provided feedback indicating that the tools were comprehensive, and that in general, the questions were understandable and relevant to grantee SOC efforts. Feedback from the testing was used to clarify individual questions, including re-wording items and adding definitions of terms, and additional information was added to instructions and introductory sections of the tools to provide additional clarity. Grantee participants also provided feedback on the presentation and display of the data collection tools (particularly those displayed online) to make the administration more user-friendly. For example, grantees indicated that it was helpful to display the Stakeholder interview questions on the computer through WebEx so that they could read the questions at the same time the interviewer asked them. Additional details about the pilot testing for individual tools are provided in the sections below.
Implementation Assessment
Stakeholder Interviews. This interview was developed by a multidisciplinary team that included experts in measurement and system of care development, as well as family and youth representatives. The framework for the Stakeholder Interviews was based on the SOC implementation literature and previous examinations of SOC sustainability (e.g., Stroul & Manteuffel, 2008) and expansion efforts (e.g., Stroul & Friedman, 2011). Testing of the interview included expert review of the framework and protocols by additional family and youth representatives (who were not on the development teams) and experts on systems of care. The Stakeholder interview was pilot-tested with four volunteer grantee participants (including a Project Director, two Family Representatives, and one Youth Representative). After recording the time it took to administer the interview, the grantee participants provided feedback on the clarity of questions and the experience of completing the interview. Grantee participants indicated that the questions were comprehensive and understandable. Feedback on the Stakeholder Interviews was used to shorten or reword specific questions, identify terms that needed further definition, and identify terms that needed to be customized for individual grantee sites.
SAIS. This web-based survey was developed by the NET to enable grantees to document and assess their own progress toward expanding the SOC in their jurisdiction. Development of this survey followed the framework used for the Stakeholder interviews. The development team included experts in measurement and system of care development, as well as family and youth representatives. This survey has undergone an expert review by family and youth representatives, experts in systems of care, and testing specialists. The SAIS was pilot-tested with one volunteer grantee participant using an online survey platform. The participant indicated that the questions were good and generally straight forward. Feedback from the pilot testing was used to identify items that required further clarification and ways that the survey could be presented to improve the online experience for users.
SOCEA. The SOCEA was adapted from the System of Care Assessment (SOCA) (see Brannan, Brashears, et al., 2012) used in the prior evaluation (i.e., National Evaluation of CMHI). Adaptations were made by the same multi-disciplinary team that developed the Stakeholder Interviews. It was reviewed by the same panel of experts that provided feedback to the other data collection tools. The SOCEA was pilot-tested with one volunteer grantee participant followed by a debriefing on the clarity of questions and the experience of completing the interview. Feedback from the pilot testing indicated that there was no redundancy across questions.
Network Analysis and GIS Component
Network Analysis Survey: Jurisdiction and Local System. Indicators of networking behavior within the SOC were based on (1) a literature review of network analyses in mental health and other social services sectors; and (2) tailored to SOC expansion efforts. This comprehensive list of collaborative activities was finalized in consultation with experts in the field. The Jurisdiction-level networking survey was pilot tested with one volunteer grantee participant through an online survey platform. Feedback from the session indicated that the survey was comprehensive in capturing collaborative activities between agencies and organizations.
GIS Component. This evaluation component involves the collection of a work/business address and Census block group. Identification of the appropriate location indicators was made in consultation with the NET’s GIS expert. GIS analysis is an established methodology for which addresses and Census block group data are commonly used indicators.
Financial Mapping and Benchmark Component
Financial Mapping Interview. This interview was developed by a team that holds extensive experience in designing the tool and collecting similar data in the past. This interview will be informed by work with a wide variety of state and county MH systems’ use of key financing sources. This interview protocol was reviewed by the complete evaluation team and a cognitive interview was conducted to assess the clarity of questions and the process by which grantees will provide these data. The Financial Mapping Tool was pilot tested with one volunteer grantee participant. Feedback was used to clarify the presentation of material during the interview.
Benchmark Tool. This is a well-annotated data collection tool which has previously been successfully used by states and counties. It has been updated to reflect current terminology, with input from a researcher who has recently analyzed a national database of Medicaid-funded children’s MH services. SAMHSA did not test the benchmarking tool, but it was developed based on prior tools that were used with over 30 states and counties.
Child and Family Outcome Component
The following instruments selected for the Child and Family Component of the Evaluation have been validated and tested with multiple populations. The Evaluation also plans to use outcomes collected by the CDP including the Family/Living Situation items; these measures were not tested for this Evaluation but have been widely used for this purpose. The CDP data collection is managed under a different contract and OMB approval (see Section A.3).
Columbia Impairment Scale (CIS). This 13-item scale assesses four dimensions of social functioning: interpersonal relations, job/school, use of leisure time, and specific psychological areas. There are two parallel forms; one is completed by caregivers, the other is a youth self-report. The CIS has demonstrated high internal consistency, good test-retest reliability, and strong convergent validity (Bird et al., 1993).
Pediatric Symptom Checklist-17 (PSC-17). The PSC-17 is a shorter version of the widely-used and well-validated PSC-35 (citation). According to the developer’s website3, the PSC-35 has a sensitivity of 0.95 and a specificity of 0.68 relative to clinician ratings. Test-retest reliability of the PSC-35 is high (r = .84 - .91) as is its internal consistency (α=.91). The PSC-17 is organized around the same three areas (internalizing, externalizing, and attention problems) assessed by the PSC-35. Studies indicate that the 17-item version yields scores that are generally congruent with the longer version (citation).
Caregiver Strain Questionnaire (CGSQ). The CGSQ is a 21-item questionnaire that focuses on the strain experienced by caregivers associated with caring for a child with emotional and behavior challenges. The CGSQ has demonstrated reliability and validity, with excellent internal consistency (α = 0.92). It has also demonstrated convergent validity with other caregiver distress and family functioning instruments (Brannan, Heflinger, & Bickman, 1997). The CGSQ was shortened to 13 items for this evaluation by its developer (CGSQ-13; Brannan & Pullmann, in development). The CGSQ-13 was developed using Rasch-based Item Response Theory, paired with traditional psychometric analyses. These analyses were performed on an existing dataset from previously funded SOC sites. The CGSQ-13 demonstrated reliability and construct validity comparable to the original 21-item version.
5. STATISTICAL CONSULTANTS
The NET has full responsibility for the development of the overall statistical design, and assumes oversight responsibility for data collection and analysis for this Evaluation. Training, TA, and monitoring of data collection will be provided by the NET. The individual responsible for overseeing the entire evaluation, including all aspects of the design, data collection and analysis, and who had some involvement in the prior CMHI Evaluation, is the Principal Investigator:
Ana Maria Brannan, Ph.D.
Associate Professor
Indiana University School of Education
Indiana University
201 N. Rose Avenue
W.W. Wright Education Building Room Wright Ed Bldg.
Bloomington, Indiana 47405-1006
Office: (812) 856-8119
The following additional individuals will serve as statistical consultants to this project:
Alison Cuellar, Ph.D. (Health Economist, Financial Components, Consultant)
Associate Professor
Department
of Health Administration and Policy
George Mason University
4400
University Drive
MS: 1J3
Fairfax, VA 22030-4444
Office: (703) 993-5048
Michael Giangrande, M.G.I.S. (GIS Specialist, GIS Component)
Senior Study Director
Westat
RW 3546
1600 Research Boulevard
Rockville, MD 20850
Office: (301) 610-5107
Michael Pullmann, Ph.D. (Mixed Methods Specialist, Child and Family Outcome Data)
Research Assistant Professor
Public Behavioral Health and Justice Policy
Department of Psychiatry and Behavioral Science
University of Washington
2815 Eastlake Ave. East,
Suite 200
Seattle, WA 98195-8015
Office: (206) 685-0408
Michael Steketee, Ph.D. (Content Expert, Network Analysis)
Senior Study Director
Westat
1600 Research Boulevard
Rockville, MD 20850
240-453-2603
Data Collection and Analysis of Information:
Daksha Arora, Ph.D. (Health Information Systems, Project Manager)
Senior Study Director
Westat
RB 1122
1600 Research Boulevard
Rockville, MD 20850
(240) 314-2481
Lacy Kendrick Burk (Content Expert, Youth)
Executive Director
Youth MOVE National
6641 Hwy 98 W
Suite 202
Hattiesburg, MS 39402
Office: (800) 580-6199 ext. 101
Eric Burns, Ph.D. (Content Expert, Wraparound Services, Consultant)
Associate Professor
University
of Washington Dept. of Psychiatry & Behavioral Sciences
Division
Public Behavioral Health & Justice Policy
2815 Eastlake
Avenue East, Suite 200
Seattle, WA 98102
Box 358015
Office: (206) 685-2477
Allen Daniels, Ed.D. (Content Expert, Health Care Systems, Consultant)
Senior Health Care Systems Specialist
Westat
1600 Research Boulevard
RB 4118
Rockville, MD 20850 -3129
Office: (513) 319-5614
Michael Dennis, Ph.D. (Content Expert, Recovery Research, Consultant)
Senior Research Psychologist
Chestnut Health Systems
Lighthouse Institute
448 Wylie Drive,
Normal, IL 61761-5405
Office: (309) 451-7801
Richard Dougherty, Ph.D. (Content Expert, Financial Components)
CEO
DMA
Health Strategies
9 Meriam Street, Suite 4
Lexington, MA
02420-5312
Office: (781) 863-8003
Lynda Gargan Ph.D. (Content Expert, Family)
Senior Managing Director
National Federation of Families for Children's MH
9605 Medical Center Dr.
Suite 280
Rockville, Maryland
Office: (240) 403-1490
Preethy George, Ph.D. (Content Expert, Children’s Behavioral Health Prevention Specialist)
Senior Study Director
Westat
1600 Research Boulevard
RB 4114
Rockville, MD 20850 -3129
(301) 738-3553
Craig Anne Heflinger, Ph.D. (Content Expert, SOC, prior CMHI Evaluation, Consultant)
Professor & Associate Dean
Department of Human and Organizational Development
Peabody College of Education and Human Development
Vanderbilt University
Mayborn Bldg., Room 206
130 Magnolia Circle
Nashville, TN 37203-5721
Office: (615) 322-8275
Wendy Holt, Ph.D. (Content Expert, Financial Components)
Principal
DMA
Health Strategies
9 Meriam Street
Suite
4
Lexington, MA 02420-5312
Office: (781) 863-8003
Keri Jowers, Ph.D. (Content Expert, Report to Congress, prior CMHI Evaluation)
Walter R. McDonald & Associates, Inc.
12300
Twinbrook Parkway, Suite 310
Rockville, MD 20852-1698
Phone
(301) 881-2590
Wendy Kissin, Ph.D. (Content Expert, Behavioral Health Specialist)
Senior Study Director
Westat
1600 Research Boulevard
RB 3143
Rockville, MD 20850 -3129
(301) 294-3885
Craig Love, Ph.D. (Content Expert, Native American/Native Alaskans, Consultant)
Senior Study Director
Westat
1600 Research Boulevard
RB 3148
Rockville, MD 20850 -3129
(240) 314-2443
Nanmathi Mianian, Ph.D.
Senior Study Director
Westat
1600 Research Boulevard
RB 3143
Rockville, MD 20850 -3129
301-294-2863
Brianne Masselli (Content Expert, Youth)
Director of Technical Assistance and Evaluation
Youth M.O.V.E National
Office (202) 808-3992 X104
Allison Metz, Ph.D. (Content Expert, Program Implementation, Child Welfare, Consultant)
Associate Director
National Implementation Research Network
The University of North Carolina at Chapel Hill
Sheryl-Mar South, Room 142
Campus Box 8185
Chapel Hill, NC 27599-8185
Office: (919) 218.7540
Kurt Moore, Ph.D. (Report to Congress, Child and Family Outcomes, prior CMHI Evaluation)
Walter R. McDonald & Associates, Inc.
1626 Washington Street
Denver, CO 80203
Office: (916) 239-4020 ext. 409
Garrett Moran, Ph.D. (Project Director, Behavioral Health Systems Expert)
Vice President
Westat
1600 Research Boulevard
RB 4118
Rockville, MD 20850 -3129
Office: (301) 294-3821
Mary Anne Myers, Ph.D. (Qualitative Assessment Expert)
Associate Director
Westat
1600 Research Boulevard
RB 4105
Rockville, MD 20850 -3129
Office (240) 453-2673
Marie Niarhos (Content Expert, Family)
Family Involvement Content Specialist
National Federation of Families for Children's MH
9605 Medical Center Dr.
Suite 280
Rockville, Maryland
(240) 403-1901 office number
Debra Rog, Ph.D. (Content Expert, Evaluation Design, Consultant)
Associate Director
Westat
1600 Research Boulevard
RW 3526
Rockville, MD 20850 -3129
Office: (301) 279-4594
Martha Stapleton (Survey and Questionnaire Design and testing Expert, Consultant)
Senior Study Director
Westat
1600 Research Boulevard
RB 4161
Rockville, MD 20850 -3129
Office: 301-251-4382
Beth A. Stroul, M.Ed. (Content Expert, SOC, prior CMHI Evaluation, Consultant)
Management & Training Innovations, Inc.
7417 Seneca Ridge Drive
McLean, VA 22102
(703) 448-7570
Sandra Spencer (Content Expert, Family, Consultant)
Executive Director
National Federation of Families for Children's MH
9605 Medical Center Dr.
Suite 280
Rockville, Maryland
Office: (240) 403-1901
Jessica Taylor, Ph.D. (TRAC Data Collection and Management Expert)
Westat
1600 Research Boulevard
RB 4144
Rockville, MD 20850 -3129
Office: (240) 314-5852
The SAMHSA staff person responsible for receiving and approving contract deliverables is:
Kaitlyn Harrington, M.P.A., M.A.
Public Health Advisor
Center for Mental Health Services
Substance Abuse and Mental Health Services
1 Choke Cherry Road, Room 6–1046
Rockville, MD 20857
Office: 240-276-1928
References
Barksdale, C. L., Ottley, P.G., Stephens, R., Gebreselassie, T., Fua, I., Azur, M., et al. (2012). System-level change in cultural and linguistic competence (CLC): How changes in CLC are related to service experience outcomes in system of care. American Journal of Community Psychology, 49(3-4), 483-493.
Bickman, L. & Heflinger, C. A. (1995). Seeking success by reducing implementation and evaluation failures. In L. Bickman & D. J. Rog (Eds.), Children's mental health services: Research, policy and innovation (pp.171-205). Newbury Park, CA: Sage.
Bird, H. R., Shaffer, D., Fisher, P., Gould, M. S., Stagehezza, B., Che, J. Y., et al. (1993). The Columbia impairment scale (CIS): Pilot findings on a measure of global impairment for children and adolescents. International Journal of Methods in Psychiatric Research, 3,167–176.
Brannan, A. M. (2003). Ensuring effective mental health treatment in real-world settings and the critical role of families. Journal of Child and Family Studies, 12(1), 1-10.
Brannan, A. M., Brashears, F., Gyamfi, P., & Manteuffel, B. (2012). Implementation and development of federally-funded systems of care over time. American Journal of Community Psychology, 49, 467–482.
Brannan, A.M., & Hazra, M. (2012). Final report of the evaluation of the commUNITY cares system of care initiative. Pine Belt Mental Health Resources, Hattiesburg, Mississippi.
Brannan, A. J., Heflinger, C. A. & Bickman, L. (1997). The Caregiver Strain Questionnaire: The impact of living with a child with serious emotional disturbance. Journal of Emotional and Behavioral Disorders, 5(4), 212-222.
Fixsen, D., Blase, K., Metz, A., & van Dyke, M. (2013). Statewide implementation of evidence-based programs. Exceptional Children, 79(2), 213-230.
Manteuffel, B., Stephens R.L., Sondheimer D.L., Fisher S.K. (2008). Characteristics, service experiences, and outcomes of transition-ages youth in systems of care: Programmatic and policy implications. Journal of Behavioral Health Services Research, 35(4), 469-487.
Merikangas, K.R., He J.P., Brody D., Fisher P.W., Bourdon K., Koretz D.S. (2010). Prevalence and treatment of mental disorders among US children in the 2001-2004 NHANES. Pediatrics, 125(1):75-81.
Spybrook, J., & Raudenbush, S. W. (2009). An examination of the precision and technical accuracy of the first wave of group-randomized trials funded by the institute of education sciences. Educational Evaluation and Policy Analysis, 31(3), 298-318.
Stroul, B., Blau, G., & Friedman, R. (2010). Updating the system of care concept and philosophy. Washington, DC: Georgetown University Center for Child and Human Development, National Technical Assistance Center for Children’s Mental Health.
Stroul, B. A., & Friedman, R. M. (2011). Effective strategies for expanding the system of care approach. A report on the study of strategies for expanding systems of care. Atlanta, GA: ICF Macro.
Stroul, B. A., & Manteuffel, B. (2008). Sustaining systems of care Paul H Brookes Publishing, Baltimore, MD.
List of Attachments
Evaluation Logic Model
Semi-Structured Stakeholder Interviews
SAIS (SAIS)
Network Analysis Survey
Group Collaborative Events for GIS Analysis Form
Financial Mapping Interview Protocol
Financial Benchmarking Tool
Systems of Care Expansion Assessment (SOCEA)
Child and Family Level Data Tool
1 NOTE. Briefly, the power of a statistical test is generally defined as the probability of rejecting a false null hypothesis. In other words, power gives an indication of the probability that a statistical test will detect an effect of a given magnitude that, in fact, exists in the population. The power analysis does not indicate that a design will actually produce an effect of a given magnitude. The magnitude of an effect, as represented by the population parameter, exists independent of the component and is dependent on the relationship among the independent and the dependent variables in question. The probability of detecting an effect from the data, on the other hand, depends on several major factors in multi-level or repeated-measures frameworks, some of which include: (1) the level of significance used; (2) the size of the treatment effect in the population; (3) sample size; (4) the intraclass correlation(s), that is, the amount of individual variance accounted for by membership within a group (or nesting), or, similarly, the correlation among repeated measures; (5) the amount of measurement error.
2 This date assumes that evaluation funding is continued to allow for complete followup for all participating sites. Alternative power analyses, using only completion estimates for data to be collected within the funded evaluation contract window (but not reported here to conserve space), also reveal a sufficient sample size to detect small effects. Similarly, the inclusion of 130 jurisdictions assumes continued OMB approval following the initial 3-year period covered by the present request.
3 http://www2.massgeneral.org/allpsych/pediatricsymptomchecklist
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Wendy Kissin |
File Modified | 0000-00-00 |
File Created | 2021-01-25 |