Phase VI_Final_SS_9-13-2012-B

Phase VI_Final_SS_9-13-2012-B.docx

National Evaluation of the Comprehensive Mental Health Services for Children and Their Families Program: Phase VI

OMB: 0930-0307

Document [docx]
Download: docx | pdf



Phase VI of the National Evaluation of the Comprehensive Community Mental Health Services for Children and Their Families Program


Supporting Statement



B. STATISTICAL METHODS


1. RESPONDENT UNIVERSE AND SAMPLING METHODS


System of Care Assessment. Respondents for the System of Care Assessment will be selected based on their affiliation with the system of care community and must serve in specific roles. To determine the respondents, the National Evaluator will send a site informant list to each community 8 weeks prior to its site visit. The system of care community will select potential respondents that meet the requirements outlined in the list. System of care communities will e-mail the completed list to the National Evaluator at least 4 weeks prior to the scheduled visit so that the list of projected interviewees can be reviewed to ensure that each category of respondent will be adequately represented. The respondent categories include representatives of core child-serving agencies, project directors, family representatives and representatives of family advocacy organizations, social marketers, cultural and linguistic competence coordinators, program evaluators, intake workers, youth coordinators, care coordinators and case managers, direct service providers, care review participants, caregivers, and youth. For each system of care community, there will be approximately 27 respondents per site visit. Based on previous experience, we expect a response rate for this study component of approximately 84 percent.


The universe for the Phase VI Cross-Sectional Descriptive Study, the Child and Family Outcome Study, and the Service Experience Study consists of the children served by the CMHS program in the 47 CA awardee sites.


Cross-Sectional Descriptive Study. For this evaluation component, data will be collected on children and families at intake into services. Descriptive data will be collected on the census of all children and their families who are being served by the CMHS program. To be included in this study component children will need to: (1) meet the community’s service program eligibility criteria; and (2) receive services in that community. Because these data are routinely collected at the sites for internal purposes, descriptive data on all the children and families who receive services will be available.


Child and Family Outcome Study. A sample of families will be selected for participation in this component. The Child and Family Outcome Study sample will be selected from the pool of children and their families entering the Phase VI-funded systems of care. Although each site is funded for 6 years, the first year is committed to initial system development with data collection occurring in the last 5 years of their funding. Hence, recruitment of family participants will occur in years 2, 3, and 4 of program funding (or years 1, 2, and 3 of the evaluation) but could continue in later years if enrollment goals are not met.


As systems of care will develop differentially over the length of the project, it is important to consider the growth of the system of care. If the entire sample is recruited in the first year, the opportunity will be lost to assess whether changes in the client population occurred as the system matured (e.g., increasingly serving children with more severe problems or children referred through the juvenile justice system). For that reason, recruitment will be spread across 3 years and the number of children and families recruited each year will be standard across sites.


It is important that we draw a large enough sample in each CA awardee site to ensure that the evaluation will be able to detect the impact of the system of care initiative on child and family outcomes. If the samples are too small, significant differences of an important magnitude might go undetected. The effect sizes of the phenomena of interest form the basis of determining the minimum sample size needed through a statistical power analysis. Briefly, the power of a statistical test is generally defined as the probability of rejecting a false null hypothesis. In other words, power gives an indication of the probability that a statistical test will detect an effect of a given magnitude that, in fact, really exists in the population. The power analysis does not indicate that a design will actually produce an effect of a given magnitude. The magnitude of an effect, as represented by the population parameter, exists independent of the study and is dependent on the relationship among the independent and the dependent variables in question. The probability of detecting an effect from sample data, on the other hand, depends on three factors: (1) the level of significance used; (2) the size of the treatment effect in the population; and (3) sample size.


For the Child and Family Outcome Study in the CA awardee communities, the longitudinal design assesses whether individual children and families experience meaningful improvements in outcomes between the time they enter the systems of care and subsequent data collection points. Comparisons of outcomes among different groups within a community and across communities will also be made.


Power analysis assumptions have been modified from previous phases of the evaluation to reflect higher effect sizes observed in analyses of data from communities funded in Phases IV and V and a between- versus within-site difference in our analytical approach. As a result, each site will be expected to recruit sufficient numbers of children and families to ensure enrollment of 220 in each community. Relative to previous phases, this reduction in enrollment numbers and number of cohorts for initial enrollment will allow local evaluation staff to expend their limited evaluation resources on longitudinal follow-up to increase the quality of data and improve retention rates. Enrollment of 220 children and families at intake will result in a final sample of 180, assuming 5 percent attrition at each wave over five waves of data collection (intake, 6, 12, 18, and 24 months), which will reflect approximately 82 percent retention at the end of data collection. Children and families will be enrolled into the study beginning in year 2 of funding, and enrollment can continue through the end of year 4 of funding to ensure 24-month follow-up. The target sample sizes will be large enough to ensure the ability to detect changes in outcomes over time and to compare outcomes of relevant subgroups of children and families across a variety of characteristics (e.g., referral source, demographic characteristics, and risk factors) within a community.


These analyses can be accomplished using repeated measures analysis of variance (ANOVA). As an example, with the proposed 180 children and families enrolled in each community, a repeated measures design with one between-subjects factor with two groups of 90 each (e.g., school referral vs. not) and time as a within-subjects factor with five levels (intake and 6, 12, 18, and 24 months) has power greater than .80 to detect an effect size of 0.24 for the Group × Time interaction with α=.05. Analysis using hierarchical linear modeling (HLM) will allow for examination of variation in child-level outcomes across sites, and to relate that variation in outcomes to site level characteristics. We assume a three-level model, where level 1 includes linear and quadratic change in outcomes across five waves of data collection at 6-month intervals from entry into services to 24 months, level 2 includes 180 children per site, and level 3 includes 47 sites. From our previous experience using HLM with similar populations, we assume a dependent variable with σ 2=39 (the within-person variation), τ=130 (the sum of between-person and between-site variation), and ρ=.05 (meaning that approximately 5% of total variation in outcome change is attributed to between-site variation). Given these assumptions, the model would have power of .80 to detect an effect size of .19.


Table 5 shows the data collection schedule for the 5 years of data collection for communities funded in FY 2010. Table 6 shows the data collection schedule for the remaining 3 years of data collection for communities funded in FY 2008 and FY 2009.


Table 5. Data Collection Schedule for the Child and Family Outcome Study:

Communities funded in FY 2010


Data Collection Year Recruited1

Data Collection Year

FY12-13

FY13-14

FY14-15

FY15-16

FY16-17

Year 2

666

633

601

571

542






Year 3



666

633

601

571

542




Year 4





666

633

601

571

542


Year 5











Year 6

Completion of data collection if data collection goals have not been met.

1. Refers to the year of the national evaluation in which the family was recruited into the study. Across all sites, the national evaluation spans 5 years. Although data collection will occur in years 2 through 6, recruitment ends in year 4 with follow-up data collection continuing in year 6. Any sites that have not met their participant recruitment goals will be allowed to continue recruitment during year 6 as long as at least one follow-up interview can be completed before program funding ends.



Table 6. Data Collection Schedule for the Child and Family Outcome Study:

Communities funded in FY 2008 and FY 2009


Data Collection Year Recruited1

Data Collection Year

FY12-13

FY13-14

FY14-15

FY15-16

FY16-17

Year 2

2290










Year 3

2538

2411

2290








Year 4

2812

2671

2538

2411

2290






Year 5











Year 6

Completion of data collection if data collection goals have not been met.

1. Refers to the year of the national evaluation in which the family was recruited into the study. Across all sites, the national evaluation spans 5 years. Although data collection will occur in years 2 through 6, recruitment ends in year 4 with follow-up data collection continuing in year 6. Any sites that have not met their participant recruitment goals will be allowed to continue recruitment during year 6 as long as at least one follow-up interview can be completed before program funding ends.



To reach these numbers, some CA awardee sites will need to recruit all willing families into the Child and Family Outcome Study sample. For these sites, the cross-sectional descriptive and the longitudinal samples will be identical. Other sites will need to employ a sampling strategy to randomly select a sufficient number of families from the pool of children who enter the system of care. At these sites, a systematic sampling approach will be used. A random starting point between 1 and the nearest integer to the sampling ratio (n/N) will be selected using a table of random numbers. Children will be systematically selected for inclusion at intervals of the nearest integer to the sampling ratio. For example, every tenth child (after the random starting point) would be sampled in a site serving 2200 children (n/N = 2200/220 = 10) and every fifth child would be sampled in a site serving half that number or 1100 children (n/N = 1100/220 = 5) (where n = the number of children in the population and N = the number of children to be recruited into the sample).


The purpose of the sampling strategy described above is to maximize the chance that the children who participate in the Child and Family Outcome Study are indeed representative of the universe of children who enter the systems of care. If this is achieved, the findings from data collected from the randomly selected sample are more likely to generalize to the overall client pool. Every effort will be made to recruit and follow the children who are randomly selected into the Child and Family Outcome Study. However, one should expect that some of the families approached about entering the study would refuse to participate. When a family refuses to participate, the next family that meets the selection criteria will be selected. Past experience indicates that sites vary in their abilities to recruit Cross-Sectional Descriptive Study sample members into the Child and Family Outcome Study with the majority of sites recruiting more than 60 percent of the Cross-Sectional Descriptive Study sample into the Child and Family Outcome Study sample. To estimate the effect of the refusals on the representativeness of the sample, the families who refuse will be compared to the participating sample on, at minimum, demographic characteristics. (See the Data Analysis Plan section above.) Recall that descriptive data will be collected on all families that enter the system of care. This will provide the data upon which to make comparisons.


Experience from previous phases of the national evaluation has shown that, although sites can make estimates, it is difficult to predict precisely how many children will be served by the CA awardee systems of care. In addition, the number of children who enter the systems of care may increase over time as CA awardees expand their service capacity and enhance outreach efforts. For that reason, sampling strategies will have to remain flexible during the recruitment period and will be monitored closely by the National Evaluator. The sampling strategies will be based on the sampling ratio approach to random selection described above. In the first year of their funding, CA awardees will monitor the number of children that enter their systems of care. Toward the end of the first year, a sampling ratio will be developed based on the first year of enrollment into the system of care. That sampling ratio will be tested in the first 3 months of data collection and monitored throughout the recruitment period to ensure that it remains on target.


The actual process of recruitment will differ across sites. This is necessary because children and families will enter services differently across sites. For example, in one site, the primary portals of entry might be the schools, while in another it might be the court system. It is also likely that sites will have a variety of portals of entry (e.g., mental health centers, schools, and courts). Every effort will be made to ensure that the recruitment process is as standardized as possible across sites and at the various portals of entry. Procedures for sample selection and recruitment will be documented in the national evaluation procedures manual, with additional guidelines developed specifically for each site.


Service Experience Study. The sampling and recruitment procedures for this study are identical to that of the Child and Family Outcome Study; that is, the same randomly selected sample of children and families being served in all system of care communities. Thus, anticipated response rates and retention rates are the same as for the Child and Family Outcome Study.


Sector and Comparison Study. For this component, 202 children will be sampled in each sectoral cluster at intake (101 in the system of care group and 101 in the comparison group), resulting in a final sample of 190 children (95 children in the system of care group and 95 children in the comparison group) in each cluster, assuming 5% attrition at each wave over five waves of data collection (intake and 6, 12, 18, and 24 months).. An overall sample size of 190 (n=95 in each group) achieves power of .80 to detect an odds ratio of 1.75 (i.e., a probability of remaining in school that is 1.75 times greater in the system of care group than in the comparison group) in a design with five repeated measurements (i.e., intake and 6, 12, 18, and 24 months).


Services and Cost Study. Data for the Services and Costs Study are collected only on children and youth enrolled in the Longitudinal Child and Family Outcome Study. The sampling and recruitment procedures for this study are identical to that of the Longitudinal Child and Family Outcome Study. This includes the same randomly selected sample of children and families, response rates, and retention rates.



2. INFORMATION COLLECTION PROCEDURES


System of Care Assessment. The National Evaluator will collect data for this component during periodic site visits. Data collection will include semi-structured interviews with key informants, review of documents and randomly selected case records, and observations. To document changes in system of care development that occur over time, all system of care communities will be visited three times during the 5 years of data collection (every 18–24 months), beginning in the second year of program funding.


The System of Care Assessment protocol yields an average of 23 individual interviews and 6 case record reviews per data collection site visit. It is expected that these averages will be achieved during the Phase VI data collection process. Key informants include the local project director, representatives of core child-serving agency, representatives of family organizations, cultural and linguistic competence coordinators, social marketers, program evaluators, youth coordinators, care coordinators, direct service providers, caregivers of children who receive services through the system of care, and youth who receive services through the system of care. The average time to obtain the required information from each person is about 1 hour.


Prior to the site visit, the National Evaluator will send out tables to be completed by the system of care community. These tables will collect information on: (1) the structure and participants of the governing body, (2) trainings that have been provided on system of care principles, (3) demographics of program staff, (4) services provided in the system of care community’s service array, (5) amounts, sources, and types of funding, and (6) participants on the care review team. These completed tables will be e-mailed to the National Evaluator approximately 4 weeks prior to the site visit. (See Attachments B.6.A-S for System of Care Assessment protocols.)


Cross-Sectional Descriptive Study. Data for the Cross-Sectional Descriptive Study will be collected at entry into services for all children and families in the CA awardee sites. Data for this component will be collected by sites’ intake staff, who will be trained by the National Evaluator to ensure standard collection of these data. To standardize the collection of these data across sites, the National Evaluator has developed the Enrollment and Demographic Information Form (EDIF) and the Child Information Update Form (CIUF). The information will be collected from administrative records but some information can be obtained from interviews conducted at intake. The National Evaluator strongly recommends that all CA awardees incorporate these items into their intake process. These data can be directly entered into a Web-based database by intake personnel to facilitate capture of basic descriptive characteristics of children served. There is no burden associated with the Enrollment and Demographic Information Form (EDIF) or Child Information Update Form (CIUF). To the extent possible, the collection of this information will be coordinated with the collection of data elements required for the National Outcome Measures (NOMs) reporting through TRAC. The GFA for FY2010 funding states that CA awardees will be required to report a number of performance measures to ensure SAMHSA can meet its reporting obligations under GPRA. The required descriptive information includes the number of persons served by age, gender, race and ethnicity. This information will be gathered using the CMHS NOMs Adult Consumer Outcome Measures for Discretionary Programs or the Child Consumer Outcome Measures for Discretionary Programs (Child or Adolescent Respondent Version or Caregiver Respondent Version).


For families participating in the Child and Family Outcome Study, the descriptive information that may change over time (e.g., diagnosis, insurance status) will also be collected at each follow-up data collection point using the Child Information Update Form (CIUF). Evaluation staff will collect these follow-up descriptive data elements in conjunction with other follow-up data collection for the Child and Family Outcome Study (see below). Again, the information collected in the Cross-Sectional Descriptive Study creates no additional respondent burden.


Child and Family Outcome Study. Data collection for this evaluation component begins in the second year of the CA awardees’ funding. Because respondents’ reading levels will vary, the instruments will be administered in interview format. This approach has been successfully implemented in Phases II, III, IV, and V. These data will be collected at intake and follow-up data collection points. In Phase VI, child and family outcome data will be collected from a sample of children, their caregivers, and their service providers. (See Attachment D.1-31 for instruments.)


Eight of the measures—the Youth Services Survey (YSS), the Delinquency Survey, Revised (DS–R), the Substance Use Survey, Revised (SUS–R), the Gain Quick–R: Substance Problem Scale (GAIN), the Youth Information Questionnaire, Revised (YIQ–R–I, YIQ–R–F), the Revised Children’s Manifest Anxiety Scale, Second Edition (RCMAS–2), the Reynolds Adolescent Depression Scale, Second Edition (RADS–2), and the Behavioral and Emotional Rating Scale—Second Edition, Youth Rating Scale (BERS–2Y)—will be completed by youth 11 years of age and older.


Onsite data collectors hired and managed by the sites, will collect data in the funded systems of care. In these sites, the people who will collect the data depend on the resources and needs of the sites. The National Evaluator will document and monitor data collection procedures in the system of care sites to ensure the greatest possible uniformity in data collection across sites. In addition, evaluation staff and data collectors will be trained using standard materials developed by the National Evaluator.


Sector and Comparison Study. Data for this study will be collected in select system of care communities. A subset of children enrolled in the core study will be randomly sampled into three sectoral groups (juvenile justice, education, child welfare). Service enrollment expectations established in funding awards, diversity of populations served, stratification based on sector clustering, and other factors will be considered to establish appropriate sampling strategies and sample sizes for this study in the funded communities. In those sites where local evaluation capacity will not ensure adequate data quality and retention of participants, national evaluation staff members will work with local evaluators to identify additional local staff to hire and train to assist in conducting interviews. These interviewers will collect the more detailed data with sector-specific instruments. Local evaluation staff in sites with sufficient capacity will collect data as with other core study participants and, in addition, will collect measures included as part of the enhanced study. For the identification of comparison sample, we will work with selected unfunded agencies to develop a process for the national evaluation study coordinator to oversee data collection for the comparison study. (See Attachments E.3.A-F)


Services and Costs Study. To provide data for this study, grant communities will collect two types of data. The first type of data is budget data on services provided through flexible fund expenditures. The second type of data is child-level service event data. This includes data on each service provided to each child/youth by as many partner agencies in the systems of care as possible. The availability of these data and procedures that communities will implement in accessing these data will vary widely across grant communities. Some of the data needed for this study are already collected by communities in existing data systems developed for their own program management purposes. Other data are recorded on paper-based forms or as part of the child’s case records. However, some communities do not currently collect the data needed for this study, either electronically or on paper. For data not already collected, communities will be asked to begin collecting these data specifically for the Services and Costs Study.


Data will be compiled by either extracting data from existing data systems and recoding them according to a specified data dictionary or by key entering information collected from paper records. Some communities will either extract or recode their data or will key enter their data, while other communities will use a combination of both methods.


The national evaluation will provide two data dictionaries to provide specifications for communities to use in recoding data from existing data systems, one for flexible fund expenditures and the other for service event data. (See Attachment F.1-2) The national evaluation will also provide two data entry applications for communities to use for key entering data from paper records. The first application is the Flex Funds Tool for budget data on flexible funding expenditures. The second application is the Services and Costs Data Tool child-level service event data. Data that are compiled by extracting and recoding existing data will be transmitted to the national evaluation at regular intervals. Data that are entered from paper records will be transmitted to a central database on an on-going basis, as they are entered.


Cost effectiveness and cost benefit analysis will involve utilization of data collected as part of the Child and Family Outcome Study and Sector and Comparison Study. See appropriate sections for the description of information collection procedures for these studies.


Table 7 summarizes the respondent, data collection procedure, and periodicity for each measure.


Table 7. Instrumentation, Respondents, and Periodicity


Measure

Indicators

Data Source(s)

Method

When Collected

System of Care Assessment (all sites)

System of Care Assessment Tool (Interview Guides and Data Collection Forms)

  • Family-driven

  • Youth-guided

  • Individualized services

  • Cultural competence

  • Interagency collaboration

  • Service coordination

  • Service array

  • System & service accessibility

  • Community-based services

  • Least restrictive service provision

  • Project staff

  • Core agency representatives

  • Family members

  • Caregivers

  • Youth

  • Service providers

  • Other constituents

  • Documents

Interview

Review

Every 18–24 months

Child and Family Outcome Study (a sample of children and families enrolled in the system of care)

Caregiver Information Questionnaire, Revised (CIQ–R)

  • Age

  • Educational level and placement

  • Socioeconomic status

  • Race/ethnicity

  • Parents employment status

  • Family advocacy and peer support

  • Living arrangement

  • Presenting problem(s)

  • Intake/referral source

  • Risk factors for family and child

  • Child and family physical health

  • Coercion for services

  • Service use

Caregiver

Interview

Intake, 6 months, and every 6 months thereafter

Child and Family Outcome Study (a sample of children and families enrolled in the system of care)

Living Situations Questionnaire (LSQ)

  • Living situations

  • Number of placements

  • Restrictiveness of placements

Caregiver

Interview

Intake, 6 months, and every 6 months thereafter

Behavior and Emotional Rating Scale—Second Edition, Parent Rating Scale (BERS–2C)

  • Strengths

  • Social competence

Caregiver of children age 6 years and older

Interview

Intake, 6 months, and every 6 months thereafter

Preschool Behavior and Emotional Rating Scale— Parent Rating Scale (PreBERS)

  • Strengths

  • Social competence

Caregivers of children age 3-5

Interview

Intake, 6 months, and every 6 months thereafter

Child Behavior Checklist 6–18 (CBCL 6–18) and Child Behavior Checklist 1½–5 (CBCL 1½ –5)

  • Symptomatology

  • Social competence

Caregiver

Interview

Intake, 6 months, and every 6 months thereafter

Education Questionnaire, Revision 2 (EQ–R2)

  • Functioning in school environments

Caregiver

Interview

Intake, 6 months, and every 6 months thereafter

Devereux Early Childhood Assessment (DECA)

  • Behavioral concerns

  • Initiative, self-control, attachment

  • Attention problems, aggression, withdrawal, emotional control

Caregiver of children aged 0–5

Interview

Intake, 6 months, and every 6 months thereafter

Parenting Stress Index (PSI)

  • Parenting characteristics and child adjustment

Caregiver of children age 12 years and younger

Interview

Intake, 6 months, and every 6 months thereafter

The Columbia Impairment Scale (CIS)

  • General functioning

Caregiver of children age 6 years and older

Interview

Intake, 6 months, and every 6 months thereafter

Caregiver Strain Questionnaire (CGSQ)

  • Caregiver strain


Caregiver

Interview

Intake, 6 months, and every 6 months

Behavior and Emotional Rating Scale—Second Edition, Youth Scale (BERS–2Y)

  • Strengths

  • Social Competence

Youth

Interview

Intake, 6 months, and every 6 months

Delinquency Survey, Revised (DS–R)

  • Delinquent or risky behaviors

Youth 11 years and older

Interview

Intake, 6 months, and every 6 months thereafter

Gain Quick–R Substance Problem Scale (GAIN)

  • Substance use, abuse, and dependence

Youth 11 years and older

Interview

Intake, 6 months, and every 6 months thereafter

Substance Use Survey, Revised (SUS–R)

  • Alcohol, tobacco, and drug use

Youth 11 years and older

Interview

Intake, 6 mo., and every 6 months thereafter

Revised Children’s Manifest Anxiety Scale, Second Edition (RCMAS–2)

  • Child anxiety

Youth 11 years and older

Interview

Intake, 6 months, and every 6 months thereafter

Reynolds Adolescent Depression Scale, Second Edition (RADS–2)

  • Child depression

Youth 11 years and older

Interview

Intake, 6 months, and every 6 months thereafter

Child and Family Outcome Study (a sample of children and families enrolled in the system of care)

Youth Information Questionnaire, Revised (YIQ–R)

  • Acculturation

  • Coercion

  • Peer relations

  • Symptomatology

  • Suicidality

  • Neighborhood Safety

  • Presenting problems

  • Empowerment

  • Self-efficacy

  • Life skills

  • Employment status

Youth 11 years and older

Interview

Intake, 6 months, and every 6 months thereafter

Multi-Sector Service Contacts, Revised—Intake (MSSC–R–I)

  • Type of service

  • Amount of service

  • Location of service

Caregiver

Interview

Intake if services received

Multi-Sector Service Contacts, Revised—Follow-Up (MSSC–R–F)

  • Type of service

  • Amount of service

  • Location of service

Caregiver

Interview

Intake, 6 months, and every 6 months thereafter if services received

Youth Services Survey for Families (YSS–F)

  • Service experience

  • Client satisfaction

  • Perceived outcomes

Caregiver

Interview

Intake, 6 months, and every 6 months thereafter if services received

Youth Services Survey (YSS)

  • Service experience

  • Client satisfaction

  • Perceived outcomes

Youth 11 years and older

Interview

Intake, 6 months, and every 6 months thereafter if services received

Cultural Competence and Service Provision Questionnaire, Revised (CCSP–R)

  • Cultural competence

Caregiver

Interview

Intake, 6 months, and every 6 months thereafter if services received

Sector and Comparison Study

Teacher Questionnaire (TQ)

  • School attendance

  • Grade Level

  • School Achievement

  • Alternate/special school placements

  • Reasons for having an Individualized Education Plan (IEP)

Teacher

Interview


Education Sector Caregiver Questionnaire (ESCQ)

  • Caregiver experience with child’s IEP

Caregiver

Interview

Intake and every 6 months thereafter if child has IEP

School Administrator Questionnaire (SAQ)

  • Quality and availability of mental health services in schools

School administrator

Interview


Court Representative Questionnaire (CRQ)

  • Service referral

  • Completion of court-ordered activities

  • Arrests

  • Court appearances

Court representative

Interview


Services and Costs Study (all sites; caregivers: all enrolled in the Child and Family Outcome Study)

Management Information Systems (MIS)

  • Previous service history

  • Service setting and type

  • Level of restrictiveness

  • Mix of services

  • Amount and duration

  • Continuity of care

  • Service costs

  • Funding sources & third party reimbursements

MIS systems maintained by State and local agencies

Data abstraction

Continuously; data transmitted at regular intervals



3. METHODS TO MAXIMIZE RESPONSE RATES


The retention of participants in multi-site, longitudinal studies is a critical concern for many program evaluators. A distinctive feature of longitudinal studies is that loss of study participants tends to accumulate over time as further data collection waves are conducted. Non-response in longitudinal studies may occur in a number of ways and can display different patterns. For example, if a participant fails to complete (or skips) certain items in a data collection instrument then this is referred to as item non-response. When data are missing for subsequent waves due to loss of a participant this occurrence is referred to as unit non-response (or attrition). Some participants may skip one wave of a longitudinal study but may return in a subsequent wave of data collection; this occurrence of non-response is known as wave non-response. Participant attrition, particularly differential attrition in randomized or quasi-experimental designs, plagues many longitudinal research studies. High rates of participant attrition compromise the ability of evaluators to accurately report outcomes of longitudinal research in two major areas. First, the ability of evaluators to estimate the amount of change over time can be compromised when participants are lost to follow-up at different rates. The estimate of the amount of change may be over- or under-estimated depending on the remaining participants’ outcomes and not the result of the impact of the intervention. Second, if participant attrition has changed the composition of the study population substantially over time, the outcomes may not be generalizable to other similar populations outside the longitudinal research. High retention of participants enables the accurate measurement of outcomes and is critical for minimizing the bias that can result from missing data. Participants’ loss during follow- up can potentially bias the results of analysis because of differences between those who dropped out and those who continue, reduce overall sample size, and decrease statistical power in estimating significance of changes over time.

National evaluation staff members recently examined the factors associated with retention of children and youth (and their families) in the longitudinal Child and Family Outcome Study. Data from the Child and Family Outcome Study, as well as site-level information on communities participating in the evaluation, were used to investigate predictors of retention in this study at 6, 12, 18, and 24 months in a 3-level hierarchical linear model. In particular, the following areas were examined for their impact on retaining study participants: 1) how characteristics of the study participants affect retention; 2) how staffing characteristics influence the ability to follow study participants over time; and 3) the effects of incentives on participant retention. Missing data patterns and the steps taken to impute missing data also were examined. The findings of this study will soon be submitted to the American Journal of Evaluation. In brief, specific findings indicated that behavioral and emotional problems at intake, referral source, proportion of total staff devoted to interviewers, and total amount of incentive paid out at baseline (but not incentive amounts offered per wave for subsequent waves) influenced retention of participants. In addition, significant random effects at both level-2 and level-3 indicated that retention varied significantly between communities and among children even after controlling for child and family characteristics.


To maximize the response rate and reduce non-response bias for all data collection efforts, a number of steps will be taken:


The National Evaluator will continue to take an active role providing technical assistance and support to the CA awardee sites. This will be done by providing: (1) a detailed Data Collection Procedures Manual; (2) an initial training on evaluation protocols; (3) evaluation workshops at semi-annual national meetings and through Webinars; (4) one-on-one contact with national evaluation liaisons; (5) regular teleconferences and site visits throughout the evaluation period; (6) forums for cross-community facilitated discussions; (7) reading materials; and (8) additional guidance and information, as questions arise. In addition, resources to assure that site evaluators are aware when an interview is due for completion will be provided in the form of a Tracking System in Microsoft Access specific to this evaluation, and reminder e-mails generated by the Internet-based data collection system to eliminate the need for site-level duplication of effort and expense in the design of local tracking materials.


Additionally, the National Evaluator will provide mechanisms for sites to communicate with the National Evaluator and other sites. This will be done by provision of an Internet-based listserv for facilitating communication about training and technical assistance regarding evaluation implementation and utilization. The listserv allows site evaluators to communicate with the National Evaluator and each other through group e-mail. Any e-mail message sent to the listserv is automatically distributed to all site evaluators. The listserv is run at no cost to site evaluators.


Special efforts around training in communities with smaller service populations will also be conducted to ensure that as many people as possible from the target population are enrolled and that site staff are familiar with methods for maximizing response rates. The National Evaluator will encourage these sites to keep in frequent contact with study participants to update telephone numbers and addresses and to create program branding and materials for the site to engage families. As well, the National Evaluator will provide these sites with contact information for staff from other sites that have had high response rates and will assist them in applying strategies that have been used successfully in other communities.


To help ensure that data are being collected regularly and in keeping with national evaluation standards, the data collection staff at the local sites will continue to work closely with local providers, staff from various agencies, and evaluation staff. These contacts will inform the evaluation implementation and data collection procedures, and address any questions or concerns of the participating providers or agencies. As well, local parent groups will be enlisted to encourage the cooperation of families in providing child and family information.


Following from the national evaluation standards, information will be collected from participants in the Child and Family Outcome Study to facilitate contacting them in the future. This will include the names, phone numbers, and addresses of close friends and family members who are likely to always know where the participants are if they move. At the time of follow-up data collection, staff will attempt to contact respondents at different times of the day and week using a variety of methods (e.g., phone calls, mailed postcards). This will continue until the determination is made that a family has refused further participation or cannot be found. Efforts to contact respondents for follow-up data collection will begin by 1 month before the follow-up interview is due. Other efforts to increase the response rate and reduce non-response bias will include:


    • Providing an incentive payment for completing follow-up interviews;

    • Administering the instruments to children and their parents or caregivers at times and settings of their choice and administering multiple instruments at one time;

    • Developing a close working relationship between the data collection staff and providers at each site to facilitate tracking;

    • Conducting follow-up and informational mailings throughout the study period to maintain contact with study participants;

    • Using a centralized data collection and tracking system involving trained interviewers and at least one person dedicated to the tracking of study participants over time to keep study attrition to a minimum;

    • Employing proven tracking techniques (e.g., request address corrections from the post office for forwarded mail, use Web-based address and telephone searches, employ locator services to search for respondents);

    • Obtaining permission from caregivers for evaluators to contact other agencies for the purpose of getting new addresses and phone numbers if the family has moved since the last interview;

    • Providing sites with useful feedback on data obtained through the evaluation activities that will assist them in planning and service delivery.



4. TESTS OF PROCEDURES


Many instruments planned for Phase VI are standardized instruments that have been tested through use in children’s mental health services research and practice. These include the Child Behavior Checklist (CBCL), the Behavioral and Emotional Rating Scale—Second Edition (BERS–2), the Preschool Behavioral and Emotional Rating Scale (PreBERS), the Devereux Early Childhood Assessment (DECA), the Gain Quick–R: Substance Problem Scale (GAIN), the Youth Services Surveys (YSS), the Revised Children’s Manifest Anxiety Scale, Second Edition (RCMAS–2), and the Reynolds Adolescent Depression Scale, Second Edition (RADS–2). Selection of measures was based on expert panel reviews, and an assessment of measurement quality as reported in the literature. Decisions about Phase VI instrumentation were made in conjunction with expert reviewers, site representatives, and family members. These consultants are listed in Attachment A.


In addition to providing input into the selection of standardized instruments, the team of consultants also suggested measures to be removed from the evaluation, and specific items to include in the evaluation (which have been incorporated into the new and revised measures). New and revised measures have been administered to determine burden estimates. Experience and data from previous phases were further used to assess reliability and validity and contributed to the burden estimates.


Table 8 below shows currently OMB-approved data collection instruments that the request proposes to continue:


Table 8. Currently OMB-Approved Data Collection: Propose to Continue

  • Child Behavior Checklist (CBCL)

  • Services and Costs Data Dictionary/Data Entry Application

  • Behavioral and Emotional Rating Scale—Second Edition (BERS-2)

  • System of Care Assessment Interview Protocols

  • Preschool Behavioral and Emotional Rating Scale (PreBERS)

  • Caregiver Information Questionnaire, Revised (CIQ-R)

  • Devereux Early Childhood Assessment (DECA)

  • Education Questionnaire, Revision 2 (EQ-R2)

  • Gain Quick-R: Substance Problem Scale (GAIN)

  • Substance Use Survey, Revised (SUS-R)

  • Youth Services Survey (YSS)

  • Delinquency Survey, Revised (DS-R)

  • Revised Children’s Manifest Anxiety Scale, Second Edition (RCMAS-2)

  • Youth Information Questionnaire, Revised (YIQ-R)

  • Reynolds Adolescent Depression Scale, Second Edition (RADS-2)

  • Multi-Sector Service Contacts, Revised—Follow-Up (MSSC-R-F)

  • Child Behavior Checklist (CBCL)

  • Cultural Competence and Service Provision Questionnaire, Revised (CCSP-R)

  • Parenting Stress Index (PSI)

  • Youth Services Survey for Families (YSS-F)

  • Court Representative Questionnaire (CRQ)


  • Teacher Questionnaire (TQ)

  • Living Situations Questionnaire (LSQ)

  • School Administrator Questionnaire (SAQ)

  • Columbia Impairment Scale (CIS)


  • Multi-Sector Service Contacts, Revised—Intake (MSSC-R-I)

  • Caregiver Strain Questionnaire (CGSQ)

  • Flex Funds Data Dictionary/Tool



New instruments added to Phase VI:


  • Education Sector Caregiver Questionnaire (ESCQ) added to Education Sector and Comparison Study

  • Child Welfare Sector Study Record Review Form (CWRF)There is no burden associated with this instrument because data are obtained from administrative records.


Measures removed from the previously approved Phase VI evaluation:

  • Child Welfare Sector Study Questionnaire: Initial and Follow-up (CWSQ-I and CWSQ-F)

  • Alumni Networking and Collaboration Survey

  • Alumni Network Web Site Satisfaction Survey

  • Local CQI Focus Group Guide

  • National CQI Focus Group Guide

  • CQI Monitoring Survey

  • Sustainability Survey

  • Sustainability Survey: Brief Form


In addition to these measures, the Phase VI evaluation will include an electronic data transfer to obtain records from the education and juvenile justice sectors. These data will be collected as part of the Sector and Comparison study and will obtain administrative data such as school grades, school attendance, arrest and adjudication records.


All the measures for Phase VI have been or will be translated into Spanish. The reliability and validity of the Spanish Child Behavior Checklist (CBCL) has been reported in the literature. Translation of measures will be conducted using established procedures, as done in earlier phases. First, experienced bilingual translation consultants translated the measures from English to Spanish. Then, to maximize the accuracy of the translation, full measures or in some cases selected sections were then back-translated from Spanish to English by other translators who were largely native speakers in CA awardee communities.



5. STATISTICAL CONSULTANTS


The National Evaluator has full responsibility for the development of the overall statistical design, and assumes oversight responsibility for data collection and analysis for Phase VI. Training, technical assistance, and monitoring of data collection will be provided by the National Evaluator. The individual responsible for overseeing data collection and analysis is:


Brigitte Manteuffel, Ph.D.

Macro International Inc.

3 Corporate Square, Suite 370

Atlanta, GA 30329

(404) 321-3211


The following individuals will serve as statistical consultants to this project:


Susan L. Ettner, Ph.D.

Professor

David Geffen School of Medicine at UCLA

Division of General Internal Medicine and Health Services Research

911 Broxton Plaza, Room 106

Box 951736

Los Angeles, CA 90095-1736

Phone: (310) 794-2289

Fax: (310) 794-0732


Anna Krivelyova, M.S.

Macro International Inc.

3 Corporate Square, Suite 370

Atlanta, GA 30329

(404) 321–3211


Robert Stephens, Ph.D., M.P.H.

Macro International Inc.

3 Corporate Square, Suite 370

Atlanta, GA 30329

(404) 321–3211


Tesfayi Gebreselassie, Ph.D.

ICF Macro

3 Corporate Square, Suite 370

Atlanta, GA 30329

(404) 321–3211


The agency staff person responsible for receiving and approving contract deliverables is:


Ingrid Goldstrom, M.Sc.

Child, Adolescent, and Family Branch

Center for Mental Health Services

Substance Abuse and Mental Health Services

1 Choke Cherry Road, Room 6–1047

Rockville, MD 20857

LIST OF ATTACHMENTS



Attachment A: Consultation

1. Federal/National Partnership for Children’s Mental Health

2. Methodological Consultants and Services Evaluation Committee

3. Expert Reviewers of Instrumentation


Attachment B: System of Care Assessment

1. System of Care Assessment Framework

2. Letter Templates

3. Informant Table

4. Pre-Visit Documentation

5. Consent Forms

6. System of Care Assessment Interview Protocols


Attachment C: Cross-Sectional Descriptive Study

1. Enrollment and Demographic Information Form (EDIF)

2. Child Information Update Form (CIUF)


Attachment D: Longitudinal Child and Family Outcome Study and Service Experience Study

1. Caregiver Information Questionnaire, Revised: Caregiver—Intake (CIQ–RC–I)

2. Caregiver Information Questionnaire, Revised: Caregiver—Follow-Up (CIQ–RC–F)

3. Caregiver Information Questionnaire, Revised: Staff as Caregiver—Intake (CIQ–RS–I)

4. Caregiver Information Questionnaire, Revised: Staff as Caregiver—Follow-Up (CIQ–RS–F)

5. Caregiver Strain Questionnaire (CGSQ)

6. Child Behavior Checklist (CBCL 1½–5)

7. Child Behavior Checklist (CBCL 6–18)

8. Education Questionnaire, Revision 2 (EQ–R2)

9. Living Situations Questionnaire (LSQ)

10. Behavioral and Emotional Rating Scale—Second Edition, Parent Rating Scale (BERS–2C)

11. Columbia Impairment Scale (CIS)

12. Parenting Stress Index (PSI)

13. Devereux Early Childhood Assessment for Infants (DECA 1–18M)

14. Devereux Early Childhood Assessment for Toddlers (DECA 18–36M)

15. Devereux Early Childhood Assessment (DECA 2–5Y)

16. Preschool Behavioral and Emotional Rating Scale (PreBERS)

17. Delinquency Survey, Revised (DS–R)

18. Behavioral and Emotional Rating Scale—Second Edition, Youth Rating Scale (BERS–2Y)

19. Gain Quick–R: Substance Problem Scale (GAIN)

20. Substance Use Survey, Revised (SUS–R)

21. Revised Children’s Manifest Anxiety Scale (RCMAS)

22. Reynolds Adolescent Depression Scale—Second Edition (RADS–2)

23. Youth Information Questionnaire, Revised—Intake (YIQ–R–I)

24. Youth Information Questionnaire, Revised—Follow-Up (YIQ–R–F)

25. Multi-Sector Service Contacts, Revised: Caregiver—Intake (MSSC–RC–I)

26. Multi-Sector Service Contacts, Revised: Caregiver—Follow-Up (MSSC–RC –F)

27. Multi-Sector Service Contacts, Revised: Staff as Caregiver—Intake (MSSC–RS–I)

28. Multi-Sector Service Contacts, Revised: Staff as Caregiver—Follow-Up (MSSC–RS –F)

29. Cultural Competence and Service Provision Questionnaire, Revised (CCSP–R)

30. Youth Services Survey for Families, Abbreviated Version (YSS–F)

31. Youth Services Survey, Abbreviated Version (YSS)


Attachment E: Sector and Comparison Study

1. Introduction Scripts

2. Consent Forms

3. Instruments

a. Court Representative Questionnaire (CRQ)

b. Teacher Questionnaire (TQ)

c. School Administrator Questionnaire (SAQ)

d. Education Sector Caregiver Questionnaire (ESCQ)

e. Child Welfare Sector Study Record Review Form (CWRF)


Attachment F: Services and Costs Study

1. Flexible Funds Data Dictionary

2. Services and Costs Data Dictionary


Attachment G: Consent Forms for Longitudinal Child and Family Outcome Study and Service Experience Study

1. Sample Informed Consent—Caregiver Version

2. Sample Informed Assent—Youth Version

3. Sample Informed Consent—Young Adult Version


Crosswalk of Uploaded Documents


Caregiver—Instruments


Attachment B: System of Care Assessment

6. System of Care Assessment Interview Protocols

i. Caregiver of Child or Youth Served by the Program


Attachment D: Longitudinal Child and Family Outcome Study and Service Experience Study

1. Caregiver Information Questionnaire, Revised: Caregiver—Intake (CIQ–RC–I)

2. Caregiver Information Questionnaire, Revised: Caregiver—Follow-Up (CIQ–RC–F)

3. Caregiver Information Questionnaire, Revised: Staff as Caregiver—Intake (CIQ–RS–I)

4. Caregiver Information Questionnaire, Revised: Staff as Caregiver—Follow-Up (CIQ–RS–F)

5. Caregiver Strain Questionnaire (CGSQ)

6. Child Behavior Checklist (CBCL 1½–5)

7. Child Behavior Checklist (CBCL 6–18)

8. Education Questionnaire, Revision 2 (EQ–R2)

9. Living Situations Questionnaire (LSQ)

10. Behavioral and Emotional Rating Scale—Second Edition, Parent Rating Scale (BERS–2C)

11. Columbia Impairment Scale (CIS)

12. Parenting Stress Index (PSI)

13. Devereux Early Childhood Assessment for Infants (DECA 1–18M)

14. Devereux Early Childhood Assessment for Toddlers (DECA 18–36M)

15. Devereux Early Childhood Assessment (DECA 2–5Y)

16. Preschool Behavioral and Emotional Rating Scale (PreBERS)

25. Multi-Sector Service Contacts, Revised: Caregiver—Intake (MSSC–RC–I)

26. Multi-Sector Service Contacts, Revised: Caregiver—Follow-Up (MSSC–RC –F)

27. Multi-Sector Service Contacts, Revised: Staff as Caregiver—Intake (MSSC–RS–I)

28. Multi-Sector Service Contacts, Revised: Staff as Caregiver—Follow-Up (MSSC–RS –F)

29. Cultural Competence and Service Provision Questionnaire, Revised (CCSP–R)

30. Youth Services Survey for Families, Abbreviated Version (YSS–F)


Attachment E: Sector and Comparison Study

3. Instruments

d. Education Sector Caregiver Questionnaire (ESCQ)



Caregiver—Other


Attachment B: System of Care Assessment

2. Letter Templates: Family Stipend Receipt

5. Consent Forms—Caregiver, Parent/Guardian Approval for Youth Participant Aged 14–17, Informed Consent for Record Review


Attachment E: Sector and Comparison Study

1. Introduction Scripts

2. Consent Forms


Attachment G: Consent Letters for Longitudinal Child and Family Outcome Study and Service Experience Study

1. Sample Informed Consent—Caregiver Version



Youth—Instruments


Attachment B: System of Care Assessment

6. System of Care Assessment Interview Protocols

p. Youth Respondent


Attachment D: Longitudinal Child and Family Outcome Study and Service Experience Study

17. Delinquency Survey, Revised (DS–R)

18. Behavioral and Emotional Rating Scale—Second Edition, Youth Rating Scale (BERS–2Y)

19. Gain Quick–R: Substance Problem Scale (GAIN)

20. Substance Use Survey, Revised (SUS–R)

21. Revised Children’s Manifest Anxiety Scale (RCMAS)

22. Reynolds Adolescent Depression Scale—Second Edition (RADS–2)

23. Youth Information Questionnaire, Revised—Intake (YIQ–R–I)

24. Youth Information Questionnaire, Revised—Follow-Up (YIQ–R–F)

31. Youth Services Survey, Abbreviated Version (YSS)



Youth—Other


Attachment B: System of Care Assessment

2. Letter Templates: Youth Stipend Receipt

5. Consent Forms—Youth (18–21 years old), Youth (14–17 years old)


Attachment E: Sector and Comparison Study

2. Consent Forms


Attachment G: Consent Letters for Longitudinal Child and Family Outcome Study and Service Experience Study

2. Sample Informed Assent—Youth Version

3. Sample Informed Consent—Young Adult Version



Provider/Administrator—Instruments


Attachment B: System of Care Assessment

6. System of Care Assessment Interview Protocols

a. Core Agency Representative

b. Project Director

c. Family Representative/Representative of Family/Advocacy Organizations

d. Program Evaluator

e. Intake Worker

f. Care Coordinator

g. Direct Service Delivery Staff

h. Care Review Participant

l. Direct Service Staff from Other Public Child-Serving Agencies

m. Care Record/Chart Review

n. Other Staff

o. Debriefing Document

q. Youth Coordinator

r. Cultural and Linguistic Competence Coordinator

s. Social Marketing-Communications Manager


Attachment C: Cross-Sectional Descriptive Study

1. Enrollment and Demographic Information Form (EDIF)

2. Child Information Update Form (CIUF)


Attachment E: Sector and Comparison Study

3. Instruments

a. Court Representative Questionnaire (CRQ)

b. Teacher Questionnaire (TQ)

c. School Administrator Questionnaire (SAQ)

e. Child Welfare Sector Study Record Review Form (CWRF)


Attachment F: Services and Costs Study

1. Flexible Funds Data Dictionary

2. Services and Costs Data Dictionary



Provider/Administrator—Other


Attachment B: System of Care Assessment

1. System of Care Assessment Framework

2. Letter Templates

3. Informant Table

4. Pre-Visit Documentation

5. Consent Forms—Staff

Attachment E: Sector and Comparison Study

2. Consent Forms



Consultation


Attachment A: Consultation

1. Federal/National Partnership for Children’s Mental Health

2. Methodological Consultants and Services Evaluation Committee

3. Expert Reviewers of Instrumentation


20


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKatherine.E.Young
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy