OMB HtE RI 36M_Supporting Statement Part B_12.05.07

OMB HtE RI 36M_Supporting Statement Part B_12.05.07.doc

Enhanced Services for the Hard-to-Employ Demonstration and Evaluation: Rhode Island 36-Month Follow-Up Data Collection

OMB: 0970-0337

Document [doc]
Download: doc | pdf











SUPPORTING STATEMENT

FOR OMB CLEARANCE

PART B


DHHS/ACF/ASPE/DOL

ENHANCED SERVICES FOR THE HARD-TO-EMPLOY (HtE)

DEMONSTRATION AND EVALUATION PROJECT




RHODE ISLAND 36-MONTH DATA COLLECTION INSTRUMENTS

October 24, 2007




B. COLLECTION OF INFORMATION USING STATISTICAL METHODS


B1. Sampling


The follow-up survey sample size for the parents will be 400 respondents. Dependent on whether a mother has one focal child, two focal children in a group, or one focal child in both groups, the estimated sample size for the youth is 298 and for the younger children is 164. This follow-up sample estimate is based on the assumption that 80 percent of the research sample will be successfully interviewed.

Exhibit B1-1

Follow-up Survey Sample Sizes

Survey Efforts/Sites

Research Samples

Follow-Up Survey Samples

Rhode Island 36-month Parent Survey

500

400

Rhode Island 36-month Youth Survey

373

298

Rhode Island 36-month Child Direct Assessment

205

164


The evaluation literature often discusses the appropriateness of the sample size for a study by focusing on the smallest program impacts that are likely to be detected with a specified level of confidence, assuming a sample of a given size and characteristics. These are usually called the program’s “minimum detectable effects” (MDEs). Analysis of MDEs is also referred to as “power analysis,” as it estimates the study's power to measure the effects it was designed to find.


As a guide to determining appropriate sample sizes, Exhibit B1-2 projects the statistical power of sampling plans for a two-group impact estimate using results for children aged 11-16 years in the New Hope 5-Year Follow-Up.1 Exhibit B1-2 reports MDEs, the minimum program impact that a sample has an acceptable chance of detecting (with a .10 significance level, and .80 power) for four child outcomes, and for samples of different sizes.


Exhibit B1-2

Minimum Detectable Effects for Key Outcomes in Effect Size Units,
5 Years of Follow-Up, New Hope Project

Size of Program and Control Group

Positive Behavior Scale

Externalizing Behaviors

Internalizing Behaviors

Achievement


75/75

.21

.28

.26

.44

100/100

.18

.24

.23

.38

150/150

.15

.20

.18

.31

200/200

.13

.17

.16

.27



Meta-analyses of the randomized clinical trial literature on the treatment of outpatients with major depression suggest that post-treatment depression symptoms between active treatment and control groups are on average about 1 standard deviation. 2 Given that these are efficacy trials in controlled clinical settings, an effectiveness trial such as the current study may result in an expected effect size on depression of .66 standard deviation, based on prior work on the effects of care management as used in the current study. 3 Given that these effectiveness trials have found stronger effects in low-income samples, 4 this may be a conservative estimate for this study. This same assumption was made in a related NIH grant application testing this same intervention on a working population (P. Wang, Principal Investigator; Kessler and Simon, co-Principal Investigators).



Longitudinal research on the effects of depression on outcomes for children finds effect sizes ranging from .36 (based on rates of Major Depression Disorder (MDD) in a four-year follow-up of a community sample of parents, some of whom had affective disorders5 to .6 found in a 10-year follow-up of clinically-referred parents. 6 Assuming the average of these two estimates (.48), and combining this with a .66 treatment effect size on symptom severity, we expect an expected effect size of this intervention effort of on children’s diagnosis of depression of .32 (.66 * .36). In order to detect an effect of .32, the total sample of children must at least be 242 given that diagnoses of depression would only be assessed in the older age group. Our total sample for this age group of 298 will be able to detect this level of impact.

B2. Procedures for Collection of Information


The follow-up survey data will be collected through in-person paper and pencil surveys for the parents and Audio-CASI for the administration of the youth survey. In-person outreach and interviewing strategies will be used to maximize response rates. MDRC will work with HumRRO to develop strategies that ensure an 80 percent response rate.


All completed surveys will be reviewed to ensure all applicable fields are correctly completed and that all relevant interviewer notes are included in the data set. Any open ended and “other, please specify” items will be coded based on codes developed at HumRRO and approved by MDRC. Preliminary data files will be created and shared – with documentation – with MDRC on an agreed-upon schedule.

B2.1 Procedures for the surveys


Interviewer Selection. MDRC will work with HumRRO to ensure that the interviewers administering these follow-up surveys are professional interviewers, many of whom have worked on social research projects. Prefer­ence will be given to those who are multilingual, depending on the languages spoken by the research samples. Familiarity with the special requirements of interviewing low-income populations will be desirable. New personnel will be trained along with the seasoned interviewers.


Interviewer Training. MDRC will work with HumRRO to ensure sufficient interviewer training. In the past, this has typically involved 4-day trainings. For example, personnel who are new to interviewing were trained in general interviewing techniques and approaches in the first day of the session. Professional interviewers will be trained with the new recruits on project-specific material in the remaining 3 days. Some pre-training exercises are likely to be required, and the actual training will include an item-by-item review of the survey instruments, and practice interviews and critiques of those interviews. Direct assessments will first be practiced through role-playing, and later with actual children prior to entering the field.


Training will take place close to the time when the first cohorts of research subjects reach the appropriate anniversary of their random assignment date (36 months).


All interviewers will sign a confidentiality pledge during training. They will be instructed on the importance of maintaining confidentiality and told that breaches of confidentiality will lead to dismissal.


MDRC will also work with HumRRO to establish procedures for monitoring early interviews for each interviewer (e.g., videotaping interviews), periodically monitoring interviews throughout the course of data collection, and procedures for offering feedback.


Conducting Interviews. In all cases, the interviewers will explain the purpose of the interview, and inform respondents that they will receive a small incentive for completing the survey. Each interviewer will be prepared to answer any questions about the study that sample members might have.


Interviewer Supervision. Interviewing staff will be supervised directly by staff from HumRRO.

B3. Maximizing Response Rates


The goal will be to achieve an 80 percent response rate in each site. Procedures for obtaining the maximum degree of cooperation include:

  • Conveying the purposes of the survey to respondents so they will thoroughly understand the purposes of the survey and perceive that cooperating is worthwhile;

  • Providing a toll-free number for respondents to use to ask questions about the survey and the survey firm’s staff;

  • Training site staff to be encouraging and supportive, and to provide assistance to respondents as needed;

  • Hiring interviewers who have necessary skills for encouraging respondent cooperation;

  • Training interviewers to maintain one-on-one personal rapport with respondent; and

  • Offering appropriate payments to respondents.


The HtE Rhode Island follow-up surveys are designed to be administered in person. Individuals who refuse to meet in person will be offered to do parent and youth surveys over the telephone, foregoing the opportunity for a direct child assessment (if applicable).


Interviewers will also be trained to distinguish "soft" refusals from "hard" ones. Soft refusals often occur when the sample member has been reached at an inopportune time. In these cases, it is important to back off gracefully and to establish a convenient time to call or come back rather than to persist at the moment. Hard refusals do occur and must also be accepted gracefully by the interviewer.

B4. Pre-testing


Most of the questions proposed for this survey are either identical to questions used in prior MDRC evaluations or are similar, if not identical, to questions used in previous national surveys or major evaluations. Consequently, many of the items have been thoroughly tested on larger samples.


The HtE Rhode Island follow-up surveys have already undergone a number of revisions, following critiques by internal staff, by project consultants, and by staff at HHS. Revisions were also made on the basis of our small-scale consultations.


MDRC will also work closely with HumRRO’s senior staff to conduct formal pre-tests of these follow-up surveys, using nine sample members with ample contact information to complete each survey in person. These pre-tests will provide more definitive estimates about the length of each survey – and its various components – as well as lead to improvements in question wording and document formatting. Following each of the pre-tests, respondents will be debriefed about the clarity of the questions and any potential problems with the instruments. Interviewers will also be debriefed concerning any problems they encountered in survey administration – and they will recommend improvements. Translated versions of the surveys will be developed once English versions are finalized.



B5. Consultants on Statistical Aspects of the Design


There were no consultants on the statistical aspects of the design. For consultants on survey development aspects, see A8.



1 Huston et al., 2005

2 Agency for Healthcare Policy and Research, 1999; DeRubeis et al., 1999; Gaffan, Tsaousis, & Kemp-Wheeler, 1995; Gloaguen et al., 1998; Joffe, Sokolov, & Streiner, 1996; Moncrieff, Wessely, & Hardy, 1998

3 Katzelnick et al., 2000; Wells et al., 2000

4 Miranda et al., 2003

5 Beardslee et al., 1993

6 Weissman et al., 1997




File Typeapplication/msword
AuthorMDRCER
Last Modified ByUSER
File Modified2007-12-06
File Created2007-12-06

© 2024 OMB.report | Privacy Policy