SHM OMB Supporting Statement PART B 11-24-08

SHM OMB Supporting Statement PART B 11-24-08.doc

Supporting Healthy Marriage Project Baseline Data Collection

OMB: 0970-0299

Document [doc]
Download: doc | pdf



PART B. COLLECTION OF INFORMATION USING STATISTICAL METHODS

B1.      Sampling

The SHM research sample, including the baseline survey sample, will contain a maximum of 8,000 couples (16,000 individual respondents - husbands and wives) among 8 demonstration sites who volunteer to participate in marriage education classes. The number of couples/individuals may be lower based on site program capacity or budget constraints. The random assignment ratio is 50/50. In order to be included in the programs and research, both the husband and wife must enroll and agree to participate. Because the low-income married population is so heterogeneous, the universe of respondents for the survey will include individuals who differ by age, race, income, level of marital distress, and a host of other factors.

The evaluation literature often discusses the appropriateness of the sample size for a study by focusing on the smallest program impacts that are likely to be detected with a specified level of confidence, assuming a sample of a given size and characteristics. These are usually called the program’s “minimum detectable effects” (MDEs). Analysis of MDEs is also referred to as “power analysis,” as it estimates the study's power to measure the effects it was designed to find.

Exhibit B1.1 shows the minimum detectable effects (MDEs) that can be achieved with different sample sizes. Because the literature on marriage education often expresses results in effect sizes (that is, in terms of the number of standard deviations of the outcome), the first column shows minimum detectable effect sizes. The remaining columns show the expected MDEs for several key outcomes — marital satisfaction, divorce rates, child well-being, and parental earnings — expressed as percentages of likely control group levels, based on recent experiments with low-income families.

We expect each site to randomly assign 400-500 couples to a program group and a similar number of couples to a control group. We present the MDEs based on the lower sample estimates to be conservative in the presentation of detectable effects. We expect 80 percent of the sample (320 control and 320 program group couples in each site) to complete surveys at the 12-month follow-up (OMB #0970-0339). The exhibit therefore shows MDEs for several sample sizes: (1) 160 couples in each research group, which could represent a subgroup of half of the sample in a site, (2) 320 couples in each research group, representing a single site, (3) 640 couples in each research group, representing results for the two curricula that will be used in two sites, (4) 960 couples in each research group, representing results for the curriculum that will be used in three sites, and (5) 2,560 couples in each research group, representing results for all sites pooled.

As the exhibit indicates, the MDE in each site is 0.20 standard deviations. This means that if the true effect of an intervention is 0.20 standard deviations, then the difference in survey-based outcomes between program and control groups would be statistically significant in 80 percent of experimental tests of that intervention. Compared with many marital interventions studied using random assignment with middle-class white couples, a short-term impact of 0.20 standard deviations is not especially large. Meta-analyses of marriage education and marital and family therapy have found average effect sizes at post-program assessment of 0.50 standard deviations or more.

If sites are pooled, the study has a much better chance of finding statistically significant impacts on survey-based outcomes. For two sites using the same curriculum, for example, the MDE is about 30 percent lower when the two sites are combined than when they are looked at separately. For three sites using another curriculum, the MDE is more than 40 percent smaller when the three sites are pooled. We also plan to estimate results pooling all eight sites. This will reduce the MDE by nearly two thirds.


It is assumed that nearly all sample members assigned for the study will be interviewed at baseline; a 95 percent response rate is anticipated. Sufficient sample will be identified to produce approximately 1,600 completed baseline interviews in each site. Procedures for maximizing response rates are discussed in section B3.

B2. Procedures for Collection of Information

The following approaches will be used to collect the baseline data:

  1. Program staff will complete the eligibility checklist with each member of the couple.

  2. If found eligible, a staff member will then assist the couple in completing the baseline information form (BIF). Although the couple will complete BIFs at the same time, each member of the couple will complete their own form.

  3. The couple will be separated in order to complete the self-administered questionnaire (SAQ). Due to the sensitive nature of some SAQ items, all efforts will be made to ensure privacy for respondents, including moving them to different rooms when possible. During completion of the SAQ, staff members will be available to answer questions or provide assistance, but will not be actively involved in administering the SAQ. The respondents will be instructed to place their completed SAQ into an enveloped and seal it; this sealed envelope will be directly returned to researchers.

  4. Each member of the couple will be asked to complete a contact information form indicating the name, address, and other contact information of friends and family members for future reference.

All completed interviews, except the SAQ, will be reviewed internally by data team personnel at MDRC to ensure that all applicable questions are correctly completed and that all relevant interviewer notes are included in the data set. Any open ended and “other, please specify” items will be coded based on codes approved by MDRC. Date files are transferred to MDRC electronically and securely on a regular basis from Social Solutions, the developers of the MIS.


B2.1 Procedures for the Baseline Data Collection


Site Staff. In all SHM sites, specific site staff are designated and trained to assist in the administration of the eligibility screener, BIF, SAQ, and contact information form. Each site has designated one or two people who are responsible for this phase of the study.

Training Site Staff. Project personnel from MDRC have conducted training for the designated staff at each site. In the training, the use of the computer system and MIS were explained, confidentiality issues discussed, the questionnaire reviewed, and mock interviews entered onto the form. The record keeping and data transmission schedule required by MDRC was explained. The trainer then observed the site staff using the system with actual sample members. Additional training sessions for new staff are conducted over the course of the data collection effort. In addition, telephone-based technical support is available to site staff administering the baseline data collection.

Conducting Interviews. The designated site staff person(s) handles all interview sessions. Individuals identified as potential sample members are directed to that staff person, who initiates the survey session at that time and answers any questions about the study that sample members might have. The staff person reads a script that provides sample members with assurances of confidentiality and discusses their rights as study participants. The staff person administers the eligibility screener, assists respondents in completing the BIF, and provides guidance about completing the SAQ.

Supervision. Site staff are supervised under the normal supervisory system established at the site. In addition, one staff member is available once each week to report to project personnel at MDRC. At each contact, the number of interviews conducted is compared to the number that was transmitted electronically. Any anomalies in the weekly intake numbers are discussed, as are any problems with completed interviews, software, or hardware.

B3. Maximizing Response Rates

The goal is to administer the baseline interview to all sample members in each site. Procedures for obtaining the maximum degree of cooperation include:

  • Conveying the purposes of the study and baseline to respondents so they will thoroughly understand the purposes of the data collection and perceive that cooperating is worthwhile;

  • Providing a toll-free number for respondents to use to ask questions of MDRC about the survey;

  • Training site staff to be encouraging and supportive, and to provide assistance to respondents as needed; and

  • Training interviewers to maintain one-on-one personal rapport with respondent.

In addition to the above procedures, the privacy that the SAQ process affords respondents during the administration of sensitive questions has been found to increase response rates.

B4. Pre-testing

Most of the questions proposed for this instrument are either identical to questions used in prior MDRC evaluations or are similar, if not identical, to questions used in previous national surveys or major evaluations. Consequently, many of the items have been thoroughly tested on larger samples.

The baseline instruments went through several reviews by MDRC staff, expert consultants, and

staff at ACF. Revisions were also made on the basis of cognitive testing that assessed the comprehensibility of the draft survey instruments on a very small sample of low-income married couples in Washington, DC. Dr. Lina Guzman and colleagues at Child Trends analyzed pre-test results and recommended appropriate revisions to instruments as well as revisions to administration procedures.


The pretests were also undertaken with the goal of improving the quality of the data the instruments would yield, and thus great care was taken in gleaning information about question wording. Following each pretest, respondents were debriefed and asked about question clarity and about any problems or confusions that arose. Research personnel assisting baseline administration were also debriefed about problems they encountered and about their recommendations for improving the instruments. The pretest resulted in some changes to the draft instruments including skip patterns and improving or simplifying some question wording. Instrument pretests confirmed that the burden estimates were in line with actual completion time.


B5. Consultants on Statistical Aspects of the Design

The MDRC research team has extensive expertise and experience. They also drew on the considerable expertise of others within MDRC, specifically Charles Michalopoulos and Howard Bloom, and as well as Larry Orr of Abt Associates.

File Typeapplication/msword
File TitlePART B
AuthorNancye C. Campbell
Last Modified ByNancye C. Campbell
File Modified2008-11-25
File Created2008-11-25

© 2024 OMB.report | Privacy Policy