B3_OMB_Supporting_Statement_B_Revised Jan 2017

B3_OMB_Supporting_Statement_B_Revised Jan 2017.docx

Building Bridges and Bonds (B3)

OMB: 0970-0485

Document [docx]
Download: docx | pdf





Building Bridges and Bonds (B3)


OMB Information Collection Request

New Collection




Supporting Statement

Part B

March 2016


Submitted By:

Office of Planning, Research and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers:

Aleta Meyer

Anna Solmeyer


B1. Respondent Universe and Sampling Methods


Sampling

The B3 impact study will use a randomized control trial design to provide rigorous evidence on the impacts of a parenting intervention in three sites, an employment intervention in three sites, and an engagement intervention in three sites (the engagement intervention will be tested in the same sites the parenting intervention is being tested in). Across the three sites testing the parenting intervention, the study will attempt to enroll approximately 1,500 fathers. Across the three sites testing an employment intervention, the study will attempt to enroll approximately 1,200 fathers.


Within a site, the B3 study will enroll fathers who show up to a responsible fatherhood program seeking services and also may enroll some fathers who are currently receiving business as usual services from the fatherhood program when the B3 study begins. Program staff will identify fathers who are eligible for the study by asking a set of screening questions. Then, a staff member will explain the study and obtain informed consent. Fathers will be enrolled in the study from the time of program launch (estimated to be August 2016, following OMB approval of this request) for a period of approximately 18 - 24 months. We anticipate enrolling all eligible fathers who provide consent over the enrollment period to meet our sample size target of 2,700 fathers.


Statistical power

Fathers will be randomly assigned to a program group that receives enhanced program services (parenting or employment depending on the site) or to a control group that is not eligible for the enhanced services. Fathers in both the program and control groups can receive business as usual services offered by the fatherhood program. In parenting sites, fathers in the program group will be further randomly assigned to an engagement sub-study, which will offer enhanced engagement services versus standard engagement services.


The estimation of program impacts will compare outcomes for the program and control groups. To achieve sufficient statistical power, our analyses will be pooled across 3 sites testing a parenting intervention and across 3 sites testing an employment intervention. The sample sizes and minimum detectable effect sizes we describe below are based on these pooled samples. The sample sizes for the engagement sub-study will likely not be large enough to generate statistically significant impacts so this will be more of an exploratory analysis. Statistical power for the engagement study is therefore not discussed in this section.


Although families within a site may be more similar to one another than to families in other sites, this would not affect the statistical power of the pooled estimates presented below. That is because individuals will be randomly assigned to the program or control group within a site. Previous studies such as MIHOPE (0970-0402), Supporting Healthy Marriage (0970-0339), and PACT (0970-0403) have used a similar approach.


Tables B.1 and B.2 show the “minimum detectable effect” of this sampling plan for the analysis of program impacts for several sample sizes. We show the full sample of study participants when pooled across all of the B3 parenting or employment sites and a range of possible subgroup sample sizes when pooled across all of the B3 parenting or employment sites for different types of analyses. A minimum detectable effect is the smallest true effect that is likely to generate statistically significant estimated effects. For purposes of the design, calculations were performed to find the smallest effects that would generate statistically significant findings in 80 percent of studies with a similar design, using two-tailed t-tests with a 10 percent significance level. All results are presented as effect sizes, that is, in terms of standard deviations of the outcome being examined. Results are presented both for administrative data, which would be available for all study participants, and for survey data, which are assumed to be available for 80 percent of study participants.



Table B.1

Minimum Detectable Effects of Proposed Sampling Plan in Parenting Sites, by Data Source


Administrative

Survey

 Parenting sites

data

data




Full sample (n = 1500)

0.11

0.12




Percent of full sample in subgroup



75 percent (n = 1125)

0.12

0.14

66 percent (n = 1000)

0.13

0.15

50 percent (n = 750)

0.15

0.17

33 percent (n = 500)

0.19

0.21

Notes: Sample assumed to be split evenly across research conditions. Administrative data are assumed to be available for the full sample. Survey data are assumed to be available for 80% of the full sample. Minimum Detectable Effects are expressed in terms of standard deviations of outcomes. Results are the smallest true impact that would generate a statistically significant impact estimate in 80 percent of studies with a similar design using two-tailed t-tests with a 10 percent significance level. No adjustment for multiple comparisons is assumed. Baseline data are assumed to explain 30 percent of variation in outcomes across father study participants.


Statistical Power in Parenting Sites. As Exhibit B.1 indicates, the MDE for the sample when pooled across the parenting sites provide reasonable statistical power for evaluating the intervention of interest. The parenting intervention being tested in B3 has not been experimentally evaluated to benchmark the effect size expected, but other parenting interventions such as Triple P have yielded effect sizes in the range of .15 to .40 standard deviations for mothers in randomized control trials (Nowak and Heinrich 2008). There are two reasons why we plan to have the power to detect somewhat smaller effect sizes for the B3 parenting test. First, prior parenting studies have yielded smaller effect sizes for fathers than for mothers. Second, the control group in B3 will receive some services meaning that the treatment contrast will not be as strong as in studies with a zero services control group.


In sites testing a parenting intervention, the minimum detectable effect for the pooled sample would be 0.11 standard deviations for administrative records and 0.12 for survey-based outcomes. For example, using administrative data to estimate program impacts, if 50 percent of fathers in the control group were paying child support, this design would have an 80 percent chance of finding a statistically significant impact if the true impact is an increase of 5.5 percentage points (from a 50.0 percent payment rate in the control group to a 55.5 percent payment rate in the program group). Likewise, using survey data to estimate program impacts, if 50 percent of fathers in the control group reported seeing their child in the prior month in the follow-up survey, this design would have an 80 percent chance of finding a statistically significant impact if the true impact is an increase of 6 percentage points (from 50 percent of the control group to 56 percent of the program group).


Depending upon the size of the subgroups being examined, it may be more difficult to find statistically significant impacts for particular subgroups. The MDEs for subgroups of various sizes ranges from about 0.12 to 0.21 standard deviations, depending upon the data source and subgroup size when pooled across all of the B3 parenting sites.


Table B.2

Minimum Detectable Effects of Proposed Sampling Plan in Employment Sites, by Data Source


Administrative

Survey

 

data

data

Employment sites (n=1200)

0.12

0.13




Percent of full sample in subgroup



75 percent (n = 900)

0.14

0.16

66 percent (n = 792)

0.15

0.17

50 percent (n = 600)

0.17

0.19

33 percent (n = 396)

0.21

0.23

Notes: Sample assumed to be split evenly across research conditions. Administrative data are assumed to be available for the full sample. Survey data are assumed to be available for 80% of the full sample. Minimum Detectable Effects are expressed in terms of standard deviations of outcomes. Results are the smallest true impact that would generate a statistically significant impact estimate in 80 percent of studies with a similar design using two-tailed t-tests with a 10 percent significance level. No adjustment for multiple comparisons is assumed. Baseline data are assumed to explain 30 percent of variation in outcomes across father study participants.


Statistical Power in Employment Sites. As Exhibit B.2 indicates, the MDE for the sample when pooled across the employment sites provide reasonable statistical power for the employment sites. The employment enhancement we will be testing is innovative, and we do not yet have a strong evidence base for how this enhancement will impact employment and related outcomes among fathers served by fatherhood programs. There is, however, an evidence base for similar programs aimed at reducing recidivism among a prisoner re-entry population, many of whom are fathers. These cognitive-behaviorally informed interventions have reduced recidivism among ex-offenders by around .20 standard deviations (Lipsey et al. 2007). In B3, we are seeking the ability to detect smaller effects because the control group will receive some program services, and because we aim to detect effects on a broader set of outcomes than recidivism over a relatively short term 6 month follow-up period.


In sites testing an employment intervention, the minimum detectable effect for the pooled sample would be 0.12 standard deviations for administrative records and 0.13 for survey-based outcomes. For example, using administrative data to estimate program impacts, if 50 percent of fathers in the control group were employed, this design would have an 80 percent chance of finding a statistically significant impact if the true impact is an increase of 6 percentage points (from a 50 percent employment rate in the control group to a 56 percent employment rate in the program group). Likewise, using survey data to estimate program impacts, if 50 percent of fathers in the control group reported ever working at a formal or informal job over the follow-up, this design would have an 80 percent chance of finding a statistically significant impact if the true impact is an increase of 6.5 percentage points (from 50 percent of the control group to 56.5 percent of the program group).


Depending upon the size of the subgroups being examined, it may be more difficult to find statistically significant impacts for particular subgroups. The MDEs for subgroups of various sizes ranges from about 0.14 to 0.23 standard deviations, depending upon the data source and subgroup size when pooled across all of the B3 employment sites.



B2. Procedures for Collection of Information


This section focuses on procedures for data collection activities for data to be collected from fathers at study enrollment and after study enrollment, as well as data to be collected from staff members.


Data collected from fathers at study enrollment

B3 study enrollment will build upon each fatherhood program’s existing data collection processes. This includes their use of the nFORM management information system (MIS) and surveys that are being developed for federal Responsible Fatherhood grantees by the Fatherhood and Marriage Local Evaluation and Cross-site (FaMLE Cross-site) Project (0970-0460). The following describes the procedures for data collection at study enrollment and how study enrollment will be combined with program operations.

Before recruiting fathers into the study, following the site’s existing procedures, the fatherhood program will collect information to determine whether the father is eligible for the program’s services. If he is determined eligible, the staff member will then determine eligibility for the B3 study.1 In sites that are offering the parenting intervention, eligibility will be determined using a screener provided by the study team (Attachment 1 - Screening questions for parenting intervention). While the screener will be done on paper, a few key pieces of information from this screener that summarize eligibility will be stored in nFORM (Attachment 4: B3-specific eligibility data). In sites that are offering the enhanced employment services, eligibility will be determined using information the site has already collected as part of their existing data collection procedures or a screener provided by the study team (Attachment 2 - Screening questions for employment intervention). The screener provided by the study team will be administered on paper since we only need to record a summary score and an indicator of moderate or high risk electronically. The staff person will ask the father the questions on the screener and then will calculate the total score before entering the score into nFORM. In addition to entering the summary information about the screener, staff members will be asked to mark whether or not three other eligibility criteria have been met in nFORM (Attachment 4: B3-specific eligibility data). Regardless of whether or not fathers are eligible for the B3 study, the staff person will then enter basic information about the father into an Application Form in the nFORM system. Much of this is information that would be collected in nFORM even in the absence of the study, including the father’s name, date of birth, and contact information, and contact information for one or more people who would know how to contact the father if the program is having a hard time reaching the father.2 For fathers who are eligible for the B3 study in sites testing the parenting intervention, this will also include entering some key pieces of information from the screener about the focal child and co-parent that will be helpful for program operations (Attachment 5: B3-specific enrollment data). For all B3-eligible fathers, staff members will try to get contact information for three people who would know how to contact the father if the program or survey firm cannot find him (Attachment 5: B3-specific enrollment data).


For study enrollment, the program staff will then conduct the following procedures:

  • Introduce the study and the enhanced intervention being tested at that site. For the parenting intervention, there will be attractive introductory materials about the study (Appendix F).

  • Provide a commitment to privacy, explain random assignment, and answer questions just prior to going through the consent form with the father to ensure the father understands the implications of the study and has an opportunity to ask any questions he may have.

  • Attempt to obtain informed consent for the father to participate in B3. Informed consent forms (Appendix A) will reference the baseline and follow-up data collections and will allow the study team to collect state administrative data on the father. Based on prior studies such as the Evaluation of the Center for Employment Opportunities (CEO) Transitional Jobs Program (part of the Enhanced Services for the Hard‑to‑Employ Demonstration and Evaluation (0970-0251)), 95 percent of fathers are assumed to provide consent to participate in the study. Thus, the evaluation expects to describe the study and attempt to obtain consent from 2,842 fathers in order to enroll 2,700 fathers. A different version of the form will be used to gain assent from fathers who are under 18 (Appendix B). The staff member will have a consent/assent form on a tablet or computer with a hard copy version for the father to read while the staff person reviews it with him. Staff members will be trained to explain the issues in the consent/assent form and to be able to answer questions. If the father would like to participate, he will electronically sign the consent form. The father will also be given a paper copy of the consent form to take home with him.


  • If an applicant is a minor, it will be necessary to obtain consent from the parent as well, unless the state’s emancipated minor laws make this unnecessary. This consent will be obtained by telephone. The study team anticipates that one-third of program applicants will be minors in the sites testing the parenting intervention. (Applicants under 18 are not eligible for the B3 employment intervention.) The script for obtaining consent from parents of minors is included as Attachment 3 - Consent materials for parents of fathers under 18. The script will be easily accessible in nFORM so that staff members can read it while speaking with the father’s parent and check a box on the same page indicating whether or not consent was given.

  • Fathers assigned to the parenting intervention group in one or two sites will be asked to grant consent to be video recorded during parenting workshops (Attachment 21) and the custodial parent of the fathers’ child, in some cases the father himself and in other cases a co-parent, will be asked to grant consent for the father’s child to be video recorded during parenting workshops (Attachment 22).

  • A staff person will load the 15-minute Applicant Characteristics survey (part of the FaMLE Cross-site data collection) for the father to complete on a tablet or computer. All fathers will complete this survey regardless of whether or not they are eligible for or consented to the B3 study.

  • If the father consents to participate in B3, the staff person will then load either the baseline survey for sites testing parenting intervention or the baseline survey for sites testing employment intervention (Attachments 8 or 9), for the father to complete on the same tablet or computer. Both baseline surveys will take 30 minutes to complete. Like the Applicant Characteristics survey, the B3 baseline surveys will have an audio component (Audio Computer Aided Self Interview-ACASI) which will read the questions and response options to the respondent to ensure that any literacy issues do not affect the ability of the respondent to complete the survey independently. The participant can choose to use the audio or read the questions/responses on screen, or both, whichever is most comfortable for them. After the baseline survey has been completed, the staff person will hand the father a physical gift card as a sign of appreciation for participating in the survey. The father may also opt to receive an electronic incentive by e-mail if he prefers.

  • Enter a few additional pieces of information needed for the study into the nFORM MIS for eligible fathers. This will include two questions about the type of cell phone and text messaging plan that the father has and collection of relevant ID numbers needed to access administrative records (Attachment 5 – B3-specific enrollment data).

  • Indicate in the MIS that the father is ready to be randomly assigned to the program or control group. If the father does not consent to participate in B3, random assignment will still be completed to determine if the enhanced program services will be provided. This ensures that participation in the evaluation does not affect the father’s ability to receive the B3 enhanced services. The result of random assignment will appear in the MIS.

  • Inform the father whether he was assigned to receive the new services available to the program group or the standard services available to the control group.

Data collected from fathers or mothers after study enrollment

Several data sources will be collected after study enrollment to inform the impact and process research questions. The data collection procedures for each are described below.

Attachment 10 – 6 month follow-up survey for sites testing parenting intervention,

Attachment 11 – 6 month follow-up survey for sites testing employment intervention. The 6-month follow-up survey will be conducted using a mixed-mode methodology that consists of a combination of computer-assisted telephone interviews (CATI) and computer-assisted in-person interviews (CAPI). In a mixed-mode approach, the survey firm first attempts to survey each respondent by telephone. Field interviewers then attempt to interview respondents who cannot be contacted by telephone in person. Study participants can refuse to complete the survey, or refuse to answer any of the questions on the survey, and will not be penalized in any way.


About 6 months following random assignment, fathers enrolled in the B3 study will be contacted by Abt Associates with a letter reminding them of their participation in the B3 study and informing them that they will soon receive a phone call from an Abt representative who will want to interview them over the phone. Abt Associates’ interviewers will call the specified contact numbers for B3 study participants and administer the 40-minute follow-up survey to all willing participants. In all cases, the interviewers will explain the purpose of the interview, and inform respondents that they will receive a small monetary incentive for participating in the 6-month survey. Each interviewer will be prepared to answer any questions about the study that sample members might have, through a Frequently Asked Questions resource and additional guidance embedded within the CATI (Computer Aided Telephone Interview) or CAPI (Computer Aided Personal Interview) scripted interviews. If attempts to reach the respondent via phone are not successful, field interviewers then attempt to interview respondents in person. Following the survey interview, respondents will be given a choice of receiving their incentive by e-mail, by mail (physical gift card), or by money order (for interviews done in person).


To ensure that the interviews are handled professionally and that the data are of high quality, considerable effort will be put into interviewer selection, training and supervision.


Attachment 16 - Participant focus groups. Focus groups of approximately 8 program group members per group will be convened across all B3 sites during site visits approximately 6 and 18 months after program launch. The research team will consult with B3 program administrators and staff to appropriately target who should be invited to the focus group and to plan the focus group logistics to accommodate attendance as much as possible. Focus groups will complement the information collected in the mobile device surveys, providing a more in-depth and qualitative understanding of participant’ experiences. Gift cards will be handed out at the conclusion of the focus groups as a “thank you” to fathers for participating.

Attachment 17 - Mother Focus Groups. Twenty mothers of children engaged in the parenting interventions with their fathers will be asked to participate in focus groups at each parenting intervention site. The research team will consult with B3 program administrators and staff to appropriately target who should be invited to the focus group and to plan the focus group logistics to accommodate attendance as much as possible. Since mothers are not the primary participants in the B3 study, there is only one data collection effort focused on mothers. Approximately the first 5 minutes of each focus group will be dedicated to obtaining consent from mothers (Appendix C – Consent forms for focus groups with mothers). Focus groups were chosen as the method for data collection because they are an efficient way to obtain an in-depth and qualitative understanding of mothers’ perceptions of fathers and the program. Gift cards will be handed out at the conclusion of the focus groups as a “thank you” to mothers for participating.


Attachment 18 - Mobile device employment survey, Attachment 19 - Mobile device parenting and co-parenting survey. B3 will use mobile phones to collect data from program and control members who have working cell phone numbers. Brief surveys will be administered to fathers by text message an average of 3 times after fathers enroll in the study in employment sites and an average of 3.5 times after fathers enroll in the study in parenting sites (for a total weighted average of 3.28). Delivering these brief surveys by mobile device allows the team to get data from sample members who stopped participating in their assigned components in addition to those who keep participating. The mobile device survey will be used to collect real-time responses about sample members’ experiences, focusing particularly on questions for which we are concerned about recall bias. At the beginning of each survey, the participant chooses how they would like to receive their incentive, either through email or by text message. If they elect for a text message incentive, within a few hours of completing of the last question of the module, they will receive a text message that contains a weblink to redeem their incentive. This link can be accessed on web-enabled phones or manually copied down to be entered into a computer. If they elect for an email incentive, they will be sent an email within 2 days after they complete the module. The email will contain the weblink to redeem their incentive.


In addition to these data collection efforts, fathers and their children in one or two parenting sites will be video recorded during the father/child play sessions that are a regular part of the parenting intervention. This data collection is not described here, because it is passive data collection that does not impose burden.


Data collected from staff members

A few data sources will be collected from staff members to inform the process research questions. The data collection procedures for each are described below.

Attachment 6 - B3 tracking of attendance in services for program group members. OFA Responsible Fatherhood Grantees are required to enter information about fathers’ program participation into the nFORM MIS. The study team will be asking staff members to record a few extra data fields, as appropriate, to measure participation in the enhanced services offered through the study. Staff members will do this by checking boxes, and/or selecting from drop-down menus on the same screens in the MIS where they are already recording other information about fathers’ participation in services. We estimate that 72 staff members across the 6 B3 sites will be responsible for entering this information into the nFORM MIS.


Attachment 12 - Staff and management semi-structured interviews for sites testing parenting intervention, Attachment 13 - Staff and management semi-structured interviews for sites testing employment intervention. Staff working with the program and control groups as well as B3 program administrators will be asked to participate in semi-structured interviews over the course of two sites visits scheduled to occur approximately 6 and 18 months after study launch. We plan to spend a maximum of 90 minutes with each staff person during each site visit; the same staff may not be interviewed during both visits. B3 site administrators and the designated research liaison will help the research team to plan these visits. Teams of two will conduct two to three day visits and B3 staff and management will be interviewed individually or in small groups of no more than three individuals, depending on their roles and the questions of interests. Semi-structured interviews will complement the information collected in the staff surveys, providing a more in-depth and qualitative understanding of staff members’ roles, challenges and solutions encountered in program operations, etc.

Attachment 14 - Staff survey for sites testing parenting intervention, Attachment 15 - Staff survey for sites testing employment intervention. Staff working with the program and control groups will be asked to complete a web-based survey in 2017. The web-based survey allows for the efficient administration by using skip logic to quickly move to the next appropriate question depending upon a respondent’s previous answer. This survey will take approximately 40 minutes to complete. The research liaison at each site will help the research team at MDRC with administration.


Attachment 20 - Post-session debrief for sites testing parenting intervention. The post-session debrief notes will be completed by staff working with the program group after each father-child play session. These notes provide information about fidelity of program delivery. The notes will be completed on a tablet so they can be transmitted efficiently to the program developer for assessment purposes.


B3. Methods to Maximize Response Rates and Deal with Nonresponse


Expected Response Rates


The expected response rates will vary by instrument. For screening materials (Attachments 1 & 2), the consent materials for parents of fathers under 18 (Attachment 3), the eligibility criteria (Attachment 4), and the baseline surveys (Attachments 8 & 9), we expect nearly 100 percent participation since those instruments are required before the father can progress in the enrollment process. We will also ask staff members to track fathers’ attendance in B3-specific services (Attachment 6). We expect response rates to be close to 100 percent for this, given that staff members are already required to enter information about fathers’ participation in services in nFORM as a condition of their federal funding and there is minimal burden involved with the additional information being requested by B3. For Attachment 5 – B3-specific enrollment data, we expect very high response rates for most of the data elements. We expect nearly 100 percent response rates for data on the focal child and co-parent since much of this information was already collected in the screening instrument and since staff members will need to collect this information in order to involve focal children and co-parents in the parenting intervention. The Enhanced Transitional Jobs Demonstration (0970-0413) was able to get social security numbers and at least one criminal justice ID from over 99 percent of sample members in sites targeting individuals recently released from prison. We expect similar results for this study which will have a similar study population. We also expect nearly 100 percent of fathers to provide some contact information for people who could help locate the father since this is a standard part of program data collection and staff members are very invested in strategies to keep the father engaged in services.

For the B3 site that is not an OFA Responsible Fatherhood Grantee, we will ask fathers in that site to fill out the FaMLE Cross-site Applicant Characteristics questionnaire, and staff members to enter information about the father into the nFORM MIS even though this site is not part of the FaMLE Cross-site study (Attachments are part of the FaMLE Cross-site Project OMB package (0970-0460)). We expect nearly 100 percent participation in the Applicant Characteristics Questionnaire since this questionnaire will have to be filled out before the father can progress in the enrollment process. We also expect a non-grantee site to enter fairly complete information about the father and his participation into nFORM since part of the site payment will be designated to cover staff time entering data into nFORM and because we will be monitoring nFORM data quality on an ongoing basis and providing regular feedback on the data quality. We are following a similar approach in the Mother and Infant Home Visiting Program Evaluation (MIHOPE) and 64 out of 88 sites have completion rates over 85 percent.

For the 6 month follow-up surveys (Attachment 10 & 11), we expect a response rate between 80 and 90 percent. A response rate of 80 percent is commonly our goal for follow-up surveys. An 80 percent response rate is typically achievable when using a mixed-mode approach that consists of a combination of computer-assisted telephone interviews (CATI) and computer-assisted in-person interviews (CAPI), along with incentives for participation. Numerous MDRC studies with similar populations have achieved response rates of at least 80 percent. For example, the Work Advancement and Support Center demonstration achieved an 81 percent response rate for the 12-month follow-up survey for a sample which included ex-offenders (Miller et al., 2012). The Parents’ Fair Share study, which included non-custodial parents, achieved a response rate of 78 percent (Miller & Knox, 2001). The Philadelphia Hard-to-Employ study (a transitional jobs program for TANF recipients) achieved a 79 percent response rate (Jacobs & Bloom, 2011). Several sites in the Employment Retention and Advancement evaluation achieved 80 percent response rates as well (Hendra et al., 2010).


Staff working with the program and control groups will be asked to complete a survey in 2017 (Attachments 14 & 15). Based on the response rates for the staff surveys in the Enhanced Transitional Jobs Demonstration, we expect around 85 percent of staff to complete the survey.

Staff members will also be asked to participate in semi-structured interviews approximately 6 and 18 months after program launch (Attachments 12 & 13). We expect response rates to be nearly 100 percent for these interviews, with the only nonrespondents being those staff members who are not available on the days the interviews are occurring.

The post-session debriefs (Attachment 20) were created by and will be collected by the model developer who will be monitoring program fidelity. It is expected the response rates will be high among the staff since they will implement the intervention with the understanding that it is part of this study, and, the questionnaires will be very quick to fill out at the end of each workshop.

Focus groups will be convened with fathers across all B3 sites (Attachment 16), and with mothers in sites testing the parenting intervention (Attachment 17). Ideally, each group would include approximately 8 individuals. As is usually the case with focus groups, we will recruit at least double the number of people for each focus group with the anticipation that half will not attend.

The data collected from mobile devices will be used to understand mechanisms for change and reasons for non-participation, rather than to measure impacts. For surveys using mobile data collection (Attachments 18 & 19), we expect to have about a 30 percent response rate for any given administration of a survey via mobile device. This estimation was derived from Qualtrics response rates for short message systems (text message) surveys and adjusted for our low-income sample. Since we expect 80 percent of the study sample to have cell phones, this means that we expect about a quarter of the full study sample to respond to any given mobile survey. However, as is the case for other modes of survey administration, we will deliver each survey to non-respondents more than once, and expect the response rate to increase with each delivery of the survey. Taking into account the proportion who have phones and the proportion who would respond after multiple attempts to reach the non-respondents, we expect a final response rate of about 40-50% for each module that we administer.

For most of our data collection instruments, we have the goal of achieving response rates that are 80 percent or higher. Respondents are more likely to differ from nonrespondents if the response rate is low. However, with response rates of at least 80 percent, we have typically not found strong evidence of nonresponse bias. In addition, we are aiming to recruit enough sample members so that we will have power to detect impacts in the range of 0.12-0.15 standard deviation on survey outcomes with survey response rates of 80 percent.


Dealing with Nonresponse


As described in more detail in the next section, all efforts will be made to obtain information on a high proportion of fathers at follow-up, including offering incentives for participation to fathers for filling out the follow-up surveys. To assess the impact of survey nonresponse, an analysis will be conducted to determine whether the results from the 6-month follow-up surveys (Attachments 10 & 11) may be biased by non-response. In particular, two types of bias will be assessed: (1) whether estimated effects among survey respondents apply to the full study sample, and (2) whether program group respondents are similar to control group respondents. The former type of bias affects whether results from the study can be generalized to the wider group of families involved in the study, while the second assesses whether the impacts of the programs are being confounded with pre-existing differences between program group and control group respondents.


To assess non-response bias, several tests will be conducted.


  • The proportion of program group and control group respondents will be compared to make sure the response rate is not significantly higher for one research group.

  • A logistic regression will be conducted among respondents. The “left hand side” variable will be their assignment (program group or control group) while the explanatory variables will include a range of baseline characteristics. An omnibus test such as a log-likelihood test will be used to test the hypothesis that the set of baseline characteristics are not significantly related to whether a respondent is in the program group. Failure to reject this null hypothesis will provide evidence that program group and control group respondents are similar.

  • Impacts from administrative records sources – which will be available for the full sample – will be compared for the full sample and for the sample of survey respondents to determine whether there are substantial differences between the two groups.

  • Baseline characteristics of respondents will be compared to baseline characteristics of non-respondents. This will be done using a logistic regression where the outcome variable is whether someone is a respondent and the explanatory variables are baseline characteristics. An omnibus test such as a log-likelihood test will be used to test the hypothesis that the set of baseline characteristics are not significantly related to whether a respondent is in the program group. Failure to reject this null hypothesis will provide evidence that non-respondents and respondents are similar at baseline.


If any of these tests indicate that non-response could bias impact estimates, a standard technique such as multiple imputation or weighting by the inverse probability of response will be used to determine the sensitivity of impact estimates to non-response.

It is also possible that there could be high item nonresponse for certain questions. For example, it is common for there to be higher rates of missing data for earnings or income questions. To maximize the amount of usable income data we have, the 6-month follow-up survey for sites testing the B3 employment intervention uses several strategies, including: 1) priming the father to think about all of the sources of income that he might be receiving, 2) providing probes that the interviewers can use to try to get an answer from fathers who are hesitant to answer, and 3) asking fathers to report the category in which their income falls for fathers who do not answer the question that asks for their exact income amount. Similar strategies will be used to maximize the amount of usable earnings data we have. In addition, we will be getting quarterly wage data from the National Directory of New Hires, so that we have information on employment and earnings even if there were high rates of missingness for these items on the survey.

There could also be item nonresponse for some sensitive questions, such as those related to the fathers’ criminal justice involvement. In sites with high populations of ex-offenders, we will be getting criminal justice administrative records, in addition to asking about criminal involvement on the survey, so that we will have an additional source of data even if there are high rates of missingness on some of the criminal justice survey questions.

All staff members will be asked to fill out a survey (Attachments 14 & 15), while some staff members will be asked to document their post-session debrief (Attachment 20) or participate in semi-structured interviews (Attachments 12 & 13). Efforts will be made to maximize response rates on all of these data collection activities, as described in more detail in the next section. While these data collection efforts will be used for largely descriptive purposes, we will still note response rates, and will caveat our analysis as appropriate.

The data from focus groups (Attachments 16 & 17) will also be used largely for descriptive purposes. However, we will still aim to get good response rates by offering incentives for participation in focus groups and by taking steps to make it easier for participants to attend (for example, by scheduling the meetings at the right times and locations). We will note response rates and systematic observable differences between respondents and nonrespondents (based on baseline data), and will caveat our analysis as appropriate.


The data from mobile device surveys (Attachments 18 & 19) will be used to describe participants’ and nonparticipants’ experiences with the program. While we will aim to maximize response rates through a number of strategies described in the next section, we do not expect response rates higher than 40-50% for each module that we administer. We think these response rates will be adequate to provide useful information on participants’ and nonparticipants’ experiences with the program, but we will do some response bias analysis to understand any observable differences in the fathers who do, and do not, respond. One variable we expect to differ between the two groups is the type of cell phone the father reports having at the time of random assignment.


Maximizing Response Rates


Minimizing sample attrition is of utmost importance to any longitudinal study. It is likely that many B3 fathers will be highly mobile, and therefore there will be the risk of attrition at the 6-month follow up (Attachments 9 & 10). Several strategies will be adopted to mitigate the risk of attrition at follow up:

  • Maximizing the use of contact information collected by the program at the point of random assignment, including email addresses and alternate contact information for at least three other individuals whom the respondent identified as likely to know how to find him;

  • Mailing a welcome letter (Appendix G) to each enrolled participant, shortly after enrollment, that contains a magnet with a toll-free number that can be used to provide updated contact information to Abt SRBI;

  • Using advance letters and email contacts (Appendix G);

  • Conveying the purposes of the survey to respondents so they will thoroughly understand the purposes of the survey and perceive that cooperating is worthwhile;

  • Training site staff to be encouraging and supportive, and to provide assistance to participants as needed;

  • Hiring interviewers who have necessary skills for encouraging cooperation;

  • Training in-person interviewers in the skills needed to locate hard to find respondents;

  • Training interviewers thoroughly in conversion and avoidance of refusals. Interviewers will also be trained to distinguish "soft" refusals from "hard" ones. Soft refusals often occur when the sample member has been reached at an inopportune time. In these cases, it is important to back off gracefully and to establish a convenient time to call or come back rather than to persist at the moment. Hard refusals do occur and must also be accepted gracefully by the interviewer.

  • Using a mixed-method methodology that consists of a combination of computer-assisted telephone interviews (CATI) and computer-assisted in-person interviews (CAPI). In a mixed-mode approach, the survey firm first attempts to survey each respondent by telephone. Once this relatively low-cost approach has been thoroughly tried, field interviewers attempt to obtain additional responses from people who could not be contacted by telephone in person.

  • Timing cases from the telephone center to the field so that each case will not remain in the telephone center for more than 30 days. Generally, the telephone center will have thoroughly worked all available phone numbers for follow-up sample within 21-28 days. Therefore moving non-completed cases to the field team quickly helps maximize the response rate, while ensuring that interviews are completed in a reasonable timeframe.

  • Offering appropriate, modest incentives to participants for participating in the follow-up survey effort (See section A.9 for additional information);

  • Coordinating with state and local departments of correction to gain access to incarcerated respondents so that they can also be interviewed for the survey.


Maximizing response rates for the data collection efforts targeted towards staff members (Attachments 12-15) is also important. When a site enters the study, the research team will explain the importance of the data collection efforts for advancing the field of fatherhood, particularly the enhancements being tested. In addition:

  • For web-based staff surveys, we maximize response rates primarily through good design and monitoring of completion reports. It is important to 1) keep the survey invitation attractive, short, and easy to read, 2) make accessing the survey clear and easy, and 3) communicate to the respondent that the completed survey is saved and thank them for completing the survey. Research staff will closely monitor data completion reports for the survey. If a site’s surveys are not completed within one week of the targeted time-frame, the site liaison will follow up with the site point of contact to remind their staff that survey responses are due.

  • For semi-structured interviews, it is particularly important to plan the visits well in advance with the assistance of program management and schedule interviews at the most convenient times for staff.

  • For post-session debrief, it will be important that the forms are very quick and easy to fill out, and that the expectation is set that they will be filled out at the end of each workshop.


Efforts will also be made to maximize response rates for focus groups (Attachments 16 & 17). The research team will work with staff at each site to plan focus groups to be accommodating to fathers and mothers and to maximize attendance; this includes scheduling the meeting at the right time and location.


Several strategies will be used to maximize response rates for the mobile device surveys (Attachments 18 & 19). Surveys will be kept short and nicely formatted to make it easy for fathers to take the surveys. The informed consent/assent form (Appendices A and B) mention the mobile device surveys so that fathers have advanced warning that they will be contacted about these surveys. We will deliver each survey to non-respondents more than once, and expect the response rate to increase with each delivery of the survey. In addition, administration of each module will be timed for the best day of the week and time of the day to maximize response rates. Finally, modest incentives will be given to respondents of mobile device surveys.


B4. Tests of Procedures or Methods to be Undertaken


The study team used pretesting with 9 respondents for both the parenting and employment baseline surveys to identify revisions to be made to procedures and instruments for the baseline data collection (Attachments 8-9). We identified 9 respondents in a fatherhood program in Los Angeles (including both English and Spanish-speaking participants) to pretest the parenting survey. The respondents were a mix of resident and nonresident fathers, all with young children. The parenting survey is being translated into Spanish. Once we have the final instruments approved, Spanish versions will be finalized and sent to OMB as a nonsubstantive change.


The team also identified 9 respondents in an employment program in NewYork to pretest the employment survey. The respondents were all English-speaking since the employment survey will only be offered in English. The respondents were all fathers with criminal justice backgrounds. None of these respondents will be in our research sample.


When pretesting the baseline surveys, the interviewers began by introducing the study, assuring privacy to the extent permitted by law and reiterating that participation in the survey is voluntary. The interviewers also asked for permission to audio-record the interview. The interviewers asked the questions exactly as worded. The length of interviews was monitored and the participants and interviewers were debriefed so that feedback on the instruments was collected. The telephone interviews were recorded so that survey management staff could hear the recordings and obtain accurate estimates of the length of the interview for OMB burden estimates. Pretesting has not yet been completed for the 6-month follow-up data collection instruments, but when it is, a similar protocol will be followed.


The employment mobile device survey (Attachment 18) was pretested with 5 respondents in a fatherhood program in Cleveland, Ohio. The respondents had a variety of mobile devices on which to test the survey. When pretesting, the interview began by introducing the study, assured privacy, and reiterated that participation was voluntary. Respondents answered questions as they were delivered to their mobile devices. After each module was delivered, the interviewer asked the respondent specific questions to get information about their reactions to the questions and their experience taking the survey on their mobile device. The interviewer took notes throughout the conversation.


The staff survey was pretested with 8 staff from a fatherhood programin Cincinnati, Ohio. The interviewer walked the staff through each section of the survey and asked staff to give their reactions to the questions. The interviewer probed on specific questions. Staff were also be encouraged to ask questions and to give suggestions for improving the survey. The interviewer took notes throughout the conversation.


Any resulting modifications to the instruments that have not yet been pretested will be submitted as nonsubstantive changes for OMB approval.


B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


The following is a list of individuals involved in the design of the B3 project, the plans for data collection, and the analysis.


Rekha Balu Senior Associate, MDRC

Annie Bickerton Research Analyst, MDRC

Dan Bloom Director, MDRC

Emily Brennan Research Assistant, MDRC

Rachel Dash Research Assistant, MDRC

Katie Egan Research Assistant, MDRC

Sam Elkin Subcontractor, MEF Associates

Emily Ellis Subcontractor, MEF Associates

Mike Fishman Subcontractor, MEF Associates

Kristen Harknett Consultant

JoAnn Hsueh Senior Associate, MDRC

Rebecca Hughes Operations Associate, MDRC

Dina Israel Senior Associate, MDRC

Ginger Knox Director, MDRC

Erika Lundquist Research Associate, MDRC

Patrizia Mancini Research Analyst, MDRC

Michelle Manno Research Associate, MDRC

Aleta Meyer Project Officer, OPRE

Carly Morrison Subcontractor, MEF Associates

Doug Phillips Research Analyst, MDRC

Cindy Redcross Senior Associate, MDRC

Bright Sarfo Subcontractor, MEF Associates

Anna Solmeyer Project Officer, OPRE

Samantha Wulfsohn Senior Associate, MDRC


References


Hendra, Richard, Keri-Nicole Dillman, Gayle Hamilton, Erika Lundquist, Karin Martinson, and Melissa Wavelet. 2010. How Effective Are Different Approaches Aiming to Increase Employment Retention and Advancement? Final Impacts for Twelve Models.New York: MDRC.


Jacobs, Erin, and Dan Bloom. 2011. Alternative Employment Strategies for Hard-to-Employ TANF Recipietns: Final Results from a Test of Transitional Jobs and Preemployment Services in Philadelphia. OPRE Report 2011-19, Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.


Miller, Cynthia, Mark van Dok, Betsy L. Tessler, and Alexandra Pennington. 2012. Strategies to Help Low-Wage Workers Advance: Implementation and Final Impacts of the Work Advancement and Support Center (WASC) Demonstration. New York: MDRC.


Miller, Cynthia, and Virginia Knox. 2001. The Challenge of Helping Low-Income Fathers Support Their Children:Final Lessons from Parents’ Fair Share. New York: MDRC.


Nowak, Christoph and Nina Heinrichs. 2008. “A Comprehensive Meta-Analysis of Triple P-Positive Parenting Program Using Hierarchical Linear Modeling: Effectiveness and Moderating Variables” Clinical Child and Family Psychology Review 11(3):114–144.


Lipsey, Mark W., Nana A. Landenberger, & Sandra J. Wilson. 2007. "Effects of Cognitive-Behavioral Programs for Criminal Offenders." Campbell Systematic Reviews.

1There may be sites in which fathers are recruited for the study after they are already enrolled in program services. In these cases, study enrollment may happen weeks or months after a father has already entered the program.

2For fathers being recruited into the study after they have already been enrolled in the program, this information will not have to be re-entered, though staff will be instructed to verify that the contact information is still accurate.

12


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDHHS
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy