B3_OMB_Supporting_Statement_A_Revised Jan 2017

B3_OMB_Supporting_Statement_A_Revised Jan 2017.docx

Building Bridges and Bonds (B3)

OMB: 0970-0485

Document [docx]
Download: docx | pdf


Building Bridges and Bonds (B3)



OMB Information Collection Request

New Collection

Supporting Statement

Part A

March 2016



Submitted By:

Office of Planning, Research and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers:


Aleta Meyer

Anna Solmeyer


A1. Necessity for the Data Collection

The Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services (HHS) seeks approval to collect information from fathers enrolled in responsible fatherhood programs, from co-parents of these fathers’ children, and from program staff as part of the Building Bridges and Bonds (B3) study.


The Office of Planning Research and Evaluation (OPRE) launched the B3 study in 2014. Using a mix of research methods, this five-year study is partnering with six programs that serve low-income fathers to understand the effectiveness of strategies used to enhance fathers’ participation in fatherhood programs, to increase fathers’ stable employment and improve their economic circumstances, to encourage fathers’ consistent and positive engagement with their children, and to improve constructive cooperation with co-parents.


There is substantial evidence that fathers’ parenting support – both financial and emotional – and family stability are critical foundations for child well-being. Declining wages for less educated men and high rates of family instability have together created a sense of urgency toward providing services that can increase fathers’ capacity to provide supportive contexts for their children. Nevertheless, it has been difficult to design employment and responsible parenting programs that produce substantial impacts in the lives of men and their children (Knox et al. 2011). B3 offers an important opportunity to advance the goals of fatherhood programs by building on recent advances in program design and implementation to rigorously test the effects of innovative program enhancements in the areas of parenting and employment.


In addition, fatherhood programs have found time and again that engaging fathers is a substantial challenge, suggesting that building a knowledge base about effective engagement strategies is fundamentally important to these programs’ future effectiveness. Thus, B3 seeks to advance the field by testing new evidence-informed approaches to bolster engagement and sustained participation of fathers in program services.


Since 2006 Congress has authorized dedicated funding for discretionary grants from the Office of Family Assistance (OFA) to programs to promote responsible fatherhood. B3 comes at an important time for research on fatherhood programming. Programs use a number of promising models to work with fathers, but rigorous studies have not yet shown which are effective and worth expanding or replicating. B3 is one of several new studies funded by ACF taking complementary approaches to provide needed evidence about program strategies to serve low-income fathers and their families.


The B3 impact study will use a randomized control trial design to provide rigorous evidence on the impacts of a parenting intervention in three sites, an employment intervention in three sites, and an engagement intervention in three sites (the engagement intervention will be tested in the same sites the parenting intervention is being tested in). The table below shows the six B3 sites and the interventions that will be tested in each site:


Site Name

Location

Parenting test

Employment test

Engagement test

Children’s Institute Inc.

Los Angeles, CA

X


X

People for People Inc.

Philadelphia, PA

X


X

Structured Employment Economic Development Corporation (Seedco)

New York, NY

X


X

Kanawha Institute for Social Research & Action (KISRA)

Dunbar, WV


X


Ohio Department of Job and Family Services

Cleveland, OH


X


The Fortune Society Inc.

New York, NY


X



The B3 process study will provide detailed information on program implementation from the perspective of staff and program participants at multiple points of time and from the perspective of the mothers of participants’ children.



A2. Purpose of Survey and Data Collection Procedures


Overview of Purpose and Approach


The B3 study is designed as a structured demonstration project in six sites to test evidence-informed enhancements to the core components of responsible fatherhood programs. These enhancements will include a parenting intervention, an employment intervention, and tests of strategies designed to promote greater engagement and participation of fathers in program services. The study received a generic OMB clearance in March 31, 2015 to gather information from staff at responsible fatherhood programs on existing services to aid in intervention design (0970-0356). The current request covers all of the data collection for the implementation study and assessment of program impacts. The data collection activities will include:


1) Recruiting fathers into the study and collecting baseline information1

2) Tracking program implementation and program participation

3) Collecting follow-up survey data for estimating program impacts.

These B3 data collection efforts will complement other ACF studies, such as Parents and Children Together (PACT) Evaluation (0970-0403) and The Fatherhood and Marriage Local Evaluation (FaMLE) and Cross-site Project (0970-0460). PACT and FaMLE are investigating the impacts of fatherhood programs as they exist in the field. B3 will complement these projects by looking inside the black box and testing the impacts of different features or components of programs, rather than the programs as a whole.

The data collection described in this request will be supplemented with administrative records data on employment and criminal justice outcomes. These administrative records data will not impose any additional burden on participants or data collectors because they are routinely collected for purposes independent from the B3 study. The data collection efforts we are proposing in this request will gather crucial information for the B3 impact and process study that are not available in administrative data sources.

Data collection timeline

Starting in Summer of 2016 (following OMB approval of this request), B3 will assess the eligibility of fathers, enroll them in the study and randomly assign them to the program or control group, and collect baseline data at intake.


Screening for eligibility, enrollment, and baseline data collection will continue for approximately 18 – 24 months. Participant information will be collected throughout the study using the nFORM management information system (MIS).2 Program staff will be responsible for entering participant information into nFORM for approximately 27 months. Other data will be collected from participants to inform the process study in the 27 month period beginning in Summer of 2016, with mobile data collection occurring nearly the whole time and site visits occurring approximately 6 and 18 months after program start, for example. During the same time period, some mothers of children engaged in the parenting interventions with their fathers will also be asked to participate in a focus group.

B3 will collect information on staff through a survey to be fielded in 2017, and semi-structured interviews to occur at two points in time, approximately six and eighteen months after program launch. Staff session debrief notes will also be collected from the parenting sites.

Follow up survey data will be collected from participants 6 months after the participants’ time of enrollment. Administrative records will supplement the baseline and follow up surveys and will be collected for a period of approximately 4 ½ years starting approximately 18 months before the program launch.

Beginning in early 2017 through the intervention period (of about 18 to 24 months), video recordings will be made of parenting workshop sessions in one or two parenting sites. These recordings do not impose any burden on participants as they will be of regularly-scheduled parenting program sessions.

Research Questions

The B3 study will aim to address a set of fundamental research questions for the fatherhood field across several key areas:

1) What are effective approaches to engage fathers in fatherhood program services?

2) How was the parenting program enhancement implemented in each site? What factors seem to support effective implementation more than others?

3) How was a cognitive behaviorally informed employment program enhancement implemented in each site? What factors seem to support effective implementation more than others?

4) What are the program impacts of enhanced engagement, parenting, and employment components when added to current fatherhood services? Who seems to benefit more or less from these interventions?

Below we situate the study design within the broader literature on what we know and what we need to know about encouraging responsible fatherhood, supporting father/child relationships, fostering healthy co-parenting relationships, encouraging father’s stable employment, and increasing fathers’ engagement with and participation in fatherhood program services. Then, we turn to a more specific set of research questions and the data collection components necessary to address these open research questions.

Brief Review of Scientific Literature.


Father-child Relationships. Research shows a strong link between supportive fathering and child outcomes, and that the absence and disengagement of fathers from the family can pose developmental risks for children (e.g., Amato and Gilbreth 1999, Cabrera et al. 2007, Cancian et al. 2010, Carlson & Magnuson, 2011, Cowan, Cowan, Cohen, Pruett, & Pruett, 2008, King and Sobolewski 2006, ). Evidence on the positive correlation between father involvement and child well-being is particularly strong for fathers who live with their children. The research on nonresident fathers is not as clear. Overall, the literature suggests that the amount of contact between nonresident fathers and their children is not necessarily correlated with positive child outcomes; rather, the quality of interaction may matter more.


A catalog of research on fatherhood programs for low-income fathers funded by ACF identified eight rigorous impact studies that included programming intended to increase and improve fathers’ amount and quality of involvement with their children (Avellar et al. 2011). While some of these studies showed positive impacts on fathers’ involvement with children, most were very small-scale and not all were for low-income fathers; there is much left to learn about how to best support parenting skills among low-income fathers. The evidence base for programs that involve children and co-parents is particularly limited. The B3 study represents an important opportunity to break new ground by testing targeted parenting skills training that directly engages fathers with their children and co-parents.


Economic Security. Although responsible fatherhood programs have always considered bolstering fathers’ earnings to be an important goal, very few rigorous studies have tested employment-oriented programs targeted specifically to fathers. The first such study, MDRC’s Parents’ Fair Share Demonstration (PFS), found small positive impacts on some employment outcomes, but only for the least job-ready participants (Miller and Knox 2001). A few other studies of programs targeting noncustodial parents using weaker research designs have found modest impacts on employment outcomes, but it is difficult to determine which specific kinds of employment services were effective (Bloom 2014).


Adding to the employment challenge is that a substantial proportion of fathers served in responsible fatherhood programs have had some involvement with the criminal justice system. Individuals returning from a period of incarceration face a number of well-documented issues including the need for a job and immediate income, stable housing, and the need to reestablish healthy relationships with family and friends. For many, these needs are compounded by spotty work histories, low-levels of educational attainment, and mental health and substance abuse issues (Bauldry et al 2009). Some criminologists argue that for services to be effective, it is important to identify and address underlying “core criminogenic needs” such as pro-criminal attitudes and behaviors, antisocial peers, impulsivity, substance abuse and lack of motivation, prior to engaging in treatment of secondary needs, such as employment, family relationships, and parenting skills (Andrews and Bonta 2010).


An approach integrating cognitive-behavioral and economic security services may be a promising direction for improving outcomes for these highly disadvantaged men. Our site visits (OMB generic clearance number 0970-0356) revealed that cognitive-behavioral approaches were being incorporated into responsible fatherhood services and were an area of interest to the field. B3 has an opportunity to add to our evidence based on how cognitively informed employment programs impact fathers’ employment, parenting, co-parenting, criminal justice, and other outcomes.

Behavioral approaches to strengthening engagement in fatherhood programs. A review of fatherhood programs (Martinson and Nightingale 2008) concluded that these programs have consistently struggled to recruit and retain fathers. Therefore, testing strategies to boost fathers’ engagement with programs is a pressing need for the field. B3 will draw on recent insight from behavioral economics (BE). BE has shown that most human decision-making is characterized by predictable limits on computation, willpower, and self-interest, contrary to the core assumption of classical economics that humans are rational utility maximizers (Thaler and Sunstein 2008). With these insights in mind, programs designers should assume that their clients (and their staff) will take mental shortcuts, have trouble following through, be overwhelmed by choices, and be led by group norms. BE provides a set of tools that are relatively inexpensive and easy to implement to guard against these obstacles to rational decision-making (Ly et al 2013; Richburg-Hayes et al 2014; Sanders and Kirkman, 2014). The B3 study will test a set of these tools to provide evidence on their effectiveness in boosting participation in fatherhood program services.

Impact Study Research Questions

The baseline and 6-month follow-up surveys will be administered to fathers enrolled in the B3 study and assigned to either the program or control group. Table 1 summarizes the impact research questions that B3 is designed to address, and the baseline and 6-month follow-up survey data required to address particular impact questions.

In general, we propose to collect baseline and follow-up measures of outcomes for which we will estimate program impacts. The baseline measures of outcomes will be used as covariates to increase the precision of estimates of program impacts at the 6 month follow-up. Some of our baseline survey measures are expected to moderate program impacts and therefore will also be used to identify key subgroups. The study team will also rely on pre-existing administrative data.



Table A1. Research Questions addressed by Baseline and Follow-Up Survey Data and nFORM MIS data

Domain

Research Question

Construct

Baseline survey (covariate)

6-month follow-up survey (outcome)

nFORM (MIS)

Father involvement

(1) Does the program increase fathers' access to their children and the time they spend with their children?

Contact with children; frequency of various types of shared activities

x

x


Parenting skills and efficacy

(2) Does the program strengthen fathers' parenting skills and confidence?

Harsh or positive discipline ; parenting efficacy; Parent/child dysfunctional interaction scale

x

x


Parental commitment

(3) Does the program increase fathers' commitment to their children?

Father commitment/dedication to child; Father perceived influence on child

x

x


Father/child relationship quality

(4) Does the program increase the quality of fathers’ relationship with their children?

Father reports of overall quality of relationship with child, and feelings of disappointment, pride, or frustration with child.

x

x


Co-parenting conflict and cooperation

(5) Does the program improve fathers' relationships with co-parents?

Co-parenting alliance, conflict, support, maternal gatekeeping, undermining

x

x


Cognitive behavioral

(6) Does the program reduce fathers' parental stress and improve other cognitive behavioral outcomes?

Perceived Stress Index; impulsivity; coping; self-efficacy; perseverance; self control; problem solving skills

x

x


Employment, earnings, and economic well-being

(7) Does the program increase fathers’ employment, earnings, and economic well-being?

Hours worked, wages, income, income volatility; job characteristics

x

x


Child support

(8) Does the program increase fathers’ child support payments or informal financial support for children?

Financial and in-kind contributions to children

x

x


Criminal justice involvement and substance use3

(9) Does the program reduce recidivism and drug use?

Parole or probation violations; time spent in prison; problems with work or relationships caused by drug or alcohol use

x

x


Household and family structure

(10) Do program impacts vary for fathers with different marital and fertility histories?

Number and ages of children, number of childbearing partners, marital and partnership status.

x



Participation

(11) Between two approaches to encourage participation in a parenting program, does one increase participation more than another?

Enrollment, attendance, completion, re-enrollment


x

x

Engagement

(12) Between two approaches to encourage participation in a parenting program, does one increase fathers’ satisfaction and perceived usefulness of the program more than another?

Satisfaction, perceived usefulness


x







Process Study Research Questions

Qualitative and quantitative data will be collected from multiple sources and triangulated for the process study. While the multiple data sources are complementary, they are not duplicative. Different perspectives on the program will inform our understanding of key implementation issues. Sources include: staff, study participants and their co-parents, and nFORM or program MIS and pre-existing program materials. Together, the multiple data sources will give us information about how implementation varies at three levels – organizational, staff, and participant – to provide as comprehensive an understanding as possible of how these interventions were implemented across different contexts and individuals.

Table A2. Research Questions addressed by Process Study Data

Research Question

Data Collected from Study Participants

Data Collected from Program Staff

1- What was the community context and service environment in which the parenting or employment interventions operated and how did it change over the course of the evaluation?

x

x

2- What is the business as usual service model and how did this differ from the parenting or employment models within sites?


x

3- How are study participants identified, recruited, determined eligible, and enrolled in parenting or employment services? To what extent do sites re-engage participants who do not complete their program?

x

x

4- How were the engagement, parenting, or employment interventions offered and used by participants?

x

x

5- What are the implementation systems that sites used to support the engagement, parenting, or employment interventions?


x

6- What were key challenges and lessons of implementing the engagement, parenting, or employment interventions?

x

x

7- What are the organizational characteristics of each B3 site?


x

8- What are the study participant perspectives on how well the program met their needs?

x






Study Design


The design of the B3 study will involve an intake process in which fathers are screened for eligibility for enhanced program services and then asked to consent to be in the study. Eligible fathers will be randomly assigned to enhanced services or to a control group whether or not they consent to be in the study. The eligibility screening will ensure that only fathers for whom the interventions are well-suited are included in the study. In the sites testing a parenting intervention, fathers will be eligible if they have recent contact with a young child of theirs. In sites testing an employment intervention, fathers’ eligibility will be determined by prior criminal justice involvement and a score on a risk assessment screener that was developed to assess the risk for recidivism among ex-offenders. Two of the sites offering the enhanced employment services will use a screener that they were already using, while the third site will use a screener provided by the study team. The screener provided by the study team will ask about fathers’ criminal history, education, employment, family and social support, alcohol and drug use, and attitudes to determine their risk for recidivism. Fathers will be given 0, 1, or 2 points for each question, with a higher number of points corresponding with a higher risk of recidivism. Staff members will add up the total number of points that a father has across all 35 questions in the screener. If the father has at least 15 points, he will be considered to have at least a moderate risk of recidivism and will be eligible for the study.


Participant tracking will be done using the nFORM management information system (MIS), for which clearance was previously obtained (0970-0460), starting at intake and throughout the program. This clearance includes burden for participant tracking in Federally-funded fatherhood grantee sites.


At baseline, fathers will complete an Applicant Characteristics questionnaire, which has already been approved under the Fatherhood and Marriage Local Evaluation and Cross-site (FaMLE Cross-site) Project OMB package (0970-0460). This clearance includes burden for the Applicant Characteristics survey in Federally-funded fatherhood grantee sites. Fathers will also be asked to complete a baseline survey, administered by audio computer-assisted self-interview software (ACASI), that is expected to take an average of 30 minutes per respondent.

Fathers will also respond to a follow-up survey 6 months after baseline that will be administered by telephone computer-assisted telephone interview (CATI) and is expected to take an average of 40 minutes. For sites testing the parenting intervention, both the baseline and 6-month follow-up surveys will be translated into Spanish. The employment instruments will not be translated because of very low numbers of Spanish-speaking fathers at those particular sites. Once the English versions of instruments are approved, Spanish versions will be finalized and sent to OMB as a nonsubstantive change.

The impact study will take advantage of the experimental research design. Outcomes for program and control group members will be compared at follow-up. By virtue of random assignment, the program and control groups should be identical on average in their measured and unmeasured characteristics before they are randomly assigned to the program or the control group. For this reason, the internal validity of our estimates of program impacts is particularly strong.


The B3 study addresses a pressing need in the fatherhood field for demonstrations of how innovative approaches can be implemented and for early efficacy tests of these approaches. Our study sample will not be a probability sample; and, therefore, we will not be able to statistically generalize our findings to a broader population. However, our extensive research into the landscape of fatherhood programs under the generic clearance will allow us to assess and explain how our six sites fit in the context of the broader responsible fatherhood field. The baseline survey will collect detailed data on the characteristics of the fathers in our study and will allow us to compare our sample to the characteristics of the broader population of low-income fathers.


The process study is designed to provide insight into the treatment differentials and context for interpreting the findings of the impact study, describe and document each newly established intervention and how it operated, and to distill lessons for the fatherhood field about program implementation challenges and effective strategies. The process study will also highlight what it takes to engage participants in services from the perspective of fathers and mothers. In contrast to other process studies of fatherhood programs, the B3 process study will emphasize gathering information about the specific enhancements being tested, the process for implementing them, and understanding how those differ from services as usual, with less emphasis on the fatherhood organizations as a whole. With this approach, the process study will be able to provide lessons and guidance for other fatherhood programs who may be interested in implementing similar program enhancements.


The process study uses a mixed-methods approach, employing both quantitative and qualitative approaches, and triangulating data from a variety of sources including through questionnaires, site visits, and analysis of nFORM MIS data. Whenever possible, quantifiable measures of key process dimensions will be used to characterize program processes in a clear and systematic way. With data collected from staff we propose a sequential mixed-method approach in which quantitative data is conducted first via a staff survey, followed by qualitative interviews. Based on results of early quantitative analyses of the staff surveys and nFORM MIS data analysis, the semi-structured interviews will be designed to help us understand how and why certain patterns or challenges are in place. While two different interventions are being tested, much of the process study approach will be common across the interventions; the primary differences will be content-specific inquiries embedded in questionnaires and other forms of data collection. One difference is the plan to collect information from mothers whose children are engaged in the parenting intervention with their fathers; we do not plan to collect information from mothers whose co-parents are engaged in the employment intervention.



Universe of Data Collection Efforts


Attachment 1 – Screening questions for parenting intervention (row 1 in Table A.4), and

Attachment 2 – Screening questions for employment intervention (row 2 in Table A.4).


Fathers entering fatherhood programs will be screened for their eligibility for enhanced parenting or employment services. In parenting intervention sites, fathers will be eligible if they have had recent contact with a young child in the eligible age range (between 2 months and 2 years of age). (We may increase the eligible age range to 4 years of age if we are having a hard time recruiting enough sample.) In employment intervention sites, fathers will be eligible if they have prior criminal justice involvement and if their score on a risk assessment indicates that they would potentially benefit from a cognitive-behaviorally informed employment program.


Attachment 3 – Consent Materials for Parents of Fathers under 18 (row 3 in Table A.4)


Fathers entering fatherhood programs will be required to sign consent forms (Appendix A – Consent materials for fathers and Appendix B – Assent materials for fathers under 18). If an applicant is a minor, it will be necessary to obtain consent from the parent as well, unless the state’s emancipated minor laws make this unnecessary. This consent will be obtained by telephone. The script for obtaining consent from parents of minors is included as Attachment 3 - Consent materials for parents of fathers under 18.

Attachment 4 – B3-specific eligibility data (row 4 in Table A.4)


B3 specific eligibility will be largely determined using paper instruments before staff members enter any data into nFORM. Once the paper screeners have been completed, a few key data fields will be entered in the nFORM MIS system by staff.


Attachment 5 – B3-specific enrollment data (rows 5 in Table A.4)

All fathers who consent to be in the study will be asked about the type of cell phone and messaging plan they have to help with the nonresponse bias analysis for the mobile device survey. The study team will also be collecting relevant ID numbers needed to access administrative records and contact information for others who know how to get in touch with the father to help ensure that the father can be reached for later data collection activities. In sites testing the parenting intervention, information will be collected about the child and co-parent in the nFORM MIS system. Some of this information will be used to populate fields in the baseline survey. In addition, some of this information is meant to help facilitate the focal child and co-parent’s participation in program services.


Attachment 6 – B3 tracking of attendance in services for program group members (row 6 in Table A.4)

Tracking attendance in program services is an essential component of the process study. Staff will be responsible for tracking attendance in services for program group members in nFORM. This data collection effort contributes to research question (12) in Table A1.


Attachment 7 – Additional nFORM burden for non-Grantee site (row 7 in Table A.4)

B3 will implement an employment intervention in one site that is not an OFA Responsible Fatherhood Grantee. Therefore we are requesting additional burden to cover collection of data on applicant characteristics and program operations for a non-Grantee site. The instruments associated with this burden can be found in the FaMLE Cross-site OMB package (0970-0460). This data collection effort contributes to research question (12) in Table A1.


Attachment 8 – Baseline survey for sites testing parenting intervention, and

Attachment 9 – Baseline survey for sites testing employment intervention (row 8 in Table A.4)


We will collect baseline information from fathers in the program and control groups. These baseline data will be used to describe the population being served, to assess whether random assignment led the program and control groups to be well balanced in their baseline characteristics and circumstances, to identify key subgroups for analysis, and to collect baseline measures of outcomes. This survey will be self-administered using an Audio Computer-Assisted Self-Interview (ACASI) on a laptop or tablet. Enrollees who are eligible for B3 program services will go through a brief consent process (Appendix A) or assent process for minors (Appendix B) before taking the survey. The consent process will cover all data collected from father participants in this OMB new data collection request.


Attachment 8 – Baseline survey for sites testing parenting intervention. The baseline survey in parenting sites will collect data on service receipt, household and family structure, father/child contact, father engagement in particular caretaking, educational, or recreational activities with child, discipline, father/child relationship quality, parenting efficacy, father commitment to child, co-parenting relationship quality, child support, employment, and cognitive/behavioral outcomes. These data will be used as baseline covariates in impact analyses at 6-months to address research questions (1) through (5) in Table A1 about program impacts on parenting and co-parenting outcomes and research questions (6), (7), (8), and (9) in Table A1 about program impacts on cognitive behavioral outcomes, employment, child support, and substance abuse. These data will also provide information needed to analyze moderators of program impacts as described in research question (10) in Table A1.

Attachment 9 – Baseline survey for sites testing employment intervention. The baseline survey in sites testing an employment intervention will collect data on service receipt, employment, income and economic well-being, child support, criminal justice, parenting/co-parenting, and cognitive/behavioral outcomes. These data will be used as baseline covariates in impact analyses at 6-months to address research questions (6) through (9) in Table A1 about program impacts on cognitive behavioral outcomes, employment, criminal justice and substance use, and child support payments. These data will also be used as baseline covariates in tests for impacts analyses on parenting and co-parenting outcomes, as shown in research questions (1), (3), (4), and (5) in Table A1.

Attachment 10 – 6 month follow-up survey for sites testing parenting intervention, and

Attachment 11 – 6 month follow-up survey for sites testing employment intervention (row 9 in Table A.4)

A computer-assisted telephone interview (CATI) survey will be administered to fathers 6 months after baseline. Fathers in the program and control groups will be interviewed. The follow-up survey data will be the primary source of data with which to estimate program impacts. The survey will be supplemented with data available through administrative records on quarterly employment and earnings, and criminal justice system involvement.


Attachment 10 – 6 month follow-up survey for sites testing parenting intervention. The follow-up survey in sites testing the parenting intervention will collect father reports of the amount of father/child contact, the activities fathers and children engaged in together, the quality of father/child relationship, father’s parenting self-efficacy, the quality of father’s relationship with the co-parent, employment, stress, and other cognitive/behavioral outcomes. These data will be a primary source of 6 month outcome data to address research questions (1) through (5) in Table A1 about program impacts on parenting and co-parenting outcomes and research questions (6), (7), (8), and (9) in Table A1 about program impacts on cognitive behavioral outcomes, employment, child support, and substance abuse.

We are not planning to include any measures of child outcomes on the assumptions that (a) many nonresident fathers will lack sufficient contact with children to accurately report on their child’s behavioral, cognitive, and other outcomes, and (b) we are unlikely to be able to detect impacts on these measures in a 6-month follow-up.


Attachment 11 – 6 month follow-up survey for sites testing employment intervention. The follow-up survey in sites testing an employment intervention will collect father reports of employment, earnings, income, child support payments, father/child contact, quality of relationship with co-parents, cognitive behavioral outcomes, and criminal justice outcomes. These data will be a primary source of 6-month outcome data for impact analyses at 6-months to address research questions (6) through (9) in Table A1 about program impacts on cognitive behavioral outcomes, employment, criminal justice and substance use, and child support payments and research questions (1), (3), (4), and (5) in Table A1 about program impacts on parenting and co-parenting.

Attachment 12 – Staff and management semi-structured interviews for sites testing parenting intervention, and

Attachment 13 – Staff and management semi-structured interviews for sites testing employment intervention (row 10 in Table A.4)

Staff working with the program and control groups (which could include workshop facilitators, case workers, or support staff) as well as B3 program administrators will be asked to participate in semi-structured interviews over the course of two sites visits scheduled to occur approximately 6 and 18 months after program launch. These interviews are designed to understand the “how” and “why” of program implementation with an emphasis on the adoption of a new intervention and how it was implemented. The specific questions asked in each interview will depend on the staff person’s role but could include the topics of community context and service environment, program operations (business as usual and the enhanced models), participant outreach and recruitment, the systems supporting the interventions, challenges and lessons of program implementation, and organizational characteristics of the B3 organizations. Information collected from staff will address each of the research questions listed in Table A2 except for study participant perspectives (question 8).


Attachment 14 – Staff survey for sites testing parenting intervention, and

Attachment 15 – Staff survey for sites testing employment intervention (row 11 in Table A.4)


Staff working with the program and control groups will be asked to complete a web-based survey. This survey serves as a point-in-time snapshot to address questions 2, 4, 5, and 7 listed in Table A2, including to assess the use of business as usual strategies by the staff serving the control group, to assess usage and experiences with implementing the enhancements including involvement in recruitment, and to understand the implementation systems supporting the enhancements that could be potential drivers of implementation variation.

Attachment 16 – Participant focus groups (row 12 in Table A.4)


A focus group of approximately 8 program group members will be convened in each B3 site during site visits approximately 6 and 18 months after program launch. The purpose of these groups is to gather information to address research questions 1, 3, 4, 6, and 8 listed in Table A2, including community context, intake and maintaining engagement with program (including barriers to program engagement and program completion), participant experiences in the respective programs and challenges to participating, and opinions about program services and staff.


Attachment 17 – Mother Focus Groups (row 13 in Table A.4)

A total of twenty mothers of children engaged in the parenting interventions with their fathers will be asked to participate in focus groups at each of the three parenting intervention sites. The focus groups are designed to better understand the barriers to getting their child to the program, experiences and satisfaction with the program, and what their needs and concerns are. Responses will inform research questions 4, 6, and 8 in Table A2. The mothers will also be asked about how the program has affected their relationship with their child’s father. The focus groups will begin with a brief consent process (Appendix C – Consent forms for focus groups with mothers).


Attachment 18 – Mobile device employment survey, and

Attachment 19 – Mobile device parenting and co-parenting survey (row 14 in Table A.4)



Mobile phones will be used to collect information from program and control group members in employment and parenting sites about engagement in and their experiences with the program, including sample members who stop participating in their assigned components. Responses will inform research question 4, 6, and 8 in Table A2. Information may be collected through a mobile application or text message. Questions for program group fathers will be specific to the intervention to receive feedback from participants in real time and will be developed from existing literature and similar models.


Attachment 20 – Post-session debrief for sites testing parenting interventions (row 15 in Table A.4)

Throughout the parenting intervention, staff will complete debrief notes after each father-child play session. These notes provide information about the degree of fidelity the father has to the lesson taught that day to inform research question 4 in Table A2. B3 will analyze these data as part of the process study.


Attachment 21 - Consent Materials for Father Video Observations (row 8 in Table A.4)


In one or two parenting sites, fathers will be asked consent to be video recorded during parenting sessions with their child. Attachment 21 contains the consent form fathers will be asked to sign.


Attachment 22 – Consent Materials for Child Video Observations (row 21 in Table A.4)


In one or two parenting sites, custodial parents will be asked to consent for their child to be video recorded during parenting sessions with their father. Attachment 22 contains the consent form that custodial parents will be asked to sign.



A3. Improved Information Technology to Reduce Burden

This study will use information technology, when possible, to minimize respondent burden and to collect data efficiently.

When information is available from a centralized, computerized source, such information has not been included in the data collection instruments described in this submission. For example, Responsible Fatherhood Grantees are required to track participation in the nFORM MIS, and since most of the B3 sites will be Responsible Fatherhood Grantees, we will be using data from this system to track participation in services offered by the site.

The baseline surveys for fathers will be collected using a secure web-based survey that is self-administered by the participant. There will be an audio component (Audio Computer Aided Self Interview-ACASI) which will read the questions and response options to the respondent to ensure that any literacy issues do not affect the ability of the respondent to complete the survey independently. The participant can choose to use the audio or read the questions/responses on screen, or both, whichever is most comfortable for him. Conducting the baseline survey in this manner means that the respondent can answer the survey questions without site staff involvement and his responses are kept confidential. ACASI also allows for the efficient administration of a survey by using skip logic to quickly move to the next appropriate question depending upon a respondent’s previous answer.

To ensure that baseline data collection is as seamless as possible for fathers and program staff, some information that is needed for the survey will be pushed from the nFORM MIS to the baseline survey. During the enrollment session with each participant, once the initial information has been collected in nFORM, nFORM will pass several data fields to the baseline survey using a web service. The survey will then be prepopulated with key fields such as the focal child name and age and the name of the mother or guardian of the focal child. The site staff will then pass the laptop/tablet to the participant, along with a set of headphones.

The 6-month follow-up surveys for fathers will be conducted using computer-assisted telephone interviews (CATI), and computer-assisted in-person interviews (CAPI) for respondents who cannot be contacted by telephone. CATI and CAPI also allow for the efficient administration of a survey by using skip logic and accommodate participants who have low literacy. 

The team will also use short surveys delivered on mobile devices to capture key pieces of information during fathers’ participation in the study. This method of data collection will be particularly useful for capturing information on perceptions and experiences that might not be reported accurately if we asked about them a number of months after the experience, due to recall bias. Since the number of characters allowed in a text message is limited, respondents will be sent a link directing them to a website with information about the study and mobile device survey.

A4. Efforts to Identify Duplication

The primary target population for this research is fathers who are participating in programs offered by OFA-funded Responsible Fatherhood Grantees. All fathers who receive program services from the grantees will provide data for the FaMLE Cross-Site study (0970-0460) through the nFORM MIS.

As part of the FaMLE Cross-Site study, fathers will be given an Applicant Characteristics survey. B3 will use some of these data to supplement the B3 baseline surveys. B3 will obtain these survey data from the FaMLE Cross-Site research team to avoid duplication of effort.

The FaMLE Cross-Site study also includes a pretest which is delivered at the time of the first service and asks about some things that are similar to questions asked on the B3 baseline surveys (Attachments 8 & 9). We have worked with FaMLE Cross-Site to shorten the pretest in B3 sites so that respondents do not have to answer very similar questions on two different surveys, which are both given at or near the beginning of their entry into the program.

The FaMLE Cross-Site study also includes a Program Operation survey for program staff; the B3 process study instruments cover some topics related to items in the FaMLE Program Operations survey, but we will be asking for more detail than is included in the Program Operations survey. The B3 baseline and 6-month follow-up surveys and the process study data collection components will focus on information that cannot be found in administrative records, program MIS, or other existing sources. For example, the survey will facilitate the collection of data on the father/child relationship and the father/co-parent relationship, nuanced characteristics of employment such as work hours and schedules, and cognitive behavioral outcomes. As such, the B3 instruments do not duplicate other information accessible to ACF.

A5. Involvement of Small Organizations

We anticipate that at least some of the six sites that will be part of B3 are small community organizations. To minimize the burden of the study on program staff, B3 will provide resources for each site to hire a Research Coordinator.

We will use technology to reduce the burden of data collection on staff. Fathers will be able to complete the baseline survey on a laptop or tablet, minimizing the staff time required for data collection. We will also take advantage of program participation data collected using the nFORM system, which will allow us to avoid duplication of data collection efforts.

To minimize the burden of the process study data collection efforts, we will work in partnership with sites to identify the best opportunities for administering surveys to staff and study participants to minimize interruption of the program. Furthermore, site visits will be scheduled and planned in conjunction with program leadership.


A6. Consequences of Less Frequent Data Collection

The B3 data collection aims to collect information only as frequently as needed to achieve the aims of the study. Eliminating any of the proposed data collection items would compromise our ability to address key research questions.

Baseline survey data (Attachments 8 & 9).The baseline survey will be administered once. Without it, we would be unable to verify that random assignment had yielded program and control groups that were similar in their observed background characteristics and in their baseline measures of outcomes. The baseline survey is also essential for describing the baseline characteristics of our study sample, for providing covariates for regression-adjusted impact analyses, and for collecting variables that will be used to identify subgroups.

Follow-up survey data (Attachments 10 & 11). The follow-up survey will be administered only once. The follow-up survey is essential for allowing us to estimate program impacts. There is a reasonable expectation of significant change in key measures between the baseline and follow-up survey, particularly in the treatment condition.

Semi-structured interviews with staff (Attachments 12 & 13). B3 staff working with the program and control groups will be interviewed at two points in time, approximately 6 months and 18 months after program launch. Two interviews are needed because programs and their services are likely to change over time as a result of learning from program implementation. The study team will keep track of these program changes in order to understand observed service delivery and participation patterns.

Surveys of staff at B3 program sites (Attachments 14 & 15). Staff working with the program and control groups will be asked to complete a survey in 2017. The survey is needed to gauge staff background and experience, efficacy, perceptions of their organization, and experience with the new enhancements. This information will contribute to the analysis of program implementation.

Mobile device surveys (Attachments 18 & 19). To capture information from both participants and non-participants in the program and control groups, in order to inform practice and learning among site staff, frequent data collection is beneficial. More regular data collection can help us understand potential points of disengagement and aspects of the program that may not feel suitable to participants. While this data collection effort would be somewhat frequent (average of 3 surveys per respondent at an employment intervention site and 3.5 surveys per respondent at a parenting intervention site for a weighted overall average of 3.28 surveys per respondent), the surveys are limited to a handful of the most crucial questions to limit respondent burden.

A7. Special Circumstances

There are no special circumstances for the proposed data collection efforts.

A8. Federal Register Notice and Consultation

Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13 and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995)), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on December 1, 2015, Volume 80, Number 230, page 75117-75119, and provided a 60-day period for public comment. A copy of this notice is included as Appendix D. During the notice and comment period, the government did not receive any comments in response to the Federal Register notice.

Consultation with Experts Outside of the Study

A panel of experts in the fatherhood field provided consultation to the study team and members of ACF in a meeting convened on January 13-14, 2015. These experts represented a range of disciplines and included both practitioners and researchers. Input we received at the meeting was incorporated into the design on the B3 interventions and the design of the B3 research study. The study team has also consulted with experts from the Fatherhood Research and Practice Network on questionnaire development.

Finally, the team asked Kathryn Edin (Bloomberg Distinguished Professor in the Department of Sociology, Zanvyl Krieger School of Arts and Sciences and Department of Population, Family, and Reproductive Health, Bloomberg School of Public Health), Cynthia Osborne (Associate Professor of Public Affairs; Director, Center of Health and Social Policy; Director, Child and Research Partnership, Univerity of Texas, Austin), Jay Fagan (Professor in the School of Social Work at Temple University and Principle Investigator and Co-director of the Fatherhood Research and Practice Network), and Paul Florsheim (Professor, Community and Behavioral Health Promotion School Of Public Health, University of Wisconsin, Milwaukee) to review the baseline (Attachments 8 & 9) and follow-up (Attachments 10 & 11) surveys for fathers. We also consulted with Richard Guare on measures for baseline and follow-up surveys in employment sites (Attachments 9 & 11, respectively).


A9. Incentives for Respondents

The evaluation’s data collection plan includes gift cards, checks, or money orders to participants for completing baseline and follow-up surveys, for completing a mobile device survey, and for participating in focus groups. In sites testing a parenting intervention, mothers who participate in focus groups will be offered a small gift card or check as a thank you for their participation. In addition, small amounts will be given to sample members to encourage them to update their contact information prior to the 6-month survey. Each respondent will receive the designated amounts after each data collection activity that they participate in. (See Table A.3 for remuneration amounts.)


The purpose of the incentives to participants is to improve response rates and reduce non-response bias by decreasing the number of refusals, enhancing respondent retention, speeding the data collection process, and providing a gesture of goodwill to acknowledge respondent burdens. In addition, providing a small monetary incentive at baseline will also increase the likelihood that these sample members will respond to the follow-up survey because sample members who receive monetary incentives for completing a past survey are more likely to respond to subsequent surveys (Singer et al., 1998).


To be effective, the amount of the incentives must fit the burden of the survey. We have based the amount to be offered to B3 respondents on prior research, and MDRC’s and Abt SRBI’s experience collecting data from similar population. The proposed amounts are listed in Table A.3.


The proposed amounts are designed to be sufficient to encourage individuals to participate in both the study and the survey but are not overly generous. Offering a lower amount could jeopardize the study and actually cost the government more because it could result in a lower uptake of fathers into the study and more effort expended by the evaluation team to successfully enroll and survey fathers.


The incentive amounts are in line with what has been done in recent studies, which worked with economically disadvantaged samples similar to this one. For instance, MIHOPE (0970-0402) offered $40 in cash plus in-kind gifts of appreciation for completing the 60-minute baseline participant survey; the survey was required so the response rate was 100%. Building Strong Families (0970-0304 & 0970-0344) offered $50 for completing two 50-minute parent interviews and response rates were 72% among fathers for the 15-month survey and 69% among fathers for the 36-month survey. The Enhanced Transitional Jobs Demonstration (0970-413) offered $40 for completing a 45-minute follow-up survey at 12 months and response rates ranged from 67-82% across sites. The Parents and Children Together (PACT) Evaluation (0970-0403) offered $25 for the follow-up survey and the final response rate was 78%; as is the case in many studies, the PACT study offered $10 more for the follow-up survey than they did for the baseline survey to ensure high response rates for the follow-up survey.


Table A.3 Incentives for Participation

Research Activity

Length

Incentive Amount

When

Baseline surveys (Attachments 8 & 9)

30 min

$25

At time of study intake

Updating contact information with the survey firm

10 min

maximum of $7

1-4 months after baseline

Follow-up surveys (Attachments 10 & 11)

40 min

$35

6 months after baseline

Participant focus groups (Attachment 16)

120 min

$20

6 or 18 months after baseline

Mobile device surveys (Attachments 18 & 19)

30 min

maximum of $25

1 – 2 months after baseline

Mother focus group (Attachment 17)

60 min

$20

6 or 18 months after baseline


Fathers who participate in all data collection activities would receive $112. Mothers who participate in a focus group would receive $20.

A large body of research provides evidence of the benefits of offering monetary incentives. In a seminal meta-analysis, Singer, et al. (1999) found that incentives in face-to-face and telephone surveys were effective at increasing response rates, with a one dollar increase in incentive resulting in approximately a one-third of a percentage point increase in response rate, on average. A study by Berlin and colleagues found that incentives increased the response rates of respondents with low levels of literacy, as well as lowering interviewer costs (Berlin et al., 1992). James also found that an incentive was effective in lowering non-response rates and that any incentive lowered the number of interviewer visits per case (James, 1997). The Mack et al. study of responders to the Survey of Income and Program Participation (SIPP) found that incentives reduced non-response rates in initial and subsequent interviews, and were particularly effective in reducing non-response rates in poor and African-American households (Mack, Huggins, Keathley & Sudukchi, 1998). Moreover, the use of incentives has been found to be efficacious for increasing the response rates of sensitive subject matter surveys (Mosher et al., 1994). OPRE is currently conducting experiments on incentives for another approved project (MIHOPE, (0970-0402) to examine the effectiveness of different incentive levels to inform future projects.


Finally, our prior experience fielding data collection instruments with economically disadvantaged and Temporary Assistance for Needy Families (TANF)-receiving populations also supports the evidence that incentives increase response rates. For example, in a follow-up interview with Jobs Corps applicants, experimental evidence showed that incentives increased response rates and greatly increased search efficacy. Experience in these and similar studies of disadvantaged populations suggest that incentives can help convince reluctant respondents to participate (Moffitt, 2004).


The B3 study team developed the plan for offering incentives based on research showing the effectiveness of incentives and the amounts given for data collection activities in previous studies. In most cases, longer data collection activities are associated with larger incentives to acknowledge the larger time commitment required by those activities. One notable exception is the amount being offered for focus group participation ($20 for both participant and mother focus groups). While the focus groups are a relatively large time commitment compared with the other data collection activities, getting high response rates for the focus groups is less crucial to the study since they will not be informing the impact study. Given this, the team decided to offer more modest incentives for the focus groups.


We believe that the studies summarized here, and MDRC’s previous experiences with fielding surveys and other kinds of assessments with low-income populations, make a strong case for the use of respondent incentives for completing surveys or focus groups.


A10. Privacy of Respondents


The study team is committed to protecting the privacy of participants and keeping private the data that are entrusted to us to the extent permitted by the law. In addition, the study team is experienced in implementing stringent security procedures. Every MDRC, MEF Associates, and Abt SRBI employee, including field staff employed for data collection, is required to sign a privacy pledge as an assurance of nondisclosure of private information. Field staff will also be trained in maintaining strict privacy and data security.


Our IRB submission received a full committee review and was approved on December 4, 2015 pending modifications. We submitted the required modifications and received the full approval on March 4, 2016. The expiration date of the IRB approval is December 4, 2016. The Project Title is [797029-1] Building Bridges and Bonds (B3) and was reviewed by IRB #0003522, FWA#00003694.


The evaluation has also obtained a Certificate of Confidentiality from HRSA (Appendix H). The Certificate of Confidentiality helps to assure sites and families that their information will be kept private unless a court compels disclosure.


Consent will be obtained by a trained staff member employed by the fatherhood programs selected to be a part of B3 (Appendix A – Consent materials for fathers). The staff member will explain the B3 study to the father after the father has been deemed eligible to be a participant in B3. The staff member will have a consent form on a tablet or computer with a hard copy version for the father to read while the staff person reviews it with him. If the father would like to participate, he will electronically sign the consent form. The father can take the paper copy of the consent form home with him. A different version of the form will be used to gain assent from fathers who are under 18 (Appendix B – Assent materials for fathers under 18).


If an applicant is a minor, it will be necessary to obtain consent from the parent as well, unless the state’s emancipated minor laws make this unnecessary. This consent will be obtained by telephone. The script for obtaining consent from parents of minors is included as Attachment 3 - Consent materials for parents of fathers under 18.


A similar process will be conducted in order to gain consent from co-parents who take part in focus groups (Appendix C – Consent forms for focus groups with mothers).


In one or two of the parenting sites, fathers assigned to the program intervention group will be asked to sign an additional consent form that asks for fathers’ consent to be video taped during regularly scheduled parenting workshop sessions. This separate consent form is included as Attachment 21. If these fathers are custodial parents, they will also be asked to sign a separate consent form that grants permission for their child to be video taped during regularly-scheduled parenting workshop sessions. This consent form is included as Attachment 22. If fathers are not custodial parents, the co-parent of the father’s child will be asked to sign the consent (Attachment 22) for their child to be video taped.


The following privacy and data security measures will be in place to protect respondents’ privacy, including any personally identifiable4 or sensitive information collected about respondents:


  • All data, including paper files, portable media (e.g., voice/video recordings) and computerized files, are kept in secure areas. Paper files and portable media are stored in locked storage areas with limited access on a need-to-know basis. Computerized files are managed via password control systems to restrict access as well as physically secure the source files.

  • Merged data sources have identification data stripped from the individual records or encoded to preclude overt identification of individuals.

  • When files with personally identifiable or sensitive data are transferred between MDRC and service providers, government agencies, site staff, or subcontractors, personally identifiable information is first stripped from the file whenever possible. When portable media with personally identifiable or sensitive data need to be transferred, data are encrypted before being transferred. For computerized files, MDRC uses secure file transfer methods for transmitting confidential data, including Axway (encrypts data in transit (via SSL protection) and at rest on MDRC’s secure network transfer site) or data providers’ secure FTP sites.

  • All paper records and electronic records containing personally identifiable information will be destroyed at the end of the study.

  • All reports, tables, and printed materials are limited to the presentation of aggregate numbers.

  • Compilations of individualized data are not provided to participating agencies.

  • Confidentiality agreements are executed with any participating research subcontractors and consultants who must obtain access to detailed data files. These agreements are corporate forms and will not be distributed to respondents.

  • Some of the information will be collected through the nFORM MIS (specifically, the information in Attachments 4, 5, 6, and 7). The ACF Office of Planning, Research and Evaluation completed a Privacy Impact Assessment (PIA) for the nFORM system, and the additional information in B3 that is collected through nFORM will fall under nFORM’s Authority To Operate. The PIA, titled Information Family Outcomes Reporting & Management was finalized and signed by the HHS Senior Agency Official of Privacy on June 8, 2016. See Appendix I.

All research staff conducting site visits will be trained on appropriate privacy and data security matters, including consenting research subjects when needed. Staff will be trained to minimize risk by using encrypted laptops and recording equipment, using secure storage locations and transfer mechanisms, limiting access to only those who need to know, and destroying data once analysis is complete.


A11. Sensitive Questions


Our baseline and 6-month follow-up surveys of fathers will contain questions on some sensitive topics, like relationship with children and co-parents, stressors and risks, substance abuse, criminal justice involvement, spanking, child support requirements and payments, and employment status and wage (Attachments 8 – 11). The sensitive survey questions included in the baseline and follow-up surveys are necessary for understanding the barriers fathers face in their relationships with children, relationships with co-parents, and employment stability, and for understanding program impacts. The interventions being tested are expected to improve parent/child relationships, parent/co-parent relationships, to promote healthy thought patterns and behaviors, to reduce risky behaviors, and promote employment stability.


One of the sites testing the employment intervention will be using a new screening instrument to determine eligibility for the B3 study (Attachment 2). As part of the screening process fathers will be asked questions about potentially sensitive topics such as employment status, criminal justice involvement, and alcohol or drug problems. These questions need to be asked to determine the fathers’ risk scores to see if they are appropriate candidates for the employment intervention.


At intake, fathers will be asked to provide personally identifiable information such as relevant ID numbers (SSN, criminal justice IDs, etc.) and contact information (e.g. name, address, phone numbers, and e-mail address) for the father, the co-parent, and additional contacts that could help the research team find the father and the co-parent (Attachment 5). The collection of personal identifiers is necessary for participant tracking for follow-up surveys and to allow us to access and match administrative records data. For example, social security numbers and criminal justice IDs will be used to match to New Hires data and criminal justice administrative records. Contact information will be used by the survey firm to contact fathers for the follow-up survey, including matching with various tracking databases. These identifiers will not be used to retrieve information on individuals. Rather, information will be retrieved in batches based on date of enrollment in the study. We will describe the uses of this information to respondents.


Focus groups may also ask questions on sensitive topics like barriers to participation and experiences in the program (Attachment 16). The process study data are necessary to describe the challenges and approaches to implementing program interventions.


At baseline, these questions will be answered in a self-administered format, which should minimize discomfort (Attachments 8 & 9). At follow-up the data will be collected by an interviewer over the telephone or in person (Attachments 10 & 11). For baseline and follow-up surveys, as well as focus groups, respondents will be informed that they do not have to answer any question they do not want to answer (Attachments 8, 9, 10, 11, & 16) Also, respondents will be informed by research staff prior to the start of the interviews or surveys that their answers will be kept private to the extent permitted by law, that results will only be reported in the aggregate, and that their responses will not affect any services or benefits they or their family members receive.


A “system of records” under the Privacy Act is:

  • a group of records about individuals

  • under agency control

  • maintained in a paper or electronic system from which the records are

  • actually and directly retrieved by those individuals’ personal identifiers


For this project, information will not be maintained in a paper or electronic system from which it is actually or directly retrieved by an individual’s personal identifiers. Therefore, the Privacy Act’s requirements for systems of records do not apply.


A12. Estimation of Information Collection Burden


Burden Hours


Table A.4 shows the annual burden of the activities described in this supporting statement. Appendix E explains how the burden estimates were calculated for each instrument in the table. The total annual burden for applicants and staff members is estimated to be 3,017 hours.


Table A.4 Burden Estimates

Instrument

Respondent

Total Number of Respondents

Annual Number of Respondents

Number of Responses Per Respondent

Average Burden Hours Per Response

Annual Burden Hours

Average Hourly Wage

Total Annual Cost

1 - Screening questions for parenting intervention

Applicant

4,000

1,333

1

0.083


111

$4.92

$546.12

Staff5

36

12

111

0.083


111

$27.86

$3,092.46

2 - Screening questions for employment intervention

Applicant

900

300

1

0.250

75

$4.92

$369.00

Staff6

12

4

75

0.250

75

$27.86

$2,089.50

3 - Consent Materials for Parents of Fathers under 18

Parent of Applicant

500

167

1

0.167

28

$4.92

$137.76

Staff7

36

12

14

0.167

28

$27.86

$780.08

4 - B3-specific eligibility data

Applicant

6,400

2,133

1

0.017

36

$4.92

$177.12

Staff8

72

24

89

0.017

36

$27.86

$1002.96

5 - B3-specific enrollment data

Applicant

2,700

900

1

0.153

138

$4.92

$678.96

Staff9

72

24

38

0.151

138

$27.86

$3,844.68

Instrument

Respondent

Total Number of Respondents

Annual Number of Respondents

Number of Responses Per Respondent

Average Burden Hours Per Response

Annual Burden Hours

Average Hourly Wage

Total Annual Cost

6 - B3 tracking of attendance in services for program group members

Staff10

72

24

363

0.008

70

$27.86

$1,950.20

7 - Additional nFORM burden for non-Grantee site

Applicant

450

150

1

0.250

38

$4.92

$186.96

Staff11

12

4

1,969

0.0343


270

$27.86

$7,522.20

8 & 9 - Baseline surveys

Applicant

2,842

947

1

0.642

608

$4.92

$2,991.36

10 & 11 - 6 month follow-up surveys

Applicant

2,430

810

1

0.667

540

$4.92

$2,656.80

12 & 13 - Staff and management semi-structured interviews

Staff12

228

76


2

1.5


228

$27.86

$6,352.08

14 & 15 - Staff surveys

Staff13

240

80

1

0.667

53

$27.86

$1,476.58

16 - Participant focus groups

Applicant

160

53

1

2.0

106

$4.92

$521.52

17 - Mother Focus Groups

Co-parent of Applicant

80

27

1

1.0

27

$4.92

$132.84

18 & 19 - Mobile device surveys

Applicant

1,728

576

3.28

.1

189

$4.92

$929.88

Instrument

Respondent

Total Number of Respondents

Annual Number of Respondents

Number of Responses Per Respondent

Average Burden Hours Per Response

Annual Burden Hours

Average Hourly Wage

Total Annual Cost

20 - Post-session debrief for sites testing parenting intervention

Staff14

36

12

104

0.083

104

$27.86

$2,897.44

21 – Consent for child video coding study

Staff

12

4

13

.083

4

$27.86

$111.44

Co-parent of Applicant

150

50

1

.083

4

$4.92

$19.68

Data Collection Total






3,017


$40,467.62





Total Annual Cost


For cost calculations for Table A.4, we estimate the average hourly wage for staff at the organizations is the hourly wage of “social and community service managers
taken from the U.S. Bureau of Labor Statistics, National Compensation Survey, 2012 ($27.89). The average hourly wage of program applicants is estimated from the average hourly earnings ($4.92) of study participants in the Building Strong Families Study (Wood et al., 2010). These average hourly earnings are lower than minimum wage because many study participants were not working. We expect that this will also be the case for B3 study participants. The estimated total annual burden cost is $
40,467.62.


A13. Cost Burden to Respondents or Record Keepers


There are no additional costs to respondents.


A14. Estimate of Cost to the Federal Government


The total cost for the data collection activities under this current request will be $16,606,471. This amount includes costs for new data collection activities under this request, including instrument development and design; data collection and analysis plans; data collection activities including surveys and qualitative research; data analysis; and dissemination of findings. It also includes labor costs for implementation and site monitoring activities.


Annual costs to the Federal government will be $5,535,490 for this proposed data collection.


A15. Change in Burden


This nonsubstantive change includes adjustments to burden assumptions based on experience in the field.


A16. Plan and Time Schedule for Information Collection, Tabulation and Publication


Analysis Plan


Estimating program impacts. Although the use of a randomized design will ensure that simple comparisons of experimental and control group means will yield unbiased estimates of program effects, the precision of the estimates will be enhanced by estimating multivariate regression models that control for factors at baseline that also affect the outcome measures. Such impacts are often referred to as “regression-adjusted” impacts. Examples of factors that may affect outcomes are the sample members’ age, number of children, and baseline measures of parenting, co-parenting, and employment outcomes.

In drawing inferences about estimated impacts, standard statistical tests such as the two-group t- test (for continuous variables such as earnings, or for dichotomous measures such as employment or any child contact) or chi-square tests for categorical measures will be used to determine whether estimated effects are statistically significant. Similar methods will be used to estimate the impact of engagement strategies on participation and completion rates.


Statistical analyses will be performed in SAS. Our sampling design does not require the use of survey weights. Parenting program impacts will be estimated for a pooled sample across the 3 sites testing a parenting intervention. Employment program impacts will be estimated for a pooled sample across the 3 sites testing an employment intervention. In addition, we will estimate impacts for key subgroups defined by baseline characteristics. Subgroup impacts estimates will also use samples pooled across the 3 parenting sites or the 3 employment sites.


Analyzing Process Study data. Notes from qualitative data collection (interviews and focus groups, for example) will be imported into Dedoose, MDRC’s mixed-methods analysis software. Notes will be coded using a pre-specified coding scheme that accounts for the priorities of research questions and what we hope to learn from the process study. Quantitative data (surveys, for example) will undergo descriptive statistics analysis in SAS. If warranted, quantitative data may also be imported into Dedoose for analysis. Where possible, we will analyze process study data for all parenting sites together and all employment sites together. Further, where informative, we will analyze data for individual sites within the intervention types.


Analyzing Video Observations. Videotaped father/child interaction will be analyzed through systematic coding of videos using validated coding schemes designed for parent-child interactions. The quantitative coded values will then be analyzed over the course of the parenting intervention to learn whether father/child interaction quality improves from the beginning to the end of the intervention period and about the patterns of these improvements. The quantitative codes will also be used descriptively to compare the quality of father/child interactions with other populations.


Time Schedule and Publication


Dates included in Table A.5 are based on OMB approval of this information collection request prior to August 2016. The timeline will be adjusted, if necessary.


Table A.5

Data Collection Timeline


Start Date

End Date

Screening, enrollment, and baseline data (Attachments 1, 2, 3, 4, 8 & 9)

August 2016

July 2018

Program participant data (nFORM MIS )

August 2016

February 2019

Focus groups (Attachments 16 & 17)

January 2017

June 2018

Mobile data collection (Attachments 18 & 19)

August 2016

October 2018

Post-session debrief (Attachment 20)

August 2016

October 2018

Data collection from staff (semi-structured interviews and surveys) (Attachments 12-15)

January 2017

February 2019

6-month follow-up survey data (Attachments 10 & 11)

January 2017

March 2019

Administrative records

January 2015

June 2019

Data Analysis Timeline



Data analysis for engagement study

October 2016

December 2018

Data analysis for process study

July 2017

September 2019

Data analysis for impact study

June 2018

September 2019

Data analysis for video observations

June 2018

September 2019

Publications Timeline



Brief introducing interventions and sites in study


2016

Study Design Report


2017




Brief/infographic describing baseline characteristics


2017

Report on process and impacts for engagement interventions


2018

Briefs summarizing process results for employment and parenting studies


2018

Two reports on process study and impacts for (1) parenting and (2) employment interventions


2019

Research brief summarizing B3 findings across interventions


2019

Two practitioner briefs for parenting and employment interventions


2019

Journal Article


2019


A17. Reasons Not to Display OMB Expiration Date

All instruments will display the expiration date for OMB approval.

A18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.



References

Amato, Paul R., and Joan G. Gilbreth. 1999. “Nonresident Fathers and Children’s Well-Being: A Meta-Analysis.” Journal of Marriage and the Family 61: 557-573.

Andrews, D.A., and James Bonta. 2010. “Rehabilitating Criminal Justice Policy and Practice.” Psychology, Public Policy, and Law 16, 1: 39-55.


Avellar, Sarah, et al. 2011. Catalog of research: Programs for low-income fathers. Mathematica Policy Research.


Bauldry, Shawn, Danijela Korom-Djakovic, Wendy S. McClanahan, Jennifer McMaken, and Lauren J. Kotloff. 2009. Mentoring Formerly Incarcerated Adults: Insights from the Ready4 Work Reentry Initiative. Philadelphia, PA: Public/Private Ventures.


Berlin, M., Mohadjer, L., Waksberg, J., Kolstad, A., Kirsch, I., Rock, D., & Yamamoto, K. 1992. “An experiment in monetary incentives.” Proceedings of the Survey Research Section of the American Statistical Association, 393-398.


Bloom, Dan. 2014. “Framing the Future of Economic Security Evaluation Research for the Fatherhood Research and Practice Network.” Fatherhood Research & Practice Network.


Cabrera, Natasha J., Jacqueline D. Shannon, and Catherine Tamis-LeMonda. 2007. “Fathers' Influence onTheir Children's Cognitive and Emotional Development: From Toddlers to Pre-K.” Applied Developmental Science 11, 4: 208-213.


Cancian, Maria, Slack, Kristen Shook, Yang, Mi Youn. 2010. “The Effect of Family Income on Risk of Child Maltreatment.” Institute for Research on Poverty Discussion Paper 1385-10.


Carlson, Marcia J. and Katherine Magnuson. 2011. “Low-Income Fathers’ Influence on Children.” The Annals of the American Academy of Political and Social Science 635: 95-116.


Cowan, Carolyn P., Philip A. Cowan, Nancy Cohen, Marsha K. Pruett, and Kyle Pruett. 2008. “Supporting Fathers’ Engagement with Their Kids.” Pages 44-80 in Jill Duerr Berrick and Neil Gilbert (eds.), Raising Children: Emerging Needs, Modern Risks, and Social Responses. New York: Oxford University Press.


James, T. 1997. “Results of the Wave 1 Incentive Experiment in the 1996 Survey of Income and Program Participation.” Proceedings of the Survey Research Section of the American Statistical Association.


King, Valerie, and Juliana M. Sobolewski. 2006. “Nonresident Fathers’ Contributions to Adolescent Wellbeing.”Journal of Marriage and Family 68: 537–557.


Knox, V., Cowan, P. A., Cowan, C. P., & Bildner, E. (2011). “Policies That Strengthen Fatherhood and Family Relationships: What Do We Know and What Do We Need to Know?” The ANNALS of the American Academy of Political and Social Science, 635(1): 216–239.


Ly, Kim, Nina Mažar, Min Zhao and Dilip Soman. 2013. A Practitioner’s Guide to Nudging. Toronto, Ontario: University of Toronto, Rotman School of Management.


Mack, S., Huggins, V., Keathley, D., & Sudukchi, M. 1998. “Do monetary incentives improve response rates in the Survey of Income and Program Participation?” Proceedings of the Survey Research Section of the American Statistical Association.


Martinson, Karin and Demetra Nightingale. 2008. “Ten Key Findings from Responsible Fatherhood Initiatives.” Washington: The Urban Institute.


Miller, Cynthia, and Virginia Knox. 2001. The Challenge of Helping Low-Income Fathers Support Their Children:Final Lessons from Parents’ Fair Share. New York: MDRC.


Moffitt, R.. 2004.  The Three-City Study Incentive Experiment:  Results from the First Two Waves.  Retrieved from http://www.jhu.edu/~welfare.


Mosher, W.D., Pratt, W.F., and Duffer, A.P. 1994. “CAPI, event histories, and incentives in the NSFG cycle 5 pretest.” Proceedings of the Survey Research Methods Section of the American Statistical Association, 1, 59-63.


Richburg-Hayes, Lashawn, Caitlin Anzelone, Nadine Dechausay, Saugato Datta, Alexandra Fiorillo, Louis Potok, Matthew Darling, and John Balz. 2014. Behavioral Economics and Social Policy: Designing Innovative Solutions for Programs Supported by the Administration for Children and Families. OPRE Report 2014-16a. Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.


Sanders, M., & Kirkman, E. (2014). I've booked you a place. Good luck. A field experiment applying behavioural science to improve attendance at high-impact recruitment events (No. 13/334). Department of Economics, University of Bristol, UK.


Singer E, Van Hoewyk J, Gebler N, Raghunathan T, McGonagle K. 1999. “The effect of incentives on response rates in interviewer-mediated surveys.” Journal of Official Statistics;15(2):217–230.


Singer E, Van Hoewyk J, Maher MP. 1998. “Does payment of incentives create expectation effects?” Public Opinion Quarterly. 62:152–164.

Thaler, Richard H., and Cass R. Sunstein. 2008. Nudge: Improving Decisions about Health, Wealth, and Happiness. New Haven, CT: Yale University Press.

1Fathers who are entering one of the fatherhood programs that is participating in the B3 study will be recruited into the study. Some fathers who are already enrolled in those programs before the study starts may also be recruited for B3.

2Staff members in programs that are federal Responsible Fatherhood Grantees have to enter information about fathers and their program participation into the nFORM (the Information, Family Outcomes, Reporting, and Management System) MIS, an MIS being developed by the Fatherhood and Marriage Local Evaluation and Cross-site (FaMLE Cross-site) Project (0970-0460).

3The B3 baseline and follow-up surveys ask about criminal justice involvement and substance use. In addition, the study will be collecting criminal justice administrative records in the three sites testing the employment intervention.

4The personally identifiable information to be collected, along with a description of how these data will be used, is described in Section A11.

5This burden is categorized as record-keeping burden.

6This burden is categorized as record-keeping burden.

7This burden is categorized as record-keeping burden.

8This burden is categorized as record-keeping burden.

9This burden is categorized as record-keeping burden.

10This burden is categorized as record-keeping burden.

11This burden is categorized as record-keeping burden.

12This burden is categorized as record-keeping burden.

13This burden is categorized as record-keeping burden.

14This burden is categorized as record-keeping burden.

15




File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorErika Lundquist
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy