1.FaMLE_SupportingStatement_PartA_renewal_20180629

1.FaMLE_SupportingStatement_PartA_renewal_20180629.docx

Fatherhood and Marriage Local Evaluation and Cross-Site Data Collection

OMB: 0970-0460

Document [docx]
Download: docx | pdf

U.S. Department of Health and Human Services


Administration for Children and Families


Office of Planning, Research, and Evaluation


330 C Street, SW

Washington DC 20201


Federal Project Officer:

Seth Chamberlain



OMB Supporting Statement for the Fatherhood and Marriage Local Evaluation and Cross-Site Data Collection

Part A: Justification

June 2018











LIST OF ATTACHMENTS



ATTACHMENT A: LEGISLATIVE AUTHORITY


ATTACHMENT B: 60-DAY FEDERAL REGISTER NOTICE


ATTACHMENT C: CONSENT STATEMENT FOR HM AND RF FOCUS GROUP PROGRAM PARTICIPANTS


ATTACHMENT D: GUIDANCE FOR INSTITUTIONAL REVIEW BOARD (IRB) APPROVAL OF HEALTHY MARRIAGE AND RESPONSIBLE FATHERHOOD GRANTEE ACTIVITIES


ATTACHMENT E: CONFIDENTIALITY PLEDGE


ATTACHMENT F: ASSUMPTIONS FOR CALCULATION OF BURDEN ESTIMATES IN TABLE A.4


ATTACHMENT G: FOCUS GROUP RECRUITMENT LETTER AND PHONE SCRIPT

A1. Circumstances making the Data Collection Necessary

This information collection request (ICR) is for renewal of clearance for OMB package # 0970-0460, which was originally approved in July 2015 to collect information for the Fatherhood and Marriage Local Evaluation (FaMLE) and Cross-Site Project. The FaMLE Cross-Site Project gathers information from the current round of healthy marriage (HM) and responsible fatherhood (RF) grantees; the funding is authorized under Sec. 811 (b) Healthy Marriage Promotion and Promoting Responsible Fatherhood Grants of the Claims Resolution Act of 2010, Pub. L. No. 111-291, 124 Stat. 3064 (Dec. 8, 2010). A copy of the legislative authority is included as Attachment A. The HM and RF grants were awarded in fall 2015.

The project is being undertaken by the U.S. Department of Health and Human Services, Administration for Children and Families (ACF), and is being implemented by Mathematica Policy Research.

A. Background

Healthy marriage and responsible fatherhood (HMRF) programs have been undergoing a transformation in the past few decades. At first a new approach for serving vulnerable families, such programs have become an established presence in many communities, with connections to other agencies and a growing number of families served. Responsible Fatherhood (RF) programs began in the 1990s with such efforts as the Young Unwed Parents program, Parents’ Fair Share, and Partners for Fragile Families. In the early 2000s, ACF announced the Healthy Marriage Initiative, which provided funding to federal grantees through existing legislative authorities to add marriage education to their service offerings. This effort coincided with findings from the longitudinal Fragile Families and Child Well-being Study that suggested the period around a child’s birth could be an opportunity for intervening with unmarried couples, who typically were romantically involved and interested in marriage (McLanahan et al. 2001).

The Deficit Reduction Act of 2005 created the HMRF grant program, which authorized $150 million per annum to support program activities aimed at promoting and sustaining healthy marriages, fostering economic stability, and promoting responsible parenting. The Claims Resolution Act of 2010 re-authorized this grant program. In September 2015 a third round of grants was awarded. The Office of Family Assistance (OFA), Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services (HHS) administers the HMRF grant program and, to date, 437 HMRF grants have been awarded (some agencies have received successive grants).

As the reach and variety of HMRF programs grow, so too does interest in their operations and effectiveness. Some research results have accumulated—largely from a few multisite, federally funded evaluations. For example:

  • The Building Strong Families project (BSF) and the Supporting Healthy Marriages (SHM) evaluations employed rigorous research designs to evaluate Healthy Marriage (HM) programs aimed at unmarried parenting couples (BSF) and married parents (SHM). These evaluations focused on a small number of programs with the capacity to recruit large samples and meet other "evaluability" criteria.

  • Also a rigorous experimental design, the Parents and Children Together (PACT) Evaluation is assessing the implementation and effectiveness of four RF and two HM programs chosen from among the 2011 cohort of grantees.1

  • Two rigorous experimental designs—the Building Bridges and Bonds (B3) and Strengthening Relationship Education and Marriage Services (STREAMS) evaluations—are assessing the implementation and effectiveness of eleven programs chosen from among the 2015 cohort of grantees.

Strong evidence is still limited, however, and is derived mostly from a narrow range of programs selected more for their suitability for such studies than their representativeness of HMRF programs in general. According to an extensive review of HM and RF program evaluations conducted by Mathematica Policy Research (Avellar 2011, 2012), local evaluations carried out on grant-funded HMRF programs often are small, and most do not use rigorous methods. About 20 percent of RF and HM programs that have been the subject of research were assessed with designs that were moderately or well designed to detect the effects of the program. Most other studies have reported participant outcomes but were unable to determine if programs or other factors caused observed changes or differences, due to evaluation design (Avellar 2011, 2012). Policy makers need to know more about what works to guide future funding decisions, and program operators need to understand what makes for an effective program to guide future programming.

B. Overview of the project

ACF contracted with Mathematica Policy Research to conduct the FaMLE Cross-Site project, with the dual goals of supporting quality and consistent collection of performance measures data, and fostering strong local evaluations by the 2015 cohort of HMRF grantees. More specifically, Mathematica: (1) collects information on the design and implementation of grantees’ programs; and (2) has developed and maintains data collection tools that grantees use to collect information on program participation and participant outcomes for use in reporting performance and conducting evaluations.

The project relies on two complementary, ongoing sources of data collection: (1) data collection by the contractor for cross-site analysis of program design and implementation, hereafter referred to as DCI (Data collected by the Contractor Itself); and (2) data collection by the grantees themselves for performance reporting and cross-site analysis, hereafter referred to as DCS (Data Collected for Cross-Site analyses). OMB renewal will allow these sources of data collection to continue for another three years.

1. DCI

The DCI effort focuses on program design and implementation. Information for this effort is collected via document review, a series of stakeholder interviews with grantee staff and partner organizations, and focus groups with clients. All document reviews include materials readily available to ACF without imposing burden on any participants.

DCI occurs in four stages (Figure 1). The first two stages are focused on a broad group of grantees and document program design, while stages three and four, focused on a narrower group of grantees, will document program implementation. Stage one consisted of document reviews examining the grant applications of all (approximately 90) HMRF grantees; telephone interviews in stage two will probe a subset of approximately 60 grantees for additional detail on program design. In stage three, further document review of performance reports will focus on a subset of about 20 grantees within the group of 60 contacted during stage two. The documents for review at this stage will be those that grantees must submit to ACF as part of performance monitoring. In stage four, telephone interviews and site visits will probe for additional detail on program implementation among these 20 grantees, adding depth to the implementation information gathered during stage three.

Table A.1. Overview of stages with increasingly focused subsets of grantees

Stage

Purpose

Data Source

Number of programs

1

Collect basic program design information

Grant applications

90

2

Confirm and further explore program design information with a subset of well-designed programs

Telephone interviews

60

3

Review basic program implementation information

Semiannual reports

20

4

Confirm and further explore program implementation

Telephone interviews, site visits, client focus groups

20



2. DCS

The DCS effort focuses on performance data collected by the current round of HMRF grantees, and used by ACF to monitor program performance and to conduct cross-site analyses. As a condition of their grant award, all HMRF grantees are required to collect information on:

  1. Program applicant characteristics

  2. Program operations

  3. Enrollment and participation in program services

  4. Participant outcomes, pre- and post.

Grantees provide these data by using a management information system (MIS) called Information, Family Outcomes, Reporting, and Management (or simply, nFORM) which was developed and is managed by ACF (specifically, by the contractor for this project).

C. Data Collection Activities Requiring Clearance

This ICR is for a renewal of clearance for seven data collection activities and two data reporting activities:

  • DCI: Data Collection. Three DCI activities have been or will be carried out by contractor staff to collect information on the design and implementation of grantees’ programs.

  • DCS: Data Collection. There are four DCS data collection activities, which are conducted by the grantees themselves to document program applicant characteristics, program operations, services received, and outcomes for program participants.

  • DCS: Performance Reporting. Grantees are required to submit two semi-annual reports and two quarterly reports to OFA each grant year; these reports draw upon a subset of the DCS data described above.

These data collection and reporting activities are described along with the instruments that are used for each in Section A.2.



A2. Purpose and Use of the Information Collection

The information obtained through the FaMLE Cross-Site project is critical to understanding the broad array of RF and HM programs to be funded—the services provided, the context in which they operate, the nature and extent of participation, and the outcomes for program participants. Information will be used to report performance to OFA and for cross-site analyses. The goal is to better understand the design, operations, and benefits of HMRF programs, thereby informing decisions about future government investments in HMRF programming.

1. DCI: Data Collection

The information collected by the contractor for DCI will help clarify the processes and contextual factors associated with program design and implementation, and will supplement gaps in information about these topics that was gathered via document review. To achieve these purposes, we propose collecting the following information.

  • Program design. A topic guide will be used to conduct semi-structured telephone interviews with lead program staff at the approximately 60 grantees selected for stage two DCI data collection. The semi-structured Instrument DCI-1, Grantee Staff Topic Guide on Program Design, will focus on information about program design, including clarifying information from the grant application, perceived strengths and weaknesses, and the array of services grantees intend to offer. This will help the project clearly document aspects of strong program design and analyze these across sites to provide information for ACF and other programs to use when designing future programs. The topic guide will be tailored based on the grantees selected for this stage of data collection.

  • Implementation. A topic guide will be used to conduct semi-structured interviews, by phone or in person, with a range of program staff from the approximately 20 grantees selected in stage three to learn about program implementation. The semi-structured Instrument DCI-2, Grantee Staff Topic Guide on Implementation, will focus on information about program implementation. The interviews will focus on the timing, duration, and frequency of program activities and services the grantee delivered; enrollment and participation; characteristics of clients who enroll and participate; actual staffing, and successes and challenges. The interview data will allow the project to gain a deeper understanding about how a subset of grantees with strong program design actually implemented and operated their program. This information will help ACF and other programs identify practical examples, promising practices, and lessons learned when implementing future programs. The topic guide will be tailored based on the grantees selected for this stage of data collection.

  • Program participants. A topic guide will be used to conduct four (in-person) focus groups with participants at each of the 20 stage-three grantees to learn about their program experiences. Instrument DCI-3, the Program Participant Focus Group Topic Guide, will be used to explore and document program participants’ perspectives on their motivation for enrolling in the program, and the availability, quality, and value of program services. Of particular interest will be participants’ level of satisfaction with the program and their assessment of the knowledge and skills gained as a result of program participation. This information will help ACF and programs learn which aspects of RFHM programs are of greatest interest and value to clients. The topic guide will be tailored based on the grantees selected for this stage of data collection.



2. DCS: Data Collection

The information collected by the HMRF grantees for DCS is and will be used: (1) in analyses of program design and implementation; (2) in cross-site analyses of outcomes across all grantees and impacts among a subset of grantees conducting experimental or quasi-experimental evaluations; and (3) for reporting program performance to OFA. DCS data may also be used for special topics reports as requested by ACF. The specific use and purpose of each of the DCS information collection activities is described below.2

  • Applicant characteristics. All HMRF grantees collect and enter information about individuals and couples applying to the program. HMRF grantees collect information on demographic characteristics (e.g., gender, age); financial well-being (e.g., employment status); family status (e.g., marital and parenting status); health and well-being (e.g., psychological distress); and how the program applicant heard about the program and reasons for enrolling. This information is collected from all program applicants at program intake using Applicant Characteristics, Instrument DCS-1. Information from DCS-1 is used in cross-site descriptive analyses of who applies to and who eventually participates in HMRF programs, and for reporting participant characteristics to OFA in the semi-annual reports. There are 24 HMRF grantees conducting experimental or strong quasi-experimental local evaluations. Applicant characteristics may also be used by local evaluators and for the cross-site analysis of impacts as covariates in a regression model to increase the precision of the impact estimates, to check that the characteristics of program and control group members are on average similar at baseline, and to predict program participation in analyses of the impacts among those who actually participate in the program(s).

  • Program operations. All HMRF grantees collect and enter data on: (1) strategies used to market to and recruit individuals and couples into their programs (such as the amount and types of mass marketing strategies; recruitment methods; and the number of FTE staff dedicated to marketing, outreach, and recruitment); (2) practices to monitor quality (such as staff training, staff supervision, and program observations); (3) staff qualifications (including the proportion of staff with various levels of educational attainment, training and years of experience), and (4) implementation challenges (such as staff turnover and recruitment challenges) using Program Operations, Instrument DCS-2. Grantees enter this information quarterly and it is used for reporting performance to OFA semi-annually and in cross-site descriptive analyses of program design and implementation.

  • Service receipt. All HMRF grantees collect and enter data on program services offered and individuals’ and couples’ participation in these services using Service Receipt, Instrument DCS-3 (screenshots which show the information grantees enter on service receipt). Attendance in program activities is typically documented immediately (for example, from workshop sign-in sheets) and entered into nFORM at least once a week. HM grantees track participation at both the individual and couple levels. This information is used for reporting performance to OFA in the quarterly and semi-annual reports and in cross-site descriptive analyses of program implementation.

  • Self-Administered Questionnaires (SAQs) upon program entry (pre-test) and program exit (post-test). All HMRF grantees ask participants to complete a SAQ assessing information at program entry (at the first workshop attended) and at program exit (the last core program activity), or one month post-exit if the program is structured to last less than one month. HMRF grantees collect information in five outcome domains: (1) parenting, co-parenting, and fatherhood; (2) economic stability; (3) healthy marriage and relationships; and (4) personal development; and (5) program perceptions. Within these five domains, there are 21 outcome constructs, as shown in Table A.2. Twelve of these constructs are common to both RF and HM grantees given their common program components, another four constructs (pertaining to healthy relationships and marriage) are specific to HM grantees, and another five constructs (pertaining to parenting, co-parenting, and fatherhood) are unique to RF grantees.

The English version of the pre- and post-test SAQs for the two types of RF grantees are included in this ICR for renewal of clearance as Instruments DCS-4HM and DCS-4-RF. There are two versions of the pre- and post-program SAQs for HM grantees: one for adult populations (DCS-4HM.1 and DCS-4HM.2), and one for youth populations (DCS-4HM.3 and DCS-4HM.4). There are also two versions of the pre- and post-test SAQs for RF grantees: one for fathers residing in the community (DCS-4RF.1 and DCS-4RF.2, and one for incarcerated fathers (DCS-4RF.3 and DCS-4RF.4. These instruments are also available in Spanish.

Outcomes data collected through the pre- and post-tests (Instrument DCS-4HM and DCS-4RF) are used for reporting performance to OFA semi-annually and in cross-site descriptive analyses. For the 24 HMRF grantees conducting experimental or strong quasi-experimental evaluations, outcomes data for both program and control/comparison groups may be used by local evaluators and the cross-site analysis to calculate the average impacts of program participation. Measuring many of the same variables at pre-test and post-test will increase the precision of estimate impacts. Outcomes data may also be used for a special topics report as requested by ACF.

Table A.2. Unique and Overlapping Constructs for RF and HM Grantees, by Outcome Domain


Outcomes by Grantee Type

Outcome Domain

HM Grantees Only

Both HM and RF Grantees

RF Grantees Only

Parenting/

Co-parenting

None

  • Parenting attitudes

  • Parenting skills/behavior and efficacy

  • Parenting alliance

  • Amount of contact with child



  • Engagement with child

  • Attempts to connect with child

  • Responsibility for child’s financial support

  • Knowledge of child support

Economic Stability

None

  • Ability to manage money

  • Human capital development

  • Employment

  • Job stability

None

Healthy Marriage/ Relationship

  • Relationship quality

  • Infidelity

  • Attitudes toward sex (youth only)

  • Relationship stability

  • Attitudes toward marriage/relationships

  • Communication & conflict management

None

Personal Development

None

  • Psychological well-being

  • Social support (HM youth only)

  • Involvement with criminal justice system


Program Perceptions

None

  • Perceived helpfulness

  • Other thoughts

None


3. DCS: Performance Reporting

Grantees are required to report on their performance using the data they collect through the DCS task. A description of and the purpose of each of the two kinds of performance reports is provided below.

  • Semi-annual performance progress report (PPR, Instruments DCS-5HM and DCS-5RF). The semi-annual PPR includes quantitative descriptive information on program applicants and quantitative information on a subset of DCS performance measures pertaining to program operations (staff training and supervision; marketing, outreach, and recruitment; and implementation challenges); program enrollment; and program participation. The semi-annual report also includes narrative descriptions of the grantee’s major activities and accomplishments; implementation challenges and steps taken to address these challenges; program successes; and emerging promising practices. The template for use by HM grantees is included in this ICR as Instrument DCS-5HM, and the template for use by RF grantees is included in this ICR as Instrument DCS-5RF.

Grantees are required by ACF’s Office of Grants Management (OGM) to submit a PPR twice during each grant year (in October and April), reporting on the programmatic activities conducted by the grantee in the prior six months and activities planned for the next six months. This report meets OGM reporting requirements and is used by grantees to self-monitor semi-annual performance and by the Office of Family Assistance (OFA) to monitor and manage these grants.

  • Quarterly Performance Report (QPR, Instruments DCS-6HM and DCS-6RF). Grantees also report on a subset of the (quantitative) performance measures reported in the semi-annual performance report—namely, staff training and supervision; program enrollment; program participation; and implementation challenges–on a quarterly basis. Grantees are required to submit these two quarterly reports (in January and July) to provide an interim view of performance using a subset of the performance measures reported in the semi-annual PPR. OFA and the grantees use these quarterly reports to assess progress and identify areas for improvement.

The DCS performance reporting is facilitated by nFORM, which automatically pre-populates the PPR or QPR with required performance measures from the data grantees collect and enter into nFORM.

A3. Use of Improved Information Technology and burden reduction

The FaMLE Cross-Site Evaluation uses the following technology to collect information for the DCI and DCS study components.

1. DCI: Data Collection

Audio recording, with respondent permission, will be used to facilitate interviewer-participant dialogue and interaction without distraction of extensive note taking and to increase accuracy of documentation of all points raised during the focus group discussions with clients.

2. DCS: Data Collection

As part of the FaMLE Cross-Site project, the contractor has developed and operates an MIS, called nFORM, which grantees use to enter performance measures data (applicant characteristics, program operations, enrollment and participation, and participant outcomes). The web-based MIS has a user-friendly interface accessible to authorized users from any computer with internet access, allowing for ease of data entry without purchasing or installing additional software or changing the configuration of their computers. All data are housed on secure servers behind the contractors’ firewall, thereby maintaining data security. Each grantee can view and report only data for its own program. The nFORM system is an adaptation of a similar MIS that the contractor designed for HMRF grantees participating in the PACT Evaluation. This web-based MIS reduces grantees’ reporting burden by providing a convenient and simple method for submitting data electronically.

Program participants use computers or tablets to self-administer the pre- and post-tests in nFORM. This method presents several advantages over interviewer-administered surveys. It ensures greater privacy, and respondents will be more likely to avoid socially desirable responses, particularly with sensitive questions (Turner et al. 1998; Tourangeau and Smith 1996). It also reduces burden for grantee staff who would otherwise need to administer the surveys. To address possible literacy limitations, respondents have the option to wear headphones and listen to a recording of the questions, known as Audio Computer-Assisted Self-Interview (ACASI).

3. DCS: Performance Reporting

nFORM allows grantees to generate the required quantitative performance measures for quarterly reporting to OFA with only the touch of a button, thereby minimizing grantee burden while maximizing cross-site consistency and quality of performance data.

A4. Efforts to Identify Duplication and Use of Similar Information

There are no other sources of information that would allow us to assess the design, implementation, and outcomes of all ACF-funded RF and HM programs for the current round of grant funding. We use measures that have successfully been used in prior studies involving similar populations and programs. No superfluous or unnecessary information is requested of program staff or participants from the current round of HMRF grants. We do not collect information that is available elsewhere. None of the instruments ask for information that can be reliably obtained through other sources.

A5. Impact on Small Businesses or Other Small Entities

The potential exists for data collection activities to affect small entities associated with the grantee. HMRF grantee partners and direct service providers may be included as part of DCI interviews. Additionally, some HMRF grantees conduct local evaluations led by local evaluators; if so, they may task the local evaluator with the collection of some or all of the DCS performance measures data. Current data collection efforts are designed to minimize the burden on all organizations involved, including small businesses and entities, by collecting only critical information.

A6. Consequences of Not Collecting Information or Collecting Information Less Frequently

The purpose of each information collection instrument included in this submission is described in Item A2, above. Not collecting information using these instruments would limit the government’s ability to document the performance of its grantees and to assess the extent to which these federal grants are successful in achieving their purpose. Furthermore, the FaMLE Cross-Site Project provides a valuable opportunity for OFA, practitioners, and researchers to gain empirical knowledge about the design and implementation of a broad range of HMRF programs and the characteristics of and outcomes for program participants.

Specifically, without the information collected through grantee staff interviews and participant focus groups, the FaMLE Cross-site Project would have to rely entirely on implementation information reported by a single source: the HMRF grantee semi-annual progress reports. Thus, the study would lack the broader perspectives of other staff and program participants, and we would be severely hampered in our understanding of how HMRF grantees design and implement their programs, critical program challenges and successes, and what leads applicants to participate fully in program services.

In addition, without collecting information on applicant characteristics (Instrument DCS-1), program operations (Instrument DCS-2), service receipt (Instrument DCS-3), and participant outcomes (collected through SAQs, Instrument DCS-4), HMRF grantees would not be able to report on the required performance measures, and the cross-site evaluation would be unable to link participant outcomes to various implementation factors (such as levels or combinations of specific services received).

If service receipt data were collected less frequently, providers would have to store service data or try to recall it weeks or months after delivery. Less frequent data collection would also reduce our ability to identify and address data quality issues, such as missing data and data entry errors, in a timely way. Finally, if participant outcomes were not collected at both program entry and program exit (or one month after program exit), we would not be able to assess changes in outcomes pre- and post-program participation, which is required for grantees to report performance to OFA.

A7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

This request fully complies with the general information collection guidelines of 5 CFR 1320.5(d) (2). No special circumstances apply to the ongoing data collection.

A8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

In accordance with the Paperwork Reduction Act of 1995, the public was given an opportunity to review and comment through the 60-day Federal Register Notice, published on February 8, 2018, (Vol. 83, No. 27, pp. 5631- 5633). A copy of this notice is attached as Attachment B. The notice provided 60 days for public comment; no comments were received from the public.

A9. Explanation of Any GiftS to Respondents

As previously approved, in stage four of DCI data collection, we propose to provide a $25 gift card for each focus group participant in appreciation of their participation in the focus group. We make this proposal to cover incidental expenses such as child care, transportation, etc. and to increase the likelihood of participation. We expect the focus groups to be approximately 90 minutes long.

A10. Assurance of Privacy Provided to Respondents

Respondents to Instrument DCI-3 (Program Participant Focus Group Topic Guide) will be informed that the identifying information they provide will be kept private. Participants will be given a hard copy of the consent statement for their records (Attachment C contains the consent statement for RF and HM program participants). All consent forms that are given to focus group participants will include assurances that the research team will protect their privacy to the fullest extent possible under the law. At the beginning of each focus group, the data collectors will state that the information provided by the respondent will be kept private and that the results of the study will be presented in aggregate form only. All focus group respondents will be provided with the informed consent form before the interviews are conducted. The consent form will explain the purpose of the evaluation, the duration of the interviews, and any benefits, risks, or discomfort involved.

For DCS data collection, grantees are responsible for obtaining any necessary IRB approvals for their data collection. Each grantee executed a data sharing and user agreement with Mathematica to document data security and data sharing requirements in connection with the grantee’s use of nFORM.

In all data collection and performance reporting efforts, ACF has taken the following specific measures to protect respondents’ privacy:

  • Adopt strict security measures and web security best practices to protect data collected through the project MIS, called nFORM. Data entered into nFORM are housed on secure servers that conform to the requirements of the HHS Information Security Program Policy. This MIS employs strict security measures and web security best practices to ensure the data are submitted, stored, maintained, and disseminated securely and safely. Strict security measures are employed to protect the privacy of participant information stored in the system including data authentication, monitoring, auditing, and encryption. Specific security procedures include, but are not limited to:

  • nFORM received an Authority to Operate (ATO) from HHS. The ATO will be renewed during the summer of 2019 per the HHS security policy.

  • All data are encrypted in transit (using TLS protocol backward compatible to SSL).

  • Data are encrypted at rest and reside behind firewalls.

  • nFORM users can access the system only within the scope of their assigned roles and responsibilities:

    • Only authorized contractor staff have access to the securely-held individual-level data.

    • Other FaMLE Cross-Site Project staff have access only to auto-generated reports that provide aggregated information only.

    • Only authorized staff at each grantee are able to view all individual-level data for their participants. Other staff have access to auto-generated reports that provide aggregated information only.

  • Authorized research staff are assigned a password only with permission from the study director. Each user has a unique user id/password combination.

  • Security procedures are integrated into the design, implementation, and day-to-day operations of the portal.

  • To further ensure data security, project personnel are required to adhere to strict standards, receive periodic security training, and sign security agreements as a condition of employment.

  • The nFORM system has developed and implemented standard procedures for assigning identification numbers to all participant-level data. Case- and individual-level numbers are content-free. For example, they do not include special codes to indicate enrollment dates, participant location, gender, age, or other characteristics. Data extracts from nFORM, which may not be secured, include IDs and not PII.

Regarding this ICR, grantees and local evaluators collect data from participants, including PII, such as name and contact information. The data are stored in nFORM, which is hosted and maintained by Mathematica. Each grantee is required to review the rules of their governing body to determine whether IRB approval for their data collection is necessary. If applicable, the IRBs are responsible for reviewing and approving the procedures that grantees have in place for protecting PII. Attachment D is the guidance provided to grantees regarding their responsibilities related to IRB approval. Mathematica has secured a memorandum of understanding with each grantee to share data; ACF has provided guidelines for grantees and local evaluators for protecting PII. Only contractor staff responsible for ensuring data quality have access to PII; research staff have access only to the de-identified data. Limiting the number of contractor staff with access to PII reduces the risk of disclosure.

  • Training cross-site evaluation interviewers in privacy procedures. All site visit interviewers will be knowledgeable about privacy procedures and will be prepared to describe them in detail or to answer any related questions raised by respondents. During the introduction to each interview, site visit informants will be told that none of the information they provide will be used for monitoring or accountability purposes and that the results of the study will be presented in aggregate form only.

In addition to these study-specific procedures, the contractor has extensive corporate administrative and security systems to prevent the unauthorized release of personal records, including state-of-the-art hardware and software for encryption that meets federal standards, other methods of data protection (e.g., requirements for regular password updating), and physical security that includes limited key card access and locked data storage areas.

Finally, the contractor requires every employee to sign a pledge to protect the privacy of data and respondent identity, and breaking that pledge is grounds for immediate dismissal and possible legal action. A copy of that pledge is provided as Attachment E.

A11. Justification for Sensitive Questions

There are no sensitive questions in the protocols for the DCI data collection.

For the DCS data collection, some of the items that grantees are required to collect may be considered sensitive questions. Some sensitive questions are necessary when a key project goal is the development of performance measures and when the programs involved are designed to affect personal relationships and employment. Grantees are responsible for obtaining any necessary IRB approvals for their data collection, including the necessary consent procedures. Table A.3 lists these topics and the justifications for including them.

Table A.3. Sensitive Topics and Justification for Inclusion

Sensitive topic

Relevant instrument(s)

Justification

Attitudes about sex

DCS-4HM (youth questionnaire only)

Healthy marriage and relationship programs for youth in high school aim to prevent nonmarital childbearing by educating youth on the disadvantages that most children face when they are born outside of marriage. Attitudes and intentions regarding engaging in sex are strong predictors of subsequent behavior (Buhi and Goodson 2007), in particular, sexually active teens are more likely to cohabit as young adults (Raley et al. 2007). These questions were adapted from the Toledo Adolescent Relationships Study, the PREP evaluation, and from Connections: Dating and Emotions (Kay Reed, Dibble Institute).

Infidelity

DCS-4HM

Infidelity has been found to be a major obstacle to marriage for unwed parents (Edin and Kefalas 2005). The curricula used by the HMRF programs addresses this in different ways, including discussing the importance of fidelity and trust in building healthy relationships and marriage. Several large surveys have included similar questions concerning infidelity, such as the Study of Marital Instability Over the Life Course, the Louisiana Fragile Families Study, and the Baseline Survey of Family Experiences and Attitudes in Florida. These questions were also used in the Building Strong Families 15- and 36-month follow-up surveys and had low nonresponse rates (Wood et al. 2010).

Psychological distress

DCS-4HM, DCS-4RF

Psychological distress is likely to affect key RFHM goals—improved parenting, employment, and relationship quality—and thus may be an important mediator of program outcomes. Symptoms of parental depression and anxiety have been shown to have adverse consequences for child outcomes (Downey and Coyne 1990, Gelfand and Teti 1990). To measure psychological distress, we use the K-6, a brief but highly reliable and valid measure frequently used in government health surveys in the U.S. and Canada and by the World Health Organization (Kessler et al. 2002).

Harsh discipline

DCS-4HM, DCS-4RF

A measure of harsh disciplinary practices will enable us to determine whether the HMRF programs’ emphasis on conflict management and parenting skills leads to a reduction in the use of harsh discipline techniques among participants. These items were adapted from the Supporting Healthy Marriages evaluation, where they were successfully used with a population of low-income married couples with children (Lundquist et al. 2014).

Criminal history

DCS-4HM, DCS-4RF

Recent research suggests that a history of incarceration and involvement with the criminal justice system may be fairly common among men in the RFHM target population (Zaveri et al. 2014; Pearson et al. 2011). Incarceration has major negative effects on child and family well-being, including reducing the financial support and other types of support adults can provide to their partners, children, and families, thus documenting the incidence is important. Further, because fatherhood programs encourage men to become more responsible, we want to explore whether the programs had any effect on criminal involvement. Similar questions have been included in other large national studies, such as the Fragile Families and Child Wellbeing Study, the National Job Corps Study, the Building Strong Families Study, and the Parents and Children Together evaluation. In the Building Strong Families survey (the most recent completed study cited), nonresponse was less than 1 percent for these items (Wood et al. 2010).

Income

DCS-1

A key goal of RF and some HM programs is to improve participants’ economic stability. The outcomes of an individual employed when he/she enters the program may be very different than those of an individual who enters without employment. The applicant characteristics survey asks whether the respondent is currently working and, if so, the income they have earned in the past 30 days. Questions on earnings are asked on many surveys including the Building Strong Families survey (Wood et al. 2010). In this survey, only 0.4 percent of mothers and 0.1 percent of fathers did not respond to the earnings questions.



A12. Estimates of Annualized Burden Hours and Costs

Table A.4 provides the estimated annual reporting burden calculations for DCI (contractor data collection) and DCS (grantee data collection). Estimates are broken out separately as burden for HMRF program applicants and participants, and burden for HMRF grantee staff. The total annual burden for program applicants/participants (including application intake, participant pre- and post-tests, and participant focus groups) is estimated to be 132,226 hours, and the associated annual burden cost is estimated to be $1,377,858.

For all cost calculations, we estimate the average hourly wage for program directors and managers to be the average hourly wage of “Social and Community Service Manager” (OES 11-9151; $34.07), that of grantee staff to be the average hourly wage of “Social Worker” (OES 21-1029; $28.56), that of data entry specialists to be the average hourly wage of “Data Entry Keyers” (OES 43-9021; $15.21), taken from the U.S. Bureau of Labor Statistics, Occupational Employment Statistics (OES), 2016. The average hourly wage of HMRF program clients is estimated from the average hourly earnings ($4.92) of study participants in the Building Strong Families Study (Wood et al. 2010). These average hourly earnings are lower than minimum wage because many study participants were not working. We expect that to also be the case for the grantee clients.

The estimates in Table 4 are per year of the grant. The burden and costs for DCI data collection are annualized over three years, meaning the total number of respondents has been divided by three. Information for DCI data collection at stages one and three imposes no additional burden on respondents, as it is extracted from grant applications the HMRF grantees have prepared and submitted previously (stage 1) and document review of PPRs and QPRs (stage 3).

Table A.4. Estimates of Burden and Costs for the FaMLE Cross-site

Activity, by Respondent

Total Number of Respondents

Annual Number of Respondents

Number of Responses per Respondenta

Average Burden per Response (hours)

Total Annual Burden Hours

Average Hourly Wage

Total Annualized Cost


DCI (data collection by contractor)


DCI-1: Topic Guide on Program Design

60

20

1

1

20

$34.07

$681


DCI-2: Topic Guide on Program Implementation

300

100

1

1

100

$34.07

$3,407


DCI-3: Focus group Protocol

801

267

1

1.50

401

$4.92

$1,972



DCS (data collection by grantees)


DCS-1: Applicant Characteristics









Program applicants

265,838

88,613

1

0.25

22,153

$ 4.92

$108,993


Program staff

360

360

246

0.10

8,856

$28.56

$252,927


DCS-2: Grantee Program Operations

120

120

1

0.75

90

$15.21

$ 1,369


DCS-3: Service Receipt in MIS

239,493b

79,831b

15

0.033

39,916

$15.21

$607,122


DCS 4: SAQ Pre-Test and Post-Test









Program participants (pre-test)

239,493

79,831

1

0.42

33,529

$ 4.92

$164,963


Program participants (post-test)

132,087

44,029

1

0.42

18,492

$ 4.92

$90,981


Program staff (entry from paper)

60

20

1,285

0.30

7,710

$15.21

$ 117,269


DCS-5: Semi-annual Progress Report

120

120

2

3

720

$34.07

$ 24,530


DCS-6: Quarterly Performance Report

120

120

2

1

240

$15.21

$ 3,650











Total





132,227


$1,377,858


a Total number of responses per respondent is rounded to the nearest whole number.

b Thirteen staff from each grantee are assumed to enter service delivery on an estimated 79,831 clients. The burden hours per response are based on number of clients for whom data are entered. Please see attachment F for the breakdown of the calculations.



The process for generating the burden and cost estimates for this ICR for renewal of clearance, including assumptions regarding the number of respondents and periodicity of data collection, is described in Attachment F. Attachment F also details for each instrument the estimated burden used to date; of the originally approved 180,090 annual burden hours and $1,610,705 total annualized cost, we estimate that 52,687 total annual burden hours and $502,233 total annualized cost will have been used through July 31, 2018. Of the originally approved burden, 127,403 total annual burden hours and $1,108,472 total annualized cost remain unused. In this ICR for renewal of clearance, we have maintained the same number of instruments and estimated time for completion, but we have made substantial changes in estimated burden for the three year renewal period over those proposed in the initial ICR in terms of the number of respondents. These changes result in a decrease in annual burden hours of 47,863 compared to the initial ICR, and a decrease in the total annualized cost of $232,847. The changes include:

  • Decreased annual burden hours for program applicants and program clients on the entrance and exit survey; revised estimates are based on the actual number of clients enrolled and receiving services in the first 18 months of data collection for the current grant cohort (July 1, 2016 through December 31, 2017) as well as projected increases during the renewal period. This change applies to DCS-1, DCS-3, and DCS-4.

  • Increased program staff estimates for completion of entrance and exit surveys on paper, based on estimates of current grantee activity.

  • Increased average hourly wages to reflect new averages provided by the Bureau of Labor Statistics since the completion of the initial ICR. May 2016 data were used in the renewal calculations.

A13. Estimates of Other Total Cost Burden to Respondents and Record Keepers

These information collection activities do not place any additional costs on respondents or record keepers other than those described above.

A14. Cost to the Federal Government

If all core and optional service components are exercised over the seven-year project period, the total value of the FaMLE Cross-site to the federal government is $9,608,052, and the annualized cost to the federal government is $1,372,578.86.

A15. Explanation for Program Changes or Adjustments

This is a renewal request for an ongoing data collection. Burden has been updated to reflect expected response rates over the next three years of data collection.

A16. Plans for Tabulation and Publication and Project Time Schedule

A. Plans for tabulation

For data collected through both DCI and DCS, we will continue to follow standard protocols for cleaning data, constructing variables that address the project’s purposes and research questions, and computing descriptive statistics. Additional plans for tabulation of DCI and DCS data are described below.

1. DCI

The contractor will use standard qualitative procedures to analyze and summarize information from telephone interviews and interviews conducted using the semi-structured interview topic guide (for program staff), and the focus group discussion guide. Analysis will involve organization, coding, triangulation, and theme identification. For each qualitative data collection activity, standardized templates will be used to organize and document the information and then code this documentation. Coded text will be searched to gauge consistency and triangulate across respondents and data sources. This process will reduce large volumes of qualitative data to a manageable number of topics/themes/categories (Yin 1994; Coffey and Atkinson 1996) which can then be analyzed to address the study’s research questions.

2. DCS

Grantees are able to produce quarterly tabulations of the information below as necessary for reporting performance to ACF. The contractor tabulates data as requested by ACF for conducting cross-site analyses. To achieve these purposes, we anticipate conducting the descriptive analyses presented below.

  • Applicant characteristics (Instrument DCS-1). Summary statistics within and across all HMRF grantees on applicant characteristics—for all program applicants, and for those who ultimately participate in program services. For HM grantees, some applicant characteristics will be calculated at the couple level. For grantees conducting impact studies, t-tests will assess baseline equivalence in applicant characteristics between program and control/comparison groups.

  • Enrollment (Instrument DCS-3). For example, the number enrolled in the program in the previous quarter, and the total number enrolled in the program since the beginning of the grant year. For HM grantees, enrollment will be calculated at both the individual and couple levels. For grantees conducting impact studies, numbers enrolled in the control/comparison group will also be tracked.

  • Program participation (Instrument DCS-3). For example, the proportion of enrollees who attend a core workshop within two months of enrollment, and the average number of hours of services received by program participants. For HM grantees, program participation will be calculated at both the individual and couple levels.

  • Program operations (Instrument DCS-2):

    • Marketing, outreach, and recruitment. Summary statistics within and across all HMRF grantees on the marketing strategies, recruitment methods, and referral sources used.

    • Staff Characteristics. Summary statistics on the proportion of program staff with various levels of education and experience.

    • Quality assurance and monitoring. Summary statistics within and across all HMRF grantees on measures of staff training, supervision, and observation of program services.

    • Implementation challenges. Summary statistics within and across all HMRF grantees on the degree to which potential implementation challenges have been a problem.

  • Participant outcomes (Instrument DCS-4). Summary statistics within and across all HMRF grantees on outcomes collected at program entry and exit (or one-month post program exit, for programs shorter than one month). For HM grantees, some outcomes will be calculated at the couple level.

  • Program impacts (Instrument DCS-4). For the 24 HMRF grantees conducting experimental or strong quasi-experimental evaluations, we will also estimate impacts of the programs. We anticipate that these grantees will provide our research team with individual-level data on participants. This structure makes it feasible to identify program effects using ordinary least squares (OLS) (for linear outcomes) and probit regressions (for binary outcomes), such as

and

.

Yig represents some outcome of interest for individual i served by grantee g, Treat indicates an individual’s treatment status, X represents demographic controls such as age and race, the δg variables are grantee-fixed effects, and Φ indicates the standard normal distribution. The key coefficient from the OLS regressions is β, or the change in an outcome associated with treatment status. (We will propose reporting marginal effects, interpretable as the change in associated with a change in X for the average individual, when we use probit specifications).

B. Time schedule and publications

The schedule for FaMLE Cross-site data collection and reporting is shown below in Table A.5.

Table A.5. Schedule for the FaMLE Cross-site Project

Activity*

Date

Grantee applications and awards

May 11, 2015 – September 30, 2015

DCI data collection


Program design data collection

Summer 2018-Summer 2019

Program implementation data collection

Summer 2018-Summer 2019

Participant focus groups

Summer 2018-Summer 2019

DCS data collection*


Applicant Characteristics

On-going, Spring 2016 – September 2021

Program Operations

Spring 2016, updated quarterly through September 2021

Service delivery data

On-going, Spring 2016 – September 2021

Pre-test instruments

Post-test instruments

On-going, Spring 2016 – September 2021

On-going, Spring 2016 – September 2021

Quarterly Performance Report

Semi-Annual Program Performance Report (PPR)

Summer 2016, updated quarterly through Winter 2021

Fall 2016, updated semi-annually through October 2021

Reports


Program design report

Fall 2018

Program implementation report

Fall 2018

Program outcome report

Spring 2020

* Schedule reflects activities covered under initial and continued approval.

In addition to the reports described above, the FaMLE Cross-Site Project provides opportunities for analyzing and disseminating additional information through special topics reports and research or issue briefs on an as-requested basis.

A17. Reason(s) Display of OMB Expiration Date Is Inappropriate

All instruments will display the expiration date for OMB continued approval.

A18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.

REFERENCES

Avellar, S., A. Clarkwest, M. R. Dion, S. Asheer, K. Borradaile, M. Hague Angus, T. Novak, J. Redline, H. Zaveri, and M. Zukiewicz. “Catalog of Research: Programs for Low-Income Couples.” OPRE Report # 2012-09. Report submitted to the Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services. Washington, DC: Mathematica Policy Research, May 2012.

Avellar, S., M. R. Dion, A. Clarkwest, H. Zaveri, S. Asheer, K. Borradaile, M. Hague Angus, T. Novak, J. Redline, and M. Zukiewicz. “Catalog of Research: Programs for Low-Income Fathers.” OPRE Report # 2011-20. Report submitted to the Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services. Washington, DC: Mathematica Policy Research, December 2011.

Buhi, E.R. and P. Goodson. “Predictors of adolescent sexual behavior and intention: a theory-guided systematic review.” Journal of Adolescent Health, 40(1): 4-21, 2007.

Coffey, A., B. Holbrook and P. Atkinson (1996) 'Qualitative Data Analysis: Technologies and Representations', Sociological Research Online, vol. 1, no. 1, <http://www.socresonline.org.uk/ 1/1/4.html>.

Dion, M. R., B. Devaney, M. Ford, H. Hill, S. McConnell, and P. Winston. “Helping Low-Income Families Build Strong and Healthy Marriages: A Conceptual Framework for Interventions.” Washington, DC: Mathematica Policy Research, January 2003.

Downey, G. and J.C. Coyne. “Children of depressed parents: An integrative review.” Psychological Bulletin, July, 108(1): 50-76, 1990.

Edin, Kathryn and Maria Kefalas. Promises I Can Keep: Why Poor Women Put Motherhood Before Marriage. Berkley and Los Angeles, CA: University of California Press, 2005.

Gelfand, D.M. and D.M. Teti. The effects of maternal depression on children. Clinical Psychology Review, 10, 329-353, 1990.

Kessler, R., G. Andrews, L. Colpe, E. Hiripi, D. Mroczek, S. Norman, E. Walters, and A. Zaslavsky. “Short screening scales to monitor population prevalences and trends in non-specific psychological distress.” Psychological Medicine, 32, 959-976, 2002.

Lundquist, Erika, JoAnn Hsueh, Amy Lowenstein, Kristen Faucetta, Daniel Gubits, Charles Michalopoulos, and Virginia Knox. A Family-Strengthening Program for Low-Income Families: Final Impacts from the Supporting Healthy Marriage Evaluation. OPRE Report 2014-09A. Washington, DC: Office of Planning Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services, 2014.

McLanahan, S., I. Garfinkel, N. Reichman, J. Teitler, M Carlson, and C. A. Audigier. “The National Report: The Fragile Families and Child Wellbeing Study Baseline Report.” Princeton, NJ: The Center for Research on Child Wellbeing, August 2001.

Raley, R. K.; Crissey, S.; & Muller, C. (2007). Of sex and romance: Late adolescent relationships and young adult union formation. Journal of Marriage and Family, 69(5), 1210-1226.

Tourangeau, R., & Smith, T W. (1996). Asking sensitive questions: The impact of data collection mode, question format, and question context. Public Opinion Quarterly, 60, 275-304.

Turner, C., Ku, L., Rogers, S. M., Lindberg, L. D., Pleck, J. H., & Sonenstein, F. L. (1998). Adolescent sexual behavior, drug use, and violence: Increased reporting with computer survey technology. Science, 280, 867-873.

Wood, Robert G., Sheena McConnell, Quinn Moore, Andrew Clarkwest, and JoAnn Hsueh. “Strengthening Unmarried Parents’ Relationships: The Early Impacts of Building Strong Families.” Princeton, NJ: Mathematica Policy Research, May 2010.

Yin, R. (1994). Case study research: Design and methods (2nd ed.). Thousand Oaks, CA: Sage Publishing.

Zaveri, Heather, P. Holcomb, R. Dion, D. Friend, and R. Selekman. “Fathers’ motivations for enrolling and engaging in responsible fatherhood programs: Insights from the Parents and Children Together (PACT) evaluation.” Presentation for the Welfare Research and Evaluation Conference, May 2014.







1 Other evaluations of HMRF programs include the Community Healthy Marriage Initiative, which assessed the implementation and outcomes of a community-wide approach to strengthening relationships; and the Ex-Prisoner Reentry Strategies Study, which assesses program implementation.

2 In addition to the activities described, ACF may conduct limited additional analyses on special topics, e.g. marketing and recruitment strategies associated with greater enrollment and participation.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorFelita Buckner
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy