STREAMS_Supporting Statement A_Final June 2016

STREAMS_Supporting Statement A_Final June 2016.docx

Strengthening Relationship Education and Marriage Services (STREAMS) Evaluation

OMB: 0970-0481

Document [docx]
Download: docx | pdf



Strengthening Relationship Education and Marriage Services (STREAMS) Evaluation



OMB Information Collection Request

New Collection

Supporting Statement

Part A

February 2016

Updated June 2016



Submitted by:

Office of Planning, Research and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


330 C Street, SW
Washington, D.C. 20201


Project Officer:

Samantha Illangasekare




This page has been left blank for double-sided copying.


CONTENTS

A1. Necessity for the data collection 1

Study background 1

Legal or administrative requirements that necessitate the collection 2

A2. Purpose of survey and data collection procedures 3

Overview of purpose and approach 3

Research questions 3

Study design 4

Universe of data collection efforts 5

A3. Improved information technology to reduce burden 7

A4. Efforts to identify duplication 8

A5. Involvement of small organizations 8

A6. Consequences of less frequent data collection 9

A7. Special circumstances 9

A8. Federal register notice and consultation 9

Federal register notice and comments 9

Consultation with experts outside the study 9

A9. Incentives for respondents 10

A10. Privacy of respondents 11

A11. Sensitive questions 12

A12. Estimation of information collection burden 14

Newly requested information collections 14

Process study instruments 14

Total annual cost 18

A13. Cost burden to respondents or record keepers 18

A14. Estimate of cost to the federal government 18

A15. Change in burden 18

A16. Plan and time schedule for information collection, tabulation, and publication 18

Analysis plan 18

Time schedule and publications 20

A17. Reasons not to display OMB expiration date 21

A18. Exceptions to certification for Paperwork Reduction Act submissions 21

References 22

TABLES

A.1 STREAMS expert group 10

A.2 Respondent payments proposed for data collection activities 10

A.3 Justification for sensitive questions 13

A.4 Total burden requested under this information collection 17

A.5 Schedule for the STREAMS Evaluation 20







A1. Necessity for the data collection

The Office of Planning, Research, and Evaluation (OPRE) within the Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services seeks approval to collect process and impact study data from six healthy marriage and relationship education (HMRE) programs funded by the Office of Family Assistance (OFA) within ACF. This information collection is being carried out as part of the Strengthening Relationship Education and Marriage Services (STREAMS) evaluation. The purpose of STREAMS is to measure the effectiveness and quality of HMRE programs, program components, and implementation factors designed to strengthen and improve the quality of romantic relationships. The evaluation will examine HMRE programs serving both youth and adults and will fill knowledge gaps about the effectiveness of these programs and strategies for improving program delivery and participant engagement in services.

Proposed data collection activities in study sites are: (1) data collection for documenting program implementation, including: semi-structured interviews with grantee staff and community stakeholders, focus groups with program participants, a survey of program staff, and use of a management information system by grantee staff to record data on session adherence; and (2) data collection for measuring program impacts, including: baseline and follow-up surveys of 3,600 individuals in sites serving youth in high schools; and baseline and follow-up surveys of 4,000 individuals in sites serving adults.

Study background

Healthy marriage and relationship education (HMRE) programs have been undergoing a transformation over the past decade. At first a new approach for serving vulnerable families, such programs have become an established presence in many communities, with connections to other agencies and a growing number of families served. In the early 2000s, ACF announced the Healthy Marriage Initiative, which provided funding to federal grantees through existing legislative authorities to add marriage education to their service offerings. This effort coincided with findings from the longitudinal Fragile Families and Child Well-being Study that suggested the period around a child’s birth could be an opportunity for intervening with unmarried couples, who typically were romantically involved and interested in marriage (McLanahan et al. 2001).

The Deficit Reduction Act of 2005 created the HMRE grant program, which authorized $150 million over five years to support program activities aimed at promoting and sustaining healthy marriages, providing relationship education services to youth, and fostering economic stability. The Claims Resolution Act of 2010 re-authorized this grant program, and three-year grants totaling $150 million were awarded in September 2011 (and subsequently extended through September 2015). In October 2015, ACF to awarded five-year grants to 46 HMRE grantees that plan to serve a broad spectrum of couples, single adults, and youth.

Studies of the effectiveness of HMRE programs serving couples that were funded by the government have produced mixed results. Most federally sponsored HMRE research in recent years—including the Building Strong Families (BSF) evaluation, the Supporting Health Marriage (SHM) evaluation, and the ongoing Parents and Children Together (PACT) evaluation—has focused on programs serving low-income couples raising children together. BSF, which examined HMRE programs for unmarried parents, found no overall effects on relationship quality or status (Wood et al. 2012; 2014). Positive effects in one BSF site at 15 months generally did not persist at 36 months. SHM, which examined HMRE programs for low-income married parents, found small positive effects on relationship quality and declines in psychological abuse and distress at 12 and 30 months, but no effects on relationship status (Hsueh et al. 2012; Lundquist et al. 2014). The PACT evaluation is assessing implementation and effectiveness of a next generation of HMRE programs for low-income couples chosen from the 2011 cohort of grantees serving couples; results are not yet available.

However, many HMRE programs serve youth; many others serve adults as individuals and do not serve couples. Little rigorous research has been conducted on these programs or the curricula they implement. One rigorous study of an HMRE youth curriculum, Relationship Smarts PLUS, conducted in 39 Alabama high schools found short-term positive effects on measures of healthy relationship beliefs and perceived conflict management ability (Kerpelman et al. 2009). Other prior studies used less rigorous research methods. A pre-post study of Love U2: Communication Smarts found improvements on measures of relationship attitudes and skills (Antle et al. 2011). A small pre-post study of the Within My Research curriculum for single adults reported improved relationship knowledge and skills (Antle et al. 2013). A small quasi-experimental study of another curriculum for single adults, PICK a Partner, reported increased participant knowledge and confidence in participants’ ability to develop healthy relationships (Van Epp et al. 2008).

To expand the research base on the full range of HMRE programs funded by ACF, in March 2015, the agency contracted with Mathematica Policy Research to conduct the Strengthening Relationship Education and Marriage Services (STREAMS) evaluation. The study entails designing and implementing a multi-site impact evaluation of HMRE programs. It aims to identify strategies for improving the effectiveness and delivery of HMRE programs by conducting multiple rigorous tests of program components and implementation factors across six study sites. The six STREAMS sites will be chosen from among the 45 ACF healthy marriage grantees that were awarded in October 2015. Sites will be selected to answer a range of distinct research questions that address specific policy priorities for ACF and that will help to fill key gaps in the existing research on healthy marriage programming. In particular, the study will assess the effectiveness of commonly used HMRE curricula for both youth and adults, as well as the integration of HMRE and economic stability services for adults. The study will also assess the effectiveness of strategies for improving the quality of program delivery of HMRE programming in high schools, effectiveness of curricula at different levels of dosage of HMRE programming, and the effectiveness of behavioral economics strategies to encourage participant engagement in services. Attachemnt Q presents a set of six potential STREAMS sites and the research questions they would address. Sites will be selected for their suitability for answering priority research questions, and not to be nationally representative of all healthy marriage grantees. A process study in the six sites will aid in interpreting impact findings and generate evidence to support future replication of effective curricula and implementation strategies. Both process and impact study findings will be used to improve programming implemented through ACF’s HMRE grant program.

Legal or administrative requirements that necessitate the collection

This is a discretionary data collection authorized under Sec. 811 (b) Healthy Marriage Promotion and Promoting Responsible Fatherhood Grants of the Claims Resolution Act of 2010, Pub. L. No. 111-291, 124 Stat. 3064 (Dec. 8, 2010). A copy of the legislative authority is included as Attachment A.

A2. Purpose of survey and data collection procedures

Overview of purpose and approach

To address existing knowledge gaps about HMRE programs for youth and adults, STREAMS will conduct a process and impact study in six sites. The impact study will use a random assignment research design. The plan is to select two sites serving youth in high schools and four sites serving adults. Some sites serving adults will serve couples; others will serve adults as individuals. The current request is for approval of sample intake forms, a baseline and follow-up survey for youth in high school, a baseline and follow-up survey for adults, and process study data collection instruments.

The STREAMS evaluation and data collection will build on the data collection being conducted as part of ACF’s Fatherhood and Marriage Local Evaluation (FaMLE) Cross-Site Project. The FaMLE Cross-Site data collection has already been approved by OMB under 0970‑0460. As part of FaMLE Cross-Site, ACF is collecting performance measure data from all healthy marriage and responsible fatherhood grantees, including the six healthy marriage grantees that will participate in STREAMS. These performance measure data include information on program participation collected through the Information, Family Outcomes, Reporting, and Management (nFORM) information system. They also include data on program participant characteristics and outcomes. As described in more detail below, to reduce the overall burden on respondents, STREAMS data collection will build on the FaMLE Cross-Site data collection effort already approved by OMB.

Research questions

STREAMS will examine a mix of HMRE programs serving both youth and adults. The evaluation will address a range of research questions, including:

  1. What is the effect on youth relationship skills, knowledge, and attitudes of programs designed to offer relationship skills education to high school students as part of the regular school curriculum?

  2. What is the effect on youth relationship skills, knowledge, and attitudes of programs designed to provide enhanced training and support to program facilitators who deliver relationship skills education to high school students?

  3. What is the effect on youth relationship skills, knowledge, and attitudes of delivering a shortened relationship education curriculum to high school students?

  4. What is the effect on adult relationship quality and stability of programs designed to offer relationship skills education services to adults who participate in program services as individuals (and not as part of a couple)?

  5. What is the effect on adult relationship quality and stability of programs designed to offer integrated HMRE and economic stability services?

  6. Are strategies such as text messaging effective in promoting regular attendance in HMRE program services?

  7. How are the programs implemented and how do participants respond to program services?

Different research questions will be addressed in the different evaluation sites. The evaluation in some sites will be designed to address multiple research questions. Addressing these questions will fill gaps in the literature and field concerning the full range of HMRE programs funded by ACF and will provide rigorous evidence related to HMRE program components and implementation factors that will help ACF strengthen the HMRE grant initiative.

Study design

The information to be obtained through STREAMS is critical to understanding the effectiveness and implementation of HMRE programs for youth and adults and to strengthening the overall grant initiative. Results from the study will be used to inform future government investments in HMRE programming, as well as the design and operation of such services. To address the research questions noted above, STREAMS will include process and impact studies in six study sites.

Process study. The goal of the process study is to examine how the programs included in the evaluation are implemented. This information will support the interpretation of impact findings and document program operations to support future replication if the programming is shown to be effective. As part of the process study, STREAMS will conduct semi-structured interviews with program staff and selected community stakeholders to document program operations, as well as staff and stakeholder experiences with implementation. Researchers will conduct focus groups with program participants to learn about their motivations for enrolling in the program and their perspective on the availability, quality, and value of program services. STREAMS will administer a paper-and-pencil survey to program staff about their work activities, implementation experiences, training, supervision, and perspective on the supportiveness of the grantee agency. STREAMS will also rely on the nFORM data on program participation that is being collected as part of the FaMLE Cross-Site project (OMB no. 0970-0460). STREAMS will collect data on adherence to program curricula through an add-on to the nFORM system.

The results from the process study will help inform the interpretation of impact findings. For example, the process study will assess how closely grantees adhered to curricula developers’ requirements for curriculum implementation. The process study will also document how each type of programming was implemented, the challenges encountered, and lessons learned about implementing HMRE programming. Researchers will gather process study information using established protocols that incorporate best practices in qualitative and quantitative methods.

Impact study. The impact study will examine the effectiveness of HMRE services and program components. It will also examine the effect of key program implementation factors, such as the dosage of program services and strategies for encouraging regular attendance. Each study site will examine a distinct research question or questions. In all sites, the impact study will use a random assignment research design. Results from each site will provide rigorous research evidence that can be used to improve ACF’s HMRE grant initiative.

In study sites that serve adults, individuals will be randomly assigned to either a group that is offered program services or a control group that is not. In sites serving youth in high schools, random assignment will occur at the classroom level and will involve either two or three research groups. In sites serving adults, grantee staff will conduct random assignment using a web-based system that STREAMS developed as an added component to the nFORM system (OMB no. 0970-0460). The burden for this random assignment feature for nFORM is included in this information collection request. In sites using school-level random assignment, evaluation team members will conduct random assignment. Therefore, no burden associated with random assignment for these sites is included in this request.

In five of six evaluation sites, STREAMS will collect baseline and follow-up survey data. Baseline surveys will be collected at the time of study entry. Data from the STREAMS baseline survey will also be used for two initial purposes. First, the baseline data will be used to describe characteristics of the study sample. This step will enable ACF to understand the characteristics of the populations being served and to provide guidance on how the study sample and findings might generalize to a broader policy setting. Second, baseline data may also be used for exploratory subgroup analyses, to examine the demographic and personal characteristics that may moderate the impacts of healthy relationship education programming.

Follow-up surveys will be conducted 12 months after the baseline survey, to measure the effectiveness of HMRE services and program components on participant outcomes. The evaluation team will limit the primary analyses for each site to a small set of key outcomes. In selecting these primary outcomes for the impact analysis, the evaluation team will rely on the program logic model developed for each site. For sites serving adults, the team anticipates that most of these outcomes will be measures of the quality and stability of relationships. For sites serving youth in high schools, we anticipate that the primary outcomes will be measures of relationship knowledge, skills, and attitudes.

Some of these sites will serve youth in high schools; others will serve adults. There are two different versions of both baseline and follow-up surveys, one tailored for youth in high schools and the other tailored for adults.

In one evaluation site, STREAMS will not collect baseline or follow-up survey data. In this site, participants will be randomly assigned to different strategies to promote program participation, such as text messages using different techniques to motivate regular attendance. OPRE plans to conduct this program participation test in partnership with the University of Florida healthy marriage grantee (see Attachment Q). This grantee will serve couples in five counties across the state that include a mix of urban, suburban, and rural areas. The test will include approximately 1,200 couples from all five of these counties. Couples will be randomly assigned to one of four research groups, with each group receiving a different set of text message to encourage regular attendance at program sessions. The test will help ACF determine which kinds of messages are most successful at encouraging couples to attend healthy marriage programming and will inform the guidance that the agency gives to grantees concerning the best strategies for encouraging regular program attendance. In this site, the evaluation team will rely solely on client characteristic and program participation data being collected as part of performance measure data collection. This burden has already been approved as part of the FaMLE Cross-Site project (OMB no. 0970-0460).

Universe of data collection efforts

Process study. The following instruments are associated with the STREAMS process study:

  1. Topic guide for staff and stakeholder interviews (Instrument 1). The purpose of this information collection is to document manager, staff, and community stakeholder experiences and perspectives about the implementation and operation of HMRE programs. This guide will be used during site visits to collect information from program managers, program staff, and community stakeholders on topics such as program plans and goals, staffing structure, recruitment and engagement, service delivery, enrollment and retention strategies, goal attainment, and community context.

  1. Focus group guide for adults (Instrument 2). The purpose of the focus group guide for adults is to obtain information about adult program participants’ motivations for enrolling in the program and their perspectives on the availability, quality, and value of program services. The evaluation team will also ask about participants’ level of satisfaction with the program and their self-assessment of knowledge and skill gains from program participation.The purpose of the focus groups is to document participants’ experiences and satisfaction with the program, as well as their perceptions of the knowledge and skills they gained through program participation. Focus group data will not be used to measure program effects. Program effects will be measured using data from follow-up surveys which will cover a range of topics, including the quality and stability of romantic relationships and other outcomes.

  2. Focus group guide for youth (Instrument 3). The purpose of the focus group guide for youth is to obtain information about youth program participants’ motivations for enrolling in the program and their perspectives on the availability, quality, and value of program services. The evaluation team will also ask about participants’ level of satisfaction with the program and their self-assessment of knowledge and skill gains from program participation. The purpose of the focus groups is to document participants’ experiences and satisfaction with the program, as well as their perceptions of the knowledge and skills they gained through program participation. Focus group data will not be used to measure program effects. Program effects will be measured using data from follow-up surveys which will cover a range of topics, including the quality and stability of romantic relationships and other outcomes.

  3. Staff survey (Instrument 4). The purpose of this survey is to obtain more systematic and potentially more candid information than can be gained from program staff through interviews during site visits. Site visit interviews of program staff are often conducted in group or other semi-public settings, which may limit staff’s willingness to disclose certain work experiences. Staff completing the survey may be more willing to candidly report on their experiences working with the program due to the anonymity afforded by the survey and the ability to complete the instrument in a private setting. The survey will gather information from program staff on their work activities, work experience, interactions with other staff members, opportunities to receive training, supervision, and perceptions of the supportiveness of the organization. The survey will be self-administered using a paper-and-pencil instrument.

  4. Session adherence form (Instrument 5). The purpose of this form is to collect data on facilitators’ adherence to the HMRE curriculum during each group session. Facilitators will report on the materials, lessons, and activities they used during the sessions as well as any disruptions or difficulties with conducting the session as planned. The STREAMS evaluation team will include an additional screen in the nFORM system (OMB no. 0970-0460) for use in STREAMS sites that facilitators can use to record this information.

Impact study. The following instruments are associated with the STREAMS impact study:

  1. Introductory script (Instrument 6). In STREAMS sites serving adults, grantee staff will use this script to introduce applicants to the HMRE program and the STREAMS evaluation and to answer applicants’ questions about the study. Grantee staff will read the introductory script to applicants prior to conducting random assignment.

  2. Add-on to nFORM to conduct random assignment (Instrument 7). In STREAMS sites serving adults, grantee staff will use an added component to the nFORM system (OMB no. 0970-0460) to conduct random assignment. After reading the introductory script to applicants and determining eligibility, grantee staff will enter applicant information to enroll sample members and perform random assignment.

  3. Baseline survey for youth in high schools (Instrument 8). For sites serving youth in high schools, baseline surveys will be conducted via audio computer-assisted self-administered interviewing (ACASI) on a tablet device. Surveys will be administered through group administration in high school classrooms. The baseline survey will be used to collect information from study participants on their characteristics, pre‑program measures of key outcomes, and contact information used to locate them for follow-up survey completion.

Attachment B contains a question-by-question justification for items on the youth baseline survey. To reduce burden on respondents and avoid duplication of effort, the STREAMS youth baseline survey integrates required items that program participants must complete as part of performance measure data collection. These items come from the Applicant Characteristics Survey and Pre-Program Survey designed for the FaMLE Cross-Site Project. The burden for these items has already been cleared by OMB (0970-0460). The evaluation team has supplemented these required items with additional items to collect information needed for the STREAMS evaluation. Attachment B indicates which items on the youth baseline survey are previously approved required performance measure items and which are items added for STREAMS. As indicated in the far right-hand column of the Table in Attachment B, each item added for STREAMS was selected for one of the following reasons: (1) because the item is a potential outcome of relationship education programming for youth, (2) because the item is needed to conduct exploratory subgroup analyses examining the demographic and personal characteristics that may influence or “moderate” the impacts of healthy relationship education programming, or (3) because the item will be used to collect contact information for survey locating.

  1. Follow-up survey for youth in high schools (Instrument 9). For sites serving youth in high schools, follow-up surveys will be conducted via ACASI on a tablet device. In most cases, these surveys will be conducted in schools through group administration. To ensure a high response rate to the follow-up survey, the evaluation team will contact youth who do not complete the follow-up in school and request that they complete the survey by telephone (see Attachment L for call script). An advance letter will be sent to the youth and their parents prior to attempting contact by telephone (Attachment M). The follow-up survey will include all the same items as the baseline survey, but will omit the contact information because the contact information is no longer needed after youth complete the follow-up survey. See Attachment B for a question-by-question justification of these items. Follow-up surveys for youth in high schools will be conducted 12 months after study enrollment.

  2. Baseline survey for adults (Instrument 10). In sites serving adults, the baseline survey will be conducted via computer assisted telephone interviewing (CATI). When an individual enrolls in the study, grantee staff will call the Mathematica Survey Operations Center and connect the applicant to a trained telephone interviewer who will then administer consent and conduct the baseline survey with the applicant. The baseline survey will be used to collect information from study participants on their characteristics, pre-program measures of key outcomes, and contact information used to locate them for follow‑up survey completion.

Attachment C contains a question-by-question justification for items on the adult baseline survey. To reduce burden on respondents and avoid duplication of effort, the STREAMS adult baseline survey integrates required items that program participants must complete as part of performance measure data collection. These items come from the Applicant Characteristics Survey and Pre-Program Survey designed for the FaMLE Cross-Site Project. The burden for these items has already been cleared by OMB (0970-0460). The evaluation team has supplemented these required items with additional items to collect information needed for the STREAMS evaluation. Attachment C indicates which items on the adult baseline survey are previously approved required performance measure items and which are items added for STREAMS.

  1. Follow-up survey for adults (Attachment D). In STREAMS sites serving adults, the follow-up survey will be conducted via CATI. Follow-up surveys will be conducted 12 months after study enrollment. The adult follow-up survey instrument has not yet been fully developed and will be finalized based on this initial phase of data collection. Once the instrument has been finalized, we will request clearance for the instrument as a nonsubstantive change to the current package. Attachment D provides an overview of the plans for the adult follow-up. Burden for this information collection was included in the 60 day Federal Register Notice and is included in this request.

A3. Improved information technology to reduce burden

Process study. Semi-structured interviews and focus groups conducted during process study site visits will involve two study team members, with one asking questions and a second typing close to verbatim notes capturing key quotes and responses on a laptop. Site visit teams will use an audio recorder with permission from respondents to later confirm direct quotes or other details from the interviews and focus groups.

The session adherence form will be accessible to program staff via the nFORM web-based system (OMB no. 0970-0460). nFORM will prompt session facilitators to complete the form when they record attendance data. The form will make use of check boxes and drop-down menus to reduce burden. nFORM will be accessible from any computer, allowing for ease of data entry.

Impact study. In STREAMS sites serving adults, grantee staff will use a component added to nFORM to conduct random assignment. Key descriptive information gathered from applicants on CATI baseline surveys will be automatically sent to the system, allowing for an automated check to confirm that the applicant is not already in the study sample and is therefore eligible for random assignment. Program staff will be immediately notified of the applicant’s research status, avoiding burden on both staff and applicants that would be created if applicants had to be contacted later to inform them of their research status.

The evaluation team will use technology to reduce burden for completing the baseline and follow-up surveys. In STREAMS sites serving youth in high schools, the evaluation team will conduct baseline and follow-up surveys using ACASI. The respondent listens to a recording of the survey questions and then enters responses on a tablet device. ACASI allows programming of skip logic and response validation, creating a more streamlined experience for respondents than a paper-and-pencil survey and ensuring that respondents do not inadvertently complete sections of the survey that they should skip.

In STREAMS sites serving adults, the evaluation team will administer baseline and follow-up surveys using CATI. This technology is well suited to interviews with complex skip patterns, the need for interviewer probes, and large numbers of respondents. CATI reduces respondent burden by automating skip logic and question wording adaptations and eliminating delays caused when interviewers have to determine for themselves the next question to ask. CATI is programmed to accept only valid responses based on pre-programmed checks for logical consistency across responses. Interviewers are able to correct errors during the interview, eliminating the need for burdensome call-backs to respondents.

A4. Efforts to identify duplication

The STREAMS evaluation will not require the collection of information that is available from alternative sources. To avoid potential duplication, the STREAMS evaluation team will rely on data being collected from grantees as part of performance measurement. This data collection is being overseen through the FaMLE Cross-Site project with data being recorded in the nFORM system (OMB no. 0970-046). For example, the STREAMS process study will use service use data recorded by grantees through nFORM, avoiding the need for them to record this information again for STREAMS. In addition, as described earlier, the evaluation team has integrated required performance measure items being gathered directly from participants into STREAMS baseline surveys. The evaluation team will provide this information to grantees so that they can use it to meet their performance measure data collection requirements. This integration of STREAMS and FaMLE Cross-Site instruments avoids participants having to report this information twice.

At each stage of the evaluation, the evaluation team will ensure that they do not collect information that is available elsewhere. None of the instruments will ask for information that can be reliably obtained through other sources.

A5. Involvement of small organizations

No small businesses that are not HMRE grantees or their partners will participate in data collection. Some of the HMRE grantees and partners may be small entities, such as community‑based organizations and schools. We will only request information required for the intended use and minimize burden by restricting the length of surveys to the minimum required, conducting interviews on-site or on the telephone at times that are convenient to respondents, and convening focus groups in a convenient location.

A6. Consequences of less frequent data collection

The topic guide for staff and stakeholder interviews, focus group guide for adults, focus group guide for youth, staff survey, introductory script, add-on to nFORM to conduct random assignment, baseline survey for youth in high schools, follow-up survey for youth in high schools, baseline survey for adults, and follow-up survey for adults are one-time data collections.

The session adherence form (Instrument 5) requires multiple entries for program facilitators throughout the study period. These multiple entries are necessary to provide documentation on variation in delivery of program sessions that is critical for assessing adherence to curricula, documenting actual delivery of HMRE services, and interpreting findings from the impact analyses. Reduced frequency of session adherence entries would adversely affect the evaluation team’s ability to assess adherence to curricula for programming in which sample members participate, which would in turn limit its ability to accurately interpret program impacts.

A7. Special circumstances

There are no special circumstances for the proposed data collection.

A8. Federal register notice and consultation

Federal register notice and comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on December 9, 2015, Volume 80, Number 236, page 76499, and provided a 60-day period for public comment. A copy of this notice is included as Attachment J (a copy of the 30-day notice is also included). No comments were received during the notice and comment period.

Consultation with experts outside the study

The STREAMS evaluation team consulted with external experts to complement the knowledge and experience of the evaluation team (Table A.1). Consultants included program and policy experts and researchers. Collectively, these consultants have specialized knowledge in HMRE programs for youth and adults, strategies for improving the quality of program implementation, and research design and data collection methods relevant to this work.

Table A.1. STREAMS expert group

Name

Affiliation

Paul Amato

Pennsylvania State University

Thomas Bradbury

University of California-Los Angeles, Relationship Institute

Carolyn Pape Cowan

University of California-Berkeley

Philip Cowan

University of California-Berkeley

William Doherty

University of Minnesota

Ariel Kalil

University of Chicago, Behavioral Insights and Parenting Lab

Jennifer Kerpelman

Auburn University

Wendy Manning

Bowling Green State University

Allison Metz

University of North Carolina, National Implementation Research Network

Shari Miller

RTI International

Kay Reed

The Dibble Institute

Mindy Scott

Child Trends

Renee Sieving

University of Minnesota

Scott Stanley

University of Denver, Center for Marital and Family Studies

Luis Torres

University of Houston

David Wolfe

University of Toronto



A9. Incentives for respondents

In STREAMS sites serving youth in high schools, the evaluation team proposes offering gifts of appreciation for participation in follow-up survey data collection. The team proposes offering a $25 gift card to focus group participants. The evaluation team proposes offering a $15 gift card to youth who complete the follow-up survey during the standard in-school group administration that will be conducted a year after study enrollment (Table A.2). The team proposes offering a $20 gift card to respondents who miss the in-school administration of the follow-up survey and instead complete the survey by telephone. The evaluation team proposes a slightly larger amount for youth completing outside of school to convey its appreciation for the additional effort completing the survey during the respondent’s personal time, rather than at school, might require. The gift amounts proposed here are identical to those used successfully in the ACF-sponsored PREP Multi-Component Evaluation (0970-0398). The PREP evaluation included multiple sites serving young teenagers (ages 13 to 15), similar to the ages anticipated in the STREAMS youth sites. These gifts of appreciation were an important part of the PREP evaluation team’s approach for achieving high response rates (which were 90 percent or higher in sites serving young teens) to PREP follow-up surveys. The STREAMS evaluation team anticipates that these gifts of appreciation will also serve as a useful tool for ensuring high response rates for STREAMS follow-up surveys. The team also proposes offering up to $500 to schools for each year they participate in the study in appreciation for their cooperation with follow-up data collection activities. The gifts of appreciation proposed here are similar to gifts used effectively in previous studies such as the Head Start Family and Child Experiences Survey in appreciation for cooperation with data collection in Head Start classrooms (OMB no. 0970-0151).

In STREAMS sites serving adults, the evaluation team proposes offering gifts of appreciation for participation in both baseline and follow-up survey data collection. In these sites, the evaluation team proposes offering a $10 gift card for completing the baseline survey (Table A.2). The team proposes offering a $25 gift card for completing the follow-up survey. In addition, the team proposes offering a $25 gift card to focus group participants. The proposed incentive amounts are identical to the incentives used successfully in other recent and ongoing ACF-sponsored evaluations of similar populations, including the Parents and Children Together evaluation (OMB no. 0970-0403), the Building Strong Families (BSF) evaluation (OMB no. 0970-0344), and the Child Support Noncustodial Parent Employment Demonstration (CSPED) (OMB no. 0970-0439). These incentives have been instrumental in helping the evaluation team achieve high response rates in these other evaluations, including response rates well above 80 percent for couples enrolled in healthy relationship programs like the ones to be evaluated in STREAMS. In addition, incentives of $25 for follow-up data collection with adult sample members are also supported by the work of Singer and Kulka (2002) who found that an effective range for incentives for data collection fell between $20 and $30.

The evaluation team proposes providing participants with these gifts of appreciation for two reasons:

  1. To increase response rates and mitigate nonresponse bias. When respondents know that their participation will be appreciated, the likelihood increases that they will complete the data collection activity. Research has shown that such tokens of appreciation are effective at increasing response rates for populations similar to participants in HMRE programs—people with lower educational levels (Berlin et al. 1992) and low-income populations (James and Bolstein 1990). In addition, Singer and Kulka (2002) showed that incentives reduced differential response rates and thus potential nonresponse bias.

  1. To encourage study enrollment. To ensure adequate statistical power for detecting likely program effects, it is essential to have large enough samples in each study site. Modest gifts associated with baseline survey data collection can make it easier for grantee staff to meet enrollment targets. Program staff involved with both the ACF-sponsored PACT and CSPED evaluations reported that offering a $10 gift card after study applicants completed the baseline survey was helpful in gaining their cooperation throughout the sample intake process.

Table A.2. Respondent incentives proposed for data collection activities

Data collection activity

Length of activity (minutes)

Respondent payment (per participant)

Process study



Focus groups (youth)

90

$25

Focus groups (adults)

90

$25

Impact study



Baseline survey (adults)

40

$10

12-month follow-up survey (youth, completed in school)

30

$15

12 month follow-up survey (youth, completed by phone)

40

$20

12 month follow-up survey (adults)

45

$25



A10. Privacy of respondents

Information collected will be kept private to the extent permitted by law. All STREAMS staff—at sites or on the evaluation team—with access to private data will receive study-specific training on (1) limitations on disclosure; (2) safeguarding the physical work environment; and (3) storing, transmitting, and destroying data securely. These procedures will be documented in training manuals. Refresher training will occur annually. All Mathematica staff sign the Mathematica Confidentiality Agreement (see Attachment K), complete online security awareness training when they are hired, and receive annual refresher training thereafter. Training addresses security policies and procedures found in the Mathematica Corporate Security Manual. Subcontractors and consultants will not handle personally identifiable information (PII).

Access to all PII will be available only on a need-to-know basis. A study identification number will be used to identify each study member. A link file will associate each study identification number with the name and other identifying information of each study participant. All analysis files will contain only the identification numbers and no identifying information. When private data is transmitted, the evaluation team will encrypt the data with SecureZIP, using the option to encrypt in Federal Information Processing Standard (FIPS) mode with Advanced Encryption Standard 256-bit encryption. All data from STREAMS will also be encrypted when being stored using FIPS 140-2 compliant cryptographic modules and in accordance with the most current National Institute of Standards and Technology (NIST) requirements and other applicable federal and departmental regulations. The evaluation team will establish a procedure to account for all laptop computers, desktop computers, and other mobile devices and portable media that store or process private information. Data will be securely destroyed at the earliest opportunity. The evaluation team will ensure all project staff report any suspected or actual computer incidents immediately to Mathematica’s security incident team and Mathematica will report immediately to OPDIV Senior Information Systems Security Officer, or other designated personnel. The evaluation team will submit a plan for minimizing to the extent possible the inclusion of private information on paper records and for the protection of any paper records, field notes, or other documents that contain PII that ensures secure storage and limits on access.

Through the informed consent process, respondents will be informed of all planned uses of data, that their participation is voluntary, and that their information will be kept private to the extent permitted by law. Due to the private and sensitive nature of some information that will be collected as part of this research (see Section A11), the evaluation will obtain a Certificate of Confidentiality. The Certificate of Confidentiality helps to assure participants that their information will be kept private to the fullest extent permitted by law. As described in Section A11, in sites serving adults, the evaluation team will request Social Security numbers (SSNs) in order to gather information on their employment outcomes from the National Directory of New Hires. Respondents will still be eligible for the study and for program services if they choose not to provide their SSN.

The Office of Planning, Research and Evaluation completed a Privacy Impact Assessment (PIA) for the nFORM system to ensure that information handling conforms with applicable legal, regulatory, and policy requirements regarding privacy; determine the risks of collecting and maintaining Personally Identifying Information (PII); assist in identifying protections and alternative processes for handling PII to mitigate potential privacy risks; and communicate an information system’s privacy practices to the public. A PIA for OPRE research projects in general is currently in process for approval. The PIAs will be available online through the Department of Health and Human Services.

A11. Sensitive questions

For the STREAMS impact study, both the surveys for youth in high schools (Instrument 8 and Instrument 9) and the surveys for adults (Instrument 10 and Attachment D) include sensitive questions on relationship topics, such as intimate partner violence and parents’ discipline practices. Several of these questions are required for the FaMLE Cross-Site data collection and were thus previously approved by OMB under 0970-0460. For STREAMS, a limited number of additional sensitive questions were added beyond those required by the FaMLE Cross-Site data collection. These additional topics are described in Table A.3. When asked to complete surveys for the STREAMS impact study, all participants will be informed that their identities will be kept private and that they do not have to answer questions that make them uncomfortable.

Table A.3. Justification for sensitive questions

Question topic

Justification

Youth in high schools

Adults

Involvement with the criminal justice system


Recent research suggests that a history of incarceration and involvement with the criminal justice system may be fairly common among men in the STREAMS target population (Zaveri et al. 2014; Pearson et al. 2011). Incarceration has major negative effects on child and family well-being, including reducing the financial support and other types of support adults can provide to their partners, children, and families, thus, documenting the incidence is important. Further, because some relationship education programs encourage men to become more responsible, we want to explore whether the programs had any effect on criminal involvement. Similar questions have been included in other large national studies, such as the Fragile Families and Child Wellbeing Study, the National Job Corps Study, the Building Strong Families Study, and the Parents and Children Together evaluation. In the Building Strong Families survey, nonresponse was less than 1 percent for these items (Wood et al. 2010).


X

Relationship experiences and characteristics

Several questions on participants’ relationship experiences and characteristics are required for the FaMLE Cross-Site data collection and were previously approved by OMB under 0970-0460. For STREAMS, a limited number of additional questions on relationship experiences and characteristics were added to both the surveys for youth in high schools and the surveys for adults. For youth in high schools, the additional questions ask about the respondent’s current relationship status and, for those youth currently in a relationship, about their relationship satisfaction and experiences. For adults, the additional questions ask about the number of romantic partners the respondent has had in the past year and the characteristics of the respondent’s current romantic relationship. These questions are necessary both to understand the populations being served and as potential moderators and outcomes of healthy relationship education programming.

X

X

Sexual activity

For youth in high schools, healthy relationship education programming often includes information on decision making around sexual activity. To measure the potential impact of this program component, the STREAMS surveys for youth in high schools include four questions on youth sexual activity: (1) whether the respondent has ever had sex, (2) whether the respondent was sexually active in the past three months, (3) whether the respondent had sex without a condom in the past three months, and (4) whether the respondent had sex without any effective contraceptive method in the past three months. All of these questions derive from recent federal evaluation of adolescent teen pregnancy prevention programs.

X


Sexual orientation

There is a growing emphasis in healthy relationship education on inclusivity with respect to sexual orientation. For the STREAMS impact study, we will ask respondents to self-identify their sexual orientation both to better understand the populations being served and as a potential moderator of relationship education impacts.

X

X

Social Security number

In evaluation sites serving adults, the respondent’s Social Security number is essential for two reasons. First, it will be used to collect important outcome data on sample members’ employment from the National Directory of New Hires. These data will be important for measuring potential employment effects in in sites that are providing a combination of relationship education and economic stability services. Second, Social Security numbers will also be used to collect information on the location of the study participant for the follow-up data collection.


X

A12. Estimation of information collection burden

Newly requested information collections

Table A.4 summarizes the estimated reporting burden and costs for each of the study instruments included in this information collection request. The request is for three years. Figures are estimated as follows:

Process study instruments

  1. Topic guide for staff and stakeholder interviews (Instrument 1). During the three year clearance period, we expect to interview up to 150 program staff and community stakeholders (25 staff and stakeholders per site * six sites) for up to one hour. The annualized burden is 50 hours per year.

  1. Focus group guide for adults (Instrument 2). During the three year clearance period, we expect to conduct focus groups with up to 120 adult program participants (10 participants per focus group * three focus groups per site serving adults * four sites serving adults) for up to 1.5 hours. The annualized burden is 60 hours per year.

  2. Focus group guide for youth in high school (Instrument 3). During the three year clearance period, we expect to conduct focus groups with up to 60 youth program participants (10 participants per focus group * three focus groups per site serving youth in high school * two youth sites) for up to 1.5 hours. The annualized burden is 30 hours per year.

  3. Staff survey (Instrument 4). During the three year clearance period, we expect to conduct a paper-and-pencil survey with 120 program staff (20 staff per site * six sites). We expect each survey to take 0.5 hours to complete. The annualized burden is 20 hours per year.

  4. Session adherence form (Instrument 5). We expect 48 program staff (eight staff per site * six sites) to each complete a total of 312 adherence form entries, or 104 annually.. We expect each entry to take 0.08 hours to complete. The total annual burden is 399 hours per year (48 staff * 104 entries per staff member * .08 hours per entry).

Impact study instruments

  1. Introductory script (Instrument 6). In sites serving adults, grantee staff will introduce the evaluation to program applicants at study enrollment. We assume that four evaluation sites will serve adults. We assume one of these sites will serve couples and that, in this site, both members of the couple will be included in the research sample. The other three sites will enroll adults individually into the study and the program. We estimate the burden on grantee staff and program applicants as follows:

    1. Grantee staff. During the three year clearance period, we expect eight program staff (two staff per site * four sites conducting individual-level random assignment) to provide information about the HMRE program and the STREAMS evaluation to 5,250 applicants. Each staff member involved with sample enrollment will conduct a total of about 656 of these meetings over the three year period, or 219 each year. During these meetings program staff will explain program services and the fact that the applicant will be randomly assigned to be eligible or not eligible for services; the meetings will last approximately 0.08 minutes. The total annualized burden for grantee site staff is 140 hours (8 staff members *219 meetings * 0.08 hours).

    2. Program applicants. During the three year clearance period, we expect 5,250 program applicants to participate in meetings with program staff to deliver the introductory script for sample enrollment. Each meeting will last approximately 0.08 hours. Thus, the total annualized burden for program applicants is 140 hours (1,750 applicants per year * 0.08 hours).

  1. Add-on to nFORM to conduct random assignment (Instrument 7). During the three year clearance period, we expect approximately 95 percent of adult program applicants, or 5,000 of 5,250 applicants, to enroll in the study sample. This burden is based on the number of computer entries grantee site staff will make as they enroll the study participants. We estimate that eight program staff (two staff members per site * four sites conducting individual-level random assignment) will conduct these entries. —for a total of 625 entries per staff member over three years, or 208 entries per staff member per year—each taking 0.08 hours to complete. Therefore, the total annualized burden is 133 hours (8 staff member *208 entries per year * 0.08 hours).

  2. Baseline survey for youth in high schools (Instrument 8). During the three year clearance period, we anticipate that two evaluation sites will enroll youth in high schools. We expect to include 100 classrooms in the study sample in each of these sites, with 25 students per classroom. We expect 72 percent of students and their parents to consent to participate in the study. This yields a study sample of 3,600 youth across the two sites (100 classrooms * 25 youth per classroom * 72 percent consent rate * 2 sites = 3,600 youth). We expect each baseline survey to last 0.6 hours, with 0.1 hours of each baseline survey devoted to required performance measure items. The 0.1 hours of burden for these required items has already been approved (OMB no. 0970‑0460). Therefore, we are requesting 0.5 hours of additional burden per baseline survey, for a total of 1,800 burden hours (0.5 hours * 3,600 youth); the total annualized burden over three years is 600 hours per year.

  3. Follow-up survey for youth in high schools (Instrument 9). During the three year clearance period, we expect 90 percent of the 3,600 youth completing the baseline survey to complete the following up survey, for a total of 3,240 youth (3,600 youth * 90 percent response rate). We expect each follow-up survey to last 0.5 hours. We anticipate that the follow-up survey will be slightly shorter than the baseline survey because we will not be collecting contact information at follow-up. We estimate that the total burden for the youth follow-up survey to be 1,620 hours (0.5 hours * 3,240 youth); the total annualized burden over three years is 540 hours.

  4. Baseline survey for adults (Instrument 10). During the three year clearance period, we anticipate that we will collect baseline survey data in three sites serving adults. (In the fourth evaluation site serving adults, the analysis will rely solely on nFORM data; no participant surveys will be conducted.) We anticipate enrolling 4,000 individuals across these three sites: 1,000 in each of two sites serving adults as individuals and a 1,000 couples (2,000 individuals) in a third site serving couples. We expect each baseline survey to last 0.75 hours, with 0.25 hours of each baseline survey devoted to required performance measure items. The 0.25 burden hours for these required items has already been approved (OMB no. 0970-0460). Therefore, we are requesting 0.5 hours of additional burden per baseline survey, for a total of 2,000 burden hours (0.5 hours * 4,000 adults); the total annualized burden over three years is 667 hours per year.

  5. Follow-up survey for adults (Attachment D). During the three year clearance period, we expect 80 percent of the 4,000 adults completing the baseline survey to complete the follow up survey, for a total of 3,200 adults (4,000 adults * 80 percent response rate). We estimate that the total burden for the adult follow-up survey to be for a total of 2,400 hours (0.75 hours * 3,200 adults); the total annualized burden over three years is 800 hours.



Table A.4. Total burden requested under this information collection

Instrument

Total number of respondents

Annual number of respondents

Number of responses per respondent

Average burden hours per response

Annual burden hours

Average hourly wage

Total annual cost

Process Study

1. Topic guide for staff and stakeholder interviews

150

50

1

1

50

$27.86

$1,393

2. Focus group guide for adults

120

40

1

1.5

60

$7.25

$435

3. Focus group guide for youth in schools

60

20

1

1.5

30

$7.25

$218

4. Staff survey

120

40

1

.5

20

$27.86

$557

5. Session adherence form

48

481

104

.08

399

$27.86

$11,116

Impact Study

6a. Introductory script, grantee staff

8

81

219

.08

140

$27.86

$3,900

6b. Introductory script, program applicants

5,250

1,750

1

.08

140

$7.25

$1,015

7. Add-on to nFORM to conduct random assignment

8

81

208

.08

133

$27.86

$3,705

8. Baseline survey for youth

3,600

1,200

1

.5

600

$7.25

$4,350

9. Follow-up survey for youth

3,240

1,080

1

.5

540

$7.25

$3,915

10. Baseline survey for adults

4,000

1,333

1

.5

667

$7.25

$4,836

11. Follow-up survey for adults

3,200

1,067

1

.75

800

$7.25

$5,800


Total estimated annual burden

6,644


3,579


$41,240



Total annual cost

We estimate the average hourly wage for staff at the grantee organizations is the average hourly wage of “social and community service managers” taken from the U.S. Bureau of Labor Statistics, National Compensation Survey, 2010 ($27.86). We estimated the average hourly wage of program applicants based on the current federal minimum wage ($7.25).

A13. Cost burden to respondents or record keepers

There are no additional costs to respondents.

A14. Estimate of cost to the federal government

The cost over the three years of the requested clearance is $15,520,710, and the annualized cost to the Federal Government is $5,173,570.

A15. Change in burden

This is a new data collection.

A16. Plan and time schedule for information collection, tabulation, and publication

Analysis plan

Process study. The process study will document the implementation inputs and outputs in each evaluation site, assess adherence to plans for programming and curricula, assess participant responsiveness to the programming, identify factors that supported or hindered implementation, and document the counterfactual condition. These findings will aid in interpreting impact findings and generate evidence to support future replication of effective interventions and implementation strategies. The process study will use both qualitative and quantitative analysis methods to analyze data collected using the instruments included in this OMB package.

The evaluation team will use standard qualitative procedures to analyze and summarize information from semi-structured interviews and focus groups conducted using topic guides. Analysis will involve organization, coding, triangulation, and theme identification. For each qualitative data collection activity, standardized templates will be used to organize and document the information and then code this documentation. Coded text will be searched to gauge consistency and triangulate across respondents and data sources. This process will reduce large volumes of qualitative data to a manageable number of topics, themes, and categories (Yin 1994; Coffey, Holbrook, and Atkinson 1996) that can then be analyzed to address the study’s research questions.

To code the qualitative data for key subtopics and themes, the evaluation team will first develop a coding scheme based on the interview or focus group questions. Senior members of the evaluation team will refine the initial coding scheme by reviewing codes and a preliminary set of data output to make adjustments and ensure alignment with the topics that emerge from the data. For each round of coding, multiple project team members will be trained to code the data using a qualitative analysis software package such as NVivo. To ensure reliability across coders, all team members will code an initial document and compare codes to identify and resolve discrepancies. As coding proceeds, the lead team member will review a sample of coded documents from each coder to monitor reliability. Coded data will enable the team to compare responses across respondents within and across partnerships by searching on specific codes. The software will also allow the team to retrieve data on particular codes by type of respondent. To compare information, the evaluation team may retrieve data for subsets of partnerships, such as partnerships with child care providers.

The evaluation team will summarize quantitative data using basic descriptive methods. Sources of quantitative data include the staff survey and session adherence data. Analysis of data from each source will follow a common set of steps involving data cleaning, variable construction, and computing descriptive statistics. To facilitate analysis of each data source, we will create variables to address the study’s research questions. Construction of these analytic variables may combine several survey responses into a scale, aggregate adherence data from a set time period, or compare responses to identify a level of agreement.

The evaluation team will use nFORM data (OMB no. 0970-0460) to document service receipt. These data will allow the evaluation team to generate summary statistics for key program features:

  • Enrollment patterns. For example, the average number of new applicants each month.

  • Service provided by grantees. For example, the average number of group sessions offered each month or the average number of individual case management contacts each month.

  • Participation patterns. For example, the number of participants that engage in a group session within two months of enrollment and the average number of hours of group sessions received by program participants.

Impact study. Program impacts will be analyzed separately for each site. In most sites, these data will come from the STREAMS baseline and follow-up surveys. In one site, the evaluation team will rely solely on nFORM data to examine strategies to promote regular program attendance. With a random assignment research design, unbiased impact estimates can be obtained by comparing mean outcomes for the treatment and control groups based on follow-up data alone. However, the precision of the impact estimates can be improved by estimating multi-variate regression models that control for baseline covariates, such as baseline measures of the outcome variables. Regression adjustment can also address any differences between the treatment and control groups in baseline characteristics that arise by chance or from survey nonresponse.

The empirical specification for the regression model will depend on the unit of random assignment. In sites that randomly assign individuals to the treatment or control groups, the regression model can be expressed as follows:

(1) yi =β′xi+λTii

where yi is the outcome of interest for individual i; xi is a vector of baseline characteristics; Ti is an indicator equal to one for individuals in the treatment group and zero for individuals in the control group; and εi is a random error term. The vector of baseline characteristics xi will include demographic characteristics such as age, gender, race/ethnicity, and baseline measures of the outcomes. The parameter estimate for λ is the estimated impact of the program.

In other study sites, random assignment will occur at the classroom level. In these sites, the estimated regression model must account for the correlation of outcomes among individuals in the same cluster, as they will all be randomly assigned as a single unit, and each sample member cannot be considered statistically independent. To account for this dependence, the regression model used to estimate program impacts can be expressed as follows:

(2) yis =β′xis+λTissis .

The general structure of the model is the same, but now yis is the outcome measure for individual i in cluster s (and similarly for the treatment status indicator, Tis, vector of baseline characteristics, xis and the error term εis). Most importantly, the error term in Equation (2) accounts for the clustering of youth within clusters because of the inclusion of the cluster-level error term ηs—a cluster “random effect.” If this error term is excluded, the precision of the impact estimates could be seriously overstated. As in Equation (1), the estimated impact of the program is λ.

Time schedule and publications

Table A.5 displays the tentative timeline for data collection and reporting activities. Sample enrollment and baseline data collection is expected to begin around July 2016, after obtaining OMB approval, and to continue for a roughly two-year period until August 2018. Data collection for the process study will begin as soon as service delivery to sample members begins and continue through the end of service delivery with sample members, approximately July 2016 through December 2018. Data collection for the follow-up survey will begin around July 2017—a year after the start of sample enrollment and baseline data collection—and continue for just over two years, until November 2019.

Table A.5. Schedule for the STREAMS Evaluation

Activity

Timinga

Data collection


Sample enrollment and baseline surveys

July 2016 through August 2018

Process study data collection

July 2016 through December 2018

Follow-up surveys

July 2017 through November 2019

Reporting


Process study reports

January 2019 through September 2019

Impact study reports

October 2019 through July 2020

Final synthesis reports

August 2020

aSubject to timing of obtaining OMB approval.

The planned reporting activities have three main components: (1) process study reports, (2) impact study reports, and (3) a final synthesis report. The process study reports will be released on a rolling basis from roughly January 2019 through September 2019. A separate process study report will be prepared for each site. Site-specific impact reports will be released on a rolling basis from roughly October 2019 through July 2020. A final cross-site synthesis report will summarize and highlight key findings from the site-specific process study and impact reports. It is slated to be released in August 2020.

A17. Reasons not to display OMB expiration date

All instruments will display the expiration date for OMB approval.

A18. Exceptions to certification for Paperwork Reduction Act submissions

No exceptions are necessary for this information collection.

References

Antle, B., B. Sar, D. Christensen, F. Ellers, A. Barbee, and M. van Zyl. “The Impact of the Within My Reach Relationship Training on Relationship Skills and Outcomes for Low-Income individuals.” Journal of Marital and Family Therapy, vol. 39, 2013, pp. 346-357.

Antle, B.F., D.J. Sullivan, A. Dryden, E.A. Karam, and A. Barbee. “Healthy Relationship Education for Dating Violence Prevention among High-Risk Youth.” Children and Youth Services Review, vol. 33, 2011, pp. 173-179.

Berlin, Martha, Leyla Mohadjer, Joseph Waksberg, Andrew Kolstad, Irwin Kirsch, D. Rock, and Kentaro Yamamoto. “An Experiment in Monetary Incentives.” in Proceedings of the Section on Survey Research Methods. Alexandria, VA: American Statistical Association, 1992, pp. 393-398.

Buhrmester, D., Furman, W., Wittenberg, M. T., & Reis, H. T. (1988). Five domains of interpersonal competence in peer relationships. Journal of personality and social psychology, 55(6), 991.

Carson, K. D., & Bedeian, A. G. (1994). Career commitment: Construction of a measure and examination of its psychometric properties. Journal of Vocational Behavior, 44(3), 237-262.

Cobb, N. P., Larson, J. H., & Watson, W. L. (2003). Development of the Attitudes About Romance and Mate Selection Scale*. Family Relations, 52(3), 222-231.

Coffey, A., B.L. Holbrook, and P. Atkinson. “Qualitative Data Analysis: Technologies and Representations.” Sociological Research Online, vol. 1, no. 1, 1996. Available at: http://www.socresonline.org.uk/index_by_issue.html.

Diemer, M. A., & Blustein, D. L. (2007). Vocational hope and vocational identity: Urban adolescents’ career development. Journal of Career Assessment, 15(1), 98-118.

Foshee, VA, Fothergill, K & Stuart, J. (1992) Results from the Teenage Dating Abuse Study Conducted in Githens Middle School and Southern High Schools. Technical Report. Chapel Hill, NC: University of North Carolina.

Hsueh, J., D. Principe Alderson, E. Lundquist, C. Michalopoulos, D. Gubits, D. Gein, and V. Knox. “The Supporting Healthy Marriage Evaluation: Early Impacts on Low-Income Families.” New York, NY: MDRC, February 2012.

James, Jeannine M., and Richard Bolstein. “The Effect of Monetary Incentives and Follow-Up Mailings on the Response Rate and Response Quality in Mail Surveys.” Public Opinion Quarterly, vol. 54, 1990, pp. 346-361.

Kerpelman, J.L., J.F. Pittman, F. Adler-Baeder, S. Eryigit, and A. Paulk. “Evaluation of a Statewide Youth-Focused Relationship Education Curriculum.” Journal of Adolescence, vol. 32, 2009, pp. 1359-1370.

Lippman, L. H., Ryberg, R., Terzian, M., Moore, K. A., Humble, J., & McIntosh, H. (2014). Positive and protective factors in adolescent well-being. In Handbook of child well-being (pp. 2823-2866). Springer Netherlands.

Lundquist, E., J. Hsueh, A.E. Lowenstein, K. Faucetta, D. Gubits, C. Michalopoulos, and V. Know. “A Family-Strengthening Program for Low-Income Families: Final Impacts from the Supporting Healthy Marriage Evaluation.” OPRE Report 2014-09A. Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services, 2014.

McLanahan, S., I. Garfinkel, N. Reichman, J. Teitler, M Carlson, and C.A. Audigier. “The National Report: The Fragile Families and Child Wellbeing Study Baseline Report.” Princeton, NJ: The Center for Research on Child Wellbeing, August 2001.

Pearson, Jessica, Lanae Davis, and Jane Venohr. “Parents to Work!” Denver: Center for Policy Research, February 2011.

Pinto-Meza A., A. Serrano-Blanco, M.T. Peñarrubia, E. Blanco, and J.M. Haro, “Assessing Depression in Primary Care with the PHQ-9: Can It Be Carried Out over the Telephone?” Journal of General Internal Medicine, vol. 20, no. 9, 2005, pp. 738-742.

Singer, E., and R.A. Kulka. “Paying Respondents for Survey Participation in Studies of Welfare Populations: Data Collection and Research Issues.” Ed. Michele Ver Ploeg, Robert A. Moffitt, and Constance F. Citro, pp. 105-128. Washington, DC: National Academy Press, 2002.

Straus, M. A., Hamby, S. L., Boney-McCoy, S., & Sugarman, D. B. (1996). The revised conflict tactics scales (CTS2) development and preliminary psychometric data. Journal of family issues, 17(3), 283-316.

Van Epp, M.C., T.G. Futris, J.C. Van Epp, and K. Campbell. “The Impact of the PICK a Partner Relationship Education Program on Single Army Soldiers.” Family and Consumer Sciences Research Journal, vol. 36, 2008, pp. 328-349.

Vennum, A., & Fincham, F. D. (2011). Assessing decision making in young adult romantic relationships. Psychological assessment, 23(3), 739.

Wood, R.G., S. McConnell, Q. Moore, A. Clarkwest, and J. Hsueh. “The Effects of Building Strong Families: A Healthy Marriage and Relationship Skills Education Program for Unmarried Parents.” Journal of Policy Analysis and Management, vol. 31, no. 2, spring 2012, pp. 228–252.

Wood, R.G., S. McConnell, Q. Moore, A. Clarkwest, and J. Hsueh. “Strengthening Unmarried Parents’ Relationships: The Early Impacts of Building Strong Families. Princeton, NJ: Mathematica Policy Research, 2010.

Wood, R.G., Q. Moore, A. Clarkwest, and A. Killewald. “The Long-Term Effects of Building Strong Families: A Program for Unmarried Parents.” Journal of Marriage and Family, vol. 76, April 2014, pp. 446-463.

Yin, R. Case Study Research: Design and Methods, 2nd ed. Thousand Oaks, CA: Sage Publishing, 1994.

Zaveri, Heather, P. Holcomb, R. Dion, D. Friend, and R. Selekman. “Fathers’ motivations for enrolling and engaging in responsible fatherhood programs: Insights from the Parents and Children Together (PACT) evaluation.” Presentation for the Welfare Research and Evaluation Conference, May 2014.

1 This line in the burden table is annualized at the responses level (as opposed to the respondent level).


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSTREAMS OMB Supporting Statement A
SubjectOMB Statement
AuthorMathematica Staff
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy