Part A Justification_final

Part A Justification_final.docx

Child Support Noncustodial Parent Employment Demonstration (CSPED)

OMB: 0970-0439

Document [docx]
Download: docx | pdf



U.S. Department of Health and Human Services

Administration for Children and Families

Office of Child Support Enforcement (OCSE) Aerospace 4th Floor

901 D Street, SW

Washington DC 20447

Project Officer: Elaine Sorensen

Child Support Noncustodial Parent Employment Demonstration (CSPED)



New Collection

OMB Supporting Statement for Implementation, Cost, and Impact Studies

Part A: Justification

AUGUST 2013












1. Necessity for the Data Collection

The Office of Child Support Enforcement (OCSE) within the Administration for Child and Families at the U.S. Department of Health and Human Services seeks approval to collect information for the Child Support Noncustodial Parent Employment Demonstration (CSPED) evaluation. Under CSPED, OCSE has issued grants to eight state child support agencies to provide employment, parenting, and child support services to noncustodial parents (NCPs) who are having difficulty meeting their child support obligations. OCSE has undertaken this demonstration to test whether state child support agencies can improve their effectiveness by providing these services.

The overall objective of the CSPED evaluation is to document and evaluate the effectiveness of the approaches taken by the eight CSPED grantees. The study will use an experimental design in eight sites to compare outcomes for study participants who will be randomly assigned to treatment and control groups. The evaluation is being undertaken by the U.S. Department of Health and Human Services, ACF, OCSE, and its grantee the Wisconsin Department of Children and Families. The evaluation is being implemented by the University of Wisconsin-Madison and its partner, Mathematica Policy Research.

a. Study Background

The past several decades have witnessed sweeping changes in family structure. In 1980, 77 percent of children lived with two married parents; by 2010, this figure had fallen to only 66 percent U.S. Census Bureau 2013). Child support is a key resource for children living apart from one of their parents, and recent demographic and policy changes have made an effective child support system increasingly important. More than four in ten children are born to unmarried parents and many married couples with minor children go through divorce, making child support a potentially important source of support for most children at some point in their lives. Changes in the social safety net, which no longer includes an entitlement to cash assistance for low-income single parents, also make reliable child support increasingly important. At the same time, many NCPs, including a disproportionate share of those whose children are living in poverty, have limited earnings and ability to pay support. Thus, a successful child support system must enforce and enable NCPs’ contributions and also require effective policies to encourage noncustodial employment. The CSPED evaluation is designed to identify effective policy alternatives to address these needs.

The problem of NCPs not paying child support and the degree to which it disproportionately affects lower income families has been documented in earlier research (Heinrich et al. 2011; Manning and Smock 2000; U.S. Census Bureau 2013). This problem affects the state, the custodial parents/families and the NCPs. The debt is primarily owed to the states because the mothers who would receive the child support payments are themselves on welfare, which is paid by the state collecting the child support. Statistics suggest unemployed (and sometimes incarcerated or part time employed) NCPs contribute the largest share of this debt and the likelihood of that debt being repaid is slim. Several studies on the effectiveness of programs aimed at addressing these issues appear to be inconclusive. In the state of Maryland, for example, an initiative aimed at employing the NCPs was combined with a debt forgiveness effort, so the contribution of the employment component versus the debt forgiveness could not be clearly distinguished (Heinrich, Burkhardt and Shager, 2011).

Few rigorous studies of child support programs have been conducted to date. Currently, most states are conducting non-experimental evaluations of TANF-related policy changes (Meyer and Cancian 2002). The CSPED evaluation will use a rigorous, randomized controlled trial design to examine the effectiveness of child support programs providing employment and other services to NCPs in improving child support payments. The CSPED evaluation will yield information about effective strategies for improving child support payments by providing noncustodial parents employment and other services through child support programs. In addition, this evaluation will generate extensive information on how these programs operated, what they cost, the effects the programs had, and whether the benefits of the programs exceed their costs. The information gathered will be critical to informing decisions related to future investments in child-support-led employment-focused programs for NCPs who have difficulty meeting their child support obligations.

b. Overview of the Evaluation

The CSPED evaluation is a rigorous, innovative, and efficient study that will advance the field by building the knowledge base about effective strategies for supporting NCPs in their roles as providers for their children. The evaluation’s two main components—the implementation and cost study and the impact study—will yield considerable information about not only whether a program is effective, but also about how it operates, why it may or may not be effective, and the challenges and opportunities it faces.

The purpose of the implementation and cost study is to provide a detailed description of how the programs are implemented, the contexts in which they are operated, their participants, their promising practices, and their costs and benefits. The key data collection activities of the implementation and cost study include: (1) conducting semi-structured interviews with program staff and selected community partner organizations, (2) conducting focus groups with program participants, (3) administering a web-based survey to program staff and community partners, and (4) collecting data on study participant service use, dosage, and duration of enrollment throughout the demonstration using a web-based Management Information System (MIS) to track program participation.

The goal of the impact study is to provide estimates of the effectiveness of the programs offered by the eight CSPED grantees. The evaluation will be based on a randomized controlled trial (experimental) research design in which program applicants who are eligible for CSPED services will be randomly assigned to either a treatment group that is offered CSPED program services or a control group that is not. The study MIS that will document service use for the implementation and cost study will also be used to randomly assign program applicants to the treatment and control groups. The impact study will rely on data collected from surveys of participants as well as administrative records from state and county data systems. Survey data will be collected twice from program applicants: at baseline and 12 months after random assignment.

c. Data Collection Activities Requiring Clearance (Current Request)

This ICR requests clearance for nine data collection protocols, four of which will be used in the implementation and cost study and five will be used in the impact study.



Implementation and Cost Study

Clearance is requested for the following data collection activities designed to support this effort:

  1. Staff interview topic guide. The topic guide will be used to conduct semi-structured interviews with program staff and selected community partner organizations across the eight grantee sites during site visits conducted during the first and third year of program implementation.

  2. Focus group guide. The focus group guide will be used to conduct focus groups with program participants at each site to gather information about their program experiences.

  3. Program staff survey. The staff survey will be web-based and administered to program staff and staff at partner agencies working with CSPED participants. The survey will be administered twice and will capture broader staff program experiences beyond the information garnered from the semi-structured interviews.

  4. Study MIS to track program participation. The study MIS will be web-based and will be used to track participation in the program. Information about services received by all program participants will be entered into the system by program staff.

Impact Study

Clearance is requested for the following data collection activities designed to support this effort:

  1. Introductory script read by program staff. The script will be used by grantee site staff to introduce applicants to the CSPED program and the study and to address questions they may have about the study.

  2. Introductory script heard by program applicants. Program applicants will hear the program staff use the introductory script to introduce applicants to the CSPED program and the study and to address questions they may have about the study. A list of frequently asked questions (FAQs) is provided in Attachment C.

  3. Baseline survey. The baseline survey will be administered to program applicants using Computer-Assisted Telephone Interviewing (CATI). Grantee staff will provide a telephone for program applicants to use to call an interviewer employed by the contractor. The interviewer will begin by describing the study to the applicant and asking for the applicant’s consent to participate in the study. Once sample intake is complete, a copy of the consent statement will be provided to the sample member (Attachment A). If the applicant agrees to participate in the study, the interviewer will administer the baseline survey (provided as Instrument #7). The CSPED baseline survey draws heavily from the OMB-approved Parents and Children Together (PACT) baseline data collection instrument. Attachment B contains a detailed description of the ways in which the PACT baseline data collection instrument was tailored for the purposes of the CSPED study. A question-by-question justification for the items included in the CSPED baseline survey is also presented in Attachment B.

  4. Study MIS to conduct random assignment. The study MIS will include functions to conduct the random assignment of applicants at all evaluation sites.

  5. Protocol for collecting administrative records. This protocol will be used to extract information from state and county databases on participants’ child support obligations and payments, Temporary Assistance for Needy Families (TANF), Supplemental Nutrition Assistance Program (SNAP), and Medicaid benefits, involvement with the criminal justice system, and earnings and benefit data collected through the Unemployment Insurance (UI) system.

A separate OMB submission will seek clearance for a 12-month follow-up survey of program participants, as well as additional administrative records sources. An overview of the topics covered by that 12-month follow-up survey is provided as Attachment F.

d. Legal or Administrative Requirements that Necessitate the Collection

There are no legal or administrative requirements that necessitate the collection. OCSE is undertaking the collection at the discretion of the agency.

2. Purpose and Use of the Information Collection

The data collected through the instruments included in this ICR will be used to learn about the approaches that will be implemented by the eight CSPED grantees to provide employment supports and other services to NCPs who face barriers to employment and experience difficulties in paying child support. The information to be obtained through the CSPED evaluation is critical to understanding the CSPED programs—the services they provide, the experiences of their participants, their effectiveness at improving outcomes for NCPs and their children, and their ability to improve the performance of the Title IV-D Child Support Program. The data collected in the CSPED evaluation can also be used to inform decisions related to policy and programmatic improvements to the Title IV-D Child Support Program. If the information collection requested by this ICR is not conducted, policymakers and providers of employment and child support programs will lack high-quality information on the effects of the programs, as well as descriptive information that can be used later to refine the operation of the programs and better meet child support performance goals. Details on the purpose and use of the information collection for each of these studies are provided below.

  1. Implementation and Cost Study

The goal of the implementation and cost study is to provide a detailed description of the programs--how they are implemented, their participants, the contexts in which they are operated, their promising practices, and their costs and benefits. These detailed descriptions will assist in interpreting program impacts, identifying program features and conditions necessary for effective program replication or improvement, and carefully documenting the costs of delivering these services.

  • Staff interview topic guide. This guide will be used to collect information from program staff on the plans and goals for the program, the staffing structure, recruitment and engagement strategies, services offered, costs, enrollment and receipt of services, and characteristics of the community.

  • Focus group guide. The focus groups will explore participants’ perspectives on their motivation for enrolling in the program, and the availability, quality, and value of program services. Of particular interest will be participants’ level of satisfaction with the program and their assessment of the knowledge and skills gained as a result of program participation.

  • Program staff survey. The survey will ask respondents to describe their work activities, work experience, interactions with other staff members, opportunity to receive training and supervision, the supportiveness of the organization hosting the program, costs, and how the program delivers services and makes needed resources available.

  • Study MIS to track program participation. Data collected through the study MIS are critical to the implementation and cost study. The data collected will provide information on program participation (e.g. participant entry into the program, participation, and exit from the program). The data will be used for two main purposes:

    1. Collecting information on the services provided by grantee sites and the extent of program participation. Program staff will be asked to report on all services provided to program participants on an ongoing basis. The implementation study will describe what services grantee programs offered, and the level of participation in those services. Historically, research indicates that many social services programs find it difficult to engage and retain participants—many individuals either never begin participating after enrollment or leave the program before it is completed. Hence, it is important to collect information both on what services the grantee site offers and what services the participants actually receive. This information will also aid in interpreting the impact estimates (by allowing analysis by high or low levels of active participation/dosage).

    2. Monitoring grantee sites during the study period. The information gathered through the study MIS will be used to monitor program performance and provide timely feedback to the grantees to help them identify any areas needing attention.

  1. Impact Study

The purpose of the impact study is to provide rigorous estimates of the effectiveness of the eight CSPED programs using an experimental research design. Program applicants who are eligible for CSPED services will be randomly assigned to either a program group that is offered program services or a control group that is not.

  • Introductory script. The grantee staff will use this script to describe the study to the applicant and explain why they will be asking him or her to speak with an interviewer over the telephone.

  • Baseline Survey. Data collected through the baseline survey are crucial for the impact study, and will provide critical information on both study participants served and those who are not served by grantee programs. In particular, these data will be used for the following purposes:

  1. Describing the characteristics of participants. The baseline survey will gather descriptive information on study participants at baseline to make it possible to identify the characteristics of NCPs who apply to grantee programs. In addition to basic demographic information, these data will provide information about the types of challenges faced by NCPs who enroll in grantee programs (for example, education level, employment status, housing stability, etc.). These data will also be used to construct survey nonresponse weights that adjust for potential bias that might arise from follow-up survey nonresponse, to control for baseline characteristics in estimating program impacts, to define groups for subgroup analysis, and to estimate propensity score models for analysis of impacts for those who received larger doses of CSPED services.

  1. Identifying subgroups of interest. Baseline data will be used to identify subgroups for which impacts may differ—for example, it may be that impacts are larger for NCPs who see their children more often than for those who seldom see them—or to identify subgroups which may be informative to qualitative analyses.

  2. Collecting information that can explain variation in outcomes Impact estimates obtained from the differences between mean outcomes of treatment group members and mean outcomes of control group members are unbiased. However, impact estimates obtained using a regression model with covariates that explain some of the variation in outcomes at follow-up, such as the outcomes assessed at baseline, can improve the precision of the estimates. Hence, the baseline survey includes baseline measures of key outcomes that will be measured again on the follow-up survey.

  3. Identifying factors that could predict program participation. The primary impact analysis will focus on the estimated effect of offering grantee services to NCPs. Factors at baseline that predict program participation can be used to estimate the impact of receiving different types and intensities of program services (as described in section A16). Hence, the baseline survey asks the respondent about his or her motivation to participate in the program and barriers to participation. Information collected from grantee staff as part of the study’s MIS (described below) will also be used for this purpose.

  4. Checking that the treatment and control groups are on average similar at baseline. Information on the characteristics of study participants can be used to check the similarity of the treatment and control groups. Although random assignment produces similar groups, on average, baseline data will be used to verify program-control equivalence for the full research sample and for the sample of respondents to the follow-up survey.

  5. Identifying and tracking study participants. Identifying information includes the study participant’s complete name, sex, date of birth, mailing address, and Social Security number. This information is needed to match with other administrative data (for example, wage/earnings data, child support data) to assess the impact of the programs on these key outcomes. In addition, personal information along with information on sample members’ telephone numbers, email addresses, social network information, and contact information for up to three relatives or friends is needed to facilitate locating study participants for follow-up survey data collection. Accurate and detailed locating information is essential for achieving high survey response rates.

  • Study MIS to conduct random assignment. Data collected through the study MIS are critical to the impact study. The data will be used for two main purposes associated with the CSPED impact study:

  1. Conducting random assignment. The CSPED impact evaluation will be a randomized controlled trial (experimental) evaluation. The study MIS, overseen by the evaluation contractor, will determine random assignment after participants have consented and completed the baseline survey. Random assignment is the core of an experimental impact evaluation since it creates a control group that is similar on all baseline characteristics to treatment group participants. For this reason, an experimental evaluation is often considered the most rigorous program evaluation.

  2. Estimating the impact of receipt rather than offer of services. The data on program participation obtained with the MIS for the treatment group can be used to estimate the relationship between participant characteristics at baseline (including grantee staff predictions of likely participation) and participation in CSPED program activities. This information can also be used to estimate the impact of receipt of services (as described in Section A16).

  • Protocol for collecting administrative records. Data extracted from administrative records is essential to the impact study because information about study participants’ court records, child support payments, unemployment benefits, and services received that are not part of the CSPED program will be used to demonstrate the contrasts between the CSPED program group (the treatment group) and the comparison (control) group.

3. Improved Information Technology to Reduce Burden

The CSPED evaluation will use multiple methods to collect study information. Web based applications will be used for the survey of program staff, the MIS, and collection of administration data. CATI will be used for the baseline survey of participants. Semi-structured interviews and focus groups do not make use of information technology to reduce burden.

a. Implementation and Cost Study

Staff interview topic guide. These semi-structured interviews will be conducted in person by the data collection team, without the use of information technology.

Focus group guide. The focus groups will be facilitated by a member of the CSPED evaluation team, without the use of information technology.

Program Staff Survey. The survey of program staff and community partners will be administered via the web, and is expected to take no longer than 30 minutes to complete. The web instrument will offer the easiest means of providing data. As it will be programmed to automatically skip questions not relevant to the respondent; this approach will reduce respondent burden. The instrument will also allow respondents to complete the survey at a time convenient to them without the risk of their losing a paper survey questionnaire. Since the survey instrument will automatically skip to the next appropriate question based on a respondent’s answers, the instrument will also provide high-quality data. If respondents are unable to complete the survey in one sitting they may save their place in the survey and return to the questionnaire at another time, which reduces the burden on the respondent. In addition to offering the web instrument, participants may request a paper (mail or fax) questionnaire or receive telephone assistance in completing the survey from the contractor’s site liaison.

Study MIS to track program participation. The study MIS will be a web-based application providing easy access while maintaining the security of the data. The web-based application will allow sites to access the MIS without purchasing or installing additional software or changing the configuration of their computers. The system can be accessed from any computer, allowing for ease of entry, while the data are housed on secure servers behind the contractor’s firewall, thereby maintaining data security. The system has been designed with use by the grantee staff in mind, and based on experience from prior studies with similar types of service providers. As such, it will be flexible, easy-to-use, and include navigational links to relevant fields for each type of entry to reduce burden on grantee site staff and increase the quality and quantity of data collected. The system is designed for multiple users at each organization and will include options for varying levels of system access depending on users’ access needs. For example, administrators or supervisors will have the greatest rights within the system, having the ability to create new users, assign program participants to staff members, and review all activity for the organization. Staff providing direct services to study participants will have the ability to record and review information about participants assigned to their caseload. The various levels of system access allow for streamlining of information; limiting full system access to a small set of staff members promotes increased data security and greater data quality.

b. Impact Study

Introductory script. This script does not lend itself to the use of improved informational technology such as computerized interviewing.

Baseline Survey. The baseline survey will be conducted using CATI. CATI is a good method for administering interviews with questions with complex skip patterns, the need for interviewer probes, and large numbers of respondents. CATI reduces respondent burden by automating skip logic and question adaptations and by eliminating delays caused when interviewers must determine the next question to ask. CATI is programmed to accept only valid responses based on preprogrammed checks for logical consistency across answers. Interviewers are thus able to correct errors during the interview, eliminating the need for burdensome and costly call-backs to respondents.

Study MIS to conduct random assignment. Use of information technology and burden reduction through the study MIS is described in the implementation and cost study description presented in the previous section.

Protocol for collecting administrative records. The CSPED evaluation team will request administrative records from state and county agencies to extract information from their databases to gather outcome data for the impact study. The evaluation team will set up a secure web-based system for transferring these data.

4. Efforts to Identify Duplication and Use of Similar Information

The CSPED Evaluation will not require the collection of information that is available from alternative data sources.

None of the instruments will ask for information that can be reliably obtained through administrative data collection. For example, the baseline survey will ask study participants to provide limited information on formal child support, as administrative data in one state do not consistently capture child support orders in other states. The baseline survey will ask study participants to report on informal contributions (monetary and in-kind support) that would not be reflected in administrative data. In addition, information on quarterly earnings (reported to the state unemployment insurance agency) will be obtained from administrative data; the baseline survey will ask for earnings in the past month to capture more recent earnings and earnings that may not have been reported to the unemployment agency. Though criminal history information within a state is potentially available through administrative sources, that information will be gathered through the baseline survey because not all states allow administrative data access for research purposes, because state records do not always record criminal activity in other states and because in some states administrative data lack key information. Nevertheless, participants will be asked to provide consent for the collection of administrative data on criminal background if that is deemed necessary at a later time.

Child support programs do not typically collect all the information that will be gathered by the study MIS. For instance, information required for intake and random assignment is not likely to be available from other sources. Likewise, child support programs often do not have an existing MIS that systematically tracks the information to be included in the service receipt section of the study MIS. However, if a grantee has an existing MIS that tracks information needed for the CSPED Evaluation, we will accept data from their existing MIS.

No program participant or staff member will be asked for the same information more than once. For example, the staff will not be asked during the semi-structured interviews any questions that they are asked on the staff survey.

5. Impact on Small Businesses or Other Small Entities

No small businesses are expected to be involved in data collection. Nonetheless, instruments have been tailored to minimize burden and collect only critical evaluation information.

6. Consequences of Not Collecting Information or Collecting Information Less Frequently

Not collecting information for the CSPED evaluation overall would limit the government’s ability to document the kinds of activities implemented with federal funds and its ability to measure the effectiveness of such activities. In particular, the CSPED evaluation represents an important opportunity for OCSE to learn about how to improve child support program performance and increase the reliable payment of child support through the provision of enhanced child support services and employment programs for NCPs and their children. If the information collection requested by this clearance package is not conducted, policymakers and providers of these programs will lack high-quality information on the impacts of the programs, as well as descriptive information that can be used later to refine the programs.

  1. Implementation and Cost Study

Staff interview topic guide. Without collecting information by conducting interviews with program staff, the study will not have complete information about the implementation of the CSPED programs. We propose collecting data from interviews with CSPED program staff twice, during a site visit conducted early in implementation and another site visit conducted late in implementation. The first visit will focus on understanding program design, while the later visit will focus on implementation experiences. In addition, prior experience (Dion et al. 2010) has illustrated that service delivery programs similar to those delivered by CSPED grantees modify their implementation approach over time in response to their early experiences, so collecting these data twice will capture those changes and will also capture staff feedback about the lessons they learned along the way.

Focus group guide. Without focus groups, the participant perspective on the program will not be captured. Program participants will not be asked to participate in multiple focus groups.

Program staff survey. Without this survey, information that would be difficult to explore during semi-structured interviews, such as the quality of staff relationships and the supportiveness of program leadership, will not be collected. It will also allow for the collection of data from a broader set of program staff than those who will be interviewed during an in-person visit. The staff survey will be administered twice (once early in program implementation and once after operations are more established). This will capture changes over time in staff composition as well as staff perceptions of the program.

Study MIS to track program participation. Staff will be asked to enter information about services offered to participants (e.g., individuals assigned to the next parenting workshop) and their actual participation and attendance throughout the period of the study. Without information on service receipt, we will not be able to describe the services offered to participants by CSPED programs and the extent to which program participants received these services. These data are critical to the implementation analysis and to interpreting the findings from the impact analysis.

  1. Impact Study

Introductory script. This script is necessary to ensure that program staff provide program applicants with accurate information about the study and explain why the applicant needs to talk with an interviewer employed by the evaluation contractor.

Baseline Survey. Without collection of detailed contact information on study participants at baseline, the ability to track study participants over a 12-month follow-up period would be limited. This would likely lead to a lower response rate and a greater risk that the impact estimates will be biased by nonresponse. The lack of baseline information would also limit the evaluation contractor’s ability to describe the population of CSPED program participants and would limit the analysis of program impacts on subgroups, thereby limiting the ability to determine the groups for which the program is most effective. Without data from the baseline survey, baseline information cannot be included as covariates in the impact analyses which will render the impact estimates less precise and will make small impacts less likely to be detected. Also, adjustments for nonresponse to the follow-up survey would have to be based on administrative data, which are much more limited. In addition, without baseline information on factors that could predict program participation, it would not be possible to measure the impact of programs on receiving services, rather than being offered services (for more detail, see Section A16).

Finally, the baseline survey yields data that are vital for ensuring that random assignment is properly implemented. In particular, without data from the baseline survey, it would not be possible to test whether the program and control groups were equivalent at baseline on many key measures (such as those not covered by administrative data). Baseline surveys will be collected only once; thus, no repetition of effort is planned.

Study MIS to conduct random assignment. Information entered by grantee site staff at intake is collected once, prior to submitting an applicant’s case for random assignment. Without entry of this information, we would not be able to check if the applicant is already a member of the evaluation sample, an issue that ensures the integrity of random assignment. In addition, staff predictions of likely program participation would be missing, making it more difficult to estimate impacts on those who actually participated.

Protocol for collecting administrative records. Without administrative data, information about public benefits received, criminal justice involvement, child support payments history, unemployment benefits information, and employment history both before and after study enrollment would not be collected. This information is crucial to differentiating program effects between treatment and control groups and to identifying the net costs of the program.

7. Special Circumstances

There are no special circumstances for the proposed data collection.

8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

In accordance with the Paperwork Reduction Act of 1995, the public was given an opportunity to review and comment through the 60-day Federal Register Notice, published on January 11 2013 (78FR 8, document number 2013-00416, pp. 2409-2411). The notice provided 60 days for public comment.

9. Explanation of Any Payment or Gift to Respondents

The payments we propose to make for the data collection activities covered by this ICR are summarized in Table A.1.

Table A.1. Respondent Payments Proposed for Data Collection Activities

Data Collection Activity

Length of Activity (minutes)

Respondent Payment (per participant)

Implementation and Cost Study

Focus groups

90

$20

Impact Study

Baseline Survey

35

$10

12 Month Follow-Up Surveya

45

$25

a A separate OMB submission will seek clearance for a 12-month follow-up survey of program participants. An overview of the topics covered by that 12-month follow-up survey is provided as Attachment F.


We propose to make respondent payments for three reasons:

  1. To increase response rates. The knowledge that they will be paid for completion is expected to increase respondents’ likelihood of spending the time completing the data activity. Research has shown that respondent payments are effective at increasing response rates for populations similar to participants in child support and employment programs—people with lower educational level (Berlin et al. 1992) and low-income and nonwhite populations (James and Bolstein 1990). Singer and Kulka (2002) showed that respondent payments reduced differential response rates and hence the potential for nonresponse bias. The suggested payment to complete the focus groups is a higher incentive than the payment suggested for completing the baseline survey due to the increased burden of this request and to facilitate recruitment.

  1. To reduce attrition for follow-up data collection. In longitudinal studies, providing an incentive for earlier surveys may contribute to higher response rates for subsequent surveys (Singer et al. 1998). Therefore, providing a modest payment at baseline may reduce attrition for follow-up data collection.

  2. To gain study participants cooperation in data collection activities. Providing a modest payment to all study participants—including those who are assigned to the control group—will show to participants that their time is valuable. The suggested payment to complete the 12-month follow-up survey has been set as a higher incentive than the payment suggested for completing the baseline survey to maximize the incentives to participate in the 12-month follow-up survey.

10. Assurance of Confidentiality Provided to Respondents

The consent statement and all other materials provided to study participants and program staff will include assurances that the research team will protect their privacy to the fullest extent possible under the law. Before the baseline survey is administered, the interviewer will read a consent statement, which includes a pledge that responses will be kept confidential and reported in a manner that will not identify individual respondents (see page ii of instrument #7). In addition, a written consent statement will be distributed to participants by grantee site staff at the time of study enrollment. Consent will be provided verbally by the participant after the consent statement has been read to the participant by the interviewer. The consent statement for the web-administered program staff survey is provided in the “Introduction” section of Instrument #3. This text will be provided on the first page of the web survey after the respondent logs in. Several specific measures will be taken to protect respondent privacy:

  • Training interviewers in confidentiality procedures. The oral consent process and baseline interview will be administered by telephone interviewers at the University of Wisconsin Survey Center (UWSC), who will remotely access Mathematica’s CATI system via secure network connection. Interviewers will be seated in a common supervised area. As part of the telephone interviewers’ introductory comments, study participants will be told that their responses will be protected and that they will have the opportunity to have their questions concerning the study answered by the interviewer. Interviewing staff at the UWSC will receive training that includes general UWSC security and confidentiality procedures as well as project-specific training that includes explanation of the highly confidential nature of this information, instructions not to share this or any other personally identifiable information (PII) with anyone not on the project team, and warnings about the consequences of any violations. After receiving training, these staff members will sign confidentiality and nondisclosure agreements.

  • Using CATI for consent and the baseline survey. Administering consent and the baseline survey via CATI eliminates security risks related to shipping hard-copy forms containing PII to the evaluator. Additionally, UWSC interviewers logging in remotely to the Mathematica network using a secure network connection enables data to be stored on servers behind Mathematica’s secure firewall to minimize security risks.

  • Restricting and logging access to the sample management system (SMS). Some data elements from the baseline survey data will be entered into an SMS to help with locating sample members for the follow-up survey. This is a SQL server database housed on an encrypted server. A hierarchical architecture will be used to assign user rights to specific individuals who will be able to access the system and enter information only at their own location. All activity in the system will be logged.

  • Restricting access to the study MIS. Data collected through the study MIS will be housed on secure servers behind Mathematica’s firewall. Access to the study MIS will be restricted to approved staff members assigned a password with permission from the study director.

  • Using de-identified data for all focus group reports. Any data elements used for recruitment of focus group participants, such as name and telephone number, will be destroyed after completion of the focus groups. Interview transcripts and resultant reports will not identify respondents by name.

Data security. In addition to these study-specific procedures, the evaluator has extensive corporate administrative and security systems to prevent the unauthorized release of personal records, including state-of-the-art hardware and software for encryption that meet federal standards, and other methods of data protection (e.g., requirements for regular password updating), as well as physical security that includes limited key card access and locked data storage areas.

The contractor has a secure server for online data collection (including data collected through the MIS and through the staff web survey), utilizing its existing and continuously tested web survey infrastructure. This infrastructure features the use of HTTPS (secure socket, encrypted) data communication; authentication (login and password); firewalls; and multiple layers of servers, all implemented on a mixture of platforms and systems to minimize vulnerability to security breaches.

Hosting on an HTTPS site ensures that data are transmitted using 128-bit encryption, so that transmissions intercepted by unauthorized users cannot be read as plain text. This security measure is in addition to standard password authentication that precludes unauthorized users from accessing the web application.

The contractor has established data security plans for handling all data during all phases of survey execution and data processing for the surveys it conducts. The contractor’s existing plans meet the requirements of U.S. federal government agencies and are continually reviewed in the light of new government requirements and survey needs. Such security is based on (1) exacting company policy promulgated by the highest corporate officers in consultation with systems staff and outside consultants, (2) a secure systems infrastructure that is continually monitored and evaluated with respect to security risks, and (3) secure work practices of an informed staff that take all necessary precautions when dealing with private data.

11. Justification for Sensitive Questions

a. Implementation and Cost Study

The instruments associated with the implementation study have no sensitive questions. Most focus on the experiences of program and community organization staff with their jobs of delivering services at the program or elsewhere in the community. The program staff and community partner survey asks case managers and partners to rate aspects of their working relationship, but the data from this instrument will be aggregated for analysis and individual responses will not be shared with the other half of the case manager/community partner dyad. Focus groups with program participants will ask about those respondents’ impressions of and experiences with the program, but will not make special requests for personal information.

  1. Impact Study

Some sensitive questions are necessary in a study of programs designed to affect personal relationships and employment. Prior to starting the baseline survey, all respondents will be informed that their identities will be kept private and that they do not have to answer questions that make them uncomfortable. Table A.2 describes the justification for the sensitive questions included in the baseline survey. Although these questions are sensitive, they have commonly, and successfully, been asked of respondents similar to those who will be in this study (for example, in the Fragile Families and Child Wellbeing Study, the Building Strong Families Study, the Early Head Start Research Evaluation Project, and the PACT Evaluation).



Table A.2. Justification for Sensitive Questions – Baseline Survey and Study MIS

Question Topic

Justification

Respondent Social Security number

Baseline survey question A3

The respondent’s Social Security number is essential for this evaluation for three reasons. First, it will be used to collect administrative data on the respondents. The Social Security number will allow us to obtain important outcome data on the respondent from child support agencies and state and county databases. Second, Social Security numbers will also be used to collect information on the location of the study participant for the follow-up data collection. Third, these numbers will be used as an identifier to link the information collected in the study MIS with the survey data and will allow the study MIS to check whether the person has already been randomly assigned.

Current Romantic Relationships

Baseline survey questions D19 – D25

Information on NCPs’ current romantic relationships is important for understanding the complex structure of their households and the demands on their time and resources. These relationships could influence the time and resources NCPs devote to their children. These questions have been asked successfully on other large-scale survey efforts involving low-income families, such as the Building Strong Families evaluation.

Earnings

Baseline survey questions E1 - E9

A key goal of child support programs is to improve NCPs’ economic stability. The outcomes of a parent employed when entering the program may be very different than those of a parent who enters without employment. The survey asks whether the respondent worked in the past month and, if so, the amount he or she earned in the last month from formal and informal jobs. This question has been asked successfully in many surveys including the Building Strong Families survey (Wood et al. 2010).

Barriers to employment

Baseline survey question E10

Noncustodial parents’ barriers to employment are expected to be key predictors of similar economic stability outcomes at follow-up. The survey asks how difficult issues such as problems getting to work, not having necessary skills, having to take care of a family member, not having a place to live, problems with alcohol or drugs, trouble getting along with others, physical health, and having a criminal record have made finding or keeping a job for NCPs in the past year. This question has been asked successfully in the Fragile Families and Child Well-Being Study (McLanahan 2009) and the Building Strong Families survey (Wood et al. 2010).

Symptoms of depression

Baseline survey question F4

Parental depression has been shown to have adverse consequences for labor market and child outcomes (Alexandre and French 2001, Downey and Coyne 1990, Gelfand and Teti 1990, Marcotte et al. 1999). To measure depressive symptoms, we will use eight items from the Patient Health Questionnaire (PHQ-9), which was designed as a diagnostic instrument for depression but can also be used to measure subthreshold depressive disorder in the general population (Martin et al. 2006). The PHQ-9 has been shown to be reliable and valid in diverse populations and has been used in clinical settings to measure symptom improvement and monitor treatment outcomes (Kroenke, Spitzer, and Williams 2001; Löwe et al. 2004). Findings from telephone administrations of the instrument have been shown to be similar to in-person assessments (Pinto-Meza et al. 2005). The PHQ-8 includes eight of the nine items from the PHQ-9; it has been shown to be a useful measure of depression in population-based studies (Kroenke et al. 2009).

Involvement with the criminal justice system

Baseline survey questions F7 - F11

Recent research suggests that a history of incarceration and involvement with the criminal justice system may be fairly common among parents in the CSPED target population (Pearson et al. 2011). Parental incarceration has major negative effects on child and family well-being, reducing the financial support and other types of support the parents can provide to their children and families. Similar questions have been included in other large national studies, such as the Fragile Families and Child Wellbeing Study, the National Job Corps Study, and the Building Strong Families Study. In the Building Strong Families survey, nonresponse was less than 1 percent for these items.


12. Estimates of Annualized Burden Hours and Costs

The estimated reporting burden and cost for the data collection instruments included in this ICR is presented in Tables A.3 through A.5.

We estimate the average hourly wage for staff at the grantee organizations to be the average hourly wage of “social and community service managers” taken from the U.S. Bureau of Labor Statistics, National Compensation Survey, 2010 ($27.86). The average hourly wage of program applicants is estimated to be $7.25, the federal minimum wage.

Implementation and Cost Study

Table A.3 summarizes the proposed burden and cost estimates for the use of the implementation and cost study instruments in the eight evaluation sites. The total estimated cost figures are computed from the total annual burden hours and an average hourly wage for staff (using the $27.86 rate described above) and program applicants (using the $7.25 hourly wage described above). Figures are estimated as follows:

  • Semi-structured interview with grantee staff and community partners. We expect to interview 120 grantee site staff members and community partners (15 per site across 8 sites) twice during the evaluation period (in years 2 and 4) and therefore expect to conduct 240 interviews. We expect these meetings, which will involve a semi-structured interview about experiences with the program, to last approximately 1 hour per interview. Thus, the total burden for grantee site staff and community partners is 240 hours (120 staff members participating in 2 meetings of 1 hour in duration each), and the total annualized burden over three years is 80 hours.

  • Focus groups with program participants. We expect to conduct focus groups with 240 program participants (30 per site across 8 sites). We expect these meetings, which will involve a focus group discussion about experiences with the program, to last approximately 1.5 hours per group. Thus the total burden for program participants participating in focus groups is 360 hours (240 program participants participating in one group of 1.5 hours in duration), and the total annualized over three years is 120 hours.

  • Program staff survey. We expect to conduct the web survey with 200 grantee site staff and community partners (25 per site across 8 sites) at two points during the evaluation period. We expect the web survey to take approximately 30 minutes to complete per respondent. Thus the total burden for grantee site staff and community partners participating in the web survey is 200 hours (200 program participants participating in surveys at two points during the evaluation period lasting .5 hours in duration), and the total annualized over three years is 66.7 hours.



  • Study MIS to track program participation. This burden is based on the number of computer entries grantee site staff will make as they enroll and track participation by participants. We estimate that there will be 6,000 participants in the CSPED program1 and that 200 staff members—25 in each of eight sites—will collect MIS data on these participants. We estimate that there will be 50 MIS entries per participant to record program participation, which will result in a total of 300,000 MIS entries (6,000 x 50 = 300,000) about participation over the course of three years. Each staff member will make 1,500 entries over the course of 3 years (300,000 ÷ 200 = 1,500) and we estimate that each entry to document program participation will take two minutes (or 1/30 hours) on average2, so the total annualized burden is 3,333.3 hours ([300,000 ÷ 30] ÷ 3).

Table A.3. Estimate of Burden and Cost for the CSPED Evaluation -Implementation and Cost Study

Instrument

Total Number of Respon-

dents

Number of Responses per Respon-

dent

Average Burden Hours per response

Total Burden Hours

Annual Number of Respon-

dentsa

Total Annual Burden Hoursa

Average Hourly Wage

Total Annualized Cost

Staff interview topic guide

120

2

1

240

40

80

27.86

$2,229


Focus group guide

240

1

1.5

360

80

120

7.25

$870


Program staff survey

200

2

0.5

200

66.7

66.7

27.86

$1,858


Study MIS to track program participation

200

1,500

0.0333

10,000

66.7

3,333.3

27.86

$92,866

Total





253.4

3,600


$97,823

a All burden estimates are annualized over three years.

Impact Study

Table A.4 summarizes the proposed burden and cost estimates for the use of the impact study instruments across the eight evaluation sites. The total estimated cost figures are computed from the total annual burden hours and an average hourly wage for staff (using the $27.86 rate described above) and program applicants (using the $7.25 hourly wage described above).





Table A.4. Estimate of Burden and Cost for the CSPED Evaluation -Impact Study

Instrument

Total Number of Respon-

dents

Number of Responses per Respon-

dent

Average Burden Hours per response

Total Burden Hours

Annual Number of Respon-

dentsa

Total Annual Burden Hoursa

Average Hourly Wage

Total Annualized Cost

Introductory script:









Grantee staff

120

105

0.1667

2,100

40

700

$27.86

$19,502

Program applicantsb

12,600

1

0.1667

2,100

4,200

700

$7.25

$5,075


Baseline survey

12,000

1

0.5833

7,000

4,000

2,333.3

$7.25

$16,916










12 month follow-up surveyc

9,600d

1

0.75

7,200

3,200

2,400

$7.25

$17,400


Study MIS to conduct random assignment

120

105

0.1667

2,100

40

700

$27.86

$19,502


Protocol for collecting administrative records

32

2

8

512

10.7

170.7

$27.86

$4,756

Total





11,490.7

7,004


$83,151

a All burden estimates are annualized over three years.

b Five percent of program applicants are not expected to agree to participate in the study; thus there are 5% more program applicants than study participants.

c Clearance for the 12-month follow-u- survey will be requested in a later OMB submission. The burden estimates for this survey administration may be updated after the instrument has been developed. See Attachment F for more information on plans for this survey.

d 80% of the baseline sample of 12,000 participants (9,600 participants) is expected to participate at follow-up.



Table A.4 figures are estimated as follows:

  • Introductory script read by grantee site staff. We expect 120 staff members–15 in each of the 8 sites–to provide information about the CSPED study. Annualizing the number of grantee site staff members results in 40 grantee staff (which equals 105 meetings per staff member.) We expect these meetings, which will involve explaining the program services and the fact that the applicant will be randomly assigned to be eligible or not eligible for services, to last approximately 0.1667 hours. Thus, the total annualized burden for grantee site staff is 700 hours (40 staff members holding 12,600 meetings of 0.1667 hours duration).

  • Introductory script heard by program applicants. We expect 12,600 program applicants to hear program staff read the introductory script that provides information about the CSPED program. Annualizing the number of program applicants who hear this script results in 4,200 applicants. We expect these meetings, which will involve explaining the program services and the fact that the applicant will be randomly assigned to be eligible or not eligible for services, to last approximately 0.1667 hours. Each program applicant will hear this information once. Thus, the total annualized burden for program applicants will be 700 hours (4,200 applicants listening for 0.1667 hours).

  • Baseline survey for program applicants. We expect 12,600 applicants during the study intake period. It is assumed that about 95 percent of applicants to the program will be found eligible for the study and consent to participate. Thus, 12,000 are expected to complete the baseline survey: 1,500 respondents in each of the eight sites. Annualizing 12,000 over three years is 4,000. We expect each survey to last 0.5833 hours, for a total of 2,333.3 annualized burden hours (7,000 total burden hours).

  • 12 month follow-up survey for program enrollees. 12,000 sample members are expected to complete the baseline survey (1,500 respondents in each of the eight sites). Twelve-month follow-up surveys will be attempted with all of these 12,000 sample members. We anticipate an 80 percent response rate or 9,600 respondents. Annualizing 9,600 over three years yields 3,200 respondents per year. We expect each survey to last 0.75 hours, for a total of 2,400 annualized burden hours (7,200 total burden hours).3 A separate OMB submission will seek clearance for this 12-month follow-up survey.

  • Study MIS to conduct random assignment. This burden is based on the number of computer entries grantee site staff will make as they enroll and track participation by participants. We anticipate 12,600 program applicants will have one MIS entry to document intake and to conduct random assignment, producing 12,600 MIS entries. We estimate that 120 staff members—15 in each of 8 sites—will collect MIS data on these participants. Therefore, each staff member will make 105 entries over the course of 3 years (12,600 ÷ 120 = 105). We estimate that each entry to conduct random assignment will take 0.1667 hours. Therefore, when we annualize the grantee site staff members – resulting in 40 grantee staff per year– the total annualized burden is 700 hours.

  • Protocol for collecting administrative records. We expect 32 staff members–4 in each of the eight sites–to complete the state and county administrative records protocol. We expect each response to complete the administrative records protocol to last eight hours. Thus, the total annualized burden for grantee site staff is 170.7 hours (32 staff members conducting two responses of eight hours duration each).

Combined Total Burden

Table A.5 summarizes the total estimated reporting burden and costs for the currently requested ICR (semi-structured interviews with program staff and community partners, focus groups with program participants, web survey with program staff and community partners, study MIS to track participation in services, introductory script, baseline survey, study MIS to conduct random assignment, and administrative records protocol). A separate OMB submission will seek clearance for the 12-month follow-up survey. If the current request is approved, 8,204 annual burden hours and an annualized cost of $163,574 would be approved for the CSPED study.4

Table A.5. Estimate of Burden and Cost for the CSPED Evaluation – TOTAL Burden Request

Activity/Respondent

Annual Number of Respondentsa

Number of Responses per Respondent

Average Burden per Response (hours)

Total Annual Burden Hoursa

Average Hourly Wage

Total Annualized Cost

Implementation and Cost Study

Semi-structured Interviews

Program staff and community partners

40

2

1

80

$27.86

$2,229

Focus Groups







Program participants

80

1

1.5

120

$7.25

$870

Staff survey

Program staff and community partners

66.7

2

0.5

66.7

$27.8

$1,858

Study MIS







Staff tracking program participation

66.7

1,500

0.0333

3,333.3

$27.86

92,866

Impact Study

Introductory Script







Grantee staff

40

105

0.1667

700

$27.86

$19,502

Program applicants

4,200

1

0.1667

700

$7.25

$5,075

Baseline Survey







Study participants

4,000

1

0.5833

2333.3

$7.25

$16,916

12 month follow-up

Survey







Study participants

3,200

1

0.75

2,400

$7.25

$17,400

Study MIS







Staff conducting random assignment

40

105

0.1667

700

$27.86

$19,502

Administrative Records Protocol







Staff completing protocol

10.7

2

8

170.7

$27.86

$4,756

Total Estimated for Current Requestb

8,544.1



8,204


$163,574

Total Estimated for Full CSPED Studyc

11,744.1



10,604


$180,974

a Burden estimates are annualized over three years.

b The total burden estimates for the current request do not include the estimated burden for the 12-month follow-up survey.

c The total burden estimates for the full CSPED study include the estimated burden for the 12-month follow-up survey.



13. Estimates of Other Total Cost Burden to Respondents and Record Keepers

These information collection activities do not place any additional costs on respondents or record keepers.

14. Cost to the Federal Government

The estimated cost for completion of the CSPED evaluation is $13,486,665 over the five years of the evaluation. The cost over the three years of the requested clearance is $8,091,999 and the annualized cost to the federal government is $2,697,333.

15. Explanation for Program Changes or Adjustments

This is a new submission. There is no request for program changes or adjustments.

16. Plans for Tabulation and Publication and Project Time Schedule

a. Plans for Tabulation

Implementation and Cost Study

The instruments included in this OMB package for the implementation and cost study will yield data that will be analyzed using qualitative and quantitative methods to describe program implementation, assess the program’s overall quality, analyze the factors that appear to be linked to quality, and identify lessons for future practice. A thorough understanding of program implementation will provide context for interpreting program impacts, while a greater understanding of how programs can be implemented with high quality is expected to inform the next generation of programming.

The evaluation contractor will use standard qualitative procedures to analyze and summarize information from staff and partner interviews conducted using the semi-structured staff interview topic guide and the focus group guide. Analysis will involve organization, coding, triangulation, and theme identification. For each qualitative data collection activity, standardized templates will be used to organize and document the information and then code this documentation. Coded text will be searched to gauge consistency and triangulate across respondents and data sources. This process will reduce large volumes of qualitative data to a manageable number of topics/themes/categories (Yin 1994; Coffey et al. 1996) which can then be analyzed to address the study’s research questions.

To code the qualitative data for key subtopics and themes, the evaluation team will first develop a coding scheme that builds from the interview or focus group questions. Senior members of the evaluation team will refine the initial coding scheme by reviewing codes and a preliminary set of data output to make adjustments and ensure alignment with the topics that emerge from the data. For each round of coding, two to three project team members will be trained to code the data using a qualitative analysis software package, such as Atlas.ti or NVivo. To ensure reliability across coders, all team members will code an initial document and compare codes to identify and resolve discrepancies. As coding proceeds, the lead team member will review a sample of coded documents from each coder to monitor reliability. Coded data will enable the team to compare responses across respondents within and across grantees by searching on specific codes. The software will also allow the team to retrieve data on particular codes by type of respondent (for example, case manager or parenting services coordinator). To compare information, the evaluation team may retrieve data for subsets of programs, such as those using the same fatherhood curriculum or those located in rural areas.

Quantitative data will be summarized using basic descriptive methods. Sources of quantitative data include the program staff and community partner survey. Data from each source will follow a common set of steps involving data cleaning, variable construction, and computing descriptive statistics. To facilitate analysis of each data source we will create variables to address the study’s research questions. Construction of these analytic variables will vary depending on a variable’s purpose and the data source being used. Variables may combine several survey responses into a scale, aggregate attendance data from a set time period, or compare responses to identify a level of agreement.

Study MIS information will also be used for the implementation and cost study. The study will provide summary statistics for key program features:

  • Enrollment patterns. For example, the average number of new applicants each month.

  • Services provided by grantees. For example, the average number of group workshops offered each month, the average number of individual service contacts each month, or the percentage of individual service contacts provided in participants’ homes or in the office.

  • Participation patterns. For example, the number of participants that engage in a group activity within two months of enrollment and the average number of hours of group workshops received by program participants.

We will analyze data from the study MIS for each grantee at two points in time which correspond to the two implementation reports identified in Table A.6. In each report, we will describe enrollment patterns, services provided, and participation patterns over the previous 12 months. Later analyses may describe how patterns changed over time, such as from the early to late implementation period.

Impact study

Baseline survey data will be used in the impact and implementation analyses. First, baseline survey data will be used to describe the characteristics of CSPED program participants. For each site, we will present tables of frequencies and means for key participant characteristics, including demographic and family information.

Baseline survey data will also be used in the impact analysis to test for baseline equivalence, define subgroups, improve the precision of impact estimates, and estimate factors that predict participation in the program. The goal of the impact analysis is to provide statistically valid and reliable estimates of the effects of CSPED on the outcomes of participants. To do so, we will compare observed outcomes for members of a randomly selected program group—individuals eligible for program services—with outcomes for members of a randomly selected control group that was not offered program services. We will use the experience of the control group as a measure of what would have happened to the program group members in the absence of CSPED. Random assignment of noncustodial parents to either a program (treatment) or a control group ensures that the two groups of noncustodial parents do not initially differ in any systematic way on any characteristic, observed or unobserved. Any observed differences in outcomes between the program and control group of noncustodial parents can therefore be attributed to the program with a known degree of precision.

Though random assignment ensures that noncustodial parents in the program and control groups do not initially differ in any systematic way, there might still be chance differences between groups. To confirm that there were no differences between the program and control groups before random assignment, we will statistically compare key characteristics—including outcome measures—between the groups at baseline. In particular, to establish baseline equivalence, we will conduct t-tests and F-tests to test for differences between the two groups both overall and separately by site. In these comparisons, we will use the analytic sample, which is composed of respondents to the follow-up survey.

Baseline survey data will also be analyzed jointly with the follow-up survey data to estimate impacts. Using baseline data in the impact analysis will improve the statistical precision of impact estimates. Differences of means or proportions in outcomes between the program and control group would provide unbiased estimates of the impacts of being offered participation in the CSPED program (referred to as the intent-to-treat effect, or ITT effect). However, the impact analysis will use regression models to estimate the ITT effect, allowing us to control for random differences in the baseline characteristics of program and control group members.

In addition, data from the baseline survey will be used to estimate the impact of receiving program services, the effect of treatment on the treated (or the TOT effect). In many settings, the TOT effect can be calculated by adjusting the ITT effect for the difference between the program and control groups in program participation rates. However, in this context, CSPED offers a range of services, and as a result participants may take up only some of those services. For example, program participants might only attend some group meetings or might choose to participate in only some components of the program. Because we are interested in understanding how the impact of the programs varies with the type and intensity of services received, the TOT effect must be calculated using quasi-experimental methods—techniques that do not rely solely on the study’s random assignment design (see Wood et al. 2011 for an application of these methods). To estimate the TOT effect, we will use data from the baseline survey and from the study MIS to predict program participation; possible predictors include motivation to change, barriers to participation, grantee staff predictions of participation, and information on referral source. If participant self-reports and grantee staff assessments are predictive of participation among program group members, we will be able to estimate the TOT effect in addition to the ITT effect.

Information from the study MIS will be used in the impact study to estimate the effect of participating in program services as described above. In particular, information provided at intake by grantee staff, together with predictors of participation from the baseline survey, will be used with service receipt data to estimate the TOT effect.

The information obtained through the collection of administrative records will be used to measure the impacts of the CSPED program, that is, the information on study participants’ court records, child support payments, unemployment benefits, and services received outside of what is offered in the CSPED program, will be used to demonstrate the contrasts between the CSPED program group (the treatment group) and the comparison (control) group. The information from administrative records, in combination with information from the staff semi-structured interviews and the study MIS, will be used to assess the relative benefits and costs of the CSPED program.

b. Time Schedule and Publications

This study is expected to be conducted over a five-year period beginning on October 1, 2012. This ICR is requesting burden for three years. Baseline data collection is expected to begin in October 2013. Attachment E provides a timeline for the Information Collection Activities covered by this ICR.

In addition to reports on the findings from the impact, implementation, and qualitative studies, CSPED will provide opportunities for analyzing and disseminating additional information through special topics reports and research or issue briefs. We will also provide a public or restricted use data file for others to replicate and extend our analysis.

Table A.6. Schedule for the CSPED Evaluation

Activity

Date

Intake period for impact study

October 2013-September 2016

Initial implementation and cost study report

April 2015

Final implementation and cost study report

April 2017

Impact and benefit-cost report

August 2017



Short research or policy briefs are an effective and efficient way of disseminating study information and findings. The evaluation team will develop and produce several short research briefs. Topics for these briefs will emerge as the evaluation progresses but could, for example, summarize key implementation, impact, or subgroup findings or describe the study purpose and grantees.

17. Reason(s) Display of OMB Expiration Date Is Inappropriate

All instruments will display the expiration date for OMB approval.

18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.

REFERENCES

Alexandre, Pierre K., and Michael T. French. “Labor Supply of Poor Residents in Metropolitan Miami, Florida: The Role of Depression and the Co-Morbid Effects of Substance Use.” The Journal of Mental Health Policy and Economics, 4, 2001, pp. 161-173.

Berlin, Martha, Leyla Mohadjer, Joseph Waksberg, Andrew Kolstad, Irwin Kirsch, D. Rock, and Kentaro Yamamoto. 1992. An experiment in monetary incentives. Pp. 393-398 in Proceedings of the Section on Survey Research Methods. Alexandria, VA: American Statistical Association.

Coffey, Amanda, Beverly L. Holbrook, and Paul Atkinson. “Qualitative Data Analysis: Technologies and Representations.” Sociological Research Online, vol. 1, no. 1, 1996. Available at http://www.socresonline.org.uk/index_by_issue.html.

Dion, Robin M., Sarah A. Avellar, and Elizabeth J. Clary. “The Building Strong Families Project: Implementation of Eight Programs to Strengthen Unmarried Parent Families.” Final report submitted to the U.S. Department of Health and Human Services, Office of Planning, Research and Evaluation, Administration for Children and Families. Washington, DC: May 2010.

Downey, G., and J.C. Coyne. “Children of Depressed Parents: An Integrative Review.” Psychological Bulletin, vol. 108, 1990, pp. 50–76.

Gelfand, D.M., and D.M. Teti. “The Effects of Maternal Depression on Children.” Clinical Psychology Reviews, vol. 10, 1990, pp. 329–353.

Heinrich, Carolyn J., Brett C. Burkhardy, and Hilary M. Shager. “Reducing Child Support Debt and Its Consequences; Can Forgiveness Benefit All?” Journal of Policy Analysis and Management, vol. 30, no. 4, 2011, pp. 755-774.

James, Jeannine M., and Richard Bolstein.1990 The effect of monetary incentives and follow-up mailings on the response rate and response quality in mail surveys. Public Opinion Quarterly 54:346-361.

Kroenke, Kurt, Robert L. Spitzer, and Janet B.W. Williams. “The PHQ-9: Validity of a Brief Depression Severity Measure.” Journal of General Internal Medicine, vol. 16, no. 9, 2001, pp. 606-613.

Kroenke, Kurt, Tara W. Strine, Robert L. Spitzer, Janet B.W. Williams, Joyce T. Berry, and Ali H. Mokdad. “The PHQ-8 as a Measure of Current Depression in the General Population.” Journal of Affective Disorders, vol. 144, no. 1, 2009, pp. 163–173.

Löwe, B., J. Unützer, C.M. Callahan, A.J. Perkins, and K. Kroenke, “Monitoring Depression Treatment Outcomes with the Patient Health Questionnaire-9.” Medical Care, vol. 42, no. 12, 2004, pp. 1194-1201.

Marcotte, Dave E., Virginia Wilcox-Gök, and D. Patrick Redmon. “Prevalence and Patterns of Major Depressive Disorder in the United States Labor Force.” The Journal of Mental Health Policy and Economics, 2, 1999, pp. 123-131.

Martin, A., W. Reif, A. Klaiberg, and E. Braehler, “Validity of the Brief Patient Health Questionnaire Mood Scale (PHQ-9) in the General Population.” General Hospital Psychiatry, vol. 28, no. 1, 2006, pp. 71-77.

Manning, W.D., and P.J. Smock. “‘Swapping’ Families: Serial Parenting and Economic Support for Children.” Journal of Marriage and Family, vol. 62, 2000, pp. 111–122.

McLanahan, Sara. “Fragile Families and the Reproduction of Poverty.” Annals of the American Academy of Political and Social Science. vol. 621, no. 1, 2009, pp. 111–131.

Meyer, Daniel R. and Maria Cancian (2002). “Volume 1: Comparative Summary of Quantitative Nonexperimental and Experimental Analyses.” Institute on Research on Poverty, University of Wisconsin-Madison.

Pearson, Jessica, Lanae Davis, and Jane Venohr. “Parents to Work!” Denver: Center for Policy Research, February 2011.

Pinto-Meza A., A. Serrano-Blanco, M.T. Peñarrubia, E. Blanco, and J.M. Haro, “Assessing Depression in Primary Care with the PHQ-9: Can It Be Carried Out over the Telephone?” Journal of General Internal Medicine, vol. 20, no. 9, 2005, pp. 738-742.

Singer, Eleanor and Richard A. Kulka. “Paying respondents for survey participation,” in Studies of Welfare Populations: Data Collection and Research Issues, eds. Michele Ver Ploeg, Robert A. Moffitt and Constance F. Citro, National Academy Press, Washington, D.C., 2002, pp. 105-127.

Singer, Eleanor, John Van Hoewyk, and Mary P. Maher. “Does the Payment of Incentives Create Expectation Effects?” Public Opinion Quarterly. vol. 62, 1998, pp. 152–164.

U.S. Census Bureau. “Family Structure and Children’s Living Arrangements.” Available at http://www.childstats.gov/americaschildren/famsoc.asp. Accessed March 12, 2013.

Wood, Robert G., Sheena McConnell, Quinn Moore, Andrew Clarkwest, and JoAnn Hsueh. “Strengthening Unmarried Parents’ Relationships: The Early Impacts of Building Strong Families.” Princeton, NJ: Mathematica Policy Research, May 2010.

Wood, Robert G., Quinn Moore, and Andrew Clarkwest. “BSF’s Effects on Couples Who Attended Group Relationship Skills Sessions: A Special Analysis of 15-Month Data.” Princeton, NJ: Mathematica Policy Research, May 2011.

Yin, R. Case study research: Design and methods (2nd ed.). Thousand Oaks, CA: Sage Publishing, 1994.

1 We estimate that there will be 750 participants in the CSPED program at each of the eight sites, resulting in 6,000 participants in the CSPED program.

2 Based on the Building Strong Families Study (Wood et al. 2010) and the speed at which data can be entered, we expect each entry to take about two minutes, or 1/30 of an hour (which is rounded to 0.0333 hours).

3 The burden estimates for the 12-month follow-up administration may be revised after the 12-month follow-up survey instrument has been developed.

4 These burden estimates correspond to the currently requested ICR and thus they do not include the annual burden hours and annualized cost of the 12-month follow-up survey of participants.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Title40174 CSPED OMB
SubjectOMB Package Part A
AuthorJLugo-Gil
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy