PJAC Supporting Statement A

PJAC Supporting Statement A.doc

Procedural Justice Informed Alternative to Contemp (PJAC)

OMB: 0970-0505

Document [doc]
Download: doc | pdf

PJAC – Part A


U.S. Department of Health and Human Services

Administration for Children and Families

Procedural Justice Informed Alternatives to Contempt Demonstration (PJAC)

Office of Child Support Enforcement

(OCSE)


New Collection

330 C Street, SW Fifth Floor

Washington, DC 20201

OMB No. 0970-NEW


Project Officer: Elaine Sorensen

Supporting Statement for Implementation, Cost, and Impact Studies


Part A: Justification


July 2017

  1. Circumstances Making the Collection of Information Necessary

The Office of Child Support Enforcement (OCSE) within the Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services (HHS) seeks approval to collect information for the Procedural Justice-Informed Alternatives to Contempt (PJAC) evaluation. The study will test the efficacy of incorporating procedural justice principles into child support practices as a cost-effective alternative to contempt. The goal of the demonstration is to employ techniques designed to increase parents’ perception of fairness in the child support process and, as a result, to improve cooperation with the child support program, increase reliable child support payments, and reduce the ineffective use of contempt. Six child support agencies were selected through a competitive grant application process to participate in PJAC. They are located in: Arizona; California; Franklin County, Ohio; Michigan; Stark County, Ohio; and Virginia. The Georgia Department of Human Resources, Division of Child Support Services (Georgia DCSS) was awarded a cooperative agreement to procure and manage the evaluation of PJAC. Georgia DCSS contracted with MDRC for the PJAC evaluation.

    1. Study Background

Every child support program has a portion of the caseload that is noncompliant with child support orders. One strategy that some jurisdictions use in response to noncompliance is civil contempt proceedings, including the threat of incarceration, to enforce child support (Gardiner 2002). Although standard contempt practices sometimes result in one-time “purge” payments to avoid jail, there is no evidence that these practices result in future compliance with the support order or ongoing support payments that families can count on to make ends meet. In addition, contempt procedures are more expensive than other enforcement remedies (Coffin 2014). Recognizing these realities, some child support programs have redirected their resources away from civil contempt to practices that increase compliance and reduce the build-up of unpaid arrears by working proactively with both parents to address the underlying impediments to payment. Innovative strategies that encourage voluntary compliance with child support orders include enhanced investigation, case conferencing, setting income-based orders, early intervention, timely modification, and employment services. Research suggests that these practices can improve compliance with child support orders, increasing both the amount of child support collected and the consistency of payment (Sorensen and Tannehill 2006; Pearson, Thoennes, and Davis 2007; Ovwigho, Sanders, and Born 2009; Lowry and Potts 2010; Diaz and Chase 2010; Bryant 2012). The PJAC demonstration will add to the evidence base of innovations in child support business practices for noncompliant obligors by evaluating the effectiveness of procedural justice-informed approaches.

    1. Legal or Administrative Requirements that Necessitate the Collection

There are no legal or administrative requirements that necessitate the collection. The Administration for Children and Families (ACF) is undertaking the collection at the discretion of the agency.

  1. Purpose and Use of the Information Collection

    1. Overview of Purpose and Approach

The overarching purpose of the PJAC evaluation is to assess the efficacy of offering procedural justice-informed child support services as an alternative to the current contempt process. The evaluation includes three study components: (1) an implementation study, (2) an impact study, and (3) a benefit-cost study. Respectively, these study components will generate extensive knowledge regarding how PJAC programs operate, their effects on key outcomes such as reliable child support payments, and whether the benefits of PJAC services exceed their costs. The purpose of each information collection activity is to inform one or more of these study components.

Current Request

The current information collection request includes five different data collection activities. These activities include: (1) staff data entry for random assignment, which will allow the evaluation team to establish program and control groups and from which data will be used for the impact analysis; (2) data entry into a study Management Information System (MIS), which will be used by PJAC caseworkers to track service delivery and by the evaluation team as part of the implementation study and benefit-cost study; (3) staff and community partner interviews and (4) participant interviews, both of which will be used by the evaluation team as part of the implementation study; and (5) participant survey tracking, which will help minimize anticipated survey nonresponse problems, the survey being a key data source for the impact study. The evaluation team will also collect administrative data from state and county child support systems, court records, criminal justice records, and data from the National Directory of New Hires. The collection of these data will not impose additional burden on respondents or record keepers.

Future Request

A future request will include (1) a staff and community partner survey, intended to further inform the implementation study; (2) a participant survey, which, as per above, will inform the impact study (this survey will be administered to a subsample of participants); and (3) a staff time study, which will be key to estimating staff costs for the benefit-cost study.

Information collection for the implementation study will take place from late 2017 through late 2021. PJAC participation and services will be tracked in the study MIS for all participants for at least one year from the time of study enrollment, which is slated to run from late 2017 through late 2020. Staff, partner, and participant interviews, along with the staff/partner survey, will take place in Fall 2018 and Winter 2020. Information collection for the impact study will begin in early 2019, when tracking efforts are initiated for those who are included in the survey subsample. Finally, since MIS participation and service data will be used for the benefit-cost study in addition to the implementation study (to estimate PJAC costs), information collection for the benefit-cost study will also run from late 2017 through late 2021, with the time study occurring in late 2018. Table A.1 summarizes the expected timelines for data collection for the five instruments being submitted under this Information Collection Request (ICR).

Table A.1. Timeframe for Proposed Data Collection Activities

Data Collection Activity

Expected Timeframe

Staff data entry for random assignment

Late 2017 – Late 2020

Study MIS to track program participation

Late 2017 – Late 2021

Staff and community partner interview topic guide

Fall 2018, Winter 2020

Participant interview topic guide

Fall 2018, Winter 2020

Participant survey tracking letter

Early 2019 – Late 2020



    1. Research Questions

As described in section A1a (Study Background), the PJAC demonstration will add to the evidence base of innovations in child support business practices for noncompliant obligors by addressing the following research questions regarding the use of procedural justice-informed approaches as an alternative to contempt proceedings:

  • How was PJAC designed and operated? (primary research question for the implementation study)

  • What impact did PJAC have on child support payments, enforcement actions, contempt proceedings, jail stays, and employment and earnings relative to what would have happened in the absence of the intervention? (primary research question for the impact study)

  • To what extent do PJAC’s costs differ from those expended on behalf of individuals randomly assigned to a control group that could not receive PJAC program services (net cost)? How does the net cost compare with the net benefits associated with the program’s impacts? (primary research question for the benefit-cost study)

    1. Study Design

The evaluation of PJAC will employ a random assignment design and, as per above, will include three study components: an implementation study, an impact study, and a benefit-cost study. As the most rigorous method of evaluating large-scale social service interventions, the random assignment design will provide the strongest possible evidence regarding the efficacy of the PJAC intervention in improving key outcomes such as reliable child support payments, reduced arrears, and decreased use of contempt proceedings by allowing for a direct comparison between a program group (noncustodial parents about to enter contempt proceedings who are offered PJAC services) and a control group (noncustodial parents at this same point who are not offered PJAC services, but instead will proceed down the business-as-usual path of contempt). The sample will be drawn across six sites over three years of enrollment, with each site targeting a sample size of 3,000 (though some of the sites in smaller cities may fall short of this target) for a total evaluation sample size of 18,000. Sample members will be representative of the larger population of noncustodial parents who have fallen far enough behind in their child support payments for contempt proceedings to be initiated. The evaluation includes both qualitative and quantitative data collection components, as described in more detail in the next section.

    1. Universe of Data Collection Efforts

Current Request for Data Collection Instruments

  • Staff data entry for random assignment. A secure website, linked to the study MIS, will provide the results of random assignment and enroll eligible noncustodial parents into the study upon entry of a few key pieces of information. Random assignment will be conducted administratively, after enforcement tools have been exhausted and the noncustodial parent’s case is eligible for contempt. In other words, rather than recruit noncustodial parents to enroll in the study, child support agency staff will automatically initiate the random assignment process for any noncustodial parent who is eligible for contempt based on internal administrative data records. (Note that the evaluation team will obtain a waiver of informed consent for random assignment from the MDRC IRB, as the circumstances of this study meet the criteria in order to qualify for the waiver and, given the extremely hard-to-reach nature of this population, the study would not be practicable without such a waiver.)

  • Study MIS to track program participation. The study MIS will be web-based and will be used to track participation in the PJAC program. Information about services received by all program participants will be entered into the system by program staff. These data will aid staff as they deliver the intervention as well as inform the evaluation. (Note that the inclusion of this data collection instrument is contingent on the award of additional evaluation funding. OCSE has the funding available, has requested approval to award the funding, and has received verbal approval to proceed. It is fully expected that the funding will be formally approved and awarded by 9/30/2017.)

  • Staff and community partner interview topic guide. The topic guide will be used to conduct semi-structured interviews with program staff and selected community partner organizations during site visits conducted beginning in Fall 2018.

  • Participant interview topic guide. This topic guide will be used to conduct semi-structured interviews with study participants across the grantee sites during site visits.

  • Participant survey tracking letter. A survey tracking letter will be distributed at 2-3 different points in time during the year following random assignment. Tracking letters will be sent to the survey subsample only; the survey subsample will be composed of 3,000 respondents across the six sites. The purpose of these tracking letters is to maintain the best possible contact information for survey sample members to help minimize any potential nonresponse bias for the 12-month follow-up survey (described below and part of a future information collection request). The tracking letter will ask participants to call in, mail back, or otherwise communicate updated contact information to the survey firm.

Future Information Collection Requests

  • Staff and community partner survey. The staff and community partner survey will be web-based and administered to program staff and staff at partner agencies working with PJAC participants. The survey will capture staff program experiences beyond the information learned from the semi-structured interviews. It will include a larger number of staff and will focus on quantitative information that cannot be gathered via interviews, such as the use of procedural justice approaches.

  • Staff time study. The time study will ask PJAC program staff at participating child support agency sites to estimate the amount of their time they spend on various tasks. This information will be used to estimate the net child support cost per program group member, which is the difference in costs of child support services provided to the program group versus the control group.

  • Participant 12-month follow-up survey. The evaluation includes a 12-month follow up survey to measure impacts on outcomes that cannot be captured using administrative records (sample members’ experiences with child support agencies and courts, perceptions of fairness, relationships with the custodial parent(s) and children), and employment and earnings that are not captured in administrative records data. The follow-up survey will be administered to a subsample of 3,000 individuals across the six participating sites. Different subsampling approaches are under consideration as the evaluation team explores different ways to optimize the utility of this survey; a more detailed plan will be provided in a future information collection request, when the full survey instrument is submitted for clearance.

  1. Use of Improved Information Technology and Burden Reduction

The study MIS, which staff will use to conduct random assignment and to track program participation, will be a web-based application providing easy access while maintaining the security of the data.1 Users will be assigned their own user account, including a user ID and password. The web-based application will allow sites to access the MIS without purchasing or installing additional software or changing the configuration of their computers. The system can be accessed from any computer, allowing for ease of entry, while the data are housed on secure servers, thereby maintaining data security.

The system will be designed with use by the site staff in mind, and based on experience from prior studies with similar types of service providers. As such, it will be flexible, easy-to-use, and include helpful tools, reports, and reminders to reduce burden on grantee site staff and increase the quality and quantity of data collected. The system is designed for multiple users at each organization and will include options for varying levels of system access depending on users’ access needs. For example, administrators or supervisors will have the greatest rights within the system, having the ability to create new users, assign program participants to staff members, and review all activity for the organization. Staff providing direct services to study participants will have the ability to record and review information about participants assigned to their caseload. Additionally, a portal will be created to allow community partners, who will be providing ancillary supportive services, to also access the system. Through this portal, child support caseworkers will be able to send referrals to community partners. Community partners, in turn, will be able to receive all necessary referral information and enter attendance and other relevant information into the system once they have begun working with referred clients. This two-way communication outlet will streamline workloads for both child support staff and partner agency staff, allowing for smoother collaboration, information sharing, and service delivery.

The various levels of system access in the study MIS will allow for streamlining of information; limiting full system access to a small set of staff members promotes increased data security and greater data quality.

  1. Efforts to Identify Duplication and Use of Similar Information

The PJAC evaluation will not require the collection of information that is available from alternative data sources. None of the instruments will ask for information that can be reliably obtained through administrative data collection.2 The study MIS will gather information about program participation not typically collected by child support programs. For instance, information specific to service receipt of the PJAC intervention is unlikely to be collected through other avenues. However, if a grantee has an existing MIS that tracks information needed for the PJAC evaluation, the evaluation team will gather data from the existing MIS. No program participant or staff member will be asked for the same information more than once. For example, the participants will not be asked during the semi-structured interviews any questions about their child support case since that information is available through administrative data.



  1. Impact on Small Businesses or Other Small Entities

No small businesses are expected to be involved in data collection. Nonetheless, instruments have been tailored to minimize burden and collect only critical evaluation information.

  1. Consequences of Collecting the Information Less Frequently

Not collecting information for the PJAC evaluation would limit the government’s ability to document the kinds of activities implemented with federal funds and its ability to measure the effectiveness of such activities. The PJAC evaluation represents an important opportunity for OCSE to learn about how to improve child support program performance and increase the reliable payment of child support through the provision of procedural justice-informed practices as an alternative to contempt. If the information collection requested by this clearance package is not conducted, policymakers and providers of these programs will lack reliable information on the impacts of PJAC, as well as descriptive information that can be used later to replicate, refine, and improve the intervention as needed.

Staff data entry for random assignment. The secure random assignment website is required to implement the study’s random assignment design, allowing for the creation of program and control groups as well as the collection of some basic data elements that will be vital to the impact analysis. Data entry for random assignment will occur only once for each noncustodial parent enrolled into the study.

Study MIS to track program participation. Without the study MIS, the evaluation team would not be able to track elements of the PJAC model delivered to participants, nor would it be able to accurately assess the contrast in services between program group members and control group members. It would not be able to describe the frequency and intensity of services received as part of the PJAC intervention. The MIS enables collection of information in the most efficient manner, occurring as the case activity is happening. Less frequent collection would require more time and burden.

Staff and partner interview topic guide. Omitting staff interviews from the data collected for the implementation study would mean a lack of understanding of PJAC implementation. Staff perspectives will be invaluable to understanding how the program functions on the ground. Staff and partners will be interviewed at two different points in time since their experiences of delivering PJAC and the challenges they face are likely to shift over the course of program implementation. If these interviews were conducted at only one point in time, the evaluation team may not be obtaining representative data about the nature of PJAC implementation, calling into question the accuracy of any conclusions drawn based on analysis of those data.

Participant interview topic guide. Without interviewing PJAC participants, the customer perspective will not be captured as part of the implementation study, missing a key viewpoint, and the ability to accurately assess the project goal of increasing their perception of fairness. Participant perspectives will be central to understanding what elements of delivering the PJAC model were effective/ineffective and why. Each participant interviewed will only be interviewed once.

Participant survey tracking letter. Without the survey tracking letter, the survey firm’s ability to locate participants for the 12-month follow-up survey would be weakened. Case records indicate this population is quite transient, and a survey tracking letter will help ensure that the survey firm has the most updated contact information possible. Given the 12 months that will elapse between when contact information (possibly already outdated) is obtained and survey interviews can commence, it is important to collect this information more than once in that interval so as not to lose the trail on a group that may move often and be difficult to locate. Minimizing nonresponse bias will help ensure that the results of the survey are useful to understanding the effects of PJAC on various outcomes that cannot otherwise be measured as there is no other source for them.

  1. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

There are no special circumstances for the proposed data collection.

  1. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

    1. Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13 and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995)), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on May 1, 2017 (Vol. 82, No. 82, pp. 20345-20346), and provided a 60-day period for public comment. A copy of this notice is included as Attachment A. During the notice and comment period, the government received 0 comments.

    1. Consultation with Experts Outside of the Study

The following two experts in the area of child support were consulted:

Dr. Dan Meyer

University of Wisconsin-Madison

School of Social Work

1350 University Ave.

Madison, WI 53706


Linda Mellgren

Office of the Assistant Secretary for Planning and Evaluation, HHS (retired)

245 11th Street SE

Washington, DC 20003



  1. Explanation of Any Payment or Gift to Respondents


The gifts we propose to make for the data collection activities covered in this ICR are summarized in Table A.2.

Table A.2. Respondent Gifts Proposed for Data Collection Activities

Data Collection Activity

Length of Activity (minutes)

Gift Amount

Participant interview

60

$15

Participant update of contact information

6

$2


We propose providing small gifts in appreciation of sample members’ participation in interviews and contact information updates. As part of the implementation study, the research team plans to interview both program and control group members in each site during two separate sets of implementation site visits. In order to obtain a more representative sample (rather than a convenience sample), a group of study participants from each research group will be randomly selected from among recent study enrollees who have engaged with the child support system at some point following enrollment. In order to maximize the utility of participant interviews, we hope to minimize nonparticipation among this subsample by offering a $15 gift card to offset any transportation or other expenses, as well as to otherwise offset burden on participants. We expect this gift will be particularly helpful in encouraging participation among control group members, who are likely to be disinclined to assist with the research given their experiences in the child support contempt process, and whose voices need to be heard in order to understand the usual business practices of the child support agencies and how noncustodial parents respond differentially to these practices versus PJAC services.

Included in this submission is a tracking letter meant to minimize potential nonresponse issues on the 12-month survey.3 This letter will be sent to those selected into the survey subsample of 3,000 across the six sites at up to three different points in time between when they are enrolled into the study sample and when 12 months have passed and interviewers from MDRC’s subcontractor, survey firm Decision Information Resources (DIR), attempt to administer the survey to them. The purpose of these tracking letters is to maintain better contact information for the subsample, knowing that the target population for PJAC tends to be more transient and may move and/or change phone numbers multiple times between when contact information is initially procured from the child support agency (at study enrollment) and their 12-month interview window. A $2 pre-paid gift for the few minutes of time it will take to update contact information should encourage completion of this task.

Research has shown that respondent gifts are effective at increasing response rates for populations similar to participants in child support programs—people with lower educational levels (Berlin et al. 1992) and low-income and nonwhite populations (James and Bolstein 1990). Singer and Kulka (2002) showed that respondent payments reduced differential response rates and, hence, the potential for nonresponse bias. Furthermore, a recent meta-analysis indicated that pre-paid incentives, as opposed to promised incentives, are most effective at eliciting a response to mail surveys (Mercer, Caporaso, Cantor, and Townsend 2015).

  1. Assurance of Confidentiality Provided to Respondents

Procedures will be strictly followed by the evaluation team and grantee staff to ensure the privacy of participant information to the extent permitted by law. The following actions will be taken:

  • Training grantee staff in privacy procedures. Comprehensive, in-person training will be provided to grantee staff related to the use of Personal Identifiable Information (PII) at the time of random assignment. We should note that child support staff already has significant requirements and training in privacy and security.

  • Training interviewers in privacy procedures. Evaluation team members interviewing staff, program partners, and participants will follow strict privacy procedures to protect data. All evaluation staff must regularly undergo mandatory human subjects research and data security trainings. Whenever possible, notes will be taken on encrypted laptops and saved on secure servers. If handwritten note-taking is required, no PII will be included and notes will be transferred to electronic form to be saved on secure servers as soon as is practicable; paper notes will either be destroyed or kept in locked file cabinets.

  • Restricting access to the study MIS. Data collected through the study MIS will be housed on secure servers. Access to the study MIS will be restricted to approved staff members assigned a login ID and password with permission from the evaluation data manager. Access rights will be assigned strictly on a need-to-know basis.

  • Using de-identified data for all interview participants. Any data elements used for recruitment of participants for semi-structured interviews, such as name and telephone number, will be destroyed after completion of the interview. Interview transcripts and resulting reports will not identify respondents by name.

  • Obtaining informed consent from all interview participants. All interview participants will be informed that their participation in interviews is completely voluntary, that their information will be kept private, and that members of the researcher team will be taking notes during the interviews but their names will not be attached to these notes and the evaluation team will not reveal statements made by individuals to anyone outside of the evaluation team. Staff will be further informed that their participations or non-participation will not affect their employment, while participants will be informed that their participation or non-participation will not affect the services they receive or their child support. The evaluation team will obtain verbal informed consent from staff and community partners and written informed consent from study participants prior to conducting interviews.

  • Enforcing strict data security protocols by subcontractors. As noted earlier, MDRC has subcontracted with Decision Information Resources (DIR) to administer the 12-month survey. DIR is contractually obligated to abide by all of MDRC’s data security standards, including maintaining the privacy of the survey subsample’s information. MDRC’s data security protocols are described in more detail in the following subsection.

Data security. In addition to procedures specific to PJAC, MDRC and its staff have a strong institutional commitment to data security. MDRC has strict confidentiality protocols for all research data collection and usage.

MDRC's computer facilities are designed to be available whenever needed, be secure, meet current and future computing needs, and do so while maintaining strict authentication and access controls that promote the protection of data through physical and virtual security, access, and separate storage, encryption at rest when required at directory, sub-directory, file, and binary levels.  All data are backed up daily or more frequently if required (up to five times a day). The internal network has redundant virtual servers with failover and for specific software, clusters of physical servers, to provide added redundancy. Storage area networks and network attached storage provide redundancy for data storage.

The local and wide area networks (LAN and WAN) are based on Cisco and other manufacturers enterprise design and equipment These include firewalls, intrusion detection and protection systems, specialized appliance and software combination to encrypt data at rest and in motion, specialty equipment to detect network penetrations above and beyond the capabilities of intrusion detection and protection systems, virtual private networks for external access by remote offices and users, encrypted internet-protocol based teleconference and video conference services and, other software and equipment for the management of the enterprise.

Transmission of data is done securely. Encryption protocols, if required, meet US Government Secret and Top Secret protection protocols using AES 128 and 256 bit encryption in line with Federal Information Processing Standards (FIPS) 140-2 requirements.

IRB approval. The evaluation team will formally submit the PJAC evaluation to MDRC’s accredited IRB in July 2017. (IRB Registration Number 0003522, Federal-wide Assurance (FWA) Number 00003694).

  1. Justification for Sensitive Questions

The required staff data entry for random assignment will involve PJAC caseworkers entering participant social security numbers into the study MIS. This is necessary in order to build a sample with unique identifiers so that administrative data, including National Directory of New Hires records, can later be requested using the identifying information.

Sample members who consent to participate in semi-structured participant interviews will be asked questions about their background, including their employment and earnings history and their criminal history. These two factors are among the most significant barriers to child support payment for most noncustodial parents and are therefore particularly relevant to understanding the experiences of those who are participating in the PJAC evaluation, the primary goal of which is to improve reliable child support payments among parents who have fallen far behind in their payments. By learning more about participants’ backgrounds, the evaluation team will be better able to understand the context of their experiences with PJAC services or with business as usual child support practices, including circumstances that resulted in them reaching the point of contempt and caseworkers’ decisions regarding service provision. As discussed in the previous section, all possible measures will be taken to protect the privacy of participant interviewees.

The other instruments submitted under this information collection request have no sensitive questions. Tracking program participation in the study MIS focuses on non-sensitive information, including contacts and outreach, location efforts, case management, and supportive services. Staff and partner interviews focus on the experiences of program and community organization staff with their jobs of delivering services. Similarly, there are no sensitive questions involved in the survey tracking letter.

A separate ICR for the 12-month survey will discuss justification of sensitive questions for that data collection tool, as well as the staff survey and staff time study instruments.

  1. Estimates of Annualized Burden Hours and Costs

The estimated reporting burden and cost for the five information collection instruments included in this ICR is presented in Table A.3. These estimates were determined as described in the following bullets.

  • Staff data entry for random assignment. The evaluation team liberally estimated that 20 PJAC caseworkers in each of the six sites will conduct random assignment, though we imagine this number will likely be lower. With a target sample size of 3,000 per site, this amounts to an average of 150 noncustodial parents randomly assigned by each caseworker. As per the attached instrument, we will only require staff to enter a small number of data elements to conduct random assignment, estimated to take only about three minutes per sample member. This calculates to 900 burden hours. When annualized over three years, the result is 300 burden hours. We estimate the average hourly wage for staff at the child support agencies to be the average hourly wage of “social and community service managers” taken from the U.S. Bureau of Labor Statistics, National Compensation Survey, 2016 ($34.07). The annual burden hours multiplied by the estimated hourly wage of $34.07 results in a total annual cost of $10,221 for this information collection activity.

  • Study MIS to track program participation. The evaluation team made the same assumptions for this activity regarding number of staff and number or responses per staff member, explained in the preceding bullet. Over the approximately one year of follow-up, we estimate that PJAC case workers will spend about one hour on MIS data entry per sample member, resulting in a total burden of 18,000 hours, annualized to 6,000 hours. Using the same average wage estimates for case workers of $34.07, this amounts to a total annual cost of $204,420.

Table A.3. Annual Burden and Cost Estimates

Instrument

Total Number of Respondents

Number of Responses per Respondent

Average Burden Hours per Response

Total Burden Hours

Total Annual Burden Hours

Average Hourly Wage

Total Annual Cost

Staff data entry for random assignment

120

50

0.05

900

300

$34.07

$10,221

Study MIS to track program participation

120

50

1.00

18,000

6,000

$34.07

$204,420

Staff and community partner interview topic guide

150

.67

1.00

300

100

$34.07

$3,407

Participant interview topic guide

180

.33

1.00

180

60

$8.675

$520.50

Participant survey tracking letter

3,000

1

0.10

900

300

$8.675

$2,602.50

Estimated Total Annual Burden Hours: 6,760

Estimated Total Annual Cost: $221,171


  • Staff and community partner interview guide. The evaluation team hopes to interview as many as 25 different child support agency staff and community partner staff in each site at two different points in time (during each of two rounds of implementation visits). This amounts to 150 staff each giving two responses. Each interview is estimated to take one hour for a total of 300 burden hours, annualized to 100 burden hours. When multiplied by the estimated average wage for these respondents, the total annual cost is estimated at $3,407.

  • Participant interview guide. The evaluation team hopes to interview up to 15 unique participants in each of six sites at two different points in time, for a total of 180 respondents, each giving an hour of their time to complete the interview. This calculates to 180 burden hours, annualized to 60. The average hourly wage estimate for participates was calculated as the average current minimum wage across sites (AZ = $10/hour, CA = $10.50/hour, MI = $8.90/hour, OH = $8.15/$7.75 depending on the size of the business, VA = $7.25). This approach resulted in an estimated hourly wage of $8.675. When multiplying the annual burden times this amount, the total annual cost amounts to $520.50.

  • Participant survey tracking letter. The survey subsample will be composed of 3,000 noncustodial parents. Each will be mailed a tracking letter at up to three different points in time and will be asked to update their contact information. We have estimated six minutes for this task. The total burden hours calculate to 900, annualized to 300. When multiplied by the estimated hourly wage for noncustodial parents, this produces a total annual cost of $2,602.50. (Note that this is likely an overestimate as tracking efforts are known to have very low response rates, generally averaging somewhere around 10 percent, so only a small fraction of sample members will likely participate in this activity for any given mailing.)

  1. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

There are no additional costs to respondents.

  1. Annualized Cost to the Federal Government

The total cost for completing the components of the PJAC evaluation under this current request will be $1,451,260. Annual costs to the Federal government will be $483,753.

  1. Explanation for Program Changes or Adjustments

This is a new submission. There is no request for program changes or adjustments.

  1. Plans for Tabulation and Publication and Project Time Schedule

    1. Analysis Plan

The instruments contained in the current ICR will require both qualitative and quantitative analysis methods:

Staff data entry for random assignment will allow program and control groups to be established; these groups are the basis for the evaluation and will factor heavily into each of the three evaluation components: the implementation study, the impact study, and the benefit-cost study. The impact study will compare the outcomes of the program group to the control group. In addition, a few key data elements that will be part of staff data entry for random assignment will be used as covariates in our linear regression models, making impact estimates more precise, and/or as subgroup variables, allowing us to determine whether the effects of the program differ across key groups of noncustodial parents. All quantitative statistical analysis will be conducted on secure servers. For more information about statistical methods, please see Supporting Statement B.

The study MIS to track program participation will be a core component of the implementation study, allowing us to learn what services were delivered, the frequency and intensity of those services, and how the program group services contrast with the status quo. These quantitative data will be processed and analyzed in SAS, with key measures created and presented in various tables, figures, and infographics. These data will also be used in the benefit-cost study to estimate the cost of PJAC services.

The staff and community partner interviews and participant interviews will both be written up into standardized templates that reflect the content in the interview topic guide and then coded by theme and analyzed using Dedoose, specialized software used to analyze qualitative data. These interviews, also core to the implementation study, will be integrated with the MIS data findings in order to gain a fuller understanding of PJAC implementation and address the key implementation-related research questions using mixed methods.

The participant tracking letter in and of itself will only be analyzed in order to determine what response rates it yielded in terms of updated contact information.

    1. Time Schedule and Publication

This study is expected to be conducted over a roughly five-year period beginning in January 2018. This ICR is requesting burden for three years; we will request an extension when appropriate.

Table A.4. Schedule for the PJAC Evaluation

Activity

Date

Intake period

January 2018-December 2020

Implementation analysis plan

February 2018

Policy brief introducing project

May 2018

Impact analysis plan

June 2018 (draft)

Implementation site visits

Fall 2018, Winter 2020

Implementation study briefs with cost analysis

July 2019

Interim impact memo

March 2021

Final impact report

Fall 2022

Final benefit-cost report

Fall 2022

  1. Reason(s) Display of OMB Expiration Date is Inappropriate

All instruments will display the expiration date for OMB approval.

  1. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.

REFERENCES

Berlin, Martha, Leyla Mohadjer, Joseph Waksberg, Andrew Kolstad, Irwin Kirsch, D. Rock, and Kentaro Yamamoto. 1992. An experiment in monetary incentives. Pp. 393-398 in Proceedings of the Section on Survey Research Methods. Alexandria, VA: American Statistical Association.

Bryant, Cynthia. 2012. Case conferences: a better way to reach agreements, Child Support Report.

Coffin, Ann. 2014. Florida's Data Analytics: Compliance of Support Orders. Presentation to the OCSE Strategic Planning Workgroup on Measuring Child Support Performance.

Diaz, José and Richard Chase. 2010. Return on Investment to the FATHER Project, Wilder Research.

Gardiner, Karen. 2002. Administrative and Judicial Processes for Establishing Child Support Orders. Lewin Group.

James, Jeannine M., and Richard Bolstein. 1990. The effect of monetary incentives and follow-up mailings on the response rate and response quality in mail surveys. Public Opinion Quarterly 54: 346-361.

Lowry, Pamela and Diane Potts. 2010. Illinois Update on Using Civil Contempt to Collect Child Support. Chicago, IL: American Bar Association.

Mercer, Andrew, Andrew Caporaso, David Cantor, Reanne Townsend. 2015. How Much Gets You Much? Monetary Incentives and Response Rates in Household Surveys. Public Opinion Quarterly 79: 105-129.

Ovwigho, Pamela, Correne Sanders, and Catherine Born. 2009. Early Intervention and Child Support Outcomes: Lessons Learned. MD: University of Maryland, School of Social Work.

Pearson, Jessica, Nancy Thoennes, and Lanae Davis. 2007. Early Intervention in Child Support. Denver, CO: Center for Policy Research.

Singer, Eleanor and Richard A. Kulka. 2002. “Paying respondents for survey participation,” in Studies of Welfare Populations: Data Collection and Research Issues, eds. Michele Ver Ploeg, Robert A. Moffitt and Constance F. Citro, National Academy Press, Washington, D.C., pp. 105-127.

Sorensen, Elaine and Tess Tannehill. 2006. Preventing Child Support Arrears in Texas by Improving Front-End Processes. Washington, DC: The Urban Institute.

U.S. Department of Labor, Bureau of Labor Statistics. 2016. Occupational Employment and Wages. OES Tables. Washington, DC: U.S. Bureau of Labor Statistics.

1As noted earlier, the inclusion of the study MIS in the evaluation is conditional on the award of additional funding. OCSE expects this funding request to be formally approved and awarded by September 30, 2017.

2 Note that some of the items included in the random assignment data entry protocol (IC #1) are available in administrative child support data. However, since the data elements listed are under consideration for special analytic use in the impact study (as subgroup indicators or covariates for regression models), it is vital that they be captured at the point of random assignment, lest they be changed, filled in, or updated post-random assignment, thus introducing bias. For this reason, the evaluation team selected a small number of key elements and plans to request that these items be entered by staff at the time of study enrollment rather than relying on administrative data provided later.

3A future OMB submission will seek clearance for a 12-month follow-up survey of sample members.

0


File Typeapplication/msword
AuthorACF
Last Modified BySYSTEM
File Modified2017-08-11
File Created2017-08-11

© 2024 OMB.report | Privacy Policy