Part B Statistical Methods_renewal_4.12.16b

Part B Statistical Methods_renewal_4.12.16b.docx

Child Support Noncustodial Parent Employment Demonstration (CSPED)

OMB: 0970-0439

Document [docx]
Download: docx | pdf



U.S. Department of Health and Human Services


Administration for Children and Families


Office of Child Support Enforcement (OCSE) Aerospace 4th Floor


901 D Street, SW


Washington DC 20447


Project Officer: Elaine Sorensen

Child Support Noncustodial Parent Employment Demonstration (CSPED)



Extension

OMB No. 0970-439

OMB Supporting Statement for Implementation, Cost, and Impact Studies

Part B: Statistical Methods

April 12, 2016





1. Respondent Universe and Sampling Methods

In October 2012, the Office of Child Support Enforcement (OCSE) within the Administration for Children and Families (ACF) issued grants to eight state child support agencies to provide employment, parenting, and child support services to noncustodial parents who are having difficulty meeting their child support obligation. The demonstration is called Child Support Noncustodial Parent Employment Demonstration (CSPED). The overall objective of the CSPED evaluation is to document and evaluate the effectiveness of the approaches taken by the eight grantees. All eight grantees are included in the evaluation’s two key components: (1) the implementation and cost study; and (2) the impact study.

Implementation and Cost Study

During the requested extension period, there are two components of the implementation and cost study that remain: semi-structured interviews with grantee site staff and continued tracking of program participation and service use in the study MIS for all NCPs randomly assigned to the treatment condition.

Staff interview topic guide. Interviews will be conducted with child support staff at the state and local level, as well as staff at their partner agencies that will be providing core CSPED services. Respondents will be selected purposively using organizational charts and information on each employee’s role at the host organization and its partner organizations. Purposeful selection is appropriate for staff selection because insights and information can only come from individuals with particular roles or knowledge. An estimated 120 child support and partner agency staff are expected to be interviewed in 2016. We interview the grantee director as well as key managers and coordinators. Some grantees are implementing CSPED in multiple counties or communities. For those grantees, we also interview the grantee’s lead staff member in each community. All grantees are required to partner with other organizations to provide employment and fatherhood services. We interview the lead staff person responsible for grant activities at each partner agency. In addition, we interview a sample of frontline case support, employment, and fatherhood staff. If an agency has dedicated more than one frontline staff person to the demonstration, we randomly pick one of them to interview.

Study MIS to track program participation. The study MIS is used by program staff to document services received by all CSPED program participants who are selected into the treatment group during the grant period.

Impact Study

Noncustodial parents (NCPs) are eligible for CSPED if they are not stably employed and are behind in their child support payments or appear likely to become so because of inadequate earnings. Grantees are expected to recruit 500 NCPs who are eligible for CSPED each year for three years into the research sample, yielding a total of 1,500 sample members within each site and 12,000 sample members across all eight sites. Half of these NCPs will be randomly assigned to the treatment group and will be offered CSPED services; the other half will be randomly assigned to the control group and will not be offered these services.

Because there is likely to be substantial variation in the CSPED programs offered by the eight grantees, each site will be analyzed separately, although some pooled analysis will also be conducted if such analysis is deemed appropriate once more is known about the similarity of the eight CSPED programs. To support site-level analysis, relatively large samples are required within each site to detect policy-relevant program impacts.

We are currently enrolling NCPs into the study, conducting baseline interviews and tracking service receipt, collecting administrative data, and have begun conducting follow up interviews with NCPs 12 months after their random assignment date. As of January 31st, 2016, 8,060 participants have been enrolled and completed the baseline survey. Over 2,300 12 month follow up surveys have been completed. When the original data collection expires on September 30, 2016, we will need to conduct follow-up interviews with approximately 1,000 more participants. In addition, grantee site staff will continue to track program participation and service use in the study MIS for all NCPs randomly assigned to the treatment condition and we will need to complete semi-structured interviews with site staff. We anticipate that all data collection activities will be completed by September 30, 2018.

2. Procedures for Collecting Information

a. Statistical Methodology, Estimation, and Degree of Accuracy

Implementation and Cost Study

In this ICR, clearance is sought for an extension of the following information collection activities associated with the CSPED implementation and cost study. The data collection procedures are described below:

  1. Staff interview topic guide (IC #1). Interviews are being conducted with state and local child support staff and local service provider staff during site visits conducted in the first and third years of CSPED implementation. Interviews are one-on-one or in small groups, depending on the staffing structure, roles, and number of staff in each role. Topics for the first round of interviews include documenting the service model, implementation system, and inputs to implementation; sample recruitment strategies, challenges, and successes; assessment of early program operations and participant responsiveness; experiences of staff, participants, and community partners; operational challenges and solutions. Topics for the second round of interviews, once service delivery had reached a steady state, include documentation of program operations, assessment of participant experiences and responsiveness, community partnerships, staff experiences, and adaptations to the service model, and cost data collection.

  2. Study MIS to track program participation (IC #2). The MIS is used in each site to document service use by program participants. The web-based system allows program staff to document all individual service contacts with participants, referrals to other community services, incentives provided, and participation in group activities such as peer support groups or job readiness workshops.

Information for the implementation and cost study will be descriptive. In general, it will not involve formal hypothesis testing.

Impact Study

The data collection procedures for the extension of the CSPED impact study instruments are described below:

  1. Introductory Scripts Read by Program Staff, Introductory Script Heard by Program Applicants, Baseline Survey, and Study MIS to Conduct Random Assignment (IC #3, 4, 5, 6). In the eight evaluation sites, program staff will identify noncustodial parents (NCPs) who are eligible for CSPED services. Intake workers at each program will meet with eligible and interested NCPs in person to enroll them in the study sample. They will use the introductory script to describe the study to potential enrollees. When intake workers are ready to enroll an NCP into the CSPED study, program staff will call the University of Wisconsin Survey Center to connect with a trained interviewer who will administer the consent form and conduct the baseline survey. Once the baseline survey is complete, the interviewer will instruct the NCP to hand the phone back to the intake worker, who then will use the Study MIS to confirm that the NCP is eligible for random assignment and to conduct the random assignment process.

  2. Protocol for Collecting Administrative Records (IC #7). The evaluation team is collecting administrative records pertaining to the activities of study participants. These administrative records include: (1) child support records collected from state child support enforcement agencies, (2) wage and unemployment insurance benefit records from state labor agencies, (3) public assistance benefit records from state human services agencies, and (4) criminal justice records from state and county criminal justice agencies. To acquire these data, project staff coordinate with staff at the relevant agency regarding data availability and transfer protocols. Data are transferred to the CSPED evaluation team following all appropriate confidentiality procedures for handling sensitive data.

  3. 12-Month Follow-Up Survey (IC #8). In addition, a follow-up survey is conducted approximately one year after random assignment. The sample for the one-year follow-up survey includes approximately 1,000 sample members within each site who enroll in the study in the first two years. A response rate of 80 percent is anticipated for the follow-up survey, or approximately 800 survey respondents within each site. Study participants are contacted approximately two weeks prior to the start of data collection by mail, to provide notification of the upcoming survey request. To achieve a high response rate, we send reminder emails or text messages to those who have not responded within 2 weeks and then follow up with reminder postcards. The language for these contact materials is presented in Supplement D. All analysis of follow-up survey data will account for survey nonresponse using nonresponse weights calculated using standard techniques to estimate the probability of nonresponse as a function of baseline characteristics. Administrative records data on child support outcomes, earnings and employment, TANF, SNAP, and Medicaid benefit receipt, and criminal justice outcomes will be gathered on all 1,000 sample members within each site.

Table B.1 reports site-level minimum detectable impacts on two illustrative binary outcomes—one with 50 percent prevalence (such as employment within a quarter) and another with 20 percent prevalence (such as whether a child support order was modified over the follow-up period). The table also includes minimum detectable impacts on annual earnings, child support payments, and a continuous father engagement scale. Separate estimates are presented for measures based on follow-up survey data and administrative records data, as well as for site-level and pooled analysis.

Both the survey and administrative records samples yield adequate statistical power for detecting impacts on key outcomes. The primary impact estimates will be based on impacts that are pooled across sites. The survey and administrative record samples are sufficient to allow for some additional subgroup and site-level analysis.

Table B.1. Minimum Detectable Impacts for Key Outcomes

Sample


Employed in the Quarter (%)

(Mean 50%)

Had Child Support Order Modified (%)

(Mean 20%)

Annual Earnings ($)

(SD = $14,717)

Monthly Child Support Payments Made ($)

(SD =$217)

Father Engagement Scale
(Effect Size)

Follow-up Survey

(80% response rate)






Pooled, full sample

(N=6,400)

2.8

2.2

$818

$12

0.06

Pooled, 25% subgroup

(N=1,600)

5.6

4.4

$1,637

$24

0.11

Site-level, full sample

(N=800)

7.9

6.3

$2,315

$34

0.16

Administrative Records






Pooled, full sample

(N=12,000)

2.0

1.6

$590

$9

N/A

Pooled, 25% subgroup

(N=3,000)

4.1

3.2

$1,195

$18

N/A

Site-level, full sample

(N=1,500)

5.7

4.6

$1,690

$24

N/A

Note: Figures assume a 50-50 split of sample members into program and control groups, a two-tailed test with a 95-percent confidence level and 80-percent power, and an R-squared in the impact regression of 0.20. N/A = not applicable (measure not available from administrative records). Standard deviations for annual earnings and the amount of monthly child support payments made are drawn from the Building Strong Families evaluation (Wood et al. 2012).

b. Unusual Problems Requiring Specialized Sampling Procedures

There are no unusual problems requiring specialized sampling procedures.

c. Periodic Cycles to Reduce Burden

There will be only one cycle of baseline data and one cycle of follow-up data collection.

3. Methods to Maximize Response Rates and Deal with Nonresponse

Implementation and Cost Study

Study MIS to track program participation. To maximize response rates and data reliability for the study MIS, we take these steps:

  • Develop a user-friendly, flexible MIS. The MIS is specifically designed for use by grantee site staff. As such, it is extremely user-friendly and flexible, so as to meet the needs of each site. Making the system simple and easy to use will improve the quality of the data collected. In addition, by providing sites with this system, we standardize the information being collected from each site and improve the reliability of our implementation and impact components.

  • Include data quality checks in the MIS. The MIS also ensures data reliability by instituting automatic data quality checks. For example, if grantee staff enter odd or unlikely values in a particular field, the system will prompt users to check the value. For some fields, the response values will be restricted; for others, grantee staff site will be able to override the check.

  • Provide extensive training to grantee site staff. To increase data quality, we provide extensive training to system users prior to initial use. Initial training was on site; follow-up training is conducted using web and telephone conferences. Following training, CSPED team members conduct follow-up site visits to ensure compliance with procedures and be available by phone and email to assist users.

  • Monitor data quality. We also monitor the data entered by grantee sites and provide feedback to grantees on their data quality. Initially, we monitor data quality on a weekly basis, tapering that gradually to monthly monitoring, as agencies demonstrate their ability to use the system correctly.

Impact Study

Baseline and 12-Month Follow-Up Surveys. In evaluation sites, to maximize response rates and data reliability for the survey efforts, we take the following steps:

  • Use a tested questionnaire common to all sites. While the CSPED baseline and follow-up surveys have been tailored to the specific circumstances of the CSPED evaluation, both are based closely on the Parents and Children Together (PACT) baseline and follow-up surveys. PACT is ACF initiative which has received OMB approval (OMB Control Number 0970-0403), was extensively tested, and is currently being fielded. The alignment of the CSPED follow-up survey with the PACT follow-up survey is described in Supplement C.

  • Use a straightforward, undemanding survey. The CSPED baseline and follow-up surveys are designed to be easy to complete. The questions use clear and straightforward language. The average time required for the respondent to complete the baseline survey is estimated at 30 minutes,1 and the average time required for the respondent to complete the follow-up survey is estimated at 45 minutes.

  • Administer the surveys using computer-assisted technical interviewing (CATI). Administering the baseline and follow-up surveys via CATI maximizes the reliability of the data entered by telephone interviewers through skip-pattern logic and checks for consistency and validity.

  • Use trained interviewers. Respondents are interviewed by trained members of the University of Wisconsin Survey Center’s survey operations staff and members of Mathematica Policy Research’s survey operations center staff, many of whom have significant experience working on similar studies. All survey staff assigned to the study participate in both general training (if they are not already trained) and an extensive project-specific training. Interviewers do not work on the study until they have been certified as prepared. The project-specific training includes role playing with scenarios and other techniques to ensure that interviewers are ready to respond effectively to sample members’ questions. They also focus on developing skills for securing respondents’ cooperation and averting and converting refusals.

  • Be able to administer the survey in multiple languages. During telephone contact, interviewers identify Spanish-speaking respondents and connect them to speak with a certified Spanish language interviewer. Both the University of Wisconsin Survey Center (UWSC) and Mathematica Policy Research employ staff who have experience conducting interviews in Spanish. If once sample enrollment is underway, the evaluation team determines that interviews will need to be conducted in languages other than English or Spanish, UWSC will hire interviewers with the necessary language skills.

  • Provide payments for survey participants. We offer a modest $10 payment to baseline survey respondents to increase program applicants’ agreement to participate in the study and to reduce attrition for follow-up data collection. We offer an additional $25 payment to follow-up survey respondents to increase willingness to participate in follow-up data collection efforts (This is discussed in greater detail in Section A.9.)

  • Collection of contact information prior to 12-month follow-up survey. On the baseline survey, multiple forms of contact information for the respondent are collected in order to maximize the contact options available at the time of follow-up. Additionally, contact information for persons the respondent is in close contact with is collected at baseline to assist with locating hard to find noncustodial parents at the time of the 12-month follow-up survey.

  • Perform intensive field locating for hard-to-reach respondents. After attempting to contact non-respondents by telephone and via a non-response locating postal mailing (see Supplement D), highly-trained field locators from Mathematica Policy Research perform in-person locating of hard-to-reach sample members. Once a sample member is located and agrees to participate, the locator provides a cell phone for the sample member to call in and complete the telephone survey with an interviewer.

We anticipate high response rates to the baseline survey, since only those who complete the baseline survey will be randomly assigned. Based on experience with PACT, Building Strong Families, and other prior research studies, the evaluation team anticipates that 95 percent of those offered the opportunity to enroll in the CSPED study will agree to participate in the evaluation (consent) and that 100 percent of those who do consent will complete the baseline survey as part of the intake process. Also, the evaluation team does not anticipate significant item nonresponse on the baseline survey based on prior experience asking similar questions with similar populations.

We also anticipate achieving an 80 percent response rate to the follow-up survey, based on the amount of contact information collected at baseline, saliency of the interview topic, our intent to conduct intensive field locating of nonrespondents, and our experience with prior research studies (such as Building Strong Families in which an 80 percent response rate was obtained). The evaluation team does not anticipate significant item nonresponse on the follow-up survey based on prior experience asking similar questions with similar populations.

We will closely monitor the difference in treatment and control response rates, using the attrition boundaries developed by Mathematica Policy Research as part of ACF’s Employment Strategies for Low Income Adults Evidence Review: Standards and Methods (Matri, Sama-Miller, Clarkwest 2015) as our guide. If a response rate lower than 80 percent is achieved, differential attrition rates must differ by no more than the conservative boundary for the level of attrition to be considered low and the level of potential bias acceptable. For example, at a 70 percent overall response rate, the treatment and control response rates must differ by no more than 4 percentage points to reduce the likelihood of nonresponse bias.

4. Tests of Procedures or Methods to be Undertaken

Study MIS. All functions of the automated version of the MIS system are rigorously tested and evaluated by the development team to ensure proper functionality. Additionally, we consult with practitioners on the usability of the system and engage these practitioners in the testing phase.

Program Staff Survey. The evaluation team carefully tested the web-based version of the instrument to ensure that skip patterns were correctly programmed and the flow through the instrument is working properly.

Baseline Survey. Most questions on the CSPED baseline survey were drawn from the baseline survey used in the PACT evaluation. In-person cognitive interviews and telephone pretests of the PACT baseline survey were conducted to ensure that questions were understood and were consistent with the concepts they aim to measure; to identify typical instrumentation problems such as question wording and incomplete or inappropriate response categories; to measure the response burden; and to check that there are no unforeseen difficulties in administering the instrument via telephone.

12-Month Follow-Up Survey. As described in Supplement C, most questions on the CSPED 12-month follow-up survey were drawn from the follow-up survey used in the PACT evaluation. Additionally, a pretest with no more than 9 noncustodial parents will be conducted prior to follow-up survey data collection to ensure that all questions are understood by participants, and that questions measure the intended concepts.

5. Individuals Consulted on Statistical Methods

Preliminary input on statistical methods was received from staff in OCSE as well as staff at the University of Wisconsin and Mathematica Policy Research, including the following individuals:

Dr. Dan Meyer

University of Wisconsin-Madison

School of Social Work

1350 University Ave.

Madison, WI 53706

Dr. Robert Wood

Mathematica Policy Research

P.O. Box 2393

Princeton, NJ 08543

In the future, further input on analytic approaches may be sought from additional staff at these organizations, and from outside consultants.

REFERENCES

Mastri, A., E. Sama-Miller, and A. Clarkwest. “Employment Strategies for Low-Income Adults Evidence Review: Standards and Methods.” Report submitted to the U.S. Department of Health and Human Services, Administration for Children and Families, Office of Planning, Research and Evaluation. Washington, DC: Mathematica Policy Research, May 2015.


Wood, Robert G., Quinn Moore, Andrew Clarkwest, Alexandra Killewald, and Shannon Monahan. “The Building Strong Families Project: The Long-Term Effects of Building Strong Families: A Relationship Skills Education Program for Unmarried Parents.” Report submitted to the U.S. Department of Health and Human Services, Administration for Children and Families. Princeton, NJ: Mathematica Policy Research, November 2012.


1 As noted in burden tables, NCPs will be on the telephone with survey interviewers for a total of 35 minutes to account for the approximately 5 minutes it will take for them to complete the consent process before beginning the baseline survey.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitlePart B Statistical Methods Renewal
SubjectOMB Package Part A
AuthorSheena McConnell (formatted by Sheena Flowers)
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy