PACT - Field Data Collection - ICR Supporting Statement B - Final - 4-9-12

PACT - Field Data Collection - ICR Supporting Statement B - Final - 4-9-12.docx

Parents and Children Together (PACT) Evaluation

OMB: 0970-0403

Document [docx]
Download: docx | pdf

Parents and Children Together (PACT) Evaluation:


OMB Supporting Statement

for the Data Collection Necessary to Select Grantees for the Study


Part B: Collection of Information Involving Statistical Methods

January 18, 2012


Submitted By:

Office of Planning, Research and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


7th Floor, West Aerospace Building

370 L’Enfant Promenade, SW

Washington, D.C. 20447


Project Officers:

Nancye Campbell

Seth Chamberlain

This information collection request (ICR) is for clearance to collect information for the Parents and Children Together (PACT) evaluation of a subset of Responsible Fatherhood (RF) and Healthy Marriage (HM) grants authorized under the Claims Resolution Act of 2010 (public law 111-291). The Responsible Fatherhood and Healthy Marriage (RFHM) grantees represent the “next generation” of grantees that build on what has been learned by earlier grantees and take a more comprehensive approach to serving families. The evaluation is being undertaken by the U.S. Department of Health and Human Services, Administration for Children and Families (ACF) and is being implemented by Mathematica Policy Research and its partner, ICF International.

Work under PACT will be carried out in stages with different types of information collection in each stage. Thus, requests for clearance will be submitted in stages as work progresses. This first submission requests clearance for “field data collection,” that is, to collect information from grantees and key partners that will inform selection of a subset of grantees for evaluation. The information will be collected via telephone calls and in-person conversations either at grantee meetings or at the grantees’ organizations. The submitted discussion guide, if approved, will be used for this information collection. Subsequent submissions will request clearance for further data collection instruments (e.g. baseline and follow-up instruments for the impact study; e.g. interview protocols for the implementation and qualitative studies). These instruments will be developed after we select the grantees to be evaluated and additional design work has been completed.

While this document requests clearance only for the data collection necessary to inform grantee selection, it discusses the entire plan for the study. Because the study is still in its design phase, some of the details of the plan may change, as each stage may influence subsequent stages.

1. Respondent Universe and Sampling Methods

The PACT evaluation will focus on grantee programs purposively selected for the study. Up to 15 grantees are expected to be selected for the impact/implementation component (qualitative data is expected to be collected in these sites in addition to impact/implementation data – see section A1 for more detail) and up to 15 separate grantees will be selected for the qualitative component. The grantees will be selected for their ability to address important research questions, including:

  • What are the experiences, issues and challenges in designing, implementing and operating comprehensive responsible fatherhood and healthy marriage education services for lower-income fathers or couples?

  • What are the net impacts of the programs on relationship quality and stability, parenting attitudes and behaviors, measures of adult and child well-being, and economic outcomes?

  • What are the experiences of fathers and couples who volunteer for the programs?

Additional selection criteria for a grantee to be included in the impact/implementation component are (1) it must be possible collect the necessary baseline information, to insert random assignment into the program’s intake procedures, and prevent the control group from receiving the same or similar services offered to the program group; (2) the program must be able to enroll enough participants to meet sample size requirements; and (3) it must be plausible that the program can lead to impacts that are detectable with the planned sample size.

The sample frame for the impact/implementation study includes all eligible applicants to the selected grantee programs who consent to participate in the study. The sample intake period is expected to be about two years. We expect about 400 eligible applicants in each program, with 200 assigned to the program group and 200 to the control group. Baseline and follow-up data will be collected on all sample members.

The sample frame for the qualitative study will be all participants in the selected grantee sites enrolled between September 2011 and January 2013. Sample members will be selected purposively.

With regard to the data collection for which this request is submitted—collecting additional information from grantees—the respondent universe for the grantee discussions includes all 115 RFHM grantees funded in September 2011, as well as their key partner agencies. Applications from all 115 successful RF and HM grantees will be reviewed. More in-depth information will only be collected from grantees and their partner agencies if information from their applications suggests that they may be a candidate for the impact/implementation study or the qualitative study (see selected research questions and additional selection criteria above). As soon as sufficient information has been collected to determine that the grantee is not a potential candidate for inclusion in the study, data collection from the grantee will cease.

2. Procedures for Collecting Information

a. Statistical Methodology, Estimation, and Degree of Accuracy

With regard to the study as a whole, Table B.1 shows, with a single-site sample of 400 (200 in the program group and 200 in the control group),1 we are confident of detecting impacts on continuous outcomes that have an effect size of at 0.20 or larger. This is sufficiently small to be able to detect impacts on fathers’ attitudes toward fatherhood. Cowan et al. (2009) found an effect size of 0.31 of a fatherhood program on a measure indicating the extent to which fathers viewed “fatherhood” as one of the main roles in their lives. It is also large enough to detect the effect size of 0.21 found in the Oklahoma City Building Strong Families (BSF) program for the impact on couple’s relationship happiness (Wood et al. 2010). The sample is also large enough to detect an impact on employment of 6 percentage points, an impact smaller than the one found in a pilot employment program for parents behind in their child support in four communities in New York (Lippold and Sorensen 2011). Wood et al. (2010) found that the Oklahoma City BSF program increased the percentage of fathers who provide substantial financial support for raising children by 8 percentage points. This impact would be detected with a sample size of 400. The sample is also large enough to detect impacts on subgroups if the data are pooled across two or more programs


Table B.1. Minimum Detectable Impacts for Key Outcomes

Sample Size

Continuous Outcome
(Effect Size)

Fathers’ Likelihood of Employment
(Percentage Points)
Control = 0.11a

Father Provides Substantial Financial Support
(Percentage Points)
Control = 0.72b

400

0.20

0.06

0.09

600

0.16

0.05

0.07

800

0.14

0.04

0.06

Note: We assume an effective response rate of 80 percent; a 50-50 split of sample members into program and control groups. All calculations assume a 95 percent confidence level, 80 percent power, and a one-tailed test. We assume an R-squared in the impact regression of 0.50.

a. Lippold and Sorensen (2011)

b. Wood et al. (2010)


b. Unusual Problems Requiring Specialized Sampling Procedures

There are no unusual problems requiring specialized sampling procedures.

c. Periodic Cycles to Reduce Burden

There will be only one cycle of baseline data and one cycle of follow-up data collection.

3. Methods to Maximize Response Rates and Data Reliability

With respect to the entire project, we have planned approaches to maximize the response rate at each stage of data collection. We expect response rates of 80 percent for follow-up data collection.

Some eligible program applicants may refuse to consent to participate in the study. We expect nearly all (95 percent) of those who are asked to participate in the study will consent to do so. Response rates to the baseline data collection for those who consent will be close to 100 percent, as completing this data collection will be part of the intake process.

The following approaches will be used to maximize response rates for the telephone survey and collection of data from diaries: (1) detailed contact information will be collected at baseline; (2) specialized locating services and staff will be used; and (3) experienced and trained interviewers will be used.

At baseline, we propose to collect detailed contact information from fathers for the RF grantees and both members of couples for the HM grantees. This contact information will include telephone numbers and addresses as well as emails and social media addresses. It will also include contact information for friends and relatives who may know the whereabouts of the sample member at the 12-month follow-up.

Locating sample members who cannot be reached at the telephone number obtained on the baseline form will proceed through the following stages, as necessary:

  • Automated database searches. National locating databases, such as Accurint and the National Change of Address Service, can easily be searched for up-to-date contact information.

  • Searches by specialized in-house staff. Contractor staff will seek new contact information using searchable databases, directory assistance and reverse directories, social networking, and contacts with neighbors and community organizations.

  • Field locating. Trained field locators will search beginning with known post office box addresses or addresses of friends reported at baseline or discovered through locating. They may approach neighbors to the sample member’s last known address and rely on neighborhood resources such as post offices, churches, schools, recreation centers, past employers, bars, homeless shelters, or community centers as sources of contact information.



Telephone interviewers will be selected based on past experience and performance on similar studies with demonstrated skills in communication and refusal conversion. They will be trained on the specific instruments used for the PACT evaluation.

As some nonresponse to follow-up is inevitable, an analysis of nonresponse will be conducted to assess whether the analysis samples are representative of the full sample of fathers and couples. Using the data on characteristics of sample members at baseline, we will conduct statistical tests (chi-squared and t-tests) to gauge whether the program group members who participated in data collection are representative of all the program group members, whether the control group members who participated in data collection are representative of all the control group members, and whether there are differences in the program and control group members who participated in the data collection.

We will use two approaches to correct for potential nonresponse bias in the estimation of program impacts. First, the regression models described in Part A Section 16 will adjust for observed differences between the characteristics of program and control group respondents. Second, because this regression procedure will not correct for differences between respondents and nonrespondents in each research group, we will construct sample weights so that the weighted baseline characteristics of respondents in the program and control group in each site are similar to the full sample (respondents and nonrespondents). These weights will be constructed using baseline data.

With respect to the specific information collection requested through this submission—the field data collection from grantees—we expect a 95 percent response rate to the grantee discussions and to obtain high quality data for three reasons. First, cooperating with the evaluation is a condition of receiving the grant. Second, grantees were told of the importance of the study by funding officials at their first grantee meeting and informed about the potential need for the evaluation contractor to collect additional information. Third, the interviews will be conducted by trained, experienced contractor staff.

4. Tests of Procedures or Methods

Pretests of the data collection instruments will be used: (1) to identify typical instrumentation problems such as question wording and incomplete or inappropriate response categories, and (2) to measure the response burden. Pretests of each instrument will be conducted with nine or fewer respondents. The respondents selected for the pretests will be similar to the sample members. Selected grantees will be asked to suggest program participants who could participate in a pretest before the study intake period begins.

Telephone pre-test interviews will be audio-taped and/or monitored to identify: (1) questions the respondents have difficulty understanding, (2) additional response categories that might be appropriate, and (3) wording changes that might improve the clarity of the question intent. As a result of the pretest, we expect to make minor changes to correct errors and improve the wording of the questions and their sequencing.

The in-home observation instruments will be pretested with a convenience sample selected close to a Mathematica facility. The pretest will be observed by senior project staff. As a result of the pretest, we expect minor changes to procedures to improve the wording of instructions.

The implementation and qualitative study instruments will be pretested in one site and/or with one participant and then revised based on that experience.

A previous version of the discussion guide was pretested with ACF staff. The type and range of items included in the discussion guide are very similar to discussion protocols used successfully in similar studies, such as the Evaluation of Adolescent Pregnancy Prevention Approaches (formerly OMB #0970-0360, now #0990-0382).

5. Individuals Consulted on Statistical Methods

Input to the PACT Discussion Guide was received from staff in the ACF, Office of Planning, Research, and Evaluation as well as staff at Mathematica Policy Research.

Inquiries regarding statistical aspects of the study should be directed to:

Ms. Nancye Campbell

7th Floor West

901 D Street, SW

Washington, DC 20447

Phone: (202) 401 5760

[email protected]

Mr. Seth Chamberlain

7th Floor West

901 D Street, SW

Washington, DC 20447

Phone: (202) 260 2242

[email protected]

Further consultations will be made with statistical experts in the upcoming design phase of the evaluation.




REFERENCES

Cowan, Philip A., Carolyn Pape Cowan, Marsha Kline Pruett, Kyle Pruett, and Jessie Wong. “Promoting Fathers’ Engagement with Children: Preventive Interventions for Low-Income Families.” Journal of Marriage and Family, 71, August 2009, 663-679.

Lippold, Kye and Elaine Sorensen, “Strengthening Families Through Stronger Fathers: Final Impact Report for the Pilot Employment Programs.” The Urban Institute, October 2011.

Wood, Robert G., Sheena McConnell, Quinn Moore, Andrew Clarkwest, and JoAnn Hsueh. “Strengthening Unmarried Parents’ Relationships: The Early Impacts of Building Strong Families.” Report submitted to the U.S. Department of Health and Human Services, Administration for Children and Families. Princeton, NJ: Mathematica Policy Research, May 2010.



1 It is expected that approximately 421 fathers will be invited to participate in the study, per site (as well as mothers associated with those fathers in half the sites), and that 95%, or 400 per site, will accept the opportunity to participate.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Title06997 PACT OMB
SubjectOMB Package Part B
AuthorSheena McConnell (formatted by Sheena Flowers)
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy