PACT HM Baseline+Impl+Qual+Hisp Impl - Supp Stat B - 2-11-13

PACT HM Baseline+Impl+Qual+Hisp Impl - Supp Stat B - 2-11-13.docx

Parents and Children Together (PACT) Evaluation

OMB: 0970-0403

Document [docx]
Download: docx | pdf









U.S. Department of Health and Human Services

Administration for Children and Families

Office of Planning, Research, and Evaluation

Aerospace 7th Floor West

901 D Street, SW

Washington DC 20447

Project Officers: Nancye Campbell and

Seth Chamberlain


Parents and Children Together (PACT) Evaluation (0970-0403):

OMB Supporting Statement for the Healthy Marriage Baseline, Implementation Study, and Qualitative Study Data Collections

Part B: Collection of Information Involving Statistical Methods

February 2013




CONTENTS

1. Respondent Universe and Sampling Methods 1

2. Procedures for Collecting Information 3

a. Statistical Methodology, Estimation, and Degree of Accuracy 3

b. Unusual Problems Requiring Specialized Sampling Procedures 4

c. Periodic Cycles to Reduce Burden 4

3. Methods to Maximize Response Rates and Data Reliability 5

4. Tests of Procedures or Methods 10

5. Individuals Consulted on Statistical Methods 12

REFERENCES 14



(Please see item A1 for a short description of the impact and implementation/qualitative only evaluations, as well as the data collection instruments already approved and currently requested, which are numbered (1) through (18): thus, numbers in parentheses refer to the number of the instrument.)

1. Respondent Universe and Sampling Methods

Impact Evaluation

Experimental Impact (Healthy Marriage Grantee Evaluation – Responsible Fatherhood Grantee Evaluation discussed in previous ICR). Up to five healthy marriage (HM) grantees will be selected for the impact study. Four selection criteria need to be met for a grantee to be included in the impact evaluation: (a) a study of the grantee must yield policy-relevant information; (b) it must be possible to collect the necessary baseline information, to insert random assignment into the program’s intake procedures, and to prevent the control group from receiving the same or similar services from the grantee that are offered to the program group, although they may seek other services in the community; (c) the program must be able to enroll enough participants to meet sample size requirements; and (d) it must be plausible that the program can lead to impacts that are detectable with the planned sample size.

The sample frame for the newly-requested (5) HM baseline survey includes all couples who apply to the selected grantee programs during the study intake period who satisfy study eligibility criteria. To be eligible for the study both members of the couple must be: (a) eligible for the program; (b) consent to participate in the study; (c) be 18 years or older; and (d) have a biological or adopted child who is under 18. The child must live with one or both members of the couple. Applicants that do not meet these criteria might still be eligible to receive program services but will not be included in the study.

Implementation – MIS (Healthy Marriage Grantee Evaluation – Responsible Fatherhood Grantee Evaluation discussed in previous ICR). In impact study grantees, the (7) HM study MIS will be used to randomly assign all persons who apply for participation in the study and who are eligible for both the program and the study. The study MIS will be used to collect information on the services received by all program participants in grantees participating in impact studies and potentially also grantees participating in implementation studies.

Implementation – Additional Implementation Data Collection Instruments (Responsible Fatherhood and Healthy Marriage Grantee Evaluation). Up to 30 sites are anticipated: though the mix may change as we continue to recruit sites, at present, burden is calculated for …

  • RF sites: 5 impact, and 10 implementation/qualitative only sites; and

  • HM sites: 5 impact, and 10 implementation/qualitative only sites.

These instruments will be used in both (i) impact evaluation and the (ii) implementation/qualitative only evaluation.


The sampling approaches for each instrument are:

  1. Semi-structured interview topic guide (for program staff). Respondents will be selected purposively using organizational charts and information on each employee’s role at the host organization and its partner organizations. Purposeful selection is appropriate for staff selection because insights and information can only come from individuals with particular roles or knowledge. In selecting staff, we will take into account factors such as each staff member’s (a) position and responsibilities, and (b) amount of daily interaction with participants or prospective participants.

  2. On-line survey (for program staff). All program staff at sites included in the implementation study at the time of survey administration will be asked to complete the survey. We anticipate that in each program 25 staff will be asked to complete the survey.

  3. Telephone interview guide (for program staff at referral organizations). The contractor will conduct a series of telephone interviews with individuals at organizations that either (1) refer fathers or couples to the RF/HM program, or (2) receive referrals from the RF/HM program for support services not available through the RF/HM program. Interviews will occur with up to 5 referral organizations per site. To identify respondents, the contractor will obtain from each RF/HM program a list of organizations that work with the program to provide or receive referrals and contact information for a representative at the referral organization.

  4. On-line Working Alliance Inventory (for program staff and participants). The instrument will be used with all case manger-program participant dyads that enroll in RF/HM programs during a six-month period. We estimate that up to 100 individuals would enroll in this time period and be asked to complete a Working Alliance Inventory. The case manager assigned to work with each individual enrolling in this time period would be asked to complete the Working Alliance Inventory about the relationship between the individual and case manager.

  5. Focus group discussion guide (for program participants). For each focus group, a random sample of 15 participants will be selected from those that meet eligibility requirements. Individuals who have engaged in at least two program activities or attended a single activity two times beyond the intake interview will be eligible to participate in the focus groups.

  6. Telephone interview guide (for program dropouts). In each RF/HM program, the contractor will conduct up to 15 telephone calls with individuals identified as “program dropouts.” Program dropouts will be defined as individuals or couples who are enrolled in the program but who have never participated in a group session, or only participated once or twice, and/or received no more than one substantive case management contact. If an RF/HM program has more than 15 fathers or couples identified as program dropouts, the contractor will randomly order the set of program dropouts and attempt to complete interviews with the first 15 on the list.

Qualitative (Responsible Fatherhood Grantee Evaluation). The qualitative study will occur in all RF grantees that are participating in the RF impact study. Qualitative studies may also occur in RF grantees that are participating in implementation/qualitative only studies (including the Hispanic RF sub-study) but not the impact study. Following the (14) in-depth, in-person interview guide, the contractor will interview up to 95 program participants over all the sites. To select these participants, we expect to draw a random sample in each site from a list of program participants who meet a minimum participation threshold – e.g., those who have engaged in at least two program activities or attended a single activity two times beyond the intake interview over a four- to five-month period. The (15) check-in call guide will be used to contact the same fathers in the sample for the in-person in-depth interviews.

Implementation/Qualitative Only Evaluation: Hispanic RF Sub-study

The Hispanic RF study will occur in up to five RF grantees that serve mostly Hispanic fathers. (These five RF grantees will be a subset of the up to 10 grantees participating in the implementation/qualitative only evaluation but not the impact evaluation). The sampling approach for each instrument is:

(16) Semi-structured interview topic guide (for program staff). Respondents will be selected purposively using organizational charts and information on each employee’s role at the host organization and its partner organizations.

(17) Focus group discussion guide (for program participants). For each focus group, a random sample of 15 participants will be selected from those that meet eligibility requirements. Individuals who are Hispanic and have engaged in at least two program activities or attended a single activity two times beyond the intake interview will be eligible.

(18) Questionnaires (for program participants in focus groups). All participants in the Hispanic father focus group will be asked to complete a questionnaire.



2. Procedures for Collecting Information

a. Statistical Methodology, Estimation, and Degree of Accuracy

For the HM impact evaluation, a sample of 400, which we expect to be the site-level sample size, is large enough to detect impacts on several key outcomes.1 As Table B.1 shows, with a single-site sample of 400 couples (200 in the program group and 200 in the control group) with baseline data and 320 couples with anticipated 12-month follow-up data (80 percent response rate), we are confident of detecting impacts on continuous outcomes, such as relationship quality scales or parenting scales, of effect sizes of 0.20 or larger. This sample size is sufficient to detect the impact found on relationship quality (0.31 standard deviations) in the Oklahoma site of Building Strong Families at 15 months.

Table B.1. Minimum Detectable Impacts for Key Outcomes

Sample Size
(Couples)
(Baseline/Follow-up)

Likelihood of Couple Being Still Romantically Involved
(Percentage Points)
Control = 0.76a

Continuous Outcome
(Effect size)

400/320

8.4

0.20

600/480

6.9

0.16

800/640

5.9

0.14

1,800/1,440

4.0

0.09

Note: We assume an effective response rate of 80 percent, and a 50-50 split of couples into program and control groups. All calculations assume a 95-percent confidence level, 80-percent power, and a one-tailed test. We assume an R-squared in the impact regression of 0.50.

a. Wood et al. 2012



A sample size of 400 couples per site may not be sufficient to detect impacts on all outcomes or for subgroup analysis at the site level. We anticipate including a set of grantees that offer strong programs and that, combined, will generate at least 1,800 sample couples for the impact evaluation. With a sample of 1,800, we can detect an impact on being romantically involved of 4.0 percentage points, smaller than the impact we observed in the Building Strong Families study of 5.1 percentage points in the Oklahoma site at 15 months. Past evaluations have demonstrated effect sizes of 0.10 or greater on relationship outcomes (Wood et al. 2012; Hsueh et al. 2012). A sample of 1,800 will position the evaluation to detect impacts of about this size. Furthermore, a sample of 1,800 will permit subgroup analyses of approximately 20 percent, or 400 couples.

Based on previous experience, we are confident that an 80 percent response rate for the 12-month follow-up data collection can be achieved. The response rate for the 15-month follow-up survey for the Building Strong Families Study was 72 percent for fathers and 83 percent for mothers. At least one member of the couple responded in 87 percent of couples. We expect to achieve a higher response rate for couples than in Building Strong Families for four reasons: (1) we are anticipating conducting the follow-up interview at 12 months after random assignment rather than 15 months after random assignment; (2) the baseline survey will be conducted by telephone by a trained interviewer who can collect more detailed and accurate contact information than the grantee staff members who administered the Building Strong Families baseline survey; (3) the PACT baseline survey will collect both email and social media addresses, which were not collected in the Building Strong Families Study; and (4) a reminder about the study and a request for updated contact information will be texted or emailed to respondents at about 6 months after random assignment (included in Appendix K).

b. Unusual Problems Requiring Specialized Sampling Procedures

There are no unusual problems requiring specialized sampling procedures.

c. Periodic Cycles of Data Collection

Impact Evaluation

For the HM impact study (newly discussed in this ICR), there will be only one cycle of baseline data, and we anticipate one cycle of follow-up data collection.

For the implementation study, there will be two rounds of data collection for the (8) semi-structured interview topic guide (for program staff), for the (9) on-line survey (for program staff), and for the (12) focus group discussion guide (for program participant fathers or couples. The data collection rounds will be spaced between twelve and eighteen months to allow for program maturation. The remaining implementation study data collections – the (10) telephone interview guide (for program staff at referral organizations), the (11) on-line Working Alliance Inventory (for program staff and participants), and the (13) telephone interview guide (for program dropouts) – will be collected only once.

The qualitative (14) in-person, in-depth interview guide (for program participants) will include three cycles of data collection, spaced approximately one year apart. The (15) telephone check-in guide will be conducted twice between the first and second interview with all participants and then twice between the second and third interviews.

Implementation/Qualitative Only Evaluation: Hispanic RF Sub-study

The Hispanic RF substudy will include only one round of site visits.

3. Methods to Maximize Response Rates and Data Reliability

Impact Evaluation

Experimental Impact (Healthy Marriage Grantee Evaluation – Responsible Fatherhood Grantee Evaluation discussed in previous ICR). To maximize response rates and data reliability for the (5) HM baseline survey effort, the contractor will take the following steps:

  • Use a straightforward, undemanding instrument. The PACT HM baseline survey is designed to be easy to complete. The questions use clear and straightforward language. The average time required for the respondent to complete the survey is estimated at 30 minutes.

  • Administer the survey using computer-assisted technical interviewing (CATI). Administering the HM baseline survey via CATI will maximize the reliability of the data entered by telephone interviewers through skip-pattern logic and checks for consistency and validity.

  • Use trained interviewers. Respondents will be interviewed by trained members of Mathematica’s survey operations staff, many of whom are experienced in such interviews from their work on previous studies conducted for ACF. All survey staff assigned to the study will participate in both general training (if they are not already trained) and an extensive project-specific training. Interviewers will not work on the study until they have been certified as prepared. The project-specific training will include role playing with scenarios and other techniques to ensure that interviewers are ready to respond effectively to sample members’ questions. They will also focus on developing skills for securing respondents’ cooperation and averting and converting refusals.

  • Be able to administer the survey in multiple languages. During telephone contact, interviewers will identify Spanish-speaking respondents and connect them to speak with a bilingual interviewer. When necessary, translators for languages other than Spanish will be used.

  • Provide tokens of appreciation for survey participants. We suggest offering a modest $10 token of appreciation to each HM baseline survey respondent (for a total of $20 per couple) to increase program applicants’ agreement to participate in the study and to reduce attrition for follow-up data collection. (This is discussed in greater detail in Part A.)

We anticipate that at least 95 percent of program applicants will agree to participate in the evaluation (consent) and complete the baseline survey as part of the intake process. This high response rate is based on prior experience asking similar questions with similar populations.

Implementation – MIS (Healthy Marriage Grantee Evaluation – Responsible Fatherhood Grantee Evaluation discussed in previous ICR). To maximize response rates and data reliability for the (7) HM study MIS, we will take these steps:

  • Develop a user-friendly, flexible MIS. The MIS was specifically designed for use by grantee staff. As such, it will be extremely user-friendly and flexible to meet each site’s needs. By providing sites with this system, we standardize the information being collected from each site and improve the reliability of our implementation and impact components.

  • Include data quality checks in the MIS. The MIS will also ensure data reliability by instituting automatic data quality checks. For example, if grantee staff enter odd or unlikely values in a particular field, the system will prompt users to check the value. For some fields, the response values will be restricted; for others, grantee staff will be able to override the check.

  • Provide extensive training to grantee staff. To increase data quality, we will provide extensive training to system users prior to initial use. Initial training will be on site; follow-up training will be conducted using web and telephone conferences. Following training, PACT team members will conduct follow-up site visits to ensure compliance with procedures and be available by phone and email to assist users.

  • Monitor data quality. We will also monitor the data entered by grantees and provide feedback to grantees on their data quality. Initially, we will monitor data quality on a weekly basis, tapering that gradually to monthly monitoring as agencies demonstrate their ability to use the system correctly.

Implementation – Additional Implementation Data Collection Instruments (Responsible Fatherhood and Healthy Marriage Grantee Evaluation).

(8) Semi-structured interviews with program staff. To maximize response rates and data reliability, the contractor will:

  • Conduct interviews during site visits. We anticipate that all grantees selected to participate in the PACT evaluation will agree to participate in these visits. Our past experience indicates that staff participation rates in site visits are typically higher than 90 percent among selected grantees.

  • Identify convenient dates/times for site visits. To help ensure high participation among staff for interviews, the contractor will coordinate with the selected grantees to determine convenient dates for these visits and work with grantees to develop a schedule that accounts for the availability of program staff.

  • Use experienced and trained staff. All contractor staff conducting semi-structured staff interviews will have prior experience conducting semi-structured interviews and will participate in training to maximize data reliability.



(9) On-line survey (for program staff). To maximize response rates, the contractor will:

  • Obtain contact information. The contractor will work with grantees to identify a contact to provide staff information and assist with encouraging staff to complete the survey, if needed. From this contact, the contractor will obtain a roster listing all staff with a management or direct service role for the grantee and individual contact information, including email address and telephone number.

  • Contact staff. Each staff member will receive an email invitation, and up to two reminder contacts, requesting that s/he complete the survey (included in Appendix K). Each email communication will include a unique username and password to ensure that responses from a staff member are private. These email messages will be sent approximately one week apart.

  • Follow-up with staff. Staff members who do not complete the survey after three email contacts will receive a phone contact from a member of contractor’s team to ask if email messages were received and to request that the individual complete the survey. Also, after three email attempts, the contractor will ask the grantee contact to check if staff have received the email invitations and encourage non-responding staff to complete the survey. The combination of direct email contact to respondents and contactor and program staff contact, as needed, has resulted in high response rates on a prior staff survey conducted by the contractor for the Cross-Site Evaluation of Evidence-Based Home Visiting Programs.

(10) Telephone interview guide (for program staff at referral organizations). To maximize the number of responses by representatives of referral organizations, we will:

  • Train interviewers. All interviewers conducting telephone calls will be experienced telephone interviewers and complete a training specific to this study.

  • Identify referral organizations. The contractor will ask each RF/HM program participating in the implementation study to identify its referral organizations and a representative at each organization, and provide email and telephone contact information for the representative.

  • Contact referral organizations. Using these lists, we will contact the identified representatives through direct calls where we introduce the purpose of the call and ask the individual to complete the interview at that time or schedule an alternative time. If the individual requests that we schedule the interview for an alternative time, interviewers will work with the representative to identify a suitable time, even if during non-standard work hours.

  • Follow-up with non-responders. Interviewers will conduct multiple attempts to reach identified representatives by telephone. If telephone outreach does not work, interviewers will contact representatives by email to ask him or her to schedule a time for the interview (included in Appendix K).

(11) On-line Working Alliance Inventory (for program staff and participants). The Working Alliance Inventory will ask dyads comprised of a program participant and the assigned case manager to complete the 12-item survey. To maximize response to the Working Alliance Inventory, we will:

  • Keep the survey brief and clearly identify the reference individual. The Working Alliance Inventory is a 12-item survey that uses a common 7-point Likert-type scale for each question. This survey length will enable program participants and case managers to quickly complete it. Also having survey items reference the first name of either the program participant or case manager, depending on what is appropriate for an item, ensures that each item will be clear to a respondent.

  • Use a web-based application. Administering the survey through a web-based application will allow both program participants and case managers to access the survey from any computer with an internet connection. Grantees will be asked to allow program participants to use on-site computers to access the survey, if the participant prefers, to minimize non-response due to lack of an accessible computer.

  • Provide unique usernames and passwords to each respondent. Each program participant and case manager will receive a unique username and password when completing the survey to protect the privacy of their responses. While program participants and case managers will be asked to complete the survey around the same point in time, they will be encouraged to not be together at the time of completion to encourage honest responses to the survey items.

  • Provide grantees with tracking tools. Grantees will ask the program participants to complete the survey during an office visit about six months after program enrollment. The grantee will be provided a tracking tool that identifies when the dyad reaches this milestone, so the program knows to invite the program participant to an office visit. The tool will also monitor survey completion, so program and contractor staff can work together to ensure both members of the dyad complete the survey.

(12) Focus groups with participants. To maximize response rates and data reliability in the focus groups, the contractor will:

  • Use multiple modes and reminders to recruit participants. Grantee staff will be asked to provide the selected participants a recruitment flyer for the focus group (included in Appendix K). In addition, an email/letter will be sent to each selected participant (also included in Appendix K). Reminder calls will be made at least once before each focus group is held. To maximize response rates, we will offer a $25 gift in appreciation of each participant’s time.

  • Conduct focus groups on site at a time convenient to participants. All focus groups will be held at the program location during a scheduled site visit. We will coordinate the schedule for each focus group so that it is convenient for participants to attend, for example just before or after a program group session, during the evening or weekend.

  • Use experienced focus group moderators. All contractor staff moderating focus groups will have prior experience with focus group moderation and participate in training to increase data reliability.

(13) Telephone interviews with program dropouts. To maximize response rates in conducting the brief telephone interviews with program dropouts, the contractor will:

  • Attempt contact during times when respondents are likely to be home. To accommodate varied schedules, the contractor will make calls to the selected respondents during evening and weekend hours as well as weekdays. The contractor will offer respondents a $15 gift in appreciation of their participation.

  • Use multiple methods to contact respondents. Initial attempts to contact the selected respondents will be by telephone, but the contractor may also send an email/letter requesting the interview (included in Appendix K).

  • Monitor staff. Contractor staff will monitor the telephone interviews with program dropouts to ensure that all interviewers are following the interview guide. All interviewers conducting will participate in training to enhance data reliability.



Qualitative (Responsible Fatherhood Grantee Evaluation). To maximize data reliability and response rates for the (14) in-person, in-depth interviews and (15) telephone check-ins, we will take the following steps:

  • Use multiple methods for recruiting and scheduling interviews. An email/letter will be sent to all participants that are randomly selected to inform them of the in-person interview (included in Appendix K). Subsequently, trained staff will contact participants to schedule the interview one week in advance. The staff member who will be conducting the interview will contact the participant four hours prior to the interview for confirmation. Trained field staff will be used to locate and recruit participants who were not reached by telephone or who do not show up for the interview during their scheduled time.

  • Schedule interviews at a convenient time for respondents. All interviews will be conducted by a single interviewer at a place and time of the fathers’ choosing, ideally in their own neighborhoods. Conducting interviews in public places in the participants’ own neighborhoods often enhances participants’ level of comfort with the interview process which may improve the quality of their responses particularly surrounding sensitive topics.

  • Use trained interviewers. Qualitative interviewers will receive intensive and comprehensive training in how to conduct in-depth interviews. During training, interviewers will learn about the goals of the qualitative study, its relation to the larger evaluation, and the research questions that the interviews are intended to address. They will also be trained on the interview guide and how to skillfully probe via follow-up questions. The training will also include role playing and immediate feedback from trainers. A key part of training will focus on how interviewers can present themselves and phrase their questions and probes in culturally sensitive ways. Interviewers will be instructed to avoid both asking leading questions and expressing approval or disapproval while still maintaining rapport. These leaders will also routinely review randomly selected audio recordings for each interviewer to ensure quality and adherence to interviewer techniques and the in-depth interview guide, and will also attempt to observe each interviewer at least once.

  • All interview waves conducted with same interviewer. All three waves of interviews will be conducted by a single interviewer for each participant. This continuity will maximize rapport between the interviewer and the respondent, which increases data quality. Allowing respondents to build a relationship with a single interviewer across waves is also expected to increase response rates in the second and third waves.

  • Open-ended questions with topic guide. There is no script that interviewers must follow verbatim, but they will use an interview guide to help ensure that interviewers systematically cover each topic of interest while still preserving the freedom for the in-depth interviews to be primarily respondent-led. The absence of a structured script helps develop rapport between the interviewer and the respondent, which increases data quality.

  • Provide a gift as appreciation for respondent’s time. To maximize response rates and thank participants for their time, the contractor will provide $60 for each completed 2-hour interview. Similar amounts have been used to encourage participation in qualitative interviews in ongoing and past studies of a similar population.2



Implementation/Qualitative Only Evaluation: Hispanic RF Sub-study

The Hispanic RF Sub-study will use similar instruments for (16) semi-structured interviews and (17) focus groups as for the Additional Implementation Data Collection Instruments discussed above. Thus, the methods to maximize response rates and reliability will be the same for these instruments in the Hispanic RF sub-study as in the Implementation Study.

(18) Questionnaire for focus group participants in the Hispanic RF substudy. To maximize response rates and ensure data reliability, the contractor will:

  • Administer the survey in person directly after the focus group. Participants will be provided with paper and pencil copies of the questionnaire and asked to complete it immediately following the focus group.

  • Provide Spanish language translations of the questionnaire. Some respondents may be more comfortable responding to a written questionnaire in Spanish rather than English; respondents can choose to use either version.

  • Read questions aloud and provide assistance to address any literacy issues. A bilingual focus group moderator will read each question aloud in English and Spanish as respondents complete the questionnaire. A second moderator will be on hand to provide additional individual assistance for any respondents who may have difficulty with literacy.

4. Tests of Procedures or Methods

Impact Evaluation

Experimental Impact (Healthy Marriage Grantee Evaluation – Responsible Fatherhood Grantee Evaluation discussed in previous ICR). Pretests of the (5) HM baseline survey have been conducted with six people. The respondents selected for the telephone pretests were as similar to likely actual sample members as possible (we recruited pretest participants by contacting similar programs). The pretest interviews were conducted by telephone and audio-taped or monitored to identify potential issues. As a result of the telephone pretests, we made changes to the survey instrument to improve the wording of the questions and their sequencing.

Implementation – Additional Implementation Data Collection Instruments (Responsible Fatherhood and Healthy Marriage Grantee Evaluation). Several implementation study instruments build on existing questions and previous experience from similar studies completed by the implementation study team. Consequently, pretesting of previously used instruments or measures has not been planned. Specifically, the (8) and (16) semi-structured interview topic guides (for program staff), the (10) telephone interview guide (for program staff at referral organizations), the (12) and (17) focus group discussion guides (for program participants), and the (13) telephone interview guide (for program dropouts) build on guides used in similar studies such as Building Strong Families and the Cross-Site Evaluation of Evidence-Based Home Visiting Programs. The (11) on-line Working Alliance Inventory (for program staff and participants) also will not be pretested, as this instrument was used on the Cross-Site Evaluation of Evidence-Based Home Visiting Programs.

The (9) on-line survey (for program staff) was pretested on six staff members of a RF program in Texas. The staff members completed a paper-and-pencil survey. Five of the six staff then participated in a debrief interview to discuss their experiences completing the survey. As a result of the pretests, some questions were revised, some response categories refined, and one question removed because of redundancy.

Qualitative (Responsible Fatherhood Grantee Evaluation). Because the (14) in-person, in-depth interview guide (for program participants) and the (15) telephone check-in protocol that will guide RF participant interviews builds on previous experience from similar studies completed by the qualitative study team, drawing from topics that have been explored in prior studies, it will not be pretested.

Implementation/Qualitative Only Evaluation: Hispanic RF Sub-study

For a discussion of tests of procedures or methods for (16) semi-structured interview topic guide (for program staff) and (17) focus group discussion guide (for program participants), please see the Implementation – Additional Implementation Data Collection Instruments (Responsible Fatherhood and Healthy Marriage Grantee Evaluation) in this item (item B4).

The (18) questionnaire for focus group participants in the Hispanic sub-study includes questions about demographic information and two well-established scales. The Mexican American Cultural Values scale has shown good psychometric properties (Knight et al. 2010), and the Acculturation Rating Scale for Mexican Americans Version II has demonstrated validity and reliability in prior studies (Barrera et al. 2012; Gonzalez et al. 2001; Berry 1997; Fischer and Corcoran 2007). For these reasons, pretesting of these measures has not been planned.



5. Individuals Consulted on Statistical Methods

Input on statistical methods on statistical methods was received from staff in the ACF Office of Planning, Research, and Evaluation as well as staff at Mathematica Policy Research and project and a limited number of staff external to Mathematica.

Ms. Nancye Campbell

7th Floor West

901 D Street, SW

Washington, DC 20447


Mr. Seth Chamberlain

7th Floor West

901 D Street, SW

Washington, DC 20447


Dr. Sheena McConnell

Mathematica Policy Research

1100 1st Street, NE, 12th floor

Washington, DC 20002-4221


Dr. Robert Wood

Mathematica Policy Research

P.O. Box 2393

Princeton, NJ 08543


Dr. Jane Fortson

Mathematica Policy Research

505 14th Street

Suite 800

Oakland, CA 94612


Dr. Kathryn Edin

John F. Kennedy School of Government
Mailbox 103
79 JFK Street
Cambridge, MA 02138


Dr. Alexandra Killewald
Department of Sociology
Harvard University
33 Kirkland St.
Cambridge, MA 02138


Ms. M. Robin Dion

Mathematica Policy Research

1100 1st Street, NE, 12th floor

Washington, DC 20002-4221


Ms. Heather Zaveri

Mathematica Policy Research

1100 1st Street, NE, 12th floor

Washington, DC 20002-4221


Dr. Amber Tomas

Mathematica Policy Research

1100 1st Street, NE, 12th floor

Washington, DC 20002-4221

REFERENCES

Barrera M., Jr., D. Toobert, L. Strycker, and D. Osuna. Effects of acculturation on a culturally adapted diabetes intervention for Latinas. Health Psychology: Official Journal of the Division of Health Psychology, American Psychological Association. 31(1):51-54, 2012.

Berry, J.W. Immigration, Acculturation, and Adaptation. Applied Psychology: An International Review. 46(1):5-33, 1997.

Gonzalez, H.M, M.N. Haan, and L. Hinton. Acculturation and the prevalence of depression in older Mexican Americans: Baseline results of the Sacramento Area Latino Study on Aging. Journal of the American Geriatrics Society. 49(7):948-953, 2001.

Fischer J, and K. Corcoran. Measures for Clinical Practice and Research: A Sourcebook, Fourth Edition, Volume 2: Adults. New York: Oxford University Press; 2007.

Hsueh, JoAnn, Desiree Principe Alderson, Erika Lundquist, Charles Michalopoulos, Daniel Gubits, David Fein, and Virginia Knox (2012). “The Supporting Healthy Marriage Evaluation: Early Impacts on Low-Income Families.” OPRE Report 2012-11. Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Knight, G. P., M.W. Roosa, and A.J. Umaña-Taylor, Eds. Studying ethnic minority and economically disadvantaged populations. Washington, DC: American Psychological Association, 2009.

Wood, Robert G., Sheena McConnell, Quinn Moore, Andrew Clarkwest, and JoAnn Hsueh. “Strengthening Unmarried Parents’ Relationships: The Early Impacts of Building Strong Families.” Princeton, NJ: Mathematica Policy Research, May 2010.

1 Note that a sample of 400 couples would mean 800 individuals; multiplies across 5 sites, that equals a total burden of 4,000 baseline surveys, which we have requested in A12.

2 For example, in a study for the U.S. Housing and Urban Development (HUD) entitled Moving to Opportunity, a $50 gift was provided for a 60-minute interview with the household head (OMB control number 2528-0161).



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Title06997 PACT OMB
SubjectOMB Package Part A
AuthorSheena McConnell (formatted by Sheena Flowers)
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy