PPA-Supporting Statment B_FU data collection_to OMB_051412

PPA-Supporting Statment B_FU data collection_to OMB_051412.docx

Evaluation of Pregnancy Prevention Approaches - First Follow-up

OMB: 0990-0382

Document [docx]
Download: docx | pdf



Supporting Justification for OMB Clearance of Evaluation of Pregnancy Prevention Approaches

Part B: Statistical Methods for Follow-up Data Collection


February 2012


Contract Number:

HHSP23320082911YC

Mathematica Reference Number:

06549.303

Submitted to:

Office of Adolescent Health

Division of Policy, Planning & Communication

Department of Health & Human Services

1101 Wootton Parkway, Suite 700

Rockville, MD 20852


Project Officer: Amy Farb

Submitted by:

Mathematica Policy Research

P.O. Box 2393

Princeton, NJ 08543-2393

Telephone: (609) 799-3535

Facsimile: (609) 799-0005

Project Director: Alan Hershey

Supporting Justification for OMB Clearance of Evaluation of Pregnancy Prevention Approaches

Part B: Statistical Methods for
Follow-up Data Collection

February 2012






CONTENTS

Abstract 1

B1. Respondent Universe and Sampling Methods 3

B2. Procedures for Collection of Information 4

B3. Methods to Maximize Response Rates and
Deal with Nonresponse 8

B4. Tests of Procedures or Methods to be Undertaken 10

B5. Individuals Consulted on Statistical Aspects and
Individuals Collecting and/or Analyzing Data 11







TABLES

B.1. Expected Sample Sizes 4

B.2. PPA Evaluation Sites: Target Populations and Follow-Up Schedules 5

B.3. Projected Distribution of First Follow-Up Completions by Mode, by PPA Site 7

B.4. Incentives for Follow-up Data Collection, by Site 10


ATTACHMENTS

ATTACHMENT A: Chicago Public Schools: Follow-up Instrument

ATTACHMENT B: OhioHealth: Follow-up Instrument

ATTACHMENT C: Children’s Hospital Los Angeles (CHLA): Follow-up Instrument

ATTACHMENT D: Oklahoma Institute for Child Advocacy (OICA): Follow-up Instrument

ATTACHMENT E: EngenderHealth: Follow-up Instrument

ATTACHMENT F: Live the Life Ministries (LtL): Follow-up Instrument

ATTACHMENT G: Princeton Center for Leadership Training (PCLT): Follow-up Instrument

ATTACHMENT H: Crosswalks of Follow-up and Baseline Items for Each Site

ATTACHMENT I: 60-day Federal Register Notice




Abstract

The U.S. Department of Health and Human Services (HHS) is conducting the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA), an eight-year demonstration designed to study the effectiveness of promising policy-relevant strategies to reduce teen pregnancy. The study was designed to include up to eight evaluation sites, and at this point it appears that there will be seven sites:

  • one site – Chicago Public Schools, implementing the Health Teacher curriculum – has been recruited, and baseline and follow-up surveys have been implemented; and

  • six federally-funded grantees have been recruited, and a baseline survey has been implemented in three of the sites

Approval for outreach discussions with stakeholders, experts in the field, and program developers was received on November 24, 2008 (OMB Control No. 0970-0360). Approval for the baseline survey data collection and the collection of youth participant records was received on July 26, 2010 (OMB Control No. 0970-0360). Emergency clearance for site-specific variants of the baseline survey questionnaire was received on August 22, 2011 (OMB Control No. 0970-0360). Per the conditions of the emergency approval, a request for standard clearance of the site-specific baseline instruments has been submitted and is currently under review.

Similar to the baseline survey effort, a large group of federal staff has collaborated to modify a previously drafted PPA follow-up instrument into a “concordance follow-up instrument” suitable for all HHS pregnancy prevention evaluations, including but not limited to PPA. HHS is trying to maximize consistency across evaluations of federal pregnancy prevention grant programs. In 2010 and 2011, the Administration for Children and Families (ACF) and the Office of Adolescent Health (OAH), in coordination with other HHS offices overseeing pregnancy prevention evaluation, collaborated to consider revisions to the previously drafted PPA instrument. Approval for the first follow-up data collection, the follow-up “concordance” instrument (to be used in Chicago) and one site-specific follow-up questionnaire was received on September 27, 2011 (OMB Control No. 0970-0360). We now seek OMB approval for the remaining site-specific variants of the follow-up instrument1 and the follow-up data collection, including all rounds of follow-up using the instruments submitted for review.

As in the case of baseline data collection, site-specific variation in follow-up data collection instruments is planned because of the differences among the seven PPA sites. As PPA sites were recruited, we found that variations in their target populations and program models make it essential to tailor data collection, at both baseline and follow-up, to analytical priorities in each site. Developing those site-specific instruments involves working closely with the six sites that are federal pregnancy prevention grantees, and with the local evaluators they have engaged as a condition of their grants.

The collaboration with the six grantee sites also involves specifying the exact schedule for follow-up data collection. Across these sites, there is variation in the length of the program being tested, the age of the target population, the key outcomes on which impacts are of greatest interest, and thus on the most suitable schedule for follow-up surveys. The PPA technical work group (TWG) provided important guidance for the timing of two follow-up surveys: a first follow-up no earlier than 3-6 months after program completion, and a second no later than 18-24 months after program completion. This guidance has been quite closely followed, with well-justified exceptions. In two cases the negotiation with local evaluators led to plans for three follow-ups, with the third follow-up inserted as an early survey. In one case, the final follow-up timing deviates from the TWG guidance because the program lasts 18 months; follow-ups are scheduled at 6, 18, and 30 months after enrollment, which means there will be a follow-up during the intervention, immediately after it ends, and 12 months after it ends.

The process of working out these instruments and survey schedules has now been completed site by site, and the result determines when the first follow-up survey must be administered in each site, and thus determines for which sites approval of follow-up data collection is most urgent. A previous submission focused on first follow-up data collection in the two earliest sites: Chicago and Oklahoma (approval received September 27, 2011; OMB Control No. 0970-0360). The current submission presents follow-up questionnaires and estimated burden for the remaining sites and rounds of follow-up data collection.2 Table B.1 provides a summary of instruments and estimated burden included in this submission.

Table B.1. Instruments and Estimated Burden Included in this Submission


Follow-up Instrument

Burden Estimate


Previously Approved, With Minor Modifications

Previously Approved, With No Changes

New Submission

FU1

FU2

Additional Early Follow-Up

Chicago Public Schools

ü




ü


OhioHealth



ü

ü

ü

ü

CHLA



ü

ü

ü


Oklahoma Institute for Child Advocacy (OICA)


ü



ü


EngenderHealth



ü

ü

ü


Live the Life (LtL)



ü

ü

ü


Princeton Center for Leadership Training



ü

ü

ü



B1. Respondent Universe and Sampling Methods

In the PPA evaluation, HHS has identified seven study sites that will implement different pregnancy prevention approaches. In three of these sites, the programs to be tested will be school-based—operated in high schools or middle schools. In the other sites, the programs to be tested will be operated in community-based organizations (CBOs). The study will use a sample of approximately 9,000 teens across all sites. In each site, youth will be assigned to a treatment group that receives the program of interest, or to a control group that does not. In five sites, to ensure that behavior of control group youth is not affected, or “contaminated” by interaction with treatment group youth, random assignment will be done generally at the cluster level (that is, the school or CBO). In the other two sites, random assignment will be done at the individual level, because risks of contamination are low.

A baseline survey will be conducted with both the program and control groups before the youth in the program group are exposed to the pregnancy prevention programs. The first follow-up surveys will be conducted in most sites, and pursuant to the TWG guidance, no sooner than 3-6 months after the end of the scheduled program intervention for each sample member. The final follow-up survey will be conducted with participating youth no later than 18-24 months after the scheduled end of the program. The exact timing of the follow-up surveys has been determined in each site, taking into account the length of the program, the age of the target population, and the priority outcomes of interest3.

The universe of potential respondents will vary across study sites, depending on the type of program in place at each site. Hence, we first describe the possible types of program structures and the corresponding study design.

Of the seven sites in the evaluation, five will involve random assignment at the cluster level (schools or other groupings), and two will involve random assignment at the individual level. Random assignment will occur at the time of sample enrollment (after the baseline survey). At follow-up, we plan to target all youth who were randomly assigned at baseline to the program or control group. It is possible, however, in schools that might have an appreciably larger population of students than the target sample size, that following the baseline survey we subsample students for follow-up.

Table B.2 summarizes our sample size estimates for all seven evaluation sites (burden estimates for the sites are presented in Part A12). Based on our plans to include five sites with cluster random assignment and two with individual-level random assignment, we expect the total sample size will be approximately 9,000.

Table B.2. Expected Sample Sizes

Type of Program

Number of Sites

Average Sample
Size Per Site

Total Sample Size by Program Type

Required in-school

3

1,600

4,800

Community-based

2

1,100

2,200

Clinic/service-based (individual)

2

1,000

2,000

Total

7


9,000



We expect to achieve a response rate of 85 percent on the first follow-up survey, and 80 percent on the second follow-up survey4. These rates are comparable to the response rates achieved on the study of Title V abstinence education programs conducted by Mathematica Policy Research.5 Reasons for projecting these response rates are explained in section B3.

The proposed sample sizes and response rates for each site provide sufficient statistical power to detect policy-relevant effect sizes of 0.3 or lower with a high (80 percent or better) probability. This conclusion is based on our plan to calculate impact estimates and hypothesis tests separately for each site with a 95 percent threshold for statistical significance. We assume regression-adjusted impact estimates, with baseline covariates explaining up to 30 percent of the variance in observed outcomes. For the five sites using a cluster random assignment design, we assume an intra-class correlation (ICC) of up to 0.035. The projected sample sizes for the seven sites sum to 8,988, which we have rounded in Table B.2 to 9.000.

B2. Procedures for Collection of Information

HHS will collect information in the follow-up surveys on youth behaviors from approximately 9,000 youth across seven sites (see Table B.2 for distribution). Whenever possible, the assignment to treatment (receipt of one of the approaches to reducing teen pregnancy) or control groups (not receiving such treatment) takes place at the site, school, or classroom level in order to minimize contamination between control and treatment group youth. When there are more youth at a site than anticipated, youth may be subsampled at baseline, or in some cases the youth completing the baseline survey may be randomly sampled for follow-up if their numbers substantially exceed sample requirements.

Consent for the duration of the study will be collected prior to baseline data collection. Data collection will occur only if informed consent is provided by a parent or legal guardian if the respondent is a minor, or by respondents themselves if they are 18 or older. No data will be collected from those without consent. We will attempt to collect baseline data for any consented sample member. For those who are absent during the baseline administration, we will attempt to collect their data through a make-up administration. Even with these efforts, we anticipate a small number of consented youth will not complete the baseline survey. We will collect follow-up data for any sample member with consent, regardless if they completed the baseline survey. Participant assent is obtained prior to the administration of each of the surveys. In three sites, parental consent is not required for all participants. In Oklahoma, some of the youth are under the legal guardianship of the state foster care system, so a caseworker, lawyer, or other identified legal representative will be providing consent for these youth to participate. In OhioHealth, some of the youth are 18 and older. Parental consent is not required for these participants, so we will obtain active consent directly from these sample members. In CHLA, the IRB has determined that parental consent is not required for these participants so, active consent will be collected from the CHLA sample members. Parental consent was received for 73% of eligible youth in Chicago, and 93% percent of consented youth responded to the baseline survey. Consent and baseline data collection are currently underway in CHLA, Oklahoma, OhioHealth and Teen PEP. The general plan for follow-up data collection is to conduct two follow-up surveys in each site. In most sites, a first follow-up will be administered no earlier than 3-6 months after the end of program participation for the program group, and the second follow-up no later than 18-24 months after the end of the program, as recommended by the PPA Technical Work Group. First follow-up data collection has been completed in Chicago, with a response rate of 94 percent. The exact timing for each site (see Table B.3) takes into account the age of the sample population, the length of the intervention, and the period over which detectable impacts on the key priority outcomes could be expected to emerge.

In two sites, an additional early follow-up has been scheduled. In the Oklahoma (OICA) site, an immediate posttest will allow analysis of immediate effects on knowledge and attitudes, using the progression of three follow-up data points to model the role of intermediate outcomes on long-term impacts. In the OhioHealth site, where intervention effects on short-term contraceptive practice of teen mothers after the birth of their child is an important goal, the plan includes a follow-up six months after enrollment, while the program sample is still active in the program. The addition of this early follow-up should have no effect on the quality of data collected at later follow-ups. The interval between the early follow-up and the next is six months or more; that interval, and even shorter ones, are commonly used in teen pregnancy prevention studies multiple follow-up surveys. We will work in each of these sites to ensure that the same procedures are used in the early follow-up as in later ones, and to maintain respondent commitment to sustained participation in the study.

Table B.3. PPA Evaluation Sites: Target Populations and Follow-Up Schedules

Site (Grantee)

Target Population/Enrollment Point

Length of Intervention (elapsed time)

Timing of Early Follow-up
(from end of program)

Timing of Final Follow-up (from end of program)

Chicago Public Schools (CPS)

7th grade students/start of 7th grade

16 weeks (fall

2010–spring 2011

5-6 months

13-14 months

Engender Health

14-16 year old participants in summer youth employment program/start of program

5 days

6 months

18 months

Princeton Center for Leadership Training (PCLT)

9th grade students/start of 9th grade

5-16 weeks (depending on school schedule)

6-7 months

18-19 months

OhioHealth Research and Innovation Institute

Pregnant/parenting females 15-19 years old/recruited after delivery or during prenatal care

18 months

- Early FU during program (6 months after enrollment)

- FU at end of intervention (18 months after enrollment)

12 months (30 months after enrollment)

Oklahoma Institute for Child Advocacy (OICA)

Youth in foster care group homes 15-19 years old/resident at time of study recruitment

10 weeks

- At program completion

- 6 months

12 months

Children’s Hospital Los Angeles (CHLA)

Teen mothers less than 20, with child less than 6 months/recruited through clinics and other programs

12 weeks

9 months

21 months

Live the Life (LtL)

7th grade students/start of 7th grade

2 school years (8-day dose each year)

3-5 months (spring 8th grade)

15-17 months (spring 9th grade)

In all sites in varying degrees, locating some sample members for follow-up will be required. Sample members in school-based sites will, at a minimum, have changed classrooms since baseline, and some will have changed schools. In other sites, sample members may have moved. Prior to the follow-up survey data collections, the contractor will work with the site to locate sample members in their new classrooms or schools, or obtain any available updates to contact information. Additionally, information will be collected at various points throughout the study through emails, phone calls, and postcards asking sample members to provide updated contact information. Cases that are particularly difficult to find will be sent to the contractor’s locating staff.

Where the program enrolls students and is delivered in schools, the follow-up data collection will begin with group administration. Contractor staff will work with sites to determine a date and exact venues for conducting group survey administration. Contractor staff will arrive at the site for the survey day, two staff members per survey room. When in the survey room(s), contractor staff will use the survey roster to take attendance and determine whether any youth are missing and to exclude any not on the survey roster. Any sample members who have moved out of the area will be given the option of completing the follow-up survey via the web or over the telephone. Contractor staff will hand out pre-identified survey packets to the youth whose names are on the packets, and obtaining youth assent. Each packet will consist of the PPA paper-and-pencil interview (PAPI) questionnaire and a sealable, blank survey return envelope. The questionnaire and outside envelope will have a label with a unique ID number (no personally identifying information will appear on the questionnaire or return envelope). All youth will complete Questionnaire Part A, which asks for background information and concludes with a single screening question about sexual experience. Youth with sexual experience will complete Part B1 and those without will complete Part B2. Two contractor staff members will monitor each survey room. Upon completion, youth will place the questionnaire Parts A, B1, and B2 (both the used and the unused sections) in the return envelope, seal it, and return it to a contractor staff member. Staff will send the completed questionnaires to the contractor’s office, where the questionnaires will be receipted and checked for completeness and scannability. All questionnaires that pass the check will be sent to a scanning vendor to be scanned. All scanned data will be electronically transmitted to the contractor.

Telephone and web-based administration of the follow-up survey will also be used. In sites with group administration, sample members who do not complete in an initial group session or a make-up session will be given the option to complete by phone or web. In other sites, telephone and web will be the primary modes of data collection, because the sample will be dispersed and assembling groups will not be feasible. In both situations, contractor staff will contact them and provide a PIN/password for web completion or, interview them by telephone using the PAPI instrument. After such completions, the same receipting and scanning processes as for PAPI completions will take place. Web instruments will be prepared after OMB approval of the basic hard-copy questionnaires, and provided to OMB.

In one site (CHLA), follow-up will be conducted in person with individual respondents. Data collectors will provide laptop computers equipped with audio for respondent self administration (ACASI).

Our current projections of completion rates by mode for the first follow-up are as shown in the following table. For completeness, the table includes the two sites for which first follow-up instruments were previously submitted (and approved) and the other sites for which instruments are currently submitted.

As with any survey that uses different modes of administration, the answers by some respondents to certain questions may differ depending on the mode in which they complete the survey. Any such “mode effects” should not affect the validity of the impact estimates because we anticipate that equal proportions of treatment and control group members will complete the different modes. Indeed, as a means to assure that this is the case, we will follow identical plans for administering the different modes of the survey between the two experimental groups, including using identical methods for locating respondents and more generally maximizing survey response rates (discussed below in Section B.3).

Table B.4. Projected Distribution of First Follow-Up Completions by Mode, by PPA Site

Site (Grantee)

Site Type

Target Population/Enrollment Point

Projected Mode of Completion FU1

Projected Mode of Completion FU2

Chicago Public Schools (CPS)

School-Based

7th grade students/start of 7th grade

95% group

0% web

5% phone

85% group

5% web

10% phone

Engender Health

CBO-Based (Individual)

14-16 year old participants in summer youth employment program/start of program

0% group

80% web

20% phone

0% group

80% web

20% phone

Princeton Center for Leadership

Training (PCLT)

School-Based

9th grade students/start of 9th grade

90% group

5% web

5% phone

80% group

10% web

10% phone

OhioHealth Research and Innovation Institute

Clinic-Based

Pregnant/parenting females 15-19 years old/recruited after delivery or during prenatal care

0% group

0% web

100% phone

0% group

0% web

100% phone

Oklahoma Institute for Child Advocacy (OICA)

CBO (Group-Based)

Youth in foster care group homes 15-19 years old/resident at time of study recruitment

50% group

25% web

25% phone

40% group

30% web

30% phone

Children’s Hospital Los Angeles (CHLA)

Clinic-Based

Teen mothers younger than 20, with child less than 6 months/recruited through clinics and other programs

100% in-person (ACASI)


100% in-person (ACASI)


Live the Life (LtL)

School-Based

7th grade students/start of 7th grade

90% group

5% web

5% phone

80% group

10% web

10% phone



B3. Methods to Maximize Response Rates and Deal with Nonresponse

We expect a response rate of 85 percent or better on the first follow-up surveys6 . We can expect to achieve this completion rate for several reasons. Survey administration will occur at most six months after the program end date.7 This timing will ensure that our contact data are quite current (minimal location problems). In some sites, surveys can be administered to most youth in the location where the baseline survey was conducted, and where the program took place (for example, the school). In addition, we expect that obtaining the site’s willing assistance will be very important to maximizing the response rate; we will invest significant effort in gaining their cooperation from the beginning of the study, minimizing burden on sites and assuring privacy and confidentiality to the youth participants. Sites will be given detailed information about the surveys, how they will be administered and on what schedule, what involvement and time will be required of school staff, and how data will be used and protected. Bringing sites into the process while minimizing burden will assure site support of the PPA data collection. We do not anticipate differential response rates across sites. Moreover, by applying identical methods for maximizing the response rates of the treatment and control groups, we anticipate no differences in the rates within sites between the two experimental groups.

Prior to survey administration in the school-based sites, we will work closely with our school contacts to locate respondents in their new classrooms. We will ask schools to post reminders and make announcements prior to and on the day of the survey administration to maximize attendance. On the day of the survey administration, contractor staff will take attendance prior to beginning administration and immediately follow-up with the school contact regarding any unexpected absentees. As previously noted, sample members who have transferred schools or moved out of the area will be tracked and given the option to complete the survey over the web or by telephone.

In sites where group-based administration is not possible, an advance letter will be sent to sample members, notifying them of the data collection and providing them with the information necessary to complete the survey on the web or over the phone. Additional email and telephone prompts will be conducted as needed.

Additionally, incentives will be provided to respondents to encourage participation in the survey. Table B.5 provides a summary of the incentives to be offered in each of the sites. For the school-based administrations (Chicago, Live the Life, PCLT) no incentive is offered at baseline. In the other sites, baseline incentive amounts vary slightly depending on the mode of administration. For follow-up data collection in the school-based sites, a $10 gift card is provided to participants completing the survey in a group setting; a $25 gift card is provided to those completing the survey by phone or on the web. A higher incentive is offered to these respondents because completion outside of the group administration requires greater initiative and cooperation on behalf of the respondent, as well as additional time outside of the school day. In the other sites, the incentive amounts vary slightly based on the mode of administration (and associated burden), the mobility of the population, and the length of time from enrollment. A report evaluating the use of incentives in this study will be provided to OMB.

Despite our expectation that the non-response rate will be low in each site, we will nevertheless take steps to both understand the nature of any non-response and to account for the threat that it may pose for the validity of the study’s impact estimates. Using data from the baseline survey, we will first test for statistically significant differences across all of the demographic and baseline outcome variables. We will then control for any such differences by using baseline data as covariates (see Section A.16). In addition, to the extent that non-response is higher than anticipated (above 20 percent), we will correct for differences between respondents and nonrespondents by constructing sample weights that mirror the characteristics of the full sample. These weights will be used in each of the models used to estimate the program effects (described in Section A.16)


Table B.5. Incentives for Data Collection, by Site

Site (Grantee)

Baseline

First Follow-up

Second Follow-up

Chicago Public Schools

None

$10 group; $25 phone/web*

$10 group; $25 phone/web

OhioHealth

$10 individual PAPI

$10 for 6 months; $25 for 18 months

$50

Children’s Hospital Los Angeles (CHLA)

$20 in-person ACASI

$20 in-person ACASI

$30 in-person ACASI

Oklahoma Institute for Child Advocacy (OICA)

$10 group

$15 for immediate post-test; $25 for 6 months*

$35+

EngenderHealth

$20 group PAPI

$25 web/phone

$25 web/phone

Live the Life (LtL)

None

$10 group; $25 phone/web

$10 group; $25 phone/web

Princeton Center for Leadership Training (PCLT)

None

$10 group; $25 phone/web

$10 group; $25 phone/web

*Incentive was included in previous submission to OMB (approval received September 27, 2011; OMB Control No. 0970-0360).

+ We are pursuing IRB approval to reduce the incentive from the originally approved $50 to $35.

B4. Tests of Procedures or Methods to be Undertaken

We conducted pretests of the follow-up instrument8. We recruited pretest participants and study staff talked directly with all interested teens to explain the pretest and the need to obtain parental consent prior to their participation. Letters explaining the study and the purpose of the pretest were sent to parents, along with the active parental consent form. Those with parental consent were invited to participate in one of several pretest administrations, during which small groups of four or five teens completed the self-administered questionnaire in a group setting and then completed a one-hour one-on-one debriefing with a researcher. The pretest sample included youth ages 12-16 from both high and low socioeconomic backgrounds, some of whom were receiving social support services from a community organization.

The administration of the pretest mirrored as closely as possible what will happen during the actual study in school-based, group administrations. The survey administration began with a brief description of the study, an explanation of the purpose of the pretest, and a clear reassurance to respondents of confidentiality. Student assent was then obtained for each respondent, and staff distributed surveys, explaining that each person was to complete Part A, but that they were to complete only one Part B and then put all three Parts (complete and not complete) in the blank return envelope. As will done in the study, no distinction was made between the two Part Bs – it was simply noted they were not to complete both Part B1 and Part B2 and they were to follow instructions carefully about which Part B to complete. To the extent possible, respondents were seated as if in a classroom, with at least one seat in between each person.

Once they completed the survey, pretest respondents attended a one-on-one debriefing session with a same-sex staff member, where they were asked about questions or terms that may have been unclear or unknown, thoughts on the survey and how comfortable they would feel responding in class; what they thought of when they were answering particular questions, how they came to their answer for particular questions, etc.

This pretest was conducted with a draft of what is now the “concordance” instrument, and site-specific variants of that instrument have been developed. However, all items included in these site instruments have been tested, used already in the PPA baseline context, or derived from other surveys.

Items incorporated in the other site-specific follow-up instruments based on local evaluators’ draft instruments have been pretested. All grantees are required to conduct a pilot, including their instrumentation. As a result, items taken from local evaluators have been tested under the terms of the grantees’ federal funding. In a few instances, new items were developed by the study team. These items will be piloted with 9 or fewer respondents. As needed, revised instruments will be submitted to OMB prior to administration.

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The PPA follow-up surveys will be administered by HHS’s contracting organization, Mathematica Policy Research. The same contractor will analyze data with support from evaluation colleagues at Child Trends. Individuals whom HHS consulted on the collection and/or analysis of the follow-up data include those listed below.

Alan Hershey

Mathematica Policy Research, Inc.

P.O. Box 2393

Princeton, NJ 08543

(609) 275-2384



Christopher Trenholm

Mathematica Policy Research, Inc.

P.O. Box 2393

Princeton, NJ 08543

(609) 936-2796



Brian Goesling

Mathematica Policy Research, Inc.

P.O. Box 2393

Princeton, NJ 08543

(609) 945-3355



Kristin Moore

Child Trends

4301 Connecticut Ave. NW

Washington, DC 20008-2333

(202) 362-5580

Melissa Thomas

Mathematica Policy Research, Inc.

P.O. Box 2393

Princeton, NJ 08543

(609) 275-2231



Jennifer Manlove

Child Trends

4301 Connecticut Ave. NW

Washington, DC 20008-2333

(202) 362-5580



Silvie Colman

Mathematica Policy Research, Inc.

P.O. Box 2393

Princeton, NJ 08543

(609) 750-4094





We have also consulted with:

Amy Margolis

HHS Office of Adolescent Health

U.S. Department of Health and Human

Services

1101 Wootton Parkway

Rockville, MD 2085

(240) 453-2820



Stan Koutstaal

Family and Youth Services Bureau (prior to FY11)

U.S. Department of Health and Human

Services

370 L’Enfant Promenade, SW

Washington, DC 20477

(202) 401-5457




Lisa Trivits

Office of the HHS Assistant Secretary for Planning and Evaluation (ASPE)

U.S. Department of Health and Human Services

370 L’Enfant Promenade, SW

Washington, DC 2047

(202) 205-5750

Inquiries regarding statistical aspects of the study design should be directed to the project officer:

Amy Farb

Office of Adolescent Health

1101 Wootton Parkway

Suite 700

Rockville, MD 20852

(240) 453-2836


www.mathematica-mpr.com

Improving public well-being by conducting high-quality, objective research and surveys

Princeton, NJ Ann Arbor, MI Cambridge, MA Chicago, IL Oakland, CA Washington, DC


Mathematica® is a registered trademark of Mathematica Policy Research




1 Within each site, the same instrument will be used for all rounds of follow-up data collection. Minor updates may be needed to adjust references to periods of time or specific dates.

2 Specifically, the current submission includes follow-up instruments for all seven sites (Chicago first follow-up and Oklahoma first follow-up (6 months) and additional early follow-up (immediate post-test) were previously approved on September 27, 2011; OMB Control No.: 0970-0360; now under OMB Control No, 0990-0382). This submission also includes the estimated burden for the following: first follow-up data collection in the five remaining sites (OhioHealth, CHLA, Teen PEP, EngenderHealth, and Live the Life); an additional early follow-up data collection in OhioHealth; and second follow-up data collection in all seven sites.

3 In two sites, an additional early follow-up has been scheduled. In the Oklahoma (OICA) site, an immediate posttest will allow analysis of immediate effects on knowledge and attitudes, using the progression of three follow-up data points to model the role of intermediate outcomes on long-term impacts. In the OhioHealth site, where intervention effects on short-term contraceptive practice of teen mothers after the birth of their child is an important goal, the plan includes a follow-up six months after enrollment, while the program sample is still active in the program. The addition of this early follow-up should have no effect on the quality of data collected at later follow-ups. The interval between the early follow-up and the next is six months or more; that interval, and even shorter ones, are commonly used in teen pregnancy prevention studies multiple follow-up surveys. We will work in each of these sites to ensure that the same procedures are used in the early follow-up as in later ones, and to maintain respondent commitment to sustained participation in the study.



4 In Oklahoma, we are projecting 85 percent completion for both the immediate posttest and the six-month follow-up. In OhioHealth, we are projecting 85 percent for the 6 month follow-up and 82 percent for the 18 month follow-up.

5 Trenholm, Christopher, Barbara Devaney, Kenneth Fortson, Lisa Quay, Justin Wheeler, and Melissa Clark. “Impacts of Four Title V, Section 510 Abstinence Education Programs.” Final report submitted to the U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation. Princeton, NJ: Mathematica Policy Research, 2007.

6 For the two sites with additional early follow-ups, we expect a response rate of 85 percent or better for the immediate post-test and the 6 month follow-up in Oklahoma; in OhioHealth, we expect a response rate of 85 percent or better for the 6 month follow-up and 82 percent or better for the 18 month follow-up.

7 The estimated response rate is based on the number of consented individuals and is not conditional on completion of the baseline survey.

8 Within sites, we will use the same instrument for all rounds of follow-up data collection.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Justification for OMB Clearance of Evaluation of Pregnancy Prevention Approaches
AuthorBarbara Collette
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy