0990-0291_09-02-08

0990-0291_09-02-08.doc

Adolescent Family Life Program: Prevention Demonstration Projects

OMB: 0990-0291

Document [doc]
Download: doc | pdf

PowerPlusWaterMarkObject3 08-22-08


August 2008


SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION


for


Evaluating the Title XX
Adolescent Family Life (AFL) Program: Prevention Demonstration Projects


Prepared for


Johanna Nestor

Office of Population Affairs/DHHS

1101 Wotton Parkway, Suite 700

Rockville, MD 20852

(240) 453-2808


Prepared by


RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709


RTI Project Number 0211776






___________________

RTI International is a trade name of Research Triangle Institute.

Table of Contents

Section Page

A. Justification 1

1. Circumstances Making the Collection of Information Necessary 1

2. Purpose and Use of the Information Collection 2

3. Use of Improved Information Technology and Burden Reduction…………….....3

4. Efforts to Identify Duplication and Use of Similar Information 4

5. Impact on Small Businesses or Other Small Entities 5

6. Consequences of Collecting the Information Less Frequently 5

7. Special Circumstances Relating to the Guidelines of 5CFR 1320.5 5

8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency 6

9. Explanation of Any Payment or Gift to Respondents 6

10. Protection of Data Security and Participant Privacy 9

11. Justification for Sensitive Questions 11

12. Burden Estimate (Total Hours & Wages) 12

13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers 12

14. Annualized Cost to the Federal Government 12

15. Explanation of Program Changes or Adjustments 13

16. Plans for Tabulation and Publication and Project Time Schedule 13

17. Expiration Date 18

18. Exceptions to Certification for Paperwork Reduction Act Submissions 18


B. Collection of Information Employing Statistical Method 20

1. Respondent Universe and Sampling Methods 20

2. Procedures for the Collection of Information 23

3. Methods to Maximize Response Rates and Deal with Nonresponse 25

4. Tests of Procedures or Methods to be Undertaken 25

5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 27

References 29


Appendices

A. Statute/Regulation Mandating or Authorizing the Collection of Information

B. Cross-Site Evaluation Data Collection Materials

C. Federal Register Notice to the Public

D. RTI Institutional Review Board Approval Notice

E. Assurances of Confidentiality and Study Descriptions Provided to Respondents

F. Recruiting Materials

G. Cross-site Evaluation Study Protocol


List of Exhibits

Number Page

Exhibit 1. Cross-Site Evaluation Research Questions 3

Exhibit 2. Persons Consulted Outside the Agency 7

Exhibit 3. Studies Involving Child and Adolescent Respondents Receiving $10 Incentives and Corresponding Response Rates 9

Exhibit 4. Description of Sensitive Questions, Justification for Inclusion, and Use of Data 11

Exhibit 5. Estimated Annualized Burden Hours 12

Exhibit 6. Estimated Annualized Cost to Respondents 12

Exhibit 7. Logic Model for the Cross-Site Evaluation of the AFL Program 14

Exhibit 8. Study Hypotheses 18

Exhibit 9. Cross-Site Evaluation Analyses 19

Exhibit 10. Time Schedule for the Entire Project 20

Exhibit 11. Longitudinal Response Rates and Numbers of Adolescents 22

Exhibit 12. Longitudinal Completion and Retention Rates for Prior Studies 22



























This document provides a Supporting Statement to accompany a request for approval of revisions to the Office of Population Affairs (OPA) prevention demonstration project core evaluation instruments (OMB 0990-0291), which are used to collect information in order to evaluate Prevention demonstration projects funded by the Adolescent Family Life (AFL) Program which is administered by OPA. Approval is also requested for collection of information for the Cross-Site Evaluation of the AFL Program: Prevention Demonstration Projects.


A. Justification

This section provides detailed justification for the request for approval of revised core instruments and of collection of information for the Cross-Site Evaluation of the Title XX AFL Program: Prevention Demonstration Projects.


1. Circumstances Making the Collection of Information Necessary

The AFL Program is administered by the Office of Adolescent Pregnancy Programs (OAPP) within OPA to support Prevention demonstration projects providing abstinence education to adolescents and Care demonstration projects providing services to pregnant and parenting adolescents. The AFL program and cross-site evaluation are authorized by Title XX of the Public Health Service Act (USC 42 Chapter 6A Subchapter XVIII) (Appendix A).

The Title XX statute requires an independent evaluation of all demonstration projects funded through the AFL program. Because these evaluations are independent, the data collected from one project to another vary. Moreover, the independent evaluations do not always necessarily examine questions of particular statutory or policy relevance to the OPA. Thus, the OPA has developed core evaluation instruments for AFL Prevention and Care demonstration projects that reflect Title XX legislative requirements, as well as the A-H definition of abstinence education contained in the Welfare Reform Act of 1996. 

The use of these core evaluation instruments across AFL Prevention and Care projects enable the OPA to better monitor the direction and progress of the program. This is important on at least two counts: 

a. These are demonstration projects and are, therefore, developing and implementing new approaches to abstinence education for adolescents and services for pregnant and parenting adolescents and their families are well within the parameters of the Title XX statute. To direct its funding resources appropriately and efficiently, it is of great importance that the OPA be able to assess the success or failure of these approaches.  

b. The AFL program was recently evaluated by the OMB Program Assessment Rating Tool (PART). As a result of that evaluation, OMB recommended that the program develop and track a set of performance measures. Measures for AFL demonstration projects have been developed, and the program tracks them using data from the Prevention and Care core instruments (OMB 0990-0290 and OMB 0990-0291, expiring 9/30/2008).

The core evaluation instruments have already been approved by OMB, but recommendations were made for revisions based on pilot testing and feedback from demonstration projects. This submission requests approval for the revised instruments in order to improve the ability of OPA to monitor project performance, to improve the quality of individual demonstration project evaluations, and to facilitate a cross-site evaluation of AFL demonstration projects. The OAPP estimates that 40,000 participants may use the surveys annually.


The specific aim of the cross-site evaluation is to evaluate the impact of AFL demonstration projects. Desired outcomes for Prevention projects providing abstinence education include prevention of and reduction in sexual activity, improved attitudes about abstinence, and increased parent-child communication among adolescents. For the cross-site evaluation, impact evaluation data will be collected by AFL Prevention demonstration projects from 2,661 adolescents aged 9 to 19 who are AFL service recipients or serve as comparison group participants. (These adolescents and projects are a subset of the up to 40,000 participants completing the surveys). The cross-site evaluation will include demonstration projects with strong evaluation designs, namely randomized controlled trials and strong quasi-experimental designs. The research will include three data collection points using the two core evaluation instruments: (1) a baseline Prevention survey and (2) follow-up Prevention surveys to be administered approximately one and two years after baseline. This submission requests approval for both surveys.

The field of adolescent reproductive health is well poised to seize an opportunity for a large-scale evaluation with important public health and policy implications. To date, various studies have shown abstinence education to be an effective approach for prevention and risk reduction (e.g., Blake, Simkin, Ledsky, Perkins, & Calabrese, 2001; Doniger, Riley, Utter, & Adams, 2001; Weed, 2004), although some studies suggest abstinence education may not significantly impact sexual behavior (e.g., Kirby, 2007; Trenholm et al., 2007). The proposed cross-site evaluation is an effort to advance the field of research and respond to calls for improvement in AFL’s program results/accountability (The White House, 2005) and for rigorous evaluation of adolescent reproductive health programs overall (Hoyer, 1998; Kirby, 2002; U.S. Government Accountability Office, 2006).

The AFL cross-site evaluation will be a meta-analysis impact evaluation to compare adolescents targeted by Prevention projects against adolescents not targeted. The cross-site evaluation presents a unique opportunity to evaluate the effectiveness of a multi-site funding program to prevent adolescent sexual activity.

2. Purpose and Use of the Information Collection

The purpose of the data collection and evaluation is to determine the impact of AFL demonstration projects on desired outcomes. Anticipated effects of Prevention projects providing abstinence education include main effects on sexual activity, intentions to have sex, attitudes about abstinence, and parent-child communication; mediating effects of self-efficacy to remain abstinent, beliefs about the future, and attitudes about sexual risk; and moderating effects of prior risk levels and demographic characteristics. Research questions that will be investigated using these instruments will vary from project to project; however, the cross-site evaluation is designed to answer specific research questions across projects. Key research questions for the cross-site evaluation are presented in Exhibit 1. Copies of data collection instruments are attached in Appendix B.

The information obtained from the proposed data collection activities will be used to inform OPA, policy makers, practitioners, and researchers about the effects of the AFL program activities. This information will enable OPA to more effectively address prevention of sexual risk behavior. These findings will inform the application of AFL program funds and priorities and will have policy implications for other mechanisms of providing funding for abstinence education programs.

Exhibit 1. Cross-Site Evaluation Research Questions

  1. Was the program effective in producing the desired outcomes on the targeted mediator variables, including:

    1. Self-efficacy to remain abstinent

    2. Beliefs about the future

    3. Attitudes about perceived risks from sexual activity

  2. Did the program effectively increase parent-child communication?

  3. Did the program effectively improve adolescent attitudes, intentions, and behaviors surrounding sexual activity and abstinence?

  4. Did the effects of the program vary based on moderator variables? Potential moderator variables include:

    1. Pre-program risk level of adolescents (prior behavior, parent-child communication, parent involvement, parent-child relationship, involvement in prosocial activities)

    2. Demographic characteristics (adolescent age, race/ethnicity, gender, urbanicity, region, socioeconomic status, family structure)

5. Did the program achieve its effects on parent-child communication and abstinence attitudes/behaviors by altering the mediating variables in #1?



  1. Use of Improved Information Technology and Burden Reduction

The revised core evaluation instruments will be used in the conduct of the independent evaluations required of all AFL Prevention grantees by statute. Use of information technologies for these independent evaluations will therefore be dependent upon the capacities of specific grantees and their evaluators. 

The cross-site evaluation will rely on paper-and-pencil Teleform questionnaires to be self-administered by adolescents. One alternative method considered was to conduct telephone surveys. However, conducting surveys by telephone would be extremely time-consuming and costly, given the number of students (n=2,661) expected to participate. In addition, response rates for telephone surveys are decreasing as new technology (answering machines, voice mail, caller identification) becomes available (O’Rourke et al., 1998), and non-locate rates in later waves of longitudinal telephone surveys are increasing, likely due to increased use of cellular phones and frequent switching of carrier companies. Further, we believe there would be serious risks to privacy and confidentiality if students were asked to disclose sensitive information regarding reproductive health topics over the telephone. OPA’s contractor for the cross-site evaluation, RTI International, conducted a capacity assessment of Prevention projects to determine the best way to collect data across projects. Many participants do not have reliable access to computers, and using school computers for survey administration would not provide adequate privacy for respondents to feel comfortable answering questions honestly. Even if each classroom had a computer, there would be no privacy for the students and little availability for all students to use the computer to complete the survey in a timely manner. Most projects lack the capacity to use technology such as audio computer-assisted self interview (ACASI). Survey administration with Teleform instruments will minimize burden on AFL demonstration projects, while minimizing potential biases that might jeopardize our ability to address the evaluation research questions.

4. Efforts to Identify Duplication and Use of Similar Information

The purpose of the core evaluation instruments is to ensure uniform data collection across AFL demonstration projects in areas of particular statutory or policy interest to the OPA. While program evaluations might, in the absence of the core instruments, collect some similar data, core evaluation instruments ensure that this data is collected consistently. The OPA requires all AFL demonstration grantees, funded in FY 2005 and after, to incorporate the core instruments into their evaluation. 

In designing the proposed data collection activities for the cross-site evaluation, we have taken several steps to ensure that this effort does not duplicate ongoing efforts and that no existing data sets would address the proposed study questions. To ensure that this study is forging new ground in our understanding of the effectiveness of the AFL program, we conducted an extensive review of the literature by examining several large periodical journal databases. We identified published articles or books containing the keywords, “adolescent,” “youth,” “abstinence,” and “parent-child communication.” In addition, to reviewing published information, we searched for “gray” literature by contacting well-known researchers in the field and by exploring the Internet. Searches were performed on several Internet search engines, including Google, Yahoo, AltaVista, Medline, and Science Direct, using search terms “adolescent,” “youth,” “abstinence,” and “parent-child communication.”

The results of the literature search and consultation with experts in the field revealed that although a fair amount of research has been conducted on abstinence education, little has been done to conduct a cross-site evaluation in this area or evaluate the effectiveness of a program like AFL. One study evaluated the effectiveness of four Title V abstinence education programs, but the programs randomized adolescents to treatment and control conditions within schools, introducing possible contamination between treatment and control groups, and only targeted adolescents in upper elementary and middle school grades (Trenholm et al., 2007). Another study included three abstinence education programs, but none of the programs were AFL projects (Kirby, 2007); the study was unable to draw substantial conclusions about effectiveness of abstinence education because of the small number of programs included in the study. A recent meta-analysis included four abstinence-oriented programs, but none of the programs were AFL projects (Silva, 2002); the study’s findings were limited by the quality of the primary research, which utilized poor designs that for the most part did not provide conclusive evidence of program effects. To date, no duplication of the proposed effort has been identified.

We have carefully reviewed existing data sets to determine whether any of them are sufficiently similar or could be modified to address OPA’s need for information on the effectiveness of the AFL program with respect to abstinence education. Efforts to avoid duplication include a review of OPA’s administrative agency reporting requirement and of existing studies of OPA’s programs. We investigated the possibility of using existing data to examine our research questions, such as evaluations of past Prevention demonstration projects; individual and local evaluations of abstinence education efforts; surveys by the National Campaign to Prevent Teen Pregnancy (2003); the National Survey on Family Growth (Abma, Martinez, Mosher, & Dawson, 2004; Albert et al., 2005); the National Longitudinal Study of Adolescent Health (1998); the National Survey of Adolescents and Young Adults: Sexual Health Knowledge, Attitudes and Experiences (Henry J. Kaiser Family Foundation, 2003); and the Youth Risk Behavior Survey (Eaton et al., 2006). However, none of these existing data included pre- and post-test data in a rigorous design using standardized instruments across multiple programs to test projects and services like the ones funded by the AFL program.

5. Impact on Small Businesses or Other Small Entities

To the extent that AFL demonstration projects might be considered small businesses or entities, the data to be collected from the core evaluation instruments (Appendix B) would still need to be collected in some form to satisfy the independent evaluation requirement of the AFL statute (Appendix A). Thus, any burden on demonstration projects will be minimal.

6. Consequences of Collecting the Information Less Frequently

While individual AFL demonstration project evaluations may collect similar data in the absence of the core evaluation instruments, the data would not be consistent across projects. This would hamper the OPA’s ability to effectively monitor and manage the direction of the program as a whole, as well as track the performance measures recommended by the OMB. 

If the cross-site evaluation were not conducted, it would be difficult to determine the value or impact of the AFL program on the lives of individuals and families that it is intended to serve. Failure to collect these data could reduce effective use of program resources to benefit adolescents and families.

The cross-site evaluation involves three data collection points—a baseline and two follow-up surveys after approximately one and two school years for Prevention demonstration projects. Serious consideration has been given to the issue of how frequently to survey respondents for the cross-site evaluation. After consulting with a committee of AFL project staff and young adult clients, an expert workgroup, and other project staff, it was determined that the data collection strategy selected would need to be sufficient in number to track and document changes in outcomes between and across individuals before exposure to a time point late enough for intervention effects to be observed on initiation of sexual activity among Prevention respondents. In addition, adolescents may experience several developmental changes as some adolescents enter puberty and begin noticing the opposite sex, experience peer pressure, and experience opportunities to engage in sexual activity. Thus, it is important to measure attitudes, behavior, and risk and protective factors for these at several time points in order to account for changes that may occur because of adolescents’ developmental progression. Less frequent data collection would not allow for measurement of immediate program effects and long-term effects. Because of concerns about respondent attrition due to possible dropping out of the study, RTI determined that the follow-up intervals would need to be narrow enough to enable completion of survey cycles with a given individual over a reasonably short period of time.

7. Special Circumstances Relating to the Guidelines of 5CFR 1320.5

There are no special circumstances that require data collection to be conducted in a manner inconsistent with 5 CRF 1320.5(d)(2).

8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

A 60-day Federal Register Notice was published in the Federal Register on June 9, 2008, in Volume 73, Number 111, pages 32583-32584 and provided a 60-day period for public comments (Appendix C). There were no public comments.

A list of consultants on this project is provided in Exhibit 2. Consultants contacted included AFL project staff, AFL young adult clients and former clients, and expert researchers with a background in adolescent reproductive health and program evaluation. The information provided from these discussions was extremely helpful in informing RTI staff about suggested improvements to the core instruments for all grantees, as well as the expected reactions of adolescents who will participate in the evaluations (and of parents of adolescents aged 17 or younger, who will need to provide consent for adolescent participation). This information helped guide the development of both the instruments and cross-site evaluation study design. Input and recommendations were incorporated into the survey and questionnaire design to the extent possible. Contact information for the consultants for this project is provided in Exhibit 2.

RTI staff consulted with respondent surrogates in connection with pre-tests of the survey instruments (which are currently approved instruments under the collection OMB 0990-0291) as described in Section B.4. A total of 72 self-administered Prevention baseline questionnaires and 73 follow-up questionnaires were completed by adolescents aged 9 to 19 to ensure that these surveys could be completed in approximately 20 minutes. Refinements to the surveys were made as a direct result of these pretests.

9. Explanation of Any Payment or Gift to Respondents

For individual project evaluations that involve the core instruments, payments to respondents may be provided if they are necessary to facilitate participation and the decision to provide payments will be left to each project. 

For the cross-site evaluation, incentives will be provided to teachers and to adolescent respondents for return of parent consent forms, and to adolescent respondents for completion of data collection instruments.


Parent consent incentives

Because obtaining active parent consent in schools is a very difficult task, we will provide a $25 gift card to each teacher in a school-based or after-school setting for each classroom in which at least 90% of the parental consent forms are returned, whether or not the parents allow adolescents to participate. Gift cards may be used to purchase classroom supplies or pay for a class party. RTI has conducted several studies in which we have employed parent consent tracking incentives and have achieved parent consent form return rates higher than 75%. Such incentives are now routinely offered in RTI’s school-based grant efforts and have been approved by OMB for use in other studies (OMB 0920-0783).






Exhibit 2. Persons Consulted Outside the Agency

Expert Work Group

Elaine Borawski, Ph.D., Director, Center for Health Promotion Research

Case Western Reserve University

216.368.1024

[email protected]

Jeff Tanner, Ph.D., Associate Dean

Baylor University

254.710.3485

[email protected]

Claire Brindis, Dr.P.H.

Professor of Pediatrics and Health Policy

Associate Director, Institute for Health Policy Studies

Center for Reproductive Health Research and Policy

University of California at San Francisco

415.476.5255

[email protected]

Lynne Tingle, Ph.D., Assistant Professor

University of North Carolina at Greensboro

336.334.3435

lrtingle@uncg.edu

Douglas Kirby, Ph.D., Senior Research Scientist

ETR Associates

831.438.4060

[email protected]

Gina Wingood, Sc.D., Associate Professor and Director, Behavioral and Social Science Core

Center for AIDS Research

Emory University

404.727.0241

[email protected]

Lisa Lieberman, Ph.D., President

CHES

Healthy Concepts, Inc.

845.638.1619

[email protected]

Meredith Kelsey, Ph.D., Research and Policy Analyst

Division of Children and Youth Policy

Office of the Assistant Secretary for Planning and Evaluation

202-690-6652

[email protected]

Dennis McBride, Ph.D., Associate Director for Research

The Washington Institute for Mental Illness Research and Training

University of Washington

253.756.2335

[email protected]

Lisa Trivits, Ph.D., Research and Policy Analyst
Division of Children and Youth Policy

Office of the Assistant Secretary for Planning and Evaluation

202-205-5750

[email protected]

Amy Ong Tsui, Ph.D., Director and Professor

Bloomberg School of Public Health

Johns Hopkins University

410-955-2232

[email protected]


Staff Committee

Anne Badgley, M.Ed., President and CEO
Heritage Community Services

843-863-0508

[email protected]

David MacPhee, Ph.D., Professor

Human Development & Family Studies

Colorado State University

970-491-5503

[email protected]

Leisa Bishop, Director of Neighborhood Services

BETA Center, Inc., Project FAME

407-277-1942 ext. 134

[email protected]

Janet Mapp, Interim Director of Prevention Services

Switchboard of Miami

305-358-1640

[email protected]

Doreen Brown, Director of Outreach Services Healthy Connections

479-243-0279

[email protected]

Dr. Ruben Martinez, Ph.D., Evaluator

Decisions For Life of Baptist Child and Family Services

210-458-2654

[email protected]; [email protected]

Carl Christopher, Educator

St. Vincent Mercy Family Care Center

419-251-2341

[email protected]

Alice Skenandore, Executive Director

Wise Women Gathering Place

920-490-0627

[email protected]

Cheri Christopher, Young Adult Representative

St. Vincent Mercy Family Care Center

419-251-2341

[email protected]

Jared Stangenberg, Young Adult Representative

615-683-7106

[email protected]

Christina Diaz, Program Director

Decisions For Life of Baptist Child and Family Services

210-240-8866

[email protected]; [email protected]

Cherie Wooden, R.N., BSN Program Manager

Helping Our Parents to be Educators (HOPE)

607-584-4485

[email protected]

Amy Lewin, Psy.D., Assistant Professor

Center for Health Services and Community Research Healthy Generations Program

Children’s National Medical Center

202-884-3106

[email protected]




We will provide adolescents with a small incentive worth $1.00 (such as tokens they can redeem in the school book store or cafeteria, arm bands, pencils, or mirrors) for returning their signed parental consent form, whether or not they obtain permission to participate in the study. In a school-based study, Blinn-Pike and colleagues (2000) asked teachers to estimate how many students they could obtain parent consent from for baseline data collection. The study was to begin within one month after the start of school year. However, by mid-October, 6 of the 12 teachers had not secured signed parental permission forms needed to begin the project. A primary reason stated by teachers was that incentives were needed to motivate students to return the forms. On a recent school-based study that RTI conducted, we attempted to obtain signed parent consent forms without offering a student incentive. We were disappointed with the low consent return rates during the first two weeks, so we began offering a small $0.50 token incentive to students. Consent return rates improved dramatically. Small student incentives are now routinely offered in RTI’s school-based grant efforts and have been approved by OMB for use in other studies (OMB 0920-0783).


Survey participation incentives

A $10 gift card incentive will be offered to participants who complete the baseline and follow-up surveys. Adolescents are a difficult cohort to recruit for a 20-minute survey about this sensitive topic without the use of a small incentive. The incentives are intended to recognize the time burden placed on adolescents, encourage their cooperation, and to convey appreciation for contributing to this important study. Numerous empirical studies have shown that incentives can significantly increase response rates (e.g., Abreu & Winters, 1999; Shettle & Mooney, 1999; Singer et al., 1999). The decision to use incentives for this study is based on several projects conducted by RTI and others, which found that use of $10 incentives increased response rates among adolescents and other populations similar to the proposed study population. Exhibit 3 summarizes several such studies and the response rates achieved. Although these studies differ in other respects that could account for some variability in response rates, overall, incentives of at least $10 were generally associated with higher response rates compared with no incentive.

Exhibit 3. Studies Involving Child and Adolescent Respondents Receiving $10 Incentives and Corresponding Response Rates

Study

Population

Incentive Provided

Response Rate Achieved

Healthy Schools/Healthy Communities (2002)

Adolescents aged 12 to 17 years

$10 gift certificate and a baseball cap or calculator

68%

Georgia Health and Behavior Study (2002)

Persons aged 9 to 17 years

$10 cash for each of two interviews

76% first interview
84% second interview

National Survey on Child and Adolescent Well-Being (2002)

Children aged 6 to 10 years

$10 gift certificate for 25-minute interview

85%

The University of California Irving Stress and Trauma Study (2001–2004)

Adolescents aged 13 to 17

$10 initial incentive Pool A, $10 initial incentive + $10 completion incentive Pool B

83%: Pool A

79%: Pool B



The use of modest incentives is expected to enhance survey response rates without biasing responses or coercing respondents to participate. A smaller incentive would not appear sufficiently attractive to adolescents. We also believe that the incentives will result in higher data validity as adolescents become more engaged in the survey process. The amount of the incentives was determined through discussions with RTI staff and expert work group members with expertise in conducting adolescent surveys about reproductive health issues. Because all selected adolescents may not be eligible for the study, we want to assure efficient project spending and only provide substantial incentives to respondents after they are determined to be eligible.

10. Protection of Data Security and Participant Privacy

For individual AFL project evaluations, specific procedures for data collection privacy and security are site specific. Each AFL applicant, however, must submit a signed acceptance of assurances required by Title XX of the Public Health Service Act. These assurances include affirmation that a system for maintaining the security of client records is in place. Compliance is monitored by OPA staff. Unless grantees have obtained a Certificate of Confidentiality, respondents are not assured that their data are confidential.

All procedures for the cross-site evaluation have been developed, in accordance with federal, state, and local guidelines, to ensure that the rights, privacy, and information security of respondents are protected and maintained. The RTI Institutional Review Board (IRB) reviewed all instruments, informed consent and assent materials, and procedures to ensure that the rights of individuals participating in the study are safeguarded. A copy of the RTI IRB approval notice is included as Appendix D. A pilot test of these procedures was conducted, and no problems were identified (see Section B.4 for a summary of the pilot test). RTI will apply for a Certificate of Confidentiality so that respondents can be assured that their information is confidential.

All respondents will be assured that the information they provide will be kept private and will be used only for the purpose of this research. A copy of this assurance provided in writing to respondents is presented in Appendix E. Respondents will be assured that their answers will not be shared with family members and that their names will not be reported with responses provided. Respondents will be told that the information obtained from all of the surveys will be combined into a summary report so that details of individual questionnaires cannot be linked to a specific participant.

All AFL projects will submit the questionnaire to their site IRB prior to initiating data collection. The questionnaire data will be treated as private and maintained in a manner that satisfies the privacy requirements set forth by the site IRB.

It is possible that another adolescent could view survey responses while survey administration is in progress, so adolescents will be spaced out around the room when the survey is administered to more than one adolescent at a time, in order to avoid the possibility of another respondent being able to view survey responses. After completion of the survey, respondents will place questionnaires in an envelope. AFL staff will seal the envelope, and it will not be unsealed in the presence of respondents. Only evaluation staff will have access to survey information provided by individual respondents.

To ensure data security, all RTI and AFL project staff are required to adhere to strict standards and to sign agreements as a condition of employment on the cross-site evaluation. Survey administrators will be thoroughly educated in methods of maximizing parent and adolescent understanding of the government’s commitment to privacy. Hard-copy data collection forms will be delivered to a locked area for receipt and processing. Individual identifying information will be kept separate from survey responses, and ID numbers will be assigned to participants for identification purposes. RTI and AFL project staff will never leave completed consent/assent forms or questionnaires unattended. All completed consent/assents forms and the list of participant names and ID numbers will be stored in separate locked filing cabinets only accessible to authorized study personnel. Survey responses will be stored on a secure, password-protected computer shared drive. RTI maintains restricted access to all data preparation areas (i.e., receipt, coding, and data entry). All data files on multi-user systems will be under the control of a database manager, with access limited to project staff on a “need-to-know” basis only. Transmission and collection protocols will be in accordance with requirements set forth by RTI’s IRB and the site IRB. No respondent identifiers will be contained in reports, and results will only present data in aggregate form.

We will seek approval and review by the OS Privacy Act Coordinator, Maggie Blackwell.

11. Justification for Sensitive Questions

The major foci of the AFL prevention demonstration projects are to provide demonstration projects that develop and test innovative approaches to abstinence education. Demonstration project curricula and other materials cover issues around adolescent sexual activity and the benefits of waiting to have sex. Thus, some questions included in the revised core evaluation instruments might be considered sensitive by some respondents. Exhibit 4 identifies the sensitive questions, explains the justification for their inclusion in the surveys, and describes how the data will be used in the cross-site evaluation. The informed consent and assent protocol apprises respondents that these topics will be covered during survey administration. These questions are included in the surveys because of their importance in understanding adolescent attitudes and behaviors surrounding pregnancy prevention and the moderating effect of adolescents’ pre-intervention risk characteristics on the main effects of the AFL program on adolescent attitudes, intentions and behavior regarding sexual activity and abstinence. As with all information collected, these data will be presented with all identifiers removed.

For all AFL Prevention projects, including those that participate in the cross-site evaluation, the OPA will consider a waiver to administering questions about sexual attitudes, intentions, and behaviors. This will be done on a case by case basis if the Prevention demonstration project can provide adequate justification. For example, a waiver might be given for a very young client population or, in the case of a school-based project, in the face of opposition from a school board or district.   

Exhibit 4. Description of Sensitive Questions, Justification for Inclusion, and Use of Data

Description of Questions

Justification for Inclusion

Use of Data

Feelings about marriage and sex

Necessary to determine main effects of the AFL program in improving adolescent attitudes about abstinence until marriage

Used as dependent variable for multivariate analysis comparing treatment and comparison adolescents

Sexual activity

Necessary to determine main effects of the AFL program in preventing or reducing adolescent sexual activity

Used as dependent variable for multivariate analysis comparing treatment and comparison adolescents

Method(s) to prevent pregnancy and sexually transmitted diseases

Necessary to determine whether unintended consequences of the AFL program include reductions in contraceptive or STD prevention behaviors among adolescents who are sexually active

Used as dependent variable for multivariate analysis comparing treatment and comparison adolescents who are sexually active

Tobacco, alcohol, or other drug use

Necessary to determine whether adolescents involved in substance use are equally or less likely to benefit from AFL program exposure than those not involved in substance use

Used as moderating variable for multivariate analysis to assess interaction between exposure to the AFL program and substance use as a significant predictor of abstinence among adolescents

12. Burden Estimate (Total Hours & Wages)

The annual response burden is 29,334. This burden is the total response burden per year for all AFL projects, since the cross-site evaluation involves a subset of respondents completing core evaluation instruments. Exhibit 5 provides details about how the burden estimate was calculated. AFL survey respondents will be comprised of adolescents aged 9 to 19 who are selected by AFL Prevention demonstration projects to participate in treatment or comparison groups. The paper-and-pencil self-administered surveys will be designed to maximize ease of response and thus decrease respondent burden. The annual respondent cost is $176,000 (Exhibit 6). This cost is the total respondent cost per year for all AFL projects, since the cross-site evaluation involves a subset of respondents completing core evaluation instruments. Respondents participate on a purely voluntary basis and, therefore, are subject to no direct costs other than time to participate; there are no start-up or maintenance costs. Timings were conducted during our pilot test procedures to determine the overall burden per respondent for the core instruments during the cross-site evaluation. Paper and pencil data collection is expected to take 22 minutes per respondent. Because it is not known what the wage rate category will be for these selected adolescents (or even whether they will be employed at all), the figure of $6.00 per hour was used as an estimate of average minimum wage across the country.

Exhibit 5. Estimated Annualized Burden Hours

Type of Respondent

Form Name

No. of Respondents

No. of Responses/ Respondent

Average Burden/
Response
(Hours)

Total Burden (Hours)

Adolescents aged 9 to 19

Baseline survey

40,000

1

22/60

14,667

Follow-up survey

40,000

1

22/60

14,667


TOTAL



29,334



Exhibit 6. Estimated Annualized Cost to Respondents

Type of Respondent

Form Name

Total Burden Hours

Hourly Wage Rate

Total Respondent Costs

Adolescents aged 9 to 19

Baseline survey

14,667

$6.00**

$88,000

Follow-up survey

14,667

$6.00**

$88,000


TOTAL

29,334


$176,000

**Estimates of average hourly living allowance for participants

13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

Respondents will incur no capital or maintenance costs.

14. Annualized Cost to the Federal Government

There are no costs to the federal government associated with grantee use of the revised core evaluation instruments. The cost estimate for the completion of the cross-site evaluation will be $1,944,778 over 4 years, including a possible one-year no-cost extension for the project. This total cost covers all cross-site evaluation activities and includes information collection and other cross-site evaluation tasks not included in this OMB application. This includes the estimated cost of coordination with the OPA and AFL projects; project plan and schedule development; RTI IRB applications; overseeing of data collection; analysis; reporting; and progress reporting. Annual cost to the federal government is estimated to be $486,195 ($1,944,778/4).

15. Explanation of Program Changes or Adjustments

A previous OMB application was approved in 2005 for the core evaluation instruments to be used among AFL projects (0990-0291); thus this is not a new collection, although that collection expires 9/30/2008. However, there are no increases in burden requested because the grantees are already conducting baseline and follow-up data collection as part of the grant funding requirements. Furthermore, we are not creating new documents and not increasing sample size for this data collection. In fact, the instruments have been shortened from a 30-minute administration to a 22-minute administration, reducing burden.

16. Plans for Tabulation and Publication and Project Time Schedule

The OPA requires AFL demonstration projects to provide tabulations of data on basic demographics and selected questions in the core evaluation instruments in their end-of-year reports (OMB 0990-0300). These aggregated data are used to track progress on the performance measures in response to OMB’s recommendation. 

Analysis of the data for the statutorily required independent evaluation of each project will vary, and be determined, by the individual grantees and their evaluators. 

Analyses for the cross-site evaluation will consist of two phases: (1) analyses of baseline data and (2) longitudinal analyses that include multiple waves of survey data.

Baseline Analyses

Baseline analyses will begin once baseline data are available and will consist of definition of the analysis variables (individual variables, compounded scales, latent groups), descriptive statistics (frequencies, means), and basic tests of differences between treatment groups and grantees (cross-tabulations, Student’s t tests, and chi-square tests). This will include single time point tests of association between hypothesized independent, mediating, moderating, and dependent variables (as conceptualized in Exhibit 7).

Initial data quality analysis will consist of analysis of nonresponse, study dropout, and missingness patterns in the data as well as preparation of the analysis scales and measures. Although we will implement ongoing data quality control, at the end of the study we will examine the overall quality of the data, which will include but not be limited to identifying patterns of nonresponse and data missingness and examining characteristics of the sample to uncover sources of potential biases. For example, each individual risk-related item might have a small number of missing records, but when many items are combined to create a scale, the cumulative number of observations with missing data could be significant. In cases of missing or inconsistent data, we will decide on the best way to correct the data (imputation methods, reassessing the subject records, etc.).

Exhibit 7. Logic Model for the Cross-Site Evaluation of the AFL Program

 

Though expected to be minimized, particularly with the instrument reconfiguration, missingness due to item nonresponse or invalid responses will be appropriately imputed using Proc MI (multiple imputations), a procedure available in SAS (release 9.1). Compared with the more typical single imputation, this procedure offers the advantage of providing valid statistical analyses that properly reflect the uncertainty due to missing data. We will also conduct checks for outlying values and identify the best way to deal with them. We will construct the actual measures that will be used in the study, such as attitudes about abstinence and parent-child communication about sex. Finally, we will prepare an analytic data set that will include all studied variables and measures.

We will start baseline analysis with the examination of the psychometric properties of the data, with the intention of developing reliable scales that accurately capture attitudes, beliefs, and behaviors targeted by the AFL program. Measures may be examined for appropriateness across various cultural groups, and scalar equivalence may be assessed using procedures described in Knight, Virdin, & Roosa (1994). Basic tests of association will examine the relationships between the demographic characteristics, pre-program risk levels, and outcomes at baseline to create a reference to which these characteristics will be compared at the longitudinal time points. When the construction of a compound scale is not justifiable, we will use structural equation modeling (SEM), which combines a number of variables with similar meaning into latent class variables. These latent class variables could then be related to each other and/or other variables according to the conceptual model of mediating and moderating effects. However, we will use SEM for a simpler analysis of baseline differences between the groups of variables forming major moderating, mediating, and outcome categories. Finally, SEM will be used to evaluate the multivariate relationships of baseline demographics, mediators, and outcomes to attrition from baseline to follow-up. Variables found to differ between follow-up survey responders and nonresponders will be addressed by adjusting sampling weights to the demographic composition and size of grantee populations (gleaned from end-of-year reports) or by controlling for these variables in multivariate analyses.

To test difference among treatment group variables at baseline, we will use a multilevel (hierarchical) linear model (HLM):

Dependent variableit = ß0 + ß1(treatment status) +Wj+ eijt, (1)

where indexes i and j correspond to a subject and grantee, Wj is a grantee-specific random effect, and ß corresponds to regression parameters.

For testing the relationship between baseline outcome variables and targeted risk and protective factors, the hierarchical model will become

Outcomeit = ß0 + ß1(treatment status)+ ß2(risk or protective factor)+ Wj+ eijt. (2)

Longitudinal Analyses

Main longitudinal analyses will be addressed by using a repeated measures regression model that controls for the baseline values and data collection time point. For example, a model for continuous variables, such as parent-child communication score, will look like the following:

Parent-child communication scoreit = ß0 + ß1(program characteristic) + ß2(baseline value)

+ ß3(first follow up) + ß4(second follow up) + ß5(first follow up)(program characteristic) + ß6(second follow up)(program characteristic) + Ui +Wj+ eijt, (3)


where indexes i and j correspond to a subject and grantee, Ui and Wj are cluster-specific random effects corresponding to a subject and a grantee respectively, and ß corresponds to regression parameters. An interaction term between time point and treatment status is included to estimate which program characteristics are particularly effective at which time points. Random effects are included to account for correlations within subject (i) and within project sites (j). The nature of multilevel hierarchical analysis is to account for clustering of responses within the same individual and the same grantee (i.e., individuals within a site are more similar than individuals across sites, and individuals’ responses are correlated across time). This clustering of individuals within grantees and of responses within individuals is estimated through the variance of the random effects.

The variables for tested hypotheses will also include binary variables for which we will use a multilevel logistic regression model. To adjust for adolescents’ ages and other demographics, we will modify Equation 1 by adding the corresponding terms. We will present adjusted and unadjusted results.

Analyses of Moderating and Mediating Effects

Moderation of program effects at the organizational and individual levels will be examined to assess whether program effectiveness depends on demographic characteristics and baseline levels of the targeted mediators (i.e., baseline by treatment interaction effects) and contextual variables (e.g., involvement in extracurricular activities). To test the moderating effects of individual demographic characteristics and pre-program risk levels, corresponding covariates and their interactions with the intervention effects will be added to the longitudinal regression models described above (Baron & Kenny, 1986; Judd & Kenny, 1981).

To test the potential mediating effects of risk and protective factors, such as attitudes about sexual risk, corresponding interaction terms (e.g., time x attitudes about sexual risk) will be added to the regression models described above. Then, we will conduct two more regression analyses by adding the mediators as covariates to the first regression model so it becomes the following:

Outcomeit = b0 + b1(treatment status)it + ... + Ui +Wj + g(attitudes about sexual risk)it + eit. (4)

Attitudes about sexual risk it = a0+a1(treatment status)it + Ui +Wj + eit (5)

The test for mediation will consist of testing for the product ga1 (Sobel, 1982). The procedures for estimating mediated effects in the context of multilevel analysis, outlined in Krull and MacKinnon (1999), will be used for parameter estimation. Mediation analyses at the organizational and individual levels will be conducted using the Asymmetric Confidence Interval method (MacKinnon, Taborga, & Morgan-Lopez, 2002), a state-of-the-art method of estimating confidence intervals for mediated effects. Mediational effects will be considered at all follow-up points and will be accounted for by including interaction terms for each time point. Multiple waves of data from respondents will be advantageous to these analyses because potential mechanisms of effect can be examined at intermediate time points, rather than concurrently with predictors or outcomes, which strengthens the evidence for causal associations.

Use of Sampling Weights

In conducting the analyses, we will use statistical weights that provide the projection of the sample estimates to the entire population of adolescents targeted by grantees participating in the cross-site outcome evaluation. Sampling or design weights for each unit observed are computed as the inverse of the probability of selection at each stage of selection. Typically, adjustments to these sampling weights are necessary to account for bias related to sample selection, nonresponse and/or attrition, and deficiencies in population coverage. In recent years, RTI statisticians have developed the generalized exponential model (GEM), a response propensity modeling approach for computing nonresponse and coverage adjustments (Folsom & Singh, 2000). The Folsom and Singh modeling approach is a generalization of constrained logistic models first suggested by Deville and Särndal (1992). This approach has been shown to be more effective than the more commonly used weighting-class approach in correcting for nonresponse bias. The increase in effectiveness comes from the ability to incorporate a greater number of correlates of nonresponse in the modeling approach than would be possible with traditional weighting-class methods and to allow for controlling the variability of the adjustments, which in turn decreases the variance of the resultant.

Exploratory Effect Size Analysis

Based on our expert workgroup’s recommendation from Stage 2, an additional approach for the outcome evaluation, which would involve calculating the effect sizes of individual projects, will also be used (DeCoster, 2004; Egger & Smith, 1997; Singleton & Straits, 1999). Effect sizes will be computed in Cohen’s D metric, or standardized mean differences (Cohen, 1988). Cohen’s D values will be calculated both for comparisons of preprogram versus postprogram means for each group of adolescents participating in an AFL project, as well as for comparisons between adolescents receiving AFL project services and those not receiving AFL project services. Separate Cohen’s D values will be computed for each form of comparison. Cohen’s D values will be calculated from means and standard deviations calculated from the data set for each grantee. Each effect size will be weighted by the inverse of its variance to provide more efficient estimation of true population effects (Hedges & Olkin, 1985). Findings for both unweighted and weighted effect sizes will be reported in analyses of overall program effects. Only weighted Cohen’s D values will be analyzed and reported in analyses of moderator variables. All effect sizes will be coded such that positive values indicate differences in directions consistent with a favorable effect of the AFL program.

For the additional meta-analytic approach, each project will be coded on characteristics (based on process evaluation data), such as project features (e.g., abstinence education alone versus abstinence education as part of a multi-component intervention, project goal, geographic location, setting in which project activities occurred, monitoring of implementation, characteristics of project staff, project staff training, inclusion of parental involvement component), characteristics of participating adolescents (gender, race/ethnicity, age, family structure, socioeconomic background, at-risk status), project dosage and exposure (actual frequency of project contact, average length), and timing of assessment. Efforts will be directed towards at least preliminarily determining whether different types (content, themes, and modes) of projects have different outcomes. The independent sample will be the primary unit of analysis. Each project will contribute one independent sample to the analysis.

The analytic strategies described above will provide an optimal design for assessing the overall impacts of the AFL program on adolescent outcomes and will allow for examination of differences in effects as a function of individual characteristics. The study hypotheses for the cross-site evaluation are outlined in Exhibit 8. Additionally, Exhibit 9 summarizes each of our planned analyses using baseline and multiple waves of follow-up data.

As the evaluation questions and hypotheses are addressed, the findings will be summarized and shared with OPA and OPA-identified stakeholders for comment and interpretation. For this study, we expect the findings to be disseminated to a number of audiences. Therefore, the evaluation reports will be written in a way that emphasizes scientific rigor for more technical audiences but is also intuitive, easily understood, and relevant to less technical audiences. The reporting and dissemination mechanism will consist of three primary components: (1) a baseline sample profile, (2) a data summary including preliminary results of follow up data, and (3) peer-reviewed journal articles. The baseline sample profile report will offer descriptive information about adolescent evaluation participant characteristics at baseline, comparisons between these data and national data about similar populations, and comparisons between treatment and comparison groups. The data summary will include data from the baseline and follow-up data collections, including follow-up response rates and characteristics of adolescents who participated in baseline and follow-up data collection, with preliminary findings from cross-tabulations, multivariable regression models, and exploratory and confirmatory factor analyses. This report will also include preliminary data on adolescents’ intermediate outcomes (attitude and belief changes). The results of our study also will be used to develop at least one peer-reviewed journal article (e.g., American Journal of Public Health, Perspectives on Sexual and Reproductive Health, Journal of Research on Adolescence, or Prevention Science) that summarizes findings on the overall effectiveness of the AFL program. With review and approval by OPA, the results of the evaluation will also be used to develop at least one conference presentation.

The key events and reports to be prepared are listed in Exhibit 10.

Exhibit 8. Study Hypotheses

The primary study hypotheses concern the effects of AFL Prevention projects on parent-child communication, attitudes towards abstinence, sexual intentions, and sexual behavior.


Exposure to AFL Prevention projects will result in:

  • Increased parent-child communication

  • Increase in positive attitudes towards abstinence

  • Reduced intentions to have sex

  • Reduced sexual behavior


Additionally, Exhibit 7 identifies several secondary hypotheses, which represent relationships between mediating and moderating variables in the model, and the interaction between these variables, AFL program exposure, and outcomes:


  • Self-efficacy and positive beliefs about the future mediate the relationship between program exposure and sexual intentions and behavior;

  • Attitudes about sexual risk mediate the relationship between program exposure and sexual intentions and behavior;

  • Pre-program levels of risk, including prior risk behavior and parent child relationship, moderate the relationship between program exposure and the primary outcomes;

  • Demographic characteristics moderate the relationship between program exposure and the primary outcomes.



17. Expiration Date

The OMB expiration date will be displayed on all data collection instruments.

18. Exceptions to Certification for Paperwork Reduction Act Submissions

There are no exceptions to the certification.











Exhibit 9. Cross-Site Evaluation Analyses

Time

Research Question/Hypothesis

Methods

Baseline

What are the sample characteristics of study participants?

Adjusted1 means and frequencies

How are mediating/moderating variables associated with presumed outcomes?

Adjusted1 cross-tabulations, analysis of variance, and correlations

How are mediating and moderating variables correlated?

What are the psychometric properties of the survey data?

Exploratory and confirmatory factor analyses

What are the differences between treatment and comparison groups at baseline?

Multilevel logistic and linear regression

Baseline through final Follow-up

Are there different patterns of change over time in continuous (scaled) outcomes as a function of exposure to AFL Prevention programs?

Multilevel growth curve modeling

Are there different rates of dichotomous outcomes as a function of exposure to AFL Prevention programs?

Multilevel logistic regression

What is the relationship between exposure conditions and mediating variables?

Multilevel logistic and linear regression

What is the relationship between mediating variables and program outcomes?

What are the pathways among exposure conditions, mediators, and outcomes?

What is the impact of moderating variables on the relationship between exposure conditions and change over time in continuous outcomes?

Multilevel growth curve modeling

What is the impact of moderating variables on the relationship between exposure conditions and rates of dichotomous outcomes?

Multilevel logistic regression

What variables are linked to attrition?

Multilevel logistic regression

1 Analyses will be adjusted for clustering of individuals within organizations and programs.



Exhibit 10. Time Schedule for the Entire Project

Task/Activity

Start Date

End Date

Start date

September 26, 2007

---

Develop project plan and schedule

September 26, 2007

August 12, 2008

Design instruments

September 26, 2007

August 8, 2008

Pilot test instruments

October 1, 2007

January 31, 2008

Main study data collection preparation activities

March 1, 2008

September 30, 2008

Collect baseline data

October 1, 2008

November 24, 2009

Collect follow-up data

March 1, 2009

November 22, 2011a

Analyze data

December 1, 2009

May 15, 2012 a

Submit baseline sample profile

January 31, 2010

Submit data summary

January 31, 2012 a

Submit manuscript and conference presentation

June 1, 2012 a

a This estimated timeline includes a possible no-cost extension for the project.

B. Collection of Information Employing Statistical Methods

Statistical methods are not used in the collection of information for all AFL demonstration projects using the revised core evaluation instruments; therefore, responses to this section apply only to the methods used for the cross-site evaluation of the AFL program.

1. Respondent Universe and Sampling Methods

The cross-site evaluation (which will be a subset of the projects and respondents to the survey) will include up to approximately 2,661 adolescents receiving abstinence education. Adolescents served by Title XX Prevention projects and those selected to serve as comparison groups will participate in the cross-site evaluation.

A total of 36 Prevention projects serve adolescents. From these projects, 7 Prevention projects involving 30 schools or after-school sites have been selected to obtain the sample of 2,661 participants for the cross-site evaluation. Prevention projects were selected for participation based on the rigor of their evaluation designs, namely those that have equivalent treatment and comparison groups and that avoid contamination by the intervention of comparison group respondents. We also prioritized projects that are located in different geographic regions in order to maximize regional diversity and projects that employ implementation strategies conducive to rigorous evaluation (including appropriate timing of program delivery). Information about evaluation design rigor, implementation strategies, and project characteristics was obtained by reviewing end-of-year reports submitted to OAPP and through discussions with OAPP project officers. Within each project, adolescents will be assigned by AFL project staff to treatment and comparison groups.

We conducted power analyses to determine the optimal sample size for detecting statistically significant differences between treatment and comparison groups. The frequency with which adolescents report they have engaged in communication with their parents about abstinence and related topics serves as the primary outcome measure, and responses will be averaged across 15 items using a 4-point scale (from 0 = no talk to 4 = four times or more in the previous 3 months; Miller et al., 1993) for the purposes of power calculations. Power calculations were based on the comparison between treatment and comparison groups. [Three other outcome measures—attitudes about abstinence, intentions to have sex, and sexual activity—will not be considered in final power analyses because some projects may obtain waivers to omit these questions among very young respondents (aged 9 to 13) or among respondents targeted through organizations, such as schools, that will not allow data collection of such sensitive information.] Several assumptions were made concerning population parameters for power analyses of the parent-child communication outcome. First, we assumed a 0.5 correlation coefficient between outcomes measured at baseline and 18-month follow-up for the same respondent. Although there is little definitive information about the true correlation over 2 years, there is some evidence from 1-year follow-up studies that such correlation is no stronger than we assume here (Sales et al., 2006). Second, we assume that all outcomes between different respondents will be uncorrelated. (Siblings or adolescents living in the same household as an enrolled study participant will be excluded.) The exception to this is that because adolescents in Prevention projects are clustered within schools, neighborhoods, or communities, we assumed a school or community-level intraclass coefficient of 0.10, based on pilot data analyses and prior RTI school-based data about adolescent risk behavior. Third, it was assumed that adolescents will report a mean score of 1.2 at baseline and that treatment adolescents would report a mean score of 1.7 at the end of the second school year, as reported by Miller (1993). Each of these assumptions is very conservative, resulting in increased sample sizes for our evaluation. In contrast, Miller (1993) produced similar effects at 3 months, using an extremely low intensity intervention (a videotape viewed by adolescents and their parents). However, our assumption allows us to include enough subjects in our evaluation to detect small effects, and making a less conservative assumption would create the possibility that the Prevention project interventions are efficacious but that our sample size is not large enough to detect this.

To achieve 0.80 power, analyses indicate that a total of 2,661 adolescents from 24 schools or after-school sites will need to complete the baseline survey. The numbers of adolescents in the respondent universe and in each sample are shown in Exhibit 11. The expected response rate at the second school year follow-up includes all adolescents who participate at baseline, including those who may refuse to participate in the first school year follow-up data collection.

All decisions about assumptions that guided our power analysis were intended to err in favor of a larger sample size to safeguard for the possibility of a worst case scenario in terms of difficulty detecting effects. These assumptions increased our confidence that smaller effects produced by Prevention projects than those found by previous programs would be reasonably detected using the sample sizes we identified.

As noted, our sample design is based on conservative assumptions about survey response. Thus, our estimates of longitudinal retention rates shown in Exhibit 11 should be viewed as “worst case” scenarios that if hold true, would still ensure sufficient sample sizes to reasonably detect small program effects. For Prevention, we estimate that at least 96% of adolescents who are contacted and for whom parent consent is obtained will complete the baseline survey, that at least 85% of adolescents will be retained between the baseline and first follow-up survey, and that at least 80% of treatment adolescents and 70% of comparison adolescents will be retained between the baseline and second follow-up surveys.

Exhibit 11. Longitudinal Response Rates and Numbers of Adolescents

Numbers and Response Rates

Treatment Adolescents

Comparison Adolescents

Total

Number of subjects to be contacted at baseline

1,768

1,786

3,554

Expected parent consent rate

81%

75%


Number of subjects with parent consent at baseline

1,432

1,340

2,772

Expected response rate at baseline

96%

96%


Number of completed baseline surveys

1,375

1,286

2,661

Expected response rate at end of school year

85%

85%


Number of completed first follow-up surveys

1,169

1,093

2,262*

Expected response rate at end of second school year

80%

70%


Number of completed second follow-up surveys

1,100

900

2,000*

*A subset of the original 2,661 baseline respondents.

Exhibit 12 shows longitudinal retention rates for prior studies of various lengths.

Exhibit 12. Longitudinal Completion and Retention Rates for Prior Studies

Project

Institution/
Client

Sample

Survey

Time from Baseline

Follow-up Survey Completion Rate

Baseline to Follow-up Retention Rate

Evaluation of abstinence-based pregnancy prevention program (Project IMPPACT)

Inwood House/U.S. Department of Health and Human Services

7th and 8th grade students

Paper and pencil questionnaire

2 years

75%

59%

Child and Family Well-being Study (The Three Cities Study)

Johns Hopkins University/ National Institute for Child Health and Human Development

Focal children of poor households

Physical measurements and a CAPI/ ACASI questionnaire

Wave 2: 1.5 years

Wave 3: 6 years

82%

80%



It should be noted that while attrition will inevitably occur in this study, as it usually does in any longitudinal study, we do not expect attrition to bias any of the study’s main findings. In sample surveys, there will almost always be missing data due to the attrition (or initial nonresponse) of selected respondents. In longitudinal surveys, this problem is typically exacerbated as a function of time because there may be further attrition at each wave of the survey. Three distinct mechanisms causing missing data can be identified and the cause of missingness determines the extent to which bias may be introduced into the study estimates. These mechanisms include the following:

Data are said to be missing completely at random (MCAR) if the probability of attrition is unrelated to study outcome variables or to the value of any other explanatory variables, including the exposure conditions. No additional bias will be introduced to estimates based on incomplete data due to missingness under MCAR. However, the reduced data set will typically result in larger standard errors.

Data are said to be missing at random (MAR) if the probability of attrition is unrelated to study outcome variables after controlling for other explanatory variables. That is, attrition may vary by demographic characteristics. For example, adolescents of lower income may be more likely to drop out of the survey compared to adolescents of higher income. Thus bias would be introduced into an overall outcome variable estimate for adolescents but not into income-specific estimates. Thus, under MAR, the potential bias in estimates due to missingness can be eliminated (or reduced significantly) if the appropriate explanatory variables, such as income, are controlled for.

Data are said to be missing not at random (MNAR) if the probability of attrition is related to the study outcome variable itself. For example, suppose that adolescents who indicate lower parent-child communication about sex at baseline are more likely to drop out of the survey than adolescents who report more parent-child communication. In this case, the overall estimate of parent-child communication among all adolescents will be biased upward by attrition.

In practice, all three missingness mechanisms may be at work (i.e., different attriters may drop out according to different mechanisms). If MNAR is not dominant, then reasonably unbiased estimates of study outcomes can be constructed through appropriate modeling. In the case of this study, we do not expect MNAR to be present.

2. Procedures for the Collection of Information

To gather sensitive and complex data for the cross-site impact evaluation, AFL demonstration project evaluation staff will administer paper and pencil Teleform surveys with treatment and comparison adolescents.

In order for adolescents aged 17 or younger to be included in the cross-site evaluation sample, their parents must be able to read English or Spanish to provide active consent for their adolescent’s participation (either in writing or by telephone with mailed documentation), and all adolescents must be able to read English or Spanish to provide written consent or assent for their own participation in the study. Consent forms and assent forms are included in Appendix E.

All AFL sites will submit the survey instruments to their site IRB prior to initiating data collection. Copies of local site IRB approvals will be submitted to RTI’s IRB. The questionnaire data will be treated as private and maintained in a manner that satisfies the privacy requirements set forth by the site IRB. Any and all transmission of individual or case level data will also be done in accordance with privacy requirements set forth by their site IRB.

Data collection training, monitoring, and ongoing technical assistance will be provided for projects participating in the cross-site evaluation in order to assure high quality data collection procedures. All AFL project staff administering core evaluation instruments will be trained in survey administration, including consent and assent procedures, privacy guidelines, and identifying respondent distress. In addition, the training will emphasize to AFL project staff the importance of following the data collection procedures, including mailing procedures, in order to ensure that the rationale for data collection procedures is fully understood by those responsible for data collection.

Data collection staff will be encouraged to avoid reading all questions to groups of respondents if possible in order to avoid adolescents looking at each others’ survey responses. Completed instruments will be sealed in envelopes, and project staff will not unseal envelopes containing completed surveys in the presence of respondents. AFL Prevention project staff with access to identifying information will never view responses about respondents’ sexual activity in order to avoid any mandatory reporting requirements in their state. The lists of identifiers and identification numbers will be sent to RTI for safekeeping during the cross-site evaluation. Standard procedures will be developed for identification number assignment and linking for the cross-site impact evaluation, with exceptions made if necessary.

Cross-site evaluation baseline data will be collected by Prevention grantees from October 2008 through November 2009.

For the cross-site evaluation, individual parent consent form return incentives will be provided (such as arm bands, pencils, or mirrors) even if the parent refuses to allow the adolescent to participate. Adolescents will also receive a $10 gift card incentive for baseline data collection because adolescents are a difficult cohort to recruit for a 20-minute survey without the use of a small incentive. The decision to use incentives for this study is based on previous findings in the literature (Abreu & Winters, 1999; Shettle & Mooney, 1999; Singer et al., 1999) and by studies that incentives can significantly increase response rates among adolescents. Although these studies differ in other respects that could account for some variability in response rates, overall, incentives of at least $10 were generally associated with higher response rates compared with no incentive. It is expected that these modest incentives will enhance survey response rates without biasing responses or coercing respondents to participate, as well as higher data validity as adolescents become more engaged in the survey process. Because incentives are geographically and culturally specific, this standardized value will be offered, but individual grantees will determine what is actually provided. A protocol for standardized incentives for the cross-site impact evaluation will be suggested. Additional explanation regarding the use of incentives in this study is provided in Section A9.

Treatment and comparison group adolescents who completed baseline surveys will be surveyed again approximately 1 and 2 years after baseline (from March 2009 through November 2011). A potential threat to the external validity of the proposed longitudinal design is loss to follow-up or attrition (Biglan et al., 1991). In other words, the results of the evaluation may be different among the group of subjects who remain in the study after baseline from those who do not remain in the study after baseline. Potential attrition may be an important consideration in the selection of adolescents, particularly because grantees frequently recruit clients located in areas with high levels of transience and hard-to-reach populations (such as low-income families without telephones). RTI’s experience suggests that by using mail surveys and tracing and locating services and by obtaining extensive locating information from participants at baseline (i.e., cell phone, e-mail, contact information for family or friends), it becomes more likely to successfully survey at follow-up 80% of respondents who completed baseline interviews.

All questionnaire hard copies and electronic data will be stored in a secure area designated by the site IRB. AFL project staff will store completed parent consent and adolescent consent/assent forms in separate locked filing cabinets. Completed Prevention instruments for the cross-site evaluation will be sent via Federal Express to the RTI project director and marked as confidential with no expense to participating demonstration projects within 1 business day of survey administration. No respondent names will be included in the Federal Express package of completed instruments. Assent/consent forms and completed surveys must be shipped to RTI separately and on different days. RTI will be notified and provided a tracking number for each shipment. If shipments do not arrive as scheduled, tracing will immediately be initiated through Federal Express. This process will be monitored and feedback provided to AFL project staff throughout the data collection period. If needed, AFL project staff may be re-trained regarding mailing procedures.

3. Methods to Maximize Response Rates and Deal with Nonresponse

The following procedures will be used to maximize cooperation and to achieve the desired high response rates for the cross-site evaluation:

A $10 gift card will be offered to participants who complete the baseline survey. An additional $10 gift card for each follow-up survey will be offered to participants who complete the follow-up survey at the end of the first school year and at the end of the second school year.

An attempt will be made to locate participants who leave the study before the end of the cross-site evaluation. Location efforts will include mailings of refusal conversion materials designed to persuade participants to complete the study. In addition to using mailed refusal conversion materials, RTI may also conduct telephone-based refusal conversion, contacting each attriting participant via telephone.

RTI and AFL grantees will provide a toll-free telephone number to all sampled individuals and invite them to call with any questions or concerns about any aspect of the study.

AFL grantee data collection staff will work with RTI project staff to address concerns that may arise.

4. Tests of Procedures or Methods to be Undertaken

RTI implemented pilot tests of the core evaluation instruments (OMB 0990-0291) previously approved by OMB with 145 youths in North Carolina. The purpose of the pilot tests was twofold: (1) to assess technical aspects and functionality of the survey instrument and (2) to identify areas of the survey that were either unclear or difficult to understand.


Pilot test data collection was conducted from October through December 2007. Eligible participants originated from a convenience sample of students aged 9 to18 in North Carolina who attended schools in the state with low performance in reading and English and lived in low income communities. Low performance in reading was measured by the percentages at end of grade testing. Schools were eligible if 70% or fewer of their students were at grade level for reading. Parents were recruited to give permission for their children to participate in the pilot study through Parent/Teacher Association (PTA) meeting presentations, principal/school involvement, tabling at school events, flyers at libraries, attendance at fall festivals, and word of mouth through parents who had already agreed for their children to participate in the study. To obtain 145 completed surveys, RTI obtained contact information for 188 parents. Parents who expressed interest for their child(ren) to participate in the study received a lead letter from RTI. A screener conducted with parents or students aged 18 and older was used to determine study eligibility of participants. Students self-administered either the baseline or follow-up instrument at local libraries, community facilities, or schools under the supervision of RTI survey administration staff. A total of 72 baseline and 73 follow-up survey instruments were completed, including questions regarding parent-child communication, attitudes and beliefs about abstinence and sexual risks, involvement in positive activities, beliefs about the future, and demographic characteristics. Three participants completed survey instruments in Spanish. Nine participants aged 14 or older also self-administered new items, including questions regarding sexual activity and contraception.


Of the 188 parents contacted by RTI, 3 refused participation, 20 students whose parents agreed to their participation did not attend survey administration, and 2 students were found be ineligible. An additional 18 parents could not be reached by phone to schedule survey administration. A total of 145 student surveys were completed for a 77% response rate. Analyses of the pilot test data indicated there were few significant technical problems with the survey instrument. Many of the respondents in the pilot study put check marks in the boxes instead of filling them in. RTI has replaced the response boxes with circles to increase the likelihood that responses will be accurately scanned. Many respondents were younger than 13, and some said they skipped questions that referred to “teens” because they did not think such questions applied to them. RTI has changed the term “teens” to “young people” to apply to all youths. Some respondents were unsure about what to answer for their race. RTI has created an additional response option for “other (describe ______________)” for race. Lastly, a few respondents wrote their names on the front of the surveys, even though RTI asked them not to. RTI has added a note to the front of the survey that clearly asks respondents not to do this.


There were no outlier values, and all response options were labeled correctly. All skip patterns appeared to function correctly except questions referring to parents. Some students responded that they did not have a mother (or father) and then answered questions about that person. RTI has changed the language in relevant questions to make it clear that having a mother (or father) does not necessarily mean living with them, and not having a mother (or father) means not having one at all. Our findings suggest that there were no logic problems with the survey and the data were accurately recorded. There were no non-response problems with the survey except for a substantial amount of missing data on the question regarding extracurricular activities. RTI has changed this question to an item assessing the overall frequency of participation in extracurricular activities. The average length of the survey was 22 minutes, with a range of 10 to 50 minutes.


Based on the findings of the pilot test, the survey appears to function as intended and is not overly burdensome, sensitive, or difficult to understand. Therefore, few substantive revisions were made to the survey instrument as a result of pilot testing.


5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The agency official responsible for receiving and approving contract deliverables is:

Johanna Nestor
240-453-2808
[email protected]
Office of Population Affairs/DHHS
1101 Wotton Parkway, Suite 700
Rockville, MD 20852

The person who designed the data collection is:

Olivia S. Ashley, Dr.P.H.
919-541-6427
[email protected]
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709

The person who will collect the data is:

Karen Morgan, Ph.,D.

(919) 485-7779

[email protected]

RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709

The persons who will analyze the data are:

Georgiy Bobashev, Ph.D.
919-541-6167
[email protected]
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709

Michael Penne, M.S.
919-541-5988
[email protected]
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709

Marni Kan, Ph.D.

919-485-2756

[email protected]

RTI International

3040 Cornwallis Road
Research Triangle Park, NC 27709

References

Abma, J., Martinez, G., Mosher, W., & Dawson, B. (2004). Teenagers in the United States: Sexual activity, contraceptive use, and childbearing, 2002. Vital and Health Statistics, Series 23, No 24. Hyattsville, MD: National Center for Health Statistics.

Abreu, D. A., & Winters, F. (1999). Using monetary incentives to reduce attrition in the survey of income and program participation. In Proceedings of the Survey Research Methods Section of the American Statistical Association. http://www.amstat.org/ sections/SRMS/proceedings/. Last updated on May 24, 2007.

Albert, B., Lippman, L., Franzetta, K., Ikramullah, E., Keith, J. D., Shwalb, R., et al. (2005). Freeze frame: A snapshot of America’s teens. Washington, DC: National Campaign to Prevent Teen Pregnancy.

Amin, R., & Sato, T. (2004). Impact of a school-based comprehensive program for pregnant teens on their contraceptive use, future contraceptive intention, and desire for more children. Journal of Community Health Nursing, 21, 39-47.

Barnet, B., Liu, J., Devoe, M., Alperovitz-Bichell, K., & Duggan, A. K. (2007). Home visiting for adolescent mother: Effects on parenting, maternal life course, and primary care linkage. Annals of Family Medicine, 5, 224-232.

Biglan, A., Hood, D., Brozovsky, P., Ochs, L., Ary, D., & Black, C. (1991). Subject attrition in prevention research. In. W. Bukoski, & K. Leukefeld (Eds.), Drug abuse prevention research: Methodological issues. NIDA Research Monograph (Vol. 107, pp. 213-223). Rockville, MD: National Institute on Drug Abuse.

Black, M., Bentley, M. E., Papas, M. A., Oberlander, S. A., Teti, L. O., McNary, S., Le, K., & O’Connell, M. (2006). Delaying second births among adolescent mothers: A randomized, controlled trial of a home-based mentoring program. Pediatrics, 18, 2005-2318.

Blake, S. M., Simkin, L., Ledsky, R., Perkins, C., & Calabrese, J. M. (2001). Effects of a parent-child communications intervention on young adolescents’ risk for early onset of sexual intercourse. Family Planning Perspectives, 33, 52-61.

Blinn-Pike, L., Berger, T., & Rea-Holloway, M. (2000). Conducting adolescent sexuality research in schools: Lessons learned. Family Planning Perspectives, 32, 246-51, 265.

Cohen, J. (1988). Statistical power analysis for the behavioral sciences. Hillsdale, NJ: Earlbaum.

DeCoster, J. (2004). Meta-analysis. In K. Kempf-Leonard (Ed.), The encyclopedia of social measurement (pp. 1-19). San Diego, CA: Academic Press.

Doniger, A. S., Riley, J. S., Utter, C. A., & Adams, E. (2001). Impact evaluation of the “Not Me, Not Now” abstinence-oriented adolescent pregnancy prevention communications program, Monroe County, NY. Journal of Health Communication, 6, 45-60.

Eaton, D. K., Kann, L., Kinchen, S., Ross, J., Harris, W. A., Lowry, R., McManus, T., Chyen, D., Shanklin, S., Lim, C., Grunbaum, J. A., & Wechsler, H. (2006). Youth Risk Behavior Surveillance—United States, 2005. Morbidity & Mortality Weekly Report, 55, SS-5, 1-108.

Egger, M., & Smith, G. D. (1997). Meta-analysis: Potential and promise. British Medical Journal, 315, 1371-1374.

Hedges, L. V., & Olkin, I. (1985). Statistical methods for meta-analysis. Orlando, FL: Academic Press.

Hedges, L. V., & Vevea, J. L. (1998). Fixed and random effects models in meta-analysis. Psychological Methods, 3, 486-504.

Henry J. Kaiser Family Foundation. (2003). National Survey of Adolescents and Young Adults: Sexual health knowledge, attitudes and experiences. Menlo Park, CA: Author.

Kirby, D. (2002). Do abstinence-only programs delay the initiation of sex among young people and reduce teen pregnancy? Washington, DC: National Campaign to Prevent Teen Pregnancy.

Kirby, D. (2007). Emerging answers 2007: Research findings on programs to reduce teen pregnancy and sexually transmitted diseases. Washington, DC: National Campaign to Prevent Teen and Unplanned Pregnancy.

Kirby, D., Barth, R. P., Leland, N., & Fetro, J. V. (1991). Reducing the risk: Impact of a new curriculum on sexual risk-taking. Family Planning Perspectives, 23, 253-263.

Knight, G. P., Virdin, L. M., & Roosa, M. (1994). Socialization and family correlates of mental health outcomes among Hispanic and Anglo American children: Consideration of cross-ethnic scalar equivalence. Child Development, 65, 212-224.

Krull, J. L., & MacKinnon, D. P. (1999). Multi-level mediation modeling of group-based intervention studies. Evaluation Review, 23, 418-444.

MacKinnon, D. P., Taborga, M. P., & Morgan-Lopez, A. A. (2002). Mediation designs for tobacco prevention research. Drug and Alcohol Dependence, 68, S69-S83.

Marin, B. V., Coyle, K., Gomez, C., Carvajal, S., & Kirby, D. (2000). Older boyfriends and girlfriends increase risk of sexual initiation in young adolescents. Journal of Adolescent Health, 27, 409-418.

Miller, B. C., Norton, M. C., Jenson, G. O., Lee, T. R., Christopherson, C., & King, P. K. (1993). Impact evaluation of FACTS & feelings: A home-based video sex education curriculum. Family Relations, 42, 392-400.

The National Campaign to Prevent Teen Pregnancy. (2003). With one voice 2003: America's adults and teens sound off about teen pregnancy. Washington, DC: The National Campaign to Prevent Teen Pregnancy.

The National Longitudinal Study of Adolescent Health. (1998). Waves I & II, 1994–1996. Chapel Hill, NC: Carolina Population Center, University of North Carolina at Chapel Hill. [Need to verify reference]

O’Rourke, D., Chapa-Resendez, G., Hamilton, L., Lind, K., Owens, L., & Parker, V. (1998). An inquiry into declining RDD response rates part I: Telephone survey practices. Survey Research, 29, 1-16.

Percy, M. S., & McIntyre, L. (2001). Using Touchpoints to promote parental self-competence in low income, minority, pregnant, and parenting teen mothers. Journal of Pediatric Nursing, 16, 180-186.

Shettle, C., & Mooney, G. (1999). Monetary incentives in U.S. government surveys. Journal of Official Statistics, 15, 231-250.

Silva, M. (2002). The effectiveness of school-based sex education programs in the promotion of abstinent behavior: A meta-analysis. Health Education Research, 17, 471-481.

Singer, E., Van Hoewyk, J., Gebler, N., Raghunathan, T., & McGonagle, K. (1999). The effect of incentives in interviewer-mediated surveys. Journal of Official Statistics, 15, 217-230.

Singleton, R., & Straits, B. C. (1999). Approaches to social research. New York: Oxford University Press.

Thomas, D. V., & Looney, S. W. (2004). Effectiveness of a comprehensive psychoeducational intervention with pregnant and parenting adolescents: A pilot study. Journal of Child and Adolescent Psychiatric Nursing, 17, 66-77.

Trenholm, C., Devaney, B., Fortson, K., Quay, L., Wheeler, J., & Clark, M. (2007). Impacts of four Title V, Section 510 abstinence education programs, final report. Princeton, NJ: Mathematica Policy Research, Inc.


Weed, S. (2004). Choosing the best research results: Executive summary. Washington, DC: U.S. Department of Health and Human Services.

U.S. Government Accountability Office. (2006). Abstinence education: Efforts to assess the accuracy and effectiveness of federally funded programs. Washington, DC: Author.

The White House. (2005). Program Assessment Rating Tool: 2006 budget. http://www.whitehouse.gov/omb/budget/fy2006/sheets/part.xls. Last updated July 23, 2005.

Public Law 98-512, 42 U.S.C. 300z-2, as amended.

Appendix A


Statute/Regulation Mandating or Authorizing the Collection of Information

TITLE XX -- ADOLESCENT FAMILY LIFE DEMONSTRATION PROJECTS


§2001. [ 300z] Findings and purposes

(a) The Congress finds that -

(1) in 1978, an estimated one million one hundred thousand teenagers became pregnant, more than five hundred thousand teenagers carried their babies to term, and over one-half of the babies born to such teenagers were born out of wedlock;

(2) adolescents aged seventeen and younger accounted for more than one-half of the out of wedlock births to teenagers;

(3) in a high proportion of cases, the pregnant adolescent is herself the product of an unmarried parenthood during adolescence and is continuing the pattern in her own lifestyle;

(4) it is estimated that approximately 80 per centum of unmarried teenagers who carry their pregnancies to term live with their families before and during their pregnancy and remain with their families after the birth of the child;

(5) pregnancy and childbirth among unmarried adolescents, particularly young adolescents, often results in severe adverse health, social, and economic consequences including: a higher percentage of pregnancy and childbirth complications; a higher incidence of low birth weight babies; a higher infant mortality and morbidity; a greater likelihood that an adolescent marriage will end in divorce; a decreased likelihood of completing schooling; and higher risks of unemployment and welfare dependency; and therefore, education, training, and job research services are important for adolescent parents;

(6) (A) adoption is a positive option for unmarried pregnant adolescents who are unwilling or unable to care for their children since adoption is a means of providing permanent families for such children from available approved couples who are unable or have difficulty in conceiving or carrying children of their own to term; and

(B) at present, only 4 per centum of unmarried pregnant adolescents who carry their babies to term enter into an adoption plan or arrange for their babies to be cared for by relatives or friends;

(7) an unmarried adolescent who becomes pregnant once is likely to experience recurrent pregnancies and childbearing, with increased risks;

(8) (A) the problems of adolescent premarital sexual relations, pregnancy, and parenthood are multiple and complex and are frequently associated with or are a cause of other troublesome situations in the family; and

(B) such problems are best approached through a variety of integrated and essential services provided to adolescents and their families by other family members, religious and charitable organizations, voluntary associations, and other groups in the private sector as well as services provided by publicly sponsored initiatives;

(9) a wide array of educational, health, and supportive services are not available to adolescents with such problems or to their families, or when available frequently are fragmented and thus are of limited effectiveness in discouraging adolescent premarital sexual relations and the consequences of such relations;

(10)(A) prevention of adolescent sexual activity and adolescent pregnancy depends primarily upon developing strong family values and close family ties, and since the family is the basic social unit in which the values and attitudes of adolescents concerning sexuality and pregnancy are formed, programs designed to deal with issues of sexuality and pregnancy will be successful to the extent that such programs encourage and sustain the role of the family in dealing with adolescent sexual activity and adolescent pregnancy;

(B) Federal policy therefore should encourage the development of appropriate health, educational, and social services where such services are now lacking or inadequate, and the better coordination of existing services where they are available; and

(C) services encouraged by the Federal Government should promote the involvement of parents with their adolescent children, and should emphasize the provision of support by other family members, religious and charitable organizations, voluntary associations, and other groups in the private sector in order to help adolescents and their families deal with complex issues of adolescent premarital sexual relations and the consequences of such relations; and

(11)(A) there has been limited research concerning the societal causes and consequences of adolescent pregnancy;

(B) there is limited knowledge concerning which means of intervention are effective in mediating or eliminating adolescent premarital sexual relations and adolescent pregnancy; and

(C) it is necessary to expand and strengthen such knowledge in order to develop an array of approaches to solving the problems of adolescent premarital sexual relations and adolescent pregnancy in both urban and rural settings.


(b) Therefore, the purposes of this subchapter are -

(1) to find effective means, within the context of the family, of reaching adolescents before they become sexually active in order to maximize the guidance and support available to adolescents from parents and other family members, and to promote self discipline and other prudent approaches to the problem of adolescent premarital sexual relations, including adolescent pregnancy;

(2) to promote adoption as an alternative for adolescent parents;

(3) to establish innovative, comprehensive, and integrated approaches to the delivery of care services both for pregnant adolescents, with primary emphasis on unmarried adolescents who are seventeen years of age or under, and for adolescent parents, which shall be based upon an assessment of existing programs and, where appropriate, upon efforts to establish better coordination, integration, and linkages among such existing programs in order to -

(A) enable pregnant adolescents to obtain proper care and assist pregnant adolescents and adolescent parents to become productive independent contributors to family and community life; and

(B) assist families of adolescents to understand and resolve the societal causes which are associated with adolescent pregnancy;

(4) to encourage and support research projects and demonstration projects concerning the societal causes and consequences of adolescent premarital sexual relations, contraceptive use, pregnancy, and child rearing;

(5) to support evaluative research to identify effective services which alleviate, eliminate, or resolve any negative consequences of adolescent premarital sexual relations and adolescent childbearing for the parents, the child, and their families; and

(6) to encourage and provide for the dissemination of results, findings, and information from programs and research projects relating to adolescent premarital sexual relations, pregnancy, and parenthood.






Appendix B


Cross-Site Evaluation Data Collection Materials



Prevention Core Baseline Questionnaire (English/Spanish)

Prevention Core Follow Up Questionnaire (English/Spanish)

[INSERT INSTRUMENTS HERE]















































Appendix C


Federal Register Notice to the Public


[INSERT PDF]



Appendix D


RTI Institutional Review Board Approval Notice


[INSERT PDF]


Appendix E


Assurances of Confidentiality and Study Descriptions Provided to Respondents




  • Prevention Survey Parent/Guardian Informed Consent for Youths Younger than Age 18

  • Prevention Survey Assent for Youths Younger than Age 18

  • Prevention Survey Consent for Youths Age 18 and Older

  • Youth Assent Script for Prevention Youths Younger Than Age 18

  • Youth Consent Script for Prevention Youths Aged 18 or Older









Adolescent Family Life (AFL) Prevention Survey

Parent/Guardian Informed Consent for Youths Younger than Age 18


Protocol Title: AFL Prevention Core Evaluation Instruments

Sponsor: Department of Health and Human Services

Office of Population Affairs

Office of Adolescent Pregnancy Programs


AFL Prevention Program Director: NAME


Introduction

We are inviting your child to be part of a research study to evaluate [PROGRAM NAME] as part of our involvement with the Office of Population Affairs, Adolescent Family Life Prevention Program. Your child was selected because of his/her involvement with [PROGRAM NAME]. This information will be used to help improve programs like ours. Before you decide whether you want your child to take part in this study, you need to read this Informed Consent form so that you understand what the study is about and what your child will be asked to do. This form also tells you who can be in the study, the risks and benefits of the study, how we will protect your information, and who you can call if you have questions. Please call Dr. Olivia Ashley, the researcher responsible for this study, at (800) 334-8571 ext. 6427 (a toll-free number) about anything you don’t understand before you make your decision.


Purpose

This study sponsored by the Office of Population Affairs (OPA), Department of Health and Human Services (DHHS), is being conducted by RTI International, a research organization located in North Carolina. This national study will involve more than 2,600 youths. The purpose of this national study is to learn about youths who are served by programs like [PROGRAM NAME].


Procedures

If you agree to let your child participate, he or she will be asked to complete a questionnaire. The questions ask about things like their future goals; relationships with friends and family; feelings about marriage and sex; sexual activity; method(s) to prevent pregnancy and sexually transmitted diseases; and tobacco, alcohol, or other drug use. Youths don’t have to use tobacco, alcohol, or other drugs or be sexually active to be in the study. Most of the questions are multiple choice. This is not a test. There are no right or wrong answers. If your child prefers, we can read the questions to him or her.


Study Duration

Completing the questionnaire will take about 20 minutes of your child’s time. There will be two additional surveys, conducted at the end of this school year and at the end of next school year. Each additional survey will take about the same amount of time to complete.


Possible Risks or Discomforts

Some of the questions may seem personal or make your child feel uncomfortable. There is a risk that your child’s answers could be seen by someone else other than the project staff, which could create problems among friends or teachers, but we promise to do our best to keep this from happening.


In addition to the risks and discomforts listed here, there may be uncommon or previously unknown risks. You should report any problems to Dr. Olivia Ashley at (800) 334-8571 ext. 6427 (a toll-free number).


Benefits

Your Benefits

There are no direct benefits to your child from participating in this study. However, the survey could help service providers learn about ways to improve your child’s services.


Benefits for Other People

We hope that this research will help us understand more about how to improve programs like [PROGRAM NAME].


Payment for Participation

Your child will receive a $10.00 gift card each time for trying any part of this questionnaire and for each follow-up survey at the end of this school year and next school year.


Privacy

All the questionnaire answers will be kept private. We will not allow anyone outside the program evaluation staff know which answers are your child’s, except when required by law. There are two exceptions: 1) if your child reveals that he or she is a danger to self or others, or 2) if he or she reveals abuse or neglect committed against a child. In either of these cases, we must report it to the appropriate authorities. This includes suspected abuse or neglect of your child or suspected abuse or neglect of a friend of your child. We may want to share the results of the study with other people who worked on the survey and the funding agency, but no names will be included. Your child’s name will be replaced with a number for the purposes of this study. After all surveys are completed, a summary will be written that contains information from all participants, but no names. It will not be possible to determine who wrote what on a questionnaire.


The Institutional Review Board (IRB) at RTI has reviewed this research. An IRB is a group of people who are responsible for assuring that the rights of participants in research are protected. The IRB may review the records of your child’s participation in this research to assure that proper procedures were followed. A representative of the IRB may contact you for information about your child’s experience with this research. This representative will be given your name, but will not be given any of your child’s private study data. If you wish, you may refuse to answer any questions this person may ask.


Future Contacts

If your child participates in this study, we will contact him or her to participate in the follow-up surveys at the end of this school year and at the end of next school year. If your child does not participate, we will not contact him or her in the future. As part of your child’s participation in [PROGRAM NAME], your child may be contacted additionally by [PROGRAM NAME], but not about this study.


Your Rights

Your child’s decision to take part in this research study is completely voluntary. You do not have to agree to allow your child to take the questionnaire in order for him or her to get services here or anywhere else. Your child will also be asked if he or she is willing to voluntarily participate in the study. In order for your child to complete the questionnaire, BOTH you and your child must agree to participation. If your child does participate in the study, he or she can skip any questions. If your child feels like the questionnaire is taking too long, gets tired, or if for any other reason he or she wants to stop, they may do so at any time.


Your Questions

If you have any questions about this study, you can contact the AFL Program Project Director, [PROGRAM DIRECTOR], at [PROGRAM NAME] at [LOCAL NUMBER] or Dr. Olivia Ashley at RTI at (800) 334-8571 ext. 6427 (a toll-free number). If you have any questions about protecting your privacy on this survey, please call [LOCAL IRB LIASION NAME] at [LOCAL NUMBER]. If you have any questions about your rights as a study participant, you may call RTI’s Office of Research Protection at 1-866-214-2043 (a toll-free number).
































RTI ID:








COMPLETE AND RETURN THIS FORM TO [NAME OF AFL PROGRAM DATA COLLECTOR].


Please read the information below and check one box. Please sign and return this consent form by __________.

[PLEASE PRINT] Child's name: ___________________________________


I have read this form and understand it.


[ ] I GIVE PERMISSION for my child to take part in surveys for this study.

[ ] I DO NOT GIVE PERMISSION for my child to take part in surveys for this study.


[PLEASE PRINT] Parent/Guardian name: ___________________________________


Parent/Guardian signature: ___________________________________


Date: ___________________________________


KEEP THE FIRST THREE PAGES OF THIS CONSENT FORM.




Adolescent Family Life (AFL) Prevention Survey

Assent for Youths Younger than Age 18


Protocol Title: AFL Prevention Core Evaluation Instruments

Sponsor: Department of Health and Human Services

Office of Population Affairs

Office of Adolescent Pregnancy Programs


AFL Prevention Program Director: NAME


Introduction

We are inviting you to be part of a research study about [PROGRAM NAME]. You were chosen because you are part of [PROGRAM NAME]. This information will be used to help improve programs like ours. Before you decide whether you want to take part in this study, you need to read this form so that you know what the study is about and what you will be asked to do. This form also tells you who can be in the study, the risks and benefits of the study, how we will protect your privacy, and who you can call if you have questions. Please call Dr. Olivia Ashley, who is in charge of this study, at (800) 334-8571 ext. 6427 (a toll-free number) about anything you want to ask before you decide.


Purpose

This study is sponsored by the Office of Population Affairs (OPA), Department of Health and Human Services (DHHS). It is being done by RTI International, a research firm in North Carolina. This national study will involve more than 2,600 youths. The purpose of the national study is to learn about young people served by programs like [PROGRAM NAME].


Procedures

If you agree to take part, you will be asked to take a survey. The questions ask about things like your future goals; relationships with friends and family; feelings about marriage and sex; sexual activity; method(s) to prevent pregnancy and sexually transmitted diseases; and tobacco, alcohol, or other drug use. You don’t have to use tobacco, alcohol, or other drugs or be sexually active to be in the study. Most of the questions are multiple choice. This is not a test. There are no right or wrong answers. If you prefer, we can read the questions to you.


Study Duration


Completing the questionnaire will take about 20 minutes. There will be two additional surveys, conducted at the end of this school year and at the end of next school year. Each additional survey will take about the same amount of time to complete.


Risks or Discomforts That May Happen

Some of the questions may seem personal or bother you. There is a risk that your answers could be seen by someone other than the project staff. This could create problems among friends or teachers, but we promise to do our best to prevent this.


In addition to these risks and discomforts, other risks may happen that are not common or that we don’t expect. You should report any problems to Dr. Olivia Ashley at (800) 334-8571 ext. 6427 (a toll-free number).


Benefits

Your Benefits


There are no direct benefits to you from taking part in this study. However, the survey could help staff learn about ways to improve your services.


Benefits for Other People


We hope that this research will help us understand how to improve programs like [PROGRAM NAME].


Payment for Taking Part

You will receive a $10.00 gift card for trying any part of the survey. You will also receive a $10.00 gift card for the survey that you will take at the end of this school year and at the end of next school year.


Privacy

All the survey answers are private. We will not allow people outside the study staff know which answers are yours except when required by law. There are two reasons we would do this: 1) if you reveal that you are a danger to yourself or others or 2) if you reveal a child is being hurt or not taken care of. In either of these cases, we must report it to the authorities. This includes if you are being hurt or not taken care of, or if you a friend of yours is being hurt or not taken care of. We will combine your answers with those of other young people. We may share these results with other people who worked on the survey and the funding group, but we will not share names. Your name will be replaced with a number for the purposes of this study. After all surveys are done, we will write a summary that contains answers from all young people. The staff doing the study will not use your name in the report and will keep your answers private. Readers will not be able to tell who wrote what on a survey.


The Institutional Review Board (IRB) at RTI has reviewed this research. An IRB is a group of people who must make sure that the rights of people who take part in research are protected. The IRB may review the records of your taking part in this research to make sure that proper rules were followed.


Future Contacts


We will contact you to take the next survey at the end of this school year and at the end of the next school year.


Your Rights


Your decision to take part in this research study is your choice. You do not have to take the survey in order for you to get services here or anywhere else. In order for you to take the survey, BOTH you and your parent/guardian must agree that you can take part. If you do take part in the study, you can skip any questions. If you feel like the survey is taking too long, you are getting tired, or if for any other reason you want to stop, you may do so at any time.


Your Questions

If you have any questions about this study, you can contact the AFL Program Project Director, [PROGRAM DIRECTOR], at [PROGRAM NAME] at [LOCAL NUMBER] or Dr. Olivia Ashley at RTI at (800) 334-8571 ext. 6427 (a toll-free number). If you have any questions about your privacy on this survey, please call [LOCAL IRB LIASION NAME] at [LOCAL NUMBER]. If you have any questions about your rights as a person taking part in the study, you may call RTI’s Office of Research Protection at 1-866-214-2043 (a toll-free number).



KEEP PAGES 1-3.





RTI ID:









FILL OUT AND GIVE THIS PAGE TO [Name of AFL Program Data Collector].



By signing this form, you are letting us know that you have read it, got answers to your questions, and freely decided to try this survey. Signing this form will not affect your receiving services here or anywhere else.



_______________________ _________

Youth’s Signature Date


_______________________ __________ __________________________ ______

Witness Signature Date Witness Printed Name Date



Adolescent Family Life (AFL) Prevention Survey

Consent for Youths Age 18 and Older


Protocol Title: AFL Prevention Core Evaluation Instruments

Sponsor: Department of Health and Human Services

Office of Population Affairs

Office of Adolescent Pregnancy Programs


AFL Prevention Program Director: NAME


Introduction

We are inviting you to be part of a research study to evaluate [PROGRAM NAME] as part of our involvement with the Office of Population Affairs, Adolescent Family Life Prevention Program. You were chosen because you are part of [PROGRAM NAME]. This information will be used to help improve programs like ours. Before you decide whether you want to take part in this study, you need to read this Informed Consent form so that you understand what the study is about and what you will be asked to do. This form also tells you who can be in the study, the risks and benefits of the study, how we will protect your information, and who you can call if you have questions. Please call Dr. Olivia Ashley, the researcher responsible for this study, at (800) 334-8571 ext. 6427 (a toll-free number) about anything you don’t understand before you make your decision.


Purpose

This study sponsored by the Office of Population Affairs (OPA), Department of Health and Human Services (DHHS), is being conducted by RTI International, a research organization located in North Carolina. This national study will involve more than 2,600 youths. The purpose of the national study is to learn about young people who are served by programs like [PROGRAM NAME].


Procedures

If you agree to participate, you will be asked to complete a questionnaire. The questions ask about things like your future goals; relationships with friends and family; feelings about marriage and sex; sexual activity; method(s) to prevent pregnancy and sexually transmitted diseases; and tobacco, alcohol, or other drug use. You don’t have to use tobacco, alcohol, or other drugs or be sexually active to be in the study. Most of the questions are multiple choice. This is not a test. There are no right or wrong answers. If you prefer, we can read the questions to you.


Study Duration


Completing the questionnaire will take about 20 minutes. There will be two additional surveys, conducted at the end of this school year and at the end of next school year. Each additional survey will take about the same amount of time to complete.


Possible Risks or Discomforts

Some of the questions may seem personal or make you feel uncomfortable. There is a risk that your answers could be seen by someone else other than the project staff, which could create problems among friends or teachers, but we promise to do our best to keep this from happening.


In addition to the risks and discomforts listed here, there may be uncommon or previously unknown risks. You should report any problems to Dr. Olivia Ashley at (800) 334-8571 ext. 6427 (a toll-free number).


Benefits

Your Benefits


There are no direct benefits to you from participating in this study. However, the survey could help service providers learn about ways to improve your services.


Benefits for Other People


We hope that this research will help us understand how to improve programs like [PROGRAM NAME].


Payment for Participation

You will receive a $10.00 gift card for trying any part of the survey. You will also receive a $10.00 gift card for the survey that you will take at the end of this school year and at the end of next school year.

Privacy

All the questionnaire answers will be kept private. We will not allow anyone outside the program evaluation staff know which answers are yours except when required by law. There are two exceptions: 1) if you reveal that you are a danger to yourself or others, or 2) if you reveal a child is being hurt or not taken care of. In either of these cases, we must report it to the appropriate authorities. This includes if a friend of yours is being hurt or not taken care of. We may want to share the results of the study with other people who worked on the survey and the funding agency, but no names will be included. Your name will be replaced with a number for the purposes of this study. After all surveys are completed, a summary will be written that contains information from all participants. The staff doing the study will not use your name in the report, and will keep your answers private. It will not be possible to determine who wrote what on a questionnaire.


The Institutional Review Board (IRB) at RTI has reviewed this research. An IRB is a group of people who are responsible for assuring that the rights of participants in research are protected. The IRB may review the records of your participation in this research to assure that proper procedures were followed. A representative of the IRB may contact you for information about your experience with this research. This representative will be given your name, but will not be given any of your private study data. If you wish, you may refuse to answer any questions this person may ask.


Future Contacts


We will contact you to take the next survey at the end of this school year and at the end of the next school year.


Your Rights


Your decision to take part in this research study is your choice. You do not have to agree to take the questionnaire in order for you to get services here or anywhere else. If you do participate in the study, you can skip any questions. If you feel like the questionnaire is taking too long, you are getting tired, or if for any other reason you want to stop, you may do so at any time.


Your Questions

If you have any questions about this study, you can contact the AFL Program Project Director, [PROGRAM DIRECTOR], at [PROGRAM NAME] at [LOCAL NUMBER] or Dr. Olivia Ashley at RTI at (800) 334-8571 ext. 6427 (a toll-free number). If you have any questions about protecting your privacy on this survey, please call [LOCAL IRB LIASION NAME] at [LOCAL NUMBER]. If you have any questions about your rights as a study participant, you may call RTI’s Office of Research Protection at 1-866-214-2043 (a toll-free number).


























RTI ID:









By signing this form, you are letting us know that you have read it, received answers to your questions, and freely decided to try this survey. Signing this form will not affect your receiving services here or anywhere else. Please keep pages 1-3 for your records and return this last page to AFL staff.



_______________________ _________ _________________________ ______

Youth’s Signature Date Youth’s Printed Name Date


_______________________ __________ __________________________ ______

Witness Signature Date Witness Printed Name Date


Youth Assent Script for Prevention Youths Younger Than Age 18


[To be read to youths by survey administrators during youth assent form distribution]

We’re inviting you to be in a research study to evaluate [PROGRAM NAME] as part of your involvement with the Office of Population Affairs, Adolescent Family Life Prevention Program. You were chosen because you are a part of [PROGRAM NAME.] There is a toll-free phone number on the form for Dr. Ashley, who leads the study at RTI in North Carolina, that you can call with any questions.


The study is sponsored by the federal government and is conducted by RTI in North Carolina. The study is to learn about youths who are served by programs like [PROGRAM NAME].


The first survey will take place this fall. There will be two more surveys at the end of this school year and at the end of next school year. If you agree, we will ask you to complete a survey. The form tells you what the questions are about: things like your future goals; relationships with friends and family; feelings about marriage and sex; sexual activity; method(s) to prevent pregnancy and sexually transmitted diseases; and tobacco, alcohol, or other drug use.


Each survey will take about 20 minutes.


Some of the questions might be personal or make you uncomfortable. If anyone who doesn’t work on the study saw your answers, it might create problems for you, so we are going to try very hard to protect your privacy. Dr. Ashley’s toll-free phone number is listed again for you to call if you have any problems.


The study results won’t help you directly but could help service providers learn how to give better services.


We hope that this research will help us understand how to improve programs like [PROGRAM NAME].


We will give you a $10 gift card for each survey if you try to answer any of the questions.


All of your answers will be kept private. We will not let anyone outside the study know your answers except if the law makes us. There are two reasons we would have to do this: 1) If you say you are a danger to yourself or others, or 2) If you say that a child is being hurt or is not being taken care of--we would have to report either of these to the authorities. This includes if you are being hurt or not being taken care of, or if a friend of yours is being hurt or not being taken care of. Your answers will be combined with other answers, but we will replace your name with a number. So when a report is written, it will contain information from everyone who took the survey but no names.


There is a group of people at RTI who have reviewed our privacy procedures. This group might review our records about your taking the survey, may be given your name (but not your answers).


[We will contact you to take the next survey at the end of this school year and at the end of the next school year.]


Taking this survey is completely your choice. Your parent has already said that it is okay for you to take the survey. If you do try the survey, you can skip any questions or you can stop at any time.


There are phone numbers you can call with questions about the study or about your privacy and rights.


Keep the copy of this form that explains all of this.


So if you sign the last page of this form, you are saying you’ve read the form, got all your questions answered, and are deciding to try the survey. Signing does not affect your legal rights.

Youth Consent Script for Prevention Youths Aged 18 or Older



[To be read to youths by survey administrators during youth consent form distribution]



We’re inviting you to be in a research study to evaluate [PROGRAM NAME] as part of your involvement with the Office of Population Affairs, Adolescent Family Life Prevention Program. You were chosen because you are a part of [PROGRAM NAME.] There is a toll-free phone number on the form for Dr. Ashley, who leads the study at RTI in North Carolina, that you can call with any questions.


The study is sponsored by the federal government and is conducted by RTI in North Carolina. The study is to learn about youths who are served by programs like [PROGRAM NAME].


The first survey will take place this fall. There will be two more surveys at the end of this school year and at the end of next school year. If you agree, we will ask you to complete a survey. The form tells you what the questions are about: things like your future goals; relationships with friends and family; feelings about marriage and sex; sexual activity; method(s) to prevent pregnancy and sexually transmitted diseases; and tobacco, alcohol, or other drug use.


Each survey will take about 20 minutes.


Some of the questions might be personal or make you uncomfortable. If anyone who doesn’t work on the study saw your answers, it might create problems for you, so we are going to try very hard to protect your privacy. Dr. Ashley’s toll-free phone number is listed again for you to call if you have any problems.


The study results won’t help you directly but could help service providers learn how to give better services.


We hope that this research will help us understand how to improve programs like [PROGRAM NAME].


We will give you a $10 gift card for each survey if you try to answer any of the questions.


All of your answers will be kept private. We will not let anyone outside the study know your answers except if the law makes us. There are two reasons we would have to do this: 1) If you say you are a danger to yourself or others, or 2) If you say that a child is being hurt or is not being taken care of--we would have to report either of these to the authorities. This includes if a friend of yours is being hurt or not being taken care of. Your answers will be combined with other answers, but we will replace your name with a number. So when a report is written, it will contain information from everyone who took the survey but no names.


There is a group of people at RTI who have reviewed our privacy procedures. This group might review our records about your taking the survey, may be given your name (but not your answers), and may contact you to ask you about how things went, but you don’t have to answer any of their questions if you don’t want to. It’s up to you.


[We will contact you to take the next survey at the end of this school year and at the end of the next school year.]


Taking this survey is completely your choice. If you do try the survey, you can skip any questions or you can stop at any time.


There are phone numbers you can call with questions about the study or about your privacy and rights.


I’ll make a copy of this form for you to keep.


So if you sign, you are saying you’ve read the form, got all your questions answered, and are deciding to try to survey. Signing does not affect your legal rights.


Appendix F


Recruiting Materials






































OPA Lead Letter














































[OPA LETTERHEAD]



TO: [AFL PROGRAM DIRECTOR]


FROM: Johanna Nestor


CC: Olivia Silber Ashley, RTI International


DATE: [DATE]


SUBJECT: Evaluating the Title XX Adolescent Family Life (AFL) Program


The Office of Population Affairs (OPA) has contracted with RTI International, a not-for-profit organization in Durham, NC, to design a cross-site evaluation of the AFL program. We have selected your project to participate in the cross-site evaluation. Participating in the cross-site evaluation is a condition of your grant funding.

Baseline data collection for the cross-site evaluation will begin in October 2008. RTI will contact you to begin the process of obtaining local Institutional Review Board (IRB) approval for this data collection.

If you have any questions about your participation in the cross-site evaluation, please contact RTI’s Project Director, Dr. Olivia Ashley, at (800)334-8571 ext. 6427 or [email protected] or me at (240) 453-2808 or [email protected].

Thank you for your help as we learn about the impacts of the AFL demonstration projects.

Parent Lead Letters

Youth (aged 18 or older) Lead Letters

Prevention Parent Lead Letter

(PREVENTION PROGRAM LETTERHEAD)


[DATE]


(PARENT NAME)

(PARENT ADDRESS)


Dear (PARENT NAME):


This letter is to invite (YOUTH NAME) to participate in a research study being conducted by RTI International, a not-for-profit research organization in Durham, North Carolina. (YOUTH NAME) was selected for this study because of his/her participation in (PREVENTION PROGRAM NAME). RTI is conducting a national study funded by the Office of Population Affairs in the U.S. Department of Health and Human Services about youths served by programs like (PREVENTION PROGRAM NAME). This national study will involve more than 2,600 youths.


If you and (YOUTH NAME) agree, we would like for (YOUTH NAME) to complete three written questionnaires—one a the beginning of this school year, one at the end of this school year, and one at the end of next school year. Each survey should take about 20 minutes of (YOUTH NAME)'s time.


The questions ask about things like their future goals; relationships with friends and family; feelings about marriage and sex; intentions to have sex; sexual behaviors; and method(s) to prevent pregnancy and sexually transmitted diseases; and tobacco, alcohol, or other drug use. Youths don’t have to use tobacco, alcohol, or other drugs or be sexually active to be in the study.


After each questionnaire is completed, we will give a $10 gift card to (YOUTH NAME).


If you have any questions about the current study, please contact me at (LOCAL NUMBER) or the RTI Project Director, Dr. Olivia Ashley, toll-free at (800) 334-8571 ext. 6427. If you have questions about your rights as a study participant, please call (LOCAL IRB) at (LOCAL NUMBER) or RTI's Office of Research Protection toll-free at (866) 214-2043.


Thank you for considering this request.


Sincerely,




(AFL PREVENTION PROGRAM DIRECTOR)

Prevention Youth (18 or older) Lead Letter

(PREVENTION PROGRAM LETTERHEAD)


[DATE]


(YOUTH NAME)

(YOUTH ADDRESS)


Dear (YOUTH NAME):


This letter is to ask you to be part of a research study done by RTI International, a not-for-profit research firm in Durham, North Carolina. We asked you to be in the study because you participate in (PREVENTION PROGRAM NAME). This is a national study paid for by the Office of Population Affairs in the U.S. Department of Health and Human Services about youths served by programs like (PREVENTION PROGRAM NAME). This national study will involve over 2,600 youths.


If you agree to be in the study, we would like you to fill out a survey. It should take about 20 minutes of your time. There will be two additional surveys, conducted at the end of this school year and at the end of next school year. Each additional survey will take about the same amount of time to complete.


The survey asks about things like your future goals; relationships with your friends and family; feelings about marriage and sex; intentions to have sex; sexual behaviors; and method(s) to prevent pregnancy and sexually transmitted diseases; and tobacco, alcohol, or other drug use. You do not have to use tobacco, alcohol, or other drugs or be sexually active to be in the study. After the survey is finished, we will give you a $10 gift card.


If you have any questions about the study, please call me at (LOCAL NUMBER) or the RTI Project Director, Dr. Olivia Ashley, toll-free at (800) 334-8571 ext. 6427. If you have questions about your rights as a study participant, please call (LOCAL IRB) at (LOCAL NUMBER) or RTI's Office of Research Protection toll-free at (866) 214-2043.


Thank you,




(AFL PREVENTION PROGRAM DIRECTOR)








Appendix G


Cross-Site Evaluation Study Protocol






































1) Baseline Survey (fall of first school year)

  • Participants assigned to treatment and comparison conditions within AFL projects

  • All participants complete baseline survey

  • Treatment group participants will receive services such as abstinence education programs, parent-youth workshops, parent programs, social marketing campaigns, youth development activities, mentoring, community workshops, and academic assistance

  • Comparison group participants will either receive another abstinence education program or no program

2) 1st Follow-Up Survey (spring of first school year or fall of second school year)

  • All participants (treatment and comparison conditions) complete 1st follow-up survey


3) 2nd Follow-Up Survey (spring of second school year or fall of third school year)

  • All participants (treatment and comparison conditions) complete 2nd follow-up survey




File Typeapplication/msword
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy