0990 - suppt.st.Care and Prevention_revised_template_08 21 09

0990 - suppt.st.Care and Prevention_revised_template_08 21 09.doc

Adolescent Family Life Care Demonstration Project End of Year Care Template (EOYC)

OMB: 0990-0299

Document [doc]
Download: doc | pdf


Supporting Statement for

Adolescent Family Life (AFL) Care and Prevention Demonstration Project

End of the Year Report Templates”



A. Background and Justification

1. Need and Legal Basis

This document provides a Supporting Statement to accompany a request for approval of revisions to the "Adolescent Family Life Care and Prevention Demonstration Project End of the Year Report Templates” (OMB 0990-0299). The End of Year Report Templates (hereafter EOY Report Templates) are used to collect information in order to describe the activities of Care and Prevention demonstration projects funded by the Adolescent Family Life (AFL) Program. The revised templates are to be used for two major activities 1) to assess the progress of the AFL grantees and to 2) conduct a process evaluation of AFL demonstration projects.

The Office of Population Affairs (OPA) administers the AFL Program, authorized under Title VI of the Public Health Service Act (P.L. 95-626). Title XX of the Public Health Service Act (PHSA) in the Office of Adolescent Pregnancy Programs (OAPP). The Title XX AFL program supports grants for two types of demonstration projects: care and prevention. Care projects develop programs to provide health, education and social services to pregnant and parenting adolescents, their infants, male partners, and their families. Care projects develop and test programs for pre-adolescents, adolescents, and their families to delay the onset of adolescent sexual activity and thus reduce the incidence of pregnancy and sexually transmitted infections. Both Prevention and Care projects are funded across the country in a variety of settings.

The Title XX statute requires an independent evaluation of all demonstration projects funded through the AFL Program. Because these evaluations are independent, the data collected from one project to another vary. Moreover, the independent evaluations do not always necessarily examine questions of particular statutory or policy relevance to the OPA. Thus, the OPA has developed end-of-year reporting templates for AFL Prevention and Care demonstration projects that reflect Title XX legislative requirements, as well as the A-H definition of abstinence education contained in the Welfare Reform Act of 1996.

Grants administered by the OAPP are demonstrative in nature and are required by statute to have an evaluation component (§2006. [300z-5] (b)(1)). In addition, each grantee that receives funds for a demonstration project must make reports concerning its use of Federal funds (§2006. [300z-5] (c)). Since the inception of the program, AFL has requested that grantees write a year-end report to describe both the progress of the program and the results of their evaluation.

In May 2005, OMB approved the EOY Report Template as a means of providing direction to the grantees on report content. The template has been very useful in structuring the end of year reports. Currently, the reports are used to assess the progress of the AFL grantees and, by extension, have helped evaluate the effectiveness of the AFL program. Specifically, the EOY reports are scored and provide OAPP with information regarding how well the grantees’ program evaluation is used to inform program planning and to corroborate program results.

This submission requests approval for the revised EOY Report Template in order to 1) streamline the report and thereby reduce the burden on the grantees, 2) provide further guidance regarding areas for discussion in the report that may be used proactively by grantees to improve the quality of their evaluations, and 3) systematically collect comparable data on program implementation across all grantees. These revisions include changes to the narrative section and an addition of a brief survey to enclosures that accompany the narrative. The OAPP estimates that as many as 66 AFL Care and Prevention project directors may complete the revised template annually.

Revisions to the EOY Report Template narrative were designed to both reduce duplication in information asked of the grantee and to provide better guidance for the information that need to be included in the report. The currently approved EOY Report Template includes both a Program Progress Report and an Evaluation Report, with some sections included in both reports. The revised templates consolidate the program and evaluation reports, thus eliminating duplication in reporting. In addition, information that appears in the enclosures and was described in the narrative will now only be presented in tabular form. This streamlined template will contain five major sections to be addressed – Executive Summary, Detailed Description of demonstration model, Grantee’s Financial Sustainability, Grants Management Issues, and the Evaluation.

The current EOY Report template includes three enclosures (A: Program Statistics, B: Performance Measures for AFL Care or Prevention Projects, and C: Efficiency Measure for AFL Care or Prevention Demonstration Projects). One additional enclosure will be for the new process evaluation survey instrument. This survey will provide important descriptive data about program implementation that will be used for the National Evaluation of the AFL program.

2. Purpose and Use of the Information Collection

The existing EOY Report Template assists AFL Care and Prevention grantees in meeting the guidelines and mandates of legislation and AFL policies. It guides grantees in uniformly tracking certain services, demographic data and program information and reporting this information when formulating their year end reports. In turn, this information helps the AFL program assess the effectiveness of their federal grants and help inform management actions, budget requests, and legislative proposals directed at achieving results (as requested by OMB).

The revisions requested will further these aims by both streamlining the report and guiding the grantees to provide the most relevant information. It is expected that such changes will further the OPA’s ability to adhere to program management standards. Second, the addition of the process evaluation survey will describe in detail the implementation of AFL demonstration projects, which could inform replication of these projects in other settings. The revised EOY Report Template will provide a way to collect quantitative data about characteristics of program implementation across all grantees and can be linked to effectiveness data among grantees participating in the cross-site outcome evaluation.

The existing sections of the EOY Report Template are: a cover page, program progress report template, evaluation report template, Enclosure A (Program Statistics), and Enclosure B (Performance Measures for AFL Care and Prevention Projects), and Enclosure C (Efficiency Measure for AFL Care and Prevention Demonstration Projects). The revision will combine the program progress report and evaluation report templates to reduce duplication in reporting. The sections of the new template are displayed in Exhibit 1 with an asterisk noting those sections that have been unduplicated.

Exhibit 1. New Sections of Template


I. Executive Summary (Abstract) *

II. Detailed description of the demonstration model for the previous year.

  1. Description of program/intervention for the care demonstration project *

  2. Logic Model *

C. Description of challenges proposed solutions.

D. Description of any significant changes in the project

E. Description of the unique features or accomplishments

III. Grantee's financial sustainability

IV Describe any grants management issues not otherwise addressed.

  1. Evaluation

    1. Research Objectives and Hypotheses*

    2. Process Evaluation (Aims, Measures, Dosage,* Modifications)

    3. Outcome Evaluation Research Design (Comparison Strategy, Sampling Strategy, Instrumentation, Data Collection, Management Information Systems, Tracking, Data Analysis, Design Limitations, How Data used)

    4. Results (Tables, Findings related to questions, Missing Data, Attrition)

    5. Discussion (Interpretation of data, Issues affecting evaluation, Problems with implementation and evaluation, Extent to which program reached objectives, Implications)

    6. Recommendations

    7. Dissemination

In addition, there were several narrative sections that were eliminated because they duplicated the information in the enclosures. The complete new template is found in Appendix A. Another new feature that is added to the template is a set of probes designed to guide grantees in their reporting. An example of the probes is provided in Exhibit 2.

Exhibit 2. Example of Probes

Template Section

Probes

Research Objectives and Hypotheses: Describe the outcome-based objectives, with a clear statement of results or benefits expected (or achieved). Objectives should be specific, measurable, achievable, realistic, and time-framed (S.M.A.R.T.).

  • The questions/hypotheses that the evaluation is addressing are clearly stated.

  • The questions/hypotheses are closely tied to the SMART objectives.

  • The evaluation goals and objectives are aligned with the activities that are being conducted. The outcomes are reasonable, given the level, type, and intensity of the intervention activities.

  • The objectives are written in SMART (Specific, Measurable, Achievable, Realistic, Time-framed) terminology.

  • The endpoints are behavioral, meaningful, and related to the program’s theory of change.

The last section of the EOY Report Template will be the new process evaluation survey instrument enclosure. The process evaluation will both describe the program and serve as a platform for cross-site comparisons across grantees to be utilized as part of a cross-site evaluation of AFL Care demonstration projects conducted by RTI International under a contract from OPA. The process evaluation presents a unique opportunity to understand the implementation of a multi-site funding program aimed at ameliorating the consequences of adolescent childbearing. Despite what is known about characteristics associated with program effectiveness, the knowledge base regarding implementation of programs for pregnant and parenting adolescents is still emerging (Klerman, 2004). Furthermore, characteristics of the implementation of programs for pregnant and parenting adolescents within the AFL Program are unknown. The proposed process evaluation represents an effort to advance the field of implementation research by providing a rich description of AFL Care project goals, activities, and contexts. The process evaluation data will be combined with data from the outcome evaluation of AFL projects which are included in the cross-site evaluation in order to identify the characteristics of AFL Care projects that are associated with program effects on intended outcomes. Data from three time points—the end of the 2008-2009, 2009-2010, and 2010-2011 grant years—will be used to support the process evaluation.

This enclosure which will follow the other three is intended to describe the program delivery, content, theoretical orientation, empirical basis, use of best practices, organizational context, innovation, fidelity, and dosage of the AFL Care projects. Key research questions for the process evaluation are presented in Exhibit 3. A copy of the data collection instrument is attached in Appendix B.

Exhibit 3. Process Evaluation Research Questions

  1. What project activities are being delivered?

    1. What is the content of project activities?

b. To what extent are project activities theoretically based?

    1. Are project activities empirically based?

    2. Do projects utilize best practices in delivering project activities? Best practices may include those identified by Hoyer (1998) Klerman (2004), and Kirby (2007).

    3. What are projects doing that is innovative or demonstrative of something new (versus replicating the status quo)?

    4. Are projects implementing activities with fidelity?

    5. To what extent do care projects provide the 10 core service areas?

    6. What level of dosage are participants exposed to?

    7. Do projects address knowledge, attitude, and/or skill changes?

  1. What are the characteristics of the program setting? For example, are projects school-based, clinic-based, community-based, or in the home?

  2. What are the characteristics of the program atmosphere? For instance, what is the level of institutionalization of the project? (Goodman, et al., 1993; Steckler & Linnan, 2002).

4. What are the characteristics of the program delivery staff?





3. Use of Information Technology and Burden Reduction

The revised forms will continue to be available electronically via the Internet, as a Microsoft Word document, as a WordPerfect document, and as a Portable Document Format (PDF) and can be returned to OAPP electronically. Facsimile transmission can be another method to submit the report. The new process evaluation survey will rely on a self-administered instrument enclosed in the EOY Report Template to be filled out by grantees.

4. Efforts to Identify Duplication and Use of Similar Information

The revisions to the EOY Report Template narrative are specifically designed to reduce duplication by streamlining the information obtained across the progress and evaluation reports and by clarifying the information requested of the grantees through probes. Although the narrative does ask about the process evaluation, the nature of the questions and the format of the survey added as Enclosure D are different. Moreover, the addition of the process evaluation survey will ensure that these data are reported consistently across all grantees. In designing the proposed data collection activity for the process evaluation, we have taken several steps to ensure that this effort does not duplicate ongoing efforts and that no existing data sets would address the proposed study questions. To ensure that this study is forging new ground in our understanding of the implementation of the AFL Program, we conducted an extensive review of the published and “gray” literature. The results of the literature search and consultation with experts in the field revealed that although some research has been conducted on programs for pregnant and parenting adolescents (e.g., Baytop, 2006; Corcoran & Pillai, 2007; Hoyer, 1998; Seitz & Apfel, 1999), little has been done to conduct a process evaluation in these areas or evaluate the implementation of a program like AFL. Research is beginning to describe the implementation of programs for pregnant and parenting adolescents (Klerman, 2004), but insufficient data are available to provide a detailed picture of these programs’ strategies, approaches, and contexts, and no literature describes the characteristics of AFL Care and Prevention demonstration projects. To date, no duplication of the proposed effort has been identified. Nor did a review of existing data yield anything that could be used to examine the research questions that we have posed.

5. Impact on Small Businesses or Other Small Entities

To the extent that AFL grantees might be considered small businesses or entities, the data to be collected from the revised EOY Report would still need to be collected in some form to satisfy the independent evaluation requirement in the AFL statute. The revised EOY Report Template is more streamlined resulting in a reduced burden on grantees. The process evaluation instrument (Appendix B) will be a small addition to the EOY Report Template. Thus, there should be only minimal, if any, additional burden on AFL grantees.

6. Consequences of Not Collecting the Information/Collecting Less Frequently

The use of the current EOY Report Template has facilitated the OAPP’s ability to effectively monitor and manage the direction of the program as a whole, as well as track the performance measures as recommended by the OMB. Reviewing the end of year reports prepared using the EOY Report Template has provided important program metrics. However, there remains variability in the quality of the reports that may be reduced by providing a more user friendly template that includes more guidance to grantees.

If the process evaluation survey were not included as an enclosure, it would be difficult to amass the data necessary to understand the implementation of the AFL Program and to potentially replicate the projects in other settings. The process evaluation involves three data collection points—a survey instrument enclosure in the EOY Report Template to be submitted by AFL grantees to OPA at the end of the 2008–2009, 2009–2010, and 2010–2011 grant years. Serious consideration has been given to the issue of how frequently to survey respondents for the process evaluation. After consulting with a committee of AFL project staff and young adult clients, an expert workgroup, and OPA staff, it was determined that data collection would need to be sufficiently frequent to capture AFL projects’ changes over time. It is important to collect data each year for three years in order to allow projects to document characteristics of implementation each year. Less frequent data collection would not capture this information accurately because respondents could not recall detailed information about the characteristics of their programs from prior years.

7. Special Circumstances

There are no special circumstances that occur when collecting this information.

8. Federal Register Comments and Persons Consulted Outside the Office of Population Affairs

A 60-day notice was published in the Federal Register on May 22, 2009, in Volume 74 Number 98, pages 24013 and provided a 60-day period for public comments. There were no public comments.

RTI staff were consulted regarding revisions to the EOY Report template. Suggestions were made based on their review of the EOY reports. In particular, the introduction of probes was recommended as a way to guide grantees to produce reports that would better address the progress made on their evaluations.

A list of consultants who provided input regarding the process evaluation is found in Exhibit 4. Consultants contacted included AFL project staff, AFL young adult clients and former clients, and expert researchers with a background in adolescent reproductive health and program evaluation. The information provided from these discussions was extremely helpful in identifying suggested improvements to the process evaluation instrument. This information helped guide the development of the instrument. Input and recommendations were incorporated into the survey and questionnaire design to the extent possible.

Exhibit 4. Persons Consulted Outside the Agency

Expert Work Group

Elaine Borawski, Ph.D., Director, Center for Health Promotion Research

Case Western Reserve University

216.368.1024

[email protected]

Jeff Tanner, Ph.D., Associate Dean

Baylor University

254.710.3485

[email protected]

Claire Brindis, Dr.P.H.

Professor of Pediatrics and Health Policy

Associate Director, Institute for Health Policy Studies

Center for Reproductive Health Research and Policy

University of California at San Francisco

415.476.5255

[email protected]

Lynne Tingle, Ph.D., Assistant Professor

University of North Carolina at Greensboro

336.334.3435

lrtingle@uncg.edu

Douglas Kirby, Ph.D., Senior Research Scientist

ETR Associates

831.438.4060

[email protected]


Gina Wingood, Sc.D., Associate Professor and Director, Behavioral and Social Science Core

Center for AIDS Research

Emory University

404.727.0241

[email protected]

Lisa Lieberman, Ph.D., President

CHES

Healthy Concepts, Inc.

845.638.1619

[email protected]

Meredith Kelsey, Ph.D., Research and Policy Analyst

Division of Children and Youth Policy

Office of the Assistant Secretary for Planning and Evaluation

202-690-6652

[email protected]

Dennis McBride, Ph.D., Associate Director for Research

The Washington Institute for Mental Illness Research and Training

University of Washington

253.756.2335

[email protected]

Lisa Trivits, Ph.D., Research and Policy Analyst
Division of Children and Youth Policy

Office of the Assistant Secretary for Planning and Evaluation

202-205-5750

[email protected]

Amy Ong Tsui, Ph.D., Director and Professor

Bloomberg School of Public Health

Johns Hopkins University

410-955-2232

[email protected]


Staff Committee

Anne Badgley, M.Ed., President and CEO
Heritage Community Services

843-863-0508

[email protected]


David MacPhee, Ph.D., Professor

Human Development & Family Studies

Colorado State University

970-491-5503

[email protected]

Leisa Bishop, Director of Neighborhood Services

BETA Center, Inc., Project FAME

407-277-1942 ext. 134

[email protected]

Janet Mapp, Interim Director of Prevention Services

Switchboard of Miami

305-358-1640

[email protected]

Staff Committee (cont.)

Doreen Brown, Director of Outreach Services Healthy Connections

479-243-0279

[email protected]

Dr. Ruben Martinez, Ph.D., Evaluator

Decisions For Life of Baptist Child and Family Services

210-458-2654

[email protected]; [email protected]

Carl Christopher, Educator

St. Vincent Mercy Family Care Center

419-251-2341

[email protected]


Mary Lou McCloud, Director of Young Parents
Support Services

YWCA

585-368-2248

[email protected]

Cheri Christopher, Young Adult Representative

St. Vincent Mercy Family Care Center

419-251-2341

[email protected]

Charnese McPherson, Young Adult Representative

202-305-0384

Audra Cummings, Young Adult Representative

479-216-0842

Alice Skenandore, Executive Director

Wise Women Gathering Place

920-490-0627

[email protected]

Christina Diaz, Program Director

Decisions For Life of Baptist Child and Family Services

210-240-8866

[email protected]; [email protected]

Jared Stangenberg, Young Adult Representative

615-683-7106

[email protected]

Amy Lewin, Psy.D., Assistant Professor

Center for Health Services and Community Research Healthy Generations Program

Children’s National Medical Center

202-884-3106

[email protected]

Cherie Wooden, R.N., BSN Program Manager

Helping Our Parents to be Educators (HOPE)

607-584-4485

[email protected]



9. Payments to Respondents

There will be no payments to respondents.

10. Assurance of Confidentiality

This EOY Report template was developed for use in individual AFL Care demonstration reports and, therefore, specific procedures for assuring confidentiality are determined by the grantee. Each AFL applicant, however, must submit a signed acceptance of assurances required by Title XX of the Public Health Service Act. These assurances include affirmation that a system for maintaining the confidentiality of client records is in place. Compliance is monitored by OAPP staff.

The process evaluation data collected will be shared with a contractor, RTI International as part of the National Evaluation of the Title XX – AFL program. RTI received approval for an Institutional Review Board (IRB) exemption since the data will be non-sensitive. A copy of the RTI IRB approval notice is included as Appendix D. A pilot test of these procedures was conducted, and [will describe any problems identified] (see Section B.4 for information on the pilot test).

To ensure data security, all RTI project staff are required to adhere to strict standards and to sign agreements as a condition of employment on the process evaluation. Survey responses will be stored on a secure, password-protected computer shared drive. All data files on multi-user systems will be under the control of a database manager, with access limited to project staff on a “need-to-know” basis only. No respondent identifiers will be contained in reports generated by RTI, and results will only present data in aggregate form.

We will seek approval and review by the OS Privacy Act Coordinator, Maggie Blackwell.

Any data reported to the OAPP will be in aggregated format. Individual identifiers will not be included.

11. Sensitive Questions

The major focus of the AFL Care demonstration program is to build on what is already known about preventing rapid repeat births, unhealthy risk behaviors and sexually transmitted diseases in parenting adolescents. Care programs also enhance good parenting skills and help to reduce child abuse and neglect. Although the Care Core Instrument does ask about birth control practices, these data are not tracked in the EOY Report template or enclosures. Only the number of subsequent pregnancies, which is a also a client outcome measure from the Care Core Instrument, is tracked.

The major focus of the AFL Prevention program is to promote premarital abstinence from sexual activity for adolescents. Most Prevention interventions include the use of curricula, as well as other materials, that cover issues around adolescent sexual activity and the benefits of abstinence. Thus, the EOYP Template asks grantees to report on a measure of aggregate client outcome data from the Prevention Core Evaluation Instrument addressing the youth participants’ strength of intention to remain abstinent from premarital sexual activity subsequent to the intervention.

However OAPP will consider a waiver to addressing this measure, on a case basis, if the Prevention demonstration project can provide adequate justification–for example–a very young client population or, in the case of a school-based project, opposition from a school board or district.

In addition, individual respondents will be informed that their participation is voluntary and that they may refuse to answer any or all of the questions in the instrument. They will also be informed of procedures taken to assure the confidentiality of their answers.

In the context of the EOY Report, all responses are reported in aggregate. All AFL grantees administer the Care or Prevention Core Evaluation Instruments and those who were comfortable in answering the sensitive questions would be included in the aggregate data presented in the report. Section V B 3 of the EOY Report Template requests the evaluator to describe any difference between participants served by the project and those included in the outcome objectives and evaluation data.

Because the process evaluation will obtain information from grantees about the program delivery, content, theoretical orientation, empirical basis, use of best practices, organizational context, innovation, fidelity, and dosage, there are no sensitive questions associated with this data collection. These data will be presented with all identifiers removed.

12. Estimate of Hour Burden Including Annualized Hourly Costs

The estimate of the burden hour for the collection of information contained in the current EOY template is 65 hours annually. This estimate was derived by asking 9 OAPP Care and 9 Prevention grantees to estimate the time it took them to complete their last End of Year report in hours. After the highest and lowest time estimates were dropped, the average was 64.6 hours, and the range was 30 to 140 hours. The number of estimated respondents is 66 per year, with each respondent completing this report once per year. Grantees send in an end of the year report as requested by the OAPP; however, with the revised template the estimated burden will be lessened. Because some of the narrative portions of the report remain consistent after they are completed the first year, in the second through fifth reports, it would be expected that the time to complete the report would be less than in the first year.

The estimate annual response burden for the additional process evaluation survey is 11 50/60 hours. This burden is the total response burden per year for all AFL Care and Prevention demonstration project directors. Exhibit 5 provides details about how the burden estimate was calculated. The self-administered survey will be completed electronically in Microsoft Word or using paper and pencil and is designed to maximize ease of response and thus decrease respondent burden. There will be three waves of data collection for the Care demonstration project directors. Wave 1 will include all 66 AFL Care and Prevention grantees: Data will be collected from 58 grantees in November and December 2009, 7 grantees in March 2010, and 1 grantee in June 2010. Wave 2 will include 44 grantees in November and December 2010, 7 in March 2011, and 1 in June 2011. Wave 3 will include 30 grantees in December 2011. The annual respondent cost is $308.00 (Exhibit 6). This cost is an average of the total respondent cost per year for all AFL Care and Prevention demonstration project directors. Respondents are subject to no direct costs other than time to participate; there are no start-up or maintenance costs. Timings were conducted during our pilot test procedures to determine the overall burden per respondent for the process evaluation instrument. Data collection is expected to take 30 minutes per respondent. Because it is not known what the wage rate category will be for each project director, the figure of $26.00 per hour was used as an estimate, based on an average of selected Care demonstration project director wages.



Exhibit 5. Adolescent Family Life Care and Prevention Template Estimated Annualized Burden Hours

Type of Respondent

Form Name

No. of Respondents

No. of Responses/ Respondent

Average Burden/
Response
(Hours)

Total Burden (Hours)

Care demonstration projects

Adolescent Family Life Care and Prevention Template

31

1

65

2015

Prevention demonstration projects

Adolescent Family Life Care and Prevention Template

35

1

65

2275

Exhibit 6. Estimated Annualized Cost to Respondents: Years 1–3

Type of Respondent

Form Name

Total Burden Hours

Hourly Wage Rate

Total Respondent Costs

Care demonstration project director

Adolescent Family Life Care and Prevention Template

2015

$26.00*

$52,390

Prevention demonstration projects

Adolescent Family Life Care and Prevention Template

2275

$26.00

$59,150

*Estimate of average hourly working rate derived from a sample of AFL Care demonstration project director salaries.

  1. Estimate of Other Total Annual Cost to Respondents or Recordkeepers

Respondents will incur no capital or maintenance costs. There are no start-up costs to respondents and no additional cost to the Office of Grants Management for generating the information. The template will provide clear guidance on report contents.

14. Annualized Cost to the Federal Government

The total annual costs to the Federal Government associated with this data collection are $101,904.

Category

No. of Respondents

Average Labor Hours

Hourly Wage Rate

Total Costs

Federal Government Employees – To assess the status of the AFL grant projects and provide feedback

66

8

$25*

$13,200

Contract costs – to assess evaluations of AFL grants and identify technical assistance needs to OAPP.

66

24

$56*

$88,704





$101,904

*Estimate of average hourly working rate is the average of personnel salaries of those who are involved in the data collection assessment.

15. Explanation for Program Changes or Adjustments

A previous OMB application was approved in 2005 for the EOY Report template used among AFL Care demonstration projects (0990-02999); thus, this is not a new collection. The grantees are already required to complete the EOY Report template annually as part of the grant funding requirements. The requested change is to revise the reporting mechanisms to eliminate duplication in content by combining the progress report and the evaluation report, to provide additional guidance to grantees regarding the information requested, and to collect information on program implementation. There is a modest increase in burden requested because additional process evaluation questions are being added to the existing template, but this may be counterbalanced through the changes in the narrative portion. We are not increasing sample size for this data collection. These revisions will be used to facilitate program management.

16. Tabulation of Data and Schedule

The OAPP will require AFL Care and Prevention demonstration projects to provide demographic information for their target population and address the respective performance measures. These aggregated data will be used to track numbers and types of clients served by the AFL Care and Prevention programs, progress on the performance measures, and outcomes of AFL care programs.

Analysis of the data for the statutorily required independent evaluation of each project will vary, and be determined, by the individual grantees and their evaluators.

Cross-sectional analyses will be conducted for the process evaluation. Analyses will begin once all three waves of demonstration project EOY reports are received. Process evaluation data will first be analyzed by calculating descriptive statistics about the categories of project activities conducted by grantees and the programmatic approaches used. Descriptive statistics will also be used to assess the fidelity of program implementation; dosage and exposure of adolescents to program activities; and the extent to which projects address changes in knowledge, attitudes, and skills. RTI will supplement process evaluation data from the enclosure with information abstracted from existing EOY report enclosures to consider project-level data about the gender, age, and race/ethnicity of participants served.

Process evaluation analyses will be conducted to address the research questions presented in Exhibit 3. Process evaluation data collected with revised EOY Report templates will be analyzed to determine the extent to which project activities are theoretically based, whether project activities are evidence-based, the extent to which grantees are utilizing best practices, and to describe the organizational contexts in which project activities are conducted. We will combine the data across demonstration projects. Descriptive statistics will be utilized involving means, medians, and modes. Project characteristics (such as the percentage of demonstration projects utilizing mentoring) will be measured.

Quantitative process data at the grantee level (e.g., program fidelity and characteristics of program activities) will be linked with outcome data from the cross-site outcome evaluation at the longitudinal time points. For instance, analyses may be conducted to determine the extent to which program fidelity is associated with overall program impacts or the extent to which different programmatic approaches or levels of program intensity are more or less effective in improving adolescent outcomes. Based on our expert workgroup’s recommendation during preparations for the cross-site evaluation, one approach to linking process evaluation data to program outcomes will involve calculating the effect sizes of individual projects (DeCoster, 2004; Singleton & Straits, 1999). For this approach, each project will be coded on characteristics (based on process evaluation data from the survey enclosure), such as project features (e.g., project goal, geographic location, setting in which project activities occurred, monitoring of implementation, characteristics of project staff, project staff training), characteristics of participating adolescents (gender, race/ethnicity, age), project dosage and exposure (average frequency of project contact, length of program implementation), and timing of assessment. Efforts will be directed towards at least preliminarily determining whether different types (content, themes, and modes) of projects have different outcomes, by considering process evaluation data with data from the outcome evaluation and/or existing end of year reporting templates. The independent sample will be the primary unit of analysis. Each project will contribute one independent sample to the analysis.

The analytic strategies described above will provide an optimal design for examining implementation characteristics of Care demonstration projects and will allow for examination of differences in program effects as a function of implementation characteristics. As the evaluation questions are addressed, the findings will be summarized and shared with OPA for comment and interpretation. For this study, we expect the findings to be disseminated to a number of audiences. Therefore, manuscripts will be written in a way that emphasizes scientific rigor for more technical audiences but is also intuitive, easily understood, and relevant to less technical audiences. The reporting and dissemination mechanism will consist of at least one peer-reviewed journal article (e.g., Health Promotion Practice, Health Education and Behavior, Health Education Research, and/or Preventive Medicine) that summarizes findings on the overall implementation of the AFL program. With review and approval by OPA, the results of the evaluation will also be used to develop at least one conference presentation.

The key events and reports to be prepared are listed in Exhibit 7.

Exhibit 7. Time Schedule for the Entire Project

Task/Activity

Start Date

End Date

Start date

July 22, 2008

Develop project plan and schedule

July 22, 2008

August 12, 2008

Design instruments

August 1, 2008

December 3, 2008

Pilot test instruments

January 12, 2009

February 3, 2009

Collect data

November 30, 2009

December 31, 2011

Analyze data

January 2, 2011

May 31, 2011

Submit data summary

October 3, 2011 a

Submit manuscript and conference presentation

June 1, 2012 a

a This estimated timeline includes a possible no-cost extension for the project.



17. Display of Expiration Date for OMB Approval

The expiration date for OMB approval currently is displayed on the EOY Report Template and will continue to be so displayed on the revised template.

18. Exceptions to Certification Statement.

There are no exceptions to the certification statement.



B. Collection of Information Employing Statistical Methods

The EOY Report narrative itself does not employ statistical methods, although the required independent evaluation may utilize statistical methods which are determined by the university-affiliated evaluator for the project. However, the process data collected in Enclosure D will be subjected to statistical analyses

1. Respondent Universe and Sampling Methods

The process evaluation will include all 31 AFL grantees delivering services to pregnant and parenting adolescents.

2. Procedures for the Collection of Information

To gather data for the process evaluation, AFL demonstration project directors will complete a self-administered instrument attached as an enclosure to the EOY Report template. Completing these templates is a requirement for grant funding. Contact will begin with a pre-notification memorandum from OPA reminding AFL Care project directors that OPA is conducting a process evaluation to describe the activities of Care demonstration projects through the new EOY Report template enclosure (Appendix A).

EOY Reports will be submitted to OPA at the end of the 2008–2009, 2009–2010, and 2010–2011 grant years. AFL Care demonstration project directors will complete the EOY reports electronically using Microsoft Word or using paper and pencil, whichever is easiest. Project directors will e-mail or send the reports via Federal Express to OPA each year within three months of the end of each grant year. Process evaluation data will be collected from Care grantees from November 30, 2009 through December 31, 2011.

3. Methods to Maximize Response Rates and Deal with Non-response

Because end-of-year reporting is a requirement for grant funding, we anticipate a 100% response rate. Contact will begin with a pre-notification memorandum from OPA explaining the new end-of-year reporting template enclosure. OPA project officers will follow up with late or non-responders as they currently do.

4. Tests of Procedures or Methods to be Undertaken

Pilot test data collection was conducted from January 12, 2009, to February 3, 2009. RTI contacted nine care grantees funded in FY 2002 or 2003 to participate in the pilot testing of the process evaluation instruments.

The purpose of the pilot test was to elicit comments on the availability, usefulness, and likely accuracy of the data requested; the burden associated with providing the data; the overall instruments; and specific instruments questions.

5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The agency official responsible for receiving and approving contract deliverables is:

Alicia Richmond Scott
240-453-2816
[email protected]

Office of Population Affairs/DHHS
1101 Wootton Parkway, Suite 700
Rockville, MD 20852

The persons who designed the data collection are:

Olivia S. Ashley
919-541-6427
[email protected]
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709

Jennifer C. Gard
919-541-7369
[email protected]
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709

The persons who will collect the process evaluation data are:

Stephanie Alexander
240-453-2809
[email protected]

Office of Population Affairs/DHHS
1101 Wootton Parkway, Suite 700
Rockville, MD 20852

Lizzette del Canto
240-453-2804
[email protected]

Office of Population Affairs/DHHS
1101 Wootton Parkway, Suite 700
Rockville, MD 20852

Jacquelyn Crump McCain
240-453-2823
[email protected]

Office of Population Affairs/DHHS
1101 Wootton Parkway, Suite 700
Rockville, MD 20852

Allison Roper
240-453-2806
[email protected]

Office of Population Affairs/DHHS
1101 Wootton Parkway, Suite 700
Rockville, MD 20852

Alicia Richmond Scott240-453-2816
[email protected]


Office of Population Affairs/DHHS
1101 Wootton Parkway, Suite 700
Rockville, MD 20852

The persons who designed the revised EOY Report template are:

Barri B. Burrus

941-486-0245

[email protected]

RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709

Ina F. Wallace

919-541-6967

[email protected]

RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709

Kimberly Leeks

770-234-5024

[email protected]

RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709

Jonathan Blitstein

919-541-7313

[email protected]

RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709

The persons who will analyze the data are:

Georgiy Bobashev
919-541-6167
[email protected]
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709

Marni Kan
919-485-2756
[email protected]
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709

Michael Penne
919-541-5988
[email protected]
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709

References

Baytop, C. (2006). Evaluating the effectiveness of programs to improve educational attainment of unwed African American teen mothers: A meta analysis. The Journal of Negro Education, 75, 458-477.

Corcoran, J., & Pillai, V. K. (2007). Effectiveness of secondary pregnancy prevention programs: A meta-analysis. Research on Social Work Practice, 17, 5-18.

DeCoster, J. (2004). Meta-analysis. In K. Kempf-Leonard (Ed.), The encyclopedia of social measurement (pp. 1-19). San Diego, CA: Academic Press.

Goodman, R. M., McLeroy, K. R., Steckler, A. B., & Hoyle, R. H. (1993). Development of Level of Institutionalization scales for health promotion programs. Health Education Quarterly, 20(2), 161-178.

Hoyer, P. J. (1998). Prenatal and parenting programs for adolescent mothers. Annual Review of Nursing Research, 16, 221-249.

Kirby, D. (2007). Emerging answers 2007: Research findings on programs to reduce teen pregnancy and sexually transmitted diseases. Washington, DC: National Campaign to Prevent Teen and Unplanned Pregnancy.

Klerman, L. (2004). Another Chance: Preventing Additional Births to Teen Mothers. Washington, DC: The National Campaign to Prevent Teen Pregnancy.

Seitz, V., & Apfel, N. H. (1999). Effective interventions for adolescent mothers. Clinical Psychology: Science and Practice, 6, 50-66.

Singleton, R., & Straits, B. C. (1999). Approaches to social research. New York: Oxford University Press.

Steckler, A, & Linnan, L. (2002). Process evaluation for public health interventions and research: an overview. In A. Steckler & L. Linnan (Eds.), Process evaluation for public health interventions and research (pp. 1–23). San Francisco, CA: Jossey-Bass.

Appendix A

Revised End of Year Report Templates





Appendix B


Federal Register Notice to the Public







Appendix C


RTI Institutional Review Board Approval Notice



22

File Typeapplication/msword
File TitleSupporting Statement for
Authorwallace
Last Modified BySherett Coleman
File Modified2009-08-23
File Created2009-08-23

© 2024 OMB.report | Privacy Policy