Task #5
Formative Evaluation of the National Academic Centers
of Excellence in Youth Violence Prevention
Michele Hoover, Technical Monitor
Centers for Disease Control and Prevention
Division of Violence Prevention
4770 Buford Highway, MS-K60
Atlanta, Georgia
June 15, 2004
Revised
October 3, 2005
Mary Kay Dugan, M.A., Task Leader
Dhooleka Raj, Ph.D.
Andrew Davis, M.A.
Joanne Abed, Ph.D.
BATTELLE
Centers for Public Health Research and Evaluation
1100 Dexter Ave N.
Seattle, WA 98109
Supporting community surveillance of youth violence
Providing infrastructure for interdisciplinary collaboration on youth violence
Conducting innovative etiological research on youth violence
Developing, testing, implementing, and evaluating violence prevention strategies
Mentoring and training professionals from varying backgrounds on violence prevention
Developing community-based partnerships to address violence
Ten centers have been funded at two stages of development (see Exhibit 1-1). Five comprehensive centers were funded for five years, and the five developing centers for three years. The comprehensive centers have established expertise in the area of youth violence. The core activities of these centers are to conduct research into risk factors for youth violence and the effectiveness of interventions. The developing centers focus on developing and implementing community response plans, training health care professionals and conducting small, pilot projects to evaluate effective intervention in youth violence.
Exhibit 1-1 National Academic Centers of Excellence in Youth Violence Prevention Grantees
State |
Name of Center |
Host Institution |
Principal Investigator |
Comprehensive Centers |
|||
AL |
Comprehensive Youth Violence Center |
U Alabama/Birmingham |
Michael Windle |
HI |
Asian/Pacific Islander Youth Violence Prevention Center |
U Hawaii/Manoa, John A. Burns School of Medicine |
Gregory Yee Mark |
MA |
Harvard Youth Violence Prevention Center |
Harvard School of Public Health |
David Hemenway |
MD |
Hopkins Center for the Prevention of Youth Violence |
Johns Hopkins U, School of Hygiene and Public Health |
Philip Leaf |
NY |
Center for Violence Research and Prevention |
Columbia U, Mailman School of Public Health |
Bruce Link |
Developing Centers |
|||
CA |
Southern California Developing Center of Excellence on Youth Violence Prevention |
U California/Riverside, Robert Presley Center for Crime and Justice Studies |
Nancy Guerra |
CA |
UCSD Academic Center of Excellence on Youth Violence Prevention |
U California/ San Diego |
Vivian Reznik |
MI |
Youth Violence Prevention Center |
U Michigan, School of Public Health |
Marc Zimmerman |
PR |
Developing Center of Excellence on Youth Violence Prevention |
U Puerto Rico, Filius Institute |
Brenda Mirabel-Colon |
VA |
October Center for the Study and Prevention of Youth Violence |
Virginia Commonwealth U |
Robert Cohen |
The formative evaluation is designed to inform CDC about how well the centers are able to translate research into effective prevention practices and to guide future funding efforts. Project activities include: (1) development of a logic model(s) that conceptualizes the centers’ activities and operations; (2) conduct of a preliminary examination of performance measures; (3) development of performance measures to assess the progress of the centers in achieving specified goals; and (4) examination of management approaches used by other grant programs of a similar nature to ACE. In this report, we present the findings for activity number 4 above, the examination of management approaches.
Battelle
was asked to assist CDC in identifying management approaches used by
program managers from federal agencies and/or non-profit
organizations to oversee multi-site programs with a high level of
coordination and collaboration. CDC’s goals for the task were
to: (1) identify management approaches employed by funding agencies
to coordinate across grantees/centers; (2) identify best practices in
Technical Assistance (TA) provided by an agency to its
grantees/centers; and (3) obtain examples of products of cross-site
collaboration. The central questions for this task are:
What management structures and procedures are best suited to assist the centers to function effectively?
What forms of TA support and collaboration are provided to centers by the central organization?
What kinds of cross-center products facilitate and/or result from cross-site collaboration?
CDC will use the information gained from this task to guide and enhance its management approaches for the existing ACE Centers to help them meet their objectives and goals.
This report is intended to be used as an “internal document” shared among relevant ACE CDC staff to work towards effective coordination. This report identifies management approaches using examples of administrative structures and procedures and TA offered by federal agencies and non-profit organizations to their grantees or centers. The examination of management approaches employed by federal agencies or non-profit organizations will allow CDC to evaluate existing management approaches to programs with multiple grantees and, with further research and evaluation, could be used to develop a comprehensive management strategy focused on collaboration and coordination. In identification of existing practices, this task is less concerned with the evaluation of a representative sample of existing management approaches, and more focused on capturing strategies from a range of programs of interest to CDC. Further, our objective is not to prove that certain management approaches are valid or to evaluate a particular program’s performance, but rather to identify existing management procedures that have been successfully used by agencies and organizations for program oversight to facilitate cooperation and collaboration.
This executive summary focuses on a description of the study methodology (Section 2.0) and the lessons learned and the program summary matrices developed to describe the main activities of support and collaboration offered by funding agencies to assist program grantees in achieving their goals and objectives across multiple centers managed by federal agencies and non-profit organizations across the US, excluding CDC programsfor this task (Section 3.0).
2.1 Program Identification and Selection Criteria
The identification of programs for the management interviews involved two main tasks: 1) the identification of the universe of possible programs using a World Wide Web search; and, 2) a screening of these websites to identify suitable programs and contact individuals for the management approaches interviews. The aim of the web search was to capture a broad range of programs with different topical foci and a mandate of program oversight for multiple grantees/centers. The following search inclusion and exclusion criteria were used to identify prospective programs:
Inclusion criteria:
Federal agency or non-profit organization
Multiple grantees/centers managed
Geographically dispersed centers
Focus on technology, violence, behavioral health, or prevention
Exclusion criteria:
Single-source funded centers
Single-site or local centers without geographical dispersion
CDC-funded center/program
In addition to the criteria listed above, three main types of centers were targeted for the web search. Programs with a: 1) clinical focus, 2) service orientation (i.e., Centers of Excellence in Women’s Health), or 3) community-focused research and translation/implementation mandate. We conducted a search of existing program websites using key search terms such as:
Centers of Excellence
Technology Transfer Grantees/Centers
Technology Diffusion Grantees/Centers
Public Health Research Grantees/Centers
Research Translation Grantees/Centers
Web-based search engine tools such as firstgov.gov, google.com, etc. were used to identify the universe of possible programs. Once the web search was completed, we compiled a list of all identified programs, their sponsoring organizations, relevant website(s), and brief descriptions. This list was submitted to CDC to assist in their selection of appropriate subject programs. Appendix B contains further details on the web search process and a list of all websites identified.
Exhibit 2-1 Priority Programs Interviewed for Management Approaches Task
Priority Number |
Date of Interview |
Program Name |
Agency/Nonprofit Organization |
Rationale for Inclusion |
1 |
2/24/04 |
Excellence Centers to Eliminate Ethnic/Racial Disparities (EXCEED) |
Department of Health and Human Services / Agency for Healthcare Research and Quality (HHS/AHRQ) |
Broad mission, grant structure, similar mandate to explore causes and contributing factors and apply to prevention strategies |
2 |
5/7/04 |
Regional Educational Laboratories (REL) |
Department of Education / Institute of Education Sciences (ED/IES) |
Broad scope, similar concept to ACE |
3 |
4/2/04 |
National Centers of Excellence in Women’s Health (NCEWH) |
Department of Health and Human Services / Office on Women’s Health (HHS/OWH) |
Overall structure and format is similar, overall management of centers with different goals |
4 |
4/29/04 |
Area Health Education Centers (AHEC)1 |
Department of Health and Human Services / Health Resources and Services Administration (HHS/HRSA) |
Interdisciplinary, broad focus, similar mission. |
5 |
4/27/04 |
Roybal Centers for Translational Research in the Behavioral and Social Sciences (Roybal) and Resource Centers for Minority Aging Research (RCMAR) |
Department of Health and Human Services / National Institutes of Health / National Institute on Aging (HHS/NIH/NIA) |
Similar concepts to ACE |
6 |
3/30/04 |
Addiction Technology Transfer Center (ATTC) |
Department of Health and Human Services / Substance Abuse and Mental Health Services Administration (HHS/SAMHSA) |
Regional Centers and National Office. Regional-level efforts focus on meeting unique needs of their area while also supporting national initiatives. The national office leads the network in implementing national initiatives and concurrently promotes and supports individual regional efforts. |
7 |
2/25/04 |
Health Services Research and Development Service Centers of Excellence (HSR&D COE) |
Department of Veterans Affairs / Office of Research and Development (VA/ORD) |
Similar structure, works with schools of public health, each creates a research agenda, focus on innovation, creativity and support |
8 |
2/25/04 |
Translating Research Into Practice (TRIP) – II |
Department of Health and Human Services / Agency for Healthcare Research and Quality (HHS/AHRQ) |
Evidence-based centers, working to translate research into practice in different settings. |
9a |
4/12/04 |
NIH/NIA – Transdisciplinary Tobacco Use Research Centers (TTURC) |
Department of Health and Human Services / National Institutes of Health / National Cancer Institute (HHS/NIH/NCI) |
Multiple projects with strong work in public health approach. |
9b |
5/5/04
|
RWJF – Transdisciplinary Tobacco Use Research Centers (TTURC) |
Robert Wood Johnson Foundation (RWJF) |
Supplemental information from a non-profit organization to add to information from priority program 9a above. |
2.2 Development of Data Collection Protocol and Instrument
A data collection protocol and a semi-structured interview instrument were developed for the study. The data collection protocol contains the contact letter and informed consent information (see Appendices D and E for copies of the recruitment letter and informed consent form, respectively). In addition, the protocol contains a semi-structured interview instrument (see Appendix F for a copy of the final interview guide). The interview instrument contains the following sections:
Interview Introduction. This section provides an introduction to the study and information about the interview and informed consent.
General Program Information. This section asks questions about the program and its mission, as well as about funding.
Support and Coordination. The purpose of this section is to obtain information about the program in terms of management oversight and support.
Evaluation. This section of the instrument asks questions related to monitoring and evaluation of the program including cross-site evaluation.
Program Management. This section of the instrument contains questions related to the program’s overall management approach, in terms of who oversees the centers and any policies or committees that are in place to assist with oversight.
Cross-Center Identity, Activities and Products. This section asks questions related to cross-center identity and infrastructure that may be in place to support coordination between centers.
Outreach and Sustainability. The final section asks questions related to outreach activities, policy briefing activities, and whether the centers will continue past the funding cycle.
An internal review of the data collection instrument was conducted by CDC on August 13, 2003. Based on this review, minor modifications to the instrument were made to clarify questions. In addition, in an effort to reduce burden on respondents, a decision was made to prioritize the information to be collected to ensure that the most important information was collected first and that interviews lasted no longer than one hour unless cleared with respondents. The final interview guide and protocol was submitted to the Battelle Institutional Review Board (IRB) and approved prior to any contact with respondents.
2.3 Data Collection
CDC staff initiated contact with potential programs via email and telephone. The program director or senior manager of each of the 9 priority programs was contacted and the purpose of the interview was described. In each case, CDC identified the person most suitable for the interview and ascertained her/his willingness and availability to be interviewed. Once an individual was identified, CDC sent a formal recruitment letter and informed consent information.
Contact information on the individuals identified for the interviews was provided to Battelle. Battelle staff then contacted each of the individuals recruited by CDC via telephone and email and scheduled them for interviews. Battelle conducted a semi-structured interview with each of the identified program managers, with interview times ranging from 40 minutes to 1.5 hours. Interview dates for all 9 programs are listed in Exhibit 2-1. With one exception (priority program #4, the Area Health Education Centers)
interviews were conducted with senior program managers. The AHEC interview was conducted with the grantees and the coordinating center for the program.2 Thus, the information for this program reflects the perspective of grantees who are involved with the national AHEC organization in working on issues of cross-site coordination around specific issues.
Sessions were recorded on audiotape and verbatim transcripts were produced by a professional transcriptionist. (At the start of an interview, permission for audio taping was sought from respondents.) In addition, detailed handwritten notes were taken by the interviewer in the data notebook for the respective program. Interviews with the nine programs (representing 10 agencies and 11 individuals) were conducted between February and May of 2004.
2.4 Data Analysis
Qualitative content analysis was used determine what factors and types of support are offered by agencies or non-profit organizations to their grantees/centers. We reviewed all transcripts (and detailed notes) in order to identify major themes and patterns, particularly as related to issues in managing a multi-site program. Themes were identified around the three main questions of this study:
What management structures and procedures are best suited to assist the centers to function effectively?
What forms of TA support and collaboration are provided to centers by the central organization?
What kinds of cross-center products facilitate and/or result from cross-site collaboration?
Our analytical results are presented in multiple formats in this report. First, the transcripts were reviewed so that individual program profiles could be produced to provide a program-by-program description of key features. Second, summary matrices were developed to code key elements of the programs and to allow for comparison across programs. The transcripts were reviewed and used to complete the matrix to the extent possible. However, in some cases, the matrices could not be coded completely only by using the interview transcript and notes (e.g., respondent was not involved with program in its early stages, a particular topic was not discussed). In these instances, other sources of information (e.g., program websites, interpretation of website information, and interviewees’ other responses) were used to fill in the matrices. Finally, the transcripts were reviewed to identify overall themes that have been compiled into a lessons learned discussion on information of particular relevance to the ACE program.
As described above, this task involved a purposive sample of sites deemed innovative or interesting by CDC. Therefore, the programs were not randomly selected. As such, information presented in this report is not necessarily representative of approaches of all federal and non-profit multi-site programs but rather represents a subset of such programs deemed by CDC to have particular relevance to the ACE Program.
In this section, we present the results of and lessons learned on this task based across all of the priority program interviews conducted. This section is divided into two subsections: (1) lessons learned and (2) summary matrices. In the Lessons Learned section we provide a summary of the following topics:
The management structures best suited to assist centers (including overall management structures, staffing, supplemental structures, cross-site structures, and funding mechanisms);
The forms of TA support and coordination provided to the funded grantees (including: levels of support, types of support, forms of technical assistance, approaches to evaluation support);
The kinds of cross-center products that facilitate and/or result from cross-site collaboration (including: facilitators and barriers to the development of cross-site identity); and,
Additional Factors that Affect program management and success (such as the program manager role, the center directors, the RFA, Program Branding, Funding and Program Expectations, Cross-Site collaboration.
The summary matrices give an overview across the nine programs addressing the following key program elements:
Funding and Sustainability (Exhibit 3-2)
Support and Coordination (Exhibit 3-3)
Program Evaluation (Exhibit 3-4)
Progress Monitoring (Exhibit 3-5)
Program Management (Exhibit 3-6)
Cross-Grantee Identity, Activities, and Products (Exhibit 3-7)
Outreach and Policy Activities (Exhibit 3-8)
Document Sharing to CDC (Exhibit 3-9)
The summary matrices were created to supplement the detailed information from the individual program profiles and the lessons learned. The matrices were formatted to cover a selection of important comparison points across programs for each topic area (or “theme”) outlined in the guide for the semi-structured interviews.
3.1 Management Structure Best Suited to Assist Centers
In this section we summarize lessons learned relating to optimal management structures for oversight of multi-site grant programs. In Section 3.1.1 we look at management structures overall, followed by staffing structures (Section 3.1.2), supplemental management structures (Section 3.1.3), oversight and cross-site structures (Section 3.1.4), and funding mechanisms (Section 3.1.5).
3.1.1 Priority Program Management Structures
There were two main forms of management structures for the priority programs:
Agency primarily responsible for all activities
Agency worked with a national office to oversee programs
In the first model, the federal agency is responsible for all activities and oversight, using internal resources, and staff from across the agency. In the second model, while primary oversight rests with the federal agency, substantive assistance with cross-site collaborations is provided through a national program office (see Section 3.1.3 below).
3.1.2 Staffing Structures
Number of Staff
The majority of programs had a single person responsible for program management whose title was either program officer or program manager. These program managers had varying degrees of supplemental support. In the case of the OWH (PP (priority program) 3), there were four additional program analysts (at the GS-13 and GS-14 level), as well as support staff (though not dedicated). In a couple of cases, there were a number of program officers responsible for different grantees (PP 2-REL has four program officers, PP 4-AHEC has between five and six, PP 9-TTURC has had three program directors, PP 8-TRIP has had up to eight depending on the timeframe discussed).
Staff Skills
Interviewees identified some basic skills as necessary for all who are involved in program management and oversight. These include:
Strong analytical skills
Strong writing abilities
Working well within tight deadlines
Contracting and/or grant experience
Additionally, a long-time familiarity with the program topic is essential to program success. In cases where the emphasis of the program was on funding research, it was important that staff have a strong knowledge of study design and social or clinical interventions, in addition to program management. Some respondents emphasized that the staff member should be a strong player in the field as a subject matter expert with either a PhD or an MD degree. Also, when the grantees were funded to conduct research, it was felt that a scientific expertise was necessary in order to run the program, especially addressing study design. However, it was recognized that no one person could have the full range of scientific expertise necessary, especially for programs that involve an interdisciplinary or transdisciplinary focus.
During the interviews, an additional desirable staff trait was mentioned, without being explicitly labeled as a staff skill. This trait involved either: (a) long-time familiarity with other key players who would or could be involved in the program (federal agency staff, grantees), or (b) a strong background in federal management and oversight. In the case of relationships with key players, many of the programs also rely on partnerships and people and organizations who are interested in the program (such as other researchers, or stakeholders), and the program managers demonstrated the ability to make multiple connections for the program by:
Promoting the program internally within the agency/organization (for example, by encouraging face-to-face meetings between branch directors or board presidents and grantees)
Promoting the program externally with:
Stakeholders or potential stakeholders
Well-known researchers in the field (by involving them in meetings, review process, evaluations, grantee-level advisory boards).
Organizing panels and ensuring program visibility at large national meetings and conferences.
Assisting with dissemination plans and activities (for example, the funders jointly wrote two articles for a special issue published by PP 9-TTURC).
The majority of the interviewees saw their roles in the program as beyond federal oversight responsibilities, particularly in the cooperative agreement setting: “we’re not just monitors. We’re participants in the process.”
3.1.3 Supplemental Management Structures and Procedures
The interviews revealed that, beyond the skills and experience of program managers, the ability to effectively manage a program generally relied further on a number different supplemental administrative mechanisms. Various combinations of the program management mechanisms described below were viewed as facilitating program management.
External Evaluation Panel
The review of applications in response to the RFA/RFP involved formal procedures that are generally standard across the agency. This was especially true for agencies such as NIH, whose review procedures and committees are well defined for each grant mechanism. External experts were often involved in review of the applicants’ proposals and were sometimes involved subsequently, as well, in the review of accomplishments of funded programs in a progress monitoring capacity.
Strategic Plan/Concept Paper
Two programs (PP 3-OWH and PP 7-VA) discussed post-award procedures that involved grantee submissions of a detailed work plan. The plans outline their projected activities over the grant period and must be submitted within a specified period – three months for the VA and six months for the OWH. The plans are reviewed by the federal agency and/or an external group of reviewers. Each of these are presented in detail below.
Strategic Plan Example – PP 7-VA
In the case of the VA, the grantees’ strategic plan is reviewed by a board of non-VA senior researchers.
“Within three months they have to put together a strategic plan, and they’re supposed to work with their steering committee on the strategic plan.” It outlines the grantee’s priorities. The plan contains the following elements: “executive summary, outline of strategic goals and initiatives, strengths of the center, challenges, goals and initiatives and then infrastructure. Included in the infrastructure is an organizational chart, description of personnel and their roles, identity of the steering committee, core staff allocation, facilities and equipment and projected budget and its allocation.… The strategic plan is really outlining specific goals and objectives that they’re going to be held accountable for.” In discussing the difference between the grantee proposal and the strategic plan, the interviewee clarified “The proposal review is more [to determine] capacity [of the grantee organization]…. The strategic plan is ‘ok, what are they saying they’re going to do?’”
Concept Paper Example – PP 3-OWH
The OWH grantees present a concept paper within six months as a formal deliverable. In explaining the difference between the concept paper and the application, the respondent stated: “What goes in the application for funding are sort of three things, one the past history of what they have done in the areas that are required under the contract or grant, the current activities and programs and activities that are ongoing now, and of course, what they plan to do in the future. Now, in an application or proposal, what you get for the future is sort of we will do this and we will do that. It’s kind of the future dreams and hopes of what they’re hoping to do. What we’ve found is that until they get the award and start talking to everybody and setting up networks and partners and connecting with everybody, they really don’t know what they’re going to do for the future or what they’re going to be able to do. After they get the award, we require this conceptual plan, because at that point they have six months to go around and talk to everybody and really get their program together in each component. Then the conceptual plan is what they really can do.”
Working Groups, National Committees, Steering Committees
Six of the nine priority programs used working groups or national committees, with one interviewee mentioning another program that had this feature (PP 5-Roybal mentioned the NIA RCMARs during an interview on the Roybal Centers). The committees generally represented the major infrastructural components of the program. Membership was comprised of participants from funded centers. Working groups generally met both face-to-face during annual grantee meetings and via teleconference. Additionally, many of the working groups, or national committees, were organized by the national coordinating center. In most cases it was grantees who were members of the committee/group. In the case of PP 8-TRIP, external partners were also invited to participate. The requirement to participate in cross-site activities such as this was a requirement of the RFA in PP 8-TRIP. Additionally, “the RFA included instructions on required participation, as well as committee and subcommittee structure.” While the details of the working groups were not always discussed, some of the programs identified the names of their program working groups/committees.
The following working groups were identified during the interview regarding PP 3-OWH:
Publication working group – mostly composed of center directors
Coordinator’s working group
Conference planning
Evaluation working group
Racial and ethnic minority and underserved working group
Outreach working group
Research working group
Leadership working group
Professional educators working group
Resource center working group
The following national committees were identified during the interview regarding PP 6-ATTC:
Workforce development
Services improvement
Evaluation workgroup
Grantee-level Steering Committees
In addition to national level committees, the programs discussed grantee-level committees. For example, the PP 7-VA grantees’ steering committee membership was approved by the federal agency. The agency coordinates their oversight activities with annual meetings held by the Steering Committee.
3.1.4 Structure to Assist with Oversight and Cross-Site Activities – National Coordinating Center
A majority of programs had national program offices (PP 3-OWH, 4-AHEC, 5-Roybal, 6-ATTC, 9b-TTURC/RWJF). As discussed by the priority programs that used them, these cross-site offices were labeled variously, and each had a different funding mechanism. The full details of the national program offices are outlined in the program profiles. For those priority programs that used them, there were two main types of funding discussed. In one scenario, the federal agency or non profit (e.g., OWH, RWJF) provided funds separately from the program funds to the centers to establish an independent center. In a second scenario, grantee funds were used to pay for the national office (for example, the RCMAR centers discussed as part of the interview on PP 5-Roybal). Finally, in one case, the program office was funded through memberships available to those who are part of the program (AHECs). In the latter case, there was coordination with the federal agency. However, there was substantial independence regarding activities, and assistance with oversight (especially progress reporting).
3.1.5 Funding Mechanisms
The total funding for the priority programs varied widely from $65 million per annum for five years (PP 2-REL) and $60 million per annum for five years (PP 7-VA), to a total $70 million (representing NIDA’s and NCI’s commitment to PP 9-TTURC) or a total of $7.5 million over the total funded period (for example, NIA’s commitment to PP 5-Roybal and AHRQ’s commitment to PP 8-TRIP). Moreover, the grantee funding cycle varied between 3 and 5 years. The average annual funding per grantee ranged between $100,000 per annum (HRSA’s PP 4-AHEC) and up to $8.6 million per annum (PP 2-REL).
Additionally, the number of grantees varied between 6 and 46, with grantee funding varying between $100,000 and $156,000 per annum for PP 4-AHEC and PP3-OWH, respectively, to $2.5 million per annum for PP 9a-TTURC, and up to $8.6 million per annum for PP 2-REL.
The funding level determines the resources available to federal staff, as well as grantees, for TA and support, oversight and cross-site collaboration activities.
Additional aspects of funding discussed during interviews include: funding mechanism, changes in funding mechanism, leveraging funds, and sustainability. Each of these topics is discussed below.
Funding Mechanism
The three mechanisms discussed by the federal agencies and non-profit organizations for multi-center grant programs were cooperative agreements, contracts, and grants. The case of the PP 8-VA was distinct in that the funding mechanism for the program basically involved an intramural transfer of funds, as the program participants were all VA institutions.
Cooperative Agreements. In total, four of the nine programs used cooperative agreements (PP 1-EXCEED, PP 4-AHEC, PP 6-AATC, and PP 8-TRIP). The funding agencies used this mechanism either because it:
Allowed for greater federal staff involvement to work at a cross-site level and shape program direction (PP 1, PP 6); or
Worked well as a mechanism for academic and community partnerships (PP 4, PP 8).
With respect to allowing more federal agency staff involvement in the program, as stated during one interview (PP 1), “It allows us to have more of a hands-on relationship with grantees, particularly in grantees that might have a wide breadth of experience. Some of them may be more experienced than others. Some may be dealing with different sites. It allows us to have an interaction with them that we wouldn’t normally be able to have.”
Grants. Three of the nine programs utilized a grant mechanism to fund their program grantees (PP 5-Roybal, PP 9a/b-TTURC). Not surprisingly, this mechanism was used primarily by the NIH (NIA, NCI, NIDA) and the non-profit organization, the Robert Wood Johnson Foundation. The grants were used to fund research centers based in academic institutions, with specific allocations to various research projects and developmental research (for example, pilot research projects). NIH Center Grants also had a way to include training as part of the research funding, to encourage new investigators in a field.
Contracts. Of the nine programs, two used a contractual relationship to fund their multi-site programs (PP 2-REL, PP 3-OWH). It was felt that this mechanism was useful for increasing agency control and oversight of the program.
Changes in Mechanism
Three of the nine programs (PP 2-REL, PP 5-Roybal, PP 8-TRIP) changed their funding mechanism on the occasion of subsequent RFA reissuance for the program.
R01 to Cooperative Agreement. In the case of PP 8-TRIP, the mechanism changed from an R01 to a cooperative agreement for evaluation purposes, “It was set up as a cooperative agreement to create consistency around certain areas of evaluation among the grantees.”
Grant to Contract. For PP 2-REL, there was a balance sought between the need for federal agency control over the program with the congressional mandate that each laboratory be responsive to their region.
“The labs are set up by Congress to respond to the educational needs of their region, and there are ten different regions. The other thing you should know and I realized it right from the beginning, this used to be a grant program. It’s no longer a grant program. It’s now a contract. Each lab is under contract to do the work that they do. … That grew out of a lot of criticism and concern that not enough was happening with the labs in terms of advancing and meeting the needs of the nation’s schools and educators. We felt that we needed more control, but it’s a very awkward situation, because they are supposed to be responsive to their regions. They have governing boards, and the governing boards are made up of representatives that reflect the region’s educational establishment. All the boards will have key state school officers for each of the spaces that they serve. Not all the chiefs are on these boards, on their respective boards where they get served, but there are a number of them. Then other people in the regions that are important to education, so it has obviously, not only a built-in constituency, but I mean these boards are supposed to give direction to these labs. They do biannual assessments of needs and that sort of thing, which are the drives for what goes on in these regions.”
Change in Grant Mechanism. For PP 5-Roybal, the mechanism changed from one type of center funding to another, from a P50 to a P30. The reason for this change was to encourage a different kind of scientific investigation in the centers and to move away from funding major research projects. “[W]e wanted to get away from funding major projects and feature R01 projects within each of the centers. We wanted them to be smaller grants that would have a small administrative core, and a small dissemination core, which is optional by the way, and the rest of it set up as private investigators as kind of seed money.”
Leveraging Funds
A number of the programs spoke of the importance of the grantees’ ability to leverage federal funds to obtain additional resources to carry out their work. In some cases it was acknowledged that the federal funds covered infrastructural aspects to setting up a center (facilities and staff), in others that it provided recognition, which in turn opened the door for additional funds.
Recognition. For PP 3-OWH, the federal funds provided for some programs an official recognition of the work that the center was conducting. This allowed the grantee to seek additional funds and resources. “We realized that what’s valuable to the CoEs is not so much the $150,000 or the $160,000. What’s valuable is the designation, the award from the federal government. It’s kind of the Better Housekeeping seal. In fact, we always meet with the deans of the schools of medicine, the CEOs of the hospitals when we go to these CoE site visits to make sure the institutional commitment is there. They have over and over again told us, it’s not so much that they don’t want the money, but what’s valuable to them is that designation.”
Tracking Leverage Amounts
The amounts leveraged varied between an overall program tracking of $11 million of federal agency commitment that created approximately $350 million leveraged support (PP 3-OWH) to an average leverage amount of every federal dollar is leveraged by $22 across the program (PP 7-VA). It was recognized that the federal contribution to the center could be less than 10% of the funds required by the center to operate.
Two federal agencies provided details regarding their system of tracking the amounts that their funded centers were leveraging vis à vis federal funds (PP 3-OWH, PP 7-VA). An example from the Office of Women’s Health program is provided below.
Leverage Tracking Example: PP 3-OWH. The OWH tracks three aspects of leveraging:
Internal leveraging (from within the grantee institution)
External funds (through additional grants, foundation monies, etc.)
Amounts that partners of the program have leveraged because of involvement with the program
“We really early on set up this leveraging chart. It has three columns. One column is the amount of money that they have leveraged internally. Many of our CoEs get more space, more staff and little internal grants that their university is giving out because of the CoE designation. The second column is external funds, grants, foundation money, etc. The third column is money that partners of the CoE … have leveraged … because they’re associated with the CoE. Many of these CoEs have partners in satellite sites and affiliates and stuff like that, and so we found that the partners even leverage money, because they say that they’re a part of the CoE program. The last time I looked, we had spent about $11 or $12 million on this program, and we had leveraged over $350 million.”
Sustainability
Programs have various ways to ensure sustainability of funds, including leveraging funds, requiring plans for sustainability to continue research or dissemination as part of the RFA (PP 8-TRIP). In other programs, for example, HRSA’s AHEC program (PP 4) has a congressional mandate for a one-to-one match of funds within a six-year period. It is hoped that the matched funds come from either the state or local government or foundation sponsors. However, another program discussed that reliance on state funds was a potential liability to the federal program, as funding from the states has decreased over the last few years because of budget cuts, putting some programs in crisis.
One of the more innovative approaches discussed during the interviews was that of PP 3-OWH. The program RFA, in fact, ensures sustainability as the federal funds available to grantees are “tapered” every year. As part of the concept paper, the sites are required to submit a plan for ensuring ongoing funding. Also, when conducting site visits, the OWH meets with senior host institution administrators requiring increased institutional commitment from the academic institutions where the grantees are housed.
3.2 What forms of TA support and coordination are provided to the centers by the central organization?
In this section we discuss the types of support and coordination provided to the centers/grantees by the funder. In particular we discuss the varying levels of support provided (Section 3.2.1), various types of support provided (Section 3.2.2), various forms of technical assistance provided (Section 3.2.3), and various approaches to evaluation support (Section 3.2.4).
3.2.1 Differing Levels of Support
The levels of support and technical assistance provided by the nine programs examined in this task vary widely. In some cases, such as PP 5-Roybal or PP 1-EXCEED, very little support is provided to the grantees by the funder. In the case of other programs, such as PP 9-TTURC and PP 3-OWH, a wide range of support on many levels is provided to funded centers. Others were in various types of transition, such as PP 2-REL and PP 9-TTURC (where the RFA has been reissued, but with significant changes, such as the fact that the funding from RWJF will not continue).
3.2.2 Forms of Support
The forms of support for programs vary and include listservs, conference calls, site visits (formal and informal), and meetings. Each of these is described further below.
Listserv
A listserv was used in five of the nine programs (PP 3-OWH, PP 6-ATTC, PP 7-VA, PP 9a-TTURC/NIH, PP 9b-TTURC/RWJF) as a means for funded centers to communicate with the funding agency or organization as well as with other funded centers. Overall, the respondents using listservs for their programs felt it was a useful communication tool. However, one respondent advised that a listserv is most useful and effective in the early stages of program development, especially in the first year as programs try to build their infrastructure and foster relationships. However, as programs become more highly developed, this interviewee felt that a listserv tends to become more of a burden than a useful tool for the grantees. The volume of messages simply becomes too overwhelming for the grantees and distracts from other important tasks at hand.
“We did establish a listserv with the idea that if we have a lot of messages we want to send out to the group, and there would be a lot of messages that the groups just wants to send out to each other – like an investigator in Wisconsin might want to ask investigators at all the other centers about something that they’ve encountered. We found that the listserv was … very useful in the developmental stage of the centers, but there did come a point where I felt the folks got overwhelmed with the number of e-mail messages that were going back and forth.”
Conference Calls
Eight of the programs mentioned holding conference calls as a means of communicating regularly with grantees and monitoring progress for their programs. Conference calls between the program staff and the grantees were generally held every month or two. These calls could include all funded grantees on the same call, or just individual grantees and their staff on the line with staff at the funding agency. One participant noted, “I think that regular conversation really enhances the work that’s happening in the centers and the degree to which it’s helped to catalyze new directions and inject some more interesting perspectives on the research that’s being done … . I think having conference calls with the PIs individually and collectively is useful.”
In some cases, conference calls were a required part of the funding application and agreement, not just a post-award arrangement between the program office and the funded centers. One respondent described how each of the “cores” associated with the funding mechanism has a required monthly conference call, and participation by the director (or a surrogate) is obligatory on these and other scheduled director conference calls.
“The PIs have a regular conference call, regularly scheduled Wednesday afternoons at 3:00 once a month. We all sit down and we’re all expected to be there, and I would say 90% of the time we’re all there. If they can’t be there, then they certainly make sure that a substitute is on the line.”
Site Visits
In total, eight of the nine programs mentioned conducting formal or informal site visits (PP 1-EXCEED, PP 3-OWH, PP 4-AHEC, PP 5-Roybal, PP 6-ATTC, PP 7-VA, PP 8-TRIP, PP 9b-TTURC/RWJF) to the grantees’ locations.
Formal site visits. Six of the nine programs (PPs 1, 3, 4, 5, 7, 8) conduct formal site visits. Many of the site visits are either scheduled in the few months prior to a regularly scheduled cross-grantee meeting or are timed to coincide with a scheduled meeting at the grantee’s site.
One respondent told us: “We’ve had site visits between January and May. We have a fall meeting and a May meeting, so we do our site visits. We let Christmas and Thanksgiving go by, and then January to May before our May meeting on the 24th, we do all of our site visits so that they have plenty of time to ask questions….
Another mentioned that “one aspect of monitoring is that each center is supposed to have a steering committee and that’s from our smallest to our largest. The steering committee is made up of experts that understand the main interest that that center is focused on. They’re supposed to meet once a year to review the progress of the center. When I talk about us doing site visits, we normally attend the sites at the same time so that we can hear what they present to the steering committee and review their progress.”
Informal site visits. Three programs (two of which also conduct formal site visits) mentioned doing informal site visits as part of scheduled meetings at different grantee locations (PP 5-Roybal and PP 6-ATTC) or if staff happened to be close to one of the centers for other work (PP 3-OWH).
“(We have) site visits both formally and informally every year and for every site. There’s a scheduled site visit, and then we often have unscheduled site visits. A lot of us are kind of in these cities anyway, and if we’re going there for some other reason or a member of the (program) staff is going there for some reason, we just drop in and see that everything’s going the way the progress reports say they’re going.”
Meetings
Meetings were held at least annually with grantees and federal staff participating. In other cases, biannual meetings were held with certain grantee staff (for example, for PP 9b-TTURC/RWJF, the communications directors met twice a year. Another respondent (PP 6-ATTC) described how program staff hold three director meetings per year, each one at a different grantee location. The respondent felt that this type of arrangement fosters a greater sense of cross-center identity and allows grantees to learn up close how other funded centers operate.
“We have three directors meetings a year, where we bring the directors to a location. I try to rotate them around different regions, so it’s not just a directors meeting where we talk about (program) initiatives that need to be reflected in the work of each region, but it’s also an opportunity if we’re meeting (at a particular grantee location) for the directors from around the country to see up close how another (funded center) is operated and to get ideas about how they’re configured, what they have in their resource room, and how their staff works together. Each time we do that the full staff from the whole state (center) provides a briefing and an in-depth look into that region, so it’s kind of cross-fertilization between all … [the] regions.”
One respondent (PP 2-REL) described how program staff have four meetings with directors per year, two in DC and two others at other locations (e.g., national or regional meetings). “[We] have four meetings with the directors each year, but two of them they come here to Washington. They may come to Washington in any case, but they come to Washington so that we can attend. Actually, it’s our meeting, but often we tell them and they’ll pretty much put the agenda together and what have you. We don’t have the staff to really do it the way it should be done, so we do rely on them a little bit on those kinds of things.”
3.2.3 Types of TA
There are varying levels and degrees of technical assistance provided to grantees by the different programs. Common examples of TA provided to funded centers include, but are not limited to, the following: reviewing applications; assisting with publications and outreach (e.g., discussing findings); conducting conference calls or using other modes of communication to provide advice on a particular subject, a new research direction or an application to outside sources for supplemental funding. Some programs provided higher levels of scientific support and oversight for their grantees than others did. For example, one respondent described how the program has a particularly challenging research area focus. For this reason, project officers are clinicians with “fairly well-backed knowledge of study design, intervention, and program management” who essentially provide grant research specific TA.
Coordinating Centers
Moreover, two programs have coordinating centers (PP 5-Roybal and PP 9b-TTURC/RWJF), which have been effective and worthwhile according to the program officers. One program director (PP 5) developed an innovative approach to providing assistance to all funded centers. The program director decided to extract a fixed amount from grantee funds (about $25,000 per year from each grantee) to create a coordinating center to assist with technical assistance and cross-center activities. The coordinating center was competed and is housed at one of the funded centers. The director of the coordinating center needs to be knowledgeable in the field, as well as have a lot of coordinating experience.
“One of the things that I did was I hired out about $25,000 or something like that from each of the RCMAR grants. I had about $150,000 or $200,000 and created a coordinating center. That coordinating center was to me a God send. It competed also. It was competed along with the original RCMAR application. A RCMAR PI could at his/her option write for coordinating centers, and there was just going to be one of them. Also, the coordinating centers were reviewed, and I had the funds available. I went on the site visits. To be honest with you, the site visits were set up, first of all, to determine if they really had an understanding of how the coordination fits, and it’s a very diverse center. Secondly, I thought that I could get along with them, because I lean very heavily on them. If I didn’t have my coordinating center for the RCMAR, which is currently at UCLA, I would spend 25% of my time doing it. As it is, it’s probably down to three to four hours a week. I write to the PI of the coordinating center and say “[name deleted], I need…” She turns her people loose, and within a day I get it.” …. All of the money is awarded to the PI of the grant, but there’s a subcontract with the group that runs the coordinating center from the PI.”
Another program (PP 9b-TTURC/RWJF), which receives funding from different sources, has an innovative approach to coordination and assistance in place. This program supports a national program office, “an outside entity that helps us manage the program”. The respondent explained that the program office, which is housed at a university, is “basically their [the grantees] main point of contact, so they’ll send us their financial reports so that we can send them a check, but if they’re having difficulties with the work they’re trying to do, they contact the National Program Office … . [T]his office was to oversee our grants, but … what we were really trying to do was to connect all of the grantees together and, of course, you don’t bring the stuff that we fund and not the other stuff. It just doesn’t make any sense. It has to be a whole, and so they started to see how that could really be beneficial to not only them in their oversight, but also in making these connections across the centers.”
It was stressed by another funder on this program (PP 9a-TTURC/NIH) that this was another program for which program staff ought to have a strong scientific background, given the complexity of the research area(s) being funded. One staff member felt that “it really helps to have a scientific background in one or a couple of the areas that are represented by the various grants that you’re providing oversight for.” This program has a board of scientific advisors and the respondent explained that “we try to provide scientific monitoring and stewardship to our individual centers that we might have in our portfolio, but also to the centers collectively. Then that is a way that we might exert influence that we have in terms of the processes that I discussed earlier in terms of … typical grant monitoring and grant oversight, but also at the meetings where we’re having people that are preparing data and bring people in from outside of the centers to try to stimulate new directions and comment on what we see.”
One arrangement worth noting is that RWJF pays for a group of external experts who are very well known in the field to attend the annual national cross-site meetings. These experts – an external peer review group, not a standing committee – would attend the grantees’ presentations and, for example, provide feedback and advice on innovative research directions to the grantees. The respondent said, “Well, we did convene a group of, I can’t remember whatever we titled them, but it was a group of experts who were well-known in (the field), many of whom could have been a center themselves, if they had applied. They’re very well-known leaders in (our area of) research. We convened them to basically come to all the meetings and hear about the science and where they’re going, and really be kind of a challenge to the PIs in terms of progress and the way they’re looking at things. They were kind of peer reviewers for them … . That was fairly successful for us.”
Another unique aspect of this wide ranging program is that a communications director is funded at each site by RWJF to assist the grantees. “We wanted them to bring on a communications director who really could keep track of all that [is] going on within the center so that they could see where there were opportunities to communicate to science. We’ve completed this study, and it’s going to be published in this journal – pretty standard type of communications, but also so that they could work inside the center. A lot of these were scientists who had never worked together before, to help them with their own communications with each other. That was from other things that the foundation had funded with transdisciplinary work. When you’re bringing scientists together who are from very different disciplines, who had never worked together before – they don’t speak the same scientific language. It was very important to have a fairly senior type of communication person who can see some of those kind of disconnects. Also, because they weren’t scientists themselves, the scientists had to communicate to that person in kind of lay language. They kind of turned out to be almost like a translator, with what was going on. In some centers it worked better than in others.”
The communications directors are also particularly helpful as both the research and the program enter the more developed stages. For example, the communications directors provide advice to the scientists on effective strategies for outreach activities and dissemination of findings and instruct the scientists on how to process the information in a standardized format.
“The policy researchers really now are starting to get to a point where there are really good solid results coming out, and so generally I can see … the way RWJF handles policy-related work, particularly in tobacco. I know that probably better than other areas. We have a fairly sophisticated connection between all of our (program area) grantees through which we communicate results that are going to be coming out and … we help them understand what the implications might be for policies at different levels. Then they’re ready to speak on it, so that they understand what the science is saying, and then we have a lot of grantees that particularly their focus is on speaking to policy and implementation of policies.”
For a different program, one respondent discussed some “special groups” that have been created as part of their program to target specific issues in health care. These groups are similar to task forces, but are not labeled as such. Meetings are held once a year for these groups, which are usually comprised of 15 to 20 active participants selected by the program staff. The respondent explained that “our Centers of Excellence directors, their involvement would be as needed. They’re primarily the recipients of this information. Basically, it helps the centers and all the researchers to know what direction we need to be heading in this field. Where are areas that are over researched and (where are the) areas that are under researched?” One recent group was convened to examine the issue of long-term care. As the interviewee continued to explain, these discussions and findings from these groups “will generally lead to a solicitation for research.”
(For interesting forms of TA provided for progress reporting and evaluation, please see Section 3.2.4. below.)
3.2.4 Program Evaluation
The majority of programs discussed various types of evaluation that had been conducted, which included activities related to grantee monitoring (such as site visits, surveys, etc). Some programs (PP 3-OWH, PP 6-ATTC) also included grantee working groups/committees that addressed evaluation either at a program or grantee level. For the SAMHSA-funded ATTCs, the individual grantee evaluations are compiled into a cross-site analysis by the national office. This cross-site evaluation is submitted to the federal program manager.
Furthermore, discussions of evaluation during the interviews also elicited information on various aspects of evaluation required by federal funding agencies and non-profit organizations. Often discussed was the evaluation process involved in decisions to fund applications and the roles of external reviewers and senior federal staff (for information on this see Section 5.1.5 above on Funding). Additionally staff discussed procedural monitoring required by funded grantees. These included both formal mechanisms, such as regular submission of progress reports, and informal mechanisms, such as participation in grantee annual meetings. Finally overall program evaluation was discussed. Below we present some additional information regarding evaluations conducted.
External Evaluation
Of the programs interviewed for this task, only two had had external program evaluations (PP 3-OWH and PP 4-AHEC). In the case of the OWH, they used the evaluation to establish a baseline for the program grantees. The data were both qualitative and quantitative, but not yet published. For the AHECs, they had an external evaluation done by the UNC Sheps Center. Not all funded grantees were evaluated, however, as “there had been a long process getting through deciding what would be the programs that would be evaluated based on geography and length of time that they’ve been in existence”. The finding from this report has been published.
Additionally, interviewees for PP4-AHECs and PP-2 RELs referenced federal evaluations. For the latter, a GAO report was published in 2002 with information on all funded grantees, but no formal external evaluation was ever conducted.
Annual Progress Reports
All funding agencies required annual progress reports; some required more frequent reporting (quarterly). In some cases (for example, PP 7-VA), grantees submit required monitoring information online to the federal agency. The federal agency then compiles the information for cross-site analysis and program monitoring. In many cases, a formal reporting template was used (PP 7-VA, PP 8-TRIP).
Example of Progress Reporting (PP 2-REL). An innovative approach to progress reporting was outlined by PP 2-REL. During the interview, we discussed that the annual plan was more forward-looking in objectives, in that it contained the grantees’ plans for the following funded year. As part of the annual submission, the grantee would also provide an evaluation report. This is an evaluation paid for by grantees conducted by an external group to monitor program progress. Every two years, the same document also has a needs assessment attached.
Also, funded grantees are required to submit quarterly progress reports, and they are given overall points for their progress. If reports are submitted late, the grantees have points deducted from their overall scores. As described in the interview, “I mean it’s unheard of not to get a quarterly report or something. It may be a day late or something, but even that, they have to be on time or they get points taken off. We have these elaborate, and I have to say elaborate almost regulatory types of things. That’s what it’s been and they’ve been tightly monitored.”
The Department of Education is currently reviewing their evaluation procedures based on findings from the National Academy of Science’s panel report, Scientific Principles in Education. These procedures are being developed across the agency. The RELs’ federal agency staff have discussed that they would like to collect quantifiable data on this program, which is in line with the developing cross-agency evaluation perspective.
Cross-Site Evaluation Case (PP 9-TTURC, funded by NIH’s NCI, NIDA, and RWJF). An internal evaluation of the TTURCs program was conducted as a result of consultation with the grantees. From program inception, the program identified desired results of the program. This was done through substantive grantee participation in the evaluation process, including input on definitions of program success, identification of measurements needed, and identification of data that would satisfy measurements.
The TTURC evaluation also included substantial bibliometric data collection and analysis.3 This involved assessing not only the number of publications, but an analysis of their impact and transdiciplinary research.
Informal Monitoring
The majority of agencies and funding bodies supplemented formal mechanisms (annual reports) with informal program evaluation based on ongoing grantee monitoring. This involved site visits, focus groups, and attending annual grantee meetings. The informal monitoring assists federal agency staff in familiarizing themselves with individual grantee activities, but does not provide as much information on cross-site progress or how the overall program process and outcome goals are being met.
3.3 What kinds of cross-center products facilitate and/or result from cross-site collaboration?
The development of a cross-site identity for programs varied substantially. In some cases, for example (PP 5-Roybal) no cross-center identity has developed. In other cases, for example the RCMARs (discussed as part of the PP 5 interview) a substantial amount of cross-site identity and products were developed. Of the programs discussed in interviewees, seven of the nine (including the RCMARs) noted that a cross-center identity had developed.
3.3.1 Facilitators to Development of Cross-Site Identity
The development of a cross site identity was facilitated by many interrelated factors.
During the interviews we identified the following:
A clear program goal of cross-site collaboration required by grantees is stated in the RFA (PP 2-REL, PP 5-Roybal RCMARs, PP 9a-TTURC/NIH).
Specification as part of the funding mechanism, such as a cooperative agreement, that stipulates that a cross-site product must be produced by all grantees (i.e., money and funding mechanism make cooperation explicit), as for example in PP 6-ATTC.
The activities of a national program office to coordinate across centers (PP 4-AHEC, PP 5-Roybal RCMARs).
Grantees on working groups that focused on developing a product, such as a publication or strengthening program infrastructure that affected all grantees, for example, evaluation. (PP 3-OWH, PP 6-ATTC, PP 9-TTURC).
Program branding activities, such as the name of the center (e.g., PP 6-ATTC) and the development of a cross-site logo (e.g., PP 5-Roybal RCMARs, the NIA sponsored a competition across grantees to identify a RCMAR logo).
Although not explicitly mentioned, the amount of “face time” and resources committed to face-to-face meetings among grantees facilitates cross-site collaboration. For example:
Allowing and encouraging separate collaboration-specific time during grantee meetings
Program structure elements related to oversight and monitoring (such as directors retreats discussed by PP 7-VA) often provide opportunities to identify cross-site collaboration
Providing travel funds to visit other grantees (PP 9b-TTURC/RWJF)
Holding meetings at grantee sites
The overall program support structure determines the amount of cross-site interaction as well as the depth of grantee involvement, that is, the extent that interactions are limited to center directors and federal agency staff or extend to grantee staff involved in the program. Examples of program support that involves cross-site collaboration include:
Committees
Task forces
Annual meetings
Regular conference calls
Evaluation working groups
Publication/dissemination working groups
Topic area concerns (for example, researchers who are interested in methodological issues in minority data sets)
Additionally, the extent to which cross-site collaboration is encouraged and tied to money and built into the funding mechanism is a factor, as is the commitment from the funding agency to development of a cross-center identity – for example, PP 3-OWH is known internationally and being replicated by other countries. In this case, it is not simply that a cross-site identity has emerged, but rather that the program concept is sufficiently well developed that it can be replicated and extended.
Examples of Cross-Center Products. Examples of program wide products include:
Websites (all of the programs)
Joint publications (PP3-OWH, PP 5-Roybal, PP 6-ATTC, PP 8-TRIP, PP 9-TTURC)
Participation in a listserv (see Section 5.2.2 on Program Support above)
Production of newsletters, which may be done by the national coordinating office (for example, PP 9b-TTURC/RWJF)
Conference Presentations (PP 6-ATTC, PP 8-TRIP, PP 9-TTURC)
3.3.2 Barriers to Development of Cross-Site Identity
Some of the barriers involved in cross-center collaboration are that it is difficult to encourage relationships between groups that compete for funds (PP 7-VA, PP 5-Roybal), or have different interests and agendas, for example, working across disciplines or with a medical school when a grantee is accustomed to working with a different department (PP 6-ATTC, PP 9-TTURC). Also, for some programs the centers are run by PIs that are more accustomed to working alone, rather than in collaborative efforts that require a different type of interaction and commitment, and different timeframes.
3.4 What Additional Factors Affect Program Management and Success?
3.4.1 Program Manager Role – Management and Monitoring
The on-site involvement between funding agency key staff with grantees was seen as an important way to manage, oversee, and monitor programs. Interviewees stressed that this could not be done without ongoing face-to-face interactions with grantees. It was stressed that conference calls and other means of communication could not be expected to replace face-to-face contact. Moreover, one respondent (PP 9-TTURC) mentioned the importance and need for frequent meetings (both face to face and other correspondence), especially during the early stages of a program.
For example, regarding PP 7-VA, “you need to have management staff go to the facility and attend the steering committee and make sure that they’re focused and that their understanding of what is expected of them is clear from the beginning. We go to these, and I specifically on our newer programs make sure that myself or an associate are there for the first steering committee. With 32 centers you can’t necessarily hit them all every year, but that first one, we’re definitely and sometimes both of us go if there are administrative issues. We normally just observe. It’s a perfect time when you’re observing to develop some understanding of whether they really understand the perspective of it. I don’t know about the specific issues at CDC, but for us who are an intramural program, they need to be doing research on veterans and doing our research. We’re not just funding them so that they can go to NIH. We’d prefer to put our money somewhere else, so that’s essential – you need to be directly involved and visit the sites. Don’t just conference in. Be there and see what’s going on and meet with the staff and develop some kind of interconnections and provide them gentle but direct oversight into the direction that they need to head. Do the same when they’re not heading in the right direction. We’ll do that. We’ll go out to the next steering committee. It’s not on the telephone. You go there and you talk to them behind closed doors and bringing up issues that you’re concerned about. As long as everyone knows what the responsibilities are and what the expectations are, I think it works much better. If you take a hands-off approach and just read the reports and sit in on a conference call from time to time, I’d be surprised if it doesn’t happen that something goes off target.”
3.4.2 Center Directors
The role of the center director was identified as pivotal in overall program success. The person should not only be someone who has a strong record of research or subject matter expertise in an area, but they should also have a track record of demonstrated successes, which reveal that they have the skills and networks be able to make connections with others. Having a strong center director assists in achieving program success. The person maybe, as one interviewee whose program addressed research, described it “major players in the field … . Major players at major universities. It’s just a lot easier getting these people to understand good science, because that’s their life.”
In the interview regarding PP 3-OWH: “the person that is selected by the site to be the center director is so critical. I don’t mean that it has to be this really dynamic outspoken pushy person. It has to be a person that’s very influential to get people to do things. I mean when you think about it, we give $150,000 to $160,000 per year, which is nothing, and we’re asking them to do five components and integrate those components with each other. At a community entity that is not very much money to do that. When we go on site visits we meet 60 to 70 people who are involved in each CoE in order to get the requirements of those five or six components met. Of course, with that amount of money you can’t pay 1% of anybody’s salary to help you to do that, so the personality of the leader, the center director is so critical. They have to be able to be respected and trusted, and they have to be able to influence others to do the work of the CoE. We have seen over and over again, we have had five CoEs that have failed. It’s usually because of not being able to sort of convince others to do the requirements of the contract.”
3.4.3 RFA
RFAs generally provide clear overall program goals for applicants. Various other aspects were also discussed, clearly defined expectations of the funding agency, the funded grantee expectations and commitments, including required participation in meetings, evaluation process, committees, etc.
Regarding the Roybal and RCMAR centers (PP 5), it was stated that, “Some people accuse me of overwriting the RFA, but I’m happy that I did it. It was very tight … . It specified what the outcomes were that we wanted. We wanted to see researchers getting grants. We wanted to see the development of measures that were consistent across the various racial and ethnic groups and that sort of thing.”
However, with respect to PP 8-TRIP, there was an acknowledgement of the need to balance between specifying what is expected from a program and allowing for innovation. They suggested that in hindsight focusing the topical area of the grant, which in this case involved research, would have assisted the program overall. And yet, they were wary of being overly prescriptive, in order to allow for research innovation. In the words of the interviewee, “I mean in hindsight for us, I guess [it] would have been maybe [to] more narrowly focus the areas of interest that we were hoping to capture, but then again we might have missed out on some of the interesting things that came in, you know? It’s really a tradeoff. We’re looking for innovation and you don’t want to be too structured in what you’re getting.”
3.4.4 Program Branding
The promotion of the program within the field was seen as important. More crucial, however, was to ensure that name changes or program changes are also communicated to the field. For example, in discussing PP 6-ATTC, which underwent a name change, it was clear that many no longer recognized the program once its name was changed.
“They were ATTC addiction training centers, and then they became addiction technology transfer centers. Being branded and making themselves known to the people working in the addiction field was one of the obstacles that they had to overcome and we’re still working on that. I never fail when I go to a meeting of grantees to say how many of you know of the resources that are available to you through your ATTC, and sometimes I’ll get 80 hands out of a group of a hundred and sometimes I get 3, you know? It always surprises me that there’s always work to do, and pushing for people to know that we exist and to know where to find us on the web and how to contact their own regional ATTC.”
3.4.5 Funding and Program Expectations
As part of establishing a program and having it be known in the field, it is important that program managers also address their resources with respect to outreach and dissemination activities. For example, in the case of PP 6-ATTC, the limited funds prevented ongoing training over the course of a year. “Given our funding levels, we pretty much top out pretty early in the year as to the service that can be provided, and sometimes there’s a waiting. People don’t get turned away, but people are given realistic expectations about when we might be able to offer a training or a workshop or a summer school topic on something. It’s a resource issue.”
3.4.6 Program Cross-Site Collaboration
For those programs that encouraged cross-site collaboration, it was important to communicate this early in the program implementation. Also, crucial was the “buy-in” from grantees, particularly with respect to recognizing cross-site collaboration as program success and an outcome that was being specifically targeted.
In the discussion on PP 9-TTURC, the communications regarding cross-site collaboration were identified as an additional element for overall program success. For example, “I think that the semiannual meetings, that level of frequency is really important in the beginning. It continues to develop new collaborative activities. I think identifying or working with the group to help them bind to the idea that collaboration across centers can be an exciting opportunity, not just for themselves, but for their other colleagues that make up the centers and especially the junior colleagues. Something is really going to progress in the next generation by what we’re doing now. I’m probably going to be meeting, talking about opportunities for collaboration and brainstorming opportunities for collaboration. I think that there was a pretty good buy-in into the idea that collaboration would be a terrific outcome that would really be a good thing, and so people began thinking about what they might be collaborating on. There are some folks that were high-level collaborators, and there were others….and part of it was the nature of the scientific community and part of it was the nature of people.… Another was … what I mentioned earlier, which is … a general role that was raised by the program officer, which was to keep one’s eyes open for both science and individuals in different places that might fit together well. I think personally bringing people together to develop new directions or collaborations, partnerships – it’s been one of the things that’s been the most exciting for me in my career.”
Exhibit 3-1: General Program Information |
||||
Priority Program |
Agency / Nonprofit Organization |
Year Program Started |
Program Address |
Website(s) |
1. Excellence Centers to Eliminate Ethnic / Racial Disparities (EXCEED) |
Dept. of Health & Human Services / Agency for Healthcare Research & Quality (HHS/AHRQ) |
2000 |
Center for Outcomes and Evidence 540 Gaither Road Rockville, MD 20850 |
|
2. Regional Educational Laboratories (REL) |
Dept. of Education / Institute of Education Sciences (ED/IES) |
1965 |
555 New Jersey Avenue NW, Room 506 E Washington, DC 20208 |
|
3. National Centers of Excellence in Women’s Health (NCEWH) |
Dept. of Health & Human Services / Office on Women’s Health (HHS/OWH) |
1996 |
Parklawn Building, 5600 Fishers Lane, Room 16A-55 Rockville, MD 20857 |
http://www.4woman.gov/COE/index.htm http://www.4woman.gov/CCoE/index.htm
|
4. Area Health Education Centers (AHEC) |
Dept. of Health & Human Services / Health Resources & Services Administration (HHS/HRSA) |
1972 |
AHEC Branch, Bureau of Health Professions Division
of State, Community and Public Health |
|
5. Roybal Centers for Translational Research in the Behavioral & Social Sciences (Roybal) & Resource Centers for Minority Aging Research (RCMAR) |
Dept. of Health & Human Services / National Institutes of Health / National Institute on Aging (HHS/NIH/NIA) |
1993 (Roybal) / 1997 (RCMAR) |
Individual Behavioral Processes Branch Behavioral and Social Science Research Program National Institute on Aging 7201 Wisconsin Avenue, Suite 533 Bethesda, MD 20892-9205 |
|
6. Addiction Technology Transfer Center (ATTC) |
Dept. of Health & Human Services / Substance Abuse & Mental Health Services Administration (HHS/SAMHSA) |
1993 |
Center for Substance Abuse Treatment Rockwall II Building, 5600 Fishers Lane, Suite 618 Rockville, MD 20857 |
|
7. Health Services Research & Development Service Centers of Excellence (HSR&D COE) |
Dept. of Veterans Affairs / Office of Research & Development (VA/ORD) |
1977 |
Health Services Research & Development Service (124-I) 810 Vermont Avenue NW Washington, DC 20420 |
http://www.hsrd.research.va.gov/about/centers/ http://www.hsrd.research.va.gov/about/centers/centers_of_excellence.cfm |
8. Translating Research Into Practice (TRIP) – II |
Dept. of Health & Human Services / Agency for Healthcare Research & Quality (HHS/AHRQ) |
1997 |
Center for Outcomes and Evidence 540 Gaither Road Rockville, MD 20850 |
http://www.ahrq.gov/research/trip2fac.pdf |
9a. NIH / NCI - Transdisciplinary Tobacco Use Research Centers (TTURC) |
Dept. of Health & Human Services / National Institutes of Health / National Cancer Institute (HHS/NIH/NCI) |
1999 |
Tobacco Control Research Branch Executive Plaza North, Room 4034 6130 Executive Blvd., MSC 7337 Bethesda, MD 20892-7337 |
|
9b. RWJF – Transdisciplinary Tobacco Use Research Centers (TTURC) |
Robert Wood Johnson Foundation (RWJF) |
1999 |
Route 1 and College Road East P.O. Box 2316 Princeton, NJ 08543 |
|
Exhibit 3-2: Funding and Sustainability |
|||||||||
Priority Program |
Total Number of Centers Funded |
Funding Cycle (Number of Years) |
Overall Funding Level per Program |
Funding per Grantee |
Funding Mechanism(s) for Program |
Reasons for Initial Funding Mechanism Decision (and Any Subsequent Changes) |
Additional Funding for Special Projects? |
Expected to Continue Past Agency Funding? |
Mechanisms in Place to Ensure Program Sustainability? |
1. Excellence Centers to Eliminate Ethnic / Racial Disparities (EXCEED) |
9 |
5 |
$45M total |
$250K - $2M/yr |
Cooperative Agreements |
Greater staff involvement and collaboration with and across grantees |
Y |
Respondent did not discuss |
N |
2. Regional Educational Laboratories (REL) |
10 |
5 |
$65M/yr |
$3.8M - $8.6M/yr |
Changed from Grants to Contracts |
Changed because more central office control & oversight was needed |
Y |
Y |
Y |
3. National Centers of Excellence in Women’s Health (NCEWH) |
19 |
4 |
$11-12M total |
$156K/yr |
Contracts |
Greater control & oversight |
Y |
Y |
Y |
4. Area Health Education Centers (AHEC) |
46 |
3 |
$34-36M/yr |
$100K/yr |
Cooperative Agreements |
Good mechanism for academic / community partnerships |
Y |
Y |
Y |
5. Roybal Centers for Translational Research in the Behavioral & Social Sciences (Roybal) & Resource Centers for Minority Aging Research (RCMAR) |
6 |
5 |
$7.5M total (Roybal) / $18M total (RCMAR) |
$360K/yr (Roybal) / $600K/yr (RCMAR) |
Changed from P50 to P30 grants |
Changed program focus from funding major projects to funding smaller grants / pilots |
N |
N |
N |
6. Addiction Technology Transfer Center (ATTC) |
14 |
5 |
$8M total |
$500K total |
Cooperative Agreements |
More opportunities to work cooperatively with grantees to shape program direction – federal staff participants in process, not just monitors |
N |
Y |
Y |
7. Health Services Research & Development Service Centers of Excellence (HSR&D COE) |
13 |
5 |
$60M/yr |
$760K - $1M/yr |
Transfer of Funds |
Operational requirement within the VA – no other funding options |
Y |
Y |
Y |
8. Translating Research Into Practice (TRIP) - II |
13 |
3 |
$7.5M total |
$100K-$600K/yr |
Changed from R01s to Cooperative Agreements |
New direction – very rigorous study designs to real-world applications & community partnerships |
Y |
Y |
Y |
9a. NIH / NCI - Transdisciplinary Tobacco Use Research Centers (TTURC) |
7 |
5 |
$70M total |
$2.5M/yr |
P50 Grants |
P50 allows training and pilot projects, wanted to fund centers with multiple disciplines to work together |
N |
Y |
Y |
9b. RWJF – Transdisciplinary Tobacco Use Research Centers (TTURC) |
7 |
5 |
$14M total (started 2001) |
$2M total |
Grants |
Standard RWJF funding mechanism |
N |
Y |
Y |
Exhibit 3-3: Support and Coordination |
||||
Priority Program |
Scientific Support & Oversight? |
Programmatic Support & Oversight? |
Specific Types of Skills Program Staff Need for Success |
Examples of Management Procedures, Structures, and Activities to Support Grantees Effectively |
1. Excellence Centers to Eliminate Ethnic / Racial Disparities (EXCEED) |
Y |
Y |
Advanced knowledge of study design, clinical interventions, program management |
|
2. Regional Educational Laboratories (REL) |
N |
Y |
Strong program management, quantitative skills
|
|
3. National Centers of Excellence in Women’s Health (NCEWH) |
N |
Y |
Strong writing ability, analytical skills, contract / grant experience |
|
4. Area Health Education Centers (AHEC) |
Respondent did not discuss |
Y |
Respondent did not discuss |
|
5. Roybal Centers for Translational Research in the Behavioral & Social Sciences (Roybal) & Resource Centers for Minority Aging Research (RCMAR) |
Y |
N (Roybal) / Y (RCMAR) |
PhD or MD in related areas |
|
6. Addiction Technology Transfer Center (ATTC) |
Respondent did not discuss |
Y |
Respondent did not discuss |
|
7. Health Services Research & Development Service Centers of Excellence (HSR&D COE) |
N |
Y |
Respondent did not discuss |
|
8. Translating Research Into Practice (TRIP) - II |
Y |
Y |
Advanced or general degree with good working knowledge of translational activities |
|
9a. NIH / NCI - Transdisciplinary Tobacco Use Research Centers (TTURC) |
Y |
Y |
Program staff should have scientific background in at least one area represented by grantees |
|
9b. RWJF – Transdisciplinary Tobacco Use Research Centers (TTURC) |
Y |
Y |
Respondent did not discuss |
|
Exhibit 3-4: Program Evaluation |
|||||||
Priority Program |
Formal External Program Evaluation(s)? |
Any Other Type of Evaluation? |
Other Evaluations External (E) or Internal (I)? |
Brief Description of Evaluation(s) |
Examples of Mechanisms Used |
Published Report / White Papers? |
Future Program Evaluation(s) Planned or Considered? |
1. Excellence Centers to Eliminate Ethnic / Racial Disparities (EXCEED) |
N |
N |
N/A |
N/A |
N/A |
N/A |
N |
2. Regional Educational Laboratories (REL) |
Y |
Y |
E |
|
Surveys, interviews, site visits, review of relevant statutes |
Y |
Y |
3. National Centers of Excellence in Women’s Health (NCEWH) |
Y |
N |
N/A |
Baseline external program evaluation |
Surveys, interviews, site visits |
Y |
Y |
4. Area Health Education Centers (AHEC) |
Y |
N |
N/A |
AHEC program evaluation by UNC-CH Sheps Center |
Site visits, surveys, focus groups, interviews, past data sets |
Y |
Respondent did not discuss |
5. Roybal Centers for Translational Research in the Behavioral & Social Sciences (Roybal) & Resource Centers for Minority Aging Research (RCMAR) |
N |
Y |
I |
Process and outcome measures report – designed with help from grantees |
On-line data in standardized format to coordinating center every 6 months |
Y |
N |
6. Addiction Technology Transfer Center (ATTC) |
N |
Y |
I |
Internal evaluation only – cross-site end-of-year evaluations by part-time national evaluator |
Case studies, internal evaluation instruments |
N |
N |
7. Health Services Research & Development Service Centers of Excellence (HSR&D COE) |
N |
Y |
I |
Comparison of grantees in annual center reports & annual matrix (summary) reports |
Grantees enter data on key funding activities into central database
|
N |
Y |
8. Translating Research Into Practice (TRIP) - II |
N |
Y |
I |
Internal parts evaluation – examine specific topic areas |
Grantees enter data into data reporting template |
N |
Y |
9a. NIH / NCI - Transdisciplinary Tobacco Use Research Centers (TTURC) |
N |
Y |
I |
Evaluation conducted by concept mapping & evaluation expert temporarily at NCI |
Surveys, bibliographic analysis, review of three years of progress reports |
N |
Y |
9b. RWJF – Transdisciplinary Tobacco Use Research Centers (TTURC) |
N |
N (done by NIH) |
N/A |
N/A |
N/A |
N/A |
N |
Exhibit 3-5: Progress Monitoring |
|||||||
Priority Program |
Site Visits? |
In-house (I) or Outside (O) Experts on Site Visits? |
Grantee Reports Required? |
Examples of Reports / Data Submission Required |
Standardized Format to Complete Reports / Submit Information Online? |
Face-to-Face Grantee Meetings? |
How Often are Grantee Director Meetings? |
1. Excellence Centers to Eliminate Ethnic / Racial Disparities (EXCEED) |
Y |
I |
Y |
Progress reports submitted “regularly” |
Respondent did not discuss |
Y |
Respondent did not discuss |
2. Regional Educational Laboratories (REL) |
N |
N/A |
Y |
|
Y |
Y |
4 times/year |
3. National Centers of Excellence in Women’s Health (NCEWH) |
Y |
I & O |
Y |
|
Respondent did not discuss |
Y |
2 times/year |
4. Area Health Education Centers (AHEC) |
Y |
I |
Y |
|
Y |
Y |
1 time/year |
5. Roybal Centers for Translational Research in the Behavioral & Social Sciences (Roybal) & Resource Centers for Minority Aging Research (RCMAR) |
N (Roybal) / Y (RCMAR) |
I |
Y |
Annual progress |
Y |
Y |
1 time/year |
6. Addiction Technology Transfer Center (ATTC) |
Y |
I |
Y |
Biannual progress |
Y |
Y |
3 times/year (at different grantee site each time) |
7. Health Services Research & Development Service Centers of Excellence (HSR&D COE) |
Y |
I |
Y |
|
Y |
Y |
Respondent did not discuss |
8. Translating Research Into Practice (TRIP) - II |
Y |
I |
Y |
Biannual progress |
Y |
Y |
Respondent did not discuss |
9a. NIH / NCI - Transdisciplinary Tobacco Use Research Centers (TTURC) |
N (done by RWJF staff) |
N/A |
Y |
Annual progress |
Y |
Y |
Respondent did not discuss |
9b. RWJF – Transdisciplinary Tobacco Use Research Centers (TTURC) |
Y |
I |
Y |
Annual progress |
Respondent did not discuss |
Y |
2 times/year (with communications directors) |
Exhibit 3-6: Program Management |
||||||
Priority Program |
Committees to Shape / Inform Program Management? |
Task Forces / Work Groups / Advisory Boards? |
Examples of Task Forces / Working Groups / Committees / Advisory Boards |
Written Policies & Procedure? |
Management Barriers & Facilitators at Initial Program Setup |
Advice for Early Stages of Program Setup & Development |
1. Excellence Centers to Eliminate Ethnic / Racial Disparities (EXCEED) |
N |
Respondent did not discuss |
Respondent did not discuss |
Respondent did not discuss |
Difficult study designs, issues to study, data collection, implementation |
Respondent did not discuss |
2. Regional Educational Laboratories (REL) |
N |
Y |
External working groups to monitor lab progress and activities |
Y |
Respondent did not discuss |
Respondent did not discuss |
3. National Centers of Excellence in Women’s Health (NCEWH) |
N |
Y |
|
Y |
Respondent did not discuss |
|
4. Area Health Education Centers (AHEC) |
Y |
Y |
|
Respondent did not discuss |
Respondent did not discuss |
Respondent did not discuss |
5. Roybal Centers for Translational Research in the Behavioral & Social Sciences (Roybal) & Resource Centers for Minority Aging Research (RCMAR) |
N (Roybal) / Y (RCMAR) |
N (Roybal) / Y (RCMAR) |
|
N |
P32 training grants (overhead of 8%, no faculty salary) hurt recruiting of major researchers |
|
6. Addiction Technology Transfer Center (ATTC) |
Y |
Y |
|
N |
|
|
7. Health Services Research & Development Service Centers of Excellence (HSR&D COE) |
Y |
Y |
|
Y |
Dip in performance at newer centers after 1st / 2nd year – building capacity, proposals for additional funds |
|
8. Translating Research Into Practice (TRIP) - II |
Y |
Y |
|
Y |
|
Think about how narrowly or broadly you want to focus the areas of interest – tradeoffs between structure and innovation, rigor and generalizability |
9a. NIH / NCI - Transdisciplinary Tobacco Use Research Centers (TTURC) |
Y |
Y |
|
Respondent did not discuss |
|
|
9b. RWJF – Transdisciplinary Tobacco Use Research Centers (TTURC) |
Y |
Y |
Oversight committee formed with well-known leaders in tobacco research |
Y |
Difficult communicating to PIs what & why interested in funding at centers |
|
Exhibit 3-7: Cross-Grantee Identity, Activities and Products |
|||||||
Priority Program |
Cross-Grantee Identity Developed? |
Materials Produced to Promote Grantees? |
Examples of Promotional Materials |
National and/or Regional Meetings? |
Frequency of National / Regional Meetings? |
Cross-Grantee Activities / Events / Products? |
Examples of Cross-Grantee Activities / Events / Products & Resources to Promote Collaboration |
1. Excellence Centers to Eliminate Ethnic / Racial Disparities (EXCEED) |
Respondent did not discuss |
Y |
EXCEED program brief |
Respondent did not discuss |
Respondent did not discuss |
Respondent did not discuss |
Respondent did not discuss |
2. Regional Educational Laboratories (REL) |
Y |
Y |
REL annual reports with information on all labs |
Y |
Annually |
Y |
|
3. National Centers of Excellence in Women’s Health (NCEWH) |
Y |
Y |
|
Y |
Respondent did not discuss |
Y |
|
4. Area Health Education Centers (AHEC) |
Y |
Y |
|
Y |
Annually |
Y |
Cross-grantee small group discussions with federal program representatives at program director meetings Information sharing – program monitor shares interesting activities at specific sites with other grantees |
5. Roybal Centers for Translational Research in the Behavioral & Social Sciences (Roybal) & Resource Centers for Minority Aging Research (RCMAR) |
N (Roybal) / Y (RCMAR) |
Y |
|
Y |
Annually |
Y (RCMAR) |
Joint RCMAR production - The Science of Inclusion: Recruiting and Retaining Racial and Ethnic Elders in Health Research |
6. Addiction Technology Transfer Center (ATTC) |
Y |
Y |
Website – links to cross-site and regional products and publications |
Y |
Annually |
Y |
|
7. Health Services Research & Development Service Centers of Excellence (HSR&D COE) |
N |
Y |
Each center has link to its own website on the VA HSR&D – each center has its own logo, newsletter, identity |
Y |
Annually |
N |
|
8. Translating Research Into Practice (TRIP) - II |
Y |
Y |
Press releases, TRIP-II fact sheet |
Y |
Annually |
Y |
|
9a. NIH / NCI - Transdisciplinary Tobacco Use Research Centers (TTURC) |
Y |
Y |
TTURC website – links to cross-center collaborations, individual center information, publications, news releases, etc. |
Y |
Biannually |
Y |
|
9b. RWJF – Transdisciplinary Tobacco Use Research Centers (TTURC) |
Y |
Y |
National Program Office produces newsletter about all sites (hard copy), RWJF website |
Y |
Biannually |
Y |
|
Exhibit 3-8: Outreach and Policy Activities |
||
Priority Program |
Promotional Materials Produced for Outreach and Policy Activities? |
Examples of Outreach Activities / Promotional Materials / Policy Briefing Activities / Methods Used to Disseminate Findings |
1. Excellence Centers to Eliminate Ethnic / Racial Disparities (EXCEED) |
Y |
|
2. Regional Educational Laboratories (REL) |
Y |
|
3. National Centers of Excellence in Women’s Health (NCEWH) |
Y |
|
4. Area Health Education Centers (AHEC) |
Y |
|
5. Roybal Centers for Translational Research in the Behavioral & Social Sciences (Roybal) & Resource Centers for Minority Aging Research (RCMAR) |
Y |
|
6. Addiction Technology Transfer Center (ATTC) |
Y |
|
7. Health Services Research & Development Service Centers of Excellence (HSR&D COE) |
Y |
|
8. Translating Research Into Practice (TRIP) - II |
Y |
Paper published in International Journal for Quality in Health Care entitled “Translating research into practice: the future ahead” - summary and analysis of TRIP I & II applications funded in 1999 and 2000 |
9a. NIH / NCI - Transdisciplinary Tobacco Use Research Centers (TTURC) |
Y |
Two published papers (“Facilitating transdisciplinary research: the experience of the transdisciplinary tobacco use research centers” and “Evaluating transdisciplinary science” in a special issue of the December 2003 Nicotine and Tobacco Use Research Journal – funding agencies describe program intent, mission, and evaluation findings |
9b. RWJF – Transdisciplinary Tobacco Use Research Centers (TTURC) |
Y |
|
Exhibit 3-9: Document Sharing to CDC |
|||
Priority Program |
Examples of Available (or Potentially Available) Documents to Share with CDC |
Individuals to Contact and/or Websites to Search for Documents to Share with CDC |
OK to Contact Interviewee Again? |
1. Excellence Centers to Eliminate Ethnic / Racial Disparities (EXCEED) |
EXCEED program brief |
http://www.ahrq.gov/research/exceed.htm - EXCEED Program Brief Interviewee inherited program in June, 2003 – she said to contact her if there are specific questions and she will locate the correct people to contact with additional knowledge regarding earlier program stages |
Y |
2. Regional Educational Laboratories (REL) |
2002 GAO report 02-190, annual reports, scientific standards |
http://www.gpoaccess.gov/gaoreports/advanced.html (search for report “GAO-02-190”, Education Should Improve Assessments of R&D Centers, Regional Labs, and Comprehensive Centers [PDF or HTML version]) http://www.relnetwork.org/publications.html - Products & Publications (e.g., Annual Reports) http://www.w-w-c.org/ - What Works Clearinghouse (scientific standards) |
Y |
3. National Centers of Excellence in Women’s Health (NCEWH) |
Many resources available on website (e.g., evaluation reports [executive summaries], brochures, leveraging strategies, pamphlets, newsletters, fact sheets, white papers) |
Contact interviewee if unable to locate any documents
|
Y |
4. Area Health Education Centers (AHEC) |
2002 AHEC evaluation by Cecil G. Sheps Center at UNC-Chapel Hill, NAO electronic newsletters, pamphlets, brochures, resource manual (info on state programs), and biannual National AHEC bulletin |
http://www.nationalahec.org/update_archive/AHECEvaluativeStudyFinalReport.pdf - evaluation report http://www.nationalahec.org/News/Archive/ - NAO newsletters http://bhpr.hrsa.gov/grants/reports.htm - Link to Comprehensive Performance Management System (CPMS) and Uniform Progress Reports (UPR) (Word version) Pamphlets, brochures, resource manuals, biannual AHEC bulletin – contact Judy Lyle at NAO |
Y |
5. Roybal Centers for Translational Research in the Behavioral & Social Sciences (Roybal) & Resource Centers for Minority Aging Research (RCMAR) |
Annual NIA report, Roybal issue briefs, RCMAR summary reports, RFA |
Contact interviewee for copies of annual report & Roybal issue briefs http://www.rcmar.ucla.edu/model.php - RCMAR model http://www.rcmar.ucla.edu/cores.php - links to RCMAR summary reports, administrative, measurement / methods, investigator development, and community liaison cores http://grants1.nih.gov/grants/guide/rfa-files/RFA-AG-04-007.html - RFA |
Y |
6. Addiction Technology Transfer Center (ATTC) |
The Change Book, Blending Initiative, ATTC newsletter, TAP 21 |
http://www.nattc.org/pdf/changebook.pdf - The Change Book: A Blueprint for Technology Transfer http://www.nida.nih.gov/CTN/dissemination.html - Information on Blending Initiative http://www.nattc.org/newsField/networker.html - ATTC Networker - newsletter published 3x/year TAP 21 (Core Counselor Competencies) – request copy from Mary Beth Johnson (Director, ATTC National Office – email: [email protected], phone: 816-482-1200) – OK to mention interviewee’s name |
Y |
7. Health Services Research & Development Service Centers of Excellence (HSR&D COE) |
HSR&D policies and procedures manual, internal publications |
http://www.hsrd.research.va.gov/for_researchers/resources/policy_documents.cfm - Policies and procedures for HSR&D programs http://www.hsrd.research.va.gov/publications/search_pubs.cfm - internal publications |
Y |
8. Translating Research Into Practice (TRIP) - II |
TRIP-II fact sheet, journal article, RFA |
http://www.ahrq.gov/research/trip2fac.pdf - TRIP-II fact sheet http://intqhc.oupjournals.org/cgi/reprint/14/3/233 - PDF version of “Translating research into practice: the future ahead” article (in International Journal for Quality in Health Care) http://grants.nih.gov/grants/guide/rfa-files/RFA-HS-00-008.html - RFA |
Y |
9a. NIH / NCI - Transdisciplinary Tobacco Use Research Centers (TTURC) |
Two journal articles, unpublished internal evaluation report, RFA, cross-center collaborations on website |
Interviewee mailed journal with 2 pertinent articles (see Outreach & Policy Activities matrix for details) Evaluation report is unpublished, internal document currently. Interviewee said CDC may send him a request for public version of report, which he will forward to proper individuals. http://grants.nih.gov/grants/guide/rfa-files/RFA-CA-98-029.html - RFA |
Y |
9b. RWJF – Transdisciplinary Tobacco Use Research Centers (TTURC) |
RWJF style guide, newsletters from National Program Office |
http://www.rwjf.org/publications/style/I/titlepage.jsp - RWJF Style Guide http://www.rwjf.org/grantee/connect/index.jhtml - Connect project information Contact interviewee for copies of newsletter from national program office |
Y |
1 Note: Grantees were interviewed for their feedback on this program.
2It is our understanding that during the screening process, CDC was unable to reach the federal program officer, and instead contacted the National AHEC Organization, which was described as “an organization that includes the AHEC program director, the Health Education Training Centers (HETC) program director and all the center directors for HETC.”
3The full findings for the TTURC program and evaluation were published in the two following articles: (1) Morgan G.D., Kobus K., Gerlach K.K., Neighbors C., Lerman C., Abrams D.B., and Rimer B.K. 2003 Facilitating transdisciplinary research: the experience of the transdisciplinary tobacco use research centers. Nicotine & Tobacco Research 5 (Supplement 1): 11-19 and (2) Stokols D., Fuqua J., Gress J., Harvey R., Phillips K., Baezconde-Garbanati L., Unger J., Palmer P., Clark M.A., Colby S.M., Morgan G., and Trochim W. 2003 Evaluating transdisciplinary science. Nicotine & Tobacco Research 5 (Supplement 1): 21-39.
ACE Draft Final report Management Approaches
File Type | application/msword |
File Title | Draft |
Author | mlh5 |
Last Modified By | zep0 |
File Modified | 2007-07-06 |
File Created | 2007-07-06 |