Supporting Statement Part B -- Assessing the Feasibility of Disseminating EHC Products Through ED Activities 5-25-2012

Supporting Statement Part B -- Assessing the Feasibility of Disseminating EHC Products Through ED Activities 5-25-2012.docx

Assessing the Feasibility of Disseminating Effective Health Center Products through Educational Activities

OMB: 0935-0194

Document [docx]
Download: docx | pdf



SUPPORTING STATEMENT


Part B



Assessing the Feasibility of Disseminating Effective Health

Center Products through Educational Activities Planned

and Implemented in Partnership with the

Society of Academic Continuing Medical Education



OMB No. xxxxxxx













Version: May 17, 2012






Agency of Healthcare Research and Quality (AHRQ)

















Table of contents


B. Collections of Information Employing Statistical Methods 3

1. Respondent universe and sampling methods 3

2. Information Collection Procedures 5

3. Methods to Maximize Response Rates 6

4. Tests of Procedures 6

5. Statistical Consultants 7





































B. Collections of Information Employing Statistical Methods


1. Respondent Universe and Sampling Methods


The evaluation activities described in Supporting Statement Part A will be used to gather data from six academic organizations that offer CME to clinical practitioners, clinical researchers, or clinical faculty in educational settings. The sampling frame was defined as all 11 institutions submitting a proposal that described a means of integrating comparative effectiveness research into a CME activity of any format (e.g., live, online, Webinar, academic detailing). Because this is a feasibility study designed to collect pilot data, there was no intent to generalize to a larger universe of CME providers, and thus use of probabilistic sampling was unnecessary. Instead, purposeful sampling was employed to select those projects that might be reasonably pursued in light of available support and which employed innovative educational formats and/or curriculum design. Criteria were developed to rank the project submissions according to the following: (a) dissemination – project will potentially reach large numbers of clinicians/other audiences and will provide valuable feedback on dissemination, educational effects, and/or other outcomes; (b) guidance – project will inform the CME community on ways to incorporate AHRQ products into educational programming; (c) replicability – project will be scalable/extensible to other CME applications and products; and (d) overall quality.


Each of the institutions submitting a proposal were asked to rank the pool of projects according to defined criteria. This ranking was nonbinding on the Eisenberg Center. The top six choices of the institutions matched the selections of the Eisenberg Center, The selected organizations include one in the northeast, three in the midwest, one in the south, and one on the west coast. Project selection was done with the intent of acquiring a comprehensive mix of instructional methods. Of these, one is using peer detailing as the instructional method, another is using moderated learning communities with follow-up case studies, a third has taken a clinical quality improvement approach, the fourth organization is integrating Comparative Effectiveness Research (CER) findings into live larger-group CME programs targeting primary care providers, and the fifth organization is exploring ways of integrating CER findings into different types of CME activities that use either live presentation or enduring materials (e.g., CD-ROMs, audio/video tapes, monographs, or other products that can be used multiple times for learning purposes). The CME content-specific implementation strategies and objectives associated with each project is summarized in Table 1.


TABLE 1: Model Projects and Unique Dissemination Activities Involving

Society of Academic Continuing Medical Education Member Organizations

Model Project

Unique Dissemination Objective

Peer Detailing (New York City)

Using EHC materials to develop sustainable educational networks that can be the basis for future cooperative CME with the aims of practice improvement and improved population health.

Moderated Learning Communities
and Follow-up Case Studies (Kentucky)

To implement presenter moderated learning communities with the goal of engaging learners in the process of identifying barriers to change regarding EHC evidence.

Physician Performance
via Quality Improvement Services (Chicago)

Assess changes in practice performance as a result of integrating AHRQ CER findings into clinical practice.

Comparative Effectiveness Research as an Adjunct to, or as the Primary Topic in Live Annual Large Group CME Seminars (West Coast)

To educate clinicians on the AHRQ comparative effectiveness research process while presenting findings from comparative effectiveness reviews that are important to a primary care audience.

CER Findings in Annual CME Programs (Cincinnati)

To assess the impact of CER recommendations adapted to different types of live and enduring material formats.

Incorporation of EC-topics into existing UNTHSC-sponsored conferences and related activities directed to osteopathic physicians in learning and in practice (Denton, TX)

To determine how EHC Program materials can be integrated effectively into multimodal learning activities designed to address the needs of faculty, students, and area practitioners who participate in activities delivered primarily by primary care faculty of am osteopathic medical school.


Table 2 summarizes the numbers of CME activities to be carried out by the six collaborating organizations, the CME formats to be used, and the numbers of participants projected to be involved in the organization’s total range of activities.


TABLE 2: Numbers and Formats for CME Activities to Offered and

Total Estimated Number of Participants in Activities to Be Carried Out by Each Site

Model Project

Total Activities Offered

Format

Total Est. No. Participants

Peer Detailing (New York City)

1

Detailing

200

Moderated Learning Communities
and Follow-up Case Studies (Kentucky)


4

2 – Live

2 – Internet Enduring

Material

40

Physician Performance via Quality Improvement Services (Chicago)

3

2 – Live

1 – Internet Enduring

Material

120

Comparative Effectiveness Research as an Adjunct to, or as the Primary Topic in Live Annual Large Group CME Seminars (West Coast)

5

4 – Live

1 – Internet Enduring

Material

> 725

CER Findings in Annual CME Programs (Cincinnati)

4

1 – Live

2 – Grand Rounds

1 – Internet Enduring

Material

335

Multimodal CME Developed through an Osteopathic Medical School (Denton)

55

4 – Live

3 – Journal Club

36 – Grand Rounds

8 – Newsletter Topics

3 – Webcasts

1 – Enduring Material

>1,000


For all six of these initiatives by partner organizations, CME professionals at the primary institutions through which the activities will be planned and carried out will gather data on learner performance in a manner consistent with current methods required for CME activities. These methods include: 1) gathering information on the characteristics of learners (e.g., names, professional credentials, employment settings); 2) measuring changes in topical knowledge levels before and after participation in CME activities; 3) assessing intentions to change practice behaviors based on participation in a CME activity; and 4) gauging learner perceptions of the quality of the CME activities in which they were engaged and soliciting recommendations for improvement. This approach to CME evaluation is standard practice across the CME community and is required in order to satisfy requirements for accreditation of CME activities from the national accrediting body, the Accreditation Council on Continuing Medical Education (ACCME).


Semi-structured interviews will be used to assess the development and implementation of CME activities that integrated EHC Program products or materials into course content and will be conducted with CME faculty responsible for the content development and/or presentation. An attempt will be made to include all faculty of the six projects.


CME providers in charge of directing the six projects and identifying faculty, content focus, and target audiences, will be evaluated by interviews and focus group. Again an attempt will be made to all participants.



2. Information Collection Procedures


All information collections will be conducted in a manner that is consistent with the following guidelines:


  • Participation will be completely voluntary, and non-participation will have no effect on eligibility for or receipt of future AHRQ-sponsored health services research or products.


  • Sample size and selection with respect to CME providers was described above.


  • Information collections will be limited to those assessments to evaluate the following:

1) Identify critical factors that enhance or impede integration of EHC evidence into CME activities;

2) Assess strategies to remove, overcome, and/or address barriers to implementation of effective CME programming with selected audiences;

3) Confirm approaches that can be used in whole or in part to create and deliver effective CME instruction about products (e.g., clinician guides, consumer guides) associated with the EHC Program; and

4) Review early educational program outcomes associated with integration of EHC evidence into CME activities.


  • Given the voluntary nature of the information collections, efforts will be made to obtain the highest possible response rates. Efforts will also be made to assess non-response bias, to the extent feasible.


3. Methods to Maximize Response Rates


The design of each information collection will include approaches to maximize response rates, while retaining the voluntary nature of the effort, consistent with appropriate survey methodology. These will include approaches described in Supporting Statement A such as sending one or more follow-up requests for participation in an assessment activity, provision of remuneration (i.e., $250 maximum honorarium for physician involvement) to encourage participation, and maintaining data confidentiality and, when possible, anonymity, to encourage open and honest responses. As noted in Supporting Statement A, all persons will be informed that participation in data collection activities is entirely voluntarily, any information they provide will be combined and summarized with information provided by others; and no individually identifiable information will be released. In instances where respondent identifiers are needed and anonymity cannot be maintained (e.g., participation in focus groups/interviews, registrations for updates), all contacts and data collections will fully comply with the requirements of the Privacy Act.


4. Tests of Procedures


The proposed data collection mechanisms have been informed by hundreds of assessment instruments previously developed and administered to learners and faculty participating in CME activities designed by Baylor College of Medicine and in other AHRQ contracts in their role as the Eisenberg Center. Additionally, the instruments proposed here were developed in consultation with a three-member panel of CME thought leaders who have high levels of expertise and experience in evaluation methodology and applied research, serving as an Evaluation Subcommittee. When feasible, instruments were pretested using groups of less than 10 participants with representatives of target audiences followed by refinement and finalization.


5. Statistical Consultants


The Evaluation Subcommittee has provided input throughout the project on evaluation methodology and will continue to provide guidance during the data analytic phase and report generation. No other external entities will be engaged for consultation.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSTATISTICAL METHODS
Authorwcarroll
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy