0990-CER_Portfolio Part A 6May2011 (2)

0990-CER_Portfolio Part A 6May2011 (2).docx

Comparative Effectiveness Research: Portfolio

OMB: 0990-0381

Document [docx]
Download: docx | pdf



Supporting Statement for OMB Clearance of Assessment of American Recovery and Reinvestment Act (ARRA) Comparative Effectiveness Research (ACERE)


Section A: Purpose and Use of Information


May 6, 2011


Project Officer:

Kate Goodrich, M.D., MHS

Medical Officer

Office of the Assistant Secretary for Planning and Evaluation

U.S. Department of Health and Human Services

200 Independence Avenue, SW

Washington, DC 20201


CONTENTS

Circumstances Making the Collection of Information Necessary 3

Purpose and Use of Information Collection 3

Use of Improved Information Technology and Burden Reduction 3

Efforts to Identify Duplication and Use of Similar Information 3

Impact on Small Businesses or Other Small Entities 3

Consequences of Collecting the Information Less Frequently 3

Special Circumstances Relating to the Guidelines of 5 CFR 1320.5 3

Comments in Response to the Federal Register Notice/Outside Consultation 3

Explanation of Any Payment/Gift to Respondents 3

Assurance of Confidentiality Provided to Respondents 3

Justification for Sensitive Questions 3

Estimates of Annualized Hour and Cost Burden 3

Estimates of Other Total Annual Cost Burden To Respondents or Recordkeepers/Capital Costs 3

Annualized Cost to Federal Government 3

Explanation for Program Changes or Adjustments 3

Plans for Tabulation and Publication and Project Time Schedule 3

Reason(s) Display of OMB Expiration Date is Inappropriate 3

Exceptions to Certification for Paperwork Reduction Act Submissions 3

References 18




TABLES

A.1. HHS Advisory Panel 3

A.2. Individuals Consulting Contractor on Data Collection 3

A.3. Estimated Burden, by Data Collection Activity 3

A.4. Data Collection Schedule 3




LIST OF ATTACHMENTS

Attachment A: Authorizing Legislation



Attachment B: PSLA—Web-based survey of PIs and PDs.



Attachment C: PSLA—In-depth telephone interviews with PIs and PDs



Attachment D: SSLA—Web-based survey of three key stakeholder groups
in two rounds



Attachment E: SSLA—Focus groups with members of the general public
in two rounds



Attachment F: SSLA— In-depth telephone interviews with providers



Attachment G: SSLA— In-depth telephone interviews with health care organizations



Attachment H: SSLA— In-depth telephone interviews with patients/consumers



Attachment I: SSLA— In-depth telephone interviews with employers and payers



Attachment J: SSLA— In-depth telephone interviews with researchers



Attachment K: SSLA— In-depth telephone interviews with innovators



Attachment L: Invitation to participate and reminder notices for PIs and PDs

Attachment M: Invitation to participate and reminder notices for stakeholders








Justification

1. Circumstances Making the Collection of Information Necessary

The Assistant Secretary for Planning and Evaluation (ASPE) is requesting approval from the Office of Management and Budget (OMB) for the following data collection activities to support an evaluation and impact assessment of the American Reinvestment and Recovery Act of 2009 (ARRA) comparative effectiveness research (CER) portfolio:

  • A web-based survey of principal investigators (PIs) and project directors (PDs) who have received ARRA funds to conduct CER

  • In-depth telephone interviews with PIs and PDs

  • A web-based survey (conducted in two rounds, one year apart) of three groups of key stakeholders in CER: health care providers, health care administrators, and patients/consumers

  • Focus groups with members of the general public in two rounds, one-year apart

  • In-depth telephone interviews with stakeholders of CER, including health care providers, health care administrators, patients/consumers, members of the general public, employers and payers, researchers, and developers of health innovations

Background and Legislative Requirements. Researchers and policymakers have emphasized the need for research on effectiveness of health care interventions under real-world conditions in diverse populations and clinical practice settings, that is, CER. The ARRA expanded federal resources devoted to CER by directing $1.1 billion to the U.S. Department of Health and Human Services (HHS) for such research, with $300 million allocated to the Agency for Healthcare Research and Quality (AHRQ), $400 million to the National Institutes of Health (NIH), and $400 million to the Office of the Secretary.

ARRA required the study of priority setting for CER to assist HHS in allocating these funds. For example, the legislation called for a report on priority CER topics by the Institute of Medicine (IOM). By June 2009, IOM gathered input from the research, professional, and public communities; reviewed the current state of knowledge; convened experts; and produced a report to Congress and the Secretary of HHS that presented priority CER topics and recommendations to support a robust and sustainable CER enterprise. In addition, ARRA established the Federal Coordinating Council on Comparative Effectiveness Research (FCCCER), whose general purpose was to help coordinate and minimize duplicative efforts of federally sponsored CER across multiple agencies and to advise the President and Congress on how to allocate federal CER expenditures. This council created a strategic framework that identified the need for CER investments in four core categories: research in comparative effectiveness, human and scientific capital, data infrastructure, and dissemination and translation. The FCCCER also identified three cross-cutting priority themes for consideration in investment: (1) populations, (2) conditions, and (3) types of interventions. Independently, HHS also created a CER framework that included many of the elements of the FCCCER model, but also incorporated evidence needs identification and stakeholder input and involvement as necessary to a robust CER enterprise.

The hoped-for impact of ARRA CER initiatives is the development of more and better clinical evidence that will foster fundamental change in how evidence is used in clinical practice and promote greater value in the health care system. For this evaluation, ASPE seeks to understand whether initial investments appear to be accomplishing their goals and help policymakers set a course that will achieve these long-term benefits.

Overview of the Study Design. This project aims to evaluate and assess the products and outcomes of ARRA-funded CER investments and the impacts of those investments on the priority topics recommended by IOM and on the categories and themes of the FCCCER and HHS frameworks. The evaluation will also gauge the evolution in CER-related knowledge and skills, opinions and attitudes, and behaviors and experiences among stakeholders and society in general and will ultimately draw lessons for future CER funding. The evaluation will incorporate data from new and existing sources and use multiple data collection methods over a two-year period to assess the broad array of CER-relevant federal programs and stakeholder and community perspectives.

The evaluation design consists of a mixed-methods approach for addressing the effectiveness of the ARRA CER portfolio in meeting its programmatic goals. The primary goals of this evaluation are to:

  1. Conduct an initial assessment of the ARRA CER portfolio, cataloguing how CER funding was invested to achieve the vision of the FCCCER, and assessing initial impacts from the perspective of various stakeholders

  2. Lay the groundwork for future CER investments by identifying investment opportunities, evidence gaps, and lessons learned; providing the tools for ongoing assessment; and providing recommendations for future investments

The evaluation consists of two components (with unique objectives) with data collection activities that involve public burden and for which clearance is requested; the two components are (1) a Project-Specific Level of Analysis (PSLA) and (2) a Societal/Stakeholder Level of Analysis (SSLA). The PSLA will identify the achievements and lessons learned from specific ARRA-funded CER projects, whereas the SSLA will undertake a broad study of knowledge and attitudes held by stakeholders and members of society in general. Each component is discussed in turn below.

Overview of the PSLA. The PSLA will provide information on the products, outputs, and outcomes of ARRA-funded CER projects, identify the factors that facilitate or limit project success, and determine whether there are systematic gaps in projects’ design, conduct, or dissemination that limit their value to decision makers. This level of analysis includes collecting data from the public via two methods: (1) a web-based survey of principal investigators (PIs) and project directors (PDs), and (2) in-depth telephone interviews with PIs and PDs.

Overview of the SSLA. The SSLA will provide information on CER-relevant knowledge and skills; attitudes and opinions; and behaviors and experiences among key stakeholders and members of the general public. This component of the evaluation will obtain information from several groups, including members of the general public who have no direct involvement in CER, persons or groups who have a vested interest in clinical decisions and the evidence that supports those decisions, and individuals or groups directly involved in CER projects. This component will use three public data collection activities to collect information on knowledge and skills, attitudes and beliefs, and behaviors and experiences to understand attitudes toward CER and the processes that stakeholders use to engage in it. The SSLA activities for collecting data from the public include: (1) a web-based survey in two rounds of three key stakeholder groups: health care providers, health care administrators, and patients/consumers; (2) two rounds of focus groups with members of the general public; and (3) in-depth telephone interviews with stakeholders.

The SSLA survey and focus groups will each be conducted at two points in time in order to evaluate whether there is any change in the knowledge, attitudes, and behaviors of the sample frame populations over time. The first round will occur shortly after OMB provides clearance for data collection (around November 2011) and the second round will occur approximately one year later with different participants, using the same instruments as at the first round.

2. Purpose and Use of Information Collection

The purpose of each data collection activity for which ASPE is seeking approval is as follows:

  • PSLA—Web-based survey of PIs and PDs. The sample frame consists of all ARRA-funded PIs and PDs. The data collected from the survey will provide important details about projects that cannot be obtained through existing HHS documents or from other sources. Data from the survey will bolster the project-level database and inform the evaluation of barriers to CER projects, such as the available research infrastructure or the availability of staff with appropriate expertise to conduct project work. In addition, the survey data will provide information on projects’ interim products, outputs, and outcomes; how these compare with projects’ goals and objectives; whether the resources, time, and supporting infrastructure were sufficient to meet the goals and objectives of the projects; and whether projects were on track to meet their goals.

  • PSLA—In-depth telephone interviews with PIs and PDs. The evaluation will identify and conduct telephone interviews with up to 50 ARRA-funded PIs and PDs. The purpose of this data collection activity is to explore in greater depth the challenges, barriers, and limitations, as well as the successes and promise, of the ARRA CER portfolio from the perspective of PIs and PDs. The interviews will provide more detail than can be obtained from the PI-PD survey. These one-hour telephone interviews will focus on the challenges projects face, whether these challenges have been overcome (and, if so, how), investigators’ views on limitations of their studies and how they might be minimized, and unintended consequences. Investigators will also be asked about the successes of their projects and the long-term sustainability of such work.

  • SSLA—Web-based survey of three key stakeholder groups in two rounds. The key stakeholder survey will provide data to answer questions about the CER-relevant knowledge and skills, attitudes and opinions, and behaviors and experiences of key groups who have a vested interest in clinical decisions and the evidence that supports those decisions. For the three key stakeholder groups, we will collect data on knowledge of, attitudes toward, and experiences with CER both for general CER topics and for specific CER topic areas. The key stakeholder groups that we will survey consist of the following populations:

  1. Health care providers, including primary care physicians, specialist physicians, nurse practitioners, and physician assistants

  2. Health care organizations, including hospitals, health care facilities for outpatient procedures (for example, ambulatory surgery centers), large medical group practices, long term care facilities, and behavioral health centers

  3. Patients/consumers, namely patients with chronic or acute conditions, parents of children under age 18, and family caregivers of patients with chronic conditions

The purpose of administering the survey at two different times (in November–December 2011 and November–December 2012) is to examine whether there are changes in the knowledge, attitudes, and behaviors of a representative cross-section of the three key stakeholder groups toward CER. This pre-post design will provide two data points with different sample members during the evaluation period, and will serve as a baseline measure for future ASPE evaluations of ARRA-funded research.

  • SSLA—Focus groups with members of the general public in two rounds. The evaluation will include six focus groups in each round in three large metropolitan areas to ensure geographic and demographic diversity: Cambridge/Boston, Massachusetts; Oakland/San Francisco, California; and Chicago, Illinois. Large metropolitan areas were selected in order to ensure a diverse group of participants, both in terms of demographic diversity as well as a diversity of experiences with the health care system. The focus groups will examine the knowledge and skills, attitudes and opinions, and behaviors and experiences of the general public with respect to CER in two rounds. Because we expect relatively little knowledge or understanding of CER among the general public, a focus group is the most appropriate tool for uncovering public attitudes about CER as it allows participants to be exposed to new material, engage with it, and present their understanding and views of it, which is not possible through a survey. In addition, since knowledge of and attitudes toward CER do not occur in a vacuum but are largely shaped by the public discourse, focus groups allow researchers to examine multiple perspectives in a group setting where participants can respond to each others’ comments. The first round will begin November 2011, and the second round will take place one year later. This pre-post design will provide two data points during the evaluation period to provide a baseline for ASPE’s future evaluations of ARRA-funded research and to discern whether attitudes toward CER have changed among the general public.

  • SSLA—In-depth telephone interviews with stakeholders. The purpose of the in-depth stakeholder telephone interviews is to collect information on the three primary domains of interest—(1) knowledge and skills, (2) attitudes and beliefs, and (3) behaviors and experiences—in an effort to understand attitudes toward CER and the processes stakeholders use to engage in it. The interviews will also follow up on issues raised in earlier data collection activities such as the first round of stakeholder surveys and focus groups with the general public. To examine differences in the knowledge, attitudes, and behaviors of various stakeholder groups, this information will be collected from six stakeholder groups, namely: (1) health care providers, (2) health care organization administrators, (3) patients/consumers, (4) employers and payers, (5) researchers, and (6) developers of health innovations. These interviews will take place during May–August 2012.

ASPE will use the information from the evaluation to identify gaps and barriers to achieving ARRA CER goals; develop metrics to assess short- and long-term impacts of the ARRA CER portfolio; and receive formative feedback on ARRA CER portfolio effects. The evaluation incorporates the multiple data collection methods described above in order to assess the broad array of CER-relevant federal programs and stakeholder and community perspectives. The evaluation will also examine the effectiveness of these investments in building the nation’s CER capacity across the core categories and priority themes of the strategic framework of the FCCCER.

At this time, there are no other data sources that can address the goals of the evaluation described under question 1 above, thereby necessitating these data collection activities.

3. Use of Improved Information Technology and Burden Reduction

For every data collection activity that is part of this evaluation, the contents of every instrument have been compared against the research questions that form the evaluation’s goals. Doing so enables the contractors to plainly see what survey, interview, and focus group items are required for the data collection to be complete in order to adequately answer the research questions posed by the evaluation. In addition, it allows the contractor to identify any items that are unnecessary, and therefore can be deleted from the instruments. This procedure ensures that each instrument is complete and yet only collects the minimum information necessary for the purposes of the project. Furthermore, throughout the evaluation, the contractor will coordinate the data collection activities to ensure that each data collection activity collects unique information. This coordination will be achieved by having research team members overlap across the different components of the evaluation, so that each data collection component is informed by every other component.

The contractor will conduct each data collection activity involving public burden using the data collection mode that (1) is most appropriate for the research questions it is answering and (2) minimizes burden. For the surveys, the primary data collection mode will be a web-based instrument. For the focus groups, the mode is an in-person group discussion. For the in-depth interviews, the mode is a telephone interview.

  • Survey data collection. For both the PSLA survey of PIs and PDs and the SSLA survey of key stakeholders, the primary mode of data collection will be a web-based data collection instrument. The web instrument will offer the easiest means of providing data as it will be programmed to automatically skip questions that are not relevant to the respondent and thereby reduce respondent burden. The instrument will also allow respondents to complete the survey at a time that is convenient to them without the risk of losing a paper survey questionnaire. Since the instruments will automatically skip to the next appropriate question based on a respondent’s answers, the instrument will also provide high quality data. In addition to offering the web instrument, participants may request a paper (mail or fax) questionnaire or receive telephone assistance in completing the survey from the contractor’s facility liaison.

  • Focus group data collection. For the SSLA focus groups with the general public, the focus group mode will be a group discussion to be conducted in person with a focus group moderator. Because we expect relatively little knowledge or understanding of CER among the general public currently, a focus group is the most appropriate tool for uncovering attitudes of the general public about CER. Asking participants to formulate opinions in a survey about material that is likely very new to them would pose an undue burden and not provide useful information for the evaluation. Additionally, conducting a general population survey would require a significantly larger number of participants in order to detect knowledge of and attitudes toward CER. Because the mode is an in-depth in-person discussion, technology will not be used to collect responses to questions; in this instance use of technology would not be practical.

  • In-depth interview data collection. The PSLA component of the evaluation includes 50 in-depth interviews with PIs and PDs. The SSLA component of the evaluation includes 60 in-depth interviews among the following groups of stakeholders (10 interviews per stakeholder group): (1) health care providers; (2) health care organization administrators; (3) patients/consumers; (4) employers and payers; (5) researchers; and (6) developers of health innovations. Because these interviews are in-depth conversations with open-ended questions, a computer-assisted telephone instrument is not practical for this data collection. Having respondents submit responses electronically would also pose significantly greater burden on them. This mode requires an interviewer and a note-taker to conduct the discussion and record the information collected, respectively. After the interviews are complete, responses will be coded electronically using the Atlas.ti program, which will facilitate analysis of the data. To reduce burden on respondents, contractor staff will schedule interviews at times most convenient to the participating sample members.

4. Efforts to Identify Duplication and Use of Similar Information

ASPE recognizes that certain information necessary to conduct a complete evaluation can be obtained from document reviews and other records. The evaluation consists of three rounds of environmental scans of the peer-reviewed literature, grey literature, and news media such as print and health-focused blogs. Each round will occur for a six-week period, with ongoing content analysis and synthesis of findings. The environmental scans will take place before data collection occurs, thereby ensuring that where information can be obtained from existing documents or databases, the evaluation will not duplicate the collection of that information from the public.

The surveys, focus groups, and in-depth interviews will be entirely new, as no data collection pertaining to ARRA CER investments from these populations has been conducted yet. For the PSLA web-based survey of PIs and PDs, the contractor will use existing records to identify projects, determine the resources available to carry out projects, and assess their progress and outputs. Records will include contractor and grantee proposals (redacted) and reports; reports required of all ARRA-funded projects; an inventory to be provided by ASPE at the start of the project; and other federal websites that catalogue federal grants and contracts, such as http://projectreporter.nih.gov or http://www.science.gov. The instruments used to collect information from PIs and PDs will not include any questions for which information is available from another source. Additionally, as described in section A3 above, for every data collection activity that is part of this evaluation, the contents of every instrument have been compared against the research questions that form the evaluation’s goals. This ensures that there is no duplication of data collection activities either among the different components of the evaluation or with data that already exist.

5. Impact on Small Businesses or Other Small Entities

The SSLA component of the CER evaluation will collect data from health care providers and health care organizations that vary greatly in size from small service delivery operations to large units such as hospitals. To minimize burden on small entities, the questionnaire will be available in a web version and respondents may access it at their convenience. Additionally, the instrument will be available in hardcopy form to those who prefer and request this mode. We expect that the organization staff best suited to respond to the survey and interviews will be people such as medical directors and others with administrative responsibilities in their organization. Because the survey does not request accounting or financial data that might require accessing records, the survey is not likely to put any undue burden on small entities. Rather, the data collection instruments will ask respondents about their knowledge, attitudes and behaviors regarding CER, and their perceptions of others’ knowledge, attitudes, and behaviors. Investigating the relevance of the knowledge, attitude, and behavior items in the data collection instruments to respondents is an important component of the pilot test.

6. Consequences of Collecting the Information Less Frequently

The purpose of this evaluation is to provide ASPE with evidence on whether the CER portfolio has met the objectives established by HHS, and to gauge the evolution in CER-related knowledge and skills, opinions and attitudes, and behaviors and experiences among stakeholders and society in general, as well as lessons for future CER funding. If the proposed data collection activities are not implemented, ASPE will have no direct information with which to evaluate outcomes to date, and to assess the degree to which stated CER objectives are being met.

The frequency of each data collection effort is listed below:

  • PSLA—Web-based survey of principal investigators (PIs) and project directors (PDs). This is a one-time only data collection effort. If data are not collected, ASPE will be unable to make an informed assessment of barriers to CER projects; projects’ interim products, outputs, and outcomes; how these compare with projects’ goals and objectives; whether the resources, time, and supporting infrastructure were sufficient to meet the goals and objectives of the projects; and whether projects were on track to meet their goals or met their stated goals.

  • PSLA—In-depth telephone interviews with PIs and PDs. This is a one-time only data collection effort. If data are not collected, ASPE will be unable to make an informed assessment of the challenges projects face, how (or if) these challenges were overcome, investigators’ views on limitations of their studies and how they might be minimized, unintended consequences, and the long-term sustainability of such work.

  • SSLA—Web-based survey of three key stakeholder groups in two rounds. This data collection effort will occur on two separate occasions. Without this data, ASPE will be unable to assess the knowledge, attitudes and behaviors of key stakeholders with regards to CER. The purpose of administering the survey at two different times (round 1 in November-December 2011 and round 2 in November-December 2012) is to track changes in these metrics over time. If these data are not collected twice, it will not be possible to make this assessment. The second administration of the survey will be with a new cross-sectional sample, so respondents will not be asked to participate twice.

  • SSLA—Focus groups with members of the general public in two rounds. This data collection effort will occur on two separate occasions. If data are not collected, ASPE will be unable to make an informed assessment of the knowledge and skills, attitudes and opinions, and behaviors and experiences of the general public toward CER. The purpose of administering the focus groups at two points in time (the first round estimated to begin in November–December 2011, one month after the estimated date of receiving OMB approval, and the second round one year later in November–December 2012) is to discern whether attitudes toward CER have changed. If these data are not collected twice, it will not be possible to make this assessment. The second administration of the focus groups will be with a new sample of participants, so individuals will not be asked to participate twice.

  • SSLA—In-depth telephone interviews with stakeholders. This is a one-time only data collection effort. If data are not collected, ASPE will be unable to make an informed assessment of the knowledge and skills, attitudes and beliefs, and behaviors and experiences of stakeholders toward CER and the processes they use to engage in it.

There are no legal obstacles to reduce the burden.

7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

This request fully complies with the general information collection guidelines of 5 CFR 1320.5(d)(2). No special circumstances apply.

8. Comments in Response to the Federal Register Notice/Outside Consultation

Consultation with the public. ASPE consulted with the public about this information collection. As required by 5 CFR 1320.8(d) a 60-day notification was published in the Federal Register on February 23, 2011 (volume 76, page 6793). See Attachment H for this text. Below are the public comments received:

  • COMMENT (received 3/7/2011): “Thank you for the info below. In looking back at the 2/24 Federal Register notice, I had a few additional questions I was hoping you could answer.

“First, are the surveys (PI/PD and key stakeholder surveys) or the interview questionnaire (both mentioned in the burden estimate of the notice) available for review? If possible, could you direct me to a website where I might find those questions or send them to me via email?

“Second, in the Burden Table, the listed burden for the PI/PD survey is “20/60” and the burden for the stakeholder survey is “15/60.” Does that mean 20 out of 60 minutes (as in, 1/3 hour) and 15 out of 60 minutes (or ¼ hour)? Any additional clarity you could provide on these estimate numbers would be appreciated.”


  • RESPONSE: ASPE is forwarding the surveys to the requester. The burden table has been revised to clarify the hour burden estimate.


  • COMMENT (received 4/11/2011): The commenter, representing the National Health Council (NHC), notes that the evaluation being undertaken is important to understanding whether CER investments being funded by the American Reinvestment and Recovery Act funding are achieving their objectives. The commenter also notes that the NHC is prepared to collaborate with the Secretary to promote engagement by patients in the data collection efforts that are part of this evaluation.

    • RESPONSE: ASPE has noted the support of the commenter.

Consultation with individuals. Throughout the evaluation, input is being solicited from the ASPE leadership team and all relevant HHS operating divisions through the ASPE-designated HHS internal advisory group. This group will meet four times to provide feedback on the direction of the evaluation. Some members of HHS divisions (AHRQ and NIH) might also regularly attend evaluation meetings with ASPE. Table 1 identifies the individuals who have consulted ASPE on this evaluation.


Table A.1. HHS Advisory Panel

Name

Title

Organizational Affiliation

Phone Number

Peter Briss, MD, MPH

Medical Director, National Center for Chronic Disease Prevention and Health Promotion

CDC

770-488-8189

Carolyn Clancy, MD

Director

AHRQ

301-427-1200

Rosaly Correa-de Araujo, MD, MSc, PhD

Deputy Director, Office on Disability

OS

301-427-1550

Sherry Glied, PhD

Assistant Secretary for Planning and Evaluation

ASPE

202-690-7858

Lia Hotchkiss, MPH

Comparative Effectiveness Research Portfolio Lead

AHRQ

301-427-1620

Kathie Kendrick, RN, MS, CS

Deputy Director

AHRQ

301-427-1200

Richard Kronick, PhD

Deputy Assistant Secretary, Office of Health Policy

ASPE

202-690-6870

Joel Kupersmith, MD

Chief Research and Development Officer

VA

202-461-1700

Mike Lauer, MD

Director, Cardiovascular Sciences at the National Heart, Lung, and Blood Institute (NHLBI)

NIH

301-435-0422

Karen Milgate, MPP

Director, Office of Policy

CMS

202-260-0630

Mike Millman, PhD

Director, Division of Information Analysis

HRSA

301-443-0368

Alex Ommaya, ScD

Director, Translational Research and Policy

VA

202-254-0198

Jean Slutsky, PA, MSPH

Director, Center for Outcomes and Evidence

AHRQ

301-427-1600


In addition, the contractor evaluation team has also consulted with individuals about the research and data collection activities for this evaluation. The table below identifies the individuals with whom the contractor consulted.

Table A.2. Individuals Consulting Contractor on Data Collection

Name

Title

Organizational Affiliation

Phone Number

Michael Barry, MD

Medical Director

Massachusetts General Hospital

617-726-4106

Bryan Dowd, PhD

Mayo Professor

University of Minnesota School of Public Health

612-624-5468

Jean Paul Gagnon, PhD

Consultant; Former Senior Director of Public Policy

Independent; Formerly of Sanofi-Aventis

908-310-8196

Marjorie Ginsburg, MPH

Executive Director

Center for Healthcare Effectiveness

916-851-2828

Sheldon Greenfield, MD

Donald Bren Professor of Medicine

University of California, Irvine

949-824-5430

Debra Lappin, JD

Senior Vice President

B&D Consulting

202-312-7496

Sanne Magnan, MD, PhD

President and CEO

Institute for Clinical Systems Improvement

952-814-7075

Mary D. Naylor, PhD, RN

Professor of Gerontology

University of Pennsylvania

215-898-6088

J. Sanford Schwartz, MD

Professor of Medicine

University of Pennsylvania School of Medicine

215-898-3563

Teresa Moran Schwartz, JD

Chair

Consumers Union Board of Directors

202-329-5369

Lisa Simpson, MB, BCh, MPH, FAAP

President and CEO

AcademyHealth

202-292-6700

Sean Tunis, MD, MSc

Director

Center for Medical Technology Policy

410-547-2687

Judy Zerzan, MD, MPH

Chief Medical Director and Deputy Director

Colorado Department of Health Care Policy and Financing

303-724-2244

9. Explanation of any Payment/Gift to Respondents

For the PSLA survey and in-depth interviews with PIs and PDs, respondents will not be offered an incentive for completing the survey.

For the SSLA survey with key stakeholders, respondents will receive a $20 incentive for their participation. Incentives have been found to generate a statistically and substantively significant increase in response rates among medical providers (Kellerman and Herold, 2001; Thorpe et al 2009; Field et al 2002), organizational representatives (Simsek and Veiga, 2001), and the general public (Goritz 2006). Many studies have found that providing an incentive yields considerably higher response rates among physicians, organizational representatives, and members of the general public who have no prior relationship to the subject matter than not providing an incentive. However the exact amount of the incentive varies from study to study and there is little consensus on the optimal amount that balances an increase in response with costs to the evaluation (see Kellerman and Herold, 2001). Though there has been some research to suggest that surveys with physicians require higher incentives than those with the general public, the incentive amounts tested have ranged from very low (such as one dollar or a nominal gift of a pen or pencil) to very high ($50 or more). Since this survey sample frame includes a highly varied population – with a variety of providers including nurses and physician assistants, health care administrators, and consumers/patients, there are also equity issues to consider. Since there is little consensus on the optimal level across these groups of key stakeholders, especially when one of these groups (consumers/patients) has only tenuous ties to the subject matter and sponsoring agency, providing the same $20 incentive across all the stakeholder groups is appropriate.

For the SSLA focus groups with the general public, participants will receive a $50 incentive for their participation. These participants will be contributing an amount of time that exceeds the other forms of data gathering. They must also travel to the data gathering site, unlike for the surveys and in-depth interviews. Thus, the incentive for focus group participants must be higher to stimulate participation (see Kreuger and Casey 2009, Stewart et al. 2007).

The incentives being provided for participation in the data collection activities above are not reimbursements for respondents’ time or burden on the respondents. They are intended to incentivize participation in the data collection effort.

10. Assurance of Confidentiality Provided to Respondents

The information collection will fully comply with all respects of the Privacy Act (1974). Individuals and agencies will be assured of the privacy of their replies under Section 934(c) of the Public Health Service Act of 1944, 42 USC 299c-3(c). Survey respondents, interviewees, and focus group participants will be told the purposes for which the information is collected, and that any identifiable information about them will not be used or disclosed for any other purpose, except under such circumstances as may be required by law. Respondents will be given this assurance during recruitment (in the advance letter or focus group recruitment telephone call), which will also provide assurance that the information being gathered is for research purposes only. Respondents will be informed that participation is voluntary, that they may refuse to answer any question, and that they may stop their participation at any time.

No identifying information will be requested from participants. Names will not be linked to comments or responses. Data will be reported in aggregate form. The contractor will safeguard all data and only authorized users will have access to them. Information gathered for this study will be made available only to researchers authorized to work on the study.

Data Security. The contractor has a secure server for online data collection utilizing its existing and continuously tested web-survey infrastructure. This infrastructure features the use of HTTPS (secure socket, encrypted) data communication; authentication (login and password); firewalls; and multiple layers of servers, all implemented on a mixture of platforms and systems to minimize vulnerability to security breaches.

Hosting on an HTTPS site insures that data are transmitted using 128-bit encryption so that transmissions that are intercepted by unauthorized users cannot be read as plain text. This security measure is an addition to standard password authentication that precludes unauthorized users from accessing the web application.

The contractor has established data security plans for the handling of all data during all phases of survey execution and data processing for the surveys that it conducts. Its existing plans meet the requirements of U.S. federal government agencies and are continually reviewed in the light of new government requirements and survey needs. Such security is based on (1) exacting company policy promulgated by the highest corporate officers in consultation with systems staff and outside consultants, (2) a secure systems infrastructure that is continually monitored and evaluated with respect to security risks, and (3) secure work practices of an informed staff that take all necessary precautions when dealing with private data.

11. Justification for Sensitive Questions

ASPE is collecting information about race and ethnicity in the SSLA survey of key stakeholders. In its 2009 report, Race, Ethnicity, and Language Data: Standardization for Health Care Quality Improvement, the IOM’s Subcommittee on Standardized Collection of Race/Ethnicity Data for Healthcare Quality Improvement indicated that disparities in access to health care and to quality care persist for specific racial and ethnic population groups. The report notes that collecting data on race and ethnicity is “a fundamental step in identifying which populations are most at risk.” For this reason, ASPE is collecting information on race and ethnicity in its survey of key stakeholders. ASPE is following the recommendation of the IOM Subcommittee and using OMB categories for race and ethnicity.

ASPE is not collecting any other information of a sensitive nature from individuals or organizations.

12a. Estimates of Annualized Hour and Cost Burden

Table 3 provides estimates of the time burden by data collection activity and for the evaluation overall. The total hour burden of data collection for this evaluation is estimated to be 1,491 hours. All hourly burden estimates for completing the survey questionnaires and for the qualitative interviews are based on pilot testing each instrument and protocol with fewer than 10 respondents from the relevant population. The hourly burden estimate for participating in the focus groups is based on previous experience conducting focus groups with the general public on matters related to health care policy.

  • PSLA web-based survey of PIs and PDs. Based on pilot testing with fewer than 10 PIs and PDs, ASPE estimates that the web survey with PIs and PDs will take respondents approximately 20 minutes (0.33 hours) to complete. The online survey will be fielded with all ARRA-funded PIs and PDs and one additional researcher from each ARRA-funded project, as identified by the primary investigator.

  • PSLA In-depth interviews with PIs and PDs. Based on pilot testing experience, ASPE estimates each of the telephone interviews with PIs and PDs will last approximately 60 minutes (1 hour).

  • SSLA web-based survey of key stakeholders. The three groups defined as key stakeholders include health care providers, health care organizations, and patients/consumers of CER. Because the three key stakeholder groups consist of different types of individuals, the sample frame for each population will be obtained from a different source. The evaluation will collect data from 600 respondents per round from each key stakeholder group (a total of 1,800 per round, or 3,600 across both rounds). The provider sample will be obtained from Medical Marketing Systems, Inc. (MMS). The sample of health care organizations will be obtained from the Joint Commission on Accreditation of Health Care Organizations (JCAHCO). The sample frame for the patient/consumer sample will come from Marketing Systems Group (MSG). These sample frames will include contact information for sample members, but the contractor will only obtain the email addresses for the patient/consumer sample. MMS and JCAHCO will select and deliver the email to the sample they provide. The three key stakeholders will be receiving the same survey questionnaire, thus the pilot test for this instrument included some members from each group, for a total of fewer than 10 pilot survey interviews with this instrument. Based on pilot testing experience, ASPE estimates that each web survey will take respondents approximately 20 minutes (0.33 hours).

  • SSLA focus groups with the general public. Based on previous experience conducting focus groups, ASPE estimates each group will last no more than 120 minutes (2 hours).

  • SSLA qualitative interviews with stakeholders. Based on pilot testing experience, ASPE estimates each of the telephone interviews with PIs and PDs will last approximately 60 minutes (1 hour).


Table A.3. Estimated Hour Burden, by Data Collection Activity

Instrument

Type of Respondent

Number of Respondents

Number of Responses per Respondent

Average Burden (in hours) per Response

Total Hour Burden

Attachment B: Survey (PSLA)

Principal investigators and project directors

730

1

20/60

243

Attachment C: In-depth interviews (PSLA)

Principal investigators and project directors

50

1

1

50

Attachment D: Survey (SSLA)

Key stakeholders: health care providers

1,200


2

15/60

300

Attachment D: Survey (SSLA)

Key stakeholders: health care organization administrators

1,200


2

15/60

300

Attachment D: Survey (SSLA)

Key stakeholders: patients/ consumers

1,200


2

15/60

300

Attachment E: Focus group (SSLA)

Members of the general public

120


2

2

240

Attachment F: In-depth interviews (SSLA)

Stakeholders: health care providers

10

1

1

10

Attachment G: In-depth interviews (SSLA)

Stakeholders: health care organization administrators

10

1

1

10

Attachment H: In-depth interviews (SSLA)

Stakeholders: patients/ consumers,

10

1

1

10

Attachment I: In-Depth interviews (SSLA)

Stakeholders: employers and payers

10

1

1

10

Attachment J: In-Depth interviews (SSLA)

Stakeholders: researchers

10

1

1

10

Attachment K: In-Depth interviews (SSLA)

Stakeholders: developers of health innovations

10

1

1

10

Total


4,560



1,491





Table 4 provides estimates of the cost burden by data collection activity and for the evaluation overall. The total cost burden is estimated to be $46,915.71. Using Bureau of Labor Statistics (BLS) standards, ASPE estimates the average hourly wage rate of each respondent type.




Table A.4. Estimated Hour Cost Burden, by Data Collection Activity

Instrument

Type of Respondent

Number of Respondents

Number of Responses per Respondent

Average Burden (in hours) per Response

Average Hourly Wage Cost

Total Hour Cost Burden

Attachment B: Survey (PSLA)

Principal investigators and project directors

730

1

20/60

$35.31

$8,509.71

Attachment C: In-depth interviews (PSLA)

Principal investigators and project directors

50

1

1

$35.31

$1,765.50

Attachment D: Survey (SSLA)

Key stakeholders: health care providers

1,200


2

15/60

$33.51

$10,053.00

Attachment D: Survey (SSLA)

Key stakeholders: health care organization administrators

1,200


2

15/60

$43.74

$13,122.00

Attachment D: Survey (SSLA)

Key stakeholders: patients/ consumers

1,200


2

15/60

$20.90

$6,270.00

Attachment E: Focus group (SSLA)

Members of the general public

120


2

2

$20.90

$5,016.00

Attachment F: In-depth interviews (SSLA)

Stakeholders: health care providers

10

1

1

$33.51

$335.10

Attachment G: In-depth interviews (SSLA)

Stakeholders: health care organization administrators

10

1

1

$43.74

$437.40

Attachment H: In-depth interviews (SSLA)

Stakeholders: patients/ consumers,

10

1

1

$20.90

$209.00

Attachment I: In-Depth interviews (SSLA)

Stakeholders: employers and payers

10

1

1

$43.74

$437.40

Attachment J: In-Depth interviews (SSLA)

Stakeholders: researchers

10

1

1

$35.31

$353.10

Attachment K: In-Depth interviews (SSLA)

Stakeholders: developers of health innovations

10

1

1

$40.75

$407.50

Total


4,560




$46,915.71



13. Estimates of Other Total Annual Cost Burden to Respondents or Recordkeepers/Capital Costs

There are no additional costs to the respondents.

14. Annualized Cost to Federal Government

The annual cost to the government is calculated to be $1,389,276 per year (over the three years of the contract).

15. Explanation for Program Changes or Adjustments

This is a new data collection.

16. Plans for Tabulation and Publication and Project Time Schedule

  1. Time Schedule

ASPE anticipates beginning data collection one month after receiving OMB clearance, which is currently estimated to be in October 2011. Thus the estimated data collection start is November 2011, but data collection may begin sooner or later depending on when OMB clearance is received. Table 4 below presents the anticipated data collection schedule by type of data collection.

Table A.4. Data Collection Schedule

Data Collection Activity

Start of Data Collection

Completion of Data Collection

B: PSLA PI/PD Web Survey

November 2011

December 2011

C: PSLA PI/PD In-Depth Interviews

November 2011

December 2011

D: SSLA Key Stakeholder Web Survey –Round 1

November 2011

December 2011

E: SSLA Focus Groups–Round 1

November 2011

December 2011

F-K: SSLA In-Depth Interviews

June 2012

July 2012

D: SSLA Key Stakeholder Web Survey –Round 2

November 2012

December 2012

E: SSLA Focus Groups–Round 2

November 2012

December 2012

  1. Analysis

The analyses that will be conducted for each data collection are described below.

PSLA web-based survey of PIs and PDs Analysis of survey data will include an analysis of the association of proxies for short- and intermediate-term success to project and investigator characteristics, and qualitative analysis of open-ended responses to questions that ask PIs and PDs to report on challenges and barriers to project success.

The survey will collect a number of process metrics, some of which will be used as proxies for project success. In addition to examining all survey data descriptively, the evaluation team will study the association of proxies for success to project and investigator characteristics. As one example, the analysis may explore whether projects’ abilities to disseminate preliminary findings in a timely way differed across core categories of the strategic framework. Analyses will use the non-response weights described above when appropriate.

The evaluation team will also conduct a qualitative analysis of answers to open-ended questions, such as those that examine barriers and challenges. This will include identifying themes that emerge across all respondents, as well as key differences among subgroups. For example, investigators may identify a common set of barriers to CER projects, or the barriers may vary with the research environment.

The contractor will construct analysis weights that account for unit non-response using relevant characteristics known about both respondents and non-respondents. Using these weights for estimates and analysis will minimize the risk of bias due to differential non-response patterns. Because the survey population is a census, the weights will not need to account for probabilities of selection.

PSLA in-depth interviews with PIs and PDs After all interviews are completed, the contractor staff conducting the interviews will meet to share their findings and conduct qualitative data analysis of the interview notes in Atlas.ti, a qualitative data analysis tool. Codes will be developed based on the most common themes across interviews as identified by the lead interviewers. The primary themes will address the key interview topics, such as barriers and facilitators, successes, missed opportunities, unintended consequences, and investigators’ future research plans.

SSLA web-based survey of key stakeholders in two rounds. The evaluation will conduct descriptive, predictive, and impact analyses with the key stakeholder survey data. The descriptive analysis will address two of the research questions: (1) What are the baseline metrics of key stakeholders? and (2) How do these metrics change from round 1 to round 2? The predictive analysis will also address two of the research questions: (1) What knowledge, skills, attitudes, and personal characteristics predict behaviors for each key stakeholder group? and (2) What baseline metrics and characteristics predict change from round 1 to round 2? Finally, the impact analysis will address one research question: What is the impact of CER on key stakeholders? Below is an overview of the three types of analyses.

For the descriptive analysis, the evaluation will summarize (1) baseline metrics (by domain and construct, separately for each stakeholder group), (2) follow-up metrics, (3) changes in metrics from round 1 to round 2, and (4) differences among the stakeholder groups on metrics. This analysis will also include psychometric analysis of baseline metrics to ensure measurement reliability, especially examining multiple-item scales.

The predictive analysis has two components. First, it will include a predictive model using regression analysis, predicting behavior from knowledge, skills, and personal characteristics (and health-related decision-making consumer segment, based on relative skills, self-efficacy, and engagement). Second, this analysis will include a model of behavior change, or the difference between baseline behavior and behavior at round 2 (assessed with two different samples) from baseline metrics and characteristics using a regression model. The predictive analyses will be done separately for each stakeholder group.

For the impact analysis, the evaluation will use a regression model to conduct an interaction analysis (similar to a difference-in-differences analysis) to examine the interactive or moderating effect of investment in CER topics for which findings are rolled out between both rounds the baseline and follow-up on behavior change (difference between behavior at round 1 and round 2), separately for each stakeholder group. The topics will include high-priority topics that are likely to vary in salience, drawn from the IOM’s list of priority CER topics (Institute of Medicine 2009).

As with the PSLA web survey, ASPE’s contractor will construct analysis weights for the SSLA that account for non-response to the web survey using relevant characteristics known about both respondents and non-respondents in each group. These weights will help mitigate the risk of biased estimates due to differential non-response patterns. The weights will first account for probabilities of selection then be adjusted for non-response. Because the sample of people with chronic diseases is a web panel and is not representative of the general population, we will work with the vendor to appropriately post-stratify to external population counts.

SSLA focus group with members of the general public in two rounds. The evaluation will use information learned from the focus groups to form an in-depth profile of the general public with respect to knowledge and skills, attitudes and opinions, and behaviors and experiences as they pertain to CER. After the second round of focus groups, the evaluation will assess any changes in trends with respect to responses to questions targeting the three domains (knowledge and skills, attitudes and opinions, and behaviors and experiences). As presented above, six focus group discussions will be conducted during each of our two rounds of focus groups (two focus groups in three cities per round).

After each round of focus groups, the focus group team will use Atlas.ti to examine and summarize findings across the relevant domains (such as health decision-making behaviors and experiences and knowledge of CER), noting any trends in participant responses and whether responses were common among people sharing certain characteristics, such as education level or chronic condition status.

SSLA in-depth interviews with stakeholders. After interviewers on the evaluation team have completed all 60 stakeholder interviews, a qualitative data analysis of the interview notes will be conducted using Atlas.ti. Findings will be summarized across the relevant domains (such as health decision-making behaviors and experiences and knowledge of CER), noting any trends in participant responses and whether responses were common among people sharing certain characteristics.

  1. Publication

Findings from each data collection activity and analysis will be included in a draft interim report and final evaluation report. The interim data collection reports for all the in-depth interviews, for the survey of PIs and PDs, the first round of the survey of key stakeholders, and the first round of focus groups will be completed in March 2012. Final data collection reports for all data collection activities, including document reviews, will be completed in May 2013.

17. Reason(s) Display of OMB Expiration Date is Inappropriate

ASPE is not seeking permission not to display the expiration date on any data collection instrument. ASPE will display the OMB number and expiration date on the web versions and paper versions of the two survey questionnaires used for this data collection, the PSLA PI-PD Survey and the SSLA Key Stakeholder Survey.

18. Exceptions to Certification for Paperwork Reduction Act Submissions

ASPE is not seeking an exception.


REFERENCES


Field , TS, C. Cadoret, M. Brown, et al. (2002). Surveying physicians: do components of the “Total Design Approach” to optimizing survey response rates apply to physicians? Medical Care. 40:596-606.


Goritz, Anja S. (2006). Incentives in Web Studies: Methodological Issues and a Review. International Journal of Internet Science. 1(1): 58-70.


Kellerman, Scott E. and Joan Herold (2001). Physician Response to Surveys: A Review of the Literature. American Journal of Preventive Medicine. 20(1):61-67.


Kreuger, Richard A. and Mary Anne Casey (2009).v Focus Groups: A Practical Guide for Applied Research (Fourth Edition). Thousand Oaks, CA: Sage Publications.


Simsek, Zeki and John F. Veiga (2001). A Priner on Internet Organizational Surveys. Organizational Research Methods. 4:218-235.


Stewart, David W., Prem N. Shamdasani, and Dennis W. Rook. (2007). Focus Groups: Theory and Practice (Second Edition). Thousand Oaks, CA: Sage Publications.


Thorpe, C., B. Ryan, S. McLean, A. Burt, M. Stewart, J. Brown, G. Reid, and S. Harris. (2009). How to Obtain Excellent Response Rates when Surveying Physicians. Family Practice. 26(1):65-68.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement for OMB Clearance of Assessment of American Recovery and Reinvestment Act (ARRA) Comparative Effectiveness
AuthorSheena Flowers
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy