Supporting Statement Part A -- Pilot Test of HSOPS 2 0 -- 9-2-15

Supporting Statement Part A -- Pilot Test of HSOPS 2 0 -- 9-2-15.docx

Pilot Test of the Proposed Hospital Survey on Patient Safety Culture Version 2.0

OMB: 0935-0230

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT


Part A







Pilot Test of the Proposed Hospital Survey on Patient Safety Culture

Version 2.0





Version: September 2, 2015







Agency for Healthcare Research and Quality (AHRQ)





A. JUSTIFICATION


1. Need for Information


The mission of the Agency for Healthcare Research and Quality (AHRQ) set out in its authorizing legislation, The Healthcare Research and Quality Act of 1999, is to enhance the quality, appropriateness, and effectiveness of health services, and access to such services, through the establishment of a broad base of scientific research and through the promotion of improvements in clinical and health systems practices, including the prevention of diseases and other health conditions. AHRQ shall promote health care quality improvement by conducting and supporting:

1. research that develops and presents scientific evidence regarding all aspects of health care;


2. the synthesis and dissemination of available scientific evidence for use by patients, consumers, practitioners, providers, purchasers, policy makers, and educators; and


3. initiatives to advance private and public efforts to improve health care quality.


Also, AHRQ shall conduct and support research and evaluations, and support demonstration projects, with respect to (A) the delivery of health care in inner-city areas, and in rural areas (including frontier areas); and (B) health care for priority populations, which shall include (1) low-income groups, (2) minority groups, (3) women, (4) children, (5) the elderly, and (6) individuals with special health care needs, including individuals with disabilities and individuals who need chronic care or end-of-life health care.


Ensuring patient safety is critical for improving health care quality. The Healthcare Research and Quality Act of 1999 mandates that AHRQ conduct and support research to identify the causes of preventable health care errors and patient injury in health care delivery; develop, demonstrate, and evaluate strategies for reducing errors and improving patient safety; and disseminate such effective strategies throughout the health care industry [Section 912, (c) (1) (2) and (3) (http://www.ahrq.gov/policymakers/hrqa99.pdf)]. One way in which AHRQ is meeting this mandate is the construction, dissemination, and compilation of results from surveys of patient safety culture in health care facilities. AHRQ defines safety culture as the individual and group values, attitudes, perceptions, competencies, and patterns of behavior that determine the commitment to, and the style and proficiency of, an organization's health and safety management.1

In 2004, AHRQ developed and published a measurement tool to assess the culture of patient safety in hospitals (OMB control no. 0935-0115). The Hospital Survey on Patient Safety Culture2 (HSOPS) is a survey of providers and staff that can be implemented by hospitals to identify strengths and areas for patient safety culture improvement as well as raise awareness about patient safety. When conducted routinely, the survey can be used to examine trends in patient safety culture over time and evaluate the cultural impact of patient safety initiatives and interventions. The data can also be used to make comparisons across hospital units. AHRQ also produced a survey user’s guide to assist hospitals in conducting the survey successfully.3 The guide addresses issues such as which providers and staff should complete the survey, how to select a sample of hospital providers and staff, how to administer the questionnaire, and how to analyze and report on the resulting data.

Since 2004, thousands of hospitals within the U.S. and internationally have implemented the survey. In response to requests for comparative data from other hospitals, AHRQ funded the development of a comparative database on the survey in 2006 (OMB control no. 0935-0162). The database is currently compiled every two years, using the latest data provided by participating hospitals (and retaining submitted data for no more than 2 years). Reports describing the findings from analysis of the database are made available on the AHRQ website to assist hospitals in comparing their results. The 2014 database contains data from 405,281 hospital provider and staff respondents within 653 participating hospitals. The 2014 User Comparative Database Report presents results by hospital characteristics (e.g., number of beds, teaching status, geographic location) and respondent characteristics (e.g., position type, work area/unit).4

The survey constructed in 2004 remains in use today, 10 years after its initial launch. Since the launch of HSOPS, AHRQ has funded development of patient safety culture surveys for other settings.5 In 2008, surveys were published for outpatient medical offices (OMB control no. 0935-0131) and nursing homes (OMB control no. 0935-0132). In 2012, a survey for community pharmacies (OMB control no. 0935-0183) was released. Surveys for each setting built upon the strengths of HSOPS but improved and updated items where appropriate.


Users of HSOPS have provided feedback over the years suggesting that changes to the instrument would be valuable and welcomed. The comparative database registrants provided feedback about potential changes in 2013, and telephone interviews were conducted with 8 current survey users and vendors to gain an in-depth understanding of their thoughts on the current survey and possible changes. As a result of this feedback, the Hospital Survey on Patient Safety Culture Version 2.0 (HSOPS 2.0) is being constructed with the following objectives in mind:


1) Shift to a Just Culture framework for understanding responses to errors. In the original HSOPS, questions around responses to errors were negatively worded to detect a “culture of blame” in organizations. For example, respondents evaluated the extent to which errors were held against them and whether it felt as though the person were being written up rather than the problem. In contrast, the Just Culture framework emphasizes learning from mistakes, providing a safe environment for reporting errors, and utilizing a balanced approach to errors that considers both system and individual behavioral reasons for errors.6 New items will be constructed in HSOPS 2.0 to capture the extent to which positive responses to error consistent with a Just Culture framework are present in an organization. For example, respondents will be asked to evaluate the extent to which the organization tries to understand the factors that lead to patient safety errors.



2) Reduce the number of negatively worded items. The original HSOPS has negatively worded items. For example, respondents are asked whether there are “patient safety problems in this unit” (negatively worded). Using some negatively worded items was intended to reduce social desirability and acquiescence biases and identify individuals not giving the survey their full attention (e.g., “straight-lining,” or providing the same answer for every item, regardless of positive or negative wording). However, many users have indicated that respondents sometimes had difficultly correctly interpreting and responding to the negatively worded items. Therefore, many survey users recommended that the number of negatively worded items should be reduced, but they did not recommend removing all of these items as they felt a mixture of items helps keep respondents engaged.


3) Add a “Does not apply/Don’t know” response option. Analysis of the Comparative Database data found that a percentage of respondents selects “neither agree nor disagree” on many items when they really should have answered “Does not apply/Don’t know”. While some portion of respondents will always have neutral feelings about a statement, in some cases a respondent will select a neutral response to an item because they do not have experience in that area or the item does not apply to their position. Addition of a “does not apply/don’t know” response option should reduce neutral responses to an item in cases where the item is not relevant for a respondent, providing more statistical variability in responses. Recognizing these issues, the other AHRQ Surveys on Patient Safety Culture all have a 5th “Does not apply/Don’t know” response option.


4) Reword unclear or difficult-to-translate items. HSOPS was originally designed for use in U.S. hospitals, but it has since been translated into languages other than English. Some HSOPS items use idiomatic expressions that do not translate well, such as “things fall between the cracks” and “the person is being written up.” Other items have words that are complex or may mean different things to different people, such as “sacrifice” and “overlook.” HSOPS 2.0 uses more universal phrases which can be accurately translated and have more consistent meaning across respondents, some of whom are non-clinical staff. A related change across many items is use of the word “we” rather than “staff.” It may be unclear to respondents whether providers such as physicians, residents, and interns qualify as “staff,” while “we” invites a more inclusive view of those in the hospital or unit.


5) Reword items to be more applicable to physicians and non-clinical staff. Users have indicated that the wording of some of the items makes it awkward for physicians to answer. For example, the section that asks about “Your Supervisor/Manager” does not apply well to physicians who report to a clinical leader but not to a manager per se. In addition, some items were difficult for non-clinical staff to answer. For example, the item “We have patient safety problems in this unit” may not be relevant for staff that do not have direct interaction with patients (e.g., IT staff).


6) Align the HSOPS survey with AHRQ patient safety culture surveys for other settings. The development of patient safety culture surveys for other settings provided opportunities to test new items and refinements of original HSOPS items. Many of these items have performed well for other settings and are relevant to the hospital setting. In addition, standardizing items across the patient safety culture surveys would allow cross-setting comparisons that are not currently possible.


7) Reduce survey length. To increase response rates and reduce the survey administration burden for hospitals, the revised survey is intended to be shorter than the original instrument. Some of the original items have relatively low variability and therefore contribute little to discrimination between positive and negative assessment of patient safety culture. However, the need for careful testing of alternative questions means that the initial draft of the revised or 2.0 survey is slightly longer than the original. Through cognitive interviewing, pilot testing, and expert review, we will identify items that can be deleted, resulting in a shorter final instrument.


8) Investigate supplemental items/composites. Develop a set of supplemental items for the HSOPS 2.0 survey pertaining to Health Information Technology (Health IT) .


Attachment A contains a crosswalk of the original and current 2.0 versions of the instrument. The current 2.0 version of the instrument has undergone preliminary cognitive testing with 9 hospital physicians and staff members as well as review by a Technical Expert Panel. Attachment B contains the draft Health IT Patient Safety supplemental items.


This research has the following goals:


  1. Cognitively test with individual respondents the items in a) the draft HSOPS 2.0 survey and b) HSOPS 2.0 supplemental item set assessing Health IT Patient Safety. Cognitive testing will be conducted in English and Spanish.


  1. Conduct data collection as follows:


    1. A combined pilot test and bridge study for the draft HSOPS 2.0 in 40 hospitals and modify the questionnaire as necessary. The pilot test component will entail administering the draft 2.0 version to determine which items to retain. The bridge study component will entail administering the original HSOPS in addition to the draft HSOPS 2.0 version to provide guidance to hospitals in understanding changes in their scores resulting from the new instrument versus changes resulting from true changes in culture.

    2. The pilot testing of the supplemental item set will be conducted with the same hospitals and respondents as the pilot test for the draft HSOPS 2.0. These supplemental items will be added to the draft HSOPS 2.0 survey for pilot testing.


  1. Engage a Technical Expert Panel (TEP) in review of pilot results and finalize the questionnaire and supplemental item set.


  1. Make the final HSOPS 2.0 survey and the supplemental items publicly available.



To achieve these goals, we propose the following activities:


  1. Cognitive interviews – The purpose of these interviews is to understand the cognitive processes respondents engage in when answering each item on the survey, which will aid in refining the survey instrument. These interviews will be conducted with a mix of hospital personnel, including physicians, nurses, and other types of staff (from dietitians to housekeepers).

    1. Draft HSOPS 2.0 – Cognitive interviews have already been conducted with 9 respondents to inform development of the current draft HSOPS 2.0 presented in Attachment A. Up to three additional rounds of interviews will be conducted by telephone with a total of 27 respondents (nine respondents each round). The instrument will be translated into Spanish and another round of cognitive interviews will be conducted with nine Spanish-speaking respondents for a total of up to 36 respondents across all four rounds. The cognitive interview guide found in Attachment C will be used for all rounds.

    2. Supplemental Items – Up to three rounds of interviews will be conducted by telephone for a total of 27 respondents (nine respondents each round). The supplemental items will be translated into Spanish and another round of cognitive interviews will be conducted with nine Spanish-speaking respondents for a total of up to 36 respondents across all four rounds. The cognitive interview guide found in Attachment D will be used for all rounds.


Feedback obtained from the first round of interviews for the draft HSOPS 2.0 and the supplemental items will be used to refine the items. The results of Round 1 testing, along with the proposed revisions, will be reviewed with a Technical Expert Panel prior to commencing with Rounds 2 and/or 3 testing. In total, up to 72 cognitive interviews will be conducted to refine the draft HSOPS 2.0 and supplemental items for pilot testing.



2) Pilot test and bridge study– There will be one data collection effort which will provide data for the pilot test and the bridge study. The pilot test of the draft HSOPS 2.0 and supplemental items will allow the assessment of the psychometric properties of the items and composites. We will assess the variability, reliability, factor structure and construct validity of the draft HSOPS 2.0 and supplemental items and composites, allowing for their further refinement (see Part A, Section 16 for analysis plan description). The draft HSOPS 2.0 survey (see Attachment A) and supplemental items (see Attachment B) will be pilot tested with hospital personnel in approximately 40 hospitals to facilitate multilevel analysis of the data. Approximately 500 providers and staff will be sampled from each hospital, with 250 receiving HSOPS 2.0 with supplemental items for the pilot test and 250 receiving the original HSOPS for the bridge study comparisons. A hospital point of contact (POC) will be recruited in each hospital to publicize the survey and assemble a list of sampled providers and staff. Instructions for the POCs are included in Attachment E, and Exhibit 2 includes a burden estimate for the POCs’ time in assisting with the pilot test. Providers and staff will receive notification of the survey and reminders via email and the web-based survey will be fielded entirely online. The draft pilot test notification and follow-up reminder notice are included in Attachment F.


The goal of the bridge study will be to provide users with guidance on how their new results will compare with results from the original HSOPS survey. Although users have requested that the HSOPS survey be revised, they are also concerned about their ability to trend results with data from prior years. A similar bridge study was conducted during the 1994 redesign of the Census Bureau’s Current Population Survey (CPS). In the CPS bridge study, an additional 12,000 households were added to the survey’s monthly rotation schedule between July 1992 and December 1993. The added households received the redesigned version of the instrument.7 Thus, the CPS fielded both the revised and the original versions of the instrument simultaneously. One of the most important results of the CPS bridge study was the development of metrics that allowed estimates of change that were due to the changes in the instrument. These metrics were used to adjust the estimates produced by the revised CPS instrument. As a result of the study, key labor force metrics such as the unemployment rate could be trended accurately after the instrument’s redesign.


We propose to conduct a similarly constructed bridge study in which sampled providers and staff take either the draft HSOPS 2.0 or original versions of HSOPS. As noted above, a split ballot design will be used in which half of sampled providers and staff in each hospital receive the original HSOPS (N=250) and the other half receive the draft HSOPS 2.0 (N=250). This bridge study is designed to produce metrics of change that are attributable to the changed survey instrument. The number of hospitals and sampled providers and staff for this data collection effort was calculated to ensure the statistical power needed to detect relatively small differences in scores (3 percentage points).


3) Technical Expert Panel (TEP) feedback – A TEP has been assembled to provide input to guide patient safety culture survey product development and has been convened to discuss the proposed changes to the HSOPS survey and supplemental items. Upon completion of the pilot test, results will be reviewed with the TEP and the survey will be finalized. The TEP is discussed in more detail in Section 8b. This TEP activity does not impose a burden on the public and is therefore not included in the burden estimates in Section 12.


4) Dissemination activities – The final HSOPS 2.0 instrument and supplemental items will be made publicly available through the AHRQ Web site. A report from the bridge study will also be made public as a resource to hospitals making the transition to the new survey. This dissemination activity does not impose a burden on the public and is therefore not included in the burden estimates in Section 12.


This work is being conducted by AHRQ through its contractor, Westat, pursuant to AHRQ’s statutory authority to conduct and support research on healthcare and on systems for the delivery of such care, including activities with respect to the quality, effectiveness, efficiency, appropriateness and value of healthcare services and with respect to quality measurement and improvement. 42 U.S.C. 299a(a)(1) and (2).

2. How, by Whom, and for What Purpose Information Will Be Used


The information collected in the pilot test and bridge study data collection effort will be used for two purposes:


  1. Pilot test draft HSOPS 2.0 and supplemental item set. The results from just the responses to the draft HSOPS 2.0 survey and Health IT Patient Safety supplemental items will be used by project staff to test and improve the items and composites. Psychometric analysis will be conducted on the HSOPS 2.0 and supplemental items data to examine item nonresponse, item response variability, factor structure, reliability, and construct validity of the items and composites. Because the items are being developed to measure specific aspects or composites of patient safety culture in the hospital setting, the factor structure of the items will be evaluated through multilevel confirmatory factor analysis. On the basis of the data analyses, items or factors may be dropped to create the final HSOPS 2.0 and supplemental item set.


  1. Bridge Study of HSOPS 2.0 and original HSOPS. The purpose of the bridge study is to assess differences resulting from hospitals’ use of the items and composites in the final HSOPS 2.0 based on the analyses in the pilot test and compare them to the original HSOPS. The bridge analysis will include statistical comparisons of survey items and composites across split ballot samples of respondents answering the original and HSOPS 2.0 survey instruments versions within 40 hospitals. A report from the bridge study will be made available to the public on AHRQ’s Web site. This information will assist hospitals in understanding and interpreting their hospital’s scores on the HSOPS 2.0 survey compared to the original.


Hospitals participating in the pilot test /bridge study data collection effort will receive a report of their hospital-specific results. This feedback report serves as an incentive for participation and saves hospitals time and effort in analyzing their own results.


The final HSOPS 2.0 survey instrument and supplemental item set will be made available to the public for use in hospitals to assess their patient safety culture from the perspectives of their staff. The survey and supplemental items can be used by hospitals to identify areas for patient safety culture improvement. Researchers are also likely to use the survey and supplemental items to assess the impact of patient safety culture improvement initiatives. This HSOPS 2.0 survey will replace the current HSOPS survey instrument in AHRQ’s suite of surveys on patient safety culture, which are available on the AHRQ Web site at (http://www.ahrq.gov/professionals/quality-patient-safety/surveys/index.html). These surveys have been used by thousands of hospitals, nursing homes, medical offices, and community pharmacies across the U.S. to assess patient safety culture.


3. Use of Improved Information Technology


The pilot test and bridge study data collection will be conducted using a Web survey because the majority of hospitals are already using Web surveys for their routine administration of HSOPS. In addition to reducing the burden associated with survey administration (printing and tracking paper surveys), a Web-based survey will offer increased security of responses and eliminate the cost of data entry.


4. Efforts to Avoid Duplication


Surveys on Patient Safety Culture. HSOPS has been in use for 10 years, but other surveys are also being used in the U.S. and internationally to assess patient safety culture in hospitals. We reviewed the literature to determine what other surveys are in use, how they compare with HSOPS in terms of psychometric properties, and whether the resulting data from other surveys are being compiled across hospitals to form a comparative database.


A recent analysis identified 12 survey instruments in use for the measurement of patient safety culture in healthcare facilities.8 The two most frequently used worldwide are HSOPS, developed and maintained by AHRQ, and the Safety Attitudes Questionnaire (SAQ), developed and maintained by researchers at the University of Texas.9,10 Both HSOPS and the SAQ were developed to measure safety culture in hospitals, both instruments are made available to the public on the Web, and both have been extensively tested in recent years.11 A recent study by Etchegaray and Thomas10 indicates that the survey instruments have similar reliability and predictive validity, although the only patient safety outcomes studied were those captured by HSOPS.


HSOPS has been more extensively studied internationally than has SAQ, with reported use in several European, Middle Eastern, and Asian countries.9 AHRQ maintains a Spanish translation of the instrument on its Web site, and efforts to revise the instrument for HSOPS 2.0 are sensitive to the need for items in which the meaning can be accurately translated. Although a number of articles with published data from the SAQ exist, only HSOPS regularly makes comparative database reports available to the public using the latest survey data from participating hospitals. The HSOPS 2014 User Comparative Database Report contains 405,281 respondents in 653 hospitals – the largest publicly available database of hospital patient safety culture data in the nation. Additionally, de-identified hospital patient safety culture data from that database are made available to researchers upon request for secondary analysis.


In summary, although several hospital patient safety culture instruments are available, HSOPS has an advantage because of its length in use (nationally and internationally), rigorous testing, inclusion of outcome measures, and maintenance of a comparative database. Because many hospitals depend on it as a critical evaluation tool for patient safety, failing to complete the testing for this update to HSOPS 2.0 could result in hospitals relying on an outdated instrument, failing to measure Just Culture components of patient safety, and misunderstanding their scores on the new instrument in relation to their scores on the prior instrument.


Health IT Patient Safety. Information on the Health IT Patient Safety has not been systematically or rigorously collected and is not available through any other sources to our knowledge. There are many surveys available about Health IT adoption or use as well as self-assessment tools to help assess readiness for Health IT adoption12,13, but no surveys were found on Health IT patient safety or the culture of patient safety related to Health IT.



5. Involvement of Small Businesses


It is unlikely that any hospitals participating in this pilot test will be small businesses.


6. Consequences if Information Collected Less Frequently


This effort is a one-time pilot test and bridge study.


7. Special Circumstances


The data collection efforts will be consistent with the guidelines at 5 CFR 1320.5(d)(2).


8. Federal Register Notice and Outside Consultations


8.a. Federal Register Notice


As required by 5 CFR 1320.8(d), notice was published in the Federal Register on (date and page number of 60 day notice) for 60 days (see Attachment G).


8.b. Outside Consultations


To guide the development of all patient safety culture survey products, a Technical Expert Panel (TEP) has been assembled. The TEP reviewed drafts of the HSOPS 2.0 survey instrument and will also review feedback from the cognitive interviews and assist in finalizing the survey instrument and supplemental items. The TEP contains 16 members from various parts of the health sector covered by the patient safety culture surveys, including hospitals (6 members), medical offices (2 members), nursing homes (2 members), community pharmacies (2 members), the Department of Defense (2 members), and international representatives (2 members). Attachment H lists the TEP members and their institutional affiliations.


9. Payments/Gifts to Respondents


Cognitive Interview Respondents. To successfully recruit 72 cognitive interview participants, it is appropriate to offer a cash incentive.

  1. Draft HSOPS 2.0 – For 1.5 hours cognitive interviews, we propose a $150 cash remuneration for 4 attending physicians and hospitalists, a $100 cash remuneration for 8 department managers and registered nurses, and a $75 cash remuneration for 24 other hospital support staff. Total amount for cash incentives is $3,200.

  2. Supplemental Items – For 1 hour cognitive interviews, we propose a $100 cash remuneration for 4 attending physicians and hospitalists, a $75 cash remuneration for 8 department managers and registered nurses, and a $50 cash remuneration for 24 other hospital support staff. Total amount for cash incentives is $2,200.


The survey research literature uniformly demonstrates that incentives are an effective means of communicating the importance of the study to the respondent. In a meta-analysis, Caporaso, Mercer, Cantor, and Townsen (in press)14 show that incentives follow a dose-response model – the greater the incentive, the greater the level of respondent participation. This is equally true for the incentives offered in cognitive testing. The amounts proposed are based on the average hourly rates for the respondents and the time to complete the cognitive interviews (see Exhibit 2). We strongly believe that if we do not provide incentives equal to at least the average hourly wage for respondents, it will significantly hinder recruiting for this effort.


Because of the time stresses in a clinical setting, cognitive testing on clinical personnel is particularly challenging. To the best of our knowledge, all cognitive testing of questionnaires on clinical staff offer an incentive. Usually, it is a monetary incentive, often called an honorarium. Examples of studies where the clinical staff was given a monetary incentive for testing a questionnaire include: Shaw, Talley, Beebe, and Rockwood (2001)15; Cho, Johnson, and VanGeest (2013)16; McLeod, Klabunde, Willis, and Stark (2013)17; and Salinas (2014)18.


Pilot Test and Bridge Study Respondents. No remuneration is proposed for organizations or individuals participating in the pilot test or bridge study. Hospitals will receive customized feedback reports of results, with comparisons to the other hospitals which will assist them in understanding their patient safety culture. Further, hospitals will also receive feedback on how the final HSOPS 2.0 instrument compares with the original HSOPS instrument in their hospital(s). We think this is sufficient incentive for participation.


10. Assurance of Confidentiality


Individuals and organizations will be assured of the confidentiality of their replies under Section 934(c) of the Public Health Service Act, 42 USC 299c-3(c). They will be told the purposes for which the information is collected and that, in accordance with this statute, any identifiable information about them will not be used or disclosed for any other purpose. Identifiers such as name, email address, and position will be collected to facilitate survey administration and to notify respondents of the survey. Once data collection is complete, personal identifiers will be removed from the data and destroyed.


11. Questions of a Sensitive Nature


We do not believe there are questions of a particularly sensitive nature included in the survey, but if during cognitive testing sensitivities are discovered, such questions will be modified to ensure they are not of a sensitive nature.


12. Estimates of Annualized Burden Hours and Costs


Exhibit 1 shows the estimated annualized burden hours for the participants’ time to take part in this research. Cognitive interviews for the draft HSOPS 2.0 will be conducted with 36 individuals and will take about one hour and 30 minutes to complete. Cognitive interviews for the supplemental items will be conducted with 36 individuals and take about one hour to complete. We will recruit 40 hospitals for the pilot test and bridge study, sampling approximately 500 staff members in each (250 taking the original survey and 250 taking the HSOPS 2.0 and supplemental item set). Because we require such a large sample within each hospital, we will target only hospitals with 49 or more beds. For hospitals with fewer than 500 providers and staff, we will conduct a census in the hospital (assuming on average 375 providers and staff in these hospitals this will yield a total of 18,375 sample members assuming all 40 hospitals participate. Assuming a response rate of 50 percent, this will yield a total of 9,188 completed questionnaires. The total annualized burden is estimated to be 2,387 hours.


Exhibit 2 shows the estimated annualized cost burden associated with the participants’ time to take part in this research. The total cost burden is estimated to be $84,879.88.


Exhibit 1.  Estimated annualized burden hours

Form Name/Activity

Number of respondents

Hours per response

Total burden hours

Cognitive interviews – HSOPS 2.0

36

1.5

54

Cognitive interviews – Supplemental Items

36

1.0

36

Pilot test and bridge study (Attachments A and B)

9,188

0.25

2,297

Total

9,260

na

2,387


Exhibit 2. Estimated annualized cost burden

Form Name/Activity

Total burden hours

Average hourly wage rate*

Total cost burden

Cognitive interviews (HSOPS 2.0 and supplemental items)

90

$36.05a

$3,244.50

Pilot test and bridge study

2,297

$35.54b

$81,635.38

Total

2,387

na

$84,879.88

a Based on the weighted average hourly wage in hospitals for one physician (29-1060; $103.54), one registered nurse (29-1141; $30.67), one general and operations manager (11-1021; $54.50), and six clinical lab techs (29-2010; $22.62) whose hourly wage is meant to represent wages for other hospital employees who may participate in cognitive interviews

b Based on the weighted average hourly wage in hospitals for 7,625 registered nurses, 805 clinical lab techs, 677 physicians and surgeons, and 21 general and operations managers

*National Industry-Specific Occupational Employment and Wage Estimates, May 2014, from the Bureau of Labor Statistics (available at http://www.bls.gov/oes/current/naics4_621100.htm [for general medical and surgical hospitals, NAICS 622100]).





13.  Estimates of Annualized Respondent Capital and Maintenance Costs


Capital and maintenance costs include the purchase of equipment, computers or computer software or services, or storage facilities for records, as a result of complying with this data collection.  The only cost to the respondent will be that associated with their time to respond to the information collection, as shown in Exhibits 1 and 2.


14.  Estimates of Annualized Cost to the Government [


Exhibit 3 shows the estimated total and annualized cost for this project.  Although data collection will last for less than one year, the entire project will take about 3 years.  The total cost of the data collection activities includes $7,200 in incentives to the survey respondents. The total cost for this project is approximately $578,000, and the annualized cost is estimated at $192,667.


Exhibit 3.  Estimated Total and Annualized Cost

Cost Component

Total Cost

Annualized Cost

Project Development

$33,029

$11,010

Data Collection Activities

$165,142

$55,047

Data Processing and Analysis

$123,858

$41,286

Publication of Results

$49,543

$16,514

Project Management

$41,286

$13,762

Overhead

$165,142

$55,047

Total

$578,000

$192,667


Exhibit 4: Annual cost to AHRQ for project oversight

Project Officer- GS 15 Step 5

$143,079

5%

$7,154

Subject Matter Expert- GS 15-Step 5, $143,079

5%

$7,154

Program Specialist- GS 12-Step 5

5%

$4,328

Total


$18,636









https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/15Tables/html/DCB.aspx


Note that these oversight costs are included in “Overhead” in Exhibit 3.


15. Change in Burden


This is a new activity.


16. Time Schedule, Publication and Analysis Plan


As soon as OMB approval is received, pilot test and bridge study activities will begin. The estimated time schedule to conduct these activities is shown below:


  1. Up to three rounds of cognitive interview for English version of the draft HSOPS 2.0 survey (4 months)

  2. Up to three rounds of cognitive interview for English version of the Health IT Patient Safety supplemental item set (4 months)

  3. One round of cognitive interviews for the Spanish version of the draft HSOPS 2.0 survey (1 month)

  4. One round of cognitive interviews for the Spanish version of the supplemental item set (1 month)

  5. Pilot test and bridge study data collection (6 months)

  6. Data analysis, feedback report production, and development of technical reports (6 months)

  7. Final HSOPS 2.0 survey and development of toolkit materials (3 months)

  8. Final Health IT Patient Safety supplemental item set and development of toolkit materials (2 months)


The final version of HSOPS 2.0, Health IT Patient Safety supplemental item set, technical reports, and accompanying toolkit materials will be made publicly available on the AHRQ Web site.


This section outlines the analyses to be conducted on the combined pilot test and bridge study data. The analysis plan is broken out by pilot test analyses and bridge study analyses below.


Pilot Test Psychometric Analysis of Draft HSOPS 2.0 and Supplemental Item Set. The psychometric analyses described here will be conducted on both the draft HSOPS 2.0 data and the Health IT Patient Safety supplemental item set (i.e., survey items). Psychometric analysis will be conducted to examine item nonresponse, item response variability, factor structure, reliability, and construct validity of the items. Because the survey items are being developed to measure specific aspects or composites of patient safety culture, the factor structure of the survey items will be evaluated through multilevel confirmatory factor analysis. On the basis of the data analyses, items or factors may be dropped.


Descriptive Statistics

The means, standard deviations, and response frequencies for the survey items will be examined to ensure that respondents and hospitals exhibit adequate response variability on the survey items. In addition, items will be examined to ensure that there are low rates of missing data (lower than 20% missing response per item). Poorly functioning items will be identified.


Individual Level Factor Analysis

A confirmatory individual level factor analysis will be conducted to initially examine whether groups of items intended to measure a specific patient safety composite are interrelated, ignoring the nesting of respondent data within hospitals. Factor loadings for each item in an a priori composite will be considered as having an adequate contribution to a particular composite or factor if the strength of the item's relationship to that factor (i.e., its factor loading), is .40 or greater.


We will also examine overall model fit indices using standard fit statistics: the chi-square, comparative fit index (CFI), and the standardized root mean square residual (SRMR). For chi-square statistics, lower and non-significant chi-squares indicate good fit. The factor structure is determined to adequately fit the data if the CFI is at least .90. A value of zero for the SRMR indicates perfect fit, but a value less than .08 is considered a good fit.


Intraclass Correlations (ICCs) and Design Effects

Intraclass correlations (ICCs) will be computed for each composite. ICC's determine if substantial variation exists between groups compared to variation within groups. ICCs above .05 or 5% indicate that the between group variance is greater than expected by chance and imply that nesting in groups does have an effect on the responses of individuals.


Given that ICCs are likely to be influenced when there are many groups with few individuals within the groups (or when there are few groups with many individuals within the groups), we will also examine design effects, which take into account within-group sample size. A design effect of 2 or more implies that group membership or nesting of individuals within groups does have an effect on the responses of the individuals and therefore multilevel modelling should be conducted to account for the multilevel nature of the data.


Multilevel Confirmatory Factor Analysis (MCFA)

Individuals responding to the survey are located within hospitals. When data are nested in groups like this, results from an individual level confirmatory factor analysis may be biased or incorrect. Therefore, multilevel confirmatory factor analysis will be conducted on the survey’s a priori composites to examine the structure of the factors at the hospital level of analysis, taking into consideration that the data are nested.


An MCFA will be conducted to test the fit of the measurement model for the survey’s patient safety composites, taking into consideration the nested nature of the data at the hospital level of analysis. We will first evaluate the MCFA results by examining the item factor loadings on the composites. Factor loadings should be 0.40 or greater.


We will also examine overall model fit indices using standard fit statistics: the chi-square, comparative fit index (CFI), and the standardized root mean square residual (SRMR). For chi-square statistics, lower and non-significant chi-squares indicate good fit. The factor structure is determined to adequately fit the data if the CFI is at least 0.90. A value of zero for the SRMR indicates perfect fit, but a value less than 0.08 is considered a good fit.


Reliability Analysis

Reliability analyses will then be performed on the composites to examine whether individuals responded consistently to the items within each composite. Internal consistency reliability will be calculated using Cronbach's alpha. The minimum criterion for acceptable reliability is an alpha of at least 0.70.


Intercorrelations

Intercorrelations among the survey’s patient safety composites, supplemental items and outcome measures included in the survey (e.g., Recommend the hospital, Overall rating on patient safety) will also be examined. Intercorrelations will be explored at the individual and hospital levels of analysis. While the composites should be correlated since they measure aspects of the patient safety culture, the intercorrelations should not be extremely high (0.80 or higher) because very high intercorrelations indicate that the composites may not be unique enough to be considered separate constructs or measures. While there is no standard criterion for acceptable levels of dimension intercorrelations and construct validity, in general, such correlations should be less than 0.80 for the composites to be considered unique and to avoid problems with multicollinearity.


The above analyses will be used to determine which items and composites are functioning more poorly and remove them from the survey to derive a final set of items and composites with good psychometric properties and reduce the overall length of the final survey. The Technical Expert Panel will be informed of the data analysis results and be asked to weigh in on which items to retain or drop when the psychometric results do not provide enough guidance and decisions can be made based on the content value of the items alone.


The final HSOPS 2.0 survey and supplemental item set will be made publicly available on the AHRQ Web site for use by hospitals and researchers.


Bridge Study of Original HSOPS to Final HSOPS 2.0. The purpose of the bridge study is to help hospitals interpret trends in their original HSOPS scores after implementation of the final HSOPS 2.0. Within hospitals, respondents will be randomly assigned to take either the old or new instrument. Trends in composite scores owing to changes in a hospital’s patient safety culture (vs. changes in the survey instrument) can be computed by subtracting the difference in scores between the original HSOPS and final HSOPS 2.0 obtained during the bridge study data collection. The bridge study analyses will be performed on the final HSOPS 2.0 which resulted from the pilot test analyses from the same data collection effort.


To ensure that this method can be safely extended beyond the specific hospitals participating in the bridge study, statistical significance of the observed differences in HSOPS and HSOPS 2.0 composite scores during the bridge study will be assessed using t-tests of proportions at the composite level and for individual items that persisted from HSOPS to HSOPS 2.0. The sampling methodology described in Supporting Statement B was designed to detect a three percentage point change in composite scores with statistical power of at least 0.74, with 10 of the 12 composites having power well above 0.80. In other words, the probability of a three percentage point difference being a true difference in the population of hospitals will be at least 74 percent; for most composites, it will be above 90 percent.



17. Exemption for Display of Expiration Date


No exemption is being requested.


List of Attachments


Attachment A – Crosswalk Between the Original AHRQ Hospital Survey on Patient Safety Culture and Draft Version 2.0


Attachment B – Draft Health IT Patient Safety Supplemental Item Set


Attachment C – Draft HSOPS 2.0 Cognitive Interview Guide


Attachment D – Draft Health IT Patient Safety Supplemental Item Set Cognitive Interview Guide


Attachment E – Hospital Point of Contact (POC) Instructions


Attachment F – Survey Invitation and Reminder Notices


Attachment G – Federal Register Notice


Attachment H – Technical Expert Panel Members








3 Sorra JS, Nieva VF. Hospital Survey on Patient Safety Culture. (Prepared by Westat, under Contract No. 290-96-0004). AHRQ Publication No. 04-0041. Rockville, MD: Agency for Healthcare Research and Quality. September 2004. Available at: http://www.ahrq.gov/professionals/quality-patient-safety/patientsafetyculture/hospital/resources/hospcult.pdf.

4 2014 User Comparative Database Report: Hospital Survey on Patient Safety Culture. March 2014. Agency for Healthcare Research and Quality, Rockville, MD. Available at: http://www.ahrq.gov/professionals/quality-patient-safety/patientsafetyculture/hospital/2014/index.html

5 The complete suite of patient safety culture surveys is available at : http://www.ahrq.gov/professionals/quality-patient-safety/patientsafetyculture/index.html

6 Khatri N, Brown GD, Hicks LL. From a blame culture to a just culture in health care. Healthcare Management Review. 2009; 34: 312-322.

7 Polivka, A., and Miller, S. The CPS after the redesign: Refocusing the economic lens. In Haltiwanger, J., Manser, M., and Topel, R. (Eds.), Labor Statistics Measurement Issues (pp. 249-289). Chicago, IL: University of Chicago Press.

8 Halligan M, Zecevic A. Safety culture in healthcare: a review of concepts, dimensions, measures, and progress. BMJ Qual Saf. 2011; 20: 338-343

9 The Health Foundation. Report: Measuring Safety Culture. 2011. Available at: http://www.health.org.uk/public/cms/75/76/313/2600/measuring%20safety%20culture.pdf?realName=p6V3X0.pdf

10 Etchegaray J. Thomas E. Comparing two safety culture surveys: Safety Attitudes Questionnaire and Hospital Survey of Patient Safety. BMJ Qual Saf. 2012.

11 See https://med.uth.edu/chqs/surveys/safety-attitudes-and-safety-climate-questionnaire for the SAQ survey instrument and a bibliography of research on the measure.

12 See for example, Hsaio, Chun-JU, PhD and Hing, Ester MPH, Use and Characteristics of Electronic Health Record Systems Among Office-based Physician Practices: United States, 2001-2013. NCH Data Brief, No. 143, January 2014 (see http://www.cdc.gov/nchs/data/databriefs/db143.pdf) or http://www.skainfo.com/press_releases.php?article=122

13 Office of the National Coordinator for Health IT, Report to Congress, Update on the Adoption of Health Information Technology and Related Efforts to Facilitate the Electronic Use and Exchange of Health Information, October 2014  (see: http://www.healthit.gov/sites/default/files/rtc_adoption_and_exchange9302014.pdf)


14 Mercer A., Caporaso, A., Cantor, D., and Townsen, R. (2015). How much gets you how much? Monetary incentives and response rates in household surveys. Public Opinion Quarterly, 79(1), 105-129.

15 Shaw, M., Talley, N., Beebe, T., and Rockvood, T. (2001). Initial validation of a diagnostic questionnaire for gastroesophageal reflux disease, Am J Gastroenterology,  96(1):52-7

16 Cho, Y., Johnson, T., Vangeest, J. (2013). Enhancing surveys of health care professionals – a meta-analysis of techniques to improve response, Eval Health Prof, 36(3): 382-407

17 McLeod, C., Klabunde, C., Willis, G., Stark, D. (2013). Health Care Provider Surveys in the United States, 2000-2010: A Review, Eval Health Prof,  36: 106-126

18 Salinas, G. D. (2014). Trends in physician preferences for and use of sources of medical information in response to questions arising at the point of care: 2009–2013. Journal of Continuing Education in the Health Professions, 34(S1), S11-S16


19

9-2-15

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSUPPORTING STATEMENT
AuthorAHCPR
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy