Evaluation of Hospital Preparedness in a Mass Casualty Event (MCE)
Submitted
by:
Department of Health and Human Services
Center for Disease Control and Prevention
National Center for Injury Prevention and Control
Division of Unintentional Injury Prevention
4770 Buford Highway, NE F62
Atlanta, GA 30341-3717
July 18, 2014
Project Officer: Mark Faul, PhD, MA
Tel: 770.488.1276
Email: [email protected]
Fax:
770.488.1317
Date:
B. Statistical Methods 3
1. Respondent Universe and Sampling Methods 3
2. Procedures for Collection of Information 3
3. Methods to Maximize Response Rates and Deal with Nonresponse 4
4. Tests of Procedures or Methods to be Undertaken 4
5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 4
B. Statistical Methods
The respondent universe consists of all adult and pediatric, trauma-level and non-trauma-level hospitals in the United States. The American Hospital Association 2013 Hospital Guide lists approximately 6,500 hospitals which will serve as the sample frame. The guide includes hospital name, CEO name, state, address, telephone number, number of beds. The targeted respondent for each hospital is the emergency preparedness manager/coordinator (EPMC) as they will be the most knowledgeable about hospital preparedness activities.
A total of 400 hospitals will be randomly selected from the sample frame. Hospitals will be selected using simple random sampling without stratification or clustering. A database of hospitals will be created and random selection will be conducted using a random number generator. The hospital CEO of selected hospitals will be mailed or emailed an invitation letter and receive a follow up telephone call to introduce the study and request permission to contact the EPMC. If permission is granted, the EPMC will be mailed or emailed an invitation letter that includes login information for the web-based survey. Follow up telephone or email contacts will direct respondents to the survey web site for data collection.
A response rate of 80% is targeted resulting in a sample size of 320. The sample size of 320 hospitals allows for generalizability to the study population with +/- 5.5% error at a 95% level of confidence. The population and sample sizes are presented in Table B.1.1.
Table B.1.1. Study Population and Sample Size |
|
Component |
Size |
Respondent Universe (Study Population) |
6,416 |
Hospitals Sampled |
400 |
Respondents (EPMCs) |
320 |
The survey is a self-administered questionnaire (Attachment F). Data collection will be completed via a web-based platform. Once the 400 hospitals are selected from the sample frame, SciMetrika will send an invitation letter to each hospital’s CEO and will follow up with a phone call to determine if the facility will be participating. The appropriate respondents (i.e. EPMCs) will be identified by their respective hospital CEOs. SciMetrika will obtain the CEOs permission to contact and invite the EPMCs to participate in the survey. Once participation is confirmed, participants will be provided with username and password information to access and complete the web-based survey. When participants arrive at the web-site (URL to be determined) using Hypertext Transfer Protocol Secure (https) protocol they will enter their username and password. They will then see information about the study background and informed consent. Respondents will click on the “accept” button to begin the survey. Respondents complete the survey and click on the “submit” button at the end to finish.
Statistical analysis will start with descriptive statistics including frequency counts and means. Group differences will be computed using Chi-square, t-test, ANOVA, and regression procedures. The degree of accuracy required is less than a +/- 7% confidence interval at a 95% confidence level. There are no unusual problems requiring specialized sampling procedures.
Data collection will occur one time only.
The study will employ quality control measures including monitoring of data collection to ensure the survey is functioning properly, minimize missing data, and incorporate any needed revisions. The survey will automatically control skip patterns and verify if questions are left unanswered to minimize user error.
The study is a cross-sectional descriptive design. It does not entail an experiment.
Previous national surveys have typically achieved response rates between 50% and 65%. For example, Ash et al. (2004) in a survey of hospitals to determine the availability of inpatient computerized physician order entry in U.S. hospitals and the degree to which physicians are using it. They surveyed 964 randomly selected hospitals and received a 65% response rate.1 Lamb et al. (2003) surveyed 479 hospitals on disclosure of unexpected outcomes for patients and achieved a 51% response rate.2 Jha et al. (2009) achieved a 63.1% response rate in a survey of hospitals regarding the use of electronic health records.3 Lastly, Asch et all (1997) found that the mean response rate to mail surveys published in medical journals was about 60%.4 This study will aim to obtain an 80% response rate. The methods used for maximizing the response rate for this survey are most similar to those used by Jha et al (2009) and include the following procedures.
Use of invitation letter on CDC letterhead and signed by CDC official sent to hospital CEOs
Telephone follow-up calls to CEOs for approval confirmation and permission to contact the emergency preparedness coordinator/manager
Follow-up contact with emergency preparedness coordinators/managers
Web-based survey for response ease, completion in multiple sessions
Up to 4 follow up telephone calls or emails to non-respondents.
If the response rate is found to be higher early in the data collection period, correspondingly fewer than 400 hospitals will be contacted to reach the target sample size. Non-response bias analyses will be conducted to determine differences between the sample and population characteristics (e.g. number of beds, geographic location) and the potential direction of effects on estimates.
The questionnaire has undergone a cognitive interview review, which was completed by nine hospital EPMCs. Questions rated as being less clear and/or less responsive were revised for greater clarity and responsiveness. The same procedures to be used for gaining cooperation from CEOs and EPMCs for the survey were successfully in recruiting cognitive interview participants. The web-survey will undergo rigorous testing and debugging before beginning data collection.
The survey questionnaire has not been used in prior studies. Although the questionnaire has undergone cognitive interviewing, a pilot test of the instrument and procedures will be attempted with nine hospitals. The pilot test will entail the same procedures as the main study to determine the effectiveness of recruitment materials and procedures and to test the responsiveness of the survey. Nine hospitals will be randomly selected from the population of hospitals. The pilot test hospitals will not be included in the main study. The CEOs will be contacted and their emergency preparedness managers recruited for the web-survey. It is anticipated that seven surveys will be completed. Data will be examined for missing data, skip pattern accuracy, and survey completion. Revisions to the questionnaire, programming, and recruitment materials will be made if deficiencies are found prior to the main study.
Table B.5.1 provides the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
Table B.5.1. Contact information for statistical consultants |
|||
Organization |
Name |
Role |
Contact Information |
CDC |
Mark Faul |
Statistical design consulting |
770.488.1276 |
|
|
|
|
SciMetrika |
Darryl Cooney |
Statistical design consulting; Analyze information |
919.354.5212/ [email protected] |
|
Russ Foushee |
Collect information |
919.354.5272/ [email protected] |
|
Dena Elimam |
Collect information |
404.325.5002/ [email protected] |
|
Charles Hallman |
Collect information |
919.354.5224/ [email protected] |
|
Ram Jain |
Statistical design consulting; Analyze information |
404.325.5002/ [email protected] |
1 Ash JS, Gorman PN, Seshadri V, & Hersh, WR Computerized Physician Order Entry in U.S. Hospitals: Results of a 2002 Survey. Journal of the American Medical Informatics Association 11(2), Mar / Apr 2004.
2 Lamb RM, Studdert DM, Bohmer RMJ, Berwick DM, & Brennan TA. Hospital Disclosure Practices: Results of a National Survey. Health Affairs 22(2), 2003, 73-83.
3 Jha AK, DesRoches CM, Campbell EG, Donelan K, Rao SR, Ferris TG, Shields A, Rosenbaum S, & Blumenthal D. Use of Electronic Health Records in U.S. Hospitals. New England Journal of Medicine 360(16), 2009, 1628-38.
4 Asch DA, Jedrziewski MK, & Christakis NA. Response Rates to Mail Surveys Published in Medical Journals. Journal of Clinical Epidemiology 50(10), Oct. 1997, 1129-36.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Evaluation of Hospital Preparedness for Public Health Emergencies and Mass Causality Events Project |
Author | CMS |
File Modified | 0000-00-00 |
File Created | 2021-01-26 |