INFORMATION COLLECTION REQUEST
Centers for Disease Control and Prevention
National Center for Injury Prevention and Control
Evaluation of the Field Triage Decision Scheme: The National Trauma Triage Protocol
Supporting Statement B
October 2009
Kelly Sarmiento, MPH
DHHS/CDC/CCEHIP/NCIPC/DIR
Health Communications Specialist
Chamblee GA 30341-3717
Phone: (770) 488-1384
Fax: (770) 488-4338
E-mail: [email protected]
B. statistical methods 17
B.1 Respondent Universe and Sampling Methods 17
B.2 Procedures for the Collection of Information 18
B.3 Methods to Maximize Response Rates and Deal with Nonresponse 19
B.4 Test of Procedures or Methods to be Undertaken 19
B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 20
B. Collections of Information Employing Statistical Methods
This supporting statement addresses two data collection procedures related to the evaluation of the CDC’s Field Triage Decision Scheme: The National Trauma Triage Protocol: the focus groups and the survey. Only the survey involves the use of statistical methods. Therefore, this section will describe only the various aspects of the EMS professional survey data collection methods.
B.1. Respondent Universe and Sampling Methods
A survey will be conducted with EMS professionals, a group that includes EMS administrators at the state and local level and also EMS providers that are in the field. Specifically, the populations of interest for this study are EMS professionals that have received and reviewed the CDC’s Field Triage Decision Scheme materials.
NCIPC is working with multiple professional organizations to distribute the Decision Scheme materials to their members. For evaluation purposes, NCIPC will contact a segment of those individuals that were sent a copy of the Decision Scheme materials through these means. NCIPC will collaborate with the National Registry of Emergency Medical Technicians (NREMT) to promote the survey. There are more than 200,000 individuals on this organization’s member lists. NCIPC will work with this organization to make a random selection of individuals. NREMT will then send out a survey invitation to these randomly selected members that had previously been sent a copy of the Decision Scheme materials.
The e-mail invitation explains the purpose of the survey and includes directions for accessing the survey Web site. NREMT will be asked to send the e-mail invitation so that potential participants receive the e-mail from a source they are familiar with. They will also be asked to send one or two reminder e-mails to potential participants to increase the response rate.
The survey will be conducted in several waves. The first wave will be sent to a randomly selected group of 4,000 individuals. This survey will attempt to achieve an 80% response rate. However, if this proves unattainable, additional sample will be released until 3,000 responses are attained. After the first wave of invitations have been in the field for several weeks, we will evaluate response rates and target sample sizes. The remaining sample, just enough to complete the survey, will be released soon thereafter. We will use the knowledge of the response rate from the first wave to determine the sample size of the second wave, assuming that the response rate will be similar since persons are chosen randomly. If needed to complete the targeted number of surveys, a third wave will be conducted. The survey will be closed after 3,000 individuals respond. Based on an 80% response rate and on the 200,000 individuals in the survey population, 3,000 responses will provide us with a 99% confidence level with a 2.34 confidence interval.
Table B-1. Sampling Strategy
Sample |
Sample Frame |
Respondent Universe |
Target Number of Completed Surveys |
EMS Professionals |
Random sample via e-mail from one professional organization |
4,000 in wave 1. Wave 2 will be just enough respondents to complete the targeted number of surveys. A third wave will be sent if necessary |
3,000 |
B.2. Procedures for the Collection of Information
Upon OMB approval, NOVA will commence data collection. Potential survey participants will be sent an initial e-mail invitation and a reminder e-mail in order to increase response rates (Attachment I). The reminder e-mail will be sent to these groups after 5 weeks. Both e-mails will include information about the survey and a link to the survey Web site.
NOVA will track survey responses on a daily basis and provide CDC/NCIPC with periodic updates. To facilitate tracking, an administrator’s tool will be developed as part of the online questionnaire that allows NOVA staff to obtain real-time updates on the number of surveys completed.
Data analysis of the EMS professional survey will utilize quantitative data analysis software, such as Stata. Analysis of quantitative measures will begin with descriptive statistics (e.g., frequencies, means, medians) to characterize the data and answer evaluation questions related to the Field Triage Decision Scheme. These methods will be used to analyze closed-ended questions in the survey. More complex analyses and causal modeling may be possible depending on quality and quantity of data and specific evaluation factors appropriate to more complex analyses. Where appropriate, questions will be presented overall, as well as cross-tabulated by significant variables.
Frequencies can be compared across various response categories or variables (i.e., comparative frequencies of responses to items by respondents’ demographics, type of work, or region, etc.) or on subsets of the population. Frequencies will be depicted in tables, charts, and graphs, and reported to CDC/NCIPC as part of the survey findings. After tabulating frequencies, NOVA will conduct chi-square tests, or cross tabulations of responses by relevant response categories or variables.
B.3. Methods to Maximize Response Rates and Deal with Nonresponse
High response rates minimize selection bias in survey findings. Several procedures will be implemented to maximize the response rate. Survey response rates are more robust when the research topic is salient to the respondent’s work, when the questionnaire has been designed for maximum ease of administration, and when the data collection protocol is tailored through a variety of accommodations to acknowledge respondents’ cooperation and contribution. The presentation of the survey is also important, so that respondents can differentiate it from other research requests.
CDC/NCIPC will undertake considerable efforts to maximize response rates by using material design and methodology to make survey response as compelling and convenient as possible. The repeated e-mail contacts remind respondents about the importance of providing feedback on the materials. The ability to take the survey online is appealing for its ease of access and usage.
Furthermore, the survey was designed to address research questions in a clear and succinct manner so that respondents’ time burden in completing a survey is kept to a minimum. The estimated time involved in filling out a survey is stated in the survey invitation so that respondents will realize that surveys will take only a small amount of time to complete.
The introductory e-mail with the link to the survey will indicate that it is sponsored by CDC. The e-mail will succinctly inform the reader of the importance of the survey, as well as procedures for maintaining the privacy of respondents (i.e., identities of individuals will not be released, identifying information will be stored separately from the survey responses, and all information collected will be analyzed in the aggregate).
Consistent with the response rate calculations approved by the American Association for Public Opinion Research (AAPOR), response rates for this study will be calculated as follows:
Number of Completed Surveys
Number of Completed Surveys + Number of Nonrespondents
B.4. Tests of Procedures or Methods to be Undertaken
When constructing the survey instrument, items used previously in other surveys by CDC or organizations were carefully evaluated for inclusion. The survey instrument was tested with cognitive interviews with three respondents that are EMS professionals similar to the ones that will respond to the survey. Respondents were informed that they were participating in a pretest of the survey. They were informed that they would be asked all survey items and then asked to provide feedback and suggestions on clarity and appropriateness of language, clarity of survey directions, comprehension of item content, need to either delete or add items, and any other suggestions for improving the survey.
Feedback from the respondents indicated that the language of the survey is appropriate and that the questions are clear. No substantive revisions to the survey were made, only slight wording changes and response categories were added to several items.
B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
CDC/NCIPC has contracted with NOVA Research Company to direct and implement all aspects of the study, including the proposed data collection and analysis. NOVA’s Dan Eckstein will direct the development of the data analysis plan, including statistical procedures, of the data collected. NOVA’s Allison Zambon and Fred Snyder will support him in these functions. Several NOVA staff with in-depth statistical training are also available as needed to perform statistical programming, prepare tables and summary statistics for reports, and assist in interpretation of the results yielded from the quantitative analysis.
Kelly Sarmiento, of CDC/NCIPC will serve as the Technical Monitor and the federal agency personnel responsible for receiving and approving all contract deliverables. Ms. Sarmiento’s phone number is 770-488-1384.
File Type | application/msword |
Author | Allison Zambon |
Last Modified By | fmc7 |
File Modified | 2009-10-19 |
File Created | 2009-10-19 |