B. Collection of Information Employing Statistical Methods.
B1. Respondent Universe and Sampling Methods
We will be collecting data from three states with high levels of IPV: Nevada, Oklahoma and South Carolina. Since our objective is to develop an e-learning curriculum that will be effective in reducing levels of IPV, targeting states with high levels for our pilot test will provide a test of the procedures under conditions of maximum potential impact. The following table presents the number of physicians, nurses and social workers in these states:
Respondent Group |
State
Population Estimate: |
State
Population Estimate: |
State
Population Estimate: |
Total Population Estimate |
Health Care Providers1 |
48,326 |
22,920 |
59,660 |
130,490 |
Physicians2 |
7,070 |
3,050 |
5,950 |
16,070 |
Nurses4 |
39,390 |
19,140 |
51,410 |
109,940 |
Medical Social Workers |
1,450 |
730 |
2,300 |
4,480 |
1All estimates were taken from the US Dept. of Labor Bureau of Labor Statistics accessed on January 30, 2014.
2The category of physicians includes family and general practitioners, internets, obstetricians/gynecologists, pediatricians, and “other” physicians.
3 The category of nurses includes registered nurses and licensed practical nurses.
A useful article for the determination of the minimum sample size needed for our pilot study is Hertzog (2008).1 If we are simply estimating a proportion, as we will when we assess the change in the proportion of practitioners who will screen for IPV in the posttest and follow-up, Hertzog indicates that the 95% confidence interval of the proportion will be plus or minus 5% with a sample of n = 40 per group (Table 1, p. 182). This Hertzog recommendation parallels the required sample size estimate derived from Potvin and Schutz, (2000).
Short, Suprenaut, and Harris Jr.2 and Harris et al.3 conducted similar evaluations of online CME intimate partner violence courses with physicians using paper surveys. In these studies, the enrollment response rate was 6%. Other OWH projects related to domestic violence have state response rates of 10% to 30%. We are anticipating a response rate of 10% from each association.
Short et al. (2006) and Harris et al. (2002) reported retention rates of 66% after 6 weeks and 84% after 6 months. We expect a retention rate of at least 80%
We anticipate the Cronbach’s alphas for the measures to be fairly high. Based on similar evaluations of online IPV curriculums, Cronbach alphas for an IPV attitude scale ranged from .73 to .91. Cronbach alpha’s for Short et al. (2006)’s for a perceived knowledge scale was fairly high (α >.95) as well. With our sample of physicians (urgent care and all other physicians), nurses, and social workers we anticipate the reliabilities of our knowledge and attitude scales to be at least .70 (test-retest correlation), a lower bound to acceptable stability (Nunnally & Bernstein, 1994). With high reliabilities (e.g., r = .80), a sample size of n = 40 per group ensures a 95% confidence interval from .73-.86 (Table 4, p. 184).
For the within-subjects (pre, post, and follow-up) design we are proposing, Hertzog (2008) indicates that power will be above .80 for even small effects with samples n = 40 or greater (p. 188), assuming within-subject correlations of r = .60.
Thus, we will target obtaining samples (after attrition) of n=400 from each of the four groups (two groups of physicians-urgent care physicians, other physicians, nurses, and social workers) in each of the three states with a total n = 1600. If a greater number of participants complete the e-learning module and the three data collection times, we will randomly sample from the pool of respondents.
B.2 Procedures for the Collection of Information
The following presents data collection procedures for the evaluation project:
Establish agreements and obtain IRB approval (if necessary) from each professional association. GEARS will collect IRB approval letters from all associations participating in the evaluation.
Obtain OMB clearance.
Finalize all forms by making any changes suggested by OMB. Make sure the OMB clearance number is printed on all forms.
The day OMB clearance is received, GEARS will send an email to each association to: 1) inform them that OMB clearance has been obtained, and 2) They will receive a letter to distribute electronically to their members in the three pilot states within the next 3 days.Participants will be redirected from the e-learning course to an online site to take all assessments (e.g. Pre, Post, 3 month follow-up).
Members will have 2 weeks to sign-up for the course and take the pre-test. During this 2 week period, the associations will be provided with an email reminder to forward to their members reminding them of the e-learning opportunity.
Participants will have 2 weeks from the time they take the pre-test to take the course and the post-test. We will send 1 week and 1 day reminders during this 2 week period.
Participants who have completed the posttest will be sent email reminders to complete the 3-month follow-up assessment at 1 month, 2 weeks, and on the day of eligibility.
Participants will be given 1 week after the day of eligibility to complete the 3-month follow-up assessment. After completing the follow-up assessment, they will be given the option of receiving CEU/CME credits for their participation.
B.3 Methods to Maximize Response Rates and Deal with Nonresponse
Several reminders will be used from recruitment to follow-up and CME/CEU credits will be offered to participants to increase response rates for this data collection. During the 2-week recruitment period, associations will send an email reminding their members to of the deadline to sign-up for the study. GEARS will send reminders to all participants during the 2 week period given to participants to take the course. GEARS will also send periodic reminders (1 month, 2 weeks, and 1 day) during the 3-month follow-up period. Drafts of communications (letters and email reminders) can be found in Appendix C.
B.4 Test of Procedures or Methods to be Undertaken
Several of the questions and item chosen for this evaluation have been selected from standardized instruments from the literature on educating healthcare providers about domestic violence. Questions about perceived knowledge of IPV and clinical practices related to screening, treating, and referring were taken from the Physician Readiness to Manage Intimate Partner Violence (PREMIS) scale.4 This scale is widely referenced in the literature and has been used to evaluate an online CME course. Cronbach Alpha’s for the perceived knowledge scale is fairly high (α >.95). Questions about attitudes about IPV were taken from the Health Care Provider survey for Domestic Violence scale.5 This survey is also widely known in the literature and has been used to assess an online CME course. Cronbach’s alphas for the HCP-DV scale range from .73 to .91. Additional references, reliability, and validity information for all of the items from existing scales can be found in Appendix D. Questions about the functionality of the course were taken from Wang (2003)6 e-Learner Satisfaction (ELS) scale. Cronbach’s alphas for this scale range from .88 to .90.
All of the questions used assess knowledge of the 9 course modules were developed by GEARS. GEARS developed 5-10 questions based on the learning objectives for each of the 9 modules. These questions were piloted by internal GEARS staff and reviewed by our statistician, subcontractor, and OWH.
B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
Program Development Contact
Adrienne Smith, Ph.D., MS, CHES
Public Health Advisor
U.S. Department of Health and Human Services
Office on Women's Health
202-401-8325
Data Collection/Analysis and Statistical Contact
Deborah Brome, Ph.D.
Vice President and Director of Evaluation & Applied Research
Global Evaluation & Applied Research Solutions (GEARS Inc.)
301-429-5982
Michael Milburn, Ph.D.
Professor of Psychology
University of Massachusetts, Boston
100 Morrissey Boulevard
Boston, Massachusetts
617-287-6386
1 Hertzog identified abstracts of pilot studies funded by National Institutes of Health (NIH) National Institute of Nursing Research (NINR), R03 and R15 grants from 2002 to 2004 were obtained using the CRISP database (National Institutes of Health, 2005), and Medline was searched for articles on pilot studies published in 2004 and referenced in the category of nursing. The sample sizes for studies similar to ours ranged from n = 24 to n = 419, with a median of n = 49.
2 Short, L. M., Surprenant, Z. J., & Harris Jr., J. M. (2006). A community-based trail of an online intimate partner violence CME program. American Journal of Preventative Medicine, 30(2), 181-185. doi:10.1016/j.ampre.2005.10.012
3 Harris Jr., J. M., Kutob, R. M., Surprenant, Z. J., Maiuro, R. D., & Delate, T. A. (2002). Can internet-based education improve physician confidence in dealing with domestic violence? Methods for Continuing Medical Education, 34(4), 287-292. doi:
4 Short, L. M., Alpert, E., Harris, J. M., & Suprenant, Z. J. (2006). A tool for measuring Physician Readiness to Manage Intimate Partner Violence (PREMIS). American Journal of Preventative Medicine, 30(2), 173-180. doi:10.1016/j.ampre.2005.10.009
5 Maiuro, R.D., Vitaliano P. P., Sugg, N. K., Thompson, D. C., Rivara, F. P., & Thompson, R. S. (2000). Development of a Health Care
Provider Survey for Domestic Violence: Psychometric Properties. American Journal of Preventative Medicine, 19(4), 245-252.
doi: 10.1016/S0749-3797(00)00230-0
6 Wang, Y. (2003). Assessment of learner satisfaction with asynchronous electronic learning systems. Information & Management, 41, 75-86. http://dx.doi.org/10.1016/S0378-7206(03)00028-4
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Funn, Sherrette (OS/ASA/OCIO/OEA) |
File Modified | 0000-00-00 |
File Created | 2021-01-25 |