0990-ET_IPV_SupportingStatement_7_01_14 v3

0990-ET_IPV_SupportingStatement_7_01_14 v3.doc

Educating and Training of Healthcare Violence Against Women

OMB: 0990-0422

Document [doc]
Download: doc | pdf




Supporting Statement for Request for Clearance:


EDUCATION AND TRAINING OF HEALTHCARE PROVIDERS AS A COORDINATED PUBLIC HEALTH RESPONSE TO

VIOLENCE AGAINST WOMEN


Contact Information:


Adrienne Smith, Ph.D., MS, CHES

Office on Women's Health

Office of Public Health Science

U.S. Department of Health and Human Services

200 Independence Avenue, SW, Room 733E

Washington, DC 20201

202-401-8325

202-401-4005 fax

[email protected]



July 1, 2014

SUPPORTING STATEMENT

GIRLS AT GREATER RISK FOR JUVENILE DELINQUENCY AND HIV

PREVENTION PROGRAM



A. JUSTIFICATION


A.1 Need and Legal Basis


According to the 2010 National Intimate Partner and Sexual Violence Survey (NISVS) more than one in three women have experienced physical violence at the hands of an intimate partner and nearly one in ten women in the United States (9.4%) have been raped by an intimate partner in her lifetime.1 As part of the White House’s national strategy on domestic violence and women,2 the Obama administration reauthorized the Family Violence and Prevention Services Act3, reauthorized the Violence against Women Act4, and included provisions for healthcare in the Affordable Care Act.


The Affordable Care Act (PHS 2713)5 requires health insurance plans to cover preventive care and screening for women as defined by the Health Resources and Services Administration (HRSA)Women’s Preventive Services Guidelines. These guidelines include screening and counseling for interpersonal and domestic violence.6 In addition, the U.S. Preventive Services Task Force released a recommendation in January 2013 calling for clinicians to “screen women of childbearing age for intimate partner violence.” 7


Many health care providers are uncertain about how to handle disclosures of abuse and violence. The U.S. Department of Health and Human Services (DHHS) Office of Women’s Health (OWH)8 is seeking to pilot and evaluate an e-learning course designed to educate and train healthcare providers on how to respond to intimate partner violence (IPV). The primary goal of the project is to develop a comprehensive e-learning course that trains healthcare providers on how to screen, assess, treat, and refer female patients that may be victims of domestic violence and/or sexual assault. The curriculum has already been developed and reviewed in conjunction with OWH and a panel of federal partners and national experts. The current data collection is needed to pilot and evaluate this e-learning course. This is a new data collection and OWH is requesting OMB approval.


OWH will use evaluation findings to make subsequent changes to the e-learning course and launch a subsequent initiative to work with federal and state partners to disseminate the curriculum nationwide. Furthermore, evaluation findings will help OWH assess changes in knowledge, attitudes, and practice.


A.2 Purpose and Use of Information


The purpose of this data collection is to gather data from healthcare providers who have volunteered to participate in the pilot and evaluation of an e-learning course designed to educate and train healthcare providers on how to respond to intimate partner violence (IPV) against women. Information obtained from this data collection will be used to identify areas of improvement and measure the effectiveness of the e-learning course in educating healthcare providers about IPV, addressing attitudinal barriers to IPV screening, and increasing IPV screening in clinical practice. This data will also help identify any problems in the navigation and functioning of the e-learning course. The results of this evaluation will assist OWH in making revisions to the course and subsequently coordinating a national launch, making the e-learning course available to healthcare providers across the U.S. All data collection forms and activities will be used within a 4-month time frame.


This evaluation supports the DHHS and OWH’s overall mission and strategic plan. It supports the DHHS objective of implementing “prevention policies, programming, and interventions to prevent and respond to individuals, families, and communities impacted by domestic violence.9 It also enhances OWH’s capacity to provide healthcare providers with accurate, evidence-based information and identify innovative educational strategies.10 Furthermore, the results will also aid in the planning and development of future OWH and other public and private sector initiatives to promote IPV awareness and screening in the healthcare setting. Knowledge gained from the evaluation will inform federal, public, and private sector on how IPV knowledge, attitude, and practices may differ between healthcare providers and healthcare settings.


The Office on Women's Health intends to use the evaluation results of e-learning course to address the PART deficiencies indicated by the Office of Management and Budget in 2004. The evaluation will address several of the objectives for program management, strategic planning and program results. Additionally, the evaluation results are critical to measuring the efficacy of the use of government funds.


Failure to collect this information will have negative consequences on the implementation of the e-learning course. This data collection is part of an effort to pilot & evaluate this curriculum before disseminating it nationally. The evaluation findings are essential to determining and ensuring the effectiveness of the e-learning course. Disseminating the e-learning course without knowing if it is effective is not desirable and not in line with OWH’s goal of providing health care professionals with accurate, science and evidence-based health resources. In addition, not collecting this information may also result in delays in the nationwide dissemination of an essential resource that will enhance healthcare providers’ capacity to screen for IPV.


Overall, the evaluation of OWH’s Healthcare Provider IPV e-learning course will assess the effectiveness of this -e-learning curriculum in educating health providers about how to respond to intimate partner violence. Data will be collected through pre, post, and 3 month follow-up assessments. The evaluation methodology is designed to assess program effectiveness over a 4-month period and in a manner which is efficient. Although the project is designed to reach one respondent type, healthcare providers, there are three main professions OWH is specifically interested in: physicians, including urgent care physicians, nurses, and medical social workers. Professionals within the three states with the highest levels of domestic and/or sexual violence, South Carolina, Nevada, and Oklahoma, have been selected as the target population for this study.



A.3 Use of Improved Information Technology and Burden Reduction


Healthcare providers within the 3 selected states will use electronic technology to take the course and for all data collection activities. All data collection instruments (i.e., pre, post, follow-up) and the instructions associated with the instruments will be on a secure online website. Participants will be redirected from the e-learning course to a secure site where they will take pre, post, and 3 month follow-up assessments online via the web. Participants will take the course and the pre, post, and follow-up tests at their own pace. After taking the pre-test, participants will have 2 weeks to complete 2 modules and complete the post-test. Three months after completing the post-test participants will be prompted to take the 3-month follow-up test. All participant activity associated with this project should occur within 2 quarters. All questions, except for one, are multiple choice, which reduces the amount of time and burden on participants. OWH and GEARS Inc. will have 24-hour access to the data.




A.4 Efforts to Identify Duplication and Use of Similar Information


No effort to collect similar data is being conducted within the agency. Additionally, no data collection efforts outside the agency have been made to collect this data. The respondents are participants in a new OWH project and the data are specific to the evaluation of this e-learning course.


A.5 Impact on Small Businesses or Other Small Entities


This data collection involves the collection of information from small businesses or other small entities. It is possible that some of the healthcare providers are also owners of private practices which may be considered small businesses. In consideration of respondents’ time, GEARS developed the data collection assessments to include the minimal amount of information required to effectively evaluate the e-learning course. In addition, the e-learning educational method allows for flexibility in when the participants choose to take the assessments which alleviates the burden further. After signing-up for the course, participants have 2 weeks to complete the pre-test, take the curriculum, and complete the post-assessment. Participants have 1 week after the 3-month eligibility date to complete the follow-up assessment. Because the assessments are online, participants may take the assessment at any time of the day as well.


A.6 Consequences of Collecting the Information Less Frequently


This is a one-time data collection effort with one respondent type, healthcare providers, and 3 categories within this type: physicians, nurses, and medical social workers. Approval is sought for one year.


Data will be collected from participants at 3 different time points: at pre-test before accessing the curriculum, at post-test after participants finish the curriculum, and at follow-up 3 months after post-test.


Data collection for all of these time-points is needed. The baseline assessment taken at pre-test and the post-test assessment allows OWH to assess the immediate effect of the e-learning course on IPV knowledge, attitudes, and behaviors. The post-test also allows OWH to get feedback on how well the e-learning course functions. The 3-month follow-up assessment is necessary in order to determine if the outcomes of the curriculum endure with time.


If this data collection is not conducted, OWH’s ability to accurately measure and evaluate the impact of this program against its stated objectives will be negatively affected. Failure to include these data collection activities as part of the overall evaluation design will limit the validity of the results and the validity of the overall course. There are no legal obstacles to reduce respondent burden.


A.7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5


The proposed evaluation fully complies with all guidelines of 5 CFR 1320.5 (d) (2).


A.8 Comments in Response to the Federal Register Notice/Consultation


The data collection notice for the evaluation of the Education and Training of Healthcare Providers program was published in the Federal Register, volume 79, number 64 page 18691-18692 on April 3, 2014. A copy of the Federal Register notice is included as Appendix A. There were no comments received from the public regarding this data collection.


The DHHS/OWH Project Officer for this data collection is Dr, Adrienne Smith. Additionally, OWH engaged the consulting firm Global Evaluation & Applied Research Solutions (GEARS), Inc. to assist in the development of the survey instruments and evaluation methodology for this evaluation. GEARS is experienced in managing and conducting evaluations and provided expertise on issues including the availability of data, frequency of collection, clarity of instructions, record keeping, confidentiality, disclosure of data, reporting format, and necessary data elements. Also, in 2010 GEARS completed the OWH evaluation of its HIV Prevention Program for Young Women attending Minority Institutions” program. This evaluation was approved by OMB. Also, GEARS is currently conducting an OMB approved data collection for the Office on Women’s Health for the “Girls at Greater Risk for Juvenile Delinquency and HIV Prevention Program.”


A.9 Explanation of Any Payment or Gift to Respondents


There will be no payment, gift, or reimbursement to respondents for time spent.




A.10 Assurance of Confidentiality Provided to Respondents


The evaluation contractor, GEARS, Inc. will not collect any identifying, personal data from participants. The Pre, Post, and Three-Month Follow-up assessments (See Appendix B) do not ask participants’ names or other personal identifiers. All information collected from program participants will be de-identified. GEARS’ uses a secure, online site designed to collect data for the evaluation of this e-learning course. The first online page informs participants of the purpose of the assessment, how the information collected will be used, the estimated time to complete, and that no personal identifiers will be linked to their responses. All information collected will be kept private to the extent possible by law. The data will be electronically submitted to GEARS and used only for data analytic and evaluation purposes.


A.11 Justification of Sensitive Questions


The items and questions asked in this evaluation are not of a sensitive nature. Moreover, all questionnaires used in the evaluation have been reviewed by an Institutional Review Board to ensure that respondents’ rights are protected.


A.12 Estimates of Annualized Hour and Cost Burden


This evaluation is a one-time effort conducted for one year with an estimated 2001 annual burden hours. The evaluation of the pilot curriculum will be targeted to approximately 1600 health care providers (unduplicated count) consisting of 800 physicians, 400 nurses, and 400 social workers. Exhibit A.1 presents the hourly burden breakdown which was used to derive the total burden time. Exhibit A.2 presents the annualized hourly costs for respondents.


Program participants will complete three assessments as part of their participation in the project. During the first 2 weeks participants will complete a pre-test and post-test assessment that are given prior to taking the pilot curriculum and immediately after taking the curriculum. Three months after taking the curriculum respondents will complete the follow-up test/assessment. These tests/assessments will be administered electronically through a website and take 25 minutes each or a total of 75 minutes.




Exhibit A-1 Estimated Annualized Burden Hours


Form Name

Number of
Respondents

No. Responses per Respondent

Average Burden per Response (in hours)

Total
Burden Hours

Pre-Assessment

1600

1

25/60

667

Post-Assessment

1600

1

25/60

667

3-Month Follow-Up Assessment

1600

1

25/60

667

Total

 


 

2,001



Exhibit A-2 Estimated Cost Burden


Type of Respondent

Total Burden Hours

Hourly Wage
Rate

Total

Respondent Costs

Healthcare Providers




Pre-Assessment

667

$90.00

$60,030.00

Post-Assessment

667

$90.00

$60,030.00

3-Month Follow-up Assessment

667

$90.00

$60,030.00

Total



$180,090.00

A.13 Estimates of other Total Annual Cost Burden to Respondents or Record-keepers/Capital Costs


There are no additional respondent costs associated with start-up or capital investments. Additionally, there is no operational, maintenance, or equipment respondent costs associated with continued participation in the evaluation. The total annual cost burden to respondents or record-keepers is $180,090 as presented in Exhibit A.2.


A.14 Annualized Cost to the Federal Government


The evaluation will be conducted for one year. The overall cost to implement the evaluation is associated with labor required to conduct the following activities: develop evaluation design and methodology; develop data collection forms; design and develop electronic data storage systems; manage data collection activities; develop quarterly reports; conduct and report site visits to funded contractors; develop the evaluation methodology and analysis plan; train evaluation staff; ensure accurate data maintained in data storage systems; and analyze and report evaluation results. Exhibit A-3 presents the cost breakdown by major budget category.





Exhibit A-3 Cost of the Proposed Study


Activity

Cost

Personnel Costs (GEARS and federal employee)

$211,683.64

Other costs (website, subcontractors, consultants, supplies)

$199,134.06

Total

$410,817.70


Total annualized costs to conduct this evaluation are $410,817.70.


A.15 Explanation for Program Changes or Adjustments


There are no changes in burden. This is a new project.


A.16 Plans for Tabulation and Publication and Project Time Schedule


Exhibit A-4 Project Time Schedule


Activity

Time Period

Federal Register Notice and OMB Clearance

April 3, 2014

Recruitment

Once OMB approval is received

Pre Assessment, e-Learning course, & Post-Assessment

Within two weeks of receiving OMB approval

3 Month Follow-Up Assessment

Three months after completing the post assessment

Analysis & Reporting

Within three months of the completion of the last follow-up assessment.


Publication

Evaluation findings will be summarized in a comprehensive Evaluation Report and Executive Summary developed by GEARS for OWH. The findings from this evaluation will be shared with a panel of federal partners and national experts and presented at regional and/or national conferences.


Analysis Plan

Quantitative data will be collected for this evaluation. Data analysis will be supervised by Deborah Brome, Ph.D., Project Director, in consultation with Michael Milburn, Ph.D., Project Statistician. Data entry, file organization and data access and management will be supervised by Dr. Deborah Brome.


A. Qualitative Data Analysis: We will collect two different types of qualitative, open-ended data. At the end of the follow-up assessment, there is one open-ended question to get respondents’ overall impressions of the course. These responses will be coded, categorized, and summarized. The findings will be used to guide OWH in making modifications to the curriculum & e-learning course. The second type of open-ended response is the answers participants write n for “Other” response options.


  1. Quantitative Data Analysis. Quantitative data will consist of measures of IPV knowledge, attitudes about IPV and the role of healthcare providers, IPV screening behavior, and assessment of the functionality of the e-learning course.  The data will be exported to a statistical database from the online website and cleaned. Descriptive statistics, such as frequencies, percentages, means, and standard deviations, will be calculated, and reliability analyses for all scales will be conducted.  The basic research design that will be used to analyze the data is a mixed design with two between-groups factors (i.e., healthcare providers and states) and one within-group factor (time) with three measurements taken at pre-test, post-test and a 3-month follow-up.

 

There are four main outcome areas of interest: IPV Knowledge, IPV attitudes, IPV screening, and course functionality. The internal consistency of these measures will be assessed using factor analysis and reliability analyses.  Additionally, there are a group of questions that assess knowledge for each specialty module. Background variables such as age and gender will be entered as covariates. While there may be main effect differences across type of healthcare provider and states, the interaction of these variables by time will reveal differences in these variables.  From a power analysis (Cohen, 1977) calculation using tables for interactions in repeated measures analysis of variance (Potvin & Schutz, 2000), we can make a judgment on the minimum sample size needed to ensure adequate power Hypothesizing a medium effect (ES= .50), the time (2 df) by state (2 df) interaction has 4 degrees of freedom. The healthcare provider (3 df) by time (2 df) interaction has 6 degrees of freedom. Because of the high education level of research participants, we anticipate high correlations across time of the within-group measures. So, to obtain power = .80 at p = .05, approximately 40 subjects per group are needed (extrapolating using Potvin & Schutz, Table 1, page 352).11  The proposed available sample size of per group thus ensures quite adequate statistical power.

 

A.17 Reason(s) Display of OMB Expiration Date is Inappropriate


OMB expiration dates will be displayed on all materials.


A.18 Exceptions to Certification for Paperwork Reduction Act Submissions


There are no exceptions to the certification statement identified in item 19 “Certification for Paperwork Reduction Act Submissions,” of OMB Form 83-I.






B. Collection of Information Employing Statistical Methods.


B1. Respondent Universe and Sampling Methods

We will be collecting data from three states with high levels of IPV: Nevada, Oklahoma and South Carolina. Since our objective is to develop an e-learning curriculum that will be effective in reducing levels of IPV, targeting states with high levels for our pilot test will provide a test of the procedures under conditions of maximum potential impact. The following table presents the number of physicians, nurses and social workers in these states:

Respondent Group

State Population Estimate:
Oklahoma

State Population Estimate:
Nevada

State Population Estimate:
South Carolina

Total Population Estimate

Health Care Providers1

48,326

22,920

59,660

130,490

Physicians2

7,070

3,050

5,950

16,070

Nurses4

39,390

19,140

51,410

109,940

Medical Social Workers

1,450

730

2,300

4,480

1All estimates were taken from the US Dept. of Labor Bureau of Labor Statistics accessed on January 30, 2014.

2The category of physicians includes family and general practitioners, internets, obstetricians/gynecologists, pediatricians, and “other” physicians.

3 The category of nurses includes registered nurses and licensed practical nurses.



A useful article for the determination of the minimum sample size needed for our pilot study is Hertzog (2008).12 If we are simply estimating a proportion, as we will when we assess the change in the proportion of practitioners who will screen for IPV in the posttest and follow-up, Hertzog indicates that the 95% confidence interval of the proportion will be plus or minus 5% with a sample of n = 40 per group (Table 1, p. 182). This Hertzog recommendation parallels the required sample size estimate derived from Potvin and Schutz, (2000).

Short, Suprenaut, and Harris Jr.13 and Harris et al.14 conducted similar evaluations of online CME intimate partner violence courses with physicians using paper surveys. In these studies, the enrollment response rate was 6%. Other OWH projects related to domestic violence have state response rates of 10% to 30%. We are anticipating a response rate of 10% from each association.

Short et al. (2006) and Harris et al. (2002) reported retention rates of 66% after 6 weeks and 84% after 6 months. We expect a retention rate of at least 80%

We anticipate the Cronbach’s alphas for the measures to be fairly high. Based on similar evaluations of online IPV curriculums, Cronbach alphas for an IPV attitude scale ranged from .73 to .91. Cronbach alpha’s for Short et al. (2006)’s for a perceived knowledge scale was fairly high (α >.95) as well. With our sample of physicians (urgent care and all other physicians), nurses, and social workers we anticipate the reliabilities of our knowledge and attitude scales to be at least .70 (test-retest correlation), a lower bound to acceptable stability (Nunnally & Bernstein, 1994). With high reliabilities (e.g., r = .80), a sample size of n = 40 per group ensures a 95% confidence interval from .73-.86 (Table 4, p. 184).

For the within-subjects (pre, post, and follow-up) design we are proposing, Hertzog (2008) indicates that power will be above .80 for even small effects with samples n = 40 or greater (p. 188), assuming within-subject correlations of r = .60.

Thus, we will target obtaining samples (after attrition) of n=400 from each of the four groups (two groups of physicians-urgent care physicians, other physicians, nurses, and social workers) in each of the three states with a total n = 1600. If a greater number of participants complete the e-learning module and the three data collection times, we will randomly sample from the pool of respondents.

B.2 Procedures for the Collection of Information

The following presents data collection procedures for the evaluation project:

  1. Establish agreements and obtain IRB approval (if necessary) from each professional association. GEARS will collect IRB approval letters from all associations participating in the evaluation.

  2. Obtain OMB clearance.

  3. Finalize all forms by making any changes suggested by OMB. Make sure the OMB clearance number is printed on all forms.

  4. The day OMB clearance is received, GEARS will send an email to each association to: 1) inform them that OMB clearance has been obtained, and 2) They will receive a letter to distribute electronically to their members in the three pilot states within the next 3 days.Participants will be redirected from the e-learning course to an online site to take all assessments (e.g. Pre, Post, 3 month follow-up).

  5. Members will have 2 weeks to sign-up for the course and take the pre-test. During this 2 week period, the associations will be provided with an email reminder to forward to their members reminding them of the e-learning opportunity.

  6. Participants will have 2 weeks from the time they take the pre-test to take the course and the post-test. We will send 1 week and 1 day reminders during this 2 week period.

  7. Participants who have completed the posttest will be sent email reminders to complete the 3-month follow-up assessment at 1 month, 2 weeks, and on the day of eligibility.

  8. Participants will be given 1 week after the day of eligibility to complete the 3-month follow-up assessment. After completing the follow-up assessment, they will be given the option of receiving CEU/CME credits for their participation.

B.3 Methods to Maximize Response Rates and Deal with Nonresponse


Several reminders will be used from recruitment to follow-up and CME/CEU credits will be offered to participants to increase response rates for this data collection. During the 2-week recruitment period, associations will send an email reminding their members to of the deadline to sign-up for the study. GEARS will send reminders to all participants during the 2 week period given to participants to take the course. GEARS will also send periodic reminders (1 month, 2 weeks, and 1 day) during the 3-month follow-up period. Drafts of communications (letters and email reminders) can be found in Appendix C.


B.4 Test of Procedures or Methods to be Undertaken


Several of the questions and item chosen for this evaluation have been selected from standardized instruments from the literature on educating healthcare providers about domestic violence. Questions about perceived knowledge of IPV and clinical practices related to screening, treating, and referring were taken from the Physician Readiness to Manage Intimate Partner Violence (PREMIS) scale.15 This scale is widely referenced in the literature and has been used to evaluate an online CME course. Cronbach Alpha’s for the perceived knowledge scale is fairly high (α >.95). Questions about attitudes about IPV were taken from the Health Care Provider survey for Domestic Violence scale.16 This survey is also widely known in the literature and has been used to assess an online CME course. Cronbach’s alphas for the HCP-DV scale range from .73 to .91. Additional references, reliability, and validity information for all of the items from existing scales can be found in Appendix D. Questions about the functionality of the course were taken from Wang (2003)17 e-Learner Satisfaction (ELS) scale. Cronbach’s alphas for this scale range from .88 to .90.


All of the questions used assess knowledge of the 9 course modules were developed by GEARS. GEARS developed 5-10 questions based on the learning objectives for each of the 9 modules. These questions were piloted by internal GEARS staff and reviewed by our statistician, subcontractor, and OWH.

B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


Program Development Contact

Adrienne Smith, Ph.D., MS, CHES

Public Health Advisor

U.S. Department of Health and Human Services

Office on Women's Health

202-401-8325

[email protected]


Data Collection/Analysis and Statistical Contact

Deborah Brome, Ph.D.

Vice President and Director of Evaluation & Applied Research

Global Evaluation & Applied Research Solutions (GEARS Inc.)

301-429-5982

[email protected]


Michael Milburn, Ph.D.

Professor of Psychology

University of Massachusetts, Boston

100 Morrissey Boulevard

Boston, Massachusetts

617-287-6386

[email protected]



1 Black, Michele C., Kathleen C. Basile, Matthew J. Breiding, Sharon G. Smith, Mikel L. Walters, Melissa T. Merrick, Jieru Chen, and Mark R. Stevens. “The National Intimate Partner and Sexual Violence Survey: 2010 Summary Report.” Atlanta: National Center for Injury Prevention and Control, Centers for Disease Control and Prevention, 2011. . Accessed 21 January 2014 at http://www.cdc.gov/violenceprevention/pdf/nisvs_executive_summary-a.pdf

2 The Obama Administration’s Commitment to Combating Violence Against Women. Accessed 21 January 2014 at http://www.whitehouse.gov/1is2many/about/federal-efforts.

3 US DHHS, Office of Women’s Health. Laws on Violence Against Women. Accessed 21 January 2014 at http://womenshealth.gov/violence-against-women/laws-on-violence-against-women/#a.

5 The Patient Protection and Affordable Care Act. Public Law No. 111-148, § 2713 (2010).75 FR 41726 (July 19, 2010).

6 Health Resources and Services Administration. “Women’s Preventive Services: Required Health Plan Coverage Guidelines.” Rockville: Health Resource and Services Administration, U.S. Department of Health and Human Services, 2012. . Accessed 21 January 2014 at http://www.hrsa.gov/womensguidelines/

7 U.S. Preventive Services Task Force. “Screening for Family and Intimate Partner Violence: Recommendation Statement.” Rockville, MD: U.S. Preventive Services Task Force, 2004. . Accessed 21 January 2014 at http://www.uspreventiveservicestaskforce.org/3rduspstf/famviolence/famviolrs.pdf




8 The Office of Women’s Health advances the work of eliminating violence against women in the country by stimulating programmatic and policy activity within DHHS (

9 DHHS. DHHS Strategic Plan & Priorities for FY2010-2015. Accessed 21 January 2014 at http://www.hhs.gov/secretary/about/priorities/strategicplan2010-2015.pdf


10 US DHHS, Office on Women’s Health Strategic Plan FY2014-FY2016. Accessed 21 January 2014 at http://www.womenshealth.gov/about-us/who-we-are/owhstrategicplanforwebsitenov2013508.pdf

11 Potvin, P. J., & Schutz, R. W. (2000). Statistical power for the two-factor repeated measures ANOVA. Behaivor Research Methods, Instruments, & Computers, 32, 347-356.


12 Hertzog identified abstracts of pilot studies funded by National Institutes of Health (NIH) National Institute of Nursing Research (NINR), R03 and R15 grants from 2002 to 2004 were obtained using the CRISP database (National Institutes of Health, 2005), and Medline was searched for articles on pilot studies published in 2004 and referenced in the category of nursing. The sample sizes for studies similar to ours ranged from n = 24 to n = 419, with a median of n = 49.

13 Short, L. M., Surprenant, Z. J., & Harris Jr., J. M. (2006). A community-based trail of an online intimate partner violence CME program. American Journal of Preventative Medicine, 30(2), 181-185. doi:10.1016/j.ampre.2005.10.012

14 Harris Jr., J. M., Kutob, R. M., Surprenant, Z. J., Maiuro, R. D., & Delate, T. A. (2002). Can internet-based education improve physician confidence in dealing with domestic violence? Methods for Continuing Medical Education, 34(4), 287-292. doi:

15 Short, L. M., Alpert, E., Harris, J. M., & Suprenant, Z. J. (2006). A tool for measuring Physician Readiness to Manage Intimate Partner Violence (PREMIS). American Journal of Preventative Medicine, 30(2), 173-180. doi:10.1016/j.ampre.2005.10.009

16 Maiuro, R.D., Vitaliano P. P., Sugg, N. K., Thompson, D. C., Rivara, F. P., & Thompson, R. S. (2000). Development of a Health Care

Provider Survey for Domestic Violence: Psychometric Properties. American Journal of Preventative Medicine, 19(4), 245-252.

doi: 10.1016/S0749-3797(00)00230-0


17 Wang, Y. (2003). Assessment of learner satisfaction with asynchronous electronic learning systems. Information & Management, 41, 75-86. http://dx.doi.org/10.1016/S0378-7206(03)00028-4


OMB Clearance Supporting Statement

Evaluation of OWH Healthcare Provider IPV e-Learning Course Page 13

File Typeapplication/msword
File TitleSupporting Statement for Paperwork Reduction Act Submissions
AuthorBA&H User
Last Modified ByFunn, Sherrette (OS/ASA/OCIO/OEA)
File Modified2014-07-08
File Created2014-07-08

© 2024 OMB.report | Privacy Policy