Learner's Perception Survey Instructions (Terms of Clearance)

LPS Instructions_2012_05.docx

Learner's Perception (LP) Survey

Learner's Perception Survey Instructions (Terms of Clearance)

OMB: 2900-0691

Document [docx]
Download: docx | pdf

Page 0 of 6

LPS2012 Instructions

April, 2012


Terms of Clearance: Learner’s Perception Survey, 2900-0691


Learner's Perception Survey Instructions


This document is in reference to the non-response bias analysis requested by OMB. OMB made a second request for the lead Statistician to expound on their previous response, by providing more detail. 




Learners’ Perceptions Survey

Department of Veterans Affairs

LPS2012

Instructions

I. BACKGROUND

The Learners’ Perceptions Survey (LPS) is a standardized, scientifically validated survey instrument that has been designed to measure the perceptions of health professions trainees on their clinical training experiences at a specified clinical teaching facility. Clinical teaching facilities include medical centers, hospitals, and outpatient care clinics. The LPS is intended for research, as well as evaluative, government regulatory, policy-making, program administrative, and accreditation purposes.

The United States Department of Veterans Affairs (VA) designed the LPS in 1999,1 through its Veterans Health Administration Office of Academic Affiliations (OAA) and an expert panel, now called the OAA’s National Evaluation Workgroup. Since 2001, OAA has administered the LPS annually to all health professions trainees who rotate through, are assigned to, or spend educational time in a VA Healthcare System facility during an academic year. The academic year is defined to begin on July 1st in the prior year, and end on June 30th in the following year.

Between 2001 and 2012, the LPS survey went through a total of nine version changes. These changes were designed to fit growing VA demands for information about its health professions trainees and to account for changes in the classification of health professions disciplines, specialty training, and academic levels.

To create VA versions of LPS questions, the term “MAIN” facility was replaced with “VA Medical Center,” and “Patient health record” was replaced with “Computerized Patient Record System.” “CPRS” refers to VA’s electronic medical record. The VA version of the LPS was administered to VA trainees during academic years 2001, 2002, through 2012, as surveys LPS2001, LPS2002, …, LPS2012, respectively. The latest version, LPS2012 administered during the 2012 academic year, was adapted from the current ninth version of the LPS survey (LPS_v009).

A detailed description of how the VA’s adaption of the LPS changed, by LPS version number, is contained in the table below.




Development of the

Department of Veterans Affairs

Learners’ Perceptions Survey (LPS).

VA

LPS Survey

Academic Year Administered

LPS Version Number

Comments

LPS2001

July 1, 2000 - June 30, 2001

v001

The initial survey was administered to all VA trainees. Questions asked about the respondent’s discipline/specialty, academic level, gender, time in training, and percent time in training spent at VA. Facility-level domains include VA and nonVA comparisons, 100-point numerical score, overall value of VA clinical training experience, whether respondent would recommend experience to other trainees and would choose VA training experience again. Core domains focused separately on Clinical Faculty/Preceptors, Learning Environment, Working Environment, and Physical Plant.

LPS2002

July 1, 2001 - June 30, 2002

v002

The second version added a listing of Physician Residency Specialties and VA Post-Residency Special Fellowship training programs. The name of the Physical Plant Domain changed to Physical Environment. The question describing “preparation for an evidence-based clinical practice,” previously presented as a separate question, was listed as an element to the Clinical Faculty/Preceptors Domain. Questions asking for the name and address of the Main Medical Facility and the institutions sponsoring the training program were added. Seven items describing characteristics of patients seen were added. Respondent-level questions asking about year graduated from medical school and whether the medical school was US or non-US were added.

LPS2003

July 1, 2002 - June 30, 2003

v003

The single survey was divided into two separate questionnaires, one intended for Associated Health trainees (AH) and the other for Physician Residents, including fellows and medial students (PR). Research Mentoring and Mentoring by Faculty elements were added to the Clinical Faculty Preceptors Domain. Personal Experience Domain was added. Patient characteristics described in terms of whether “Treatment will resolve an acute problem,” “Treatment will stabilize or improve a chronic condition,” and “Treatment will comfort or palliate” were added to the characteristics of patients seen. Clinical Environment, Staff/Service Availability, Staff/Service Quality, and Quality of Care and Patient Safety Domains were added to the PR survey. Questions asking about the Main Medical Facility were deleted.

LPS2004

July 1, 2003 - June 30, 2004

v004

Quality of Care and Patient Safety Domain was re-focused to become the Systems and Process Medical Error Domain. A Topic Domain was added to the PR questionnaire describing the overall effect of the 2003 Accreditation Council for Graduate Medical Education (ACGME) duty hours/scheduling on training experiences. Facility-Level Question “Would you consider the VA as a future employment site?” was added. The element: “Dealing with terminally ill patients” was removed from the Personal Experience Domain, and “Ownership/personal responsibility for your patients' care” was added to the Personal Experience Domain. “Treatment will resolve an acute problem,” “Treatment will stabilize or improve a chronic condition,” and “Treatment will comfort or palliate” were removed from the characteristics of patients seen. Questions identifying the sponsoring institution were deleted.

LPS2005

July 1, 2004 - June 30, 2005

v005

The classification of academic levels for Pharmacy trainees was modified.

LPS2006

July 1, 2005 - June 30, 2006

v006

Specialty and subspecialty classifications for Physician Residents were expanded.

LPS2007

July 1, 2006 - June 30, 2007

v006

No change

LPS2008

July 1, 2007 - June 30, 2008

v006

No change

LPS2009

July 1, 2008 - June 30, 2009

v006

No change

LPS2010

July 1, 2009 - June 30, 2010

v007

Rehabilitation discipline was divided into blind, occupational, physical and other therapy. The question: “Are you currently on Active Duty in the military?” was added among questions describing the characteristics of the respondent.

LPS2011

July 1, 2010 - June 30, 2011

v008

ACGME Topic Domain was deleted from the PR questionnaire.

LPS2012

July 1, 2011 - June 30, 2012

v009

The classification of Physician Residents Specialty and Advanced Fellowship Programs were revised. Three Topic Domains were added: Psychological Safety, Patient/Family Centered Care, and U.S. Accreditation Council for Graduate Medical Education (ACGME) Duty/Hours Scheduling Domains. Disciplines were divided into Associated Health, Dentistry and Nursing programs. Separate questions describing Advanced Fellowship Programs were added to AH questionnaire.



LPS Survey responses have been shown to have good internal consistency2 (’s ranging from 0.87 to 0.92), and have been validated for discriminant and construct validity across medical students and physician residents,2 medical specialties,3, 4 and dental specialties,5 and for construct validity in longitudinal analyses for physician residents.6 A description of the data accounting system used for collecting, processing, and analyzing LPS data has also been described.7 The data accounting system was needed to organize, compute scores, retrieve, run analyses, and track data processing.

II. STRUCTURE

LPS2012 contains two separate questionnaires designed to measure the trainee’s perceptions about their clinical training experiences at a “VA” facility. The VA facility is the clinical teaching setting of interest, and can include a VA Healthcare System Facilities, VA medical center, VA hospital, or VA outpatient clinic. “Experience” is operationally defined to be the respondent’s most recent clinical experience at the VA facility. To capture information concerning the complete experience, LPS Questionnaires should be administered near the end of the respondent’s rotation through, assignment to, or educational time in the VA facility of interest.

The LPS2012 Physician Fellow, Resident, and Medical Student Questionnaire, or LPS2012_PR, is designed to measure the perceptions of medical trainees about their most recent clinical training experience at a VA facility. The LPS2012-PR is intended for medical trainees at all academic levels, including medical school and Graduate Medical Education. The LPS2012 Associated Health, Dental, and Nursing Questionnaire, or LPS2012_AH, is designed to measure the perceptions of trainees in Associated Health, Dental, and Nursing programs about their most recent clinical training experience at a VA facility. The LPS2012_AH is intended for trainees at all academic levels, from certificate and diploma programs through post doctoral and residency training programs (PGY1-3).

The LPS2012_PR and LPS2012_AH questionnaires were designed to work together, and contain comparable respondent, facility, and environment level information. Questionnaires differ in how programs, disciplines, specialties, and education background are defined, and by the inclusion of selected environment-level and topic domains to accommodate differences between medical physicians and other health professions training.

II.A. Respondent-Level Information

Respondent level information includes the following classes of information.

II.A.(i). Current education program, discipline, specialty, sub-specialty, and academic level

The LPS classifies respondents by training program, disciplines within each training program, a specialty within each discipline, and a sub-specialty within each specialty. Respondents are also classified by an academic level specific to the given discipline, specialty, or subspecialty.

Training programs include Medical Physicians, whose trainees are administered the LPS2012-PR, and Associated Health, Dentistry, and Nursing, whose trainees are administered the LPS2012-AH.

Disciplines within Associated Health include, among others, Audiology, Chaplaincy, Chiropractic, Dietetics, Health Information, Health Services Research and Development, Imaging, Laboratory, Medical/Surgical Support, Optometry, Pharmacy, Physician Assistant, Podiatry, Psychology, Rehabilitation, Social Work, Speech Pathology, and Other. Disciplines within Dentistry include Dental Assistants, Hygienists, and Dentists. Nursing includes Licensed Practical Nurses, Registered Nurses, and Nurse Practitioners. There is only the “medicine” discipline within Medical Physicians.

Specialties are specific to disciplines. Medical physicians include medical, surgical, psychiatric specialties and subspecialties, as classified by the United States Accreditation Council on Graduate Medical Education. To accommodate training programs where specialization and sub-specialization is often declared during the course of training, medical students are assigned “undeclared” as a medicine specialty. Since a subspecialty within Internal Medicine is to be determined upon admission to a fellowship program, residents in internal medicine at PGY1-3 are assigned to “internal medicine” even though the trainee may be planning on entering a sub-specialty. Once admitted to a sub-specialty fellowship, the trainee is assigned to the subspecialty (e.g., “cardiology”). For example, Associated Health Rehabilitation includes occupational therapy, physical therapy, and blind rehab, among others. Dentists specialize in general practice, maxillofacial prosthetics, oral and maxillofacial pathology, radiology, and surgery, oral medicine, orthodontics, and periodontics, among others.

Academic levels depend on the discipline and specialty and reflect progress towards completing an education goal. For LPS2012-PR, academic levels include medical student by year, and Graduate Medical Education intern (PGY1), resident (PGY1-5), and fellow (PGY4-8). For the LPS2012-AH, academic levels that are specific by discipline and specialty include certificate, diploma, associate degree, baccalaureate, post baccalaureate, pre-masters, masters, and post-masters, pre-doctoral, doctoral, intern, post doctoral, and residency (PGY1-3).

Both LPS2012-PR and LPS2012-AH allows the respondent to report a second training program under the rubric of Advanced Special Fellows. The disciplines within Special Fellowships include advanced geriatrics, medical informatics, health services research, health systems engineering, women’s health, patient safety, and disease specific research, among others.

II.A.(ii). Education background

For the LPS2012-PR, education background includes whether the respondent graduated from a U.S. or non-U.S. medical school, year graduated, percent of time in current clinical training spent at the VA facility, and current rotation status. For the LPS2012-AH, education background includes information about how much time the resident completed in the current clinical education program, time needed to complete the current clinical education program, and percent of time completed that was spent in the VA facility.

II.A.(iii). Respondent’s demographic characteristics

Demographic characteristics include the respondent’s gender and current military status.

II.A.(iv). Characteristics of patients seen

The characteristics of patients that the respondent saw during their most recent experience with the VA facility include the percent of patients seen who: (1) were 65 years of age or older, (2) had a chronic medical illness, (3) had a chronic mental illness, (4) had multiple medical illnesses, (5) had alcohol/substance dependence, (6) had low income or socioeconomic status, and (7) did not have social or family support.

II.B. Perceptions Domains

Perceptions are described in terms of facility-level domains, environment-level domains, and elements within each domain. Environment-level domains are classified into experiences.

II.B.1. Facility-Level Domains

Respondents are asked to summarize their overall clinical training experience at the VA facility in terms of five facility-level domains. (i) Numerical Summary is an interval score that describes the overall experience on a 100-point scale, with higher values indicating greater satisfaction, and a score of 70 considered to be “passing.” (ii) Value is an ordinal score that describes the value of the training experience on a five-point Likert scale as: “poor,” “fair,” “adequate,” “very good,” or “excellent.” (iii) Recommendation includes two scales, an ordinal 4-point Likert scale that indicates whether the respondent would choose the training again as: “definitely would not,” “probably would not,” “probably would,” or “definitely would;” and a binary scale indicating whether the respondent would recommend the training to others as: “yes,” or “no.” (iv) Employment includes two measures, a 5-point Likert scale that indicates whether as a result of the clinical training experience the respondent would consider the VA facility for future employment as “a lot less likely,” “somewhat less likely,” “no difference,” “somewhat more likely,” or “a lot more likely;” and a binary scale as “yes” or “no.” (v) Comparison: includes a domain summary and nine element scales, based on a five-point Likert scale that compares the training experience at the VA facility as “a lot worse,” “somewhat worse,” “about the same,” “somewhat better,” or “a lot better” than at an equivalent level of training experience at a NON-VA facility. Domain elements include faculty and preceptors, facility staff, learning, working, and physical environments, degree of autonomy and supervision, the quality of care, and the usefulness of what was learned.

II.B.2. Environment-Level Domains

II.B.2.(1) Core Domains

The core domains are designed to describe the trainee’s teaching, working, and clinical experiences during his or her most recent clinical training at the VA facility. All nine core domains are each made up of between 6 and 15 item questions, or domain elements. Each domain element question asks the respondent about a different aspect of the domain. In addition, after all domain element questions have been answered, the respondent is asked to respond to a domain summary question in which the trainee is asked to provide an overall rating of the domain. Each domain element and domain summary question asks respondents to describe their perceptions on an ordinal five-point Likert scale that describes their satisfaction in terms of: “very dissatisfied,” “somewhat dissatisfied,” “neither satisfied or dissatisfied,” “somewhat satisfied,” or “very satisfied.”

Core domains are classified into teaching, working, or clinical experiences. (i) Teaching Experience is made up of two domains: the Learning Environment containing 15 elements, and Clinical Faculty and Preceptors containing 13 elements. (ii) Working Experience is made up of three domains: Working Environment containing 13 elements, Physical Environment containing 12 elements, and Personal Experiences containing 13 elements. (iii) Clinical Experience is made up of four domains: Clinical Environment containing 15 elements, the Availability and Timeliness of Staff and Services (on weekdays, nights, and weekends) containing 13 elements, the Quality of Staff and Services (whenever such staff or services are available) containing 6 elements, and the Processes of Dealing with Medical Errors containing 6 elements.

The LPS2012-PR questionnaire asks respondents to answer all nine domains. The LPS2012-AH questionnaire asks respondents to answer only the two teaching and three working experience domains.

II.B.2.(2) Topic Domains

LPS includes special topic domains that are designed to ask respondents about special events or focus on different aspects of their training experiences that are otherwise not covered by one or more core domains. Both LPS2012-AH and LPS2012-PR questionnaires contain a two-element psychological safety topic domain and a twenty-element patient/family centered care topic domain. In addition, the LPS2012-PR questionnaire contains a seventeen-element duty hours/scheduling topic domain.

To answer psychological safety and patient/family centered care domain element questions, respondents are presented with statements and are asked to report whether they “strongly disagree,” “disagree,” “neither agree or disagree,” “agree,” or “strongly agree.” The patient/family centered care topic domain also includes a domain summary question in which respondents are asked to rate their overall satisfaction with patient and family centered care at the VA facility as: “very dissatisfied,” “somewhat dissatisfied,” “neither satisfied or dissatisfied,” “somewhat satisfied,” or “very satisfied.”

The duty hours/scheduling topic domain is designed to assess the extent to which new duty hour limits and clinical rotation scheduling standards, as implemented by the U.S. Accreditation Council for Graduate Medical Education (ACGME) in 2011, had any impact on the respondent’s clinical teaching experiences at the VA facility.6 The duty hours/scheduling topic domain consists of seventeen elements. Each element presents a hypothetical effect that standards may have had on the respondent’s clinical teaching experience (e.g., supervision, transitions of care, fatigue management, and professionalism). For each domain element and summary question, the respondent is asked to judge whether the 2011 duty standards had a “very negative effect,” “somewhat negative effect,” “had no effect,” “somewhat positive effect,” or “very positive effect” on their clinical teaching experiences at the VA facility.

Topic domains can be used to assess the impact that implementation of a new standard or regulation has had on trainees’ satisfaction rates with their clinical teaching experiences. Rather than comparing the topic domain with core domain responses, one can compare core domain responses both before and after the implementation of the standard or regulation, and use responses to the topic domain, administered after implementation of the standard or regulation of interest, as a differencing variable to control for the confounding influences of time trends. This Robust Differencing Variable Technique has been applied to LPS data, as described elsewhere.6

II.B.3. Elements

Each of the nine core environment domains consists of between 6 and 15 domain element questions. The three topic domains contain between 2 and 20 domain element questions. Responses to domain element questions are intended to describe and explain the respondent’s overall rating for the domain.

III. ADMINISTRATION

LPS questionnaires can be administered by interview, written format, or website. The VA-LPS2012 is administered by website. By design, LPS questionnaires should be administered at the end of the respondent’s clinical rotation, assignment, or educational time in the VA facility of interest.

IV. SCORING METHOD

The LPS facility-level, environment-level, and topic domain scores are computed for each respondent using the following strategies. Scoring strategies were designed to meet the different needs for information among clinical administrators and training program directors, clinical faculty, trainees, program evaluators, as well as education researchers.

(1) Domain mean scores. For each domain, the domain mean score is computed by taking the mean of all domain element responses, where “very dissatisfied” responses are assigned to the value of one, “dissatisfied” to the value of two, “neither satisfied or dissatisfied” as three, “satisfied” as four, and “very satisfied” as five. Mean domain scores are treated as missing if two or more domain element values are missing.

(1a) z-scores are computed for each domain and for each respondent by subtracting the respondent’s domain mean score by the mean of domain mean scores, and dividing the difference by the square root of the variance of all domain mean scores. The mean and standard deviation values to compute z-scores may be computed for a population of responders or a reference population of trainees.

(1b) Adjusted z-scores are computed for each domain and for each respondent by taking the respondent’s z-score and subtracting the expected z-score for the given respondent. The expected z-score is the score that is expected from an average respondent who has the same characteristics as the given respondent. Adjusted z-scores permit investigators to compare differences in scores over time, between facilities, across specialties, and over domains when differences in scores are adjusted to reflect differences in respondent-level characteristics.

(2) Domain ordinal scores. For each domain, the ordinal score is taken directly from responses to the domain summary question.

(3) Domain binary scores. For each domain, binary scores are computed by reviewing only a partial range of responses to a subset of domain element questions. Binary scores are designed to reflect different aspects of satisfaction rates for a given domain. Binary scores are represented by two values: zero and one.

(3a1) Positive satisfaction scores. For each domain, a positive satisfaction score is assigned a value of one if the respondent answers “very satisfied” to the domain summary question, and zero otherwise.

(3a2) Negative satisfaction scores. For each domain, the negative satisfaction score is assigned a value of one if the respondent answers “very dissatisfied” to the domain summary question, and zero otherwise.

(3b1) Positive element satisfaction scores. For each domain, the positive element satisfaction score equals one if the respondent answers “very satisfied” to one or more domain element questions, and zero otherwise.

(3b2) Negative element satisfaction scores. For each domain, the negative element satisfaction score equals one if the respondent answers “very dissatisfied” to one or more domain element questions, and zero otherwise.

(3c) Adjusted binary scores. For each domain, the adjusted binary scores are computed by taking the actual binary score from the respondent and subtracting the expected binary score computed for an average respondent who has the same characteristics as the respondent. Specifically, expected binary scores are described in terms of a probability that the binary score will be coded a one from the responses that are expected from an average respondent with the same characteristics as the given respondent. Thus, expected binary scores will range continuously from zero to one, and adjusted binary scores will range from -1 to +1.

(4) Domain element scores. Each domain can be described in terms of the pattern of responses to all of the domain element questions that comprise a given domain. Element scores are ordinal in value. Examining the strength of the association between element scores and the respective domain summary score can be used to assess the overall importance that respondents placed on each domain element.2, 3, 5





1 Keitz SA, Holland GJ, Melander EH, Bosworth HB, Pincus SH. The Veterans Affairs Learners’ Perceptions Survey: The Foundation for Education Quality Improvement. Academic Medicine 2003;78:910-917.

2 Cannon GW, Keitz SA, Holland GJ, Chang BK, Byrne JM, Tomolo A, Aron DC, Wicker A, Kashner M. Factors determining medical student and resident satisfaction during VA-based training: Results from the VA Learners’ Perceptions Survey. Academic Medicine 2008;83(6):611-620.

3 Kaminetzky CP, Keitz SA, Kashner TM, Aron DC, Byrne JM, Chang BK, Clarke C, Gilman SC, Holland GJ, Wicker A, and Cannon GW. Training satisfaction for subspecialty fellows in internal medicine: Findings from the Veterans Affairs (VA) Learners’ Perceptions Survey. BMC Medical Education 2011;11(21):1-9.

4 Cannon GW, Wahlen GE, and Kashner TM. Rheumatology fellows report higher satisfaction with VA-based training in comparison to other internal medicine subspecialty fellows. Presented at the National Meeting of the American College of Rheumatology, Philadelphia, PA, October, 2009.

5 Lam H-T, O’Toole TG, Arola PE, Kashner TM, and Chang BK. Factors associated with the satisfaction of millennial-generation dental residents with their training experience. Journal of Dental Education, in press.

6 Kashner TM, Henley SS, Golden RM, Byrne JM, Keitz SA, Cannon GW, Chang BK, Holland GJ, Aron DC, Muchmore EA, Wicker A, White H. Studying effects of ACGME duty hours limits on resident satisfaction: Results from the VA Learners’ Perceptions Survey. Academic Medicine 2010;85(7):1130-1139.

7 Kashner TM, Hinson RS, Holland GJ, Mickey D, Hoffman K, Lind L, Johnson LD, Chang BK, Golden RM, Henley SS. A data accounting system for clinical investigators. Journal American Medical Informatics Association 2007;14(4):394-396.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleFollow-up after conference call on March 21, 2009 for LPS calibration survey
Authorvhaslccannog
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy