LPS Instructions Manual

LPS_Instructions Manual_v001_ed13.pdf

Learner's Perception (LP) Survey

LPS Instructions Manual

OMB: 2900-0691

Document [pdf]
Download: pdf | pdf
Page 1 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

Learners’ Perceptions Survey
Instructions Manual for Data Users
version#001- edition#13

October 15, 2015

prepared by T. Michael Kashner, Ph.D., J.D., M.P.H.; David S. Bernett, B.A.;
and Annie B. Wicker, B.S.;
representing OAA National Evaluation Workgroup
Sheri A. Keitz, M.D., Ph.D., Chair;
with
David C. Aron, M.D., M.S.; Jemma Ayvazian, D.N.P.; Judy L. Brannen, M.D., M.B.A.;
John M. Byrne, D.O.; Grant W. Cannon, M.D.; Christopher T. Clarke, Ph.D.;
Mary B. Dougherty, Ph.D., M.B.A., M.A., R.N.; Stuart C. Gilman, M.D., M.P.H.;
Debbie L. Hettler, O.D., M.P.H.; Kenneth R. Jones, Ph.D.;
Catherine P. Kaminetzky, M.D., M.P.H.; and Robert A. Zeiss, Ph.D.

Office of Academic Affiliations
Veterans Health Administration
Department of Veterans Affairs,
Washington, DC

Page 2 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

Learners’ Perceptions Survey (LPS)
Instructions Manual for Data Users
I. INRODUCTION
I.A. Background
Since 1946, health professions education has been one of the four statutory missions for
VA, 1, 2, 3 along with patient care, research, and clinical backup to the Department of Defense. In
2014, VA medical centers had more than 7,200 affiliation agreements with 1,800 unique
accredited college and university programs and 2,300 graduate medical education programs,
involving 120,700 students representing over 40 professions, including 41,223 physician
residents, 22,931 medical students, 27,275 nursing students and residents, 1,399 dental
students and residents, and 27,265 associated health trainees. Nearly two-thirds of all U.S.
medical students will train in a VA facility prior to their graduation.
The Learners’ Perceptions Survey (LPS) is a standardized, scientifically validated instrument
that has been designed by the Department of Veterans Affairs (VA) to measure how health
professions trainees perceive their clinical training experiences at a VA medical center, hospital,
or outpatient care facility. The LPS is intended to assess trainee perceptions of their clinical
training experience. The LPS is used for research in health professions education, as well as
program evaluation, regulatory, managerial, and operations oversight, program, policy analyses,
and clinical training program accreditation.
The Department of Veterans Affairs, Veterans Health Administration, Office of Academic
Affiliations (OAA) through its National Evaluation Workgroup first drafted the LPS in 1999. 4
Since 2001, OAA has administered the LPS annually during the academic year. The survey is
requested of all health professions education trainees who go through a VA medical center as
part of an accredited health professions education program. VA trainees include students,
practicum participants, externs, interns, residents, or fellows, who rotate through, are assigned
to, or spend educational time in a VA healthcare system facility. For our purposes, the
academic year is said to begin on July 1st in the prior year and runs through June 30th of the
current year. The current academic year 2016 began on July 1, 2015 and will end on June 30,
2016.
The LPS underwent thirteen version changes from 2001 to 2016, as detailed in Table 1. These
changes were made in order to meet the growing demands for information about VA’s health
professions trainees, or to simplify the LPS for administrative purposes, or because a topic was
out of date. Most changes were to update categories of health professions, disciplines,
specialties, subspecialties, special fellows, and academic levels. In rare cases, changes in the
items comprising the questions themselves were changed.
I.B. Administration
LPS questionnaires are administered to trainees centrally through OAA’s Data Management
and Support Center in St. Louis, MO. VA trainees are encouraged to complete the LPS
questionnaire at or near the end of their clinical rotations via OAA’s website. Trainees are
informed about eligibility and the importance of the survey, as well as given instructions on how
to take the LPS through their VA medical center’s Designated Learning Officer (DLO),
Designated Education Officer (DEO), and/or Associate Chief of Staff for Education (ACOS-E).

Page 3 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

This information is provided in many ways, directly by face-to-face contacts, through posters
and circulars distributed throughout the VA medical center and education offices, as well as
indirectly through discussions with the trainee’s education program director at their affiliated
educational institution. Trainees also learn about the LPS through direct emails received from
the Office of Academic Affiliations (OAA). Trainees may access the LPS landing page by either
going directly to the page or by the OAA website LPS link. Upon accessing the survey landing
page, the trainee clicks on the appropriate link. There are two choices. Choice 1 is: “All health
professions trainees, students, interns, residents or fellows, except MD or DO.” Choice 2 is:
“MD or DO trainees, medical students, interns, residents, or fellows.” The LPS can also be
accessed through profession and specialty specific portals where education programs
communicate with their clinical trainees. These portals provide links that allow trainees to
bypass the landing page and go directly to the appropriate profession specific LPS survey.
I.C. LPS Data Accounting
All LPS data files are collected, maintained, processed, and analyzed using a three-stage data
accounting system designed for these purposes. 5 The purpose of this accounting system is to:
(1) ensure efficient use and ease of access to OAA databases to qualified and approved
investigators for evaluation, performance appraisals, assessments, investigations, research, or
other purposes, (2) document all access, data changes, analyses, and result outputs, (3)
maintain data integrity for reliability and validity, (4) ensure data security to preserve the
confidentiality of respondents answers to LPS questions, and (5) facilitate cross-training of OAA
staff who can access, process, and analyze LPS datasets.
All data processing and analyses are performed by first preparing code, validating the code for
accuracy (often on test datasets), approving it for use, and finally running the code on actual
datasets. All software used for data processing and analyses is indexed, referenced, and
stored permanently for documentation and possible future use. The accounting system is
flexible in allowing OAA staff to introduce changes in response codes, data values, or index
computations by changing the code to the software, and then re-running the software with the
new code to re-populate datasets and re-construct output tables, charts, graphs, and other
results. In this way, OAA can efficiently introduce changes to improve the data, or remove such
changes as needed.
The accounting system processes the data in three stages.
I.C.1. Raw Databases: (R-files). Construction of the LPS dataset begins with data collection
when trainees click on the LPS and enter responses to questions online. OAA collects the
responses and creates raw data files. Raw data are collected directly from responses and
stored on servers maintained at the OAA’s Data Management and Support Center in St. Louis,
MO. Names for raw variables are designed to point to their origins and/or location in files (e.g.,
by question number). At the end of the academic year, LPS data files are examined, approved,
and then officially closed, or sealed. Once sealed, R-files cannot be opened for further
changes, modifications, deletions, or additional entries.
I.C.2. Research-Ready Databases: (RR-files). Raw databases are then pre-processed into
research-ready, or RR-files. Investigators, programmers, analysts, and researchers access RRfiles, not R-files, for purposes of analyzing LPS data. RR datasets are created by running a
written and reviewed research-ready program that generates RR-files from R-files. RR-files are
held indefinitely until OAA leadership determines they should be changed or deleted. Data can
be changed in the RR-files only by coding the change into the RR software and then re-running
the RR program to re-populate existing RR-files or create new files to include all corrections,
additions, and deletions.

Page 4 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

The LPS RR-files are created in three levels.
I.C.2.(i). Level-1: In the first level, the raw data collected each year are processed by restructuring data into datasets based on a file architecture and data format that are designed to
make access easier and improve programming efficiency and accuracy. These level-1 RR-files
are named as LPS_01_AH and LPS_01_PR for 2001 for the LPS-AH and LPS-PR surveys,
LPS_02_AH and LPS_02_PR for 2002, and so forth. Thus for 2016, the level-1 files are
LPS_16_AH and LPS_16_PR.
RR Level-1 software also computes construct variables and scores indices using predetermined algorithms. Response categories are coded and labeled, and a referent response is
defined (e.g. male / female coded as ‘0’ / ‘1’ where ‘0’ is assigned to male as the referent and ‘1’
as the alternative). Numerical responses are formatted to reflect appropriate significant figures
(e.g. age in years as an integer).
RR level-1 software also gives names to the RR-variables based on naming conventions that
point to the analytic purpose of the variable, not its location or origin. Thus, the software
includes codes to transcribe R-file variables into RR-file research-ready variables. The names
of variables are often chosen so that, when alphabetized, the corresponding variables cluster by
a common purpose. The naming convention is designed to make programming more efficient,
reduce programming errors, and decrease preparation time to prepare final results in a timelier
manner.
RR level-1 software also amends and corrects raw data as needed. Once completed and
approved, R-files do not change. To amend mistakes after an R-file is sealed, the RR software
will make the necessary changes when generating the respective RR-variable. In this way, the
RR program documents all changes to the RR dataset used in analyses. This process also
permits OAA staff to “reverse” all changes, if necessary. Since all processing and analyses are
executed formally by written programs (rather than through a menu), changes can easily be
introduced by amending the software, and then running the amended software to re-populate
the datasets, create new datasets, and produce corrected outputs as graphs, charts, tables, and
figures.
RR level-1 software also codes critical education program information, including the facility
identity and the responder’s academic discipline, profession, specialty, subspecialty, or special
fellows program, and academic level. In some cases, multiple classification systems exist.
Separate variables are created to reflect each system.
I.C.2.(ii). Level-2: In the second level, level-1 RR datasets created for each academic year are
merged into a single file, both across years and the two PR and AH surveys. The intent is to
create a database that both captures complete information collected in the respective year,
while affording investigators opportunities to make cross year and cross facility comparisons.
Names of the RR-variables that may differ by year are amended to be uniform across academic
years. Prior year data are retrofitted by re-coding responses and computing transcription
algorithms to account for any changes that may occur in data format, response codes,
definitions, and standards. This is achieved by adding response options, merging two or more
options into a single response category, or creating variables that populate values in academic
years when such data were collected and assigned a missing value code in years when data
were not being collected. In addition, new variables are created from existing variables by
collapsing their multiple response options into broader, more inclusive categories.
The level-2 RR-file for 2016 is named: LPS2016. In prior years, level-2 RR-files were LPS2015
for academic year 2015, LPS2014 for academic year 2014, and so forth. Each dataset contains
prior year data retrofitted to fit the LPS format in the final year, with retired questions treated as

Page 5 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

missing data in subsequent years, and new questions treated as missing data in prior years.
This strategy allows investigators to perform longitudinal analyses for questions no longer asked
directly on the updated dataset.
I.C.2.(iii). Level-3: In the third level, level-2 data are analyzed to adjust values for selected
variables to reflect a ‘referent’ respondent or facility. These adjustments are created using
generalized linear models, including logistic regression. Adjusted values are useful when
comparing responses among responders from different facilities, across disciplines or academic
levels, or over time while reporting on the same facility.
The level-3 RR-files are: LPS2016-adj for all data through academic year 2016, LPS2015-adj for
all data through academic year 2015, etc. These datasets contain all Level-2 and Level-3
variables for the specified ‘current’ academic year plus values from prior years that were
retrofitted as needed to fit the coding schedules for the defined ‘current’ year.
I.C.3. Analytic Databases: (A-files). Analytic databases, or A-files, are temporary datasets
created from RR-files for the purposes of conducting specific analyses to create tables, charts,
graphs, statistics, estimated models, and other statistical and computational outputs.
A-files are created from Level-3 RR-files through an analytic program, or A-program. Aprograms have two parts. The first part pre-processes RR-files to create A-files that are
designed for specific analyses. Processing includes merging datasets, re-coding categorical
variables, transforming continuous variables, and constructing new indices. The second part is
the actual analyses of the A-datasets. A-files are temporary and are to be deleted when
analyses are done. However, the A-programs creating the files are indexed and stored for
referral, documentation, and possible re-construction of the analytic dataset when needed.
A-files are created to feed OAA’s data cube, to compute findings for special OAA reports, and to
prepare results for scientific manuscripts.
I.D. Federal Authority to Administer the LPS
The Office of Academic Affiliations (OAA) within Veterans Health Administration (VHA) is
permitted to administer the LPS to VA’s clinical health professions trainees under the Office of
Management and Budget (OMB) license #2900-0691, under VA Form 10-0439 Learners’
Perceptions Survey (LPS). Authority for the license can be found under Federal Law 38 U.S.C.
Part V, Chapter 73, Section 7302 providing that the Department of Veterans Affairs (VA) offers
education and training to a national cohort of health care trainees each year and thus, VA is
required to evaluate such programs on a continuing basis and determine its effectiveness in
achieving its goals (Federal Law, 38 U.S.C. Part I, Chapter 5, Section 527). In addition, the
Government Performance and Results Act (GPRA) of 1993, requires Federal agencies to set
goals, measure performance, and report on the accomplishments.
Specifically, since 1949, health professions training is one of the Veterans Health
Administration’s (VHA) statutory missions since the 1949 Policy Memorandum No. 2 established
an association between Veterans Administration medical centers (VAMC) and schools of
medicine. VA medical centers now affiliate with accredited training programs in undergraduate
and graduate medical education, nursing, dentistry, and associated health including pharmacy,
psychology, social work, podiatry, optometry, chaplaincy, chiropractic, dietetics, rehabilitation
including physical therapy, occupational therapy, vocational therapy, recreation and manual arts
therapy, and blind rehabilitation, marriage and family therapy, and licensed professional mental
health counseling. In addition to performance assessment, LPS survey results are used by VA
affiliate institutions in their applications for program accreditation. Credit for academic work will

Page 6 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

satisfy requirements for professional licensing only if such credit issues from accredited
programs.
The need for information collected from the LPS survey is also found in assessing how the
changing healthcare environment may impact VA medical centers. For instance, academic
accrediting bodies, such as the Accreditation Council on Graduate Medical Education (ACGME)
have imposed sweeping changes, including resident duty hour limits and strict supervision
requirements that affect the VA training environments as well as the clinical care environments
where veteran patients’ health care needs are served. Changes in regulations governing
clinical training programs are expected to impact how health professionals are trained in VA.
This changing landscape in the clinical education environment comes at a time when Veterans
Health Administration (VHA) must rise to the challenge of creating “veteran centric” care models
to treat a new generation of veterans, as well as to provide care to veterans, recruit and retain
highly qualified clinical staff, and train the nations new health professionals to provide care for
the special health needs of US military veterans. The interface between the clinical and
educational arenas necessitates a system for assessment of the education environment to
identify both strengths and opportunities for improvement in VA’s clinical education environment
and measures the satisfaction of VA clinical trainees who come in direct contact with our
veteran patients and who contribute to their care. Implicit in the LPS is the identification of key
drivers of clinical trainees’ satisfaction so as to develop and implement targeted improvements
that will benefit both learners and patients in VHA. In summary, the LPS is consistent with VA’s
oversight responsibilities and Government Performance and Results Act (GPRA) and
represents a major metric for a statutory VA mission.
More recently, the Veterans Access, Choice, and Accountability Act of 2014 (PL 113-146), as
signed into law on August 7, 2014 (128 Stat. 1754; 38 USC 101; HR3230), was enacted to “…
improve the access of veterans to medical services from the Department of Veterans Affairs…”.
The Act requires the Secretary to: “… ensure that already established medical residency
programs have a sufficient number of residency positions…” (at §301(b)(1) amending 38 U.S.C.
7308(e)(1)); to provide annual reports (directly to both House and Senate Veterans’ Affairs
Committees) detailing its progress towards meeting the 1,500 new GME positions goal (at
§301(b)(3)) and include the number of “positions filled,” “not filled,” “anticipating filling,” (at
§301(b)(3)(B)(i)) provide the resident’s geographic location, academic affiliation (at
§301(b)(3)(B)(ii)), “the policy at each medical facility … with respect to the ratio of medical
residents to staff supervising medical residents” (at §301(b)(3)(B)(iii)), “… the number of
individuals who declined an offer from the Department [of Veterans Affairs] to serve as a
medical resident at a medical facility of the Department and the reason why each such
individual declined such offer” (at §301(b)(3)(B)(iv)), descriptions of “… challenges … faced by
the Department in filling graduate medical education residency positions…” (at
§301(b)(3)(B)(v)(I)), the “actions … taken by the Department to address such challenges…” (at
§301(b)(3)(B)(v)(II)), and finally “… efforts … to recruit and retain medical residents to work for
the Veterans Health Administration as full-time employees.” (at §301(b)(3)(B)(vi)).
I.E. Reporting Requirements and Confidentiality
I.E.1. Learners’ Perceptions Survey [LPS]
The nationally administered Learners’ Perceptions Survey, or LPS, consists of two surveys: the
LPS-AH intended for all health professions trainees including Associated Health, Dentistry, and
Nursing trainees, students, interns, residents, or fellows, except MD or DO trainees, and the
LPS-PR intended for MD and DO trainees, including medical students, interns, residents, or
fellows.

Page 7 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

Under license issued by OMB, OAA is responsible to protect the confidentiality of survey
responses from the “field.” The field covers not only the public at large, but also local VA
medical staff, supervisors, education program administrators and directors, as well as program
directors, educators, administrators, and faculty and staff at VA’s affiliate universities, colleges,
and institutions.
OAA’s responsibility to protect respondent confidentiality also originates from the contract
between responders who take the survey and OAA who administers the survey and use the
information to further VA aims and objectives. Specifically, the respondent begins the survey
with the introduction page that includes the statement:
“This is a confidential survey.”
Thus, OAA establishes a contract with each respondent. The respondent agrees to submit
reliable and accurate responses to LPS questions. In exchange, OAA agrees to protect the
identity of the respondent. An agreement is reached when OAA puts the survey on its official
website and the respondent responds and submits to at least one question in the survey to
OAA.
To comply with both OMB licensing and respondent contract requirements, OAA releases to the
field only aggregated data (means, medians, modes, frequencies) for reporting units with 10 or
more respondents. In September 2015, this number was reduced to 8 to be consistent with
other agencies and offices within Veterans Health Administration. It is noted that this is only a
guideline, the purpose of which is to determine a “reasonable” re-identification risk standard.
OAA, nor any other agency, cannot provide assurances that respondents will have absolutely
no risk of re-identification when responding to the LPS.
In special cases, LPS survey responses may provide evaluation information that is required by
VA trainees’ education programs from our affiliated institutions. Such evaluation information is
often used in applications to professional associations to accredit the education program. Such
accreditation is necessary for the clinical experiences in VA for trainees enrolled in the
education program to count to satisfy their professional licensing or board certification
requirements. In such cases, whenever the total number of trainees who could potentially take
the LPS from the reporting unit is known, and whenever trainees were informed that LPS survey
results were necessary for the program’s application for accreditation, and whenever the
accreditation is of substantial interest to the Government, the program, and the respondents,
then the number of potential respondents, not actual responders, are counted to determine if the
aggregate minimum of eight respondents were reached, and thus a reasonable re-identification
risk standard is met.
I.E.2. Learners’ Perceptions Survey - Consented version [LPSc]
For FY2016, OAA created the LPSc where “c” indicates that survey responses may be released
to the field containing aggregated responses (means, medians, modes, and frequencies) for
respondent groups of two (2) or more. As with the LPS, the LPSc includes two questionnaires:
the LPS-AHc intended for all health professions trainees, including students, interns, residents,
or fellows in Associated Health, Dentistry, and Nursing except MD or DO, and LPS-PRc
intended for MD and DO trainees including medical students, interns, residents, or fellows. Both
LPS-PRc and LPS-AHc contain the same questions as their LPS-PR and LPS-AH equivalents.
However, LPSc has a different introduction page, as described below.
The LPSc was necessary to allow LPS surveys to be used for investigational studies, requested
reports from Executive and Legislative branches, and for accreditation and certification

Page 8 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

applications from VA affiliated education institutions and programs when the number of actual or
potential trainees involved are between two and seven.
To begin the process, VA medical staff and faculty will approach all respondents whose
responses are to be used for accreditation or other critical purposes for the trainee’s education
program. The staff explains the purpose of the LPSc, its importance, qualifications to take the
LPSc, and offers two options to each qualifying trainee. The trainee may take the LPS survey
whose responses will be aggregated by no fewer than 8 respondents in the reporting unit.
Alternatively, the trainee may take the LPSc survey where responses may be aggregated by no
fewer than 2 reporting respondents in the reporting unit. The trainee is then given the website
where they may link to either the LPS or LPSc surveys. The trainee’s choice will not be known
to VA faculty and staff, or to the trainee’s education program director.
Trainees who access the LPSc survey are given notice on the LPSc introduction page:
This is a confidential survey. Your responses will not be made available to your program
directors, faculty, or clinical staff. However, your responses may be combined with those
from one or more other respondents to compute aggregate information (means,
frequencies). Aggregate information may be released to program directors, faculty, and
clinical staff for program evaluation and administrative purposes. By taking the survey you
have agreed to these terms.
Trainees may discontinue taking the LPSc survey, and re-enter the system to take the LPS
survey. There are no connecting data that can link a trainee’s attempts to enter both LPS and
LPSc, or to identify any respondent who enters the LPS or LPSc website.
I.E.3. Learners’ Perceptions Survey - Identified version [LPSi]
For FY2016, we created the LPSi, where “i” indicates that the respondent is to be identified.
That is, responses can be released to the field containing aggregated responses for respondent
groups of one (1) or more. The LPSi consists of two questionnaires: LPS-PRi intended for MD
and DO trainees including medical students, interns, residents, or fellows, and LPS-AHi
intended for all health professions trainees, including students, interns, residents, or fellows in
Associated Health, Nursing, and Dentistry except MD or DO. The LPS-PRi and LPS-AHi are
the same as the LPS-PR and LPS-AH, respectively, except for an additional question that asks
LPSi respondents to provide the text of their first, middle and last name. Together with the
facility identifier, investigators are able to match LPSi responses to other databases by the
responder’s name.
The LPSi is accessible only to trainees who were properly consented and signed an informed
consent as part of an IRB-approved research protocol. To access the LPSi, the trainee must be
given the specific website that enables him or her to link to the LPSi survey. That is, LPSi
surveys are intended for responders who are participating in evaluation or research studies and
who have signed informed consent to an IRB approved protocol. OAA’s use of the data, such
as linking survey responses to other datasets, is limited to what is allowed as specified in the
IRB-approved protocol, and is further limited to any terms in the signed informed consent
agreement, and other limitations imposed by applicable data use agreements.

Page 9 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

I.E.4. Learners’ Perceptions Survey - Primary Care version [LPS-PC]
In 2012, OAA created a primary care version of the LPS, or LPS-PC2012. The LPS-PC was
designed to capture the experiences of health professions trainees who rotate through or are
assigned to a VA primary care clinic. 6 The LPS-PC is applicable to all trainees and thus
contains only one questionnaire.
Historically, LPS-PC2012 was created from the LPS2012 by (i) replacing “VA Medical Center”
with “VA Primary Care Clinic,” (ii) by modifying, deleting, and adding individual questions to fit a
primary care setting (e.g., excluding elements referring to inpatient care), and (iii) changing the
wording to make survey questions applicable to all health professions trainees, and thus no
longer making distinctions in questions pertaining to Associated Health, Dental, Nursing, and
Physician Resident and Medical Student. For the LPS-PC survey, “primary care” was defined
as any clinical setting where patients receive comprehensive, continuity, and primary care, such
as general internal medicine, primary care, or Patient Aligned Care Team (PACT) clinics.
The LPS-PC2012 data were obtained by administering the LPS-PC_v002 (04/02/2012) survey
to a 2% sample of LPS responders who claimed to have rotated through a VA primary care
clinic. The LPS-PC2013 data was obtained from the LPS-PC_v003_ed03 survey (09/06/2012)
that was administered for the 2013 academic year (July 1, 2012 through June 30, 2013).
Updates from LPS-PC2012 to LPS-PC2013 paralleled those updates from LPS2012 to
LPS2013. The LPS-PC2014 data were collected from responses from the 2% sample to the
LPS-PC_v004_ed011 (10/09/2013). Updates to LPS-PC2014 paralleled those applied to the
LPS2014.
The LPS-PC2015 data were obtained from the LPS-PC_v005_ed.003 (08/05/2014), with midyear updates to LPS-PC_v006_ed006 (03/30/2015) that included changes in profession
response codes, and again to LPS-PC_v007_ed004 (04/14/2015) and to LPS-PC_v007_ed006
(09/24/2015) to be consistent with changes to the LPS response codes and survey questions,
whenever appropriate. Table 2 shows applicable differences between LPS and LPS-PC.
I.E.5. Learners’ Perceptions Survey - Primary Care Identified version [LPS-PCi]
For FY2016, the LPS-PCi_v007_ed006 was created to include an additional question to the
LPS-PC that asks responders to report their first, middle, and last names. That is, the LPSPC_v007_ed006 and LPS-PCi_v007_v006 offers the same questions, response codes and
question and response order, but includes an identity question to determine the identity of the
respondent. Like the LPSi, each responder must sign an informed consent from an IRB
approved protocol. After signing, trainees are given the website by VA faculty and staff to
access the link and direct the trainee to enter responses to the LPS-PCi survey. As with the
LPSi, the LPS-PCi is intended for responders who are participating in evaluation or research
studies and who have signed informed consent to an IRB approved protocol that thus permit
investigators to have access to the responder’s identity where such information may be used to
link with other data sources in accordance with the IRB approved protocol, and may be limited
further by the signed, IRB-approved informed consent agreement and all applicable data use
agreements.
I.E.6. Learners’ Perceptions Survey - Standard version [LPSs]
OAA maintains a standard version of the LPS that is applicable to both VA and non-VA facilities.
The standard version asks the same questions as the LPS where VA specific language has
been replaced by generic terms. For instance, “VA Medical Center” and “Computerized Patient
Record System” found in LPS questions were replaced by “MAIN” facility and “Patient Health
Record,” respectively, in LPSs questions.

Page 10 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

II. DATA STRUCTURE
The current LPS Survey, or LPS2016, is a self-reporting survey designed to measure the
perceptions of health professions trainees about their clinical training experiences at a VA
facility. The progression of LPS surveys since 2001 (LPS2001, LPS2002, … LPS2014,
LPS2015, LPS2016) are described in Table 1. Differences between LPS2013, LPS2014,
LPS2015 and LPS2016 are described in Table 2. A description of differences in LPS surveys
for years prior to 2013 is available in the respective LPS Instructions Manual prepared for the
applicable years.
By definition, a “VA facility” is a VA Healthcare System, Medical Center, Hospital, Outpatient
Clinic, or Outreach Center. “Experience” is operationally defined to be the respondent’s most
recent clinical experience at a given VA facility. LPS surveys are administered near the end of
the respondent’s rotation, assignment, or educational time for the designated VA facility of
interest.
LPS2016 survey is made up of two separate questionnaires. These questionnaires are
administered separately to trainees from different education programs. The LPS2016 Physician
Resident Questionnaire, or LPS2016_PR, is designed to measure the perceptions of medical
students, physician interns (or PGY-1), residents (PGY2-4), or fellows (PGY4+) in a graduate
medical education program. Questions cover the respondent’s most recent clinical training
experience at a given VA facility.
The LPS2016 Associated Health, Dental, and Nursing Questionnaire, or LPS2016_AH, is
designed to measure the perceptions of trainees in Associated Health, Dentistry, and Nursing
programs about their most recent clinical training experience at a given VA facility. The
LPS2016_AH is intended for all academic levels that range from pre-baccalaureate certificate
and diploma programs through postdoctoral and residency training programs.
The LPS2016_PR and LPS2016_AH questionnaires and response codes were designed to
work together so that responses across professions would be comparable. Both questionnaires
contain facility-level and environment-level domains that describe teaching and working
experiences and the clinical environment. In addition, the LPS2016_PR contains environmentlevel domains that capture the respondent’s perceptions about the availability, timeliness and
quality of staff and services, as well as systems and processes to deal with medical errors.
Table 2 provides a detailed description of how the LPS_PR and LPS_AH questionnaires differ.
II.A. Facility-Level Information
Facility level information is based on information supplied by the respondent to describe their
reporting facility.
II.A.(i). Reporting Facilities
Each VA facility is classified using a Veterans Integrated Services Network (VISN), VHA Service
Support Center (VSSC), and six-digit number (STA6n or STA6ID), that distinguishes sites down
to basic service levels, as identified in VA corporate data sets. Examples of service levels
include a domiciliary unit, nursing home, main hospital, and outpatient care facility. The service
level classification of facilities is grouped by point of service indicating a common physical
address and classified using a five-digit number (STA5). The point of service classification of
facilities is grouped by a common parent facility and classified using a five-digit number
(STA5n).

Page 11 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

Facility information is computed at parent facility levels (STA5n). However, computation of the
calibration instrument used to adjust scores (described below in this Manual) relies on point of
service (address-based) facility levels (STA5).
The LPS will adopt the new region classification of VA medical centers intended for 2016 when
those rules become finalized.
II.A.(ii). Facility-Level Complexity Codes
For performance measurement, administration pay grade, and research purposes, the Human
Resource Committee of the National Leadership Board, through the Office of Productivity,
Efficiency and Staffing, assigns each parent facility to one of six peer groupings that represent
different degrees of operating complexity.
Veterans Health Administration assigned complexity scores based on a five-point ordinal scale.
These scores are based on a Facility Complexity Model that is approved by the Under Secretary
in an Executive Decision and published for VA use. Scores were published in 2005, 2008,
2011, and 2014. A complexity score is assigned to each facility by year based on the most
recently computed value. Thus, complexity scores for FY2001 through 2007 are based on
schedules published in 2005; complexity scores for 2008 through 2010 are based on schedules
published in 2008; scores for 2011 through 2013 are based on schedules published in 2011 and
scores for 2014 through 2015 are based on schedules published in 2014.
The Facility Complexity Model assigns the parent facility (STA5n) to a complexity level based on
seven variables. These variables are as follows:
(i) Patient Volume is calculated as the number of pro-rated patients seen based on the
Veterans Equitable Resource Allocation model (VERA) that classifies patients by level of
treatment and costs incurred.
(ii) Intensive Care Unit and Surgical Operative Complexity Levels are measured on a combined
scale where the highest score is a facility with Level 1 ICU and complex surgery, and the lowest
score is a facility with neither program.
(iii) Patient Risk is computed as the Medicare Relative Risk score calculated from all VA patient
diagnoses based on Diagnostic Cost Groups. Patients with higher risk are considered to have
more complex illnesses that are more difficult to manage.
(iv) Total Resident Slots is determined as the number of paid resident slots that were allocated
to the facility by VA’s Office of Academic Affiliations. More slots indicate greater commitment to
the education mission and are expected to add complexity to facility management.
(v) Herfindahl-Hirshman Index of Resident Slots is computed for each facility as the proportion
of the facility’s residents for each academic program, squaring the proportion, and then
summing the squared proportions over all of the facility’s programs. Scores range from zero to
one. Higher scores indicate facilities where residents are more concentrated in fewer programs.
Greater concentration is expected to decrease the complexity of managing a facility’s education
mission.
(vi) Research Dollars is computed as VERA Research Support allocation.
(vii) Complexity of Clinical Programs is computed as the number of complex clinical programs
from a list of 11 such programs that require specialized staff, equipment, or complex academic
affiliations (PGY5-7). These programs include Spinal Cord Injury, Blind Rehab, Cardiac
Surgery, Invasive Cath Lab, Neurosurgery, Transplant, Radiation Oncology, Interventional
Radiology, Polytrauma, Inpatient Acute Mental Health and PTSD, and Mental Health Intensive
Care Management.

Page 12 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

These seven variables are weighted and combined to assign each parent facility to a high,
medium, or low complexity group. Those assigned to the high complexity group are further
subdivided into three sub-groups.
For 2014, the number of facilities assigned based on Option 2 is as follows:

Complexity
Rating
1a

1b
1c
2
3

Description

Number
Facilities
2014

Largest level of patient volume, patient risk, teaching and research
Largest number / breadth of physician specialist
Level 5 ICU unit
Very large patient volume, patient risk, teaching and research
Level 4 or 5 ICU unit
Large patient volume, patient risk, teaching and research
Level 4 ICU unit
Medium patient volume, patient risk, some teaching and / or research
Level 3 and 4 ICU unit
Smallest patient volume, smallest patient risk, little or no teaching and / or research
Lowest number physician specialist per pro-rated person
Level 1 and 2 ICU units

39

21
24
25
31

II.B. Respondent-Level Information
Respondent level information includes the following classes of information.
II.B.1. Specialty and Academic Level
“Specialty” refers to either a discipline, specialty within a discipline, or a subspecialty within a
specialty. These specialties are, in turn, aggregated into one of four health professions
education programs: Associated Health, Dentistry, Nursing, and Physicians.
II.B.1.(i). Discipline, specialty, and subspecialty
II.B.1.(i)(a). Program
Each respondent is asked to identify their health professions education program from among 4
possible pull down menus listing program choices: Associated Health, Dentistry, Nursing, and
Physicians (MD or DO) (Table 3).
II.B.1.(i)(b). Reported Specialty
Once they have indicated a health professions program, respondents are then asked to select
their specialty from a list specific to each program (Table 3). There are a total 176 possible
disciplines, specialties, and subspecialties, including 25 specialties listed for associated health,
22 specialties within dental health professions, 23 for nursing, and 106 physician specialties and
subspecialties, including medical student as a classification (Table 3). In addition, respondents
may also report their participation in an advanced special fellowship from a list of 25 fellowship
programs (Table 4).
Respondents are asked to select a specialty that best describes their current educational goal
for their current education program, not their ultimate career goal. For example, a physician
resident entering PGY-1 in internal medicine, though intends eventually to enter a cardiology

Page 13 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

fellowship upon completion of an Internal Medicine program, should report “Internal Medicine”
as the specialty of their current education program.
The names of all health specialties, and their assignment into a Specialty Group (Table 3), were
based on and reviewed by the VA Office of Academic Affiliations’ Director of Medicine and
Dentistry, Director of Associated Health, and Director of Nursing Education.
II.B.1.(i)(c). Specialty Groups
All trainee respondents are assigned by a computer algorithm into one of 26 health professions
specialty groups. While respondents are asked to report their specialty [II.B.1.(i)(b)] and
academic level [II.B.1.(ii)(a)]), actual assignments to a specialty group are reviewed by a preprogrammed computer algorithm that reviews the trainee-report specialty (Table 3) and
compares trainee-reported academic level (Table 5) against a range of possible academic
levels that are applicable to the trainee-reported specialty (Table 6).
That is, the assignment algorithm accounts for respondents who may be reporting their ultimate
career objective, rather than their immediate and current education program. So, a respondent
reporting a specialty that maps into a specialty group (Table 3) while reporting an academic
level that is not included among acceptable academic levels for that specialty group (Table 6),
would be assigned to a different specialty grouping, depending on the specialty and academic
level specified. For these assignments, the algorithm assumes that the listing of academic level
is accurate because academic program levels have explicit language that is universal across
professions and trainees are likely to monitor closely their progress through an academic
education program.
For example, a trainee who reports “Psychology” as the health specialty (Table 3) and “Doctoral
Practicum Extern” as the academic level (Table 6) would be assigned to the “Psychology”
specialty group. If the trainee reports being in “Psychology” (Table 3), but at a Baccalaureate
level (not listed as a specialty-specific academic level under “Psychology” in Table 6), the
trainee’s discipline would be assigned to the “Other Associated Health” specialty group. In
another example, a second year resident in Internal Medicine who selects “Cardiology” as their
specialty because it is their ultimate education goal, would be assigned by the computer
algorithm to the “Internal Medicine” specialty group since the trainee would begin a cardiology
program only after completing his or her Internal Medicine program. If the respondent reports
being a medical student, but reports being a PGY-1 academic level, then the respondent is
assigned to the Internal Medicine group at a PGY-1.
All assignments made by the computerized assignment algorithm were reviewed and approved
by the VA Office of Academic Affiliations’ Director of Medicine and Dentistry, Director of
Associated Health, and Director of Nursing Education. Whenever the reported academic level
falls outside the range of allowed academic levels for the selected specialty, the respondent’s
assigned specialty is classified as “other” within the chosen health professions education
program. The reported academic level is then mapped to an assigned specialty-specific
academic level defined by the “other” specialty category.
II.B.1.(i)(d). Special Fellows
All respondents are asked if they participate in one of 25 special fellowship programs that OAA
funds, as listed in Table 4. Response selections do not depend on the respondent’s selection of
a specialty program or an academic level.

Page 14 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

II.B.1.(ii). Academic Level
II.B.1.(ii)(a). Reported Academic Level
The “current academic level” is defined as the level that best describes the progress the trainee
is making in the course of their health professions education program. To facilitate responses,
responders are asked to choose an academic level from separate lists of possible academic
levels based on the type of health professions education program they had selected at the
beginning of the LPS survey (Associated Health, Dentistry, Nursing, and Physicians). Table 5
tabulates the list of possible choices by program type (i.e., Associated Health, Dentistry,
Nursing, and Physicians MD and DO). For example, a third year medical student working to
complete a third year of medical school should report “medical school – 3rd year” as the
appropriate academic level, even though completing medical school and entering a physician
residency program may be the overall education goal.
II.B.1.(ii)(b). Academic Year
A metric was constructed to reflect years since high school that the respondent has spent in a
health professions education program. How specific levels are translated into academic years
is defined in Table 5. The purpose of counting years is to provide a measure of academic
progress that is comparable across all health professions with a common reference point.
Academic year is a relatively new measure and not available prior to 2014.
II.B.1.(ii)(c). Academic Level Group
A metric was also constructed to measure academic progress across health professions by
coding respondents into broad academic groups that define their health professions education
level. Assignments are described in Table 5. Academic level group is available on LPS
datasets since 2001.
II.B.2. Education Background
For physicians, the LPS_PR collects information on: (1a) U.S. medical school status or whether
the respondent graduated from a U.S. or non-U.S. medical school; (1b) year medical school
graduation, or the year the respondent graduated from medical school, or will graduate from
medical school; (1c) VA rotation status, or currently whether the respondent is rotating at a VA
facility, “yes” or “no”; and (1d) percent VA, or percent of the time that the respondent spent in
their clinical training program that was also spent at the VA facility.
For Associated Health, Dentistry, and Nursing programs, the LPS_AH collects information on:
(2a) time required in current program, or how much total time in weeks, months, and years, the
trainee expects to spend in their current clinical education program, (2b) time spent in current
program, or how much time in weeks, months, and years, the trainee has spent in their current
clinical education program, and (2c) percent VA, or percent of the time that the respondent
spent in their current clinical education program that was also spent at the VA facility.
II.B.3. Respondent’s Demographic Characteristics
Demographic characteristics include (1) gender, or whether the respondent is “female” or
“male;” and (2) active duty status, or “yes” or “no” to whether the respondent is currently active
in the military.
II.B.4. Mix of Patients Seen by Respondent
The characteristics of patients, or patient mix, that the respondent saw during their most recent
clinical training experience with the VA facility are described in terms of the percent of patients
seen who: (1) were 65 years of age or older, (2) female gender, (3) had a chronic medical

Page 15 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

illness, (4) had a chronic mental illness, (5) had multiple medical illnesses, (6) had alcohol /
substance dependence, (7) had low income or socioeconomic status, and (8) did not have
social or family support.
II.C. Domains
Perceptions are described in terms of facility-level domains and environment-level domains.
Each domain contains a series of elements that define specific items that collectively comprise
the domain. The nine environment-level domains are also grouped into one of three
experiences. A listing of all domains, their associated element questions, and how domains are
grouped by experience, is summarized below and in Table 2.
II.C.1. Facility-Level Domains
Respondents are asked to summarize their overall clinical training experience at the VA facility
by answering questions that correspond to five facility-level domains.
II.C.1.(i) Likely use again is an ordinal four-point Likert scale that indicates whether
respondents: “definitely would not,” “probably would not,” “probably would,” or “definitely would”
choose their VA training experience again.
II.C.1.(ii) Employment potential comprises two scales.
II.C.1.(ii)(1) Likely Recruitable before, or a five-point Likert scale indicating whether
respondents before their VA clinical training experience were “very likely,” “somewhat likely,”
“had not thought about it,” “somewhat unlikely” or “very unlikely” to consider future employment
opportunities at a VA medical facility.
II.C.1.(ii)(2) Likely Recruitable after, or a five-point Likert scale indicating the change in whether
respondents as a result of their VA clinical training experience are “a lot more likely,” “somewhat
more likely,” “no difference,” “somewhat less likely,” or “a lot less likely” to consider future
employment opportunities at a VA medical facility.
II.C.1.(iii) Patient care quality
II.C.1.(iii)(1) Quality before, or a five-point Likert scale indicating whether the quality of care at
the VA facility before starting the VA training experience is “excellent,” “very good,” “good,”
“fair,” or “poor.”
II.C.1.(iii)(2) Quality after, or a five-point Likert scale indicating whether the quality of care at
the VA facility based on their actual VA experience is “excellent,” “very good,” “good,” “fair,” or
“poor.”
II.C.2. Environment-Level Domains
II.C.2.(1) Core Domains
(i) Domain Elements: As described in Table 2, there are a total of nine core domains that
describe the trainee’s teaching, working, and clinical experiences during the respondent’s most
recent clinical training experience at a given VA facility. Core domains are made up of from 6 to
15 item questions, or domain elements. Each domain element question asks the respondent
about a different aspect of the domain. Each domain element question asks respondents to
describe their perceptions on an ordinal five-point Likert scale: “very satisfied,” “somewhat
satisfied,” “neither satisfied nor dissatisfied,” “somewhat dissatisfied,” or “very dissatisfied.”
(ii) Domain Summary: After all domain element questions have been answered, the
respondent is also asked to respond to a domain summary question. Here, the respondent is

Page 16 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

asked to provide an overall rating that summarizes the domain where all domain elements are
taken into account. As with domain element questions, domain summary questions ask
respondents to describe their perceptions on an ordinal five-point Likert scale: “very satisfied,”
“somewhat satisfied,” “neither satisfied nor dissatisfied,” “somewhat dissatisfied,” or “very
dissatisfied.”
II.C.2.(1)(a) Teaching Experience is made up of two domains:
II.C.2.(1)(a)(i) Learning Environment domain contains 15 elements that describe the
respondent’s clinical learning environment. Elements include time working with patients,
supervision, autonomy, non-education “scut” work, interdisciplinary approach, preparation for
clinical practice, for future training, and for business aspects of clinical practice, time for
learning, access to specialty expertise, teaching conferences, quality of care, culture of patient
safety, spectrum of patient problems, and diversity of patients seen.
II.C.2.(1)(a)(ii) Clinical Faculty and Preceptors domain contains 13 elements that describe the
relationships with VA clinical faculty and preceptors whom respondents encountered during their
VA clinical training experience. Elements include clinical skills, teaching ability, interest in
teaching, research mentoring, accessibility and availability, approachability and openness,
timeliness of feedback, fairness in evaluation, being a role model, mentoring, patient-oriented,
quality of faculty, and evidence-based clinical practice.
II.C.2.(1)(b) Working Experience is made up of three domains:
II.C.2.(1)(b)(i) Working Environment domain contains 9 elements that describe the respondent’s
VA working environment that had been encountered during their clinical training experience.
Elements include ancillary / support staff morale, laboratory services, radiology services,
ancillary / support staff, call schedule, computerized patient record system, access to online
journals, resources and references, computer access, and workspace.
II.C.2.(1)(b)(ii) Physical Environment domain contains 8 elements that describe the
respondent’s VA physical environment that had been encountered during their clinical training
experience. Elements include convenience of facility location, parking, personal safety,
availability of needed equipment, facility maintenance and upkeep, facility cleanliness and
housekeeping, call rooms, and availability of food at the medical center when on call.
II.C.2.(1)(b)(iii) Personal Experience domain contains 7 elements that describe the
respondent’s VA personal experience that had been encountered during their clinical training
experience. Elements include personal reward from work, balance of personal and professional
life, level of job stress, and of fatigue, continuity of relationship with patients, ownership and
personal responsibility for patients’ care, and enhancement of clinical knowledge and skills.
II.C.2.(1)(c) Clinical Experience is made up of four domains:
The physician LPS survey, or LPS_PR questionnaire, asks respondents to answer all four
clinical experience domains. The associated health, nursing, and dental LPS survey, or
LPS_AH questionnaire, asks respondents to answer only the Clinical Environment domain.
II.C.2.(1)(c)(i) Clinical Environment domain contains 7 elements that describe the respondent’s
clinical environment that had been encountered during their VA clinical training experience.
Elements include hours worked, number of inpatients admitted for care, number of outpatients
and clinic patients seen, how well physicians and nurses work together, physicians and other
clinical staff work together, ease of getting patient records, backup system for electronic health

Page 17 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

records. Both physician and non-physician specialties are administered the clinical environment
domain.
II.C.2.(1)(c)(ii) Availability and Timeliness of Staff & Services domain contains 13 elements and
include outpatient nursing staff on weekdays, and for both weekdays and for nights and
weekends regarding attending and supervisory staff, inpatient nursing staff, ancillary support
staff, pharmacy services, radiology services, and laboratory services.
II.C.2.(1)(c)(iii) Quality of Staff & Services domain (whenever staff or services are available),
contains 6 elements describing quality of attending and supervisory staff, nursing, ancillary,
pharmacy, radiology, and laboratory services.
II.C.2.(1)(c)(iv) Processes of Dealing with Medical Errors domain contains 6 elements and
include prevent / reduce medical errors, assure medication safety, report medical / medication
errors, assure confidentiality of error reporting, facilitate discussion of medical / medication
errors, and facilitate analysis of medical / medication errors as a learning experience.
II.C.2.(2) Topic Domains
As described in Table 2, LPS includes three special topic domains that are designed to ask
respondents about special events or to focus on different aspects of their training experiences
that are not otherwise covered by one or more core domains.
Topic domain questions are often fact-based where respondents are asked to agree or disagree
with a statement about their VA experiences. In some cases, the topic domain will include both
fact-based and corresponding satisfaction-based questions. The intent is to measure the extent
to which an item, factor, condition, or circumstance exists in the respondent’s clinical training
experience. Topic domain responses can be compared with core domain responses to
determine if the presence or absence of a factor has an impact on how respondents rate their
domain satisfaction.
II.C.2.(2)(i) Psychological Safety. The 2 element questions comprising the psychological safety
topic domain ask if respondents “strongly agree,” “agree,” “neither agree nor disagree,”
“disagree,” or “strongly disagree” with, respectively, whether members of the clinical team are
able to “…bring up problems and tough issues,” and if the respondent felt “… free to question
the decisions or actions of those with more authority?”
II.C.2.(2)(ii) Patient / Family Centered Care. (a) The 17 element questions comprising the
patient / family centered care domain ask respondents about whether they “strongly disagree,”
“disagree,” “neither agree nor disagree,” “agree,” or “strongly agree” with specific statements of
facts regarding patient and family centered care at VA. (b) Respondents are also asked a factbased summary question: “overall, VA practitioners provide patient and family centered care.”
(c) In addition, respondents are asked to rate their overall satisfaction with patient and family
centered care at the VA facility as: “very satisfied,” “somewhat satisfied,” “neither satisfied nor
dissatisfied,” “somewhat dissatisfied,” or “very dissatisfied.”
II.C.2.(2)(iii) Interprofessional Team Care. (a) The 9 element questions comprising the
interprofessional team care domain ask respondents about whether they “strongly disagree,”
“disagree,” “neither agree nor disagree,” “agree,” or “strongly agree” with specific statements of
facts regarding interprofessional team care at VA. (b) Respondents are also asked a fact-based
summary question: “overall, VA practitioners provide interprofessional team care.” (c) In
addition, respondents are asked to rate their overall satisfaction with interprofessional team care
at the VA as: “very satisfied,” “somewhat satisfied,” “neither satisfied nor dissatisfied,”
“somewhat dissatisfied,” or “very dissatisfied.”

Page 18 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

III. SCORING METHODS
Several strategies are applied to compute scores that can represent a given respondent’s
overall ratings for facility-level, environment-level, and topic domains. Scoring strategies are
selected to meet the information needs of users of LPS data. Users include, but are not
necessarily limited to, VA education administrators, trainee supervisors, and clinical
practitioners, Designated Education Officers, Designated Learning Officers, Associate Chiefs of
Staff for Education, VA executive leadership at the local VA medical center, VISN offices, and
VA Central Offices, education program directors and executive officers at the local affiliate
education university, college, or institution, national policy makers, program administrators,
education evaluators, program investigators, and medical, epidemiologic, analytic, and health
services researchers.
III.A. Background: Scale Types
III.A.1. Categorical Scores: Categorical scores represent classes or groupings of items that
have no particular order. Examples include ethnicity, gender, education program specialty, or
professional discipline.
III.A.2. Ordinal Scores: Categories that can be ordered according to a value of interest are
said to be ordinal. For example, satisfaction questions asks respondents to consider the
intensity by which they are satisfied (or dissatisfied) with an element of their learning
experiences, and then classify that intensity into a distinct cateogory, such as “very satisfied,”
“somewhat satisfied,” “neither satisfied nor dissatisfied,” “somewhat dissatisfied,” and “very
dissatisfied.” Responders who report being “very satisfied” will have a higher level of intensity of
satisfaction then responders who report being “somewhat satisfied,” who in turn, will have a
higher intensity of satisfaction than a responder who reports being “neither satisfied nor
dissatisfied.” Likewise, responders who “strongly agree” with a given statement are likely to
agree more intensively than responders who only “agree” with the statement.
Ordinal scores, however, do not measure intervals between ordered ratings. We often assign a
consecutive integer value to each response code to keep track of its respective order. So, for
responses to satisfaction questions, we assign 1 to “very dissatisfied,” 2 to “somewhat
dissatisfied,” 3 to “neither satisfied nor dissatisfied,” 4 to “somewhat satisfied,” and 5 to “very
satisfied.” These assigned integer numbers reflect the true order of their respective response
categories. So “very satisfied” (valued at 5) is more satisfied than “somewhat satisfied” (valued
at 4). That is, 5 is greater than 4 and reflects the order of the respective response categories
representing the intensity of satisfaction. However, the difference between very satisfied and
somewhat satisfied (i.e., 5 – 4 or 1), and the difference between “somewhat dissatisfied” at 2
and “very dissatisfied” at 1 (i.e., 2-1=1) does not necessarily mean that the difference in the
former is equal to the difference in the latter even though numerically, the difference in the
ordered scale are both 1. That is, differences in integer values for ordinal scales are
meaningless.
III.A.3. Interval Scales: Interval scores are ordinal scores with the additional property that
differences between values are defined. For example, the difference in the actual intensity of
satisfaction between two raters who score a “2” (“somewhat dissatisfied”) and “3” (neither
satisfied nor dissatisfied”) respectively, will represent the same difference in satisfaction
between two raters who score a “3” and “4” (“somewhat satisfied”). There are strategies to turn
ordinal scales into interval scales. In specific cases, those strategies are applied here.
III.A.4. Ratio Scales: Ratio scores are interval scales for which absolute zero is defined. In
this way, one can claim that one score is “twice” that of another score. Scores of groups of

Page 19 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

respondents computed as “percent satisfied” is a ratio scale because “zero” percent is defined
as no respondent was satisfied, even though each respondent’s response is classified as either
“satisfied” or “otherwise,” which is an ordinal scale. For our purposes, we can compute a ratio
score for each respondent by estimating the “likelihood” that a respondent will report being
satisfied. “Likelihoods” are reported as probabilities that vary between 0 (very unlikely
respondent will report being satisfied) to 1 (highly likely respondent will report being satisfied).
III.B. Element Scores
III.B.1. Element o-scores
We score responses to element questions using an ordinal scale taken directly from selected
responses to element questions. For satisfaction-based questions, we assign the integer value
of 1 to indicate the order of the “very dissatisfied” response as that with the lowest satisfaction
intensity. Similarly, we assign 2 to “somewhat dissatisfied,” 3 to “neither satisfied nor
dissatisfied,” 4 to “somewhat satisfied” and 5 to “very satisfied” indicating the highest
satisfaction intensity. For fact-based questions, the value of 1 is assigned to indicate “strongly
disagree,” 2 to “disagree,” 3 to “neither agree nor disagree”, 4 to “agree,” and 5 to “strongly
agree.” Element o-scores constitute an ordinal scale.
III.B.2. Element p-scores
Each element may also be scored as a binary value. These are called “p” scores because it
can be used to indicate the percent of respondents who are satisfied with a condition, or who
agreed with a statement about their VA training experiences. For element questions with
satisfaction-based responses, 1 (satisfied) is assigned to respondents who reported “very
satisfied” or “satisfied” with the given condition, and are assigned 0 (otherwise) if respondents
reported “neither satisfied nor dissatisfied,” “dissatisfied,” or “very dissatisfied”. For element
questions with agreement-based responses, 1 (agree) indicates the respondent “strongly
agrees” or “agrees” with the given statement, and 0 (otherwise) indicates the respondent
“neither agrees nor disagrees,” “disagrees,” or “strongly disagrees” with the given statement.
III.C. Domain Scores
Domain scores summarize the information contained across elements for each domain.
III.C.1. Summary Domain o-scores
A summary score are determined from the 5-item responses to the summary domain question.
Each summary question is asked at the end of each domain section following the respective
element questions. By design, o-scores are ordinal scales. Since respondents are asked to
summarize the domain, the response to the summary question reflects how each respondent
considered and weighed each element when determining an overall ratnig for the entire domain.
That is, summary domain o-scores comprise element responses that are weighted by each
individual respondent. Summary scores reflect how individual respondents value each of the
elements when conceptually rating the overall domain. For example, the ratings of those
elements about which a respondent places little overall value in the context of the domain would
thus have little impact on how a respondent rates the overall domain.
For satisfaction-based summary domain questions, to reflect how responses are ordered in
terms of satisfaction intensity, we assign the integer value of 1 to indicate the order of the “very
dissatisfied” response as that with the lowest overal domain satisfaction intensity. Similarly, we
assign 2 to “somewhat dissatisfied,” 3 to “neither satisfied nor dissatisfied,” 4 to “somewhat
satisfied” and 5 to “very satisfied” indicating the highest satisfaction intensity. For fact-based

Page 20 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

questions, the value of 1 is assigned to indicate “strongly disagree,” 2 to “disagree,” 3 to “neither
agree nor disagree”, 4 to “agree,” and 5 to “strongly agree.”
III.C.2. Summary Domain p-score Summary domain scores are dichotomized to two values.
For satisfaction-based summary domain questions, 1 (‘satisfied’) is assigned to respondents
who reported “very satisfied” or “satisfied,” and assigned 0 (‘otherwise’) to respondents who
reported “neither satisfied nor dissatisfied,” “dissatisfied,” or “very dissatisfied.” For agreementbased summary domain questions, 1 (‘agree’) indicates the respondent “strongly agrees” or
“agrees” with the given statement, and 0 (‘otherwise’) indicates the respondent “neither agrees
nor disagrees,” “disagrees,” or “strongly disagrees” with the given statement.
III.C.3. Summary Domain adjusted p-score
Adjusted p-scores are computed as likelihoods that an individual respondent will have a
summary domain p-score of 1 (“very satisfied” or “somewhat satisfied” for satisfaction-based
domains and “strongly agree” or “agree” for fact-based domains) versus 0 alternative (“neither
satisfied nor dissatisfied,” “somewhat dissatisfied,” or “very dissatisfied” for satisfaction-based
domains, and “neither agree nor disagree,” “disagree,” or “strongly disagree” for fact-based
domains). Adjusted p-scores are calculated by regressing the summary domain o-score as the
dependent variable, and the mean element summary domain (m-score) plus other covariate
adjustors as the independent variables. M-scores are defined below.
Specifically, adjusted p-scores are calculated by estimating generalized linear models with
summary domain o-score as an ordinal dependent variable, and the mean element domain mscore and mean-centered covariate adjustors as independent variables, with a cumulative logit
linking function and multinomial error distribution. P-scores are calculated by summing the
predicted likelihood the respondent would have reported being “very satisfied” or “somewhat
satisfied” based on the respondents actual mean element domain m-score and covariates that
are valued for a referent trainee.
There are two sets of covariate adjustors and referent trainee:
R1: Summary domain adjusted p-scores based on R1 permit comparisons across facilities,
within a facility overtime, or across domains. Covariates include the respondent’s specialty or
discipline based on the specialty group variable, academic level based on the academic level
group variable, gender, response bias index, service complexity of reporting facility, and mix of
patients seen. Covariates are mean centered around the ‘referent’, defined to be a trainee in
internal medicine specialty and PGY 1-3 academic level, male, with zero response bias, in a
facility classified as ‘1a’ complexity (highest), and seeing a median mix of patients, for trainees
during academic years 2011-2015.
R3: Summary domain adjusted p-scores based on R3 permit comparisons among respondents
over academic level and among disciplines, specialties, and subspecialties. List of covariates
explicitly excludes the respondent’s specialty or discipline and academic level, but includes
gender, response bias index, service complexity of reporting facility, and mix of patients seen.
Covariates are mean centered around the ‘referent’ defined to be a trainee who is male, with
zero response bias, in a facility classified as ‘1a’ complexity (highest), and seeing a median mix
of patients, for trainees during academic years 2011-2015.
III.C.4. Mean Element Domain m-scores
Mean element scores, or m-scores, are computed by domain for each respondent by taking the
average of non-missing element o-scores. Here, the consecutive integer values (1, 2, 3, 4, and
5 for the 5-item Likert scales) for the element o-scores are treated as actual numbers when
computing their average to calculate the domain’s m-score. Elements not answered are treated

Page 21 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

as missing and not counted. If 20% of element questions within a given domain for a given
respondent are unanswered, the respondent’s mean element score for the given domain is
treated as a missing value.
As described, m-scores are interval scales. This is allowed because element questions
comprising a single domain are generally one-dimensional [see VI.]. Thus, mean element
domain m-scores reflect “latent” domain factors that represents the intensity by which
respondents are reporting their satisfaction, or agreement. The simple mean of element
responses thus becomes a sufficient statistic to represent that latent factor.
By design, m-scores are computed as simple averages of element o-scores where integer
values are treated as numbers. Thus, each element o-score is weighted equally across all
elements in computing a score to represent the entire domain. This contrasts with summary
domain o-scores where respondents must weigh the relative importance of each element when
determining how they rate the entire domain (by responding to the domain summary question).
III.C.5. Mean Element Domain z-scores
Mean element domain m-scores can be recomputed to z-scores by subtracting the mean of mscores from the respondent’s m-score, and dividing the difference by the square root of the
variance of m-scores across all responders.
Z-scores enable investigators to compare ratings across responders where scores are adjusted
to reflect a standard variance of one and a mean of zero. A z-score of zero is benchmarked to
be the score the average respondent assigned to an average facility. A negative (or positive) zscore would indicate that the respondent rates a given facility less (or more) than the average
rater rating an average facility.
III.C.6. Mean Element Domain Adjusted z-scores
Mean element domain z-scores can also be adjusted to reflect differences in other covariates.
Adjusted z-scores are calculated by regressing the mean element domain z-scores against
covariates in a generalized linear model with an identity linking function and normal error
distribution. The adjusted z-score is computed as residual values between the respondent’s
actual z-score (based on the respondent’s ratings) and the predicted z-score obtained from the
estimated model. There are two sets of covariate adjustors included in the model.
R1: Mean element domain z-scores adjusted based on R1 permit comparisons across facilities,
within facility overtime, or across domains. Covariates include the respondent’s specialty based
on the specialty group variable, academic level based on the academic level group variable,
gender, response bias index, service complexity of the reporting facility, and mix of patients
seen.
R3: Mean element domain z-scores adjusted based on R3 permit comparisons among
respondents over academic level and among disciplines, specialties, and subspecialties. List of
covariates explicitly excludes the respondent’s specialty or discipline and academic level, but
includes gender, response bias index, service complexity of the reporting facility, and mix of
patients seen.
III.D. Missing Values
To compute mean element domain scores, we take the mean of only those elements for which
the respondent reported a useable response (not missing or inapplicable). The mean domain
score is treated as a missing value whenever the respondent fails to answer two or more
domain element questions.

Page 22 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

To compute adjusted scores, respondents must have described the mix of patients they saw
during their VA clinical encounters along seven dimensions. To account for missing data when
respondents failed to provide a complete set of information on these covariate factors
(approximately 13%), we imputed the values for the given respondent by taking the mean
among all such responders who were in the respondent’s facility, in the same specialty group
and academic level group, and who responded to the survey during the same two-year reporting
period.

IV. INDEX COMPUTATIONS
The LPS survey is designed to compute important indices needed to interpret findings from
survey responses.
IV.A. Element Value, Importance, Attitude Score
The value or importance that a group of respondents place on a domain element and within the
context of that domain, can be computed as an independent association between the element
satisfaction rating and the domain summary satisfaction rating, independent of effects of all
other elements on the domain summary.6, 7 The element value is essentially the weight that the
respondent applied to that particular element when considering their overall satisfaction for the
domain. Elements with less value are considered relatively unimportant drivers of a
respondent’s satisfaction with their clinical training experience in the context of the given
domain.
IV.B. Response bias index
Respondents are asked to describe their satisfaction with an element or domain by selecting
from among five response choices (“very satisfied,” “somewhat satisfied,” “neither satisfied nor
dissatisfied,” “somewhat dissatisfied,” “very dissatisfied”). In so doing, respondents must define
each category and mentally compute cut points to translate the intensity of their satisfaction or
dissatisfaction into a specific choice from among the five response options. Respondents may
vary in how they define those cut values. For example, a rater who is not highly satisfied may
report “very satisfied” while another respondent feeling the same intensity of satisfaction may
choose to report “somewhat satisfied” on the survey.
This response bias phenomenon can be observed by observing how different trainees report
satisfaction for essentially the same, or common, experience. Such common experiences
include interactions with VA’s computer system, facility-level parking, or the convenience of the
facility’s location for among trainees who report on the same facility for the same time period.
Here, variability of responses across responders would reflect, in part, differences in how
respondents chose a response option when describing the intensity of their satisfaction.
To account for these responder biases, we developed a response bias index, nicknamed
responder “grumpiness.” The theory behind response bias indexes is that all respondents who
report on the same experience should, at least theoretically, be expected to assign the same
rating. Thus a response bias index could be computed by comparing a respondent’s actual
satisfaction rating with the average among other trainees who reported on the same experience.
The response bias index computed here is taken from three element questions used in two
domains. These “common” elements describe experiences that may vary across facilities, but
do not vary between trainees reporting on the same facility and time period. Listed in Table 2,
these common elements ask respondents to rate their satisfaction with the facility’s

Page 23 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

“Computerized Patient Record System CPRS” (as a Working Domain element), and the
“convenience of facility location” and “parking” (as Physical Domain elements). Responses are
recorded on five-point, ordered, Likert scales. The responses were recoded so that “very
satisfied” is assigned to a value of five, “somewhat satisfied” to a value of four, “neither satisfied
nor dissatisfied” to three, “somewhat dissatisfied” to two, and “very dissatisfied” to one. The
mean of these recoded responses over the three elements are calculated for each respondent.
That is, the index is an m-score computed as the mean of re-coded responses across the three
common elements.
An index value is computed for each respondent by taking the m-score for the three common
elements for a given respondent and subtracting the average m-score from among all
respondents to the given facility and calendar year. To ensure that all trainees were reporting
about the same experience, facilities are defined in terms of a 6-digit facility code.
The facility’s trainees include those who took the LPS in either the same academic year, or an
earlier or later academic year. To account for small changes that may have occurred in
computers, convenience, and parking over time, trainee ratings were weighted to reflect
differences in time that lapsed between when the given respondent completed the LPS, and
when each facility trainee completed the LPS. Scores taken from trainees who responded to
the LPS in the reporting year were given a weight of one (1=1/(1+0)). Scores taken from
trainees who responded to the LPS either one year later, or one year earlier from the reporting
year were assigned to a weight of 0.50 (computed as: 1/(1+1)=0.50). Scores that are two years
apart were weighted by 0.33 (computed as: 1/(1+2)=0.33). This continues so that scores up to
10 years out were assigned a weight of 0.09 (computed as: 1/(1+10)=0.09). The weighted
average is computed by first multiplying the trainee rate (mean of the three element rates) by
the corresponding weight (based on when the trainee took the LPS), summing the weighted
rates over all of the facility’s trainees, and dividing the weighted sum by the sum of weights over
all of the facility’s trainees. Note that for a given year, information to compute the response bias
index to correct for responder biases is taken from both years prior, and year’s post, to the year
the responder completed the LPS survey of interest.
IV.C. Differencing Variable
Responses to topic domains can be used to compute differencing variables.12 Differencing
variables are equivalent to moderator variables found in controlled clinical trials that can turn on,
or turn off, the effect of an intervention of interest. Responses to topic domain questions enable
investigators to assess the extent to which the presence or absence of a condition impacted a
respondent’s rating of their VA clinical training experience by core domains. The differencing
variable strategy enables investigators to use LPS data to make inferences about effect sizes of
interventions on core domains using pre-post, before-after, and with-without designs. The
strategy has been explained, and applied to LPS data to determine the impact of changes in
ACGME duty hour standards on trainee satisfaction with the VA clinical learning experience.12

V. DISSEMINATION
The Office of Academic Affiliations, within the Department of Veterans Affairs Veterans Health
Administration, provides official findings of the LPS2016 data by means of a series of
standardized and on-going Current Reports, and one-of-a-kind Special Reports. These reports
are designed to provide information about the progress the VA has made towards its education
mission. The intended audience for these reports include, among others, VA’s Designated
Learning Officers, Education Officers, Associate Chiefs of Staff for Education, and local and

Page 24 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

national academic leaders. The purpose for these reports are to help education leaders identify
problems, propose solutions, implement interventions, and evaluate the progress achieved
when those interventions are implemented, in order to offer VA trainees an optimal clinical
learning environment while providing veterans with safe, effective, and high quality health care.
V.A. Current Reports / Data Cube
The LPS2015 Current Reports provide analytic information calculated from trainee responses to
LPS surveys administered during the academic years from 2010 through and including 2015.
Current Reports allow users to compare scores between facilities, to see how scores have
changed over time for a given facility, and to examine how scores vary across specialties and
academic levels.
While LPS data contains LPS and LPSc responses, current reports are computed whenever a
minimum eight respondents are available for a reporting unit.
Current Reports are available through a data cube accessible on OAA’s website at
http://vaww.oaa.med.va.gov/lpsCurrentReports/. Current Reports are formatted so that users
specify the facilities to include in the analyses (one facility, group, VISN, or all facilities
nationwide) and specify specialty (by specialty group) and academic level (by academic level
group). Current Reports will produce charts to display information graphically, and tables to
display statistics numerically. Where multiple domains are applicable, separate charts and
tables are usually produced, a set for each domain.
Current Reports were created and developed using Microsoft Visual Studio (2008) tools, with
the Data Cube constructed on an SQL Server (2008 R2) Analysis Services platform, and
Reports constructed on an SQL Server (2008 R2) Reporting Services platform. Adjusted scores
and pre-processing of the raw data were performed on SPSS version 19. To permit
comparisons of satisfaction ratings across facilities, over time, among specialties, and over
academic levels, the construction of these Reports from the original raw survey data required an
equivalent 26,500 lines of programming, excluding software developed to assess robustness,
construct validity, response biases, or reliability of the final research-ready datasets.
Available Current Reports are as follows:
1. Element reports
1.1

For each of nine domains, describes percent satisfied across a primary group and a
comparison group of facilities, for each of between six and fifteen of the domain’s
elements, the overall domain satisfaction, and the adjusted overall domain satisfaction,
for respondents identified by selected disciplines or specialties, and academic level. For
example, the primary group may consist of one facility with all facilities in the
corresponding VISN or all facilities as the comparison group. Element report also shows
trainee survey counts and HI/LOW table.

1.2.

For each of nine domains, describes percent satisfied across the last three reporting
years for each of between six and fifteen of the domain’s elements, the overall domain
satisfaction, and adjusted overall domain satisfaction, for respondents identified by
selected facilities, disciplines or specialties, and academic levels.

2. Domain Reports
2.1.1. For each of nine domains, describes adjusted percent satisfied across selected
individual facilities for respondents identified by reporting years, disciplines or
specialties, and academic levels.

Page 25 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

2.1.2. For each of nine domains, describes adjusted percent satisfied across all VISNs for
respondents identified by reporting years, disciplines or specialties, and academic levels.
2.2.

For each of nine domains, describes adjusted percent satisfied across reporting years
2005 through 2015, for respondents identified by reporting facilities, respondent’s
disciplines or specialties, and academic levels.

V.B. Special Reports
LPS2016 data can be used to generate a series of short Special Reports for administrative,
evaluative, regulatory, and research purposes, in response to inquiries from OAA staff, from
ACOS-E, Designated Learning Officers, and interested VA faculty and staff in the field, and
other government Executive and Legislative branch offices and agencies.
V.C. Publications and Presentations
Information contained in the LPS2016 data will be disseminated through manuscripts published
in peer-reviewed scientific journals, presentations at scientific meetings, formal lectures and
continuing education seminars, and project reports for distribution to the public through the
Office of Academic Affiliations. A brief list of publications and presentations are listed below
where input was obtained from LPS datasets:
1.

Keitz, S., Holland, G.J., Melander, E.H., Bosworth, H., and Pincus, S.H. for the Learners’
Perceptions Working Group (Gilman, S.C., Mickey, D.D., Singh, D., et al). The Veterans
Affairs Learners’ Perceptions Survey: The Foundation for Educational Quality Improvement.
Academic Medicine 78(9):910-917, 2003.

2.

Singh, D.K., Holland, G.J., Melander, E.H., Mickey, D.D., Pincus, S.H.: VA’s Role in U.S.
Health Professions Workforce Planning. Proceedings of the 13th Federal Forecasters
Conference of 2003:127-133, 2004.

3.

Singh, D. K., Golterman, L., Holland, G. J., Johnson, L.D., and Melander, E. H., Proposed
Forecasting Methodology for Pharmacy Residency Training, Proceedings of the 15th Federal
Forecasters Conference of 2005: 39-42, 2005.

4.

Chang, Barbara K.; Kashner, T. Michael; and Holland, Gloria J. “Evidence-based Expansion
and Realignment of Physician Resident Positions.” Presented at the 3rd Annual Association of
American Medical Colleges Physician Workforce Research Conference. Bethesda MD, May 24, 2007.

5.

Chang, B.K.; Holland, G.J.; Kashner, T.M.; Flynn, T.C.; Gilman, S.C.; Sanders, K.M.; and Cox,
M. “Graduate Medical Education Enhancement in the VA.” Presented at the Association of
American Medical Colleges Group on Resident Affairs Professional Development Meeting,
Small Group Facilitated Discussion. Memphis TN, April 22-25, 2007.

6.

Chang, B. K.; Kashner, T.M.; Holland, G.J. “Allocation Methods to Enhance Graduate
Medical Education.” Presented at the International Medical Workforce Collaborative.
Vancouver B.C., Canada, March 21-24 2007.

7.

Cannon, Grant W.; Keitz, Sheri A.; Holland, Gloria J.; Chang, Barbara K.; Byrne, John M.;
Tomolo, Anne; Aron, David C.; Wicker, Annie B.; and Kashner, T. Michael. “Factors
Determining Medical Students’ and Residents’ Satisfaction during VA-Based Training:
Findings from the VA Learners’ Perceptions Survey.” Academic Medicine. vol. 83, no. 6 (June
2008), pp. 611-620.

8.

Chang, Barbara K.; Cox, Malcolm; Sanders, Karen M.; Kashner, T. Michael; and Holland,
Gloria J. Expanding and Redirecting Physician Resident Position by the US Department of

Page 26 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)
Veterans Affairs.” Presented at the 11th International Medical Workforce Collaborative, Royal
College of Surgeons of Edinburgh, Edinburgh UK, September 17, 2008.
9.

Golden, Richard M.; Henley, Steven S.; White Jr., Halbert L.; and Kashner, T. Michael.
"Correct Statistical Inferences using Misspecified Models with Missing Data with Application
to the Learners’ Perceptions Survey." Presented at the Joint Annual Convention of the 42nd
Annual Meeting of the Society for Mathematical Psychology and the 40th Annual Conference of
the European Mathematical Psychology Group, Amsterdam, Netherlands, August 1-4, 2009.

10. Kashner, T. Michael; Henley, Steven S.; Golden, Richard M.; Byrne, John M.; Keitz, Sheri A.;
Cannon, Grant W.; Chang, Barbara K.; Holland, Gloria J.; Aron, David C.; Muchmore, Elaine
A.; Wicker, Annie; and White Jr., Halbert L. “Studying the Effects of ACGME Duty Hours
Limits on Resident Satisfaction: Results from VA Learners’ Perceptions Survey.” Academic
Medicine. vol. 85, no. 7 (July, 2010), pp. 1130-1139.
11. Golden, Richard M.; Henley, Steven S.; White Jr., Halbert L.; and Kashner, T. Michael.
“Application of a Robust Differencing Variable (RDV) Technique to the Department of
Veterans Affairs Learners’ Perceptions Survey.” Presented at the 43rd Annual Meeting of the
Society for Mathematical Psychology, Portland, OR, August 7-10, 2010.
12. Kaminetzky, Catherine P.; Keitz, Sheri A.; Kashner, Michael; Aron, David C.; Byrne, John M.;
Chang, Barbara K. ; Clarke, Christopher; Gilman, Stuart C.; Holland, Gloria J.; Wicker, Annie;
and Cannon, Grant W. “Training Satisfaction for Subspecialty Fellows in Internal Medicine:
Findings from the Veterans Affairs (VA) Learners’ Perceptions Survey.” BMC Medical
Education. vol. 11, no. 21 (2011), pp. 1-9 (http://www.biomedcentral.com/1472-6920/11/21).
13. Kashner, T. Michael; and Chang, Barbara K. “VA Residents Improve Access and Financial
Value.” Presented at the Annual Meeting of the Association of American Medical Colleges,
Denver, CO, November 4-9, 2011.
14. Lam, Hwai-Tai C.; O’Toole, Terry G.; Arola, Patricia E.; Kashner, T. Michael; and Chang,
Barbara K. “Factors Associated with the Satisfaction of Millennial Generation Dental
Residents.” Journal of Dental Education, vol. 76, no. 11 (November, 2012), pp. 1416-1426.
15. Byrne, John M.; Chang, Barbara K.; Gilman, Stuart; Keitz, Sheri A.; Kaminetzky, Cathy; Aron,
David; Baz, Sam; Cannon, Grant; Zeiss, Robert A.; and Kashner, T. Michael. “The Primary
Care-Learners’ Perceptions Survey: Assessing Resident Perceptions of Internal Medicine
Continuity Clinics and Patient-Centered Care.” Journal of Graduate Medical Education, vol. 5,
no. 4 (December, 2013), pp. 587-593.
16. Chang, Barbara; Muchmore, Elaine; and Kashner, T. Michael. “Taking the Pulse of Your GME
Training Programs.” Presented at the 2014 (AAMC) Group on Resident Affairs Spring
Meeting, Phoenix, AZ, May 4-7, 2014.
17. Byrne, John M.; Kashner, T. Michael; Gilman, Stuart C.; Wicker, Annie B.; Bernett, David S.;
Aron, David C.; Brannen, Judy L.; Cannon, Grant W.; Chang, Barbara K.; Hettler, Debbie L.;
Kaminetzky, Catherine P.; Keitz, Sheri A.; Zeiss, Robert A.; Golden, Richard M.; Paik, DaeHyun; and Henley, Steven S. “Do Patient Aligned Medical Team Models of Care Impact VA’s
Clinical Learning Environments.” Presented at the 2015 Health Services Research and
Development / Quality Enhancement Research Initiative (HSR&D/QUERI) National
Conference, Philadelphia, PA, July 8-10, 2015.
18. Perez, Elena V.; Byrne, John M.; Durkin, Rob; Wicker, Annie B.; Henley, Steven S.; Golden,
Richard M.; Hoffman, Keith A.; Hinson, Robert S.; Aron, David C.; Baz, Samuel; Loo,
Lawrence K.; Velasco, Erwin D.; McKay, Tracy; and Kashner, T. Michael. “Clinical
Supervision Index: Measuring Supervision of Physician Residents in VA Medical Centers.”
Presented at the 2015 Health Services Research and Development / Quality Enhancement
Research Initiative (HSR&D/QUERI) National Conference, Philadelphia, PA, July 8-10, 2015.

Page 27 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)
19. Kashner, T. Michael; Hettler, Debbie L.; Zeiss, Robert A.; with Aron, David C.; Brannen, Judy
L.; Byrne, John M.; Cannon, Grant W.; Chang, Barbara K.; Dougherty, Mary B.; Gilman,
Stuart C.; Holland, Gloria J.; Kaminetzky, Catherine P.; Wicker, Annie B.; Bernett, David S.;
and Keitz, Sheri A. “Has Interprofessional Education Changed Learning Preferences? A
National Perspective,” invited resubmission to Health Services Research.

VI. Psychometric Properties
Historically, LPS Survey responses have shown good internal consistency 8 (α’s ranging from
0.87 to 0.92), and have been validated for discriminant and construct validity across medical
students and physician residents,8 medical specialties, 9, 10 dental specialties, 11 and in
longitudinal analyses for physician residents. 12 Empirical analyses with LPS data revealed that
trainee responses that have been subject to scoring, covariate adjustments, response bias
corrections, and calibrations do permit investigators to make robust comparisons of satisfaction
ratings across responding trainees representing different disciplines, specialties and
subspecialties, academic levels, and reporting facilities.
Table 7 summarizes the psychometric properties computed for LPS responses obtained during
the academic year for 2015 (July 1, 2014 through June 30, 2015). Properties examined for
each domain included the number of elements, mean, standard deviation, and minimum and
maximum values based on mean element domain scores, Cronbach alpha across elements for
consistency, intraclass correlation for single and average measures, the Eigen value for each
principal component for all components with Eigen values equal to or greater than 0.50, and the
percent of variance explained for each principal component.

Page 28 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLES
TABLE 1
Development of the Department of Veterans Affairs
Learners’ Perceptions Survey (LPS)
VA
LPS

Academic Year
Administered

LPS
Version
Number

LPS2001

July 1, 2000 - June 30, 2001

v001

The initial survey was administered to all VA trainees.
Questions asked about the respondent’s discipline /
specialty, academic level, gender, time in training, and
percent of time in training spent at VA. Facility-level
domains include VA and nonVA comparisons, 100-point
numerical score, overall value of VA clinical training
experience, whether respondent would recommend
experience to other trainees and would choose VA
training experience again. Core domains focused
separately on Clinical Faculty / Preceptors, Learning
Environment, Working Environment, and Physical Plant.

LPS2002

July 1, 2001 - June 30, 2002

v002

The second version added a listing of Physician
Residency Specialties and VA Post-Residency Special
Fellowship training programs. The name of the Physical
Plant Domain changed to Physical Environment. The
question describing “preparation for an evidence-based
clinical practice,” previously presented as a separate
question, was listed as an element to the Clinical
Faculty / Preceptors Domain. Questions asking for the
name and address of the Main Medical Facility and the
institutions sponsoring the training program were added.
Seven items describing characteristics of patients seen
were added. Respondent-level questions asking about
year graduated from medical school and whether the
medical school was US or non-US were added.

LPS2003

July 1, 2002 - June 30, 2003

v003

The single survey was divided into two separate
questionnaires, one intended for Associated Health
trainees (AH) and the other for Physician Residents,
including fellows and medical students (PR). Research
Mentoring and Mentoring by Faculty elements were
added to the Clinical Faculty Preceptors Domain.
Personal Experience Domain was added. Patient
characteristics described in terms of whether
“Treatment will resolve an acute problem,” “Treatment
will stabilize or improve a chronic condition,” and
“Treatment will comfort or palliate” were added to the

Comments

Page 29 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 1
Development of the Department of Veterans Affairs
Learners’ Perceptions Survey (LPS)
VA
LPS

Academic Year
Administered

LPS
Version
Number

Comments
characteristics of patients seen. Clinical Environment,
Staff / Service Availability, Staff / Service Quality, and
Quality of Care and Patient Safety Domains were added
to the PR survey. Questions asking about the Main
Medical Facility were deleted.

LPS2004

July 1, 2003 - June 30, 2004

v004

Quality of Care and Patient Safety Domain was refocused to become the Systems and Process Medical
Error Domain. A Topic Domain was added to the PR
questionnaire describing the overall effect of the 2003
Accreditation Council for Graduate Medical Education
(ACGME) duty hours / scheduling on training
experiences. Facility-Level Question, “Would you
consider the VA as a future employment site?” was
added. The element: “Dealing with terminally ill patients”
was removed from the Personal Experience Domain,
and “Ownership / personal responsibility for your
patients' care” was added to the Personal Experience
Domain. “Treatment will resolve an acute problem,”
“Treatment will stabilize or improve a chronic condition,”
and “Treatment will comfort or palliate” were removed
from the characteristics of patients seen. Questions
identifying the sponsoring institution were deleted.

LPS2005

July 1, 2004 - June 30, 2005

v005

The classification of academic levels for Pharmacy
trainees was modified.

LPS2006

July 1, 2005 - June 30, 2006

v006

Specialty and subspecialty classifications for Physician
Residents were expanded.

LPS2007

July 1, 2006 - June 30, 2007

v006

No change

LPS2008

July 1, 2007 - June 30, 2008

v006

No change

LPS2009

July 1, 2008 - June 30, 2009

v006

No change

LPS2010

July 1, 2009 - June 30, 2010

v007

Rehabilitation discipline was divided into blind,
occupational, physical and other therapy. The question:
“Are you currently on Active Duty in the military?” was
added among questions describing the characteristics
of the respondent.

Page 30 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 1
Development of the Department of Veterans Affairs
Learners’ Perceptions Survey (LPS)
VA
LPS

Academic Year
Administered

LPS
Version
Number

LPS2011

July 1, 2010 - June 30, 2011

v008

ACGME Topic Domain was deleted from the PR
questionnaire.

LPS2012

July 1, 2011 - June 30, 2012

v009

The classification of Physician Residents Specialty and
Advanced Fellowship Programs were revised. Three
Topic Domains were added: Psychological Safety,
Patient / Family Centered Care, and U.S. Accreditation
Council for Graduate Medical Education (ACGME) Duty
/ Hours Scheduling Domains. Disciplines were divided
into Associated Health, Dentistry and Nursing programs.
Separate questions describing Advanced Fellowship
Programs were added to the AH questionnaire.

LPS2013

July 1, 2012 - June 30, 2013

v010

There were major changes in how specialties and
academic level data were collected for Associated
Health and Nursing. Consistent with the strategy for
Dentistry and Physicians, Associated Health and
Nursing program respondents were asked separate
questions to name their discipline or specialty, and to
indicate their academic level, in their current program.
The list of disciplines and specialties for Associated
Health Programs was expanded to include Marriage &
Family Counseling, Mental Health Counseling, and
Surgical Technician / Technologist. Nursing disciplines
and specialties were also expanded. The listings for
disciplines and specialties and academic levels for
Dentistry were also updated.

LPS2014

July 1, 2013 - June 30, 2014

v011

The listings of academic levels and listings of
disciplines, specialties, and subspecialties for each
health professions programs (Associated Health,
Dentistry, Nursing, and Physicians) were updated. Prebaccalaureate academic levels “certificate,” “diploma,”
and “associate degree” were clarified to distinguish prebaccalaureate from post-doctoral certificate. The
number of facility-level domains was reduced based on
reported need in the field. The before-after quality of
care assessment was continued for physician residents,
and added to the Associated Health survey for dental,
nursing, and associated health programs. The
elements comprising teaching experiences, including
clinical learning environment and faculty & preceptor

Comments

Page 31 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 1
Development of the Department of Veterans Affairs
Learners’ Perceptions Survey (LPS)
VA
LPS

Academic Year
Administered

LPS
Version
Number

Comments
domains, were left unchanged from LPS2013. In
addition, the number of elements comprising the
working experience domains, including personal
experience, working environment, and physical
environment, were reduced based on the contribution
each element had to drive variation in all element
scores by domain. For clinical experience, Staff and
Services Timeliness and Availability, Quality of Staff and
Services, and Process Medical Error Domains were left
unchanged from FY2013. However, the number of
elements comprising Clinical Environment was reduced
based on the contribution each element had to drive
variation in all element scores for the domain. The
ACGME2011 duty hour topic domain was discontinued,
as that study concluded. The Patient-Centered Care
topic domain was modified. Specifically the domain was
divided into an Interprofessional Team Care domain
focusing on provider-provider interactions, and PatientCentered Care domain focusing on provider-patient
interactions. Both Interprofessional-team and PatientCentered Care domains had a fact-based domain
summary and a satisfaction-based domain summary.

LPS2015

July 1, 2014 - June 30, 2015

v012

The listing of specialties for Physicians was updated to
include Adult Reconstructive Orthopaedics, Advanced
Heart Failure and Transplant Cardiology, Blood Banking
\ Transfusion Medicine, Brain Injury Medicine, Chemical
Pathology, Clinical Informatics, Complex General
Surgical Oncology, Craniofacial Surgery,
Cytopathology, Emergency Medical Services, Epilepsy,
Female Pelvic Medicine and Reconstructive Surgery OB-GYN, Female Pelvic Medicine and Reconstructive
Surgery - Urology, Foot and Ankle Orthopaedics,
Forensic Pathology, Hand Surgery - Orthopaedic, Hand
Surgery - Plastic Surgery - Integrated, Hematology Internal Medicine, Hematology - Pathology - Anatomic
and Clinical, Medical Biochemical Genetics, Medical
Microbiology, Molecular Genetic Pathology,
Musculoskeletal Oncology, Neuromuscular Medicine Neurology, Neuromuscular Medicine - (PM&R),
Neuropathology, Neurotology, Ophthalmic Plastic and
Reconstructive Surgery, Orthopaedic Sports Medicine,
Orthopaedic Surgery of the Spine, Orthopaedic Trauma,
Selective Pathology, and Vascular Neurology. The

Page 32 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 1
Development of the Department of Veterans Affairs
Learners’ Perceptions Survey (LPS)
VA
LPS

Academic Year
Administered

LPS
Version
Number

Comments
scale for the question “as a result of this VA clinical
training experience, how likely would you be to consider
a future employment opportunity at a VA medical
facility” was revised to very likely, somewhat likely, had
not thought about it, somewhat unlikely and very
unlikely.

LPS2016

July 1, 2015 - June 30, 2016

v013

The Associated Health and Nursing Program disciplines
were updated. Also, Advanced Fellowship programs
was expanded to include Addiction Treatment, Clinical
Simulation, Health Professions Education Evaluation
and Research, and Pycho-Social Rehab Physicians
Fellow. The question “practitioners from different
settings (inpatient, outpatient, and extended care)
communicate with me about my patients and their
transitions from one level of care to another, such as
hospital discharge” was deleted from the PatientCentered Care topic domain, however, the question
remains in the Interprofessional Team Care topic
domain.

Page 33 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires
LPS2013

MEASURES

AH

PR

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

AH

PR

AH

PR

AH

PR

LPS-PC
ver007
ed006

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

LPS2014

LPS2015

LPS2016

FACILITY-LEVEL
Numerical score
Value of experience
Choose experience again
Recommend experience
Likely to consider VA future
employment before experience
Likely to consider VA future
employment after experience
More/Less likely to consider VA
future employment as result of VA
experience
Consider as a future employer

✔

✔

Page 34 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires

PR

AH

PR

AH

PR

AH

PR

LPS-PC
ver007
ed006

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

LPS2013

MEASURES

AH

What level of patient care quality did
you expect to find at the VA facility
BEFORE starting VA training
experience
How do you rate the quality of
patient care at the VA facility NOW,
based on your actual experience
Compare alternative experiences
with:
VA clinical faculty and preceptors

✔

✔

VA facility staff

✔

✔

VA learning environment

✔

✔

VA working environment

✔

✔

VA physical environment

✔

✔

Degree of autonomy

✔

✔

Degree of supervision

✔

✔

LPS2014

LPS2015

LPS2016

Page 35 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires

AH

PR

AH

PR

AH

PR

LPS-PC
ver007
ed006

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

Degree of autonomy

✔

✔

✔

✔

✔

✔

✔

✔

✔

Amount of non-educational work
("scut")

✔

✔

✔

✔

✔

✔

✔

✔

✔

Interdisciplinary approach

✔

✔

✔

✔

✔

✔

✔

✔

✔

Preparation for clinical practice

✔

✔

✔

✔

✔

✔

✔

✔

✔

Preparation for future training

✔

✔

✔

✔

✔

✔

✔

✔

✔

LPS2013

MEASURES

AH

PR

✔

✔

✔

✔

Time working with patients

✔

Degree of supervision

Quality of care
Usefulness of what respondent
learned

LPS2014

LPS2015

LPS2016

ENVIRONMENT-LEVEL DOMAINS
TEACHING EXPERIENCES

Learning Environment

Page 36 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires

AH

PR

AH

PR

AH

PR

AH

PR

LPS-PC
ver007
ed006

Preparation for business aspects of
clinical practice

✔

✔

✔

✔

✔

✔

✔

✔

✔

Time for learning

✔

✔

✔

✔

✔

✔

✔

✔

✔

Access to specialty expertise

✔

✔

✔

✔

✔

✔

✔

✔

✔

Teaching conferences

✔

✔

✔

✔

✔

✔

✔

✔

✔

Quality of care

✔

✔

✔

✔

✔

✔

✔

✔

✔

Culture of patient safety

✔

✔

✔

✔

✔

✔

✔

✔

✔

Spectrum of patient problems

✔

✔

✔

✔

✔

✔

✔

✔

✔

Diversity of patients

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

LPS2013

MEASURES

LPS2014

LPS2015

LPS2016

Time for teaching others

Clinic related teaching conferences
Access to learning / educational
resources

Limiting interruptions from other
patient care responsibilities
OVERALL satisfaction

Page 37 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires

AH

PR

AH

PR

AH

PR

AH

PR

LPS-PC
ver007
ed006

Clinical skills

✔

✔

✔

✔

✔

✔

✔

✔

✔

Teaching ability

✔

✔

✔

✔

✔

✔

✔

✔

✔

Interest in teaching

✔

✔

✔

✔

✔

✔

✔

✔

✔

Research mentoring

✔

✔

✔

✔

✔

✔

✔

✔

✔

Accessibility / availability

✔

✔

✔

✔

✔

✔

✔

✔

✔

Approachability / openness

✔

✔

✔

✔

✔

✔

✔

✔

✔

Timeliness of feedback

✔

✔

✔

✔

✔

✔

✔

✔

✔

Fairness in evaluation

✔

✔

✔

✔

✔

✔

✔

✔

✔

Being role models

✔

✔

✔

✔

✔

✔

✔

✔

✔

Mentoring by faculty

✔

✔

✔

✔

✔

✔

✔

✔

✔

Patient-oriented

✔

✔

✔

✔

✔

✔

✔

✔

✔

Quality of faculty

✔

✔

✔

✔

✔

✔

✔

✔

✔

Evidence-based clinical practice

✔

✔

✔

✔

✔

✔

✔

✔

✔

OVERALL satisfaction

✔

✔

✔

✔

✔

✔

✔

✔

✔

LPS2013

MEASURES

LPS2014

LPS2015

LPS2016

Clinical Faculty / Preceptors

Page 38 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires

AH

PR

AH

PR

AH

PR

LPS-PC
ver007
ed006

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

Ancillary / support staff

✔

✔

✔

✔

✔

✔

✔

✔

✔

Call schedule

✔

✔

✔

✔

✔

✔

✔

✔

✔

Computerized Patient Record
System (CPRS)

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

LPS2013

MEASURES

AH

PR

Faculty / preceptor morale

✔

✔

Ancillary / support staff morale

✔

✔

Peer group morale

✔

✔

Laboratory services

✔

Radiology services

LPS2014

LPS2015

LPS2016

WORKING EXPERIENCES

Working Environment

Social work services
Interpreter services

Patient Record System
Orientation program

Page 39 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires
LPS2013

MEASURES
Library services

AH

PR

✔

✔

Access to online journals, resources,
references

AH

PR

AH

PR

AH

PR

LPS-PC
ver007
ed006

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

LPS2014

LPS2015

LPS2016

Computer access

✔

✔

Internet access

✔

✔

Workspace

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

Convenience of facility location

✔

✔

✔

✔

✔

✔

✔

✔

✔

Parking

✔

✔

✔

✔

✔

✔

✔

✔

✔

Room availability for seeing patients
Clinic room design
Presence of clinic room supplies
Clinic room equipment
Space for case discussion with
faculty
OVERALL satisfaction

Physical Environment

Page 40 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires

AH

PR

AH

PR

AH

PR

AH

PR

LPS-PC
ver007
ed006

Personal safety

✔

✔

✔

✔

✔

✔

✔

✔

✔

Availability of phones

✔

✔

Availability of needed equipment

✔

✔

✔

✔

✔

✔

✔

✔

✔

Maintenance of equipment

✔

✔

Facility maintenance / upkeep

✔

✔

✔

✔

✔

✔

✔

✔

✔

Lighting

✔

✔

Heating and air conditioning

✔

✔

Facility cleanliness / housekeeping

✔

✔

✔

✔

✔

✔

✔

✔

✔

Call rooms

✔

✔

✔

✔

✔

✔

✔

✔

Availability of food at the medical
center when on call

✔

✔

✔

✔

✔

✔

✔

✔

OVERALL satisfaction

✔

✔

✔

✔

✔

✔

✔

✔

✔

Personal support from colleagues

✔

✔

Personal reward from work

✔

✔

✔

✔

✔

✔

✔

✔

✔

Relationship with patients

✔

✔

LPS2013

MEASURES

LPS2014

LPS2015

LPS2016

Personal Experience

Page 41 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires

AH

PR

AH

PR

AH

PR

LPS-PC
ver007
ed006

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

Continuity of relationship with
patients

✔

✔

✔

✔

✔

✔

✔

✔

✔

Ownership / personal responsibility
for respondent's patients' care

✔

✔

✔

✔

✔

✔

✔

✔

✔

Quality of care respondent's patients
receive

✔

✔

Enhancement of respondent's clinical
knowledge and skills

✔

✔

✔

✔

✔

✔

✔

✔

✔

LPS2013

MEASURES

AH

PR

Appreciation of respondent's work by
faculty

✔

✔

Appreciation of respondent's work by
patients

✔

✔

Balance of personal and professional
life

✔

✔

Enjoyment of respondent's work

✔

✔

Level of job stress

✔

Level of fatigue

LPS2014

LPS2015

LPS2016

Appreciation of respondent’s work by
other members of the
interprofessional healthcare team

Page 42 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires

AH

PR

AH

PR

AH

PR

AH

PR

LPS-PC
ver007
ed006

✔

✔

✔

✔

✔

✔

✔

✔

✔

LPS2013

MEASURES
OVERALL satisfaction

LPS2014

LPS2015

LPS2016

CLINICAL EXPERIENCES

Clinical Environment
Hours at work

✔

✔

✔

✔

✔

✔

✔

Number of inpatients admitted for
respondent’s care

✔

✔

✔

✔

✔

✔

✔

Number of outpatients / clinic patients
seen

✔

✔

✔

✔

✔

✔

✔

Timely availability of outpatient
appointments

✔

Timely availability of appointments for
routine follow up visits
Timely availability of appointments for
acute care / urgent issues
Timely performance of necessary
procedures / surgeries
Time allotted to see patients
(appointment length)

✔

Page 43 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires
LPS2013

MEASURES

AH

PR

PR

AH

PR

AH

PR

LPS-PC
ver007
ed006

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

LPS2014
AH

LPS2015

LPS2016

How well physicians, nurse
practitioners, and physician
assistants work together
Admitting patients in a timely fashion

✔

Ability to use emerging therapies /
pharmaceuticals

✔

How well physicians and nurses work
together

✔

How well primary care practitioners
and nursing staff work together
How well physicians and ancillary
staff work together
How well physicians and other
clinical staff work together
How well primary care practitioners
and other health professionals work
together
How well primary care practitioners
and administrative support staff work
together

✔

Page 44 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires
LPS2013

MEASURES

AH

PR

PR

AH

PR

AH

PR

LPS-PC
ver007
ed006

LPS2014
AH

LPS2015

LPS2016

Getting tests done in a timely fashion
on weekdays

✔

Getting tests done in a timely fashion
on nights and weekends

✔

Ease of getting patient records

✔

✔

✔

✔

✔

✔

✔

Backup system for electronic health
records

✔

✔

✔

✔

✔

✔

✔

Amount of “paper work”

✔

Ability to work within the system to
get the best care for respondent’s
patients

✔

✔

✔

✔

✔

✔

✔

Nursing support for patient care
issues between visits
How well primary care practitioners
support patient care for each other’s
assigned patients
Management of patient phone calls
OVERALL satisfaction

✔

Page 45 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires
LPS2013

MEASURES

AH

PR

LPS2014
AH

PR

LPS2015
AH

PR

PR

LPS-PC
ver007
ed006

LPS2016
AH

Staff and Services Timeliness
and Availability
Attending / supervisory staff:
weekdays

✔

✔

✔

✔

✔

Attending / supervisory staff: nights
and weekends

✔

✔

✔

✔

✔

Outpatient nursing staff: weekdays

✔

✔

✔

✔

✔

Inpatient nursing staff: weekdays

✔

✔

✔

✔

✔

Inpatient nursing staff: nights and
weekends

✔

✔

✔

✔

✔

Ancillary / support staff: weekdays

✔

✔

✔

✔

✔

Ancillary / support staff: nights and
weekends

✔

✔

✔

✔

✔

Pharmacy services: weekdays

✔

✔

✔

✔

✔

Pharmacy services: nights and
weekends

✔

✔

✔

✔

✔

Radiology services: weekdays

✔

✔

✔

✔

✔

Page 46 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires
LPS2013

MEASURES

AH

PR

LPS2014
AH

PR

LPS2015
AH

PR

PR

LPS-PC
ver007
ed006

LPS2016
AH

Radiology services: nights and
weekends

✔

✔

✔

✔

✔

Laboratory services: weekdays

✔

✔

✔

✔

✔

Laboratory services: nights and
weekends

✔

✔

✔

✔

✔

Physician services: weekdays

✔

Physician services: nights and
weekends

✔

OVERALL satisfaction

✔

✔

✔

✔

✔

Attending / supervisory staff

✔

✔

✔

✔

✔

Nursing staff

✔

✔

✔

✔

✔

Ancillary / support staff

✔

✔

✔

✔

✔

Pharmacy services

✔

✔

✔

✔

✔

Radiology services

✔

✔

✔

✔

✔

Laboratory services

✔

✔

✔

✔

✔

Staff and Services Quality

Physician services

✔

Page 47 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires
LPS2013

MEASURES

AH

OVERALL satisfaction

PR

LPS2014
AH

PR

LPS2015
AH

PR

PR

LPS-PC
ver007
ed006

LPS2016
AH

✔

✔

✔

✔

✔

Prevent / reduce medical errors

✔

✔

✔

✔

✔

Assure medication safety

✔

✔

✔

✔

✔

Report medical / medication errors

✔

✔

✔

✔

✔

Assure confidentiality of error
reporting

✔

✔

✔

✔

✔

Facilitate discussion of medical /
medication errors

✔

✔

✔

✔

✔

Facilitate analysis of medical /
medication errors as a learning
experience

✔

✔

✔

✔

✔

OVERALL satisfaction

✔

✔

✔

✔

✔

✔

✔

Process Medical Errors

TOPIC DOMAIN

Psychological Safety
Members of the clinical team of which

✔

✔

✔

✔

✔

✔

✔

Page 48 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires

AH

PR

AH

PR

AH

PR

AH

PR

LPS-PC
ver007
ed006

✔

✔

✔

✔

✔

✔

✔

✔

✔

Patients and families are treated as
members of the care team

✔

✔

Patient transitions from one level of
care to another, such as hospital
discharge, are well-coordinated

✔

✔

✔

✔

✔

✔

✔

✔

✔

Patients and families are engaged
with clinicians in collaborative goal
setting

✔

✔

✔

✔

✔

✔

✔

✔

✔

Patients and families are listened to,
respected, and treated as partners in
care

✔

✔

✔

✔

✔

✔

✔

✔

✔

LPS2013

MEASURES

LPS2014

LPS2015

LPS2016

respondent was a part are able to
bring up problems and tough issues
Respondent feels free to question the
decisions or actions of those with
more authority
Respondent feels safe to take a risk
in the VA clinical tem

Patient Centered Care

Page 49 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires

AH

PR

AH

PR

AH

PR

AH

PR

LPS-PC
ver007
ed006

Families are actively involved in care
planning and transitions

✔

✔

✔

✔

✔

✔

✔

✔

✔

Web portals provide specific healthrelated, patient education resources
for patients and families

✔

✔

✔

✔

✔

✔

✔

✔

✔

Clinicians use e-mail to communicate
with patients and families

✔

✔

✔

✔

✔

✔

✔

✔

✔

Clinicians use telemedicine or
telehealth technology to evaluate or
interact with patients or other
practitioners who are off-site

✔

✔

✔

✔

✔

✔

✔

✔

✔

Other than e-mail or telemedicine /
telehealth, clinicians use additional
electronic means of communicating
with patients

✔

✔

✔

✔

✔

✔

✔

✔

✔

Educational materials are routinely
provided to patients and families

✔

✔

✔

✔

✔

✔

✔

✔

✔

Assistance is provided for patients
who have difficulty accessing health
care services

✔

✔

✔

✔

✔

✔

✔

✔

✔

Patients have access to their paper /
electronic health records

✔

✔

LPS2013

MEASURES

LPS2014

LPS2015

LPS2016

Page 50 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires
LPS2013

MEASURES

AH

PR

Patients have access to their health
records

AH

PR

AH

PR

AH

PR

LPS-PC
ver007
ed006

✔

✔

✔

✔

✔

✔

✔

LPS2014

LPS2015

LPS2016

Environment encourages family
presence

✔

✔

✔

✔

✔

✔

✔

✔

✔

Families are treated as members of
the treatment team

✔

✔

✔

✔

✔

✔

✔

✔

✔

Respondent participates in regularly
scheduled treatment team meetings
that include physicians and nonphysicians (e.g., nurses,
psychologists, social workers,
pharmacists)

✔

✔

Page 51 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires

AH

PR

AH

PR

AH

PR

LPS-PC
ver007
ed006

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

LPS2013

MEASURES

AH

PR

Respondent participates in regularly
scheduled treatment team meetings
that include physicians and nonphysicians (e.g., nurses,
psychologists, social workers,
pharmacists)

✔

✔

Care is provided using an
interprofessional, collaborative team
approach

✔

✔

Respondent follows a defined panel
of patients longitudinally

✔

Patients or cohorts of patients with
chronic disease(s) are identified who
might benefit from additional
intervention or coordination of care
between clinic visits
For patients with chronic disease
such as diabetes, respondent
reviews lists of patients in
respondent’s primary care clinic or
panel in order to identify and better
manage patients not meeting
treatment goals

LPS2014

LPS2015

LPS2016

Page 52 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires
LPS2013

MEASURES

AH

PR

For patients with chronic disease
such as diabetes or mental illness,
respondent reviews lists of patients in
order to identify and better manage
patients not meeting treatment goals
Practitioners from different settings
(inpatient, outpatient, and extended
care) communicate with respondent
about respondent’s patients and their
transitions from one level of care to
another, such as hospital discharge

✔

✔

OVERALL, VA practitioners provide
patient and family centered care

AH

PR

AH

PR

AH

PR

LPS-PC
ver007
ed006

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

LPS2014

LPS2015

LPS2016

OVERALL, VA practitioners provide
patient and family centered care in
respondent’s VA primary care clinic
OVERALL satisfaction with patient
and family centered care
OVERALL satisfaction with patient
and family centered care in
respondent’s VA primary care clinic

✔

✔

✔

✔

✔

✔

✔

✔

✔
✔

Page 53 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires

AH

PR

AH

PR

AH

PR

LPS-PC
ver007
ed006

Participate regularly in team
meetings (formal or informal) with
members of different professions to
discuss and coordinate care of
patients

✔

✔

✔

✔

✔

✔

✔

Participate regularly in team
meetings (formal or informal) with
members of different professions to
discuss performance improvement

✔

✔

✔

✔

✔

✔

✔

Participate regularly in team
meetings (formal or informal) with
members of different professions to
discuss clinical operational issues

✔

✔

✔

✔

✔

✔

✔

Practitioners from different settings
(inpatient, outpatient, extended care)
communicate about patients and their
transitions from one level of care to
another, such as hospital discharge

✔

✔

✔

✔

✔

✔

✔

VA staff work well together among
primary and specialty care
practitioners

✔

✔

✔

✔

✔

✔

✔

LPS2013

MEASURES

AH

PR

LPS2014

LPS2015

LPS2016

Interprofessional Team Care

Page 54 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires

AH

PR

AH

PR

AH

PR

LPS-PC
ver007
ed006

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

✔

VA staff work well together among
nurses and other health professionals

✔

✔

✔

✔

✔

✔

✔

VA staff work well together among
clinical and administrative support
staff

✔

✔

✔

✔

✔

✔

✔

LPS2013

MEASURES

AH

PR

LPS2014

LPS2015

LPS2016

Primary care practitioners (e.g.,
physicians, nurse practitioners,
physician assistants) work well
together
VA staff work well together among
physicians and nurses
Primary care practitioners and
nursing staff work well together
VA staff work well together among
physicians and other health
professionals (e.g., optometry,
pharmacy, podiatry, psychology,
rehabilitation, social work)
Primary care practitioners and other
health professionals work well
together (e.g., optometry, pharmacy,
podiatry, psychology, rehabilitation,
social work)

Page 55 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires
LPS2013

MEASURES

AH

PR

LPS2014

LPS2015

LPS2016

AH

PR

AH

PR

AH

PR

✔

✔

✔

✔

✔

✔

LPS-PC
ver007
ed006

Primary care practitioners and
administrative support staff work well
together
OVERALL VA practitioners provide
interprofessional team care
OVERALL, respondent’s primary
care clinic provides interprofessional
team care
OVERALL satisfaction with VA
interprofessional team care
OVERALL satisfaction with
interprofessional team care for
respondent’s VA primary care clinic

✔

✔

✔

✔

✔

✔

✔
✔

Page 56 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires
LPS2013

MEASURES

AH

PR

ACGME Duty Hours / Scheduling
Personal support from colleagues

✔

Personal reward from work

✔

Relationship with patients

✔

Appreciation of respondent’s work by
faculty

✔

Supervision of respondent’s work by
attendings and more senior residents

✔

Appreciation of respondent’s work by
patients

✔

Balance of personal and professional
life

✔

Enjoyment of respondent’s work

✔

Level of job stress

✔

Level of fatigue

✔

Continuity of relationship with
patients

✔

Ownership / personal responsibility
for respondent’s patients' care

✔

LPS2014
AH

PR

LPS2015
AH

PR

LPS2016
AH

PR

LPS-PC
ver007
ed006

Page 57 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 2
Domain Elements By Learners’ Perceptions Survey Questionnaires
LPS2013

MEASURES

AH

PR

Quality of care respondent’s patients
receive

✔

Safety of patient care

✔

Respondent’s personal safety (e.g.,
driving home from work)

✔

Enhancement of respondent’s clinical
knowledge and skills

✔

Ability to transition care of patients to
other members of the treatment team

✔

Overall effect of changes in ACGME
requirements

✔

LPS2014
AH

PR

LPS2015
AH

PR

LPS2016
AH

PR

LPS-PC
ver007
ed006

Page 58 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 3
Reported Specialties Listed in the LPS Survey,
By Health Professions Education Program
and Assigned Specialty Group
Program

Associated
Health

Assigned Specialty
Group

Reported Specialty

Audiology

Audiology

Chaplaincy

Chaplaincy

Chiropractic

Chiropractic

Dietetics

Dietetics

Occupational Therapy

Occupational Therapy

Optometry

Optometry

Pharmacy

Pharmacy

Physical Therapy

Physical Therapy

Physician Assistant

Physician Assistant

Podiatry

Podiatry

Psychology

Psychology

Rehabilitation

Blind Rehabilitation
Recreation / Manual Arts Therapy
Rehabilitation / Other

Social Work

Social Work

Speech Pathology

Speech Pathology

Technical and
Laboratory

Laboratory
Medical Imaging
Medical / Surgical Support Tech
Radiation Therapy

(discipline, specialty, subspecialty)

Page 59 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 3
Reported Specialties Listed in the LPS Survey,
By Health Professions Education Program
and Assigned Specialty Group
Program

Assigned Specialty
Group

Reported Specialty
(discipline, specialty, subspecialty)
Surgical Technician / Technologist

Dentistry

Nursing

Other Associated
Health

Licensed Professional Mental Health Counselor
Marriage & Family Therapist
Orthotics / Prosthetics
Other

Dental Auxiliary

Dental Assistant
Dental Hygiene

Dentists

Anesthesiology
Craniofacial Special Care Orthodontics
Dentist
Endodontics
General Practice
Maxillofacial Prosthetics
Oral and Maxillofacial Pathology
Oral and Maxillofacial Radiology
Oral and Maxillofacial Surgery
Oral and Maxillofacial Cosmetics
Oral and Maxillofacial Craniofacial
Oral and Maxillofacial Oncology
Oral Medicine
Orthodontics & Dentofacial Orthopedics
Orthodontics / Periodontics
Pediatric
Periodontics
Prosthodontics
Prosthodontics / Maxillofacial Prosthetics
Public Health

Nursing

Nurse Aide / Assistant
Certified Registered Nurse Anesthetist
Clinical Nurse Leader
Clinical Nurse Specialist - Acute Care
Clinical Nurse Specialist - Adult-Gerontology

Page 60 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 3
Reported Specialties Listed in the LPS Survey,
By Health Professions Education Program
and Assigned Specialty Group
Program

Assigned Specialty
Group

Reported Specialty
(discipline, specialty, subspecialty)
Clinical Nurse Specialist - Family/Individual Across Lifespan
Clinical Nurse Specialist - Neonatal
Clinical Nurse Specialist - Pediatrics
Clinical Nurse Specialist - Psychiatric-Mental Health
Clinical Nurse Specialist - Women’s Health/Gender-Related
Licensed Practical Nurse
Licensed Vocational Nurse
Nurse Administration
Nurse Educator
Nurse Midwifery
Registered Nurse
Nurse Practitioner - Acute Care
Nurse Practitioner - Adult-Gerontology
Nurse Practitioner - Family/Individual Across Lifespan
Nurse Practitioner - Neonatal
Nurse Practitioner - Pediatrics
Nurse Practitioner - Psychiatric-Mental Health
Nurse Practitioner - Women’s Health / Gender-Related

Physicians

Medical Student

Medical Student

Internal Medicine

Internal Medicine
Internal Medicine - Chief Resident
Internal Medicine / Emergency Medicine
Sports Medicine - Internal Medicine

Int. Med. Subspecialty

Advanced Heart Failure and Transplant Cardiology
Cardiovascular Disease
Clinical Cardiac Electrophysiology
Critical Care Medicine - Internal Medicine
Endocrinology, Diabetes, and Metabolism
Gastroenterology
Geriatric Medicine - Internal Medicine
Hematology - Internal Medicine
Hematology and Oncology
Infectious Disease

Page 61 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 3
Reported Specialties Listed in the LPS Survey,
By Health Professions Education Program
and Assigned Specialty Group
Program

Assigned Specialty
Group

Reported Specialty
(discipline, specialty, subspecialty)
Interventional Cardiology
Nephrology
Oncology
Pulmonary Disease
Pulmonary Disease and Critical Care Medicine
Rheumatology
Sleep Medicine (multidisciplinary)

Medical / Other

Allergy and Immunology
Brain Injury Medicine
Clinical Neurophysiology
Dermatology
Dermatopathology (multidisciplinary)
Epilepsy
Family Medicine
Geriatric Medicine - Family Medicine
Hospice and Palliative Medicine (multidisciplinary)
Neurology
Neuromuscular Medicine - Neurology
Neuromuscular Medicine – (PM&R)
Physical Medicine and Rehabilitation (PM&R)
Procedural Dermatology
Spinal Cord Injury Medicine
Sports Medicine - Family Medicine
Sports Medicine - (PM&R)
Vascular Neurology
Other

Hospital-Based

Anesthesiology
Blood Banking / Transfusion Medicine
Chemical Pathology
Clinical Informatics
Clinical Neurophysiology
Critical Care Medicine - Anesthesiology
Cytopathology
Emergency Medical Services

Page 62 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 3
Reported Specialties Listed in the LPS Survey,
By Health Professions Education Program
and Assigned Specialty Group
Program

Assigned Specialty
Group

Reported Specialty
(discipline, specialty, subspecialty)
Emergency Medicine
Forensic Pathology
Hematology - Pathology - Anatomic and Clinical
Medical Biochemical Genetics
Medical Genetics
Medical Microbiology
Medical Toxicology - Emergency Medicine
Medical Toxicology - Preventive Medicine
Molecular Genetic Pathology (multidisciplinary)
Neuropathology
Neuroradiology
Nuclear Medicine
Nuclear Radiology
Pain Medicine (multidisciplinary)
Pathology - Anatomic and Clinical
Preventive Medicine
Radiation Oncology
Radiology - Diagnostic
Selective Pathology
Sports Medicine - Emergency Medicine
Transitional Year
Vascular and Interventional Radiology

Page 63 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 3
Reported Specialties Listed in the LPS Survey,
By Health Professions Education Program
and Assigned Specialty Group
Program

Assigned Specialty
Group

Reported Specialty

Surgery

Adult Reconstructive Orthopaedics
Colon and Rectal Surgery
Complex General Surgical Oncology
Craniofacial Surgery
Endovascular Surgical Neuroradiology
Female Pelvic Med and Reconstructive Surgery - OB-GYN
Female Pelvic Med and Reconstructive Surgery – Urology
Foot and Ankle Orthopaedics
Hand Surgery - Orthopaedic
Hand Surgery - Plastic Surgery - Integrated
Musculoskeletal Oncology
Neurological Surgery
Neurotology
Obstetrics and Gynecology
Ophthalmic Plastic and Reconstructive Surgery
Ophthalmology
Orthopaedic Sports Medicine
Orthopaedic Surgery
Orthopaedic Surgery of the Spine
Orthopaedic Trauma
Otolaryngology
Plastic Surgery
Plastic Surgery - Integrated
Surgery - General
Surgical Critical Care
Thoracic Surgery
Thoracic Surgery - Integrated
Transplant Hepatology
Urology
Vascular Surgery
Vascular Surgery - Integrated

(discipline, specialty, subspecialty)

Page 64 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 3
Reported Specialties Listed in the LPS Survey,
By Health Professions Education Program
and Assigned Specialty Group
Program

Assigned Specialty
Group

Reported Specialty

Psychiatry

Addiction Psychiatry
Forensic Psychiatry
Geriatric Psychiatry
Psychiatry
Psychosomatic Medicine - Psychiatry

(discipline, specialty, subspecialty)

Page 65 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 4
Special Fellowships Listed in the LPS Survey

Addiction Treatment

Parkinson’s Disease (PADRECC)

Advanced Geriatrics

Patient Safety

Clinical Simulation

Polytrauma / Traumatic Brain Injury Rehabilitation
(1 year clinical track)

Dental Research

Polytrauma / Traumatic Brain Injury Rehabilitation
(2 year research track)

Geriatric Neurology

Psychiatric Research / Neurosciences

Health Professions Education Evaluation and Research

Psycho-Social Rehab Physicians Fellow

Health Services Research & Development

Quality Scholars

Health Systems Engineering

The Robert Wood Johnson (RWJ) Clinical Scholars

(1 year practitioner track)

Health Systems Engineering

Spinal Cord Injury Research

(2 year research track)

Medical Informatics

War Related and Unexplained Illness

Mental Illness Research and Treatment

Women's Health

(Advanced Psychiatry)

Mental Illness Research and Treatment
(Advanced Psychology)

Multiple Sclerosis

Other

Page 66 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 5
Academic Level, Academic Year, and Academic Level Group, by Program
Program

Academic Level

Associated
Health

Clinical hours for Certificate (Pre-Baccalaureate)
Clinical hours for Diploma (Pre-Baccalaureate)
Clinical hours for Associate Degree
Clinical hours for Baccalaureate Degree
Post-Baccalaureate clinical hours
Clinical hours for Masters Degree or Fellowship
Post-Masters clinical hours
Predoctoral or Doctoral clinical hours, Externship, or Practicum
Predoctoral or Doctoral Internship
Postdoctoral Residency or Fellowship Year 1
Postdoctoral Residency or Fellowship Year 2
Postdoctoral Residency or Fellowship Year 3
Postdoctoral Residency or Fellowship Year 4
Postdoctoral Residency or Fellowship Year 5
Postdoctoral Residency or Fellowship Year 6
Certificate (Pre-Baccalaureate)
Diploma (Pre-Baccalaureate)
Associate Degree
Baccalaureate Degree
Post-Baccalaureate Internship
Masters Degree
Post-Masters Internship or Fellowship
Dental Student - 1st Year
Dental Student - 2nd Year
Dental Student - 3rd Year
Dental Student - 4th Year
Postdoctoral Residency or Fellowship Year 1
Postdoctoral Residency or Fellowship Year 2
Postdoctoral Residency or Fellowship Year 3
Postdoctoral Residency or Fellowship Year 4
Postdoctoral Residency or Fellowship Year 5
Postdoctoral Residency or Fellowship Year 6
Postdoctoral Residency or Fellowship Year 7
Certificate (Pre-Baccalaureate)
Diploma (Pre-Baccalaureate)
Associate Degree
Baccalaureate Degree
Post-Baccalaureate Residency
Masters Degree
Post-Masters
Post-Masters Residency
Pre-Doctoral Research Fellowship
Pre-Doctoral Clinical Fellowship

Dentistry

Nursing

Grouping

Year1

Level2

1
2
3
4
5
6
7
9
10
11
12
13
14
15
16
1
2
3
4
5
6
7
7
8
9
10
11
12
13
14
15
16
17
1
2
3
4
5
6
7
7
7
7

1
1
1
2
3
3
3
4
5
6
6
6
7
7
7
1
1
1
2
3
3
3
4
4
5
5
6
6
6
7
7
7
7
1
1
1
2
3
3
3
3
4
4

Page 67 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 5
Academic Level, Academic Year, and Academic Level Group, by Program
Program

Physicians

Academic Level

Doctoral / PhD
Doctoral / DNS, DNSc
Doctoral / DNP
Postdoctoral Research Fellowship
Postdoctoral Clinical Fellowship
Post-Doctoral Residency
Medical Student - 1st year
Medical Student - 2nd year
Medical Student - 3rd year
Medical Student - 4th year
Residency or Fellowship - PGY1
Residency or Fellowship - PGY2
Residency or Fellowship - PGY3
Residency or Fellowship - PGY4
Residency or Fellowship - PGY5
Residency or Fellowship - PGY6
Residency or Fellowship - PGY7
Residency or Fellowship - PGY8
Residency or Fellowship - PGY9

Grouping

Year1

Level2

10
10
10
11
11
11
7
8
9
10
11
12
13
14
15
16
17
18
19

5
5
5
6
6
6
4
4
5
5
6
6
6
7
7
7
7
7
7

1-

Academic Year: Certificate [1]; Diploma [2]; Associate Degree [3]; Baccalaureate [4]; PostBaccalaureate [5]; Masters [6]; Doctoral First Year, Post-Master, Pre-Doctoral [7]; Doctoral
Second Year [8]; Doctoral Third Year, Practicum [9]; Doctoral Fourth Year, Doctoral Intern,
Doctoral [10]; Post-Doctoral First Year [11]; Post-Doctoral Second Year [12]; Post-Doctoral Third
Year [13]; Post-Doctoral Fourth Year [14]; Post-Doctoral Fifth Year [15]; Post-Doctoral Sixth
Year [16]; Post-Doctoral Seventh Year [17]; Post-Doctoral Eight Year [18]; Post-Doctoral Ninth
Year [19].

2-

Academic group: pre-Baccalaureate [1], Baccalaureate [2], Masters [3], Doctoral First or
Second Year [4], Doctoral Third or Fourth Year [5], Post-Doctoral First, Second or Third Year [6],
Post-Doctoral Fourth Year or higher [7]

Page 68 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 6
Specialty-Specific Academic Levels,
by Specialty Group
Assigned Specialty Group

Specialty-Specific Academic
Level

Associated Health
Audiology

Masters
Post-Masters
Doctoral
Postdoctoral

Chaplaincy

Certificate
Baccalaureate
Masters
Doctoral
Postdoctoral

Chiropractic

Doctoral
Postdoctoral

Dietetics

Associate Degree
Baccalaureate
Post-Baccalaureate
Masters
Post-Masters

Occupational Therapy

Pre-Baccalaureate
Baccalaureate
Master
Doctoral

Optometry

Doctoral
Postdoctoral

Pharmacy

Doctoral PharmD
Postdoctoral

Physical Therapy

Pre-Baccalaureate
Baccalaureate
Master
Doctoral

Page 69 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 6
Specialty-Specific Academic Levels,
by Specialty Group
Assigned Specialty Group

Specialty-Specific Academic
Level

Physician Assistant

Baccalaureate
Post-Baccalaureate Intern / Fellow
Masters
Post-Masters

Podiatry

Doctoral
Postdoctoral - PGY1
Postdoctoral - PGY2
Postdoctoral - PGY3
Postdoctoral - PGY4
Postdoctoral - PGY5

Psychology

Post-Masters
Doctoral Practicum Extern
Doctoral Intern
Postdoctoral

Rehabilitation

Certificate, Diploma, Associate Degree
Baccalaureate
Masters
Doctoral
Postdoctoral

Social Work

Baccalaureate
Masters
Doctoral
Postdoctoral

Speech Pathology

Masters
Post-Masters
Doctoral
Postdoctoral

Technical and Laboratory

Certificate or Diploma
Associate Degree
Baccalaureate
Masters
Post-Masters

Other Associated Health

Certificate or Diploma
Associate Degree

Page 70 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 6
Specialty-Specific Academic Levels,
by Specialty Group
Assigned Specialty Group

Specialty-Specific Academic
Level
Baccalaureate
Post-Baccalaureate
Masters
Post-Masters
Doctoral
Postdoctoral

Dentistry
Dental Auxiliary

Certificate / Diploma
Associate Degree
Baccalaureate
Post-Baccalaureate Intern
Masters
Post-Masters Intern / Fellow

Dentists

Doctoral
Intern
Postdoctoral Intern / Fellow
Resident / Fellow - PGY1
Resident / Fellow - PGY2
Resident / Fellow - PGY3
Resident / Fellow - PGY4
Resident / Fellow - PGY5

Nursing
Certificate
Diploma
Associate Degree
Baccalaureate
Post-Baccalaureate Residency
Masters
Post-Masters
Pre-Doctoral Fellowship
Doctoral
Postdoctoral Fellowship
Post-Doctoral Residency

Page 71 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 6
Specialty-Specific Academic Levels,
by Specialty Group
Assigned Specialty Group

Specialty-Specific Academic
Level

Physicians
Medical Student

Medical School - year 1
Medical School - year 2
Medical School - year 3
Medical School - year 4

Medical / Internal Medicine

PGY - 1
PGY - 2
PGY - 3
PGY - 4

Medical / Internal Medicine
Subspecialties

PGY - 4
PGY - 5
PGY - 6
PGY - 7
PGY - 8
PGY - 9

Medical / Other

PGY - 1
PGY - 2
PGY - 3
PGY - 4
PGY - 5
PGY - 6
PGY - 7
PGY - 8
PGY - 9

Surgery

PGY - 1
PGY - 2
PGY - 3
PGY - 4
PGY - 5
PGY - 6
PGY - 7
PGY - 8
PGY - 9

Page 72 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 6
Specialty-Specific Academic Levels,
by Specialty Group
Assigned Specialty Group

Specialty-Specific Academic
Level

Psychiatry

PGY - 1
PGY - 2
PGY - 3
PGY - 4
PGY - 5
PGY - 6
PGY - 7
PGY - 8
PGY - 9

Hospital-Based

PGY - 1
PGY - 2
PGY - 3
PGY - 4
PGY - 5
PGY - 6
PGY - 7
PGY - 8
PGY - 9

Page 73 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 7
Psychometric Properties of LPS Domain Satisfaction Ratings
Domains / Elements

Overall

Components
First

Second

Third

Fourth

Learning Environment
Number elements

15

Mean Element

4.42

Standard Deviation

0.69

Range: [min, max]

[1.00, 5.00]

Cronbach’s Alpha

0.956
0.111

Intraclass Correlation
– single measure

[0.095, 0.127]

Intraclass Correlation
– average measure

[0.173, 0.225]

0.200

Eigenvalue

9.518

.875

.726

.560

% of variance

63.453

5.833

4.840

3.735

Time working with patients

.747

-.295

.238

.052

Degree of supervision

.792

-.363

.145

.005

Degree of autonomy

.739

-.411

.236

.166

Amount of non-educational work
(“scut”)

.752

.010

-.241

.222

Interdisciplinary approach

.826

.018

-.183

-.043

Preparation for clinical practice

.871

-.173

-.019

-.055

Preparation for future training

.875

-.159

-.009

-.056

Preparation for business aspects
of clinical practice

.773

.232

-.249

.217

Time for learning

.829

-.028

-.181

.089

Access to specialty expertise

.816

.077

-.132

.001

Teaching conferences

.755

.154

-.218

.212

Quality of care

.850

.073

-.058

-.398

Culture of patient safety

.823

.112

-.061

-.443

Spectrum of patient problems

.773

.341

.399

.004

Diversity of patients

.707

.461

.417

.135

Page 74 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 7
Psychometric Properties of LPS Domain Satisfaction Ratings
Domains / Elements

Overall

Components
First

Second

Third

Fourth

Clinical Faculty / Preceptors
Number elements

13

Mean Element

4.51

Standard Deviation

0.70

Range: [min, max]

[1.00, 5.00]

Cronbach’s Alpha

0.968
0.086

Intraclass Correlation
– single measure

[0.070, 0.103]

Intraclass Correlation
– average measure

[0.131, 0.186]

0.159

Eigenvalue

9.554

.527

*

*

% of variance

73.489

4.051

*

*

Clinical skills

.839

.266

*

*

Teaching ability

.893

.028

*

*

Interest in teaching

.877

-.065

*

*

Research mentoring

.755

.008

*

*

Accessibility / availability

.848

-.208

*

*

Approachability / openness

.859

-.300

*

*

Timeliness of feedback

.848

-.255

*

*

Fairness in evaluation

.846

-.230

*

*

Being role models

.910

-.011

*

*

Mentoring by faculty

.889

.019

*

*

Patient-oriented

.862

.189

*

*

Quality of faculty

.872

.225

*

*

Evidence-based clinical practice

.838

.336

*

*

Page 75 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 7
Psychometric Properties of LPS Domain Satisfaction Ratings
Domains / Elements

Overall

Components
First

Second

Third

Fourth

Working Environment
Number elements

9

Mean Element

4.19

Standard Deviation

0.82

Range: [min, max]

[1.00, 5.00]

Cronbach’s Alpha

0.929

Intraclass Correlation
– single measure

0.169
[0.153, 0.184]

Intraclass Correlation
– average measure

[0.265, 0.311]

0.288

Eigenvalue

5.787

.919

.539

*

% of variance

64.295

10.210

5.986

*

Ancillary / support staff morale

.817

-.303

-.373

*

Laboratory services

.854

-.301

.087

*

Radiology services

.830

-.284

.230

*

Ancillary / support staff

.853

-.338

-.241

*

Call schedule

.788

-.085

.423

*

Computerized Patient Record
System (CPRS)

.764

.268

-.013

*

Computer Access

.760

.479

-.148

*

Workspace

.766

.356

-.178

*

Access to online journals,
resources, references

.777

.326

.219

*

Physical Environment
Number elements

8

Mean Element

4.26

Standard Deviation

0.76

Range: [min, max]

[1.00, 5.00]

Page 76 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 7
Psychometric Properties of LPS Domain Satisfaction Ratings
Domains / Elements

Overall

Components
First

Cronbach’s Alpha

Second

Third

Fourth

0.896
0.217

Intraclass Correlation
– single measure

[0.201, 0.233]

Intraclass Correlation
– average measure

[0.335, 0.378]

0.357

Eigenvalue

4.971

.770

.655

*

% of variance

62.143

9.630

8.190

*

Convenience of facility location

.710

.338

-.353

*

Parking

.603

.669

.284

*

Personal Safety

.804

.165

-.248

*

Availability of needed equipment

.846

-.141

-.070

*

Facility maintenance / upkeep

.885

-.237

-.131

*

Facility cleanliness /
housekeeping

.866

-.235

-.146

*

Call rooms

.824

-.194

.224

*

Availability of food at the medical
center when on call

.728

-.115

.543

*

4.970

.812

*

*

70.933

11.597

*

*

Personal Experience
Number elements

7

Mean Element

4.44

Standard Deviation

0.71

Range: [min, max]

[1.00, 5.00]

Cronbach’s Alpha

0.930
0.103

Intraclass Correlation
– single measure

[0.086, 0.119]

Intraclass Correlation
– average measure

[0.159, 0.213]

Eigenvalue
% of variance

0.187

Page 77 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 7
Psychometric Properties of LPS Domain Satisfaction Ratings
Domains / Elements

Overall

Components
First

Second

Third

Fourth

Personal reward from work

.847

.196

*

*

Balance of personal and
professional life

.857

-.295

*

*

Level of job stress

.861

-.408

*

*

Level of fatigue

.842

-.444

*

*

Continuity of relationship with
patients

.808

.262

*

*

Ownership / personal
responsibility for respondent’s
patient’s care

.842

.370

*

*

Enhancement of respondent’s
clinical knowledge and skills

.841

.342

*

*

4.887

.652

.576

*

69.817

9.309

8.232

*

Hours at work

.815

-.324

.151

*

Number of inpatients admitted
for respondent’s care

.842

-.365

.131

*

Number of outpatients / clinic
patients seen

.846

-.294

.074

*

How well physicians and nurses

.851

.114

-.442

*

Clinical Environment
Number elements

7

Mean Element

4.35

Standard Deviation

0.74

Range: [min, max]

[1.00, 5.00]

Cronbach’s Alpha

0.927
0.142

Intraclass Correlation
– single measure

[0.126, 0.158]

Intraclass Correlation
– average measure

[0.223, 0.274]

Eigenvalue
% of variance

0.249

Page 78 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 7
Psychometric Properties of LPS Domain Satisfaction Ratings
Domains / Elements

Overall

Components
First

Second

Third

Fourth

work together
How well physicians and other
clinical staff work together

.857

.104

-.430

*

Ease of getting patient records

.808

.408

.292

*

Backup system for electronic
health records

.829

.369

.256

*

8.790

1.078

.712

*

67.616

8.291

5.474

*

Attending / supervisory staff:
weekdays

.705

.624

-.004

*

Attending / supervisory staff:
nights and weekends

.725

.532

.038

*

Outpatient nursing staff:
weekdays

.826

.113

-.284

*

Inpatient nursing staff: weekdays

.849

-.122

-.402

*

Inpatient nursing staff: nights
and weekends

.850

-.220

-.332

*

Ancillary / support staff:
weekdays

.875

-.148

-.192

*

Staff and Services Timeliness and
Availability
Number elements

13

Mean Element

4.14

Standard Deviation

0.83

Range: [min, max]

[1.00, 5.00]

Cronbach’s Alpha

0.958
0.261

Intraclass Correlation
– single measure

[0.234, 0.287]

Intraclass Correlation
– average measure

[0.380, 0.446]

Eigenvalue
% of variance

0.414

Page 79 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 7
Psychometric Properties of LPS Domain Satisfaction Ratings
Domains / Elements

Overall

Components
First

Second

Third

Fourth

Ancillary / support staff: nights
and weekends

.824

-.346

-.016

*

Pharmacy services: weekdays

.848

.210

.063

*

Pharmacy services: nights and
weekends

.825

.042

.188

*

Radiology services: weekdays

.848

.050

.238

*

Radiology services: nights and
weekends

.766

-.283

.393

*

Laboratory services: weekdays

.880

-.060

.114

*

Laboratory services: nights and
weekends

.847

-.235

.241

*

4.194

.627

*

*

69.896

10.454

*

*

Attending / supervisory staff

.712

.571

*

*

Nursing staff

.829

-.385

*

*

Ancillary / support staff

.865

-.323

*

*

Pharmacy services

.849

.207

*

*

Radiology services

.868

.063

*

*

Laboratory services

.882

-.044

*

*

Staff and Services Quality
Number elements

6

Mean Element

4.19

Standard Deviation

0.82

Range: [min, max]

[1.00, 5.00]

Cronbach’s Alpha

0.911
0.264

Intraclass Correlation
– single measure

[0.238, 0.290]

Intraclass Correlation
– average measure

[0.384, 0.450]

Eigenvalue
% of variance

0.418

Page 80 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

TABLE 7
Psychometric Properties of LPS Domain Satisfaction Ratings
Domains / Elements

Overall

Components
First

Second

Third

Fourth

Process Medical Errors
Number elements

6

Mean Element

4.17

Standard Deviation

0.91

Range: [min, max]

[1.00, 5.00]

Cronbach’s Alpha

0.979
0.182

Intraclass Correlation
– single measure

[0.154, 0.210]

Intraclass Correlation
– average measure

[0.267, 0.346]

0.308
5.434

*

*

*

90.574

*

*

*

Prevent / reduce medical errors

.941

*

*

*

Assure medication safety

.944

*

*

*

Report medical / medication
errors

.955

*

*

*

Assure confidentiality of error
reporting

.950

*

*

*

Facilitate discussion of medical /
medication errors

.959

*

*

*

Facilitate analysis of medical /
medication errors as a learning
experience

.961

*

*

*

Eigenvalue
% of variance

Page 81 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

REFERENCES
1

Gilman SC, Chang BK, Zeiss RA, Dougherty MB, Marks Jr WJ, Ludke DA, Cox M. The
Academic Mission of the Department of Veterans Affairs. In The Praeger Handbook of
Veterans’ Health: History, Challenges, Issues, and Developments. Volume 1: History, Eras,
and Global Healthcare. TW Miller (editor). Santa Barbara (CA): Praeger, 2012, pp. 53-82.

2

Cohen JJ, for the Blue Ribbon Panel on VA-Medical School Affiliations. The Report of the
Blue Ribbon Panel on VA-Medical School Affiliations: Transforming an Historic Partnership
for the 21st Century. Department of Veterans Affairs, 2009.

3

Jha AK, Perlin JB, Kizer KW, Dudley RA. Effect of transformation of the Veterans Affairs
health care system on the quality of care. New England Journal of Medicine
2003;348:2218-2227.

4

Keitz SA, Holland GJ, Melander EH, Bosworth HB, Pincus SH. The Veterans Affairs
Learners’ Perceptions Survey: The Foundation for Education Quality Improvement.
Academic Medicine 2003;78:910-917.

5

Kashner TM, Hinson RS, Holland GJ, Mickey D, Hoffman K, Lind L, Johnson LD, Chang
BK, Golden RM, Henley SS. A data accounting system for clinical investigators. Journal
American Medical Informatics Association 2007;14(4):394-396.

6

Byrne JM, Chang BK, Gilman S, Keitz SA, Kaminetzky C, Aron D, Baz S, Cannon G, Zeiss
RA, and Kashner TM. The Primary Care-Learners’ Perceptions Survey: Assessing resident
perceptions of internal medicine continuity clinics and patient-centered care. Journal of
Graduate Medical Education 2013;5(4):587-593.

7

Kashner TM, Hettler DL, Zeiss RA, Aron DC, Bernett DS, Brannen JL, Byrne JM, Cannon
GW, Chang BK, Dougherty MB, Gilman SC, Holland GJ, Kaminetzky CP, Wicker AB, Keitz
SA. Has interprofessional education change the attitudes of physicians and other health
trainees? A national perspective. Manuscript, 2014.

8

Cannon GW, Keitz SA, Holland GJ, Chang BK, Byrne JM, Tomolo A, Aron DC, Wicker A,
Kashner M. Factors determining medical student and resident satisfaction during VA-based
training: Results from the VA Learners’ Perceptions Survey. Academic Medicine
2008;83(6):611-620.

9

Kaminetzky CP, Keitz SA, Kashner TM, Aron DC, Byrne JM, Chang BK, Clarke C, Gilman
SC, Holland GJ, Wicker A, and Cannon GW. Training satisfaction for subspecialty fellows in
internal medicine: Findings from the Veterans Affairs (VA) Learners’ Perceptions Survey.
BMC Medical Education 2011;11(21):1-9.

10

Cannon GW, Wahlen GE, and Kashner TM. Rheumatology fellows report higher
satisfaction with VA-based training in comparison to other internal medicine subspecialty
fellows. Presented at the National Meeting of the American College of Rheumatology,
Philadelphia, PA, October 2009.

11

Lam H-T, O’Toole TG, Arola PE, Kashner TM, and Chang BK. Factors associated with the
satisfaction of millennial-generation dental residents with their training experience. Journal
of Dental Education 2012;76(11):1416-1426.

Page 82 of 82
LPS Instructions Manual
October 15, 2015 (ver#001_ed13)

12

Kashner TM, Henley SS, Golden RM, Byrne JM, Keitz SA, Cannon GW, Chang BK,
Holland GJ, Aron DC, Muchmore EA, Wicker A, White H. Studying effects of ACGME duty
hours limits on resident satisfaction: Results from the VA Learners’ Perceptions Survey.
Academic Medicine 2010;85(7):1130-1139.


File Typeapplication/pdf
File TitleFollow-up after conference call on March 21, 2009 for LPS calibration survey
Authorvhaslccannog
File Modified2015-10-22
File Created2015-10-21

© 2024 OMB.report | Privacy Policy