Memo

4944_001.pdf

Learner's Perception (LP) Survey

Memo

OMB: 2900-0691

Document [pdf]
Download: pdf | pdf
SUPPORTING STATEMENT FOR 2900.0691

vA FORM 10-0439, LEARNERS' PERCEPTIONS SURVEY (LPS)
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

l.

Provide a numerical estimate of the potential respondent universe and
describe any sampling or other respondent selection method to be used. Data on
the number of entities (e.9., households or persons) in the universe and the
corresponding sample are to be provided in tabular format for the universe as a
whole and for each strata. lndicate expected response rates. lf this has been
conducted previously include actual response rates achieved.
The following proposed changes in the methods for conducting the LPS are
provided in response to OMB's terms of clearance at the time of last approval in August
of 2006. Specificallv, we were requested to address the following:
The response rates obtained during the current approval
The utility of continuing to conduct the survey as a census rather than
selecting a probability sample and
ln consultation with the steering committee, VA shall assess the feasibility of
incorporating a non-response bias analysis into the next version of the
survey.

o
o
o

These queries will be addressed in turn and our proposed, revised methodology
presented largely under the discussion of item 1.2 below.
The January 2007 approval OMB Terms of Clearance requested that VA address the
utilization of the proposed trainee registration system for this project. As a summary,
the "ClinicalTrainee Registration System of Records" initiative will automate the
registration and appointment processes for VHA's clinical trainees (i.e. physician
residents, dentists, nurses, and trainees in psychology, optometry, podiatry, pharmacy
and other associated health disciplines) who rotate through VHA medicalfacilities.
Over 100,000 trainees per year rotate through VA facilities, with highly variable
durations of appointment and assignment locations. The appointment and processing
requirements of such a large number of trainees strains current human resource
systems and personnel capacity. The project also supports a consistent system wide
process to implement Executive Order 10450, Security Requirements for Government
Employment, for clinical trainees. The "ClinicalTrainee Registration System of
Records" is in the Business Requirements Development process, but has not yet been
guaranteed funding.
1.1. In response to the first item, the response rates to the current LPS are not available
at the time of submission of this request. As a result of changes and restructuring of the
VA lT Organization, the VA-wide automated trainee registration system that was
mentioned in the prior submission and which was alluded to in OMB's terms of
clearance (i.e., "the new system for employee registration will provide significant
benefits for surveying and tracking potential respondents") is not yet available. A
request for development has been submitted and is under review for approval and
funding by OlT. Moreover, due to changes in the OIT project planning and funding
Page

1

SUPPORTING STATEMENT FOR LEARNERS' PERCEPTIONS SURVEY, COnt'd
process, a truly functional registration system for trainees is unlikely to be available for
the next two to three years.
1.2. Justification of continuing to conduct the LP survey as a census rather than a
probability sample - and proposed changes in methods.

At present, using a sampling methodology is not possible. To reduce respondent
burden, we intend to make the survey available to trainees at VA training medical
facilities throughout the academic year so they can take the survey as part of their outprocessing. The 14,000 responses captured via the census approach provides a very
adequate sample for the intended purpose of this survey; i.e., to accurately and
substantially evaluate VA performance at the facility and individual program levels while
significantly minimizing respondent burden. However as soon the Trainee Registration
System is available we will enjoy being able to compute a valid probability sample and
response rate.

a.

Overall design
The rationale for using a Census approach is that, until a trainee registration
system is available, a sample is not possible. As noted previously, the LPS is available
starting in September in order to capture non-academic year associated health trainees,
however in April we increase efforts within VA facilities, through intensified marketing, to
increase the population of trainees at VA medical facilities who take the LPS This
approach addresses concerns that: (1) only a small number of trainees may be enrolled
at each VA program level; (2) every VA medical center and program must be evaluated
as part of their performance assessment; and (3) trainees are offered an opportunity to
provide feedback to faculty preceptors, mentors, and supervising attending physicians.
This approach is actually preferred to any type of probability sampling within each
program as such sampling would pose risks to the confidential format that is essential to
getting accurate and unbiased responses from trainees. The VA will continue to accept
responses from trainees up to four months following March 31.

b.

Scope
The LPS survey is designed to provide information at the program level for each
VA facility, contingent on the collection of an adequate number of respondents from
each training program. The information collected focuses on assessing the current
learning environment and providing details on how the facility may improve. The survey
results provide (A) a facility wide summary score on a 100 point scale; (B) specific
domain scores that describe aspects of the learning environment; (C) breaks down each
domain score into specific elements that provide details on where the facility failed and
succeeded in producing a high-quality learning environment for the trainee. The use of
specified elements are in lieu of open-ended questions which we have found are: (i)
unreliable, (ii) difficult to analyze with current resources in the absence of unstructured
text analyses technology; (iii) are time consuming for the respondent thus often remain
unanswered, and (iv) do not lend themselves to computing scores that permit
comparisons across facilities and over time.
Page2

SUPPORTING STATEMENT FOR LEARNERS' PERCEPTIONS SURVEY, Cont'd

c.

Content

Two different LP surveys are provided, one for physician residents and one for
other (associated health) trainees. Both surveys include questions identifying the
respondent's host VA medical facility, program status, level of training, and gender. All
respondents are asked to rate their learning experience at the VA medical facility on a
100-point scale.
LPS survey for Physician Residents: The LPS-Physician Resident survey is
intended for physician residents. lt contains a pool of elements comprising 11 domains.
There are 128 elements, ranging from 6 to 15 elements per domain. Each element and
each domain are evaluated based on a 5-point Likert scale,
LPS survey for Associated Health trainees: The LPS-Associated Health survey is
intended for all non-physician health professionals. lt contains a pool of elements
comprising 7 domains. There are 96 elements, ranging from 9 to 15 for each domain.
Each element and each domain are evaluated based on a 5-point Likert scale.

d.

Entering trainees
Office of Academic Affiliations (OAA), with the support of VHA leadership, will
instruct local facility management to mount a sustained effort starting during the month
of April to identify and contact trainees present at their facilities to complete the LPS.
Trainees will be identified through the Associate Chief of Staff for Education or
equivalent at each of the training VA medical facilities, who will contact the local
program coordinators or site directors. Ihe trainees will be contacted by their program
coordinators or site directors and asked to respond to questions on a web-based
survey. OAA will supplement and support these activities with on-line real time
reporting of facility responses and other marketing materials.

1.3

The question regarding incorporation of non-response bias into the next version of
the survey. See 3 below.

2.
a.

Describe the procedures for the collection of information, including:

Statistical methodology for stratification and sample selection
The VA willtake a census of all trainees present in VA teaching facilities

participating in the survey during the academic year.

b.

Estimation procedure

c.

Degree ofaccuracy needed

We propose to survey an all-inclusive population and hence there is no estimation
procedure for sampling. The VA will take a simple means and frequencies of each
element, domain, and total hospital score, computed by program and facility, by VISN,
and for all VA. This is needed in order to assess the performance and learning
environment at the administrative program level.

Page

3

SUPPORTING STATEMENT FOR LEARNERS' PERCEPTIONS SURVEY, CONt'd
ln order to increase the response rate to better represent the population, a
reminder e-mails are sent to the Associate Chief of Staff for Education or equivalent to
increase the response rate. These e-mails will be accompanied by facility-specific
response numbers.

d.

Unusual problems requiring specialized sampling procedures
There are no unusual problems associated with the selection of the time period for
collection. The census sample obtained from our administration procedures yields
statistically valid and usable results.

e. Any use of less frequent than monthly data collection to reduce burden
Not applicable. This survey is collected annually. In having the survey available for
most of the academic year the overall response burden is reduced by (a) spreading out
data collections, (b) the convenience of being available when trainees are available and
(c) available when trainees out-process from their clinicaltraining rotations.

3.

Describe methods to maximize response rate and to dealwith issues of nonresponse. The accuracy and reliability of information collected must be shown to
be adequate for intended uses. For collections based on sampling, a special
justification must be provided for any collection that will not yield "reliable" data
that can be generalized to the universe studied.

3.1

Response Rate
A response rate will be computed by determining the percent of trainees who
responded to the survey based upon the number of trainees who were contacted to
respond to the survey. The number of trainees who are contacted to take the survey is
available because both local and national contact lists are maintained. To maximize the
number of respondents, the Office of Academic Affiliations will send reminder e-mails to
the Associate Chief of Staff for Education or equivalent. These e-mails will be
accompanied by facility-specific response numbers.

3.2

Response rate bias
The response rate bias will be computed by comparing characteristics among
trainees who completed the survey from registrants who did not complete the survey.
We will know the program characteristics of those respondents who did not complete
the survey from information supplied by when trainees first log-in to take the survey but
failto complete. Further, respondent characteristics will be derived and normalized
against similar characteristics in the trainee population as a whole and against trainees
completing the survey over the last 5 years.
We will compute the (unadjusted) overall hospital score across all respondents.
then
compute an adjusted overall score to reflect the distribution of all trainees,
We
including both respondents and non-responders. Scores are adjusted based on an
estimated regression where trainee characteristics serve as independent variables.
Trainee characteristics include training program, type, and level, among others. These
will be entered as mean centered values, where means are computed for all trainees.
Page 4

SUPPORTING STATEMENT FOR LEARNERS' PERCEPTIONS SURVEY, CoNt'd
The response rate bias is computed by taking the difference between adjusted and
unadjusted scores as a percent of the adjusted score. Statistical significance and
confidence intervals will be determined by bootstrapping 10,000 samples. We also
compute a robust estimate of the response rate bias by taking the mean response rate
bias across all bootstrapped samples. Adjusted scores will not be corrected for facility
nesting since we wish to know how the distribution of non-responses across facilities
impact national performance estimates. The robust estimate of the response rate bias
measures potential damage of assessing national performance estimates when
registrants failto respond to the survey.
We also plan to present results that controls for response biases. This will
enable us to compare results across facilities, across disciplines, and within a facility
over time. Such comparisons are necessary to evaluate progress VA is making to
achieve their clinical professional education mission. This will be accomplished with a
differencing variable technique derived from difference-in-differences analyses that is
designed to controlfor both observed and unobserved differences between responding
sample and population. Estimates for each domain will be computed by estimating the
following regression:

!¡r

= Þo+

þ,(x,-X)*

þztt + þrFtt' + p[r)toFlr) +u, +v, *

lu =wo-lwtluttwz!¡z+..,+wkyijk+...+wryrr +u|

z¡r

Eq.

1

Eq.2

where ya*is the kth element score (for the domain) of respondent i in VA facility i , yuis
the summary domain score (summarizing all lç1 , ..., K elements within the domain). X
is characteristics of respondent i and X-bar is the mean characteristics of the reference
respondent sample. fr is the continuous differencing variable that ranges from zero to
one for the "#h" element of the domain, where t=0 indicates the element should be
"common" to allVA (e.9., accessing patient records where little variation in respondent
VA experiences is expected with a nation-wide uniform electronic medical record
system), and t=1 indicates the response should be very specific to the respondent's..,
particular VA experiences (e.g., interaction with preceptor or clinical supervisor). F¡ui ¡s
equal to one if respondent i is located in facilityi, and zero othenryise. The "vy's" are
coefficients, normalized so thatw¡...+wKequals one, describing how much each
element factor loads onto domain summary scores. The n/s are used to weight records
by facility and respondent for each domain when estimating the parameters to the
modef in equation 1 . Lt, v, z, and u0 are random variates assumed to by llD normally
distributed. We thus can compute an adjusted score for the given domain for facility o
by FoÜ), the interaction term to the given hospitali and the reference group of facilities
used to standardize the scores. The reference group of facilities will be all VA facilities
in the base year 2009. The reference sample (to compute X-bar) are respondents to
the reference group of VA facilities. This will allow comparisons across facilities. We
also plan to use as our reference responders to 2005-2006 surveys. We willthen
compute þ¿Íor each facility by year (2007,2008, 2009). This will permit us to compare
progress within facility but over time for the past three years. Similar strategy is
proposed to compute overall hospital score.

Response bias, computed as a percent difference, may be approximated by
Page 5

SUPPORTING STATEMENT FOR LEARNERS, PERCEPTIONS SURVEY, CoNt'd

y,-y _fl";Fo

Eq.3

,þo
Finally, we intend to bootstrap samples (10,000 with replacement)to compute
confidence intervals of estimated performance outcome .

4.

Describe any tests of procedures or methods to be undertaken. Testing is
encouraged as an effective means of refining collections to minimize burden and
improve utility. Tests must be approved if they call for answers to identical
questions of l0 or more individuals.

Not applicable. The survey instrument has been previously fielded and extensively
tested, with findings published in scientific journals. No new questions have been
added in the present year and the survey instrument is the same as was previously
approved by OMB.

5. Provide the name and telephone number of individuals consulted on statistical
aspects of the design and the name of the agency unit, contractor(s), grantee(s),
or other person(s) who will actually collect and/or analyze the information for the
agency.
Name of agency unit collecting and analyzing the information:

Office of Academic Affiliations (14)
Veterans Health Adm in istration
Department of Veterans Affairs Central Office
Washington, DC 20420
(202) 461-9490
Individual overseeing the analysis of information:

T. Michael Kashner, PhD JD
Director of Program Evaluation,
Office of Academic Affiliations, VHA, Department of Veterans Affairs
(214) 648-4608
Professor, Department of Psychiatry
University of Texas Southwestern Medical Center at Dallas
Dallas, TX
lndividual consulted on statistical aspects of the design:
Richard M. Golden, PhD, M.S.E.E.,
Professor Cognitive Science and Engineering,
School of Behavioral and Brain Sciences, GR4.1
Page 6

SUPPORTING STATEMENT FOR LEARNERS, PERCEPTIONS SURVEY, CONt'd
800 West Campbell Road,
Universlty of Texas at Dallas,
Richardson, TX 75083-3021
(972) 883-2423

lndividual overseeing the collection of information:
Christopher T. Clarke, PhD
Director, OAA Support Services
Office of Academic Affiliations, VHA, Department of Veterans Affairs
(314\ 894-5760

PageT


File Typeapplication/pdf
File Modified0000-00-00
File Created2010-10-05

© 2024 OMB.report | Privacy Policy