Supporting Statement Part B

Supporting Statement Part B.doc

Learner's Perception (LP) Survey

OMB: 2900-0691

Document [doc]
Download: doc | pdf

SUPPORTING STATEMENT FOR LEARNERS’ PERCEPTIONS SURVEY, Cont’d


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS



1. Provide a numerical estimate of the potential respondent universe and describe any sampling or other respondent selection method to be used. Data on the number of entities (e.g., households or persons) in the universe and the corresponding sample are to be provided in tabular format for the universe as a whole and for each strata. Indicate expected response rates. If this has been conducted previously include actual response rates achieved.


The following proposed changes in the methods for conducting the LPS are provided in response to OMB’s terms of clearance at the time of last approval in August of 2006. Specifically, we were requested to address the following:

  • The response rates obtained during the current approval

  • The utility of continuing to conduct the survey as a census rather than selecting a probability sample and

  • In consultation with the steering committee, VA shall assess the feasibility of incorporating a non-response bias analysis into the next version of the survey.


These queries will be addressed in turn and our proposed, revised methodology presented largely under the discussion of item 1.2 below.


1.1. In response to the first item, the response rates to the current LPS are not available at the time of submission of this request. The VA-wide automated trainee registration system that was mentioned in the prior submission and which was alluded to in OMB’s terms of clearance (i.e., “the new system for employees registration will provide significant benefits for surveying and tracking potential respondents”) is not available due to resource and technical limitations. Moreover, a truly functional registration system for trainees is unlikely to be available for the next two to three years.


1.2. Justification of continuing to conduct the LP survey as a census rather than a probability sample – and proposed changes in methods.


A one-month time-period, bounded census is being conducted instead of a sample until the new registration system is available. At present, using a sampling methodology is not possible. To reduce respondent burden, we intend to survey trainees at VA training medical facilities only during the month of March. A VA medical facility is classified as a training facility if it is classified by a VA 3-digit identifier code and has at least 20 paid residency slots as determined by OAA. This March census will capture an estimated 14% of all trainees (or about 13,000 trainees) who come through VA training facilities during the year. We believe this one-time assessment, assuming a response rate of about 70% or about 9,000 respondents, will enable us to evaluate VA performance at the facility and individual program levels while significantly minimizing respondent burden.



a. Overall design

The rationale for using a March census survey is that, until a trainee registration system is available, a sample is not possible. Further sampling within the population of trainees at VA medical facilities during the month-long March census survey would result in an insufficient number of responses at any one VA facility to provide granular information upon which to base quality improvement efforts. Thus, VA proposes to continue to conduct a March census of all eligible trainees receiving medical training at VA facilities during a one month time period – i.e., March, which is chosen so as to avoid the beginning of the year stressors and end of year fatigue. At the facility level, this approach addresses concerns that: (1) only a small number of trainees may be enrolled at each VA program level; (2) every VA medical center and program must be evaluated as part of their performance assessment; and (3) trainees are offered an opportunity to provide feedback to faculty preceptors, mentors, and supervising attending physicians. Furthermore, sampling within each program would pose risks to the confidential format that is essential to getting accurate and unbiased responses from trainees.


b. Scope

The LPS survey is designed to provide information at the program level for each of the VA’s teaching facilities with 20 or more trainees, totaling about 110 facilities. The information collected focuses on assessing the current learning environment and providing details on how the facility may improve. The survey results provide (A) a facility wide summary score on a 100 point scale; (B) specific domain scores that describe aspects of the learning environment; (C) breaks down each domain score into specific elements that provide details on where the facility failed and succeeded in producing a high-quality learning environment for the trainee. The use of specified elements are in lieu of open-ended questions which we have found are: (i) unreliable, (ii) difficult to analyze with current resources in the absence of unstructured text analyses technology; (iii) often remain unanswered, and (iv) do not lend themselves to computing scores that permit comparisons across facilities and over time.


c. Content

Two different LP surveys are provided, one for physician residents and one for associated health trainees. Both surveys include questions identifying the respondent’s host VA medical facility, program status, level of training, and gender. All respondents are asked to summarize their learning experience at the VA medical facility on a 100-point scale.


LPS survey for Physician Residents: The LPS-Physician Resident survey is intended for physician residents. It contains a pool of elements comprising each of 11 domains. There are 128 elements, ranging from 6 to 15 elements per domain. Each element and each domain are evaluated based on a 5-point Likert scale.


LPS survey for Associated Health trainees: The LPS-Associated Health survey is intended for all non-physician health professionals. It contains a pool of elements comprising each of 7 domains. There are 96 elements, ranging from 9 to 15 for each domain. Each element and each domain are evaluated based on a 5-point Likert scale.

d. Entering trainees

Office of Academic Affiliations (OAA), with the support of VHA leadership, will instruct local facility management to mount a sustained effort during the month of March to identify and contact trainees present at their facilities to complete the LPS. Trainees will be identified through the Associate Chief of Staff for Education or equivalent at each of the training VA medical facilities, who will contact the local program coordinators or site directors. The trainees will be contacted by their program coordinators or site directors and asked to respond to questions on a web-based survey. The Associate Chief of Staff for Education or equivalent will provide a total count of the number of trainees by training program type and level present at the facility during the month of March to allow determination of the response rate.



1.3 The question regarding incorporation of non-response bias into the next version of the survey. See 3 below.



2. Describe the procedures for the collection of information, including:


a. Statistical methodology for stratification and sample selection

The VA will take a census of all trainees present in VA teaching facilities participating in the survey during a one-month time period (March). There is no sampling as all trainees during the month of March, selected by time period, will be requested to respond to the survey.


b. Estimation procedure

We propose to survey an all-inclusive population and hence there is no estimation procedure for sampling. The VA will take a simple means and frequencies of each element, domain, and total hospital score, computed by program and facility, by VISN, and for all VA. This is needed in order to assess the performance and learning environment at the administrative program level.


c. Degree of accuracy needed

In order to increase the response rate to better represent the population, a reminder e-mails is sent to the Associate Chief of Staff for Education or equivalent to increase the response rate. These e-mails will be accompanied by facility-specific response rates. Assessments should reflect the overall assessments of registrant trainees, by program and facility level.


d. Unusual problems requiring specialized sampling procedures

There are no unusual problems associated with the selection of the time period for collection. As noted, March avoids the beginning of the academic year stressors and the end of year fatigue.


  1. Any use of less frequent than monthly data collection to reduce burden


Not applicable. This survey is collected annually only during the month of March.


3. Describe methods to maximize response rate and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.

3.1 Response Rate

The expected response rate is 70%. The response rate will be computed by determining the percent of trainees at the training VA medical facilities who responded to the web-based questionnaire. The program type and level of respondents will be compared to the trainee population. Separate response rates will be determined for physician resident and for associated health trainee respondents. To maximize the response rate, the Office of Academic Affiliations will send reminder e-mails to the Associate Chief of Staff for Education or equivalent. These e-mails will be accompanied by facility-specific response rates.


3.2 Response rate bias

The response rate bias will be computed by comparing characteristics among trainees who completed the survey from registrants who did not complete the survey. We will know the program characteristics of those respondents who did not complete the survey from information supplied by the Associate Chief of Staff for Education or equivalent at the VA medical training facility.

Specifically, we will compute the (unadjusted) overall hospital score across all respondents. We then compute an adjusted overall score to reflect the distribution of all trainees, including both respondents and non-responders. Scores are adjusted based on an estimated regression where trainee characteristics serve as independent variables. Trainee characteristics include training program, type, and level, among others. These will be entered as mean centered values, where means are computed for all trainees. The response rate bias is computed by taking the difference between adjusted and unadjusted scores as a percent of the adjusted score. Statistical significance and confidence intervals will be determined by bootstrapping 10,000 samples. We also compute a robust estimate of the response rate bias by taking the mean response rate bias across all bootstrapped samples. Adjusted scores will not be corrected for facility nesting since we wish to know how the distribution of non-responses across facilities impact national performance estimates. The robust estimate of the response rate bias measures potential damage of assessing national performance estimates when registrants fail to respond to the survey.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions of 10 or more individuals.


Not applicable. The survey instrument has been previously fielded and extensively tested over the past 5 years. No new questions have been added in the present year and the survey instrument is the same as was previously approved by OMB.



5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Name of agency unit collecting and analyzing the information:


Office of Academic Affiliations (14)

Veterans Health Administration

Department of Veterans Affairs Central Office

Washington, DC 20420

(202) 357-4010


Individual overseeing the analysis of information:


Barbara K. Chang, MD MA

Director, Program Evaluation & Acting Director, Graduate Medical Education

Office of Academic Affiliations, VHA, Department of Veterans Affairs

(505) 256-6425


Individual consulted on statistical aspects of the design:


T. Michael Kashner, PhD JD

Director, Division of Health Services Research and Professor of Psychiatry

Department of Clinical Sciences

University of Texas Southwestern Medical Center at Dallas

Associate Director, Program Evaluation

Office of Academic Affiliations, VHA, Department of Veterans Affairs

(214) 648-4608


Individual overseeing the collection of information:


Christopher T. Clarke, PhD

Director, OAA Support Services

Office of Academic Affiliations, VHA, Department of Veterans Affairs

(314) 894-5760 x66764


Page 5

File Typeapplication/msword
Authorvhacostoutm
Last Modified Byvhacostoutm
File Modified2007-03-01
File Created2007-03-01

© 2024 OMB.report | Privacy Policy