Justification B FINAL_12_12

Justification B FINAL_12_12.doc

Survey of Veteran Enrollees' Health and Reliance Upon VA

OMB: 2900-0609

Document [doc]
Download: doc | pdf

SUPPORTING STATEMENT FOR OMB CONTROL NUMBER 2900-0609, CONTINUED


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


Overall basic survey methodology is described below. VA has worked with its contractor to enhance the coding of case level outcomes to better understand reasons for non-response and conduct research that may include tracing Veterans with inadequate contact information, increasing the number of call attempts, and testing strategies for identifying and handling Veterans in institutions, etc. The findings from these studies are discussed in more detail below and have driven enhancements to the survey methodology


  1. Provide a numerical estimate of the potential respondent universe and describe any sampling or other respondent selection method to be used. Data on the number of entities (e.g., households or persons) in the universe and the corresponding sample are to be provided in tabular format for the universe as a whole and for each stratum. Indicate expected response rates. If this has been conducted previously include actual response rates achieved.


The Survey of Enrollee universe to be sampled is the population of enrollees specified at some point in time. For example, the approximately 9.2 million living enrollees as of September 2010 was the population of interest for the 2011 survey.


The final sample of enrollees responding to the Survey of Enrollees must pass through many stages:

  • The September 2010 VHA enrollment file contained 9,216,388 records;

  • After dropping “cost-only”, deceased/cancelled, and declined/not eligible (known as “current” enrollees) enrollees, the VHA enrollment file contained 8,431,166 records;

  • After dropping missing priority, 8E and 8G (income too high), and those with invalid address information, the VHA enrollment file contained 7,895,108 records;

  • To be in the final sample of respondents, an enrollee must be in the sampling frame—meaning that contact information and all stratification variables are available;

  • Then, the enrollee must be sampled via the stratified random selection process;

  • Next, the enrollee’s contact information must be valid and lead to the correct enrollee; and

  • Finally, the enrollee must elect to respond to the survey.


The only stage that is a controlled random process—and, therefore, not subject to potential bias—is the random sample selection. For previous surveys, the sampling frame of enrollees had been stratified into 294 strata based on VISN (21), enrollee type (2: pre or post) and priority group (7: 1-6, 7/8). The target number of interviews for each combination was 200 for priority groups 1-6 and 400 in priority group 7/8. To increase the data utility for OEF/OIF/OND (Operation Enduring Freedom/Operation Iraqi Freedom/Operation New Dawn) Veterans, VHA added additional strata based on OEF/OIF/OND status. The stratification and sample allocation have been modified to achieve target OEF/OIF/OND sample sizes for each VISN (200) and the target sample sizes for the original 294 combinations of VISN, enrollee type, and priority.


About 7.6 percent of the survey-eligible enrollee population was not eligible to be in the sampling frame in 2011 due to incomplete telephone information or incomplete stratification information—slightly lower than in 2010 when 8.8 percent was ineligible. The frame information continues to improve annually as the percentage of ineligible records declines: 25 percent ineligible in 2005, 27 percent ineligible in 2007, and only 12 percent ineligible in 2008. A telephone number may be missing from the sample completely, missing digits, or not have a valid area code and exchange (prefix) combination. The improved frame had contact information for a higher percentage of enrollees and reduced the risk of bias due to incomplete coverage.

A random sample of approximately 125 enrollees in each of priority categories 1-8, for the pre and post enrollees, in each of 21 health care networks is expected to yield an optimally stratified sample of approximately 42,000. The attached 2011 stratification table shows the potential priority group definitions and stratifications as well as the projected universe and sample size by strata, based on the 2011 Survey of Enrollees.

For the 2011 Survey of Enrollees, the VA contractor obtained a 40 percent response rate with the post-enrollee sample and a 44 percent response rate with the pre-enrollee sample. The cooperation rate in 2011 (defined as the proportion of completed interviews to eligible contacted respondents), was 76 percent with the post-enrollee sample and a 75 percent with the pre-enrollee sample. We expect that the implementation of a multi-mode survey strategy will improve both response and cooperation rates.

To achieve the goal of 200 OEF/OIF/OND enrollees per VISN, we stratified each VISN into one of four OEF/OIF/OND size strata. The reason for grouping the VISNs into size strata is to achieve the sample size targets for each VISN while adding as few new strata as possible (for operational efficiencies). To achieve approximately 200 per VISN, we allocated 4600 interviews to the OEF/OIF/OND sample. In each survey year VHA evaluates the need to continue to stratify the sample by pre and post-enrollees, as well as the need to oversample for OEF/OIF/OND enrollees.


2. Describe the procedures for the collection of information, including: Statistical methodology for stratification and sample selection; the estimation procedure; the degree of accuracy needed for the purpose in the proposed justification; any unusual problems requiring specialized sampling procedures; and any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The stratification and sample selection methodology described below was employed in the 2011 Survey of Enrollees and is expected to be used going forward.


For the 2011 survey, VHA provided its contractor with a sample of 420,011 records from its database of enrollees as follows:

  • VHA extracted the entire universe of enrollees who were listed as of September 30, 2010—this list includes both institutionalized and non-institutionalized Veterans enrolled in VA health care.

  • VHA eliminated all records that are:

    • Lacking a valid address,

    • Not in the U.S. or Puerto Rico, and

    • Missing one of the stratification variables.

  • The file of enrollees was stratified by OEF/OIF/OND status, pre/post-enrollee status, priority group, and VISN.


From this sample, the contractor selected a stratified random sub-sample. For surveys prior to 2008, the sampling frame of enrollees had been stratified into 294 strata based on VISN (21), enrollee type (two: pre or post), and priority group (seven: 1-6, 7/8). The target number of interviews for each combination was 100 for Priority Groups 1 through 6 and 400 in Priority Group 7/8. To increase the data utility for OEF/OIF/OND, VHA added additional strata based on OEF/OIF/OND status in 2008 and repeated this in 2010 and 2011. The stratification and sample allocation was based on achieving target OEF/OIF/OND sample sizes for each VISN (200) and the target sample sizes for the original 294 combinations of VISN, enrollee type, and priority—which were modified as follows:

  • Each of the 42 combinations of VISN (21) by type (two: pre or post) was allocated a sample size of 1,000.

  • Each sample of 1,000 was allocated to the priority groups based on these rules:

    • At least 125 interviews for Priority Groups 1 through 5.

    • At least 250 for Priority Group 7/8.

    • Priority Group 6 was sampled at the same rate as Priority Group 7/8. For analytic purposes, Priority Group 6 was combined with Priority Group 7/8. Having Priority Group 6 in proportion to Priority Group 7/8 generally resulted in higher precision for combined estimates. However, the number of enrollees in Priority Group 6 was much smaller than Priority Group 7/8, so a separate stratum was maintained to ensure this group was proportionally represented.

  • Finally, the remaining unallocated sample after steps one through three was allocated proportionally to Priority Groups 1 through 5.


To achieve the goal of 200 OEF/OIF/OND enrollees per VISN, the contractor stratified each VISN into one of four OEF/OIF/OND size strata. Grouping the VISNs into size strata achieved the sample size targets for each VISN while adding as few new strata as possible (for operational efficiencies). To achieve approximately 200 per VISN, ICF International (the contractor that conducts the survey) allocated 4,875 interviews to the OEF/OIF/OND sample.


The stratification and sample allocation is as follows:


Stratification and Sample Allocation OEF/OIF/OND

Group

VISNs

Target

1

2, 3, 5, 10

1,100

2

1, 11, 12, 15, 18, 19, 21

1,525

3

4, 6, 9,17, 20, 23

1,350

4

7, 8, 16, 22

900

TOTAL

4,875



After stratifying the OEF/OIF/OND enrollees into one of these groups, the contractor calculated the expected distribution of the OEF/OIF/OND sample in each VISN, enrollee type, and priority group strata. Then, to calculate the non-OEF/OIF/OND sample size targets, the contractor subtracted the expected OEF/OIF/OND sample from the overall targets.


For a sample size of approximately 42,000, we expect survey estimates based on the total sample to have error margins of approximately +/-0.5 percentage points at the 95 percent confidence level. For each priority level combining pre1 and post enrollees within VISN, with a sample size of approximately 200, we expect survey estimates to have error margins in the range of approximately +/-7 percentage points at the 95 percent confidence level. Confidence interval projections are based on measuring a population percentage equal to 50 percent. These projections do not account for sample design effects, which may increase the actual error margins for the survey estimates. VA will provide the contractor a list of enrollees from which to draw.


3. Describe methods used to maximize the response rate and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


VA’s contractor conducted a non-response bias analysis for the 2005 SoE. One of the recommendations resulting from this analysis was a propensity score weighting adjustment. This weighting adjustment, also used in 2007 and in 2008, corrects for the differential non-response by health utilization and demographic information. To determine the adjustment, the contractor:

  • Estimates an enrollee’s individual propensity (or probability) to respond with a probability model (described below);

  • Groups the estimated enrollees into five equal size classes (or quintiles) with similar probabilities; and

  • Within each class, weights the respondents up to account for the non-respondents.


The propensity score weighting adjustment reduces potential bias to the extent that non-respondents and respondents with similar response probabilities are also similar with respect to the survey statistics of interest. During the 2007 Survey of Enrollees, enrollees were only sampled from a frame of enrollees with telephone numbers. Enrollees without telephone numbers had no chance of selection—thereby introducing coverage error. Therefore, the 2007 survey was susceptible to two forms of bias, coverage and non-response. For that reason, two separate propensity score adjustments were developed: one for non-response and another for frame coverage.


For the 2008 survey, the survey sample of enrollees was selected from a frame of enrollees with and without telephone numbers. Since the sample was selected from a complete frame of enrollees, coverage bias was not a concern. However, non-response (including ineligible contact information) remained a concern. Therefore, a single propensity score adjustment focused on mitigating non-response bias. The 2010 sample was modeled on the 2008 sampling plan. Therefore, the contractor used the same weighting methodology as in 2008. Similar to 2008 and 2010, the sample for 2011 was selected from a frame of enrollees with and without telephone numbers. Therefore, we use one adjustment for non-response.


Design Weights

Prior to calculating the non-response adjustment, the contractor adjusted for differential selection probabilities. The sample (n) was selected from the full list of enrollees so the probability of selection may be calculated. The inverse of these selection probabilities is the design weight, w1=1/Pr. The design weights were used to calculate the non-response adjustment.


Non-response Adjustment

To calculate the non-response adjustment, each sampled respondent was classified into a non-response category (y) based on whether the interview was a complete or an incomplete interview. Using logistic regression, the contractor estimated the probability that an enrollee completed the interview given his or her characteristics. VHA provided a file based on administrative records that indicated if an enrollee had utilized services in the previous year (the file did not indicate the frequency of use or amount paid for any of these benefits).


The utilization indicators have been used for weighting since the 2007 survey. From 2007-2010, the indicators were sourced from VHA workload files based on bed section and clinic stop. This categorization indicates where a Veteran received care. For the 2011 survey, the indicators were based on service utilization from Health Service Categories (HSCs). The categorization indicates what care a Veteran received. A second change in 2011 was to include institutional and non-institutional long-term care indicators. From 2007-2010, this indicator was a single measure of home health service.


For this modeling, design weights equal to the ratio of the frame total to the sample total in each stratum were used. The outcome of the model is the propensity score, the estimated probability that the enrollee is in the final sample of respondents given their characteristics (VISN, priority status, enrollee type, age, gender, and service utilization.)

After estimating each sampled enrollee’s probability of completing an interview based on the predictor variables, respondents and non-respondents were grouped into quintiles based on their propensity score. Within each quintile, the respondents were ratio-adjusted to account for the non-respondents. The first quintile represents the enrollees with the lowest propensity scores; this means that these enrollees are less likely to be in the final sample of respondents and thus receive the largest weights. The last quintile represents the enrollees with the highest propensity scores; this means that these enrollees are more likely to be in the final sample of respondents and thus receive the smallest weights.



Response

Non-Response

Adjustment

0-20th percentile

225,258

1,353,373

7.01

20-40th percentile

413,250

1,166,082

3.82

40-60th percentile

535,607

1,043,176

2.95

60-80th percentile

725,287

853,883

2.18

80-100th percentile

820,973

758,220

1.92


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions of 10 or more individuals.


Earlier (1999, 2000, 2002, 2003, 2005, 2007, 2008, 2010, and 2011) survey questions on the Survey of Enrollees have been clarified based upon Veteran or interviewer questions and input. Any proposed survey questions will be pre-tested on fewer than 10 Veterans in order to work out any problems with wording or Veterans’ comprehension of questions, etc., before full implementation of the survey. The attached VHA enrollee survey instrument includes all necessary modifications to date in the survey instrument.


In 2012, VHA began testing the use of a multi-mode format to include the current CATI telephone survey method as well as introduce a web-based and mailed survey format. In this initial year, we conducted a feasibility test of these alternative methods with a small sample of enrollees in order to determine any differences in response patterns. The purpose of conducting the feasibility test first both ensures that any changes in the results are attributable to true changes rather than to mode effects and allows us to determine the quality and cost-effectiveness of CATI-only versus multiple modes of survey administration. It is expected that the use of a multi-mode survey format will both increase response and cooperation rates by meeting the requests of responses and reduce response bias by reaching more enrollees who may have a different response pattern than those who reply to telephone surveys.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


ICF International (Tel. 802-863-9600)

Andrew Dyer

Randall ZuWallack

126 College Street

Burlington Vermont 05401

Burlington, VT, Contractor for the survey


Milliman Inc.

Kathi Patterson, FSA, MAAA, (Tel. 206-504-5539)

Principal and Consulting Actuary, Contractor

Ed Jhu, Principal and Consulting Actuary, Contractor (Tel. 206-504-5828)


Marybeth Matthews (Tel. 414-384-2000 ext 42359)

VHA Office of the Assistant Deputy Under Secretary for Health for Policy and Planning,

Department of Veterans Affairs

5000 West National Avenue

Milwaukee WI 53295


Karen Lentz (Tel 414-384-2000 ext 42365)

VHA Office of the Assistant Deputy Under Secretary for Health for Policy and Planning,

Department of Veterans Affairs

5000 West National Avenue

Milwaukee WI 53295


Laura Bowman (Tel. 202-461-7108)

VHA Office of the Assistant Deputy Under Secretary for Health for Policy and Planning,

Department of Veterans Affairs

810 Vermont Avenue, NW

Washington, DC 20420


Cathy Tomczak (Tel. 202-461-7421)

VHA Office of the Assistant Deputy Under Secretary for Health for Policy and Planning,

Department of Veterans Affairs

810 Vermont Avenue, NW

Washington, DC 20420

1 Pre enrollees are those Veterans who have used VA prior to implementation of eligibility reform in 1999.

Page 6

File Typeapplication/msword
AuthorLinda Bergofsky
Last Modified ByEIE Desktop Technologies
File Modified2012-12-26
File Created2012-12-26

© 2024 OMB.report | Privacy Policy