Blind Rehabilitation Services Survey
Sampling Methodology Report
Prepared by
Veteran Experience Office
Version 1 May 2024
Executive Summary………………………………………………………………………………3
Part I – Introduction………………………………………………………………………………4
A. Background…………………………………………………………………………………4
B. Basic Definitions……………………………………………………………………………5
C. Application to Veterans Affairs…………………………………………………………….5
Part II – Methodology…………………………………………………………………………….5
A. Target Population and Frame………………………………………………………………5
B. Sample Size Determination……………………………………………………………….6
C. Stratification………………………………………………………………………………7
D. Data Collection Methods………………………………………………………………….7
E. Reporting………………………………………………………………………………….7
F. Quality Control……………………………………………………………………………8
G. Sample Weighting, Coverage Bias, and Non-Response Bias……………………………..8
H. Quarantine Rules…………………………………………………………………………..9
Part III – Assumptions and Limitations…………………………………………………………..10
The Veteran Health Administration (VHA) provides Blind and Visual Impairment Rehabilitation Services to eligible Veterans and active-duty Service members. VA is the first and only national healthcare system to completely and seamlessly integrate rehabilitation services for patients with vision loss into its health benefits. This ensures that patients receive the finest medical and rehabilitative care, as well as cutting-edge assistive technology. The mission of Blind Rehabilitation Service (BRS) is to assist eligible Veterans and active-duty Service members with a visual impairment in developing the skills needed for personal independence and successful reintegration into the community and family environment.
For this survey, VEO partnered with VHA to measure the satisfaction of Veterans who receive Blind Rehabilitation Services directly through VHA.
The goal of service level measurements is three-fold:
To collect continuous customer experience data from Blind Rehabilitation Services patients
To help field staff and the national office identify areas for improvement
To better understand the reasons Blind Rehabilitation Services patients provide positive or negative feedback
The survey questionnaire is brief and contains general Likert-scale (a scale of 1-5 from Strongly Disagree to Strongly Agree) questions to assess patient satisfaction as well as questions assessing the knowledge, speed, and manner of the interaction. These questions have been mapped to the OMB A-11 Customer Experience drivers. After the survey has been distributed, recipients have two weeks to complete the survey. Invitees will receive a reminder email after one week.
The purpose of this document is to define VA’s sampling methodology for selecting potential survey respondents for this study. This survey is conducted via random sampling. The sample size for Blind Rehabilitation Services survey was determined so that the reliability of monthly survey estimates is 3.0% margin of error at a 95% confidence level. This report describes the methodology used to conduct the Blind Rehabilitation Services survey. Information about quality assurance protocols, as well as limitations of the survey methodology, is also included in this report.
The Enterprise Measurement and Design team (EMD) within the Veterans Experience Office (VEO) is tasked with conducting transactional surveys of the customer population to measure their satisfaction with the Department of Veterans Affairs (VA) numerous benefit services. Thus, their mission is to empower Veterans by rapidly and discreetly collecting feedback on their interactions with such VA entities as National Cemetery Administration (NCA), Veterans Health Administration (VHA), and Veterans Benefits Administration (VBA). VEO surveys generally entail probability samples which only contact minimal numbers of customers necessary to obtain reliable estimates. This information is subsequently used by internal stakeholders to monitor, evaluate, and improve beneficiary processes. Customers are always able to decline participation and can opt out of future invitations. A quarantine protocol is maintained to limit the number of times a customer may be contacted over a period of time across all VEO surveys, in order to prevent survey fatigue.
Surveys issued by EMD are generally brief in nature and present a low amount of burden to customers. A few targeted questions will utilize a human centered design (HCD) methodology, revolving around concepts of Trust, Ease, Effectiveness and Emotion. Questions will focus on a specific aspect of a service process. Structured questions directly address the pertinent issues regarding each surveyed line of business. The opportunity to volunteer open-ended text responses is provided within most surveys. This open text has been demonstrated to yield enormous information. Machine learning tools are used for text classification, ranking by sentiment scores, and screening for homelessness, depression, etc. Modern survey theory is used to create sample designs which are representative, statistically sound, and in accordance with OMB guidelines on federal surveys.
VEO has been commissioned by VHA to measure the satisfaction and experience of customers with their Blind Rehabilitation Services. VEO proposes to conduct a brief survey on customers who experienced their entire disability claims process. Randomly sampled customers will be contacted through an invitation email. A link will be enclosed so the survey may be completed using an online interface, with customized customer information. The survey itself will consist of a handful of questions revolving around a human-centered design, focusing on such elements as trust, emotion, effective, and ease with the care they received.
Coverage |
The percentage of the population of interest that is included in the sampling frame. |
Measurement Error |
The difference between the response coded and the true value of the characteristic being studied for a respondent. |
Non-Response |
Failure of some respondents in the sample to provide responses in the survey. |
Transaction |
A transaction refers to the specific time a customer interacts with the VA that impacts the customer’s journey and their perception of VA’s effectiveness in servicing customers. |
Response Rate |
The ratio of participating persons to the number of contacted persons. This is one of the basic indicators of survey quality. |
Sample |
In statistics, a data sample is a set of data collected and/or selected from a statistical population by a defined procedure. |
Sampling Error |
Error due to taking a particular sample instead of measuring every unit in the population. |
Sampling Frame |
A list of units in the population from which a sample may be selected. |
Reliability |
The consistency or dependability of a measure. Also referred to as standard error. |
This measurement may bring insights and value to all stakeholders at VA. Front-line VA leaders can resolve individual feedback from customers and take steps to improve the customer experience; meanwhile VA executives can receive real-time updates on systematic trends that allow them to make changes.
To collect continuous patient experience data to monitor the relative success of programs designed to improve patient experience with Blind Rehabilitation Services.
To help field staff and the national office identify need of the specific population they serve
To better understand why Blind Rehabilitation Services patients provide positive or negative feedback
The target population of the Blind Rehabilitation Services survey is all Blind Rehabilitation Services patients that receive care directly through VHA whether in an inpatient, outpatient, or telehealth settings.
For a given margin of error and confidence level, the sample size is calculated as below (Lohr, 1999). For population that is large, the equation below is used to yield a representative sample for proportions:
where
=
1.96, which is the critical Z score value under the normal
distribution when using a 95% confidence level (α = 0.05).
p = the estimated proportion of an attribute that is present in the population, with q=1-p.
Note that pq attains its maximum when value p=0.5, and this is often used for a conservative sample size (i.e., large enough for any proportion).
e = the desired level of precision; Also referred to as MOE.
For a population that is relatively small, the finite population correction is used to yield a representative sample for proportions:
Where
=
Representative sample for proportions when the population is large.
N = Population size.
The margin of error surrounding the baseline proportion is calculated as:
Where
=
1.96, which is the critical Z score value under the normal
distribution when using a 95% confidence level (α = 0.05).
N = Population size.
n = Representative sample.
p = the estimated proportion of an attribute that is present in the population, with q=1-p.
Table 2 depicts the estimated number of unique Blind Rehabilitation Services patients within a month. Preliminary analysis of this population indicates that approximately 69% of qualifying customers have provided an email address to the VA. Due to the variety of modes and the decentralized nature of service delivery, VEO proposes to do a census of available patients. With current estimates, this would result in 12,628 completed surveys from 90,201 invitations per year. To account for potential estimation errors, improvement in email collection, or changes in business volume; we are requesting approval for a maximum of 18,000 completes annually.
Table 2. Monthly Population and Survey Figures
Service Type |
Est. Population |
Est. Email Population |
Available For Invite1 |
Est. Returned Surveys |
Expected Response Rate |
Outpatient |
4,234 |
4,234 |
3,176 |
445 |
14% |
Telehealth |
8,244 |
5,692 |
4,269 |
598 |
14% |
Inpatient |
141 |
97 |
72 |
10 |
14% |
Total |
12,618 |
10,022 |
7,517 |
1,052 |
14% |
1 Excluding estimated quarantined records (25% loss)
Since the proposed sample is a census, stratification will not be used for this survey.
The population for the survey will be drawn from the two existing electronic health record system. Due to regulation that do not allow us to survey inpatients that have selected to receive HCAHPS surveys, the inpatient portion of the sample will be integrated with VEO’s inpatient survey that is conducted in two waves (non-HCAHPS and post-HCAHPS) on a twice monthly basis). For the other modalities, a sample will be processed twice weekly in order to reduce the effect of quarantine. VEO data analysts will access the data to download the required fields from records that had a qualifying encounter. Any record with a valid email address (less opt outs and quarantine) will be invited to take the survey. With the exception of inpatient encounters, elected respondents will be contacted within 5 days of their interaction. They will have 14 days to complete the survey and will receive a reminder after 7 days. Estimates will be accessible to data users instantly on the VSignals platform.
Table 3. Survey Mode
Mode of Data Collection |
Recruitment Method |
Time After Transaction |
Recruitment Period |
Collection Days |
Outpatient and Telehealth |
Email Recruitment |
Within 5 days of interaction |
14 Days (Reminder after 7 Days) |
Tuesday & Friday |
Inpatient Wave 1 |
Within 30 days from discharge |
Friday |
||
Inpatient Wave 2 |
Within 86 days from discharge |
Researchers will be able to use the VSignals platform for interactive reporting and data visualization. Trust, Ease, Effectiveness, and Emotion scores can be observed for each). The scores may be viewed by various subgroups (e.g. gender) in various charts for different perspective. They are also depicted within time series plots to investigate trends. Finally, filter options are available to assess scores at varying time periods and within the context of other collected variable information.
Additional internal reporting may be through the many reports generated by VEO or through other custom dashboards.
To ensure the prevention of errors and inconsistencies in the data and the analysis, quality control procedures will be instituted in several steps of the survey process. Records will undergo a cleaning during the population file creation. The quality control steps are as follows.
Records will be reviewed for missing data. When records with missing data are discovered, they will be either excluded from the population file when required or coded as missing.
Any duplicate records will be removed from the population file to both maintain the probabilities of selection and prevent the double sampling of the same customer.
Invalid emails will be removed.
The survey sample loading and administration processes will have quality control measures built into them.
The extracted sample will be reviewed for representativeness. A secondary review will be applied to the final respondent sample.
The survey load process will be rigorously tested prior to the induction of the survey to ensure that sampled customers is not inadvertently dropped or sent multiple emails.
The email delivery process is monitored to ensure that bounce-back records will not hold up the email delivery process.
A final respondent sample should closely resemble the true population, in terms of the demographic distributions (e.g. age groups). One problem that arises in the survey collection process is nonresponse, which is defined as failure of selected persons in the sample to provide responses. This occurs in various degrees to all surveys, but the resulting estimates can be distorted when some groups are actually more or less prone to complete the survey. In many applications, younger people are less likely to participate than older persons. Another problem is under-coverage, which is the event that certain groups of interest in the population are not even included in the sampling frame. They cannot participate because they cannot be contacted: those without an email address will be excluded from sample frame. These two phenomena may cause some groups to be over- or under-represented. In such cases, when the respondent population does not match the true population, conclusions drawn from the survey data may not be reliable and are said to be biased.
While we are not currently planning to weight the data, survey practitioners recommend the use of sampling weighting to improve inference on the population. This will be introduced into the survey process as a tool that helps the respondent sample more closely represent the overall population. Weighting adjustments are commonly applied in surveys to correct for nonresponse bias and coverage bias. As a business rule will be implemented to require callers to provide email address, the coverage bias for this survey is expected to decrease. In many surveys, however, differential response rates may be observed across age groups. In the event that some age groups are more represented in the final respondent sample, the weighting application will yield somewhat smaller weights for this age group. Conversely, age groups that are underrepresented will receive larger weights. This phenomenon is termed non-response bias correction for a single variable. Strictly speaking, we can never know how non-respondents would have really answered the question, but the aforementioned adjustment calibrates the sample to resemble the full population – from the perspective of demographics. This may result in a substantial correction in the resulting weighting survey estimates when compared to direct estimates in the presence of non-negligible sample error (non-response bias).
It was reported earlier that the email population comprises 69% of the full Blind Rehabilitation Services population. This is lower than average considering that 88% of US veterans utilize email (National Telecommunications and Information Administration, 2020) and may cause some bias if the program has failed to provide the assistive technology and training needed to navigate email.
When implemented, weighting will utilize cell weights in real time. With each query on the VSignals platform for each respondent by dividing the target for a cell by the number of respondents in the cell. The weighting scheme will include, where possible all the variables used for explicit stratification, However, cells will be collapsed if the proportion of the population is insufficient to reliably achieve a minimum of 3 completes per month. As a result, weights may be more comprehensive for larger population segments. For instance, in the VA, women are a smaller proportion of the populations. Therefore, woman will have more collapsed cells than men.
As part of the weighting validation process, the weights of persons in age and gender groups are summed and verified that they match the universe estimates (i.e., population totals). Additionally, we calculate the unequal weighting effect, or UWE (see Kish, 1992; Liu et al., 2002). This statistic is an indication of the amount of variation that may be expected due to the inclusion of weighting. The unequal weighting effect estimates the percent increase in the variance of the final estimate due to the presence of weights and is calculated as:
where
cv
= coefficient of variation for all weights
.
s = sample standard deviation of weights.
=
sample mean of weights,
ij.
VEO seeks to limit contact with customers as much as possible, and only as needed to achieve measurement goals. These rules are enacted to prevent excessive recruitment attempts upon VA’s customers. VEO also monitors participation within other surveys, to ensure veterans and other customers do not experience survey fatigue. All VEO surveys offer options for respondents to opt out, and ensure they are no longer contacted for a specific survey. VEO also monitors Veteran participation within other surveys, to ensure customers do not experience survey fatigue.
Table 4. Quarantine Protocol
Quarantine Rule |
Description |
Elapsed Time |
Repeated Sampling for Blind Rehabilitation Services |
Number of days between receiving/completing online survey, prior to receiving email invitation for Blind Rehabilitation Services Survey |
30 Days |
Other VEO Surveys |
Number of days between receiving/completing online survey and becoming eligible for another VEO survey |
30 Days |
Opt Outs |
Persons indicating their wish to opt out of either phone or online survey will no longer be contacted. |
N/A |
Since the VEO Blind Rehabilitation Services Claims Survey are email only, there is a population of Blind Rehabilitation Services patients that cannot be reached by the survey due to lack of email address or low access or ability to utilize assistive technology. Veterans that lack access to the internet or do not use email may have different levels of Trust and satisfaction with their service especially for this population since empowering those with vision loss is an explicit goal of the services. In order to verify this, VEO plans to execute a coverage bias study to assess the amount of coverage bias due and derive adjustment factors in the presence of non-negligible bias.
Choi, N.G. & Dinitto, D.M. (2013). Internet Use Among Older Adults: Association with Health Needs, Psychological Capital, and Social Capital. Journal of Medical Internet Research, 15(5), e97
Kalton, G., & Flores-Cervantes, I. (2003). Weighting Methods. Journal of Official Statistics, 19(2), 81-97.
Kish, L. (1992). Weighting for unequal P. Journal of Official Statistics, 8(2), 183-200.
Kolenikov, S. (2014). Calibrating Survey Data Using Iterative Proportional Fitting (Raking). The Stata Journal, 14(1): 22–59.
Lohr, S. (1999). Sampling: Design and Analysis (Ed.). Boston, MA: Cengage Learning.
Liu, J., Iannacchione, V., & Byron, M. (2002). Decomposing design effects for stratified sampling. Proceedings of the American Statistical Association’s Section on Survey Research Methods.
National Telecommunications and Information Administration (2020) Digital Nation Data Explorer https://www.ntia.doc.gov/data/digital-nation-data-explorer#sel=emailUser&demo=veteran&pc=prop&disp=chart
Wong, D.W.S. (1992) The Reliability of Using the Iterative Proportional Fitting Procedure. The Professional Geographer, 44 (3), 1992, pp. 340-348
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Jacobsen, Michael |
File Modified | 0000-00-00 |
File Created | 2025-05-19 |