Download:
pdf |
pdfRequest for Approval under the “Generic Clearance for Improving
Customer Experience (OMB Circular A-11, Section 280
Implementation)” (OMB Control Number: 2900-0876)
TITLE OF INFORMATION COLLECTION: Military Exposure Survey
PURPOSE
The PACT Act is a new law that expands VA health care and benefits for Veterans
exposed to burn pits, Agent Orange, and other toxic substances. The PACT Act adds to
the list of health conditions that we assume (or “presume”) are caused by exposure to
these substances. This law helps provide generations of Veterans—and their survivors—
with the care and benefits they’ve earned and deserve. Part of the early stages of the
program is a medical screening of potential candidate for potential exposure which started
late in 2022. The VHA wants to survey to monitor how veterans experience this screening
through a patient experience survey.
DESCRIPTION OF RESPONDENTS:
The target populations of the Military Exposure screening survey will be any veteran that
is shown in the health record system to have completed a toxic exposure screening in the
past week. The sample frame will exclude veterans without a valid email address, those
that have been invited to take another VEO survey in the past 30 days, those who have
opted out from receiving VEO surveys, and those with incomplete information.
TYPE OF COLLECTION: (Check one)
[ ] Customer Comment Card/Complaint Form
[ ] Usability Testing (e.g., Website or Software
[ ] Focus Group
[X] Customer Satisfaction Survey
[ ] Small Discussion Group
[ ] Other: ______________________
CERTIFICATION:
I certify the following to be true:
1. The collection is voluntary.
2. The collection is low-burden for respondents and low-cost for the Federal
Government.
3. The collection is non-controversial and does not raise issues of concern to other
federal agencies.
4. Personally identifiable information (PII) is collected only to the extent necessary and
is not retained.
5. Information gathered is intended to be used for general service improvement and
program management purposes.
6. The collection is targeted to the solicitation of opinions from respondents who have
experience with the program or may have experience with the program in the future.
7. All or a subset of information may be released as part of A-11, Section 280
requirements on performance.gov. Additionally, summaries of the data may be
released to the public in communications to Congress, the media and other releases
disseminated by VEO, consistent with the Information Quality Act.
Name: Sergio Gazaryan, Enterprise Measurement Project Manager, Veterans Experience
Office, VA (603) 203-3167
To assist review, please provide answers to the following question:
Personally Identifiable Information:
1. Will this survey use individualized links, through which VA can identify particular
respondents even if they do not provide their name or other personally identifiable
information on the survey? [X] Yes [] No
2. Is personally identifiable information (PII) collected? [ ] Yes [X] No
3. If Yes, will any information that is collected be included in records that are subject to
the Privacy Act of 1974? [ ] Yes [ ] No [N/A]
4. If Yes, has an up-to-date System of Records Notice (SORN) been published? [ ] Yes
[ ] No [N/A]
Gifts or Payments:
Is an incentive (e.g., money or reimbursement of expenses, token of appreciation)
provided to participants? [ ] Yes [ X] No
BURDEN HOURS
Category of Respondent
Individuals and Households
Totals
No. of
Respondents
120,000
120,000
Participation
Time
5 minutes
5 minutes
Please answer the following questions.
1. Are you conducting a focus group, a survey that does not employ random
sampling, user testing or any data collection method that does not employ
statistical methods?
Yes X
No __
Burden
10,000 hours
10,000 hours
If Yes, please answer questions 1a-1c, 2 and 3.
If No, please answer or attach supporting documentation that answers questions 2-8.
a. Please provide a description of how you plan to identify your potential group
of respondents and how you will select them.
•
The target populations of the Military Exposure screening survey will be
any veteran that is shown in the health record system to have completed a
toxic exposure screening in the past week. The sample frame will exclude
veterans without a valid email address, those that have been invited to take
another VEO survey in the past 30 days, those who have opted out from
receiving VEO surveys, and those with incomplete information.
b. How will you collect the information? (Check all that apply)
[ ] Web-based or other forms of Social Media
[ ] Telephone
[ ] In-person
[ ] Mail
[X] Other- E-mail-based surveys
c. Will interviewers or facilitators be used? [ ] Yes [X] No
2. Please provide an estimated annual cost to the Federal government to conduct this data
collection: __$13,000______
3. Please make sure that all instruments, instructions, and scripts are submitted with the
request. This includes questionnaires, interviewer manuals (if using interviewers or
facilitators), all response options for questions that require respondents to select a
response from a group of options, invitations given to potential respondents,
instructions for completing the data collection or additional follow-up requests for the
data collection.
•
Done
4. Describe (including a numerical estimate) the potential respondent universe and any
sampling or other respondent selection methods to be used. Data on the number of
entities (e.g., establishments, State and local government units, households, or
persons) in the universe covered by the collection and in the corresponding sample are
to be provided in tabular form for the universe as a whole and for each of the strata in
the proposed sample. Indicate expected response rates for the collection as a whole. If
the collection had been conducted previously, include the actual response rate
achieved during the last collection.
•
Not applicable.
5. Describe the procedures for the collection of information, including:
a. Statistical methodology for stratification and sample selection.
b. Estimation procedure.
c. Degree of accuracy needed for the purpose described in the justification.
d. Unusual problems requiring specialized sampling procedures.
e. Any use of periodic (less frequent than annual) data collection cycles to reduce
burden.
•
Not applicable.
6. Describe methods to maximize response rates and to deal with issues of nonresponse.
The accuracy and reliability of information collected must be shown to be adequate for
intended uses. For collections based on sampling, a special justification must be
provided for any collection that will not yield "reliable" data that can be generalized to
the universe studied.
•
Not applicable.
7. Describe any tests of procedures or methods to be undertaken. Testing is encouraged
as an effective means of refining collections of information to minimize burden and
improve utility. Tests must be approved if they call for answers to identical questions
from 10 or more respondents. A proposed test or set of tests may be submitted for
approval separately or in combination with the main collection of information.
•
Not applicable.
8. Provide the name and telephone number of individuals consulted on statistical aspects
of the design and the name of the agency unit, contractors, grantees, or other person(s)
who will actually collect or analyze the information for the agency.
•
Collection and Analysis:
o Evan Albert, Dir. of Measurement and Data Analytics, Veterans
Experience Office, VA (202) 875-9478
o Sergio Gazaryan, Enterprise Measurement Project Manager,
Veterans Experience Office, VA (603) 203-3167
o Lisa McAndrew, Health Science Specialist, WRII, 973-676-1000
o Joe Salvatore, Executive Assistant, MyVA Task Force, OEI, 202280-8403
Military Toxic Exposure Care Survey
Sampling Methodology Report
Prepared by
Veteran Experience Office
Version 1, March 2023
Contents
Executive Summary
7
Part I – Introduction
8
A. Background
B. Basic Definitions
C. Application to Veterans Affairs
Part II – Methodology
8
9
9
9
A. Target Population and Frame
B. Sample Size Determination
C. Stratification
D. Data Collection Methods
E. Reporting
F. Quality Control
G. Sample Weighting, Coverage Bias, and Non-Response Bias
H. Quarantine Rules
Part III – Assumptions and Limitations
A. Coverage Bias
References
9
10
11
11
12
12
12
13
15
15
15
Executive Summary
In 2022, the PACT Act bill was passed and signed allowing expanding Veteran
Benefits to military personnel who suffered from toxic exposure while serving. Part of the
early stages of the program is a medical screening of potential candidate for potential
exposure which started late in 2022. The VHA wants to survey to monitor how veterans
experience this screening through a patient experience survey.
This report describes the methodology used to conduct the Military Toxic
Exposure Care survey. Information about quality assurance protocols, as well as
limitations of the survey methodology, is also included in this report.
Part I – Introduction
A. Background
The Enterprise Measurement and Design team (EMD) within the Veterans
Experience Office (VEO) is tasked with conducting transactional surveys of the customer
population to measure their satisfaction with the Department of Veterans Affairs (VA)
numerous benefit services. Thus, their mission is to empower Veterans by rapidly and
discreetly collecting feedback on their interactions with such VA entities as National
Cemetery Administration (NCA), Veterans Health Administration (VHA), and Veterans
Benefits Administration (VBA). VEO surveys generally entail probability samples which
only contact minimal numbers of participants necessary to obtain reliable estimates. This
information is subsequently used by internal stakeholders to monitor, evaluate, and
improve processes. Participants are always able to decline participation and can opt out of
future invitations. A quarantine protocol is maintained to limit the number of times a
customer may be contacted over a period of time across all VEO surveys, in order to
prevent survey fatigue.
Surveys issued by EMD are generally brief in nature and present a low amount of
burden on participants. Structured questions directly address the pertinent issues regarding
the surveyed population. The opportunity to volunteer open-ended text responses is
provided within most surveys. This open text has been demonstrated to yield enormous
information. Machine learning tools are used for text classification, ranking by sentiment
scores, and screening for homelessness, depression, etc. Modern survey theory is used to
create sample designs which are representative, statistically sound, and in accordance with
OMB guidelines on federal surveys.
The VHA has developed a screening protocol to assure that veterans can receive
fairly be assessed for toxic exposure and has offered training throughout the VHA health
network to assure consistent implication of the protocol. The VHA has asked EMD to help
develop and implement a patient experience survey specifically for this screening
appointment.
Veterans will be selected to participate in the survey via an invitation email. A link
is enclosed so the survey may be completed using an online interface, with customized
participant information. The data will be collected on a weekly basis. The purpose of this
document is to outline the planned sample design and provide a description of the data
collection and sample sizes necessary for proper reporting
The survey questionnaire is brief. After the survey has been distributed, recipients
have two weeks to complete the survey. Invitees will receive a reminder email after one
week.
Coverage
Measurement Error
Non-Response
Transaction
Response Rate
Sample
Sampling Error
Sampling Frame
Reliability
B. Basic Definitions
The percentage of the population of interest that is included in
the sampling frame.
The difference between the response coded and the true value
of the characteristic being studied for a respondent.
Failure of some respondents in the sample to provide responses
in the survey.
A transaction refers to the specific time a customer interacts
with the VA that impacts the customer’s journey and their
perception of VA’s effectiveness in servicing participants.
The ratio of participating persons to the number of contacted
persons. This is one of the basic indicators of survey quality.
In statistics, a data sample is a set of data collected and/or
selected from a statistical population by a defined procedure.
Error due to taking a particular sample instead of measuring
every unit in the population.
A list of units in the population from which a sample may be
selected.
The consistency or dependability of a measure. Also referred
to as standard error.
C. Application to Veterans Affairs
This measurement may bring insights and value to all stakeholders at VA. Frontline VA staff can resolve individual feedback from participant and take steps to improve
their experience; meanwhile VA executives can receive real-time updates on systematic
trends that allow them to make changes.
1) To collect continuous participant experience data to monitor the relative
success of programs designed to improve Military Exposure screening
2) To help field staff and the national office identify need of the specific
population they serve
3) To better understand why veterans provide positive or negative feedback about
Military Exposure screening
Part II – Methodology
A. Target Population and Frame
The target populations of the Military Exposure screening survey will be any
veteran that is shown in the health record system to have completed a toxic exposure
screening in the past week. The sample frame will exclude veterans without a valid email
address, those that have been invited to take another VEO survey in the past 30 days,
those who have opted out from receiving VEO surveys, and those with incomplete
information.
VEO staff will access the data directly from the VHA patient databases including
the legacy CDW and the onboarding Cerner HER.
B. Sample Size Determination
For a given margin of error and confidence level, the sample size is calculated as
below (Lohr, 1999). For population that is large, the equation below is used to yield a
representative sample for proportions:
𝑍2𝛼/2 𝑝𝑞
𝑛0 =
𝑒2
where
•
•
•
𝒁𝜶/𝟐 = 1.96, which is the critical Z score value under the normal distribution when
using a 95% confidence level (α = 0.05).
p = the estimated proportion of an attribute that is present in the population, with
q=1-p.
o Note that pq attains its maximum when value p=0.5, and this is often used
for a conservative sample size (i.e., large enough for any proportion).
e = the desired level of precision; in the current case, the margin of error e = 0.03,
or 3%. Also referred to as MOE.
For a population that is relatively small, the finite population correction is used to
yield a representative sample for proportions:
𝑛0
𝑛=
𝑛
1 + 𝑁0
Where
•
•
𝒏𝟎 = Representative sample for proportions when the population is large.
N = Population size.
The margin of error surrounding the baseline proportion is calculated as:
𝑀𝑎𝑟𝑔𝑖𝑛 𝑜𝑓 𝑒𝑟𝑟𝑜𝑟 = 𝑧𝛼/2 √
𝑁 − 𝑛 𝑝(1 − 𝑝)
√
𝑁−1
𝑛
Where
•
𝒁𝜶/𝟐 = 1.96, which is the critical Z score value under the normal distribution when
using a 95% confidence level (α = 0.05).
•
N = Population size.
•
n = Representative sample.
•
p = the estimated proportion of an attribute that is present in the population, with
q=1-p.
Table 2 depicts the estimated population available and the proposed sample
nationally and for the smallest VISN in the VHA health system. The total sample size
needed was determined by estimating the sample size needed for the smallest VISN to
assure that it achieves a monthly +/- 4% MOE at a 80% confidence from a national sample
proportionately distributed across VISNs. The resulting sample is estimated to require
710,000 invites annually resulting in 117,000 completed surveys. To account for
uncertainty of the response rate, we are requesting clearance for 120,000 completed
surveys.
Table 2a. Estimated Monthly Population and Survey Figures
Total Population
Email Population
Available Population
Total
247,755
190,893
152,714
Smallest VISN
6,615
5,097
4,077
Target Completes
Estimated Return Rate
Sample Needed
9,750
16.5%
59,091
260
16.5%
1,576
MOE
Confidence
Sample Rate
+/-1%
95%
38.7%
+/-4%
80%
38.7%
C. Stratification
Stratification is used to ensure that the sample matches the population, to the extent
possible, across sub-populations. For this survey we will rely on implicit stratification or
balancing to assure that the targets remain proportional if the population distribution
fluctuates. Balancing variables will include age gender and location.
D. Data Collection Methods
The population for the survey will be extracted by VEO every week. Any record with
a valid email address will be included in the sample frame. Email invitations are delivered
to all selected participants. Selected respondents will be contacted within 8 days of their
screening appointment. They will have 14 days to complete the survey. Estimates will be
accessible to data users instantly on the VSignals platform.
Table 3. Survey Mode
Mode of Data Collection
Recruitment
Method
Recruitment
Period
Collection Days
Online Survey
Email
Recruitment
14 Days
Tuesday
(Reminder after
7 Days)
E. Reporting
Researchers will be able to use the VSignals platform for interactive reporting and
data visualization. The results may be viewed by various subgroups across a variety of
charts for different perspective. They are also depicted within time series plots to
investigate trends. Finally, filter options are available to assess scores at varying time
periods and within the context of other collected variable information.
Recruitment is continuous (weekly) but the results from several weeks may be
combined into a monthly, quarterly, or annual estimate for more precise estimates.
F. Quality Control
To ensure the prevention of errors and inconsistencies in the data and the analysis,
quality control procedures will be instituted in several steps of the survey process. Records
will undergo a cleaning during the population file creation. The quality control steps are
as follows.
1. Records will be reviewed for missing data. When records with missing data are
discovered, they will be either excluded from the population file when required or
coded as missing.
2. Any duplicate records will be removed from the population file to both maintain
the probabilities of selection and prevent the double sampling of the same
customer.
3. Invalid emails will be removed.
The survey sample loading and administration processes will have quality control
measures built into them.
1. The extracted sample will be reviewed for representativeness. A secondary review
will be applied to the final respondent sample.
2. The survey load process will be rigorously tested prior to the induction of the
survey to ensure that sampled participants is not inadvertently dropped or sent
multiple emails.
3. The email delivery process is monitored to ensure that bounce-back records will
not hold up the email delivery process.
G. Sample Weighting, Coverage Bias, and Non-Response Bias
A final respondent sample should closely resemble the true population, in terms of
the demographic distributions (e.g. age groups). One problem that arises in the survey
collection process is nonresponse, which is defined as systematic failure of selected
persons in the sample to provide responses. This occurs in various degrees to all surveys,
but the resulting estimates can be distorted when some groups are more or less prone to
complete the survey. In many applications, younger people are less likely to participate
than older persons. Another problem is under-coverage, which is the event that certain
groups of interest in the population are not even included in the sampling frame. They
cannot participate because they cannot be contacted: those without an email address will
be excluded from sample frame. These two phenomena may cause some groups to be
over- or under-represented. In such cases, when the respondent population does not match
the true population, conclusions drawn from the survey data may not be reliable and are
said to be biased.
Weighting adjustments are commonly applied in surveys to correct for
nonresponse bias and coverage bias. In many surveys, however, differential response rates
may be observed across age groups. In the event that some age groups are more
represented in the final respondent sample, the weighting application will yield somewhat
smaller weights for these age group. Conversely, age groups that are underrepresented will
receive larger weights. This phenomenon is termed non-response bias correction for a
single variable. Strictly speaking, we can never know how non-respondents would have
really answered the question, but the aforementioned adjustment calibrates the sample to
resemble the full population – from the perspective of demographics. This may result in a
substantial correction in the resulting weighting survey estimates when compared to direct
estimates in the presence of non-negligible sample error (non-response bias).
If implemented, weighting will utilize cell weights in real time. With each query
on the VSignals platform for each respondent by dividing the target for a cell by the
number of respondents in the cell. The weighting scheme will include, where possible all
the variables used for stratification, however, cells will be collapsed if the proportion of
the population is insufficient to reliably achieve a minimum of 3 completes per month. As
a result, weights may be more comprehensive for larger population segments. For
instance, in the VA, women are a smaller proportion of the populations. Therefore, woman
will have more collapsed cells than men.
As part of the weighting validation process, the weights of persons in age and
gender groups are summed and verified that they match the universe estimates (i.e.,
population totals). Additionally, we calculate the unequal weighting effect, or UWE (see
Kish, 1992; Liu et al., 2002). This statistic is an indication of the amount of variation that
may be expected due to the inclusion of weighting. The unequal weighting effect
estimates the percent increase in the variance of the final estimate due to the presence of
weights and is calculated as:
𝑠
2
𝑈𝑊𝐸 = 1 + 𝑐𝑣𝑤𝑒𝑖𝑔ℎ𝑡𝑠
= ( )2
𝑤
̅
where
•
•
•
cv = coefficient of variation for all weights 𝑤𝑖𝑗 .
s = sample standard deviation of weights.
1
̅ = sample mean of weights, 𝑤
𝒘
̅ = 𝑛 ∑𝑖𝑗 𝑤ij.
H. Quarantine Rules
VEO seeks to limit contact with participants as much as possible, and only as
needed to achieve measurement goals. These rules are enacted to prevent excessive
recruitment attempts upon VA’s participants. All VEO surveys offer options for
respondents to opt out, and ensure they are no longer contacted for a specific survey. VEO
also monitors participation within other VEO surveys, to ensure participants do not
experience survey fatigue.
Table 4. Quarantine Protocol
Quarantine Rule
Description
Repeated Sampling
for Military Exposure
Survey
Other VEO Surveys
Number of days between receiving/completing online
survey, prior to receiving email invitation for Military
Exposure Survey
Number of days between receiving/completing online
survey and becoming eligible for another VEO survey
Persons indicating their wish to opt out of either phone
or online survey will no longer be contacted.
Opt Outs
Elapsed
Time
30 Days
30 Days
N/A
Part III – Assumptions and Limitations
A. Coverage Bias
Since the Military Exposure Survey is email only, there is a substantial population
of qualifying veterans that cannot be reached by the survey. Veterans that lack access to
the internet or do not use email may have different levels of Trust and satisfaction with
their service. As such, it is thought that Veterans in this latter category do not harbor any
tangible differences to other program participants who do share their information.
References
Choi, N.G. & Dinitto, D.M. (2013). Internet Use Among Older Adults: Association with
Health Needs, Psychological Capital, and Social Capital. Journal of Medical
Internet Research, 15(5), e97
Kalton, G., & Flores-Cervantes, I. (2003). Weighting Methods. Journal of Official
Statistics, 19(2), 81-97.
Kish, L. (1992). Weighting for unequal P. Journal of Official Statistics, 8(2), 183-200.
Kolenikov, S. (2014). Calibrating Survey Data Using Iterative Proportional Fitting
(Raking). The Stata Journal, 14(1): 22–59.
Lohr, S. (1999). Sampling: Design and Analysis (Ed.). Boston, MA: Cengage Learning.
Liu, J., Iannacchione, V., & Byron, M. (2002). Decomposing design effects for stratified
sampling. Proceedings of the American Statistical Association’s Section on Survey
Research Methods.
National Telecommunications and Information Administration (2020) Digital Nation Data
Explorer https://www.ntia.doc.gov/data/digital-nation-dataexplorer#sel=emailUser&demo=veteran&pc=prop&disp=chart
Wong, D.W.S. (1992) The Reliability of Using the Iterative Proportional Fitting
Procedure. The Professional Geographer, 44 (3), 1992, pp. 340-348
16
File Type | application/pdf |
File Modified | 2023-03-09 |
File Created | 2023-03-09 |