A-11 Requirements Document

A-11 Fast-Track Clearance Request Template_Clinical Call Centers_(005).pdf

Clearance for A-11 Section 280 Improving Customer Experience Information Collection

A-11 Requirements Document

OMB: 2900-0876

Document [pdf]
Download: pdf | pdf
Request for Approval under the “Generic Clearance for Improving Customer
Experience (OMB Circular A-11, Section 280 Implementation)” (OMB
Control Number: 2900-0876)
TITLE OF INFORMATION COLLECTION: Clinical Contact Center Surveys
PURPOSE
The ECCC (Enterprise Contact Center Council) Clinical Contact Center Survey is designed to measure
customer experience after contacting one of the clinical care contact centers. Contact Center staff are
often the first point of contact for Veterans – our virtual red coats. There are over 200 Contact Centers
in VA answering over 140M calls each year. Currently the Clinical Contact Center leadership team
does not have a clear understanding of the customer experience in most of these or the ability to
compare key elements across the call centers.
The Enterprise Contact Center Council is providing more and better tools for VA’s contact centers
through implementation of the Enterprise Contact Center Modernization Operating Plan. Currently
VSignals is used at three contact centers, NCA Scheduling Office, VBA National Call Center, and the
White House / VA Hotline.
Veterans experience data for this survey is collected by using an online transactional survey
disseminated via an invitation email sent to randomly selected beneficiary. The data collection occurs
once per week with invitation being sent out within 8 days of calling the ECCC Clinical Contact
Center. The questionnaire is brief and contains general Likert-scale (a scale of 1-5 from Strongly
Disagree to Strongly Agree) questions to assess customer satisfaction as well as questions assessing the
knowledge, speed, and manner of the interaction. After the survey has been distributed, recipients have
two weeks to complete the survey and will receive a reminder email after one week.
DESCRIPTION OF RESPONDENTS:
The target population of the ECCC Clinical Contact Center Survey is defined as any Veterans
who has received telephonic
care through the ECCC Clinical Contact Center in the past weeks are eligible for participation.
The sample frame is prepared by extracting population information directly from VHA’s
Corporate Data Warehouse.
These extracts are also used to obtain universe figures for the sample weighting process. The
Veteran is the primary sampling

1

unit and is randomly selected from the population according to a stratified design. The primary
stratification will be the type of contact
which fall into 3 strata—nurse triage, pharmacy, and Licensed Independent Provider (LIP). The
survey will also utilize implicit stratification
or balancing by age, gender, and location.
TYPE OF COLLECTION: (Check one)
[ ] Customer Comment Card/Complaint Form
[ ] Usability Testing (e.g., Website or Software
[ ] Focus Group

[X] Customer Satisfaction Survey
[ ] Small Discussion Group
[ ] Other: ______________________

CERTIFICATION:
I certify the following to be true:
1. The collection is voluntary.
2. The collection is low-burden for respondents and low-cost for the Federal Government.
3. The collection is non-controversial and does not raise issues of concern to other federal
agencies.
4. Personally identifiable information (PII) is collected only to the extent necessary and is not
retained.
5. Information gathered is intended to be used for general service improvement and program
management purposes.
6. The collection is targeted to the solicitation of opinions from respondents who have
experience with the program or may have experience with the program in the future.
7. All or a subset of information may be released as part of A-11, Section 280 requirements on
performance.gov. Additionally, summaries of the data may be released to the public in
communications to Congress, the media and other releases disseminated by VEO, consistent
with the Information Quality Act.
Name: Evan Albert, Director of Measurement and Data Analytics (Acting), Veterans Experience
Office [email protected] (202) 875-478
To assist review, please provide answers to the following question:
Personally Identifiable Information:
1. Will this survey use individualized links, through which VA can identify particular
respondents even if they do not provide their name or other personally identifiable
information on the survey? [ X ] Yes [] No
2. Is personally identifiable information (PII) collected? [ ] Yes [X] No
3. If Yes, will any information that is collected be included in records that are subject to the
Privacy Act of 1974? [ ] Yes [ ] No [N/A]
4. If Yes, has an up-to-date System of Records Notice (SORN) been published? [ ] Yes [ ] No
[N/A]
2

Gifts or Payments:
Is an incentive (e.g., money or reimbursement of expenses, token of appreciation) provided to
participants? [ ] Yes [ X] No
BURDEN HOURS
Category of Respondent

Individuals & Households
VA Form (if applicable)
Totals

No. of
Respondents

Participation
Time
( X minutes =)

Burden
(÷ 60 =)

8,688 annually

3

434.4

8,688 annually

3

434.4

Please answer the following questions.
1. Are you conducting a focus group, a survey that does not employ random sampling,
user testing or any data collection method that does not employ statistical methods?
[ ] Yes
[X] No
If Yes, please answer questions 1a-1c, 2 and 3.
If No, please answer or attach supporting documentation that answers questions 2-8.
a. Please provide a description of how you plan to identify your potential group of
respondents and how you will select them.
b. How will you collect the information? (Check all that apply)
[ ] Web-based or other forms of Social Media
[ ] Telephone
[ ] In-person
[ ] Mail
[X] Other- E-mail-based surveys
c. Will interviewers or facilitators be used? [ ] Yes [ X ] No
2. Please provide an estimated annual cost to the Federal government to conduct this data
collection: __$13,000______
3. Please make sure that all instruments, instructions, and scripts are submitted with the request.
This includes questionnaires, interviewer manuals (if using interviewers or facilitators), all
response options for questions that require respondents to select a response from a group of
options, invitations given to potential respondents, instructions for completing the data
collection or additional follow-up requests for the data collection.
3

-Done
4. Describe (including a numerical estimate) the potential respondent universe and any
sampling or other respondent selection methods to be used. Data on the number of entities
(e.g., establishments, State and local government units, households, or persons) in the
universe covered by the collection and in the corresponding sample are to be provided in
tabular form for the universe as a whole and for each of the strata in the proposed sample.
Indicate expected response rates for the collection as a whole. If the collection had been
conducted previously, include the actual response rate achieved during the last collection.
- Please see Statistical Sample Plan in the Appendix.
5. Describe the procedures for the collection of information, including:
a. Statistical methodology for stratification and sample selection.
b. Estimation procedure.
c. Degree of accuracy needed for the purpose described in the justification.
d. Unusual problems requiring specialized sampling procedures.
e. Any use of periodic (less frequent than annual) data collection cycles to reduce
burden.
- Please see Statistical Sample Plan in the Appendix.
6. Describe methods to maximize response rates and to deal with issues of nonresponse. The
accuracy and reliability of information collected must be shown to be adequate for intended
uses. For collections based on sampling, a special justification must be provided for any
collection that will not yield "reliable" data that can be generalized to the universe studied.
Please see Statistical Sample Plan in the Appendix.
7. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an
effective means of refining collections of information to minimize burden and improve
utility. Tests must be approved if they call for answers to identical questions from 10 or more
respondents. A proposed test or set of tests may be submitted for approval separately or in
combination with the main collection of information.
Please see Statistical Sample Plan in the Appendix.
8. Provide the name and telephone number of individuals consulted on statistical aspects of the
design and the name of the agency unit, contractors, grantees, or other person(s) who will
actually collect or analyze the information for the agency.
Statistical Aspects:
Mark Andrews, Statistician, Veterans Experience Office, VA. (703)
483-5305
Collection and Analysis: Evan Albert, Director of Measurement and Data Analytics,
Veterans Experience Office, VA (202) 875-9478
4

Suzanne Klinker, VISN 8 Deputy Network Director for Clinical
Contact Centers (727) 575-8056
Brian Cieluch, VISN 23, Director, Telephone Call Center (651)
405-5672

,

APPENDIXSTATISTICAL SAMPLE
PLAN

Service Level Measurements – ECCC
Clinical Contact Center Survey
Sampling Methodology Report
Prepared by
Veteran Experience Office
Version 1
June 2020

5

Executive Summary
The ECCC Clinical Contact Center Survey is designed to measure customer experience
after contacting one of the clinical care contact centers.
Veterans experience data is collected by using an online transactional survey
disseminated via an invitation email sent to randomly selected beneficiary. The data collection
occurs once per week with invitation being sent out within 8 days of calling the ECCC Clinical
Contact Center. The questionnaire is brief and contains general Likert-scale (a scale of 1-5 from
Strongly Disagree to Strongly Agree) questions to assess customer satisfaction as well as
questions assessing the knowledge, speed, and manner of the interaction. After the survey has
been distributed, recipients have two weeks to complete the survey and will receive a reminder
email after one week.
The overall sample size for the ECCC Clinical Contact Center Survey population is
selected to optimize the reliability of monthly survey estimate for each survey type given the
amount of available sample while being conscious of the burden placed on the veteran. In this
case the largest cohort (Nurse Triage) is targeted to achieve a +/-4% margin of error at a 95%
confidence level. The survey will be sent to a representative sample of Veterans. Once data
collection is completed, the participant responses in the online survey will be weighted.
This report describes the methodology used to conduct the ECCC Clinical Contact
Center Survey. Information about quality assurance protocols, as well as limitations of the survey
methodology, is also included in this report.

6

Part I – Introduction
A. Background
The Enterprise Measurement and Design team (EMD) is part of the Insights
and Analytics (I&A) division within the Veterans Experience Office (VEO). The EMD
team is tasked with conducting transactional surveys of the Veteran population to measure
their satisfaction with the Department of Veterans Affairs (VA) numerous benefit services.
Thus, their mission is to empower Veterans by rapidly and discreetly collecting feedback
on their interactions with such VA entities as NCA, VHA, and VBA. VEO surveys
generally entail probability samples which only contact minimal numbers of Veterans
necessary to obtain reliable estimates. This information is subsequently used by internal
stakeholders to monitor, evaluate, and improve beneficiary processes. Veterans are always
able to decline participation and have the ability to opt out of future invitations. A
quarantine protocol is maintained to limit the number of times a Veteran may be
contacted, in order to prevent survey fatigue, across all VEO surveys.
The VEO team designed two surveys related to the ECCC Clinical Contact Center Survey.
The standard survey will be administered with beneficiaries to who receive services
directly through the Clinical Contact Center (for RN Triage and Pharmacy). The Licensed
Independent Provider (LIP) Survey will be administered to beneficiaries who receive
telephonic care through contracted clinical care specialists.
In order to continue to provide quality services to Veterans, VEO has been commissioned
to measure the satisfaction with the ECCC Clinical Contact Center. To complete this goal,
VEO proposed to conduct a brief transactional survey with selected Veterans who had
received telephonic care through the ECCC Clinical Contact Center. The core survey
consists of eight questions developed using a human-centered design, focusing on
Veterans’ experience with regard to their recent encounter and centered on to the factors of
Trust, Ease, Effectiveness, Helpfulness, Quality and Emotion. These Likert-scale (a scale
of 1-5) questions are designed through extensive Veteran input and recommendations from

7

subject matter experts in the VA. Veterans also have an opportunity to provide a free-text
response about their experience. 1
Veterans are selected to participate in the survey via an invitation email. A link is
enclosed so the survey may be completed using an online interface, with customized
participant information. The data is collected on a weekly basis and the survey is reported
on a monthly basis. The purpose of this document is to outline the planned sample design
and provide a description of the data collection and sample sizes necessary for proper
reporting.

The LIP Survey is different from the core questionnaire with the exception of a universal Trust question. It
contains an additional Likert-scale question and a question regarding their reason for seeking telehealth
services.

1

8

B. Basic Definitions
Coverage
Measurement Error
Non-Response
Transaction

Response Rate
Sample
Sampling Error
Sampling Frame
Reliability

The percentage of the population of interest that is
included in the sampling frame.
The difference between the response coded and the true value
of the characteristic being studied for a respondent.
Failure of some respondents in the sample to provide
responses in the survey.
A transaction refers to the specific time a Veteran interacts
with the VA that impacts the Veteran’s journey and their
perception of VA’s effectiveness in caring for Veterans.
The ratio of participating persons to the number of contacted
persons. This is one of the basic indicators of survey quality.
In statistics, a data sample is a set of data collected and/or
selected from a statistical population by a defined procedure.
Error due to taking a particular sample instead of measuring
every unit in the population.
A list of units in the population from which a sample may be
selected.
The consistency or dependability of a measure. Also referred
to as standard error.

C. Application to Veterans Affairs
Customer experience and satisfaction are usually measured at three levels to: 1)
provide enterprises the ability to track, monitor, and incentivize service quality; 2) provide
service level monitoring and insights; and 3) give direct point-of-service feedback. This
measurement may bring insights and value to all stakeholders at VA. Front-line VA leaders
can resolve individual feedback from Veterans and take steps to improve the customer
experience; meanwhile VA executives can receive real-time updates on systematic trends
that allow them to make changes.
1) To collect continuous customer experience data
2) To help field staff and the national office identify areas of improvement.
3) To understand emerging drivers and detractors of customer experience.

9

Part II – Methodology
A. Target Population, Frame, and Stratification
The target population of the ECCC Clinical Contact Center Survey is defined as
any Veterans who has received telephonic care through the ECCC Clinical Contact Center
in the past weeks are eligible for participation.
The sample frame is prepared by extracting population information directly from
VHA’s Corporate Data Warehouse. These extracts are also used to obtain universe figures
for the sample weighting process. The Veteran is the primary sampling unit and is
randomly selected from the population according to a stratified design. The primary
stratification will be the type of contact which fall into 3 strata—nurse triage, pharmacy,
and Licensed Independent Provider (LIP). The survey will also utilize implicit
stratification or balancing by age, gender, and location.

B.

Sample Size Determination

To achieve a certain level of reliability, the sample size for a given level of
reliability is calculated below (Lohr, 1999):
For a population that is large, the equation below is used to yield a representative
sample for proportions:

where

𝑛𝑛0 =

2
𝑝𝑝𝑝𝑝
𝑍𝑍𝛼𝛼/2

𝑒𝑒 2

𝒁𝒁𝜶𝜶/𝟐𝟐 = is the critical Z score which is 1.96 under the normal distribution when using
a 95% confidence level (α = 0.05).
• p = the estimated proportion of an attribute that is present in the population, with
q=1-p.
o Note that pq attains its maximum when value p=0.5 or 50%. This is what is
typically reported in surveys where multiple measures are of interest. When
examining measures closer to 100% or 0% less sample is needed to achieve
the same margin of error.
• e = the desired level of precision or margin of error. For example, for the ECCC
Clinical Contact Center Survey the targeted margin of error is e = 0.04, or +/-4%.
For a population that is relatively small, the finite population correction is used to
yield a representative sample for proportions:
•

𝑛𝑛 =
10

𝑛𝑛0
𝑛𝑛
1 + 𝑁𝑁0

Where
•
•

𝒏𝒏𝟎𝟎 = Representative sample for proportions when the population is large.
N = Population size.

The margin of error surrounding the baseline proportion is calculated as:
𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀 𝑜𝑜𝑜𝑜 𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸 = 𝑧𝑧𝛼𝛼/2 �

𝑁𝑁 − 𝑛𝑛 𝑝𝑝(1 − 𝑝𝑝)
�
𝑛𝑛
𝑁𝑁 − 1

Where
• 𝒁𝒁𝜶𝜶/𝟐𝟐 = 1.96, which is the critical Z score value under the normal distribution when
using a 95% confidence level (α = 0.05).
• N = Population size.
• n = Representative sample.
• p = the estimated proportion of an attribute that is present in the population, with
q=1-p.
Estimates from the population files drawn for the first 5 months of 2020 indicate
that in the average month 16,508 calls are made to the ECCC Clinical Contact Center by
Veterans. The proposed sample plan is designed to achieve an MOE of +/-4% at a 95%
confidence for the Nurse Triage strata. For the LIP and Pharmacy stratum, there is
insufficient sample to achieve this level of accuracy. The plan, therefore, is to achieve an
MOE of +/-5% at 80% confidence for LIP by using most, if not all, of the available
sample. Pharmacy calls have the least amount of available sample. To address this, the
plan recommends using all available sample.
Table 1A indicates the population figures based on numbers from that period, as
well as estimated population with email addresses on file and the proportion that is likely
to be usable after removing duplicates and quarantine rules across VEO surveys.
Table 1A. Target Population Figures, Sample Size, and Email Contacts

Estimated
Monthly
Callers

Nurse Triage
LIP
Pharmacy

14,508
1,677
323

Estimated
Monthly
Callers w/
Email
Addresses
8,175
1,082
147

Estimated
Monthly Callers
w/ Email
Addresses
Available After
Exclusion Rules
and Deduplication
6,540
866
117

Target
MOE 2

Confidence

4.00%
5.00%
13.00%

95%
80%
80%

Minimum
Monthly
Responses
Needed
560
143
21

Response
Rates

18%
18%
18%

Table 1B shows the estimated sample frame and minimum target sample size on a weekly
basis. Minimum targets are rounded upward to assure the prescribed accuracy is achieved.
2

MOE measures assume that non-response to the survey is randomly distributed.

11

Minimum
Monthly
Sample
Needed

3,107
791
117

Table 1B shows the weekly sample availability and sample needs.

Nurse Triage
LIP
Pharmacy

Estimated Weekly
Callers w/ Email
Addresses
Available After
Exclusion Rules
and Deduplication
1,505
199
27

Minimum weekly
sample needed

Rounded weekly
sample targets

129
33
5

Sampling Rate

720
185
27

47.8%
93.0%
100.0%

The sample will be drawn using a systematic sampling methodology. This statistical valid
approach allows the team to balance the sample across several variables such as age,
gender, and location. These balancing variables are often referred to as implicit strata. This
has been shown to stabilize trends and improve accuracy of estimates.
Email addresses will be acquired by matching Veteran ID numbers to the VBA’s
Enterprise Data Warehouse (EDW) and the VHA’s Corporate Data Warehouse (CDW).
The CDW will be prioritized if the two sources produce different and valid email
addresses. Each email address encountered is validated in several ways:
•
•
•
•
•

C.

Validation that the email address has a valid structure
Comparison with a database of bad domains
Correction of common domain misspellings
Comparison of a database of bad emails including
o Opt outs
o Email held by multiple veterans
Comparison to a database of valid TDLs (e.g. “.com”, “.edu”)

Data Collection Methods

Invitations will be sent out each week to assure that initial invites are sent within
eight days of their call to the ECCC Clinical Contact Center. Caller information will be
regularly extracted from VHA database resource: the VHA’s Corporate Data Warehouse
(CDW). The extraction process will be executed and validated by the Office of
Performance Improvement and Assessment (PA&I). with the population extracts sent to
VEO twice a week. Invitation will be sent on Mondays. Invitees that have not completed
the survey will receive a reminder after one week. The survey will remain open for a total
of two weeks. Survey responses are immediately available within VSignals as soon as
feedback is submitted.

12

D.

Reporting

Researchers will be able to use the Veteran Signals (VSignals) system for
interactive reporting and data visualization. VA employees with a PIV card may access
the system at https://va.voice.medallia.com/sso/va/. The scores may be viewed by Age
Group, Gender, and Race/Ethnicity in various charts for different perspective. They are
also depicted within time series plots to investigate trends. Finally, filter options are
available to assess scores at varying time periods and within the context of other collected
variable information.
Recruitment is continuous but the results should be combined into a monthly data
file for more precise estimates, at the call center level. Short interval estimates are less
reliable for small domains, (i.e., VAMC-level) and should only be considered for
aggregated populations. Monthly estimates will have larger sample sizes, and therefore
higher reliability. Estimates over longer periods are the most precise but will take the
greatest amount of time to obtain and are less dynamic in that trends and short-term
fluctuation in service delivery may be missed. Users examining subpopulation should be
particularly diligent in assuring that insights stem from analysis with sufficient sample in
the subpopulations being examined or compared.

E.

Quality Control

To ensure the prevention of errors and inconsistencies in the data and the analysis,
quality control procedures will be instituted in several steps of the survey process. Records
will undergo a cleaning during the population file creation. The quality control steps are
as follows.
1. Records will be reviewed for missing sampling and weighting variable data. When
records with missing data are discovered, they will be either excluded from the
population file or put into separate strata upon discussion with subject matter
experts.
2. Any duplicate records will be removed from the population file to both maintain
the probabilities of selection and prevent the double sampling of the same Veteran.
3. Invalid emails will be removed.
The survey sample loading and administration processes will have quality control
measures built into them.
1. The survey load process will be rigorously tested prior to the induction of the
survey to ensure that sampled customers is not inadvertently dropped or sent
multiple emails.
2. The email delivery process is monitored to ensure that bounce-back records will
not hold up the email delivery process.
The weighting and data management quality control checks are as follows:

13

1. The sum of the weighted respondents will be compared to the overall population
count to confirm that the records are being properly weighted. When the sum does
not match the population count, weighting classes will be collapsed to correct this
issue.
2. The unequal weighting effect will be used to identify potential issues in the
weighting process. Large unequal weighting effects indicate a problem with the
weighting classes, such as a record receiving a large weight to compensate for
nonresponse or coverage bias.

F.

Sample Weighting, Coverage Bias, and Non-Response Bias

Weighting is commonly applied in surveys to adjust for nonresponse bias and/or
coverage bias. Nonresponse is defined as failure of selected persons in the sample to
provide responses. This is observed virtually in all surveys, in that some groups are more
or less prone to complete the survey. The nonresponse issue may cause some groups to be
over- or under-represented. Coverage bias is another common survey problem in which
certain groups of interest in the population are not included in the sampling frame. The
reason that these Veterans cannot participate is because they cannot be contacted (no email
address available). In both cases, the exclusion of these portions of Veterans from the
survey contributes to the measurement error. The extent that the final survey estimates are
skewed depends on the nature of the data collection processes within an individual line of
business and the potential alignment between veteran sentiment and their likelihood to
respond.
Survey practitioners recommend the use of sample weighting to improve inference
on the population so that the final respondent sample more closely resembles the true
population. It is likely that differential response rates may be observed across different age
and gender groups. Weighting can help adjust for the demographic representation by
assigning larger weights to underrepresented group and smaller weights to overrepresented
group. Stratification can also be used to adjust for nonresponse by oversampling the
subgroups with lower response rates. In both ways of adjustments, weighting may result in
substantial correction in the final survey estimates when compared to direct estimates in
the presence of non-negligible sample error.
Weights are updated live within the VSignals reporting platform 3. Proportions are
set based on the monthly distribution of the previous month. 4
If we let wij denote the sample weight for the ith person in group j (j=1, 2, and 3),
then the CW formula is:

Realtime weighting may cause some distortions at the beginning of each cycle due to empty cells or
random variance in small sample distributions.

3

Using previous months data is a design option for handling the problem of setting targets prior to fielding
each month. An alternative design is to set targets off annualized estimates to create more stability month to
month. If the population is known to fluctuate from month to month, past month population estimates may
not be the optimal solution.

4

14

𝑤𝑤𝑖𝑖𝑖𝑖 =

% 𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉 𝑖𝑖𝑖𝑖 𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝 𝑖𝑖𝑖𝑖 𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔 𝑗𝑗
# 𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉 𝑖𝑖𝑖𝑖 𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔 𝑗𝑗 𝑖𝑖𝑖𝑖 𝑡𝑡ℎ𝑒𝑒 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠

As part of the weighting validation process, the weights of persons in an age and
gender group are summed and verified that they match the universe estimates (i.e.,
population proportion). Additionally, we calculate the unequal weighting effect, or UWE
(see Kish, 1992; Liu et al., 2002). This statistic is an indication of the amount of variation
that may be expected due to the inclusion of weighting. The unequal weighting effect
estimates the percent increase in the variance of the final estimate due to the presence of
weights and is calculated as:
𝑠𝑠
2
𝑈𝑈𝑈𝑈𝑈𝑈 = 1 + 𝑐𝑐𝑐𝑐𝑤𝑤𝑤𝑤𝑤𝑤𝑤𝑤ℎ𝑡𝑡𝑡𝑡
= ( )2
𝑤𝑤
�

where
•
•
•

G.

cv = coefficient of variation for all weights 𝑤𝑤𝑖𝑖𝑖𝑖 .
s = sample standard deviation of weights.
1
𝒘𝒘
� = sample mean of weights, 𝑤𝑤
� = 𝑛𝑛 ∑𝑖𝑖𝑖𝑖 𝑤𝑤 ij.

Quarantine Rules

VEO seeks to limit contact with Veterans as much as possible, and only as needed
to achieve measurement goals. These rules are enacted to prevent excessive recruitment
attempts upon Veterans. VEO also monitors Veteran participation within other surveys, to
ensure Veterans do not experience survey fatigue. All VEO surveys offer options for
respondents to opt out, and ensure they are no longer contacted for a specific survey.
Table 5. Proposed Quarantine Protocol
Quarantine Rule
Past waves
Active Waves
Anonymous
Opt Outs

15

Description
Number of days between completing online survey any
VEO survey and receiving another invitation.
Number of days between receiving an invitation to a VEO
survey and receiving another invitation.
Callers explicitly wishing to remain anonymous will not be
contacted.
Persons indicating their wish to opt out of either phone or
online survey will no longer be contacted.

Elapsed Time
30 Days
14 Days
N/A
N/A

Part III – Assumptions and Limitations
A) Population Estimation Error
The population estimates for this survey include some uncertainty due to 1)
fluctuation in the call volumes due to the current pandemic (Covid 19); 2) an increase over
time in the use of telemedicine over time; and 3) potential policy shift (e.g. shift to more
reliance on contractor or LIP). Estimates tried to account for these factors. None-the-less,
a large amount of uncertainty exists. To address this risk, we recommend evaluating the
sample plan over time to determine how well the estimates hold up.

B) Coverage Bias due to Email-Only Data Collection
Since the ECCC Clinical Contact Center Survey is email-only, there is a segment
of the population of ECCC Clinical Contact Center callers that cannot be reached by the
survey. This will correspond to persons that lack access to the internet, and those who do
not have an email address, or elect to not share their email address with the VA. Such
beneficiaries may have different levels of general satisfaction with their service they
received.

Index 1. Survey Questions
Standard Survey Questions

A-11 Customer
Experience Domains

I waited a reasonable amount of time to speak to an agent.

Efficiency/Speed

It was easy to reach the right person about my need.

Ease/Simplicity

16

Standard Survey Questions

A-11 Customer
Experience Domains

The agent took a reasonable amount of time to address
my need.

Efficiency/Speed

I understood the information provided by the [contact
center agent].

Employee Helpfulness

The agent I interacted with was helpful.

Employee Helpfulness

The issue that I contacted [contact center] about on
[date/today] was resolved.

Quality

I am satisfied with the service I received from [contact
center].

Satisfaction

I trust VA to fulfill our country's commitment to Veterans.

Confidence/Trust

17

LIP (Telehealth) Survey

A-11 Customer
Experience Domains

I am satisfied with the care I received during this
interaction / appointment.

Satisfaction

It was easy to reach the right person about my need.

Ease/Simplicity

The issue that I contacted [contact center] about on
[date/today] was addressed.

Quality

The VA telehealth provider I interacted with was helpful.

Employee Helpfulness

I understood the information provided by the VA
telehealth provider.

Employee Helpfulness

After my virtual visit, I knew what I needed to do next.

Ease/Simplicity

I waited a reasonable amount of time to speak to a VA
telehealth provider.

Efficiency/Speed

The VA telehealth provider took a reasonable amount of
time to address my need.

Efficiency/Speed

If you sought a telehealth appointment at VA, tell us the
main reason why:
• I did not want to go to VA in person because of
COVID-19 concerns
• I was given the option to have a telehealth
appointment because it would meet my care needs
• Other

N/A

I trust VA to fulfill our country's commitment to Veterans.

Confidence/Trust

Index 2. References
Choi, N.G. & Dinitto, D.M. (2013). Internet Use Among Older Adults: Association with
Health Needs, Psychological Capital, and Social Capital. Journal of Medical
Internet Research, 15(5), e97
Kish, L. (1992). Weighting for unequal P. Journal of Official Statistics, 8(2), 183-200.
Lohr, S. (1999). Sampling: Design and Analysis (Ed.). Boston, MA: Cengage Learning.
Liu, J., Iannacchione, V., & Byron, M. (2002). Decomposing design effects for stratified
sampling. Proceedings of the American Statistical Association’s Section on Survey
Research Methods.
18

19

Appendix 1. References
Appendix B. References
Choi, N.G. & Dinitto, D.M. (2013). Internet Use Among Older Adults: Association with
Health Needs, Psychological Capital, and Social Capital. Journal of Medical
Internet Research, 15(5), e97
Kalton, G., & Flores-Cervantes, I. (2003). Weighting Methods. Journal of Official
Statistics, 19(2), 81-97.
Kish, L. (1992). Weighting for unequal P. Journal of Official Statistics, 8(2), 183-200.
Kolenikov, S. (2014). Calibrating Survey Data Using Iterative Proportional Fitting
(Raking). The Stata Journal, 14(1): 22–59.
Lohr, S. (1999). Sampling: Design and Analysis (Ed.). Boston, MA: Cengage Learning.
Liu, J., Iannacchione, V., & Byron, M. (2002). Decomposing design effects for stratified
sampling. Proceedings of the American Statistical Association’s Section on Survey
Research Methods.
Wong, D.W.S. (1992) The Reliability of Using the Iterative Proportional Fitting
Procedure. The Professional Geographer, 44 (3), 1992, pp. 340-348

20

21


File Typeapplication/pdf
File TitleDOCUMENTATION FOR THE GENERIC CLEARANCE
Author558022
File Modified2020-06-26
File Created2020-06-26

© 2024 OMB.report | Privacy Policy