A-11 Requirements Document

A-11 Fast-Track Clearance Request Template_VBA Contact Center Survey_(004).pdf

Clearance for A-11 Section 280 Improving Customer Experience Information Collection

A-11 Requirements Document

OMB: 2900-0876

Document [pdf]
Download: pdf | pdf
Request for Approval under the “Generic Clearance for Improving Customer
Experience (OMB Circular A-11, Section 280 Implementation)” (OMB
Control Number: 2900-0876)
TITLE OF INFORMATION COLLECTION: VBA Contact Center Survey
PURPOSE
The VBA Call Center survey is designed to measure customer experience after contacting the National
Call Center (NCC), Education Call Center (ECC), or Insurance Call Center (ICC) to inquire about and
conduct business regarding VA benefits and services. For the NCC and ECC surveys, only Veterans
who recently called these call centers and communicated with a call representative may be invited to
participate in a brief online survey. For the ICC, other beneficiaries and representatives may be
contacted when an email address is available. The purpose of this report is to document the survey
methodology and sampling plan of the VBA Call Center survey.
Customer experience data is collected by using an online transactional survey disseminated via an
invitation email sent to randomly selected beneficiaries. For the NCC and ECC surveys, the data
collection occurs two times a week within 5 days after callers have interacted with the call center. The
ICC survey is conducted weekly with s two-week lag added out of respect for beneficiaries recently
losing a loved one. The questionnaire is brief and contains general Likert-scale (a scale of 1-5 from
Strongly Disagree to Strongly Agree) questions to assess customer satisfaction as well as questions
assessing the knowledge, speed, and manner of the interaction. Selected respondents will have 14 days
to complete the online survey, with an email reminder after 7 days if the survey has not been completed
DESCRIPTION OF RESPONDENTS:
The target population of these surveys is all Veterans 1 who contacted the National Call Center,
or Education Call Center, within the past week. For these surveys data collection will occur
twice a week, to reduce the time between the interaction with the call center and the time of the
initial survey contact. This will help to improve cognitive recall and thus improve the survey
measurement (reduce measurement error). Due to the sensitive nature of the Insurance Line of
Business, the ICC data collection will occur once per week and the data itself will be two weeks
old at the time of delivery. Therefore, the ICC invitations will be sent two weeks after the
transaction occurs. Note only persons that have shared their email address will be included in
the sample frame and thus are able to participate. Selected respondents will have 14 days to
complete the online survey, with an email reminder after 7 days if the survey has not been
completed. The random sampling will be conducted independently by call type (NCC,
Education, and Insurance).

The original plan was to include dependents in this study. It was determined, however, that the email addresses
recorded for dependents provided too little coverage and were often unreliable (e.g. matching the email on record for
the veteran instead of a dependent).

1

1

TYPE OF COLLECTION: (Check one)
[ ] Customer Comment Card/Complaint Form
[ ] Usability Testing (e.g., Website or Software
[ ] Focus Group

[X] Customer Satisfaction Survey
[ ] Small Discussion Group
[ ] Other: ______________________

CERTIFICATION:
I certify the following to be true:
1. The collection is voluntary.
2. The collection is low-burden for respondents and low-cost for the Federal Government.
3. The collection is non-controversial and does not raise issues of concern to other federal
agencies.
4. Personally identifiable information (PII) is collected only to the extent necessary and is not
retained.
5. Information gathered is intended to be used for general service improvement and program
management purposes.
6. The collection is targeted to the solicitation of opinions from respondents who have
experience with the program or may have experience with the program in the future.
7. All or a subset of information may be released as part of A-11, Section 280 requirements on
performance.gov. Additionally, summaries of the data may be released to the public in
communications to Congress, the media and other releases disseminated by VEO, consistent
with the Information Quality Act.
Name: Evan Albert, Director of Measurement and Data Analytics (Acting), Veterans Experience
Office [email protected] (202) 875-478
To assist review, please provide answers to the following question:
Personally Identifiable Information:
1. Will this survey use individualized links, through which VA can identify particular
respondents even if they do not provide their name or other personally identifiable
information on the survey? [ X ] Yes [] No
2. Is personally identifiable information (PII) collected? [ ] Yes [X] No
3. If Yes, will any information that is collected be included in records that are subject to the
Privacy Act of 1974? [ ] Yes [ ] No [N/A]
4. If Yes, has an up-to-date System of Records Notice (SORN) been published? [ ] Yes [ ] No
[N/A]
Gifts or Payments:
Is an incentive (e.g., money or reimbursement of expenses, token of appreciation) provided to
participants? [ ] Yes [ X] No
BURDEN HOURS
2

Category of Respondent

Individuals & Households
VA Form (if applicable)
Totals

No. of
Respondents

Participation
Time
( X minutes =)

Burden
(÷ 60 =)

8,688 annually

3

434.4

8,688 annually

3

434.4

Please answer the following questions.
1. Are you conducting a focus group, a survey that does not employ random sampling,
user testing or any data collection method that does not employ statistical methods?
[ ] Yes
] No
If Yes, please answer questions 1a-1c, 2 and 3.
If No, please answer or attach supporting documentation that answers questions 2-8.
a. Please provide a description of how you plan to identify your potential group of
respondents and how you will select them.

[X

b. How will you collect the information? (Check all that apply)
[ ] Web-based or other forms of Social Media
[ ] Telephone
[ ] In-person
[ ] Mail
[X] Other- E-mail-based surveys
c. Will interviewers or facilitators be used? [ ] Yes [ X ] No
2. Please provide an estimated annual cost to the Federal government to conduct this data
collection: __$13,000______
3. Please make sure that all instruments, instructions, and scripts are submitted with the request.
This includes questionnaires, interviewer manuals (if using interviewers or facilitators), all
response options for questions that require respondents to select a response from a group of
options, invitations given to potential respondents, instructions for completing the data
collection or additional follow-up requests for the data collection.
-Done
4. Describe (including a numerical estimate) the potential respondent universe and any
sampling or other respondent selection methods to be used. Data on the number of entities
(e.g., establishments, State and local government units, households, or persons) in the
3

universe covered by the collection and in the corresponding sample are to be provided in
tabular form for the universe as a whole and for each of the strata in the proposed sample.
Indicate expected response rates for the collection as a whole. If the collection had been
conducted previously, include the actual response rate achieved during the last collection.
- Please see Statistical Sample Plan in the Appendix.
5. Describe the procedures for the collection of information, including:
a. Statistical methodology for stratification and sample selection.
b. Estimation procedure.
c. Degree of accuracy needed for the purpose described in the justification.
d. Unusual problems requiring specialized sampling procedures.
e. Any use of periodic (less frequent than annual) data collection cycles to reduce
burden.
- Please see Statistical Sample Plan in the Appendix.
6. Describe methods to maximize response rates and to deal with issues of nonresponse. The
accuracy and reliability of information collected must be shown to be adequate for intended
uses. For collections based on sampling, a special justification must be provided for any
collection that will not yield "reliable" data that can be generalized to the universe studied.
Please see Statistical Sample Plan in the Appendix.
7. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an
effective means of refining collections of information to minimize burden and improve
utility. Tests must be approved if they call for answers to identical questions from 10 or more
respondents. A proposed test or set of tests may be submitted for approval separately or in
combination with the main collection of information.
Please see Statistical Sample Plan in the Appendix.
8. Provide the name and telephone number of individuals consulted on statistical aspects of the
design and the name of the agency unit, contractors, grantees, or other person(s) who will
actually collect or analyze the information for the agency.
Statistical Aspects:
Mark Andrews, Statistician, Veterans Experience Office, VA. (703)
483-5305
Collection and Analysis: Evan Albert, Director of Measurement and Data Analytics,
Veterans Experience Office, VA (202) 875-9478
Amanda Tepfer, Senior Customer Experience Strategist, Veterans
Benefit Administration (612) 803-1819
,

4

APPENDIXSTATISTICAL SAMPLE
PLAN

VBA Call Center Survey
Sampling Methodology Report
Prepared by
Veteran Experience Office
Version 1.4
June 2020

5

Contents

Executive Summary....................................................................................................................................... 7
Part I – Introduction ...................................................................................................................................... 7
A. Background .......................................................................................................................................... 7
B. Basic Definitions .................................................................................................................................. 8
C. Application to Veterans Affairs ........................................................................................................... 8
Part II – Methodology ................................................................................................................................... 9
A. Target Population, Frame, and Stratification ...................................................................................... 9
B. Sample Size Determination............................................................................................................. 10
C. Data Collection Methods ................................................................................................................ 14
D. Reporting......................................................................................................................................... 14
E. Quality Control ............................................................................................................................... 15
F. Sample Weighting, Coverage Bias, and Non-Response Bias ......................................................... 15
G. Quarantine Rules ............................................................................................................................. 16
Part III – Assumptions and Limitations ....................................................................................................... 18
A Veterans Only...................................................................................................................................... 18
B Coverage Bias due to Email-Only Data Collection ............................................................................. 18
C Call Characteristics: Length of Call & Number of Calls..................................................................... 18
Appendix 1. List of Data Extraction Variables .................................................................................. 19
Appendix 2. Survey Questions ............................................................................................................. 19
Appendix 3. References ........................................................................................................................ 19

6

Executive Summary
The VBA Call Center survey is designed to measure customer experience after contacting
the National Call Center (NCC), Education Call Center (ECC), or Insurance Call Center (ICC) to
inquire about and conduct business regarding VA benefits and services. For the NCC and ECC
surveys, only Veterans who recently called these call centers and communicated with a call
representative may be invited to participate in a brief online survey. For the ICC, other
beneficiaries and representatives may be contacted when an email address is available. The
purpose of this report is to document the survey methodology and sampling plan of the VBA
Call Center survey.
Customer experience data is collected by using an online transactional survey
disseminated via an invitation email sent to randomly selected beneficiaries. For the NCC and
ECC surveys, the data collection occurs two times a week within 5 days after callers have
interacted with the call center. The ICC survey is conducted weekly with s two-week lag added
out of respect for beneficiaries recently losing a loved one. The questionnaire is brief and
contains general Likert-scale (a scale of 1-5 from Strongly Disagree to Strongly Agree) questions
to assess customer satisfaction as well as questions assessing the knowledge, speed, and manner
of the interaction. Selected respondents will have 14 days to complete the online survey, with an
email reminder after 7 days if the survey has not been completed
The overall sample size for the ECC population is determined so that the reliability of
monthly survey estimate is at 3% Margin of Error at a 95% Confidence Level. For NCC, the
sample size monthly precision is extended to be at 4% margin of error and 95% confidence level
for the five highest volume call centers. Lower volume call centers have insufficient sample to
produce this level of reliability and, therefore, have graduated targets that are less rigorous for
call centers of the lowest call volume. The ICC sample is limited due to call volume and
availability of email addresses and, for this reason, the survey will be sent to all qualified
Veterans and beneficiaries for which email addresses are available. Once data collection is
completed, the participant responses in the online survey will be weighted so the samples more
closely represent the actual call volume of each call center
This report describes the methodology used to conduct the VBA Call Center Surveys for
the NCC, ECC, and ICC. Information about quality assurance protocols, as well as limitations of
the survey methodology, is also included in this report.
Part I – Introduction
A. Background
The Enterprise Measurement and Design team (EMD) is part of the Veterans
Experience Office (VEO). The EMD team is tasked with conducting transactional surveys of the
Veteran and Beneficiary population to measure their satisfaction with the Department of
Veterans Affairs (VA) numerous benefit services. Thus, their mission is to empower Veterans by
rapidly and discreetly collecting feedback on their interactions with such VA entities as NCA,
VHA, and VBA. VEO surveys generally entail probability samples which only contact minimal
numbers of beneficiaries necessary to obtain reliable estimates. This information is subsequently
used by internal stakeholders to monitor, evaluate, and improve beneficiary processes.
Beneficiaries are always able to decline participation and have the ability to opt out of future
7

invitations. A quarantine protocol is maintained to limit the number of times a beneficiary may
be contacted, in order to prevent survey fatigue, across all VEO surveys
The VBA oversees numerous government programs supporting Veterans and their
families. VBA call centers are one of the primary ways in which VBA engages these
beneficiaries. VEO was procured by VBA to measure the customer satisfaction of persons
contacting the following call centers: NCC, ECC, and ICC. A sample of recent callers to these
call centers will be contacted via email invitation to complete a brief transactional online survey.
The purpose of this document is to outline the planned sample design and provide a description
of the data collection and sample sizes necessary for proper reporting.
B. Basic Definitions
Coverage
Measurement Error
Non-Response
Transaction
Response Rate
Sample
Sampling Error
Sampling Frame
Reliability

The percentage of the population of interest that is
included in the sampling frame.
The difference between the response coded and the true value
of the characteristic being studied for a respondent.
Failure of some respondents in the sample to provide
responses in the survey.
A transaction refers to the specific time a Veteran interacts
with the VA that impacts the Veteran’s journey and their
perception of VA’s effectiveness in caring for Veterans.
The ratio of participating persons to the number of contacted
persons. This is one of the basic indicators of survey quality.
In statistics, a data sample is a set of data collected and/or
selected from a statistical population by a defined procedure.
Error due to taking a particular sample instead of measuring
every unit in the population.
A list of units in the population from which a sample may be
selected.
The consistency or dependability of a measure. Also referred
to as standard error.
C. Application to Veterans Affairs

Customer experience and satisfaction are usually measured at three levels to: 1) provide
enterprises the ability to track, monitor, and incentivize service quality; 2) provide service level
monitoring and insights; and 3) give direct point-of-service feedback. This measurement may
bring insights and value to all stakeholders at VA. Front-line VA leaders can resolve individual
feedback from Veterans and take steps to improve the customer experience; meanwhile VA
executives can receive real-time updates on systematic trends that allow them to make changes.

8

Part II – Methodology
A. Target Population, Frame, and Stratification
The target population of the these surveys is all Veterans 2 who contacted the National
Call Center, or Education Call Center, within the past week. For these surveys data collection
will occur twice a week, to reduce the time between the interaction with the call center and the
time of the initial survey contact. This will help to improve cognitive recall and thus improve the
survey measurement (reduce measurement error). Due to the sensitive nature of the Insurance
Line of Business, the ICC data collection will occur once per week and the data itself will be two
weeks old at the time of delivery. Therefore, the ICC invitations will be sent two weeks after the
transaction occurs. Note only persons that have shared their email address will be included in
the sample frame and thus are able to participate. Selected respondents will have 14 days to
complete the online survey, with an email reminder after 7 days if the survey has not been
completed. The random sampling will be conducted independently by call type (NCC,
Education, and Insurance).
Figure 1A. Measurement Goals and Survey Mode
Preferred
Time After
Mode of Data Recruitment Transaction for
Recruitment
Survey Type
Collection
Method
Initial Invitation
Window
Education Call
Within 5 business
Center
days after
Email
2 Emails over a 2
National Call Center Online Survey
Completing a Call
Recruitment
Week Period
Within 2 Weeks
Insurance Call
after Completing a
Center
Call

The original plan was to include dependents in this study. It was determined, however, that the email addresses
recorded for dependents provided too little coverage and were often unreliable (e.g. matching the email on record for
the veteran instead of a dependent).

2

9

Figure 1B. Stratification Variables
Explicit Strata

Implicit Strata

Call Center

Gender and Age (NCC and ECC)
Gender, Age and Veteran Status (ICC) 3

B. Sample Size Determination
To achieve a certain level of reliability, the sample size for a given level of reliability is
calculated below (Lohr, 1999):
For a population that is large, the equation below is used to yield a representative sample
for proportions:
2
𝑍𝑍𝛼𝛼/2
𝑝𝑝𝑝𝑝
𝑛𝑛0 =
2
𝑒𝑒
where
•
•

•

𝒁𝒁𝜶𝜶/𝟐𝟐 = is the critical Z score which is 1.96 under the normal distribution when using a 95%
confidence level (α = 0.05).
p = the estimated proportion of an attribute that is present in the population, with q=1-p.
o Note that pq attains its maximum when value p=0.5 or 50%. This is what is typically
reported in surveys where multiple measures are of interest. When examining
measures closer to 100% or 0% less sample is needed to achieve the same margin
of error.
e = the desired level of precision or margin of error. For example, for the ECC survey the
targeted margin of error is e = 0.03, or +/-3%.

For a population that is relatively small, the finite population correction is used to yield a
representative sample for proportions:
𝑛𝑛0
𝑛𝑛 =
𝑛𝑛
1 + 𝑁𝑁0
Where

•
•

𝒏𝒏𝟎𝟎 = Representative sample for proportions when the population is large.
N = Population size.

The margin of error surrounding the baseline proportion is calculated as:
𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀 𝑜𝑜𝑜𝑜 𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸 = 𝑧𝑧𝛼𝛼/2 �

𝑁𝑁 − 𝑛𝑛 𝑝𝑝(1 − 𝑝𝑝)
�
𝑁𝑁 − 1
𝑛𝑛

Where
•
𝒁𝒁𝜶𝜶/𝟐𝟐 = 1.96, which is the critical Z score value under the normal distribution when using
a 95% confidence level (α = 0.05).
•
N = Population size.
•
n = Representative sample.
3

Age and gender not available for non-veteran callers.

10

•

p = the estimated proportion of an attribute that is present in the population, with q=1-p.

Estimates from last 4 months of population files drawn indicate that in the average month
(defined as 4 weeks) nearly 1-4 million veterans make at least one call to at least one of the call
centers. Table 2A indicates the population figures based on numbers from that period, as well as
estimated population with email addresses on file and the proportion that is likely to be usable
after removing duplicates and exclusion rules across VEO surveys. The call centers vary
substantially by caller volume with more active call centers capable of achieving greater
accuracy. Given the volume of sample available, each call center is assigned a target margin of
error (MOE) at a selected margin. As described above, these inputs are used to calculate the
minimum number of respondents needed to achieve the prescribed level of accuracy.
Furthermore, current response rates are used to calculate the minimum sample needed to achieve
the desired result.
The ECC survey uses a target MOE of +/-3% at a 95% confidence level within a month.
The five highest volume NCCs were assigned sample targets to meet or exceed a precision of +/4% at 95% confidence. Call volumes, in three of the call centers, does not allow for this level of
precision without overextending the sample. For these call centers, precision targets were relaxed
in order to manage the burden on the veterans and the sustainability of the survey effort 4. The
Philadelphia Call Center, the target was reduced to +/-4.25% at 95%, the Columbus Call Center
precision was reduced to +/-4.5% at 95% and the San Juan Call centers +/- 5% at 80%
confidence.
The ICC sample is a census. For such sample an MOE can only be calculated if one
assumes the non-response patterns are randomly distributed. If so, the ICC survey will have an
estimated MOE of +/- 5.2% for estimates of the population with valid email addresses.
Table 2B provides the minimum sample targets and the minimum number of callers to be
contacted for each call center. Based on response rates from prior VEO surveys (13.0% for NCC
and 6.5% for ECC), it is estimated that VEO will need to initiate contact with a minimum of
15,903 callers for ECC and 30,803 callers for NCC to achieve the sample targets. The number of
calls may fluctuate with monthly changes in the population.

Sustainability of a tracking survey can be undermined by oversampling the population. This is due to two effects:
1) the sample reduces the viability of future waves because of the exclusion rules in place that limit the frequency in
which veterans can be surveyed; 2) the oversample leads to lower response rates due the high frequency in which
veterans are asked to participate. The sampling rate figures in Table 2B show that a maximum target of around 55%
was used for small call centers. The San Juan Call Center and ECC were allowed to go to 60% since the populations
are less likely to contact other call centers.
4

11

Table 2A. Target Population Figures, Sample Size, and Email Contacts

64,064
180,056
19,424
12,392
33,616

33,000
117,184
12,456
8,088
21,232

Estimated
Monthly Callers
w/ Email
Addresses
Available After
Exclusion Rules
and Deduplication
28,050
94,396
9,965
6,875
16,986

14,972

9,560

7,648

4.25%

95%

504

13.0%

3,875

51,248

34,104

27,283

4.00%

95%

590

13.0%

4,538

24,400

16,560

13,248

4.00%

95%

579

13.0%

4,456

4,020
19,984
16,910
261,030

2,444
12,740
4,570
154,754

2,200
10,192
3,884
126,331

5.00%
4.00%
3.8%

80%
95%
95%

154
573
583
5,622

13.0%
13.0%
15.0%

1,182
4,410
3,884
50,591

Estimated
Monthly
Callers

Education Call Center
National Call Centers
Cleveland Call Center
Columbia Call Center
Nashville Call Center
Philadelphia Call
Center
Phoenix Call Center
Salt Lake City Call
Center
San Juan Call Center
St. Louis Call Center
Insurance Call Center
Total

Estimated
Monthly Callers
w/ Email
Addresses

Min-imum
Monthly
Resp-onses
Needed

Response
Rates

95%
95%
95%

1,034
4,004
573
448
584

6.5%
13.0%
13.0%
13.0%
13.0%

15,903
30,803
4,405
3,447
4,491

Target
MOE

Confidence

3.00%

95%

4.00%
4.50%
4.00%

Table 2B shows the estimated sample frame and minimum target sample size on a weekly basis.
Minimum targets are rounded upward to assure the prescribed accuracy is achieved. The NCC
targets were increased by a minimum of 10%. ECC targets were rounded up by only 5.6% due to
the lower statistical error in between the estimated returns and likely outcomes as well as to
manage an already high sampling rate at this call center. The rounded numbers also make
management and quality control easier. The sampling rate is provided to show the extent that the
prescribed accuracy for low volume call centers maximizes the sample usage at around 60%. See
foot note on previous page for explanation.

12

Minimum
Monthly
Sample
Needed

Table 2B shows the weekly sample availability and sample needs.

Education Call Center
National Call Centers
Cleveland Call Center
Columbia Call Center
Nashville Call Center
Philadelphia Call Center
Phoenix Call Center
Salt Lake City Call Center
San Juan Call Center
St. Louis Call Center
Insurance Call Center
Total

Estimated Weekly
Callers w/ Email
Addresses
Available After
Exclusion Rules
and Deduplication
7,013
23,599
2,491
1,719
4,246
1,912
6,821
3,312
550
2,548
919
31,531

Minimum weekly
sample needed

Rounded weekly
sample targets

3,976
7,701
1,101
862
1,123
969
1,134
1,114
295
1,102
919
12,005

4,200
8,800
1,250
950
1,300
1,070
1,350
1,300
330
1,250
919
13,328

Sampling Rate

59.9%
37.3%
50.2%
55.3%
30.6%
56.0%
19.8%
39.3%
60.0%
49.1%
100.0%
42.3%

The sample will be drawn using a systematic sampling methodology. This statistical valid
approach allows the team to balance the sample across several variables such as age, gender, and
veteran status. This balancing variable are often referred to as implicit strata. In the coming
wave, the VEO team will begin to leverage this capability because, though the effect on margin
of error is difficult to measure, this methodology has been proven to improve the accuracy of
estimates, stabilize weights, and reduce the variability that make trends difficult to interpret.
Each email encountered is validated in several ways:
•
•
•
•
•

Validation of the email address has a valid structure
Comparison with a database of bad domains
Correction of common domain spellings
Comparison of a database of bad emails including
o Opt outs
o Email held by multiple veterans
Comparison to a database of valid TDLs (e.g. “.com”, “.edu”)

Veteran email addresses come from one of three sources prioritized in the following order.
Validated email addresses provided in the sample file are prioritized first. If no valid email
address is available, the second source is the VBA’s Enterprise Data Warehouse (EDW)
followed by the VHA’s Corporate Data Warehouse (CDW). The CDW is the sole source of
demographic data for veterans (Age and Gender). For the ICC, non-Veteran emails must come
from the original sample files.

13

C. Data Collection Methods
To improve cognitive recall of customer experience, the ECC and NCC surveys will send
invitations two times a week within 2-5 days after callers have interacted with the call center.
Caller information will be regularly extracted from Call Center CRM database and delivered to
VEO via data extracts. The files will be delivered twice a week for NCC and ECC and once a
week for ICC. Invitation will be sent on Tuesdays and Fridays, with the weekly sample split
across those days. Tuesday collections will correspond to callers from the Thursday and Friday
from the previous week, while Friday collections will pertain to calls received on Monday,
Tuesday, or Wednesday of the same week. Caller responses are immediately available within
VSignals as soon as feedback is submitted. See Table 4 for specific data collection times:
Table 4. Data Collection Times for ECC and NCC
Day Call Received

Day of Initial Contact

Monday
Tuesday
Wednesday
Thursday
Friday

Friday
Friday
Friday
[Next] Tuesday
[Next] Tuesday

Number of Days Since
Interaction
4 Days
3 Days
2 Days
5 Days
4 Days

Out of respect for family members who recently lost their loved one, the ICC survey will
be conducted once per week (Tuesdays) with an additional one-week delay before sending the
invitations.

D. Reporting
Researchers will be able to use the Veteran Signals (VSignals) system for interactive
reporting and data visualization. VA employees with a PIV card may access the system at
https://va.voice.medallia.com/sso/va/. The scores may be viewed by Age Group, Gender, and
Race/Ethnicity in various charts for different perspective. They are also depicted within time
series plots to investigate trends. Finally, filter options are available to assess scores at varying
time periods and within the context of other collected variable information.
Recruitment is continuous but the results should be combined into a monthly data file for
more precise estimates, at the call center level. Short interval estimates are less reliable for small
domains, (i.e., VAMC-level) and should only be considered for aggregated
populations. Monthly estimates will have larger sample sizes, and therefore higher reliability.
Estimates over longer periods are the most precise but will take the greatest amount of time to
obtain and are less dynamic in that trends and short-term fluctuation in service delivery may be
missed. Users examining subpopulation should be particularly diligent in assuring that insights
stem from analysis with sufficient sample in the subpopulations being examined or compared.

14

E. Quality Control
To ensure the prevention of errors and inconsistencies in the data and the analysis, quality
control procedures will be instituted in several steps of the survey process. Records will undergo
a cleaning during the population file creation. The quality control steps are as follows.
1. Records will be reviewed for missing sampling and weighting variable data. When
records with missing data are discovered, they will be either excluded from the
population file or put into separate strata upon discussion with subject matter experts.
2. Any duplicate records will be removed from the population file to both maintain the
probabilities of selection and prevent the double sampling of the same Veteran.
3. Invalid emails will be removed.
The survey sample loading and administration processes will have quality control
measures built into them.
1. The survey load process will be rigorously tested prior to the induction of the survey to
ensure that sampled customers is not inadvertently dropped or sent multiple emails.
2. The email delivery process is monitored to ensure that bounce-back records will not hold
up the email delivery process.
The weighting and data management quality control checks are as follows:
1. The sum of the weighted respondents will be compared to the overall population count to
confirm that the records are being properly weighted. When the sum does not match the
population count, weighting classes will be collapsed to correct this issue.
2. The unequal weighting effect will be used to identify potential issues in the weighting
process. Large unequal weighting effects indicate a problem with the weighting classes,
such as a record receiving a large weight to compensate for nonresponse or coverage
bias.

F. Sample Weighting, Coverage Bias, and Non-Response Bias
Weighting is commonly applied in surveys to adjust for nonresponse bias and/or
coverage bias. Nonresponse is defined as failure of selected persons in the sample to provide
responses. This is observed virtually in all surveys, in that some groups are more or less prone to
complete the survey. The nonresponse issue may cause some groups to be over- or underrepresented. Coverage bias is another common survey problem in which certain groups of
interest in the population are not included in the sampling frame. The reason that these Veterans
cannot participate is because they cannot be contacted (no email address available). In both
cases, the exclusion of these portions of Veterans from the survey contributes to the
measurement error. The extent that the final survey estimates are skewed depends on the nature
of the data collection processes within an individual line of business and the potential alignment
between veteran sentiment and their likelihood to respond.
Survey practitioners recommend the use of sample weighting to improve inference on the
population so that the final respondent sample more closely resembles the true population. It is
likely that differential response rates may be observed across different age and gender groups.

15

Weighting can help adjust for the demographic representation by assigning larger weights to
underrepresented group and smaller weights to over represented group. Stratification can also be
used to adjust for nonresponse by oversampling the subgroups with lower response rates. In both
ways of adjustments, weighting may result in substantial correction in the final survey estimates
when compared to direct estimates in the presence of non-negligible sample error.
The VBA survey currently relies only on what are often referred to as design weights—
weights that correct for disproportional sampling where respondents have different probabilities
of selection. Therefore, the weights are applied to make the explicit strata (the call centers)
proportional to the number of veterans that contact each call center.
Weights are updated live within the VSignals reporting platform 5. Proportions are set
based on the monthly distribution of the previous month. 6
If we let wij denote the sample weight for the ith person in group j (j=1, 2, and 3), then the
CW formula is:

𝑤𝑤𝑖𝑖𝑖𝑖 =

% 𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉 𝑖𝑖𝑖𝑖 𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝 𝑖𝑖𝑖𝑖 𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔 𝑗𝑗
# 𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉 𝑖𝑖𝑖𝑖 𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔 𝑗𝑗 𝑖𝑖𝑖𝑖 𝑡𝑡ℎ𝑒𝑒 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠

As part of the weighting validation process, the weights of persons in an age and gender
group are summed and verified that they match the universe estimates (i.e., population
proportion). Additionally, we calculate the unequal weighting effect, or UWE (see Kish, 1992;
Liu et al., 2002). This statistic is an indication of the amount of variation that may be expected
due to the inclusion of weighting. The unequal weighting effect estimates the percent increase in
the variance of the final estimate due to the presence of weights and is calculated as:
𝑠𝑠
2
= ( )2
𝑈𝑈𝑈𝑈𝑈𝑈 = 1 + 𝑐𝑐𝑐𝑐𝑤𝑤𝑤𝑤𝑤𝑤𝑤𝑤ℎ𝑡𝑡𝑡𝑡
𝑤𝑤
�

where
•
•
•

cv = coefficient of variation for all weights 𝑤𝑤𝑖𝑖𝑖𝑖 .
s = sample standard deviation of weights.
1
𝒘𝒘
� = sample mean of weights, 𝑤𝑤
� = 𝑛𝑛 ∑𝑖𝑖𝑖𝑖 𝑤𝑤 ij.

G. Quarantine Rules
VEO seeks to limit contact with Veterans as much as possible, and only as needed to
achieve measurement goals. These rules are enacted to prevent excessive recruitment attempts
upon Veterans. VEO also monitors Veteran participation within other surveys, to ensure

Realtime weighting may cause some distortions at the beginning of each cycle due to empty cells or random
variance in small sample distributions.
6
Using previous months data is a design option for handling the problem of setting targets prior to fielding each
month. An alternative design is to set targets off annualized estimates to create more stability month to month. If the
population is known to fluctuate from month to month, past month population estimates may not be the optimal
solution.
5

16

Veterans do not experience survey fatigue. All VEO surveys offer options for respondents to opt
out, and ensure they are no longer contacted for a specific survey.

17

Table 5. Proposed Quarantine Protocol
Quarantine Rule
Repeated Sampling
for the VBA Call
Center Survey
Other Surveys
Anonymous
Opt Outs

Description
Number of days between completing online survey and
receiving another VBA Call Center online survey.

Elapsed Time
2 Months or 60
Days

Veterans engaged that have recently completed other VEO
surveys will not be selected for 30 days.
Callers explicitly wishing to remain anonymous will not be
contacted.
Persons indicating their wish to opt out of either phone or
online survey will no longer be contacted.

30 Days
N/A
N/A

Part III – Assumptions and Limitations
A Veterans Only
At the onset of the VBA Call Center surveys, email addresses are only available for Veterans and
not their dependents. Since Veteran attitudes may differ from those of non-Veterans, the exclusion of
non-Veterans from the survey may contribute bias to the survey estimates. VEO will continue to work
with VBA to acquire contact information for all callers to benefit services, and this information will be
used to in future releases to address the entire target population.

B Coverage Bias due to Email-Only Data Collection
Since the VBA Call Center Survey is email-only, there is a segment of the population
VBA recipients that cannot be reached by the survey. This will correspond to persons that lack
access to the internet, and those who do not have an email address, or elect to not share their
email address with VBA. Such beneficiaries may have different levels of general satisfaction
with their service they received. Moreover, email addresses are currently obtained from VHA
health records, and this process may also contribute to coverage bias because only Veterans who
happen to have accessed VA Healthcare in the past are contacted.
C Call Characteristics: Length of Call & Number of Calls
There is a possibility that length of call to a VBA call center may have be a predictor of
customer satisfaction. Longer calls may produce higher or lower levels of satisfaction, perhaps
either due to long waiting times or because of the increased levels of assistance provided to the
call by the call center representative. The data extraction process at the time of this version of the
sampling documentation does not include call length. VEO will work to obtain this possibly
relevant information. At such time, consideration will be taken into incorporating call length into
the sampling and weighting procedures.

18

Appendix 1. List of Data Extraction Variables
Survey Variables
Survey Person ID
Agent ID
Date Time Call
Call Center
Phone Number
Coach
Full Name
Service Request Action
Caller Relation to Veteran
Has eBenefit Account
Credit Level
Call Type
Sub Type
NCC Start Date
Age
Gender
Period of Service
Veterans Email
Veteran ID # (MVI)

1.
2.
3.
4.
5.

Appendix 2. Survey Questions

The information provided by the phone representative was explained in terms I could understand.
The length of time it took to get connected to a phone representative was reasonable.
The phone representative answered my question on the issue I recently called about.
The phone representative treated me with courtesy and respect.
The information provided during the call helped me feel that I have a better understanding of my
issue and next steps.
6. I am satisfied with the service I received from the VA Call Center.
7. I trust VA to fulfill our country’s commitment to Veterans.

Appendix 3. References
Choi, N.G. & Dinitto, D.M. (2013). Internet Use Among Older Adults: Association with Health
Needs, Psychological Capital, and Social Capital. Journal of Medical Internet Research,
15(5), e97
Kish, L. (1992). Weighting for unequal P. Journal of Official Statistics, 8(2), 183-200.
Lohr, S. (1999). Sampling: Design and Analysis (Ed.). Boston, MA: Cengage Learning.

19

Liu, J., Iannacchione, V., & Byron, M. (2002). Decomposing design effects for stratified
sampling. Proceedings of the American Statistical Association’s Section on Survey
Research Methods.

20


File Typeapplication/pdf
File TitleDOCUMENTATION FOR THE GENERIC CLEARANCE
Author558022
File Modified2020-06-26
File Created2020-06-26

© 2025 OMB.report | Privacy Policy