FY15 Non Response Bias Reports - Pension

VBA_VOV LOB_Pension_FY15 NonResponse Bias Report_12.03.15 Final1.pdf

Voice of Veteran (VOV) Continuous Measurement Surveys

FY15 Non Response Bias Reports - Pension

OMB: 2900-0782

Document [pdf]
Download: pdf | pdf
Voice of the Veteran Line of Business Tracking Study
Pension Service
Fiscal Year 2015 Non-Response Bias Analysis

VETERANS BENEFITS ADMINISTRATION

[FY15 REPORT]

Table of Contents
Executive Summary .................................................................................................................................. 4
Introduction ............................................................................................................................................. 6
Methodology ............................................................................................................................................ 7
2.1 J.D. Power Index Model ................................................................................................................... 7
2.2 Sampling......................................................................................................................................... 10
2.3 Data Collection ............................................................................................................................... 11
Non-Response Bias Analysis................................................................................................................... 12
3.1 Survey Yield .................................................................................................................................... 17
3.2 Missing Data Patterns and Mechanisms ........................................................................................ 24
3.3 Margin of Error .............................................................................................................................. 25
3.3.1 Sampling Distribution ....................................................................................................... 27
3.3.2 Distribution of Overall Satisfaction Index Scores.............................................................. 28
3.3.3 Analysis for Demographic Differences .............................................................................. 29
3.3.4 Data Imputation Analysis for Demographic Differences .................................................. 32
Findings .................................................................................................................................................. 35
Conclusion .............................................................................................................................................. 36
References ............................................................................................................................................. 37
List of Appendices
Appendix A Missing Data Patterns and Mechanisms .................................................................... 39
Appendix B Item Response Rates .................................................................................................. 40
Appendix C Study Overview ........................................................................................................... 44
1.1 Study Background .................................................................................................................. 44
1.2 Methodology ......................................................................................................................... 44
1.3 Data Cleaning ......................................................................................................................... 45
1.4 Order generation and fulfillment process ............................................................................. 46
1.5 Reporting ............................................................................................................................... 47
Sample Plan Overview ................................................................................................................... 48
2.1 Sample Criteria ...................................................................................................................... 48
2.2 Fielding/Sampling Frequency ................................................................................................ 48
2.3 Data Transfer ......................................................................................................................... 48
2.4 Sample Cleaning Rules Glossary ............................................................................................ 50
2.5 Sample Selection.................................................................................................................... 50
2.6 Data Collection....................................................................................................................... 51
Appendix D Approaches toEffects of Non-Response Bias and Improving Response Rates........... 52

ii

1.1 Approach 1: Strategies to Maximize Response Rates ........................................................... 52
1.2 Approach 2: Correcting Unit Non-response Bias with Sample Weighting and Survey Raking53
Strategies to Improve Response Rate ............................................................................................ 54
Appendix E Impact of FAR 8.8 ........................................................................................................ 55
1.1 Impact .................................................................................................................................... 55
Appendix F Survey Questionnaire.................................................................................................. 56
Appendix G List of Acronyms………………………………………………………………..…………………………………….73

iii

Executive Summary
The Voice of the Veteran (VOV) Line of Business Tracking Satisfaction Research Study was
developed to establish continuous satisfaction measurement and incorporate direct Veteran
feedback in the decision-making process in order to improve the level of service to
Servicemembers, Veterans, and their beneficiaries.
As part of this study, two surveys were fielded in Fiscal Year 2015 (FY15) for the Department of
Veterans Affairs (VA), Veterans’ Benefits Administration (VBA) Pension Service. One survey was
based on the access to the benefit and the other on the ongoing servicing of the benefit. The
Access survey yielded a response rate of 26.31% (increase of 2.16% from FY14) and the
Servicing survey yielded a response rate of 18.49% (decrease of 2.56% from FY14). These rates
were lower than the estimated response rate submitted with the information collection request
(ICR) as well as lower than the Office of Management and Budget’s standard of 80% (at the
overall unit response rate).
OMB’s “Standards and Guidelines for Statistical Surveys,” Section 3.2, Guideline 3.2.9, notes
that a non-response analysis should be conducted for surveys with an overall unit response rate
of less than 80%. Therefore, J.D. Power (JDP) conducted the necessary statistical tests in
accordance with OMB’s guidelines in order to verify the validity of Pension Service’s survey
results for FY15.
The initial analyses for these reports were done in consultation with Dr. Don Dillman, a
professor at Washington State University. Dr. Dillman is regarded as a key survey method
expert on non-response bias research and the report conforms to sound statistical research
practices in accordance with OMB standards. The analysis performed also includes an iterative
survey ranking procedure to derive sample weightings based on a simultaneous balancing
analysis of the demographic differences.
The analysis performed was done in consultation with Dr. Don Dillman, professor at
Washington State University, who is regarded as a key survey method expert on non-response
research. The analysis also includes an iterative survey raking procedure to derive sample
weightings based on a simultaneous balancing analysis of the demographic differences. More
detail is provided in section 3.2, Missing Data Patterns and Mechanisms.
After adjusting for non-response bias in age, race, census region, and service era, the statistical
tests performed on the survey responses for the Pension Service surveys collected illustrate
that no differences were found in the Servicing Overall Satisfaction Index Score and Advocacy
ratings (likelihood to inform others about VA benefits).
The sample for the Access population was defined as Veterans and beneficiaries who received a
decision for their application for Pension benefits within the past 30 days. The Access Overall
Satisfaction Index Score (652) and Advocacy ratings (likelihood to inform about VA benefits
(Mail 3.41, on a rating scale of 1-4 points) are not impacted in any meaningful way by nonresponse bias.
4

The sample for the Servicing population was defined as Veterans and beneficiaries who have
been receiving pension benefits for at least 6 months. The Servicing Overall Satisfaction Index
score (716) and Advocacy ratings (likelihood to inform about VA benefits (Mail 3.37, also on a
scale of 1-4 points) are not impacted in any meaningful way by non-response bias.
This analysis confirms that the data collected during Fiscal Year 2015 is valid for use by VBA.

5

Introduction
In an effort to achieve the highest possible level of customer service, VBA partnered with J.D.
Power to conduct Veteran satisfaction research on its behalf. VBA’s Voice of the Veteran (VOV)
Satisfaction Initiative was established to continuously measure and improve the level of service
to Servicemembers, Veterans, and their beneficiaries.
The intent of this initiative is to:






Reinstate VBA’s customer satisfaction research program in order to incorporate Veteran
feedback into the decision-making process
Identify the critical factors to Veterans’ satisfaction with benefits and services provided
by VBA
Provide continuous feedback to validate effectiveness of new initiatives and process
changes
Provide decision-makers and stakeholders with timely and actionable feedback on a
continuous basis
Identify and document best practices, and act as a vehicle to celebrate successful
interactions and experiences

The VOV Line of Business Tracking Satisfaction Research Study was developed to continuously
field customer satisfaction survey instruments to provide Veteran and beneficiary feedback on
the following VBA lines of business and benefit programs: Compensation, Pension, Education,
Vocational Rehabilitation and Employment, and Loan Guaranty (including Specially Adapted
Housing). In support of this effort, in FY15, JDP fielded a survey instrument regarding the Access
and Servicing process on behalf of the Pension program. The purpose of the Access and
Servicing process surveys was to identify the factors critical to Veteran satisfaction with the
access and receipt of benefits issued by VBA and to improve the level of services provided.
The survey instruments for Servicing and the Access process were developed in collaboration
with VA’s Pension Service, and in accordance with OMB’s guidelines concerning statistical
collection procedures and methods. After the initial survey instrument was designed, cognitive
labs using the “think aloud” method were conducted to evaluate user experience when
completing the survey. Prior to the FY15 fielding of the Servicing and Access process survey, a
Benchmark (pilot) study was conducted from October 2012 through January 2013 to further
assess the effectiveness of the methodology and conformance to OMB’s standards. This study
was fielded in FY14 and the FY15 fielding will be the third iteration.

6

Methodology
2.1 J.D. Power Index Model
J.D. Power defines customer satisfaction as a measure of how well product or service
experiences fit the expectations of customers. All JDP index models assume a two-tiered
regression model involving factors and attributes. Each customer experience is influenced by
several factors (i.e., first tier), which in turn, are influenced by several attributes or drivers (i.e.,
second tier). A diagram of the index model follows on the subsequent page.
In order to begin the index model calculation, each set of attributes within a factor are used to
predict the Overall Satisfaction Index score (sub-OSAT) for that factor. An importance weight is
assigned to each attribute, where the weight of “importance” of each attribute is defined as the
ability of that attribute to predict Overall Satisfaction. A multiple regression model is used to
estimate the attribute weights. This model produces the “bottom-level” weights and is
computed for each factor separately. The bottom-level weights are rescaled so that they add up
to 1 point within each subcategory. As a result, the percentage of total explained variation in
the sub-OSAT that is due to a particular attribute constitutes that attribute’s importance weight
within its respective factor.
Following the calculation of attribute (i.e., bottom- level) weights, the factor (i.e., top-level)
weights are calculated. Factor scores are calculated by taking the sum of the product of the
attribute rating scores and the attribute importance weights. This model produces the top-level
weights, which are rescaled so that they add up to 1 point. Thus, the percentage of the total
explained variation in the Overall Satisfaction rating that is due to a particular sub-OSAT
constitutes that factor’s importance weight.
After all factor scores are computed, they are weighted so that some contribute more to
Overall Satisfaction than others, based on the index importance weights. The index score is
subsequently calculated by taking the sum of the product of all of the factor scores and the
factor importance weights. Finally, both the index and factor scores are multiplied by 100 so
that the range of each is 100 (if all attributes were rated 1 point) to 1,000 (if all attributes were
rated 10 points).
By applying the importance weights derived from the two-tiered modeling approach, JDP
creates a weighted index score that ranges from a low of 100 points to a high of 1,000 points.
This index approach has the benefit of being highly reliable and valid and provides increased
ability to discriminate the performance levels of companies.

7

Pension Access and Servicing Process Index Weights
In working with Pension’s subject matter experts and leadership, the design of its survey
encompasses the factors and attributes as outlined in the tables on the next page. The factors
(Benefit Information, Contact with VA, Benefit Application, and Benefit Entitlement) and
attributes (Ease of Accessing Information, Availability of Information, etc.) represent Access and
Servicing Index Models in FY15. The corresponding weights for each factor and attribute are the
weights based on the above index model calculation. The weights are derived from the relative
importance of each factor or attribute to the respondents.

8

Table 2.0. Access: Index Model Weights

Access Index Model Weights

Benefit Information
Contact with VA
Benefit Application
Clarity of Info on Appeal
Benefit Entitlement

Effective
Weight
18.85%
11.00%
28.31%
2.46%
39.38%

Table 2.2. Servicing: Index Model Weights
Access Weights by Attribute
Effective
Weight
Benefit Information
Ease of accessing information

3.61%

Availability of information

2.25%

Clarity of information

3.30%

Usefulness of information

4.18%

Frequency of information

5.51%

Benefit Application
Ease of completing the application

8.37%

Timeliness of eligibility notification

11.63%

Flexibility of application methods

8.31%

Contact with VA

11.00%

Clarity of Info on Appeal

2.46%

Benefit Entitlement ( Timeliness of receiving
benefit)

39.38%

9

Table 2.2. Servicing: Index Model Weights

Servicing Index Model Weights

Benefit Information
Contact with VA
Benefit Entitlement

Effective
Weight
26.46%
10.76%
62.78%

Table 2.3. Servicing: Weights by Attribute
Servicing Weights by Attribute
Effective
Weight
Benefit Information
Ease of accessing information

5.58%

Availability of information

2.99%

Clarity of information

4.52%

Usefulness of information

5.44%

Frequency of information

7.92%

Contact with VA

10.76%

Benefit Entitlement ( Timeliness of receiving
benefit)

62.78%

2.2 Sampling
The Access survey is fielded to Veterans and beneficiaries who received a decision for their
application for pension benefits within the past 30 days. These individuals may include those
who were found ineligible on a new claim and those who have been denied and are not
appealing the decision. The Servicing survey was fielded to Veterans and beneficiaries who
received a decision or are receiving benefit payments.
J.D. Power mailed approximately 10,000 surveys for the Access survey and 10,000 for the
Servicing survey to Veterans (and surviving spouses) across the nation in FY15. The target
number of completed surveys was 3,000 each for both the Access and Servicing surveys. The
actual number of completed surveys received for Access was 2,987 and for Servicing was 2,164.

10

The samples used in this study were provided by the Office of Performance Analysis and
Integrity (PA&I) on behalf of Pension and delivered to JDP on a monthly basis. They represent a
random sample from the available records provided in the sample file. See Appendix D, Sample
Plan Overview for further detail on sampling.
Methodology

Fielding Frequency

Total Mail-outs in
FY15

Access

Mail Only

Monthly

10,000

Servicing

Mail Only

Annually

10,000

Survey Instrument

2.3 Data Collection
During the survey fielding period, self-administered paper surveys were collected. While
verbatim responses are recorded by a live survey processor, responses from paper surveys are
scanned through automated imaging software. Survey returns undergo quality assurance to
validate the accuracy of responses captured.
Respondents who completed each survey on paper received two separate mailings:


1st Mailing: Survey Package, which included a cover letter introducing the study to the
respondent, a paper survey, and a business reply envelope



2nd Mailing: Survey Package, which included a cover letter, a paper survey, and a business
reply envelope

Each time the surveys were deployed, the survey packages were subject to a proof approval
process that utilized three levels of approvals by J.D. Power, Benefits Assistance Service (BAS),
and VA Publications Services Division (VAPSD). After the print vendor mailed the survey
packages, mail receipts were sent to VBA.
During the survey fielding period, JDP provided a toll-free survey hotline and dedicated e-mail
address to answer survey-related inquiries and to provide assistance to respondents for
completing the surveys. The telephone and e-mail helpdesk was staffed by three JDP
employees who answered inquiries during regular business hours (8:00am-5:00pm PST,
Monday thru Friday). A voice message system was available to receive phone messages so
after-hours calls could be responded to the following business day. An automatically generated
e-mail response was sent to all e-mail inquiries informing respondents that their e-mail was
received and they would receive a response within 24 hours.

11

JDP helpdesk representatives logged each survey-related inquiry in a password protected
spreadsheet documenting the reason for the inquiry, the resolution provided, and the contact
information of each caller. At the end of each month, a log containing all the inquiries was
provided to the Contracting Officer Representative (COR) for review. If non-survey related highseverity benefit inquiries were received, J.D. Power contacted the COR immediately with the
respondent’s contact information.
Throughout the course of the program, weekly status meetings were held between JDP and
BAS to discuss survey administration. Biweekly status meetings were held between the
Government Printing Office print vendor, JDP, BAS and VAPSD to discuss the printing and
mailing of the survey materials.

Non-Response Bias Analysis
The purpose of the non-response bias analysis is to ascertain the possible causes of variance in
response rates among different respondent demographics and/or determine if any bias has
been introduced with a low response rate. Given that the Voice of the Veteran Pension Access
survey had an overall unit response rate of approximately 26% and the Voice of the Veteran
Pension Servicing survey had an overall unit response rate of 18% in FY 2015, the following
section examines whether a low response rate or other factors may have caused respondent
bias to occur.
The Office of Management and Budget’s Questions and Answers, “When Designing Surveys for
Information Collections” dated January 2006, and “Standards and Guidelines for Statistical
Surveys” dated September 2006 (see References) provide guidelines on acceptable survey
design and response rates. OMB guidelines recommend a non-response bias evaluation for
surveys with an overall unit response rate of less than 80%.
In addition to the above referenced documents prepared by OMB, J.D. Power assessed other
source documents that were written and published by the Federal Committee on Statistical
Methodology, “Statistical Policy Working Paper 17, Survey Coverage” (1990) and “Statistical
Policy Working Paper 31, Measuring and Reporting Sources of Error in Surveys” (2001).
While high response rates are always desirable in surveys, JDP finds an 80% response rate is not
achievable for most voluntary, satisfaction-based, survey research studies (Malhotra & Birks,
2007). In particular, survey research studies that do not provide an incentive are subject to not
achieving an 80% response rate. To better illustrate this point, the Dillman Method for survey
fielding (which was discussed in Dillman, D. A. (2014, pp. 22), detailing the efforts to attain an
80% response rate.

12

A survey instrument was fielded to600 students at the University of Washington, the same
institution that sponsored the study. After five attempts to solicit a response in a closed
university setting, as well as offering a monetary incentive to complete the survey, they failed
to achieve the 80% response rate garnering only a 77% response rate. The JDP team met with
the VA Contracting Officer Representative to discuss current trends and realistic response rates.
As noted JDP does not believe that an 80% response rate is achievable and this concern was
shared with the Benefits Assistance Service team.
JDP conducted the following non-response bias analysis to determine if the respondents (i.e.
those who completed the survey) were different in a meaningful way from the nonrespondents (i.e. those who were sent a survey, but did not complete it). Chi-squared analyses
consist of comparisons between respondents and non-respondents on available demographic
variables such as gender, age, race, geographical region, war participation (service era), and
military service branch. The U.S. states were converted to standard USA census regions
(Midwest, Northeast, South, and West) in order to aggregate the data and enhance regional
comparisons.
J.D. Power research indicates that there is an absence of systematic statistical differences of
respondents’ overall satisfaction on the mail and online survey results. Research does suggest
differences can occur between mixed mode survey methodologies (mail, online, and phone),
but these are primarily related to (a) social desirability and interviewer bias associated with
phone surveys (see Baum, Chandonnet, Fentress, and Rasinowich, 2012, p. 2, for a review) and
(b) that older respondents tend to respond by mail more often than online.
Throughout this report, we are conducting statistical analyses to compare survey respondents
and non-respondents. Frequently used statistical tests can include the T-Test, Chi-Square, or
Analyses of Variance (ANOVA). These tests generate relevant t-statistics, Chi-Squares, or F
statistics that are reported. The magnitude of the statistic’s value (either positive or negative)
measures the size of the difference relative to the variation in the data. If the statistic is not
large enough to generate a probability (p-value) less than .05, then it falls below the accepted
standard probability cut-off level that indicates whether a statistical difference is significant. If
a difference is not significant, statisticians regard these results as part of the normal sample
variation that occurs within the same population. Throughout this report, the probability pvalue standard of “must be less than .05 to be significant” is used for all statistics reported.
The VA pension surveys for Access and Servicing were conducted by mail only. Therefore a
comparison of mail vs. online methodology is not needed for these divisions.
For the Pension Access survey, there were considerable missing values (50% - 80%) for
demographics values. As a consequence, JDP was only able to use those demographics where
there were enough non-missing values to make legitimate statistical comparisons. Gender,
Region, and Age Generation were used for an analysis of demographic differences between
survey respondents and non-respondents.
For the Access sample, no significant gender differences were found between survey
respondents and non-respondents.
13

Table 3a.e. Access: Comparing Gender for Respondents and Non-Respondents
Gender by Respondent Type (%)

Female
Male

Survey
Respondents

NonRespondents

Total

49
51

48
52

49
52

Statistic
Chi-Square

DF
1

Value
0.3598

Prob
0.5486

For the Access sample, significant differences were found with the population based on age
generation, such that a larger number of older Veterans and a fewer number of younger
Veterans completed the survey. While the ages of respondents in each generational group are
shown in this study by current age ranges, JDP further clarifies age by birth year, thus PreBoomer includes individuals born prior to 1946; Boomer is born 1946-1964; Generations X, Y
and Z are born 1964-2004.
Table 3b.e. Access: Comparing Age Generation for Respondents and Non-Respondents
Age Generation by Respondent Type (%)

Survey
NonRespondents Respondents
Baby-Boomer
(ages 50-68)
Generations
XYZ (ages 1849)
Pre-Boomer
(ages 69+)

Total

24

23

23

4

5

5

73

71

72

Statistic

DF

Chi-Square

2

Value

Prob

10.92 <.004

For the Access survey, significant differences were found with the population based on
geographical census region such that there were more survey respondents from the Midwest
and fewer from the South region:
Table 3c.e. Access: Comparing Region for Respondents and Non-Respondents
U.S. Census Region by Respondent Type (%)

Midwest
Northeast
South
West

Survey
Respondents

NonRespondents

Total

26
11
41
22

23
10
45
22

24
11
44
22

Statistic
Chi-Square

DF
3

Value
16.56

Prob
<.001

14

For the Pension Servicing sample, there were also some missing values for demographics in the
dataset–especially for race (87% missing). As a consequence, JDP was only able to use those
demographics where there were enough non-missing values to make legitimate statistical
comparisons. Gender, Age Generation, Region, Branch of Service, Service Discharge, Award
Level, Benefit Type, and War Period were used for an analysis of demographic differences
between survey respondents and non-respondents.
For the Pension Servicing sample, differences in Gender approached significance, as shown in
Table 3a.s at probability of .097 such that there were more female respondents than nonrespondents.
Table 3a.s. Pension Servicing: Comparing Gender for Respondents and Non-Respondents
Gender by Respondent Type (%)

Female
Male

Survey
Respondents

NonRespondents

Total

39
61

37
63

37
63

Statistic
Chi-Square

DF
1

Value
2.75

Prob
.097

For the Servicing sample, age generation differences also approached significance at probability
of .094 such that there were more baby-boomer respondents than non-respondents:
Table 3b.s. Access: Comparing Age Generation for Respondents and Non-Respondents
Age Generation by Respondent Type (%)

Survey
NonRespondents Respondents
Baby-Boomer
(ages 50-68)
Generations
XYZ (ages 1849)
Pre-Boomer
(ages 69+)

Total

34

33

33

3

4

4

63

63

63

Statistic

DF

Value

Prob

Chi-Square

2

4.72

.094

For the Servicing survey, significant differences were found with the population based on
geographical census region such that there were more survey respondents from the West and
Midwest and fewer from the South region:
Table 3c.s. Servicing: Comparing Region for Respondents and Non-Respondents
U.S. Census Region by Respondent Type (%)

Midwest
Northeast
South
West

Survey
Respondents

NonRespondents

Total

30
16
30
25

28
17
32
22

29
17
32
23

Statistic
Chi-Square

DF
3

Value
9.44

Prob
0.024

15

For the Servicing sample, no significant differences were found with the population based on
branch of service:
Table 3d.s. Servicing: Comparing Military Service Branch for Respondents and Non-Respondents
Military Service Branch by Respondent Type (%)

Air Force
Army
Marines
Navy
Other

Survey
Respondents

NonRespondents

Total

9
58
8
24
2

9
60
8
21
2

9
60
8
21
2

Statistic
Chi-Square

DF
4

Value
6.17

Prob
.187

For the Servicing survey, significant differences were found in Service Discharge with fewer
surveys returned by Veterans who were discharged under “other than honorable” conditions:
Table 3e.s. Servicing: Comparing Service Discharge for Respondents and Non-Respondents
Service Discharge by Respondent Type (%)

Honorable
Other

Survey
Respondents

NonRespondents

Total

98
2

95
5

96
4

Statistic
Chi-Square

DF
1

Value
20.12

Prob
<.0001

For the Servicing sample, significant differences were found with the population based on
Benefit Award. More surveys were completed by Veterans who receive a $1,001-$1,500
benefit and fewer by Veterans receiving $1,000 or less:
Table 3f.s. Servicing: Comparing Benefit Award for Respondents and Non-Respondents
Benefit Award by Respondent Type (%)

Survey
NonRespondents Respondents
$1,000 or less
$1,001-$1,500
$1,501 or
more

Total

57
40

61
36

60
37

3

3

3

Statistic
Chi-Square

DF
2

Value

10.01

Prob

<.0007

For the Servicing survey, no differences were found in Benefit Type between Respondents and
Non-Respondents:

16

Table 3g.s. Servicing: Comparing Benefit Type for Respondents and Non-Respondents
Benefit Type by Respondent Type (%)

Survey
NonRespondents Respondents
Death Pension
Pension

40
60

42
58

Total

Statistic
Chi-Square

DF
1

Value
1.00

Prob
.316

41
59

For the Servicing sample, a Chi-square test showed war period differences such that a larger
number of Vietnam Veterans and a fewer number of World War I and World War II Veterans
completed the Servicing survey:
Table 3h.s. Servicing: Comparing War Period for Respondents and Non-Respondents
War Period by Respondent Type (%)

Survey
NonRespondents Respondents
Gulf War
Korean Conflict
Vietnam Era
World War I and
II

Total

2
18
39

3
17
37

3
17
38

41

43

43

Statistic

DF

Value

Prob

Chi-Square

3

10.6626

<.02

3.1 Survey Yield
In accordance with OMB “Standards and Guidelines for Statistical Surveys,” an agency must
appropriately measure, adjust for, report, and analyze unit and item non-response when the
intended response for a targeted population is not met.1 In assessing Pension’s data in
accordance with Section 3.2, and Guidelines 3.2.1-3.2.3, the unweighted unit response rate was
calculated as the ratio of the number of completed cases to the number of in-scope sample
cases (Ellis, 2000; AAPOR, 2000).

1

As defined by OMB and FCSM, unit non-response occurs when a respondent fails to respond to all required response items
(i.e., fails to fill out or return a data collection instrument); item non-response occurs when a respondent fails to respond to one
or more relevant item(s) on a survey

17

Table 3.1a.e below shows the sample distribution and response rate for the Pension Access
target population:
Table 3.1a.e. Sample Distribution and Response Rates for Pension Access Population
Total Pension Access Population FY2015
Total Records Received
Duplicate records in sample file
Duplicate record history
Invalid address
Invalid values
Blanks
Do not contact
2
Total Records Available after Cleaning
Total Records Selected
Undeliverable addresses
Total Mailed (excludes undeliverable)
Total completed mail surveys
Total completed online surveys
Total Completed Surveys
3
Total Completed Surveys with Overall Index Score
4
Total Sample Response Rate
5
Eligible Sample Response Rate

134,694
10,497
2,286
19,997
0
0
863
101,071
10,000
28
9,972
2,987
N/A
2,987
2,631
26.31%
29.95%

2

Glossary of sample cleaning rules included in Appendix E.
Findings in the report are based on the “Total Completed Surveys with Overall Index Score” (N=2,631).
4
Response rate calculation per OMB “Standards and Guidelines for Statistical Surveys,” Section 3.2, Guideline 3.2.9
(includes undeliverables as number of non-contacted sample units known to be eligible).
5
Response rate calculation per Council of American Survey Research Organizations (CASRO) (includes number of
completed interviews with reporting units/number of eligible reporting units in sample). The American Association
for Public Opinion Research (AAPOR) also uses this method for calculation and cites CASRO (AAPOR Standard
Definitions, 2008, pp. 34).
3

18

Table 3.1a.s below shows the sample distribution and response rate for Pension Servicing target
population:
Table 3.1a.s. Sample Distribution and Response Rates for Pension Servicing Population
Total Pension Servicing Population FY2015
Total Records Received
Duplicate records in sample file
Duplicate record history
Invalid address
Invalid values
Blanks
Do not contact
6
Total Records Available after Cleaning
Total Records Selected
Undeliverable addresses
Total Mailed (excludes undeliverable)
Total completed mail surveys
Total completed online surveys
Total Completed Surveys
7
Total Completed Surveys with Overall Index Score
8
Total Sample Response Rate
9
Eligible Sample Response Rate

245,214
893
2,948
25,858
0
0
7
215,508
10,000
1,001
8,999
2,164
N/A
2,164
1,849
18.49%
24.05%

Of the 134,694 total records received from Access, 33,623 records were purged from the
sample due to cleaning rules such as duplicate records, invalid addresses and values, blanks,
and opt outs for do not contact. From the 33,632 records purged, 2,286 records were cleaned
out due to duplicate records across VBA’s other business line surveys (i.e., duplicate record
history).

6

Glossary of sample cleaning rules included in Appendix E.
Findings in the report are based on the “Total Completed Surveys with Overall Index Score” (N=1,849).
8
Response rate calculation per OMB “Standards and Guidelines for Statistical Surveys,” Section 3.2, Guideline 3.2.9
(includes undeliverables as number of non-contacted sample units known to be eligible).
9
Response rate calculation per Council of American Survey Research Organizations (CASRO) (includes number of
completed interviews with reporting units/number of eligible reporting units in sample). The American Association
for Public Opinion Research (AAPOR) also uses this method for calculation and cites CASRO (AAPOR Standard
Definitions, 2008, pp. 34).
7

19

In Servicing, a total of 245,214 records were received but 29,706 records were purged from the
sample due to cleaning rules such as duplicate records, invalid addresses and values, blanks,
and opt outs for do not contact. Also, from the 29,706 records that were purged, 2,948 records
were cleaned due to duplicate records across other business lines.
The purpose of the cleaning rules is to prevent respondents from being re-contacted if they
were previously selected to participate in any of VBA’s business line surveys in the past 12
months. The cleaning rule is a JDP and survey research best practice and is intended to promote
proper conduct in market research. About 25% of the total records provided for Access and
about 12% of the total records provided for Servicing were removed from the sample due to
these cleaning rules. It is unlikely that the cleaning rules impacted the unit non-response and
we were able to secure the designated number (10,000) of records for both Servicing and
Access.
Table 3.1b.e. Pension Access: Weight/Person for Completed Surveys per Population
Completed Surveys

Pension Access 2015
Population

Weight/Person

2,987

134,694

45

In Table 3.1b.e the 45 in the Weight/Person column means that every survey completed and
returned represents the views of 45 Veterans using Pension Access benefits, which is an
acceptable sampling representativeness. This was calculated by dividing the number of
completed surveys into the population number.
Table 3.1b.s. Pension Servicing: Weight/Person for Completed Surveys per Population
Completed Surveys
2,164

Pension Access 2015
Population
245,214

Weight/Person
113

In Table 3.1b.s the 113 in the Weight/Person column means that every survey completed and
returned represents the views of 113 Veterans using Pension Servicing benefits, which is an
acceptable sampling representativeness. This was calculated by dividing the number of
completed surveys into the population number.
To confirm the sample’s representativeness for both Access and Servicing, a comparison was
conducted among the total records provided and the records available after cleaning. The
intent of this analysis was to determine whether the cleaning rules caused the remaining
sample to vary in a meaningful way from the original sampling frame.
To confirm the sample’s representativeness, a comparison was conducted among the total
records provided and the records available after cleaning. The intent of this analysis was to
determine whether the cleaning rules caused the remaining sample to vary in a meaningful way
from the original sampling frame.
20

Table 3.1c.e (Access) and Table 3.1c.s (Servicing) indicate characteristics such as Gender, Age
Generation, and Geographical Region are similar among the total records provided and the
records available after cleaning. Regional comparisons by state yield differences that are
mostly less than 1.5% points. Overall, these comparisons suggest the cleaning rules did not
significantly alter the proportion of respondent characteristics provided in the original sampling
frame.
Table 3.1c.e. Access: Comparing Gender, Age Generation, and U.S. States to Total Population
Total
Population
(%)

Records Available
(%)

% Point
Difference

53.48
46.52

52.15
47.85

-1.33
1.33

70.49
23.96
5.55

70.42
24.35
5.24

-0.08
0.39
-0.31

0.10
3.80
1.28
2.49
10.92
1.50
0.54
0.07
0.19
5.97
2.82
0.16
1.58
0.68
3.44
2.33
1.22
1.66
2.21
0.95
0.98
0.23
3.95
1.60

0.11
4.05
1.27
2.55
11.59
1.56
0.46
0.07
0.18
5.67
2.8
0.18
1.58
0.71
3.5
2.39
1.25
1.72
2.31
0.83
0.94
0.22
3.96
1.5

0
0.25
-0.01
0.06
0.68
0.06
-0.07
0
-0.01
-0.3
-0.02
0.02
0
0.02
0.06
0.06
0.03
0.06
0.1
-0.12
-0.04
-0.01
0.02
-0.09

Gender
Female
Male
Generation
Pre-Boomer
Baby-Boomer
Generations XYZ

U.S. State
AK
AL
AR
AZ
CA
CO
CT
DC
DE
FL
GA
HI
IA
ID
IL
IN
KS
KY
LA
MA
MD
ME
MI
MN

21

Table 3.1c.e. Access: Comparing Gender, Age Generation, and U.S. States to Total Population (Continued)

MO
MS
MT
NC
ND
NE
NH
NJ
NM
NV
NY
OH
OK
OR
PA
RI
SC
SD
TN
TX
UT
VA
VT
WA
WI
WV
WY

Total
Population
(%)

Records Available
(%)

% Point
Difference

2.85
1.49
0.43
2.65
0.21
0.51
0.25
1.17
0.79
1.10
2.91
4.73
1.70
1.61
3.72
0.18
1.81
0.36
3.41
8.46
0.96
1.61
0.06
2.56
1.85
0.46
0.20

2.76
1.53
0.47
2.48
0.21
0.52
0.22
1.11
0.83
1.12
2.42
4.71
1.71
1.72
3.62
0.16
1.78
0.36
3.49
9.12
0.96
1.52
0.06
2.63
1.53
0.45
0.22

-0.09
0.05
0.04
-0.17
0
0.01
-0.03
-0.06
0.05
0.02
-0.49
-0.01
0.01
0.11
-0.1
-0.02
-0.02
0.01
0.08
0.66
-0.01
-0.09
0
0.07
-0.31
-0.01
0.02

22

Table 3.1c.s. Servicing: Comparing Gender, Generation, and U.S. States to Total Population
Total
Population
(%)

Records Available
(%)

% Point
Difference

38.82
61.18

37.71
62.29

-1.1
1.1

64.68
31.84
3.48

63.30
33.26
3.44

-1.38
1.42
-0.05

0.11
5.88
2.03
2.59
13.12
1.73
0.04
0.01
0.01
0.72
0.36
0.26
1.78
0.66
4.33
2.36
1.38
2.81
3.79
0.06
0.09
0.02
5.38
2.3
4.2
2.4
0.64
0.21
0.32
0.69
0.02

0.11
5.98
2.08
2.62
13.27
1.72
0.04
0.01
0.01
0.7
0.35
0.27
1.7
0.66
4.3
2.25
1.34
2.83
3.89
0.05
0.08
0.02
5.35
2.19
4.08
2.43
0.65
0.21
0.32
0.69
0.02

0
0.09
0.04
0.03
0.15
-0.01
-0.01
0
0
-0.03
-0.01
0.01
-0.09
0
-0.03
-0.11
-0.04
0.01
0.1
-0.01
-0.01
0
-0.03
-0.11
-0.12
0.03
0.02
0
0
0
0

Gender
Female
Male
Generation
Pre-Boomer
Baby-Boomer
Generations XYZ

U.S. State
AK
AL
AR
AZ
CA
CO
CT
DC
DE
FL
GA
HI
IA
ID
IL
IN
KS
KY
LA
MA
MD
ME
MI
MN
MO
MS
MT
NC
ND
NE
NH

23

Table 3.1c.s. Servicing: Comparing Gender, Age Generation, and U.S. States to Total Population (Continued)

NJ
NM
NV
NY
OH
OK
OR
PA
RI
SC
SD
TN
TX
UT
VA
VT
WA
WI
WV
WY

Total
Population
(%)

Records Available
(%)

% Point
Difference

0.06
1.08
1.27
0.19
7.72
2.34
2.26
0.97
0.01
0.13
0.58
4.47
11.42
0.82
0.16
0.01
2.64
3.21
0.08
0.18

0.05
1.13
1.32
0.16
7.58
2.42
2.35
0.95
0.01
0.13
0.58
4.54
11.76
0.81
0.16
0.01
2.66
2.86
0.08
0.19

0
0.05
0.04
-0.03
-0.14
0.08
0.09
-0.02
0
0
0
0.07
0.34
-0.01
0
0
0.02
-0.35
0
0.01

3.2 Missing Data Patterns and Mechanisms
In accordance with the OMB “Standards and Guidelines for Statistical Surveys” Guidelines 3.2.9
and 3.2.11, an investigation of missing data patterns was performed on the 2,987 total surveys
received for Access and the 2,164 total surveys received for Servicing. In order to assess the
distribution of missing data, a procedure was performed to process missing values involving
iterative multiple imputation chains using expectation–maximization (MCMC) algorithms and
dividing these into distribution interval groupings (Pierchala, Carl E. (2001)). This was done on
the key measures of the Overall Satisfaction Index (see Appendix A for calculation) and
Advocacy ratings related to Veterans’ likelihood to recommend VA benefits.
As shown in Tables 3.2.e and 3.2.s for Access and Servicing, respectively, there were no
indications of unusual patterns for missing data. For more discussion of missing data
mechanisms (MCAR, MAR, and MNAR), please see Appendix A.

24

Table 3.2.e. Access: Missing Data Patterns in Satisfaction and Advocacy (0 = missing, 1 = data)
Group Means

Group
1
2
3
4

Overall
Satisfaction

Likelihood
to Inform
Others

0
0
1
1

0
1
0
1

Freq
60
143
32
2287

Percent

OSAT
Index

Age

% Male

2%
6%
1%
91%

630
646
578
655

78
80
73
78

44%
42%
48%
52%

Table 3.2.s. Servicing: Missing Data Patterns in Satisfaction and Advocacy (0 = missing, 1 = data)
Group Means

Group
1
2
3
4

Overall
Satisfaction

Likelihood
to inform
others

0
0
1
1

0
1
0
1

Freq
11
60
33
1745

Percent

OSAT
Index

Age

% Male

1%
3%
2%
94%

692
750
686
716

80
83
74
77

45%
43%
77%
61%

3.3 Margin of Error
The margin of error expresses the maximum expected difference between the true population
parameter and a sample estimate of that parameter. It is often used to indicate the accuracy of
survey results. The larger the margin of error around an estimated value, the less accurate the
estimated value will be. Larger samples are more likely to yield results close to the true
population quantity and thus have smaller margins of error than smaller samples.
Based on a sample of 2,987 Veterans, the Overall Satisfaction Index for the Access study is 652
index points on a 1,000 point scale and has a margin of error of 8 index points, at the 95%
confidence level. This indicates that if the survey were repeated many times with different
samples, the true mean Overall Satisfaction Index would fall within 8 index points 95% of the
time.
Table 3.3.e below demonstrates relative decreases in margin of error as the study sample size
increases. A 20% response rate (1,994 completes) would be associated with a margin of error of
10 index points, similar to the margin of error for a 30% response rate (2,992 completes).
Results from this analysis indicate the Overall Satisfaction Index (OSAT) calculated from the
Access study is an accurate measurement of the true population mean.

25

Table 3.3.e. Access: Margin of Error for Larger Sample Sizes
Sample

Response
Rate

Completes
(N)

OSAT
(mean)

Standard
Deviation

Standard
Error

Margin of error
(95% confidence
interval)

9,972
9,972
9,972
9,972
9,972
9,972
9,972

29.95%
20%
30%
40%
50%
60%
80%

2,987
1,994
2,992
3,989
4,986
5,983
7,978

652
652
652
652
652
652
652

217
217
217
217
217
217
217

4.0
4.9
4.0
3.4
3.1
2.8
2.4

8
10
8
7
6
6
5

Based on a sample of 8,999 Veterans, the Overall Satisfaction Index for the Servicing study is
716 and has a margin of error of 9 index points, on a 1,000 point scale, at the 95% confidence
level. This indicates that if the survey were repeated many times with different samples, the
true mean Overall Satisfaction Index would fall within 9 index points 95% of the time.
Table 3.3.s below demonstrates relative decreases in margin of error as the study sample size
increases. A 30% response rate (2,700 completes) would be associated with a margin of error
of 8 index points, similar to the margin of error for a 40% response rate (3,600 completes).
Results from this analysis indicate the Overall Satisfaction Index (OSAT) calculated from the
Servicing study is an accurate measurement of the true population mean.
Table 3.3.s. Servicing: Margin of Error for Larger Sample Sizes
Sample

Response
Rate

Completes
(N)

OSAT
(mean)

Standard
Deviation

Standard
Error

Margin of error
(95% confidence
interval)

8,999
8,999
8,999
8,999
8,999
8,999
8,999

24.05%
20%
30%
40%
50%
60%
80%

2,164
1,800
2,700
3,600
4,500
5,399
7,199

716
716
716
716
716
716
716

209
209
209
209
209
209
209

4.5
4.9
4.0
3.5
3.1
2.8
2.5

9
10
8
7
6
6
5

26

In the margin of error analysis noted above and in subsequent analyses included in this report,
the Overall Satisfaction Index score is the main dependent variable and is the basis for the
analysis. The Overall Satisfaction Index score is the survey metric that VBA utilizes to measure
customer satisfaction and benchmark performance against other industries. It is the primary
measurement in all JDP studies. The Overall Satisfaction Index encompasses all aspects of the
customer experience10, and can therefore be used as a reliable indicator for the presence or
absence of respondent bias in the survey results as a whole. For these reasons, the Overall
Satisfaction Index score is used as the main dependent variable in the margin of error analysis
and subsequent t-test analyses included in this report.

3.3.1 Sampling Distribution
Respondent characteristics such as gender and age were compared to that of the total sample
to determine whether respondents and non-responders differed on key variables of interest.
Compared with the population of all eligible respondents (Access 10,000, Servicing 10,000),
survey respondents demonstrate the same gender characteristics. For Access, Table 3.3.1.e
below illustrates 49% of survey respondents were female and 51% were male, similar to the
total sample population. The distribution of age shows that survey respondents tend to be
older.
Table 3.3.1.e. Access: Comparing Gender and Age of Survey Respondents to Total Sample

Gender
Female
Male
Age Generation
Baby-Boomer
Generations XYZ
Pre-Boomer

Respondents
(%)

Sample Size
(N)

Total Sample
(%)

Sample Size
(N)

% Point
Difference

49
51

1284
1328

49
51

4386
4643

0
0

23
3
74

663
95
2105

23
5
72

2339
462
7199

0
2
-2

For Servicing, Table 3.3.1.s below illustrates 40% of survey respondents were female and 60%
were male, similar to the total sample population. The distribution of age shows that survey
respondents tend to be older.

10

Explanation of J.D. Power Index Model calculation included in Methodology.

27

Table 3.3.1.s. Servicing: Comparing Gender and Age of Survey Respondents to the Total Sample

Gender
Female
Male
Age Generation
Baby-Boomer
Generations XYZ
Pre-Boomer

Respondents
(%)

Sample
Size (N)

Total
Sample (%)

Sample
Size (N)

% Point
Difference

40
60

770
1167

37
63

3242
5411

-3
3

33
3
64

712
71
1381

33
4
63

3304
412
6284

0
-1
1

3.3.2 Distribution of Overall Satisfaction Index Scores
Following the comparison of sampling distributions, a comparison of Overall Satisfaction Index
scores was conducted to determine whether differences in age and gender among respondents
correlate with differences in Overall Satisfaction.
For Access, Table 3.3.2.e below indicates no differences in Overall Satisfaction Index scores
between gender groups (654 vs. 655). Comparing age groups reveals that Generations XYZ had
lower overall satisfaction compared with Pre- and Baby Boomers, although the sample size is
small for Generations XYZ (N=89) and may not be as representative.
Table 3.3.2.e. Access: Overall Satisfaction Scores for Gender and Age Groups
Characteristics
Gender
Female
Male
Age Generation
Baby-Boomer
Generations XYZ
Pre-Boomer

OSAT (mean)

Standard Deviation

Sample Size (N)

654
655

219
212

1132
1175

652
606
655

227
204
213

593
89
1840

For Servicing, Table 3.3.2.s below indicates differences in Overall Satisfaction Index scores are
notable between genders. On average, females tend to rate their experience 29 index points
higher than males (735 vs. 706). Comparing age groups reveals that Pre-Boomers had the
highest overall satisfaction, with Baby-Boomers having much lower satisfaction.

28

Table 3.3.2.s. Servicing: Overall Satisfaction Scores for Gender and Age Groups
Characteristics
Gender
Female
Male
Age Generation
Baby-Boomer
Generations XYZ
Pre-Boomer

OSAT (mean)

Standard Deviation

Sample Size (N)

735
706

202
216

649
1015

682
725
734

225
196
198

626
60
1163

3.3.3 Analysis for Demographic Differences
T-test analyses were conducted to determine whether differences in demographic groups
produced statistical differences in Overall Satisfaction (OSAT) scores. T-tests are typically used
to determine whether or not the difference between two groups’ averages most likely reflect a
meaningful difference in the population from which the groups were sampled.
For Access, gender differences were not statistically significantly different. Demographics with
too many missing values were excluded because they could not be used to conduct a
meaningful statistical analysis (e.g., service discharge - 51%, benefit type - 59%).
Table 3.3.3a.e. Access: T-Test Analysis for Pairs of Characteristics in Veterans’ Satisfaction

Characteristics
Gender
Female vs. Male

T-Test Statistic

Statistical Difference
(95% confidence level)

-0.05

No

For Servicing, the differences for gender and benefit type were both statistically significant
such that females and pension type had higher satisfaction, whereas Service Discharge showed
no differences:
Table 3.3.3a.s. Servicing: T-Test Analysis for Pairs of Characteristics in Veterans’ Satisfaction
Characteristics
Gender
Female vs. Male
Service Discharge
Honorable vs. Other
Benefit Type
Pension vs. Death Pension

T-Test Statistic

Statistical Difference
(95% confidence level)

2.76

Yes

0.59

No

2.55

Yes

29

Analyses of Variance (ANOVA) were conducted to determine whether differences in
demographic groups produced statistical differences in Overall Satisfaction Index scores.
ANOVA analyses are typically used to determine whether or not the difference among three or
more groups’ averages most likely reflect a meaningful difference in the population from which
the groups were sampled.
For Access, differences in Overall Satisfaction Index score by age generation were not significant
(F = 2.24, p-value < .11).
Table 3.3.3b.e. Access: Overall Satisfaction for Age Generation
Generation
Pre-Boomer
Baby Boomer
Generations XYZ

OSAT (mean)

Sample Size (N)

655
652
606

1840
593
89

For Access, differences in Overall Satisfaction Index score by region were significant (F = 4.25, pvalue < .006), such that Northeast satisfaction was lower than the other regions.
Table 3.3.3c.e. Access: Overall Satisfaction by Region
Regions

OSAT (mean)

Sample Size (N)

Midwest
Northeast
South
West

667
615
647
661

660
269
1014
552

For Servicing, differences in Overall Satisfaction Index score by age generation were significant
(F = 13.21, p-value = .0001) such that Baby-Boomer respondents had the lowest satisfaction:
Table 3.3.3b.s. Servicing: Overall Satisfaction for Generation
Generation
Pre-Boomer
Baby Boomer
Generations XYZ

OSAT (mean)

Sample Size (N)

734
682
725

1163
626
60

For Servicing, differences in Overall Satisfaction Index score by region were significant (F = 4.38,
p-value < .005) such that Midwest Veterans had the highest satisfaction.

30

Table 3.3.3c.s. Servicing: Overall Satisfaction by Region
Regions

OSAT (mean)

Sample Size (N)

Midwest
Northeast
South
West

742
713
707
698

562
287
547
452

For Servicing, differences in Overall Satisfaction Index score by branch of service were not
significant (F = 1.58, p-value = .178).
Table 3.3.3e.s. Servicing: Overall Satisfaction for Military Service Branch
Military Service

OSAT (mean)

Sample Size (N)

Army
Air Force
Marines
Navy
Other

704
723
728
698
749

158
1078
143
435
35

For Servicing, differences in Overall Satisfaction Index score by benefit award level were
significant (F = 10.59, p-value < .0001), such that those earning $1,000 or less had the lowest
satisfaction:
Table 3.3.3b.s. Servicing: Overall Satisfaction for Benefit Award Level
Benefit Award
$1,000 or less
$1,001-$1,500
$1,501 or more

OSAT (mean)

Sample Size (N)

697
742
744

1053
734
62

For Servicing, differences in Overall Satisfaction Index score by war period were significant (F =
10.12, p-value < .0001), such that Vietnam Era Veterans had the lowest satisfaction level:
Table 3.3.3d.s. Servicing: Overall Satisfaction by War Period
War Period
Gulf War
Korean Conflict
Vietnam Era
World War I & II

OSAT (mean)

Sample Size (N)

702
722
685
744

33
329
722
765

31

3.3.4 Data Imputation Analysis for Demographic Differences
A pairwise comparison t-test analysis was conducted to evaluate whether data imputation for
missing values across significant demographic differences shown in section 3.3.3 would impact
Overall Satisfaction Index scores. This analysis included survey raking across demographic
differences as one level of comparison.
The results (Tables 3.3.4a.e and 3.3.4a.s) show that there were no significant differences
between the non-imputed mean and the imputed mean of the Overall Satisfaction Index across
demographics, sample sizes, and survey ranked values. We want to highlight that after
statistical adjustment for the differences found between respondents and non-respondents
reported earlier, there were no differences in overall satisfaction levels. These results support
the conclusion that the survey findings for Veterans’ overall satisfaction ratings are accurate.
Table 3.3.4a.e. Access: Comparison of Imputed vs. Non-Imputed on Veterans’ Satisfaction
T-Tests on Imputed vs. Non-Imputed for Age, Gender, and Region

Overall Satisfaction Index
(100 - 1000 range)
Imputed demographics
(2,522 final sample size)
Imputed survey-raked demographics
(2,522 final sample size)
Imputed survey-raked demographics
(2,987total respondents)

mean
(imputed)

mean (nonimputed)

t-statistic

p-value

652.63

652.55

-0.01

0.99

649.69

649.79

-0. 02

0.99

652.46

649.68

-0.47

0.63

Note: Non-imputed is based on the 2,522 final cleaned sample size used in this report.

Table 3.3.4a.s. Servicing: Comparison of Imputed vs. Non-Imputed on Veterans’ Satisfaction
T-Tests on Imputed vs. Non-Imputed for Age, Gender, Region, War Service, Benefit Award and Type

Overall Satisfaction Index
(100 - 1000 range)
Imputed demographics
(1,849 final sample size)
Imputed survey-raked demographics
(1,849 final sample size)
Imputed survey-raked demographics
(2,164 total respondents)

mean
(imputed)

mean (nonimputed)

t-statistic

p-value

715.59

716.26

0. 10

0. 92

712.61

713.45

0.12

0.90

715.44

713.30

-0.33

0.74

Note: Non-imputed is based on the 1,849 final cleaned sample size used in this report.

32

Survey Raking for Sample Weights to Adjust for Differences and Compare Overall Satisfaction
and Advocacy Ratings
The procedure known as “raking” adjusts a set of data so that its marginal totals match
specified control totals on a specified set of variables. The term suggests an analogy with the
process of smoothing the soil in a garden plot by alternately working it back and forth with a
rake in two perpendicular directions (Izrael and Battaglia (2004).
Survey raking is an iterative sample-balancing algorithm-based technique that provides sample
weighting convergence across multiple variables and multiple categories (see Battaglia, Izrael,
Hoaglin, and Frankel (2009)).
In keeping with OMB “Standards and Guidelines for Statistical Surveys” Guidelines 3.2.12 and
3.2.13, JDP selected the best statistical method to simultaneously adjust for multiple
differences between groups by applying a survey raking procedure (see Anderson, L., and R.D.
Fricker, Jr. (2015)).
The JDP raking procedure is proprietary, representing an improved version based on the
excellent methods initially developed by Izrael and Battaglia (2000, 2004) and Battaglia, Izrael,
Hoaglin, and Frankel (2004). JDP raking improvements are primarily related to better handling
of low cell values during iterative convergence processing. For this analysis, 50 iterations were
set (although fewer were needed) to converge on the best sample weights (.2 estimation
margin) to simultaneously adjust for non-response bias in age, race, region, and war (service
era) demographic categories. For additional background on survey raking methodologies, see
Wallace and Rust (1996).
The estimated population distributions are used as convergence targets. In this case, the
dataset of all eligible respondents for Access (10,000) and Servicing (10,000) were used as the
estimated population to derive sample weightings for the Access survey respondents (2,987)
and the Servicing survey respondents (1,849).
In accordance with OMB “Standards and Guidelines for Statistical Surveys” Guideline 3.2.13, a
series of t-tests were conducted to determine whether non-response bias in demographic areas
produced statistical differences in overall satisfaction scores and advocacy ratings. Typically, ttests are used to determine whether differences between the averages and variances of two
groups reflect a meaningful difference in the population. The sample weightings derived from
the survey raking procedure were included in the t-tests to equalize the survey respondent
differences with non-respondents.
For Access, there were no significant differences in Overall Satisfaction Index score or Advocacy
levels when the data was adjusted for demographic differences between survey respondents
and non-respondents. These results support the conclusion that the survey findings for
Veterans’ overall satisfaction ratings are accurate:

33

Table 3.3.4b.e. Access: Overall Satisfaction and Advocacy for Survey Respondents (Unweighted
and Weighted)
Analysis of Survey Respondent Scores with Weighted Adjustment for Non-Response Bias
Mean
Standard
Standard
Mean
tpRating Measure (Unweighte
Deviation
Deviation
(Weighted)
statistic value
d)
(Unweighted)
(Weighted)
Overall
Satisfaction Index
(100 - 1000 range)
Likelihood to
inform others
about VA benefits
( rating 1 - 4)

653

650

216

218

0.45

0.65

3.41

3.42

0.70

0.69

-0.54

0.59

For Servicing, there were no significant differences in Overall Satisfaction Index score or
Advocacy levels when the data was adjusted for demographic differences between survey
respondents and non-respondents. These results support the conclusion that the survey
findings for Veterans’ overall satisfaction ratings are accurate:
Table 3.3.4b.s. Servicing: Overall Satisfaction and Advocacy for Survey Respondents
(Unweighted and Weighted)
Analysis of Survey Respondent Scores with Weighted Adjustment for Non-Response Bias
Mean
Standard
Standard
Mean
tpRating Measure (Unweighte
Deviation
Deviation
(Weighted)
statistic value
d)
(Unweighted)
(Weighted)
Overall
Satisfaction Index
(100 - 1000 range)
Likelihood to
inform others
about VA benefits
(rating 1 - 4)

716

713

209

209

.41

.68

3.37

3.38

.72

.70

-.84

.40

34

Findings
Results of the non-response bias analysis indicate that the Overall Customer Satisfaction Index
score and the Advocacy ratings in the Pension Access and Pension Servicing studies reflect the
experience of all Veterans and beneficiaries who received a decision for their application for
pension benefits and those who have been receiving pension benefits.

Sample Cleaning: Initial comparisons on Age, Gender, and Region characteristics between
the total records provided and the records available after cleaning (see Survey Yield, Section
3.1) suggest the sample utilized in the study exhibits similar characteristics as the total sample.
Additional comparisons (see Margin of Error and Sampling Distribution, Section 3.3) suggest the
sample cleaning rules did not impact the sample’s representativeness, and thus the results are
conclusive.

Non-Response Bias Analysis: Results of the non-response bias analysis did show group
differences for Age, Gender, Region, War Service, Benefit Award and Type between survey
respondents and non-respondents. After correcting for these differences using a
recommended sample-balancing survey raking method to derive sample weights (see Margin of
Error, Section 3.3.4 Data Imputation Analysis for Demographic Variables), there were no
differences found in Veterans’ overall satisfaction and advocacy (likelihood inform others about
VA benefits) between weighted and unweighted survey respondents.

Item Response Rate Calculations: Results from the survey item response rate
calculations indicate high item response rates, with none falling below OMB guidelines (see
Appendix B for Item Response Rates). According to OMB Guideline 3.2.10, given that neither
study had a response rate lower than 70%, a non-response bias analysis was not necessary at
the item level.
The research and approach taken by JDP are in accordance with sound market research and
current best practices from the American Association for Public Opinion Research (AAPOR)
regarding response rate recommendations: “Results that show the least bias have turned out,
in some cases, to come from surveys with less than optimal response rates. Experimental
comparisons have also revealed few significant differences between estimates from surveys
with low response rates and short field periods and surveys with high response rates and long
field periods.” See AAPOR “Response Rates – An Overview” (2015) and Special Issue of Public
Opinion Quarterly "Nonresponse Bias in Household Surveys" (Singer, 2006).

35

Conclusion
The Overall Customer Satisfaction Index score and Advocacy rating (likelihood to inform others
about VA benefits) are not impacted in any meaningful way by non-response bias. This analysis
confirms that the data collected during FY15 is valid.
The FY15 Voice of the Veteran Line of Business Tracking Satisfaction Study data for both the
Pension Access survey and the Pension Servicing survey can be used to infer reliable Overall
Customer Satisfaction Index scores and Advocacy ratings. The Overall Customer Satisfaction
Index score reflects the experience of all Veterans and beneficiaries who received a decision for
their application for pension benefits and those who have been receiving pension benefits.
The sample utilized in the study exhibits similar characteristics for Age, Gender, and Region as
the total sample provided by Pension Service. This indicates the sample cleaning rules did not
impact the sample’s representativeness.
While the results from the non-response bias analysis did show group differences in
demographic characteristics between survey respondents and non-respondents, there were no
differences found in Veterans’ overall satisfaction and advocacy ratings between weighted and
unweighted survey respondents. This was evaluated after correcting for these differences
using a recommended sample-balancing survey raking method to derive sample weights. JDP
conducted all necessary statistical tests in accordance with OMB standards.
J.D. Power certifies the results contained within this report.

36

References
Anderson, L., and R.D. Fricker, Jr. (2015). “Raking: An Important and Often Overlooked Survey Analysis
Tool,” Phalanx,: http://faculty.nps.edu/rdfricke/docs/Analysis%20process_v4.pdf
American Association for Public Opinion Research (2008). “Standard Definitions: Final Disposition of
Case Codes and Outcome Rates for Surveys.” Ann Arbor, Michigan: AAPOR.
(http://www.aapor.org/AAPORKentico/AAPOR_Main/media/MainSiteFiles/Standard_Definitions_07
_08_Final.pdf)
American Association for Public Opinion Research (2015). “Response Rates – An Overview.”
http://www.aapor.org/AAPORKentico/Education-Resources/For-Researchers/Poll-SurveyFAQ/Response-Rates-An-Overview.aspx
Battaglia, Michael P., Izrael, David, Hoaglin, David C., and Frankel, Martin R. (2004). “To Rake or Not To
Rake Is Not the Question Anymore with the Enhanced Raking Macro.” Proceedings of the 29th Annual
SAS Users Group International Conference, Paper 207.
Battaglia, Michael P., Izrael, David, Hoaglin, David C., and Frankel, Martin R. (2009). “Practical
Considerations in Raking Survey Data.” Survey Practice, Vol 2, No. 5.
Baum, Herbert M., Ph.D.; Chandonnet, Anna M.A., Fentress, Jack M.S., M.B.A., and Rasinowich, Colleen,
B.A. (2012). “Mixed-Mode Methods for Conducting Survey Research.” Data Recognition Corporation.
http://www.datarecognitioncorp.com/survey-services/Documents/Mixed-Mode-Methods-forConducting-Survey-Research.pdf
Dillman, D. A. and J.D. Power (2015). Conference call discussion on non-response bias, avoidance
methods, and post-hoc sample weighting. Conference call between Dr. Dillman and JDP (Greg Truex,
Jay Meyers, Ph.D., Lee Quintanar, Ph.D.), May 20, 2015 (2pm PDT).
Dillman, D. A. (2014). Internet, Phone, Mail and Mixed-Mode Surveys: The Tailored Design Method.
Fourth Edition. John Wiley & Sons, Inc: New York.
Ellis, J. M. (2000). “Estimating the Number of Eligible Respondents for a Telephone Survey of LowIncidence Households.” Paper presented at the annual meeting of the American Association for
Public Opinion Research, Portland, OR, May 21, 2000
Federal Committee on Statistical Methodology. “Statistical Policy Working Paper 31, Measuring and
Reporting Sources of Error in Surveys” (2001). Washington, D.C.
Izrael, David, Hoaglin, David C., and Battaglia, Michael P. (2000). “A SAS Macro for Balancing a Weighted
Sample.” Proceedings of the Twenty-Fifth Annual SAS Users Group International Conference, Cary,
NC. Paper 275.
Izrael, David, Hoaglin, David C., and Battaglia, Michael P. (2004). “Tips and Tricks for Raking Survey Data
(a.k.a. Sample Balancing).” Proceedings of the 2004 American Association for Public Opinion
Research (AAPOR) Conference, Cambridge, MA.

37

Malhotra, N.K, and Birks, D.F. (2007). “Marketing Research: An Applied Approach,” 3rd edition. Prentice
Hall/Financial Times: England.
Pierchala, Carl E. (2001). “PROC MI® as the Basis for a Macro for the Study of Patterns of Missing Data.”
Northeast SAS Users Group. http://www.lexjansen.com/nesug/nesug03/st/st009.pdf
Singer, E. (2006). “Special Issue: Nonresponse Bias in Household Surveys.” Public Opinion Quarterly, Vol
70, Issue 5.
U.S. Office of Management and Budget (1990), ”Survey Coverage.” Statistical Policy Working Paper 17,
Washington, D.C.
U.S. Office of Management and Budget Publication (January 2006). “When Designing Surveys for
Information Collections.” The Office of Management and Budget, 725 17th Street, NW. Washington,
D.C. 20503.
U.S. Office of Management and Budget Publication (September 2006). “Standards and Guidelines for
Statistical Surveys.” The Office of Management and Budget, 725 17th Street, NW. Washington, D.C.
20503.
U.S. Office of Management and Budget Publication (2008). VBA Pension OMB - Part B Supporting
statement for “Collections of Information Employing Statistical Methods.” Washington, D.C.
Vogt, W. Paul, Vogt, Elaine R., Gardner, Dianne C., and Haeffele, Lynne M. (2014). “Selecting the Right
Analyses for Your Data-Quantitative, Qualitative, and Mixed Method.” Guilford Press, New York, NY.
Wallace, Leslie and Rust, Keith (1996). “A Comparison of Raking and Poststratification Using 1994 NAEP
Data.” Leslie Wallace, West Inc., 584-589.

38

Appendix A
Missing Data Patterns and Mechanisms
An excellent discussion of missing data patterns, mechanisms, and research analysis methods is
provided in Vogt, W. Paul, Vogt, Elaine R., Gardner, Dianne C., and Haeffele, Lynne M. (2014).
An overview of the missing data types and issues is described below:
Understanding the reasons why data is missing can help with analyzing the remaining data. If
values are missing at random, the data sample may still be representative of the population.
However, if the values are missing systematically, analysis may be harder.






Missing completely at random. Values in a data set are missing completely at random
(MCAR) if the events that lead to any particular data item being missing are independent
both of observable variables and of unobservable parameters of interest, and occur entirely
at random. When data are MCAR, the analyses performed on the data are unbiased;
however, data are rarely MCAR.
Missing at random. Missing at random (MAR) is an alternative, and occurs when the
missingness is related to a particular variable, but it is not related to the value of the
variable that has missing data. An example of this is accidentally omitting an answer on a
questionnaire.
Missing not at random. Missing not at random (MNAR) is data that is missing for a specific
reason (i.e., the value of the variable that is missing is related to the reason it is missing). An
example of this is when a certain question on a questionnaire tends to be skipped
deliberately by participants with certain characteristics. Graphical models can be used to
describe the missing data mechanism in detail.

While it is clear that MNAR can introduce statistical bias, there is no definitive test (see Vogt et
al, 2014). It is also clear that MCAR is rarely evident in research data and most tests of it will
fail. However, MAR is fully acceptable for valid statistical analyses (Vogt et al, 2014). MAR is
essentially “missing partially at random” whereby the intra-group missingness remains random
despite some differences between group tendencies. Graphical data representations are the
typical tool used in assessment as described above and in Pierchala, Carl E. (2001).
See Section 3.2 Missing Data Patterns and Mechanisms for findings specific to Pension’s data.

39

Appendix B
Item Response Rates
In accordance with OMB “Standards and Guidelines for Statistical Surveys,” Section 3.2,
Guidelines 3.2.6-3.2.7, the item response rate was calculated as the ratio of the number of
respondents for whom an in-scope response was obtained to the number of respondents who
were asked to answer that item. The number asked to answer an item is the number of unitlevel respondents minus the number of respondents with a valid skip pattern. In addition to
item response rate, total item response rate was calculated as the product of the overall unit
response rate and the item response rate for each item. The purpose of these calculations is to
assess the item non-response, which occurs when one or more survey items are left blank in an
otherwise completed questionnaire. Tables B1.e and B1.s display the item and total item
response rates for these surveys.
The OMB “Standards and Guidelines for Statistical Surveys” Guideline 3.2.10 states an item nonresponse analysis should be conducted for items with an item response rate of less than 70%.
Since none of the survey item response rates falls below 70% for either Access or Servicing, an
item-level analysis of non-response bias was not necessary. The Access item response rates
range from 84% to 100%, with a 95% average, while Servicing response rates range from 83% to
100%, with a 96% average.

Table B1.e. Access Item and Total Item Response Rate11

Question
Number
1
2
3
4
5a
5b
5c
5d
5e
5f
6

Item
Response
Rate
85%
100%
98%
99%
98%
98%
97%
96%
94%
97%
100%

Unit
Response
Rate
22%
26%
26%
26%
26%
26%
26%
25%
25%
25%
26%

11

Open capture question for additional comments about experience and e-mail opt in questions display “N/A” and were not
included in item and total item response rate calculations.

40

Table B1.e. Access Item and Total Item Response Rate (Continued)
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25a
25b
25c
25d
26
27
28
29a
29b
29c
30
31
32
33
34
35

94%
98%
92%
100%
93%
96%
100%
96%
87%
100%
99%
100%
100%
96%
97%
100%
100%
89%
95%
94%
90%
96%
N/A
100%
95%
85%
84%
87%
96%
92%
96%
N/A
N/A
N/A

25%
26%
24%
26%
24%
25%
26%
25%
23%
26%
26%
26%
26%
25%
26%
26%
26%
23%
25%
25%
24%
25%
N/A
26%
25%
22%
22%
23%
25%
24%
25%
N/A
N/A
N/A

41

Table B1.s Servicing Item and Total Item Response Rate12

Question
Number
1
2
3
4
5a
5b
5c
5d
5e
5f
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24a
24b
24c
25
26
27
28

Item
Response
Rate
83%
99%
99%
98%
99%
99%
98%
97%
97%
98%
100%
93%
97%
93%
96%
98%
95%
99%
93%
92%
94%
97%
100%
N/A
N/A
94%
98%
96%
94%
85%
93%
98%
96%
98%
N/A

Unit
Response
Rate
15%
18%
18%
18%
18%
18%
18%
18%
18%
18%
18%
17%
18%
17%
18%
18%
18%
18%
17%
17%
17%
18%
18%
N/A
N/A
17%
18%
18%
17%
16%
17%
18%
18%
18%
N/A

12

Open capture questions for additional comments about experience and items unclear in letter and e-mail opt in questions
display “N/A” and were not included in item and total item response rate calculations.

42

Table B1.s. Access Item and Total Item Response Rate (Continued)
29
30

N/A
N/A

N/A
N/A

In the item response rate calculations above, JDP considered blanks as non-response for mail
returns and “don’t know” selections in addition to blanks as non-response for online returns.
“Don’t know” selections are included as non-response for online returns, since respondents are
forced to select a response in the online survey.
Similarly, “N/A” responses were also included as non-response for rating questions in online
returns. Respondents taking the survey online must answer each question before proceeding to
the next question in the survey, thus “Not Applicable” or “N/A” could either mean that the
respondent was answering “N/A” to the question or did not wish to answer it. Therefore, this
response option was included as non-response.

43

Appendix C
Study Overview
1.1 Study Background
The Voice of the Veteran Satisfaction Initiative tracks Veteran satisfaction with the benefits and
services received from VBA. The VOV Line of Business Tracking Satisfaction Research Study is
ongoing survey research that tracks Veteran satisfaction with the following VBA lines of
business: Compensation, Pension, Education, Vocational Rehabilitation & Employment (VR&E),
and Loan Guaranty (LGY).
As part of Executive Order 13571 Streamlining Service Delivery and Improving Customer
Service, agencies that provide significant services directly to the public to identify and survey
customers, establish service standards and track performance against those standards, and
benchmark customer service against the best in business. This program enables VBA to
understand what is important to Veterans relative to benefits received and services provided.
This program provides timely and actionable Veteran feedback on how well VBA is providing
services. Insights from this program identify opportunities for improvement and measure the
impact of improvement initiatives, as well as to continuously measure performance outcomes.
The Pension survey instrument measures Veteran satisfaction with access and receipt of
benefits issued by VBA. In FY15, fielding occurred continuously on a monthly basis for Access
and annually for Servicing. Surveys remained open in field until the end of each quarter. If any
surveys were received after a quarter closed field, then those returns were counted in the next
quarter’s number of returns.

Methodology

Fielding
Frequency

Total Mailouts
Per Year

Target
Number of
Completes

Access

Mail Only

Monthly

10,000

3,000

Servicing

Mail Only

Annually

10,000

3,000

Survey

1.2 Methodology
Respondents had the option to complete the survey via paper instrument. Respondents were
sent a survey packet containing a cover letter, survey, and a business reply envelope.
Approximately 3 weeks after deployment of the first survey packet, a second survey packet was
mailed that was cleaned to exclude anyone who completed the survey at least 1 week prior to
the second survey packet mailing.

44

Sample Population Definition
The targeted populations were identified by Pension Service. For Access, the target population
is defined as Veterans and beneficiaries who received a decision on a pension benefit claim
within the past 30 days. These individuals may include those who were found ineligible on a
new claim and those who had been denied and are not appealing the decision.
For Servicing, the target population is defined as Veterans and beneficiaries who received a
decision or are receiving benefit payments.
Sample File Generation









Pension generates the sample files based upon the sampling definitions and submits sample
files directly to BAS.
BAS receives the sample files and sends to VADIR for processing.
VADIR processes sample files (to remove SSN and append demographics/EDIPI) and returns
to BAS.
BAS transfers sample files (via EDX platform) to JDP and notifies JDP via email that sample
files are ready for deployment.
JDP cleans the sample file and selects the sample.
Sample is transferred to Government Printing Office (GPO) print vendor (via EDX platform)
for printing and mailing of the survey packages.

Sample is transferred in accordance with the following schedule:

VOV_LOB
Tracking_Production Schedule_10.06.15.pdf

1.3 Data Cleaning
JDP processed the sample according to the following cleaning rules:
1. Eliminate duplicate records within each business line and across surveys based on the
unique identifier (EDI_PI or VA_ID) for each record. Note: EDIPI is Electronic Data
Interchange Personal Identifier.
a) Exception: For Pension Access (v1) and Pension Servicing (v8), eliminate duplicate
records based on EDI_PI and Claim Number.
b) When each new sample file is received, JDP cleans it against all sample selected from
every sample batch that has been delivered 12 months prior to ensure a respondent
does not receive a VA line of business survey more than once in a 12 month-period. In
the case of duplicates occurring within the same sample month, priority is assigned to
business lines with the lowest number of sample records.
2. Clean out records present on the JDP Do Not Contact list and clean against the National
Change of Address (NCOA) list.
3. Clean out records for any respondents who do not have any EDI_PI or VA_ID included in
their sample record.
a) Exception: For Pension Access (v1) and Pension Servicing (v8), clean out records with
blank EDI_PI and Claim Number.
45

4. Clean out any respondents not specified as a dependent/spouse who have a date of death
(DOD) in their sample record.
5. Clean out any respondents who do not have any address included in their sample record.
6. Assign and maintain unique sampling identifiers to each sample record in order to track
history of sampling. Exclude records that have been sampled in the past 12 months to
ensure no respondent is mailed surveys more than once in a 12-month time frame. This rule
may not apply to those who completed a survey.

1.4 Order generation and fulfillment process
Federal Acquisition Regulations (FAR 8.8) mandate government agencies solicit all printing
requirements through the Government Printing Office (GPO). GPO utilizes print vendors to
fulfill orders. A Data Transfer Agreement (DTA) must be in place with print vendor and
contractor before BAS can obligate funds or transfer sample files to the print vendor and
contractor.
Prior to mailing the mail surveys, print orders must be generated for each survey. The entire
process may take up to 2-4 weeks from inception of the print order to the mailing of the survey
package or postcard. Below are the steps involved in order generation and order fulfillment.
Order generation








After sample is received by JDP, the sample files are cleaned and selected. Letter Work
Orders (LWOs) are then created to provide the print vendor with the necessary information
to match the sample files to the correct survey instrument. (1 day)
JDP creates the print order and sends to BAS Contractor Officer’s Representative (COR).
(Same day as above step)
The COR reviews, authorizes, and submits the print order. (1 day)
The BAS Publication Officer and/or COR submits the orders to the VA Publications Services
Division (VAPSD). (Same day as above step)
The order is issued a control number by a VBA Management Analyst, Publications. (Variable
timing)
Once the control number is assigned, the order goes to VA Publication Services Division
liaison to forward to GPO Contracting Officer. (Variable timing) Note: the amount of time an
order is with VAPSD varies greatly, and could range from 3 to 20 days.
The GPO Contracting Officer sends the printing and mailing order to the print vendor.

Order fulfillment





Once the order is placed, the GPO print vendor is allotted 9 business days to fulfill the order
(2 days to generate proofs, 2 days for proof review/correction, and 5 days to print and
mail).
Upon receipt of the proofs from print vendor, JDP reviews and approves; then BAS reviews
and approves; then VAPSD reviews and approves.
After the orders have been mailed, the print vendor provides the mail receipts to
contractor, BAS, and VAPSD.
Upon order completion, VAPSD provides actual costs to BAS.
46

1.5 Reporting
Reporting occurs four times yearly for the Access Process survey.
On a quarterly basis, the following deliverables are provided:





Scorecard
Data matrices
Data is loaded to the VOV reporting site
Open-ended comments (verbatims)

On a semiannual (twice yearly) basis, the following deliverable is provided:


Data and Analysis Presentation

Reporting occurs once annually for the Servicing Process survey.
On an annual basis, the following deliverables are provided:






Scorecard
Data matrices
Data is loaded to the VOV reporting site
Open-ended comments (verbatims)
Data and Analysis Presentation

47

Sample Plan Overview
2.1 Sample Criteria
VBA was responsible for providing sample to JDP that meets the following sampling criteria:
Sample Population

Inclusion Criteria

Frequency of Data Request

Access Survey

For Access, the target population is
defined as Veterans and
beneficiaries who received a
decision on a pension benefit claim
within the past 30 days. These
individuals may include those who
were found ineligible on a new
claim and those who have been
denied and are not appealing the
decision.

Monthly

Servicing Survey

For Servicing, the target population
is defined as Veterans and
beneficiaries who received a
decision or are receiving benefit
payments.

Annually

2.2 Fielding/Sampling Frequency
Survey
Instrument
Access
Survey
Servicing
Survey

Methodology

Total
Survey
Instruments

Targeted
Number of
Completes

Number of
Postcards
(eSurvey)

Number
of Mail
Packages

Fielding
Frequency

Mail Only

10,000

3,000

N/A

10,000

Monthly

Mail Only

10,000

3,000

N/A

10,000

Annually

2.3 Data Transfer
The sample was posted by BAS once a month within the sampling folder on the VOV EDX site.
Sample was provided in a file layout consistent with the file layout provided for the study as
outlined below.

48

Pension File Layout
ADDRESS_1
ADDRESS_2
AGE
AID_ATTENDANCE_HOUSEBOUND
AMOUNT_AWARDED
BENEFICIARY_TYPE
BENEFIT_TYPE
BRANCH_OF_SERVICE
CHARACTER_OF_DISCHARGE
City
CLAIM_NUMBER
CLOTHING_ALLOWANCE
CURRENT_CLAIM_STATUS
DATE_OF_APPLICATION
DATE_OF_BIRTH
DATE_OF_DEATH
EMAIL_ADDRS_TXT
ENTITLEMENT_CODE
ENTITLEMENT_DATE
EOD
EVALUATION
FIRST_NAME

GENDER
HOMELESS
INDIVIDUAL_UNEMPLOYABILITY
LAST NAME
LATEST_END_PRODUCT
NUM_DISABILITIES_CLAIMED
NUMBER_OF_DEPENDENTS
PAYEE_CODE
PERIOD_OF_SERVICE
PHONE
RAD
REASON_CODE
REGIONAL_OFFICE_CODE
SERVICE_REPRESENTATIVE
SSN
STATE
ZIP

49

Pension File Layout (Continued.)
DATE AWARD
METHOD OF APPLICATIONS
NUMBER OF APPLICATIONS
DEVELOPMENT INITIATED
NUMBER OF APPEALS
COMPENSATION AWARDED
PENSION AWARDED
PRIOR EDUCATION LEVEL
SVC_CD

OEF_OIF_IND
RACE_CD
CHAR_SVC_CD
DPV_Code

2.4 Sample Cleaning Rules Glossary
Duplicate records in sample file – The record is cleaned out if there is more than one record
within the same sample file for the same respondent.
Duplicate record history – The record is cleaned out if the record has been selected within the
past 12 months for any of VBA’s business line surveys (i.e. Compensation, Pension, Education,
Home Loan Guaranty, and Vocational Rehabilitation) regardless of whether the respondent
completed the survey.
Invalid address – The record is cleaned out if JDP’s address verification software indicates an
invalid address code.
Invalid values – The record is cleaned out if the “VA_ID” field is blank.
Blanks – The record is cleaned out if the “Name” field corresponding to the record is blank.
Do not contact – The record is cleaned out if the individual is listed on JDP’s Do Not Contact list.

2.5 Sample Selection
JDP selected sample records following the completion of the sample cleaning process. The
following guidelines are referenced when selecting sample:
1. Total Sampling Targets: The table below summarizes the total sampling target per an RO per
a fielding period. The “Sampling Target per RO” column indicates the minimum number of
sample records that should be selected per an RO for each survey. If this minimum target
number cannot be reached for a particular RO, sample from a different RO will be selected
to make up the difference.
50

Frequency

Total
Sampling
Target

Sampling
Target Per
Time Period

Sampling
Target Per
RO

Access
Monthly
10,000
833
278
Survey
Servicing
Annually
10,000
10,000
3,333
Survey
Note: JDP did not receive Regional Office information in the Compensation sample files.

Number of
ROs
3
3

2. The same record cannot be selected for multiple surveys during the same wave.
Respondents who have completed a survey within the past 12 months cannot be selected.
Survey priority is based on the number of records in each sample file. The survey with the
smallest number of records is given first priority.
3. Following sample selection, the JDP project team receives an automated report confirming
the number of records selected for each survey version. The JDP project team verifies that
the sample selection quantities reflect the sample targets and approves the sample file for
fielding.

2.6 Data Collection
During the survey fielding period, both online survey returns and paper surveys are collected as
they are received and posted on a secure EDX site. Responses from paper surveys are scanned
through automated imaging software while verbatim responses are recorded by a live survey
processor. Survey returns must have all pages intact in order to be processed and counted as a
return. Surveys with missing pages are counted as unusable. Returns are also considered
unusable if there is an indication that the individual completing the survey is not the individual
selected from the sample file (i.e., the respondent name and/or address on the survey is
replaced with a different name and/or address). During each day of fielding, a subset of survey
returns undergo quality assurance to validate the accuracy of responses captured. If duplicate
surveys are returned (as identified by the unique sampling identifier assigned to each sample
record), the original survey return is processed while the duplicate survey is removed. In the
case of duplicate survey returns from mixed methodology surveys, the date the survey was
received is used to identify the original return while the subsequent return is removed post
fielding.

51

APPENDIX D
Approaches to Mitigating the Effect of NonResponse Bias and Strategies to Improve the
Response Rate
The following section outlines two approaches used in FY15 to mitigate the potential of nonresponse bias. As mentioned earlier in the report, J.D. Power affirms that while high response
rates are always desirable in surveys, an 80% response rate is typically not achievable for a
voluntary, customer-satisfaction survey instrument (Malhotra & Birks, 2007), particularly those
that do not provide an incentive (not recommended for this program). To illustrate this point,
the Dillman Method for survey fielding, discussed in Dillman, D. A. (2014) , utilized a survey
instrument that was fielded to 600 students at the University of Washington. After five
attempts to solicit a response in a closed university setting, as well as offering a monetary
incentive to complete the survey, only able a 77% response rate was achieved.
The first approach to minimize non-response occurs before and during data collection and
involves introducing measures to maximize survey response rates. The second approach is to
make statistical adjustments after the data is collected.

1.1 Approach 1: Strategies to Maximize Response Rates
Prior to, and during, fielding the survey, JDP implemented the following measures to reduce the
chances of non-response:






Respondents were provided with the promise of confidentiality on the survey cover letter
and postcard, and assured that their survey responses would not impact their current or
future eligibility for benefits.
Following the first mailing, non-respondents were sent an additional survey mailing.
Respondents were provided with a toll-free telephone number and dedicated e-mail
address to contact JDP about survey-related inquiries (e.g., how to interpret questions and
response items, the purpose of the survey, how to get another copy of the survey if their
copy has been lost/damaged, etc.). Telephone calls and emails are responded to within 24
hours and answered during regular business hours (8:00-5:00pm PT Monday through
Friday).
JDP ensured the online surveys were accessible to Veterans with disabilities by maintaining
508 compliant standards. These standards include:
 Keyboard navigation rather than mouse or other pointing devices
 Customization options for color, size, and style of text displayed

52

 Compatibility with screen readers to translate items displayed on the survey in audible
output and/or Braille displays
 Customer support and technical support through JDP Help Desk toll-free phone number
and email address
 Exclusion of non-text elements, image maps, animation, flashing or blinking text



The survey fielding period was extended to offer opportunities to respond for subgroups
having a propensity to respond late (e.g., males, young, full-time employed)
The survey was developed and reviewed in order to enhance respondent understanding of
the survey materials and to improve the relevancy of the data collected:
 Prior to fielding the Benchmark study, a series of cognitive labs was conducted with test
users to ensure the survey questions were easily understood and correctly interpreted.
Revisions were made to the survey based on test user feedback. (As per OMB Guideline
1.4.1)
 After the Benchmark study and prior to fielding the first year of the Tracking study,
Compensation Service and JDP conducted a review of the survey instruments and
modified the surveys to improve the relevancy of data collected. (As per OMB Guideline
1.4.2)

1.2 Approach 2: Correcting Unit Non-Response Bias with Sample Weighting and
Survey Raking
As stated above, the two approaches to tackling non-response bias include implementing
measures to maximize response rates during the fielding period and making post hoc statistical
adjustments to the survey results. The following section discusses the statistical adjustments
approach, which include weighting the data or imputing scores to correct the amount of nonresponse bias. An example of this approach would be the survey raking procedure described
earlier in this paper. See the associated references in the “Survey Raking Procedure for Sample
Weightings” section for more information.
The procedure known as “raking” adjusts a set of data so that its marginal totals match
specified control totals on a specified set of variables. The term suggests an analogy with the
process of smoothing the soil in a garden plot by alternately working it back and forth with a
rake in two perpendicular directions (Izrael and Battaglia 2004).
If non-response bias was identified in the survey data, the non-response bias could be
corrected mathematically with a post-stratification survey weight. JDP would weigh the survey
data based on certain demographics (such as age, gender, region, etc.) of the total sample so
that the weighted survey data would conform more to the demographics of the total sample.
The implicit assumption in this approach is the distributions of characteristics of the nonrespondents within an adjustment class (such as an age group) are the same, on average, as
those of the respondents within the same adjustment class.
See Appendix B for the item response rate for each question in the survey. If the item response
rate was not lower than 70%, as per OMB standards, the imputation of data is not necessary.

53

In the case that a particular item-level response was less than 70%, JDP would recommend
conducting additional analysis to determine the potential for other factors (i.e., missing or skip
patterns in the survey instrument) to be the cause of non-response.

Strategies to Improve Response Rate
In addition to the strategies listed above, JDP recommends considering the following strategies
to improve response rates going forward:






Issue ongoing public communications (e.g., press releases, post information on the VA
website) to spread awareness and confirm the legitimacy of the VA Pension study.
Educate VA employees and VSOs about the survey to encourage participation. Provide a list
of frequently asked questions and answers to VSOs and VA employees to equip them with
answering Veterans’ questions regarding the survey.
Send email invitations to Veterans rather than mailing postcards to make it easier for them
to complete the survey online.
Reduce the length of the survey to improve Veterans’ willingness to respond
 Reduce overall number of questions and number of response options for each question.



Increase the number of contacts available to respondents with additional reminders about
the survey to encourage participation.
 Provide respondents with an additional paper survey questionnaire.








Reduce the frequency of mailings to reduce the opportunities for delays and errors in the
GPO Print process.
Revise the cover letter and postcard to express the importance of participation in the
survey.
Provide sample from the 30-day period immediately prior to the mailing, rather than
sample from 90 days prior, to improve the recency of experience with the Pension benefit
(which improves both participation and recollection).
Change location of sequence number to directly follow survey link on postcard and cover
letter.
Alter the responsibility of sample file generations from Pension to PA&I. A PA&I data pull
will increase consistency.
Alter formatting on postcard and cover letter to include color print to make materials more
readable, which may increase participation.

54

Appendix E
Impact of FAR 8.8
Federal Acquisition Regulation (FAR) 8.8 requires that printing must be conducted through
Government Printing Office (GPO). The following section outlines limiting factors of the VOV
Line of Business Tracking Satisfaction Research Study that occurred as a result of the FAR
requirement.
Through the utilization of the GPO Print Vendor, the following occurred in FY15:
o Quality issues to include:
 Survey instruments were printed and mailed:
 Utilizing the sample population from one survey, but receiving a different
survey (e.g. potential respondents from the pool of one business line
received the survey for a different business line).
 Using a version of the instrument that was outdated. This version did not
contain the current questions or responses that were being fielded.
 Mixing content between survey versions.
 Using shells from one survey printed with a different survey.
o Ongoing timeliness delays occurred with each set of orders placed as the order
fulfillment process took a minimum of 2-4 weeks.

1.1 Impact
The project experienced ongoing delays in the printing and mailing of postcards and survey
packets for VBA’s lines of business. The delays affected the critical processes required to
execute the VOV program to its fullest potential.
A multitude of quality issues were experienced throughout FY15 that negatively impacted the
VOV program response rates. The issues that occurred impacted: access to the online survey;
readability of mail materials; level of effort required by respondents to take the survey;
relevancy of survey; and the diminishment of brands (VA/JDP) associated with poor quality
materials.

55

Appendix F
NOTE: Questionnaire is not shown in the formatted version that respondents used to fill out
survey.

Survey Questionnaires
[DO NOT DISPLAY/IDENTIFY SECTION HEADERS. DISPLAY SINGLE QUESTION PER PAGE.]
[RESPONSE CODES APPEAR IN BRACKETS AT THE END OF EACH RESPONSE FOR SINGLE
RESPONSES AND IN THE PROGRAMMING INSTRUCTIONS FOR MULTIPLE RESPONSES.]

Servicing Questionnaire
Benefit Information
1. How did you FIRST learn about VA benefit programs? (Mark only one) If you are
unsure, please indicate the first way you remember learning about VA benefit
programs. [RADIO BUTTONS. SINGLE RESPONSE.]
a. VA website [1]
b. VetSuccess.gov [2]
c. eBenefits.va.gov [3]
d. Social media websites (e.g., Facebook, Twitter, etc.) [11]
e. Internet (excluding VA and social media sites) [14]
f. Mail (from VA) [4]
g. VA phone number (800-827-1000) [5]
h. In person with a VA representative (e.g., VA medical center, VA Vet center,
Regional Office, etc.)[8]
i. Transition Assistance Program/Disabled Transition Assistance Program
briefings [6]
j. Veterans Service Organizations (e.g., Amer. Legion, DAV, VFW, PVA,
MOPH, etc.)
(Specify) ______________ [TEXT BOX. FORCE TEXT IF RESPONSE IS
SELECTED. 50 CHARACTER MAX.] [7]
Other Veterans [13]
Friends or family [15]
Other publications (e.g., Army Times, local newspaper, etc.) [16]
Other (Specify) ___________________[TEXT BOX. FORCE TEXT IF
RESPONSE IS SELECTED. 50 CHARACTER MAX.] [97]
o. Don’t know or not sure [99]
k.
l.
m.
n.

2. What method(s) do you MOST FREQUENTLY use to obtain general information
about VA’s benefits or services? (Mark all that apply) [CHECK BOXES.
MULTIPLE RESPONSE. CODE EACH RESPONSE AS 0 IF UNCHECKED OR
1 IF CHECKED]
a. VA website
56

b.
c.
d.
e.
f.
g.
h.
i.
j.

k.
l.
m.
n.
o.
p.

VetSuccess.gov
eBenefits.va.gov
Social media websites (e.g., Facebook, Twitter, etc.)
Other websites (excluding VA or social media sites)
Phone
Mail
E-mail
In person with a VA representative (e.g., VA medical center, VA Vet center,
Regional Office, etc.)
Veterans Service Organizations (e.g., Amer. Legion, DAV, VFW, PVA,
MOPH, etc.) (Specify) ___________________ [TEXT BOX, FORCE TEXT
IF RESPONSE IS SELECTED, 50 CHARACTER MAX.]
Disabled Veterans’ Outreach Program
Friends or family
Other publications (e.g., Army Times, local newspaper, etc.)
Other (Specify) ___________________ [TEXT BOX, FORCE TEXT IF
RESPONSE IS SELECTED, 50 CHARACTER MAX.]
Don’t know or not sure [MUTUALLY EXCLUSIVE RESPONSE]
None of the above [MUTUALLY EXCLUSIVE RESPONSE]

3. How frequently would you like to receive communications (e.g., emails, letters,
newsletters, etc.) about VA benefits or services? (Mark only one) [RADIO
BUTTONS. SINGLE RESPONSE.]
a. Weekly [1]
b. Monthly [2]
c. Quarterly (every 3 months) [3]
d. Semiannually (twice per year) [4]
e. Annually (once per year) [5]
f. Never [6]
g. Don’t know or not sure [99]
4. How would you like to receive information from VA about benefits or services?
(Mark all that apply) [CHECK BOXES. MULTIPLE RESPONSE. CODE EACH
RESPONSE AS 0 IF UNCHECKED OR 1 IF CHECKED]
a. Phone
b. Mail
c. Email
d. VA website
e. Social media websites (e.g., Facebook, Twitter, etc.)
f. In person at a Regional Office
g. Veterans Service Organizations(e.g., Amer. Legion, DAV, VFW, PVA,
MOPH, etc.) (Specify) ___________________ [TEXT BOX, FORCE TEXT
IF RESPONSE IS SELECTED, 50 CHARACTER MAX.]
h. Other (Specify) ___________________ [TEXT BOX, FORCE TEXT IF
RESPONSE IS SELECTED, 50 CHARACTER MAX.]
i. Don’t know or not sure [MUTUALLY EXCLUSIVE RESPONSE.]

57

The following question asks you to rate various aspects of your experience with Pension
using a scale of 1 to 10, where 1 is Unacceptable, 10 is Outstanding, and 5 is Average.
[SHOW ON SAME PAGE AS THE QUESTION THAT FOLLOWS.]
5. Please rate your experience in obtaining information about your benefit on the
following items: (Mark only one per row) [SHOW RESPONSES IN GRID WITH
10-POINT SCALE IN COLUMNS AND ATTRIBUTES/RESPONSES IN ROWS
(SEE JDP CONVENTIONS DOCUMENT PG. 1 FOR SPECIFIC DETAILS OF
LAYOUT). EVENLY SPACED RADIO BUTTONS/COLUMNS, ALTERNATE
SHADES IN ROWS. SINGLE RESPONSE PER ROW. RANDOMIZE ALL
ATTRIBUTES EXCEPT THE LAST ONE.]
a. Ease of accessing information [ALLOW N/A RESPONSE][1-10, N/A=99]
b. Availability of information [ALLOW N/A RESPONSE] [1-10, N/A=99]
c. Clarity of information [ALLOW N/A RESPONSE] [1-10, N/A=99]
d. Usefulness of information [ALLOW N/A RESPONSE] [1-10, N/A=99]
e. Frequency of information provided by VA [ALLOW N/A RESPONSE] [1-10,
N/A=99]
f. Overall rating of information [1-10]
Contact with VA
6. During the past 6 months, did you contact anyone from VA about your benefit?
(Mark only one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
(Ask Q7-Q12 if Q6 is yes, otherwise go to Q13)

7. Which of the following best describes the reason for your most recent contact?
(Mark only one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Resolve a problem [1]
b. Ask a question [2]
c. Request a change to your records/provide information [3]
8. Can you briefly describe the nature of your most recent contact? (Mark all that
apply) [CHECK BOXES. MULTIPLE RESPONSE. CODE EACH RESPONSE
AS 0 IF UNCHECKED OR 1 IF CHECKED.]
a. Update your dependency status
b. Change your address or direct deposit information
c. Provide verification documents required for payment (e.g., income
verification, medical records, etc.)
d. Report the death of an individual who received VA benefits
e. Report that you did not receive your VA check or direct deposit
f. Resolve a problem with your benefits
g. Find out about a late benefit payment
h. Report a problem with a VA customer service representative
i. Ask a general question
58

j. Obtain information about submitting/re-opening a claim
k. Other (Specify) ___________________ [TEXT BOX, FORCE TEXT IF
RESPONSE IS SELECTED, 50 CHARACTER MAX.]
9. Thinking about your most recent contact, how did you contact VA? (Mark only
one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Phone [1]
b. Fax [8]
c. Website [6]
d. Email [7]
e. Mail [9]
f. In person [3]
g. eBenefits.va.gov [10]
h. Online Chat
10. Was your most recent issue resolved? (Mark only one) [RADIO BUTTONS.
SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
(Ask Q11 if Q10 is No, otherwise go to Q12)
11. Why wasn’t your most recent issue resolved? [CHECK BOXES. MULTIPLE
RESPONSE. CODE EACH RESPONSE AS 0 IF UNCHECKED OR 1 IF
CHECKED.]
a. Did not receive all of the information required
b. Received incorrect information
c. Was referred to the incorrect office/person
d. Waiting for follow-up from VA
e. Other (Specify) ____________________ [TEXT BOX, FORCE TEXT IF
RESPONSE IS SELECTED, 50 CHARACTER MAX.]
f. Don't know or not sure [MUTUALLY EXCLUSIVE RESPONSE.]

12. Thinking of your most recent contact with the VA, how would you rate your
overall customer service experience with the VA or VA representatives, using a
scale of 1 to 10 where 1 is Unacceptable, 10 is Outstanding, and 5 is
Average?[SHOW RESPONSES IN GRID WITH 10-POINT SCALE IN
COLUMNS AND SINGLE ROW (SEE JDPA CONVENTIONS DOCUMENT
PG. 1 FOR SPECIFIC DETAILS OF LAYOUT). EVENLY SPACED RADIO
BUTTONS/COLUMNS, SINGLE RESPONSE PER ROW.][1-10]

Benefit Entitlement

59

13. Have you submitted a claim for an Aid and Attendance or Housebound benefit in
the past 6 months? (Mark only one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]
(Ask Q14-18 if Q13 is Yes, otherwise go to Q19)
14. What is your preferred method to submit a claim? (Mark only one) [RADIO
BUTTONS. SINGLE RESPONSE]
a. Mail [1]
b. In person at a Regional Office [2]
c. In person at a Veterans Service Organization(e.g., Amer. Legion, DAV,
VFW, PVA, MOPH, etc.) [3]
d. Online [5] (SKIP TO Q16)
e. Other (Specify) ___________________ [TEXT BOX, FORCE TEXT IF
RESPONSE IS SELECTED, 50 CHARACTER MAX.] [97]
f. Don’t know or not sure [99]
(Ask Q15 if Q14 ≠ Online, otherwise go to Q16
15. Would you be willing and able to submit your claim online if the VA was able to
process your claim quicker (possibly within 2-14 days)?
a. Yes [1]
b. No [0]
c. I do not have access to a computer/Internet [96]
d. Don’t know or not sure [99]
16. Did VA require you to provide additional medical evidence after you submitted
your claim? (Mark only one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
c. Don’t know or unsure [99]
(Ask Q17 if Q16 is Yes, otherwise go to Q19)
17. Were you required to undergo a VA medical evaluation as a result of your claim?
(Mark only one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]
d. Not applicable [96]
(Ask Q18 if Q17 is Yes, otherwise go to Q19)
18. Did the exam seem appropriate and/or address your claimed condition(s)?
[RADIO BUTTONS. SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]

60

19. If you were previously found ineligible for VA pension benefits, did you
understand why you were found ineligible? (Mark only one) [RADIO BUTTONS.
SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]
d. Not applicable [96]
(Ask Q20 if Q19 is “No”, otherwise go to Q21)
20. What did you find unclear/didn’t understand about your ineligibility decision?
(Open Capture) [OPEN-END. TEXT BOX. 1000 CHARACTERS MAX. ALLOW
NO COMMENT, MUTUALLY EXCLUSIVE CHECK BOX. CODE NO COMMENT
AS 0 IF UNCHECKED AND 1 IF CHECKED.]
21. In the past 6 months, have you submitted any documentation required to verify
your eligibility for benefits (e.g., income verification, marriage certificate, medical
records, dependent information, etc.)? (Mark only one) [RADIO BUTTONS.
SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]
(Ask Q22 if Q21 is Yes, otherwise go to Q24)
22. Was there any change (increase or decrease) to your pension benefits based on
the verification of the documents submitted? [RADIO BUTTONS. SINGLE
RESPONSE.]
a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]
(Ask Q23 if Yes to Q22, otherwise go to Q24)
23. Were you informed as to the reason why your benefit payment changed? (Mark
only one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]
The following question asks you to rate various aspects of your experience with
benefits, using a scale of 1 to 10 where 1 is Unacceptable, 10 is Outstanding, and 5 is
Average. [SHOW ON SAME PAGE AS THE QUESTION THAT FOLLOWS]
24. Please rate your pension benefit on the following items: (Mark only one per row)
[SHOW RESPONSES IN GRID WITH 10-POINT SCALE IN COLUMNS AND
ATTRIBUTES/RESPONSES IN ROWS (SEE JDPA CONVENTIONS
DOCUMENT PG. 1 FOR SPECIFIC DETAILS OF LAYOUT). EVENLY SPACED
RADIO BUTTONS/COLUMNS, ALTERNATE SHADES IN ROWS. SINGLE
61

RESPONSE PER ROW. RANDOMIZE ALL ATTRIBUTES EXCEPT THE LAST
ONE.]
a. Amount of pension benefit payment [ALLOW N/A RESPONSE] [1-10,
N/A=99]
b. Timeliness of receiving benefit payment [ALLOW N/A RESPONSE] [1-10,
N/A=99]
c. Overall rating of your benefit [1-10]

Overall Experience with Benefit
25. Thinking about ALL aspects of your experience with your pension benefits,
please rate VA overall, using a scale of 1 to 10 where 1 is Unacceptable, 10 is
Outstanding, and 5 is Average. (Mark only one) [SHOW RESPONSES IN GRID
WITH 10-POINT SCALE IN COLUMNS AND SINGLE ROW (SEE JDPA
CONVENTIONS DOCUMENT PG. 1 FOR SPECIFIC DETAILS OF LAYOUT).
EVENLY SPACED RADIO BUTTONS/COLUMNS, SINGLE RESPONSE PER
ROW.] [1-10]

Overall Experience with VA
26. Taking into consideration all of the non-medical benefits (e.g., education,
compensation, pension, home loan guaranty, vocational rehabilitation and
employment, insurance, etc.) you have applied for or currently receive, please
rate your experience with VA overall, using a scale of 1 to 10 where 1 is
Unacceptable, 10 is Outstanding, and 5 is Average. (Mark only one) [SHOW
RESPONSES IN GRID WITH 10-POINT SCALE IN COLUMNS AND SINGLE
ROW (SEE JDPA CONVENTIONS DOCUMENT PG. 1 FOR SPECIFIC
DETAILS OF LAYOUT). EVENLY SPACED RADIO BUTTONS/COLUMNS,
SINGLE RESPONSE PER ROW.] [1-10]

27. How likely are you to inform other Veterans or beneficiaries about your
experience with VA benefits or services? (Mark only one) [RADIO BUTTONS.
SINGLE RESPONSE]
a. Definitely will not [1]
b. Probably will not [2]
c. Probably will [3]
d. Definitely will [4]
28. Do you have any other comments or concerns about your experience? (Open
Capture) [OPEN-END. TEXT BOX. 1000 CHARACTERS MAX. ALLOW NO
COMMENT, MUTUALLY EXCLUSIVE CHECK BOX. CODE NO COMMENT
AS 0 IF UNCHECKED AND 1 IF CHECKED.]
62

Additional Questions

As a reminder, your responses will be kept completely confidential and your email
address will not be sent to VA with any responses on this survey.
29. Would you like to provide an email address so VA can contact you with general
information about VA benefits and services? (Mark only one) [RADIO
BUTTONS. SINGLE RESPONSE]
a. Yes [1]
b. No [0]
c. I do not have an email address [96]
d. Prefer not to answer [99]
(Ask Q30 if Q29 is Yes)
30. Please enter your preferred email address where you would like to be contacted:
(Open Capture)
a. E-mail: [TEXT BOX. 100 CHARACTER MAX.]

63

Access Questionnaire
Benefit Information
1. How did you FIRST learn about VA benefit programs? (Mark only one) If you are
unsure, please indicate the first way you remember learning about VA benefit
programs. [RADIO BUTTONS. SINGLE RESPONSE.]
a. VA website [1]
b. VetSuccess.gov [2]
c. eBenefits.va.gov [3]
d. Social media websites (e.g., Facebook, Twitter, etc.) [11]
e. Internet (excluding VA and social media sites) [14]
f. Mail (from VA) [4]
g. VA phone number (800-827-1000) [5]
h. In person with a VA representative (e.g., VA medical center, VA Vet center,
Regional Office, etc.)
i. Transition Assistance Program/Disabled Transition Assistance Program
briefings [6]
j. Veterans Service Organizations(e.g., Amer. Legion, DAV, VFW, PVA, MOPH,
etc.)
(Specify) ______________ [TEXT BOX. FORCE TEXT IF RESPONSE IS
SELECTED. 50 CHARACTER MAX.] [7]
[8]

k.
l.
m.
n.
o.

[9]
[10]
[12]
Other Veterans [13]
Friends or family [15]
Other publications (e.g., Army Times, local newspaper, etc.) [16]
Other (Specify) ___________________[TEXT BOX. FORCE TEXT IF
RESPONSE IS SELECTED. 50 CHARACTER MAX.] [97]
Don’t know or not sure [99]

2. What method(s) do you MOST FREQUENTLY use to obtain general information
about VA’s benefits or services? (Mark all that apply) [CHECK BOXES.
MULTIPLE RESPONSE. CODE EACH RESPONSE AS 0 IF UNCHECKED OR
1 IF CHECKED]
a. VA website
b. VetSuccess.gov
c. eBenefits.va.gov
d. Social media websites (e.g., Facebook, Twitter, etc.)
e. Other websites (excluding VA or social media sites)
f. Phone
g. Mail
h. Email
i. In person with a VA representative (e.g., VA medical center, VA Vet center,
Regional Office, etc.)
64

j. Veterans Service Organizations(e.g., Amer. Legion, DAV, VFW, PVA, MOPH,
etc.) (Specify) _________________[TEXT BOX. FORCE TEXT IF
RESPONSE IS SELECTED. 50 CHARACTER MAX.]
k. Disabled Veterans’ Outreach Program
l. Friends or family
m. Other publications (e.g., Army Times, local newspaper, etc.)
n. Other (Specify) ___________________[TEXT BOX. FORCE TEXT IF
RESPONSE IS SELECTED. 50 CHARACTER MAX.]
o. Don’t know or not sure [MUTUALLY EXCLUSIVE RESPONSE.]
p. None of the above [MUTUALLY EXCLUSIVE RESPONSE.]
3. How frequently would you like to receive communications (e.g., emails, letters,
newsletters, etc.) about VA benefits or services? (Mark only one) [RADIO
BUTTONS. SINGLE RESPONSE.]
a. Weekly [1]
b. Monthly [2]
c. Quarterly (every 3 months) [3]
d. Semiannually (twice per year) [4]
e. Annually (once per year) [5]
f. Never [6]
g. Don’t know or not sure [99]
4. How would you like to receive information from VA about applying for VA benefits
or services? (Mark all that apply) [CHECK BOXES. MULTIPLE RESPONSE.
CODE EACH RESPONSE AS 0 IF UNCHECKED OR 1 IF CHECKED]
a. Phone
b. Mail
c. Email
d. VA website
e. Social media websites (e.g., Facebook, Twitter, etc.)
f. In person at a Regional Office
g. Veterans Service Organizations( e.g., Amer. Legion, DAV, VFW, PVA,
MOPH, etc.) (Specify) ___________________[TEXT BOX. FORCE TEXT IF
RESPONSE IS SELECTED. 50 CHARACTER MAX.]
h. Other (Specify) ___________________[TEXT BOX. FORCE TEXT IF
RESPONSE IS SELECTED. 50 CHARACTER MAX.]
i. Don’t know or not sure [MUTUALLY EXCLUSIVE RESPONSE.]

The following question asks you to rate various aspects of your experience with
Pension, using a scale of 1 to 10, where 1 is Unacceptable, 10 is Outstanding, and 5 is
Average. [SHOW ON SAME PAGE AS THE QUESTION THAT FOLLOWS]
5. When thinking about your most frequently used methods of communication,
please rate your experience in obtaining information about your benefit
application on the following items: (Mark only one per row) [SHOW
RESPONSES IN GRID WITH 10-POINT SCALE IN COLUMNS AND
65

ATTRIBUTES/RESPONSES IN ROWS (SEE JDPA CONVENTIONS
DOCUMENT PG. 1 FOR SPECIFIC DETAILS OF LAYOUT). EVENLY SPACED
RADIO BUTTONS/COLUMNS, ALTERNATE SHADES IN ROWS. SINGLE
RESPONSE PER ROW. RANDOMIZE ALL ATTRIBUTES EXCEPT THE LAST
ONE.]
a. Ease of accessing information [ALLOW N/A RESPONSE][1-10, N/A=99]
b. Availability of information [ALLOW N/A RESPONSE] [1-10, N/A=99]
c. Clarity of information [ALLOW N/A RESPONSE] [1-10, N/A=99]
d. Usefulness of information [ALLOW N/A RESPONSE] [1-10, N/A=99]
e. Frequency of information provided by VA [ALLOW N/A RESPONSE] [1-10,
N/A=99]
f. Overall rating of information [1-10]

Contact with VA
6.

During the past 6 months, did you contact anyone from VA about the benefit
application process? (Mark only one) [RADIO BUTTONS. SINGLE
RESPONSE.]
a. Yes [1]
b. No [0]

(Ask Q7-Q12 if Q6 is yes, otherwise go to Q13)
7.

Which of the following best describes the reason for your most recent contact?
(Mark only one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Resolve a problem [1]
b. Ask a question [2]
c. Request a change to your records/provide information [3]

8.

Can you briefly describe the nature of your most recent contact? (Mark all that
apply) [CHECK BOXES. MULTIPLE RESPONSE. CODE EACH RESPONSE
AS 0 IF UNCHECKED OR 1 IF CHECKED]
a. Change your address or direct deposit information
b. Report the death of an individual who received VA benefits
c. Report that you did not receive your VA check or direct deposit
d. Report a problem with a VA customer service representative
e. Ask a general question
f. Obtain information about submitting/re-opening a claim
g. Check on the status of a claim
h. Other (Specify) ___________________[TEXT BOX. FORCE TEXT IF
RESPONSE IS SELECTED. 50 CHARACTER MAX.]

9.

Thinking about your most recent contact, how did you contact VA? (Mark only
one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Phone [1]
b. Fax [8]
c. eBenefits.va.gov [10]
66

d.
e.
f.
g.
h.

Website [6]
Email [7]
Mail [9]
In person [3]
Online Chat

10. Was your most recent issue resolved? (Mark only one) [RADIO BUTTONS.
SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
(Ask Q11 if Q10 is No, otherwise go to Q12
11. Why wasn’t your most recent issue resolved? [CHECK BOXES. MULTIPLE
RESPONSE. CODE EACH RESPONSE AS 0 IF UNCHECKED OR 1 IF
CHECKED]
a. Did not receive all of the information required
b. Received incorrect information
c. Was referred to the incorrect office/person
d. Waiting for follow-up from VA
e. Other (Specify) ____________________ [TEXT BOX. FORCE TEXT IF
RESPONSE IS SELECTED. 50 CHARACTER MAX.]
f. Don't know or not sure

12. Thinking of your most recent contact with the VA, how would you rate your
overall customer service experience with the VA or VA representatives, using a
scale of 1 to 10, where 1 is Unacceptable, 10 is Outstanding, and 5 is Average?
[SHOW RESPONSES IN GRID WITH 10-POINT SCALE IN COLUMNS AND
SINGLE ROW (SEE JDP CONVENTIONS DOCUMENT PG. 1 FOR SPECIFIC
DETAILS OF LAYOUT). EVENLY SPACED RADIO BUTTONS/COLUMNS,
SINGLE RESPONSE PER ROW.][1-10]

Benefit Eligibility and Application Process
13. Thinking about your most recent application, did someone from VA (e.g., call
center representative, regional office representative, etc.) provide you with
information about the benefit application process? [RADIO BUTTONS. SINGLE
RESPONSE]
a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]
14. Thinking about your most recent benefit application, what method did you use to
apply for your benefit? (Mark only one) [RADIO BUTTONS. SINGLE
RESPONSE]
67

a. Online (SKIP TO Q16)
b. Mail [2]
c. In person at a Regional Office [3]
d. In person at a Veterans Service Organization (e.g., Amer. Legion, DAV, VFW,
PVA , MOPH, etc.) [4]
e. Other (Specify) ___________________ [TEXT BOX. FORCE TEXT IF
RESPONSE IS SELECTED. 50 CHARACTER MAX.] [97]
f. Don’t know or not sure [99]
(Ask Q15 if Q14 ≠ Online, otherwise go to Q16
15. Would you be willing and able to submit applications online if the VA was able to
process your claim quicker (possibly within 2-14 days)?
a. Yes [1]
b. No [0]
c. I do not have access to a computer/Internet [96]
d. Don’t know or not sure [99]
16. After you submitted your application, did you receive a notification/confirmation
from VA that your claim was received? [RADIO BUTTONS. SINGLE
REPONSE.]
a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]
(Ask Q17-22 if Q16 is Yes, otherwise go to Q23)
17. Thinking about the notification/confirmation from VA, was it clear and easy to
understand? (Mark only one) [RADIO BUTTONS. SINGLE REPONSE.]
a. Not at all clear [1]
b. Somewhat clear [2]
c. Completely clear [3]
d. Don’t know or not sure [99]
e. I did not read the letter [96]
18. Did you contact VA to obtain clarification about any of the
notification(s)/confirmation(s) you received? [RADIO BUTTONS. SINGLE
REPONSE.]
a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]
19. Did you provide VA with the documentation that was requested in the
notification(s)/confirmations(s)? (Mark only one) [RADIO BUTTONS. SINGLE
REPONSE.]
a. Yes [1]
b. No [0]
c. Nothing was requested [96]
68

d. Don’t know or not sure [99]
(Ask Q20-Q21 if Q19 is yes, otherwise go to Q22)
20. How did you submit the documentation to VA that was requested in the
notification/confirmation? (Mark only one) [RADIO BUTTONS. SINGLE
REPONSE.]
a. Online
b. In person at a Regional Office [2]
c. Mail [5]
d. Through a Veterans Service Organization (e.g., Amer. Legion, DAV, VFW,
PVA, MOPH, etc.) [3]
e. Other (Specify) ___________________[TEXT BOX. FORCE TEXT IF
RESPONSE IS SELECTED. 50 CHARACTER MAX.] [97]
f. Don’t know or not sure [99]
21. What is your preferred method to submit the documentation to VA that was
requested in the notification/confirmation? (Mark only one) [RADIO BUTTONS.
SINGLE REPONSE.]
a. Online
b. In person at a Regional Office [2]
c. Mail [3]
d. Through a Veterans Service Organization(e.g., Amer. Legion, DAV, VFW,
PVA, MOPH, etc.) [4]
e. Other (Specify) ___________________[TEXT BOX. FORCE TEXT IF
RESPONSE IS SELECTED. 50 CHARACTER MAX.] [97]
f. Don’t know or not sure [99]
22. Did you receive a subsequent notification requesting information in support of
your claim from VA? (Mark only one) [RADIO BUTTONS. SINGLE REPONSE.]
a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]
23. During the application process, did you have to provide the same information
more than once? (Mark only one) [RADIO BUTTONS. SINGLE REPONSE.]
a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]

(Ask Q24 if Q23 is Yes, otherwise go to Q25)
24. What information did you have to provide more than once? (Mark all that apply)
[CHECK BOXES. MULTIPLE RESPONSE. CODE EACH RESPONSE AS 0 IF
UNCHECKED OR 1 IF CHECKED]
a. Discharge papers (DD214)
b. Service treatment records
c. Private medical records
d. Proof of dependency (e.g., marriage license, birth certificate, etc.)
69

e. Other (Specify) ___________________[TEXT BOX. FORCE TEXT IF
RESPONSE IS SELECTED. 50 CHARACTER MAX.]
f. Don’t know or not sure

The following question asks you to rate various aspects of your experience with your
benefit application, using a scale of 1 to 10, where 1 is Unacceptable, 10 is
Outstanding, and 5 is Average. [SHOW ON SAME PAGE AS THE QUESTION THAT
FOLLOWS]
25. Please rate your experience with the benefit application process on the following
items: (Mark only one per row) [SHOW RESPONSES IN GRID WITH 10-POINT
SCALE IN COLUMNS AND ATTRIBUTES/RESPONSES IN ROWS (SEE JDP
CONVENTIONS DOCUMENT PG. 1 FOR SPECIFIC DETAILS OF LAYOUT).
EVENLY SPACED RADIO BUTTONS/COLUMNS, ALTERNATE SHADES IN
ROWS. SINGLE RESPONSE PER ROW. RANDOMIZE ALL ATTRIBUTES
EXCEPT THE LAST ONE.]
a. Ease of completing the application [ALLOW N/A RESPONSE][1-10, N/A=99]
b. Timeliness of eligibility/entitlement notification [ALLOW N/A RESPONSE] [110, N/A=99]
c. Flexibility of application methods [ALLOW N/A RESPONSE] [1-10, N/A=99]
d. Overall rating of application process [1-10]
(Paper Only Instruction: Ask Q26-Q28 if previously found ineligible for VA benefit
payments, otherwise go to Q29)
26. If you were previously found ineligible for VA benefit payments, did you
understand why you were found ineligible? (Mark only one) [RADIO BUTTONS.
SINGLE RESPONSE]
a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]
d. Not applicable, never been found ineligible (Online Only Response) [96]
(Online Instruction: Ask Q27-Q28 if Q26 is yes, otherwise go to Q29)
27. Were you provided information about how to appeal your decision? (Mark only
one) [RADIO BUTTONS. SINGLE RESPONSE]
a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]
28. Using a scale of 1 to 10, where 1 is Unacceptable, 10 is Outstanding, and 5 is
Average, please rate the clarity of the information you were provided about
appealing your decision. [SHOW RESPONSES IN GRID WITH 10-POINT
SCALE IN COLUMNS AND SINGLE ROW (SEE JDP CONVENTIONS
DOCUMENT PG. 1 FOR SPECIFIC DETAILS OF LAYOUT). EVENLY SPACED
RADIO BUTTONS/COLUMNS, SINGLE RESPONSE PER ROW.][1-10]
70

Benefit Entitlement
The following question asks you to rate various aspects of your experience with your
benefit payment, using a scale of 1 to 10, where 1 is Unacceptable, 10 is Outstanding,
and 5 is Average. [SHOW ON SAME PAGE AS THE QUESTION THAT FOLLOWS]
29. Please rate your benefit payment on the following items: (Mark only one per row)
[SHOW RESPONSES IN GRID WITH 10-POINT SCALE IN COLUMNS AND
ATTRIBUTES/RESPONSES IN ROWS (SEE JDP CONVENTIONS
DOCUMENT PG. 1 FOR SPECIFIC DETAILS OF LAYOUT). EVENLY SPACED
RADIO BUTTONS/COLUMNS, ALTERNATE SHADES IN ROWS. SINGLE
RESPONSE PER ROW. RANDOMIZE ALL ATTRIBUTES EXCEPT THE LAST
ONE.]
a. Amount of benefit payment [ALLOW N/A RESPONSE][1-10, N/A=99]
b. Timeliness of receiving initial benefit payment [ALLOW N/A RESPONSE] [110, N/A=99]
c. Overall rating of your benefit payment [1-10]

Overall Application Experience
30. Thinking about ALL aspects of your experience applying for your pension benefit,
please rate VA overall, using a scale of 1 to 10 where 1 is Unacceptable, 10 is
Outstanding, and 5 is Average. (Mark only one) [SHOW RESPONSES IN GRID
WITH 10-POINT SCALE IN COLUMNS AND SINGLE ROW (SEE JDP
CONVENTIONS DOCUMENT PG. 1 FOR SPECIFIC DETAILS OF LAYOUT).
EVENLY SPACED RADIO BUTTONS/COLUMNS, SINGLE RESPONSE PER
ROW.] [1-10]

Overall Experience with VA
31. Taking into consideration all of the non-medical benefits (e.g., education,
compensation, pension, home loan guaranty, vocational rehabilitation and
employment, insurance, etc.) you have applied for or currently receive, please
rate your experience with VA overall, using a scale of 1 to 10 where 1 is
Unacceptable, 10 is Outstanding, and 5 is Average. (Mark only one) [SHOW
RESPONSES IN GRID WITH 10-POINT SCALE IN COLUMNS AND SINGLE
ROW (SEE JDP CONVENTIONS DOCUMENT PG. 1 FOR SPECIFIC DETAILS
OF LAYOUT). EVENLY SPACED RADIO BUTTONS/COLUMNS, SINGLE
RESPONSE PER ROW.] [1-10]

32. How likely are you to inform other Veterans or beneficiaries about your
experience with VA benefits or services? (Mark only one) [RADIO BUTTONS.
SINGLE RESPONSE.]
71

a.
b.
c.
d.

Definitely will not [1]
Probably will not [2]
Probably will [3]
Definitely will [4]

33. Do you have any other comments or concerns about your experience? (Open
Capture) [OPEN-END. TEXT BOX. 1000 CHARACTERS MAX. ALLOW NO
COMMENT, MUTUALLY EXCLUSIVE CHECK BOX. CODE NO COMMENT AS
0 IF UNCHECKED AND 1 IF CHECKED]
____________________________________________________
Additional Questions

As a reminder, your responses will be kept completely confidential and your e-mail
address will not be sent to VA with any responses on this survey. [SHOW ON THE
SAME PAGE AS THE QUESTION THAT FOLLOWS.]
34. Would you like to provide an email address so VA can contact you with general
information about VA benefits and services? (Mark only one) [RADIO
BUTTONS. SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
c. I do not have an email address [96]
d. Prefer not to answer [98]
(Ask Q35 if Yes in Q34)
35. Please enter your preferred email address where you would like to be contacted:
(Open Capture)
a. E-mail: [TEXT BOX. 100 CHARACTER MAX.]

72

Appendix G
List of Acronyms
AAPOR American Association for Public Opinion Research
ANOVA Analysis of Variance
BAS
Benefits Assistance Service
BPA
Blanket Purchase Agreement
BRE
Business Reply Envelope
CAPS
Centralized Account Processing System
COR
Contracting Officer’s Representative
DTA
Data Transfer Agreement
EDIPI
Electronic Data Interchange Personal Identifier
EDX
Enterprise Data Exchange
FAR
Federal Acquisition Regulations
FY
Fiscal Year
GPO
Government Printing Office
ICR
Information Collection Request
JDP
J.D. Power
LGY
Loan Guaranty Service
LWO
Letter Work Order
MAR
Missing At Random
MCAR
Missing Completely At Random
MCMC
Markov chain Monte Carlo algorithm
MNAR
Missing Not At Random
NPC
NPC, Inc. Integrated Print and Digital Solutions
OIF
Operation Iraqi Freedom
OEF
Operation Enduring Freedom
OMB
Office of Management and Budget
OSAT
Overall Satisfaction Index
RO
Regional Office
SSN
Social Security Number
US
United States
USA
United States of America
VA
Department of Veterans Affairs
VADIR
VA DoD Identity Repository
VAPSD
VA Publications Services Division
VBA
Veterans Benefits Administration
VOV
Voice of the Veteran
VR&E
Vocational Rehabilitation and Employment Service
VSO
Veterans Service Organizations

73


File Typeapplication/pdf
File TitleTraining Catalog, Department of Veterans Affairs
SubjectTraining Catalog
AuthorDepartment of Veterans Affairs, Office of Human Resources and Ad
File Modified2016-12-19
File Created2016-12-19

© 2024 OMB.report | Privacy Policy