FY15 Non Reponse Bias Report - LGY

VBA_VOV LOB_FY15 LGY NRB_12.03.15 Final1.pdf

Voice of Veteran (VOV) Continuous Measurement Surveys

FY15 Non Reponse Bias Report - LGY

OMB: 2900-0782

Document [pdf]
Download: pdf | pdf
Voice of the Veteran Line of Business Tracking Study
Loan Guaranty Service
Fiscal Year 2015 Non-Response Bias Analysis

VETERANS BENEFITS ADMINISTRATION

[FY15 REPORT]

Table of Contents
Executive Summary .................................................................................................................................. 4
Introduction ............................................................................................................................................. 5
Methodology ............................................................................................................................................ 6
2.1 J.D. Power Index Model ................................................................................................................... 6
2.2 Sampling........................................................................................................................................... 9
2.3 Data Collection ................................................................................................................................. 9
Non-Response Bias Analysis................................................................................................................... 10
3.1 Survey Yield .................................................................................................................................... 15
3.2 Missing Data Patterns and Mechanisms ........................................................................................ 18
3.3 Margin of Error .............................................................................................................................. 19
3.3.1 Sampling Distribution ....................................................................................................... 20
3.3.2 Distribution of Overall Satisfaction Index Scores.............................................................. 21
3.3.3 Analysis for Demographic Differences .............................................................................. 21
3.3.4 Data Imputation Analysis for Demographic Differences .................................................. 24
Findings .................................................................................................................................................. 26
Conclusion .............................................................................................................................................. 27
References ............................................................................................................................................. 28
List of Appendices
Appendix A Missing Data Patterns and Mechanisms .................................................................... 30
Appendix B Item Response Rates .................................................................................................. 31
Appendix C Study Overview ........................................................................................................... 35
1.1 Study Background .................................................................................................................. 35
1.2 Methodology ......................................................................................................................... 35
1.3 Data Cleaning ......................................................................................................................... 36
1.4 Order Generation and Fulfillment Process ............................................................................ 37
1.5 Reporting ............................................................................................................................... 38
Sample Plan Overview ................................................................................................................... 39
2.1 Sample Criteria ...................................................................................................................... 39
2.2 Fielding/Sampling Frequency ................................................................................................ 39
2.3 Data Transfer ......................................................................................................................... 39
2.4 Sample Cleaning Rules Glossary ............................................................................................ 40
2.5 Sample Selection.................................................................................................................... 41
2.6 Data Collection....................................................................................................................... 42
Appendix D Approaches to Effects of Non-Response Bias and Improving Response Rates .......... 43

ii

1.1 Approach 1: Strategies to Maximize Response Rates ........................................................... 43
1.2 Approach 2: Correcting Unit Non-response Bias with Sample Weighting and Survey Raking44
Strategies to Improve Response Rate ............................................................................................ 45
Appendix E Impact of FAR 8.8 ........................................................................................................ 46
1.1 Impact .................................................................................................................................... 46
Appendix F Survey Questionnaire.................................................................................................. 47
Appendix G List of Acronyms……………………………………………………………………………….………..……………62

iii

Executive Summary
The Voice of the Veteran (VOV) Line of Business Tracking Satisfaction Research Study was
developed to establish continuous satisfaction measurement and incorporate direct Veteran
feedback in the decision-making process in order to improve the level of service to
Servicemembers, Veterans, and their beneficiaries.
As part of this study, a survey was fielded in Fiscal Year 2015 (FY15) for the Department of
Veterans Affairs (VA), Veterans’ Benefits Administration (VBA) Loan Guaranty (LGY) Division.
This survey is fielded annually on behalf of the LGY Service Program. The survey yielded a
response rate of 9.69% (an increase of 3.23%), which was lower than the estimated response
rate submitted with the information collection request (ICR) as well as lower than the Office of
Management and Budget’s standard of 80% (at the overall unit response rate).
OMB’s “Standards and Guidelines for Statistical Surveys,” Section 3.2, Guideline 3.2.9, notes
that a non-response analysis should be conducted for surveys with an overall unit response rate
of less than 80%. Therefore, J.D. Power (JDP) conducted the necessary statistical tests in
accordance with OMB’s guidelines to verify the validity of LGY’s survey results for FY15.
The statistical tests performed on the survey illustrate that no differences were found in the
Overall Satisfaction Index score and Advocacy ratings for LGY in FY15 after adjusting for nonresponse bias in race, region, active days of service, and service discharge.
The sample for LGY’s population was defined as individuals from a 30-day period who closed a
VA home loan in the 90 days prior to the fielding period, including those who closed on
purchase loans, those who received loans for interest rate reductions, and those who obtained
cash out or other refinancing.
The initial 2015 analyses for these reports were done in consultation with Dr. Don Dillman, a
professor at Washington State University. Dr. Dillman is regarded as a key survey method
expert on non-response bias research and the report conforms to sound statistical research
practices in accordance with OMB standards. The analysis performed also includes an iterative
survey raking procedure to derive sample weightings based on a simultaneous balancing
analysis of the demographic differences.
The Overall Satisfaction Index score (819) and Advocacy ratings (likelihood to recommend VA
benefits (3.77 (rating 1-4)) and likelihood to inform others about those benefits (3.67 (rating 14)) are not impacted in any meaningful way by non-response bias. This analysis confirms that
the data collected during FY15 is valid for use by VBA.

4

Introduction
In an effort to achieve top-level customer service, VBA partnered with J.D. Power to conduct
Veteran satisfaction research on its behalf. VBA’s Voice of the Veteran (VOV) Satisfaction
Initiative was established to continuously measure and improve the level of service to
Servicemembers, Veterans, and their beneficiaries.
The intent of this initiative is to:






Reinstate VBA’s customer satisfaction research program in order to incorporate Veteran
feedback into the decision-making process
Identify the critical factors to Veterans’ satisfaction with benefits and services provided
by VBA
Provide continuous feedback to validate effectiveness of new initiatives and process
changes
Provide decision-makers and stakeholders with timely and actionable feedback on a
continuous basis
Identify and document best practices, and act as a vehicle to celebrate successful
interactions and experiences

VBA’s VOV Line of Business Tracking Satisfaction Research Study was developed to continuously
field customer satisfaction survey instruments to provide Veteran and beneficiary feedback on
the following VBA lines of business and benefit programs: Compensation, Pension, Education,
Vocational Rehabilitation and Employment, and Loan Guaranty. In support of this effort, in
FY15, JDP fielded a survey instrument regarding the home loan process on behalf of Loan
Guaranty Service (LGY). The purpose of the home loan process survey was to determine ways
LGY can improve the level of service provided to Veterans for home loan guaranty benefits and
services.
The survey instrument for the LGY home loan process was developed in collaboration with Loan
Guaranty Service and in accordance with OMB’s guidelines concerning statistical collection
procedures and methods. After the initial survey instrument was designed, cognitive labs using
the “think aloud” method were conducted to evaluate user experience when filling out the
survey. Prior to the FY15 fielding of the Loan Guaranty home loan process survey, a benchmark
(pilot) study was conducted from October 2012 through January 2013 to further assess the
effectiveness of the methodology and conformance to OMB’s standards. Additionally, the study
was also fielded in FY14 and the FY15 fielding was the third iteration.

5

Methodology
2.1 J.D. Power Index Model
J.D. Power defines customer satisfaction as a measure of how well product or service
experiences fit the expectations of customers. All JDP index models assume a two-tiered
regression model involving factors and attributes. Each customer experience is influenced by
several factors (i.e., first tier), which, in turn, are influenced by several attributes or drivers (i.e.,
second tier). A diagram of the index model follows on the subsequent page.
To begin the index model calculation, each set of attributes within a factor are used to predict
the overall satisfaction rating (sub-OSAT) for that factor. An importance weight is assigned to
each attribute, where the weight of “importance” of each attribute is defined as the ability of
that attribute to predict overall satisfaction. A multiple regression model is used to estimate the
attribute weights. This model produces the bottom-level weights, which are computed for each
factor separately. The bottom-level weights are rescaled so that they add up to one within each
subcategory. As a result, the percentage of total explained variation in the sub-OSAT rating that
is due to a particular attribute constitutes that attribute’s importance weight within its
respective factor.
Following the calculation of attribute (i.e., bottom level) weights, the factor (i.e., top level)
weights are calculated. Factor scores are calculated by taking the sum of the product of the
attribute rating scores and the attribute importance weights. This model produces the top-level
weights and these weights are rescaled so that they add up to one. Thus, the percentage of the
total explained variation in the overall satisfaction rating that is due to a particular sub-OSAT
constitutes that factor’s importance weight.
After all factor scores are computed, they are weighted so that some contribute more to overall
satisfaction than others, based on the index importance weights. The index score is
subsequently calculated by taking the sum of the product of all of the factor scores and the
factor importance weights. Finally, both the index and factor scores are multiplied by 100 so
that the range of each is 100 (if all attributes were rated 1) to 1,000 (if all attributes were rated
10).
By applying the importance weights derived from the two-tiered modeling approach, JDP
creates a weighted index score that ranges from a low of 100 to a high of 1,000. This index
approach has the benefit of being highly reliable and valid and provides increased ability to
discriminate the performance levels of companies.

6

Loan Guaranty Home Loan Process Index Weights
In working with LGY’s subject matter experts and leadership, the design of its survey
encompasses the factors and attributes as outlined in the tables on the next page. The factors
(Benefit Information, Contact with VA, Benefit Application, and Benefit Entitlement) and
attributes (Ease of Accessing Information, Availability of Information, etc.) represent LGY’s
Index Model in FY15. The corresponding weights for each factor and attribute are the weights
based on the above index model calculation. The weights are derived from the relative
importance of each factor or attribute to the respondents.

7

Table 2.0. Index Model Weights

Index Model Weights
Effective Weight
Benefit Information
Contact with VA
Benefit Application
Benefit Entitlement

21.92%
1.91%
23.76%
52.41%

Table 2.1 Weights by Attribute
Weights by Attribute
Effective Weight
Benefit Information
Ease of accessing information

5.05%

Availability of information

2.17%

Clarity of information

3.50%

Usefulness of information

5.23%

Frequency of information

5.96%

Benefit Eligibility and Application
Ease of completing the application

8.31%

Timeliness of eligibility notification

8.17%

Flexibility of application methods

7.28%

Contact with VA

1.91%

Benefit Entitlement ( Timeliness of
receiving benefits)

52.41%

8

2.2 Sampling
The LGY survey was fielded to individuals who closed a home loan period in the 30-day period,
90 days previous to the fielding period. This includes purchase loans, interest rate reductions,
cash out, and other refinancing through the VA Home Loan Guaranty program.
J.D. Power mailed approximately 40,000 surveys to Veterans across the nation in FY15. The
target number of completed surveys was 12,000. The actual number of completed surveys
received was 3,821. At the onset of the study, a target of 1,000 was set for each regional loan
center; however, the sample provided by LGY did not include regional loan center identifiers.
Therefore, data was reported at the national level only and not by regional loan center.
Survey Instrument

Methodology

Fielding Frequency

Total Mailouts in
FY15

LGY Home Loan Process

Mixed (Paper Survey and
Postcard w/eSurvey

Monthly

40,000

2.3 Data Collection
During the survey fielding period, both self-administered online survey returns and selfadministered paper surveys were collected. While verbatim responses are recorded by a live
survey processor, responses from paper surveys are scanned through automated imaging
software. Survey returns undergo quality assurance to validate the accuracy of responses
captured.
Respondents received two separate mailings and had the option of completing the survey on
paper or online:


1st Mailing: Postcard, introducing the study to the respondent, which included an online
survey link and a unique access code login for the online survey.



2nd Mailing: Survey package, which included a cover letter with the online survey link, a
paper survey, and a business reply envelope.

Please note that VBA and J.D. Power will be switching the order of the mailings in FY16 in order
to increase response rates.
Each time the surveys were deployed, the postcards and survey packages were subject to a
proof approval process that utilized three levels of approvals by J.D. Power, Benefits Assistance
Service (BAS), and VA Publications Services Division (VAPSD). After the print vendor mailed the
survey packages, mail receipts were sent to VBA. Fielding of the survey instruments continues
to experience ongoing delays due FAR 8.8 Requirement 8.8 mandating that all printing occur
through the Government Printing Office. See Appendix F for more detail.
9

During the survey fielding period, JDP provided a toll-free survey hotline and dedicated e-mail
address to answer survey-related inquiries and to provide assistance to respondents for
completing the surveys. The telephone and e-mail helpdesk was staffed with three JDP
employees who answered inquiries during regular business hours (8:00am-5:00pm PST,
Monday through Friday). A voice message system was available to receive phone messages so
after-hours calls could be responded to the following business day. An automatically generated
e-mail response was sent to all e-mail inquiries informing respondents that their e-mail was
received and they would receive a response within 24 hours. JDP helpdesk representatives
logged each survey-related inquiry in a password protected spreadsheet documenting the
reason for the inquiry , the resolution provided, and the contact information of each caller. At
the end of each month, a log containing all inquiries was provided to the Contracting Officer
Representative (COR) for review. If non-survey related high-severity benefit inquiries were
received, J.D. Power contacted the COR immediately with the respondent’s contact
information.
Throughout the course of the program, weekly status meetings were held between JDP and
BAS to discuss survey administration. Biweekly status meetings were held between the
Government Printing Office print vendor, JDP, BAS, and VAPSD to discuss the printing and
mailing of the survey materials.

Non-Response Bias Analysis
The purpose of the non-response bias analysis is to ascertain the possible causes of variance in
response rates among different respondent demographics and/or determine if any bias has
been introduced with a low response rate. Given that the Voice of the Veteran Loan Guaranty
Home Loan Process Study had an overall unit response rate of 9% in FY2015, the following
section examines whether a low response rate or other factors may have caused respondent
bias to occur.
The Office of Management and Budget’s Questions and Answers, “When Designing Surveys for
Information Collections” dated January 2006, and “Standards and Guidelines for Statistical
Surveys” dated September 2006 (see References) provide guidelines on acceptable survey
design and response rates. OMB guidelines recommend a non-response bias evaluation for
surveys with an overall unit response rate of less than 80%.
In addition to the above referenced documents prepared by OMB, J.D. Power assessed other
source documents that were written and published by the Federal Committee on Statistical
Methodology, “Statistical Policy Working Paper 17, Survey Coverage” (1990) and “Statistical
Policy Working Paper 31, Measuring and Reporting Sources of Error in Surveys” (2001).

10

While high response rates are always desirable in surveys, JDP finds an 80% response rate is not
achievable for most voluntary, satisfaction-based survey research studies (Malhotra & Birks,
2007). In particular, survey research studies that do not provide an incentive are subject to not
achieving an 80% response rate. To better illustrate this point, the Dillman Method for survey
fielding was discussed in Dillman, D. A. (2014, pp. 22), detailing the efforts to attain an 80%
response rate.
A survey instrument was fielded to 600 students at the University of Washington, the same
University that sponsored the study. After five attempts to solicit a response in a closed
university setting, as well as offering a monetary incentive to complete the study, they failed to
achieve an 80% response rate, garnering only a 77% response rate. The JDP team met with the
VA Contracting Officer Representative to discuss current trends and realistic response rates. As
noted JDP does not believe that an 80% response rate is achievable and this concern was
shared with the Benefits Assistance Service team.
JDP conducted the following non-response bias analysis to determine whether the respondents
(i.e., those who completed the survey) were different in a meaningful way from the nonrespondents (i.e., those who were sent a survey, but did not complete it). Chi-squared analyses
consist of comparisons between respondents and non-respondents on available demographic
variables, such as gender, age, race, geographical region, war participation (service era), and
military service branch. The U.S. states were converted to standard U.S. Census regions
(Midwest, Northeast, South, and West) in order to aggregate the data and enhance regional
comparisons.
J.D. Power research indicates that there is an absence of systematic statistical differences of
respondents’ overall satisfaction on the mail and online survey results. Research does suggest
that differences can occur between mixed mode survey methodologies (mail, online, and
phone), but these are primarily related to (a) social desirability and interviewer bias associated
with phone surveys (see Baum, Chandonnet, Fentress, and Rasinowich, 2012, p. 2, for a review)
and (b) that older respondents tend to respond by mail more often than online.
Throughout this report, we are conducting statistical analyses to compare survey respondents
and non-respondents. Frequently used statistical tests can include the T-Test, Chi-Square, or
Analyses of Variance (ANOVA). These tests generate relevant t-statistics, Chi-Squares, or F
statistics that are reported. The magnitude of the statistic’s value (either positive or negative)
measures the size of the difference relative to the variation in the data. If the statistic is not
large enough to generate a probability (p-value) less than .05, then it falls below the accepted
standard probability cut-off level that indicates whether a statistical difference is significant. If
a difference is not significant, statisticians regard these results as part of the normal sample
variation that occurs within the same population. Throughout this report, the probability pvalue standard of “must be less than .05 to be significant” is used for all statistics reported.

11

Table 3a shows there were no statistical differences found between the mail and online
methodologies for overall satisfaction or advocacy:
Table 3a. T-Test Analysis of Mail vs. Online Survey Results
Rating Measure
Overall Satisfaction Index (100 - 1000
range)
Likelihood to inform others about VA
benefits (rating 1 - 4)
Likelihood to recommend benefits to
Veterans (rating 1 - 4)

Mail

Online

t-statistic

p-value

817

824

-1.11

.27

3.68

3.66

0.79

.43

3.77

3.77

.23

.82

No significant differences were found between the survey respondent and non-respondent
samples on gender (Table 3b):
Table 3b. Comparing Genders: Respondents vs. Non-Respondents
Gender by Respondent Type (%)

Female
Male

Survey
Respondents

NonRespondents

Total

10
90

11
89

11
89

Statistic

DF

Value

Prob

Chi-Square

1

.45

.50

Significant differences were found with the population based on age generation as shown in
Table 3c, such that a larger number of older Veterans and a fewer number of Gen X and Y
Veterans completed the survey.
Table 3c. Comparing Age Generation: Respondents vs. Non-Respondents
Age Generation by Respondent Type (%)

Baby-Boomer
(ages 50-68)
Generation X
(ages 37-49)
Generation YZ
(ages 18-36)
Pre-Boomer
(ages 69+)

Survey
Respondents

NonRespondents

Total

49

30

32

22

29

28

13

35

33

16

7

7

Statistic

DF

Value

Prob

Chi-Square

3

1168

<.0001

Significant differences were found based on Race, as shown in Table 3d, such that a fewer
number of White and more “Other” respondents completed the survey than non-respondents.

12

Table 3d. Comparing Race: Respondents vs. Non-Respondents
Race by Respondent Type (%)

White
Asian
Black
Other

Survey
Respondents

NonRespondents

Total

54
2
9
35

63
3
11
22

63
3
11
22

Statistic

DF

Value

Prob

Chi-Square

3

276

<.0001

Significant differences were found between the survey respondent and non-respondent
samples on census region, such that more Midwestern and fewer Southern respondents
completed the survey (Table 3e).
Table 3e. Comparing Census Regions: Respondents vs. Non-Respondents
U.S. Census Region by Respondent Type (%)

Midwest
Northeast
South
West

Survey
Respondents

NonRespondents

Total

17
6
46
30

15
7
49
30

15
7
49
30

Statistic

DF

Value

Prob

Chi-Square

3

24

<.0001

Significant differences were found with the population based on Military Service Branch, as
shown in Table 3f, such that a larger proportion of “Other” Veterans responded to the survey
compared to the population.
Table 3f. Comparing Military Service Branches: Respondents vs. Non-Respondents
Military Service Branch by Respondent Type (%)
Survey
Respondents

NonRespondents

Total

Air Force

19

20

20

Army

33

36

36

Marines

8

11

11

Navy

17

18

18

Other

24

14

15

Statistic

DF

Value

Prob

Chi-Square

4

224

<.0001

13

Significant differences were found with the population based on War Participation in Operation
Iraqi Freedom (OIF) and Operation Enduring Freedom (OEF), as shown in Table 3g, such that a
larger number of Veterans from wars before OEF/OIF completed the survey compared to nonrespondents. This may be reflective of age differences as well.
Table 3g. Comparing War Participation in OIF and OEF: Respondents vs. Non-Respondents
War Participation in OIF/OEF by Respondent Type (%)

All other
wars
OEF/OIF

Survey
Respondents

NonRespondents

Total

85

76

77

15

24

23

Statistic

DF

Value

Prob

Chi-Square

1

112

< .0001

Note: OIF is Operation Iraqi Freedom and OEF is Operation Enduring Freedom.

Significant differences were found with the population based on days of active service, as
shown in Table 3h, such that survey respondents were more likely to have served 1,000 or
fewer days and less likely to have served 1,001 to 2,000 days compared to the population.
Table 3h. Enrollment: Comparing Days of Active Service: Respondents vs. Non-Respondents
Days of Active Service by Respondent Type (%)

1000 days
or less
1001-2000
days
2001-4000
days
4001 days
or more

Survey
Respondents

NonRespondents

Total

41

29

30

18

28

27

14

19

19

28

23

24

Statistic
Chi-Square

DF
3

Value
329

Prob
<.0001

Significant differences were found based on service discharge, as shown in Table 3i, such that
fewer honorable discharge and more unknown status respondents completed the survey than
non-respondents.
Table 3i. Enrollment: Comparing Days of Active Service: Respondents vs. Non-Respondents
Service Discharge by Respondent Type (%)

Honorable
Other than
Honorable
Unknown

Survey
Respondents

NonRespondents

Total

44

53

52

1

1

1

55

47

47

Statistic
Chi-Square

DF
2

Value
95

Prob
<.0001

14

3.1 Survey Yield
In accordance with OMB “Standards and Guidelines for Statistical Surveys,” an agency must
appropriately measure, adjust for, report, and analyze unit and item non-response when the
intended response for a targeted population is not met.1 In assessing the survey data in
accordance with Section 3.2, and Guidelines 3.2.1-3.2.3, the unweighted unit response rate was
calculated as the ratio of the number of completed cases to the number of in-scope sample
cases (Ellis, 2000; AAPOR, 2000).
Table 3.1a below shows the sample distribution and response rate for Loan Guaranty’s target
population:
Table 3.1a. Sample Distribution and Response Rates for Loan Guaranty Service Population
Loan Guaranty Service Population FY15
Total records received
Duplicate records in sample file
Duplicate record history
Invalid Address
Invalid Values
Blanks
Do Not Contact
2
Total records available after cleaning
Total records selected
Undeliverable addresses
Total mailed (excludes undeliverable)
Total completed mail surveys
Total completed online survey
Total completed surveys
3
Total completed surveys with Overall Index Score
4
Total Sample Response Rate
5
Eligible Sample Response Rate

534,101
1,730
21,662
27,538
20,322
0
546
462,303
40,000
559
39,441
2,838
983
3,821
3,599
9.00%
9.69%

1

As defined by OMB and FCSM, unit non-response occurs when a respondent fails to respond to all required response items (i.e.,
fails to fill out or return a data collection instrument); item non-response occurs when a respondent fails to respond to one or
more relevant item(s) on a survey.
2
Glossary of sample cleaning rules included in Appendix E.
3
Findings in the report are based on the “Total completed surveys with Overall Index Score” (N=3599).
4
Response rate calculation per OMB Standards and Guidelines for Statistical Surveys, section 3.2, guideline 3.2.9 (includes
undeliverables as number of noncontacted sample units known to be eligible).
5
Response rate calculation per Council of American Survey Research Organizations (CASRO) (includes number of completed
interviews with reporting units/number of eligible reporting units in sample).The American Association for Public Opinion
Research (AAPOR) also uses this method for calculation and cites CASRO (AAPOR Standard Definitions, 2008, pp. 34).

15

Of the 534,101 total records received from LGY, 71,798 records were purged from the sample
due to cleaning rules such as duplicate records, invalid addresses and values, blanks, and do not
contact opt-outs. From the 71,798 records purged, 21,662 records were cleaned out due to
duplicate records across VBA’s other business line surveys (i.e., duplicate record history). The
purpose of this cleaning rule is to prevent respondents from being re-contacted if they were
previously selected to participate in any of VBA’s business line surveys within the past 12
months. The cleaning rule is a JDP and survey research best practice and is intended to promote
proper conduct in market research. About 13% of the total records provided by LGY were
removed from the sample due to these cleaning rules. It is unlikely that these cleaning rules
impacted the unit non-response since the target number of records for the survey was secured.
Table 3.1b. Weight/Person for Completed Surveys per Population
Completed Surveys

2015 Population

Weight/Person

3,821

534,101

140

In Table 3.1b the 140 Weight/Person means that every survey completed and returned
represents the views of 140 Veterans using LGY benefits. This was calculated by dividing the
number of completed surveys into the population number.
To confirm the sample’s representativeness, a comparison was conducted among the total
records provided (534,101) and the records available after cleaning (462,303). The intent of this
analysis was to determine whether the cleaning rules caused the remaining sample to vary in a
meaningful way from the original sampling frame.
Table 3.1c indicates that such characteristics as gender, age, and geographical region are similar
among the total records provided and the records available after cleaning. Regional state
comparisons yield differences that are less than 2 percentage points. These comparisons
suggest the cleaning rules did not alter the proportion of respondent characteristics provided in
the original sampling frame.
Table 3.1c. Comparing Gender, Generation, and States to Total Population

Gender
Female
Male
Generation
Baby Boomer
Generation X

Total
Population
(%)

Records Available
(%)

% Point
Difference

10.45
89.55

10.66
89.34

0.22
-0.22

32.67
27.00

32.40
28.02

-0.27
1.03

16

Table 3.1c. Comparing Gender, Generation, and States to Total Population (Continued)

Generation YZ
Pre-Boomer
U.S. State
AK
AL
AR
AZ
CA
CO
CT
DC
DE
FL
GA
HI
IA
ID
IL
IN
KS
KY
LA
MA
MD
ME
MI
MN
MO
MS
MT
NC
ND
NE
NH
NJ
NM
NV
NY
OH
OK
OR

Total
Population
(%)

Records Available
(%)

% Point
Difference

31.10
9.23

32.22
7.36

1.12
-1.88

0.55
1.89
0.87
3.74
11.08
4.03
0.47
0.11
0.39
7.54
4.33
0.43
0.65
0.79
1.82
1.61
0.92
1.16
1.17
0.85
2.79
0.32
1.79
1.32
1.82
0.70
0.43
4.62
0.27
0.68
0.35
0.94
0.84
1.75
1.28
2.48
1.40
1.74

0.55
1.90
0.87
3.65
10.85
4.01
0.46
0.11
0.38
7.61
4.37
0.42
0.64
0.77
1.84
1.59
0.92
1.16
1.19
0.83
2.79
0.32
1.73
1.31
1.81
0.71
0.42
4.71
0.28
0.69
0.34
0.91
0.83
1.72
1.29
2.48
1.41
1.68

0.00
0.01
0.00
-0.09
-0.23
-0.02
-0.01
0.00
-0.01
0.07
0.03
-0.01
-0.01
-0.02
0.02
-0.02
0.00
0.01
0.01
-0.01
-0.01
0.00
-0.06
-0.01
-0.01
0.01
0.00
0.09
0.00
0.01
0.00
-0.03
0.00
-0.03
0.01
0.00
0.01
-0.05

17

Table 3.1c. Comparing Gender, Generation, and States to Total Population (Continued)

PA
RI
SC
SD
TN
TX
UT
VA
VT
WA
WI
WV
WY

Total
Population
(%)

Records Available
(%)

% Point
Difference

2.16
0.17
2.28
0.31
2.70
8.91
1.05
6.66
0.09
4.05
0.96
0.38
0.30

2.15
0.17
2.28
0.31
2.70
9.12
1.03
6.83
0.09
4.04
0.97
0.38
0.30

-0.01
0.00
0.01
0.00
0.00
0.21
-0.02
0.17
0.00
0.00
0.01
0.00
0.00

3.2 Missing Data Patterns and Mechanisms
In accordance with the OMB “Standards and Guidelines for Statistical Surveys” Guidelines 3.2.9
and 3.2.11, an investigation of missing data patterns was performed on the 3,821 total surveys
received. To assess the distribution of missing data, a procedure was performed to process
missing values involving iterative multiple imputation chains using expectation–maximization
(MCMC) algorithms and divide these into distribution interval groupings. See Pierchala, Carl E.
(2001). This was done on the key measures of the overall satisfaction index (see Appendix A for
calculation) and advocacy ratings related to Veterans’ likelihood to recommend VA benefits.
As shown in Table 3.2, there were no indications of unusual patterns for missing data. For more
discussion of missing data mechanisms (MCAR, MAR, and MNAR), please see Appendix A.

18

Table 3.2. Missing Data Patterns in Satisfaction and Advocacy Ratings (0 = missing, 1 = data)
Group Means
Group
1
2
3
4
5
6
7
8

Overall
Satisfaction
0
0
0
0
1
1
1
1

Likelihood Likelihood
to
to inform
recommend
others
0
1
0
1
0
1
0
1

0
0
1
1
0
0
1
1

Freq

Percent

OSAT
Index

Age

%
Male

219
9
3
55
24
30
93
2861

6.6%
0.3%
0.1%
1.7%
0.7%
0.9%
2.8%
86.9%

818
770
839
848
784
874
828
818

58
55
60
57
61
57
58
55

89%
78%
100%
91%
100%
97%
88%
90%

3.3 Margin of Error
The margin of error expresses the maximum expected difference between the true population
parameter and a sample estimate of that parameter. It is often used to indicate the accuracy of
survey results. The larger the margin of error around an estimated value, the less accurate the
estimated value will be. Larger samples are more likely to yield results close to the true
population quantity and, thus, have smaller margins of error than smaller samples.
Based on a sample of 3,599 Veterans, meaning those who filled enough questions to be
counted in the index model, the FY15 Overall Satisfaction Index for the LGY Study is 819 index
points on a 1,000 point scale and has a margin of error of 5 index points at the 95% confidence
level. This indicates that if the survey were repeated many times with different samples, the
true mean Overall Satisfaction Index would fall within 5 index points 95% of the time.
Table 3.3 below demonstrates relative decreases in the margin of error as the study sample size
increases. A 20% response rate (7,888 completes) would be associated with a margin of error of
4 index points, similar to the margin of error for a 30% response rate (11,832 completes).
Results from this analysis indicate the Overall Satisfaction Index (OSAT) calculated from the
Loan Guaranty Study is an accurate measurement of the true population mean.

19

Table 3.3. Margin of Error for Larger Sample Sizes

Sample

Response
Rate

Completes
(N)

OSAT
(mean)

Standard
Deviation

Standard
Error

Margin of Error
(95% confidence
interval)

39,441
39,441
39,441
39,441
39,441
39,441
39,441

9.69%
20%
30%
40%
50%
60%
80%

3,821
7,888
11,832
15,776
19,721
23,665
31,553

819
819
819
819
819
819
819

164
164
164
164
164
164
164

2.7
1.8
1.5
1.3
1.2
1.1
0.9

5
4
3
3
2
2
2

In the margin of error analysis noted on the previous page and in subsequent analyses included
in this report, the Overall Satisfaction Index score is the main dependent variable and is the
basis for the analysis. The Overall Satisfaction Index score is the survey metric that VBA utilizes
to measure customer satisfaction and benchmark performance against other industries. It is the
primary measurement in all reports. The Overall Satisfaction Index encompasses all aspects of
the customer experience,6 and can therefore be used as a reliable indicator for the presence or
absence of respondent bias in the survey results as a whole. For these reasons, the Overall
Satisfaction Index score is used as the main dependent variable in the margin of error analysis
and subsequent t-test analyses included in this report.

3.3.1 Sampling Distribution
Respondent characteristics such as gender and age were compared to that of the total sample
to determine whether respondents and non-respondents differed on key variables of interest.
Compared to the population of all eligible respondents (40,000), the survey respondents
demonstrate the same gender characteristics. Table 3.1.1 below illustrates 10% of survey
respondents were female and 90% were male, mirroring the total sample population. The
distribution of age shows that survey respondents tend to be older.

6

Explanation of J.D. Power Index Model Calculation included in Methodology.

20

Table 3.1.1. Comparing Gender and Age of Survey Respondents to the Total Sample

Gender
Female
Male
Age Generation
Baby Boomer
Generation X
Generation YZ
Pre-Boomer

Respondents
(%)

Sample
Size (N)

Total
Sample (%)

Sample
Size (N)

% Point
Difference

10
90

356
3,134

10
90

4,195
35,801

-0.3
0.3

49
22
12
16

1,721
781
423
566

32
28
33
7

12,722
11,266
13,034
2,977

17
-6
-20
9

3.3.2 Distribution of Overall Satisfaction Index Scores
Following the comparison of sampling distributions, a comparison of Overall Satisfaction scores
was conducted to determine whether differences in age and gender among respondents
correlate with differences in overall satisfaction.
Table 3.3.2 below indicates minor differences in Overall Satisfaction scores based on gender
and age. On average, females tend to rate their experience 4 index points higher than males
(822 vs. 818, respectively). On average, Baby Boomers had higher scores than others, especially
when compared to older Pre-Boomers who had the lowest (820 vs. 812, respectively).
Table 3.3.2. Overall Satisfaction Scores by Gender and Age Group
Gender and Age
Gender
Female
Male
Age Generation
Baby Boomer
Generation X
Generation YZ
Pre-Boomer

OSAT (mean)

Standard Deviation

Sample Size (N)

822
818

165
163

334
2,959

820
818
822
812

164
159
157
171

1,624
740
416
514

3.3.3 Analysis for Demographic Differences
T-test analyses were conducted to determine whether differences in demographic groups
produced statistical differences in Overall Satisfaction scores. T-tests are typically used to
determine whether or not the difference between two groups’ averages most likely reflect a
meaningful difference in the population from which the groups were sampled.

21

Both gender and war participation demonstrated no differences in Overall Satisfaction scores,
as shown in Table 3.3.3a.
Table 3.3.3a. T-Test Analysis for Gender and War Service in Veterans’ Overall Satisfaction
Gender and War
Service

T-Test Statistic

p-value

Statistical Difference
(95% confidence level)

.42

.68

No

-.58

.56

No

Gender
Female vs. Male
War Participation
OEF/OIF vs. All other wars

Analyses of Variance (ANOVA) were conducted to determine whether differences in
demographic groups produced statistical differences in overall satisfaction scores. ANOVAs are
typically used to determine whether or not the difference between three or more groups’
averages most likely reflect a meaningful difference in the population from which the groups
were sampled.
There were no differences in overall satisfaction across Age generation (F = .34, p-value = .798).
Table 3.3.3b. Overall Satisfaction by Age Generation
Generation
Baby Boomer
Gen-X
Gen-YZ
Pre-Boomer

OSAT (mean)

Sample Size (N)

820
818
822
812

1624
740
416
514

Differences in overall satisfaction by Region were significant (F = 3.90, p-value = .009) such that
respondents from the South had the highest satisfaction.
Table 3.3.3c. Overall Satisfaction by Regions
Region
Midwest
Northeast
South
West

OSAT (mean)

Sample Size (N)

800
812
826
819

574
212
1513
991

Differences in overall satisfaction by Race were significant (F = 7.03, p-value = .0001) such that
Black respondents had the highest satisfaction.

22

Table 3.3.3d. Overall Satisfaction by Race
Race

OSAT (mean)

Sample Size (N)

Asian
Black
Other
White

830
857
809
818

80
296
1141
1777

There were no differences in overall satisfaction by Branch of Service (F = 1.49, p-value = .202).
Table 3.3.3e. Overall Satisfaction by Military Service Branch
Military Branch

OSAT (mean)

Sample Size (N)

Air Force
Army
Marines
Navy
Other

830
821
814
814
811

615
1,087
263
546
783

Differences in overall satisfaction by Days of Active Service were found to be significant
(F = 5.25, p-value = . 001) such that respondents with 4001 days or more active service had the
highest satisfaction.
Table 3.3.3f. Enrollment: Overall Satisfaction by Days of Active Service
Days of Active
Service
1000 days or less
1001-2000 days
2001-4000 days
4001 days or more

OSAT (mean)

Sample Size (N)

808
818
818
835

1,350
581
448
915

There were differences in overall satisfaction by Service Discharge (F = 4.28, p-value = .01) such
that respondents with an Unknown category of discharge had the lowest levels of satisfaction.
Table 3.3.3g. Overall Satisfaction by Military Service Discharge
Service Discharge
Honorable
Other than Honorable
Unknown

OSAT (mean)

Sample Size (N)

828
828
811

1,449
20
1,825

23

3.3.4 Data Imputation Analysis for Demographic Differences
A pairwise comparison T-Test analysis was conducted to evaluate whether data imputation for
missing values across age, race, region, and other significant demographics for the final cleaned
sample size of 3,294 and the 3,821 total survey respondents generated any changes in the
overall satisfaction index score. This analysis also included survey raking across demographic
differences as one level of comparison.
The results below show that there were no significant differences between the non-imputed
mean and the imputed mean of the satisfaction index across demographics, sample sizes, nor
survey raked values. These results support the conclusion that the survey’s findings for
Veterans’ overall satisfaction ratings are accurate.
Table 3.3.4a. T-Tests of Imputed vs. Non-Imputed on Veterans’ Overall Satisfaction Scores
T-Test Analysis on Imputed vs. Non-Imputed for Race, Region, Active Service, and Discharge

Overall Satisfaction Index
(100 – 1,000 range)
Imputed demographics
(3,294 final sample size)
Imputed survey-raked demographics
(3,294 final sample size)
Imputed survey-raked demographics
(3,491 total respondents)

mean
(imputed)

mean (nonimputed)

t-statistic

p-value

818.55

818.52

-0.01

0.9946

819.17

819.12

-0.01

0.9885

819.84

819.01

-0.21

0.8312

Note: Non-imputed is based on the 288 final cleaned sample size used in this report.

Survey Raking for Sample Weights to Adjust for Differences and Compare Overall Satisfaction
and Advocacy Ratings
The procedure known as “raking” adjusts a set of data so that its marginal totals match
specified control totals on a specified set of variables. The term “raking” suggests an analogy
with the process of smoothing the soil in a garden plot by alternately working it back and forth
with a rake in two perpendicular directions. See Izrael and Battaglia (2004).
Survey raking is an iterative sample-balancing algorithm-based technique that provides sample
weighting convergence across multiple variables and multiple categories. See Battaglia, Izrael,
Hoaglin, and Frankel (2009).
In keeping with OMB “Standards and Guidelines for Statistical Surveys” Guidelines 3.2.12 and
3.2.13, JDP selected the best statistical method to simultaneously adjust for multiple
differences between groups by applying a survey raking procedure. See Anderson, L., and R.D.
Fricker, Jr. (2015).

24

The JDP raking procedure is a proprietary and improved version based on the excellent
methods initially developed by Izrael and Battaglia (2000, 2004) and Battaglia, Izrael, Hoaglin,
and Frankel (2004). JDP raking improvements are primarily related to better handling of low cell
values during iterative convergence processing. For this analysis, 50 iterations were set
(although less were needed) to converge on the best sample weights (.2 estimation margin) to
simultaneously adjust for non-response bias in age, race, region, and war (service era)
demographic categories. For additional background about survey-raking methodologies, see
Wallace and Rust (1996).
The estimated population distributions are used as convergence targets. In this case, the data
set of all eligible respondents (40,000) was used as the estimated population to derive sample
weightings for the 3,821 survey respondents.
In accordance with OMB “Standards and Guidelines for Statistical Surveys” Guideline 3.2.13, a
series of t-tests were conducted to determine whether non-response bias in demographic areas
produced statistical differences in overall satisfaction scores and advocacy ratings. Typically, ttests are used to determine whether differences between two groups’ averages and variances
reflect a meaningful difference in the population. The sample weightings derived from the
survey-raking procedure were included in the t-tests to equalize the survey respondent
differences with non-respondents.
There were no significant differences in Overall Satisfaction or advocacy levels when the data
was adjusted for demographic differences between survey respondents and non-respondents.
The results below support the conclusion that the survey’s findings for Veterans’ overall
satisfaction ratings are accurate.
Table 3.3.4b. Overall Satisfaction and Advocacy for Respondents Unweighted and Weighted
Analysis of Survey Respondent Scores with Weighted Adjustment for Non-Response Bias
Standard
Standard
Rating
Mean
Mean
tDeviation
Deviation
p-value
Measure
(Unweighted) (Weighted)
statistic
(Unweighted)
(Weighted)
Overall
Satisfaction
Index (100 1000 range)
Likelihood to
inform about
VA benefits
(rating 1 - 4)
Likelihood to
recommend
benefits (rating
1 - 4)

818.52

819.12

163

162

-0.15

0.8823

3.67

3.68

0.53

0.52

-0.61

0.5447

3.77

3.78

0.47

0.46

-0.46

0.6459

25

Findings
Results from the non-response bias analysis indicate that the Overall Satisfaction Index score
and the Advocacy ratings from the Home Loan Guaranty Study reflect the experience of all
Veterans who originated a purchase, interest rate reduction, cash out, or other refinancing
through the VA Home Loan Guaranty program.

Sample Cleaning: Initial comparisons on age, gender, and geographical characteristics
between the total records provided and the records available after cleaning, suggests the
sample utilized in the study exhibits similar characteristics as the total sample provided by LGY.
The tests (see Margin of Error and Sampling Distribution, Section 3.3,) suggest the sample
cleaning rules did not impact the sample’s representativeness and the results are conclusive.

Non-Response Bias Analysis: Results from the non-response bias analysis did show group
differences in race, region, active days of service, and service discharge between survey
respondents and non-respondents. After correcting for these differences using a recommended
sample-balancing survey-raking method to derive sample weights (see Margin of Error, Section
3.3.4 Data Imputation Analysis for Demographic Variables), there were no differences found in
Veterans’ overall satisfaction and advocacy ratings between weighted and unweighted survey
respondents.

Item Response Rate Calculations: Results from the survey item response rate calculations
show high item response rates, with none falling below 70% (see Appendix B for Item Response
Rates). According to OMB Guideline 3.2.10, given this high item response rate, a non-response
bias analysis was not necessary at the item level.
The research and approach taken by JDP are in accordance with sound market research and
current best practices from the American Association for Public Opinion Research (AAPOR)
regarding response rate recommendations: “Results that show the least bias have turned out,
in some cases, to come from surveys with less than optimal response rates. Experimental
comparisons have also revealed few significant differences between estimates from surveys
with low response rates and short field periods and surveys with high response rates and long
field periods.” See AAPOR “Response Rates – An Overview” (2015) and Special Issue of Public
Opinion Quarterly "Nonresponse Bias in Household Surveys" (Singer, 2006).

26

Conclusion
The Overall Satisfaction Index score and Advocacy ratings are not impacted in any meaningful
way by non-response bias. This analysis confirms that the data collected during Fiscal Year 2015
is valid.
The FY15 Voice of the Veteran Line of Business Tracking Satisfaction Study data for the Loan
Guaranty survey can be used to infer reliable overall satisfaction scores and advocacy ratings.
The Overall Satisfaction Index score reflects the experience of all Veterans who originated a
purchase, interest rate reduction, cash out, or other refinancing through the VA Home Loan
Guaranty program.
The sample utilized in the study exhibits similar characteristics for age, gender, and geography
as the total sample provided by the Loan Guaranty Service. This indicates the sample cleaning
rules did not impact the sample’s representativeness.
While the results from the non-response bias analysis did show group differences in race,
region, active days of service, and service discharge between survey respondents and nonrespondents, there were no differences found in Veterans’ overall satisfaction and advocacy
ratings between weighted and unweighted survey respondents. This was after correcting for
these differences using a recommended sample-balancing survey-raking method to derive
sample weights. JDP conducted all necessary statistical tests in accordance with OMB
standards.
J.D. Power certifies the results contained within this report.

27

References
Anderson, L., and R.D. Fricker, Jr. (2015). Raking: An Important and Often Overlooked Survey Analysis
Tool, Phalanx,: http://faculty.nps.edu/rdfricke/docs/Analysis%20process_v4.pdf
American Association for Public Opinion Research (2008). Standard Definitions: Final Disposition of Case
Codes and Outcome Rates for Surveys. Ann Arbor, Michigan: AAPOR.
(http://www.aapor.org/AAPORKentico/AAPOR_Main/media/MainSiteFiles/Standard_Definitions_07
_08_Final. pdf).
American Association for Public Opinion Research (2015). “Response Rates – An Overview”
http://www.aapor.org/AAPORKentico/Education-Resources/For-Researchers/Poll-SurveyFAQ/Response-Rates-An-Overview.aspx
Battaglia, Michael P., Izrael, David, Hoaglin, David C., and Frankel, Martin R. (2004). “To Rake or Not To
Rake Is Not the Question Anymore with the Enhanced Raking Macro.” Proceedings of the 29th Annual
SAS Users Group International Conference, Paper 207.
Battaglia, Michael P., Izrael, David, Hoaglin, David C., and Frankel, Martin R. (2009). Practical
Considerations in Raking Survey Data. Survey Practice, Vol 2, No. 5.
Baum, Herbert M., Ph.D.; Chandonnet, Anna M.A.; Fentress, Jack M.S., M.B.A.; and Rasinowich, Colleen,
B.A. (2012). “Mixed-Mode Methods for Conducting Survey Research.” Data Recognition Corporation.
http://www.datarecognitioncorp.com/survey-services/Documents/Mixed-Mode-Methods-forConducting-Survey-Research.pdf
Dillman, D. A. and JDP (2015). “Conference call discussion on non-response bias, avoidance methods, and
post-hoc sample weighting.” Conference call between Dr. Dillman and JDP (Greg Truex, Jay Meyers,
PhD, Lee Quintanar, PhD), May 20, 2015 (2pm PDT).
Dillman, D. A. (2014). Internet, Phone, Mail and Mixed-Mode Surveys: The Tailored Design Method.
Fourth Edition. John Wiley & Sons, Inc: New York.
Economic Systems Inc. (2004). Evaluation of VA’s Home Loan Guaranty Program, Final Report “Appendix
A Sampling and Nonresponse Analysis.” ORC Macro, Hay Group, Philadelphia, PA.
Ellis, J. M. (2000). Estimating the Number of Eligible Respondents for a Telephone Survey of LowIncidence Households. Paper presented at the annual meeting of the American Association for Public
Opinion Research, Portland, OR, May 21.
Federal Committee on Statistical Methodology’s Statistical Policy Working Paper 31, Measuring and
Reporting Sources of Error in Surveys (2001). Washington, D.C.
Izrael, David, Hoaglin, David C., and Battaglia, Michael P. (2000). “A SAS Macro for Balancing a Weighted
Sample.” Proceedings of the Twenty-Fifth Annual SAS Users Group International Conference, Paper
275.

28

Izrael, David, Hoaglin, David C., and Battaglia, Michael P. (2004). “Tips and Tricks for Raking Survey Data
(a.k.a. Sample Balancing).” Proceedings of the 2004 American Association for Public Opinion
Research (AAPOR) Conference.
Malhotra, N.K, and Birks, D.F. (2007). Marketing Research: An Applied Approach, 3rd edition. Prentice
Hall/Financial Times: England.
Pierchala, Carl E. (2001). PROC MI® as the Basis for a Macro for the Study of Patterns of Missing Data.
Northeast SAS Users Group. http://www.lexjansen.com/nesug/nesug03/st/st009.pdf
Singer, E. (2006). Special Issue: Nonresponse Bias in Household Surveys. Public Opinion Quarterly, Vol
70, Issue 5.
U.S. Office of Management and Budget (1990). "Survey Coverage," Statistical Policy Working Paper 17,
Washington, D.C.
U.S. Office of Management and Budget Publication (January 2006). “When Designing Surveys for
Information Collections.” The Office of Management and Budget, 725 17th Street, NW, Washington,
D.C., 20503 USA.
U.S. Office of Management and Budget Publication (September 2006). “Standards and Guidelines for
Statistical Surveys.” The Office of Management and Budget, 725 17th Street, NW, Washington, D.C.,
20503 USA.
U.S. Office of Management and Budget Publication (2008). VBA LGY OMB - Part B Supporting statement
for “Collections of Information Employing Statistical Methods.” Washington, D.C.
Vogt, W. Paul, Vogt, Elaine R., Gardner, Dianne C., and Haeffele, Lynne M. (2014). Selecting the Right
Analyses for Your Data - Quantitative, Qualitative, and Mixed Method. Guilford Press, New York, NY.
Wallace, Leslie and Rust, Keith (1996). A Comparison of Raking and Poststratification Using 1994 NAEP
Data. Leslie Wallace, West Inc., 584-589.

29

Appendix A
Missing Data Patterns and Mechanisms
An excellent discussion of missing data patterns, mechanisms, and research analysis methods is
provided in Vogt, W. Paul, Vogt, Elaine R., Gardner, Dianne C., and Haeffele, Lynne M. (2014).
An overview of the missing data types and issues is described below:
Understanding the reasons why data is missing can help with analyzing the remaining data. If
values are missing at random, the data sample may still be representative of the population.
But if the values are missing systematically, analysis may be harder.






Missing completely at random. Values in a data set are missing completely at random
(MCAR) if the events that lead to any particular data-item being missing are independent
both of observable variables and of unobservable parameters of interest, and occur entirely
at random. When data are MCAR, the analyses performed on the data are unbiased;
however, data are rarely MCAR.
Missing at random. Missing at random (MAR) is an alternative, and occurs when the
missing value is related to a particular variable, but it is not related to the value of the
variable that has missing data. An example of this is accidentally omitting an answer on a
questionnaire.
Missing not at random. Missing not at random (MNAR) is data that is missing for a specific
reason (i.e., the value of the variable that's missing is related to the reason it's missing). An
example of this is if a certain question on a questionnaire tends to be skipped deliberately
by participants with certain characteristics. Graphical models can be used to describe the
missing data mechanism in detail.

While it is clear that MNAR can introduce statistical bias, there is no definitive test. See Vogt et
al. (2014). It is also clear that MCAR is rarely evident in research data and most tests of it will
fail. However, MAR is fully acceptable for valid statistical analyses (Vogt et. al, 2014). MAR is
essentially “missing partially at random,” whereby the intra-group missingness remains random
despite some differences between group tendencies. Graphical data representations are the
typical tool used in assessment as described above and in Pierchala, Carl E. (2001).
See Section 3.2 Missing Data Patterns and Mechanisms for findings specific to LGY’s data.

30

Appendix B
Item Response Rates
In accordance with OMB “Standards and Guidelines for Statistical Surveys,” Section 3.2,
Guidelines 3.2.6-3.2.7, the item response rate was calculated as the ratio of the number of
respondents for whom an in-scope response was obtained to the number of respondents who
were asked to answer that item. The number asked to answer an item is the number of unitlevel respondents minus the number of respondents with a valid skip pattern. In addition to
item response rate, total item response rate was calculated as the product of the overall unit
response rate and the item response rate for each item. The purpose of these calculations is to
assess the item non-response, which occurs when one or more survey items are left blank in an
otherwise completed questionnaire. Table B1 displays the item and total item response rates
for this survey.
The OMB “Standards and Guidelines for Statistical Surveys” (Guideline 3.2.10) states an item
non-response analysis should be conducted for items with an item response rate of less than
70%. Since none of the survey item response rates fall below 70%, an item-level analysis of nonresponse bias was not necessary. Results from the item response rate calculation suggest the
item response rate for the Loan Guaranty Study is strong, ranging from 83% to 100% with a 93%
average. The questions that comprise the multivariate regression for the Overall Satisfaction
Index are within the range of 89% - 99%, and all other statistics reported are descriptive in
nature.
Table B1. Comparing Survey Item Response Rates7
Question
Number
1
2
3
4
5
6
7a
7b
7c
7d
7e

Item Response
Rate
89%
99%
99%
93%
93%
92%
99%
99%
99%
99%
95%

Unit Response
Rate
8%
9%
9%
8%
8%
8%
9%
9%
9%
9%
9%

7

Email opt-in and additional comments about your experience (open capture) questions display “N/A” and were not included in
item and total item response rate calculations

31

Table B1. Comparing Survey Item Response Rates (Continued)
7f
8
9
10
11
12
13
14
15
16
17
18
19
20a
20b
20c
20d
21
22
23
24
25
26a
26b
26c
27
28
29
30
31
32
33
34
35
36
37
38
39a
39b
39c
39d
39e
39f

99%
95%
95%
98%
83%
99%
100%
97%
96%
94%
92%
86%
83%
93%
91%
89%
96%
94%
97%
95%
94%
88%
90%
91%
96%
96%
90%
92%
92%
98%
89%
84%
87%
93%
97%
94%
88%
89%
92%
93%
93%
93%
91%

9%
9%
9%
9%
7%
9%
9%
9%
9%
8%
8%
8%
7%
8%
8%
8%
9%
8%
9%
9%
8%
8%
8%
8%
9%
9%
8%
8%
8%
9%
8%
8%
8%
8%
9%
8%
8%
8%
8%
8%
8%
8%
8%

32

Table B1. Comparing Survey Item Response Rates (Continued)
39g
40a
40b
40c
40d
40e
40f
41a
41b
41c
41d
41e
42
43a
43b
43c
43d
43e
44
45a
45b
45c
45d
45e
46
47
48
49
50
51
52
53
54
55
56
57

93%
91%
91%
91%
91%
90%
92%
93%
93%
93%
90%
93%
95%
97%
97%
97%
97%
98%
85%
97%
96%
96%
96%
97%
83%
94%
95%
95%
99%
83%
94%
98%
90%
N/A
N/A
N/A

8%
8%
8%
8%
8%
8%
8%
8%
8%
8%
8%
8%
9%
9%
9%
9%
9%
9%
8%
9%
9%
9%
9%
9%
8%
8%
9%
9%
9%
7%
8%
9%
8%
N/A
N/A
N/A

33

In the item response rate calculation above, JDP considered blanks as non-response for mail
returns and “don’t know” selections in addition to blanks as non-response for online returns.
“Don’t know” selections are included as non-response for online returns since respondents are
forced to select a response in the online survey.
Similarly, “N/A” responses were also included as non-response for rating questions in online
returns. For respondents taking the survey online, the respondent must answer each question
before proceeding to the next question in the survey, “Not Applicable” or “N/A” could either
mean that the respondent was answering “N/A” to the question or did not wish to answer it.
Therefore, this response option was included as non-response.

34

Appendix C
Study Overview
1.1 Study Background
The Voice of the Veteran Satisfaction Initiative tracks Veteran satisfaction with the benefits and
services received from VBA. The VOV Tracking Satisfaction Research Study is ongoing survey
research tracking Veteran satisfaction with VBA’s lines of business: Compensation, Pension,
Education, Vocational Rehabilitation & Employment (VR&E), and Loan Guaranty (LGY).
As part of Executive Order 13571 Streamlining Service Delivery and Improving Customer
Service, agencies that provide significant services directly to the public are to identify and
survey customers, establish service standards and track performance against those standards,
and benchmark customer service against the best in business. This program enables VBA to
understand what is important to Veterans relative to benefits received and services provided.
This program provides timely and actionable Veteran feedback on how well VBA is providing
services. Insights from this program identify opportunities for improvement and measure the
impact of improvement initiatives, as well as continuously measure performance outcomes.
Loan Guaranty’s survey instrument is regarding Veteran satisfaction with the Home Loan
Process.

Survey

Loan Guaranty

Methodology

Fielding
Frequency

Total Mailouts
Per Year

Target
Number of
Completes

Mixed (Paper Survey
and Postcard w/
eSurvey

Monthly

40,000

12,000

1.2 Methodology
Respondents had the option of completing a paper survey or an online survey. Respondents
were first sent a postcard with a link to the eSurvey to complete the survey online. Each
respondent was issued a unique sequence number that is entered online prior to beginning the
eSurvey. Three weeks after deployment of the postcard, a survey packet containing a cover
letter, survey instrument, and Business Reply Envelope (BRE) was sent to non-responders (to
the postcard mailing). The sample for mailings of the survey packet was cleaned to exclude
anyone who completed the survey at least one week prior to the cleaning.
Sample Population Definition
The targeted population was identified by LGY. The target population is defined as individuals
from a 30-day period who closed a VA home loan within the 90 days prior to the fielding period.
The sample included (1) those who closed on purchase loans; (2) those who received loans for
interest rate reductions; and (3) those who obtained cash out or other refinancing.
35

Sample File Generation









LGY generates the sample files based upon the sampling definition and submits sample files
directly to BAS.
BAS receives the sample files and sends to VADIR for processing.
VADIR processes sample files (to remove SSN and append demographics/EDIPI) and returns
to BAS.
BAS transfers sample files (via EDX platform) to JDP and notifies JDP via email that sample
files are ready for deployment.
JDP cleans the sample file and selects the sample.
Sample is transferred to Government Printing Office (GPO) print vendor (via EDX platform)
for printing and mailing of the postcards and survey packages.

Sample is transferred in accordance with the following schedule:

VOV_LOB
Tracking_Production Schedule_10.06.15.pdf

1.3 Data Cleaning
JDP processed the sample according to the following cleaning rules:
1. De-duplicate records within each business line and across surveys based on the unique
identifier (EDI_PI or VA_ID) for each record. Note: EDIPI is Electronic Data Interchange
Personal Identifier.
a) Exception: For Pension Enrollment (v1) and Pension Servicing (v8), de-duplicate records
based on EDI_PI and Claim Number.
b) When each new sample file is received, JDP cleans it against all sample selected from
every sample batch that has been delivered 12 months prior to ensure a respondent
does not receive a VA line of business survey more than once in a 12 month-period. In
the case of duplicates occurring within the same sample month, priority is assigned to
business lines with the lowest number of sample records.
2. Clean out records that are present on the JDP “do not contact” list and clean against the
National Change of Address (NCOA) list.
3. Clean out any respondents who do not have any EDI_PI or VA_ID included in their sample
record.
a) Exception: For Pension Enrollment (v1) and Pension Servicing (v8), clean out records
with blank EDI_PI and Claim Number.
4. Clean out any respondents not specified as a dependent/spouse who have a date of death
(DOD) in their sample record.
5. Clean out any respondents who do not have any address included in their sample record.

36

6. Assign and maintain unique sampling identifiers to each sample record in order to track
history of sampling. Exclude records that have been sampled in the past 12 months to
ensure no respondent is mailed surveys more than once in a 12-month time frame. This rule
may not apply to those that completed a survey.

1.4 Order Generation and Fulfillment Process
Federal Acquisition Regulations (FAR 8.8) mandate government agencies to solicit all printing
requirements through the Government Printing Office. GPO utilizes print vendors to fulfill
orders. A Data Transfer Agreement (DTA) must be in place with the print vendor and contractor
before BAS can obligate funds or transfer sample files to the print vendor and contractor.
Prior to mailing the postcards and mail surveys, print orders must be generated for each survey.
The entire process may take up to 2-4 weeks from inception of the print order to the mailing of
the survey package or postcard. Below are the steps involved in order generation and order
fulfillment.
Order generation








After sample is received by JDP, the sample files are cleaned and selected. Then Letter Work
Orders (LWOs) are created to provide the print vendor with the necessary information to
match the sample files to the correct survey instrument. (1 day)
JDP creates the print order and sends over to BAS Contractor Officer’s Representative
(COR). (Same day as above step)
The COR then reviews, authorizes, and submits the print order. (1 day)
The BAS Publication Officer and/or COR submits the orders to the VA Publications Services
Division (VAPSD). (Same day as above step)
The order is issued a control number by a VBA Management Analyst, Publications. (Variable
timing)
Once the control number is assigned, the order goes to VA Publication Services Division
liaison to forward to GPO Contracting Officer. (Variable timing) Note: the amount of time an
order is with VAPSD varies greatly, ranging between 3 days up to 20 days.
The GPO Contracting Officer sends the printing and mailing order to the print vendor.

Order fulfillment






Once the order is placed, the GPO print vendor is allotted 9 business days to fulfill the order.
(2 days to generate proofs, 2 days for proof review/correction, and 5 days to print and mail)
Upon receipt of the proofs from print vendor, JDP reviews and approves; then BAS reviews
and approves; then VAPSD reviews and approves.
After the orders have been mailed, the print vendor provides the mail receipts to
contractor, BAS, and VAPSD.
Upon order completion, VAPSD provides actual costs to BAS.

37

1.5 Reporting
Reporting occurs four times yearly for the LGY Home Loan Process survey.
On a quarterly basis, the following deliverables are provided:





Scorecard
Data Matrices
Data is loaded to the VOV reporting site
Open ended comments (verbatims)

On a semiannual (twice yearly) basis, the following deliverable is provided:
Data and Analysis Presentation

38

Sample Plan Overview
2.1 Sample Criteria
VBA was responsible for providing sample to JDP that meets the following sampling criteria:
Sample Population

Inclusion Criteria

Frequency of Data Request

Loan Guaranty

The targeted population will
include individuals from a 30 day
period who closed a VA home loan
in the 90 days prior to the fielding
period (includes those that closed
on purchase loans, those who
received loans for interest rate
reductions, and those who obtained
cash out or other refinancing.

Monthly

2.2 Fielding/Sampling Frequency
Survey
Methodology
Instrument
Loan
Guaranty

Mixed (Paper
Survey and
Postcard w/
eSurvey

Total
Targeted
Survey
Number of
Instruments Completes
40,000

12,000

Number of
Postcards
(eSurvey)

Number
of Mail
Packages

Fielding
Frequency

40,000

40,000

Monthly

2.3 Data Transfer
The sample was posted by BAS once a month in the sampling folder on the VOV EDX site.
Sample should be provided in a file layout consistent with the file layout provided for the study
as outlined below.

LGY File Layout
ADDRESS_LINE_ONE
ADDRESS_LINE_TWO
ADDRESS_LINE1
ADDRESS_LINE2
BIRTH_DATE
CHAR_SVC_CD
CITY_NAME
COUNTY_NAME
DATE_OF_BIRTH

39

LGY File Layout (Continued.)
DAYS_OF_ACTIVE_SERVICE
EDU_LVL_CD
FIRST_NAME
GENDER
LAST_NAME
LGY_SSN
LIN
OEF_OIF_IND
ORIGINATION_DATE_BINNED
PN_1ST_NM
PN_BRTH_DT
PN_LST_NM
PN_SEX_CD

PNL_BGN_DT
RACE_CD
STATE_CODE
STREET_NUMBER
SVC_CD
TERM_DT
VA_ID
ZIP
ZIP_SUFFIX
City
State
Before_Address_1
Before_City
Before_Zip
DPV_Code

2.4 Sample Cleaning Rules Glossary
Duplicate records in sample file—the record is cleaned out if there is more than one record
within the same sample file for the same respondent
Duplicate record history—the record is cleaned out if the record has been selected within the
past 12 months for any of VBA’s business line surveys (i.e., Compensation, Pension, Education,
Home Loan Guaranty, and Vocational Rehabilitation) regardless of whether the respondent
completed the survey

40

Invalid address—the record is cleaned out if JDP’s address verification software indicates an
invalid address code
Invalid values—the record is cleaned out if the “VA_ID” field is blank
Blanks—the record is cleaned out if the “Name” field corresponding to the record is blank
Do not contact—the record is cleaned out if the individual is listed on JDP’s “do not contact” list

2.5 Sample Selection
JDP selected sample records following the completion of the sample cleaning process. The
following guidelines are referenced when selecting sample:
1. Total Sampling Targets: The table below summarizes the total sampling target per an RO per
a fielding period. The “Sampling Target per RO” column indicates the minimum number of
sample records that should be selected per an RO for each survey. If this minimum target
number cannot be reached for a particular RO, sample from a different RO will be selected
to make up the difference.
Survey

Loan Guaranty

Frequency

Total
Sampling
Target

Sampling
Target Per
Time Period

Sampling
Target Per
RO

Number of
ROs

Monthly

40,000

3,333

N/A

N/A

2. The same record cannot be selected for multiple surveys during the same fielding period
Respondents who have completed a survey within the past 12 months cannot be selected.
Survey priority is based on the number of records in each sample file. The survey with the
smallest number of records is given first priority.
3. Following sample selection, the JDP project teams receives an automated report confirming
the number of records selected for each survey version. The JDP project team verifies that
the sample selection quantities reflect the sample targets and approves the sample file for
fielding.

41

2.6 Data Collection
During the survey fielding period, both online survey returns and paper surveys are collected as
they are received and posted on a secure EDX site. Responses from paper surveys are scanned
through automated imaging software, while verbatim responses are recorded by a live survey
processor. Survey returns must have all pages intact in order to be processed and counted as a
return. Surveys with missing pages are counted as unusable. Returns are also considered
unusable if there is an indication that the individual completing the survey is not the individual
selected from the sample file (i.e., the respondent name and/or address on the survey is
replaced with a different name and/or address). During each day of fielding, a subset of survey
returns undergoes quality assurance to validate the accuracy of responses captured. If duplicate
surveys are returned (as identified by the unique sampling identifier assigned to each sample
record), the original survey returned is processed and the duplicate survey is removed. In the
case of duplicate survey returns from mixed methodology surveys, the date the survey was
received is used to identify the original return, while the subsequent return is removed postfielding.

42

Appendix D
Approaches to Mitigating the Effect of NonResponse Bias and Strategies to Improve the
Response Rate
The following section outlines two approaches used in FY15 to mitigate the potential of nonresponse bias. As mentioned earlier in the report, J.D. Power affirms that while high response
rates are always desirable in surveys, an 80% response rate is typically not achievable for a
voluntary, customer satisfaction survey instrument (Malhotra & Birks, 2007), particularly those
that do not provide an incentive (not recommended for this program). To illustrate this point,
the Dillman Method for survey fielding was discussed in Dillman, D. A. (2014)—a survey
instrument was fielded to 600 students at the University of Washington. After five attempts to
solicit a response, as well as offering a monetary incentive to complete the study, a 77%
response rate was reported.
The first approach to minimize non-response occurs before and during data collection and
involves introducing measures to maximize survey response rates. The second approach is to
make statistical adjustments after the data is collected.

1.1 Approach 1: Strategies to Maximize Response Rates
Prior to and during the fielding of the survey, JDP implemented the following measures to
reduce the chances of non-response:






Respondents were provided a promise of confidentiality on the survey cover letter and
postcard, and were assured that their survey responses would not impact their current or
future eligibility for benefits.
Following the first mailing, non-respondents were sent an additional survey mailing.
Respondents were provided with a toll-free telephone number and dedicated email address
to contact JDP about survey-related inquiries (e.g., how to interpret questions and response
items, the purpose of the survey, how to get another copy of the survey if their copy has
been lost/damaged, etc.). Telephone calls and emails are responded to within 24 hours and
answered during regular business hours (8:00 a.m. - 5:00 p.m. PT).
JDP ensured the Web-based surveys were accessible to people with disabilities by
maintaining 508 compliant standards. These standards include:
 Keyboard navigation rather than mouse or other pointing devices
 Customization options for color, size, and style of text displayed
 Compatibility with screen readers to translate items displayed on the survey in audible
output and/or Braille displays
 Customer support and technical support through JDP help desk toll-free phone number
and email address
43

 Exclusion of non-text elements, image maps, animation, flashing or blinking text



The survey fielding period was extended to offer opportunities to respond for subgroups
having a propensity to respond late (e.g., males, young, full-time employed).
The survey was developed and reviewed to enhance respondent understanding of the
survey materials and to improve the relevancy of the data collected:
 Prior to fielding the Benchmark study, a series of cognitive labs was conducted with test
users to ensure the survey questions were easily understood and correctly interpreted.
Revisions were made to the survey based on feedback from test users (as per OMB
Guideline 1.4.1)
 After the Benchmark study and prior to fielding the second year of the Tracking study in
FY15, Loan Guaranty Service and JDP conducted a review of the survey instrument and
modified the survey to improve the relevancy of data collected (As per OMB Guideline
1.4.2)

1.2 Approach 2: Correcting Unit Non-Response Bias with Sample Weighting and
Survey Raking
As stated above, the two approaches to tackling non-response bias include implementing
measures to maximize response rates during the fielding period and making post hoc statistical
adjustments to the survey results afterwards. The following section discusses the statistical
adjustments approach, which include weighting the data or imputing scores to correct the
amount of non-response bias. An example of this approach would be the survey raking
procedure described earlier in this report. See the associated references in the “Survey Raking
Procedure for Sample Weightings” section for more information.
The procedure known as “raking” adjusts a set of data so that its marginal totals match
specified control totals on a specified set of variables. The term “raking” suggests an analogy
with the process of smoothing the soil in a garden plot by alternately working it back and forth
with a rake in two perpendicular directions. See Izrael and Battaglia (2004).
If non-response bias was identified in the survey data, the non-response bias could be
corrected mathematically with a post-stratification survey weight. JDP would weigh the survey
data based on certain demographics (such as age, gender, region, etc.) of the total sample so
that the weighted survey data would conform more closely to the demographics of the total
sample. The implicit assumption in this approach is that the distribution of characteristics of the
non-respondents within an adjustment class (such as an age group) are the same, on average,
as those of the respondents within the same adjustment class.
See Appendix B for the item response rate for each question in the survey. If the item response
rate was not lower than 70%, as per OMB standards, the imputation of data is not necessary.
In the case that a particular item-level response was less than 70%, JDP would recommend
conducting additional analyses to determine the potential for other factors (i.e., missing or skip
patterns in the survey instrument) to be the cause of non-response.
44

Strategies to Improve Response Rate
In addition to the strategies listed above, JDP recommends considering the following strategies
to improve response rates going forward:





Issue ongoing public communications (e.g., press releases, information posts on the VA
website) to spread awareness and confirm the legitimacy of the VA LGY Study.
Educate VA employees and VSOs about the survey to encourage participation. Provide a list
of frequently asked questions and answers to VSOs and VA employees to enable them to
answer Veterans’ questions regarding the survey.
Send email invitations to Veterans rather than mailing postcards to make it easier for
Veterans to complete the survey online.
Reduce the length of the survey to improve respondents’ willingness to respond.
 Reduce overall number of questions and number of response options for each question



Increase the number of contacts to respondents with additional reminders about the survey
to encourage participation.
 Provide respondents with an additional paper survey questionnaire








Reduce the frequency of mailings to minimize the opportunities for delays and errors in the
GPO Print process.
Revise the cover letter and postcard to express the importance of participation in the
survey.
Provide sample from the 30-day period immediately prior to the mailing rather than sample
from 90 days prior to improve the recency of respondents’ experience with the LGY benefit,
which improves both participation and recollection.
Alter the responsibility of sample file generation from Loan Guaranty to PA&I. A PA&I data
pull will increase consistency.
Change location of sequence number to directly follow survey link on postcard and cover
letter.
Alter formatting on postcard and cover letter to include color to make materials more
readable to increase participation.

45

Appendix E
Impact of FAR 8.8
Federal Acquisition Regulation (FAR) 8.8 requires that printing must be conducted through the
Government Printing Office (GPO). The following section outlines limiting factors of the VOV
Line of Business Tracking Satisfaction Research Study that occurred as a result of the FAR
requirement.
Through the utilization of the GPO Print Vendor, the following occurred in FY15:
o Quality issues included:
 Survey instruments were printed and mailed:
 Utilizing the sample population from one survey, but receiving a different
survey (e.g., potential respondents from the pool of one business line
received the survey for a different business line)
 Using a version of the instrument that was outdated; this version did not
contain the current questions or responses that were being fielded
 Mixing content between survey versions
 Using shells from one survey printed with a different survey
o Ongoing timeliness delays occurred with each set of orders placed, as the order
fulfillment process took a minimum of 2-4 weeks

1.1

Impact

The project experienced ongoing delays in the printing and mailing of its postcards and survey
packets for VBA’s lines of business. The delays affected the critical processes required to
execute the VOV Program to its fullest potential.
A multitude of quality issues were experienced throughout FY15 that negatively impacted the
VOV Program response rates. The issues that occurred impacted: access to the online survey;
readability of mail materials; level of effort required by respondents to take the survey;
relevancy of survey; and the diminishment of brands (VA/JDP) associated with poor quality
materials.

46

Appendix F
NOTE: Questionnaire is not shown in the formatted version that respondents used to fill out
survey.

Survey Questionnaire
[DO NOT DISPLAY/IDENTIFY SECTION HEADERS. DISPLAY SINGLE QUESTION PER PAGE.]
[RESPONSE CODES APPEAR IN BRACKETS AT THE END OF EACH RESPONSE FOR SINGLE
RESPONSES AND IN THE PROGRAMMING INSTRUCTIONS FOR MULTIPLE RESPONSES.]
Sampling definition: The targeted population will include individuals who closed a VA
home loan in the 90 days prior to the fielding period. The sample will be stratified as
follows: (1) those that closed on purchase loans, (2) those who received loans for
interest rate reductions, and (3) those who obtained cash out or other refinancing.
Benefit Information

1. How did you FIRST learn about the VA Home Loan Program? (Mark only one)
If you are unsure, please indicate the first way you remember learning about the VA
Home Loan Program [RADIO BUTTONS. SINGLE RESPONSE.]
a.
b.
c.
d.
e.
f.
g.
h.

l.
m.
n.
o.
p.
q.
r.

VA website [1]
VetSuccess.gov [2]
eBenefits.va.gov [3]
Social media websites (e.g., Facebook, Twitter, etc.)
Internet (excluding VA and social media sites)
Mail (from VA) [4]
VA phone number (800-827-1000) [5]
In person with a VA representative (e.g., VA medical center, VA Vet
Center, Regional Office, etc.) [8]
i. Transition Assistance Program/Disabled Transition Assistance Program
briefings [6]
j. Veterans Service Organizations (e.g., Amer. Legion, DAV, VFW, PVA,
MOPH, etc.)
k. Information came with notification/ratings letter [16]
Other Veterans [13]
Friends or family [15]
Lender [17]
Real estate agent
Home builder
Other publications (e.g., Army Times, local newspapers, etc.) [18]
Other (Specify) ___________________ [TEXT BOX, FORCE TEXT IF
RESPONSE IS SELECTED, 50 CHARACTER MAX.] [97]
47

s. Don’t know or not sure [99]
2. What method(s) do you MOST FREQUENTLY use to obtain general information
about the VA Home Loan Program? (Mark all that apply) [CHECK BOXES.
MULTIPLE RESPONSE. CODE EACH RESPONSE AS 0 IF UNCHECKED OR
1 IF CHECKED]
a. VA website
b. VetSuccess.gov
c. eBenefits.va.gov
d. Social media websites (e.g., Facebook, Twitter, etc.)
e. Other websites (excluding VA or social media sites)
f. Phone
g. Mail
h. E-mail
i. In person with a VA representative (e.g., VA medical center, VA Vet
Center, Regional Office, etc.)
j. Veterans Service Organizations (e.g., Amer. Legion, DAV, VFW, PVA,
MOPH, etc.)
k. Disabled Veterans’ Outreach Program
l.
m.
n.
o.
p.
q.

Friends or family
Lender
Real estate agent
Home builder
Other publications (e.g., Army Times, local newspapers, etc.)
Other (Specify) ___________________ [TEXT BOX, FORCE TEXT IF
RESPONSE IS SELECTED, 50 CHARACTER MAX.]
r. Don’t know or not sure [MUTUALLY EXCLUSIVE RESPONSE]
s. None of the above [MUTUALLY EXCLUSIVE RESPONSE]

3. How were you informed about the application process for your most recent
certificate of eligibility (COE)? (Mark all that apply) [CHECK BOXES. MULTIPLE
RESPONSE. CODE EACH RESPONSE AS 0 IF UNCHECKED OR 1 IF
CHECKED]
a. Transition Assistance Program/Disabled Transition Assistance Program
briefings
b. Phone
c. Mail
d. E-mail
e. Pamphlets/brochures
f. VA website
g. In person with a VA representative (e.g., VA medical center, VA Vet
Center, Regional Office, etc.)
h. Veterans Service Organizations (e.g., Amer. Legion, DAV, VFW, PVA,
MOPH, etc.)
i. Disabled Veterans’ Outreach Program
48

j.
k.
l.
m.
n.
o.

Other Veterans
Friends or family
Lender
Real estate agent
Home builder
Other (Specify) ___________________ [TEXT BOX, FORCE TEXT IF
RESPONSE IS SELECTED, 50 CHARACTER MAX.]
p. Don’t know or not sure [MUTUALLY EXCLUSIVE RESPONSE]
q. Did not receive information about application process [MUTUALLY
EXCLUSIVE RESPONSE]
4. How would you like to receive information from VA about applying for home loan
benefits? (Mark all that apply) [CHECK BOXES. MULTIPLE RESPONSE. CODE
EACH RESPONSE AS 0 IF UNCHECKED OR 1 IF CHECKED]
a. Phone
b. Mail
c. E-mail
d. VA website
e. Social media websites (e.g., Facebook, Twitter, etc.)
f. In person at a Regional Office
g. Veterans Service Organizations (e.g., Amer. Legion, DAV, VFW, PVA,
MOPH, etc.)
h. Lender
i. Real estate agent
j. Home builder
5. Prior to receiving this survey, which of the following home loan benefits were you
aware of? (Mark all that apply) [CHECK BOXES. MULTIPLE RESPONSE.
CODE EACH RESPONSE AS 0 IF UNCHECKED OR 1 IF CHECKED]
a. Purchase of a new home
b. Home equity refinance (cash-out)
c. Streamlined refinance (interest-rate reduction)
d. Funding fee waiver for eligible disabled veterans
e. No down payment
f. Loan default/foreclosure avoidance assistance
g. None of the above [MUTUALLY EXCLUSIVE RESPONSE]
6. To the best of your knowledge, was all of the information that VA provided to you
about home loan benefit programs correct? (Mark only one) [RADIO BUTTONS.
SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]
The following question asks you to rate various aspects of your experience with VA
home loans using a scale of 1 to 10 where 1 is Unacceptable, 10 is Outstanding, and 5
is Average. [SHOW ON SAME PAGE AS THE QUESTION THAT FOLLOWS]
49

7. When thinking about your most frequently used methods of communication,
please rate your experience in obtaining information about your certificate of
eligibility (COE) application on the following items: (Mark only one per row)
[SHOW RESPONSES IN GRID WITH 10-POINT SCALE IN COLUMNS AND
ATTRIBUTES/RESPONSES IN ROWS (SEE JDPA CONVENTIONS
DOCUMENT PG. 1 FOR SPECIFIC DETAILS OF LAYOUT). EVENLY
SPACED RADIO BUTTONS/COLUMNS, ALTERNATE SHADES IN ROWS.
SINGLE RESPONSE PER ROW. RANDOMIZE ALL ATTRIBUTES EXCEPT
THE LAST ONE.]
a. Ease of accessing information [ALLOW N/A RESPONSE][1-10, N/A=99]
b. Availability of information [ALLOW N/A RESPONSE] [1-10, N/A=99]
c. Clarity of information [ALLOW N/A RESPONSE] [1-10, N/A=99]
d. Usefulness of information [ALLOW N/A RESPONSE] [1-10, N/A=99]
e. Frequency of information provided by VA [ALLOW N/A RESPONSE] [110, N/A=99]
f. Overall rating of information[1-10]
Contact with VA
8. During the past 6 months, did you contact anyone from VA about the home loan
process? (Mark only one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
(Ask Q9-15 if Q8 is Yes, otherwise go to Q16)
9. Which of the following best describes the reason for your most recent contact?
(Mark only one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Resolve a problem [1]
b. Ask a question [2]
c. Request a change to your records/provide information [3]
10. Can you briefly describe the nature of your most recent contact? (Mark all that
apply) [CHECK BOXES. MULTIPLE RESPONSE. CODE EACH RESPONSE
AS 0 IF UNCHECKED OR 1 IF CHECKED.]
a. Report a problem with your realtor
b. Report a problem with your broker
c. Report a problem with your lender
d. Report a problem with your home builder
e. Report a problem with your appraiser
f. Report a problem with the appraisal process
g. Report a problem with a VA customer service representative
h. Ask a general question
i. Obtain information about submitting/re-opening a claim
j. Submit a new application for COE
k. Check on the status of a COE application
50

l.
m.
n.
o.

Appeal an eligibility decision
Question or problem about a pending COE application
Question or problem about an eligibility decision
Other (Specify) ___________________ [TEXT BOX, FORCE TEXT IF
RESPONSE IS SELECTED, 50 CHARACTER MAX.]

11. Thinking about your most recent contact, how did you contact VA?
(Mark only one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Phone [1]
b.
c.
d.
e.
f.

Website [6]
E-mail [7]
Mail [9]
In person [3]
Online Chat

(Ask Q12 if Q11 is Phone, otherwise go to Q13)
12. Which phone number did you use to contact VA? (Mark only one) [RADIO
BUTTONS. SINGLE RESPONSE.]
a. VA toll-free number (1-800-827-1000) [1]
b. VA Home Loan Guaranty number (1-877-827-3702) [2]
c. VA Regional Loan Center [3]
d. Other (Specify) [97] _____________
e. Don’t know or not sure [99]
13. Was your most recent issue resolved? (Mark only one) [RADIO BUTTONS.
SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
(Ask Q14 if Q13 is No, otherwise go to Q15)
14. Why wasn’t your most recent issue resolved? [CHECK BOXES. MULTIPLE
RESPONSE. CODE EACH RESPONSE AS 0 IF UNCHECKED OR 1 IF
CHECKED.]
a. Did not receive all of the information required
b. Received incorrect information
c. Was referred to the incorrect office/person
d. Waiting for follow-up from VA
e. Other (Specify) ____________________ [TEXT BOX, FORCE TEXT IF
RESPONSE IS SELECTED, 50 CHARACTER MAX.]
f. Don't know or not sure [MUTUALLY EXCLUSIVE RESPONSE]
15. Thinking of your most recent contact with VA, how would you rate your overall
customer service experience with VA or VA representatives using a scale of 1
to 10 where 1 is Unacceptable, 10 is Outstanding, and 5 is Average? [SHOW
RESPONSES IN GRID WITH 10-POINT SCALE IN COLUMNS AND SINGLE
ROW (SEE JDPA CONVENTIONS DOCUMENT PG. 1 FOR SPECIFIC
51

DETAILS OF LAYOUT). EVENLY SPACED RADIO BUTTONS/COLUMNS,
SINGLE RESPONSE PER ROW.][1-10]

Benefit Eligibility and Application Process
Please answer the following questions based on your most recent home-buying
experience. [SHOW ON THE SAME PAGE AS THE FOLLOWING QUESTION]
16. At the time your loan closed, were you a(n): (Mark only one) [RADIO BUTTONS.
SINGLE RESPONSE.]
a. Discharged Veteran of the U.S. Armed Forces [1]
b. Active duty service member in the U.S. Armed Forces [2]
c. Surviving spouse [3]
d. Other (Specify) ___________________ [TEXT BOX, FORCE TEXT IF
RESPONSE IS SELECTED, 50 CHARACTER MAX.][97]

17. Through what method did you use to apply for your COE (i.e., a form that
indicated you were eligible for a VA home loan, e.g., VA Form 26-1880, VA Form
26-1870, etc.) ? (Mark only one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Obtained on my own through eBenefits.va.gov
b. Obtained through my lender [1]
c. Through the mail from VA [2]
d. In person at a Regional Loan Center [3]
e. VA website [4]
f. Don’t know or not sure [99]
18. After your application was submitted for a COE, did VA contact you or your
lender to request additional information for your application (e.g., character of
service, length of service documents, etc.)? (Mark only one) [RADIO BUTTONS.
SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]
19. From the time your COE application was submitted, how long did it take to
receive your COE? (Mark only one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Immediately [1]
b. Less than 3 business days [2]
c. 3 to 5 business days [3]
d. More than 5 business days
e. Don’t know or not sure [99]

52

The following question asks you to rate various aspects of your experience with VA
home loans using a scale of 1 to 10, where 1 is Unacceptable, 10 is Outstanding, and 5
is Average. [SHOW ON SAME PAGE AS THE QUESTION THAT FOLLOWS]
20. Please rate your experience with the VA COE application process on the
following items: (Mark only one per row) [SHOW RESPONSES IN GRID WITH
10-POINT SCALE IN COLUMNS AND ATTRIBUTES/RESPONSES IN ROWS
(SEE JDPA CONVENTIONS DOCUMENT PG. 1 FOR SPECIFIC DETAILS
OF LAYOUT). EVENLY SPACED RADIO BUTTONS/COLUMNS,
ALTERNATE SHADES IN ROWS. SINGLE RESPONSE PER ROW.
RANDOMIZE ALL ATTRIBUTES EXCEPT THE LAST ONE.]
a. Ease of completing the application [ALLOW N/A RESPONSE][1-10,
N/A=99]
b. Timeliness of receiving COE [ALLOW N/A RESPONSE] ][1-10, N/A=99]
c. Flexibility of application methods [ALLOW N/A RESPONSE] ][1-10,
N/A=99]
d. Overall rating of application process [1-10]
Benefit Entitlement
As a reminder, your responses will be kept completely confidential and will not affect
any current or future benefits you may receive. [SHOW ON THE SAME PAGE AS THE
QUESTION THAT FOLLOWS]
21. When you obtained your current mortgage, was it to…?(Mark only one) [RADIO
BUTTONS. SINGLE RESPONSE.]
a. Purchase a new or existing home [1]
b. Refinance an existing loan [2]
(Ask Q22 if Q21 is refinance, otherwise go to Q23)
22. What type of loan refinancing did you obtain? (Mark only one) [RADIO
BUTTONS. SINGLE RESPONSE.]
a. Streamlined (interest-rate reduction) [1]
b. Home equity (cash-out) [2]
c. Don’t know or not sure [99]
23. Did you make a down payment on your VA home loan? (Mark only one) [RADIO
BUTTONS. SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
(Ask Q24 if Q23 is yes, otherwise go to Q25)
24. Why did you make a down payment on your VA home loan? (Mark all that apply)
[CHECK BOXES. MULTIPLE RESPONSE. CODE EACH RESPONSE AS 0 IF
UNCHECKED OR 1 IF CHECKED]
a. Home price was too high
b. Appraisal value was lower than purchase price
53

c.
d.
e.
f.
g.

Low credit score
Lender requirement
Desire to establish equity
Lower monthly payment
Don’t know or not sure [MUTUALLY EXCLUSIVE RESPONSE]

25. Did you pay a funding fee for your VA home loan?(Mark only one) [RADIO
BUTTONS. SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]
The following question asks you to rate various aspects of your experience with VA
home loans using a scale of 1 to 10 where 1 is Unacceptable, 10 is Outstanding, and 5
is Average. [SHOW ON SAME PAGE AS THE QUESTION THAT FOLLOWS]
26. Please rate your home loan benefit on the following items: (Mark only one per
row) [SHOW RESPONSES IN GRID WITH 10-POINT SCALE IN COLUMNS
AND ATTRIBUTES/RESPONSES IN ROWS (SEE JDPA CONVENTIONS
DOCUMENT PG. 1 FOR SPECIFIC DETAILS OF LAYOUT). EVENLY
SPACED RADIO BUTTONS/COLUMNS, ALTERNATE SHADES IN ROWS.
SINGLE RESPONSE PER ROW. RANDOMIZE ALL ATTRIBUTES EXCEPT
THE LAST ONE.]
a. Amount of guaranty [ALLOW N/A RESPONSE] [1-10, N/A=99]
b. Timeliness of receiving benefits [ALLOW N/A RESPONSE] [1-10,
N/A=99]
c. Overall rating of benefit [1-10]

Overall Application Experience
27. Thinking about ALL aspects of your experience in obtaining a VA home loan,
please rate the VA Home Loan Program overall, using a scale of 1 to 10 where
1 is Unacceptable, 10 is Outstanding, and 5 is Average. (Mark only one)
[SHOW RESPONSES IN GRID WITH 10-POINT SCALE IN COLUMNS AND
SINGLE ROW (SEE JDPA CONVENTIONS DOCUMENT PG. 1 FOR
SPECIFIC DETAILS OF LAYOUT). EVENLY SPACED RADIO
BUTTONS/COLUMNS, SINGLE RESPONSE PER ROW.] [1-10]

28. Based on your experience with the VA Home Loan Program overall, how likely
are you to recommend it to other Veterans? (Mark only one) [RADIO BUTTONS.
SINGLE RESPONSE.]
a. Definitely will not [1]
b. Probably will not [2]
c. Probably will [3]
d. Definitely will [4]
54

Overall Experience with VA
29. Taking into consideration all of the non-medical benefits (e.g., education,
compensation, pension, home loan guaranty, vocational rehabilitation and
employment, insurance, etc.) you have applied for or currently receive, please
rate your experience with VA overall, using a scale of 1 to 10 where 1 is
Unacceptable, 10 is Outstanding, and 5 is Average. (Mark only one) [SHOW
RESPONSES IN GRID WITH 10-POINT SCALE IN COLUMNS AND SINGLE
ROW (SEE JDPA CONVENTIONS DOCUMENT PG. 1 FOR SPECIFIC
DETAILS OF LAYOUT). EVENLY SPACED RADIO BUTTONS/COLUMNS,
SINGLE RESPONSE PER ROW.] [1-10]

30. How likely are you to inform other Veterans about your experiences with VA
benefits or services? (Mark only one) [RADIO BUTTONS. SINGLE
RESPONSE.]
a. Definitely will not [1]
b. Probably will not [2]
c. Probably will [3]
d. Definitely will [4]
Loan Process
31. Did any of the following people discourage you from using your VA home loan
benefit? (Mark all that apply) [CHECK BOXES. MULTIPLE RESPONSE.
CODE EACH RESPONSE AS 0 IF UNCHECKED OR 1 IF CHECKED]
a. Realtor
b. Lender
c. Broker
d. Builder affiliated lender
e. Home builder
f. Other (Specify) ___________________ [TEXT BOX, FORCE TEXT IF
RESPONSE IS SELECTED, 50 CHARACTER MAX.]
g. Don’t know or not sure [MUTUALLY EXCLUSIVE RESPONSE]
h. I was not discouraged [MUTUALLY EXCLUSIVE RESPONSE]
i. Not applicable [MUTUALLY EXCLUSIVE RESPONSE]
(ASK Q32-34 if Q31 is realtor, lender, broker, builder affiliated lender, home builder
oror Other, otherwise go to Q35)
32. Why did they discourage you from using your VA home loan benefit? (Mark all
that apply) [CHECK BOXES. MULTIPLE RESPONSE. CODE EACH
RESPONSE AS 0 IF UNCHECKED OR 1 IF CHECKED]
a. Would be easier or cheaper to obtain a conventional FHA loan
b. Process for obtaining a VA home loan would take too long
c. Seller would not sell home to VA-finance borrower
55

d. The VA eligibility process would take too long or is too complex
e. Home did not meet VA property requirements
f. Other (Specify) ___________________ [TEXT BOX, FORCE TEXT IF
RESPONSE IS SELECTED, 50 CHARACTER MAX.]
g. Don’t know or not sure
33. Did they discourage you from using your VA home loan benefit on your…? (Mark
only one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Most recent home loan [1]
b. Previous home loan [2]
c. Don’t know or not sure [99]
34. When you were discouraged from using your VA home loan benefit, was the loan
you were applying to…? (Mark only one) [RADIO BUTTONS. SINGLE
RESPONSE.]
a. Purchase a new or existing home [1]
b. Refinance an existing loan [2]
c. Don’t know or not sure [99]
35. Did you receive any of the following during the home loan guaranty application
process? (Mark all that apply) [CHECK BOXES. MULTIPLE RESPONSE.
CODE EACH RESPONSE AS 0 IF UNCHECKED OR 1 IF CHECKED]
a. Copy of the appraisal
b. Notice of Value document from lender
c. Copy of your VA COE
d. None [MUTUALLY EXLCUSIVE RESPONSE]
e. Don’t know or not sure [MUTUALLY EXCLUSIVE RESPONSE]

(Ask Q36 if received a copy of the appraisal in Q35, otherwise go to Q37)
36. Relative to your closing date, when did you receive a copy of your appraisal?
(Mark only one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Prior to the closing date [1]
b. Same day as the closing date [2]
c. After the closing date [3]
d. Don’t know or not sure [99]
(Ask Q37 if received a Notice of Value Document in Q35, otherwise go to Q38)
37. Relative to your closing date, when did you receive a Notice of Value document
(e.g., an estimate of the home’s reasonable value) from your lender? (Mark
only one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Prior to the closing date [1]
b. Same day as the closing date [2]
c. After the closing date [3]
d. Don’t know or not sure [99]

56

38. How many times have you obtained a loan using the VA Home Loan Program?
(Open Capture)
a. Number of times (0-99)_______________ [NUMERIC TEXT BOX.
ACCEPTABLE RANGE 0-99]
b. Don’t know or not sure [CHECK BOX. MUTUALLY EXCLUSIVE
RESPONSE.] [CODE AS 0 IF UNCHECKED AND 1 IF CHECKED]
39. Please rate your experience with your lender regarding the home loan application
and approval process, using a scale of 1 to 10 where 1 is Unacceptable, 10 is
Outstanding, and 5 is Average. (Mark only one per row) [SHOW RESPONSES
IN GRID WITH 10-POINT SCALE IN COLUMNS AND
ATTRIBUTES/RESPONSES IN ROWS (SEE JDPA CONVENTIONS
DOCUMENT PG. 1 FOR SPECIFIC DETAILS OF LAYOUT). EVENLY
SPACED RADIO BUTTONS/COLUMNS, ALTERNATE SHADES IN ROWS.
SINGLE RESPONSE PER ROW. RANDOMIZE ALL ATTRIBUTES EXCEPT
THE LAST ONE.]
a. Variety of loan options to choose from [ALLOW N/A RESPONSE] [1-10,
N/A=99]
b. Competitiveness of interest rates offered [ALLOW N/A RESPONSE] [110, N/A=99]
c. Ease of completing loan application [ALLOW N/A RESPONSE] [1-10,
N/A=99]
d. Length of time from loan application to final approval [ALLOW N/A
RESPONSE] [1-10, N/A=99]
e. Reasonableness of the amount of supporting documentation required
[ALLOW N/A RESPONSE] [1-10, N/A=99]
f. Reasonableness of all fees paid at application [ALLOW N/A RESPONSE]
[1-10, N/A=99]
g. Overall rating of application/approval process [1-10]
40. Please rate your experience with your loan officer/representative regarding the
home loan/refinance process on the following items, using a scale of 1 to 10
where 1 is Unacceptable, 10 is Outstanding, and 5 is Average. (Mark only one
per row) [SHOW RESPONSES IN GRID WITH 10-POINT SCALE IN
COLUMNS AND ATTRIBUTES/RESPONSES IN ROWS (SEE JDPA
CONVENTIONS DOCUMENT PG. 1 FOR SPECIFIC DETAILS OF LAYOUT).
EVENLY SPACED RADIO BUTTONS/COLUMNS, ALTERNATE SHADES IN
ROWS. SINGLE RESPONSE PER ROW. RANDOMIZE ALL ATTRIBUTES
EXCEPT THE LAST ONE.]
a. Knowledge of loan officer/representative [ALLOW N/A RESPONSE] [110, N/A=99]
b. Courtesy of loan officer/representative[ALLOW N/A RESPONSE] [1-10,
N/A=99]
c. Representative’s responsiveness to questions [ALLOW N/A RESPONSE]
[1-10, N/A=99]
d. Representative’s concern for your needs [ALLOW N/A RESPONSE] [110, N/A=99]
57

e. Clarity of explanation of loan options [ALLOW N/A RESPONSE] [1-10,
N/A=99]
f. Overall rating of loan officer/representative [1-10]
41. Please rate your experience with your home loan closing on the following items,
using a scale of 1 to 10 where 1 is Unacceptable, 10 is Outstanding, and 5 is
Average. (Mark only one per row) [SHOW RESPONSES IN GRID WITH 10POINT SCALE IN COLUMNS AND ATTRIBUTES/RESPONSES IN ROWS
(SEE JDPA CONVENTIONS DOCUMENT PG. 1 FOR SPECIFIC DETAILS
OF LAYOUT). EVENLY SPACED RADIO BUTTONS/COLUMNS,
ALTERNATE SHADES IN ROWS. SINGLE RESPONSE PER ROW.
RANDOMIZE ALL ATTRIBUTES EXCEPT THE LAST ONE.]
a. Ease of understanding closing documents [ALLOW N/A RESPONSE] [110, N/A=99]
b. Convenience of closing [ALLOW N/A RESPONSE] [1-10, N/A=99]
c. Length of time from final loan approval to closing [ALLOW N/A
RESPONSE] [1-10, N/A=99]
d. Reasonableness of closing costs [ALLOW N/A RESPONSE] [1-10,
N/A=99]
e. Overall rating of home loan closing [1-10]

42. Did you use the services of a realtor real estate agent when buying/refinancing
your home loan? (Mark only one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
(Ask Q43 if used services in Q42, otherwise go to Q44)
43. Please rate your experience with your realtor/real estate agent regarding the
home loan application process on the following items, using a scale of 1 to 10
where 1 is Unacceptable, 10 is Outstanding, and 5 is Average. (Mark only one
per row) [SHOW RESPONSES IN GRID WITH 10-POINT SCALE IN
COLUMNS AND ATTRIBUTES/RESPONSES IN ROWS (SEE JDPA
CONVENTIONS DOCUMENT PG. 1 FOR SPECIFIC DETAILS OF LAYOUT).
EVENLY SPACED RADIO BUTTONS/COLUMNS, ALTERNATE SHADES IN
ROWS. SINGLE RESPONSE PER ROW. RANDOMIZE ALL ATTRIBUTES
EXCEPT THE LAST ONE.]
a. Knowledge of realtor/ real estate agent [ALLOW N/A RESPONSE] [1-10, N/A=99]
b. Courtesy of realtor/real estate agent [ALLOW N/A RESPONSE] [1-10, N/A=99]
c. Realtor/ real estate agent’s responsiveness to questions [ALLOW N/A RESPONSE]
[1-10, N/A=99]
d. Realtor/ real estate agent’s concern for your needs [ALLOW N/A RESPONSE] [1-10,
N/A=99]
e. Overall rating of realtor/ real estate agent [1-10]
44. Did you use the services of a home builder when buying/refinancing your home
loan? (Mark only one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Yes [1]
58

b. No [0]
(Ask Q45 if used services in Q44, otherwise go to Q46)
45. Please rate your experience with your home builder regarding the home loan
application process on the following items, using a scale of 1 to 10 where 1 is
Unacceptable, 10 is Outstanding, and 5 is Average. (Mark only one per row)
[SHOW RESPONSES IN GRID WITH 10-POINT SCALE IN COLUMNS AND
ATTRIBUTES/RESPONSES IN ROWS (SEE JDPA CONVENTIONS
DOCUMENT PG. 1 FOR SPECIFIC DETAILS OF LAYOUT). EVENLY
SPACED RADIO BUTTONS/COLUMNS, ALTERNATE SHADES IN ROWS.
SINGLE RESPONSE PER ROW. RANDOMIZE ALL ATTRIBUTES EXCEPT
THE LAST ONE.]
a. Knowledge of home builder [ALLOW N/A RESPONSE] [1-10, N/A=99]
b. Courtesy of home builder [ALLOW N/A RESPONSE] [1-10, N/A=99]
c. Home builder’s responsiveness to questions [ALLOW N/A RESPONSE] [1-10,
N/A=99]
d. Home builder’s concern for your needs [ALLOW N/A RESPONSE] [1-10, N/A=99]
e. Overall rating of home builder [1-10]

About You
46. Prior to completing the VA home loan application process, how much did you
understand the VA Home Loan Program? (Mark only one)
a. Completely
b. Mostly
c. Somewhat
d. Only a little
e. Not at all
47. After completing the VA home loan application process, how much do you
understand the VA Home Loan Program? (Mark only one) [RADIO BUTTONS.
SINGLE RESPONSE.]
a. Completely [5]
b. Mostly [4]
c. Somewhat [3]
d. Only a little [2]
e. Not at all [1]
48. Was this your first home loan of any type? (Mark only one) [RADIO BUTTONS.
SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
49. For this most recent loan, did you consider another type of home loan?
(Mark only one) [RADIO BUTTONS. SINGLE RESPONSE.]
59

a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]
(Ask Q50 if considered another type of home loan in Q49, otherwise go to Q51)
50. What other type(s) of home loans did you consider? (Mark all that apply)
[CHECK BOXES. MULTIPLE RESPONSE. CODE EACH RESPONSE AS 0
IF UNCHECKED OR 1 IF CHECKED]
a, Conventional
b. Federal Housing Administration
c. Other
51. What is the primary reason you applied for a VA home loan, as opposed to a
Federal Housing Administration loan or other type of loan? (Mark only one)
[RADIO BUTTONS. SINGLE RESPONSE.]
a. The VA loan program is offered only to US Veterans [1]
b. No down payment required [2]
c. Convenience [3]
d. No mortgage insurance required [4]
e. Loan more likely to be approved [5]
f. VA's assistance to avoid foreclosure [6]
g. Previous experience with the VA loan program [7]
h. Funding fee exemption for service-connected disability
i. Other [97]
52. Have you ever obtained either a conventional or a Federal Housing
Administration home loan?
(Mark only one) [RADIO BUTTONS. SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]

(Ask Q53 if Yes in Q52, otherwise go to Q54)
53. Thinking about ALL aspects of your experience in obtaining your last
conventional or Federal Housing Administration loan (including the application
process, eligibility requirements and loan amount, loan information, contacting
your lender, etc.), please rate your loan experience overall, using a scale of 1
to 10 where 1 is Unacceptable, 10 is Outstanding, and 5 is Average. [SHOW
RESPONSES IN GRID WITH 10-POINT SCALE IN COLUMNS AND SINGLE
ROW (SEE JDPA CONVENTIONS DOCUMENT PG. 1 FOR SPECIFIC
DETAILS OF LAYOUT). EVENLY SPACED RADIO BUTTONS/COLUMNS,
SINGLE RESPONSE PER ROW.] [1-10]
54. If you had not received a VA guaranteed home loan, would you have been able
to purchase your home at this time? [RADIO BUTTONS. SINGLE
RESPONSE.]
60

a. Yes [1]
b. No [0]
c. Don’t know or not sure [99]
55. Do you have any other comments or concerns about your experience? (Open
Capture) [OPEN-END. TEXT BOX. 1000 CHARACTER MAX. ALLOW NO
COMMENT, MUTUALLY EXCLUSIVE CHECK BOX. CODE NO COMMENT AS
0 IF UNCHECKED AND 1 IF CHECKED.]
____________________________________________________
As a reminder, your responses will be kept completely confidential and your e-mail
address will not be sent to VA with any responses on this survey.[SHOW ON THE
SAME PAGE AS THE QUESTION THAT FOLLOWS]
56. Would you like to provide an e-mail address so VA can contact you with general
information about VA benefits and services? (Mark only one) [RADIO
BUTTONS. SINGLE RESPONSE.]
a. Yes [1]
b. No [0]
c. I do not have an e-mail address [96]
d. Prefer not to answer [98]
(Ask Q57 if Yes in Q56)
57. Please enter your preferred e-mail address where you would like to be contacted:
(Open Capture)
a. E-mail: [OPEN CAPTURE. 100 CHARACTER MAX.]

61

Appendix G
List of Acronyms
AAPOR
ANOVA
BAS
BPA
BRE
CAPS
COR
DTA
EDIPI
EDX
FAR
FY
GPO
ICR
JDP
LGY
LWO
MAR
MCAR
MCMC
MNAR
NPC
OIF
OEF
OMB
OSAT
RO
SSN
US
USA
VA
VADIR
VAPSD
VBA
VOV
VR&E
VSO

American Association for Public Opinion Research
Analysis of Variance
Benefits Assistance Service
Blanket Purchase Agreement
Business Reply Envelope
Centralized Account Processing System
Contracting Officer’s Representative
Data Transfer Agreement
Electronic Data Interchange Personal Identifier
Enterprise Data Exchange
Federal Acquisition Regulations
Fiscal Year
Government Printing Office
Information Collection Request
J.D. Power
Loan Guaranty Service
Letter Work Order
Missing At Random
Missing Completely At Random
Markov chain Monte Carlo algorithm
Missing Not At Random
NPC, Inc. Integrated Print and Digital Solutions
Operation Iraqi Freedom
Operation Enduring Freedom
Office of Management and Budget
Overall Satisfaction Index
Regional Office
Social Security Number
United States
United States of America
Department of Veterans Affairs
VA DoD Identity Repository
VA Publications Services Division
Veterans Benefits Administration
Voice of the Veteran
Vocational Rehabilitation and Employment Service
Veterans Service Organizations

62


File Typeapplication/pdf
File TitleTraining Catalog, Department of Veterans Affairs
SubjectTraining Catalog
AuthorDepartment of Veterans Affairs, Office of Human Resources and Ad
File Modified2016-12-19
File Created2016-12-19

© 2024 OMB.report | Privacy Policy