Supporting Statement B (7-9-2014)

Supporting Statement B (7-9-2014).pdf

Military Compensation and Retirement Survey

OMB: 3260-0001

Document [pdf]
Download: pdf | pdf
B.

Collections of information employing statistical methods

1.

Description of universe
Universe. The universe for the study is the population of current U.S. military retirees
(target population), where the sampling frame—comprised of all military retirees with
email addresses on file with the Defense Manpower Data Center (DMDC) from its the
latest update to the Defense Enrollment Eligibility Reporting System (DEERS). Each
record within the sampling frame will be stratified by the following four variables
considered important in the exploration and development of compensation options for
Military Compensation and Retirement Modernization Commission (MCRMC): 1)
current age group (three groups: <55 years, 55–64 years, >65 years), 2) rank group before
retirement (two groups: officer, enlisted), 3) duty status before retirement (two groups:
active duty, guard or reserve), and 4) current family status (four groups: single with
children, single without children, married with children, married without children).
The complete cross classification of these variables results in 48 strata for the sampling
frame. As of February 2014, the target population was made up of about 2.1 million
military retirees. Because the mode of data collection will be a web survey where the
military retirees will be notified through email to participate in the survey, the sampling
frame will be limited to the approximate 1.0 million retirees with email addresses on file
with DMDC. Tables B.1 and B.2 present the number of cases and their
distribution/percentages in the target population and in the sampling frame limited to
those with email addresses (about one-half of the target population) broken down by
sampling strata. A preliminary examination on the distribution of retirees in the sampling
frame versus that in the target population showed that there are small differences in some
sampling strata, as shown in Tables B.1 for frequency and B.2 for percentage. Such
differences will be corrected later through the use of a post-stratification weighting
technique.
Please see Tables B.1 and B.2.
Response rate. Experience has shown that the population of military personnel,
including reserve personnel and retirees, is a difficult one to survey. The response rate for
surveying this population varies depending on the survey topics and the modes of data
collection.1 For planning purposes, MCRMC is using a 25 percent response rate (survey
completion rate) of the retiree sampling frame contacted. This rate is based on last year’s
customer satisfaction survey conducted by the Defense Finance and Accounting Services
(DFAS). While it is likely that survey response rates will vary across the 48 strata in the

1

In a pilot survey on civilian health insurance conducted by RAND (2007), 59.7 percent of the sampled military
retirees responded to the telephone survey (advance notification letters followed by CATI). Similarly, in a study on
the views of the American public and U.S. foreign affairs experts on China policies conducted by the Pew Research
Center (2012), only 25 percent of sampled military retirees responded; this survey relied on a combination of web
and telephone surveys (mailed advance letters, emails, and phone follow-ups). The quarterly Health Care Survey of
DOD Beneficiaries, sponsored by TRICARE Management Activity, has response rates of around 13.5 percent to its
web surveys (mailed advance letters with the URL and password for the surveys).

1

sampling frame, there are no response data available from other surveys to use as a basis
for making these estimates at the stratum level.2
2.

Statistical Methodology
Sample selection. As previously noted, the sampling frame consists of all military
retirees with email addresses on file with DMDC. While statistical methods governing
sample selection becomes moot in this context, the design of the sampling frame from an
analysis and reporting standpoint remains an important step. As described in Section B.1,
we will implement a stratification of the sampling frame based on age group (current),
rank group (before retirement), duty status (before retirement), and current family status
resulting in 48 strata. The size of the target population relative to the resulting completed
surveys for each of the 48 strata will require that we statistically weight each completed
survey record. The statistical methods used to weight the completed survey records are
discussed elsewhere in this section.
In addition to the variables used to stratify the sampling frame, we may use three other
variables for analysis and reporting purposes: 1) service branch upon retirement (Army,
Navy, Marine Corps, Air Force, and Coast Guard); 2) gender (male, female); and 3)
current retirement type (disability retired, and non-disability retired).
Estimation, precision, and sample size. The reporting domain for estimation and data
analysis will be individual sampling stratum/cell. In most cases the MCRMC survey asks
the respondent to indicate the extent to which they agree with a statement based on a
sliding scale scored from 0 to 100. As a result, the sample size for each sampling stratum
is determined based on a consistent precision requirement across each of the 48 sampling
cells equal to a margin of error of five score points in the 95 percent two-sided
confidence interval. To be able to calculate the sample size for each sampling stratum, we
utilized information on the estimates of population variances for a subset of key survey
items from a past survey collecting similar information. Because this survey was based
on a relatively small sample in many cases, we were not able to produce reliable
population variance estimates at the stratum level for all strata, and in such cases, we
used the average of the population variance across the strata overall or for a broader
subgroup to compute the stratum sample size requirements.
Note that because we intend to survey all military retirees with email addresses on file
with DMDC (a sampling frame of approximately one million records), it is likely that we
will achieve sufficient counts of completed surveys in each of the 48 strata to satisfy the
established criteria for statistical precision (five score points margin of error in the 95
percent two-sided confidence interval).

2

DMDC maintains extensive data on survey response rates by subgroups for the active and selected reserves, but
has no experience surveying military retirees. DFAS does conduct customer satisfaction surveys to include military
retirees, but does not retain any data on response rates for subgroups. For last year’s customer satisfaction survey,
DFAS estimated a 22.6 percent response rate for those military retirees contacted by email. Because of the nature of
the content in its survey, MCRMC increased the average response rate slightly to 25 percent for military retirees.

2

3.

Statistical Reliability
Method to maximize response rate. MRCMC is taking extensive steps to publicize its
survey prior to launch. For example, we conducted an interview in the May 13, 2014
editions of the Army and Navy Times, informing their subscribers, which includes
retirees, of the survey.3 We will also advertise the survey through Military and Veterans
Service Organizations, such as the Military Officers Association of American and the
Veterans of Foreign Wars, since they have existing and effective distribution to retirees.
In addition, we will conduct a media roundtable at the end of May in conjunction with the
release of the Commission's interim report and include the survey roll-out in the course of
the discussion. Finally, reminder emails will be sent after one week to survey nonrespondents, then every two weeks thereafter throughout the 6-8 week duration of the
survey.
Dealing with non-response. When the response rate falls below the cut-off rate of 80
percent, the rate required by OMB, we will perform a nonresponse bias analysis. The
goals of this analysis are to evaluate whether there is a potential nonresponse bias when
survey estimation is computed based on respondents only (without any nonresponse
adjustment), and to assess variables appropriate for nonresponse adjustment procedures.
In the nonresponse bias analyses, we will look at the following:


Response rates by sampling frame characteristics; for example, differences
between the response rates by the age group may indicate a potential nonresponse
bias because sample composition may no longer be similar to the original (full)
sample with regard to age group.



Distribution of sampling frame characteristics of the full sample compared to the
distribution of characteristics of respondents only; for example, a significant
difference in the proportion of active duty retirees in the full sample from that in
the respondents only may also be an indication of nonresponse bias.



Distribution of sampling frame characteristics of the full sample compared to the
distribution of characteristics of respondents only after applying the nonresponse
weight adjustments and the post-stratification that follows; this, in turn, will
demonstrate that the potential for nonresponse bias observed above has been
eliminated to the extent possible by the adjustment process. We will also examine
the degree to which the nonresponse and post-stratification adjustments affect the
survey estimates.

For these analyses, we will utilize the sampling frame and other auxiliary variables
available for both respondents and non-respondents; for example, we will use both
sampling stratification and sorting variables.

3

Please see http://www.armytimes.com/article/20140513/BENEFITS/305130045/Panel-prepares-launch-pay-benefits-survey

3

Weighting. Sampled retirees in this study will be selected based on a sampling design
that implemented stratification and oversampling of certain groups, resulting in unequal
probabilities of selection and subsequent differential sampling weights across samples.
For this reason, the data analysis should be weighted. The basic sampling weights can be
used in the analysis when all samples respond to the survey so that analyses with these
weights will provide design-unbiased estimates. When survey non-response exists, such
as in this study, analyses based on respondents using only the basic sampling weights
may no longer produce unbiased estimates. Using the nonresponse bias analyses
discussed above, we will determine whether and how to best adjust the weights for
nonresponse. The nonresponse adjusted weight is designed to account for differences in
the propensity to respond to a survey, as well as potential differences in survey outcomes
between respondents and non-respondents. Data analyses then can be conducted using the
nonresponse adjusted weights.
The method for nonresponse adjustment depends on the response mechanism underlying
the study population. Commonly, when a response mechanism is assumed to be missing
at random, nonresponse adjustments typically are implemented independently within
weighting classes/cells, under the assumption that the probability of response is
homogeneous among units in the same class, and the survey variable(s) is homogeneous
within the class. The weighting cells then will usually be constructed based on
characteristics directly or indirectly related to survey variables. It is reasonable that the
same variables used during the sampling stage can be used to construct weighting cells.
To calculate the nonresponse adjusted weights, within each cell, the basic sampling
weight will be multiplied with the inverse of response rate used as the nonresponse
adjustment factor within the cell.
Post-stratification. To address possible coverage bias due to unavailability of email
addresses from all retirees so that samples are drawn only from retirees with email
addresses, we will perform post-stratification. Post-stratification is a ratio adjustment
technique that forms a mutually exclusive set of post strata (or post-stratification cells)
and adjusts the weights within each post stratum so that weighted counts equal control
totals, where in this case the control totals used will be the number of retirees in the target
population including those without email address. These control totals are important
because the post-stratification, as a ratio-based adjustment, will force the weighted
distribution to reflect these population of retirees overall and will reduce the variability of
survey estimates. As a result, once the nonresponse adjustments and the poststratification are completed, the distribution of the nonresponse adjusted respondent data
will match the population profile as obtained from the target population database to
reduce the potential for nonresponse bias in the final study results.

4

4.

Describe any tests of procedures or methods to be undertaken. Testing is
encouraged as an effective means of refining collections of information to minimize
burden and improve utility. Tests must be approved if they call for answers to
identical questions from 10 or more respondents. A proposed test or set of tests may
be submitted for approval separately or in combination with the main collection of
information.
In addition to internal testing within MCRMC staff, we assembled a small group (5-9) of
retired military volunteers to beta test the survey interface. Testing evaluated the
functionality of the software, intuitiveness of the interface and understanding of the
question items. These volunteers performed this activity in a report environment (home
or work) unprompted by any monitor or facilitator.
Each volunteer received an email invitation to participate and was then directed to a
website to exercise the survey interface and a second website to capture feedback on the
system.
For each tester, we tracked:
 The time taken to advance through each screen; and,
 The total time taken to complete the application.
The testers were asked to answer the following questions with specific feedback:
CONTENT
 Were the questions easy to understand?
o If not, which question(s) was/were confusing? Why?
 Did you use the rollovers or help text?
o If so, did you find them clear and helpful?
DESIGN
 Was the design (layout and images) of the survey engaging?
 Did the pages, images, and text load at an appropriate speed?
 Did you have trouble with any aspect of the experience?
GENERAL
 We are trying to understand how retirees value different aspects of their retirement
pay and benefits. Do you believe that the questions we’ve asked are reasonable and
relevant to this goal?
 How do you feel about the length of the survey?
 Do you have any other feedback for us?
With this feedback, MCRMC is working with the contractor to make the necessary
adjustments.

5

5.

Provide the name and telephone number of individuals consulted on statistical
aspects of the design and the name of the agency unit, contractor(s), grantee(s), or
other person(s) who will actually collect and/or analyze the information for the
agency.
Amang Sukasih
Senior Statistician
Mathematica Policy Research
1100 1st Street, NE, 12th Floor
Washington, DC 20002-4221
Phone: 202-484-3286

6


File Typeapplication/pdf
File Modified2014-07-09
File Created2014-07-09

© 2024 OMB.report | Privacy Policy