Download:
pdf |
pdfSupporting Statement – Part B
Collection of Information Employing Statistical Methods
Medicare.gov/CMS.gov Websites
1. Results of this follow-up study will measure the progress achieved in improving areas of the
websites identified in the previous studies.
As in the previous studies, this project involves surveying (intercepting) website visitors as they
are visiting the CMS.gov and Medicare.gov websites. Visitors to these sites will be randomly
selected and then asked to respond to questions about the sites’ navigability, content,
interactivity, performance, available publications, and privacy policy. Using the same method,
the previous studies provided benchmark measurements on the attributes of each of the above
websites. The results of the previous studies identified where website improvements should be
made in order to have the largest impact on visitor experience. The follow-up study proposed in
this submission will measure the effect of these website improvements on enhancing visitor
experiences.
The potential respondent universe will be the universe of visitors perusing selected pages of the
websites during a specified time period. This universe will be sampled on a probabilistic basis.
Based on historic visitor counts, an estimated sample size designed to achieve the precision
requirements of the study will be determined. Using this expected sample size along with a predetermined “window” for data collection, a probability-based sample of visitors will be
randomly selected via JavaScript from the specified web pages. Survey invitations will be set to
trigger for visitors who had visited at least four different pages during their visit. Based on the
initial study, the sampling rate is expected to be approximately 30 percent of visitors. That is,
one in five visitors using a JavaScript enabled browser will receive the survey invitation. For
non-JavaScript enabled browsers, a fixed link invitation will be visible.
2. For the purposes of this study, the sample design is simple random sampling. There is no
stratification involved. The estimation procedures are designed to produce simple, standard
descriptive statistics consistent with a sampling scheme based on simple random sampling.
As with the prior studies, we estimate the number of completed interviews for this study to be
approximately 7,000 visitors. Accordingly, we expect estimates produced from this survey to
have very small levels of sampling error and that small differences (one point or less) will be
able to be detected between rounds of the survey.
We expect to be able to measure differences in index scores (e.g., average score for questions
about site characteristics such as “navigability” or “content” based on a scale of 1 to 100)
between two rounds of the survey, with differences as small as 0.90 points (i.e., within less than
one point between rounds) using a two-tailed test at the .05 level of significance.
We do not expect any unusual problems requiring specialized sampling procedures or any use of
periodic data collection cycles.
3. This survey is very different from other traditional surveys by its very nature. One major
difference is that no attempt will be made to convert initial non-respondents. Traditional surveys
often have extensive callback efforts and/or reminder letters for converting initial refusals.
However, due to the nature of intercepting website visitors in a probabilistic manner, the initial
attempt to induce response will be the only contact ever made with the potential respondent.
Once the respondent refuses the initial survey invitation, no other contact is made. After all, we
do not want to unduly interfere with the real reason the sampled visitor is visiting the website in
the first place. In surveys of this type, the data are used to point to specific problems (or
dissatisfaction) visitors have when using the either site. We think that 30% response rate is good
for this type of survey.
4. No tests of procedures or methods are being proposed for this study. This study will use
methods that have been demonstrated to be an effective means of surveying CMS.gov website
and the Medicare.gov website. Please note that the Medicare.gov website was tested in 2002,
2003 and 2004 using this exact same methodology. Since the 2001 Medicare.gov research was
the first time this type of web intercept survey was used to test a CMS website, a pretest of the
survey methodology was conducted. Pretest results were used to refine the intercept survey prior
to the 2002 Medicare.gov website testing. The same survey was used again for Medicare.gov in
2003 and in 2007.
5. Please contact either of the following CMS contacts regarding the statistical and
methodological aspects of the design or for agency information:
Frank Funderburk
Center for Medicare & Medicaid Services
CMS/OC
7500 Security Blvd. S1-13-01
Baltimore, MD 21244-1850
(410) 786-1820
Fax: (410) 786- 2097
[email protected]
Kymeiria Ingram
Centers for Medicare & Medicaid Services
CMS/OC
7500 Security Blvd. S1-11-06
Baltimore, MD 21244-1850
(410) 786-8431
Fax: (410) 786- 0006
[email protected]
File Type | application/pdf |
File Title | Supporting Statement - Part B: Collection of Information Employing Statistical Methods |
Subject | OMB Clearance - Supporting Statement - Part B for Medicare.gov and CMS.gov Online Survey |
Author | CMS |
File Modified | 2014-04-24 |
File Created | 2014-04-24 |