SUPPORTING STATEMENT – PART B SUPPORTING STATEMENT FOR
PAPERWORK REDUCTION ACT SUBMISSIONS
HEALTHCARE.GOV SITE-WIDE ONLINE SURVEY
COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS
B.1 Respondent universe and sample
The target population for the site-wide Healthcare.gov survey is a subset of users of the Healthcare.gov website. While anyone can use this website, our target will be consumers eligible to purchase marketplace insurance. The survey asks the respondent’s role so that this set of users can be identified. The size of the target population of consumers varies by month. During open enrollment (e.g., three months from November 1, 2015 through January 31, 2016 for the current year) the number of visits has historically been in the millions and for remaining months has been significantly lower. From this very large population, website users are randomly intercepted. From this sample, anyone who meets the screening criteria for the survey is eligible to respond.
Sampling Plan
The population-based site-wide survey will use a sample of all website users. Screening questions will be asked to determine survey eligibility for each intercepted user so that users can be identified who are part of the appropriate target audience (consumers).
Each respondent will answer questions concerning the task they performed on the website immediately before taking the survey. These survey sections or tasks include looking for and reading information, creating an account, filling out the application, shopping for and comparing health plans, and enrolling in a plan. There are five trigger pages in the website, called Contact Us, Get Answers, Eligibility Results, Plan Compare To-Do List, and Congratulations Enroll To-Do List. The consumer who elects to take the survey will be directed to the appropriate questions based on their response to the question, “What did you do most recently on Healthcare.gov today?”
We estimate that randomly sampling two percent of the user population will be sufficient to gather 1,000 respondents per week during open enrollment for each of the six survey sections. The data collected “off-season” will be minimal and is anticipated to be aggregated monthly. The size of the open enrollment sample is expected to be sufficient to meaningfully track change week-to-week and to examine subgroups of respondents such as new enrollees versus re-enrollees.
Sampling Plan based on Open Enrollment 2 (2014-2015) |
|
|
||||
Weeks during Open Enrollment 2 |
16 |
|
|
|
|
|
Total Users |
670,050 |
|
|
|
|
|
Users per week |
41,878 |
|
|
|
|
|
Respondent Quota per week |
1000 |
|
|
|
|
|
Percent of users to receive survey intercept |
2% |
|
|
|
|
|
Our goal in this research is to enhance our understanding of how well the interactive tools and information provided on Healthcare.gov are performing. This continuously running survey will identify content that may be confusing for consumers, tasks that are difficult for consumers to complete, and other measures such as overall satisfaction using the website. The data collected and reported will be descriptive in nature and diagnostic for those charged with enhancing the website. The data will not be used to develop or publish formal official statistics.
B.2 Information collection procedures
The data collection will consist of an online survey that is launched from specific web pages on HealthCare.gov. As discussed above, the survey will be available to a random sample of users. Users will opt in to take the survey, if desired.
B.3 Methods to maximize response rates and non-response analysis plan
The survey is very brief. Any respondent will be asked up to 25 closed-ended questions. The survey length is about 4 minutes. A strong completion rate is expected with this short survey.
Survey response rates express completed interviews as a percentage of the estimated eligible units. In the case of the HealthCare.gov site-wide survey, the response rate will be the number of respondents over the number of unique page visits during a specified period of time (e.g., one week). The numbers of unique visits for pages of interest in 2015 are shown in the table below.
Number of Unique Users during Open Enrollment 2 (2014-2015) |
|
Get Answers page |
670,050 |
Contact Us page |
2,530,609 |
Eligibility Results page |
4,956,469 |
Plan Compare page |
415,170 |
Enrollment page |
3,353,846 |
As indicated in the table above, the usage of each page varies widely. There are no data available to suggest what the response rate would be for this unique interactive website. We estimate that a sampling rate of 2% of all users will provide a sample sufficient for tracking of results week to week and for comparing subgroups such as those mentioned above.
Non-response bias will be examined indirectly. We will examine other data sources, such as click stream analysis, completion rates of tasks on healthcare.gov, demographics of successful healthcare.gov applicants, number of new enrollees vs. re-enrollees reaching each of the trigger pages, etc. Unfortunately, while the response rate can be calculated, we do not know the mean or proportion for the non-respondents or the effect on the resulting user experience and perception data.
Since the purpose of this survey is to identify opportunities for improvement on a continuous basis, we do not plan to weight the data to external demographics or other characteristics as it would be difficult if not impossible to identify an existing set that would be representative of the HealthCare.gov user. Instead, we will describe the composition of the survey respondent pool and interpret the results in that context.
B.4 Tests of procedures or methods
The online instrument was tested internally for functionality, language, and skip patterns and flow. It will be tested live, with small samples to verify general comprehension among the target audience, when the survey initially launches. In addition, many of the questions in this instrument have been used successfully in other surveys and for screening and classification of Marketplace consumer audiences. This type of pretesting with consumers serves to reduce the potential for confusion. We do not anticipate a need to develop any new questions as part of this work. However, if new questions were required, they would also be tested using cognitive testing methods prior to data collection. The data collection procedures have been well tested in surveys of US consumers and business leaders.
B.5 Statistical and questionnaire design consultant:
Frank Funderburk (410-786-1820) or Diane Field (410-786-5978)
Division of Research
Office of Communications
Centers for Medicare and Medicaid Services
The project officer who will receive the deliverables from the supplement is: Diane Field, Ph.D.
Division of Research
Office of Communications
Strategic Marketing Group
Centers for Medicare and Medicaid Services
410.786.5978
References
The American Association for Public Opinion Research (AAPOR) (2011). “Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys.” 7th edition. Lenexa, Kansas: AAPOR.
Groves, Robert M. & Peytcheva, E. (2008). The impact of nonresponse rates on nonresponse bias: A meta-analysis. Public Opinion Quarterly, 72(2), 167-189.
Heeringa, Steven G., West, Brady T., & Berglund, Patricia A. (2010). Applied Survey Data
Analysis. Boca Raton, FL: CRC Press.
Peytchev, Andy, Baxter, Rodney K., Carley-Baxter, Lisa R. (2009). Not all survey effort is equal: Reduction of nonresponse bias and nonresponse error. Public Opinion Quarterly, 73(4):
785-806.
Supporting
Statement
B:
HealthCare.gov site-wide survey
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Frank Funderburk |
File Modified | 0000-00-00 |
File Created | 2021-01-23 |