Supporting Statement B for Request for Clearance:
NATIONAL CENTER FOR HEALTH STATISTICS
RAPID SURVEYS SYSTEM
OMB No. 0920-XXXX
Expiration Date: XX/XX/XXXX
Contact Information:
Stephen Blumberg, Ph.D.
Director
Division of Health Interview Statistics
National Center for Health Statistics/CDC
3311 Toledo Road
Hyattsville, MD 20782
301-458-4107
May 24, 2023
Table of Contents
1. Response Universe and Sampling Methods 3
2. Procedures for the Collection of Information 5
3. Methods to Maximize Response Rates and Deal with Nonresponse 7
4. Tests of Procedures or Methods to be Undertaken 8
Questionnaire Design and Evaluation 9
5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 11
LIST OF ATTACHMENTS
Attachment A – Public Health Service Act
Attachment B – Published 60-Day FRN
Attachment B1 – Comments
Attachment C – Guidance for sponsors
Attachment D – RSS Round 1 (2023) Burden Estimate and Questionnaire
Attachment E – RSS Round 1 (2023) Content Justification from Sponsors
Attachment F – 30-day Federal Register Notice for the RSS and RSS Round 1 (2023)
Attachment G – NCHS Ethics Review Board Determination Notice
Attachment H– Publications on RANDS Estimation and Calibration Research
NCHS Rapid Surveys System
As described in Supporting Statement A, the Rapid Surveys System (RSS) is being designed to complement the current household survey systems at the National Center for Health Statistics (NCHS). By utilizing online panel-based data collection, Rapid Surveys will be used to produce time-sensitive estimates of new and emerging public health topics, attitudes, and behaviors. This program includes ongoing methods development and evaluation of the data collection and weighting approaches.
Activities and statistical methods that will be used to operationalize this data system include:
Questionnaire design: Methods such as cognitive interviewing, web probing, and experimental design will be used to evaluate and validate questions on emerging health topics.
Use of multiple panels: Data from two commercial survey panels will be collected and combined for estimation. The performance of both panel surveys will be assessed in terms of coverage, subgroup estimation, nonresponse, and other survey features. Research will be performed to evaluate statistical approaches for combining the two panels, including methods for weighting, mean estimation, and variance estimation.
Calibration weighting: Sample weights from the online commercial panels will be combined and calibrated to the National Health Interview Survey (NHIS) based on previous research findings using NCHS’ Research and Development Survey (RANDS) and ongoing development and evaluations using the Rapid Surveys System. The weighting approach may change over time based on the findings of the ongoing evaluations and will be documented.
Data quality evaluation: Evaluations of the similarities, differences, and opportunities for further weighting calibration will be evaluated between Rapid Surveys and other NCHS and federal surveys, including the NHIS, National Health and Nutrition Examination Survey (NHANES), National Survey of Family Growth (NSFG), Behavioral Risk Factor Surveillance System (BRFSS), and the Census Household Pulse Survey. In addition, data quality assessments such as sampling metrics, item nonresponse, and consistency between responses in the two surveys will be used for evaluating and reporting data quality.
Further details are provided below.
NCHS
plans to conduct a series of surveys under the Rapid Surveys System
to produce estimates informing the understanding of the US adult
population on emerging health topics as well as topics not suitable
for current household surveys such as attitudes and behaviors. The
data collections will be performed using the AmeriSpeak
(conducted by NORC) and KnowledgePanel (conducted by Ipsos Public
Affairs)
web-based panels with surveys conducted on a quarterly basis. NCHS
plans on obtaining 2,000 survey responses from the KnowledgePanel and
4,000 survey responses from the AmeriSpeak Panel for the first four
surveys. For subsequent data collections, NCHS plans at least 2,000
completed responses from each of the two vendors.
The Rapid Surveys System will utilize pre-established commercial survey panels, which allow the program to launch quickly without time and system needed to recruit sampled adults into a new panel. Both the AmeriSpeak and KnowledgePanel are established and reputable panels, and NORC and Ipsos have experience recruiting, maintaining, and surveying their panel members. Both panel providers have probability-sampled commercial panels which are designed to be representative of the US adult population and use primarily web-based administration to rapidly assess emerging health topics.
While the Rapid Surveys System is a new program, it is based upon NCHS research with data collection using commercial panel surveys via the RANDS platform (OMB numbers 0920-0222, 0920-1298, and 0920-1323). RANDS used NORC’s AmeriSpeak Panel to conduct RANDS 3-8 and RANDS during COVID-19 Rounds 1-3. For these rounds, survey completion rates have ranged from 62.2% (RANDS 3) to 78.5% (RANDS during COVID-19 Round 1), and cumulative response rates have ranged from 11.1% (RANDS 5) to 23.0% (RANDS during COVID-19 Round 1).
The response universe and sampling methods for each panel provider is described in more detail below. Both panels are probability-based panels and thus, unlike opt-in panels, the recruitment process starts with a random sample of households from a specified frame. As a result, it is possible to derive the selection probability and hence the sampling weight for each respondent on the panel.
The AmeriSpeak Panel is a probability-based panel operated by NORC at the University of Chicago. U.S. households are randomly selected with a known, non-zero probability from the NORC National Frame as well as address-based sample (ABS) frames. Sample households are recruited to join AmeriSpeak by mail, telephone, and by field interviewers face to face in both English and Spanish languages. During the initial recruitment, AmeriSpeak panelists are offered an opportunity to choose their preferred mode (web or phone) for future participation in AmeriSpeak surveys. Most panelists complete AmeriSpeak surveys using NORC’s web interface, though about 10% of the panel have indicated they prefer to participate in telephone surveys.
For the Rapid Surveys program, a stratified sample of AmeriSpeak panelists will be contacted. The sample design will use the panel information that NORC holds about the AmeriSpeak members to create strata based on age, race and Hispanic origin, education, household income, and gender (96 sampling strata in total). Sample selection will account for the expected differential survey completion rates across the sampling strata. The size of the selected sample per stratum is determined such that the distribution of the complete surveys across the strata matches that of the target population as represented by census data.
The KnowledgePanel is a probability-based panel operated by Ipsos Public Affairs since 1998. Ipsos currently recruits US households through ABS methodology, which involves probability-based sampling of addresses from the U.S. Postal Service’s Delivery Sequence File (DSF). An estimated 97% of households are covered in the DSF. Prior to using ABS recruitment (April 2009), list-assisted RDD sampling techniques were used on the sample frame consisting of the entire U.S. residential telephone population. Approximately 20% of current panel members were recruited through RDD methodology, while 80% were recruited using the ABS methodology. Invited households can join the KnowledgePanel via a mail-back survey form, recruitment interview completed over a phone call, or an online recruitment from, and can participate in surveys in English and Spanish. KnowledgePanel surveys are conducted using online administration. Panelists without internet access at home are provided a tablet with a mobile data plan.
For the Rapid Surveys program, Ipsos will select panelists using probability proportional to size (PPS) sampling with a measure of size (MOS) based on panel sample selection weights that are benchmarked by: age by gender by race and Hispanic origin, census region by race and Hispanic origin, metropolitan status by race and Hispanic origin, education by race and Hispanic origin, household income by race and Hispanic origin, rental status by race and Hispanic origin, Hispanic origin by census region, household size by race and Hispanic origin, and language dominance by census region. The benchmarks are selected using population control totals from the US Census Bureau’s American Community Survey (ACS) and the March Supplement of the Current Population Survey (CPS).
The basic Rapid Surveys will be conducted using the AmeriSpeak Panel via web and phone administration and the KnowledgePanel via web administration. NCHS has also exercised an optional task for each contract that will implement custom enhancements to improve sample representativeness. Each round’s questionnaire will consist of four main components: (1) basic demographic information on respondents to be used as covariates in analyses, (2) new, emerging, or supplemental content proposed by programs in NCHS, other CDC Centers, Institute, and Offices), and other HHS agencies, (3) questions used for calibrating the survey weights, and (4) additional content selected by NCHS to evaluate against relevant benchmarks. The questionnaire for each round will vary. The questions in each of the four components above will be used for various purposes including:
Estimation: The externally sponsored content from CDC/HHS programs on emerging new health topics will be used for producing estimates of these selected topics. Data from the two survey panels will be combined with a combined sample weight created from the panel-provided Rapid Survey sample weights. Weighted estimates will be calculated using the combined sample weight calibrated to the NHIS and will be produced for all adults and for identified population subgroups of interest. Variables included for estimation will vary by round of Rapid Survey.
Calibration: For the purpose of generating reliable estimates on emerging health topics, the Rapid Survey sample weights provided by NORC and Ipsos will be calibrated to the NHIS using a set of questions fielded on both surveys. Questions identified for calibration are based on NCHS research and recommendations from the NCHS Board of Scientific Counselors (BSC) and are those that adjust for possible differences in demographic characteristics (e.g., education and marital status), underlying health between the samples (e.g., diagnosed asthma), and characteristics of panel respondents including internet access and use, volunteerism, and civic engagement.
Evaluation: To evaluate Rapid Surveys and to permit comparison, we will include some questions from other sources, primarily the NHIS but also the CPS and Census Bureau’s Household Pulse Survey. Comparisons of estimates from the Rapid Surveys recruited panels surveys after statistical adjustment to those from the NHIS using questions fielded on both sources continues to be an important tool for NCHS to gauge the effectiveness of its statistical methods. The evaluation variables will be used to independently assess the quality of the data from NORC and Ipsos as well as the weighting approaches used for the two panels and by NCHS.
Measurement Research: In addition, NCHS plans to use Rapid Surveys for methodological work related to measurement error and question design. This work will largely rely on the use of set cognitive probes and experimental design. This measurement research will be used to inform question design for the selected emerging health topics being evaluated, including understanding how survey respondents interpret and understand questions and concepts related to these topics. This information will be used to not only inform current and future NCHS and CDC surveys, but also the design and analysis of other Federal and non-Federal surveys and questionnaires.
Research Combining Probability-Based Panels: As part of this program, the data from two separate probability-based panels will be combined. This program will evaluate the limitations and advantages of combining data from two panels. In addition, NCHS will perform research on appropriate statistical methods for generating weights, mean estimates, and variances estimates from the combined panel data.
The evaluation variables will be used to continually assess the performance of the weighting adjustments and will be used for reporting the quality of the Rapid Surveys System estimates. These evaluation variables will also support continued research on statistical methods for combining information from multiple sources for generating estimates through calibration and other model-based approaches. In addition to statistical methods for combining data from a commercial web panel with NCHS core surveys, evaluation variables will be used to evaluate and advance methods for combining data from two panels.
Assessments of the quality will be performed for each quarterly data collection and will include evaluations of sample size, response rates, metrics evaluating the weights, and calculations of bias of the selected evaluation variables.
Both Ipsos and NORC employ several approaches for maintaining participation and maximizing the responses for surveys fielded using their survey panels (described in more detail below). There are two levels of non-response that need to be accounted for when using survey panels, including the non-response (and coverage bias) during the panel recruitment phase and non-participation in the actual survey that is administered. The first of these two levels of non-response and coverage bias is one of the greatest limitations in using commercially available panels compared to federal surveys. To address this bias, NCHS incorporates relevant covariates to account for differences between web panel respondents and the general population into the calibration weighting approach. In addition, the two panel providers for Rapid Surveys will be providing information on nonrespondents so that the quality of the responses as well as potential nonresponse bias in both panels can be evaluated. NCHS and RTI International, a third contractor who will be providing support for the Rapid Surveys program, will be independently evaluating the panel and data quality. Using information collected for all panel participants (e.g., socio-economic and occupational status, age, gender, race and Hispanic origin, and geographic characteristics) which is available on the final file, NCHS will conduct a non-response bias analysis of the Rapid Surveys sample. The results of these analyses will permit NCHS to further refine its model-based estimation techniques and create calibrated survey weights to adjust for nonresponse bias.
NORC uses a well-maintained ABS frame that is constantly updated as NORC conducts their field surveys for panel recruitment. The annual AmeriSpeak Panel retention rate is about 85%. In addition, NORC regularly invites new households into the panel using a variety of recruitment methods. Notably, NORC has a large non-response follow-up effort (NRFU) during the panel recruitment which utilizes face-to-face interviews (as opposed to the initial mail-back recruitment survey). NRFU recruitment significantly improves the representation of the panel with respect to demographic segments that are under-represented among the respondents to the initial recruitment, including young adults (persons 18 to 34 years of age), non-Hispanic Black adults, Hispanic adults, lower-income households, renters, cellphone-only households, and persons with lower educational attainment (e.g., no college degree). AmeriSpeak panel weights (non-study-specific weights) also include statistical nonresponse adjustments for the panel recruitment.
NORC employs dedicated staff to maintain panelists’ participation in AmeriSpeak and limits the number of surveys any one panelist may be asked to complete per month. This process minimizes the number of surveys any one panelist is exposed to, maximizes the rotation of all panelists among AmeriSpeak surveys, and leads to a relatively high participation rate. To maximize response rates, NORC sends multiple email and text reminders to web preference respondents and calls phone preference respondent. These follow ups can be targeted to groups with lower response rates than expected. NORC also has longer field periods to improve the completion rate. Panelists are also offered an incentive upon completion of the surveys. Upon completion of a particular study, NORC produces final statistical weights that have been adjusted to address survey nonresponse through a weighting class method.
Ipsos utilizes the U.S. Postal Service’s DSF for the ABS sample frame which has high coverage of the US population, with an estimated 97% of US households covered in the DSF. In addition, Ipsos regularly recruits to maintain their panel with four to six recruitment waves per year. Ipsos adjusts the stratification methodology used to sample households based on the panel geodemographic composition, fluctuating response, and attrition rates. Currently, Ipsos oversamples groups including households likely to have a young adult; households like to have a Latino resident; households in Census Block Groups with a high concentration of residents with a high school or less education; households in Census Block Groups with low internet penetration; and households in rural areas to better represent the US population. For households for which a corresponding valid telephone number can be matched to the sampled address, nonresponse telephone refusal-conversion calls are made to improve panel participation. Selected panelists can join the panel using mail, phone, or web options.
Ipsos also implements a number of strategies to address nonresponse to maximize survey level completion rates. The KnowledgePanel has a loyalty policy for all panel members, including providing extra attention to the populations that are more likely to leave the panel. Higher levels of incentives, surprise incentives that are not tied to surveys, and offering the ability to take a break from taking surveys for a few months encourage ongoing participation on the panel. The standard KnowledgePanel data collection period is between 2 and 3 weeks, which provides sufficient time for all respondents who want to participate in a survey to do so and encourages responses from hard-to-reach populations which typically take longer to complete a survey. Ipsos also sends multiple reminders to non-respondents to encourage participation. In addition, to ease the process of taking the survey, Ipsos’ survey software automatically detects the type of device, operating system, and web browser, and automatically adapts to a template optimized for the device of the respondent. KnowledgePanel samples typically result in high completion rates and the final weighting approach for the data is designed to address commonly observed patterns of differential nonresponse.
As a new program, the Rapid Surveys System will be used to perform research and evaluate the use of methods related to the questionnaire design and evaluation, calibration weighting, combining two panel surveys, and estimation.
For the questionnaire design and evaluation, activities include performing cognitive interviews with members of the public to understand how respondents answer the survey questions and to provide data users with relevant information for interpreting survey results. In addition, web probes may be designed for inclusion on the questionnaires to provide quantitative and qualitative findings regarding respondents’ interpretations of selected survey items. RTI will conduct in person and virtual cognitive interviews to test questions. Participants who complete the interview will receive cash incentives for their participation. The results will be used to make recommendations for questions to be revised, included, or not included in subsequent NCHS surveys. Procedures for these cognitive interviews will conform to those laid out in NCHS’ Collaborating Center for Questionnaire Design and Evaluation Research’s generic clearance (OMB Control # 0920-2222, Expiration 9/30/24).
The two data collection contractors, Ipsos and NORC, will provide NCHS final weights for each of their samples. The weights account for the probability of selection into the respective panels as well as selection into Rapid Surveys. Each contractor conducts their own adjustment approaches, including nonresponse adjustment and alignment to population benchmarks on specified variables. A brief description of the contractor weighting approaches is provided:
AmeriSpeak Panel (NORC): NORC develops final study specific weights that account for the panel member’s probability of selection into the sample of panel along with adjustments for nonresponse, the probability of selection into the study sample from the panel under the sample design, and final adjustments. The final study weights address survey nonresponse through a weighting class method. Then, raking adjustments are applied to the non-response adjusted weights to align the survey sample to population benchmarks on specified demographic variables including age, gender, census division, race and Hispanic origin, and education from the CPS. Raking and re-raking are done during the weighting process such that the weighted demographic distribution of the survey completes resemble the demographic distribution in the target population.
KnowledgePanel (Ipsos): Ipsos’ weighting process begins with design weights for all panelists sampled and invited to a given study. The base weights for KnowledgePanel respondents are adjusted using raking on specified demographic variables including: gender, race and Hispanic origin, census region by metropolitan status, education, and household income using benchmarks from the CPS. For surveys that include Spanish survey takers, an additional adjustment for language proficiency is included using benchmarks from the ACS.
After receiving the final data and weights from Ipsos and NORC, calibration weighting approaches will be used to adjust the contractor provided weights prior to estimation to reduce potential bias in the estimates. Calibrated survey weights will be developed using several questions included verbatim on both the NHIS and the Rapid Surveys questionnaire for the purpose of adjusting the weights.
For this step, research findings from NCHS’ RANDS program as well as findings from NCHS’ BSC Workgroup (https://www.cdc.gov/nchs/data/bsc/Letter-to-NCHS-Director-Recommendations-on-Use-of-Panel-Survey-Data-by-NCHS-June-6-2022Redacted-with-Findings.pdf) will be used to guide the selection of calibration variables to adjust the Rapid Surveys survey weights to the NHIS. Selected calibration variables will include demographic, health, and web survey participation-related factors. In addition, new research will be performed to continue guiding decisions for the calibration weighting, including the evaluation of potential calibration variables added to both the Rapid Surveys and NHIS questionnaires. For initial rounds of Rapid Surveys these new calibration variables under consideration include: questions about Internet access, the use of health information technology, volunteering and civic engagement, life satisfaction, and social and work limitations.
This adjustment is intended to control for possible differences between the NHIS and Rapid Surveys samples due to different response propensities, coverage and/or sample variability. Based on NCHS’ past and ongoing research using the RANDS platform, additional calibrations to the weights provided for panel surveys can reduce differences between RANDS and NHIS estimates, though the effects may differ among subgroups and can depend on the outcomes of interest, calibration variables, and the relationships between the two. In particular, inclusion of health-related variables in weight calibration has shown improvement in the health outcome estimates through reduced bias. Underlying health may affect participation in Rapid Surveys (and other panel surveys) and response to certain health-related questions for some participants.
The
effectiveness of the model-assisted approach has been documented in
statistical literature and has also been explored in our research on
previous RANDS recruited panel survey data. Articles illustrating the
research performed at NCHS are presented in Attachment
H.
Moreover, the Rapid Surveys system will utilize two separate commercial panels and can be used for evaluating the use of multi-panels, as recommended by the BSC. One key feature of this study will be an evaluation to understand differences and limitations of the panels, as well as appropriate methods for creating combined estimates from the two panel collections. Initial assessments will compare the data structure, variable and value labels, and response options to ensure alignment between the two surveys. In addition, the data quality of the two panels will be compared on data quality metrics such as representativeness and item missingness. Another important component of the study will be identifying similarities and differences in the data collection methods, frame development, sample selection, and weighting, as they can have a large impact on the comparability of estimate from different panels using the same questionnaire. Based on the understanding of how each panel is constructed, statistical methods for combining the two surveys will be evaluated. This includes developing a combined set of weights and determining appropriate methods for producing mean and variance estimates for the combined panel data.
To generate health outcome estimates, Rapid Surveys will use estimation procedures developed by NCHS. This approach is necessary given the major difference in sample quality between Rapid Surveys and NCHS’ household surveys that produce official statistics (such as the National Health Interview Survey, or NHIS, and National Health and Nutrition Examination Survey, or NHANES). Rapid Surveys will use both traditional survey estimation procedures, as well as model-assisted methods. Estimates for a limited set of variables, including the sponsored content, will be publicly released and technical notes will accompany the estimates that describe the methods and their limitations. BSC findings will be used to guide estimation decisions including avoiding estimates that can be produced by NCHS’ existing household surveys, avoiding estimates that are related to willingness to participate, and carefully assessing subdomain estimates prior to release.
The persons with overall responsibility for the methodological and technical aspects of the
described activities are:
Paul Scanlon, Ph.D.
Senior Methodologist
National Center for Health Statistics
3311 Toledo Road
Hyattsville, Maryland
(301)-458-4649
Katherine Irimata, Ph.D.
Mathematical Statistician
National Center for Health Statistics
3311 Toledo Road
Hyattsville, Maryland
(301)-458-4075
James Dahlhamer, Ph.D.
Survey Statistician
National Center for Health Statistics
3311 Toledo Road
Hyattsville, Maryland
(301) 458-4403
Data collection for the Rapid Surveys program will be conducted by Ipsos and NORC. The persons responsible for the data collection aspects of Rapid Surveys are:
Ipsos
Frances Barlas, Ph.D.
Ipsos Public Affairs, LLC
2020 K. Street, NW, Suite 410
Washington, DC 20002
202-203-0379
NORC
Margrethe Montgomery, M.A.
NORC at the University of Chicago
55 E Monroe, 31st Floor
Chicago, IL 60603
312-325-2533
RTI will be conducting data quality checks, combining weights and creating variance estimation variables based on NCHS specifications. The person responsible for the aspects of Rapid Surveys from RTI is:
Sarah Lessem, Ph.D.
RTI International
701 13th St NW
Washington, DC 20005
202.816.3981
Email: [email protected]
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Brown, Amy (CDC/DDPHSS/NCHS/DHIS) |
File Modified | 0000-00-00 |
File Created | 2024-07-31 |