B. Statistical Methods
The sampling frame for the exploratory study is Ipsos Public Affairs’ (Ipsos’) online consumer panel, KnowledgePanel.1 Unlike opt-in web-based panels that use convenience sampling, the KnowledgePanel is a probability-based panel that is designed to be representative of the U.S. adult population. This representation is achieved through address-based sampling (ABS), where every U.S. adult with an address (including those who do not have a landline phone number) has an equal probability of being selected for participation on the panel. Selected panelists without Internet access are provided with Internet access and a tablet computer, if needed.
The KnowledgePanel has some limitations that should be considered when interpreting survey results. The low recruitment rate for panel participation, panel attrition, and nonresponse among panelists selected to complete a particular survey lead to a very low overall response rate (7%), which may result in nonresponse bias if nonrespondents are systematically different from respondents (Tourangeau, Conrad, & Couper, 2013). Other potential limitations include sampling and coverage issues, nonresponse from breakoffs (i.e., not completing the survey), and measurement error (Tourangeau, Conrad, & Couper, 2013). Because of these limitations, it is not appropriate to make inferences to the U.S. population using results from a probability-based panel survey.
The KnowledgePanel consists of approximately 55,000 adults who were randomly selected and invited to participate as panelists. Panelists typically serve for 2 to 3 years and complete two to three surveys per month during their tenure. As previously noted, panelists from non-Internet households are equipped with access to the Internet and a tablet computer if needed. The panel also includes households that have listed and unlisted telephone numbers and those without a landline telephone.
Panelists are recruited using random sampling strategies, which is distinct from other online panels that use opt-in recruitment methods and allow individuals to volunteer as panelists. Random digit dialing (RDD) was originally the primary sampling strategy but was supplemented with ABS starting in 2008 and RDD sampling fully eliminated in late 2009.
1. completing a paper enrollment form and returning it in a postage-paid envelope
2. calling a toll-free hotline and completing an enrollment interview
3. visiting a secure web site and completing an online enrollment form
After an individual accepts the invitation to be a panel member, s/he is instructed to log on to a secure web site and complete individual profile questionnaires. The questionnaires capture essential demographic information (e.g., sex, age, race, income, education) and health information (e.g., diagnoses, health status, health behaviors). These profiles enable the prescreening of potential participants for eligibility and to eliminate demographic questions on every survey, and they can also be used in a nonresponse bias analysis. Each panelist updates his or her profile annually.
Between 2009 and 2011, Hispanic and non-Hispanic households (aged 18 to 29) were oversampled to help ensure sufficient representation. In 2012, Hispanic and non-Hispanic households (aged 18 to 29) were oversampled again but also included the age group 30 years and older. The panel’s weighting procedures adjust for the oversampling carried out to improve the demographic composition of the panel.
For selection of general population samples (i.e., adults aged 18 and older) from the panel, a patented methodology is used to ensure the resulting sample represents an equal probability of selection method sample. First, the entire panel (i.e., the sampling frame) is weighted to the latest Current Population Survey (CPS) (2017) benchmarks to compensate for any minor misalignments that may result from differential attrition rates among hard-to-reach subgroups. This step helps ensure that the weighted distribution of the KnowledgePanel aligns with the U.S. population of adults. The geo-demographic dimensions used for weighting the entire panel include the following:
gender (male/female)
age (18 to 29, 30 to 44, 45 to 59, and 60+)
race/Hispanic ethnicity (white/non-Hispanic, black/non-Hispanic, other/non-Hispanic, 2+ races/non-Hispanic, Hispanic)
education (less than high school, high school, some college, bachelor’s and beyond)
Census region (Northeast, Midwest, South, West)
household income (under $10,000, $10,000 to <$25,000, $25,000 to <$50,000, $50,000 to <$75,000, $75,000+)
home ownership status (own, rent/other)
metropolitan area (yes, no)
Internet access (yes, no)
Using the above weights as the measure of size (MOS) for each panel member, in the next step a probability proportional to size (PPS) procedure is used to select study-specific samples. It is the application of this PPS methodology with the above MOS values that produces fully self-weighting samples from the KnowledgePanel. For this study, an equal probability of selection method sample of 4,400 panel members will be selected to achieve 2,400 completed surveys.
Once the study sample has been fielded and all the survey data are cleaned, the design weights will be adjusted to compensate for any differential nonresponse that may have occurred during the data collection process. Final analysis weights will be produced using an iterative proportional fitting (raking) procedure to ensure that the resulting weights are aligned with respect to all study benchmark distributions simultaneously. In the last step, calculated weights will be examined to identify and, if necessary, trim outliers at the extreme upper and lower tails of the weight distribution. The resulting weights will then be scaled to sum to the total sample size of all eligible respondents.
B.2. Procedures for the Collection of Information
Once the sample for this study is randomly selected, sampled panelists will receive an e-mail invitation to invite them to participate in the study. The e-mail invitation will contain a unique link, specifically for that panelist, to the survey. After clicking on the link in the e-mail invitation, panelists will be directed to the online instrument. On the first screen, panelists will be provided information on informed consent and asked if they would like to proceed with the survey (Appendix B). If panelists decline, they will be categorized as nonrespondents. If panelists accept, they will be asked four questions to determine eligibility. Selected panel members will be eligible to complete the survey if they
are adults (18+);
have at least one child (0 to 17 years) living in the household;
cook and prepare meals at home at least four times a week; and
have prepared a meal with meat or poultry within the past 30 days.
Panelists who are deemed not eligible to complete the survey will be categorized as ineligible.
We estimate that it will take respondents no more than 20 minutes to complete the online survey (Appendix A provides a copy of the survey questionnaire). Data collection will take 2 to 3 weeks to complete.
Based on experience conducting 20-minute online surveys with general population samples (i.e., adults 18 and older), we estimate that about 55% of the randomly selected panelists will be eligible and complete the online survey. A maximum of three automatic e-mail reminders will be sent to nonresponding panelists during data collection. Panelists who do not complete the survey will be categorized as nonrespondents.
To maximize participation, the study team conducted cognitive interviews and pretests to help improve the understandability and usability of the questionnaire, reduce participant burden, and enhance administration.
In addition, to encourage participation, each e-mail invitation and reminder will state the study purpose and identify USDA as the study sponsor and provide panelists with the e-mail address and toll-free number for the RTI team lead to acquire additional information about the study or verify the authenticity of the study. The contractor will monitor all phases of sampling and data collection and resolve any problems immediately throughout the course of the study.
Nonresponse bias
Because we anticipate that the response rate will be less than 80%, a nonresponse bias analysis will be conducted to evaluate the potential for bias due to nonresponse. Nonresponse may cause bias in survey estimates if sample members who chose not to respond would have provided answers that differ systematically from answers provided by sample members who chose to respond.
The demographic information (i.e., age, gender, race/ethnicity, educational attainment) of all sampled panel members for respondents and nonrespondents will be compared with U.S. Census benchmarks and the bias due to nonresponse estimated. This analysis will be conducted using the final weights. The results of these analyses will be reported alongside summary statistics and other information resulting from this proposed data collection.
RTI conducted cognitive interviews with nine adults to identify any survey questions and response items that were confusing or difficult to understand. Based on the cognitive interview findings, we refined the programmed instrument. Specifically, we removed and revised several questions that were not being answered as intended (e.g., foodborne illness outbreak questions) or questions that were not priority to this year’s research (e.g., Food Safe Families logo) to help minimize respondent burden. Additionally, we tested two versions of the same question to see which version respondents preferred and used the version that was easier for respondents to answer. To improve understanding and readability, we reformatted and simplified the instructions for some questions and added an introductory question with skip logic for ease of understanding in some cases. The cognitive interviews confirmed the estimated average burden of 20 minutes per response.
To ensure that the programming logic, sample distribution and fulfillment, and data compilation are functioning correctly, we will conduct a pretest with a total of 80 randomly selected panelists. If necessary, the online instrument will be refined based on the pretest findings. Data collection for the pretest will not commence until OMB approval is obtained. If changes are made to the survey instrument based on the pretest findings, FSIS will submit a revised survey instrument to OMB for review and approval.
Sheryl Cates is the RTI Project Director and will manage the study and will work with Jenna Brophy to manage the data collection activities. Christopher Bernstein, an FSIS employee, will review the results of the web survey.
References:
Tourangeau, F., Conrad, F. G., & Couper, M. P. (2013). The science of web surveys. New York, NY: Oxford University Press.
1 The KnowledgePanel was formerly owned and maintained by GfK.
File Type | application/msword |
File Title | Fortification Supporting Statement Part A |
Author | Verrill |
Last Modified By | SYSTEM |
File Modified | 2019-08-09 |
File Created | 2019-08-09 |