NOTE TO THE REVIEWER OF:
|
Request for Information Collection under ERS Generic Clearance OMB Control No. 0536-0073, Exp 4/30/2025 |
FROM: |
Sandra Hoffmann USDA Economic Research Service |
SUBJECT: |
Submission of Materials for “Foodborne Illness Valuation Research”
|
|
|
|
|
1. Justification
1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.
Need for this information collection
This Information Collection request under ERS’ Generic Clearance is to pretest a survey. The survey is designed to improve estimates of the value of health improvements from food safety regulation in the United States. The original survey was developed and administered in the United Kingdom and Australia. Pretesting is needed to adapt the survey for use in the United States.
Need for the survey
The USDA’s Economic Research Service (ERS) is responsible for providing timely research and analysis to public and private decision makers on topics related to agriculture, food, the environment, and rural America. In this role it develops research relevant to U.S. Department of Agriculture (USDA) programs. The USDA Food Safety and Inspection Service (FSIS) is responsible for developing and enforcing federal regulation to help ensure the safety of meat, poultry, egg products and catfish in the U.S. Most safety issues involve foodborne pathogens.
FSIS uses cost-benefit analysis to evaluate the efficacy of safety regulations in protecting human health. One tool for this evaluation has been ERS research findings on the cost of foodborne illnesses.
Office of Management and Budget guidance directs agencies to value regulatory benefits using estimates of the public’s willingness to pay (WTP) for them.1 This guidance is consistent with standard theory and practice in the field of public economics.2
Willingness to pay to reduce risk of illnesses is the sum of:
medical expenditures needed to treat resulting illness,
lost wages and the value of other time spent ill instead of engaging in normal activities,
willingness to pay to avoid pain and suffering from the illness,
willingness to pay to reduce risk of death,
expenditures individuals currently make to reduce their own health risk.
(Harrington and Portney 1987).3
Current ERS cost of foodborne illness estimates 4all the above elements except “willingness to pay to avoid pain and suffering from the illness” and “expenditures individuals currently make to reduce risk of foodborne illness”5. FSIS uses the current ERS estimates in its regulatory impact analyses. The survey being pretested is designed to improve the accuracy of estimates of WTP to reduce risk of foodborne illness by providing estimates of WTP to avoid pain and suffering from illness.
The purpose of the survey being pretested is to provide estimates of the U.S. population’s “willingness to pay to avoid the pain and suffering from foodborne illnesses.” This survey will allow FSIS to provide a more accurate assessment of the American public’s willingness to pay for the benefits of FSIS food safety regulations.
ERS is not alone in not including “willingness to pay to avoid pain and suffering from illness”. There simply has not been the research needed to estimate this component of willingness to pay to reduce risk of illness.6 The survey we are adapting is designed to help fill that gap. The survey is designed to estimate individuals’ “willingness to pay to avoid or reduce risk of pain and suffering” from both acute foodborne illnesses and their long-term/chronic outcomes. It also characterizes the impact of foodborne disease using the EQ5D, a widely used generic scale used to characterize health outcomes. This will allow results from this study to applied to newly emerging foodborne illnesses and other illnesses.
Authority: Why USDA ERS should conduct the survey
The Secretary of Agriculture is charged with collection of statistics and other appropriate means within his power (7 USC Sec. 2204). This authority is delegated to the Under Secretary for Research, Education and Economics (7 CFR Sec. 2.21). As Chief Scientist of the USDA, the Under Secretary for Research Education and Economics is directed to “coordinate the research programs and activities of the Department” which specifically includes research on “food safety, nutrition, and health” (7 USC 6971). The USDA Economic Research Service is responsible for collecting data and conducting research related to USDA mission areas including food and food safety (7 CFR Sec. 2.21(a) (8)).7
Why use a survey and not market data?
There are two possible methods for estimating willingness to pay for any outcome. One is to use econometric analysis of data from market transactions. The other is to use surveys that elicit information on individuals’ willingness to pay from their statements about what they would be willing to pay or from their statements about choices they would make that can then be used to estimate their willingness to pay.
Food producers do not market food based on microbial food safety. There is a tacit agreement among marketers that to do so would only raise questions in consumers’ minds about the safety of products. They typically focus instead on meeting or exceeding food safety standards. As a result, there is no market data that can be used to estimate consumer willingness to pay to reduce risk of foodborne infections.
In the absence of such market data, researchers have used choice experiment surveys, such as that to be pretested in the proposed data collection.8 Choice experiments have an advantage over other stated preference methods in using choices between alternative products, rather than asking a respondent to accept or reject a proposed price for a single product. This more closely mimics decisions commonly made between alternatives in markets.9
2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the use the agency has made of the information received from the current or existing data collection.
This is a Generic Clearance request to pretest an existing survey instrument. The pretesting will use cognitive interviews, focus groups, and pilot tests of the survey to help assure that the survey can be clearly understood by a wide range of Americans and that the survey results provide statistically valid estimates. Cognitive interviews and focus groups will be used to improve the clarity of the survey instrument, reduce its burden, and identify initial prices/costs of choices about avoiding or reducing risk of pain and suffering from foodborne illnesses illness presented in the survey. Pilot tests of the survey will be used to finalize the prices/costs and to assure that the programmed survey is functioning as intended. This approach follows standard practice for survey development.10
The results of this pretesting will only be available to the research team for the purposes of improving the survey instrument. The research team is made up of Drs. Sandra Hoffmann and Kar Lim of the USDA Economic Research Service, and Dr. Alan Krupnick of Resources for the Future.
Goals for Pretesting
The overall goals of this pretesting are to adapt the survey instrument to U.S. conditions, improve its clarity, reduce its burden, and set final price/cost levels. To achieve these overall goals the following objectives will be met.
Adapt substantive materials that are U.K.-specific, e.g., differences in medical care systems or income levels, to those appropriate for the U.S, and pretest to make certain new language is clear.
Adapt British English usage to American English, and pretest to make certain new language is clear.
Assess whether respondents understand the questions and information treatments, e.g., the disease outcome descriptions, as intended and refine them until they do.
Take steps to determine whether survey responses are meeting internal and external validity criteria needed to assure that the choice experiment survey is credibly measuring willingness-to-pay.
Redesign the chronic illness choices to be choices about health outcomes that occur with a particular probability rather than with certainty. Pretest these changes to be sure they are clear to respondents.
The UK survey asks respondents to assume that they will face chronic health outcomes with certainty as a result of getting a foodborne infection. Since these outcomes are rare, though serious, it is more realistic to tell respondents that they face a risk of the chronic outcomes after a foodborne infection. This redesign will make the survey represent foodborne illnesses more accurately, but it needs pretesting. The redesign builds on a long history of stated preference surveys estimating WTP to reduce risk of disease together with the lessons these efforts have taught about how to communicate probability and risk to a broad audience.
Continue to simplify the survey, as possible, to reduce respondent burden.
Pilot test the survey to determine price/cost ranges that appropriately capture the range of respondent preferences.
To ensure that the programmed survey is functioning as intended.
Drafts of the acute and chronic versions of the survey reflecting preliminary adaptations to the UK surveys are attached as Appendices 1 and 2.
Pretesting Process:
Survey development methods recommend using a combination of cognitive interviews and focus groups to refine the survey and pilot testing to finalize questions and test that the survey is functioning as expected.11 Several Federal statistical agencies, as well as private research organizations, including Research Triangle Institute (RTI) and NORC at the University of Chicago, routinely use cognitive interviewing to refine survey questionnaires.12
Cognitive Interviews and Focus Groups:
Cognitive interviews and focus groups are qualitative research methods that explicitly focus on the cognitive processes that subjects use to answer survey questions. Due to the small number of subjects involved, and type of information gathered it is the interviewer’s “clinical judgment,” rather than quantitative analysis, that must be used to determine when a question is clearly communicating the intended meaning.13 The method is also used to make certain that the survey itself has a flow and logic that is intelligible and easy to comprehend to survey respondents. “The capacity of the interviewing and questionnaire design staff for applying judgment, adjustment, and subjective corrections is basic to the practice of cognitive interviewing.”14 All members of the research team are experienced in development and pretesting of stated preference/choice experiment surveys. 15
We use pretesting to refer to the use of focus groups and cognitive interviewing to refine the wording and logical flow of the survey. We will use both cognitive interviewing and focus groups in this pretesting. Cognitive interviews involve two major methods: “think-aloud” interviewing, and “verbal probing” techniques.16 Pretesting is an iterative process whose focus shifts as the survey is improved using the information gained from the interviews and focus groups.17
This survey has been extensively tested for use in the UK. Debriefing questions, a tutorial on understanding how the survey visually presents risk levels, and the structure of choices about uncertain outcomes are all taken from other surveys that have been extensively pretested and successfully fielded. 18 As a result, we anticipate that the survey will generally need less pretesting than a survey that is being newly developed.
Focus groups.
The surveys include choice sets that describe disease outcomes of varying types, severities, and durations. Early in the process we will conduct 2 focus groups on descriptions of acute and chronic health outcomes that will be included in these choice sets. Preliminary drafts of health outcome descriptions will be based on descriptions from the original UK survey, modified to better align with U.S. disease modeling as developed for the ERS cost of foodborne illness estimates, and reviewed by foodborne disease experts from U.S. CDC. The purpose of these focus groups is to explore how clear these disease descriptions are to participants and to identify areas where improved communication is needed. We will also do preliminary exploration of price/cost levels used in the choice options. No focus group will exceed 2 hours, including a break.
Cognitive Interviews.
The first 2-3 cognitive interviewees will be asked to read through the survey talking aloud about their thoughts as they do so. The purpose of these interviews is to identify places where the survey is difficult to understand, unnecessarily burdensome, or where the logical flow of the survey is unclear.
Once areas of the survey and disease descriptions needing improvement are identified, individual questions or areas of the survey (e.g., debriefing questions) will be worked on through cognitive interviews. Appendix 4, Pretest Plan, provides more detail and interview guides for each survey question. No cognitive interview will last more than 90 minutes.
Following standard survey development practice and OMB guidance on cognitive interviewing, an iterative approach to improvement will be taken. Areas needing improvement will be identified in discussion with respondents about the survey. Brainstorming about possible alternatives will also be done with the respondents during the focus groups and cognitive interviews. Notes and individual interview reports will be maintained. Following each round of interviews, researchers will meet to analyze information that has been collected during interviewing. Researchers will examine data within interviews, across interviews (by question) and across survey sections to identify thematic patterns in question interpretations and response error.
Per OMB guidelines, “ongoing analysis of the data determines when “saturation” has been reached (i.e., the point at which little new information is being collected from each participant) and, therefore, informs when interviewing may cease.” If “saturation” cannot be reached with the number of respondents requested under this ICR, we will return to OMB to request permission for additional pretesting.
As the researchers determine that interviews are reaching a point of “saturation” and multiple participants find the survey is functioning well, we will again ask other pretest participants to do additional “think-aloud” interviews through the entire survey. We will also ask some interviewees to take the survey in assess how long it takes to take the survey and to evaluate how long it takes to complete the survey. If needed adjustments will be made to shorten the survey. Our goal is to have the survey take no more than 20-30 minutes to complete, which is typical of prior successful stated preference surveys.19
The debriefing questions that we are adding to the survey are taken from prior willingness-to-pay surveys that have been extensively pretested and fielded by in the United States.20 We will conduct cognitive interviews on these debriefing questions, but given extensive prior pretesting, we do not expect that they will need much refinement.
Based on prior pretesting experience, we estimate that it will require about 24 pretests to revise and finalized the acute illness instrument and about 30 pretests for the chronic instrument. All parts of the acute version have been extensively pretested in prior pretesting and fielding. Because the chronic illness version of the survey will have more changes, we anticipate that it will need more pretesting. The goal of this pretesting is to produce a clear survey that communicates ideas as intended, minimizes respondent burden, and assures responses will provide the information needed to accurately estimate respondents’ WTP.
Pilot Testing:
To finalize the survey instrument, it is also necessary to conduct pilot tests of both the acute and chronic versions of the survey. Pilot testing is conducted under conditions identical to that to be used to field the survey. It provides a quality control step and for WTP surveys is needed to set final cost/price levels.
Each version of the survey (acute and chronic) will be fielded on 100 respondents each (200 respondents total). These respondents will be drawn from the NORC AmeriSpeak panel, described further below and also in Appendix 5. Prior to pilot testing, the survey will be programmed by staff at NORC at the University of Chicago (NORC), the survey firm that we plan to have field the final survey. Pilot testing will provide an opportunity to conduct quality control checks on the programmed survey. Importantly, it also provides the statistical power needed to assure that the price/cost levels used in the disease outcome choices provide respondents’ the range of options needed to capture demand for the health outcomes across the population.
Each pilot test will be stopped periodically to allow for preliminary analysis of results and adjustment of prices if needed. The survey will first be administered to roughly 30 respondents. Responses will be analyzed looking at the distribution of choices made by the respondents and on results from econometric regression models. For example, a large concentration of respondents opting not to choose the risk reduction would indicate that the prices to get that risk reduction are set too high. Or for example, a positive regression coefficient associated with price—showing demand increases as price increases— would be a basic violation of standard economic demand theory, indicating a need to adjust the survey. After prices are adjusted in response to these results, the survey administration will continue for another 30 respondents and results again analyzed. Prior research has shown that pilot tests on 100 respondents generally provides enough statistical power to allow the iterative adjustments needed to finalize a well-functioning survey.
Recruiting
Respondents
NORC will provide pretest and pilot testing subjects drawn from their probability-based AmeriSpeak Panel. NORC recruits AmeriSpeak panel participants through address-based sampling, engaging with 40 percent of those who do not respond to multiple mailings in in-person, face-to-face field recruiting. This follow-up strategy secures respondents who are more likely to be low income, young, non-white, and less educated, groups that are underrepresented in most surveys. The result is a sample of Americans that reflect the U.S. Census within 1 to 2 percentage points on key demographics (Appendix 5).
For the focus groups and cognitive interviews, too small a number of subjects is involved to allow for statistical analysis or to make representative sampling feasible. OMB guidance advises that recruitment for focus groups and cognitive interviews should be done in a “purposeful way” rather than through random sampling. Respondents should be identified based on their relationship or experiences with the “key characteristics of the study”. The key characteristics of this study are that respondents are capable of contracting a foodborne illness and have financial responsibility for themselves. NORC will recruit adult subjects that roughly reflect the US population in terms of age, gender, race/ethnicity, and education.
In contrast, the pilot testing is conducted on a larger sample and can be statistically representative the American population. NORC will randomly draw pilot testing subjects from the AmeriSpeak panel using the same methods that will be used to recruit respondents for the actual survey.
Use of Results
Results from both the pretesting and pilot testing will be used solely for the purpose of developing and testing the survey instrument. None of the results will be published. All information will be kept confidential using protocols and data environments that meet CIPSEA requirements.21
3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.
The cognitive interviews and focus groups will be conducted remotely using Zoom. Zoom is a widely used, user friendly remote meeting software. Conducting pretests remotely will reduce respondent burden by making it possible for them to participate in pretesting without having to travel. It will also reduce the cost of conducting pretesting with participants from across the United States better capturing regional differences in language. Finally, it will reduce respondent burden by reducing COVID safety risks to pretest participants.
NORC will recruit pretesting subjects for focus groups and cognitive interviews and will set up the Zoom calls to be used in conducting pretests (Appendix 3). The Zoom calls will be conducted on NORC’s secure web-based system. NORC will obtain the subjects informed consent and permission to video record sessions. The research team will not be provided personally identifying information on the subjects and will not collect such information while the pretesting.
Each pretest will involve two of the research team members: one as a facilitator/interviewer and one as a note taker. All researchers will be CIPSEA agents with proper CIPSEA compliance training and all will work on ERS computers. Researchers’ notes will be stored in a secure folder on the ERS server with access limited to the researchers. This file storage system is CIPSEA compliant.
Should recordings of the pretest sessions be needed, they will be transferred from NORC to the researchers at ERS via Kiteworks. Kiteworks is an online service which facilitates the secure transfer of data between ERS staff and other collaborators. Kiteworks is FedRAMP certified, and it is the only ERS-approved secure data transfer service.
NORC will administer the pilot testing using methods that they plan to use to administer the final survey. NORC uses multi-mode data collection that enables panelists to complete the survey online or by phone. NORC’s support team provides customer service to panelists and assures their privacy and secure storage of their responses.
4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.
This Generic Clearance is to pretest a choice experiment survey on willingness to pay to reduce foodborne illness that was developed for use in the United Kingdom. This survey has not been administered in the United States.
Pretesting of this instrument has not been conducted in the United States. Pretesting conducted in other countries are not adequate to assure that the survey will function properly in the United States due to differences in language, social context, demographics, and economic conditions (incomes and cost of living) that affect the preventive expenditures included in the choice sets.
Through formal literature review and discussions with other researchers around the world we have found no other U.S. surveys designed to estimate willingness to pay to reduce risk of pain and suffering from foodborne illness. Such surveys have been conducted in the United Kingdom and in Australia. Public economics theory and practice advises that assessment of benefits for cost benefit analysis should be based on valuation estimates from the affected population, in this case the U.S. public.22
5. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.
This is a survey of individuals. This is not a survey of small businesses or other small entities.
6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.
This Generic Clearance request is to pretest and pilot test an existing survey on willingness to pay to reduce pain and suffering from foodborne illness using cognitive interviews, focus groups, and pilot tests. The purpose of this pretesting is to make certain that respondents will clearly understand the final survey instrument and to minimize the burden of taking the survey. The cognitive interviews and focus groups will also provide initial starting points for the cost attribute of the choices respondents will make that are used to estimate willingness to pay. Pilot testing will be used to finalize the cost/price levels used in the survey and to assure that the programmed survey is functioning properly. If the pretesting and pilot testing is not conducted, the survey instrument will not perform well.
7. Explain any special circumstances that would cause an information collection to be conducted in a manner:
requiring respondents to report information to the agency more often than quarterly;
requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
requiring respondents to submit more than an original and two copies of any document;
requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;
in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;
requiring the use of a statistical data classification that has not been reviewed and approved by OMB;
that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or’
requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.
This information collection (pretesting) does not involve any special circumstances.
8. If applicable, provide a copy and identify the date and page number of publications in the Federal Register of the agency's notice, required by 5 CFR 1320.8 (d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.
This Generic Clearance request is for pre-testing of the survey instrument and does not require publication of a Federal Register notice.
9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.
AmeriSpeak is a pre-enrolled household survey panel designed to be representative of the U.S. population that includes panelists age 13 and older. As part of their enrollment, every AmeriSpeak panelist is promised that they will be awarded points for each survey or pretest they complete. Respondents redeem these points for cash, Amazon gift codes, virtual Mastercard current, or physical goods via the AmeriSpeak Panel member web portal or by calling the AmeriSpeak support toll-free telephone number. For the focus groups and cognitive interviews, 25,000 AmeriSpeak points will be awarded to each participant (equivalent to $25); 5,000 AmeriSpeak points will be awarded to each respondent completing the pilot survey (equivalent to $5).
10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.
The ERS Privacy and Confidentiality Review Officer has assessed this package for applicability of 5 U.S.C. § 552a and has determined that the Privacy Act applies to the information collection (Appendix 3). Resources for the Future is contracting with NORC to provide subjects for pretesting and to collect quantitative pilot testing data on the finalized survey instrument. NORC maintains PII, such as name, address, telephone numbers, and email address as part of their maintenance of the AmeriSpeak online panel. This PII was collected in order to contact individuals for participation and reimbursement and is maintained according to privacy regulations. ERS will not receive nor have access to AmeriSpeak panel members' PII. ERS will not collect PII during pretesting.
To maintain the confidentiality of the participants, any data files or recordings shared with USDA will be stripped of any PII. All data files will be transferred from NORC to the researchers via Kiteworks. Kiteworks is an online service which facilitates the secure transfer of data between ERS staff and other collaborators. Kiteworks is FedRAMP certified, and it is the only ERS-approved secure data transfer service.
Key safeguards have been put in place to assure respondents that their responses will be treated in a secure and private manner. Prior to the start of the pretests or pilot survey, NORC staff will ask prospective respondents for their informed consent text (Appendix 3).
Every AmeriSpeak panelist is provided a CIPSEA approved Privacy Statement at the time of recruiting to the AmeriSpeak panel (Appendix 3). Because each panel member is asked to provide key demographic data such as age, gender, race/ethnicity, state of residence, household income, and more, the Privacy Statement also tells panel members how they can verify the accuracy of their PII and how they can request that the information be deleted or updated.
The AmeriSpeak Privacy Statement includes the following:
A promise to treat all AmeriSpeak panelists and their information with respect.
The assurance that participation in any AmeriSpeak study is completely voluntary and that panel members may choose not to answer any questions that they do not wish to answer. Furthermore, panel members may withdraw their participation in AmeriSpeak at any time.
AmeriSpeak will never try to sell the panel member anything or ask for donations.
AmeriSpeak will not share the personally identifying information with any clients unless panel members have given explicit permission to do so. Only survey responses will be shared with clients.
Personal information will never be shared with telemarketers or others who would try to sell panel members anything.
AmeriSpeak has established security measures to protect the security and confidentiality of its panel members.
Panel members control their personal information and have the right to view their personal information or ask AmeriSpeak to delete it.
The recruitment documents contain the CIPSEA pledge (Appendix 3, p. 13):
Assurance of Confidentiality: All information which would permit identification of an individual, a practice, or an establishment will be held confidential, will be used for statistical purposes only, will be used only by contractors authorized by USDA ERS to perform statistical activities only when required and with necessary controls, and will not be disclosed or released to other persons without the consent of the individual or establishment in accordance with the Confidential Information Protection and Statistical Efficiency Act (PL-107-347). By law, every employee as well as every agent has taken an oath and is subject to a jail term of up to five years, a fine of up to $250,000, or both if he or she willfully discloses ANY identifiable information about you or your company.
AmeriSpeak’s systems are CIPSEA compliant. NORC will use the NIST 800-53 framework and controls to protect Confidential Information.)). NORC will encrypt all data in storage and in transit.
In addition, for each pre-test or pilot survey invitees are provided the required information for informed consent (purpose of the pretest/pilot survey, length of time, AmeriSpeak awards to be given) during recruitment (Appendix 3).
The NORC infrastructure framework is compliant with the Federal Information Security Management Act (FISMA) to ensure that all data, operations, and assets are protected from security threats. As such, NORC follows the standards and guidelines set by the National Institute of Standards and Technology (NIST) Special Publication 800-53 rev 4 (Recommended Security Controls for Federal Information Systems and Organizations) at the moderate level and the Federal Information Processing Standards (FIPS). In keeping with our high standards, the policies established in the OMB Circular A-130 regarding Management of Federal Resources Appendix III, are strictly followed. All personnel maintaining the systems are trained according to the policies set by each project to comply with the data security requirements and manage the usage of data, including personally identifiable information (PII).
11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior or attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
We do not believe any of the questions in the survey are sensitive or involve matters that most people would view as private. In the survey we will ask about recent experience of gastroenteric symptoms (e.g., diarrhea), a common experience among Americans, as a way of reminding the respondent of what it is like to have a foodborne illness. This is standard practice in health choice experiment studies to strengthen the validity of responses by helping the respondent remember that they likely have experienced the outcome we are asking about. As 1 in 6 Americans get a foodborne infection each year, we think it is unlikely to be seen as a highly sensitive question.
12. Provide estimates of the hour burden of the collection of information. Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated.
A) Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.
Recruiting and participating in the focus groups or cognitive interviews involves 3 steps: reading the email invitation to the screener questionnaire, completing the screener questionnaire, and then participating in the interview/focus group. NORC will invite potential respondents to take a screener survey to make sure they have Zoom and a web camera, get respondent consent, and have respondents schedule themselves for the interview or focus group (Appendix 3). This typically takes 5 minutes. NORC estimates that approximately a third of invited respondents accept the invitation. NORC sends a confirmation email, two reminder emails or a text to those accepting the pretest invitation. NORC estimates these take 3 minutes to read and address. Half of those accepting a pretest invitation end up participating in pretests. For pilot surveys there is no need for reminder emails or scheduling and a third of those invited take the survey.
Recruiting for and participating in the pilot survey is a two-step process: reading the invitation and taking the survey. NORC estimates that it takes 5 minutes to read the invitation and decide about participating in the survey. The survey will be designed to take an average of 30 minutes to complete.
Table 3. Reporting Burden
Steps |
Sample Size |
Respondents |
Min/ Resp |
Burden Hours |
Non respondents |
Min/ Resp |
Burden Hours |
Total Burden Hours |
Pretest: Recruitment and participation |
|
|
|
|
|
|
|
|
1. Reading and responding to the invitation for interview or focus group screener survey |
420 |
140 |
5 |
12 |
280 |
5 |
23 |
35 |
2. Respond to confirmation and reminder emails |
140 |
70 |
3 |
3.5 |
70 |
3 |
3.5 |
7 |
3. Participating in the cognitive interviews |
54 |
54 |
90 |
81 |
N/A |
N/A |
N/A |
81 |
4. Participating in the focus groups |
16 |
16 |
120 |
32 |
N/A |
N/A |
N/A |
32 |
Subtotal |
N/A |
N/A |
N/A |
132 |
360 |
N/A |
28 |
155 |
|
|
|
|
|
|
|
|
|
Pilot test: Recruitment and participation |
|
|
|
|
|
|
|
|
1. Reading and responding to the invitation for the pilot test survey |
600 |
200 |
5 |
17 |
400 |
5 |
33 |
50 |
2. Completing the pilot test survey |
200 |
200 |
30 |
100 |
N/A |
0 |
0 |
100 |
Subtotal |
|
|
N/A |
117 |
N/A |
N/A |
33 |
150 |
|
|
|
|
|
|
|
|
|
Total |
1020 |
340 |
N/A |
245 |
680 |
N/A |
60 |
305 |
B) Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories.
In December 2022, the average U.S. wage rate of $32.82.23 We estimate that the pretesting and pilot testing will involve 305 hours of respondents’ time. This implies an average total cost to respondents of approximately $10,010.
13. Provide estimates of the total annual cost burden to respondents or record keepers resulting from the collection of information, (do not include the cost of any hour burden shown in items 12 and 14). The cost estimates should be split into two components: (a) a total capital and start-up cost component annualized over its expected useful life; and (b) a total operation and maintenance and purchase of services component.
Respondents have no capital or start-up costs or operation, maintenance or purchase of services associated with the pretests or pilot tests we are requesting.
The only equipment used by researchers in these pretests are computers that have already been purchased by ERS and NORC for general purposes.
14. Provide estimates of annualized cost to the Federal government. Provide a description of the method used to estimate cost and any other expense that would not have been incurred without this collection of information.
The cost to the federal government to conduct these pretests (cognitive interviews and focus groups) is approximately $120,000. We estimate that 2 federal researchers will spend about 40% of their time for 4 months on this pretesting, equating to roughly $37,000. Dr. Krupnick of Resources for the Future will also work on this pretesting at a cost of $21,000 under a cooperative agreement with ERS. Resources for the Future has contracted with NORC to support pretesting and pilot testing by providing subjects, a web-based interviewing environment, conducting pilot testing, and managing data security at a cost of $62,000.
15. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-1.
There are no program changes or adjustments that needed to be reported in Items 13 or 14 of the OMB Form 83-1.
16. For collections of information whose results are planned to be published, outline plans for tabulation and publication.
This Generic Clearance request is for pre-testing of the survey instrument (cognitive interviews, focus groups, and pilot testing of the survey instrument). Nothing will be published from this pretesting.
17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.
We are not seeking approval to not display the expiration date for OMB approval of the information collection.
18. Explain each exception to the certification statement identified in Item 19 "Certification for Paperwork Reduction Act."
A. Certification Statement
The agency is able to certify compliance with all provisions under Item 19 of OMB Form 83-1.
1 Office of Management and Budget. Circular A-4 September 17, 2003.
2 Boardman, A. E., Greenberg, D. H., Vining, A. R., & Weimer, D. L. (2017). Cost-benefit analysis: concepts and practice. Cambridge University Press.
3 Harrington, Winston, and Paul R. Portney. "Valuing the benefits of health and safety regulation." Journal of Urban Economics 22, no. 1 (1987): 101-112.
5 USDA Economic Research Service, Cost Estimates of Foodborne Illnesses, accessed at ERS.usda.gov/data-products/cost-estimates-of-foodborne-illnesses/ Jan. 7, 2023.
6 Cameron, T. "Valuing morbidity in environmental benefit-cost analysis." Annu. Rev. Resource. Econ. 6, no. 1 (2014): 249-272.
7 7 CFR Sec. 2.21(a)(8)(ii) Delegations of authority to the Under Secretary for Research, Education, and Economics
(8) Related to economic research and statistical reporting.
(ii) Conduct economic and social science research and analyses relating to: (A) Food and agriculture situation and outlook; (B) production, marketing, and distribution of food … including studies of the performance of the food and agricultural sector of the economy in meeting needs and wants of consumers
(C) basic and long-range, worldwide, economic analyses and research on supply, demand, and trade in food. and the effects on the U.S. food and agricultural system.
…
(iv) improve statistics in the Department maintain liaison with OMB and other Federal agencies for coordination of statistical methods and techniques
8 List, John A., Paramita Sinha, and Michael H. Taylor. "Using choice experiments to value non-market goods and services: evidence from field experiments." The BE Journal of Economic Analysis & Policy 6, no. 2 (2006).
9 Kriström, B., Johansson, P.O. and Wohl, E., 2015. Economic valuation methods for non-market goods or services. Environmental Science, pp.9780199363445-0044.
10 Fowler, Floyd. 2014. Survey Research Methods. 5th Ed. Sage Publications. Thousand Oaks, California.
11 Fowler, Floyd. 2014. Survey Research Methods. 5th Ed. Sage Publications. Thousand Oaks, California.
12 Willis, Gordon. 2005. Cognitive Interviewing: A Tool for Improving Questionnaire Design. Sage Publications. Thousand Oaks, California. Personal communication with Michael Dennis, NORC at University of Chicago Feb. 6, 2023.
13 Willis, Gordon. 2005. Cognitive Interviewing: A Tool for Improving Questionnaire Design. Sage Publications. Thousand Oaks, California. Personal communication with Michael Dennis, NORC at University of Chicago Feb. 6, 2023.
14 Willis, Gordon, 1999. Cognitive Interviewing: A “How to” Guide. Short Course presented at the 1999 Meeting of the American Statistical Association.
15 Including, but not limited to: Hoffmann, Sandra, Ping Qin, Alan Krupnick, Burmaajav Badrakh, Suvd Batbaatar, Enkhjargal Altangerel, and Lodoysamba Sereeter. "The willingness to pay for mortality risk reductions in Mongolia." Resource and Energy Economics 34, no. 4 (2012): 493-513; Alberini, Anna, Maureen Cropper, Alan Krupnick, and Nathalie B. Simon. "Does the value of a statistical life vary with age and health status? Evidence from the US and Canada." Journal of Environmental Economics and Management 48, no. 1 (2004): 769-792.; Adamowicz, Wiktor, Diane Dupont, Alan Krupnick, and Jing Zhang. "Valuation of cancer and microbial disease risk reductions in municipal drinking water: An analysis of risk context using multiple valuation methods." Journal of Environmental Economics and Management 61, no. 2 (2011): 213-226.; Itaoka, Kenshi, Alan Krupnick, Makoto Akai, Anna Alberini, Maureen Cropper, and Nathalie Simon. "Age, health, and the willingness to pay for mortality risk reductions: a contingent valuation survey of Shizuoka, Japan, residents." Environmental economics and policy studies 8 (2007): 211-237.
16 Willis, Gordon. 2005. Cognitive Interviewing: A Tool for Improving Questionnaire Design. Sage Publications. Thousand Oaks, California. Personal communication with Michael Dennis, NORC at University of Chicago Feb. 6, 2023.
17 Willis, Gordon. 2005. Cognitive Interviewing: A Tool for Improving Questionnaire Design. Sage Publications. Thousand Oaks, California.
18 Willis, Gordon. 2005. Cognitive Interviewing: A Tool for Improving Questionnaire Design. Sage Publications. Thousand Oaks, California.
19 Hoffmann, Sandra, Alan Krupnick, and Ping Qin. "Building a set of internationally comparable value of statistical life studies: estimates of Chinese willingness to pay to reduce mortality risk." Journal of Benefit-Cost Analysis 8, no. 2 (2017): 251-289.Hoffmann, Sandra, Ping Qin, Alan Krupnick, Burmaajav Badrakh, Suvd Batbaatar, Enkhjargal Altangerel, and Lodoysamba Sereeter. "The willingness to pay for mortality risk reductions in Mongolia." Resource and Energy Economics 34, no. 4 (2012): 493-513; Alberini, Anna, Maureen Cropper, Alan Krupnick, and Nathalie B. Simon. "Does the value of a statistical life vary with age and health status? Evidence from the US and Canada." Journal of Environmental Economics and Management 48, no. 1 (2004): 769-792; Organization of Economic Cooperation and Development, Surveys on Willingness-to-Pay to Avoid Negative Chemicals-Related Health Impacts. Accessed at oecd.org/chemicalsafety/costs-benefits-chemicals-regulation.htm on Jan. 11, 2023.
20 Organization of Economic Cooperation and Development, Surveys on Willingness-to-Pay to Avoid Negative Chemicals-Related Health Impacts. Accessed at oecd.org/chemicalsafety/costs-benefits-chemicals-regulation.htm on Jan. 11, 2023.
21 The following assurance of confidentiality language will be used. “Assurance of Confidentiality: The information you provide will be used for statistical purposes only. Your response will be kept confidential and any person who willfully discloses ANY identifiable information about you or your operation is subject to a jail term, a fine, or both. This survey is conducted in accordance with the Confidential Information Protection and Statistical Efficiency Act of 2018, Title III of Pub. L. No. 115-435, codified in 44 U.S.C. Ch. 35 and other applicable Federal laws.”
22 Boardman, A. et al. (2017). Cost-benefit Analysis Concepts and Practice. Cambridge University Press.
23 U.S. Bureau of Labor Statistics, Establishment Data, Table B-3 Average hourly and weekly earnings of all employees on private nonfarm payrolls by industry sector, seasonally adjusted. https://www.bls.gov/news.release/empsit.t19.htm
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Hoffmann, Sandra - REE-ERS |
File Modified | 0000-00-00 |
File Created | 2024-07-20 |