Supporting Statement - Part A revisesd

Supporting Statement - Part A revisesd.docx

Data Collection on Public Understanding of Science

OMB: 3145-0231

Document [docx]
Download: docx | pdf

Supporting Statement

Request for Clearance for Data Collection on Public Understanding of Science

A. JUSTIFCATION

A1. Necessity of Information

This request is for clearance to conduct experimentation with survey questions on factual knowledge of science items using an Internet panel of survey respondents.


The National Science Foundation reports information about public understanding of science in Science and Engineering Indicators (SEI), a publication of the National Science Board (NSB). The quality of the reported information is dependent on the validity of the knowledge questions as measures of basic factual knowledge of science, the wording of the questions, and the question format.


NSB members have raised concerns about two true-false survey items that have been part of the SEI basic science knowledge scale for many years: “Human beings, as we know them today, developed from earlier species of animals” and “The universe began with a huge explosion.” One concern, particularly with the first item, is that the item may confound religious belief with understanding of the relevant science. For example, some with certain religious beliefs would respond “false” or “don’t know” to these statements even if they are aware of the scientific consensus about evolution. A second concern, mainly with the second item, was that it presented a seriously flawed summary of relevant aspects of the accepted scientific account of the origins of the universe.


In response to these concerns, NSF’s Directorate for Social, Behavioral and Economic Sciences (SBE) held two workshops to consider the conceptualization and measurement of adult science knowledge. In outlining a revised framework for indicators of public understanding of science, the workshops urged further research on public understanding of the various dimensions of knowledge and acceptance of evolution and inquiry into how well the public understands the norms and practices characteristic of science as an institution.


The workshops also encouraged research on the effects of question format on the measurement of science knowledge. One potential question format issue that may affect the measurement of science knowledge is the extent to which survey respondents are encouraged or discouraged from providing don’t know responses. One the one hand, it is argued that don’t know responses conceal partial knowledge and that the presence of an explicit “don’t know” option leads to an underestimate of knowledge levels (Krosnick, 2002; Mondak and Davis, 2001). On the other hand, it is argued that not including a “don’t know” option encourages guessing by respondents who have inadequate levels of knowledge (Sturgis, Allum, and Smith, 2008). Although research on the effects of a don’t know option has been conducted in the area of political knowledge, similar research has not been undertaken for science knowledge.


NSF factual science knowledge questions have always been asked in True-False format, which may encourage guessing. An alternative that may lead to more accurate measurement is to create forced choice response options where the respondent is presented with two or more plausible response options (Harris and Changas, 1994).


In light of the workshop recommendations, NSF is undertaking exploratory and experimental work centered on public scientific knowledge related to the topics covered by the two survey items. The primary purpose of this work is to aid in interpreting responses to the existing items that are used in SEI to track trends in factual knowledge. NSF expects to use these data as the basis of an essentially methodological sidebar1 in SEI 2016. In a few paragraphs, the sidebar would describe the research experiments, explain their implications for the interpretation of survey data on factual knowledge of science, show (we expect) how apparently small variations in question wording and content can produce sizable differences in measures of public knowledge of evolution, and, if warranted, suggest promising directions for future indicators development in this area. NSF envisions that this research may, in the long run, contribute to developing alternate items that might replace the current ones, but this project is not primarily designed to produce new survey questions for administration beyond the experiments performed or for continued use. NSF also expects to make the data from the survey experiments publicly available for analysis and publication in scientific journals.


NCSES has contracted with the survey research firm Westat to design survey-based experiments that address the research questions listed below.


  • To what degree and in what ways do responses to questions testing knowledge of human evolution differ from responses to questions about the theory of evolution that do not involve specific reference to humans (e.g., questions about natural selection, questions about inheritance processes)?

  • How is respondents’ belief in/acceptance of the theory of evolution related to their likelihood of correctly responding to questions at different levels of difficulty that test knowledge of/familiarity with tenets of the theory of evolution?

  • For questions about the origins of the universe, to what degree and in what way do question variants (e.g., mention of the age of the universe; focus on a continuous process of expansion and cooling since the universe originated, rather than on an originating event such as a “big bang” or “huge explosion”) affect response patterns?

  • Are there question format variations that demonstrably affect substantive results or measurement quality for the two true-false items in question and for other true-false science knowledge items?











A2. Needs and Uses

The primary immediate use of the data will be as the basis of an essentially methodological sidebar in SEI 2016. In a few paragraphs, the sidebar would describe the research experiments, explain their implications for the interpretation of survey data on factual knowledge of science, show (we expect) how apparently small variations in question wording and content can produce sizable differences in measures of public knowledge of evolution, and, if warranted, suggest promising directions for future indicators development in this area. Findings may also be presented in professional publications and presentations.


In addition, the NSF Office of Legislative and Public Affairs may use the data in preparing speeches and testimony for the NSF Director and Deputy Director. The NSB and its staff may use the data for similar purposes.


SEI is used extensively in the U.S. science and technology policy community. Users include policy makers and program managers in the Executive Branch; Congressional representatives and their staff; state and local government officials; researchers and administrators in industry and academia: and representatives of professional and trade associations. The report is also used by international organizations and by researchers in other countries.


A3. Use of Technology


We will be collecting data via the Internet from a probability sample representative of the U.S. population. An Internet-administered survey has advantages in enabling respondents to choose when to answer questions, thereby reducing burden, and in reducing cost and increasing speed of survey administration. It is also faster and more reliable than collecting data by paper-and-pencil. Internet surveys also help minimize the time burden on respondents by automating skip patterns in the questionnaire and simplifying data entry. Previous work suggests that Web administration of these types of basic knowledge questions can yield results as least as valid as those obtained by telephone (Fricker et al., 2005)


A4. Efforts to Identify Duplication

Data produced by the NSF survey are unduplicated by any other source, and no other data can be used or modified to achieve the same purpose. NSF staff continually track research on public attitudes toward and understanding of science and technology and are aware of no other statistically valid data that could substitute for the information in the NSF time series data. Although other organizations (all outside the government) have performed survey work in this area, the NSF time series remains unique in its content, continuity over time, and statistical validity.

A5. Impact on Small Businesses

There are no small businesses used as part of this project.



A6. Consequences of Less Frequent Data Collection

This is a one-time collection.

A7. Special Circumstances Influencing Collection

No special circumstances to note.

A8. Federal Register Publication and Outside Consultation

The research under this clearance is consistent with the guidelines in in 5 CFR 1320.6. The 60 day notice in the Federal register was published on June 26, 2013 (78 FR 38410). No comments were received in response to the 60 day notice. The 30 day notice was published on or about September 4, 2013.

A9. Payment or Gift to Respondents

The survey firm Knowledge Networks, which maintains the KnowledgePanel (described below), operates a panel relations program to encourage participation and member loyalty. As part of its program, members can enter special raffles or be entered into special sweepstakes with both cash and other prizes to be won. No incentives will be provided to the survey respondents that are specifically targeted to encourage them to complete this project.


A10. Assurance of Confidentiality


Confidentiality procedures follow the standard Knowledge Networks protocol for the KnowledgePanel. Informed consent is obtained by participants in the Internet panel at the time the person is recruited to participate in the panel. The information given to KnowledgePanel participants during the recruitment process is shown in Appendix A. Respondents are informed about their privacy and confidentiality of their survey responses. They are also informed that their participation is voluntary. Consent to receive survey invitations from KnowledgePanel is obtained during the recruitment process when the respondents give their email addresses or shipping addresses to receive a computer. They are then asked to complete the “Core Profile Survey,” which collects basic personal demographic information. Respondents receive invitations to client surveys only upon completion of the “Core Profile Survey.” The data provided to NSF will contain no personal identifiers on the data file.


A11. Justification of Sensitive Questions

The collection of some information about the respondents’ religious beliefs is required for this project, which focuses on the interaction between religious belief and understanding of relevant science. A concern, with items such as, “Human beings, as we know them today, developed from earlier species of animals”, is that the item may confound religious belief with understanding of the relevant science. For example, some with certain religious beliefs would respond “false” or “don’t know” to these statements even if they are aware of the scientific consensus about evolution. In other words, these respondents object to this explanation of human development on religious grounds. Hence, it is important to measure aspects of religious beliefs in order to understand if question wording alternatives are successful at accomplishing this goal.

A12. Estimate of Hour Burden

NSF estimates that, on average, 15 minutes per respondent will be required to complete the survey. The annual respondent burden for completing the survey is therefore estimated at 875 hours, based on 3,500 respondents.

A13. Estimate of Respondent Cost Burden

The expected burden cost associated with the estimated hours is $21,009 and is based on the average hourly earnings of $24.01 per hour for private nonfarm payrolls.2

A14. Estimated Cost to Federal Government

The total estimated cost to the government for survey development and implementation is $204,528. This consists of two components.

1. Costs associated with survey development and implementation

  • Survey planning, question design, experimental design $ 20,550

  • Data collection $123,228

  • Data analysis, delivery and project summary reporting $ 60,750

$204,528

2. We estimate it will cost NSF $33,000 in staff time and associated overhead costs to monitor performance under the contract.

A15. Reasons for Change in Burden

Not applicable.

A16. Plans for Publication

NSF plans to use these data as the basis of an essentially methodological sidebar in SEI 2016. In a few paragraphs, the sidebar would describe the research experiments, explain their implications for the interpretation of survey data on factual knowledge of science, show (we expect) how apparently small variations in question wording and content can produce sizable differences in measures of public knowledge of evolution, and, if warranted, suggest promising directions for future indicators development in this area. Findings will also be presented in professional publications and presentations.

A17. Expiration Date Approval

The OMB Control Number and any other required language will be presented to respondents at the introduction to the survey on the website.

A18. Exceptions to the Certification Statement

There are no exceptions to the certification statement.



1 “Sidebars discuss interesting recent developments in the field, more speculative information than is presented in regular chapter text, or other special topics.” (SEI, p. xiii)

2 June 2013 Economic news release national occupational employment and wage estimate for average hourly earnings of all employees on private nonfarm payrolls, by industry sector, seasonally adjusted (Source: http://www.bls.gov/oes/current/oes259099.htm)

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorPlimpton, Suzanne H.
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy