Supporting Statement

Supporting Statement.docx

NIST Generic Clearance for Decision Science Data Collections

Supporting Statement

OMB: 0693-0089

Document [docx]
Download: docx | pdf

PRA Application Supporting Statement


Generic Clearance for Decision Science Data Collections

OMB CONTROL NO. 0693-0089

Expiration Date 9/30/2024


Public Perceptions of Resilience, Climate Adaptation, Sustainability, and Equity (RASE) as Planning Concepts and Goals”



1. Explain who will be surveyed and why the group is appropriate to survey describing the specific goals and purposes of the study as well as the specific research questions that the study will address. Describe whether this study will be used strictly as feedback for internal programmatic use only, or whether it will provide performance measures for Congress or OMB, inform policy, inform agency rulemaking, or be published as an agency report or a report to Congress. Include a discussion of the strengths and weaknesses of the proposed design and its suitability for the intended uses.


Currently, communities in the US are striving to improve their hazard and climate change preparedness, environmental sustainability, and social equity, alongside addressing other pressing issues. Community planning methods to achieve these goals are described in community plans, often using key terms such as “resilience”, “climate adaptation”, “sustainability” and/or “equity” (hereafter called “RASE”). These terms encompass a variety of somewhat complementary goals and approaches to community planning, which in themselves have diverse and evolving meanings in the academic and practitioner communities (Moser et.al. 2019; Woodruff et.al. 2018; Malecha, Clavin, and Walpole in review). However, less if currently known about how the public perceives these terms related to planning, and how these perceptions and other factors may influence their support of such efforts.


NIST’s Community Resilience Program develops guidance, tools, and methods that are employed by professionals to enhance the built environment, the success of which relies in part upon public support for and adoption of recommended actions resulting from these tools. Previous work has found that the public associates certain concepts and goals with terms such as community resilience and climate adaptation planning, which influences levels of planning support (Meerow and Neuner 2021), perceptions of planning efficacy, and motivation to personally prepare for changing conditions (Wong-Parodi et.al. 2015). More information on underlying perceptions of RASE planning approaches and how they intersect could inform better methods of integrating and communicating about these planning topics. Namely, by identifying concepts that already overlap in the minds of the public and underlie support (which could be emphasized in community plans and other communication), versus sources of conceptual distinction or lack of support (which will likely require improved communication when presenting integrated plans).


This online, electronic survey will utilize 4 randomized groups to answer the following research questions:


  1. How do members of the public perceive the concepts and goals of planning efforts if they are associated with the terms resilience, climate adaptation, sustainability, or equity? What concepts or goals “overlaps” are shared between these four terms, and which are specific to certain terms?

  2. How do factors such as demographics, geographic location, perceived efficacy, and hazards and other issues of concern predict support for RASE planning efforts overall?

  3. How do other perceptions of the efficacy and support for RASE efforts differ based on terminology used alone?


As all members of the public with various backgrounds, ages, and incomes (etc.) can influence planning processes, it is important that our survey be answered by a representative portion of the US population, who provide complete and accurate responses. For this reason, we have contracted with a professional sampling service (Qualtrics) to host the survey and provide a sample of participants curated to be demographically representative of the US population, in terms of 1,000 adult (18+ year old) individuals approximate most recent available Census data in terms of age, gender, income, region, and race. The contractor will also be responsible for checking the validity, completeness, and quality of participant responses. As a result, we are confident that we will be able to obtain high-quality data from the sample of interest.


As a part of this process, participants asked to take the survey by the sampling service will be incentivized to participate with $1 - $5 payments. The reasons for this are twofold: related to data quality and reliability, the use of incentives are justified due to the burden and length of the survey (over 10 minutes) which is expected to experience significant participant drop-off if incentives are not used to compensate for participant’s time. Second, survey methods research strongly support the use of incentives to reduce nonresponse bias in self-administered survey data collection. In our case, incentives will help to ensure that our sample is representative of the US public and not only obtained from those willing to respond for little to no compensation. The use of different incentive levels will be determined by the contractor on a flexible basis, and are to obtain responses from certain groups that prove more difficult to reach but are nevertheless important members of our target demographic. Without such incentives study goals would not be met due to a high preponderance of incomplete responses from a non-representative sample of the US public.


As community planning is inherently a local and community-specific activity, this effort will not be able to provide specific feedback (i.e. on a case-study level) to individual communities. However, it will be able to provide National-level data on general trends which will be helpful to communities who do not have the resources or ability to learn about perceptions of their own populations, as well as identify broad trends and recommendations which will likely be helpful to a variety of communities across the nation.



2. Explain how the survey was developed including consultation with interested parties, pretesting, and responses to suggestions for improvement.


The survey instrument was created by a team of social scientific researchers with expertise in decision science and psychology, with detailed input from experts in city and regional planning. It was informed first by a literature review focused on previous work in plan quality evaluation and public perceptions of planning efforts. This included two highly relevant articles which tested the effects of message framing (i.e. what you call a planning effort) on public support and action specifically. Where possible, this previous work and other established scales (e.g. Gallup polls) informed the question text and response options used.


The draft survey instrument was shared with several collaborators within the Community Resilience and Applied Economics Group at NIST, as well as selected topic area experts outside of NIST. Their invaluable feedback was incorporated to better answer our research questions and make it easier for members of the public to understand and respond to questions. This included changes to information included in the experimental treatments (message prompts), the wording of dependent variables, and the addition of new items such as prior knowledge of planning efforts and trust in local government.



3. Explain how the survey will be conducted, how customers will be sampled if fewer than all customers will be surveyed, expected response rate, and actions your agency plans to take to improve the response rate.


The survey will be hosted on and conducted online via the survey platform Qualtrics. The NIST researcher PI will be responsible for developing the final survey instrument and obtaining appropriate IRB and PRA approvals, as well as conducting analysis on the final collected dataset. In terms of data collection alone, participants will be identified, contacted, and compensated for their time by Qualtrics professionals through their national panel services. In this way, via a contract agreement we are guaranteed 1,000 complete responses of the targeted population.


Qualtrics will recruit a sample of participants for us by contacting members of their existing participant pool. As data collection progresses, the contractor will monitor incoming data and participant demographics and adjust new invitations to ensure that demographic representativeness in the final sample of 1,000 is achieved.


Selected panel members are sent an email invitation or prompted on the survey platform to proceed with a given survey. The typical survey invitation is very simple and provides a hyperlink which will take the respondent to the survey, as well as mention the incentive offered (when participants are invited to take a survey, they are informed what they will be compensated). Data collection will be closed after 1,000 complete responses are obtained. The survey is expected to take 20 minutes to complete, resulting in 333 total burden hours:

1000 (respondents) * 20 Minutes / 60 (minutes in an hour) = 333 burden hours.


Interested participants will be routed to our survey, where they will be provided with more information and allowed to consent to proceed with the survey or decline. They will be presented with our survey questions, beginning with a set of basic questions about their own communities and awareness of any planning efforts, followed by random assignment into one of four question sets asking more specific questions about the terms resilience, climate adaptation, sustainability, or social equity, respectively. Note that questions and time to take the survey will be otherwise identical between these four groups – all that will differ is which term questions in this section are referring to. While participants will also be asked basic demographic questions such as age, income, and zip code no PII will be collected or shared with the researchers. After completing the survey, participants will be compensated for their time in the agreed-upon amount by Qualtrics. Information will not be saved in a Privacy Act System of Records, therefore no SORN or Privacy Act Statement are required.


A power analysis was conducted using effect sizes from similar work, which determined that group sizes of 250 should be sufficient to detect the expected effects (Specifically, effects sizes of Meerow and Neuner, 2020 ranged from .32 - .55. A slightly conservative .25 was used for our estimates). As a result, a total sample size of 1,000 (250 per 4 groups) is being requested.



4. Describe how the results of the survey will be analyzed and used to generalize the results to the entire customer population. Also, will this information be used by other Federal agencies? If so, for what purposes? Are they any privacy concerns related to this information sharing? If so, how have these been addressed?


Data received from the contractor will be uploaded into the statistical analysis program SPSS (by IBM) and analyzed in accordance with our research questions stated above. Descriptive statistics will be used to answer research question 1 by exploring the general trends we see in responses across the four groups. Regression analyses will then be used to answer research questions 2 and 3, to test which terms have the highest support overall, and if any significant differences in these perceptions are caused by other key factors such as region of residence.


While there are no current plans to share these results with other Federal agencies, resulting recommendations will be broadly informative of community resilience, climate adaptation, and sustainability planning efforts. We do not plan to share our data outside of NIST and do not have privacy concerns related to information sharing. Any results that are shared (via resulting reports; publications) will be aggregated and not personally identify any individuals.



5. Peer Review: If there is a reasonable likelihood that the results of this information collection will constitute “influential scientific information” under the Information Quality Bulletin for Peer Review, has NIST developed a peer review plan that will be posted on its peer review agenda?


The findings and recommendations resulting from this research do not constitute influential scientific information but will nevertheless be subject to standard peer review processes at NIST prior to publication. This includes WERB review for any NIST internal products (e.g. technical notes) as well as planned submissions to Journals. This process will include review by a non-coauthor area expert within the Division (731), one from outside of the Division, the group leader, and division chief. The manuscript must receive approval from all of these parties before it can be published, presented on, or submitted to Journals.


For Journal articles, the manuscript will then be subject to a second round of peer-review based on the policies of the specific Journal, following their typical review processes before being published (if accepted).









Statement to be added to survey:


OMB Control #XXXX-YYYY

Expiration date: ADD DATE

This collection of information contains Paperwork Reduction Act (PRA) requirements approved by the Office of Management and Budget (OMB). Notwithstanding any other provisions of the law, no person is required to respond to, nor shall any person be subject to a penalty for failure to comply with, a collection of information subject to the requirements of the PRA unless that collection of information displays a currently valid OMB control number. Public reporting burden for this collection is estimated to be approximately 20 minutes per response. Send comments regarding this burden estimate or any aspect of this collection of information, including suggestions for reducing this burden, to the National Institute of Standards and Technology, Attn: Emily Walpole, NIST, 100 Bureau Drive, MS 8603, Gaithersburg, MD 20899-1710, telephone 301-975-2617, or via email to [email protected].







Citations Used:

Malecha, M., C. Clavin, and E. Walpole. “How well are U.S. communities planning for resilience, climate adaptation, and sustainability—and what’s missing? Results of a national survey of local staff and officials.” Submitted to: Urban Studies.

Meerow, S., and F., Neuner. “Positively resilient? How framing local action affects public opinion.” Urban Affairs Review 57, no. 1 (January 2021): 70-103.

Meerow, S, and M. Stults. “Comparing Conceptualizations of Urban Climate Resilience in Theory and Practice.” Sustainability 8, no. 7 (July 2016). https://doi.org/10.3390/su8070701

Moser, S., S. Meerow, S. Arnott, and E. Jack-Scott. “The Turbulent World of Resilience: Interpretations and Themes for Transdisciplinary Dialogue.” Climatic Change 153, no. 1–2 (March 2019): 21–40. https://doi.org/10.1007/s10584-018-2358-0.

Romero-Lankao, P., D. M. Gnatz, O. Wilhelmi, and M. Hayden. “Urban Sustainability and Resilience: From Theory to Practice.” Sustainability 8, no. 12 (December 2016). https://doi.org/10.3390/su8121224

Wong-Parodi, G., B. Fischhoff, and B. Strauss. “Resilience vs. Adaptation: Framing and action.” Climate Risk Management 1, no. 10 (January 2015): 1-7.

Woodruff, S.C., S. Meerow, M. Stults, and C. Wilkins. “Adaptation to resilience planning: Alternative pathways to prepare for climate change.” Journal of Planning Education and Research 42, no. 1 (August 2018): 64-75.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorYonder, Darla (Fed)
File Modified0000-00-00
File Created2023-08-01

© 2024 OMB.report | Privacy Policy