10280131_RenewalRequestSSA_WaterResources_230705clean

10280131_RenewalRequestSSA_WaterResources_230705clean.docx

Water Resources Management – Institutional Resiliency, Hazards Planning, and Data Delivery Needs Information Collection

OMB: 1028-0131

Document [docx]
Download: docx | pdf

Supporting Statement A


Water Resources Management – Institutional Resilience, Hazards Planning, and Data Delivery Needs


OMB Control Number 1028-0131


Terms of Clearance:


This information collection request is approved as a one-year pilot study. If the agency wishes

to continue either of the component studies, the Data Delivery Needs study or the Institutional

Resilience study, the agency must submit the following to OMB: (1) non-response rate for the

screener and for the interview and (2) item non-response rates for individual questions in both

screener and interview. The agency may seek to extend the Data Deliver Needs study for a

further two years by providing these response rates and submitting a Change request to

extend the expiration date. If the agency wishes to extend or expand the Institutional

Resilience study, they must prepare a Supporting Statement B and submit either a Revision

request or a New information request. The agency may not release the results of this pilot

study publicly or use the results to inform policy making.


Justification


1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection.


Water information is fundamental to national and local economic well-being, protection of life and property, and effective management of the Nation’s water resources. The USGS works with partners to monitor, assess, conduct targeted research, and deliver information on a wide range of water resources and conditions including streamflow, groundwater, water quality, and water use and availability. This information collection will provide information to the USGS Water Resources Mission Area that will allow for understanding the resilience of water management institutions (e.g., State Engineers, Community Water Systems, Irrigation Providers, State Fisheries Managers, State Department of Natural Resource Managers, Hydro-Electric Power Providers, Dam and Reservoir Operators Board Members for Conservancy Districts) and how best to deliver data to water data users. The Organic Act of March 3, 1879, authorizes the USGS to conduct this research and Section 9 of the SECURE Water Act directs the Secretary of Interior to consult with the USGS and ensure that strategies are developed to address potential water shortages, conflicts, and other impacts to water users and the environment of each service area – this information collection supports that work.


2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection. Be specific. If this collection is a form or a questionnaire, every question needs to be justified.


This information is being collected in support of strategic priorities for the USGS Water Mission Area (WMA) to understand socioeconomic factors that affect water availability and provide data and decision support tools to partners and stakeholders in practical ways. In addition to congressional mandates through the SECURE Water Act, this collection responds to recommendations made by the National Academy of Sciences in their 2018 report to the USGS WMA on Future Water Priorities for the Nation (NASEM, 2018). Specific recommendations included: enhance the development and delivery of integrated and dynamic models encompassing the full water cycle, increase focus on the relationships between human activities and water, and answer the question of how institutions, governance, and institutional resilience impact water quality and quantity (NAS, 2018).


To those ends, responses to the Institutional Resilience component of this ICR will be used by the WMA Social and Economic Drivers (SED) Program to:

  • Identify the organizational, environmental, and socio-political conditions that enhance or impair resilience in water management institutions through elicitation of participants’ tacit knowledge and professional expertise

  • Evaluate the utility of established metrics from engineering and safety research for assessing resilience in water management institutions

  • Explore repeated themes emerging from interviews for novel metrics or indices that may help illuminate organizational and management aspects of water insecurity

  • Understand how decision-making processes, across various levels of water governance, change in response to environmental or socio-political events, and the impact of those changes (or lack there-of) on institutional resilience

  • Develop metrics of resilience in water resource management institutions that can be integrated into WMA national and regional assessments of water security


Information collected to date for the Institutional Resilience component of this ICR has been used by the WMA SED Program to:

  • Expand and improve bureau and mission area understanding of the complex water governance landscapes in both the Delaware River Basin and the Upper Colorado River Basin

  • Understand how decision-makers in water management institutions conceptualize and operationalize resilience at both the organizational and the system levels

  • Validate existing resilience metrics from other management realms for use in the water sector and identify emergent themes for future exploration


Responses to the Data Delivery component of this ICR will be used by the Integrated Water Availability Assessment (IWAA) Program within WMA to:

  • Improve agency understanding of data delivery needs and preferences for users of water data including (but not limited to) spatial and temporal scale, update frequency, key variables of interest, data formats, access pipelines, and degree of interpretive or visualized content

  • Understand critical characteristics of short- and long-term forecasts to partners and stakeholders who use USGS water data for decision-making

  • Identify data gaps that could be filled with integrated water quantity, quality, use, and ecosystem models

  • Understand the utility and usability of data delivery prototypes, and necessary technological infrastructure


Information collected to date for the Data Delivery component of this ICR has been used by the IWAA program to:

  • Shape initial product prototypes of the National Water Census online delivery system including early versions of a web-interface, data portal, and model and data dictionaries

  • Inform development of metadata and metadata standards for modeled water data

  • Communicate with modeling teams within WMA about user needs in relation to variables of interest, temporal and spatial scale, update frequency, model uncertainty, and potential postprocessing steps to improve usability

  • Identify unmet partner needs for data that could inform the direction of future work within WMA


Four questionnaires have been used to collect information in support of the goals listed above. The full instruments, including individual question justifications, are available in a separate document accompanying this request This request is for full approval of all four instruments from the original pilot study ICR, with no changes.



3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden and specifically how this collection meets GPEA requirements.


The Institutional Resilience pre-interview survey was administered electronically using Microsoft Forms and the Data Delivery pre-interview survey was administered electronically using the survey software Qualtrics. These survey programs facilitated ease of administration on the part of the research team and reduced the burden of response for respondents. The Institutional Resilience and Data Delivery interviews to date have been conducted using an online video-conferencing platform, over the phone, or in person depending on the interview participants access to technology and comfort level with technology. We have provided an option to conduct interviews over the phone if the interviewee is more comfortable participating over the phone or in person if the interviewee is comfortable meeting in person and this is feasible for the research team. If full approval is granted, these collection methods will remain unchanged.


4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


Institutional Resilience Data Collection

A thorough literature review on the topic of institutional resilience and water resource management was conducted prior to our initial ICR. This review was intended to provide necessary background information as well as ensure that our research is novel through a nonsystematic, scoping review of three general bodies of literature: water management, water resilience, and resilience engineering (RE). This also included several books on these topics. From initial search results, we performed forward and backward tracing (i.e., papers that cited them and papers they cited) to identify other foundational sources. This review enabled us to explore existing themes in the literature related to resilience, water management resilience, institutional resilience, and water management decision-making. A sample of sources included in the review can be found in Appendix A at the end of this document.


We found that several contemporary literature reviews have recently examined both the definition and application of the term resilience in social, ecological, and infrastructure contexts related to water management, conservation, and use (see for example, Wang & Blackmore 2009; Pande & Sivapalan 2017; Rodina 2018; Shin et al. 2018; Dewulf et al. 2019; Lawson et al. 2020; Mottahedi et al. 2021). These reviews explore components of the total system and identify metrics of resilience related to measurable hydrologic parameters, age or operation capacity of water delivery systems, community demographic data, ecological indicators, or hydropower generation and revenue. They reveal a gap in our understanding of system resilience, however, by a lack of operational or organizational metrics. Specifically, there is little attention given to the way decisions made by water managers and water management institutions contribute or impair the ability of those specific organizations to maintain critical functions when conditions occur outside of those expected as “normal” or “baseline.” Given the nature of STS in which human cognition and decision-making plays a key role in system performance, this omission feels critical.


In contrast, there are rich bodies of existing research attributed to the fields of cognitive architecture, resilience engineering (RE), and decision science that explore how the concept of resilience can be operationalized in complex systems across transportation industries, infrastructure, and private industry (see, for example: Hollnagel 2011; Lee et al. 2013; Lay et al. 2015; Ganin et al. 2016), and how decision-making at the individual or organizational level can help or hinder the ability of an institution to [anticipate or] “cope with a hazardous event or trend or disturbance, responding or reorganizing in ways that maintain essential function, identity and structure, while also maintaining the capacity for adaptation, learning, and transformation” (IPCC 2014, p.5; the word “anticipate” was added by the study authors to better align the IPCC definition of resilience with the RE approach to proactive intervention and ongoing system engagement). Specifically, a framework known as the “Four Cornerstones of Resilience” has emerged in the field of RE as a way of thinking about the capacities required for system resilience (Hollnagel 2011). Those capacities are:


  1. Responding – the knowledge and capacity to influence the system toward a desired outcome in response to disturbances or opportunities

  2. Monitoring – the knowledge and capacity to look for, and identify, elements of the system that change or may change in the near-term to require a response

  3. Anticipating – the knowledge and capacity to envision and plan for future developments, disruptions, threats, and opportunities

  4. Learning – the capacity and willingness to understand both successes and failures in the past, and correctly identify the salient lesson to inform future system performance


These four cornerstones are common to resilient systems across different industries including transportation, aviation, and healthcare. In contrast with system performance metrics, these characteristics are operational in nature and can be closely correlated with how institutional decisions are made. The “Four Cornerstones of Resilience” framework seems to provide a clear method for evaluating how the behavior of decision-makers and institutions impacts the way the system responds to challenges or disturbances but has been relatively unexplored in the field of water resource management. Given the nature of their tightly coupled ecological and human components, water institutions may share other important organizational indices of resilience that would complement this framework but have yet to be documented.


Given the findings of our literature review, we are confident that this work represents a new area of research that is nonduplicative in the realm of water management. Additionally, outreach across the USGS-Water Resources Mission Area, Water Science Centers, and USGS Regional offices has been conducted to ensure duplicative efforts have not been planned elsewhere.



Data Delivery Data Collection

This information collection is to inform the development of a web-based platform to house modeled water availability data for the National Water Census. Because this is a new product for USGS and the WMA, the information collected is unique and nonduplicative with any other USGS user-centered research. Our sampling methodology was informed by a previous, broad WMA effort to understand who the users of USGS data are and what kinds of decisions they need to make with the data we provide (Restrepo-Osorio et al., 2022). This groundwork allowed us to optimize our questions and sampling strategies in order to maximize our information gains while avoiding redundancy with recent work.



5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.


The collection of information does not impact small businesses.


6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


This information collection is required to fulfill the objectives and technical requirements of two Water Resources Mission Area Programs – the Social & Economic Drivers Program and the Integrated Water Availability Assessments Program – part of the Congressionally sub-allocated budget program the Water Availability and Use Science Program. If the collection is not conducted, or is conducted less frequently, we will be unable to fulfill the objectives of the projects within these programs and will not meet the program specific technical requirements.


7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

* requiring respondents to report information to the agency more often than quarterly;

* requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

* requiring respondents to submit more than an original and two copies of any document;

* requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

* in connection with a statistical survey that is not designed to produce valid and reliable results that can be generalized to the universe of study;

* requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

* that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

* requiring respondents to submit proprietary trade secrets, or other confidential information, unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.


There are no circumstances that require us to collect the information in a manner inconsistent with OMB guidelines.


8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and in response to the PRA statement associated with the collection over the past three years, and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every three years — even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


We published a 60-Day Federal Register notice 88 FR 32237 on May 19, 2023.  We did not receive any comments in response to that notice.

Outreach


In addition to soliciting comments through the FRN, we reached out directly to nine individuals who participated in our pilot collection and asked for their voluntary responses to the following four questions:


1. Was the collection of information necessary for the proper performance of the functions of he agency, including whether or not the information will have practical utility;

2. Was the estimate of burden hours accurate for this collection of information, including the validity of the methodology and assumptions used;

3. Are there ways to enhance the quality, utility, and clarity of the information to be collected; and

4. How might the agency minimize the burden of the collection of information on those who are to respond, including through the use of appropriate automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of response.


Their responses are summarized below:


Response 1: Resource Planning Specialist, State Agency

Q1

Q2

Q3

Q4

Action needed/taken

The collection of information about water management seems appropriate and could have practical utility for implementing water management rules in the Pinelands Area.


The estimated time seems accurate.


I don’t have enough information on how the information has been processed to respond to this question.


I do not have ideas about how the agency can further minimize the burden of collecting this information.


No response necessary

Response 2: Scientist, Non-profit Organization

Partnership for the Delaware Estuary (PDE), as a National Estuary Program,  supports the creation of the Delaware Estuary’s Comprehensive Conservation Management Plan (CCMP). Collecting specific information about water quality and quantity is an essential task for understanding how well the CCMP is being implemented and what other tasks need to be done to continue to work towards goals (e.g., the overarching goal of a healthy Delaware Estuary). PDE also works closely with the Delaware River Basin Commission (DRBC) to carry out many tasks related to water quality and quantity. Although PDE itself does not necessarily rely on USGS data for our proper performance (i.e., carrying out our routine tasks), DRBC may, and by association, such data collection has great utility for PDE’s performance as a National Estuary Program. Therefore, obtaining information about PDE’s use of USGS data was pertinent answer hope that these conversations continue so that our ability to track the health of the Delaware Estuary continues.


The time estimate was accurate, and the methods/assumptions were appropriate.


It has been some time since I engaged in this conservation, but I might suggest a website that has a quick synopsis of the main goal(s) of this information collection effort, just so folks can independently review background information. That might also be a good place to house other relevant information (e.g., plans, descriptions of other existing efforts, reports) that would allow interviewees to formulate answers to surveys before the survey is sent.


I might imagine keeping conversations relatively short, but regular may help reduce burdens. For instance, rather than one long survey, it might be easier to split it into two parts separated with a short (one page) report out of the first survey to keep participants engaged.


With regards to the requested synopsis of information collected, the goal of the researchers is to provide a report of our findings from the pilot study to all our participants, once full OMB approval is granted for the Information Collection. With regards to the comments to question 4, we appreciate the suggestions and are open to considering different ways to minimize burden to participants in the future. If additional information is collected, we will ask participants, at the time of our invitation, whether they prefer to respond to one long instrument, or several shorter ones. The content of the instruments will not change.

Response 3: General Manager, District

With regard to other agencies, I have no response to the question. With regard to RWCD, I do not believe the information is necessary for our proper performance. However, I do believe the collected information does have practical utility insomuch as it presents an opportunity to read and understand how other agencies operate in this arena.


I do not recall the time taken during the interview; I do recall it was a good conversation and well worth the time. I believe the time to complete the survey was around 15-20.


Other than site visits, I am not certain of a better way in which to obtain the information requested.


I hesitate to characterize providing the information as a burden. It was not a mandatory information/date submittal so those who participated chose to do so. Likewise, I believe the process used to collect the data was done using appropriate and available techniques.


No response necessary

Response 4: Assistant General Manager, District

The information will have practical utility for the agencies mission and the management of the nation’s Natural Resources


The time commitment estimate was accurate


no recommendation – appreciated the virtual interview and direct dialogue with researchers


 no comment


No response necessary

Response 5: Professor/Researcher, University

Yes this data is invaluable for the scientific mission of the agency. In particular, the insight into what aspects of agency data products would be particularly relevant for public health scientists and environmental scientists making use of the data to better understand environmental risks can inform the design of agency outputs


This was a minimally burdensome effort. The agency estimate was conservative


It was quite well done


The burden is already minimal. The modality is appropriate.


No response necessary

Response 6: Principal Engineer, Private Sector

The National Water Census is 100% necessary for my job and many jobs in the water industry. Understanding where and how water is used across the US is fundamental information and must be regularly and accurately reported.


Accuracy is important, but so is regularity. The most important aspect of the USGS water use reporting are the trends - the changes in use over timer. Is water use going up? Going down? Staying the same? How is it changing? Und


Continual improvement. Explain why the data are better this time than last time. Perfection is impossible, but improvement is necessary. As long as we keep doing better and better with our national water use assessment, we are on the right track. Letting the program languish without producing a report is the worst option.


USGS must distill the most critical information required and then set up a straight foward way to report this info. Consistency is essential. USGS should have staff to assist in the reporting process. In some cases USGS staff will need to hold hands to make sure it happens. If a consistent regular reporting regime is created, the system can improve over time


Participant did not address actual Information Collection – No action taken


We consulted with a wide variety of subject matter experts across the US Geological Survey to ensure the completeness, understandability, and conciseness of all 4 information collection instruments. The titles of those individuals and summary of feedback received is provided in the table below.


Commenters on the survey or announcement

U.S. Geological Survey

Water Mission Area

Program Manager and Research Hydrologist

Denver, Colorado


Reviewed data delivery survey and interview questions. Provided feedback on the specifics of questions asked in the data delivery survey and interview question guide. Suggested changes to wording of some questions, which were accepted.

U.S. Geological Survey

Water Mission Area

Research Social Scientist

Denver, Colorado


Reviewed data delivery interview questions. Provided comments related to the flow of the interviews. Specifically, recommended changes to wording to make the interview more conversational, which were accepted.


U.S. Geological Survey

Rocky Mountain Region

Senior Scientist

Denver, Colorado


Provided additional questions for pre-interview survey (institutional resilience) and changed wording of some questions to make them more accurate. These wording suggestions were accepted.




U.S. Geological Survey

Northeast Climate Adaptation Science Center

Acting Deputy Director

Amherst, Massachusetts


Reviewed institutional resilience pre-interview survey and interview questions. Reviewed for content as well as policy …and scientific validity. Expressed that they did not have any policy or science concerns. Suggested a slight wording change to the survey that was accepted.


U.S. Geological Survey

Rocky Mountain Region

Research Social Scientist


Reviewed institutional resilience interview questions. Provided suggestions to clarify the wording on some questions. These clarifications were accepted.

Department of Interior

Office of Policy Analysis

Economist


Reviewed statistical methodology for ICR SS-B. Suggestions were provided to improve clarity regarding methods and to ensure appropriate amount of detail was provided. These suggestions were all accepted.



9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


We will not provide payments or gifts to respondents.


10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


The USGS does not provide an assurance of confidentiality. However, respondents will remain anonymous beyond the research team. Assurance will be provided in the form of an informed consent document presented to respondents before information is collected as well as display of the Privacy Act Statement and System of Records notice identified as [DOI Social Networks (Interior/USGS-8) published at 76 FR 44033, 7/22/2011] on all written materials (questionnaire and informed consent document) and stated verbally as part of interviews.


11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


Respondents will not be asked questions of a sensitive nature.


12. Provide estimates of the hour burden of the collection of information. The statement should:

* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.

* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here.


We are using the Bureau of Labor Statistics Employer Costs for Employee Compensation, USDL-22-049, published on 03/17/2023, to determine our dollar value for burden hours. The value used is $40.23 per hour for public respondents (private industry) and $57.60 for State, local and Tribal government respondents.


Table 2 Respondent burden

Participant / Activity

Number of Responses

Minute per response

Burden Hours

Dollar Value for Burden Hr

Public individual reads announcement or instructions for Institutional Resilience survey and completes survey

15

10

3

$120.69


Public individual reads announcement or instructions for Institutional Resilience interview and participates in interview (subset of individuals that took survey, thus not additive)

10

60

10

$402.30

Public individual reads announcement or instructions for Data Delivery survey and completes survey

100

15

25

$1,005.75

Public individual reads announcement or instructions for Data Delivery interview and participates in interview (subset of individuals that took survey, thus not additive)

30

60

30

$1,206.90


Subtotal

115

205

68

$2735.64

State, Local, Tribal govt reads announcement or instructions and completes Institutional Resilience survey

75

10

13

$748.80

State, Local, Tribal govt reads announcement or instructions and completes Institutional Resilience Interview (subset of individuals that took survey, thus not additive)

25

60

25

$1,440.00

State, Local, Tribal govt reads announcement or instructions and completes Data Delivery survey

75

15

19

$1,094.40

State, Local, Tribal govt reads announcement or instructions and completes Data Delivery Interview (subset of individuals that took survey, thus not additive)

30

60

30

$1,728.00

Total Subtotal

150

145

87

$5,011.20

Total

265

350

155

$7,746.84


13. Provide an estimate of the total annual non-hour cost burden to respondents or recordkeepers resulting from the collection of information. (Do not include the cost of any hour burden already reflected in item 12.)

* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information (including filing fees paid for form processing). Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.


We have not identified any non-hour cost burden associated with this collection. 


14. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information.


We used the Office of Personnel Management Salary Table 2023-GS to determine the hourly wage rate for all personnel that will be involved in administrating surveys, conducting interviews, and analyzing and interpretating the resulting data. To calculate benefits, we multiplied the hourly rate by 1.6 to account for benefits.



Table 3 Federal Government Expenses

Position

Grade/

Step

Hourly Rate

Annual Hrs

Fully Loaded Hr Rate

Total Labor Value

Research Social Scientist (Denver locality pay)

13/2

$54.02

40

$86.43

$3,457.28

Biologist/Decision Analyst (Rest of US)

12/3

$42.33

120

$67.73

$8,127.36

Student Trainee Geography (Rest of US)

7/3

$23.87

20

$38.19

$763.84

Physical Scientist (Rest of US)

12/2

$41.01

40

$65.62

$2,624.64

Geographer (Rest of US)

9/2

$28.28

40

$45.25

$1,809.92

Physical scientist (Rest of US0

12/2

$41.01

40

$65.62

$2,624.64


Table 4 Other Federal Government Expenses

Journal publication costs

$1,000

Conference Registration

$1,000

Transcription Services

$6,720


The total cost to the government for this information collection is $ 28,127.68


15. Explain the reasons for any program changes or adjustments in hour or cost burden.


This is a renewal request for an approved pilot-study information collection. Our initial approval was granted 8/31/2022. We updated our cost burden estimates based on new compensation data from the U.S. Bureau of Labor Statistics. For employee salary expenses, we updated the Grade/Step for each individual and used OPMs 2023 GS pay-scale tables.


16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


We anticipate publishing the findings of this information collection as a scientific journal article – with accompanying U. S. Geological Survey data release via the Science base online platform, a summary report for participating stakeholders, and presentations at scientific conferences (e.g., American Geophysical Union [AGU]).


17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


We will display the OMB Control Number and expiration date on appropriate materials. 


18. Explain each exception to the topics of the certification statement identified in "Certification for Paperwork Reduction Act Submissions."

There are no exceptions to the certification statement. 



Literature cited within form responses above:


Dewulf, A., Karpouzoglou, T., Warner, J., Wesselink, A., Mao, F., Vos, J., Tamas, P., Groot, A. E., Heijmans, A., Ahmed, F., Hoang, L., Vij, S., & Buytaert, W. (2019). The power to define resilience in social–hydrological systems: Toward a power‐sensitive resilience framework. WIREs Water, 6(6), 1–14. https://doi.org/10.1002/wat2.1377


Ganin, A.A., Massaro, E., Gutfraind, A., Steen, N., Keisler, J.M., Kott, A., Mangoubi, R., Linkov, I. (2016). Operational Resilience: concepts, design and analysis. Scientific Reports, 6, 19540. https://doi.org/10.1038/srep19540.


Hollnagel, E. (2011). RAG – Resilience Analysis Grid. In: Hollnagel, E., Paries, J., Woods, D., & Wreathall, J. (Eds), Resilience Engineering in Practice: A Guidebook. CRC Press, Taylor & Francis Group. pp 275–295.


IPCC. (2014). Climate Change 2014: Impacts, Adaptation, and Vulnerability. Part A: Global and Sectoral Aspects. Contribution of Working Group II to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change. Cambridge University Press Cambridge, United Kingdom and New York, NY, USA, 1132 pp.


Lawson, E., Farmani, R., Woodley, E., & Butler, D. (2020). A resilient and sustainable water sector: Barriers to the operationalisation of resilience. Sustainability (Switzerland), 12(5), 1–21. https://doi.org/10.3390/su12051797


Lay, E., Branlat, M., Woods, Z. (2015). A practitioner’s experiences operationalizing Resilience Engineering. Reliability Engineering and System Safety, 141, 63-73. https://doi.org/10.1016/j.ress.2015.03.015.


Lee, A.V., Vargo, J., Seville, E. (2013). Developing a tool to measure and compare organizations’ resilience. Natural Hazards Review, 14(1), 29-41


Mottahedi, A., Sereshki, F., Ataei, M., Qarahasanlou, A. N., & Barabadi, A. (2021). The resilience of critical infrastructure systems: A systematic literature review. In Energies (Vol. 14, Issue 6). https://doi.org/10.3390/en14061571


National Academies of Sciences, Engineering, and Medicine (NASEM). (2018). Future Water Priorities for the Nation: Directions for the U.S. Geological Survey Water Mission Area. Washington, DC: The National Academies Press. https://doi.org/10.17226/25134.


Pande, S., & Sivapalan, M. (2017). Progress in socio-hydrology: a meta-analysis of challenges and opportunities. WIREs Water, 4:e1193. http://doi.org/10.1002/wat2.1193.


Restrepo-Osorio, D.L., Stoltz, A.D., Herman-Mercer, N.M. (2022). Stakeholder Engagement to Guide Decision-Relevant Water Data Delivery. Journal of the American Water Resources Association, 1-14, https://doi.org/10-1111/1752-1688.13055.


Rodina, L. (2018). Defining “Water Resilience”: Debates, concepts, approaches, and gaps. WIREs Water, 6(2), 1–18. https://doi.org/10.1002/wat2.1334


Shin, S., Lee, S., Judi, D.R., Parvania, M., Goharian, E., McPherson, T., & Burian, S.J. (2018). A systematic review of quantitative resilience measures for water infrastructure systems. Water 10,164. https://doi.org/10.3390/w10020164.


Wang, C., & Blackmore, J. M. (2009). Resilience Concepts for Water Resource Systems. Journal of Water Resources Planning and Management, 135(6), 528–536. https://doi.org/10.1061/(asce)0733-9496(2009)135:6(528)



Appendix A: Sample of sources included in the nonsystematic, scoping review for the Institutional Resilience project.


Appelbaum, S. H. (1997). Socio-technical systems theory: an intervention strategy for organizational development. Management Decision, 35(6), 452–463. https://doi.org/10.1108/00251749710173823

Assad, A., Moselhi, O., & Zayed, T. (2019). A New Metric for Assessing Resilience of Water Distribution Networks. Water, 2019-June.

Baker, K., Tang, S., Sweetapple, C., Ward, S., Staddon, C., Bishop, T., Bulmer, P., & Butler, D. (2018). Resilience Leanring for Water Sector Culture Change. The 6th Joint EWA/JSWA/WEF Conference: Resilience of the Water Sector.

Burnham, M., Ma, Z., Endter-Wada, J., & Bardsley, T. (2016). Water Management Decision Making in the Face of Multiple Forms of Uncertainty and Risk. Journal of the American Water Resources Association, 52(6), 1366–1384. https://doi.org/10.1111/1752-1688.12459

Dewulf, A., Karpouzoglou, T., Warner, J., Wesselink, A., Mao, F., Vos, J., Tamas, P., Groot, A. E., Heijmans, A., Ahmed, F., Hoang, L., Vij, S., & Buytaert, W. (2019). The power to define resilience in social–hydrological systems: Toward a power‐sensitive resilience framework. WIREs Water, 6(6), 1–14. https://doi.org/10.1002/wat2.1377

Hollnagel, E. (2011). RAG – Resilience Analysis Grid. In: Hollnagel, E., Paries, J., Woods, D., & Wreathall, J. (Eds), Resilience Engineering in Practice: A Guidebook. CRC Press, Taylor & Francis Group. pp 275–295.

Hollnagel, E., Woods, D.D., & Leveson, N. (Eds). Resilience engineering: Concepts and precepts. (2006). Ashgate. https://doi.org/10.1136/qshc.2006.018390

Hollnagel, E., Paries, J., Woods, D., & Wreathall, J. (Eds). Resilience engineering in practice: A Guidebook. (2011). CRC Press, Taylor & Francis Group. https://doi.org/10.1201/9781315605708

Hossain, F., Arnold, J., Beighley, E., Brown, C., Burian, S., Chen, J., Mitra, A., Niyogi, D., Pielke, R., Tidwell, V., & Wegner, D. (2015). What do experienced water managers think of water resources of our nation and its management infrastructure? PLoS ONE, 10(11), 1–10. https://doi.org/10.1371/journal.pone.0142073

Hossain, F., & Science, A. (2020). Resilience of Large Water Management Infrastructure. In Resilience of Large Water Management Infrastructure. https://doi.org/10.1007/978-3-030-26432-1

Johannessen, Å., & Wamsler, C. (2017). What does resilience mean for urban water services? Ecology and Society, 22(1). https://doi.org/10.5751/ES-08870-220101

Lawson, E., Farmani, R., Woodley, E., & Butler, D. (2020). A resilient and sustainable water sector: Barriers to the operationalisation of resilience. Sustainability (Switzerland), 12(5), 1–21. https://doi.org/10.3390/su12051797

Linkov, I., & Trump, B. D. (2019). The Science and Practice of Resilience. Springer Nature. https://doi.org/10.1136/jrnms-14-216

Liu, J., Shao, Z., & Wang, W. (2021). Resilience assessment and critical point identification for urban water supply systems under uncertain scenarios. Water (Switzerland), 13(20). https://doi.org/10.3390/w13202939

Mankad, J., Borse, D., Das, L., Padhiyar, N., & Srinivasan, B. (2020). Development of Operational Resilience Metrics for Water Distribution Systems. 19–41. https://doi.org/10.1007/978-981-15-4668-6_2

Milly, P. C. D., Betancourt, J., Falkenmark, M., Hirsch, R. M., Kundzewicz, Z. W., Lettenmaier, D. P., & Stouffer, R. J. (2008). Stationarity is dead: Whither water management? Science, 319(5863), 573–574. https://doi.org/10.1126/science.1151915

Milman, A., & Short, A. (2008). Incorporating resilience into sustainability indicators: An example for the urban water sector. Global Environmental Change, 18(4), 758–767. https://doi.org/10.1016/j.gloenvcha.2008.08.002

Mottahedi, A., Sereshki, F., Ataei, M., Qarahasanlou, A. N., & Barabadi, A. (2021). The resilience of critical infrastructure systems: A systematic literature review. In Energies (Vol. 14, Issue 6). https://doi.org/10.3390/en14061571

Mysiak, J., Henrikson, H. J., Sullivan, C., Bromley, J., & Pahl-Wostl, C. (2013). The adaptive water resource management handbook. In The Adaptive Water Resource Management Handbook. https://doi.org/10.4324/9781315065984

Penny, G., & Goddard, J. J. (2018). Resilience principles in socio-hydrology: A case-study review. Water Security, 45(December), 37–43. https://doi.org/10.1016/j.wasec.2018.11.003

Richter, B. D., Baumgartner, J. V., Wigington, R., & Braun, D. P. (1997). How much water does a river need? Freshwater Biology, 37(1), 231–249. https://doi.org/10.1046/j.1365-2427.1997.00153.x

Rodina, L. (2018). Defining “water resilience”: Debates, concepts, approaches, and gaps. WIREs Water, 6(2), 1–18. https://doi.org/10.1002/wat2.1334

Rodina, L., & Chan, K. M. A. (2019). Expert views on strategies to increase water resilience: evidence from a global survey. Ecology and Society, 24(4). https://doi.org/10.5751/ES-11302-240428

Trist, E. (1981). The evolution of socio-technical systems: A conceptual framework and an action research program. In A. H. Van de Ven & W. Joyce (Eds.), Perspectives on Organizational Design and Behavior. Wiley-Interscience.

Wang, C., & Blackmore, J. M. (2009). Resilience Concepts for Water Resource Systems. Journal of Water Resources Planning and Management, 135(6), 528–536. https://doi.org/10.1061/(asce)0733-9496(2009)135:6(528)

Woods, D. D. (2015). Four concepts for resilience and the implications for the future of resilience engineering. Reliability Engineering and System Safety, 141(April 2015), 5–9. https://doi.org/10.1016/j.ress.2015.03.018


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authordjbieniewicz
File Modified0000-00-00
File Created2023-08-29

© 2024 OMB.report | Privacy Policy