1028-0109 SS-A iCoast 2017-08-03

1028-0109 SS-A iCoast 2017-08-03.docx

iCoast-Did the Coast Change?

OMB: 1028-0109

Document [docx]
Download: docx | pdf

Supporting Statement A


iCoast—Did the Coast Change


OMB Control Number 1028-0109


Terms of Clearance: None


Justification


1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection.


The U.S. Geological Survey (USGS) and its collaborators (including the National Aeronautics and Space Administration, the U.S. Army Corps of Engineers, and university researchers) conduct sustained investigations of coastal hazards associated with major hurricane landfall. USGS hurricane research and response activities include collection of storm-surge water levels, aerial photography, and laser altimetry (lidar) surveys of pre- and post-storm beach conditions. These efforts document the nature, magnitude, and variability of costal changes such as beach erosion, overwash deposition, island breaching, and destruction of infrastructure. Predictive models and assessments of severe storm impacts are developed and evaluated, and probabilistic assessments are distributed to the public, local, State, and Federal agencies. The assessments and observations provide information needed to understand, prepare for, and respond to coastal disasters. These ongoing analyses are authorized by The Disaster Relief Act of 1974, 42 U.S.C 5201 et seq. Section 202(a).1


In support of this research, the USGS has been taking oblique aerial photographs of the coast before and after each major storm since 1996 and has amassed a database of over 190,000 photographs of the Gulf and Atlantic Coasts. Computers cannot yet automatically analyze these data because classifying this photography requires understanding the diversity of forms that even this small set of primary features (shore, beach, dune, marsh, built environment) can represent. Human intelligence is needed, and USGS does not have the personnel or the capacity for this. These oblique aerial photographs are currently used for broad overviews of damage, and selected photo pairs have been shared on the Internet with the public after storms. The intense interest by the public in the pre- and post-storm USGS photographic pairs, and the increasing use of citizen science and crowdsourcing by Federal Government agencies suggests that a significant segment of the public might volunteer to serve as our “eyes on the coast.” The iCoast—Did the Coast Change? website (hereafter referred to as iCoast) posts a suite of pre- and post-storm photographs from a major storm, and citizen scientists can compare photographs and classify the changes they see with predefined tags, or by appending comments. Citizen scientists also identify coastal landforms, determine the storm impacts to coastal infrastructure and landforms, and indicate other changes, including response and recovery efforts. These data can be used by USGS scientists to ground truth and fine-tune their models of coastal change. These mathematical models predict the likely interaction between coastal features such as beaches and dunes and storm surge. They are based on pre-storm dune height, measured by lidar, and predicted wave behavior based on data from the National Oceanic and Atmospheric Administration. They are not based on ground truth observations. A body of citizen observations will allow for more accurate predictions of vulnerability. These model predictions are typically shared with Federal, State, and local authorities both before and after storms. The project will also result in greater citizen awareness of the probabilities for coastal change, and will be a resource for teachers and students pursuing science, technology, engineering and math (STEM).


Other laws that support the use of citizen science observations for coastal change are:

  • 1737 The Federal Land Policy and Management Act of 1976 (FLPMA), 43 U.S.C. 1701 et seq.; 43 U.S.C., authorizing the Secretary of the Interior to conduct investigations, studies, and experiments involving the management, protection, development, acquisition, and conveying of public lands; and to prepare and maintain inventories of all public land and resources.

  • ) The Coastal Zone Management Act of 1976, 46 U.S.C. 31(a) and (b), providing that each department, agency, and instrumentality of the Executive Branch of the Federal Government may assist the Secretary (of Commerce), on a reimbursable basis or otherwise, in carrying out research and technical assistance for coastal zone management.



2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection. Be specific. If this collection is a form or a questionnaire, every question needs to be justified.


iCoast has been collecting citizen scientist photographic classifications since Oct 2014. Since its initial launch with the photographic data collected after Hurricane Sandy made landfall, over 1,500 users have logged onto iCoast. Of those, 794 users have completed at least one classification, resulting in nearly 48,000 completed classifications. 100% of the Hurricane Sandy photographs have had at least one classification completed, with many photographs having been classified by multiple users. In October of 2015, photographs for Hurricane Joaquin were added iCoast. To date, 264 users have completed over 10,500 classifications for 6393 photographs (73.5% of the photographs for Hurricane Joaquin).


The data generated by these collections are being analyzed to determine how iCoast is being utilized by the citizen scientist crowd. There are currently two publications in progress. One looks at the type of crowd iCoast has attracted, while the second looks at how iCoast works, then compares the classifications of a select group of users considered “experts” (those with a background in coastal studies and coastal management issues), to those from the general public using Bayesian statistical analysis. Once these analyses are complete we hope to begin comparing user classifications to predictive storm impact models to verify and refine them. The users’ observations of changes in beach morphology and damage to infrastructure can confirm or refute model predictions and provide human feedback to the model that will help refine predictions for future storms.



The USGS has designed iCoast to appeal to the following categories of users:


  • Coastal and Marine Scientists — iCoast uses coastal and marine scientists to establish a “expert” crowd for verifying classification. Being familiar with coastal processes, these experts will provide USGS with confirmation of model predictions and will be used as a standard on which to verify the classification of the general crowd.

  • Coastal Planners and Managers — Coastal planners and managers can use to promote stewardship, protect significant coastal resources such as national, state, and local shoreline parks, revitalize working waterfronts, and oversee land use planning in coastal areas. This group is interested in short-term coastal vulnerabilities to extreme storms as well as long-term predictions of coastal change due to seal-level rise.

  • Coastal residents — Since 1996, the USGS has been publishing aerial photographs of areas impacted by extreme storms in the form of matched pairs of pre- and post-storm photography in areas of extreme coastal damage and/or change. Coastal residents have shown great interest in these photographs. Coastal residents can use iCoast to examine a much boarder region of the coast and compare pre- and post-storm conditions in areas other then those highlighted on the National Assessment of Storm-Induced Coastal Change Hazards website (https://coastal.er.usgs.gov/hurricanes/). We hope that this group of users will also gain a better understanding of their vulnerability to future storms.

  • Digital Crisis Volunteers — Extreme storms and other disasters have attracted various communities of volunteers who use digital tools including social media and online maps to aid affected communities and emergency responders. Deployment of iCoast for future storms will provide a platform for these volunteer opportunities.

  • Emergency Managers — iCoast will be valuable before storms to potentially stage equipment and personnel in areas that are particularly vulnerable by reviewing impacts from previous storms and, after storms, to assess damages.

  • Interested Public — Judging by the success of the citizen science projects such as Galaxy Zoo (www.galaxyzoo.org) which has attracted many participants in a project to classify galaxies from the Sloan sky survey, the general public is quite interested in participating in a citizen science project to classify photographs of phenomena even if they are not immediately present at the scene. We anticipate similar interest from the general public, particularly after large storms.

  • Marine Science Student — iCoast provides a valuable resource for students new to marine science, as well as students who are furthering their studies. By classifying match pairs students are presented with real-world examples of the impact extreme storms have on our coastlines. The USGS can also use this class of user to augment the “expert” class of coastal and marine scientists. Additional, primary school teachers and their students can use iCoast as a teaching/learning tool to familiarize themselves with coastal processes and actively participate in coastal issues.

  • Policy Maker — iCoast provides a broad look at the variability of the impacts of extreme storms along our coastlines. This perspective can help officials provide sound policy for their constituents by providing examples of how various types of infrastructure and coastal landforms respond to storms.

  • Watersport Enthusiasts — This group of users are considered coastal stewards interested in preserving the coast for recreational opportunities. They often have a unique perspective on coastal issues and are often familiar with changes observed after storms.

  • Other” Crowd Users — iCoast provides the opportunity for users to self identify a crowd type not listed in the above listed nine pre-set crowd types. To date seven other crowd types have self-identified, as well as many more that can not be grouped into a single crowd. The “other” crowd current includes students (other then marine science students), GIS professionals, other scientists, statisticians, mappers and geographers, teachers, and photographers.


The tasks that the users of iCoast will perform are listed below and screen shots are shown in Survey instrument:


  • Select Photograph to Classify. The user is presented with a post-storm image to classify. The user can tag that photograph or, select a different photograph randomly, pick a photograph from a map (Figure 1).


  • Match Coastal Aerial Photographs. The first step in the iCoast tagging process is to match a pre-storm photograph to the post-storm photograph presented to the user for classification. A matching pair of photographs shows the same natural and/or man-made features, though the photographs may be from a slightly different angle or level of zoom from the camera’s lens. The matches are generated by an algorithm that uses geographic coordinates to determine the nearest pre-storm photograph to the post-storm photograph the user has selected. If the user decides the computer match is not accurate, they may choose a better match from three additional sequentially located photographs along the coast to either side of the post-storm photograph (Figure 2.). The user can tag that photograph or, if still unsatisfied with the match options presented the user can mark the photograph as “No Match Found” or “Flag as Unsuitable”, then select a different photograph randomly, pick a photograph from a map, or traverse the coast in either direction to find a better location for a match.


Once the user satisfied with the match, they are presented with the four tagging tasks. For each tag available, the user can hover his or her mouse over each term to reveal a sample photograph of the landscape and a description of the feature to be tagged. Descriptions of the tag text can be found in Appendix A. Each task has two parts:


  • Task 1: Identify coastal landscape. The user selects the type of coastline (e.g., barrier island, mainland, etc.), then the level of development (e.g., undeveloped, moderate development, or heavy development) characterizing the geomorphic and human modifications to the shoreline (Figure 3).


  • Task 2: Determine impacts to coastal infrastructure. The user is asked to identify what infrastructure (e.g., seawall, roadway, buildings, etc.) is visible in the pre-storm photograph, then indicate which infrastructure elements are damaged in the post-storm photograph (Figure 4).


  • Task 3: Specify changes to coastal landforms and dominant process. Coastal scientists at the USGS have established a storm-impact scale that classifies storm damage into four regimes (see http://coastal.er.usgs.gov/hurricanes/impact-scale/). The user first identifies the impact to the coast based on these by selecting tags that indicate evidence of one of these regimes:


  • beach erosion (i.e., less sand, dark sand, or beach scarps),

  • dune erosion (i.e., dune scarps, leveled dunes, and less vegetation),

  • overwash (i.e., sand inland, sand on roads, and sand in marshes), and

  • inundation (i.e., breaches, standing water inland, and dead vegetation).


For each coastal change regime category, the tags are nested by color to aid the user in understanding the relationship between the tags and the coastal change regime categories (Figure 5). For example, dark sand on the beach may be indicative of beach erosion while sand on the roads may be indicative of overwash. Lastly, the user is then asked to identify the most dominant coastal change process that explains the changes observed in the post-storm photo.


  • Task 4: Identify Other Changes. The final task asks the user to identify other changes to the coast that may or may not have been storm related as well as post-storm recovery efforts (Figure 6). The user can also indicate other coastal changes or additional information they would like to share through a comment box.


The information sought in iCoast will be non-proprietary and contain no personally identifiable information as defined under the Privacy Act of 1974.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden and specifically how this collection meets Government Paperwork Elimination Act (GPEA) requirements.


This information collection is conducted entirely on the Internet on a website hosted by USGS. It involves automated interactive mapping and visualizations that would be impossible to duplicate with paper technology. This collection is entirely voluntary and meets GPEA requirements.


4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


USGS identified and evaluated several online citizen science projects in which volunteers were asked to analyze photography after disasters. These projects were deployed by a Federal agency, a commercial firm, and a non-profit organization.

  • The Civil Air Patrol (CAP), a volunteer arm of the Air Force, photographs areas affected by disasters. These photographs are shared with the Federal Emergency Management Agency (FEMA). After Hurricane Sandy (2012) 35,000 of these photo were put online using the MapMill system (http://www.mapmill.org) and evaluated by more than 6,000 citizen volunteers. These photographs were used for damage assessment and to target areas where resources were needed2.

  • The commercial satellite firm, Digital Globe, runs the Tomnod application (http://www.tomnod.com), an online system where volunteers are invited to tag satellite photographs. Tomnod was used after the Moore, Oklahoma tornado (2013) and Super Typhoon Haiyan (2013).

  • The Grassroots Mapping project (http://grassrootsmapping.org) was conceived by Jeffrey Warren. The website shares information about how to construct low-cost aerial imaging systems from cameras tethered to balloons and has a platform that can be used to stitch these photographs together. Various community groups have used these technologies to photograph disasters such as the Deepwater Horizon oil spill (2010).


Each of these projects was deemed insufficient for use by the USGS in constructing a system for volunteers to classify coastal damage from storms. The CAP photographs, while oblique, covered a much larger area than the USGS photographs and had an extremely simplified set of categories (little to no damage/medium damage/heavy damage.) These photographs did not contain enough detail on specific damage to the immediate coastal dune and beach systems. The Tomnod application is proprietary, moreover the photography is not as detailed as the aerial photographs taken by the USGS, and the vantage point of the Digital Globe photographs is from directly overhead, making them less useful for detecting the type of changes than can be seen in the USGS oblique aerial photographs. The volunteer methods described by the Grassroots Mapping project would not scale to cover the geographic extent of a large storm, and would likely not be timely.


The information to be derived from iCoast is unobtainable elsewhere. USGS has a scientifically valuable archive of photographs stretching back more than 20 years. Analysis of these photographs by interested citizen scientists will increase the value of the data. Many of the storms recorded in this archive have effected the same aerial extents of the coast, making comparative historical studies possible. This constitutes a long-term record of the coastal response of the Gulf and Atlantic Coasts to extreme storms. The three applications mentioned above are aimed at situational awareness for first responders and other emergency personnel. They do not inform scientific predictions of coastal response to future storms. Additionally, iCoast has demonstrated expandability, in that future storms can be entered into the system to allow continued interaction with the public and additional benefit to the USGS in their data analysis.


5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.


There is no impact to small business or other small entities.


6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


Failure to collect this information will result in:

  • Reduced ability to understand the impacts of severe storms and changing climate on coastal resources of concern;

  • Reduced ability to share predictions and vulnerabilities with other Federal, State, county and municipal agencies and with the public;

  • Reduced ability for coastal planners and managers from all levels of government to strategically allocate resources to areas of highest need. This would leave the Secretary of the Interior with reduced ability to fulfill his or her legal obligations under the acts mentioned in section 2.


7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

* requiring respondents to report information to the agency more often than quarterly;

* requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

* requiring respondents to submit more than an original and two copies of any document;

* requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

* in connection with a statistical survey that is not designed to produce valid and reliable results that can be generalized to the universe of study;

* requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

* that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

* requiring respondents to submit proprietary trade secrets, or other confidential information, unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.


There are no special circumstances associated with the proposed collection activity that would require it to be conducted in any of the manners described.


8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and in response to the PRA statement associated with the collection over the past three years, and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


On February 28, 2014, a 60-day Federal Register notice (79 FR 11461) was published announcing that USGS would submit this information collection to OMB for approval. Public comments were solicited for 60 days ending April 22, 2014. No comments were received from this Federal Register notice.


Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


We consulted with the individuals listed in the table to obtain their views on the information above. Several changes to the format and design of the application were suggested during the testing period and these have been incorporated. Reviews also checked the website for ease of use, layout, navigation and information content. Suggests have been incorporated into the website.



Table 1. Individuals (non-USGS) consulted in the construction of the iCoast website

Reviewers

The National Socio-Environmental Synthesis Center

Coastal resident, photographer, private individual

Interactive Multimedia Developer, private individual

.


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every three years — even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


There will be no payment offered to respondents.


10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


No assurance of confidentiality is given to respondents as no information of a confidential nature is solicited.


11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


The collection does not include sensitive or private questions.


12. Provide estimates of the hour burden of the collection of information. The statement should:

* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.

* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here.


We examined the records of Internet activity through 2017. Judging by the domain names of visitors we estimate that approximately 1% of those visitors are from the “.gov“ domain.


We estimate that the total dollar value of burden hours to the public by year for Federal employees*, is ~$500. The dollar value of burden hours for private users by year is ~$104,144. The numbers found in table 2 are based on statics generated by iCoast for 2017.


Estimates are based on complete classifications only. There is no good way to accurately estimate the amount of time spent when a classification is incomplete. These figures may increase greatly in years when additional severe storms (category 4 or 5) covering wide geographic areas make landfall, or may be reduced if no new storms make landfall. The potential variability in visitors is impossible to predict at this time.


The source for the hourly rate for all civilian employees is drawn from the Bureau of Labor Statistics report on March 2017 (USDL-17-0321). This burden does not include registration and reading the introductory and explanatory material (see table 2). We do not feel it is a significant additional amount of time once the user has become familiar with the site interface.






Table 2. Estimated non-Federal dollar value of annual burden hours*


Participant / Activity


Number of Responses

Minutes per response

Burden Hours

Burden Value


Private Individuals complete matching one pair of photos (one response)

63,581

3

3,179

$104,144






State, Local gov’t

Complete one response

630

3

32

$1,531






Total:

64,211


3,211

$105,675


* - Table 2 was created using information from Bureau of Labor Statistics USDL-17-0321, Employer Cost for Employee Compensation, published March 17, 2017. BLS reported employee compensation for Private Industry averaged $32.76 per hour and for state and local government employees averaged $47.85 per hour. These values include benefits and overtime.






13. Provide an estimate of the total annual non-hour cost burden to respondents or recordkeepers resulting from the collection of information. (Do not include the cost of any hour burden already reflected in item 12.)

* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information (including filing fees paid for form processing). Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.


There is no non-hour cost burden to respondents resulting from this collection. There are no fees associated with the application process, or with collection requirements or methods.


14. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information.


The estimates of annualized costs to the Federal government are shown in Tables 3. These costs have been calculated from actual costs incurred in FY 2013 and FY 2014, and predicted annual costs thereafter. The estimated annual costs for the Federal government is estimated at $36,357 per year. Due to the episodic nature of severe storms, and the unknown extent of citizen interest in the iCoast application, it is impossible to estimate with precision the estimated costs for future years. The USGS intends to operate the iCoast site indefinitely.



Table 3: Federal Government Annual Costs


Employee

Grade/

Step

Hours Worked

Hourly Rate3

Full Rate

Personnel

Costs

Computer Programmer

(contractor)

Contract

na

na

na

$12,000

Geologist

GS 11/10

416

$32

$51.12

$21,266

Research Geographer

GS 14/6

40

$49

$48.30

$3,091

Total





$36,357





15. Explain the reasons for any program changes or adjustments in hour or cost burden.


iCoast collects data without constant management from researchers. System improvements and monitoring of the system is minimal. However, the commitment of individuals working on iCoast is highly variable depending on the frequency of storms during a given season/year as well as personnel available. Also, estimates of who will work on iCoast and how much of their time is dedicated to maintenance, adding new projects (storms) and improving the system has changed significantly since the last Statement A.


16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


All raw, and limited summative, data are available upon demand to iCoast administrators in an electronic format on the project web site. Data used in publications will be made available as data releases. Any PII (such as email addresses) will be removed from published data. Summary reports will be published in scientific journals or other USGS outlets (e.g., open-file reports, data releases, etc.); published reports are compliant with USGS Fundamental Science Practice; reports are produced at periodic intervals, dependent on storm activity. Presentations are made at scientific conferences as appropriate. Two reports are currently in the publications process and logged in the Information Product Data System (IPDS).


Time schedule: The iCoast has been in operation since October of 2014. The iCoast application will continue indefinitely as long as it is supported by the USGS Coastal and Marine Geology Program. To date, photographs from two storms (Hurricane Sandy and Hurricane Joaquin) have been made available to the public for classification. 100% of the photographs from Hurricane Sandy have at least one complete classification. 73.5% of the photographs from Hurricane Joaquin have at least one complete classification. We plan to open a new classification project for Hurricane Matthew (October 2016) in the near future.


17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


Not applicable for this request.


18. Explain each exception to the topics of the certification statement identified in "Certification for Paperwork Reduction Act Submissions."


There are no exceptions to the certification statement.


1 The Disaster Relief Act states that "The President shall insure that all appropriate Federal agencies are prepared to issue warnings of disasters to State and local officials." In addition, Section 202(b) states that "The President shall direct appropriate Federal agencies to provide technical assistance to State and local governments to insure that timely and effective disaster warning is provided."

* State and Local entities participated in 2014 and are not reported here.

3 Hourly rates in Tables 3 drawn from https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/2017/general-schedule/. Operational costs drawn from rates at individual USGS science centers supporting this application.

12


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authordjbieniewicz
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy