0648-COU Supporting Statement Part A

0648-COU Supporting Statement Part A.docx

Surveys to Collect Data on Use of the NOAA National Weather Service Cone of Uncertainty

OMB: 0648-0791

Document [docx]
Download: docx | pdf


SUPPORTING STATEMENT

U.S. Department of Commerce

National Oceanic & Atmospheric Administration

Surveys to Collect Data on Use of the NOAA National Weather Service

Cone of Uncertainty

OMB Control No. 0648-xxxx



A. JUSTIFICATION


1. Explain the circumstances that make the collection of information necessary.


The NOAA National Weather Service (NWS) National Hurricane Center (NHC) produces several forecast graphics in advance of a tropical cyclone to help its partners and members of the public make appropriate planning and preparedness decisions. The Tropical Cyclone Forecast Track Graphic, commonly referred to as the “Cone of Uncertainty” (or the Cone), is designed to convey the forecast uncertainty of the center of a tropical cyclone’s track. The cone shape represents an outline of the 67th percentile of NHC’s average track effort over the last five years at each forecast time. This means that the size of the Cone is not dynamic on a storm-by-storm basis, or even a forecast-by-forecast basis, and reflects the amount of error (forecast vs. actual path) averaged for all events over the previous five years.


The NWS seeks to understand how its user base interprets and uses the information provided by the Cone. To accomplish this, it has conducted a comprehensive literature review of the Cone. The literature shows that the Cone’s visual features have come under scrutiny, with many studies and reports pointing to misunderstanding about what the graphic represents (NOAA, 2006; Broad et al., 2007; Meyer et al. 2013; Wu et al., 2014; Liu et al., 2015; Losee et al., 2017; Milch et al., 2018; Bostrom et al., 2018; Padilla et al., 2017; Ruginski et al., 2016; NTSB, 2017; Sherman-Morris and Del Valle-Martinez; 2017).1 These misunderstandings are further documented in the extensive literature review conducted last year, which is attached to this ICR package.


The literature (see attached literature review) has largely focused on how emergency management and the public use and interpret the Cone, as well as how these interpretations affect decision-making, particularly concerning evacuations. However, much less is known about the Cone’s wider user base—particularly those industry sectors that have a role in protecting public safety or ensuring national security and economic welfare. The literature lacks research into who uses the Cone, how they interpret the graphic, what decisions they make based on the graphic, and whether the graphic is helping to inform these decisions.


To better understand its broader user base and these users’ needs for hurricane forecast information and decision support services, the NWS proposes to conduct a survey of individuals employed in key “Core Partner” entities. The NWS has defined Core Partners in NWS Policy Directive 10-24. There are four criteria for determining if an entity is a Core Partner: 1) there is a legal mandate to support the Core Partner (e.g., Executive Order, statute) or support is a matter of national security; 2) the entity exercises a large degree of authority or influence relative to other Core Partners, on public safety or management of the nation’s water resources for the public good; 3) the entity serves a population or entity particularly vulnerable to impacts of weather, water, or climate hazards; or 4) the entity acts as a force multiplier to help amplify NWS messages to other partners. The Policy Directive also acknowledges that while there are a large number of individuals who are critical in building a Weather-Ready Nation, Core Partners have a unique need for direct access to NWS information based on the critical public services they provide or to facilitate their role in supporting the NWS mission.


Based on the criteria described above, the NWS chose four key industries for the proposed survey:2


  • Transportation including air, water, transit, rail, and truck transportation; pipeline transportation; scenic and sightseeing transportation; couriers and messengers; warehousing and storage; etc. Meets Core Partner criteria 1, 2, 3: Hurricanes pose significant disruptions to supply chains and the flow of freight and goods; transportation authorities also play critical roles in ensuring the safe movement of people in advance of a hurricane.


  • Marine including shipping, water transportation, boat building support for oil and gas operations, commercial fishing, etc. Meets Core Partner criteria 1, 2, 3: Hurricanes have been the cause of many maritime disasters, and the marine sector is a key player in facilitating the movement of U.S. goods and services.


  • Energy/utilities including oil and gas production, pipeline and refining, electricity or natural gas facilities, water and wastewater utilities, etc. Meets Core Partner criteria 1, 2, 3: Hurricanes have caused billions of dollars of damage to the energy/utility infrastructure over the years, impacting production and causing fluctuations in energy prices, as well as creating disruptions to housing and threats to public health, particularly if electric, water, or wastewater utilities are impacted.


  • Tourism including travel, food and hospitality, sports, hotels and motels, other accommodations, scenic and sightseeing services, etc. Meets Core Partner criteria 3 and 4: The tourism industry not only plays a vital role in helping to convey and amplify NWS messages in advance of a hurricane, but the sector can also face significant losses as a result of hurricanes.


The specific target audience for the survey is individuals within these four sectors who use the Cone of Uncertainty, or have knowledge of how the Cone of Uncertainty is used, within their organizations to make decisions. These individuals may include representatives from energy facilities/utilities, hotels, airlines, cruise lines, and cargo shipping companies, among others.


The survey seeks to discern the degree to which respondents use the Cone of Uncertainty to make operational decisions within their organizations, as well as to determine whether the product helps or hinders their decision-making. The survey supports Section 104 of the 2017 Weather Research and Forecasting Innovation Act (Pub. L. 115-25, H.R. 535), which calls for NOAA to improve hurricane forecasting and communication, as well as NWS Policy Directive 10-24, which calls for the NWS to provide detailed, accurate, and timely impact-based decision support services, particularly in association with high-impact events. The survey will help to improve the NWS’s understanding of how and when the Cone graphic is used by Core Partner organizations and if the graphic is serving Core partners’ decision-making needs—or if changes are needed to the product to better serve the user base.


2. Explain how, by whom, how frequently, and for what purpose the information will be used. If the information collected will be disseminated to the public or used to support information that will be disseminated to the public, then explain how the collection complies with all applicable Information Quality Guidelines.


NOAA’s NWS is conducting a web-based survey with four key stakeholder groups. The survey will be deployed by Eastern Research Group, Inc. (ERG), NOAA’s contractor for this work. The survey will be conducted using the Qualtrics, Inc., online web data collection platform. All responses will be collected via the Qualtrics platform.


The survey will be a one-time data collection.


NOAA is interested in collecting data about the interpretation and use of the Cone of Uncertainty in the decision-making of key sectors that are at significant risk during a hurricane: energy/utilities, tourism, transportation, and marine. NOAA will use the data from the survey to determine how embedded the Cone of Uncertainty is in these key stakeholders’ decision-making, as well as to determine what those decisions and implications look like (life and safety, loss reduction, other). It is vitally important for the NHC to understand this information before making any changes to the Cone graphic (e.g., visualization changes with the intent of improving understanding).


NOAA will collect the following information:

  • Organizational past experience with hurricanes. This will help NOAA understand what types of decisions partners make related to hurricane preparedness.

  • Extent of damage and/or adverse effects organizations have experienced from past hurricanes. This will help NOAA determine the economic and societal outcomes partners have experienced as a result of hurricanes, so the agency can think about forecast and/or Cone improvements to contribute to positive future outcomes.

  • Preparedness actions and decision-making in advance of a hurricane. This will allow NOAA to understand whether partners have institutionalized procedures for hurricane preparation.

  • Use of information resources and tools, including those provided by NOAA. This will allow NOAA to identify whether NOAA products are embedded in partners’ decision-making and, if so, how deeply they are embedded.

  • Types of forecast parameters needed (e.g., storm track, storm intensity, and storm size). This will help NOAA identify whether the Cone currently conveys all the parameters essential to partner decision-making, or whether partners rely on/need a variety of forecast products. NOAA can use this information to determine whether it should find a better way to present its current hurricane products suite or whether it should create a more consolidated hurricane product in the future.

  • Ability to understand and interpret the information provided by the Cone graphic.

  • Feedback on ease of use and clarity of the Cone graphic. This will help NOAA decide whether Cone refinements are needed.

  • Suggestions for enhancing the content or design of the Cone graphic. This information will help NOAA determine what types of refinements are most desirable to key stakeholders.


It is anticipated that the information collected will be disseminated to the public or used to support publicly disseminated information. NOAA NWS will retain control over the information and safeguard it from improper access, modification, and destruction, consistent with NOAA standards for confidentiality, privacy, and electronic information. See response to Question 10 of this Supporting Statement for more information on confidentiality and privacy. The information collection is designed to yield data that meet all applicable information quality guidelines. Prior to dissemination, the information will be subjected to quality control measures and a pre-dissemination review pursuant to Section 515 of Public Law 106-554.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological techniques or other forms of information technology.


The survey will be conducted using the Qualtrics, Inc., online web data collection platform. All responses will be collected via the Qualtrics platform. No forms are involved; only an online survey instrument will be used. It is not possible to generate a printable version of the form that is easily fillable by respondents. All respondents will have access to the internet to complete the online survey, and completing the online survey will not require any additional software or operating system requirements.


The results will be made available to the public in the form of a report with raw data tabulations provided in an appendix.


4. Describe efforts to identify duplication.


To the best of our knowledge, no other studies or data collection efforts have been performed that focus on this type of information on the Cone. As part of its project, ERG, NOAA’s contractor for this work, conducted an extensive literature review of nearly 60 scientific studies and research studies and found no studies in the public domain that assess how these sectors use the Cone in decision-making.


The literature review revealed several types of misunderstandings (primarily documented among members of the public or nonexperts) with the Cone. These misunderstandings arise both from the way people comprehend the map’s salient visual features (e.g., track line, cone boundary, legend), as well as from the way they perceive and process this information. Common misunderstandings documented in the literature include misinterpreting the Cone as the storm impact area, equating the size of the Cone to the size of the hurricane, and focusing on the center track line and failing to recognize that hurricanes do not always follow the forecast track or even fall within the Cone. Much of the research that has been done focuses on how these misinterpretations and errors in judgment have important consequences on decision-making, such as whether to shelter in place or evacuate.


A gap in the literature is that these studies are typically conducted with nonexperts (and often students), but there is a much wider user base of hurricane forecast information that is not captured in this research. Furthermore, study participants are not typically provided with any information concerning the conventions of the display that they may be evaluating (e.g., such as the basis for probability distributions or what data outliers may or may not be captured) or any social information (such as what the media is saying about a forecast or what friends and family are doing). Another limitation in the research is that study participants frequently make judgments based on a single piece of information in a controlled environment. In a real-world setting, people get information from multiple sources—and this information would change as the storm approaches land.


This survey, with its focus on the less-studied user base that is making critical decisions affecting people’s lives, national security, and economic welfare, fills an important gap. This survey will help the NWS understand how these users are interpreting the Cone, what other types of information these users are accessing, what decisions they are making based on this information, and where and how the Cone fits into this broader landscape of forecast information.


5. If the collection of information involves small businesses or other small entities, describe the methods used to minimize burden.


We expect that the majority of respondents will be representing medium- to large-sized organizations, including utilities and oil and gas operations, or government organizations such as port authorities. However, a portion of the respondents would likely be classified as small businesses, such as those in scenic/sightseeing industries. To reduce the burden on those entities, we have designed a simple online survey with a majority of close-ended questions. Respondents should be able to complete this survey within 20 minutes or less. The survey does not ask them to review records or search for other information.


6. Describe the consequences to the Federal program or policy activities if the collection is not conducted or is conducted less frequently.


If these data are not collected, NOAA will continue to lack insights into how important partners use the Cone of Uncertainty and other hurricane forecast information, how embedded these products are in their preparedness decision-making, and the benefits they derive from the information. NOAA will also lack information on how easy it is to use and interpret the Cone graphic and other NOAA hurricane forecast information and whether there are any unmet information needs. If these data are not collected, NOAA will not be able to improve its forecast communication and develop informational products that better meet the needs of its stakeholders.


7. Explain any special circumstances that require the collection to be conducted in a manner inconsistent with OMB guidelines.


There are no inconsistencies with OMB guidelines.


8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8 (d), soliciting comments on the information collection prior to submission to OMB.

Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments.

Specifically address comments received on cost and hour burden.

Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


A Federal Register Notice published on May 16, 2019 (84 FR 22115) solicited public comments. No comments were received.


9. Explain any decisions to provide payments or gifts to respondents, other than remuneration of contractors or grantees.


There are no gifts or payments being made to respondents.


10. Describe any assurance of confidentiality provided to respondents and the basis for assurance in statute, regulation, or agency policy.


NOAA will ensure the data are kept confidential to the extent allowed by law. While NOAA will distribute the survey to potential respondents via a web link, NOAA’s contractor, Eastern Research Group, Inc. (ERG), will collect and analyze the data. Prior to providing any data to NOAA, ERG will remove any identifying information from the survey responses.


11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private.


The survey does not involve any sensitive questions.


12. Provide an estimate in hours of the burden of the collection of information.


Table 1 provides an estimate of the respondent breakdown for the survey, and Table 2 presents a burden estimate for the survey based on these respondents. Respondents should be able to complete this survey within 20 minutes or less. The survey does not ask them to review records or search for other information, and the majority of the questions are close-ended.


The survey respondents can be broken into two broad categories:


  • Respondents on NWS Weather Forecast Office (WFO) lists. The NHC has developed a list of 55 NWS WFOs in the geographic areas of interest (i.e., locations that experienced hurricanes within recent years) consisting of 30 inland and 25 coastal areas. Each WFO has developed its own partner list based on its Core Partners and guidance provided under NWS Policy Directive 10-24 for providing impact-based decision support services. Based on information from the WFOs, we assume the average WFO list contains about 240 partner names, and that, on average, about one-fourth (60) of these names in each list are in the sectors of interest. For the 30 inland areas, we assume that the 60 names on each list are evenly divided among the three sectors, excluding the marine sector. For the 25 coastal WFOs, we expect the marine sector will make up one-half of the names on each list (30 names per WFO), while the other 30 inland WFOs will have no marine names, but contain the names of the other three sectors of interest. The survey will involve collecting data from all names compiled through this process.


  • Respondents on targeted lists. The NWS has targeted lists of partners from the NHC (marine sector), which contain 232 marine partners’ names. The NWS also maintains a list of names on its Weather-Ready Nation Ambassadors (WRN) initiative, which works to build partnerships to strengthen partner and community resilience to extreme weather and water events. As of January 24, 2019, there were 10,561 ambassadors in the program. Based on discussions with NOAA staff that manage the WRN initiative, we assume that one-tenth of the ambassadors (1,056) fall within the four sectors of interest and that those names are evenly divided in the four sectors of interest (264 names in each of the four sectors). Also, there are 100 ambassadors in the WRN Aviation Ambassador program, which fall within the transportation sector.


Based on past experience, we expect that 30 percent of the total survey respondents will complete the survey. As noted above, Table 2 provides an estimate of the total respondent burden. Overall, we expect that 1,406 people will respond to the survey; assuming 20 minutes per response results in an estimate of 468.5 burden hours and an estimated cost of $22,731.3


We also expect that 1) the Office of the Federal Coordinator for Meteorology (OFCM) and the U.S. Departments of Energy and Transportation will share the survey link with their partners, and 2) that some respondents will share the link with a colleague or other member of their profession, but we have not made an attempt to estimate those numbers.


Table 1. Potential Number of Respondents for the Survey

Respondent Source

Marine

Transport

Energy/Utilities

Tourism

WFO Partner Lists (30 Inland) [a]


600

600

600

WFO Partner Lists (25 Coastal) [b]

750

250

250

250






NHC Marine Partner Lists [c]

232




WRN Ambassadors [d]

264

364 [e]

264

264

Subtotals

1,246

1,214

1,114

1,114

Total Potential Respondents

4,688

Expected Response (30 Percent of Total)

1,406

[a] The average WFO list contains about 60 partner names across the sectors of interest. There are 30 inland WFOs, and these lists contain no marine sector names. The 60 names on each list are evenly divided among the remaining three sectors (20 names/sector x 30 WFOs = 600 names per sector).

[b] There are 55 NWS WFOs in the geographic areas of interest (i.e., locations that experienced hurricanes within recent years)—each with its own partner list. The average WFO list contains about 60 partner names across the sectors of interest. There are 25 coastal WFOs, and the marine sector makes up half of the names on each list (30 names x 25 WFOs= 750 names). The remainder of the names (30 names) are evenly divided among the other three sectors (10 names x 25 WFOs = 250 names per sector).

[c] The NHC maintains lists of marine partners; there are 232 names on these lists.

[d] As of January 24, 2019, there were 10,561 ambassadors in the program as a whole. Based on discussions with NOAA staff that manage the WRN initiative, we assume that one-tenth of the ambassadors (1,056) fall within the four sectors of interest and that those names are evenly divided among the four sectors (264).

[e] In addition to the 264 names in the transportation list that are part of the general WRN Ambassador program, there are another 100 ambassadors in the WRN Aviation Ambassador program.

Table 2. Time Burden Estimates for the Survey

Region

Group

Total Responses [a]

Time per Response

Total Burden Hours

Total Annual Cost [b]

Marine

NHC Contacts

70

20 minutes

23.3

$1,131

WFO Lists

225

20 minutes

75.0

$3,639

WRN Ambassadors

79

20 minutes

26.3

$1,276

Energy/Utilities

WFO Lists

255

20 minutes

85.0

$4,124

WRN Ambassadors

79

20 minutes

26.3

$1,276

Tourism

WFO Lists

255

20 minutes

85.0

$4,124

WRN Ambassadors

79

20 minutes

26.3

$1,276

Transportation

WFO Lists

255

20 minutes

85.0

$4,124

WRN Ambassadors

109

20 minutes

36.3

$1,761

Totals


1,406


468.5

$22,731

[a] Total respondents based on a 30 percent response rate. All respondents will provide only one response per year.

[b] Cost was calculated using the median hourly rate ($48.52) for general and operations managers from the Bureau of Labor Statistics: https://www.bls.gov/oes/current/oes111021.htm.

13. Provide an estimate of the total annual cost burden to the respondents or record-keepers resulting from the collection (excluding the value of the burden hours in Question 12 above).


There are no recordkeeping costs associated with this collection effort.


14. Provide estimates of annualized cost to the Federal government.


The total cost to the federal government is $169,013. As detailed in Table 3, this includes the cost of the contract under which the survey will be developed, administered, analyzed, and reported ($140,455) and the federal labor cost for pre-solicitation of the contract ($5,886); reviewing proposals ($2,834), deliverables ($6,540), and the final presentation ($654); preparing the final Federal Register Notice ($872); and coordinating the overall contract process ($11,772).

Table 3. Estimated Annualized Cost to the Federal Government

Cost Descriptions

Grade/Step

Loaded Labor Cost (Hourly)

Effort (Hours)

Cost to Government [a]

Federal Oversight 





Initial scoping, draft SOW, pre-award calls

GS-14/GS-15

 $109.00

54

$5,886.00

Review proposals, conduct review calls, debrief non-selected bidders

GS-14/GS-15

 $109.00

26

$2,834.00

Review deliverables

GS-14/GS-15

 $109.00

60

$6,540.00

Attend and comment on final presentation

GS-14/GS-15

 $109.00

6

$654.00

Prepare Federal Register Notice

GS-14/GS-15

 $109.00

8

$872.00

Monthly coordination calls

GS-14/GS-15

 $109.00

108

$11,772.00

Total Federal Oversight Cost




$28,558.00

Total Contract Cost

 



$140,455.00

Total Cost to Government



262

$169,013 [b]

[a] Federal oversight was estimated by multiplying the fully loaded hourly wage of $109 (column 3) for federal employees (at high GS-14 and low GS-15) by the total time burden of hours (column 4) spent on each task.

[b] Total cost to government equals sum of federal oversight costs ($28,558) plus contract cost ($140,455).


15. Explain the reasons for any program changes or adjustments.


This is a new program.


16. For collections whose results will be published, outline the plans for tabulation and publication.


NOAA will develop tabulations and cross-tabulations of the collected data as needed. NOAA will collect the data over the 1 to 2 months following approval. The information will be published via the NWS website.


17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons why display would be inappropriate.


NOAA will display the expiration date.


18. Explain each exception to the certification statement.


There are no exceptions for compliance with provisions in the certification statement.


1 NOAA. (2006). Service Assessment: Hurricane Charley, August 9–15, 2007. National Oceanic and Atmospheric Administration, National Weather Service. Silver Spring, MD. https://www.weather.gov/media/publications/assessments/Charley06.pdf

Broad, K., Leiserowitz, A., Weinkle, J., and Steketee, M. (2007). Misinterpretations of the “Cone of Uncertainty” in Florida during the 2004 Hurricane Season. Bulletin of the American Meteorological Society, 88(5): 651–667. https://journals.ametsoc.org/doi/pdf/10.1175/BAMS-88-5-651

Meyer, R., Broad, K., Orlove, B., and Petrovic, N. (2013). Dynamic Simulation as an Approach to Understanding Hurricane Risk Response: Insights from the Stormview Lab. Risk Analysis, 33(8): 1532–1552. https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1539-6924.2012.01935.x

Wu, H.-C., Lindell, M.K., Prater, C.S., and Samuelson, C.D. (2014).Effects of Track and Threat Information on Judgments of Hurricane Strike Probability. Risk Analysis, 34(6): 1025–1039. https://onlinelibrary.wiley.com/doi/epdf/10.1111/risa.12128

Liu, L.Y., Mirzargar, M., Kirby, R.M., Whitaker, R.T., and House, D.H. (2015). Visualizing Time-Specific Hurricane Predictions, with Uncertainty, from Storm Path Ensembles. Computer Graphics Forum, 34(3): 371–380. https://doi.org/10.1111/cgf.12649

Losee, J.E., Naufel, K.Z., Locker, L., and Webster, G.D. (2017). Weather Warning Uncertainty: High Severity Influences Judgment Bias. Weather, Climate, and Society, 9: 441–454. https://doi.org/10.1175/WCAS-D-16-0071.1

Milch, K., Broad, K., Orlove, B., and Meyer, R. (2018). Decision Science Perspectives on Hurricane Vulnerability: Evidence from the 2010–2012 Atlantic Hurricane Seasons. Atmosphere, 9(1): [32]. http://www.mdpi.com/2073-4433/9/1/32

Bostrom, A., Morss, R.E., Lazo, J.K., Demuth, J., and Lazrus, H. (2018). Eyeing the Storm: How Residents of Coastal Florida See Hurricane Forecasts and Warnings. International Journal of Disaster Risk Reduction. https://www.sciencedirect.com/science/article/pii/S221242091830219X.

Padilla, L.M., Ruginski, I.T., and Creem-Regehr, S.H. (2017). Effects of Ensemble and Summary Displays on Interpretations of Geospatial Uncertainty Data. Cognitive Research: Principles and Implications, 2(1): 40. https://link.springer.com/content/pdf/10.1186%2Fs41235-017-0076-1.pdf

Ruginski, I.T., Boone, A.P., Padilla, L.M., Liu, L., Heydari, N., Kramer, H.S., Hegarty, M., Thompson, W.B., House, D.H., and Creem-Regehr, S.H. (2016). Non-Expert Interpretations of Hurricane Forecast Uncertainty Visualizations. Spatial Cognition & Computation, 16(2): 154–172. https://www.tandfonline.com/doi/full/10.1080/13875868.2015.1137577

NTSB. (2017). Safety Recommendation Report: Tropical Cyclone Information for Mariners (accident number DCA16MM001). National Transportation Safety Board. NTSB/MSR-17/02. https://www.ntsb.gov/investigations/AccidentReports/Reports/MSR1702.pdf

Sherman-Morris, K., and Del Valle-Martinez, I. (2017). Optimistic Bias and the Consistency of Hurricane Track Forecasts. Natural Hazards, 88(3): 1523–1543. https://doi.org/10.1007/s11069-017-2931-2

2 The sectors selected have some overlap and some entities may need to decide which sector best describes their operations. For example, a charter boat operator could reasonably place him- or herself in the “marine” or “tourism” sectors. The survey allows respondents to select which sector best describes their operations. During the survey design process, the NWS determined that allowing respondents to select their sector would be preferable to providing detailed definitions and several options to select from.

3 Calculation is 1,406 respondents x 20 minutes = 28,120 minutes divided by 60 minutes/hour = 468.5 hours x $48.52 = $22,731. Note that $48.52 is the median hourly rate for general and operations managers from the Bureau of Labor Statistics https://www.bls.gov/oes/current/oes111021.htm.


9



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorTadesse Wodajo
File Modified0000-00-00
File Created2021-01-14

© 2024 OMB.report | Privacy Policy