FHWA Roadway Safety Data Program
State Roadway Safety Data Capability Assessment Questionnaire
State
Date
Area 1: Data Collection/Technical Standards 13
Area 1: Data Collection/Technical Standards 14
Other Data for Safety Performance Management 15
Other Data for Safety Performance Management 23
Other Data Supporting Safety Performance Management 27
Other Data Supporting Safety Performance Management 29
Other Data Supporting Safety Performance Management 31
Area 2: Data analysis Tools and Uses 33
Element 2A: Network Screening (Data) 33
Element 2A: Network Screening (Method) 33
Element 2A: Network Screening (Coverage) 33
Element 2C: Countermeasure Selection 33
Element 2E: Accessibility 33
Element 2A: Network Screening (Data) 40
Element 2A: Network Screening (Method) 42
Element 2A: Network Screening (Coverage) 44
Element 2C: Countermeasure Selection 47
Road Inventory Data Accessibility 51
Other Safety Performance Data Accessibility 53
Area 3: Data Management and Governance 54
Element 3A: Roles and Responsibilities 54
Area 3: Data Management and Governance 55
Element 3A: Roles and Responsibilities 59
Traffic Records Coordinating Committee 62
Element 4B: Expandability 73
Element 4C: Spatial Data Integration 73
Element 4C: Spatial Data Integration 78
Road Inventory Data Linkages in Support of Data Integration 78
Other Safety Performance Data Linkages 79
Area 5: Safety Performance Management 80
Element 5A: Performance-Based Planning and Programming 80
Element 5B: Interagency Coordination 80
Area 5: Safety Performance Management 81
Element 5A: Performance Based Planning and Programming 81
Appendix A: For IT Professional 87
Area 3: Data Management and Governance 87
Element 3A: Roles and Responsibilities 87
AASHTOWare SafetyAnalystTM |
A set of software tools which utilizes SPFs for screening roadway locations and contains over 100 SPFs for various roadway segment types. SafetyAnalyst includes modules for identifying locations for potential safety improvement (network screening), diagnosis and countermeasure selection, economic appraisal and priority ranking, and evaluation of implemented improvements. |
Accessibility |
A measure of how easy it is for legitimate users to retrieve and manipulate data in a system, in particular, by those entities that are not the data system owner. |
Accuracy |
How close do the data match the correctness for each internal and external roadway inventory element. The external accuracy of roadway inventory data can only be verified with direct observation – survey, photo or video log, aerial photos, etc. Internal accuracy concerns whether legitimate data values are present and can be monitored through computerized checks. |
Application Developers |
IT personnel who design and write code for software. |
ARNOLD |
All Roads Network of Linear Referenced Data (ARNOLD) is the FHWA sponsored effort to encourage States to develop enterprise-wide linear referencing system for all public roads. |
Automated enforcement device |
An electronic citation issuance device related to speed, red-light running or other enforcement. |
Basemap
|
In a GIS, the overlay of locations onto spatial coordinates to represent the physical environment, including roadways. |
Before-after study |
The evaluation of implemented safety treatments, accomplished by comparing expected frequency or severity of crashes before and after implementation. There are several different types of before-after studies. These studies often develop crash modification factors for a particular treatment or group of treatments. Also known as BA studies. |
Business Rule |
A formally stated constraint governing the characteristic or behavior of an object or the relationship between objects (entities) used to control the complexity of the activities of an enterprise. Example: the standard width of an Interstate lane is 12 feet. |
Business Users |
Application software users from among stakeholder agencies. |
Centralized IT
|
An organizational structure in which IT is a separate State agency from the DOT, but may have an IT group within the DOT. As opposed to an IT department within DOT with no Statewide central authority over all State government IT programs. |
Collision diagram |
A two-dimensional plan view representation of the crashes that have occurred at a site within a time period. It simplifies the visualization of crash patterns. Clusters of crashes by collision type may become evident on the diagram that were otherwise overlooked. |
Completeness |
How much of the all public road inventory that can be potentially be collected and stored are actually in the final electronic data file. |
Computerized internal checks |
For coded elements (e.g., pavement type, shoulder type), the entered value would be compared to legitimate codes and flagged and corrected if not legitimate. Reasonable ranges might include lane width between 8 and 13 feet or AADT for two-lane rural roads that are non-zero and less than some reasonable maximum value. “Agreement with related variables” might mean a shoulder width of zero when a curb is present. |
Condition diagram |
Similar to a collision diagram, but instead of crash information, it provides detailed site characteristics and information on the surrounding land uses. |
Count Program |
Traffic counting programs may be described by type of count requested and the nature of the counting hardware used. For example:
Permanent counts are typically year-round and involve installation of sensors in the roadbed (or overhead for video-based sensors) connected to a digital counting station. |
Critical rate |
A method in which the observed crash rate at each site is compared to a calculated critical crash rate that is unique to each site. |
Cross-functional Teams |
Any multi-agency/multi-business area group brought together in an advisory capacity. In safety data governance, the cross-functional teams may serve as the data governance group/board. |
Data Administrators |
Personnel in charge of the data system and who oversee its operation. |
Data Business Plan |
A plan that lists tasks and activities within each task and is aimed at improving data completeness, timeliness, accuracy, uniformity, integration, and accessibility. It will cover data collection and planned data uses. |
Data Custodians |
Individuals responsible for the technical support of the data applications, which may include activities such as data loading, maintaining data dictionaries, data models, and back-up and recovery procedures for databases. |
Data Governance Board or Council |
A high-level data governance structure in the organization that typically includes senior managers. Responsibilities may include identifying priorities for data governance policies, projects, or system enhancements, and the authorization, implementation and enforcement of data governance policies and standards. |
Data Governance Plan |
The accountability for the management of an organization’s data assets to achieve its business purposes and compliance with any relevant legislation, regulation and business practice. |
Data Owners |
People or groups with decision making authority for initiating or discontinuing the data program and who determine the content of what data is collected. |
Data Quality Performance Measures and Metrics |
Defined measurements made to assess the quality of data (e.g., time between roadway modification and modification of roadway inventory data in the official inventory file). For safety data (including inventory data), one or more measurements are defined for each of six data-quality criteria – accuracy, timeliness, completeness, uniformity, accessibility and integration. Performance metrics are the goals associated with each data quality performance measure. |
Data Quality Standards |
The operational definitions established through data governance processes that describe how the data are to be collected, the QA/QC processes for managing data quality, and, ultimately, the numeric targets to be achieved in completeness, timeliness, accuracy, uniformity, integration, and accessibility. |
Data Stewards |
People who are accountable for the quality, value and appropriate use of the data. |
Data Stewardship |
The formal, specifically assigned and entrusted accountability for business (as opposed to information technology) responsibilities ensuring effective control and use of data and information assets. |
Database architecture |
The design of data structures within a system and the relationships among the various data tables. |
Decentralized DOT
|
An organizational structure in which the State DOT headquarters provides policies, tools, and oversight while Districts play a large role in developing, designing, and constructing projects. |
Diagnosis |
The identification of factors that may contribute to a crash. |
Dynamic Segmentation |
A method of defining roadway sections based on a change in features, attributes, or events to trigger the start of a new segment. |
EMS |
Emergency Medical Services. In this study, EMS refers to the data source consisting of run reports from ambulance services. |
Enterprise, enterprise-wide |
Term used to describe data systems that span the full range of a department’s areas of responsibility. A single, comprehensive and all-encompassing system. |
Equivalent property damage only (EPDO) average crash frequency with EB adjustments |
Rather than looking at crash severities separately, this measure combines all crashes using a weighted average. Specifically, it converts all crashes to property damage only crashes, so an injury crash represents X PDO crashes and a fatal crash represents Y PDO crashes. These multipliers X and Y are typically calculated based on accepted crash costs by crash severity |
Estimated AADT |
A data source listing the estimated annual average daily traffic on local roads and rural minor collector roads and intersections as defined in the roadway inventory. |
Excess expected average crash frequency with EB adjustments |
The observed average crash frequency and the predicted crash frequency from a safety performance function are weighted together using the Empirical Bayes method to calculate an expected average crash frequency. The resulting expected average crash frequency is compared to the predicted average crash frequency from a SPF. The difference between the EB adjusted average crash frequency and the predicted average crash frequency from an SPF is the excess expected average crash frequency. |
Excess predicted average crash frequency |
Method in which sites are ranked according to the difference between the observed crash frequency and the predicted crash frequency based on a safety performance function. |
Expected crashes |
An estimate of long range average number of crashes per year for a particular type of roadway or intersection. |
Federal-aid Highways/System
|
All public roads other than rural minor collectors, rural local roads, urban local roads (functional class). |
GIS |
Geographic Information System is an electronic mapping system using spatial coordinates (latitude/longitude) to associate data with specific locations on a base map. |
HSIP |
Highway Safety Improvement Program – a core Federal-aid program aimed at achieving a significant reduction in traffic fatalities and serious injuries on all public roads, including non-State-owned public roads and roads on tribal lands. The HSIP requires a data-driven, strategic approach to improving highway safety on all public roads that focuses on performance. |
HSP |
Highway Safety Plan – a document submitted by States each fiscal year which describes the strategies and projects the State plans to implement and the resources from all sources it plans to use to achieve its highway safety performance targets. The HSP is coordinated with the State strategic highway safety plan as defined in 23 U.S.C. 148(a) |
In-patient billing |
A statewide data source that provides information on all patients admitted to the hospital. The billing information includes causes of injury (e.g., motor vehicle crashes), types and severity of injuries, treatments, and charges. |
Integration |
Integration is the ability to merge data from multiple different systems and/or different agencies into a single analytic resource. |
Interactive Highway Safety Design Manual |
The IHSDM - HSM Predictive Method 2016 Release includes six evaluation modules: Crash Prediction, Policy Review, Design Consistency, Traffic Analysis, Driver/Vehicle and Intersection Review. To the extent possible, the Crash Prediction Module (CPM) faithfully implements Part C (Predictive Method) of AASHTO's 1st Edition Highway Safety Manual for evaluating rural 2-lane highways, rural multilane highways, and urban/suburban arterials, as well as HSM 2014 Supplement materials on freeway segments and freeway ramps/interchanges. |
Level of service of safety |
The ranking of sites according to their predicted and expected crash frequency for the entire population, where the degree of deviation is then labeled into four level of service classes. |
Local and regional jurisdictions |
Local (county and municipal) governments/agencies as well as entities such as Metropolitan Planning Organizations and Regional Planning Organizations that coordinate efforts among these agencies and entities. |
LRTP |
Long Range Transportation Plan |
Metadata |
A set of data that describes and gives information about other data. |
Method of moments |
Method in which a site’s observed accident frequency is adjusted based on the variance in the crash data and average crash counts for the site’s reference population. |
MIRE
MPA |
Model Inventory of Roadway Elements - Guideline to help agencies improve their roadway and traffic inventories
Metropolitan Planning Area – the area in which the metropolitan transportation planning process is carried out. The MPA is made up of the census-defined Urbanized Area (UZA), plus the contiguous area expected to become urbanized within the next 20 to 25 years. |
MPO |
Metropolitan Planning Organization –Federal transportation laws and regulations require the establishment of an MPO in every urbanized area of the U.S. with a population over 50,000. MPOs are responsible for meeting the federal metropolitan planning regulations for transportation. |
Network screening |
Process by which State or local agencies identify sites with “safety issues”. This is an initial cut at identifying sites with potential for treatment. Further studies are necessary (diagnosis) to determine specific issues and appropriate treatments. |
Non-public roadways |
Roads that aren’t functionally classified, i.e. private roads, military installations, and national park service roads. |
Performance Measure |
An expression based on a metric, used to establish targets and to assess progress towards achieving the established target |
Performance Metric |
A quantifiable indicator of performance or condition |
Performance Target |
A quantifiable level of performance or condition, as a value for a measure, to be achieved within a specified period. |
Performance threshold |
A numerical value that is used to establish a threshold of expected number of crashes (i.e. safety performance) for sites under consideration. |
Predicted crashes |
The estimate of long-term average crash frequency which is forecast to occur at a site using a predictive model or safety performance function suitable to the roadway type under consideration. |
Publicly-owned non-State-maintained roadways |
Includes county and local/municipality roads that are not maintained by the State. |
RDIP |
Roadway Data Improvement Program (RDIP) offers expert review and recommendations on a State’s roadway data management and data quality. |
Regression-to-the-mean |
When a period with a comparatively high crash frequency is observed, it is statistically probably that a lower crash frequency will be observed in the following period. This tendency is known as regression-to-the-mean. |
Relational Database |
A modern database structure characterized by defined data tables related to each other by keys in a hierarchical manner that avoids duplicating data elements. Examples of relational database software environments in Transportation include Oracle and SQL. |
Relative severity index |
An average crash cost calculated based on the crash types and severities at each site and then compared to an average crash cost for sites with similar characteristics to identify those sites that have a higher than average crash cost. The crash costs can include direct crash costs accounting for economic costs of the crashes only; or account for both direct and indirect costs. |
Roadway inventory types |
In this document, primary inventory types surveyed include (1) roadway segments (e.g., number of lanes, shoulder width, AADT), intersections (e.g., type, traffic control, crossing street AADT), interchanges (e.g., type of interchange, lighting), ramps (e.g., ramp AADT, length, type), curves (e.g., length, degree of curve) and grades (e.g., percent grade, up or downgrade). Supplemental information is also collected on roadside object inventories, sign inventories, speed data inventories and safety improvement inventories. |
Roadway Segment |
A portion of roadway as defined in the State’s roadway inventory system. The method of defining when new segments start varies among States and sometimes within a State depending on the roadway type under consideration. Usually segments are defined to be homogenous with respect to key features such as pavement width, number of lanes, median type, AADT, etc. |
Rolling Average |
A method for smoothing time series data by averaging data points for a fixed number of consecutive terms. Each data point of the series is sequentially included in the averaging, while the oldest data point in the span of the average is removed. |
RPO/RPC/RPA |
Regional Planning Organization/Regional Planning Commission/Regional Planning Agency
|
Safety performance functions |
An equation used to estimate or predict the average crash frequency per year at a location as a function of traffic volume and in some cases roadway or intersection characteristics (e.g. number of lanes, traffic control, or median type). |
Scenario Analysis |
A technique used to compare different transportation investment options based on their predicted impacts on transportation system performance. |
Service Level Agreements |
Interagency agreements for developing data elements at a specified quality. |
SHSO |
State Highway Safety Office - lead coordinator for traffic safety programs in each State, led by the Governor’s designated Highway Safety Representative. |
Site address |
Roadway system screening to identify sites for potential treatment requires multiple years of crash data. Changes to the roadway system (e.g., lengthening of a curve; realignment of a roadway section, opening a new intersection thus creating two segments and one new intersection) can change the “site address” (route/milepost) for modified and downstream locations before and after the change. |
State-maintained roadway |
Any roadway segment, regardless of ownership, that is maintained by the State DOT. |
State-system |
The roadway network under the control (“owned by”) the State DOT. The remainder of the public mileage in a State (i.e., “non-State public roads”) is predominately owned by local governments (e.g., county or municipal) or the federal government (e.g., national park roads). |
STIP |
State Transportation Improvement Program or MPO Transportation Improvement Program |
Supplemental Databases |
In this data capability assessment, supplemental data refers to information related to safety that is beyond the typical inventory files maintained by a State DOT. The list includes estimated AADT, roadside fixed objects, signs, speed data, automated enforcement devices, safety-related land use, bridges/structures, railroad grade crossings, safety improvements and others. |
Temporal trends |
Time-related factors that influence reported crashes and can change throughout a given study period, including crash reporting thresholds, weather conditions, etc. |
TIP |
Transportation Improvement Program is a capital improvement program developed cooperatively by local and state transportation entities. |
Trauma care |
A data source usually stored in a Trauma Registry reporting care provided by designated trauma centers. The data source typically includes information on cause of injury (including motor vehicle crashes), the extent of injuries, and the treatments provided. Linking data from crashes and trauma registries can be used to improve the accuracy of data on crash injuries, medical outcomes, and the economic cost of motor vehicle crashes. |
UZA |
Urbanized Area – One type of urban areas that the Census Bureau identifies with 50,000 or more people. |
VMT
|
Vehicle Miles of Travel – The product of traffic volume and the length of the road segment(s) within a defined scope of interest during the period of interest. |
|
|
How is the State roadway basemap constructed? (select one)
|
|
|
|
Does the State have an ARNOLD-compliant LRS/network? (select one)
|
|
How is the inventory data stored (e.g., in GIS tables on Oracle server)?
What roadways are covered in the basemap? (Check all that apply)
State-owned (percent):
State-maintained (percent):
Non-State public (local, tribal, federal) (percent):
Non-public (percent):
Do you have defined quantitative data quality performance measures and metrics (goals) for the set of roadway inventory data elements you collect? (Check all that apply.) If so, could you please provide any documentation on what they are, how they are measured, and how they are tracked.
|
|
|
|
|
|
|
Descriptions of these terms are available in the NHTSA Performance Measures White Paper,
https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/811441
Up-to-date data quality performance measures are available in the NHTSA Traffic Records Assessment Advisory, https://www.transportation.gov/government/traffic-records/nhtsa-traffic-records-program-assessment-advisory
Does the State take an approach where all business units can use and integrate into one LRS/Map? (Select one)
|
|
Are there any Service Level Agreements (SLAs) (i.e., interagency formal agreements or memoranda of understanding assigning responsibilities) in place for maintaining the basemap that include components for completeness, timeliness, accuracy, uniformity, accessibility, and integration? (Select one)
|
|
General
Which of the following statements best describes the status of your agency’s efforts to identify and address gaps in data needed for safety performance management? (Select one)
Gaps in meeting safety performance data requirements are not yet well understood (1)
Safety data gaps have been identified and work is underway to identify actions required to fill these gaps (2)
There is an established plan for filling safety data gaps but implementation has not yet begun (3)
There is an established plan, and several data improvement initiatives underway to fill safety data gaps (4)
There is an established plan, essential safety data gaps have been filled, and there is a continuous improvement process to sustain and improve data quality (5)
Which of the following practices do you employ to manage quality of your crash data: (Check all that apply)
|
|
|
|
|
|
|
|
|
What percent of your state’s public road mileage is State owned or maintained?
State-owned (HPMS item 6 = 1): _________
State maintained (HPMS item 68 = 1):_______
What percent of your state’s public road system has roadway inventory data that are maintained electronically?
State-owned (percent): ____
State-maintained (percent): _____
Non-State public (local, tribal, federal) (percent): ____
Non-public (percent): ____
How does the State collect data on local roads? (Check all that apply)
|
|
|
|
NOTE:
If more than one answer applies in question 3, please describe the
circumstances when each of the employed data collection methods is
used.
Please describe any data sharing agreements and/or practices you have with local, tribal, federal, and other agencies that maintain roads in the State.
What percentage of the MIRE elements are in the State road inventory file, what percent of public roadways are they collected on, and what elements are planned for future collection?
Worksheet can be found in Appendix B. Additional information on MIRE can be found at http://safety.fhwa.dot.gov/rsdp/mire.aspx.
MIRE Elements (# of elements) |
Is this collected? (Y/N) |
Percent of Elements |
Percent of Roadways Collected On |
Planned Future Collection |
|||
State % |
Local/Federal % |
State (All = 100%, Most > 50%, Some < 50%, None = 0%) |
Local/Federal/Tribal (All = 100%, Most > 50%, Some < 50%, None = 0%) |
State (Y/N) |
Local/Federal (Y/N) |
||
I. Roadway Segment (109) - Page 92 Appendix B |
|
|
|
|
|
|
|
II. At-Grade Intersection/Junctions (18) – Page 98 Appendix B |
|
|
|
|
|
|
|
III. Intersection Leg (Each Approach) (40) – Page 99 Appendix B |
|
|
|
|
|
|
|
IV. Interchange/Ramp (25) – Page 102 Appendix B |
|
|
|
|
|
|
|
V. Horizontal Curve (8) - Page 104 Appendix B |
|
|
|
|
|
|
|
VI. Vertical Grade (5) – Page 104 Appendix B |
|
|
|
|
|
|
|
What supplemental datasets are included in your roadway inventory
system? If collected, please briefly describe how the data are being
collected and maintained?
NOTE: For this assessment, it is
recognized that States may differ in what they collect in each of
these systems. A description of the contents of each of the systems
is not required. If there is any doubt if a particular system meets
the sense of this assessment, the State may provide a data element
list to show the contents of each of the supplemental datasets.
Alternatively, the State and/or the assessors can make brief notes
describing a system’s contents if the State desires.
Data collection techniques/technologies may include: As-built plans (AB), Field survey (FS), Instrumented vehicle (IV), Aerial Photos (AP), and Other, please describe (O, description).
State-Owned Roadway Network
Supplemental Data |
Collected State (All = 100%, Most > 50%, Some < 50%, None = 0%) |
How is it Collected |
How is it linked to other location-based data? |
How is it Stored |
Access management |
|
|
|
|
Automated enforcement devices |
|
|
|
|
Curve inventories |
|
|
|
|
Grade inventories |
|
|
|
|
Guard rails |
|
|
|
|
ITS Devices |
|
|
|
|
Lighting |
|
|
|
|
National Bridge Inventory |
|
|
|
|
Pedestrian (i.e. counts, sidewalks, trails, injuries) |
|
|
|
|
Bicycle (i.e. counts, sidewalks, trails, injuries) |
|
|
|
|
Pavement condition (IRI) |
|
|
|
|
Pavement markings |
|
|
|
|
FRA Highway-Rail Crossing Inventory |
|
|
|
|
Fixed Object on Side of Roadway |
|
|
|
|
Safety improvements (planned/programmed? Completed/work history? HSIP annual report requirements?) |
|
|
|
|
Signs |
|
|
|
|
Speed (please describe in notes) |
|
|
|
|
Other (please describe).
|
|
|
|
|
Locally-owned roadway network
Supplemental Data |
Collected Local/Federal/Tribal (All = 100%, Most > 50%, Some < 50%, None = 0%) |
How is it Collected |
How is it linked to other location-based data: |
How is it Stored |
Access management |
|
|
|
|
Automated enforcement devices |
|
|
|
|
Curve inventories |
|
|
|
|
Grade inventories |
|
|
|
|
Guard rails |
|
|
|
|
ITS Devices |
|
|
|
|
Lighting |
|
|
|
|
National Bridge Inventory |
|
|
|
|
Pedestrian (i.e. counts, sidewalks, trails, injuries) |
|
|
|
|
Bicycle (i.e. counts, sidewalks, trails, injuries) |
|
|
|
|
Pavement condition (IRI) |
|
|
|
|
Pavement markings |
|
|
|
|
FRA Highway-Rail Crossing Inventory |
|
|
|
|
Fixed Object on Side of Roadway |
|
|
|
|
Safety improvements (planned/programmed? Completed/work history? HSIP annual report requirements?) |
|
|
|
|
Signs |
|
|
|
|
Speed Limits/ Speed Zones |
|
|
|
|
Other (please describe).
|
|
|
|
|
How would the State prioritize in rank-order improving the collection of the following items for safety purposes. If resources/grants (cost, labor, etc.) were available to assist with a larger data collection effort where would the State want to spend those resources? (rank the items with 1 as top priority; ties are acceptable). NOTE: items for which the State feels it already has adequate data to support its programs and decision making would receive a lower priority in this ranking because they would not need improvement. Low priority should also be given to those items for which there is no interest by the State in improving or expanding data collection. Highest priority should be given to those items that need improving and the State sees a clear need for better data.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
What is the status of the State’s efforts to fulfill the requirement of providing a MIRE FDE dataset beyond providing a plan in the July 1, 2017 Traffic Records Strategic Plan?
Crash Data
Which calendar years of fatal and serious injury crash data do you currently have available for reporting and analysis? (Source: HSIP Report)
From Calendar Year: _____________________
To Calendar Year: _______________________
Which word below best characterizes the availability of serious injury data for State maintained public roads? (Select one)
Good – complete data for 90% or more of the state-maintained system
Fair – complete data for 70% or more of the state-maintained system
Poor – complete data for less than 70% of the state-maintained system
If you answered Fair or Poor to the above, which of the following factors contribute to the lack of complete data: (Check all that apply.)
a. Poor/incomplete road network data that enables crashes to be located
b. Poor/incomplete crash data attributes
c. Under-reporting of crashes
d. Other, please specify:
Which word below best characterizes the availability of serious injury data for non-state maintained public roads? (Select one)
Good – complete data for 90% or more of the non-state-maintained system
Fair – complete data for 70% or more of the non-state-maintained system
Poor – complete data for less than 70% of the non-state-maintained system
Boundary Data
Does your agency maintain spatial data on current approved Metropolitan Planning Area (MPA) and Urbanized Area (UZA) boundaries? (Select one)
No
Yes
Does your agency maintain historical spatial data on the MPA and UZA boundaries that were effect in past calendar years? (Select one)
No – we just have current boundary information
Yes – we have data sets for boundaries in effect for prior years (please specify how far back your boundary data goes): ________________________________
Not applicable –boundaries haven’t changed
VMT Data
Are your state’s MPOs currently able to produce a VMT estimate for all public roads in their MPA? (Select one)
None of our MPOs can do this
All of our MPOs can do this
Some of our MPOs can do this (Which MPOs cannot currently do this?) __________________________________________________________
Using the descriptions I-VI below the table, what type of procedure(s) do you have to update each of your state-owned inventory data types? What is the time lapse between the “open to traffic” dates of a new roadway or roadway modification and when the revised data are included in each inventory file? (Complete all that apply.)
Inventory Type |
Update Procedure |
Typical Time Lapse |
Roadway Segment |
|
|
Traffic volume (MIRE FDE) |
|
|
Intersection |
|
|
Interchange |
|
|
Ramp |
|
|
Curve |
|
|
Grade |
|
|
Supplemental Data |
|
|
Other (please describe) |
|
|
Example procedures might include the following:
Ad hoc procedure – no standardized procedure, but changes to the file are made when they come to the attention of the file maintainer.
Annual (or less often) survey of entire or part of the roadway system (e.g., the roadway system is re-inventoried over a five-year period).
On-going “as roadway is modified” process where descriptions or “as-built” plans are submitted to the file maintainer each time a change is made to the roadway or a new roadway segment is opened to traffic. The data for the affected section or location are then updated.
Updates vary by data type.
Other (please describe).
Example time lapses might include the following:
There is no systematic updating process, thus the time varies greatly.
More than one year.
6 – 12 months.
1-5 months.
1 month or less.
What are the constraints to updating data elements in a timely manner in terms of (a) collecting and/or (b) entering them in the databases (e.g., funding, staff, technology)?
Are there plans to improve the timeliness for adding new data elements to the roadway files? If so, please describe these plans.
Are there plans to regularly update the all public roads database that continue to meet the ARNOLD requirements? If so, please describe these plans.
Does project completion trigger a change in the corresponding data in the system where applicable?
Do you indicate the following items in your inventory files, and if so, how? (Check all that apply)
|
|
Does the State track the date of the physical change(s) to roadway locations in a database? (Select one)
|
|
Is information about the change history available in existing query tools and reports? If yes, are these reports available to external users? (Select one)
|
|
|
What is the average number of reporting days for fatal and serious injury crashes in your agency? (Average number of reporting days is defined as the number of days between when the crash occurred – as noted on the crash form, and when the crash information is entered into the electronic crash database.) (Select one)
Less than 30 days
30 – 89 days
90-179 days
180 days or more
What percentage of local jurisdictions in your State can provide crash reports for fatal and serious injury crashes within 3 months of crash occurrence? (Select one)
Over 90 percent
70 – 90 percent
Less than 70 percent
If the average elapsed time from fatal or serious injury crash is 1 month or greater, is your agency planning any initiatives to improve timeliness of crash data reporting? Please describe these plans and the target improvement that you hope to achieve: ____________________________________________________________________________________________________________________________________________________________
Do you have a systematic process where all or some part of your inventory data are re-measured or otherwise verified in the field (e.g., from surveys, video logs, aerial photos, etc.)? If so, please describe it. (If the answer differs for different roadway inventory types, please answer for each type.)
For the data that you already maintain (this may apply only to State-maintained roadways if that is what the State’s database content has now), and based on past accuracy verifications, indicate which groups of data elements can be considered accurate for 90% of the records; if no accuracy has been verified, indicate it by NA?
Roadway inventory elements (approximate percent): ______
Traffic inventory elements (approximate percent): ______
Curve inventory elements (if present) (approximate percent): ______
Grade inventory elements (if present) (approximate percent): ______
Intersection inventory elements (if present) (approximate percent): ______
Interchange inventory elements (if present) (approximate percent): ______
Ramp inventory elements (if present) (approximate percent): ______
Other – please describe (approximate percent): ______
Do you have and continually use a series of computerized internal checks (beyond data type edits) to verify that data values are legitimate codes or in reasonable ranges and agree with values for related data variables (i.e., checks for logical agreement among data elements)?
If
yes, please provide data from accuracy measurements (i.e. pass/fail
ratios).
If possible, please provide a copy of the edit
check definitions if that is not possible, please describe the
extent of the accuracy checks by telling us what percentage of
roadway data elements are subjected to these validation and logical
agreement checks.
Do you monitor crash data accuracy using formally defined performance measurements? (Select one)
Yes
No
If yes, please provide crash data accuracy reports.
Do you perform validity and consistency checks in the crash data capture and data entry processes? (This includes electronic data collection as well as paper-based processes) (Select one)
Yes
No
Do you provide feedback to reporting agencies regarding the accuracy of the crash reports they submit? (Select one)
Yes
No
If yes, please state how often.
For all types of existing inventory data on State-system roadways (e.g., roadway segments, intersections, curves, etc.), are element definitions and coding consistent across all highway divisions/regions? If not, describe what differences exist.
For State-system and local roadways in the statewide roadway inventory system, are data element definitions consistent for all roadway locations (i.e., does the database documentation have the same data definitions for use in all locales or are there different definitions used in the database for local versus state-maintained roadways? If not, describe the differences for each data type roadways (e.g., roadway segments, intersections, curves, etc.) included.
For any locally-maintained roadway inventory databases, do the local systems all adopt the same data element definitions as the State? If not, please describe the differences.
If your inventory data system contains multiple years of data, what procedures are in place to ensure that coding for each variable (or critical variables) is consistent across years? (if no procedures exist, answer “none”).
How are deliberate changes in data definitions (e.g. to meet new requirements) accommodated while balancing consistency across years? Are timestamps provided? Is documentation provided to users to inform them of the changes?
What is the specific system (e.g. Roads and Highways) used to ensure that the same “site address” (e.g., route milepost) in the crash location and roadway inventory file describes the same “site” across multiple years?
Do you provide feedback to reporting agencies regarding the uniformity of the crash reports they submit? (Select one)
Yes
No
How would you characterize the consistency of coding Serious Injury Data across jurisdictions? (Select one)
Good – consistent definitions and coding for 90% or more of the jurisdictions
Fair – consistent definitions and coding for 70% or more of the jurisdictions
Poor – consistent definitions and coding for less than 70% of the jurisdictions
Do your current criteria (as documented in your manuals/data dictionary) for identifying a serious injury match with the definition of a “Suspected Serious Injury” in the MMUCC 4th Edition? (Source: HSIP Report) (Select one)
Yes
No
Which of the following actions have you taken to ensure consistency with the MMUCC 4th edition definition of “Suspected Serious Injury”? (Check all that apply)
a. Included the verbatim definition from MMUCC in the Police Crash Report Form
b. Included the verbatim definition from MMUCC in the State Crash Report User Manual
c. Included the verbatim definition from MMUCC in the State Crash Database Data Dictionary
d. Made certain that the attributes of Suspected Serious Injury A1 are reportable and not included in other coded values of injury status.
Do you anticipate compliance with the MMUCC 4th edition definition of Suspected Serious Injury by April 15th, 2019? SI Data – Timeliness (Select one)
We are already in compliance
We are not yet in compliance but anticipate that we will be by April 15th, 2019
We do not anticipate being in compliance by April 15th, 2019 – our current estimated date for compliance is: ____________________________
Which of the following ways do you plan to implement serious injury designation in your crash database? SI Data-Uniformity (Select one)
We include (or plan to include) an Injury Status data element that utilizes the full set of codes specified in MMUCC version 4
We include (or plan to include) a data element that indicates only the “A-Suspected Serious Injury” coding (as opposed to the full set of Injury Status data codes in MMUCC version 4)
Other (Describe): _____________________________________________________
Does your State crash database allow you to identify injuries involving pedestrians or pedalcyclists as defined in ANSI D16.1-2007? (Select one)
Yes
No: If no, please explain what is needed to achieve consistency with ANSI D16.1-2007 (e.g. convert existing data, change data collection practices, etc.)
Have available MPA VMT estimates been verified to (1) make use of data reported in HPMS, (2) be consistent with the state’s HPMS data for the MPO’s UZA(s) and (if applicable) National Ambient Air Quality Standards (NAAQS) areas? MPO VMT Data – Uniformity (Select one)2
None of the MPA VMT estimates have been verified to meet these criteria
All of the MPA VMT estimates have been verified to meet these criteria
Some of the MPA VMT estimates have been verified to meet these criteria
Are available MPA VMT estimates based on an “a documented statistically valid procedure based on monitored traffic” – as described in the Traffic Monitoring Guide3 and other available guidance for air quality conformity analysis and HPMS? MPO VMT Data – Uniformity (Select one)
None of the MPA VMT estimates have been verified to meet this criterion
All of the MPA VMT estimates have been verified to meet this criterion
Some of the MPA VMT estimates have been verified to meet this criterion
For which of the following roadways are crash data available for analysis? (Check all that apply)
All State-maintained roadways.
All publicly-owned non-State-maintained roadways (includes county and local/municipality roads).
A portion of State-maintained roadways.
A portion of owned by local government agencies.
A portion of roadways on federal lands
A portion of roadways on Tribal lands
Does the State have a formal road safety management process? (Select one)
|
|
How often does the State conduct a network screening? (Select one)
|
|
|
Please indicate in the table below what data analysis tools and resources you are using for network screening, countermeasure selection, and evaluation? (Check all that apply)
Data Analysis Tools |
DATA ANALYSIS PROCEDURES |
||
Network Screening |
Countermeasure Selection |
Evaluation |
|
Agile Assets Safety Analyst |
|
|
|
Highway Safety Manual |
|
|
|
AASHTOWare Safety Analyst |
|
|
|
Interactive Highway Safety Design Model* |
|
|
|
CMF Clearinghouse |
|
|
|
FHWA Systemic Safety Project Selection Tool |
|
|
|
HPMS/ARNOLD |
|
|
|
Numetric |
|
|
|
usRAP |
|
|
|
Other (please specify)
|
|
|
|
Please indicate in the table below which roadway characteristics are available for mainline road segments, and for which roadway types they are available.
*Indicate if the statistics are available for all, most, some or none of the roadway types in the columns
DATA TYPE |
ROADWAY TYPE |
|
State (All, Most, Some, None)4 |
Local/Federal/Tribal (All, Most, Some, None) |
|
Number of lanes per direction |
|
|
Location of access points (e.g., drives) |
|
|
Median type |
|
|
Median width |
|
|
Posted speed |
|
|
Area type (e.g., urban, suburban, rural) |
|
|
Adjacent land use |
|
|
Curvature (e.g., curve/tangent, degree of curve) |
|
|
Grade |
|
|
Traffic volume (MIRE FDE) |
|
|
ITS Devices |
|
|
Please indicate in the table below which roadway statistics are available for intersections, and for which roadway types they are available.
*Indicate if the intersections statistics are available for all, most, some or none of the roadway types in the columns
DATA TYPE |
ROADWAY TYPE |
|
State (All, Most, Some, None) |
Local/Federal/Tribal (All, Most, Some, None) |
|
Traffic control (e.g., signalized, two-way stop control, yield, etc.) |
|
|
Number of approaches (e.g., 3-legged or 4-legged) |
|
|
Cross-section by approach (e.g., number of through/turn lanes) |
|
|
Functional classification |
|
|
Area type |
|
|
Traffic volume (MIRE FDE) |
|
|
Turning movement counts |
|
|
Posted speed |
|
|
Location of access points (e.g., drives) |
|
|
Adjacent land use |
|
|
Median presence |
|
|
Observed crash rates |
|
|
Average crash rates |
|
|
Please indicate in the table below which roadway statistics are available for ramp and ramp terminals, and for which roadway types they are available.
*Indicate if the ramp and ramp terminal statistics are available for all, most, some or none of the roadway types in the columns
DATA TYPE |
ROADWAY TYPE |
|
State (All, Most, Some, None) |
Local/Federal/Tribal (All, Most, Some, None) |
|
Interchange type (e.g., diamond, cloverleaf, etc.) |
|
|
Traffic control at ramp terminal (e.g., signalized, two-way stop control, yield, etc.) |
|
|
Cross-section |
|
|
Functional classification |
|
|
Area type |
|
|
Adjacent land use |
|
|
Posted speed |
|
|
Traffic volume |
|
|
Curvature |
|
|
Grade |
|
|
Please indicate in the table below which roadway statistics are available for at-grade rail crossings, and for which roadway types they are available.
*Indicate if the at-grade rail crossing statistics are available for all, most, some or none of the roadway types in the columns.
DATA TYPE |
ROADWAY TYPE |
|
State (All, Most, Some, None) |
Local/Federal/Tribal (All, Most, Some, None) |
|
Type of crossing (e.g., active or passive) |
|
|
Number of roadway lanes |
|
|
Number of tracks |
|
|
Functional classification of roadway |
|
|
Area type |
|
|
Adjacent land use |
|
|
Traffic volume on roadway |
|
|
Number of trains per day |
|
|
Roadway curvature |
|
|
Roadway grade |
|
|
Which of the following data are available on all public roads tied to individual road segments or intersections for network screening? (Check all that apply)
|
|
|
|
|
|
|
|
Which of the following best describes the network screening process? (Select one)
|
|
|
|
Which stakeholders provide input during the network screening process? (Check all that apply)
|
|
|
|
|
|
Which of the following “networks” can be incorporated into a statewide query by linking crash data and having sufficient data on site characteristics to compare to other sites statewide?
(Check all that apply)
|
|
|
|
|
|
|
|
|
|
|
Are you able to generate performance metrics for peer groups for comparison (e.g., average crash rates for certain intersection types)? (Select one)
No.
Yes. Please explain.
Which of the following performance measures are used in the network screening process? (Check all that apply)
|
|
|
|
|
|
|
|
|
|
|
|
Descriptions of these terms are available in the AASHTO Highway Safety Manual, Part 2, Chapter 4 as well as on page 60 (2-20) of the FHWA HSIP Manual, http://safety.fhwa.dot.gov/hsip/resources/fhwasa09029/fhwasa09029.pdf.
|
Considering the performance measures selected in the previous question, what percentage of roadways does your network screening analysis have the ability to cover? Please complete the table below for each roadway type.
Roadway |
Covered (All, Most, Some, None) |
State-maintained |
|
Publicly-owned non-State-maintained1 |
|
1 Includes county, federal, Tribal, and local municipality roads. |
|
|
How extensively is your State using analytical tools for network screening? This is not limited to advanced tools such as AASHTOWare Safety Analyst but may also include any analytical tools that conduct screening, including any developed by the State. (Select one)
Not using it at all.
Using it partially for some State-owned roadways.
Using it for all State roads
Using it for all State and non-State local roads.
Please indicate below which crash statistics can be generated for a specific site or corridor and for which roadway types they are available by filling “all” (every location), “some”, or “none” (not available for any location) in each box.
DATA TYPE |
ROADWAY TYPE |
|
State-maintained roadways |
Publicly-owned non-State-maintained roadways |
|
Crash type |
|
|
Crash severity |
|
|
Time of crash |
|
|
Day of crash |
|
|
Date of crash |
|
|
Road condition (dry, wet, etc.) |
|
|
Lighting condition (light, dark-lit, etc.) |
|
|
Weather condition (clear, rain, snow) |
|
|
Contributing factors |
|
|
Driver impairment |
|
|
Driver age |
|
|
Pedestrian impairment |
|
|
Pedestrian age |
|
|
Bicyclist impairment |
|
|
Bicyclist age |
|
|
Motorcyclist impairment |
|
|
Motorcyclist age |
|
|
Unique location identifier |
|
|
Object hit |
|
|
Direction of travel |
|
|
Specific location of crash (e.g., within an intersection, on east approach, off the right roadside, etc.) |
|
|
Does the State have the ability to automatically generate a collision diagram? (Select one)
|
|
Does the State have the ability to automatically generate a condition diagram? (Select one)
|
|
Is your State using analytical tools for diagnosis? (Select one; if yes, please describe)
No.
Yes, partially for some State-owned roadways.
Yes, for all State roads
Yes, for all State and non-State local roads.
Is a test of proportions used for overrepresented crashes (e.g., comparing an agency’s proportion of serious crashes to the statewide proportion)? (Select one)
No.
Yes.
There are no further questions under this section. Relevant information for this section can be obtained from responses to questions in the “Background” section.
Does the State maintain a central database of records for roadway safety improvement projects implemented on State roads with locations, dates, and what improvements were implemented? (Select one)
|
|
|
|
What types of information are available for completed safety improvement projects? (Check all that apply)
|
|
|
|
|
|
|
|
|
How long is the information kept and what form (single database, spreadsheet, paper records, etc.) is the information in?
Retention period:____________
Form:
Does the State have the ability to link crash data to the safety improvement project site(s) of interest? If so, how many years of historical crash data are available? (Select one)
|
|
Does the State have the ability to link annual traffic data (ADT or AADT) to the safety improvement project site(s) of interest? If so, how many years of reliable historical traffic volume data are available? (Select one)
|
|
Are crash data and annual traffic data (ADT or AADT) available for the same set of years? (Select one)
|
|
Does the State have the ability to define specific reference or comparison groups (e.g., rural, four-legged, signalized intersections) with associated traffic volume and crash data to be used as non-treatment sites such as an inventory of all other similar sites in the State that did not receive the safety treatment? (Select one)
|
|
Does the State have the ability to identify non-treatment sites for any/all portions of the network? (Select one)
|
|
Does the State have a method of knowing which sites have had a project within a specific timeframe? (Select one)
No.
Yes. Specify:
Does the State have the ability to summarize crash/performance statistics based on various criteria defining the comparison site? (Select one)
No.
Yes. Specify:
What tools and analytic methods does the State use for before-after study (e.g., EB method)?
Is the information described in the questions within this element 2D being used to impact decision making? (Select one)
No.
Yes, Specify:
What is the internal process for obtaining roadway inventory data? (Select one)
|
|
What is the external process for obtaining roadway inventory data? (Select one)
|
|
Does the State have visualization tools (e.g., internal GIS tool or public GIS portal) for geospatial roadway inventory data? (Select one)
|
|
Which of the following safety partners have (or may be allowed with agency approval) direct access (i.e., via internal computer network or internet) to inventory data? (Check all that apply)
|
|
|
|
|
|
Are there different levels of access to inventory data for these safety partners?
For those not checked in Question 4 (i.e., those that do not have direct access), which of the following safety partners can request and receive access to or information from the State roadway inventory database? (Check all that apply)
|
|
|
|
|
|
Are there different levels of access to inventory data for these safety partners?
Does the State have a defined timeline for filling data requests? If so, which of the following best describes the State’s policy for filling data requests? (Select one)
|
|
|
|
|
|
Does the State measure users’ satisfaction with data accessibility (e-mail, online satisfaction survey, etc.)? (Select one)
|
|
Is all necessary documentation available to facilitate data access (e.g., data dictionaries and information about data quality processes)? (Select one)
|
|
Has a communication plan (informal or formal) been established to make internal and external partners aware of what data DOT is collecting and what additional data and services DOT would be able to provide as data efforts expand? (Select one)
|
|
Do you make available subsets of crash data for particular MPAs? (Select one)
Yes, to internal agency users of our data systems
Yes, to both internal agency users and our MPO partners
We have the capability to produce this data on request but have not yet done so
We do not currently have the capability to produce this data
Which of the following systems operate under a system using relational database management or LRS? (Check all that apply)
|
Relational Database Management |
LRS |
Roadway Inventory Data. |
|
|
Traffic Data. |
|
|
Crash Data. |
|
|
Citation / Adjudication Data. |
|
|
Injury Data. |
|
|
Driver Data. |
|
|
Vehicle Data. |
|
|
ITS Data. |
|
|
Which of the following systems are currently being upgraded or are planned to be created or upgraded in the next two years to a system using relational database management or LRS? (Check all that apply.)
|
Relational Database Management |
LRS |
Roadway Inventory Data. |
|
|
Traffic Data. |
|
|
Crash Data. |
|
|
Citation / Adjudication Data. |
|
|
Injury Data. |
|
|
Driver Data. |
|
|
Vehicle Data. |
|
|
ITS Data. |
|
|
For coordinating data needs among internal agencies, which of the following numbers best describes your efforts: (Select one)
1 |
Agency-Wide: Most data collection efforts in the agency are independent—there has been little or no effort to coordinate across business units. The agency does not have information about the extent of data duplication. Program Specific: There have been no efforts to coordinate data collection or management activities with other business units. |
2 |
Agency-Wide: The agency has assessed the extent to which there is duplication across data sets within the agency. Opportunities for coordinating data collection and management across business units (e.g., safety and asset management) are periodically discussed, but limited progress has been made. Program Specific: Opportunities for coordinating data collection and/or management activities with other business units have been discussed, but no action has been taken. |
3 |
Agency-Wide: The agency has implemented a data collection effort involving coordination of more than one business unit (e.g., use of video imagery from pavement data collection to extract data on other assets). The agency has defined metrics to track improvements in data collection and storage efficiency. Program Specific: A specific opportunity for coordinated data collection has been identified and is being pursued. |
4 |
Agency-Wide: Agency business data owners are encouraged and incentivized to share their data with a broader audience within the agency (where appropriate). Agency business data owners are encouraged and incentivized to plan new data collection initiatives in partnerships with other business units where information needs of multiple units can be simultaneously addressed. The agency monitors progress of efforts to reduce data duplication. Program Specific: Data collection is routinely coordinated with one or more other business units. |
5 |
Agency-Wide: The agency periodically reviews its data collection programs to identify opportunities to leverage new technologies and externally available data sources. The agency regularly seeks opportunities to minimize or reduce redundancy in data collection, storage, and processing. Program Specific: New internal agency partnerships on data collection and management are actively sought to achieve economies of scale and make best use of limited staff and budget. |
Reference: NCHRP 814, Element 4 Data Collaboration
For coordination of safety data needs among external agencies who share an interest in or need for common sets of data, which of the following numbers best describes your efforts: (Select one)
1 |
Agency-Wide: Individual business units obtain and use publicly available data from external entities as needs and opportunities arise. The agency has acquired single "point-in-time" data sets from external entities. Program Specific: Publicly available data from external entities is obtained and used as needs and opportunities arise. |
2 |
Agency-Wide: The agency is exploring partnerships with other public- and private-sector organizations to share data on an ongoing basis. Program Specific: Partnerships with other public- and private-sector organizations are being explored to share data on an ongoing basis. |
3 |
Agency-Wide: The agency has data-sharing agreements with external entities. The agency provides "self-serve" access to data sets of value to external users. Program Specific: Data-sharing agreements are in place with external entities. "Self-serve" access is provided to data sets of value to external users. |
4 |
Agency-Wide: The agency has sustained partnerships with external entities involving regular update cycles. Program Specific: Data-sharing agreements with external entities have been sustained over time (2+ years) and through multiple data update cycles. |
5 |
Agency-Wide: The agency routinely seeks new opportunities for data partnerships with external entities. They have designated staff liaison responsibilities for managing the external partnerships. Program Specific: New opportunities for data partnerships with external entities are actively sought. Staff liaison responsibilities for managing these external partnerships have been designated. |
Reference: NCHRP 814, Element 4 Data Collaboration
Please describe the effectiveness of the process to coordinate safety data needs among agencies. What improvements (if any) do you think are needed?
Check-off which of the following Roles and Responsibilities matrix in your data management organizational framework are placed at 1) State DOT or 2) Multi-agency group: (Select one for each column)
Describe State DOT program(s) assessed in this question:
Describe Multi-agency group(s) assessed in this question:
|
|
State DOT |
Multi-agency group |
1 |
Agency-Wide and Program Specific: Accountability for the quality, value, and appropriate use of data has not been clearly established. |
|
|
2 |
Agency-Wide: One or more individuals have been identified to lead agency-wide data governance activities. A business lead or point person has been designated for each major data set or application but the responsibilities of the role have not been spelled out. Program Specific: A business lead or point person has been designated for each major data set or application but the responsibilities of their role have not been spelled out. |
|
|
3 |
Agency-Wide: An agency-wide data governance body has been established with representation from IT and business functions and has defined its charter. Objectives and performance metrics for data governance and stewardship have been defined and documented. Role(s) have been designated to identify points of accountability for data quality, value, and appropriate use—for priority data programs or data subject categories. Decision-making authority has been defined for collection/acquisition of new data, discontinuation of current data collection, and significant changes to the content of existing data. Capabilities and skills for data management are included in staff position descriptions, agency recruiting, and staff development efforts. Program Specific: Role(s) have been designated to identify points of accountability for data quality, value, and appropriate use—for priority data programs or data subject categories. Decision-making authority has been defined for collection/acquisition of new data, discontinuation of current data collection, and significant changes to the content of existing data. Capabilities and skills for data management are included in staff position descriptions, agency recruiting, and staff development efforts. |
|
|
4 |
Agency-Wide: An agency-wide data governance body is active and achieving results recognized as valuable. The agency is successfully identifying and resolving situations where individual business unit interests are in conflict with agency-wide interests related to data collection and management Staff with responsibility for data stewardship and management have sufficient time and training to carry out these responsibilities. Staff with responsibility for data stewardship and management play an active role in defining data improvements and periodically produce reports of progress to their managers. Program Specific: Staff with responsibility for data stewardship and management have sufficient time and training to carry out these responsibilities. Staff with responsibility for data stewardship and management play an active role in defining data improvements and periodically produce reports of progress to their managers.
|
|
|
5 |
Agency-Wide: A charter for agency-wide data governance body is reviewed periodically and updated based on experience. Stewardship roles are periodically reviewed and refined to reflect new or changing data requirements and implementation of new data systems. Staff with responsibility for data stewardship and management are coordinating with their peers in the agency and with external data partners to deliver best value for resources invested. Data management-related metrics are routinely considered in employee performance reviews. Program Specific: Stewardship roles are periodically reviewed and refined to reflect new or changing data requirements and implementation of new data systems. Staff with responsibility for data stewardship and management are coordinating with their peers in the agency and with external data partners to deliver best value for resources invested. Data management-related metrics are routinely considered in employee performance reviews. |
|
|
Reference: NCHRP 814, Element 1 Data Strategy and Governance
Is there awareness at the executive level at all agencies of the need for an institutional arrangement or organizational structure to support data governance? (Select one)
|
|
Does the data governance body include safety-related data representatives? (Select one)
|
|
|
|
Is there a data committee within the DOT that would report to (or coordinate with) the data governance board on safety data? (Select one)
|
|
|
|
|
Does the State TRCC serve as the data governance committee? (Select one)
|
|
What is the role/responsibility of the TRCC in the State with respect to roadway inventory, traffic volume, and crash data? What is its role/responsibility with respect to other safety data sources? In other words, is the TRCC responsible for the data sets as a committee or does the TRCC defer to specific agencies to manage the data sets?
How often does the TRCC meet?
Which data subjects are represented on the executive leadership level (if any) and the working level of the TRCC? (Check all that apply)
|
|
|
|
|
|
|
|
|
Have action (or implementation) plans been developed from the Strategic Plan for Traffic Records or any other efforts such as the Crash Data Improvement Program, Roadway Data Improvement Program, NCHRP 814? Please describe.
Within any of the safety-data plans, is there a clear strategic vision articulated as an organizing principle for all public roads safety data projects/programs covering crash, roadway, and traffic data? (Select one)
|
|
|
|
If yes, please specify the plan (or plans) and provide the relevant vision statement(s)
Is there a formal program established to evaluate data sets and identify which data elements are critical for safety analysis? (Select one)
No.
Yes,
describe:
Is there a formal program to evaluate data systems and applications for inclusion in the safety data program? (Select one)
No.
Yes, describe:
Data ownership/stewardship within our organization is assigned by: (Check all that apply)
a. An enterprise data steward or individual who assigns all responsibilities across our organization |
b. A data management committee that assigns responsibilities across the organization |
c. Individuals within the organization that have responsibility for stewardship as part of their everyday work who then inform the managers of the decisions and policies that have been put in place |
d. Management or a committee who make a final decision as to what policies are put in place after individuals within the organization make recommendations |
e. Other (please specify)
|
How active are data stewards in the State DOT related to safety data management? (Select one)
|
|
|
|
How active are the majority of data stewards at agencies outside of the State DOT related to safety data management? (Select one)
|
|
|
|
Are GIS, IT, and/or business intelligence personnel included in the data governance working groups to address data integration and spatial data components? (Select one)
No.
Yes.
Specify:
Does the DOT share standards with local agencies to facilitate data integration from local agencies into DOT systems? (Select one)
No.
Yes.
Is there a program or process at the State level to improve the management of safety data? Which number below best describes your program or process. (Select one)
1 |
No formal policies and procedures have been defined. |
2 |
Executive leadership has endorsed basic data principles. |
3 |
The scope of data governance has been established. Data classifications have been defined based on importance or need for cross business unit integration. A limited set of data management policies have been adopted for priority data categories. There is a documented procedure and decision-making process for requesting and evaluating new data collection or acquisition requests. |
4 |
A comprehensive set of data management policies has been adopted based on collaboration, including IT, business units, and records management. Processes are in place to monitor and enforce compliance with policies. There is a documented and implemented procedure for requesting and evaluating new data collection or acquisition requests (i.e., the documented procedure is routinely followed). |
5 |
Policies are regularly reviewed and updated based on factors such as awareness/reach, effectiveness, and cost burden |
NCHRP 814: Element 1 Data Strategy and Governance 1.3
What is the current commitment and strategic planning initiative to maximize value of data to meet agency goals? Choose a number that best describes this commitment. (Select one)
1 |
State-wide: Data collection and management is performed by individual business units with little or no direction or coordination. Data improvements are not systematically or regularly identified—they are implemented reactively or opportunistically. Program Specific: Data improvements are not systematically or regularly identified—they are implemented on a reactive or opportunistic basis. |
2 |
State-wide: Efforts to implement data governance or assess data needs are being discussed or planned. Data improvement needs are identified and communicated to management informally and efficiently. Program Specific: Data improvement needs are identified and communicated to management informally |
3 |
State-wide: Executive leadership has communicated the expectation that business units and IT functions should collaborate on identifying and implementing data improvements of benefit. Data business plans or equivalent planning tools have been prepared to identify short and longer term data collection and management strategies that align with business objectives. Data improvement needs have been systematically reviewed, assessed, and documented. Program Specific: Data business plans or equivalent planning tools have been prepared to identify short and longer term data collection and management strategies that align with business objectives. Data improvement needs have been systematically reviewed, assessed and documented. |
4 |
State-wide: Leadership regularly communicates and demonstrates active support for data improvements that will lead to improved effectiveness and efficiency. Leadership actively works to facilitate collaboration across business units on data improvements and maintain strong partnerships between IT and business unit managers. Data business plans or equivalent planning tools are regularly updated. A regular process of data needs assessment is in place and is used to drive budgeting decisions. Program Specific: Data business plans or equivalent planning tools are regularly updated. A regular process of data needs assessment is in place and is used to drive budgeting decisions. |
5 |
Data governance and planning activities are continually refined to focus on key risks and opportunities and eliminate activities without demonstrated payoff. Data governance and planning activities have a high probability of continuing through changes in executive leadership. |
NCHRP 814: Element 1 Data Strategy and Governance 1.1
Is there a data catalog with data definitions, standards, policies, and procedures for the collection and use of data available electronically in the organization and is it accessible to users? (Select one)
|
|
Has a data business plan or equivalent formal plan been developed to guide management of strategic safety data programs? (Select one)
|
|
Are safety data stakeholders consulted when the annual data collection plan is being updated? (Select one)
No.
Yes.
N/A: no such plan exists.
Is there coordination between the data governance group, data stewards, and other stakeholders, as needed? (Select one)
No.
Yes.
Ownership/stewardship within our organization is tied to the data based on the type of data that is being managed? (Select one)
No.
Yes.
Where one is designated, the data steward (or data owner) has control over: (check all that apply)
a. Master Data |
b. Transactional Data |
c. Reference Data |
d. Metadata |
e. Historical data |
f. Temporal data |
g. Other (please specify) |
Which data systems have one or more standard business rules? (Check all that apply)
a. Roadway Inventory Data. |
b. Traffic Data. |
c. Crash Data. |
d. Citation / Adjudication Data. |
e. Injury Data. |
f. Driver Data. |
g. Vehicle Data. |
h. Other (please specify). ______ |
Which of the following types of data have one or more coded business rules that can be automatically applied to check data quality, restrict data access, and perform consistent calculations and transformations? (Check all that apply)
a. Roadway Inventory Data. |
b. Traffic Data. |
c. Crash Data. |
d. Citation / Adjudication Data. |
e. Injury Data. |
f. Driver Data. |
g. Vehicle Data. |
h. ITS Data. |
i. Other (please specify). ______ |
Which standards, documentation, and communication protocols are applied to the roadway inventory data? (Check all that apply)
a. Data Definitions. |
b. Data file structures or database schemas. |
c. Formats used for data exchange. |
d. Frequency of publication of data updates. |
e. Processes to secure the transmission of confidential data and information. |
Which number best describes your maintenance cycle for the process and access of roadway data? (Select one)
Describe program(s) assessed in this question:
1 |
Agency-Wide and Program Specific: Data updating cycles and business rules for data updates have not been defined. |
2 |
Agency-Wide and Program Specific: Updating cycles have been established but have not been documented. |
3 |
Agency-Wide and Program Specific: Updating cycles have been documented. Business rules have been defined for how key data entities are added, updated, and deleted. |
4 |
Agency-Wide and Program Specific: Updating cycles are being consistently followed. Business rules for data updating are embedded in and enforced by applications (where applicable). |
5 |
Agency-Wide and Program Specific: Data updating methods are periodically reviewed to identify opportunities for improved efficiencies. |
NCHRP 814 Element 2: Data Life Cycle Management 2.1
Which number in the list below best describes guidelines and procedures for protecting data assets? (Select one)
Describe program(s) assessed in this question:
1 |
Agency-Wide: There may be important data sets managed using desktop applications within individual business units, but these have not been systematically identified. Each business unit is responsible for ensuring that its data sets are backed up and periodically archived to enable future retrieval and use. Program Specific: Backups of data sets are made ad hoc. |
2 |
Agency-Wide: Several of the agency's important data sets are managed using desktop applications (e.g., spreadsheets) but plans are in process to bring these into enterprise databases. Data owners receive informal (unwritten) guidance regarding frequency and storage locations for backups and archive copies. Program Specific: Backups of data sets are made regularly, but there are no written procedures on backup frequency or storage locations. Archive copies of data sets exist, but there are no written procedures on how to create these and how to retrieve them. |
3 |
Agency-Wide: Several of the agency's important data sets are managed using desktop applications (e.g., spreadsheets) but plans are in process to bring these into enterprise databases. Data owners receive informal (unwritten) guidance regarding frequency and storage locations for backups and archive copies. Program Specific: Backups of data sets are made regularly, but there are no written procedures on backup frequency or storage locations. Archive copies of data sets exist, but there are no written procedures on how to create these and how to retrieve them. |
4 |
Agency-Wide: All of the agency's important data sets are managed within enterprise databases (e.g., Oracle, SQL Server) and regular backups are made. Backup procedures are consistently followed. Archiving procedures are consistently followed. Backup procedures have been fully tested. Archiving procedures have been fully tested. Program Specific: Backup procedures are consistently followed. Archiving procedures are consistently followed. Backup procedures have been fully tested. Archiving procedures have been fully tested. |
5 |
Agency-Wide and Program Specific: Data managers and stewards periodically review existing data backup and archiving procedures and update them as appropriate to reflect user feedback or changing needs.
|
NCHRP 814 Element 2: Data Life Cycle Management 2.4
Which number in the list below best describes management of geospatial data? (Select one)
1 |
The agency does not provide enterprise-wide planning and support for management and integration of geospatial data. Management of geospatial data is not integrated with other agency data management and IT functions. |
2 |
The agency has designated responsibilities for enterprise-wide planning and support for managing geospatial data. The agency manages a collection of spatial data sets and makes them available for internal use. |
3 |
The agency has written policies and standards defining how geospatial data is to be collected, stored, managed, shared, and integrated with non-spatial data attributes. The agency considers spatial data in their IT strategic plan (or equivalent) that identifies investment needs and priorities for hardware, software, and data. The agency has identified data entities that should have standard location referencing. |
4 |
The agency has a well-understood and functioning process for collecting, adding, and updating geospatial data sets. The agency has a standard approach to assigning spatial location to key data entities (e.g., construction projects and assets.) Training and support is provided to ensure adherence to adopted policies and standards for geospatial data collection and management and to build skills in spatial data analysis. |
5 |
Spatial data collection, management, and visualization requirements are fully integrated within the agency's IT and data management planning and operational functions. The agency periodically reevaluates and updates its approach to geospatial data management to reflect changes in technology, data availability and cost, and user requirements. |
NCHRP 814 Element 2: Data Architecture and Integration 3.2
Describe procedures and policies in place to manage sensitive data:
Does the agency use any software tools to calculate safety performance measures and compare to safety performance metrics/targets? (Select one)
|
|
Do both internal (agency) and external safety stakeholder share access to the same safety performance data and displays (maps, etc.)? (Select one)
|
|
|
|
Which of the following tools is the agency most interested in developing (or expanding) to improve safety data management and analysis? Please do not check existing tools that are satisfactory as is. (Check all that apply)
a. Enterprise Geospatially Enabled data warehouse. |
b. Business intelligence (including GIS). |
c. Data repositories. |
d. Data dictionaries |
e. Data cleansing / Data standardization. |
f. None of the above. |
What role does the Information Technology (IT) staff/department play in selecting or improving the various tools for safety data management and analysis? (Select one)
|
|
Which of the following do you find challenging to manage and track change? (check all that apply)
a. Data security |
b. Data models |
c. Data definitions |
d. Data structures |
e. Data transfers |
f. The structure of metadata repositories |
g. Type of metadata included in a metadata repository |
h. Stewardship responsibilities |
i. None of the above |
Which best describes your State’s ability to support an analysis associating roadway attributes (geometrics) with crash frequency? (Select one)
|
|
|
|
|
Which best describes your State’s ability to support analysis linking crash types to roadway features (e.g., identifying locations with a propensity for rollovers, run-off-road crashes, or other) and then analyzing the network for similar locations based on similarity of roadway attributes? (Select one)
|
|
|
|
|
|
Which best describes your State’s ability to support an analysis of the consequences of crashes from a health perspective, incorporating information on roadway locations, types of roadways/attributes, and medical treatments (EMS, trauma care, in-patient billing, etc.)? (Select one)
|
|
|
|
|
|
Which best describes your State’s ability to support analysis linking the safety-related enforcement activities (citations/arrests) and crashes with roadway features/attributes or locations by type? (Select one)
|
|
|
|
|
|
|
Which best describes your State’s ability to support analysis linking driver characteristics and safety/conviction history with roadway features/attributes or locations by type? (Select one) Please explain:
|
|
|
|
|
|
Are there efforts to establish relationships with local agencies to determine what data are currently in existence? (Select one)
Yes
No
Which best describes your State’s current level of roadway data integration? (Select one)
|
|
|
|
|
Which best describes your State’s interactions with local or regional jurisdictions in the State with respect to roadway inventory data? (Select one)
|
|
|
|
|
|
What would it take to add a new data element (e.g., a new roadway inventory attribute not previously included in the database) across State and local roadway data systems?
Which best describes your State’s ability to modify the data structure of statewide databases maintained by the DOT? (Select one)
|
|
|
|
|
We have business rules as to how to request and implement changes. (Select one)
Yes
No
Which best describes your State’s current ability to conduct spatial analysis? (Select one)
|
|
|
|
Which best describes your State’s ability to conduct spatial analysis one year from now? (Select one)
|
|
|
|
Which describes your data systems’ abilities to support state-of-the-art analyses as described in the Highway Safety Manual and using tools listed here (https://www.fhwa.dot.gov/innovation/everydaycounts/edc_4/ddsa_resources/)? (Select one)
|
|
|
|
|
How are locations coded for State-maintained and local roadways in the roadway inventory? (Select one)
|
|
|
|
|
|
What percentage of all crash report locations are assigned a valid location code (after all automated and manual processes) that is identical to the location code used in the roadway inventory file? (Select one)
|
|
|
|
|
What percentage of crash location codes are assigned automatically? (Select one)
|
|
|
|
|
What percentage of automatically assigned codes require manual location adjustment due to errors? (Select one)
|
|
|
|
|
Has a formal data collection plan for data elements, which includes prioritization of data elements required for enhanced safety analysis been developed? (Select one)
Note: these data elements may include: Sign inventory, Signal inventory, Lighting, Guardrail, Speed limit, Driveway density
Yes
No
Are you able to produce extracts of crash data for a particular MPA or UZW as of a specific year in the past? (Select one)
Yes
No
Partially (please explain:)_____________________________________________
Which of the following best describes your agency’s current capability for establishing evidence-based, data driven safety performance targets? (Select one):
An approach to establishing evidence-based safety performance targets is currently under development.
We have developed an approach involving review of baseline and trend data, examination of external factors that may influence future target achievement, and feasibility analysis – but have not yet applied this approach.
We have applied our evidence-based approach to establish targets and have gained internal agency approval for these targets. We have reviewed our targets against actual performance achieved and can evaluate factors contributing to target achievement (or lack thereof.)
Our evidence-based approach has been successfully applied for multiple cycles of target setting and is periodically reviewed and refined to reflect influencing factors and risks.
Which of the following analysis capabilities do you currently have or plan to develop in support of performance target setting (check all that apply):
|
Current Capability |
Under Development |
Planned Future Development |
a. Capability to graph trends in 5 year rolling averages for the five national safety performance measures (number of fatalities, fatality rate, number of serious injuries, serious injury rate, number of non-motorized fatalities and serious injuries) |
|
|
|
b. Capability to analyze trends to diagnose causal factors behind the trends |
|
|
|
c. Capability to access and review pertinent data on external factors likely to impact future safety performance, including but not limited to: socioeconomic data (population, demographics, jobs, etc.), VMT, revenues. |
|
|
|
d. Capability to predict the impact of planned and programmed HSIP projects on future safety performance. |
|
|
|
e. Capability to predict the impact of planned and all programmed STIP and/or TIP projects (other than those in the HSIP) on future safety performance. |
|
|
|
f. Basic scenario analysis capability – ability to estimate future safety performance for different sets of projects |
|
|
|
g. Advanced scenario analysis capability – ability to estimate future safety performance for different sets of projects and program elements + varying assumptions about external factors |
|
|
|
h. Other (Describe):
|
|
|
|
Which of the following practices do you currently employ or plan to implement to strengthen performance-based planning and programming decisions? (Check all that apply)
|
Current Practice |
Implementation in Progress |
Planned Future Practice |
a. Use estimates of projected impacts on safety performance measures to prioritize/select HSIP projects |
|
|
|
b. Use estimates of projected impacts on safety performance measures to prioritize/select HSIP elements |
|
|
|
c. Use estimates of projected impacts on safety performance measures to prioritize/select STIP and/or TIP projects |
|
|
|
d. Use scenario analysis to set targets and allocate available funds across different program areas based on estimated impacts of varying investment levels on safety performance |
|
|
|
e. Use results of Before/After analysis to improve future projections of safety performance impacts for candidate projects |
|
|
|
f. Involve safety stakeholders representing the 4E’s (education, enforcement, engineering and emergency response) in evaluation of the LRTP and STIP and/or TIP |
|
|
|
g. Other (Describe):
|
|
|
|
How far along is your State on implementing a coordinated approach for safety performance measure target setting and reporting? (Select one)
The need for coordination across agencies with respect to target setting and reporting is not yet well understood; discussions on coordination have not yet been initiated.
Discussions among partner agencies on coordination of targets and reporting are underway.
Basic decisions have been made about whether MPOs will set separate targets, and whether there will be separate targets for urban versus rural areas. There is understanding among partner agencies on what type of coordination is needed, and what coordination processes will be followed for target setting and reporting, but no formal agreements are in place.
Processes for coordinated target setting and performance reporting have been formally documented and agreed-to.
Processes for coordinated target setting and performance reporting have been refined or improved based on initial experience.
In what ways do you plan to coordinate on collection, management and reporting of safety performance data? (check all that apply)
|
Current Practice |
Implementation in Progress |
Planned Future Practice |
a. State DOT and SHSO share a single source database for all required data elements for HSIP and HSP |
|
|
|
b. State DOT and SHSO use the same consultant or data analysts for production of reports for HSIP and HSP |
|
|
|
c. State DOT provides access to data system and reports that MPOs can use to produce reports |
|
|
|
d. State DOT provides data export to MPOs for use in producing reports |
|
|
|
e. Other (Describe):
|
|
|
|
Do you agree that the level assigned to your State is consistent with your ability?
The CMM defines five capability maturity levels as follows:
(1)
Initial / Ad hoc: The organization does not possess a stable
implementation environment and the safety data
collection, management (entering/coding, processing, and evaluating)
and maintenance process is ‘ad hoc’ with no
interconnection within the organization. Interoperability
and expandability are not planned.
(2) Repeatable: Activities are based on the results of previous projects and the demands of the current one. Decisions are considered during individual projects.
(3) Defined: The process is documented throughout the organization rather than on a per-project basis. Projects are carried out under guidance of the organization's standards and are tied to an adopted strategy.
(4) Managed: Projects are started and supervised by process management. Through performance management, processes are predictable and the organization can develop rules and conditions regarding the quality of the products and processes.
(5) Optimizing: The whole organization is focusing on the continuous improvement. The organization possesses the means to detect weaknesses and to strengthen areas of concern proactively.
To what level would you assign your State (overall and for each Area) if the questions represented your actual practices rather than desired practice?
Overall:
Roadway Data Collection/Technical Standards:
Data Analysis Tools and Uses:
Data Management and Governance:
Data Integration and Expandability:
What level would you like to be at overall and for each area?
Overall:
Roadway Data Collection/Technical Standards:
Data Analysis Tools and Uses:
Data Management and Governance:
Data Integration and Expandability:
What are the non-financial challenges/barriers preventing you from reaching that level?
What kinds of assistance (webinar, in-person training, web links, etc.) should FHWA be providing to stakeholders to assist with the collection, use, and expansion of roadway safety data and data capabilities? (Check all that apply)
Assistance for Collection of Data (Specify):
Assistance for Use of Data (Specify):
Assistance for Expansion of Data (Specify):
What kinds of problems are you having with policies or processes at the State or Federal level that make it difficult to collect, use or expand roadway safety data and data capabilities? (Check all that apply)
Problems with Collection of Data (Specify):
Problems with Use of Data (Specify):
Problems with Expansion of Data (Specify):
What non-financial resources such as tools, guidance, training etc., would be beneficial to you to collect, use, or expand roadway safety data and data capabilities? (Check all that apply)
Resources for Collection of Data (Specify):
Resources for Use of Data (Specify):
Resources for Expansion of Data (Specify):
Is there anything else you would like to share with FHWA, or the highway safety community that you think would be beneficial to improving the collection, usage, or expansion of roadway safety data and data capabilities?
Appendix A is an extended list of questions of Area 3, which focuses on questions related to IT professionals.
Are communities of interest (both internal working groups and external stakeholders) defined and active? (Select one)
|
|
Are business users active in data strategies and delivery? (Select one)
|
|
Does a data quality group work directly with safety data stewards, application developers, and database administrators to address quality issues and/or concerns? (Select one)
|
|
Do data stewards work directly with cross-functional teams to enact data quality standards? (Select one)
|
|
Are data stewards identified so that all organizational staff know who they are? (Select one)
No.
Yes.
Are data quality and data integration tools standardized across the organization? (Select one)
No.
Yes.
Do all aspects of the organization use standard business rules created and maintained by designated data stewards? (Select one)
No.
Yes.
Are there any rules or processes built for data governance? (Select one)
|
|
Is there a Data Business Plan to support the data management of the following databases? (Check all that apply)
a. Roadway Inventory Data. |
b. Traffic Data. |
c. Crash Data. |
d. Citation / Adjudication Data. |
e. Injury Data. |
f. Driver Data. |
g. Vehicle Data. |
h. Provide a link to your data business plan: |
Are new initiatives only approved after careful consideration of how the initiatives will affect the existing data infrastructure? (Select one)
|
|
If yes to 3, is there a formal business process to request a change to the Data Business Plan? (Select one)
No.
Yes. Please specify:
Are policies in place to ensure that data remains consistent, accurate, and reliable throughout the enterprise? (Select one)
|
|
Have data management goals shifted from problem correction to prevention? (Select one)
|
|
|
|
Are real-time preventive data quality rules and processes in place? (Select one)
|
|
Do data quality metrics provide insight into areas needing improvement? (Select one)
|
|
Has the State DOT developed and published a Data Governance manual or handbook that identifies the roles and responsibilities of staff in the State DOT to support data governance operations? (Select one)
|
|
Has the State IT department developed and published a Data Governance manual or handbook that identifies the roles and responsibilities of staff to support data governance operations? (Select one)
|
|
Are most data management processes short-range and focused on recently discovered problems? (Select one)
|
|
What planned long-range processes for data governance are being pursued?
Are data continuously inspected? (Select one)
No.
Yes.
If yes, are deviations from standards are resolved immediately? (Select one)
|
|
What are some of the data checks (i.e. manual procedures, electronic files)?
Are there other business rules that dictate formal data governance framework? (Select one)
No.
Yes. Please specify:
Does ongoing data monitoring help the data stewards maintain data integrity? (Select one)
|
|
Do the data models capture the business meaning and technical details of all corporate data elements? (Select one)
|
|
Are data definitions, business rules, and data quality standards available to authorized users of the data? (Select one)
No.
Yes.
MIRE Element |
Collected |
Indicate the Inventory/Database it is stored in |
Comments |
|
State % |
Local % |
|||
|
||||
1. County Name |
|
|
|
|
2. County Code |
|
|
|
|
3. Highway District |
|
|
|
|
4. Type of Governmental Ownership FDE |
|
|
|
|
5. Specific Governmental Ownership |
|
|
|
|
6. City/Local Jurisdiction Name |
|
|
|
|
7. City/Local Jurisdiction Urban Code |
|
|
|
|
8. Route Number FDE |
|
|
|
|
9. Route/Street Name FDE |
|
|
|
|
10. Begin Point Segment Descriptor FDE |
|
|
|
|
11. End point Segment Descriptor FDE |
|
|
|
|
12. Segment Identifier FDE |
|
|
|
|
13. Segment Length FDE |
|
|
|
|
14. Route Signing |
|
|
|
|
15. Route Signing Qualifier |
|
|
|
|
16. Coinciding Route Indicator |
|
|
|
|
17. Coinciding Route – Minor Route Information |
|
|
|
|
18. Direction of Inventory FDE |
|
|
|
|
19. Functional Class FDE |
|
|
|
|
20. Rural/Urban Designation FDE |
|
|
|
|
21. Federal Aid FDE |
|
|
|
|
22. Route Type FDE |
|
|
|
|
23. Access Control FDE |
|
|
|
|
24. Surface Type FDE |
|
|
|
|
25. Total Paved Surface Width |
|
|
|
|
26. Surface Friction |
|
|
|
|
27. Surface Friction Date |
|
|
|
|
28. International Roughness Index (IRI) |
|
|
|
|
29. Pavement Roughness Date |
|
|
|
|
30. Pavement Condition (Present Serviceability Rating) |
|
|
|
|
31. Pavement Condition (PSR) Date |
|
|
|
|
32. Number of Through Lanes FDE |
|
|
|
|
33. Outside Through Lane Width |
|
|
|
|
34. Inside Through Lane Width |
|
|
|
|
35. Cross Slope |
|
|
|
|
36. Auxiliary Lane Presence/Type |
|
|
|
|
37. Auxiliary Lane Length |
|
|
|
|
38. HOV Lane Presence/Type |
|
|
|
|
39. HOV Lanes |
|
|
|
|
40. Reversible Lanes |
|
|
|
|
41. Presence/Type of Bicycle Facility |
|
|
|
|
42. Width of Bicycle Facility |
|
|
|
|
43. Number of Peak Period Through Lanes |
|
|
|
|
44. Right Shoulder Type |
|
|
|
|
45. Right Shoulder Total Width |
|
|
|
|
46. Right Paved Shoulder Width |
|
|
|
|
47. Right Shoulder Rumble Strip Presence/Type |
|
|
|
|
48. Left Shoulder Type |
|
|
|
|
49. Left Shoulder Total Width |
|
|
|
|
50. Left Paved Shoulder Width |
|
|
|
|
51. Left Shoulder Rumble Strip Presence/Type |
|
|
|
|
52. Sidewalk Presence |
|
|
|
|
53. Curb Presence |
|
|
|
|
54. Curb Type |
|
|
|
|
55. Median Type FDE |
|
|
|
|
56. Median Width |
|
|
|
|
57. Median Barrier Presence/Type |
|
|
|
|
58. Median (Inner) Paved Shoulder Width |
|
|
|
|
59. Median Shoulder Rumble Strip Presence/Type |
|
|
|
|
60. Median Sideslope |
|
|
|
|
61. Median Sideslope Width |
|
|
|
|
62. Median Crossover/Left Turn Lane Type |
|
|
|
|
63. Roadside Clearzone Width |
|
|
|
|
64. Right Sideslope |
|
|
|
|
65. Right Sideslope Width |
|
|
|
|
66. Left Sideslope |
|
|
|
|
67. Left Sideslope Width |
|
|
|
|
68. Roadside Rating |
|
|
|
|
69. Tapered Edge |
|
|
|
|
70. Major Commercial Driveway Count |
|
|
|
|
71. Minor Commercial Driveway Count |
|
|
|
|
72. Major Residential Driveway Count |
|
|
|
|
73. Minor Residential Driveway Count |
|
|
|
|
74. Major Industrial/Institutional Driveway Count |
|
|
|
|
75. Minor Industrial/Institutional Driveway Count |
|
|
|
|
76. Other Driveway Count |
|
|
|
|
77. Terrain Type |
|
|
|
|
78. Number of Signalized Intersections in Segment |
|
|
|
|
79. Number of Stop-Controlled Intersections in Segment |
|
|
|
|
80. Number of Uncontrolled/Other Intersections in Segment |
|
|
|
|
81. Annual Average Daily Traffic (AADT) FDE |
|
|
|
|
82. AADT Year FDE |
|
|
|
|
83. AADT Annual Escalation Percentage |
|
|
|
|
84. Percent Single Unit Trucks or Single Truck AADT |
|
|
|
|
85. Percent Combination Trucks or Combination Truck AADT |
|
|
|
|
86. Percentage Trucks or Truck AADT |
|
|
|
|
87. Total Daily Two-Way Pedestrian Count/Exposure |
|
|
|
|
88. Bicycle Count/Exposure |
|
|
|
|
89. Motorcycle Count or Percentage |
|
|
|
|
90. Hourly Traffic Volumes (or Peak and Off peak AADT) |
|
|
|
|
91. K-Factor |
|
|
|
|
92. Directional Factor |
|
|
|
|
93. One/Two-Way Operations FDE |
|
|
|
|
94. Speed Limit |
|
|
|
|
95. Truck Speed Limit |
|
|
|
|
96. Nighttime Speed Limit |
|
|
|
|
97. 85th Percentile Speed |
|
|
|
|
98. Mean Speed |
|
|
|
|
99. School Zone Indicator |
|
|
|
|
100. On-Street Parking Presence |
|
|
|
|
101. On-Street Parking Type |
|
|
|
|
102. Roadway Lighting |
|
|
|
|
103. Toll Charged |
|
|
|
|
104. Toll Type |
|
|
|
|
105. Edgeline Presence/Width |
|
|
|
|
106. Centerline Presence/Width |
|
|
|
|
107. Centerline Rumble Strip Presence/Type |
|
|
|
|
108. Passing Zone Percentage |
|
|
|
|
109. Bridge Numbers for Bridges in Segment |
|
|
|
|
II. At-Grade Intersection/Junctions |
||||
110. Unique Junction Identifier FDE |
|
|
|
|
111. Type of Intersection/Junction |
|
|
|
|
112. Location Identifier for Road 1 Crossing Point FDE |
|
|
|
|
113. Location Identifier for Road 2 Crossing Point FDE |
|
|
|
|
114. Location Identifier for Additional Road Crossing Points |
|
|
|
|
115. Intersection/Junction Number of Legs |
|
|
|
|
116. Intersection/Junction Geometry FDE |
|
|
|
|
117. School Zone Indicator |
|
|
|
|
118. Railroad Crossing Number |
|
|
|
|
119. Intersecting Angle |
|
|
|
|
120. Intersection/Junction Offset Distance |
|
|
|
|
121. Intersection/Junction Traffic Control FDE |
|
|
|
|
122. Signalization Presence/Type |
|
|
|
|
123. Intersection/Junction Lighting |
|
|
|
|
124. Circular Intersection - Number of Circulatory Lanes |
|
|
|
|
125. Circular Intersection - Circulatory Lane Width |
|
|
|
|
126. Circular Intersection - Inscribed Diameter |
|
|
|
|
127. Circular Intersection - Bicycle Facility |
|
|
|
|
III. Intersection Leg (Each Approach) |
||||
128. Intersection Identifier for this Approach |
|
|
|
|
129. Unique Approach Identifier FDE |
|
|
|
|
130. Approach AADT |
|
|
|
|
131. Approach AADT Year |
|
|
|
|
132. Approach Mode |
|
|
|
|
133. Approach Directional Flow |
|
|
|
|
134. Number of Approach Through Lanes |
|
|
|
|
135. Left Turn Lane Type |
|
|
|
|
136. Number of Exclusive Left Turn Lanes |
|
|
|
|
137. Amount of Left turn Lane Offset |
|
|
|
|
138. Right Turn Channelization |
|
|
|
|
139. Traffic Control of Exclusive Right Turn Lanes |
|
|
|
|
140. Number of Exclusive Right Turn Lanes |
|
|
|
|
141. Length of Exclusive Left Turn Lanes |
|
|
|
|
142. Length of Exclusive Right Turn Lanes |
|
|
|
|
143. Median Type at Intersection |
|
|
|
|
144. Approach Traffic Control |
|
|
|
|
145. Approach Left Turn Protection |
|
|
|
|
146. Signal Progression |
|
|
|
|
147. Crosswalk Presence/Type |
|
|
|
|
148. Pedestrian Signalization Type |
|
|
|
|
149. Pedestrian Signal Presence/Type |
|
|
|
|
150. Crossing Pedestrian Count/Exposure |
|
|
|
|
151. Left/Right Turn Prohibitions |
|
|
|
|
152. Right Turn-On-Red Prohibitions |
|
|
|
|
153. Left Turn Counts/Percent |
|
|
|
|
154. Year of Left Turn Counts/Percent |
|
|
|
|
155. Right Turn Counts/Percent |
|
|
|
|
156. Year of Right Turn Counts/Percent |
|
|
|
|
157. Transverse Rumble Strip Presence |
|
|
|
|
158. Circular Intersection - Entry Width |
|
|
|
|
159. Circular Intersection - Number of Entry Lanes |
|
|
|
|
160. Circular Intersection – Presence/Type of Exclusive Right Turn Lane |
|
|
|
|
161. Circular Intersection - Entry Radius |
|
|
|
|
162. Circular Intersection - Exit Width |
|
|
|
|
163. Circular Intersection - Number of Exit Lanes |
|
|
|
|
164. Circular Intersection - Exit Radius |
|
|
|
|
165. Circular Intersection - Pedestrian Facility |
|
|
|
|
166. Circular Intersection - Crosswalk Location |
|
|
|
|
167. Circular Intersection – Island Width |
|
|
|
|
IV. Interchange/Ramp |
||||
168. Unique Interchange Identifier FDE |
|
|
|
|
169. Location Identifier for Road 1 Crossing Point |
|
|
|
|
170. Location Identifier for Road 2 Crossing Point |
|
|
|
|
171. Location Identifier for Additional Road Crossing Points |
|
|
|
|
172. Interchange Type FDE |
|
|
|
|
173. Interchange Lighting |
|
|
|
|
174. Interchange Entering Volume |
|
|
|
|
175. Interchange Identifier for this Ramp |
|
|
|
|
176. Unique Ramp Identifier |
|
|
|
|
177. Ramp Length FDE |
|
|
|
|
178. Ramp Acceleration Lane Length |
|
|
|
|
179. Ramp Deceleration Lane Length |
|
|
|
|
180. Ramp Number of Lanes |
|
|
|
|
181. Ramp AADT FDE |
|
|
|
|
182. Year of Ramp AADT FDE |
|
|
|
|
183. Ramp Metering |
|
|
|
|
184. Ramp Advisory Speed Limit |
|
|
|
|
185. Roadway Type at Beginning Ramp Terminal FDE |
|
|
|
|
186. Roadway Feature at Beginning Ramp Terminal |
|
|
|
|
187. Location Identifier for Roadway at Beginning Ramp Terminal FDE |
|
|
|
|
188. Location of Beginning Ramp Terminal Relative to Mainline Flow |
|
|
|
|
189. Roadway Type at Ending Ramp Terminal FDE |
|
|
|
|
190. Roadway Feature at Ending Ramp Terminal |
|
|
|
|
191. Location Identifier for Roadway at Ending Ramp Terminal FDE |
|
|
|
|
192. Location of Ending Ramp Terminal Relative to Mainline Flow |
|
|
|
|
|
||||
193. Curve Identifiers |
|
|
|
|
194. Curve Feature Type |
|
|
|
|
195. Horizontal Curve Degree or Radius |
|
|
|
|
196. Horizontal Curve Length |
|
|
|
|
197. Curve Superelevation |
|
|
|
|
198. Horizontal Transition/Spiral Curve Presence |
|
|
|
|
199. Horizontal Curve Intersection/Deflection Angle |
|
|
|
|
200. Horizontal Curve Direction |
|
|
|
|
|
||||
201. Grade Identifiers and Linkage Elements |
|
|
|
|
202. Vertical Alignment Feature Type |
|
|
|
|
203. Percent of Gradient |
|
|
|
|
204. Grade Length |
|
|
|
|
205. Vertical Curve Length |
|
|
|
|
MIRE FDEs for non-local (based on functional classification) paved roads.
MIRE Name (MIRE Number)(4) |
|
Roadway Segment |
Intersection |
Segment Identifier (12) |
Unique Junction Identifier (120) |
Route Number (8)* |
Location Identifier for Road 1 Crossing Point (122) |
Route/street Name (9)* |
Location Identifier for Road 2 Crossing Point (123) |
Federal Aid/ Route Type (21)* |
Intersection/Junction Geometry (126) |
Rural/Urban Designation (20)* |
Intersection/Junction Traffic Control (131) |
Surface Type (23)* |
AADT (79) [for Each Intersecting Road] |
Begin Point Segment Descriptor (10)* |
AADT Year (80) [for Each Intersecting Road] |
End Point Segment Descriptor (11)* |
Unique Approach Identifier (139) |
Segment Length (13)* |
|
Direction of Inventory (18) |
Interchange/Ramp |
Functional Class (19)* |
Unique Interchange Identifier (178) |
Median Type (54) |
Location Identifier for Roadway at Beginning Ramp Terminal (197) |
Access Control (22)* |
Location Identifier for Roadway at Ending Ramp Terminal (201) |
One/Two-Way Operations (91)* |
Ramp Length (187) |
Number of Through Lanes (31)* |
Roadway Type at Beginning Ramp Terminal (195) |
Average Annual Daily Traffic (79)* |
Roadway Type at Ending Ramp Terminal (199) |
AADT Year (80)* |
Interchange Type (182) |
Type of Governmental Ownership (4)* |
Ramp AADT (191)* |
|
Year of Ramp AADT (192)* |
|
Functional Class (19)* |
|
Type of Governmental Ownership (4)* |
MIRE FDEs for local paved roads.
MIRE Name (MIRE Number)(4) |
Roadway Segment |
Segment Identifier (12) |
Functional Class (19)* |
Surface Type (23)* |
Type of Governmental Ownership (4)* |
Number of Through Lanes (31)* |
Average Annual Daily Traffic (79)* |
Begin Point Segment Descriptor (10)* |
End Point Segment Descriptor (11)* |
Rural/Urban Designation (20)* |
MIRE FDEs for unpaved roads.
MIRE Name (MIRE Number)(4) |
Roadway Segment |
Segment Identifier (12) |
Functional Class (19)* |
Type of Governmental Ownership (4)* |
Begin Point Segment Descriptor (10)* |
End Point Segment Descriptor (11)* |
1 Severe laceration resulting in exposure of underlying tissues/muscle/organs or resulting in significant loss of blood; Broken or distorted extremity (arm or leg); Crush injuries; Suspected skull, chest or abdominal injury other than bruises or minor lacerations; Significant burns (second and third degree burns over 10% or more of the body); Unconsciousness when taken from the crash scene; Paralysis
2 See information at: https://www.fhwa.dot.gov/planning/processes/tools/technical_guidance/index.cfm.
3 https://www.fhwa.dot.gov/policyinformation/tmguide/
4 In some States, the dividing line for different treatment of locations is not based on ownership / jurisdiction but on whether the location is Federal Aid eligible or not. If that is the case for your State, please make a note that the answers in the table correspond to FA-eligible versus non-FA-eligible. This note may apply to questions 5-9, or a subset thereof.
File Type | application/msword |
File Title | Working Draft: State Data Capability Assessment |
Subject | Developing the Assessment Framework and Process |
Author | msawyer |
Last Modified By | SYSTEM |
File Modified | 2017-10-24 |
File Created | 2017-10-24 |