SUPPORTING STATEMENT
ASSESSMENT OF TRANSPORTATION PLANNING
AGENCY NEEDS, CAPABILITIES AND CAPACITY
INTRODUCTION
This is to request the Office of Management and Budget’s (OMB) three-year approval for the information collection entitled, Assessment of Transportation Performance Management Needs, Capabilities and Capacity.
Describe potential respondent universe and any sampling selection method to be used.
A census (100 percent sample) of 52 State Departments of Transportation (DOTs),
A census (100 percent sample) of urbanized areas from which metropolitan planning organizations (MPOs) will be drawn, and
Follow-up data collection with the same respondent organizations in 2017 or in 2018 and later.
No attempt will be made to draw inferences to any population other than the set of units that responded to the data collection effort.
Describe procedures for collecting information, including statistical methodology for stratification and sample selection, estimation procedures, degree of accuracy needed, and less than annual periodic data cycles.
State DOT Data Collection: As the assessment will seek to include all State DOTs, no formal sampling strategy will be required for this respondent group. A recent preliminary assessment of the state transportation agencies by FHWA had full participation, so we expect that we will have a high response rate of 80 percent or more. With a response rate of 80 percent (42 agencies), the 90 percent confidence level margin-of-error for population proportion estimates would be at most plus or minus 6 percent. With a response rate of 90 percent (47 agencies), the 90 percent confidence level margin-of-error for population proportion estimates would be less than plus or minus 4 percent. We believe this minimum response would adequately enable FHWA to identify and quantify state transportation agency levels of readiness, areas of concern, and training and resource needs.
MPO Data Collection: As the assessment will seek to include all MPOs, no formal sampling strategy will be required for this respondent group. The MPO groups will be sorted into urbanized area stratums based on the represented metropolitan areas’ population, air quality characteristics (including ozone non-attainment) , and planning organization representation. Since many regulatory requirement thresholds are related to area population and Environmental Protection Agency (EPA) air quality conformity assessments, these thresholds are likely to reflect differences in the surveyed agencies’ level of sophistication and exposure to performance management based planning concepts.
The urbanized area strata will include:
Stratum 1: Areas of more than one million population;
Stratum 2: Areas of less than one million population that have air quality non-attainment issues;
Stratum 3: Areas of between 200,000 and one million population that do not have air quality non-attainment issues;
Stratum 4: Areas represented by MPOs with less-than-200,000 population that do not have air quality non-attainment issues;
The final stratums will be refined through the combination of several available federal databases:
Census Bureau Urbanized Area List;
the MPO database maintained by FHWA; and
EPA Greenbook, which records air quality conformity issues by region.
Based on our preliminary processing of these sources, the populations of urbanized areas by strata are about the following:
Stratum 1 – 50 regions;
Stratum 2 – 63 regions;
Stratum 3 – 112 regions; and
Stratum 4 – 183 regions.
These population estimates will be reviewed and corrected to ensure that the assignment of regions by type is accurate, we would propose to sort MPO participants as follows:
Stratum 1 – include all 50 regions
Stratum 2 – include all 63 regions
Stratum 3 – include 112 regions
Stratum 4 – include 183 regions
The MPOs from the selected regions in strata 1 to 4 will be contacted to complete the survey. A recent web-based survey of Census data specialists at MPOs conducted for AASHTO yielded a response rate of 27 percent. Another recent survey of MPOs conducted for FHWA regarding the organizational structure of the agencies had a response rate of 36 percent. The National TPM Implementation Assessment is expected to have a comparatively strong response rate, because of the importance of the data collection topic to the mission of the MPOs and because of the full range of survey design measures that will be employed to minimize non-response bias that are described in later sections below, most notably:
The survey topic will be of greater importance to the target respondents, the agencies’ Executive Directors, as the topic will affect many of the agencies’ business practices;
The survey invitation will come from a more prominent sender from FHWA;
We will seek to have pre-notification, and hopefully endorsement, of the data collection effort be provided by national planning organizations, such as NARC and AMPO, and by State DOTs;
The survey pre-notification and follow-up protocols will be robust and will include both email and telephone contact.
Because of these survey data collection features, we are expecting that the MPO survey response rate will be in the 35 to 45 percent range. For planning, we assume a response rate of 35 percent, though we will seek to achieve the highest possible rate. The 35 percent rate would yield about 117 valid responses.
At this level of return, the 90 percent confidence level margin-of-error for population proportion estimates would be at most plus or minus 6 percent. We believe this minimum response would adequately enable FHWA to identify and quantify MPO levels of readiness, areas of concern, and training and resource needs.
Follow-Up Data Collection: A follow-up survey of the same partner organizations will be conducted in 2017 or in 2018 and later. Respondents from the initial State DOT and MPO assessments will be re-contacted for the follow-up assessments. When organizations that complete the initial assessment do not respond to the follow-up assessment, we will seek to identify and recruit similar organizations that did not participate in the initial data collection (either because they were not sampled or because they refused to be included in the initial effort) to participate in the follow-up. The resulting follow-up assessment sample will allow for longitudinal analyses (with attrition replacement).
Respondent Selection within Partner Organizations: One of the important challenges of the National TPM Implementation Assessment will be to identify the best people within the agencies from whom to collect information. The initial State DOT contacts will be the individuals previously identified by FHWA for the previous initial assessments.
The default MPO principal points-of-contact will be the Executive Directors. We will also ask for input from AMPO.
Each of the partner organization assessments will be seeking information that may reside with multiple staff members at the State DOTs and MPOs. Consequently, a survey strategy that involves multiple points of contact will be required. The approach envisioned is to send the main survey invitation to the key points-of-contact, described above, and allow them to complete the subsections of the survey themselves or to identify others in the Agency or Department that should complete the program topic area specific subsections of the survey.
More likely to capture data from the staff members that are knowledgeable of specific Agency or Department capabilities
Multiple perspectives from each Agency or Department can better identify specific issues and concerns
Increased interest in the TPM implementation and in the Assessment effort throughout the Agencies and Departments
Potential biases may be introduced by letting the primary respondents select the subsection respondents
Multiple perspectives from each Agency or Department could be contradictory
Potential difficulty in gaining perspectives on prioritization between different roles and responsibilities to implement TPM requirements within program areas
In our view, the benefit of reaching the most knowledgeable staff outweighs any potential biases introduced by having the main respondents select the subsection respondents. The multiple perspective approach also reflects the fact that TPM touches on many disciplines within an Agency or State DOT. To address prioritization across the many roles and responsibilities associated with TPM requirements within the system performance areas, the survey will include general prioritization questions for the main respondent to answer, and more specific subsection questions for other sub-respondents.
Describe methods to maximize response rate.
The data collection effort will consist of the following steps:
FHWA Office of TPM and Division Office staff will alert State DOT Point of Contact (POC) that a web-based survey is being developed that will help with determining needs and priorities for TPM training, guidance resources, and technical assistance
The FHWA will develop an invitation email with a link to the State DOT survey. The FHWA TPM Director will send the email invitation with a link to the main survey to the State DOT POCs
If no response is received after seven days, an automated reminder email invitation with a link to the survey will be sent to the State DOT contacts
After seven more days, a second automated reminder email invitation with a link to the survey will be sent to non-respondents
If still no response is received, the project team will place a telephone call reminder asking the State DOT contact to either complete the web-based survey or to set up an appointment to complete it by phone
As part of the main survey, the State DOT POC will be given the option to identify the best person within their agencies to complete each of the subsections of the survey, which will be based on the anticipated State DOT’s TPM roles
The survey software will then automatically email the referenced people invitations to complete surveys with the identified survey subsections
The same follow-up protocols will be followed for the subsection survey respondents as for the main surveys
Only the single State DOT POC can review, edit, and submit final approved responses to FHWA.
FHWA Office of TPM and Division Office staff will alert the MPO POCs of an upcoming web-based assessment. If an MPO contact cannot be identified, the MPO Executive Director will be the point-of-contact
The FHWA will develop invitation emails with links to the MPO survey. The FHWA OPM Director will send the email invitation with a link to the surveys to the MPO contacts
If no response is received after seven days, an automated reminder email invitation with a link to the survey will be sent to the MPO contacts
After seven more days, a second automated reminder email invitation with a link to the survey will be sent to non-respondents
As part of the main surveys, the MPO contacts will be given the option to identify the best person within their agencies to complete each of the subsections of the survey, which will be based on their agencies’ anticipated TPM roles
The survey software will then automatically email the referenced people invitations to complete surveys with the identified survey subsections
The same follow-up protocols will be followed for the subsection survey respondents as for the main surveys
Only the single State DOT POC can delegate, review, edit, and submit final approved responses to FHWA.
Describe tests of procedures or methods.
The development of the survey instrument will be an interactive process, beginning with FHWA review and editing of the data elements listed above. As data elements are settled, specific question wording will be developed. Each question and associated response categories will be evaluated along the following dimensions:
Lack of focus
Bias
Fatigue
Miscommunication
It will be important to limit the amount of time needed for respondents to completely respond to the National TPM Implementation Assessment. Fatigue and loss of interest affect survey completion rates, data quality, and open-ended response completeness and thoughtfulness. We will seek to entire survey process to no more than 20 burden hours. The burden hours estimate acknowledges and accounts for time often needed to collect information; coordinate and input responses; and review and approve final responses,
Where possible, response category orders will be randomized to limit bias.
Survey page timers (not visible to respondents) will be used to identify potential understanding problems (unusually long page dwell times) and potential loss-of-interest problems (unusually short page dwell times)
Survey instrument diagnostics: Survey software includes built-in capabilities to evaluate the web-based survey instrument:
Fatigue / survey timing scores
Language and graphics accessibility scores
Generation of survey test data: Once the survey is drafted, we will generate hypothetical synthetic output datasets. This will enable us to correct response category problems and to ensure that the output data will support the tabulations and analyses we expect to perform on the actual data set.
Office pretest: Prior to engaging the Partner Organizations, we will generate an email invitation link to a test survey and distribute it to Spy Pond Partners and FHWA staff that are knowledgeable of the survey topics but that were not involved in the survey development. We will seek their input on the survey questions and identify potential improvements to the survey.
Field pretest: Because the National TPM Implementation Assessment will be distributed to all State DOTs and most MPOs, a full dry-run survey field pretest cannot be used.
Instead, we will schedule about five of the FHWA Partner agencies (State DOT and/or MPOs) assessments to be delivered earlier than the rest of the assessments. We will review results of the early assessments as they are completed to evaluate comprehension and cooperation levels. We will contact early respondents by phone to ask if they had any specific issues that could be fixed. We will make necessary changes for the full assessment release, and if necessary re-contact early respondents to collect any data elements that were not in the early survey.
The proposed survey software platform is Survey Gizmo in addition to the TPM Capablity maturity model self-assessment tool.
Specific advantages of Survey Gizmo compared to other online survey data collection optionsa:
Wider range of question types than most online survey options, including group questions, matrix questions, and experimental design choice exercises
Custom scripting capabilities
Flexible page and question logic and skipping
Style themes by device type
Email campaign tools
Response tracking, reporting, and multiple data export formats (CSV, Excel, SPSS, etc.)
Greater range of respondent access controls than other online products
Allowance of save-and-continue
Duplicate protection
Anonymous responses
Quota setting
Restrictions on going backward
Section navigation
Greater range of administrator roles and collaboration features than other products
The Section Navigator is particularly critical for the Partner Organization assessment because it will enable the primary points-of-contact to separate the assessment into sections to make it easy for different respondents to complete different parts without interrupting or overwriting one another. Simply stated, the Section Navigator enables one Partner Organization to be completed by multiple people. For example:
Source: Survey Gizmo documentation, 2014.
An example of a recent survey conducted in Survey Gizmo:
http://www.surveygizmo.com/s3/1775738/AASHTO-CTPP-Survey-a
As the data are collected, we will review responses for validity
Survey response patterns (such as straightlining, etc.)
Page completion times
Completion of closed-ended and open-ended survey responses
Internal consistency checks
Data outlier review
Topline results
Cross-tabulations
Cluster analysis to group partner organizations by similarities, if feasible
MaxDiff priority measurement
Gap analysis (training needs versus capabilities)
Open response coding
(Follow-up assessment only) Longitudinal (before-after) comparisons of initial assessments and follow-up assessments
The MaxDiff priority measurement approach is a discrete choice date collection and analysis method where respondents will be asked to select the most important and least important priorities among several experimentally designed lists. The respondent selections will be used to model the relative prioritization of roles and responsibilities, as well as potential capacity building strategies. More direct rating scale questions have the appeal to respondents of being easily understood, but the ratings are commonly affected by response effects, such as respondents scoring many potential responses as the highest priority. In addition, responses to scales can vary from person to person. Consequently, relying only on scale questions can be problematic. Choice exercises, such as MaxDiff, help to alleviate many of the problems of scale questions.
Access to the assessment results will be given to FHWA staff to support additional data analysis and summary efforts. Through this access FHWA staff will be able to provide individual respondents upon request.
The raw assessment data will also be submitted to FHWA in a simple excel workbook.
Provide name and telephone number of individuals who were consulted on statistical aspects of the IC and who will actually collect and/or analyze the information.
Michael D. Nesbitt
Senior Transportation Specialist
Office of Transportation Performance Management
Federal Highway Administration
(202)366-1179
Kevin Tierney
Bird's Hill Research
617-839-0938
Scott Richrath
Spy Pond Partners, in Denver:
2334 S Fenton Dr
Lakewood, CO 80227
(720) 737-5671 (cell)
a Double click on the icon for full a full list of Survey Gizmo features.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | SAMPLE |
Author | frauser1 |
File Modified | 0000-00-00 |
File Created | 2021-01-12 |