Supporting Statement B- Assessment of Transportation - 052824

Supporting Statement B- Assessment of Transportation - 052824.docx

Assessment of Transportation Performance Management Needs, Capabilities and Capacity

OMB: 2125-0655

Document [docx]
Download: docx | pdf

DEPARTMENT OF TRANSPORTATION

FEDERAL HIGHWAY ADMINISTRATION (FHWA)

SUPPORTING STATEMENT B

OMB CONTROL NO. 2125-0655


ASSESSMENT OF TRANSPORTATION PLANNING

AGENCY NEEDS, CAPABILITIES, AND CAPACITY

Part B. Collections of Information Employing Statistical Methods

1. Describe potential respondent universe and any sampling selection method to be used.


  • A census (100 percent sample) of 50 State Departments of Transportation (DOTs), the District of Columbia, and Puerto Rico

  • A census (100 percent sample) of 409 metropolitan planning organizations (MPOs) will be drawn, and

  • No attempt will be made to draw inferences to any population other than the set of units that responded to the data collection effort.


2. Describe procedures for collecting information, including statistical methodology for stratification and sample selection, estimation procedures, degree of accuracy needed, and less than annual periodic data cycles.


  • State DOT Data Collection: As the assessment will seek to include all State DOTs, no formal sampling strategy will be required for this respondent group. With a response rate of 80 percent (42 agencies), the 90 percent confidence level margin-of-error for population proportion estimates would be at most plus or minus 6 percent. With a response rate of 90 percent (47 agencies), the 90 percent confidence level margin-of-error for population proportion estimates would be less than plus or minus 4 percent. We believe this minimum response would adequately enable FHWA to identify and quantify state transportation agency levels of readiness, areas of concern, and training and resource needs.

  • MPO Data Collection: As the assessment will seek to include all MPOs, no formal sampling strategy will be required for this respondent group. The MPO groups will be sorted into urbanized area based on the represented metropolitan areas’ population, air quality characteristics (including ozone non-attainment) , and planning organization representation. Since many regulatory requirement thresholds are related to area population and Environmental Protection Agency (EPA) air quality conformity assessments, these thresholds are likely to reflect differences in the surveyed agencies’ level of sophistication and exposure to performance management based planning concepts.

  • The urbanized area strata will include:

    • Strata 1: Areas of more than one million population;

    • Strata 2: Areas of less than one million population that have air quality non-attainment issues;

    • Strata 3: Areas of between 200,000 and one million population that do not have air quality non-attainment issues;

    • Strata 4: Areas represented by MPOs with less-than-200,000 population that do not have air quality non-attainment issues;

  • The final strata will be refined through the combination of several available federal databases:

    • Census Bureau Urbanized Area List;

    • the MPO database maintained by FHWA; and

    • EPA Greenbook, which records air quality conformity issues by region

  • Based on our preliminary processing of these sources, the populations of urbanized areas by strata are about the following:

    • Strata 1 – 50 regions;

    • Strata 2 – 63 regions;

    • Strata 3 – 112 regions; and

    • Strata 4 – 183 regions.

  • The MPOs in strata 1 to 4 will be contacted to complete the survey. The assessment is expected to have a comparatively strong response rate, because of the importance of the data collection topic to the mission of the MPOs:

    • The survey topic will be of greater importance to the target respondents, the agencies’ Executive Directors, as the topic will affect many of the agencies’ business practices;

    • The survey invitation will come from a more prominent sender from FHWA;

    • We will seek to have pre-notification, and hopefully endorsement, of the data collection effort be provided by national planning organizations, such as the National Association for Regional Councils (NARC) and the Association of Metropolitan Planning Organizations (AMPO), and by State DOTs;

    • The survey pre-notification and follow-up protocols will be robust and will include both email and telephone contact.

  • For planning, we assume a response rate of 35 percent, though we will seek to achieve the highest possible rate. The 35 percent rate would yield about 161 (=[52+409]*0.35) valid responses.

  • At this level of return, the 90 percent confidence level margin-of-error for population proportion estimates would be at most plus or minus 6 percent. We believe this minimum response would adequately enable FHWA to identify and quantify MPO levels of readiness, areas of concern, and training and resource needs.

  • Follow-Up Data Collection: A follow-up survey of the same partner organizations (State DOTs and MPOs) will be conducted in 2024 or 2025 and later. Respondents from the State DOT and MPO assessments will be re-contacted for the follow-up assessments to determine if the baseline has changed. Should an organizations that completed their assessment not respond to the follow-up assessment, we will seek to identify and recruit similar organizations that did not participate in the initial data collection.

  • Respondent Identification within Partner Organizations: One of the important challenges of the assessment will be to identify the best people within the agencies from whom to collect information. The initial State DOT contacts will be the individuals previously identified by FHWA.

  • The default MPO principal points-of-contact will be the Executive Directors. We will also ask for input from AMPO.

  • Each of the partner organization assessments will be seeking information that may reside with multiple staff members at the State DOTs and MPOs. Consequently, a survey strategy that involves multiple points of contact will be required. The approach envisioned is to send the main survey invitation to the key points-of-contact (e.g. MPO Executive Directors) and allow them to complete the subsections of the survey themselves or to identify others in the Agency or Department that should complete the program topic area specific subsections of the survey.

  • Advantages of this approach:

    • More likely to capture data from the staff members that are knowledgeable of specific Agency or Department capabilities

    • Multiple perspectives from each Agency or Department can better identify specific issues and concerns

    • Increased interest in the implementation and assessment effort throughout an Agency/ Department

  • Disadvantages of this approach:

    • Potential biases may be introduced by letting the primary respondents select the subsection respondents

    • Multiple perspectives from each Agency or Department could be contradictory

    • Potential difficulty in gaining perspectives on prioritization between different roles and responsibilities to implement planning requirements

  • In our view, the benefit of reaching the most knowledgeable staff outweighs any potential biases introduced by having the main respondents select the subsection respondents. The multiple perspective approach also reflects the fact that planning touches on many disciplines within an Agency or State DOT. To address prioritization across the many roles and responsibilities associated with the planning requirements within the system performance areas, the survey will include general prioritization questions for the main respondent to answer, and more specific subsection questions for other sub-respondents.


3. Describe methods to maximize response rate.

  • The data collection effort will consist of the following steps:

    • State DOT Assessment:

      • FHWA will alert State DOT Point of Contact (POC) that a web-based survey is being developed that will help with determining needs and priorities for training, guidance resources, and technical assistance

      • FHWA will develop an invitation email with a link to the State DOT survey. FHWA will send the email invitation with a link to the main survey to the State DOT POCs

      • If no response is received after seven days, an automated reminder email invitation with a link to the survey will be sent to the State DOT contacts

      • After seven more days, a second automated reminder email invitation with a link to the survey will be sent to non-respondents

      • If still no response is received, the project team will place a telephone call reminder asking the State DOT contact to either complete the web-based survey or to set up an appointment to complete it by phone

      • As part of the main survey, the State DOT POC will be given the option to identify the best person within their agencies to complete each of the subsections (7) of the survey, which will be based on the anticipated State DOT’s roles

      • The survey software will then automatically email the referenced people invitations to complete surveys with the identified survey subsections

      • The same follow-up protocols will be followed for the subsection survey respondents as for the main surveys

      • Only the single State DOT POC can review, edit, and submit final approved responses to FHWA

    • MPO Assessment:

      • FHWA staff will alert the MPO POCs of an upcoming web-based assessment. If an MPO POC cannot be identified, the MPO Executive Director will be the POC.

      • The FHWA will develop invitation emails with links to the MPO survey. The FHWA Office of Planning Director will send the email invitation with a link to the surveys to the MPO POC.

      • If no response is received after seven days, an automated reminder email invitation with a link to the survey will be sent to the MPO contacts

      • After seven more days, a second automated reminder email invitation with a link to the survey will be sent to non-respondents

      • As part of the main surveys, the MPO contacts will be given the option to identify the best person within their agencies to complete each of the subsections of the survey, which will be based on their agencies’ anticipated roles

      • The survey software will then automatically email the referenced people invitations to complete surveys with the identified survey subsections

      • The same follow-up protocols will be followed for the subsection survey respondents as for the main surveys

      • Only the single MPO POC can delegate, review, edit, and submit final approved responses to FHWA.


4. Describe tests of procedures or methods.

  • Survey Question Construction

    • The development of the survey instrument will be an interactive process, beginning with FHWA review and editing of the data elements. As data elements are settled, specific question wording will be developed. Each question and associated response categories will be evaluated along the following dimensions:

      • Lack of focus

      • Bias

      • Fatigue

      • Miscommunication

    • Bias limitation and detection

      • It will be important to limit the amount of time needed for respondents to completely respond to the assessment. Fatigue and loss of interest affect survey completion rates, data quality, and open-ended response completeness and thoughtfulness. We will seek to entire survey process to no more than 3 burden hours. The burden hours estimate acknowledges and accounts for time often needed to collect information; coordinate and input responses; and review and approve final responses.

      • Where possible, response category orders will be randomized to limit bias.

      • Survey page timers (not visible to respondents) will be used to identify potential understanding problems (unusually long page dwell times) and potential loss-of-interest problems (unusually short page dwell times)

    • Testing the Draft Survey

      • Survey instrument diagnostics: Survey software includes built-in capabilities to evaluate the web-based survey instrument:

        • Fatigue / survey timing scores

        • Language and graphics accessibility scores

    • Generation of survey test data: Once the survey is drafted, we will generate hypothetical synthetic output datasets. This will enable us to correct response category problems and to ensure that the output data will support the tabulations and analyses we expect to perform on the actual data set.

    • Office pretest: Prior to engaging the Partner Organizations, we will generate an email invitation link to a test survey and distribute it to staff that are knowledgeable of the survey topics but that were not involved in the survey development. We will seek their input on the survey questions and identify potential improvements to the survey.

    • Field pretest: Because the assessment will be distributed to all State DOTs and MPOs, a full dry-run survey field pretest cannot be used. Instead, we will schedule about five of the FHWA Partner agencies (State DOT and/or MPOs) assessments to be delivered earlier than the rest of the assessments. We will review results of the early assessments as they are completed to evaluate comprehension and cooperation levels. We will contact early respondents by phone to ask if they had any specific issues that could be fixed. We will make necessary changes for the full assessment release, and if necessary re-contact early respondents to collect any data elements that were not in the early survey.

  • Selection of survey data collection software

    • A proposed survey software platform is Survey Gizmo (http://www.surveygizmo.com/)

      • Specific advantages of Survey Gizmo compared to other online survey data collection options:

        • Wider range of question types than most online survey options, including group questions, matrix questions, and experimental design choice exercises

        • Custom scripting capabilities

        • Flexible page and question logic and skipping

        • Style themes by device type

        • Email campaign tools

        • Response tracking, reporting, and multiple data export formats (CSV, Excel, SPSS, etc.)

        • Greater range of respondent access controls than other online products

          • Allowance of save-and-continue

          • Duplicate protection

          • Anonymous responses

          • Quota setting

          • Restrictions on going backward

          • Section navigation

        • Greater range of administrator roles and collaboration features than other products

    • The Section Navigator is particularly critical for the assessment because it will enable the primary points-of-contact to separate the assessment into sections to make it easy for different respondents to complete different parts without interrupting or overwriting one another. Simply stated, the Section Navigator enables one Partner Organization to be completed by multiple people. For example:



Source: Survey Gizmo documentation. An example of a recent survey conducted in Survey Gizmo:


http://www.surveygizmo.com/s3/1775738/AASHTO-CTPP-Survey-a


    • Analysis of Results

    • Data review

      • As the data are collected, we will review responses for validity

      • Survey response patterns (such as straightlining, etc.)

      • Page completion times

      • Completion of closed-ended and open-ended survey responses

      • Internal consistency checks

      • Data outlier review

    • Tabulations

      • Topline results

      • Cross-tabulations

      • Cluster analysis to group partner organizations by similarities, if feasible

    • Analyses

      • MaxDiff priority measurement

      • Gap analysis (training needs versus capabilities)

      • Open response coding

      • (Follow-up assessment only) Longitudinal (before-after) comparisons of initial assessments and follow-up assessments

    • The MaxDiff priority measurement approach is a discrete choice date collection and analysis method where respondents will be asked to select the most important and least important priorities among several experimentally designed lists. The respondent selections will be used to model the relative prioritization of roles and responsibilities, as well as potential capacity building strategies. More direct rating scale questions have the appeal to respondents of being easily understood, but the ratings are commonly affected by response effects, such as respondents scoring many potential responses as the highest priority. In addition, responses to scales can vary from person to person. Consequently, relying only on scale questions can be problematic. Choice exercises, such as MaxDiff, help to alleviate many of the problems of scale questions.

    • Survey Data Files and Tabulation

      • Access to the assessment results will be given to FHWA staff to support additional data analysis and summary efforts. Through this access FHWA staff will be able to provide individual respondents upon request.

      • The raw assessment data will also be submitted to FHWA in a simple excel workbook.


5. Provide name and telephone number of individuals who were consulted on statistical aspects of the IC and who will actually collect and/or analyze the information.


Kenneth Petty

Director, Office of Planning

Federal Highway Administration

(202) 366-6654

[email protected]

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorLewis, Jazmyne (FHWA)
File Modified0000-00-00
File Created2024-07-27

© 2024 OMB.report | Privacy Policy