Part B NHTS OMB Supporting Statement073015

Part B NHTS OMB Supporting Statement073015.docx

National Household Travel Survey (NHTS)

OMB: 2125-0545

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT

2015 National Household Travel Survey


This is a request for an Office of Management and Budget (OMB) approved clearance for the reinstatement of a periodic information collection entitled “National Household Travel Survey” (NHTS).


Part B. Statistical Methods


  1. Describe potential respondent universe and any sampling selection method to be used:


Part B.  Information Collection by Statistical Methods:

While many elements of the survey design for the 2015 NHTS are based on the design utilized in the 2009, 2001, and 1995 surveys, the sample design for the 2015 NHTS uses address based sampling (ABS). Given the continuing decline in landline telephone use and a resulting RDD sampling frame that covers less than 60 percent of households1 (down from 75 percent in 2009), the change to an ABS design will provide improvement in attaining both higher coverage, using a frame that covers virtually all households,. Although these changes may affect trend analysis, it is imperative to address overall declining response rates, with significant focus placed on the respondent experience. For further information about the survey design contact Adella Santos at FHWA, 202-366- 5021, [email protected].


  1. Describe potential respondent universe and any sampling selection method to be used:

The NHTS is a study conducted by the U.S. Department of Transportation (DOT) Federal Highway Administration (FHWA) that obtains data on key aspects of travel by the American public. The survey is focused on the household as the basic unit of observation. 


Respondent Universe

The population of inferential interest for the 2015 NHTS is defined as households living in the U.S, excluding group quarters, during the post-pretest data collection period (November, 2015 – December 2016).  The universe for sample selection is households within the 50 states and the District of Columbia. 


The National 2015 NHTS will comprise a sample of addresses that will be selected from the ABS frame maintained by Marketing Systems Group (MSG) using stratified systematic sampling. MSG’s ABS frame originates from the U.S. Postal Service (USPS) Computerized Delivery Sequence file (CDS), and is updated on a monthly basis. The vendor used, MSG, has taken great strides to evaluate and enhance the standard CDS-based list. For example, MSG has augmented simplified addresses with no specific street address to provide an ability to match auxiliary variables to a set of sampled addresses. In addition, use of this ABS frame can associate characteristics such as Hispanic surnames that can be matched to about 90 percent of addresses. Other characteristics of addresses (e.g., race/ethnicity, education, household income) may also be matched to some addresses. However, all of these items are subject to error (see Roth, Han &Montaquila, 20132), but may still prove useful for data collection purposes. Another benefit to MSG addresses is that their addresses are geocoded which can be appended to the Decennial Census and the American Community Survey characteristics of block, block group, and track level, which is very useful for sampling and/or data collection purposes.


Most studies of coverage of ABS frames have focused on in-person surveys that require locating the physical address in a specific geography (giving rise to problems with rural route addresses and PO boxes, as well as coverage issues resulting from geocoding errors). For a mail survey the household coverage rates are much higher, since the USPS delivers mail to almost all households, and households with mailing addresses that do not correspond to a physical location are covered. As reported by Iannacchione (2011) 3, ABS frames that are derived from the USPS CDS file and its associated No-Stat file offer nearly complete coverage of households.4 The sampling frame for the 2015 NHTS will be derived from only the CDS file, since including the No-Stat file (a file of mostly inactive addresses) is estimated to increase the coverage by less than one percentage point (Shook-Sa et al.5). Thus, the ABS sample used for the 2015 NHTS is expected to have substantially higher coverage than the 2009 NHTS landline RDD sample. (In 2009, about one-fourth of households did not have landline telephones, and it has been estimated that an additional 5 to 20 percent of landline households were excluded from landline RDD frames; see Blumberg and Luke (2009) 6 for telephone service statistics, and Boyle et al. (2009) 7, Fahimi, Kulp, and Brick (2009) 8, Barron and Zhao (2010) 9, and Barron et al. (2013) 10 for discussion of landline RDD frame coverage of landline households.)



Sample Selection

The 2015 NHTS will be conducted using an Address Based Sampling frame as discussed above. There will be a nationwide sample with a target of 26,000 responding households. Because of the need to produce some state-level estimates with adequate precision, state will be used for stratification. Additionally, the following four groups will be used to sub-stratify within each primary stratum:


  • Counties in Metropolitan Statistical Areas (MSAs) of at least1 million people and containing heavy rail for transit use (14 such MSAs exist in the U.S.);

  • Counties in MSAs of at least1 million people and not containing heavy rail for transit use;

  • Counties in MSAs of less than 1 million people; and

  • Counties not in MSAs.


The sample size (specified in terms of responding households) will then be initially allocated among the strata according to the proportion of addresses falling in the stratum (i.e., proportional to stratum size, as determined by the counts of addresses from the ABS frame). A minimum allocation of 250 responding households per state will be used; states with initial allocations of fewer than 250 households will be increased to 250, and the remainder of the National sample will be re-allocated proportionally to the strata associated with the remaining states.


Once the sample of responding households has been allocated in the manner described above, these sample sizes will be inflated to account for expected losses due to ineligible addresses, i.e., Postmaster non-deliverable mail (an assumed rate of 11 percent of addresses, based on other National ABS mail studies conducted by Westat), non-response to the recruitment effort (an assumed non-response rate of 70 percent of eligible addresses), and non-response to the retrieval effort (an assumed non-response rate of 35 percent of recruited households). The departures from proportional allocation of responding households in the National sample will result in a sample of addresses selected with variable sampling rates. These variations in sampling rates will be properly accounted for in the computation of the survey weights.


Within each substratum, the ABS frame will be sorted in a prescribed manner prior to sample selection. The sort used by MSG is geographic in nature (within each state, the file is sorted by ZIP+4), and addresses are sampled systematically using the geographic sort. Thus, no important issues arise in the definition of areas with an ABS sample design that relies on mail for data collection.


Table B-1 below presents the estimated sample sizes needed by stratum for the national sample to yield 26,000 completed household interviews using the allocation described above.


Table B-2 presents the estimated sample sizes aggregated over the sub-strata described above.


Table B-1. Expected Sample Sizes for the National Sample by State

STATE

ABS 12/14 occupied housing units*

ABS address sample needed assuming 89% residency rate

Recruitment completes assuming 30% response

Retrieval completes assuming 65% response

Alabama

2,099,930

2,198

587

382

Alaska

245,754

1,441

385

250

Arizona

2,704,563

2,831

756

491

Arkansas

1,235,085

1,441

385

250

California

13,315,204

13,939

3,722

2,419

Colorado

2,104,380

2,203

588

382

Connecticut

1,459,660

1,528

408

265

Delaware

387,757

1,441

385

250

District of Columbia

301,320

1,441

385

250

Florida

8,797,039

9,209

2,459

1,598

Georgia

3,959,849

4,145

1,107

719

Hawaii

466,121

1,441

385

250

Idaho

598,298

1,441

385

250

Illinois

5,176,017

5,419

1,447

940

Indiana

2,761,832

2,891

772

502

Iowa

1,252,609

1,441

385

250

Kansas

1,175,772

1,441

385

250

Kentucky

1,843,639

1,930

515

335

Louisiana

1,931,525

2,022

540

351

Maine

575,676

1,441

385

250

Maryland

2,349,233

2,459

657

427

Massachusetts

2,774,255

2,904

775

504

Michigan

4,263,440

4,463

1,192

775

Minnesota

2,210,703

2,314

618

402

Mississippi

1,236,148

1,441

385

250

Missouri

2,575,052

2,696

720

468

Montana

389,691

1,441

385

250

Nebraska

743,212

1,441

385

250

Nevada

1,115,129

1,441

385

250

New Hampshire

530,210

1,441

385

250

New Jersey

3,514,537

3,679

982

639

New Mexico

775,084

1,441

385

250

New York

7,798,454

8,164

2,180

1,417

North Carolina

4,130,585

4,324

1,155

750

North Dakota

286,886

1,441

385

250

Ohio

5,082,240

5,320

1,421

923

Oklahoma

1,579,355

1,653

441

287

Oregon

1,576,258

1,650

441

286

Pennsylvania

5,425,758

5,680

1,517

986

Rhode Island

458,760

1,441

385

250

South Carolina

2,044,349

2,140

571

371

South Dakota

316,742

1,441

385

250

Tennessee

2,769,775

2,900

774

503

Texas

10,058,020

10,529

2,811

1,827

Utah

945,196

1,441

385

250

Vermont

250,552

1,441

385

250

Virginia

3,286,115

3,440

919

597

Washington

2,785,131

2,916

778

506

West Virginia

780,380

1,441

385

250

Wisconsin

2,457,776

2,573

687

447

Wyoming

214,990

1,441

385

250

TOTAL

127,116,046

149,821

40,000

26,000


Table B-2. Expected Sample Sizes for the National Sample by Primary Stratum

National sample stratum

ABS 12/14 occupied housing units

Proportion

Stratum sample size

Counties within MSAs > 1 million and heavy rail

31,070,705

24.4%

5,996

Counties within MSAs > 1 million and not heavy rail

38,036,786

29.9%

7,157

Counties within MSAs < 1 million

39,573,540

31.1%

8,509

Not in an MSA

18,435,015

14.5%

4,338

TOTAL

127,116,046

 

26,000

NOTE: The stratum sample size column indicates the expected number of completed household surveys.


The sample release process will control the balance of travel days by month. Sampled addresses will be assigned a day of the week equally distributed across all days to ensure a balanced day of week distribution. This is a proven approach to travel day assignment that has been used in all of our travel surveys.


Response Rates

The NHTS has several unique characteristics that have historically made it difficult to achieve robust response rate targets. These survey characteristics include:


Two-stage data collection rather than one-stage - In 1995, the NHTS series switched from a one-stage survey to a two-stage, which greatly improved the quality of the trip data. The trip data constitutes the heart of the survey subject matter. The proposed 2015 survey will constitute the eighth in the NPTS/NHTS series, with the first National Personal Transportation Survey (NPTS) taking place in 1969. By the 1990 survey, the pace of our lives and the complexity of daily travel had changed so much that a one-stage retrospective survey was found to be lacking in producing a reliable account of a specific day’s travel. Starting in 1995, the survey adopted the current two-stage approach along with a travel diary or log for recruited respondents to use as a memory jogger.


All household members interviewed rather than a randomly selected person - In addition to obtaining the amount, type and characteristics of daily travel, we also need to understand how the household, as a unit, functions to meet the travel needs and desires of its members. Thus, we need to interview each household member for travel on the same designated day. The requirement to interview all household members places an additional and significant operations burden on the survey. To reduce respondent burden, once a previously interviewed household member reports a trip taken with a particular respondent, that trip is populated on the second respondent’s travel day record and the second respondent is only asked to verify if they were on that trip.


A relatively short window of opportunity for data collection rather than extending data collection until a household is completed - Because the nature of the data being collected on a typical assigned day is usually unremarkable, we have found that a time limit needs to be enforced or else one specific assigned day blends into all others. We have resolved this issue by allowing daily trip collection within a 7-day window after the assigned day. That guideline, while improving data quality, creates another challenge to full survey participation.


The proposed 2015 NHTS design is a two-stage survey with an ABS frame, mail out/mail back recruit stage and web-based travel day retrieval, using multiple modes for reminders at key points, telephone retrieval as an option when necessary, and cash incentives at key stages. An additional option of using mail-back paper questionnaires was initially considered. This option was rejected because paper questionnaires cannot control for the completion of all data elements, or the range of responses. Such internal data checks are possible in web-based and telephone data retrieval modes.


Most regional household travel surveys have adopted an approach that includes an ABS frame, mailing letters that encourage recipients to recruit online, multiple reminder postcard mailings, and varying levels of telephone follow up. This approach is well-suited for studies with compressed data collection schedules that are unable to accommodate the longer recruitment effort that will be used for the NHTS. The NHTS, with its significantly longer field period, can accommodate the unique strategy presented in the design of the 2015 survey. Based on recent experience with the proposed recruitment methodology, a 30 percent recruitment rate and a 65% retrieval rate are projected for the 2015 NHTS.



The 2009 NHTS resulted in a 23% recruit response rate and an 80% retrieval response, for a total of 18.4% overall survey response rate. The design proposed for the 2015 survey is expected to produce a stronger recruit rate of 30% and a retrieval rate of 65%. Table B-3 gives a description and the sizes of the population, sample, and respondents.


Recruit Stage

Switching from a telephone recruit to a mail-out/mail-back recruitment is considered to be more palatable to US households. As described earlier, the recruitment survey will be short and contain some engaging questions about the respondent’s daily travel experiences. Those features, combined with a promised completion incentive, are designed to improve the previous recruitment response.


Retrieval Stage

Experience in conducting the NHTS series has shown that, once recruited, households tend to follow through and complete the survey. In 2001 the retrieval response was 71% and in 2009 it was 80%. The more conservative estimate of 65% for the 2015 survey is projected because of the change to a web-based retrieval instrument, which has not been previously used for this survey. Although regional household travel surveys have used the web-based retrieval instrument, the NHTS content is more comprehensive and collects additional information. Thus a 65% retrieval rate is projected.



Table B-3. Description and sizes of population, sample, and respondents

Group

Description

Size

Universe (population)

U.S. households

116 million

Sample

Sample of addresses from ABS frame

149,813

Respondents

Households completing all retrieval instruments

26,000



  1. Describe procedures for collecting information, including statistical methodology for stratification, and sample selection, estimation procedures, degree of accuracy needed, and less than annual periodic data cycles:

Data Collection Procedures

As in previous series of the NHTS, the 2015 NHTS will maintain a two-phase study, which includes a household recruitment mail out/mail back survey (phase 1) and trip level retrieval survey (phase 2) with all eligible household members ages 5 and older. As described in the sections above, the data collection will use an ABS frame to randomly invite households to participate in the 2015 NHTS. These addresses will be contacted by mail and asked to complete a brief, household-level survey instrument and return that survey to the contractor. Members of recruited households will be asked to keep track of all the places they go for one day and to report that information to the contractor either online or by telephone. A description of the materials used in the conduct of the 2015 NHTS follows. 2015 NHTS modifications to the questions used in the 2009 NHTS are summarized in Appendix 6

Recruitment Materials

Invitation Letter/Survey Packet. The proposed methodology includes an invitation letter (Appendix 7) that will be mailed to each sampled address in the format City Resident, Address. The letter will introduce the NHTS, discuss the importance of participation, include the first incentive ($2) and describe the incentive structure. The letter will request that the enclosed recruitment survey (Appendix 8) be completed and returned in the Business Reply Envelope (Appendix 9). To increase the perceived legitimacy of the study, the letter will include the DOT logo and be signed by an USDOT official. Including a $2 cash “primer” incentive in the invitation letter is expected to encourage households to begin participation.


A toll-free number on the survey materials will be provided and the participant will have the option to complete the recruitment survey by web. All data collected in the recruitment survey will be used to populate the household record in the retrieval survey database. If respondents mail, call, or use the web to complete the recruitment survey, their responses are collected in the same survey database as the retrieval.


The initial recruitment tool will be designed to capture the interest of sampled households by targeting issues likely to resonate with them and engage them in further participation. It will be designed as a scannable document and will be processed using standard software for intelligent data capture and image processing. All processing will be conducted in a secure data facility. This software will be used in the design of the recruitment survey, to scan and extract responses, conduct validation tests, and store data. Best practices for security and protecting print materials that contain identifying or confidential information will be administered.


Reminder Postcard 1. Reminder postcards, addressed to City Resident, and designed with the project logo to capture interest, will be sent to each sampled address seven days after the invitation letter (Appendix 10).


Invitation Letter/Survey Packet. A second survey packet will be mailed to non-responding addresses 21 days after the first mailing. The letter will be slightly different (Appendix 11); the recruitment survey will be the same as in the initial mailings. This packet will not include another incentive.


Reminder Postcard 2. Forty-four days after the first mailing, each non-responding address will be mailed a second postcard that will include an invitation for participation by web (Appendix 12).This packet will not include another incentive, but will remind the household that a small gift will be provided for the completion of the survey. This invitation will provide an opportunity for households that are more likely to participate online to do so. The web recruitment survey content will be identical to the paper recruitment survey.


Project Website. A project website will be created and maintained for the 2015 NHTS. This public website will be dynamic and able to provide information about the study and contain Frequently Asked Questions. The website will also serve as the portal through which survey respondents access the online survey, or contact the consultant.

Retrieval/Travel Log Materials

Travel Log Packet. The travel log packet will include a letter (Appendix 13), an exemplar log on one side and personalized travel logs for each household member ages 5 and older (Appendix 4), and will be sent using first class postage in a 9 1/2”x 12” envelope (Appendix 14). The envelopes will be branded to match the letterhead used for the invitation letter. The second respondent incentive will be included with the travel logs. This $5 cash incentive, appended to the letter, is expected to serve as a “good faith” incentive to encourage completion of the retrieval survey..


Travel Log Letter. A household letter will be included in the travel log packet. The letter will thank participants for agreeing to participate, further familiarize the participants with the travel recording stage, identify the households’ travel date and provide instructions about when and how to complete the retrieval survey. The letter will also include a reminder note about the final $20 household incentive. Similar to the invitation letter, the travel log letter will include the DOT logo.


Travel Logs. A personalized travel log for each household member (age 5 and older) will be sent. Depending upon how each household member was identified (i.e., name or initials) in the recruitment survey, a label with that identification will be affixed to each of the household member’s travel log. The logs are intended to be a memory jogger to guide accurate data collection and aid in the reporting of each place visited on the travel day.


Exemplar Log. An exemplar log with the instructions for recording the details about the places visited on the travel day will be provided to the household. The exemplar log will have colorful icons and directional arrows to show the general public how their travel should be reported.


Final Incentive Payment. Once a household has completed the survey a $20 check will be mailed to the household in a #10 envelope. (Appendix 16)


Reminder Contacts. Electronic and telephone reminders will be used to engage participants, remind them about their travel date and to report their travel. An overview of the reminder schedule is found in Figure 3. The first reminder will be sent when the travel log packet is mailed. Reminders are generally sent in late afternoon on the day before the assigned travel day. Although there is no email “receipt” required for these emails, the consultant’s database system generates summary reports of the reminders sent each day, and the email addresses that failed. The consultants’ Telephone Research Center staff is fully trained on the reminder scripts to be used when phoning participants.


Statistical Methodology for Stratification


As discussed above, the 2015 NHTS will be conducted using an Address Based Sampling frame. There will be a National sample with a target of 26,000 responding households. Because of the need to produce state-level estimates with adequate precision, state will be used for stratification. Additionally, the following four groups will be used to sub-stratify within each primary stratum:


  • Counties in Metropolitan Statistical Areas (MSAs) of at least1 million people and containing heavy rail for transit use (14 such MSAs exist in the U.S.);

  • Counties in MSAs of at least1 million people and not containing heavy rail for transit use;

  • Counties in MSAs of less than 1 million people; and

  • Counties not in MSAs.


The sample size (specified in terms of responding households) will then be initially allocated among the strata according to the proportion of addresses falling in the stratum (i.e., proportional to stratum size, as determined by the counts of addresses from the ABS frame). A minimum allocation of 250 responding households per state will be used; states with initial allocations of fewer than 250 households will be increased to 250, and the remainder of the National sample will be re-allocated proportionally to the strata associated with the remaining states.


Once the sample of responding households has been allocated in the manner described above, these sample sizes will be inflated to account for expected losses due to ineligible addresses, i.e., Postmaster non-deliverable mail (an assumed rate of 11 percent of addresses, based on other National ABS mail studies conducted by Westat), non-response to the recruitment effort (an assumed non-response rate of 70 percent of eligible addresses), and non-response to the retrieval effort (an assumed non-response rate of 35 percent of recruited households). The departures from proportional allocation of responding households in the National sample will result in a sample of addresses selected with variable sampling rates. These variations in sampling rates will be properly accounted for in the computation of the survey weights.


Within each substratum, the ABS frame will be sorted in a prescribed manner prior to sample selection. The sort used by MSG is geographic in nature (within each state, the file is sorted by ZIP+4), and addresses are sampled systematically using the geographic sort. Thus, no important issues arise in the definition of areas with an ABS sample design that relies on mail for data collection.


Table B-1 (above) presents the estimated sample sizes needed by stratum for the national sample to yield 26,000 completed household interviews using the allocation described above.




Estimation Procedures


Sample weights for the 2015 NHTS will be developed and assigned to the records representing various analytic units associated with the study such as responding households and persons within responding households, vehicles, and trips. The weights will be designed to permit inference to the corresponding target populations. The weights will be developed so that the expanded survey can be analyzed for each state separately, and for the nation. Sample weights will be designed to permit data users to calculate representative estimates of the population of interest from the collected data. Replicate weights will allow users to compute standard errors for the estimates from the collected data.


The process of weighting has three major components. The first is the provision of a base weight, which is the inverse of the overall selection probability of each selected unit. Base weights are needed to account for the varying probabilities of selection among sample units in any given sample. Probabilities vary purposefully to some extent. Since no subsampling within households will take place, household- and person-level base weights will be identical. That is, the base weight for each person in a household is identical to their corresponding household base weight.


The second component of the weighting process consists of adjustments for nonresponse. Nonresponse can occur at the household level at the recruitment stage when a household declines to participate or at the retrieval stage when a household initially participates in the recruitment survey but does not actually participate in the retrieval survey. In either case, selected households who should have participated (according to the sampling scheme) ultimately did not participate. Results for the nonparticipating households are not reflected by the use of base weights, and therefore nonresponse adjustments to these weights are made in an effort to represent the full household population by adjusting the weights of the actual participating sample to account for the nonparticipating households.


The third major component of the weighting is the application of additional adjustments, such as weight trimming and calibration. Unlike the previous two components, trimming is not directed primarily at minimizing the bias of survey estimates, but rather at reducing their variance, permitting more precise statements to be made about the travel patterns and behavior and comparisons among subgroups and over time. Trimming consists of reducing the weights for each participating sample unit whose weight, as a result of the calculation of base weights and the application of nonresponse adjustments, makes an unduly large relative contribution to the total weighted data set. Calibration consists of adjusting the weights of sample units in population subgroups so that the sum is equal to an independent, relatively reliable benchmark estimate of the size of that population subgroup. Calibration will be done at both the household and person levels by using raking procedures. Characteristics such as age, sex, race, ethnicity, Census Division, MSA size, household size, number of household vehicles, and worker status will be considered for calibration.  We will implement the weight trimming procedure iteratively with the raking process so that the trimmed portions of the weights will be redistributed across all the remaining weights. This will ensure that the final weights will achieve consistency with the external population distributions without any excessively large survey weights.


Deriving these components of survey weights and combining them to produce a final weight is standard practice in complex sample surveys with some nonresponse. The underlying principles and standard practices are described in texts on survey sampling (e.g., Sarndal, Swensson, and Wretman, 1992; Lohr, 1999; Heeringa et al., 2010) and are reflected in Westat’s standard weighting procedures (which are proprietary SAS macros that were developed and have been used for decades to produce weights for numerous surveys).


In addition to the full-sample weights used to generate estimates from the survey, a set of replicate weights will also be created to allow users to compute variances of survey estimates and to conduct inferential statistical analyses. Replication methods work by dividing the sample into subsamples (also referred to as replicates) that mirror the sample design. A weight is calculated for each replicate using the same procedures as used for the full-sample weight (as described above). That is, the nonresponse, trimming, and calibration adjustments will be replicated so the replicate variance estimator correctly accounts for the effects of these adjustments. The variation among the replicate estimates is then used to estimate the variance for the survey estimates. Replicate weights for the NHTS sample will be generated using the jackknife procedure, in which sampled households are formed into groups reflecting the sample design and each replicate corresponds to dropping one group. The replicate weights can be used with a software package, such as WesVar, SUDAAN, STATA or SAS, to produce consistent variance estimators for totals, means, ratios, linear and logistic regression coefficients, etc.


Degree of Accuracy Needed


Because the NHTS provides data on a broad spectrum of household travel characteristics, it is impossible to estimate the precision for all possible transportation parameters of interest. Table B-4 provides estimates of the expected precision for selected rate domains of interest to the primary users of the data. The 2009 NHTS precision estimates are used because the 2015 NHTS, with the same sample size, is expected to yield similar precision. Despite design changes, the 2009 data remain the best indicator of 2015 precision estimates.



Table B-4. Precision Estimates for Key Variables

Domain Description

2009 Relative Domain Size

2009 Number of observations

2009 SE

2009 Relative Standard Error

Average number of daily trips made by a household

9.5 trips

150,147

0.05

+/-0.56 trips

Average annual vehicle miles of travel per household

19,850 miles

150,147

270.87

+/-1.36 miles

Average number of daily household shopping trips

3.7 trips

150,147

0.03

+/-0.89 trips

Percent of households that own an SUV

31.28%

150,147

0.18

+/-0.93%

Average vehicle miles of travel per day in miles for Females aged 16-20

31.0 miles

6,495

1.43

+/-4.62 miles

Number of people who do not drive a vehicle

30.31 million

24,590

505,548

+/-1.67 people

Average distance to work in miles

5.50 miles

308,901

0.06

+/- 1.04 miles

Average vehicle occupancy for trips of 11-15 miles

1.67 people

741,173

0.01

+/- 0.89 people

Average time spent on driving to a shop in an urban area

14.8 minutes

164,622

0.31

+/- 2.11 minutes

Average time spent driving on any trip (nationwide)

19.91 minutes

1,164,458

0.14

+/- 0.72 minutes

Average trip duration for motorcycle riders

25 minutes

3,207

1.55

+/- 6.31 minutes

Average daily vehicle miles driven for people not born in the United States

30.9 miles

25,990

1.62

+/- 5.24 mile


Less Than Annual Periodic Data Cycles

The NHTS has been conducted at approximately 5 year intervals since 1969. Fielding a survey to net about 26,000 households every five years is sufficient for the Department of Transportation, States, MPOs, and other users of NHTS data.


Non-response Follow-up Methodology

A well-designed plan for follow-up of non-respondents during data collection is critical to maximizing response rates. Because the information available to contact sampled households differs across the two phases of data collection, a non-response follow-up will be phase-specific. In the recruitment phase, a sequence of contacts based on design principles for mail surveys laid out by Dillman, Smyth, and Christian (2009) will be used. Seven days after mailing the initial recruitment package, a postcard reminder will be sent to all sampled addresses who have not yet returned their recruitment survey. Twenty-one days after the initial recruitment invitation, a second recruitment package will be mailed, without an incentive to all households that have not returned their recruitment questionnaires.


For the retrieval phase, the primary mode of data collection is web. Reminders and non-response follow-up will make use of other modes. Since the recruitment survey collects telephone numbers and email addresses, an attempt to contact non-respondents will be conducted by email, text, and telephone. During the recruitment survey phase, respondents are asked to state their preferred mode for receiving reminders. These particular data collection modes have proven to be beneficial in Westat projects in boosting response rates both overall and for subgroups such as those less likely to have access to the web, those with privacy concerns associated with transmitting data through the Internet, and those who simply prefer the telephone over the computer as a means of communication.


Non-response may bias survey estimates if the characteristics of respondents differ from those of non-respondents. Traditionally, the size of the bias has been viewed as a deterministic function of the size of the response difference and the response rate (see, for example, Sarndal& Lundström, 2005,11 for discussion). More recently, the emphasis has shifted toward a stochastic perspective that characterizes non-response bias by examining the relationship between the key variable and the response propensity (Groves et al., 200712; Montaquila et al., 200713); for example, if the propensity to respond is related to the number of trips an adult takes (e.g., adults taking more trips are less likely to respond), then estimates of the total number of trips may be subject to nonresponse bias. Adjustments to the survey weights aim to reduce bias due to non-response. However, even with such adjustments, it is important to have a plan to evaluate the potential for non-response bias as described below.


With respect to non-response, the largest concern for the 2015 NHTS is non-response in the recruitment phase. There are a variety of methods available to assess non-response bias. Previous research has suggested that each of these methods has strengths and weaknesses, thus a multi-method approach is recommended for a comprehensive evaluation of non-response bias. In previous NHTS, the recruitment phase is the phase that historically yielded the bulk of the non-response due in part to the length and complexity of the required response. That is expected to be the case for the 2015 survey as well, even with the shift to a mail-based ABS approach. However, the ABS approach affords the opportunity to link in covariates at both aggregate (e.g., tract-level characteristics from the American Community Survey) and address-level, for use in non-response adjustment and in bias analyses. In addition, the recruitment survey has been shortened considerably in order to reduce the initial burden of the respondent.


The recruitment survey contains several variables (e.g., number of household members, gender and ethnicity) that may be associated with non-response to the retrieval phase and are associated with key survey outcome variables. Having this rich set of variables will be very useful for non-response adjustment and non-response bias analysis at the retrieval phase.


  1. Describe methods to maximize response rate:

In recent experience, the consultant has found the best combination of low-cost and effective data collection modes to be mail and web. Telephone can be used to provide minor improvements to participation rates for a few respondents who will not participate by mail or web. Engaging respondents using user-friendly instruments is an essential ingredient to the success of any mode. Incentives are effective, but large incentives for low-cost alternatives are not necessary or even desirable.14 The survey design must appeal to the importance and relevance of the research, and rely on social exchange theory15 as a way of generating responses. These principles have helped structure the research design.


The research approach is heavily influenced by the findings of previous research, especially Messer and Dillman (2011)16 and Millar and Dillman (2011)17 who explored ways of pushing sampled persons to the web to respond. Although the requirements for their surveys were very different from those of the NHTS, their findings do generalize in several ways to this work. Other work in this area is also informative, but the lack of adequate experimental design in many cases makes it more difficult to generalize those results18,19.


A flowchart depicting the recruitment process is presented in Figure 2 above. The initial recruitment package will include a letter signed by an appropriate Department of Transportation official, a $2 incentive, the survey and a Business Reply Envelope to be used to return the completed survey. A reminder postcard will be sent to each sampled address 7 days after the initial mailing. Three weeks after the initial mailing all non-responding addresses will be sent a second survey packet (without incentive).


The final request for participation will be sent 44 days after the first mailing to those who have not responded by mail. This postcard will provide an invitation to complete the recruitment survey on the web. We do not plan to attempt to call non-responding households at the recruitment stage because previous Westat studies have shown very little success in supplementing the primary recruitment response mode using telephone outreach. Similarly, we plan to wait to offer a web recruitment response mode until the final postcard, as previous research (e.g., Smyth et al. 201020) has shown that more than one initial response mode reduces participation. The 44 day period allows the primary methodology to be fully executed before engaging the non-response protocol.


Figure 3 above provides the flow diagram overview for the retrieval process, including the multiple reminder contacts built into the data collection plan to facilitate accurate tracking and early reporting of all the places visited on the travel day. The retrieval stage begins with a travel log package sent to each recruited household. This package will contain a letter that explains the next steps, identifies the assigned travel date, discusses the incentive for completion and provides information about how to contact the consultant for questions or assistance in completing the survey. Help desk staff will be trained in all aspects of the study to enable them to efficiently respond to frequently asked questions, technology issues, and entering survey data onto the web. Also included in the package will be personalized travel logs for each household member ages 5 and older and a $5 cash incentive.


The contractor will begin the reminder process when the travel log package is mailed and will continue to prompt non-responding houses using email and telephone follow-up. Most households are expected to respond to the retrieval survey via web. Those that indicate web as their preference will not be contacted by an interviewer on the first 2 days following their assigned travel day. This retrieval response follow-up will only be available to households that either provide a telephone number in the recruitment instrument or have telephone numbers matched to the original ABS frame.


Consistent with past administrations of the NHTS, the consultant will implement the 7-day retrieval data collection window in which all attempts to initiate contact with participating households will end after seven days. Most regional household travel surveys have adopted a definition of a “completed” household that requires 100 percent of eligible household members to complete the survey rule, especially those regions using advanced travel demand models. Using this approach increases the usability of the NHTS data across the data user community. FHWA is adopting this definition for the 2015 NHTS.


The 2015 NHTS sampled households will be drawn in two waves. Response rates will be monitored across regions during the initial data collection period, and the targets for the second sample selection will be adjusted to better target low response regions.


In order to facilitate responses from those with disabilities, the contractor will ensure that all web and CATI instruments meet Section 508 compliance using the rules specified in sections 1194.22 – ‘Web-based intranet and internet information and applications’ and 1194.23 – ‘Telecommunications products.


All NHTS 2015 materials will be available in both English and Spanish language forms to help maximize participation by Hispanic households. Since the 2001 survey, the contractor has also offered bi-lingual interviewers. An evaluation of previous surveys indicated that interviewing in Spanish was an important factor in gaining the cooperation of Hispanic respondents, a rapidly increasing proportion of the overall population. The Spanish translations of survey materials were developed using industry standards including reverse-translation protocols.


  1. Describe tests of procedures or methods:


Mixed Mode Surveys

Experiments conducted by Westat,21,22 and confirmed by others,23,24 using mixed mode collection methods have found that telephone is not the best mode for conducting interviews with ABSs. This does not imply telephone data collection cannot be used to enhance response rates, but that it is not as an effective choice as a primary mode. So when pairing a data collection mode with an ABS frame, researchers have begun to look beyond the traditional telephone-based approach. It is this research that provides the foundation for the methodological approach to be used for the 2015 NHTS.


The mail back recruitment approach described here has been tested and found to be successful in several surveys funded by the Federal Government (e.g., the National Crime Victimization Survey); these surveys have proven this method can be implemented with large sample sizes covering vast geographic regions. For these reasons, this approach has been selected in response to declining recruitment rates. In recent studies the consultant has observed response rates ranging from 28 to 38 percent with the proposed method.


Testing and Review

All field materials were subject to a series of tests to assess their utility and validity as survey instruments. These tests are described below.


Expert Review

The first step in developing a revised questionnaire and field materials for the NHTS was to conduct an expert review of existing materials to identify potential improvements. In the expert review process, a questionnaire design expert conducted a thorough appraisal of the questionnaire, providing guidance on refinement and further development activities factoring in the transition from telephone- to web-based data collection modes. The expert reviewer judged features of the instrument such as ease and efficiency of completion, avoidance of errors, ability to correct mistakes, and how the content is presented (overall appearance and layout of the instrument). The review process also included an appraisal of all other survey materials, including letters, travel logs, instructions, and reminders.

Cognitive Testing


After the questionnaire and materials were revised, the consultant conducted cognitive testing using in-depth, semi-structured interviews to gather insights into the cognitive sources of potential misinterpretation (Willis, 2005). This qualitative methodology focuses on examining the respondents’ thought processes, allowing survey researchers to identify and refine question wordings that are either misunderstood or understood differently by different respondents; instructions that are insufficient, overlooked, or misinterpreted; ambiguous instructions; and confusing response options. In addition to question wording, cognitive testing can reveal any issues with the visual presentation of the survey and materials. Visual design elements such as color, contrast, or spacing are crucial to correctly guide respondents through an instrument. Visual design elements were also reviewed by the FHWA Data Visualization Center for ease of instructions, placement, font size, and visual graphics.


In keeping with OMB regulations, the cognitive testing was limited to nine respondents per instrument (recruitment survey, travel log, web-based survey, and printed materials). The consultant conducted two sets of 9 interviews with adults ages 18 and older:

  • 9 interviews to test materials to be used for the NHTS and

  • 9 interviews to test the recruitment questionnaire for the NHTS.


Respondents were recruited using Craigslist advertisements posted in the local Washington, D.C. area.  Recruitment was purposive and sought adults with a variety of day-to-day travel experiences (e.g., public transit commuters, car commuters, carpoolers).  Interviews were conducted with 9 women and 9 men, ranging in age from 22 to 58.  In addition, respondents represented a mix of races and education levels.  The interviews were completed in-person at the consultant’s cognitive testing lab.  All cognitive interviews were conducted by experienced survey methodologists The interviewing staff consisted of senior survey methodologists who have extensive experience using cognitive interview methods to test survey questionnaires and materials. Results were compiled and presented to FHWA for consideration.


Survey Pretest

The pretest includes all facets of data collection and will be designed to complete interviews with up to 500 households drawn from two regions. The Pretest provides an opportunity to employ and evaluate all planned protocols and materials, and to determine whether changes are needed before the main survey is conducted. If significant issues are revealed through the conduct of the pretest, it is possible that the effort will need to be repeated.


At the conclusion of the Survey Pretest, the consultant will update the Survey Plan to include changes resulting from the pretest. The final Survey Plan will provide the FHWA with complete documentation for how the 2015 NHTS will be conducted. Details will include survey methodology, questionnaire design, sampling, non-response analysis, estimation procedures, and procedures and protocols used to conduct the study.




  1. Provide name and telephone number of individuals who were consulted on statistical aspects of the IC and who will collect and/or analyze the information:

J. Michael Brick, Ph.D. 301-294-2004

Jill Montaquila 301-517-4046

Shelly Brock 301-517-8042




Appendices:


  1. National Household Travel Survey, Compendium of Uses, January 2014 - June 2014 (http://nhts.ornl.gov/2009/pub/Compendium_2014.pdf

  2. Federal Register 60 Day Notice

  3. Federal Register 30 Day Notice

  4. Travel Log and Exemplar log.

  5. Odometer Sheet

  6. Summary of 2015 NHTS changes in questions from the 2009 NHTS

  7. Invitation Letter

  8. Recruitment Survey Instrument

  9. Business Reply Envelope

  10. Reminder Postcard 1

  11. 2nd Package Letter to Non-respondents

  12. Reminder Postcard 2 with Web Invitation

  13. Travel Package Letter

  14. Business Reply Envelope

  15. Retrieval Survey

  16. # 10 Incentive Envelope

  17. Summary of 2015 NHTS changes in questions from the 2009 NHTS

  18. Example of Web Retrieval survey

  19. Transportation Research Circular, No. E-C178, October 2013 (http://onlinepubs.trb.org/onlinepubs/circulars/ec178.pdf


1 Blumberg, S.J., and Luke, J.V. (2014). Wireless substitution: Early release of estimates from the National Health Interview Survey, July – December 2013. National Center for Health Statistics. Available from http://www.cdc.gov/nchs/nhis.htm.

2 Roth, S.B., Han, D. and Montaquila, J.M. (2013). The ABS Frame: Quality and considerations. Survey Practice, 6(4). Available from http://surveypractice.org/index.php/SurveyPractice/article/view/73/pdf.

3 Iannacchione, Vincent G. The changing role of address-based sampling in survey research. Public Opinion Quarterly 75.3 (2011): 556-575.

4 Iannacchione (2011) showed that the number of addresses in the CDS and its No-Stat file combined is virtually identical to the Census Bureau’s estimate of the number of housing units; the author noted that while there is not a one-to-one association between these, this offers evidence of virtually complete coverage.

5 Shook-Sa, B. E., Currivan, D. B., McMichael, J. P., and Iannacchione, V. G. (2013). Extending the Coverage of Address-Based Sampling Frames Beyond the USPS Computerized Delivery Sequence File. Public opinion quarterly, 77(4), 994-1005.

6 Blumberg SJ, and Luke JV. Wireless substitution: Early release of estimates from the National Health Interview Survey, July-December 2009. National Center for Health Statistics. May 2010. Available from: /nchs/nhis.htm

7 Boyle, J., Bucuvalas, M., Piekarski, L., and Weiss, A. (2009). Zero banks: Coverage error and bias in RDD samples based on hundred banks with listed numbers. Public Opinion Quarterly, 73(4), 729-750.

8 Fahimi, M., Kulp, D., and Brick, J.M. (2009). A reassessment of list-assisted RDD methodology. Public Opinion Quarterly, 73(4), 751- 760.

9 Barron, M. and Zhao, Z. (2010). Measuring undercoverage of landline telephone population in 1+ 100 bank surveys. Presented at the 65th Annual Conference of the American Association for Public Opinion Research.

10 Barron, M., Barron, M., Kelly, J., Montgomery, R., Singleton, J., Shin, H. C., Skalland, B., Tao, X., and Wolter, K. (2013). More on the extent of undercoverage in RDD telephone surveys due to the omission of 0- banks. Survey Practice, 3(2).

11 Särndal, C. E., and Lundström, S. (2005). Estimation in surveys with nonresponse. John Wiley & Sons.

12 Groves, R. M., Couper, M. P., Presser, S., Singer, E., Tourangeau, R., Acosta, G. P., and Nelson, L. (2006). Experiments in producing nonresponse bias. Public Opinion Quarterly, 70(5), 720-736.

13 Montaquila, J. M., Brick, J. M., Hagedorn, M. C., Kennedy, C., and Keeter, S. (2007). Aspects of nonresponse bias in RDD telephone surveys. In Advances in telephone survey methodology (pp. 561-586). NJ: Wiley & Sons.

14 Han, D., Montaquila, J., and Brick, J.M. (2013). An evaluation of incentive experiments in a two-phase address-based sample mail survey. Survey Research Methods, 7, 207-218.

15 Dillman, D., Smyth, J., and Christian, L. (2009). Internet, mail, and mixed-mode surveys: The tailored design method. Hoboken, NJ: John Wiley & Sons.

16 Messer, B.L., and Dillman, D. (2011). Surveying the general public over the internet using address-based sampling and mail contact procedures. Public Opinion Quarterly, 75(3), 429-457.

17 Millar, M., and Dillman, D. (2011). Improving response to web and mixed-mode surveys. Public Opinion Quarterly, 75, 249-269.

18 Ormond, B.A., Triplett, T., Long, S.K., Dutwin, D., and Rapoport, R. (2010). 2009 District of Columbia Health Insurance Survey: Methodology report. Available from http://www.urban.org/publications/1001376.html.

19 Ormond, B.A., Triplett, T., Long, S.K., Dutwin, D., and Rapoport, R. (2010). 2009 District of Columbia Health Insurance Survey: Methodology report. Available from http://www.urban.org/publications/1001376.html.

20 Smyth, J.D., Dillman, D.A., Christian, L.M., and O’Neill, A. (2010). Using the Internet to survey small towns and communities: Limitations and possibilities in the early 21st century. American Behavioral Scientist, 53, 1423-1448.

21 Brick, J.M., Williams, D., and Montaquila, J. (2011). Address-based sampling for subpopulation surveys. Public Opinion Quarterly, 75, 409-428.

22 Edwards, S., Brick, J.M., and Lohr, S. (2013). Sample performance and cost in a two-stage ABS design with telephone interviewing. Paper presented at AAPOR Conference, Boston, MA.

23,Leslyn, H., ZuWallack, R., and Eggers, F. (2012). Fair market rent survey design: Results of methodological experiment. U.S. Department of Housing and Urban Development. Available from http://www.huduser.org/portal/datasets/fmr/fmr2013p/FMR_Surveys_Experiment_Report_rev.pdf

24 Johnson, P., and Williams, D. (2010). Comparing ABS vs landline RDD sampling frames on the phone mode. Survey Practice, 3(3). Available from http://surveypractice.org/index.php/SurveyPractice/article/viewFile/251/pdf






0




File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Title2014-549: Proposal
AuthorDebra Reames
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy