Download:
pdf |
pdfAppendix A: Methodology for the 2010 General Aviation
and Part 135 Activity Survey
APPENDIX A
METHODOLOGY FOR THE 2010 GENERAL AVIATION
AND PART 135 ACTIVITY SURVEY
Purpose of Survey
The General Aviation and Part 135 Activity Survey (GA Survey) provides the Federal Aviation
Administration (FAA) with information on general aviation and on-demand Part 135 aircraft
activity. The survey enables the FAA to monitor the general aviation fleet so that it can
anticipate and meet demand for National Airspace System (NAS) facilities and services, assess
the impact of regulatory changes on the fleet, and implement measures to assure the safe
operation of all aircraft in the NAS. The data are also used by other government agencies, the
general aviation industry, trade associations, and private businesses to identify safety problems
and to form the basis for research and analysis of general aviation issues.
Background and History
Prior to the first implementation of the annual GA Survey in 1978, the FAA used the Aircraft
Registration Eligibility, Identification, and Activity Report (AC Form 8050-73) to collect data on
general aviation activity. The form was sent annually to all owners of civil aircraft in the United
States and served two purposes: a) Part 1 was the mandatory aircraft registration revalidation
form; and b) Part 2 was voluntary and applied to general aviation aircraft only, asking questions
on the owner-discretionary characteristics of the aircraft such as flight hours, avionics
equipment, base location, and use. The FAA used this information to estimate aircraft activity.
In 1978, the FAA replaced AC Form 8050-73 with a new system. Part 1 was replaced by a
triennial registration program. In January 1978, the FAA implemented a new procedure, known
as triennial revalidation, for maintaining its master file. Instead of requiring all aircraft owners to
revalidate and update their aircraft registration annually, the FAA only required revalidation for
those aircraft owners who had not contacted the FAA Registry for three years. This less
frequent updating affected the accuracy and representation in the master file: a) the accuracy of
information about current owners and their addresses deteriorated; and, b) the master file
retained information on aircraft that would have been re-registered or purged from the file under
the previous revalidation system.
Part 2 of AC Form 8050-73 was replaced by the General Aviation Activity Survey. Conducted
annually, the survey was based on a statistically selected sample of aircraft, and it requested
the same type of information as Part 2 of AC Form 8050-73. The first survey took place in 1978
and collected data on the 1977 general aviation fleet.
In 1993, the name of the survey was changed to the General Aviation and Air Taxi Activity
Survey to reflect that the survey included air taxi (that is, on-demand Part 135) aircraft. Starting
in 1999, information about avionics equipment, which had been collected only every other year,
was requested every year. As a result, the survey’s name was changed to the General Aviation
and Air Taxi Activity and Avionics Survey. In 2006, “Part 135” replaced the term “Air Taxi” in the
survey title, the word “Avionics” was removed (though avionics data were still collected
annually), and the survey was named the General Aviation and Part 135 Activity Survey. This is
the name under which the 2010 survey was conducted. The 2010 statistics in this report were
derived from the thirty-third GA Survey, which was implemented in 2011.
A-1
Appendix A: Methodology for the 2010 General Aviation
and Part 135 Activity Survey
The GA Survey has undergone periodic revisions to content, implementation, and definition of
the GA population in order to remain current with regulations, activity patterns, and aviation
technology. Tables A.1 through A.3 summarize changes in the form or content of the survey, the
protocol for collecting data, and the sample design, including changes in how the survey
population is defined.
Table A.1: Changes in Form or Content of Survey Questionnaire, by Survey Year
Year
Change in form or content of survey questionnaire
1993
Added sightseeing and external load to use categories
1996
Added public use (i.e., flights for the purpose of fulfilling a government function) to use categories
Significant re-design of the entire survey form to reduce item non-response, add new content, and be
compatible with optical scanning
1999
Added air medical services to use categories
Discontinued the use of a catch-all “other” category as used in previous years
Began collecting avionics data every year, rather than every other year
2000
“Public use” asked as a separate question, independent of other use categories (e.g., business
transportation), because it was not mutually exclusive with respect to other flight activity
2002
Use categories refined to be mutually exclusive and exhaustive and match definitions used by National
Transportation Safety Board (NTSB) for accident reporting
2004
Air medical services was divided into two separate types to capture air medical flights under Part 135 and
air medical flights not covered by Part 135
A more clearly defined “other” category was reintroduced
Fractional ownership question was changed from yes/no to a percentage
2005
Reduced the number of fuel type response categories by removing obsolete options
Average fuel consumption (in gallons per hour) was added
Revised questions about avionics equipment by adding and rearranging items
Location of aircraft revised to ask the state or territory in which the aircraft was “primarily flown” during the
st
survey year rather than where it was "based” as of December 31 of the survey year.
Percentage of hours flown in Alaska was added
2007
Questions on percentage of hours flown under different flight plans, flight conditions, and day/night were
revised into a single tabular format
Number of types of landing gear systems was expanded
Ice protection equipment was revised and prohibition from flight in icing conditions was added
Questions about avionics equipment were revised to reflect changes in technology
Two questions about avionics equipment were revised:
2009
“Air Bag/Ballistic Parachute” was asked as two items–“Air Bag” and “Ballistic Parachute”
“ADS-B (Mode S)” was separated into two questions–“ADS-B (Mode S) Transmit Only (Out)” and
“ADS-B (Mode S) Transmit and Receive (In)”
2010
Removed the skip instruction in the mail survey based on responses to Part 121/129 operations
Added “Specify” option if reason not flown was “Other”
A-2
Appendix A: Methodology for the 2010 General Aviation
and Part 135 Activity Survey
Table A.2: Changes in Data Collection Methodology, by Survey Year
Year
Change in data collection methodology
1
1999
Non-respondent telephone survey conducted to adjust active aircraft and hours flown estimates
2000
Discontinued non-respondent telephone survey because of the variability of telephone non-respondent
factors
Added Internet response option
2003
Added a reminder/thank-you postcard between the first and second mailings
Introduced “large fleet” summary form to allow owners/operators of multiple aircraft to report aggregate
data for their entire fleet on a single form
2004
2009
Initiated telephone follow-up effort to contact owners/operators of multiple aircraft who had not responded.
(Protocol encourages and facilitates participation by providing alternate forms and offering technical
assistance but survey is not conducted by telephone.)
Initiated telephone follow-up effort to contact owners/operators of single aircraft who completed partial
survey. (Protocol encourages and facilitates participation by offering technical assistance but survey is not
conducted by telephone.)
Mailed end-of-field-period follow-up postcard to owners/operators of single aircraft that participated the
previous survey year but had not yet completed the current year’s survey
2010
Discontinued telephone follow-up with owners/operators of single aircraft who completed partial survey due
to low effectiveness
Telephone follow-up efforts with owners/operators of multiple aircraft included some collection of key
variables by telephone and were not limited to encouraging participation.
Table A.3: Changes in Sample Design or Definition of Survey Population, by Survey Year
Year
Change in sample design or survey population
1993
Number of aircraft types classified by the sample was expanded from 13 to 19
1999
Sample design revised to stratify by aircraft type (19 categories) and FAA region (9 categories)
2003
Aircraft with known incorrect addresses and identified as “Postmaster Return” status on the Registry were
retained in the definition of the survey population and eligible for sample selection
2
Aircraft listed on the Registry as “registration pending” or “sold” (if sold status less than 5 years ago) were
retained in the definition of the survey population and eligible for sample selection
2004
Sample design revised to stratify by aircraft type (19 categories), FAA region (9 categories), and whether
the aircraft is owned by an entity certified to fly Part 135 (2 categories)
Introduced 100% samples of turbine aircraft, rotorcraft, on-demand Part 135, and Alaska-based aircraft
th
2005
2006
Introduced light-sport aircraft as a 20 aircraft type sampled at 100 percent. Light-sport included aircraft
with special or experimental airworthiness as well as aircraft for which airworthiness was not yet final.
Sample design simplified to 14 aircraft types (removed distinctions based on number of seats and
eliminated “Other” subcategories of piston, turboprop, and turbojet aircraft)
Sample design included 100 percent sample of aircraft manufactured in the past five years
2008
100 percent sample of light-sport aircraft was limited to special light-sport aircraft. Experimental light-sport
and light-sport without completed airworthiness sampled at a rate less than 1.0. Results in sample design
with 15 aircraft types.
2010
Aircraft excluded from the survey population if “sale reported” or “registration pending” more than 12
months. These aircraft no longer eligible for sample selection.
1
Telephone surveys of non-respondents also were conducted in 1977, 1978, 1979, 1997, and 1998. Please refer to
the 1999 GA Survey report for a full discussion of the telephone survey of non-respondents.
2
Before 1999, the sample was stratified by aircraft type (19 categories) and state/territory (54 categories).
A-3
Appendix A: Methodology for the 2010 General Aviation
and Part 135 Activity Survey
Survey Population and Survey Sample
The survey population for the 2010 General Aviation and Part 135 Activity Survey includes all
civil aircraft registered with the FAA that are based in the US or US territories and that were in
existence and potentially active between January 1 and December 31, 2010. This includes
aircraft operating under:
•
Part 91: General operating and flight rules
•
Part 125: Certification and operations: Airplanes having a seating capacity of 20 or
more passengers or a maximum payload capacity of 6,000 pounds or more (but not
for hire)
•
Part 133: Rotorcraft external load operations
•
Part 135: On-demand (air taxi) and commuter operations not covered by Part 121
•
Part 137: Agricultural aircraft operations.
Aircraft operating under Part 121 as defined in Part 119 are excluded from the survey
population. Foreign air carriers, which operate under Part 129, are also not part of the survey
population. Civil aircraft that are known not to be potentially active during the survey year are
excluded from the population (e.g., aircraft on static display, destroyed prior to January 1, 2010).
The Aircraft Registration Master File, maintained by the FAA’s Mike Monroney Aeronautical
Center in Oklahoma City, Oklahoma, serves as the sample frame or list of cases from which a
sample of civil aircraft is selected. The Registration Master File (“Registry”) is the official record
of registered civil aircraft in the United States. For the purpose of defining the 2010 survey
population, we used the Registry’s list of aircraft as of December 31, 2010.
The Registry, like many sample frames, is an imperfect representation of the survey population.
While it may exclude a small number of aircraft that operate under the FAA regulations
governing the operation of general aviation and on-demand Part 135 aircraft, it also includes
aircraft that are not part of the survey population. Prior to sample selection, several steps are
taken to remove ineligible aircraft from the sample frame. Specifically, this includes removing
the following:
•
aircraft whose registration has been cancelled or revoked
•
aircraft based in Europe or registered to a foreign company that have not returned
flight hour reports
•
aircraft that operate under Part 121
•
aircraft destroyed or moved to static display prior to January 1, 2010
•
aircraft that are flagged Postmaster Return (known to have incorrect address
information) prior to 2001 (10 years prior to survey year)
A-4
Appendix A: Methodology for the 2010 General Aviation
and Part 135 Activity Survey
•
aircraft that lack information necessary to execute the sample design (i.e., aircraft
type, FAA region)3
Two criteria for defining the population were added or revised in 2010 to be consistent with the
FAA’s re-registration rule that became effective October 1, 2010.4 The re-registration rule
requires all US civilian aircraft to re-register every three years. The rule also limits the length of
time that an aircraft can hold a temporary status, such as “sale reported” or “registration
pending.” The re-registration component of the rule will be phased-in over a three-year period
from 2011 through 2013 and therefore does not affect the definition of the 2010 population.
However, the restrictions on temporary statuses became effective October 1, 2010. As a result,
the 2010 population excluded the following aircraft:
•
aircraft listed as “sale reported” more than 12 months (prior to January 1, 2010)
•
aircraft listed as “registration pending” more than 12 months (prior to January 1,
2010)
Since the 2004 survey year, the survey population retained aircraft listed as “sale reported”
within the previous five years and all aircraft with “registration pending,” regardless of duration.
The time limits applied in the 2010 survey are consistent with a 6-month restriction on sale
reported and a 12-month restriction on registration pending under the new rule.5
The Registry included 373,896 aircraft as of December 31, 2010. This represents a decrease of
less than one-half percent (0.13 percent) from the 2009 Registry file (374,373 records). After
excluding the aircraft described above, 304,334 records remain, which is 81.4 percent of the
Registry as of December 31, 2010. The 2010 survey population of 304,334 represents a
decrease of 1.8 percent from 2009 (309,811).
The 2010 GA Survey Sample
The 2010 survey sample design is the same as that for the 2009 survey year. The sample is
stratified by aircraft type (15 categories), FAA region in which the aircraft is registered (9
categories), whether the aircraft operates under a Part 135 certificate (2 categories), and
whether the aircraft was manufactured in the past 5 years (2 categories). Aircraft operated
under a Part 135 certificate were identified using the FAA’s Operations Specifications
Subsystem (OPSS) database that was merged with the Registry by N-number. The four
stratifying variables yield a matrix of 540 cells.
We define 15 aircraft types to execute the sample design as shown in Table A.4. The
classification distinguishes among fixed wing aircraft, rotorcraft, experimental aircraft, light-sport,
and other aircraft. Within the major categories of fixed wing and rotorcraft, we differentiate
aircraft by type and number of engines (e.g., piston, turboprop, turbojet, turbine, single- and twoengines). Experimental aircraft are subdivided by amateur-built status and airworthiness
certification, and we classify “other” aircraft as gliders or lighter-than-air. Light-sport is
3
The number of aircraft missing this information is typically very small.
“Re-Registration and Renewal of Aircraft Registration,” Department of Transportation, Federal Aviation
Administration. Federal Register Vol. 75, No. 138, Tuesday July 20, 2010.
5
Although the re-registration rule states that registrations will be cancelled if aircraft are listed as “sale reported” more
than six months without filing new registration materials, we allow 12 months before excluding aircraft from the
population. These aircraft could have valid registrations part of the year and potentially be operating.
4
A-5
Appendix A: Methodology for the 2010 General Aviation
and Part 135 Activity Survey
subdivided into special and experimental based on airworthiness certification. Light-sport aircraft
for which airworthiness certificates are not yet final are included with experimental light-sport.
Prior to the 2006 survey year, we defined 20 aircraft types and distinguished aircraft by size as
well as by type and number of engines. We eliminated subcategories based on number of seats
to increase the efficiency of the sample. We also eliminated three “other” categories, because
improvements in the Registry have left few aircraft in these residual categories.6
Although the sample design uses 15 aircraft types, statistical estimates are reported for 18
types, further differentiating aircraft by number of engines and number of seats. Starting in
2009, estimates were reported separately for experimental- and special light-sport.
Table A.4: Aircraft Types Used for Sample Design and for Reporting Survey Results
Aircraft Types in the Sample Design
Aircraft Types for Reporting Results
Fixed wing piston (1 engine)
Fixed wing piston (1 engine, 1-3 seats)
Fixed wing piston (2 engines)
Fixed wing piston (1 engine, 4 or more seats)
Fixed wing turboprop (1 engine)
Fixed wing piston (2 engines, 1-6 seats)
Fixed wing turboprop (2 engines)
Fixed wing piston (2 engines, 7 or more seats)
Fixed wing turbojet
Fixed wing turboprop (1 engine)
Rotorcraft (Piston)
Fixed wing turboprop (2 engines, 1-12 seats)
Rotorcraft (Turbine, 1 engine)
Fixed wing turboprop (2 engine, 13 or more seats)
Rotorcraft (Turbine, multi-engine)
Fixed wing turbojet
Glider
Rotorcraft (Piston)
Lighter-than-air
Rotorcraft (Turbine, 1 engine)
Experimental (Amateur)
Rotorcraft (Turbine, multi-engine)
Experimental (Exhibition)
Glider
Experimental (Other)
Lighter-than-air
Light-sport (Experimental)
Experimental (Amateur)
Light-sport (Special)
Experimental (Exhibition)
Experimental (Other)
Light-sport (Experimental)
Light-sport (Special)
6
The following three categories were eliminated: Fixed wing piston–other, fixed wing turboprop–other, and fixed wing
turbojet–other. The few aircraft in the major category that cannot be classified are assigned to the modal category for
that group (e.g., unclassifiable fixed wing turboprops are assigned to fixed wing turboprop–2 engines, 1-12 seats).
A-6
Appendix A: Methodology for the 2010 General Aviation
and Part 135 Activity Survey
Aircraft Sampled at 100 Percent
The 2010 survey sample included several types of aircraft that were sampled at a rate of 1.0.
Because of the FAA’s interest in better understanding the operation of these aircraft, all such
aircraft listed in the Registry were included in the survey sample to ensure a sufficient number of
responses to support analysis and provide more precise estimates of fleet size and aircraft
activity. These include:
•
100 percent sample of turbine aircraft (turboprops and turbojets)
•
100 percent sample of rotorcraft
•
100 percent sample of special light-sport aircraft
•
100 percent sample of aircraft operating on-demand Part 135
•
100 percent sample of aircraft based in Alaska7
•
100 percent sample of aircraft manufactured within the past 5 years (since 2006
inclusive).
Since 2004, the survey design has included 100 percent samples of turbine aircraft, rotorcraft,
aircraft certificated to operate under Part 135, and Alaska-based aircraft. In 2005, we added the
100 percent sample of light-sport aircraft. In 2006, we added the 100 percent sample of
recently-manufactured aircraft. In 2008, we revised the 100 percent sample of light-sport aircraft
to include only special light-sport aircraft. Experimental light-sport and those without final
airworthiness documentation are sampled at less than 100 percent but in sufficient numbers to
support statistical estimates of flight activity. Altogether the aircraft sampled at 100 percent
contributed 61,445 observations to the 2010 survey sample.
Aircraft Sampled at Less than 100 Percent
Aircraft that are not part of a 100 percent sample are subject to selection based on sampling
fractions defined for each cell in the sample design matrix. Annual flight hours is the primary
measure needed by the FAA to address survey goals. Sample fractions for each sample strata
are defined to optimize sample size to obtain a desired level of precision for an estimate of flight
activity. Data from the previous survey year on average hours flown, variability in hours flown by
region and aircraft type, and response rates are used to set precision levels and identify the
optimal sample size for each strata. Aircraft are randomly selected from each cell in the matrix,
subject to the desired sample size. Strata that yield a very small sample size are examined and
adjusted to include all observations in the strata if necessary. In 2010, an additional 23,537
aircraft were sampled at a rate of less than 1.0.
The 2010 GA Survey sample included 84,982 aircraft. Table A.5 summarizes the population
counts and sample sizes by aircraft type.
7
Alaska-based aircraft are identified by the state listed in the Registry file.
A-7
Appendix A: Methodology for the 2010 General Aviation
and Part 135 Activity Survey
Table A.5: Population and Survey Sample Counts by Aircraft Type
Aircraft Type
Fixed Wing - Piston
1 engine, 1-3 seats
1 engine, 4+ seats
2 engines, 1-6 seats
2 engines, 7+ seats
Fixed Wing - Turboprop
1 engine
2 engines, 1-12 seats
2 engines, 13+ seats
Fixed Wing - Turbojet
2 engines
Rotorcraft
Piston
Turbine (1 engine)
Turbine (multi-engine)
Other Aircraft
Glider
Lighter-than-air
Experimental
Amateur
Exhibition
Other
Light-sport
Experimental*
Special
Total
Population
207,024
60,300
125,479
14,447
6,798
10,253
4,508
4,566
1,179
12,566
12,566
12,615
5,125
5,799
1,691
9,482
3,059
6,423
41,994
36,641
3,115
2,238
10,400
8,626
1,774
304,334
Sample Size
Sample as
Percent of
Population
30,586
6,786
15,744
5,225
2,831
10,253
4,508
4,566
1,179
12,566
12,566
12,615
5,125
5,799
1,691
4,261
1,763
2,498
10,518
6,543
1,994
1,981
4,183
2,409
1,774
84,982
14.8
11.3
12.5
36.2
41.6
100.0
100.0
100.0
100.0
100.0
100.0
100.0
100.0
100.0
100.0
44.9
57.6
38.9
25.0
17.9
64.0
88.5
40.2
27.9
100.0
27.9
* Includes light-sport aircraft with experimental airworthiness and light-sport aircraft for which
airworthiness certification is not final
Weighting the Survey Data
Data from completed surveys are weighted to reflect population characteristics. The weights
reflect the proportion of aircraft sampled from the population in each sample strata and
differential response as well as adjustment for aircraft that are not part of the survey population.
Initially, each aircraft for which we receive a completed survey is given a weight that reflects
sampling fraction and differential response. That is:
WEIGHT = (Population Nijkl/Sample Nijkl) * (N Respondentsijkl/Sample Nijkl)
A-8
Appendix A: Methodology for the 2010 General Aviation
and Part 135 Activity Survey
where i, j, k, and l represent the four sample strata of aircraft type, FAA region, Part 135 status,
and whether an aircraft was manufactured in the past 5 years.
The weight is subsequently adjusted to reflect new information about non-general aviation
aircraft. That is, survey responses that identify an aircraft as not being part of the survey
population—e.g., destroyed prior to January 1, 2010; displayed in a museum; or operated
primarily as an air carrier under Part 121 or 129—are used to remove aircraft proportionally from
the sample and from the population. This adjustment is done at the level of the 15 aircraft types.
The procedure assumes that ineligible aircraft occur in the same proportion among survey
respondents and non-respondents. To the extent that ineligible aircraft are less likely to receive
and complete a survey, this approach will underestimate the adjustment for aircraft that are not
part of the general aviation population.
Errors in Survey Data
Errors associated with survey data can be classified into two types—sampling and nonsampling errors. Sampling errors occur because the estimates are based on a sample of aircraft
rather than the entire population and we can expect, by chance alone, that some aircraft
selected into the sample differ from aircraft that were not selected.
Non-sampling errors can be further subdivided into a) errors that arise from difficulties in the
execution of the sample (e.g., failing to obtain completed interviews with all sample units), and
b) errors caused by other factors, such as misinterpretation of questions, inability or
unwillingness to provide accurate answers, or mistakes in recording or coding data.
Sampling Error
The true sampling error is never known, but in a designed survey we can estimate the potential
magnitude of error due to sampling. This estimate is the standard error. The standard error
measures the variation that would occur among the estimates from all possible samples of the
same design from the same population.
This publication reports a standard error for each estimate based on survey sample data. An
estimate and its standard error can be used to construct an interval estimate (“confidence
interval”) with a prescribed level of confidence that the interval contains the true population
figure. In general, as standard errors decrease in size we say the estimate has greater precision
(the confidence interval is narrower), while as standard errors increase in size the estimate is
less precise (the confidence interval is wider). Table A.6 shows selected interval widths and
their corresponding confidence.
A-9
Appendix A: Methodology for the 2010 General Aviation
and Part 135 Activity Survey
Table A.6: Confidence Interval Estimates
Width of interval
Approximate confidence
that interval includes
true population value
1 Standard error
68%
2 Standard error
95%
3 Standard error
99%
This report presents a “percent standard error” for each estimate, which is the standard error
relative to the mean. The percent standard error is the ratio of the standard error to its estimate
multiplied by 100. For example, if the estimate is 4,376 and the standard error is 30.632, then
the percent standard error is (30.632/4,376) x 100 = 0.7. Reporting percent standard errors
makes it possible to compare the precision of estimates across categories.
Estimates and percent standard errors reported in Table 2.1 in Chapter 2 ("Population Size,
Active Aircraft, Total Flight Hours, and Average Flight Hours by Aircraft Type") provide an
example of how to compute and interpret confidence intervals. To obtain a 95 percent
confidence interval for the estimated number of total hours flown for twin-engine turboprops in
2010, where the total hours flown is estimated to be 1,238,407 and the percent standard error of
the estimate is 2.7, the following computation applies:
Lower confidence limit: 1,238,407 – 1.96(2.7/100)(1,238,407) = 1,172,871
Upper confidence limit: 1,238,407 + 1.96(2.7/100)(1,238,407) = 1,303,943
In other words, if we drew repeated samples of the same design, 95 percent of the estimates of
the total hours flown by twin-engine turboprops would fall between 1,172,871 and 1,303,943.
Non-sampling Error
Sampling error is estimable and can be reduced through survey design (e.g., by increasing
sample size), but it is difficult, if not impossible, to quantify the amount of non-sampling error.
Although extensive efforts are undertaken to minimize non-sampling error, the success of these
measures cannot be quantified.
Steps taken to reduce non-sampling error include strategies to reduce non-response and efforts
to minimize measurement and coding errors. To this end, implementation and design of the
2010 GA Survey incorporated the following steps to maximize cooperation among sample
members:
•
Two modes of administration to facilitate access to the survey—a postcard invitation
to complete the survey on the Internet followed by a mail survey to be completed by
pen or pencil.
•
Three mailings of the survey to individuals who had not yet responded, as well as a
reminder/thank-you postcard and, for single-aircraft owners/operators, an end-of-field
period follow-up postcard.
A-10
Appendix A: Methodology for the 2010 General Aviation
and Part 135 Activity Survey
•
Cover letters accompanying each survey mailing clearly explained the purpose of the
survey as well as the endorsement (organizational logos) of several aviation
associations.
•
Cover letters assured owners of the confidentiality of their responses and informed
them: “Names of individuals are never associated with responses. There is an
identification number on your survey only so [survey contractor] knows who should
receive the survey.”
•
Use of additional sources to obtain updated contact information and help ensure the
mail survey reaches the sample member (e.g., National Change of Address, updates
from aviation associations).
•
Use of a toll-free telephone number and e-mail address to respond to questions.
•
Collaboration with aviation organizations and industry groups to encourage
cooperation of owners/operators of multiple aircraft.
•
Telephone follow-up to owners/operators of multiple aircraft who had not yet
responded.
The survey efforts also minimize measurement error by increasing the likelihood that
respondents share a common understanding of survey questions and reducing errors in data
coding. These efforts include:
•
Close collaboration with the FAA, other federal agencies, and aviation groups to
refine and clarify question wording as well as definitions to questions. The
questionnaire is re-examined each year to identify ambiguities or revisions necessary
to remain consistent with aviation regulations and definitions.
•
The questionnaire has been re-designed periodically (see “Background” section of
this report), and significant revisions are pre-tested with a sample of aircraft
owners/operators.
•
Comprehensive editing and verification procedures to ensure the accuracy of data
transcription to machine-readable form as well as internal consistency of responses.
We undertake extensive effort to reduce measurement error, particularly where we can
anticipate systematic or repeated error on the part of survey respondents, but it is impossible to
eliminate all measurement error. Survey participants may misunderstand questions or misreport
flight activity in ways that cannot be anticipated or prevented through survey or questionnaire
design. Where survey reports appear nonsensical or contradict FAA regulations (e.g., lighterthan-air aircraft providing air medical services), we manually verify that the data were processed
accurately. Instances in which a small number of illogical reports occur may be suppressed and
are indicated in table notes. No additional steps are taken to “cleanse” the data of apparently
illogical reports or assign them to other categories. To do so would introduce additional and
systematic error that would be misleading and would affect other uses of the data.
A-11
Appendix A: Methodology for the 2010 General Aviation
and Part 135 Activity Survey
Imputation of Missing Data
Imputation of missing data is very important for stabilizing the estimates of aircraft activity and
equipment. Values are imputed for variables if the survey response is incomplete, the survey
form did not include the question, or the Registry data field is blank. Table A.7 lists the variables
for which values are imputed, describes the imputation procedure, and shows the percentage of
cases with imputed data. The table shows rates of imputation among aircraft that received the
full survey form (first column of numbers) as well as rates of imputation for all survey responses,
including those that returned a short form (last column). It is important to recognize that the
latter figures will have inflated imputation rates: data for many items are structurally missing
because the questions were not asked on the short form.
Table A.7: Variables with Imputed Values, Imputation Procedure, and Percentage Imputed
Variable
Imputation Procedure
Percent
Imputed
(full survey
form only)
Percent
Imputed
(incl. short
form)
Percent of hours by use
(e.g., personal, business
transport)
Mean values by aircraft type
1.1
2.0
Percent of hours fractional
ownership
Nearest neighbor by aircraft type by engine
manufacture model
0.5
0.9
Percent of hours
rented/leased *
Nearest neighbor by aircraft type by engine
manufacture model
1.9
23.8
Percent of hours public
use
Nearest neighbor by aircraft type by engine
manufacture model
1.9
2.6
Percent of hours by flight
plans/flight conditions *
Mean values by aircraft type
1.8
23.8
Airframe hours *
Nearest neighbor by aircraft type by age
2.5
24.3
Number of landings
Nearest neighbor by aircraft type by engine
manufacture model by age
3.0
4.3
Landing gear *
Nearest neighbor by aircraft type by engine
manufacture model
2.2
24.1
Fuel type *
Nearest neighbor by aircraft type by engine
manufacture model
2.4
24.2
Fuel burn rate
Nearest neighbor by aircraft type by engine
manufacture model
2.1
3.7
Avionics equipment *
Nearest neighbor by aircraft type by engine
manufacture model by age
3.5
29.5
State primarily flown
Assign state of registration from Registry
Master
22.9
25.1
Year of manufacture
Nearest neighbor by aircraft type by engine
8.6
9.7
(Registry data field)
manufacture model
Percentages are based on unweighted survey responses (total 37,215).
* Question not asked on the abbreviated survey form administered to owners/operators of multiple aircraft.
In 2010, rates of imputations are typically less than two percent for sampled aircraft that
completed the full survey form. Item non-response on key activity variables are consistently
low—hours flown by use (1.1 percent), fractional ownership hours (0.5 percent), rented or
leased hours (1.9 percent), public use hours (1.9 percent), and hours by flight conditions (1.8
A-12
Appendix A: Methodology for the 2010 General Aviation
and Part 135 Activity Survey
percent). Other variables have slightly higher imputation rates but are still well below four
percent (airframe hours, landings, fuel consumptions, and avionics equipment). The state in
which an aircraft is primarily flown is the only variable with markedly higher rates of imputation
(23 percent). In fact, data on this variable are seldom missing, but many answers cannot be
coded to a single state because respondents list more than one state, describe a region, or
simply indicate “US.”
Over the past ten years, the survey questionnaire has undergone re-design efforts and data
collection methods have been developed to reduce item non-response. 1) The layout of the
questionnaire was made more user-friendly by increasing font size and space between
questions; 2) instructions were simplified or added based on respondent feedback from
pretests; 3) confidentiality of survey results has been emphasized to reduce concerns that data
will be used inappropriately; 4) respondents have been encouraged to report their best guess if
they do not have exact information; 5) questions were revised to simplify the computations
performed by respondents and eliminate the need to refer to previous answers; and 6)
instructions to enter a zero, rather than leave an item blank, has minimized the frequency of
ambiguous answers.
Survey Content
The 2010 GA Survey questionnaire requests the aircraft owner or operator to provide
information on flight activity, flight conditions, where the aircraft was flown, and aircraft
characteristics. Key variables derived from the survey responses include:
•
number of total hours flown in 2010 hours flown by use, and total lifetime airframe
hours
•
the state in which the aircraft was primarily flown and hours flown in the state of
Alaska
•
hours flown by flight plan and flight conditions, including flight under Instrumental
Meteorological Conditions (IMC) and Visual Meteorological Conditions (VMC) during
the day and night
•
hours flown as part of a fractional ownership program, rented or leased, or used to
fulfill a government function
•
type of landing gear and number of landings in 2010
•
fuel type and average fuel burn rate
•
avionics equipment installed in the aircraft.
Data Collection Methods
Collecting Data from Owners/Operators of a Single Aircraft
Appendix B presents the materials used to conduct the 2010 survey. The survey form
administered to owners/operators of a single aircraft is shown in Figure B.1. The postcard
invitation to the Internet component and the reminder/thank-you postcard are shown in Figures
B.2 and B.3. Surveys sent to aircraft owners who started, but did not complete, an Internet
A-13
Appendix A: Methodology for the 2010 General Aviation
and Part 135 Activity Survey
survey included a special insert (Figure B.4). Surveys mailed to Alaskan addresses included an
insert with the endorsement of Alaska aviation associations encouraging owners to participate
(Figure B.5). Each of the three mailings for the survey was accompanied by a cover letter,
shown respectively in Figures B.6, B.7, and B.8. The data collection effort for the 2010 survey
also included an “end of field period” postcard (Figure B.9).
The protocol for the 2010 survey is similar to that used since the 2000 survey. The survey data
were collected from owners and operators of the sampled aircraft through two venues—the
Internet and mailings of the questionnaire. We implemented the Internet component before the
mailing portion to maximize the number of responses collected electronically. We first sent the
owners/operators of sampled aircraft a postcard inviting them to complete the survey on the
Internet (mailed on April 8, 2011). All single-aircraft surveys received through September 11
(on-line or by mail) were processed and included in analysis.
We mailed survey questionnaires to owners/operators of sampled aircraft three times during the
field period as well as a reminder/thank you postcard between the first and second mailings and
an end of field period follow-up postcard. With the exception of the final follow-up postcard, each
mailing was sent to owners/operators that had not yet responded to the survey at that time or
had not been assigned a final disposition (e.g., refused, respondent deceased, undeliverable
with no new address). The final postcard was sent only to owners/operators that had
participated the previous year but had not yet completed a 2010 survey by the end of July. We
mailed the first questionnaire on May 12, 2011, followed by the reminder/thank you postcard on
June 3, 2011. The second and third mailings were sent June 24, 2011 and July 22, 2011,
respectively. The end of field follow-up postcard was mailed on August 8, 2011.
Collecting Data from Owners/Operators of Multiple Aircraft
The 2010 GA Survey continued the effort initiated in 2004 to increase cooperation among
respondents who own or operate multiple aircraft. The 2010 survey employed data collection
tools and methods similar to those introduced in 2004, including extensive effort to contact
owners/operators of multiple aircraft by telephone to encourage participation among nonresponders after the first mailing. The survey forms, cover letters, and reminder letter are
presented in Appendix B, Figures B.10–B.14.
The responses of multiple-aircraft owners/operators are important for accurately estimating
general aviation activity. Because of the increased burden of reporting for multiple aircraft, there
was a concern that these operators were less likely to respond to the survey. After selecting the
sample, we identify groups of aircraft belonging to the same operator using several resources:
FAA’s Operations Specifications Subsystem (OPSS), databases available from aviation
associations, and the Civil Aviation Registry’s Master file.
Owners/operators of multiple aircraft receive an abbreviated survey form to minimize the
reporting burden. The form, developed in cooperation with several aircraft operators and
aviation associations, allows an operator to report a summary of activity for a group of aircraft of
a similar type instead of requiring the operator to complete a separate and longer questionnaire
for each individual aircraft. This survey form (Figure B.10) collects data on key variables for
major classes of aircraft (e.g., hours flown, how flown, fuel consumption, fractional ownership,
and number of landings). The form does not collect data on flight conditions, fuel type, landing
gear, or avionics.
A-14
Appendix A: Methodology for the 2010 General Aviation
and Part 135 Activity Survey
Data collection for multiple-aircraft owners/operators followed similar timing as that for
owners/operators of single aircraft. We programmed an Internet survey that matched the hardcopy survey form and the on-line survey remained open throughout the field period. We mailed
survey questionnaires three times during the field period as well as a reminder letter between
the first and second mailings. Each mailing was sent to owners/operators of multiple aircraft that
had not yet responded to the survey at that time and had not been assigned a final disposition.
The first survey mailing was sent April 15, 2011 followed by a reminder letter on May 12, 2011.
The second and third mailings were sent June 3, 2011 and July 22, 2011, respectively. The field
period for collecting responses from multiple-aircraft owners/operators was two weeks longer
than the data collection period for single-aircraft owners. All large fleet surveys received through
September 25 were processed and included in analysis.
To maximize survey response, we placed follow-up telephone calls to all multiple-aircraft
owners/operators who had not responded. The telephone effort, which was prioritized by fleet
size, began June 9, 2011 and continued through the field period. The calling effort focused on
encouraging survey participation as well as ensuring that survey mailings were reaching the
appropriate person in the operator’s organization. In some instances, the caller was able to
collect key information by telephone (e.g., flown/not flown and hours flown) when the
owner/operator would be unable to return the full survey form before data collection ended.
The alternate survey form for owners/operators of multiple aircraft has reduced respondent
burden and improved representation of activity among high-end and high-use aircraft. The
alternate data collection track for owners/operators of multiple aircraft accounts for about 20
percent or more of all aircraft responding to the survey. In 2010, 22.6 percent of all completed
surveys followed this data collection track (average 22.4 percent for GA Surveys 2006-2010).
Response Rate
The response rate is calculated conservatively following guidelines published by the American
Association for Public Opinion Research (AAPOR), a professional association that establishes
standards, “best practice” guidelines, and a code of ethics for professional survey researchers
and research firms.8 Specifically, the response rate is computed as the number of completed
and partial surveys returned divided by the total number of eligible aircraft in the sample using
the following formula.
RR = (C + P) / (C + P) + (NR + INS + REF + PMR + UNK)
Where
RR = Response Rate
C = Completed survey
P = Partial survey
NR = No response
8
The American Association for Public Opinion Research. 2000. Standard Definitions: Final Dispositions of Case
Codes and Outcome Rates for Surveys. Ann Arbor, MI: AAPOR.
A-15
Appendix A: Methodology for the 2010 General Aviation
and Part 135 Activity Survey
INS = Insufficient complete; a partial survey that is not sufficient to count as a complete
REF = Refused
PMR = Post Master Returned, no new address
UNK = Unknown eligibility
The numerator is comprised of completed surveys and partial surveys that provide enough
information to be used for analysis. Partial surveys must include information on hours flown to
be included in the numerator.
In addition to completed and partial surveys, the denominator includes cases for which no
response was received, insufficiently completed surveys (i.e., no data reported for hours flown),
refusals, surveys returned as undeliverable by the USPS, and cases of unknown eligibility. The
last category includes aircraft in which the owners cannot be identified or cannot report about
aircraft activity (e.g., owner is deceased and the survivors cannot report on the aircraft activity,
survey recipient does not own the aircraft listed).
The denominator includes aircraft that were sold or destroyed during the survey year. The
survey collects data on flight activity for the portion of the year the aircraft was eligible to fly, and
data collection efforts attempt to identify and mail surveys to new owners.
The denominator excludes aircraft known not to be part of the general aviation fleet or known
not to be eligible to fly during the survey year. These are aircraft that were destroyed prior to the
survey year, displayed in a museum, operated primarily as an air carrier, operated outside of the
US, or exported overseas.
Table A.8 shows the final response rate by mailing and overall, along with the number of
completed surveys. The number of completed surveys shown here excludes duplicate surveys
after cleaning the returned survey data to retain the form with the most complete information.
The overall response rate for the 2010 GA Survey was 44.2 percent. Almost 60 percent of
responses were received on the Internet and slightly more than one-quarter were received from
the first mailing. The second and third mailings had lower response, but these rates are
calculated conservatively. For example, the Mail 3 response rate is the proportion of sampled
aircraft that returned that hard-copy survey. If a third mailing was sent, but the survey was later
completed on-line then the response is recorded as “Internet.”
Table A.8: Response Rate by Mailing
Mailing
Internet
Completes
Response Rate
% Total Response
21,641
25.6%
58.1%
st
9,713
11.5%
26.1%
nd
3,047
3.6%
8.2%
rd
2,814
3.3%
7.6%
37,215
44.2%
100.0%
1 Mailing
2 Mailing
3 Mailing
Overall
A-16
Appendix A: Methodology for the 2010 General Aviation
and Part 135 Activity Survey
As noted above, the response rate is calculated conservatively and retains all non-responding
surveys, sampled units with bad addresses, and sampled aircraft of unknown eligibility in the
denominator. In the 2010 survey, 4,223 surveys were returned undeliverable and we were
unable to obtain updated address information. In addition, the survey sample itself included
about 800 aircraft that could not be contacted because their status was “Sale Reported,”
“Registration Pending” or the address was already known to be incorrect (i.e., Postmaster
Return status on the Registry).9 Applying guidelines for defining the GA population developed
with the FAA and Registry personnel, these aircraft are deemed potentially active and therefore
eligible for selection into the survey.
Table A.9 illustrates the steady increase in the Internet response as a percentage of all returned
surveys from 2000 to 2010. Almost 60 percent of survey responses were received by Internet in
2010, and the share of Internet response was very similar to the previous survey year. Since the
an on-line survey form was introduced in 2000, participation via the Internet has increased over
75 percent: Initially accounting for about one-third of all completed surveys, almost 60 percent
of responding aircraft now answer the survey on-line. Internet response grew steadily over the
past ten years with larger increases occurring in the 2007 and 2008 survey years. The share of
Internet responses appears to have stabilized since 2008.
Table A.9: Percentage of All Completed Surveys Responding by Internet
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
Total Sample Size
31,039
30,886
30,817
31,996
75,659
77,403
84,486
84,570
82,277
85,086
84.982
Total Completes
15,689
16,432
15,254
14,471
32,056
34,248
38,973
38,920
35,607
36,222
37,215
Internet Completes
5,144
5,954
5,304
6,059
13,441
14,555
17,266
19,268
20,611
20,985
21,641
Internet % of Total
32.8%
36.2%
34.8%
41.9%
41.9%
42.5%
44.3%
49.5%
57.9%
57.9%
58.2%
Table A.10 shows response rates by aircraft type. The overall response rate in 2010 increased
slightly more than one percentage point over the previous year, from 42.9 percent to 44.2
percent. In 2010, participation is highest among experimental-amateur aircraft (60 percent), and
response by light-sport and gliders are also above 50 percent. Response by single-engine fixed
wing piston roughly matched the overall rate in 2010, but increased noticeably over the previous
survey year. In 2009, response rates for single-engine fixed wing piston aircraft were about 40
percent, while the rates in 2010 are 45 percent. Piston rotorcraft also had higher response in
2010—increasing from 29.4 percent in 2009 to 32.1 percent in 2010. Rates for most other
aircraft types are roughly similar to the previous survey year or slightly slower.
9
Surveys were sent to aircraft listed as “Registration Pending” or “Sale Reported” at Mail 2 and Mail 3 if the Registry
has updated address information. Less than 20 percent of these surveys were subsequently completed in 2010.
A-17
Appendix A: Methodology for the 2010 General Aviation
and Part 135 Activity Survey
Table A.10: Response Rate by Aircraft Type
Aircraft Type
Invalid
10
Sample
Sample
Completes
Response
Rate
Fixed Wing - Piston
1 engine, 1-3 seats
6,786
25
3,075
45.5%
1 engine, 4+ seats
15,744
37
7,086
45.1%
2 engines, 1-6 seats
5,225
49
2,158
41.7%
2 engines, 7+ seats
2,831
82
1,090
39.7%
1 engine
4,508
23
2,046
45.6%
2 engines, 1-12 seats
4,566
36
1,734
38.3%
2 engines, 13+ seats
1,179
22
375
32.4%
12,566
157
5,070
40.9%
Piston
5,125
43
1,630
32.1%
Turbine: 1 engine
5,799
42
2,605
45.2%
Turbine: Multi-engine
1,691
51
780
47.6%
Glider
1,763
17
900
51.5%
Lighter-than-air
2,498
26
985
39.8%
Amateur
6,543
48
3,926
60.4%
Exhibition
1,994
29
902
45.9%
Experimental: Other
1,981
23
704
36.0%
4,183
15
2,149
51.6%
84,982
725
37,215
44.2%
Fixed Wing - Turboprop
Fixed Wing - Turbojet
2 engines
Rotorcraft
Other Aircraft
Experimental
Light-sport
Total
10
Even though efforts are made to remove ineligible aircraft from the population before the sample is selected, a
small number of surveys are returned indicating that the aircraft should not be part of the survey population (e.g., the
aircraft was used primarily as a Part 121 air carrier, or was a museum piece the entire survey year). The Invalid
Sample represents such aircraft, which are excluded from response rate calculations.
A-18
File Type | application/pdf |
File Title | 2010 GA Survey Appendix A Methodology_30DEC2011V1 |
Author | peg.krecker |
File Modified | 2011-12-30 |
File Created | 2011-12-30 |