Statistical Methods Report

FVAP_Ready_Final_LEO_Statistical_Methods_report_20091009.pdf

Post-election Survey of Local Election Officials

Statistical Methods Report

OMB: 0704-0125

Document [pdf]
Download: pdf | pdf
2008 Post-Election Voting Survey
of Local Election Officials
Statistical Methodology Report

Additional copies of this report may be obtained from:
Defense Technical Information Center
ATTN: DTIC-BRR
8725 John J. Kingman Rd., Suite #0944
Ft. Belvoir, VA 22060-6218
Or from:
http://www.dtic.mil/dtic/order.html
Ask for report by ADA504028

DMDC Report No. 2009-053
August 2009

2008 POST-ELECTION VOTING SURVEY OF
LOCAL ELECTION OFFICIALS:
STATISTICAL METHODOLOGY REPORT

Defense Manpower Data Center
Human Resources Strategic Assessment Program
1600 Wilson Boulevard, Suite 400, Arlington, VA 22209-2593

Acknowledgments
Defense Manpower Data Center (DMDC) is indebted to numerous people for their
assistance with the 2009 Post-Election Survey of Local Election Officials, which was conducted
on behalf of the Office of the Under Secretary of Defense for Personnel and Readiness
(OUSD[P&R]). DMDC’s survey program is conducted under the leadership of Timothy Elig,
Chief of the Human Resource Strategic Assessment Program.
Policy officials contributing to the development of this survey include Erin St. Pierre and
Scott Wiedmann (Federal Voting Assistance Program).
DMDC’s Program Evaluation Branch, under the guidance of Brian Lappin, former
Branch Chief, and Kristin Williams, current Branch Chief, is responsible for the development of
questionnaires. Lead analyst was Robert Tinney.
DMDC’s Personnel Survey Branch, under the guidance of Jean Fowler, former Branch
Chief, and David McGrath, current Branch Chief, is responsible for HRSAP survey sampling,
weighting, database construction, and archiving. Lead operation analyst on this survey was Lisa
Howard Davis, SRA International, Inc., supported by John Freimuth, Consortium Research
Fellow. The lead statistician on this survey was Mark Gorsak, supported by Katrina Hsen,
Consortium Research Fellow. Jean Fowler performed the sampling procedures.
DMDC’s Survey Technology Branch, under the guidance of Fred Licari, Branch Chief, is
responsible for the distribution of datasets outside of DMDC and maintaining records on
compliance with the Privacy Act and 32 CFR 219.

ii

2008 POST-ELECTION VOTING SURVEY OF
LOCAL ELECTION OFFICIALS
Executive Summary
The Uniformed and Overseas Citizens Absentee Voting Act of 1986 (UOCAVA), 42
USC 1973ff, permits members of the Uniformed Services and Merchant Marine, and their
eligible family members and all citizens residing outside the United States who are absent from
the United States and its territories to vote in the general election for federal offices. These
groups include:
•

Members of the Uniformed Services (including Army, Navy, Air Force, Marine
Corps, Coast Guard)

•

U.S. citizens employed by the federal Government residing outside the U.S., and

•

All other private U.S. citizens residing outside the U.S.

The Federal Voting Assistance Program (FVAP), under the guidance of USD(P&R), is
charged with implementing the UOCAVA and evaluating the effectiveness of its programs. The
FVAP Office asked DMDC to design, administer, and analyze post-election surveys on
Uniformed Services voter participation, overseas nonmilitary voter participation, and local
election officials. Without such surveys, the Department will not be able to assess and improve
voter access. In addition, such surveys fulfill 1988 Executive Order 12642 that names the
Secretary of Defense as the “Presidential designee” for administering the UOCAVA and requires
surveys to evaluate the effectiveness of the program in presidential election years.
The objectives of the 2008 post-election surveys are: (1) to gauge participation in the
electoral process by citizens covered by UOCAVA, (2) to assess the impact of the FVAP’s
efforts to simplify and ease the process of voting absentee, (3) to evaluate other progress made to
facilitate voting participation, and (4) to identify any remaining obstacles to voting by these
citizens. Surveys were done of military members, federal civilian employees overseas, other
U.S. citizens overseas, voting assistance personnel, and local election officials in the U.S.
This report focuses on the 2008 Post-Election Voting Survey of Local Election Officials
(2008 LEO), which was designed to capture the attitudes and behaviors from the local election
officials as well as voting information with the voting jurisdiction, concentrating on the absentee
vote.
This report describes the sampling and weighting methodologies used in the 2008 LEO.
Calculation of response rates is described in the final section.
The population of interest for the 2008 LEO consisted of the local election officials from
the voting jurisdictions in the United States and the four territories. There were 7,886 voting
jurisdictions covering the United States and the four territories.

iii

The 2008 LEO survey was a sample of voting jurisdictions with the LEO as the
respondent. The total size was 2,598. The survey administration period lasted from November
5, 2008 to January 7, 2009. There were 1,376 usable questionnaires.
After the determination of eligibility for the survey and completion of a survey, analytic
weights were created to account for varying response rates among population subgroups. First,
the sampling weights (the inverse of the selection probabilities) were computed. Second, the
base weights were adjusted to account for survey nonresponse.
Location, completion, and response rates are provided in the final section of this report
for both the full sample and for population subgroups. These rates were computed according to
the recommendations of the Council of American Survey Research Organizations (1982). The
location, completion, and response rates were 81%, 68%, and 55%.

iv

Table of Contents
Page
Introduction......................................................................................................................................1
Sample Design and Selection.....................................................................................................2
Target Population.................................................................................................................2
Sampling Frame ...................................................................................................................2
Sample Design .....................................................................................................................3
Survey Allocation ................................................................................................................4
Sample Selection..................................................................................................................4
Survey Administration ...............................................................................................................5
Sample Contact Information ................................................................................................5
Survey Administration .........................................................................................................6
Web Survey Administration ................................................................................................6
Mail Survey Administration ................................................................................................7
Survey Administration Issues ....................................................................................................7
Selection for Multiple Election Surveys ..............................................................................7
Local Election Officials Not in Sample ...............................................................................7
Weighting...................................................................................................................................7
Case Dispositions.................................................................................................................8
Base Weight .......................................................................................................................10
Adjustments to Base Weights ............................................................................................10
Nonresponse Adjustments and Final Weight.....................................................................10
Variance Estimation...........................................................................................................11
Location, Completion, and Response Rates ............................................................................11
Ineligibility Rate ................................................................................................................13
Estimated Ineligible Postal Non-Deliverable/Not Located Rate .......................................13
Estimated Ineligible Nonresponse .....................................................................................13
Adjusted Location Rate......................................................................................................13
Adjusted Completion Rate.................................................................................................13
Adjusted Response Rate ....................................................................................................13
Edit and Imputation Processes .................................................................................................14
Edit Process........................................................................................................................14
Imputation Process.............................................................................................................15
References......................................................................................................................................17

List of Tables
1.
2.
3.

Total Voting Jurisdictions and Register Voter Counts by Sampling Stratum .....................3
Sample Counts and Probability of Selection for Voting Jurisdictions by Sampling
Stratum.................................................................................................................................4
Initial and Final Sample Counts by Sampling Stratum........................................................5

v

4.
5.
6.
7.
8.
9.
10.
11.

E-Mail Distribution to Local Election Officials ..................................................................6
Case Disposition Resolutions ..............................................................................................8
Sample Size by Case Disposition Categories ......................................................................9
Complete Eligible Cases by Sampling Stratum ...................................................................9
Base Weights by Sampling Stratum ..................................................................................10
Final Weights by Sampling Stratum ..................................................................................11
Disposition Codes for CASRO Response Rates................................................................12
Rates for Full Sample and Stratification Levels ................................................................14

vi

2008 POST-ELECTION VOTING SURVEY OF LOCAL
ELECTION OFFICIALS:
STATISTICAL METHODOLOGY REPORT
Introduction
The Uniformed and Overseas Citizens Absentee Voting Act of 1986 (UOCAVA), 42
USC 1973ff, permits members of the Uniformed Services and Merchant Marine, and their
eligible family members and all citizens residing outside the United States who are absent from
the United States and its territories to vote in the general election for federal offices. These
groups include:
•

Members of the Uniformed Services (including Army, Navy, Air Force, Marine
Corps, Coast Guard),

•

U.S. citizens employed by the federal Government residing outside the U.S., and

•

All other private U.S. citizens residing outside the U.S.

The Federal Voting Assistance Program (FVAP), under the guidance of USD(P&R), is
charged with implementing the UOCAVA and evaluating the effectiveness of its programs. The
FVAP Office asked DMDC to design, administer, and analyze post-election surveys on
Uniformed Services voter participation, overseas nonmilitary voter participation, and local
election officials. Without such surveys, the Department will not be able to assess and improve
voter access. In addition, such surveys fulfill 1988 Executive Order 12642 that names the
Secretary of Defense as the “Presidential designee” for administering the UOCAVA and requires
surveys to evaluate the effectiveness of the program in presidential election years.
The objectives of the 2008 post-election surveys are: (1) to gauge participation in the
electoral process by citizens covered by UOCAVA, (2) to assess the impact of the FVAP’s
efforts to simplify and ease the process of voting absentee, (3) to evaluate other progress made to
facilitate voting participation, and (4) to identify any remaining obstacles to voting by these
citizens. Surveys were done of military members, federal civilian employees overseas, other
U.S. citizens overseas, voting assistance personnel, and local election officials in the U.S.
This report describes sampling and weighting methodologies for the 2008 LEO. The first
section describes the design and selection of the sample. The second section describes the
survey administration. The third section describes weighting and variance estimation. The final
section describes the calculation of response rates, location rates, and completion rates for the
full sample and for population subgroups. Tabulated results of the survey are reported by
DMDC (2009).

1

Sample Design and Selection
Target Population
The 2008 LEO was designed to represent all local election officials from the voting
jurisdictions in the United States and the four territories. The 2004 survey sampled about 1,000
local election officials compared with 2,598 for the 2008 survey.
Sampling Frame
The sampling frame was built from two sources. A file from the Election Administration
database from the Election Data Services, Inc (EDS) was initially used to develop the frame.
The EDS file contained 10,729 voting jurisdiction records. There were duplicate records for
many voting jurisdictions within the EDS file. After removing the duplicate records, there were
10,051 voting jurisdiction level records.
After contacting a sample of jurisdictions, modifications to the frame were needed due to
the following:
•

Minnesota has county level voting jurisdictions. Jurisdictions at a geographic level
below the county did not process the voting information needed for the survey. The
initial EDS frame had county and sub-county voting jurisdictions for Minnesota.

•

Wisconsin and Michigan have municipality and city-level voting jurisdictions.
Jurisdictions at the county level did not process the voting information needed for the
survey. The initial EDS frame had county and sub-county voting jurisdictions for
Wisconsin and Michigan.

•

Kalawao County, Hawaii uses the governmental services from Maui County for
voting purposes.

•

Ferdinand, Vermont is an unincorporated town that does not have governmental
voting services.

•

West Windsor, Vermont was a duplicate not originally removed from the frame file.

Kalawao County, Hawaii; Ferdinand, Vermont; and one record from West Windsor,
Vermont became ineligible for the frame. All the counties (87) and sub-counties (373) in
Minnesota were included in the original frame. However, all jurisdictions below the county level
in Minnesota were removed from the frame, and the base weights were adjusted according to the
sampling stratum. There were 373 city and township level jurisdictions removed from the frame
in Minnesota. All the county level jurisdictions in Wisconsin and Michigan were removed from
the frame and the base weights were adjusted according to the sampling stratum. For Wisconsin
and Michigan, the frame did not contain all jurisdictions at or below the county level. There
were 41 counties removed from Wisconsin and 54 counties removed from Michigan. To add
jurisdictions below the county level, listings from the National Association of Counties (NACo)
were used. The NACo listing includes the cities, towns, villages, and boroughs as per the Census

2

Bureau definition. An additional nine voting jurisdictions were added to the sample for
Wisconsin and 51 voting jurisdictions were added to the sample for Michigan.
The final sampling frame size was 7,886 voting jurisdictions. The register voter counts
were from the EDS Election Administration database. Table 1 shows the total number of voting
jurisdictions and registered voters by sampling stratum.

Table 1.
Total Voting Jurisdictions and Register Voter Counts by Sampling Stratum
Sampling Stratum

Jurisdiction Registered
Count
Voter Count
Total jurisdictions
7,886
187,857,248
a
Any jurisdiction with 200,001–360,000 registered voters
101
25,838,816
Any jurisdiction with more than 360,000 registered voters
79
64,109,108
County/City jurisdictions with registered voters
100,001–200,000..............................................................
146
20,078,655
75,001–100,000..............................................................
109
8,968,523
40,001–75,000..............................................................
279
14,357,107
10,001–40,000..............................................................
1,263
22,967,130
5,001–10,000..............................................................
624
3,925,735
Less than 5,001..............................................................
791
1,365,417
Town/Township/Village jurisdictions with registered
voters
10,001–200,000..............................................................
331
17,695,367
5,001–10,000..............................................................
302
2,987,674
Less than 5,001..............................................................
3,861
5,563,716
a

This stratum also contains the largest jurisdiction for states or territories with only jurisdictions less than
200,001 registered voters.

Sample Design
The 2008 LEO used a single-stage stratified design. The two strata with jurisdictions
with more than 200,000 registered voters were included in the sample with certainty. For states
or territories with only jurisdictions less than 200,001 registered voters, the largest jurisdiction
from that state or territory was included in the sample with certainty. So, the sample included at
least one jurisdiction from the 50 states, the District of Columbia, and the four territories.
Within each remaining stratum, voting jurisdictions were selected with equal probability
without replacement using simple random sampling. Since the allocation of the sample was not
proportional to the size of strata, selection probabilities varied among strata, and jurisdictions
were not selected with equal probability overall. Nonproportional allocation was used to achieve

3

adequate sample sizes for relatively small subpopulations of analytic interest. The primary
domain of interest is jurisdiction size by type of jurisdiction.

Table 2.
Sample Counts and Probability of Selection for Voting Jurisdictions by Sampling Stratum
Sampling Stratum
Total jurisdictions
Any jurisdiction with 200,001–360,000 registered votersa
Any jurisdiction with more than 360,000 registered voters
County/City jurisdictions with registered voters
100,001–200,000....................................................................
75,001–100,000....................................................................
40,001–75,000....................................................................
10,001–40,000....................................................................
5,001–10,000....................................................................
Less than 5,001....................................................................
Town/Township/Village jurisdictions with registered voters
10,001–200,000....................................................................
5,001–10,000....................................................................
Less than 5,001....................................................................
a

Sample
Count
2,598
101
79

Probability
of Selection
n/a
1.00
1.00

120
77
194
702
336
335

0.82
0.71
0.70
0.56
0.54
0.42

72
64
518

0.22
0.21
0.13

This stratum also contains the largest jurisdiction for states or territories with only jurisdictions less than 200,001 registered voters.

Survey Allocation
The allocation was in proportion to the number of registered voters. A higher percentage
of voting jurisdictions in the sampling strata with more register voter population was allocated
more sample than voting jurisdictions with less registered voter population. Table 2 shows the
probabilities of selection for each sampling stratum.
Sample Selection
Initially, the frame was stratified by the sampling stratum and separate simple random
samples were drawn within each sampling stratum. The initial sample was notified about the
survey. From that contact, modifications to the frame were needed. After correcting the frame
to account for the missing jurisdictions below the county level for Wisconsin and Michigan, an
additional sample was selected using a simple random sample. The additional sample was drawn
from the population of missing jurisdictions in Wisconsin and Michigan.
After the removal of county level voting jurisdictions in Wisconsin and Michigan and the
voting jurisdictions below the county level for Minnesota, the requirement to include the largest
voting jurisdiction in each state did not exist for Wisconsin. As a result, the city of Milwaukee

4

was included in sample with certainty. The number of registered voters on the EDS Election
Administration database for the city of Milwaukee was 172,676. Detroit, Michigan and
Hennepin County, Minnesota were already in the sample with certainty. The number of
registered voters was 639,053 and 703,453 for Detroit and Hennepin County, respectively. No
adjustment was necessary for Michigan or Minnesota to satisfy the requirement for the largest
voting jurisdiction in each state.

Table 3.
Initial and Final Sample Counts by Sampling Stratum
Sampling Stratum
Total jurisdictions
Any jurisdiction with 200,001–360,000 registered votersa
Any jurisdiction with more than 360,000 registered voters
County/City jurisdictions with registered voters
100,001–200,000................................................................
75,001–100,000................................................................
40,001–75,000................................................................
10,001–40,000................................................................
5,001–10,000................................................................
Less than 5,001................................................................
Town/Township/Village jurisdictions with registered voters
10,001–200,000................................................................
5,001–10,000................................................................
Less than 5,001................................................................

Initial
Sample Count
3,004
115
85

Final
Sample Count
2,598
101
79

131
81
212
726
338
291

120
77
194
702
336
335

74
77
874

72
64
518

a

This stratum also contains the largest jurisdiction for states or territories with only jurisdictions less than
200,001 registered voters.

Survey Administration
Sample Contact Information
The sample contact information was from the Election Administration database from the
EDS. The initial sample was notified using the contact information from EDS.
After modifications to the frame, additional contact information was needed for the
voting jurisdictions added to the sample. A Web search for contact information about the local
election officials in Wisconsin and Michigan was done. There was a phone follow-up to confirm
that the contact information was correct.

5

Survey Administration
Survey pre-administration activities began on April 17, 2008, with an additional mailing
on July 28, for the survey administration of the 2008 LEO, beginning on November 5, 2008, and
continuing through January 7, 2009. The survey was administered in mixed modes—in both
Web and paper formats. Please see DMDC (In preparation) for further information on survey
administration.
The actual administration plan called for three types of communications with sampled
local election officials: notification, survey invitation, and thank you/reminder.
The first communication was a notification of the sampled jurisdictions that they would
be surveyed at the time of the November general election. The jurisdictions were provided a
spreadsheet (both paper and an Excel file) that could be used to track numbers during the year
that would be needed for the survey. It was during this pre-administration process that the frame
and sample were cleaned, as described above.
The second communication, the “survey invitation,” would contain the paper survey for
postal recipients or a link to the survey for Web recipients. There was also a statement in the
cover letter that the 2008 LEO survey was different from the Election Day survey conducted by
the United States Elections Assistance Commission.
Finally, the third type of communication would be a “thank you/reminder.” After a
specified period following survey invitation/distribution, the “thank you/reminder” would be
sent. The main purpose of this communication was to remind sampled individuals of the survey
and ask them to please complete and return the survey.
Web Survey Administration
Survey invitation and thank you/reminder e-mails were sent to the survey sample with
known email addresses provided during the sample verification process. The e-mail contact was
under the signature of Polli Brunelli, Director of the Federal Voting Assistance Program
(FVAP). Table 4 shows the dates for the e-mail distribution.

Table 4.
E-Mail Distribution to Local Election Officials
Type of E-Mail
Survey Invitation
Thank you/reminder:
First
Second
Third

Date
11/5/08
11/14/08
11/28/08
12/12/08

6

All e-mail notifications included the link to the survey Web site and a unique Ticket
Number for logging on to the survey. Thank you/reminders were sent to all sample members
excluding the following:
•

Those who had submitted a Web survey or returned a paper survey;

•

Those who had requested a paper survey; and

•

Those who had been assigned a case disposition code indicating a refusal or survey
ineligibility (e.g., a disposition code for deceased or no longer employed with the
agency).

Mail Survey Administration
The paper survey was formatted and prepared for printing. A unique Ticket Number and
the URL for accessing the Web version of the survey were included on the cover of the paper
survey. Instructions were included stating that sample members had the option of completing
either the Web or paper versions of the survey.
Printed survey materials were assembled into survey packets. Each packet included a
survey cover letter (under the signature of Polli Brunelli, Director of FVAP), the survey, an
envelope to return the survey, and an outer mailing envelope.
Survey Administration Issues
Selection for Multiple Election Surveys
During the administration of the 2008 LEO survey, local election officials received
requests of information from other organizations. There were questions about the 2008 LEO
survey made to the FVAP toll-free number and the FVAP Web site address.
Local Election Officials Not in Sample
Some local election officials not in the 2008 LEO sample received word about the survey.
These LEOs inquired through the FVAP toll-free number and the FVAP Web site address if they
could participate in the survey. The LEOs were notified that the 2008 LEO was a scientific
sample representing the United States and that their participation was not needed for the survey.
Weighting
The analytic weights for the 2008 LEO were created to allow the estimation of population
values by eligible survey respondents. To facilitate this representation, weights were created that
reflected the differential survey sampling rates in the 11 population subgroups shown in tables 2
and 3 and the differential rates of response in each of these subgroups.

7

Case Dispositions
Final case dispositions for weighting were determined using information from field
operations (the Survey Control System, or SCS), and returned surveys. No single source of
information is both complete and correct; inconsistencies among these sources were resolved
according to the order of precedence shown in Table 5. Execution of the weighting process and
computation of response rates both depend on this classification.

Table 5.
Case Disposition Resolutions
Case Disposition

Information
Conditions
Source
SCS
Ineligible
Item response rate Item response is at least 50%.

Frame ineligible
Eligible,
complete response
Eligible,
Item response rate Return is not blank but item response is less than 50%.
incomplete
response
Active refusal
SCS
Reason for refusal is “any;” ineligible reason is
“other;” reason survey is blank is “refused-too long,”
“ineligible-other,” “unreachable at this address,”
“refused by current resident,” or “concerned about
security/confidentiality.”
Blank return
SCS
No reason given.
PND
SCS
Postal non-delivery or original non-locatable.
Nonrespondent
Remainder
Remainder

This order is critical to resolving case dispositions. For example, suppose a sample
person refused the survey, with the reason that it was too long; in the absence of any other
information, the disposition would be “eligible nonrespondent.” If the SCS indicated that the
survey was from an ineligible jurisdiction, the disposition would be “ineligible.”
Final case dispositions for the 2008 LEO are shown in Table 6. The total number of
eligible cases for weighting is shown in Table 7.

8

Table 6.
Sample Size by Case Disposition Categories
Case Disposition
Category and (Code Value)
Total
Record Ineligible (1)
Eligible Response
Complete (4)
Incomplete (5)
Refused/Other (8)
Blank (9)
Postal Non-Delivery (10)
Non-respondents (11)

Sample
Size
2,598
3
1,376
136
117
3
397
566

Table 7.
Complete Eligible Cases by Sampling Stratum
Sampling Stratum
Total jurisdictions
Any jurisdiction with 200,001–360,000 registered votersa
Any jurisdiction with more than 360,000 registered voters
County/City jurisdictions with registered voters
100,001–200,000..................................................................
75,001–100,000..................................................................
40,001–75,000..................................................................
10,001–40,000..................................................................
5,001–10,000..................................................................
Less than 5,001..................................................................
Town/Township/Village jurisdictions with registered voters
10,001–200,000..................................................................
5,001–10,000..................................................................
Less than 5,001..................................................................
a

Complete
Eligible Cases
1,376
50
34
59
37
95
364
174
194
37
34
298

This stratum also contains the largest jurisdiction for states or territories with only jurisdictions less than 200,001 registered voters.

9

Base Weight
The 2008 LEO sample was a stratified random sample where separate samples were
selected from each of the 11 frame strata (Table 1). Within each stratum, a simple random
sample was drawn (Table 3). The base or sampling weight is the ratio of the frame count to the
sample count for each stratum. Table 8 shows the base weights for each stratum.

Table 8.
Base Weights by Sampling Stratum
Sampling Stratum
Four territories
Any jurisdiction with less than 250,000 registered votersa
Any jurisdiction with 250,001–360,000 registered voters
Any jurisdiction with 360,001–1,000,000 registered voters
Any jurisdiction with more than 1,000,000 registered voters
County/City jurisdictions with registered voters
100,001–200,000..........................................................................
75,001–100,000..........................................................................
40,001–75,000..........................................................................
10,001–40,000..........................................................................
5,001–10,000..........................................................................
Less than 5,001..........................................................................
Town/Township/Village jurisdictions with registered voters
10,001–200,000..........................................................................
5,001–10,000..........................................................................
Less than 5,001..........................................................................
a

Base Weight
1.00
1.00
1.00
1.00
1.00
1.22
1.42
1.43
1.80
1.85
2.37
4.60
4.72
7.48

This stratum also contains the largest jurisdiction for states or territories with only jurisdictions less than 200,001 registered voters.

Adjustments to Base Weights
After case dispositions were resolved, the sampling weights were adjusted for
nonresponse. The eligibility-adjusted weights for eligible respondents (value 4) were adjusted to
account for eligible sample members who had not returned a completed survey (value 5).
Nonresponse Adjustments and Final Weight
Once base weights were adjusted, final weights were calculated by dividing the sum of
base weights allocated to eligible respondents by the count of eligible respondents in each
stratum. Final weights greater than zero were assigned to all eligible cases that had completed
responses.

10

The four territories were in sample with certainty and assigned to the sampling stratum
for any jurisdiction with 200,001–360,000 registered voters. To calculate an estimated number
of votes comparable to the general population, the territories were weighted separately from the
jurisdictions within the United States. Two of the four territories responded. Weighting for the
certainty strata used cutoffs of a quarter million, 360,000, and one million registered voters. The
final weight is 2.00. Table 9 shows the final weights by sampling stratum.

Table 9.
Final Weights by Sampling Stratum
Sampling Stratum
Four territories
Any jurisdiction with less than 250,000 registered votersa
Any jurisdiction with 250,001–360,000 registered voters
Any jurisdiction with 360,001–1,000,000 registered voters
Any jurisdiction with more than 1,000,000 registered voters
County/City jurisdictions with registered voters
100,001–200,000.............................................................................
75,001–100,000.............................................................................
40,001–75,000.............................................................................
10,001–40,000.............................................................................
5,001–10,000.............................................................................
Less than 5,001.............................................................................
Town/Township/Village jurisdictions with registered voters
10,001–200,000.............................................................................
5,001–10,000.............................................................................
Less than 5,001.............................................................................

Final Weight
2.00
2.20
1.89
2.29
2.50
2.47
2.95
2.94
3.47
3.59
4.07
8.95
8.88
12.96

a

This stratum also contains the largest jurisdiction for states or territories with only jurisdictions less than 200,001
registered voters.

Variance Estimation
Analysis of the 2008 LEO data requires a variance estimation procedure that accounts for
the complex sample design. The final step of the weighting process was to define strata for
variance estimation by Taylor series linearization. The 2008 LEO variance estimation strata
correspond to the sampling strata shown in Table 7. Eleven variance estimation strata were
defined for the 2008 LEO.
Location, Completion, and Response Rates
Location, completion, and response rates were calculated in accordance with guidelines
established by The Council of American Survey Research Organizations (CASRO). The
procedure is based on recommendations for Sample Type II response rates (Council of American
11

Survey Research Organizations, 1982). This definition corresponds to The American
Association for Public Opinion Research (AAPOR) RR3 (AAPOR, 2000), which estimates the
proportion of eligible cases among cases of unknown eligibility.
Location, completion, and response rates were computed for the 2008 LEO as follows:
The location rate (LR) is defined as
LR =

adjusted located sample N L
.
=
adjusted eligible sample N E

The completion rate (CR) is defined as

CR =

usable responses
N
= R.
adjusted located sample N L

The response rate (RR) is defined as
RR =

usable responses
N
= R.
adjusted eligible sample N E

where
•

NL = Adjusted located sample

•

NE = Adjusted eligible sample

•

NR = Usable responses.

To identify the cases that contribute to the components of LR, CR, and RR, the
disposition codes were grouped as shown in Table 10.

Table 10.
Disposition Codes for CASRO Response Rates
Case Disposition Category Code Valuea
Eligible Sample
4, 5, 8, 9, 10, 11
Located Sample
4, 5, 8, 9, 11
Eligible Response
4
No Return
11
Eligibility Determined
2, 3, 4, 5, 8, 9
b
Self Report Ineligible
2, 3
a
b

Code values are from table 6.
There were no self report ineligibles for the survey.

12

Ineligibility Rate

The ineligibility rate (IR) is defined as
IR =

self report ineligible cases
.
eligible determined cases

Estimated Ineligible Postal Non-Deliverable/Not Located Rate

The estimated ineligible postal non-deliverable/not located rate (IPNDR) is defined as
IPNDR = (Eligible Sample − Located Sample ) * IR.

Estimated Ineligible Nonresponse

The estimated ineligible nonresponse (EINR) is defined as

EINR = ( Not returned ) * IR.
Adjusted Location Rate

The adjusted location rate (ALR) is defined as

ALR =

( Located Sample − EINR )
.
( Eligible Sample − IPNDR − EINR )

Adjusted Completion Rate

The adjusted completion rate (ACR) is defined as

ACR =

( Eligible response )
.
( Located Sample − EINR )

Adjusted Response Rate

The adjusted response rate (ARR) is defined as

ARR =

( Eligible response )
.
( Eligible Sample − IPNDR − EINR )

Weighted location, completion, and response rates by region for the 2008 LEO are shown
in Table 11.

13

Table 11.
Rates for Full Sample and Stratification Levels
Domain
Sample
Jurisdiction by registered voters
All with 200,001–360,000
All with more than 360,000

Sample
Usable
Size
Responses
2,598
1,376

Sum of
Weights
7,886

Location
Rate (%)
81.3

Completion
Rate (%)
67.8

Response
Rate (%)
55.1

101
79

50
34

101
79

86.1
81.0

57.5
53.1

49.5
43.0

County/City
jurisdiction with registered voters
100,001–200,000......................
75,001–100,000 .......................
40,001–75,000 .......................
10,001–40,000 .......................
5,001–10,000 .......................
Less than 5,001 .......................

120
77
194
702
336
335

59
37
95
364
174
194

146
109
279
1,263
624
791

88.3
92.2
87.1
86.9
88.7
87.1

55.7
52.1
56.2
59.7
58.4
66.7

49.2
48.1
49.0
51.9
51.8
58.1

Town/Township/Village
jurisdiction with registered voters
10,001–200,000 .......................
5,001–10,000 .......................
Less than 5,001 .......................

72
64
518

37
34
298

331
302
3,861

83.3
78.1
76.0

61.7
68.0
76.0

51.4
53.1
57.8

Edit and Imputation Processes

To calculate estimated totals from the survey data, edit and imputation processes were
developed for the items with missing data. Without an edit and imputation process, the
estimated totals will underrepresent the actual total. The edit process is the inspection of
collected data, prior to statistical analysis. The goal of editing is to verify that the data have
properties intended for the original design. An imputation process places an estimated answer
into a data field for a record that previously had no data or had incorrect or implausible data.
Edit Process

There were two edits done prior to statistical analysis. The first edit was specific for
Question 3, the total number of votes for the local jurisdiction. If the jurisdiction was an eligible
respondent, then an edit was performed. When the total number of votes for the jurisdiction did
not closely correspond to the expected number of votes used during the sample design, then there
was a Web search to find the total number of votes for the jurisdiction through the FVAP Web
site. Question 3 was used during the imputation process.
The second edit called the common denominator edit was used for questions with
multiple parts or sub-items. The questions pertaining to count data had three sub-items.

14

•

Military in the U.S.

•

Military overseas (usually designated by an APO/FPO address)

•

Civilians overseas

The common denominator edit was performed on all complete and incomplete eligible
cases. When one or more sub-items had valid responses, the missing values for the remaining
sub-items are set to zero.
Imputation Process

After the edit process, the imputation process started. The imputation process used the 11
sampling strata as the subgroups for the donors. To become a donor, the case needed to be a
complete eligible case that had no missing data at the data item level. The imputation process
generated recipients from the questions asking count data that had missing values for the three
sub-items.
Using a simple random sample, a donor was found from the sampling stratum for each
recipient in the same sampling stratum. No donor could be used more than one time. The donor
provided a ratio. The ratio used the data item needing imputation by the recipient as the
numerator and the total number of votes as the denominator. The ratio was multiplied to the total
number of votes of the recipient case.

15

References
American Association for Public Opinion Research. (2008). Standard definitions: Final
dispositions of case codes and outcome rates for surveys. 5th edition, Lenexa, KS: AAPOR.
Council of American Survey Research Organizations. (1982). On the definition of response
rates (special report of the CASRO task force on completion rates, Lester R Frankel, Chair).
Port Jefferson, NY: Author.
DMDC. (In preparation). 2008 Post-Election Voting Survey of Local Election Officials:
Administration, datasets, and codebook (Report No. 2009-052). Arlington, VA: Author.
DMDC. (2009). 2008 Post-Election Voting Survey of Local Election Officials: Tabulations of
responses (Report No. 2009-051). Arlington, VA: Author.

17

)RUP$SSURYHG

5(3257'2&80(17$7,213$*(

20%1R

7KH SXEOLF UHSRUWLQJ EXUGHQ IRU WKLV FROOHFWLRQ RI LQIRUPDWLRQ LV HVWLPDWHG WR DYHUDJH  KRXU SHU UHVSRQVH LQFOXGLQJ WKH WLPH IRU UHYLHZLQJ LQVWUXFWLRQV VHDUFKLQJ H[LVWLQJ GDWD VRXUFHV
JDWKHULQJDQGPDLQWDLQLQJWKHGDWDQHHGHGDQGFRPSOHWLQJDQGUHYLHZLQJWKHFROOHFWLRQRILQIRUPDWLRQ6HQGFRPPHQWVUHJDUGLQJWKLVEXUGHQHVWLPDWHRUDQ\RWKHUDVSHFWRIWKLVFROOHFWLRQ
RI LQIRUPDWLRQ LQFOXGLQJ VXJJHVWLRQV IRU UHGXFLQJ WKH EXUGHQ WR 'HSDUWPHQW RI 'HIHQVH :DVKLQJWRQ +HDGTXDUWHUV 6HUYLFHV 'LUHFWRUDWH IRU ,QIRUPDWLRQ 2SHUDWLRQV DQG 5HSRUWV
-HIIHUVRQ'DYLV+LJKZD\6XLWH$UOLQJWRQ9$5HVSRQGHQWVVKRXOGEHDZDUHWKDWQRWZLWKVWDQGLQJDQ\RWKHUSURYLVLRQRIODZQRSHUVRQVKDOOEH
VXEMHFWWRDQ\SHQDOW\IRUIDLOLQJWRFRPSO\ZLWKDFROOHFWLRQRILQIRUPDWLRQLILWGRHVQRWGLVSOD\DFXUUHQWO\YDOLG20%FRQWUROQXPEHU

3/($6('21275(7851<285)250727+($%29($''5(66
5(3257'$7(''00<<<<

5(32577<3(

14-08-09

'$7(6&29(5(')URP7R

October-December 2008

Final Report

D&2175$&7180%(5

7,7/($1'68%7,7/(

2008 Post-Election Survey of Local Election Officials - Statistical
Methodology Report

E*5$17180%(5

F352*5$0(/(0(17180%(5

G352-(&7180%(5

$87+256

Defense Manpower Data Center (DMDC)
H7$6.180%(5

I:25.81,7180%(5

3(5)250,1*25*$1,=$7,21

3(5)250,1*25*$1,=$7,211$0(6$1'$''5(66(6

5(3257180%(5

Defense Manpower Data Center
1600 Wilson Boulevard, Suite 400
Arlington, VA 22209-2593

DMDC Report 2009-053

6321625021,725
6$&521<06

6321625,1*021,725,1*$*(1&<1$0(6$1'$''5(66(6

6321625021,725
65(3257
180%(56

',675,%87,21$9$,/$%,/,7<67$7(0(17

Approved for Public Release; distribution unlimited

6833/(0(17$5<127(6

$%675$&7

This report describes sample design, sample selection, weighting, and variance estimation procedures for the 2008 Post-Election
Voting Survey of Local Election Officials. The first section of this report describes the design and selection of sample. The second
section provides information on weighting and variance estimation. The final section describes the calculation of response rates,
location rates, and completion rates for the full sample and for population subgroups.

68%-(&77(506

Weighting, Response Rates, Sampling Design, Estimation Procedures, Variance Estimation, UOCAVA, Voting, and Local Election
Officials
6(&85,7<&/$66,),&$7,212)
D5(3257

U

E$%675$&7

U

F7+,63$*(

U

/,0,7$7,212)

180%(5 D1$0(2)5(63216,%/(3(5621

$%675$&7

2)
3$*(6

UU

30

David E. McGrath
E7(/(3+21(180%(5,QFOXGHDUHDFRGH

(703) 696-2675
6WDQGDUG)RUP5HY
3UHVFULEHGE\$16,6WG=

,16758&7,216)25&203/(7,1*6)
5(3257'$7()XOOSXEOLFDWLRQGDWHLQFOXGLQJ
GD\PRQWKLIDYDLODEOH0XVWFLWHDWOHDVWWKH\HDU
DQGEH
File Typeapplication/pdf
File TitleMicrosoft Word - FVAP_Ready_Final_LEO_Statistical_Methods_report_20090831.doc
AuthorMaloneJM
File Modified2010-10-26
File Created2009-08-31

© 2024 OMB.report | Privacy Policy