National Voluntary Laboratory Accreditation Program (NVLAP) Information Collection System

National Voluntary Laboratory Accreditation Program (NVLAP) Information Collection System

VST - Voting System Standards of 2002 - VSS-2002

National Voluntary Laboratory Accreditation Program (NVLAP) Information Collection System

OMB: 0693-0003

Document [pdf]
Download: pdf | pdf
Voting Systems Performance and Test Standards:
An Overview
This document provides an overview of the Voting System Standards (the “Standards”),
developed by the Federal Election Commission (FEC). This overview serves as a companion document
for understanding and interpreting both Volume I, the performance provisions of the Standards, and
Volume II, the testing specifications.

Background
The program to develop and implement performance and test Standards for electronic voting
equipment is over 25 years old. However, national interest in this program has been renewed as a result
of the 2000 Presidential election.
In 1975, the National Bureau of Standards (now the National Institute of Standards and
Technology) and the Office of the Federal Elections (the Office of Election Administration’s predecessor
at the General Accounting Office) produced a joint report, Effective Use of Computing Technology in
Vote Tallying. This report concluded that a basic cause of computer-related election problems was the
lack of appropriate technical skills at the state and local level to develop or implement sophisticated
Standards against which voting system hardware and software could be tested. A subsequent
Congressionally-authorized study produced by the FEC and the National Bureau of Standards cited a
significant number of technical and managerial problems affecting the integrity of the vote counting
process. The report detailed the need for a federal agency to develop national performance Standards that
could be used as a tool by state and local election officials in the testing, certification, and procurement of
computer-based voting systems.
In 1984, Congress appropriated funds for the FEC to develop voluntary national Standards for
computer-based voting systems. During this developmental period more than 130 participants, including
state and local election officials, independent technical experts, election system vendors, Congressional
staff, and other interested parties, attended numerous public hearings and reviewed the proposed criteria
for the draft Standards. Prior to final issuance, the FEC published the draft Standards in the Federal
Register and requested that all interested parties submit formal comments. After reviewing all responses
and incorporating corrections and suitable suggestions, the FEC formally approved the Performance and
Test Standards for Punchcard, Marksense and Direct Recording Electronic Voting Systems1 in January
1990.

1

This document is generally referred to as the Voting Systems Standards.

Voting Systems Standards: An Overview

2

The national testing effort is overseen by NASED’s Voting Systems Board, which is composed of
election officials and independent technical advisors (see attachment).2 NASED has established a process
for vendors to submit their equipment to an Independent Test Authority (ITA) for evaluation against the
Standards. To date, Wyle Laboratories, Inc., CIBER, Inc., and SysTest Labs are certified by NASED to
serve as program ITAs for the testing of hardware and the examination of software. 3
Since NASED’s testing program was initiated in 1994, more than 30 voting systems or
components of voting systems have gone through the NASED testing and qualification process. In
addition, many systems have subsequently been certified at the state level using the Standards in
conjunction with functional and technical requirements developed by state and local policymakers to
address the specific needs of their jurisdictions.
As the qualification process matured and as qualified systems were used in the field, the Voting
Systems Board, in consultation with the ITAs, was able to identify certain testing issues that needed to be
resolved. Moreover, rapid advancements in information and personal computer technologies have
introduced new voting system development and implementation scenarios not contemplated by the 1990
Standards.
In 1997, NASED briefed the FEC on the necessity for continued FEC involvement, citing the
importance of keeping the Standards current in its reflection of modern and emerging technologies
employed by voting system vendors. Following a Requirements Analysis released in 1999, the
Commission authorized the Office of Election Administration to revise the Standards to reflect
contemporary needs of the elections community.

Issues Addressed by the Revised Standards
The primary goal of the Standards is to provide a mechanism for state and local election officials
to assure the public of the integrity of computer-based election systems; this has remained unchanged
since 1990. However, the methods for achieving this goal have broadened over the last decade.
The revised Standards provide a common set of requirements across all voting technologies,
using technology-specific requirements only where essential to address the specified technology’s impact
on voting accuracy, integrity, and reliability. The original Standards classified systems as either
Punchcard and Marksense (P&M) or Direct Recording Electronic (DRE) and defined separate Standards
for each technology. The new document revises this terminology to specify standards for two separate
categories: paper-based voting systems and DRE voting systems.
Paper-based systems encompass both punchcards and optically scanned ballots. Electronic
systems include a broad range of DRE systems, such as those that use touch screens and/or keyboards to
record votes. In addition, voting systems that use electronic ballots and transmit official vote data from
the polling place to another location over a public network are now designated as Public Network DRE
Voting Systems and are subject to the standards applicable to other DRE systems, and to requirements
specific to systems that use public network telecommunications.

2

The FEC’s Director of the Office of Election Administration and representatives from IEEE, Wyle Laboratories,
SysTest, and Ciber serve as ex-officio members.
3
NASED also continues to encourage other qualified testing facilities to request certification as Independent Test
Authorities.

Voting Systems Standards: An Overview

3

Revised Performance Features
The revised Standards provide new or expanded coverage of the following functional and
technical system capabilities:
•

Election Management Functions: Performance requirements are specified for components that
define, develop and maintain election databases; perform election definition and setup functions;
format ballots; count votes; consolidate and report results; and maintain audit trails.

•

Feedback to Voter: Performance requirements are defined for DRE systems and for paper-based
precinct-based systems in order to provide direct feedback to the voter that indicates when an
undervote or overvote is detected.

•

Accessibility: Performance requirements are defined for voting systems so that a system can
meet the specific needs of voters with disabilities. These requirements were developed by the
Access Board, a federal agency responsible for developing accessibility standards. The
requirements are based on the accessibility standards for electronic and information technology
established in 36 CFR Part 1194 - Electronic and Information Technology Accessibility
Standards, which implement Section 508 of the Rehabilitation Act Amendments of 1998. The
requirements provide common standards that must be met by all voting devices claiming
accessibility and specific standards related to various types of DRE voting systems.

•

Audit Trails: Performance requirements for audit trails are strengthened to address the full range
of election management functions, including such functions such as ballot definition and election
programming.

•

Telecommunications: Performance requirements are defined for hardware and software
components of voting systems that transmit voting-related information using public
telecommunications components. These requirements apply to systems where data is carried
between devices at a single site, and systems where data is carried between devices in two
geographically distinct locations. Systems must be designed to provide the secure transfer of
many distinct types of vote data, including lists of eligible voters, voter authentication
information, ballot definition information, and vote transmission and tabulation information. Due
to the limits of existing technology to prevent unauthorized use of data, the Standards include
some blanket prohibitions against the communications or transfer of certain types of data via
telecommunications under any circumstances.

•

Broadcasting of Unofficial Results: Performance requirements are defined for the content and
labeling of data provided to the media and other organizations (in reports, data files, or postings
to official Web sites) prior to the canvass and certification of election results.

Revised Test Features
The revised Standards also provide a restructured and expanded description of the tests performed
by ITAs:
•

Expanded Testing Standards: Additional tests are defined to address the expanded functional
and technical requirements for voting systems.

•

Stages in the Test Process: The test process is re-defined in terms of pre-testing, testing, and
post-testing activities.

Voting Systems Standards: An Overview

4

•

Distinction Between Initial Tests and Testing of Modifications to Previously Tested
Systems: A voting system remains qualified as long as no modifications are made. Any changes
to a system must be submitted to the appropriate ITA. The proper course of action to evaluate the
implication of a modification to a system, including the possibility of requiring additional testing,
depends on the nature of the changes made by the vendor. Some criteria for determining the
scope of testing for modifications are defined in the Standards, but the ITA has full discretion to
evaluate this criteria against modifications made to the system.

•

Documentation Submitted by Vendors: The description of documentation provided by vendors
as part of the Technical Data Package (TDP) is refined to support the collection of all information
required by the ITAs to conduct the expanded testing.

Revised Organizational Features
The Standards have been reorganized and edited to better suit the needs of different user groups
and to improve readability. These changes include:
•

Multiple Volumes: While the original Standards was published as a single document, the
revision is divided into two distinct volumes. Volume I, Voting System Performance Standards,
provides an introduction to the Standards. It describes the functional and technical requirements
for voting systems, and provides a summary of the ITA’s testing process. This volume is intended
for a general audience including the public, the press, state and local election officials, and
prospective vendors, as well as the ITAs and current vendors already familiar with the Standards
and the testing process. Volume II, Voting System Test Standards, is written specifically for
jurisdictions purchasing a new system, vendors, and ITAs. This volume provides details of the
test process, including the information to be submitted by the vendor to support testing, the
development of test plans by the ITAs for initial system testing, the testing of modifications to the
system, the conduct of system qualification tests by the ITAs, and the test reports generated by
the ITAs.

•

Standards, Guidelines and Fundamental System Development Techniques: The revised
Standards clearly identify individual elements as mandatory requirements or recommended
guidelines. Such requirements are designated in the Standards by the term “shall.”The Standards
no longer provide descriptions of basic professional system developmental and managerial
techniques, which were included in the 1990 version of the Standards. However, they do provide
references to common industry practices, and require the vendors to submit documentation of its
processes for some topics such as quality assurance and configuration management.

•

Human Interface and Usability Standards: Recent controversy over the design of the
Presidential ballot in certain jurisdictions has highlighted the importance of ballot design and
system usability by both election officials and the general public. Human interface and usability
issues are addressed in Appendix C to Volume I. This appendix provides guidelines to vendors
and election officials to aid in the design and procurement of systems that are easy to use by the
general public. Additionally, the FEC has begun the development testable human factors
standards that will be incorporated into the Standards upon completion.

•

Inclusion of Selected Test Procedure Details: Volume II of the Standards specify the procedure
for certain hardware tests for voting devices and vote counting devices. However, many tests of
hardware and software in a voting system can not be developed without examining the design and
configuration of the specific system seeking qualification. Because of this, the Standards give the

Voting Systems Standards: An Overview

5

ITAs wide latitude to develop and perform appropriate tests to fully evaluate a system against the
Standards.

Issues Not Addressed by the Revised Standards
This revisions to the Standards do not provide sufficient guidance for a number of important
issues. Some of these issues are outside the scope of the Standards, some are only partially addressed by
the Standards, and some will be addressed in future modules of the Standards. These issues include:

4

•

Administrative Functions: The revised Standards do not address administrative and managerial
practices outside the direct control of the vendor. Election officials have long recognized that
adequate Standards and test criteria are only part of the formula for ensuring that votes are cast
and counted in an accurate manner. The other key component that is often overlooked in the rush
to embrace technological solutions to election problems is efficient and consistent administration
and management. Effective administration at the local level requires the adoption and
implementation of consistent and effective procedures for acquiring, securing, operating and
maintaining a voting system. Although the Standards mandate that vendors document many
components of optimal managerial practices, the execution of such procedures are not included in
a Standards document that focuses on the system itself.

•

Integration with the Voter Registration Database: Local and statewide automated voter
registration databases have become more common in recent years as election officials throughout
the country attempt to harness innovations in network computing to address the needs of
increasingly complex voter registration information requirements. In some instances, a voter
registration database will contain many data fields common to other election administration
applications. These applications include campaign finance recording, election worker
management, and the reporting of election results. Although many of these applications are codependent, the testing of the design and interface between the voting system and the voter
registration database has been specifically excluded from this update of the Standards for
practical reasons. First, because there is such a variety of databases and interfaces being used
among the various states and within the localities of each individual state, there is no practical
and systematic way to test a voting system against all possible combinations and configurations.
Second, many of the voting systems being used today still do not include an electronic interface
with the voter registration database. At such time when the majority of voting systems and voter
registration databases become more seamlessly integrated, a module will be added to the
Standards covering their performance, functionality, and testing.

•

Commercial Off-the-Shelf (COTS) Products: Some voting systems use one or more readilyavailable COTS hardware devices (such as card readers, printers, or personal computers) or
software products (such as operating systems, programming language compilers, or database
management systems). These devices and software are exempted from certain portions of the
qualification testing process so long as such products are not modified in any manner for use in a
voting system.

•

Internet Voting: A recent report4 conducted by the Internet Policy Institute and sponsored by the
National Science Foundation in cooperation with the University of Maryland stated:

“Report of the National Workshop on Internet Voting: Issues and Research Agenda” March, 2001. Internet
Policy Institute.

Voting Systems Standards: An Overview

6

“Remote Internet voting systems pose significant risk to the integrity of the
voting process and should not be fielded for use in public elections until
substantial technical and social science issues have been addressed. The
security risk associated with these systems are both numerous and
pervasive and, in many cases, cannot be resolved using even today’s
most sophisticated technology.”
The findings of this and other studies on internet voting have led the FEC and NASED to
conclude that controls cannot be developed at the present time to make remote Internet voting
sufficiently risk-resistant to be confidently used by election officials and the voting public.
Therefore, the Standards can not be written for the testing and qualification of these systems for
widespread use in general elections. However, the Standards do not prohibit the development and
use of these systems for special populations such as military and civilian government employees
based outside the United States. In addition to Federal Voting Assistance Program use of Internet
voting, States are encouraged to conduct pilot tests and demonstration projects in accordance with
applicable state regulations.
The Standards contemplate the development of systems that integrate public telecommunications
networks other than the Internet at the poll site setting. These voting systems are considered
public network direct recording electronic (DRE) voting systems and must meet the same revised
Standards for security, accuracy, and reliability as other similarly defined voting systems. Such
systems must additionally meet requirements specific to systems that integrate certain
telecommunications components.
•

Human Error Rate vs. System Error Rate: In the Standards, the term “error rate” applies to
errors introduced by the system and not by a voter’s action, such as the failure to mark a ballot in
accordance with instructions. The updated accuracy standard is defined as a ballot position error
rate. The error rate applies to specific system functions, such as recording a vote, storing a vote
and consolidating votes into vote totals. Each location on a paper ballot card or electronic ballot
image where a vote may be entered represents a ballot position. The Standards set two error
rates:
1. Target error rate: a maximum of one error in 10,000,000 ballot positions, and
2. Testing error rate: a maximum acceptable rate in the test process of one error in
500,000 positions.
This system error rate applies to data that is entered into the system in conformance with the
applicable instructions and specifications. Further research on human interface and usability
issues is needed to enable the development of Standards for error rates that account for human
error.

Summary of Content of Volume I
Volume I contains performance standards for electronic components of voting systems. In
addition to containing a glossary (Appendix A) and applicable references (Appendix B), Volume I is
divided into nine sections:
•

Section 1- Introduction: This section provides an introduction to the Standards, addressing the
following topics:
•

Objectives and usage of the Standards;

Voting Systems Standards: An Overview

•

•

Development history for initial Standards;

•

Update of the Standards;

•

Accessibility for individuals with disabilities;

•

Definitions of key terms;

•

Application of the Standards and test specifications; and

•

Outline of contents.

7

Section 2 - Functional Capabilities: This section contains Standards detailing the functional
capabilities required of a voting system. This section sets out precisely what it is that a voting
system is required to do. In addition, this section sets forth the minimum actions a voting system
must be able to perform to be eligible for qualification. For organizational purposes, functional
capabilities are categorized by the phase of election activity in which they are required:
•

Overall Capabilities: These functional capabilities apply throughout the election

process. They include security, accuracy, integrity, system auditability, election
management system, vote tabulation, ballot counters, telecommunications, and data
retention.
•

Pre-voting Capabilities: These functional capabilities are used to prepare the voting

system for voting. They include ballot preparation, the preparation of electionspecific software (including firmware), the production of ballots or ballot pages, the
installation of ballots and ballot counting software (including firmware), and system
and equipment tests.
•

Voting Capabilities: These functional capabilities include all operations conducted at
the polling place by voters and officials including the generation of status messages.

•

Post-voting Capabilities: These functional capabilities apply after all votes have been

cast. They include closing the polling place; obtaining reports by voting machine,
polling place, and precinct; obtaining consolidated reports; and obtaining reports of
audit trails.
•

Maintenance, Transportation and Storage Capabilities: These capabilities are
necessary to maintain, transport, and store voting system equipment.

For each functional capability, common standards are specified. In recognition of the diversity of
voting systems, some of the standards have additional requirements that apply only if the system
incorporates certain functions (for example, voting systems employing telecommunications to
transmit voting data) or configurations (for example, a central count component). Where systemspecific standards are appropriate, common standards are followed by standards applicable to
specific technologies (i.e., paper-based or DRE) or intended use (i.e., central or precinct count).
The requirement that voting systems provide access to individuals with disabilities is one of the
most significant additions to the Standards. The FEC has incorporated specifications that were
developed by the Access Board and are based on the accessibility Standards for electronic and
information technology established in 36 CFR Part 1194 - Electronic and Information
Technology Accessibility Standards, which implement Section 508 of the Rehabilitation Act
Amendments of 1998.

Voting Systems Standards: An Overview

•

8

Section 3 - Hardware Standards: This section describes the performance requirements, physical
characteristics, and design, construction, and maintenance characteristics of the hardware and
related components of a voting system. This section focuses on a broad range of devices used in
the design and manufacture of voting systems, such as:
•

For paper ballots: printers, cards, boxes, transfer boxes, and readers;

•

For electronic systems: ballot displays, ballot recorders, precinct vote control units;

•

For voting devices: punching and marking devices and electronic recording devices;

•

Voting booths and enclosures;

•

Equipment used to prepare ballots, program elections, consolidate and report votes,
and perform other elections management activities;

•

Fixed servers and removable electronic data storage media; and

•

Printers.

The Standards specify the minimum values for the relevant attributes of hardware, such as:

•

•

Accuracy;

•

Reliability;

•

Stability under normal environmental operating conditions and when equipment is in
storage and transit;

•

Power requirements and ability to respond to interruptions of power supply;

•

Susceptibility to interference from static electricity and magnetic fields;

•

Product marking; and

•

Safety.

Section 4- Software Standards: This section describes the design and performance
characteristics of the software embodied in voting systems, addressing both system level software
and voting system application software, whether COTS or proprietary. The requirements of this
section are intended to ensure that the overall objectives of accuracy, logical correctness, privacy,
system integrity, and reliability are achieved. Although this section emphasizes software, the
software standards may influence hardware design in some voting systems.

Voting Systems Standards: An Overview

9

The requirements of this section apply to all software developed for use in voting systems,
including:
•

Software provided by the voting system vendor and its component suppliers; and

•

Software furnished by an external provider where the software is potentially used in
any way during voting system operation.

The general standards in this section apply to software used to support the broad range of voting
system activities, including pre-voting, voting and post-voting activities. System specific
Standards are defined for ballot counting, vote processing, the creation of an unalterable audit
trail, and the generation of output reports and files. Voting system software is also subject to the
security requirements of Section 6.
•

Section 5 - Telecommunications Standards: This section describes the requirements for the
telecommunications components of voting systems. Additionally, it defines the acceptable levels
of performance against these characteristics. For the purpose of the Standards,
telecommunications is defined as the capability to transmit and receive data electronically
regardless of whether the transmission is localized within the polling place or the data is
transmitted to a geographically distinct location. The requirements in this section represent
functional and performance requirements for the transmission of data that is used to operate the
system and report official election results. Where applicable, this section specifies minimum
values for critical performance and functional attributes involving telecommunications hardware
and software components.
This section addresses telecommunications hardware and software across a broad range of
technologies such as dial-up communications technologies, high-speed telecommunications lines
(public and private), cabling technologies, communications routers, modems, modem drivers,
channel service units (CSU)/data service units (DSU), and dial-up networking applications
software.
Additionally, this section applies to voting-related transmissions over public networks, such as
those provided by regional telephone companies and long distance carriers. This section also
applies to private networks regardless of whether the network is owned and operated by the
election jurisdiction. For systems that transmit data over public networks, this section applies to
telecommunications components installed and operated at settings supervised by election
officials, such as polling places or central offices.

•

Section 6 - Security Standards: This section describes the essential security capabilities for a
voting system, encompassing the system’s hardware, software, communications, and
documentation. The requirements of this section recognize that no predefined set of security
Standards will address and defeat all conceivable or theoretical threats. However, the Standards
articulate requirements to achieve acceptable levels of integrity, reliability, and inviolability.
Ultimately, the objectives of the security Standards for voting systems are to:
•

Establish and maintain controls that can ensure that accidents, inadvertent mistakes,
and errors are minimized;

•

Protect the system from intentional manipulation and fraud;

•

Protect the system from malicious mischief;

•

Identify fraudulent or erroneous changes to the system; and

•

Protect secrecy in the voting process.

Voting Systems Standards: An Overview

10

These Standards are intended to address a broad range of risks to the integrity of a voting system.
While it is not possible to identify all potential risks, the Standards identify several types of risk
that must be addressed, including:

•

•

Unauthorized changes to system capabilities for defining ballot formats, casting and
recording votes, calculating vote totals consistent with defined ballot formats, and
reporting vote totals;

•

Alteration of voting system audit trails;

•

Altering a legitimately cast vote;

•

Preventing the recording of a legitimately cast vote,

•

Introducing data for a vote not cast by a registered voter;

•

Changing calculated vote totals;

•

Preventing access to vote data, including individual votes and vote totals, to
unauthorized individuals; and

•

Preventing access to voter identification data and data for votes cast by the voter such
that an individual can determine the content of specific votes cast by the voter.

Section 7 - Quality Assurance: In the Standards, quality assurance is a vendor function with
associated practices that confirms throughout the system development and maintenance life-cycle
that a voting system conforms with the Standards and other requirements of state and local
jurisdictions. Quality assurance focuses on building quality into a system and reducing
dependence on system tests at the end of the life-cycle to detect deficiencies.
This section describes the responsibilities of the voting system vendor for designing and
implementing a quality assurance program to ensure that the design, workmanship, and
performance requirements of the Standards are achieved in all delivered systems and components.
These responsibilities include:

•

•

Development of procedures for identifying and procuring parts and raw materials of
the requisite quality, and for their inspection, acceptance, and control.

•

Documentation of hardware and software development processes.

•

Identification and enforcement of all requirements for in-process inspection and
testing that the manufacturer deems necessary to ensure proper fabrication and
assembly of hardware, as well as installation and operation of software or firmware.

•

Procedures for maintaining all data and records required to document and verify the
quality inspections and tests.

Section 8 - Configuration Management: This section contains specific requirements for
configuration management of voting systems. For the purposes of the Standards, configuration
management is defined as a set of activities and associated practices that assures full knowledge
and control of the components of a system, beginning with its initial development, progressing

Voting Systems Standards: An Overview

11

throughout its development and construction, and continuing with its ongoing maintenance and
enhancement. This section describes activities in terms of their purpose and outcomes. It does
not describe specific procedures or steps to be employed to accomplish them—these are left to the
vendor to select.
The requirements of this section address a broad set of record keeping, audit, and reporting
activities that include:
•

Identifying discrete system components;

•

Creating records of formal baselines of all components;

•

Creating records of later versions of components;

•

Controlling changes made to the system and its components;

•

Submitting new versions of the system to ITAs;

•

Releasing new versions of the system to customers;

•

Auditing the system, including its documentation, against configuration management
records;

•

Controlling interfaces to other systems; and

•

Identifying tools used to build and maintain the system.

Vendors are required to submit documentation of these procedures to the ITA as part of the
Technical Data Package for system qualification testing. Additionally, as articulated in state or
local election laws, regulations, or contractual agreements with vendors, authorized election
officials or their representatives reserve the right to inspect vendor facilities and operations to
determine conformance with the vendor’s reported configuration management procedures.
•

Section 9 - Overview of Qualification Tests: This section provides an overview for the
qualification testing of voting systems. Qualification testing is the process by which a voting
system is shown to comply with the requirements of the Standards and the requirements of its
own design and performance specifications. The testing also evaluates the completeness of the
vendor's developmental test program, including the sufficiency of vendor tests conducted to
demonstrate compliance with stated system design and performance specifications, and the
vendor’s documented quality assurance and configuration management practices.
The qualification test process is intended to discover errors that, should they occur in actual
election use, could result in failure to complete election operations in a satisfactory manner. This
section describes the scope of qualification testing, its applicability to voting system components,
documentation that is must be submitted by the vendor, and the flow of the test process. This
section also describes differences between the test process for initial qualification testing of a
system and the testing for modifications and re-qualification after a qualified system has been
modified.
Since 1994, the testing described in this section has been performed by an ITA that is certified by
NASED. The testing may be conducted by one or more ITAs for a given system, depending on
the nature of tests to be conducted and the expertise of the certified ITA. The testing process
involves the assessment of:
•

Absolute correctness of all ballot processing software, for which no margin for error
exists;

Voting Systems Standards: An Overview

•

Operational accuracy in the recording and processing of voting data, as measured by
the error rate articulated in Volume I, Section 3;

•

Operational failure or the number of unrecoverable failures under conditions
simulating the intended storage, operation, transportation, and maintenance
environments for voting systems, using an actual time-based period of processing test
ballots;

•

System performance and function under normal and abnormal conditions; and

•

Completeness and accuracy of the system documentation and configuration
management records to enable purchasing jurisdictions to effectively install, test, and
operate the system.

Summary of Volume II Content
•

•

12

Section 1 - Introduction: This section provides an overview of Volume II, addressing the
following topics:
•

The objectives of Volume II;

•

The general contents of Volume II;

•

The qualification testing focus;

•

The qualification testing sequence;

•

The evolution of testing; and

•

The outline of contents

Section 2 - Technical Data Package: This section contains a description of vendor
documentation relating to the voting system that shall be submitted with the system as a
precondition for qualification testing. These items are necessary to define the product and its
method of operation; to provide the vendor’s technical and test data supporting the its claims of
the system's functional capabilities and performance levels; and to document instructions and
procedures governing system operation and field maintenance.
The content of the Technical Data Package (TDP) is intended must contain a complete
description of the following information about the system:
•

Overall system design, including subsystems, modules, and interfaces;

•

Specific functional capabilities;

•

Performance and design specifications;

•

Design constraints and compatibility requirements;

•

Personnel, equipment, and facilities necessary for system operation, maintenance,
and logistical support;

•

Vendor practices for assuring system quality during the system’s development and
subsequent maintenance; and

Voting Systems Standards: An Overview

•

13

Vendor practices for managing the configuration of the system during development
and for modifications to the system throughout its life-cycle.

•

Section 3 - Functionality Testing: This section contains a description of the testing to be
performed by the ITA to confirm the functional capabilities of a voting system submitted for
qualification testing. It describes the scope and basis for functional testing, the general sequence
of tests within the overall test process, and provides guidance on testing for accessibility. It also
discusses testing of functionality of systems that operate on personal computers.

•

Section 4 - Hardware Testing: This section contains a description of the testing to be performed
by the ITAs to confirm the proper functioning of the hardware components of a voting system
submitted for qualification testing. This section requires ITAs to design and perform procedures
that test the voting system hardware for both operating and non-operating environmental tests.
Hardware testing begins with non-operating tests that require the use of an environmental test
facility. These are followed by operating tests that are performed partly in an environmental
facility and partly in a standard test laboratory or shop environment. The non-operating tests are
intended to evaluate the ability of the system hardware to withstand exposure to various
environmental conditions incidental to voting system storage, maintenance, and transportation.
The procedures are based on test methods contained in Military Standards (MIL-STD) 810D,
modified where appropriate, and include such tests as: bench handling, vibration, low and high
temperature, and humidity.
The operating tests involve running the system for an extended period of time under varying
temperatures and voltages. This ensures that the hardware meets or exceeds the minimum
requirements for reliability, data reading, and processing accuracy contained in Section 3 of
Volume I. Although the procedure emphasizes equipment operability and data accuracy, it is not
an exhaustive evaluation of all system functions. Moreover, the severity of the test conditions has
in most cases been reduced from that specified in the Military Standards to reflect commercial,
rather than military, practice.

•

Section 5 - Software Testing: This section contains a description of the testing to be performed
by the ITAs to confirm the proper functioning of the software components of a voting system
submitted for qualification testing. It describes the scope and basis for software testing, the initial
review of documentation to support software testing, and the review of voting system source
code.
The software qualification tests encompass a number of interrelated examinations. The
examinations include selective review of source code for conformance with the vendor’s stated
standards, and other system documentation provided by the vendor. The code inspection is
complemented by a series of functional tests to verify the proper performance of all system
functions controlled by the software.

•

Section 6 - System Level Integration Testing: This section contains a description of the testing
conducted by the ITAs to confirm the proper functioning of the fully integrated components of a
voting system submitted for qualification testing. It describes the scope and basis for integration
testing, testing of internal and external system interfaces, testing of security capabilities, testing of
accessibility features, and the configuration audits, including the evaluation of claims made in the
system documentation.

System-level qualification tests address the integrated operation of hardware, software
and telecommunications capabilities (where applicable) to assess the system’s response

Voting Systems Standards: An Overview

14

to a range of both normal and abnormal conditions in an attempt to compromise the
system.
•

Section 7 - Examination of Vendor Practices for Configuration Management and Quality
Assurance: This section contains a description of examinations conducted by the ITAs to
evaluate the extent to which vendors meet the requirements for configuration management and
quality assurance. It describes the scope and basis for the examinations and the general sequence
of the examinations. It also provides guidance on the substantive focus of the examinations.
In reviewing configuration management practices, the ITAs examine the vendor’s:
• configuration management policy;
• configuration identification policy;
• baseline, promotion and demotion procedures;
• configuration control procedures;
• release process and procedures; and
• configuration audit procedures.
In reviewing quality assurance practices, the ITAs examine the vendor’s:
• quality assurance policy;
• parts and materials tests and examinations;
• quality conformance plans, procedures and inspection results; and
• voting system documentation.

Conclusion
Almost eighty percent of the States have adopted the Standards. The Commission recommends
that individual States continue to decide how best to adopt and implement the Standards to aid in the
procurement of electronic voting systems. States are also encouraged to develop and implement
individual certification processes to make sure that qualified voting systems can meet the unique and
particular demands of the purchasing jurisdiction.
As a whole, implementation of the original Standards, combined with NASED’s national testing
program, has allowed election officials to be more confidant than ever that the voting systems they
procure will work accurately and reliably. Although the requirements for voting systems and the
technologies used to build them have evolved over the past decade, the revised Standards will close the
gaps in the Standards for system performance and testing. In order to prevent technology gaps in the
future, the FEC and NASED are committed to making the Standards a living document capable of being
updated in an expedited manner to respond to constantly evolving technology. Such technological
innovation should be embraced in order to maintain a sophisticated and robust voting systems industry.

Voting Systems Standards: An Overview

15

NATIONAL ASSOCIATION OF STATE ELECTION DIRECTORS
VOTING SYSTEMS BOARD
Thomas R. Wilkey, Chair
Executive Director
New York State Board of Elections
Albany, New York
Denise Lamb, Vice Chair
Director
State Bureau of Elections
Santa Fe, New Mexico

Brit Williams, Professor Emeritus
CSIS Dept, Kennesaw State College
Kennesaw, Georgia

David Elliott, Asst. Director of Elections
Office of the Secretary of State
Olympia, Washington
Sandy Steinbach, Director of Elections
Office of Secretary of State
Des Moines, Iowa
Donetta Davidson,
Secretary of State
Denver, Colorado
Connie Schmidt, Commissioner
Johnson County Election Commission
Olathe, Kansas

Paul Craft, Computer Audit Analyst
Florida State Division of Elections
Tallahassee, Florida
Jay W. Nispel, Senior Principal Engineer
Computer Sciences Corporation
Annapolis Junction, Maryland
Steve Freeman, Software Consultant
League City, Texas
Robert Naegele, President
Granite Creek Technology
Pacific Grove, California

Yvonne Smith (Member Emeritus)
Former Assistant to the Executive Director
Illinois State Board of Elections
Chicago, Illinois

Ex Officios:
Penelope Bonsall, Director
Office of Election Administration
Federal Election Commission
Washington, D.C.
Jim Dearman
Wyle Laboratories
Huntsville, Alabama

Stephen Berger, Chair
IEEE Voting Equipment Standard
Working Group
Georgetown, Texas
Jennifer Price
Shawn Southworth
CIBER, Inc
Huntsville, Alabama

Committee Secretariat:
The Election Center
R. Doug Lewis, Executive Director
Houston, Texas
Tele:
281-293-0101
Fax:
281-293-0453
email: [email protected]

Carolyn Coggins
SysTest Labs
Denver, Colorado

Volume I, Section 1
Table of Contents

1

Introduction ...........................................................................................................................................1-1
1.1

Objectives and Usage of the Voting System Standards ................................................................1-1

1.2

Development History for Initial Standards ......................................................................................1-2

1.3

Update of the Standards ................................................................................................................1-3

1.4

Accessibility for Individuals with Disabilities...................................................................................1-3

1.5

Definitions ......................................................................................................................................1-4

1.6

1.7

1.5.1

Voting System...................................................................................................................1-4

1.5.2

Paper-Based Voting System.............................................................................................1-5

1.5.3

Direct Record Electronic (DRE) Voting System ................................................................1-5

1.5.4

Public Network Direct Record Electronic (DRE) Voting System .......................................1-6

1.5.5

Precinct Count Voting System ..........................................................................................1-6

1.5.6

Central Count Voting System ...........................................................................................1-6

Application of the Standards and Test Specifications ....................................................................1-7
1.6.1

Qualification Tests ............................................................................................................1-8

1.6.2

Certification Tests .............................................................................................................1-9

1.6.3

Acceptance Tests ...........................................................................................................1-10

Outline of Contents ......................................................................................................................1-10

i

Volume I – Section 1
Introduction

1

Introduction
1.1

Objectives and Usage of the Voting System
Standards

State and local officials today are confronted with increasingly complex voting system
technology and an increased risk of voting system failure. Responding to calls for
assistance from the states, the United States Congress authorized the Federal Election
Commission (FEC) to develop voluntary national voting systems standards for
computer-based systems. The resulting FEC Voting System Standards (“the
Standards”) seek to aid state and local election officials in ensuring that new voting
systems are designed to function accurately and reliably, thus ensuring the system’s
integrity. States are free to adopt the Standards in whole or in part. States may also
choose to enact stricter performance requirements for systems used in their
jurisdictions.
The Standards specify minimum functional requirements, performance characteristics,
documentation requirements, and test evaluation criteria. For the most part, the
Standards address what a voting system should reliably do, not how system
components should be configured to meet these requirements. It is not the intent of the
Standards to impede the design and development of new, innovative equipment by
vendors. Furthermore, the Standards balance risk and cost by requiring voting systems
to have essential , but not excessive, capabilities.
The Standards are not intended to define appropriate election administration practices.
However, the total integrity of the election process can only be ensured if
implementation of the Standards is coupled with effective election administration
practices.
The Standards are intended for use by multiple audiences to support their respective
roles in the development, testing, and acquisition of voting systems:
♦ Authorities responsible for the analysis and testing of such systems in support
of qualification and/or certification of systems for purchase within a
designated jurisdiction;
♦ State and local agencies evaluating voting systems to be procured within their
jurisdictions; and

1-1

Volume I – Section 1
Introduction

♦ Designers and manufacturers of voting systems.

1.2

Development History for Initial Standards

Much of the groundwork for the Standards’ development was laid by a national study
conducted in 1975 by the National Bureau of Standards, now known as the National
Institute of Standards and Technology (NIST). This study was requested by the FEC's
Office of Election Administrator’s predecessor, the Office of Federal Elections of the
General Accounting Office. The report, “Effective Use of Computing Technology in
Vote-Tallying,” made a number of recommendations bearing directly on the Standards
project. After analyzing computer-related election problems encountered in the past,
the report concluded that one of the basic causes for these difficulties was the lack of
appropriate technical skill at the state and local level for developing or implementing
sophisticated and complex standards against which voting system hardware and
software could be tested.
Following the release of this report, Congress mandated that the FEC, with the
cooperation and assistance of the National Bureau of Standards, study and report on
the feasibility of developing “voluntary engineering and procedural performance
standards for voting systems used in the United States.” (2 U.S.C. §431 Note) The
resulting 1983 study cited a substantial number of technical and managerial problems
that affected the integrity of the vote counting process. It also asserted the need for a
federal agency to develop national performance standards that could be used as a tool
by state and local election officials in the testing, certification, and procurement of
computer-based voting systems. In 1984, Congress approved initial funding for the
Standards.
The FEC held a series of public hearings in developing the initial Standards. State and
local election officials, election system vendors, technical consultants, and others
reviewed drafts of the proposed criteria. The FEC considered their many comments
and made appropriate revisions. Before final issuance, the FEC publicly announced
the availability of the latest draft of the Standards in the Federal Register and
requested that all interested parties submit final comments. The FEC meticulously
reviewed all responses to the notice and incorporated corrections and suitable
suggestions. Ultimately, the final product was the result of considerable deliberation,
close consultation with election officials, and careful consideration of comments from
all interested parties.
In January 1990, the FEC issued the performance standards and testing procedures for
punchcard, marksense, and direct recording electronic (DRE) voting systems. The
Standards did not cover paper ballot and mechanical lever systems because paper
ballots are sufficiently self-explanatory not to require technical standards and
mechanical lever systems are no longer manufactured or sold in the United States. The
FEC also did not incorporate requirements for mainframe computer hardware because
it was reasonable to assume that sufficient engineering and performance criteria

1-2

Volume I – Section 1
Introduction

already governed the operation of mainframe computers. However, vote tally software
installed on mainframes is covered by the Standards.

1.3

Update of the Standards

Today, over two-thirds of the States have adopted the Standards in whole or in part.
As a result, the voting systems marketed today are dramatically improved. Election
officials are better assured that the voting systems they procure will work accurately
and reliably. Voting system failures are declining, and now primarily involve preStandard equipment, untested equipment configurations, or the mismanagement of
tested equipment. Overall, systems integrity and the election processes have improved
markedly.
However, advances in voting technology, legislative changes, and the proliferation of
electronic voting systems make an update of the Standards necessary. The industry
has been marked by widespread integration of personal computer technology and nonmainframe servers into DRE voting systems.
In addition, voting systems need to be responsive to the Americans with Disabilities
Act (ADA) of 1990 and guidelines developed to assist in implementing the ADA.

1.4

Accessibility for Individuals with Disabilities

Voters and election officials who use voting systems represent a broad spectrum of the
population, and include individuals with disabilities who may have difficulty using
traditional voting systems. In developing accessibility provisions for the Standards,
the FEC requested assistance from the Access Board, the federal agency in the
forefront of promulgating accessibility provisions. The Access Board submitted
technical standards designed to meet the diverse needs of voters with a broad range of
disabilities. The FEC has adopted the entirety of the Access Board’s
recommendations and incorporated them into the Standards. These recommendations
comprise the bulk of the accessibility provisions found in Section 2.2.7.
Implementing these provisions, however, will not entirely eliminate the need to
accommodate the needs of some disabled voters by human interface.
The FEC anticipates that during the lifetime of this version of the Standards increased
obligations will be placed upon election officials at every jurisdictional level to
provide voting equipment tailored to meet the needs of voters with disabilities. To
facilitate jurisdictions in meeting accessibility needs, the Standards mandate that every
voting system incorporate some accessible voting capabilities. The Standards also
mandate that systems incorporating a DRE component meet specific technological
requirements. To do so, it is anticipated that a vendor will have to either configure all

1-3

Volume I – Section 1
Introduction

of the system’s voting stations to meet the accessibility specifications or will have to
design a unique station that conforms to the accessibility requirements and is part of
the overall voting system configuration.
Under no circumstances should compliance with requirements for accessibility be
viewed as mutually exclusive from compliance with any other provision of the
Standards. If a voting system contains a machine uniquely designed to meet the
accessibility requirements, such a machine will be tested for compliance with the
accessibility requirements, as well as for compliance with all of the DRE standards, in
order to ensure that an accessible machine does not unintentionally abrogate the
mandates of the Standards.

1.5

Definitions

The Standards contain terms describing function, design, documentation, and testing
attributes of equipment and computer programs. Unless otherwise specified, the
intended sense of technical terms is that which is commonly used by the information
technology industry. In some cases terminology is specific to elections or voting
systems, and a glossary of those terms is contained in Appendix A. Non-technical
terms not listed in Appendix A shall be interpreted according to their standard
dictionary definitions.
Additionally, the following terms are defined below:
♦ Voting system;
♦ Paper-based voting system;
♦ Direct record electronic (DRE) voting system;
♦ Public network direct record electronic (DRE) voting systems;
♦ Precinct count voting system; and
♦ Central count voting system.

1.5.1

Voting System

A voting system is a combination of mechanical, electromechanical, or electronic
equipment. It includes the software required to program, control, and support the
equipment that is used to define ballots; to cast and count votes; to report and/or
display election results; and to maintain and produce all audit trail information. A
voting system may also include the transmission of results over telecommunication
networks.

1-4

Volume I – Section 1
Introduction

Additionally, a voting system includes the associated documentation used to operate
the system, maintain the system, identify system components and their versions, test
the system during its development and maintenance, maintain records of system errors
and defects, and determine specific changes made after system qualification. By
definition, this includes all documentation required in Section 9.4.
Traditionally, a voting system has been defined by the mechanism the system uses to
cast votes and further categorized by the location where the system tabulates ballots.
However, the Standards recognize that as the industry develops unique solutions to
various challenges and as voting systems become more responsive to the needs of
election officials and voters, the rigid dichotomies between voting system types may
be blurred. Innovations that use a fluid understanding of system types can greatly
improve the voting system industry, but only if controls are in place to monitor and
control integrity through the proper evaluation of the system brought for qualification.
As such, vendors that submit a system that integrates components from more than one
traditional system type or a system that includes components not addressed in this
Standard shall submit the results of all beta tests of the new system. Vendors also shall
submit a proposed test plan to the appropriate independent test authority recognized
by the National Association of State Election Directors (NASED) to conduct national
qualification testing of voting systems. The Standards permit vendors to produce or
utilize interoperable components of a voting system that are tested within the full
voting system configuration.

1.5.2

Paper-Based Voting System

A Paper-Based Voting System, (referred to in the initial Standards as a Punchcard and
Marksense [P&M] Voting System) records votes, counts votes, and produces a
tabulation of the vote count from votes cast on paper cards or sheets. A punchcard
voting system allows a voter to record votes by means of holes punched in designated
voting response locations. A marksense voting system allows a voter to record votes
by means of marks made by the voter directly on the ballot, usually in voting response
locations. Additionally, a paper based system may record votes using other
approaches whereby the voter’s selections are indicated by marks made on a paper
ballot by an electronic input device, as long as such an input device does not
independently record, store, or tabulate the voters selections.

1.5.3

Direct Record Electronic (DRE) Voting System

A Direct Record Electronic (DRE) Voting System records votes by means of a ballot
display provided with mechanical or electro-optical components that can be activated
by the voter; that processes data by means of a computer program; and that records

1-5

Volume I – Section 1
Introduction

voting data and ballot images in memory components. It produces a tabulation of the
voting data stored in a removable memory component and as printed copy. The
system may also provide a means for transmitting individual ballots or vote totals to a
central location for consolidating and reporting results from precincts at the central
location.

1.5.4

Public Network Direct Record Electronic (DRE)
Voting System

A Public Network Direct Record Electronic (DRE) Voting System is an election
system that uses electronic ballots and transmits vote data from the polling place to
another location over a public network as defined in Section 5.1.2. Vote data may be
transmitted as individual ballots as they are cast, periodically as batches of ballots
throughout the election day, or as one batch at the close of voting. For purposes of the
Standards, Public Network DRE Voting Systems are considered a form of DRE
Voting System and are subject to the standards applicable to DRE Voting Systems.
However, because transmitting vote data over public networks relies on equipment
beyond the control of the election authority, the system is subject to additional threats
to system integrity and availability. Therefore, additional requirements discussed in
Section 5 and 6 apply.
The use of public networks for transmitting vote data must provide the same level of
integrity as other forms of voting systems, and must be accomplished in a manner that
precludes three risks to the election process: automated casting of fraudulent votes,
automated manipulation of vote counts, and disruption of the voting process such that
the system is unavailable to voters during the time period authorized for system use.

1.5.5

Precinct Count Voting System

A Precinct Count Voting System is a voting system that tabulates ballots at the polling
place. These systems typically tabulate ballots as they are cast, and print the results
after the close of polling. For DREs, and for some paper-based systems, these systems
provide electronic storage of the vote count and may transmit results to a central
location over public telecommunication networks.

1.5.6

Central Count Voting System

A Central Count Voting System is a voting system that tabulates ballots from multiple
precincts at a central location. Voted ballots are typically placed into secure storage at

1-6

Volume I – Section 1
Introduction

the polling place. Stored ballots are transported or transmitted to a central counting
place. The systems produce a printed report of the vote count, and may produce a
report stored on electronic media.

1.6

Application of the Standards and Test
Specifications

The Standards apply to all system hardware, software, telecommunications, and
documentation intended for use to:
♦ Prepare the voting system for use in an election;
♦ Produce the appropriate ballot formats;
♦ Test that the voting system and ballot materials have been properly prepared
and are ready for use;
♦ Record and count votes;
♦ Consolidate and report results;
♦ Display results on-site or remotely; and
♦ Maintain and produce all audit trail information.
In general, the Standards define functional requirements and performance
characteristics that can be assessed by a series of defined tests . Standards are
mandatory requirements and are designated by use of the term “shall.”
Some voting systems use one or more readily available commercial off-the-shelf
(COTS) devices (such as card readers, printers, or personal computers) or software
products (such as operating systems, programming language compilers, or database
management systems). COTS devices and software are exempted from certain
portions of the qualification testing process as defined herein, as long as such products
are not modified for use in a voting system.
Generally, voting systems are subject to the following three testing phases prior to
being purchased or leased:
♦ Qualification tests;
♦ State certification tests; and
♦ State and/or local acceptance tests.

1-7

Volume I – Section 1
Introduction

1.6.1

Qualification Tests

Qualification tests validate that a voting system meets the requirements of the
Standards and performs according to the vendor’s specifications for the system. Such
tests encompass the examination of software; the inspection and evaluation of system
documentation; tests of hardware under conditions simulating the intended storage,
operation, transportation, and maintenance environments; operational tests to validate
system performance and function under normal and abnormal conditions; and
examination of the vendor’s system development, testing, quality assurance, and
configuration management practices. Qualification tests address individual system
components or elements, as well as the integrated system as a whole.
Since 1994, qualification tests for voting systems have been performed by
Independent Test Authorities (ITAs) certified by the National Association of State
Election Directors (NASED). NASED has certified an ITA for either the full scope of
qualification testing or a distinct subset of the total scope of testing. To date, ITAs
have been certified only for distinct subsets of testing. Upon the successful
completion of testing by an ITA, the ITA issues a Qualification Test Report to the
vendor and NASED. The qualification test report remains valid for as long as the
voting system remains unchanged.
Upon receipt of test reports that address the full scope of testing, NASED issues a
Qualification Number that indicates the system has been tested by certified ITAs for
compliance with the Standards and qualifies for the certification process of states that
have adopted the Standards. The Qualification Number applies to the system as a
whole, and does not apply to individual system components or untested
configurations.
After a system has completed qualification testing, further examination of a system is
required if modifications are made to hardware, software, or telecommunications,
including the installation of software on different hardware. Vendors request review
of modifications by the appropriate ITA based on the nature and scope of changes
made and the scope of the ITA’s role in NASED qualification. The ITA will
determine the extent to which the modified system should be resubmitted for
qualification testing and the extent of testing to be conducted.
Generally, a voting system remains qualified under the standards against which it was
tested, as long as no modifications not approved by an ITA are made to the system.
However, if a new threat to a particular voting system is discovered, it is the
prerogative of NASED to determine which qualified voting systems are vulnerable,
whether those systems need to be retested, and the specific tests to be conducted. In
addition, when new standards supersede the standards under which the system was
qualified, it is the prerogative of NASED to determine when systems that were
qualified under the earlier standards will lose their qualification, unless they are tested
to meet current standards.

1-8

Volume I – Section 1
Introduction

Among other things, qualification testing complements and evaluates the vendor's
developmental testing and beta testing. The ITA is expected to evaluate the
completeness of the vendor's developmental test program, including the sufficiency of
vendor tests conducted to demonstrate compliance with the Standards as well as the
system’s performance specifications. The ITA undertakes sample testing of the
vendor's test modules and also designs independent system-level tests to supplement
and check those designed by the vendor. Although some of the qualification tests are
based on those prescribed in the Military Standards, in most cases the test conditions
are less stringent, reflecting commercial, rather than military, practice.

1.6.2

Certification Tests

Certification tests are performed by individual states, with or without the assistance of
outside consultants, to:
♦ Confirm that the voting system presented is the same as the one qualified
through the Standards;
♦ Test for the proper implementation of state-specific requirements;
♦ Establish a baseline for future evaluations or tests of the system, such as
acceptance testing or state review after modifications have been made; and
♦ Define acceptance tests.
Precise certification test scripts are not included in the Standards, as they must be
defined by the state, with its laws, election practices, and needs in mind. However, it
is recommended that they not duplicate qualification tests, but instead focus on
functional tests and qualitative assessment to ensure that the system operates in a
manner that is acceptable under state law. If a voting system is modified after state
certification, it is recommended that States reevaluate the system to determine if
further certification testing is warranted.
Certification tests performed by individual states typically rely on information
contained in documentation provided by the vendor for system design, installation,
operations, required facilities and supplies, personnel support and other aspects of the
voting system. States and jurisdictions may define information and documentation
requirements additional to those defined in the Standards. By design, the Standards,
and qualification testing of voting systems for compliance with the Standards, do not
address these additional requirements. However, qualification testing addresses all
capabilities of a voting system stated by the vendor in the system documentation
submitted to an ITA, including additional capabilities that are not required by the
Standards.

1-9

Volume I – Section 1
Introduction

1.6.3

Acceptance Tests

Acceptance tests are performed at the state or local jurisdiction level upon system
delivery by the vendor to:
♦ Confirm that the system delivered is the specific system qualified by NASED
and, when applicable, certified by the state;
♦ Evaluate the degree to which delivered units conform to both the system
characteristics specified in the procurement documentation, and those
demonstrated in the qualification and certification tests; and
♦ Establish a baseline for any future required audits of the system.
Some of the operational tests conducted during qualification may be repeated during
acceptance testing.

1.7

Outline of Contents

The organization of the Standards has been simplified to facilitate its use. Volume I,
Voting System Performance Standards, is intended for use by the broadest audience,
including voting system developers, equipment manufacturers and suppliers,
independent test authorities, local agencies that purchase and deploy voting systems,
state organizations that certify a system prior to procurement by a local jurisdiction,
and public interest organizations that have an interest in voting systems and voting
systems standards.
♦ Section 2 describes the functional capabilities required of voting systems.
♦ Sections 3 through 6 describe specific performance standards for election
system hardware, software, telecommunications and security, respectively.
♦ Sections 7 and 8 describe practices for quality assurance and configuration
management, respectively, to be used by vendors, and required information
about vendor practices that will be reviewed in concert with system
qualification and certification test processes and system purchase decisions.
♦ Section 9 provides an overview of the test and measurement process used by
test authorities for qualification and re-qualification of voting systems.
♦ Appendix A provides a glossary of important terms used in Volume I.
♦ Appendix B lists the publications that were used for guidance in the
preparation of the Standards. These publications contain information that is
useful in interpreting and complying with the requirements of the Standards.

1-10

Volume I – Section 1
Introduction

♦ Appendix C addresses issues of usability of voting systems, commonly
referred to as “human factors.” This appendix does not represent mandates
that voting systems will be tested against, but rather contain recommendations
and best practices on usability issues designed to provide vendors and election
officials with guidance on designing and procuring systems that are easy and
intuitive to use by voters.
Volume II, Voting System Qualification Testing Standards describes the standards for
the technical information submitted by the vendor to support testing; the development
of test plans by the ITA for initial system testing and testing of system modifications;
the conduct of system qualification tests by the ITA; and the test reports generated by
the ITA. This volume complements the content of Volume I and it is intended
primarily for use by ITAs, state organizations that certify a system, and vendors.

1-11

Volume I – Section 1
Introduction

Volume I, Section 2
Table of Contents

2

Functional Capabilities.........................................................................................................................2-1
2.1

Scope .............................................................................................................................................2-1

2.2

Overall System Capabilities ...........................................................................................................2-2
2.2.1

Security.............................................................................................................................2-2

2.2.2

Accuracy ...........................................................................................................................2-3
2.2.2.1 Common Standards .................................................................................................2-3
2.2.2.2 DRE System Standards ...........................................................................................2-4

2.2.3

Error Recovery..................................................................................................................2-4

2.2.4

Integrity .............................................................................................................................2-4
2.2.4.1 Common Standards .................................................................................................2-4
2.2.4.2 DRE Systems Standards..........................................................................................2-5

2.2.5

System Audit.....................................................................................................................2-5
2.2.5.1 System Audit Purpose and Context .........................................................................2-5
2.2.5.2 Operational Requirements .......................................................................................2-6
2.2.5.2.1 Time, Sequence, and Preservation of Audit Records ..............................2-6
2.2.5.2.2 Error Messages........................................................................................2-7
2.2.5.2.3 Status Messages......................................................................................2-8
2.2.5.3 COTS General Purpose Computer System Requirements ......................................2-8

2.2.6

Election Management System ..........................................................................................2-9

2.2.7

Accessibility ....................................................................................................................2-10
2.2.7.1 Common Standards ...............................................................................................2-10
2.2.7.2 DRE Standards ......................................................................................................2-12

2.2.8

Vote Tabulating Program................................................................................................2-14
2.2.8.1 Functions 2-14
2.2.8.2 Voting Variations ....................................................................................................2-15

2.2.9

Ballot Counter .................................................................................................................2-15

2.2.10 Telecommunications.......................................................................................................2-16
2.2.11 Data Retention................................................................................................................2-16
2.3

Pre-voting Functions ....................................................................................................................2-17
2.3.1

Ballot Preparation ...........................................................................................................2-18
2.3.1.1 General Capabilities ...............................................................................................2-18

i

Volume I – Section 2
Functional Capabilities

2.3.1.1.1 Common Standards ...............................................................................2-18
2.3.1.1.2 Paper-Based System Standards............................................................2-19
2.3.1.2 Ballot Formatting ....................................................................................................2-19
2.3.1.3 Ballot Production ....................................................................................................2-19
2.3.1.3.1 Common Standards ...............................................................................2-20
2.3.1.3.2 Paper-Based System Standards............................................................2-20
2.3.2

Election Programming ....................................................................................................2-20

2.3.3

Ballot and Program Installation and Control ...................................................................2-21

2.3.4

Readiness Testing ..........................................................................................................2-21
2.3.4.1 Common Standards ...............................................................................................2-21
2.3.4.2 Paper-Based Systems............................................................................................2-22

2.4

2.3.5

Verification at the Polling Place ......................................................................................2-22

2.3.6

Verification at the Central Location.................................................................................2-23

Voting Functions ..........................................................................................................................2-23
2.4.1

Opening the Polls ...........................................................................................................2-24
2.4.1.1 Opening the Polling Place (Precinct Count Systems) ............................................2-24
2.4.1.2 Paper-Based System Standards ............................................................................2-24
2.4.1.2.1 All Paper-Based Systems ......................................................................2-24
2.4.1.2.2 Precinct Count Paper-Based Systems...................................................2-25
2.4.1.3 DRE System Standards .........................................................................................2-25

2.4.2

Activating the Ballot (DRE Systems) ..............................................................................2-25

2.4.3

Casting a Ballot...............................................................................................................2-26
2.4.3.1 Common Standards ...............................................................................................2-26
2.4.3.2 Paper-Based Systems Standards ..........................................................................2-26
2.4.3.2.1 All Paper-Based Systems ......................................................................2-27
2.4.3.2.2 Precinct Count Paper-Based Systems...................................................2-27
2.4.3.3 DRE Systems Standards........................................................................................2-27

2.5

Post-Voting Functions ..................................................................................................................2-28
2.5.1

Closing the Polling Place (Precinct Count) .....................................................................2-29

2.5.2

Consolidating Vote Data .................................................................................................2-29

2.5.3

Producing Reports ..........................................................................................................2-29
2.5.3.1 Common Standards ...............................................................................................2-29
2.5.3.2 Precinct Count Systems .........................................................................................2-30

2.5.4
2.6

Broadcasting Results......................................................................................................2-30

Maintenance, Transportation, and Storage..................................................................................2-31

ii

Volume I – Section 2
Functional Capabilities

2

Functional Capabilities
2.1

Scope

This section contains standards detailing the functional capabilities required of a
voting system. This section sets out precisely what it is that a voting system is
required to do. In addition, this section sets forth the minimum actions a voting system
must be able to perform to be eligible for qualification.
For organizational purposes, functional capabilities are categorized by the phase of
election activity in which they are required:
♦ Overall Capabilities: These functional capabilities apply throughout the
election process. They include security, accuracy, integrity, system
auditability, election management system, vote tabulation, ballot counters,
telecommunications, and data retention.
♦ Pre-voting Capabilities: These functional capabilities are used to prepare the
voting system for voting. They include ballot preparation, the preparation of
election-specific software (including firmware), the production of ballots or
ballot pages, the installation of ballots and ballot counting software (including
firmware), and system and equipment tests.
♦ Voting Capabilities: These functional capabilities include all operations
conducted at the polling place by voters and officials including the generation
of status messages.
♦ Post-voting Capabilities: These functional capabilities apply after all votes
have been cast. They include closing the polling place; obtaining reports by
voting machine, polling place, and precinct; obtaining consolidated reports;
and obtaining reports of audit trails.
♦ Maintenance, Transportation and Storage Capabilities: These capabilities are
necessary to maintain, transport, and store voting system equipment.

In recognition of the diversity of voting systems, the Standards apply specific
requirements to specific technologies. Some of the Standards apply only if the system
incorporates certain optional functions (for example, voting systems employing

2-1

Volume I – Section 2
Functional Capabilities

telecommunications to transmit voting data). For each functional capability, common
standards are specified. Where necessary, common standards are followed by
standards applicable to specific technologies (i.e., paper-based or DRE) or intended
use (i.e., central or precinct count).

2.2

Overall System Capabilities

This section defines required functional capabilities that are system-wide in nature
and not unique to pre-voting, voting, and post-voting operations. All voting systems
shall provide the following functional capabilities:
♦ Security;
♦ Accuracy;
♦ Error recovery;
♦ Integrity;
♦ System auditability;
♦ Election management system;
♦ Accessibility:
♦ Vote tabulating;
♦ Ballot counters; and
♦ Data Retention.
Voting systems may also include telecommunications components. Technical
standards for these capabilities are described in Sections 3 through 6 of the Standards.

2.2.1

Security

System security is achieved through a combination of technical capabilities and sound
administrative practices. To ensure security, all systems shall:
a. Provide security access controls that limit or detect access to critical system
components to guard against loss of system integrity, availability,
confidentiality, and accountability.
b. Provide system functions that are executable only in the intended manner and
order, and only under the intended conditions.

2-2

Volume I – Section 2
Functional Capabilities

c. Use the system's control logic to prevent a system function from executing if
any preconditions to the function have not been met.
d. Provide safeguards to protect against tampering during system repair, or
interventions in system operations, in response to system failure.
e. Provide security provisions that are compatible with the procedures and
administrative tasks involved in equipment preparation, testing, and operation.
f.

If access to a system function is to be restricted or controlled, the system shall
incorporate a means of implementing this capability.

g. Provide documentation of mandatory administrative procedures for effective
system security.

2.2.2

Accuracy

Memory hardware, such as semiconductor devices and magnetic storage media, must
be accurate. The design of equipment in all voting systems shall provide for the
highest possible levels of protection against mechanical, thermal, and electromagnetic
stresses that impact system accuracy. Section 3 provides additional information on
susceptibility requirements.

2.2.2.1

Common Standards

To ensure vote accuracy, all systems shall:
a. Record the election contests, candidates, and issues exactly as defined by
election officials;
b. Record the appropriate options for casting and recording votes;
c. Record each vote precisely as indicated by the voter and be able to produce an
accurate report of all votes cast;
d. Include control logic and data processing methods incorporating parity and
check-sums (or equivalent error detection and correction methods) to
demonstrate that the system has been designed for accuracy; and
e. Provide software that monitors the overall quality of data read-write and
transfer quality status, checking the number and types of errors that occur in
any of the relevant operations on data and how they were corrected.

2-3

Volume I – Section 2
Functional Capabilities

2.2.2.2

DRE System Standards

As an additional means of ensuring accuracy in DRE systems, voting devices shall
record and retain redundant copies of the original ballot image. A ballot image is an
electronic record of all votes cast by the voter, including undervotes.

2.2.3

Error Recovery

To recover from a non-catastrophic failure of a device, or from any error or
malfunction that is within the operator's ability to correct, the system shall provide the
following capabilities:
a. Restoration of the device to the operating condition existing immediately
prior to the error or failure, without loss or corruption of voting data
previously stored in the device;
b. Resumption of normal operation following the correction of a failure in a
memory component, or in a data processing component, including the central
processing unit; and
c. Recovery from any other external condition that causes equipment to become
inoperable, provided that catastrophic electrical or mechanical damage due to
external phenomena has not occurred.

2.2.4

Integrity

Integrity measures ensure the physical stability and function of the vote recording and
counting processes.

2.2.4.1

Common Standards

To ensure system integrity, all systems shall:
a. Protect, by a means compatible with these Standards, against a single point of
failure that would prevent further voting at the polling place;
b. Protect against the interruption of electronic power;
c. Protect against generated or induced electromagnetic radiation;
d. Protect against ambient temperature and humidity fluctuations;

2-4

Volume I – Section 2
Functional Capabilities

e. Protect against the failure of any data input or storage device;
f.

Protect against any attempt at improper data entry or retrieval;

g. Record and report the date and time of normal and abnormal events;
h. h.
Maintain a permanent record of all original audit data that cannot be
modified or overridden but may be augmented by designated authorized
officials in order to adjust for errors or omissions (e.g. during the canvassing
process.)
i.

Detect and record every event, including the occurrence of an error condition
that the system cannot overcome, and time-dependent or programmed events
that occur without the intervention of the voter or a polling place operator;
and

j.

Include built-in measurement, self-test, and diagnostic software and hardware
for detecting and reporting the system's status and degree of operability.

2.2.4.2

DRE Systems Standards

In addition to the common standards, DRE systems shall:
a. Maintain a record of each ballot cast using a process and storage location that
differs from the main vote detection, interpretation, processing, and reporting
path; and
b. Provide a capability to retrieve ballot images in a form readable by humans.

2.2.5

System Audit

This section describes the context and purpose of voting system audits and sets forth
specific functional requirements. Additional technical audit requirements are set forth
in Section 4.

2.2.5.1

System Audit Purpose and Context

Election audit trails provide the supporting documentation for verifying the
correctness of reported election results. They present a concrete, indestructible
archival record of all system activity related to the vote tally, and are essential for
public confidence in the accuracy of the tally, for recounts, and for evidence in the
event of criminal or civil litigation.

2-5

Volume I – Section 2
Functional Capabilities

The following audit trail requirements are based on the premise that system-generated
creation and maintenance of audit records reduces the chance of error associated with
manually generated audit records. Because most audit capability is automatic, the
system operator has less information to track and record, and is less likely to make
mistakes or omissions.
The sections that follow present operational requirements critical to acceptable
performance and reconstruction of an election. Requirements for the content of audit
records are described in Section 4 of the Standards.
The requirements for all system types, both precinct and central count, are described
in generic language. Because the actual implementation of specific characteristics may
vary from system to system, it is the responsibility of the vendor to describe each
system's characteristics in sufficient detail that ITAs and system users can evaluate the
adequacy of the system's audit trail. This description shall be incorporated in the
System Operating Manual, which is part of the Technical Data Package (TDP).
Documentation of items such as paper ballots delivered and collected, administrative
procedures for system security, and maintenance performed on voting equipment are
also part of the election audit trail, but are not covered in these technical standards.
Future volumes of the Standards will address these and other system operations
practices. In the interim, useful guidance is provided by the Innovations in Election
Administration #10, Ballot Security and Accountability, available from the FEC’s
Office of Election Administration.

2.2.5.2

Operational Requirements

Audit records shall be prepared for all phases of elections operations performed using
devices controlled by the jurisdiction or its contractors. These records rely upon
automated audit data acquisition and machine-generated reports, with manual input of
some information. These records shall address the ballot preparation and election
definition phase, system readiness tests, and voting and ballot-counting operations.
The software shall activate the logging and reporting of audit data as described in the
following sections.

2.2.5.2.1

Time, Sequence, and Preservation of Audit Records

The timing and sequence of audit record entries is as important as the data contained
in the record. All voting systems shall meet the following requirements for time,
sequence and preservation of audit records:
a. Except where noted, systems shall provide the capability to create and
maintain a real-time audit record. This capability records and provides the
operator or precinct official with continuous updates on machine status. This
information allows effective operator identification of an error condition

2-6

Volume I – Section 2
Functional Capabilities

requiring intervention, and contributes to the reconstruction of electionrelated events necessary for recounts or litigation.
b. All systems shall include a real-time clock as part of the system’s hardware.
The system shall maintain an absolute record of the time and date or a record
relative to some event whose time and data are known and recorded.
c. All audit record entries shall include the time-and-date stamp.
d. The audit record shall be active whenever the system is in an operating mode.
This record shall be available at all times, though it need not be continually
visible.
e. The generation of audit record entries shall not be terminated or altered by
program control, or by the intervention of any person. The physical security
and integrity of the record shall be maintained at all times.
f.

Once the system has been activated for any function, the system shall preserve
the contents of the audit record during any interruption of power to the system
until processing and data reporting have been completed.

g. The system shall be capable of printing a copy of the audit record. A separate
printer is not required for the audit record, and the record may be produced on
the standard system printer if all the following conditions are met:
1) The generation of audit trail records does not interfere with the production
of output reports;
2) The entries can be identified so as to facilitate their recognition,
segregation, and retention; and
3) The audit record entries are kept physically secure.
2.2.5.2.2

Error Messages

All voting systems shall meet the following requirements for error messages:
a. The system shall generate, store, and report to the user all error messages as
they occur;
b. All error messages requiring intervention by an operator or precinct official
shall be displayed or printed unambiguously in easily understood language
text, or by means of other suitable visual indicators;
c. When the system uses numerical error codes for trained technician
maintenance or repair, the text corresponding to the code shall be selfcontained, or affixed inside the unit device. This is intended to reduce
inappropriate reactions to error conditions, and to allow for ready and
effective problem correction;
d. All error messages for which correction impacts vote recording or vote
processing shall be written in a manner that is understandable to an election

2-7

Volume I – Section 2
Functional Capabilities

official who possesses training on system use and operation, but does not
possess technical training on system servicing and repair;
e. The message cue for all systems shall clearly state the action to be performed
in the event that voter or operator response is required;
f.

System design shall ensure that erroneous responses will not lead to
irreversible error; and

g. Nested error conditions shall be corrected in a controlled sequence such that
system status shall be restored to the initial state existing before the first error
occurred.
2.2.5.2.3

Status Messages

The Standards provide latitude in software design so that vendors can consider various
user processing and reporting needs. The jurisdiction may require some status and
information messages to be displayed and reported in real-time. Messages that do not
require operator intervention may be stored in memory to be recovered after ballot
processing has been completed.
The system shall display and report critical status messages using unambiguous
indicators or English language text. The system need not display non-critical status
messages at the time of occurrence. Systems may display non-critical status messages
(i.e., those that do not require operator intervention) by means of numerical codes for
subsequent interpretation and reporting as unambiguous text.
Systems shall provide a capability for the status messages to become part of the realtime audit record. The system shall provide a capability for a jurisdiction to designate
critical status messages.

2.2.5.3

COTS General Purpose Computer System Requirements

Further requirements must be applied to COTS operating systems to ensure
completeness and integrity of audit data for election software. These operating
systems are capable of executing multiple application programs simultaneously. These
systems include both servers and workstations (or “PCs”), including the many
varieties of UNIX and Linux, and those offered by Microsoft and Apple. Election
software running on these COTS systems is vulnerable to unintended effects from
other user sessions, applications, and utilities, executing on the same platform at the
same time as the election software.
“Simultaneous processes” of concern include unauthorized network connections,
unplanned user logins, and unintended execution or termination of operating system
processes. An unauthorized network connection or unplanned user login can host
unintended processes and user actions, such as the termination of operating system

2-8

Volume I – Section 2
Functional Capabilities

audit, the termination of election software processes, or the deletion of election
software audit and logging data. The execution of an operating system process could
be a full system scan at a time when that process would adversely affect the election
software processes. Operating system processes improperly terminated could be
system audit or malicious code detection.
To counter these vulnerabilities, three operating system protections are required on all
such systems on which election software is hosted. First, authentication shall be
configured on the local terminal (display screen and keyboard) and on all external
connection devices (“network cards” and “ports”). This ensures that only authorized
and identified users affect the system while election software is running.
Second, operating system audit shall be enabled for all session openings and closings,
for all connection openings and closings, for all process executions and terminations,
and for the alteration or deletion of any memory or file object. This ensures the
accuracy and completeness of election data stored on the system. It also ensures the
existence of an audit record of any person or process altering or deleting system data
or election data.
Third, the system shall be configured to execute only intended and necessary
processes during the execution of election software. The system shall also be
configured to halt election software processes upon the termination of any critical
system process (such as system audit) during the execution of election software.

2.2.6

Election Management System

The Election Management System (EMS) is used to prepare ballots and programs for
use in casting and counting votes, and to consolidate, report, and display election
results. An EMS shall generate and maintain a database, or one or more interactive
databases, that enables election officials or their designees to perform the following
functions:
a. Define political subdivision boundaries and multiple election districts as
indicated in the system documentation;
b. Identify contests, candidates, and issues
c. Define ballot formats and appropriate voting options;
d. Generate ballots and election-specific programs for vote recording and vote
counting equipment;
e. Install ballots and election-specific programs;
f.

Test that ballots and programs have been properly prepared and installed;

g. Accumulate vote totals at multiple reporting levels as indicated in the system
documentation;

2-9

Volume I – Section 2
Functional Capabilities

h. Generate the post-voting reports required by Section 2.5; and
i.

Process and produce audit reports of the data indicated in Section 4.5.

2.2.7

Accessibility

The Standards provide requirements for voting systems to meet the accessibility needs
of a broad range of voters with disabilities. To do so, it is anticipated that a vendor
will have to either configure all of the system’s voting stations to meet the
accessibility specifications or will have to design a unique station that conforms to the
accessibility requirements and is part of the overall voting system configuration.
Efforts to meet the accessibility requirements shall not violate the privacy, secrecy,
and integrity demands of the Standards.

2.2.7.1

Common Standards

To facilitate accessibility, all voting systems shall be capable of meeting the
following conditions, as illustrated in Figures 2-1 through 2-4 :
a. Where clear floor space only allows forward approach to an object, the maximum
high forward reach allowed shall be 48inches. The minimum low forward reach
is 15 inches.
b. Where forward reach is over an obstruction with knee space below, the maximum
level forward reach is 25 inches. When the obstruction is less than 20 inches
deep, the maximum high forward reach is 48 inches. When the obstruction
projects 20 to 25 inches, the maximum high forward reach is 44 inches.
c. The position of any operable control is determined with respect to a vertical plane
that is 48 inches in length, centered on the operable control, and at the maximum
protrusion of the product within the 48-inch length;
d. Where any operable control is 10 inches or less behind the reference plane, have a
height that is between 15 inches and 54 inches above the floor;
e. Where any operable control is more than 10 inches and not more than 24 inches
behind the reference plane, have a height between 15 inches and 46 inches above
the floor; and
f.

Have operable controls that are not more than 24 inches behind the reference
plane.

2-10

Volume I – Section 2
Functional Capabilities

Figure 2-1

Figure 2-2

2-11

Volume I – Section 2
Functional Capabilities

Figure 2-3

54" or
1370mm

15" or 380mm

46" or
1168mm

15" or 380mm

Figure 2-4

2.2.7.2

DRE Standards

DRE voting systems shall provide, as part of their configuration, the capability to
provide access to voters with a broad range of disabilities. This capability shall:
a. Not require, the voter to bring their own assistive technology to a polling
place;

2-12

Volume I – Section 2
Functional Capabilities

b. Provide audio information and stimulus that:
1) Communicates to the voter the complete content of the ballot;
2) Provides instruction to the voter in operation of the voting device;
3) Provides instruction so that the voter has the same vote capabilities and
options as those provided by the system to individuals who are not using
audio technology;
4) For a system that supports write-in voting, enables the voter to review the
voter’s write-in input, edit that input, and confirm that the edits meet the
voter’s intent;
5) Enables the voter to request repetition of any information provided by the
system;
6) Supports the use of headphones provided by the system that may be
discarded after each use;
7) Provides the audio signal through an industry standard connector for
private listening using a 1/8 inch stereo headphone jack to allow
individual voters to supply personal headsets; and
8) Provides a volume control with an adjustable amplification up to a
maximum of 105 dB that automatically resets to the default for each
voter;
c. Provide, in conformance with FCC Part 68, a wireless coupling for assistive
devices used by people who are hard of hearing when a system utilizes a
telephone style handset to provide audio information;
d. Meet the requirements of ANSI C63.19-2001 Category 4 to avoid
electromagnetic interference with assistive hearing devices;
e. For electronic image displays, permit the voter to:
1) Adjust the contrast settings;
2) Adjust color settings, when color is used; and
3) Adjust the size of the text so that the height of capital letters varies over a
range of 3 to 6.3 millimeters;
f.

For a device with touchscreen or contact-sensitive controls, provide an input
method using mechanically operated controls or keys that shall:

2-13

Volume I – Section 2
Functional Capabilities

1) Be tactilely discernible without activating the controls or keys;
2) Be operatable with one hand and not require tight grasping, pinching, or
twisting of the wrist;
3) Require a force less than 5 lbs (22.2 N) to operate; and
4) Provide no key repeat function;
g. For a system that requires a response by a voter in a specific period of time,
alert the voter before this time period has expired and allow the voter
additional time to indicate that more time is needed;
h. For a system that provides sound cues as a method to alert the voter about a
certain condition, such as the occurrence of an error, or a confirmation, the
tone shall be accompanied by a visual cue for users who cannot hear the audio
prompt; and
i.

Provide a secondary means of voter identification or authentication when the
primary means of doing so uses biometric measures that require a voter to
possess particular biological characteristics.

2.2.8

Vote Tabulating Program

Each voting system shall have a vote tabulation program that will meet specific
functional requirements.

2.2.8.1

Functions

The vote tabulating program software resident in each voting device, vote count
server, or other devices shall include all software modules required to:
a. Monitor system status and generate machine-level audit reports;
b. Accommodate device control functions performed by polling place officials
and maintenance personnel;
c. Register and accumulate votes; and
d. Accommodate variations in ballot counting logic.

2-14

Volume I – Section 2
Functional Capabilities

2.2.8.2

Voting Variations

There are significant variations among the election laws of the 50 states with respect
to permissible ballot contents, voting options, and the associated ballot counting logic.
The TDP accompanying the system shall specifically identify which of the following
items can and cannot be supported by the system, as well as how the system can
implement the items supported:
a. Closed primaries;
b. Open primaries;
c. Partisan offices;
d. Non-partisan offices;
e. Write-in voting;
f.

Primary presidential delegation nominations;

g. Ballot rotation;
h. Straight party voting;
i.

Cross-party endorsement;

j.

Split precincts;

k. Vote for N of M;
l.

Recall issues, with options;

m. Cumulative voting;
n. Ranked order voting; and
o. Provisional or challenged ballots.

2.2.9

Ballot Counter

For all voting systems, each device that tabulates ballots shall provide a counter that:
a. Can be set to zero before any ballots are submitted for tally;
b. Records the number of ballots cast during a particular test cycle or election;
c. Increases the count only by the input of a ballot;
d. Prevents or disables the resetting of the counter by any person other than
authorized persons at authorized points; and
e. Is visible to designated election officials.

2-15

Volume I – Section 2
Functional Capabilities

2.2.10

Telecommunications

For all voting systems that use telecommunications for the transmission of data during
pre-voting, voting or post-voting activities, capabilities shall be provided that ensure
data are transmitted with no alteration or unauthorized disclosure during transmission.
Such transmissions shall not violate the privacy, secrecy, and integrity demands of the
Standards. Section 5 of the Standards describes telecommunications standards that
apply to, at a minimum, the following types of data transmissions:
♦ Voter Authentication: Coded information that confirms the identity of a voter
for security purposes for a system that transmit votes individually over a
public network;
♦ Ballot Definition: Information that describes to a voting machine the content
and appearance of the ballots to be used in an election;
♦ Vote Transmission to Central Site: For systems that transmit votes individually
over a public network, the transmission of a single vote to the county (or
contractor) for consolidation with other county vote data;
♦ Vote Count: Information representing the tabulation of votes at any one of
several levels: polling place, precinct, or central count; and
♦ List of Voters: A listing of the individual voters who have cast ballots in a
specific election.

2.2.11

Data Retention

United States Code Title 42, Sections 1974 through 1974e, states that election
administrators shall preserve for 22 months “all records and paper that came into
(their) possession relating to an application, registration, payment of poll tax, or other
act requisite to voting.” This retention requirement applies to systems that will be used
at anytime for voting of candidates for Federal offices (e.g., Member of Congress,
United States Senator, and/or Presidential Elector). Therefore, all systems shall
provide for maintaining the integrity of voting and audit data during an election and
for a period of at least 22 months thereafter.
Because the purpose of this law is to assist the Federal government in discharging its
law enforcement responsibilities in connection with civil rights and elections crimes,
its scope must be interpreted in keeping with that objective. The appropriate state or
local authority must preserve all records that may be relevant to the detection and
prosecution of federal civil rights or election crimes for the 22-month federal retention
period, if the records were generated in connection with an election that was held in
whole or in part to select federal candidates. It is important to note that Section 1974
does not require that election officials generate any specific type or classification of

2-16

Volume I – Section 2
Functional Capabilities

election record. However, if a record is generated, Section 1974 comes into force and
the appropriate authority must retain the records for 22 months.
For 22-month document retention, the general rule is that all printed copy records
produced by the election database and ballot processing systems shall be so labeled
and archived. Regardless of system type, all audit trail information spelled out in
subsection 4.5 of the Standards shall be retained in its original format, whether that be
real-time logs generated by the system, or manual logs maintained by election
personnel. The election audit trail includes not only in-process logs of election-night
(and subsequent processing of absentee or provisional ballots), but also time logs of
baseline ballot definition formats, and system readiness and testing results.
In many voting systems, the source of election-specific data (and ballot formats) is a
database or file. In precinct count systems, this data is used to program each machine,
establish ballot layout, and generate tallying files. It is not necessary to retain this
information on electronic media if there is an official, authenticatable printed copy of
all final database information. However, it is recommended that the state or local
jurisdiction also retain electronic records of the aggregate data for each device so that
reconstruction of an election is possible without data re-entry. The same requirement
and recommendation applies to vote results generated by each precinct device or
system.

2.3

Pre-voting Functions

This section defines capabilities required to support functions performed prior to the
opening of polls. All voting systems shall provide capabilities to support:
♦ Ballot preparation;
♦ Election programming;
♦ Ballot and program installation and control;
♦ Readiness testing;
♦ Verification at the polling place; and
♦ Verification at the central counting place.
The standards also include requirements to ensure compatible interfaces with the
ballot definition process and the reporting of election results.

2-17

Volume I – Section 2
Functional Capabilities

2.3.1

Ballot Preparation

Ballot preparation is the process of using election databases to define the specific
contests, questions, and related instructions to be contained in ballots and to produce
all permissible ballot layouts. Ballot preparation requirements include:
♦ General capabilities for ballot preparation;
♦ Ballot formatting; and
♦ Ballot production.

2.3.1.1

General Capabilities

All systems shall provide the general capabilities for ballot preparation.

2.3.1.1.1

Common Standards

All systems shall be capable of:
a. Enabling the automatic formatting of ballots in accordance with the
requirements for offices, candidates, and measures qualified to be placed on
the ballot for each political subdivision and election district;
b. Collecting and maintaining the following data:
1) Offices and their associated labels and instructions;
2) Candidate names and their associated labels; and
3) Issues or measures and their associated text;
c. Supporting the maximum number of potentially active voting positions as
indicated in the system documentation;
d. For a primary election, generating ballots that segregate the choices in
partisan races by party affiliation;
e. Generating ballots that contain identifying codes or marks uniquely associated
with each format; and
f.

Ensuring that vote response fields, selection buttons, or switches properly
align with the specific candidate names and/or issues printed on the ballot
display, ballot card or sheet, or separate ballot pages.

2-18

Volume I – Section 2
Functional Capabilities

2.3.1.1.2

Paper-Based System Standards

In addition to the common standards, paper-based systems shall meet the following
standards applicable to the technology used:
a. Enable voters to make selections by punching a hole or by making a mark in
areas designated for this purpose upon each ballot card or sheet;
b. For punchcard systems, ensure that the vote response fields can be properly
aligned with punching devices used to record votes; and
c. For marksense systems, ensure that the timing marks align properly with the
vote response fields.

2.3.1.2

Ballot Formatting

Ballot formatting is the process by which election officials or their designees use
election databases and vendor system software to define the specific contests and
related instructions contained on the ballot and present them in a layout permitted by
state law. All systems shall provide a capability for:
a. Creation of newly defined elections;
b. Rapid and error-free definition of elections and their associated ballot layouts;
c. Uniform allocation of space and fonts used for each office, candidate, and
contest such that the voter perceives no active voting position to be preferred
to any other;
d. Simultaneous display of the maximum number of choices for a single contest
as indicated by the vendor in the system documentation;
e. Retention of previously defined formats for an election;
f.

Prevention of unauthorized modification of any ballot formats; and

g. Modification by authorized persons of a previously defined ballot format for
use in a subsequent election.

2.3.1.3

Ballot Production

Ballot production is the process of converting ballot formats to a media ready for use
in the physical ballot production or electronic presentation.

2-19

Volume I – Section 2
Functional Capabilities

2.3.1.3.1

Common Standards

The voting system shall provide a means of printing or otherwise generating a ballot
display that can be installed in all system voting devices for which it is intended. All
systems shall provide a capability to ensure:
a. The electronic display or printed document on which the user views the ballot
is capable of rendering an image of the ballot in any of the languages required
by The Voting Rights Act of 1965, as amended;
b. The electronic display or printed document on which the user views the ballot
does not show any advertising or commercial logos of any kind, whether
public service, commercial, or political, unless specifically provided for in
State law. Electronic displays shall not provide connection to such material
through hyperlink; and
c. The ballot conforms to vendor specifications for type of paper stock, weight,
size, shape, size and location of punch or mark field used to record votes,
folding, bleed through, and ink for printing if paper ballot documents or paper
displays are part of the system.
2.3.1.3.2

Paper-Based System Standards

In addition to the common standards, vendor documentation for marksense systems
shall include specifications for ballot materials to ensure that vote selections are read
from only a single ballot at a time, without detection of marks from multiple ballots
concurrently (e.g., reading of bleed-through from other ballots).

2.3.2

Election Programming

Election programming is the process by which election officials or their designees use
election databases and vendor system software to logically define the voter choices
associated with the contents of the ballots. All systems shall provide for the:
a. Logical definition of the ballot, including the definition of the number of
allowable choices for each office and contest;
b. Logical definition of political and administrative subdivisions, where the list
of candidates or contests varies between polling places;
c. Exclusion of any contest on the ballot in which the voter is prohibited from
casting a ballot because of place of residence, or other such administrative or
geographical criteria;

2-20

Volume I – Section 2
Functional Capabilities

d. Ability to select from a range of voting options to conform to the laws of the
jurisdiction in which the system will be used; and
e. Generation of all required master and distributed copies of the voting
program, in conformance with the definition of the ballots for each voting
device and polling place, and for each tabulating device.

2.3.3

Ballot and Program Installation and Control

All systems shall provide a means of installing ballots and programs on each piece of
polling place or central count equipment in accordance with the ballot requirements of
the election and the requirements of the jurisdiction in which the equipment will be
used.
All systems shall include the following at the time of ballot and program installation:
a. A detailed work plan or other documentation providing a schedule and steps
for the software and ballot installation, which includes a table outlining the
key dates, events and deliverables;
b. A capability for automatically verifying that the software has been properly
selected and installed in the equipment or in a programmable memory devices
and for indicating errors; and
c. A capability for automatically validating that software correctly matches the
ballot formats that it is intended to process, for detecting errors, and for
immediately notifying an election official of detected errors.

2.3.4

Readiness Testing

Election personnel conduct equipment and system readiness tests prior to the start of
an election to ensure that the voting system functions properly, to confirm that system
equipment has been properly integrated, and to obtain equipment status reports.

2.3.4.1

Common Standards

All systems shall provide the capabilities to:
a. Verify that voting machines or vote recording and data processing equipment,
precinct count equipment, and central count equipment are properly prepared
for an election, and collect data that verifies equipment readiness;
b. Obtain status and data reports from each set of equipment;

2-21

Volume I – Section 2
Functional Capabilities

c. Verify the correct installation and interface of all system equipment;
d. Verify that hardware and software function correctly;
e. Generate consolidated data reports at the polling place and higher
jurisdictional levels; and
f.

Segregating test data from actual voting data, either procedurally or by
hardware/software features.

Resident test software, external devices, and special purpose test software connected
to or installed in voting devices to simulate operator and voter functions may be used
for these tests provided that the following standards are met:
a. These elements shall be capable of being tested separately, and shall be
proven to be reliable verification tools prior to their use; and
b. These elements shall be incapable of altering or introducing any residual
effect on the intended operation of the voting device during any succeeding
test and operational phase.

2.3.4.2

Paper-Based Systems

Paper-based systems shall:
a. Support conversion testing that uses all potential ballot positions as active
positions; and
b. Support conversion testing of ballots with active position density for systems
without pre-designated ballot positions.

2.3.5

Verification at the Polling Place

Election officials perform verification at the polling place to ensure that all voting
systems and equipment function properly before and during an election. All systems
shall provide a formal record of the following, in any media, upon verification of the
authenticity of the command source:
a. The election's identification data;
b. The identification of all equipment units;
c. The identification of the polling place;
d. The identification of all ballot formats;

2-22

Volume I – Section 2
Functional Capabilities

e. The contents of each active candidate register by office and of each active
measure register at all storage locations (showing that they contain only
zeros);
f.

A list of all ballot fields that can be used to invoke special voting options; and

g. Other information needed to confirm the readiness of the equipment, and to
accommodate administrative reporting requirements.
To prepare voting devices to accept voted ballots, all voting systems shall provide the
capability to test each device prior to opening to verify that each is operating
correctly. At a minimum, the tests shall include:
a. Confirmation that there are no hardware or software failures; and
b. Confirm that the device is ready to be activated for accepting votes.
If a precinct count system includes equipment for the consolidation of polling place
data at one or more central counting places, it shall have means to verify the correct
extraction of voting data from transportable memory devices, or to verify the
transmission of secure data over secure communication links.

2.3.6

Verification at the Central Location

Election officials perform verification at the central location to ensure that vote
counting and vote consolidation equipment and software function properly before and
after an election. Upon verification of the authenticity of the command source, any
system used in a central count environment shall provide a printed record of the
following :
a. The election's identification data;
b. The contents of each active candidate register by office and of each active
measure register at all storage locations (showing that they contain all zeros);
and
c. Other information needed to ensure the readiness of the equipment and to
accommodate administrative reporting requirements.

2.4

Voting Functions

All systems shall support:
♦ Opening the polls; and

2-23

Volume I – Section 2
Functional Capabilities

♦ Casting a ballot.
Additionally, all DRE systems shall support:
♦ Activating the ballot.
♦ Augmenting the election counter; and
♦ Augmenting the life-cycle counter.

2.4.1

Opening the Polls

The capabilities required for opening the polls are specific to individual voting system
technologies. At a minimum, the systems shall provide the functional capabilities
indicated below.

2.4.1.1

Opening the Polling Place (Precinct Count Systems)

To allow voting devices to be activated for voting, the system shall provide:
a. An internal test or diagnostic capability to verify that all of the polling place
tests specified in Section 2.3.5 have been successfully completed; and
b. Automatic disabling any device that has not been tested until it has been
tested.

2.4.1.2

Paper-Based System Standards

The standards for opening the polling place for paper-based systems consist of
common standards and additional standards that apply to precinct count paper-based
systems.

2.4.1.2.1

All Paper-Based Systems

To facilitate opening the polls, all paper-based systems shall include:
a. A means of verifying that ballot punching or marking devices are properly
prepared and ready to use;
b. A voting booth or similar facility, in which the voter may punch or mark the
ballot in privacy; and

2-24

Volume I – Section 2
Functional Capabilities

c. Secure receptacles for holding voted ballots.
2.4.1.2.2

Precinct Count Paper-Based Systems

In addition to the above requirements, all paper-based precinct count equipment shall
include a means of:
a. Activating the ballot counting device;
b. Verifying that the device has been correctly activated and is functioning
properly; and
c. Identifying device failure and corrective action needed.

2.4.1.3

DRE System Standards

To facilitate opening the polls, all DRE systems shall include:
a. A security seal, a password, or a data code recognition capability to prevent
the inadvertent or unauthorized actuation of the poll-opening function;
b. A means of enforcing the execution of steps in the proper sequence if more
than one step is required;
c. A means of verifying the system has been activated correctly; and
d. A means of identifying system failure and any corrective action needed.

2.4.2

Activating the Ballot (DRE Systems)

To activate the ballot, all DRE systems shall:
a. Enable election officials to control the content of the ballot presented to the
voter, whether presented in printed form or electronic display, such that each
voter is permitted to record votes only in contests in which that voter is
authorized to vote;
b. Allow each eligible voter to cast a ballot;
c. Prevent a voter from voting on a ballot to which he or she is not entitled; and
d. Prevent a voter from casting more than one ballot in the same election.
e. Activate the casting of a ballot in a general election;
f.

Enable the selection of the ballot that is appropriate to the party affiliation
declared by the voter in a primary election;

2-25

Volume I – Section 2
Functional Capabilities

g. Activate all portions of the ballot upon which the voter is entitled to vote; and
h. Disable all portions of the ballot upon which the voter is not entitled to vote.

2.4.3

Casting a Ballot

Some required capabilities for casting a ballot are common to all systems. Others are
specific to individual voting technologies or intended use. Systems must provide
additional functional capabilities that enable accessibility to disabled voters as defined
in Section 2.2.7 of the Standards.

2.4.3.1

Common Standards

To facilitate casting a ballot, all systems shall:
a. Provide text that is at least 3 millimeters high and provide the capability to
adjust or magnify the text to an apparent size of 6.3 millimeters;
b. Protect the secrecy of the vote such that the system cannot reveal any
information about how a particular voter voted, except as otherwise required
by individual State law;
c. Record the selection and non-selection of individual vote choices for each
contest and ballot measure;
d. Record the voter’s selection of candidates whose names do not appear on the
ballot, if permitted under State law, and record as many write-in votes as the
number of candidates the voter is allowed to select;
e. In the event of a failure of the main power supply external to the voting
system, provide the capability for any voter who is voting at the time to
complete casting a ballot, allow for the graceful shutdown of the voting
system without loss or degradation of the voting and audit data, and allow
voters to resume voting once the voting system has reverted to back-up
power; and
f.

Provide the capability for voters to continue casting ballots in the event of a
failure of a telecommunications connection within the polling place or
between the polling place and any other location.

2.4.3.2

Paper-Based Systems Standards

The standards for casting a ballot for paper-based systems consist of common
standards and additional standards that apply to precinct count paper-based systems.

2-26

Volume I – Section 2
Functional Capabilities

2.4.3.2.1

All Paper-Based Systems

All paper-based systems shall:
a. Allow the voter to easily identify the voting field that is associated with each
candidate or ballot measure response;
b. Allow the voter to punch or mark the ballot to register a vote;
c. Allow either the voter or the appropriate election official to place the voted
ballot into the ballot counting device (for precinct count systems) or into a
secure receptacle (for central count systems); and
d. Protect the secrecy of the vote throughout the process.
2.4.3.2.2

Precinct Count Paper-Based Systems

In addition to the above requirements, all paper-based precinct count systems shall:
a. Provide feedback to the voter that identifies specific contests or ballot issues
for which an overvote or undervote is detected;
b. Allow the voter, at the voter’s choice, to vote a new ballot or submit the ballot
‘as is’ without correction; and
c. Allow an authorized election official to turn off the capabilities defined in ‘a’
and ‘b’ above.

2.4.3.3

DRE Systems Standards

In addition to the above common requirements, DRE systems shall:
a. Prohibit the voter from accessing or viewing any information on the display
screen that has not been authorized by election officials and preprogrammed
into the voting system (i.e., no potential for display of external information or
linking to other information sources);
b. Enable the voter to easily identify the selection button or switch, or the active
area of the ballot display that is associated with each candidate or ballot
measure response;
c. Allow the voter to select his or her preferences on the ballot in any legal
number and combination;
d. Indicate that a selection has been made or canceled;
e. Indicate to the voter when no selection, or an insufficient number of
selections, has been made in a contest;

2-27

Volume I – Section 2
Functional Capabilities

f.

Prevent the voter from overvoting;

g. Notify the voter when the selection of candidates and measures is completed;
h. Allow the voter, before the ballot is cast, to review his or her choices and, if
the voter desires, to delete or change his or her choices before the ballot is
cast;
i.

For electronic image displays, prompt the voter to confirm the voter's choices
before casting his or her ballot, signifying to the voter that casting the ballot is
irrevocable and directing the voter to confirm the voter’s intention to cast the
ballot;

j.

Notify the voter after the vote has been stored successfully that the ballot has
been cast;

k. Notify the voter that the ballot has not been cast successfully if it is not stored
successfully, including storage of the ballot image, and provide clear
instruction as to the steps the voter should take to cast his or her ballot should
this event occur;
l.

Provide sufficient computational performance to provide responses back to
each voter entry in no more than three seconds;

m. Ensure that the votes stored accurately represent the actual votes cast;
n. Prevent modification of the voter’s vote after the ballot is cast;
o. Provide a capability to retrieve ballot images in a form readable by humans
(in accordance with the requirements of Section 2.2.2.2 and 2.2.4.2);
p. Increment the proper ballot position registers or counters;
q. Protect the secrecy of the vote throughout the voting process;
r.

Prohibit access to voted ballots until after the close of polls;

s.

Provide the ability for election officials to submit test ballots for use in
verifying the end-to-end integrity of the system; and

t.

Isolate test ballots such that they are accounted for accurately in vote counts
and are not reflect in official vote counts for specific candidates or measures.

2.5

Post-Voting Functions

All systems shall provide capabilities to accumulate and report results for the
jurisdiction and to generate audit trails. In addition, precinct count systems must
provide a means to close the polling place including generating appropriate reports. If
the system provides the capability to broadcast results, additional standards apply.

2-28

Volume I – Section 2
Functional Capabilities

2.5.1

Closing the Polling Place (Precinct Count)

These standards for closing the polling place are specific to precinct count systems.
The system shall provide the means for:
a. Preventing the further casting of ballots once the polling place has closed;
b. Providing an internal test that verifies that the prescribed closing procedure
has been followed, and that the device status is normal;
c. Incorporating a visible indication of system status;
d. Producing a diagnostic test record that verifies the sequence of events, and
indicates that the extraction of voting data has been activated; and
e. Precluding the unauthorized reopening of the polls once the poll closing has
been completed for that election.

2.5.2

Consolidating Vote Data

All systems shall provide a means to consolidate vote data from all polling places, and
optionally from other sources such as absentee ballots, provisional ballots, and voted
ballots requiring human review (e.g., write-in votes).

2.5.3

Producing Reports

All systems shall be able to create reports summarizing the data on multiple levels.

2.5.3.1

Common Standards

All systems shall provide capabilities to:
a. Support geographic reporting, which requires the reporting of all results for
each contest at the precinct level and additional jurisdictional levels;
b. Produce a printed report of the number of ballots counted by each tabulator;
c. Produce a printed report for each tabulator of the results of each contest that
includes the votes cast for each selection, the count of undervotes, and the
count of overvotes;

2-29

Volume I – Section 2
Functional Capabilities

d. Produce a consolidated printed report of the results for each contest of all
votes cast (including the count of ballots from other sources supported by the
system as specified by the vendor) that includes the votes cast for each
selection, the count of undervotes, and the count of overvotes;
e. Be capable of producing a consolidated printed report of the combination of
overvotes for any contest that is selected by an authorized official (e.g.; the
number of overvotes in a given contest combining candidate A and candidate
B, combining candidate A and candidate C, etc.);
f.

Produce all system audit information required in Section 4.5 in the form of
printed reports, or in electronic memory for printing centrally; and

g. Prevent data from being altered or destroyed by report generation, or by the
transmission of results over telecommunications lines.

2.5.3.2

Precinct Count Systems

In addition to the common reporting requirements, all precinct count voting systems
shall:
a. Prevent the printing of reports and the unauthorized extraction of data prior to
the official close of the polling place;
b. Provide a means to extract information from a transportable programmable
memory device or data storage medium for vote consolidation;
c. Consolidate the data contained in each unit into a single report for the polling
place when more than one voting machine or precinct tabulator is used; and
d. Prevent data in transportable memory from being altered or destroyed by
report generation, or by the transmission of results over telecommunications
lines.

2.5.4

Broadcasting Results

Some voting systems offer the capability to make unofficial results available to
external organizations such as the news media, political party officials, and others.
Although this capability is not required, systems that make unofficial results available
shall:
a. Provide only aggregated results, and not data from individual ballots;
b. Provide no access path from unofficial electronic reports or files to the storage
devices for official data; and
c. Clearly indicate on each report or file that the results it contains are unofficial.

2-30

Volume I – Section 2
Functional Capabilities

2.6

Maintenance, Transportation, and Storage

All systems shall be designed and manufactured to facilitate preventive and corrective
maintenance, conforming to the hardware standards described in Section 3.
All vote casting and tally equipment designated for storage between elections shall:
a. Function without degradation in capabilities after transit to and from the place
of use, as demonstrated by meeting the performance standards described in
Section 3; and
b. Function without degradation in capabilities after storage between elections,
as demonstrated by meeting the performance standards described in Section 3.

2-31

Volume I – Section 2
Functional Capabilities

Volume I, Section 3
Table of Contents

3

Hardware Standards .............................................................................................................................3-1
3.1

3.2

Scope .............................................................................................................................................3-1
3.1.1

Hardware Sources ............................................................................................................3-2

3.1.2

Organization of this Section..............................................................................................3-2

Performance Requirements ...........................................................................................................3-2
3.2.1

Accuracy Requirements....................................................................................................3-3

3.2.2

Environmental Requirements ...........................................................................................3-4

3.2.3

3.2.4

3.2.2.1

Shelter Requirements.........................................................................................3-5

3.2.2.2

Space Requirements..........................................................................................3-5

3.2.2.3

Furnishings and Fixtures ....................................................................................3-5

3.2.2.4

Electrical Supply.................................................................................................3-5

3.2.2.5

Electrical Power Disturbance .............................................................................3-6

3.2.2.6

Electrical Fast Transient.....................................................................................3-6

3.2.2.7

Lightning Surge ..................................................................................................3-6

3.2.2.8

Electrostatic Disruption.......................................................................................3-7

3.2.2.9

Electromagnetic Radiation .................................................................................3-7

3.2.2.10

Electromagnetic Susceptibility............................................................................3-7

3.2.2.11

Conducted RF Immunity.....................................................................................3-7

3.2.2.12

Magnetic Fields Immunity...................................................................................3-8

3.2.2.13

Environmental Control - Operating Environment................................................3-8

3.2.2.14

Environmental Control - Transit and Storage .....................................................3-8

3.2.2.15

Data Network Requirements ..............................................................................3-8

Election Management System (EMS) Requirements........................................................3-9
3.2.3.1

Recording Requirements....................................................................................3-9

3.2.3.2

Memory Stability.................................................................................................3-9

Vote Recording Requirements..........................................................................................3-9
3.2.4.1

Common Standards .........................................................................................3-10

3.2.4.2

Paper-Based Recording Standards..................................................................3-10
3.2.4.2.1 Paper Ballot Standards .......................................................................3-10
3.2.4.2.2 Punching Devices ...............................................................................3-11
3.2.4.2.3 Marking Devices..................................................................................3-11

i

Volume I – Section 3
Hardware Standards

3.2.4.2.4 Frames or Fixtures for Punchcard Ballots...........................................3-11
3.2.4.2.5 Frames or Fixtures for Printed Ballots.................................................3-12
3.2.4.2.6 Ballot Boxes and Ballot Transfer Boxes..............................................3-12
3.2.4.3

DRE Systems Recording Requirements ..........................................................3-13
3.2.4.3.1 Activity Indicator ..................................................................................3-13
3.2.4.3.2 DRE System Vote Recording..............................................................3-13
3.2.4.3.3 Recording Accuracy ............................................................................3-14
3.2.4.3.4 Recording Reliability ...........................................................................3-14

3.2.5

Paper-based Conversion Requirements.........................................................................3-14
3.2.5.1

Ballot Handling .................................................................................................3-14
3.2.5.1.1 Capacity (Central Count).....................................................................3-15
3.2.5.1.2 Exception Handling (Central Count)....................................................3-15
3.2.5.1.3 Exception Handling (Precinct Count) ..................................................3-15
3.2.5.1.4 Multiple Feed Prevention ....................................................................3-16

3.2.5.2
3.2.6

Ballot Reading Accuracy ..................................................................................3-16

Processing Requirements...............................................................................................3-17
3.2.6.1

Paper-Based System Processing Requirements .............................................3-17
3.2.6.1.1 Processing Accuracy...........................................................................3-17
3.2.6.1.2 Memory Stability..................................................................................3-18

3.2.6.2

DRE System Processing Requirements...........................................................3-18
3.2.6.2.1 Processing Speed ...............................................................................3-18
3.2.6.2.2 Processing Accuracy...........................................................................3-18
3.2.6.2.3 Memory Stability..................................................................................3-18

3.2.7

3.2.8

3.3

3.4

Reporting Requirements.................................................................................................3-19
3.2.7.1

Removable Storage Media...............................................................................3-19

3.2.7.2

Printers .............................................................................................................3-19

Vote Data Management Requirements ..........................................................................3-19
3.2.8.1

Data File Management .....................................................................................3-20

3.2.8.2

Data Report Generation ...................................................................................3-20

Physical Characteristics ...............................................................................................................3-20
3.3.1

Size.................................................................................................................................3-20

3.3.2

Weight.............................................................................................................................3-21

3.3.3

Transport and Storage of Precinct Systems ...................................................................3-21

Design, Construction, and Maintenance Characteristics .............................................................3-21
3.4.1

Materials, Processes, and Parts .....................................................................................3-21

3.4.2

Durability.........................................................................................................................3-22

3.4.3

Reliability ........................................................................................................................3-22

3.4.4

Maintainability .................................................................................................................3-22

ii

Volume I – Section 3
Hardware Standards

3.4.4.1

Physical Attributes............................................................................................3-23

3.4.4.2

Additional Attributes .........................................................................................3-23

3.4.5

Availability.......................................................................................................................3-24

3.4.6

Product Marking..............................................................................................................3-25

3.4.7

Workmanship..................................................................................................................3-25

3.4.8

Safety..............................................................................................................................3-26

3.4.9

Human Engineering—Controls and Displays .................................................................3-26

iii

Volume I – Section 3
Hardware Standards

3

Hardware Standards
3.1

Scope

This section contains the requirements for the machines and manufactured devices
that are part of a voting system. It specifies minimum values for certain performance
characteristics; physical characteristics; and design, construction, and maintenance
characteristics for the hardware and selected related components of all voting systems,
such as:
♦ Ballot printers;
♦ Ballot cards and sheets;
♦ Ballot displays;
♦ Voting devices, including punching and marking devices and DRE recording
devices;
♦ Voting booths and enclosures;
♦ Ballot boxes and ballot transfer boxes;
♦ Ballot readers;
♦ Computers used to prepare ballots, program elections, consolidate and report
votes, and perform other elections management activities;
♦ Electronic ballot recorders;
♦ Electronic precinct vote control units;
♦ Removable electronic data storage media;
♦ Servers; and
♦ Printers.
This section applies to the combination of software and hardware to accomplish
specific performance and system control requirements. Standards that are specific to
software alone are provided in Section 4 of the Standards.

3-1

Volume I – Section 3
Hardware Standards

3.1.1

Hardware Sources

The requirements of this section apply generally to all hardware used in voting
systems, including:
a. Hardware provided by the voting system vendor and its suppliers;
b. Hardware furnished by an external provider (for example, providers of
commercial off-the-shelf (COTS) machines and devices) where the hardware
may be used in any way during voting system operation; and
c. Hardware provided by the voting jurisdiction.

3.1.2

Organization of this Section

The standards presented in this section are organized as follows:
♦ Performance Requirements: These requirements address the combined
operational capabilities of the voting system’s hardware and software across a
broad range of parameters;
♦ Physical Requirements: These requirements address the size, weight and
transportability of the voting system; and
♦ Design, Construction, and Maintenance Requirements: These requirements
address the reliability and durability of materials, product marking, quality of
system workmanship, safety, and other attributes to ensure smooth system
operation in the voting environment.

3.2

Performance Requirements

The performance requirements address a broad range of parameters, encompassing:
a. Accuracy requirements, where requirements are specified for distinct
processing functions of paper-based and DRE systems;
b. Environmental requirements, where no distinction is made between
requirements for paper-based and DRE systems, but requirements for precinct
and central count are described;
c. Vote data management requirements, where no differentiation is made
between requirements for paper-based and DRE systems;

3-2

Volume I – Section 3
Hardware Standards

d. Vote recording requirements, where separate and distinct requirements are
delineated for paper-based and DRE systems;
e. Conversion requirements, which apply only to paper-based systems;
f.

Processing requirements, where separate and distinct requirements are
delineated for paper-based and DRE systems; and

g. Reporting requirements, where no distinction is made between requirements
for paper-based and DRE systems, but where differences between precinct
and central count systems are readily apparent based on differences of their
reporting.
The performance requirements include such attributes as ballot reading and handling
requirements; system accuracy; memory stability; and the ability to withstand
specified environmental conditions. These characteristics also encompass system-wide
requirements for shelter, electrical supply, and compatibility with data networks.
Performance requirements for voting systems represent the combined operational
capability of both system hardware and software. Accuracy, as measured by data error
rate, and operational failure are treated as distinct attributes in performance testing.
All systems shall meet the performance requirements under operating conditions and
after storage under non-operating conditions.

3.2.1

Accuracy Requirements

Voting system accuracy addresses the accuracy of data for each of the individual
ballot positions that could be selected by a voter, including the positions that are not
selected. For a voting system, accuracy is defined as the ability of the system to
capture, record, store, consolidate and report the specific selections and absence of
selections, made by the voter for each ballot position without error. Required accuracy
is defined in terms of an error rate that for testing purposes represents the maximum
number of errors allowed while processing a specified volume of data. This rate is set
at a sufficiently stringent level such that the likelihood of voting system errors
affecting the outcome of an election is exceptionally remote even in the closest of
elections.
The error rate is defined using a convention that recognizes differences in how vote
data is processed by different types of voting systems. Paper-based and DRE systems
have different processing steps. Some differences also exist between precinct count
and central count systems. Therefore, the acceptable error rate applies separately and
distinctly to each of the following functions:
a. For all paper-based systems:
1) Scanning ballot positions on paper ballots to detect selections for
individual candidates and contests;

3-3

Volume I – Section 3
Hardware Standards

2) Conversion of selections detected on paper ballots into digital data;
b. For all DRE systems:
1) Recording the voter selections of candidates and contests into voting data
storage; and
2) Independently from voting data storage, recording voter selections of
candidates and contests into ballot image storage.
c. For precinct-count systems (paper-based and DRE):
Consolidation of vote selection data from multiple precinct-based systems to
generate jurisdiction-wide vote counts, including storage and reporting of the
consolidated vote data; and
d. For central-count systems (paper-based and DRE):
Consolidation of vote selection data from multiple counting devices to
generate jurisdiction-wide vote counts, including storage and reporting of the
consolidated vote data.
For testing purposes, the acceptable error rate is defined using two parameters: the
desired error rate to be achieved, and the maximum error rate that should be accepted
by the test process.
For each processing function indicated above, the system shall achieve a target error
rate of no more than one in 10,000,000 ballot positions, with a maximum acceptable
error rate in the test process of one in 500,000 ballot positions.

3.2.2

Environmental Requirements

The environmental requirements for voting systems include shelter, space, furnishings
and fixtures, supplied energy, environmental control , and external
telecommunications services. Environmental conditions applicable to the design and
operation of voting systems consist of the following categories:
♦

Natural environment, including temperature, humidity, and atmospheric
pressure;

♦

Induced environment, including proper and improper operation and handling
of the system and its components during the election processes;

♦

Transportation and storage; and

♦

Electromagnetic signal environment, including exposure to and generation of
radio frequency energy.

All voting systems shall be designed to withstand the environmental conditions
contained in the appropriate test procedures of the Standards. These procedures will

3-4

Volume I – Section 3
Hardware Standards

be applied to all devices for casting, scanning and counting ballots, except those that
constitute COTS devices that have not been modified in any manner to support their
use as part of a voting system and that have a documented record of performance
under conditions defined in the Standards.
The TDP supplied by the vendor shall include a statement of all requirements and
restrictions regarding environmental protection, electrical service, recommended
auxiliary power, telecommunications service, and any other facility or resource
required for the proper installation and operation of the system.

3.2.2.1

Shelter Requirements

All precinct count systems shall be designed for storage and operation in any enclosed
facility ordinarily used as a warehouse or polling place, with prominent instructions as
to any special storage requirements.

3.2.2.2

Space Requirements

There is no restriction on space allowed for the installation of voting systems, except
that the arrangement of these systems shall not impede performance of their duties by
polling place officials, the orderly flow of voters through the polling place, or the
ability for the voter to vote in private.

3.2.2.3

Furnishings and Fixtures

Any furnishings or fixtures provided as a part of voting systems, and any components
provided by the vendor that are not a part of the system but that are used to support its
storage, transportation, or operation, shall comply with the design and safety
requirements of Subsection 3.4.8.

3.2.2.4

Electrical Supply

Components of voting systems that require an electrical supply shall meet the
following standards:
a. Precinct count systems shall operate with the electrical supply ordinarily
found in polling places (120vac/60hz/1);
b. Central count systems shall operate with the electrical supply ordinarily found
in central tabulation facilities or computer room facilities (120vac/60hz/1,
208vac/60hz/3, or 240vac/60hz/2); and

3-5

Volume I – Section 3
Hardware Standards

c. All systems shall also be capable of operating for a period of at least 2 hours
on backup power, such that no voting data is lost or corrupted, nor normal
operations interrupted. When backup power is exhausted the system shall
retain the contents of all memories intact.
The backup power capability is not required to provide lighting of the voting area.

3.2.2.5

Electrical Power Disturbance

Vote scanning and counting equipment for paper-based systems, and all DRE
equipment, shall be able to withstand, without disruption of normal operation or loss
of data:
a. Surges of 30% dip @10 ms;
b. Surges of 60% dip @100 ms & 1 sec
c. Surges of >95% interrupt @5 sec;
d. Surges of +15% line variations of nominal line voltage; and
e. Electric power increases of 7.5% and reductions of 12.5% of nominal
specified power supply for a period of up to four hours at each power level.

3.2.2.6

Electrical Fast Transient

Vote scanning and counting equipment for paper-based systems, and all DRE
equipment, shall be able to withstand, without disruption of normal operation or loss
of data, electrical fast transients of:
a. 2 kV AC & DC external power lines;
b. +1 kV all external wires >3m no control; and
c. +2 kV all external wires control.

3.2.2.7

Lightning Surge

Vote scanning and counting equipment for paper-based systems, and all DRE
equipment, shall be able to withstand, without disruption of normal operation or loss
of data, surges of:
a. +2 kV AC line to line;
b. +2 kV AC line to earth;
c. +.5 kV DC line to line >10m;

3-6

Volume I – Section 3
Hardware Standards

d. +.5 kV DC line to earth >10m; and
e. +1 kV I/O sig/control >30m.

3.2.2.8

Electrostatic Disruption

Vote scanning and counting equipment for paper-based systems, and all DRE
equipment, shall be able to withstand ±15 kV air discharge and ±8 kV contact
discharge without damage or loss of data. The equipment may reset or have
momentary interruption so long as normal operation is resumed without human
intervention or loss of data. Loss of data means votes that have been completed and
confirmed to the voter.

3.2.2.9

Electromagnetic Radiation

Vote scanning and counting equipment for paper-based systems, and all DRE
equipment, shall comply with the Rules and Regulations of the Federal
Communications Commission, Part 15, Class B requirements for both radiated and
conducted emissions.

3.2.2.10

Electromagnetic Susceptibility

Vote scanning and counting equipment for paper-based systems, and all DRE
equipment, shall be able to withstand an electromagnetic field of 10 V/m modulated
by a 1 kHz 80% AM modulation over the frequency range of 80 MHz to 1000 MHz,
without disruption of normal operation or loss of data.

3.2.2.11

Conducted RF Immunity

Vote scanning and counting equipment for paper-based systems, and all DRE
equipment, shall be able to withstand, without disruption of normal operation or loss
of data, conducted RF energy of:
a. 10V AC & DC power; and
b. 10V, 20 sig/control >3m.

3-7

Volume I – Section 3
Hardware Standards

3.2.2.12

Magnetic Fields Immunity

Vote scanning and counting equipment for paper-based systems, and all DRE
equipment, shall be able to withstand, without disruption of normal operation or loss
of data, AC magnetic fields of 30 A/m at 60 Hz.

3.2.2.13

Environmental Control - Operating Environment

Equipment used for election management activities or vote counting (including both
precinct and central count systems) shall be capable of operation in temperatures
ranging from 50 to 95 degrees Fahrenheit.

3.2.2.14

Environmental Control - Transit and Storage

Equipment used for vote casting, or for counting votes in a precinct count system,
shall meet specific minimum performance standards that simulate exposure to
physical shock and vibration associated with handling and transportation by surface
and air common carriers, and to temperature conditions associated with delivery and
storage in an uncontrolled warehouse environment.
a. High and low storage temperatures ranging from -4 to +140 degrees
Fahrenheit, equivalent to MIL-STD-810D, Methods 501.2 and 502.2,
Procedure I-Storage;
b. Bench handling equivalent to the procedure of MIL-STD-810D, Method
516.3, Procedure VI;
c. Vibration equivalent to the procedure of MIL-STD-810D, Method 514.3,
Category 1- Basic Transportation, Common Carrier; and
d. Uncontrolled humidity equivalent to the procedure of MIL-STD-810D,
Method 507.2, Procedure I-Natural Hot-Humid.

3.2.2.15

Data Network Requirements

Voting systems may use a local or remote data network. If such a network is used,
then all components of the network shall comply with the telecommunications
requirements described in Section 5 of the Standards and the Security requirements
described in Section 6.

3-8

Volume I – Section 3
Hardware Standards

3.2.3

Election Management System (EMS) Requirements

The EMS requirements address electronic hardware and software used to conduct the
pre-voting functions defined in Section 2 with regard to ballot preparation, election
programming, ballot and program installation, readiness testing, verification at the
polling place, and verification at the central location.

3.2.3.1

Recording Requirements

Voting systems shall accurately record all election management data entered by the
user, including election officials or their designees. For recording accuracy, all
systems shall:
a. Record every entry made by the user;
b. Add permissible voter selections correctly to the memory components of the
device;
c. Verify the correctness of detection of the user selections and the addition of
the selections correctly to memory;
d. Add various forms of data entered directly by the election official or designee,
such as text, line art, logos, and images;
e. Verify the correctness of detection of data entered directly by the user and the
addition of the selections correctly to memory;
f.

Preserve the integrity of election management data stored in memory against
corruption by stray electromagnetic emissions, and internally generated
spurious electrical signals; and

g. Log corrected data errors by the system.

3.2.3.2

Memory Stability

Electronic system memory devices, used to retain election management data, shall
have demonstrated error-free data retention for a period of 22 months.

3.2.4

Vote Recording Requirements

The vote recording requirements address the enclosure, equipment, and supplies used
by voters to vote.

3-9

Volume I – Section 3
Hardware Standards

3.2.4.1

Common Standards

All systems shall provide voting booths or enclosures for poll site use. Such booths or
enclosures may be integral to the voting system or supplied as components of the
voting system, and shall:
a. Be integral to, or makes provision for, the installation of, the voting device;
b. Ensure by its structure stability against movement or overturning during
entry, occupancy, and exit by the voter;
c. Provide privacy for the voter, and be designed in such a way as to prevent
observation of the ballot by any person other than the voter; and
d. Be capable of meeting the accessibility requirements of Section 2.2.7.1.

3.2.4.2

Paper-Based Recording Standards

The paper-based recording requirements govern:
♦ Ballot cards or sheets, and pages or assemblies of pages containing ballot field
identification data;
♦ Punching devices;
♦ Marking devices;
♦ Frames or fixtures to hold the ballot while it is being punched;
♦ Compartments or booths where voters record selections; and
♦ Secure containers for the collection of voted ballots.
3.2.4.2.1

Paper Ballot Standards

Paper ballots used by paper-based voting systems shall meet the following standards:
a. Punches or marks that identify the unique ballot format, in accordance with
Section 2.3.1.1.1.c., shall be outside the area in which votes are recorded, so
as to minimize the likelihood that these punches or marks will be mistaken for
vote responses and the likelihood that recorded votes will obliterate these
punches or marks;
b. If printed or punched alignment marks are used to locate the vote response
fields on the ballot, these marks shall be outside the area in which votes are
recorded, so as to minimize the likelihood that these marks will be mistaken
for vote responses and the likelihood that recorded votes will obliterate these
marks; and

3-10

Volume I – Section 3
Hardware Standards

c. The TDP shall specify the required paper stock, size, shape, opacity, color,
watermarks, field layout, orientation, size and style of printing, size and
location of punch or mark fields used for vote response fields and to identify
unique ballot formats, placement of alignment marks, ink for printing, and
folding and bleed-through limitations for preparation of ballots that are
compatible with the system.

3.2.4.2.2

Punching Devices

Punching devices used by voting systems shall:
a. Be suitable for the type of ballot card specified;
b. Facilitate the clear and accurate recording of each vote intended by the voter;
c. Be designed to avoid excessive damage to vote recorder components; and
d. Incorporate features to ensure that the chad (debris) is completely removed,
without damage to other parts of the ballot card.
3.2.4.2.3

Marking Devices

The TDP shall specify marking devices (such as pens or pencils) that, if used to make
the prescribed form of mark, produce readable marked ballots such that the system
meets the performance requirements for accuracy specified previously. These
specifications shall identify:
a. Specific characteristics of marking devices that affect readability of marked
ballots;
b. Performance capabilities with regard to each characteristic; and
c. For marking devices manufactured by multiple external sources, a listing of
sources and model numbers that are compatible with the system.
3.2.4.2.4

Frames or Fixtures for Punchcard Ballots

The frame or fixture for punchcards shall:
a. Hold the ballot card securely in its proper location and orientation for voting;
b. When contests are not printed directly on the ballot card or sheet, incorporate
an assembly of ballot label pages that identify the offices and issues
corresponding to the proper ballot format for the polling place where it is used
and that are aligned with the voting fields assigned to them; and

3-11

Volume I – Section 3
Hardware Standards

c. Incorporate a template to preclude perforation of the card except in the
specified voting fields; a mask to allow punches only in fields designated by
the format of the ballot; and a backing plate for the capture and removal of
chad. This requirement may be satisfied by equipment of a different design as
long it achieves the same result as the Standards with regard to:
1) Positioning the card;
2) Association of ballot label information with corresponding punch fields;
3) Enabling of only those voting fields that correspond to the format of the
ballot; and
4) Punching the fields and the positive removal of chad.
3.2.4.2.5

Frames or Fixtures for Printed Ballots

A frame or fixture for printed ballot cards is optional. However, if such a device is
provided, it shall:
a. Be of any size and shape consistent with its intended use;
b. Position the card properly;
c. Hold the ballot card securely in its proper location and orientation for voting;
and
d. Comply with the requirements for design and construction contained in
Section 3.4.
3.2.4.2.6

Ballot Boxes and Ballot Transfer Boxes

Ballot boxes and ballot transfer boxes, which serve as secure containers for the
storage and transportation of voted ballots, shall:
a. Be of any size, shape, and weight commensurate with their intended use;
b. Incorporate locks or seals, the specifications of which are described in the
system documentation;
c. Provide specific points where ballots are inserted, with all other points on the
box constructed in a manner that prevents ballot insertion; and
d. For precinct count systems, contain separate compartments for the segregation
of unread ballots, ballots containing write-in votes, or any irregularities that
may require special handling or processing. In lieu of compartments, the
conversion processing may mark such ballots with an identifying spot or
stripe to facilitate manual segregation.

3-12

Volume I – Section 3
Hardware Standards

3.2.4.3

DRE Systems Recording Requirements

The DRE systems recording requirements address the detection and recording of
votes, including the logic and data processing functions required to determine the
validity of voter selections, to accept and record valid selections, and to reject invalid
selections. The requirements also address the physical environment in which ballots
are cast.

3.2.4.3.1

Activity Indicator

DRE systems shall include an audible or visible activity indicator providing the status
of each voting device. This indicator shall:
a. Indicate whether the device has been activated for voting; and
b. Indicate whether the device is in use.
3.2.4.3.2

DRE System Vote Recording

To ensure vote recording accuracy and integrity while protecting the anonymity of the
voter, all DRE systems shall:
a. Contain all mechanical, electromechanical, and electronic components;
software; and controls required to detect and record the activation of
selections made by the voter in the process of voting and casting a ballot;
b. Incorporate redundant memories to detect and allow correction of errors
caused by the failure of any of the individual memories;
c. Provide at least two processes that record the voter’s selections that:
1) To the extent possible, are isolated from each other;
2) Designate one process and associated storage location as the main vote
detection, interpretation, processing and reporting path; and
Use a different process to store ballot images, for which the method of recording may
include any appropriate encoding or data compression procedure consistent with the
regeneration of an unequivocal record of the ballot as cast by the voter.
d. Provide a capability to retrieve ballot images in a form readable by humans;
and
e. Ensure that all processing and storage protects the anonymity of the voter.

3-13

Volume I – Section 3
Hardware Standards

3.2.4.3.3

Recording Accuracy

DRE systems shall meet the following requirements for recording accurately each
vote and ballot cast:
a. Detect every selection made by the voter;
b. Correctly add permissible selections to the memory components of the device;
c. Verify the correctness of the detection of the voter selections and the addition
of the selections to memory;
d. Achieve an error rate not to exceed the requirement indicated in Section 3.2.1;
e. Preserve the integrity of voting data and ballot images (for DRE machines)
stored in memory for the official vote count and audit trail purposes against
corruption by stray electromagnetic emissions, and internally generated
spurious electrical signals; and
f.

Maintain a log of corrected data.

3.2.4.3.4

Recording Reliability

Recording reliability refers to the ability of the DRE system to record votes accurately
at its maximum rated processing volume for a specified period of time. The DRE
system shall record votes reliably in accordance with the requirements of Section
3.4.3.

3.2.5

Paper-based Conversion Requirements

The paper-based conversion requirements address the ability of the system to read the
ballot card and to translate its pattern of punches or marks into electronic signals for
later processing. These capabilities may be built into the voting system in an
integrated fashion, or may be provided by one or more components that are not unique
to the system, such as a general-purpose data processing card reader or read head
suitably interfaced to the system. These requirements address two major functions:
ballot handling and ballot reading.

3.2.5.1

Ballot Handling

Ballot handling consists of a ballot card’s acceptance, movement through the read
station, and transfer into a collection station or receptacle.

3-14

Volume I – Section 3
Hardware Standards

3.2.5.1.1

Capacity (Central Count)

The capacity to convert the punches or marks on individual ballots into signals is
uniquely important to central count systems. The capacity for a central count system
shall be documented by the vendor. This documentation shall include the capacity for
individual components that impact the overall capacity.

3.2.5.1.2

Exception Handling (Central Count)

This requirement refers to the handling of ballots for a central count system when they
are unreadable or when some condition is detected requiring that the cards be
segregated from normally processed ballots for human review. In response to an
unreadable ballot or a write-in vote all central count paper-based systems shall:
a. Outstack the ballot, or
b. Stop the ballot reader and display a message prompting the election official or
designee to remove the ballot, or
c. Mark the ballot with an identifying mark to facilitate its later identification.
Additionally, the system shall provide a capability that can be activated by an
authorized election official to identify ballots containing overvotes, blank ballots, and
ballots containing undervotes in a designated race. If enabled, these capabilities shall
perform one of the above actions in response to the indicated condition.

3.2.5.1.3

Exception Handling (Precinct Count)

This requirement refers to the handling of ballots for a precinct count system when
they are unreadable or when some condition is detected requiring that the cards be
segregated from normally processed ballots for human review. All paper based
precinct count systems shall:
a. In response to an unreadable or blank ballot, return the ballot and provide a
message prompting the voter to examine the ballot;
b. In response to a ballot with a write-in vote, segregate the ballot or mark the
ballot with an identifying mark to facilitate its later identification;
c. In response to a ballot with an overvote the system shall:
1) Provide a capability to identify an overvoted ballot;
2) Return the ballot;
3) Provide an indication prompting the voter to examine the ballot;
4) Allow the voter to submit the ballot with the overvote; and

3-15

Volume I – Section 3
Hardware Standards

5) Provide a means for an authorized election official to deactivate this
capability entirely and by contest; and
d. In response to a ballot with an undervote the system shall:
1) Provide a capability to identify an undervoted ballot;
2) Return the ballot;
3) Provide an indication prompting the voter to examine the ballot;
4) Allow the voter to submit the ballot with the undervote; and
5) Provide a means for an authorized election official to deactivate this
capability.
3.2.5.1.4

Multiple Feed Prevention

Multiple feed refers to the situation arising when a ballot reader attempts to read more
than one ballot at a time. The requirements govern the ability of a ballot reader to
prevent multiple feed or to detect and provide an alarm indicating multiple feed.
a. If multiple feed is detected, the card reader shall halt in a manner that permits
the operator to remove the unread cards causing the error, and reinsert them in
the card input hopper.
b. The frequency of multiple feeds with ballots intended for use with the system
shall not exceed l in 10,000.

3.2.5.2

Ballot Reading Accuracy

This paper-based system requirement governs the conversion of the physical ballot
into electronic data. Reading accuracy for ballot conversion refers to the ability to:
♦ Recognize vote punches or marks, or the absence thereof, for each possible
selection on the ballot;
♦ Discriminate between valid punches or marks and extraneous perforations,
smudges, and folds; and
♦ Convert the vote punches or marks, or the absence thereof, for each possible
selection on the ballot into digital signals.
To ensure accuracy, paper-based systems shall:
a. Detect punches or marks that conform to vendor specifications with an error
rate not exceeding the requirement indicated in Section 3.2.1;
b. Ignore, and not record, extraneous perforations, smudges, and folds; and

3-16

Volume I – Section 3
Hardware Standards

c. Reject ballots that meet all vendor specifications at a rate not to exceed 2
percent.

3.2.6

Processing Requirements

Processing requirements apply to the hardware and software required to accumulate
voting data for all candidates and measures within voting machines and polling places,
and to consolidate the voting data at a central level or multiple levels. These
requirements also address the generation and maintenance of audit records, the
detection and disabling of improper use or operation of the system, and the monitoring
of overall system status. Separate and distinct requirements for paper-based and DRE
voting systems are presented below.

3.2.6.1

Paper-Based System Processing Requirements

The paper-based processing requirements address all mechanical devices,
electromechanical devices, electronic devices, and software required to perform the
logical and numerical functions of interpreting the electronic image of the voted
ballot, and assigning votes to the proper memory registers.

3.2.6.1.1

Processing Accuracy

Processing accuracy refers to the ability of the system to receive electronic signals
produced by punches for punchcard systems and vote marks and timing information
for marksense systems; perform logical and numerical operations upon these data; and
reproduce the contents of memory when required, without error. Specific
requirements are detailed below:
a. Processing accuracy shall be measured by vote selection error rate, the ratio of
uncorrected vote selection errors to the total number of ballot positions that
could be recorded across all ballots when the system is operated at its nominal
or design rate of processing;
b. The vote selection error rate shall include data that denotes ballot style or
precinct as well as data denoting a vote in a specific contest or ballot
proposition;
c. The vote selection error rate shall include all errors from any source; and
d. The vote selection error rate shall not exceed the requirement indicated in
Section 3.2.1.

3-17

Volume I – Section 3
Hardware Standards

3.2.6.1.2

Memory Stability

Paper-based system memory devices, used to retain control programs and data, shall
have demonstrated error-free data retention for a period of 22 months, under the
environmental conditions for operation and non-operation (i.e. storage).

3.2.6.2

DRE System Processing Requirements

The DRE system processing requirements address all mechanical devices,
electromechanical devices, electronic devices, and software required to process voting
data after the polling places are closed.

3.2.6.2.1

Processing Speed

DRE voting systems shall meet the following requirements for processing speed:
a. Operate at a speed sufficient to respond to any operator and voter input
without perceptible delay (no more than three seconds); and
b. If the consolidation of polling place data is done locally, perform this
consolidation in a time not to exceed five minutes for each device in the
polling place.
3.2.6.2.2

Processing Accuracy

Processing accuracy is defined as the ability of the system to process voting data
stored in DRE voting devices, or in removable memory modules installed in such
devices. Processing includes all operations to consolidate voting data after the polling
places have been closed. DRE voting systems shall:
a. Produce reports that are completely consistent, with no discrepancy among
reports of voting device data produced at any level; and
b. Produce consolidated reports containing absentee, provisional, or other voting
data that are similarly error-free. Any discrepancy, regardless of source, is
resolvable to a procedural error, to the failure of a non-memory device, or to
an external cause.
3.2.6.2.3

Memory Stability

DRE system memory devices used to retain control programs and data shall have
demonstrated error-free data retention for a period of 22 months. Error-free retention

3-18

Volume I – Section 3
Hardware Standards

may be achieved by the use of redundant memory elements, provided that the
capability for conflict resolution or correction among elements is included.

3.2.7

Reporting Requirements

The reporting requirements govern all mechanical, electromechanical, and electronic
devices required for voting systems to print audit record entries and results of the
tabulation. These requirements also address data storage media for transportation of
data to other sites.

3.2.7.1

Removable Storage Media

In voting systems that use storage media that can be removed from the system and
transported to another location for readout and report generation, these media shall use
devices with demonstrated error-free retention for a period of 22 months under the
environmental conditions for operation and non-operation contained in Section 3.2.2.
Examples of removable storage media include: programmable read-only memory
(PROM), random access memory (RAM) with battery backup, magnetic media, or
optical media.

3.2.7.2

Printers

All printers used to produce reports of the vote count shall be capable of producing:
a. Alphanumeric headers;
b. Election, office and issue labels; and
c. Alphanumeric entries generated as part of the audit record.

3.2.8

Vote Data Management Requirements

The vote data management requirements for all systems address capabilities that
manage, process, and report voting data after the data has been consolidated at the
polling place or other intermediate levels. These capabilities allow the system to:
a. Consolidate voting data from polling place data memory or transfer devices;
b. Report polling place summaries; and

3-19

Volume I – Section 3
Hardware Standards

c. Process absentee ballots, data entered manually, and administrative ballot
definition data.
The requirements address all hardware and software required to generate output
reports in the various formats required by the using jurisdiction.

3.2.8.1

Data File Management

All voting systems shall provide the capability to:
a. Integrate voting data files with ballot definition files;
b. Verify file compatibility; and
c. Edit and update files as required.

3.2.8.2

Data Report Generation

All voting systems shall include report generators for producing output reports at the
device, polling place, and summary level, with provisions for administrative and
judicial subdivisions as required by the using jurisdiction.

3.3

Physical Characteristics

This section covers physical characteristics of all voting systems and components that
affect their general utility and suitability for election operations.

3.3.1

Size

There is no numerical limitation on the size of any voting system equipment, but the
size of each device should be compatible with its intended use and the location at
which the equipment is to be used.

3-20

Volume I – Section 3
Hardware Standards

3.3.2

Weight

There is no numerical limitation on the weight of any voting system equipment, but
the weight of each device should be compatible with its intended use and the location
at which the equipment is to be used.

3.3.3

Transport and Storage of Precinct Systems

All precinct systems shall:
a. Provide a means to safely and easily handle, transport, and install polling
place equipment, such as wheels or a handle or handles; and
b. Be capable of using, or be provided with, a protective enclosure rendering the
equipment capable of withstanding:
1) Impact, shock and vibration loads accompanying surface and air
transportation; and
2) Stacking loads accompanying storage.

3.4

Design, Construction, and Maintenance
Characteristics

This section covers voting system materials, construction workmanship, and specific
design characteristics important to the successful operation and efficient maintenance
of the system.

3.4.1

Materials, Processes, and Parts

The approach to system design is unrestricted, and may incorporate any form or
variant of technology capable of meeting the voting systems requirements and
standards.
Precinct count systems shall be designed in accordance with best commercial practice
for microcomputers, process controllers, and their peripheral components. Central
count voting systems and equipment used in a central tabulating environment shall be
designed in accordance with best commercial and industrial practice.

3-21

Volume I – Section 3
Hardware Standards

All voting systems shall:
a. Be designed and constructed so that the frequency of equipment malfunctions
and maintenance requirements are reduced to the lowest level consistent with
cost constraints;
b. Include, as part of the accompanying TDP, an approved parts list; and
c. Exclude parts or components not included in the approved parts list.

3.4.2

Durability

All voting systems shall be designed to withstand normal use without deterioration
and without excessive maintenance cost for a period of ten years.

3.4.3

Reliability

The reliability of voting system devices shall be measured as mean time between
Failure (MTBF) for the system submitted for testing. MBTF is defined as the value of
the ratio of operating time to the number of failures which have occurred in the
specified time interval. A typical system operations scenario consist of approximately
45 hours of equipment operation, consisting of 30 hours of equipment set-up and
readiness testing and 15 hours of elections operations. For the purpose of
demonstrating compliance with this requirement, a failure is defined as any event
which results in either the:
a. Loss of one or more functions; or
b. Degradation of performance such that the device is unable to perform its
intended function for longer than 10 seconds.
The MTBF demonstrated during qualification testing shall be at least 163 hours.

3.4.4

Maintainability

Maintainability represents the ease with which maintenance actions can be performed
based on the design characteristics of equipment and software and the processes the
vendor and election officials have in place for preventing failures and for reacting to
failures. Maintainability includes the ability of equipment and software to selfdiagnose problems and make non-technical election workers aware of a problem.
Maintainability addresses all scheduled and unscheduled events, which are performed
to:

3-22

Volume I – Section 3
Hardware Standards

•

Determine the operational status of the system or a component;

•

Adjust, align, tune, or service components;

•

Repair or replace a component having a specified operating life or
replacement interval;

•

Repair or replace a component that exhibits an undesirable predetermined
physical condition or performance degradation;

•

Repair or replace a component that has failed; and

•

Verify the restoration of a component, or the system, to operational status.

Maintainability shall be determined based on the presence of specific physical
attributes that aid system maintenance activities, and the ease with which system
maintenance tasks can be performed by the ITA. Although a more quantitative basis
for assessing maintainability, such as the mean to repair the system is desirable, the
qualification of a system is conducted before it is approved for sale and thus before a
broader base of maintenance experience can be obtained.

3.4.4.1

Physical Attributes

The following physical attributes will be examined to assess reliability:
a. Presence of labels and the identification of test points;
b. Provision of built-in test and diagnostic circuitry or physical indicators of
condition;
c. Presence of labels and alarms related to failures; and
d. Presence of features that allow non-technicians to perform routine
maintenance tasks (such as update of the system database).

3.4.4.2

Additional Attributes

The following additional attributes will be considered to assess system
maintainability.
a. Ease of detecting that equipment has failed by a non-technician;
b. Ease of diagnosing problems by a trained technician;
c. Low false alarm rates (i.e., indications of problems that do not exist);
d. Ease of access to components for replacement;
e. Ease with which adjustment and alignment can be performed;

3-23

Volume I – Section 3
Hardware Standards

f.

Ease with which database updates can be performed by a non-technician; and

g. Adjust, align, tune, or service components.

3.4.5

Availability

The availability of a voting system is defined as the probability that the equipment
(and supporting software) needed to perform designated voting functions will respond
to operational commands and accomplish the function. The voting system shall meet
the availability standard for each of the following voting functions:
a. For all paper-based systems:
1) Recording voter selections (such as by ballot marking or punch); and
2) Scanning the punches or marks on paper ballots and converting them into
digital data;
b. For all DRE systems, recording and storing the voter’s ballot selections.
c. For precinct-count systems (paper-based and DRE), consolidation of vote
selection data from multiple precinct-based systems to generate jurisdictionwide vote counts, including storage and reporting of the consolidated vote
data; and
d. For central-count systems (paper-based and DRE), consolidation of vote
selection data from multiple counting devices to generate jurisdiction-wide
vote counts, including storage and reporting of the consolidated vote data.
System availability is measured as the ratio of the time during which the system is
operational a (up time) to the total time period of operation (up time plus down time).
Inherent availability (Ai) is a the fraction of time a system is functional, based upon
Mean Time Between Failure (MTBF) and Mean Time to Repair (MTTR), that is:
Ai = (MTBF)/(MTBF + MTTR)
Mean Time to Repair (MTTR) is the average time required to perform a corrective
maintenance task during periods of system operation. Corrective maintenance task
time is active repair time, plus the time attributable to other factors that could lead to
logistic or administrative delays, such as travel notification of qualified maintenance
personnel and travel time for such personnel to arrive at the appropriate site.
Corrective maintenance may consist of substitution of the complete device or one of
its components, as in the case of precinct count and some central count systems, or it
may consist of on-site repair.
The voting system shall achieve at least ninety nine percent availability during normal
operation for the functions indicated above. This standard encompasses for each

3-24

Volume I – Section 3
Hardware Standards

function the combination of all devices and components that support the function,
including their MTTR and MTBF attribute.
Vendors shall specify the typical system configuration that is to be used to assess
availability, and any assumptions made with regard to any parameters that impact the
MTTR. These factors shall include at a minimum:
a. Recommended number and locations of spare devices or components to be
kept on hand for repair purposes during periods of system operation;
b. Recommended number and locations of qualified maintenance personnel who
need to be available to support repair calls during system operation; and
c. Organizational affiliation (i.e., jurisdiction, vendor) of qualified maintenance
personnel.

3.4.6

Product Marking

All voting systems shall:
a. Identify all devices by means of a permanently affixed nameplate or label
containing the name of the manufacturer or vendor, the name of the device, its
part or model number, its revision letter, its serial number, and if applicable,
its power requirements;
b. Display on each device a separate data plate containing a schedule for and list
of operations required to service or to perform preventive maintenance; and
c. Display advisory caution and warning instructions to ensure safe operation of
the equipment and to avoid exposure to hazardous electrical voltages and
moving parts at all locations where operation or exposure may occur.

3.4.7

Workmanship

To help ensure proper workmanship, all manufacturers of voting systems shall:
a. Adopt and adhere to practices and procedures to ensure that their products are
free from damage or defect that could make them unsatisfactory for their
intended purpose; and
b. Ensure that components provided by external suppliers are free from damage
or defect that could make them unsatisfactory for their intended purpose.

3-25

Volume I – Section 3
Hardware Standards

3.4.8

Safety

All voting systems shall meet the following requirements for safety:
a. All voting systems and their components shall be designed so as to eliminate
hazards to personnel, or to the equipment itself;
b. Defects in design and construction that can result in personal injury or
equipment damage must be detected and corrected before voting systems and
components are placed into service; and
c. Equipment design for personnel safety shall be equal to or better than the
appropriate requirements of the Occupational Safety and Health Act (OSHA),
as identified in Title 29, part 1910, of the Code of Federal Regulations.

3.4.9

Human Engineering—Controls and Displays

All voting systems and components shall be designed and constructed so as to
simplify and facilitate the functions required, and to eliminate the likelihood of
erroneous stimuli and responses on the part of the voter or operator. Other specific
requirements for controls and displays are described below. In addition, specific
functional requirements for system use by voters with disabilities are described in
Section 2.2.7 of the Standards. Appendix C provides additional advisory guidance on
the application of human engineering principles to the interface between the voter and
the voting system.
All voting systems shall meet the following requirements for controls and displays:
a. In all systems, controls used by the voter or equipment operator shall be
conveniently located, shall use designs that are consistent with their functions,
and shall be clearly labeled. Instruction plates shall be provided, if they are
necessary to avoid ambiguity or incorrect actuation;
b. Information or data displays shall be large enough to be readable by voters
and operators with no disabilities and by voters with disabilities consistent
with the requirements defined is Section 2.2.7 of the Standards;
c. Status displays shall meet the same requirements as data displays, and they
shall also follow conventional industrial practice with respect to color:
1) Green, blue, or white displays shall be used for indications of normal
status;
2) Amber indicators shall be used to indicate warnings or marginal status;
and

3-26

Volume I – Section 3
Hardware Standards

3) Red indicators shall be used to indicate error conditions or equipment
states that may result in damage, or in hazards to personnel; and unless
the equipment is designed to halt under conditions of incipient damage or
hazard, an audible alarm shall also be provided.
d. Color coding shall be selected so as to assure correct perception by voters and
operators with color blindness; and shall not be used as the only means of
conveying information, indicating an action, prompting a response, or
distinguishing a visual element (see Appendix B for suggested references);
and
e. The system’s display shall not use flashing or blinking text objects, or other
elements having a flash or blink frequency, greater than 2 Hz and lower than
55 Hz.

3-27

Volume I – Section 3
Hardware Standards

Volume I, Section 4
Table of Contents

4

Software Standards ..............................................................................................................................4-1
4.1

4.2

Scope ..........................................................................................................................................4-1
4.1.1

Software Sources .............................................................................................................4-2

4.1.2

Location and Control of Software and Hardware on Which it Operates ...........................4-2

4.1.3

Exclusions.........................................................................................................................4-3

Software Design and Coding Standards ........................................................................................4-3
4.2.1

Selection of Programming Languages..............................................................................4-4

4.2.2

Software Integrity..............................................................................................................4-4

4.2.3

Software Modularity and Programming.............................................................................4-4

4.2.4

Control Constructs ............................................................................................................4-6

4.2.5

Naming Conventions ........................................................................................................4-6

4.2.6

Coding Conventions .........................................................................................................4-7

4.2.7

Comment Conventions .....................................................................................................4-7

4.3

Data and Document Retention.......................................................................................................4-8

4.4

Audit Record Data..........................................................................................................................4-8

4.5

4.4.1

Pre-election Audit Records ...............................................................................................4-8

4.4.2

System Readiness Audit Records ....................................................................................4-9

4.4.3

In-Process Audit Records ...............................................................................................4-10

4.4.4

Vote Tally Data ...............................................................................................................4-11

Vote Secrecy (DRE Systems) ......................................................................................................4-11

i

Volume I – Section 4
Software Standards

4

Software Standards
4.1

Scope

This section describes essential design and performance characteristics of the software
used in voting systems, addressing both system-level software, such as operating
systems, and voting system application software, including firmware. The
requirements of this section are intended to ensure that voting system software is
reliable, robust, testable, and maintainable. The standards in this section also support
system accuracy, logical correctness, privacy, security and integrity.
The general requirements of this section apply to software used to support the entire
range of voting system activities described in Section 2. More specific requirements
are defined for ballot counting, vote processing, creating an audit trail, and generating
output reports and files. Although this section emphasizes software, the standards
described also influence hardware design considerations.
This section recognizes that there is no best way to design software. Many
programming languages are available for which modern programming practices are
applicable, such as the use of rigorous program and data structures, data typing, and
naming conventions. Other programming languages exist for which such practices are
not easily applied.
The Standards are intended to guide the design of software written in any of the
programming languages commonly used for mainframe, mini-computer, and
microprocessor systems. They are not intended to preclude the use of other languages
or environments, such as those that exhibit “declarative” structure, “object-oriented”
languages, “functional” programming languages, or any other combination of
language and implementation that provides appropriate levels of performance,
testability, reliability, and security. The vendor makes specific software selections.
However, the use of widely recognized and proven software design methods will
facilitate the analysis and testing of voting system software in the qualification
process.

4-1

Volume I – Section 4
Software Standards

4.1.1

Software Sources

The requirements of this section apply generally to all software used in voting
systems, including:
♦ Software provided by the voting system vendor and its component suppliers;
♦ Software furnished by an external provider (for example, providers of COTS
operating systems and web browsers) where the software may be used in any
way during voting system operation; and
♦ Software developed by the voting jurisdiction.
Compliance with the requirements of the software standards is assessed by several
formal tests, including code examination. Unmodified software is not subject to code
examination; however, source code generated by a package and embedded in software
modules for compilation or interpretation shall be provided in human readable form to
the ITA. The ITA may inspect source code units to determine testing requirements or
to verify that the code is unmodified and that the default configuration options have
not been changed.
Configuration of software, both operating systems and applications, is critical to
proper system functioning. Correct test design and sufficient test execution must
account for the intended and proper configuration of all system components.
Therefore, the vendors shall submit to the ITA, in the TDP, a record of all user
selections made during software installation. The vendor shall also submit a record of
all configuration changes made to the software following its installation. The ITA
shall confirm the propriety and correctness of these user selections and configuration
changes.

4.1.2

Location and Control of Software and Hardware on
Which it Operates

The requirements of this section apply to all software used in any manner to support
any voting-related activities, regardless of the ownership of the software or the
ownership and location of the hardware on which the software is installed or operates.
These requirements apply to:
♦ Software that operates on voting devices and vote counting devices installed
at polling places under the control of the voting jurisdiction;
♦ Software that operates on ballot printers, vote counting devices, and other
hardware typically installed at central or precinct locations (including
contractor facilities); and
♦ Election management software.

4-2

Volume I – Section 4
Software Standards

However, some requirements apply only in specific situations indicated in this section.
In addition to the requirements of this section, all software used in any manner to
support any voting-related activities shall meet the requirements for security described
in Section 6 of the Standards.

4.1.3

Exclusions

Some voting systems use equipment, such as personal computers, that may be used for
other purposes and have resident on the equipment general purpose software such as
operating systems, programming language compilers, database management systems,
and Web browsers. Such software is governed by the Standards unless:
♦ The software provides no support of voting system capabilities;
♦ The software is removable, disconnectable, or switchable such that it cannot
function while voting system functions are enabled; and
♦ Procedures are provided that confirm that the software has been removed,
disconnected, or switched.

4.2

Software Design and Coding Standards

The software used by voting systems is selected by the vendor and not prescribed by
the Standards. This section provides standards for voting system software with regard
to:
♦ Selection of programming languages;
♦ Software integrity;
♦ Software modularity and programming;
♦ Control constructs;
♦ Naming conventions;
♦ Coding conventions; and
♦ Comment conventions.

4-3

Volume I – Section 4
Software Standards

4.2.1

Selection of Programming Languages

Software associated with the logical and numerical operations of vote data shall use a
high-level programming language, such as: Pascal, Visual Basic, Java, C and C++.
The requirement for the use of high-level language for logical operations does not
preclude the use of assembly language for hardware-related segments, such as device
controllers and handler programs. Also, operating system software may be designed in
assembly language.

4.2.2

Software Integrity

, Self-modifying, dynamically loaded, or interpreted code is prohibited, except under
the security provisions outlined in section 6.4.e. This prohibition is to ensure that the
software tested and approved during the qualification process remains unchanged and
retains its integrity. External modification of code during execution shall be
prohibited. Where the development environment (programming language and
development tools) includes the following features, the software shall provide controls
to prevent accidental or deliberate attempts to replace executable code:
♦

Unbounded arrays or strings (includes buffers used to move data);

♦

Pointer variables; and

♦

Dynamic memory allocation and management.

4.2.3

Software Modularity and Programming

Voting system application software, including COTS software, shall be designed in a
modular fashion. However, COTS software is not required to be inspected for
compliance with this requirement.. For the purpose of this requirement1, “modules”
may be compiled or interpreted independently. Modules may also be nested. The
modularity rules described here apply to the component sub modules of a library. The
principle concept is that the module contains all the elements to compile or interpret
successfully and has limited access to data in other modules. The design concept is
simple replacement with another module whose interfaces match the original module.
A module is designed in accordance with the following rules:

1

Some software languages and development environments use a different definition of module
but this principle still applies.

4-4

Volume I – Section 4
Software Standards

a. Each module shall have a specific function that can be tested and verified
independently of the remainder of the code. In practice, some additional
modules (such as library modules) may be needed to compile the module
under test, but the modular construction allows the supporting modules to be
replaced by special test versions that support test objectives;
b. Each module shall be uniquely and mnemonically named, using names that
differ by more than a single character. In addition to the unique name, the
modules shall include a set of header comments identifying the module’s
purpose, design, conditions, and version history, followed by the operational
code. Headers are optional for modules of fewer than ten executable lines
where the subject module is embedded in a larger module that has a header
containing the header information. Library modules shall also have a header
comment describing the purpose of the library and version information;
c. All required resources, such as data accessed by the module, should either be
contained within the module or explicitly identified as input or output to the
module. Within the constraints of the programming language, such resources
shall be placed at the lowest level where shared access is needed. If that
shared access level is across multiple modules, the definitions should be
defined in a single file (called header files in some languages, such as C)
where any changes can be applied once and the change automatically applies
to all modules upon compilation or activation;
d. A module is small enough to be easy to follow and understand. Program
logic visible on a single page is easy to follow and correct. Volume II,
Section 5 provides testing guidelines for the ITA to identify large modules
subject to review under this requirement;
e. Each module shall have a single entry point, and a single exit point, for
normal process flow. For library modules or languages such as the objectoriented languages, the entry point is to the individual contained module or
method invoked. The single exit point is the point where control is returned.
At that point, the data that is expected as output must be appropriately set.
The exception for the exit point is where a problem is so severe that execution
cannot be resumed. In this case, the design must explicitly protect all
recorded votes and audit log information and must implement formal
exception handlers provided by the language; and
f.

Process flow within the modules shall be restricted to combinations of the
control structures defined in Volume II, Section 5. These structures support
the modular concept, especially the single entry/exit rule above. They apply
to any language feature where program control passes from one activity to the
next, such as control scripts, object methods, or sets of executable statements,
even though the language itself is not procedural.

4-5

Volume I – Section 4
Software Standards

4.2.4

Control Constructs

Voting system software shall use the control constructs identified in Volume II,
Section 5:
a. Acceptable constructs are Sequence, If-Then-Else, Do-While, Do-Until, Case,
and the General loop (including the special case for loop);
b. If the programming language used does not provide these control constructs,
the vendor shall provide them (that is, comparable control structure logic).
The constructs shall be used consistently throughout the code. No other
constructs shall be used to control program logic and execution;
c. While some programming languages do not create programs as linear
processes, stepping from an initial condition, through changes, to a
conclusion, the program components nonetheless contain procedures (such as
“methods” in object-oriented languages). Even in these programming
languages, the procedures must execute through these control constructs (or
their equivalents, as defined and provided by the vendor); and
d. Operator intervention or logic that evaluates received or stored data shall not
re-direct program control within a program routine. Program control may be
re-directed within a routine by calling subroutines, procedures, and functions,
and by interrupt service routines and exception handlers (due to abnormal
error conditions). Do-While (False) constructs and intentional exceptions
(used as GoTos) are prohibited.

4.2.5

Naming Conventions

Voting system software shall use the following naming conventions:
a. Object, function, procedure, and variable names shall be chosen so as to
enhance the readability and intelligibility of the program. Insofar as possible,
names shall be selected so that their parts of speech represent their use, such
as nouns to represent objects, verbs to represent functions, etc.;
b. Names used in code and in documentation shall be consistent;
c. Names shall be unique within an application. Names shall differ by more than
a single character. All single-character names are forbidden except those for
variables used as loop indexes. In large systems where subsystems tend to be
developed independently, duplicate names may be used where the scope of
the name is unique within the application. Names should always be unique
where modules are shared; and

4-6

Volume I – Section 4
Software Standards

d. Language keywords shall not be used as names of objects, functions,
procedures, variables, or in any manner not consistent with the design of the
language.

4.2.6

Coding Conventions

Voting system software shall adhere to basic coding conventions. The coding
conventions used shall meet one of the following conditions:
a. The vendors shall identify the published, reviewed, and industry-accepted
coding conventions used and the ITAs shall test for compliance; or
b. The ITAs shall evaluate the code using the coding convention requirements
specified in Volume II, Section 5.
These standards reference conventions that protect the integrity and security of the
code, which may be language-specific, and language-independent conventions that
significantly contribute to readability and maintainability. Specific style conventions
that support economical testing are not binding unless adopted by the vendor.

4.2.7

Comment Conventions

Voting system software shall use the following comment conventions:
a. All modules shall contain headers. For small modules of 10 lines or less, the
header may be limited to identification of unit and revision information. Other
header information should be included in the small unit headers if not clear from
the actual lines of code. Header comments shall provide the following
information:
1) The purpose of the unit and how it works;
2) Other units called and the calling sequence;
3) A description of input parameters and outputs;
4) File references by name and method of access (read, write, modify ,
append, etc.);
5) Global variables used; and
6) dDate of creation and a revision record;
b. Descriptive comments shall be provided to identify objects and data types. All
variables shall have comments at the point of declaration clearly explaining

4-7

Volume I – Section 4
Software Standards

their use. Where multiple variables that share the same meaning are required,
the variables may share the same comment;
c. In-line comments shall be provided to facilitate interpretation of functional
operations, tests, and branching;
d. Assembly code shall contain descriptive and informative commentssuch that
its executable lines can be clearly understood; and
e. All comments shall be formatted in a uniform manner that makes it easy to
distinguish them from executable code.

4.3

Data and Document Retention

All systems shall:
a. Maintain the integrity of voting and audit data during an election, and for at
least 22 months thereafter, a time sufficient in which to resolve most
contested elections and support other activities related to the reconstruction
and investigation of a contested election; and
b. Protect against the failure of any data input or storage device at a location
controlled by the jurisdiction or its contractors, and against any attempt at
improper data entry or retrieval.

4.4

Audit Record Data

Audit trails are essential to ensure the integrity of a voting system. Operational
requirements for audit trails are described in Section 2.2.5.2 of the Standards. Audit
record data are generated by these procedures. The audit record data in the following
subsections are essential to the complete recording of election operations and
reporting of the vote tally. This list of audit records may not reflect the design
constructs of some systems. Therefore, vendors shall supplement it with information
relevant to the operation of their specific systems.

4.4.1

Pre-election Audit Records

During election definition and ballot preparation,, the system shall audit the
preparation of the baseline ballot formats and modifications to them, a description of
these modifications, and corresponding dates. The log shall include:

4-8

Volume I – Section 4
Software Standards

a. The allowable number of selections for an office or issue;
b. The combinations of voting patterns permitted or required by the jurisdiction;
c. The inclusion or exclusion of offices or issues as the result of multiple
districting within the polling place;
d. Any other characteristics that may be peculiar to the jurisdiction, the election,
or the polling place's location;
e. Manual data maintained by election personnel;
f.

Samples of all final ballot formats; and

g. Ballot preparation edit listings.

4.4.2

System Readiness Audit Records

The following minimum requirements apply to system readiness audit records:
a. Prior to the start of ballot counting, a system process shall verify hardware
and software status and generate a readiness audit record. This record shall
include the identification of the software release, the identification of the
election to be processed, and the results of software and hardware diagnostic
tests;
b. In the case of systems used at the polling place, the record shall include the
polling place's identification;
c. The ballot interpretation logic shall test and record the correct installation of
ballot formats on voting devices;
d. The software shall check and record the status of all data paths and memory
locations to be used in vote recording to protect against contamination of
voting data;
e. Upon the conclusion of the tests, the software shall provide evidence in the
audit record that the test data have been expunged;
f.

If required and provided, the ballot reader and arithmetic-logic unit shall be
evaluated for accuracy, and the system shall record the results. It shall allow
the processing, or simulated processing, of sufficient test ballots to provide a
statistical estimate of processing accuracy; and

g. For systems that use a public network, provide a report of test ballots that
includes:
1) Number of ballots sent;
2) When each ballot was sent;
3) Machine from which each ballot was sent; and

4-9

Volume I – Section 4
Software Standards

4) Specific votes or selections contained in the ballot.

4.4.3

In-Process Audit Records

In-process audit records document system operations during diagnostic routines and
the casting and tallying of ballots. At a minimum, the in-process audit records shall
contain:
a. Machine generated error and exception messages to demonstrate successful
recovery. Examples include, but are not necessarily limited to:
1) The source and disposition of system interrupts resulting in entry into
exception handling routines;
2) All messages generated by exception handlers;
3) The identification code and number of occurrences for each hardware and
software error or failure;
4) Notification of system login or access errors, file access errors, and
physical violations of security as they occur, and a summary record of
these events after processing;
5) Other exception events such as power failures, failure of critical hardware
components, data transmission errors, or other type of operating anomaly;
b. Critical system status messages other than informational messages displayed
by the system during the course of normal operations. These items include,
but are not limited to:
1) Diagnostic and status messages upon startup;
2) The “zero totals” check conducted before opening the polling place or
counting a precinct centrally;
3) For paper-based systems, the initiation or termination of card reader and
communications equipment operation; and
4) For DRE machines at controlled voting locations, the event (and time, if
available) of activating and casting each ballot (i.e., each voter's
transaction as an event). This data can be compared with the public
counter for reconciliation purposes;
c. Non-critical status messages that are generated by the machine's data quality
monitor or by software and hardware condition monitors; and
d. System generated log of all normal process activity and system events that
require operator intervention, so that each operator access can be monitored
and access sequence can be constructed.

4-10

Volume I – Section 4
Software Standards

4.4.4

Vote Tally Data

In addition to the audit requirements described above, other election-related data is
essential for reporting results to interested parties, the press, and the voting public, and
is vital to verifying an accurate count.
Voting systems shall meet these reporting requirements by providing software capable
of obtaining data concerning various aspects of vote counting and producing reports
of them on a printer. At a minimum, vote tally data shall include:
a. Number of ballots cast, using each ballot configuration, by tabulator, by
precinct, and by political subdivision;
b. Candidate and measure vote totals for each contest, by tabulator;
c. The number of ballots read within each precinct and for additional
jurisdictional levels, by configuration, including separate totals for each party
in primary elections;
d. Separate accumulation of overvotes and undervotes for each contest, by
tabulator, precinct and for additional jurisdictional levels (no overvotes would
be indicated for DRE voting devices); and
e. For paper-based systems only, the total number of ballots both processed and
unprocessable; and if there are multiple card ballots, the total number of cards
read.
For systems that produce an electronic file containing vote tally data, the contents of
the file shall include the same minimum data cited above for printed vote tally reports.

4.5

Vote Secrecy (DRE Systems)

All DRE systems shall ensure vote secrecy by:
a. Immediately after the voter chooses to cast his or her ballot, record the voter’s
selections in the memory to be used for vote counting and audit data
(including ballot images), and erase the selections from the display, memory,
and all other storage, including all forms of temporary storage; and
b. Immediately after the voter chooses to cancel his or her ballot, erase the
selections from the display and all other storage, including buffers and other
temporary storage.

4-11

Volume I – Section 4
Software Standards

Volume I, Section 5
Table of Contents

5

Telecommunications ............................................................................................................................5-1
5.1

5.2

Scope ..........................................................................................................................................5-1
5.1.1

Types of Components.......................................................................................................5-2

5.1.2

Telecommunications Operations and Providers ...............................................................5-3

5.1.3

Data Transmissions ..........................................................................................................5-4

Design, Construction, and Maintenance Requirements .................................................................5-5
5.2.1

Accuracy ...........................................................................................................................5-5

5.2.2

Durability...........................................................................................................................5-5

5.2.3

Reliability ..........................................................................................................................5-5

5.2.4

Maintainability ...................................................................................................................5-5

5.2.5

Availability.........................................................................................................................5-5

5.2.6

Integrity .............................................................................................................................5-6

5.2.7

Confirmation .....................................................................................................................5-6

i

Volume I – Section 5
Software Standards

5

Telecommunications
5.1

Scope

This section contains the performance, design, and maintenance characteristics of the
telecommunications components of voting systems and the acceptable levels of
performance against these characteristics. For the purpose of the Standards,
telecommunications is defined as the capability to transmit and receive data
electronically using hardware and software components over distances both within
and external to a polling place.
The requirements in this section represent acceptable levels of combined
telecommunications hardware and software function and performance for the
transmission of data that is used to operate the system and report election results.
Where applicable, this section specifies minimum values for critical performance and
functional attributes involving telecommunications hardware and software
components.
This section does not apply to other means of moving data, such as the physical
transport of data recorded on paper-based media, or the transport of physical devices,
such as memory cards, that store data in electronic form.
Voting systems may include network hardware and software to transfer data among
systems. Major network components are local area networks (LANs), wide area
networks (WANs), workstations (desktop computers), servers, data, and applications.
Workstations include voting stations, precinct tabulation systems, and voting
supervisory terminals. Servers include systems that provide registration forms and
ballots and accumulate and process voter registrations and cast ballots.
Desirable network characteristics include simplicity, flexibility (especially in routing,
to maintain good response times) and maintainability (including availability, provided
primarily through redundancy of resources and connections, particularly of
connections to public infrastructure).
A wide area network (WAN) public telecommunications component consists of the
hardware and software to transport information, over shared, public (i.e., commercial
or governmental) circuitry, or among private systems. For voting systems, the

5-1

Volume I – Section 5
Telecommunications

telecommunications boundaries are defined as the transport circuitry, on one side of
which exists the public telecommunications infrastructure, outside the control of
voting system supervisors. On the other side of the transport circuitry are the local
area network (LAN) resources, workstations, servers, data and applications controlled
by voting system supervisors.
Local area network (LAN) components consist of the hardware and software
infrastructure used to transport information between users in a local environment,
typically a building or group of buildings. Typically a LAN connects workstations,
perhaps with a local server.
An application may be a single program or a group of programs that work together to
provide a function to an end user, who may be a voter or an election administrator.
Voter programs may include voter registration, balloting, and status checking.
Administrator programs may include ballot preparation, registration for preparation,
registration approval, ballot vetting, ballot processing, and election processing.
This Section is intended to compliment the network security requirements found in
Volume I Section 6, which include requirements for voter and administrator access,
availability of network service, data confidentiality, and data integrity. Most
importantly, security services will restrict access to local election system components
from public resources, and these services will also restrict access to voting system data
while it is in transit across public resources. (This is corollary to voting supervisors
controlling local election systems and not assuming control over public resources.)

5.1.1

Types of Components

This section addresses telecommunications hardware and software across a broad
range of technologies including, but not limited to:
♦ Dial-up communications technologies:
•

Standard landline;

•

Wireless;

•

Microwave;

•

Very Small Aperture Terminal (VSAT);

•

Integrated Services Digital Network (ISDN); and

•

Digital Subscriber Line (DSL);

♦ High-speed telecommunications lines (public and private):
•

FT-1, T-1, T-3;

5-2

Volume I – Section 5
Telecommunications

•

Frame Relay; and

•

Private line;

♦ Cabling technologies:
•

Universal Twisted Pair (UTP) cable (CAT 5 or higher);

•

Ethernet hub/switch; and

•

Wireless connections (Radio Frequency (RF) and Infrared);

♦ Communications routers;
♦ Modems, whether internal and external to personal computers, computer
servers, and other voting system components (whether installed at the polling
place or central count location);
♦ Modem drivers, dial-up networking software;
♦ Channel service units (CSU)/Data service units (DSU) (whether installed at
the polling place or central count location); and
♦ Dial-up networking applications software.

5.1.2

Telecommunications Operations and Providers

This section applies to voting-related transmissions over public networks, such as
those provided by regional telephone companies and long distance carriers. This
section also applies to private networks regardless of whether the network is owned
and operated by the election jurisdiction.
For systems that transmit official data over public networks, this Section applies to
telecommunications components installed and operated at settings supervised by
election officials, such as polling places or central offices. These standards apply to:
♦ Components acquired by the jurisdiction for the purpose of voting, including
components installed at the poll site or a central office (including central site
facilities operated by vendors or contractors); and
♦ Components acquired by others (such as school systems, libraries, military
installations and other public organizations) that are used at settings
supervised by election officials, including minimum configuration
components required by the vendor but that the vendor permits to be acquired
from third party sources not under the vendor’s control (e.g., router or modem
card manufacturer or supplier)

5-3

Volume I – Section 5
Telecommunications

5.1.3

Data Transmissions

These requirements apply to the use of telecommunications to transmit data for the
preparation of the system for an election, the execution of an election, and the
preservation of the system data and audit trails during and following an election.
While this section does not assume a specific model of voting system operations and
does not assume a specific model for the use of telecommunications to support such
operations, it does address the following types of data, where applicable:
♦ Voter Authentication: Coded information that confirms the identity of a
voter for security purposes for a system that transmits votes individually over
a public network;
♦ Ballot Definition: Information that describes to a voting machine the content
and appearance of the ballots to be used in an election;
♦ Vote Transmission: For systems that transmit votes individually over a
public network, the transmission of a single vote within a network at a polling
place and to the county (or contractor) for consolidation with other county
vote data;
♦ Vote Count: Information representing the tabulation of votes at any level
within the control of the jurisdiction, such as the polling place, precinct, or
central count; and
♦ List of Voters: A listing of the individual voters who have cast ballots in a
specific election.
Additional data transmissions used to operate a voting system in the conduct of an
election, but not explicitly listed above, are also subject to the standards of this
section.
For systems that transmit data using public networks, this section applies to
telecommunications hardware and software for transmissions within and among all
combinations of senders and receivers indicated below:
♦ Polling places;
♦ Precinct count facilities; and
♦ Central count facilities (whether operated by the jurisdiction or a contractor).

5-4

Volume I – Section 5
Telecommunications

5.2

Design, Construction, and Maintenance
Requirements

Design, construction, and maintenance requirements for telecommunications represent
the operational capability of both system hardware and software. These capabilities
shall be considered basic to all data transmissions.

5.2.1

Accuracy

The telecommunications components of all voting systems shall meet the accuracy
requirements of Section 3.2.1.

5.2.2

Durability

The telecommunications components of all voting systems shall meet the durability
requirements of Section 3.4.2.

5.2.3

Reliability

The telecommunications components of all voting systems shall meet the reliability
requirements of Section 3.4.3.

5.2.4

Maintainability

The telecommunications components of all voting systems shall meet the
maintainability requirements of Section 3.4.4.

5.2.5

Availability

The telecommunications components of all voting systems shall meet the availability
requirements of Section 3.4.5.

5-5

Volume I – Section 5
Telecommunications

5.2.6

Integrity

For WANs using public telecommunications, boundary definition and implementation
shall meet the following requirements.
a. Outside service providers and subscribers of such providers shall not be given
direct access or control of any resource inside the boundary;
b. Voting system administrators shall not require any type of control of
resources outside this boundary. Typically, an end point of a
telecommunications circuit will be a subscriber termination on a Digital
Service Unit/Customer Service Unit (DSU/CSU) (though the precise
technology may vary, being such things as cable modems or routers).
Regardless of the technology used, the boundary point must ensure that
everything on one side is locally configured and controlled while everything
on the other side is controlled by an outside service provider; and
c. The system shall be designed and configured such that it is not vulnerable to a
single point of failure in the connection to the public network causing total
loss of voting capabilities at any polling place.

5.2.7

Confirmation

Confirmation occurs when the system notifies the user of the successful or
unsuccessful completion of the data transmission, where successful completion is
defined as accurate receipt of the transmitted data. To provide confirmation, the
telecommunications components of a voting system shall:
d. Notify the user of the successful or unsuccessful completion of the data
transmission; and
e. In the event of unsuccessful transmission, notify the user of the action to be
taken.

5-6

Volume I – Section 5
Telecommunications

Volume I, Section 6
Table of Contents

6

Security Standards ...............................................................................................................................6-1
6.1

6.2

Scope ..........................................................................................................................................6-1
6.1.1

System Components and Sources ...................................................................................6-2

6.1.2

Location and Control of Software and Hardware on Which it Operates ...........................6-2

6.1.3

Elements of Security Outside Vendor Control ..................................................................6-3

6.1.4

Organization of this Section..............................................................................................6-3

Access Control ...............................................................................................................................6-4
6.2.1

Access Control Policy .......................................................................................................6-4
6.2.1.1 General Access Control Policy.................................................................................6-4
6.2.1.2 Individual Access Privileges .....................................................................................6-5

6.2.2
6.3

6.4

6.5

Access Control Measures.................................................................................................6-5

Physical Security Measures ...........................................................................................................6-6
6.3.1

Polling Place Security .......................................................................................................6-6

6.3.2

Central Count Location Security .......................................................................................6-6

Software Security ...........................................................................................................................6-7
6.4.1

Software and Firmware Installation ..................................................................................6-7

6.4.2

Protection Against Malicious Software .............................................................................6-7

Telecommunications and Data Transmission ................................................................................6-8
6.5.1

Access Control..................................................................................................................6-8

6.5.2

Data Integrity ....................................................................................................................6-8

6.5.3

Data Interception Prevention ............................................................................................6-8

6.5.4

Protection Against External Threats .................................................................................6-9
6.5.4.1 Identification of COTS Products ...............................................................................6-9
6.5.4.2 Use of Protective Software.......................................................................................6-9
6.5.4.3 Monitoring and Responding to External Threats ....................................................6-10

6.6

6.5.5

Shared Operating Environment ......................................................................................6-11

6.5.6

Access to Incomplete Election Returns and Interactive Queries ....................................6-11

Security for Transmission of Official Data Over Public Communications Networks.....................6-12
6.6.1

General Security Requirements for Systems Transmitting Data Over Public Networks.6-12

6.6.2

Voting Process Security for Casting Individual Ballots over a Public Telecommunications
Network...........................................................................................................................6-12

i

Volume I – Section 6
Security Standards

6.6.2.1 Documentation of Mandatory Security Activities ....................................................6-12
6.6.2.2 Capabilities to Operate During Interruption of Telecommunications Capabilities ..6-13

ii

Volume I – Section 6
Security Standards

6

Security Standards
6.1

Scope

This section describes essential security capabilities for a voting system,
encompassing the system’s hardware, software, communications, and documentation.
The Standards recognize that no predefined set of security standards will address and
defeat all conceivable or theoretical threats. However, the Standards articulate
requirements to achieve acceptable levels of integrity, reliability, and inviolability.
Ultimately, the objectives of the security standards for voting systems are:
♦ To establish and maintain controls that can ensure that accidents, inadvertent
mistakes, and errors are minimized;
♦ To protect the system from intentional manipulation and fraud, and from
malicious mischief;
♦ To identify fraudulent or erroneous changes to the system; and
♦ To protect secrecy in the voting process.
The Standards are intended to address a broad range of risks to the integrity of a
voting system. While it is not possible to identify all potential risks, the Standards
identify several types of risk that must be addressed by a voting system. These
include:
♦ Unauthorized changes to system capabilities for:
•

Defining ballot formats;

•

Casting and recording votes;

•

Calculating vote totals consistent with defined ballot formats; and

•

Reporting vote totals;

♦ Alteration of voting system audit trails;
♦ Changing, or preventing the recording of, a vote;
♦ Introducing data for a vote not cast by a registered voter;

6-1

Volume I – Section 6
Security Standards

♦ Changing calculated vote totals;
♦ Preventing access to vote data, including individual votes and vote totals, to
unauthorized individuals; and
♦ Preventing access to voter identification data and data for votes cast by the
voter such that an individual can determine the content of specific votes cast
by the voter.
This section describes specific capabilities that vendors shall integrate into a voting
system in order to address the risks listed above.

6.1.1

System Components and Sources

The requirements of this section apply to the broad range of hardware, software,
communications components, and documentation that comprises a voting system.
These requirements apply to components:
♦ Provided by the voting system vendor and the vendor’s suppliers;
♦ Furnished by an external provider (for example providers of personal
computers and commercial off-the-shelf (COTS) operating systems) where
the components are capable of being used during voting system operation;
and
♦ Developed by a voting jurisdiction.

6.1.2

Location and Control of Software and Hardware on
Which it Operates

The requirements of this section apply to all software used in any manner to support
any voting-related activity, regardless of the ownership of the software or the
ownership and location of the hardware on which the software is installed or operated.
These requirements apply to software that operates on:
♦ Voting devices and vote counting devices installed at polling places under the
control or authority of the voting jurisdiction; and
♦ Ballot printers, vote counting devices, and other hardware typically installed
at central or precinct locations (including contractor facilities).
However, some requirements are applicable only in circumstances specified by this
section.

6-2

Volume I – Section 6
Security Standards

6.1.3

Elements of Security Outside Vendor Control

The requirements of this section apply to the capabilities of a voting system provided
by the vendor. The Standards recognizes that effective security requires safeguards
beyond those provided by the vendor. Effective security demands diligent security
practices by the purchasing jurisdiction and the jurisdictions representatives. These
practices include:
♦ Administrative and management controls for the voting system and election
management, including access controls;
♦ Internal security procedures;
♦ Adherence to, and enforcement of, operational procedures (e.g., effective
password management);
♦ Security of physical facilities; and
♦ Organizational responsibilities and personnel screening.
Because specific standards for these elements are not under the direct control of the
vendor, they will be addressed in forthcoming Operational Guidelines that address
best practices for jurisdictions conducting elections and managing the operation of
voting systems.

6.1.4

Organization of this Section

The standards presented in this section are organized as follows:
♦ Access Control: These standards addresses procedures and system
capabilities that limit or detect access to critical system components in order
to guard against loss of system integrity, availability, confidentiality, and
accountability.
♦ Equipment and Data Security: These standards address physical security
measures and procedures that prevent disruption of the voting process at the
poll site and corruption of voting data.
♦ Software Security: These standards address the installation of software,
including firmware, in the voting system and the protection against malicious
software.
♦ Telecommunication and Data Transmission: These standards address
security for the electronic transmission of data between system components or
locations over both private and public networks

6-3

Volume I – Section 6
Security Standards

♦ Security for Transmission of Official Data Over Public Communications
Networks: These standards address security for systems that communicate
individual votes or vote totals over public communications networks.
It should be noted that computer-generated audit controls facilitate system security
and are an integral part of software capability. These audit requirements are presented
in Section 4.

6.2

Access Control

Access controls are procedures and system capabilities that detect or limit access to
system components in order to guard against loss of system integrity, availability,
confidentiality, and accountability. Access controls provide reasonable assurance that
system resources such as data files, application programs, and computer-related
facilities and equipment are protected against unauthorized operation, modification,
disclosure, loss, or impairment. Unauthorized operations include modification of
compiled or interpreted code, run-time alteration of flow control logic or of data, and
abstraction of raw or processed voting data in any form other than a standard output
report by an authorized operator.
Access controls may include physical controls, such as keeping computers in locked
rooms to limit physical access, and technical controls, such as security software
programs designed to prevent or detect unauthorized access to sensitive files. The
access controls contained in this section of the Standards are limited to those controls
required of system vendors. Access controls required of jurisdictions will be
addressed in future documents detailing operational guidelines for jurisdictions.

6.2.1

Access Control Policy

The vendor shall specify the general features and capabilities of the access control
policy recommended to provide effective voting system security.

6.2.1.1

General Access Control Policy

Although the jurisdiction in which the voting system is operated is responsible for
determining the access policies applying to each election, the vendor shall provide a
description of recommended policies for:
a. Software access controls;

6-4

Volume I – Section 6
Security Standards

b. Hardware access controls;
c. Communications;
d. Effective password management;
e. Protection abilities of a particular operating system;
f.

General characteristics of supervisory access privileges;

g. Segregation of duties; and
h. Any additional relevant characteristics.

6.2.1.2

Individual Access Privileges

Voting system vendors shall:
a. Identify each person to whom access is granted, and the specific functions and
data to which each person holds authorized access;
b. Specify whether an individual’s authorization is limited to a specific time,
time interval, or phase of the voting or counting operations; and
c. Permit the voter to cast a ballot expeditiously, but preclude voter access to all
other aspects of the vote-counting processes.

6.2.2

Access Control Measures

Vendors shall provide a detailed description of all system access control measures
designed to permit authorized access to the system and prevent unauthorized access.
Examples of such measures include:
a. Use of data and user authorization;
b. Program unit ownership and other regional boundaries;
c. One-end or two-end port protection devices;
d. Security kernels;
e. Computer-generated password keys;
f.

Special protocols;

g. Message encryption; and
h. Controlled access security.

6-5

Volume I – Section 6
Security Standards

Vendors also shall define and provide a detailed description of the methods used to
prevent unauthorized access to the access control capabilities of the system itself.

6.3

Physical Security Measures

A voting system’s sensitivity to disruption or corruption of data depends, in part, on
the physical location of equipment and data media, and on the establishment of secure
telecommunications among various locations. Most often, the disruption of voting and
vote counting results from a physical violation of one or more areas of the system
thought to be protected. Therefore, security procedures shall address physical threats
and the corresponding means to defeat them.

6.3.1

Polling Place Security

For polling place operations, vendors shall develop and provide detailed
documentation of measures to anticipate and counteract vandalism, civil disobedience,
and similar occurrences. The measures shall:
a. Allow the immediate detection of tampering with vote casting devices and
precinct ballot counters; and
b. Control physical access to a telecommunications link if such a link is used.

6.3.2

Central Count Location Security

Vendors shall develop and document in detail the measures to be taken in a central
counting environment. These measures shall include physical and procedural controls
related to the:
a. Handling of ballot boxes;
b. Preparing of ballots for counting;
c. Counting operations; and
d. Reporting data.

6-6

Volume I – Section 6
Security Standards

6.4

Software Security

Voting systems shall meet specific security requirements for the installation of
software and for protection against malicious software.

6.4.1

Software and Firmware Installation

The system shall meet the following requirements for installation of software,
including hardware with embedded firmware:
a. If software is resident in the system as firmware, the vendor shall require and
state in the system documentation that every device is to be retested to
validate each ROM prior to the start of elections operations;
b. To prevent alteration of executable code, no software shall be permanently
installed or resident in the system unless the system documentation states that
the jurisdiction must provide a secure physical and procedural environment
for the storage, handling, preparation, and transportation of the system
hardware;
c. The system bootstrap, monitor, and device-controller software may be
resident permanently as firmware, provided that this firmware has been shown
to be inaccessible to activation or control by any means other than by the
authorized initiation and execution of the vote-counting program, and its
associated exception handlers;
d. The election-specific programming may be installed and resident as firmware,
provided that such firmware is installed on a component (such as computer
chip) other than the component on which the operating system resides; and
e. After initiation of election day testing, no source code or compilers or
assemblers shall be resident or accessible.

6.4.2

Protection Against Malicious Software

Voting systems shall deploy protection against the many forms of threats to which
they may be exposed such as file and macro viruses, worms, Trojan horses, and logic
bombs. Vendors shall develop and document the procedures to be followed to ensure
that such protection is maintained in a current status.

6-7

Volume I – Section 6
Security Standards

6.5

Telecommunications and Data Transmission

There are four areas that must be addressed by telecommunications and data
transmission security capabilities:
♦ Access control for telecommunications capabilities;
♦ Data integrity;
♦ Detection and prevention of data interception; and
♦ Protection against external threats to which commercial products used by a
voting system may be susceptible.

6.5.1

Access Control

Voting systems that use telecommunications to communicate between system
components and locations are subject to the same security requirements governing
access to any other system hardware, software, and data function.

6.5.2

Data Integrity

Voting systems that use electrical or optical transmission of data shall ensure the
receipt of valid vote records is verified at the receiving station. This should include
standard transmission error detection and correction methods such as checksums or
message digest hashes. Verification of correct transmission shall occur at the voting
system application level and ensure that the correct data is recorded on all relevant
components consolidated within the polling place prior to the voter completing
casting of his or her ballot.

6.5.3

Data Interception Prevention

Voting systems that use telecommunications as defined in Section 5 to communicate
between system components and locations before the poll site is officially closed
shall:
a. Implement an encryption standard currently documented and validated for use
by an agency of the U.S. Federal Government; and

6-8

Volume I – Section 6
Security Standards

b. Provide a means to detect the presence of an intrusive process, such as an
Intrusion Detection System.

6.5.4

Protection Against External Threats

Voting systems that use public telecommunications networks shall implement
protections against external threats to which commercial products used in the system
may be susceptible.

6.5.4.1

Identification of COTS Products

Voting systems that use public telecommunications networks shall provide system
documentation that clearly identifies all COTS hardware and software products and
communications services used in the development and/or operation of the voting
system, including:
a. Operating systems;
b. Communications routers;
c. Modem drivers; and
d. Dial-up networking software.
Such documentation shall identify the name, vendor, and version used for each such
component.

6.5.4.2

Use of Protective Software

Voting systems that use public telecommunications networks shall use protective
software at the receiving-end of all communications paths to:
a. Detect the presence of a threat in a transmission;
b. Remove the threat from infected files/data;
c. Prevent against storage of the threat anywhere on the receiving device;
d. Provide the capability to confirm that no threats are stored in system memory
and in connected storage media; and
e. Provide data to the system audit log indicating the detection of a threat and
the processing performed.

6-9

Volume I – Section 6
Security Standards

Vendors shall use multiple forms of protective software as needed to provide
capabilities for the full range of products used by the voting system.

6.5.4.3

Monitoring and Responding to External Threats

Voting systems that use public telecommunications networks may become vulnerable,
by virtue of their system components, to external threats to the accuracy and integrity
of vote recording, vote counting, and vote consolidation and reporting processes.
Therefore, vendors of such systems shall document how they plan to monitor and
respond to known threats to which their voting systems are vulnerable. This
documentation shall provide a detailed description, including scheduling information,
of the procedures the vendor will use to:
a. Monitor threats, such as through the review of assessments, advisories, and
alerts for COTS components issued by the Computer Emergency Response
Team (CERT), for which a current listing can be found at http://www.cert.org,
the National Infrastructure Protection Center (NIPC), for which a current
listing can be found at http://www.nipc.gov/warnings/warnings.htm, and the
Federal Computer Incident Response Capability (FedCIRC), for which
additional information can be found at http://www.fedcirc.gov/;
b. Evaluate the threats and, if any, proposed responses;
c. Develop responsive updates to the system and/or corrective procedures;
d. Submit the proposed response to the ITAs and appropriate states for approval,
identifying the exact changes and whether or not they are temporary or
permanent;
e. After implementation of the proposed response is approved by the state, assist
clients, either directly or through detailed written procedures, how to update
their systems and/or to implement the corrective procedures no later than one
month before an election; and
f.

Address threats emerging too late to correct the system at least one month
before the election, including:
1) Providing prompt, emergency notification to the ITAs and the affected
states and user jurisdictions;
2) Assisting client jurisdictions directly, or advising them through detailed
written procedures, to disable the public telecommunications mode of the
system; and
3) After the election, modifying the system to address the threat; submitting
the modified system to an ITA and appropriate state certification authority
for approval, and assisting client jurisdictions directly, or advising them
through detailed written procedures, to update their systems and/or to
implement the corrective procedures after approval.

6-10

Volume I – Section 6
Security Standards

6.5.5

Shared Operating Environment

Ballot recording and vote counting can be performed in either a dedicated or nondedicated environment. If ballot recording and vote counting operations are performed
in an environment that is shared with other data processing functions, both hardware
and software features shall be present to protect the integrity of vote counting and of
vote data. Systems that use a shared operating environment shall:
a. Use security procedures and logging records to control access to system
functions;
b. Partition or compartmentalize voting system functions from other concurrent
functions at least logically, and preferably physically as well;
c. Controlled system access by means of passwords, and restriction of account
access to necessary functions only; and
d. Have capabilities in place to control the flow of information, precluding data
leakage through shared system resources.

6.5.6

Access to Incomplete Election Returns and
Interactive Queries

If the voting system provides access to incomplete election returns and interactive
inquiries before the completion of the official count, the system shall:
a. For equipment that operates in a central counting environment, be designed
to provide external access to incomplete election returns only if that access
for these purposes is authorized by the statutes and regulations of the using
agency. This requirement applies as well to polling place equipment that
contains a removable memory module, or that may be removed in its entirety
to a central place for the consolidation of polling place returns.
b. Use voting system software and its security environment designed such that
data accessible to interactive queries resides in an external file, or database,
that is created and maintained by the elections software under the restrictions
applying to any other output report, namely, that:
1) The output file or database has no provision for write-access back to the
system.
2) Persons whose only authorized access is to the file or database are
denied write-access, both to the file or database, and to the system.

6-11

Volume I – Section 6
Security Standards

6.6

Security for Transmission of Official Data Over
Public Communications Networks

DRE systems that transmit data over public telecommunications networks face
security risks that are not present in other DRE systems. This section describes
standards applicable to DRE systems that use public telecommunications networks.

6.6.1

General Security Requirements for Systems
Transmitting Data Over Public Networks

All systems that transmit data over public telecommunications networks shall:
a. Preserve the secrecy of a voter’s ballot choices, and prevent anyone from
violating ballot privacy;
b. Employ digital signature for all communications between the vote server and
other devices that communicate with the server over the network; and
c. Require that at least two authorized election officials activate any critical
operation regarding the processing of ballots transmitted over a public
communications network takes place, i.e. the passwords or cryptographic keys
of at least two employees are required to perform processing of votes.

6.6.2

Voting Process Security for Casting Individual Ballots
over a Public Telecommunications Network

Systems designed for transmission of telecommunications over public networks shall
meet security standards that address the security risks attendant with the casting of
ballots from poll sites controlled by election officials using voting devices configured
and installed by election officials and/or their vendor or contractor, and using inperson authentication of individual voters.

6.6.2.1

Documentation of Mandatory Security Activities

Vendors of systems that cast individual ballots over a public telecommunications
network shall provide detailed descriptions of:
a. All activities mandatory to ensuring effective system security to be performed
in setting up the system for operation, including testing of security before an
election; and

6-12

Volume I – Section 6
Security Standards

b. All activities that should be prohibited during system setup and during the
time frame for voting operations, including both the hours when polls are
open and when polls are closed.

6.6.2.2

Capabilities to Operate During Interruption of
Telecommunications Capabilities

These systems shall provide the following capabilities to provide resistance to
interruptions of telecommunications service that prevent voting devices at the poll site
from communicating with external components via telecommunications:
a. Detect the occurrence of a telecommunications interruption at the poll site and
switch to an alternative mode of operation that is not dependent on the
connection between poll site voting devices and external system components;
b. Provide an alternate mode of operation that includes the functionality of a
conventional DRE machine without losing any single vote;
c. Create and preserve an audit trail of every vote cast during the period of
interrupted communication and system operation in conventional DRE system
mode;
d. Upon reestablishment of communications, transmit and process votes
accumulated while operating in conventional DRE system mode with all
security safeguards in effect; and
e. Ensure that all safeguards related to voter identification and authentication are
not affected by the procedures employed by the system to counteract potential
interruptions of telecommunications capabilities.

6-13

Volume I – Section 6
Security Standards

Volume I, Section 7
Table of Contents

7

Quality Assurance.................................................................................................................................7-1
7.1

Scope ..........................................................................................................................................7-1

7.2

General Requirements ...................................................................................................................7-1

7.3

Components from Third Parties .....................................................................................................7-2

7.4

Responsibility for Tests ..................................................................................................................7-2

7.5

Parts & Materials Special Tests and Examinations........................................................................7-2

7.6

Quality Conformance Inspections ..................................................................................................7-3

7.7

Documentation ...............................................................................................................................7-3

i

Volume I – Section 7
Quality Assurance

7

Quality Assurance
7.1

Scope

Quality Assurance provides continuous confirmation that a voting system conforms
with the Standards and to the requirements of state and local jurisdictions. Quality
Assurance is a vendor function with associated practices that is initiated prior to
system development and continues throughout the maintenance life cycle of the
voting system. Quality Assurance focuses on building quality into a system and
reducing dependence on system tests at the end of the life cycle to detect deficiencies,
thus helping ensure the system:
♦ Meets stated requirements and objectives;
♦ Adheres to established standards and conventions;
♦ Functions consistent with related components and meets dependencies for use
within the jurisdiction; and
♦ Reflects all changes approved during its initial development, internal testing,
qualification, and, if applicable, additional certification processes.

7.2

General Requirements

The voting system vendor is responsible for designing and implementing a quality
assurance program to ensure that the design, workmanship, and performance
requirements of this standard are achieved in all delivered systems and components.
At a minimum, this program shall:
a. Include procedures for specifying, procuring, inspecting, accepting, and
controlling parts and raw materials of the requisite quality;
b. Require the documentation of the hardware and software development
process;
c. Identify and enforce all requirements for:

7-1

Volume I – Section 7
Quality Assurance

1) In-process inspection and testing that the manufacturer deems necessary
to ensure proper fabrication and assembly of hardware, and
2) Installation and operation of software (including firmware).
d. Include plans and procedures for post-production environmental screening
and acceptance test; and
e. Include a procedure for maintaining all data and records required to document
and verify the quality inspections and tests.

7.3

Components from Third Parties

A vendors who does not manufacture all the components of its voting system, but
instead procures components as standard commercial items for assembly and
integration into a voting system, should verify that the supplier vendors follow
documented quality assurance procedures that are at least as stringent as those used
internally by the voting system vendor.

7.4

Responsibility for Tests

The manufacturer or vendor shall be responsible for:
a. Performing all quality assurance tests;
b. Acquiring and documenting test data; and
c. Providing test reports for review by the ITA, and to the purchaser upon
request.

7.5

Parts & Materials Special Tests and
Examinations

In order to ensure that voting system parts and materials function properly, vendors
shall:
a. Select parts and materials to be used in voting systems and components
according to their suitability for the intended application. Suitability may be
determined by similarity of this application to existing standard practice, or by
means of special tests;

7-2

Volume I – Section 7
Quality Assurance

b. Design special tests, if needed, to evaluate the part or material under
conditions accurately simulating the actual operating environment; and
c. Maintain the resulting test data as part of the quality assurance program
documentation.

7.6

Quality Conformance Inspections

The vendor performs conformance inspections to ensure the overall quality of the
voting system and components delivered to the ITA for testing and to the jurisdiction
for implementation. To meet the conformance inspection requirements the vendor or
manufacturer shall:
a. Inspect and test each voting system or component to verify that it meets all
inspection and test requirements for the system; and
b. Deliver a record of tests, or a certificate of satisfactory completion, with each
system or component.

7.7

Documentation

Vendors are required to produce documentation to support the development and
formal testing of voting systems. To meet documentation requirements, vendors shall
provide complete product documentation with each voting systems or components, as
described Volume II, Section 2 for the TDP. This documentation shall:
a. Be sufficient to serve the needs of the ITA, voters, election officials, and
maintenance technicians;
b. Be prepared and published in accordance with standard industrial practice for
information technology and electronic and mechanical equipment; and
c. Consist, at a minimum, of the following:
1) System overview;
2) System functionality description;
3) System hardware specification;
4) Software design and specifications;
5) System security specification;
6) System test and verification specification;
7) System operations procedures;

7-3

Volume I – Section 7
Quality Assurance

8) System maintenance procedures;
9) Personnel deployment and training requirements;
10) Configuration management plan;
11) Quality assurance program; and
12) System Change Notes.

7-4

Volume I – Section 7
Quality Assurance

Volume I, Section 8
Table of Contents

8

Configuration Management..................................................................................................................8-1
8.1

Scope ..........................................................................................................................................8-1
8.1.1

Configuration Management Requirements .......................................................................8-1

8.1.2

Organization of Configuration Management Standards....................................................8-2

8.1.3

Application of Configuration Management Requirements ................................................8-2

8.2

Configuration Management Policy .................................................................................................8-3

8.3

Configuration Identification.............................................................................................................8-3
8.3.1

Structuring and Naming Configuration Items ....................................................................8-3

8.3.2

Versioning Conventions....................................................................................................8-3

8.4

Baseline, Promotion, and Demotion Procedures ...........................................................................8-4

8.5

Configuration Control Procedures..................................................................................................8-4

8.6

Release Process ............................................................................................................................8-5

8.7

Configuration Audits.......................................................................................................................8-5

8.8

8.7.1

Physical Configuration Audit.............................................................................................8-5

8.7.2

Functional Configuration Audit..........................................................................................8-6

Configuration Management Resources..........................................................................................8-6

i

Volume I – Section 8
Configuration Management

8

Configuration Management
8.1

Scope

This section contains specific requirements for configuration management of voting
systems. For the purpose of the Standards, configuration management is defined as a
set of activities and associated practices that ensures full knowledge and control of the
components of a system, starting with its initial development and progressing through
its ongoing maintenance and enhancement. This section describes activities in terms
of their purposes and outcomes. It does not describe specific procedures or steps to be
employed to accomplish them. Specific steps and procedures are left to the vendor to
select.
Vendors are required to submit these procedures to the Independent Test Authority
(ITA) as part of the Technical Data Package (TDP) for system qualifications described
in Volume II, Voting Systems Qualification Testing Standards, for review against the
requirements of this section. Additionally, state or local election legislation,
regulations, or contractual agreements may require the vendor to conform to
additional standards for configuration management or to adopt specific required
procedures. Further, authorized election officials or their representatives reserve the
right to inspect vendor facilities and operations to determine conformance with the
vendor’s reported procedures and with any additional requirements.

8.1.1

Configuration Management Requirements

Configuration management addresses a broad set of record keeping, audit, and
reporting activities that contribute to full knowledge and control of a system and its
components. These activities include:
♦ Identifying discrete system components;
♦ Creating records of a formal baseline and later versions of components;
♦ Controlling changes made to the system and its components;
♦ Releasing new versions of the system to ITAs;

8-1

Volume I – Section 8
Configuration Management

♦ Releasing new versions of the system to customers;
♦ Auditing the system, including its documentation, against configuration
management records;
♦ Controlling interfaces to other systems; and
♦ Identifying tools used to build and maintain the system.

8.1.2

Organization of Configuration Management
Standards

The standards for configuration management presented in this section include:
♦ Application of configuration management requirements;
♦ Configuration management policy;
♦ Configuration identification;
♦ Baseline, promotion, and demotion procedures;
♦ Configuration control procedures;
♦ Release process;
♦ Configuration audits; and
♦ Configuration management resources.

8.1.3

Application of Configuration Management
Requirements

Requirements for configuration management apply regardless of the specific
technologies employed to all voting systems subject to the Standards. These system
components include:
a. Software components;
b. Hardware components;
c. Communications components;
d. Documentation;
e. Identification and naming and conventions (including changes to these
conventions) for software programs and data files;

8-2

Volume I – Section 8
Configuration Management

f.

Development and testing artifacts such as test data and scripts; and

g. File archiving and data repositories.

8.2

Configuration Management Policy

The vendor shall describe its policies for configuration management in the TDP. This
description shall address the following elements:
a. Scope and nature of configuration management program activities; and
b. Breadth of application of the vendor’s policies and practices to the voting
system (i.e., extent to which policies and practices apply to the total system,
and extent to which policies and practices of suppliers apply to particular
components, subsystems, or other defined system elements.

8.3

Configuration Identification

Configuration identification is the process of identifying, naming, and acquiring
configuration items. Configuration identification encompasses all system components.

8.3.1

Structuring and Naming Configuration Items

The vendor shall describe the procedures and conventions used to:
a. Classify configuration items into categories and subcategories;
b. Uniquely number or otherwise identify configuration items; and
c. Name configuration items;

8.3.2

Versioning Conventions

When a system component is used to identify higher-level system elements, a vendor
shall describe the conventions used to:

8-3

Volume I – Section 8
Configuration Management

a. Identify the specific versions of individual configuration items and sets of
items that are used by the vendor to identify higher level system elements
such as subsystems;
b. Uniquely number or otherwise identify versions; and
c. Name versions.

8.4

Baseline, Promotion, and Demotion Procedures

The vendor shall establish formal procedures and conventions for establishing and
providing a complete description of the procedures and related conventions used to:
a. Establish a particular instance of a component as the starting baseline;
b. Promote subsequent instances of a component to baseline status as
development progresses through to completion of the initial completed
version released to the ITAs for qualification testing; and
c. Promote subsequent instances of a component to baseline status as the
component is maintained throughout its life cycle until system retirement (i.e.,
the system is no longer sold or maintained by the vendor).

8.5

Configuration Control Procedures

Configuration control is the process of approving and implementing changes to a
configuration item to prevent unauthorized additions, changes, or deletions. The
vendor shall establish such procedures and related conventions, providing a complete
description of those procedures used to:
a. Develop and maintain internally developed items;
b. Acquire and maintain third-party items;
c. Resolve internally identified defects for items regardless of their origin; and
d. Resolve externally identified and reported defects (i.e., by customers and
ITAs).

8-4

Volume I – Section 8
Configuration Management

8.6

Release Process

The release process is the means by which the vendor installs, transfers, or migrates
the system to the ITAs and, eventually, to its customers. The vendor shall establish
such procedures and related conventions, providing a complete description of those
used to:
a. Perform a first release of the system to an ITA;
b. Perform a subsequent maintenance or upgrade release of the system, or a
particular components, to an ITA;
c. Perform the initial delivery and installation of the system to a customer,
including confirmation that the installed version of the system matches
exactly the qualified system version; and
d. Perform a subsequent maintenance or upgrade release of the system, or a
particular component, to a customer, including confirmation that the installed
version of the system matches exactly the qualified system version.

8.7

Configuration Audits

The Standards require two types of configuration audits: Physical Configuration
Audits (PCA) and Functional Configuration Audits (FCA).

8.7.1

Physical Configuration Audit

The PCA is conducted by the ITA to compare the voting system components
submitted for qualification to the vendor’s technical documentation. For the PCA, a
vendor shall provide:
a. Identification of all items that are to be a part of the software release;
b. Specification of compiler (or choice of compilers) to be used to generate
executable programs;
c. Identification of all hardware that interfaces with the software;
d. Configuration baseline data for all hardware that is unique to the system;
e. Copies of all software documentation intended for distribution to users,
including program listings, specifications, operations manual, voter manual,
and maintenance manual;

8-5

Volume I – Section 8
Configuration Management

f.

User acceptance test procedures and acceptance criteria; and

g. Identification of any changes between the physical configuration of the
system submitted for the PCA and that submitted for the FCA, with a
certification that any differences do not degrade the functional characteristics;
and
h. Complete descriptions of its procedures and related conventions used to
support this audit by:
1)

Establishing a configuration baseline of the software and hardware to be
tested; and

2)

Confirming whether the system documentation matches the
corresponding system components.

8.7.2

Functional Configuration Audit

The FCA is conducted by the ITA to verify that the system performs all the functions
described in the system documentation. The vendor shall:
a. Completely describe its procedures and related conventions used to support
this audit for all system components;
b. Provide the following information to support this audit:
1)

Copies of all procedures used for module or unit testing, integration
testing, and system testing;

2)

Copies of all test cases generated for each module and integration test,
and sample ballot formats or other test cases used for system tests; and

3)

Records of all tests performed by the procedures listed above, including
error corrections and retests.

In addition to such audits performed by ITAs during the system qualification process,
elements of this audit may also be performed by state election organizations during
the system certification process, and individual jurisdictions during system acceptance
testing.

8.8

Configuration Management Resources

Often, configuration management activities are performed with the aid of automated
tools. Assuring that such tools are available throughout the system life cycle,
including if the vendor is acquired by or merged with another organization, is critical
to effective configuration management. Vendors may choose the specific tools they

8-6

Volume I – Section 8
Configuration Management

use to perform the record keeping, audit, and reporting activities of the configuration
management standards. The resources documentation standard provided below focus
on assuring that procedures are in place to record information about the tools to help
ensure that they, and the data they contain, can be transferred effectively and promptly
to a third party should the need arise. Within this context, a vendor is required to
develop and provide a complete description of the procedures and related practices for
maintaining information about:
a. Specific tools used, current version, and operating environment;
b. Physical location of the tools, including designation of computer directories
and files; and
c. Procedures and training materials for using the tools.

8-7

Volume I – Section 8
Configuration Management

Volume I, Section 9
Table of Contents

9

Overview of Qualification Tests...........................................................................................................9-1
9.1

Scope .............................................................................................................................................9-1

9.2

Documentation Submitted by Vendor ............................................................................................9-2

9.3

Voting Equipment Submitted by Vendor ........................................................................................9-3

9.4

Testing Scope ................................................................................................................................9-3
9.4.1

Test Categories ................................................................................................................9-4
9.4.1.1 Focus of Functionality Tests.....................................................................................9-5
9.4.1.2 Focus of Hardware Tests .........................................................................................9-5
9.4.1.3 Focus of Software Evaluation...................................................................................9-6
9.4.1.4 Focus of System-Level Integration Tests .................................................................9-6
9.4.1.5 Focus of Vendor Documentation Examination .........................................................9-7

9.4.2
9.5

Sequence of Tests and Audits..........................................................................................9-8

Test Applicability ............................................................................................................................9-8
9.5.1

General Applicability .........................................................................................................9-8
9.5.1.1 Hardware..................................................................................................................9-9
9.5.1.2 Software ...................................................................................................................9-9

9.5.2

Modifications to Qualified Systems.................................................................................9-10
9.5.2.1 General Requirements for Modifications ................................................................9-10
9.5.2.2 Basis for Limited Testing Determinations...............................................................9-10

9.6

Qualification Test Process ...........................................................................................................9-11
9.6.1

Pre-test Activities ............................................................................................................9-11
9.6.1.1 Initiation of Testing .................................................................................................9-11
9.6.1.2 Pre-test Preparation ...............................................................................................9-12

9.6.2

Qualification Testing .......................................................................................................9-12
9.6.2.1 Qualification Test Plan ...........................................................................................9-12
9.6.2.2 Qualification Test Conditions..................................................................................9-13
9.6.2.3 Qualification Test Fixtures......................................................................................9-13
9.6.2.4 Witness of System Build and Installation ...............................................................9-14
9.6.2.5 Qualification Test Data Requirements....................................................................9-14
9.6.2.6 Qualification Test Practices....................................................................................9-14

9.6.3

Qualification Report Issuance and Post-test Activities ...................................................9-15

i

Volume I – Section 9
Overview of Qualification Testing

9.6.4

Resolution of Testing Issues...........................................................................................9-16

ii

Volume I – Section 9
Overview of Qualification Testing

9

Overview of Qualification Tests
9.1

Scope

This section provides an overview of the testing process for qualification testing of
voting systems. Qualification testing is the process by which a voting system is shown
to comply with the requirements of the Standards and the requirements of its own
design and performance specifications.
Qualification testing encompasses the examination of software; tests of hardware
under conditions simulating the intended storage, operation, transportation, and
maintenance environments; the inspection and evaluation of system documentation;
and operational tests to validate system performance and function under normal and
abnormal conditions. The testing also evaluates the completeness of the vendor's
developmental test program, including the sufficiency of vendor tests conducted to
demonstrate compliance with stated system design and performance specifications,
and the vendor’s documented quality assurance and configuration management
practices. The tests address individual system components or elements, as well as the
integrated system as a whole. Since 1994, qualification tests for voting systems have
been performed by Independent Test Authorities (ITAs) certified by the National
Association of State Election Directors (NASED). NASED has certified an ITA for
either the full scope of qualification testing or a distinct subset of the total scope of
testing. The test process described in this section may be conducted by one or more
ITAs, depending on the nature of tests to be conducted and the expertise of the
certified ITAs.
Qualification testing is distinct from all other forms of testing, including
developmental testing by the vendor, certification testing by a state election
organization, and system acceptance testing by a purchasing jurisdiction:
♦ Qualification testing follows the vendor’s developmental testing;
♦ Qualification testing provides an assurance to state election officials and local
jurisdictions of the conformance of a voting system to the Standards as input
to state certification of a voting system and acceptance testing by a purchasing
jurisdiction; and
♦ Qualification testing may precede state certification testing, or may be
conducted in parallel as established by the certification program of individual
states.

9-1

Volume I – Section 9
Overview of Qualification Testing

Generally a voting system remains qualified under the standards against which it was
tested, as long as all modifications made to the system are evaluated and passed by a
certified ITA. The qualification test report remains valid for as long as the voting
system remains unchanged from the last tested configuration. However, if a new threat
to a particular voting system is discovered, it is the prerogative of NASED to
determine which qualified voting systems are vulnerable, whether those systems need
to be retested, and the specific tests to be conducted. In addition, when new standards
supersede the standards under which the system was qualified, it is the prerogative of
NASED to determine when systems that were qualified under the earlier standards
will lose their qualification, unless they are tested to meet current standards.
The remainder of this section describes the documentation and equipment required to
be submitted by the vendor, the scope of qualification testing, the applicability to
voting system components, and the flow of the test process.

9.2

Documentation Submitted by Vendor

The vendor shall submit to the ITA documentation necessary for the identification of
the full system configuration submitted for evaluation and for the development of an
appropriate test plan by the ITA for system qualification testing.
One element of the documentation is the Technical Data Package (TDP). The TDP
contains information that that defines the voting system design, method of operation,
and related resources. It provides a system overview and documents the system’s
functionality, hardware, software, security, test and verification specifications,
operations procedures, maintenance procedures, and personnel deployment and
training requirements. It also documents the vendor’s configuration management plan
and quality assurance program. If the system was previously qualified, the TDP also
includes the system change notes.
This documentation is used by the ITA in constructing the qualification testing plan
and is particularly important in constructing plans for the re-testing of systems that
have been qualified previously. Re-testing of systems submitted by vendors that
consistently adhere to particularly strong and well documented quality assurance and
configuration management practices will generally be more efficient than for systems
developed and maintained using less rigorous or less well documented practices.
Volume II provides a detailed description of the documentation required for the
vendor’s quality assurance and configuration management practices used for the
system submitted for qualification testing.

9-2

Volume I – Section 9
Overview of Qualification Testing

9.3

Voting Equipment Submitted by Vendor

Vendors may seek to market a complete voting system or an interoperable component
of a voting system. Nevertheless, vendors shall submit for testing the specific system
configuration that is to be offered to jurisdictions or that comprises the component to
be marketed plus the other components with which the vendor recommends that
component be used. The system submitted for testing shall meet the following
requirements:
a. The hardware submitted for qualification testing shall be equivalent, in form
and function, to the actual production versions of the hardware units or the
COTS hardware specified for use in the TDP;
b. The software submitted for qualification testing shall be the exact software
that will be used in production units;
c. Engineering or developmental prototypes are not acceptable, unless the
vendor can show that the equipment to be tested is equivalent to standard
production units in both performance and construction; and
d. Benchmark directory listings shall be submitted for all software/firmware
elements (and associated documentation) included in the vendor’s release as
they would normally be installed upon setup and installation.

9.4

Testing Scope

The qualification test process is intended to discover vulnerabilities that, should they
appear in actual election use, could result in failure to complete election operations in
a satisfactory manner.
Five types of focuses guide the overall qualification testing process:
♦ Operational accuracy in the recording and processing of voting data, as
measured by target error rate, for which the maximum acceptable error rate is
no more than one in ten million ballot positions, with a maximum acceptable
error rate in the test process of one in 500,000 ballot positions (while it would
be desirable that there be an error rate of zero, if this had to be proven by a
test, the test itself would take an infinity of time);
♦ Operational failures or the number of unrecoverable failures under conditions
simulating the intended storage, operation, transportation, and maintenance
environments for voting systems, using an actual time-based period of
processing test ballots;
♦ System performance and function under normal and abnormal conditions; and

9-3

Volume I – Section 9
Overview of Qualification Testing

♦ Completeness and accuracy of the system documentation and configuration
management records to enable purchasing jurisdictions to effectively install,
test, and operate the system.
Qualification testing complements and evaluates the vendor's developmental testing,
including any beta testing. The ITA evaluates the completeness of the vendor's
developmental test program, including the sufficiency of vendor tests conducted to
demonstrate compliance with the Standards as well as the system’s performance
specifications. The ITA undertakes sample testing of the vendor's test modules and
also designs independent system-level tests to supplement and check those designed
by the vendor. Although some of the qualification tests are based on those prescribed
in the Military Standards, in most cases the test conditions are less stringent, reflecting
commercial, rather than military, practice. The ITA may use automated software
testing tools to assist in this process if they are available for the software under
examination.
The procedure for disposition of system deficiencies discovered during qualification
testing is described in Volume II of the Standards. This procedure recognizes that
some but not necessarily all operational malfunctions (apart from software logic
defects) may result in rejection. Basically, any defect that results in or may result in
the loss or corruption of voting data, whether through failure of system hardware,
software, or communication, through procedural deficiency, or through deficiencies in
security and audit provisions, shall be cause for rejection. Otherwise, malfunctions
that result from failure to comply fully with other requirements of this standard will
not in every case warrant rejection. Specific failure definition and scoring criteria are
also contained in Volume II.

9.4.1

Test Categories

The qualification test procedure is presented in several parts:
♦ Functionality testing;
♦ Hardware testing;
♦ Software evaluation;
♦ System-level integration tests, including audits; and
♦ Examination of documented vendor practices for quality assurance and for
configuration management.
In practice, there may be concurrent indications of hardware and software function, or
failure to function, during certain examinations and tests. Operating tests of hardware
partially exercise the software as well and therefore supplement software
qualification. Security tests exercise hardware, software and communications

9-4

Volume I – Section 9
Overview of Qualification Testing

capabilities. Documentation review conducted during software qualification
supplements the review undertaken for system-level testing.
The qualification test procedures are presented in these categories because test
authorities frequently focus separately on each. The following subsections provide
information that test authorities need to conduct testing.
Not all systems being tested are required to complete all categories of testing. For
example, if a previously-qualified system has had hardware modifications, the system
may be subject only to non-operating environmental stress testing of the modified
component, and a partial system-level test. If a system consisting of general purpose
COTS hardware or one that was previously qualified has had modifications to its
software, the system is subject only to software qualification and system-level tests,
not hardware testing. However, in all cases the system documentation and
configuration management records will be examined to confirm that they completely
and accurately reflect the components and component versions that comprise the
voting system.

9.4.1.1

Focus of Functionality Tests

Functionality testing is performed to confirm the functional capabilities of a voting
system submitted for qualification. The ITA designs and performs procedures to test a
voting system against the requirements outlined in Section 2. In order to best
compliment the diversity of the voting systems industry, this part of the qualification
testing process is not rigidly defined. Although there are basic functionality testing
requirements, additions or variations in testing are appropriate depending on the
system’s use of specific technologies and configurations, the system capabilities, and
the outcomes of previous testing.

9.4.1.2

Focus of Hardware Tests

Hardware testing begins with non-operating tests that require the use of an
environmental test facility. These are followed by operating tests that are performed
partly in an environmental facility and partly in a standard test laboratory or shop
environment.
The non-operating tests are intended to evaluate the ability of the system hardware to
withstand exposure to the various environmental conditions incidental to voting
system storage, maintenance, and transportation. The procedures are based on test
methods contained in Military Standards (MIL-STD) 810D, modified where
appropriate, and include such tests as: bench handling, vibration, low and high
temperature, and humidity.

9-5

Volume I – Section 9
Overview of Qualification Testing

The operating tests involve running the system for an extended period of time under
varying temperatures and voltages. This period of operation ensures with confidence
that the hardware meets or exceeds the minimum requirements for reliability, data
reading, and processing accuracy contained in Section 3. The procedure emphasizes
equipment operability and data accuracy; it is not an exhaustive evaluation of all
system functions. Moreover, the severity of the test conditions, in most cases, has
been reduced from that specified in the Military Standards to reflect commercial and
industrial, rather than military and aerospace, practice.

9.4.1.3

Focus of Software Evaluation

The software qualification tests encompass a number of interrelated examinations,
involving assessment of application source code for its compliance with the
requirements spelled out in Volume I, Section 4. Essentially, the ITA will look at
programming completeness, consistency, correctness, modifiability, structuredness
and traceability, along with its modularity and construction. The code inspection will
be followed by a series of functional tests to verify the proper performance of all
system functions controlled by the software.
The ITA may inspect COTS generated software source code in the preparation of test
plans and to provide some minimal scanning or sampling to check for embedded code
or unauthorized changes. Otherwise, the COTS source code is not subject to the full
code review and testing. For purposes of code analysis, the COTS units shall be
treated as unexpanded macros.

9.4.1.4

Focus of System-Level Integration Tests

The functionality, hardware, and software qualification tests supplement a fuller
evaluation performed by the system-level integration tests. System-level tests focus on
these aspects jointly, throughout the full range of system operations. They include
tests of fully integrated system components, internal and external system interfaces,
usability and accessibility, and security. During this process election management
functions, ballot-counting logic, and system capacity are exercised. The process also
includes the Physical Configuration Audit (PCA) and the Functional Configuration
Audit (FCA).
The ITA tests the interface of all system modules and subsystems with each other
against the vendor’s specifications. Some, but not all, systems use telecommunications
capabilities as defined in Section 5. For those systems that do use such capabilities,
components that are located at the poll site or separate vote counting site are tested for
effective interface, accurate vote transmission, failure detection, and failure recovery.
For voting systems that use telecommunications lines or networks that are not under
the control of the vendor (e.g., public telephone networks), the ITA tests the interface

9-6

Volume I – Section 9
Overview of Qualification Testing

of vendor-supplied components with these external components for effective
interface, vote transmission, failure detection, and failure recovery.
The security tests focus on the ability of the system to detect, prevent, log, and recover
from a broad range of security risks as identified in Section 6. The range of risks
tested is determined by the design of the system and potential exposure to risk.
Regardless of system design and risk profile, all systems are tested for effective access
control and physical data security. For systems that use public telecommunications
networks, to transmit election management data or official election results (such as
ballots or tabulated results), security tests are conducted to ensure that the system
provides the necessary identity-proofing, confidentiality, and integrity of transmitted
data. The tests determine if the system is capable of detecting, logging, preventing,
and recovering from types of attacks known at the time the system is submitted for
qualification. The ITA may meet these testing requirements by confirming the proper
implementation of proven commercial security software.
The interface between the voting system and its users, both voters and election
officials, is a key element of effective system operation and confidence in the system.
At this time, general standards for the usability of voting systems by the average voter
and election officials have not been defined, but are to be addressed in the next update
of the Standards. However, standards for usability by individual voters with
disabilities have been defined in Section 2.7 based on Section 508 of the
Rehabilitation Act Amendments of 1998. Voting systems are tested to ensure that an
accessible voting station is included in the system configuration and that its design
and operation conforms with these standards.
The Physical Configuration Audit (PCA) compares the voting system components
submitted for qualification to the vendor’s technical documentation and confirms that
the documentation submitted meets the requirements of the Standards. As part of the
PCA, the ITA also witnesses the buildof the executable system to ensure that the
qualified executable release is built from the tested components.
The Functional Configuration Audit (FCA) is an exhaustive verification of every
system function and combination of functions cited in the vendors' documentation.
Through use, the FCA verifies the accuracy and completeness of the system's TDP.
The various options of software counting logic that are claimed in the vendor’s
documentation shall be tested during the system-level FCA. Generic test ballots or test
entry data for DRE systems, representing particular sequences of ballot-counting
events, will test the counting logic during this audit.

9.4.1.5

Focus of Vendor Documentation Examination

The ITA reviews the documentation submitted by the vendor to evaluate the extent to
which it conforms to the requirements outlined in Sections 7 and 8 for vendor
configuration and quality assurance practices. The ITA also evaluates the

9-7

Volume I – Section 9
Overview of Qualification Testing

conformance of other documentation and information provided by the vendor with the
vendor’s documented practices for quality assurance and configuration management.
The Standards do not require on-site examination of the vendor’s quality assurance
and configuration management practices during the system development process.
However, the ITA conducts several activities while at the vendor site to witness the
system build that enable assessment of the vendor’s quality assurance and
configuration management practices and conformance with them. These include
surveys, interviews with individuals at all levels of the development team, and
examination of selected internal work products such as system change requests and
problem tracking logs.

9.4.2

Sequence of Tests and Audits

There is no required sequence for performing the system qualification tests and audits.
For a new system, not previously qualified, a test using the generic test ballot decks
might be performed before undertaking any of the more lengthy and expensive tests or
documentation review. The ITA or vendor may, however, schedule the PCA, FCA, or
other tests in any convenient order, provided that the prerequisite conditions for each
test have been met before it is initiated.

9.5

Test Applicability

Qualification tests are conducted for new systems seeking initial qualification as well
as for systems that are modified after qualification.

9.5.1

General Applicability

Voting system hardware, software, communications and documentation are examined
and tested to determine suitability for elections use. Examination and testing addresses
the broad range of system functionality and components, including system
functionality for pre-voting, voting, and post-voting functions described in Section 2.
All products custom designed for election use shall be tested in accordance with the
applicable procedures contained in this section. COTS hardware, system software and
communications components with proven performance in commercial applications
other than elections, however, are exempted from certain portions of the test as long
as such products are not modified for use in a voting system. Compatibility of these
products all other components of the voting system shall be determined through
functional tests integrating these products with the remainder of the system.

9-8

Volume I – Section 9
Overview of Qualification Testing

9.5.1.1

Hardware

Specifically, the hardware test requirements shall apply in full to all equipment used
in a voting system with the exception of the following:
a. Commercially available models of general purpose information technology
equipment that have been designed to an ANSI or IEEE standard, have a
documented history of successful performance for relevant requirements of
the standards, and have demonstrated compatibility with the voting system
components with which they interface;
b. Production models of special purpose information technology equipment that
have a documented history of successful performance under conditions
equivalent to election use for relevant requirements of the standards and that
have demonstrated compatibility with the voting system components with
which they interface; and
c. Any ancillary devices that do not perform ballot definition, election database
maintenance, ballot reading, ballot data processing, or the production of an
official output report; and that do not interact with these system functions
(e.g.; modems used to broadcast results to the press, printers used to generate
unofficial reports, or CRTs used to monitor the vote counting process).
This equipment shall be subject to functional and operating tests performed during
software evaluation and system-level testing. However, it need not undergo hardware
non-operating tests. If the system is composed entirely of off-the-shelf hardware, then
the system also shall not be subject to the 48-hour environmental chamber segment of
the hardware operating tests.

9.5.1.2

Software

Software qualification is applicable to the following:
a. Application programs that control and carry out ballot processing, commencing with the definition of a ballot, and including processing of the ballot
image (either from physical ballots or electronically activated images), and
ending with the system's access to memory for the generation of output
reports;
b. Specialized compilers and specialized operating systems associated with
ballot processing; and
c. Standard compilers and operating systems that have been modified for use in
the vote counting process.
Specialized software for ballot preparation, election programming, vote recording,
vote tabulation, vote consolidation and reporting, and audit trail production shall be
subjected to code inspection. Functional testing of all these programs during software

9-9

Volume I – Section 9
Overview of Qualification Testing

evaluation and system-level testing shall exercise any specially tailored software offline from the ballot counting process (e.g.; software for preparing ballots and
broadcasting results).

9.5.2

Modifications to Qualified Systems

Changes introduced after the system has completed qualification under these
Standards or earlier versions of the national Voting System Standards will necessitate
further review.

9.5.2.1

General Requirements for Modifications

The ITA will determine tests necessary for to qualify the modified system based on a
review of the nature and scope of changes, and other submitted information including
the system documentation, vendor test documentation, configuration management
records, and quality assurance information. Based on this review, the ITA may:
a. Determine that a review of all change documentation against the baseline
materials is sufficient for recommendation for qualification; or
b. Determine that all changes must be retested against the previously qualified
version (this will include review of changes to source code, review of all
updates to the TDP, and a performance of system-level and functional tests);
or
c. Determine that the scope of the changes is substantial and will require a
complete retest of the hardware, software, and/or telecommunications.

9.5.2.2

Basis for Limited Testing Determinations

The ITA may determine that a modified system will be subject only to limited
qualification testing if the vendor demonstrates that the change does not affect
demonstrated compliance with these Standards for:
a. Performance of voting system functions;
b. Voting system security and privacy;
c. Overall flow of system control; and
d. The manner in which ballots are defined and interpreted, or voting data are
processed.

9-10

Volume I – Section 9
Overview of Qualification Testing

Limited qualification testing is intended to facilitate the correction of defects, the
incorporation of improvements, the enhancement of portability and flexibility, and the
integration of vote-counting software with other systems and election software.

9.6

Qualification Test Process

The qualification test process may be performed by one or more ITAs that together
perform the full scope of tests required by the Standards. Where multiple ITAs are
involved, testing shall be conducted first for the voting system hardware, firmware,
and related documentation; then for the system software and communications; and
finally for the integrated system as a whole. Voting system hardware and firmware
testing may be performed by one ITA independently of the other testing performed by
other ITAs. Testing may be coordinated across ITAs so that hardware/firmware tested
by one ITA can be used in the overall system tests performed by another ITA.
Whether one or more ITAs are used, the testing generally consists of three phases:
♦

Pre-test Activities;

♦

Qualification Testing; and

♦

Qualification Report Issuance and Post-test Activities.

9.6.1

Pre-test Activities

Pre-test activities include the request for initiation of testing and the pre-test
preparation.

9.6.1.1

Initiation of Testing

Qualification testing shall be conducted at the request of the vendor, consistent with
the provision of the Standards. The vendor shall:
a. Request the performance of qualification testing from among the certified
ITAs,
b. Enter into formal agreement with the ITAs for the performance of testing, and
c. Prepare and submit materials required for testing consistent with the
requirements of the Standards.

9-11

Volume I – Section 9
Overview of Qualification Testing

Qualification testing shall be conducted for the initial version of a voting system as
well as for all subsequent changes to the system prior to release for sale or for
installation. As described in Section 9.5.2, the nature and scope of testing for system
changes or new versions shall be determined by the ITA based on the nature and
scope of the modifications to the system and on the quality of system documentation
and configuration management records submitted by the vendor.

9.6.1.2

Pre-test Preparation

Pre-test preparation encompasses the following activities:
a. The vendor shall prepare and submit a complete TDP to the ITA. The TDP
should consist of the items listed in Section 9.2 and specified in greater detail
in Standards Volume II;
b. The ITA shall perform an initial review of the TDP for completeness and
clarity and request additional information as required;
c. The vendor shall provide additional information, if requested by the ITA;
d. The vendor and ITA shall enter into an agreement for the testing to be
performed by the ITA in exchange for payment by the vendor; and
e. The vendor shall deliver to the ITA all hardware and software needed to
perform testing.

9.6.2

Qualification Testing

Qualification testing encompasses the preparation of a test plan, the establishment of
the appropriate test conditions, the use of appropriate test fixtures, the witness of the
system build and installation, the maintenance of qualification test data, and the
evaluation of the data resulting from tests and examinations.

9.6.2.1

Qualification Test Plan

The ITA shall prepare a Qualification Test Plan to define all tests and procedures
required to demonstrate compliance with Standards, including:
a. Verifying or checking equipment operational status by means of manufacturer
operating procedures;
b. Establishing the test environment or the special environment required to
perform the test;

9-12

Volume I – Section 9
Overview of Qualification Testing

c. Initiating and completing operating modes or conditions necessary to evaluate
the specific performance characteristic under test;
d. Measuring and recording the value or range of values for the characteristic to
be tested, demonstrating expected performance levels;
e. Verifying, as above, that the equipment is still in normal condition and status
after all required measurements have been obtained;
f.

Confirming that documentation submitted by the vendor corresponds to the
actual configuration and operation of the system; and

g. Confirming that documented vendor practices for quality assurance and
configuration management comply with the Standards.
A recommended outline for the test plan and the details of required testing are
contained in Standards Volume II.

9.6.2.2

Qualification Test Conditions

The ITA may perform Qualification tests in any facility capable of supporting the test
environment. The following practices shall be employed:
a. Preparations for testing, arrangement of equipment, verification of equipment
status, and the execution of procedures shall be witnessed by at least one
independent, qualified observer, who shall certify that all test and data
acquisition requirements have been satisfied;
b. When a test is to be performed at “standard” or “ambient” conditions, this
requirement shall refer to a nominal laboratory or office environment, with a
temperature in the range of 68 to 75 degrees Fahrenheit, and prevailing
atmospheric pressure and relative humidity; and
c. Otherwise, all tests shall be performed at the required temperature and
electrical supply voltage, regulated within the following tolerances:
1) Temperature

+/- 4 degrees F

2) Electrical supply voltage

+/- 2 vac.

9.6.2.3

Qualification Test Fixtures

ITAs may use test fixtures or ancillary devices to facilitate qualification testing. These
fixtures and devices may include arrangements for automating the operation of voting
devices and the acquisition of test data:
a. For systems that use a light source as a means of detecting voter selections,
the generation of a suitable optical signal by an external device is acceptable.

9-13

Volume I – Section 9
Overview of Qualification Testing

For systems that rely on the physical activation of a switch, a mechanical
fixture with suitable motion generators is acceptable;
b. ITAs may use a simulation device, and appropriate software, to speed up the
process of testing and eliminate human error in casting test ballots, provided
that the simulation covers all voting data detection and control paths that are
used in casting an actual ballot. In the event that only partial simulation is
achieved, then an independent method and test procedure shall be used to
validate the proper operation of those portions of the system not tested by the
simulator; and
c. If the vendor provides a means of simulating the casting of ballots, the
simulation device is subject to the same performance, reliability, and quality
requirements that apply to the voting device itself.

9.6.2.4

Witness of System Build and Installation

Although most testing is conducted at facilities operated by the ITA, a key element of
voting system testing shall be conducted at the vendor site. The ITA responsible for
testing voting system software, telecommunications, and integrated system operation
(i.e., system wide testing) shall witness the final system build, encompassing
hardware, software and communications, and the version of associated records and
documentation. The system elements witnessed, including their specific versions,
shall become the specific system version that is recommended for qualification.

9.6.2.5

Qualification Test Data Requirements

The following qualification test data practices shall be employed:
a. A test log of the procedure shall be maintained. This log shall identify the
system and equipment by model and serial number;
b. Test environment conditions shall be noted; and
c. All operating steps, the identity and quantity of simulated ballots, annotations
of output reports, the elapsed time for each procedure step, and observations
of equipment performance and, in the case of non-operating hardware tests,
the condition of the equipment shall be recorded.

9.6.2.6

Qualification Test Practices

The ITA shall conduct the examinations and tests defined in the Test Plan such that all
applicable tests identified in Standards Volume II are executed to determine
compliance with the requirements in Sections 2-8 of the Standards. The ITA shall

9-14

Volume I – Section 9
Overview of Qualification Testing

evaluate data resulting from examinations and tests, employing the following
practices:
a. If any malfunction or data error is detected that would be classified as a
relevant failure using the criteria in Volume II, its occurrence, and the
duration of operating time preceding it, shall be recorded for inclusion in the
analysis of data obtained from the test, and the test shall be interrupted;
b. If a malfunction is due to a defect in software, then the test shall be terminated
and system returned to the vendor for correction;
c. If the malfunction is other than a software defect, and if corrective action is
taken to restore the equipment to a fully operational condition within 8 hours,
then the test may be resumed at the point of suspension;
d. If the test is suspended for an extended period of time, the ITA shall maintain
a record of the procedures that have been satisfactorily completed. When
testing is resumed at a later date, repetition of the successfully completed
procedures may be waived, provided that no design or manufacturing change
has been made that would invalidate the earlier test results;
e. Any and all failures that occurred as a result of a deficiency shall be classified
as purged, and test results shall be evaluated as though the failure or failures
had not occurred, if the:
1) Vendor submits a design, manufacturing, or packaging change notice to
correct the deficiency, together with test data to verify the adequacy of the
change;
2) Examiner of the equipment agrees that the proposed change will correct
the deficiency; and
3) Vendor certifies that the change will be incorporated into all existing and
future production units; and
f.

9.6.3

If corrective action cannot be successfully taken as defined above, then the
test shall be terminated, and the equipment shall be rejected.

Qualification Report Issuance and Post-test Activities

Qualification report issuance and post-test activities encompass the activities
described below:
a. The ITA may issue interim reports to the vendor, informing the vendor of the
testing status, findings to date, and other information. Such reports do not
constitute official test reports for voting system qualification;
b. The ITA shall prepare a Qualification Test Report that confirms the voting
has passed the testing conducted by the ITA. The ITA shall include in the

9-15

Volume I – Section 9
Overview of Qualification Testing

Qualification Test Report the date testing was completed, the specific system
version addressed by the report, the version numbers of all system elements
separately identified with a version number by the vendor, and the scope of
tests conducted. A recommended outline for the test report is contained in
Volume II;
c. Where a system is tested by multiple ITAs, each ITA shall prepare a
Qualification Test Report;
d. The ITA shall deliver the Qualification Test Report to the vendor and to
NASED;
e. NASED shall issue a single Qualification Number for the system to the
vendor and to the ITAs. The issuance of a Qualification Number indicates that
the system has been tested by certified ITAs for compliance with the national
test standards and qualifies for the certification process of states that have
adopted the national standards;
f.

This number applies to the system as a whole only for the configuration and
versions of the system elements tested by the ITAs and identified in the
Qualification Test Reports. The Qualification Number does not apply to
individual system components or untested configurations; and

g. The Qualification Number is intended for use by the states and their
jurisdictions to support state and jurisdiction processes concerning voting
systems. States and their jurisdictions shall request ITA Qualification Test
Reports based on the Qualification Number as part of their voting system
certification and procurement processes systems that rely on the Standards.

9.6.4

Resolution of Testing Issues

The NASED Voting Systems Board (the Board) is responsible for resolving questions
about the application of the Standards in the testing of voting systems. The Secretariat
for the Board will relay its decisions to the NASED certified ITAs and voting system
vendors. The Federal Election Commission will monitor these decisions in order to
determine which of them, if any, should be reflected in a subsequent version of the
standards.

9-16

Volume I – Section 9
Overview of Qualification Testing

Volume I, Appendix A
Table of Contents
Glossary...................................................................................................................................................... A-1

i

Volume I – Appendix A
Glossery

A

Glossary

Absentee Ballot

A ballots cast by a voter unable to vote in person at his or her polling place on
election day.

Acceptance Test

The examination of a voting system and its components by the purchasing
election authority (usually in a simulated-use environment) to validate
performance of delivered units in accordance with procurement requirements,
and to validate that the delivered system is, in fact, the certified or qualified
system purchased. Testing to validate performance may be less broad than that
involved with qualification testing and successful performance for multiple units
(precinct count systems) may be inferred from a sample test.

Ballot Configuration

The combination of contests, ballot measures, or both that is unique to a
particular political subdivision, precinct, or portion of a precinct (for split
precincts) in a particular election. Typically, in primary elections, there are
separate ballot configurations for each participating political party and for
nonpartisan races and ballot issues. Depending on state law and practice,
contests for federal, state, and local office may be presented in separate ballot
configurations or combined into a single ballot configuration.

Ballot Counter

A counter in a voting device that counts the ballots cast in a single election or
election test. Previously known as public counter.

Ballot Counting Logic

The software logic that defines the combinations of voter choices that are valid
and invalid on a given ballot and that determines how the vote choices are
totaled in a given election. States differ from each other in the way they define
valid and invalid votes and in their vote counting procedures. For example,
voters in some States are permitted to both select the straight party option and
vote “by exception” for candidates from a different political party. Voters in other
States that choose the straight party option and any candidates from a different
party for some contests will be considered to have overvoted in those contests.

Ballot Format

One of any number of specific ballot configurations issued to the appropriate
precinct. At minimum, ballot formats differ from one another in content. They
may also differ in size of type, in language used, or in method of presentation
(e.g.; visual or audio). Also referred to as “ballot style.”

Ballot Image

An electronically produced record of all votes cast by a single voter. (Also
referred to as “ballot set”).

A-1

Volume I – Appendix A
Glossery

Ballot Preparation

The process of using election databases to select the specific contests and
questions to be contained in a ballot format and related instructions; preparing
election specific software containing these selections; producing all possible
ballot formats (or styles); and validating the correctness of ballot materials and
software containing these selections for an upcoming election.

Ballot Production

The process of converting the ballot format to a media ready for use in the
physical ballot production or electronic presentation.

Ballot Rotation

The process of varying the order of the candidate names within a given contest
to reduce the impact of voter bias towards the candidate(s) listed first. States
that require ballot rotation may do so for primary elections, general elections, or
both. States may rotate the names according to a number of different formulas
including by political subdivision, by election district, by precinct, or by ballot
displays or voting machines.

Ballot Set

See “Ballot Image.”

Ballot Scanner

A device used to read the data from a marksense ballot

Ballot Style

One of any number of specific ballot configurations issued to the appropriate
precinct. At minimum, ballot styles differ from one another in content. They may
also differ in size of type, in language used, or in method of presentation (e.g.;
visual or audio). Also referred to as “ballot format.”

Baseline

A product configuration that has been formally submitted for review against the
the Standards, which thereafter serves as the basis for further development;
and can be changed and offered to jurisdictions only through formal change
control and requalification procedures (and/or recertification procedures where
applicable). (Patterned after IEEE Std. 610.12-1990)

Candidate Register

The record that reflects the total votes cast for the candidate. This record is
augmented as each ballot is cast on a DRE or as digital signals from the
conversion of voted paper ballots are logically interpreted and recorded.

Canvass

A compilation of election returns and validation of the outcome that form the
basis of the official results.

Catastrophic System Failure

A total loss of function or functions, such as the loss or unrecoverable
corruption of voting data, or the failure of an on-board battery for volatile
memory.

Certification Testing

The state examination, and possibly testing, of a voting system to determine its
compliance with state laws, regulations, and rules and any other state
requirements for voting systems.

Challenged Ballot

A ballot provided to individuals whose eligibility to vote has been challenged.
Once voted, such ballots are not included in the tabulation until after the voter’s
eligibility is confirmed.

Closed Primary

A primary election in which voters receive a ballot listing only those candidates
running for office in the political party with which the voters are affiliated, along
with nonpartisan offices and ballot issues presented at the same election.
Usually, unaffiliated voters are permitted to vote only on nonpartisan offices and
ballot issues. In some cases, one or more political parties within a state may
allow unaffiliated voters to choose to vote in their party’s primary.

A-2

Volume I – Appendix A
Glossery

Commercial Off-the-Shelf
(COTS)

Commercial, readily-available hardware devices (such as card readers,
printers, or personal computers) or software products (such as operating
systems, programming language compilers, or database management
systems). These devices and software are exempted from certain portions of
the qualification testing process so long as such products are not modified in
any manner for use in the voting system.

Component

Individual elements or items that collectively comprise a device. Examples
include circuit boards, internal modems, processors, disk drives, computer
memory.

Configuration Identification

An element of configuration management, consisting of selecting the
configuration items for a system and recording their functional and physical
characteristics in technical documentation. (Patterned after IEEE Std. 610.121990)

Configuration Item

An aggregation of hardware, software, or both that is designated for
configuration management and treated as a single entity in the configuration
management process. (Patterned after IEEE Std. 610.12-1990)

Configuration Management

A discipline applying technical and administrative direction and surveillance to:
identify and document functional and physical characteristics of a configuration
item, control changes to these characteristics, record and report change
processing and implementation status, and verify compliance with specified
requirements. (Patterned after IEEE Std. 610.12-1990)

Configuration Status
Accounting

An element of configuration management, consisting of the recording and
reporting of information needed to manage a configuration effectively. This
information includes a listing of the approved configuration identification, the
status of proposed changes to the configuration, and the implementation status
of approved changes. (Patterned after IEEE Std. 610.12-1990)

COTS

See “Commercial Off-the-Shelf.”

Count

The process of totaling votes.

Cross-party Endorsement

The endorsement of a single candidate or slate of candidates by more than one
political party. The candidate or slate appears on the ballot representing each
endorsing political party. State requirements vary for how votes are recorded
when a voter selects the same candidate or slate more than once. Also referred
to as “cross filing.”

Cumulative Voting

A practice where voters are permitted to cast as many votes as there are seats
to be filled. Voters are not limited to giving only one vote to a candidate.
Instead, they can put multiple votes on one or more candidates. (For additional
information, access the Center for Voting and Democracy’s web site at
http://www.fairvote.org/contents.htm#irv.)

Data Accuracy

The system's ability to process voting data absent internal errors generated by
the system. It is distinguished from data integrity, which encompasses errors
introduced by an outside source.

Data Integrity

The invulnerability of the system to accidental intervention or deliberate,
fraudulent manipulation that would result in errors in the processing of data. It is
distinguished from data accuracy that encompasses internal, system-generated
errors.

Device

A functional unit that performs its assigned tasks as an integrated whole.

A-3

Volume I – Appendix A
Glossery

Direct Record Electronic (DRE)
Voting System

A voting system that records votes by means of a ballot display provided with
mechanical or electro-optical components that can be actuated by the voter;
that processes the data by means of a computer program; and that records
voting data and ballot images in internal and/or external memory components. It
produces a tabulation of the voting data stored in a removable memory
component and in printed copy.

Election Coding

See “Election Programming.”

Election Databases

A data file or set of files that contains geographic information about political
subdivisions and boundaries; all contests and questions to be included in an
election; and the candidates for each contest.

Election District

A contiguous geographic area represented by a public official who is elected by
voters residing within the district boundaries. The district may cover an entire
state or political subdivision, may be a portion of the state or political
subdivision, or may include portions of more than one political subdivision.

Election Management System

A set of processing functions and databases within a Voting System that define,
develop and maintain election databases; perform election definition and setup
functions; format ballots; count votes; consolidate and report results; and
maintain audit trails.

Election Programming

The process by which election officials or their designees use voting system
software to logically define the ballot for a specific election. Also referred to as
“election coding.”

FEC

An acronym for the Federal Election Commission.

Firmware

Computer programs (software) stored in read-only memory (ROM) devices
embedded in the system and not capable of being altered during system
operation. For purposes of applying the Standards, firmware is considered a
form of software.

Functional Configuration Audit
(FCA)

An exhaustive verification of every system function and combination of
functions cited in the vendors' documentation. Through use, the FCA verifies
the accuracy and completeness of the system's Voter Manual, Operations
Procedures, Maintenance Procedures, and Diagnostic Testing Procedures.

Functional Test

A test performed to verify or validate the accomplishment of a function or a
series of functions.

General Election

An election in which voters, regardless of party affiliation, are permitted to
select persons to fill public office and vote on ballot issues. Where the public
office may be filled by a candidate affiliated with a political party, voters choose
among the nominees of political parties and, as permitted by state law,
unaffiliated candidates.

ITA

An acronym for Independent Test Authority.

Logical Correctness

A condition signifying that, for a given input, a computer program will satisfy the
program specification (produce the required output).

Marksense Voting System

A system by which votes are recorded by means of marks made in voting
response fields designated on one or both faces of a ballot card or series of
cards. Marksense systems use a ballot scanner to read the ballots.

A-4

Volume I – Appendix A
Glossery

Measure Register

The record that reflects the total votes cast for and against a specific ballot
issue. This record is augmented as each ballot is cast on a DRE or as digital
signals from the conversion of voted paper ballots are logically interpreted and
recorded.

Non-partisan Office

An elected office for which candidates run independent of political party
affiliation.

Nonvolatile Memory

Memory in which information can be stored indefinitely with no power applied.
ROMs and EPROMs are examples of nonvolatile memory.

Open Primary

A primary election in which voters, regardless of political affiliation, may choose
in which party’s primary they will vote. Some states require voters to publicly
declare their choice of party ballot at the polling place, after which the poll
worker provides or activates the appropriate ballot. Other states allow the
voters to make their choice of party ballot within the privacy of the voting booth.
Voters also are permitted to vote on nonpartisan offices and ballot issues that
are presented at the same election.

Overvotes

The generally prohibited practice of voting for more than the allotted number of
candidates for the office being contested.

Paper-Based Voting System

A voting system referred to in the 1990 Standards as a Punchcard and
Marksense (P&M) Voting System that records votes, counts votes, and
produces a tabulation of the vote count, using one or more ballot cards.

Partisan Office

An elected office for which candidates run as representatives of a political party.

Physical Configuration Audit
(PCA)

An inspection that compares the voting system components submitted for
qualification to the vendor’s technical documentation and confirms that the
documentation submitted meets the requirements of the Standards. As part of
the PCA, the ITA also witnesses the building of the executable system to
ensure that the qualified executable release is built from the tested
components.

Political Subdivision

Any unit of government, such as counties and cities but often excepting school
districts, having authority to hold elections for public offices or on ballot issues.

Polling Location

The physical address of a polling place.

Polling Place

The area within the polling location where voters cast ballots.

Precinct

An administrative division representing a contiguous geographic area in which
voters cast ballots at the same polling place. Voters casting absentee ballots
may also be combined into one or more administrative absentee precincts for
purposes of tabulating and reporting votes. Generally, voters in a polling place
precinct are eligible to vote in a general election using the same ballot format.
In some jurisdictions, however, the ballot formats may be different due to split
precincts or required ballot rotations within the precinct.

Primary Election

In most cases, an election held to determine which candidate will represent a
political party in the general election. During presidential election years, voters
in primary elections may also select delegates to presidential nominating
conventions. Some states have an “open primary”, while others have a “closed
primary”. Sometimes elections for nonpartisan offices and ballot issues are
held during primary elections.

A-5

Volume I – Appendix A
Glossery

Primary Presidential Delegation
Nominations

A primary election in which voters choose the delegates to the Presidential
nominating conventions allotted to their state by the national party committees.

Provisional Ballot

A ballot provided to individuals who claim they are eligible to vote but whose
eligibility cannot be confirmed when they present themselves to vote. Once
voted, such ballots are not included in the tabulation until after the voter’s
eligibility is confirmed.

Public Network Direct Record
Electronic (DRE) Voting System

A form of DRE voting system that uses electronic ballots and transmits official
vote data from the polling place to another location (such as a central count
facility) over a public network beyond the control of the election authority. These
networks include public telephone lines and the Internet.

Punchcard Voting System

A voting system where votes are recorded by means of punches made in voting
response fields designated on one or both faces of a ballot card or series of
cards.

Qualification Number

A number issued by NASED to a system that has been tested by certified
Independent Test Authorities for compliance with the qualification test
standards. The issuance of a Qualification Number indicates that the system
qualifies for certification process of states that have adopted the Standards.

Qualification Test Report

A report of results of independent testing of a voting system by an Independent
Test Authority indicating the date testing was completed, the specific system
version tested, and the scope of tests conducted

Qualification Testing

The examination and testing of a computerized voting system by an
Independent Test Authority using qualification test standards to determine if the
system complies with the qualification performance and test standards and with
its own specifications. This process occurs prior to state certification.

Ranked Order Voting

A practice that allows voters to rank candidates in a contest in order of choice:
1, 2, 3 and so on. It takes a majority to win. If anyone receives a majority of the
first choice votes, that candidate wins that election. If not, the last place
candidate is deleted, and all ballots are counted again, but this time each ballot
cast for the deleted candidate counts for the next choice candidate listed on the
ballot. The process of eliminating the last place candidate and recounting the
ballots continues until one candidate receives a majority of the vote. The
practice is also known as instant runoff voting, preferences or preferential
voting, or choice voting. (For additional information, access the Center for
Voting and Democracy’s web site at http://www.fairvote.org/contents.htm#irv.)

Recall Issues (with Options)

The process that allows voters to remove their elected representatives from
office prior to the expiration of their terms of office. Often, the recall involves
not only the question of whether a particular officer should be removed from
office, but also the question of naming a successor in the event that there is an
affirmative vote for the recall. There are no provisions for the recall of federal
office holders.

Recertification

The state examination, and possibly the retesting, of a voting system that was
modified subsequent to receiving state certification. The object of this process
is to determine if the modification still permits the system to function properly in
accordance with state requirements.

A-6

Volume I – Appendix A
Glossery

Runoff Election

An election to select a winner following a primary, or sometimes a general
election, in which no candidate in the contest received the required minimum
percentage of the votes cast. The two candidates receiving the most votes for
the race in question proceed to the runoff election.

Split Precinct

A split precinct is a precinct containing more than one ballot format in order to
accommodate a contiguous geographic area served by the precinct that
contains more than one election district.

Straight Party Voting

A mechanism by which voters are permitted to cast a vote indicating the
selection of all candidates on the ballot for a single political party.

Support Software

Software that aids in the development or maintenance of other software, for
example compilers, loaders and other utilities. (Patterned after IEEE Std.
610.12-1990)

Tabulation

See “Count.”

Undervotes

The practice of voting for less than the total number of election contests listed
on the ballot, or of voting for less than the number of positions to be filled for a
single office. (i.e. A person would undervote if a contest required the selection
of 3 out of a given number of candidates, and the voter chose only two
candidates).

Validation

The process of evaluating a system or component during or at the end of the
development process to determine whether it satisfies specified requirements.
(Patterned after IEEE Std. 610.12-1990)

Verification

The process of evaluating a system or component to determine whether the
products of a given development phase satisfy the conditions (such as
specifications) imposed at the start of that phase. (Patterned after IEEE Std.
610.12-1990)

Vote for N of M

A ballot choice in which voters are required to vote for a limited number of
candidates for a single office from a larger field of candidates. For example, in
an election for six open city council seats, voters may be told that they can vote
for six out of twelve candidates actually listed on the ballot.

Voter Registration System

A set of processing functions and data storage that maintains records of eligible
voters. This system generally is not considered a part of a Voting System
subject to the Standards.

Voting Position

Specific response fields on a ballot where the voter indicates the selection of a
candidate or ballot proposition.

Voting Station

A location within the polling place where voters may record their votes. A
voting station includes the voting booth or enclosure and the vote-recording
device.

Write-in Voting

A means to cast a vote for an individual not listed on the ballot. Voters may do
this by using a marking device to physically write their choice on the ballot or
they may use a keypad, touchscreen or other electronic means to indicate their
choice.

A-7

Volume I – Appendix A
Glossery

Volume I, Appendix B
Table of Contents

B Appendix - Applicable Documents..................................................................................................... B-1
B.1 Documents Incorporated in the Standards.................................................................................... B-1
B.2 Standards Development Documents ............................................................................................ B-2
B.3 Guidance Documents.................................................................................................................... B-4

i

Volume I – Appendix B
Applicable Documents

B

Appendix - Applicable
Documents
B.1

Documents Incorporated in the Standards

The following publications have been incorporated into the Standards. When specific
provisions from these publications have been incorporated, specific references are
made in the body of the Standards.
Federal Regulations

Code of Federal Regulations, Title 20, Part 1910, Occupational Safety and Health Act
Code of Federal Regulations, Title 36, Part 1194, Architectural and Transportation
Barriers Compliance Board, Electronic and Information Technology Standards - Final
Rule
Code of Federal Regulations, Title 47, Parts 15 and 18, Rules and Regulations of the
Federal Communications Commission
Code of Federal Regulations, Title 47, Part 15, “Radio Frequency Devices”, Subpart
J, “Computing Devices”, Rules and Regulations of the Federal Communications
Commission

American National
Standards Institute
(ANSI)

ANSI C63.4

Methods of Measurement of Radio-Noise Emissions from LowVoltage Electrical and Electronic Equipment in the Range of 9Khz to
40 GHz

ANSI C63.19

American National Standard for Methods of Measurement of
Compatibility between Wireless Communication Devices and
Hearing Aids

B-1

Volume I – Appendix B
Applicable Documents

International
Electrotechnical
Commission (IEC)

National Institute of
Standards and
Technology

Military Standards

IEC 61000-4-2 (199501)

Electromagnetic Compatibility (EMC) Part 4: Testing and
Measurement Techniques. Section 2 Electrostatic Discharge
Immunity Test (Basic EMC publication).

IEC 61000-4-3 (1996)

Electromagnetic Compatibility (EMC) Part 4: Testing and
Measurement Techniques. Section 3 Radiated Radio-Frequency
Electromagnetic Field Immunity Test.

IEC 61000-4-4 (199501)

Electromagnetic Compatibility (EMC) Part 4: Testing and
Measurement Techniques. Section 4 Electrical Fast Transient/Burst
Immunity Test.

IEC 61000-4-5 (199502)

Electromagnetic Compatibility (EMC) Part 4: Testing and
Measurement Techniques. Section 5 Surge Immunity Test.

IEC 61000-4-6 (199604)

Electromagnetic Compatibility (EMC) Part 4: Testing and
Measurement Techniques. Section 6 Immunity to Conducted
Disturbances Induced by Radio-Frequency Fields.

IEC 61000-4-8 (199306)

Electromagnetic Compatibility (EMC) Part 4: Testing and
Measurement Techniques. Section 8 Power-Frequency Magnetic
Field Immunity Test. (Basic EMC publication).

IEC 61000-4-11 (199406)

Electromagnetic Compatibility (EMC) Part 4: Testing and
Measurement Techniques. Section 11. Voltage Dips, Short
Interruptions and Voltage Variations Immunity Tests.

IEC 61000-5-7 Ed. 1.0
b:2001

Electromagnetic compatibility (EMC) Part 5-7: Installation and
mitigation guidelines—Degrees of protection provided by enclosures
against electromagnetic disturbances

FIPS 140-1

Security Requirements for Cryptographic Modules

FIPS 180-1

Secure Hash Standard

FIPS 188

Standard Security Label for Information Transfer

FIPS 196

Entity Authentication Using Public Key Cryptography

FIPS (number TBD)

Advanced Encryption Standard (AES) (Expected to become official
December 2001)

MIL-STD-498

Software Development and Documentation Standard, 1989

MIL-STD-810D (2)

Environmental Test Methods and Engineering Guidelines, 19 July
1983

B.2

Standards Development Documents

The following publications have been used for guidance in the revision of the
Standards.

B-2

Volume I – Appendix B
Applicable Documents

American National
Standards Institute
(ANSI)

ANSI/ISO/IEC TR
9294.1990

Information Technology Guidelines for the Management of Software
Documentation

ISO/IEC TR 133354:2000

Information technology—Guidelines for the management of IT
Security—Part 4: Selection of safeguards

ISO/IEC TR 133353:1998

Information technology—Guidelines for the management of IT
Security—Part 3 Techniques for the management of IT security

ISO/IEC TR 133352:1997

Information technology—Guidelines for the management of IT
Security—Part 2: Managing and planning IT security

ISO/IEC TR 133351:1996

Information technology—Guidelines for the management of IT
Security—Part 1: Concepts and models for IT security

ISO 10007:1995

Quality Mgmt. Guidelines for Configuration Management

ISO 10005-1995

Quality Mgmt. Guidelines for Quality Plans

ANSI/ISO/ASQC
QS9000-3-1997

QM and QA standards Part 3: Guidelines for the application of
ANSI/ISO/ASQC Q9000-1994 to the Development, Supply,
Installation, and Maintenance of Computer Software

MB2, MB5, MB9

Maintainability Bulletins

EIA 157

Quality Bulletin

EIA QB2-QB5

Quality Bulletins

EIA RB9

Failure Mode and Effect Analysis, Revision 71

EIA SEB1—SEB4

Safety Engineering Bulletins

RS-232-C

Interface Between Data Terminal Equipment and Data
Communications Equipment Employing Serial Binary Data Interchange

RS-366-A

Interface Between Data Terminal Equipment and Automatic Calling
Equipment for Data Communication

RS-404

Standard for Start-Stop Signal Quality Between Data Terminal
Equipment and Non-synchronous Data Communication Equipment

National Institute of
Standards and
Technology

NISTIR 4909

Software Quality Assurance: Documentation and Reviews

Institute of Electrical
and Electronics
Engineers

610.12-1990

IEEE Standard Glossary of Software Engineering Terminology

730-1998

IEEE Standard for Software Quality Assurance Plans

828-1998

IEEE Standard for Software Configuration Management Plans

829-1998

IEEE Standard for Software Test Documentation

830-1998

IEEE Recommended Practice for Software Requirements
Specifications

International
Organization for
Standardization (ISO)
International Electrotechnical Commission
(IEC)

Electronic Industries
Alliance Standards

B-3

Volume I – Appendix B
Applicable Documents

Military Standards

MIL-STD-498

B.3

Software Development and Documentation, 27 May 1998

Guidance Documents

The following publications contain information that is useful in understanding and
complying with the Standards.
American National
Standards Institute
(ANSI)
International
Organization for
Standardization (ISO)
International Electrotechnical Commission
(IEC)

National Institute of
Standards and
Technology

Institute of Electrical
and Electronics
Engineers

Military Standards

ANSI/ISO/IEC TR
10176.1998

Information Technology Guidelines for the Preparation of
Programming Language Standards

ANSI/ISO/IEC
6592.2000

Information Technology Guidelines for the Documentation of
Computer Based Application Systems

ANSI/ISO/ASQC
Q9000-3-1997

Quality management and quality assurance standards Part 3:
Guidelines for the application of ANSI/IAO/ASQC Q9001-1994 to
the Development, supply, installation and maintenance of computer
software

ANSI/ISO/ASQC
Q9000-1-1994

Quality Management and Quality Assurance Standards—Guidelines
for Selection and Use

ANSI/ISO/ASQC
Q10007-1995

Quality Management Guidelines for Configuration Management

FIPS 102

Guideline for Computer Security Certification and Accreditation

FIPS 112

Password Usage (3)

FIPS 113

Computer Data Authentication

488-1987

IEEE Standard Digital Interface for Programmable Instrumentation

796-1983

IEEE Standard Microcomputer System Bus IEEE/ANSI Software
Engineering Standards

750.1-1995

IEEE Guide for Software Quality Assurance Planning

1008-1987

IEEE Standard for Software Unit Testing

1016-1998

IEEE Recommended Practice for Software Design Descriptions

1012-1998

IEEE Guide for Software Verification and Validation Plans

MIL-HDBK-454

Standard General Requirements for Electronic Equipment

MIL-HDBK-470

Maintainability Program for Systems & Equipment

MIL-STD-882

Systems Safety Program Requirements

MIL-STD-1472

Human Engineering Design Criteria for Military Systems, Equipment
and Facilities

B-4

Volume I – Appendix B
Applicable Documents

MIL-STD-973

Other References

Configuration Management, 30 September 2000

Designing for the Color-Challenged: A Challenge, by Thomas G.
Wolfmaier (March 1999);
http://www.sandia.gov/itg/newsletter/mar99/accessibility_color_challe
nged.html;
Effective Color Contrast: Designing for People with Partial Sight and
Color Deficiencies, by Aries Arditi, Ph.D;
http://www.lighthouse.org/color_contrast.htm

B-5

Volume I – Appendix B
Applicable Documents

Volume I, Appendix C
Table of Contents

C Appendix – Usability............................................................................................................................ C-1
C.1 Scope ......................................................................................................................................... C-1
C.2 General Principles......................................................................................................................... C-2
C.3 Overall Design and Layout of the Voter Workspace ..................................................................... C-3
C.4 Ballot Legibility and Understandability .......................................................................................... C-4
C.5 Information Grouping .................................................................................................................... C-6
C.6 Voting Input Fields ........................................................................................................................ C-7
C.7 Navigation and Manipulation of Ballots ......................................................................................... C-8
C.8 Preventing and Minimizing Voter Errors........................................................................................ C-9
C.9 Help and System Failure............................................................................................................. C-10
C.10 Voter Familiarization and Training .............................................................................................. C-10

i

Volume I – Appendix C
Usabiltiy

C

Appendix – Usability
C.1

Scope

This appendix addresses the design of the voting system to meet the needs of the
voters, that is, to develop the interfaces between the voter and the system that are easy
to use and that minimize voter errors due to poor interface design. Depending on the
voting technology employed, the main elements of this interface are:
♦ Information displays, e.g., presentations of contests, candidates, propositions,
and instructions
♦ Vote input fields, e.g., the location where the voter indicates his or her
selection; and
♦ Navigation aids, e.g., the way that voters "move" from one part of the system
to another.
The most effective interfaces are almost transparent to the voter. They enable the
voter to devote his or her complete attention to the task at hand - voting for the
candidates and propositions of their choice. A good voter-voting system interface
guides the voter to appropriate behavior. It should be obvious to the voter what he or
she should do, and importantly, what seems obvious to the voter should be correct.
To the extent that the design confuses the voter or causes the voter to stop and think,
for example "where on this ballot do I place my vote" or "how do I change my vote,"
attention is directed away from the voter's main task and to the interface. At best, this
can lead to voter frustration. The voter must shift attention away from voting to
figuring out how to use the voting system. At worst, it can lead to errors such as
failing to vote for a contest, improperly indicating the vote so it is not counted, or
voting for more than the required number of candidates.
Designing effective and usable interfaces between the voter and the voting system
involves a number of activities. First, voter task requirements should be identified.
The requirements reflect the fact that the way that voters interact with the system is
different depending on the voting system technology. For example, the way a voter
casts a vote and navigates through a ballot will be quite different for a paper ballot

C-1

Volume I – Appendix C
Usabiltiy

when compared to a computer-based voting system. These tasks need to be carefully
analyzed and addressed in the design.
Second, human factors design guidelines should be used to guide the interface design.
These guidelines have evolved from scientific research on the human performance
aspects of system design and from many years of application in the design of systems.
Their application can help to ensure that the design of voting systems is consistent
with and compatible with the physical and cognitive characteristics of the voting
public.
Third, usability tests and evaluations should be conducted to ensure the voting system
has achieved its design goals. In part, these tests can help verify that the design
features recommended in this Appendix are successfully implemented in the final
voting system design. Usability tests are based on the feedback and performance of
samples of voters and can help identify aspect of the design that may be unclear to
voters. Results from the tests and evaluations can be used to correct any design
deficiencies before the system are actually used for voting.
While all three activities are important, this appendix mainly addresses the second
activity discussed above. It provides guidance on the design of usable voter-voting
system interfaces based on human factors principles.

C.2

General Principles

The equipment used by voters to cast ballots should meet the following general
principles:
♦ The design should support voter tasks by providing alerts, information,
instructions, and controls when and where they are needed;
♦ The design should ensure compatibility with human physiological and
cognitive characteristics and limitations, including: visual and auditory
perception, information processing and memory, anthropometry and
biomechanics;
♦ The design should ensure voter safety. The potential for hazards such as
sharp edges, falling objects, pinch points, and electrical shock should be
anticipated and eliminated as much as possible from the design;
♦ Design conventions should be established to provide consistency and
standardization of the voter's interface with the system;
♦ The system should be the simplest design needed to meet its intended
function;

C-2

Volume I – Appendix C
Usabiltiy

♦ The design should provide guidance to the voter through the balloting
process;
♦ The design should minimize voter inputs, e.g., don't add unnecessary steps,
minimize need to turn pages, and to navigate displays;
♦ The design should minimize attention shifts and interruptions, e.g., all
necessary information to cast a vote for a single race should in one place
without the need to turn pages or page to other screens; and
♦ Provisions should be made to accommodate the unique demands of all voters.
Additional criteria for accessibility that are mandated by the Standards is
discussed in 2.2.7.

C.3

Overall Design and Layout of the Voter
Workspace

The workspace is the booth, workstation, or other location provided by the election
district where the voter goes to use the paper, mechanical, or electronic systems
provided for voting.
The ballot and supporting system elements should be properly located so the displays
are directly within the voter's visual field and comfortably with the voter's reach. This
may require making the voter-system interface adjustable so voters can adjust the
interface for their unique demands. Alternatively, different workspaces can be
provided where equipment is positioned to accommodate voters of different heights,
voters who are in wheel chairs, and voters that have to sit.
All displays and controls should be located to avoid parallax. Parallax refers to the
apparent change in the relative positions of objects depending on the position of the
viewer. Error will be minimized if the distance between the displays and controls is
small, and if the ballot is located so that it can be viewed "straight-on," i.e., with the
observer's line of sight perpendicular to the plane of the ballot.
The ambient lighting provided should be consistent with the balloting technology
used. More lighting should be provided for paper ballots than for electronic ballots.
When VDU are used, ensure that lighting does not produce glare or reflections on
screens. Where both VDU and paper must be used, task lighting for reading paper
should be provided.
When ballots extend to more than one page (paper or electronic) the same general
organization layout should be used for all pages, i.e. location of page identifiers, page
numbers, items to be voted on, navigation aids, etc.

C-3

Volume I – Appendix C
Usabiltiy

C.4

Ballot Legibility and Understandability

In order to facilitate usability, voting system designers should play close attention to
design elements that affect the voter’s ability to clearly read and easily understand the
information provided. The following guidance addresses these design features:
a. The font size and style used should ensure that written material can be easily
and unambiguously read. Special provisions may be needed for visuallyimpaired voters:
1) Text (except labels) should be presented using upper and lower case
characters. Reading text is easier and faster when capitalization is used
conventionally to start sentences and to indicate proper nouns and
acronyms;
2) A clearly legible font should be utilized. Fonts should have true ascenders
and descenders, uniform stroke width, and uniform aspect ratio.
Preference should be given to simple styles. Script and other highly
stylized fonts should be avoided;
3) For a given font, it should be possible to clearly distinguish between the
following characters: X and K, T and Y, I and L, I and 1, O and Q, O and
0, S and 5, and U and V;
4) Character size should be large enough to easily read the text from the
normal sitting and standing position without squinting or leaning forward
for a person with normal corrected vision (guidance for character size is
provided in Table C-1;
Table C-1 Approximate Point Sizes For Different Viewing
Distances
Viewing Distance
25
30
36
42
48

Minimum
9
10
12
14
16

Preferred
12
14
17
20
22

Note: Point sizes refer to the size of letters when printed. When
viewed on monitors, it is not exactly the same.

b. Instructions should be concise. Instructions should be designed to
communicate information clearly and unambiguously so that they can be
easily understood and interpreted without error:

C-4

Volume I – Appendix C
Usabiltiy

1) Instructions should be available in the voter's preferred language (as
required by the Voting Rights Act of 1965);
2) Instructions should be written as short sentences with short, simple words;
3) Instruction steps should be written in active voice as positive commands
(focusing on what to do, not what not to do);
4) Punctuation should conform to standard usage of the language used;
5) Words, phrases, and names used in instructions should be used
consistently within and among instructions and all other the voting system
components;
6) Abbreviations and acronyms should be avoided or, if necessary, limited to
those well known to the voters;
7) If instructions include number lists, Arabic numerals should be used.
Numbers that are spelled out should be consistently spelled under the
same conditions;
8) The instructions should specify any conditions that must be met before an
action can be undertaken. Information about preconditions should be
located so that voters read the information before acting. Information
given in other locations may be overlooked, or require additional actions
to retrieve it, which may be distracting and time consuming. Further, if
conditions are implied, voters may easily miss or misinterpret them;
9) Applicable cautions or warnings should be displayed when the relevant
instructions are in view of the voter. Displaying warnings and cautions at
the same time as their associated instructions will help ensure that voters
read the information. Information provided elsewhere may be overlooked,
or may require retrieval by distracting and time-consuming actions;
10) Cautions or warnings should be uniquely presented, so that they are easily
distinguished from each other and from other display elements; and
11) All supplementary information (such as explanatory figures) required for
a procedure step should be shown concurrently with the step;
c. Graphics should be simple and have an obvious meaning that is consistent
with population stereotypes (unless well known graphics are used, the
meaning of graphics should be tested in advance to ensure the they
communicate the intended message). Voter understanding of graphics can be
enhanced when the graphics are accompanied by instructional labels. For
example, if an arrow is used to indicate where to vote it may be more clearly
understood if the text "To register your vote, click here";

C-5

Volume I – Appendix C
Usabiltiy

d. If the information is communicated by means of visual coding, such as by
color or shape, the following principles should be followed:
1) A limited number of codes should be used;
2) The meaning of code levels should be clearly presented to the voter;
3) Voters should be able to easily discriminate between the levels of the
code, e.g., the different colors; and
4) If the information being coded differs in importance, the code levels
should be mapped for salience, e.g., most salient display characteristics
should direct voter's attention to the most important information;
e. Decorative features with no information content should be minimized since
they can create distractions; and
f.

C.5

All information (e.g., contest labels, candidate names, instructions, graphics,
and coding) should have good contrast against the background.

Information Grouping

Proper use of information is facilitated by the application of grouping principles.
When information is presented in a display, people have a tendency to group elements
in the display based on how they are presented. It is far preferable to intentionally
design the voting system display for proper grouping than to leave it to chance. The
guidance in this section addresses these design considerations.
a. Information on the ballot should be grouped. A group should include the
following:
1) Candidates for a given office;
2) The office for which a group of candidates are running; and
3) Vote response fields;
b. Any applicable instructions pertinent to the specific vote, such as an
indication of the number of candidates to vote for;
c. Information on the ballot should reflect principles of grouping:

C-6

Volume I – Appendix C
Usabiltiy

1) A group should be visually distinct, e.g., examples of techniques that can
be used to visually set apart a group include boarders and demarcations,
background color, and textures;
2) The office for which a group of candidates are running should be
prominently labeled; and
3) The names should be grouped so they appear together (not on separate
areas of the display or separate pages); and
4) There should be clear separations between groups. The separations
between the groups should be greater than the separation between the
items of information within a group.

C.6

Voting Input Fields

The design of the voting input field is as important as the presentation of information
itself. The design should make it clear where and how to vote and the system should
provide feedback that the vote was accepted by the system. The guidance in this
section addresses these design features.
a. Ballot should clearly indicate the action voters must take to cast a vote and
where the action must be made in order to vote for specific candidates;
b. There should be a consistent relationship between names of the candidates
and where to cast a vote. For example, if the response field where voters
indicate their selection is located to the right of a candidates name, it should
always be located to the right of all candidates names and never to the left or
some alternative position. The reason for this is that people are active
information processors and will abstract rules about the relationships between
information elements in the display. The rules then guide their subsequent
behavior. If the design in inconsistent, applying the rule leads to error.
Consistency, therefore, will help establish voter expectancy with balloting
systems and minimize errors;
c. Ballot should clearly indicate how many candidates are to be voted for;
d. The design should support the ability of the voter to remain in visual contact
with the current options when in the act of casting their vote. That is, the
design should minimize as much as possible, the occlusion of the current item
being voted for by the voting action, such as when the movement of the
voter's hand to cast a vote blocks information on the display;
e. Feedback on the voter's selection should be provided. It should be clearly
obvious to voters what they voted for. In paper ballots, this is supported by

C-7

Volume I – Appendix C
Usabiltiy

clear grouping principles. In electronic systems, an informative feedback
message should be provided;
f.

Voters should be able to review all their votes prior to final submission.
While this is easily to implement in electronic systems it can be more difficult
with some paper ballots, like punch cards. In such cases, where possible, it is
desirable to provide voters with easy access to a punch card reader or similar
device to check that their votes were cast as intended;

g. Voters should be able to modify their votes at any time before finalizing their
voting session;
h. In electronic and computer-based systems, fields where voters have to enter
identifying information, if any, should be clearly labeled and the place where
the information is to go should be clearly visible;
i.

In computer-based systems, the cursor should be automatically positioned in
the first data entry field and when the voter hits the "enter/return" key, the
cursor should automatically move to the next data entry field;

j.

Voters should be able to correct the information if mistakes are made; and

k. In electronic and computer-based systems, voters should not have to input
identifying information more that once, e.g., if voters input their names at the
beginning of the voting session, they should not have to repeat the input on
subsequent pages.

C.7

Navigation and Manipulation of Ballots

As noted earlier, navigation and manipulation of ballots can be a distracting task that
shifts the voter's attention away from the voting task and, therefore, can increase the
probability of error. Therefore, careful attention has to be paid to the design of these
aspects of the voting system. The guidance provided in this section is intended to
minimize the demands of these activities.
a. Voters should be able to control the pace and sequence of their use of the
ballot. Voters should be able to freely move back and forth;
b. The means by which voters navigate through the system should be simple and
not require complex or complicated actions (e.g., clicking on a "Next Page"
button rather than scrolling);
c. The display should provide orientation and landmark features to support the
voter in determining where they are in the ballot;

C-8

Volume I – Appendix C
Usabiltiy

d. Navigation features should be provided that are distinct and should be clearly
separated from voting response fields;
e. Any cursors should be visually distinct and should not move beyond the
boundaries of the screen (become invisible);
f.

The input device (such as a mouse) and cursor response to voter movements
should be as precise as needed to reliably enter a vote; and

g. The system should provide feedback to user inputs in less than a second, but
if processing takes longer, feedback should be provided that the system is
processing the voter's input.

C.8

Preventing and Minimizing Voter Errors

During the design of voting systems, it is important to anticipate the types of errors
voters may make so that features can be deigned to minimize voter errors and to
provide the means for voters to realize their errors and correct them. The guidance in
this section addresses these design considerations.
a. The system should provide clear and explicit instructions on what procedures
the user should follow throughout the voting process.
b. The system should check user inputs for acceptability, e.g., check for inputs
that seem to be in error (such as putting a Arabic number in a name field) and
alert the voter when such a situation exists;
c. When feasible, interlocks should prevent voters from voting for more
candidates than is permitted or from providing other types of unacceptable
voter inputs. When this occurs, voters should be alerted as to what is
incorrect;
d. The system should inform voters of items on the ballot that they have not
voted for. This should be done before the voter leaves the system. Voters
should be given the opportunity to compete their vote if they choose to or
they should be able to exit without voting for those they omitted; and
e. A means for correcting a vote response should be readily available. For nonpaper based systems, this should be built into the design of the system. For
paper-based system, procedures for undoing votes should be available and
voters should be explicitly told in advance what they are and that information
should be posted close to where they will use the ballot.

C-9

Volume I – Appendix C
Usabiltiy

C.9

Help and System Failure

The availability of useful help features can support voting system usability. Similarly,
the voter should be alerted to any system failures that may impact the proper
recording of votes or personal safety. The guidance in this section addresses these
design considerations.
a. Help should be available to support users with specific questions;
b. The system should provide voters with information on what to do if the
instructions provided are not understood;
c. Acceptable voter behavior should be clearly identified (e.g., whether the voter
can leave the booth, open a curtain, remove a ballot, etc.);
d. System messages should be informative and in "plain English" and should not
contain technical or jargon terms;
e. Alarms should be provided to alert voters to system failures. The alarms
should be accompanied by instructions informing voters of the actions to take;
and
f.

Status and alarm displays should follow conventional practice with respect to
color:
1) Green, blue, or white displays shall be used for indications of normal
status;
2) Amber indicators shall be used to indicate warnings or marginal status;
3) Red indicators shall be used to indicate error conditions or equipment
states that may result in damage, or in hazards to personnel; and
4) unless the equipment is designed to halt under conditions of incipient
damage or hazard, an audible alarm shall also be provided.

C.10

Voter Familiarization and Training

Successful use of any voting system is supported by the availability of mean for voters
to become familiar with voting system operation. Being able to use and become
familiar with the system prior to voting will minimize confusion and errors. The
guidance in this section addresses these design considerations.

C-10

Volume I – Appendix C
Usabiltiy

a. Voters should have access to sample ballots and all instructions before they
have to vote;
b. Voters should have an opportunity to practice before they vote, especially if
using electronic systems. On-line support can be provided, e.g., provide webbased access to all ballot information; and
c. Voters should have access to knowledgeable personnel to resolve any
questions.

C-11

Volume I – Appendix C
Usabiltiy

Volume II, Section 1
Table of Contents

1

Introduction ...........................................................................................................................................1-1
1.1

Objectives and Usage of Volume II of the Voting Systems Standards...........................................1-1

1.2

General Contents of Volume II.......................................................................................................1-1

1.3

Qualification Testing Focus............................................................................................................1-2

1.4

Qualification Testing Sequence .....................................................................................................1-3

1.5

Evolution of Testing........................................................................................................................1-4

1.6

Outline of Contents ........................................................................................................................1-4

i

Volume II – Section 1
Introduction

1

Introduction
1.1

Objectives and Usage of Volume II of the Voting
Systems Standards

Volume II, Voting System Qualification Testing Standards, is a complementary
document to Volume I, Voting System Performance Standards. While Section 9 of
Volume I provides an overview of the qualification testing process performed by the
Independent Test Authorities (ITAs), Volume II provides specific detail about the
process that is necessary for ITAs, vendors, and election officials participating in the
qualification process. The Standards envision a diverse set of users for Volume II,
including:
♦ Vendors: Voting system vendors will use Volume II to guide the design,
construction, documentation, internal testing, and maintenance of voting
systems to ensure conformance with the Standards. Vendors will also use
Volume II to help define the obligations of organizations that support the
vendor’s system, such as suppliers, testers, and consultants.
♦ Independent Testing Authorities: Testing authorities certified to qualify
systems will use Volume II to guide the testing of voting systems and
preparation of test reports. Laboratories and other parties interested in
becoming ITAs can use Volume II to understand the requirements and
obligations placed on the ITAs involved in the process.
♦ Election officials: Voting officials in many jurisdictions will use Volume II
to guide system certification, procurement and acceptance requirements and
processes, which may include additional requirements and adjustments to
those requirements included in the Standards.

1.2

General Contents of Volume II

To support these primary users of the Standards, Volume II provides:

1-1

Volume II – Section 1
Introduction

a. A discussion of the general sequencing of tests performed by the ITAs:
Volume II identifies the tests where sequencing is important and provides
such required sequences. Volume II also indicates other tests that may be
conducted in parallel.
b. A detailed description of the information required to be submitted by
voting system vendors in the Technical Data Package (TDP): The TDP is
a comprehensive set of documents that describe system design specifications,
operating procedures, system testing information, facility and resource
requirements for system operations, system maintenance instructions for
jurisdictions, and vendor practices for quality assurance and configuration
management that underlie the development and update of the system. The
TDP focuses predominantly on the required documentation contents,
providing flexibility to vendors to determine the best format for meeting the
content requirements.
c. Delineation of specific system tests to be conducted by the ITAs:
Volume II identifies specific tests that are to be conducted relating to system
components and to the integrated system as a whole. Tests are defined for
system functionality, hardware, software, telecommunications, and security
that address the performance standards delineated in Volume I.
d. Delineation of specific examinations of other information provided by
the vendor: Volume II identifies the criteria to be used by the ITAs in
conducting examinations of the information submitted in the TDP. These
criteria address the documentation provided in the TDP, including
documentation of the system and related operational procedures as well as
vendor practices for quality assurance and configuration management.
e. Description of process for handling failures: A system may fail to pass one
or more of the tests and examinations performed by the ITAs. Volume II
describes the practices to be used by the ITAs when the system or its
documentation fails a test or examination, including the nature and depth of
re-testing required for corrections submitted by the vendor.
f.

Outline of Qualification Test Report. Volume II provides an outline of the

report issued by the ITAs at the conclusion of testing, providing the specific
requirements for this report.

1.3

Qualification Testing Focus

Qualification tests focus on multiple aspects of the voting system and the process for
development and maintenance. Although multiple ITAs may conduct qualification
testing, with each ITA conducting tests in its areas of expertise, the focus of their
combined activities remains the same. Overall, qualification testing focuses on:

1-2

Volume II – Section 1
Introduction

a. The functional capabilities of the system to support specific election
activities performed by system users, including election officials and voters,
as defined in Volume I, Section 2 of the Standards;
b. The performance capabilities of the system that ensure accuracy, integrity,
and reliability of system operations and the election activities that rely on
them, as defined in Volume I, Sections 3, 4, 5 and 6 of the Standards;
c. The system development and maintenance processes and related quality
assurance activities performed by the vendor to ensure system quality, as
addressed in Volume I, Section 7 of the Standards;
d. The configuration management activities used to control the development
and modification of the system and its individual components, and maintain
accurate information about the version and status of the system and its
components throughout the system life cycle, as addressed in Volume I,
Section 8 of the Standards; and
e. The documentation developed and maintained by the vendor to support
system development, testing, installation, maintenance and operation, as
addressed by the TDP described in Volume II, Section 2.

1.4

Qualification Testing Sequence

The overall qualification test process progresses through several stages involving pretesting, testing, and post-testing activities as described in Volume I, Section 9 of the
Standards. Whereas Volume I describes the flow of the overall process, Volume II
focuses on the details of activities conducted by the ITA and activities conducted by
the vendor to facilitate testing and respond to errors, anomalies, and other findings of
concern during the test process.
Qualification testing involves a series of physical tests and other examinations that are
conducted in a particular sequence. This sequence is intended to maximize overall
testing effectiveness, as well as conduct testing in as efficient a manner as possible.
The ITA follows the general sequence of activities indicated below. Note that test
errors and anomalies are communicated to the vendor throughout the process.
a. Initial examination of the system and TDP provided by the vendor to ensure
that all components and documentation needed to conduct testing have been
submitted, and to help determine the scope and level of effort of testing
needed;
b. Development of a detailed system test plan that reflects the scope and
complexity of the system, and the status of system qualification (i.e., initial
qualification or re-qualification);
c. Operational testing of hardware components, including environmental tests, to
ensure that operational performance requirements are achieved;

1-3

Volume II – Section 1
Introduction

d. Functional and performance testing of hardware components;
e. Examination of the vendor’s Quality Assurance Program and Configuration
Management Plan;
f.

Code review for selected software components;

g. Functional and performance testing of software components;
h. System installation testing and testing of related documentation for system
installation and diagnostic testing;
i.

Functional and performance testing of the integrated system, including testing
of the full scope of system functionality, performance tests for
telecommunications and security; and examination and testing of the System
Operations Manual;

j.

Examination of the System Maintenance Manual;

k. Witnessing of a system ‘build’ conducted by the vendor to conclusively
establish the system version and components being tested; and
l.

1.5

Preparation of the Qualification Test Report.

Evolution of Testing

The ITA will conduct extensive tests on a voting system to evaluate it against the
requirements of the Standards. Taking advantage of the experience gained in
examining other voting systems, ITAs will design tests specifically for the system
design, configuration, and documentation provided by the vendor. Additionally, new
threats may be identified that are not directly addressed by the Standards or the
system. As new threats to a voting system are discovered, either during the system’s
operation or during the operation of other computer-based systems that use
technologies comparable to those of another voting system, ITAs shall expand the
tests used for system security to address the threats that are applicable to a particular
design of voting system.

1.6

Outline of Contents

Volume II of the Voting Systems Standards is organized as follows:
♦ Section 2 describes the requirements for the Technical Data Package;
♦ Section 3 describes functionality testing;

1-4

Volume II – Section 1
Introduction

♦ Sections 4 and 5 describe specific testing standards for hardware and
software;
♦ Section 6 describes standards for testing the fully integrated system, including
telecommunications and security capabilities, and the documentation used to
operate the system;
♦ Section 7 describes the standards for examining the documentation of vendor
practices for quality assurance and configuration management;
♦ Appendix A provides an outline for the Qualification Test Plan;
♦ Appendix B provides an outline for the Qualification Test Report; and
♦ Appendix C describes the guiding principles used to design the voting system
qualification testing process performed by ITAs.

1-5

Volume II – Section 1
Introduction

Volume II, Section 2
Table of Contents

2

Technical Data Package .......................................................................................................................2-1
2.1

Scope ..........................................................................................................................................2-1
2.1.1

Content and Format..........................................................................................................2-1
2.1.1.1 Required Content for Initial Qualification..................................................................2-2
2.1.1.2 Required Content for System Changes and Re-qualification...................................2-2
2.1.1.3 Format

2.2

.................................................................................................................2-3

2.1.2

Other Uses for Documentation .........................................................................................2-3

2.1.3

Protection of Proprietary Information ................................................................................2-3

System Overview ...........................................................................................................................2-3
2.2.1

System Description...........................................................................................................2-4

2.2.2

System Performance ........................................................................................................2-5

2.3

System Functionality Description ...................................................................................................2-5

2.4

System Hardware Specification .....................................................................................................2-6

2.5

2.4.1

System Hardware Characteristics ....................................................................................2-6

2.4.2

Design and Construction ..................................................................................................2-7

Software Design and Specification ................................................................................................2-7
2.5.1

Purpose and Scope ..........................................................................................................2-7

2.5.2

Applicable Documents ......................................................................................................2-8

2.5.3

Software Overview............................................................................................................2-8

2.5.4

Software Standards and Conventions ..............................................................................2-8

2.5.5

Software Operating Environment......................................................................................2-9
2.5.5.1 Hardware Environment and Constraints ..................................................................2-9
2.5.5.2 Software Environment ............................................................................................2-10

2.5.6

Software Functional Specification...................................................................................2-10
2.5.6.1 Configurations and Operating Modes.....................................................................2-10
2.5.6.2 Software Functions.................................................................................................2-10

2.5.7

Programming Specifications ...........................................................................................2-11
2.5.7.1 Programming Specifications Overview...................................................................2-11
2.5.7.2 Programming Specifications Details.......................................................................2-11

2.5.8

System Database ...........................................................................................................2-12

2.5.9

Interfaces ........................................................................................................................2-13

i

Volume II – Section 2
Technical Data Package

2.5.9.1 Interface Identification ............................................................................................2-13
2.5.9.2 Interface Description ..............................................................................................2-13
2.5.10 Appendices .....................................................................................................................2-15
2.6

2.7

2.8

2.9

System Security Specification......................................................................................................2-15
2.6.1

Access Control Policy .....................................................................................................2-16

2.6.2

Access Control Measures...............................................................................................2-16

2.6.3

Equipment and Data Security .........................................................................................2-16

2.6.4

Software Installation .......................................................................................................2-16

2.6.5

Telecommunications and Data Transmission Security...................................................2-17

2.6.6

Other Elements of an Effective Security Program ..........................................................2-17

System Test and Verification Specification ..................................................................................2-18
2.7.1

Development Test Specifications ...................................................................................2-18

2.7.2

Qualification Test Specifications.....................................................................................2-19

System Operations Procedures ...................................................................................................2-19
2.8.1

Introduction .....................................................................................................................2-19

2.8.2

Operational Environment ................................................................................................2-20

2.8.3

System Installation and Test Specification .....................................................................2-20

2.8.4

Operational Features ......................................................................................................2-20

2.8.5

Operating Procedures.....................................................................................................2-21

2.8.6

Operations Support.........................................................................................................2-22

2.8.7

Appendices .....................................................................................................................2-22

System Maintenance Procedures ................................................................................................2-22
2.9.1

Introduction .....................................................................................................................2-23

2.9.2

Maintenance Procedures................................................................................................2-23
2.9.2.1 Preventive Maintenance Procedures .....................................................................2-23
2.9.2.2 Corrective Maintenance Procedures ......................................................................2-24

2.9.3

Maintenance Equipment .................................................................................................2-24

2.9.4

Parts and Materials.........................................................................................................2-24
2.9.4.1 Common Standards ...............................................................................................2-25
2.9.4.2 Paper-Based Systems............................................................................................2-25

2.9.5

Maintenance Facilities and Support................................................................................2-25

2.9.6

Appendices .....................................................................................................................2-26

2.10 Personnel Deployment and Training Requirements ....................................................................2-26
2.10.1 Personnel........................................................................................................................2-26
2.10.2 Training...........................................................................................................................2-27
2.11 Configuration Management Plan..................................................................................................2-27
2.11.1 Configuration Management Policy..................................................................................2-27
2.11.2 Configuration Identification .............................................................................................2-28

ii

Volume II – Section 2
Technical Data Package

2.11.3 Baseline, Promotion, and Demotion Procedures ............................................................2-28
2.11.4 Configuration Control Procedures ..................................................................................2-28
2.11.5 Release Process.............................................................................................................2-29
2.11.6 Configuration Audits .......................................................................................................2-29
2.11.7 Configuration Management Resources ..........................................................................2-29
2.12 Quality Assurance Program .........................................................................................................2-30
2.12.1 Quality Assurance Policy ................................................................................................2-30
2.12.2 Parts & Materials Special Tests and Examinations ........................................................2-30
2.12.3 Quality Conformance Inspections...................................................................................2-30
2.12.4 Documentation................................................................................................................2-31
2.13 System Change Notes .................................................................................................................2-31

iii

Volume II – Section 2
Technical Data Package

2

Technical Data Package
2.1

Scope

This section contains a description of vendor documentation relating to the voting
system that shall be submitted with the system as a precondition of qualification
testing. These items are necessary to define the product and its method of operation; to
provide technical and test data supporting the vendor's claims of the system's
functional capabilities and performance levels; and to document instructions and
procedures governing system operation and field maintenance. Other items relevant to
the system evaluation shall be submitted along with this documentation (such as disks,
tapes, source code, object code, and sample output report formats).
Both formal documentation and notes of the vendor's system development process
shall be submitted for qualification tests. Documentation outlining system
development permits assessment of the vendor's systematic efforts to test the system
and correct defects. Inspection of this process also enables the design of a more
precise qualification test plan. If the vendor's developmental test data is incomplete,
the test agency shall design and conduct the appropriate tests.

2.1.1

Content and Format

The content of the Technical Data Package (TDP) is intended to collect clear,
complete descriptions of the following information about the system:
♦ Overall system design, including subsystems, modules and the interfaces
among them;
♦ Specific functional capabilities provided by the system;
♦ Performance and design specifications;
♦ Design constraints, applicable standards, and compatibility requirements;
♦ Personnel, equipment, and facility requirements for system operation,
maintenance, and logistical support;

2-1

Volume II – Section 2
Technical Data Package

♦ Vendor practices for assuring system quality during the system’s development
and subsequent maintenance; and
♦ Vendor practices for managing the configuration of the system during
development and for modifications to the system throughout its life cycle.
The vendor shall list all documents controlling the design, construction, operation, and
maintenance of the system. Documents shall be listed in order of precedence.

2.1.1.1

Required Content for Initial Qualification

At minimum, the TDP shall contain the following documentation:
a. System configuration overview;
b. System functionality description;
c. System hardware specifications;
d. Software design and specifications;
e. System test and verification specifications;
f.

System security specifications;

g. User/system operations procedures;
h. System maintenance procedures;
i.

Personnel deployment and training requirements;

j.

Configuration management plan;

k. Quality assurance program; and
l.

System change notes.

2.1.1.2

Required Content for System Changes and Re-qualification

For systems seeking re-qualification, vendors shall submit System Change Notes as
described in Section 2.13, as well as current versions of all documents that have been
updated to reflect system changes.
Systems in existence at the time the revised standards are released may not have all
required developmental documentation. When such a system is subject to evaluation
as a result of system modification, the vendor shall provide what information they can.
Vendors may also submit other information relevant to the evaluation of the system,
such as documentation of tests performed by other independent test authorities and
records of the system's performance history, if any.

2-2

Volume II – Section 2
Technical Data Package

2.1.1.3

Format

The requirements for formatting the TDP are general in nature; specific format details
are of the vendor’s choosing. Other items submitted by the vendor, such as
documentation of tests conducted by other test authorities, performance history,
failure analysis, and corrective action may be provided in a format of the vendor's
choosing.
The TDP shall include a detailed table of contents for the required documents, an
abstract of each document and a listing of each of the informational sections and
appendices presented. A cross-index shall be provided indicating the portions of the
documents that are responsive to documentation requirements for any item presented
using the vendor's format.

2.1.2

Other Uses for Documentation

Although all of the TDP documentation is required for qualification testing, some of
these same items may also be required during the state certification process and local
level acceptance testing. Therefore, it is recommended that the technical
documentation required for certification and acceptance testing be deposited in
escrow.

2.1.3

Protection of Proprietary Information

The vendor shall identify all documents, or portions of documents, containing
proprietary information not approved for public release. Any person or test agency
receiving proprietary information shall agree to use it solely for the purpose of
analyzing and testing the system, and shall agree to refrain from otherwise using the
proprietary information or disclosing it to any other person or agency without the
prior written consent of the vendor, unless disclosure is legally compelled.

2.2

System Overview

In the system overview, the vendor shall provide information that enables the test
authority to identify the functional and physical components of the system, how the
components are structured, and the interfaces between them.

2-3

Volume II – Section 2
Technical Data Package

2.2.1

System Description

The system description shall include written descriptions, drawings and diagrams that
present:
a. A description of the functional components (or subsystems) as defined by the
vendor (e.g., environment, election management and control, vote recording,
vote conversion, reporting, and their interconnection);
b. A description of the operational environment of the system that provides an
overview of the hardware, software, and communications structure;
c. A theory of operation that explains each system function, and how the
function is achieved in the design;
d. Descriptions of the functional and physical interfaces between subsystems and
components;
e. Identification of all COTS hardware and software products and
communications services used in the development and/or operation of the
voting system, identifying the name, vendor and version used for each such
component, including:
1) Operating systems;
2) Database software;
3) Communications routers;
4) Modem drivers; and
5) Dial-up networking software;
f.

Interfaces among internal components, and interfaces with external systems.
For components that interface with other components for which multiple
products may be used, the TDP shall provide an identification of:
1) File specifications, data objects, or other means used for information
exchange; and
2) The public standard used for such file specifications, data objects, or
other means; and

g. Benchmark directory listings for all software (including firmware elements)
and associated documentation included in the vendor’s release in order of how
each piece of software would normally be installed upon setup and
installation.

2-4

Volume II – Section 2
Technical Data Package

2.2.2

System Performance

The vendor shall provide system performance information that includes descriptions
of:
a. The performance characteristics of each operating mode and function in terms
of expected and maximum speed, throughput capacity, maximum volume
(maximum number of voting positions and maximum number of ballot styles
supported), and processing frequency;
b. Quality attributes such as reliability, maintainability, availability, usability,
and portability;
c. Provisions for safety, security, privacy, and continuity of operation; and
d. Design constraints, applicable standards, and compatibility requirements.

2.3

System Functionality Description

The vendor shall declare the scope of the system’s functional capabilities, thereby
establishing the performance, design, test, manufacture, and acceptance context for the
system.
The vendor shall provide a listing of the system’s functional processing capabilities,
encompassing capabilities required by the Standards and any additional capabilities
provided by the system. This listing shall provide a simple description of each
capability. Detailed specifications shall be provided in other documentation required
for the TDP as indicated by the standards for that documentation.
a. The vendor shall organize the presentation of required capabilities in a
manner that corresponds to the structure and sequence of functional
capabilities indicated in Volume I, Section 2 of the Standards. The contents of
Volume I Section 2 may be used as the basis for a checklist whereby the
vendor indicates the specific functions provided and those not provided by the
system;
b. Additional capabilities shall be clearly indicated. They may be presented
using the same structure as that used for required capabilities (i.e., overall
system capabilities, pre-voting functions, voting functions, post-voting
functions), or may be presented in another format of the vendor’s choosing;
c. Required capabilities that may be bypassed or deactivated during installation
or operation by the user shall be clearly indicated;

2-5

Volume II – Section 2
Technical Data Package

d. Additional capabilities that function only when activated during installation or
operation by the user shall be clearly indicated; and
e. Additional capabilities that normally are active but may be bypassed or
deactivated during installation or operation by the user shall be clearly
indicated.

2.4

System Hardware Specification

The vendor shall expand on the system overview by providing detailed specifications
of the hardware components of the system, including specifications of hardware used
to support the telecommunications capabilities of the system, if applicable.

2.4.1

System Hardware Characteristics

The vendor shall provide a detailed discussion of the characteristics of the system,
indicating how the hardware meets individual requirements defined in Volume I,
Sections 3, 4, 5 and 6 of the Standards, including:
a. Performance characteristics: This discussion addresses basic system
performance attributes and operational scenarios that describe the manner in
which system functions are invoked, describe environmental capabilities,
describe life expectancy, and describe any other essential aspects of system
performance;
b. Physical characteristics: This discussion addresses suitability for intended use,
requirements for transportation and storage, health and safety criteria, security
criteria, and vulnerability to adverse environmental factors;
c. Reliability: This discussion addresses system and component reliability stated
in terms of the systems operating functions, and identification of items that
require special handling or operation to sustain system reliability;
d. Maintainability: Maintainability represents the ease with which maintenance
actions can be performed based on the design characteristics of equipment and
software and the processes the vendor and election officials have in place for
preventing failures and for reacting to failures. Maintainability includes the
ability of equipment and software to self-diagnose problems and make nontechnical election workers aware of a problem. Maintainability also addresses
a range of scheduled and unscheduled events; and
e. Environmental conditions: This discussion addresses the ability of the system
to withstand natural environments, and operational constraints in normal and
test environments, including all requirements and restrictions regarding

2-6

Volume II – Section 2
Technical Data Package

electrical service, telecommunications services, environmental protection, and
any additional facilities or resources required to install and operate the
system.

2.4.2

Design and Construction

The vendor shall provide sufficient data, or references to data, to identify
unequivocally the details of the system configuration submitted for qualification
testing. The vendor shall provide a list of materials and components used in the
system and a description of their assembly into major system components and the
system as a whole. Paragraphs and diagrams shall be provided that describe:
a. Materials, processes, and parts used in the system, their assembly, and the
configuration control measures to ensure compliance with the system
specification;
b. The electromagnetic environment generated by the system;
c. Operator and voter safety considerations, and any constraints on system
operations or the use environment;
d. Human engineering considerations, including provisions for access by
disabled voters.

2.5

Software Design and Specification

The vendor shall expand on the system overview by providing detailed specifications
of the software components of the system, including software used to support the
telecommunications capabilities of the system, if applicable.

2.5.1

Purpose and Scope

The vendor shall describe the function or functions that are performed by the software
programs that comprise the system, including software used to support the
telecommunications capabilities of the system, if applicable.

2-7

Volume II – Section 2
Technical Data Package

2.5.2

Applicable Documents

The vendor shall list all documents controlling the development of the software and its
specifications. Documents shall be listed in order of precedence.

2.5.3

Software Overview

The vendor shall provide an overview of the software that includes the following
items:
a. A description of the software system concept, including specific software
design objectives, and the logic structure and algorithms used to accomplish
these objectives;
b. The general design, operational considerations, and constraints influencing
the design of the software;
c. Identification of all software items, indicating items that were:
1) Written in-house;
2) Procured and not modified; and
3) Procured and modified including descriptions of the modifications to the
software and to the default configuration options;
d. Additional information for each item that includes:
1) Item identification;
2) General description;
3) Software requirements performed by the item;
4) Identification of interfaces with other items that provide data to, or
receive data from, the item; and
5) Concept of execution for the item;
The vendor shall also include a certification that procured software items were
obtained directly from the manufacturer or a licensed dealer or distributor.

2.5.4

Software Standards and Conventions

The vendor shall provide information that can be used by an ITA or state certification
board to support software analysis and test design. The information shall address

2-8

Volume II – Section 2
Technical Data Package

standards and conventions developed internally by the vendor as well as published
industry standards that have been applied by the vendor. The vendor shall provide
information that addresses the following standards and conventions:
a. System development methodology;
b. Software design standards, including internal vendor procedures;
c. Software specification standards, including internal vendor procedures;
d. Software coding standards, including internal vendor procedures;
e. Software testing and verification standards, including internal vendor
procedures, that can assist in determining the program's correctness and
ACCEPT/REJECT criteria; and
f.

Quality assurance standards or other documents that can be used by the ITA
to examine and test the software. These documents include standards for
program flow and control charts, program documentation, test planning, and
for test data acquisition and reporting.

2.5.5

Software Operating Environment

This section shall describe or make reference to all operating environment factors that
influence the software design.

2.5.5.1

Hardware Environment and Constraints

The vendor shall identify and describe the hardware characteristics that influence the
design of the software, such as:
a. The logic and arithmetic capability of the processor;
b. Memory read-write characteristics;
c. External memory device characteristics;
d. Peripheral device interface hardware;
e. Data input/output device protocols; and
f.

Operator controls, indicators, and displays.

2-9

Volume II – Section 2
Technical Data Package

2.5.5.2

Software Environment

The vendor shall identify the compilers or assemblers used in the generation of
executable code, and describe the operating system or system monitor.

2.5.6

Software Functional Specification

The vendor shall provide a description of the operating modes of the system and of
software capabilities to perform specific functions.

2.5.6.1

Configurations and Operating Modes

The vendor shall describe all software configurations and operating modes of the
system, such as ballot preparation, election programming, preparation for opening the
polling place, recording votes and/or counting ballots, closing the polling place, and
generating reports. For each software function or operating mode, the vendor shall
provide:
a. A definition of the inputs to the function or mode (with characteristics,
tolerances or acceptable ranges, as applicable);
b. An explanation of how the inputs are processed; and
c. A definition of the outputs produced (again, with characteristics, tolerances,
or acceptable ranges as applicable).

2.5.6.2

Software Functions

The vendor shall describe the software's capabilities or methods for detecting or
handling:
a. Exception conditions;
b. System failures;
c. Data input/output errors;
d. Error logging for audit record generation;
e. Production of statistical ballot data;
f.

Data quality assessment; and

g. Security monitoring and control.

2-10

Volume II – Section 2
Technical Data Package

2.5.7

Programming Specifications

The vendor shall provide in this section an overview of the software design, its
structure, and implementation algorithms and detailed specifications for individual
software modules.

2.5.7.1

Programming Specifications Overview

This overview shall include such items as flowcharts, HIPOs, data flow diagrams, and
other graphical techniques that facilitate understanding of the programming
specifications. This section shall be prepared to facilitate understanding of the internal
functioning of the individual software modules. Implementation of the functions shall
be described in terms of the software architecture, algorithms, and data structures.

2.5.7.2

Programming Specifications Details

The programming specifications shall describe individual software modules and their
component units, if applicable. For each module and unit, the vendor shall provide the
following information:
a. Module and unit design decisions, if any, such as algorithms used;
b. Any constraints, limitations, or unusual features in the design of the software
module or unit;
c. The programming language to be used and rationale for its use if other than
the specified module or unit language;
d. If the software module or unit consists of or contains procedural commands
(such as menu selections in a database management system (DBMS) for
defining forms and reports, on-line DBMS queries for database access and
manipulation, input to a graphical user interface (GUI) builder for automated
code generation, commands to the operating system, or shell scripts), a list of
the procedural commands and reference to user manuals or other documents
that explain them;
e. If the software module or unit contains, receives, or outputs data, a description
of its inputs, outputs, and other data elements as applicable. (Section 2.5.9
describes the requirements for documenting system interfaces.) Data local to
the software module or unit shall be described separately from data input to or
output from the software module or unit;
f.

If the software module or unit contains logic, the logic to be used by the
software unit, including, as applicable:

2-11

Volume II – Section 2
Technical Data Package

1) Conditions in effect within the software module or unit when its
execution is initiated;
2) Conditions under which control is passed to other software modules or
units;
3) Response and response time to each input, including data conversion,
renaming, and data transfer operations;
4) Sequence of operations and dynamically controlled sequencing during the
software module’s or unit’s operation, including:
i)

The method for sequence control;

ii) The logic and input conditions of that method, such as timing
variations, priority assignments;
iii) Data transfer in and out of memory; and
iv) The sensing of discrete input signals, and timing relationships
between interrupt operations within the software module or unit; and
5) Exception and error handling; and
g. If the software module is a database, provide the information described in
Volume II, Section 2.5.8.

2.5.8

System Database

The vendor shall identify and provide a diagram and narrative description of the
system’s databases, and any external files used for data input or output. The
information provided shall include for each database or external file:
a. The number of levels of design and the names of those levels (such as
conceptual, internal, logical, and physical);
b. Design conventions and standards (which may be incorporated by references)
needed to understand the design;
c. Identification and description of all database entities and how they are
implemented physically (e.g., tables, files, etc.);
d. Entity relationship diagram and description of relationships; and
e. Details of table, record or file contents (as applicable) to include individual
data elements and their specifications, including:
1) Names/identifiers;
2) Data type (alphanumeric, integer, etc.);
3) Size and format (such as length and punctuation of a character string);

2-12

Volume II – Section 2
Technical Data Package

4) Units of measurement (such as meters, dollars, nanoseconds);
5) Range or enumeration of possible values (such as 0-99);
6) Accuracy (how correct) and precision (number of significant digits);
7) Priority, timing, frequency, volume, sequencing, and other constraints,
such as whether the data element may be updated and whether business
rules apply;
8) Security and privacy constraints; and
9) Sources (setting/sending entities) and recipients (using/receiving entities);
and
f.

For external files, a description of the procedures for file maintenance,
management of access privileges, and security.

2.5.9

Interfaces

The vendor shall identify and provide a complete description of all internal and
external interfaces, using a combination of text and diagrams.

2.5.9.1

Interface Identification

For each interface identified in the system overview, the vendor shall:
a. Provide a unique identifier assigned to the interface;
b. Identify the interfacing entities (systems, configuration items, users, etc.) by
name, number, version, and documentation references, as applicable; and
c. Identify which entities have fixed interface characteristics (and therefore
impose interface requirements on interfacing entities) and which are being
developed or modified (thus having interface requirements imposed on them).

2.5.9.2

Interface Description

For each interface identified in the system overview, the vendor shall provide
information that describes:
a. The type of interface (such as real-time data transfer, storage-and-retrieval of
data, etc.) to be implemented;
b. Characteristics of individual data elements that the interfacing entity(ies) will
provide, store, send, access, receive, etc., such as:

2-13

Volume II – Section 2
Technical Data Package

1) Names/identifiers;
2) Data type (alphanumeric, integer, etc.);
3) Size and format (such as length and punctuation of a character string);
4) Units of measurement (such as meters, dollars, nanoseconds);
5) Range or enumeration of possible values (such as 0-99);
6) Accuracy (how correct) and precision (number of significant digits);
7) Priority, timing, frequency, volume, sequencing, and other constraints,
such as whether the data element may be updated and whether business
rules apply;
8) Security and privacy constraints; and
9) Sources (setting/sending entities) and recipients (using/receiving entities);
c. Characteristics of communication methods that the interfacing entity(ies) will use
for the interface, such as:
1) Communication links/bands/frequencies/media and their characteristics;
2) Message formatting;
3) Flow control (such as sequence numbering and buffer allocation);
4) Data transfer rate, whether periodic/aperiodic, and interval between
transfers;
5) Routing, addressing, and naming conventions;
6) Transmission services, including priority and grade; and
7) Safety/security/privacy considerations, such as encryption, user
authentication, compartmentalization, and auditing;
d. Characteristics of protocols the interfacing entity(ies) will use for the interface,
such as:
1) Priority/layer of the protocol;
2) Packeting, including fragmentation and reassembly, routing, and
addressing;
3) Packeting, including fragmentation and reassembly, routing, and
addressing;
4) Legality checks, error control, and recovery procedures;
5) Synchronization, including connection establishment, maintenance,
termination; and
6) Status, identification, and any other reporting features; and

2-14

Volume II – Section 2
Technical Data Package

e. Other characteristics, such as physical compatibility of the interfacing entity(ies)
(dimensions, tolerances, loads, voltages, plug compatibility, etc.).

2.5.10

Appendices

The vendor may provide descriptive material and data supplementing the various
sections of the body of the Software Specifications. The content and arrangement of
appendices shall be at the discretion of the vendor. Topics recommended for
amplification or treatment in appendix form include:
a. Glossary: A listing and brief definition of all software module names and
variable names, with reference to their locations in the software structure.
Abbreviations, acronyms, and terms should be included, if they are either
uncommon in data processing and software development or are used in an
unorthodox semantic;
b. References: A list of references to all related vendor documents, data,
standards, and technical sources used in software development and testing;
and
c. Program Analysis: The results of software configuration analysis algorithm
analysis and selection, timing studies, and hardware interface studies that are
reflected in the final software design and coding.

2.6

System Security Specification

Vendors shall submit a system security specification that addresses the security
requirements of Volume I, Section 6 of the Standards. This specification shall
describe the level of security provided by the system in terms of the specific security
risks addressed by the system, the means by which each risk is addressed, the process
used to test and verify the effective operation of security capabilities and, for systems
that use public telecommunications networks as defined in Volume I, Section 5, the
means used to keep the security capabilities of the system current to respond to the
evolving threats against these systems.
Information provided by the vendor in this section of the TDP may be duplicative of
information required by other sections. Vendors may cross reference to information
provided in other sections provided that the means used provides a clear mapping to
the requirements of this section.
Information submitted by the vendor shall be used by the test authority to assist in
developing and executing the system qualification test plan. The Security
Specification shall contain the sections identified below.

2-15

Volume II – Section 2
Technical Data Package

2.6.1

Access Control Policy

The vendor shall specify the features and capabilities of the access control policy
recommended to purchasing jurisdictions to provide effective voting system security
to meet the specific requirements of Volume I, Section 6.2.1. The access control
policy shall address the general features and capabilities and individual access
privileges indicated in Volume I, Section 6.2.1.

2.6.2

Access Control Measures

The vendor shall provide a detailed description of all system access control measures
and mandatory procedures designed to permit access to system states in accordance
with the access policy, and to prevent all other types of access to meet the specific
requirements of Volume I, Section 6.2.2.
The vendor also shall define and provide a detailed description of the methods used to
preclude unauthorized access to the access control capabilities of the system itself.

2.6.3

Equipment and Data Security

The vendor shall provide a detailed description of system capabilities and mandatory
procedures for purchasing jurisdictions to prevent disruption of the voting process and
corruption of voting data to meet the specific requirements of Volume I, Section 6.3
of the Standards. This information shall address measures for polling place security
and central count location security.

2.6.4

Software Installation

The vendor shall provide a detailed description of the system capabilities and
mandatory procedures for purchasing jurisdictions to ensure secure software
(including firmware) installation to meet the specific requirements of Volume I,
Section 6.4 of the Standards. This information shall address software installation for
all system components.

2-16

Volume II – Section 2
Technical Data Package

2.6.5

Telecommunications and Data Transmission Security

The vendor shall provide a detailed description of the system capabilities and
mandatory procedures for purchasing jurisdictions to ensure secure data transmission
to meet the specific requirements of Volume I, Section 6.5:
a. For all systems, this information shall address access control, and prevention
of data interception; and
b. For systems that use public communications networks as defined in Volume I
Section 5, this information shall also include:
1) Capabilities used to provide protection against threats to third party
products and services;
2) Policies and processes used by the vendor to ensure that such protection is
updated to remain effective over time;
3) Policies and procedures used by the vendor to ensure that current versions
of such capabilities are distributed to user jurisdictions and are installed
effectively by the jurisdiction;
4) A detailed description of the system capabilities and procedures to be
employed by the jurisdiction to diagnose the occurrence of a denial of
service attack, to use an alternate method of voting, to determine when it
is appropriate to resume voting over the network, and to consolidate votes
cast using the alternate method;
5) A detailed description of all activities to be performed in setting up the
system for operation that are mandatory to ensure effective system
security, including testing of security before an election; and
6) A detailed description of all activities that should be prohibited during
system setup and during the timeframe for voting operations, including
both the hours when polls are open and when polls are closed.

2.6.6

Other Elements of an Effective Security Program

The vendor shall provide a detailed description of the following additional procedures
required for use by the purchasing jurisdiction:
a. Administrative and management controls for the voting system and election
management, including access controls;
b. Internal security procedures, including operating procedures for maintaining
the security of the software for each system function and operating mode;

2-17

Volume II – Section 2
Technical Data Package

c. Adherence to, and enforcement of, operational procedures (e.g., effective
password management);
d. Physical facilities and arrangements; and
e. Organizational responsibilities and personnel screening.
This documentation shall be prepared such that these requirements can be integrated
by the jurisdiction into local administrative and operating procedures.

2.7

System Test and Verification Specification

The vendor shall provide test and verification specifications for:
a. Development test specifications; and
b. Qualification test specifications.

2.7.1

Development Test Specifications

The vendor shall describe the plans, procedures, and data used during software
development and system integration to verify system logic correctness, data quality,
and security. This description shall include:
a. Test identification and design, including:
1) Test structure;
2) Test sequence or progression; and
3) Test conditions;
a. Standard test procedures, including any assumptions or constraints;
b. Special purpose test procedures including any assumptions or constraints;
c. Test data; including the data source, whether it is real or simulated, and how
test data is controlled;
d. Expected test results; and
e. Criteria for evaluating test results.
Additional details for these requirements are provided by MIL-STD-498, Software
Test Plan (STP) and Software Test Description (STD). In the event that test data is not
available, the ITA shall design test cases and procedures equivalent to those ordinarily
used during product verification.

2-18

Volume II – Section 2
Technical Data Package

2.7.2

Qualification Test Specifications

The vendor shall provide specifications for verification and validation of overall
software performance. These specifications shall cover:
a. Control and data input/output;
b. Acceptance criteria;
c. Processing accuracy;
d. Data quality assessment and maintenance;
e. Ballot interpretation logic;
f.

Exception handling;

g. Security; and
h. Production of audit trails and statistical data.
The specifications shall identify procedures for assessing and demonstrating the
suitability of the software for elections use.

2.8

System Operations Procedures

This documentation shall provide all information necessary for system use by all
personnel who support pre-election and election preparation, polling place activities
and central counting activities, as applicable, with regard to all system functions and
operations identified in Section 2.3 above. The nature of the instructions for operating
personnel will depend upon the overall system design and required skill level of
system operations support personnel.
The system operations procedures shall contain all information that is required for the
preparation of detailed system operating procedures, and for operator training,
including the sections listed below:

2.8.1

Introduction

The vendor shall provide a summary of system operating functions and modes, in
sufficient detail to permit understanding of the system's capabilities and constraints.
The roles of operating personnel shall be identified and related to the operating modes
of the system. Decision criteria and conditional operator functions (such as error and
failure recovery actions) shall be described.

2-19

Volume II – Section 2
Technical Data Package

The vendor shall also list all reference and supporting documents pertaining to the use
of the system during elections operations.

2.8.2

Operational Environment

The vendor shall describe the system environment, and the interface between the user
or operator and the system. The vendor shall identify all facilities, furnishings,
fixtures, and utilities that will be required for equipment operations, including
equipment that operates at the:
a. Polling place;
b. Central count facility; and
c. Other locations.

2.8.3

System Installation and Test Specification

The vendor shall provide specifications for validation of system installation,
acceptance, and readiness. These specifications shall address all components of the
system and all locations of installation (e.g., polling place central count facility), and
shall address all elements of system functionality and operations identified in Section
2.3 above, including:
a. Pre-voting functions;
b. Voting functions;
c. Post-voting functions; and
d. General capabilities.
These specifications also serve to provide guidance to the procuring agency in
developing its acceptance test plan and procedure according to the agency's contract
provisions, and the election laws of the state.

2.8.4

Operational Features

The vendor shall provide documentation of system operating features that meets the
following requirements:

2-20

Volume II – Section 2
Technical Data Package

a. Provides a detailed description of all input, output, control, and display
features accessible to the operator or voter;
b. Provide examples of simulated interactions in order to facilitate understanding
of the system and its capabilities;
c. Provide sample data formats and output reports; and
d. Illustrate and describe all status indicators and information messages.

2.8.5

Operating Procedures

The vendor shall provide documentation of system operating procedures that meets
the following requirements:
a. Provides a detailed description of procedures required to initiate, control, and
verify proper system operation;
b. Provides procedures that clearly enable the operator to assess the correct flow
of system functions (as evidenced by system-generated status and information
messages);
c. Provides procedures that clearly enable the operator to intervene the system
operations to recover from an abnormal system state;
d. Defines and illustrates the procedures and system prompts for situations
where operator intervention is required to load, initialize, and start the system;
e. Define and illustrate procedures to enable and control the external interface to
the system operating environment if supporting hardware and software are
involved (such information shall be provided for the interaction of the system
with other data processing systems or data interchange protocols as well);
f.

Provide administrative procedures and off-line operator duties (if any) if they
relate to the initiation or termination of system operations, to the assessment
of system status, or to the development of an audit trail;

g. To support successful ballot and program installation and control by election
officials, provide a detailed work plan or other form of documentation
providing a schedule and steps for the software and ballot installation, which
includes a table outlining the key dates, events and deliverables; and
h. To support diagnostic testing, specify diagnostic tests that may be employed
to identify problems in the system, verify the correction of maintenance
problems; and isolate and diagnose faults from various systems states.

2-21

Volume II – Section 2
Technical Data Package

2.8.6

Operations Support

The vendor shall provide documentation of system operating procedures that meets
the following requirements:
a. Defines the procedures required to support system acquisition, installation,
and readiness testing (these procedures may be provided by reference, if they
are contained either in the system hardware specifications, or in other vendor
documentation provided to the ITA and to system users); and
b. Describe procedures for providing technical support, system maintenance and
correction of defects, and for incorporating hardware upgrades and new
software releases.

2.8.7

Appendices

The vendor may provide descriptive material and data supplementing the various
sections of the body of the System Operations Manual. The content and arrangement
of appendices shall be at the discretion of the vendor. Topics recommended for
discussion include:
a. Glossary: A listing and brief definition of all terms that may be unfamiliar to
persons not trained in either voting systems or computer operations;
b. References: A list of references to all vendor documents and to other
sources related to operation of the system;
c. Detailed Examples: Detailed scenarios that outline correct system responses
to faulty operator input. Alternative procedures may be specified depending
on the system state; and
d. Manufacturer's Recommended Security Procedures: This appendix shall
contain the security procedures that are to be executed by the system operator.

2.9

System Maintenance Procedures

The system maintenance procedures shall provide information in sufficient detail to
support election workers, data personnel, or maintenance personnel in the adjustment
or removal and replacement of components or modules in the field. Technical
documentation needed solely to support the repair of defective components or
modules ordinarily done by the manufacturer or software developer is not required.

2-22

Volume II – Section 2
Technical Data Package

Recommended service actions to correct malfunctions or problems shall be discussed,
along with personnel and expertise required to repair and maintain the system; and
equipment, materials, and facilities needed for proper maintenance. This manual shall
include the sections listed below.

2.9.1

Introduction

The vendor shall describe the structure and function of the equipment (and related
software) for election preparation, programming, vote recording, tabulation, and
reporting in sufficient detail to provide an overview of the system for maintenance,
and for identification of faulty hardware or software. The description shall include a
theory of operation that fully describes such items as:
a. The electrical and mechanical functions of the equipment;
b. How the processes of ballot handling and reading are performed (paper-based
systems);
c. How vote selection and casting of the ballot are performed (DRE systems);
d. How transmission of data over a network are performed (DRE systems, where
applicable);
e. How data are handled in the processor and memory units;
f.

How data output is initiated and controlled;

g. How power is converted or conditioned; and
h. How test and diagnostic information is acquired and used.

2.9.2

Maintenance Procedures

The vendor shall describe preventive and corrective maintenance procedures for
hardware and software.

2.9.2.1

Preventive Maintenance Procedures

The vendor shall identify and describe:
a. All required and recommended preventive maintenance tasks, including
software tasks such as software backup, database performance analysis, and
database tuning;

2-23

Volume II – Section 2
Technical Data Package

b. Number and skill levels of personnel required for each task;
c. Parts, supplies, special maintenance equipment, software tools, or other
resources needed for maintenance; and
d. Any maintenance tasks that must be coordinated with the vendor or a third
party (such as coordination that may be needed for off-the-shelf items used in
the system).

2.9.2.2

Corrective Maintenance Procedures

The vendor shall provide fault detection, fault isolation, correction procedures, and
logic diagrams for all operational abnormalities identified by design analysis and
operating experience.
The vendor shall identify specific procedures to be used in diagnosing and correcting
problems in the system hardware (or user-controlled software). Descriptions shall
include:
a. Steps to replace failed or deficient equipment;
b. Steps to correct deficiencies or faulty operations in software;
c. Modifications that are necessary to coordinate any modified or upgraded
software with other software modules;
d. The number and skill levels of personnel needed to accomplish each
procedure;
e. Special maintenance equipment, parts, supplies, or other resources needed to
accomplish each procedure; and
f.

2.9.3

Any coordination required with the vendor, or other party for off the shelf
items.

Maintenance Equipment

The vendor shall identify and describe any special purpose tests or maintenance
equipment recommended for fault isolation and diagnostic purposes.

2.9.4

Parts and Materials

Vendors shall provide detailed documentation of parts and materials needed to operate
and maintain the system. Additional requirements apply for paper-based systems.

2-24

Volume II – Section 2
Technical Data Package

2.9.4.1

Common Standards

The vendor shall provide a complete list of approved parts and materials needed for
maintenance. This list shall contain sufficient descriptive information to identify all
parts by:
a. Type;
b. Size;
c. Value or range;
d. Manufacturer's designation;
e. Individual quantities needed; and
f.

Sources from which they may be obtained.

2.9.4.2

Paper-Based Systems

For marking devices manufactured by multiple external sources, the vendor shall
provide a listing of sources and model numbers that are compatible with the system.
The TDP shall specify the required paper stock, size, shape, opacity, color,
watermarks, field layout, orientation, size and style of printing, size and location of
punch or mark fields used for vote response fields and to identify unique ballot
formats, placement of alignment marks, ink for printing, and folding and bleedthrough limitations for preparation of ballots that are compatible with the system

2.9.5

Maintenance Facilities and Support

The vendor shall identify all facilities, furnishings, fixtures, and utilities that will be
required for equipment maintenance. In addition, vendors shall specify the
assumptions made with regard to any parameters that impact the mean time to repair.
These factors shall include at a minimum:
a. Recommended number and locations of spare devices or components to be
kept on hand for repair purposes during periods of system operation;
b. Recommended number and locations of qualified maintenance personnel who
need to be available to support repair calls during system operation; and
c. Organizational affiliation (i.e., jurisdiction, vendor) of qualified maintenance
personnel.

2-25

Volume II – Section 2
Technical Data Package

2.9.6

Appendices

The vendor may provide descriptive material and data supplementing the various
sections of the body of the System Maintenance Manual. The content and arrangement
of appendices shall be at the discretion of the vendor. Topics recommended for
amplification or treatment in appendix include:
a. Glossary: A listing and brief definition of all terms that may be unfamiliar to
persons not trained in either voting systems or computer maintenance;
b. References: A list of references to all vendor documents and other sources
related to maintenance of the system;
c. Detailed Examples: Detailed scenarios that outline correct system responses to
every conceivable faulty operator input. Alternative procedures may be
specified depending on the system state; and
d. Maintenance and Security Procedures: This appendix shall contain technical
illustrations and schematic representations of electronic circuits unique to the
system.

2.10

Personnel Deployment and Training
Requirements

The vendor shall describe the personnel resources and training
required for a jurisdiction to operate and maintain the system.

2.10.1

Personnel

The vendor shall specify the number of personnel and skill level required to perform
each of the following functions:
a. Pre-election or election preparation functions (e.g., entering an election, race
and candidate information; designing a ballot; generating pre-election reports;
b. System operations for voting system functions performed at the polling place;
c. System operations for voting system functions performed at the central count
facility;
d. Preventive maintenance tasks;
e. Diagnosis of faulty hardware or software;

2-26

Volume II – Section 2
Technical Data Package

f.

Corrective maintenance tasks; and

g. Testing to verify the correction of problems.
A description shall be presented of which functions may be carried out by user
personnel, and those that must be performed by vendor personnel.

2.10.2

Training

The vendor shall specify requirements for the orientation and training of the following
personnel:
a. Poll workers supporting polling place operations;
b. System support personnel involved in election programming;
c. User system maintenance technicians;
d. Network/system administration personnel (if a network is used);
e. Data personnel; and
f.

Vendor personnel.

2.11

Configuration Management Plan

Vendors shall submit a Configuration Management Plan that addresses the
configuration management requirements of Volume I, Section 8 of the Standards. This
plan shall describe all policies, processes and procedures employed by the vendor to
carry out these requirements. Information submitted by the vendor shall be used by
the test authority to assist in developing and executing the system qualification test
plan. This information is particularly important to support the design of test plans for
system modifications. A well-organized, robust and detailed Configuration
Management Plan will enable the test authority to more readily determine the nature
and scope of tests needed to fully test the modifications. The Configuration
Management Plan shall contain the sections identified below.

2.11.1

Configuration Management Policy

The vendor shall provide a description of its organizational policies for configuration
management, addressing the specific requirements of Volume I, Section 8.3 of the
Standards. These requirements pertain to:

2-27

Volume II – Section 2
Technical Data Package

a. Scope and nature of configuration management program activities; and
b. Breadth of application of vendor’s policy and practices to the voting system.

2.11.2

Configuration Identification

The vendor shall provide a description of the procedures and naming conventions
used to address the specific requirements of Volume I, Section 8.4. These
requirements pertain to:
a. Classifying configuration items into categories and subcategories;
b. Uniquely numbering or otherwise identifying configuration items; and
c. Naming configuration items.

2.11.3

Baseline, Promotion, and Demotion Procedures

The vendor shall provide a description of the procedures and naming conventions
used to address the specific requirements of Volume I, Section 8.5 of the Standards.
These requirements pertain to:
a. Establishing a particular instance of a system component as the starting
baseline;
b. Promoting subsequent instances of a component to baseline throughout the
system development process for the first complete version of the system
submitted for qualification testing; and
c. Promoting subsequent instances of a component to baseline status as the
component is maintained throughout its life cycle.

2.11.4

Configuration Control Procedures

The vendor shall provide a description of the procedures used by the vendor to
approve and implement changes to a configuration item to prevent unauthorized
additions, changes, or deletions to address the specific requirements of Volume I,
Section 8.6 of the Standards. These requirements pertain to:
a. Developing and maintaining internally developed items;

2-28

Volume II – Section 2
Technical Data Package

b. Developing and maintaining third-party items;
c. Resolve internally identified defects; and
d. Resolve externally identified and reported defects.

2.11.5

Release Process

The vendor shall provide a description of the contents of a system release, and the
procedures and related conventions by which the vendor installs, transfers, or migrates
the system to ITAs and customers to address the specific requirements of Volume I,
Section 8.7 of the Standards. These requirements pertain to:
a. A first release of the system to an ITA;
b. A subsequent maintenance or upgrade release of a system, or particular
components, to an ITA;
c. The initial delivery and installation of the system to a customer; and
d. A subsequent maintenance or upgrade release of a system, or particular
components, to a customer.

2.11.6

Configuration Audits

The vendor shall provide a description of the procedures and related conventions for
the two audits required by Volume I, Section 8.8 of the Standards. These requirements
pertain to:
a. Physical configuration audit that verifies the voting system components
submitted for qualification to the vendor’s technical documentation; and
b. Functional configuration audit that verifies the system performs all the
functions described in the system documentation.

2.11.7

Configuration Management Resources

The vendor shall provide a description of the procedures and related conventions for
the maintaining information about configuration management tools required by
Volume I, Section 8.9 of the Standards. These requirements pertain to information
regarding:
a. Specific tools used, current version, and operating environment;

2-29

Volume II – Section 2
Technical Data Package

b. Physical location of the tools, including designation of computer directories
and files; and
c. Procedures and training materials for using the tools.

2.12

Quality Assurance Program

Vendors shall submit a Quality Assurance Program that addresses the quality
assurance requirements of Volume I, Section 7. This plan shall describe all policies,
processes and procedures employed by the vendor to ensure the overall quality of the
system for its initial development and release and for subsequent modifications and
releases. This information is particularly important to support the design of test plans
by the test authority. A well-organized, robust and detailed Quality Assurance
Program will enable the test authority to more readily determine the nature and scope
of tests needed to test the system appropriately. The Quality Assurance Program shall,
at a minimum, address the topics indicate below.

2.12.1

Quality Assurance Policy

The vendor shall provide a description of its organizational policies for quality
assurance, including:
a. Scope and nature of QA activities; and
b. Breadth of application of vendor’s policy and practices to the voting system.

2.12.2

Parts & Materials Special Tests and Examinations

The vendor shall provide a description of its practices for parts and materials tests and
examinations that meet the requirements of Volume I, Section 7.3 of the Standards.

2.12.3

Quality Conformance Inspections

The vendor shall provide a description of its practices for quality conformance
inspections that meet the requirements of Volume I, Section 7.4 of the Standards. For
each test performed, the record of tests provided shall include:
a. Test location;

2-30

Volume II – Section 2
Technical Data Package

b. Test date;
c. individual who conducted the test; and
d. Test outcomes.

2.12.4

Documentation

The vendor shall provide a description of its practices for documentation of the system
and system development process that meet the requirements of Volume I, Section 7.5
of the Standards.

2.13

System Change Notes

Vendors submitting a system for testing that has been tested previously by the test
authority and issued a qualification number shall submit system change notes. These
will be used by the test authority to assist in developing and executing the test plan for
the modified system. The system change notes shall include the following
information:
a. Summary description of the nature and scope of the changes, and reasons for
each changes;
b. A listing of the specific changes made, citing the specific system
configuration items changed and providing detailed references to the sections
of documentation changed;
c. The specific sections of the documentation that are changed (or complete
revised documents, if more suitable to address a large number of changes) ;
d. Documentation of the test plan and procedures executed by the vendor for
testing the individual changes and the system as a whole, and records of test
results.

2-31

Volume II – Section 2
Technical Data Package

Volume II, Section 3
Table of Contents

3

Functionality Testing ............................................................................................................................3-1
3.1

Scope ..........................................................................................................................................3-1

3.2

Breadth of Functionality Testing.....................................................................................................3-1

3.3

3.2.1

Basic Functionality Testing Requirements........................................................................3-1

3.2.2

Variation of System Functionality Testing to Reflect Voting System
Technologies and Configurations .....................................................................................3-2

3.2.3

Variation of System Functionality Testing to Reflect Additional Voting
System Capabilities ..........................................................................................................3-2

3.2.4

Variation of System Functionality Testing to Reflect Voting Systems that
Incorporate Previously Tested Functionality.....................................................................3-3

General Test Sequence .................................................................................................................3-3
3.3.1

Functionality Testing in Parallel with Hardware Testing for Precinct
Count Systems .................................................................................................................3-4

3.3.2

Functionality Testing in Parallel with Hardware Testing for Central
Count Systems .................................................................................................................3-5

3.4

Functionality Testing for Accessibility ............................................................................................3-6

3.5

Functionality Testing for Systems that Operate on Personal Computers.......................................3-6

i

Volume II – Section 3
Functionality Testing

3

Functionality Testing
3.1

Scope

This section contains a description of the testing to be performed by the ITAs to
confirm the functional capabilities of a voting system submitted for qualification. It
describes the scope and basis for functionality testing, outlines the general sequence
of tests within the overall test process, and provides guidance on testing for
accessibility.

3.2

Breadth of Functionality Testing

In order to best compliment the diversity of the voting systems industry, the
qualification testing process is not rigidly defined. Although there are basic
functionality testing requirements, additions or variations in testing are appropriate in
order to compliment the system’s use of specific technologies and configurations, the
system capabilities, and the outcomes of previous testing.

3.2.1

Basic Functionality Testing Requirements

ITAs shall design and perform procedures to test a voting system against the
functional requirements outlined in Volume I, Section 2. Tests procedures shall be
designed and performed by the ITA that address:
a. Overall system capabilities;
b. Pre-voting functions;
c. Voting functions;
d. Post-voting functions;
e. System maintenance; and

3-1

Volume II – Section 3
Functionality Testing

f.

Transportation and storage.

The specific procedures to be used shall be identified in the Qualification Test Plan
prepared by the ITA. These procedures may replicate testing performed by the vendor
and documented in the vendor’s TDP, but shall not rely on vendor testing as a
substitute for functionality testing performed by the ITA.
Recognizing variations in system design and the technologies employed by different
vendors, the ITAs shall design test procedures that account for such variations and
reflect the system-specific functional capabilities in Volume I, Section 2.

3.2.2

Variation of System Functionality Testing to Reflect
Voting System Technologies and Configurations

Voting systems are not designed according to a standard design template. Instead,
system design reflects the vendor’s selections from a variety of technologies and
design configurations. Such variation is recognized in the definitions of voting
systems in Volume I, Section 1, and serves as the basis for delineating various
functional capability requirements.
Functional capabilities will vary according to the relative complexity of a system and
the manner in which the system integrates various technologies. Therefore, the testing
procedure designed and performed by the ITA for a particular system shall reflect the
specific technologies and design configurations used by that system.

3.2.3

Variation of System Functionality Testing to Reflect
Additional Voting System Capabilities

The requirements for voting system functionality provided by Volume I, Section 2
reflect a minimum set of capabilities. Vendors may, and often do, provide additional
capabilities in systems that are submitted for qualification testing in order to respond
to the requirements of individual states. These additional capabilities shall be
identified by the vendor within the TDP as described in Volume II, Section 2. Based
on this information, ITAs shall design and perform system functionality testing for
additional functional capabilities as well as the capabilities required by Volume I,
Section 2 of the Standards.

3-2

Volume II – Section 3
Functionality Testing

3.2.4

Variation of System Functionality Testing to Reflect
Voting Systems that Incorporate Previously Tested
Functionality

The required functional capabilities of voting systems defined in Volume I, Section 2
reflect a broad range of system functionality needed to support the full life cycle of an
election, including post election activities. Many systems submitted for qualification
testing are designed to address this scope, and are tested accordingly.
However, some new systems seek qualification using a combination of new
subsystems or system components interfaced with the components of an previously
qualified system. For example, a vendor can submit a voting system for qualification
testing that has a new DRE voting device, but that integrates the election management
component from a previously qualified system.
In this situation, the vendor is strongly encouraged to identify in its TDP the
functional capabilities supported by new subsystems/components and those supported
by subsystems/components taken from a previously qualified system. The vendor is
also encouraged to indicate in its system design documentation and configuration
management records the scope and nature of any modifications made to the reused
subsystems or components. Following these suggestions will assist the ITA in
developing efficient test procedures that rely in part on the results of testing of the
previously qualified subsystems or components.
In this situation the ITA may design and perform a test procedure that draws on the
results of testing performed previously on reused subsystems or components.
However, the scope of testing shall include, irrespective of previous testing, certain
functionality tests:
a. All functionality performed by new subsystems/modules;
b. All functionality performed by modified subsystems/modules;
c. Functionality that is accomplished using any interfaces to new modules, or
that shares inputs or outputs from new modules;
d. All functionality related to vote tabulation and election results reporting; and
e. All functionality related to audit trail maintenance.

3.3

General Test Sequence

There is no required sequence for performing the system qualification tests. For a
system not previously qualified, the ITA may perform tests using generic test ballots,
and schedule the tests in a convenient order, provided that prerequisite conditions for
each test have been satisfied before the test is initiated.

3-3

Volume II – Section 3
Functionality Testing

Regardless of the sequence of testing used, the full qualification testing process shall
include functionality testing for all system functions of a voting system, minus the
exceptions noted in Section 3.2. Generally, in depth functionality testing will follow
testing of the systems hardware and the source code review of the system’s software.
ITAs will usually conduct functionality testing as an integral element of system level
integration testing described in Volume II, Section 6.
Some functionality tests for the voting functions defined in Volume I, Section 2.4 and
2.5 may be performed as an integral part of hardware testing, enabling a more
efficient testing process. Ballots processed and counted during hardware operating
tests for precinct count and central count systems may serve to satisfy part of the
functionality testing provided that the ballots were cast using a test procedure that is
equivalent to the procedures indicated below.

3.3.1

Functionality Testing in Parallel with Hardware
Testing for Precinct Count Systems

For testing voting functions defined in Volume I, Sections 2.4 and 2.5, the following
procedures shall be performed during the functionality tests of voting equipment and
precinct counting equipment.
a. The procedure to prepare election programs shall:
1) Verify resident firmware, if any;
2) Prepare software (including firmware) to simulate all ballot format and
logic options for which the system will be used;
3) Verify program memory device content; and
4) Obtain and design test ballots with formats and voting patterns sufficient
to verify performance of the test election programs.
b. The procedures to program precinct ballot counters shall:
1) Install program and data memory devices, or verify presence if resident;
and
2) Verify operational status of hardware as in Volume II, Section 4.
c. The procedures to simulate opening of the polls shall:
1) Perform procedures required to prepare hardware for election operations;
2) Obtain "zero" printout or other evidence that data memory has been
cleared;
3) Verify audit record of pre-election operations; and
4) Perform procedure required to open the polling place and enable ballot
counting.

3-4

Volume II – Section 3
Functionality Testing

d. The procedure to simulate counting ballots shall cast test ballots in a number
sufficient to demonstrate proper processing, error handling, and generation of
audit data as specified in Volume I, Sections 2 and 4.
e. The procedure to simulate closing of polls shall:
1) Perform hardware operations required to disable ballot counting and close
the polls;
2) Obtain data reports and verify correctness; and
3) Obtain audit log and verify correctness.
They need not be performed in the sequence listed, provided the necessary
precondition of each procedure has been met.

3.3.2

Functionality Testing in Parallel with Hardware
Testing for Central Count Systems

For testing voting functions defined in Volume I, Sections 2.4 and 2.5, the following
procedures shall be performed during the functional tests.
a. The procedure to prepare election programs shall:
1) Verify resident firmware, if any;
2) Prepare software (including firmware) to simulate all ballot format and
logic options for which the system will be used, and to enable simulation
of counting ballots from at least 10 polling places or precincts;
3) Verify program memory device content; and
4) Procure test ballots with formats, voting patterns, and format
identifications sufficient to verify performance of the test election
programs;
b. The procedure to simulate counting ballots shall count test ballots in a number
sufficient to demonstrate proper processing, error handling, and generation of
audit data as specified in Volume I, Sections 2 and 4; and
c. The procedure to simulate election reports shall:
1) Obtain reports at polling places or precinct level;
2) Obtain consolidated reports;
3) Provide query access, if this is a feature of the system;
4) Verify correctness of all reports and queries; and
5) Obtain audit log and verify correctness.

3-5

Volume II – Section 3
Functionality Testing

They need not be performed in the sequence listed, provided the necessary
preconditions of each procedure have been met.

3.4

Functionality Testing for Accessibility

As indicated in Volume I, Section 2.2.7, voting systems shall provide accessibility to
individuals with disabilities, meeting the specific requirements of this Section. ITAs
shall design and perform test procedures that verify conformance with each of these
requirements.

3.5
Functionality Testing for Systems that Operate
on Personal Computers
For systems intended to use non-standard voting devices, such as a personal computer,
provided by the local jurisdiction, ITAs shall conduct functionality tests using
hardware provided by the vendor that meets the minimum configuration specifications
defined by the vendor.
Volume II, Section 4, provides additional information on hardware to be used to
conduct functionality testing of such voting devices, as well as hardware to be used to
conduct security testing and other forms of testing.

3-6

Volume II – Section 3
Functionality Testing

Volume II, Section 4
Table of Contents

4 Hardware Testing ...................................................................................................................4-1
4.1 Scope ...............................................................................................................................4-1
4.2 Basis of Hardware Testing ...............................................................................................4-1

4.3
4.4
4.5
4.6

4.2.1

Testing Focus and Applicability ........................................................................................4-1

4.2.2

Hardware Provided by Vendor..........................................................................................4-2

Test Conditions ................................................................................................................4-2
Test Log Data Requirements ...........................................................................................4-3
Test Fixtures ....................................................................................................................4-3
Non-operating Environmental Tests .................................................................................4-4
4.6.1

General .............................................................................................................................4-4
4.6.1.1 Pretest Data .............................................................................................................4-5
4.6.1.2 Preparation for Test..................................................................................................4-5
4.6.1.3 Mechanical Inspection and Repair ...........................................................................4-5
4.6.1.4 Electrical Inspection and Adjustment .......................................................................4-5
4.6.1.5 Operational Status Check ........................................................................................4-5
4.6.1.6 Failure Criteria..........................................................................................................4-6

4.6.2

Bench Handling Test ........................................................................................................4-6
4.6.2.1 Applicability ..............................................................................................................4-6
4.6.2.2 Procedure.................................................................................................................4-6

4.6.3

Vibration Test....................................................................................................................4-7
4.6.3.1 Applicability ..............................................................................................................4-7
4.6.3.2 Procedure.................................................................................................................4-7

4.6.4

Low Temperature Test......................................................................................................4-8
4.6.4.1 Applicability ..............................................................................................................4-8
4.6.4.2 Procedure.................................................................................................................4-8

4.6.5

High Temperature Test.....................................................................................................4-9
4.6.5.1 Applicability ..............................................................................................................4-9
4.6.5.2 Procedure.................................................................................................................4-9

4.6.6

Humidity Test....................................................................................................................4-9
4.6.6.1 Applicability ............................................................................................................4-10

i

Volume II – Section 4
Hardware Testing

4.6.6.2 Procedure...............................................................................................................4-10

4.7 Environmental Tests, Operating.....................................................................................4-11
4.7.1

Temperature and Power Variation Tests ........................................................................4-11

4.7.1.1 Data Accuracy ................................................................................................................4-12
4.7.2

Maintainability Test .........................................................................................................4-13

4.7.3

Reliability Test ................................................................................................................4-13

4.7.4

Availability Test...............................................................................................................4-14

4.8 Other Environmental Tests.............................................................................................4-14
4.8.1

Power Disturbance .........................................................................................................4-15

4.8.2

Electromagnetic Radiation..............................................................................................4-15

4.8.3

Electrostatic Disruption ...................................................................................................4-15

4.8.4

Electromagnetic Susceptibility ........................................................................................4-15

4.8.5

Electrical Fast Transient .................................................................................................4-15

4.8.6

Lightning Surge...............................................................................................................4-15

4.8.7

Conducted RF Immunity .................................................................................................4-16

4.8.8

Magnetic Fields Immunity ...............................................................................................4-16

ii

Volume II – Section 4
Hardware Testing

4

Hardware Testing
4.1

Scope

This section contains a description of the testing to be performed by the ITAs to
confirm the proper functioning of the hardware components of a voting system
submitted for qualification testing. It describes the scope and basis for functionality
testing, required test conditions for conducting hardware testing, guidance for the use
of test fixtures, test log data requirements, and test practices for specific non-operating
and operating environmental tests.

4.2

Basis of Hardware Testing

This section addresses the focus and applicability of hardware testing, and specifies
the vendor’s obligations to produce hardware to conduct such tests.

4.2.1

Testing Focus and Applicability

ITAs shall design and perform procedures that test the voting system hardware
requirements identified in Volume I, Section 3. Test procedures shall be designed and
performed by the ITA for both operating and non-operating environmental tests:
♦ Operating environmental tests apply to the entire system, including hardware
components that are used as part of the voting system telecommunications
capability; and
♦ Non-operating tests apply to those elements of the system that are intended
for use at poll site voting locations, such as voting machines and precinct
counters. These tests address environmental conditions that may be
encountered by the voting system hardware at the voting location itself, or
while in storage or transit to or from the poll site.

4-1

Volume II – Section 4
Hardware Testing

Additionally, compatibility of this equipment with the voting system environment
shall be determined through functional tests integrating the standard product with the
remainder of the system.
All hardware components custom-designed for election use shall be tested in
accordance with the applicable procedures contained in this section. Unmodified
COTS hardware will not be subject to all tests. Generally such equipment has been
designed to rigorous industrial standards and has been in wide use, permitting an
evaluation of its performance history. To enable reduced testing of such equipment,
vendors shall provide the manufacturers specifications and evidence that the
equipment has been tested to the equivalent of the Standards.
The specific testing procedures to be used shall be identified in the Qualification Test
Plan prepared by the ITA. These procedures may replicate testing performed by the
vendor and documented in the vendor’s TDP, but shall not rely on vendor testing as a
substitute for hardware testing performed by the ITA.

4.2.2

Hardware Provided by Vendor

The hardware submitted for qualification testing shall be equivalent, in form and
function, to the actual production versions of the hardware units. Engineering or
developmental prototypes are not acceptable unless the vendor can show that the
equipment to be tested is equivalent to standard production units in both performance
and construction.

4.3

Test Conditions

Qualification tests may be performed in any facility capable of supporting the test
environment. Preparation for testing, arrangement of equipment, verification of
equipment status, and the execution of procedures shall be witnessed by at least one
independent, qualified observer who shall certify that all test and data acquisition
requirements have been satisfied.
When a test is to be performed at “standard” or “ambient” conditions, this requirement
shall refer to a nominal laboratory environment at prevailing atmospheric pressure and
relative humidity.
Otherwise, all tests shall be performed at the required temperature and electrical
supply voltage, regulated within the following tolerances:
a. Temperature of +/- 4 degrees F; and

4-2

Volume II – Section 4
Hardware Testing

b. Electrical supply voltage +/- 2 VAC.

4.4

Test Log Data Requirements

The ITA shall maintain a test log of the procedure employed. This log shall identify
the system and equipment by model and serial number. Test environment conditions
shall be noted.
In the event that the ITA deems it necessary to deviate from requirements pertaining
to the test environment, the equipment arrangement and method of operation, the
specified test procedure, or the provision of test instrumentation and facilities, the
deviation shall be recorded in the test log. A discussion of the reasons for the
deviation and the effect of the deviation on the validity of the test procedure shall also
be provided.

4.5

Test Fixtures

The use of test fixtures or ancillary devices to facilitate hardware qualification testing
is encouraged. These fixtures and devices may include arrangements for automating
the operation of voting devices and the acquisition of test data.
The use of a fixture to ensure correctness in casting ballots by hand is recommended.
Such a fixture may consist of a template, with apertures in the desired location, so that
selections may be made rapidly. Such a template will eliminate or greatly minimize
errors in activating test ballot patterns, while reducing the amount of time required to
cast a test ballot.
For systems that use a light source as a means of detecting voter selections, the
generation of a suitable optical signal by an external device is acceptable. For systems
that rely on the physical activation of a switch, a mechanical fixture with suitable
motion generators is acceptable.
To speed up the process of testing and to eliminate human error in casting test ballots
the tests may use a simulation device with appropriate software. Such simulation is
recommended if it covers all voting data detection and control paths that are used in
casting an actual ballot. In the event that only partial simulation is achieved, then an
independent method and test procedure must be used to validate the proper operation
of those portions of the system not tested by the simulator.

4-3

Volume II – Section 4
Hardware Testing

If the vendor provides a means of simulating the casting of ballots, the simulation
device is subject to the same performance, reliability, and quality requirements that
apply to the voting device itself so as not to contribute errors to the test processes.

4.6

Non-operating Environmental Tests

This section addresses a range of tests for voting machines and precinct counters, as
such devices are stored between elections and are transported between the storage
facility and polling site.

4.6.1

General

Environmental tests of non-operating equipment are intended to simulate exposure to
physical shock and vibration associated with handling and transportation of voting
equipment and precinct counters between a jurisdiction’s storage facility and precinct
polling site. These tests additionally simulate the temperature and humidity conditions
that may be encountered during storage in an uncontrolled warehouse environment or
precinct environment. The procedures and conditions of these tests correspond
generally to those of MIL-STD-810D, “Environmental Test Methods and Engineering
Guidelines,” 19 July 1983. In most cases, the severity of the test conditions has been
reduced to reflect commercial, rather than military, practice.
Systems exclusively designed with system-level COTS hardware whose configuration
has not been modified in any manner and are not subjected to this segment of
hardware testing. Systems made up of individual COTS components such as hard
drives, motherboards, and monitors that have been packaged to build a voting
machine or other device will be required to undergo the hardware testing.
Prior to each test, the equipment shall be shown to be operational by means of the
procedure contained in Subsection 4.6.1.5. The equipment may then be prepared as if
for actual transportation or storage, and subjected to appropriate test procedures
outlined. After each procedure has been completed, the equipment status will again be
verified as in Subsection 4.6.1.5.
The following requirements for equipment preparation, functional tests, and
inspections shall apply to each of the non-operating test procedures.

4-4

Volume II – Section 4
Hardware Testing

4.6.1.1

Pretest Data

The test technician shall verify that the equipment is capable of normal operation.
Equipment identification, environmental conditions, equipment configuration, test
instrumentation, operator tasks, time-of-day or test time, and test results shall be
recorded.

4.6.1.2

Preparation for Test

The equipment shall be prepared as for the expected non-operating use, as noted
below. When preparation for transport between the storage site and the polling place
is required, the equipment shall be prepared with any protective enclosures or internal
restraints that the vendor specifies for such transport. When preparation for storage is
required, the equipment shall be prepared using any protective enclosures or internal
restraints that the vendor specifies for storage.

4.6.1.3

Mechanical Inspection and Repair

After the test has been completed, the devices shall be removed from their containers,
and any internal restraints shall be removed. The exterior and interior of the devices
shall be inspected for evidence of mechanical damage, failure, or dislocation of
internal components. Devices shall be adjusted or repaired, if necessary.

4.6.1.4

Electrical Inspection and Adjustment

After completion of the mechanical inspection and repair, routine electrical
maintenance and adjustment may be performed, according to the manufacturer's
standard procedure.

4.6.1.5

Operational Status Check

When all tests, inspections, repairs, and adjustments have been completed, normal
operation shall be verified by conducting an operational status check.
During this process, all equipment shall be operated in a manner and environmental
conditions that simulate election use to verify the functional status of the system.
Prior to the conduct of each of the environmental hardware non-operating tests, a
supplemental test shall be made to determine that the operational state of the
equipment is within acceptable performance limits.

4-5

Volume II – Section 4
Hardware Testing

The following procedures shall be followed to verify the equipment status:
Step 1:

Arrange the system for normal operation.

Step 2:

Turn on power, and allow the system to reach recommended operating
temperature.

Step 3:

Perform any servicing, and make any adjustments necessary, to achieve
operational status.

Step 4:

Operate the equipment in all modes, demonstrating all functions and
features that would be used during election operations.

Step 5:

Verify that all system functions have been correctly executed.

4.6.1.6

Failure Criteria

Upon completion of each non-operating test, the system hardware shall be subject to
functional testing to verify continued operability. If any portion of the voting machine
or precinct counter hardware fails to remain fully functional, the testing will be
suspended until the failure is identified and corrected by the vendor. The system will
then be subject to a retest.

4.6.2

Bench Handling Test

The bench handling test simulates stresses faced during maintenance and repair of
voting machines and ballot counters.

4.6.2.1

Applicability

All systems and components, regardless of type, shall meet the requirements of this
test. This test is equivalent to the procedure of MIL-STD-810D, Method 516.3,
Procedure VI.

4.6.2.2
Step 1:

Procedure
Place each piece of equipment on a level floor or table, as for normal
operation or servicing.

4-6

Volume II – Section 4
Hardware Testing

Step 2:

Make provision, if necessary, to restrain lateral movement of the equipment
or its supports at one edge of the device. Vertical rotation about that edge
shall not be restrained.

Step 3:

Using that edge as a pivot, raise the opposite edge to an angle of 45 degrees,
to a height of four inches above the surface, or until the point of balance has
been reached, whichever occurs first.

Step 4:

Release the elevated edge so that it may drop to the test surface without
restraint.

Step 5:

Repeat steps 3 and 4 for a total of six events.

Step 6:

Repeat steps 2, 3, and 4 for the other base edges, for a total of 24 drops for
each device.

4.6.3

Vibration Test

The vibration test simulates stresses faced during transport of voting machines and
ballot counters between storage locations and polling places.

4.6.3.1

Applicability

All systems and components, regardless of type, shall meet the requirements of this
test. This test is equivalent to the procedure of MIL-STD-810D, Method 514.3,
Category 1- Basic Transportation, Common Carrier.

4.6.3.2

Procedure

Step 1:

Install the test item in its transit or combination case as prepared for
transport.

Step 2:

Attach instrumentation as required to measure the applied excitation.

Step 3:

Mount the equipment on a vibration table with the axis of excitation along
the vertical axis of the equipment.

Step 4:

Apply excitation as shown in MIL-STD-810D, Method 514.3-1, “Basic
transportation, common carrier, vertical axis”, with low frequency
excitation cutoff at 10 Hz, for a period of 30 minutes.

4-7

Volume II – Section 4
Hardware Testing

Step 5:

Repeat steps 2 and 3 for the transverse and longitudinal axes of the
equipment with the excitation profiles shown in Figures 514.3-2 and 514.33, respectively. (Note: The total excitation period equals 90 minutes, with
30 minutes excitation along each axis.)

Step 6:

Remove the test item from its transit or combination case and verify its
continued operability.

4.6.4

Low Temperature Test

The low temperature test simulates stresses faced during storage of voting machines
and ballot counters.

4.6.4.1

Applicability

All systems and components, regardless of type, shall meet the requirements of this
test. This test is equivalent to the procedure of MIL-STD-810D, Method 502.2,
Procedure I-Storage. The minimum temperature shall be -4 degrees F.

4.6.4.2

Procedure

Step 1:

Arrange the equipment as for storage. Install it in the test chamber.

Step 2:

Lower the internal temperature of the chamber at any convenient rate, but
not so rapidly as to cause condensation in the chamber, and in any case no
more rapidly than 10 degrees F per minute, until an internal temperature of 4 degrees F has been reached.

Step 3:

Allow the chamber temperature to stabilize. Maintain this temperature for a
period of 4 hours after stabilization.

Step 4:

Allow the internal temperature of the chamber to return to standard laboratory conditions, at a rate not exceeding 10 degrees F per minute

Step 5:

Allow the internal temperature of the equipment to stabilize at laboratory
conditions before removing it from the chamber.

Step 6:

Remove the equipment from the chamber and from its containers, and
inspect the equipment for evidence of damage.

Step 7:

Verify continued operability of the equipment.

4-8

Volume II – Section 4
Hardware Testing

4.6.5

High Temperature Test

The high temperature test simulates stresses faced during storage of voting machines
and ballot counters.

4.6.5.1

Applicability

All systems and components, regardless of type, shall meet the requirements of this
test. This test is equivalent to the procedure of MIL-STD-810D, Method 501.2,
Procedure I-Storage. The maximum temperature shall be 140 degrees F.

4.6.5.2

Procedure

Step 1:

Arrange the equipment as for storage. Install it in the test chamber.

Step 2:

Raise the internal temperature of the chamber at any convenient rate, but in
any case no more rapidly than 10 degrees F per minute, until an internal
temperature of 140 degrees F has been reached.

Step 3:

Allow the chamber temperature to stabilize. Maintain this temperature for a
period of 4 hours after stabilization.

Step 4:

Allow the internal temperature of the chamber to return to standard laboratory conditions, at a rate not exceeding 10 degrees F per minute.

Step 5:

Allow the internal temperature of the equipment to stabilize at laboratory
conditions before removing it from the chamber.

Step 6:

Remove the equipment from the chamber and from its containers, and
inspect the equipment for evidence of damage.

Step 7:

Verify continued operability of the equipment.

4.6.6

Humidity Test

The humidity test simulates stresses faced during storage of voting machines and
ballot counters.

4-9

Volume II – Section 4
Hardware Testing

4.6.6.1

Applicability

All systems and components regardless of type shall meet the requirements of this
test. This test is similar to the procedure of MIL-STD-810D, Method 507.2,
Procedure I-Natural Hot-Humid. It is intended to evaluate the ability of the
equipment to survive exposure to an uncontrolled temperature and humidity
environment during storage. This test lasts for ten days.

4.6.6.2

Procedure

Step 1:

Arrange the equipment as for storage. Install it in the test chamber.

Step 2

Adjust the chamber conditions to those given in MIL-STD810D Table 507.2-I, for the time 0000 of the HotHumid
cycle (Cycle 1).

Step 3:

Perform a 24-hour cycle with the time and temperaturehumidity values specified in Figure 507.2-1, Cycle 1.

Step 4:

Repeat Step 2 until 5, 24-hour cycles have been completed.

Step 5:

Continue with the test commencing with the conditions
specified for time = 0000 hours.

Step 6:

At any convenient time in the interval between time = 120
hours and time = 124 hours, place the equipment in an
operational configuration, and perform a complete
operational status check as defined in Subsection
4.6.1.5

Step 7:

If the equipment satisfactorily completes the status check,
continue with the sixth 24-hour cycle.

Step 8:

Perform 4 additional 24-hour cycles, terminating the test at
time = 240 hours

Step 9:

Remove the equipment from the test chamber and inspect it
for any evidence of damage.

Step 10: Verify continued operability of the equipment.

4-10

Volume II – Section 4
Hardware Testing

4.7

Environmental Tests, Operating

This section addresses a range of tests for all voting system equipment, including
equipment for both precinct count and central count systems.

4.7.1

Temperature and Power Variation Tests

This test is similar to the low temperature and high temperature tests of MIL-STD810D, Method 502.2 and Method 501.2, with test conditions that correspond to the
requirements of the performance standards. This procedure tests system operation
under various environmental conditions for at least 163 hours. During 48 hours of this
operating time, the device shall be in a test chamber. For the remaining hours, the
equipment shall be operated at room temperature. The system shall be powered for the
entire period of this test; the power may be disconnected only if necessary for removal
of the system from the test chamber.
Operation shall consist of ballot-counting cycles, which vary with system type. An
output report need not be generated after each counting cycle; the interval between
reports, however, should be no more than 4 hours to keep to a practical minimum the
time between the occurrence of a failure or data error and its detection.
Test Ballots per Counting Cycle
Precinct count systems

100 ballots/hour

Central count systems

300 ballots/hour

The recommended pattern of votes is one chosen to facilitate visual recognition of the
reported totals; this pattern shall exercise all possible voting locations. System
features such as data quality tests, error logging, and audit reports shall be enabled
during the test.
Each operating cycle shall consist of processing the number of ballots indicated in the
preceding chart.
Step 1:

Arrange the equipment in the test chamber. Connect as required and
provide for power, control and data service through enclosure wall.

Step 2:

Set the supply voltage at 117 vac.

Step 3:

Power the equipment, and perform an operational status check as in Section
4.6.1.5.

4-11

Volume II – Section 4
Hardware Testing

Step 4:

Set the chamber temperature to 50 degrees F observing precautions against
thermal shock and condensation.

Step 5:

Begin 24 hour cycle.

Step 6:

At T=4 hrs, lower the supply voltage to 105 vac.

Step 7:

At T=8 hrs, raise the supply voltage to 129 vac.

Step 8:

At T=11:30 hrs, return the supply voltage to 117 vac and return the
chamber temperature to lab ambient, observing precautions against thermal
shock and condensation.

Step 9:

At T=12:00 hrs, raise the chamber temperature to 95 degrees Fahrenheit.

Step 10: Repeat Steps 5 through 8, with temperature at 95 degrees Fahrenheit,
complete at T=24 hrs.
Step 11: Set the chamber temperature at 50 degrees Fahrenheit as in Step 4.
Step 12: Repeat the 24 hour cycle as in Steps 5-10, complete at T=48 hrs.
Step 13: After completing the second 24 hour cycle, disconnect power from the
system and remove it from the chamber if needed.
Step 14: Reconnect the system as in Step 2, and continue testing for the remaining
period of operating time required until the ACCEPT/REJECT criteria of
Subsection 4.7.11 have been met.

4.7.1.1

Data Accuracy

As indicated in Volume I, Section 3, data accuracy is defined in terms of ballot
position error rate. This rate applies to the voting functions and supporting equipment
that capture, record, store, consolidate and report the specific selections, and absence
of selections, made by the voter for each ballot position. Volume I, Section 3.2.1
identifies the specific functions to be tested.
For each processing function, the system shall achieve a target error rate of no more
than one in 10,000,000 ballot positions, with a maximum acceptable error rate in the
test process of one in 500,000 ballot positions. This error rate includes errors from any
source while testing a specific processing function and it related equipment.
This error rate is used to determine the vote position processing volume used to test
system accuracy for each function:

4-12

Volume II – Section 4
Hardware Testing

♦ If the system makes one error before counting 26,997 consecutive ballot
positions correctly, it will be rejected. The vendor is then required to improve
the system.
♦ If the system reads at least 1,549,703 consecutive ballot positions correctly, it
will be accepted.
♦ If the system correctly reads more than 26,997 ballot positions but less than
1,549,703 when the first error occurs, the testing will have to be continued
until another 1,576,701 consecutive ballot positions are counted without error
(a total of 3,126,404 with one error).
Volume II, Appendix C, Section C.5 provides further details of the calculation for this
testing volume.

4.7.2

Maintainability Test

The ITA shall test for maintainability based on the provisions of Volume I, Section 3
for maintainability, including both physical attributes and additional attributes
regarding the ease of performing maintenance activities. These tests include:
a. Examine the physical attributes of the system to determine whether significant
impediments exist for the performance of those maintenance activities that are
to be performed by the jurisdiction. These activities shall be identified by the
vendor in the system maintenance procedures (part of the TDP).
b. Performing activities designated as maintenance activities for the jurisdiction
in the TDP, in accordance with the instructions provided by the vendor in the
system maintenance procedures, noting any difficulties encountered.
Should significant impediments or difficulties be encountered that are not remedied by
the vendor, the ITA shall include such findings in the qualification test results of the
qualification test report.

4.7.3

Reliability Test

The ITA shall test for reliability based on the provisions of Volume I, Section 3 for
the acceptable mean time between failure (MBTF). The MBTF shall be measured
during the conduct of other system performance tests specified in this section, and
shall be at least 163 hours. Volume II, Appendix C, Section C.4 provides further
details of the calculation for this testing period.

4-13

Volume II – Section 4
Hardware Testing

4.7.4

Availability Test

The ITA shall assess the adequacy of system availability based on the provisions of
Volume I, Section 3. As described in this section, availability of voting system
equipment is determined as a function of reliability, and the mean time to repair the
system in the event of failure.
Availability cannot be tested directly before the voting system is deployed in
jurisdictions, but can be modeled mathematically to predict availability for a defined
system configuration. This model shall be prepared by the vendor, and shall be
validated by the ITA.
The model shall reflect the equipment used for a typical system configuration to
perform the following system functions:
a. For all paper-based systems:
1) Recording voter selections (such as by ballot marking or punch);
2) Scanning the punches or marks on paper ballots and converting them into
digital data;
b. For all DRE systems:
1) Recording and storing the voter’s ballot selections.
c. For precinct-count systems (paper-based and DRE):
1) Consolidation of vote selection data from multiple precinct-based systems
to generate jurisdiction-wide vote counts, including storage and reporting
of the consolidated vote data; and
d. For central-count systems (paper-based and DRE):
1) Consolidation of vote selection data from multiple counting devices to
generate jurisdiction-wide vote counts, including storage and reporting of
the consolidated vote data.
The model shall demonstrate the predicted availability of the equipment that supports
each function. This demonstration shall reflect the equipment reliability, mean time to
repair and assumptions concerning equipment availability and deployment of
maintenance personnel stated by the vendor in the TDP.

4.8

Other Environmental Tests

4-14

Volume II – Section 4
Hardware Testing

4.8.1

Power Disturbance

The test for power disturbance disruption shall be conducted in compliance with the
test specified in in IEC 61000-4-11 (1994-06).

4.8.2

Electromagnetic Radiation

The test for electromagnetic radiation shall be conducted in compliance with the FCC
Part 15 Class B requirements by testing per ANSI C63.4.

4.8.3

Electrostatic Disruption

The test for electrostatic disruption shall be conducted in compliance with the test
specified in IEC 61000-4-2 (1995-01).

4.8.4

Electromagnetic Susceptibility

The test for electromagnetic susceptibility shall be conducted in compliance with the
test specified in IEC 61000-4-3 (1996).

4.8.5

Electrical Fast Transient

The test for electrical fast transient protection shall be conducted in compliance with
the test specified in IEC 61000-4-4 (1995-01).

4.8.6

Lightning Surge

The test for lightning surge protection shall be conducted in compliance with the test
specified in IEC 61000-4-5 (1995-02).

4-15

Volume II – Section 4
Hardware Testing

4.8.7

Conducted RF Immunity

The test for conducted RF immunity shall be conducted in compliance with the test
specified in IEC 61000-4-6 (1996-04).

4.8.8

Magnetic Fields Immunity

The test for AC magnetic fields RF immunity shall be conducted in compliance with
the test specified in IEC 61000-4-8 (1993-06).

4-16

Volume II – Section 4
Hardware Testing

Volume II, Section 5
Table of Contents

5 Software Testing ....................................................................................................................5-1
5.1 Scope ...............................................................................................................................5-1
5.2 Basis of Software Testing.................................................................................................5-1
5.3 Initial Review of Documentation .......................................................................................5-2
5.4 Source Code Review........................................................................................................5-2
5.4.1 Control Constructs ...............................................................................................5-3
5.4.1.1 Replacement Rule....................................................................................................5-3
5.4.1.2 Figures .....................................................................................................................5-4

5.4.2

Assessment of Coding Conventions ....................................................................5-8

i

Volume II – Section 5
Software Testing

5

Software Testing
5.1

Scope

This section contains a description of the testing to be performed by the ITA to
confirm the proper functioning of the software components of a voting system
submitted for qualification testing. It describes the scope and basis for software
testing, the initial review of documentation to support software testing, and the review
of the voting system source code. Further testing of the voting system software is
addressed in the following sections:
a. Volume II, Section 3, for specific tests of voting system functionality; and
b. Volume II, Section 6, for testing voting system security and for testing the
operation of the voting system software together with other voting system
components.

5.2

Basis of Software Testing

ITAs shall design and perform procedures that test the voting system software
requirements identified in Volume I. All software components designed or modified
for election use shall be tested in accordance with the applicable procedures contained
in this section.
Unmodified, general purpose COTS non-voting software (e.g., operating systems,
programming language compilers, data base management systems, and Web browsers)
is not subject to the detailed examinations specified in this section. However, the ITA
shall examine such software to confirm the specific version of software being used
against the design specification to confirm that the software has not been modified.
Portions of COTS software that have been modified by the vendor in any manner are
subject to review.
Unmodified COTS software is not subject to code examination. However, source code
generated by a COTS package and embedded in software modules for compilation or
interpretation shall be provided in human readable form to the ITA. The ITA may

5-1

Volume II – Section 5
Software Testing

inspect COTS source code units to determine testing requirements or to verify the
code is unmodified.
The ITA may inspect the COTS generated software source code in preparation of test
plans and to provide some minimal scanning or sampling to check for embedded code
or unauthorized changes. Otherwise, the COTS source code is not subject to the full
code review and testing. For purposes of code analysis, the COTS units shall be
treated as unexpanded macros.
Compatibility of the voting system software components or subsystems with one
another, and with other components of the voting system environment, shall be
determined through functional tests integrating the voting system software with the
remainder of the system.
The specific procedures to be used shall be identified in the Qualification Test Plan
prepared by the ITA. These procedures may replicate testing performed by the vendor
and documented in the vendor’s TDP, but shall not rely on vendor testing as a
substitute for software testing performed by the ITA.
Recognizing variations in system design and the technologies employed by different
vendors, the ITAs shall design test procedures that account for these variations.

5.3

Initial Review of Documentation

Prior to initiating the software review, the ITA shall verify that the documentation
submitted by the vendor in the TDP is sufficient to enable:
a. Review of the source code; and
b. Design and conducting of tests at every level of the software structure to
verify that the software meets the vendor's design specifications and the
requirements of the performance standards.

5.4

Source Code Review

The ITA shall compare the source code to the vendor's software design documentation
to ascertain how completely the software conforms to the vendor's specifications.
Source code inspection shall also assess the extent to which the code adheres to the
requirements in Volume I, Section 4.

5-2

Volume II – Section 5
Software Testing

5.4.1

Control Constructs

Voting system software shall use the control constructs identified in this section as
follows:
a. If the programming language used does not provide these control constructs,
the vendor shall provide them (that is, comparable control structure logic).
The constructs shall be used consistently throughout the code. No other
constructs shall be used to control program logic and execution;
b. While some programming languages do not create programs as linear
processes, stepping from an initial condition, through changes, to a
conclusion, the program components nonetheless contain procedures (such as
“methods” in object-oriented languages). Even in these programming
languages, the procedures must execute through these control constructs (or
their equivalents, as defined and provided by the vendor); and
c. Operator intervention or logic that evaluates received or stored data shall not
re-direct program control within a program routine. Program control may be
re-directed within a routine by calling subroutines, procedures, and functions,
and by interrupt service routines and exception handlers (due to abnormal
error conditions). Do-While (False) constructs and intentional exceptions
(used as GoTos) are prohibited.
Illustrations of control construct techniques are provided in Figures 4-1 through 4-6.
♦ Fig. 4-1 Sequence
♦ Fig. 4-2 If -Then -Else
♦ Fig. 4-3 Do -While
♦ Fig. 4-4 Do -Until
♦ Fig. 4-5 Case
♦ Fig. 4-6 General loop, including the special case FOR loop

5.4.1.1

Replacement Rule

In the constructs shown, any ‘process’ may be replaced by a simple statement, a
subroutine or function call, or any of the control constructs. In Fig 4-1 for example,
“Process A” may be a simple statement and “Process B” another Sequence construct.

5-3

Volume II – Section 5
Software Testing

5.4.1.2

Figures

Control flows from “Process A” to the next in sequence, “Process B.”
Figure 4-1, “SEQUENCE”

Using the replacement rule to replace one or both of the processes in the Sequence
construct with other Sequence constructs, a large block of sequential code may be
formed. The entire chain is recognized as a Sequence construct and is sometimes
called a BLOCK construct. In many languages, a Sequence may need to be marked
with special symbols or punctuation to delimit where it starts and where it ends. For
example, a “BEGIN” and “END” may be used. This allows the scope of a Sequence
used as “Process C” in the IF-THEN-ELSE (Fig 4-2) to be recognized as completing
the IF-THEN-ELSE rather than part of a higher level Sequence that included the IFTHEN-ELSE as a component.

Figure 4-2, “IF-THEN-ELSE”

*In Figure 4-2, Flow of control will skip a process pending the condition of “A."

5-4

Volume II – Section 5
Software Testing

Figure 4-3, “DO-WHILE”

In Figure 4-3,condition “A” is evaluated. If found to be true, then control is passed to
Process “B” and condition “A” is reevaluated. If condition “A” is found to be false,
then control is passed out of the loop. Note that, if B is a BLOCK, the “DO” may be
recognized as the opening symbol. A terminating symbol is needed from the language
used.

Figure 4-4, “DO-UNTIL”

Figure 4-4 is similar to a DO-WHILE, except that the test of condition A is performed
after “Process B” has executed and the DO is performed upon a false “A” condition..
If condition “A” is true, control is passed out of the loop.

5-5

Volume II – Section 5
Software Testing

Figure 4-5, “CASE”

Control is passed to a Process based on the value of i.

5-6

Volume II – Section 5
Software Testing

Figure 4-6, “General LOOP”

Optional process A is executed. Condition B is then evaluated. If found to be false,
optional process C is executed and control is passed to process A. Condition B is then
evaluated again. If condition B is true, then control is passed out of the loop.
A special case of the GENERAL LOOP is the FOR loop. The FOR is not strictly
essential as it can be programmed as a DO-WHILE loop. The FOR loop executes on a
counter. The control FOR statement defines a counter variable or variables, a test for
ending the loop, and a standard method of changing the variable(s) on each pass such
as incrementing or decrementing. For example,
“FOR c = 0; c < 10; c + 1
DO Process A;”
5-7

Volume II – Section 5
Software Testing

The counter is initialized to zero, if the counter test is false, the DO process is
executed and the counter is incremented (or decremented). Once the counter test is
true, control exits from the loop without incrementing the counter. The
implementation of the FOR loop in many languages, however, can be error prone.
The use of the FOR loop shall include strictly enforced coding conventions to avoid
the common errors such as a loop that never ends.
The GENERAL LOOP should not be used where one of the other loop structures will
serve. It too is error prone and may not be supported in many languages without using
GOTOs type redirections. However, if defined in the language, it may be useful in
defining some loops where the exit needs to occur in the middle. Also, in other
languages the GENERAL LOOP logic can be used to simulate the other control
constructs. Like the special case, the use of the GENERAL LOOP shall require the
strict enforcement of coding conventions to avoid problems.

5.4.2

Assessment of Coding Conventions

The ITA shall test for compliance with the coding conventions specified by the
vendor. If the vendor does not identify an appropriate set of coding conventions in
accordance with the provisions of Volume I, section 4.2.6.a, the ITA shall review the
code to ensure that it:
a. Uses uniform calling sequences. All parameters shall either be validated for
type and range on entry into each unit or the unit comments shall explicitly
identify the type and range for the reference of the programmer and tester.
Validation may be performed implicitly by the compiler or explicitly by the
programmer;
b. For C based language and others to which this applies, has the return
explicitly defined for callable units such as functions or procedures (do not
drop through by default) and, in the case of functions, have the return value
explicitly assigned. Where the return is only expected to return a successful
value, the C convention of returning zero shall be used or the use of another
code justified in the comments. If an uncorrected error occurs so the unit must
return without correctly completing its objective, a non-zero return value shall
be given even if there is no expectation of testing the return. An exception
may be made where the return value of the function has a data range including
zero;
c. Does not use macros that contain returns or pass control beyond the next
statement;
d. For those languages with unbound arrays, provides controls to prevent writing
beyond the array, string, or buffer boundaries;
e. For those languages with pointers or which provide for specifying absolute
memory locations, provides controls that prevent the pointer or address from
being used to overwrite executable instructions or to access inappropriate
areas where vote counts or audit records are stored;
5-8

Volume II – Section 5
Software Testing

f.

For those languages supporting case statements, has a default choice explicitly
defined to catch values not included in the case list;

g. Provides controls to prevent any vote counter from overflowing. Assuming
the counter size is large enough such that the value will never be reached is
not adequate;
h. Is indented consistently and clearly to indicate logical levels;
i.

Excluding code generated by commercial code generators, is written in small
and easily identifiable modules, with no more than 50% of all modules
exceeding 60 lines in length, no more than 5% of all modules exceeding 120
lines in length, and no modules exceeding 240 lines in length. “Lines” in this
context, are defined as executable statements or flow control statements with
suitable formatting and comments. The reviewer should consider the use of
formatting, such as blocking into readable units, which supports the intent of
this requirement where the module itself exceeds the limits. The vendor shall
justify any module lengths exceeding this standard;

j.

Where code generators are used, the source file segments provided by the
code generators should be marked as such with comments defining the logic
invoked and, if possible, a copy of the source code provided to the ITA with
the generated source code replaced with an unexpanded macro call or its
equivalent;

k. Has no line of code exceeding 80 columns in width (including comments and
tab expansions) without justification;
l.

Contains no more than one executable statement and no more than one flow
control statement for each line of source code;

m. In languages where embedded executable statements are permitted in
conditional expressions, the single embedded statement may be considered a
part of the conditional expression. Any additional executable statements
should be split out to other lines;
n. Avoids mixed-mode operations. If mixed mode usage is necessary, then all
uses shall be identified and clearly explained by comments;
o. Upon exit() at any point, presents a message to the user indicating the reason
for the exit().
p. Uses separate and consistent formats to distinguish between normal status and
error or exception messages. All messages shall be self-explanatory and shall
not require the operator to perform any look-up to interpret them, except for
error messages that require resolution by a trained technician.
q. References variables by fewer than five levels of indirection (i.e. a.b.c.d or
a[b].c->d).
r.

Has functions with fewer than six levels of indented scope, counted as
follows:
int function()

5-9

Volume II – Section 5
Software Testing

{
if (a = true)
1

{
if ( b = true )

2

{
if ( c = true )

3

{
if ( d = true )

4

{
while(e > 0 )

5

{
code
}
}
}
}

}
}
s.

Initializes every variable upon declaration where permitted

t.

Specifies explicit comparisons in all if() and while() conditions. For
instance,
i.

if(flag)
is prohibited, and shall be written in the format

ii.

if (flag == TRUE)
for both single and multiple conditions.

5-10

Volume II – Section 5
Software Testing

u. Has all constants other than 0 and 1 defined or enumerated, or shall have a
comment which clearly explains what each constant means in the context of
its use. Where “0” and “1” have multiple meanings in the code unit, even
they should be identified. Example: “0” may be used as FALSE, initializing
a counter to zero, or as a special flag in a non-binary category.
v. Only contains the minimum implementation of the “a = b ? c : d” syntax.
Expansions such as “j=a?(b?c:d):e;” are prohibited.
w. Has all assert() statements coded such that they are absent from a production
compilation. Such coding may be implemented by ifdef()s that remove them
from or include them in the compilation. If implemented, the initial program
identification in setup should identify that assert() is enable and active as a
test version.

5-11

Volume II – Section 5
Software Testing

Volume II, Section 6
Table of Contents

6

System Level Integration Testing........................................................................................................6-1
6.1

Scope .............................................................................................................................................6-1

6.2

Basis of Integration Testing............................................................................................................6-1
6.2.1

Testing Breadth ................................................................................................................6-2

6.2.2

System Baseline for Testing .............................................................................................6-2

6.2.3

Testing Volume.................................................................................................................6-3

6.3

Testing Interfaces of System Components ....................................................................................6-3

6.4

Security Testing .............................................................................................................................6-3
6.4.1

Access Control..................................................................................................................6-4

6.4.2

Data Interception and Disruption ......................................................................................6-5

6.5

Accessibility Testing.......................................................................................................................6-5

6.6

Physical Configuration Audit ..........................................................................................................6-6

6.7

Functional Configuration Audit .......................................................................................................6-7

i

Volume II – Section 6
System Level Integration Testing

6

System Level Integration Testing
6.1

Scope

This section contains a description of the testing to be performed by the ITAs to
confirm the proper functioning of the fully integrated components of a voting system
submitted for qualification testing. It describes the scope and basis for integration
testing, testing of internal and external system interfaces, testing of security
capabilities, and the configuration audits, including the testing of system
documentation.
System-level qualification tests address the integrated operation of both hardware and
software, along with any telecommunications capabilities. The system-level
qualification tests shall include the tests (functionality, volume, stress, usability,
security, performance, and recovery) indicated in the ITAs’ Qualification Test Plan,
described in Appendix A. These tests assess the system's response to a range of both
normal and abnormal conditions initiated in an attempt to compromise the system.
These tests may be part of the audit of the system's functional attributes, or may be
conducted separately.
The system integration tests include two audits: a Physical Configuration Audit that
focuses on physical attributes of the system, and a Functional Configuration Audit
that focuses on the system’s functional attributes, including attributes that go beyond
the specific requirements of the Standards.

6.2

Basis of Integration Testing

This subsection addresses the basis for integration testing, the system baseline for
testing, and data volumes for testing.

6-1

Volume II – Section 6
System Level Integration Testing

6.2.1

Testing Breadth

ITAs shall design and perform procedures that test the voting system capabilities for
the system as a whole. These procedures follow the testing of the systems hardware
and software, and address voting system requirements defined in Volume I, Sections
2, 5, 6 and 8.
These procedures shall also address the requirements for testing system functionality
provided in Volume II, Section 3. Where practical, the ITA will perform coverage
reporting of the software branches executed in the functional testing. The selection of
the baseline test cases will follow an operational profile of the common procedures,
sequencing, and options among the shared state requirements and those that are
specifically recognized and supported by the vendor. The ITA will use the coverage
report to identify any portions of the source code that were not covered and determine:
a. The additional functional tests that are needed;
b. Where more detailed source code review is needed; or
c. Both of the above.
The specific procedures to be used shall be identified in the Qualification Test Plan
prepared by the ITA. These procedures may replicate testing performed by the vendor
and documented in the vendor’s TDP, but shall not rely on vendor testing as a
substitute for testing performed by the ITA.
Recognizing variations in system design and the technologies employed by different
vendors, the ITAs shall design test procedures that account for these variations.

6.2.2

System Baseline for Testing

The system level qualification tests are conducted using the version of the system as it
is intended to be sold by the vendor and delivered to jurisdictions. To ensure that the
system version tested is the correct version, the ITA shall witness the build of the
executable version of the system immediately prior to or as part of the physical
configuration audit. Additionally, should components of the system be modified or
replaced during the qualification testing process, the ITA shall require the vendor
conduct a new “build” of the system to ensure that the qualified executable release of
the system is built from tested components.

6-2

Volume II – Section 6
System Level Integration Testing

6.2.3

Testing Volume

For all systems, the total number of ballots to be processed by each precinct counting
device during these tests shall reflect the maximum number of active voting positions
and the maximum number of ballot styles that the TDP claims the system can support.

6.3

Testing Interfaces of System Components

The ITA shall design and perform test procedures that test the interfaces of all system
modules and subsystems with each other against the vendor’s specifications. These
tests shall be documented in the ITA’s Qualification Test Plan, and shall include the
full range of system functionality provided by the vendor’s specifications, including
functionality that exceeds the specific requirements of the Standards.
Some voting systems may use components or subsystems from previously tested and
qualified systems, such as ballot preparation. For these scenarios, the ITA shall, at a
minimum,
a. Confirm that the version of previously approved components and subsystems
are unchanged; and
b. Test all interfaces between previously approved modules/subsystems and all
other system modules and subsystems. Where a component is expected to
interface with several different products, especially from different
manufacturers, the vendor shall provide a public data specification of files or
data objects used to exchange information.
Some systems use telecommunications capabilities as defined in Section 5. For those
systems that do use such capabilities, components that are located at the poll site or
separate vote counting site shall be tested for effective interface, accurate vote
transmission, failure detection, and failure recovery. For voting systems that use
telecommunications lines or networks that are not under the control of the vendor
(e.g., public telephone networks), the ITA shall test the interface of vendor-supplied
components with these external components for effective interface, vote transmission,
failure detection, and failure recovery.

6.4

Security Testing

The ITA shall design and perform test procedures that test the security capabilities of
the voting system against the requirements defined in Volume I, Section 6. These
procedures shall focus on the ability of the system to detect, prevent, log, and recover

6-3

Volume II – Section 6
System Level Integration Testing

from a broad range of security risks as identified in Section 6 and system capabilities
and safeguards, claimed by the vendor in its TDP that go beyond the risks and threats
identified in Volume I, Section 6.
The range of risks tested is determined by the design of the system and potential
exposure to risk. Regardless of system design and risk profile, all systems are tested
for effective access control and physical data security.
For systems that use public telecommunications networks, including the Internet, to
transmit election management data or official election results (such as ballots or
tabulated results), the ITAs shall conduct tests to ensure that the system provides the
necessary identity-proofing, confidentiality, and integrity of transmitted data. These
tests shall be designed to confirm that the system is capable of detecting, logging,
preventing, and recovering from types of attacks known at the time the system is
submitted for qualification.
The ITA may meet these testing requirements by confirming proper implementation
of proven commercial security software. In this case, the vendor must provide the
published standards and methods used by the US Government to test and accept this
software, or it may provide references to free, publicly available publications of these
standards and methods, such as government web sites.
At its discretion, the ITA may conduct or simulate attacks on the system to confirm
the effectiveness of the system's security capabilities, employing test procedures
approved by the NASED Voting Systems Board.

6.4.1

Access Control

The ITA shall conduct tests of system capabilities and review the access control
policies and procedures and submitted by the vendor to identify and verify the access
control features implemented as a function of the system. For those access control
features built in as components of the voting system, the ITA shall design tests to
confirm that these security elements work as specified.
Specific activities to be conducted by the ITA shall include:
a. A review of the vendor’s access control policies, procedures and system
capabilities to confirm that all requirements of Volume I, Section 6.2 have
been addressed completely; and
b. Specific tests designed by the ITA to verify the correct operation of all
documented access control procedures and capabilities, including tests
designed to circumvent controls provided by the vendor. These tests shall
include:

6-4

Volume II – Section 6
System Level Integration Testing

1) Performing the activities that the jurisdiction will perform in specific
accordance with the vendor’s access control policy and procedures to
create a secure system, including procedures for software (including
firmware) installation (as described in Volume I, Section 6.4); and
2) Performing tests intended to bypass or otherwise defeat the resulting
security environment. These tests shall include simulation of attempts to
physically destroy components of the voting system in order to validate
the correct operation of system redundancy and backup capabilities.
This review applies to the full scope of system functionality. It includes functionality
for defining the ballot and other pre-voting functions, as well as functions for casting
and storing votes, vote canvassing, vote reporting, and maintenance of the system’s
audit trail.

6.4.2

Data Interception and Disruption

For systems that use telecommunications to transmit official voting data, the ITA shall
review, and conduct tests of, the data interception and prevention safeguards specified
by the vendor in its TDP. The ITA shall evaluate safeguards provided by the vendor
to ensure their proper operation, including the proper response to the detection of
efforts to monitor data or otherwise compromise the system.
For systems that use public communications networks the ITA shall also review the
vendor’s documented procedures for maintaining protection against newly discovered
external threats to the telecommunications network. This review shall assess the
adequacy of such procedures in terms of:
a. Identification of new threats and their impact;
b. Development or acquisition of effective countermeasures;
c. System testing to ensure the effectiveness of the countermeasures;
d. Notification of client jurisdictions that use the system of the threat and the
actions that should be taken;
e. Distribution of new system releases or updates to current system users; and
f.

6.5

Confirmation of proper installation of new system releases.

Accessibility Testing

The ITA shall design and perform procedures that test the capability of the voting
system to assist voters with disabilities. ITA test procedures shall confirm that:

6-5

Volume II – Section 6
System Level Integration Testing

a. Voting machines intended for use by voters with disabilities provide the
capabilities required by Volume I, Section 2.2.7;
b. Voting machines intended for use by voters with disabilities operate
consistent with vendor specifications and documentation; and
c. Voting machines intended for use by voters with disabilities meet all other
functional requirements required by Volume I, Section2.

6.6

Physical Configuration Audit

The Physical Configuration Audit compares the voting system components submitted
for qualification to the vendor's technical documentation, and shall include the
following activities:
a. The audit shall establish a configuration baseline of the software and
hardware to be tested. It shall also confirm whether the vendor's
documentation is sufficient for the user to install, validate, operate, and
maintain the voting system. MIL-STD-1521 can be used as a guide when
conducting this audit;
b. The test agency shall examine the vendor's source code against the submitted
documentation during the Physical Configuration Audit to verify that the
software conforms to the vendor's specifications. This review shall include an
inspection of all records of the vendor's release control system. If changes
have been made to the baseline version, the test agency shall verify that the
vendor's engineering and test data are for the software version submitted for
qualification;
c. If the software is to be run on any equipment other than a COTS mainframe
data processing system, minicomputer, or microcomputer, the Physical
Configuration Audit shall also include a review of all drawings,
specifications, technical data, and test data associated with the system
hardware. This examination shall establish the system hardware baseline
associated with the software baseline;
d. To assess the adequacy of user acceptance test procedures and data, vendor
documents containing this information shall be reviewed against the system's
functional specifications. Any discrepancy or inadequacy in the vendor's plan
or data shall be resolved prior to beginning the system-level functional and
performance tests; and
e. All subsequent changes to the baseline software configuration made during
the course of qualification testing shall be subject to reexamination. All
changes to the system hardware that may produce a change in software
operation shall also be subject to reexamination.

6-6

Volume II – Section 6
System Level Integration Testing

The vendor shall provide a list of all documentation and data to be audited, crossreferenced to the contents of the TDP. Vendor technical personnel shall be available
to assist in the performance of the Physical Configuration Audit.

6.7

Functional Configuration Audit

The Functional Configuration Audit encompasses an examination of vendor tests, and
the conduct of additional tests, to verify that the system hardware and software
perform all the functions described in the vendor's documentation submitted for the
TDP. It includes a test of system operations in the sequence in which they would
normally be performed, and shall include the following activities (MIL-STD-1521
may be used as a guide when conducting this audit.):
a. The test agency shall review the vendor's test procedures and test results to
determine if the vendor's specified functional requirements have been
adequately tested. This examination shall include an assessment of the
adequacy of the vendor's test cases and input data to exercise all system
functions, and to detect program logic and data processing errors, if such be
present; and
b. The test agency shall perform or supervise the performance of additional tests
to verify nominal system performance in all operating modes, and to verify on
a sampling basis the vendor's test data reports. If vendor developmental test
data is incomplete, the ITA shall design and conduct all appropriate module
and integrated functional tests. The functional configuration audit may be
performed in the facility either of the test agency or of the vendor, and shall
use and verify the accuracy and completeness of the System Operations,
Maintenance, and Diagnostic Testing Manuals.
The vendor shall provide a list of all documentation and data to be audited, crossreferenced to the contents of the TDP. Vendor technical personnel shall be available
to assist in the performance of the Functional Configuration Audit.

6-7

Volume II – Section 6
System Level Integration Testing

Volume II, Section 7
Table of Contents
7

Examination of Vendor Practices for Configuration Management and Quality Assurance ..7-1
7.1

Scope .............................................................................................................................................7-1

7.2

Basis of Examinations....................................................................................................................7-1

7.3

General Examinations Sequence...................................................................................................7-2

7.4

7.5

7.3.1

Examination of Vendor Practices in Parallel with Other Qualification Testing..................7-2

7.3.2

Performance of Functional Configuration Audit as an Element of Integrated System
Testing ..............................................................................................................................7-2

Examination of Configuration Management Practices ...................................................................7-3
7.4.1

Configuration Management Policy....................................................................................7-3

7.4.2

Configuration Identification ...............................................................................................7-3

7.4.3

Baseline, Promotion, and Demotion Procedures ..............................................................7-4

7.4.4

Configuration Control Procedures ....................................................................................7-4

7.4.5

Release Process...............................................................................................................7-4

7.4.6

Configuration Audits .........................................................................................................7-5

7.4.7

Configuration Management Resources ............................................................................7-5

Examination of Quality Assurance Practices .................................................................................7-5
7.5.1

Quality Assurance Policy ..................................................................................................7-6

7.5.2

Parts & Materials Special Tests and Examinations ..........................................................7-6

7.5.3

Quality Conformance Inspections.....................................................................................7-7

7.5.4

Documentation..................................................................................................................7-7

i

Volume II – Section 7
Configuration Management and Quality Assurance

7

Examination of Vendor Practices
for Configuration Management
and Quality Assurance
7.1

Scope

This section contains a description of the examination performed by the ITAs to
confirm conformance with the requirements for configuration management and
quality assurance of voting systems. It describes the scope and basis for the
examinations, the general sequence of the examinations within the overall test
process, and provides guidance on the substantive focus of the examinations.

7.2

Basis of Examinations

ITAs shall design and perform procedures that examine documented vendor practices
for quality assurance and configuration management as addressed by Volume I,
Sections 7 and 8, and complemented by Volume II, Section 2.
Examination procedures shall be designed and performed by the ITA that address:
a. Conformance with the requirements to provide information on vendor
practices required by the Standards;
b. Conformance of system documentation and other information provided by the
vendor with the documented practices for quality assurance and configuration
management.
The Standards do not require on-site examination of the vendor’s quality assurance
and configuration management practices during the system development process.
However, the ITAs conduct several activities while at the vendor site to witness the
system build that enable assessment of the vendor’s quality assurance and
configuration management practices and conformance with them. These include

7-1

Volume II – Section 7
Configuration Management and Quality Assurance

surveys, interviews with individuals at all levels of the development team, and
examination of selected internal work products such as system change requests and
problem tracking logs.
It is recognized that examinations of vendor practices, and determinations of
conformance, entail a significant degree of professional judgement. These standards
for vendor practices identify specific areas of focus for the ITAs, while at the same
time relying on their expertise and professional judgement, as evaluated in the
certification of the ITAs.
The specific procedures used by the ITA shall be identified in the Qualification Test
Plan. Recognizing variations in vendors’ quality assurance and configuration
management practices and procedures, the ITAs shall design examination procedures
that account for these variations.

7.3

General Examinations Sequence

There is no required sequence for performing the examinations of quality assurance
and configuration management practices. No other testing within the overall
qualification testing process is dependent on the performance and results of these
examinations. However, examinations pertaining to configuration management, in
particular those pertaining to configuration identification, will generally be useful in
understanding the conventions used to define and document the components of the
system and will assist other elements of the qualification test process.

7.3.1

Examination of Vendor Practices in Parallel with
Other Qualification Testing

While not required, ITAs are encouraged to initiate the examinations of quality
assurance and configuration management practices early in the overall qualification
testing sequence, and conduct them in parallel with other testing of the voting system.
Conducting these examinations in parallel is recommended to minimize the overall
duration of the qualification process,

7.3.2

Performance of Functional Configuration Audit as an
Element of Integrated System Testing

As described in Volume I, Section 8, the functional configuration audit verifies that

7-2

Volume II – Section 7
Configuration Management and Quality Assurance

the voting system performs all the functions described in the system documentation.
To help ensure an efficient test process, this audit shall be conducted by ITAs as an
element of integrated system testing that confirms the proper functioning of the
system as a whole. Integrated system testing is described in more detail in Volume II,
Section 6.

7.4

Examination of Configuration Management
Practices

The examination of configuration management practices shall address the full scope
of requirements described in Volume I, Section 8, and the documentation
requirements described in Volume II, Section 2. In addition to confirming that all
required information has been submitted, the ITAs shall determine the vendor’s
conformance with the documented configuration management practices.

7.4.1

Configuration Management Policy

The ITAs shall examine the vendor’s documented configuration management policy
to confirm that it:
a. Addresses the full scope of the system, including components provided by
external suppliers; and
b. Addresses the full breadth of system documentation;

7.4.2

Configuration Identification

The ITAs shall examine the vendor’s documented configuration identification
practices policy to confirm that they:
a. Describe clearly the basis for classifying configuration items into categories
and subcategories, for numbering of configuration items; and for naming of
configuration items; and
b. Describe clearly the conventions used to identify the version of the system as
a whole and the versions of any lower level elements (e.g., subsystems,
individual elements) if such lower level version designations are used.

7-3

Volume II – Section 7
Configuration Management and Quality Assurance

7.4.3

Baseline, Promotion, and Demotion Procedures

The ITA shall examine the vendor’s documented baseline, promotion and demotion
procedures to confirm that they:
a. Provide a clear, controlled process that promotes components to baseline
status when specific criteria defined by the vendor are met; and
b. Provide a clear controlled process for demoting a component from baseline
status when specific criteria defined by the vendor are met;

7.4.4

Configuration Control Procedures

The ITA shall examine the vendor’s configuration control procedures to confirm that
they:
a. Are capable of providing effective control of internally developed system
components; and
b. Are capable of providing effective control of components developed or
supplied by third parties.

7.4.5

Release Process

The ITA shall examine the vendor’s release process to confirm that it:
a. Provides clear accountability for moving forward with the release of the initial
system version and subsequent releases;
b. Provides the means for clear identification of the system version being
replaced;
c. Confirms that all required internal vendor tests and audits prior to release
have been completed successfully;
d. Confirms that each system version released to customers has been qualified
by a the appropriate ITA prior to release;
e. Confirms that each system release has been received by the customer; and

7-4

Volume II – Section 7
Configuration Management and Quality Assurance

f.

7.4.6

Confirms that each system release has been installed successfully by the
customer;

Configuration Audits

The ITA shall examine the vendor’s configuration audit procedures to confirm that
they:
a. Are sufficiently broad in scope to address the entire system, including system
documentation;
b. Are conducted with appropriate timing to enable effective control of system
versions; and
c. Are sufficiently rigorous to confirm that all system documentation prepared
and maintained by the vendor indeed matches the actual system functionality,
design, operation and maintenance requirements.

7.4.7

Configuration Management Resources

The ITA shall examine the configuration management resource information submitted
by the vendor to determine whether sufficient information has been provided to enable
another organization to clearly identify the resources used and acquire them for use.
This examination is intended to ensure that in the event the vendor concludes business
operations, sufficient information has been provided to enable an in-depth audit of the
system should such an audit be required by election officials and/or a law enforcement
organization.

7.5

Examination of Quality Assurance Practices

The examination of quality assurance practices shall address the full scope of
requirements described in Volume I, Section 7, and the documentation requirements
described in Volume II, Section 2. The ITA shall confirm that all required information
has been submitted, and assess whether the vendor’s quality assurance program
provides for:
a. Clearly measurable quality standards;
b. An effective testing program throughout the system development life cycle;

7-5

Volume II – Section 7
Configuration Management and Quality Assurance

c. Application of the quality assurance program to external providers of system
components and supplies;
d. Comprehensive monitoring of system performance in the field and diagnosis
of system failures;
e. Effective record keeping of system failures to support analysis of failure
patterns and potential causes; and
f.

Effective processes for notifying customers of system failures and corrective
measures that need to be taken, and for confirming that such measures are
taken.

In addition to the general examinations described above, the ITA shall focus on the
specific elements of the vendor’s quality assurance program indicated below.

7.5.1

Quality Assurance Policy

The ITA shall examine the vendor’s quality assurance policy to confirm that it:
a. Addresses the full scope of the voting system;
b. Clearly designates a senior level individual accountable for implementation
and oversight of quality assurance activities;
c. Clearly designates the individuals, by position within the vendor’s
organization, who are to conduct each quality assurance activity; and
d. Provides procedures that determine compliance with, and correct deviations
from, the quality assurance program at a minimum annually.

7.5.2

Parts & Materials Special Tests and Examinations

The ITA shall examine the vendor’s parts and materials special tests and examinations
to confirm that they:
a. Identify appropriate criteria that are used to determine the specific system
components for which special tests are required to confirm their suitability for
use in a voting system;
b. Are designed in a manner appropriate to determine suitability; and
c. Have been conducted and documented for all applicable parts and materials.

7-6

Volume II – Section 7
Configuration Management and Quality Assurance

7.5.3

Quality Conformance Inspections

The ITAs shall examine the vendor’s quality conformance plans, procedures and
inspection results to confirm that:
a. All components have been tested according to the test requirements defined
by the vendor;
b. All components have passed the requisite tests; and
c. For each test, the test documentation identifies:
1) Test location;
2) Test date;
3) Individual who conducted the test; and
4) Test outcome.

7.5.4

Documentation

The ITAs shall examine the vendor’s voting system documentation to confirm that it
meets the content requirements of Volume I, Section 7.5, and Volume I Section 2, and
is written in a manner suitable for use by purchasing jurisdictions.

7-7

Volume II – Section 7
Configuration Management and Quality Assurance

Volume II, Appendix A
Table of Contents

A Qualification Test Plan ........................................................................................................................ A-1
A.1 Scope ............................................................................................................................................ A-1
A.1.1

References ...................................................................................................................... A-2

A.1.2

Terms and Abbreviations................................................................................................. A-2

A.2 Prequalification Tests.................................................................................................................... A-2
A.3 Materials Required for Testing ...................................................................................................... A-2
A.3.1

Software........................................................................................................................... A-3

A.3.2

Equipment........................................................................................................................ A-3

A.3.3

Test Materials .................................................................................................................. A-3

A.3.4

Deliverable Materials ....................................................................................................... A-3

A.3.5

Proprietary Data............................................................................................................... A-4

A.4 Test Specifications ........................................................................................................................ A-4
A.4.1

Hardware Configuration and Design................................................................................ A-4

A.4.2

Software System Functions ............................................................................................. A-4

A.4.3

Test Case Design ............................................................................................................ A-5
A.4.3.1 Hardware Qualitative Examination Design ........................................................ A-5
A.4.3.2 Hardware Environmental Test Case Design ..................................................... A-5
A.4.3.3 Software Module Test Case Design and Data .................................................. A-6
A.4.3.4 Software Functional Test Case Design ............................................................. A-7
A.4.3.5 System-level Test Case Design ........................................................................ A-8

A.5 Test Data....................................................................................................................................... A-9
A.5.1

Data Recording................................................................................................................ A-9

A.5.2

Test Data Criteria............................................................................................................. A-9

A.5.3

Test Data Reduction ...................................................................................................... A-10

A.6 Test Procedure and Conditions................................................................................................... A-10
A.6.1

Facility Requirements .................................................................................................... A-10

A.6.2

Test Set-up .................................................................................................................... A-11

A.6.3

Test Sequence............................................................................................................... A-11

A.6.4

Test Operations Procedures.......................................................................................... A-11

i

Volume II – Appendix A
Qualification Testing Plan

A

Qualification Test Plan
A.1

Scope

This Appendix contains a recommended outline for the Qualification Test Plan, which
is to be prepared by the test agency. The primary purpose of the test plan is to
document the test agency's development of the complete or partial qualification test. A
sample outline of a Qualification Test Plan is illustrated in Figure A-1 at the end of
this Appendix.
It is intended that the test agency use this Appendix as a guide in preparing a detailed
test plan, and that the scope and detail of the requirements for qualification be tailored
to the type of hardware, and the design and complexity of the software being tested.
Required hardware tests are defined in Section 4, whereas software and system-level
tests must be developed based on the vendor prequalification tests and information
available on the specific software's physical and functional configuration.
Prior to development of any test plan, the test agency must obtain the Technical Data
Package (TDP) from the vendor submitting the voting system for qualification. The
TDP contains information necessary to the development of a Qualification Test Plan,
such as the vendor's Hardware Specifications, Software Specifications, System
Operating Manual and System Maintenance Manual.
It is foreseen that vendors may submit some voting systems in use at the time the
standards are issued to partial qualification tests. It is also specified by the standards
that voting systems incorporating the vendor's software and COTS hardware need
only be submitted for software and system-level tests. Requalification of systems with
modified software or hardware is also anticipated. The test agency shall alter the test
plan outline as required by these situations.
The following sections describe the individual sections of the recommended
Qualification Test Plan.
The test agency shall include the identification, and a brief description of, the
hardware and software to be tested, and any special considerations that affect the test
design and procedure.

A-1

Volume II – Appendix A
Qualification Testing Plan

A.1.1

References

The test agency shall list all documents that contain material used in preparing the test
plan. This list shall include specific reference to applicable portions of the standards,
and to the vendor's TDP.

A.1.2

Terms and Abbreviations

The test agency shall list and define all terms and phrases relevant to the hardware, the
software, or the test plan.

A.2

Prequalification Tests

The test agency shall evaluate vendor tests, or other agency tests in determining the
scope of testing required for system qualification. Prequalification test activities may
be particularly useful in designing software functional test cases and tests of system
security.
The ITA shall summarize prequalification test results that support the discussion of
the preceding section.

A.3

Materials Required for Testing

The following materials must presented to the ITA in order to facilitate testing of the
voting system:
♦ Software;
♦ Equipment;
♦ Test materials;
♦ Deliverable materials; and
♦ Proprietary Data.

A-2

Volume II – Appendix A
Qualification Testing Plan

A.3.1

Software

The ITA shall list all software required for the performance of hardware, software,
telecommunications, security and integrated system tests. If the test environment
requires supporting software such as operating systems, compilers, assemblers, or
database managers, then this software shall also be listed.

A.3.2

Equipment

The ITA shall list all equipment required for the performance of the hardware,
software, telecommunications, security and integrated system tests. This list shall
include system hardware, general purpose data processing and communications
equipment, and test instrumentation, as required.

A.3.3

Test Materials

The ITA shall list all test materials required in the performance of the test including,
as applicable, test ballot layout and generation materials, test ballot sheets, test ballot
cards and control cards, standard and optional output data report formats, and any
other materials used to simulate preparation for and conduct of elections.

A.3.4

Deliverable Materials

The ITA shall list all documents and materials to be delivered as a part of the system,
such as:
♦ Hardware specification;
♦ Software specification;
♦ Voter, operator, and hardware and software maintenance manuals;
♦ Program listings, facsimile ballots, tapes; and
♦ Sample output report formats.

A-3

Volume II – Appendix A
Qualification Testing Plan

A.3.5

Proprietary Data

The ITA shall list and describe all documentation and data that are the private
property of the vendor, and hence are subject to restrictions with respect to ITA use,
release, or disclosure.

A.4

Test Specifications

The ITA shall cite the pertinent hardware qualitative examinations and quantitative
tests that follow from Volume I, Sections 3 and 9. The ITA shall also describe the
specific test requirements that follow from the design of the software and
telecommunications capabilities under test.
The qualification test shall include ITA consideration of hardware, software and
telecommunications, design; and ITA development and conduct of all tests to
demonstrate satisfactory performance. Environmental, non-operating tests shall be
performed in the categories of simulated environmental conditions specified by the
vendor or user requesting the tests. Environmental operating tests shall be performed
under varying temperatures. Other functional tests shall be conducted in an
environment that simulates, as nearly as possible, the intended use environment.
Test hardware and software shall be identical to that designed to be used together in
the voting system, except that software intended for use with general-purpose off-theshelf hardware may be tested using any equivalent equipment capable of supporting
its operation and functions.

A.4.1

Hardware Configuration and Design

The ITA shall document the hardware configuration and design in detail sufficient to
identify the specific equipment being tested. This document shall provide a basis for
the specific test design and include a brief description of the intended use of the
hardware.

A.4.2

Software System Functions

The ITA shall describe the software functions in sufficient detail to provide a
foundation for selecting the test case designs and conditions contained in Subsections

A-4

Volume II – Appendix A
Qualification Testing Plan

A.4.4.3, A.4.4.4, and A.4.4.5, below. On the basis of this test case design, the ITA
shall prepare a table delineating software functions and how each shall be tested.

A.4.3

Test Case Design

The ITA shall examine the test case design of the following aspects of the voting
system:
♦ Hardware Qualitative Examination Design;
♦ Hardware Environmental Test Case Design;
♦ Software Module Test Case Design and Data;
♦ Software Functional Test Case Design; and
♦ System-level Test Case Design.

A.4.3.1

Hardware Qualitative Examination Design

The ITA shall review the results, submitted by the vendor, of any previous
examinations of the equipment to be tested. The results of these examinations shall be
compared to the performance characteristics specified by Section 2 of the standards
concerning the requirements for:
♦ Overall system capabilities;
♦ Pre-voting functions;
♦ Voting functions; and
♦ Post-voting functions.
In the event that a review of the results of previous examinations indicates problem
areas, the test agency shall provide a description of further examinations required
prior to conducting the environmental and system-level tests. If no previous
examinations have been performed, or records of these tests are not available, the test
agency shall specify the appropriate tests to be used in the examination.

A.4.3.2

Hardware Environmental Test Case Design

The ITA shall review the documentation, submitted by the vendor, of the results and
design of any previous environmental tests of the equipment submitted for testing.
The test design and results shall be compared to the qualification tests described in

A-5

Volume II – Appendix A
Qualification Testing Plan

Volume I, Section 9 of the standards. The test agency shall cite any additional tests
required, based on this review and those tests requested by the vendor or the state. The
test agency shall also cite any environmental tests of Section 9 that are not to be
conducted, and note the reasons why.
For complete qualification, environmental tests shall include the following tests,
depending upon the design and intended use of the hardware.
a. Non-operating tests, including the:
1) Bench handling test;
2) Vibration test;
3) Low temperature test;
4) High temperature test; and
5) Humidity test; and
b. Operating tests involving a series of procedures that test system reliability and
accuracy under various temperatures and voltages relevant to election use.

A.4.3.3

Software Module Test Case Design and Data

The test agency shall review the vendor's program analysis, documentation, and, if
available, module test case design. The test agency shall evaluate the test cases for
each module, with respect to flow control parameters and data on both entry and exit.
All discrepancies between the Software Specifications and the test case design shall be
corrected by the vendor prior to initiation of the qualification test.
If the vendor's module test case design does not provide conclusive coverage of all
program paths, then the test agency shall perform an independent analysis to assess
the frequency and consequence of error of the untested paths. The ITA shall design
additional module test cases, as required, to provide coverage of all modules
containing untested paths with potential for untrapped errors.
The test agency shall also review the vendor's module test data in order to verify that
the requirements of the Software Specifications have been demonstrated by the data.
In the event that the vendor's module test data are insufficient, the test agency shall
provide a description of additional module tests, prerequisite to the initiation of
functional tests.

A-6

Volume II – Appendix A
Qualification Testing Plan

A.4.3.4

Software Functional Test Case Design

The test agency shall review the vendor's test plans and data to verify that the
individual performance requirements described in Volume II, Section 2, Subsection
2.5.3.5, are reflected in the software.
As a part of this process, the test agency shall review the vendor's functional test case
designs. The test agency shall prepare a detailed matrix of system functions and the
test cases that exercise them. The test agency shall also prepare a test procedure
describing all test ballots, operator procedures, and the data content of output reports.
Abnormal input data and operator actions shall be defined. Test cases shall also be
designed to verify that the system is able to handle and recover from these abnormal
conditions.
The vendor's test case design may be evaluated by any standard or special method
appropriate; however, emphasis shall be placed on those functions where the vendor
data on module development reflects significant debugging problems, and on
functional tests that resulted in disproportionately high error rates.
The test agency shall define ACCEPT/REJECT criteria for qualification using the
Software Specifications and, if the software runs on special hardware, the associated
Hardware Specifications to determine acceptable ranges of performance.
The test agency shall describe the functional tests to be performed. Depending upon
the design and intended use of the voting system, all or part of the functions listed
below shall be tested.
a. Ballot preparation subsystem;
b. Test operations performed prior to, during, and after processing of ballots,
including:
1) Logic tests to verify interpretation of ballot styles, and recognition of
precincts to be processed;
2) Accuracy tests to verify ballot reading accuracy;
3) Status tests to verify equipment statement and memory contents;
4) Report generation to produce test output data; and
5) Report generation to produce audit data records;
c. Procedures applicable to equipment used in the polling place for:
1) Opening the polling place and enabling the acceptance of ballots; (b)
maintaining a count of processed ballots;
2) Monitoring equipment status;
3) Verifying equipment response to operator input commands;

A-7

Volume II – Appendix A
Qualification Testing Plan

4) Generating real-time audit messages;
5) Closing the polling place and disabling the acceptance of ballots;
6) Generating election data reports;
7) Transfer of ballot counting equipment, or a detachable memory module,
to a central counting location; and
8) Electronic transmission of election data to a central counting location; and
d. Procedures applicable to equipment used in a central counting place:
1) Initiating the processing of a ballot deck or PMD for one or more precincts;
2) Monitoring equipment status;
3) Verifying equipment response to operator input commands;
4) Verifying interaction with peripheral equipment, or other data processing
systems;
5) Generating real-time audit messages;
6) Generating precinct-level election data reports;
7) Generating summary election data reports;
8) Transfer of a detachable memory module to other processing equipment;
9) Electronic transmission of data to other processing equipment; and
10) Producing output data for interrogation by external display devices.

A.4.3.5

System-level Test Case Design

The test agency shall provide a description of system tests of both the software and
hardware. For software, these tests shall be designed according the stated design
objective without consideration of its functional specification. The test agency shall
independently prepare the system test cases to assess the response of the hardware and
software to a range of conditions, such as:
♦ Volume tests: These tests investigate the system's response to processing more
than the expected number of ballots/voters per precinct, to processing more
than the expected number of precincts, or to any other similar conditions that
tend to overload the system's capacity to process, store, and report data;
♦ Stress tests: These tests investigate the system's response to transient overload
conditions. Polling place devices shall be subjected to ballot processing at the
high volume rates at which the equipment can be operated to evaluate
software response to hardware-generated interrupts and wait states. Central
counting systems shall be subjected to similar overloads, including, for

A-8

Volume II – Appendix A
Qualification Testing Plan

systems that support more than one card reader, continuous processing
through all readers simultaneously;
♦ Usability tests: These tests are designed to exercise characteristics of the
software such as response to input control or text syntax errors, error message
content, audit message content, and other features contained in the software
design objectives but not directly related to a functional specification;
♦ Accessibility tests: These tests are designed to exercise system capabilities and
features intended for use by voters with disabilities in accordance with
Volume I, Section 2.2.5;
♦ Security tests: These tests are designed to defeat the security provisions of the
system including modification or disruption of pre-voting, voting, and post
voting processing; unauthorized access to, deletion, or modification of data,
including audit trail data; and modification or elimination of security
mechanisms;
♦ Performance tests: These tests verify accuracy, processing rate, ballot format
handling capability, and other performance attributes claimed by the vendor;
and
♦ Recovery tests: These tests verify the ability of the system to recover from
hardware and data errors.

A.5

Test Data

A.5.1

Data Recording

The test agency shall identify all data recording requirements (e.g.; what is to be
measured, how tests and results are to be recorded). The test agency shall also design
or approve the design of forms or other recording media to be employed. The test
agency shall supply any special instrumentation (pulse measuring device) needed to
satisfy the data requirements.

A.5.2

Test Data Criteria

The test agency shall describe the criteria against which test results will be evaluated,
such as the following:

A-9

Volume II – Appendix A
Qualification Testing Plan

♦ Tolerances: These criteria define the acceptable range for system
performance. These tolerances shall be derived from the applicable hardware
performance requirements contained in Volume I, Section 3, Hardware
Standards.
♦ Samples: These criteria define the minimum number of combinations or
alternatives of input and output conditions that can be exercised to constitute
an acceptable test of the parameters involved.
♦ Events: These criteria define the maximum number of interrupts, halts or other
system breaks that may occur due to nontest conditions. This count shall not
include events from which recovery occurs automatically or where a relevant
status message is displayed.

A.5.3

Test Data Reduction

The test agency shall describe the techniques to be used for processing test data. These
techniques may include manual, semi-automatic, or fully automatic reduction
procedures. However, semi-automatic and automatic procedures shall have been
shown to be capable of handling the test data accurately and properly. They shall also
produce an item-by-item comparison of the data and the embedded acceptance criteria
as output.

A.6

Test Procedure and Conditions

The test agency shall describe the test conditions and procedures for performing the
tests. If tests are not to be performed in random order, this section shall contain the
rationale for the required sequence, and the criteria that must be met, before the
sequence can be continued. This section shall also describe the procedure for setting
up the equipment in which the software will be tested, for system initialization, and
for performing the tests. Each of the following sections that contain a description of a
test procedure shall also contain a statement of the criteria by which readiness and
successful completion shall be indicated and measured.

A.6.1

Facility Requirements

The test agency shall describe the space, equipment, instrumentation, utilities,
manpower, and other resources required to support the test program.

A-10

Volume II – Appendix A
Qualification Testing Plan

A.6.2

Test Set-up

The test agency shall describe the procedure for arranging and connecting the system
hardware with the supporting hardware and telecommunications equipment, if
applicable. It shall also describe the procedure required to initialize the system, and to
verify that it is ready to be tested.

A.6.3

Test Sequence

The test agency shall state any restrictions on the grouping or sequence of tests in this
section.

A.6.4

Test Operations Procedures

The test agency shall provide the step-by-step procedures for each test case to be
conducted. Each step shall be assigned a test step number and this number, along with
critical test data and test procedures information, shall be tabulated onto a test report
form for test control and the recording of test results.
In this section, the test agency shall also identify all test operations personnel, and
their respective duties. In the event that the operator procedure is not defined in the
vendor's operations or user manual, the test agency shall also provide a description of
the procedures to be followed by the test personnel.

A-11

Volume II – Appendix A
Qualification Testing Plan

Figure A-1

Test Plan Outline

1
1.1
1.2

Introduction
References
Terms and Abbreviations

2
2.1
2.2

Prequalification Tests
Prequalification Test Activity
Prequalification Test Results

3
3.1
3.2
3.3
3.4
3.5

Materials Required for Testing
Software
Equipment
Test Materials
Deliverable Materials
Proprietary Data

4
4.1
4.2
4.3
4.4
4.4.1
4.4.2
4.4.3
4.4.4
4.4.5

Test Specification
Requirements
Hardware Configuration and Design
Software System Functions
Test Case Design
Hardware Qualitative Examination Design
Hardware Environmental Test Case Design
Software Module Test Case Design and Data
Software Functional Test Case Design and Data
System-level Test Case Design

5
5.1
5.2
5.3

Test Data
Data Recording
Test Data Criteria
Test Data Reduction

6
6.1
6.2
6.3
6.4

Test Procedure and Conditions
Facility Requirements
Test Set-up
Test Sequence
Test Operations Procedures

A-12

Volume II – Appendix A
Qualification Testing Plan

Volume II, Appendix B
Table of Contents

B Qualification Test Report .................................................................................................................... B-1
B.1 Scope ............................................................................................................................................ B-1
B.1.1

New Voting System Qualification Test Report ................................................................. B-1

B.1.2

Changes to Previously Qualified Voting System Qualification Test Report ..................... B-1

B.2 Qualification Test Background ...................................................................................................... B-2
B.3 System Identification ..................................................................................................................... B-2
B.4 System Overview .......................................................................................................................... B-2
B.5 Qualification Test Results and Recommendation ......................................................................... B-3
B.6 Appendix - Test Operations and Findings..................................................................................... B-3
B.7 Appendix - Test Data Analysis ...................................................................................................... B-4

i

Volume II – Appendix B
Qualification Testing Report

B

Qualification Test Report
B.1

Scope

This Appendix contains a recommended outline for the Qualification Test Report to
be prepared by the test agency. The test report shall be organized so as to facilitate the
presentation of conclusions and recommendations regarding system acceptability, a
summary of the test operations, a summary of the test results, the test data records, and
the analyses that support the conclusions and recommendations. The content of the
report may vary based on the scope of review conducted.

B.1.1

New Voting System Qualification Test Report

A full report is prepared for the initial qualification testing of a voting system. This
document consists of five main sections: Introduction, Qualification Test Background,
System Identification, System Overview, and Qualification Test Results.
Detailed information about the test operations and findings, and test data, are included
as appendices to the report.
Sections B.2 through B.8 describe the contents of the individual sections of this
report.

B.1.2

Changes to Previously Qualified Voting System
Qualification Test Report

This report addresses a wide range of scenarios. After a preliminary review of the
submitted changes, the test agency may determined that:
a. A review of all change documentation against the baseline materials was
sufficient for recommendation for qualification; or

B-1

Volume II – Appendix B
Qualification Testing Report

b. All changes must be retested against the previously qualified baseline; or
c. The scope of the changes are substantial enough such that a complete retest of
the software is required.
The format of this report varies, based on the type of review that was performed. If
only a review of change documentation against the baseline materials was performed
the report is quite simple. It consists of an Introduction, a Version Description, the
Testing Approach, and a Results Summary. A more extensive report is prepared, for
changes that have extensive impact on the system design and/or operations.

B.2

Qualification Test Background

This section contains the following information:
a. General information about the qualification test process; and
b. A list and definition of all terms and nomenclature peculiar to the hardware,
the software, or the test report;

B.3

System Identification

This section gives information about the tested software and supporting hardware,
including:
a. System name and major subsystems (or equivalent);
b. System Version;
c. Test Support Hardware; and
d. Specific documentation provided in the vendor's TDP used to support testing.

B.4

System Overview

This section describes the voting system in terms of its overall design structure,
technologies used, processing capacity claimed by the vendor for system components
(such as ballot counters, voting machines, vote consolidation equipment) and mode of
operation. It may also identify other products that interface with the voting system.

B-2

Volume II – Appendix B
Qualification Testing Report

B.5

Qualification Test Results and Recommendation

This section provides a summary of the results of the testing process, and indicates
any special considerations that affect the conclusions derived from the test results.
This summary includes:
a. The acceptability of the system design and construction based on the
performance of the system hardware, software and communications, and on
the source code inspection;
b. The degree to which the hardware and software meet the vendor's
specifications and the standards, and the acceptability of the vendor's
technical and user documentation;
c. General findings on the maintainability of the system including, where
applicable, notation of specific maintenance activities that are determined to
be difficult to perform;
d. Identification and description of any deficiencies that remain uncorrected after
completion of the qualification test and that has caused or is judged to be
capable of causing the loss or corruption of voting data, providing sufficient
detail to support a recommendation to reject the system being tested.
(Similarly, any deficiency in compliance with the security, accuracy, data
retention, and audit requirements are fully described); and
e. A specific recommendation to the NASED ITA Committee for approval or
rejection.
Of note, any uncorrected deficiency that does not involve the loss or corruption of
voting data shall not necessarily be cause for rejection. Deficiencies of this type may
include failure to fully achieve the levels of performance specified in Volume I,
Sections 3 and 4 of the Standards, or failure to fully implement formal programs for
qualify assurance and configuration management described in Volume I, Sections 7
and 8. The nature of the deficiency is described in detail sufficient to support the
recommendation either to accept or to reject the system, and the recommendation is
based on consideration of the probable effect the deficiency will have on safe and
efficient system operation during all phases of election use.

B.6

Appendix - Test Operations and Findings

This appendix provides additional detail about the test results to enable the
understanding of test results and recommendation. This information is organized in a
manner that reflects the Qualification Test Plan. Summaries of the results of hardware
examinations, operating and non-operating hardware tests, software module tests,
software function tests, and system-level tests (including security and

B-3

Volume II – Appendix B
Qualification Testing Report

telecommunications tests, and the results of the Physical and Functional Configuration
Audits) are provided.

B.7

Appendix - Test Data Analysis

This appendix provides summary records of the test data and the details of the
analysis. The analysis includes a comparison of the vendor's hardware and software
specifications to the test data, together with any mathematical or statistical procedure
used for data reduction and processing.

B-4

Volume II – Appendix B
Qualification Testing Report

Volume II, Appendix C
Table of Contents

C Appendix C: Qualification Test Design Criteria.................................................................. C-1
C.1 Scope C-1
C.2 Approach to Test Design............................................................................................................... C-1
C.3 Probability Ratio Sequential Test (PRST) ..................................................................................... C-2
C.4 Time-based Failure Testing Criteria.............................................................................................. C-3
C.5

Accuracy Testing Criteria ............................................................................................................. C-6

i

Volume II – Appendix C
Qualification Test Design Criteria

C

Appendix C: Qualification Test
Design Criteria
C.1

Scope

This appendix describes the guiding principles used to design the voting system
qualification testing process conducted by ITAs.
Qualification tests are designed to demonstrate that the system meets or exceeds the
requirements of the Standards. The tests are also used to demonstrate compliance with
other levels of performance claimed by the manufacturer.
Qualification tests must satisfy two separate and possibly conflicting sets of
considerations. The first is the need to produce enough test data to provide confidence
in the validity of the test and its apparent outcome. The second is the need to achieve a
meaningful test at a reasonable cost, and cost varies with the difficulty of simulating
expected real-world operating conditions and with test duration. It is the test
designer's job to achieve an acceptable balance of these constraints.
The rationale and statistical methods of the test designs contained in the Standards are
discussed below. Technical descriptions of their design can be found in any of several
books on testing and statistical analysis.

C.2

Approach to Test Design

The qualification tests specified in the Standards are primarily concerned with
assessing the magnitude of random errors. They are also, however, capable of
detecting bias errors that would result in the rejection of the system.
Test data typically produce two results. The first is an estimate of the true value of
some system attribute such as speed, error rate, etc. The second is the degree of
certainty that the estimate is a correct one. The estimate of an attribute's value may or
may not be greatly affected by the duration of the test. Test duration, however, is very

C-1

Volume II – Appendix C
Qualification Test Design Criteria

important to the degree of certainty; as the length of the test increases, the level of
uncertainty decreases. An efficient test design will produce enough data over a
sufficient period of time to enable an estimate at the desired level of confidence.
There are several ways to design tests. One approach involves the preselection of
some test parameter, such as the number of failures or other detectable factor. The
essential element of this type of design is that the number of observations is
independent of their results. The test may be designed to terminate after 1,000 hours
or 10 days, or when 5 failures have been observed. The number of failures is
important because the confidence interval (uncertainty band) decreases rapidly as the
number of failures increases. However, if the system is highly reliable or very
accurate, the length of time required to produce a predetermined number of failures or
errors using this method may be unachievably long.
Another approach is to determine that the actual value of some attribute need not be
learned by testing, provided that the value can be shown to be better than some level.
The test would not be designed to produce an estimate of the true value of the attribute
but instead to show, for example, that reliability is at least 123 hours or the error rate
is no greater than one in ten million characters.
The latter design approach, which was chosen for the Standards, uses what is called
Sequential Analysis. Instead of the test duration being fixed, it varies depending on
the outcome of a series of observations. The test is terminated as soon as a statistically
valid decision can be reached that the factor being tested is at least as good as or no
worse than the predetermined target value. A sequential analysis test design called the
"Wald Probability Ratio Test" is used for reliability and accuracy testing.

C.3

Probability Ratio Sequential Test (PRST)

The design of a Probability Ratio Sequential Test (PRST) requires that four
parameters be specified:
H0, the null hypothesis
H1, the alternate hypothesis
a, the Producer's risk
b, the Consumer's risk
The Standards anticipate using the PRST for testing both time-based and event-based
failures.
This test design provides decision criteria for accepting or rejecting one of two test
hypotheses: the null hypothesis, which is the Nominal Specification Value (NSV), or
the alternate hypothesis, which is the MAV. The MAV could be either the Minimum
Acceptable Value or the Maximum Acceptable Value depending upon what is being

C-2

Volume II – Appendix C
Qualification Test Design Criteria

tested. (Performance may be specified by means of a single value or by two values.
When a single value is specified, it shall be interpreted as an upper or lower singlesided 90 percent confidence limit. If two values, these shall be interpreted as a twosided 90 percent confidence interval, consisting of the NSV and MAV.
In the case of Mean Time Between Failure (MTBF), for example, the null hypothesis
is that the true MTBF is at least as great as the desired value (NSV), while The
alternate hypothesis is that the true value of the MTBF is less than some lower value
(Minimum Acceptable Value). In the case of error rate, the null hypothesis is that the
true error rate is less than some very small desired value (NSV), while the alternate
hypothesis is that the true error rate is greater than some larger value that is the upper
limit for acceptable error (Maximum Acceptable Value).

C.4

Time-based Failure Testing Criteria

An equivalence between a number of events and a time period can be established
when the operating scenarios of a system can be determined with precision. Many of
the performance test criteria of Section Volume II, Section 4, Hardware Testing, use
this equivalence.
System acceptance or rejection can be determined by observing the number of
relevant failures that occur during equipment operation. The probability ratio for this
test is derived from the Exponential probability distribution. This distribution implies
a constant hazard rate. Therefore, two or more systems may be tested simultaneously
to accumulate the required number of test hours, and the validity of the data is not
affected by the number of operating hours on a particular unit of equipment. However,
for environmental operating hardware tests, no unit shall be subjected to less than two
complete 24 hour test cycles in a test chamber as required by Volume II, Subsection
4.7.2. of the Standards.
In this case, the null hypothesis is that the Mean Time Between Failure (MTBF), as
defined in Subsection 3.4.3 of the Standards, is at least as great as some value, here
the Nominal Specification Value. The alternate hypothesis is that the MTBF is no
better than some value, here the Minimum Acceptable Value.
For example, a typical system operations scenario for environmental operating
hardware tests will consist of approximately 45 hours of equipment operation. Broken
down, this time allotment involves 30 hours of equipment set-up and readiness testing
and 15 hours of elections operations. If the Minimum Acceptable Value is defined as
45 hours, and a test discrimination ratio of 3 is used (in order to produce an acceptably
short expected time of decision), then the Nominal Specification Value equals 135
hours.
With a value of decision risk equal to 10 percent, there is no more than a 10 percent
chance that a system would be rejected when, in fact, with a true MTBF of at least 135

C-3

Volume II – Appendix C
Qualification Test Design Criteria

hours, the system would be acceptable. It also means that there is no more than a 10
percent chance that a system would be accepted with a true MTBF lower than 45
hours when it should have been rejected.
Therefore,
H0: MTBF = 135 hours
H1: MTBF = 45 hours
a=
b=

0.10
0.10

and the minimum time to accept (on zero failures) is 163 hours.
It follows, then, that the test is terminated and an ACCEPT decision is reached when
the cumulative number of equipment hours in the second column of the following
table has been reached, and the number of failures is equal to or less than the number
shown in the first column. The test is terminated and a REJECT decision is reached
when the number of failures occurs in less than the number of hours specified in the
third column. In the event that no decision has been reached by the times shown in the
last table entries, the test is terminated, and the decision is declared as indicated.
Number of
Failures

Accept if Time
Greater Than

Reject if Time
Less Than

163
245
327
409(1)
1635

Continue test
Continue test
Continue test
82
245(2)

0
1
2
3
4

(1) Terminate and ACCEPT
(2) Terminate and REJECT
The ACCEPT/REJECT criteria of this time-based test accommodate the inclusion of
partial failures in the following manner. A graph is drawn, consisting of two parallel
lines through the sets of numbers of failures and time values shown in the table. These
lines are plotted against the total number of failures on the vertical axis, and the
elapsed time on the horizontal axis. They become "ACCEPT" and "REJECT"
boundaries. As an illustration, Figure C-1 below has been constructed using the values
from the previous table.

C-4

Volume II – Appendix C
Qualification Test Design Criteria

Figure C-1

As operating time is accrued, the horizontal line is extended from the origin to the
current value of time. If a total or partial failure occurs, the value of the cumulative
failure score is plotted at the time when the failure occurred. A vertical line is drawn
between this point and the horizontal trace. The test is resumed and the horizontal
trace is continued at the level of the cumulative failure score.
The test is terminated and the equipment is accepted whenever this horizontal line
intersects the lower of the two parallel lines. If the vertical line drawn to connect the
horizontal trace to the new cumulative failure score intersects the upper of the two
parallel lines, the test is terminated and the equipment rejected.
The test is terminated and the equipment is rejected if a total score of 5.0 or more is
reached. If after 409 hours of operation the cumulative failure score is less than 5.0,
than the equipment is accepted.

C-5

Volume II – Appendix C
Qualification Test Design Criteria

C.5

Accuracy Testing Criteria

Some voting system performance attributes are tested by inducing an event or series
of events, and the relative or absolute time intervals between repetitions of the event
has no significance. Although an equivalence between a number of events and a time
period can be established when the operating scenarios of a system can be determined
with precision, another type of test is required when such equivalence cannot be
established. It uses event-based failure frequencies to arrive at ACCEPT/REJECT
criteria. This test may be performed simultaneously with time-based tests.
For example, the failure of a device is usually dependent on the processing volume
that it is required to perform. The elapsed time over which a certain number of
actuation cycles occurs is, under most circumstances, not important. Another example
of such an attribute is the frequency of errors in reading, recording, and processing
vote data.
The error frequency, called “ballot position error rate,” applies to such functions as
process of detecting the presence or absence of a voting punch or mark, or to the
closure of a switch corresponding to the selection of a candidate.
Qualification and acceptance test procedures that accommodate event-based failures
are, therefore, based on a discrete, rather than a continuous probability distribution. A
Probability Ratio Sequential Test using the binomial distribution is recommended. In
the case of ballot position error rate, the calculation for a specific device (and the
processing function that relies on that device) is based on:
HO: Desired error rate = 1 in 10,000,000
H1: Maximum acceptable error rate = 1 in 500,000
a = 0.05
b= 0.05
and the minimum error-free sample size to accept for qualification tests is 1,549,703
votes.
The nature of the problem may be illustrated by the following example, using the
criteria contained in the Standards for system error rate. A target for the desired
accuracy is established at a very low error rate. A threshold for the worst error rate
that can be accepted is then fixed at a somewhat higher error rate. Next, the decision
risk is chosen, that is the risk that the test results may not be a true indicator of either
the system's acceptability or unacceptability. The process is as follows:

C-6

Volume II – Appendix C
Qualification Test Design Criteria

♦ The desired accuracy of the voting system, whatever its true error rate (which
may be far better), is established as no more than one error in every ten
million characters (including the null character).
♦ If it can be shown that the system's true error rate does not exceed one in
every five hundred thousand votes counted, it will be considered acceptable.
(This is more than accurate enough to declare the winner correctly in almost
every election.)
♦ A decision risk of 5 percent is chosen, to be 95 percent sure that the test data
will not indicate that the system is bad when it is good or good when it is bad.
This results in the following decision criteria:
♦ If the system makes one error before counting 26,997 consecutive ballot
positions correctly, it will be rejected. The vendor is then required to improve
the system;
♦ If the system reads at least 1,549,703 consecutive ballot positions correctly, it
will be accepted; and
♦ If the system correctly reads more than 26,997 ballot positions but less than
1,549,703 when the first error occurs, the testing will have to be continued
until another 1,576,701 consecutive ballot positions are counted without error
(a total of 3,126,404 with one error).

C.6

C-7

Volume II – Appendix C
Qualification Test Design Criteria


File Typeapplication/pdf
File Title_ _
AuthorData Systems Division
File Modified2007-10-10
File Created2004-11-26

© 2024 OMB.report | Privacy Policy