Support_Statement_FVAP_LEOs_Revised_20101117_AcceptedTrackChanges

Support_Statement_FVAP_LEOs_Revised_20101117_AcceptedTrackChanges.doc

Post-election Survey of Local Election Officials

OMB: 0704-0125

Document [doc]
Download: doc | pdf

SUPPORTING STATEMENT


2010 Post-Election Voting Survey of Local Election Officials


Section A. JUSTIFICATION

A.1. Need for Information Collection

Primary objectives. The primary objective of the 2010 Post-Election Voting Survey of Local Election Officials is to identify areas where the electoral process can be improved by providing an accurate picture of the absentee voting process. Additionally, the data will permit an ongoing evaluation of the extent to which recent legislative changes have been successful in removing barriers for absentee voting and identify any remaining obstacles to voting by those populations covered by the Uniformed and Overseas Citizens Absentee Voting Act of 1986 (UOCAVA).

The Federal Voting Assistance Program (FVAP) post-election surveys request information to determine participation in the electoral process by those populations covered by UOCAVA. The information collection will help determine: 1) whether voting materials are being distributed in a timely manner and whether voting assistance is being made available; 2) the types of obstacles voters encounter when attempting to vote absentee; 3) the impact of FVAP’s efforts to simplify and ease the process of voting absentee; and 4) any other problems existing for an absentee voter. FVAP will use the information to prepare a report to the President and Congress as required by the National Defense Authorization Act. Prior to 2010, the voting surveys were administered every four (4) years; i.e., immediately after each presidential election. Beginning in 2010, the surveys will be administered every two years, i.e., immediately after each national election. A detailed content summary can be found in Attachment 1a. This summary highlights the relationships between the research objectives and the specific items found on the questionnaire.

Legal authorities. The President of the United States designated the Secretary of Defense to administer the Uniformed and Overseas Citizens Absentee Voting Act of 1986 (UOCAVA), 42 USC 1973ff (Attachment 1b), and as modified by the National Defense Authorization Act of FY 2010 (Attachment 1c). The Act permits members of the Uniformed Services and Merchant Marine and their eligible family members and all citizens residing outside the United States who are absent from the United States and its territories to vote in the general elections for Federal offices. The 1988 Executive Order 12642 (Attachment 1d) names the Secretary of Defense as the “Presidential designee” for administering UOCAVA. The Secretary of Defense in turn delegated this responsibility to the Director of the Federal Voting Assistance Program in Department of Defense (DoD) Directive 1000.4, Federal Voting Assistance Program (FVAP) (Attachment 1e), which defines the program’s responsibilities and authority.

A.2. Purpose and Use of Information

How and for what purpose information will be used. The respondents for this specific information collection are the Local Election Officials (LEOs) who are responsible for administering elections in counties, cities, parishes, townships, and other jurisdictions within the United States. LEOs process voter registration and absentee ballot applications, send absentee ballots to voters, and receive and process voted absentee ballots. As in prior survey administrations of UOCAVA populations, these survey data will identify areas where the electoral process can be improved by providing an accurate picture of the absentee voting process through the perspective of the LEOs. DMDC designed the survey questions to capture self-reported attitudes and behaviors of the LEOs as well as information about the voting jurisdiction, concentrating on the absentee vote.

By whom information will be used. The sponsor of the 2010 Post-Election Voting Survey of Local Election Officials is the Office of the Under Secretary of Defense (Personnel and Readiness) (OUSD[P&R]), and the users of the data will be the FVAP, the Office of the Secretary of Defense (OSD), other DoD senior staff and administrators, and Defense Manpower Data Center (DMDC).

A.3. Improved Information Technology

To minimize respondent burden and to capitalize on computer-assisted survey administration technology, LEOs will be able to complete the survey on the Web by logging onto the survey operations contractor’s secure Web site. Furthermore, to capitalize on e-mail communication capability, e-mails will be sent to all LEOs for whom a valid e-mail address has been obtained explaining the purpose of the survey and inviting the LEOs to participate (further procedural details are provided in Section B.2.). To access the Web site, LEOs will be provided with an individual access code (i.e., a unique Ticket number they will need to enter to gain access to the survey application) in both the e-mail and postal communications.

A paper-and-pen questionnaire option will also be available, since it is expected that some LEO offices lack Web-access to complete an on-line survey. LEOs opting to complete a paper-and-pen survey will have available a hard copy instrument with a unique identifier located on both the front and back of the questionnaire; i.e., a lithocode that identifies the LEO office, similar to the unique ticket number used for web-based data collection. The purpose of the unique lithocode is to determine if a completed paper version of the instrument has been submitted by a particular LEO office. Also, if multiple hard copy questionnaires need to be sent to any given LEO office, the lithocode will link the multiple copies to that LEO office. For the 2008 survey, 63% of eligible respondents utilized the paper option and 37% used the electronic option. Similar rates are expected for the current data collection effort.

A.4. Efforts to Identify Duplication

There is no other Federal agency tasked with collecting information specific to all the populations covered by UOCAVA and designed to evaluate and report on FVAP’s efforts to simplify and ease the process of voting absentee. The Secretary of Defense, as the “Presidential designee” under 42 USC 1973ff, designated the Director of FVAP to administer and oversee the Federal responsibilities of the Act. Presently, the only information of a similar nature available is information collected by FVAP from surveys of prior elections, with 2008 the most recent federal election. This information is no longer current and cannot be used to extrapolate to the upcoming federal election. Without current information, the FVAP cannot perform its responsibilities under the Act.

The U.S. Election Assistance Commission (EAC), an independent bipartisan commission established by the Help America Vote Act of 2002, will be administering the 2010 Election Administration and Voting Survey. The focus of the EAC survey is quite different from that of the 2010 Post-Election Voting Survey of Local Election Officials, and there is little overlap in the content of the questions. Where there is unavoidable overlap, for example, questions on absentee ballots, the various sections on absentee ballots in the proposed survey (e.g., the section titled Non-Federal Post Card Application Absentee Ballot [FPCA] Requests and Transmission of Regular UOCAVA Absentee Ballots) ask for this information broken down by 1) Uniformed Services Members and 2) Overseas Civilians. This information is important to the purpose of the proposed survey and would not be available otherwise. In addition, the proposed survey has been designed to collect data on the various reasons why jurisdictions may have been unable to process the FPCA and non-FPAC requests they received, as well as the various reasons why the ballots which were returned by UOVACA voters (emailed, faxed, or Federal Write-In Absentee Ballots, or FWABs) and received by the jurisdictions may have been rejected. Finally, the proposed survey asks a number of questions specifically about the role of FVAP which are absent from the EAC survey, for example, “Overall, how useful was the voting information or assistance you received from the Federal Voting Assistance Program’s (FVAP) toll-free telephone service during the 2010 election year?” The proposed survey is the only source of information available to FVAP to assist in its efforts to improve the absentee voting process.

Looking forward to the 2012 LEO surveys, the Directors of FVAP and the EAC have begun collaborating with the shared goal of reducing the redundancy or duplication of survey or data collection requests of the LEOs. The timeframe for 2010 did not allow for such collaboration, but the agreed upon goal for the future is to reduce the survey requests imposed on LEOs by the two government agencies.1

A.5. Methods Used to Minimize Burden on Small Entities

The survey respondents for this data collection are the LEOs who are part of the FVAP. No data collection is being conducted with other businesses or establishments.

A.6. Consequences of Not Collecting the Information

The UOCAVA requires a statistical analysis of absentee voter participation, which includes uniformed services and overseas nonmilitary populations. To obtain the required information under UOCAVA to conduct this analysis, surveys need to be administered to LEOs. FVAP is then required to prepare a report to Congress no later than the end of the year after a federal election. If surveys were not administered, the DoD would not be in compliance with the law.

A.7. Special Circumstances

There are no special circumstances. This collection will be conducted in a manner consistent with guidelines contained in 5 CFR 1320.5(d)(2).

A.8. Agency 60-Day Federal Register Notice and Consultations Outside the Agency

Received comments. An agency 60-Day Federal Notice was published in Vol. 75, No. 25, Monday, February 8, 2010, Federal Register, pages 6184-6185, as required by 5 CFR 1320.8(d). A copy of the 60-Day Federal Notice is included in Attachment 2a. No public comments were received in response to the notice. FVAP corresponds regularly with interested citizens and State and local government officials. Any comments received throughout the approved license period are taken into consideration.

Coordinations were obtained from Ms. Cindy Allard, OSD/JS Privacy Office, WHS/ESD, 703.588.2386 (Attachment 2b), and Ms. Andrea Zucker, Exempt Determination Official for the Office of the Deputy Under Secretary of Defense (Program Integration), Human Research Protection Program, DHRA, 703.696.7178 (Attachment 2c).

The Defense Manpower Data Center (DMDC), the survey research arm of the Under Secretary of Defense for Personnel and Readiness, will manage the data collection in 2010 for FVAP.

A.9. Payments to Respondents

No payments or gifts will be provided to LEOs for completing the survey.

A.10. Assurance of Confidentiality

The information collection does not ask respondents to submit propriety or trade secret information to DoD. Though DMDC cannot promise confidentiality to this population, respondents will be told that the information they provide will be kept private to the extent permitted by law.

A.11. Sensitive Questions

The data collection instrument contains no questions of a sensitive nature. The survey will be non-intrusive and respondents will be informed that their participation is voluntary. The survey does not collect personally identifiable information and survey responses are not retrieved by personal identifier. Therefore, the information collected is not subject to the Privacy Act of 1974, as amended. DMDC will only report results in the aggregate; that is, in the form of statistical summaries.

A.12. Estimates of Annual Response Burden and Labor Cost for Hour Burden to the Respondent for Collection of Information – Main Survey.

a. Response burden.

Total Annual Respondents:

4,013

Assumes .55 Completion Rate

Frequency of Response:

1


Total Annual Responses:

4,013


Burden Per Response:

90

Minutes on Average

Total Burden Minutes:

361,170

Minutes

Total Burden Hours:

6,020

Hours


b. Explanation of how burden was estimated. The estimated number of annual respondents is based on the number of Local Election Officials in the population (N = 7,296) multiplied by the 2008 response rate of 55% (7,296 * 0.55 = 4,013). The estimated burden per response (90 minutes) is based on in-house practice, or mock, administrations and guidance from FVAP.

c. Labor cost to respondent.

Total annual respondents:

4,013

Assumes .55 completion rate

Frequency of response:

1


Total annual responses:

4,013


Burden per response:

90

Minutes on average

Average cost per response:

$33.86


Burden per response (90 minutes) × hourly rate for GS-9/5 ($22.57)

Total respondent cost:

$135,880

Average cost per response ($33.86) × expected number of responses (4,013)

d. Explanation of How Labor Cost to Respondent was Estimated. The annual salaries of the Local Election Officials across all jurisdictions, from the small and more rural jurisdictions to the very large and urban and/or county-wide jurisdictions, undoubtedly varies greatly, but the overall estimated hourly wage used to calculate the average cost per response is the 2010 GS-9/5 hourly rate of $22.57 excluding any locality adjustment.


Respondent for Collection of Information – Non-Response Bias Survey.

e. Response burden for non-response bias survey.

Total Annual Respondents:

325

Assumes .65 Completion Rate

Frequency of Response:

1


Total Annual Responses:

325


Burden Per Response:

30

Minutes on Average

Total Burden Minutes:

9,750

Minutes

Total Burden Hours:

162.5

Hours


f. Labor cost to respondents for non-response bias survey.

Total annual respondents:

325

Assumes .65 completion rate

Frequency of response:

1


Total annual responses:

325


Burden per response:

30

Minutes on average

Average cost per response:

$11.29


Burden per response (90 minutes) × hourly rate for GS-9/5 ($22.57)

Total respondent cost:

$3,668

Average cost per response ($11.29) × expected number of responses (325)

d. Explanation of How Labor Cost to Respondent was Estimated. The annual salaries of the Local Election Officials across all jurisdictions, from the small and more rural jurisdictions to the very large and urban and/or county-wide jurisdictions, undoubtedly varies greatly, but the overall estimated hourly wage used to calculate the average cost per response is the 2010 GS-9/5 hourly rate of $22.57 excluding any locality adjustment.


A.13. Estimates of Other Cost Burden for the Respondent for Collection of Information. Total Capital and Start-up Cost. There are no capital/startup costs.

Operation and Maintenance Cost. There are no operation and maintenance costs. No outside resources, consultations or record retrieval are required to answer the survey questions. Any computer costs borne by the establishment will be minimal.


A.14. Estimates of Cost to the Federal Government.

a. DMDC Staffing Costs


GS Grade/Step

Annual Rate

25% Fringe

Monthly

FTE Months

Cost

12/1

$74,872

$93,590

$7,799

6

$46,795

13/1

$89,033

$111,291

$9,274

4

$37,097

14/1

$105,211

$131,514

$10,959

3

$32,878

15/1

$123,758

$154,698

$12,891

3

$38,674

Total Cost





$155,444


b. Explanation of How Cost was Estimated. Federal labor costs were estimated using the GS Salary Table for 2010-DCB which includes a locality payment of 24.22% for the Washington, Baltimore, and Northern Virginia area. An additional estimated 25% fringe benefit cost was added based on research available from the U.S. Bureau of Labor Statistics.2

c. Additional Costs

Survey Contractor Operations and Maintenance Cost $176,634

(Includes contractor labor to produce surveys and letters, screening and reminder telephone calls, conducing the non-response bias study, all data collection costs, materials and freight, data storage, and postage.)

On-Site Contractor Support $178,184

(Costs for support contracts are based on negotiated rates for similar services.)

Government Staffing Cost $155,444

(Includes sampling and weighting, analysis of basic data set, creation of tab volume, statistical methods reports, contractor technical oversight, contract administration, consults with FVAP, preparation of the all final internal documents.)

Total Cost $510,262


A.15. Changes in Burden

Change in burden is due to re-estimation of the number of respondents. In particular to the LEO survey, a sample of LEOs was drawn in 2008, but the 2010 survey effort will be a census of the LEOs.

A.16. Published Reports and Project Schedule

Published reports. There are currently no plans to publish the results outside the DoD.

Project schedule.

Activity:

Anticipated Date:

Data collection begins

November 19, 2010

Data collection ends

January 25, 2011

Publish methods report

May 1, 2011

Post reports to FVAP Web site

July 1, 2011

Post tabulations to FVAP Web site

August 11, 2011

Post briefings to FVAP Web site

August 24, 2011


A.17. Approval Not to Display Expiration Date

This approval is not being requested.

A.18. Exceptions to the Certification Statement

No exceptions to the Certification Statement are being requested.


Section B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


B.1. Description of Potential Respondents

Target Population. The 2010 Post-Election Voting Survey of Local Election Officials will be designed to represent all local election officials from the voting jurisdictions in the United States and the four territories. Eligible respondents are individuals at the LEO offices who are responsible for approving voter registration, assigning and sending ballots to the voter, and accepting vote ballots. For 2010, DMDC plans to select all 7,296 voting jurisdictions in comparison to 2,598 sampled voting jurisdictions in 2008.

Sampling frame. The sampling unit for this study is the local election voting jurisdiction, which are counties for most states, but were defined differently from state to state. For example, the states of Alaska and Maine are considered one voting jurisdiction for UOCAVA purposes, whereas, Michigan, Wisconsin and the New England states define voting jurisdiction by individual townships. The remaining states define voting jurisdictions as counties with the exception of Virginia, which defines voting jurisdiction by counties as well as some cities. DMDC developed the sampling frame from three sources: (1) a file provided by FVAP, (2) state election website research, and (3) website research from the overseas vote foundation (OVF). In total there are 7,296 unique voting jurisdictions determined.

Sample design. DMDC will take a census for the 2010 Post-Election Voting Survey of Local Election Officials (LEO) survey, which is all 7,296 local election voting jurisdictions, with the intent of making state-level estimates for states with sufficient numbers of responding LEOs. Because voting legislation and rules vary by state (e.g., whether voter has to mail absentee ballot, if fax or email is optional, etc.), individual state estimates are preferred. This information will allow FVAP to target poor-performing states during future training and outreach programs.

Expected response rate. Based on the 2008 study, DMDC expects the response rate to be approximately 55%. DMDC will produce statistical estimates of overall percentages, and for states with sufficient numbers of responding jurisdictions, and will examine potential bias due to nonresponse (see section B.3). Although this collection is a census, DMDC may calculate “margins of error” to reflect uncertainty in the point estimates that are published. However, given that the 2010 survey field methods will include extensive telephone screening calls as well as targeted reminder calls (neither of which was conducted during the 2008 survey), it may be that the final response rate will exceed 55%. If that is the case, similar methods will be considered for the 2012 LEO survey.

Response rates will be calculated using AAPOR RR3, whereby a completed survey is one that has at least 50% of the applicable questions completed, and where (Cases of Unknown Eligibility) is an estimate of the non-responding jurisdictions with an unknown eligibility status. For the 2010 Post-Election Voting Survey of Local Election Officials, we expect all jurisdictions to be eligible.

To handle unit survey non-response, DMDC will compute nonresponse adjustment factors within weighting classes defined by state, jurisdiction type (county versus minor civil division) and size classes (number of registered voters). DMDC will collapse cells with fewer than 30 jurisdictions. This weighting adjustment reduces bias from differential responses rates on these three characteristics.


B.2. Procedures for the Collection of Information

Participant recruitment. DMDC will obtain names, postal mailing addresses, and e-mail addresses of the current LEOs from FVAP from the Web sites of the individual LEO offices. DMDC will compare and clean these two lists to make sure contact information is as accurate as possible. All LEOs will be notified of the upcoming survey by postal mail (Attachment 3a) approximately two weeks prior to the start of data collection. The letter will describe the purpose of the survey and will include the sponsor of the survey, a toll-free phone number to call with questions, a unique ticket number, and a Web site the LEOs can log onto to update their contact information. The notification will be followed by an announcement e-mail sent the day after Election Day (Wednesday, November 03, 2010) informing the LEOs data collection has started (Attachment 3b). An announcement postal letter (Attachment 3c) and accompanying hard copy questionnaire will be mailed the same day. Throughout the field period, the non-responding LEOs will be sent reminder e-mails (Attachment 3d) and reminder postal letters (Attachment 3e). The survey operations contractor will maintain a control system which will be updated daily for Web receipts and paper receipts to determine non-respondents. The file period will last approximately six (6) weeks.

The data collection procedures are not expected to involve any risk to participants. Names are used only in communicating with LEOs. These names are kept securely by the survey operations contractor and are not linked to response data. The datasets sent to DMDC contain no names or addresses. Access to full detailed confidential data is limited to DMDC analysts and contractors under their direct supervision.

Data collection. Data collection will be dual-mode: LEOs can choose to complete the survey on a hard copy paper-and-pen questionnaire (Attachment 4a) mailed to their offices or on the Web3. For the paper-and-pen survey instrument, Agency Disclosure Notice (ADN) and privacy notice/informed consent text is included in the survey packet. The ADN is located on the inside cover immediately before the Privacy Advisory Statement. The Privacy Advisory Statement is followed by the informed consent information on the inside front cover of the survey document. The informed consent information includes the instructions "Returning this survey indicates your agreement to participate in this research."

For the Web version, when respondents log on to the Web site using their unique Ticket number to complete the on-line version of the instrument, they will be directed to a set of Web screens, starting with the “Welcome” screen (Attachment 5a). From there they can view the “Frequently Asked Questions” screen (Attachment 5b) and the “Security Protection Advisory”4 screen (Attachment 5c). The Advisory for the survey program informs visitors to the Web site that no information on the person's computer or Internet connection is collected in a way it can be associated with the person or the survey responses. Respondents are then directed to the “Agency Disclosure Notice” (ADN; Attachment 5d) and finally the “Privacy Advisory and Informed Consent Statement” screen (Attachment 5e). On the Web version of the instrument, the ADN must be read before continuing to the Privacy Advisory and Informed Consent Statement screen. The informed consent screen includes the instruction "Click 'Continue' if you agree to do the survey" and informed consent is indicated by clicking the "Continue" button and answering the survey questions.

LEOs completing a paper-and-pen version of the survey will use an accompanying postage-paid envelope to mail the completed survey to the survey operations contractor. The survey operations contractor will log the received instruments into the control system and perform any necessary cleaning and editing to prepare for data entry. The cleaning process includes looking for errors in skip logic and cleaning difficult-to-read hand written entries. The hard copies are scanned into the database and the data are then merged with the Web-based survey response data. The operations contractor converts raw data files to SAS datasets according to specifications written by DMDC. Datasets are then transmitted to DMDC via secure file transfer protocol. DMDC then creates a report for FVAP of the survey responses in subgroup categories and percentages (weighted to reflect the population). FVAP then further analyzes the data and prepares a report to the President and the Congress.

Data security. This survey does not collect or use personally identifiable information and data are not retrieved by personal identifier. Therefore, the information collected is not subject to the Privacy Act of 1974, as amended. Only aggregate data will be reported in the form of statistical summaries.

The network sites for both DMDC and DMDC’s survey contractor, Data Recognition Corporation (DRC), are secure and password protected. Security is strictly enabled by using physical and software access restrictions. All servers are physically located in locked rooms with access permitted only to Technical Services staff through the use of a security card system. Access to the network is allowed only through a login account and password. In addition, employees use password protected screen savers at workstations to protect their systems while they are away from their desks. At DMDC, the network is accessed through the use of Common Access Card (CAC) readers and utilizes Public Key Infrastructure (PKI) security. Logging on to the network requires both physical possession of the CAC and a separately issued Personal Identification Number. All computer systems comply with current Federal Information Security Management Act security standards.

DRC makes daily backup tapes that are stored for five years in fire proof vaults located within a security-card protected area, and all provisions dealing with the protection of human subjects and data security are in force for as long as the contractor retains any protected data. DRC meets the Department of Defense Information Assurance Certification and Accreditation Process (DIACAP) requirements and conducts all necessary updates on an on-going basis to continue to meet DIACAP requirements. DRC’s facilities have an Authority to Operate (ATO) issued by DMDC.

Weighting. The analytic weights for the 2010 post-election voting survey of LEOs will be created to allow for the estimation of population values by eligible survey respondents. DMDC will create survey weights to reflect the initial selection probabilities as well as the adjustments for the differential response rates. Since the plan is to take a census, all base weights, which are the ratio of the frame count to the sample count, will be 1. After the survey has been conducted and the case dispositions are resolved, the sampling weights will be adjusted for non-response. The eligibility-adjusted weights for eligible respondents will be adjusted to account for eligible jurisdictions that were non-respondents. For this survey we expect all sample jurisdictions to be eligible.

Edit and imputation processes. To calculate estimated totals from the survey data, edit and imputation processes will be developed for the items with missing data similar to those in 2008. Without an edit and imputation process, the estimated totals will under represent the actual total. The edit process is the inspection of collected data, prior to statistical analysis. The goal of editing is to verify that the data fall within expected ranges and relationships among variables indicate respondents understood the question concepts. An imputation process places an estimated answer into a data field for a record that previously had no data or had incorrect or implausible data.

Data Editing. There will be two edits done prior to statistical analysis. The first edit is specific for Question 4, the total number of UOCAVA voters for the local jurisdiction. If the jurisdiction is an eligible respondent, then an edit will be performed. When the total number of voters for the jurisdiction does not closely correspond to the expected number of votes, DMDC will conduct a Web search to find the total number of votes for the jurisdiction through the FVAP Web site. Question 4 will be used during the imputation process.

The second edit called the common denominator edit will be used for questions with multiple parts or sub-items. The questions pertaining to count data have two sub-items: (1) Uniformed Service members (domestic or foreign), and (2) overseas civilians.

The common denominator edit will be performed on all complete and incomplete eligible cases. When one or more sub-items have valid responses, the missing value for the remaining sub-item will be set to zero.

Imputation Process. After the edit process, DMDC will implement a hot-deck imputation. Hot-deck imputation uses similar sized jurisdictions as ‘donors’ to replace missing data. To become a donor, the jurisdiction needs to be a complete eligible case that has response data for the donor question. Using a simple random sample, a donor will be found preferably from within the same state and relatively close in size for each voting jurisdiction with missing information. No donor will be used more than one time. The donor will provide a ratio of the question with missing data needing imputation to the number of UOCAVA voters. The ratio will be multiplied by the total number of registered voters of the recipient case.

Unusual problems requiring specialized sampling. No unusual problems that would require specialized sampling are anticipated.

Use of periodic data collection cycles to reduce burden. This request is for a single collection, and therefore it cannot be reduced further.


B.3. Non-response, Maximization to Response Rates, and Accuracy and Reliability

Encouraging response. To maximize response rates, DMDC will notify the LEOs of the upcoming survey by postal notification letter approximately two weeks prior to the start of data collection. The letter explains the purpose of the survey and asks for voluntary completion of the survey. The notification will be followed by an announcement e-mail sent the day after Election Day (Wednesday, November 03, 2010), informing the LEOs data collection has started. An announcement postal letter and accompanying hard copy questionnaire will be mailed the same day. Throughout the field period, the non-responding LEOs will be sent reminder e-mails and reminder postal letters. LEOs will be told in the reminder communications how to obtain another copy of the hard copy instrument, if they have misplaced their copy or never received one, and/or prefer completing a hard copy instrument over the web-based instrument.

In addition to sending out reminder emails and letters and addressing LEOs at the Election Center conference, FVAP will conduct telephone reminder calls to large, nonresponding LEO jurisdictions to encourage response. Specifically, about half-way through the field period FVAP will place calls to the 200 largest nonresponding LEO jurisdictions. (This type of telephone follow-up was not conducted as part of the 2008 LEO survey effort, but rather is new for the 2010 survey.) A targeted nonresponse telephone follow-up effort of this kind; that is, one that focuses on large jurisdictions and has the intended impact of successfully increasing their response rate, will greatly reduce the effect of nonresponse on the 2010 estimates in that for surveys that contain differentially sized sample units (e.g., LA County versus a small township in Vermont), most of the variance comes from the largest sized units. In 2008, there were only approximately 200 jurisdictions with at least 100,000 registered voters, and we expect the distribution to be about the same for 2010.

It is very important for a data collection effort of this kind to obtain buy-in from the population of interest to the extent possible. To that end, Dr. Timothy Elig, DMDC Survey Division Chief, along with representatives of FVAP, attended the August 2010 Election Center national conference. The Election Center is comprised of all the Local Election Officials and the conference provided an excellent opportunity to both publicize and legitimize the survey. A presentation was made at the conference on all of the post-election voting surveys, but in this forum the LEOs had an opportunity to hear and ask questions about the upcoming LEO survey. Dr. Elig was able to address questions the LEOs had about the survey instrument, its methodology, how and when they will be contacted to complete the survey, and how the FVAP LEO differs from the EAC LEO survey. Dr. Elig was also able to give examples of how DMDC and FVAP used the 2008 survey results to help design the 2010 survey; that is, questions that did not seem to be generating reliable or valid data, or appeared too confusing or difficult for LEOs to complete in 2008 were either dropped or revised accordingly. Dr. Elig had copies of the draft instrument in the event any of the LEOs wanted to see the kinds of information being asked for, the formats in which the data are being collected, etc. FVAP and DMDC believe that by actively participating in this conference the LEOs will have a better understanding of the need for the survey and will be more willing to participate and provide accurate data.

Analysis of survey non-response. Both DMDC and OMB are concerned with general declining response rates in the survey industry. To address this concern, OMB issued standards and guidelines for federal statistical surveys requiring that, for any survey with a response rate below 80 percent, survey agencies conduct a non-response analysis. For the proposed survey, DMDC will conduct a nonresponse bias study. This study consists of contacting a random sample of survey non-respondents by telephone, and asking a subset of key survey questions. The sample size for the nonresponse bias study was developed to detect differences of five percent on key variables of interest at 90 percent confidence. For instance, in 2008 the average ‘total number of absentee ballots issued’ was 94. If the difference between estimates from the original survey and the nonresponse study were greater than ‘5’ (about 5 percent), we require enough sample that the possibility that this difference is due to chance is less than 10 percent. Examining across the key survey items for the nonresponse study, a sample of approximately 500 jurisdictions is required.

The set of questions that will be used for the nonresponse bias study are submitted with the revised Supporting Statement. We anticipate it will take approximately 30 minutes to complete these questions, and we expect about a 65% completion rate. The burden calculations have been added to Section A.12, parts e. and f.

To assess nonresponse bias, DMDC will compare responses from initial survey respondents to survey non-respondents converted to response by the more expensive telephone mode. Because LEO survey questions are technically complex, DMDC would expect unacceptably large mode effects if the nonresponse survey were conducted by telephone. To counter this anticipated problem, DRC’s telephone staff will be trained on how to contact the respondents and then, while on the telephone, guide them to a Web-based version of the abbreviated nonresponse survey instrument. In this way, respondents will receive the most similar stimulus and essential survey conditions to that of the original respondents. Although the respondent will be viewing a Web version of the questionnaire, or a hard copy version if they prefer, to formulate their answer, DRC’s telephone staff will capture and enter the information over the phone. In this manner, the non-response study will differ from a strict interviewer-administered CATI-interview. DMDC will analyze results from this study, including response rates to the nonresponse follow-up study and substantive, statistically significant differences in the estimates of key analysis variables, to estimate the level of nonresponse bias in LEO estimates.

DMDC will compute response rates for each of these groups. If response rates differ within a group, DMDC will consider use of the identified variable for post-survey statistical adjustments including non-response adjustments to survey weights. In addition, DMDC will analyze the responses to the survey by the groups above and by survey mode (web versus paper). These analyses and subsequent statistical adjustments should reduce potential non-response bias to the post election voting survey for LEOs and inform the DoD on methods for improving future surveys of this nature, including maximizing survey response though the use of telephone follow-up.

B.4. Tests of Procedures

DMDC utilizes best practices in its design of Web-based surveys (e.g., visual presentation of questions and response options, usability and interactive elements, use of color, font style and size, and screen layout). These features of Web-based data collection, in addition to automated skip logic, serve to ease the burden on the respondents, increase data quality, and minimize response error. Similarly, DMDC has incorporated a number of randomized experiments into its surveys over the past several years testing such conditions as number and wording of letters and emails, timing of respondent contacts, presence or absence of a brochure, sponsorship, and subject line text. Because of these experiments, DMDC has developed field procedures grounded in experience and empirical findings. In addition to these tests and experiments, DMDC conducted the 2008 Post-Election Voting Survey of Local Election Officials, and as stated above, based on these returns, revisions were made to questions that did not seem to be generating reliable or valid data, appeared confusing, or were formatted in a way that proved cumbersome to respondents. For example, to improve data quality since the 2008 survey and to avoid generating data from which poorly informed policy decisions might be made, throughout the survey respondents can now check a “zero” box to designate “none” rather than entering the number “0” which can often be confused with the number “6”. Respondents can also select a box for “Data Not Available” if the jurisdiction does not keep records the way the question is describing. Finally, in some cases, problematic questions from 2008 were dropped completely.

B.5. Individuals Consulted on Statistical Aspects and Individuals Consulting and/or Analyzing Data

FVAP Principal Investigators: Robert Carey

Director, Federal Voting Assistance Program

W: 703-588-8118

C: 703-485-5022

[email protected]


FVAP Deputy Director for Research and Assessments: Paul Drugan

W: 703-588-8124

[email protected]


FVAP Program Analyst: Allan White

W: 703-588-8112

[email protected]


DMDC Principal Investigator: Dr. Timothy Elig (703-696-5858)


DMDC Survey Statisticians:

  • David McGrath (703-696-2675)

  • Dr. Fawzi Al Nassir (703-696-5825)

  • Eric Falk (703-696-8960)

  • Owen Hung (703-696-1343)

  • Dorothy Kester Jackman (703-696-5839)


DMDC Analysts:

  • Ryan Tully (703-696-6339)

  • Elizabeth Davis (703-588-0228)

  • Laverne Wright (703-696-5833)

  • Kristin Williams (703-696-8106)


DMDC Survey Reviewer: Dr. Robert Simmons (703-696-8961)


Survey Operations Contractor: Data Recognition Corporation (DRC) (800-826-2368): Contact person: Valerie Waller (763.268.2166)

1 Section A.4. addresses one of the two Terms of Clearance items that came out of the 2008 OMB review: “In addition, DOD will coordinate efforts to identify overlap with their ‘Post-Election Survey of Overseas and Post-Election Survey of Local Election Officials,’ and the Election Assistance Commission’s subsequent ‘Election Administration & Voting Survey’ to avoid duplicative efforts.” The 2008 Statistical Methodology Report is submitted as part of this Supporting Statement per the second Terms of Clearance issue from the 2008 OMB review.

2 An estimate of 25% as the cost of fringe benefits was taken from a review of two papers available from the U.S. Bureau of Labor Statistics website; 1) Report on the American Workforce, U.S. Department of Labor, Elaine L. Chao, Secretary, 2001, and, 2) “The Growth of fringe benefits: implications for social security” by Yung-Ping Chen.

3 The final Web-based survey instrument has not yet been programmed, but the Web-based survey will contain the same text as the hard copy version.

4 Because the data are collected on a Web site, the Web site is required to include Security Protection Advisory information according to the Office of the Secretary of Defense Policy for Establishing and Maintaining a Publicly Accessible Department of Defense Web Information Service (dated July 18, 1997; updated January 9, 1998).

17


File Typeapplication/msword
File TitleThe following are the headings for each question to be used in the Supporting Statement:
AuthorQuigleyB
Last Modified ByF. Licari
File Modified2010-11-17
File Created2010-11-17

© 2024 OMB.report | Privacy Policy