OMB_Supporting_Statement

OMB_Supporting_Statement.docx

Post-election Survey of Local Election Officials

OMB: 0704-0125

Document [docx]
Download: docx | pdf


SUPPORTING STATEMENT


2012 Post-Election Voting Survey of State and Local Election Officials


Section A. JUSTIFICATION

A.1. Need for Information Collection

Primary objectives. The primary objective of the 2012 Post-Election Voting Survey of State and Local Election Officials, conducted on behalf of the Federal Voting Assistance Program (FVAP), an agency of the Department of Defense, is to identify areas where the electoral process can be improved by providing an accurate picture of the absentee voting process. This investigation will, in turn, permit an ongoing evaluation of the extent to which FVAP is achieving its mission and what actions FVAP might be able to take in the future to improve the process. In addition, the data will assist FVAP in determining if legislative changes have been successful in removing barriers for absentee voting and identify any remaining obstacles to voting by those populations covered by the Uniformed and Overseas Citizens Absentee Voting Act of 1986 (UOCAVA). To obtain the necessary information, the 2012 Post-Election Voting Survey of State and Local Election Officials, which is comprised of three (3) component surveys, will be administered to the voting jurisdictions in the United States, the U. S. territories, and the District of Columbia with respondents being either the State Election Officials or the Local Election Officials.

Taken together, these components will, for example, help determine: 1) whether voting materials are being distributed in a timely manner and whether voting assistance is being made available; 2) the types of obstacles voters encounter when attempting to vote absentee; 3) the impact of FVAP’s efforts to simplify and ease the process of voting absentee; and 4) any other problems existing for an absentee voter as determined by the responding election officials. FVAP will use the information to prepare a report to the President and Congress as required by the National Defense Authorization Act. Prior to 2010, the voting surveys were administered every four (4) years; i.e., immediately after each presidential election. Beginning in 2010, the surveys are scheduled to be administered every two years, i.e., immediately after each federal election.

Legal authorities. The President of the United States designated the Secretary of Defense to administer the Uniformed and Overseas Citizens Absentee Voting Act (UOCAVA) As Modified by the Military and Overseas Voting Empowerment Act, 42 USC 1973ff (Attachment 1a). The Act permits members of the Uniformed Services and Merchant Marine and their eligible family members and all citizens residing outside the United States who are absent from the United States and its territories to vote in the general elections for Federal offices. The 1988 Executive Order 12642 (Attachment 1b) names the Secretary of Defense as the “Presidential designee” for administering UOCAVA. In the Department of Defense Directive 1000.04, Federal Voting Assistance Program (FVAP) (Attachment 1c), the Secretary of Defense delegated UOCAVA-related responsibilities first to the Undersecretary of Defense for Personnel and Readiness, and then, in turn, to the Director of the Federal Voting Assistance Program. The DoD Directive 1000.04 also updates the policy and responsibilities for FVAP under Executive Order 12642.


A.2. Purpose and Use of Information

Quantitative and Qualitative Survey Data Collection. Compared to past administrations where there was a single survey instrument, as was the case for the 2010 data collection, the 2012 information collection effort has been split into two surveys to better focus the goals of the overall data collection effort. These two surveys comprise two of the three component parts of the overall survey effort. One survey instrument is designed to collect the quantitative information and the second survey instrument is designed to collect the qualitative information. In brief, and also described in Section B.1., Description of Potential Respondents, the quantitative survey will be administered to 1) the State Election Officials (SEOs) in those states, as identified by FVAP, with reliable state-wide databases whereby the SEO will provide the quantitative data for selected jurisdictions using their state’s database as the source, and 2) a sample of the Local Election Officials (LEOs) in those states without such state-wide databases and where the data reside more reliably at the local level. The quantitative survey asks questions such as the number of registered/eligible voters in the jurisdictions; number of post card applications received/rejected; number of absentee ballots requests received; number of Federal Write-in Absentee Ballots (FWABs) received by UOCAVA voters; etc.

The qualitative survey will be administered only to a sample of the LEOs. These survey questions capture self-reported attitudes and behaviors of the LEOs and ask questions such as: What was the main reason why you or your staff did not visit the Federal Voting Assistance Program's (FVAP) Web site in 2012?;”Overall, how useful was the voting information or assistance that you received from the Federal Voting Assistance Program's (FVAP) electronic fax and e-mail conversion service during the 2012 election year?;” and “Did you or anyone else on your staff use the FVAP Local Election Official online training module?”

The data from both the quantitative and the qualitative surveys will identify areas where the electoral process can be improved by providing an accurate picture of the absentee voting process. In addition, because some of the survey questions have been asked after prior federal elections, changes over time in some of the areas of interest to FVAP can be assessed.

Quantitative Data Validation Survey. The third component part of the overall survey effort is designed to serve as a validation check of the quantitative data collection: In those states where the SEOs are the respondents for the quantitative survey, a small (n = 327) randomly selected group of LEOs will be contacted and asked to complete the quantitative survey for their own jurisdictions. The data provided by each LEO will be compared with the data provided by his/her SEO for the LEO’s jurisdiction to assess the degree to which the data match or are discrepant.

The decision to conduct this validation survey was made by DMDC and FVAP because in previous years all the quantitative data were collected from the LEOs rather than from the SEOs. However, in 2012, both to reduce the number of individual respondents compared to the 2010 administration, as well as to take advantage of the state-wide data bases, the quantitative data for the majority of the voting jurisdictions will be coming from the SEOs. To investigate the unlikely possibility that a change in the respondent population, as well as a change in the format of the data collection instrument itself (from Web-based survey to Excel spreadsheet, as described in more detail in Sections A. 3 and B. 2), could produce an unanticipated effect in the data, the validation survey was developed.

However, given that the state-wide databases maintained by the SEOs are populated by the information provided by the states’ individual LEOs, FVAP and DMDC do not expect that any significant systematic differences will be identified between the information coming from these two sources. But if discrepancies are found, a resolution of the data won’t be attempted; that is, neither the SEO nor the LEO will be contacted to attempt to reconcile the differences. Rather, the possible origins of any discrepancies would be investigated with the goal being to use the information as a guide for improving data collection in 2014. For example, if discrepancies are found in the data from a given question, the properties of that question (e.g., clarity, length, complexity, format, wording, response options, question order, presence or absence of any question-specific respondent instruction) would be investigated as possible causes, as would the possibility that the question itself might be interpreted differently by the SEOs compared to the LEOs.

By whom information will be used. The sponsor of the 2012 Post-Election Voting Survey of State and Local Election Officials is the Office of the Under Secretary of Defense (Personnel and Readiness) (OUSD[P&R]), and the users of the data will be FVAP, the Office of the Secretary of Defense (OSD), other DoD senior staff and administrators, and the Defense Manpower Data Center (DMDC).


A.3. Improved Information Technology

Digitally signed e-mails, electronic files, and Web-based technology will be used for respondent communications and for all data collection. Specifically, for the quantitative survey of SEOs and LEOs, as well as for the quantitative data validation survey, Excel spreadsheets and a survey instruction booklet will be hosted by DMDC’s survey operations contractor. Respondents will be able to log onto the contractor’s site using their unique Ticket number where they can access and download directly onto their own computer the survey instruction booklet and their specific spreadsheet, and enter their data directly into the spreadsheet. They can also print out their spreadsheet, enter their answers onto a hard copy if they wish, and then enter their data into the spreadsheet. The spreadsheet will be unique to each state ; for example, each SEO will down load a unique spreadsheet whereby the rows will be labeled with the names of that state’s sampled jurisdictions, and each column will be labeled with a specific question number. Once the spreadsheet has been completed, the respondent will log back in using their unique Ticket number, click on an “Upload” button and upload their completed spreadsheet back onto the survey operations contractor’s site. Once data collection is complete, DMDC will access the site, compile all the spreadsheets and convert the data into a SAS data base, one for the quantitative survey and another for the quantitative data validation survey.

For the qualitative survey of LEOs, to minimize respondent burden and to capitalize on computer-assisted survey administration technology, the respondents will be completing the survey on the Web by logging onto the survey operations contractor’s secure Web site. No hard copy survey data collection is planned for the qualitative survey. E-mails will be sent to all LEOs for whom a valid e-mail address has been obtained explaining the purpose of the survey and inviting the LEOs to participate (See Section B.2 for more information). To access the Web site, LEOs will be provided with an individual access code (i.e., a unique Ticket number included in both the e-mail and postal communications) which they will need to enter to gain access to the survey application.

Finally, to capitalize on most current e-mail communication technology, DMDC’s survey operations contractor now has the capability of digitally signing all the e-mails it sends out which enhances the legitimacy and security of all e-mail communication.


A.4. Efforts to Identify Duplication

There is no other federal agency tasked with collecting information specific to all the populations covered by UOCAVA and designed to evaluate and report on FVAP’s efforts to simplify and ease the process of voting absentee. The Secretary of Defense, as the “Presidential designee” under 42 USC 1973ff, designated the Director of FVAP to administer and oversee the Federal responsibilities of the Act. At present, the only available information of a similar nature is information collected by FVAP from surveys of prior elections, with 2010 being the most recent. This information is no longer current, however, and cannot be used to extrapolate to the upcoming federal election. Without information collected specifically about the 2012 federal election, FVAP cannot perform its responsibilities under the Act.

The U.S. Election Assistance Commission (EAC), an independent bipartisan commission established by the Help America Vote Act of 2002, will be administering the 2012 Election Administration and Voting Survey. The focus of the EAC survey is quite different from the components of FVAP’s 2012 Post-Election Voting Survey of State and Local Election Officials, and whatever overlap there is in the content of the questions is minimal, as described briefly below. Where there is unavoidable overlap, for example, a question on the number of UOCAVA absentee ballots received by the jurisdiction, the question asks for information broken down by 1) Uniformed Service Voters – non-U.S. addresses; 2) Uniformed Service Voters – Domestic U.S. addresses; and 3) Overseas Civilians. This information is important to the purposes of meeting the goals of the FVAP report to Congress and would not be available otherwise. In addition, the proposed qualitative survey component of the 2012 Post-Election Voting Survey of State and Local Election Officials, asks questions specifically about the role of FVAP, and these are absent from the EAC survey completely, for example, “Overall, how useful was the voting information or assistance that you received from the FVAP ETS during the 2012 election year?” The qualitative survey is the only source of information available to FVAP to assist in its efforts to improve these features of the absentee voting process.

While planning for the 2012 surveys, the Director of FVAP and the Director and board members of the EAC had begun working toward the negotiated objective of eliminating the redundancy or duplication of data collection by having only a single 2012 data collection effort to be administered by one or the other of the two agencies. However, the timeframe for the development of the 2012 survey efforts did not allow for the complete resolution of such collaboration. The prolonged delay, for example, in the funding status of EAC required FVAP to continue to move forward with its planning independently, and the unique needs of the two agencies differed significantly; 1) FVAP’s need to collect the qualitative data directly from the LEOs was something EAC was not prepared to undertake, and 2) FVAP’s need for more detailed quantitative information than the EAC normally collects in its survey was problematic. The timing of the two surveys also posed problems that could not be resolved in a timely way; that is, the date by which Congress requires FVAP to submit its 2012 Congressional report was moved up considerably compared to the 2008 and 2010 dates, and much earlier than the finalized EAC data would be available.

However, a Memorandum of Understanding (MOU) (Attachment 1d) has been drafted between FVAP, EAC, and the National Association of State Election Directors (NASED) on the subject of reducing the data collection burden imposed by the bi-annual surveys of the EAC and FVAP. This agreement between these three key organizations, as stated in the MOU, includes shortening the length of the FVAP survey that collects quantitative data, and also significantly reduces the number of local jurisdictions, or LEOs, who will be contacted for data collection in 2012 compared to 2010. Finally, and most importantly in terms of reducing burden and removing duplication of effort, the MOU states that EAC and FVAP will work together to produce a single survey to be administered in 2014 following the federal election taking place that year. The MOU is currently on route to signature by the three organizations, two of which have already provided their signatures.

In additional, an effort by FVAP and DMDC to minimize burden on the LEO respondents selected for the qualitative survey resulted in simplifying and substantially shortening the qualitative survey compared to the 2010 survey and compared to the first OMB submission for the current 2012 survey. Only questions considered essential and actionable have been retained.

Another method by which the FVAP survey will reduce burden for the SEOs who complete FVAP’s quantitative survey is in the designing and formatting of the questionnaire itself; that is, the quantitative survey has been designed in a way that resembles the design of the EAC survey. The EAC designed a spreadsheet document for SEO’s to use into which they enter their answers electronically, and a questionnaire document, or road map, to use that fully describes each question and provides whatever definitions or respondent instructions the SEOs will need. FVAP and DMDC are modeling the FVAP quantitative survey after this example so SEOs can use the same format for both data collections rather than use a spreadsheet format for the EAC survey and either a hard copy or a Web-based survey for the FVAP survey.


A.5. Methods Used to Minimize Burden on Small Entities

The survey respondents for this data collection are either the State Election Officials (quantitative survey only) or the Local Election Officials (quantitative and qualitative surveys for sample of LEOs). No data collection is being conducted with other businesses or establishments.


A.6. Consequences of Not Collecting the Information

The UOCAVA requires a statistical analysis of absentee voter participation which includes uniformed services and overseas nonmilitary populations. To obtain the required information under UOCAVA to conduct this analysis, survey data need to be collected at the level of the voting jurisdiction, and for 2012 those data will be collected from the State Election Officials and from the Local Election Officials. FVAP is then required to prepare a report to Congress. If surveys were not administered, the DoD would not be in compliance with the law and would not be able to report to Congress.


A.7. Special Circumstances

There are no special circumstances. This collection will be conducted in a manner consistent with guidelines contained in 5 CFR 1320.5(d)(2).


A.8. Agency 60-Day Federal Register Notice and Consultations Outside the Agency

Received comments. An agency 60-Day Federal Notice was published in Vol. 77, No. 58, Monday, March 26, 2012, Federal Register, pages 17460-17461, as required by 5 CFR 1320.8(d). A copy of the 60-Day Federal Notice is included in Attachment 2a. No public comments were received in response to the notice. FVAP corresponds regularly with interested citizens and State and local government officials, e.g., State Board of Elections Directors and State Election Officials. Any comments received from these stakeholders regarding survey content were taken into consideration.

Coordinations were obtained from Ms. Cindy Allard, OSD/JS Privacy Office, WHS/ESD, 571.372.0461 (Attachment 2b), and Dr. Jane Styer, Exempt Determination and Secondary Review Official, OUSD(P&R) Human Research Protection Program, DHRA, DMDC, 831.583.4076 (Attachment 2c). Approval of the FVAP surveys was provided by Ms. Sharon Cooper, Director, Defense Human Resource Activity (DHRA) (Attachment 2d).

The Defense Manpower Data Center (DMDC), the survey research arm of the Under Secretary of Defense for Personnel and Readiness, will manage the data collection in 2012 for FVAP.


A.9. Payments to Respondents

No payments or gifts will be provided to LEOs for completing the survey.


A.10. Assurance of Confidentiality

The information collection does not ask respondents to submit propriety or trade secret information to DoD. Though DMDC cannot promise confidentiality to this population, respondents will be told that the information they provide will be kept private to the extent permitted by law.


A.11. Sensitive Questions

The data collection instrument contains no questions of a sensitive nature. The surveys will be non-intrusive and respondents will be informed that their participation is voluntary. The survey does not collect personally identifiable information and survey responses are not retrieved by personal identifier. Therefore, the information collected is not subject to the Privacy Act of 1974, as amended. DMDC will only report results in the aggregate; that is, in the form of statistical summaries.


A.12. Estimates of Respondent Burden and Labor Costs

Estimates for Quantitative Survey and Quantitative Data Validation Survey.

a. Response Burden and Labor Costs for Quantitative Surveys: OLD VERSION: FROM FIRST OMB SUBMISSION:

Response Burden

State Election Officials

Local Election Officials

LEOs (Validation Survey)

Total

Number

45

4,101

500

4,646

R Rate

0.90

0.60

0.60


# completes

41

2,461

300

2,801






Average length (min)

150

45

45


Total length (min)

6.075

110,727

13,500

130,302

Average length (hours)

2.50

.75

.75


Total length (hours)

101

1,845

225

2,172






Labor Costs




# completes

41

2,461

300

2,801

GS 8/5 HR rate

$20.43

$20.43

$20.43


Labor Cost per Resp

$51.08

$15.32

$15.32


Total Labor Cost

$2,069

$37,703

$4,597

$44,368


  1. Response Burden and Labor Costs for Quantitative Surveys: NEW VERSION: VALUES CALCULATED AFTER REDUCTION IN SAMPLE SIZE.

Response Burden

State Election Officials

Local Election Officials

LEOs (Validation Survey)

Total

Number

41

538

327

916

R Rate

0.90

0.60

0.60


# completes

37

329

196

562






Average length (min)

150

45

45


Total length (min)

5,535

14,796

8,829

29,160

Average length (hours)

2.50

.75

.75


Total length (hours)

92

247

147

486






Labor Costs




# completes

37

329

196

562

GS 8/5 HR rate

$20.43

$20.43

$20.43


Labor Cost per Resp

$51.08

$15.32

$15.32


Total Labor Cost

$1,885

$5,038

$3,006

$9,929


b. Explanation of How Labor Cost and Burden to Respondent Were Estimated. It is anticipated that 37 of the 41 SEOs, or 90%, will submit a quantitative survey (that is, will submit the Excel spreadsheet, regardless of how thoroughly the spreadsheet has been filled out) given the past experience of the U.S. Election Assistance Commission in conducting their surveys of SEOs. It is also estimated that the average length of time for an SEO (and/or members of the SEO’s staff) to complete the quantitative survey is approximately 2.5 hours. Once the SEOs have located within their databases the quantitative data being asked for in the survey, or once the SEOs have programmed their systems to generate the requested information, they will be able to generate the information for all of their jurisdictions fairly efficiently: the average number of jurisdictions among the 45 states is approximately 100. Overall, it is anticipated that to identify the information for all the jurisdictions and to enter the information into the spreadsheet will require on average approximately 2.5 hours.

For both the quantitative survey and the quantitative data validation survey in which the LEO is the respondent, a 60% response rate is anticipated, which is somewhat higher than the 53% respondent rate obtained in the 2010 FVAP survey of Local Election Officials. The anticipated increase is due to the shortening of the 2012 survey compared to 20101, the added effort being put into the field work in the form of reminder telephone calls, and FVAP’s efforts throughout the past year to keep the various stakeholders informed of the surveys. Similarly, because the 2012 survey is shorter, it is anticipated that the average length of time for an LEO to complete either of the quantitative surveys is approximately 45 minutes.

The annual salaries of the State and Local Election Officials across all states and all jurisdictions included in the quantitative surveys, from the small and more rural jurisdictions to the very large and urban and/or county-wide jurisdictions, undoubtedly varies greatly, but the overall estimated hourly wage used to calculate the average cost per response is the 2012 GS-8/5 hourly rate of $20.43 (frozen to 2010 levels) excluding any locality adjustment. While it is assumed that the State Election Officials will, on average, earn a higher annual wage than the Local Election Officials, the vast majority of the respondents in this data collection will be Local Election Officials.


Estimates for Quantitative Survey ESTIMATES REDUCED AS RESULT OF SHORTENING OF THE QUALITATIVE SURVEY.

a. Response burden for Qualitative Survey of Local Election Officials.

Number of Sampled LEOs:

1,500


Frequency of Response:

1


Total Annual Responses:

900

Assumes .60 Completion Rate

Burden Per Response:

15

Minutes on Average

Total Burden Minutes:

13,500

Minutes

Total Burden Hours:

225

Hours


b. Labor cost for Qualitative Survey of Local Election Officials.

Total annual respondents:

900

Assumes .60 completion rate

Frequency of response:

1


Total annual responses:

900


Burden per response:

15

Minutes on average

Average cost per response:

$5.11


Burden per response (20 minutes) × hourly rate for GS-8/5 ($20.43)

Total respondent cost:

$4,597

Average cost per response ($20.43) × expected number of responses (900)

c. Explanation of How Labor Cost to Respondent Was Estimated. The annual salaries of the Local Election Officials across all jurisdictions, from the small and more rural jurisdictions to the very large and urban and/or county-wide jurisdictions, undoubtedly varies greatly, but the overall estimated hourly wage used to calculate the average cost per response is the 2012 GS-8/5 hourly rate of $20.43 (frozen to 2010 levels) excluding any locality adjustment.


A.13. Estimates of Respondents Costs Other Than Burden Hour Costs

Total Capital and Start-up Cost. There are no capital/startup costs.

Operation and Maintenance Cost. There are no operation and maintenance costs. No outside resources, consultations or record retrieval are required to answer the survey questions. Any computer costs borne by the establishment will be minimal.


A.14. Estimates of Cost to the Federal Government

a. FVAP and DMDC Staffing Costs


GS Grade/Step

Annual Rate

25% Fringe

Monthly

FTE Months

Cost

12/5

$84,855

$106,069

$8,839

5

$44,195

13/5

$100,904

$126,130

$10,511

4

$42,043

14/5

$119,238

$149,048

$12,421

5

$62,103

15/3

$132,009

$165,011

$13,751

4

$55,004

Total Cost





$203,345


b. Explanation of How Cost was Estimated. Federal labor costs were estimated using the GS Salary Table for 2012-DCB (frozen at 2010 levels) which includes a locality payment of 24.22% for the Washington, Baltimore, and Northern Virginia area. An additional estimated 25% fringe benefit cost was added based on research available from the U.S. Bureau of Labor Statistics.2

c. Additional Costs

DMDC Survey Contractor Operations and Maintenance Cost $200,000

Includes contractor labor to produce surveys and letters, screening and reminder telephone calls, conducting the non-response bias study, all data collection costs, materials and freight, data storage, and postage.

DMDC On-Site Contractor Support $180,000

Costs for support contracts are based on negotiated rates for similar services.

FVAP Call Center Costs $11,000

Government Staffing Cost $203,345

Includes senior management, project management, instrument development and testing, monitoring data collection, contractor technical oversight, sampling and weighting, analysis of basic data set, data cleaning, creation of tabulations volume, writing statistical methods report, contract administration, consults with FVAP, ad-hoc analysis, preparing and review of the all final internal documents.

Total Cost $594,345

A.15. Changes in Burden

The Supporting Statement for the 2012 Post-Election Voting Survey of State and Local Election Officials is part of the submission package for a new collection rather than a renewal of an existing collection, so there is no change in burden in that regard. However, there is a change in burden compared to what was stated in the 60-day FRN due to re-estimation of the number of respondents and the decision to include the quantitative data validation survey as part of the data collection effort. Specifically, the 60-day FRN stated the number of respondents would be 3,000, but the estimated number of respondents for all three (3) components of the survey, including the quantitative data validation survey, is approximately1,462. Similarly, the total Annual Burden Hours has substantially decreased from what was included in the 60-day FRN from 2,000 hours to approximately 711.


A.16. Published Reports and Project Schedule

Published reports. There are currently no plans to publish the results outside the DoD.

Project schedule – QUANTITATIVE SURVEYS.

Activity:

Anticipated Date:

Data collection begins

January 02, 2013

Data collection ends

February 15, 2013

Final Tabulation Volumes produced

May 02, 2013

2012 Post Election Survey Report to Congress

June 12, 2013

Tabulation volumes posted on FVAP Web site

August 10, 2013



Project schedule – QUALITATIVE SURVEY

Activity:

Anticipated Date:

Data collection begins

December 03, 2012

Data collection ends

January 03, 2013

Final Tabulation Volumes produced

March 11, 2013

2012 Post Election Survey Report to Congress

June 12, 2013

Tabulation volumes posted on FVAP Web site

August 10, 2013


A.17. Approval Not to Display Expiration Date

This approval is not being requested.


A.18. Exceptions to the Certification Statement

No exceptions to the Certification Statement are being requested.


Section B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

B.1. Description of Potential Respondents

Sampling frame. The sampling unit, or unit of analysis, for all three components of the 2012 Post-Election Voting Survey of State and Local Election Officials is the local election voting jurisdiction, which for most states is the county but is defined differently from state to state. For example, the states of Alaska and Maine are considered single voting jurisdictions for UOCAVA purposes, whereas Michigan, Wisconsin, and the New England states define voting jurisdiction as the individual township. The remaining states define voting jurisdictions as counties with the exception of Illinois and Virginia. These two states define voting jurisdiction mainly by county but also include some cities among their voting jurisdictions. DMDC developed the sampling frame from three sources: (1) a file provided to DMDC by FVAP, (2) the Websites of the individual state election offices, and (3) the Website of the Overseas Vote Foundation. When compiled, there are 7,303 unique voting jurisdictions in the United States and the four territories.

Quantitative Survey: Sample Design, Target Population, Expected Response Rate. The quantitative survey component of the 2012 Post-Election Voting Survey of State and Local Election Officials will be a sample of 2,500 voting jurisdictions designed to represent all 7,303 unique voting jurisdictions in the United States and the four U.S. territories, with the intent of making state-level estimates for those states with sufficient numbers of responding jurisdictions. Because voting legislation and rules vary by state (e.g., whether voters are required to mail in absentee ballots; whether fax or e-mail voting is optional, etc.), individual state estimates are preferred but may not be possible for any or all states. However, this information, where available, will allow FVAP to target poor-performing states during future training and outreach programs. At a minimum, however, an overall national estimate will be calculated.

In order to create a sample that will represent all jurisdictions, the 7,303 will be stratified into 6 groups based on the number of UOCAVA ballots transmitted in 2008. This number, which will likely be more in line with the 2012 numbers than the 2010 numbers because they are both presidential election years, will be taken from the Election Assistance Commission’s 2008 survey. For those jurisdictions with administrative values for both total registered voters and UOCAVA ballots transmitted, a ratio of these numbers will be taken. If a jurisdiction does not have a UOCAVA ballots transmitted value from the EAC file, then the median of these ratios will be multiplied by that jurisdiction’s number of total registered voters.

The Cumulative Square Root Frequency method was used to determine the boundaries of the strata. The strata are: 1) 0-11 UOCAVA ballots transmitted, 2) 12-32 UOCAVA ballots transmitted, 3) 33-53 UOCAVA ballots transmitted, 4) 54-104 UOCAVA ballots transmitted, 5) 105-189 UOCAVA transmitted ballots, and 6) 190 or more UOCAVA ballots transmitted. The largest strata will be included with certainty, with the rest of the 2,500 sample size being distributed via Neyman allocation.


Stratum

Label

Population

Sample Size

1

Fewer than 12 UOCAVA transmitted ballots

4,067

632

2

12-32 UOCAVA transmitted ballots

1,124

324

3

33-53 UOCAVA transmitted ballots

556

162

4

54-104 UOCAVA transmitted ballots

575

401

5

105-189 UOCAVA transmitted ballots

319

319

6

190 or more UOCAVA transmitted ballots

662

662


Total

7,303

2,500


Eligible respondents for the quantitative survey will be both Local Election Officials (LEO) and State Election Officials (SEO). LEOs are responsible for administering elections within their jurisdictions (e.g., counties, cities, parishes, townships, and other jurisdictions within the United States), as well as for overseeing the voter registration and absentee ballot application processes, for sending absentee ballots to voters, and for receiving and processing voted absentee ballots. SEOs are responsible for overseeing the election process at the state level and for compiling all the required information on all jurisdictions in their states.

FVAP communicated with all the SEOs to determine which states have reliable centralized voter registration databases. As described in Section A.2, for states that have these databases in place, and therefore have the most complete and accurate data for all jurisdictions in those states, the respondents will be SEOs. The District of Columbia and the four territories are considered SEOs in this context. When these state-wide databases do not exist, the respondents will be LEOs. For 2012, DMDC plans to contact 41 SEOs, which include the District of Columbia and two of the four territories, to collect data for 1,952 jurisdictions, and a sample of 548 voting jurisdictions from the remaining 12 states for a total of 589 individuals. This is in comparison to the census of all 7,296 voting jurisdictions that were included in the 2010 data collection effort where data collection was attempted with all LEOs.

As stated above in Section A.12b., it is estimated that 37 of the 41 SEOs (90%) will submit a quantitative survey (spreadsheet). When the SEO is the respondent and is entering data for all of his/her state’s jurisdictions, DMDC will consider each row of the spreadsheet as a separate case or unit of analysis because each row of the spreadsheet constitutes a jurisdiction. It is also estimated that when these 37 submitted spreadsheets are compiled, approximately 90% of the rows, or jurisdictions, will be sufficiently completed to count as a completed case, where a completed case is defined as a row where at least 50 percent of the applicable questions have been completed. Among the 548 sampled LEOs in those states where the SEOs are not contacted, it is expected that 60% of the LEOs will submit a completed spreadsheet where at least 50 percent of the applicable questions have been completed.

Taken together then, when the SEOs and the LEOs returns are combined, it is estimated that an overall response rate of approximately 76% will be obtained for the entire quantitative survey. This figure is derived as follows: Of the 2,500 jurisdictions in the sample, LEOs will be asked to report on 548 of them, and SEOs will be asked to report on the remaining 1,952. It is anticipated that 90% of the SEOs will return a spreadsheet and of all these spreadsheets combined, 90% of the rows will be sufficiently complete for a total of 1,581 completed jurisdictions supplied by the SEOs. Similarly, 60% of the 548 LEOs will also return a spreadsheet that will be sufficiently complete for a total of 329 completed jurisdictions supplied by the LEOs. Overall, it is expected that data will be received for 1,910 of the 2,500 jurisdictions for an overall response rate of approximately 76%. DMDC will calculate margins of error to reflect uncertainty in the point estimates due to sampling and nonresponse variance.

Qualitative Survey: Sample Design, Target Population, and Expected Response Rate. For collection of the qualitative information, as a component part of the 2012 Post-Election Voting Survey of State and Local Election Officials, 1,500 LEOs will be sampled and contacted. The sample of 1,500 will be designed to represent all 7,303 voting jurisdictions. The population will be stratified into 6 strata: 1) jurisdictions with more than 25,000 registered voters, 2) jurisdictions with at least 5,000 but fewer than 25,000 registered voters, 3) jurisdictions in Michigan or Wisconsin with at least 1,000 but fewer than 5,000 registered voters, 4) jurisdictions from all other states with at least 1,000 but fewer than 5,000 registered voters, 5) jurisdictions from Michigan or Wisconsin with fewer than 1,000 registered voters, and 6) jurisdictions from all other states with fewer than 1,000 registered voters. Michigan and Wisconsin are separated because they have 3,355 jurisdictions and DMDC does not want the sample to contain too many jurisdictions from these states. DMDC will select 250 jurisdictions from each of the largest 4 strata, 350 will be selected from the stratum of small jurisdictions from Michigan and Wisconsin, and 150 will be selected from the stratum of small jurisdictions from the other states. It is anticipated that 60% of the contacted LEOs will submit a spreadsheet with at least 50% of the applicable questions answered.


Stratum

Label

Population

Sample Size

1

MI & WI, Fewer than 1000 registered voters

2,310

350

2

all other states, Fewer than 1000 registered voters

296

150

3

MI & WI, 1000-4999 registered voters

780

250

4

all other states, 1000-4999 registered voters

820

250

5

all states, 5000-25000 registered voters

1,944

250

6

all states, more than 25000 registered voters

1,153

250


Total

7,303

1,500

Quantitative Data Validation Survey: Sample Design, Target Population, and Expected Response Rate. DMDC will also select 327 LEOs from states where the SEOs were the respondents to perform a data validation study, the voting jurisdiction for that LEO was included in the quantitative sample, and the LEO was not selected for the qualitative survey (DMDC did not want to burden any jurisdiction with three LEO surveys). These will be comprised of the LEOs from the 5 largest jurisdictions and a random sample of 5 other jurisdictions from each of these states. In some states such as Delaware and Hawaii, there are less than 10 jurisdictions included in the quantitative sample that are not also included in the qualitative sample. In these states, a census of all these jurisdictions is part of this verification study. Similar to the assumption made above about the LEOs participating in the quantitative survey, and given that both the spreadsheet itself and the data collection methodology as well will be identical to that of the quantitative survey, it is again expected that 60% of the LEOs will submit a completed spreadsheet where at least 50 percent of the applicable questions have been completed.

Response Rate Calculations. For all three component surveys, response rates will be calculated using AAPOR RR3, whereby a completed survey is determined by a respondent completing at least 50 percent of the applicable questions and where (Cases of Unknown Eligibility) is an estimate of the non-responding jurisdictions with an unknown eligibility status. However, for the 2012 surveys, all jurisdictions are considered to be eligible.

Completed

[(Completed + Partial Interviews) + (Non-interviews) + e(Cases of Unknown Eligibility)]


To handle unit non-response, DMDC will compute nonresponse adjustment factors within weighting classes defined by state, jurisdiction type (county versus minor civil division) and size classes (number of registered UOCAVA voters). DMDC will collapse cells with too few jurisdictions. This weighting adjustment reduces bias from differential responses rates on these three characteristics.

DMDC will produce statistical estimates of overall percentages, and for states with sufficient numbers of responding jurisdictions, and will conduct basic checks of potential bias due to nonresponse which will be noted in the final Statistics and Methods report produced for this survey.


B.2. Procedures for the Collection of Information

Participant recruitment. DMDC will obtain names, postal mailing addresses, e-mail addresses, and telephone numbers of all the current LEOs and eligible SEOs from FVAP’s administrative staff, from the FVAP Web site, or from the Web sites of the individual SEO and LEO offices.

All prospective respondents will be notified of the upcoming surveys by postal mail approximately one to two weeks prior to the start of data collection. The letters, all of which will include the signature of FVAP’s Acting Director, will describe the purpose of the survey and will include the name of the sponsor, a toll-free phone number to call with questions about how to participate, an unique ticket number, and the URL for the Website the LEOs and SEOs can log onto to update their contact information if desired.

An announcement e-mail will be sent to all prospective respondents immediately prior to the start of data collection informing the LEOs and SEOs data collection has begun and will include their unique ticket number and instructions on how they can log onto the survey Website to either complete the Web-based qualitative survey (LEOs) or to download the Excel spreadsheet and the Instruction Booklet for the quantitative survey (SEOs and LEOs).

Throughout the course of the field period, the survey operations contractor will maintain a case control system which will be updated daily for Web receipts and spreadsheet uploads to determine both LEO and SEO non-respondents. Only those LEOs and SEOs who have not responded to their respective data collections will be sent any reminder contacts: up to five (5) reminder e-mails depending on the survey, and two (2) reminder postal letters. The field period will last approximately four (4) weeks for the qualitative survey and approximately six (6) weeks for the quantitative survey.

The postal letters and emails are included as the following attachments:

Quantitative Survey of SEOs/LEOs.

Attachment 3a: Quantitative Survey Postal Letters – SEOs;

Attachment 3b: Quantitative Survey Emails – SEOs;

Attachment 3c: Quantitative Survey Postal Letters – LEOs;

Attachment 3d: Quantitative Survey Emails – LEOs;

Qualitative Survey of LEOs.

Attachment 3e: Qualitative Survey Postal Letters - LEOS

Attachment 3f: Qualitative Survey Emails - LEOs


The data collection procedures are not expected to involve any risk to participants. Names are used only in communicating with LEOs and SEOs. These names are kept securely by the survey operations contractor and are not linked to response data. The datasets sent to DMDC contain no names or addresses.

Data collection: Quantitative Surveys of SEOs and LEOs. As stated in the above section on participant recruitment, the data collection instrument for the quantitative surveys will be an Excel spreadsheet. Each SEO will access and download a spreadsheet that is customized to their state with their sample jurisdictions already populated in their spreadsheet, and each LEO will download a generic spreadsheet. The letters and emails each respondent receives will include their unique ticket number and the URL for accessing the Website. Once logged on and the ticket number is entered, each respondent will be routed first to a Welcome Screen by which they can access their spreadsheet with their jurisdiction(s) constituting the row(s), and the individual questions constituting the columns. Along with the spreadsheet, which functions as an answer sheet, the respondent will also access, again via the Welcome Screen, a downloadable Instruction Booklet which contains all the questions and any necessary respondent instructions to assist the respondent. This booklet will function as a “road map” that clearly guides the respondent from question to question, and from column to column in the spreadsheet/answer sheet. The booklet will also include all the addition requisite information such as the privacy notice and informed consent information.3 The Welcome Screen in included as Attachment 4a. The Instructional Booklet, which the SEOs and LEOs will access/download, is included as Attachment 4b, and the version of the Excel spreadsheet that the LEOs will see; that is, the version that does not include any jurisdiction names, is included as Attachment 4c.

The SEOs and LEOs will be instructed to download the booklet and the spreadsheet. In this manner they can print out one or both of the documents, can gather the necessary information to answer the questions without needing to stay logged on, and can complete the hard copy version of the spreadsheet if they wish before entering their answers into the electronic version of the spreadsheet. Another benefit of downloading the spreadsheet and Instruction Booklet is that respondents, especially SEOs, may want to enlist the assistance of other staff members to complete the various sections of the spreadsheet. Then the responses to all the questions, and for all the jurisdictions, can be compiled and the electronic spreadsheet can be fully populated.

Once the spreadsheet has been completed, the respondent will log back on to the Web site, enter their ticket number, and follow the instructions for uploading and submitting the completed spreadsheet. The operations contractor will be able to monitor the status of each spreadsheet, e.g., whether it has been accessed and downloaded, and whether it has been uploaded. In this way, targeted postal and emails reminders can be sent, and reminder phone calls can be made – see Section B3. As the spreadsheets are completed, the operations contractor will either send them to DMDC or DMDC will download them from the contractor’s Web site and then will perform any necessary cleaning and editing. Once the field period closes, DMDC will create the final spreadsheets and convert the files to SAS datasets for analysis.

Quantitative data will be collected in this manner for both the quantitative survey of SEOs and LEOs, and for the quantitative data validation survey of LEOs. At the end of data collection, two final datasets will be developed: 1) All the spreadsheets submitted by the SEOs and the LEOs eligible for the quantitative survey will be combined into a single data file because the jurisdiction is the final unit of analysis regardless of the source of the information. 2) The second data file will be comprised only of those spreadsheets submitted by the LEOs who were sampled to be part of the quantitative data validation survey (n= 327).

Data collection: Qualitative Survey of LEOs. Data collection for the qualitative survey of LEOs will consist of a Web-based instrument that selected LEOs will access by navigating to the survey operations contractor’s Web site and entering their unique ticket numbers which have been provided to them in the postal letters and emails they have received. Respondents will be directed to a set of Web screens, starting with the “Welcome” screen. From there they can view the “Frequently Asked Questions” screen, the “Security Protection Advisory”4screen, and the “OMB Number screen. The Advisory informs respondents that no information on the individual’s computer or Internet connection is collected in a way that can be associated with the individual or his/her survey responses. Respondents are then directed to the “Agency Disclosure Notice and Privacy / Informed Consent” screen. This last screen includes the instruction “Click

'Continue' if you agree to do the survey" and informed consent is indicated by clicking the "Continue" button and by answering the survey questions and submitting the survey. The Attachments for the Web-based qualitative survey of LEOs are the following: Attachment 5a: Questionnaire; Attachment 5b: Welcome screen; Attachment 5c: Frequently Asked Questions screen; Attachment 5d: OMB Number screen; Attachment 5e: Security Protection Advisory screen; Attachment 5f: Agency Disclosure Notice and Privacy/Informed Consent screen.

The qualitative survey functions as a Web-based instrument whereby respondents are guided through it screen-by-screen via a program that includes any appropriate skip logic. At the close of the field period, the qualitative data will be compiled into a single SAS dataset, after appropriate cleaning, by the data collection contractor. Datasets are then transmitted to DMDC via secure file transfer protocol.

Weighting. The analytic weights for the quantitative survey of SEOs and LEOs will be created to allow for the estimation of population values by eligible survey respondents. DMDC will create survey weights to reflect the initial selection probabilities as well as the adjustments for the differential response rates. Base weights, which are the ratio of the frame count to the sample count, will be calculated by stratum and assigned to each jurisdiction in the sample. LEOs receiving the qualitative survey will receive base weights equal to the total number of jurisdictions, 7,303, divided by the number of jurisdictions in the sample, 1,500. As in the quantitative survey, the base weights will vary over the strata. The quantitative data validation study will not produce estimates, as it will be used only for data accuracy purposes, and will therefore not include base weights. After the surveys have been conducted and the case dispositions are resolved, the sampling weights for the quantitative and qualitative surveys will be adjusted for non-response. The eligibility-adjusted weights for eligible respondents will be adjusted to account for eligible jurisdictions that were non-respondents. For this survey we expect all sample jurisdictions to be eligible.

Edit and imputation processes. To calculate estimated totals from the survey data in the quantitative survey, edit and imputation processes will be developed for the items with missing data similar to those in 2010. Without an edit and imputation process, the estimated totals will under-represent the actual total. The edit process is the inspection of collected data, prior to statistical analysis. The goal of editing is to verify that the data fall within expected ranges and relationships among variables indicate respondents understood the question concepts. An imputation process places an estimated answer into a data field for a record that previously had no data or had incorrect or implausible data.

Data Editing. DMDC will develop a thorough set of data quality checks against which to perform data editing on the quantitative survey databases to verify any data that are outside of specified tolerances. DMDC will use a combination of historical reporting for the jurisdictions (e.g., some of the specifications will be derived from lessons learned from the 2010 data collection) and jurisdiction size and type to set edit parameters. If data fall outside parameters, DMDC will contact the jurisdiction or the state by telephone to verify reported values.

Imputation Process. After the edit process, DMDC will implement a multiple hot-deck imputation for the quantitative survey. Hot-deck imputation uses similar sized jurisdictions as ‘donors’ to replace missing data. To become a donor, the jurisdiction needs to be a complete eligible case that has response data for the donor question. Using a simple random sample, a donor will be found preferably from within the same state and relatively close in size for each voting jurisdiction with missing information. No donor will be used more than one time and the donor will provide a value for the question with missing data needing imputation. Five fully imputed data sets will be created and the values will be averaged to create point estimates. The multiple data sets will inflate variance estimates by introducing randomness into uncertain values. In other words, different values will be imputed into each of the five data sets, and the variance from these different values will be incorporated into overall variance estimates. Five data sets is the industry standard, as that number allows for an accurate amount of variance inflation while more data sets do not improve estimates substantially.

Data security. The 2012 Post-Election Voting Survey of State and Local Election Officials does not collect or use personally identifiable information and data are not retrieved by personal identifier. Therefore, the information collected is not subject to the Privacy Act of 1974, as amended. Only aggregate data will be reported in the form of statistical summaries.

The network sites for both DMDC and DMDC’s survey contractor, Data Recognition Corporation (DRC), are secure and password protected. Security is strictly enabled by using physical and software access restrictions. All servers are physically located in locked rooms with access permitted only to Technical Services staff through the use of a security card system. Access to the network is allowed only through a login account and password. In addition, employees use password protected screen savers at workstations to protect their systems while they are away from their desks. At DMDC, the network is accessed through the use of Common Access Card (CAC) readers and utilizes Public Key Infrastructure (PKI) security. Logging on to the network requires both physical possession of the CAC and a separately issued Personal Identification Number. All computer systems comply with current Federal Information Security Management Act security standards.

DRC makes daily backup tapes that are stored for five years in fire proof vaults located within a security-card protected area, and all provisions dealing with the protection of human subjects and data security are in force for as long as the contractor retains any protected data. DRC meets the Department of Defense Information Assurance Certification and Accreditation Process (DIACAP) requirements and conducts all necessary updates on an on-going basis to continue to meet DIACAP requirements. DRC’s facilities have an Authority to Operate (ATO) issued by DMDC.

Unusual problems requiring specialized sampling. No unusual problems that would require specialized sampling for any of the component parts of the 2012 Post-Election Voting Survey of State and Local Election Officials are anticipated.

Use of periodic data collection cycles to reduce burden. This request is for all data collection to take place after the 2012 federal election, and therefore it cannot be reduced further.


B.3. Non-response, Maximization of Response Rates, and Accuracy and Reliability

Encouraging response. To maximize response rates, DMDC has developed a set of survey-specific postal letters and e-mails that will be sent out to all selected survey participants (See Section B.2. on Participant Recruitment) throughout the field period. The letters will be on FVAP letterhead, signed by FVAP’s Acting Director, and whenever possible will be personalized; that is, addressed to the specific SEOs and LEOs identified by FVAP as the contact person for their state or jurisdiction. The selected individuals will be notified of the upcoming survey by postal notification letter approximately one to two weeks prior to the start of data collection. The letter explains the purpose of the survey, includes a telephone number the individual can call to confirm the legitimacy of the survey, and informs the individual that participation is voluntary. The notification letter will be followed by an announcement e-mail sent immediately prior to the start of data collection informing the LEOs and SEOs data collection has started and includes clear instructions on how to participate. Throughout the field period, the non-responding LEOs and SEOs will be sent reminder e-mails and reminder postal letters with the same information. DRC now has the capability of sending out emails that are digitally signed on behalf of the Department of Defense, and it is expected that this feature will add to the legitimacy of the surveys.

In addition to sending out reminder e-mails and postal letters, the FVAP Call Center, an independent professional Call Center with a dedicated set of managers and telephone staff, will conduct reminder telephone calls to non-responding Local Election Officials to encourage response and to answer questions they may have about the survey. To make the phone calls as effective as possible, the DMDC survey operations contractor, DRC, will design a CAPI-based Web interface by which the FVAP Call Center staff can connect directly to the Case Control system maintained by DRC and updated regularly to exclude completed cases. In this way, the respondent’s contact information, as well as other case-specific information such as the respondent’s ticket number, can be directly accessed by the Call Center staff. In addition to just reminding the LEO about the survey, the Call Center staff person will also be able to enter information directly into the control system, such as any new email or postal address for the respondent, whether the respondent would like another copy of the FVAP letter sent to them, and whether the respondent needs the URL to the survey Web-site or their individual ticket number. The caller can also provide the respondent with the phone number to DRC or FVAP if the respondent needs technical assistance or has specific questions about any given survey item respectively. In short, the Call Center will provide any assistance they can to help facilitate data collection and encourage response.

Telephone scripts are being developed by DMDC and Call Center managers for handling the reminder telephone calls, and DMDC will train the Call Center staff on the use of the Web interface and how to use the reminder call script. As part of their usual practice, the Call Center managers will be periodically monitoring their staffs’ calls, and DMDC will meet with the managers to assess how the phone calls are proceeding and whether any changes are needed to make the calls more effective. The Call Center has sufficient capacity to conduct reminder calls to all non-responding LEOs throughout the field period. The reminder telephone call flow chart is included as Attachment 4d.

To capitalize on the professional working relationships FVAP has established with the SEOs, FVAP will be contacting the non-responding SEOs during the six-week field period reminding them of the survey and answering any questions or addressing any concerns they may be having about the data collection.

It is very important for a data collection effort of this kind to obtain buy-in from the key stakeholders and populations of interest to the extent possible. To that end, Robert Carey, the FVAP Director at that time, attended the January 2012 National Association of State Election Directors Conference. The National Association of State Election Directors (NASED) is comprised of all the SEOs and the conference provided an excellent opportunity to both publicize and legitimize the survey. Mr. Carey made a presentation at the conference on the 2012 surveys, and in this forum the SEOs and several LEOs had an opportunity to hear and ask questions about the upcoming surveys. Mr. Carey addressed questions about the quantitative and qualitative survey instruments, their methodology, how and when potential respondents would be contacted, and how the quantitative FVAP survey differs from the EAC survey. In addition, Mr. Carey informed the conference participants that both the quantitative and qualitative survey instruments were to be posted on the FVAP Web site to allow stakeholder input. Prior to this conference a Memorandum of Understanding (MOU) was developed between the three main organizations (FVAP, EAC and NASED), as described in Section A.4. Efforts to Identify Duplication, that discusses the survey effort for 2012 and states that FVAP and EAC will produce a single survey in 2014.

Analysis of survey non-response. DMDC and OMB are concerned with general declining response rates in the survey industry overall, and in DoD-sponsored surveys in particular, and what, if any, impact such a decline may have on the validity of the reported estimates. To address this concern, OMB issued standards and guidelines for federal statistical surveys requiring that, for any survey with a response rate below 80 percent, survey agencies conduct a non-response analysis.

As part of the OMB approval for the 2010 Post-Election Survey of Local Election Officials, and listed under Terms of Clearance5, was the instruction to provide OMB with the results of a non-response bias analysis of the 2010 data. Submitted as Attachment 6 of this Supporting Statement is a Survey Note entitled, “2010 Post-Election Voting Survey of Local Election Officials: Nonresponse Bias Study” which reports the results of that analysis. The study included telephone data collection of non-respondents using an abbreviated form of the 2010 survey. In general, differences in the means of the unweighted data between the actual 2010 survey and the non-response telephone survey suggest that non-response bias is present in some of the 2010 questions, but that this varies by jurisdiction. The Survey Note suggests a strategy for the 2012 survey whereby the nonresponse bias can be accounted for in a more systematic fashion for all the questions. Please see Attachment 6 for a detailed discussion of the nonresponse bias analysis.

To monitor bias of this sort in the 2012 survey, DMDC will analyze response rates and item missing data rates by stratum and state to determine if there are patterns that could indicate nonresponse bias, as well as compare responses to previous iterations of the survey to determine the influence of nonresponse on estimates. During the weighting and imputation process, DMDC will also divide the population into strata based on jurisdiction size and type, which should increase the probability that data not provided by jurisdictions is missing at random. To determine if data is missing at random, DMDC will use capture-recapture methods in a closed population to combine data from the 2008, 2010, and 2012 iterations of the survey, as presented by Fritz Scheuren and Eric Falk in their June 19th, 2012 Washington Statistical Society seminar. Creating these more homogeneous groups should allow respondents to more readily represent the other jurisdictions within the same stratum. Although the total number of registered voters is not extremely correlated with the number of UOCAVA voters, this should still ensure that donor jurisdictions are similar to the recipient jurisdictions. Therefore, imputed values should be subjected to minimal nonresponse bias.


B.4. Tests of Procedures

DMDC utilizes best practices in its design of Web-based surveys (e.g., visual presentation of questions and response options, usability and interactive elements, use of color, font style and size, and screen layout). These features of Web-based data collection, in addition to automated skip logic, serve to ease the burden on the respondents, increase data quality, and minimize response error. Similarly, DMDC has incorporated a number of randomized experiments into its surveys over the past several years testing such conditions as number and wording of letters and e-mails, timing of respondent contacts, presence or absence of a brochure, sponsorship, and subject line text. Because of these experiments, DMDC has developed field procedures grounded in experience and empirical findings. In addition to these tests and experiments, DMDC conducted the 2010 Post-Election Voting Survey of Local Election Officials, and as stated above, based on these returns, revisions were made to questions that did not seem to be generating reliable or valid data, appeared confusing, or were formatted in a way that proved cumbersome to respondents. For example, to improve data quality since the 2008 survey and to avoid generating data from which poorly informed policy decisions might be made, the survey respondents can use a drop-down Yes/No box where a No selection automatically greys out the cells to questions the respondent can skip over. Respondents can also select a box for “No Data” if the state or jurisdiction does not keep records the way the question is describing. Finally, problematic questions from 2010 were dropped completely.


B.5. Individuals Consulted on Statistical Aspects and Individuals Consulting and/or Analyzing Data

FVAP Principal Investigator: Pamela Mitchell

Acting Director, Federal Voting Assistance Program

571.371.0727

[email protected]

FVAP Deputy Director for Research and Assessments: Paul Drugan

571-372-0751

[email protected]

FVAP Deputy Director of Election Official Assistance: Paddy McGuire

571-372-0739

[email protected]

FVAP Technology Programs: Samantha Walker

571-372-0741

[email protected]

DMDC Program Manager/Team Lead: Fred Licari

571-372-1102

[email protected]

DMDC Principal Investigator: Kristin Williams

571-372-1033

[email protected]

DMDC Survey Statisticians:

  • David McGrath - 571-372-0983

  • Eric Falk - 571-372-1098

  • Timothy Markham - 571-372-1126

  • Fawzi Al Nassir - 571-372-1114

DMDC Analysts:

  • Elizabeth Davis - 571-372-1105

  • Margaret Coffey - 571-372-1115


Survey Operations Contractor: Data Recognition Corporation (DRC) (800-826-2368): Contact person: Valerie Waller (763-268-2166)

1 In 2010, State Election Officials were not contacted, but instead the Local Election Officials were asked to answer questions designed to collect both the quantitative as well as the qualitative information.

2 An estimate of 25% as the cost of fringe benefits was taken from a review of two papers available from the U.S. Bureau of Labor Statistics website; 1) Report on the American Workforce, U.S. Department of Labor, Elaine L. Chao, Secretary, 2001, and, 2) “The Growth of fringe benefits: implications for social security” by Yung-Ping Chen.

3 EAC conducts its voting surveys using this method (i.e., spreadsheet and Instruction Booklet), and DMDC is modeling its data collection methods on this process, a process with which SEOs are already familiar.


4 Because the data are collected on a Web site, the Web site is required to include Security Protection Advisory information according to the Office of the Secretary of Defense Policy for Establishing and Maintaining a Publicly Accessible Department of Defense Web Information Service (dated July 18, 1997; updated January 9, 1998).

5 See ICR Reference Number: 201011-0704-001; OMB Control Number: 0704-0125

16


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleThe following are the headings for each question to be used in the Supporting Statement:
AuthorQuigleyB
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy