2020 Census Operational Plan

2020 Census Operational Plan.pdf

Address Canvassing Testing

2020 Census Operational Plan

OMB: 0607-0992

Document [pdf]
Download: pdf | pdf
2020 Census Operational Plan
A New Design for the 21st Century
Issued November 2015
Version 1.1

U.S. Department of Commerce
Economics and Statistics Administration
U.S. CENSUS BUREAU

census.gov

[This page intentionally left blank]

TABLE OF CONTENTS
1.	 Introduction  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	1
1.1	Purpose  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	1
1.2	 Design Approach  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	1
1.3	 Document Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	2
1.4	 Document Development Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .	3
1.5	 Document Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	4
2.	 The 2020 Census Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	5
2.1	 Purpose, Goal, and Challenge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	5
2.2	 Uses of Decennial Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	5
2.3	 The Changing Environment and Escalating Costs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	6
2.4	 Four Key Innovation Areas  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	8
2.5	 A New Design for the 21st Century  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	8
2.6	 The 2020 Census Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	9
3.	 The Four Key Innovation Areas  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	15
3.1	 Reengineering Address Canvassing  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	16
3.2	 Optimizing Self-Response . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	18
3.3	 Utilizing Administrative Records and Third-Party Data  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	22
3.4	 Reengineering Field Operations  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	26
3.5	 Summary of Innovations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	29
4.	 Key Tests, Milestones, and Production Dates  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	33
4.1	 Tests to Inform the Operational Design and Prepare for Conducting the Census  . . . . . . . . . 	33
	
4.1.1	 Tests in 2012–2014 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	34
	
4.1.2	 Tests in 2015  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	40
	
4.1.3	 Tests in 2016  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	45
	
4.1.4	 Tests in 2017  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	47
	
4.1.5	 Tests in 2018  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	48
	
4.1.6	 Tests in 2019  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	49
4.2	 Key Decision Points and Milestones . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	49
4.3	 2020 Census Production Operational Schedule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	51
5.	 The 2020 Census Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	55
5.1	 Operations Overview  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	55
	
5.1.1	Frame . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	58
	
5.1.2	 Response Data  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	58
	
5.1.3	 Publish Data  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	58
5.2	 Program Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	59
	
5.2.1	 Program Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	59
5.3	 Census/Survey Engineering  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	64
	
5.3.1	 Systems Engineering and Integration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	64
	
5.3.2	 Security, Privacy, and Confidentiality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	67
	
5.3.3	 Content and Forms Design  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	70
	
5.3.4	 Language Services  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	73
5.4	Frame	 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	76
	
5.4.1	 Geographic Programs  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	76
	
5.4.2	 Local Update of Census Addresses  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	82
	
5.4.3	 Address Canvassing  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	86

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 i

5.5	
	
	
	
	
	
	
	
	
	
	
	
	
5.6	
	
	
	
	
	
5.7	
	
5.8	
	
	
	
	
5.9	
	
	
	
	

Response Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	89
5.5.1	 Forms Printing and Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .	90
5.5.2	 Paper Data Capture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	92
5.5.3	 Integrated Partnership and Communications  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	95
5.5.4	 Internet Self-Response . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	97
5.5.5	 Non-ID Processing  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	102
5.5.6	 Update Enumerate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	105
5.5.7	 Group Quarters  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	107
5.5.8	 Enumeration at Transitory Locations  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	110
5.5.9	 Census Questionnaire Assistance  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	112
5.5.10	Nonresponse Followup  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	115
5.5.11	Response Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	122
5.5.12	Federally Affiliated Americans Count Overseas  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	126
Publish Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	127
5.6.1	 Data Products and Dissemination  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	127
5.6.2	 Redistricting Data Program  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	129
5.6.3	 Count Review  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	131
5.6.4	 Count Question Resolution  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	133
5.6.5	Archiving  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	135
Other Censuses  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	136
5.7.1	 Island Areas Censuses  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	136
Test and Evaluation  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	138
5.8.1	 Coverage Measurement Design and Estimation  . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	138
5.8.2	 Coverage Measurement Matching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	140
5.8.3	 Coverage Measurement Field Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	142
5.8.4	 Evaluations and Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	144
Infrastructure  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	146
5.9.1	 Decennial Service Center . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	147
5.9.2	 Field Infrastructure  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	149
5.9.3	 Decennial Logistics Management  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	152
5.9.4	 IT Infrastructure  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	154

6.	 Key Program-Level Risks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	159
6.1	 Funding Requests Not Realized  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	160
6.2	 Reengineering Address Canvassing Operation  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	160
6.3	 Administrative Records and Third-Party Data—External Factors  . . . . . . . . . . . . . . . . . . . . . . 	160
6.4	 Public Perception of Ability to Safeguard Response Data  . . . . . . . . . . . . . . . . . . . . . . . . . . . 	161
6.5	 CyberSecurity Incidents  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	161
6.6	 Enterprise IT Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	162
6.7	 Technological Innovations Surfacing After Design Is Finalized  . . . . . . . . . . . . . . . . . . . . . . . 	162
6.8	 Data Quality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	162
6.9	 Late Operational Design Changes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	163
6.10	Administrative Records and Third-Party Data—Access and Constraints . . . . . . . . . . . . . . . . . 	163
6.11	Policy Impacts  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	164
7.	 Quality Analysis  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	165
7.1	 Reengineered Address Canvassing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	165
7.2	 Optimizing Self-Response . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	169
7.3	 Utilizing Administrative Records and Third-Party Data for Nonresponse Followup  . . . . . . . . 	170
7.4	 Future Plans . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	177

ii 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

8.	 Life-Cycle Cost Estimate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	179
9.	 Approval Signature  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	181
10.	Document Logs  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	183
10.1	Sensitivity Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	183
10.2	Review and Approvals  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	183
10.3	Version History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	183

Appendix: List of Acronyms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	185

LIST OF FIGURES
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure

1: Approach to the Operational Design  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	1
2: 2020 Census Program Documentation Structure  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	2
3: Organizations and Governance Boards for the 2020 Census Operational Plan . . . . . . . . . . . 	3
4: 2020 Census Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	6
5: Costs—Traditional vs Innovative 2020 Census . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	7
6: The 2020 Census—A New Design for the 21st Century . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	9
7: Operations by Work Breakdown Structure  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	14
8: Summary of Reengineering Address Canvassing  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	16
9: Operations That Contribute to Reengineering Address Canvassing  . . . . . . . . . . . . . . . . . . . 	17
10: Summary of Optimizing Self-Response . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	19
11: Operations That Contribute to Optimizing Self-Response . . . . . . . . . . . . . . . . . . . . . . . . . . 	20
12: Summary of Utilizing Administrative Records and Third-Party Data . . . . . . . . . . . . . . . . . . 	22
13: Operations That Contribute to Utilizing Administrative Records and Third-Party Data . . . . 	24
14: Summary of Reengineering Field Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	26
15: Operations That Contribute to Reengineering Field Operations . . . . . . . . . . . . . . . . . . . . . 	27
16: Operations With Significant Innovations Since the 2010 Census  . . . . . . . . . . . . . . . . . . . . 	30
17: High-Level View of Tests  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	33
18: Tests in 2012–2014 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	34
19: Tests and Key Decisions in 2015 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	40
20: Tests Planned in 2016 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	45
21: Schedule for the 2017 Census Test  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	47
22: Schedule for the 2018 Census End-to-End Test  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	48
23: Defect Resolution and Performance Tests in 2019 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	49
24: Key Decision Points and Milestones . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	50
25: 2020 Census Operations—Production Timeline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	52
26: High-Level Integrated Schedule  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	53
27: Operational Overview by Work Breakdown Schedule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	56
28: High-Level Integration of Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	57
29: Program Management Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	60
30: Summary of Geographic Programs Components  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	77
31: Geographic Programs Timeline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .	82
32: Paper Data Capture Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	93
33: Response Processing Operation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	124
34: 2020 Census Program-Level Risk Matrix  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	159
35: The Hybrid Administrative Records Use . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	172
36: Simulated Household Population Distribution  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	173
37: Simulated Occupied and Vacant Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	174
38: Simulated Distribution of 2010 NRFU Fieldwork Cases  . . . . . . . . . . . . . . . . . . . . . . . . . . . 	175

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 iii

LIST OF FIGURES—Con.
Figure 39: Simulated 2010 Number of Contacts  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	175
Figure 40: Both Race and Hispanic Origin Not Reported  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	177

LIST OF TABLES
Table 1: Operations and Purpose. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	10
Table 2: Description of Operations That Contribute to Reengineering Address Canvassing . . . . . . . . . 	18
Table 3: Description of Operations That Contribute to Optimizing Self-Response . . . . . . . . . . . . . . . . 	21
Table 4: Description of Operations That Contribute to Utilizing Administrative Records
and Third-Party Data  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	25
Table 5: Description of Operations That Contribute to Reengineering Field Operations  . . . . . . . . . . . 	28
Table 6: Summary of Key Innovations by Operation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	31
Table 7: Operational Tests  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	34
Table 8: Estimated Missed Adds and Missed Deletes by Percentage of In-Field Address Canvassing . . 	166
Table 9: Estimated Missed Adds and Missed Deletes by In-Office Address Canvassing Success Rate . . 	167
Table 10: Estimated Missed Adds and Missed Deletes by Percentage of Added and
Deleted Addresses in the Initial Frame . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	168
Table 11: 2010 Census Correct Enumerations by Operation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	169
Table 12: Estimated Correct Person Enumerations  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	170
Table 13: Resolution Status Projection  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	176

iv 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

1. Introduction
1.1	PURPOSE
The U.S. Census Bureau’s 2020 Census Operational
Plan documents the current design for conducting the 2020 Census. As the initial version of an
emerging concept of operations, it reflects and
supports evidence-based decision making by
describing design concepts and their rationale,
identifying decisions still to be made, and describing significant issues and risks related to the implementation of the Operational Plan.

1.2	 DESIGN APPROACH
As shown in Figure 1, the operational design comprises a set of design decisions that drive how the
2020 Census will be conducted. These design decisions are informed through research, testing, and
analysis of the cost and quality impacts of different
design options. The operational design also drives
the requirements for Information Technology (IT)
capabilities and acquisitions.

Approach to Design Decisions

The operational design
omprises a set of design
ecisions that together
eflect how the 2020 Census
will be conducted.

The 2020 Census is being designed and developed
on a rolling schedule. Accordingly, this process is
iterative. Preliminary design decisions have been
made based on early research, testing, and analysis, and these have been used to determine initial
requirements for capabilities and acquisitions. As
the design matures and more decisions are finalized, the requirements will be updated to reflect
the revised design.
An important aspect of the design approach for the
2020 Census is an increased reliance on enterprise standards and solutions. Specifically, the
design of all information technology capabilities
adheres to the Enterprise Systems Development
Life Cycle (eSDLC) and IT Guiding Principles.
Furthermore, the 2020 Census Program’s budget,
schedule, and work activities align with the eSDLC/
Mission Enabling and Support Work Breakdown
Structure (WBS). The 2020 Census design also
leverages enterprise-shared services, including the

Quality and
Cost Impact
Analysis

Research
and Testing
Results

Research, testing, and
nalysis of the cost and
uality impacts of different
esign options provide the
ata necessary to inform the
esign decisions.

Needs

Trade-Offs

Design Decisions

The operational design
rives the requirements
or IT capabilities and
cquisitions

Operations Design
Requirements

IT Capabilities

Requirements

Acquisitions

Figure 1: Approach to the Operational Design
U.S. Census Bureau

	

2020 Census
Operational Plan—Version
1.1 1
14
Pre-decisional

Census 2020 Operational Design
Operational
Design

Research,
Testing,
and Analysis

Program
Management

Systems
Engineering and
Integration (SE&I)

IT Solutions

2020 Census
Operational
Plan Executive
Summary

Plans

Plans

Management
Plans

Capabilities

2020 Census
Operational Plan

Results

Artifacts

Artifacts

Architectures

Detailed
Operational
Plans

Other
Studies
and Reports

Life-Cycle
Cost
Estimates

Operational Plan
Briefing Materials

Designs

Rebaselined
Schedule

Figure 2: 2020 Census Program Documentation Structure
Pre-decisional

Census Enterprise Data Collection and Processing
(CEDCaP) solution and the Center for Enterprise
Dissemination Services and Consumer Innovation
(CEDSCI) solution.1 These two initiatives provide the
technology solutions required to support significant
portions of the innovations for the 2020 Census.

1.3 DOCUMENT SCOPE
This document is the initial baseline version of the
2020 Census Operational design and covers all
operations required to execute the 2020 Census,
starting with precensus address and geographic
feature updates, and ending once census data
products are disseminated and coverage and
quality are measured. It describes what will be
done during the 2020 Census and, at a highlevel, how the work will be conducted. Additional
specifics of how each operation will be performed
are documented in individual detailed operational
1
Throughout this document references are made to specific
systems that are part of CEDCaP. These are the systems being
used to support the early 2020 Census tests; however, final decisions on production systems have not been made.

2 2020 Census Operational Plan—Version 1.1	

plans, which are being created on a rolling schedule. These detailed plans will include the business
process models and requirements that have been
developed for each operation.
While this document is a comprehensive plan, the
initial research and testing phase focused on those
areas that provided the greatest opportunity for
cost savings. The maturity level of the plan varies
by operation. For each operation, the plan presents the decisions made to date and the decisions
that still need to be made. Research and testing
to refine and improve all operations will continue
through the end-to-end test in 2018.
As shown in Figure 2, this Operational Plan, shaded
in yellow, is part of a broader set of documentation
for the 2020 Census Program that will be developed as the Program matures. Those items outlined
in dark blue (i.e., the 2020 Census Operational Plan
Executive Summary, the Operational Plan Briefing
Materials, the Life-cycle Cost Estimates, and the
Rebaselined Schedule) are being completed in conjunction with this Plan.
U.S. Census Bureau

27

anizations
Governance
rds for the
0 Census
rational Plan

CENSUS BUREAU
2020 Census Executive
Steering Committee

2020 Census
Detailed
Operational
Plan

CENSUS BUREAU
Decennial Leadership Group
One for each
production
operation

2020 Census
Operational
Plan

DECENNIAL CENSUS
MANAGEMENT DIVISION
Budget Formulation and
Schedule Branches

DECENNIAL CENSUS
MANAGEMENT DIVISION
Testing Branch

CENSUS BUREAU
(CROSS ORGANIZATIONAL)
Operational Integrated
Project Teams (IPTs)

CENSUS BUREAU
(CROSS ORGANIZATIONAL*)
Operational Plan Team

DECENNIAL STATISTICAL
STUDIES DIVISION
Quality Analysis Team

CENSUS BUREAU
(CROSS ORGANIZATIONAL)
Operational Subject
Matter Experts

*Communications, Field, Information
Technology, Research and
Methodology, and Decennial Census
Programs Directorates

Figure 3: Organizations and Governance Boards for the 2020 Census
Operational Plan

1.4 DOCUMENT DEVELOPMENT
PROCESS
Many organizations across the Decennial Census
Directorate and the Census Bureau have been
involved in developing the 2020 Census operational
design. Figure 3 illustrates these organizations. The
Operational Plan Team consists of subject matter
experts from the key Census Bureau organizations with significant roles in supporting the 2020
Census. This team, supplemented with additional
subject matter experts from across the Census
Bureau, plays a key role in identifying research

U.S. Census Bureau

	

needs, preparing for and analyzing
the results 12
of
Pre-decisional
tests, and recommending design decisions. The
Decennial Census Management Division is leading
the development of the schedule, life-cycle cost
analysis, and the testing. The Decennial Statistical
Studies Division is leading the quality analysis. The
2020 Census Operational Plan has been reviewed
and approved by both the Decennial Leadership
Group and the 2020 Census Executive Steering
Committee. Over the next 2 years, Operational
Integrated Project Teams are developing detailed
Operational Plans for each production operation.

2020 Census Operational Plan—Version 1.1 3

1.5 DOCUMENT ORGANIZATION
This document is organized into eight sections:
1.	

Introduction

2.	

The 2020 Census Overview

3.	

The Four Key Innovation Areas

4.	

Key Tests, Milestones, and Production Dates

5.	

The 2020 Census Operations

6.	

Key Program-Level Risks

7.	

Quality Analysis

8.	

Life-Cycle Cost Estimate

Section 5 describes each of the 34 census operations and constitutes the bulk of this Operational
Plan.

4 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

2. The 2020 Census Overview
2.1 PURPOSE, GOAL, AND
CHALLENGE
The purpose of the 2020 Census is to conduct a
census of population and housing and disseminate
the results to the President, the states, and the
American people. The goal of the 2020 Census
is to count everyone once, only once, and in the
right place, and the challenge is to conduct a 2020
Census at a lower cost per household (adjusted for
inflation) than the 2010 Census, while maintaining
high quality results.

2.2 USES OF DECENNIAL DATA
As the operational design of the 2020 Census is
finalized, it is important to keep in mind the purpose
of the 2020 Census and how the data will be used.
The primary requirement of the decennial census is
the apportionment of seats allocated to the states
for the House of Representatives. This requirement
is mandated in the U.S. Constitution:
Article I, Section 2, “The actual enumeration
shall be made within three years after the first
meeting of the Congress of the United States,
and within every subsequent Term of ten Years”
Fourteenth Amendment, Section 2,
“Representatives shall be apportioned among
the several States according to their respective
numbers, counting the whole number of persons in each State”
Decennial data at the census block level are used
by governmental entities for redistricting, i.e.,
defining the representative boundaries for congressional districts, state legislative districts, school
districts, and voting precincts. Additionally, decennial data are used to enforce voting rights and civil
rights legislation.

U.S. Census Bureau

	

The Census Bureau also uses the decennial census
results to determine the statistical sampling frames
for the American Community Survey (ACS), which
replaced the long form in the decennial census and
is part of the Decennial Program, and the dozens of current surveys conducted by the Census
Bureau. The results of these surveys are used to
support important government functions, such as
appropriating federal funds to local communities
(an estimated $400 billion annually); producing
monthly unemployment, crime, and poverty rates;
and publishing health and education data.
Finally, decennial data play an increasingly important role in U.S. commerce and the economy. As
people expand their use of data to make decisions
at the local and national levels, they increasingly
depend on data from the Census Bureau to make
these decisions. Today, local businesses look at
data provided by the Census Bureau on topics
like population growth and income levels to make
decisions about whether or where to locate their
restaurants or stores. Similarly, a real estate investor, who is considering investing significant funds
to develop a piece of land in the community, relies
on Census Bureau data to measure the demand for
housing, predict future need, and review aggregate trends. Big businesses also rely heavily on
Census Bureau data to make critical decisions that
impact their success and shape the economy at
the national level. As noted above, the decennial
census is the foundation for the Census Bureau’s
demographic survey data.
The decennial data must meet high quality standards to ensure good decision-making and to
continue building confidence in the government,
society, and the economy. Studying the balance
between cost and quality is an increasing focus of
the census design in the 2016–2018 years.

2020 Census Operational Plan—Version 1.1 5

he 2020 Census
nvironment
A mobile
population

2020 Census design
st be able to incorporate
ancements in technology in
ay that minimizes risk and
ures an accurate count.

Informal,
complex living
arrangements

Increasingly
diverse
population

Constrained
fiscal
environment

The 2020
Census
Declining
response
rates

Rapidly
changing use
of technology

Information
explosion

Distrust in
government

Figure 4: 2020 Census Environment

2.3 THE CHANGING ENVIRONMENT
AND ESCALATING COSTS
The 2020 Census challenge is exacerbated by multiple environmental factors that have the potential
to impact its success. The Census Bureau is committed to proactively addressing the challenges
that follow (see Figure 4):
•• Constrained fiscal environment: Budget
deficits place significant pressure on funding
available for the research, testing, design, and
development work required for successful
innovation.
•• Rapidly changing use of technology:
Stakeholders expect the decennial census to
use technology innovation, yet the rapid pace of
change makes it challenging to plan for and adequately test the use of these technologies before
they become obsolete.
•• Information explosion: Rapid changes in
information technology create stakeholder
expectations for how the Census Bureau
6 2020 Census Operational Plan—Version 1.1	

Pre-decisional
interacts with the
public to obtain3 and disseminate data products.

•• Distrust in government: Concerns continue
to grow about information security and privacy,
the confidentiality of information given to the
government, and how government programs
will use the information it collects. This makes it
more difficult to collect important demographic
survey information.
•• Declining response rates: Response rates for
Census Bureau surveys, and for surveys and censuses in general, have declined as citizens are
overloaded with requests for information and
become increasingly concerned about sharing
information.
•• Increasingly diverse population: The demographic and cultural make-up of the United
States continues to increase in complexity,
resulting in a growing number of households
and individuals who do not speak English as
their native language, who have a wide variety
of cultural traditions and mores, and who may
U.S. Census Bureau

Total Costs
Traditional
2020 Census
$17.8 B*

Cost of repeating 2010 Census in 2020 =
$124 per housing unit
Expected 2020 Census cost =
$88 per housing unit

Innovative
2020 Census
$12.5 B

$12.3 B
$9.4 B

$4.7 B
$3.0 B
$1.1 B
1790
69.5

1980
89.5

1990
103.5

2000
117.5

2010
133.5

2020
142.9
(projected)

* Estimate based on 2020 Census life-cycle cost if the 2010 approach is used.
Note: Figures through 2020 shown in 2020 constant dollars.
Source: U.S. Census Bureau.

Number
of housing
units
(millions)

Figure 5: Costs—Traditional vs Innovative 2020 Census

have varying levels of comfort with government
involvement.
•• Informal, complex living arrangements:
Households are becoming more diverse and
dynamic, making it a challenge to associate an
identified person to a single location. For example, blended families may include children who
have two primary residences. Additionally, some
households include multiple relationships and
generations.
•• A mobile population: The United States continues to be a highly mobile nation as about 12
percent of the population moves in a given year,
based on results from the ACS conducted in
2012–2013 and 2013–2014. Continued growth
in the use of cellular telephone technology and
an associated reduction in landline telephones
tied to physical locations may also complicate
enumeration.

U.S. Census Bureau

	

Several of the societal, demographic, and technological trends listed above can result in a population that is harder and more expensive to enumerate. As it becomes more challenging to locate
individuals and solicit their participation through
traditional methods, the Census Bureau must
decade after decade spend more money simply to
maintain the same level of accuracy as in previous
censuses. As shown in Figure 5, on average, the
total costs—in constant dollars—of conducting the
decennial census have increased significantly each
decade. Initial estimates for expected total costs
for the 2020 Census are $17.8 billion if the Census
Bureau repeats the 2010 Census design and
methods. With the innovations described in this
Operational Plan, the Census Bureau estimates that
it can conduct the 2020 Census for $12.5 billion.

2020 Census Operational Plan—Version 1.1 7

2.4	 FOUR KEY INNOVATION AREAS
With cost reductions in mind, the 2020 Census team focused on four Key Innovation Areas:

Reengineering
Address
Canvassing

Optimizing
Self-Response

Field costs associated with Address Canvassing and
Nonresponse Followup operations comprise the
most expensive parts of the 2020 Census. All four
innovation areas are aimed at reducing the costs
of fieldwork. A reengineered Address Canvassing
operation is expected to reduce the field workload
for address updating by 75 percent. Self-response
innovations, which are aimed at generating the
largest possible self-response rate, coupled with
the use of administrative records and third-party
data, are intended to reduce the field workload
associated with Nonresponse Followup. Finally,
the reengineered field operations are intended to
increase the efficiency of those operations, allowing managers and fieldworkers to be more productive and effective.
Each innovation area is described further in
Section 3.

2.5 A NEW DESIGN FOR THE 21ST
CENTURY
Figure 6 describes at a high-level how the 2020
Census will be conducted. This design reflects a
flexible approach that takes advantage of new technologies and data sources while minimizing risk.
The first step in conducting the 2020 Census is to
identify all of the addresses where people could
live, or Establish Where to Count. An accurate
address list helps ensure that everyone is counted.
For the 2020 Census, the Census Bureau will begin
an in-office review of 100 percent of the nation’s
addresses in September 2015 and continually
update the address list based on data from multiple sources, including the U.S. Postal Service, tribal,
state, and local governments, satellite imagery, and
third-party data providers. This office work will
also determine which parts of the country require
fieldwork to verify address information. While

8 2020 Census Operational Plan—Version 1.1	

Utilizing
Administrative
Records and
Third-Party Data

Reengineering
Field Operations

fieldwork will begin in 2016 on a small scale for
address coverage measurement, the bulk of the
In-Field Address Canvassing will begin in 2019 and
is anticipated to cover approximately 25 percent of
all addresses, a significant reduction from the 100
percent that were reviewed in the field during the
2010 Census.
As noted on page 6, response rates to surveys and
censuses have been declining. To Motivate People
to Respond, the 2020 Census will include a nationwide communications and partnership campaign.
This campaign is focused on getting people to
respond on their own (self-respond) as it costs
significantly less to process a response provided via
the Internet or through a paper form than it does
to send a fieldworker to someone’s home to collect
their response. Advertising will make heavy use of
digital media, tailoring the message to the audience.
The Census Bureau Counts the Population
by collecting information from all households,
including those residing in group or unique living
arrangements. The Census Bureau wants to make
it easy for people to respond anytime and anywhere. To this end, the 2020 Census will offer and
encourage people to respond via the Internet and
will not require people to enter a unique Census
identification with their response. Online responses
will be accurate, secure, and convenient. If people
are at the bus stop, waiting at the doctor’s office,
or watching TV and do not have their Census ID
handy, they can provide their address instead.
For those who do not respond, the Census Bureau
will use the most cost-effective strategy for contacting and counting people. The goal for the 2020
Census is to reduce the average number of visits
by using available data from government administrative records and third-party sources. These
data may be used to identify vacant households,

U.S. Census Bureau

The 2020 Census
Operational
Overview

Count everyone once,
in the right place.

ESTABLISH
WHERE TO
COUNT

MOTIVATE
PEOPLE TO
RESPOND

COUNT
THE
POPULATION

RELEASE
CENSUS
RESULTS

Identify all addresses
where people could live

Conduct a nation wide
communications and
partnership campaign

Collect data from all
households, including
group and unique living
arrangements

Process and provide Census data

Conduct a 100-percent review
and update of the nation’s
address list

Work with trusted sources
to inspire participation

Minimize in-field work
with in-office updating

Maximize outreach using
traditional and new media

Use multiple data sources
to identify areas with
address changes

Target advertisements to
specific audiences

Get local government input

Make it easy for people to
respond anytime, anywhere
Encourage people to use the
new online response option
Use the most cost-effective
strategy to contact and
count nonrespondents

Deliver apportionment counts
to the President by
December 31, 2020
Release counts for
redistricting by April 1, 2021
Make it easier for the public
to get information

Streamline in-field
census-taking
Knock on doors only
when necessary

Figure 6: The 2020 Census—A New Design for the 21st Century
Pre-decisional

determine the best time of day to visit a particular
household, or to count the people and fill in the
responses with existing high-quality data from
trusted sources. A reduced number of visits will
lead to significant cost savings. It can also allow
the Census Bureau to focus its field resources to
achieve consistent response rates across geographic areas and demographic groups.
Additional cost savings are expected to result from
the use of automation to streamline in-field census
taking. Fieldworkers will use handheld devices for
collecting the data. Operations such as recruiting,
training, and payroll will be automated, reducing
the time required for these activities. New operational control centers will rely on automation to
manage the work, enabling more efficient case
assignment, automatic determination of optimal
travel routes, and reduction of the number of
physical offices. In general, a streamlined operation

U.S. Census Bureau

	

9

and management structure is expected to increase
productivity and save costs.
The last step in the 2020 Census is to Release the
2020 Census Results. The 2020 Census data will
be processed and sent to the President (for apportionment) by December 31, 2020, to the states
(for redistricting) by March 31, 2021, and to the
public beginning in December 2021.

2.6 THE 2020 CENSUS OPERATIONS
The 2020 Census includes 34 operations that are
organized into eight major areas that correspond
with the Census Bureau standard WBS. The term
operation refers to both support and business
functions. For example, Program Management
is considered a support function, and Address
Canvassing is considered a business function.
Table 1 provides a high-level purpose statement for
each operation.

2020 Census Operational Plan—Version 1.1 9

Table 1: Operations and Purpose—Con.
Operations

Purpose

Program Management
Program Management

Define and implement program management policies, processes, and the control
functions for planning and implementing the 2020 Census.

Census/Survey Engineering
Systems Engineering and Integration
(SE&I)

Manage the delivery of a system of systems that meets the 2020 Census Program
business and capability requirements. Implement and manage the full eSDLC for
systems supporting the 2020 Census.

Security, Privacy, and Confidentiality

Ensure that all operations and systems used in the 2020 Census adhere to the
appropriate systems and data security, respondent, and employee privacy and
confidentiality policies, and regulations.

Content and Forms Design

Identify, research, and finalize content and design of questionnaires and other
nonquestionnaire materials, ensure consistency across data collection modes and
operations, and promote high response rates and accurate and consistent responses
across modes.

Language Services

Assess and support language needs of non-English speaking populations for all modes
and other mailing and field materials, determine the number of languages and level
of support required, optimize non-English content, and ensure cultural relevancy and
meaningful translation of non-English materials.

Frame
Geographic Programs

Provide the geographic foundation in support of the 2020 Census data collection and
tabulation activities, including delineation of boundaries in the Master Address File
(MAF)/Topologically Integrated Geographic Encoding and Referencing (TIGER) System,
delivery of address and spatial extracts from the MAF/TIGER System, and updates to
the MAF/TIGER System.

Local Update of Census Addresses
(LUCA)

Provide an opportunity for tribal, federal, state, and local governments to review and
improve the address lists and maps used to conduct the 2020 Census as required by
Public Law (P.L.) 103-430.

Address Canvassing

Deliver a complete and accurate address list and spatial database for enumeration and
determine the type and address characteristics for each living quarter.

10 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

Table 1: Operations and Purpose—Con.
Purpose

Operations
Response Data
Forms Printing and Distribution

Print and distribute Internet invitations, reminder postcards, and questionnaire mailing
packages to support the 2020 Census mailing strategy and enumeration of the
population.

Paper Data Capture

Capture and convert data from the 2020 Census paper questionnaires, including
document preparation, scanning, Optical Character Recognition, Optical Mark
Recognition, Key From Image, editing, and checkout.

Integrated Partnership and
Communications (IPC)

Communicate the importance of participating in the 2020 Census to the entire
population of the 50 states, the District of Columbia, and Puerto Rico. Motivate people to
self-respond, preferably via the Internet, and raise and keep awareness high throughout
the entire 2020 Census.

Internet Self-Response

Collect response data via the Internet to reduce paper and Nonresponse Followup and
maximize online response to the 2020 Census via contact strategies and improved
access for respondents.

Non-ID Processing

Make it easy for people to respond anytime, anywhere to increase self-response rates
by providing response options that do not require a unique Census ID.

Update Enumerate (UE)

Update the address and feature data and enumerate housing units in certain designated
geographic areas with special enumeration needs (e.g., areas that do not have city-style
addresses and areas with unique challenges associated with accessibility).

Group Quarters (GQ)

Enumerate people living or staying in group quarters, people experiencing
homelessness, and people receiving service at service-based locations.

Enumeration at Transitory Locations
(ETL)

Enumerate individuals in occupied units at transitory locations, such as recreational
vehicle parks, campgrounds, tent cities, racetracks, circuses, carnivals, marinas, hotels,
and motels, who do not have a usual home elsewhere.

Census Questionnaire Assistance
(CQA)

Provide questionnaire assistance for respondents by answering questions about specific
items on the census form or other frequently asked questions about the 2020 Census
and provide an option for callers to complete a census interview over the telephone.

Nonresponse Followup (NRFU)

Determine housing unit status for nonresponding addresses and enumerate housing
units for which a census response was not received.

Response Processing

Establish the initial 2020 Census universe, assign the specific enumeration strategy for
each census case based on case status and associated paradata, create and distribute
workload files required for enumeration operations, track case enumeration status, and
run postdata collection processing actions in preparation for producing the final 2020
Census results.

Federally Affiliated Americans Count
Overseas

Obtain counts by home state of U.S. military and federal civilian employees stationed or
deployed overseas and their dependents living with them.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 11

Table 1: Operations and Purpose—Con.
Operations

Purpose

Publish Data
Data Products and Dissemination

Prepare and deliver the 2020 Census population counts to the President of the United
States for Congressional apportionment, tabulate and disseminate 2020 Census data
products for use by the states for redistricting, and tabulate and disseminate 2020
Census data for use by the public.

Redistricting Data

Provide to each state the legally required P.L. 94-171 redistricting data tabulations by the
mandated deadline of 1 year from Census Day: April 1, 2021.

Count Review

Enhance the accuracy of the 2020 Census by implementing an efficient and equitable
process for Federal State Cooperative Population Estimates members to identify missing
housing units and missing or geographically misallocated large group quarters.

Count Question Resolution (CQR)

Provide a mechanism for governmental units to challenge their official 2020 Census
results.

Archiving

Provide 2020 Census records deemed permanent, including files containing individual
responses, to the National Archives and Records Administration for archiving and to
the National Processing Center (NPC) to use as source materials to conduct the Age
Search Service.

Other Censuses
Island Areas Censuses (IA)

Update and enumerate all living quarters in the Pacific Island Area of American Samoa,
the Commonwealth of the Northern Mariana Islands, Guam, and the U.S. Virgin Islands,
collectively known as the Island Areas.

Test and Evaluation
Coverage Measurement Design and
Estimation

Develop the survey design and sample for the postenumeration survey for the 2020
Census. Produce coverage error estimates and independent assessment of coverage
via demographic analysis.

Coverage Measurement Matching

Identify matches and nonmatches between the 2020 Census and the Census Coverage
Measurement Survey for the enumerated housing units and people.

Coverage Measurement Field
Operations

Collect person and housing unit information (independent from the 2020 Census
operations) for the sample of housing units in the Census Coverage Measurement
Survey.

Evaluations and Experiments

Measure the success of critical 2020 Census operations. Formulate and execute an
experimentation program to support early planning and inform the transition and design
of the 2030 Census.

Infrastructure
Decennial Service Center (DSC)

Support 2020 Census Field Operations and handle all service requests initiated by field
staff.

Field Infrastructure

Coordinate space acquisition for and lease management of the Regional Census
Centers and field offices and provide the administrative infrastructure for data collection
operations covering the 50 states, the District of Columbia, and Puerto Rico.

Decennial Logistics Management

Provide logistics management services to include procuring warehouse space,
warehousing, inventory management, kit assembly, deployment of materials, and
receiving and accessing materials.

IT Infrastructure

Provide the IT Infrastructure to support the 2020 Census, including enterprise systems
and applications, 2020 Census-specific applications, field IT infrastructure, and mobile
computing.

12 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

Figure 7 presents a graphic representation of
the 34 operations organized into the eight
areas described above. A separate area, Other
Censuses, was added to account for the Island

U.S. Census Bureau

	

Areas Censuses operation, which is unique to the
Decennial Census Programs. See Section 5 for
details about the design and decisions for each of
these operations.

2020 Census Operational Plan—Version 1.1 13

14 2020 Census Operational Plan—Version 1.1	

SUPPORT
Program Management
Program Management

Census/Survey Engineering
Systems Engineering
and Integration

Security, Privacy and
Confidentiality

Content and
Forms Design

Field Infrastructure

Decennial Logistics
Management

IT Infrastructure

Language Services

Infrastructure
Decennial Service
Center

FRAME

RESPONSE DATA

PUBLISH DATA

Geographic Programs

Forms Printing and
Distribution

Non-ID Processing

Census Questionnaire
Assistance

Data Products and
Dissemination

Local Update of
Census Addresses

Paper Data Capture

Update Enumerate

Nonresponse Followup

Redistricting Data

Address Canvassing

Integrated Partnership
and Communications

Group Quarters

Response Processing

Count Review

Internet Self-Response

Enumeration at
Transitory Locations

Federally Affiliated
Americans Count
Overseas

Count Question
Resolution

Archiving

OTHER CENSUSES
U.S. Census Bureau

Island Areas
Censuses

TEST AND EVALUATION
Coverage Measurement
Design and Estimation

Coverage Measurement
Matching

Coverage Measurement
Field Operations

Figure 7: Operations by Work Breakdown Structure

Evaluations and
Experiments

3. The Four Key Innovation Areas
The 2020 Census is designed to cost less per
housing unit than the 2010 Census (when adjusted
for inflation), while continuing to maintain high
quality. The Census Bureau plans to achieve this
by conducting the most automated, modern, and
dynamic decennial census in history. The 2020
Census includes sweeping design changes in four
key areas, including new methodologies to conduct
Address Canvassing, innovative ways of optimizing
self-response, the use of administrative records
and third-party data to reduce the Nonresponse
Followup (NRFU) workload, and the use of technology to reduce the manual effort and improve
productivity of field operations. The primary goal is
to achieve dramatic cost savings by:
•• Adding new addresses to the Census Bureau’s
address frame using geographic information
systems and aerial imagery instead of sending
Census employees to walk and physically check
11 million census blocks.

U.S. Census Bureau

	

•• Encouraging the population to respond to the
2020 Census using the Internet, reducing the
need for more expensive paper data capture.
•• Using data the public has already provided to
the government and data available from commercial sources, allowing realized savings to
focus additional visits in areas that have been
traditionally hard to enumerate.
•• Using sophisticated operational control systems
to send Census employees to follow up with
nonresponding housing units and to track daily
progress.
The Census Bureau estimates that conducting a
2020 Census that includes these major cost-­saving
innovations has the potential to save approximately $5.2 billion compared with repeating the
2010 design in the 2020 Census.

2020 Census Operational Plan—Version 1.1 15

3.1 REENGINEERING ADDRESS
CANVASSING
The goal of Reengineering Address Canvassing
is to eliminate the need to canvass every block.
Instead, the Census Bureau is developing innovative methodologies for updating the MAF/TIGER
System throughout the decade. Figure 8 highlights
the key concepts in the Reengineering Address
Canvassing approach.
Continual research and updating will be conducted
through an In-Office Address Canvassing operation
that will begin in September 2015 and continue
through the 2020 Census. Clerks will start with
the 2015 Census address list and update it based
on new information from the United States Postal
Service (USPS), and data from tribal, state, and local

governments and third parties (i.e., commercial
vendors). Clerks will review satellite imagery to
determine where changes in addresses are occurring, and based on these changes, the Census
Bureau will develop a plan for capturing those
changes. This plan will include an In-Field Address
Canvassing operation where address updates cannot be obtained or verified or in areas undergoing
rapid change. The number of addresses requiring
In-Field Canvassing is expected to be approximately
25 percent of the total number of addresses. These
design changes have the potential to save the
Census Bureau an estimated $900 million.
The operations shaded in darker blue in Figure
9 include innovations related to Reengineering
Address Canvassing.

In-Field
Canvassing
Limited In-Field Canvassing in 2019
for those areas where address updates
cannot be obtained or verified
or are undergoing rapid change

Continual In-Office Canvassing
Update and verify the MAF using aerial imagery,
administrative records and commercial data

2020 Census
Begins
Updated MAF
used to conduct
2020 Census

Master Address File (MAF) Coverage Study
Ongoing fieldwork to measure coverage, validate in-office procedures, and
improve in-field data collection methodologies

2015

2016

2017

2018

2019

2020

Figure 8: Summary of Reengineering Address Canvassing

16 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

U.S. Census Bureau

SUPPORT
Program Management
Program Management

Census/Survey Engineering

	

Systems Engineering
and Integration

Security, Privacy and
Confidentiality

Content and
Forms Design

Field Infrastructure

Decennial Logistics
Management

IT Infrastructure

Language Services

Infrastructure
Decennial Service
Center

FRAME

RESPONSE DATA

PUBLISH DATA

2020 Census Operational Plan—Version 1.1 17

Geographic Programs

Forms Printing and
Distribution

Non-ID Processing

Census Questionnaire
Assistance

Data Products and
Dissemination

Local Update of
Census Addresses

Paper Data Capture

Update Enumerate

Nonresponse Followup

Redistricting Data

Address Canvassing

Integrated Partnership
and Communications

Group Quarters

Response Processing

Count Review

Internet Self-Response

Enumeration at
Transitory Locations

Federally Affiliated
Americans Count
Overseas

Count Question
Resolution

Archiving

OTHER CENSUSES
Island Areas
Censuses

TEST AND EVALUATION
Coverage Measurement
Design and Estimation

Coverage Measurement
Matching

Coverage Measurement
Field Operations

Evaluations and
Experiments

Figure 9: Operations That Contribute to Reengineering Address Canvassing

Documented below are brief descriptions of how each operation contributes to the Reengineering Address
Canvassing innovation area:

Table 2: Description of Operations That Contribute to Reengineering Address
Canvassing
Operation

Contributions

Geographic
Programs

Simplified collection geography.
Simplified Type of Enumeration Area delineation.
More data sources to validate and augment the frame.
More frequent engagement with partners to improve quality of the MAF/TIGER System.

Local Update of Census Addresses

LUCA submissions validated as part of In-Office or In-Field Address Canvassing.

Address Canvassing

100 percent address canvassing conducted in-office.
Target 25 percent of living quarters for In-Field Address Canvassing.
Ongoing in-office and in-field improvement process.
Classification of living quarter types during in-office review.
Increased productivity of field staff due to automated case assignment and route
optimization.

Update Enumerate

Geography in UE areas not included in the in-field workloads.

Field Infrastructure

Reduced office infrastructure needed for In-Field Address Canvassing.
Automated administrative functions.

IT Infrastructure

Listing applications for In-Field Address Canvassing with flexibility to support
government-furnished equipment, personally owned devices, and Device as a Service.
Enterprise solutions with flexible architecture.
Additional IT infrastructure to support In-Office Address Canvassing.

Additional operations that contribute to
Reengineering Address Canvassing include:
Decennial Service Center; Security, Privacy,
and Confidentiality; Integrated Partnership
and Communications (IPC), and the Systems
Engineering and Integration (SE&I).

3.2 OPTIMIZING SELF-RESPONSE
The goal of this innovation area is to communicate the importance of the 2020 Census to the U.S.
population and generate the largest possible selfresponse, reducing the need to conduct expensive
in-person follow-up with those households.

18 2020 Census Operational Plan—Version 1.1	

As shown in Figure 10, the Census Bureau plans to
motivate people to respond by using technology
and administrative records and third-party data to
target advertisements and tailor contact strategies
to different demographic groups and geographic
areas. The Census Bureau also plans to utilize
its partnership program, providing information
to government agencies and hosting events at
community, recreation, and faith-based organizations. Communication and contact strategies will
encourage the use of the Internet as the primary
response mode through a sequence of invitations
and postcard mailings. In addition, when Census
fieldworkers visit a house and no one is home, the
notice of visit will encourage self-response.

U.S. Census Bureau

2
Generate the largest possible self-response, reducing the number of households
requiring follow-up.

Motivate
people to respond
and assure that
data are secure

MicroTargeted
Advertising

Tailored
Contact
Strategy

Partnership
Program

Notices
Encouraging
Self-Response

Make it easy to
respond from
any location at
any time
Multiple Modes and
Devices

Preassigned ID
Not Required*

Online Forms in
Multiple Languages

* Validate respondent addresses for those without a Census ID and prevent
fraudulent submissions.

Figure 10: Summary of Optimizing Self-Response
Pre-decisional

A second key aspect of Optimizing Self-Response
is to make it easy for people to respond from any
location at any time. This is done in several ways:
•• By enabling people to respond via multiple
modes (Internet, paper, or telephone if they
call the Census Questionnaire Assistance [CQA]
Center).

For these innovations to be successful, respondents must know that their personal information is
protected. Thus, a key element of this innovation
area is to assure respondents that their data are
secure and treated as confidential.
These design changes have the potential to save
the Census Bureau an estimated $400 million.

•• By allowing respondents to submit a questionnaire without a unique identification code.

The operations shaded in darker blue in Figure
11 include innovations related to Optimizing

•• By providing online forms in multiple languages.

Self-Response.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 19

6

20 2020 Census Operational Plan—Version 1.1	

SUPPORT
Program Management
Program Management

Census/Survey Engineering
Systems Engineering
and Integration

Security, Privacy and
Confidentiality

Content and
Forms Design

Field Infrastructure

Decennial Logistics
Management

IT Infrastructure

Language Services

Infrastructure
Decennial Service
Center

FRAME

RESPONSE DATA

PUBLISH DATA

Geographic Programs

Forms Printing and
Distribution

Non-ID Processing

Census Questionnaire
Assistance

Data Products and
Dissemination

Local Update of
Census Addresses

Paper Data Capture

Update Enumerate

Nonresponse Followup

Redistricting Data

Address Canvassing

Integrated Partnership
and Communications

Group Quarters

Response Processing

Count Review

Internet Self-Response

Enumeration at
Transitory Locations

Federally Affiliated
Americans Count
Overseas

Count Question
Resolution

Archiving

OTHER CENSUSES
U.S. Census Bureau

Island Areas
Censuses

TEST AND EVALUATION
Coverage Measurement
Design and Estimation

Coverage Measurement
Matching

Coverage Measurement
Field Operations

Figure 11: Operations That Contribute to Optimizing Self-Response

Evaluations and
Experiments

Documented below are brief descriptions of how each operation contributes to the Optimizing
Self-Response innovation area:

Table 3: Description of Operations That Contribute to Optimizing Self-Response
Operation

Contributions

Content and
Forms Design

Questionnaire designed for multiple modes and devices.

Language Services

Non-English questionnaires available across modes.
Non-English content development of contact materials (e.g., invitation letters and
postcards).

Forms Printing and Distribution

Census mailing that encourages people to respond via the Internet.

Paper Data Capture

Paper available as a response mode.

Integrated Partnership and
Communications

Micro-targeted advertising.
Multi-channel outreach.
Integrated Partnership and Communications program adjusted based on customer
response, behavior, and feedback.
National and local partnerships promoting self-response.
Educational awareness campaign via traditional and new media sources (e.g., social
media).

Internet Self-Response

Multi-mode contact approach (e.g., postcard, e-mail, phone, and text).
Optimized for mobile devices.
Multiple languages available.
Contact approach tailored to demographic and geographic areas based on administrative
records, third-party data, and paradata analysis.
Real-time edits for Internet Self-Response to improve quality.

Non-ID Processing

Public can respond anytime, anywhere without a unique Census ID.
Real-time geocoding of responses.
Real-time validation of responses without a unique Census ID.
Real-time soft edits and checks for addresses.
Administrative records and third-party data used to validate identity and validate and
augment address data.

Census Questionnaire Assistance

Flexible and adaptive language support.
Web chat.
Respondent-initiated telephone response collection.

Response Processing

Single operational control system that tracks case status across all modes.

IT Infrastructure

Infrastructure built and sized to meet demand and ensure adequate performance for
Internet Self-Response.
Secure Internet response capability.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 21

In addition, the Security, Privacy and Confidentiality
operation and the Systems Engineering and
Integration operation contribute to the Optimizing
Self-Response innovation area.

3.3 UTILIZING ADMINISTRATIVE
RECORDS AND THIRD-PARTY DATA
The goal of this innovation area is to use information people have already provided to improve the
efficiency and effectiveness of the 2020 Census,
and in particular reduce expensive in-person
follow-up activities. Administrative record data
refers to information from federal and state governments. Third-party data refers to information
from commercial sources. As shown in Figure
12, data from both sources can help improve the
quality of the address list (frame), increase the
effectiveness of advertising and contact strategies, validate respondent submissions, and
reduce field workload for follow-up activities.

3

As has been done in prior decades, administrative
data from the U.S. Postal Service and other government records are used to update the address
frame and reflect changes that occur over time.
Additional administrative records sources, as well
as third-party data from commercial companies will
also be used for this purpose. In addition, these
data sources will be used to validate incoming data
from tribal, federal, state, and local governments.
To increase the effectiveness of advertising and
contact strategies, the Census Bureau will use
demographic and geographic information from
various administrative record and third-party data
sources to help target the advertising to specific
populations. These data will also be used to create
a contact frame that includes e-mail addresses
and telephone numbers. A contact frame with this
additional information enables the Census Bureau
to expand its contact methods beyond traditional
postal mail.

Utilizing Administrative Records and
Third-Party Data

Use information people have already provided to reduce expensive in-person follow-up.
Improve the quality
of the address list

Update the address list

Validate incoming data from
tribal, federal, state, and local
governments

Increase effectiveness
of advertising and
contact strategies

Support the micro-targeted
advertising campaign

Create the contact frame
(e.g., e-mail addresses and
telephone numbers)

Validate respondent
submissions

Validate respondent addresses
for those without a Census
ID and prevent fraudulent
submissions

Reduce field workload
for follow-up activities

Remove vacant and
nonresponding occupied
housing units from the
Nonresponse Followup
workload

Optimize the number of
contact attempts

Figure 12: Summary of Utilizing Administrative Records and Third-Party Data
Pre-decisional

22 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

7

Administrative records and third-party data will
also be used to validate respondent addresses for
those who respond without providing a unique
Census ID. This will help prevent fraudulent and
erroneous submissions.
Finally, a primary use of administrative records is
to reduce field workload for follow-up activities.
To this end, the Census Bureau will use data from
internal and external sources, such as the 2010
Census, the USPS, the Internal Revenue Service, and
the Centers for Medicare and Medicaid Services to
identify vacant and nonresponding occupied housing units and remove them from the Nonresponse

U.S. Census Bureau

	

Followup workload. The Census Bureau plans to
continue acquiring and testing data from other
sources, including the National Directory of New
Hires, the Supplemental Nutrition and Assistance
Program, and state-administered programs such
as Temporary Assistance for Needy Families to
better understand how these data sources can help
reduce follow-up field workload.
These design changes have the potential to
save the Census Bureau an estimated $1.4 billion. The operations shaded in darker blue in
Figure 13 include innovations related to Utilizing
Administrative Records and Third-Party Data.

2020 Census Operational Plan—Version 1.1 23

24 2020 Census Operational Plan—Version 1.1	

SUPPORT
Program Management
Program Management

Census/Survey Engineering
Systems Engineering
and Integration

Security, Privacy and
Confidentiality

Content and
Forms Design

Field Infrastructure

Decennial Logistics
Management

IT Infrastructure

Language Services

Infrastructure
Decennial Service
Center

FRAME

RESPONSE DATA

PUBLISH DATA

Geographic Programs

Forms Printing and
Distribution

Non-ID Processing

Census Questionnaire
Assistance

Data Products and
Dissemination

Local Update of
Census Addresses

Paper Data Capture

Update Enumerate

Nonresponse Followup

Redistricting Data

Address Canvassing

Integrated Partnership
and Communications

Group Quarters

Response Processing

Count Review

Internet Self-Response

Enumeration at
Transitory Locations

Federally Affiliated
Americans Count
Overseas

Count Question
Resolution

Archiving

OTHER CENSUSES
U.S. Census Bureau

Island Areas
Censuses

TEST AND EVALUATION
Coverage Measurement
Design and Estimation

Coverage Measurement
Matching

Coverage Measurement
Field Operations

Evaluations and
Experiments

Figure 13: Operations That Contribute to Utilizing Administrative Records and Third-Party Data

Documented below are brief descriptions of how each operation contributes to the Utilizing Administrative
Records and Third-Party Data innovation area:

Table 4: Description of Operations That Contribute to Utilizing Administrative
Records and Third-Party Data
Operation

Contributions

Security, Privacy, and Confidentiality

Ongoing monitoring of public perception on decennial application of administrative
records and third-party data.

IT Infrastructure

Use of administrative records require that systems be Title 13 and Title 26 compliant.

Geographic Programs

Administrative records and third-party data used to determine types of enumeration
areas, basic collection units, and geographic boundaries.

Local Update of Census Addresses

Administrative records and third-party data used to validate incoming data from tribal,
federal, state, and local governments.

Address Canvassing

Additional sources of administrative records and third-party data used to update the frame.

Integrated Partnership and
Communications

Expanded use of administrative records and third-party data to support micro-targeted
Integrated Partnership and Communications program.

Internet Self-Response

Administrative records and third-party data used to create the contact frame.
Administrative records and third-party data used to tailor the contact strategy.

Non-ID Processing

Administrative records and third-party data used to validate and augment address data
and validate identity for submissions missing a unique Census ID.

Group Quarters

Electronic transfer and expanded use of administrative records and third-party data to
enumerate group quarters where possible.

Enumeration at Transitory Locations

Administrative records and third-party data used to update addresses of transitory
locations.

Nonresponse Followup

Expanded use of administrative records and third-party data to remove vacant and
occupied housing units from the NRFU workload.
Administrative records and third-party data used to reduce the number of contact
attempts made.
Administrative records and third-party data used to tailor work assignments based on
language and “best time of day” for contact.

Response Processing

Increased use of administrative records and third-party data to impute response data (in
whole or in part).
Increased use of libraries from past surveys and censuses to support editing and coding.
Increased use of administrative records and third-party data to enhance libraries for
primary selection algorithm and Invalid Return Detection.

Count Question Resolution

Administrative records and third-party data used to resolve CQR challenges.

Coverage Measurement Design and
Estimation

Administrative records and third-party data used for demographic analysis.
Administrative records and third-party data used for estimation.
Administrative records and third-party data used for sample design.

Coverage Measurement
Field Operations

Administrative records used and third-party data to reduce the number of contact
attempts made.
Administrative records and third-party data used to tailor work assignments based on
language and “best time of day” for contact.

Island Areas Censuses

U.S. Census Bureau

	

Administrative records and third-party data used where appropriate to support both listing
and enumeration.

2020 Census Operational Plan—Version 1.1 25

Additional operations that contribute to utilizing Administrative Records and Third-Party Data
include: Field Infrastructure, Federally Affiliated
Americans Count Overseas, and the Systems
Engineering and Integration.

3.4 REENGINEERING FIELD
OPERATIONS
The goal of this innovation area is to use technology to efficiently and effectively manage the 2020
Census fieldwork, and as a result, reduce the staffing, infrastructure, and brick and mortar footprint
required for the 2020 Census. Figure 14 shows the
three main components of the reengineered field
operations: streamlined office and staffing structure, increased use of technology, and increased
management and staff productivity.
The 2020 Census field operations will rely heavily
on automation. For example, the Census Bureau
plans to provide fieldworkers with the capability to
work completely remotely and perform all administrative and data collection tasks directly from a
handheld device. Supervisors will also be able to
work remotely and communicate with their staff
via these devices. These enhanced capabilities
significantly reduce the number of offices required
to support 2020 Census fieldwork. In the 2010
Census, the Census Bureau established 12 regional
Census centers and nearly 500 area Census offices.
The agency hired over 516,000 enumerators to

4

conduct Nonresponse Followup activities. The
new design for the 2020 Census field operations
includes six regional census centers with up to 250
Administrative Support Operation Centers.
In addition, automation enables significant changes
to how cases are assigned and the supervision of
field staff. By making it easier for supervisors to
monitor and manage their workers, the ratio of
workers to supervisor can be increased, reducing
the number of supervisors required. This streamlines the staffing structure. Other design changes
include optimized case assignment and routing.
All administrative functions associated with field
staff will be automated, including recruiting, hiring,
training, time and attendance, and payroll. Finally,
the new capabilities allow for quality to be infused
into the process through alerts to supervisors when
there is an anomaly in an enumerator’s performance
(e.g., the Global Positioning Satellite indicator on
fieldworker’s handheld device indicates that she
or he is not at the assigned address) and real-time
edits on data collection. Accordingly, the quality
assurance process used in the 2010 Census is being
reengineered to account for changes in technology.
In total, these design changes have the potential to
save the Census Bureau an estimated $2.5 billion.

Reengineering Field Operations
The operations shaded in darker blue in Figure 15

include innovations related to Reengineering Field
Operations.
Use technology to more efficiently and effectively manage
the 2020 Census fieldwork.

Streamlined Office and
Staffing Structure

Area Manager
of Operations

Increased Use of
Technology

Increased Management
and Staff Productivity

•	 Automated and
optimized work
assignments

•	 Increased visibility into
case status for improved
workforce management

Census Field
Managers

•	 Automated recruiting,
training, payroll, and
expense reporting

•	 Redesigned quality
assurance operations

Census Field
Supervisors

•	 Ability to conduct
address updates and
enumeration on same
device

Listers and
Enumerators

•	 Improved
communications

•	 Reduced paper and
manual processing

Figure 14: Summary of Reengineering Field Operations
26 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

Pre-decisional

8

U.S. Census Bureau

SUPPORT
Program Management
Program Management

Census/Survey Engineering
Security, Privacy and
Confidentiality

Content and
Forms Design

Field Infrastructure

Decennial Logistics
Management

IT Infrastructure

Language Services

	

Systems Engineering
and Integration

Infrastructure
Decennial Service
Center

FRAME

RESPONSE DATA

PUBLISH DATA

2020 Census Operational Plan—Version 1.1 27

Geographic Programs

Forms Printing and
Distribution

Non-ID Processing

Census Questionnaire
Assistance

Data Products and
Dissemination

Local Update of
Census Addresses

Paper Data Capture

Update Enumerate

Nonresponse Followup

Redistricting Data

Address Canvassing

Integrated Partnership
and Communications

Group Quarters

Response Processing

Count Review

Internet Self-Response

Enumeration at
Transitory Locations

Federally Affiliated
Americans Count
Overseas

Count Question
Resolution

Archiving

OTHER CENSUSES
Island Areas
Censuses

TEST AND EVALUATION
Coverage Measurement
Design and Estimation

Coverage Measurement
Matching

Coverage Measurement
Field Operations

Figure 15: Operations That Contribute to Reengineering Field Operations

Evaluations and
Experiments

Documented below are brief descriptions of how each operation contributes to the Reengineering Field
Operations innovation area. The field data collection operations are grouped together as they all contribute similarly.

Table 5: Description of Operations That Contribute to Reengineering Field
Operations
Operation

Contributions

Field Infrastructure

Streamlined staffing structure.
Automated use of real-time data by the field operations control system to enable better
management of the field workforce.
Automated training for field staff.
Automated administrative functions, including recruiting and payroll.
Supervisory support for fieldworkers available during all hours worked.

IT Infrastructure

Enterprise solutions with flexible architecture.
Listing and enumeration applications with flexibility to run on government-furnished
equipment, personally owned devices, and Device as a Service.

Integrated Partnership and
Communications

Enhanced communications to support field recruitment.
Rapid reclassification of living quarter type (under review).

Field Data Collection Operations:
Address Canvassing
Update Enumerate
Group Quarters
Enumeration at Transitory Locations
Nonresponse Followup
Coverage Measurement Field
Operations

Reduced paper; number of attempts tailored based on the availability of administrative
records, third-party data, and paradata.
Reduced field workload as measured by cases and attempts.
Near real-time case status updates.
Automated and optimized assignment of work.
Declaration of work availability and case assignments.
Flexibility built into work assignment process based on in-field feedback or observations.
Data on household language and “best time of day to contact” standardized and available
at central location for work assignments.
Redesigned quality assurance process.
Ability to update address list and enumerate on a single device with a suite of integrated
applications.
Ability for addresses not identified during Address Canvassing to be enumerated when
identified.

28 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

Additional operations that contribute to utilizing
Reengineering Field Operations include Decennial
Service Center, Island Areas Censuses, and the
Systems Engineering and Integration.

3.5 SUMMARY OF INNOVATIONS
This section summarizes the key innovations
planned for the 2020 Census. Innovations are
considered significant changes to the operational
design as compared with the 2010 Census.
The operations shaded in darker blue in Figure
16 are those that have the most significant
innovations.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 29

30 2020 Census Operational Plan—Version 1.1	

SUPPORT
Program Management
Program Management

Census/Survey Engineering
Systems Engineering
and Integration

Security, Privacy and
Confidentiality

Content and
Forms Design

Field Infrastructure

Decennial Logistics
Management

IT Infrastructure

Language Services

Infrastructure
Decennial Service
Center

FRAME

RESPONSE DATA

PUBLISH DATA

Geographic Programs

Forms Printing and
Distribution

Non-ID Processing

Census Questionnaire
Assistance

Data Products and
Dissemination

Local Update of
Census Addresses

Paper Data Capture

Update Enumerate

Nonresponse Followup

Redistricting Data

Address Canvassing

Integrated Partnership
and Communications

Group Quarters

Response Processing

Count Review

Internet Self-Response

Enumeration at
Transitory Locations

Federally Affiliated
Americans Count
Overseas

Count Question
Resolution

Archiving

OTHER CENSUSES
U.S. Census Bureau

Island Areas
Censuses

TEST AND EVALUATION
Coverage Measurement
Design and Estimation

Coverage Measurement
Matching

Coverage Measurement
Field Operations

Figure 16: Operations With Significant Innovations Since the 2010 Census

Evaluations and
Experiments

The specific innovations planned for each of these operations are listed in Table 6 below. Note that these
innovations are dependent upon funding and decisions on the final design.

Table 6: Summary of Key Innovations by Operation—Con.
Operation

Contributions

Local Update of Census Addresses

Reduced complexity.
Elimination of the full address list submission options to improve quality and reduce
burden and cost.

Address Canvassing

Use of a combination of in-office and in-field methods to achieve a 100 percent Address
Canvassing (target of 25 percent of addresses going to the field).
Use of automation and data (imagery, administrative records, and third-party data) for
In-Office Address Canvassing.
Ongoing fieldwork (Master Address File [MAF] Coverage Study) to validate in-office
procedures, measure coverage, and improve in-field data collection methodologies.
Use of reengineered field management structure and approach to managing fieldwork,
including new field office structure and new staff positions.

Integrated Partnership and
Communications

Microtargeted messages and placement for digital advertising, especially for hard-tocount populations.
Advertising and partnership campaign adjusted based on respondent actions.
Expanded predictive modeling to determine propensity to respond by geographic areas.
Expanded use of social media.

Internet Self-Response

Internet data capture, providing real-time edits, ability to capture unlimited household size
entries, and multiaccess methods across different technologies (e.g., computers, phones,
tablets, kiosks).
Online questionnaires available in multiple languages and non-Roman alphabets.
Multimode contact approach tailored to demographic or geographic area.
A contact frame, including e-mail and phone numbers, developed from administrative
records and third-party data to allow for follow-up if required (e.g., missing or illegible
information and reinterview for quality assurance).

Non-ID Processing

Ability for public to respond anytime, anywhere.
Real-time matching and geocoding of responses.
Validation of non-ID response data.
Use of administrative records and third-party data to validate identity and validate and
augment address data for non-ID submissions.

Update Enumerate (planned
innovations dependent on funding of
this operation)

The 2010 Census Update Leave and UE Operations combined into a single operation.
Single visit with enumeration or push to Internet Self-Response.
Use of single device for both listing and enumeration.
Use of reengineered field management structure and approach to managing fieldwork,
including new field office structure and new staff positions.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 31

Table 6: Summary of Key Innovations by Operation—Con.
Operation

Contributions

Nonresponse Followup

Use of administrative records and third-party data to remove vacant housing units from
the NRFU workload.
Use of administrative records and third-party data to remove nonresponding occupied
housing units from the NRFU workload.
Use of reengineered field management structure and approach to managing fieldwork.
Use of a variable contact strategy and stopping rules to control the number of attempts
made for each address (based on paradata).
Assignment and route optimization.
Automated training for field staff.
Automation of the field data collection.
Automation of administrative functions such as recruiting, onboarding, and payroll.
Reengineered quality assurance approach.

Field Infrastructure

Reduced number of Regional Census Centers (RCC) managing a reduced number of
Area Census Offices tasked with managing field operations and support activities.
Automated job application and recruiting processes, payroll submission and approval
process, and other administrative processes resulting in reduced staffing requirements.
Automated training.
Reduced number of enumerators and supervisors due to reengineered design for field
operations.

Decennial Logistics Management

Implementation of an online, real-time Enterprise Resource Planning system with
extended access for the RCC and field offices.
Implementation of a wireless network and bar code technology that will automate
inventory transactions.

IT Infrastructure

Early development of solutions architecture.
Use of enterprise solutions as appropriate.
Iterative deployment of infrastructure aligned with and based on testing.
Implementation of alternatives to providing government-furnished equipment, such as
Bring Your Own Device or Device as a Service.
Use of demand models to help predict Internet response volume, Census Questionnaire
Assistance center staffing, etc.
Scalable design.
Agile development of applications.

32 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

4. Key Tests, Milestones, and Production Dates
The 2020 Census has multiple decision points,
milestones, and production dates that must be met
to deliver the final apportionment and redistricting
data. Informing the decisions points are a series
of tests. These tests are documented in the 2020
Census Research and Testing Management Plan,
which provides the overarching management and
analysis framework for executing research and
testing projects and integrating the results across
projects. More detailed information about each test
is captured in formal research and test plan documents and in an integrated master schedule, facilitating the integration and coordination of activities
across tests and operations. Refer to Figure 2 in
Section 1.2 for how this documentation fits into
the broader set of documentation for the 2020
Census program. Detailed test plans and results are
available for review upon request.
The first part of this section describes the tests
used to inform the operational design and prepare
for conducting the Census. The second section
highlights key decision points and milestones
beginning with the research and testing phase in
Fiscal Year (FY) 2012 through the completion of

the Census in 2023. The third section provides the
planned production timeline for the primary 2020
Census operations, and the final section shows an
integrated schedule of the tests, milestones, and
production operations.

4.1 TESTS TO INFORM THE
OPERATIONAL DESIGN AND PREPARE
FOR CONDUCTING THE CENSUS
As shown in Figure 17, the tests conducted early
in the decade (2012–2015) are aimed at answering
specific research questions (objectives) needed to
make decisions on important aspects of the operational design for the four key innovation areas.
Starting in 2016, the focus shifts to validating
and refining the design by testing the interactions
across operations and determining the proposed
methodology for the operations. In addition,
testing of production systems begins during this
time-frame and continues through 2018, with final
performance testing to ensure scalability occurring
in 2019. An end-to-end test in 2018 will test the
integration of all major operations and systems.

October 2015
2020 Census Operational
Plan With Preliminary
Design Decisions Issued

Systems Tests

October 2018
2020 Census
Operational Design
Decisions Finalized

Develop and Test
Production Systems

Test Innovations
From Four Key
Innovation Areas

Operational
Design Tests

2012

2013

2014

Test
Innovations
Individually

Test
Integration
of SelfResponse
and
Nonresponse

Test Full
Integration
and CQA,
UE, GQ,
and QC

2015

2016

2017

Conduct
End-to-End
Test of
Systems

Conduct
Performance
Testing

Conduct
End-to-End
Test of
Operations

2018

2019

2020

Figure 17: High-Level View of Tests

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 33

Table 7 lists the operational tests executed or
planned for the 2020 Census.

Table 7: Operational Tests
Calendar
Year
2012

Test
Public-Opinion Polling (ongoing as needed
throughout the decade).
2012 National Census Test.

2013

2013 National Census Contact Test.
2013 Census Test.

2014

2014 Census Test.
Continuous Small-Scale testing (ongoing as
needed throughout the decade).
LUCA Focus Groups.

The following sections describe the tests listed
above. Tests for Calendar Years 2012 through 2014
are combined into one section. For the past and
current tests, a short description of the purpose,
scope, and timing is presented, followed by a table
with objectives of the tests, findings, and, where
applicable, design implications based on these
findings. For future tests, only the purpose, scope,
timing, and objective are provided. These may
change since future test plans are based on availability of funding as well as the analysis of prior
tests.

4.1.1 Tests in 2012–2014
As shown in Figure 18, eight tests were conducted
between 2012 and 2014.

2014 Human-in-the-Loop Test.
2015

Address Validation Test (starts in late 2014).
2015 Optimizing Self-Response Test.
2015 Census Test.
2015 National Content Test.

2016

2016 Census Test.
2016 Address Canvassing Test.

2017

2017 Census Test.

2018

2018 Census End-to-End Test.

2019

Post End-to-End Testing.

4.1.1.1 Public Opinion Polling
The Public Opinion Polling Test is a public opinion
survey of attitudes toward statistics produced by
the federal government that focuses on trust in the
federal statistical system, the credibility of federal
statistics, and attitudes toward and knowledge of
the statistical uses of administrative records and
third-party data. The Census Bureau is using the
Nightly Gallup Polling for this survey, and collects
data by telephone from 850 nationally representative housing units per week. Data collection started
in February 2012 and will continue on an ongoing
basis as needed.

Figure 18: Tests in 2012–2014

34 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

Public-Opinion Polling Test
Objectives

Determine if the public’s perception
of the Census Bureau’s commitment
and ability to protect privacy and
confidentiality are impacted if
administrative records are used in the
2020 Census design.
Determine what the public is most
concerned about with regard to privacy
and confidentiality, in general and as
related to government data collection.

Findings

2012 National Census Test
Objectives

Evaluate the performance of combined
race and origin questions on the
Internet.
Assess the Telephone Questionnaire
Assistance operation.
Findings

Total self-response rate was 56.9
percent, and the Internet selfresponse rate was 36.5 percent.

Reported belief in the credibility of
statistics predicts reported trust in
federal statistics.

An advance letter resulted in no
significant difference in overall
response rate as compared with no
advance letter.

Questions regarding administrative
record and third-party data use have
shown, when framed to indicate
that the use of records can save the
government money or provide a social
good, then respondents are more likely
to favor using administrative records
and third-party data.
Design Implications

Assess relative self-response rates and
Internet self-response rates.

Providing a telephone number in the
initial mailing resulted in no significant
difference in overall response, but an
increase in telephone interviews.
A second reminder to answer the 2012
National Census Test performed well.
Tailoring the content of the reminder
materials resulted in no significant
difference in overall response.

Continue to pursue research
and testing related to the use of
administrative records and third-party
data.

Response distributions of the combined
race and origin questions were similar
across the two question versions.

4.1.1.2 2012 National Census Test

Results did not indicate expected
benefit of enhanced reporting of
detailed race and origin groups.

The 2012 National Census Test studied overall
self-response rates and Internet self-response
rates. The test was conducted from August 2012
to October 2012 and included 80,000 nationally
representative housing units.

Of the calls to the Telephone
Questionnaire Assistance operation, 69
percent were because the respondent
did not have a computer or Internet
access.
Design Implications

Continue tests to determine response
rates and optimal contact strategies.
Further study of the collection of
detailed race and origin groups in a
national mailout test.
The 2020 Census Questionnaire
Assistance operation must account for
increased call volumes.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 35

4.1.1.3 2013 National Census Contact Test
The 2013 National Census Contact Test studied
two key areas related to strategies for contacting
respondents: the quality of the Contact Frame (a
list of supplemental contact information such as
e-mail addresses and phone numbers, built from
third-party data sources) and automated processing of census responses lacking a preassigned
census identification number (Non-ID Processing).
The study included 39,999 nationally representative addresses.

2013 Census Test
Objectives

Evaluate the use of administrative
records and third-party data to
enumerate nonresponding occupied
housing units to reduce the NRFU
workload.
Test an adaptive design approach
for cases not enumerated with
administrative records and third-party
data.

2013 National Census Contact Test
Objectives

Findings

Design Implications

Test methods for reducing the number
of enumeration contact attempts as
compared with the 2010 Census.

Evaluate the quality of phone and
e-mail contact information acquired
from third-party data sources.
Test proposed enhancements to
automated processing of responses
lacking a preassigned Census
identification number.

Evaluate the use of administrative
records and third-party data to identify
vacant housing units and remove them
from the NRFU workload.

Test the use of the telephone to make
initial enumeration contact attempts.
Findings

Respondents were not able to validate
contact information for other household
members.

Successfully used administrative
records and third-party data to identify
vacant and occupied housing units
and removed cases from the NRFU
workload.

The use of administrative records
and third-party data was effective in
enhancing non-ID addresses to allow
for a match to the MAF.

Successfully used administrative
records and third-party data as part
of an adaptive design approach to
designate cases for one to three
contact attempts.

Continue testing the quality of the
Contact Frame.

Adaptive design strategies as
implemented did not work.

Continue enhancing the functionality
associated with Non-ID Processing.

Design added complexity to training of
enumerators.

4.1.1.4 2013 Census Test
The 2013 Census Test was an operational study of
Nonresponse Followup procedures. This test was
conducted in late 2013 and involved 2,077 housing units in Philadelphia, Pennsylvania.

36 2020 Census Operational Plan—Version 1.1	

Design Implications

Continue refinement of adaptive design
methods and administrative records
and third-party data usage.
Continue refinement of training
methods.

U.S. Census Bureau

4.1.1.5 2014 Census Test
The 2014 Census Test was an operational study of self-response and Nonresponse Followup procedures.
For this test, Census Day was assumed to be July 1, 2014. The test involved 192,500 housing units in
portions of Montgomery County, Maryland, and Washington, DC.
2014 Census Test
Objectives

Test various self-response modes, including the Internet, CQA, and paper and to respond without a
preassigned Census identifier.
Evaluate the value of a preregistration option using “Notify Me” (a Web site that allows respondents to
indicate a preferred mode of contact for the 2020 Census).
Test the use of mobile devices for Nonresponse Followup enumeration in the field.
Test the use of Bring Your Own Device to conduct enumeration in the field.
Continue evaluating the use of administrative records and third-party data to remove cases (vacant and
nonresponding occupied housing units) from the Nonresponse Followup workload.
Test the effectiveness of applying adaptive design methodologies in managing the way field enumerators
are assigned their work.
Examine reactions to the alternate contacts, response options, administrative record use, and privacy or
confidentiality concerns (including how the Census Bureau might address these concerns through micro- or
macro-messaging) through focus groups.

Findings

Total self-response rate was 65.9 percent, the Internet self-response rate was 50.6 percent.
E-mail contact attempts did not work due to large number of incorrect e-mail addresses (bounce-backs).
The address collection interface in the Internet instrument yielded a much greater proportion of higher
quality address data from respondents without a unique Census ID than in 2010.
Use of administrative records and third-party data matching improved the overall address matching rate.
“Notify Me” had low participation with only about 3 percent of the sample choosing to preregister.
Higher than projected in-bound phone workloads due to respondent questions and issues primarily related
to Internet access.
Problems with coordinating contact with gated communities resulting in inefficient enumeration.
Need to strengthen training and procedures on contacting nonresponding housing units, specifically as
related to proxy interviews.
Need improved business rules and improved rule-based models for administrative records and third-party
data.

Design Implications

Conduct another test of “Notify Me” to determine if more people use this capability when advertising is used
to inform the public about the 2020 Census, and specifically about the “Notify Me” option.
Determine optimal use of adaptive design and administrative records and third-party data.
Further explore the use of Bring Your Own Device.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 37

4.1.1.6 Continuous Small-Scale Testing
The Continuous Small-Scale Testing is a study of
respondent and nonrespondent reactions to new
modes of decennial census contact and response.
The study focuses on reactions related to privacy
and confidentiality of these modes. This study
started in January 2014 and is ongoing as needed.
It included e-mails to 1,000–2,200 housing units
sampled from an opt-in frame.

Continuous Small-Scale Testing
Objectives

Determine how new contact and
response modes will impact the public’s
perception of the Census Bureau’s
commitment and ability to protect
privacy and confidentiality.
Determine how the public feels
about each new mode being tested,
specifically with regard to privacy and
confidentiality.

Findings

A text-based e-mail out performed
graphical e-mails.
Longer e-mail content with “Dear
Resident” and signature of the Director
outperformed a shorter e-mail invitation
without the greeting and signature.
Respondents report preferring
reporting online to a decennial census
with a mailed invitation with the link
over all other options.

Design Implications

38 2020 Census Operational Plan—Version 1.1	

Continue to monitor respondent and
nonrespondent reactions to various
contact and response modes.

U.S. Census Bureau

4.1.1.7 LUCA Focus Groups
The LUCA Focus Groups collected input on potential LUCA models for the 2020 Census. Focus groups
consisted of eligible LUCA participants representing various sizes and types of governments across the
nation. Forty-six governmental entities participated. The focus groups were conducted from March 2014
through June 2014.
LUCA Focus Groups
Objectives

Obtain feedback on potential LUCA models for the 2020 Census through a series of focus groups with 2010
Census LUCA participants.

Findings

Continue the 2010 Census LUCA Program improvements that were successful.
•• Continue to provide a 120-day review time for participants.
•• Continue the 6-month advance notice about the LUCA program registration.
•• Continue a comprehensive communication program with participants.
•• Continue to provide a variety of LUCA media types.
•• Continue to improve the partnership software application.
•• Continue state participation in the LUCA program.
Eliminate the full address list submission options that were available in 2010 LUCA (Options 2
and 3). This will:
•• Reduce the number of deleted LUCA records in field verification activities.
•• Reduce the burden and cost of processing addresses and LUCA address validation.
Reduce the complexity of the LUCA Program.
Include census housing unit location coordinates in the census address list and allow partners to return their
housing unit location coordinates as part of their submission.
Provide any ungeocoded United States Postal Service (USPS) Delivery Sequence File (DSF) address to
state and county partners.
Provide the address list in more standard formats.
Conduct an in-office validation of LUCA submitted addresses.
Utilize Geographic Support System Initiative data and tools to validate LUCA submissions.
Encourage governments at the lowest level to work with higher level governments to consolidate their
submission.
Eliminate the Block Count Challenge.
Eliminate the use of the asterisk (*) designation for multiunits submitted without unit designations.
Encourage LUCA participants to identify addresses used for mailing, location, or both.

Design Implications

Develop in-office validation processes, procedures, and tools.
Define relationship between Address Canvassing and LUCA, taking into consideration the timing of LUCA
feedback and the appeals operation.
Determine the feasibility of technical recommendations for the 2020 Census LUCA operation.
•• Use of background imagery on paper maps.
•• Ability to provide structure locations within LUCA materials.
•• Feasibility of web-based registration.
Determine feasibility of using areas where the Census Bureau has planned field activities to validate LUCA
records.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 39

4.1.1.8 2014 Human-in-the-Loop Test

4.1.2 Tests in 2015

The 2014 Human-in-the-Loop Test consisted of a
simulation of reengineered field operations using
an Operational Control Center and the enhanced
operational control system. The purpose was to
test proposed devices, systems, and the field
structure for staff and management processes.
The Simulation Experiment (SIMEX) occurred in
November 2014. Eighty-seven field and office staff
tested real-time field operations and field management structure in this test.

A key milestone in October 2015 is the release of
the preliminary operational design for the 2020
Census as documented in this plan and supporting
materials. This design is informed by tests conducted from 2012 through 2015. Future tests will
be used to refine the design.
Figure 19 shows the schedule for the four tests in
2015 and the 2020 Census Operational Plan milestone. Each test is described below.
4.1.2.1 Address Validation Test

Human-in-the-Loop Test
Objectives

The Address Validation Test was conducted to
assess the performance of methods and models that will help develop the 2020 Census
address list, and to estimate the In-Field Address
Canvassing workloads for the 2020 Census. The
test contained two components, the MAF Model
Validation Test (MMVT) and the Partial Block
Canvassing (PBC) Test.

Exercise field reengineering methods
(staffing ratios and enhanced
operational control system) in a
simulated environment.
Refine methods and get input from field
staff to improve business processes
prior to the 2015 Census Test.

Findings

The new design for managing field
operations was successful, including
the use of an Operational Control
Center and operational control system
to manage the Nonresponse Followup
workload.

MAF Model Validation Test
The MMVT evaluated methods that are part of the
reengineered Address Canvassing innovation area.
The test was conducted from September 2014
to December 2014 and included 10,100 nationally representative blocks (100 blocks with no
addresses), which included approximately 1.04
million addresses in the sample blocks.

The ratio of enumerators to supervisors
can be increased from the 2010
Census.
Instant notification to enumerators and
supervisors is feasible and serves as a
successful means of communication.
Design Implications

Employ the new design for
reengineered field operations during
the 2015 Census Test.
Increase the ratio of enumerators to
supervisors.

2014

Q3
Jul

2015 Address
Validation Test

MAF Model Validation

Aug

Sep

Oct

Q4
Nov

Q1
Dec

9/14

Partial Block Canvassing
2015 Optimizing
Self-Response Test
2015 Census Test

Jan

Q2

Feb

Mar

May

2015
Jun

Q3

Jul

Aug

Q4
Sep

Oct

Nov

Q1
Dec

Jan

12/14
12/14

2/15
2/15

5/15
3/15

2015 National
Content Test
2020 Census
Operational Plan

Apr

6/15
12/15

8/15
2020 Census Operational Plan

10/15

Figure 19: Tests and Key Decisions in 2015

40 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

MAF Model Validation Test
Objectives

Test In-Office and In-Field Address
Canvassing procedures.

Partial Block Canvassing
Objectives

Determine the ability to ensure an
accurate MAF.

Determine ability to accurately canvass
partial blocks.

Assess the ability of two sets of
statistical models to predict blocks that
have experienced address changes.
Findings

Evaluate an interactive review of
various materials—primarily aerial
imagery over time and geographic
quality indicators.

In-Office Address Canvassing was
effective.
Statistical models were not effective at
identifying blocks with changes.

Findings

Operationally feasible to canvass
portions of blocks.
In-office imagery review of blocks has
utility.

Statistical models were not effective at
predicting national coverage errors.
Design Implications

Statistical models are not being
pursued for determining blocks with
change or MAF coverage.
Continue with In-Office and In-Field
Address Canvassing approaches.

Measure unrecorded changes in blocks
and identify portions of blocks where
change is likely.

Design Implications

Continue PBC work in conjunction with
in-office review.
Determine cost benefit of partialblock methods against full-block
canvass.

Partial-Block Canvassing

4.1.2.2 2015 Optimizing Self-Response Test

The PBC Test evaluated the feasibility of canvassing
portions of blocks, rather than entire blocks using
both in-office and in-field methods. This test was
conducted from December 2014 to February 2015.
The staff conducted an interactive review of aerial
imagery over time and geographic quality indicators. Six hundred and fifteen blocks with national
distribution were listed by 35 professional staff.

The 2015 Optimizing Self-Response Test was an
operational study of self-response procedures.
For this test, Census Day was April 1, 2015. In the
Savannah, Georgia, media market, 407,000 housing units were included in this test, with 120,000
sampled self-responding housing units.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 41

2015 Optimizing Self-Response Test
Objectives

Determine use of digital and targeted
advertising, promotion, and outreach to
engage and motivate respondents.
Test value of “Notify Me” when
partnerships and traditional and
targeted advertising are used to
promote early engagement of
respondents.

4.1.2.3 2015 Census Test
The 2015 Census Test was an operational study
of Nonresponse Followup procedures. Census
Day was assumed to be April 1, 2015. This test
included 165,000 sampled housing units in
Maricopa County, Arizona.

Offer opportunity to respond without a
Census ID (Non-ID Processing) and
determine operational feasibility and
potential workloads around real-time
non-ID processing.
Determine self-response and Internet
response rates.
Findings

The total response rate was 47.5
percent, and the Internet response rate
was 33.4 percent.
An additional 35,249 Internet
responses from housing units not
selected in mail panels as a result of
advertising and promotional efforts.
Continued low participation in “Notify
Me”.
Successful implementation of real-time
non-ID processing, matching 98.5
percent of cases.
A new postcard panel, designed to
test how housing units not originally
included in the sample would respond
to an invitation after being exposed
to advertising, generated response of
approximately 8 percent.

Design Implications

Discontinue “Notify Me.”
Continue testing related to
partnerships, advertising, and
promotional efforts.
Continue use of offering non-ID
opportunity to respondents.

42 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

2015 Census Test
Objectives

Continue testing of fully utilizing a field operations management system that leverages planned automation
and available real-time data, as well as data households have already provided to the government, to
transform the efficiency and effectiveness of data collection operations.
Begin examining how regional offices can remotely manage local office operations in an automated
environment, the extent to which enumerator and manager interactions can occur without daily face-to-face
meetings, and revised field staffing ratios.
Reduce Nonresponse Followup workload and increase productivity with use of administrative records and
third-party data, field reengineering, and adaptive design.
Test operational implementation of Bring Your Own Device.
Explore reactions to the Nonresponse Followup contact methods, administrative records and third-party
data use, and privacy or confidentiality concerns.

Findings

Total self-response rate was 54.9 percent; Internet self-response rate was 39.7 percent.
Field Staff Training.
•• Combination of online and classroom training provided standardization of the information, provided
tracking capabilities, and offered various learning methods.
•• Reduced training hours compared with the 2010 Census Nonresponse followup enumerator training from
32 hours to 18 hours.
•• Deployment of YouTube videos to quickly and efficiently provide supplemental training to enumerators.
•• Identified topics requiring additional training in future tests.
Field Reengineering.
•• Area Operations Support Center and staffing of the Area Operations Support Center successful.
•• Electronic payroll successful.
•• Entry of availability for work and workload optimization were effective.
•• Operational Control System alerts effective in bringing attention to situations that required follow-up and
possible corrective action.
•• Optimized routing was overall successful, but uncovered where modifications to the routing algorithm are
needed.
Census Operations Mobile Platform for Adaptive Services and Solutions (COMPASS) (application used for
enumerating nonresponding housing units).
•• Application was easy to use.
•• Experienced crashes and freezes of the application; further investigation into root causes is needed.
•• Coverage questions added to respondent burden.
Field Test Procedures.
•• Work needed to define a coordinated approach to enumeration within multiunits and gated communities.
•• Refinement to data collection application “pathing” needed to better assist enumerators in cases on
proxy responses and noninterviews.
Bring Your Own Device.
•• Training was fairly labor intensive.
Based on observations, no adverse respondent reactions to the device being used for data collection.

Design Implications

Employ the use of automated training.
Continue to test the use of administrative records and third-party data in reducing the Nonresponse
Followup workload.
Optimize the number of visits and phone contacts for enumeration of nonresponders.
Make at least one contact for nonresponding housing units.
Continue to test field procedures for contacting nonresponding housing units.
Test the use of Device as a Service as a possible alternative or supplement to government-furnished
equipment and Bring Your Own Device.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 43

4.1.2.4 2015 National Content Test
The 2015 National Content test evaluated and
compared different census questionnaire content. It
assumed a Census Day of September 1, 2015. The
test included 1.2 million nationally representative
households, including 20,000 households in Puerto
Rico and 100,000 reinterviews.

2015 National Content Test
Objectives

Evaluate and compare different census
questionnaire content, including
questions on Race and Hispanic
origin (e.g., combining Race and
Hispanic origin into a single question
versus using separate questions,
and introducing a Middle Eastern
North African category), relationship
(introducing same-sex relationship
categories), and within-household
coverage (streamlined approach for
ensuring accurate within-household
coverage).
Refine estimates of national selfresponse and Internet response rates.
Continue to test self-response modes
and contact strategies (see 2014
Census Test objectives).
Reinterview a subsample of
respondents to further assess the
accuracy and reliability of the question
alternatives for race, Hispanic origin,
and within-household
coverage.

Findings and Design To be determined once the test is
Implications
completed.

44 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

4.1.3 Tests in 2016
In 2016, the Census Bureau plans to move from small scale individual tests using proof of concept and
prototype systems to more refined tests and the building of systems that will support the 2020 Census.
These tests are dependent on funding.
Specifically, as shown in Figure 20, two tests are planned for 2016. The 2016 Census Test focuses on the
integration of Self-Response and Nonresponse Followup operations. The 2016 Address Canvassing Test
expands early address canvassing tests to refine the in-office and in-field methods. Each test is described
below.
The following operations and systems will be tested in 2016 through these two tests:
Key Innovation Area

Operations

Systems

Reengineering Address
Canvassing

Address listing.

Enterprise Listing and Mapping System/
Listing and Mapping Instrument.

Optimizing Self-Response

Internet Response.

PRIMUS Prototype, using cloud infrastructure.

Telephone Response.

Census Bureau Call Centers.

Paper Response.

iCADE (Integrated Capture and Data Entry).

Non-ID Processing.

Real-time non-ID processing using cloud
infrastructure.

Utilizing Administrative Records
and Third-Party Data

Identification of vacant and occupied units.

Headquarters’ servers.

Removal from the NRFU workload.

Control and Response Processing Data
System.

Reengineering Field Operations

Workload Control.

MOJO (in-field operational control system)
prototype begins interfacing with Multi-mode
Operational Control System.

Enumeration.
Quality Assurance.

COMPASS Prototype.

Q1
Jan

Proposed
Tests Planned
in 2016

2016 Census Test

Feb

Q2
Mar

Apr

May

2016
Jun

Q3
Jul

3/16

Aug

Q4
Sep

Oct

Nov

Dec

7/16

2016 Address Canvassing Test
(In-Field)

9/16

11/16

Figure 20: Tests Planned in 2016

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 45

4.1.3.1 2016 Census Test
The 2016 Census Test is planned to be an operational study of both self-response and Nonresponse
Followup procedures. It will have a Census Day of April 1, 2016, and will include approximately 225,000
housing units per site in Los Angeles County, California, and Harris County, Texas.
2016 Census Test
Objectives

Self-Response.
•• Test provision of language support to Limited English Proficient populations through partnerships and
bilingual questionnaires.
•• Test ability to reach demographically diverse populations.
•• Refine Real-Time Non-ID Processing methods, including respondent validation.
•• Test cloud-based infrastructure for self-response and Non-ID Processing.
Nonresponse followup.
•• Refine the reengineered field operations.
•• Refine the field management staffing structure.
•• Test enhancements to the Operational Control System and COMPASS.
•• Refine the path in COMPASS to conduct proxy interviews.
•• Test improved procedures for multiunit accessibility and contact.
Reengineered quality assurance.
•• Evaluate the use of paradata and Global Positioning Satellite points collected during interview.
•• Test reinterview functionality.
Measure the systems’ abilities to manage a significant number of concurrent users during self-response.
Test a combination of Bring Your Own Device, government-furnished equipment, and Device as a Service
strategies for supplying enumerators with hardware devices.
Test scalability of Internet and non-ID processing during self-response using enterprise solutions.

Findings and Design To be completed once the test is conducted.
Implications

46 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

4.1.3.2 2016 Address Canvassing Test

4.1.4 Tests in 2017

The 2016 Address Canvassing Test is planned to
be an operational study of In-Office and In-Field
Address Canvassing procedures. It will begin in
the fall of 2016 and will continue into 2017. This
test will cover various sites across the nation
with a specific focus on areas required to support
the 2017 Census Test (i.e., one urban area, two
American Indian Reservations, and Puerto Rico).

The 2017 Census Test is the single test to be
conducted in 2017. This test comprises multiple
operations, including Group Quarters (GQ), Update
Enumerate (UE), Internet Self-Response, and NRFU.
This test and its scope are dependent on funding.
The planned schedule for testing each of these
operations as part of the 2017 Census Test is
shown in Figure 21.
4.1.4.1 2017 Census Test

2016 Address Canvassing Test
Objectives

The 2017 Census Test is planned to be an operational study of address canvassing, self-response,
and Nonresponse Followup procedures. It will
have a Census Day of April 1, 2017 and will cover
various geographic areas across the nation with
a specific focus on one urban area, two American
Indian Reservations, and Puerto Rico.

Test new In-Office and In-Field Address
Canvassing methods.
Test the use of Listing and Mapping
Instrument.
Test the use of the Basic Collection
Unit instead of traditional collection
geography.
Test updates to the MAF/TIGER
system with address and spatial data.
Test reengineered methods for quality
assurance.
Test address updating and matching
software for Puerto Rico.

Findings and Design To be completed once the test is
Implications
conducted.

2016
Q4
Oct

Group Quarters (GQ)
2017 Census
Test

Nov

Q1
Dec

Jan

Feb

2017

Q2
Mar

Apr

May

Jun

12/16

Internet Self-Response
Nonresponse Followup

Aug

Q4
Sep

Oct

Nov

Dec

7/17
1/17

Update Enumerate

Q3
Jul

5/17
3/17

5/17
4/17

7/17

Figure 21: Schedule for the 2017 Census Test

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 47

4.1.5 Tests in 2018

2017 Census Test
Objectives

One test is planned for 2018, the 2018 Census
End-to-End Test. The goal is to have the entire
operational design for seven major operations
(see below) ready for production—from a systems,
operational, and architectural perspective. This test
and its scope are dependent on funding. The 2018
Census End-to-End test will include significant field
data collection components, and the timing of the
field operations will mimic the 2020 Census (see
Figure 22).

Evaluate cross-operation impacts
of innovations including Address
Canvassing, Communications, SelfResponse, Non-ID Processing, GQ,
UE, and NRFU.
Test communications for promoting
recruiting and language support.
Test UE and GQ operations.
Test UE using an integrated set of
tools (i.e., using the same mobile
device to list addresses and enumerate
nonresponding housing units).

This test is in the early planning stages. Findings
from the 2016 Address Canvassing Test and the
2017 Census Test will be used to develop the test
plans. Other efforts in preparation of this test
include introducing CEDCaP systems that were not
in place for earlier tests, expanding and enhancing
those systems already in use, and expanding and
enhancing the systems using cloud technologies.

Determine the use of administrative
records and third-party data as applied
to group quarters populations.
Test the reengineered Quality Control
process for field operations.
Measure workloads and improve
workload models.
Test the performance management
dashboard.

Any problems found during the 2018 Census
End-to-End Test will be addressed using careful
regression testing and change control procedures
in 2019.

Findings and Design To be completed once the test is
Implications
conducted.

2017
Q3

Q2
Apr May Jun

Address Canvassing
Group Quarters

Jul

Aug Sep

Q4
Oct

Q1

Nov Dec

8/17

Jan

Q2

2018

Feb Mar Apr May Jun

Jul

Q3
Aug Sep

Q4
Oct

Q1

Nov Dec

Jan

2019

Q2

Feb Mar Apr May Jun

12/17
9/17

7/18
3/18

Internet Self-Response
2018 Census
End-to-End Test Update Enumerate

1/18

Nonresponse Followup
Census Coverage Management
Post Processing and Products—
End-to-End Functional Test

9/18
5/18

4/18

7/18

10/17

12/18
1/18

12/18

Figure 22: Schedule for the 2018 Census End-to-End Test

48 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

4.1.5.1 2018 Census End-to-End Test

4.1.6 Tests in 2019

The 2018 Census End-to-End Test is planned to test
seven major threads that cover the vast majority
of the 2020 Census requirements: Address List
Development, Self-Response, UE, NRFU, GQ, Census
Coverage Measurement, and Post Processing. This
test will have a Census Day of April 1, 2018, with
the Address Canvassing operation to be conducted
in the prior calendar year. It will include a sample
that represents urban and rural areas, Puerto Rico,
and group quarters.

As shown in Figure 23, two types of tests are
planned for 2019, Defect Resolution Testing and
Post End-to-End Performance Testing. These tests
and their scope are dependent on funding. The
Defect Resolution Testing will ensure that any
changes made to fix defects in the systems tested
in the 2018 End-to-End Test are correctly resolved.
This final performance testing in 2019 minimizes
the risk of system crashes and delays in processing
respondent Internet submissions and phone calls.
Components of performance testing will be done
according to best practices.

2018 Census End-to-End Test
Objectives

Test critical systems and operations
together to ensure proper integration
and conformance with the
requirements.

4.2 KEY DECISION POINTS AND
MILESTONES
Figure 24 shows the key decision points and milestones for the full life cycle of the 2020 Census.
Milestones include public facing milestones, such
as launching the communications campaign, as
well as delivery of 2020 Census products to the
President, states, and the public.

Test all systems and operations to
ensure readiness.
Findings and Design To be completed once the test is
Implications
conducted.

2018

Thread
Jul

Defect Resolution
Testing

Defect Resolution Testing

Post End-to-End
Performance
Testing

Address Canvassing

Aug

Sep

Oct

Q4
Nov

Q1
Dec

Jan

Feb

Q2
Mar

Apr

May

2019
Jun

Jul

Q3
Aug

Oct

Nonresponse Followup

Nov

Dec

Jan

Feb

5/23/19
6/25/19

Internet Self Response
Update Enumerate

2020
Q1

Q4
Sep

11/20/19
Legend
Defect Resolution
Performance Test

12/19/19
12/24/19

Figure 23: Defect Resolution and Performance Tests in 2019

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 49

50 2020 Census Operational Plan—Version 1.1	

2011
Decision

Begin 2020 Census
Launch 2020 Census Web site
2020 Census Operational Plan
Award Census Questionnaire
Assistance Contract
Award Communications Contract
Census Topics to Congress

Key Decision Points/
Milestones

2012

2013

2014

2015

2016

2017

2018

2019

2020

1/15

Launch 2020 Census Web site
10/15

2020 Census Operational Plan
6/16
8/16

Award Census Questionnaire Assistance Contract
Award Communications Contract
4/17

Census Topics to Congress
Deliver Final Residence Rules

Open Regional Census Centers

12/17

Open Regional Census Centers
4/18

Census Questions to Congress
1/19

Group Quarters Operations Begin

Group Quarters Operations Begin
2020 Census Day

4/20

NRFU Complete

8/20

Count Review Complete

Deliver Redistricting Counts to States

Open Field Offices
2/20

2020 Census Day

Deliver Counts to the President

2023

Begin 2020 Census

12/17

Open Field Offices

2022

Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1

11/11

Deliver Final Residence Rules

Census Questions to Congress

2021

NRFU Complete

11/20

Count Review Complete

Deliver Counts to the President

12/20

Deliver Redistricting Counts to States

Complete LUCA

3/21

Complete LUCA

Release Final 2020 Data Products

9/21

Release Final 2020 Data Products

Complete 2020 Census

Complete 2020 Census

U.S. Census Bureau

Figure 24: Key Decision Points and Milestones

4/23
9/23

4.3 2020 CENSUS PRODUCTION
OPERATIONAL SCHEDULE
Figure 25 describes the planned timing for the
major production field operations for the 2020
Census. This schedule represents the current baseline and may change based on available funding
and final design decisions.

one chart. Different types of tests (research, readiness, performance, end-to-end, and post end-toend) are shown in different colors as noted in the
legend. Key milestones, including the baseline of
the 2020 Census Operational Plan, the delivery of
topics and questions to Congress, Census Day, and
the delivery of apportionment counts and redistricting data are also shown.

Figure 26 provides an integrated schedule for the
tests, key milestones, and production operations in

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 51

52 2020 Census Operational Plan—Version 1.1	

2015
Operation

Q3
Jun

In-Office Address Canvassing
In-Field Address Canvassing

9/15

2016
Q4

Sep

Q1
Dec

Q2
Mar

Q3
Jun

2017
Q4
Sep

Q1
Dec

Q2
Mar

Q3
Jun

2018
Q4
Sep

Q1
Dec

Q2
Mar

Q3
Jun

2019
Q4
Sep

Q1
Dec

Q2
Mar

Q3
Jun

Q1
Dec

Sep

Q2
Mar

2021

Q3

Q4

Jun

Q1
Dec

Sep

Q2
Mar

Q3
Jun

7/19
8/19

12/19
6/20

2/20

Group Quarters (GQ)
Census Questionnaire Assistance

2020 Production
Operation Schedule

2020
Q4

1/20

9/20

Internet Self-Response

3/20

Update Enumerate (UE)

3/20

9/20
7/20

4/20

Nonresponse Followup (NRFU)
Response Processing to Deliver
Apportionment

8/20

1/20
Census Day

Census Day

11/20
4/1/20

Deliver Apportionment Counts

Deliver Apportionment Counts

12/31/20

Deliver Redistricting Data

Deliver Redistricting Data

Figure 25: 2020 Census Operations—Production Timeline

3/31/21

U.S. Census Bureau

U.S. Census Bureau

Public Opinion Polling
Continuous Small Scale Testing
2012 National Census Test
Early 2012–2014 Tests

2013 National Census Contact
Test

	

2013 Census Test
2014 Census Test
LUCA Focus Groups
2014 Human-in-the-Loop Test
MAF Model Validation

Tests and Key
Decisions in 2015

Partial Block Canvassing
2015 Optimizing SelfResponse Test
2015 Census Test
2015 National Content Test
2020 Census Operational Plan

Proposed Tests
Planned in 2016

Key Decisions and
Proposed Tests in 2017

2016 Census Test
2016 Adress Canvassing Test
(In-Field)
Group Quarters (GQ)
Update Enumerate
Internet Self Response
Nonresponse Followup
Address Canvassing
Group Quarters

2018 Census
End-to-End Tests

Internet Self Response
Update Enumerate

2020 Census Operational Plan—Version 1.1 53

Nonresponse Followup
Census Coverage Management

Performance and
Defect Resolution
Testing Planned in 2019

Post Processing and Products—
End-to-End Functional Test
Defect Resolution Testing
Post End-to-End Performance
Testing
In-Office Address Canvassing
In-Field Address Canvassing

2020 Production
Operation
Schedule

Group Quarters (GQ)
Census Questionnaire
Assistance
Internet Self Response
Update Enumerate (EU)
Nonresponse Followup (NRFU)
Response Processing to
Deliver Apportionment
Census Day
Deliver Apportionment Counts
Deliver Redistricting Data

Figure 26: High-Level Integrated Schedule

[This page intentionally left blank]

5. The 2020 Census Operations
This section of the document provides the current
state of the operational design. An overview of
the 34 operations is presented, followed by more
detailed descriptions of each operation that include
the following:
•• Purpose: A concise description of the
operation.
•• Lessons Learned: Selected lessons learned
from the 2010 Census or tests or studies that
have occurred since the 2010 Census.2
•• Opportunities to Innovate: Major planned
innovations for this operation.
•• Description of Operation: A basic description
of the operation.
•• Research and Design Decisions: Research
completed through the Research and Testing
phase of the 2020 Census Program (2012–2015)
and the major findings from and decisions made
based on this research.
•• Design Issues to Be Resolved: Outstanding
design components and when and how they will
be addressed.
•• Cost and Quality: The expected cost and
quality impacts of the proposed design (or
alternative designs) for this operation. Only cost
impacts of $100 million or more are reflected.
•• Risks3: The top risks associated with this
operation.
•• Milestones: Important dates associated with
this operation to include decision points and
production dates.
For support and similar operations that do not
require a research-based design, the sections on
2
The Knowledge Management Database contains the lessons
learned from the 2010 Census and is available for review upon
request.
3
Each operation has its own project-level risk register, which
includes the full list of project risks. These are available upon
request.

U.S. Census Bureau

	

research and design decisions focus on work completed or to be completed and general decisions
and issues.
Note that throughout this document references are
made to specific systems that are part of CEDCaP.
These are the systems being used to support the
early census tests; however, final decisions on production systems have not been made.

5.1 OPERATIONS OVERVIEW
Figure 27 illustrates all 34 operations organized
by the 2020 Census WBS elements. As noted by
the shading on the diagram, the degree to which
detailed planning has been conducted for each
operation varies. The maturity of the operational
design for the 34 operations also varies based on
the amount of planning conducted to date.
Although each operation is presented separately,
the operations must work together to achieve a
successful census. Information flows among the
operations as the census proceeds from frame
development through collection of response data
to the publishing and release of the data.
The integration of these business operations
requires integration of the information technology
systems that support them. This is a significant
effort and is underway. All of the interfaces for
the 2020 Census are not fully defined at this time;
however, the Systems Engineering and Integration
operation will detail those interfaces as the
Research and Testing phase ends and systems are
built for production.
The information flows among the primary business
operations are highlighted in the diagram below
(Figure 28). Major interactions and flows are shown
via the arrows in the diagram and the key external
interfaces are depicted in blue text.

2020 Census Operational Plan—Version 1.1 55

56 2020 Census Operational Plan—Version 1.1	

SUPPORT
Program Management
Program Management
Detailed
planning is
underway

Systems Engineering
and Integration

Security, Privacy and
Confidentiality

Content and
Forms Design

Field Infrastructure

Decennial Logistics
Management

IT Infrastructure

Language Services

Infrastructure
Decennial Service
Center

FRAME
Detailed
planning
recently
begun

Census/Survey Engineering

RESPONSE DATA

PUBLISH DATA

Geographic Programs

Forms Printing and
Distribution

Non-ID Processing

Census Questionnaire
Assistance

Data Products and
Dissemination

Local Update of
Census Addresses

Paper Data Capture

Update Enumerate

Nonresponse Followup

Redistricting Data

Address Canvassing

Integrated Partnership
and Communications

Group Quarters

Response Processing

Count Review

Internet Self-Response

Enumeration at
Transitory Locations

Federally Affiliated
Americans Count
Overseas

Count Question
Resolution

Detailed
planning
not started

Archiving

OTHER CENSUSES
Island Areas
Censuses

TEST AND EVALUATION
Coverage Measurement
Design and Estimation

Coverage Measurement
Matching

Coverage Measurement
Field Operations

U.S. Census Bureau

Figure 27: Operational Overview by Work Breakdown Schedule

Evaluations and
Experiments

U.S. Census Bureau

	

SUPPORT
FRAME

GEO

External
data
sources

RESPONSE DATA

Geographic
Programs
Addresses,
spatial
data, and
boundaries

Tribes,
partners,
USPS, etc.

Partnership
programs
GEO data
maintenance

Addresses and
boundaries

Addresses,
spatial data,
and
boundaries

2020 Census Operational Plan—Version 1.1 57

Address Canvassing
State and
local
governments

GEO

NPC

In-Office
In-Field

Field
worker

Address and map updates

OTHER CENSUSES

Forms Printing and
Distribution

Awareness and education

Data collection for
special locations
and populations
Enumeration
at Transitory
Locations

Group Quarters
Federally Affiliated
Americans Count
Overseas

Paper
responses

White House

Contact materials

Administrative
records and
third-party
data

Electronic
responses

Paper Data
Capture

Real-time

Public

Manual
NPC

Non-ID addresses

Electronic responses
Nonresponse cases

Census
Questionnaire
Assistance

Batch

Non-ID addresses

Electronic responses

Response
Processing

Data Products and
Dissemination

Non-ID processing

Post-processed
response data

Nonresponse
Followup
Field operation

Count
Review

Partners

Archiving

Redistricting
Data Program

NARA

Legislature

Field verification
Field worker

Electronic responses

GEO

Apportionment

Self-response data collection

Internet
Self-Response

Field worker

Address updates

PUBLISH DATA

Respondent

Update
Enumerate

LUCA
Addresses
to canvass

Integrated Partnership and
Communications

Address updates
Respondent

GEO Address and
boundary updates

TEST AND EVALUATION
Figure 28: High-Level Integration of Operations

Count
Question
Resolution

Tribal, state,
and local
governments

5.1.1 Frame
As shown in Figure 28 from the previous page, the
basic flow of information begins in the frame area
with the Geographic Programs operation that
receives addresses, spatial data, and boundary
information from tribal, federal, state, and local
governments. An additional method for updating
the frame is the review of the address and boundary information through the Local Update of
Census Addresses (LUCA) program. Updates
through Geographic Programs and LUCA typically
include adding missing living quarters, deleting
erroneous living quarters, and modifying or correcting existing records. The most current address
list is provided to the Address Canvassing
operation where staff make updates to the list via
in-office and in-field procedures. These updates
are processed on an ongoing basis throughout the
decade. Once the frame updates are complete, the
initial universe of living quarters is used for enumeration operations in the Response Data area.
The Geographic Programs operation allocates the
universe of addresses into different methods and
modes for the following operations:
•• Enumeration at Transitory Locations:
Enumerate individuals in occupied units at
transitory locations, such as recreational vehicle
parks, campgrounds, tent cities, racetracks,
circuses, carnivals, marinas, hotels, and motels,
who do not have a usual home elsewhere.
•• Update Enumerate: Update the address and
feature data and enumerate housing units in
certain designated geographic areas with special
enumeration needs (e.g., areas that do not
have city-style addresses and areas with unique
challenges associated with accessibility). (This
operation crosses Frame and Response Data
Collection in the graphic and in the WBS).
•• Group Quarters: Enumerate people living or
staying in group quarters, people experiencing
homelessness, and people receiving service at
service-based locations.
•• Federally Affiliated Americans Count
Overseas: Obtain counts by home state of
U.S. military and federal civilian employees stationed or deployed overseas and their dependents living with them. All responses from these
operations are collected electronically. Some

58 2020 Census Operational Plan—Version 1.1	

of these operations (e.g., UE or ETL) may find
addresses that were not in the initial universe.
Address updates collected during these operations
are sent back to the Geographic Programs operation for processing.

5.1.2 Response Data
A key goal for the 2020 Census is to optimize
self-response. Integrated Partnership and
Communications and Forms Printing and
Distribution create awareness for and send contact
materials to the respondents, directing them to the
online questionnaire or to a paper questionnaire.
During Internet Self-Response, some respondents will not have a Census ID, the Census Bureau
will do real-time (during the interview) processing
to identify the correct block for the respondent’s
address using methods in the Non-ID Processing
operation. The respondents that do not respond on
the Internet will be given the opportunity to respond
via Paper Data Capture. Some respondents will
call with questions, and the Census Bureau will offer
to collect their response via the telephone through
the Census Questionnaire Assistance operation. All the responses from each of the Response
Data Collection operations will go to the Response
Processing operation, which manages the status
of cases across the universe. Addresses for which
the Census Bureau did not receive a response are
sent to the Nonresponse Followup operation,
which determines the most cost-effective way of
enumerating those households (personal visit, use
of administrative records and third-party data, proxy
responses, or imputation). Any new addresses identified during Nonresponse Followup are sent to the
Geographic Programs operation for processing.

5.1.3 Publish Data
Response Processing delivers the edited data to the
Data Products and Dissemination operation to
prepare the final 2020 Census data products. This
operation delivers:
•• Apportionment counts to the White House and
statistical data to the public.
•• State counts to the Redistricting Data
Program for dissemination to the state legislatures so state governments can define the
geographic boundaries for Congressional and
legislative districts.

U.S. Census Bureau

•• Final counts to the Count Review operation for
Federal-State Cooperative Population Estimates
(FSCPE) members to ensure the accuracy of the
2020 Census.
•• Final counts to the Count Question
Resolution (CQR) operations so challenges to
Census Counts can be resolved.
•• Every questionnaire to the Archiving operation
for public release 72 years after the census.
This description of all 34 operations and the basic
integration only depicts high-level data flow and
interaction. The detailed Business Process Models
(BPM) found in the Detailed Operational Plans for
each operation show how information flows within
operations.

5.2 PROGRAM MANAGEMENT
5.2.1	 Program Management
Detailed Planning Status:

Underway

Purpose
The Program Management operation defines and
implements program management policies, processes, and the control functions for planning and
implementing the 2020 Census in order to ensure
an efficient and well-managed program.
Lessons Learned
Based on lessons learned from the 2010 Census
and other reviews, the following recommendations
were made:
•• Develop a life-cycle schedule for the 2020
Census, and complete it earlier in the decade.
•• Place more emphasis and resources on updating
cost and schedule estimates throughout the life
cycle.
•• Obtain independent cost estimates and use
them to validate cost estimates (that include
contingency reserves) developed by stakeholder
organizations.
•• Improve strategic planning and early
implementation of the 2020 Census Risk
Management process.
•• Align system development schedules with operational deadlines to allow adequate time to test
systems before their deployment.

U.S. Census Bureau

	

•• Reevaluate the practice of frontloading and
develop a better process for developing workload and cost assumptions.
•• Rethink and rework stakeholder engagement,
education, and management. The Census
Bureau needs to better define, and then clearly
articulate, its expectations with regards to roles
and responsibilities between the Census Bureau,
contractors, and stakeholder groups.
•• Set a clear and publicly announced goal to
reduce the inflation-adjusted per housing unit
cost relative to 2010 Census totals.
Opportunities to Innovate
Following an analysis and review of the 2010
Census program management practices, the 2020
Census improved its program management capabilities and defined program management processes
earlier in the decade to support 2020 Census
Research and Testing activities. New and improved
program management practices integrated into
the 2020 Census that were not part of the 2010
Census include the following:
•• Iterative operational planning to allow for periodic design refinements based on findings from
research and testing, as well as external changes
in legislation and technology.
•• Evidence-based decision-making to ensure that
operational designs are based on solid evidence
from research, testing, analysis, and prior survey and census experience.
•• Integration of schedule, scope, and budget using
a common WBS.
•• An integrated life-cycle master schedule that
uses best practices based on the Government
Accountability Office (GAO) schedule assessment guide.
•• Cost and schedule estimates updated throughout the 2020 Census life cycle based on GAO
best practices:
ºº Publication GAO-09-3SP Cost Estimating
and Assessment Guide: Best Practices for
Developing and Managing Capital Program
Costs.
ºº Publication GAO-12-120G Schedule
Assessment Guide: Best Practices for Project
Schedules.

2020 Census Operational Plan—Version 1.1 59

Program Management
Framework
12. Risk/Issue
Management

1. Governance
Project Life Cycle

11. Human Capital
Management

2. Strategic
Communications

Initiation

Planning

10. Performance
Management
Project Management
and Project Control

Earned Value
Management

Execution
4. Document
Management

Closeout/
Evaluation

9. Schedule
Management

3. Strategic
Management

8. Budget
Management

5. Change
Management

6. Knowledge
Management
7. Acquisition and
Sourcing Management

Figure 29: Program Management Framework

•• A Knowledge Management process and database
for lessons learned from the 2010 Census, 2020
Census Research and Testing Program, advisory
committees, and audit and oversight reports.

•• Performance Management focused on key cost
drivers.

•• Alignment with the Census Bureau’s approach
to implement activity-based management and
earned value management techniques.

Description of Operation

•• Formal risk management kicked off earlier in
decade (2012) and occurs at both the programlevel and project-level.
•• Increased transparency and collaboration with
internal and external stakeholders about the
2020 Census.
•• Increased international stakeholder communications to leverage learnings of other countries’
census efforts and to share the Census Bureau’s
best practices and challenges.
•• Governance that bridges organizational silos.

60 2020 Census Operational Plan—Version 1.1	

•• Workforce that is appropriately skilled and trained.

The Program Management operation is responsible
for the planning and implementation of the 2020
Census. Specifically, this operation defines the
overall 2020 Census program and project management policies, framework, and control processes
used across the entire 2020 Census and all projects
established within the program.
The established Program Management framework
is shown in Figure 29.
General activities are required to manage multiple, ongoing, interdependent projects in order to
fulfill the 2020 Census mission and objectives.
The Program Management operation defines and

U.S. Census Bureau

manages the following 12 program management
processes:
1.	

2.	

3.	

4.	

5.	

6.	

7.	

8.	

9.	

Governance: The overall management
structure, decision-making authority, priority
setting, resource utilization, and performance
verification at each level of the program.
Strategic Communications: The engagement with internal and external stakeholders,
including Congress and the general public, in
the planning, research and analysis, progress,
and decisions related to the 2020 Census.
This activity also includes collaboration with
international organizations, particularly the
International Census Forum and the United
Nations Statistics Division (for the global view
of censuses) and the United Nations Economic
Commission for Europe (for the regional view).
Strategic Management: The process for
determining and documenting the 2020
Census strategic direction regarding strategies, goals, objectives, performance, and
investments.
Document Management: Activities for
consistent and centralized management of
program documentation produced in support
of the 2020 Census program.
Change Management: Activities for managing and controlling the 2020 Census strategic
baseline, including control of charters, process
plans, design documents, operational plans,
project plans, requirements, and schedules.
Knowledge Management: Practices used
to identify, create, represent, distribute, and
enable adoption of insights and experiences.
Acquisition and Sourcing Management:
Activities to provide and support acquisition
principals and guidelines.
Budget Management: Activities used to
establish and manage future-year budget
formulations, current-year budget execution,
and cost estimating and cost modeling.
Schedule Management: Activities used
to identify and schedule activities required
to produce program deliverables, identify
interdependencies between activities, and
determine activity resource requirements
and duration.

U.S. Census Bureau

	

10.	 Performance Measurement and
Management: Practices used to monitor the
progress of the 2020 Census in order to identify variances, assign corrective actions, and
make timely changes.
11.	 Human Capital Management: Activities to
ensure that human competencies and skills
are present and available to the organization.
12.	 Risk and Issue Management: Activities to
facilitate the identification, analysis, mitigation, and contingency planning for risks and
issues related to achieving the program’s
objectives.
Each component of the framework is documented
in detail in a separate process plan. Program
Management process plans are revised based primarily on lessons learned, other feedback received
from process owners and users, and as the program evolves.
Work Completed
The following work has been completed for this
operation:
The program management processes listed above
were approved in 2011, funded, established, and
utilized during the 2020 Census Research and
Testing Phase. They will continue to be used for the
remaining phases of the 2020 Census.
Decisions Made
The following decisions related to the 2020 Census
Program Management operation have been made:
99 Strategies for each program management element were defined and approved in 2011 and
formed the basis for the management of the
2020 Census Program.
99 The 2020 Census will be managed by using a fully
integrated master schedule designed and built
using best practices based on the GAO schedule
assessment guide (GAO-12-12G, May 2012).
99 The 2020 Census will follow the Enterprise
Systems Development Life Cycle process for all
decennial IT projects. The Census Bureau Project
Life Cycle will be followed for all projects (IT and
non-IT projects).
99 The 2020 Census will manage program-level
risk at the Portfolio Management Governing

2020 Census Operational Plan—Version 1.1 61

Board-level and project-level risks at the project
team level.

•• Maturing change management processes to better ensure impact assessment.

99 The program will have a finalized and integrated
governance and performance measurement
reporting mechanism.

•• Maturing human capital management to better
plan, facilitate, and monitor a workforce that has
required competencies and skills.

99 The risk management plan includes both the
program and project-level processes.

•• Maturing schedule tool approach for using
Primavera for the program and MS Project interaction for the enterprise.

99 A formal memorandum series will be used to
document significant program decisions.
99 The program will actively engage with stakeholders and advisors on major aspects of the
2020 Census.
99 Quarterly 2020 Census Program Management
Reviews will be conducted—including live
Webcast, so stakeholders can watch live or on
demand later.

•• Maturing integration of 2020 Census schedules
with enterprise efforts and enterprise schedules.
•• Defining the role and processes for using
SharePoint in performance management.
•• Defining the detailed earned-value management
methodology.
•• Defining methods to link risk mitigation actions
to the master integrated schedule.

99 The 2020 Census Monthly Status Reports will be
delivered to key oversight entities.

Cost and Quality

99 A Decennial Policy Team will be developed and
managed to ensure interdisciplinary, interdirectorate communication in regard to legal,
policy, and IT security sensitivities.

Strong program management ensures an efficient
2020 Census. Specific examples are noted below.
A down arrow indicates a reduction in cost and an
up arrow indicates an increase in cost as compared
with the 2010 Census through:

99 The 2020 Census Web site will be developed and
supported.
99 Frequently Asked Questions about the test
program will be developed along with other
supporting materials.

ÐÐ Investment in establishing a robust and formal
program management office that develops and
manages processes that minimize potential negative cost, schedule, and scope impacts.

99 Talking Points for customer assistance for internal phone and correspondence support centers
will be developed.

ÐÐ Ongoing stakeholder engagement reduces the
likelihood of unplanned design changes late in
the decade, which can prevent additional costs.

99 A directorate representative to Census Bureau’s
International Collaboration Steering Committee
will be appointed to communicate and coordinate
international collaboration across the agency.

Program management does not directly impact the
quality of the 2020 Census results.

99 The Census Bureau will actively participate with
international and national statistical and geographic organizations for key learnings and to
share the Census Bureau’s experiences.

The Program Management operation identifies and
manages all program-level risks. The risks listed
below are specific to this operation.

Issues to Be Resolved
The Program Management operation needs to continue to establish and refine particular areas. Key
activities include the following:
•• Maturing and ensuring full utilization of performance management to better facilitate early
identification and correction of problems.

62 2020 Census Operational Plan—Version 1.1	

Risks

Commitment by the 2020 Census senior managers to improve the program management process used for the 2010 Census program requires
resources and staff with certain skill sets. IF the
skilled resources are not available and funded to
implement program management, THEN critical
functions such as schedule, budget, scope, and
risk management will be jeopardized, leading to
negative impacts to cost and schedule.

U.S. Census Bureau

As part of the 2020 Census Program Management
operation, a framework of various program management processes have been developed for ensuring the implementation of consistent and thorough
program management controls. IF staff working
on the 2020 Census operations do not follow
the program management processes, THEN the
2020 Census projects may lack sufficient scope,
schedule, budget controls, and risk management,
increasing the likelihood of negative impacts to
cost and schedule.
Performance measurement is a critical function
needed by managers to track the status of planning,
development, and implementation of the 2020
Census program and operations. IF performance
measures are inadequately defined and/or monitored, THEN managers will have difficulty assessing
and reporting accurate cost and progress status.

Milestones
Date

Activity

September 2010

Baseline the initial 2020 Census Strategic
Plan.

June 2011

Baseline the initial 2020 Census Life
Cycle Rough Order of Magnitude Cost
Estimation (or Estimate).

September 2011

Develop and gain approval for 2020
Census Program Management Process
Strategies for each component described
in this operation.

September 2012

Baseline the initial 2020 Census ProgramProject Management Plans for each
component described in this section.

December 2012

Begin the quarterly 2020 Census Program
Management Reviews.

May 2013

Baseline the initial 2020 Census Missionlevel Requirements.

April 2014

Baseline the initial 2020 Census Life
Cycle Integrated Schedule.

October 2015

Issue the Baseline of the 2020 Census
Operational Plan.

October 2015–
September 2018*

Baseline the Detailed 2020 Census
Operational Plans (one for each
operation).

Annually

Refresh and reissue strategic program
documentation and the 2020 Census
Operational Plan based on lessons
learned, test results, and other feedback.

Annually

Conduct project management process
training to process users.

* The dates for each of the Detailed Operational Plans vary depending on the timing of the operation. For example, the Detailed Operational
Plan for the Address Canvassing operation is due in October 2015 and
the Detailed Operational Plan for the Archiving operation is due in 2018.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 63

5.3 CENSUS/SURVEY ENGINEERING
The support operations in this area provide the
foundation for conducting the 2020 Census.
This area consists of four operations: Systems
Engineering and Integration; Security, Privacy, and
Confidentiality; Content and Forms Design; and
Language Services. Each is described below.

5.3.1	 Systems Engineering and
Integration
Detailed Planning Status:

Underway

Purpose
•• Manages the delivery of a system of systems
that meets 2020 Census Program business and
capability requirements.
•• Implements and manages the full Enterprise
Systems Development Life Cycle for systems
supporting the 2020 Census.
Lessons Learned
Based on lessons learned from the 2010 Census
and other reviews, the following recommendations
were made:
•• Need to have a well-documented plan that
describes the development of the business
architecture and the solution architecture. The
architecture plan must have buy-in and adoption
by all stakeholders.
•• Consider greater flexibility for requirements
configuration management in the early design
and development processes to help minimize
the necessity to make subsequent corrections,
potentially saving resources and costs associated with unplanned resource needs.
Opportunities to Innovate
Opportunities to innovate include the following:
•• Application of the Census Bureau’s Enterprise
Systems Development Life Cycle.
•• Integration with the Census Bureau’s Enterprise
Architecture.
•• Implementation of performance measurement.
•• Integration with Enterprise systems, as
appropriate.

64 2020 Census Operational Plan—Version 1.1	

•• Dedicated resources from the IT Directorate for
key positions, including Chief Architect, Chief
Systems Engineer, and Chief IT Security Engineer.
Description of Operation
The scope of the SE&I operation is to implement and manage the full eSDLC for the 2020
Census. There are five major components of SE&I,
including: Requirements Engineering, Solution
Architecture, Solution Development and Technical
Integration, Test and Evaluation, and Deployment
and Operations and Maintenance. As part of
all of these efforts, SE&I will utilize the following standard program management concepts to
manage these tasks: Schedule Management, Risk
Management, Issue Management, Configuration
Management, and Quality Assurance.
Requirements Engineering
Based on the design of the 2020 Census and plans
documented in the 2020 Census Operational Plan,
the SE&I operation defines and executes a requirements engineering approach for the 2015–2018
Census Tests and 2020 Census that aligns with
the Census Bureau’s eSDLC, meets agency and
Department of Commerce standards and guidelines, and emphasizes consistency in approach
across the portfolio of 2020 Census projects. The
scope of the Requirements Engineering effort
includes the following:
•• Ensure the controlled and consistent application
of a standardized approach to requirements
engineering throughout the program and project
life cycles.
•• Conduct early and more frequent user testing
and engagement, employing the use of prototypes, models, and simulations wherever practicable and avoiding an “over the fence” approach
to requirements engineering.
•• Establish the requirements engineering methodology and tools that must be applied across the
decennial and supporting programs:
ºº Develop BPM in concert with subject matter
experts for each operation for each of the
2015–2018 Census Tests and the 2020
Census as a tool to begin the requirements
elicitation process.
ºº Extract Project-Level Business Requirements
(PLBR) and draft Capability Requirements
(CAP) from the BPM and review with subject
U.S. Census Bureau

matter experts to finalize the initial baseline
of PLBR and CAP.
• Facilitate broad program and project level
understanding of needs for all phases of the
2020 Census.
• Develop 2015–2018 Census Tests and 2020
Census Workload Demand Models, which will aid
the 2020 Census Operational Integrated Project
Teams in identifying the nonfunctional performance PLBR and CAP.
As the incremental baselines of the PLBR and CAP
for 2015–2018 Census Tests and 2020 Census are
complete, they will be allocated to the projects for
decomposition down to the detailed solution and
specification levels. At this point in the process, the
role of the SE&I operation is to provide technical
oversight and monitoring to ensure that solutions
appropriately address the business requirements
and specifications.
Solution Architecture
The SE&I operation develops the 2020 Census
Solution Architecture and Application and Interface
Inventory. The development of the solution architecture is comprised of the following:
• Build upon lessons learned from the 2010
Census, as well as the results and findings of the
2020 Census Research and Testing phase.
• Review and revise BPM developed as part of the
requirements engineering effort to create the
Business Architecture.
• Create the Solution Architecture document
including the Application and Interface Inventory
based on the “to be” business processes and
capabilities.
• Provide technical oversight of the 2020 Census
IT Project Portfolio to ensure conformance to the
prescribed solution architecture.
• Refine and deliver subsequent baselines of
the 2020 Census Solution Architecture and
Application and Interface Inventory.
• Mediate gaps in capabilities between solution
providers and operations representatives where
required, and subsequently refine architecture to
represent output of mediation.

U.S. Census Bureau

	

Solution Development and Technical Integration
During solution development, the requirements,
architecture, and low-level technical design are
used to develop the end-product and required
interfaces. As part of Solution Development and
Technical Integration, the SE&I operation performs
the following activities:
•• Provide support as it relates to interpretation of
PLBR and CAP.
•• Ensure development is completed within the
structure of the solution architecture.
•• Oversee the Solution Development process to
ensure that the overall solution is developed
within cost and schedule constraints in compliance with the Census Bureau’s eSDLC process.
•• Conduct a weekly systems integration meeting
to ensure progress (teams for each system
report status, issues, and risks).
•• Oversee Interface Working Groups to ensure the
systems as developed will function cohesively
when they are exercised in an end-to-end fashion.
•• Work with the CEDCaP and CEDSCI programs to
ensure that they are meeting the 2020 Census
time, budget, and functional requirements.
Test and Evaluation
As part of Test and Evaluation area, SE&I will perform the following:
•• Oversee tests of programs that are comprised of
multiple projects (CEDCaP, CEDSCI, etc.).
•• Oversee tests of individual projects that are not
part of a larger enterprise program or collection
of projects.
•• Conduct Integration and Test activities across
programs and independent projects to ensure
the 2020 Census system, as a whole, performs
as expected. This level of testing could comprise
many different types of tests to include: Cross
Program and Project Integration, Data Quality,
and System Performance.
•• Document measures for acceptance in the Test
and Evaluation Master Plan and document endto-end system readiness in a Test Report.

2020 Census Operational Plan—Version 1.1 65

Deployment and O&M
The SE&I operation provides oversight and structure around the deployment of systems as well
as O&M activities. As part of the Deployment and
O&M activities, the SE&I operation will perform the
following activities:
•• Provide oversight to ensure that all systems
are deployed and ready to support 2015–2018
Census Tests and 2020 Census activities.
•• Provide oversight to ensure all supporting organizations are deployed and ready to support all
operational activities.
Work Completed

What tools and test materials are required to
support the integrated tests (Performance Test
Services, Representative Test Data, etc.)?
•• Approach: Look at the functional and nonfunctional capabilities and examine the solution
architectures and perform analysis of alternatives. Align the decennial architecture with the
enterprise architecture as appropriate.
•• Decision by: September 2016
What is the sourcing approach for each capability
supporting the 2020 Census?
•• Approach: Conduct an analysis of alternatives
for each capability.

The following work has been completed for this
operation:

•• Decision by: June 2016

•• Business process models and business and
capability requirements are baselined for all
applicable business operations (does not include
certain support operations such as Program
Management and Security and IT Infrastructure).

Cost impacts of this operation on overall 2020
Census costs include the following:

•• Solutions for the 2015 Optimizing Self-Response
Test, 2015 Census Test, and the 2015 National
Content Test were delivered.
•• The solution architecture for the 2016 Census
Test is baselined.
•• The eSDLC Phase Gate Review process is
being used.

Cost and Quality

Given the complexity of the 2020 Census, SE&I
activities are critical to a successful census. Since
so many of the innovations that are aimed at
reducing the cost of the census rely on information technology solutions, the effectiveness of this
operation could impact the cost of the census as
compared with the 2010 Census.
Quality impacts of this operation on overall 2020
Census quality include the following:

Decisions Made

ÏÏ Increase quality by setting up robust processes
for system development.

The following decisions have been made for this
operation:

Risks

99 Key IT Directorate roles, such as the 2020
Census Chief Architect, Chief Systems Engineer,
and the Chief IT Security Engineer, will be
funded by and matrixed to the 2020 Census
Program.
99 The 2020 Census Program will leverage the
enterprise infrastructure and enterprise solutions as appropriate.
Issues to Be Resolved
Additional work is required to make decisions on
the following questions:

66 2020 Census Operational Plan—Version 1.1	

The risks listed below are specific to this operation.
Testing of the systems supporting the 2020 Census
requires adequate resources (i.e., staffing, budget,
and documentation) in order to be properly conducted. IF there is insufficient resources to support
the integrated test efforts, THEN system defects
may not be identified and fixed in time for 2020
Census production.
The systems supporting the 2020 Census need to
be scalable enough to adjust to unexpected peaks
in the workload. IF system scalability is not tested
and validated, THEN systems may not function as
required or meet the performance requirements
needed to support the volume expected for the
2020 Census.

U.S. Census Bureau

Milestones
Date

Activity

2012

Baseline the initial 2020 Census Systems
Engineering and Integration Plans for each
component described in this section.

5.3.2	 Security, Privacy, and
Confidentiality
Detailed Planning Status:

Underway

Purpose

2013

Create Architecture and requirements artifacts
for the 2014 Census Tests.

2014

Initial Baseline Project-Level Business
Requirements and Capability Requirements (to
be updated as design matures).

2015

Establish Baseline 1 of Solution Architecture.

•• Appropriate systems and data security.

Establish Baseline 1 of PLBR and CAP, which
includes requirements for 2016 Census Test.

•• Respondent and employee privacy and
confidentiality.

Determine the approach for conducting
integrated tests for 2016, 2017, and 2018
Census Tests (Design Decision 1).

Lessons Learned

Determine tools and test materials required
to support the integrated tests (Performance,
Test Services, Representative Test Data, etc.)
(Design Decision 2).
2016

Conduct Integrated Test for 2016.
Establish Baseline 2 of Solution Architecture.
Establish Baseline 2 of PLBR and CAP, which
includes requirements for 2017 Census Test.

2017

Conduct Integrated Test for 2017.
Establish Baseline 3 of Solution Architecture.
Establish Baseline 3 of PLBR and CAP, which
includes requirements for 2018 Census End-toEnd Test.

2018

Conduct Integrated Test for 2018.
Establish Baseline 4 of Solution Architecture.
Establish Baseline 4 of PLBR and CAP, which
includes Lessons Learned from 2018 Census
End-to-End Test.

2019

Deploy production systems.
Conduct Final Performance Testing.

2020

Develop final, as-built, and Operated Solution
Architecture.

Annually

Refresh and reissue strategic program
documentation and the 2020 Census
Operational Plan based on lessons learned, test
results, and other feedback.

U.S. Census Bureau

	

The Security, Privacy, and Confidentiality operation
ensures that all operations and systems used in the
2020 Census adhere to the following policies and
regulations:

Based on lessons learned from the 2010 Census
and other reviews, the following recommendations
were made:
•• Ensure IT systems and applications supporting
the 2020 Census have the proper security authorization prior to start of operations.
•• Ensure all 2020 Census accepted IT security
risks are in alignment with the Census Bureau’s
security program policies.
•• Ensure all of the 2020 Census IT system security
risks are monitored by the 2020 Census Risk
Review Board, as well as an Information System
Security Officer and the Office of Information
Security.
•• Embed an Office of Information Security security
engineer in the 2020 Census Program to ensure
compliance with the IT security program and
integration with the Census Bureau’s Enterprise
environments.
•• Ensure all employees supporting IT security are
certified in accordance with the Census Bureau’s
IT security program.

2020 Census Operational Plan—Version 1.1 67

Opportunities to Innovate
Opportunities to innovate include the following:
•• Implement an IT Security Program Risk
Management Framework in accordance with
National Institute of Standards and Technology
guidelines.
•• Hire a 2020 Census Chief IT Security Engineer to
support application development, mobile computing, and enterprise systems.

•• Ensure all 2020 Census systems have an
Authority to Operate.
•• Ensure each system has a designated
Information System Security Officer.
•• Ensure all 2020 Census Program systems are
covered by the Risk Management Framework,
which includes processes to ensure systems
undergo a security review prior to testing and
a full security assessment prior to obtaining an
Authority to Operate.

•• Increase staff in the Census Bureau Office of
Information Security to provide penetration testing services and more extensive scanning for
vulnerabilities and configuration management.

•• Ensure Appropriate Suitability Screening
Processes are in place.

•• Align all Privacy Impact Assessments and Privacy
Threshold Assessments to the System Security
Plans.

•• Ensure decennial Privacy Impact Assessments
and Privacy Threshold Analyses are current.

Description of Operation
The Security, Privacy, and Confidentiality operation
ensures that all operations and systems used in the
2020 Census adhere to the appropriate systems
and data security, respondent and employee privacy and confidentiality policies, and regulations.
Specific requirements are outlined below.
Security
Ensure Compliance with the following laws and
Census Bureau policies:
•• IT Security Program Policy: Ensure all 2020
Census systems meet federal, Department
of Commerce, and Census Bureau IT security
policy requirements as identified in the Census
Bureau IT Security Program Policy and relevant
National Institute of Standards and Technology
documentation.
•• Data Stewardship Policies: Ensure that the 2020
Census complies with the Census Bureau’s Data
Stewardship (DS) polices including:
ºº Control of Personally Identifiable Information
(DS-007).

Privacy and Confidentiality

•• Ensure that each system of record has an appropriate System of Record Notice published in the
Federal Register.
•• Establish a System of Record Notice for Bring
Your Own Device (BYOD) and Device as a Service
technology to be used in the 2020 Census.
•• Align the Privacy Impact Assessments and
Privacy Threshold Assessments to security plans
as part of the accreditation process; work with
training operations to ensure 2020 Census
managers and staff are prepared to notify the
respondents about the purpose and planned
statistical uses of the information collected.
•• Ensure that all people handling or reviewing
Title 13 and Title 26 materials are Special Sworn
Status certified.
•• Ensure Personally Identifiable Information
Incident Handling process is operational.
Work Completed
The following work has been completed for this
operation:
Encryption

ºº Respondent Identification (DS-016).

•• Researched securely managing data on mobile
devices using Mobile Application Manager (MAM)
software solution.

ºº Privacy Impact Assessments (DS-019).

Cloud Technology

ºº Data Breaches (DS-022).

•• Adopted the “Cloud First” strategy.

ºº Record Linkage (DS-014).

•• Ensure that the 2020 Census only collects information necessary for complying with the 2020
Census mission and legal requirements.
68 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

•• Examined the requirements of the applications
and underlying infrastructure from a security
compliance perspective.

Will a MAM solution be used in lieu of Mobile
Device Management to support mobile data
collection?

•• Examined the requirements for hybrid cloud
capabilities to allow flexibility in leveraging
cloud technology to meet future program
requirements.

•• Approach: Researched during the 2015 Census
Test.

•• Enabled the deployment of cloud-based services.

Cost and Quality

BYOD Technology
•• Obtained a waiver to allow sensitive personal
data to be collected and stored on personally
owned devices to be used in the 2014, 2015,
and 2016 Census Tests.
•• Established a BYOD Acceptable Use Policy for
2020 Census testing purposes.
•• Implemented a MAM solution for securing data
residing on personally owned devices.
•• Granted authorization to test applications and
technologies prior to a full authorization to
operate.
Decisions Made
The following decision has been made for this
operation:
99 The 2020 Census will access Title 13 and Title
26 data, including administrative records and
third-party data, remotely using the Virtual
Desktop Infrastructure.
Issues to Be Resolved
Additional work is required to make decisions on
the following questions:

U.S. Census Bureau

	

•• Decision by: March 2016

The investment in Security, Privacy, and
Confidentiality will have minimal4 impacts on the
cost and quality of the 2020 Census as compared
with the 2010 Census.
Risks
The risk listed below is specific to this operation.
In accordance with the Census Bureau’s security
policy, all IT systems must undergo an independent
security assessment and acquire the authorization
to operate prior to operating in the production
environment. In addition, all systems must meet
the Census Bureau’s Risk Management Framework
continuous monitoring requirements. IF an IT
system supporting the 2020 Census encounters an
unexpected configuration change which affects the
system’s security posture, THEN additional security
assessments are required which may result in an
increase in security support costs, an increase in the
system security risk rating, and schedule delays.

4
Minimal impact means that this operation does not directly
impact the cost of the life-cycle cost of the 2020 Census (as compared with the 2010 Census) by more than $100 million, based
on the current life-cycle cost estimate.

2020 Census Operational Plan—Version 1.1 69

design. Also test for successful data capture
before implementation.

Milestones
Date

Security Activity

April 2015

Monitor security of systems used in the 2015
Census Test.

January
2016

Conduct security reviews and assessments on
system releases for the 2016 Census Test.

September Release Security, Privacy, and Confidentiality
2016
Detailed Operational Plan.
October
2016

Conduct security reviews and assessments on
system releases for the 2017 Census Test.

October
2017

Conduct security reviews and assessments on
system releases for the 2018 Census End-toEnd Test.

October
2018

Conduct security reviews and assessments on
system releases for the defect resolution testing
and post end-to-end performance testing in
2019.

5.3.3	 Content and Forms Design
Detailed Planning Status:

Underway

Purpose
The Content and Forms Design operation performs
the following activities:
•• Identify, research, and finalize content and
design of questionnaires and other nonquestionnaire materials.
•• Ensure consistency across data collection modes
and operations, including (but not limited to)
questionnaire content, help text, mailing materials, and field materials.
•• Provide the optimal design and content of the
questionnaires to encourage high response rates.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Develop an enterprise repository that includes
questionnaire content and design elements for
questionnaires and nonquestionnaire materials.
•• Ensure sufficient time for testing the questionnaire content. Also include testing of nonquestionnaire materials.
•• Consider forms design elements (size, color,
spacing implications, etc.), mode, and language
when finalizing questionnaire content and

70 2020 Census Operational Plan—Version 1.1	

•• Conduct comprehensive testing of optimized
content in the usability lab and in a field test to
prevent unanticipated negative impacts on data
quality.
•• Determine if a bilingual initial or replacement
questionnaire in bilingual selected tracts is
beneficial.
Opportunities to Innovate
Opportunities to innovate include the following:
•• Create a central, electronic repository of 2020
Census content (questionnaire and nonquestionnaire materials).
•• Create consistent content for automated data
collection instruments needed for Self-Response
and Nonresponse Followup.
•• Redesign the bilingual paper questionnaires
from swim lane to flip-style.
•• Create questionnaires and nonquestionnaire
materials in languages beyond English and
Spanish.
Description of Operation
Content and Forms Design is responsible for creating, refining, and finalizing instrument specifications for all data collection modes—Internet, paper,
Census Questionnaire Assistance (the telephone),
and NRFU (in-person interview). This is a significant
departure from the 2010 Census, which primarily
relied on paper for data collection.
The goal is to finalize the content of the questionnaire and other mailing and field materials for the
2020 Census so that the 2020 Census topics can be
submitted to Congress by April 2017, with the final
questionnaire wording submitted by April 2018.
To meet important deadlines, key elements of the
Content and Forms Design operation include the
following:
•• Developing instrument specifications for all data
collection modes: Internet, NRFU, CQA, and Paper.
•• Pretesting questionnaire content (e.g., cognitive
testing, focus groups) prior to making final decisions on questionnaire topics and wording.

U.S. Census Bureau

•• Finalizing content development and design of
questionnaires across all modes: Internet, CQA,
Paper, and NRFU.
•• Finalizing content development and design of
nonquestionnaire materials deployed during
self-response and NRFU operations, including postcards, letters, field materials, and
envelopes.
•• Developing print specifications for all questionnaire and nonquestionnaire materials, including
postcards, letters, and envelopes.
•• Optimizing questionnaire designs for each mode
and all supporting materials, in alignment with
systems specifications.
•• Ensuring questionnaire content and supporting
materials are accurate, appropriate, consistent,
inviting, and easy to understand across self and
nonresponse data collection modes.
Research Completed
The following research has been completed for this
operation:
•• Qualitative Research on Content:
ºº Conducted qualitative research on alternative questionnaire wording for the following
topics: Race and Hispanic origin, Relationship,
Within-Household coverage.
•• Findings: Informed questionnaire wording
(for content variations) being tested in
the 2015 National Content Test and other
Research and Testing Phase testing.
ºº Conducted expert review of paper questionnaire design and inclusion of write-in fields
for all race categories.
•• Findings: Informed layout of paper questionnaire design for the 2015 National
Content Test.
•• Usability and Systems Testing:
ºº Conducted usability testing of automated
data collection instruments (Internet, NRFU).
•• Findings: Informed final instrument layout and navigation for 2014, 2015, and
2016 Census Tests and the 2015 National
Content Test.
ºº Conducted testing on data capture of paper
questionnaire responses.

U.S. Census Bureau

	

•• Findings: Informed paper questionnaire
layout for the 2014, 2015, and 2016
Census Tests and the 2015 National
Content Test.
ºº Conducted 2014 Census Test (relationship
response categories).
•• Findings: Continue testing new relationship
response categories.
ºº Conducted 2015 Census Tests (content and
questionnaire design).
•• Findings: Coverage questions added to
respondent burden (based on observations
of field operations and respondents’ reactions to questionnaire content).
•• The 2015 National Content Test (content and
questionnaire design).
ºº Finalized content to be tested during the
2015 National Content Test.
ºº Test panels and analysis plans for 2015
National Content Test.
ºº Internet data collection instrument specifications for the 2015 National Content Test.
ºº English and Spanish bilingual paper questionnaires (10 versions: 8 for stateside, 2 for
Puerto Rico).
ºº Computer Assisted Telephone Interview
instrument specifications for the 2015
National Content Test Race and Coverage
Reinterview.
ºº Implementation of the 2015 National
Content Test.
Decisions Made
The following decisions have been made for this
operation:
99 Flip-style bilingual paper questionnaires will be
used for household enumeration.
99 Coverage questions will be streamlined to
reduce respondent burden while maintaining
data quality (based on 2014 and 2015 Census
Test field observations).
Design Issues to Be Resolved
Additional work is required to make decisions on
the following questions:

2020 Census Operational Plan—Version 1.1 71

What are optimal designs of questionnaires (including size and page layout) and nonquestionnaire
materials for the 2020 Census?

•• Approach: Coordinate with the operations and
gather the content; test in the 2016 and 2017
Census tests.

•• Approach: Based on results of field tests, other
ongoing research, and input from advisory
committees.

•• Decision by: September 2017

•• Decision by: Initial October 2017; Final August
2018
What are the final content topics for the 2020
Census?	
•• Approach: Based on results of the 2015
National Content Test, other ongoing research,
and input from advisory committees.
•• Decision by: December 2016

Cost and Quality
Cost impacts of this operation on overall 2020
Census costs include the following:
Investment in Content and Forms Design will have
minimal impact on the cost of the 2020 Census, as
compared with the 2010 Census.
Quality impacts of this operation on overall 2020
Census quality include the following:

•• Delivered to Congress by: April 2017

ÏÏ Internet questionnaire design is anticipated to
improve the quality of self-response.

What is the final questionnaire wording for the
2020 Census?

ÏÏ Automated NRFU instrument anticipated to
improve quality of response (under review).

•• Approach: Based on results of the 2015
National Content Test, other ongoing research,
and input from advisory committees.

Risks

•• Decision by: December 2017
•• Delivered to Congress by: April 2018
What is the paper questionnaire layout for respondents living in residences other than households
(e.g., group quarters and transitory locations)?

72 2020 Census Operational Plan—Version 1.1	

The risk listed below is specific to this operation.
Changes in the content of the 2020 Census questionnaire may be requested after the content has
been finalized in 2017. IF changes are approved
for the final 2020 Census questionnaire content in
2017 or later, THEN the English and non-English
material will need to be redesigned and reprinted,
requiring additional time in the schedule and
potentially delaying deliverables.

U.S. Census Bureau

5.3.4	 Language Services

Milestones
Date

Activity

May 2015

Complete cognitive testing of paper
questionnaire content for 2015 National
Content Test (English, Spanish).
Complete cognitive testing of paper
questionnaire content and nonquestionnaire
materials in multiple languages.

August 2015

Complete cognitive testing of Internet
questionnaire content for 2015 National
Content Test for English and Spanish.
Start conducting the 2015 National Content
Test.

October 2015

Underway

Purpose
The Language Services operation performs the
following activities:
•• Assess and support language needs of nonEnglish speaking populations.
•• Determine number of non-English languages and
level of support for the 2020 Census.

Complete the 2015 National Content Test
(data collection).

•• Optimize non-English content of questionnaires
and nonquestionnaire materials across data collection modes and operations.

Final questionnaire content for the 2016
Census Test: Race, Relationship, Coverage
Baselined instrument specifications for the
2016 Census Test.

•• Ensure cultural relevancy and meaningful translation of questionnaires and nonquestionnaire
materials.

February 2016 Complete cognitive and usability testing of
Chinese and Korean Internet and NRFU
instruments and nonquestionnaire materials.
June 2016

Detailed Planning Status:

Receive analysis of 2015 National Content
Test results.
Cognitive testing of possible additional topics
(e.g., tribal enrollment).

August 2016

Receive results from cognitive test of possible
additional topics (e.g., tribal enrollment).

September
2016

Release the Content and Forms Design
Detailed Operational Plan.

October 2016

Analysis of the 2016 Census Test results.
Finalize questionnaire content for the 2017
Census Test.
Baselined instrument specifications for the
2017 Census Test.

Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Conduct further research on language selection
criteria.
•• Conduct cognitive testing earlier in the decade
to allow for high quality translation of questionnaires and nonquestionnaire materials.
•• Optimize non-English materials to ensure cultural relevance for intended audiences.
•• Allow Internet responses in English and other
languages.

April 2017

Submit 2020 Census topics to Congress.

•• Test a Spanish version of the questionnaire on
the Internet.

October 2017

Finalize questionnaire content for the 2018
Census End-to-End Test.

Opportunities to Innovate

Baselined instrument specifications for the
2018 Census End-to-End Test.

Opportunities to innovate include the following:

April 2018

Submit 2020 Census question wording to
Congress.

October 2018

Analysis of the 2017 Census Test results.

•• Automated data collection instruments available
for Internet Self-Response and NRFU in multiple
non-English languages.

May 2019

Finalize 2020 Census paper questionnaires
for print.
Finalize 2020 Census questionnaires design
and layout across all modes.

March 2020

Deploy 2020 Census questionnaires across
all modes.

U.S. Census Bureau

	

Description of Operation
The Language Services operation supports the goal
of an accurate and cost-effective census by creating
awareness and facilitating participation of respondents with Limited English Proficiency. It also identifies ways to reduce barriers to enumeration of

2020 Census Operational Plan—Version 1.1 73

this hard-to-count population. The use of multiple
languages is an important part of creating a census
climate that facilitates goodwill and cooperation
among census partners and the public at large,
thereby increasing self-response, saving money,
and increasing quality.
To achieve the goals of assisting and creating
multiple modes of collecting information from
non-English speaking respondents, the Language
Services operation conducts research on language
needs and trends and relies on socio/psycholinguistic approaches to provide language operations and
assistance and to identify, create, and refine non-­
English materials for Limited English Proficiency
respondents. The operation also includes a National
Advisory Committee Language Working Group for
National Advisory Committee members and subject
matter experts to jointly strategize on language
operations for the 2020 Census.
With language testing planned for 2016 and 2017,
this operation identifies ways to encourage completion of the questionnaire online in multiple
non-English languages. In addition, it provides
accessible, alternative means of response for those
without access to the Internet.
Specific activities include the following:
•• Optimizing the census questionnaire for
each mode as appropriate for Limited English
Proficiency populations.
•• Ensuring culturally and functionally appropriate
questionnaire design and content across translations (e.g., through pretesting).
•• Enabling language research by maximizing the
ability for data collection systems to incorporate
non-English languages.
•• Analyzing ACS data to see if trends have
changed in language need.
•• Optimizing mailing strategies to: (1)
ensure non-English speakers receive the same
message as English speakers prior to going
online; (2) determine whether non-English
speakers respond differently to number and
ordering of contacts than English speakers: and,
(3) determine whether or not adding multilanguage Public Use Forms increases participation by non-English speakers.

74 2020 Census Operational Plan—Version 1.1	

•• Conducting usability testing of how questionnaires can be best adapted for use in multiple
modes in non-English languages and the types
of challenges that occur when adapting translated questionnaires to new modes.
•• Determining alternative response methods for
the visually impaired.
•• Expanding previously used tools, such as the
Language Reference Dictionary, and providing earlier in the decade for partnerships and
regions to accurately reflect census terminology.
•• Determining the number of non-English languages and level of support during the 2020
Census.
Research Completed
The following research has been completed for this
operation:
•• Qualitative Research on Non-English Content:
ºº Tested for accuracy and cultural appropriateness of translated questionnaire content for
the following languages: Spanish, Chinese,
Korean, Vietnamese, Russian, Arabic.
•• Findings: Informed questionnaire wording
for 2015 National Content Test and other
mid-decade testing.
•• In-House Review of Materials:
ºº Conducted expert review of field materials in
non-English languages.
•• Findings: Informed translated content of
Notice of Visit for the 2015 Census Test;
Revised Language Identification Flashcard
to include Chinese-spoken dialects.
•• Language Needs Assessment:
ºº Assessed current language needs using
ACS data.
•• Findings: Informed non-English support
for 2015 and 2016 Census Tests and 2015
National Content Test.
•• Research on Translation Technology:
ºº Conducted research on translation machines
(e.g., Google Translate).
•• Findings: Machine translations generally
show severe structural, grammatical, and
contextual errors and should not replace
human translations.

U.S. Census Bureau

•• Usability and Systems Testing:
ºº Conducted usability testing of Spanish automated data collection instruments (Internet,
NRFU).
•• Findings: Informed final instrument layout
and navigation for the 2014, 2015, and
2016 Census Tests and the 2015 National
Content Test.
ºº Conducted testing on data capture of Spanish
paper questionnaire responses.
•• Findings: Informed paper questionnaire
layout for the 2014, 2015, and 2016
Census Tests and the 2015 National
Content Test.
Decisions Made
The following decisions have been made for this
operation:
99 Flip-style bilingual paper questionnaires will be
used instead of the swim lane style.
99 The Language Services Operation will utilize a
National Advisory Committee Language Working
Group for early engagement on language assistance plans for the 2020 Census.
Design Issues to Be Resolved
Additional work is required to make decisions on
the following questions:
What are the number of non-English languages and
level of support needed for the 2020 Census?
•• Approach: Based on an assessment of language
needs and input from advisory committees. Also
based on the determination of infrastructure and
IT requirements to provide language support.
Results of the 2016 and 2017 Census Tests will
inform this decision.

U.S. Census Bureau

	

•• Decision by: September 2017
Cost and Quality
Cost impacts of this operation on overall 2020
Census costs include the following:
Investment in Language Services is expected to
have minimal cost impacts on the 2020 Census, as
compared with the 2010 Census.
Quality impacts of this operation on overall 2020
Census quality include the following:
ÏÏ Automated data collection instruments in
non-English languages anticipated to improve
quality of responses from non-English speaking
respondents.
ÏÏ Culturally appropriate, translated questionnaires
and nonquestionnaire materials anticipated to
improve quality of responses of non-English
speaking respondents.
Risks
The Internet data collection instrument used for the
census tests is currently only available for use by
English and Spanish speakers. IF the Internet data
collection instrument is not designed for languages
outside of English and Spanish, THEN there will not
be online self-response options for non-English and
Spanish speaking respondents for the 2020 Census.
Any content changes made after the English questionnaire for the 2020 Census is finalized will have
to be replicated for the non-English questionnaires.
IF the final English content changes after April
2018, THEN there will not be adequate time in the
schedule to translate, design, and produce nonEnglish questionnaires for the 2020 Census.

2020 Census Operational Plan—Version 1.1 75

census and survey data collection, data tabulation,
data dissemination, geocoding services, and map
production.

Milestones
Date

Activity

March 2016

Deploy Internet and NRFU instruments in
Spanish, Chinese, and Korean for the 2016
Census Test.
Deploy bilingual paper questionnaire and
associated nonquestionnaire materials in
Spanish, Chinese, and Korean for the 2016
Census Test.

September
2016

Release the Language Services Detailed
Operational Plan.

March 2017

Deploy Internet and NRFU instruments in
Spanish, Chinese, Korean, Vietnamese, and
additional non-English language(s) (to be
determined) for the 2017 Census Test.
Deploy bilingual paper questionnaire and
associated nonquestionnaire materials in
Spanish, Chinese, Korean, Vietnamese, and
additional non-English language(s) (to be
determined) for the 2017 Census Test.

September
2017

Determine number of non-English languages
and level of support for the 2020 Census.

2016–2019
(ongoing)

Conduct qualitative research on data
collection instruments and materials in
additional languages.

March 2020

Deploy 2020 Census non-English data
collection instruments and materials.

5.4 FRAME
The operations in this area have the goal of developing a high-quality geospatial frame that serves
as the universe for the enumeration activities.
This area consists of three operations: Geographic
Programs, Local Update of Census Addresses
(LUCA), and Address Canvassing. Each is described
below.

5.4.1	 Geographic Programs
Detailed Planning Status:

Underway

Purpose
The Geographic Programs operation provides the
geographic foundation in support of the 2020
Census data collection and tabulation activities
within the Master Address File/Topologically
Integrated Geographic Encoding and Referencing
(MAF/TIGER) System. The MAF/TIGER System
(software applications and databases) serves as
the national repository for all of the spatial, geographic, and residential address data needed for

76 2020 Census Operational Plan—Version 1.1	

Components of this operation include:
•• Geographic Delineations.
•• Geographic Partnership Programs.
•• Geographic Data Processing.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Consider consolidation of field operations, and
Type of Enumeration Area (TEA) values used to
support field operations.
•• To the greatest extent possible, attempt geographic reconciliation activities of boundaries on
an ongoing basis throughout the decade.
•• To the greatest extent possible, geographic
extracts and updates should be made in an
electronic form to reduce the production, shipping, and handling of paper maps and paper
listings by the Census Bureau and its program
participants.
•• Update the MAF through partnership programs
in order to increase the Census Bureau’s ability
to geocode addresses from the USPS Delivery
Sequence File (DSF).
Opportunities to Innovate
Opportunities to innovate include the following:
•• Use of varied data sources (e.g., imagery and
third-party data) to validate and augment the
MAF/TIGER System throughout the decade:
ºº As part of the Geographic Support System
Initiative (GSS-I) the Census Bureau has
obtained address and road center-line data
from state and local partnerships and has
updated the MAF/TIGER System with these
data since 2013.
ºº Ongoing investigation of potential use of
third-party data sources.
•• Development of a modular, multimode,
Geographic Update Partnership Software (GUPS)
to streamline partners’ participation.

U.S. Census Bureau

•• Delineation of Basic Collection Units to:
ºº Eliminate operation specific Assignment Area
delineations.
ºº Incorporate data and information not previously used in delineation such as predominant housing unit characteristics (e.g., single
unit, group quarters, mobile homes).
Description of Operation 
The Geographic Programs operation includes components of the 2020 Census that are geographic
in nature. The components of the Geographic
Programs project fall into three general categories
as shown in Figure 30:
•• Geographic Delineations.
•• Geographic Partnership Programs.
•• Geographic Data Processing.
Geographic Delineations
The Geographic Delineation component of the
Geographic Programs determines, delineates,
and updates the geographic area boundaries for
2020 Census data collection and data tabulation.
Census data collection relies on the delineation
of various geographic areas, known as “collection
geography,” to support the capture of data during
Census activities. This includes both the delineation of the methods used to enumerate households
and the definition of field management areas. The

Geographic
Delineations
•

•

•
•

Type of Enumeration
Area (TEA)
development and
delineation
Basic Collection Unit
(BCU) development
and testing
Delineation of Special
Land-Use Areas
Field management
area delineation

following collection geography is delineated during
the 2020 Census:
•• Type of Enumeration Area: In an effort to
ensure the most cost effective and efficient process to enumerate households, every block in
the United States is assigned to one specific type
of enumeration area or TEA. The TEA reflects the
methodology used to enumerate the households
within the block. The TEA assignment utilizes a
variety of information to identify the most cost
effective enumeration approach for all of the
United States, District of Columbia, Puerto Rico,
and the Island Areas.
•• Basic Collection Unit (BCU): BCU serves as the
smallest unit of collection geography for all 2020
Census listing operations. The BCU replaces both
the collection block and assignment area geographies used for the 2010 Census.
•• Special Land Use Area: A key component of
collection geography is the delineation of land
areas that may require unique field treatment
or tabulation. This includes military areas,
group quarter areas (e.g., correctional facilities and colleges and universities), and public
lands. The main purpose of the special land
use delineation is to improve tabulation block
boundaries, to allow field operations to manage
special land use areas in the field effectively,
to assist in maintaining the GQ address list,
to allow for public lands to be removed from

Geographic
Partnership Programs
•
•

•
•

Boundary and Annexation
Survey (BAS)
Participant Statistical Areas
Program/Tribal Statistical
Areas Program (PSAP/TSAP)
Boundary Validation Program
(BVP)
Public-Use Microdata Areas
(PUMAs)

Geographic Data
Processing
•

•
•
•

Augmentation of
MAF/TIGER with
addresses from
administrative records
and third party data
MAF/TIGER Extract
Support
Geographic Data
Processing
Geographic Area
Reconciliation Program

Figure 30: Summary of Geographic Programs Components

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 77

In-Field Address Canvassing (see Section 5.5.3)
and other field operations, and to maintain
relationships between these areas and other
geographic entities such as incorporated places
and American Indian Areas.
•• Field Management Area Delineation: This
component of collection geography includes
delineation of geographic areas, other than BCU
and TEA, which are necessary to manage and
accomplish fieldwork for the 2020 Census. In
past censuses this has included Crew Leader
Districts, Field Operation Supervisor Districts,
and Area Census Office boundaries.
Census results are dependent on the delineation
of various geographic areas to both tabulate and
report person and household statistics. The delineation of these geographic areas, known as “tabulation geography” is based on input from partnership
programs (such as the Participant Statistical Areas
Program/Tribal Statistical Areas Program [PSAP/
TSAP] program), or internally defined tabulation
criteria, such as the Urbanized Area delineation.
After rules are defined or tabulation geographies
are proposed by partners, the tabulation geography is delineated in the MAF/TIGER System through
a series of batch and interactive delineations and
then followed by a series of data integrity validations, renumbering, and certification steps. Once
the tabulation geographic areas are certified, they
are loaded into the MAF/TIGER database and used
for the tabulation of statistical data and as the base
for various geographic data products that support
the 2020 Census. Tabulation geography planned
for the 2020 Census includes:
•• American Indian Areas
•• Metropolitan and Micropolitan Statistical Areas
and Related Statistical Areas
•• Counties
•• County Subdivisions
•• Census Designated Places
•• Census Tracts
•• Block Groups
•• Blocks
•• Congressional Districts
•• State Legislative Districts
•• Voting Districts
•• School Districts
78 2020 Census Operational Plan—Version 1.1	

•• Traffic Analysis Zones
•• Zone Improvement Plan Code Tabulation Areas
•• Urban Areas
These geographies are used to tabulate and
disseminate data from the Decennial Census, the
ACS, and other censuses and surveys, and are used
outside of the Census Bureau by other government
agencies in program administration and in determining program eligibility and fund allocation.
Geographic Partnership Programs
Prior to the 2020 Census, the Census Bureau
conducts geographic partnership programs to
make the address list as up-to-date as possible
and ensure complete coverage of all housing units.
The Partnership Programs also help define statistical geographic area boundaries that will provide
meaningful data from the 2020 Census. Following
are the 2020 Census Geographic Partnership
Programs:5
•• Boundary and Annexation Survey: An ongoing survey for collecting and maintaining information about the inventory of the legal boundaries for, and the legal actions affecting the boundaries of counties and equivalent governments,
incorporated places, Minor Civil Divisions,
Consolidated Cities, Urban Growth Areas,
Census Areas of Alaska, Hawaiian Homelands,
and federally recognized legal American Indian
and Alaska Native areas (including the Alaska
Native Regional Corporations). This information
provides an accurate identification and depiction
of geographic areas for the Census Bureau to
use in conducting the decennial and economic
censuses and ongoing surveys such as the ACS.
•• Participant Statistical Areas Program/
Tribal Statistical Areas Program: Programs
that allow designated participants, following
Census Bureau guidelines, to review and suggest
modifications to the boundaries of block groups,
census tracts, Census County Divisions, and
Census Designated Places. Participants can also
propose new Census Designated Places based
on specific criteria. The 2020 Census PSAP
includes all tribal statistical boundaries, which
were administered through the TSAP in the 2010
Census, combining the two programs. The TSAP
5
Components of the Redistricting Data Program and the
Local Update of Census Addresses are also Geographic Program
Partnership Programs, but they are covered in other sections of
this document.

U.S. Census Bureau

geographies are Oklahoma Tribal Statistical
Areas, Tribal Designated Statistical Areas, State
Designated Tribal Statistical Areas, tribal census
tracts, tribal block groups, statistical tribal subdivisions, Alaska Native Village Statistical Areas,
and for administrative purposes, one legal area,
state reservations.
•• Boundary Validation Program: The intent of
the Boundary Validation Program is to provide
the Highest Elected Official a last opportunity
to review the entity boundary, and any address
range breaks where the boundary of their jurisdiction intersects a road, before the tabulation
of census data.
•• Public Use Microdata Areas: Geographic units
used for providing statistical and demographic
information. Public Use Microdata Areas do not
overlap, and are contained within a single state.
Geographic Data Processing
The Geographic Data Processing component of
Geographic Programs includes all activities that
relate to the extract, update, and maintenance of
the features, boundaries and addresses in the MAF/
TIGER System. Geographic data captured as part
of the 2020 Census, including address updates,
structure coordinate locations, boundaries, and
roads data will be processed to ensure that the
MAF/TIGER System is up to date. Following are the
major geographic data processing activities that
will occur in the 2020 Census:
•• Frame Development includes the receipt and
processing of various address records from
sources such as the USPS, state and local governments, and third-party data sources. These
data help ensure accurate address coverage
within the 2020 Census Frame.
•• MAF/TIGER Extract Support includes activities related to preparing extracts or services
enabling 2020 Census systems access to
addresses from the MAF/TIGER System, as well
as activities related to the production of spatial
extracts or services for use in various field data
collection instruments and control systems and
printing of paper.
•• Geographic Data Processing includes activities related to extract above from and update to
the features, boundaries and addresses within
the MAF/TIGER System. The MAF/TIGER updates
include any changes to the features, addresses,
U.S. Census Bureau

	

or boundaries that result from 2020 Census data
collection operations, or geographic partnership programs. The geographic data processing
activities establish benchmarks from the MAF/
TIGER System by taking a snapshot of the database at various points during the decade. Each
benchmark becomes the foundation on which
future updates are applied. These benchmarks
support the collection, tabulation, and dissemination of census and survey information and
for providing geocoding services and geospatial
data products.
•• Geographic Area Reconciliation Program
includes editing and reconciliation of boundaries
within the MAF/TIGER System. This reconciliation resolves boundary and feature discrepancies provided by separate partnership programs
at different points in time or updates prior to
release of 2020 Census tabulation products.
Research Completed
The following research has been completed for this
operation:
•• Research conducted and completed within the
initial phases of the GSS-I program:
ºº Findings: Demonstrated that administrative records from local governments are
a valuable source of address and spatial
information.
•• Research on use of public lands data:
ºº Findings: Demonstrated that public lands
data will be useful in the delineation of 2020
Census TEAs and collection geography.
•• Post Census analysis of 2010 Assignment Area
definitions.
ºº Findings: Helped lay the foundation for
establishing a consistent assignment unit—
the BCU.
Decisions Made
The following decisions have been made for this
operation:
Geographic Delineations:
99 BCUs will be used beginning in the 2016
Address Canvassing Test.
99 Special Land Use Areas and Public Lands
will be used in the delineation of collection
geographies.
2020 Census Operational Plan—Version 1.1 79

99 The Statistical Areas Program (PSAP/TSAP) will
be used in the delineation of 2020 Census tabulation geography.

99 Frame development will include the receipt and
processing of administrative records and thirdparty data sources.

99 The 2020 Census will include delineation of:

99 Boundary reconciliation within the MAF/TIGER
System will be ongoing.

ºº Tabulation geography (Blocks, Block Groups,
Tracts, etc.).
ºº Zone Improvement Plan Code Tabulation
Areas.

Design Issues to Be Resolved 
Additional work is required to make decisions on
the following questions:

ºº Traffic Analysis Zones.
ºº Urban Areas as defined by the 2020 Census
Urban Area Delineation Program.
Geographic Partnership Programs:
99 The geographic programs conducted in the
2010 Census will occur in the 2020 Census (the
approach for adding new construction is yet to
be determined).
99 The GUPS will support:
ºº All geographic partnership programs (i.e.,
Boundary and Annexation Survey [BAS], PSAP/
TSAP, Boundary Validation Program, and
Public Use Microdata Areas).
ºº Redistricting Data Program (RDP).
ºº Local Update of Census Addresses.
ºº Count Question Resolution.
99 Partnership programs will offer limited paper
materials.
99 Data received from partnership programs will be
processed from a central location.

Geographic Delineations
How will the MAF/TIGER System be used in support of reengineered field operations? For example,
what are the data input and output processing and
timing requirements and the work flows needed to
support field data collection operations?
•• Approach: Resolve when planning the 2016
Address Canvassing Test and 2017 Census Test.
•• Decision by: October 2017
What types of TEA are required for the 2020 Census?
•• Approach: Resolve using results from 2016
Address Canvassing Test and 2017 Census Test.
•• Decision by: October 2017
Geographic Partnership Programs
Will there be a separate New Construction Program
or will the GSS-I Program continue to collect new
construction addresses for the 2020 Census?
•• Approach: Resolve as part of GSS-I Planning and
the Geographic Partnership Program Planning.

Geographic Data Processing:

•• Decision by: June 2017

99 Enterprise solutions will be used to capture relevant geographic data.

Geographic Data Processing

99 Imagery will be available as a backdrop in field
listing and field enumeration instruments.
99 The MAF/TIGER System will leverage a Service
Oriented Architecture for dissemination products
and tools.
99 The USPS DSF will continue to be used as the
primary source of address updates for the MAF/
TIGER System.

80 2020 Census Operational Plan—Version 1.1	

How will the MAF/TIGER System interact with other
2020 Census systems to support 2020 Census
operations?
•• Approach: Resolve as part of 2016 Address
Canvassing Test and 2017 Census Test Planning.
•• Decision by: January 2016

U.S. Census Bureau

In what 2020 Census operations will addresses
and features be updated and added? What are the
expectations for the capture and availability of field
updates? Available in real time? Available with the
timeframe of the operations? Available for the next
operation? Available for the final tabulation?
•• Approach: Resolve as part of 2016 Address
Canvassing Test and 2017 Census Test Planning.
•• Decision by: August 2017
What is the source data (TIGER, commercial, or
both) for map displays in the 2020 Census data
collection and field management applications?
•• Approach: Research during the 2016 Address
Canvassing Test and the 2017 Census Test.
•• Decision by: March 2017 (Preliminary);
October 2017 (Final)
Cost and Quality
Cost impacts of this operation on overall 2020
Census costs include the following:
Investment in Geographic Programs will have minimal impact on cost to the 2020 Census as compared with the 2010 Census.
Quality impacts of this operation on overall 2020
Census quality include the following:

Ï Address and spatial data in the MAF/TIGER System
are updated continuously and are more current.
Ï Ongoing reconciliation of boundaries across
programs, such as the BAS and the Redistricting
Data Program, will result in higher quality tabulation boundaries.
Risks
A timely decision on the final 2020 Census operations will help in keeping the type of TEA delineation on schedule. IF there is a significant delay in
finalizing the 2020 Census operations and requirements, THEN the TEA delineation may be delayed.
Using attribution in Basic Collection Units increases
their benefits and usefulness. IF attribution related
to address coverage risk, optimal contact and enumeration strategy, and production rate and workload cannot be applied to the Basic Collection Unit,
THEN the ability for Basic Collection Unit to act as
a planning tool and to be dynamically assigned in
the field is limited.
The GUPS contract states there will be a Web-based
and stand-alone version of GUPS. IF a Web-based
version of GUPS is not developed, THEN it will significantly add to the resources required to update
partnership programs for the 2020 Census.

ÏÏ Address and spatial data in the MAF/TIGER
System are validated using multiple data sources.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 81

Milestones

Lessons Learned

Figure 31 below depicts the high-level timing of
each component within the Geographic Programs
operation.

Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:

5.4.2	 Local Update of Census
Addresses

•• Provide program materials (i.e., address lists
and maps) in standard, off-the-shelf commercial
software formats.

Underway

Detailed Planning Status:

•• Simplify the process for small (6,000 or fewer
housing units), lower-level governments (such as
minor civil divisions and places).

Purpose
The Local Update of Census Addresses operation
provides an opportunity for tribal, federal, state,
and local governments to review and improve the
address lists and maps used to conduct the 2020
Census. This operation is required by the Census
Address List Improvement Act of 1994 (Public Law
(P.L.) 103-430).

Year
Quarter

2015

2016

2017

•• Explain the definition and use of addresses and
housing units better, so that participants will
understand why post office boxes and rural
route numbers are not in scope for the Census
Bureau’s LUCA Program.

2018

2019

2020

2021

2022

1st 2nd 3rd 4th 1st 2nd 3rd 4th 1st 2nd 3rd 4th 1st 2nd 3rd 4th 1st 2nd 3rd 4th 1st 2nd 3rd 4th 1st 2nd 3rd 4th 1st 2nd 3rd

Partnership
Programs

BAS 17

BAS 18

BAS 19

BAS 20

BAS 21
PUMA

BVP
PSAP/TSAP
LUCA*
Redistricting (BBSP)*

Delineation
Programs

BCU, TEA
Delineation

Delineate Field
Offices

Redistricting (VTD)*
Delineate Field Management Areas
ZCTA

Tab Blocks
Criteria

2020 Tab
Blocks

Geographic
Data
Processing

Urban Areas

GARP
Geographic Data Processing
Administrative Records Frame Development

Support
Programs

GUPS
Help Desk
Mapping

*LUCA and Redistricting are Partnership Programs that are managed outside of the Geographic Programs Team.

Figure 31: Geographic Programs Timeline

82 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

Opportunities to Innovate
Considering recommendations from the 2010
Census and the 2020 Census Research and Testing
Phase, and the design of a reengineered 2020
Census, opportunities to innovate include the
following:
•• Reduce the complexity of the LUCA Program as
compared with the 2010 program.
•• Eliminate the full address list submission
options that were available in 2010 Census
LUCA in order to:
ºº Reduce the number of deleted LUCA records
during verification activities.
ºº Reduce the burden and cost of processing
addresses and LUCA address validation.
Description of Operation
The LUCA provides the opportunity for tribal, federal, state, and local governments to review and
comment on the Census Bureau’s address list and
maps to ensure an accurate and complete enumeration of their communities. The Census Address
List Improvement Act of 1994 (P.L. 103-430)
authorized the Census Bureau to provide individual addresses to designated local officials of tribal,
federal, state, and local governments who agreed
to conditions of confidentiality in order to review
and comment on the Census Bureau’s address list
and maps prior to the decennial census. The basic
process for LUCA includes:

•• The LUCA Program Improvement Project completed their recommendations for the 2020
Census LUCA operation. The research focused
on improving the LUCA operation with research
by the following four research areas (2020
Census LUCA Program Recommendations
4/13/2015):
ºº Looking back at previous LUCA and related
programs.
•• Findings: Simplify the 2020 Census LUCA
program as the 2010 Census LUCA program was too complicated.
ºº Validating LUCA records without Address
Canvassing.
•• Findings: It is possible to validate LUCA
addresses in an office environment.
ºº Utilizing GSS-I for LUCA.
•• Findings: Data and tools used for the GSS-I
should be used and repurposed for the
LUCA program.
ºº Focus Groups.
•• Findings: Focus group participants agreed
with the proposal to remove the full
address list submission options for the
2020 Census LUCA program.

•• Governmental entities review and add, delete, or
change address records or features.

•• As part of the 2020 Census R&D efforts staff
evaluated the 2010 LUCA and 2010 lessons
learned and conducted a series of focus groups
with former LUCA participants. This effort
resulted in 12 major recommendations for the
2020 Census LUCA operation. (Note: These
recommendations are described in more detail
in the 2020 Census Local Update of Census
Addresses Project Improvement Report):

•• Census Bureau incorporates the updates to MAF/
TIGER System.

1.	 Continue the 2010 Census LUCA Program
improvements that were successful:

•• Census Bureau validates the updates via Address
Canvassing.

ºº Continue to provide a 120-day review
time for participants.

•• Census Bureau provides feedback to the governmental entities.

ºº Continue the 6-month advance notice
about the LUCA program registration.	

•• Governmental entities can appeal the Address
Canvassing validation outcomes.

ºº Continue a comprehensive communication program with participants.	

Research Completed

ºº Continue to provide a variety of LUCA
media types.

•• Census Bureau provides address list and maps
to the governmental entities.

The following research has been completed for this
operation:

U.S. Census Bureau

	

ºº Continue to improve the Partnership
Software application.

2020 Census Operational Plan—Version 1.1 83

º Continue state participation in the LUCA
program.
2. Eliminate the full address list submission
options that were available in 2010 LUCA.
This will:
º Reduce the number of deleted LUCA
records in field verification activities.
º Reduce the burden and cost of processing addresses and LUCA address validation.

8.	 Utilize and modify existing GSS-I tools and
data to validate LUCA submission.	
9.	 Encourage governments at the lowest level
to work with larger governments to consolidate their submission.
10.	Eliminate the Block Count Challenge, as previously this did not result in useful information for the Census to determine specifically
what addresses were missing from a block.

3. Reduce the complexity of the LUCA Program
as compared with the 2010 Census program.

11.	Eliminate the option for participants to use
an asterisk (*) for multiunits submitted without unit designations.

4. Include census structure coordinates in the
census address list and allow partners to
return their structure coordinates as part of
their submission:

12.	Encourage LUCA participants to identify
E-911 Addresses used for mailing, location,
or both addresses so that Census has more
information available during MAF update.

º Benefits participants and the Census
Bureau in the review of materials
because it enables more information
about each address to be considered
in both the participants review and the
Census Bureau’s validation of the submitted addresses.
5. Provide ungeocoded USPS DSF addresses
to state and county partners in LUCA
materials:6

Decisions Made
The following decisions have been made for this
operation:
99 Conduct a comprehensive communication program with LUCA participants.	
99 Include census structure coordinates in the
census address list and allow partners to return
their structure coordinates as part of their
submission.

º Provides more complete data for participants to review.

99 Provide ungeocoded addresses to state and
county partners in LUCA materials.

º May result in participants being able
to geocode previously ungeocoded
addresses for the Census.

99 Provide the address list in more standard file formats so that lists are easier to load into common
software packages.

º Should reduce the number of duplicate addresses submitted by LUCA
participants.

99 Encourage governments at the lowest level to
work with larger governments to consolidate
their submissions.

6. Provide the address list in more standard file
formats so that lists are easier to load into
common software packages.
7. Include an in-house verification of LUCA submitted addresses to align with the reengineered Address Canvassing.

6
This component is under legal and policy review and is
subject to change.

84 2020 Census Operational Plan—Version 1.1	

99 Provide a variety of LUCA media types.
99 Simplify the 2020 Census LUCA program and
make it compatible with the GSS-I and Address
Canvassing.
99 Utilize administrative records and third-party
data to improve validation process.
99 Use the GUPS to support automated exchange of
information for LUCA participants.

U.S. Census Bureau

99 Validation of LUCA submissions will occur primarily during In-Office Address Canvassing, with
minimal validation occurring early in the In-Field
Address Canvassing operation.

•• Approach: Study the feasibility of using administrative records and third-party data, as well as
CARRA administrative records and third-party
data.

Design Issues to Be Resolved

•• Decision by: June 2017

Additional work is required to make decisions on
the following questions:
How will the Census Bureau register LUCA participants over the Internet, and are there opportunities
to use Title 13 e-signature capability so that it can
be done online?
•• Approach: Investigate whether existing systems can meet the need and if not, evaluate
options (e.g., mail/paper or new automated
solution).
•• Decision by: December 2015
To what extent does LUCA need to capture the use
of the address (i.e. mailing, location, or both)?
•• Approach: Determine the requirements for
LUCA submissions.
•• Decision by: July 2016
What is the strategy for communicating late decade
GSS-I activities during LUCA?
•• Approach: Work with GSS-I to resolve.
•• Decision by: October 2016
What is the 2020 Census LUCA Appeals process?
•• Approach: Work with Office of Management and
Budget to develop a 2020 Census LUCA Appeals
process, defining the appropriate appeals office
will largely depend on the design of LUCA.
•• Decision by: October 2016
To what extent can administrative records and
third-party data be used to validate addresses submitted by LUCA participants?

U.S. Census Bureau

	

Cost and Quality
Cost impacts of this operation on overall 2020
Census costs include the following:
Investment in LUCA will have minimal impact on
cost to the 2020 Census as compared with the
2010 Census.
Quality impacts of this operation on overall 2020
Census quality include the following:
ÏÏ Removing the full address list submission
options, thereby reducing the number of
addresses that need to be validated.
ÏÏ Use of administrative records and third-party
data to validate incoming addresses from tribal,
federal, state, and local governments to independently validate submitted addresses prior to
adding them to the MAF.
Risks
To protect Title 13 data on computer-readable
materials, all local government LUCA liaisons
and LUCA reviewers are required to sign a
Confidentiality Agreement and abide by the Census
Bureau’s security guidelines. However, lessons
learned from previous censuses show that not all
stakeholders reviewing the Title 13 materials possess the skills necessary to meet IT requirements.
IF participants are required to take additional
efforts to meet the Census Bureau’s IT Title 13
requirements, THEN there needs to be adequate
support in a help desk environment for responding
to IT Title 13 issues. 

2020 Census Operational Plan—Version 1.1 85

The Census Bureau needs to work with the Office
of Management and Budget to determine the
requirements for the LUCA Appeals Office. IF the
LUCA Appeals Office is not planned in coordination
with the Office of Management and Budget by the
summer of 2016, THEN the Census Bureau will be
required to play a larger role in the development of
the LUCA Appeals Office.

activities with Address Canvassing at the end of
the decade.
•• Allow more time in the schedule to fully develop
and test the listing instrument.
•• Improve the Address Canvassing training to
emphasize working from the ground to the
Handheld Computer.
Opportunities to Innovate

Milestones

Opportunities to innovate include the following:

Date

Activity

September
2016

Release the LUCA Detailed Operational Plan.

•• 100-percent Address Canvassing conducted
In-Office.

January 2017

Mail Advance Notice Package.

•• Target 25 percent of living quarters for In-Field
Address Canvassing.

July 2017

Mail Invitation Package.

October 2017

Mail Review Materials.

August 2018

Complete Initial Processing of LUCA
submissions for delivery to Address
Canvassing.

June 2019

Complete Address Canvassing validation of
LUCA addresses.

August 2019

Deliver Feedback Materials.

March 2020

Complete the processing of LUCA Appeal
addresses.

September
2021

Complete LUCA.

5.4.3	 Address Canvassing
Detailed Planning Status:

Underway

Purpose
The Address Canvassing operation serves two
purposes:
•• Deliver a complete and accurate address list and
spatial database for enumeration.
•• Determine the type and address characteristics
for each living quarter.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Continuously update the maps and address lists
throughout the decade, supplementing these

86 2020 Census Operational Plan—Version 1.1	

•• Use of automation and data (imagery, administrative records, and third-party data) for In-Office
Address Canvassing.
•• Ongoing MAF Coverage Study to validate
In-Office Address Canvassing procedures, measure coverage, and improve In-Field Address
Canvassing data collection methodologies.
•• Use of reengineered field management structure
and approach to managing fieldwork, including new field office structure and new staff
positions.
Description of Operation
The Census Bureau needs the address and physical
location of each living quarter in the United States
to conduct the census. During Address Canvassing,
the Census Bureau verifies that its master address
list and maps are accurate so the tabulation for all
housing units and GQ is correct. A complete and
accurate address list is the cornerstone of a successful census.
The Census Bureau has determined that while there
will be a full Address Canvassing of the nation in
2020, a full In-Field Address Canvassing of the
nation is no longer necessary. Advancements in
technology have enabled continual address and
spatial updates to occur throughout the decade
as part of the In-Office Address Canvassing effort.
This has made it possible to limit In-Field Address
Canvassing to only the most challenging areas. The
scope of the Address Canvassing operation for the
2020 Census includes:

U.S. Census Bureau

•• In-Office Address Canvassing: Process of using
empirical geographic evidence (e.g., imagery,
comparison of the Census Bureau’s address list
to partner provided lists) to assess the current
address list. Also removes geographic areas
from the In-Field Address Canvassing workload
based on the availability of administrative data
sets (e.g., military lands, national forests) and
the method of enumeration planned for the
2020 Census (e.g., UE). Detects and identifies
change from high quality administrative and
third-party data sources to reduce the In-Field
Address Canvassing workload. Determines the
In-Field Address Canvassing universe.
•• In-Field Address Canvassing: Process of doing a
dependent listing in the field to identify where
people live, stay, or could live or stay.
•• Quality Assurance: Process of reviewing the
work of field and office staff. Both In-Field
Address Canvassing and In-Office Address
Canvassing work will be validated using quality
assurance techniques.
•• MAF Coverage Study: An ongoing field activity
that validates In-Office procedures, measures
coverage, improves In-Field data collection
methodologies, and updates the MAF on a continuous basis.
Research Completed
The following research has been completed for this
operation:
•• September 2014: Released the Address
Canvassing Recommendation Report.
ºº Findings: A recommendation was made to
not walk every block and implement the
reengineered Address Canvassing (In-Field
and In-Office).
•• February 2015: Completed the 2015 Address
Validation Test, which consists of the MMVT and
the PBC Test.
ºº Findings:
•• The statistical models were not effective at
identifying specific blocks with many adds
or deletes.
•• The statistical models were not effective
at predicting national totals of MAF coverage errors.

U.S. Census Bureau

	

•• PBC was successfully implemented as an
alternative field data collection methodology; future work will determine how the
PBC method impacts cost and quality.
•• Imagery Review successfully identified
areas requiring updates; future research
is needed to refine the process and determine impacts on quality.
Decisions Made
The following decisions have been made for this
operation:
99 The Address Canvassing Operation consists of:
ºº In-Office Address Canvassing.
ºº In-Field Address Canvassing.
ºº Quality Assurance.
ºº MAF Coverage Study.
99 Administrative records and third-party data
sources will be used to validate addresses within
each block.
99 GQ will be identified and classified during
Address Canvassing.
99 Geographic areas (e.g., living quarters and
feature) which are included in downstream operations will no longer have to be canvassed in the
field (e.g., UE and Remote Alaska).
99 At most 25 percent of the living quarters will be
canvassed in the field.
ºº Target as of September, 2015; continued
study through additional testing.
99 Production Address Canvassing begins
September 2015.
99 Address Canvassing provides training for both
production and quality assurance processes for
in-office work.
99 Address Canvassing relies on automated training
for production and quality assurance processes
for in-field work.
99 Address Canvassing updates the Census
Bureau’s address list using a dependent canvass
(from ground to list).
99 Address Canvassing validates and collects coordinates for every structure with a living quarter.
99 The MAF Coverage Study will be conducted
throughout the decade.

2020 Census Operational Plan—Version 1.1 87

99 In-Office Address Canvassing creates the universe for In-Field Address Canvassing.

•• Approach: Researched in 2016 MAF Coverage
Study and 2016 Address Canvassing Test.

99 In-Office Address Canvassing will review public
lands.

•• Decision by: January 2017

99 Geographic areas designated for In-Office
Address Canvassing can move to the In-Field
Address Canvassing universe and vice versa.
99 In-Field Address Canvassing can identify additional in-field work.
99 Statistical modeling will not be used in Address
Canvassing.
99 Imagery will be available on the Listing and
Mapping Instrument to use during In-Field
Address Canvassing.
99 Address Canvassing will validate LUCA
submissions.
99 Validation of LUCA submissions will occur primarily during In-Office Address Canvassing, with
minimal validation occurring early in the In-Field
Address Canvassing operation.
Design Issues to Be Resolved
Additional work is required to make decisions on
the following questions:
Is PBC more cost-effective than Full Block
Canvassing?
•• Approach: Researched in 2016 Address
Canvassing Test.
•• Decision by: January 2017
How will the field reengineering concepts tested
for NRFU be used for In-Field Address Canvassing?
•• Approach: Researched in 2016 Address
Canvassing Test.
•• Decision by: January 2017
How will Quality Assurance be handled?

88 2020 Census Operational Plan—Version 1.1	

What are the business processes for handling
Transitory Locations7 during Address Canvassing?
•• Approach: Researched in 2016 Address
Canvassing Test.
•• Decision by: January 2017
Will the Census Bureau be able to meet the
25-percent In-Field Address Canvassing goal
without sacrificing quality?
•• Approach: Researched in 2016 MAF Coverage
Study and 2016 Address Canvassing Test.
•• Decision by: January 2017
How will ungeocoded addresses be resolved as
part of Address Canvassing?
•• Approach: Researched in 2016 Address
Canvassing Test.
•• Decision by: March 2017
What is the business process to meet spatial
accuracy requirements for capturing features and
living quarter coordinates during In-Field Address
Canvassing if the devices are unable to meet these
requirements?
•• Approach: Research during the 2016 Address
Canvassing Test.
•• Decision by: March 2017
What feature data, if any, should be collected
during an In-Field Address Canvassing?
•• Approach: Researched in 2016 MAF Coverage
Study and 2016 Address Canvassing Test.
•• Decision by: March 2017
7
Transitory Locations are recreational vehicle parks, campgrounds, hotels, motels, marinas, racetracks, circuses, and
carnivals.

U.S. Census Bureau

Cost and Quality
Cost impacts of this operation on overall 2020
Census costs include the following:
Investment in Address Canvassing will reduce the
cost of the 2020 Census as compared with the
2010 Census through:
ÐÐ Reduction in the amount of In-Field Address
Canvassing and associated infrastructure by
implementing In-Office Address Canvassing.

The LUCA program provides addresses to the
Address Canvassing workload that need to be validated. The redesigned LUCA program is intended
to resolve more addresses and lessen the potential
for increased In-Field Address Canvassing work.
IF LUCA provides addresses to In-Office Address
Canvassing that are unresolvable at a higher than
expected rate, THEN there will be an increased
workload for In-Field Address Canvassing.
Milestones

ÐÐ Use of additional sources of administrative
records and third-party data to validate the frame.

Date

Activity

ÐÐ Partial block canvass (under review).

August 2015

Release Address Validation Test Results.

In addition:

September
2015

Release Address Canvassing Detailed
Operational Plan.

September
2015

Begin 2020 Census Address Canvassing
(In-Office).

April 2016

Begin MAF Coverage Study (In-Field).

September
2016

Begin 2016 Address Canvassing Test
(In-Field).

ÏÏ Address Canvassing is expected to require additional people, process activities, data, technology, and facilities to support In-Office Address
Canvassing and the resolution of ungeocoded
responses.
Quality impacts of this operation on overall 2020
Census quality include the following:
ÏÏ Continuous in-field improvement process to:
ºº Test in-field methodologies.
ºº Verify in-office methodologies.
ºº Update MAF with results.
ÏÏ Use of additional sources of administrative
records and third-party data to validate the frame.
ÐÐ Missed changes in address list resulting from
new Address Canvassing approach.
Risks
In-Office Address Canvassing is a new approach for
the 2020 Census, and there are concerns that some
local governments may believe an In-Field Address
Canvassing may yield a greater “quality” canvassing than In-Office Address Canvassing, and they
may be concerned about the lack of census jobs
within their jurisdiction because of a decreased
In-Field Address Canvassing. IF the Census Bureau
is unable to gain stakeholder acceptance for the
proposed Address Canvassing methodology, THEN
the workload for In-Field Address Canvassing may
increase dramatically.

U.S. Census Bureau

	

Release the LUCA Detailed Operational
Plan.
September
2017

Begin In-Field Address Canvassing for 2018
Census End-to-End Test.

August 2019

Begin In-Field Address Canvassing for 2020
Census.

5.5 RESPONSE DATA
The Response Data area includes all operations
associated with the collection of responses, management of the cases, and initial processing of the
data. This area consists of 12 operations that are
described in the following sections:
1.	

Forms Printing and Distribution

2.	

Paper Data Capture

3.	

Integrated Partnership and Communications

4.	

Internet Self-Response

5.	

Non-ID Processing

6.	

Update Enumerate

7.	

Group Quarters

8.	

Enumeration at Transitory Locations

9.	

Census Questionnaire Assistance

2020 Census Operational Plan—Version 1.1 89

10.	 Nonresponse Followup

printing and distribution required as compared
with 2010.

11.	 Response Processing
12.	 Federally Affiliated Americans Count Overseas

5.5.1	 Forms Printing and Distribution
Detailed Planning Status:

Underway

Purpose
The Forms Printing and Distribution operation
prints and distributes the following paper forms
to support the 2020 Census mailing strategy and
enumeration of the population:
•• Internet invitation letters.
•• Reminder postcards.
•• Questionnaire mailing packages.
•• Materials for other special operations, as
required.
Other materials required to support field operations are handled in the Decennial Logistics
Management or Field Infrastructure operations.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Use USPS tracing data to monitor large scale
inbound and outbound census mailings.
•• Provide a comprehensive 2020 Census forms
list to be used by the contractor for printing
planning.
•• Identify an owner for every field on the
questionnaires.
Opportunities to Innovate
Opportunities to innovate include the following:
•• Shifting from paper questionnaires to the
Internet as the primary response mode to the
Census, thus reducing the amount of paper

90 2020 Census Operational Plan—Version 1.1	

•• Paper questionnaires will be used mainly for
the enumeration of Internet nonrespondents
and targeted areas or populations with low
Internet usage.
Description of Operation
The Forms Printing and Distribution operation is
responsible for the printing and distribution of
mailed Internet invitations, reminder or postcards,
and questionnaire mail packages in multiple languages as determined by the Language Services
operation.
•• The contact strategy will include printing and
mailing of paper invitations and postcards.
•• Paper questionnaires will be printed and mailed
to some portion of the population.
•• Printing and mailing will be contracted through
the Government Publishing Office.
•• A serialized barcode will be printed on each
sheet of a questionnaire to ensure all pages for a
household are properly captured.
•• All or most of the mailing items or packages
will be addressed in near real time to minimize
distribution to households who have engaged in
the digital or other nonpaper response channels.
Research Completed
The following decisions have been made for this
operation:
•• Multiple studies on the use of USPS tracing:
ºº 2010 Census Paper: Optimizing Integrated
Technologies and Multimode Response to
achieve a Dynamic Census, February 29,
2012.
ºº 2010 Census Assessment: 2010 Census
Postal Tracking Assessment, April 2, 2012.

U.S. Census Bureau

ºº Cost assessment for the paper data capture
check-in operation.
•• Findings:
ºº USPS tracing data are cost-effective and
accurate.
ºº Postal tracing services are deemed reliable and can be used on a nationwide
scale in lieu of check-in.

2015 Census Test, the 2015 National Content
Test, and the 2016 Census Test.
•• Decision by: October 2016
What is the “on demand” printing process?
•• Approach: Researched in the 2014 Census Test,
the 2015 Optimizing Self-Response Test, the
2015 Census Test, the 2015 National Content
Test, and the 2016 Census Test.

Decisions Made

•• Decision by: October 2016

The following decisions have been made for this
operation:

What other census operations have paper printing requirements (e.g., UE, Puerto Rico and Island
Areas Censuses, GQ enumeration)?

99 Paper questionnaires, in at least English and
Spanish, will be printed and mailed to some
portions of the population as part of the initial
contact strategy.
99 Printing and mailing of 2020 Census invitation
letters, reminder postcards, and questionnaires
will be contracted out through Government
Publishing Office.

•• Approach: Based on UE, Puerto Rico, Island
Areas, and GQ operational design.
•• Decision by: October 2017
Cost and Quality
Cost impacts of this operation on overall 2020
Census costs include the following:

99 USPS barcodes will be used for various postal
services, such as tracing and identification of
vacant or other undeliverable addresses.

The investment in Forms Printing and Distribution
will have minimal impact on the cost of the 2020
Census as compared with the 2010 Census.8

Design Issues to Be Resolved

Quality impacts of this operation on overall 2020
Census quality include the following:

Additional work is required to make decisions on
the following questions:
What is the printing and mailing workload as part
of the Optimizing Self-Response contact strategy
and NRFU Operation?
•• Approach: Researched in the 2014 Census Test,
the 2015 Optimizing Self-Response Test, the
2015 Census Test, the 2015 National Content
Test, and the 2016 Census Test.
•• Decision by: Initial workload projection
October 2015 and final October 2016
What is the timing for the various mailings?
•• Approach: Researched in the 2014 Census Test,
the 2015 Optimizing Self-Response Test, the

U.S. Census Bureau

	

ÏÏ Robust printing quality assurance measures
have a direct positive impact on the quality of
data from paper data capture.
Risks
The printing products and address files needed
to support the 2020 Census need to be finalized
in time so that subsequent planning and development for the printing operation can take place. IF
printing products and address files are not finalized on schedule, THEN the printing operation will
be unable to plan print contracts and production in
the most fiscally responsible way, resulting in extra
mailing costs and schedule delays.
8
Printing costs may increase from the 2010 Census due to the
requirement for increased on-demand addressing and mailing.

2020 Census Operational Plan—Version 1.1 91

The final design for the 2020 Census paper questionnaire needs to be within the established USPS
thresholds in order to take advantage of mailing
discounts. IF the final 2020 Census questionnaire
design pushes the weight, size, or shape of a
mailing piece over established USPS thresholds,
THEN the Census Bureau will be unable to maximize use of USPS mailing discounts, adding extra
mailing costs.

Lessons Learned

Milestones

Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• A timely and comprehensive forms list is
required.
•• Every field on a questionnaire must have an
owner.

Date

Activity

•• Realistic and timely contingency planning is
essential in order to properly estimate the paper
data capture workload.

September
2016

Release the Forms Printing and Distribution
Detailed Operational Plan.

•• Use postal tracing to monitor large-scale
inbound and outbound mailings.

October 2016

Receive final contact strategies from the
Internet Self-Response operation.

•• Barcode serialization is an essential automated
quality component to data capture operations.

Receive questionnaire designs from the
Content and Forms Design operation.

Opportunities to Innovate

Define the printing and mailing workload
estimates.

Opportunities to innovate include the following:

October 2018

Refine the printing and mailing workload
estimates.

March 2018–
March 2019

Start print contract planning.

•• Significant reduction in paper data capture
operations and associated infrastructure due
to Internet Self-Response and automated field
operations.

June 2019–
April 2020

Implement printing, addressing, and mailing
of paper questionnaire packages.

Start USPS mailing planning.

5.5.2	 Paper Data Capture
Detailed Planning Status:

Underway

Purpose
The Paper Data Capture operation captures and
converts data from 2020 Census paper questionnaires. This operation includes:
•• Document preparation.
•• Scanning.
•• Optical Character Recognition (OCR).
•• Optical Mark Recognition (OMR).
•• Key From Image (KFI).
•• Editing and checkout.

92 2020 Census Operational Plan—Version 1.1	

•• Use of in-house systems Integrated Capture and
Data Entry (iCADE) for paper data capture.
•• USPS tracing data used to identify questionnaires
prior to arrival.
Description of Operation
The Paper Data Capture Operation is responsible
for the capture and conversion of data from paper
questionnaires. Paper forms delivered by the USPS
are processed by the National Processing Center
(NPC). Questionnaires go through several steps
as shown in Figure 32. Note that questionnaire
images are archived. The paper questionnaires
themselves are stored until verification that data
are received by Headquarters and then they are
destroyed per security regulations.
The Paper Data Capture operation is driven largely
by the timing of the questionnaire mail out, volume of forms received, timing of the nonresponse
workload universe cut, and any priority capture

U.S. Census Bureau

Receive
questionnaires
via United States
Postal Service
(USPS)

Remove paper
questionnaires
from envelopes

Prepare
questionnaires
for scanning

Perform
KFI operations

Use software
to perform OCR
and OMR on
images

Scan
questionnaires
to create images

Send data to
Headquarters

Send images
to archive

Send paper
questionnaires
to destruction

Figure 32: Paper Data Capture Flow

requirements needed for the 2020 Census. Data
are captured from the paper forms in the most
efficient manner possible, and both data and
images of the forms are maintained. The data
are sent to the Response Processing operation
area for further work. The images are sent to the
Archiving operation.
Mail returns are identified using USPS postal tracing
to indicate that a form is en route to the processing
office. Upon receipt at the processing office, mail
return questionnaires will be processed in First-InFirst-Out order, unless otherwise specified.
The document preparation area removes mail
returns from the envelopes and prepares them for
scanning. Damaged forms are transcribed to new
forms of the same type and a new barcode label
(same ID) is affixed to the new form. Booklet forms
have the binding (spine) removed.
The questionnaires are delivered to scanning to
begin the data capture process. All questionnaires
are scanned by iCADE (no key from paper). Once
scanned, the physical paper forms move on to the
checkout operation. Forms await confirmation that

U.S. Census Bureau

	

the data have been received at Headquarters (see
Response Processing in Section 5.6.12).
Scanned images are sent forward for further
processing using the iCADE system where OMR
and OCR are performed. Data fields with low
confidence OMR and OCR results are sent to the
KFI process. Both data and images are maintained
(data are sent to response processing and images
are archived). Once all data have been received at
Headquarters, the questionnaires can be checked
out to ensure each form has been fully captured.
These forms are then eligible for destruction.
Research Completed
The following research has been completed for this
operation:
•• Conducted Improving Operational Efficiency
technical evaluation project:
ºº Expanding the use of iCADE system to support the 2020 Census.
•• Findings: iCADE has the capability to be
the paper capture solution for the 2020
Census.

2020 Census Operational Plan—Version 1.1 93

•• Multiple studies on the use of USPS tracing:
ºº 2010 Census Paper: Optimizing Integrated
Technologies and Multimode Response to
achieve a Dynamic Census, February 29, 2012.
ºº 2010 Census Assessment: 2010 Census
Postal Tracking Assessment, April 2, 2012.
ºº Cost assessment for the paper data capture
check-in operation.
•• Findings: USPS tracing data are a costeffective and accurate alternative to a
check-in operation for the 2020 Census.
Decisions Made
The following decisions have been made for this
operation:
99 iCADE is the planned paper capture system for
the 2020 Census.
ºº *Dev 8 Assessment, submitted February 2014
and updated January 2015.
ºº *iCADE is part of CEDCaP.
99 Paper questionnaires will be mailed to targeted
areas or populations with low Internet usage as
part of the initial contact strategy and to Internet
nonrespondents.
99 All questionnaires are booklets that require
separation.
99 USPS tracing data will be used to identify
questionnaires prior to arrival (no laser sorter
check-in operation).
99 All questionnaires will be scanned by iCADE (no
key from paper).
99 The 2010 Census target quality levels will be
used for OMR (99 percent), OCR (97 percent) and
KFI (99 percent).
99 There will be two paper data capture centers.
Design Issues to Be Resolved
Additional work is required to make decisions on
the following questions:
What is the 2020 Census Paper Capture workload,
questionnaire size and shape?
•• Approach: Researched during the 2015
Optimizing Self-Response Test, the 2015

94 2020 Census Operational Plan—Version 1.1	

National Content Test, and the results of the
demand model.
•• Decision by: October 2016
What does the reengineered NRFU operation require
from Paper Data Capture? Will there be priority
capture requirements for Nonresponse Followup? Is
the universe cut schedule different?
•• Approach: Based on decisions for the detailed
design of the NRFU Operation.
•• Decision by: October 2016
Which operations will use paper questionnaires as
a contingency in the event that the Internet SelfResponse, NRFU and other operations cannot be
executed as planned?
•• Approach: Based on risk analysis of alternative
options for each relevant operation.
•• Decision by: October 2016
What other operations have paper data capture
requirements (e.g., UE, Puerto Rico, and GQ)?
•• Approach: Based on decisions on paper data
capture requirements for other operations.
•• Decision by: October 2017
Cost and Quality
Cost impacts of this operation on overall 2020
Census costs include the following:
The investment in Paper Data Capture will reduce
the cost of the 2020 Census as compared with the
2010 Census through:
ÐÐ The use of an enterprise solution iCADE for
paper data capture.
ÐÐ The provision of a low-cost response mode
(other than the Internet) to increase selfresponse rates.
Quality impacts of this operation on overall 2020
Census quality include the following:
↔↔Plan to maintain the same quality level as the
2010 Census for OCR, OMR, and KFI.
Risks
In order to make informed decisions regarding
paper capture facilities and equipment, timely
guidance must be provided on the workloads

U.S. Census Bureau

for questionnaire capture. IF guidance regarding
questionnaire capture workloads is not provided on
time, THEN paper capture facility and equipment
decisions will be negatively impacted.

of the 50 states, the District of Columbia, and
Puerto Rico to:

The size of the final 2020 Census questionnaire
affects the cost of processing paper forms as it
determines the number of form faces that must be
managed. IF the final 2020 Census questionnaires
is in a booklet format, THEN additional equipment
and storage space may be needed to accommodate
the format, adding time, cost, and complexity to
the paper data capture process.

•• Raise and keep awareness high throughout the
entire 2020 Census to encourage response.

The Census Bureau is considering significant innovations to conduct the 2020 Census. These innovations (e.g., enterprise IT solutions, data collection
via the Internet and mobile devices) are expected
to drastically reduce the need for paper for many
of the operations. IF the innovations being developed to reduce the use of paper for the 2020
Census do not get implemented as planned, THEN
operations may need to be fully or partially paperbased, which will require a more robust solution
than currently planned, resulting at a minimum in
additional cost and schedule delays.
Milestones
Date

Activity

September
2016

Release the Paper Data Capture Detailed
Operational Plan.

October 2016

Develop paper data capture Nonresponse
Followup plan.

Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Integrate Census Bureau subject matter experts
into all phases of the 2020 Census Integrated
Partnership and Communications Program.
•• Improve coordination of communications among
the Decennial, Field, and Communications
Directorates and others.
•• Align timing, funding, and design decisions
between the development of the Integrated
Partnership and Communications Program Plan
and the Census Bureau’s operational milestones to
effectively support all phases of the 2020 Census.
•• Establish more specific program metrics for the
Integrated Partnership and Communications
Program to assist in evaluation and assessment.
Opportunities to Innovate

Develop paper data capture contingency
planning guidance.
October 2017

Design other operations that may require
paper data capture.

March–
August 2020

Conduct Paper Data Capture operation.

5.5.3	 Integrated Partnership and
Communications
Detailed Planning Status:

•• Engage and motivate people to self-respond,
preferably via the Internet.

Underway

Opportunities to innovate include the following:
•• Microtargeted messages and placement for
digital advertising, especially for hard-to-count
populations.
•• Advertising and partnership campaign adjusted
based on respondent performance.
•• Texting and e-mailing to motivate self-response.
•• Expanded predictive modeling to determine the
propensity to respond.
•• Expanded use of social media to encourage
response.
•• Localized advertising to encourage response.

Purpose

Description of Operation

The Integrated Partnership and Communications
operation communicates the importance of participating in the 2020 Census to the entire population

Inspiring every household in the country to complete the census is an enormous, increasingly complex, and unparalleled challenge. What was once a
widely accepted civic exercise has become much

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 95

more difficult over the past several decades. With
an increasingly diverse population and a drop in
public participation, an effective communications
strategy is critical to the success of the census.
The Integrated Partnership and Communications
Program must reach every household in the nation,
delivering the right messages to the right audiences
at the right time. It must allocate messages and
resources efficiently, taking care not to over-­allocate
resources to reach households that will readily
respond, and to dedicate additional appropriate
resources to reach households that need more
encouragement. It is a delicate balance that must
be informed by timely research. Critical to this is
ensuring consistent messaging, as well as look and
feel, across all public facing materials across communication efforts as well as operations.
An Integrated Partnership and Communications
Program contractor will be engaged to support the
2020 Census Program from recruitment through
data dissemination. At a minimum, the Program
will offer the following components:
•• Partnership including both Regional and National
efforts.
•• Advertising using print, radio, digital, television,
and out of home.
•• Social Media to include blogs, Facebook, Twitter,
and etc.
•• Statistics in Schools.
•• Rapid Response.
•• Earned Media.
•• Thank you campaign.
•• Public Relations.
Together these eight major components of the
Integrated Partnership and Communications operation will communicate the importance of participating in the 2020 Census to the entire population.
Research Completed
The following research has been completed for this
operation:
•• The 2015 Optimizing Self-Response Test:
ºº Promote “Notify Me,” allowing individuals to
provide contact information to receive future
e-mail and text message notifications when it
is time to participate in the test.
96 2020 Census Operational Plan—Version 1.1	

•• Findings: “Notify Me” is not a successful
contact strategy as designed and tested
with a very low percent of mail panel
responding.
ºº Test microtargeted digital advertising on
response rates associated with “Notify Me”
and survey completion.
•• Findings: 2015 Optimizing Self-Response
Test Report due December 2016.
ºº Test multiple communications elements,
including earned media, social media, partnership and outreach, and telephone, radio,
print, billboards, and digital advertising; as
well as automated telephone messaging by
local influencers.
•• Findings: 2015 Optimizing Self-Response
Test Report due December 2016.
ºº Test of effectiveness of partnerships in motivating self-response.
•• Findings: Partnerships were effective.
Decisions Made
The following decisions have been made for this
operation:
99 The 2020 Census will use partnerships to communicate the importance of the 2020 Census to
the U.S. population and encourage self-response.
99 The 2020 Census will use digital advertising and
social media targeting.
99 The 2020 Census will use texting and e-mailing
to motivate self-response.
99 The 2020 Census will use traditional advertising
methods, including the use of local advertising.
99 An online portal will be developed that will allow
for posting and downloading materials, providing online fulfillment, and sharing experiences.
99 Integrated Partnership and Communications
Internet kiosks will be made available in public
spaces for respondents to complete their Census
questionnaire on line.
Design Issues to Be Resolved
Additional work is required to make decisions on
the following questions:
What are the components and materials required
for implementing the Integrated Partnership and
Communication (IPC) operation?
U.S. Census Bureau

•• Approach: Census will work with IPC operation
contractor upon contract award in September
2016 to develop an IPC Program Plan.
•• Decision by: March 2017
What is the approach for audience and market segmentation models?
•• Approach: Census will work with IPC operation
contractor upon contract award in September
2016 to determine the appropriate approach
based on best practices and available data.
•• Decision by: April 2017
What metrics will be used to evaluate the success
of the IPC operation as well as each individual
component? Microtargeted digital advertising?
Automated telephone messaging by local influencers? Providing donated thank you incentives to
respondents? Social media? E-mail?
•• Approach: Based on when the Independent
Evaluation Contract for the IPC operation is
awarded. This contractor needs to work with the
Census Bureau and the IPC operation contractor
to determine metrics.
•• Decision by: April 2017
Cost and Quality
Costs impacts of this operation on overall 2020
Census quality include the following:
ÐÐ A campaign aimed at promoting the Internet
as the primary response option reduces census
data collection costs.
Quality impacts of this operation on overall 2020
Census quality include the following:
ÏÏ Increase in overall self-response rates.

Risks
The Integrated Partnership and Communications
operation may not be able to use newly emerged
communication channels as it may be too late to
incorporate these new technologies. In addition,
internal policies may not be flexible enough to
accommodate new communication channels. IF
the Integrated Partnership and Communications
operation is unable to leverage new communication channels to encourage the public to complete
the 2020 Census, THEN messages may not get
to some segments of the population, resulting in
lower self-response rates.
Milestones
Date

Activity

January
2015

Launch the 2020 Census Web site.

August 2016 Award the Integrated Partnership and
Communications contract.
September
2016

Release the Integrated Partnership and
Communications Detailed Operational Plan.

October
2016

Kick off the Integrated Partnership and
Communications contract.

June 2017

Start the 2020 Census Partnership program.

June 2017

Start the 2020 Census recruiting campaign.

5.5.4	 Internet Self-Response
Detailed Planning Status:

Underway

Purpose
The Internet Self-Response operation performs the
following functions:

ÏÏ Potential increase in self-response from traditional hard-to-count populations.

•• Maximize online response to the 2020 Census
via contact strategies and improved access for
respondents.

ÏÏ Ability to adjust advertising using real-time
metrics.

•• Collect response data via the Internet to reduce
paper and Nonresponse Followup.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 97

Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Determine optimal contact strategies for eliciting responses to the 2020 Census for Internet
and response modes.
•• Optimize the instrument for mobile devices
to provide for better user experiences and to
improve overall response rates.
•• Determine if a bilingual initial or replacement
questionnaire in bilingual selected tracts is
beneficial.
Opportunities to Innovate
Opportunities to innovate include the following:
•• Internet Data Capture:
ºº Real-time edits.
ºº Ability to capture unlimited household size
entries.
ºº Multiaccess methods across different technologies (e.g., computers, smart phones, tablets,
kiosks).
ºº Online questionnaires available in multiple
languages and non-Roman alphabets.
•• Multimode contact approach (e.g., postcard,
e-mail, phone, and text) tailored to demographic
or geographic area, designed to encourage
Internet self-response, and tied to the messaging from the Integrated Partnership and
Communication operation.
•• A contact frame, including e-mail and phone
numbers, created from administrative records
and third-party data.
Description of Operation
Two significant pieces of the program reside in
this operation: Internet Self-Response and Contact
Strategies.
Internet Self-Response
The Census Bureau has set a goal of 55 percent of
U.S. households responding to the 2020 Census
via the Internet. High Internet response is critical
for cost savings and major efforts are underway to
minimize the amount of self-response via telephone, paper questionnaire, and in-person visits.
98 2020 Census Operational Plan—Version 1.1	

Internet response was not available in previous
decennial censuses and therefore represents a substantial innovation for the enterprise. The Census
Bureau recognizes that the Internet response option
is not feasible or acceptable to the entire population. Therefore, alternate modes will be provided
for respondents to complete their 2020 Census
such as the paper methods used in the past.
Planning and development activities to support
self-response have focused on two primary areas:
optimizing the respondent experience and maximizing data quality. Each is discussed below.
Ensuring a positive experience for users is one way
to facilitate high rates of Internet self-response.
The overall experience includes factors such as
usability, convenience, speed, and the general
“look and feel” of a Web site. To meet this broad
range of expectations, respondents will be offered
multiple avenues to complete their census online.
The questionnaire Web site will be optimized for
use on mobile devices. This provides a higher
level of convenience as well as ensures the broadest access possible to those without traditional
Internet service.
Internet questionnaire screens must be easy
to complete and responses must be processed
quickly to eliminate wait time between screens.
Additionally, all systems developed to support
Internet Self-Response must have the capacity to
handle the anticipated response loads and provide
security protections for Title 13 data.
The option to respond online must be available to
those without personal Internet access. Through
the Census Bureau’s planned partnership and other
community-level efforts, free-standing or mobile
devices will be available for use by the public, and
assistance will be provided to those who cannot
complete the form themselves. Additional information on the Census Bureau’s Integrated Partnership
and Communications Campaign is described in
section 5.6.4.
Similarly, language needs must be addressed. The
Census questionnaire will be available for Internet
completion in English, Spanish and other languages
as determined by the prevalence of the need.
Additional information on the Language Services
program is described in section 5.4.4.

U.S. Census Bureau

Internet Self-Response should also lead to improvement in overall data quality. The data collection
systems will include preprogrammed edit checks
to identify user error prior to submission. Real-time
or post hoc respondent validation checks are also
possible with Internet respondents.
To further improve data quality, assistance will be
available to respondents who are having difficulty
completing their 2020 Census online through
Census Questionnaire Assistance agents who will
facilitate successful submissions of questionnaires
and reduce the number of incoming telephone
calls for assistance. Additional guidance will be
available in static form on the Census Bureau or
2020 Census Web site, including step-by-step
guides and Frequently Asked Questions for completing the Census.
Contact Strategies
All attempts by the Census Bureau to make direct
contact with individual households are referred to
as “contact strategies.” These are complimentary
but distinct from the community-level outreach
described under the Integrated Partnership and
Communications operation. Types of contact
strategies include invitation letters, postcards, and
questionnaires mailed to households; electronic
correspondence (both e-mail and text messages);
and telephone calls:
•• Mailings have traditionally been sent via the
USPS. The Census Bureau is also exploring
supplementing deliveries through additional
options, including express delivery or private
couriers, if they achieve rates of response high
enough to justify added costs and operational
complexity.
•• The Census Bureau is exploring different
options for individual-level contact, including
the use of e-mail and text messaging to cell
phone numbers.
•• The Census Bureau is exploring the use of
telephone contacts to landline or cell phone
numbers to encourage self-response.
Each type or mode of contact may be used for
multiple purposes: advance notification of upcoming contact, invitation to participate in the 2020
Census, remind prompting to nonresponders, or to
complete the questionnaire in an alternative mode.

U.S. Census Bureau

	

Prior to the 2010 Census, research yielded distinct attitudinal segments or messaging mindsets.
Research was also conducted and continues to be
refined with cluster analysis of mail return rates
from the 2010 Census and the ACS with demographic, housing, and economic variables to understand and plan for response propensities. A primary objective of the 2020 Census is for a majority
of respondents to complete their Census questionnaire online. Communication of this objective to
individual households is the purpose of the Census
Bureau’s contact strategies. The Census Bureau is
looking to develop a contact approach that produces an “actionable” response on the part of the
respondent. For example, receipt of an e-mail with
a hyperlink to the Census Web site should lead
respondents to click on the link and complete the
questionnaire.
One approach termed “Internet push” has been
developed to encourage respondents to use the
Internet. Currently this model includes the mailing
of a letter inviting respondents to complete the
questionnaire online, two follow-up reminders via
mailed postcard, and if necessary, a mailed hardcopy questionnaire. All correspondence will contain
a telephone number for respondents to call to complete the questionnaire over the telephone.
This approach, however, may not be appropriate
for all respondent types and the Census Bureau is
actively working to understand the optimal contact
strategies for different segments of the population;
exploring variations on the timing, mode, and
frequency of contacts on response. For instance,
some respondents may be less likely to react to
mailings, but will notice an e-mail invitation or text
message sent to a cell phone. Research is underway to understand whether these nontraditional
methods of contact are acceptable and produce the
intended results.
Research Completed
The following research has been completed for this
operation:
•• ACS Internet Self-Response Research.
ºº Findings:
•• People living in areas with lower Internet
usage and accessibility require paper and
or telephone questionnaire assistance.

2020 Census Operational Plan—Version 1.1 99

•• Certain messaging strategies are more
effective in motivating self-response.
•• 2012 National Census Test tested contact strategy and Internet option.
ºº Findings:
•• Initial contact to invite participation,
followed by two reminder prompts as
needed, and subsequent mailing of a paper
questionnaire was a promising strategy
(Internet push).
•• Advance letter was not shown to improve
response rates.
•• Telephone assistance needed for respondents without Internet access.
•• 2014 Census Test tested “Notify Me” mailed invitation, contact strategies, and Internet option.
ºº Findings:
•• Neither e-mail nor automated voice
messages showed a significant impact on
response rates.
•• Low participation rate for “Notify Me” component, but high questionnaire completion
rate among those who preregistered.
•• The 2015 Optimizing Self-Response Test offered
an Internet response option, including real-time
non-ID processing, and again tested the “Notify
Me” option, along with advertising and partnerships support.
ºº Findings:
•• The total response rate was 47.5 percent,
and the Internet response rate was 33.0
percent.
•• An additional 35,249 Internet responses
from housing units not selected in mail
panels as a result of advertising and promotional efforts.
•• “Notify Me” again had low participation.
•• A new postcard panel, designed to test
how housing units not originally included
in the sample would respond to an invitation after being exposed to advertising,
generated response of approximately 8
percent.
•• Small-scale opt-in e-mail testing experimented
with e-mail messaging, including subject lines,
timing of delivery, and look and feel.

100 2020 Census Operational Plan—Version 1.1	

ºº Findings:
•• A text-based e-mail out-performed graphical e-mails.
•• Short e-mail subject lines that include the
“10-minute” burden and the “U.S. Census
Bureau” name seem to perform better than
other subject lines, especially those including the word “Help” as the first word in the
subject line.
•• Longer e-mail content with “Dear Resident”
and signature of the Director e-mail outperformed a shorter e-mail invitation without
the greeting and signature.
•• Response rates did not differ by link type
(whether the full Uniform Resource Locator
(URL) or “Click here”) with this population.
•• The time of day the e-mail is sent did
not appear to have a big impact on the
response rate.
•• Respondents prefer a mailed invitation,
including a link to respond over all other
options.
Decisions Made
The following decisions have been made for this
operation:
Internet Self-Response:
99 An Internet self-response option will be provided
for the 2020 Census.
99 Invitation letters and mailed materials will
encourage people to respond using a unique
Census identifier; however, the 2020 Census
will allow people to respond without a unique
Census ID.
99 The Census Bureau will offer Internet questionnaires in a small number of languages other
than English and Spanish, including those
requiring non-Roman alphabets. The languages
selected will be based on national prevalence
rates of low-English proficiency households and
the available technology.
Contact Strategy:
99 An advance letter will not be used; the first
letter will be an Internet push letter inviting
response to the Census to most of the housing
units. We will provide a paper questionnaire
(including bilingual forms) for populations
U.S. Census Bureau

where Internet access and usage prompts us to
offer Internet Choice (questionnaire and Internet
invitation) and for whom language assistance
optimizes self-response.
99 The 2020 Census will offer alternative response
options to respondents without Internet access.
99 Messaging will be coordinated with the
Integrated Partnership and Communications
Campaign.

•• Decision by: October 2017
Contact Strategy:
What is the optimal combination of individual (e.g.,
housing unit) level contact strategies used in the
2020 Census and how will these be tailored based
on demographic and geographic areas?
•• Approach: Researched in the 2014, 2015, and
2016 Census Tests.

99 A formal “Notify Me” option will not be offered.

•• Decision by: October 2016

99 Respondents will receive direct contacts inviting
their participation in the Census. Contacts may
include some of all of the following: postcard
mailings, letter mailings, e-mails, text messages,
prerecorded telephone messages, questionnaire
mailings, and in-person visits by an enumerator.

How can USPS barcode technology be used to
optimize the respondent access to Internet in mail
materials?

Other Self-Response:
99 Text messaging will not be used as a data collection mode.
99 Housing units from whom an Internet questionnaire is not received will be mailed a paper
questionnaire.
Design Issues to Be Resolved
Additional work is required to make decisions on
the following questions:
Internet Self-Response:
Will the Census Bureau provide a mobile application for Internet Self-Response?
•• Approach: Based on technical research and cost
and benefit analysis.
•• Decision by: January 2016
In what languages will Internet self-response be
available?
•• Approach: Determined in conjunction with
Language Services using ACS data and input
from advisory committees, taking into consideration Census Enterprise Data Collection and
Processing capabilities.
•• Decision by: September 2017
What type of Internet form design will facilitate
high quality self-response data collection in GQ?
•• Approach: Researched in the 2016 and 2017
Census Tests.

U.S. Census Bureau

	

•• Approach: USPS/Census Bureau Interagency
Working Group 2015–2017.
•• Decision by: October 2017
What are the benefits and risks associated with
using the Census contact frame to reach respondents via e-mail and text messages?
•• Approach: Research as part of the 2016
and 2017 Census Tests and coordinate with
the 2020 Census Integrated Partnership and
Communications design.
•• Decision by: October 2017
Other Self-Response
What are the response rate projections for all
self-response modes?
•• Approach: 2015 Census Test, 2015 National
Content Test, 2016 Census Test, and external
demand model projection for Internet use.
•• Decision by: October 2017
Cost and Quality
Cost impacts of this operation on overall 2020
Census costs include the following:
The investment in Internet Self-Response will
reduce the cost of the 2020 Census as compared
with the 2010 Census through:
ÐÐ Reduced amount of self-response via paper
questionnaire and the infrastructure for paper
data capture.
ÐÐ Increased self-response, which will decrease the
Nonresponse Followup workload, thereby reducing field costs.

2020 Census Operational Plan—Version 1.1 101

In addition:
ÏÏ Internet Self-Response is expected to increase
the workload for Census Questionnaire
Assistance.
Quality impacts of this operation on overall 2020
Census quality include the following:
ÏÏ Increase in overall self-response rates.

•• Providing response options that do not require a
unique Census ID.
•• Maximizing real-time matching of non-ID respondent addresses to the Census MAF.
•• Accurately assigning nonmatching addresses to
census blocks.
•• Conducting validation of all non-ID responses.

ÏÏ Real-time edits to respondent data.

Lessons Learned

ÏÏ More complete self-response for large
households.

Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:

ÏÏ Potential increase in self-response from traditionally hard-to-count populations.
Risks
Data collection for the 2020 Census will include
Internet data submission from respondents for the
first time on a large scale. IF the business rules,
requirements, and assumptions for the data collection instrument, including usability of the user
interface, are not correctly defined, developed, and
tested, THEN there could be a failure in our ability
to successfully conduct cost-effective self-response
enumeration in the 2020 Census.
Milestones

•• The automated and manual non-ID processes
should be planned and developed in parallel,
rather than sequentially, as was done when preparing for the 2010 Census Non-ID Processing
operation.
•• Involve NPC throughout the life cycle of the
2020 Census Non-ID Process.
•• The delivery of addresses from non-ID processing that require independent verification should
occur on a flow basis during self-response and
Nonresponse Followup rather than at the end of
these operations.
Opportunities to Innovate

Date

Activity

January 2016

Decide on the use of mobile applications as a
self-response mode.

March 2016

Begin the 2016 Census Test.

September
2016

Release the Internet Self-Response Detailed
Operational Plan.

March 2017

Develop the strategy to optimize selfresponse for those living in group quarters.

Opportunities to innovate include the following:

Begin the 2017 Census Test.
March 2020

Begin 2020 Census Internet Self-Response
data collection.

September
2020

End 2020 Census Internet Self-Response
data collection.

5.5.5	 Non-ID Processing
Detailed Planning Status:

Underway

Purpose
The Non-ID Processing operation is focused on
making it easy for people to respond anytime, anywhere to increase self-response rates. The operation accomplishes this by:
102 2020 Census Operational Plan—Version 1.1	

•• Public can respond anytime, anywhere without a
unique Census ID.
•• Mechanism to increase self-response from traditionally hard-to-count populations.
•• Real-time matching and geocoding of responses.
•• Use of administrative records and third-party
data to validate non-ID responses.
•• Use of administrative records and third-party
data to validate and augment respondent-provided address data.
Description of Operation
During the self-response phase, the Non-ID
Operation will allow respondents to complete a
questionnaire without a Census identification code
(non-ID). By collecting the address from the respondent and then matching it real-time to the MAF/
TIGER System, the Census Bureau will attempt to
get the ID and confirm the geographic information with the respondent. The address collection

U.S. Census Bureau

interface facilitates obtaining complete and accurate data from a non-ID response.

initial matching attempt using the MAF/
TIGER System.

Key capabilities of non-ID are:

Decisions Made

•• Address standardization and a feedback loop
with the respondent to confirm the address data
they provide.

The following decisions have been made for this
operation:

•• Automated address matching during the
response.
•• Automated address geocoding during the
response.
•• Respondent address geocoding real time via a
map interface.
•• Response validation; both during the response,
as well as via back-end processing.
•• For non-ID cases not matched in real time, use
of administrative records and third-party data
to confirm or supplement respondent-provided
address data, followed by an additional address
matching attempt.

99 The 2020 Census will offer a non-ID option for
self-response and telephone agent-assisted
response.
99 The 2020 Census Internet self-response instrument and the Census Questionnaire Assistance
interviewer instrument will utilize capabilities
and requirements for the address collection
interface as specified for non-ID responses, as
used in the 2014 and 2015 Census Tests.
99 The non-ID work flow will include real-time
matching and geocoding, post real-time processing that will utilize administrative records
and third-party data, and manual (interactive)
matching and geocoding.

•• Manual matching and geocoding when automated Non-ID Processing has not determined an
acceptable match or geocode.

Design Issues to Be Resolved

Research Completed

How can non-ID respondents help confirm the location of their living quarters?

The following research has been completed for this
operation:
•• 2013 National Census Contact Test:
ºº Findings: The use of administrative records
and third-party data was effective in enhancing non-ID addresses to allow for a match to
the MAF/TIGER System.
•• 2014 Census Test on Non-ID Processing.
ºº Findings:
•• The address collection interface in the
Internet instrument yielded a much greater
proportion of higher quality address data
from non-ID responses than in 2010.
•• Use of administrative records and thirdparty data matching improved the overall
address matching rate.
•• There was no significant benefit to applying the administrative record matching
process to all non-ID responses. Therefore,
the use of administrative records and
third-party data matching should follow an

U.S. Census Bureau

	

Additional work is required to make decisions on
the following questions:

•• Approach: Informed from Optimizing SelfResponse 2015 Test, 2016 Census Test, and
Carnegie Mellon research.
•• Decision by: September 2016 (Initial recommendations; evaluation will continue through
2018 testing)
What methodology will be used to conduct non-ID
response validation?
•• Approach: Currently researching a solution that
utilizes commercial and federal data sources;
the Census Bureau will test alternate methods in
the 2016 and 2017 Census Tests to determine
methods to be used in the 2018 Census End-toEnd Test.
•• Decision by: September 2016 (Initial recommendations; evaluation will continue through
2018 testing)
How will administrative records and third-party
data be used to improve matching in Non-ID
Processing?

2020 Census Operational Plan—Version 1.1 103

•• Approach: Continue to refine methods in the
2016 and 2017 Census Tests in preparation for
the 2018 Census End-to-End Test.

The investment in Non-ID Processing will reduce
the cost of the 2020 Census as compared with the
2010 Census through:

•• Decision by: September 2017

ÐÐ Increased self-response rates.

At what proportion did office resolution confirm the existence and location of nonmatching
addresses?

ÐÐ Improved coverage through self-response.

•• Approach: Currently conducting office-based
address verification for eligible records from
the 2014 and 2015 Census Tests. The Census
Bureau will continue to test methods in the 2016
and 2017 Census Tests to determine specific
methods to be used in the 2018 Census End-toEnd Test.
•• Decision by: September 2017
If the proportion of non-ID responses increases in
the 2020 Census, can the Census Bureau accommodate the corresponding increase in workload for
downstream operations such as manual matching
and geocoding or address verification (office and
field-based)?
•• Approach: Contributing to workload modeling
efforts for upcoming tests, as well as for the
2020 Census. Initial model available September
2015, but to be revisited each year following the
2016 and 2017 Census Tests, as well as after
the 2018 End-to-End Test.
•• Decision by: September 2018
What is the expected scale of the 2020 Census
Non-ID workload?
•• Approach: Contributing to workload modeling
efforts for upcoming tests, as well as for the
2020 Census. Initial model available September
2015, but to be revisited each year following the
2016 and 2017 Census Tests, as well as after
the 2018 End-to-End Test.
•• Decision by: September 2018
Cost and Quality
Cost impacts of this operation on overall 2020
Census costs include the following:

104 2020 Census Operational Plan—Version 1.1	

Quality impacts of this operation on overall 2020
Census quality include the following:
ÏÏ May increase self-response from traditionally
hard-to-count populations.
ÏÏ May increase overall self-response rates, which
can contribute to higher quality for the overall
census.
Risks
The primary reason for conducting real-time Non-ID
Processing is to provide respondents the opportunity during the response to resolve non-ID cases
that are not matched and/or not geocoded. Any
non-ID case that is successfully matched to a valid
record in the census address inventory and is geocoded can be considered a complete response. In
other words, it would not be necessary to manually
match/geocode the respondent address or to send
an enumerator to the housing unit if the non-ID
case can be fully resolved during the response. IF
the IT infrastructure is not adequately scaled to
support real-time Non-ID Processing, THEN fewer
addresses from non-ID responses will be matched
in real time, negatively affecting the speed at which
cases are removed from the manual processing
workload or NRFU workload.
The option of submitting a non-ID response via
the Internet instrument could potentially lead
to an increase in fraudulent responses. A final
solution that will implement identity validation
during self-response has not been determined. IF
the 2020 Census program is unable to determine
prior to the 2020 Census an acceptable means to
confirm that the identity of a respondent without
a unique Census ID is valid during the Internet
self-response, THEN the non-ID Internet self-­
response option will not be made available to the
large segment of the population it is anticipated
would choose to use it.

U.S. Census Bureau

Lessons Learned

Milestones
Date

Activity

April 2015

Deliver real-time address matching and
geocoding for the 2015 OSR Test.

April 2016

Deliver real-time processing in the cloud,
manual matching and geocoding at the NPC,
and utilize multiple respondent validation
methods for the 2016 Census Test.

September
2016

Release the Non-ID Processing Detailed
Operational Plan.

April 2017

Deliver all components for the 2017 Census
Test, and include functionality for Puerto Rico
addresses.

April 2018

Conduct the 2018 Census End-To-End Test.

April–
July 2020

Conduct the 2020 Census Non-ID
Processing.	

August 2021

Complete the 2020 Census Non-ID
Assessment Report.

5.5.6	 Update Enumerate
Detailed Planning Status:

Not Started

Detailed planning for this operation has not
started. The narrative that follows represents the
Census Bureau’s preliminary thoughts as of the
release of this document.
Purpose
The Update Enumerate operation updates the
address and feature data and enumerates the
following:
•• Areas that do not have city-style addresses.
•• Areas that do not receive mail through city-style
addresses.
•• Areas that receive mail at post office boxes.
•• Areas with city-style addresses but mail is delivered to another drop point.
•• Areas affected by natural disasters.
•• Areas with high concentrations of seasonally
vacant housing.
•• Some American Indian Reservations.
•• Settlements along the Mexican-American border
(Colonias).
•• Other areas with unique challenges associated
with accessibility.
•• Communities with a population from several
hundred to just a few people.
U.S. Census Bureau

	

Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Determine ways to closely track the fieldwork
during the Update and Leave field operation in
order to monitor any falsification or procedural
issues that may arise during production.
Opportunities to Innovate
Opportunities to innovate include the following:
•• Combine enumeration methodologies from the
2010 Update Leave, Remote Update/Enumerate,
and UE Operations.
•• Single attempt to enumerate with push to
Internet Self-Response (a notice of visit form
is left when no one is home, which invites a
respondent to go online with an ID to complete the 2020 Census or call the Census
Questionnaire Assistance Center).
•• Use of single device for both listing and
enumeration.
•• Use of reengineered field management structure
and approach to managing fieldwork, including new field office structure and new staff
positions.
Description of Operation
The UE operation combines three operations
from the 2010 Census: Update/Leave, Update/
Enumerate, and Remote Update/Enumerate. As
noted above, detailed planning for the UE operation has not yet started; however, the current
plans for this operation (which will be tested in
the 2017 Census Test) are that the UE fieldworker
will update the address list and map and attempt
to conduct an interview for each housing unit. If
no one is home, the fieldworker will leave a notice
of visit form inviting a respondent for each household to go online with an ID to complete the 2020
Census Questionnaire. The design does not currently include a return personal visit or telephone
call back; however, this will be tested in the 2017
Census Test. The expectation is that nonresponding units become part of the NRFU workload.
The UE operation will take full advantage of all of
the innovations associated with the reengineered
field operations, including use of a handheld device
to collect the data, automated training, automated
2020 Census Operational Plan—Version 1.1 105

administrative processes, the operational control
system, and streamlined staffing structures.

•• Approach: Determined through the development of the Detailed Operational Plan.

The 2020 Census UE operation includes a quality
assurance component. The details of this component have yet to be defined but could include a
combination of methodologies such as the use of
alerts, paradata, and administrative records and
third-party data, as well as fieldwork.

•• Decision by: December 2015

Research Completed

•• Decision by: December 2015

Because detailed planning for this operation has
not yet started, research that directly supports this
operation has not yet been completed.

How are Census IDs from the address list associated with or linked to the notice of visit forms?

Assumptions Made
Based on planning of other operations, the following assumptions have been made:
•• UE consists of production and quality assurance
components.
•• UE utilizes a reengineered field management
structure.

What automated instruments do the enumerators
need to access if transitory units are enumerated
during UE?
•• Approach: Determined through the development of the Detailed Operational Plan.

•• Approach: Determined through the development of the Detailed Operational Plan.
•• Decision by: December 2015
How are Census IDs generated or assigned to newly
identified units not found on the address list?
•• Approach Determined through the development of the Detailed Operational Plan.
•• Decision by: December 2015

•• UE utilizes integrated automated listing and enumerations tools and systems to facilitate data
collection.

Are there any geographic areas where a paper
questionnaire should be left in lieu of the notice of
visit form, (i.e., Puerto Rico)?

•• UE collects coordinates (latitude and longitude)
for each structure with a living quarter.

•• Approach: Determined through the development of the Detailed Operational Plan.

•• UE utilizes automated systems and logistics to
monitor cost and progress.

•• Decision by: March 2016

•• No In-Field Address Canvassing for UE areas.

Will UE contact living quarters through mail and
other contact strategies?

•• The notice of visit form will provide both the
2020 Census URL and the phone number for CQA.

•• Approach: Determined through the development of the Detailed Operational Plan.

•• UE will employ real-time or near-real time data
processing.

•• Decision by: March 2016

•• There will be validation of vacant living quarters
during UE.
Design Issues to Be Resolved
In addition to validating the assumptions above, the
following decisions need to be made to design this
operation, test it in the 2017 Census Test, and refine
the design in the 2018 Census End-to-End Test:
What automated instruments do the enumerators
need to access if group quarters are enumerated
during UE?

106 2020 Census Operational Plan—Version 1.1	

What is the content on the notice of visit form?
•• Approach: Determined through the development of the Detailed Operational Plan.
•• Decision by: June 2016
What actions are taken on the address list at the
time of update (i.e., moves across block or into a
different Type of Enumeration Area)?
•• Approach: Determined through the development of the Detailed Operational Plan.
•• Decision by: June 2016

U.S. Census Bureau

Does the UE operation enumerate group quarters
or are they provided to a different 2020 Census
operation for enumeration?
•• Approach: Determined through the development of the Detailed Operational Plan.
•• Decision by: June 2016
Does the UE operation enumerate transitory units
found at transitory locations or are they provided to a different 2020 Census operation for
enumeration?
•• Approach: Determined through the development of the Detailed Operational Plan.
•• Decision by: June 2016
Will enumerators leave an invitation at known UE
addresses asking the household to update their
address online?
•• Approach: Researched in the 2017 Census Test.

•• Approach: Determined through the development of the Detailed Operational Plan.
•• Decision by: December 2017
Cost and Quality
Investment in UE will have minimal impact on the
cost and quality of the 2020 Census as compared
with the 2010 Census.
Risks
By this point in the decade, planning for all major
2020 Census operations should be underway.
Budget reductions in FY 2013 through FY 2015
delayed planning for this operation. IF planning
efforts are not initiated at the start of FY 2016,
THEN there may not be sufficient time to implement innovations related to this operation.
Milestones

•• Decision by: September 2017

Date

Activity

Will enumerators leave an invitation at known UE
addresses asking the household to update their
address online?

October 2015

Begin detailed planning Update Enumerate.

March 2017

Begin UE for 2017 Census Test.

September
2017

Release the UE Detailed Operational Plan.

•• Approach: Researched in the 2017 Census Test.
•• Decision by: September 2017

March 2018

At what time of day is the operation actually performed (i.e., during business hours or when most
people are likely to be home)?

Begin UE for the 2018 Census End-to-End
Test.

January 2020

Begin UE for the 2020 Census in Remote
Alaska.

March 2020

Begin UE for 2020 Census.

July 2020

End UE for 2020 Census.

•• Approach: Researched in 2017 Census Test.
•• Decision by: October 2017
Can administrative records and third-party data be
used to validate units in Quality Control?

5.5.7	 Group Quarters
Detailed Planning Status:

Recently Begun

•• Approach: Researched in 2017 Census Test.
•• Decision by: October 2017
What is the cost/benefit to only visiting the living
quarter once?
•• Approach: Researched in 2017 Census Test.
•• Decision by: October 2017
Is there a benefit of doing a phone call in UE versus
NRFU doing the follow-up?
•• Approach: Researched in 2017 Census Test.
•• Decision by: October 2017
How will Remote Alaska be handled?

U.S. Census Bureau

	

Purpose
The Group Quarters operation enumerates peoples
living or staying in group quarters, people experiencing homelessness, and people receiving service
at service-based locations.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Integrate GQ frame validation and enumeration
data collection methodologies.

2020 Census Operational Plan—Version 1.1 107

•• Research and test automation to collect GQ
data to reduce data capture and processing
time, which incorporates tracking and linkage
capabilities (eliminates manual transcription of
administrative records and third-party data onto
paper instrument).
•• Explore ways to reduce the number of visits on
military installations. (Research and test the enumeration of military personnel through the use
of administrative records and third-party data.)
•• Maintain consistent answer categories
regarding the question on having a usual
home elsewhere on all census data collection
instruments, the Individual Census Report, and
Shipboard Census Report.
•• Conduct outreach to professional organizations
such as education, health care, and tribal organizations as part of the 2020 Census GQ planning.
Opportunities to Innovate
Opportunities to innovate include the following:
•• Use of an integrated approach including
administrative records and third-party data and
Address Canvassing (In-Field and In-Office) to
improve the GQ frame.
•• Use of multiple modes of enumeration that
include: electronic exchange of group quarters and client-level administrative records and
third-party data; Internet self-response; and
automated field listing and enumeration.
•• Integration of Group Quarters Validation and
enumeration in all field operations that allow for
accurate classification of living quarters.
•• Staff will be trained in multiple operations for
increased efficiency.
•• Use of both in-office and in-field methods for
enumeration.
Description of Operation
Before group quarters can be enumerated, the
Census Bureau must validate the GQ frame. This
validation activity is part of the 2020 Census
Address Canvassing operation.
The 2020 Census GQ operation consists of two
components:
•• Group Quarters Advance Contact (known
as Group Quarters Advance Visit in the
2010 Census): For the 2020 Census, this will
108 2020 Census Operational Plan—Version 1.1	

primarily be an in-office function (although some
in-field work may be required in limited areas),
which includes:
ºº Verifying the group quarters’ name, address
information, contact name, and phone number, obtaining an agreed-upon date and time
to conduct the enumeration.
ºº Collecting an expected Census Day population count, addressing concerns related to
privacy, confidentiality and security.
ºº Inquiring about whether the group quarters
has an administrative record or third-party
data file that can be transmitted electronically
to the Census Bureau.
•• Group Quarters Enumeration: This includes
enumeration of all group quarters through
in-field visits or via administrative records and
third-party data.
The residence rules for the 2020 Census will determine what is considered a group quarters. The
Federal Register Notice has been published and the
Census Bureau is reviewing comments. Final residence rules will be determined in late 2017.
Pending a final determination on residence rules,
the following types of enumeration will be included
in this operation:
•• General Group Quarters Enumeration:
Enumeration of people living in group living
arrangements that are owned or managed by
an entity or organization providing housing or
services for the residents (e.g., college residence
halls, residential treatment centers, skilled
nursing facilities, group homes, correctional
facilities, workers’ dormitories, and domestic
violence shelters).
•• Service-Based Enumeration: Enumeration of
people experiencing homelessness or utilizing
transitional shelters, soup kitchens, regularly
scheduled mobile food vans, and targeted nonsheltered outdoor locations.
•• Military Group Quarters Enumeration:
Enumeration of people living in GQs on military
installations, defined as a fenced, secured area
used for military purposes.
•• Military and Maritime Vessel (Shipboard)
Enumeration: Enumeration of people residing
on U.S. military ships or on U.S. maritime vessels
in operation at the time of the 2020 Census.
U.S. Census Bureau

Research Completed
•• Issued Federal Register Notice on May 20, 2015,
requesting public comment on the 2020 Census
residence rule and residence situations. Expect
to publish the final 2020 Census residence rule
and residence situations in late 2017.
•• Ongoing partnership with the Department of
Defense’s Defense Manpower Data Center to
discuss 2020 Census goals and objectives for
enumerating personnel living on stateside military installations.
ºº Findings:
•• Census Bureau received a sample of
administrative record from one military
installation.
•• Defense Manpower Data Center identified
military installations for administrative
record testing.
Design Decisions
The following decisions have been made for this
operation:
99 The GQ frame development and validation will
be integrated with the Address Canvassing
operation.
99 The GQ operation will allow an individual to
self-respond and self-identify the group quarters
type for the facility in which he or she resides.
99 An electronic data exchange of group quarters
and client-level administrative records or thirdparty data will be part of the GQ methodology.

ºº Enumerate military group quarters using
administrative records and third-party data.
Design Issues to Be Resolved
Additional work is required to make decisions on
the following questions:
What varying computing capabilities and multiple
formats for administrative records and third-party
data can be integrated into a standardized Census
Bureau system for processing?
•• Approach: Researched in 2016.
•• Decision by: June 2016
What is the optimal linkage methodology to ensure
self-response data are linked to the correct group
quarters?
•• Approach: Researched in the 2016 Census Test.
•• Decision by: October 2017
How will varying administrative records or thirdparty data formats be processed?
•• Approach: Conduct a survey to determine
which group quarters will participate in the
automatic transfer of administrative records and
third-party data and what type of data or systems they have. Build the appropriate data transfer systems to test in the 2017 Census Test.
•• Decision by: December 2017
How much in-field Group Quarters Enumeration will
be required?
•• Approach: Researched in 2017 Census Test.

99 The Census Bureau will design a standardized
system that will accept electronically transmitted administrative records or third-party data in
multiple formats.

•• Decision by: December 2017

99 During field enumeration operations, newly
identified group quarters will be validated and
enumerated using a combination of in-office and
in-field methodologies.

•• Decision by: December 2017

99 Current goals for various types of group quarters include the following:

•• Approach: Researched in 2017 Census Test.

ºº Enumerate 75 to 80 percent of people
residing in group quarters through in-office
methodologies (i.e., electronic transfer of
administrative records or third-party data and
Internet self-response) and the remainder in
the field.

U.S. Census Bureau

	

How will quality assurance be handled?
•• Approach: Researched in 2017 Census Test.
How will field reengineering concepts be used for
integrating group quarters with multiple housing
unit enumeration operations (e.g., NRFU and UE)?

•• Decision by: December 2017
What administrative records and third-party data
files exist for service-based locations, such as soup
kitchens and regularly scheduled mobile food vans?
•• Approach: Researched in the 2017 Census Test.
•• Decision by: December 2017
2020 Census Operational Plan—Version 1.1 109

What is the impact on quality and productivity of
field staff if they are required to conduct multiple
operations?
• Approach: Researched in the 2017 Census Test.
• Decision by: December 2017
Cost and Quality
The investment in GQ will have minimal impact
on cost of the 2020 Census as compared with the
2010 Census.
Quality impacts of this operation on overall 2020
Census quality include the following:
Ï Electronic transfer of administrative records and
third-party data reduces transcription errors.
Ï Administrative records and third-party data
may provide more comprehensive demographic
information.
Ð Administrative records and third-party data may
provide less current data than data received
through Internet Self-Response or in-field
Enumeration.
Risks
Converting the Group Quarters Enumeration
Questionnaire from paper to an automated version
is a resource intensive process that requires a great
deal of programming and testing. IF the content
of the GQ paper questionnaire is not successfully
replicated on the enterprise data collection device,
THEN the GQ field operations will have to be performed entirely using a paper form.
The enterprise data collection device for listing and
enumerating housing units should also be capable
of listing and enumerating group quarters. IF housing unit and group quarters functionality is not
integrated on the enterprise data collection device,
THEN field staff may require more than one visit to
certain group quarters.
The person record of the group quarters must be
linked to the address of the group quarters. IF a
link between each person record and the group
quarters at the same address cannot be achieved,
THEN the count of people residing at each group
quarters would not be accurate.

Milestones
Date

Activity

December
2015

Conduct Electronic Transfer Capability
Survey—Stateside.

December
2015

Conduct Electronic Transfer Capability
Survey—Puerto Rico.

February 2017 Conduct the 2017 Census Test (Conduct GQ
Advance Contact).
March 2017

Conduct the 2017 Census Test (Conduct
Service-Based Enumeration).

April 2017

Conduct the 2017 Census Test (Conduct
Group Quarters Enumeration).

September
2017

Release the GQ Detailed Operational Plan.

April 2018

Conduct the 2018 Census End-to-End Test.

February 2020 Conduct GQ Advance Contact.
March 2020

Conduct Service-Based Enumeration.

April 2020

Conduct Group Quarters Enumeration.

5.5.8	 Enumeration at Transitory
Locations
Detailed Planning Status:

Not Started

Detailed planning for this operation has not
started. The narrative that follows represents the
Census Bureau’s preliminary thoughts as of the
release of this document.
Purpose
The Enumeration at Transitory Locations operation enumerates individuals in occupied units at
transitory locations who do not have a usual home
elsewhere. Transitory locations include recreational
vehicle parks, campgrounds, tent cities, racetracks,
circuses, carnivals, marinas, hotels, and motels.
Lessons Learned
Based on lessons learned from the 2010 Census,
the following recommendations were made:
•• Automate the questionnaire and all related
sources of paradata used to record contact
details during an interview.
•• Learn more about the living situations of people
counted in the ETL operation.
•• Clearly define and identify transitory locations,
as well as procedures on how to list transitory

110 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

units appropriately in operations that feed the
ETL universe.
•• Conduct intercensal testing of the ETL
population.
Opportunities to Innovate
Opportunities to innovate include the following:
•• Use of reengineered field management structure,
staff positions, and approach to managing
fieldwork.
•• Use of automation and technology for data
collection.
Description of Operation

•• The 2020 Census ETL operation will leverage
the approaches to field office structure and management of field assignments resulting from the
Field Reengineering efforts.
•• The 2020 Census ETL operation will use adaptive design (routing and dynamic case management) to allocate resources efficiently.
Although no specific decisions for the design of
the 2020 Census ETL Program have been made, the
operational design of the ETL operation is dependent on understanding the operational design
and timing for other operations, such as Address
Canvassing, LUCA, and Field Infrastructure (e.g.,
the number of field offices, staffing structures).

The operational description provided below is
based primarily on the operational design of the
2010 Census ETL Program. The goal of the ETL
program is the enumeration of individuals in occupied units at transitory locations who do not have a
Usual Home Elsewhere.

Design Issues to Be Resolved

The ETL operation will:

•• Approach: Determined through the development of the Detailed Operational Plan.

•• Use automation to facilitate data collection and
streamline operations.

The following decisions need to be made for this
operation:
What are the objectives and scope of the 2020
Census ETL Program?

•• Decision by: September 2017

•• Use reengineered staffing and management of
the field operation.

What does success for the 2020 Census ETL
Program look like and how is it measured?

•• Use in-person enumeration as the primary mode
of data collection.

•• Approach: Determined through the development of the Detailed Operational Plan.

•• Have Quality Assurance infused throughout
workload management and data collection.

•• Decision by: September 2017

Research Completed
Because detailed planning for this operation has
not yet started, research that directly supports this
operation has not yet been completed.
Assumptions Made

Given other aspects of the 2020 Census design,
what is the operational timing for the 2020 Census
ETL Program?
•• Approach: Coordination and integration with
other relevant operations.
•• Decision by: September 2017

Based on planning of other operations, the following assumptions have been made:

What will the quality assurance approach for the
Enumeration at Transitory Location Program involve
(in-field, use of paradata, etc.)?

•• Establish the 2020 Census ETL Integrated Project
Teams in FY 2016.

•• Approach: Determined through the development of the Detailed Operational Plan.

•• The 2020 Census ETL operation will include a
Quality Assurance function.

•• Decision by: September 2017

•• The 2020 Census ETL operation will utilize
automated tools and systems to facilitate the
enumeration of transitory locations.

What is the impact of self-response via the Internet
and Non-ID Processing on ETL?
•• Approach: Researched during the 2017 Census
Test.
•• Decision by: September 2017

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 111

Are there administrative records or third-party data
sources that could be used for the frame development by type?

ºº Tier 2: Provide real-time assistance over the
telephone or other electronic channels (Web
chat and e-mail) via CQA agents.

• Approach: Researched during the 2017 Census
Test.

•• Provide an option for respondents to complete a
Census interview over the telephone.

• Decision by: September 2017

Lessons Learned

Cost and Quality

Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:

Investments in the ETL Program will have minimal
impact on the cost and quality of the 2020 Census
as compared with the 2010 Census.
Risks
By this point in the decade, planning for all major
2020 Census operations should be underway.
Budget reductions in FY 2013 through FY 2015
delayed planning for this operation. IF planning
efforts are not initiated at the start of FY 2016,
THEN there may not be sufficient time to implement innovations related to this operation.

•• CQA operation requires very specialized contact
center personnel throughout the development
and operational cycles.
•• CQA operations needs to be synchronized with
the Integrated Partnership and Communications
Program.
•• Agent desktop applications need to have the
ability to easily update FAQ content so that all
relevant information is in one place.
Opportunities to Innovate

Milestones

Opportunities to innovate include the following:

Date

Activity

October 2015

Initiate the 2020 Census ETL Integrated
Product Team.

March 2017

Begin ETL for 2017 Census Test.

September
2017

Release the ETL Detailed Operational Plan.

March 2020

Begin 2020 Census ETL enumeration.

April 2020

Conclude 2020 Census ETL enumeration.

April 2021

Issue 2020 Census ETL operational
assessment.

5.5.9	 Census Questionnaire Assistance
Detailed Planning Status:

Underway

Purpose
The Census Questionnaire Assistance operation has
two primary functions:
•• Provide questionnaire assistance for respondents by answering questions about specific
items on the Census form or other frequently
asked questions about the Census;
ºº Tier 1: Provide telephone assistance via an
Interactive Voice Response (IVR).

112 2020 Census Operational Plan—Version 1.1	

•• Integration with the Internet questionnaire
development team to deliver assistance via Web
chat and e-mail.
•• Speech and text analytics to determine what is
trending in real-time across CQA.
Description of Operation
The main objectives of CQA are to assist Internet
and paper self-respondents by answering questions
coming from telephone, Web chat and e-mail. CQA
will provide support for:
•• A toll free telephone number for respondents
to call for help completing the 2020 Census
questionnaire.
•• IVR to resolve basic questions from respondents
calling on the telephone to limit the need for
additional agents.
•• Respondent questions on the Internet via realtime Web chat functionality.
•• Callers (inbound) to complete the 2020 Census
questionnaire over the telephone (with and without a unique Census ID).
•• IVR capability for the 2020 Census Jobs Line.

U.S. Census Bureau

•• Outbound telephone calls made by agents
to respondents for quality follow-up (under
review).
•• Outbound telephone calls made by agents to
respondents for NRFU quality assurance component (under review).
Scope and Timing of 2020 Census CQA includes:
•• Multichannel contact center with a central command functionality.
ºº Voice channel (telephone via IVR and agents).
ºº Nonvoice channels (Web chat and e-mail).
•• Staffing of contact center.
•• Training of contact center staff.
•• Assistance in multiple languages.

specify small business goals within the
Request for Proposal and allow the contact
center service providers and system integrators to determine how to best meet the
small business goals.
• Call Workload Modeling:
º Looked at call data from the 2010 Census,
the ACS, the 2014 Census Test, and the 2015
Optimizing Self-Response Census Test to
assist in forecasting workload for the 2020
Census.
• Findings: The mailing strategy of pushing
respondents to answer the Census on the
Internet has created an increase in assistance calls, specifically related to lack of
Internet access and technical issues.

•• Assistance for individuals with special needs
(visual or hearing impaired).

Decisions Made

•• Assistance for individuals in Puerto Rico.

The following decisions have been made for this
operation:

•• Assistance for individuals receiving experimental forms.
•• Utilization of an IVR system.
•• Integration with the Internet questionnaire
development team to deliver assistance.
•• Integration with the hiring and recruiting team
to determine contact center roles.
•• Determination of expected call volumes
(inbound and outbound), Web chat, and e-mail—
including timing of peak volumes and a rollover
plan for unanticipated volumes.
Research Completed
The following research has been completed for this
operation:
•• Market Research:
ºº Conducted vendor meetings to benchmark
contact center industry and identify best
practices.
ºº Released a Request for Information to identify
industry capabilities.
•• Findings: Most large contact center
providers have the capacity to provide
all services identified in the Request for
Information. Small businesses do not have
the facilities, staff, or experience to meet
the full range of services and size required
by CQA. However, the Census Bureau will
U.S. Census Bureau

	

9 CQA will use an acquisition with the Request for
Proposal release date of November 2015.
9 CQA will complete interviews by telephone.
9 CQA will provide respondent assistance relating
to specific items on the questionnaire.
9 CQA will handle calls relating to general questions on 2020 Census processes and frequently
asked questions.
9 CQA telephone number will be provided in
selected materials.
9 The contractor will be required to provide an
adaptive infrastructure (e.g., staffing levels
and communications capabilities) that can be
adjusted on demand as data collection occurs.
9 The contract will include options to provide
flexibility to support future operations and or
capabilities that have not yet been fully defined.
9 The 2020 Census CQA will utilize and integrate
nonvoice channels, such as Web chat, e-mail,
and texting to support in-bound questions.
9 The Request for Proposal will require the vendor
to develop the application that the agents use
to respond to calls, including the data collection
instrument to complete the questionnaire.
9 CQA will not mail paper questionnaires to people who call to request them, but they will refer
people to materials on the Web site or collect the
interview.
2020 Census Operational Plan—Version 1.1 113

99 CQA agents will be available to provide assistance and complete 2020 Census questionnaires
for all specified languages.
99 CQA will assist individuals with special needs
(visual- or hearing-impaired).
99 CQA will not collect 2020 Census questionnaire
information via text, e-mail text, or Web chat.
99 CQA will not accept e-mails with PDF attachments, faxes, or Internet uploads of completed
2020 Census questionnaire. Respondents will be
directed to mail their responses.
99 CQA will not support centralized outbound
calling for NRFU production cases. (NRFU
quality assurance component is still under
consideration.)

•• Decision by: April 2016
Will CQA include a Quality Outbound Operation?
•• Approach: Based on decisions related to 2017
Census Test.
•• Decision by: June 2016
What languages will be supported by the CQA?
•• Approach: Based on 2014–2017 Census Test
results.
•• Decision by: June 2016 (Initial decision for
contract; revised if necessary based on the 2017
Census Test)
Will CQA handle centralized outbound calling for
NRFU quality assurance component?

99 CQA will include the ability to offer respondents
an option to check on the status of the questionnaire they submitted.

•• Approach: Based on decisions related to the
2016 Census Test for NRFU quality assurance
component.

99 CQA will handle calls about technical issues
(e.g., Internet problems, lack of access to
Internet) by offering to complete the 2020
Census questionnaire instead of offering technical assistance to respondents.

•• Decision by: September 2016

99 The CQA will offer a Web chat functionality to
provide assistance to respondents while completing their questionnaire on line.
Design Issues to Be Resolved
Additional work is required to make decisions on
the following questions:
What are the specific service level agreements for
the contractor?
•• Approach: Based on Internal research, 2010
Census past experiences, and researching industry standards.
•• Decision by: November 2015
What are the assumptions for the language requirements that will be specified within the Request for
Proposal?
•• Approach: Based on requirements provided by
the Languages Services operation.
•• Decision by: November 2015
Will the 2020 Census CQA utilize IVR as a data
collection mode (full or partial) to complete questionnaire items?
•• Approach: Based on 2017 Census Test results.
114 2020 Census Operational Plan—Version 1.1	

When and how will the CQA as a response mode be
communicated to the public?
•• Approach: Based on the Integrated Partnership
and Communications operation design.
•• Decision by: April 2017
What is the impact of the mailing strategy on CQA
workload?
•• Approach: Based on data from 2015 Census
Test, 2016 Census Test, and 2017 Census Test.
•• Decision by: November 2017
When do CQA operations start and end? By
component?
•• Approach: Based on requirements for field
operations and Internet Self-Response.
•• Decision by: January 2018
Will CQA take calls to support field enumerators
who are having language issues?
•• Approach: Based on cost/benefit analysis.
•• Decision by: January 2018
Cost and Quality
Cost impacts of this operation on overall 2020
Census costs include the following:
The investment in Census Questionnaire Assistance
will reduce costs as compared with the 2010
Census through:
U.S. Census Bureau

ÐÐ Increased self-response rates.

Milestones

ÐÐ Decreased Nonresponse Followup workload,
thereby reducing field costs.

Program milestone dates for 2020 Census CQA will
be determined after contract award. For acquisition
purposes, the major milestone dates are:

ÐÐ Reduced amounts of paper questionnaires,
thereby reducing the infrastructure for paper
data capture.

Date

Activity

In addition:

November
2015

Release Request for Proposal for 2020
Census Questionnaire Assistance acquisition.

ÏÏ Internet Self-Response is expected to increase
the workload for Census Questionnaire
Assistance.

June 2016

Award contract for 2020 Census
Questionnaire Assistance.

September
2016

Release the Census Questionnaire
Assistance Detailed Operational Plan.

April 2017

Participate in 2017 Census Test (under
review).

April 2018

Participate in 2018 Census End-to-End Test.

January–
September
2020

Conduct CQA operations.

To Be
Determined

Other CQA milestone dates will be
determined after the contract has been
awarded.

Quality impacts of this operation on overall 2020
Census quality include the following:
ÏÏ Increase in overall self-response rates.
ÏÏ Real-time edits to respondent data.
Risks
Adequate staffing is required in order to properly
manage the contract supporting the CQA operation. IF approval for funding of program management staff is not in place, THEN the contract may
not be managed properly due to the scope and
complexity of the project.
In order to participate in the 2017 Census Test,
the systems involved need to be approved by
security oversight and receive certification. The
contractors working on the CQA operation cannot be brought on board until the approval has
been given. IF approval and certification from
security oversight is not received or takes longer
than anticipated for multiple IT systems, THEN the
contractor may miss the opportunity to participate
in the 2017 Census Test.
The staff working on the CQA operation must
undergo a security background check before they
can be brought on board. IF the Census Bureau
is unable to process a large number of contact
center agents and support staff through security
background checks in a short time frame for CQA,
THEN the contractor may not be appropriately
staffed to handle the anticipated workload.

5.5.10	 Nonresponse Followup
Detailed Planning Status:

Underway

Purpose
The NRFU Operation serves two purposes:
•• Determine housing unit status for nonresponding addresses.
•• Enumerate housing units for which a 2020
Census response was not received.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Traditional enumeration and management of
workload, as implemented in the 2010 Census,
is no longer viable in an era of an ever evolving,
demographically, culturally, and technologically
diverse nation.
•• Reduce the maximum number of NRFU contact
attempts.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 115

•• Include the use of a handheld enumeration
device that would have the ability to track when
an enumerator opens a case.
•• Explore additional sources and criteria for
inferring occupancy status and population size
of housing units from administrative records or
third-party data.
•• Avoid having to add late-planned operations and
procedures.
Opportunities to Innovate
Opportunities to innovate include the following:
•• Use of administrative records and third-party
data to remove vacant housing units from the
NRFU workload.
•• Use of administrative records and third-party
data to remove occupied housing units from the
NRFU workload.
•• Use of a reengineered field management structure and approach to managing fieldwork.
•• Use of a variable contact strategy and stopping
rules to control the number of attempts made
for each address (based on paradata).
•• Assignment and route optimization.
•• Automated training for field staff.
•• Automation of the field data collection.
•• Automation of administrative functions, such as
recruiting, onboarding, and payroll.
•• Implementation of alternatives to providing government-furnished equipment, such as BYOD or
Device as a Service.
•• Reengineered quality assurance approach.
Description of Operation:
For the 2020 Census, the NRFU operation will be
dramatically different from the NRFU operation conducted in the 2010 Census. The Census Bureau will
implement a NRFU operational design that utilizes
a combination of the following:
•• Administrative records and third-party data
usage to reduce the workload.
•• Reengineering of staffing and management of
field operations.
•• Use of adaptive design methodologies.

After giving the population an opportunity to
self-respond to the 2020 Census, addresses
for which the Census Bureau did not receive a
self-response will form the initial universe of
addresses for the NRFU operation. Prior to any
fieldwork, vacant addresses will be removed from
the NRFU workload using administrative records.
Undeliverable-As-Addressed information from
the USPS will provide the primary administrative
records source for the identification of vacant units.
Addresses will also be removed from the workload,
throughout the course of the NRFU operation, as
late self-responses are received. Addresses may
be added to the NRFU workload from other census operations, such as addresses from the LUCA
appeals process and addresses received through
the non-ID operation that require a field visit for
final resolution.
After an initial attempt to contact nonresponding
housing units, the NRFU workload will be further
reduced through the removal of cases where
administrative records and third-party data are
available and usable to enumerate the occupied
housing units. The NRFU operational design will
use administrative records and third-party data
to enumerate occupied housing units where it
makes sense and is feasible. Examples of sources
of administrative records and third-party data used
to enumerate occupied housing units include:
Internal Revenue Service Individual Tax Returns,
Internal Revenue Service Information Returns,
and Center for Medicare and Medicaid Statistics
Medicare Enrollment Database. A more comprehensive list of administrative records and third-party
data sources is provided in the Design Decisions
Made section below.
Addresses removed from the NRFU workload as
either vacant or occupied will receive a final mailing that encourages occupants to self-respond to
the 2020 Census. After each phase of the administrative records modeling, those addresses that
are determined to be vacant will immediately be
mailed a final letter encouraging self-response; for
those addresses that are determined to be occupied and are incomplete after one personal visit
attempt, a final letter encouraging self-response
will be mailed after 7 days.

•• Automation to facilitate data collection.

116 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

The NRFU operation will use a reengineered field
management structure and approach to managing
fieldwork, which includes:
•• Using automation for:
ºº Optimization of daily enumerator
assignments.
ºº Near real-time operations information for
decision making.
ºº Enhanced operational control system.
ºº Payroll submission and approval processing.
ºº Training of enumerators and managers.
•• New field structure, including field staff roles,
work schedule, and staffing ratios.
On a daily basis, based on an enumerator’s home
location, his or her availability to work, and the
availability and location of NRFU workload, the
enumerator will be assigned addresses and will
work the addresses in a prescribed order to determine the Census Day status of the housing unit,
and when occupied, enumerate the housing unit.
Enumerators will use an automated data collection
application on a handheld device, to record the
Census Day housing unit status and to enumerate
occupied housing units. If a respondent is not at
home, a notice of visit will be left directing the
respondent to the Internet or Census Questionnaire
Assistance to self-respond.
Unlike the 2010 Census, the 2020 Census NRFU
operation will use an adaptive design, which
includes a variable contact strategy and stopping
rules to control the number of attempts made for
each address. The number of contacts will vary by
geographic area. Fewer attempts will be made in
some geographic areas, whereas more attempts
will be made in others with the goal of achieving
a consistent response rate across all geographic
areas (and within geographic areas for key demographic characteristics.) Decisions about when
proxy responses are acceptable will also be made
as part of the variable contact strategy.
The 2020 Census NRFU operational design will
infuse quality throughout the workload management and data collection processes. Examples of
aspects of the NRFU operation designed to maintain or improve quality:
•• Use of real-time paradata and editing capabilities
will increase accuracy and quality check data.

U.S. Census Bureau

	

•• Use of a Best Time to Contact model (used for
the first time in the 2015 Census Test) in the
making of assignments will increase the likelihood of finding respondents at home.
•• Capabilities available through an enhanced operational control system will provide early opportunities to identify and take corrective action in
defined situations.
In addition, the NRFU operation will include a
reinterview component designed to deter and
detect enumerator falsification. The details of this
component are in development and could include a
combination of approaches such as use of paradata
and fieldwork.
Research Completed
The following research has been completed for this
operation:
•• The 2013 Census Test (Philadelphia, PA)
explored methods for using administrative
records and third-party data to reduce the NRFU
workload:
ºº Findings:
•• The Census Bureau was able to remove
approximately 8 percent of vacant units
and 31 percent of occupied units prior to
NRFU using administrative records and
third-party data.
•• The use of administrative records and thirdparty data and the implementation of an
adaptive design case management approach
have the potential to reduce costs.
•• The 2014 Census Test (Montgomery County,
MD and Northwest Washington, DC) built upon
the results of the 2013 Census Test specific
to administrative records and third-party data
usage to reduce the NRFU workload:
ºº Findings: A high self-response rate of 65.7
percent resulted in a NRFU universe of 46,247
housing units. The Census Bureau was able to
identify approximately 4 percent of the NRFU
cases as vacant and 55 percent of NRFU cases
as occupied based on administrative records
and third-party data.
•• The 2014 Human-in-the-Loop SIMEX.
ºº Findings:
•• The field management structure can be
streamlined and ratios increased.
2020 Census Operational Plan—Version 1.1 117

• Messaging and alerts within the operational control system provide real-time and
consistent communication.
• The enhanced operational control system
or MOJO is intuitive—users were able to
use the system with a small amount of
up-front training.
• Smart phones were usable by all people—
even those with little technology experience were able to adjust and adapt.
• The 2015 Census Test (Maricopa County, AZ)
explored the reengineering of the roles, responsibilities, and infrastructure for conducting field
data collection. It also tested the feasibility of
fully utilizing the advantages of planned automation and available real-time data to transform
the efficiency and effectiveness of data collection operations. The test continued to explore
the use of administrative records and third-party
data to reduce the NRFU workload and tested
the technical implementation of a BYOD option.
º Findings:
• A high self-response rate of 54.9 percent
resulted in a NRFU universe of 72,072
housing units. The Census Bureau was able
to identify approximately 12 percent of the
NRFU cases as vacant and 20 percent of
NRFU cases as occupied based on administrative records and third-party data.
• Successfully removed vacant housing units
and enumerated occupied housing units
using administrative records and thirdparty data.
• A combination of automated online
training and classroom training enabled a
reduction in the overall number of training
hours, compared with the 2010 Census
NRFU operation, from 32 to 18 hours.

Decisions Made
The following decisions have been made for this
operation:
99 The NRFU operation will consist of production
and quality assurance components.
99 The NRFU operation will utilize automated tools
and systems for:
ºº Recruiting, onboarding, and training.
ºº Time and attendance and payroll.
ºº Case load management.
ºº Data collection.
ºº Cost and progress monitoring.
99 The NRFU operation will utilize a reengineered
field management and staffing structure.
99 Administrative records and third-party data will
be used to identify vacant units.
99 Administrative records and third-party data will
be used to enumerate nonresponding housing
units, as appropriate.
99 A contact attempt will be made prior to using
administrative records or third-party data for
enumeration of occupied units.
99 A final letter, encouraging self-response, will be
mailed to NRFU cases that are removed from the
workload based on the administrative records
modeling.
99 Telephone contact attempts from a central location (i.e., Census Questionnaire Assistance) will
not be part of the initial NRFU contact strategy.
99 All administrative records and third-party
data will be used in compliance with data use
agreements.
99 The core set of administrative records and thirdparty data to support the 2020 Census NRFU
operations include the following:

• Management of the field data collection utilizing new roles, responsibilities, and staffing ratios were successfully implemented.

ºº Internal Revenue Service Individual Tax
Returns.

• Entry of enumerator work availability,
workload optimization, and electronic payroll were effective and efficient.

ºº Center for Medicare and Medicaid Statistics
Medicare Enrollment Database.

• Use of a BYOD option did not generate any
observable concerns from respondents.

ºº Internal Revenue Service Information Returns.

ºº Indian Health Service Patient Database.
ºº Social Security Number Identification File.
ºº USPS DSF.
ºº USPS Undeliverable-As-Addressed Information.

118 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

ºº Targus Federal Consumer File.
ºº 2010 Census Data.
ºº ACS Data.
Design Issues to Be Resolved
For each of the core administrative record and thirdparty datasets, what is the allowable use, required
timing, and acquisition approach for the data?
•• Approach: Analysis and research of policies.
•• Decision by: September 2016
To what extent can the Census Bureau minimize
the error associated with use of administrative
records and third-party data for the removal of
vacants and occupied housing units?
•• Approach: Research conducted in the 2013,
2014, 2015, and 2016 Census Tests.
•• Decision by: September 2016
What is the approach for ingest, initial processing,
use, post processing, and tabulation associated
with administrative records or third-party data for
enumeration?
•• Approach: Research conducted in the 2013,
2014, 2015, and 2016 Census Tests and the
2014 and 2015 SIMEX.
•• Decision by: September 2016
Will statistical modeling, a rules-based approach, or
a combination be used for determination of housing unit status?
•• Approach: Research conducted in the 2013,
2014, and 2015 Census Tests.
•• Decision by: September 2016
When are proxy responses used in the NRFU
operation?
•• Approach: Research conducted in the 2014,
2015, and 2016 Census Tests.
•• Decision by: September 2016
What is the final field management staffing
structure (including staffing ratios) for the NRFU
operation?
•• Approach: Research conducted in the 2015 and
2016 Census Tests, the 2014 SIMEX; refinements
may result from tests conducted in 2017.

What is the final approach for the use of variable
contact strategies and stopping rules to balance
the goal of reducing the number of attempts
against having consistent response rates across
demographic groups and geographic areas?
•• Approach: Research conducted in the 2013,
2014, 2015, and 2016 Census Tests, and the
analysis of cost and quality trade-offs of different options.
•• Decision by: September 2016
Should decentralized telephoning (i.e., attempts
made by an enumerator) and appointments be
incorporated into the Nonresponse Followup contact strategy?
•• Approach: Research conducted as part of the
2016 Census Test.
•• Decision by: September 2016
What is the best approach for coordinating enumeration of nonresponding addresses in multiunits
and gated communities?
•• Approach: Research conducted in the 2016
Census Test.
•• Decision by: September 2016
How will any field verification of unmatched but
geocoded non-ID response be integrated into the
NRFU operation?
•• Approach: Research conducted in the 2017
Census Test.
•• Decision by: September 2017
Given potential for infusing quality throughout the
Nonresponse Followup systems and procedures,
what is the operational design for the NRFU quality
assurance component?
•• Approach: Research conducted as part of the
2016 and 2017 Census Tests.
•• Decision by: September 2017
To what extent and how will vacant addresses and
addresses found to not exist, discovered during the
In-Field Nonresponse Followup, be verified?
•• Approach: Research conducted as part of the
2017 Census Test.
•• Decision by: September 2017

•• Decision by: September 2016

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 119

To what extent and how can a last-resort data collection be implemented within the controlled environment that exists with the reengineered workload optimization and management capabilities?

•• Decision by: September 2018

•• Approach: Research conducted as part of the
2017 Census Test.

•• Approach: Analysis and research of policies and
due diligence.

•• Decision by: September 2017

•• Decision by: September 2018

Will fieldworkers enumerate adds found during
Nonresponse Followup and if so, how does the
Census Bureau incorporate real-time non-ID into
the process?
•• Approach: Research conducted as part of the
2017 Census Test.
•• Decision by: September 2017
What are the business rules for optimizing case
assignments?
•• Approach: Research conducted as part of the
2015, 2016, and 2017 Census Tests.
•• Decision by: September 2017
Given other aspects of the 2020 Census operational design, what is the operational timing for the
2020 Census NRFU operation?

For each of the final administrative record and thirdparty datasets, what is the allowable use, required
timing, and acquisition approach for the data?

Cost and Quality
Cost impacts of this operation on overall 2020
Census costs include the following:
The investment in NRFU, which includes administrative records and third-party data usage and field
reengineering, will reduce the cost of the 2020
Census as compared with the 2010 Census through:
ÐÐ Reduced field workload by:
ºº Using administrative records and third-party
data to remove vacant living quarters from
the Nonresponse Followup workload.
ºº Using administrative records and thirdparty data to reduce the number of contact
attempts.

•• Approach: Coordination and integration with
other relevant operations.

ºº Using administrative records and third-party
data to enumerate nonresponding housing
units.

•• Decision by: September 2017

ºº Removal of late self-responses.

What are the sources that contribute to the NRFU
universe (e.g., LUCA Appeals, late DSF adds, and
nonresponding UE addresses)?

ÐÐ Improved productivity of field staff by:
ºº Streamlining staffing structure through the
use of automation.

•• Approach: Coordination and integration with
other relevant operations.

ºº Automating and optimizing the assignment
process.

•• Decision by: September 2017

ºº Using language information from the planning
database to determine work assignments.

What are the best enumerator performance
indicators?
•• Approach: Review of existing indicators built
into the operational control system to determine
need for additional performance alerts.
•• Decision by: September 2017

ºº Using administrative records and third-party
data to determine the best time of day for
contact attempts.
ÐÐ Reduced reinterview workload through a reengineered quality assurance approach.

What is the final set of administrative records and
third-party data (including state-level data sources)
that are necessary to support the 2020 Census
NRFU operation?

ÐÐ Reduced number of hours devoted to training
through the use of automation.

•• Approach: Research conducted in the 2013,
2014, 2015, 2016, 2017, and 2018 Census
Tests, building upon other research.

ÏÏ Use of an improved contact strategy to increase
the likelihood of self-response.

120 2020 Census Operational Plan—Version 1.1	

Quality impacts of this operation on overall 2020
Census quality include the following:

U.S. Census Bureau

Ï Use of an automated data collection application
for conducting NRFU.
Ï Use of real-time paradata and editing capabilities
to sanitize and quality check data.
Ï Use of Best Time to Contact model in the assignment optimization to increase the likelihood of
finding respondents at home.
Ï Use of Notice of Visit to push to self-response.
Ï Use of follow-up postcard mailing to push to
self-response in the case of administrative
records and third-party data vacant removal and
occupied removal.
Ð Using administrative records and third-party
data to remove vacant and occupied housing
units from the NRFU workload may impact housing unit coverage.
Ð Using administrative records and third-party
data to reduce the number of contact attempts
may decrease the quality of responses.
↔ Use of new or revised methodologies will
change results in ways not yet determined.
↔ Use of adaptive design protocol and proxy rules
may impact the quality of response data in ways
not yet determined.
Risks
Many aspects related to the Nonresponse Followup
operational design and the infrastructure necessary
to support it are based on workload assumptions.
A key input to those workload assumptions is the
self-response rate. IF the 2020 Census self-response
rate falls below expectations, THEN the initial NRFU
workload will be higher than expected and the infrastructure to support an increased field data collection volume may be insufficient.
Natural disasters in the form of hurricanes, floods,
epidemics, etc., are uncontrolled events that could

U.S. Census Bureau

	

affect the population’s willingness and ability to
participate in the decennial census, as well as
having detrimental impact on the Census Bureau’s
ability to conduct the NRFU operation. IF a natural disaster occurs at or around the time of the
2020 Census, THEN it will be difficult to conduct
NRFU in the impacted geographic areas due to the
problems gaining access to the populations living
in those areas.
The NRFU workload will be impacted by other
operations that are striving to develop and improve
the coverage and quality of the address frame used
for the 2020 Census. IF there is an increase in the
NRFU operational workload due to the results of
the up-stream address frame operations, THEN the
expected cost savings from the NRFU operation
may not be realized.
Technical innovations such as assignment optimization and Bring Your Own Device are key elements
to the operational design for conducting NRFU. IF
any aspect of the planned technical innovations
does not perform as expected, THEN the operational design for NRFU may fail.
Technical innovations are expected to reduce the
cost of the NRFU operation, but the cost of the
operation can be greatly impacted by economic
conditions beyond the Census Bureau’s control. IF
economic conditions are not favorable at the time
of the 2020 Census, THEN the costs to implement
the NRFU operation may prevent the expected cost
savings from being realized.
The utilization of administrative records and
third-party data to reduce the NRFU workload is a
foundational tenet on which the 2020 Census program expects to realize cost savings. IF the Census
Bureau is unable to use administrative records and
third-party data as planned, THEN increased costs
will be incurred to conduct NRFU.

2020 Census Operational Plan—Version 1.1 121

Milestones

Lessons Learned

Date

Activity

November 2013

Begin NRFU for 2013 Census Test.

August 2014

Begin NRFU for 2014 Census Test.

November 2014

Conduct 2014 SIMEX.

May 2015

Begin NRFU for the 2015 Census Test.

September 2015

Determine preliminary NRFU Design.

December 2015

Conduct 2015 SIMEX.

May 2016

Begin NRFU for 2016 Census Test.

September 2016

Determine strategy for use of
administrative records and third-party data
in NRFU.
Release the Nonresponse Followup
Detailed Operational Plan.

May 2017

Begin NRFU for 2017 Census Test.

May 2018

Begin NRFU for 2018 Census End-to-End
Test.

April 2020

Begin NRFU data collection for the 2020
Census.

August 2020

End NRFU data collection for the 2020
Census.

August 2021

Issue operational assessment of the 2020
Census NRFU operation.

5.5.11	Response Processing
Detailed Planning Status:

Underway

Purpose
This operation supports the three major components of the 2020 Census: predata collection
activities, data collection activities, and post-data
collection activities:
Specifically, it includes the following:
•• Establish the initial 2020 Census universe of
living quarters.
•• Assign the specific enumeration strategy (i.e.,
contact strategy and follow-up approach) for
each living quarter based on case status and
associated paradata.
•• Create and distribute workload files required for
enumeration operations.
•• Track case enumeration status.
•• Run post-data collection processing actions in
preparation for producing the final census results.

122 2020 Census Operational Plan—Version 1.1	

Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Make response data available as soon as possible to the data review teams in order to facilitate
a more thorough review.
•• Include more staff members from more areas in
the Primary Selection Algorithm determination
process. This will result in broader expertise for
design planning, rather than limiting to a small
team of mathematical statisticians or analysts.
•• Make user testing of the Quality Control
program component part of the schedule for
residual coding, to facilitate development of procedures and training of data coding staff.
Opportunities to Innovate
Opportunities to innovate include the following:
•• Use of enterprise-developed tools to facilitate
intelligent business decisions prior to and
during data collection:
ºº Conduct mode-level case assignment
activities.
ºº Interface with all printing systems for production of paper products.
ºº Serve as the overall integration “manager” of
response data collection, including Internet,
telephone, and paper data capture.
ºº Create models based on established business
rules to determine the appropriate course of
enumeration action for cases (e.g., person
visit, use of administrative records and thirdparty data, or imputation) and assign each
case to the specific mode for data collection.
•• Expanded use of administrative records and
third-party data in post-data collection processing activities to support improved data
coverage.
Description of Operation
Predata Collection Activities
During predata collection activities, the Response
Processing operation applies criteria to create the
initial 2020 Census universe used to support early
census operations, assigns and manages specific
contact strategies for each living quarter based
U.S. Census Bureau

on defined criteria, and creates and distributes
universe files required for various enumeration
operations.
Data Collection Activities
For data collection activities, the Response
Processing operation starts with receiving and managing updates to the initial 2020 Census universe.
These updates come from various address frame
update operations including Address Canvassing,
LUCA, UE, and some Geographic Programs activities. The results from the address updates establish
a revised 2020 Census self-response universe. The
Response Processing operation uses this universe
to control and track questionnaire response data.
Modeling techniques using established business
rules determine the appropriate course of enumeration action for cases and assigns the cases to the
specific modes for processing (adaptive design). As
responses are received, cases containing a Census
ID are removed from the self-response universe.
Cases returned without Census IDs are sent to
the Non-ID Processing operation for matching and
geocoding. All cases are returned to the Response
Processing operation and those that were successfully resolved are removed from the enumeration
universe.
For nonresponding cases, the Response Processing
operation supports the NRFU operation by determining the most effective enumeration strategy,
including removing cases from the workload based
on established “stopping rules.”
Post-Data Collection Activities
The Response Processing operation supports postdata collection activities by preparing the data

U.S. Census Bureau

	

for tabulation. As the data are received, write-in
responses (i.e., hand-written responses provided
when respondents do not select an option from
the questionnaire) are coded for tabulation purposes. Coding is conducted by both automated and
computer-assisted manual processes. In addition, checks are run to detect invalid (fraudulent)
returns. Response Processing applies computer­
based person matching software to unduplicate
multiple responses for the same person across census records. Then, a Primary Selection Algorithm
is run to establish the single enumeration record
for a case when multiple responses are received.
Following the Primary Selection Algorithm, imputations are applied and missing data resolved to
fix discrepancies between household population
counts and person data. This output is called the
Census Unedited File. The Census Unedited File is
used as a data source for coverage measurement
operations and a final independent count review
operation. Finally, the Census Unedited File is the
source used to produce the apportionment counts
delivered to the President of the United States via
the Data Products and Dissemination operation.
The next steps are to perform preliminary and
complex consistency edits, apply Disclosure
Avoidance techniques, and produce a Hundred
Percent Detail File for delivery to the Data Products
and Dissemination operation and then used for creation of the P.L.94-171 Census Redistricting Data
File and dissemination of data to the public. As part
of a final closeout, Response Processing prepares
census response data for delivery by the Archiving
operation to the National Archives and Records
Administration (NARA) for the Title 13 proscribed
72-year secured storage.

2020 Census Operational Plan—Version 1.1 123

•

Receive address and
geographical input data for all
known living quarters

•

Apply criteria to create the
initial 2020 Census
enumeration universe

•

Assign the specific contact
strategy for each living
quarters based on defined
criteria

Postdata Collection
Activities

Data Collection Activities

Predata Collection Activities
•

Receive updates to the initial
2020 Census Universe

•

Create the 2020 Census selfresponse universe

•

Create and distribute
workloads to data collection
modes based on modeling
results or specification
criteria

•

Apply data codes to write-in
responses to facilitate data
tabulation

•

Identifying potential invalid
returns

•

Resolve potential duplicate
responses

•

Identify the return of record
for housing units with multiple
returns

•

Record response data and
enumeration case status

•

•

Deliver response data to
Postdata Collection Activities

Repair missing or conflicting
data

•

Provide final census results

Figure 33: Response Processing Operation

Figure 33 summarizes the Response Processing
operation by component.
Research Completed
The following research has been completed for this
operation:
•• The 2014 Census Test evaluated the interface
between the response processing system and
the matching and geocoding system. In addition,
it tested the data file exchange.
ºº Findings: The tests concluded with no major
system or workload-related issues.
•• The 2015 Optimizing Self-Response Test and the
2015 Census Test included processing of non-ID
cases in real time (during response collection for
Internet and telephone data collection modes).
ºº Findings: The tests concluded with no major
system or workload-related issues.
Decisions Made
The following decisions have been made for this
operation:
99 The Response Processing operation will use the
enterprise-developed system solutions (Control
and Response Data System and Multimode

124 2020 Census Operational Plan—Version 1.1	

Operational Control System) for universe creation, data collection control and management,
and final data processing.
99 The enterprise-developed Concurrent Analysis
and Estimation System and its modeling output
will use established business rules to determine
the appropriate course of enumeration action for
cases and assign the case to the specific mode
for data collection to improve efficiency and
reduce cost.
99 Administrative records and third-party data
will be used to improve post-data collection
activities, such as coding and editing, primary
selection algorithm, Invalid Return Detection
(IRD), and imputation.
99 The Response Processing operation will comply
with Title 13 and Title 26 security requirements.
Design Issues to Be Resolved
Additional work is required to make decisions on
the following questions:
What are the methodologies, processes, and systems needed for Residual Coding, Primary Selection
Algorithm, IRD, Editing/Imputation, Edit Review
System, and Hundred Percent Detail File?

U.S. Census Bureau

•• Approach: Determined through the development of the Detailed Operational Plan.

Quality impacts of this operation on overall 2020
Census quality include the following:

•• Decision by: December 2015

ÏÏ Use of administrative records and third-party
data to improve imputation, editing and coding,
primary selection algorithm, and IRD processing.

How will administrative records and third-party
data be specifically used with response processing
operations?
•• Approach: Determined through the development of the Detailed Operational Plan.
•• Decision by: December 2015
What character set(s) will be supported for write-in
responses?
•• Approach: Design of the Languages Services
and Content and Forms Designs operations.
•• Decision by: December 2015
What are the number of write-in questions, the final
coding dictionary entries and rules, the maximum
number field lengths for write-ins, and the required
character set (if there is potential expansion to
include special characters or multilingual language
characters) that will be used for the purposes of
developing the response file layout?
•• Approach: Research in the 2016 and 2017
Census Tests.
•• Decision by: May 2016

Risks
Special characters may present difficulty in automated data processing. IF it is not defined how
special characters will be handled during automated data processing, THEN individual systems
and system interfaces may not support final character sets, allowing for corruption of nonstandard
characters and loss of data and/or data context.
Milestones
Date

Activity

March 2015

Establish the develop, test,
beta, staging, and production
environments for Response
Processing.

December 2015

Go live to support the 2016
Census Test universe creation and
response tracking.

September 2016

Release the Response Processing
Detailed Operational Plan.

December 2016

Go live for the 2017 Census Test.

What will be the estimated workload of post-capture Non-ID Processing?

January 2017

•• Approach: Researched in 2014 Census Test, all
2015 Tests, the 2016 Census Test, and the 2017
Census Test.

Deliver revised 2020 Census
business requirements for
Response Processing.

September 2018

Deliver final 2020 Census business
requirements for Response
Processing.

•• Decision by: September 2017

October 2019

Create the initial 2020 Census
enumeration universe for early
census operations.

Cost impacts of this operation on overall 2020
Census costs include the following:

January 2020

Create the 2020 Census selfenumeration universe.

Investment in Response Processing will decrease
the cost of the 2020 Census as compared with the
2010 Census through:

January 2020

Begin the 2020 Census Response
Processing operation.

November 2020

Deliver the 2020 Census Unedited
File for apportionment counts.

March 2021

Deliver the 2020 Census Microdata
Detail File for Tabulation.

Cost and Quality

ÐÐ Universe adjusted in “real-time” based on
response status and use of administrative
records and third-party data.
ÐÐ Flexible, rule-based decisions on most costeffective approach for collecting responses
(expected to reduce in-field workloads).

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 125

5.5.12	 Federally Affiliated Americans
Count Overseas
Detailed Planning Status:

Recently Begun

Research and Design Decisions Completed to
Date
Research Completed.
•• Market Research:

Purpose
The Federally Affiliated Americans Count
Overseas operation obtains counts by home state
of U.S. military and federal civilian employees
stationed or deployed overseas and their dependents living with them.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Explore new technology, including an Internet
option for collecting data on the federally affiliated population living overseas.
•• Automate this operation fully.
•• Consider new data fields to identify the residency of the military personnel living overseas.
•• Maintain a strong relationship with the
Department of Defense.
Opportunities to Innovate
The primary opportunity to innovate for this operation is to create a secure interactive database for
Department of Defense to submit their enumeration counts.
Description of Operation
For the 2020 Census, overseas is defined as
anywhere outside the 50 states and the District of
Columbia. Counts are obtained from administrative
records and are used to allocate the federally affiliated population living overseas.
The Federally Affiliated Americans Count Overseas
operation performs the following activities:
•• Compile address list of federal agencies with
personnel overseas.
•• Prepare enumeration materials.
•• Request the name of a contact person for
each agency.
•• Obtain agencies’ overseas counts by state.
•• Submit final counts in the apportionment counts.

126 2020 Census Operational Plan—Version 1.1	

ºº Met with the Defense Manpower Data Center
in March 2014 to discuss any suggested
updates from the 2010 Census enumeration.
•• Finding: U.S. Air Force is again using
the Home of Record field for its military
personnel.
Because detailed planning for this operation has
recently started, research that directly supports this
operation has not yet been completed. However,
based on the design from previous censuses, the
following assumptions have been made:
•• Continuously engage and communicate the
Census Bureau’s methodology and procedures
with the Defense Manpower Data Center.
•• Establish an online site for communicating with
participating federal agencies and for collecting responses on a form that can be completed
electronically.
•• Use data from the Department of Defense
Personnel System to enumerate the military and
their dependents and Department of Defense
federal civilian employees overseas in the following order: Home of Record, Legal Residence,
and Last Duty Station.
•• Use the Defense Enrollment Eligibility Reporting
System as an additional source of data to enumerate the military and their dependents and
Department of Defense federal civilian employees overseas.
Design Issues to Be Resolved
Additional work is required to make decisions on
the following questions:
What other data sources are available for tabulating
the overseas counts?
•• Approach: Based on ongoing discussions with
federal agencies.
•• Decision by: January 2018
How will the Census Bureau use electronic transmissions to obtain the data?
•• Approach: Based on ongoing discussions with
federal agencies.
U.S. Census Bureau

•• Decision by: January 2018

5.6	 PUBLISH DATA

Cost and Quality

Response Processing delivers the edited data to the
Data Products and Dissemination operation to
prepare the final 2020 Census data products. This
operation delivers:

Investment in the Federally Affiliated Americans
Count Overseas will have minimal impacts on the
cost and quality of the 2020 Census as compared
with the 2010 Census.
Risks
The Federally Affiliated Americans Count Overseas
operation will add new data sources to improve
data collection for the 2020 Census overseas
count. IF new ways of collecting data are not
researched and tested prior to implementation for
the 2020 Census, THEN there may be a negative
impact on data quality.
The Federally Affiliated Americans Count Overseas
operation plans to use an external-facing portal
as an automated collection system for the 2020
Census overseas count. IF the external-facing portal does not meet the Census Bureau’s IT security
requirements and cannot be used for the automated collection system, THEN collection methods
used for the 2010 Census may have to be reused
for the 2020 Census overseas count.
Milestones
Date

Activity

February
2014

Establish contact with Defense Manpower
Data Center.

February
2017

Review final guidelines for counting federally
affiliated Americans living overseas.

September
2017

Release the Federally Affiliated Americans
Count Overseas Detailed Operational Plan.

March 2018

Obtain Office of Management and Budget
clearance.

May 2018–
February
2020

Design, prepare, send contact letters, count
letters and instructions, and follow-up count
request.

September
2019

Obtain from the Office of Personnel
Management the most recent Federal Civilian
Workforce Statistics publication.

July 2020

Prepare and review overseas counts.

August 2020

Deliver overseas counts to include in
apportionment count.

U.S. Census Bureau

	

•• Input to the Count Review operation to ensure
the counts appear correct.
•• Apportionment counts to the President of the
United States.
•• State counts to the RDP for dissemination to
the state legislatures so state governments
can define the geographic boundaries for
Congressional and legislative districts.
•• Final counts to the Count Question
Resolution operations so challenges to Census
Counts can be resolved.
•• All response data to the Archiving operation for
public release 72 years after the census.

5.6.1	 Data Products and Dissemination
Detailed Planning Status:

Not Started

Detailed planning for this operation has not
started. The narrative that follows represents the
Census Bureau’s preliminary thoughts as of the
release of this document.
Purpose
Data Products and Dissemination operation performs three primary functions:
•• Prepare and deliver the 2020 Census apportionment data for the President of the United States
to provide to Congress by December 31, 2020.
•• Tabulate 2020 Census data products for use by
the states for redistricting.
•• Tabulate and disseminate 2020 Census data for
use by the public.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:

2020 Census Operational Plan—Version 1.1 127

•• Provide an approach to restructure and enhance
data dissemination activities across the entire
agency.
•• Improve customer satisfaction.
•• Expand the Census Bureau’s audience and customer base.
Opportunities to Innovate

U.S. Code. Congress decides the method used
to calculate the apportionment. This method
has been used in every census since the 1940
census.
•• This operation will:
ºº Define data products.
ºº Define metadata.

Opportunities to innovate include the following:

ºº Generate metadata and mapping for
Application Programming Interfaces.

•• Use of enterprise solutions for preparing the
2020 Census data products and disseminating
the information to the public.

ºº Generate data products (Apportionment
and Redistricting) and associated data
documentation.

•• Enhancements to existing tabulation systems to
support 2020 Census tabulation as an enterprise
solution.

Design Issues to Be Resolved

•• Leveraging new solutions to allow data users
greater flexibility in using 2020 Census data for
research, analytics, application development,
etc. The focus is on user-centric capabilities and
dissemination functionality.

How will the Census Bureau develop the 2020
Census data user interface through CEDSCI?

Description of Operation
The Data Products and Dissemination operation
takes the processed response data, tabulates, goes
through the necessary Disclosure Avoidance procedures, and prepares it for delivery to the President,
the states, and the public.
A set of enterprise-level systems will provide
access to data via an interactive Web site, allowing users to access prepackaged data products,
application programming interfaces, and metadata
documentation. These include:
•• CEDSCI dissemination platform.
•• Tabulation System.
•• Customer Experience Management System.
Research Completed
Because detailed planning for this operation has
not yet started, research that directly supports this
operation has not yet been completed.
Assumptions Made
Based on planning of other operations, the following assumptions have been made:
•• The apportionment for the 2020 Census will be
calculated using the method of equal proportions, according to the provisions of Title 2,
128 2020 Census Operational Plan—Version 1.1	

The following decisions need to be made for this
operation:

•• Approach: Requirements for a P.L. 94-171
Redistricting Data Prototype will be included as
a use case in system and user interface development starting with the release of the CEDSCI
Alpha prototype.
•• Decision by: November 2015
Which system will provide the 2020 Census
Tabulation solution?
•• Approach: ACS testing and a feasibility
recommendation for 2020 Census tabulation
processing.
•• Decision by: July 2016
What will be the mix or array of standardized data
products?
•• Approach: Design and propose the standardized data products for public comment through
.
•• Decision by: March 2017 (Tentative)
Cost and Quality
Investments in the Data Products and
Dissemination operation will have minimal impact
on the cost and quality of the 2020 Census as compared with the 2010 Census.
Risks
The scope of CEDSCI includes providing tabulation
services for the 2020 Census program starting in
U.S. Census Bureau

2018. IF the 2020 Census is depending on CEDSCI
to provide tabulation services prior to 2018, THEN
the scope of CEDSCI will be larger than what is
feasible to accomplish.
The 2020 Census program is dependent on CEDSCI
to develop and deliver a data dissemination system. IF CEDSCI is unable to deliver a dissemination
system for the 2020 Census, THEN a new data
dissemination system will not be available and traditional systems will have to be explored for reuse.
Milestones
Date

Activity

March 2014

Release the concept of operations for a more
customer-centric, streamlined, and flexible
enterprise solution for data dissemination.

July 2014

Establish the Center for Enterprise
Dissemination Services and Consumer
Innovation.

September
2017

Release the Data Products and
Dissemination Detailed Operational Plan.

September
2018

Deliver final 2020 Census business
requirements to support 2020 Census Data
Product Plan.

December
2018–April 1,
2019

Deploy tabulation system and deploy
dissemination platform for production and
release of the P.L. 94-171 Redistricting Data
Prototype.

December
2020

Provide apportionment counts to the
President of the United States.

By April 1,
2021

Complete the release of the P.L. 94-171
Redistricting Data to the states, the District of
Columbia, and Puerto Rico.

May 2021–
September
2022

Deliver 2020 Census statistical data to the
enterprise data dissemination platform for
the release of quick tables and application
programming interfaces.

April 2023

Release final data products.

5.6.2	 Redistricting Data Program
Detailed Planning Status:

Underway

Purpose
The purpose of the RDP operation is to provide to
each state the legally required P.L. 94-171 redistricting data tabulations by the mandated deadline
of 1 year from Census Day: April 1, 2021.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Provision of a prototype product is necessary.
•• The ability to provide legal boundary updates
is needed.
•• Delivery of the data prior to public release is
necessary.
Opportunities to Innovate
Opportunities to innovate include the following:
•• Separation of the program’s Block Boundary
Suggestion Project from the Voting District
Project to allow greater external participation.
•• Inclusion of a BAS component to capture and
improve underlying geography.
•• Processing at Headquarters and the NPC to
provide states with consistent guidance, to
enhance coordination between BAS and RDP,
and to reduce burden on the Geographic Area
Reconciliation Program.
•• State legislative district updates captured at time
of collection of Congressional district updates
reducing the need for multiple efforts.
Description of Operation
The RDP Operation provides the 50 states, the
District of Columbia, and Puerto Rico with the
opportunity to identify, delineate, and update
geographic boundaries for data tabulation. It
also allows for continuous process improvement
through an evaluation of the program with recommendations for the next cycle that is in an official
publication called “The View From the States.”

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 129

The five major components in the 2020 Census
RDP include:
•• Phase 1—Block Boundary Suggestion Project.

99 GQ tabulations by race for the seven main group
quarters types will be included as part of the
official P.L. 94-171 redistricting data file.

•• Phase 2—Voting District Project.   

Design Issues to Be Resolved

•• Phase 3—P.L. 94-171 data and geographic support products design and delivery.

Additional work is required to make decisions on
the following questions:

•• Phase 4—Collection of changes to Congressional
and State Legislative Districts. 

Can the Census Bureau produce 2010 Census to
2020 Census block, block group, and tract relationship files for release on the same schedule as the
P.L. 94-171 redistricting data?

•• Phase 5—Evaluation of the 2020 Census RDP
and recommendations for the 2030 RDP.
Research Completed
The following research has been completed for this
operation:
•• January 2015: Released the Designing P.L.
94-171 Redistricting Data for the Year 2020
Census—The View From the States.
ºº Findings:
•• Need for a “one number” Census.
•• Need for a prototype data product.

•• Approach: Research and test using the Block
Boundary Suggestion Program verification prototype blocks produced in December 2016.
•• Decision by: April 2017
What changes, if any, to the structure of the P.L.
94-171 redistricting data file may result from
research on changing the separate race and
ethnicity questions to a single question and the
possible inclusion of a Middle Eastern North African
category?

•• Need for data delivery prior to public
release.

•• Approach: Research using the outcomes of the
2015 National Content Test results.

•• Need for GQ data.

•• Decision by: June 2017

•• Need for support products using most
current (2020) geography.

Can the Census Bureau produce Citizen Voting Age
Population by Race tabulations in early 2021 using
the new 2020 Census tabulation geography?

•• Need for tabulation block and voting district data.
•• Need for states to have the option to use
their resident GIS systems for program
participation.
Decisions Made
The following decisions have been made for this
operation:
99 Prototype P.L. 94-171 redistricting data tabulations
and geographic support products from the 2018
Census End-to-End Test will be generated and distributed to official liaisons by April 1, 2019.
99 Use the GUPS as one of the methods for interaction with and collection of partner updates.

130 2020 Census Operational Plan—Version 1.1	

•• Approach: Research and test using the 2013–
2017 ACS 5-year estimates run using the
2018 geographies for simulated release by
February 1, 2019.
•• Decision by: March 2019
What IT capabilities and data distribution methodology will be used (including maps)?
•• Approach: Research through prototype delivery
in March 2019.
•• Decision by: June 2019
Cost and Quality
Cost impacts of this operation on overall 2020
Census costs include the following:

U.S. Census Bureau

•• The investment in Redistricting Data Program
will have minimal impact on the cost of the 2020
Census as compared with the 2010 Census.
Quality impacts of this operation on overall 2020
Census quality include the following:
ÏÏ Consistent messaging and guidance to
participants.
ÏÏ Consistent processing of incoming files.
ÏÏ Improvement of incoming file quality due to
expanded participation timeline.
ÏÏ Improvement in underlying geography through
iterated update cycles—update, apply, view,
refine, update.
Risks
The GUPS being developed is a critical tool in
ensuring that all states can participate in the program regardless of their ownership of Geographic
Information System software. IF the GUPS modules
are not ready for use by the start date of each
stage of the RDP update project, THEN participants
will have unequal opportunities for participation,
violating the principles of P.L. 94-171.
As part of its mission to provide the states with the
small area tabulations needed to conduct legislative redistricting and to deliver that product within
1 year of Census Day, the Census Bureau produces
a full prototype product and delivers that product
within the same time constraints. This prototype
and process is used to validate both the product
and the processing. IF the systems for producing products from the 2018 Census End-to-End
Test are not ready, THEN a P.L. 94-171 prototype
product will not be generated within the timeframe
required (before April 1, 2019) and stakeholders
will not be able review and provide feedback as
to the acceptability of the product in meeting the
Census Bureau’s legal mandate.

U.S. Census Bureau

	

Milestones
Date

Activity

July 2014

Submit Federal Register Notice proposing the
2020 Census Redistricting Data Program.

January 2015

Publish “Designing P.L. 94-171 Redistricting
Data for the Year 2020 Census—The View
From the States.”

December
2015–
May 2017

Conduct Phase 1: Block Boundary Suggestion
Project.

September
2016

Release the Redistricting Data Program
Detailed Operational Plan.

October 2017

Finalize the P.L. 94-171 Prototype Products
Design.

December
2017–
May 2019

Conduct Phase 2: The Voting District Project.

March 2019

Deliver P.L. 94-171 Prototype Products.

November
2020–
March 2021

Conduct Phase 3: Data Delivery for the 2020
Census Redistricting Data Program.

April 1st 2021 Deliver the P.L. 94-171 data (legal deadline).

5.6.3	 Count Review
Detailed Planning Status:

Not Started

Detailed planning for this operation has not
started. The narrative that follows represents the
Census Bureau’s preliminary thoughts as of the
release of this document.
Purpose
The Count Review operation enhances the accuracy
of the 2020 Census by:
•• Implementing an efficient and equitable process
to identify missing housing units.
•• Identifying and correcting missing or geographically misallocated large group quarters.

2020 Census Operational Plan—Version 1.1 131

Lessons Learned
Based on lessons learned from the 2010 Census,
the following recommendations were made:
•• Planning for the Count Review Program needs to
begin earlier in the decennial planning cycle to
be more easily and fully integrated with decennial census operations.
•• Address-level precision is essential to an effective count review program.
•• Consider working with E911 system, tax
assessor records, and other federal agencies to
develop a common format and address updating
protocol.
•• Have both group quarters and housing unit
address information available during the review.
Opportunities to Innovate
No specific opportunities to innovate have been
identified to date for this operation.
Description of Operation
The operational description provided below is
based primarily on the operational design of the
2010 Census Count Review operation. When the
2020 Census Count Review operation is funded,
a primary focus should be on determining what
the objectives of the 2020 Census Count Review
operation should be based on other aspects of the
2020 Census operational design. The focus should
be on defining the Count Review operation for the
2020 Census that is integrated with other census
operations, fully tested, and is designed to resolve
count issues identified by the program. Under the
joint-partnership authority, an FSCPE and 2020
Census Working Group was established to explore
opportunities to leverage the knowledge and experience of the FSCPE network to the benefit of the
2020 Census Program. Membership of the working
group includes representatives from the FSCPE
Steering Committee, as well as Census Bureau subject matter experts.
The Count Review operation consists of the
following:
•• A partnership with the FSCPE members for a
housing unit count review.
•• A partnership with the FSCPE members for a GQ
count review focusing on large group quarters
(missing and misallocated).
132 2020 Census Operational Plan—Version 1.1	

•• Review of the following for systematic or large
anomalies in population and housing units:
ºº Census Unedited File.
ºº Census Edited File.
ºº Microdata Detail File.
The design and schedule for the Count Review
Program will consider the necessary inputs and
outputs to ensure a smooth transition to downstream operations, such as the Count Question
Resolution operation.
Research Completed
Because detailed planning for this operation has
not yet started, research that directly supports this
operation has not yet been completed. However,
discussions are underway as part of the scope of
the FSCPE and 2020 Census Working Group.
Assumptions Made
Based on planning of other operations, the following assumptions have been made:
•• The Count Review operation will leverage
the knowledge and experience of the FSCPE
network.
•• The Census Count Review operation will leverage existing software and systems to accomplish its goals and objectives.
•• The FSCPE and Census Bureau staff will review
population, housing unit, and group quarters
counts.
Design Issues to Be Resolved
The following decisions need to be made for this
operation:
How will the 2020 Census Count Review Program
leverage the knowledge and experience of the
FSCPE network for conducting housing unit, group
quarters, and population count review?
•• Approach: On-going discussions with the FSCPE
and 2020 Census Working Group.
•• Decision by: end of FY 2016
What are the objectives, scope, and operational
timeline of the 2020 Census Count Review
Program?
•• Approach: Determined through the development of the Detailed Operational Plan.

U.S. Census Bureau

•• Decision by: September 2017
What does success for the 2020 Census Count
Review Program looks like and how is it measured?
•• Approach: Determined through the development of the Detailed Operational Plan.
•• Decision by: September 2017
What is the appropriate level of geography for conducting housing unit, group quarters, and population count review?
•• Approach: Determined through the development of the Detailed Operational Plan.
•• Decision by: September 2017
What is the timing of the Count Review? Can the
Census Bureau conduct the Count Review in time
to impact the counts?
•• Approach: Determined through the development of the Detailed Operational Plan.
•• Decision by: September 2017
How can Count Review improve the GQ universe
before enumeration?
•• Approach: Determined through the development of the Detailed Operational Plan.
•• Decision by: September 2017
What approach will be used for validating missing
housing units provided by FSCPEs? For example,
fieldwork? Aerial imagery?
•• Approach: Determined through the development of the Detailed Operational Plan.
•• Decision by: September 2017
What approaches will be used for validating group
quarters count discrepancies?
•• Approach: Determined through the development of the Detailed Operational Plan.
•• Decision by: September 2017

Milestones
Date

Activity

October 2015

Initiate the 2020 Census Count Review
Program Integrated Product Team.

September
2017

Release the Count Review Detailed
Operational Plan.

February
2020

Conduct 2020 Census Housing Unit Count
Review.

August 2020

Conduct 2020 Census GQ Count Review.

November
2020

Conduct 2020 Census Review of Census
Unedited File, Census Edited File, and Microdata Detail File.

August 2021

Issue 2020 Census Count Review Program
Operational Assessment.

5.6.4	 Count Question Resolution
Detailed Planning Status:

Not Started

Detailed planning for this operation has not
started. The narrative that follows represents the
Census Bureau’s preliminary thoughts as of the
release of this document.
Purpose
The Count Question Resolution operation provides
a mechanism for governmental units to challenge
their official 2020 Census results.
Lessons Learned
Based on lessons learned from the 2010 Census,
studies and reviews, the following recommendations were made:
•• Create a milestone schedule and ensure it is
followed.
•• Meet early and often so that all stakeholders involved make decisions up front, before
beginning to program control systems or write
procedures.

Cost and Quality

•• Make sure planning tasks are completed on time
and everyone is aware of key decisions.

Investments in the Count Review Program will have
minimal impact on the cost and quality of the 2020
Census, as compared with the 2010 Census.

Opportunities to Innovate

Risks

No specific opportunities to innovate have been
identified to date for this operation.

No risks have been identified to date for this
operation.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 133

Description of Operation
The CQR operation provides a mechanism for governmental units to challenge the accuracy of their
final 2020 Census counts.
The CQR operation includes the following
activities:
•• Draft proposed process and rules and publish in
the Federal Register.
•• Finalize process and rules and publish in the
Federal Register.
•• Identify staffing needs and make temporary
appointments and reassignments.

and monitor CQR Risk Register, which includes
migration and contingency planning activities.
•• Decision by: September 2018
Will the Census Bureau require challenging governments to provide location information for each
housing unit they provide on their list?
•• Approach: Evaluate the 2020 Census frame
building processes, including frequency and
quality of location information provided by governmental units.
•• Decision by: September 2018
What types of challenges will be in-scope?

•• Receive, investigate, and respond to all challenges, including correcting errors found within
the established guidelines of the program.

•• Approach: Determined through the development of the Detailed Operational Plan.

Research Completed

What documents and systems will be needed to
research and respond to challenges?

Because detailed planning for this operation has
not yet started, research that directly supports this
operation has not yet been completed.
Assumptions Made
Based on initial discussions, the following assumption has been made:
•• This program will be conducted in a similar
manner to both the 2000 and 2010 Censuses.
Design Issues to Be Resolved
The following decisions need to be made for this
operation:
What is the approach for addressing unexpected
issues related to count or geographic discrepancies? For example, in the 2010 Census, there were
some very specific issues with the way the Census
Bureau geocoded Navy ships in U.S. harbors.

•• Decision by: September 2018

•• Approach: Interdivisional teams will meet and
make recommendations on these matters. Federal
Register input will help determine final decisions
on types of challenges that will be in scope.
•• Decision by: Publish the initial Federal
Registration Notice in 2020, and the final Federal
Registration Notice in 2021 so that challenges
can be accepted as soon as state and sub-state
data are published (approx. June 2021).
Cost and Quality
Investment in Count Question Resolution will have
minimal impact on the cost and quality of the 2020
Census as compared with the 2010 Census.
Risks
No risks have been identified to date for this
operation.

•• Approach: Incorporate 2010 lessons learned
into CQR Detailed Operational Plan. Establish

134 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

and temporary record, and the Census Bureau’s
legal obligation to archive permanent records.

Milestones
Date

Activity

January 2017

Begin planning and development of program
schedule, process, and initial Federal Register
Notice.

September
2018

Release the Count Question Resolution
Detailed Operational Plan.

May 2020

Publish initial Federal Registration Notice
identifying process and types of challenges to
be considered.

March 2021

Publish final Federal Registration Notice
to establish process, timing, and types of
challenges in scope for the program.

June 2021

Begin accepting challenges from
governmental units.

2021–2023

Issue revised certified counts as appropriate
and make available on census.gov through
American FactFinder (or similar dissemination
system).

June 2023

Deadline for governmental units to submit
challenges.

Sept 2023

End program and issue assessment and
lessons learned report.

5.6.5	Archiving
Detailed Planning Status:

Not Started

Detailed planning for this operation has not
started. The narrative that follows represents the
Census Bureau’s preliminary thoughts as of the
release of this document.
Purpose
The Archiving operation performs the following
functions:

•• Start archiving planning (with an interdivisional
team) earlier in the life cycle—suggest FY 2018
at the latest.
•• Keep a log or spreadsheet on the materials
that the records schedule requires to be sent to
NARA, how they will be sent, dates promised,
and actual transfer date.
Opportunities to Innovate
No specific opportunities to innovate have been
identified to date for this operation.
Description of Operation
The Census Bureau must provide copies of the
individual responses to the 2020 Census (including
names and addresses) to the NARA. The specific
format, media, and timing for the delivery will be
negotiated between the Census Bureau and NARA
later in the decade. Because the primary use of
this information is for genealogical searches (to
be released no sooner than 72 years after Census
Day), the Census Bureau must also provide a
linkage between the individual response data and
the copies of questionnaires on paper, microfilm,
or electronic images. This operation also provides
similar data to support the Census Bureau Age
Search Program at NPC.
Research Completed
Because detailed planning for this operation has
not yet started, research that directly supports this
operation has not yet been completed.

•• Provide records deemed permanent, including
files containing the individual responses to the
2020 Census, to NARA.

Decisions Made

•• Provide similar files to the NPC to use as source
materials to conduct the Age Search Service.

Design Issues to Be Resolved

Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Make sure staff are regularly reminded of their
records management responsibilities. They need
to understand the distinction between permanent

U.S. Census Bureau

	

No decisions have been finalized for this operation.

The following decisions need to be made for this
operation:
What are the format, media, and timing for the
delivery of individual responses to NARA?
•• Approach: Census Bureau will work with NARA
to review records and make determinations of
permanent records.
•• Decision by: July 2021

2020 Census Operational Plan—Version 1.1 135

Cost and Quality

Lessons Learned

Investment in Archiving will have minimal impact
on the cost and quality of the 2020 Census as compared with the 2010 Census.

Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:

Risks
No risks have been identified to date for this
operation.
Milestones
Date

Activity

Annually,
beginning in
2016

Update official records plan performed by
Records Manager for each participating
division.

June 2018

Begin identification and review of all records
that will be generated by or for the 2020
Census.

September
2018

Release the Archiving Detailed Operational
Plan.

April 2019

Begin negotiations with NARA to make
preliminary determinations of which records
will be deemed permanent, so must be
archived.

April 2021

Develop final records schedule with NARA
and submit for approval by Archivist.

July 2022

Begin transfer of permanent records to NARA.

January 2023 Complete transfer of all permanent records to
NARA. Complete destruction of all temporary
records no longer needed by Census Bureau.

5.7 OTHER CENSUSES
Other Censuses comprises all functions associated
with the decennial censuses for the Pacific Island
Area of American Samoa, the Commonwealth of the
Northern Mariana Islands, Guam, and the U.S. Virgin
Islands, collectively known as the Island Areas.
There is one operation in this area: Island Areas
Censuses.

5.7.1	 Island Areas Censuses
Detailed Planning Status:

Recently Begun

Purpose
The purpose of the Island Areas Censuses operation is to update and enumerate all living quarters
in American Samoa, the Commonwealth of the
Northern Mariana Islands, Guam, and the U.S. Virgin
Islands, collectively known as the Island Areas (IA).

136 2020 Census Operational Plan—Version 1.1	

•• The contracts with the IA’s local governments
need to stipulate the roles and responsibilities of
the census office managers, the onsite Census
Advisors, the officials of the local governments,
and the officials at Census Bureau headquarters.
•• The IA data collection operations and data processing needs to be more in-line with stateside
operations and data processing.
•• The planning phase of the IA’s censuses should
involve data processing staff who can help create testing strategies.
Opportunities to Innovate
•• Use of enterprise solutions optimized for the
2020 Census and the ACS for preparing 2020
Census IA data products and disseminating the
information to the public.
Description of Operation
The Census Bureau will conduct the 2020 Census
of the Island Areas through partnerships with
local government agencies in American Samoa,
Commonwealth of the Northern Mariana Islands,
Guam, and the U.S. Virgin Islands. The Census
Bureau will provide the materials and guidance
to the local government agencies that are then
responsible for recruiting and hiring the staff to
conduct the data collection phase. The data collection phase will consist of:
•• Opening and closing LCOs.
•• Address Canvassing.
•• Enumerating residents.
•• Follow-up operations.
•• Local Count Review.
•• Shipping completed materials to data processing
sites.
A contract agreed upon by the parties will outline
the specific responsibilities of the Census Bureau
and the local government agencies. Following the
completion of the data collection phase, Census
Bureau staff will prepare the data and disseminate
the information to the public.

U.S. Census Bureau

Research Completed

Design Issues to Be Resolved

Because detailed planning for this operation has
recently started, research that directly supports this
operation has not yet been completed. However,
based on the 2010 Census design and planning
of other operations, the following decisions have
been made:

The following decisions need to be made for this
operation:

Decisions Made
The following design decisions are based on the
2010 Census design for this operation:
99 Continuously engage and communicate the
Census Bureau’s plans with liaisons in the local
IA’s governments, and with the Office of Insular
Affairs in the Department of Interior.
99 Revise maps with geospatial updates from the
2010 Census data, local data, site visits, and
satellite imagery.
99 Establish contracts with the local IA’s governments to conduct the census data collection.

Will the IA TEA use an Update/Enumerate strategy?
•• Approach: Based on the Census Bureau’s ability
to create and update a MAF for the IA.
•• Decision by: September 2017
To what degree will online self-response be available for IA’s respondents?
•• Approach: Investigate feasibility based on analysis of Internet access and speed and applicability of existing Internet and non-ID capabilities to
support unique IA addresses.
•• Decision by: September 2017
Which enterprise systems can be used to support
the IA Censuses operation and what modifications
are needed to these systems?

99 Establish five local census offices: two in the
U.S. Virgin Islands and one in each of the other
Island Areas.

•• Approach: Research during separate proposed
test of IA operations in FY 2016. Based on test
results, work with the IT staff to incorporate
Island Area requirements into existing systems.

99 Use a “long-form like” questionnaire.

•• Decision by: September 2017

Changes that will be made for the design of
this operation for the 2020 Census include the
following:

How will the IA questionnaire differ from the then
current ACS form?

99 Build and maintain a first-ever MAF for each of
the IA for use in the 2020 Census and in subsequent censuses.
99 Use the ACS form with minor wording changes
to accommodate time reference differences,
incorporating the final 2020 Census questions.
99 Use stateside systems whenever possible; some
modifications may be needed.
99 Deploy Census Advisors to the local census
offices in 2019 to provide guidance throughout the data collection process and to report
back to Headquarters—one advisor for each
of the Pacific Island Areas (American Samoa,
Commonwealth of the Northern Mariana Islands,
and Guam), and two advisors for the U.S. Virgin
Islands (one for St. Thomas and St. John, and
one for St. Croix).

U.S. Census Bureau

	

•• Approach: Work with internal and external
stakeholders in fiscal years 2015 through
2017 to determine the final content of the
questionnaire.
•• Decision by: December 2017
Cost and Quality Measures
Investment in the 2020 Census of the IA will have
minimal impacts on the cost and quality of the
2020 Census as compared with the 2010 Census.
Risks
The goal for the IA 2020 Censuses is to implement
an UE operation that requires an existing address
frame in the form of a MAF. IF adequate resources
are not allocated to develop the IA MAF, THEN the
design of the field listing and enumeration methodologies cannot be finalized in time to meet the
milestone of releasing the IA Censuses Detailed
Operational Plan.

2020 Census Operational Plan—Version 1.1 137

The IA Censuses operation has many unique
requirements and may not be able to leverage
enterprise solutions without significant modifications. IF the IA team cannot identify a stateside
system capable of processing Island Areas data
by September 2017, THEN the IA team will have
to find an alternate data processing system and
resources, which will increase the cost and affect
the schedule of the IA 2020 Censuses.
Milestones

(CCM) Survey for the enumerated housing units
and people.
•• Coverage Measurement Field Operations:
Collects person and housing unit information
(independent from the 2020 Census operations)
for the sample of housing units in the Census
Coverage Measurement Survey.
•• Evaluations and Experiments: Measure the
success of critical 2020 Census operations.
Formulate and execute an experimentation program to support early planning and inform the
transition and design of the 2030 Census.	

Date

Activity

September
2013

Establish quarterly contact with IA’s
government officials.

Each operation is described below.

September
2017

Release the IA Censuses Detailed
Operational Plan.

5.8.1	 Coverage Measurement Design
and Estimation

March 2018

Decide what, if any, stateside systems can be
used for the 2020 IA’s Census operations.

March 2018

Obtain Office of Management and Budget
clearance for data collection materials.

June 2018

Finalize plans for the IA’s Census operations.

September
2018

Award contracts with the IA’s governments.

June 2019

Open Area Census Offices in American
Samoa, the Commonwealth of the Northern
Mariana Islands, Guam, and St. Thomas and
St. Croix of the U.S. Virgin Islands.

Detailed Planning Status:

Not Started

Detailed planning for this operation has not
started. The narrative that follows represents the
Census Bureau’s preliminary thoughts as of the
release of this document.
Purpose

September
2020

Close the Area Census Offices in the IA and
their contracts.

December
2020

Publish the IA’s population counts.

The Coverage Measurement Design and Estimation
operation develops the survey design and sample for the post-enumeration survey for the 2020
Census. It also produces coverage error estimates
and independent assessment of coverage via
demographic analysis.

September
2023

Complete IA’s detail data publications.

Lessons Learned

5.8	 TEST AND EVALUATION
The Test and Evaluation area performs two primary
functions:
•• Evaluate the quality of the 2020 Census.
•• Prepare for the 2030 Census.
This area includes four operations:
•• Coverage Measurement Design and
Estimation: Designs the post-enumeration
survey, including sampling and estimation, and
demographic analysis.
•• Coverage Measurement Matching: Identifies
matches and nonmatches between the 2020
Census and the Census Coverage Measurement

138 2020 Census Operational Plan—Version 1.1	

Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Simplify the sampling operations, the data collection, the matching operations, and the estimation by eliminating the creation and use of
block cluster, provided the basic collection unit
concept is similar to 2010 block cluster.
•• Follow best practices from the 2010 Census
Coverage Measurement operations where the
Census Bureau anticipated potential changes
in implementing the sample design, allowing
changes to sample design requirements to
be easily handled given the implementation
approach.
•• Use of the Planning Database for designing the
Census Coverage Measurement sample.
U.S. Census Bureau

Opportunities to Innovate
No specific opportunities to innovate have been
identified to date for this operation.
Description of Operation
The description below is based primarily on the
operational design of the 2010 Census Coverage
Measurement Program.
The Coverage Measurement Design and Estimation
operation performs the following functions:
•• Develop the survey design for the postenumeration survey for the 2020 Census.
•• Design and implement the sample to support
the estimation of coverage estimates in the
2020 Census for the United States and Puerto
Rico, excluding Remote Alaska.
•• Produce estimates of net coverage error and the
components of census coverage for housing
units and persons living in housing units for the
United States and Puerto Rico, excluding Remote
Alaska.
•• Produce independent assessments of census
coverage via demographic analysis, using population and housing unit benchmarks in support
of the 2020 Census and the evaluation of the
2020 Census.
Research Completed
Because detailed planning for this operation has
not yet started, research that directly supports this
operation has not yet been completed.
Assumptions Made
Based on the 2010 Census design and planning of
other operations for the 2020 Census, the following assumptions have been made:
•• Use the capture-recapture, dual-system estimation methodology, similar to the 2010
CCM approach, to measure the 2020 Census
coverage.
•• Maintain the independence of the Coverage
Measurement Survey operations from the 2020
Census operations.
•• Automate all Coverage Measurement Survey
data collection instruments.
•• Take advantage of directorate and enterprise
automation processes.
U.S. Census Bureau

	

•• Continue to use Demographic Analysis as an
input to coverage measurement estimation as in
the 2010 Census.
•• The Demographic Analysis program will be the
primary source for administrative records-based
estimates of the total population by age, sex,
and the Demographic Analysis race categories
for comparison with the 2020 Census counts.
Design Issues to Be Resolved
The following decisions need to be made for this
operation:
Are estimates of component error a goal for
the 2020 Coverage Measurement Design and
Estimation Program?
•• Approach: Determined through the development of the Detailed Operational Plan.
•• Decision by: December 2015
What are the effects on estimates of potential operational and systems changes?
•• Approach: Research with 2010 Census data and
conduct operational simulations and tests.
•• Decision by: March 2016
When should the Coverage Measurement Design
and Estimation Operation estimates be released?
•• Approach: Determined through the development of the Detailed Operational Plan.
•• Decision by: September 2016
How can vital statistics be better used, or combined with other data sources to improve the
Demographic Analysis estimates by age and
sex, and to better estimate or expand the race
and Hispanic origin categories for which the
Demographic Analysis estimates are produced?
•• Approach: Determined through the development of the Detailed Operational Plan.
•• Decision by: September 2016
What is the optimal sampling plan that balances
estimation plans and operational considerations?
•• Approach: Determined through the development of the Detailed Operational Plan.
•• Decision by: September 2016
At what level of geography will the Coverage
Measurement Design and Estimation Operation
produce estimates?
2020 Census Operational Plan—Version 1.1 139

• Approach: Determined through the development of the Detailed Operational Plan.

Milestones
Date

Activity

October 2016

Start Coverage Measurement Design
and Estimation.

September 2017

Release the Coverage Measurement
Design and Estimation Detailed
Operational Plan.

August–
September 2019

Start 2020 Census Coverage
Measurement Design and Estimation
Sample Design.

When will the first test of the 2020 Census
Coverage Measurement Design and Estimation
Operation be conducted?

February–
April 2019

Select Coverage Measurement Design
and Estimation Sample BCUs.

January 2020

Conduct Small BCUs Subsampling.

• Approach: Researched in the 2017 Census Test
(proposed).

March–April 2020

Identify Coverage Measurement Design
and Estimation Person Interview Sample.

• Decision by: September 2017

December–
January 2021

Generate Coverage Measurement
Design and Estimation Person Estimates.

Cost and Quality:

January–
February 2021

Generate Coverage Measurement
Design and Estimation Housing Unit
Estimates.

January–
March 2021

Produce Estimation Reports.

April 2021

Release Estimation reports.

• Decision by: September 2016
How will the Census Bureau ensure independence
between the coverage measurement survey and the
census?
• Approach: Determined through the development of the Detailed Operational Plan.
• Decision by: September 2016

The Coverage Measurement Design and Estimation
Operation will have minimal impact on the cost and
quality of the 2020 Census as compared with the
2010 Census.
Risks
By this point in the decade, planning for all major
2020 Census operations should be underway.
Budget reductions in FY 2013 through FY 2015
delayed planning for this operation. IF planning
efforts are not initiated at the start of FY 2016,
THEN there may not be sufficient time to implement innovations related to this operation.

140 2020 Census Operational Plan—Version 1.1	

End Coverage Measurement Design and
Estimation.

5.8.2	 Coverage Measurement Matching
Detailed Planning Status:

Not Started

Detailed planning for this operation has not
started. The narrative that follows represents the
Census Bureau’s preliminary thoughts as of the
release of this document.

U.S. Census Bureau

Purpose
The purpose of this operation is to identify
matches and nonmatches between the 2020
Census and the Census Coverage Measurement
Survey, for both housing units and people, including computer and clerical components.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Simplify the Coverage Measurement clerical
matching tasks.
•• Rely more on the automated matching systems
than the clerical matchers.
•• Move housing unit matching and follow-up operations closer to the listing operation.
•• Automate the assignment of status codes and
address information where possible.
Opportunities to Innovate
No specific opportunities to innovate have been
identified to date for this operation.
Description of Operation
The description below is based primarily on the
operational design of the 2010 CCM Matching
Program.
The Coverage Measurement Matching operation
includes:
•• Housing Unit Matching: Links the housing unit
addresses in the sample and the initial census
addresses in the MAF using automated computer
matching and clerical matching techniques.
•• Person Matching: Links the persons in the sample and the census using automated computer
and clerical matching techniques.
•• Final Housing Unit Matching: Links the housing
unit addresses in the sample and the final
census addresses using automated computer
matching and clerical matching techniques.
Housing Unit, Person, and Final Housing Unit
Matching utilize two different methods:
•• Computer matching of addresses or persons
is conducted using software that assigns a
probability that the addresses or people match.
A threshold is identified to indicate cases that
U.S. Census Bureau

	

are definite matches, another to indicate cases
that are definite nonmatches, and the cases in
between these points are considered possible
matches. When the intent is to identify duplicates, a similar process is used, resulting in a
set of duplicate cases, nonduplicate cases, and
possible duplicate cases.
•• Clerical matching is conducted by clerical matchers utilizing the matching software. The software displays the results of computer matching
and allows the matchers to review and correct
any results. Matchers must review and code all
the possible matches or duplicates and can also
correct cases determined as linked or nonlinked
by the computer matcher. In addition, clerical
matchers must geocode new addresses collected
that are not computer geocoded and assign
residence status codes and housing unit status
codes. The clerical matchers are provided the
actual respondent information from follow-up
activities, so they can review a whole household
composition and any interviewer notes about
the cases to help with their analysis.
Research Completed
Because detailed planning for this operation has
not yet started, research that directly supports this
operation has not yet been completed and no decisions have been finalized.
Decisions Made
No decisions have been finalized for this operation.
Design Issues to Be Resolved
The following decisions need to be made for this
operation:
What computer matching and clerical matching
systems will be used for CCM?
•• Approach: Determined through the development of the Detailed Operational Plan.
•• Decision by: September 2016
When will the first test of the 2020 Census
Coverage Measurement Operations be conducted?
•• Approach: Researched in the 2017 Census Test
(proposed).
•• Decision by: September 2017

2020 Census Operational Plan—Version 1.1 141

Cost and Quality
Investment in Coverage Measurement Matching will
have minimal impact on the cost and quality of the
2020 Census as compared with the 2010 Census.
Risks
By this point in the decade, planning for all major
2020 Census operations should be underway.
Budget reductions in FY 2013 through FY 2015
delayed planning for this operation. IF planning
efforts are not initiated at the start of FY 2016,
THEN there may not be sufficient time to implement innovations related to this operation.
Milestones
Date

Activity

September 2017

Release the Coverage Measurement
and Matching Detailed Operational
Plan.

January–
February 2020

Conduct Initial Housing Unit
Computer Matching.

February–
April 2020

Conduct Initial Housing Unit Clerical
Matching.

August–
September 2020

Conduct Person Computer Matching.

September–
November 2020

Conduct Person Clerical Matching.

November 2020

Conduct Final Housing Unit Computer
Processing.

November–
December 2020

Conduct Final Housing Unit Clerical
Matching.

5.8.3	 Coverage Measurement Field
Operations
Detailed Planning Status:

Not Started

Detailed planning for this operation has not
started. The narrative that follows represents the
Census Bureau’s preliminary thoughts as of the
release of this document.
Purpose
The Coverage Measurement Field Operations collects person and housing unit information (independent from 2020 Census operations) for the
sample of CCM Survey housing units. CCM collects
the same data as the 2020 Census for both housing units and persons. Additional information is

142 2020 Census Operational Plan—Version 1.1	

collected by CCM to help us understand coverage
and to detect erroneous enumerations.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Automate all Coverage Measurement data collection instruments.
•• To ensure more accurate data, minimize the time
lag between the follow-up operations where
beneficial.
•• Consider including an early telephone phase
prior to personal visit for the Person Interview
operation.
Opportunities to Innovate
Opportunities to innovate include the following:
•• To the extent feasible and practical, the CCM
Field Data Collection Operation will leverage
the use of automation and the field reengineering concepts under development for In-Field
Address Canvassing and Nonresponse Followup
operations.
Description of Operation
This operation collects person and housing unit
information for the sample of CCM Survey housing
units. The description below is based primarily
on the operational design of the 2010 Census
Coverage Measurement Program. When detailed
planning begins, it will focus on determining the
objectives of the 2020 CCM Program, taking into
consideration other aspects of the 2020 Census
operational design. The focus should be on defining the CCM Program for the 2020 Census that
is integrated with other census operations, fully
tested, and is designed to resolve issues identified
by the Program.
Based on the 2010 Coverage Measurement Program
design, this operation includes the following five
CCM Survey field data collection sub-operations:
•• Independent Listing: In this operation listers
walk all areas of the sample BCUs and list all the
housing units in the sample area from scratch,
that is, no MAF information is used in this
operation. This is an independent listing. Listers
knock on all housing units to inquire if there are

U.S. Census Bureau

more than one housing unit at the addresses
listed (like a basement or garage apartment) and
these are listed separately.
•• Initial Housing Unit Follow-Up: The list of
CCM housing unit addresses in the sample
are matched to the Initial census MAF list of
addresses in the same sample areas to identify
matches and possible matches between the two
lists, duplicates and possible duplicates in either
list, and nonmatches in either list. The cases
(addresses) that are in one list and not the other
(nonmatches) and those identified as possible
matches or possible duplicates are sent back
for an Initial Housing Unit Follow-Up interview.
Out of this operation an additional matching
using the follow-up results is conducted. The
results identify the list of housing units in the
CCM sample to be included in the CCM person
operations.
•• Person Interview: Collects person information
for the CCM Survey sample housing units by
performing in-person interviews using a computerassisted data collection instrument. The enumerators collect data similar to that collected in the
2020 Census, as well as additional data about persons in the household to determine if any of these
people may have been counted at other addresses
on Census Day.
•• Person Follow-Up: Collects additional information in the follow-up operation when lacking
sufficient information for estimation. The list
of CCM housing unit people in the sample are
matched to the people listed in the census in
the same sample areas to identify matches
and possible matches between the two lists,
duplicates and possible duplicates in either list,
and nonmatches in either list. The nonmatched
persons (that are in only one list and not the
other) and those identified as possible matches
or possible duplicates are sent back for the
Person Follow-Up interview to obtain additional
information. The collected information is used
after follow-up matching to resolve the cases
and the results are used in the estimation of
person coverage.
•• Final Housing Unit Follow-Up: After completion of census operations, the updated MAF
list of addresses is matched to the CCM list
of addresses to identify additional matches,
nonmatches, or duplicates. Cases unresolved
are sent back to the field to conduct the Final
U.S. Census Bureau

	

Housing Unit Follow-Up operation. The resulting
data are sent to the Final Housing Unit Matching
and then used in the housing unit coverage
estimation.
As the Census Bureau designs this operation, it will
consider whether any of the address listing can be
done using in-office methods (similar to In-Office
Address Canvassing) and whether administrative
records and third-party data can be used to support person interviews, recognizing that the same
administrative records and third-party data sources
used during Nonresponse Followup cannot be used
for CCM to ensure an independent evaluation.
Research Completed
Because detailed planning for this operation has
not yet started, research that directly supports this
operation has not yet been completed; however,
the Coverage Measurement Field Operations will
leverage research conducted to support other field
operations such as In-Field Address Canvassing and
Nonresponse Followup.
Assumptions Made
Based on planning of other operations, the following assumptions have been made:
•• CCM housing unit data collection will use the
Listing and Mapping Instrument enterprise solution instrument.
•• The CCM Survey operations will be maintained
independently of the 2020 Census.
•• All CCM Survey data collection will be automated and leverage systems and tools used in
other field operations where feasible.
•• Directorate and enterprise automation processes
will be leveraged whenever possible.
Design Issues to Be Resolved
The following decisions need to be made for this
operation:
Will the CCM person data collection instruments
need a larger Form-Factor (possibly a tablet) for
automated instruments instead of a smartphone?
•• Approach: Operation design.
•• Decision by: September 2016
Will there be an additional telephone operation that
is needed before the CCM Person Interview?

2020 Census Operational Plan—Version 1.1 143

• Approach: Operation design.
• Decision by: September 2016
When will the first test of the 2020 Census
Coverage Measurement Operations be conducted?
• Approach: Researched in the 2017 Census Test
(proposed).
• Decision by: September 2017
Cost and Quality
Investment in Coverage Measurement Field
Operations will have minimal impact on the cost
and quality of the 2020 Census as compared with
the 2010 Census.
Risks
By this point in the decade, planning for all major
2020 Census operations should be underway.
Budget reductions in FY 2013 through FY 2015
delayed planning for this operation. IF planning
efforts are not initiated at the start of FY 2016,
THEN there may not be sufficient time to implement innovations related to this operation.
Milestones

Program. The details that follow address various
aspects of the planning process more so than the
detailed scope of the 2020 Census evaluations and
experiments themselves. The detailed scope of
evaluations and experiments will result from the
formulation process. The initial planning, formation of governing bodies, solicitation of input,
and the agreement on scope of the 2020 Census
Evaluations and Experiments operation is dependent upon funding.
Purpose
Evaluations document how well the 2020 Census
was conducted; evaluations analyze, interpret,
and synthesize the effectiveness of census components and their impact on data quality and/or
coverage. Experiments identify potential designs of
early 2030 Census life-cycle research and testing;
experiments are quantitative or qualitative studies
that must occur during a decennial census in order
to have meaningful results to inform planning of
future decennial censuses. In general, experiments
involve response comparisons between tests, new
or modified methods, or procedures against 2020
Census production methods or procedures.

Date

Activity

The Evaluations and Experiments operation performs the following functions:

September 2017

Release the Coverage Measurement
Field Operations Detailed Operational
Plan.

•• Measures success of critical 2020 Census operations and processes.

October–
December 2019

Conduct CCM Independent Listing
and Quality Control.

March–
April 2020

Conduct Initial Housing Follow-Up
and Quality Control.

May–
June 2020

Conduct CCM Person Interview and
Quality Control.

October–
November 2020

Conduct CCM Person Follow-Up and
Quality Control.

November–
December 2020

Conduct Final Housing Follow-Up and
Quality Control.

5.8.4	 Evaluations and Experiments
Detailed Planning Status:

Not Started

Detailed planning for this operation has not
started. The 2020 Census Evaluations and
Experiments operation is unlike other 2020 Census
operations in that, at its start, the Census Bureau
will follow a process to establish and reach consensus on the set of evaluations and experiments
to be conducted as part of the 2020 Census
144 2020 Census Operational Plan—Version 1.1	

•• Formulates a 2020 Census experimental program that will further refine 2030 Census operational design options.
•• Contributes to the formulation of the 2030
Census Research and Testing phase objectives.
•• Develops a transition plan and appropriate organizational structures to establish 2030 Census
life-cycle planning.
•• Initiates other early planning activities for the
2030 Census, including the monitoring of policy
concerns and technological, societal, and public
cooperation trends.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations associated with the development and
management of the 2020 Census Evaluations and
Experiments operation were made:

U.S. Census Bureau

•• Deployment of a Knowledge Management database to capture and track 2010 Census recommendations, recommendations from oversight
bodies, and early 2020 Census research and
testing results would be valuable for connecting
past experiences and research to future research
and planning objectives.
•• Dedicated resources are needed earlier in the
2020 Census life cycle to initiate 2030 Census
life-cycle planning efforts to enable a smooth
transition from the 2020 Census implementation
to the 2030 Census research.
Opportunities to Innovate
At its core, the scope of the 2020 Census
Evaluations and Experiments operation will
focus on aspects of the 2020 Census design that
could lead to 2030 Census innovations. As the
2020 Census operational design solidifies, the
Evaluations and Experiments operational process will define the 2020 Census Evaluations and
Experiments, identify data requirements, and document methods to address research objectives.
To date, opportunities to innovate, as documented
below, focus primarily on aspects of the planning
and scope definition process. These opportunities
to innovate include the following:
•• Implementing a Knowledge Management system
and application for the 2020 Census Directorate.
•• Formulating 2020 Census evaluations and
experiments that are more formally guided by
the decisions on the 2020 Census operational
design and the 2030 Census planning and
objectives.
•• Formulating Fiscal Years 2022–2024 Research
and Testing objectives that are more formally
guided by 2030 planning and objectives.

the initial canvass for potential experiments. The
formulation phase involves:
•• Executive Staff guidance on strategic principles
and high level research targets.
•• Feedback from internal Program Managers,
operational subject matter experts, and Senior
Researchers/Methodologists.
•• Feedback from oversight groups, advisory
committees, the international collaboration consortium, the National Academy of Science, and
other external experts.
•• Recommendations from census research
and testing, as captured in the Knowledge
Management application.
•• Establishing parameters (e.g., cost, quality, risks,
and visibility) and criteria for selecting evaluations and experiment proposals.
Following formulation of the 2020 Census
Evaluations and Experiments operation are development, implementation, program control, closeout, and coordination activities. These phases of
the operation will be clearly described in future
versions of the operational plan.
Research Completed
Because detailed planning for this operation has
not yet started, research that directly supports this
operation has not yet been completed and no decisions have been finalized.
Decisions Made
No decisions have been finalized for this operation.
Design Issues to Be Resolved
The following decisions need to be made for this
operation:

•• Formulating 2030 Census life-cycle budget
simulations that are more formally aligned with
strategic planning and research objectives.

What are the strategic principles and high-level
research targets for guiding formulation of evaluations and experiments during the 2020 Census?

Description of Operation

•• Approach: Addressed once the working group
is chartered and the plan is developed and
approved by the Executive Staff.

To initiate the formulation of the 2020 Census
Evaluations and Experiments operation, an understanding of the 2020 Census operational design
is necessary. In general, what is in-scope for the
2020 Census operations sets the landscape from
which evaluations will be identified. The 2020
Census design options made out-of-scope provides
U.S. Census Bureau

	

•• Decision by: December 2016 
What are the parameters (cost, quality, risks, visibility, etc.) and criteria for selecting and prioritizing
evaluation and experimentation proposals?

2020 Census Operational Plan—Version 1.1 145

•• Approach: Addressed once the working group
is chartered and the plan is developed and
approved by the Executive Staff.
•• Decision by: December 2016 
Given the strategic principles for guiding formulation of evaluations and experiments and the
parameters and criteria for selecting and prioritizing evaluation and experimentation proposals,
what is the defined set of 2020 Census Evaluations
and 2020 Census Experiments?

guidance on scope, have reduced external visibility,
and affect overall program endorsement.
Milestones
Date

Activity

December
2016

Baseline research plans for 2020 Census
Experiments.*

September
2017

Release the Evaluations and Experiments
Detailed Operational Plan.

October 2018

Receive Office of Management and Budget
clearances for 2020 Census Evaluations.

December
2018

Baseline research plans for 2020 Census
Evaluations.*

July 2019

Begin issuing results for 2020 Census
Evaluations.

October 2019

Receive Office of Management and Budget
clearances for 2020 Census Experiments.

Investment in Evaluations and Experiments will
have minimal impact on the cost and quality of the
2020 Census as compared with the 2010 Census.

July 2020

Baseline 2030 Census alternative design
options for research.

August 2020

Begin issuing results for 2020 Census
Experiments.

Risks

October 2020

Finalize objectives for the 2030 Census
research and testing phase.

October 2021

Begin the 2030 Census research and testing
phase.

July 2022

Finalize research results for 2020 Census
Experiments.

April 2023

Finalize research results for 2020 Census
Evaluations.

•• Approach: Solicitation of feedback and application of principles, parameters, and criteria
to defining the scope of the Evaluations and
Experiments.
•• Decision by: December 2018
Cost and Quality

The Evaluations and Experiments operation for
the 2010 Census was launched in October 2006
(FY 2007) with the establishment of a governing
Executive Steering Committee to provide guidance
on key research objectives for the 2010 Census
program and seek out feedback from external
stakeholders. IF the 2020 Census Evaluations
and Experiments operation is not established and
funded in FY 2016, THEN the decreased lead
time to formulate experiments and evaluations,
including getting feedback earlier from external
stakeholders, will jeopardize having a robust and
meaningful operation to inform research and testing beyond the 2020 Census.
Opportunities to evaluate the 2020 Census and
conduct experiments to inform the design of the
2030 Census are extensive, requiring a governing
body to establish scope. IF an Executive Steering
Committee is not established to govern evaluations and experiments for the 2020 Census program, THEN the program will lack the necessary

146 2020 Census Operational Plan—Version 1.1	

*Research plans pertain to detailed study plans for individual evaluations and experiments. The Detailed Operational Plan for Evaluations
and Experiments pertains to high-level research objectives, the Business
Process Model, systems, locations, and staffing strategy to support and
implement the program.

5.9 INFRASTRUCTURE
The following four operations support the infrastructure of the 2020 Census:
•• Decennial Service Center: Supports 2020
Census field operations and handles all service
requests initiated by field staff.
•• Field Infrastructure: Coordinates space
acquisition for and lease management of the
Regional Census Centers (RCC) and field offices

U.S. Census Bureau

and provides the administrative infrastructure
for data collection operations covering the 50
states, the District of Columbia, and Puerto Rico.
•• Decennial Logistics Management: Provides
logistics management services to include procuring warehouse space, warehousing, inventory
management, kit assembly, deployment of materials, and receiving and excessing materials.
•• IT Infrastructure: Provide the IT Infrastructure
to support the 2020 Census, including enterprise systems and applications, 2020 Censusspecific applications, field IT infrastructure, and
mobile computing.
Each operation is described below.

5.9.1	 Decennial Service Center
Detailed Planning Status:

Underway

•• Centralized service center system to provide a
call management system, incident, and service
management system supporting decentralized
Service Center technicians (e.g., technicians
based in Area Census Offices answering any call
to the DSC).
•• Online service center technician training. Provide
online training for service center technicians as
opposed to classroom training. Online training is
more accessible than classroom training.
•• Cloud technology for call management and
incident management. Cloud technology will
support the centralized service center system.
•• Introduction of additional means for requesting
support:
ºº Online live chat: DSC customers will be able
to report problems via online live chat.

Purpose

ºº Texting: DSC customers will be able to report
problems via text.

The Decennial Service Center (DSC) will support
2020 Census field operations and handle all service
requests initiated by field staff.

ºº Smartphone applications: field staff will be
able to report problems via smartphone
applications.

Lessons Learned

Description of Operation

Based on lessons learned from the 2014 and 2015
Census Tests, the following recommendations are
made:

The overall goal of the 2020 Census DSC operation is the design and deployment of an integrated
service center, which will support field operations
and handle all help or service requests initiated by
field staff during the 2020 Census. These services
include the following:

•• Having the Service Center open during annual
Census tests provides insight into potential
issues which may arise during full 2020 Census
operations.
•• Having Service Center staff involved in User
Acceptance Tests helps them gain a better
understanding of possible issues which may
occur in the field.
•• Fund support staff from the beginning of testing
through 2020 Census production; otherwise,
there is no knowledge transfer from one test to
the next. DSC is only funded on a year-to-year
basis so all contractors are dismissed at the
end of the contract. Training of Service Center
staff absorbs a significant amount of time and
resources that are lost if the Service Center is
closed during periods when field operations are
not under way.
Opportunities to Innovate
Opportunities to innovate include the following:
U.S. Census Bureau

	

•• Password resets for all 2020 Census applications
including LUCA.9
•• Resolution of software and hardware issues
from field offices and field staff, such as those
experienced by users of the Decennial Applicant
Payroll and Personnel System and mobile
devices.
•• Security incident management, such as petty
theft, injuries, and stolen equipment.
•• Communications to and from field offices to
address such things as outages or software
releases.
Major functions of the DSC include the following:
•• Provide three major functions supporting 2020
Census Field Operations:
9
DSC is only providing password reset for LUCA; no further
DSC support is anticipated for LUCA.

2020 Census Operational Plan—Version 1.1 147

ºº Receive requests for service.
ºº Respond to requests for service.
ºº Report on requests for service.
•• Provide Tier-1 support during the 2020 Census
Tests.
ºº Tier-1 support will consist of resolving simple
issues from the field in a specified period of
time, such as password resets.
•• Provide Tier-1 and Tier-2 support during the
2020 Census field operations.
ºº In addition to the Tier-1 support described
above, Tier-2 support will consist of more
complex issues requiring troubleshooting by
specially trained staff with expertise in 2020
Census applications, such as MOJO, COMPASS,
and Listing and Mapping Instrument.
•• Provide Implement service-level agreements
with Tier-3 support based on current operational
standards of practice.
•• Serve in a coordination and communication
role in the event that a field office executes a
Continuity of Operations Plan.
•• Archive electronic records generated by the DSC
in accordance with Census Bureau archiving
policies.
Work Completed
The following research has been completed for this
operation:
•• Tested DSC use as part of the 2014 and 2015
Census Tests.
ºº Findings:
•• Changes to PIN and password configurations
for enumerators have reduced the number
of calls expected for password resets.
•• As the fingerprint vendor, USPS needs to
be prepared to cover the expected call
volume.10
•• There was a lower-than-expected call volume for online training-related issues.
Decisions Made
The following decisions related to the 2020 Census
DSC operation have been made:

DSC is not planning to support this function for the
2020 Census.
10

148 2020 Census Operational Plan—Version 1.1	

99 The DSC will be limited to providing service center support for 2020 Census staff with technical
issues related to 2020 Census enterprise organization applications.
99 The DSC will provide support to field staff for
the 2020 Census systems and applications.
99 The DSC will provide support for various
types of mobile devices and mobile operating
systems.11
Design Issues to Be Resolved
Additional work is required to make decisions on
the following questions:
What is the impact of automated training on call
volume and call types?
•• Approach: Researched in the 2015 Census Test.
•• Decision by: February 2016
What is the impact on call volume of not having
on-site IT support staff available during the Control
Panel enumerator training?
•• Approach: Researched in the 2015 Census Test.
•• Decision by: February 2016
What new contracts will need to be awarded for the
2020 Census?
•• Approach: Based on analysis of support
operations during the annual Census Tests
(2014–2017).
•• Decision by: January 2017
What is the optimal service center staffing structure
for the 2020 Census? Centralized or decentralized?
Optimal staff ratios? Type of technical support
needed in local field offices? Impact on services
rendered of the number of field offices that are
deployed, and number of field staff hired? Impact
on services rendered of using wireless connectivity
in the field offices?
•• Approach: Based on comparison of annual
test data (2014–2017) with data from the 2010
Census and an assessment of data from each
of the annual Census Tests (2014, 2015, 2016,
2017).
•• Decision by: January 2017

11
For BYOD, DSC will provide support for 2020 Census applications installed on personally owned devices; however, DSC will
not support the personal device itself.

U.S. Census Bureau

What methods will be available for contacting the
DSC (e.g., live online chat, texting, and smartphone
applications)?

5.9.2	 Field Infrastructure

•• Approach: Based on pilot tests of new
technologies during the annual Census Tests
(2014–2017).

Purpose

•• Decision by: January 2017

Detailed Planning Status:

Underway

The Field Infrastructure operation performs the
following functions:

Cost and Quality

•• Coordinate space acquisition for, and lease management of, the RCC and Area Census Offices.

Investment in the DSC will have minimal impact on
the cost of the 2020 Census as compared with the
2010 Census (under review).

•• Provide the administrative infrastructure for data
collection covering the 50 states, the District of
Columbia, and Puerto Rico including:

Quality impacts of this operation on overall 2020
Census quality include the following:

ºº Recruiting.

ÏÏ Providing an efficient DSC operation will
enhance quality of data collection by enumerators during the 2020 Census.

ºº Personnel and payroll administration.

Risks

ºº Management and supervision.

The number of staff hired for the DSC will be heavily based on the expected volume of calls received.
IF call volumes are not accurately forecast, THEN
staffing levels for the DSC may be inaccurate.

ºº Clerical support.

Adjustments to DSC staffing levels and roles are
based on the schedule and scope for the 2020
Census field operations. IF late or frequent changes
to the 2020 Census field operations schedule or
scope occur, THEN there may not be sufficient time
to hire and train additional DSC staff as needed.
Milestones
Date

Activity

September
2015

Open DSC to support the 2016 Census Test.

September
2016

Start support for the 2017 Census Tests.

January 2017

Release the DSC Detailed Operational Plan.

ºº Hiring and onboarding.
ºº Training.
ºº Partnership support.

ºº Materials supply.
ºº Printing and plotting.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Establish an interagency working group to identify and develop effective strategies for space
acquisition and build communication among
stakeholders.
•• Opening some field offices earlier than the others
allowed for a “test” run of implementation in the
space acquisition effort and improved the process
for opening the remaining (majority) of offices.

Award the 2020 Census DSC contract.

•• Streamline and automate the job application
process to replace the paper-based recruitment
and testing process.

September
2017

Start support the 2018 Census End-to-End
Test.

Opportunities to Innovate

December
2017

Start support for the 2020 Census RCC.

Opportunities to innovate include the following:

January 2019

Start support for the 2020 Census Area
Census Offices.

June 2021

Close the DSC.

•• Streamlined field management structure through
the use of automation and technology to manage the Nonresponse Followup caseload.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 149

•• Automation of the job application and recruiting processes, payroll submission and approval
process, and other administrative processes to
streamline personnel processes and reduce staffing requirements and related costs.

• Area Operations Support Center and staffing of the Area Operations Support Center
successful.
• Electronic payroll successful.

•• Use of automation for training, including providing newly hired staff with electronic training
modules.

Decisions Made

Description of Operation

9 The 2020 Census field office infrastructure will
include six RCC.

Field Infrastructure includes:
•• Space acquisition or leasing, provisioning (specifications, schemas, designs, etc.), building-out,
and supplying the RCC and field offices that will
open to support field operations.
•• Providing human resources and personnel management support functions, including recruiting, hiring and onboarding (i.e., suitability and
background checks), training, payroll, and out-­
processing (i.e., separation management).
Research Completed
The following research has been completed for this
operation:
•• Review of other countries’ census field
infrastructure.
ºº Findings: Best practices include consolidation
of support functions in the field, specifically
payroll, recruiting, and other administrative
functions.
•• Develop a new concept of operations for field
infrastructure and test in the 2015 Census Test.
ºº Findings: Field Staff Training:
•• Combination of online and classroom
training provided standardization of the
information, provided tracking capabilities,
and offered various learning methods.
•• Reduced training hours compared with the
2010 Census Nonresponse Followup enumerator training from 32 to 18 hours.
•• Deployment of YouTube videos to quickly
and efficiently provide targeted training to
enumerators.
•• Identified topics requiring additional training in future tests.
ºº Findings: Field Reengineering.

150 2020 Census Operational Plan—Version 1.1	

The following decisions related to the 2020 Census
Field Infrastructure operation have been made:

9 The RCC will be located in the same metropolitan areas as the Regional Offices.
9 Separate office space will be needed in the
RCC to support and manage Census Coverage
Measurement Operations.
9 The preliminary RCC staffing model is as follows:
º General Management: one Regional Director
and one Deputy Regional Director.
º Data Collection: two Assistant Regional
Census Managers and one Regional Manager
for Operations, who oversees five Census
Operations Managers located in different field
offices.
º Administrative Functions: one Assistant
Regional Census Manager, one Recruiting
Coordinator, two Administrative Coordinators,
one Space, Leasing, and Supplies Coordinator,
and one Lead Technical Support Coordinator
(under review).
º Geography Partnership and Quality: one
Assistant Regional Census Manager, one
Regional Manager for Quality Assurance, two
Partnership Coordinators, and one Geographic
Coordinator.
9 The 2020 Census field office infrastructure will
include up to 250 field offices, a small subset of
which will open a few months early to support
early census operations, including In-Field
Address Canvassing.
9 The preliminary field office staff model is as
follows:
º General Management: one Census Operations
Manager (reporting to the Regional Manager
for Operations at the RCC), one Manager for
Support Operations and one Manager for Field
Operations.

U.S. Census Bureau

ºº Data Collection: multiple Field Managers for
Operations, Local Supervisors for Operations,
Trainers, and Enumerators; specific numbers
based on workload; supervisory ratios to be
determined.
99 In-Field Address Canvassing will be managed
out of the field offices.
99 Recruiting activities will be automated.
99 The job application and assessment (testing)
process will be automated.
99 Field staff training will employ the use of online
training capabilities.
99 The training pay rate will be lower than the production pay rate.
99 The time and expense recording and approval
process for data collection field staff will be
automated for field operations.
99 As part of the solution, the USPS will assist with
onboarding processing for field staff.
Design Issues to Be Resolved
Additional work is required to make decisions on
the following questions:
What is the approach for the recruiting and
onboarding process?
•• Approach: Research based on the 2015 and
2016 Census Tests.
•• Decision by: January 2017
Where will the field offices be located?
•• Approach: Based on analysis of the estimated
In-Field Address Canvassing and Nonresponse
Followup workload.
•• Decision by: January 2017
Cost and Quality
Cost impacts of this operation on overall 2020
Census costs include the following:

U.S. Census Bureau

	

The investment in Field Infrastructure will have
considerable cost impacts on the cost of the
2020 Census as compared with the 2010 Census
through:
ÐÐ Reduced office infrastructure for In-Field Address
Canvassing and NRFU operations.
ÐÐ Increased efficiencies due to automated administrative functions, including recruiting, onboarding, training, and payroll.
ÐÐ Increased cost savings due to reduced field
staffing.
Quality impacts of this operation on overall 2020
Census quality include the following:
ÏÏ Fewer enumerator errors resulting from use of
automation to improve training methodology
and supervision capabilities.
ÏÏ Automated Job Application and Employment
Assessment Testing.
ÏÏ Automated Personnel and Payroll Administration
(e.g., Time and Attendance Submission).
Risks
The infrastructure put in place to support the 2020
Census field operations is expected to manage
the workload regardless of how large it may be. IF
the field infrastructure is not sufficient to support
the work for the 2020 Census, THEN there is
significant risk of not being able to effectively or
efficiently manage the associated field workload,
which could have an impact on cost and data
quality.
The number of offices and staffing levels are heavily based on the expected workload for the field
operations that support the 2020 Census. IF late
design changes occur that impact the workload for
the field operations, THEN the number of offices
and staffing levels may need to increase.	

2020 Census Operational Plan—Version 1.1 151

Milestones

Opportunities to Innovate
Opportunities to innovate include the following:

Date

Activity

December
2015

Approve final field staff recruiting and training
approaches.

•• Implementation of an online, real-time Enterprise
Resource Planning system.

March 2016

Finalize RCC space requirements.

•• Implementation of a wireless network and
bar code technology will automate inventory
transactions.

Finalize number of field offices.
September
2016

Release the Field Infrastructure Detailed
Operational Plan.

January
2017

Finalize locations of field offices.

December
2017

Finalize field office space requirements.

December
2017

Begin opening RCCs.

January
2019

Begin opening field offices.

December
2020

Complete closing of all field offices.

June 2021

Complete closing of all RCCs.

•• Policy and procedure to require full material and
supply inventory accounting throughout the
Census using ERP system.
Description of Operation

5.9.3	 Decennial Logistics Management
Detailed Planning Status:

•• Extended implementation and access to the ERP
system to RCC and field offices.

Underway

Purpose
Decennial Logistics Management will provide
logistics management services including procuring
warehouse space, warehousing, inventory management, kit assembly, deployment of materials, and
receiving and excessing materials.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Purchase and deploy an Integrated Logistics
Management System to gain cost benefits generated from bulk purchasing and significantly
improve inventory control.
•• Utilize barcode technology entirely, in conjunction with an Integrated Logistics Management
System, to improve inventory control and
reduce costs.
•• Conduct training at local offices for inventory
control, in conjunction with use of an Integrated
Logistics Management System.
•• Continue the belt-driven kit assembly line
process.
152 2020 Census Operational Plan—Version 1.1	

The Decennial Logistics Management operation for
the 2020 Census consists of:
•• Setting up a warehouse and office to support
RCC and field office deployments.
•• Recruiting, hiring, and training human resources
to support NPC logistics operations.
•• Providing the means to provision RCC, field
offices, and field staff with supplies.
•• Providing the RCC and field offices with operating materials, supplies, and equipment.
•• Providing other support functions (e.g., printing,
shipping, kitting, non-IT equipment).
Work Completed
The following research has been completed for this
operation:
•• Study of current literature regarding Third-Party
Logistics Organizations.
ºº Findings: Given deadlines imposed by ThirdParty Logistics Organizations, this approach is
not consistent with the iterative development
of 2020 Census requirements.
•• Study of current literature on other logistics
support models that may fit the characteristics
of the 2020 Census:
ºº Findings:
•• No new logistics models that align with the
major characteristics of the 2020 Census:
limited and short duration, high variety and high mix of Operating Materials
and Supplies per operation, evolving

U.S. Census Bureau

data availability regarding quantities of
Operating Materials and Supplies.
• Distributed warehousing will likely not
work for the 2020 Census. The strong
implication with distributed warehousing is
that whatever is needed in each warehouse
is well known ahead of time, which is not
characteristic of a decennial census.
• The National Processing Center has implemented the first phase of the Integrated
Logistics Management System project, to include
inventory management. The product, Syteline,
is operational. The contractor and the Office of
Information Security continue working to complete requirements for full Authority to Operate,
anticipated by the end of September 2015.
Decisions Made
The following decisions related to the 2020 Census
Decennial Logistics Management operation have
been made:
9 Logistics support for procurement, assembly,
receiving, and deployment of non-IT operating
materials, supplies, and equipment will be conducted by the NPC.
9 Field Logistics support conducted by the NPC
will occur at an off-site location due to space
limitations within the current facility.
Design Issues to Be Resolved
What are the preliminary plans for the Operating
Materials and Supplies required to support the
2020 Census Operational design?
• Approach: Assess impact of operational design
for other operations on Operating Materials and
Supplies requirements through document review
and conversations with operation team leads.
• Decision by: December 2015
What are the preliminary plans for quantities of
Operating Materials and Supplies required to support operations?
• Approach: Assess impact of operational design
for other operations on Operating Materials and
Supplies requirements through document review
and conversations with operation team leads.
• Decision by: December 2015
What role will NPC have in IT deployments?

U.S. Census Bureau

	

• Approach: Develop a list of logistical responsibilities for NPC by operation.
• Decision by: March 2016
Cost and Quality
The investment in Logistics improvements will
have considerable cost impacts on the 2020
Census as compared with the 2010 Census
through:
Ð Online, real-time inventory transaction updates.
º Better, and up-to-date, information for
decision-making regarding on-going procurement activities.
Ð Material requirements planning and resource
requirements planning.
º Generate better information about space
requirements and staff required to manage
inventory, and support field operations.
Ð Production planning and scheduling of logistics
activities via proven, automated system features
instead of manual processes.
º Reduces the reliance on spreadsheet management by providing automated planning and
scheduling capabilities for this volatile census
environment.
Risks
NPC will deliver baselined space requirements for
the logistics operation to GSA by April 1, 2016, to
accommodate an 18-month lead time before occupancy. Major changes to these requirements could
mean issues with space available, or the need to
increase the amount of space to meet the changes
in material requirements. IF the NPC receives
significant changes to requirements for Operating
Materials and Supplies after the requirements for
warehousing logistics have been baselined, THEN
this may affect a change in space requirements
necessitating additional warehousing space, or
may result in underutilizing space already leased.
The more information NPC receives about operational requirements early on in the planning and
development stages tends to mitigate the need for,
and the magnitude of, additional resources and
costs. IF the NPC receives changes to operational
requirements as the 2020 Census work progresses,
THEN this may change the cost of logistics operational support, due to the need to add staff or
implement overtime to avoid schedule delays.
2020 Census Operational Plan—Version 1.1 153

Date

Activity

•• Improvements are needed in assessing and
approving requested changes to business and
technical requirements.

April 2016

Initiate search and build out activities for
Logistics Space.

Opportunities to Innovate

September
2016

Release the Decennial Logistics Management
Detailed Operational Plan.

March 2017

Initiate Equipment Leases for Logistics
Functions.

Milestones

Opportunities to Innovate include the following:
•• Early development of solutions architecture.
•• Use of enterprise solutions.

October 2017 Occupy Logistics Space: installations
complete and ready to operate.

•• Iterative deployment of infrastructure aligned
with and based on testing.

May 2021

•• Use of workload demand models to size IT solutions appropriately.

Close down Logistics Operations.

5.9.4	 IT Infrastructure
Detailed Planning Status:

•• Scalable solutions.
Underway

•• Agile development of applications.

Purpose

Description of Operation

The purpose of the IT Infrastructure operation is to
provide the information technology-related infrastructure support to the 2020 Census, including:

Each component of the IT Infrastructure operation
is described below.

•• Enterprise systems and applications.
•• Decennial specific systems, applications, and
interfaces.
•• Field IT infrastructure (RCC and field offices).
•• Mobile computing.

Enterprise Systems and Applications: This
support area includes the planning and implementation of all hardware and software to support
operations for the 2020 Census, as well as the
management and monitoring of those systems,
including but not limited to:
•• CEDCaP Systems.

Lessons Learned

•• CEDSCI Systems.

Based on lessons learned from the 2010 Census,
as well as the 2014 and 2015 Census Tests, the
following recommendations are made:

•• Shared Services (Virtual Desktop Infrastructure,
etc.).

•• Provide nonfunctional and functional requirements that drive the design of the infrastructure
(e.g., performance, availability, information
about the users, monitoring, printing, reporting,
and security).

Decennial Specific Applications: This support
area includes the planning and implementation of
all hardware and software to support operations
for the 2020 Census, as well as the management
and monitoring of those systems. Including but not
limited to:

•• Use of prototypes and a test local census office
helps validate the design of the IT infrastructure.

•• Recruiting, hiring, and on-boarding tools (including training).

•• Opening some field offices earlier than the others allowed for a “test” run of the deployment of
the IT infrastructure, including the equipment
and the telecommunications.

•• Personnel and payroll applications (e.g.,
Decennial Applicant Personnel Payroll System).

•• IT Infrastructure Readiness preparation and
assessment process for the 2015 Census Test
was instrumental and should continually be
used to improve remaining tests for the 2020
Census.

154 2020 Census Operational Plan—Version 1.1	

•• Census Hiring and Employment Check and
fingerprinting.
RCC and Field Office IT Infrastructure: This
support area covers the deployment of IT capabilities in the form of office automation services
to any RCC, field office, facility, or work location
opened as part of the 2020 Census operations.
It includes support for all field data collection
U.S. Census Bureau

operations through automated recruiting, hiring,
staffing, training, fingerprinting, and mobile device
support including:

•• Provided infrastructure to support testing of:
ºº Internet Data Collection.
ºº Real-Time Non-ID Processing.

• Definition of functional and nonfunctional solution requirements for field offices.

Decisions Made

• Development of the IT computing environment
design.

The following decisions related to the 2020 Census
IT Infrastructure operation have been made:

• Procurement of circuits and IT equipment for the
census field offices.

99 An incremental approach will be used to define,
deploy, and test the IT Infrastructure.

• Shipping, configuration, testing, and staging of
IT equipment for the census field offices.

99 Mobile devices will be used for field data
collection.

• Tear-down and disposition of IT equipment and
circuits at the conclusion of the 2020 Census
activities.

99 Whenever technically feasible and cost effective,
enterprise solutions will be used in support of
the 2020 Census.

Field IT infrastructure requirements will provide, at
a minimum, for the following:

99 A hybrid cloud design will be used for all 2020
Census systems requiring scaling wherever
possible.

• Decennial Service Center.
• National Processing Center.
• Regional Census Centers.
• Area Census Offices.
• Partnerships, if needed.
• Mobile offices and vehicles, if needed.
• Offices for outlying areas (Island Areas).
• Regional technicians.
Mobile Computing: By leveraging technology
innovations such as MAM programs, secure applications, provided via BYOD or Device as a Service,
the Census Bureau will implement a flexible and
efficient acquisition strategy to procure mobile
devices and services for fieldworkers.
Work Completed
The following work has been completed for this
operation:
• Established the Field IT infrastructure for the
2014 Census Test, 2014 SIMEX, and 2015
Census Test.
• Established the Headquarters IT infrastructure to
support the 2014 Census Test, 2014 SIMEX, and
2015 Census Tests. Mapped the IT infrastructure
to each operational component being tested to
evaluate and ensure readiness.
• Used Mobile Device Management solution and
MAM solution to push and securely manage
mobile applications on mobile devices.
U.S. Census Bureau

	

99 Virtual Desktop Infrastructure will be used for all
RCC and field office staff.
Design Issues to Be Resolved
What cloud services are required to support
the 2020 Census operational design (to include
CEDCaP and non-CEDCaP)?
•• Approach: Testing in FY 2016 with some key
2020 Census systems—acquisitions being put
in place to meet these needs and those beyond
2016; output of demand models will be used to
develop performance requirements.
•• Decision by: June 2016
What is the projected demand that the IT infrastructure and systems need to accommodate?
•• Approach: External and internal demand models
being developed and matured through testing.
•• Decision by: June 2016 (to support acquisition
of cloud computing services)
What is the solutions architecture (applications,
data, infrastructure, security, monitoring, and service management) for the 2020 Census, including
use of enterprise solutions?
•• Approach: Maturation of the business architecture and solutions architecture in line with
the refinements of the Operational Plan and test
results.
•• Decision by: September 2016

2020 Census Operational Plan—Version 1.1 155

To what extent will BYOD and Device as a Service
be used to support field operations?
•• Approach: Testing in FY 2015 and FY 2016 will
provide key insights into use of BYOD for the
2020 Census.
•• Decision by: September 2016
What is the plan for the use of mobile devices for
the 2020 Census? Security Platform for Mobile
Devices (BYOD and Device as a Service)? BYOD
Acceptable Use Policy? BYOD Reimbursement Policy?
•• Approach: Based on analysis of the performance of solutions fielded and tested.
•• Decision by: October 2017
Cost and Quality
Cost impacts of this operation on overall 2020
Census costs include the following:
The investment in IT Infrastructure will decrease
the cost of the 2020 Census through:
ÐÐ Leverage enterprise solutions.
ÐÐ Leveraging cloud computing to address peak
performance requirements.
Quality impacts of this operation on overall 2020
Census quality include the following:
ÏÏ Use of automation to collect real-time data,
enabling better monitoring and management of
the data collection activities.

Risks
The IT infrastructure built to support the 2020
Census operations is dependent on the Business
Process Models and Business Requirements
developed by each operation. IF there are
potential gaps in business representation in the
development of Business Process Models and
Business Requirements, THEN the appropriate IT
Infrastructure may not be in place to support the
2020 Census operations.
The technical solutions that will support the 2020
Census operations are dependent on the business
requirements developed by each operation being
further broken down into detailed solution-level
technical and performance requirements. IF business requirements are not appropriately decomposed into solution-level technical and performance requirements, THEN technical solutions
may not be designed and built in a timely manner
to support the 2020 Census operations.
Milestones
IT Infrastructure Milestones
Date

Activity

September
2016

Release the IT Infrastructure Detailed
Operational Plan.
Finalize Definition of Field IT Infrastructure
Solution Requirements.
Award Contract for Field IT Infrastructure.

ÏÏ Automated Training and Knowledge Base.

December
2016

ÏÏ Sufficient mobile and networking infrastructure
to effectively support field operations.

November
2017

Begin Installation of IT Infrastructure for the
Regional Census Centers.

ÏÏ Sufficient IT infrastructure to provide necessary
levels of performance, to include acceptable
interactions by the public, partners, and others.

June 2019

Begin Installation of IT Infrastructure for the
Area Census Offices.

Finalize Field Office IT Infrastructure Design.

ÏÏ Robust processes for system development.

156 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

Cloud Testing and Readiness Milestones
Date

Activity

January 2015 Identify cloud computing as the
assumed technical solution in support of the
CEDCaP Decennial Infrastructure Scale-Up
Project.
June 2015

Conduct initial testing of Internet SelfResponse using cloud computing services.

September
2015

Acquire cloud computing services in place to
support the 2016 Census Tests.
Deliver initial output from the 2020 Census
workload demand models, including Internet
Response.

December
2015

Deliver initial baseline of decomposed
2020 Census solution-level performance
requirements provided by 2020 Census
Integrated Project Teams.

March 2016

Complete 2020 Census technical solutionlevel requirements, including performance
requirements.

June 2016

Deliver analyses of alternatives and
recommended solutions architecture, to
include cloud computing as a solution
alternative, in support of technical solutionlevel requirements.

September
2016

Acquire cloud computing services to
support the 2017 Census Tests and future
Census Tests.

June 2017

Leverage cloud computing in support of 2017
Census Test and analyze test results. Modify
workload demand models and technical
solution architecture.

June 2018

Leverage cloud computing in support of 2018
Census End-to-End Test and analyze test
results. Modify workload demand models and
technical solution architecture.

September
2019

Ensure readiness of final cloud computing
solution for 2020 Census.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 157

[This page intentionally left blank]

6. Key Program-Level Risks
The 2020 Census Risk Management process
consists of activities performed to reduce the
probability and consequences of events that could
negatively affect the 2020 Census Program’s
ability to meet its objectives. The goal of the risk
management process is to ensure a common,
systematic, and repeatable assessment approach
at both the program- and project-level so that
risks can be effectively identified and managed,
as well as clearly communicated to management,

stakeholders, and executive-level decisionmakers. Risk management is iterative and designed
to be performed continuously throughout the
2020 Census Program’s Research and Testing,
Development, and Implementation phases.
Figure 34 shows the current risk matrix for all risks
in the 2020 Census Program Risk Register, as of
August 31, 2015.

Probability
5
4

1
19%

3

2

10

4

70%
11%

2

1

2

3

1

3
1

1

2

3

4

5

Impact

Figure 34: 2020 Census Program-Level Risk Matrix

From the 2020 Census Risk Register, 11 key
program-level risks are highlighted in the sections
below. These risks were selected from the risk
register because members of the 2020 Census Risk
Review Board agreed these 11 key risks represent
the major concerns that could affect the design or
the successful implementation of the 2020 Census.

U.S. Census Bureau

	

Along with the risk statement, the probability
rating, the impact rating, the risk exposure, and
the risk color are provided for each risk. Mitigation
strategies are also provided. For information about
all the program-level risks, the full program risk
register is available upon request.

2020 Census Operational Plan—Version 1.1 159

6.1 FUNDING REQUESTS NOT
REALIZED
To execute a 2020 Census that reduces cost while
maintaining quality, the Census Bureau requires
appropriate funding during the entire life cycle of
the program.
IF the funding appropriated for each fiscal year
during the 2020 Census life cycle is less than
requested or not provided at the start of each fiscal
year, THEN the Census Bureau will have to reprioritize the projects, which may affect the ability to
reengineer the systems and operations supporting
the 2020 Census.
Probability 4
(Likely)

Impact 5
(Major impact)

HIGH

Mitigation Strategies include the following:
•• Formulate and submit robust cost estimates
(including contingencies for known and
unknown risks) for planned 2020 Census activities per fiscal year.
•• Develop strong budget justifications that
demonstrate the negative impact of insufficient
funds for 2020 Census activities per fiscal year.
•• Prioritize research, testing, and implementation
activities per fiscal year to focus on those areas
that can significantly impact cost and quality, and develop contingency plans to quickly
respond to budget cuts.

6.2 REENGINEERING ADDRESS
CANVASSING OPERATION
For the 2010 Census, a near 100-percent Address
Canvassing operation in the field was used to
update and validate a complete and accurate inventory of addresses, which forms the basis for the
census enumeration. For the 2020 Census, a variety of in-office techniques are being tested for use
in updating and validating the completeness of the
address inventory. These in-office techniques are
expected to reduce the areas that require fieldwork
while achieving an equal or greater result, thereby
reducing costs and improving quality for the overall 2020 Census program.

160 2020 Census Operational Plan—Version 1.1	

IF the established threshold of addresses to update
and validate through in-office techniques is not
achieved with the expected level of quality and
cost, THEN the 2020 Census program objectives
may not be met.
Probability 3
(Moderately likely)

Impact 5
(Major impact)

HIGH

Mitigation Strategies include the following:
•• Establish the objectives for In-Office Address
Canvassing through the development of the
2020 Census Operational Plan.
•• Baseline the techniques for In-Office Address
Canvassing through the development of the
Address Canvassing Detailed Operational Plan.
•• Test the techniques by conducting In-Office
Address Canvassing beginning in September
2015.
•• Evaluate In-Office Address Canvassing techniques and results through the MAF Coverage
Study, which is a continuous field activity beginning in April 2016.
•• Update, as necessary, the In-Office Address
Canvassing techniques from lessons learned and
recommendations.

6.3 ADMINISTRATIVE RECORDS AND
THIRD-PARTY DATA—EXTERNAL
FACTORS
The Census Bureau is planning to use administrative records and third-party data to reduce need to
follow up with nonrespondents through the identification of vacant and deleted housing units (those
that do not meet the Census Bureau’s definition
of a housing unit) and the enumeration of nonresponding housing units.
IF external factors or policies prevent the Census
Bureau from utilizing administrative records and
third-party data as planned, THEN the Census
Bureau may not be able to fully meet the strategic
goal of containing the overall cost of the 2020
Census.

U.S. Census Bureau

Impact 5
(Major impact)

Probability 3
(Moderately likely)

HIGH

Mitigation Strategies include the following:
•• Identify external stakeholders that have an interest in Census Bureau policies regarding administrative record and third-party data usage.
•• Develop a stakeholder communications plan for
identified external stakeholders.
•• Regularly communicate to and seek feedback
from identified external stakeholders on design
decisions and research and testing results
related to the use of administrative records and
third-party data for the 2020 Census.
•• Assess impacts of any changes to the design
based on feedback from external stakeholders
and update plans accordingly.
•• Monitor external factors and policies that may
impact the Census Bureau’s planned use of
administrative records and third-party data for
the 2020 Census.

6.4 PUBLIC PERCEPTION OF ABILITY
TO SAFEGUARD RESPONSE DATA
The accuracy and usefulness of the data collected
for the 2020 Census are dependent upon the ability to obtain information from the public, which is
influenced partly by the public’s perception of how
well their privacy and confidentiality concerns are
being addressed.
IF a substantial segment of the public is not convinced that the Census Bureau can safeguard their
response data against data breaches and unauthorized use, THEN response rates may be lower
than projected, leading to an increase in cases for
follow-up and cost increases.
Impact 5
(Major impact)

Probability 3
(Moderately likely)

HIGH

Mitigation Strategies include the following:
•• Develop a communications strategy to build and
maintain the public’s confidence in the Census
Bureau’s ability to keep their data safe.
•• Research other Census Bureau divisions, other
government agencies, and the private sector to

U.S. Census Bureau

	

understand how they effectively mitigate the
issue of public trust and IT security.
•• Continually monitor the public’s confidence in
data security in order to stay abreast of their
probable acceptance of the Census Bureau’s
methods for enumeration.
•• Prepare for rapid response to mitigate public
concerns regarding any incidents that occur that
could affect public perception of the Census
Bureau’s ability to safeguard response data (e.g.,
breach of data from another government agency).

6.5 CYBERSECURITY INCIDENTS
Security breaches could happen to the Census
Bureau’s Internet data collection instrument, mobile
devices used for fieldwork, and data processing
and storage systems. IT security controls will be
put in place to block attempts from outside infiltration, as well as to prevent any negative impacts to
services or data, such as network disruption (denial
of services), technical malfunctions, and stolen or
corrupted data.
IF a cybersecurity incident (i.e., breach) occurs to
the systems or devices being utilized for the 2020
Census, THEN additional technological efforts
will be required to repair or replace the systems
and devices affected in order to maintain secure
services and data.
Probability 3
(Moderately likely)

Impact 5
(Major impact)

HIGH

Mitigation Strategies include the following:
•• Monitor system development efforts to ensure
the proper security guidelines are followed
during the system development phase.
•• Research other Census Bureau programs, other
government agencies, and the private sector to
understand how they effectively mitigate cybersecurity incidents.
•• Audit systems and check logs to help in detecting and tracing an outside infiltration.
•• Contract with third-party testers to perform
threat and vulnerability analysis.
•• Prepare for rapid response to address any
detected cybersecurity incidents.

2020 Census Operational Plan—Version 1.1 161

6.6 ENTERPRISE IT SOLUTIONS
The Census Bureau, wherever feasible, will leverage cross-program IT solutions and has begun
the work necessary to ensure this is achieved.
However, enterprise solutions may not address all
of the 2020 Census requirements or late changes
may be required after key development milestones.
In these cases, impacts must be identified and
proper actions taken to resolve the situation.
IF enterprise IT solutions cannot meet the 2020
Census requirements or late changes are required,
THEN existing systems may require substantial
modifications or entirely new systems may have to
be developed, adding complexity and increasing
risk for a timely and successful 2020 Census.
Probability 3
(Moderately likely)

Impact 4
(Substantial impact)

MEDIUM

Mitigation Strategies include the following:
•• Engage with enterprise efforts to ensure that
solutions architectures align and provide continued support for 2020 Census requirements
development and management.
•• Participate in agency-wide solution development
(i.e., avoid custom solutions where enterprise
or off-the-shelf solutions will suffice) and ensure
that contingencies (i.e., off-ramps) are developed early and exercised when necessary.
•• Determine the extent existing systems from
the 2010 Census can be modified and reused if
necessary.
•• Ensure IT solutions are sufficiently scalable to
adjust to unexpected peaks in the workload.
•• Design IT solutions that are flexible enough to
incorporate design changes.
•• Establish a change control management process
to assess impacts of change requests to facilitate decision-making.
•• Prepare for rapid response to implement change
based on the results of the change control
process.

6.7 TECHNOLOGICAL INNOVATIONS
SURFACING AFTER DESIGN IS
FINALIZED
Technological innovations inevitably surface, but
the 2020 Census program must move forward
toward building the operational design, which will
be finalized and put into production for the 2018
Census End-to-End Test.
IF technological innovations surface after the
design for the 2020 Census has been finalized,
THEN development and testing life-cycle phases
must be compressed if the innovations are
adopted, resulting in less time to mature innovations in census methodologies and systems.
Probability 3
(Moderately likely)

Impact 4
(Substantial impact)

MEDIUM

Mitigation Strategies include the following:
•• Build versatile operations and systems design.
•• Keep team members and management aware of
evolving technological innovations.
•• Devote dedicated resources to track and communicate innovations.
•• Dedicate funds to incorporate innovations into
the design.

6.8 DATA QUALITY
The planned innovations for the design of the 2020
Census aspire to save significant taxpayer dollars
by making data collection and field operations
more efficient.
IF the innovations implemented to meet the 2020
Census cost goals result in unanticipated negative impacts to data quality, THEN additional
unplanned efforts may be necessary in order to
increase the quality of the census data.
Probability 3
(Moderately likely)

Impact 4
(Substantial impact)

MEDIUM

Mitigation Strategies include the following:
•• Perform cost and quality trade off analysis on
data collected during field tests.
•• Review results of cost and quality trade off
analysis, and determine the most cost-effective

162 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

methods, if any, for increasing quality without
sacrificing cost savings.

6.9 LATE OPERATIONAL DESIGN
CHANGES
After key planning and development milestones
are completed, stakeholders may disagree with
the planned innovations behind the 2020 Census
and decide to modify the design, resulting in late
operational design changes.
IF operational design changes are required following the completion of key planning and development milestones, THEN the 2020 Census program
may have to implement costly design changes,
increasing the risk for a timely and successful
2020 Census.
Impact 4
(Substantial impact)

Probability 3
(Moderately likely)

MEDIUM

Mitigation Strategies include the following:
•• Identify external stakeholders that have an interest in the 2020 Census operational design.
•• Develop a stakeholder communications plan for
identified external stakeholders.
•• Regularly communicate to and seek feedback
from identified external stakeholders on design
decisions and research and testing results.
•• Assess impacts of any changes to the design
based on feedback from external stakeholders
and update plans accordingly.
•• Monitor external factors and policies that may
impact the Census Bureau’s planned innovations
for the 2020 Census operational design.
•• Establish a change control management process
to assess impacts of change requests to facilitate decision-making.
•• Prepare for rapid response to implement
change based on the results of the change control process.

U.S. Census Bureau

	

6.10 ADMINISTRATIVE RECORDS
AND THIRD-PARTY DATA—ACCESS
AND CONSTRAINTS
The Census Bureau is planning to use administrative records and third-party data to reduce the
need to follow up with nonrespondents through
the identification of vacant and deleted housing
units (those that do not meet the Census Bureau’s
definition of a housing unit) and the enumeration
of nonresponding occupied housing units. The
use of administrative records data requires special
handling and security protocols that affect the
development of the systems and infrastructure
supporting the 2020 Census.
IF the Census Bureau does not have timely and
continual access to administrative records and
third-party data, or the data providers place
constraints on the use of the data that conflicts
with planned 2020 Census operations, THEN the
Census Bureau may not be able to fully meet the
strategic goal of containing the overall cost of the
2020 Census.
Probability 2
(Not likely)

Impact 5
(Major impact)

MEDIUM

Mitigation Strategies include the following:
•• Identify all required administrative records
and third-party data sets needed for the 2020
Census program, including data providers and
points-of-contact.
•• Review data sharing agreements/contracts in
order to understand all the conditions assigned
to the administrative records and thirdparty data sets and to ensure conditions are
appropriate.
•• Ensure requirements for administrative records
and third-party data usage are developed and
documented.
•• Inform data providers that data agreements/contracts need to be updated.

2020 Census Operational Plan—Version 1.1 163

•• Disseminate updated data agreements/contracts
to internal stakeholders.
•• Negotiate with the source providers to ensure
required administrative records and third-party
data are available when needed.
•• Ensure the build-out for all systems supporting
the 2020 Census takes into account the handling
of administrative records and third-party data.
•• Ensure the security requirements, including
physical security, for all systems supporting the
2020 Census cover the handling of administrative records and third-party data.
•• Ensure staff has been trained in the proper
handling of administrative records and thirdparty data.

6.11 POLICY IMPACTS
The Census Bureau is introducing significant innovations to conduct the 2020 Census. Some of these
innovations may be contingent upon interpretation
of current policies or the development of new policies where gaps exist.
IF policies prevent the 2020 Census program from
implementing the proposed innovations, THEN the
2020 Census program may not be able to meet the
strategic goals and objectives of the program.
Probability 2
(Not likely)

Impact 3
(Moderate impact)

LOW

Mitigation Strategies include the following:
•• Actively engage key internal and external stakeholders to build support for the use of new or
modified activities and operations for enumeration in the 2020 Census.
•• Determine if current or new policies, both internal and external, will affect the implementation
of the proposed innovations.

164 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau

7. Quality Analysis
As the Census Bureau continues to evaluate the
2020 Census operational design, an analysis of
the impact on the quality of the census results is
required to ensure that innovations designed to
reduce cost do not have an unacceptable impact
on quality. This section describes the analysis
performed to date on the quality impacts of three
of the four key innovation areas: Reengineering
Address Canvassing, Optimizing Self-Response,
and Utilizing Administrative Records and ThirdParty Data. The analysis related to administrative
records and third-party data focuses on the impact
of these innovations on Nonresponse Followup
(NRFU) as that operation is where the innovations
are expected to provide the greatest cost savings.
Accordingly, this section is organized as follows:
•• Quality impacts for Reengineering Address
Canvassing
•• Quality impacts for Optimizing Self-Response
•• Quality impacts of Utilizing Administrative
Records and Third-Party Data for NRFU
•• Future plans to assess quality impacts of 2020
Census innovations
The quality of the 2010 Census was measured
using the Census Coverage Measurement Survey.12
The CCM was a post-enumeration survey designed
to assess the coverage of the census for housing
units and persons, including estimates of omissions
and erroneous enumerations. The CCM estimated a
net over-count of 0.01 percent, or 36,000 persons.
There were an estimated 10.0 million erroneous
enumerations for the household population and
16.0 million omissions. To identify the potential
cost and quality implications of the 2020 Census,
the Census Bureau does not yet have the benefit of
a post-enumeration survey. However, this analysis uses some findings from the CCM survey to
make assumptions about what to expect given the
2020 Census design plans. In addition, census test
results and simulations with 2010 Census data are
used to assess potential cost and quality effects.

The scope of the 2010 CCM survey excluded people living
in group quarters and in Remote Alaska.
12

U.S. Census Bureau

	

7.1 REENGINEERED ADDRESS
CANVASSING
The primary question being examined related to
the quality of the reengineered Address Canvassing
operation is:
•• What are the quality impacts of the use of the
reengineered Address Canvassing innovations
that use in-office methods for updating the
majority of addresses?
The quality of the 2010 Address Canvassing operation compared well to that of the 2000 operation,
but as expected, there were some errors. Johnson
and Kephart13 evaluated the accuracy of the
address frame after the Address Canvassing operation. Using the results of the 2010 CCM results,
they estimated the percentage of housing units
correctly added (or added in error) and correctly
deleted (or deleted in error) by census operations.
Of the addresses added in the 2010 Address
Canvassing operation, 16.4 percent were added
erroneously. This represented approximately 1.2
percent of the records processed in the Address
Canvassing operation. Meanwhile, of the addresses
that the Address Canvassing operation deleted, 4.3
percent were deleted or identified as duplicated
erroneously. This represented approximately 0.5
percent of the records processed in the Address
Canvassing operation. The evaluation concluded
that the higher added-in-error rate was due to listers being encouraged to add addresses even when
there was doubt about their status.
The impact of In-Office Address Canvassing innovations on the overall quality of the operation is
still uncertain. The following analysis assesses the
potential implications of conducting 17.5 percent, 25.0 percent, and 32.0 percent of Address
Canvassing in the field, and the remainder in the
office only. There are currently two key quality metrics for this operation: missed adds and
missed deletes.
To measure the rate of missed adds and missed
deletes for the 2010 Census, the Census Bureau
13
Johnson, N. and Kephart, K., 2010 Census Evaluation of
Address Frame Accuracy and Quality, 2010 Census Planning
Memoranda Series No. 252, 2013.

2020 Census Operational Plan—Version 1.1 165

used the results of the 2010 CCM Initial Housing
Unit Matching Operation. This operation matched
the addresses of the CCM Independent List to census addresses after Address Canvassing. Because
it excluded updates from census enumeration
operations, it was a good representation of the correctness of the frame after the Address Canvassing
operation. Using the match rate as a proxy for the
success rate for capturing adds, the rate of missed
adds in 2010 was 3.5 percent. Using the percent
of correct enumerations as a proxy for the success rate of identifying deletes, the rate of missed
deletes was 3.7 percent.	
The Census Bureau has estimated missed adds
and missed deletes based on the plan for the 2020
Census, with the combination of in-office and
in-field work. This required some assumptions

about success rates for in-office and in-field
procedures, and the proportion of add and delete
actions in the initial address frame. These assumptions were developed using the results of the
2015 Address Validation Test, 2010 CCM survey
results, and expert opinion. (See Table 8 below.)
It shows missed adds and deletes, given 32.0
percent, 25.0 percent, or 17.5 percent In-Field
Address Canvassing in late 2019. For example, if the Census Bureau limits in-field work to
approximately 25 percent of the total addresses,
Address Canvassing may fail to add an estimated
1.4 million addresses, or 0.96 percent of the total
addresses. The Address Canvassing operation
may also fail to identify 2.9 million addresses that
should be deleted, or 1.94 percent of the total
addresses. These estimated error rates increase
with decreased In-Field Address Canvassing.

Table 8: Estimated Missed Adds and Missed Deletes by Percentage of
In-Field Address Canvassing
Amount of In-Field
Address Canvassing

Percentage of
In-Field
Address Canvassing

Missed Adds

Missed Deletes

31.89

693,129

2,102,976

(0.47%)

(1.43%)

1,415,541

2,856,198

(0.96%)

(1.94%)

2,145,944

3,624,283

(1.46%)

(2.47%)

≈ 32.0%

≈ 25.0%

≈ 17.5%

24.89

17.56

The estimated rates of missed adds and deletes
would also vary given the In-Office Address
Canvassing success rate of capturing adds and
deletes. (In Table 8 above, success rates of 95 percent for capturing adds and deletes are assumed
for each of the three scenarios.) Table 9 shows that
with 24.89 percent In-Field Address Canvassing, if
the in-office procedures are 100-percent successful,
rates of missed adds and deletes may be limited
to 0.75 percent and 1.72 percent, respectively.
However, if in-office procedures are only 85 percent
successful, the rate of missed adds and deletes

166 2020 Census Operational Plan—Version 1.1	

Error Rates

may increase to 1.38 percent and 2.39 percent,
respectively. As expected, this also varies given the
percentage of In-Field Address Canvassing. Given
a17.5 percent In-Field Address Canvassing rate, if
the in-office procedures are 100-percent successful,
rates of missed adds and deletes may be limited
to 1.25 percent and 2.24 percent, respectively.
If the in-office procedures are merely 85-percent
successful, these rates of missed adds and deletes
may increase to 1.88 percent and 2.92 percent,
respectively.

U.S. Census Bureau

Table 9: Estimated Missed Adds and Missed Deletes by In-Office Address
Canvassing Success Rate
In-Office Success Rate

Percentage of
In-Field
Address Canvassing

Error Rates
Missed Adds

Missed Deletes

32.0% In-Field Address Canvassing
100.0%

31.89

387,065
(0.26%)

1,776,686
(1.21%)

95.0%

31.89

693,129
(0.47%)

2,102,976
(1.43%)

90.0%

31.89

999,192
(0.68%)

2,429,267
(1.65%)

85.0%

31.89

1,305,256
(0.89%)

2,755,557
(1.87%)

100.0%

24.89

1,107,721
(0.75%)

2,526,845
(1.72%)

95.0%

24.89

1,415,541
(0.96%)

2,856,198
(1.94%)

90.0%

24.89

1,723,361
(1.17%)

3,185,551
(2.17%)

85.0%

24.89

2,031,181
(1.38%)

3,514,905
(2.39%)

17.56

1,836,384
(1.25%)

3,291,848
(2.24%)

95.0%

	17.56

2,145,944
(1.46%)

3,624,283
(2.47%)

90.0%

17.56

2,455,503
(1.67%)

3,956,718
(2.69%)

85.0%

17.56

2,765,063
(1.88%)

4,289,153
(2.92%)

25.0% In-Field Address Canvassing

17.5% In-Field Address Canvassing
100.0%

The estimated rates of missed adds and missed
deletes are also affected by the percentage of
address add and delete actions expected in the
initial frame, that is, the frame at the beginning of
fiscal year 2016, as described in Table 10. Given
25 percent In-Field Address Canvassing, if 4.0
percent of the actions are adds and 5.4 percent are
deletes, there may be an estimated 0.96 percent
final rate of missed adds and 1.94 percent final

U.S. Census Bureau

	

rate of missed deletes. (Table 8 and Table 9 assume
4.0 percent adds and 5.4 percent deletes. These
numbers are based on results from the Address
Validation Test.) However, these errors decrease
to 0.60 percent for missed adds and 1.44 percent
for missed deletes if the proportion of add and
delete actions only 3.0 percent and 4.0 percent,
respectively.

2020 Census Operational Plan—Version 1.1 167

Table 10: Estimated Missed Adds and Missed Deletes by Percentage of Added and
Deleted Addresses in the Initial Frame
Initial Frame

Percentage of
In-Field
Address Canvassing

Error Rates
Missed Adds

Missed Deletes

32.0% In-Field Address Canvassing
3.0% Adds / 4.0% Deletes

31.89

438,777
(0.30%)

1,526,927
(1.04%)

4.0% Adds / 5.4% Deletes

31.89

693,129
(0.47%)

2,102,976
(1.43%)

5.0% Adds / 6.8% Deletes

31.89

2,122,883
(1.44%)

4,104,632
(2.79%)

3.0% Adds / 4.0% Deletes

24.89

888,923
(0.60%)

2,110,553
(1.44%)

4.0% Adds / 5.4% Deletes

24.89

1,415,541
(0.96%)

2,856,198
(1.94%)

5.0% Adds / 6.8% Deletes

24.89

2,846,495
(1.94%)

4,859,533
(3.31%)

3.0% Adds / 4.0% Deletes

17.56

1,686,682
(1.15%)

2,808,592
(1.91%)

4.0% Adds / 5.4% Deletes

17.56

2,145,944
(1.46%)

3,624,283
(2.47%)

5.0% Adds / 6.8% Deletes

17.56

3,577,958
(2.43%)

5,629,103
(3.83%)

25.0% In-Field Address Canvassing

17.5% In-Field Address Canvassing

This analysis suggests that missed adds for the
2020 Census may range from 0.60 percent to 1.94
percent, given 25 percent rate of In-Field Address
Canvassing. Missed deletes may range from 1.44
percent to 3.31 percent. Given the 2010 Census
proxy missed add and delete rates of 3.5 and 3.7,
respectively, these estimates suggest that the 2020
Census Address Canvassing operation may maintain the level of quality of the 2010 Census operation as defined by these metrics.
The above analysis will be refined to include
results from tests conducted over the next several
years. The next step is to analyze the results of the
2015 Address Validation Test and incorporate these
findings into the error estimates.
Address Canvassing Downstream Impacts
The cost and quality implications of planned
changes to the Address Canvassing operation have
been assessed separately from the other major

168 2020 Census Operational Plan—Version 1.1	

operations, but Address Canvassing could have
important impacts on other operations downstream. If there are more missed adds, i.e., Address
Canvassing methods fail to identify additional
addresses, there would be a negative impact on
the coverage of the census if other operations
fail to identify them as well. In the 2010 Census,
more than 800,000 housing units were added
to the address frame from the NRFU and Vacant
Delete Check operations. However, a reengineered
approach to NRFU field operations may limit the
Census Bureau’s ability to detect new addresses. In
the 2010 Census, NRFU enumerators were assigned
areas to contact and were instructed to get interviews at any addresses that appeared to be missing
from their list. In the 2020 Census, enumerators will
be given a specific list of addresses to visit. This
issue will need to be addressed. If additional missed
adds can be captured through NRFU or other field
operations, the census would maintain good coverage, but there could be negative cost implications.

U.S. Census Bureau

Missed adds may also be captured through non-ID
responses. In the 2010 Census, non-ID responses
were limited to “Be Counted” forms and some
Telephone Questionnaire Assistance responses.
But the introduction of the Internet is expected
to increase the number of non-ID responses. Any
addresses missed by the Address Canvassing
operation that are captured through non-ID would
maintain or increase coverage. The addition of
these addresses through non-ID could have negative cost implications, however, by increasing the
workload for manual matching, manual geocoding,
or address verification.

•• What are the quality impacts of the widespread
use of the Internet for self-response?

There are potential cost and quality implications
downstream for missed deletes as well. If the
Address Canvassing operation fails to identify
addresses that should be deleted, there is the
potential for over-coverage. However, similar to
missed adds, it is likely that missed deletes will be
identified in other operations. Missed deletes could
be identified in NRFU or other field operations,
resulting in proper coverage, but with a negative
impact on costs. However, missed deletes may
also be identified with administrative records and
third-party data and removed from the NRFU workload. If administrative records and third-party data
correctly identify an address as a delete, rather
than vacant or occupied, the census could maintain
coverage without negative cost implications.

7.2 OPTIMIZING SELF-RESPONSE
The primary question being examined related to
the quality of the Optimizing Self-Response operation is:

For the 2010 Census, the primary method of
self-response was a paper-based questionnaire.
The final national mail response rate, defined as
the number of unduplicated nonblank mail returns
divided by the number of housing units in the
mailback universe, was 66.5 percent. The use of
the Internet may improve the quality of responses
and is expected to generate significant cost savings relative to paper questionnaires. The use of
the Internet will also include real-time processing.
This will allow late self-responses to be removed
from the NRFU workload, eliminating unnecessary and expensive contact attempts. Internet
self-response may also increase the percentage
of telephone responses, as well as the related
costs. Respondents who are unsuccessful submitting their information online may contact Census
Questionnaire Assistance and provide information
over the phone. Responses from paper, Internet,
and CQA are all considered self-response for the
purposes of this report.
The 2010 CCM survey estimated that 284.7 million
enumerations, or 94.7 percent of the 300.7 million
census enumerations, were correct. The percentage
of correct enumerations varied by certain characteristics, including whether the response was
obtained through self-response (mail) or a NRFU
operation. (See Table 11 below.) Note that selfresponse enumerations have a higher percentage
of correct enumerations than NRFU enumerations.
Among NRFU enumerations, those provided by a
member of the household have the highest percentage correct.

Table 11: 2010 Census Correct Enumerations by Operation
Portion of the Census
Person Count

Total Correct
Enumerations

Self-Response

0.729

97.3%

NRFU Field Operation, Householder Response

0.204

93.4%

NRFU Field Operation, Proxy Response

0.054

70.1%

NRFU Field Operation, Unknown Respondent Type

0.002

68.2%

NRFU, Other

0.011

69.7%

Source: 2010 Census Coverage Measurement Estimation Report: Components of Census Coverage for the Household Population in
the United States.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 169

As this will be the first census with widespread use
of the Internet as a response option, the Census
Bureau has limited information regarding what
the response rates will be in 2020. One source of
information is the ACS, a national survey that has
been using Internet responses since 2013. While
the ACS provides information regarding the use
of the Internet, it is different from the census in
several key ways. The ACS questionnaire is much
longer than the 2020 Census questionnaire, there
is no advertising for ACS, and it is an ongoing
survey rather than a once-per-decade event. All
of these factors may lead to somewhat different
expectations for the response rates. In addition
to ACS results, the Census Bureau has census test
results that have been used as the basis of projections (see the Internet Self-Response section 5.5.4
of the Operational Plan for more information). The
results of the 2015 National Content test will allow
additional refinements next year.

There is also limited information regarding the
quality of the Internet responses themselves. There
are reasons to expect that Internet responses will
be of greater quality than paper responses, such
as the use of real-time edits. However, there is the
potential that respondents may break off and not
complete the online form, which could also impact
quality.
To assess the potential impact of Internet in 2020,
the percentage of correct enumerations are estimated based on possible response rates. For
the purposes of this analysis, it is assumed that
responses by Internet, CQA, and mail are all of
equal quality. The potential use of administrative
records and third-party data in NRFU are not considered here. If the Internet increases the response
rate, this will increase the proportion of responses
from self-response rather than NRFU, and increase
the percentage of correct enumerations overall.
(See Table 12 below.)

Table 12: Estimated Correct Person Enumerations
Self-Response

Total Correct
Enumerations

2010 Census Self-Response Rate

66.5%

94.7%

2020 Baseline Self-Response Rate

66.5%

94.7%

2020 with +5% Self-Response Rate

71.5%

95.2%

2020 with +10% Self-Response Rate

76.5%

95.6%

Source: 2010 Census Coverage Measurement Estimation Report: Components of Census Coverage for the Household Population in
the United States; 2010 Census Mail Response/Return Rates Assessment Report.

Upcoming tests in 2016 and 2017 will better measure expected impacts of innovations like non-ID
processing on self-response rates.

7.3 UTILIZING ADMINISTRATIVE
RECORDS AND THIRD-PARTY DATA
FOR NONRESPONSE FOLLOWUP
The primary question being examined related to
the quality of the NRFU operation is:
•• What are the quality impacts of the use of
administrative records and third-party data on
NRFU?
To assess the quality implications of using administrative records and third-party data for the NRFU
Operation, the Census Bureau has simulated
170 2020 Census Operational Plan—Version 1.1	

their use as applied to data and tabulations from
the 2010 Census. This simulation focuses on
self-response and NRFU. It uses the 2010 Census
universe reflecting full Address Canvassing as
completed in the 2010 Census. It does not reflect
potential differences in fieldwork or quality based
on innovations in the Address Canvassing operation for the 2020 Census. The results of this simulation are compared to the results from the 2010
Census and the 2010 CCM. To identify impacts at
the national level, the following metrics are generated: the size of the household population, the
number of occupied housing units, the number
of vacant housing units, the total number of units
in the NRFU workload, the number of household
visits, and the resolution of cases.

U.S. Census Bureau

The Census Bureau is also interested in the implications of administrative records and third-party data
usage for the identification of race and Hispanic
origin. The following simulation projects potential
percent of the household population missing both
race and Hispanic origin characteristics. For the
potential range of occupied units, comparisons are
made only to the 2010 Census results, as there are
no comparable CCM estimates for this group.
Simulation Design Assumptions
The following assumptions are used in the 2020
Census cost estimate and are based on census
tests and research that uses 2010 Census data. The
simulation work assumes a self-response percentage to determine the universe eligible for NRFU
of 63.5 percent, with a possible range extending
from a minimum of 58.5 percent to a maximum of
68.5 percent. This reflects the use of Internet push,
reminder mailings, and paper questionnaires to
encourage people to respond. This implies that on
average 36.5 percent of the census universe will be
eligible for NRFU.
The simulation also incorporates assumptions
about the use of administrative records and thirdparty data and how field visits will be conducted
during the 2020 Census NRFU. These assumptions
are based on research with administrative records
and third-party data to reduce contacts for units

U.S. Census Bureau

	

suspected to be vacant or occupied. It also reflects
a possible contact strategy based on research from
the 2015 Census Test that used the field operations
system.
The research has considered three basic
approaches for using administrative records and
third-party data to reduce NRFU contacts. This
simulation reflects the decision to focus on the
“hybrid administrative record use” approach. Figure
35 shows its NRFU work flow. First, administrative
records and third-party data are used to identify
units that are likely to be vacant before the start of
the NRFU operation. These cases receive no visits
during the NRFU operation. Second, administrative
records and third-party data are used to identify
units that are likely to be occupied, and to develop
a roster of persons from administrative records
sources. These addresses are kept in the NRFU
fieldwork and receive one visit during the NRFU
operation. During the visit, the unit can (1) respond
by completing the interview with the enumerator,
or (2) use the information on the Notice of Visit
left on their doorstep to go online or call CQA to
respond. If the unit does not respond in one of
these ways, the administrative record information
is used. The simulation reflects this approach for
using administrative records and third-party data
to identify vacant and occupied units.

2020 Census Operational Plan—Version 1.1 171

NRFU
Housing
Units

Use administrative
records and third-party
data to determine
vacant status

Attempt one
interview for
remaining
units

Use
administrative
records and
third-party data
to determine
population in
occupied units

Resolved
Unresolved

Additional
contacts
(0 to n)

Resolved
Unresolved

Figure 35: The Hybrid Administrative Records Use

The following assumptions on administrative
records use were made for this simulation. The
methodology developed for the 2015 Census
Test is applied to the 2010 Census universe.
This national application provided the amount of
administrative records identification for simulation
purposes.
•• Of the universe eligible for NRFU, 10.7 percent
could be identified as vacant based on information from administrative records and third-party
data. This value ranged from 9.9 percent to 11.6
percent in the simulation.
•• Of the universe eligible for NRFU, about 15.0
percent could be identified as occupied based
on administrative records and third-party data.
After the removal of the administrative records
vacant cases, this resulted in about 16.3 percent
of the NRFU workload. The simulation drew from
a binomial distribution with this percentage.
Similar to the census testing, these cases did not
receive proxy interviews to resolve their occupancy status.
Increased productivity from the reengineered field
operations and the number of proxy and unresolved enumerations observed in the 2015 Census
Test led to the consideration of a larger number

172 2020 Census Operational Plan—Version 1.1	

of contacts in 2020 than was used in the 2015
Census Test.
The 2015 adaptive design approach was modified to increase the average number of maximum
contacts allowed. The adaptive design approach
attempted to minimize the variability of the proxy
reporting based on an input average number of
desired contacts. For the 2015 Census Test, an
average of three visits was used. Based on the
increase in productivity, the allocation of the maximum number of visits was increased to an average
of four. This increased the maximum allowable
contacts for nonadministrative record cases before
conducting a proxy to be more than in the 2015
approach. This design still reflects conducting a
proxy enumeration only on the last visit.
For each address, the simulation allowed five
potential outcomes when an attempt was made:
1.	

Occupied with a household member

2.	

Occupied with a proxy respondent (building
manager, neighbor, etc.)

3.	

Vacant

4.	

Delete

5.	

Unresolved

U.S. Census Bureau

The probabilities of resolution (any of 1 through 4),
and of completing an interview on a given contact,
were based on initial overall results of the resolution per attempt from cases in the experimental
and control panels of the 2015 Census Test. The
control panel was one of the panels that used the
Research and Testing Operational Control System
and the 2010 Census field management approach
to implement the address in that panel.

been attempted. This simulation implemented a
simplified version of the count imputation procedure. If a specified number of addresses were still
unresolved after the sixth contact, the simulation
used the results of the last contact to impute the
number of addresses that were occupied, vacant,
and delete.

The following parameters for resolution results
were used in the simulation runs.

Figure 36 provides in a boxplot the simulated distribution of the total household population under
the “hybrid administrative record use” approach.
The simulation shows an average household population of 298,841,000, about 1.9 million below
both the count from the 2010 Census and the estimate from the 2010 CCM. The bars show the 90
percent interval over the simulation. (Due to CCM
constraints, this universe does not include people
living in group quarters or in Remote Alaska in the
2010 Census.)

•• Occupied resolution with a household member
was set to vary between 0.2 and 0.3.
•• Occupied resolution with a proxy member varied
between 0.25 and 0.75.
•• Vacant resolution varied between 0.2 and 0.3.
•• Delete resolution varied between 0.2 and 0.3.
An additional part of the simulation accounts for
unresolved addresses after all NRFU contacts had

Simulation Results

301,000,000
2010 Census
2010 CCM
300,500,000

300000,000

299500,000

299000,000
298,841,000
298500,000

298000,000
Hybrid
Bars show 90 percent intervals.
CCM does not include Remote Alaska household population.

Figure 36: Simulated Household Population Distribution

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 173

Occupied Units

Vacant Units

117,000,000

16,000,000
15,751,000

2010 CCM
2010 Census

116,500,000

15,500,000

116,522,000
2010 Census

15,000,000

116,000,000

2010 CCM

Hybrid

Bars show 90 percent intervals.
CCM estimate does not include Remote Alaska occupied units.

14,500,000

Hybrid

Bars show 90 percent intervals.
CCM estimate does not include Remote Alaska occupied units.

Figure 37: Simulated Occupied and Vacant Distribution

Figure 37 depicts the projected numbers of occupied and vacant housing units, respectively. The
scenario projected an average of 116,522,000
occupied units, about 200,000 lower than the 2010
Census and the 2010 CCM results. The scenario
projected an average of 15,751,000 vacant units.
The interval is higher than the census count of
15.0 million vacant units but covers the CCM estimate of 15.7 million vacant units.

174 2020 Census Operational Plan—Version 1.1	

Figure 38 shows the projected workload of NRFU
cases as applied to the 2010 Census. The scenario
projected an average of 44,605,000 NRFU fieldwork cases. This is about 5.3 million fewer than
the actual 2010 Census NRFU workload. Using
administrative records and third-party data to identify vacant units could reduce the NRFU fieldwork
by about 5.3 million addresses.

U.S. Census Bureau

60,000,000
55,000,000
2010 Census

50,000,000
45,000,000

44,605,000

40,000,000
35,000,000
30,000,000

Hybrid

Bars show 90 percent intervals.

Figure 38: Simulated Distribution of 2010 NRFU Fieldwork Cases

Figure 39 provides the projected number of household contacts in the 2010 NRFU under the simulated scenario. The simulation projects an average
of 112,851,000 household visits with a standard
deviation of 10.7 million. The 90 percent interval

ranges from 95 million to 130 million visits. This
range includes the number of household visits
recorded in the 2010 Census—about 110 million.
This includes personal and telephone contacts.

150,000,000
140,000,000
130,000,000
120,000,000
112,851,000

110,000,000

2010 Census

100,000,000
90,000,000
Hybrid
Bars show 90 percent intervals.

Figure 39: Simulated 2010 Number of Contacts

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 175

Table 13 shows the distribution of resolution
status. Self-response had an average of 86.677
million with a 90 percent interval from 80 million
to 93 million. There was an average of 5.3 million
cases resolved using administrative records and

third-party data after an unsuccessful resolution
on the first contact. The simulation projects 4.475
million unresolved units after the completion of all
of the fieldwork.

Table 13: Resolution Status Projection
Level

Simulated Average

Standard Deviation

86,677,000

3,875,000

Adrec Vacant

5,318,000

388,000

Adrec Occupied

5,313,000

504,000

NRFU Occ HH

18,722,000

2,815,000

NRFU Occ Proxy

2,906,000

562,000

NRFU Vacant

9,257,000

503,000

NRFU Delete

3,931,000

127,000

NRFU Unresolved

4,475,000

1,961,000

Self-Response

Figure 40 shows the percentage of time that both
race and Hispanic origin are not reported for an
enumeration. For this analysis, the two characteristics were combined to see if a person reported
either one during their enumeration. This analysis includes self-responses as well. For persons
enumerated via administrative records, it was

176 2020 Census Operational Plan—Version 1.1	

determined whether the past census, government
sources of administrative records, or third-party
sources could provide race or Hispanic origin for
the person. On average over the simulations, both
characteristics were not reported 5.2 percent of the
time. The 90 percent interval covers the observed
3.6 percent in the 2010 Census.

U.S. Census Bureau

8.0%
7.0%
6.0%
5.2%

5.0%

2010 Census

4.0%
3.0%
2.0%
1.0%
0.0%

Hybrid
Bars show 90 percent intervals.

Figure 40: Both Race and Hispanic Origin Not Reported

Additional Design Features
The previous results use data from the 2010 Census
and summarize potential quality implications for
the 2020 Census. Other design features are being
proposed or considered. They have potential implications on the quality results shown here.
One possibility is that units designated as vacant
or occupied via administrative records and thirdparty data will receive an additional mailing during
the NRFU operation. This mailing would notify
the addresses that, while a field enumerator may
not be coming to visit their address, they can still
respond to the census by Internet or mail. This
self-response might lessen some of the observed
undercounts.
A second possibility is the Census Bureau’s use of
records from other potential administrative records
and third-party data sources. The Supplemental
Nutrition Assistance Program could help address
the potential under-coverage observed in the
simulations. Another possible source to augment
census coverage could be the National Directory of
New Hires.
U.S. Census Bureau

	

A third facet is the projected resolution rate in
occupied units for conducting an interview with
a householder. The simulation applied a value
between 0.2 and 0.3. This produced about 4.5 million possibly unresolved addresses. This resolution
rate might be higher in the actual census implementation, compared to those observed during
a census test. Any increase should lead to fewer
unresolved units.
The results shown were based on the productivity
improvements observed in the 2015 Census Test. If
the productivity were less than that observed in the
test, then changes in the fieldwork procedures, as
applied in this simulation, could be needed.
Another consideration is the further use of administrative records and third-party data to identify
vacant and occupied units beyond those used in
this simulation, such as during the NRFU operation,
at the end, or even after the NRFU operation. For
example, one might use administrative records and
third-party data as an alternative to count imputation. This might reduce the amount of unresolved
cases and the amount of missing race and Hispanic
2020 Census Operational Plan—Version 1.1 177

origin characteristics. Additional research has
shown that possibly 31 percent of the 4.5 million
unresolved cases could be resolved by additional
usage of administrative records.

Content Test and 2016 tests next fiscal year.
This will allow the Census Bureau to enhance
methodologies, such as mailing strategies for
self-response.

A final possible change is in the imputation procedures to account for unresolved addresses and
person characteristics. Research continues on how
to adapt to account for the missing data situation,
using administrative records and third-party.

•• 2016 Master Address File (MAF) Coverage
Study. The Census Bureau will use the results of
the MAF Coverage Study to improve the error
rate estimates for In-Field Address Canvassing
methods.

7.4 FUTURE PLANS

•• Local Update of Census Addresses. The Census
Bureau will investigate methods to assess the
effectiveness of the LUCA program.

The Census Bureau will continue to identify opportunities to improve the data used to assess the
trade-offs between cost and quality. As a result, the
2020 Census design will be refined. The next steps
include the following:
•• Analyze 2015 Test Data. The Census Bureau
will complete the analysis of the 2015 Census
Test in Savannah, GA, and Maricopa, AZ, and
the 2015 Address Validation Test results. There
may be additional findings that inform methodologies for the 2020 Census. For example,
additional analysis of the Address Validation Test
results may help us better predict the effectiveness of In-Office Address Canvassing methods
compared to In-Field Address Canvassing.
•• Fall 2015 and 2016 Tests. The Census Bureau
will identify findings from the 2015 National

178 2020 Census Operational Plan—Version 1.1	

•• Identify Metrics. The Census Bureau will continue to identify and evaluate additional cost
and quality metrics as needed.
•• Quantify Downstream Impacts. In the future,
Address Canvassing will be linked with the other
major operations in order to better assess the
potential downstream impacts. This will allow
the Census Bureau to measure how any changes
in the number of adds or deletes missed in the
Address Canvassing operation may influence
NRFU workloads.
•• Analysis of Other Operations. As procedural
plans are developed for census operations
beyond the main ones studied in this section,
the Census Bureau will need to assess the cost
and quality implications of them as well.

U.S. Census Bureau

8.	 Life-Cycle Cost Estimate
The 2020 Census Life-Cycle Cost Estimate is pending clearance. This section will be populated
at a later date.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 179

[This page intentionally left blank]

9.	 Approval Signature

Lisa M. Blumerman (signed) 				

October 1, 2015

Lisa M. Blumerman					Date
Associate Director for Decennial Census Programs

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 181

[This page intentionally left blank]

10. Document Logs
10.1 SENSITIVITY ASSESSMENT
This table specifies whether or not the document contains any administratively restricted information.
Verification of Document Content
This document does not contain any:
•• Title 5, Title 13, or Title 26 protected information
•• Procurement information
•• Budgetary information
•• Personally identifiable information

10.2 REVIEW AND APPROVALS
This 2020 Operational Plan document has been reviewed and approved for use.
This table documents the necessary approvals leading up to the point of baselining.
Document Review and Approval Tier: Operational Plan
Name

Area Represented

Date

Ann G. Wittenauer

2020 Census Operational Plan Team

9/8/2015

2020 Census Operational Plan Team Leadership Group:
Lisa M. Blumerman

Associate Director for Decennial Census Programs

9/8/2015

Shirin A. Ahmed

Assistant Associate Director for Decennial Census Programs

9/8/2015

Deirdre D. Bishop

Chief, Decennial Census Management Division

9/8/2015

Patrick J. Cantwell

Chief, Decennial Statistical Studies Division

9/8/2015

Timothy F. Trainor

Chief, Geography Division

9/8/2015

Phani-Kumar A. Kallori

Chief, Decennial IT Division

9/8/2015

Decennial Leadership Group

9/8/2015

2020 Census Executive Steering Committee

9/8/2015

10.3 VERSION HISTORY
The document version history recorded in this section provides the revision number,
the version number, the date it was issued, and a brief description of the changes since the
previous release. Baseline releases are also noted.
Rev #

Version

Date

Description

Final

V 1.0

October 1, 2015

Original baseline

Final

V 1.1

November 6, 2015

Conversion of Operational Plan content into Communications Directorate
Desktop Publisher. Converted all figures and updated figures 8 and 28.
Also added Section 8—Lifecycle Cost Estimate and Appendices.

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 183

[This page intentionally left blank]

Appendix: List of Acronyms
Acronym

Definition

ACS

American Community Survey

BAS

Boundary and Annexation Survey

BCU

Basic Collection Unit

BPM

Business Process Models

BYOD

Bring Your Own Device

CAP

Capability Requirements

CCM

Census Coverage Measurement

CEDCaP

Census Enterprise Data Collection and Processing

CEDSCI

Center for Enterprise Dissemination Services and Customer Innovation

CM

Coverage Measurement

COMPASS

Census Operations Mobile Platform for Adaptive Services and Solutions

CQA

Census Questionnaire Assistance

CQR

Count Question Resolution

DS

Data Stewardship

DSC

Decennial Service Center

eSDLC

Enterprise Systems Development Life Cycle

ETL

Enumeration at Transitory Locations

FSCPE

Federal-State Cooperative Population Estimate

FTE

Full Time Equivalent

FY

Fiscal Year

GAO

Government Accountability Office

GQ

Group Quarters

GSS-I

Geographic Support System Initiative

GUPS

Geographic Update Partnership Software

IA

Island Areas Censuses

iCADE

Integrated Capture and Data Entry

IPC

Integrated Partnership and Communications

IT

Information Technology

IVR

Interactive Voice Response

KFI

Key From Image

LUCA

Local Update of Census Addresses

MAF

Master Address File

MAM

Mobile Application Manager

MMVT

MAF Model Validation Test

MOJO

In-field operational control system

NARA

National Archives and Records Administration

NPC

National Processing Center

U.S. Census Bureau

	

2020 Census Operational Plan—Version 1.1 185

Acronym

Definition

NRFU

Nonresponse Followup

OCR

Optical Character Recognition

OIG

Office of Inspector General

OMR

Optical Mark Recognition

PBC

Partial Block Canvassing

PL

Public Law

PLBR

Project-Level Business Requirements

PSAP

Participant Statistical Areas Program

RCC

Regional Census Center

RDP

Redistricting Data Program

SE&I

Systems Engineering and Integration

SIMEX

Simulation Experiment

TEA

Type of Enumeration Area

TIGER

Topologically Integrated Geographic Encoding and Referencing System

TSAP

Tribal Statistical Areas Program

UE

Update Enumerate

URL

Uniform Resource Locator

USPS

United States Postal Service

WBS

Work Breakdown Structure

186 2020 Census Operational Plan—Version 1.1	

U.S. Census Bureau


File Typeapplication/pdf
File Title2020 Census Operational Plan
SubjectVersion 1.1
AuthorU.S. Census Bureau
File Modified2016-02-01
File Created2015-11-17

© 2024 OMB.report | Privacy Policy