Download:
pdf |
pdf2020 Census Operational Plan
A New Design for the 21st Century
Issued December 2018
Version 4.0
Note to Reader:
Please note that the 2020 Census Operational Plan v4.0 reflects the operational design for
the 2020 Census as of October 31, 2018, unless noted otherwise.
TABLE OF CONTENTS
1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1.1 Purpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1.2 Design Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1.3 Document Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1.4 Document Development Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1.5 Document Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1
1
1
2
3
3
2. The
2.1
2.2
2.3
2.4
2.5
2.6
2.7
2020 Census Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Purpose and Goal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Uses of Decennial Census Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
The Changing Environment and Escalating Costs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Four Key Innovation Areas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
A New Design for the 21st Century . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
What the Public Can Expect . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
The 2020 Census Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5
5
5
6
8
8
9
11
3. The
3.1
3.2
3.3
3.4
3.5
Four Key Innovation Areas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Reengineering Address Canvassing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Optimizing Self-Response . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Utilizing Administrative Records and Third-Party Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Reengineering Field Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Summary of Innovations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
15
15
18
21
25
27
4. Key
4.1
4.2
4.3
Tests, Milestones, and Production Dates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Tests to Inform the Operational Design and Prepare for Conducting the Census . . . . . . . . . .
4.1.1 Tests in 2012–2014 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.1.2 Tests in 2015 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.1.3 Tests in 2016 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.1.4 Tests in 2017 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.1.5 Tests in 2018 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.1.6 Tests Between the 2018 End-to-End Census Test and the 2020 Census . . . . . . . . . . .
Key Decision Points and Milestones . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2020 Census Production Operational Schedule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
31
31
32
38
43
47
48
51
51
51
5. The 2020 Census Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.1 Operations Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.1.1 Frame . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.1.2 Response Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.1.3 Publish Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.2 Program Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.2.1 Program Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.3 Census/Survey Engineering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.3.1 Systems Engineering and Integration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.3.2 Security, Privacy, and Confidentiality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.3.3 Content and Forms Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.3.4 Language Services . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.4 Frame
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.4.1 Geographic Programs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
57
57
60
60
61
61
61
66
66
72
75
78
81
81
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 i
5.5
5.6
5.7
5.8
5.9
5.4.2 Local Update of Census Addresses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.4.3 Address Canvassing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Response Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.5.1 Forms Printing and Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.5.2 Paper Data Capture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.5.3 Integrated Partnership and Communications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.5.4 Internet Self-Response . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.5.5 Non-ID Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.5.6 Update Enumerate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.5.7 Group Quarters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.5.8 Enumeration at Transitory Locations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.5.9 Census Questionnaire Assistance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.5.10 Nonresponse Followup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.5.11 Response Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.5.12 Federally Affiliated Count Overseas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.5.13 Update Leave . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Publish Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.6.1 Data Products and Dissemination . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.6.2 Redistricting Data Program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.6.3 Count Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.6.4 Count Question Resolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.6.5 Archiving . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Other Censuses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.7.1 Island Areas Censuses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Test and Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.8.1 Coverage Measurement Design and Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.8.2 Coverage Measurement Matching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.8.3 Coverage Measurement Field Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.8.4 Evaluations and Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Infrastructure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.9.1 Decennial Service Center . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.9.2 Field Infrastructure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.9.3 Decennial Logistics Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.9.4 IT Infrastructure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
87
91
95
95
97
99
103
108
112
114
118
119
123
132
135
137
138
139
141
144
147
148
149
149
151
151
153
155
157
161
161
163
166
168
6. Key 2020 Census Risks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
6.1 Public Perception of Ability to Safeguard Response Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
6.2 Cybersecurity Incidents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
6.3 Administrative Records and Third-Party Data—External Factors . . . . . . . . . . . . . . . . . . . . . . . . .
6.4 Operations and Systems Integration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
6.5 Late Operational Design Changes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
6.6 Insufficient Levels of Staff With Subject-Matter Skill Sets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
6.7 Ability of IT Solutions to Support the 2020 Census . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
6.8 Administrative Records and Third-Party Data—Access and Constraints . . . . . . . . . . . . . . . . . .
6.9 Internet Self-Response Instrument . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
6.10 Systems Scalability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
173
173
174
174
175
175
176
176
177
177
178
7. Quality Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 179
7.1 Reengineering Address Canvassing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181
ii 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
7.2 Optimizing Self-Response . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185
7.3 Using Administrative Records . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
7.4 Reengineering Field Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
8. Approval Signature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
9. Document Logs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
9.1 Sensitivity Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
9.2 Review and Approvals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
9.3 Version History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
193
193
193
193
Appendix A. List of Acronyms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195
Appendix B. 2020 Census Operational Design: An Integrated Design for
Hard-to-Count Populations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201
LIST OF FIGURES
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
Figure
1: Approach to the Operational Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2: 2020 Census Program Documentation Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3: Organizations and Governance Boards for the 2020 Census Operational Plan . . . . . . . . . . . .
4: 2020 Census Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5: The 2020 Census—A New Design for the Twenty-First Century . . . . . . . . . . . . . . . . . . . . . . . . .
6: Operations by Work Breakdown Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7: Summary of Reengineering Address Canvassing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
8: Operations That Contribute to Reengineering Address Canvassing . . . . . . . . . . . . . . . . . . . . .
9: Summary of Optimizing Self-Response . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
10: Operations That Contribute to Optimizing Self-Response . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
11: Summary of Utilizing Administrative Records and Third-Party Data . . . . . . . . . . . . . . . . . . . .
12: Operations That Contribute to Utilizing Administrative Records and Third-Party Data . . .
13: Summary of Reengineering Field Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
14: Operations That Contribute to Reengineering Field Operations . . . . . . . . . . . . . . . . . . . . . . .
15: Operations With Significant Innovations Since the 2010 Census . . . . . . . . . . . . . . . . . . . . . . .
16: High-Level View of Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
17: Tests in 2012–2014 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
18: Tests and Key Decisions in 2015 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
19: Tests Planned in 2016 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
20: Schedule for the 2017 Census Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
21: Schedule for the 2018 End-to-End Census Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
22: 2020 Census Performance and Scalability Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
23: Key Decision Points and Milestones . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
24: 2020 Census Operations—Production Timeline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
25: High-Level Integrated Schedule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
26: The 2020 Census Operations by Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
27: Operational Overview by Work Breakdown Schedule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
28: High-Level Integration of Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
29: Program Management Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
30: Summary of Geographic Program Components . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
31: Response Processing Operation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
32: 2020 Census Risk Matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
U.S. Census Bureau
1
2
3
6
9
14
15
17
19
20
22
23
25
26
28
31
32
38
43
47
48
51
52
53
54
55
58
59
62
82
132
173
2020 Census Operational Plan—Version 4.0 iii
LIST OF FIGURES—Con.
Appendix B
Figure
Figure
Figure
Figure
Figure
Figure
Figure
1:
2:
3:
4:
5:
6:
7:
2020 Census Hard-to-Count Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2020 Census Operational Placemat . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2020 Census Race Question . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2020 Census Hispanic Origin Question . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2020 Census Relationship Question . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2020 Census Undercount Question . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Screenshot From the Response Outreach Area Mapper . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
201
203
204
205
205
205
214
LIST OF TABLES
Table 1: Operations and Purpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Table 2: Description of Operations That Contribute to Reengineering Address Canvassing . . . . . . . . . 18
Table 3: Description of Operations That Contribute to Optimizing Self-Response . . . . . . . . . . . . . . . . . 21
Table 4: Description of Operations That Contribute to Utilizing Administrative Records and
Third-Party Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
Table 5: Description of Operations That Contribute to Reengineering Field Operations . . . . . . . . . . . . 27
Table 6: Summary of Key Innovations by Operation—Con. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
Table 7: Operational Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Table 8: Summary of Quality Parameters Collected for Reengineering Address Canvassing . . . . . . . . 182
Table 9: Summary of Quality Parameters Collected for Initial Frame . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182
Table 10: Summary of Key Quality Parameters Collected for the In-Office Address Canvassing
and MAF Coverage Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183
Table 11: Geographic Programs Quality Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185
Table 12: Summary of Quality Parameters Collected for Enumeration . . . . . . . . . . . . . . . . . . . . . . . . . . . . 186
Table 13: Summary of Self-Response Workloads for Housing Units . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
Table 14: Summary of Key Quality Parameters Collected for Self-Response Person Error . . . . . . . . . . 188
Table 15: Summary of Key Quality Parameters Collected for Using Administrative Records Error
for Persons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
Table 16: Summary of Key Quality Parameters Collected for Update Leave and
Update Enumerate for Person Error . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
iv 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
1. Introduction
1.1 PURPOSE
The U.S. Census Bureau’s 2020 Census Operational
Plan documents the design for conducting
the 2020 Census. It reflects and supports evidence-based decision-making by describing
design concepts and their rationale, identifying
decisions made and remaining decisions, and
describing remaining risks related to the implementation of the 2020 Census Operational Plan.
1.2 DESIGN APPROACH
As shown in Figure 1, the operational design
comprises a set of design decisions that drive how
the 2020 Census will be conducted. These design
decisions have been informed through research,
testing, and analysis conducted from 2012
through 2018. The operational design also drives
the requirements for Information Technology (IT)
capabilities and acquisitions.
The 2020 Census has been designed and developed in an iterative fashion, incorporating results
from various tests conducted over the decade.
Most design decisions have been made and are
reflected in this document. Adjustments to the
design may be required based on analysis and
final tests conducted in 2018, in particular the
2018 End-to-End Census Test.
An important aspect of the design approach
for the 2020 Census is an increased reliance on
enterprise standards and solutions. Specifically,
the design of all IT capabilities adheres to the
Enterprise Systems Development Life Cycle
(eSDLC) and IT Guiding Principles. Furthermore,
the 2020 Census Program’s budget, schedule,
and work activities align with the eSDLC/Mission
Enabling and Support Work Breakdown Structure.
The 2020 Census design also leverages enterprise
shared services, including the Census Enterprise
Quality and
Cost Impact
Analysis
Research
and Testing
Results
Needs
Trade-Offs
Design Decisions
Operational Design
Requirements
IT Capabilities
Requirements
Acquisitions
Figure 1: Approach to the Operational Design
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 1
Census 2020 Operational Design
Research,
Testing,
and Analysis
Operational
Design
2020 Census
Operational
Plan Executive
Summary
Systems
Engineering and
Integration (SE&I)
IT Solutions
Plans
Plans
Management
Plans
Capabilities
Results
Artifacts
Artifacts
Architectures
Other
Studies
and Reports
Life-Cycle
Cost
Estimates
2020 Census
Operational Plan
Detailed
Operational
Plans
Program
Management
Designs
Rebaselined
Schedule
Operational
Design
Integration
artifacts
Figure 2: 2020 Census Program Documentation Structure
Data Collection and Processing (CEDCaP) solution and the Center for Enterprise Dissemination
Services and Consumer Innovation (CEDSCI) solution.1 These two initiatives provide the technology
solutions required to support significant portions
of the innovations for the 2020 Census.
1.3 DOCUMENT SCOPE
This document is version 4.0 of the 2020 Census
operational design and covers all operations
required to execute the 2020 Census, starting
with precensus address and geographic feature
updates, and ending once census data products
are disseminated and coverage and quality are
measured.
It describes what will be done during the 2020
Census and, at a high level, how the work will
1
Throughout this document, references are made to specific
CEDCaP systems (i.e., MOJO, PRIMUS, and COMPASS Census
operations) that were only used to support the early 2020
Census tests.
2 2020 Census Operational Plan—Version 4.0
Pre-decisional
be conducted. Additional specifics of how each
operation will be performed are documented in
individual 2020 Census Detailed Operational Plans,
which are created on a rolling schedule, as noted
in Chapter 5. These detailed plans include the
business process models that have been developed for each operation. They also identify the set
of activities that comprise the operation and the
interactions among related operations.
As shown in Figure 2, this 2020 Census
Operational Plan, shaded in yellow, is part of a
broader set of documentation for the 2020 Census
Program. Particular design documents related to
this 2020 Census Operational Plan are shown in
outline, while general categories of supporting
documents are shaded blue. Operational interactions are described in a set of Operational Design
Integration artifacts. These are new to this document in this version and included in Appendix C.
U.S. Census Bureau
27
CENSUS BUREAU
2020 Census Executive
Steering Committee
2020 Census
Detailed
Operational
Plan
CENSUS BUREAU
Portfolio Management
Governing Board
One for each
production
operation
2020 Census
Operational
Plan
CENSUS BUREAU
(CROSS ORGANIZATIONAL)
Operational Integrated
Project Teams (IPTs)
DECENNIAL CENSUS MANAGEMENT DIVISION
DECENNIAL BUDGET
OFFICE
SCHEDULING AND
PERFORMANCE
MANAGEMENT
OPERATIONS
CENSUS BUREAU
(CROSS
ORGANIZATIONAL)
Operational Subject
Matter Experts
DECENNIAL CENSUS
MANAGEMENT
DIVISION
Operational Testing
PROGRAM
MANAGEMENT
DECENNIAL
STATISTICAL STUDIES
DIVISION
Quality Analysis
Figure 3: Organizations and Governance Boards for the 2020 Census
Operational Plan
1.4 DOCUMENT DEVELOPMENT
PROCESS
Many organizations across the Decennial Census
Programs Directorate and the Census Bureau
have worked together to develop the 2020 Census
operational design. Figure 3 illustrates these organizations. The development of the 2020 Census
Operational Plan is led by the Decennial Census
Management Division (DCMD), in particular a set of
Assistant Division Chiefs responsible for the 2020
Census operations. These Assistant Division Chiefs
are supported by several DCMD functional areas,
including program management and scheduling
and performance management. The DCMD operational design work also relies on operational subject
matter experts from throughout the Census Bureau
and the quality analysis staff within the Decennial
Statistical Studies Division. DCMD also has a team
responsible for overseeing operational testing and
reporting on test results, which inform operational
design decisions. The Decennial Budget Office
U.S. Census Bureau
provides the life-cycle cost estimate, which reflects
the latest version of the design. They also analyze
the cost impacts of alternative designs.
The 2020 Census Operational Plan has been
reviewed and approved by the 2020 Census
Portfolio Management Governing Board and
the 2020 Census Executive Steering Committee.
Operational Integrated Project Teams develop
the Detailed Operational Plans. These teams are
composed of subject matter experts from across
the Census Bureau, including the IT and Field
Directorates.
1.5 DOCUMENT ORGANIZATION
This document is organized into seven sections:
1.
Introduction
2.
The 2020 Census Overview
3.
The Four Key Innovation Areas
2020 Census Operational Plan—Version 4.0 3
4.
Key Tests, Milestones, and Production Dates
5.
The 2020 Census Operations
6.
Key Program-Level Risks
7.
Quality Analysis
Section 5 describes each of the 35 census operations and constitutes the bulk of this 2020 Census
Operational Plan. All decisions in this section are
current as of October 31, 2018.
4 2020 Census Operational Plan—Version 4.0
In addition, two sections have been added to this
version. The first is an appendix with a description of operations and efforts that contribute to
improving the count of populations that could be
classified as hard-to-count. The second provides
some of the key integration artifacts that have
been developed up to the date of publication of
this document. Some of these artifacts need to
be rendered at larger than page size and will be
included in a separate supplement document.
U.S. Census Bureau
2. The 2020 Census Overview
2.1 PURPOSE AND GOAL
The purpose of the 2020 Census is to conduct a
census of population and housing and disseminate
the results to the President, the states, and the
American people. The goal of the 2020 Census
is to count everyone once, only once, and in the
right place.
2.2 USES OF DECENNIAL CENSUS
DATA
As the 2020 Census draws near, it is important to
keep in mind the purpose of the census and how
the data will be used.
The primary requirement served by the decennial
census is the apportionment of seats allocated to
the states for the House of Representatives. This
requirement is mandated in the U.S. Constitution:
Article I, Section 2;
The actual Enumeration shall be made within
three Years after the first Meeting of the
Congress of the United States, and within
every subsequent Term of ten Years
Fourteenth Amendment, Section 2;
Representatives shall be apportioned among
the several States according to their respective numbers, counting the whole number of
persons in each State
Decennial census data at the census block level
are used by governmental entities for redistricting,
i.e., defining the representative boundaries for
congressional districts, state legislative districts,
school districts, and voting precincts. Additionally,
decennial data are used to enforce voting rights
and civil rights legislation.
The Census Bureau also uses the decennial census results to determine the statistical sampling
frames for the American Community Survey
(ACS), which replaced the long form in the
U.S. Census Bureau
decennial census and is part of the Decennial
Program, and the dozens of current surveys
conducted by the Census Bureau. The results of
these surveys are used to support important government functions, such as appropriating federal
funds to local communities (an estimated $675
billion annually)1; calculating monthly unemployment, crime, and poverty rates; and publishing
health and education data.
Finally, decennial census data play an increasingly important role in U.S. commerce and the
economy. As people expand their use of data to
make decisions at the local and national levels,
they increasingly depend on data from the Census
Bureau to make these decisions. Today, local
businesses look at data provided by the Census
Bureau on topics like population growth and
income levels to make decisions about whether
or where to locate their restaurants or stores.
Similarly, a real estate investor who is considering
investing significant funds to develop a piece of
land in the community relies on Census Bureau
data to measure the demand for housing, predict
future need, and review aggregate trends. Big
businesses also rely heavily on Census Bureau
data to make critical decisions that impact their
success and shape the economy at the national
level. As noted above, the decennial census is the
foundation for the Census Bureau’s demographic
survey data.
The decennial census data must meet high quality
standards to ensure good decision-making and to
continue building confidence in the government,
society, and the economy. Studying the balance
between cost and quality is an important focus of
the census design.
1
“Uses of Census Bureau Data in Federal Funds
Distribution,” prepared by Marisa Hotchkiss and Jessica Phelan,
U.S. Census Bureau, Washington, DC, September 2017,
.
2020 Census Operational Plan—Version 4.0 5
A mobile
population
Informal,
complex living
arrangements
Increasingly
diverse
population
Constrained
fiscal
environment
The 2020
Census
Declining
response
rates
Rapidly
changing use
of technology
Information
explosion
Distrust in
government
Figure 4: 2020 Census Environment
2.3 THE CHANGING ENVIRONMENT
AND ESCALATING COSTS
The 2020 Census challenge is exacerbated by multiple environmental factors that have the potential
to impact its success. The Census Bureau is committed to proactively addressing the challenges
that follow (see Figure 4):
•• Constrained fiscal environment: Budget
deficits place significant pressure on funding
available for the research, testing, design, and
development work required for successful
innovation.
•• Rapidly changing use of technology:
Stakeholders expect the decennial census to
use technology innovation, yet the rapid pace
of change makes it challenging to plan for and
adequately test the use of these technologies
before they become obsolete.
•• Information explosion: Rapid changes in
information technology (IT) create stakeholder
6 2020 Census Operational Plan—Version 4.0
expectations for how the Census Bureau interacts with the public to collect data and disseminate data products.
•• Distrust in government: The public’s concerns
about information security and privacy, the
confidentiality of information given to the government, and how government programs use
the information they collect continue to grow.
This impacts response rates and could make it
more difficult for government agencies to collect important demographic survey information.
•• Declining response rates: Response rates for
Census Bureau surveys, and for surveys and
censuses in general, have declined as citizens
are overloaded with requests for information
and become increasingly concerned about
sharing information.
•• Increasingly diverse population: The demographic and cultural make-up of the United
States continues to increase in complexity,
including a growing number of households and
U.S. Census Bureau
individuals of Limited English Proficiency, who
may experience language barriers to enumeration and who may have varying levels of comfort with government involvement.
in 2017. Continued growth in the use of cellular telephone technology and an associated
reduction in landline telephones tied to physical
locations may also complicate enumeration.
•• Informal, complex living arrangements:
Households are becoming more diverse and
dynamic, making it a challenge to associate
an identified person with a single location. For
example, blended families may include children
who have two primary residences. Additionally,
some households include multiple relationships
and generations.
Several of the societal, demographic, and technological trends listed above can result in a population that is harder and more expensive to enumerate. As it becomes more challenging to locate
individuals and solicit their participation through
traditional methods, the Census Bureau must
decade after decade spend more money simply to
maintain the same level of accuracy as in previous
censuses. With the innovations described in the
2020 Census Operational Plan, the Census Bureau
estimates that billions of dollars can be saved relative to replicating a design similar to that of the
2010 Census.
•• A mobile population: The United States continues to be a highly mobile nation as about
14.3 percent of the population moves in a given
year, based on results from the ACS conducted
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 7
2.4 FOUR KEY INNOVATION AREAS
The 2020 Census team focused on four key innovation areas in redesigning the census:
Reengineering
Address
Canvassing
Optimizing
Self-Response
Field costs associated with Address Canvassing
(ADC) and Nonresponse Followup (NRFU) operations comprise the most expensive parts of the
2020 Census. All four innovation areas are aimed
at reducing the costs of fieldwork. A reengineered
ADC Operation is expected to reduce the field
workload for address updating.
Self-response innovations, which are aimed at
generating the largest possible self-response rate,
coupled with the use of administrative records
and third-party data, are intended to reduce the
field workload associated with NRFU. Finally,
the reengineered field operations are intended
to increase the efficiency of those operations,
allowing managers and fieldworkers to be more
productive and effective.
Each innovation area is described further in
Section 3.
2.5 A NEW DESIGN FOR THE 21ST
CENTURY
Figure 5 describes at a high-level how the 2020
Census will be conducted. This design reflects a
flexible approach that takes advantage of new
technologies and data sources while minimizing
risk.
The first step in conducting the 2020 Census is to
identify all of the addresses where people could
live, or Establish Where to Count. An accurate address list helps ensure that everyone is
counted. For the 2020 Census, the Census Bureau
began an in-office review of 100 percent of the
nation’s addresses in September 2015 and continually updates the address list based on data
from multiple sources, including the U.S. Postal
Service; tribal, state, and local governments;
satellite imagery; and third-party data providers.
This office work will also determine which parts
8 2020 Census Operational Plan—Version 4.0
Utilizing
Administrative
Records and
Third-Party Data
Reengineering
Field Operations
of the country require fieldwork to verify address
information. In-Field ADC will begin in 2019 and
is anticipated to cover approximately 38 percent
of all addresses, a significant reduction from the
nearly 100 percent that were reviewed in the field
during the 2010 Census.
As noted on page 6, response rates to surveys and
censuses have been declining. To Motivate People
to Respond, the 2020 Census will include a nationwide communications and partnership campaign.
This campaign is focused on getting people to
respond on their own (self-respond). It costs significantly less to process a response provided via
the Internet or through a paper form than it does to
send a fieldworker to someone’s home to collect a
response. Advertising will make heavy use of digital
media, tailoring the message to the audience.
The Census Bureau Counts the Population by collecting information from all households, including
those residing in group or unique living arrangements. The Census Bureau wants to make it easy
for people to respond anytime and anywhere. To
this end, the 2020 Census will offer the opportunity and encourage people to respond via the
Internet and will encourage, but not require, people to enter a unique Census identification with
their response. Online responses will be accurate,
secure, and convenient.
For those who do not respond, the Census Bureau
will use the most cost-effective strategy for
contacting and counting people. The goal for the
2020 Census is to reduce the average number of
visits to nonresponding households by using available data from government administrative records
and third-party sources. These data will be used
to identify vacant households; to predict the best
time of day to visit a particular household; for
households that do not respond and cannot be
U.S. Census Bureau
The 2020 Census
Operational
Overview
Count everyone once,
only once, and in the right place.
ESTABLISH
WHERE TO
COUNT
MOTIVATE
PEOPLE TO
RESPOND
COUNT
THE
POPULATION
RELEASE
CENSUS
RESULTS
Identify all addresses
where people could live.
Conduct a nationwide
communications and
partnership campaign.
Collect data from all
households, including
group and unique living
arrangements.
Process and provide Census data.
Conduct a 100-percent review
and update of the nation’s
address list.
Work with trusted sources
to increase participation.
Minimize in-field work
with in-office updating.
Maximize outreach using
traditional and new media.
Use multiple data sources
to identify areas with
address changes.
Target advertisements to
specific audiences.
Get local government input.
Make it easy for people to
respond anytime, anywhere.
Encourage people to use the
online response option.
Use the most cost-effective
strategy to contact and
count nonrespondents.
Deliver apportionment counts
to the President by
December 31, 2020.
Release counts for
redistricting by April 1, 2021.
Make it easier for the public
to get information.
Streamline in-field
census taking.
Knock on doors only
when necessary.
Figure 5: The 2020 Census—A New Design for the Twenty-First Century
interviewed after multiple attempts; and to count
and provide characteristics for the people in the
household using existing high-quality data from
trusted sources. A reduced number of visits will
lead to significant cost savings.
apportionment) by December 31, 2020, to the
states (for redistricting) by April 1, 2021, and to
the public beginning in December 2021.
In addition, the majority of fieldworkers will use
mobile devices for collecting the data. Operations
such as recruiting, training, and payroll will be
automated, reducing the time required for these
activities. New operational control centers will
rely on automation to manage most of the fieldwork, enabling more efficient case assignment,
automatic determination of optimal travel routes,
and reduction of the number of physical offices. In
general, a streamlined operation and management
structure is expected to increase productivity and
save costs.
The 2020 Census will be easy to respond to—at
any time and from anywhere.
The last step in the 2020 Census is to Release
the 2020 Census Results. The 2020 Census data
will be processed and sent to the President (for
U.S. Census Bureau
2.6 WHAT THE PUBLIC CAN EXPECT
Most households in the continental United States
will receive a mailed invitation from the U.S.
Census Bureau asking residents to complete the
census questionnaire online. The questionnaire
asks just a few questions and takes about 10
minutes to answer. Respondents will be able to
respond using one of a number of devices, including a desktop computer, a laptop, a tablet, or a
smartphone.
For areas of the country with low Internet connectivity or other characteristics that make it less
likely the respondents will complete the census
questionnaire online, we will be sending a paper
2020 Census Operational Plan—Version 4.0 9
questionnaire. For other areas of the country designated for self-response, a paper questionnaire
will be mailed to nonresponding households in the
fourth mailing. Paper questionnaires can be mailed
back postage-free. Respondents can also call a
toll-free number for assistance or to give their
information to a call center representative. If an
area has predominantly noncity-style addresses,
such as rural route numbers, a census worker will
deliver a paper questionnaire to the door. In more
remote areas, a census taker will enumerate at
the households in person. For household questionnaires and responses, everyone who usually
lives and sleeps in the home—including babies and
small children—should be included.
During this time, there will be many different
kinds of advertisements encouraging response to
the 2020 Census. There could be public service
announcements, social media ads, print ads, and
television commercials. Many community organizations, small businesses, and large corporations
are also pledging support to spread the word
about the importance of responding and being
counted in the census. In addition, many communities may organize local efforts to help their residents respond, such as providing help at libraries
and other community centers.
We have designed a secure Web site to make
it safer and easier than ever to be counted. All
of our IT systems are certified and accredited in
accordance with federal IT security requirements.
10 2020 Census Operational Plan—Version 4.0
Census answers are protected by law. It’s against
the law for the Census Bureau to publicly release
responses in any way that could identify any person or household. The census is secure. We take
strong precautions to keep responses safe from
hacking and other cyberthreats.
The online questionnaire will be available in 12
non-English languages, and call center help will
also be available in those same languages. The
12 non-English languages are Spanish, Chinese,
Vietnamese, Korean, Russian, Arabic, Tagalog,
Polish, French, Haitian Creole, Portuguese, and
Japanese. (The online questionnaire will be available in simplified Chinese; the over-the-phone
help will be available in Mandarin and Cantonese.)
It’s important that the public responds as soon
as possible so the Census Bureau won’t need to
expend additional effort to receive the responses,
thereby saving taxpayer dollars. After a certain
amount of time, census takers (also known as
enumerators) will visit nonresponding households to do the important work of collecting the
information.
People who do not live in traditional housing units,
such as group quarters (e.g., dormitories, prisons,
nursing homes) or people experiencing homelessness, will be counted in other census data collection operations. All 2020 Census operations are
described in the next section.
U.S. Census Bureau
2.7 THE 2020 CENSUS OPERATIONS
The 2020 Census includes 35 operations that are organized into eight major areas that correspond with
the Census Bureau’s standard Work Breakdown Structure. The term operation refers to both support
and business functions. For example, Program Management is considered a support function, and ADC
is considered a business function. Table 1 provides a high-level purpose statement for each operation.
Table 1: Operations and Purpose
Purpose
Operations
Program Management
Program Management (PM)
Define and implement program management policies, processes, and the
control functions for planning and implementing the 2020 Census in order to
ensure an efficient and well-managed program.
Census/Survey Engineering
Systems Engineering and
Integration (SEI)
Manage the delivery of a System of Systems that meets the 2020 Census
Program business and capability requirements.
Security, Privacy, and
Confidentiality (SPC)
Ensure that all operations and systems used in the 2020 Census adhere to
laws, policies, and regulations that ensure appropriate systems and data
security, and protect respondent and employee privacy and confidentiality.
Content and Forms Design (CFD)
Identify and finalize content and design of questionnaires and other
associated nonquestionnaire materials, ensure consistency across data
collection modes and operations, and provide the optimal design and content
of the questionnaires to encourage high response rates.
Language Services (LNG)
Assess and support language needs of non-English speaking populations,
determine the number of non-English languages and level of support for
the 2020 Census, optimize the non-English content of questionnaires and
associated nonquestionnaire materials across data collection modes and
operations, and ensure cultural relevancy and meaningful translation of 2020
Census questionnaires and associated nonquestionnaire materials.
Frame
Geographic Programs (GEOP)
Provide the geographic foundation in support of the 2020 Census data
collection and tabulation activities, within the Master Address File (MAF)/
Topologically Integrated Geographic Encoding and Referencing (TIGER)
System. The MAF/TIGER System (software applications and databases) serves
as the national repository for all of the spatial, geographic, and residential
address data needed for census and survey data collection, data tabulation,
data dissemination, geocoding services, and map production. Components
of this operation include Geographic Delineations, Geographic Partnership
Programs, and Geographic Data Processing.
Local Update of Census Addresses
(LUCA)
Provide an opportunity for tribal, state, and local governments to review and
improve the address lists and maps used to conduct the 2020 Census as
required by Public Law (P.L.) 103-430.
Address Canvassing (ADC)
Deliver a complete and accurate address list and spatial database for
enumeration and determine the type and address characteristics for each
living quarter.
Response Data
Forms Printing and Distribution
(FPD)
Print and distribute Internet invitation letters, reminder cards or letters
or both, questionnaire mailing packages, and materials for other special
operations, as required. Other materials required to support field operations
are handled in the Decennial Logistics Management Operation.
Paper Data Capture (PDC)
Capture and convert data from the 2020 Census paper questionnaires,
including mail receipt, document preparation, scanning, Optical Character
Recognition, Optical Mark Recognition, Key From Image, data delivery,
checkout, and form destruction.
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 11
Operations
Purpose
Integrated Partnership and
Communications (IPC)
Communicate the importance of participating in the 2020 Census to the
entire population of the 50 states, the District of Columbia, and Puerto Rico
to engage and motivate people to self-respond (preferably via the Internet),
raise and keep awareness high throughout the entire 2020 Census to
encourage response, support field recruitment efforts, and effectively support
dissemination of census data to stakeholders and the public.
Internet Self-Response (ISR)
Maximize online response to the 2020 Census via contact strategies and
improved access for respondents and collect response data via the Internet to
reduce paper and the Nonresponse Followup Operation workload.
Non-ID Processing (NID)
Make it easy for people to respond anytime, anywhere to increase selfresponse rates by providing response options that do not require a unique
Census ID, maximizing real-time matching of non-ID respondent addresses
to the census living quarters address inventory, and accurately assigning
nonmatching addresses to census blocks.
Update Enumerate (UE)
Update the address and feature data and enumerate respondents in person.
UE is designated to occur in areas where the initial visit requires enumerating
while updating the address frame, in particular in remote geographic areas
that have unique challenges associated with accessibility.
Update Leave (UL)
Update the address and feature data and leave a choice questionnaire
package at every housing unit identified to allow the household to selfrespond. UL occurs in areas where the majority of housing units do not have a
city-style address to receive mail.
Group Quarters (GQ)
Enumerate people living or staying in group quarters and provide an
opportunity for people experiencing homelessness and receiving service at
service-based locations, such as soup kitchens, to be counted in the census.
Enumeration at Transitory
Locations (ETL)
Enumerate individuals in occupied units at transitory locations who do not
have a usual home elsewhere. Transitory locations include recreational vehicle
parks, campgrounds, racetracks, circuses, carnivals, marinas, hotels, and
motels.
Census Questionnaire Assistance
(CQA)
Provide questionnaire assistance for respondents by answering questions
about specific items on the census form or other frequently asked questions
about the 2020 Census and provide an option for respondents to complete a
census interview over the telephone. Also provide outbound calling support of
Coverage Improvement.
Nonresponse Followup (NRFU)
Determine housing unit status for nonresponding addresses that do not selfrespond to the 2020 Census and enumerate households that are determined
to have a housing unit status of occupied.
Response Processing (RPO)
Create and distribute the initial 2020 Census enumeration universe, assign
the specific enumeration strategy for each living quarter based on case
status and associated paradata, create and distribute workload files required
for enumeration operations, track case enumeration status, run postdata
collection processing actions in preparation for producing the final 2020
Census results, and check for suspicious returns.
Federally Affiliated Count Overseas
(FACO)
Obtain counts by home state of U.S. military and federal civilian employees
stationed or assigned overseas and their dependents living with them.
Publish Data
Data Products and Dissemination
(DPD)
Prepare and deliver the 2020 Census apportionment data to the President of
the United States to provide to Congress, tabulate 2020 Census data products
for use by the states for redistricting, and tabulate and disseminate 2020
Census data for use by the public.
Redistricting Data (RDP)
Provide to each state the legally required P.L. 94-171 redistricting data
tabulations by the mandated deadline of April 1, 2021, 1 year from Census Day.
12 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
Operations
Purpose
Count Review (CRO)
Enhance the accuracy of the 2020 Census through remediating potential gaps
in coverage by implementing an efficient and equitable process to identify
and incorporate housing units that are missing from the Census Bureau Master
Address File, identify and include or correct large group quarters that are
missing from the Master Address File or geographically misallocated, and
position unresolved cases for a smooth transition to the Count Question
Resolution Operation.
Count Question Resolution (CQR)
Provide a mechanism for governmental units to challenge their official 2020
Census results.
Archiving (ARC)
Coordinate storage of materials and data and provide 2020 Census records
deemed permanent, include files containing individual responses, to the
National Archives and Records Administration and provide similar files to the
National Processing Center to use as source materials to conduct the Age
Search Service. Also store data to cover in-house needs.
Other Censuses
Island Areas Censuses (IAC)
Enumerate all residents of American Samoa, the Commonwealth of the
Northern Mariana Islands, Guam, and the U.S. Virgin Islands; process and
tabulate the collected data; and disseminate data products to the public.
Test and Evaluation
Coverage Measurement Design and
Estimation (CMDE)
Develop the survey design and sample for the Post-Enumeration Survey of the
2020 Census and produce estimates of census coverage based on the PostEnumeration Survey.
Coverage Measurement Matching
(CMM)
Identify matches, nonmatches, and discrepancies between the 2020 Census and
the Post-Enumeration Survey for both housing units and people in the same
areas. Both computer and clerical components of matching are conducted.
Coverage Measurement Field
Operations (CMFO)
Collect person and housing unit information (independent from the 2020 Census
operations) for the sample of housing units in the Post-Enumeration Survey
to provide estimates of census net coverage error and components of census
coverage for the United States and Puerto Rico, excluding Remote Alaska.
Evaluations and Experiments (EAE) Document how well the 2020 Census was conducted, and analyze, interpret,
and synthesize the effectiveness of census components and their impact on
data quality, coverage, or both. Assess the 2020 Census operations. Formulate
and execute an experimentation program to support early planning and inform
the transition and design of the 2030 Census and produce an independent
assessment of population and housing unit coverage.
Infrastructure
Decennial Service Center (DSC)
Support 2020 Census Field operations for decennial staff (i.e., Headquarters,
paper data capture centers, Regional Census Center, Area Census Office,
Island Areas Censuses, remote workers, and listers/enumerators.)
Field Infrastructure (FLDI)
Provide the administrative infrastructure for data collection operations
covering the 50 states, the District of Columbia, and Puerto Rico.
Decennial Logistics Management
(DLM)
Coordinate space acquisition and lease management for the regional census
centers, area census offices, and the Puerto Rico area office; provide logistics
management support services (e.g., kit assembly and supplies and interfaces
to field staff).
IT Infrastructure (ITIN)
Provide the IT-related Infrastructure support to the 2020 Census, including
enterprise systems and applications, 2020 Census-specific applications, Field
IT infrastructure, mobile computing, and cloud computing.
Figure 6 presents a graphic representation of the 35 operations organized into the eight areas described
above. See Section 5 for details about the design and decisions for each of these operations.
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 13
14 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
16. Enumeration at
Transitory Locations (ETL)
12. Internet Self-Response
(ISR)
26. Island Areas
Censuses (IAC)
23. Count Review (CRO)
24. Count Question
Resolution (CQR)
25. Archiving (ARC)
19. Response Processing
(RPO)
20. Federally Affiliated
Count Overseas (FACO)
35. Update Leave (UL)
30. Evaluations and
Experiments (EAE)
22. Redistricting Data
Program (RDP)
18. Nonresponse Followup
(NRFU)
29. Coverage Measurement
Field Operations (CMFO)
21. Data Products and
Dissemination (DPD)
PUBLISH DATA
5. Language Services
(LNG)
17. Census Questionnaire
Assistance (CQA)
34. IT Infrastructure
(ITIN)
4. Content and
Forms Design (CFD)
Figure 6: Operations by Work Breakdown Structure
27. Coverage Measurement
Design and Estimation
(CMDE)
28. Coverage Measurement
Matching (CMM)
15. Group Quarters (GQ)
11. Integrated Partnership
and Communications (IPC)
8. Address Canvassing
(ADC)
TEST AND EVALUATION
14. Update Enumerate
(UE)
10. Paper Data Capture
(PDC)
7. Local Update of
Census Addresses (LUCA)
OTHER CENSUSES
13. Non-ID Processing
(NID)
33. Decennial Logistics
Management (DLM)
3. Security, Privacy, and
Confidentiality (SPC)
9. Forms Printing and
Distribution (FPD)
RESPONSE DATA
32. Field Infrastructure
(FLDI)
2. Systems Engineering
and Integration (SEI)
Census/Survey Engineering
6. Geographic Programs
(GEOP)
FRAME
31. Decennial Service
Center (DSC)
Infrastructure
1. Program Management
(PM)
Program Management
SUPPORT
3. The Four Key Innovation Areas
The Census Bureau plans to conduct the most
automated, modern, and dynamic decennial
census in history. The 2020 Census includes
design changes in four key areas, including new
methodologies to conduct Address Canvassing,
innovative ways of optimizing self-response, the
use of administrative records and third-party data
to reduce the Nonresponse Followup (NRFU)
workload, and the use of technology to reduce
the manual effort and improve the productivity
of field operations. The primary goal is to achieve
efficiency by:
•• Using data the public has already provided
to the government and data available from
commercial sources, allowing realized savings
to focus additional visits in areas that have traditionally been hard to enumerate.
•• Adding new addresses to the Census Bureau’s
address frame using geographic information
systems and aerial imagery instead of sending
Census Bureau employees to walk and physically check 11 million census blocks.
The goal of Reengineering Address Canvassing
is to eliminate the need to canvass every census
block. Instead, the Census Bureau has developed
innovative methodologies for updating the Master
Address File/Topologically Integrated Geographic
Encoding and Referencing (MAF/TIGER) System
throughout the decade. Figure 7 highlights the key
concepts in the Reengineering Address Canvassing
approach.
•• Encouraging the population to respond to the
2020 Census using the Internet, reducing the
need for more expensive paper data capture.
•• Using sophisticated operational control systems
to send Census Bureau employees to follow up
with nonresponding housing units and to track
daily progress.
3.1 REENGINEERING ADDRESS
CANVASSING
In-Field
Canvassing
Limited In-Field Address Canvassing in
2019 for those areas where address
updates cannot be obtained or verified
or areas that are undergoing rapid
change.
Continual In-Office Canvassing
Update and verify the MAF using aerial imagery,
administrative records, and commercial data.
2020 Census
Begins
Master Address File (MAF)
Coverage Study*
Updated MAF
used to conduct
2020 Census.
Ongoing fieldwork to measure coverage, validate in-office
procedures, and improve in-field data collection methodologies.
2015
2016
2017
2018
2019
2020
* The MAF Coverage Study was paused in FY2017 due to budget considerations.
Figure 7: Summary of Reengineering Address Canvassing
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 15
Continual research and updating through In-Office
Address Canvassing began in September 2015 and
will continue through 2019 with the establishment
of the frame for the 2020 Census. Every year clerks
will start with the most recent Census Bureau
address list and update it based on new information from the United States Postal Service (USPS)
and data from tribal, state, and local governments
and third parties (i.e., commercial vendors). Clerks
will review satellite imagery to determine where
changes in addresses are occurring, and based on
these changes, the Census Bureau will develop a
plan for capturing those changes. This plan will
include In-Field Address Canvassing where address
updates cannot be obtained or verified or in areas
undergoing rapid change. Based on continued
testing and refinement of the Address Canvassing
Operation, the Census Bureau estimates that
16 2020 Census Operational Plan—Version 4.0
approximately 38 percent of the housing units
in the Self-Response Type of Enumeration Area
(TEA) will require In-Field Address Canvassing.
This new estimate is greater than previous estimates primarily because of the need to ensure a
high-quality address frame. In addition, operational smoothing of TEAs to identify reasonable
listing workloads and collection geography areas
resulted in additions to the In-Field Address
Canvassing workload. Note that areas designated
for a response methodology that includes a frame
update at the time of the census will not also be
included in the In-Field Address Canvassing.
The operations shaded in darker blue in Figure
8 include innovations related to Reengineering
Address Canvassing.
U.S. Census Bureau
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 17
27. Coverage Measurement
Design and Estimation
(CMDE)
26. Island Areas
Censuses (IAC)
16. Enumeration at
Transitory Locations
(ETL)
30. Evaluations and
Experiments (EAE)
25. Archiving (ARC)
35. Update Leave (UL)
29. Coverage Measurement
Field Operations
(CMFO)
24. Count Question
Resolution (CQR)
23. Count Review (CRO)
22. Redistricting Data
Program (RDP)
21. Data Products and
Dissemination (DPD)
PUBLISH DATA
5. Language Services
(LNG)
20. Federally Affiliated
Count Overseas
(FACO)
19. Response Processing
(RPO)
18. Nonresponse Followup
(NRFU)
17. Census Questionnaire
Assistance (CQA)
34. IT Infrastructure
(ITIN)
4. Content and
Forms Design (CFD)
Figure 8: Operations That Contribute to Reengineering Address Canvassing
28. Coverage Measurement
Matching (CMM)
TEST AND EVALUATION
OTHER CENSUSES
12. Internet Self-Response
(ISR)
15. Group Quarters (GQ)
11. Integrated Partnership
and Communications
(IPC)
8. Address Canvassing
(ADC)
14. Update Enumerate
(UE)
10. Paper Data Capture
(PDC)
7. Local Update of
Census Addresses
(LUCA)
13. Non-ID Processing
(NID)
33. Decennial Logistics
Management (DLM)
3. Security, Privacy, and
Confidentiality (SPC)
SUPPORT
9. Forms Printing and
Distribution (FPD)
RESPONSE DATA
32. Field Infrastructure
(FLDI)
2. Systems Engineering
and Integration (SEI)
Census/Survey Engineering
6. Geographic Programs
(GEOP)
FRAME
31. Decennial Service
Center (DSC)
Infrastructure
1. Program Management
(PM)
Program Management
Documented below are brief descriptions of how each operation contributes to the Reengineering
Address Canvassing innovation area:
Table 2: Description of Operations That Contribute to Reengineering Address Canvassing
Operation
Contributions
Geographic Programs (GEOP)
•
•
•
•
Simplified collection geography.
Simplified Type of Enumeration Area delineation.
More data sources to validate and augment the frame.
More frequent engagement with partners to improve quality of the MAF/
TIGER System.
Local Update of Census Addresses Local Update of Census Addresses submissions validated as part of In-Office
(LUCA)
Address Canvassing.
Address Canvassing (ADC)
•
•
•
•
•
100 percent of addresses canvassed in In-Office Address Canvassing.
Target 38 percent of living quarters for In-Field Address Canvassing.
Ongoing in-office and in-field improvement process.
Classification of living quarter types during in-office review.
Increased productivity of field staff due to automated case assignment and
route optimization.
Update Enumerate (UE)
Reduced In-Field Address Canvassing workload because Update Enumerate
areas excluded.
Update Leave (UL)
Reduced In-Field Address Canvassing workload because Update Leave areas
excluded.
Field Infrastructure (FLDI)
Automated administrative functions.
Decennial Logistics Management
(DLM)
Reduced office infrastructure needed for In-Field Address Canvassing.
IT Infrastructure (ITIN)
• Listing applications for In-Field Address Canvassing supported by decennial
Device as a Service (dDaaS).
• Enterprise solutions with flexible architecture.
• Additional IT infrastructure to support In-Office Address Canvassing.
3.2 OPTIMIZING SELF-RESPONSE
The goal of this innovation area is to make it
as easy and efficient as possible for people to
respond to the 2020 Census by offering response
options through the Internet and telephone, in
addition to paper questionnaires that are returned
through the mail. Self-response reduces the need
to conduct expensive in-person follow-up for the
enumeration. The importance of responding to the
2020 Census will be communicated in a variety of
ways.
As shown in Figure 9, the Census Bureau will
motivate people to respond by using technology and administrative records and third-party
data to target advertisements and tailor contact
strategies to different demographic groups and
18 2020 Census Operational Plan—Version 4.0
geographic areas. The Census Bureau also will use
its partnership program, providing information
to government agencies and hosting events at
community, recreation, and faith-based organizations. Communication and contact strategies will
encourage the use of the Internet as the primary
response mode through a sequence of invitations
and postcard mailings. In addition, Census Bureau
enumerators will leave materials to encourage
self-response. Response by telephone is being
encouraged for the first time, especially for those
with low Internet connectivity and those less likely
to use the Internet. Paper questionnaires will be
mailed to some households in the first mailing and
to all nonresponding households in self-response
areas in the fourth mailing.
U.S. Census Bureau
Motivate
people to respond
and assure that
data are secure
MicroTargeted
Advertising
Tailored
Contact
Strategy
Partnership
Program
Notices
Encouraging
Self-Response
Make it easy to
respond from
any location at
any time
Multiple Modes and
Devices
Preassigned ID
Not Required*
Online Forms in
Multiple Languages
* Validate all Internet respondent addresses for quality.
Figure 9: Summary of Optimizing Self-Response
A second key aspect of Optimizing Self-Response
is to make it easy for people to respond from any
location at any time. This is done in several ways:
•• By enabling people to respond via multiple
modes (Internet, paper, or telephone if they call
the Census Questionnaire Assistance Center).
•• By allowing respondents to submit a questionnaire without a unique census identification
code.
U.S. Census Bureau
•• By providing online forms in multiple languages.
For these innovations to be successful, respondents must know that their personal information is
protected. Thus, a key element of this innovation
area is to assure respondents that their data are
secure and treated as confidential.
The operations shaded in darker blue in Figure
10 include innovations related to Optimizing
Self-Response.
2020 Census Operational Plan—Version 4.0 19
20 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
29. Coverage Measurement
Field Operations
(CMFO)
Figure 10: Operations That Contribute to Optimizing Self-Response
28. Coverage Measurement
Matching (CMM)
27. Coverage Measurement
Design and Estimation
(CMDE)
26. Island Areas
Censuses (IAC)
25. Archiving (ARC)
35. Update Leave (UL)
TEST AND EVALUATION
24. Count Question
Resolution (CQR)
20. Federally Affiliated
Count Overseas
(FACO)
16. Enumeration at
Transitory Locations
(ETL)
12. Internet Self-Response
(ISR)
OTHER CENSUSES
23. Count Review (CRO)
19. Response Processing
(RPO)
15. Group Quarters (GQ)
11. Integrated Partnership
and Communications
(IPC)
30. Evaluations and
Experiments (EAE)
22. Redistricting Data
Program (RDP)
8. Address Canvassing
(ADC)
18. Nonresponse Followup
(NRFU)
14. Update Enumerate
(UE)
21. Data Products and
Dissemination (DPD)
PUBLISH DATA
5. Language Services
(LNG)
10. Paper Data Capture
(PDC)
17. Census Questionnaire
Assistance (CQA)
34. IT Infrastructure
(ITIN)
4. Content and
Forms Design (CFD)
7. Local Update of
Census Addresses
(LUCA)
13. Non-ID Processing
(NID)
33. Decennial Logistics
Management (DLM)
3. Security, Privacy, and
Confidentiality (SPC)
9. Forms Printing and
Distribution (FPD)
RESPONSE DATA
32. Field Infrastructure
(FLDI)
2. Systems Engineering
and Integration (SEI)
Census/Survey Engineering
6. Geographic Programs
(GEOP)
FRAME
31. Decennial Service
Center (DSC)
Infrastructure
1. Program Management
(PM)
Program Management
SUPPORT
Documented below are brief descriptions of how each operation contributes to the Optimizing SelfResponse innovation area:
Table 3: Description of Operations That Contribute to Optimizing Self-Response
Operation
Contributions
Content and Forms Design (CFD)
Questionnaire designed for multiple modes and devices.
Language Services (LNG)
• Non-English questionnaires available across modes.
• Non-English content development of contact materials (e.g., invitation letters
and postcards).
Forms Printing and Distribution
(FPD)
Census mailings that encourage people to respond via the Internet.
Paper Data Capture (PDC)
Paper available as a response mode.
Integrated Partnership and
Communications (IPC)
• Micro-targeted advertising.
• Multichannel outreach.
• Integrated Partnership and Communications Program adjusted based on
customer response, behavior, and feedback.
• National and local partnerships promoting self-response.
• Educational awareness campaign through traditional and new media sources
(e.g., social media).
Internet Self-Response (ISR)
• Internet instrument optimized for mobile devices.
• Multiple languages available.
• Contact approach tailored based on prior response rates, Internet access
data, and demographics (up to five self-response mailings).
• Real-time edit checks for Internet Self-Response to improve quality.
Non-ID Processing (NID)
•
•
•
•
•
Census Questionnaire Assistance
(CQA)
• Flexible and adaptive language support.
• Respondent-initiated telephone response collection.
Response Processing (RPO)
• Single operational control system that tracks case status across all modes.
• Cases can be removed from NRFU whenever responses are received and
processed.
Update Leave (UL)
• Paper forms left at housing units to encourage self-response.
IT Infrastructure (ITIN)
• Infrastructure built and sized to meet demand and ensure adequate performance for Internet Self-Response.
• Secure Internet response capability.
Public can respond anytime, anywhere without a unique Census ID.
Real-time geocoding of responses.
Real-time validation of responses without a unique Census ID.
Real-time soft edits and checks for addresses.
Administrative records and third-party data used to validate identity and
validate and augment address data.
3.3 UTILIZING ADMINISTRATIVE RECORDS AND THIRD-PARTY DATA
The goal of this innovation area is to use information that people have already provided to improve
the efficiency and effectiveness of the 2020 Census, and in particular, reduce expensive in-person
follow-up activities. Administrative record data refers to information from federal and state governments. Third-party data refers to information from commercial sources. As shown in Figure 11, data
from both sources help to improve the quality of the address list (frame), increase the effectiveness
of advertising and contact strategies, validate respondent submissions, and reduce field workload for
follow-up activities.
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 21
As has been done in prior decades, administrative data from the USPS and other government
records are used to update the address frame
and reflect changes to the housing stock that
occur over time. Additional administrative records
sources, as well as third-party data from commercial companies, will also be used for this purpose.
In addition, these data sources will be used to
validate incoming data from tribal, federal, state,
and local governments.
To increase the effectiveness of advertising and
contact strategies, the Census Bureau will use
demographic and geographic information from
various administrative record and third-party data
sources to help target the advertising to specific
populations.
Administrative records and third-party data will
also be used to validate respondent addresses for
all Internet responses.
activities. To this end, the Census Bureau will use
data from internal and external sources, such as
the 2010 Census, the USPS, the Internal Revenue
Service, the Social Security Administration, the
Centers for Medicare and Medicaid Services, and
the American Community Survey to identify when
nonresponding housing units should be classified as occupied, vacant, or nonexistent, and that
high-quality administrative data could be used
for the enumeration. These units will be visited
one time in NRFU and, if not enumerated during
that visit, will be mailed a postcard encouraging
self-response and removed from the NRFU workload for all subsequent activity. Data from these
sources will also be used to tailor work assignments related to the best time of day to contact a
household.
The operations shaded in darker blue in Figure
12 include innovations related to Utilizing
Administrative Records and Third-Party Data.
Finally, a primary use of administrative records
is to reduce the field workload for follow-up
Improve the quality
of the address list.
Update the address list.
Increase effectiveness
of advertising and
contact strategies.
Support the micro-targeted
advertising campaign.
Validate respondent
submissions.
Assess self-responses through
a rigorous Self-Response
Quality Assurance process.
Reduce field workload
for follow-up activities.
Remove vacant and
nonresponding occupied
housing units from the
NRFU workload under
specific criteria.
Validate incoming data from
tribal, federal, state, and local
governments.
Tailor work assignments with
predicted best time of day to
contact.
Figure 11: Summary of Utilizing Administrative Records and Third-Party Data
22 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 23
27. Coverage Measurement
Design and Estimation
(CMDE)
26. Island Areas
Censuses (IAC)
30. Evaluations and
Experiments (EAE)
25. Archiving (ARC)
35. Update Leave (UL)
29. Coverage Measurement
Field Operations
(CMFO)
24. Count Question
Resolution (CQR)
23. Count Review (CRO)
22. Redistricting Data
Program (RDP)
21. Data Products and
Dissemination (DPD)
PUBLISH DATA
5. Language Services
(LNG)
20. Federally Affiliated
Count Overseas
(FACO)
19. Response Processing
(RPO)
18. Nonresponse Followup
(NRFU)
17. Census Questionnaire
Assistance (CQA)
34. IT Infrastructure
(ITIN)
4. Content and
Forms Design (CFD)
Figure 12: Operations That Contribute to Utilizing Administrative Records and Third-Party Data
28. Coverage Measurement
Matching (CMM)
TEST AND EVALUATION
OTHER CENSUSES
16. Enumeration at
Transitory Locations
(ETL)
15. Group Quarters (GQ)
11. Integrated Partnership
and Communications
(IPC)
8. Address Canvassing
(ADC)
12. Internet Self-Response
(ISR)
14. Update Enumerate
(UE)
10. Paper Data Capture
(PDC)
7. Local Update of
Census Addresses
(LUCA)
13. Non-ID Processing
(NID)
33. Decennial Logistics
Management (DLM)
3. Security, Privacy, and
Confidentiality (SPC)
9. Forms Printing and
Distribution (FPD)
RESPONSE DATA
32. Field Infrastructure
(FLDI)
2. Systems Engineering
and Integration (SEI)
Census/Survey Engineering
6. Geographic Programs
(GEOP)
FRAME
31. Decennial Service
Center (DSC)
Infrastructure
1. Program Management
(PM)
Program Management
SUPPORT
Documented below are brief descriptions of how each operation contributes to the Utilizing
Administrative Records and Third-Party Data innovation area:
Table 4: Description of Operations That Contribute to Utilizing Administrative Records and
Third-Party Data
Operation
Contributions
Geographic Programs (GEOP)
Administrative records and third-party data used to determine types of
enumeration areas, basic collection units, and geographic boundaries.
Local Update of Census
Addresses (LUCA)
Administrative records and third-party data used to validate incoming data
from tribal, federal, state, and local governments.
Address Canvassing (ADC)
Additional sources of administrative records and third-party data used to
update the address frame in lieu of fieldwork.
Integrated Partnership and
Communications (IPC)
Expanded use of administrative records and third-party data to support microtargeted Integrated Partnership Communications Program.
Internet Self-Response (ISR)
Administrative records and third-party data used to tailor the contact strategy.
Non-ID Processing (NID)
For NID cases not matched in real time, use of administrative records and
third-party data in an attempt to augment respondent-provided address data,
followed by an additional address matching attempt.
Group Quarters (GQ)
Electronic transfer and expanded use of administrative records to enumerate
group quarters where possible.
Nonresponse Followup (NRFU)
• Expanded use of administrative records and third-party data to reduce the
Nonresponse Followup workload.
• Administrative records and third-party data used to reduce the number of
contact attempts made.
• Administrative records and third-party data used to tailor work assignments
based on language and “best time of day” for contact.
Response Processing (RPO)
• Increased use of administrative records and third-party data to impute
response data (in whole or in part).
• Increased use of libraries from past surveys and censuses to support editing
and coding.
• Increased use of administrative records and third-party data to enhance
libraries for Primary Selection Algorithm and Invalid Return Detection.
Federally Affiliated Count
Overseas (FACO)
Administrative records used to complete this count of federally affiliated
persons overseas.
Count Question Resolution (CQR) Administrative records and third-party data used to resolve Count Question
Resolution challenges.
Coverage Measurement Design
and Estimation (CMDE)
• Administrative records and third-party data used for estimation.
• Administrative records and third-party data used for sample design.
Coverage Measurement Field
Operations (CMFO)
• Administrative records and third-party data used to reduce the number of
contact attempts made.
• Administrative records and third-party data used to tailor work assignments
based on “best time of day” for contact.
24 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
3.4 REENGINEERING FIELD
OPERATIONS
The goal of this innovation area is to use technology to efficiently and effectively manage the 2020
Census fieldwork, and as a result, reduce the staffing, infrastructure, and brick and mortar footprint
required for the 2020 Census. Figure 13 shows the
three main components of the reengineered field
operations: streamlined office and staffing structure, increased use of technology, and increased
management and staff productivity.
The 2020 Census field operations will rely heavily
on automation. For example, the Census Bureau
plans to provide most listers and enumerators
with the capability to work completely remotely
and perform all administrative and data collection
tasks directly from a mobile device. Supervisors
will also be able to work remotely from the
field and communicate with their staff via these
devices. These enhanced capabilities significantly
reduce the number of offices required to support
2020 Census fieldwork. In the 2010 Census, the
Census Bureau established 12 Regional Census
Centers (RCCs) and nearly 500 Area Census
Offices (ACOs). The agency hired more than
516,000 enumerators to conduct NRFU activities.
The new design for the 2020 Census field operations includes six RCCs with 248 ACOs.
In addition, automation enables significant
changes to how cases are assigned and the supervision of field staff. By making it easier for supervisors to monitor and manage their workers, the
ratio of workers to supervisor can be increased,
reducing the number of supervisors required. This
streamlines the staffing structure. Other design
changes include optimized case assignment and
routing.
All administrative functions associated with most
field staff will be automated, including recruiting,
hiring, training, time and attendance, and payroll.
Finally, the new capabilities allow for quality to be
infused into the process through alerts to supervisors when there is an anomaly in an enumerator’s
performance (e.g., the Global Positioning System
indicator on an enumerator’s handheld device
indicates that she or he is not near the assigned
location) and real-time edits on data collection.
Accordingly, the quality assurance process used in
the 2010 Census has been reengineered to account
for changes in technology.
The operations shaded in darker blue in Figure 14 include innovations related to Reengineering Field
Operations.
Streamlined Office and
Staffing Structure
Area Census
Office Manager
Census Field
Managers
Census Field
Supervisors
Listers and
Enumerators
Increased Use of
Technology
Increased Management
and Staff Productivity
• Automated and
optimized work
assignments
• Increased visibility into
case status for improved
workforce management
• Automated recruiting,
training, payroll, and
expense reporting
• Redesigned quality
assurance operations
• Automated applications for address
canvassing and enumeration on mobile
devices
• Improved
communications
Figure 13: Summary of Reengineering Field Operations
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 25
26 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
25. Archiving (ARC)
35. Update Leave (UL)
26. Island Areas
Censuses (IAC)
28. Coverage Measurement
Matching (CMM)
29. Coverage Measurement
Field Operations
(CMFO)
Figure 14: Operations That Contribute to Reengineering Field Operations
27. Coverage Measurement
Design and Estimation
(CMDE)
TEST AND EVALUATION
24. Count Question
Resolution (CQR)
20. Federally Affiliated
Count Overseas
(FACO)
16. Enumeration at
Transitory Locations
(ETL)
12. Internet Self-Response
(ISR)
OTHER CENSUSES
23. Count Review (CRO)
19. Response Processing
(RPO)
15. Group Quarters (GQ)
11. Integrated Partnership
and Communications
(IPC)
30. Evaluations and
Experiments (EAE)
22. Redistricting Data
Program (RDP)
8. Address Canvassing
(ADC)
18. Nonresponse Followup
(NRFU)
14. Update Enumerate
(UE)
21. Data Products and
Dissemination (DPD)
PUBLISH DATA
5. Language Services
(LNG)
10. Paper Data Capture
(PDC)
17. Census Questionnaire
Assistance (CQA)
34. IT Infrastructure
(ITIN)
4. Content and
Forms Design (CFD)
7. Local Update of
Census Addresses
(LUCA)
13. Non-ID Processing
(NID)
33. Decennial Logistics
Management (DLM)
3. Security, Privacy, and
Confidentiality (SPC)
9. Forms Printing and
Distribution (FPD)
RESPONSE DATA
32. Field Infrastructure
(FLDI)
2. Systems Engineering
and Integration (SEI)
Census/Survey Engineering
6. Geographic Programs
(GEOP)
FRAME
31. Decennial Service
Center (DSC)
Infrastructure
1. Program Management
(PM)
Program Management
SUPPORT
Documented below are brief descriptions of how each operation contributes to the Reengineering Field
operations innovation area. The field data collection operations are grouped together as they all contribute similarly.
Table 5: Description of Operations That Contribute to Reengineering Field Operations
Operation
Contributions
Field Infrastructure (FLDI)
• Streamlined staffing structure.
• Automated use of real-time data by the field operations control system to
enable better management of the field workforce.
• Automated training for field staff.
• Automated administrative functions, including recruiting and payroll.
• Supervisory support for listers and enumerators available during all hours
worked.
Decennial Logistics Management
(DLM)
Reduced office infrastructure.
IT Infrastructure (ITIN)
• Enterprise solutions with flexible architecture.
• Listing and enumeration applications using dDaaS.
Integrated Partnership and
Communications (IPC)
Enhanced communications to support field recruitment.
Field Data Collection Operations:
• Address Canvassing
• Update Leave
• Nonresponse Followup
• Reduced paper through automated online training, field data collection,
time and expense, etc.
• Reduced field workload as measured by cases and attempts.
• Near real-time case status updates.
• Automated and optimized assignment of work.
• Fieldwork assigned based on field staff members’ declaration of work availability and geographic location along with existing case assignments.
• Flexibility built into work assignment process based on in-field feedback or
observations.
• Data on household language and “best time of day to contact” standardized
and available at central location for work assignments.
• Redesigned quality assurance process.
• Automated applications for address canvassing and enumeration on mobile
devices.
3.5 SUMMARY OF INNOVATIONS
This section summarizes the key innovations being implemented for the 2020 Census. Innovations are
considered significant changes to the operational design as compared with the 2010 Census. The operations shaded in darker blue in Figure 15 indicate those that reflect major innovations as compared with
the 2010 Census.
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 27
28 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
29. Coverage Measurement
Field Operations
(CMFO)
Figure 15: Operations With Significant Innovations Since the 2010 Census
28. Coverage Measurement
Matching (CMM)
27. Coverage Measurement
Design and Estimation
(CMDE)
26. Island Areas
Censuses (IAC)
25. Archiving (ARC)
35. Update Leave (UL)
TEST AND EVALUATION
24. Count Question
Resolution (CQR)
20. Federally Affiliated
Count Overseas
(FACO)
16. Enumeration at
Transitory Locations
(ETL)
12. Internet Self-Response
(ISR)
OTHER CENSUSES
23. Count Review (CRO)
19. Response Processing
(RPO)
15. Group Quarters (GQ)
11. Integrated Partnership
and Communications
(IPC)
30. Evaluations and
Experiments (EAE)
22. Redistricting Data
Program (RDP)
8. Address Canvassing
(ADC)
18. Nonresponse Followup
(NRFU)
14. Update Enumerate
(UE)
21. Data Products and
Dissemination (DPD)
PUBLISH DATA
5. Language Services
(LNG)
10. Paper Data Capture
(PDC)
17. Census Questionnaire
Assistance (CQA)
34. IT Infrastructure
(ITIN)
4. Content and
Forms Design (CFD)
7. Local Update of
Census Addresses
(LUCA)
13. Non-ID Processing
(NID)
33. Decennial Logistics
Management (DLM)
3. Security, Privacy, and
Confidentiality (SPC)
9. Forms Printing and
Distribution (FPD)
RESPONSE DATA
32. Field Infrastructure
(FLD)
2. Systems Engineering
and Integration (SEI)
Census/Survey Engineering
6. Geographic Programs
(GEOP)
FRAME
31. Decennial Service
Center (DSC)
Infrastructure
1. Program Management
(PM)
Program Management
SUPPORT
Documented below are brief descriptions of how each operation contributes to the specific innovations
for each of these operations. Note that these innovations are dependent upon funding.
Table 6: Summary of Key Innovations by Operation—Con.
Operation
Contributions
Local Update of Census Addresses • Reduced complexity for participants.
(LUCA)
• Elimination of the full address list submission options to improve quality and
reduce burden and cost.
Address Canvassing (ADC)
• Use of a combination of in-office and in-field methods with 100 percent
In-Office Address Canvassing and an estimated 38 percent of addresses
going to the field.
• Use of automation and data (imagery, administrative records, and third-party
data) for In-Office Address Canvassing.
• Ongoing fieldwork (Master Address File Coverage Study) to validate in-office procedures, measure coverage, and improve in-field data collection
methodologies.
• Use of reengineered field management structure and approach to managing
fieldwork, including new field office structure and new staff positions.
Integrated Partnership and
Communications (IPC)
• Microtargeted messages and placement for digital advertising, especially for
hard-to-count populations.
• Advertising and partnership campaign adjusted based on respondent
actions.
• Expanded predictive modeling to determine propensity to respond by geographic areas.
• Expanded use of social media.
Internet Self-Response (ISR)
• Internet data capture, providing real-time edits, ability to capture unlimited
household size entries, and multiaccess methods across different technologies (e.g., computers, phones, tablets).
• Online questionnaires available in multiple languages.
• Contact approach tailored based on prior response rates, Internet access
data, and demographics (up to five self-response mailings).
• Validation of Internet responses.
Non-ID Processing (NID)
• Ability for public to respond anytime, anywhere.
• Real-time matching and geocoding of responses.
• Use of administrative records and third-party data to validate identity and
validate and augment address data for non-ID submissions.
Nonresponse Followup (NRFU)
• Use of administrative records and third-party data to reduce nonresponse
follow-up contacts for vacant housing units.
• Use of administrative records and third-party data to reduce nonresponse
follow-up contacts for occupied housing units.
• Use of reengineered field management structure and approach to managing
fieldwork.
• Use of a variable contact strategy and stopping rules to control the number
of attempts made for each address.
• Assignment and route optimization.
• Automated training for field staff.
• Automation of the field data collection.
• Automation of administrative functions such as recruiting, onboarding, and
payroll.
• Reengineered quality assurance approach.
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 29
Documented below are brief descriptions of how each operation contributes to the specific innovations
for each of these operations. Note that these innovations are dependent upon funding.
Table 6: Summary of Key Innovations by Operation—Con.
Operation
Contributions
Field Infrastructure (FLDI)
• Automated job application and recruiting processes, payroll submission and
approval process, and other administrative processes resulting in reduced
staffing requirements.
• Automated training.
• Reduced number of listers, enumerators, and supervisors due to reengineered design for field operations.
Decennial Logistics Management
(DLM)
• Reduced number of RCCs managing a reduced number of ACOs tasked with
managing field operations and support activities.
• Implementation of an online, real-time Enterprise Resource Planning system
with extended access for the RCCs and field offices.
• Implementation of a wireless network and bar code technology that will
automate inventory transactions.
IT Infrastructure (ITIN)
•
•
•
•
•
Early development of solutions architecture.
Use of enterprise solutions as appropriate.
Iterative deployment of infrastructure aligned with and based on testing.
Implementation of dDaaS.
Use of demand models to help predict Internet response volume, Census
Questionnaire Assistance center staffing, etc.
• Scalable design.
• Agile development of applications.
30 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
4. Key Tests, Milestones, and Production Dates
4.1 TESTS TO INFORM THE
OPERATIONAL DESIGN AND PREPARE
FOR CONDUCTING THE CENSUS
The 2020 Census has multiple decision points,
milestones, and production dates that must be
met to deliver the final apportionment and redistricting data. Informing the decision points are a
series of tests. More detailed information about
each test is captured in formal research and test
plan documents. An integrated master schedule
facilitates the integration and coordination of
activities across tests and operations. Refer to
Figure 2 in Section 1.3 for how this documentation
fits into the broader set of documentation for the
2020 Census Program. Test reports on specific
research topics are available at the Census Bureau
Web site .
As shown in Figure 16, the tests conducted
early in the decade (2012–2015) were aimed at
answering specific research questions (objectives)
needed to make decisions on important aspects
of the operational design for the four key innovation areas. In 2016, the focus shifted to validating
and refining the design by testing the interactions
across operations and determining the proposed
methodology for the operations. Testing of production systems began in 2017 and continued
through 2018, with final performance testing to
ensure scalability occurring in 2019. The End-toEnd Census Test in 2018 tested the integration of
major operations and systems. High-level lessons
learned from the 2018 End-to-End Census Test
appear in this version of the document.
The first part of this section describes the tests
used to inform the operational design and prepare for conducting the 2020 Census. The second
part highlights key decision points and milestones beginning with the research and testing
phase in late 2011 through the completion of the
2020 Census in 2023. The third part provides the
planned production timeline for primary 2020
Census operations, and the final section shows an
integrated schedule of the tests, milestones, and
production operations.
2020 Census Operational Plan
With Initial Design
Systems Tests
In May 2016, the Census Bureau announced that
the 2020 Census Program will use a commercial
off-the-shelf platform for the data collection component of the 2020 Census.1 Prototype systems
(e.g., in-field operational control system [MOJO],
1
Blumerman, L., 2020 Census Business Solution Architecture,
2020 Census Program Memorandum Series: 2016.06.
2020 Census Operational Plan
With Final Design
Develop and Test
Production Systems
Test Innovations
From Four Key
Innovation Areas
Operational
Design Tests
2012
2013
2014
Test
Innovations
Individually
Test
Integration
of SelfResponse
and
Nonresponse
2015
2016
Test
Integration
of all SelfResponse
Modes
Conduct
End-to-End
Test of
Systems
Conduct
Performance
Testing
Conduct
End-to-End
Test of
Operations
2017
2018
2019
2020
Figure 16: High-Level View of Tests
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 31
Census Operations Mobile Platform for Adaptive
Services and Solutions [COMPASS], and PRIMUS)
were used for the 2020 Census tests in 2014–2016.
Beginning in 2017, 2020 Census tests included
systems per the revised Business Solution
Architecture.
Calendar
Year
Test
2016
•• 2016 Census Test.
•• Address Canvassing Test.
•• 2016 eResponse Data Transfer Test.
•• 2016 Service-Based Enumeration
Census Test.
Table 7 lists the operational tests executed or
planned for the 2020 Census.
2017
•• 2017 Census Test.
Table 7: Operational Tests
Calendar
Year
2012
•• 2017 eResponse Data Transfer Test.
Test
2018
•• Census Barriers, Attitudes, and
Motivators Study.
•• Public-Opinion Polling (ongoing as
needed throughout the decade).
•• 2012 National Census Test.
2013
2019
•• 2013 National Census Contact Test.
•• 2014 Census Test.
•• Continuous Small-Scale testing
(ongoing as needed throughout the
decade).
•• Local Update of Census Addresses
Focus Groups.
•• 2014 Human-in-the-Loop Test.
2015
Post End-to-End Testing.
The following sections describe the tests listed
above. Tests for calendar years 2012 through 2014
(the Research and Testing Phase) are combined
into one section. For each test, a short description
of the purpose, scope, and timing is presented,
followed by a table with objectives of the tests,
findings, and where applicable, design implications based on these findings.
•• 2013 Census Test.
2014
•• 2018 End-to-End Census Test.
4.1.1 Tests in 2012–2014
•• Address Validation Test (started in
late 2014).
As shown in Figure 17, eight tests were conducted
between 2012 and 2014.
•• 2015 Optimizing Self-Response Test.
•• 2015 Census Test.
•• 2015 National Content Test.
•• 2015 Group Quarters Electronic
Capability Test Survey.
2012
Q4
Dec
Public Opinion Polling
Q1
Jan
Feb
Q2
Mar
Apr May
2013
Q3
Jun
Jul
Aug
Q4
Sep
Oct
Nov
Q1
Dec
Jan
Feb Mar
Q2
Apr May
2014
Q3
Jun
Jul
Q4
Aug
Sep
Oct
Nov
Jan
Feb Mar
Q2
Apr
May
Q3
Jun
Jul
Aug
Q4
Sep
Oct
Nov
Q1
Dec
2/12
Jan
1
Continuous Small Scale Testing
1/14
2012 National Census Test
8/12
2013 National Census Contact Test
Early 2012–2014
Tests
Q1
Dec
1
10/12
12/13
10/13
2013 Census Test
12/13
LUCA Focus Groups
12/13
3/14
2014 Census Test
6/14
6/14
2014 Human–in–the–Loop Test
9/14
11/14
11/14
Figure 17: Tests in 2012–2014
32 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
4.1.1.1 Public Opinion Polling
The Public Opinion Polling Test is a public opinion survey of attitudes toward statistics produced by the
federal government that focuses on trust in the federal statistical system, the credibility of federal statistics, and attitudes toward and knowledge of the statistical uses of administrative records and third-party
data. The Census Bureau used the Nightly Gallup Polling for this survey and collected data by telephone
from a sample of nationally representative housing units daily. Data collection started in February 2012
and ended in July 2018. Findings from this survey are being used to inform approaches to communication about administrative records, privacy, and confidentiality.
Public-Opinion Polling Test
Objectives
•• Determine if the public’s perception of the Census Bureau’s commitment and ability to protect privacy and confidentiality are impacted if administrative records are used in the 2020
Census design.
•• Determine what the public is most concerned about with regard to privacy and confidentiality, in general and as related to government data collection.
•• Determine if attitudes toward the federal statistical system have changed over time or in
relation to specific events.
•• Collect data on hiring practices for the decennial census and on awareness of the Office of
Personnel Management (OPM) data breach.
Findings
•• Reported belief in the credibility of statistics predicts reported trust in federal statistics.
•• Respondents are more likely to favor using administrative records and third-party data when
questions regarding administrative records and third-party data are framed to indicate that
the use of records can save the government money or provide a social good.
•• Respondents are more likely to favor combining government data sets when questions
regarding combining data sources are framed to indicate that combined data would make
better use of funds and improve government services.
•• Respondents who report being concerned with responding to the census online cite concerns with security or hacking and access to the Internet or a computer.
•• Respondents who report knowledge of the statistical system, using data, or believing that
data are relevant, that data are kept confidential, and that agencies respect privacy also
report increased trust in statistical products.
•• Continue to see declines in reported trust of federal statistics and in the belief that federal
statistical agencies keep data confidential.
•• Awareness of the OPM data breach negatively influences respondents’ trust in federal
statistics.
•• Hiring people with criminal backgrounds for 2020 Census jobs has the potential to erode
trust for many and would hardly ever earn trust.
Design
Implications
•• Continue to pursue research and testing related to the use of administrative records and thirdparty data.
•• Continue efforts to use partnership and communications activities to increase trust.
•• Suggest efforts to increase knowledge about the statistical system and increase data users,
which could help by increasing confidence in the federal statistical system.
•• Continue plans for a rapid response team to use communications to mitigate negative
impacts on trust from any data breaches or similar events.
•• Do not hire people with criminal backgrounds for the 2020 Census.
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 33
4.1.1.2 2012 National Census Test
The 2012 National Census Test studied overall
self-response rates and Internet self-response
rates. The test was conducted from August 2012
to October 2012 and included 80,000 nationally
representative housing units.
2012 National Census Test
Design
Implications
•• Further study of the collection
of detailed race and origin
groups in a national mailout
test.
2012 National Census Test
Objectives
Findings
•• Assess relative self-response
rates and Internet self-response
rates.
•• Evaluate the performance of
combined race and origin questions on the Internet.
•• Assess the Telephone
Questionnaire Assistance
Operation.
•• Total self-response rate was
56.9 percent, and the Internet
self-response rate was 36.5
percent.
•• An advance letter resulted in no
significant difference in overall
response rate as compared with
no advance letter.
•• Providing a telephone number
in the initial mailing resulted
in no significant difference in
overall response, but did result
in an increase of telephone
interviews.
•• The 2020 Census Questionnaire
Assistance Operation must
account for increased call
volumes.
4.1.1.3 2013 National Census Contact Test
The 2013 National Census Contact Test studied
two key areas related to strategies for contacting
respondents: the quality of the Contact Frame (a
list of supplemental contact information such as
email addresses and phone numbers, built from
third-party data sources) and automated processing of census responses lacking a preassigned
Census identification number (Non-ID Processing).
The study included 39,999 nationally representative
addresses.
2013 National Census Contact Test
Objectives
•• A second reminder to answer
the 2012 National Census Test
performed well.
•• Tailoring the content of the
reminder materials resulted in
no significant difference in overall response.
•• Of the calls to the Telephone
Questionnaire Assistance
Operation, 69 percent were
because the respondent did
not have a computer or Internet
access.
34 2020 Census Operational Plan—Version 4.0
•• Evaluate the quality of phone
and email contact information
acquired from third-party data
sources.
•• Test proposed enhancements
to automated processing of
responses lacking a preassigned
Census identification number.
Findings
•• Response distributions of the
combined race and origin questions were similar across the
two question versions.
•• Results did not indicate
expected benefit of enhanced
reporting of detailed race and
origin groups.
•• Continue tests to determine
response rates and optimal
contact strategies.
•• Respondents were not able to
validate contact information for
other household members.
•• The use of administrative
records and third-party data
was effective in enhancing
non-ID addresses to allow for a
match to the MAF.
Design
Implications
•• Continue testing the quality of
the Contact Frame.
•• Continue enhancing the functionality associated with Non-ID
Processing.
U.S. Census Bureau
4.1.1.4 2013 Census Test
The 2013 Census Test was an operational study of Nonresponse Followup (NRFU) procedures. This test
was conducted in late 2013 and involved 2,077 housing units in Philadelphia, PA.
2013 Census Test
Objectives
•• Evaluate the use of administrative records and third-party data to identify vacant housing
units and remove them from the NRFU workload.
•• Evaluate the use of administrative records and third-party data to enumerate nonresponding
occupied housing units to reduce the NRFU workload.
•• Test an adaptive design approach for cases not enumerated with administrative records and
third-party data.
•• Test methods for reducing the number of enumeration contact attempts as compared with
the 2010 Census.
•• Test the use of the telephone to make initial enumeration contact attempts.
Findings
•• Successfully used administrative records and third-party data to identify vacant and occupied housing units and removed cases from the NRFU workload.
•• Successfully used administrative records and third-party data as part of an adaptive design
approach to designate cases for one to three contact attempts.
•• Adaptive design strategies as implemented did not work.
•• Design added complexity to training of enumerators.
Design
Implications
•• Continue refinement of adaptive design methods and administrative records and third-party
data usage.
•• Continue refinement of training methods.
4.1.1.5 2014 Census Test
The 2014 Census Test was an operational study of self-response and NRFU procedures. For this test,
Census Day was July 1, 2014. The test involved 192,500 housing units in portions of Montgomery
County, MD, and Washington, DC.
2014 Census Test
Objectives
•• Test various self-response modes, including the Internet, Census Questionnaire Assistance,
and paper, and response without a preassigned Census identifier.
•• Evaluate the value of a preregistration option using “Notify Me” (a Web site that allows
respondents to indicate a preferred mode of contact for the 2020 Census).
•• Test the use of mobile devices for NRFU enumeration in the field.
•• Test the use of Bring Your Own Device (BYOD) to conduct enumeration in the field.
•• Continue evaluating the use of administrative records and third-party data to remove cases
(vacant and nonresponding occupied housing units) from the NRFU workload.
•• Test the effectiveness of applying adaptive design methodologies in managing the way field
enumerators are assigned their work.
•• Examine reactions to the alternate contacts, response options, administrative record use, and
privacy or confidentiality concerns (including how the Census Bureau might address these
concerns through micro- or macro-messaging) through focus groups.
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 35
2014 Census Test
Findings
•• Total self-response rate was 65.9 percent, and the Internet self-response rate was 50.6
percent.
•• Email contact attempts did not work due to large number of incorrect email addresses
(bounce-backs).
•• The address collection interface in the Internet instrument yielded a much greater proportion
of higher quality address data from respondents without a unique Census ID than in 2010.
•• Use of administrative records and third-party data matching improved the overall address
matching rate.
•• “Notify Me” had low participation, with only about 3 percent of the sample choosing to
preregister.
•• Higher than projected in-bound phone workloads due to respondent questions and issues
primarily related to Internet access.
•• Problems with coordinating contact with gated communities resulting in inefficient
enumeration.
•• Need to strengthen training and procedures on contacting nonresponding housing units,
specifically as related to proxy interviews.
•• Need improved business rules and improved rule-based models for administrative records
and third-party data.
Design
Implications
•• Conduct another test of “Notify Me” to determine if more people use this capability when
advertising is used to inform the public about the 2020 Census, and specifically about the
“Notify Me” option.
•• Determine optimal use of adaptive design and administrative records and third-party data.
•• Further explore the use of BYOD.
4.1.1.6 Continuous Small-Scale Testing
The Continuous Small-Scale Testing is a study of respondent and nonrespondent reactions to new
modes of decennial census contact and response. The study focuses on reactions related to privacy
and confidentiality of these modes. This study started in January 2014 and is ongoing as needed. It has
included emails to 1,000–2,200 housing units sampled from an opt-in frame.
Continuous Small-Scale Testing
Objectives
•• Determine how new contact and response modes will impact the public’s perception of the
Census Bureau’s commitment and ability to protect privacy and confidentiality.
•• Determine how the public feels about each new mode being tested, specifically with regard
to privacy and confidentiality.
Findings
•• A text-based email outperformed graphical emails.
•• Longer email content with “Dear Resident” and signature of the Director outperformed a
shorter email invitation without the greeting and signature.
•• Respondents report preferring reporting online to a decennial census with a mailed invitation
with the link over all other options.
•• Experimenting with an idea for publicity for the 2020 Census, very few respondents (less
than 4 percent) forwarded a survey request to friend and family.
•• In an experiment with Non-ID Processing, asking an explicit question about collecting location data in addition to the smartphone’s own question appeared to increase the percentage
of people who allowed their mobile phone’s location to be accessed compared to when only
the phone’s own location message appeared.
•• The source of the administrative data has more impact on a favorable opinion toward its use
than any other attribute, including the amount of time saved by the respondent if administrative data are used instead of a survey response.
•• Data use statements are more important to respondents than other messages contained in
the survey invitation.
Design
Implications
Continue to monitor respondent and nonrespondent reactions to various contact and response
modes.
36 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
4.1.1.7 LUCA Focus Groups
The Local Update of Census Addresses (LUCA) Focus Groups collected input on potential LUCA models
for the 2020 Census. Focus groups consisted of eligible LUCA participants representing various sizes
and types of governments across the nation. Forty-six governmental entities participated. The focus
groups were conducted from March 2014 through June 2014.
LUCA Focus Groups
Objectives
Obtain feedback on potential LUCA models for the 2020 Census through a series of focus
groups with 2010 Census LUCA participants.
Findings
•• Continue the 2010 Census LUCA Operation improvements that were successful:
°° 120-day review time for participants.
°° 6-month advance notice about the LUCA Operation registration.
°° Comprehensive communication program with participants.
°° Provide a variety of LUCA media types.
°° Improve the partnership software application.
°° State participation in the LUCA Operation.
•• Eliminate the full address list submission options that were available in 2010 LUCA (Options 2
and 3). This will:
°° Reduce the number of deleted LUCA addresses in field verification activities.
°° Reduce the burden and cost of processing addresses and LUCA address validation.
°° Reduce the complexity of the LUCA Operation.
•• Include census housing unit location coordinates in the census address list and allow partners
to return their housing unit location coordinates as part of their submission.
•• Provide any ungeocoded United States Postal Service Delivery Sequence File address to
state and county partners.
•• Provide the address list in more standard formats.
•• Conduct an in-office validation of LUCA-submitted addresses.
•• Utilize Geographic Support System data and tools to validate LUCA submissions.
•• Encourage governments at the lowest level to work with higher level governments to consolidate their submission.
•• Eliminate the Block Count Challenge.
•• Eliminate the use of the asterisk (*) designation for multiunits submitted without unit
designations.
•• Encourage LUCA participants to identify addresses used for mailing, location, or both.
Design
Implications
•• Develop in-office validation processes, procedures, and tools.
•• Define relationship between Address Canvassing and LUCA, taking into consideration the
timing of LUCA feedback and the appeals operation.
•• Determine the feasibility of technical recommendations for the 2020 Census LUCA
Operation:
°° Use of background imagery on paper maps.
°° Ability to provide structure locations within LUCA materials.
°° Feasibility of Web-based registration.
•• Determine feasibility of using areas where the Census Bureau has planned field activities to
validate LUCA addresses.
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 37
4.1.1.8 2014 Human-in-the-Loop Test
2014 Human-in-the-Loop Test
The 2014 Human-in-the-Loop Test consisted of a
simulation of reengineered field operations using
an Operational Control Center and the enhanced
operational control system. The purpose was to
test proposed devices, systems, and the field
structure for staff and management processes.
The Simulation Experiment occurred in November
2014. In this test, real-time field operations and
field management structure were tested by 87
field and office staff members.
Design
Implications
•• Increase the ratio of enumerators to supervisors—further
testing required.
4.1.2 Tests in 2015
A key milestone in October 2015 was the release
of the preliminary operational design for the 2020
Census as documented in version 1.1 of this plan
and supporting materials. This original design was
informed by tests conducted from 2012 through
2015.
2014 Human-in-the-Loop Test
Objectives
•• Exercise field reengineering methods (staffing ratios
and enhanced operational
control system) in a simulated
environment.
Figure 18 shows the schedule for the four tests in
2015 and the 2020 Census Operational Plan milestone. Each test is described below.
•• Refine methods and get input
from field staff to improve business processes prior to the 2015
Census Test.
Findings
•• Employ the new design for
reengineered field operations
during the 2015 Census Test.
4.1.2.1 Address Validation Test
The Address Validation Test was conducted to
assess the performance of methods and models to help develop the 2020 Census address list
and to estimate the In-Field Address Canvassing
workloads for the 2020 Census. The test contained
two components, the Master Address File (MAF)
Model Validation Test (MMVT) and the PartialBlock Canvassing (PBC) Test.
•• The new design for managing
field operations was successful, including the use of an
Operational Control Center and
operational control system to
manage the NRFU workload.
•• The ratio of enumerators to
supervisors can be increased
from the 2010 Census.
•• Instant notification to enumerators and supervisors is feasible
and serves as a successful
means of communication.
2014
Q3
2015 Address
Validation Test
Jul
MAF Model Validation
Aug
Sep
Oct
Q4
Nov
Q1
Dec
2015 Optimizing
Self–Response Test
2015 Census Test
2015
Q2
Feb
Mar
Apr
May
Jun
Q3
Jul
Aug
Q4
Sep
Oct
Nov
Q1
Dec
Jan
12/14
9/14
Partial Block Canvassing
Jan
2/15
12/14
6/15
2/15
3/15
2015 National Content
Test
2020 Census
Operational Plan
6/15
12/15
8/15
2020 Census Operational Plan
10/15
Figure 18: Tests and Key Decisions in 2015
38 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
MAF Model Validation Test
Partial-Block Canvassing
The MMVT evaluated methods that are part of
the reengineered Address Canvassing innovation
area. The test was conducted from September
2014 to December 2014 and included 10,100
nationally representative blocks (100 blocks with
no addresses), which included approximately 1.04
million addresses in the sample blocks.
The PBC Test evaluated the feasibility of canvassing portions of blocks, rather than entire blocks,
using both in-office and in-field methods. This test
was conducted from December 2014 to February
2015. The staff conducted an interactive review of
aerial imagery over time and geographic quality
indicators. Six hundred fifteen blocks with national
distribution were listed by 35 professional staff.
MAF Model Validation Test
Objectives
•• Test In-Office and In-Field
Address Canvassing procedures.
Partial-Block Canvassing
Objectives
•• Determine the ability to ensure
an accurate MAF.
•• Determine ability to accurately
canvass partial blocks.
•• Assess the ability of two sets
of statistical models to predict
blocks that have experienced
address changes.
Findings
•• Statistical models were not
effective at predicting national
coverage errors.
Design
Implications
•• Statistical models are not being
pursued for determining blocks
with changes or MAF coverage.
•• Continue with In-Office and
In-Field Address Canvassing
approaches.
U.S. Census Bureau
•• Evaluate an interactive review
of various materials—primarily
aerial imagery over time and
geographic quality indicators.
•• In-Office Address Canvassing
was effective.
•• Statistical models were not
effective at identifying blocks
with changes.
•• Measure unrecorded changes in
blocks and identify portions of
blocks where change is likely.
Findings
•• Operationally feasible to canvass portions of blocks.
•• In-office imagery review of
blocks has utility.
Design
Implications
Continue to evaluate risks vs.
benefits of PBC approach. (Note:
subsequent to this test, a decision
was made to do only full-block
address canvassing. See the
Address Canvassing Operation
section for more information.)
2020 Census Operational Plan—Version 4.0 39
4.1.2.2 2015 Optimizing Self-Response Test
The 2015 Optimizing Self-Response Test was an operational study of self-response procedures. For this
test, Census Day was April 1, 2015. In the Savannah, GA, media market, 407,000 housing units were
included in this test, with 120,000 sampled self-responding housing units.
2015 Optimizing Self-Response Test
Objectives
•• Determine use of digital and targeted advertising, promotion, and outreach to engage and
motivate respondents.
•• Test value of “Notify Me” when partnerships and traditional and targeted advertising are
used to promote early engagement of respondents.
•• Offer opportunity to respond without a Census ID (Non-ID Processing) and determine operational feasibility and potential workloads around real-time Non-ID Processing.
•• Determine self-response and Internet response rates.
Findings
•• The total response rate was 47.5 percent, and the Internet response rate was 33.4 percent.
•• An additional 35,249 Internet responses came from housing units not selected in mail panels
as a result of advertising and promotional efforts.
•• Continued low participation in “Notify Me.”
•• Successful implementation of real-time Non-ID Processing, matching 98.5 percent of cases.
•• A new postcard panel, designed to test how housing units not originally included in the sample would respond to an invitation after being exposed to advertising, generated a response
of approximately 8 percent.
Design
Implications
•• Discontinue “Notify Me.”
•• Continue testing related to partnerships, advertising, and promotional efforts.
•• Continue use of offering the non-ID option to respondents.
4.1.2.3 2015 Census Test
The 2015 Census Test was an operational study of NRFU procedures. Census Day was April 1, 2015.
This test included 165,000 sampled housing units in Maricopa County, AZ.
2015 Census Test
Objectives
•• Continue testing of fully utilized field operations management system that leverages planned
automation and available real-time data, as well as data households have already provided to
the government, to transform the efficiency and effectiveness of data collection operations.
•• Begin examining how regional offices can remotely manage local office operations in an
automated environment, the extent to which enumerator and manager interactions can occur
without daily face-to-face meetings, and revised field staffing ratios.
•• Reduce NRFU workload and increase productivity with the use of administrative records and
third-party data, field reengineering, and adaptive design.
•• Test operational implementation of BYOD.
•• Explore reactions to the NRFU contact methods, administrative records and third-party data
use, and privacy or confidentiality concerns.
40 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
2015 Census Test
Findings
•• The total self-response rate was 54.9 percent and the Internet self-response rate was 39.7
percent.
°° Coverage questions increased respondent burden.
•• Field Staff Training
°° Combination of online and classroom training provided standardization of the information, provided tracking capabilities, and offered various learning methods.
°° Reduced training hours compared with the 2010 Census NRFU enumerator training from
32 hours to 18 hours.
°° Deployment of YouTube videos efficiently provided supplemental training to enumerators.
°° Topics requiring additional training in future tests were identified.
•• Field Reengineering
°° Area Operations Support Center and staffing of the Area Operations Support Center were
successful.
°° Electronic payroll was successful.
°° Enumerator entry of availability for work and office operational system workload optimization were effective.
°° Operational Control System alerts were effective in bringing attention to situations that
required follow-up and possible corrective action.
°° Optimized routing was successful overall, but uncovered need for modifications to the
routing algorithm.
•• COMPASS was effectively used as the application for enumerating nonresponding housing
units.
°° COMPASS application was easy to use.
°° COMPASS application experienced crashes and freezes; further investigation into root
causes is needed.
•• Field Test Procedures
°° Work needed to define a coordinated approach to enumeration within multiunits and
gated communities.
°° Refinement to data collection application “pathing” needed to better assist enumerators
in cases on proxy responses and noninterviews.
•• BYOD
°° Training was fairly labor-intensive.
°° Based on observations, no adverse respondent reactions to the device being used for
data collection.
°° A variety of logistical and security risks related to implementation of BYOD were
identified.
•• Administrative Records use in NRFU.
°° The team successfully implemented the usage of predictive models and optimization
approaches to identify the administrative record vacant and administrative record occupied units. These approaches show promise and will continue to be researched in future
census tests.
°° The team successfully used Internal Revenue Service information from the current tax
year in the administrative record identification. The Census Bureau successfully processed
the administrative record sources before the start of the NRFU field operation.
°° The delivery and processing of monthly Internal Revenue Service information was
successful, so an additional identification of administrative record occupied units in the
middle of NRFU has been added.
Design
Implications
•• Employ the use of automated training.
•• Continue to test the use of administrative records and third-party data in reducing the NRFU
workload.
•• Optimize the number of visits and phone contacts for nonresponding housing units.
•• Make at least one contact for nonresponding housing units.
•• Continue to test field procedures for contacting nonresponding housing units.
•• The decision to stop testing BYOD and move forward with decennial Device as a Service
(dDaaS) was made in January 2016 because of the risks related to BYOD. The decision
discussion and risks are documented in the “2020 Census Program Memorandum Series:
2016.01.”
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 41
4.1.2.4 2015 National Content Test
The 2015 National Content Test evaluated and compared different census questionnaire content. It
assumed a Census Day of September 1, 2015. The test included 1.2 million nationally representative
households, including 20,000 households in Puerto Rico and 100,000 reinterviews. Two major reports
of results from this test have been released publicly at . These are the 2015 National
Content Test Relationship Question Experiment Analysis at 2020 Census Program Memorandum Series
2017.07 and the 2015 National Content Test Race Ethnicity Analysis Report at 2020 Census Program
Memorandum Series 2017.08.
2015 National Content Test
Objectives
•• Evaluate and compare different census questionnaire content, including questions on race and
Hispanic origin (e.g., combining race and Hispanic origin into a single question versus using
separate questions, and introducing a Middle Eastern North African category), relationship
(introducing same-sex relationship categories), and within-household coverage (streamlined
approach for ensuring accurate within-household coverage).
•• Refine estimates of national self-response and Internet response rates.
•• Continue to test self-response modes and contact strategies (see 2014 Census Test objectives).
•• Reinterview a subsample of respondents to further assess the accuracy and reliability of the
question alternatives for race, Hispanic origin, and within-household coverage.
Findings
•• The total self-response rate was 51.9 percent, and the Internet self-response rate was 35.6
percent.
•• Adding a final mailing, a reminder sent after the paper questionnaire, significantly increased
response rates.
•• Sending the final reminder sooner by a few days prompted quicker responses, thus reducing
the size of the third mailing.
•• In low response areas, the “choice” strategy of sending a paper questionnaire in the final mailing
is effective.
•• Providing the letters in English and Spanish, rather than just English with a Spanish sentence,
elicits more Spanish language responses.
•• The new relationship question (with same-sex and opposite-sex categories) showed the same
distributions as the old relationship question.
•• Analysis of the race and ethnicity questions appears in an external report.
Design
Implications
•• Send a fifth mailing to nonrespondents.
•• Send the final reminder mailing a few days sooner.
•• Provide more language support in the mail materials.
•• Continue research on identifying which areas should receive the paper questionnaire in the first
mailing.
•• Use the new relationship categories.
42 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
4.1.3 Tests in 2016
In 2016, the Census Bureau moved from small-scale individual tests using proof of concept and prototype systems to more refined tests and the building and integration of systems that will support the
2020 Census. As shown in Figure 19, two major tests were completed in 2016. The 2016 Census Test
focused on the integration of self-response and NRFU. The Address Canvassing Test expanded early
address canvassing tests to refine the in-office and in-field methods. Each test is described below.
The following operations and systems were tested in 2016 through these two tests:
Key Innovation Area
Operations
Systems
Reengineering Address
Canvassing
Address listing.
Enterprise Listing and Mapping System/
Listing and Mapping Instrument.
Optimizing Self-Response
•• Internet Response.
•• PRIMUS Prototype.
•• Telephone Response.
•• Census Bureau Call Centers.
•• Paper Response.
•• Integrated Capture and Data Entry.
•• Non-ID Processing.
•• Real-time Non-ID Processing.
•• Language Service.
Utilizing Administrative
Records and Third-Party Data
•• Identification of vacant and occupied •• Headquarters’ servers.
units.
•• Control and Response Processing
•• Removal of cases with high-quality
Data System.
data from other sources from the
NRFU workload.
Reengineering Field Operations •• Workload Control.
•• MOJO (in-field operational control
system) prototype begins interfacing
with Multimode Operational Control
System.
•• Enumeration.
•• Quality Assurance.
•• COMPASS Prototype.
Q1
Jan
2016 Census Test
Tests in 2016
Feb
Q2
Mar
Apr
May
2016
Jun
Q3
Jul
3/16
Aug
Q4
Sep
Oct
Nov
Dec
7/16
2016 Address Canvassing Test
(In-Field)
9/16
11/16
Figure 19: Tests Planned in 2016
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 43
4.1.3.1 2016 Census Test
The 2016 Census Test was an operational study of both self-response and NRFU procedures. It had
a Census Day of April 1, 2016, and included a planned 250,000 housing units per site in Los Angeles
County, CA, and Harris County, TX.
2016 Census Test
Objectives
•• Self-response
°° Test provision of language support to Limited English Proficient populations through partnerships and bilingual questionnaires.
°° Test ability to reach demographically diverse populations.
°° Test deployment of non-English data collection instruments and contact strategies.
°° Refine Real-Time Non-ID Processing methods, including respondent validation.
•• NRFU
°° Refine the reengineered field operations.
°° Refine the field management staffing structure.
°° Test enhancements to the Operational Control System and COMPASS.
°° Refine the path in COMPASS to conduct proxy interviews.
°° Test improved procedures for multiunit accessibility and contact.
•• Reengineered quality assurance
°° Evaluate the use of paradata and Global Positioning System points collected during
interview.
°° Test reinterview functionality.
•• Measure the systems’ abilities to manage a significant number of concurrent users during
self-response.
•• Test a combination of government-furnished equipment and dDaaS strategies for supplying
enumerators with hardware devices.
•• Test scalability of Internet and Non-ID Processing during self-response using enterprise
solutions.
Findings
and Design
Implications
Findings:
•• Sending a letter instead of a postcard as the first reminder has a positive impact on improving response rates.
•• Sending a paper questionnaire in the first mailing (Internet choice) to areas expected to have
lower Internet usage is beneficial.
•• Further understanding gained about how to connect with people and encourage
self-response.
•• Continuing to utilize and refine approach to using administrative records and third-party data
to reduce the NRFU workload is productive.
•• Progressing along path to leverage automation across the 2020 Census Program.
°° Balance emphasis on controlling and managing attempts with emphasis on completing
interviews.
°° Routing rules need to be refined.
Design implications:
•• In the 2020 Census, about 20 percent of the country will receive the Internet-choice methodology, with all areas receiving a paper questionnaire if they have not yet responded in the
fourth mailing.
•• Priority focus areas for continuing development:
°°
°°
°°
°°
Operational/management reports.
Operational Control System/Optimizer enhancements.
Training.
Closeout procedures.
44 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
4.1.3.2 Address Canvassing Test
The primary objective of the Address Canvassing
Test was to examine the effectiveness of the
In-Office Address Canvassing through the results
of the In-Field Address Canvassing. In addition,
the test provided the opportunity to measure the
effectiveness of integrated systems, field staff
training, and the use of new collection geography in the field. The Address Canvassing Test
occurred in Buncombe County, NC, and city of St.
Louis, MO. Both Address Canvassing components,
In-Office and In-Field, were conducted for all areas
of the test sites. All data collection activities for
the test occurred from August through December
of 2016, with In-Office Address Canvassing data
collection from August through October of 2016,
In-Field Address Canvassing data collection from
October through mid-November of 2016, and
In-Field Relisting from mid-November through
mid-December of 2016.
Address Canvassing Test
Objectives
•• Implement all In-Office Address
Canvassing processes, including
Interactive Review, Active Block
Resolution, MAF Updating and
Identification of the In-Field
Address Canvassing workload.
•• Evaluate the effectiveness of online training for
Field Supervisors and Field
Representatives.
•• Measure the effectiveness of
In-Office Address Canvassing
through In-Field Address
Canvassing.
•• Integrate multiple information
technology applications to
create one seamless operational
data collection, control and
management system.
U.S. Census Bureau
Address Canvassing Test
Findings
and Design
Implications
•• The Census Bureau should
continue pursuing the use of
In-Office Address Canvassing
processes for identification of
the In-Field Address Canvassing
workload.
•• In-Office Address Canvassing
methods are generally effective
in detecting where the MAF
has remained accurate, where
it is keeping pace with changes
on the ground, and where
fieldwork is needed to acquire
address updates.
•• Assumptions about situations
that pose challenges to detect
change through imagery analysis are generally correct.
4.1.3.3 Group Quarters Tests
Group Quarters were not included in any of
the testing that occurred early in the decade.
However, there were three small-scale tests in
2015 and 2016 that were primarily efforts to test
automation in collection of response data through
electronic means. These tests were the 2015
Group Quarters Electronic Capability Test Survey,
the 2016 eResponse Data Transfer Test, and the
2016 Service-Based Enumeration Census Test.
Group Quarters Electronic Capability Test
Survey
Purpose: Explore GQ administrators’ ability and
willingness to send resident-level data electronically to the Census Bureau.
Findings: Of 260 GQ umbrella organizations and
agencies contacted, 40 percent responded. Of
those that responded, the majority reported: the
availability of eResponse data records, the ability
to transmit eResponse data records to the Census
Bureau, the ability to provide an electronic file
in an Excel format, and willingness to participate
in the Group Quarters Electronic Response Data
Transfer Test (or eResponse Test).
2020 Census Operational Plan—Version 4.0 45
eResponse Data Transfer Test
Purpose: Explore the capabilities for enumeration
using electronic response data; evaluate the ability
of GQ administrators to link residents/clients to
the correct GQ address.
Findings: Almost 100 percent of participants were
enumerated using the automated instrument and
provided the required data items; the automation
and administrative record enumeration effort was
strongly supported by GQ administrators.
Findings: All data files were successfully uploaded;
all administrators were able to use the Excel
spreadsheet provided by the Census Bureau; files
linked residents/clients to the correct GQ address,
as specified; some data parsing was required for
formatting submitted data into usable enumeration data.
Lessons Learned:
Lesson Learned: Test with a standardized template
and test across multiple GQ types.
Design Implications:
Service-Based Enumeration Census Test
Purpose: Explore the feasibility of enumerating
the service-based population using an automated
instrument; explore the availability of administrative records for enumerating at service-based
locations; determine the staffing needs when
using a mix of enumeration instruments at these
locations.
46 2020 Census Operational Plan—Version 4.0
Only shelters have the capability to provide listings containing client-level information.
A different staffing ratio is required depending on
type of service-based location and whether automation is used for enumeration.
Although the 2016 Service-Based Enumeration
Census test revealed that automation worked well
for this population, it was determined that additional research was necessary to be able to use
automation across all group quarter types. After
factoring in budget constraints, it was decided
to continue the use of paper for service-based
locations and also to allow the option for facility
administrators to use eResponse.
U.S. Census Bureau
4.1.4 Tests in 2017
2017 Census Test
One major test was completed in 2017: the 2017
Census Test. This test is described below and
shown in Figure 20.
Objectives
•• Test the integration of
operations and systems for
self-response.
•• Test the feasibility of collecting
tribal enrollment information.
4.1.4.1 2017 Census Test
The 2017 Census Test was a nationwide selfresponse test of 80,000 households, testing the
integration of operations and systems for selfresponse. In particular, Internet Self-Response
was tested in the cloud and Census Questionnaire
Assistance was tested at two call centers. The test
oversampled areas with relatively high populations
of American Indians and Alaska Natives as a mechanism for testing potential use of tribal enrollment
questions nationwide. This test had a Census Day
of April 1, 2017.
2016
Q4
Oct
Internet Self-Response
2017
Census
Test
Census Questionnaire Assistance
Nov
Findings
and Design
Implications
°° Internet Self-Response
Rapid response mechanisms (e.g., escalation procedures, issue resolution
points of contact) were
refined during the test and
will be further refined and
documented.
°° Census Questionnaire
Assistance (two call centers)
The primary call processing system components
were proven in and will be
used in 2018.
•• The tribal enrollment question
was not proposed for the
2020 Census. Analysis of the
tribal enrollment questions will
appear in an external report.
Q1
Dec
Jan
•• Successfully fielded the public-facing production systems
for the 2020 Census, which
included:
Feb
2017
Q2
Mar
Apr
May
Jun
Q3
Jul
3/17
3/17
Aug
Q4
Sep
Oct
Nov
Dec
8/17
5/17
Figure 20: Schedule for the 2017 Census Test
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 47
4.1.5 Tests in 2018
In addition to the 2018 End-to-End Census Test
that tested major systems and operations in an
integrated fashion, the Integrated Partnerships
and Communications Operation did a survey to
understand public opinion about participating in
the 2020 Census, known as the Census Barriers,
Attitudes, and Motivators Study.
One major test was planned for 2018, the 2018
End-to-End Census Test. The goal was to have the
entire operational design for the major operations
ready for production—from a systems, operational,
and architectural perspective. The 2018 End-toEnd Census Test included significant field data
collection components, and the timing of the field
operations will mimic the 2020 Census (see Figure
21).
4.1.5.1 2018 End-to-End Census Test
The 2018 End-to-End Census Test tested and
validated 2020 Census operations, procedures,
systems, and infrastructure together. This test had
a Census Day of April 1, 2018, and as conducted
in Providence County, RI. The Address Canvassing
Operation was conducted in the prior calendar
year because this operation is responsible for
producing the census frame, which has to be done
before the data collection. Address canvassing
was performed in three areas: Pierce County, WA;
Providence County, RI; and Bluefield-Beckley-Oak
Hill, WV.
Findings and lessons from prior tests were used
to develop the test plans to the extent possible.
Other efforts in preparation of this test included
introducing enterprise systems that were not in
place for earlier tests, expanding and enhancing systems already in use, and expanding and
enhancing the systems using cloud technologies.
Any problems found during the 2018 End-toEnd Census Test will be addressed using careful
regression testing and change control procedures
in 2019.
2017
Q3
Q2
Apr
Address Canvassing
2018 End-toEnd Census
Test
May
Jun
Jul
Aug
8/17
2018
Q4
Sep
Oct
Nov
Q1
Dec
Jan
Feb
2019
Q2
Mar
Apr
May
Q3
Jun
Jul
Aug
3/18
7/18
Internet Self-Response
3/18
7/18
4/18
Nonresponse Followup
Post Processing and Products—
End-to-End Functional Test
Oct
Nov
Q1
Dec
Jan
Feb
Q2
Mar
Apr
May
Jun
9/17
Group Quarters
Update Leave
Q4
Sep
5/18
5/18
7/18
8/18
3/19
Figure 21: Schedule for the 2018 End-to-End Census Test
48 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
2018 Census End-to-End Test
Objectives
•• Test and validate 2020 Census operations, procedures, systems, and field infrastructure
together to ensure proper integration and conformance with functional and nonfunctional
requirements.
•• Pilot test Self-Response Kiosk and Informed Delivery to explore the utilization of the United
States Postal Service-Census Bureau partnership to increase Internet self-response by providing additional methods of accessing the Internet self-response Web site.
•• Produce a prototype of geographic and data products.
Findings
and Design
Implications
Successes:
Integration of most of the systems that will be used in the 2020 Census.
Address Canvassing:
•• Implemented 2020 Census Address Canvassing approach to listing using Listing and
Mapping Application (LiMA).
Printing and Mailing:
•• Implemented a staggered mail strategy to multiple cohorts with conditional mailings to nonresponding households.
Self-Response:
•• Deployed multiple modes of self-response: Internet, paper, and telephone.
•• Fielded questions via Census Questionnaire Assistance, offering respondents an opportunity
to provide their responses to customer service representatives on the telephone.
Update/Leave:
•• Implemented the operation using the LiMA for address list updates.
Census Questionnaire Assistance:
•• Supported inbound telephone calls for assistance and self-response, outbound calls for the
Coverage Improvement activity, and outbound calling for the NRFU Reinterview.
•• Supported calls in English, Spanish, Mandarin, Cantonese, Russian, Arabic, Tagalog, Korean,
and Vietnamese.
NRFU:
•• Implemented a field data collection enumeration application to conduct interviews.
•• Identified Administrative Records Occupied cases and removed cases from the workload
after one contact attempt.
Group Quarters:
•• Demonstrated the integration of systems supporting a paper-based operation.
•• Conducted the Service-Based Enumeration using two methods (in-person interviews and
paper listings from the facility)
•• Demonstrated use of a variety of methods for enumeration at group quarters.
Recruiting and Hiring:
•• Deployed and used the online job application and assessment for peak operations (not for
Address Canvassing).
USPS testing:
•• Self-response kiosks were installed in 30 post offices across Providence County, RI. Due to
limited usage, self-response kiosks will not be implemented as part of the 2020 Census.
•• The USPS’ Informed Delivery service provided free, advance notification of Census Bureau
mail via e-mail to subscribers with a direct link to the Internet Self-Response Web site. Will
be implemented as part of the 2020 Census.
Geographic and Data Products: In progress
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 49
4.1.5.2 Census Barriers, Attitudes, and Motivators
Study
For the 2020 Census, the Census Bureau has
placed a premium on outreach and communications to reach every person throughout the
United States, no matter where they live or what
language they speak. The 2020 Census Integrated
Communications Campaign is instrumental to
accomplishing that goal.
This effort included the Census Barriers, Attitudes,
and Motivators Study (CBAMS), which had the
objective of understanding the attitudes, barriers, and motivators toward the census, with the
goal of designing outreach and communications
that increase the census response rate. The 2020
CBAMS provides a much stronger and more
robust foundation for our communications strategy than available in prior decennial censuses. It
included a national mailout survey in February to
April 2018 with oversampling of Asian, Blacks,
Hispanics, and other small sample races. In addition, there were focus groups with various audiences in a number of sites and conducted in five
languages in March and April of 2018.
50 2020 Census Operational Plan—Version 4.0
2018 Census Barriers, Attitudes,
and Motivators Study
Objectives
•• Complete research that informs
the 2020 Census Integrated
Communications Campaign.
Findings
•• Overall trust in government has
been declining for decades, and
there has also been an erosion
of trust in major institutions.
•• There are persistent knowledge
gaps about the 2020 Census
scope, purpose, and constitutional foundation.
•• Connecting census participation
to a better future for communities is a powerful motivator.
Design
Implications
•• Engaging trusted voices
for census communications
addresses trust-based concerns,
especially among the most
skeptical and disaffected.
•• Informing the public on the
census’ scope, purpose, and
process addresses privacy and
confidentiality concerns and
fear of repercussions.
•• Connecting census participation
to support for local communities addresses apathy and lack
of efficacy.
U.S. Census Bureau
4.1.6 Tests Between the 2018 End-to-End
Census Test and the 2020 Census
4.2 KEY DECISION POINTS AND
MILESTONES
For the period between the 2018 End-to-End
Census Test and the 2020 Census, performance
and scalability testing will occur, consisting of
Defect Resolution Testing and Post End-to-End
Performance Testing. The Defect Resolution
Testing will ensure that any changes made to
correct defects identified in the 2018 End-toEnd Census Test are correct. The objective of
Performance and Scalability Testing is to ensure
that systems will scale to meet the workloads,
or volumes, of the 2020 Census. Dates shown
in Figure 22 represent testing for the field
operations.
Figure 23 shows the key planning and preparation milestones for the full life cycle of the 2020
Census. Milestones include public facing milestones, such as launching the 2020 Census Web
site, delivery of topics and questions to Congress,
as well as delivery of 2020 Census products to the
President, states, and the public.
4.3 2020 CENSUS PRODUCTION
OPERATIONAL SCHEDULE
Figure 24 describes the planned timing for the
major production field operations for the 2020
Census.
The Census Bureau is planning a test in the summer of 2019 to better plan for the Nonresponse
Followup Operation and communications strategies for the 2020 Census. The randomized control
test will ask a nationally representative sample of
households to answer the planned 2020 Census
questions—the number of people, tenure, sex, age,
date of birth, Hispanic origin, race, citizenship,
relationship, and coverage questions. The test
is designed to measure the operational effect of
including a citizenship question on self-response
rates. This test has a Census Day of July 1, 2019.
Thread
Defect Resolution
Testing
Defect Resolution Testing
Post End-to-End
Performance
Testing
Address Canvassing
2018
Q3
Jul
Aug
Sep
Oct
Figure 25 provides an integrated schedule for the
tests, key milestones, and production operations
in one chart. Different types of tests (research,
readiness, performance, end-to-end, and post
end-to-end) are shown in different colors as noted
in the legend. Key milestones, including Census
Day and the delivery of apportionment counts and
redistricting data are also shown.
Figure 26 provides a consolidated view of which
operations were tested in which tests, as either
a focus of the test or a supporting operation for
the test. Early tests were focused on research on
individual operations, while tests from 2016 and
on added integration of operations and systems.
Q4
Nov
Q1
Dec
Jan
Feb
2019
Q2
Mar
Apr
May
Jun
Jul
Q3
Aug
Nonresponse Followup
Oct
Nov
Dec
Feb
Jan
1/17/20
5/17/19
Internet Self-Response
Update Leave
2020
Q1
Q4
Sep
12/13/19
Legend
Defect Resolution
Performance Test
1/3/20
1/17/20
Figure 22: 2020 Census Performance and Scalability Testing
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 51
52 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
2012
2013
2014
2015
2016
2017
10/1/15
9/24/15
1/17/17
9/30/16
2019
2020
Complete 2020 Census
Release Final 2020 Census Data Products
Begin Accepting Challenges from Governmental Units
Complete Delivery of Redistricting Counts to States
Deliver Apportionment Counts to the President
2021
Figure 23: Key Decision Points and Milestones
6/1/21
3/31/21
12/31/20
4/1/20
3/31/20
2020 Census Day
3/12/20
Complete In-Office Address Canvassing
(incl. Ungeocoded Resolution)
11/1/19
First In-Home Mailing
10/11/19
Launch Advertising Campaign
8/30/19
11/30/18
9/4/18
3/30/18
2/15/18
12/31/18
Complete In-Field Address Canvassing and QC
Begin Delivery of LUCA Feedback Materials
Receive OMB Clearance for 2020 Census
Go Live of Online Job Application and Website
Deliver 2020 Census Questions to Congress
Begin Delivery of LUCA Review Materials
12/5/17
9/30/17
4/1/17
1/1/15
3/31/17
11/1/11
Deliver 2020 Census Topics to Congress
Deliver Final Residence Criteria
2018
2022
2023
9/30/23
4/30/23
Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1
2011
Deliver Final Definition of Field Staffing Structure
Finalize Locations of Area Census Offices
Complete 2020 Census Operational Plan, v4
Complete 2020 Census Operational Plan, v3
Complete 2020 Census Operational Plan, v2
Complete 2020 Census Operational Plan, v1
2020 Census Operational Plan
Begin In-Office Address Canvassing
Launch 2020 Cenus Website
Begin 2020 Census Planning
Task
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 53
Complete Closing of all Regional Census Centers
Complete Delivery of Redistricting Counts to the States
Count Review—Census Counts and File Review
Complete Closing of all Area Census Offices
Deliver Apportionment Counts to the President
Response Processing
(Initial Universe to Apportionment Counts)
Nonresponse Followup (NRFU) (incl. Early NRFU and RI)
Census Day
Paper Data Capture (PDC)
Internet Self-Response (ISR) (incl. Non-ID)
In-Home Mailings Delivered
Update Enumerate (UE)
Update Leave (UL) and QC
Enumeration at Transitory Locations (ETL)
(incl. Advance Contact)
Group Quarters (GQ) (incl. Advance Contact)
Count Review—Address Review Phase II
(FSCPE Post-Enum GQ Review)
Census Questionnaire Assistance (CQA)
Update Enumerate (UE) in Remote Alaska
Count Review—Address Review Phase I
(FSCPE HU and GQ Review)
Launch Advertising Campaign
Recruiting Continues for Peak Operations
In-Field Address Canvassing and QC
Open Area Census Offices
Recruiting Begins for In-Field Address Canvassing
Open Regional Census Centers
In-Office Address Canvassing
(incl. Ungeocoded Resolution)
Task
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
Figure 24: 2020 Census Operations—Production Timeline
4/1/20
11/1/19
9/3/19
1/7/19
6/30/21
3/31/21
12/31/20
12/31/20
Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1
54 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
2020 Production
Operation Schedule
2020 Production
Operation Schedule
Other Tests in 2018
Performance and Defect
Resolution Testing
Planned in 2019
Other Tests in 2019
2018 End–to–End
Census Test
2017 Census Test
Tests in 2016
Proposed
Tests Planned
in 2016
Tests and Key Decision
in 2015
Early 2012–2014 Tests
Aug
Complete Delivery of Redistricting
Counts to the States
Deliver Apportionment Counts to the
President
Response Processing (Initial Universe
to Apportionment Counts)
Census Day
In-Home Mailings Delivered
Remote Alaska, CQA, GQ, ETL, UL, UE,
ISR, NID, NRFU, and PDC)
Conduct Enumeration Data Collection
Operations (incl. UE in
Launch Advertising Campaign
In-Field Address Canvassing and QC
In-Office Address Canvassing
(incl. Ungeocoded Resolution)
Aug
Feb
Q3
Q1
2013
2013
2014
2014
2015
10/1/15
Aug
Nov
Q4
Q3
Feb
Nov
Q1
Q4
2016
May
Aug
Q3
Q2
2016
Q3
Q2 Q4
Q3 Q1
Q4 Q2
Q1
2015
Q2 Q1
Q3 Q2
Q4 Q3Q1 Q4Q2Q1 Q3Q2 Q4Q3 Q1Q4 Q2
Q4
Q1
Feb
May
Feb
May
Nov
Aug
Aug
Feb
May
Feb
MayNov
Feb Aug
May Nov
Feb
Nov
Nov
NovMay
Nov May
Aug
Aug
AugFeb
Aug Feb
Q1
Q3
2012
Nov
May
Q4
Q2
Figure 25: High-Level Integrated Schedule
Nov
May
20112012
Feb
Q1
Q3 Q2
Q4
Census Response Test (Census Day 7/1/19)
Post End-to-End Performance Testing
Defect Resolution Testing
CBAMS Test
Post Processing and Products -End-to-End Functional Test
Nonresponse Followup
Update Leave
Internet Self-Response
Group Quarters
Address Canvassing
Census Questionnaire Assistance
Internet Self-Response
2016 Address Canvassing Test (In-Field)
2016 Census Test
2020 Census Operational Plan, v1
2015 National Content Test
2015 Census Test
2015 Optimizing Self-Response Test
Partial Block Canvassing
MAF Model Validation
2014 Human-in-the-Loop Test
LUCA Focus Groups
2014 Census Test
2013 Census Test
2013 National Census Contact Test
2012 National Census Test
Continuous Small Scale Testing
Public Opinion Polling
Nov
Q4
2011
Aug
2018
2018
2020
2020
3/31/21
12/31/20
4/1/20
11/1/19
May
Q4
Aug
Aug
Q3
Q3
2021
2021
May
Q2
Q2
Feb
Feb
Q4
Q4 Q1
Q1 Q2
Q2 Q3
Q3 Q4
Q4 Q1
Q1
2019
2019
Q2 Q2
Q3 Q3Q4Q4Q1 Q1Q2Q2Q3Q3
Q1
Feb
May
Feb
May
Nov
Nov
Aug
Feb
May
Feb
MayAug
Feb May
May Aug
Nov
Nov
Nov Feb
Nov
Aug
Aug
AugNov
Aug Nov
Q4 Q4
Q1
Q3
2017
Aug
May
Q3
Q2
2017
Feb
May
Q2
Q1
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 55
Op
#
1
2
3
4
5
6
8
9
10
11
12
13
14
15
17
18
19
21
22
25
31
32
33
34
35
2013
National
2013
Census
Census Contact
Test
Test
S
S
S
S
S
S
N
S
N
N
S
N
N
N
N
S
N
N
N
N
N
F
N
F
N
N
N
N
N
N
F
N
N
N
N
N
N
N
N
N
N
N
S
N
N
N
N
N
N
N
2015
2014
2015
Address
Census Census Validation
Test
Test
Test
S
S
S
S
S
S
S
S
S
S
S
N
N
S
N
S
S
S
N
N
F
S
S
N
S
S
N
N
N
N
F
S
N
F
S
N
N
N
N
N
N
N
S
S
N
F
F
N
N
N
N
N
N
N
N
N
N
N
N
N
N
S
N
N
F
S
N
N
N
F
F
S
N
N
N
2015
National
Content
Test
S
S
S
N
N
S
N
S
S
N
F
S
N
N
S
N
N
N
N
N
N
N
N
S
N
2015
Optimizing
SelfResponse
Test
S
S
S
S
S
S
N
S
S
F
F
F
N
N
S
N
N
N
N
N
N
N
N
S
N
2016
2016
Group
Census Quarters
Test
Tests
S
S
S
S
S
S
F
S
F
N
S
N
N
N
S
S
S
N
S
N
F
N
F
N
N
N
N
F
S
N
F
N
F
N
N
N
N
N
N
N
S
N
S
N
S
N
S
S
N
N
Figure 26: The 2020 Census Operations by Test
LEGEND
F
Focus of the Test
S
Required to Support the Test
N
Not included in the Test
2012
National
Op
Census
Acronym Test
PM
S
SEI
S
SPC
S
CFD
F
LNG
N
GEOP
S
ADC
N
FPD
S
PDC
N
IPC
N
ISR
F
NID
N
UE
N
GQ
N
CQA
F
NRFU
N
RPO
N
DPD
N
RDP
N
ARC
N
DSC
N
FLDI
N
DLM
N
ITIN
S
UL
N
2018
EndAddress
2017
to-End
Canvassing Census Census
Test
Test
Test
S
S
F
S
S
F
S
S
F
N
F
S
N
S
S
S
S
F
F
S
F
N
S
F
N
S
F
N
N
S
N
S
F
N
S
F
N
F
N
N
N
F
N
F
F
N
N
F
N
S
F
N
N
F
N
N
F
N
N
F
S
S
F
S
S
F
S
S
F
S
S
F
N
N
F
[This page intentionally left blank]
5. The 2020 Census Operations
This section of the document provides the current state of the operational design. An overview
of the 35 operations is presented, followed by
more detailed descriptions of each operation that
include the following:
•• Purpose: A concise description of the
operation.
•• Changes Made Since Version 3.0 Operational
Plan Release: A brief summary of significant
changes made for this operation.
•• Lessons Learned: Selected lessons learned
from the 2010 Census or tests or studies that
have occurred since the 2010 Census.2
•• Operational Innovations: Major innovations
expected to be implemented for this operation.
•• Description of Operation: A basic description
of the operation.
•• Research Completed: Research completed and
the major findings from this research.
•• Decisions Made: A list of the design decisions
made based on research completed.
•• Design Issues to Be Resolved: A list of the
outstanding design decisions and the date by
which they are expected to be made.
•• Cost and Quality: The expected cost and quality impacts of the design for this operation on
the overall 2020 Census.
•• Risks:3 The top risks associated with this
operation.
•• Milestones: Important dates associated with
this operation, to include decision points and
production dates.
For support and similar operations that do not
require a research-based design, the research and
decision sections focus on work completed.
2
The Knowledge Management Database contains the lessons
learned from the 2010 Census and is available for review upon
request.
3
Each operation has its own program-level risk register,
which includes the full list of program and project risks for each
operation.
U.S. Census Bureau
5.1 OPERATIONS OVERVIEW
Figure 26 illustrates all 35 operations organized
by the 2020 Census Work Breakdown Structure
(WBS) elements. As noted by the shading on the
diagram, the degree to which detailed planning
has been conducted for each operation varies.
Detailed Operational Plans (DOPs) are being produced for most of the 35 operations. The development of the DOPs is not only further refining the
design for those individual operations, but also
helping clarify scope, boundaries, and interaction
points among operations.
Integrated Operations Diagrams (IODs) have been
developed to describe how a group of related
operations work together to perform key functions of the 2020 Census (e.g., frame development, response data collection, and data products
and dissemination). The IODs are included in the
relevant DOPs. Additional operational integration artifacts are being developed, as time and
funding allow, to ensure a full understanding of
the integrated operational design. Other specific
operational integration artifacts are included in
Appendix C.
The operations must work together to achieve a
successful census. Information flows among the
operations as the census proceeds from frame
development through collection of response data
to the publishing and release of the data. Key
information flows among the primary business
operations are highlighted in Figure 27. Major
interactions and flows are shown with the arrows
in the diagram, and the key external interfaces are
depicted in blue text.
The integration of these business operations
requires integration of the IT systems that support them. This significant effort is underway.
The Systems Engineering and Integration (SEI)
Operation will complete the 2020 Census Solution
Architecture based upon Capability Requirements.
2020 Census Operational Plan—Version 4.0 57
58 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
Detailed
planning
not started
Detailed
planning
recently
begun
Detailed
planning is
underway
Operation
is in
production
27. Coverage Measurement
Design and Estimation
(CMDE)
26. Island Areas
Censuses (IAC)
20. Federally Affiliated
Count Overseas
(FACO)
16. Enumeration at
Transitory Locations
(ETL)
Figure 27: Operational Overview by Work Breakdown Schedule
29. Coverage Measurement
Field Operations
(CMFO)
35. Update Leave (UL)
19. Response Processing
(RPO)
18. Nonresponse Followup
(NRFU)
17. Census Questionnaire
Assistance (CQA)
34. IT Infrastructure
(ITIN)
4. Content and
Forms Design (CFD)
15. Group Quarters (GQ)
14. Update Enumerate
(UE)
13. Non-ID Processing
(NID)
33. Decennial Logistics
Management (DLM)
3. Security, Privacy, and
Confidentiality (SPC)
SUPPORT
28. Coverage Measurement
Matching (CMM)
TEST AND EVALUATION
OTHER CENSUSES
12. Internet Self-Response
(ISR)
11. Integrated Partnership
and Communications
(IPC)
10. Paper Data Capture
(PDC)
7. Local Update of
Census Addresses
(LUCA)
8. Address Canvassing
(ADC)
9. Forms Printing and
Distribution (FPD)
RESPONSE DATA
32. Field Infrastructure
(FLDI)
2. Systems Engineering
and Integration (SEI)
Census/Survey Engineering
6. Geographic Programs
(GEOP)
FRAME
31. Decennial Service
Center (DSC)
Infrastructure
1. Program Management
(PM)
Program Management
30. Evaluations and
Experiments (EAE)
25. Archiving (ARC)
24. Count Question
Resolution (CQR)
23. Count Review (CRO)
22. Redistricting Data
Program (RDP)
21. Data Products and
Dissemination (DPD)
PUBLISH DATA
5. Language Services
(LNG)
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 59
Addresses
and
boundaries
Addresses
to canvass
Address
Canvassing
Address
and
map
updates
Geographic
Programs
A
OTHER CENSUSES
Local Update
of Census
Addressess
State and
local
governments
(Phases 1 and 2)
Redistricting
Data Program
Spatial data inputs
Tribes,
partners,
USPS,
other sources
Addresses,
spatial data, and
boundaries
FRAME
Address
updates,
electronic
responses
Response
Processing
Electronic
responses
Enumeration
cases
Respondent
Internet SelfResponse
Nonresponse
Followup
NRFU workload
Response data ready
for tabulation
Non-ID addresses
Integrated
Partnership and
Communications
Awareness and
education
Non-ID addresses
Non-ID
Processing
TEST AND EVALUATION
Administrative records
and third-party data
Address updates
Addresses, spatial data, and boundaries
Address updates, eResponse files
FPD workload
Forms Printing
and Distribution
Contact materials
Paper Data
Capture
Basic collection units/cases
Printed
questionnaires
Contact materials
Paper responses
Census
Questionnaire
Assistance
Self-Response Data Collection
Figure 28: High-Level Integration of Operations
Federally Affiliated
Count Overseas
Group Quarters
Update Enumerate
Update Leave
Enumeration at
Transitory
Locations
Data Collection
for Special
Locations and
Populations
RESPONSE DATA
SUPPORT
Primary Flow
Secondary Flow
White House
Apportionment counts
Data Products
and
Dissemination
Public
(Phases 3 and 4)
Redistricting
Data Program
A
Other
Operations
Data products
Archiving
Redistricting data
State,
and local
governments
Count
Question
Resolution
Count
Review
Other archival data
Response data
Image data
PUBLISH DATA
5.1.1 Frame
As shown in Figure 27 from the previous page, the
basic flow of information begins in the frame area
with the Geographic Programs (GEOP) Operation,
which maintains the Master Address File (MAF)
and spatial and boundary data used to create the
frame for the 2020 Census. Data from the United
States Postal Service (USPS) and other administrative records and third-party data are used to
maintain the MAF and spatial data. State and local
governments provide address updates to GEOP
during the Local Update of Census Addresses
(LUCA) program. These governments also provide updates to GEOP on block boundaries and
voting districts during the first two phases of the
Redistricting Data Program (RDP). GEOP provides the most current address list to the Address
Canvassing (ADC) Operation, where staff make
updates to the list via in-office and in-field procedures. These updates are processed by GEOP on
an ongoing basis throughout the decade. Once
the frame updates are complete, GEOP provides
the address and spatial data to the Response
Processing Operation, which creates the initial
universe of Basic Collection Units for listing operations and cases for self-response and enumeration
operations.
5.1.2 Response Data
Enumeration at Transitory Locations (ETL)
Operation: Enumerates individuals in occupied
units at transitory locations (TLs) who do not have
a Usual Home Elsewhere. TLs include recreational
vehicle parks, campgrounds, racetracks, circuses,
carnivals, marinas, hotels, and motels.
Update Enumerate (UE) Operation: Updates
the address and feature data and enumerates
at housing units (HUs) in areas where the initial
visit requires enumerating while updating the
address frame (primarily remote geographic
areas that have unique challenges associated with
accessibility).
Update Leave (UL) Operation: Updates the
address and feature data for the area assigned
and leaves a choice questionnaire package (contact materials) at every HUs identified to allow
the household to self-respond. UL occurs in areas
where the majority of HUs do not have a city-style
address to receive mail.
60 2020 Census Operational Plan—Version 4.0
Group Quarters (GQ) Operation: Enumerates
people living or staying in GQs, people experiencing homelessness, and people receiving service at
service-based locations, people living on maritime
vessels, and people living on military bases.
Federally Affiliated Count Overseas (FACO)
Operation: Obtains counts by home state of U.S.
military and federal civilian employees stationed
or assigned overseas and their dependents living
with them.
Most responses from these operations are collected on paper questionnaires, provided by the
Forms Printing and Distribution (FPD) Operation.
Responses for the military and some GQs and
the count of federally affiliated persons overseas
are provided in electronic files. Paper questionnaires are sent to the Paper Data Capture (PDC)
Operation, where they are scanned and imaged
before being sent electronically to the Response
Processing (RPO) Operation.
Address updates collected during these operations
are sent to RPO, which sends the data back to
GEOP.
A key goal for the 2020 Census is to optimize
self-response. The Integrated Partnership and
Communications (IPC) Operation helps do this
by creating awareness and educating the public
about the importance of the 2020 Census. FPD
serves as the primary mechanism for mailing the
materials (letters, postcards, and paper questionnaires) needed for self-response. The Internet
Self-Response (ISR) Operation collects respondent
information through an online questionnaire. For
those Internet respondents who do not provide a
Census ID, the U.S. Census Bureau conducts realtime (during the interview) processing to identify
the correct block for the respondent’s address
using methods in the Non-ID Processing (NID)
Operation. Households that do not respond on
the Internet are given the opportunity to respond
with paper questionnaires, which are mailed to and
processed by PDC. Some people will call with questions. The Census Questionnaire Assistance (CQA)
Operation responds to these questions and, if
appropriate, offers to collect the responses through
a telephone interview. All Internet and electronically captured paper questionnaire responses are
sent to the RPO, which manages the status of cases
across the universe. HU addresses in self-response
or Update Leave Type of Enumeration Areas for
U.S. Census Bureau
which the Census Bureau did not receive a self-response are sent to the Nonresponse Followup
(NRFU) Operation, which determines the most
cost-effective way of enumerating those households (personal visit, use of administrative records
and third-party data, or proxy responses). Any new
addresses identified during NRFU are sent to RPO,
which in turn sends them back to GEOP.
5.1.3 Publish Data
Preliminary counts created near the end of data
collection are sent from RPO to the Count Review
(CRO) Operation, which provides Federal-State
Cooperative for Population Estimates (FSCPE)
members the opportunity to review the counts
to ensure the accuracy of the 2020 Census. Any
geographic updates resulting from CRO are sent
to GEOP (as shown by the circled A connector in
Figure 27). Once all of the data processing is complete, RPO delivers the processed data to the Data
Products and Dissemination (DPD) Operation to
prepare the final 2020 Census data products. This
operation creates and delivers:
•• Apportionment counts to the President and
statistical data to the public.
•• Redistricting data to the state legislatures (in
coordination with the RDP) so state governments can define the geographic boundaries
for Congressional and legislative districts.
•• Final counts to the Count Question Resolution
(CQR) so challenges to 2020 Census counts
can be resolved.
•• Data products to the Archiving (ARC)
Operation.
ARC also receives responses from RPO and image
data from PDC, as well as other archival data from
other operations, including time and expense data
and paradata. By law, decennial census results are
archived and released to the public 72 years after
the census.
This integrated view only depicts high-level data
flows and interactions. The IODs in the DOPs and
other diagrams in Appendix D provide more detail
related to operational integration.
5.2 PROGRAM MANAGEMENT
5.2.1 Program Management
Detailed Planning
Status:
In Production
Purpose
The Program Management (PM) Operation defines
and implements program management policies,
processes, and the control functions for planning
and implementing the 2020 Census in order to
ensure an efficient and well-managed program.
Changes Made Since Version 3.0 Operational Plan
Release: There have been no major changes to
this operation.
Lessons Learned
Based on lessons learned from the 2010 Census
and other reviews, the following recommendations
were made:
•• Develop a life cycle schedule for the 2020
Census, and complete it earlier in the decade.
•• Place more emphasis and resources on updating cost and schedule estimates throughout the
life cycle.
•• Obtain independent cost estimates and use
them to validate cost estimates (that include
contingency reserves) developed by stakeholder organizations.
•• Improve strategic planning and early
implementation of the 2020 Census Risk
Management process.
•• Align system development schedules with operational deadlines to allow adequate time to test
systems before their deployment.
•• Reevaluate the practice of frontloading and
develop a better process for developing workload and cost assumptions.
•• Rethink and rework stakeholder engagement,
education, and management. The Census
Bureau needs to better define, and then clearly
articulate, its expectations with regard to
roles and responsibilities between the Census
Bureau, contractors, and stakeholder groups.
•• Set a clear and publicly announced goal to
reduce the inflation-adjusted per-housing-unit
cost relative to 2010 Census totals.
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 61
Program Management
Framework
12. Risk/Issue
Management
11. Human Capital
Management
2. Strategic
Communications
Initiation
Planning
10. Performance
Management
Earned Value
Management
1. Governance
Project Life Cycle
Project Management
and Project Control
3. Strategic
Management
Execution
4. Document
Management
Closeout/
Evaluation
9. Schedule
Management
8. Budget
Management
5. Change
Management
6. Knowledge
Management
7. Acquisition and
Sourcing Management
Figure 29: Program Management Framework
Operational Innovations
Following an analysis and review of the 2010
Census program management practices, the
2020 Census improved its program management
capabilities and defined program management
processes earlier in the decade to support 2020
Census Research and Testing activities. New and
improved program management practices integrated into the 2020 Census that were not part of
the 2010 Census include the following:
•• Iterative operational planning to allow for
periodic design refinements based on findings
from research and testing, as well as external
changes in legislation and technology.
•• Evidence-based decision-making to ensure that
operational designs are based on solid evidence
from research, testing, analysis, and previous
survey and census experience.
•• Integration of schedule, scope, and budget
using a common WBS.
62 2020 Census Operational Plan—Version 4.0
•• An integrated life cycle master schedule that
uses best practices based on the Government
Accountability Office (GAO) schedule assessment guide.
•• Cost and schedule estimates updated throughout the 2020 Census life cycle based on GAO
best practices:
ºº Publication GAO-09-3SP Cost Estimating
and Assessment Guide: Best Practices for
Developing and Managing Capital Program
Costs.
ºº Publication GAO-12-120G Schedule
Assessment Guide: Best Practices for Project
Schedules.
•• A Knowledge Management process and database for lessons learned from the 2010 Census,
2020 Census Research and Testing Program,
advisory committees, and audit and oversight
reports.
U.S. Census Bureau
•• Alignment with the Census Bureau’s approach
to implement activity-based management and
earned-value management techniques.
•• Formal risk management kicked off earlier in
decade (2012) and occurs at both the program
level and project level.
•• Increased transparency and collaboration with
internal and external stakeholders about the
2020 Census.
•• Increased international stakeholder communications to leverage learnings of other countries’
census efforts and to share the Census Bureau’s
best practices and challenges.
•• Governance that bridges organizational silos.
view of censuses) and the United Nations
Economic Commission for Europe (for the
regional view).
3.
Strategic Management: The process for
determining and documenting the 2020
Census strategic direction regarding strategies, goals, objectives, performance, and
investments.
4.
Document Management: Activities for consistent and centralized management of program documentation produced in support of
the 2020 Census.
5.
Change Management: Activities for managing and controlling the 2020 Census strategic baseline, including control of charters,
process plans, design documents, operational plans, project plans, requirements, and
schedules.
6.
Knowledge Management: Practices used to
identify, create, represent, distribute, and
enable adoption of insights and experiences.
7.
Acquisition and Sourcing Management:
Activities to provide and support acquisition
principles and guidelines.
8.
Budget Management: Activities used to
establish and manage future-year budget
formulations, current-year budget execution,
and cost estimating and cost modeling.
9.
Schedule Management: Activities used to
identify and schedule activities required
to produce program deliverables, identify
interdependencies among activities, and
determine activity resource requirements
and duration.
•• Performance Management includes a focus on
key cost drivers.
•• Workforce that is appropriately skilled and
trained.
Description of Operation
The PM Operation is responsible for the planning and implementation of the 2020 Census.
Specifically, this operation defines the overall
2020 Census program and project management
policies, framework, and control processes used
across the entire 2020 Census and all projects
established within the program.
The established PM framework is shown in
Figure 28.
General activities are required to manage multiple, ongoing, interdependent projects to fulfill
the 2020 Census mission and objectives. The PM
Operation defines and manages the following 12
program management processes:
1.
2.
Governance: The overall management structure, decision-making authority, priority setting, resource utilization, and performance
verification at each level of the program.
Strategic Communications: The engagement with internal and external stakeholders,
including Congress and the general public, in
the planning, research and analysis, progress,
and decisions related to the 2020 Census.
This activity also includes collaboration with
international organizations, particularly the
International Census Forum and the United
Nations Statistics Division (for the global
U.S. Census Bureau
10. Performance Management: Practices used to
monitor the progress of the 2020 Census to
identify variances, assign corrective actions,
and make timely changes.
11. Human Capital Management: Activities to
ensure that human competencies and skills
are present and available to the organization.
12. Risk and Issue Management: Activities to
facilitate the identification, analysis, mitigation, and contingency planning for risks and
issues related to achieving the program’s
objectives.
2020 Census Operational Plan—Version 4.0 63
Each component of the framework is documented
in detail in a separate process plan. The PM process plans are revised based primarily on lessons
learned, other feedback received from process
owners and users, and as the program evolves.
Work Completed
The following work has been completed for this
operation:
Governing Board regularly on the top portfolio
risks and escalate risks and issues as necessary
for guidance. The 2020 Census program-level
risks will be managed by the Integrated Project
Teams that were formed for each operation
supporting the 2020 Census.
99 The program will have a finalized and integrated governance and performance measurement reporting mechanism.
The program management processes listed above
were approved in 2011, funded, established, and
utilized during the 2020 Census Research and
Testing Phase. Program Management has now
successfully transitioned to the implementation
phase of the 2020 Census with the completion of
the following:
99 The risk management plan includes both the
portfolio and program-level processes.
•• Baselined 2020 Census Integrated Master
Schedule (December 2017).
99 Quarterly 2020 Census Program Management
Reviews will be conducted by live Webcast, so
stakeholders can watch live or on demand later.
•• Baselined 2020 Census Life-Cycle Cost
Estimate (December 2017).
•• Completion of the Department of Commerce
Scalable Project Management Framework
Milestone Review 3—granting implementation
approval (July 2018).
•• Release of the 2020 Census Operational Plan
v4.0 (December 2018).
Decisions Made
The following decisions have been made for this
operation:
99 Strategies for each program management
element were defined and approved in 2011 and
formed the basis for the management of the
2020 Census Program.
99 The 2020 Census will be managed by using a
fully integrated master schedule designed and
built using best practices based on the GAO
schedule assessment guide (GAO-12-12G, May
2012).
99 The 2020 Census will follow the Enterprise
Systems Development Life Cycle (eSDLC) process for all decennial IT projects. The Census
Bureau Project Life Cycle will be followed for all
projects (IT and non-IT projects).
99 The 2020 Census will manage portfolio-level
risks via assigned risk owners and risk monitors,
who will report to a risk review board. The 2020
Census will brief the Portfolio Management
64 2020 Census Operational Plan—Version 4.0
99 A formal memorandum series will be used to
document significant program decisions.
99 The program will actively engage with stakeholders and advisors on major aspects of the
2020 Census.
99 The 2020 Census Monthly Status Reports will
be delivered to key oversight entities.
99 A Decennial Policy Team will be developed and
managed to ensure interdisciplinary, interdirectorate communication in regard to legal, policy,
and IT security sensitivities.
99 The 2020 Census Web site will be developed
and supported.
99 Frequently Asked Questions about the test
program will be developed along with other
supporting materials.
99 Talking Points for customer assistance for internal phone and correspondence support centers
will be developed.
99 A directorate representative to Census Bureau’s
International Collaboration Steering Committee
will be appointed to communicate and coordinate international collaboration across the
agency.
99 The Census Bureau will actively participate with
international and national statistical and geographic organizations for key learnings and to
share the Census Bureau’s experiences.
99 The Census Bureau will ensure the full utilization of performance management to better
facilitate early identification and correction of
problems.
U.S. Census Bureau
99 The Census Bureau will use change management processes to better ensure impact
assessment.
99 The Census Bureau will use human capital
management outlined in the 2020 Census
Human Capital Management Plan to better plan,
facilitate, and monitor a workforce that has the
required competencies and skills.
99 The Census Bureau will mature the use of the
Primavera scheduling tool for the program and
Microsoft Project interaction for the enterprise.
99 The Census Bureau will ensure the integration
of 2020 Census schedules with enterprise
efforts and enterprise schedules as outlined in
the 2020 Census Schedule Management Plan.
99 The Performance Measurement Branch will
manage a SharePoint site that holds all performance report requirement documents, as well
as links to all cost and progress reports.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
Investment in PM helps ensure an efficient 2020
Census, which is projected to influence (reduce
or increase ) the 2020 Census overall costs.
Specific examples are noted below.
ÐÐ Investment in establishing a robust and formal
program management office that develops and
manages processes that minimize potential
negative cost, schedule, and scope impacts.
ÐÐ Ongoing stakeholder engagement reduces the
likelihood of unplanned design changes late in
the decade, which can prevent additional costs.
The PM does not directly impact the quality of the
2020 Census results.
99 The 2020 Census Program has used Earned
Value Management (EVM) to track and monitor
its IT projects for a number of years. EVM data
for IT projects are shared on a monthly basis
with the Department of Commerce, Office of
Management and Budget, and other oversight entities through the Capital Planning and
Investment Control process. The 2020 Census
Program has a standard work breakdown structure (WBS) and the budget is applied at level 3
of the WBS, which are the planning packages.
The 2020 Census employs a progressive elaboration process to continuously improve and
detail a plan, on a fiscal year basis.
Risks
99 The 2020 Census Program has defined criteria
for creating new work packages within the
Integrated Master Schedule (IMS) for the implementation of major risk mitigation plans:
As part of the 2020 Census PM Operation, a
framework of various program management
processes have been developed for ensuring
the implementation of consistent and thorough
program management controls. IF staff working
on the 2020 Census operations do not follow the
program management processes, THEN the 2020
Census operations may not be able to properly
meet the objectives and goals of the program.
ºº If the dollar value of the mitigation plan is
greater than $1,000,000; and
ºº If the effort is not already accounted for in
the schedule.
When a mitigation plan meets this criteria, then
the baseline IMS will be modified by a change
request to incorporate the plan. However, if the
mitigation plan does not meet the criteria, then
a specific link between the IMS and the risk is
not created.
U.S. Census Bureau
The PM Operation identifies and manages all
portfolio-level risks. The risks listed below are
specific to this operation.
Commitment by the 2020 Census senior managers
to mature the program management processes
used for the 2010 Census Program requires dedicated resources, including staff with certain skill
sets. IF the dedicated resources are not available
and funded to implement program management
processes, THEN critical support functions, such
as schedule, budget, scope, and risk management
will be jeopardized.
Performance measurement is a critical function
needed by managers to track the status of the
planning, development, and implementation of
the operations supporting the 2020 Census. IF
performance measures are inadequately defined
2020 Census Operational Plan—Version 4.0 65
or monitored or both, THEN managers will have
difficulty assessing and reporting accurate cost
and progress status.
Milestones
Date
Activity
September
2010
Baseline the initial 2020 Census
Strategic Plan.
June 2011
Baseline the initial 2020 Census Life
Cycle Rough Order of Magnitude
Cost Estimation (or Estimate).
September
2011
Develop and gain approval for
2020 Census Program Management
Process Strategies for each
component described in this
operation.
September
2012
Baseline the initial 2020 Census
Program-Project Management Plans
for each component described in this
section.
December 2012
Begin the quarterly 2020 Census
Program Management Reviews.
May 2013
Baseline the initial 2020 Census
Mission-Level Requirements.
April 2014
Baseline the initial 2020 Census Life
Cycle Integrated Schedule.
October 2015
Issue the Baseline of the 2020 Census
Operational Plan.
October 2015–
summer 2019*
Baseline the 2020 Census Detailed
Operational Plans (DOPs). There
is one for each operation, except
PM and Systems Engineering and
Integration. Geographic Programs
published a DOP for each operation
subcategory. Operations with
significant overlap produce combined
DOPs (i.e., Field Infrastructure and
Decennial Logistics Management;
all three Coverage Measurement
Operations).
Periodically/
Annually
Issue the updated/current version of
the 2020 Census Operational Plan.
Annually
Refresh and reissue strategic
program documentation and the
2020 Census Operational Plan based
on lessons learned, test results, and
other feedback.
Annually
Conduct project management
process training to process users.
* The dates for each of the DOPs vary depending on the
timing of the operation. For example, the DOP for the Address
Canvassing Operation v1.0 was produced in December 2015, and
the DOP for the Count Question Resolution Operation is due in
2019.
66 2020 Census Operational Plan—Version 4.0
5.3 CENSUS/SURVEY ENGINEERING
The support operations in this area provide the
foundation for conducting the 2020 Census.
This area consists of four operations: Systems
Engineering and Integration (SEI); Security,
Privacy, and Confidentiality (SPC); Content and
Forms Design (CFD); and Language Services
(LNG). Each is described below.
5.3.1 Systems Engineering and Integration
Detailed Planning
Status:
In Production
Purpose
The Systems Engineering and Integration
Operation (SEI) is an IT operation that manages
the delivery of a System of Systems that meets
2020 Census Program business and capability
requirements.
Changes Made Since Version 3.0 Operational Plan
Release: The Integration and Implementation Plan
(IIP) was enhanced to better align to 2020 Census
operations. The new IIP is comprised of 16 operational deliveries.
Lessons Learned
Based on lessons learned from the 2010 Census,
2018 End-to-End Census Test, and other reviews,
the following recommendations were made:
•• Need to have a well-documented plan that
describes the development of the business
architecture and the solution architecture. The
architecture plan must have buy-in and adoption by all stakeholders.
•• Consider greater flexibility for requirements
configuration management in the early design
and development processes to help minimize
the necessity to make subsequent corrections,
potentially saving resources and costs associated with unplanned resource needs.
•• Implement a plan for achieving data lineage for
key data categories and operations.
•• Perform accounting through service-oriented
architecture, Enterprise Service Bus capabilities,
and software as needed.
U.S. Census Bureau
•• Expand the framework of the testing program
to include additional test phases.
scope of the Requirements Engineering effort
includes the following:
•• Build soft launches for every significant 2020
Census operation.
•• Ensuring the controlled and consistent application of a standardized approach to requirements engineering throughout the program and
project life cycles.
Operational Innovations
Operational innovations include the following:
•• Application of the Census Bureau’s enterprise
Systems Development Life Cycle (eSDLC).
•• Align with the Census Bureau’s Enterprise
Architecture.
•• Definition and implementation of performance
measurement (performance metrics and
reporting).
•• Integration with Enterprise systems, as
appropriate.
•• Dedicated resources for key positions, including Chief Architect, Chief Engineer, Chief IT
Security Engineer, and Performance Engineer.
•• Early and incremental development of solutions
architecture.
Description of Operation
The SEI Program Area serves as a centralized
functional group with the Decennial Census
Programs Directorate to manage development
of the System of Systems that meets 2020
Census business and capability requirements.
SEI has five major components: Requirements
Engineering, Solution Architecture, Technical
Integration and Solution Development Oversight,
Test and Evaluation, and Deployment Operations
and Maintenance. As part of all of these efforts,
SEI will use the following standard program
management concepts to manage these tasks:
Schedule Management, Risk Management, Issue
Management, Configuration Management, and
Quality Assurance.
Requirements Engineering
Based on the design of the 2020 Census and plans
documented in the 2020 Census Operational Plan,
the SEI Operation defines and executes a requirements engineering approach for the 2015–2018
Census Tests and 2020 Census that aligns with
the Census Bureau’s eSDLC, meets agency and
Department of Commerce standards and guidelines, and emphasizes a consistent approach
across the portfolio of 2020 Census projects. The
U.S. Census Bureau
•• Creating the Requirements Engineering
Management Plan.
•• Establishing the requirements engineering
methodology and tools that must be applied
across the decennial and supporting programs:
ºº Developing Business Process Models (BPMs)
in concert with subject matter experts for
each operation for each of the 2015–2018
Census Tests and the 2020 Census as a tool
to begin the requirements elicitation process.
ºº Extracting Project-Level Business
Requirements (PLBR) and drafting capability requirements (CAP) from the BPM and
reviewing with subject matter experts to
finalize the initial baseline of PLBR and CAP.
•• Facilitating broad program- and project-level
understanding of needs for all phases of the
2020 Census.
•• Developing 2015–2018 Census Tests and 2020
Census Workload Demand Models, which will
aid the 2020 Census Operational Integrated
Project Teams in identifying the nonfunctional
performance PLBR and CAP. This supports
the scalability of the System of Systems by
enabling the identification of infrastructure
requirements.
•• Conducting Program Systems Requirements
Reviews (SRRs) for each major census test and
2020 Census.
ºº Providing technical oversight and monitoring to ensure that solutions appropriately
address the business requirements and
specifications.
Solution Architecture
The SEI Operation is responsible for the 2020
Census Solution Architecture and Systems, including Interfaces. The development of the solution
architecture is comprised of the following:
•• Building upon lessons learned from the 2010
Census, as well as the results and findings of
the 2020 Census Research and Testing phase.
2020 Census Operational Plan—Version 4.0 67
•• Reviewing and revising BPMs developed as part
of the requirements engineering effort to create
the Business Architecture.
•• Analyzing the CAP and BPMs, identifying IT
solutions, and allocating requirements to these
IT solutions in order to support 2020 Census
operations.
•• Designing the workflows with data exchange
between systems, based on the business
requirements and using collaboration diagrams.
•• Identifying the deployment environments
for the 2020 System of Systems, including
on-premise and Cloud, and using Infrastructure
as a Service, Platform as a Service, and
Software as a Service.
•• Creating the Solution Architecture document
including the Systems and Interface Inventory
based on the “to be” business processes
and capabilities, as well as the Architecture
Transition Plan and Systems Engineering
Management Plan.
•• Providing technical oversight of the 2020
Census IT Project Portfolio to ensure conformance to the prescribed solution architecture.
•• Conducting Program Critical Design Reviews
for each major census test and 2020 Census.
•• Developing the scalability plan for the overall
solution architecture to meet the demand models and high availability requirements of the
2020 Census.
•• Refining and delivering subsequent baselines
of the 2020 Census Solution Architecture and
Systems and Interface Inventories.
•• Mediating gaps in capabilities between solution
providers and operations representatives where
required, and subsequently refining architecture
to represent output of mediation.
•• Establishing data integration and lineage by
developing a tool that can provide clear insight
and traceability regarding data elements’ collection, transformation, and processing through
the System of Systems data workflow.
Integration and Solution Development, the SEI
Operation performs the following activities:
•• Develops and tracks progress against the IIP.
•• Provides support as it relates to interpretation
of PLBR, CAP, and BPM.
•• Ensures development is completed according
to the 2020 Census Solution.
•• Oversees the Solution Development process
to ensure that the overall solution is developed
within cost and schedule constraints in compliance with the Census Bureau’s eSDLC process.
•• Conducts weekly systems integration meetings with system providers to ensure progress
(teams for each system report status, issues,
and risks).
•• Oversees Interface Working Groups to ensure
the systems as developed will function cohesively when exercised in an end-to-end fashion.
•• Works with enterprise programs (such as
Census Enterprise Data Collection and
Processing system [CEDCaP] and Center
for Enterprise Dissemination Services and
Customer Innovation [CEDSCI]) to ensure that
they are meeting the 2020 Census schedule
and functional requirements.
Test and Evaluation
As part of Test and Evaluation area, SEI will perform the following:
•• Oversee integration tests of programs that
are comprised of multiple projects (CEDCaP,
CEDSCI, etc.).
•• Oversee integration tests of projects that are
not part of a larger enterprise program or collection of projects.
•• Conduct Test Readiness Reviews (TRRs) for
each program release and operational delivery.
Technical Integration and Solution Development
Direction
•• Conduct Integration and Test activities across
programs and independent projects to ensure
the 2020 Census System of Systems, as a
whole, performs as expected. This level of
testing could comprise many different types
of tests as defined in the Test and Evaluation
Management Plan.
During solution development, the requirements,
architecture, and technical design are used to
develop the end-product System of Systems
and required interfaces. As part of Technical
•• Conduct Performance and Scalability Test activities that start at the systems level and end with
integration across all the major systems and
operations running concurrently to ensure the
68 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
2020 Census System of Systems, as a whole,
performs as expected. There is a four-phased
approach to testing defined in the Performance
and Scalability Test Plan.
•• Conduct Section 508 testing to demonstrate system compliance to Section 508
Standards and to the Web Content Accessibility
Guidelines (WCAG) 2.0. Section 508 standards
contain scoping and technical requirements for
information and communication technology to
ensure accessibility and usability by individuals with disabilities. Compliance with these
standards is mandatory for federal agencies
subject to Section 508 of the Rehabilitation Act
of 1973, as amended (29 U.S.C. 794d). Section
508 requires WCAG alignment. WCAG covers
a wide range of recommendations for making
Web content more accessible.
•• Document System Integration, Performance
and Scalability, and Section 508 test readiness
in a Test Analysis Report.
Deployment and Operations and Maintenance
(O&M)
The SEI Operation provides oversight and structure around the deployment of systems as well as
O&M processes. As part of the Deployment and
O&M activities, the SEI Operation will perform the
following:
•• Provide oversight to ensure that all systems
are deployed and ready to support 2015–2018
Census Tests and 2020 Census activities.
•• Conduct Production Readiness Reviews (PRRs)
for each program release.
•• Provide oversight to ensure all supporting organizations are set up and ready to support all
operational activities.
Work Completed
The following work has been completed for this
operation:
2017 Census Test:
Two Systems Requirements Reviews (SRRs), one
Critical Design Review (CDR), eight TRRs were
conducted.
•• Utilized commercial Cloud Service Provider.
•• Integrated 44 of 52 2020 Census systems—
approximately 87 percent have completed
development.
•• Supported operations, including new operations such as Address Canvassing (ADC),
Coverage Improvement (CI), Update Leave
(UL), Nonresponse Followup (NRFU), and
Group Quarters (GQs).
•• Utilized optimized assignment for NRFU
Operation.
•• Utilized a centralized repository of response
data.
•• Utilized mobile devices for the NRFU Operation.
•• Stood up Internet-based application for recruiting applicants.
•• Stood up Network Operations Center and
Security Operations Center.
•• Stood up a software deployment process and
testing framework.
•• Stood up service-oriented architecture system
for transacting data between systems.
•• Captured responses from all modes, paper,
Internet, and telephony.
•• The following releases were deployed to
Production:
ºº In-Field Address Canvassing.
ºº Self-Response, which included printing,
mailing, workload, and Census Questionnaire
Assistance (CQA), Self-Response, GQs
Advanced Contact, all GQs training, and CQA
training.
ºº Field Enumeration, which included UL, GQs,
Service-Based Enumeration, NRFU, and CI.
ºº Tabulation, Dissemination, and Self-Response
Quality Assurance.
•• Conducted Printing and Mailing Release TRRs.
2020 Census:
•• Conducted Self-Response TRRs.
•• Four SRRs, four CDRs, and one TRR for 2020
Census have been conducted.
2018 End-to-End Census Test:
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 69
•• Conducted business operations for In-Office
ADC Operation, Geographic Programs
Operation, and Local Update of Census
Addresses Operation.
ºº 2020 Systems Coordinator.
•• BPMs and business and capability requirements
are baselined for all business operations.
ºº Business Integration Manager.
•• Solutions for the 2015 Optimizing SelfResponse Test, 2015 Census Test, 2015 National
Content Test, 2016 Census Test, 2017 Census
Test, and 2018 End-to-End Census Test were
delivered.
ºº Integration Test Lead.
•• The solution architectures for the 2016 Census
Test, the ADC Test, 2017 Census Test, and 2018
End-to-End Census Test were baselined.
ºº Chief Data Architect.
•• Developed the 2020 Census backlog to identify, manage, and track all remaining development activities for the 2020 Census System of
Systems
•• The IIP was developed and several key IIP
Reviews were held, including:
ºº SRRs for the 2017 Census Test and 2018 Endto-End Census Test.
ºº CDRs for the ADC Test, 2017 Census Test,
and 2018 End-to-End Census Test.
ºº TRRs for 2016 Census Test, ADC releases,
2017 Census Test, and 2018 End-to-End
Census Test.
ºº PRRs for the 2016 Census Test releases, 2017
Census Test, and 2018 End-to-End Census
Test.
•• In August 2018, the 2020 Census IIP was
enhanced from the original four releases into 16
Operational Deliveries (ODs). This enhancement
provides additional development and testing
time along with better alignment to the dates
that each system is used by an operation.
•• Early Operations Preparation (OD 1.0) TRR was
held in July 2018.
•• A new Initial Baseline Review milestone was
added before TRR to mark formal scope definition and requirements allocations to each OD.
Decisions Made
The following decisions have been made for this
operation:
99 The following key roles were filled to support
the SEI Operation:
70 2020 Census Operational Plan—Version 4.0
ºº Chief Program Engineer.
ºº Lead Program Engineer.
ºº Release Manager.
ºº Mobile Test Lead.
ºº Chief Program Architect.
ºº Chief IT Security Engineer.
ºº Mobile Engineering Lead.
ºº IT Infrastructure Lead.
ºº Performance and Scalability Lead.
ºº Performance Engineer.
99 The sourcing approach of the 2020 Census
CAPs has been determined by a combination of factors. The BPMs and requirements
that support a given operation serve as input
to the requirement allocations process. The
sourcing approach maximizes the utilization of
Enterprise programs and existing IT solutions
and services. Acquired services are also considered for sourcing capabilities. The Technical
Integrator Architecture Team Lead and the
Chief Architect finalize and approve draft
allocations.
99 The 2020 Census Program will leverage the
enterprise infrastructure and enterprise solutions as appropriate.
99 Specific tools to support program-level testing
have been selected that will cover testing needs
for functional, accessibility, mobility, scalability, and performance aspects of the decennial supporting system. Test materials from
project-level testing and simulated data will be
used during program-level testing.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
SEI activities have a critical impact on the 2020
Census. Because many of the innovations aimed at
reducing the cost of the census rely on IT solutions,
the effectiveness of this operation could have an
effect on the overall cost of the 2020 Census.
U.S. Census Bureau
Impacts of this operation on overall 2020 Census
quality include the following:
ÏÏ Increase quality by setting up robust processes
for system development.
ºº Integration Test.
Date
Activity
October
2016
Complete a deployment of systems
supporting 2016 Census Test.
November
2016
Complete TRR for 2017 Census Test.
December
2016
Complete PRR for 2018 End-to-End
Census Test for ADC Recruiting/
Deployment.
January
2017
Complete PRR and deployment of systems
supporting 2017 Census Test recruiting,
training and self-response releases.
ºº Performance and Scalability.
ºº Design.
ºº Architecture.
ºº Testing.
Risks
Major concerns for the SEI Operation are covered
by the IT-related 2020 Census Program risks listed
in Chapter 6.
Milestones
Date
Activity
2012
Baseline the initial 2020 Census SEI Plans
for each component described in this
section.
2013
Create architecture and requirements
artifacts for the 2014 Census Tests.
2014
Initial Baseline PLBR and CAP (to be
updated as design matures).
2015
Establish Baseline 1 of Solution
Architecture.
Complete TRR for 2017 Census Test.
March
2017
Complete TRR for 2018 End-to-End
Census Test ADC Training.
May 2017
Complete SRR for 2020 Census Release 1.
May 2017
Complete TRR for 2018 End-to-End
Census Test In-Field ADC.
June 2017
Conduct Initial SRR for 2020 Census.
Complete CDR for 2020 Census Release 1.
June 2017
Conduct TRR for 2018 End-to-End Census
Test Peak Operation Recruiting.
Complete PRR for 2018 End-to-End
Census Test ADC Training.
July 2017
Establish Baseline 1 of PLBR and CAP,
which includes requirements for 2016
Census Test.
Complete deployment for 2018 End-toEnd Census Test ADC Training and PRR
for 2018 End-to-End Census Test In-Field
ADC.
Conduct 2018 End-to-End Census Test
PRR for Peak Operation Recruiting.
Determine the approach for conducting
integrated tests for 2016, 2017, and 2018
Census Tests (Design Decision 1).
July 2017
Complete SRR for 2020 Census Release 2.
Determine tools and test materials
required to support the integrated
tests (Performance, Test Services,
Representative Test Data, etc.) (Design
Decision 2).
August
2017
Complete CDR for 2020 Census Release 2.
August
2017
Complete deployment for 2018 End-toEnd Census Test In-Field ADC and Peak
Operation Recruiting.
April 2016
Complete a deployment of systems
supporting 2016 Census Test.
July 2016
Conduct CDR and TRR for Address
Canvassing Test.
Complete second SRR and CDR for 2020
Census.
October
2017
Complete PRRs and deployment of
systems supporting first four releases of
the 2018 End-to-End Census Test.
Conduct SRR and CDR for 2017 Census
Test and establish Baseline 2 of PLBR, CAP
and Solution Architecture.
August
2016
Complete TRR for 2018 End-to-End
Census Test for ADC Recruiting.
Conduct TRR for systems supporting selfresponse and field enumeration releases of
the 2018 End-to-End Census Test.
Conduct SRR and CDR for 2018 End-toEnd Census Test and establish Baseline 3
of PLBR, CAP and Solution Architecture.
Conduct PRR and complete deployment
of systems supporting Address Canvassing
Test.
U.S. Census Bureau
Complete TRR for systems supporting
2018 End-to-End Census Test selfresponse and field enumeration training.
November
2017
Complete SRR for 2020 Census Release 3.
2020 Census Operational Plan—Version 4.0 71
Lessons Learned
Date
Activity
December
2017
Complete TRR for systems supporting
2018 End-to-End Census Test field
enumeration.
Based on lessons learned from the 2010 Census
and other reviews, the following recommendations
were made:
January
2018
Complete PRR and deployment of systems
supporting self-response and field
enumerations releases of the 2018 End-toEnd Census Test.
•• Ensure IT systems and applications supporting the 2020 Census have the proper security
authorization prior to start of operations.
January
2018
Complete SRR for 2020 Census Release 4.
February
2018
Complete CDR for 2020 Census Release 4.
February
2018
Complete PRR for 2018 End-to-End
Census Test field enumeration.
April 2018
Complete TRR for tabulation/
dissemination release of 2018 End-to-End
Census Test.
July 2018
Complete TRR for Early Operations
Recruiting, Selection, Hiring, and Training.
August
2018
Complete PRR for Early Operations
Recruiting.
October
2018
Complete PRR and deployment of systems
supporting tabulation and dissemination
for the 2018 End-to-End Census Test.
October
2018
Complete PRR for Early Operations
Selection, Hiring, and Training.
March
2021
Release final, as-built, and operated
Solution Architecture.
Annually
Refresh and reissue strategic program
documentation based on lessons learned,
test results, and other feedback.
5.3.2 Security, Privacy, and Confidentiality
Detailed Planning
Status:
In Production
DOP published in FY 2017
Purpose
The Security, Privacy, and Confidentiality (SPC)
Operation ensures that all operations and systems
used in the 2020 Census adhere to laws, policies,
and regulations that:
•• Ensure appropriate systems and data security.
•• Protect respondent and employee privacy and
confidentiality.
Changes Made Since Version 3.0 Operational Plan
Release: There have been no major changes to
this operation.
72 2020 Census Operational Plan—Version 4.0
•• Ensure all 2020 Census accepted IT security
risks are in alignment with the Census Bureau’s
security program policies.
•• Ensure all of the 2020 Census IT system security risks are monitored by the 2020 Census
Risk Review Board, as well as an Information
System Security Officer and the Office of
Information Security.
•• Embed an Office of Information Security
security engineer in the 2020 Census Program
to ensure compliance with the IT security program and integration with the Census Bureau’s
Enterprise environments.
•• Ensure all employees supporting IT security
are certified in accordance with the Census
Bureau’s IT security program.
Operational Innovations
Operational innovations include the following:
•• Implement an IT Security Program Risk
Management Framework in accordance with
National Institute of Standards and Technology
guidelines.
•• Hire a 2020 Census Chief IT Security Engineer
to support application development, mobile
computing, and enterprise systems.
•• Increase staff in the Census Bureau Office of
Information Security to provide penetration
testing services and more extensive scanning for vulnerabilities and configuration
management.
•• Align all Privacy Impact Assessments and
Privacy Threshold Analyses to the System
Security Plans.
Description of Operation
The SPC Operation ensures that all operations and
systems used in the 2020 Census adhere to the
appropriate systems and data security, respondent
U.S. Census Bureau
and employee privacy and confidentiality policies,
laws, and regulations. Specific requirements are
outlined below.
Security
Ensure Compliance with the following laws and
Census Bureau policies:
•• IT Security Program Policy: Ensure all 2020
Census systems meet federal, Department of
Commerce, and Census Bureau IT security
policy requirements as identified in the Census
Bureau IT Security Program Policy and relevant
National Institute of Standards and Technology
documentation.
•• Ensure that the 2020 Census only collects information necessary for complying with the 2020
Census mission and legal requirements.
•• Ensure all 2020 Census systems have an
Authority to Operate.
•• Ensure each system has a designated
Information System Security Officer.
•• Ensure all 2020 Census Program systems are
covered by the Risk Management Framework,
which includes processes to ensure systems
undergo a security review before testing and
a full security assessment before obtaining an
Authority to Operate (ATO).
•• Ensure that all employees and temporary
staff working with data protected by Title 26
U.S.C. have completed the necessary Title 26
Awareness training and that their access is
controlled and tracked per Internal Revenue
Service standards.
•• Ensure that the 2020 Census complies with
the Census Bureau’s data stewardship policies
including:
ºº The Census Bureau’s Privacy Principles.
ºº Controlling Nonemployee Access to Title 13
Data Policy (DS-006).
ºº Safeguarding and Managing Information
Policy (DS-007).
ºº Data Linkage Policy (DS-014).
ºº Administrative Data Acquisition, Access, and
Use Policy (DS-001).
ºº Respondent Identification and Sensitive
Topics in Dependent Interviewing Policy
(DS-016).
ºº Control of Access to Personally Identified
Survey and Decennial Census Data:
Unauthorized Browsing Policy (DS-018).
ºº Policy On Conducting Privacy Impact
Assessments (DS-019).
ºº Data Breach Policy (DS-022).
•• Ensure Appropriate Suitability Screening
Processes are in place.
•• Ensure Decennial Privacy Impact Assessments
and Privacy Threshold Analyses are current.
Privacy and Confidentiality
•• Ensure that each system of record has an
appropriate System of Record Notice published
in the Federal Register.
•• Ensure that the Census Bureau meets its legal
obligations to protect privacy and confidentiality as prescribed by Title 5 and the Privacy
Act of 1974, the E-Government Act of 2002,
the Census Act (Title 13 United States Code
[U.S.C.]) and the Internal Revenue Code (Title
26 U.S.C.); and adheres to the data stewardship
policies that support these legal obligations.
•• Ensure that all employees of the Census Bureau
and temporary staff authorized under Title 13
U.S.C. §23(c) supporting 2020 Census operations have sworn to uphold the confidentiality
provisions of Title 13 U.S.C. §9. These individuals
must:
•• Establish a System of Record Notice for Device
as a Service technology to be used in the 2020
Census.
•• Ensure a privacy notice is available on all
Census Bureau social media sites and other
third-party Web sites operated by or on behalf
of the agency.
ºº Sign an affidavit of nondisclosure and receive
Special Sworn Status.
•• Align the Privacy Impact Assessments and
Privacy Threshold Analyses to security plans
as part of the accreditation process; work with
training operations to ensure 2020 Census
managers and staff are prepared to notify the
respondents about the purpose and planned
statistical uses of the information collected.
ºº Complete the necessary Data Stewardship
and IT Security Awareness training.
•• Ensure that the Privacy Act statement is present on all documents, forms, and questionnaires
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 73
(paper and electronic) that collect personally
identifiable information.
•• Support the Paperwork Reduction Act submission process for decennial collections.
•• Ensure Personally Identifiable Information
Incident Handling process is operational.
•• Recharter the decennial policy team as a subcommittee of the Data Stewardship Executive
Policy Committee to serve as an interdisciplinary body of experts in the area of security, privacy, confidentiality, law, policy, methodology,
information technology, and communications
to advise the 2020 Census in policy issues that
arise during the planning and execution of the
program.
Work Completed
The following work has been completed for this
operation:
Encryption
•• Researched securely managing data on mobile
devices using Mobile Application Manager
(MAM) software solution.
Cloud Technology
•• Adopted the “Cloud First” strategy.
•• Examined the requirements of the applications
and underlying infrastructure from a security
compliance perspective.
•• Examined the requirements for hybrid cloud
capabilities to allow flexibility in leveraging
cloud technology to meet future program
requirements.
•• Enabled the deployment of cloud-based
services.
•• Documentation.
•• Worked with Office in Information Security
(OIS) and Decennial Contracts Execution Office
(DCEO) to develop the ATO schedules for
future integration releases.
•• Worked with OIS, Systems Engineering and
Integration Operation, and DCEO/Technical
Integrator on a plan which supports the mitigation and monitoring of Plan of Action and
Milestones to reduce security risk of the systems supporting the 2020 Census.
74 2020 Census Operational Plan—Version 4.0
Decisions Made
The following decisions have been made for this
operation:
99 The 2020 Census will access Title 13 and Title
26 data, including administrative records and
third-party data, remotely using the Virtual
Desktop Infrastructure, Virtual Private Network,
and other secure infrastructure.
99 In Decision Memorandum 2016.01, the Census
Bureau decided to implement the Device as a
Service strategy for provisioning equipment to
enumerators in the 2020 Census. The mobile
devices that will be provisioned to enumerators in the 2020 Census through the Device
as a Service strategy will be managed by an
Enterprise Mobility Management4 (EMM) solution that offers Mobile Device Management
(MDM) and MAM capabilities.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
Investment in SPC is projected to have minimal
influence on the overall cost and quality of the
2020 Census.
Risks
In accordance with the Census Bureau’s security policy, all IT systems must undergo an independent security assessment and acquire the
authorization to operate prior to operating in the
production environment. In addition, all systems
must meet the Census Bureau’s Risk Management
Framework continuous monitoring requirements.
IF an IT system supporting the 2020 Census
encounters an unexpected configuration change,
which affects the system’s security posture, THEN
additional security assessments are required,
which may result in an increase in security support costs, an increase in the system security risk
rating, and schedule delays.
4
Both MDM and MAM fall under the umbrella term of EMM.
MDM and MAM each perform different functions. MDM manages
device functions such as connectivity and device policies. MAM
typically involves a secure workspace to manage and protect
mobile applications and its data.
U.S. Census Bureau
Milestones
Date
Security Activity
April 2015 Monitored security of systems used in the
2015 Census Test.
January
2016
Conducted security reviews and
assessments on system releases for the
2016 Census Test.
October
2016
Conducted security reviews and
assessments on system releases for the
2017 Census Test.
March
2017
Released SPC Detailed Operational Plan.
October
2017
Conducted security reviews and
assessments on system releases for the
2018 End-to-End Census Test.
October
2018
Conduct security reviews and assessments
on system releases for the defect resolution
testing and post end-to-end performance
testing in 2019.
5.3.3 Content and Forms Design
Detailed Planning
Status:
In Production
DOP published in FY 2016
Purpose
The Content and Forms Design (CFD) Operation
performs the following activities:
•• Identify and finalize content and design of
questionnaires and associated nonquestionnaire
materials such as letters, postcards, inserts,
envelopes, and field enumeration materials.
•• Ensure consistency across data collection
modes and operations, including (but not limited to) questionnaire content, help text, mailing
materials, and field enumeration materials.
•• Provide the optimal design and content of the
questionnaires to encourage high response
rates.
Changes Made Since Version 3.0 Operational
Plan Release: Although there have been no major
changes to this operation, in this fiscal year, the
Census Bureau finalized the questions planned for
the 2020 Census and submitted this documentation to Congress.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
U.S. Census Bureau
•• Ensure sufficient time for testing the questionnaire content. Also include testing of associated
nonquestionnaire materials.
•• Consider forms design elements (size, color,
spacing implications, etc.), mode, and language
when finalizing questionnaire content and
design. Also test for successful data capture
before implementation.
•• Conduct comprehensive testing of optimized
content in the usability lab and in a field test to
prevent unanticipated negative impacts on data
quality.
•• Determine if a bilingual initial or replacement
questionnaire in bilingual selected tracts is
beneficial.
Operational Innovations
Operational innovations include the following:
•• Create consistent content for automated data
collection instruments needed for Internet SelfResponse and Nonresponse Followup (NRFU)
Operations.
•• Redesign the bilingual paper questionnaires to
flip-style design.
Description of Operation
The CFD Operation is responsible for identifying
and finalizing the content and design of questionnaires and associated nonquestionnaire materials.
To support the 2020 Census, the CFD Operation
ensures content consistency across data collection modes and operations, as wording may vary
depending on mode of data collection. The CFD
Operation is responsible for creating, refining, and
finalizing instrument specifications for all data
collection modes—Internet, phone, paper, and field
enumeration. This is a significant departure from
the 2010 Census, which relied on paper for data
collection.
Specific activities of the CFD Operation include
the following:
•• Developing content specifications for all data
collection modes: Internet, phone, paper, and
field enumeration.
•• Pretesting questionnaire content (e.g., cognitive
testing, focus groups).
2020 Census Operational Plan—Version 4.0 75
•• Finalizing content development and design
of questionnaires across all modes: Internet,
phone, paper, and field enumeration.
•• Finalizing content development and design of
associated nonquestionnaire materials including
letters, postcards, inserts, envelopes, notice of
visit, and confidentiality notice.
•• Optimizing questionnaire designs for each
mode and all supporting materials, in alignment
with systems specifications.
•• Ensuring questionnaire content and supporting
materials are accurate, appropriate, consistent,
inviting, and easy to understand across self-response and nonresponse data collection modes.
Research Completed
The following research has been completed for
this operation:
•• Qualitative Research on Content:
ºº Conducted qualitative research on alternative questionnaire wording for the following
topics: race and Hispanic origin, relationship,
within-household coverage.
•• Findings: Informed questionnaire wording
(for content variations) tested in the 2015
National Content Test and other Research
and Testing Phase testing.
ºº Conducted expert review of paper questionnaire design and inclusion of write-in fields
for all race categories.
•• Findings: Informed layout of paper questionnaire design for the 2015 National
Content Test.
•• Usability and Systems Testing:
ºº Conducted usability testing of automated
data collection instruments (Internet, field
enumeration).
•• Findings: Informed final instrument layout and navigation for 2014, 2015, and
2016 Census Tests and the 2015 National
Content Test.
ºº Conducted testing on data capture of paper
questionnaire responses.
•• Findings: Informed paper questionnaire
layout for the 2014, 2015, and 2016 Census
Tests and the 2015 National Content Test.
76 2020 Census Operational Plan—Version 4.0
ºº Conducted 2014 Census Test (relationship
response categories).
•• Findings: Continue testing new relationship response categories.
ºº Conducted 2015 Census Tests (content and
questionnaire design).
•• Findings: Coverage questions added to
respondent burden (based on observations of field operations and respondents’
reactions to questionnaire content).
•• 2015 National Content Test (content and questionnaire design):
ºº Finalized content to be tested during the
2015 National Content Test.
ºº Developed content specifications for
Internet data collection instrument.
ºº Developed English and Spanish bilingual
paper questionnaires (10 versions: eight for
stateside, two for Puerto Rico).
ºº Developed Computer-Assisted Telephone
Interview instrument specifications for
the 2015 National Content Test Race and
Coverage Reinterview.
•• 2016 Census Test (content and questionnaire
design):
ºº Finalized content to be tested during the
2016 Census Test.
•• 2017 Census Test (content and questionnaire
design).
ºº Finalized content to be tested during the
2017 Census Test.
•• 2018 End-to-End Census Test (content and
questionnaire design).
ºº Finalized content for the 2018 End-to-End
Census Test.
Decisions Made
The following decisions have been made for this
operation:
99 Flip-style bilingual paper questionnaires will be
used for household enumeration.
99 Coverage questions will be streamlined to
reduce respondent burden while maintaining
data quality (based on 2014 and 2015 Census
Test field observations).
U.S. Census Bureau
99 The subjects planned for the 2020 Census
were submitted to Congress on March 28,
2017. The subjects planned for the 2020
Census include age, gender, race/ethnicity,
relationship to householder, and tenure of
occupied housing unit.
99 The paper questionnaire layout for the respondents living in residences other than households, such as group quarters and transitory
locations, will be 9 x 11 inches in size, similar to
the housing unit questionnaires.
99 The Questions Planned for the 2020 Census
and American Community Survey document
was submitted to Congress on March 29, 2018.
99 The 2020 Census questionnaires will be 9 x 11
inches in size. Questionnaire layout will be person-based and bilingual questionnaires will use
a flip-style design. Letters, language assistance
sheets, and information sheets will be 8.5 x 11
inches in size. Inserts, confidentiality notices,
and notices of visit will be a half sheet, 8.5 x 5.5
inches in size. The language identification card
will feature an accordion-fold of 8.5-x-11-inch
pages.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
Investment in CFD is projected to have minimal
influence on the overall cost of the 2020 Census.
Impacts of this operation on overall 2020 Census
quality include the following:
ÏÏ Internet questionnaire design is anticipated to
improve the quality of self-response.
ÏÏ Automated NRFU instrument is anticipated to
improve quality of response
Risks
The questions planned for the 2020 Census were
submitted to Congress in March of 2018. Any
additional changes to this content will substantially impact scope, schedule, and resources for
the 2020 Census. These impacts include, but are
not limited to, the following:
•• Automated data collection instrument content specifications will need to be updated,
U.S. Census Bureau
including any necessary residual updates to the
flow of the instrument.
•• Updated automated data collection instruments
will need to be programmed, usability-tested,
and redeployed.
•• Updated automated data collection instruments
will need to be tested to ensure responses are
accurately captured and processed.
•• Paper questionnaires will need to be updated.
•• Updated paper questionnaires will need to be
tested to ensure responses can be accurately
captured.
•• Questionnaire print files will be need to be
updated, and some questionnaires may need to
be reprinted.
•• Non-English automated data collection instrument content specifications in 12 languages will
need to be updated, and those translations will
need to be reviewed and pretested.
•• Non-English updated automated data collection instruments will need to be programmed,
usability-tested, and redeployed.
•• Language assistance guides in 59 non-English languages will need to be updated and
reviewed.
•• Content-related training of Census
Questionnaire Assistance agents and enumerators will need to be updated, reviewed, and
redeployed.
•• Content-related nonquestionnaire materials will
need to be updated and reviewed.
•• Downstream systems that capture data or
receive responses, process data, and create
reports and data products need to be updated.
IF there are any additional changes to the content,
THEN the resulting updates will require additional
time in the schedule, potentially delaying deliverables and increasing cost.
Milestones
Date
Activity
May 2015
Complete cognitive testing of paper
questionnaire content for 2015 National
Content Test (English, Spanish).
Complete cognitive testing of paper
questionnaire content and associated
nonquestionnaire materials in multiple
languages.
2020 Census Operational Plan—Version 4.0 77
Date
Activity
August 2015 Complete cognitive testing of Internet
questionnaire content for 2015 National
Content Test for English and Spanish.
Start conducting the 2015 National
Content Test.
October
2015
Complete the 2015 National Content
Test (data collection).
Final questionnaire content for the
2016 Census Test: Race, Relationship,
Coverage Baselined instrument
specifications for the 2016 Census Test.
February
2016
June 2016
Complete cognitive and usability
testing of Chinese and Korean Internet
and NRFU instruments and associated
nonquestionnaire materials.
Receive analysis of 2015 National
Content Test results.
Cognitive testing of possible additional
topics (e.g., tribal enrollment).
August 2016 Receive results from cognitive test of
possible additional topics (e.g., tribal
enrollment).
September
2016
Release the CFD Detailed Operational
Plan.
October
2016
Analysis of the 2016 Census Test results.
Finalize questionnaire content for the
2017 Census Test.
Baselined instrument specifications for
the 2017 Census Test.
April 2017
Submit 2020 Census topics to Congress.
October
2017
Finalize questionnaire content for the
2018 End-to-End Census Test.
Baselined instrument specifications for
the 2018 End-to-End Census Test.
April 2018
Submit 2020 Census question wording
to Congress.
May 2019
Finalize 2020 Census paper
questionnaires for print.
March 2020
5.3.4 Language Services
Detailed Planning
Status:
In Production
DOP published in FY 2016
Purpose
The Language Services (LNG) Operation performs
the following activities:
•• Assess and support language needs of nonEnglish speaking populations.
•• Determine the number of non-English languages and level of support for the 2020
Census.
•• Optimize the non-English content of questionnaires and associated nonquestionnaire
materials across data collection modes and
operations.
•• Ensure cultural relevancy and meaningful
translation of 2020 Census questionnaires and
associated nonquestionnaire materials.
Changes Made Since Version 3.0 Operational
Plan Release: The Census Bureau issued the 2020
Census Non-English Language Support memorandum in February 2018. The memorandum documents the 2020 Census Program decision regarding the number of non-English languages that will
be supported, and the level of support, during the
2020 Census operations.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Conduct further research on language selection
criteria.
Finalize 2020 Census questionnaires
design and layout across all modes.
•• Conduct cognitive testing earlier in the decade
to allow for high-quality translation of questionnaires and nonquestionnaire materials.
Deploy 2020 Census self-response data
collection instruments and materials.
•• Optimize non-English materials to ensure cultural relevance for intended audiences.
78 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
•• Allow Internet responses in English and other
languages.
•• Test a Spanish version of the questionnaire on
the Internet.
Operational Innovations
Automated data collection instruments available
in non-English languages.
Description of Operation
•• Determining the number of non-English languages and level of support during the 2020
Census.
•• Optimizing the content of non-English questionnaires for each data collection mode, as
appropriate, for LEP populations.
•• Ensuring culturally and functionally appropriate
questionnaire design and content across translations (e.g., through pretesting).
The LNG Operation is responsible for assessing
language needs of the nation and identifying
ways to reduce language barriers to enumeration
for respondents within limited English-speaking
households. To support the 2020 Census, the LNG
Operation will determine the number of non-English languages and level of support and optimize
the non-English content of questionnaires and
associated nonquestionnaire materials. The operation will ensure cultural relevancy and meaningful
translation of these materials across data collection modes and operations.
•• Optimizing non-English content of mailing
materials to: (1) ensure non-English speakers
receive the same message as English speakers
prior to going online; (2) determine whether
non-English speakers respond differently
to number and ordering of contacts than
English speakers; and (3) determine whether
or not adding multilanguage public-use forms
increases participation by non-English speakers.
To achieve the goal of reducing language barriers to enumeration, the LNG Operation supports
the 2020 Census operations by providing data
collection instruments in non-English languages,
optimizing the format of bilingual paper questionnaires, and enhancing the content of all non-English mailing and field materials—such as questionnaires, letters, postcards, the notice of visit, and
the confidentiality notice—through pretesting to
ensure question wording and messages are consistent and culturally relevant.
Research Completed
To achieve the goals of assisting and creating multiple modes of collecting information from non-English-speaking respondents, the LNG Operation
conducts research on language needs and trends
and relies on sociolinguistic approaches to provide
language operations and assistance and to identify,
create, and refine non-English materials for Limited
English Proficiency (LEP) respondents. The operation also includes a National Advisory Committee
Language Working Group for National Advisory
Committee members and subject-matter experts
to jointly strategize on language operations for the
2020 Census.
Specific activities of the LNG Operation include
the following:
U.S. Census Bureau
•• Providing language guides in multiple languages, including American Sign Language,
large print, and braille.
The following research has been completed for
this operation:
•• Qualitative Research on Non-English Content:
ºº Tested for accuracy and cultural appropriateness of translated questionnaire content for
the following languages: Spanish, Chinese,
Korean, Vietnamese, Russian, Arabic.
•• Findings: Informed questionnaire wording
for 2015 National Content Test and other
mid-decade testing.
ºº Conducted joint cognitive and usability
testing of non-English Internet Self-Response
(ISR) instrument in Chinese, Vietnamese,
Korean, Russian, Arabic, Tagalog, Polish,
French, Haitian Creole, Portuguese, and
Japanese.
•• Findings: Informed the 2020 Census ISR
non-English wording
•• In-House Review of Materials:
ºº Conducted expert review of field materials in
non-English languages.
•• Findings: Informed translated content of
Notice of Visit for the 2015 Census Test;
Revised Language Identification Card.
2020 Census Operational Plan—Version 4.0 79
ºº Conducted expert review of 2020 Census
materials, including Language Assistance
Sheet, scripts for Census Questionnaire
Assistance (CQA) agents, and language
guide templates.
•• Findings: Informed translated content of
2020 Census materials.
•• Language Needs Assessment:
ºº Assessed current language needs using
American Community Survey (ACS) data.
•• Findings: Informed non-English support
for 2015, 2016, 2017 End-to-End Census
Tests and the 2015 National Content Test.
ºº Analyzed the 2016 ACS, 5-year estimates to
identify language groups with at least 2,000
limited-English-speaking households, and
assessed these languages for translation
feasibility.
•• Findings: Informed 2020 Census NonEnglish Language Support.
•• Research on Translation Technology:
ºº Conducted research on translation machines.
•• Findings: Machine translations generally
show severe structural, grammatical, and
contextual errors and should not replace
human translations.
•• Usability and Systems Testing:
ºº Conducted usability testing of Spanish automated data-collection instruments (Internet,
field enumeration).
•• Findings: Informed final instrument layout
and navigation for the 2014, 2015, 2016,
and 2017 Census Tests and the 2015
National Content Test.
ºº Conducted usability testing of Chinese and
Korean automated data-collection instruments (Internet, field enumeration) for the
2016 Census Test.
•• Findings: Informed final instrument layout
and navigation for the 2016 Census Test.
ºº Conducted testing on data capture for
mid-decade testing.
•• Findings: Informed paper questionnaire
layout for the 2014, 2015, 2016, and 2017
Census Tests and the 2015 National
Content Test.
80 2020 Census Operational Plan—Version 4.0
•• Field Testing of Non-English Instruments and
Materials:
ºº Conducted testing of data collection instruments (Internet, phone, paper, field enumeration) and mailing/field materials in Spanish,
Chinese, and Korean.
•• Findings: Informed final instrument layout
and navigation for the 2016 Census Test.
ºº Conducted testing of non-English data
collection through CQA in Spanish, Chinese,
Vietnamese, Korean, Russian, Arabic, and
Tagalog.
•• Findings: Informed 2020 Census CQA
non-English content.
Decisions Made
The following decisions have been made for this
operation:
99 Flip-style bilingual paper questionnaires will be
used instead of the swimlane style.
99 The LNG Operation utilized a National Advisory
Committee Language Working Group for early
engagement on language assistance plans for
the 2020 Census.
99 During the 2020 Census, CQA will support
English, Spanish, Chinese (Mandarin and
Cantonese), Vietnamese, Korean, Russian,
Arabic, Tagalog, Polish, French, Haitian Creole,
Portuguese, and Japanese.
99 During the 2020 Census, the ISR instrument
and CQA will be provided in 12 non-English languages. In addition, language guides, language
glossaries, and language identification cards
will be provided in 59 non-English languages.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
Investment in LNG is projected to have minimal
influence on the overall cost of the 2020 Census.
Impacts of this operation on overall 2020 Census
quality include the following:
ÏÏ Automated data collection instruments in
non-English languages anticipated to improve
quality of responses from non-English speaking
respondents.
U.S. Census Bureau
ÏÏ Culturally appropriate, translated questionnaires
and associated nonquestionnaire materials
anticipated to improve quality of responses of
non-English speaking respondents.
5.4.1 Geographic Programs
Risks
Purpose
Any changes to the 2020 Census English ISR
content specification will impact all non-English
content specifications. IF final English content
changes less than 15 working days before the
Release 3 TRR, THEN there will not be adequate
time in the schedule to translate, develop, pretest,
and produce non-English Internet questionnaires
for the 2020 Census before that TRR date.
The Geographic Programs (GEOP) Operation provides the geographic foundation in support of the
2020 Census data collection and tabulation activities within the Master Address File/Topologically
Integrated Geographic Encoding and Referencing
(MAF/TIGER) System. The MAF/TIGER System
(software applications and databases) serves as
the national repository for all of the spatial, geographic, and residential address data needed for
census and survey data collection, data tabulation,
data dissemination, geocoding services, and map
production.
Milestones
Date
Activity
March 2016
Deploy Internet and NRFU instruments
in Spanish, Chinese, and Korean for the
2016 Census Test.
Deploy bilingual paper questionnaire and
associated nonquestionnaire materials
in Spanish, Chinese, and Korean for the
2016 Census Test.
September
2016
Release the LNG Detailed Operational
Plan.
2016–2019
(ongoing)
Conduct qualitative research on data
collection instruments and materials in
additional languages.
September
2017
Determine number of non-English
languages and level of support for the
2020 Census.
February
2018
Issue 2020 Census Non-English
Language Support through the 2020
Census Memorandum series.
March
2018–2019
Continue development of Internet
instrument in additional non-English
languages for 2020.
March 2020
Deploy 2020 Census non-English selfresponse data collection instruments
and materials.
5.4 FRAME
The operations in this area have the goal of developing a high-quality geospatial frame that serves
as the universe for the enumeration activities.
This area consists of three operations: Geographic
Programs (GEOP), Local Update of Census
Addresses (LUCA), and Address Canvassing
(ADC). Each is described below.
U.S. Census Bureau
Detailed Planning
Status:
In Production
DOPs published in FY 2016
Components of this operation include:
•• Geographic Delineations.
•• Geographic Partnership Programs.
•• Geographic Data Processing.
Changes Made Since Version 3.0 Operational Plan
Release: There have been no major changes to
this operation.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Consider consolidation of field operations, and
Types of Enumeration Areas (TEA) values used
to support field operations.
•• To the greatest extent possible, attempt geographic reconciliation activities of boundaries
on an ongoing basis throughout the decade.
•• To the greatest extent possible, geographic
extracts and updates should be made in an
electronic form to reduce the production, shipping, and handling of paper maps and paper
listings by the Census Bureau and its program
participants.
•• Update the MAF through partnership programs
in order to increase the Census Bureau’s ability
to geocode addresses from the United States
2020 Census Operational Plan—Version 4.0 81
Geographic
Delineations
•
•
•
•
Type of Enumeration
Area (TEA)
development and
delineation
Basic Collection Unit
(BCU) development
and testing
Delineation of Special
Land-Use Areas
Field management
area delineation
Geographic
Partnership Programs
•
•
•
•
Boundary and Annexation
Survey (BAS)
Participant Statistical Areas
Program/Tribal Statistical
Areas Program (PSAP/TSAP)
Boundary Validation Program
(BVP)
Public-Use Microdata Areas
(PUMAs)
Geographic Data
Processing
•
•
•
•
Augmentation of
MAF/TIGER System
with addresses from
administrative records
and third party data
MAF/TIGER Extract
Support
Geographic Data
Processing
Geographic Area
Reconciliation Program
Figure 30: Summary of Geographic Program Components
Postal Service (USPS) Delivery Sequence File
(DSF).
components of the GEOP project fall into three
general categories as shown in Figure 29:
Operational Innovations
•• Geographic Delineations.
Operational innovations include the following:
•• Geographic Partnership Programs.
•• Use of varied data sources (e.g., imagery and
third-party data) to validate and augment the
MAF/TIGER System throughout the decade:
•• Geographic Data Processing.
ºº As part of the Geographic Support System
Initiative (GSS-I) the Census Bureau has
obtained address and road center-line data
from state and local partnerships and has
updated the MAF/TIGER System with these
data since 2013.
ºº Ongoing investigation of potential use of
third-party data sources.
•• Development of a modular, multimode,
Geographic Update Partnership Software
(GUPS) to streamline partners’ participation.
•• Delineation of Basic Collection Units (BCUs) to:
ºº Eliminate operation specific Assignment
Area delineations.
ºº Incorporate data and information not previously used in delineation such as predominant housing unit (HU) characteristics [e.g.,
single unit, group quarters (GQs), and mobile
homes].
Description of Operation
The GEOP Operation includes components of the
2020 Census that are geographic in nature. The
82 2020 Census Operational Plan—Version 4.0
Geographic Delineations
The Geographic Delineation component of the
GEOP determines, delineates, and updates the
geographic area boundaries for 2020 Census
data collection and data tabulation. Census data
collection relies on the delineation of various
geographic areas, known as “collection geography,” to support the capture of data during census
activities. This includes both the delineation of the
methods used to enumerate households and the
definition of field management areas. The following collection geography is delineated during the
2020 Census:
•• TEA: In an effort to ensure the most cost
effective and efficient process to enumerate
households, every BCU in the United States is
assigned to one specific TEA. The TEA reflects
the methodology used to enumerate the households within the block. The TEA assignment
utilizes a variety of information to identify the
most cost effective enumeration approach for
all of the United States, District of Columbia,
Puerto Rico, and the Island Areas.
•• BCU: BCU serves as the smallest unit of collection geography for all 2020 Census listing operations. The BCU replaces both the collection
U.S. Census Bureau
block and assignment area geographies used
for the 2010 Census.
•• Special Land-Use Area: A key component of
collection geography is the delineation of land
areas that may require unique field treatment
or tabulation. This includes military areas, GQ
areas (e.g., correctional facilities and colleges
and universities), and public lands. The main
purpose of the special land use delineation is to
improve tabulation block boundaries, to allow
field operations to manage special land use
areas in the field effectively, to assist in maintaining the GQ address list, to allow for public
lands to be removed from In-Field Address
Canvassing (ADC) (see Section 5.4.3) and other
field operations, and to maintain relationships
between these areas and other geographic entities such as incorporated places and American
Indian Areas.
•• Field Management Area Delineation: This
component of collection geography includes
delineation of geographic areas, other than
BCUs and TEA, which are necessary to manage
and accomplish fieldwork for the 2020 Census.
In past censuses, this has included Crew Leader
Districts, Field Operation Supervisor Districts,
and Area Census Office boundaries. For the
2020 Census, this will consist of the Area
Census Office boundaries, the Census Field
Management areas, and on an operation-by-operation basis, Census Field Supervisor areas.
Census results are dependent on the delineation
of various geographic areas to both tabulate
and report person and household statistics. The
delineation of these geographic areas, known as
“tabulation geography” is based on input from
partnership programs (such as the Participant
Statistical Areas Program/Tribal Statistical Areas
Program [PSAP/TSAP]), or internally defined
tabulation criteria, such as the Urbanized Area
delineation. After rules are defined or tabulation
geographies are proposed by partners, the tabulation geography is delineated in the MAF/TIGER
System through a series of batch and interactive
delineations and then followed by a series of data
integrity validations, renumbering, and certification steps. Once the tabulation geographic areas
are certified, they are loaded into the MAF/TIGER
database and used for the tabulation of statistical
data and as the base for various geographic data
U.S. Census Bureau
products that support the 2020 Census. Tabulation
geography planned for the 2020 Census includes:
ºº American Indian Areas.
ºº Metropolitan and Micropolitan Statistical
Areas and Related Statistical Areas.
ºº Counties.
ºº County Subdivisions.
ºº Census Designated Places.
ºº Census Tracts.
ºº Block Groups.
ºº Blocks.
ºº Congressional Districts.
ºº State Legislative Districts.
ºº Voting Districts.
ºº Zone Improvement Plan Code Tabulation
Areas.
ºº Urban Areas.
These geographies are used to tabulate and
disseminate data from the decennial census, the
American Community Survey (ACS), and other
censuses and surveys, and are used outside of the
Census Bureau by other government agencies in
program administration and in determining program eligibility and fund allocation.
Geographic Partnership Programs
Prior to the 2020 Census, the Census Bureau will
conduct geographic partnership programs to
make the address list as up-to-date as possible
and ensure complete coverage of all HUs. The
geographic partnership programs also help define
statistical geographic area boundaries that will
provide meaningful data from the 2020 Census.
Following are the 2020 Census Geographic
Partnership Programs:5
•• Boundary and Annexation Survey (BAS): An
ongoing survey for collecting and maintaining information about the inventory of the
legal boundaries for, and the legal actions
affecting the boundaries of, counties and
equivalent governments, incorporated places,
Minor Civil Divisions, Consolidated Cities,
Urban Growth Areas, Census Areas of Alaska,
5
Components of the Redistricting Data Program (RDP)
and the Local Update of Census Addresses (LUCA) are also
Geographic Program Partnership Programs, but they are covered
in other sections of this document.
2020 Census Operational Plan—Version 4.0 83
Hawaiian Homelands, and federally recognized legal American Indian and Alaska Native
areas (including the Alaska Native Regional
Corporations). This information provides an
accurate identification and depiction of geographic areas for the Census Bureau to use in
conducting the decennial and economic censuses and ongoing surveys such as the ACS.
•• PSAP/TSAP: Programs that allow designated
participants, following Census Bureau guidelines, to review and suggest modifications
to the boundaries of block groups, census
tracts, Census County Divisions, and Census
Designated Places. Participants can also propose new Census Designated Places based
on specific criteria. The 2020 Census PSAP
includes all tribal statistical boundaries, which
were administered through the TSAP in the
2010 Census, combining the two programs.
The TSAP geographies are Oklahoma Tribal
Statistical Areas, Tribal Designated Statistical
Areas, State Designated Tribal Statistical Areas,
tribal census tracts, tribal block groups, statistical tribal subdivisions, Alaska Native Village
Statistical Areas, and for administrative purposes, one legal area, state reservations.
•• Boundary Validation Program (BVP): The
intent of the BVP is to provide the Highest
Elected Official a last opportunity to review the
entity boundary, and any address range breaks
where the boundary of their jurisdiction intersects a road, before the tabulation of census
data.
•• Public-Use Microdata Areas (PUMA): PUMAs
are statistical geographic areas defined for
the tabulation and dissemination of decennial
census and ACS Public Use Microdata Sample
(PUMS) data. PUMS data use PUMAs in publications for the decennial census, and the ACS
uses PUMAs for period estimate publications.
The delineation of PUMAs occurs in the United
States, Puerto Rico, and the Island Areas that
meet criteria guidelines. (The Commonwealth
of the Northern Mariana Islands and American
Samoa do not meet population criteria for
PUMS and therefore are not delineated in the
PUMA program.) The PUMA Program provides
participating State Data Centers an opportunity
to delineate PUMAs with input from regional,
state, local, and tribal organizations and
agencies.
84 2020 Census Operational Plan—Version 4.0
Geographic Data Processing
The Geographic Data Processing component
of GEOP includes all activities that relate to the
extract, update, and maintenance of the features,
boundaries, and addresses in the MAF/TIGER
System. Geographic data captured as part of the
2020 Census, including address updates, structure
coordinate locations, boundaries, and roads data
will be processed to ensure that the MAF/TIGER
System is up to date. Following are the major geographic data processing activities that will occur in
the 2020 Census:
•• Frame Development includes the receipt and
processing of various address records from
sources such as the USPS, state and local governments, and third-party data sources. These
data help ensure accurate address coverage
within the 2020 Census Frame.
•• MAF/TIGER Extract Support includes activities related to preparing extracts or services
enabling 2020 Census systems access to
addresses from the MAF/TIGER System, as well
as activities related to the production of spatial
extracts or services for use in various field
data-collection instruments and control systems
and printing of paper.
•• Geographic Data Processing includes activities related to extract from and update to the
features, boundaries, and addresses within the
MAF/TIGER System. The MAF/TIGER updates
include any changes to the features, addresses,
or boundaries that result from 2020 Census
data collection operations or geographic
partnership programs. The geographic data
processing activities establish benchmarks from
the MAF/TIGER System by taking a snapshot
of the database at various points during the
decade. Each benchmark becomes the foundation on which future updates are applied. These
benchmarks support the collection, tabulation,
and dissemination of census and survey information and providing geocoding services and
geospatial data products.
•• Geographic Area Reconciliation Program
(GARP) includes editing and reconciliation of
boundaries within the MAF/TIGER System. This
reconciliation resolves boundary and feature
discrepancies provided by separate partnership
programs at different points in time or updates
U.S. Census Bureau
prior to release of 2020 Census tabulation
products.
•• Paper Map Creation and Plotting/Printing
includes the creation of large- and small-format
maps for use electronically and potentially for
plotting or printing.
Research Completed
ºº Urban Areas as defined by the 2020 Census
Urban Area Delineation Program.
99 The following are the TEA required for the 2020
Census:
ºº TEA 1 = Self Response.
ºº TEA 2 = Update Enumerate (UE).
ºº TEA 3 = Island Areas.
The following research has been completed for
this operation:
ºº TEA 4 = Remote Alaska.
•• Research conducted and completed within the
initial phases of the GSS-I program:
ºº TEA 6 = Update Leave (UL).
ºº Findings: Demonstrated that commercial
spatial data are a valuable additional source
when a local spatial file is not available or the
local spatial file does not meet our feature
minimum guidelines.
•• Research on use of public lands data:
ºº Findings: Demonstrated that public lands
data will be useful in the delineation of 2020
Census TEAs and collection geography.
•• Post Census analysis of 2010 Census
Assignment Area definitions:
ºº Findings: Helped lay the foundation for
establishing a consistent assignment unit—
the BCUs.
Decisions Made
The following decisions have been made for this
operation:
Geographic Delineations:
ºº TEA 5 = Military.
Geographic Partnership Programs:
99 The geographic programs conducted in the
2010 Census will occur in the 2020 Census.
New Construction will be a part of the 2020
Census and will be conducted similar to the
LUCA operation.
99 The GUPS will support:
ºº All geographic partnership programs (i.e.,
BAS, PSAP/TSAP, BVP, and PUMAs).
ºº RDP
ºº LUCA
ºº Count Question Resolution
ºº GARP
ºº New Construction
ºº Count Review
ºº Address and spatial updates from the paperbased UE (including Remote Alaska) and
Island Areas operations.
99 BCUs have been used for tests since the 2016
ADC Test.
99 Partnership programs will offer limited paper
materials.
99 Special Land-Use Areas and public lands will
be used in the delineation of collection geographies. The current focus is on military and
national park lands.
99 Data received from partnership programs will
be processed at the National Processing Center
(NPC).
99 The Statistical Areas programs (PSAP/TSAP)
will be used in the delineation of 2020 Census
tabulation geography.
99 The 2020 Census will include delineation of:
ºº Tabulation geography (Blocks, Block Groups,
Tracts, etc.).
ºº Zone Improvement Plan Code Tabulation
Areas.
U.S. Census Bureau
Geographic Data Processing:
99 Enterprise solutions will be used to capture
relevant geographic data.
99 Imagery will be available as a backdrop in field
listing and field enumeration instruments.
99 The MAF/TIGER System will leverage a service
-oriented architecture for dissemination products and tools.
2020 Census Operational Plan—Version 4.0 85
99 The USPS DSF will continue to be used as the
primary source of address updates for the
MAF/TIGER System.
99 Frame development will include the receipt and
processing of administrative records and thirdparty data sources.
99 Boundary reconciliation within the MAF/TIGER
System will be ongoing.
99 MAF/TIGER will interact with other systems
using service-oriented architecture.
99 MAF/TIGER is the source for all data collection
and field management applications.
99 The MAF/TIGER systems contribute to all
reengineered field operations covering both
In-Office and In-Field ADC components. We
continue to evaluate the unique deadlines of
delivery and ingestion of frame-related data for
each field operation, and timing for all internal
benchmarking to assure that processes align.
The current proposal involves completing
In-Office ADC for TEA 1 areas by mid-April 2019
to prepare the In-Field ADC workload for delivery by the end of June 2019.
99 The Census Bureau expects to update structure
coordinates during the In Field ADC, UL, GQ,
Nonresponse Followup, and Enumeration at
Transitory Locations Operations. The Census
Bureau expects to update structure coordinates, and potentially street feature updates
from small format paper maps annotated
during the Remote Alaska, UE, and Island Areas
Censuses Operations. The field updates will not
be available in real time, but will be available
within the time frame of the operations. The
field updates will be available for the next operation and final tabulation.
ÏÏ Address and spatial data in the MAF/TIGER
System are updated continuously.
ÏÏ Ongoing reconciliation of boundaries across
programs, such as the BAS and the RDP, will
result in higher quality tabulation boundaries.
Risks
The Census Bureau is submitting drafts of
the PSAP respondent guides to the Office of
Management and Budget (OMB), but the final
baselined version of respondent guides will have
to be submitted via a nonsubstantive change
(NSC) addendum that must route through the
OMB/NSC approval process. IF the final respondent guides are not approved by the PSAP
baselined schedule finish date, THEN they may
not receive OMB approval by the PSAP mailout
deadline.
Milestones
Date
Activity
Geographic Delineation Programs
April 2014
Initiate Development of Tabulation Block
Criteria.
March 2016
Initiate Conducting Initial BCU
Delineation.
June 2016
Initiate Conducting Initial TEA
Delineation.
August 2016 Initiate Delineation of Field Offices.
September
2016
Release the Geographic Delineation
Programs Detailed Operational Plan,
version 1.0.
January
2017
Complete Delineation of Field Offices.
December
2017
Initiate Delineation of Field Management
Areas.
December
2018
There are no remaining design issues to be
resolved for this operation.
Release the Geographic Delineation
Programs Detailed Operational Plan,
version 2.0 (delayed).
April 2019
Update and Finalize BCUs.
Cost and Quality
July 2019
Update and Finalize 2020 TEA
Delineation.
Investment in GEOP is projected to have minimal
influence on the overall cost of the 2020 Census.
September
2020
Complete Delineation of Field
Management Areas.
Design Issues to Be Resolved
Impacts of this operation on overall 2020 Census
quality include the following:
ÏÏ Address and spatial data in the MAF/TIGER
System are validated using multiple data
sources.
86 2020 Census Operational Plan—Version 4.0
Geographic Partnership Programs
December
2015
Initiate Delivery and Maintenance of
GUPS.
September
2016
Release the Geographic Partnership
Programs Detailed Operational Plan.
U.S. Census Bureau
Date
Activity
October
2016
Open Geographic Partnership Support
Desk.
August 2017 Complete 2017 BAS.
August 2018 Complete 2018 BAS.
May 2019
Complete PSAP Delineation.
August 2019 Complete 2019 BAS.
February
2020
Complete PSAP Verification.
August 2020 Complete 2020 BAS.
of tribal, state, and local governments to submit
city-style addresses for newly built housing units
(HUs) in self-response areas. Following LUCA and
Address Canvassing, the NC serves as a conclusive
effort to complete the update of the 2020 Census
Address List. LUCA and NC satisfy the Census
Address List Improvement Act of 1994 (Public
Law (P.L.) 103-430).
Changes Made Since Version 3.0 Operational Plan
Release: The 2020 Census NC Program is conducted within the LUCA Operation.
August 2022 Complete BVP.
Lessons Learned
September
2022
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
Complete Public Use Microdata Area.
Complete Delivery and Maintenance of
GUPS.
Close Geographic Partnership Support
Desk.
Geographic Data Processing
December
2015
Initiate Geographic Data Processing.
September
2016
Release the Geographic Data Processing
Detailed Operational Plan.
June 2019
Deliver Address Canvassing In-Field
Universe.
January
2020
Deliver 2020 Census Enumeration
Universe (Internet Self-Response, UE).
June 2020
Initiate GARP.
August 2020 Complete 2020 Census Field Operations
Updates (Addresses, Mapspots, and
Features).
November
2020
Deliver Final Tabulation Geographic
Products.
September
2022
Complete Geographic Data Processing.
5.4.2 Local Update of Census Addresses
Detailed Planning
Status:
In Production
DOP published in FY 2016
Purpose
The Local Update of Census Addresses (LUCA)
Operation provides an opportunity for tribal,
state, and local governments to review and
improve the address lists and maps used to
conduct the 2020 Census. Similarly, the New
Construction (NC) Program utilizes the expertise
U.S. Census Bureau
•• Provide program materials (i.e., address lists
and maps) in standard, off-the-shelf commercial software formats.
•• Simplify the process for small (6,000 or fewer
HUs), lower-level governments (i.e., minor civil
divisions and places).
•• Explain the definition and use of addresses and
HUs better, so that participants will understand
why post office boxes and rural route numbers
are not in scope for the Census Bureau’s LUCA
Operation and the NC Program.
Operational Innovations
Considering recommendations from the 2010
Census and the 2020 Census Research and
Testing Phase, and the design of a reengineered
2020 Census, operational innovations include the
following:
•• Reduce the complexity of the LUCA Program as
compared with the 2010 Census program.
•• Eliminate the full address list submission
options that were available in 2010 Census
LUCA in order to:
ºº Reduce the number of deleted LUCA records
during verification activities.
ºº Reduce the burden and cost of processing
addresses and LUCA address validation.
ºº Provide early access to the address count
list, detailing the count of every address in
each block.
2020 Census Operational Plan—Version 4.0 87
ºº Provide partners with automated tools for
geocoding and reviewing their address list.
•• Census Bureau incorporates the updates to the
MAF/TIGER System.
Description of Operation
Research Completed
The LUCA Operation provides the opportunity
for tribal, state, and local governments to review
and comment on the Census Bureau’s address list
and maps to ensure an accurate and complete
enumeration of their communities. The Census
Address List Improvement Act of 1994 (P.L. 103430) authorized the Census Bureau to provide
individual addresses to designated local officials of tribal, state, and local governments who
agreed to conditions of confidentiality in order
to review and comment on the Census Bureau’s
address list and maps prior to the decennial census. The basic process for LUCA includes:
The following research has been completed for
this operation:
•• Census Bureau provides address list and maps
to the governmental entities.
•• Governmental entities review and add, delete,
or change address records or features.
•• Census Bureau incorporates the updates to the
Master Address File/Topologically Integrated
Geographic Encoding and Referencing System
(MAF/TIGER).
•• Census Bureau validates the updates through
a clerical review, automated address matching,
and Address Canvassing (ADC) Operation.
•• Census Bureau provides feedback to the governmental entities.
•• Governmental entities can appeal the ADC validation outcomes.
The NC Program utilizes the expertise of tribal,
state, and local governments to improve the accuracy and completeness of the address list used for
the 2020 Census. NC participants submit city-style
addresses for newly built HUs in self-response
areas or those being constructed and may be
complete on or before Census Day (April 1, 2020).
NC is the conclusive effort to catch any new
address not previously reported or not yet constructed during the ADC and LUCA operations.
The basic process for NC includes:
•• Census Bureau provides address list template and eligibility maps to the governmental
entities.
•• Governmental entities add new HU records.
88 2020 Census Operational Plan—Version 4.0
•• The LUCA Program Improvement Project
completed their recommendations for the
2020 Census LUCA Operation. The research
focused on improving the LUCA Operation
with research by the following four research
areas (2020 Census LUCA Operation
Recommendations 4/13/2015):
ºº Looking back at previous LUCA and related
programs.
•• Findings: Simplify the 2020 Census LUCA
Operation as the 2010 Census LUCA program was too complicated.
ºº Validating LUCA records without using a
field operation, such as ADC, as was done for
the 2010 Census.
•• Findings: It is possible to validate LUCA
addresses in an office environment.
ºº Utilizing the Geographic Support System
(GSS) for LUCA.
•• Findings: Data and tools used for GSS
activities should be used and repurposed
for the LUCA Operation.
ºº Focus Groups.
•• Findings: Focus group participants agreed
with the proposal to remove the full
address list submission options for the
2020 Census LUCA Operation.
•• As part of the 2020 Census Research and
Development efforts, staff evaluated the 2010
LUCA lessons learned and conducted a series
of focus groups with former LUCA participants.
This effort resulted in 12 major recommendations for the 2020 Census LUCA Operation.
(Note: These recommendations are described
in more detail in the 2020 Census LUCA Project
Improvement Report):
1. Continue the 2010 Census LUCA Program
improvements that were successful:
ºº Continue to provide a 120-day review
time for participants.
U.S. Census Bureau
ºº Continue the 6-month advance notice
about the LUCA program registration.
6. Provide the address list in more standard
file formats so that lists are easier to load
into common software packages.
ºº Continue a comprehensive communication program with participants.
7. Include an in-house verification of LUCA
submitted addresses to align within ADC.
ºº Continue to provide a variety of LUCA
media types.
ºº Continue to improve the Partnership
Software application.
ºº Continue state participation in the
LUCA program.
2. Eliminate the full address list submission
options that were available in 2010 LUCA.
This will:
ºº Reduce the number of deleted LUCA
records in field verification activities.
ºº Reduce the burden and cost of processing addresses and LUCA address
validation.
3. Reduce the complexity of the LUCA
Operation as compared with the 2010
Census program.
4. Include census structure coordinates in the
census address list and allow partners to
return their structure coordinates as part of
their submission:
ºº Benefits participants and the Census
Bureau in the review of materials
because it enables more information
about each address to be considered
in both the participants review and the
Census Bureau’s validation of the submitted addresses.
5. Provide ungeocoded U.S. Postal Service
Delivery Sequence File addresses to state
and county partners in LUCA materials:
8. Utilize and modify existing GSS tools and
data to validate LUCA submission.
9. Encourage governments at the lowest level
to work with larger governments to consolidate their submission.
10. Eliminate the Block Count Challenge, as
previously this did not result in useful information for the Census Bureau to determine
specifically what addresses were missing
from a block.
11. Eliminate the option for participants to use
an asterisk (*) for multiunits submitted without unit designations.
12. Encourage LUCA participants to identify
E911 Addresses used for mailing, location, or both addresses so that the Census
Bureau has more information available
during MAF update.
Decisions Made
The following decisions have been made for this
operation:
99 Conduct a comprehensive communication program with LUCA participants.
99 Include census structure coordinates in the
census address list and allow partners to return
their structure coordinates as part of their
submission.
99 Provide ungeocoded addresses to state and
county partners in LUCA materials.
ºº Provides more complete data for participants to review.
99 Provide the address list in more standard file
formats so that lists are easier to load into common software packages.
ºº May result in participants being able
to geocode previously ungeocoded
addresses for the census.
99 Encourage governments at the lowest level to
work with larger governments to consolidate
their submissions.
ºº Should reduce the number of duplicate addresses submitted by LUCA
participants.
99 Provide a variety of LUCA media types.
U.S. Census Bureau
99 Simplify the 2020 Census LUCA Operation and
make it compatible with the GSS and ADC.
2020 Census Operational Plan—Version 4.0 89
99 Utilize administrative records and third-party
data to improve validation process.
99 Use the Geographic Update Partnership
Software (GUPS) to support automated
exchange of information for LUCA participants.
99 Validation of LUCA submissions will occur primarily by matching to existing MAF, GSS, and
administrative records. Those LUCA addresses
needing further validation will go to In-Office
Address Canvassing. There will be no In-Field
Address Canvassing validation for LUCA
submissions.
99 The Census Bureau will provide an option for
partners to access registration materials online
and return them by email. Scanned signatures
will be accepted, but not E-signatures.
99 LUCA will instruct participants to provide mailing address, location address, or both. All data
will be used to match to the Census Bureau’s
MAF.
99 The strategy for late decade GSS activities
during LUCA is to continue GSS partner file
activities through the 2020 Census and beyond.
GSS is an ongoing program.
99 There will be a separate New Construction
Program for the 2020 Census.
99 The Census Bureau and the Office of
Management and Budget (OMB) are committed to implementing an appeals process that is
substantially similar to the 2010 LUCA appeals
process. While the Census Bureau and OMB
meet regularly to define the process, by law,
OMB is ultimately responsible for implementing
the independent appeals process.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
ÏÏ Use of administrative records and third-party
data to validate incoming addresses from tribal,
federal, state, and local governments to independently validate submitted addresses prior to
adding them to the MAF.
Risks
The feedback module for GUPS is not complete
yet. IF the feedback module is not completed on
time, THEN this will delay LUCA’s ability to provide feedback to partners using GUPS.
There is a limited window to create and QC feedback materials. IF materials are not created on
time, THEN this will delay LUCA’s ability to provide feedback to partners in a timely manner.
Milestones
2020 Census LUCA Operation
Date
Activity
September
2016
Release the LUCA Detailed Operational
Plan, version 2.0.
February
2017
Mail Advance Notice Package.
July 2017
Mail Invitation Package.
February
2018
Mail Participant Review Materials.
September
2018
Release the LUCA Detailed Operational
Plan, version 2.0.
October
2018
Complete Initial Processing of LUCA
submissions for delivery to ADC.
March 2019
Complete ADC validation of LUCA
addresses.
August 2019 Deliver Feedback Materials.
March 2020
Complete the processing of LUCA
Appeal addresses.
September
2021
Complete LUCA.
2020 Census NC Program
Investment in LUCA is projected to have minimal
influence on the 2020 Census overall costs in the
following ways:
Date
Activity
September
2018
Release the LUCA version 2.0 Detailed
Operational Plan to include NC Program.
Impacts of this operation on overall 2020 Census
quality include the following:
April 2019
Mail NC Invitation Package.
September
2019
NC Materials Mailout. Mail NC Participant
Review.
September
2020
Process NC Submissions.
ÏÏ Removing the full address list submission
options, thereby reducing the number of
addresses that need to be validated.
90 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
5.4.3 Address Canvassing
Detailed Planning
Status:
In Production
DOP published in FY 2016
Purpose
The Address Canvassing (ADC) Operation serves
two purposes:
•• Deliver a complete and accurate address list
and spatial database for enumeration.
•• Determine the type and address characteristics
for each living quarter (LQ).
Changes Made Since Version 3.0 Operational Plan
Release: To support the 2020 Census, In-Field
ADC will begin 2 weeks early in select Area
Census Offices (ACO) in each of the six regions.
The early start will begin with Census Field
Supervisor training and will be a full start to the
operation in those select ACOs. All other ACOs
will begin activities as scheduled.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Continuously update the maps and address lists
throughout the decade, supplementing these
activities with ADC at the end of the decade.
•• Allow more time in the schedule to fully
develop and test the listing instrument.
•• Improve the ADC training to emphasize working from the ground to the Handheld Computer.
Operational Innovations
Operational Innovations include the following:
•• Conducted In-Office ADC for the entire nation.
•• Select an estimated 38 percent of LQs in the
self-response areas for In-Field ADC.
•• Use automation and data (imagery, administrative records, and third-party data) for In-Office
ADC.
•• Implement Master Address File (MAF) Coverage
Study to validate In-Office ADC procedures,
measure coverage, and improve In-Field ADC
data collection methodologies.
•• Use reengineered field management structure
and approach to managing fieldwork, including
U.S. Census Bureau
new field office structure and new staff
positions.
Description of Operation
The Census Bureau needs the address and physical location of each LQ in the United States to
conduct the census. During ADC, the Census
Bureau verifies that its master address list and
maps are accurate so the tabulation for all housing
units (HUs), group quarters (GQs), and transitory
locations (TLs) is correct. A complete and accurate address list is the cornerstone of a successful
census.
The Census Bureau has determined that while
there will be a full ADC of the nation in 2020, a full
In-Field ADC of the nation is no longer necessary.
Advancements in technology have enabled continual address and spatial updates to occur throughout the decade as part of the In-Office ADC effort.
This has made it possible to limit In-Field ADC to
only the most challenging areas. The scope of the
ADC Operation for the 2020 Census includes:
•• In-Office ADC: Process of using empirical geographic evidence (e.g., imagery, comparison of
the Census Bureau’s address list to partner-provided lists) to assess the current address list.
This process also removes geographic areas
from the In-Field ADC workload based on the
availability of administrative data sets (e.g., military lands, national forests) and the method of
enumeration planned for the 2020 Census (e.g.,
areas that will be subject to Update Leave (UL)
or Update Enumerate (UE) Operations, which
will not be part of In-Field ADC). This process
detects and identifies change from high-quality
administrative and third-party data sources to
reduce the In-Field ADC workload. This process
determines the In-Field ADC universe.
ºº In-Office ADC assesses the extent to which
the number of addresses—both HUs and
GQs—in the census address list is consistent with the number of addresses visible in
current imagery. This process is known as
Interactive Review.
ºº A follow-up process seeks to research and
update areas identified with growth, decline,
undercoverage of addresses, or overcoverage of addresses from the comparison of
the two different vintages of imagery and
counts of addresses in the MAF. This process
2020 Census Operational Plan—Version 4.0 91
is known as Active Block Resolution (ABR).
ABR was suspended in support of the 2020
Census in early 2017. All other In-Office ADC
processes are fully operational.
ºº In-Office ADC also includes three additional components that review address-level
records:
•• Ungeocoded Resolution geocodes
addresses in the Master Address File/
Topologically Integrated Geographic
Encoding and Referencing (MAF/TIGER)
System that are not currently assigned to
a specific block.
•• In-Office ADC GQ reviews and updates
GQ and TL addresses and their associated
information.
•• Local Update of Census Addresses
(LUCA) Operation Address Validation
confirms the existence of the LUCA
address submissions by tribal, federal,
state, and local governments.
•• In-Field ADC: Process of doing a dependent
listing in the field to identify where people live,
stay, or could live or stay. Field staff compare
what they see on the ground to the existing
census address list and either verify or correct
the address and location information, adding
addresses to the list as necessary. Field staff
also classify each LQ as a HU or GQ.
•• Quality Assurance: Process of reviewing the
work of field and office staff. Both In-Field ADC
and In-Office ADC work will be validated using
quality assurance techniques.
•• MAF Coverage Study: A field activity that validates In-Office procedures, measures coverage,
improves In-Field data collection methodologies, and updates the MAF on a continuous
basis.
Research Completed
The following research has been completed for
this operation:
•• September 2014: Released the Address
Canvassing Recommendation Report.
ºº Findings: A recommendation was made to
not walk every block and to implement the
reengineered ADC (In-Field and In-Office).
92 2020 Census Operational Plan—Version 4.0
•• February 2015: Completed the 2015 Address
Validation Test, which consists of the MAF
Model Validation Test and the Partial Block
Canvassing (PBC) Test.
ºº Findings:
•• The statistical models were not effective
at identifying specific blocks with many
adds or deletes.
•• The statistical models were not effective
at predicting national totals of MAF coverage errors.
•• PBC was successfully implemented as an
alternative field data collection methodology; future work will determine how the
PBC method impacts cost and quality.
•• Imagery Review successfully identified
areas requiring updates; future research
is needed to refine the process and determine impacts on quality.
•• November 2016: Completed the ADC Test,
which included the Buncombe County, North
Carolina, and the St. Louis, Missouri, test sites.
ºº Findings:
•• The Census Bureau should continue pursuing the use of In-Office ADC methods to
reduce the workload for In-Field ADC.
•• In-Office ADC methods are generally
effective in detecting where the MAF has
remained accurate, where it is keeping
pace with changes on the ground, and
where fieldwork is needed to acquire
address updates.
•• Assumptions about situations that pose
challenges to detecting change through
imagery analysis are generally correct.
•• December 2016: Completed the 2016 MAF
Coverage Study.
ºº Findings:
•• For the census frame, the national estimate of overcoverage is 5.5 percent and
the national estimate of undercoverage is
6.6 percent.
•• The MAF Coverage Study estimated that
there were 7.4 million addresses in the
census frame that are deletes, duplicates,
or nonresidential.
U.S. Census Bureau
•• The MAF Coverage Study estimated that
the census frame and the MAF were missing 3.3 million new addresses.
•• October 2017: Completed the ADC Operation
of the 2018 End-to-End Census Test.
ºº Findings:
•• In-Office ADC successfully identified and
created the In-Field ADC workload, which
allowed In-Field ADC to perform targeted
fieldwork.
•• Census successfully implemented and
tested an automated In-Field listing
Quality Control process.
•• A number of systems challenges and technical issues, due to a lack of coordinated
system integration and testing, contributed to listing challenges during In-Field
ADC. The Census Bureau plans to implement a more rigorous testing program in
advance of the 2020 operation to identify
systems anomalies ahead of the start of
the ADC Operation.
Decisions Made
The following decisions have been made for this
operation:
99 The ADC Operation consists of:
ºº In-Office ADC.
ºº In-Field ADC.
ºº MAF Coverage Study.
ºº Quality Assurance.
production and improve quality control. The
discontinuation of ABR will result in a larger
workload being sent to In-Field ADC.
99 The current estimate is that 38 percent of the
LQs in the Self-Response Type of Enumeration
Area will be canvassed during In-Field ADC.
99 Production ADC began in September 2015.
99 ADC provides training for both production and
quality assurance processes for in-office work.
99 ADC relies on automated training for production and quality assurance processes for in-field
work.
99 ADC updates the Census Bureau’s address list
using a dependent canvass (from ground to
list).
99 ADC validates and collects coordinates for
every structure with a LQs.
99 The MAF Coverage Study is planned for implementation throughout the decade. The Census
Bureau completed the first MAF Coverage
Study during FY 2016. Based on funding
uncertainty and reprioritization of critical
components of the 2020 Census, the Census
Bureau completed the first half of the 2017 MAF
Coverage Study but paused it on April 1, 2017.
99 In-Office ADC creates the universe for In-Field
ADC.
99 In-Office ADC will review public lands.
99 Results from In-Office ADC can add and remove
Basic Collection Units (BCUs) into and from the
In-Field ADC universe.
99 Administrative records and third-party data
sources will be used to validate addresses
within each block.
99 All BCUs in the In-Field ADC universe will be
identified prior to the start of In-Field ADC.
99 GQs will be identified and classified during
ADC.
99 Imagery will be available on the Listing and
Mapping Instrument to use during In-Field ADC.
99 Geographic areas (e.g., LQs and feature), which
are covered by enumeration operations that
include a listing component, will no longer be
canvassed by In-Field ADC (e.g., UE, UL, and
Remote Alaska areas).
99 ADC will validate LUCA submissions.
99 Based on funding uncertainty and reprioritiza
tion of critical components of the 2020 Census,
the Census Bureau will not be able to meet the
25 percent In-Field ADC goal. ABR was discontinued in the winter of 2017 in order to evaluate
and redesign the operation to streamline
U.S. Census Bureau
99 Statistical modeling will not be used in ADC.
99 Validation of LUCA submissions will occur
during In-Office ADC.
99 The Census Bureau will canvass the whole block
(or BCU) during In-Field ADC.
99 ADC will leverage the same capabilities developed for Nonresponse Followup Operation for
In-Field ADC including automated payroll, routing to assignments, and various alerts.
2020 Census Operational Plan—Version 4.0 93
99 Ungeocoded addresses will be worked via the
In-Office ADC Operation. See the 2020 Detailed
Operational Plan for the ADC Operation for
details on the process.
99 Coordinates captured for features and LQs
will be collected using available technology.
Metadata will be collected and provided for use
in improving the spatial accuracy if deemed
necessary.
99 Spatial feature data will not be captured in the
field. Field staff will identify where features are
missing and report that back to Headquarters
(HQ) for processing.
99 In-Field ADC Quality Control will be conducted
in the field, with specific BCUs selected primarily based on their characteristics.
99 The business processes that the Census Bureau
will use to handle TLs during In-Field ADC are
conceptually based on the 2010 Census. Field
staff will attempt to verify the address, name,
and contact information for the TL while canvassing. TLs will also be handled as part of the
In-Office ADC GQ Review project.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
Investment in ADC is projected to influence
(reduce or increase ) the 2020 Census overall
costs in the following ways:
ÐÐ Reduction in the amount of In-Field ADC and
associated infrastructure by implementing
In-Office ADC.
ÐÐ Use of additional sources of administrative
records and third-party data to validate the
frame.
In addition:
ÏÏ ADC is expected to require additional people,
process activities, data, technology, and facilities to support In-Office ADC, including the
resolution of ungeocoded addresses, review of
GQ and TL addresses, and validation of LUCA
submissions.
Impacts of this operation on overall 2020 Census
quality include the following:
94 2020 Census Operational Plan—Version 4.0
ÏÏ The MAF Coverage Study will provide a continuous improvement process to:
ºº Test In-Field ADC methodologies.
ºº Verify in-office methodologies.
ºº Update the MAF with results.
ÏÏ Better detection of changes in the address list
resulting from new ADC approach.
Risks
External data sources, including Geographic
Information System viewers, will be used in the
2020 In-Office ADC Ungeocoded Resolution (UR)
project as a source to update the MAF/TIGER
database, where needed. Ungeocoded records
that are not spatially linked to a block location are
not included in the 2020 Census address frame for
enumeration. IF sufficient external data sources
are not available for use in UR, THEN the ungeocoded records within the areas lacking local data
sources may not be resolved and therefore, not
included as part of the address frame for enumeration during 2020 Census operations.
In-Office ADC is a new approach for the 2020
Census, and there are concerns that some higher
levels of government (i.e., state and federal)
believe an In-Field ADC may yield a greater “quality” canvassing than In-Office ADC and they may
be concerned about the lack of census jobs within
their jurisdiction because of a decreased In-Field
ADC. IF the Census Bureau is required to conduct
an increased In-Field ADC effort due to pressure
from higher levels of government, THEN the workload for In-Field ADC will increase dramatically.
Milestones
Date
Activity
August 2015
Release Address Validation Test
Results.
September
2015
Begin 2020 Census ADC (In-Office
Interactive Review).
December
2015
Release ADC Detailed Operational Plan,
version 1.0.
April 2016
Begin MAF Coverage Study (In-Field).
MAF Coverage Study was paused on
April 1, 2017.
April 2016
Begin ABR. ABR work was paused in
February 2017.
October 2016 Begin ADC Test (In-Field).
U.S. Census Bureau
Date
Activity
April 2017
Begin ungeocoded resolution.
June 2017
Complete first review of all blocks by
Interactive Review project of In-Office
ADC.
September
2017
Begin In-Office ADC GQ Review.
August 2017
Begin In-Field ADC for 2018 End-toEnd Census Test.
December
2017
Release ADC Detailed Operational Plan,
version 2.0.
May 2018
Begin LUCA Address Validation
process.
March 2019
Define universe of addresses to be sent
for In-Field ADC.
August 2019
Begin In-Field ADC for 2020 Census.
5.5 RESPONSE DATA
The Response Data area includes all operations
associated with the collection of responses, management of the cases, and initial processing of the
data. This area consists of 13 operations that are
described in the following sections:
paper forms to support the 2020 Census mailing
strategy and enumeration of the population:
•• Internet invitation letters.
•• Reminder cards or letters or both.
•• Questionnaire mailing packages.
•• Materials for other special operations, as
required.
Other materials required to support field operations are handled in the Decennial Logistics
Management (DLM) Operation.
Changes Made Since Version 3.0 Operational Plan
Release: There have been no major changes to
this operation.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Use United States Postal Service (USPS) tracing
data to monitor large-scale inbound and outbound census mailings.
1.
Forms Printing and Distribution
2.
Paper Data Capture
•• Provide a comprehensive 2020 Census forms
list to be used by the contractor for printing
planning.
3.
Integrated Partnership and Communications
Operational Innovations
4.
Internet Self-Response
Operational Innovations include the following:
5.
Non-ID Processing
6.
Update Enumerate
7.
Group Quarters
8.
Enumeration at Transitory Locations
•• Shifting from paper questionnaires to the
Internet as the primary response mode to the
2020 Census, thus reducing the number of
questionnaires that will be processed through
the PDC Operation.
9.
Census Questionnaire Assistance
11. Response Processing
•• Using paper questionnaires for the enumeration
of Internet nonrespondents and targeted areas
or populations with low Internet usage.
12. Federally Affiliated Count Overseas
Description of Operation
13. Update Leave
The FPD Operation is responsible for the printing
and distribution of mailed Internet invitations,
reminder cards or letters, and questionnaire mail
packages in multiple languages as determined by
the Language Services Operation.
10. Nonresponse Followup
5.5.1 Forms Printing and Distribution
Detailed Planning
Status:
In Production
Purpose
The Forms Printing and Distribution (FPD)
Operation prints and distributes the following
U.S. Census Bureau
•• The contact strategy will include printing and
mailing of paper invitations and reminder cards
or letters.
•• Paper questionnaires will be printed and
mailed initially to a portion of the population.
2020 Census Operational Plan—Version 4.0 95
Nonresponding households in the self-response
Type of Enumeration Area 1 will also receive
paper questionnaires.
•• Printing and mailing will be acquired through
the Government Publishing Office.
•• The print requirements will include the capability to produce and deliver conditional mailings
to nonresponding households.
•• A serialized barcode will be printed on each
sheet of a questionnaire to ensure all pages for
a household are properly captured.
•• The questionnaires for nonresponding households will be addressed in near real time to
minimize distribution to households who have
engaged in the digital or other nonpaper
response channels.
Research Completed
The following research has been completed for
this operation:
•• Multiple studies on the use of USPS tracing:
ºº 2010 Census Paper: Optimizing Integrated
Technologies and Multimode Response to
achieve a Dynamic Census, February 29,
2012.
ºº 2010 Census Assessment: 2010 Census
Postal Tracking Assessment, April 2, 2012.
ºº Cost assessment for the PDC check-in
operation.
•• Findings:
ºº USPS tracing data are cost-effective
and accurate.
ºº Postal tracing services are deemed reliable and could be used on a nationwide
scale.
Decisions Made
The following decisions have been made for this
operation:
99 Paper questionnaires, which will be available
in English and bilingual English/Spanish, will
be printed and mailed to some portions of
the population as part of the initial contact
strategy.
99 Printing and mailing of 2020 Census invitation
letters, reminder postcards, questionnaires, and
other self-response materials (questionnaires
96 2020 Census Operational Plan—Version 4.0
for Group Quarters (GQ) Operation, Island
Area Censuses (IAC) Operation, and other
operations) will be contracted out through the
Government Publishing Office.
99 USPS barcodes will be used for various postal
services, such as tracing and identification of
vacant or other undeliverable addresses.
99 Print contract requirements are written to
enable addressing mail packages that are preassembled before address files are available.
Successful vendor(s) must have demonstrated
ability and capacity to adhere to the Census
mailings schedule.
99 In addition to supporting self-response, which
includes Puerto Rico, the FPD Operation
will print materials for the Update Leave,
Update Enumerate, Enumeration at Transitory
Locations, GQ, Nonresponse Followup, and IAC
Operations.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
Investment in FPD is projected to have minimal
influence on the overall cost of the 2020 Census.
Impacts of this operation on the overall 2020
Census quality include the following:
ÏÏ Robust printing quality assurance measures
have a direct positive impact on the quality of
data from PDC.
Risks
The 2020 Census mailout materials will be printed
and assembled at multiple secure facilities. IF an
event (natural or otherwise) prevents or impedes
the timely printing and assembly of materials for
the 2020 Census without an alternative, THEN the
mailout of materials could be delayed.
Milestones
Date
Activity
October
2016
Receive final contact strategies from the
ISR Operation.
Receive design concepts for
questionnaires and other mailing
materials from the CFD Operation.
Define the printing and mailing workload
estimates.
U.S. Census Bureau
Date
Activity
June 2017
Release the FPD Detailed Operational
Plan, version 1.0 (delayed).
October
2018
Refine the printing and mailing workload
estimates.
January
2017–March
2019
Start print contract planning.
June 2019–
April 2020
Implement printing, addressing, and
mailing of Internet invitations, reminder
cards or letters, and paper questionnaire
packages.
Start USPS mailing planning.
5.5.2 Paper Data Capture
Detailed Planning
Status:
In Production
DOP published in FY 2017
Purpose
The Paper Data Capture (PDC) Operation captures and converts data from 2020 Census paper
questionnaires. This operation includes:
•• Mail receipt
•• Document preparation
•• Scanning
•• Optical Character Recognition (OCR)
•• Optical Mark Recognition (OMR)
•• Key from Image (KFI)
•• Data delivery
•• Checkout
•• Form destruction
Changes Made Since Version 3.0 Operational Plan
Release: There have been no major changes to
this operation.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• A timely and comprehensive forms list is
required.
•• Every field on a questionnaire must have an
owner.
•• Realistic and timely contingency planning is
essential in order to properly estimate the PDC
workload.
U.S. Census Bureau
•• Postal tracing monitors inbound and outbound
mailings.
•• Barcode serialization retains the integrity of
separated booklets, i.e., single sheets, within
batches and offers an essential automated
data component to data capture and batching
processes.
Operational Innovations
Operational innovations include:
•• A reduction in PDC operational workloads and
associated infrastructure by using Internet Self
Response and automating field operations.
•• Using an in-house system, i.e., the integrated
Capture-Assisted Data Entry (iCADE), for PDC.
•• Using United States Postal Service (USPS) tracing data to identify questionnaires prior to arrival
at the processing centers. This information may
be used to reduce follow-up workloads.
Description of Operation
The PDC Operation is responsible for the capture
and conversion of data from self-response and
personal visit paper questionnaires. Papers delivered by the USPS are processed by the National
Processing Center at one of two PDC sites. For the
2020 Census, there will be a site in Jeffersonville,
Indiana, and a site in Phoenix, Arizona.
Questionnaires go through several steps described
in the Detailed Operational Plan (DOP) for PDC.
Note that questionnaire images are archived.
The paper questionnaires themselves are stored
until verification that data are received by
Headquarters (HQ) and then they are destroyed
per security regulations.
The PDC Operation is largely driven by the timing
of the questionnaire mailout, volume of forms
received, timing of the nonresponse workload
universe cut, and any priority capture requirements needed for the 2020 Census. Data are captured from the paper forms in the most efficient
manner possible, and both data and images of
the forms are maintained. The data are sent to
the Response Processing Operation area for further work. The images are sent to the Archiving
Operation.
Mail returns are identified using USPS postal
tracing to indicate that a form is en route to the
processing office. Upon receipt at the processing
2020 Census Operational Plan—Version 4.0 97
office, mail return questionnaires will be processed in First-In-First-Out order, unless otherwise
specified.
The document preparation area removes mail
returns from the envelopes and prepares them for
scanning. Booklet forms have the binding (spine)
removed.
The questionnaires are delivered to scanning to
begin the data capture process. All questionnaires
are scanned by iCADE. There is no key from paper.
Once scanned, the questionnaires are physically
moved to the checkout operation. There, questionnaires await confirmation that questionnaire
data are deemed valid responses (see Response
Processing in Section 5.5.11).
Scanned images are sent forward for further
processing using the iCADE system where OMR
and OCR are performed. Data fields with low
confidence OMR and OCR results are sent to the
KFI process. Both data and images are maintained. Data are sent to response processing and
images are archived locally. Once data have been
received at HQ, questionnaires will be checked out
to ensure the data from each questionnaire have
been captured. Once confirmation is received,
questionnaires are then eligible for destruction per
security regulations.
Research Completed
The following research has been completed for
this operation:
•• Conducted Improving Operational Efficiency
technical evaluation project:
ºº Expanding the use of iCADE system to support the 2020 Census.
•• Findings:
ºº iCADE has the capability to be the
paper capture solution for the 2020
Census.
ºº Additional testing will be conducted to
determine scalability.
•• Multiple studies on the use of USPS tracing:
ºº 2010 Census Paper: Optimizing Integrated
Technologies and Multimode Response to
achieve a Dynamic Census, February 29, 2012.
ºº 2010 Census Assessment: 2010 Census
Postal Tracking Assessment, April 2, 2012.
98 2020 Census Operational Plan—Version 4.0
ºº Cost assessment for the PDC check-in
operation.
•• Findings:
ºº USPS tracing data are cost-effective
and accurate.
ºº Postal tracing services are deemed reliable and could be used on a nationwide
scale.
•• New equipment testing:
ºº NPC investigated the use of extractors for
potential use in improving mail processing
operations.
ºº Finding:
ºº Extractors did not perform as expected.
Decisions Made
The following decisions have been made for this
operation:
99 iCADE is the planned paper capture system for
the 2020 Census.
99 Paper questionnaires will be mailed to targeted
areas or populations with low Internet usage
as part of the initial contact strategy and to
Internet nonrespondents.
99 All housing unit questionnaires are booklets
that require separation.
99 USPS tracing data will be used to identify questionnaires in the mail stream prior to arrival at
the PDC centers.
99 All questionnaires will be scanned by iCADE.
99 The 2010 Census target quality levels will be
used for OMR (99 percent), OCR (97 percent),
and KFI (99 percent).
99 There will be two PDC centers.
99 Contingency planning is underway and will continue to mature in the coming year.
99 The 2020 Census paper questionnaire will be
a booklet format with dimensions of 9 by 11
inches. The PDC workloads are identified in the
PDC DOP, and will be further refined before the
2020 Census.
99 In addition to the operations supporting self-response, the following 2020 Census operations
require data capture of paper questionnaires by
the PDC operation:
ºº Update Enumerate
U.S. Census Bureau
ºº Update Leave
ºº Group Quarters
Milestones
Date
Activity
October 2016
Develop PDC NRFU plan.
March 2017
Release the PDC Detailed Operational
Plan, version 1.0.
There are no remaining design issues to be
resolved for this operation.
December
2017
Determine which other operations
require PDC.
Cost and Quality
December
2018
Release the PDC Detailed Operational
Plan, version 2.0 (delayed).
March–
August 2020
Conduct PDC Operation.
ºº Island Areas Censuses
ºº Enumeration at Transitory Locations
Design Issues to Be Resolved
Investment in PDC is projected to influence
(reduce or increase ) the 2020 Census overall
costs in the following ways:
ÐÐ Use of an enterprise solution iCADE for PDC.
ÐÐ Provision of a low-cost response mode (other
than the Internet) to increase self-response
rates.
ÐÐ Use of postal tracing to reduce field operation
follow-up workloads for Nonresponse Followup
(NRFU).
Impacts of this operation on overall 2020 Census
quality include the following:
•• Plan to maintain the same quality level as the
2010 Census for OCR, OMR, and KFI.
Risks
The expected workload per Paper Data Capture
Center (PDCC) determines the equipment and
staff needed to meet 2020 Census paper processing Service Level Agreements (SLAs). IF the
volume of paper responses exceeds the design
capacity of 15 million forms per site, THEN capture processing SLAs would be impacted.
The expected workload per PDCC indicates the
storage requirements for paper questionnaires
until they can be destroyed. IF the volume of
paper responses exceeds the design capacity of
15 million forms per site, THEN the PDCC storage footprint may not be sufficient to meet the
increased physical storage capacity needed.
Each of the two PDCCs will need to be staffed in
a relatively short timeframe to meet 2020 Census
production requirements. IF NPC staffing requirements at each PDCC are not met, THEN 2020
Census operations will be significantly impacted.
U.S. Census Bureau
5.5.3 Integrated Partnership and
Communications
Detailed Planning
Status:
In Production
DOP published in FY 2016
Purpose
The Integrated Partnership and Communications
(IPC) Operation communicates the importance of
participating in the 2020 Census to the entire population of the 50 states, the District of Columbia,
and Puerto Rico to:
•• Engage and motivate the public to selfrespond, preferably via the Internet.
•• Raise and keep awareness high throughout the
entire 2020 Census to encourage response.
•• Support field recruitment efforts for a diverse,
qualified census workforce.
•• Effectively support dissemination of census
data to stakeholders and the public.
Changes Made Since Version 3.0 Operational Plan
Release: There will not be an online portal for the
sharing of partner materials. Instead a partnership Web site will provide partners with access
to materials for promoting the census in their
communities.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Integrate Census Bureau subject-matter experts
into all phases of the 2020 Census IPC Program.
2020 Census Operational Plan—Version 4.0 99
•• Improve coordination of communications among the Decennial, Field, and
Communications Directorates and others.
•• Align timing, funding, and design decisions
between the development of the IPC Program
Plan and the Census Bureau’s operational milestones to effectively support all phases of the
2020 Census.
•• Establish more specific program metrics for
the IPC Program to assist in evaluation and
assessment.
Based on the lessons learned from the 2015
Census Test studies and reviews, the following
recommendations were made:
•• Prioritize minimizing break-offs from the landing page of the online survey instrument.
•• Create tailored, customizable, and changeable
landing pages in the online survey instrument
for each audience that also captures the “look
and feel” of advertisements.
•• Use digital advertisements to push decennial
census response and raise awareness.
•• Use digital advertisements and communications and the Internet specifically to reach and
increase response from young, single mobiles.
•• Perform additional research and testing to
determine the appropriate balance between
advertisements for a general audience and
hard-to-survey audiences.
•• Expanded predictive modeling to determine the
propensity to respond.
•• Expanded use of social media to encourage
response.
•• Localized advertising to encourage response.
•• Promotion of a teamwork environment
among partners through the Census Solutions
Workshops.
Description of Operation
Inspiring every household in the country to
complete the census is an enormous, increasingly complex, and unparalleled challenge. With
an increasingly diverse population and a drop in
public participation, an effective communications
strategy is critical to the success of the census.
The IPC Program must reach every household in
the nation, delivering the right messages to the
right audiences at the right time. It must allocate
messages and resources efficiently, ensuring consistent messaging, as well as look and feel, across
all public-facing materials across communication
efforts as well as operations.
An IPC Program contractor has been engaged to
support the 2020 Census Program from recruitment through data dissemination. The program
will offer the following components:
•• Advertising, using print, radio, digital, television,
and out-of-home.
•• Integrate the “look and feel” of mail materials with other communications including
advertisements.
•• Earned media and public relations.
•• Perform additional research to test which
communication channels and messages most
increase awareness.
•• Social media, to include blogs and messages on
platforms such as Facebook, Twitter, Instagram,
Snapchat, etc.
•• Perform additional research to test the use
of messages targeted to specific audiences
via addressable media outlets, such as digital
advertising.
•• Statistics in Schools.
Operational Innovations
These and other potential components of the
IPC Operation will communicate the importance
of participating in the 2020 Census to the entire
population.
Operational innovations include the following:
•• Microtargeted messages and placement for
digital advertising, especially for hard-to-count
populations at a census-tract level.
•• Advertising and partnership campaign adjusted
based on respondent performance.
100 2020 Census Operational Plan—Version 4.0
•• Partnerships, including both regional and
national efforts.
•• Rapid Response.
•• Web site.
Research Completed
The following research has been completed for
this operation:
U.S. Census Bureau
•• Notify Me:
ºº Promote “Notify Me,” allowing individuals to
provide contact information to receive future
email and text message notifications when it
is time to participate in the test.
ºº Measured the effects of different mailing
contact strategies including mail that encouraged potential respondents to preregister for
reminder emails or texts and a postcard sent
to residents who had yet to submit a form.
•• Findings: “Notify Me” is not a successful
contact strategy as designed and tested
with a very low percentage of mail panel
responding.
•• The 2015 Census Test:
ºº Measured the effects of digital advertising
and communications techniques on increasing self-response rates. The test assessed
various levels and types of digital advertising (e.g., social media ads, keyword search
ads, and display ads), as well as the use of
recorded influencer phone calls on increasing
self-response.
ºº Simulated a decennial census environment
through traditional advertising (e.g., television, radio, and print ads) and included
a partnership program for outreach and
information dissemination through the entire
Designated Market Area.
•• Findings: Results from this test show
considerable promise for the use of digital and targeted digital advertising as
a primary means to increase awareness
about the 2020 Census, motivate respondents and connect them directly to the
online response instruments, and to reach
hard-to-survey populations. Finally, the
influencer phone calls were less successful
at encouraging response, and attempting
to use prominent local figures to deliver
the messages had no affect either. Overall,
partnership activities were successful.
•• Census Barriers Attitudes and Motivators Study
(CBAMS)
ºº This study combined qualitative focus
groups and a quantitative survey that
together will provide us with hypotheses
for which messages resonate (and don’t
resonate) with each audience, as well as
U.S. Census Bureau
where these audiences are located. These
hypotheses will be tested during creative
development, which will rely on pretesting to
refine messages.
ºº Quantitative Component: This survey was
conducted with a mail and Internet option
in two languages: English and Spanish. The
study over-sampled difficult Low Response
Score tracts and tracts with a high percentage of Hispanic, Black, and Asian populations. This is a similar sampling strategy
to the 2015 National Content Test where
50,000 households were sampled and we
expected a response rate of 30 percent. The
actual response rate was 36.6 percent. The
survey tested messaging frames that will be
used to shape the campaign platform rather
than the actual messages themselves. The
following topics were covered:
•• Census familiarity, importance, and like
lihood to participate.
•• Internet and addressable media use.
•• Basic demographics.
•• Other civic participation such as voting.
•• Attitudes toward:
ºº State, local, and federal government.
ºº Data confidentiality.
ºº Basic messaging frames.
ºº Qualitative Component: The focus groups
provide better reach for small and hard-tocount communities. They will provide deeper
insights that will further inform message
development and creation. The 42 CBAMS
focus groups were comprised of six to eight
participants per group.
We conducted the following English language
focus groups:
•• Focus groups with rural, economically disadvantaged individuals.
•• Focus groups with low Internet proficiency
individuals.
•• Focus groups with Black/African Americans
with a hard-to-count focus.
•• Focus groups with American Indian and Alaska
Native individuals. They will be in Alaska and in
the continental United States.
2020 Census Operational Plan—Version 4.0 101
•• Focus groups with Middle East and North
African individuals.
•• Focus groups with Native Hawaiian and Pacific
Islander individuals.
•• Focus groups with young, single, mobile individuals with mixed race/ethnicity.
And the following non-English-speaking focus
groups:
•• Focus groups with Spanish-speaking individuals
who live on the U.S. mainland.
•• Focus groups with Spanish-speaking individuals
in Puerto Rico.
•• Focus groups with Chinese-speaking
individuals.
•• Focus groups with Vietnamese-speaking
individuals.
CBAMS results will be available early in FY 2019
and will be used in the development of creative
materials in support of the 2020 Census.
Decisions Made
The following decisions have been made for this
operation:
99 The Census Bureau will use partnerships to
communicate the importance of the 2020
Census to the entire population of the 50
states, the District of Columbia, and Puerto Rico
to encourage self-response.
99 The 2020 Census will use digital advertising
and social media.
99 The 2020 Census will use a variety of modes
of communication to motivate self-response.
Research into the most appropriate methods to
reach and motivate self-response among different audiences, especially hard-to-count areas,
were conducted in late 2017 to early 2018.
The results will be incorporated into the 2020
Census Integrated Communications Campaign
Plan v2.0, which is expected to be released in
the summer of 2019.
99 The 2020 Census will use traditional advertising
methods, including the use of local advertising.
99 A partnership Web site will be developed
that will allow for downloading partnership
materials.
102 2020 Census Operational Plan—Version 4.0
99 Where available, the partnership specialists and
local partners will provide an Internet connection through the use of tablets or laptops
that will be made available in public spaces for
respondents to complete their census questionnaire online.
99 The IPC Operation encompasses an integrated
communications campaign with multiple components. The main components of the operation are advertising, earned media and public
relations, partnerships, Statistics in Schools,
social media, rapid response, and a Web site.
99 The segmentation scheme will enable the
Census Bureau to develop messaging that
will most resonate with each group, purchase
media by group in the appropriate channels,
and to monitor performance by segment during
campaign execution. The Census Bureau will
develop self-response propensity models
to determine each households’ likelihood to
respond, when, and by which mode.
99 The IPC will be communicating the possibility
of answering the 2020 Census using the Census
Questionnaire Assistance (CQA) Operation in
several areas of the communications campaign, such as paid advertising, partnerships,
social media, information on our Web site and
through interviews landed through media outreach. Most of these tactics will start to be used
during the Motivation Phase of the IPC which
will occur between March and April 2020.
However, communications efforts will start
during the Strategic Early Education Phase (for
hard-to-count populations during the whole
year in 2019) and Awareness Phase (January–
February 2020), and respondents may start
hearing about CQA during those phases.
Design Issues to Be Resolved
Additional work is required to make decisions on
the following questions:
Question
Expected Date
What metrics will be used
to evaluate the success of
the IPC Operation as well as
each individual component?
Microtargeted digital advertising?
Automated telephone messaging
by local influencers? Providing
donated thank you incentives to
respondents? Social media? Email?
March 2019
U.S. Census Bureau
Cost and Quality
Milestones
Investment in IPC is projected to influence (reduce
or increase ) the 2020 Census overall costs in
the following ways:
Date
Activity
August
2016
Award the IPC contract.
ÐÐ A campaign aimed at promoting self-response
may reduce census data collection costs.
September
2016
Release the IPC Detailed Operational Plan,
version 1.0.
Impacts of this operation on overall 2020 Census
quality include the following:
October
2016
Kick off the IPC contract.
ÏÏ Increase in overall self-response rates.
October
2016
Release the 2020 Census Community
Partnership and Engagement Program
Plan.
June 2017
Start the 2020 Census Partnership
Program.
July 2017
Release the 2020 IPC Plan, version 1.0.
ÏÏ Potential increase in self-response from traditional hard-to-count populations.
ÏÏ Ability to adjust advertising using real-time
metrics to focus advertising in low response
areas.
Risks
Adequate staffing is needed to implement the
National Partnership Program, but the program
was significantly understaffed through FY 2018.
IF the National Partnership Program is not adequately staffed in FY 2019, THEN it will be difficult
to secure national partners, preventing the implementation of innovative outreach approaches to
raise awareness to the 2020 Census and promote
self-response.
There is a lot of press coverage surrounding the
2020 Census questionnaire. IF the Census Bureau
is given negative media coverage about the 2020
Census, THEN national partners may choose not
to partner for the 2020 Census.
The Census Bureau has faced challenges hiring
enough partnership specialists needed to promote
Census Awareness and increase the self-response
rate. Those challenges include having adequate
human resources staff in place to process applications for potential partnership specialists and
the Census Investigative Services Branch being
unable to process background checks in a timely
manner due to a backlog of applications. IF there
are continued delays in hiring partnership specialists, THEN the Regional Census Centers will have a
difficult time fully onboarding partnership specialists for FY 2019 that are prepared to carry out
outreach efforts, motivating the public to respond
to the 2020 Census.
U.S. Census Bureau
March 2019 Release the IPC Detailed Operational Plan,
version 2.0.
August
2019
Release the 2020 IPC Plan, version 2.0
5.5.4 Internet Self-Response
Detailed Planning
Status:
In Production
DOP published in FY 2018
Purpose
The Internet Self-Response (ISR) Operation performs the following functions:
•• Maximize online response to the 2020 Census
via contact strategies and improved access for
respondents.
•• Collect response data via the Internet to reduce
paper and Nonresponse Followup (NRFU).
Changes Made Since Version 3.0 Operational Plan
Release: There have been no major changes to this
operation.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Determine optimal contact strategies for eliciting responses to the 2020 Census for Internet
and other response modes.
2020 Census Operational Plan—Version 4.0 103
•• Optimize the instrument for mobile devices
to provide for better user experiences and to
improve overall response rates.
•• Determine if a bilingual initial or replacement
questionnaire in bilingual selected tracts is
beneficial.
Operational Innovations
Operational innovations include the following:
•• Internet Data Capture:
ºº Real-time edits.
ºº Ability to capture larger households than is
possible in a traditional paper-based survey.
ºº Develop and deploy an application that
can be used across most modern Internet
devices and browsers.
ºº Develop an application user interface that
is available in English and non-English languages identified by the Language (LNG)
Operation.
ºº Self-response mail contact strategy:
•• Tailored to demographic or geographic
area.
•• Designed to encourage Internet
self-response.
•• Integrated messaging with the Integrated
Partnership and Communications (IPC)
Operation.
Description of Operation
Two significant pieces of the program reside in
this operation: Internet Self-Response and Contact
Strategies.
Internet Self-Response
High Internet response is critical for cost savings
and major efforts are underway to minimize the
amount of self-response via telephone, paper
questionnaire, and in-person visits. Internet
response was not available in previous decennial
censuses and, therefore, represents a substantial
innovation for the enterprise. The Census Bureau
recognizes that the Internet response option is
not feasible or acceptable to the entire population. Therefore, alternate modes will be provided
for respondents to complete their 2020 Census
questionnaire, such as the paper methods used in
the past.
104 2020 Census Operational Plan—Version 4.0
Planning and development activities to support
ISR are centered around four organizing principles: (1) providing a responsively designed application, (2) providing the best user experience
possible, (3) utilizing the Internet to increase
data quality, and (4) ensuring that the ISR systems have the capacity to support anticipated
volumes of responses and other systems usage,
while following the most robust procedures for
ensuring data security. Each is discussed below.
The first way to maximize ISR is to design and
develop a Web application that can be used
across multiple Internet devices and browsers.
The ISR application will be responsibly designed
so that it is convenient and easy to use on most
modern Internet devices (from desktop to mobile
devices) and on most modern Web browsers.
A responsibly designed Web application makes
response more convenient, a user can respond
anywhere, at any time, provided they have a connected Internet device.
Secondly, designing and developing an ISR
application that is centered on the best possible
user experience also facilitates higher rates of
Internet response. The overall user experience
includes such factors as a person’s perception of
the system aspects (i.e., utility, ease of use, and
efficiency). The survey questions and response
options will be displayed within the ISR user interface such that they are as intuitive and straightforward as possible. The user experience will be
further enhanced by including non-English user
interfaces in the application. Additional information on the LNG program, which determines the
languages in which the ISR instrument will be
available, is described in section 5.3.4.
Thirdly, the Internet as a medium for data collection lends itself to improvements in data quality.
For example, the ISR application will include
built-in data validation checks to identify user
error as the user is inputting responses and
progressing through the survey. These checks
will include messaging to respondents indicating
missing or incomplete data, as well as messages
alerting respondents when incorrect or inconsistent information is entered. These functionalities
will help ensure high-quality data in the 2020
Census. To further improve data quality, users
will be able to contact a Census Questionnaire
U.S. Census Bureau
Assistance (CQA) agent for assistance while
completing their questionnaire online.
Lastly, the ISR application and all support systems will be designed to handle the volume of
responses that are expected. It is imperative that
the ISR application and other systems are built
to service the scale of the operation in order to
ensure that users do not experience delays when
completing the survey, or that the application is
unavailable during the self-response operation.
Also of note, in order to increase self-response
through Internet response, the ISR application
and other associated systems are being developed to adhere to the highest standards of data
security. All respondent data are encrypted
throughout the data collection process, and all
encrypted data are made inaccessible as soon
as possible. Every effort is made to ensure that
any data provided by the respondent is secure
and confidential throughout the data collection
process.
Contact Strategies
All attempts by the Census Bureau to make direct
contact with individual households by mail are
referred to as “contact strategies.” These are complementary but distinct from the community-level
outreach described under the IPC Operation.
Types of contact strategies include invitation
letters, postcards, and questionnaires mailed to
households.
Prior to the 2010 Census, research yielded distinct attitudinal segments or messaging mindsets.
A primary objective of the 2020 Census is for a
majority of respondents to complete their census
questionnaire online. Achievement of this objective is the purpose of the Census Bureau’s mail
contact strategies.
One approach, termed “Internet First,” has been
developed to encourage respondents to use the
Internet. Currently, this model includes the mailing of a letter inviting respondents to complete
the questionnaire online, two follow-up reminders
and, if necessary, a mailed paper questionnaire
followed by a final reminder. All correspondence
will contain a telephone number respondents
may use to complete the questionnaire over the
telephone.
This approach, however, may not be appropriate
for all respondent types. The “Internet Choice”
contact strategy will be utilized in areas with low
Internet connectivity or other characteristics that
make it less likely the respondents will complete
the census questionnaire online. In Internet Choice
areas, a paper questionnaire is provided on the
first contact in addition to the information about
how to respond online or by phone.
Research Completed
The following research has been completed for
this operation:
•• American Community Survey (ACS) ISR
Research.
ºº Findings:
•• People living in areas with lower Internet
usage and accessibility require paper or
telephone questionnaire assistance or
both.
•• Certain messaging strategies are more
effective in motivating self-response.
•• 2012 National Census Test tested contact strategy and Internet option.
ºº Findings:
•• Initial contact to invite participation,
followed by two reminder prompts as
needed, and subsequent mailing of a
paper questionnaire was a promising
strategy (Internet First).
•• Advance letter was not shown to improve
response rates.
•• Telephone assistance needed for respondents without Internet access.
•• 2014 Census Test tested “Notify Me” mailed
invitation, contact strategies, and Internet
option.
ºº Findings:
•• Neither email nor automated voice messages showed a significant impact on
response rates.
•• Low participation rate for “Notify Me”
component, but high questionnaire
completion rate among those who
preregistered.
•• The 2015 Optimizing Self-Response (OSR) Test
offered an Internet response option, including
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 105
real-time NID processing, and again tested the
“Notify Me” option, along with advertising and
partnerships support.
ºº Findings:
•• Longer email content with “Dear
Resident” and signature of the Director
email outperformed a shorter email invitation without the greeting and signature.
•• The total response rate was 47.5 percent,
and the Internet response rate was 33.4
percent.
•• Response rates did not differ by link type
(i.e., the full Uniform Resource Locator or
“Click here”) with this population.
•• An additional 35,249 Internet responses
from housing units (HUs) not selected in
mail panels as a result of advertising and
promotional efforts.
•• The time of day the email is sent did
not appear to have a big impact on the
response rate.
•• “Notify Me” again had low participation.
•• A new postcard panel, designed to test
how HUs not originally included in the
sample would respond to an invitation
after being exposed to advertising, generated response of approximately 8 percent.
•• 2015 National Content Test.
ºº Findings:
•• The total self-response rate was 51.9 percent, and the Internet response rate was
35.6 percent.
•• Adding a fifth mailing, a reminder sent
after the paper questionnaire, significantly
increased response rates.
•• Sending the first reminder sooner by a
few days prompted quick responses, thus
reducing the size of the third mailing.
•• In low response areas, the “choice” strategy of sending a paper questionnaire in
the first mailing is effective.
•• Providing the letters in English and
Spanish, rather than just English with a
Spanish sentence, elicits more Spanishlanguage responses.
•• Small-scale, opt-in email testing experimented
with email messaging, including subject lines,
timing of delivery, and look and feel.
ºº Findings:
•• A text-based email outperformed graphical emails.
•• Short email subject lines that include the
“10-minute” burden and the “U.S. Census
Bureau” name seem to perform better
than other subject lines, especially those
including the word “Help” as the first word
in the subject line.
106 2020 Census Operational Plan—Version 4.0
•• Respondents prefer a mailed invitation,
including a link to respond over all other
options.
•• 2016 Census Test.
ºº Findings:
•• The total self-response rate was 53.4 percent, and the Internet response rate was
31.4 percent at the Los Angeles County,
California, test site.
•• The total self-response rate was 39.6 percent, and the Internet response rate was
27.4 percent at the Harris County, Texas,
test site.
•• Continued mail strategy deployed in the
2015 National Content Test.
•• Building on the success of providing some
mail material in English and Spanish in the
2015 National Content Test, all mail materials were available in English and Spanish.
•• ISR application user interface was made
available in four languages (English,
Spanish, Korean, and Chinese).
•• 2017 Census Test.
ºº Findings:
•• The total weighted self-response rate
was 50.3 percent, and the Internet self-
response rate was 31.7 percent.
•• Utilized a commercial off-the-shelf
product for development of the ISR
application.
•• ISR application was deployed in the cloud
for the first time.
•• ISR application was developed using
Web design best practices and guidelines
developed by U.S. Digital Standards.
U.S. Census Bureau
•• Continued the mail strategy used in the
previous two tests.
•• 2018 End-to-End Census Test.
ºº Findings:
•• The total self-response rate was 52.3 percent and the Internet response rate was
32.6 percent.
•• Utilized the same commercial off-the-shelf
product that was used in the 2017 Census
Test.
•• ISR application was deployed in the cloud.
•• Refined the stratified mail strategy to
include one Internet Choice cohort and
three Internet First cohorts.
Decisions Made
The following decisions have been made for this
operation:
ISR:
99 An ISR option will be provided for the 2020
Census.
99 Invitation letters and mailed materials will
encourage people to respond using a unique
census identifier (ID); however, the 2020
Census will allow people to respond without a
unique Census ID.
99 The Census Bureau will offer an Internet
response option in 12 non-English languages:
Spanish, Chinese, Vietnamese, Korean, Russian,
Arabic, Tagalog, Polish, French, Haitian Creole,
Portuguese, and Japanese. The languages
selected were based on national estimates of
limited-English-speaking households.
99 The Census Bureau will not provide a mobile
application for ISR.
Contact Strategy:
99 An advance letter will not be used; most HUs
will receive a letter inviting online response
to the census. The Census Bureau will provide
a paper questionnaire (including bilingual
questionnaires) for populations where Internet
access and usage prompts us to offer Internet
Choice (questionnaire and Internet invitation)
and for whom language assistance optimizes
self-response.
99 The 2020 Census will offer alternative response
options to respondents without Internet access.
U.S. Census Bureau
99 Messaging will be coordinated with the IPC
Campaign.
99 A formal “Notify Me” option will not be offered.
99 Respondents will receive direct contacts inviting their participation in the census. Contacts
may include some or all of the following:
postcard mailings, letter mailings, questionnaire
mailings, and in-person visits by an enumerator.
99 Respondents more likely to respond online
will receive the “Internet First” mailing strategy, where they will receive invitations to
respond online. Those who do not respond
online will receive reminders to respond, and
a paper questionnaire before NRFU begins.
Respondents least likely to respond online (as
determined by modeling response likelihood,
using ACS data in the planning database tool
and Federal Communications Commission
Internet connectivity data), will receive the
“Internet Choice” mailing strategy. The Choice
strategy consists of receiving an invitation to
respond online, but with a paper questionnaire in the first mailing. Respondents will then
receive reminders to respond either online
or via the questionnaire they received earlier.
Those who do not respond will receive another
paper questionnaire before NRFU begins.
Anyone who does not either respond online, by
CQA, or with a paper return will be sent a final
reminder to respond before NRFU begins.
99 The Census Bureau will not use United States
Postal Service barcode scanning technology to
optimize the respondent access to Internet.
99 The Census Bureau looked into the benefits and
risk associated with using a contact frame and
will not be using it to reach respondents via
email and text message. .
99 ISR will use a mailed contact approach to
invite multimode self-response (responding by
Internet, completing a paper questionnaire, or by
calling CQA). The primary purpose of the stratified self-response contact strategy is to inform
and invite the public to respond to the census,
and to remind non-responders to respond.
The mail strategy that the ISR operation will
deploy in the 2020 Census will mail invitations,
reminders, and questionnaires over the course
of approximately 6 weeks. These mailings are
divided into two panels: the “Internet First”
panel and the “Internet Choice” panel.
2020 Census Operational Plan—Version 4.0 107
Other Self-Response:
Milestones
99 Text messaging will not be used as a data collection mode.
Date
Activity
99 HUs from whom an Internet questionnaire is not
received will be mailed a paper questionnaire.
March 2016
Begin the 2016 Census Test.
June 2017
Release the ISR Detailed Operational
Plan.
March 2017
Begin the 2017 Census Test.
March 2018
Begin the 2018 End-to-End Census Test.
March 2020
Begin 2020 Census ISR data collection.
July 2020
End 2020 Census ISR data collection.
99 ISR will not be part of the Group Quarters (GQ)
Operation enumeration. While GQ enumeration
cannot prevent GQ residents from responding
via the Internet, this method of data collection
is not part of 2020 GQ enumeration plans.
99 The 2020 Census printing and mailing workload
as part of the OSR strategy is identified in the
Life Cycle Cost Estimate.
5.5.5 Non-ID Processing
99 The response rate projections for all self-
response modes are in the Life Cycle Cost
Estimate that was released in December 2017
and will be updated for release in early 2019.
Purpose
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Detailed Planning
Status:
In Production
DOP published in FY 2016
The Non-ID Processing (NID) Operation is focused
on making it easy for people to respond anytime,
anywhere to increase self-response rates. The
operation accomplishes this by:
Cost and Quality
•• Providing response options that do not require
a unique Census Identifier (ID).
Investment in ISR is projected to influence (reduce
or increase ) the 2020 Census overall costs in
the following ways:
•• Maximizing real-time matching of NID respondent addresses to the census living quarters
(LQs) address inventory.
ÐÐ Reduced amount of self-response through
paper questionnaire.
•• Accurately assigning nonmatching addresses to
census basic collection units.
ÐÐ Increased self-response, which will decrease the
NRFU workload, thereby reducing field costs.
Changes Made Since Version 3.0 Operational Plan
Release:
In addition:
ÏÏ ISR is expected to increase the workload for
CQA.
Impacts of this operation on overall 2020 Census
quality include the following:
ÏÏ Increase in overall self-response rates.
ÏÏ Real-time edits to respondent data.
ÏÏ More complete self-response for large
households.
ÏÏ Potential increase in self-response from traditionally hard-to-count populations.
Risks
Major concerns for the ISR Operation are covered
by the 2020 Census risks listed in Chapter 6.
108 2020 Census Operational Plan—Version 4.0
There have been no major changes to this
operation.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• The automated and manual NID processes
should be planned and developed in parallel,
rather than sequentially, as was done when preparing for the 2010 Census NID Operation.
•• Involve the National Processing Center (NPC)
throughout the life cycle of the 2020 Census
NID Process to help prepare for the Clerical
Processing component of the operation.
U.S. Census Bureau
•• The delivery of addresses from NID processing
that require independent verification should
occur on a flow basis during Internet SelfResponse (ISR) and Nonresponse Followup
(NRFU) Operations, rather than at the end of
these operations.
Operational Innovations
Operational innovations include the following:
•• Public can respond to the census anytime, anywhere without a unique Census ID.
address data, followed by an additional address
matching attempt.
•• Interactive matching and geocoding of respondent-provided address.
•• Office-based address verification for nonmatching addresses.
•• Manual matching and geocoding when automated NID Processing has not determined an
acceptable match or geocode.
Research Completed
•• Mechanism that could increase self-response
from traditionally hard-to-count populations.
The following research has been completed for
this operation:
•• Real-time matching and geocoding of
responses.
•• 2013 National Census Contact Test:
•• Use of administrative records and third-party
data in an attempt to augment respondent-provided address data.
•• Use of available geographic reference sources,
such as aerial imagery and local government
address records, to verify the existence and
location of addresses provided by NID respondents that do not match to the census LQ
address inventory, significantly reducing the
need for fieldwork to perform this task.
Description of Operation
During the self-response phase, the NID Operation
will allow respondents to complete a questionnaire
without a Census ID. By collecting the address
from the respondent and then matching it in
real-time to the census LQ address inventory, the
Census Bureau will attempt to associate a Census
ID with the response. The address collection interface facilitates obtaining complete and accurate
data from a NID response.
Key capabilities of NID are:
•• Address standardization and a feedback loop
with the respondent to confirm the address
data they provide.
•• Automated address matching during the
response.
•• Automated address geocoding during the
response.
•• For NID cases not matched in real time, use of
administrative records and third-party data in
an attempt to augment respondent-provided
U.S. Census Bureau
ºº Findings: The use of administrative records
and third-party data was effective in enhancing NID addresses to allow for a match to the
MAF/TIGER System.
•• 2014 Census Test:
ºº Findings:
•• The address collection interface in the
Internet instrument yielded a much
greater proportion of higher quality
address data from NID responses than in
2010.
•• Use of administrative records and thirdparty data matching improved the overall
address matching rate.
•• There was no significant benefit to applying the administrative record matching
process to all NID responses. Therefore,
the use of administrative records and
third-party data matching should follow
an initial matching attempt using the
MAF/TIGER System.
•• 2015 OSR Test:
ºº Findings:
•• When a NID respondent address matches
a record in the census address inventory, rules can be applied for accepting
the geocode or subjecting it to further
verification. These rules can account for
the source of the geocode, whether or not
coordinates were collected in the field for
the address location.
2020 Census Operational Plan—Version 4.0 109
•• Respondents geocoded themselves
accurately only about one third of the
time. However, before making a recommendation on the use of the map interface
during self-response, results from 2015
testing will be compared with those from
the 2016 Census Test.
the 2015 and 2016 Census Tests. Clerks from
NPC call NID respondents when they are unable
to match or geocode the respondent-provided
address using available geographic reference
materials. This will enable the Census Bureau
to associate the respondent’s address with the
correct block for tabulation purposes.
•• Use of administrative records and thirdparty data continued to result in an
increase in the match rate for NID cases
compared to the census LQ address inventory during automated processing.
99 Administrative records and third-party data will
be used to attempt to enhance the respondent’s address data if the initial attempt to
match to a MAF record was not successful.
•• 2016 Census Test.
ºº Findings:
•• Respondents geocoded themselves accurately only about one quarter of the time.
Based on this information, results from
2015 testing, and research conducted on
the collection of users’ location, a decision
was made to remove the map interface
during NID self-response.
99 Testing of Office-Based Address Verification
(OBAV) during 2015 and 2016 indicated
that this approach should be used for 2020.
However, each test had varying results, and
only 2017 had 100 percent of the OBAV cases
worked, so specific proportions of addresses
verified have been different for each test.
Decisions Made
99 Based on data from census testing between
2015 and 2017, the 2020 Census will produce
an estimated 1.05 to 2.1 million cases for Clerical
NID Processing. This is based on a range of 5 to
10 percent of 2020 Census self-response being
NID cases, and following the earlier assumption
of 20 percent of the NID responses requiring
manual matching and geocoding, and another
10 percent requiring office based address
verification.
The following decisions have been made for this
operation:
99 All of self-response will be subject to a quality
assurance process, including NID responses.
•• Use of administrative records and thirdparty data continued to increase the
match rate during automated processing
for NID cases compared to the census LQ
address inventory.
99 The 2020 Census will offer a NID option for
self-response and telephone agent-assisted
response.
99 The 2020 Census ISR instrument and the
Census Questionnaire Assistance Operation
interviewer instrument will utilize capabilities
and requirements for the address collection
interface as specified for NID responses, as
used in the 2014, 2015, and 2016 Census Tests.
99 The NID work flow will include real-time matching and geocoding, post real-time processing
that will utilize administrative records and thirdparty data, and manual (interactive) matching
and geocoding.
99 NID respondents can help confirm the location
of their LQ descriptive information (i.e., cross
streets) provided to the NID Operation. This
method, which was used in the 2000 Census
and 2010 Census, has also been tested during
110 2020 Census Operational Plan—Version 4.0
99 The level of impact on downstream operations
will depend on the order of magnitude of NID
response to the 2020 Census. It is possible that
a backlog will be created during self-response
that will impact the Clerical component of the
NID Processing Operation, which could also
lead to an impact on NRFU. However, a projectlevel risk is being tracked, and a contingency
plan is in place that can be followed to reduce
the impact on NRFU (e.g., minimize the number
of cases sent out to NRFU because Clerical NID
Processing has not yet matched the response
to an address in the census inventory).
99 The expected scale of the 2020 Census NID
workload is estimated to be between 5 percent
and 20 percent of self-response, or approximated 3.75 million to 15 million responses. The
NID team has collaborated with the staff working on the optimizing self-response efforts and
the agreed-upon point estimate is 10 percent.
U.S. Census Bureau
IT infrastructure planners have been made
aware of the workload estimates, and are also
informed by the response modeling work that
projects response by day, peak hours, and so
forth. They are planning accordingly.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
Investment in NID is projected to influence
(reduce or increase ) the 2020 Census overall
costs in the following ways:
ÐÐ Increased self-response rates.
ÐÐ Improved coverage through self-response may
decrease the NRFU workload.
Impacts of this operation on overall 2020 Census
quality include the following:
ÏÏ May increase self-response from traditionally
hard-to-count populations.
ÏÏ May increase overall self-response rates, which
can contribute to higher quality for the overall
census.
Risks
Significant delays in processing of the clerical
workload could lead to duplication of efforts in
contacted households that may have already
responded via NID, and the inability to send
assignments to Field Verification before the cutoff
when Office Address-Based Verification could not
verify the address. There are two areas of concern regarding the ability of the NID Processing
Operation keeping up with the backlog of NID
cases not resolved during automated processing:
the amount of clerical workload and average production rate for clerks. Regarding the amount of
clerical workload for 2020 NID Processing; it will
be driven by several factors: the amount of marketing/promotion of the option to respond without a Census ID, the proportion of self-response
that will lack Census IDs, and the amount of
addresses from NID responses that match to a
Census Master Address File record during automated processing. The marketing and promotional
efforts for the 2020 Census are still being planned,
and regardless of that outcome, the impact of
promotion of the NID response option is unknown.
U.S. Census Bureau
In other words, there is no way to know if people
will opt to use their Census ID or not. Regarding
the proportion of self-response that will lack a
Census ID, while there are workload models based
on self-response from census tests that occurred
from 2013 to 2017 that can help estimate this,
there are limits to extrapolating to a national level
what occurs during site tests or national tests
with small samples. Similarly, while NID has data
from the census tests on the typical match rate
during automated processing, the test sites and
sample addresses may not be representative of
the match rate at a national level. In addition to
concerns about overall NID workload, the production rate for clerks is also largely unknown. There
has not been sufficient opportunity to test clerical
processing; therefore, there is insufficient data to
serve as a basis for estimating average production
rates for 2020. As a result, it is difficult to estimate
how long it will take to eliminate the backlog of
cases from automated processing, even given a
reasonable estimate of overall workload. IF the
2020 NID Processing workload exceeds the high
end of the estimated workload range, THEN there
may be duplication of effort in contacting households and an inability to send all assignments to
Field Verification prior to the cutoff.
Milestones
Date
Activity
April 2015
Deliver real-time address matching and
geocoding for the 2015 Optimizing SelfResponse Test.
April 2016
Utilize multiple respondent validation
methods for the 2016 Census Test.
September
2016
Release the NID Detailed Operational
Plan, version 1.0.
April 2017
Deliver real-time processing in the cloud
for the 2017 Census Test.
March 2018
Release the NID Detailed Operational
Plan, version 2.0.
May–June
2018
Conduct manual matching and
geocoding at the NPC for the 2018 EndTo-End Census Test.
April–
July 2020
Conduct the 2020 Census NID
Processing.
August 2021
Complete the 2020 Census NID
Assessment Report.
2020 Census Operational Plan—Version 4.0 111
5.5.6 Update Enumerate
Detailed Planning
Status:
Underway
DOP published in FY 2019
•• Do not require a 100-percent certification of
vacant and deleted addresses.
Operational Innovations
Purpose
Operational innovations include the following:
The Update Enumerate (UE) Operation is designated to occur in areas where the initial visit
requires enumerating while updating the address
frame. The majority of the operation will occur in
remote geographic areas that have unique challenges associated with accessibility. This operation
includes both the UE Type of Enumeration Area
(TEA) and the Remote Alaska TEA. In the UE
Operation, field staff update the address and feature data and enumerate respondents in person.
The primary functions of UE include:
•• Combine methodologies from the 2010 UL,
Remote Update/Enumerate, Remote Alaska,
and UE Operations.
•• Verifying and updating the address list and feature data for tabulation of the 2020 Census.
•• Determining the type and address characteristics for each living quarters (LQs).
•• Enumerating respondents at housing units
(HUs) within the UE TEA. HUs, group quarters
(GQs), and transitory locations (TLs) will be
enumerated in the Remote Alaska TEA during
this operation.
UE can occur in the following geographic areas:
•• Remote Alaska.
•• Areas that were a part of the 2010 Census
Remote UE Operation, such as northern parts
of Maine and southeast Alaska.
•• Select American Indian areas that request to be
enumerated in person during the initial visit.
Note that some areas included in the 2010 Remote
UE operations might be delineated into TEA 1 or
TEA 6 for the 2020 Census, based on changes in
address type or mailability.
Changes Made Since Version 3.0 Operational Plan
Release: There have been changes to the quality
control process for the operation.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Develop a robust method to enumerate GQs in
UE areas.
112 2020 Census Operational Plan—Version 4.0
•• Use a reengineered field management structure
and approach to managing fieldwork, including a new field office structure and new staff
positions.
•• Reuse processes and procedures from In-Field
Address Canvassing and Nonresponse Followup
(NRFU) Operations to the extent feasible.
In addition, the following operational design
assumptions result in an innovative UE Operation:
•• UE utilizes a reengineered field management
structure.
•• UE areas will not have an In-Field Address
Canvassing.
•• UE will be able to assign a final HU status of
vacant.
Description of Operation
The UE Operation is comprised of the following
components: UE Production, UE Listing QC, and
UE Reinterview.
UE Production
The UE enumerators visit every place where people could live or stay, comparing what they see
on the ground to the existing census address list
and either verifying or correcting the address and
location information. Much like ADC, enumerators
classify each LQ as a HU, a GQ, a TL, or as nonresidential. If the LQ is not classified as a HU in the
UE TEA, it is either reassigned to the appropriate
enumeration operation or removed from the list
for enumeration. (In the Remote Alaska TEA, an
enumerator will attempt to conduct an interview
at all LQ types, including HU, GQ, and TL.)
For both UE TEA and Remote Alaska TEA, at each
HU an enumerator will attempt to conduct an
interview. If someone answers the door, the enumerator will provide a Confidentiality Notice and
ask about the address in order to verify or update
the information, as appropriate. The enumerator
will then ask if there are any additional LQs in the
U.S. Census Bureau
structure or on the property and collect or update
that information, as appropriate. The enumerator
will then interview the respondent using a paper
questionnaire. If no one is home, the enumerator
will return to the nonresponding HU for two additional attempts. If there is still no response, the
enumerator can contact a proxy to complete the
interview.
99 The UE Operation will not leave a notice-ofvisit form. If no one is home during a contact
attempt, the Census Bureau will leave a record
of visit and return to complete the enumeration
at a later time.
UE Listing QC
99 UE address and map updating will occur during
daylight hours. If during daylight hours, a
respondent is home and willing to respond, the
enumerator will capture that data at that time.
If no one is home, the follow-up enumeration
will occur using some of the same business
rules established for NRFU.
The listing operation is conducted in pairs to
ensure quality. After a block is completed, the
supervisor reviews the address list for completeness. If the listing is incomplete or incorrect, then
a full canvass of the block is conducted.
UE Reinterview
The UE Operation is conducted in pairs to ensure
quality. As questionnaires are completed, the
supervisor reviews the data for any signs of potential falsification. If falsification is identified, interviews are recollected on any affected households.
Research Completed
Research that directly supports this operation has
not yet been completed.
Decisions Made
The following decisions have been made for this
operation:
99 Based on funding uncertainty and reprioritization of critical components of the 2020 Census,
the Census Bureau replanned the UE operation
and will deploy a paper-based solution for UE.
The Census Bureau will use paper-based solutions to enumerate GQs for Remote Alaska.
99 Based on funding uncertainty and reprioritization of critical components of the 2020 Census,
the Census Bureau replanned the UE operation
and will deploy a paper-based solution for UE.
The Census Bureau will use paper-based solutions to enumerate TLs for Remote Alaska.
99 There will be a universe of processing identifiers
(IDs) created before the UE Operation begins.
Enumerators will select a unique processing ID
for all newly identified LQs.
99 The UE Operation will not attempt to contact
respondents by mail.
U.S. Census Bureau
99 The UE Operation will use the same business
rules implemented for the ADC Operation. For
example, UE will add, delete, verify, move, etc.
99 UE enumerators will conduct all follow-up enumeration in person. The UE Operation will not
make outbound phone calls.
99 The UE Operation will not leave a notice-of-visit
form at a HU for a household to self-respond.
An enumerator will conduct the enumeration at
every HU during the operation.
99 Administrative records and third-party data will
not be used in UE areas to validate units’ occupancy status.
99 UE is often conducted in remote areas with
logistical constraints. As a result, UE enumerators will travel to these areas once to conduct
all listing and enumeration activities. During
the visit, the enumerators will make multiple
attempts to each housing unit to ensure a complete enumeration.
99 The operation will have a unique Census ID for
each HU. It is preprinted on the address list and
the questionnaire. It will be the unique ID for
the HU throughout the census.
99 Remote Alaska will use the same listing and
enumeration methodologies as in the 2010
Census.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
Investment in UE is projected to have minimal
influence on the overall cost and quality of the
2020 Census.
2020 Census Operational Plan—Version 4.0 113
Risks
It is essential that in-field production assignments
for UE be closed out on time at the end of each
assignment period so that the schedule stays on
course with minimal delays in completing the
Master Address File (MAF) update process and
all other future activities. IF there are significant delays in completing the in-field production
assignments for UE, THEN this will affect the
start date of the MAF update process, which may
contribute to substantial delays in future schedule
activities and downstream activities.
The UE Operation was descoped from the 2018
End-to-End Census Test. Subsequently on May 16,
2017, the 2020 Census Executive Steering
Committee approved a proposal to redesign the
UE Operation. As a result of that redesign, the
plan is to use methodologies similar to those used
for 2010 Remote Alaska and remote UE. IF the UE
Operation deviates from the methods used in 2010
or cannot utilize other operational methods being
tested in 2018, THEN the UE Operation will have
difficulty completing the goals and objectives for
the 2020 Census.
Milestones
service-based location, such as a soup kitchen,
to be counted in the census.
Changes Made Since Version 3.0 Operational Plan
Release: The Advance Contact will have two components: In-Office Advance Contact and In-Field
Advance Contact.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Integrate GQ frame validation and enumeration
data collection methodologies.
•• Research and test automation to collect GQ
data to reduce data capture and processing
time, which incorporates tracking and linkage
capabilities (eliminates manual transcription
of administrative records and third-party data
onto paper instrument).
•• Explore ways to reduce the number of visits
on military installations. (Research and test the
enumeration of military personnel through the
use of administrative records and third-party
data.)
•• Maintain consistent answer categories regarding the question on having a UHE on all census
data collection instruments, the Individual
Census Report, and Shipboard Census Report
(now referred to as the Individual Census
Questionnaire [ICQ] and Maritime Vessel
Questionnaire).
Date
Activity
April 2016
Begin detailed planning of UE.
May 2017
UE redesign approved.
December
2017
Release the UE Detailed Operational
Plan.
January
2020
Begin UE for the 2020 Census in Remote
Alaska.
March 2020
Begin UE for 2020 Census.
•• Conduct outreach to professional organizations,
such as education, health care, and tribal organizations, as part of the 2020 Census GQ planning.
July 2020
End UE for 2020 Census.
Operational Innovations
5.5.7 Group Quarters
Detailed Planning
Status:
In Production
DOP published in FY 2017
Purpose
The 2020 Census Group Quarters (GQ) Operation
will:
•• Enumerate people living or staying in GQs.
•• Provide an opportunity for people experiencing homelessness and receiving service at a
114 2020 Census Operational Plan—Version 4.0
Operational innovations include the following:
•• Use of an integrated approach including administrative records, third-party data, and Address
Canvassing (ADC) (In-Field and In-Office) to
improve the GQ frame.
•• Use of multiple modes of enumeration, including electronic exchange of client-level data, and
automated field listing and enumeration.
•• Integration of GQ Validation and enumeration
in all field operations that allow for accurate
classification of living quarters (LQ).
U.S. Census Bureau
•• Staff trained in multiple operational steps
increase efficiency for completing the
operation.
Planned data collection modes for GQ
Enumeration include:
ºº In-Office GQ Enumeration mode:
•• Use of both in-office and in-field methods for
enumeration.
•• Electronic Response Data Transfer
(eResponse) enumeration:
Description of Operation
•• The eResponse involves the electronic
transfer of client-level data from systems
maintained by GQ administrators that will
be transferred to a standardized Census
Bureau system that will accept electronically transmitted data in a standardized
template.
Before the enumeration at GQs can occur, the
Census Bureau must validate the GQ frame. This
validation activity is part of the 2020 Census ADC
Operation.
The 2020 Census GQ Operation consists of these
components:
•• GQ Advance Contact (known as GQ Advance
Visit in the 2010 Census): For the 2020 Census,
this will be both in-office and in-field function. In cases for which the staff is unable to
contact or resolve during the In-Office Advance
Contact, Field Supervisors will make an in-field
visit to the GQs. The functions of GQ Advance
Contact include:
ºº Verifying the GQ name, address information,
contact name, and phone number.
ºº Collecting an expected Census Day population count and addressing concerns related
to privacy, confidentiality, and security.
ºº Collecting a preferred enumeration method,
including if the GQs have a data file that can
be transmitted electronically to the Census
Bureau for enumeration.
ºº Obtaining an agreed-upon date and time to
conduct the enumeration.
•• GQ Enumeration: This includes enumeration
of all group quarters through in-field visits or
administrative records data.
The Residence Criteria and Residence Situations
for the 2020 Census will determine what are considered GQs. The following types of enumeration
will be included in the GQ Enumeration Operation:
•• General GQ Enumeration: Enumeration of people living in group living arrangements that are
owned or managed by an entity or organization
providing housing or services for the residents (e.g., college/university student housing,
residential treatment centers, nursing/skilled
nursing facilities, group homes, correctional
facilities, workers’ dormitories, and domestic
violence shelters).
U.S. Census Bureau
ºº In-Field GQ Enumeration modes include:
•• In-Person interview using a paper ICQ.
•• Facility Self-Enumeration—This method
will be offered only to medical facilities
and correctional facilities. A GQ administrator or point of contact is sworn in and
trained to collect the response data from
the GQ residents/clients using paper ICQs.
•• Drop Off/Pick up paper questionnaires.
•• Paper listings—Field staff pick up a
paper listing from the GQ administrator.
Paper listings are keyed at the National
Processing Center (NPC).
•• Service-Based Enumeration: Enumeration of
people experiencing homelessness or utilizing
transitional shelters, soup kitchens, regularly
scheduled mobile food vans, and targeted nonsheltered outdoor locations.
ºº The planned modes of data collection for
Service-Based Enumeration are:
•• In-person interview using paper ICQs.
•• Pick up paper roster listing to be used as
a supplemental tool to ensure data collection of the entire facility on Census Day—
transitional shelters only.
•• Military GQ Enumeration: Enumeration of
people living in a GQ or housing unit (HU)
on military installations, defined as a fenced,
secured area used for military purposes and the
enumeration of people residing on U.S. military
ships at the time of the 2020 Census. A military
vessel is defined as a United States Navy or
United States Coast Guard vessel assigned to a
homeport in the United States.
2020 Census Operational Plan—Version 4.0 115
ºº The mode of enumeration for military data
collection is similar to general GQ data collection, with the exception of the deployed
civilian and military population.
•• Maritime Vessel (Shipboard) Enumeration:
Enumeration of people living on U.S. maritime vessels in operation at the time of the
2020 Census. A maritime vessel is defined as
a U.S. Flag vessel that is a commercial vessel
registered and operated under the laws of the
United States, owned and operated by U.S.
citizens, and used in the commercial trade of
the United States.
ºº Data collection will be managed by staff at
the NPC using 2010 Census procedures.
Research Completed
•• Issued Federal Register Notice on May 20, 2015,
requesting public comment on the 2020 Census
Residence Rule and Residence Situations.
Published the final 2020 Census Residence Rule
and Residence Situations in late 2017.
•• Ongoing partnership with the Department of
Defense’s Defense Manpower Data Center to
discuss 2020 Census goals and objectives for
enumerating personnel living on stateside military installations.
ºº Findings:
•• Census Bureau received a sample of
administrative records from one military
installation.
•• Defense Manpower Data Center identified
military installations for administrative
record testing.
•• Conducted a small-scale data collection test at
several service-based locations (soup kitchens,
regularly scheduled mobile food van stops, and
transitional shelters).
ºº Findings:
•• An automated data collection device successfully replicated the content of the GQ
paper questionnaire.
•• There are minimal challenges associated
with the use of an automated instrument
for enumerating persons at service-based
locations (soup kitchens, regularly scheduled mobile food vans, transitional shelters), which are equal to the challenges of
116 2020 Census Operational Plan—Version 4.0
the use of a paper data collection instrument. However, the decision of the Census
Bureau is to use paper data collection
instruments for all In-Field GQ operations.
Decisions Made
The following decisions have been made for this
operation:
99 The GQ frame development and validation will
be integrated with the ADC Operation.
99 The GQ Operation will allow GQ administrators
to self-identify the GQ type for the facility.
99 An electronic data exchange of GQs and client-level response data records will be part of
the GQ methodology.
99 The Census Bureau will design a standardized
system that will accept electronically transmitted response data files in a standardized
template.
99 During field enumeration operations, newly
identified GQs will be validated and enumerated using a combination of methodologies.
99 Current goals for various types of GQs include
the following:
ºº Enumerate 25 to 30 percent of people residing in GQs through in-office methodologies
(i.e., electronic transfer of response data
files) and the remainder in the field using
paper.
ºº Enumerate military GQs using general GQ
data collection modes.
99 Administrative records will be pursued from
Federal State Cooperative Population Estimate
and Internet research for frame building but not
for enumeration.
99 A standardized template will be used to receive
data from GQ administrators. This will be tested
in the 2017 independent eResponse data transfer test.
99 The current assumption is that approximately
70 to 75 percent of GQ enumeration will be
in-field.
99 Cases for quality control interviews are sampled
by the Sampling, Matching, Reviewing, and
Coding System according to requirements provided by subject-matter experts.
U.S. Census Bureau
99 There are multiple ways administrative records
will be processed in the GQ Operation. The
Census Bureau plans to enumerate personnel
residing in GQs and HUs on the military installations using administrative records received from
the DMDC of the Department of Defense. For
all GQs, the response data will be processed
based on the format of the response data. If
the response is via eResponse, the data will be
received in Centurion and sent to the CDL for
processing. If it is a paper response, then the
data will be captured in the NPC Data Capture
Tracking System, tracked through NPC’s
Automated Tracking System, and then sent to
the CDL.
99 The GQ enumerators will be dedicated to the
GQ Operation during the GQ data collection
period and will not conduct multiple operations.
99 The Census Bureau established a method using
the field operational control system to link
self-response data to the correct GQs. This
method was tested in the 2018 End-to-End
Census Test and it successfully linked self-response data to the parent GQ facility. The
Census Bureau will use the same method for
the 2020 Census.
99 The Census Bureau will use a Browse Living
Quarters to determine if a GQ facility is already
in the GQ universe for data collection. If a GQ
facility is found during one of the multiple HU
enumeration operations, the enumerator will be
able to determine if the GQs is already captured
in the GQ Operation. If it was not captured in
the GQ Operation, the enumerator will have
the opportunity to send the case to the GQ
Operation, or enumerate it.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved.
Cost and Quality
Investment in GQs is projected to have minimal
influence on the overall cost of the 2020 Census.
Impacts of this operation on overall 2020 Census
quality include the following:
ÏÏ Electronic transfer of administrative records
and response data reduces transcription errors.
U.S. Census Bureau
ÏÏ Administrative records and response data may
provide more comprehensive demographic
information.
ÐÐ Administrative records and response data may
provide less current data than data received
through in-field enumeration.
Risks
Each person from a GQs must be linked to a
GQ address, as it is part of the GQ enumeration
process. IF the linkage to electronically associate
each ICQ to its GQ identification number does not
work, THEN this negatively impacts the ability to
tabulate GQ residents in their correct geography.
The Memorandum of Agreement data sharing agreement between the Census Bureau,
Commander Navy Installations, and the DMDC
will allow the Census Bureau to receive a file twice
yearly until 2020 that includes data elements as of
April 1 and October 1. These files will be sent to
the Census Bureau as soon as possible after the
files are available and can be compiled by DMDC.
The Census Bureau plans to use this administrative data as the enumeration methodology for the
military. The Military Enumeration Operation needs
to process and test data in the files to ensure it
meets quality standards for enumeration. IF the
files are not received from the DMDC in the timeframe expected, THEN the Census Bureau may not
be able to fully process and test the administrative
records military enumeration methodology and
provide detailed specifications for carrying out
this enumeration methodology in a timely manner
for the 2020 Census.
Milestones
Date
Activity
June 2015
Conduct Electronic Transfer Capability
Survey—Puerto Rico.
December
2015
Conduct Electronic Transfer Capability
Survey—Stateside.
May 2016
Conduct 2016 Service-Based
Enumeration Census.
December
2016
Conduct Electronic Response
Independent Census Test.
July 2017
Conduct 2017 Electronic Response
Data Independent Census Test.
2020 Census Operational Plan—Version 4.0 117
Description of Operation
Date
Activity
September
2017
Release the GQ Detailed Operational
Plan.
June 2018–
August 2018
Conduct the 2018 End-to-End Census
Test.
February
2020
Conduct GQ Advance Contact.
April 2020
Conduct GQ Enumeration.
5.5.8 Enumeration at Transitory Locations
Detailed Planning
Status:
Underway
DOP published in FY 2018
Purpose
The 2020 Census Enumeration at Transitory
Locations (ETL) Operation enumerates individuals
in occupied units at Transitory Locations (TLs)
who do not have a Usual Home Elsewhere (UHE).
TLs include Recreational Vehicle (RV) parks,
campgrounds, racetracks, circuses, carnivals,
marinas, hotels, and motels.
The goal of the ETL Operation is the enumeration
of individuals in occupied units at TLs who do not
have a UHE.
The ETL Operation will:
•• Use automation, where possible, to facilitate
data collection and streamline operations.
•• Use reengineered staffing and management of
the field operation.
•• Use in-person enumeration as the primary
mode of data collection.
•• Have Quality Assurance infused throughout
workload management and data collection.
Research Completed
Changes Made Since Version 3.0 Operational Plan
Release: There have been no major changes to
this operation.
The 2020 Census ETL Operation will implement a
similar design and methodologies as those used in
the 2010 Census. While enhancements will be pursued, the planning and design of the 2020 Census
ETL Operation is about the operational implementation rather than research into new or different
methodologies. Automated solutions were pursued this decade, but this operation will remain a
paper-based operation for the 2020 Census.
Lessons Learned
Decisions Made
Based on lessons learned from the 2010 Census,
the following recommendations were made:
The following decisions have been made for this
operation:
•• Automate the questionnaire and all related
sources of paradata used to record contact
details during an interview.
99 The goals and objectives of the ETL field operation is to enumerate individuals at occupied
units at TLs who do not have a UHE. The ETL
Operation is designed to enumerate eligible
populations that inhabit TLs, such as RV parks,
campgrounds, hotels, motels, marinas, racetracks, circuses, and carnivals.
•• Learn more about the living situations of people counted in the ETL Operation.
•• Clearly define and identify TLs, as well as procedures on how to list transitory units appropriately in operations that feed the ETL universe.
•• Conduct intercensal testing of the TL
population.
Operational Innovations
Operational innovations include the following:
•• Use of reengineered field management structure, staff positions, and approach to managing
fieldwork.
•• Use of automation and technology for data
collection.
118 2020 Census Operational Plan—Version 4.0
99 The Census Bureau will follow an approach similar to the approaches used in other operations,
which will involve a comprehensive approach
to quality. All cases will be subject to edits and
checks within the Operational Control System
and, as needed, the chance at being selected for
a “reinterview” involving telephone contact in
the Area Census Offices.
99 The success of the 2020 Census ETL Operation
will be the ability to identify the TL location and
enumerate the units at the TL the day of the
enumerator’s visit.
U.S. Census Bureau
99 The Census Bureau plans to conduct ETL
advance contact from February 24, 2020, to
March 21, 2020. The ETL data collection operation will be conducted from April 9, 2020, to
May 4, 2020. Respondents who do not have a
UHE are counted where they are enumerated at
the time of enumeration, not where they are on
April 1, 2020. ETL reinterview will be conducted
from April 13, 2020, to May 6, 2020. These
dates could change slightly as we get closer to
2020.
99 The Census Bureau is considering leaving a
notice-of-visit form at housing units that are not
available at the time of field enumeration. The
Census Bureau is evaluating if the enumerator
will leave a questionnaire and then follow up, or
direct them to respond through the Internet.
99 The Census Bureau is not using administrative
records or third-party data sources for the ETL
frame development, however, the ETL frame
development is a multiple phase approach
of updating and verifying TL addresses. The
Census Bureau plans to use several methods
to update the frame (1) Address Canvassing
In-Office Group Quarters review; (2) Address
Canvassing In-Field; (3) ETL Carnival research;
(4) Census Field Manager/Enumerator updates;
and (5) through the Local Update of Census
Addresses Operation and New Construction
Program.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
Investment in ETL is projected to have minimal
influence on the overall cost and quality of the
2020 Census.
Risks
One of the lessons learned from the 2010 Census
ETL Operation is the importance of field testing.
IF field testing of the ETL Operation is not conducted before the 2020 Census, THEN the operation may encounter unforeseen operational issues,
potentially increasing cost and reducing data
quality.
U.S. Census Bureau
A complete and accurate address frame is required
to implement an efficient ETL Operation. The ETL
frame development and validation will be integrated with the Address Canvassing Operation
along with efforts from ongoing geographic
update operations and other 2020 Census operations. IF the address frame does not contain all the
instances of the types of living quarters (LQs) covered by the ETL Operation, THEN some LQs may
not get enumerated by the ETL Operation and the
people living at those TLs may not get included in
the final 2020 Census population count.
Milestones
Date
Activity
October
2015
Initiated the 2020 Census ETL
Integrated Project Team.
September
2018
Release the ETL Detailed Operational
Plan.
April 2020
Begin 2020 Census ETL enumeration.
May 2020
Conclude 2020 Census ETL
enumeration.
April 2021
Issue 2020 Census ETL operational
assessment.
5.5.9 Census Questionnaire Assistance
Detailed Planning
Status:
Underway
DOP published in FY 2016
Purpose
The Census Questionnaire Assistance (CQA)
Operation has three primary functions:
•• Provide questionnaire assistance for respondents by answering questions about specific
items on the census form or other frequently
asked questions about the census:
ºº Tier 1: Provide telephone assistance via
an automated menu (Interactive Voice
Response, or IVR).
ºº Tier 2: Provide real-time assistance by CQA
agents via the telephone.
•• Provide an option for respondents to complete
a census interview over the telephone.
•• Provide outbound calling in support of
Coverage Improvement.
2020 Census Operational Plan—Version 4.0 119
Changes Made Since Version 3.0 Operational Plan
Release:
(with and without a unique Census identification number).
•• Web chat, as a formal communication channel, was eliminated based on logistics and
feasibility.
•• Outbound telephone calls made by agents to
respondents for quality follow-up (coverage
improvement).
•• Support for callers requesting information
about 2020 Census jobs was moved out of the
CQA Operation and into the Field Infrastructure
Operation.
Scope of 2020 Census CQA includes:
•• Outbound telephone calling to collect
Nonresponse Followup (NRFU) Reinterview
data for NRFU quality control was eliminated.
Findings in the 2018 End-to-End Census Test
led to the decision.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Managing the day-to-day CQA Operations and
analyzing data trends to improve the customer
experience and ensure the most efficient and
accurate assistance is provided.
•• Opening multichannel contact centers with a
central command functionality.
ºº Voice channel (telephone via IVR).
ºº Voice channel (telephone via agents).
•• Staffing of contact center.
•• Training of contact center staff.
•• Assistance in multiple languages.
•• CQA Operation requires very specialized contact center personnel throughout the development and operational cycles.
•• Assistance for individuals with special needs
(visual- or hearing-impaired).
•• CQA Operation needs to be synchronized with
the Integrated Partnership and Communications
Program.
•• Assistance for individuals receiving experimental forms.
•• Agent desktop applications need to have
the ability to easily update Frequently Asked
Questions content so that all relevant information is in one place.
•• Assistance for individuals in Puerto Rico.
•• Integration with the Internet questionnaire
development team to deliver assistance.
Operational Innovations
•• Determination of expected call volumes
(inbound and outbound), including timing of
peak volumes and a rollover plan for unanticipated volumes.
Operational innovations include the following:
Research Completed
•• Speech and text analytics to determine what is
trending in real-time across CQA.
The following research has been completed for
this operation:
Description of Operation
•• Market Research:
The main objective of CQA is to assist Internet and
paper self-respondents by answering questions
coming from telephone calls. CQA will provide
support through the following:
•• A toll-free telephone number for respondents
to call for help completing the 2020 Census
questionnaire.
•• IVR to resolve basic questions from respondents calling on the telephone to minimize the
number of agents required.
•• Real-time assistance to callers (inbound) in
completing the 2020 Census questionnaire
120 2020 Census Operational Plan—Version 4.0
ºº Conducted vendor meetings to benchmark
contact center industry and identify best
practices.
ºº Released a Request for Information to identify industry capabilities and a Request for
Proposal (RFP) to evaluate and hire expertise
suitable to assist the Census Bureau with
developing adequate systems and call center
operations.
•• Findings: Most large contact center
providers have the capacity to provide
all services identified in the Request for
Information. Small businesses do not have
U.S. Census Bureau
the facilities, staff, or experience to meet
the full range of services and size required
by CQA. However, the Census Bureau did
specify small business goals within the
RFP and allow the contact center service
providers and system integrators to determine how to best meet the small business
goals.
•• Call Workload Modeling:
ºº Looked at call data from the 2014 Census
Test, the 2015 Census Test, the 2015
Optimizing Self Response Test, the 2015
National Content Test, the 2016 Census Test,
the 2017 Census Test, and the 2018 Endto-End Census Test to assist in forecasting
workload for the 2020 Census.
•• Findings:
ºº The mailing strategy of pushing
respondents to answer the Census on
the Internet has created an increase in
assistance calls, specifically related to
lack of Internet access and technical
issues.
ºº Experienced call volume increases at
the beginning of the NRFU Operation
due to Census enumerators beginning
to visit nonrespondents.
Decisions Made
The following decisions have been made for this
operation:
99 CQA will use an acquisition with the RFP
release date of November 2015.
99 CQA will complete interviews by telephone.
99 CQA will provide respondent assistance relating
to specific items on the questionnaire.
99 CQA will handle calls relating to general questions on 2020 Census processes and frequently
asked questions.
99 CQA telephone number will be provided in
selected materials.
99 The contractor will be required to provide an
adaptive infrastructure (e.g., staffing levels
and communications capabilities) that can be
adjusted on demand as data collection occurs.
99 The contract will include options to provide
flexibility to support future operations or capabilities that have not yet been fully defined.
U.S. Census Bureau
99 The RFP will require the vendor to develop the
application that the agents use to respond to
calls, including the data collection instrument to
complete the questionnaire.
99 CQA agents will be available to provide assistance and complete 2020 Census questionnaires for all specified languages.
99 CQA will assist individuals with special needs
(visual- or hearing-impaired).
99 CQA will handle calls from respondents experiencing technical issues (e.g., Internet problems,
lack of access to Internet) and will have the
capability to complete a 2020 Census questionnaire, if requested.
99 The contract includes service-level agreements
and quality standards.
99 The 2020 Census language requirements are in
the CQA contract.
99 CQA will support the Coverage Improvement
Operation by providing a centralized quality
outbound calling operation.
99 The 2020 Census Self Response Contact
Strategy model was included in the CQA
contract.
99 CQA Operations will go live with the IVR
only on February 1, 2020. For inbound calling
operations using live CQA agents, operational
dates are aligned with the Internet SelfResponse Operation and are documented in
the 2020 Census Integrated Master Schedule.
For outbound calling in support of Coverage
Improvement, operational dates are aligned
with the Coverage Improvement period documented in the 2020 Census Integrated Master
Schedule.
99 CQA will not mail paper questionnaires to people who call to request them, but they will refer
people to materials on the 2020 Census Web
site or collect the interview over the phone. As
part of the overall contract strategy, all nonresponding housing units in self-response areas
will receive a paper questionnaire in the forth
mailing.
99 CQA will not utilize and integrate a nonvoice
communication channel, such as Web chat, to
support in-bound census questions.
2020 Census Operational Plan—Version 4.0 121
99 CQA will not collect 2020 Census questionnaire
information via texting or Web chat communication channels.
99 CQA will not accept emails, faxes, or Internet
uploads of completed 2020 Census questionnaire. Respondents will be directed to mail their
responses.
99 CQA will not support centralized outbound
calling for NRFU production cases.
99 CQA will not include the ability to offer respondents an option to check on the status of the
questionnaire they submitted.
99 CQA will not offer a Web chat functionality to
provide assistance to respondents while completing their questionnaire online.
99 CQA will not take calls directly from census
field enumerators encountering language
barriers. Instead, during NRFU, enumerators
will show a language identification card to the
respondents to identify the language(s) they
speak. If respondents identify a language supported by CQA, census field enumerators will
leave a paper Language Assistance Sheet with
the respondents, which contains instructions for
reaching the supported CQA language lines.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
Investment in CQA is projected to influence
(reduce or increase ) the 2020 Census overall
costs in the following ways:
ÐÐ Increased self-response rates resulting from
increased promotion of telephone response
decreases the NRFU workload, eliminating
costly in-person enumerations by census
enumerators.
ÐÐ Reduced quantities of paper questionnaires because of increased self-response by
telephone.
In addition:
ÏÏ Internet Self-Response is expected to increase
the workload for CQA.
Impacts of this operation on overall 2020 Census
quality include the following:
ÏÏ Real-time edits to respondent data collected
over the phone.
ÏÏ More complete self-response for large
households.
ÏÏ Potential increase in self-response from traditionally hard-to-count populations.
These four items are also included in the ISR section (5.5.4).
Risks
Multiple Census Bureau and other stakeholders
will be involved throughout the CQA Operation.
Emerging requirements, lessons learned, and
changing conditions have the potential to alter
the requirements of the operation in order for
the larger 2020 Census Program to be successful.
IF CQA stakeholders request new requirements,
THEN the CQA Operation may need to accept
higher costs, greater risks to quality, or both.
Due to the rapidly tightening labor market, it is
becoming more difficult to recruit the non-English,
non-Spanish customer service representative and
Quality Assurance Monitor positions for the call
centers. Competitors at other contact centers and
retail markets have increased hourly wages and
are offering $1,000 signing bonuses to find similarly skilled entry-level resources. IF CQA is unable
to recruit and hire enough staff for the non-English, non-Spanish positions, THEN calls on some
language lines may not be answered in a timely
fashion for the 2020 Census.
The CQA Operation spans multiple fiscal years and
is incrementally funded. Adequate funding for the
CQA Operation is not guaranteed and fluctuates
each fiscal year. IF adequate incremental funding
is not provided on a timely basis to support the
CQA Operation, THEN contractor work may be
suspended during critical scale-up activities.
Milestones
For acquisition purposes, the major milestone
dates are:
Date
Activity
January
2016
Release RFP for 2020 CQA acquisition.
July 2016
Award contract for 2020 CQA.
September
2016
Release the CQA Detailed Operational
Plan, version 1.0.
ÏÏ Increase in overall self-response rates.
122 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
Date
Activity
April 2017
Participate in 2017 Census Test.
December
2017
Release CQA Detailed Operational Plan,
version 2.0.
March 2018
Participate in 2018 End-to-End Census
Test.
February
2020
Go live with CQA IVR system.
March–July
2020
Conduct CQA Operations with live
customer service representatives.
September
2020-June
2021
Conduct CQA Post Production Analysis
and Close-out.
5.5.10
Nonresponse Followup
Detailed Planning
Status:
In Production
DOP Published in FY 2018
Purpose
The Nonresponse Followup (NRFU) Operation
serves two primary purposes:
•• Determines or resolves housing unit (HU) status
for addresses included in the NRFU workload.
•• Enumerates HUs that are determined to have a
HU status of occupied.
The NRFU workload includes nonresponding
addresses in both the self-response Type of
Enumeration Areas (TEA) and the Update Leave
(UL) TEA.
In addition, NRFU data collection will also:
•• Verify the existence and location of addresses
received through the 2020 Census Non-ID
Processing (NID) Operation that could not
be reconciled through automated or clerical
methods. These are known as Field Verification
cases.
•• Resolve potential erroneous enumerations and
omissions from the initial self-response and
field enumeration data collection. These are
known as Coverage Improvement cases.
•• Resolve the status from some categories of
self-responses, such as self-reported vacants,
paper returns that were returned blank, etc.,
and enumerate, as appropriate.
•• Re-collect the census responses in select
instances to ensure the accuracy of
U.S. Census Bureau
self-response census returns. These are known
as Self-Response Quality Assurance (SRQA)
cases.
Documentation of the various types of cases
included in the NRFU workload is found below.
Changes Made Since Version 3.0 Operational
Plan Release: The operational scope of NRFU was
modified to remove phone contact attempts for
reinterview cases. All reinterview cases will be
completed by field staff.
Addresses determined to be vacant or nonexistent from the administrative records modeling
will receive at least one visit during NRFU. If the
one NRFU visit indicates that the HU is occupied,
that address will remain in the NRFU workload for
subsequent visits, as necessary. If the NRFU visit
indicates that the HU is indeed vacant or nonexistent, the address will be removed from the NRFU
workload and will not receive any additional visits.
The contact strategy for NRFU will include three
phases—Phase1, Phase 2, and Closeout. Phase 1
will use fully automated routing and assignment
of cases with the emphasis on making contact
with all NRFU households early in the operation.
Phase 2 will restrict assignments to high-performing interviewers with the emphasis on successfully completing interviews with all households.
The final phase, Closeout, will continue to restrict
assignments to high-performing interviewers, but
the focus will change to getting minimal information for all households to achieve an orderly
completion of the operation.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Traditional enumeration and management of
workload, as implemented in the 2010 Census,
is no longer viable in an era of an ever-evolving,
demographically, culturally, and technologically
diverse nation.
•• Reduce the maximum number of NRFU contact
attempts.
•• Include the use of a handheld enumeration
device that would have the ability to track when
an enumerator opens a case.
2020 Census Operational Plan—Version 4.0 123
•• Explore additional sources and criteria for inferring occupancy status and population size of
HUs from administrative records or third-party
data.
•• Avoid having to add late-planned operations
and procedures.
Operational Innovations
Operational innovations include the following:
•• Use of administrative records and third-party
data to reduce NRFU contacts for vacant HUs.
•• Use of administrative records and third-party
data to reduce NRFU contacts for occupied
HUs when sufficient information about the HU
and its members are known.
•• Automation of administrative functions such as
recruiting, onboarding, and payroll.
•• Automated training for field staff.
•• Automation of the field data collection.
•• Use of a reengineered field management structure and approach to managing fieldwork.
•• Assignment and route optimization.
•• Use of Device-as-a-Service as an alternative to
traditional procurement methods for smartphone and tablet devices used in the operation.
•• Use of a manager interview for multiunit dwellings to determine vacancy status of units before
the enumerator interview attempts at non
responding addresses.
•• Reengineered quality assurance approach.
Description of Operation
For the 2020 Census, the NRFU Operation will be
different from the NRFU Operation conducted in
the 2010 Census. The Census Bureau will implement a NRFU operational design that utilizes a
combination of the following:
•• Administrative records and third-party data
usage to reduce the workload by removing
occupied, vacant, and nonexistent HUs from
additional follow-up when high-quality information about the HU and its occupants is known.
•• Reengineering of staffing and management of
field operations.
•• A Best-Time-to-Contact model to increase the
likelihood of making contact attempts when an
enumerator will find people at home.
124 2020 Census Operational Plan—Version 4.0
•• Automation to facilitate data collection, including: automation of administrative functions,
such as recruiting, onboarding, and payroll;
automated training for field staff; automation of
the field data collection; and automated assignment and route optimization.
•• Use of Device-as-a-Service as an alternative to
traditional procurement methods for smartphone and tablet devices used in the operation.
•• Use of a manager interview for multiunit
dwellings to determine vacancy status of units
before the enumerator interview attempts at
nonresponding addresses.
•• Reengineered quality assurance approach.
The NRFU workload is comprised of addresses
from a number of sources, including:
•• Nonresponding addresses in the self-response
and UL TEAs.
•• Blank mail returns or mail returns otherwise
deemed to be insufficient.
•• Addresses from operations such as New
Construction and HU Count Review, the spring
2020 United States Postal Service (USPS)
Delivery Sequence File, addresses upheld
in the Local Update of Census Addresses
(LUCA) appeals process, and potentially other
addresses, determined after the initial enumeration universe is established.
•• Self-reported vacants from Internet
Self-Response.
•• Field Verification cases.
•• Coverage Improvement cases.
•• SRQA cases.
After giving the population an opportunity to
self-respond to the 2020 Census, the Census
Bureau will use the most cost-effective strategy
for contacting and counting people to ensure fair
representation of every person in the United States
and Puerto Rico. Once we know the addresses that
did not respond through Internet, telephone, or
paper, we will use administrative records to identify vacant addresses and addresses that do not
exist to reduce the number of contact attempts
at those addresses. Undeliverable-as-Addressed
(UAA) information from the USPS will provide the
primary administrative records source for the identification of vacant addresses and addresses that
do not exist.
U.S. Census Bureau
For the addresses in the initial NRFU workload,
enumerators will make an in-person contact
attempt to determine the status of the address
(vacant, occupied, does not exist) and, when
occupied, collect the census response. If the
contact attempt was unsuccessful and the administrative records and UAA information identified the address as vacant or not a HU, then the
address will be resolved as vacant or not a HU
and no further contact will be attempted. If the
contact attempt was unsuccessful and the address
is believed to be occupied, and where the Census
Bureau has high-quality administrative records
from trusted sources, administrative records
will be used as the response data for the household and no further contacts will be attempted.
Examples of source of administrative records and
third-party data used to enumerate occupied HUs
include Internal Revenue Service Individual Tax
Returns, Internal Revenue Service Information
Returns, and Center for Medicare and Medicaid
Statistics Medicare Enrollment Database.
Addresses removed from the NRFU workload
as administrative records vacant, administrative
records nonexistent, or administrative records
occupied will receive a final mailing that encourages occupants to self-respond to the 2020
Census. Those addresses that are determined to
be administrative records vacant or administrative
records nonexistent will immediately be mailed
a final postcard encouraging self-response; for
those addresses that are determined to be occupied and are incomplete after one personal visit
attempt, a final postcard encouraging self-response will be mailed within 7 days.
Addresses will also be removed from the workload
throughout the course of the NRFU Operation as
self-responses are received. Addresses may be
added to the NRFU workload from other census
operations, such as addresses from the LUCA
Appeals process and addresses received through
the NID Processing Operation that require a field
visit for final resolution. See other sources contributing to the NRFU workload, listed above.
The NRFU Operation will use a reengineered field
management structure and approach to managing
fieldwork, which includes:
•• Using a new field structure, including field staff
roles and staffing ratios.
U.S. Census Bureau
•• Using automation for:
ºº Training of enumerators and managers.
ºº Enhanced operational control system.
ºº Optimization of daily enumerator
assignments.
ºº Near real-time operational information for
decision making.
ºº Payroll submission and approval processing.
A foundational innovation of the NRFU operational
design is the optimization of enumerator assignments. On a daily basis, based on an enumerator’s
home location, work availability, the availability
and location of NRFU workload, and other operational business rules, the enumerator will be
assigned NRFU addresses. The enumerator will
work the addresses in a prescribed order to determine the Census Day status of the HUs, and when
occupied, enumerate the HUs. Enumerators will
use an automated data collection application on a
handheld device to record the Census Day HU status and to enumerate occupied HUs. If a respondent is not at home, a notice of visit will be left
directing the respondent to the Internet or Census
Questionnaire Assistance (CQA) to self-respond.
The assignment and completion of the NRFU
workload are also governed by:
•• Best-Time-to-Contact probabilities that are
considered in making assignments and are used
to increase the likelihood of finding people at
home.
•• Business rules that prescribe the number of
contact attempts for an address and when a
proxy response is acceptable. A proxy response
is a response provided by a knowledgeable
source, such as a neighbor.
•• Modifications to business rules as the end of
the operation approaches to ensure an efficient
and successful operational close-out.
Operational efficiencies are also gained through
the use of manager visits. When a number of
NRFU addresses share the same street address,
such as an apartment or condominium building,
the cases will be grouped together for a manager
visit. In the manager visit interview, the enumerator will ask the building manager to identify
which units were occupied, vacant, or not a HU
on Census Day. Units identified as vacant or not
2020 Census Operational Plan—Version 4.0 125
a HU will be enumerated as such, reducing the
number of enumerator visits and respondent
burden. Addresses the building manager identifies
as occupied are subject to contact attempts by an
enumerator to collect their census response.
In the past, the coverage improvement interview
collected missing demographic data for everyone
in the household, especially for those in the continuation roster in large households. For 2020, the
proposed coverage improvement design will:
The 2020 Census NRFU operational design will
infuse quality throughout the workload management and data collection processes. Examples of
aspects of the NRFU Operation designed to maintain or improve quality include:
•• Only send large households to coverage
improvement if another coverage issue is
identified.
•• Use of real-time paradata and editing capabilities to increase accuracy and data quality.
•• Use of a Best-Time-to-Contact model in assigning work to increase the likelihood of finding
respondents at home.
•• Capabilities available through an enhanced
operational control system with real-time
supervisory alerts to provide early opportunities to identify and take corrective action in
defined situations.
In addition, the NRFU Operation will include a
reinterview component designed to deter and
detect enumerator falsification. Reinterviews will
be completed by field staff. Administrative records
will also be used as a quality control tool to validate data collected during interviews. A roster
match between the NRFU case and the administrative records does not need to be verified by
field reinterview.
Since the October 2016 release of version 2.0
of the 2020 Census Operation Plan, a decision
was reached to include a coverage improvement
operation component. The coverage improvement
component is to resolve erroneous enumerations
and omissions from the initial census self-response
data collection operation and all field enumeration operations to achieve the overall objective
to count all people and HUs once and in the
right place. Coverage improvement cases will be
completed through a telephone follow-up interview. The interview is conducted with a household
respondent to determine if changes should be
made to their household roster as reported on
their initial census return. The questions in the
interview probe are used to determine if people
were missed or if people were counted in error
because they should be counted at a different
address.
126 2020 Census Operational Plan—Version 4.0
•• Only collect demographic items for people added to the roster during the coverage
improvement telephone operation. Only collect
a subset of demographic items.
•• Not collect missing demographics for people
on the initial response roster.
•• For electronic modes, including the Internet,
collect initial census response data only. The
Census Bureau hopes to use the capabilities of
electronic modes to allow the respondent to
resolve count discrepancies during the initial
census response, thus reducing the number of
count discrepancy cases to send to coverage
improvement. Therefore, the Census Bureau
expects far fewer count discrepancies to come
from electronic modes.
•• Not use administrative records to identify cases
with potential coverage issues or for telephone
number lookup.
Research Completed
The following research has been completed for
this operation:
•• The 2013 Census Test (Philadelphia, PA)
explored methods for using administrative
records and third-party data to reduce the
NRFU workload:
ºº Findings:
•• The Census Bureau was able to remove
approximately 8 percent of vacant units
and 31 percent of occupied units prior to
NRFU using administrative records and
third-party data.
•• The use of administrative records and
third-party data and the implementation
of an adaptive design case management
approach have the potential to reduce
costs.
•• The 2014 Census Test (Montgomery County,
MD, and Washington, DC) built upon the results
U.S. Census Bureau
of the 2013 Census Test specific to administrative records and third-party data usage to
reduce the NRFU workload:
ºº Findings: A high self-response rate of 65.9
percent resulted in a NRFU universe of
46,247 HUs. The Census Bureau was able
to identify approximately 4 percent of the
NRFU cases as vacant and 55 percent of
NRFU cases as occupied based on administrative records and third-party data.
•• The 2014 Human-in-the-Loop Simulation
Experiment (SIMEX).
ºº Findings:
•• The field management structure can be
streamlined and supervisor-to-enumerator
ratios increased.
•• Messaging and alerts within the operational control system provide real-time
and consistent communication.
•• The enhanced operational control system,
or MOJO, is intuitive—users were able to
use the system with a small amount of
up-front training.
•• Smartphones were usable by all people—
even those with little technology experience were able to adjust and adapt.
•• The 2015 Census Test (Maricopa County, AZ)
explored the reengineering of the roles, responsibilities, and infrastructure for conducting field
data collection. It also tested the feasibility
of fully utilizing the advantages of planned
automation and available real-time data to
transform the efficiency and effectiveness of
data collection operations. The test continued
to explore the use of administrative records and
third-party data to reduce the NRFU workload
and tested the technical implementation of a
Bring Your Own Device (BYOD) option.
ºº Findings:
•• A high self-response rate of 54.9 percent
resulted in a NRFU universe of 72,072 HUs.
The Census Bureau was able to identify
approximately 12 percent of the NRFU
cases as vacant and 20 percent of NRFU
cases as occupied based on administrative
records and third-party data.
•• Successfully removed vacant HUs
and enumerated occupied HUs using
U.S. Census Bureau
administrative records and third-party
data.
•• A combination of automated online
training and classroom training enabled a
reduction in the overall number of training
hours, compared with the 2010 Census
NRFU Operation, from 32 to 18 hours.
•• Management of the field data collection utilizing new roles, responsibilities,
and staffing ratios were successfully
implemented.
•• Entry of enumerator work availability,
workload optimization, and electronic
payroll were effective and efficient.
•• Use of a BYOD option did not generate
any observable concerns from respondents. Please see decisions made section.
•• The 2016 Census Test (portions of Los Angeles
County, CA, and Harris County, TX) was the
first opportunity to operationally test the new
manager visit procedures for enumeration of
multiunit structures. Also tested were different
supervisor to enumerator staffing ratios, and
incremental improvements and updates to the
field data collection software that guided an
enumerator through the interviews. Finally, this
test allowed the continued evaluation of the use
of administrative records to reduce the NRFU
workload, with the new addition of a postcard
mailout to any cases removed from the NRFU
workload in this way. Findings are underway
and will be forthcoming.
•• The 2018 End-to-End Census Test (Providence,
RI) focused on the system and operational integration needed to support the NRFU Operation.
Nearly all 2020 system solutions supporting the
NRFU Operation were deployed allowing for
the first test of key functionality like reinterview
computer matching and operation management in the Field Operational Control System.
This test also allowed the continued evaluation
of the NRFU contact strategy and close-out
procedures.
Decisions Made
The following decisions have been made for this
operation:
99 The NRFU Operation will consist of production
and quality assurance components.
2020 Census Operational Plan—Version 4.0 127
99 The NRFU Operation will utilize automated
tools and systems for:
ºº Recruiting, onboarding, and training.
ºº Time and attendance and payroll.
ºº Case load management.
ºº Data collection.
ºº Cost and progress monitoring.
99 The NRFU Operation will utilize a reengineered
field management and staffing structure.
99 Administrative records and third-party data will
be used to identify vacant units.
99 Administrative records and third-party data will
be used to enumerate nonresponding HUs, as
appropriate.
99 A contact attempt will be made prior to using
administrative records or third-party data for
determination of vacant/nonexistent status and
enumeration of occupied units.
99 A final postcard, encouraging self-response,
will be mailed to NRFU cases that are removed
from the workload based on the administrative
records modeling.
99 Telephone contact attempts from a central
location (i.e., CQA) will not be part of the initial
NRFU contact strategy.
99 All administrative records and third-party
data will be used in compliance with data-use
agreements.
99 The core set of administrative records and
third-party data to support the 2020 Census
NRFU operations include the following:
ºº Internal Revenue Service Individual Tax
Returns.
ºº Internal Revenue Service Information
Returns.
ºº Center for Medicare and Medicaid Statistics
Medicare Enrollment Database.
ºº Indian Health Service Patient Database.
ºº Social Security Number Identification File.
ºº USPS DSF.
ºº USPS Undeliverable-as-Addressed
Information.
ºº Targus Federal Consumer File.
ºº 2010 Census Data.
128 2020 Census Operational Plan—Version 4.0
ºº American Community Survey Data.
99 Detailed agreements with each data provider for the core administrative record and
third-party data sources are established. The
agreements document details, such as delivery
cycles, duration of agreements, renewal cycles,
etc. Each agreement includes text that allows
the data to be used by the Census Bureau for
statistical purposes including activities that
support the Decennial Census Program.
99 The Census Bureau will pursue multiple avenues
to minimize error associated with administrative records usage. The use of USPS UAA
information is the core source in our usage of
administrative records determination of vacant
addresses and addresses that do not exist. To
minimize error in the use of the USPS UAA
information in determining the status of an
address, the Census Bureau is partnering with
the USPS to understand the procedures and
steps used by letter carriers when assigning
specific reasons to mail pieces that are UAA.
The Census Bureau observed the USPS assignment of the UAA reasons/codes and participated in focus groups with USPS carriers to
discuss their process.
99 The Census Bureau will apply specific, preidentified criteria for each HU to make a determination regarding its status.
99 Enumerators will not make specific appointments with respondents to conduct interviews.
99 Administrative records and third-party data
will be stored and accessed through a repository known as Production Environment for
Administrative Record Staging, Integration, and
Storage (PEARSIS). The Current Analysis and
Estimation System will access data in PEARSIS
to support administrative records modeling for
the NRFU Operation. The Decennial Response
Processing System will provide the response
processing capabilities to identify and ingest
administrative records and third-party data
for the purposes of providing case status
(vacant and nonexistent) and census responses
(occupied).
99 The Census Bureau will build upon the
approach used in the 2016 Census Test involving an upfront Manager Visit to ascertain the
unit status for nonresponding addresses in the
NRFU workload.
U.S. Census Bureau
99 Proxy responses are used in the NRFU
Operation when a resident of the nonresponding address is not available or cannot be found.
Proxy responses will be allowed after the
third unsuccessful contact attempt to reach
a resident of a nonresponding address. Proxy
responses are allowable on the first unsuccessful contact attempt for addresses deemed to be
vacant or not meeting the definition of a HU.
99 Based on results from the 2016 Census Test,
the following staffing structure will be used:
Census Field Manager, Census Field Supervisor,
and enumerator. The ratio of Census Field
Supervisor to enumerator will be 1:20.
99 Administrative records and third-party data
will be used for the identification of addresses
in the NRFU workload deemed to be vacant
or delete to reduce contact attempts to
those addresses before any contact attempts.
Administrative records and third-party data will
be used for the identification and enumeration
of addresses deemed to be occupied after one
unsuccessful attempt at in-person enumeration.
All other addresses in the NRFU workload will
be subject to up to six contact attempts with
cases becoming proxy eligible after the third
unsuccessful attempt. Refinement of this contact strategy (e.g., additional contact attempts)
may be possible if necessary to ensure an efficient and successful operational close-out.
99 Field verification will be conducted using NRFU
enumerators with case assignment interspersed with their NRFU assignments. For Field
Verification, enumerators will be expected to
locate the problem address and collect GPS
coordinates for the HU using the automated
instrument. There does not need to be contact
with HUs.
99 The operational design for the NRFU quality
assurance component includes the following:
ºº Use of an improved contact strategy to
increase the likelihood of self-response.
ºº Use of an automated data collection application for conducting NRFU.
ºº Use of real-time paradata and editing capabilities to validate and ensure data quality.
ºº Use of Best-Time-to-Contact model in the
assignment optimization to increase the likelihood of finding respondents at home.
U.S. Census Bureau
ºº Use of Notices of Visit to push to
self-response.
ºº Use of follow-up postcard mailings to
encourage self-response in the case of
administrative records and third-party data
vacant/nonexistent removal and occupied
removal.
ºº A reinterview component designed to deter
and detect enumerator falsification.
99 All units identified as vacant or delete will be
verified by either a proxy response or a second
enumerator. Vacants from self-response will be
verified by an enumerator.
99 The Census Bureau will have the capability to
keep cases active throughout the enumeration
process to aid in obtaining adequate response
rates.
99 Enumerators, as part of their normal work on
NRFU assignments, will not be looking for
missing addresses and adding them to their
workload, but they will have the capability to
add addresses and enumerate those HUs if
appropriate. Staff in the Area Census Offices
will also have the capability to add addresses
to the NRFU workload that have been deemed
to be missing from the address list and require
enumeration.
99 Case assignments are optimized based on the
location of enumerators, the location of the
NRFU cases, the hours the enumerators are
available to work, and Best-Time-to-Contact
probabilities associated with the NRFU cases.
99 The NRFU field data collection will occur from
early-April 2020 through the end of July 2020.
Field work in preidentified geographic areas
surrounding colleges or universities with concentrations of off-campus housing will begin in
early April. This is necessitated in areas where
the spring semester will conclude prior to
mid-May when the bulk of the NRFU workload
begins.
99 NRFU will receive supplemental addresses
from sources such as LUCA appeals, Count
Review, New Construction, and a refresh from
the spring 2020 Delivery Sequence File from
the Postal Service. Other sources of cases contributing to the NRFU workload include, but are
not limited to, Reverse Check-ins, SRQA cases,
and Self-Responding Vacant cases.
2020 Census Operational Plan—Version 4.0 129
99 Enumerator performance indicators include:
ºº Number of completed interviews.
Design Issues to Be Resolved
ºº Completed interview rate.
There are no remaining design issues to be
resolved for this operation.
ºº Number of attempts.
Cost and Quality
ºº Resolved cases per attempt.
ºº Hours worked.
ºº Resolved cases per hour.
ºº Number of alerts triggered.
ºº Number of refusal conversions.
ºº Results of reinterviews.
99 Priority capture of paper responses will not be
required in support of the NRFU Operation.
Postal tracking information will be used to
inform the cut for the NRFU workload resulting
in the removal of cases for which postal tracking indicates a paper response is on its way
back to the Census Bureau. If upon data capture, it is determined that the paper response
has insufficient data to be considered a complete response, those cases will be added back
into the NRFU workload.
99 The final set of administrative records for use
in the NRFU Operation consists of a combination of federal files from the Internal Revenue
Service, Social Security Administration,
Center for Medicare Services, Housing and
Urban Development, Indian Health Service,
Selective Services, and the U.S. Postal Service.
In addition, we are using third-party datasets
including Veterans Service Group of Illinois, and
CoreLogic or comparable files.
99 Federal and state administrative records agreements are in place with source agencies to
provide timely delivery of data and to describe
the allowable uses, including modeling, validation, and where feasible, direct substitution.
The third-party datasets are acquired through
purchase agreements and are generally used at
the government’s discretion. Detailed information concerning each administrative record or
third-party dataset used in the NRFU Operation
are documented in the Risk Register for Data
Sources used in the 2020 Census, along with
links to the individual agreements or purchase
information.
130 2020 Census Operational Plan—Version 4.0
The investment in NRFU, which includes the use
of administrative records and third-party data and
reengineered field operations, is projected to influence (reduce or increase ) the 2020 Census
overall costs in the following ways:
ÐÐ Reducing field workload by:
ºº Using administrative records and thirdparty data to reduce the number of contact
attempts.
ºº Using administrative records and third-party
data to enumerate nonresponding HUs.
ºº Removing self-responses on a near-real-time
basis.
ºº Interviewing managers of multiunit buildings
to identify and remove vacant units from the
NRFU workload.
ÐÐ Improving productivity of field staff by:
ºº Streamlining staffing structure through the
use of automation.
ºº Automating and optimizing the assignment
process.
ºº Using language information from the
planning database to match enumerator
language skills to neighborhood language
needs.
ºº Using administrative records and third-party
data to determine the best time of day for
contact attempts.
ÐÐ Reducing the reinterview workload through a
reengineered quality assurance approach.
ÐÐ Reducing the number of hours devoted to
classroom training through the use of online
training.
Impacts of this operation on overall 2020 Census
quality include the following:
ÏÏ Use of an improved contact strategy to increase
the likelihood of self-response.
ÏÏ Use of an automated data collection application
for conducting NRFU.
U.S. Census Bureau
ÏÏ Use of real-time paradata and editing capabilities to validate and ensure data quality.
ÏÏ Use of Best-Time-to-Contact model in the
assignment optimization to increase the likelihood of finding respondents at home.
ÏÏ Use of Notice of Visit to encourage self-
response.
ÏÏ Use of follow-up postcard mailing to encourage self-response in the case of administrative
records and third-party data vacant/delete
removal and occupied removal.
ÐÐ Use of administrative records and third-party
data to reduce contact attempts for vacant,
nonexistent, and occupied HUs may impact HU
coverage.
↔↔Use of new or revised methodologies will
change results in ways not yet determined.
↔↔Use of adaptive design protocol and proxy rules
may impact the quality of response data in
ways not yet determined.
Risks
For the 2018 End-to-End Census Test, the limited
resources within the Sampling, Matching, Review,
and Coding System team focused on bug fixes in
production and development for the GQ operation. As a result, system development and testing
was delayed and reduced for NRFU Reinterview,
and the NRFU QC program had less than full functionality in place for the 2018 End-to-End Census
Test. IF the NRFU QC program is not fully tested
prior to the 2020 Census, THEN its viability cannot be ensured and quality control for the 2020
Census may be compromised.
A key component of the NRFU Operation is the
QC program. The program consists of sampling a
percentage of NRFU cases for Reinterview, matching results of that Reinterview to production case,
reviewing that match, providing a resolution for
the case, and taking corrective actions as necessary. In previous census tests, only pieces of the
QC program could be fully tested. Without fully
testing the QC program in the decade leading up
to the 2020 Census, particularly as an integrated
component of the NRFU Operation, there is a
very significant impact to quality. The NRFU QC
program is at a much higher risk of not identifying large-scale data quality issues and the Census
Bureau needs to have confidence in NRFU’s ability
to identify poor-performing enumerators in a
U.S. Census Bureau
timely fashion to minimize data errors and possible rework. IF the NRFU QC program is not fully
tested prior to the 2020 Census, THEN the quality
of the NRFU Operations may be compromised.
Personnel directives, such as rules governing
premium pay for night differential for evening
times, and Sunday work, may affect data collection costs. IF it is determined that premium pay is
required, THEN the costs of enumerating during
premium times will increase, which increases the
overall NRFU budget.
Enumerator hiring is estimated to be a 60-day process. IF the NRFU Operation needs to hire more
enumerators during field data collection, THEN
the time required for hiring would make it difficult
to complete the operation in the time allotted,
and NRFU may need to recruit and get clearance
for more potential enumerators than will be hired,
which will increase the cost of recruitment.
The general public may be reluctant to respond
to the 2020 Census because of a general mistrust
of the government, which may result in lower
response rates, an increase in the NRFU workload,
and an increase in operational costs. It may also
result in more refusals during NRFU interviews.
IF the 2020 Census self-response rate decreases,
THEN the NRFU workload will increase, as will the
cost of the operation.
Many aspects related to the NRFU operational
design and the infrastructure necessary to support it are based on workload assumptions. A key
input to those workload assumptions is the self-
response rate. IF the 2020 Census self-response
rate falls below expectations, THEN the initial
NRFU workload will be higher than expected
and the infrastructure and staffing to support an
increased field data collection volume may be
insufficient.
Milestones
Date
Activity
November 2013 Begin NRFU for 2013 Census Test.
August 2014
Begin NRFU for 2014 Census Test.
November 2014 Conduct 2014 SIMEX.
May 2015
Begin NRFU for the 2015 Census Test.
September
2015
Determine preliminary NRFU Design.
May 2016
Begin NRFU for 2016 Census Test.
2020 Census Operational Plan—Version 4.0 131
Date
Activity
September
2016
Determine strategy for use of
administrative records and third-party
data in NRFU.
June 2017
Release the NRFU Detailed
Operational Plan, version 1.0.
May 2018
Begin NRFU for 2018 End-to-End
Census Test.
March 2019
Release the NRFU Detailed
Operational Plan, version 2.0.
April 2020
Begin NRFU data collection for the
2020 Census.
July 2020
End NRFU data collection for the
2020 Census.
August 2021
Issue operational assessment of the
2020 Census NRFU Operation.
5.5.11 Response Processing
Detailed Planning
Status:
In Production
DOP published in FY 2017
Purpose
The Response Processing Operation (RPO) supports the three major components of the 2020
Census: pre-data collection activities, data collection activities, and post-data collection activities:
Specifically, the operation supports the following:
Pre-Data Collection Activities
•
Receive address and
geographical input data for all
known living quarters.
•
Apply criteria to create the
initial 2020 Census
enumeration universe.
•
Assign the specific contact
strategy for each living
quarters based on defined
criteria.
•• Create and distribute the initial 2020 Census
enumeration universe of living quarters (LQs).
•• Assign the specific enumeration strategy for
each LQ based on case status and associated
paradata.
•• Create and distribute workload files required for
enumeration operations.
•• Track case enumeration status.
•• Run post-data collection processing actions
in preparation for producing the final 2020
Census results.
•• Check for suspicious returns.
Changes Made Since Version 3.0 Operational Plan
Release: The scope of the operation no longer
includes the creation and delivery of the Microdata
Detail File. This process is now being managed by
the Data Products and Dissemination Operation.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Make response data available as soon as possible to the data review teams in order to facilitate a more thorough review.
•• Include more staff members from more areas in
the Primary Selection Algorithm determination
Post-Data Collection
Activities
Data Collection Activities
•
Receive updates to the initial
2020 Census Universe.
•
Create the 2020 Census selfresponse universe.
Apply data codes to write-in
responses to facilitate data
tabulation.
•
•
•
Create and distribute workloads to data collection modes
based on modeling results
or specification criteria.
Identify potential fraudulent
returns from self-responses
and record final fraud
investigation disposition.
•
•
Record response data and
enumeration case status.
Resolve potential duplicate
responses.
•
•
Deliver response data to
Postdata Collection Activities.
Identify the return of record
for housing units with multiple
returns.
•
Repair missing or conflicting
data.
•
Provide final census results.
Figure 31: Response Processing Operation
132 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
process. This will result in broader expertise for
design planning, rather than limiting to a small
team of mathematical statisticians or analysts.
•• Make user testing of the Quality Control
program component part of the schedule for
residual coding, to facilitate development of
procedures and training of data coding staff.
Operational Innovations
Operational innovations include the following:
•• Use of enterprise-developed tools to facilitate intelligent business decisions prior to and
during data collection:
ºº Interface with all printing systems for production of paper products.
ºº Serve as the overall integration “manager” of
response data collection, including Internet,
telephone, and paper.
ºº Create models based on established business
rules to determine the appropriate course of
enumeration action for cases (e.g., person
visit, use of administrative records and thirdparty data, or imputation) and assign each
case to the specific mode for data collection.
•• Expanded use of administrative records and
third-party data in post-data collection processing activities to support improved data
coverage.
•• Expand the use of automated technology,
communications monitoring, and improved
computational modeling and data analytic techniques to provide early warnings of potentially
suspicious returns.
Description of Operation
Pre-Data Collection Activities
The pre-data collection phase of RPO creates and
populates a respondent data collection universe
of LQ for use during the later data collection and
post-data collection phases of RPO. This universe contains census addresses and geographic
attributes for all known housing units (HUs), group
quarters (GQs), and transitory locations within
the boundaries of the United States and Puerto
Rico. Each known LQ in the universe is populated
with address information, a Census identifier
(ID), geocoding information, case management
U.S. Census Bureau
information, and a contact strategy. The Census ID
will be used during later phases of RPO to associate a particular set of response data back to a
specific LQ.
Data Collection Activities
For data collection activities, the RPO starts with
receiving and managing updates to the initial
2020 Census universe. These updates come from
various address frame update operations including Local Update of Census Addresses and some
Geographic Programs activities. The results from
the address updates establish a revised 2020
Census enumeration universe. The RPO uses
this universe to control and track questionnaire
response data. As responses are received, cases
containing a Census ID are designated as received
in the universe. Cases returned without Census IDs
are sent to the non-ID (NID) Processing Operation
for matching and geocoding. All cases are
returned to the RPO and those that were successfully resolved are removed from further enumeration follow-up.
For nonresponding cases, the RPO supports
the Nonresponse Followup (NRFU) Operation
by facilitating administrative records modeling
techniques to determine the most effective and
efficient enumeration strategy, including removal
of vacant and deleted cases before follow-up,
provision of a “best time to contact” recommendation to be used by the operational control
system, and removal of cases from the workload
based on established “stopping rules” to maximize
efficiency in the NRFU Operation.
Additionally, the RPO provides response collection support to Update Leave, Update Enumerate,
GQ, and Enumeration of Transitory Locations
Operations. In general, the activities include creating and managing the enumeration workloads and
follow-up universes, as well as the enumeration
and, as applicable, address listing quality control
functions.
Post-Data Collection Activities
The RPO supports post-data collection activities
by preparing the data for tabulation. The SelfResponse Quality Assurance (SRQA) process will
perform automated and interactive checks to
2020 Census Operational Plan—Version 4.0 133
identify potentially suspicious returns from self-
response that require analyst investigation and/or
field follow-up to ensure quality. It will also ensure
a final disposition for each response processed by
SRQA is recorded prior to tabulation. As the data
are received, write-in responses (i.e., alpha characters for race and ethnicity responses) are coded
for tabulation purposes. Coding is conducted by
both automated and computer-assisted manual
processes. RPO also applies computer-based
person matching software to unduplicate multiple responses for the same person across census
records. Then, a Primary Selection Algorithm is
run to establish the single enumeration record
for a case when multiple responses are received.
Following the Primary Selection Algorithm,
count imputations are applied and missing data
resolved to fix discrepancies in household population counts and the status of HUs. This output
is called the Census Unedited File. The Census
Unedited File is used as a data source for coverage measurement operations and a final independent Count Review Operation. Finally, the Census
Unedited File is the source used to produce the
apportionment counts delivered to the President
of the United States via the Data Products and
Dissemination Operation.
The next steps are to perform preliminary and
complex consistency edits and impute missing
values (imputation and allocation) that results in
the Census Edited File. As part of a final closeout,
RPO prepares census response data for delivery by the Archiving Operation to the National
Archives and Records Administration for the Title
13 prescribed 72-year secured storage.
Figure 30 summarizes the RPO by component.
Research Completed
The following research has been completed for
this operation:
•• The 2014 Census Test evaluated the interface
between the response processing system and
the matching and geocoding system. In addition, it tested the data file exchange.
ºº Findings: The tests concluded with no major
system or workload-related issues.
Decisions Made
134 2020 Census Operational Plan—Version 4.0
The following decisions have been made for this
operation:
99 The RPO will use the enterprise-developed
system solution Control and Response Data
System for universe creation, the Enterprise
Census and Survey Enabling Operational
Control System for data collection control and
management, and the Decennial Response
Processing System for final data processing.
99 The enterprise-developed Concurrent Analysis
and Estimation System and its modeling output
will use established business rules to determine
the appropriate course of enumeration action
for cases and assign the case to the specific
mode for data collection to improve efficiency
and reduce cost.
99 Administrative records and third-party data will
be used to improve post-data collection activities, such as SRQA coding and editing, Primary
Selection Algorithm, and imputation.
99 The RPO will comply with Title 13 and Title 26
security requirements.
99 Methodology, processes, and systems have
been defined. Methodology will continue to be
adjusted as operational development, integration, and demand models are refined through
conducting and evaluating results from the 2017
and 2018 tests.
99 The specific use of administrative records
and third party data in support of reducing
the field workload associated with the NRFU
Operation is known and has been effectively
utilized during past census tests. In addition, usage of the records is known regarding
address enhancement to improve matching
NID responses through the asynchronous NID
process. Finally, SRQA’s (including response
validation) use of administrative records and
third-party data has been defined. However,
the Census Bureau will continue to adjust as
integrated operations and demand models are
refined throughout the conduct and evaluating
the results from the 2017 and 2018 tests.
99 Character sets have been defined and will continue to be adjusted as integrated operations,
language options, and data architecture are
refined throughout conducting and evaluating
results from the 2017 and 2018 tests.
U.S. Census Bureau
99 Inputs to the response file layout have been
defined and will continue to be adjusted as
integrated operations and the data architecture
are refined throughout conducting and evaluating results from the 2017 and 2018 tests.
Milestones
Date
Activity
March 2015
Establish the development,
test, beta, staging, and
production environments for
RPO.
December 2015
Go live to support the 2016
Census Test universe creation
and response tracking.
December 2016
Go live for the 2017 Census
Test.
January 2017
Deliver revised 2020 Census
business requirements for RPO.
April 2017
Release the RPO Detailed
Operational Plan, version 1.0.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
Investment in RPO is projected to influence
(reduce or increase ) the 2020 Census overall
costs through:
ÐÐ Real-time adjustment of the universe based on
response status.
Note: This initial release
reflects the state of the
RPO as of January 13,
2017. In addition, the postdata collection phase of
this operation is not fully
presented, as some details
about the process continue to
be worked out.
ÐÐ Use of administrative records and third-party
data (see NRFU).
ÐÐ Flexible, rule-based decisions on most cost-
effective approach for collecting responses
(expected to reduce in-field workloads).
Impacts of this operation on overall 2020 Census
quality include the following:
ÏÏ Use of administrative records and third-party
data to improve imputation, editing and coding, Primary Selection Algorithm, and SRQA
processing.
Risks
Changes to the 2020 Census specifications are
expected after completion of the 2018 End-toEnd Census Test. IF these changes are not incorporated and delivered to the systems in time for
system development, THEN there may be delays
or change in scope needed to complete these
systems on time for the 2020 Census.
The RPO is responsible for any administrative
record (AdRec) enumerations needed for occupied, vacant, and deleted HUs. RPO systems must
develop AdRec processing techniques to create
enumeration records for nonresponding HUs unresolved by field operations at the end of the data
collection phase. IF RPO systems do not develop
AdRec processing techniques for unresolved
nonresponding HUs, THEN necessary enumeration
records including status, person records (if occupied), and characteristics will not be accurately
reflected in the final 2018 End-To-End Census Test
results.
U.S. Census Bureau
June 2018
Release the updated RPO
Detailed Operational Plan,
version 2.0 (delayed).
September 2018
Deliver final 2020 Census
business requirements for RPO.
October 2019
Create the initial 2020 Census
enumeration universe for early
census operations.
January 2020
Create the 2020 Census
enumeration universe.
Begin the 2020 Census RPO.
November 2020
Deliver the 2020 Census
Unedited File for
apportionment counts.
February 2021
Deliver the 2020 Census Edited
File for Differential Privacy
protection.
5.5.12 Federally Affiliated Count Overseas
Detailed Planning
Status:
Underway
DOP to be published in FY 2019
Purpose
The Federally Affiliated Count Overseas (FACO)
Operation obtains counts by home state of
U.S. military and federal civilian employees
stationed or assigned overseas and their dependents living with them.
2020 Census Operational Plan—Version 4.0 135
Changes Made Since Version 3.0 Operational Plan
Release: There have been no major changes to
this operation.
•• Submits final counts in the apportionment
counts.
Lessons Learned:
99 Data sharing agreement between the
Department of Defense and the Census Bureau
stipulated that the Department of Defense
will electronically send the Census Bureau a
secure tabulated file that includes total counts
of its military and civilian personnel who are
stationed or assigned overseas (and their
dependents living with them at the overseas
post/duty station).
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Explore new technology, including an Internet
option for collecting data on the federally affiliated population living overseas.
•• Automate this operation fully.
•• Consider new data fields to identify the residency of the military personnel living overseas.
•• Maintain a strong relationship with the
Department of Defense.
Operational Innovations
Operational innovations include creating a secure
file transfer protocol for the Department of
Defense to submit their enumeration counts to the
Census Bureau, and the development of a Web
server application for departments and agencies
to submit their counts to the Census Bureau.
Description of Operation
For the 2020 Census, overseas is defined as anywhere outside the 50 states and the District of
Columbia. Counts are obtained from administrative records and are used to allocate the federally
affiliated population living overseas.
FACO performs the following activities:
Decisions Made
99 Overseas counts will be gathered in several
ways. The Defense Manpower Data Center will
send counts for military and civilian personnel stationed or assigned overseas under the
authority of Memorandum of Understanding
between the Census Bureau and the
Department of Defense to a secure census
Web server. Federal agencies will send counts
for civilian personnel to a secure census Web
application.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
Investment in the FACO Operation is projected
to have minimal influence on the overall cost and
quality of the 2020 Census.
Risks
•• Compiles address list of federal agencies with
personnel overseas.
The FACO Operation plans to use an external-
facing portal as an automated data collection system for the 2020 Census overseas count. IF there
is a cybersecurity incident with the external-facing
portal, THEN the information collected for the
FACO Operation may be compromised.
•• Prepares letters and data collection materials.
Milestones
•• Engages and communicates the Census Bureau’s
methodology and procedures with the Defense
Manpower Data Center.
•• Requests the name of a contact person for
each agency.
•• Obtains agencies’ overseas counts by state.
•• Uses the tabulated data file from the
Department of Defense to obtain total counts
of military and civilian personnel stationed or
assigned overseas and their dependents living
with them.
136 2020 Census Operational Plan—Version 4.0
Date
Activity
February
2014
Established contact with Defense
Manpower Data Center.
February
2017
Reviewed final guidelines for counting
federally affiliated population living
overseas.
U.S. Census Bureau
UL can occur in geographic areas that:
Date
Activity
May 2018–
July 2020
Design and prepare agency/department
head contact letters, agency/department
point of contact count request letters
and instructions, and follow-up letters
for nonresponding agency/department
heads and agency/department points of
contact.
•• Do not have city-style addresses.
December
2018
Release the FACO Detailed Operational
Plan (delayed).
•• Have high concentrations of seasonally vacant
housing.
September
2019
Review and finalize list of departments
and agencies to receive participation
letter.
Changes Made Since Version 3.0 Operational Plan
Release:
April—May
2020
Receive overseas counts.
September
2020
Review and reconcile overseas counts.
November
2020
Deliver overseas counts to Population
Division. Population Division will include
these counts in the apportionment
count. These counts will be allocated
to a home state for the purposes of
apportioning seats in the U.S. House of
Representatives.
UL was introduced as a new operation in late May
2017. A new Type of Enumeration Area (TEA) was
created for the UL Operation (TEA 6). The majority of the HUs originally delineated in the Update
Enumerate TEA (TEA 2) were redelineated to the
new UL TEA in the final TEA delineation.
5.5.13 Update Leave
Detailed Planning
Status:
In Production
DOP published in FY 2018
Purpose
The Update Leave (UL) Operation is designed
to occur in areas where the majority of Housing
Units (HUs) either do not have mail delivered to
the physical location of the HU or the mail delivery information for the HU cannot be verified. The
purpose of the operation is to update the address
and feature data for the area assigned, and to
leave a choice questionnaire package at every HU
identified to allow the household to self-respond.
Occupants will be offered three different ways
to complete the questionnaire including Internet,
telephone, or by mailing back a completed paper
questionnaire.
•• Do not receive mail through city-style
addresses.
•• Receive mail at post office boxes.
•• Have been affected by major disasters.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, it has been recommended
to determine ways to closely track the fieldwork
during the UL field operation in order to monitor
any falsification or procedural issues that may
arise during production.
Operational Innovations
•• Use a reengineered field management structure
and approach to managing fieldwork, including a new field office structure and new staff
positions.
•• Reuse processes and procedures from In-Field
Address Canvassing (ADC) Operation to the
extent feasible.
•• Use software to update the address list and collect feature data to provide updates in real time
and reduce back-end paper processing.
•• Have the ability to link questionnaires to
addresses at the time of the update so a
response is later linked to the correct address.
The primary functions of UL include:
Description of Operation
•• Verifying and updating the address list and feature data for tabulation of the 2020 Census.
During the UL operation, enumerators will compare address information on the ground to their
address list, and verify, correct, delete, or add
addresses. UL will utilize software on a device for
an automated listing process. The UL Operation
will use the same business rules implemented for
the ADC Operation.
•• Determining the type and address characteristics for each living quarter (LQ).
•• Leaving a questionnaire package at every HU
for the household to respond to the census.
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 137
After updating the address information, enumerators will link a paper questionnaire to a HU, and
then leave the questionnaire package at the HU
for the household to self-respond. UL will leave a
choice questionnaire package at every HU. When
the enumerator adds a new address, the system
will create an Identifier (ID) in real time. (This was
tested in the 2018 End-to-End Census Test.)
Occupants will be offered three different ways to
complete the questionnaire, including Internet,
phone, or by mailing back a completed paper
questionnaire.
UL instructs enumerators to visit each HU only
once. If the household does not respond using
the information left by the enumerator, the HU
will be visited again by an enumerator during the
Nonresponse Followup Operation.
Research Completed
The UL Operation was researched and tested
during the 2018 End-to-End Census Test in
Providence County, Rhode Island. UL Operation
results from the 2018 End-to-End Census Test will
be published in the 2018 End-to-End Census Test:
Update Leave Operational Assessment Study Plan.
Decisions Made
The following decisions have been made for this
operation:
•• Group quarters (GQ) will not be enumerated
during the UL Operation. Those cases will be
enumerated via the GQ Operation.
affect the willingness and ability of the population
in TEA 1 designated areas to participate in the
2020 Census. IF a major disaster occurs in a TEA 1
designated area at or around the time of the 2020
Census, and the decision is made to redesignate
the area as TEA 6, transferring the workload to the
UL Operation, THEN UL will need the ability to be
expanded in time to provide full coverage of the
impacted geographic area.
Remedy tickets opened during production field
work must be prioritized, routed properly, and
resolved in a timely manner once submitted.
During the 2018 End-to-End Census Test UL field
work, field staff were unable to use the You Are
Here Indicator (YAHI) to capture GPS mapspots.
A ticket was submitted to the Decennial Service
Center for research and issue resolution. However,
the problem was not addressed nor properly
investigated (still pending review and investigation
as of May 8) and, as a result, there was no opportunity to correct the YAHI issue and minimize its
impact on production. IF any technical problems
occur during the 2020 UL Operation and the process for addressing technical anomalies has not
been greatly improved beyond that of the 2018
End-to-End Census Test, THEN technical problems
occurring in 2020 may not be resolved in a timely
manner, potentially affecting the accuracy of data
collected and resulting in a delay in the completion of the operation.
Milestones
Date
Activity
May 2017
Create the UL Operation.
September
2017
Release the UL Detailed Operational
Plan, version 1.0.
March 2018
Begin UL for 2018 End-to-End Census
Test.
There are no remaining design issues to be
resolved for this operation.
April 2018
End UL for 2018 End-to-End Census Test.
March 2020
Begin UL for 2020 Census.
Cost and Quality
April 2020
End UL for 2020 Census.
•• Transitory locations will not be enumerated
during the UL Operation. Those cases will be
enumerated via the Enumeration at Transitory
Locations Operation.
Design Issues to Be Resolved
Investment in UL is projected to have minimal
influence on the overall cost and quality of the
2020 Census.
Risks
Major disasters in the form of hurricanes, floods,
epidemics, etc., are uncontrolled events that could
138 2020 Census Operational Plan—Version 4.0
5.6 PUBLISH DATA
The Response Processing (RPO) Operation provides to the Count Review Operation preliminary
counts before census operations are completed
so that Federal-State Cooperative Population
Estimate members have an opportunity to ensure
the counts appear correct.
U.S. Census Bureau
After census operations are completed through
the RPO post-data collection phase, RPO delivers
the data to the Data Products and Dissemination
(DPD) Operation to prepare the final 2020 Census
data products, including apportionment counts,
redistricting data, and other data products for
the public. DPD coordinates the dissemination of
the redistricting data with the Redistricting Data
Program Operation. DPD also delivers final counts
to the Count Question Resolution Operation so
challenges to Census Counts can be resolved.
All data products and response data are sent
to the Archiving Operation for public release 72
years after the census.
5.6.1 Data Products and Dissemination
Detailed Planning
Status:
In Production
DOP to be published in FY 2019
Purpose
The Data Products and Dissemination (DPD)
Operation performs three primary functions:
•• Prepare and deliver the 2020 Census apportionment data for the President of the United
States to provide to Congress by December 31,
2020.
plans for the 2020 Census data products. Public
feedback is essential for this comprehensive
review and will assist the Census Bureau in prioritizing products from the 2020 Census. Feedback
via the Federal Register Notice, “Soliciting
Feedback From Users on 2020 Census Data
Products” (Federal Register volume 83 FR 34111)
and the reopening of the Federal Register Notice
(volume 83 FR 50636) was solicited in the late
summer/early fall 2018.
Lessons Learned
Based on lessons learned from the 2010 Census
assessments and reviews, the following recommendations were made:
•• Provide an approach to restructure and
enhance data dissemination activities across
the entire agency.
•• Improve customer satisfaction.
•• Expand the Census Bureau’s audience and customer base.
•• Utilize an enhanced confidentiality system
based on differential privacy where we can
precisely control and tailor the amount of noise
injection that we add to the data.
Operational Innovations
•• Tabulate 2020 Census data products for use by
the states for redistricting.
Operational innovations include the following:
•• Tabulate and disseminate 2020 Census data for
use by the public.
•• Use of enterprise solutions for preparing the
2020 Census data products and disseminating
the information to the public.
Changes Made Since Version 3.0 Operational
Plan Release: DPD now includes the application
of the 2020 disclosure avoidance methodology to
the microdata in order to produce the Microdata
Detail File. The process was previously part of
the Response Processing Operation (RPO). The
disclosure avoidance methodology that will be
implemented for the 2020 Census is known as differential privacy. Differential privacy is the scientific term for a method that adds “statistical noise”
to data tables we publish in a way that protects
each respondent’s identity.
•• Enhancements to existing tabulation systems
to support 2020 Census tabulation as an enterprise solution.
The implementation of this new privacy protection
method caused the need to conduct a comprehensive review of how data products from prior
decennial censuses were used to help determine
U.S. Census Bureau
•• Leveraging new solutions to allow data users
greater flexibility in using 2020 Census data for
research, analytics, application development,
etc. The focus is on user-centric capabilities and
dissemination functionality.
•• Move to a new, advanced, and far more powerful confidentiality protection system that precisely controls the amount of statistical noise
added to data products using sophisticated
mathematical formulas. This allows the Census
Bureau to assure enough noise is added to protect privacy, but not so much as to damage the
statistical validity of our data products.
2020 Census Operational Plan—Version 4.0 139
Description of Operation
The DPD Operation covers the application of
differential privacy protection to the response
data, aggregation and tabulation of the processed
response data, employs any additional guidance
from the Disclosure Review Board, and prepares
these data for delivery to the President, the states,
and the public.
As the leader in statistical data protection, we
take steps to prevent any outside entity from identifying individuals or businesses in statistics we
publish. Historical methods to protect data cannot
completely defend against the threats posed by
today’s technology. When data are tabulated, a
process called “noise injection” is implemented
to help prevent tracking statistics back to specific respondents. It has been a key feature of the
Census Bureau’s confidentiality protection systems
for decades and has been done in every decennial
census since 1980, as well as many other Census
Bureau data products. This process is a delicate
balancing act and enough noise must be added to
protect confidentiality, but too much noise could
damage the statistic’s fitness-for-use.
With growth in computing power, advances in
mathematics, and easy access to large, public
databases pose a significant threat to confidentiality. These forces have made it possible for sophisticated users to ferret out common data points
between databases using only our published statistics. To combat this threat, the Census Bureau is
implementing the new differential privacy method.
This new methodology will be tested and implemented with the 2018 End-to-End Census Test.
To complete the implementation of the new
disclosure methodology for 2020 Production, the
entire 2020 Census data products suite must be
defined in advance. The process for determining
the 2020 Census data products began with the
announcement of the Federal Register Notice to
solicit data user feedback. The assessment phase
is now underway with a final determination of
2020 Census products expected to be completed
in early 2019. The Census Bureau anticipates publishing the plans for the 2020 Census data products in a future notice.
For the 2018 End-to-End Census Test, the prototype Public Law (P.L.) 94-171 (Redistricting data)
140 2020 Census Operational Plan—Version 4.0
is the only data product that will be tabulated and
released by April 1, 2019.
An enterprise-level dissemination system,
Center for Enterprise Dissemination Services
and Customer Innovation (CEDSCI), will provide
access to prepackaged data products via an
interactive Web site. Data users will have access
to the prepackaged data products, application
programming interfaces (API), and metadata documentation. This system is the replacement for the
previous dissemination system known as American
FactFinder.
Research Completed
Research was conducted to test the feasibility of
using the American Community Survey (ACS) tabulation system as the solution for the 2020 Census.
In late summer of 2016, testing of the tabulation
system using select 2010 Census data products
proved the feasibility of scaling to 2020 Census
production.
Assumptions Made
Based on planning of other operations, the following assumptions have been made:
•• The apportionment for the 2020 Census will
be calculated using the method of equal proportions, according to the provisions of Title 2,
U.S. Code. Congress decides the method used
to calculate the apportionment. This method
has been used in every census since the 1940
census.
•• This operation will:
ºº Define data products.
ºº Define metadata.
ºº Implement differential privacy methodology.
ºº Generate metadata and mapping for API.
ºº Generate data products (Apportionment,
Redistricting data, and all other products)
and associated data documentation.
Decisions Made
99 The CEDSCI data user interface will be developed and released in a 40-day cadence called
“Program Increments” based on prioritized
functionality identified at the enterprise level
and guided by CEDSCI’s Integration and
Implementation Plan.
U.S. Census Bureau
99 The tabulation system supporting the ACS will
be generalized and enhanced to support both
the ACS and the 2020 Census. The generalized
system will be scaled to support both ACS and
decennial tabulation needs during the 2020
Census production.
99 The 2020 Census data products will be determined following the analysis of the feedback
from the Federal Register Notice “Soliciting
Feedback from Users on 2020 Census Data
Products” and its extension and consultation
with the Data Stewardship Executive Policy. For
the 2018 End-to-End Census Test, the prototype
P.L. 94-171 is the only data product that will be
tabulated and released by April 1,2019.
calculation method or the allocation of U.S. House
seats is passed before the legal apportionment
results deadline of December 31, 2020, THEN
the Census Bureau will need to change the 2020
Apportionment calculation programs to accommodate this change, and if it happens at the last
minute, the 2020 Apportionment schedule and
workflow must be condensed and/or refactored,
potentially jeopardizing the quality assurance of
the 2020 Apportionment results.
Milestones
Date
Activity
March 2014
Release the concept of operations for
a more customer-centric, streamlined,
and flexible enterprise solution for data
dissemination.
July 2014
Establish the Center for Enterprise
Dissemination Services and Consumer
Innovation.
October
2017
Release the DPD Detailed Operational
Plan (delayed).
February
2019
Complete comprehensive review of data
products and finalize 2020 Census Data
Products Suite.
Design Issues to Be Resolved
There are no remaining design issues that need to
be resolved for this operation.
Cost and Quality
Investment in DPD is projected to have minimal
influence on the overall cost and quality of the
2020 Census.
Risks
The 2018 and 2020 Disclosure Avoidance System
(DAS) algorithms are highly complex, under active
development, address subtle though precisely
stated mathematical privacy issues, and solve
genuinely novel outstanding scientific problems.
All of these factors create an environment where
only rigorous software engineering standards,
code review, and formal external auditing of internal code can be expected to reasonably remove
major bugs from the code. In a highly unlikely
scenario, a coding error may lead to noise not
being infused in some subset of census geographies, implying that each datum released in these
geographies in the final Microdata Detailed File
would constitute a Title 13 violation. IF there are
errors in the implementation of the DAS software,
THEN it might result in Title 13 data being improperly disclosed.
Congress establishes the method of calculating apportionment. Also, legislation has been
proposed (but not passed) in recent decades
that might have provided statehood (or at least
a seat in the U.S. House of Representatives)
to the District of Columbia or Puerto Rico. IF
any bill that affects either the apportionment
U.S. Census Bureau
December
Deploy DAS tabulation system and
2018–April 1, dissemination platform for production
2019
and release of the P.L. 94-171
Redistricting Data Prototype.
December
2020
Provide apportionment counts to the
President of the United States.
By April 1,
2021
Complete the release of the P.L. 94-171
Redistricting Data to the states, the
District of Columbia, and Puerto Rico.
May 2021–
September
2022
Deliver 2020 Census statistical data
to the enterprise data dissemination
platform for the release of quick tables
and API.
April 2023
Complete release of 2020 Census data
products.
5.6.2 Redistricting Data Program
Detailed Planning
Status:
In Production
DOP published in FY 2016
Purpose
The purpose of the 2020 Census Redistricting
Data Program (RDP) is to provide to each state
the legally required Public Law (P.L.) 94-171 redistricting data tabulations by the mandated deadline of 1 year from Census Day: April 1, 2021.
2020 Census Operational Plan—Version 4.0 141
Changes Made Since Version 3.0 Operational Plan
Release:
There have been no major changes to this
operation.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Provision of a prototype product is necessary.
•• The ability to provide legal boundary updates
is needed.
•• Delivery of the data prior to public release is
necessary.
Operational Innovations
Operational innovations include the following:
•• Separation of the program’s Block Boundary
Suggestion Project from the Voting District
Project to allow greater external participation.
•• Inclusion of a Boundary Annex Survey (BAS)
component to capture and improve underlying
geography.
•• Processing at headquarters and the National
Processing Center to provide states with
consistent guidance, to enhance coordination between BAS and RDP, and to reduce
burden on the Geographic Area Reconciliation
Program.
•• State legislative district updates captured at
the time of collection of Congressional district
updates to reduce the need for multiple efforts.
Description of Operation
The RDP Operation provides the 50 states, the
District of Columbia, and Puerto Rico with the
opportunity to identify, delineate, and update
geographic boundaries for data tabulation. It
also allows for continuous process improvement
through an evaluation of the program with recommendations for the next cycle in an official publication called “The View From the States.”
The five major components in the 2020 Census
RDP include:
•• Phase 1—Block Boundary Suggestion Project.
•• Phase 2—Voting District Project.
142 2020 Census Operational Plan—Version 4.0
•• Phase 3—P.L. 94-171 data and geographic support products design and delivery.
•• Phase 4—Collection of changes to
Congressional and State Legislative Districts.
•• Phase 5—Evaluation of the 2020 Census RDP
and recommendations for the 2030 RDP.
Research Completed
The following research has been completed for
this operation:
•• January 2015: Released the Designing P.L.
94-171 Redistricting Data for the Year 2020
Census—The View From the States.
ºº Findings:
•• Need for a “one number” census.
•• Need for a prototype data product.
•• Need for data delivery prior to public
release.
•• Need for Group Quarters (GQs) Operation
data.
•• Need for support products using the most
current (2020) geography.
•• Need for tabulation block and voting district data.
•• Need for states to have the option to use
their resident Geographic Information
Systems for program participation.
Decisions Made
The following decisions have been made for this
operation:
99 Prototype P.L. 94-171 redistricting data tabulations and geographic support products
from the 2018 Census End-to-End Test will be
generated and distributed to official liaisons by
April 1, 2019.
99 Use Geographic Update Partnership Software
as one of the methods for interaction with and
collection of partner updates.
99 GQ tabulations by race for the seven main GQ
types will be included as part of the official P.L.
94-171 redistricting data file for total population
only.
99 The Block, Block Group, and Tract crosswalk
files can be released prior to the April 1, 2021,
P.L. 94-171 redistricting data file deadline.
U.S. Census Bureau
99 The race and ethnicity questions will continue
to be separate and not changed to a combined
question for the 2018 End-to-End Census Test
or the 2020 Census. Nor will a Middle Eastern/
North African category be added as a minimum
reporting category or ethnicity. Therefore, no
changes will be necessary to the structure of
the P.L. 94-171 Redistricting Data File from the
2010 file design, other than the unrelated and
documented decision to add a table for GQs
Population by GQs Type.
99 The Citizen Voting Age Population by Race
and Ethnicity tabulation from the American
Community Survey that is planned for production in early 2021, can be produced using the
new 2020 Census tabulation block and tract
geography. However, if tabulations are already
available for citizenship by voting age by race
and ethnicity at the block group and tract level
as part of planned 2020 Census products, then
this special tabulation will remain on the 2010
block group and tract geography for release in
2021.
Design Issues to Be Resolved
Additional work is required to make decisions on
the following questions:
Question
Expected Date
What IT capabilities and data
distribution methodology will
be used for the 2020 Census
(including maps)?
June 2019
Cost and Quality
Investment in RDP is projected to have minimal
influence on the overall cost of the 2020 Census.
Impacts of this operation on overall 2020 Census
quality include the following:
ÏÏ Improvements in underlying geography through
iterated update cycles.
ÏÏ Improvements in the P.L. 94-171 data product
design, including the addition of a GQ table,
will better meet the needs of the states for
small area tabulations to conduct legislative
redistricting.
Risks
As part of its mission to provide the states with
the small area tabulations needed to conduct
U.S. Census Bureau
legislative redistricting and to deliver that product
within 1 year of Census Day, the Census Bureau
produces a full suite of prototype products. IF
the P.L. 94-171 prototype data products for
small-area population totals are not accessible for
stakeholders through the Center for Enterprise
Dissemination Services and Consumer Innovation
after the 2018 End-to-End Census Test, THEN
the delivery will be delayed and/or the Census
Redistricting and Voting Rights Data Office will
have to expend resources to develop an alternative method to disseminate data to its stakeholders, such as providing products on removable
media.
A lack of institutional knowledge and new data
disclosure avoidance methodology for P.L. 94-171
data tabulation and review could prevent a timely
release of data prior to the April 1, 2021, official deadline. IF institutional knowledge is not
acquired by the start of the review of the P.L.
94-171 data tabulation, THEN processing of P.L.
94-171 data and creation of products could take
longer, causing the RDP to miss the legal deadline
for P.L. 94-171 data release of April 1, 2021.
Milestones
Date
Activity
July 2014
Submit Federal Register Notice
proposing the 2020 Census RDP.
January
2015
Publish “Designing P.L. 94-171
Redistricting Data for the Year 2020
Census—The View From the States.”
December
2015–
May 2017
Conduct Phase 1: Block Boundary
Suggestion Project.
September
2016
Release the RDP Detailed Operational
Plan, version 1.0.
October
2017
Finalize the P.L. 94-171 prototype
products design.
December
2017–
May 2019
Conduct Phase 2: The Voting District
Project.
March 2018
Release the RDP Detailed Operational
Plan, version 2.0.
March 2019
Deliver P.L. 94-171 prototype products.
November
2020–
March 2021
Conduct Phase 3: Data Delivery for the
2020 Census RDP.
April 1, 2021 Deliver the P.L. 94-171 data (legal
deadline).
2020 Census Operational Plan—Version 4.0 143
5.6.3 Count Review
Detailed Planning
Status:
Underway
DOP to be published in FY 2019
Purpose
The 2020 Census Count Review Operation (CRO)
enhances the accuracy of the 2020 Census
through remediating potential gaps in coverage
by:
•• Implementing an efficient and equitable process to identify and incorporate housing units
(HUs) that are missing from the Census Master
Address File (MAF).
•• Identifying and including or correcting large
group quarters (GQs), such as college/university student housing, that are missing from the
MAF or geographically misallocated.
•• Positioning unresolved cases for a smooth transition to the Census Question Resolution (CQR)
Operation.
•• Changes Made Since Version 3.0 Operational
Plan Release: The Data Stewardship Executive
Policy Committee approved a remote review
option for both Federal-State Cooperative for
Population Estimates (FSCPE) Count Review
events. FSCPE members will have the opportunity to conduct reviews remotely or travel to
Census Bureau headquarters to conduct them.
Special Sworn Status and Title 13 training will
be provided to the participants, as well as an IT
account to access the data, Geographic Update
Partnership Software (GUPS) and other tools
that will help them to conduct the review and
communicate with the Census Bureau liaisons.
Lessons Learned
Based on lessons learned from the 2010 Census,
the following recommendations were made:
•• Planning for the CRO Program needs to begin
earlier in the decennial planning cycle to be
more easily and fully integrated with decennial
census operations.
•• Address-level precision is essential to an effective count review program.
•• Consider working with the Emergency Services
data (E911) system, tax assessor records, and
other federal agencies to develop a common
format and address updating protocol.
144 2020 Census Operational Plan—Version 4.0
•• Have both GQ and HU address information
available during the review.
Operational Innovations
For the 2020 Census, the CRO will be timed such
that the results of the reviews are fully integrated
with the other operations. For example, the review
of HUs will be conducted in time to include any
changes resulting from the review into the supplemental universe for potential mailings and for
nonresponse followup (NRFU). It will also allow
time to follow up or to conduct data collection at
misallocated or missed GQs.
Description of Operation
The operational description provided below is
based primarily on the operational design of
the 2010 Census CRO. As was the case in past
censuses, the 2020 Census CRO relies heavily on
participation from members of the FSCPE. For
the 2020 Census CRO, the FSCPE/2020 Census
Working Group was established to explore
opportunities to leverage the knowledge and
experience of the FSCPE members to benefit the
overall 2020 Census Program. The working group
includes representatives from the FSCPE Steering
Committee, as well as Census Bureau subject-
matter experts from Field Division, Decennial
Census Management Division, Geography Division
(GEO), and Population Division.
The CRO consists of the following:
•• A pre-2020 Census review of HU and GQ
addresses to identify HUs that are potentially
missing from the MAF, or large GQs that are
potentially missing from the MAF or geographically misallocated to the wrong census block.
Missing HU addresses that meet CRO requirements will be forwarded to NRFU. Missing or
misallocated GQ addresses that meet CRO
requirements will be transmitted to GQ enumeration. Members of the FSCPE from all 50 states,
the District of Columbia, and Puerto Rico are
invited to participate in this event.
The FSCPE participants will obtain address and
coordinate data from various sources, with the
historically most common sources being tax
assessor records and E911 data. State participants will be required to provide their addresses
and Global Positioning System (GPS) coordinate
data in a specified digital format so that these
U.S. Census Bureau
data can be used in an application that enables
a review and comparison of the state-provided
data to Census Bureau data.
Census Bureau staff will perform quality checks
on the data, ensuring that all records have state
and county codes, GPS coordinates, etc. The
data will then be sent to GEO to be standardized and matched against the MAF. The output
from this matching process will be the input
files for the review tool (GUPS).
This application will be made available to the
FSCPE reviewers, and will provide information showing the differences between tallies
of the Census Bureau and FSCPE HU and GQ
addresses in a given county, tract, or block.
The prescribed review process will focus the
reviewers on the geographies where the FSCPE
counts of HU or GQ address were higher than
those of the Census Bureau.
Similar to the 2010 Census CRO, the GQ types
in-scope for the review are expected to include
nursing/skilled-nursing facilities, college/
university student housing, military barracks,
adult correctional facilities, and workers’ dormitories, and job corp centers with populations of
50 or more. The primary reason these GQ types
will be selected for the review is because they
represent more than 80 percent of the nation’s
GQ population and comprise the majority
of large GQs. After a potentially missing GQ
is identified, a second research step will be
conducted to determine if the GQ record was
under another GQ type code that was ineligible
for the review.
•• A post-enumeration review of GQ addresses will
occur to identify those addresses that are still
potentially missing from the MAF or geographically misallocated to the wrong census block.
The GQ post-enumeration review data will come
from a file of the GQ records enumerated in the
2020 Census available at the time of the review.
The purpose of the GQ post-enumeration
review is to identify any missing or misallocated GQ addresses while GQ enumeration
is still underway. Both GQ reviews (early and
post-enumeration) allow the Census Bureau to
collect demographic characteristics data of the
population living in those GQs. As with the first
event, members of the FSCPE from all 50 states,
U.S. Census Bureau
the District of Columbia, and Puerto Rico are
invited to participate in this review.
Review by Census Bureau staff of the following
census files for systematic or large anomalies in
population, HU, and GQs counts (Census Count
and File Review [CCFR]):
ºº Decennial Response Files 1 and 2
ºº Census Unedited File
ºº Census Edited File (CEF)
ºº Microdata Detail File.
The objective of the CCFR is to determine how
reasonable the results of our decennial census
data collection efforts appear to be at several
levels of geography compared to multiple sets
of benchmark data. If CCFR finds anomalies or
unexpected results, they will report their finding
to the Response Processing Operation so they can
review their processing steps to determine if the
issues are correctable.
CCFR also includes the edit review process which
verifies that each person and HU on the CEF have
valid values in the person and housing items, and
ensures consistencies among characteristics. Edit
review focuses on path tracers (tallies) and hot
deck matrix reports, and verifies that the edits and
imputations processes have been applied correctly.
Work Completed
•• Developed the Business Process Model and
requirements.
•• Selected FSCPE early participants from
Colorado, New York, Pennsylvania, and
Washington.
•• Engaged with FSCPE early participants in
biweekly meetings with the subject-matter
experts.
•• Completed development of the GUPS for the
FSCPE review process (currently undergoing
testing).
•• Awarded contract to one of the four FSCPE
early participant states (currently in the security
clearance portion of the onboarding process).
•• Assigned Census Data Lake to develop review
tool for the CCFR.
Decisions Made
The following decisions have been made for this
operation:
2020 Census Operational Plan—Version 4.0 145
99 Similar to the approach used in the 2010
Census CRO, there will be two distinct opportunities for FSCPE knowledge and experience to
remediate potential gaps in coverage associated with missing HU and missing or misallocated GQ addresses. FSCPE representatives
will leverage information from their respective
states along with MAF data and GUPS software
provided by the Census Bureau to identify clusters of missing HUs and missing or misallocated
GQs.
99 The 2020 Census CRO will be deemed successful if a majority of states participate and
complete their review on time, and the Census
Bureau is able to investigate and resolve the
majority of identified address issues before the
2020 Census is complete.
99 The planned levels of geography for conducting
HU and GQ address count review are counties,
tracts, and blocks. The planned levels of geography for the CCFR are nation, states, counties,
tracts, and blocks.
99 In 2020, the Census Bureau will host two GQ
reviews. The early GQ review will take place
concurrently with the HU review from January
to February 2020, after address canvassing.
FSCPEs will identify missing and misallocated
GQs, which will be sent to GQ enumeration.
The post-enumeration GQ review will happen
when GQ enumeration is nearly completed (mid
to late June 2020) to ensure that there are no
missing GQs. Any missing GQs will be sent to
GQ enumeration.
99 The approaches that will be used for validating
missing HUs provided by FSCPEs will be aerial
imagery and alternative sources (property tax
files, etc.).
99 The approaches that will be used for validating missing and misallocated GQs provided by
FSCPEs will be aerial imagery and alternative
sources (property tax files, etc.).
99 Current plans are for FSCPE stakeholders
to review the HU and early GQ counts from
January to February 2020, and the post-enumeration GQ counts in the summer 2020.
released in March 2019. A decision was recently
made to develop the GUPS for the 2020 Census
CRO.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
Investment in the CRO is projected to have
minimal influence on the overall cost of the 2020
Census, but will contribute to improving the quality of the results.
Risks
Contracts with FSCPE state representatives are
required for their participation in 2020 Census
CRO. IF the contracts for FSCPE state representatives are not funded, THEN the 2020 Census CRO
will not be completed with the local expertise
needed and the review may not be as precise or
efficient without local knowledge.
It has been recommended that FSCPE personnel
have the capability to conduct the data review
remotely. IF FSCPE representatives are not able
to review data remotely, THEN the review may be
delayed and will incur additional costs.
Milestones
Date
Activity
October
2015
Initiate the 2020 Census CRO Integrated
Project Team.
March 2019
Release the CRO Detailed Operational
Plan.
January to
February
2020
Conduct first 2020 Census CRO review
event: HU and GQ addresses.
June 2020
Conduct second 2020 Census CRO
review event: census post-enumeration
GQ addresses.
September
2020 to
March 2021
Conduct 2020 Census Count and Files
Review of Decennial Response Files 1
and 2, Census Unedited File, Census
Edited File, and Micro-Data Detail File.
August 2021 Issue 2020 Census Count Review
Program Operational Assessment.
99 The objectives and scope will be included in
the Detailed Operational Plan scheduled to be
146 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
5.6.4 Count Question Resolution
Detailed Planning
Status:
Recently Begun
DOP to be published in FY 2019
Purpose
The Count Question Resolution (CQR) Operation
provides a mechanism for governmental units to
challenge their official 2020 Census results.
Changes Made Since Version 3.0 Operational Plan
Release: There have been no major changes to
this operation.
Lessons Learned
Based on lessons learned from the 2010 Census,
studies and reviews, the following recommendations were made:
•• Create a milestone schedule and ensure it is
followed.
•• Meet early and often so that all stakeholders involved make decisions up front, before
beginning to program control systems or write
procedures.
•• Make sure planning tasks are completed on
time and everyone is aware of key decisions.
Operational Innovations
No specific operational innovations have been
identified for this operation.
Description of Operation
The CQR Operation provides a mechanism for
governmental units to challenge the accuracy of
their final 2020 Census counts.
The CQR Operation includes the following
activities:
•• Draft proposed process and rules and publish in
the Federal Register.
•• Finalize process and rules and publish in the
Federal Register.
•• Identify staffing needs and make temporary
appointments and reassignments.
•• Receive, investigate, and respond to all challenges, including correcting errors found within
the established guidelines of the program.
U.S. Census Bureau
•• Certify revised population and housing counts
for governmental unit(s).
Research Completed
Because detailed planning for this operation has
not yet started, research that directly supports
this operation has not yet been completed.
Assumptions Made
Based on initial discussions, the following assumption has been made:
•• This program will be conducted in a similar
manner to both the 2000 and 2010 Censuses.
Decisions Made
No decisions have been finalized for this
operation.
Design Issues to Be Resolved
Additional work is required to make decisions on
the following questions:
Question
Expected Date
What is the approach for
addressing unexpected issues
related to count or geographic
discrepancies? For example, in
the 2010 Census, there were some
very specific issues with the way
the Census Bureau geocoded Navy
ships in U.S. harbors.
June 2019
Will the Census Bureau require
challenging governments to provide location information for each
housing unit (HU) they provide on
their list?
June 2019
What types of challenges will be in
scope?
June 2019
What documents and systems will June 2019
be needed to research and respond
to challenges?
Cost and Quality
Investment in CQR is projected to have minimal
influence on the overall cost of the 2020 Census,
but is designed to improve the quality of the 2020
Census results.
Risks
No risks have been identified to date for this
operation.
2020 Census Operational Plan—Version 4.0 147
Milestones
Date
Activity
January
2017
Begin planning and development of
program schedule, process, and initial
Federal Register Notice.
June 2019
Release the CQR Detailed Operational
Plan.
August 2020 Publish initial Federal Register Notice
identifying process and types of
challenges to be considered.
May 2021
Publish final Federal Register Notice to
establish process, timing, and types of
challenges in scope for the program.
June 2021
Begin accepting challenges from
governmental units.
2021–2023
Issue revised certified counts as
appropriate and make available on
.
June 2023
Deadline for governmental units to
submit challenges.
Sept 2024
End program and issue assessment and
lessons learned report.
5.6.5 Archiving
Detailed Planning
Status:
Underway
DOP published in FY 2018
Purpose
The Archiving (ARC) Operation performs the following functions:
•• Coordinates storage of materials and data and
provides records deemed permanent as the
official data of the 2020 Census, including files
containing the individual responses to the 2020
Census, to the National Archives and Records
Administration (NARA).
•• Provides similar files to the National Processing
Center (NPC) to use as source materials to conduct the Age Search Service.
•• Stores data to cover in-house needs.
Changes Made Since Version 3.0 Operational Plan
Release: There have been no major changes to
this operation.
Lessons Learned
•• Make sure staff are regularly reminded of their
records management responsibilities. They need
to understand the distinction between permanent and temporary records, and the Census
Bureau’s legal obligation to archive permanent
records.
•• Start archiving planning (with an interdivisional
team) earlier in the life cycle—suggest FY 2018
at the latest.
•• Keep a log or spreadsheet on which materials
the records schedule requires be sent to NARA,
how they will be sent, dates promised, and
actual transfer date.
Operational Innovations
Participate in cloud implementation as a solution
for storing and transferring electronic records for
archiving.
Description of Operation
The Census Bureau must provide copies of the
individual responses to the 2020 Census (including
names and addresses) to the NARA. The specific format, media, and timing for the delivery
is negotiated between the Census Bureau and
NARA. Because the primary use of this information is for genealogical searches (to be released
no sooner than 72 years after Census Day), the
Census Bureau may also have to provide a linkage
between the individual response data and the
copies of questionnaires on paper, microfilm, or
through electronic data. This operation also provides similar data to support the Census Bureau
Age Search Program at the NPC in Jeffersonville,
Indiana.
The ARC Operation is responsible for the Census
Bureau Records Schedule relating to the 2020
Census. The 2020 Census Records Schedule established with NARA is only intended to encompass
final records used to capture, process, and tabulate respondent data, and final records used to
collect and update address and map information.
Research Completed
Planning for this operation has started, but
research that directly supports this operation has
not yet been completed.
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
148 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
Decisions Made
The individual responses will be delivered electronically to NARA no later than 15 years from
the census year, in 2035. The individual responses
will be delivered to NARA in the format outlined
in the 2020 Census Records Schedule that will be
finalized in 2019.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
Date
Activity
April 2021
Develop final records schedule with
NARA and submit for approval by the
Archivist.
July 2022
Begin transfer of permanent records to
NARA.
January
2023
Complete transfer of all permanent
records to NARA. Complete destruction
of all temporary records no longer
needed by the Census Bureau.
5.7 OTHER CENSUSES
Investment in ARC is necessary to support legislative and constitutional mandates and will require
funding for 15 years after the census to support all
archiving solutions that will influence the cost and
quality of the 2020 Census.
Other Censuses comprises all functions associated
with the decennial censuses for American Samoa,
the Commonwealth of the Northern Mariana Islands
(CNMI), Guam, and the U.S. Virgin Islands, collectively known as the Island Areas. There is one
operation in this area: Island Areas Censuses.
Risks
5.7.1 Island Areas Censuses
The archiving solution for 2020 Census-related
data will need to support the storage of data and
materials from several large sources, as well as
provide access to the data and materials. Because
of the length of time that the archiving materials and data have to be maintained, there will be
significant cost associated with supporting the
systems for storage and providing access to data
stored in the archiving solution. IF the funding
is not provided for the systems supporting the
archiving solution, THEN the solution may not be
able to store all the required 2020 Census data
required by law to send to NARA nor provide the
access necessary to reference for legal inquiries or
to conduct research for planning future censuses.
Milestones
Date
Activity
Annually,
Update official records plan performed
beginning in by Records Manager for each
2016
participating division.
August 2016 Begin identification and review of all
records that will be generated by or for
the 2020 Census.
October
2016
Begin negotiations with NARA to make
preliminary determinations of which
records will be deemed permanent, and
must be archived.
September
2018
Release the ARC Detailed Operational
Plan.
U.S. Census Bureau
Detailed Planning
Underway
Status:
DOP published in FY 2018 (delayed)
Purpose
The purpose of the Island Areas Censuses (IAC)
Operation is to enumerate all residents of American
Samoa, the Commonwealth of the Northern
Mariana Islands (CNMI), Guam, and the U.S. Virgin
Islands; process and tabulate the collected data;
and disseminate data products to the public.
Changes Made Since Version 3.0 Operational Plan
Release: There have been no major changes to this
operation.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• The contracts with the Island Areas’ local
governments need to stipulate the roles and
responsibilities of the census office managers,
the onsite Census Advisors, the officials of the
local governments, and the officials at Census
Bureau headquarters.
•• The IAC data collection operations and data
processing needs to be more in-line with stateside operations and data processing.
2020 Census Operational Plan—Version 4.0 149
•• The planning phase of the IAC should involve
data processing staff who can help create testing strategies.
Operational Innovations
Use of enterprise solutions for processing data,
creating data products, and disseminating the
information to the public.
Description of Operation
The Census Bureau will conduct the 2020 Island
Areas censuses through contract agreements with
local government agencies in American Samoa,
CNMI, Guam, and the U.S. Virgin Islands. The
Census Bureau will provide the materials and guidance to the local government agencies that are
then responsible for recruiting and hiring the staff
to conduct the data collection activities. The data
collection activities include (but not limited to):
•• Address listing.
•• Enumeration.
•• Field follow-up.
Research Completed
Detailed planning for this operation is integrated
with many of the stateside operations. This collaboration leverages the results obtained through
the stateside operations’ research and testing
activities.
Decisions Made
The following decisions have been made for this
operation:
99 Continuously engage and communicate the
Census Bureau’s plans with liaisons in the local
Island Areas’ governments, and with the Office
of Insular Affairs in the Department of Interior.
99 Establish agreements with the local Island
Areas’ governments to conduct the census data
collection.
99 Establish five Census Offices: two in the U.S.
Virgin Islands and one in each of the Pacific
Island Areas.
into account Island Area local government concerns where possible.
99 Due to funding uncertainty and reprioritization
of critical components of the 2020 Census, the
Census Bureau is no longer planning to produce
a Master Address File of Island Areas’ addresses
prior to the 2020 Census. The Census Bureau
will conduct an address listing operation
instead.
99 Use existing systems whenever possible; some
modifications may be needed.
99 Deploy Census Advisors in 2019 to provide
guidance throughout the data collection
process and to report back to Census Bureau
headquarters—one advisor for each of the
Pacific Island Areas (American Samoa, CNMI,
and Guam), and two advisors for the U.S. Virgin
Islands (one for St. Thomas and St. John, and
one for St. Croix).
99 Field enumerators will list addresses using paper
address registers and paper maps, using the
same listing procedures used in Remote Alaska
and Update Enumerate Operation. For every
living quarter the enumerators visit, they will
conduct interviews with household members
and follow up as necessary.
99 The IAC will use paper questionnaires, paper
maps, and paper address registers.
99 The Automated Tracking and Control System
currently used by the NPC will be used as a
control system in the Island Areas. The NPC will
receive bulk shipments of completed data collection materials, and use Integrated ComputerAssisted Data Entry to capture the data.
99 The questionnaires for the IAC will align with
ACS questionnaires with some modifications,
such as the addition of questions on parents’
place of birth, reasons for migration, sewage
disposal, and source of water.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
99 Use a “long-form like” questionnaire.
Cost and Quality
99 Use the American Community Survey (ACS)
questionnaire with minor wording changes to
accommodate time reference differences, incorporate the final 2020 Census questions taking
Investment in IAC is projected to have minimal
influence on the overall cost and quality of the
2020 Census.
150 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
Risks
Due to the increased usage of digital advertising
and social media planned for the 2020 Census,
Island Areas residents are likely to be exposed to
the stateside messaging of a shorter questionnaire
content with Internet self-response and mailback
options. IF the 2020 Census media message for
stateside cannot be effectively counteracted,
THEN public trust and customer expectations may
be adversely affected in the Island Areas.
•• Coverage Measurement Field Operations
(CMFO): Collects person and HU information
(independent from the 2020 Census operations) for the sample of HUs in the Coverage
Measurement Survey.
•• Evaluations and Experiments (EAE): Measure
the success of critical 2020 Census operations.
Formulate and execute an experimentation program to support early planning and inform the
transition and design of the 2030 Census.
Each operation is described below.
Milestones
Date
Activity
June 2018
Finalize plans for the IAC operations.
June 2018
Determine what stateside systems will
be used for the 2020 IAC operations.
September
2018
Release the IAC Detailed Operational
Plan (delayed).
December
2018
Finalize contract with the Island Areas
governments.
March 2019
Identify sites for Census Offices.
September
2019
Open Census Offices.
February
2020
Begin address listing.
March 2020
Begin enumeration.
April 2020
Begin field follow-up.
July 2020
Close data collections.
5.8 TEST AND EVALUATION
The Test and Evaluation area performs two primary functions:
•• Evaluate the quality of the 2020 Census.
•• Prepare for the 2030 Census.
This area includes four operations:
•• Coverage Measurement Design and Estimation
(CMDE): Designs the Post-Enumeration Survey
(PES), including sampling and estimation.
•• Coverage Measurement Matching (CMM):
Identifies matches and nonmatches between
the 2020 Census and the PES for the enumerated housing units (HUs) and people.
U.S. Census Bureau
5.8.1 Coverage Measurement Design and
Estimation
Detailed Planning
Status:
Underway
DOP to be published in FY 2019
Purpose
The Coverage Measurement Design and
Estimation (CMDE) Operation develops the survey
design and sample for the Post Enumeration
Survey (PES) of the 2020 Census. It also produces
estimates of census coverage based on the PES.
Changes Made Since Version 3.0 Operational Plan
Release: There have been no major changes to
this operation.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Simplify the sampling operations, the data collection, the matching operations, and the estimation by eliminating the creation and use of
block cluster, provided the basic collection unit
(BCU) concept is similar to 2010 block cluster.
•• Follow best practices from the 2010 Census
Coverage Measurement operations where the
Census Bureau anticipated potential changes
in implementing the sample design, allowing
changes to sample design requirements to
be easily handled given the implementation
approach.
•• Use the Planning Database for designing the
PES sample.
2020 Census Operational Plan—Version 4.0 151
Operational Innovations
Assumptions Made
The 2020 PES will use the BCU instead of the
block cluster as the primary sampling unit. This
will reduce the need for creating block clusters
and simplify operations.
Based on the 2010 Census design and planning of
other operations for the 2020 Census, the following assumptions have been made:
The 2020 PES sample will be allocated based on
state-specific measures of size, instead of national
measures of size. This will improve the within-state
stratification and allocation.
The CMDE Operation is currently researching
methods to:
•• Improve coverage estimates for young
children and babies by using demographic
analysis results by age in the correlation bias
adjustment.
•• Improve the saliency and timeliness of estimates by researching the feasibility of releasing
coverage estimates in Fiscal Year 2021.
Description of Operation
The operational design of the 2020 PES will be
based on the 2010 Census Coverage Measurement
operational design.
The CMDE Operation performs the following
functions:
•• Develop the survey design for the PES.
•• Design and implement the sample to support
the estimation of coverage estimates in the
2020 Census for the United States and Puerto
Rico, excluding Remote Alaska.
•• Produce estimates of net coverage error and
the components of census coverage for housing units (HUs) and people living in HUs for
the United States and Puerto Rico, excluding
Remote Alaska.
•• Similar to the 2010 Census Coverage
Measurement approach, net coverage estimates
will be made using the capture-recapture,
dual-system estimation methodology.
Research Completed
Research concluding that BCUs were a suitable
replacement for the block clusters.
Research concluded that using American
Community Survey and Planning Database variables did not outperform stratification using 2010
tenure (owner vs. nonowner status) and BCU size.
152 2020 Census Operational Plan—Version 4.0
•• Maintain the independence of the PES operations from the 2020 Census operations.
•• Continue to use Demographic Analysis as an
input to coverage measurement estimation as
in the 2010 Census.
Decisions Made
The following decisions have been made for this
operation:
99 The Census Bureau will estimate the net coverage error and the components of census
coverage for HUs and people living in HUs. The
components of census coverage will include
correct enumerations, erroneous enumerations
(which include census duplicates), whole-person imputations, and omissions.
99 Based on funding uncertainty and reprioritization of critical components of the 2020 Census,
the Census Bureau may experience a delay in
releasing the estimates as compared to the
original 2020 Census plan.
99 First drafts of the sampling research results
reports were complete by September 30, 2016.
These reports document the findings to the
research questions outlined in the “2020
Coverage Measurement: Sample Design
Research Plan.”
The “2020 Coverage Measurement: Sample
Design” memo has been drafted. This memo
documents the sample design for the 2020
Coverage Measurement. It describes the methodology that will be used to select the sample.
The design recommended for 2020 is similar to
2010.
99 The Census Bureau will produce estimates
for the United States (including Washington,
DC) and Puerto Rico by major demographic
subgroups and by specified census operations.
Other domains are being considered.
99 The Census Bureau will implement processes
and procedures as they were done in the 2010
Census.
99 The systems will undergo standard testing prior
to the 2020 Census operations.
U.S. Census Bureau
99 The Census Bureau cannot determine the
effect of design changes from the 2010 Census
Coverage Measurement Survey on the estimates until the estimation systems are in place.
Furthermore, the burden of running the 2010
data through the 2020 systems is quite high
because the 2020 software would have to be
modified to read in and account for different
variables in the 2010 data. We will not be able
to determine what the effects on estimates
of potential operational and system changes
are, due to the additional resources necessary
to revise the 2020 systems to ingest the 2010
data, as well as the burden required to get an
answer.
Design Issues to Be Resolved
There are no more remaining design issues to be
resolved for this operation.
Cost and Quality
Investment in CMDE is projected to have minimal
influence on the overall cost and quality of the
2020 Census.
Risks
CMDE Operation was descoped from the 2018
End-to-End Census Test. Funding and resources
for the 2020 Census CMDE Operation are uncertain. IF sufficient resources are not provided
for the 2020 Census CMDE Operation, THEN all
expected innovations for CMDE may not be fully
implemented for the 2020 Census.
To meet the release date for coverage estimates,
the CMDE Integrated Project Team needs software
development and production within a specified
timeframe. It is possible that resources will be
pulled from the PES to work on other decennial-related projects. IF the PES developers are
pulled off development or not available to implement change requests during production, THEN
coverage estimates may be delayed.
Ratios from demographic analysis are used to
reduce correlation bias in the dual system estimates. This requires matching tabulations from
demographic analysis, the 2020 Census, and PES
by age-race-sex. Differences in the reporting and
classification of race in these systems can add
measurement error to the correlation bias adjustment. For 2020, this would relate to the Black vs.
U.S. Census Bureau
non-Black tabulations for age groups. Changes
in the census questions for 2020, especially
with regard to Hispanic origin, could reduce the
integrity of the correlation bias adjustment. IF the
difference between the demographic analysis and
2020 Census race classifications is large enough,
THEN the PES may not be able to accurately correct for correlation bias within race.
Any changes to the PES design will likely require
changes to the sampling and estimation. IF
changes are made to the PES design any time
after specifications and requirements have
been started, THEN estimates may be delayed,
staff morale may be reduced, and development
resources will be increased to rewrite the specifications and software.
Milestones
Date
Activity
January 2016
Start CMDE.
March 2019
Release the CMDE Detailed
Operational Plan.
June 2019–
July 2019
Select PES BCUs.
April 2020
Conduct small BCUs subsampling.
May 2020–
August 2020
Identify PES Person Interview
sample.
June 2021
Release National Net Person
Coverage Error and National
Components of Person Coverage
Estimates.
October 2021
Release National Net Housing
Unit Coverage Error and National
Components of Housing Unit
Coverage Estimates.
October 2021
Release State Results of Net Error
and Components of Coverage for
People and Housing Units.
5.8.2 Coverage Measurement Matching
Detailed Planning
Status:
Underway
DOP to be published in FY2019
Purpose
The Coverage Measurement Matching (CMM)
Operation identifies matches and nonmatches and
discrepancies between the 2020 Census and the
Post Enumeration Survey (PES), for both housing
units (HUs) and people in the sample areas. Both
2020 Census Operational Plan—Version 4.0 153
computer and clerical components of matching
are conducted.
Changes Made Since Version 3.0 Operational
Plan Release: The Clerical Match and Map Update
system will be developed for the Housing Unit
Clerical Matching System.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Simplify the Coverage Measurement clerical
matching tasks.
•• Rely more on the automated matching systems
than the clerical matchers.
•• Move HU matching and follow-up operations
closer to the listing operation.
•• Automate the assignment of status codes and
address information where possible.
Operational Innovations
The person who is computer matching will use
telephone numbers from administrative records
for census records in the sample areas when no
telephone number was reported in the census. As
a result, the use of the updated telephone numbers could improve computer match rates, thereby
reducing the need for clerical matching and
potential HU follow-up operations. Also, to simplify the PES clerical matching tasks, the Census
Bureau will reengineer the business process to
improve the efficiency of the analyst by relying
more on the automation of the clerical matching
system.
Description of Operation
The PES design has three phases: Initial Housing
Unit Phase, Person Phase, and Final Housing Unit
Phase. In each phase, PES data that are independently collected are compared to census
collected data to check the accuracy and completeness of the census data. Within the three
phases the matching operations contains 11
suboperations:
•• Initial Housing Unit Computer Matching:
The file of addresses developed during PES
Independent Listing in each basic collection
unit (BCU) is computer-matched against the
census HU and group quarters addresses in
154 2020 Census Operational Plan—Version 4.0
the Census HU Matching Universe files in the
same BCU plus one ring of surrounding BCUs.
Addresses are assigned a match, possible
match, possible duplicate, or nonmatch code
for clerical matching.
•• Initial Housing Unit Before Follow-Up Clerical
Matching: The National Processing Center
(NPC) technicians and analysts use clerical
matching techniques, along with PES and census maps, to verify the computer matching and
attempt to locate matches, possible matches,
or possible duplicates that the computer could
not. Since analysts perform Quality Assurance
(QA) on technicians’ work, they need to acquire
the level of expertise required in the QA model
that will only be gained through extensive training and experience.
•• Initial Housing Unit After Follow-Up Clerical
Matching: The NPC technicians and analysts
will use the results of the Initial Housing Unit
Follow-Up to resolve all remaining cases. Since
analysts perform QA on technicians’ work, they
need to acquire the level of expertise required
in the QA model that will only be gained
through extensive training and experience.
•• Clerical Geocoding: Clerical matching staff will
assign census geography to alternate addresses
collected in the personal interview to determine
search areas for matching.
•• Residence Status Coding: Clerical matching staff
will assign a code that indicates where each
person should be counted in the census. The
software also displays maps of the locations
(mapspots) assigned to addresses in the sample
area by 2020 Census Operations versus the PES.
•• Person Computer Matching: The rosters of people collected during the personal interview are
computer matched against the census rosters
collected across the nation and Puerto Rico.
The computer also conducts a search for duplicates in the personal interview rosters collected
for each BCU and in the census rosters across
the nation and Puerto Rico. Using these results,
people are assigned match codes (match, possible match, possible duplicate, or nonmatch)
for clerical matching.
•• Person Before Follow-Up Clerical Matching:
The NPC technicians and analysts use clerical
matching techniques, along with maps, to
geocode alternate addresses, assign residence
U.S. Census Bureau
status codes, and find matches, possible
matches, duplicates, and possible duplicates
that the computer could not.
•• Person After Follow-Up Clerical Matching:
The NPC technicians and analysts will use the
results of the Person Follow-Up to resolve all
remaining cases.
•• Final Housing Unit Computer Matching and
Processing: The computer will process and
match the late adds and deletes from the census
listing of HU and GQ addresses in the Census HU
Matching Universe files. Addresses are assigned
a match, possible match, possible duplicate, or
nonmatch code for clerical matching.
•• Final Housing Unit Before Follow-Up Clerical
Matching: The NPC technicians and analysts
use clerical matching techniques, along with
PES and census maps, to verify the computer
matching, attempt to locate matches, possible
matches or possible duplicates that the computer could not, and reconcile any resulting discrepancies from the census adds and deletes.
•• Final Housing Unit After Follow-Up Clerical
Matching: The NPC technicians and analysts
will use the results of the Final Housing Unit
Follow-Up to resolve all remaining cases.
Research Completed
Research was undertaken to determine if the
Initial Housing Unit Follow-Up and Final Housing
Unit Follow-Up Operations are needed. A decision
was made to conduct both Initial Housing Unit
Follow-Up and Final Housing Unit Follow-Up.
Decisions Made
The following decisions have been made for this
operation:
99 The systems will undergo standard testing prior
to the 2020 Census operations.
99 The contract was awarded on September 30,
2016, to use the “person” Matching, Coding, and
Review system to assist Coverage Measurement
clerical matchers on matching the PES to census HUs and people living in HUs.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
U.S. Census Bureau
Cost and Quality
Investment in CMM is projected to have minimal
influence on the overall cost and quality of the
2020 Census.
Risks
The development has not started for the PES
Housing Unit Clerical Matching System to date. IF
the development and testing of the PES Housing
Unit Clerical Matching System is not completed in
time for training and production, THEN this could
create delays in matching production for the 2020
PES.
Milestones
Date
Activity
March 2019
Release the CMM Detailed
Operational Plan.
April 2020
Conduct Initial Housing Unit
Computer Matching.
April–July 2020
Conduct Initial Housing Unit
Clerical Matching.
December 2020
Conduct Person Computer
Matching.
January–April 2021
Conduct Person Clerical
Matching.
March–April 2021
Conduct Final Housing Unit
Computer Processing and
Matching.
April–July 2021
Conduct Final Housing Unit
Clerical Matching.
5.8.3 Coverage Measurement Field
Operations
Detailed Planning
Status:
Underway
DOP to be published in FY 2019
Purpose
The Coverage Measurement Field Operation
(CMFO) collects person and housing unit (HU)
information (independent from 2020 Census
operations) for the sample of Post-Enumeration
Survey (PES) HUs in selected basic collection
units (BCUs). The PES collects the same data
as the 2020 Census for both HUs and persons.
Additional information is collected by PES to provide estimates of census net coverage error and
2020 Census Operational Plan—Version 4.0 155
components of census coverage for the United
States and Puerto Rico, excluding Remote Alaska.
Changes Made Since Version 3.0 Operational
Plan Release: Independent Listing will use the
Listing and Mapping Application automated listing
instrument. Person Interview will use the BLAISE
instrument. Initial Housing Unit Followup, Person
Followup, and Final Housing Unit Followup will
use paper maps.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Automate all Coverage Measurement data collection instruments.
•• To ensure more accurate data, minimize the
time lag between the follow-up operations
where beneficial.
•• Consider including an early telephone phase
prior to personal visit for the Person Interview.
Operational Innovations
Operational innovations include the following:
The initial plan for 2020 Census PES field data
collection was for full automation of the five PES
collection activities. However, due to funding
uncertainty and reprioritization of critical components of the 2020 Census, only two of the five PES
data collection activities (Independent Listing and
Person Interview) will use automated data collection instruments. The other three operations will
use paper questionnaires.
Description of Operation
This operation collects person and HU information for the sample of PES HUs. The Coverage
Measurement Program for the 2020 Census will
follow the design of the 2010 Census Coverage
Measurement Program with some minor differences discussed in Coverage Measurement Design
and Estimation Operation. Accordingly, this operation includes the following five PES field data
collection suboperations:
•• Independent Listing: In this operation, listers
walk all areas of the sample BCUs and list all
the HUs in the sample area from scratch, that is,
without using Master Address File information.
This is an independent listing. Listers knock on
156 2020 Census Operational Plan—Version 4.0
all HUs to inquire if there is more than one HU at
the address (like a basement or garage apartment, and if so, these are listed separately).
•• Initial Housing Unit Followup: Following
Independent Listing, the Coverage
Measurement Matching (CMM) Operation
matches the list of PES HU addresses collected
during PES Independent Listing to the initial
census list of addresses in the same sample
areas to identify matches, possible matches,
duplicates, and possible duplicates and nonmatches between the two lists. The cases
(addresses) that are identified as nonmatches,
possible matches, or possible duplicates are
sent back to the field for an Initial Housing Unit
Followup interview. Additional clerical matching
is conducted in CMM using the results of the
Initial Housing Unit Followup.
•• Person Interview: Collects person information for the PES sample HUs by performing
in-person interviews using a computer-assisted
data collection instrument. The enumerators
collect data similar to that collected in the
2020 Census, as well as additional data about
people in the household to determine if any of
these people may have been counted at other
addresses on Census Day.
•• Person Followup: Following Person Interview,
the CMM Operation matches the list of PES HU
people collected during PES Person Interview
to the list of people in the census in the same
sample areas to identify matches, possible
matches, duplicates, and possible duplicates
and nonmatches between the two lists. The
nonmatched persons (those in only one list)
and those identified as possible matches or
possible duplicates are sent back to the field for
the Person Follow-up interview to obtain additional information. The collected information is
used in the CMM Person After Follow-up clerical
matching operation to resolve the cases, and
the results are used in the estimation of person
coverage.
•• Final Housing Unit Followup: After completion of census operations, CMM matches the
updated list of census addresses to the PES
list of addresses to identify matches, possible matches, duplicates, and possible duplicates and nonmatches between the two lists.
Unresolved cases are sent back to the field
U.S. Census Bureau
to conduct the Final Housing Unit Followup
Interview. The collected information is used in
the CMM Final Housing Unit Matching, where
clerical matchers try to resolve remaining
matching, duplication, and HU status issues.
The results of Final Housing Unit Matching are
then used in the estimation of HU coverage.
Research Completed
The CMFO will leverage research conducted to
support other field operations such as In-Field
Address Canvassing and Nonresponse Followup
(NRFU) Operations.
participate in a field test prior to the start of the
operation, THEN the operation may encounter
unforeseen operational issues, potentially increasing cost and reducing data quality.
PES Person Interview operations cannot start in
the field until NRFU operations are complete in a
BCU due to potential bias and contamination. IF
NRFU operations are delayed or extended, THEN
this will impact the timing of the PES Person
Interview and later PES operations.
Milestones
Date
Activity
March 2019
Release the CMFO Detailed
Operational Plan.
January–
March 2020
Conduct PES Independent Listing
and Quality Control.
May–
June 2020
Conduct Initial Housing Followup
and Quality Control.
June–
September 2020
Conduct PES Person Interview
and Reinterview.
Decisions Made
February–
March 2021
Conduct PES Person Followup
and Reinterview.
The following decisions have been made for this
operation:
May–
June 2021
Conduct Final Housing Followup
and Quality Control.
Assumptions Made
Based on planning of other operations, the following assumptions have been made:
•• Directorate and enterprise automation processes will be leveraged whenever possible.
•• The operational independence must be maintained between PES and 2020 Census data
collection operations.
99 The Coverage Measurement data collection
field staff will be issued laptops to record time
and expense data.
99 There will be no additional telephone operation
prior to the Coverage Measurement Person
Interview.
99 The systems will undergo standard testing prior
to the 2020 Census operations.
Design Issues to Be Resolved:
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
Investment in CMFO is projected to have minimal
influence on the overall cost and quality of the
2020 Census.
Risks
CMFO was descoped from the 2018 End-toEnd Census Test, and there are no plans in place
to test collection and processing of real-world
data before CMFO goes live. IF CMFO does not
U.S. Census Bureau
5.8.4 Evaluations and Experiments
Detailed Planning
Status:
Underway
DOP to be published in FY 2019
Purpose
The 2020 Census Evaluations and Experiments
(EAE) Operation covers operational assessments
that document how well the 2020 Census was conducted; evaluations that analyze, interpret, and
synthesize the effectiveness of census components and their impact on data quality, coverage,
or both; and experiments that identify potential
designs for early 2030 Census life cycle research
and testing. Experiments are quantitative or qualitative studies that must occur during a decennial
census in order to have meaningful results to
inform planning for future decennial censuses. In
general, experiments involve response comparisons between test treatments, new or modified
methods, or procedures against 2020 Census
production methods or procedures.
The EAE Operation performs the following
functions:
2020 Census Operational Plan—Version 4.0 157
•• Assess the 2020 Census operations through
Demographic Analysis, operational assessments, and evaluations.
•• Formulates a 2020 Census experimental program that will further refine 2030 Census operational design options.
•• Captures and manages knowledge stemming
from decennial research recommendations.
•• Contributes to the formulation of the 2030
Census Research and Testing phase objectives.
•• Develops a transition plan and appropriate
organizational structures to establish 2030
Census life cycle planning.
•• Initiates other early planning activities for the
2030 Census, including the monitoring of policy
concerns and technological, societal, and public
cooperation trends.
•• Produces an independent assessment of population and housing unit coverage.
Changes Made Since Version 3.0 Operational Plan
Release:
The Decennial Research Objectives and Methods
(DROM) working group completed an assessment of each proposal using predefined criteria,
and recommended a final scope to 2020 Census
Executive Leadership. In May 2018, the scope of
2020 Census evaluations and experiments was formally baselined by the DROM working group and
the 2020 Census Executive Steering Committee.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations associated with the development and
management of the 2020 Census EAE Operation
were made:
•• Deployment of a Knowledge Management database to capture and track 2010 Census recommendations, recommendations from oversight
bodies, and early 2020 Census research and
testing results would be valuable for connecting past experiences and research to future
research and planning objectives.
•• Dedicated resources are needed earlier in the
2020 Census life cycle to initiate 2030 Census
life cycle planning efforts to enable a smooth
158 2020 Census Operational Plan—Version 4.0
transition from the 2020 Census implementation to the 2030 Census research.
Operational Innovations
At its core, the scope of the 2020 Census EAE
Operation will focus on aspects of the 2020 Census
design that could lead to 2030 Census innovations.
As the 2020 Census operational design solidifies,
the EAE operational process will define the 2020
Census EAE Operation, identify data requirements, and document methods to address research
objectives.
To date, opportunities to innovate, as documented
below, focus primarily on aspects of the planning
and scope definition process. These opportunities
to innovate include the following:
•• Implementing a Knowledge Management system and application for the Decennial Census
Programs Directorate.
•• Formulating 2020 Census evaluations and
experiments that are more formally guided by
the decisions on the 2020 Census operational
design and the 2030 Census planning and
objectives.
•• Evaluating how administrative records can
be better used or combined with other data
sources to improve the Demographic Analysis
estimates by age and sex, and to estimate or
expand the race and Hispanic origin categories
for which the Demographic Analysis estimates
are produced.
•• Formulating fiscal years 2022–2024 research
and testing objectives that are more formally
guided by 2030 planning and objectives.
•• Formulating 2030 Census life cycle budget
simulations that are more formally aligned with
strategic planning and research objectives.
Description of Operation
The 2020 Census EAE Operation is unlike other
2020 Census operations in that, at its start, the
Census Bureau followed a process to establish
and reach consensus on the set of evaluations
and experiments to be conducted as part of the
2020 Census Program. In May 2018, the 2020 EAE
scope was approved to move forward.
The details that follow address various aspects of
the planning process more so than the detailed
U.S. Census Bureau
scope of the 2020 Census evaluations and experiments themselves. The detailed scope of evaluations and experiments will result from the 2020
Census evaluations and experiments formulation
process. The initial planning, formation of governing bodies, solicitation of input, and the agreement on scope of the 2020 Census EAE Operation
is dependent upon funding.
Phases of the EAE Operation include the formulation of research projects; the delivery of
requirements to 2020 systems, operations, and
independent data collections; implementation of
evaluation and experiment activities; data analysis;
the publishing of results; and the identification of
2022 to 2024 research and testing objectives.
To initiate the formulation of the 2020 Census
EAE Operation, an understanding of the 2020
Census operational design is necessary. In general, the scope for the 2020 Census operations
sets the landscape for identifying evaluations.
Some aspects of the 2020 Census design options,
deemed out-of-scope, provide the initial canvas
for potential experiments. The formulation phase
involves:
•• Executive staff guidance on strategic principles
and high-level research targets.
•• Feedback from internal program managers,
operational subject-matter experts, and senior
researchers/methodologists.
•• Feedback from oversight groups, advisory
committees, the international collaboration
consortium, the National Academies of Science,
and other external experts.
•• Recommendations from census research
and testing as captured in the Knowledge
Management database.
•• Establishment of parameters (e.g., cost, quality,
risks, and visibility) and criteria for selecting
evaluations and experiment proposals.
•• Management of the scope of the 2020 Census
program for evaluations and experiments.
The conduct and coordination of the phases and
activities that follow program formulation will be
described in future versions of the operational
plan.
In addition, the Demographic Analysis Program is
included within the scope of the EAE Operation.
Demographic analysis refers to a set of methods
U.S. Census Bureau
that have historically been used to develop
national-level estimates of the population for
comparison with decennial census counts.
Demographic Analysis estimates are developed
from historical vital statistics, estimates of international migration, and other sources that are essentially independent of the census. The estimates
are then compared with the census counts by
age, sex, and limited race and/or ethnicity groups
to evaluate net coverage error in the census. The
EAE Operation will also sponsor the derivation
of housing unit estimates for comparison to the
decennial frame used for the 2020 Census.
Research Completed
While the ultimate set of 2020 Census evaluations
and experiments is considered research, the process for reaching agreement on the scope of the
evaluations and experiments and the underlying
governance of the formulation process, are not
considered research. As such, no research occurs
at this stage in the EAE Operation.
Decisions Made
99 The 2030 guiding principles were applied to the
solicitation, proposal, and selection processes
to formulate the scope of 2020 Census evaluations and experiments. In April 2018, the DROM
working group recommended the final scope of
the research program. In May 2018, the program
scope was approved by the Executive Steering
Committee. The summary of the 13 approved
research projects was released through the
2020 Census Memorandum Series on July 6,
2018 and is located at .
99 A core set of strategic questions/assumptions
to guide the formulation of the evaluations and
experiments for the 2020 Census include such
factors as whether the evaluation or experiment
perfects and improves on the innovations of the
2020 Census operation plan, and explores the
possibility of eliminating decennial operations,
etc.
99 Criteria and considerations for assessing
proposed 2020 Census Evaluations and
Experiments have been defined. Criteria will
include cost, quality, new to census, feasibility,
2020 Census Operational Plan—Version 4.0 159
attainment, risk to census, burden, etc.
Considerations will include such things as sensitivity, traceability, and whether the scope has
benefit to the enterprise.
99 It has been determined that the best use of
administrative records in the production of the
demographic analysis estimates by age and
sex and expanded race categories is through
the same component-based historical model
used in previous decades. Data continue to
be available to allow for the production of
demographic analysis estimates for the Black/
non-Black race categories for all ages as in
past years. Currently available data support
the expansion of the estimates for the Black
Alone or in Combination and Not Black Alone
or in Combination race categories to include
ages zero through 39. Data are also available to
expand the demographic analysis estimates by
Hispanic origin to include ages zero through 29.
99 It has also been determined that data are available to support the production of estimates for
the Asian and Pacific Islander population ages
zero through 29 on an experimental basis as
part of the 2020 demographic analysis effort.
In addition to the data previously used in
demographic analysis (vital statistics, Medicare
records, American Community Survey data),
a legal permanent resident file maintained by
the Office of Immigration Statistics and Internal
Revenue Service tax return data may also be
used to assess the uncertainty of the demographic analysis estimates.
questions and hypotheses will not be studied,
which will greatly minimize meaningful examination of the 2020 Census and plans to inform
research and testing leading toward 2030.
The U.S. population continues to become more
racially and ethnically diverse. A challenge for the
Demographic Analysis Program is that the vital
records, which are the core of the demographic
analysis method, have limited information on race
and Hispanic origin. IF the Demographic Analysis
Program does not research how to expand the
race and Hispanic origin detail of the estimates,
THEN the 2020 demographic analysis estimates
will not reflect the characteristics of the U.S. population and will not provide a useful evaluation of
the 2020 Census.
Milestones
Date
Activity
June 2017
•• Finalize content requirements and
instructions for submitting 2020
Census EAE proposals.
•• Finalize plan for soliciting for 2020
Census EAE proposals.
June–July
2017
Formally solicit 2020 Census EAE
proposals within the Census Bureau.
August–
January 2018
DROM working group analyze and
prioritize 2020 Census EAE proposals
by groupings.
January 2018
DROM working group call for round 2
revised 2020 Census EAE proposals.
February–
March 2018
If requested, DROM holds special
sessions with proposers on the
grouping topic.
February–
April 2018
DROM working group analyze and
prioritize round 2 2020 Census EAE
proposals—Final Recommendation of
EAE Scope.
March 2018
DROM receives final research
proposals.
March–April
2018
2020 Census EAE research leads
meet with Decennial Architecture
Requirements teams to assess 2020
Census systems’ support.
April–May
2018
Participating divisions and offices
validate needed resources and identify
2020 Census EAE research leads.
May 2018
DROM working group presents for
approval the recommended programlevel scope of 2020 Census EAE to the
2020 Executive Steering Committee—
Final Approval of EAE Scope.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
Investment in EAE is projected to have minimal
influence on the overall cost and quality of the
2020 Census.
Risks
Yearly budget formulation for the EAE Operation
reflects an expectation for a robust 2020 Census
research agenda to evaluate 2020 innovation areas
and to experiment with future design alternatives. IF the budget operating plans for the EAE
Operation continue to be significantly less than
what was formulated, THEN critical research
160 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
Date
Activity
May–June
2018
Obtain 2020 Census Memorandum
Clearance to baseline program-level
research plans for 2020 Census EAE.
June 2018
•• 2020 Census EAE research leads
begin developing detailed study
plans.
•• Finalize cost impacts involving 2020
Census systems and operations.
•• Demographic Analysis Conference.
December
2018
Release the EAE Detailed Operational
Plan (delayed).
January 2019
Begin receiving Office of Management
and Budget clearances for 2020 Census
Evaluations.
October 2019
Begin receiving Office of Management
and Budget clearances for 2020 Census
Experiments.
March 2023
Begin issuing results for 2020 Census
Experiments.
July 2023
Begin issuing results for 2020 Census
Evaluations.
5.9 INFRASTRUCTURE
The following four operations support the infrastructure of the 2020 Census:
5.9.1 Decennial Service Center
Detailed Planning
Status:
In Production
DOP published in FY 2018
Purpose
The Decennial Service Center (DSC) will support 2020 Census field operations for decennial
staff (i.e., Headquarters, Paper Data Capture
(PDC) Operation, Regional Census Center
(RCC), Area Census Office (ACO), Island Area
Censuses Operation, remote workers, and listers/
enumerators).
Changes Made Since Version 3.0 Operational Plan
Release: There have been no major changes to
this operation.
Lessons Learned
Based on lessons learned from the 2014, 2015,
and 2016 Census Tests, the following recommendations are made:
•• Implementing the DSC during annual census
tests provides insight into potential issues
which may arise during full 2020 Census
operations.
•• Decennial Service Center (DSC): Supports
2020 Census field operations for decennial
staff.
•• Including service center staff in user acceptance testing helps them gain a better understanding of possible issues which may occur in
the field.
•• Field Infrastructure (FLDI): Provides the
administrative infrastructure for data collection
operations covering the 50 states, the District
of Columbia, and Puerto Rico.
•• Secure funding for DSC staff support from the
beginning of texting to enhance knowledge
transfer, training, and appropriate scaling of
resources throughout the decennial operation.
•• Decennial Logistics Management (DLM):
Coordinates space acquisition for and lease
management of the regional census centers
(RCCs), area census offices (ACOs), and the
Puerto Rico Area Office (PRAO) and provides
logistics support services.
Operational Innovations
•• IT Infrastructure (ITIN): Provides the ITIN to
support the 2020 Census, including enterprise
systems and applications, 2020 Census-specific
applications, field ITIN, and mobile computing.
Each operation is described below.
U.S. Census Bureau
Operational innovations include the following:
•• Centralized service center system provides call,
incident, and service management systems supporting decentralized service center technicians
(e.g., technicians based in ACO answering any
call to the DSC).
•• Provides online training for service center
technicians as opposed to classroom training.
Online training is more accessible and less timeconsuming than classroom training.
2020 Census Operational Plan—Version 4.0 161
•• Cloud technology will support the centralized
service center system for call and incident
management.
Description of Operation
The overall goal of the 2020 Census DSC
Operation is the design and deployment of an
integrated service center, which will support
field operations and handle all help or service
requests initiated by decennial staff during the
2020 Census. Some of the services include the
following:
•• Application access issues.
•• Resolution of software and hardware issues.
•• Safety, security, and cyber incident
management.
•• Communications to and from field sites.
The DSC has three major areas:
•• Safety, security, and cyber incident management
ºº Provides nontechnical help desk services
for safety, security, and cyber incident data
entry for all 2020 Census operations.
•• Decennial support
ºº Provides technical help desk services for
2020 Census field operations.
•• PDC IT services
ºº Provides on-site technical help desk services
and systems administration for all 2020
Census PDC operations (PDC IT support and
systems administration).
Work Completed
The following research has been completed for
this operation:
•• Tested DSC use as part of the 2014, 2015, and
2016 Census Tests.
ºº Findings:
•• Changes to PIN and password configurations for enumerators have reduced the
number of calls expected for password
resets.
•• There was a lower-than-expected call volume for online training-related issues.
Decisions Made
99 The DSC will be limited to providing service
center support for 2020 Census field staff with
technical issues related to 2020 Census enterprise organization applications.
99 The DSC will provide support to field staff for
the 2020 Census systems and applications.
99 The DSC will provide support for various
types of mobile devices and mobile operating
systems.
99 Automated training will increase volume and it
will occur earlier in the schedule. This expected
increased volume of calls will lead to additional
staff needed for a longer period of time to
field additional calls. Telephone, Internet, Paper
External Demand Model outputs have been
developed. The model has been updated in
several significant ways and will continue to be
refined.
99 Based on the changes in the business process,
the Census Bureau will no longer support
Control Panel field procedures for enumerators.
There is no impact to call volume. Field staff will
be available during classroom training to assist
with IT support.
99 The methods for contacting DSC will be
through online submission and telephone.
99 The Census Bureau did not release a request for
proposal but added a Technical Directive to the
Technical Integrator contract for the Technical
Support Remedy System in December 2017.
99 The Census Bureau is planning to use a centralized IT service manager and call manager solution. Staff will be located in the field offices and
will access both systems. The Census Bureau is
still working on the optimal staffing ratios since
the online training schedule changed during the
2016 Address Canvassing Test, and additional
information will be gathered during the 2018
End-to-End Census Test. The field offices will
have tier 1 clerks for troubleshooting calls. The
offices will have Wi-Fi access to the Internet
only. There will be minimal impact to DSC since
Wi-Fi will not be used for the workstations and
phones.
Design Issues to Be Resolved
There are no remaining design issues that need to
be resolved for this operation.
The following decisions have been made for this
operation:
162 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
Cost and Quality
Investment in DSC is projected to have minimal
influence on the overall cost of the 2020 Census.
Impacts of this operation on overall 2020 Census
quality include the following:
ÏÏ Providing an efficient DSC Operation will
enhance quality of data collection by enumerators during the 2020 Census.
Risks
Finalized documentation for systems and applications needs to be provided to DSC by scheduled
deadlines in order to create training material and
sufficiently train DSC staff. IF finalized documentation for systems and applications, requiring DSC
support, are not obtained by scheduled deadlines,
THEN preparation of training materials may not be
completed in time for scheduled training, which
would decrease the quality, efficiency, and effectiveness of DSC support.
The number of staff hired for the DSC will be
heavily based on the expected volume of calls
received. IF call volumes are not accurately forecast, THEN staffing levels for the DSC may be
inaccurate.
Adjustments to DSC staffing levels and roles
are based on the schedule and scope for the
2020 Census field operations. IF late or frequent
changes to the 2020 Census field operations
schedule or scope occur, THEN there may not be
sufficient time to hire and train additional DSC
staff as needed.
5.9.2 Field Infrastructure
Detailed Planning
Status:
In Production
DOP published in FY 2018
Purpose
The Field Infrastructure (FLDI) Operation performs the following functions:
•• Provides the administrative infrastructure for
data collection covering the 50 states, the
District of Columbia, and Puerto Rico including:
ºº Recruiting
ºº Hiring and onboarding
ºº Personnel and payroll administration
ºº Training
ºº Staffing
ºº Management and supervision
ºº Clerical support.
Changes Made Since Version 3.0 Operational
Plan Release:
There have been no major changes to this
operation.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Streamline and automate the job application
process to replace the paper-based recruitment
and testing process.
Operational Innovations
Milestones
Date
Activity
September
2015
Open DSC to support the 2016 Census
Test.
September
2016
Start support for the 2017 Census Tests.
July 2017
Start support for the 2018 End-to-End
Census Test.
December
2017
Release the DSC Detailed Operational
Plan.
April 2018
Start support for the 2020 Census RCC.
January
2019
Start support for the 2020 Census Area
Census Offices.
June 2021
Close the DSC.
U.S. Census Bureau
Operational innovations include the following:
•• Streamlined field management structure using
automation and technology to manage the
Nonresponse Followup (NRFU) Operation
caseload.
•• Use of automation for the job application and
recruiting processes, payroll submission and
approval process, and other administrative processes to streamline personnel processes and
reduce staffing requirements and related costs.
•• Use of automation for training, including providing newly hired staff with electronic training
modules.
2020 Census Operational Plan—Version 4.0 163
•• Use of a third-party vendor (3PV) to collect
fingerprints and potentially take pictures for
badging.
Description of Operation
FLDI includes:
•• Providing human resources and personnel
management support functions, including
recruiting, hiring and onboarding (i.e., suitability
and background checks), training, payroll, and
out-processing (i.e., separation management).
Research Completed
The following research has been completed for
this operation:
•• Review of other countries’ census field
infrastructure.
ºº Findings: Best practices include consolidation of support functions in the field, specifically payroll, recruiting, and other administrative functions.
•• Develop a new concept of operations for field
infrastructure and test in the 2015 Census Test.
ºº Findings: Field Staff Training:
•• Combination of online and classroom
training provided standardization of the
information, provided tracking capabilities, and offered various learning methods.
•• Reduced training hours compared with
the 2010 Census NRFU enumerator training from 32 hours to 20 hours.
•• Deployment of online videos to provide
targeted training to enumerators quickly
and efficiently.
•• Identified topics requiring additional training in future tests.
ºº Findings: Field Reengineering.
•• Area Operations Support Center and
staffing of the Area Operations Support
Center successful.
•• Electronic payroll successful.
Decisions Made
The following decisions have been made for this
operation:
99 As of June 2018, the Regional Census Center
staffing model is as follows:
164 2020 Census Operational Plan—Version 4.0
ºº General Management: one Regional Director
and one Deputy Regional Director.
ºº Field Operations: four Assistant Regional
Census Managers (ARCM), eight Area
Managers (average two per ARCM), and one
Quality Assurance Manager.
ºº Other Operations (Geography, IT, and
Space Leasing): one ARCM, one Geographic
Coordinator, one Supervisory IT Specialist,
and one Space Leasing Coordinator.
ºº Other Operations (Administrative and
Recruiting): one ARCM, one Administrative
Coordinator for National Finance Center
(NFC) staff, one Administrative Coordinator
for non-NFC staff, and one Recruiting
Coordinator.
ºº Partnership: one ARCM and multiple
Partnership Coordinators per region.
99 The Area Census Office (ACO) staff model is as
follows:
ºº General Management: one ACO Manager,
one Administrative Manager, one Recruiting
Manager, and one IT Manager.
ºº Data Collection: one Lead Census Field
Manager, multiple Census Field Managers,
Census Field Supervisors, and Enumerators;
specific numbers based on workload; supervisory ratios to be determined.
99 In-Field Address Canvassing will be managed
out of the ACOs.
99 Recruiting activities will be automated.
99 The job application and assessment (testing)
process will be automated.
99 Field staff training will employ the use of online
training capabilities.
99 The training pay rate will be lower than the production pay rate.
99 The time and expense recording and approval
process for data-collection field staff will be
automated for field operations.
99 Recruiting for staff out of the ACOs will be
conducted by recruiting assistants with help
from local partnership staff and through the use
of paid advertisement and earned media (news
reports, etc.). Recruiting of potential employees
will be conducted throughout the ACOs geographic area, based on projected operational
U.S. Census Bureau
workloads and staffing models developed for
2020 Census operations. New to the 2020
Census will be the use of the Recruiting and
Assessment application, which is part of the
Census Schedule A Human Resources and
Recruiting Payroll System (C-SHaRPS). For the
first time this decade, candidates will apply and
take a skills assessment online, as opposed to
attending recruiting sessions in person and taking a written test. Candidates will be selected
for employment based on the information
provided in their online application, the results
of the skills assessment, and other factors
depending upon the position for which they
apply. Selected candidates will be invited to be
fingerprinted and submit selected appointment
paperwork prior to attending classroom training. The candidates will be sworn in and hired
during the first day of training.
99 The USPS will not perform fingerprinting and
on-boarding functions for temporary field staff
selected during the 2018 End-to-End Census
Test or the 2020 Census.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
Investment in FLDI is projected to influence
(reduce or increase ) the 2020 Census overall
costs in the following ways:
ÐÐ Increased efficiencies due to automated
administrative functions, including recruiting,
onboarding, training, and payroll.
ÐÐ Increased cost savings due to reduced field
staffing.
Impacts of this operation on overall 2020 Census
quality include:
ÏÏ Fewer enumerator errors resulting from use of
automation to improve training methodology
and supervision capabilities using:
•• Automated Job Application and Employment
Assessment Testing.
U.S. Census Bureau
•• Automated Personnel and Payroll
Administration (e.g., Time and Attendance
Submission).
Risks
Currently, the Emergency Notification System
(ENS) does not have employee work phone
numbers nor work mobile numbers. IF employees
do not have their work phone numbers nor work
mobile numbers in ENS, THEN they may not be
able to be reached during an emergency.
In the ENS, once the fields for employee work
phone and work mobile phone are ready, the data
will have to be entered. IF the employee work
phone number and work mobile number have to
be manually entered into ENS, THEN data entry
errors could cause some employees to not be able
to be reached during an emergency.
The 2020 Census Web site will have a limitation,
if it goes down, the only possibility is for a temporary static page to be put in its place until
the site is back up. IF the 2020 Census Web site
goes down, THEN the job application site will
be unreachable causing negative effects on the
recruitment process.
Milestones
Date
Activity
June 2017
Begin 2020 Census recruiting campaign
and partnership programs.
December
2017
Release the FLDI Detailed Operational
Plan, version 1.0.
December
2018
Release the FLDI Detailed Operational
Plan, version 2.0 (delayed).
January
2019
Begin recruiting for address canvassing
field staff.
July 2019
Begin early operations training.
September
2019
Begin recruiting for peak operations.
September
2019
Launch advertising campaign.
December
2019
Begin training for peak operations.
2020 Census Operational Plan—Version 4.0 165
5.9.3 Decennial Logistics Management
Detailed Planning
Status:
Underway
DOP published in FY 2018
•• Continue the belt-driven kit assembly line
process.
Operational Innovations
Purpose
Operational innovations include the following:
The Decennial Logistics Management (DLM)
Operation performs the following functions:
•• Implemented an Integrated Logistics
Management System at NPC to facilitate better
inventory management of decennial supplies
and procurements.
•• Coordinates space acquisition for and lease
management of the Regional Census Centers
(RCCs), Area Census Offices (ACOs), and the
Puerto Rico Area Office (PRAO) in collaboration with Field Division (FLD) and the General
Services Administration.
•• Provides logistics management support services (e.g., kit assembly, supplies to field staff)
in collaboration with FLD and the National
Processing Center (NPC).
Changes Made Since Version 3.0 Operational Plan
Release:
There have been no major changes to this
operation.
Lessons Learned
Based on lessons learned from the 2010 Census
studies and reviews, the following recommendations were made:
•• Establish an interagency working group to identify and develop effective strategies for space
acquisition and build communication among
stakeholders.
•• Open some field offices earlier than others
allowed for a “test” run of implementation in
the space acquisition effort and improved the
process for opening the remaining (majority) of
offices.
Description of Operation
The DLM Operation for the 2020 Census consists
of:
•• Space acquisition and lease management for
RCCs, ACO, and the PRAO (secure bids, award
contracts/leases).
•• Building-out space (i.e., specifications, schemas,
designs, etc.).
•• Physical security.
•• Procuring and setting-up warehouse space to
support RCCs, ACOs, and PRAO.
•• Provisioning RCCs, ACOs, and the PRAO with
office furniture, supplies, operating materials,
and non-IT equipment.
•• Provisioning RCC, ACO, and PRAO field staff
with supplies.
•• Inventory management.
•• Kit assembly (e.g., recruiting, hiring, and training kits).
•• Deploying materials to RCCs, ACOs, and the
PRAO.
•• Receiving and excessing remaining materials
after the operation concludes.
•• Printing and shipping—NPC or external print
vendor.
•• Purchase and deploy an Integrated Logistics
Management System to gain cost benefits generated from bulk purchasing and significantly
improve inventory control.
Work Completed
•• Utilize barcode technology entirely, in conjunction with an Integrated Logistics Management
System, to improve inventory control and
reduce costs.
•• Study of current literature regarding ThirdParty Logistics Organizations.
•• Conduct training at local offices for inventory
control, in conjunction with use of an Integrated
Logistics Management System.
166 2020 Census Operational Plan—Version 4.0
The following research has been completed for
this operation:
ºº Findings: Third-Party Logistics Organizations
need well-defined and finalized requirements
up front to effectively provide decennial
census logistics support. The iterative
development of the 2020 Census logistics
U.S. Census Bureau
requirements prevents the Census Bureau
from meeting that criterion.
•• Study of current literature on other logistics
support models that may fit the characteristics
of the 2020 Census:
ºº Findings:
•• There were no new logistics models that
align with the major characteristics of the
2020 Census: limited and short duration,
high variety and high mix of Operating
Materials and Supplies per operation,
evolving data availability regarding quantities of Operating Materials and Supplies.
•• Distributed warehousing will likely not
work for the 2020 Census. The strong
implication with distributed warehousing
is that whatever is needed in each warehouse is well known ahead of time, which
is not characteristic of a decennial census.
99 Separate office space will be needed in the RCC
to support and manage the coverage measurement operations.
99 The 2020 Census field office infrastructure
will include 248 field offices, a subset of which
will open a few months early to support early
census operations, including In-Field Address
Canvassing.
99 The plan for locating the 248 ACOs takes into
account a variety of factors, which determine the actual number of offices and their
associated boundaries. Further information is
outlined in the 2020 Census ACOs decision
memorandum.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
•• The NPC has implemented the first phase of the
Integrated Logistics Management System project, to include inventory management.
Investment in DLM is projected to influence
(reduce or increase ) the 2020 Census overall
costs in the following ways:
Decisions Made
ÐÐ Reduced office infrastructure due to automation. Increased efficiencies due to automated
administrative functions, including recruiting,
onboarding, and payroll. Increased efficiencies
due to automation in operations versus paper
processes.
The following decisions have been made for this
operation:
99 Logistics support for procurement, assembly,
receiving, and deployment of non-IT operating
materials, supplies, and equipment will be conducted by the NPC.
99 Field Logistics support conducted by the NPC
will occur at an off-site location due to space
limitations within the current facility.
99 The preliminary plans for the Operating
Materials and Supplies have been developed
based on requirements from the census tests to
date.
99 The preliminary plans for the quantities of
Operating Materials and Supplies have been
developed based on requirements from prior
census tests (e.g., 2015, 2016) and continued
analysis of 2020 Census staffing needs.
99 The 2020 Census field office infrastructure will
include six RCCs.
99 The RCCs will be located in the same metropolitan areas as the regional offices, with the
exception of the Denver region, where the RCC
will be located in Dallas, Texas.
U.S. Census Bureau
Investment in DLM is expected to have minimal influence on the overall quality of the 2020
Census.
Risks
NPC delivered baselined space requirements for
the logistics operation to the General Services
Administration on April 1, 2016, to accommodate
an 18-month lead-time before occupancy. Major
changes to these requirements could mean issues
with space available or the need to increase the
amount of space to meet the changes in material requirements. IF the NPC receives significant
changes to requirements for Operating Materials
and Supplies after the requirements for warehousing logistics have been baselined, THEN this may
result in a change in space requirements necessitating additional warehousing space.
The more information NPC receives about operational requirements early on in the planning and
development stages tends to mitigate the need
2020 Census Operational Plan—Version 4.0 167
for, and the magnitude of, additional resources
and costs. IF the NPC receives changes to operational requirements as the 2020 Census work
progresses, THEN this may change the cost of
logistics operational support, due to the need to
add staff or implement overtime to avoid schedule
delays.
been determined that funding for printers will be
provided via a click charge for each page printed.
IF NPC cannot quickly and accurately determine
workloads and timelines for when and how many
additional printers/staff are needed, THEN printing and kit delivery challenges/delays may be
experienced.
The Census Bureau plans for every state to
include at least one ACO (currently 248 ACOs
are planned). These ACOs will meet a variety of
boundary and delineation criteria (areas of consideration) provided by the six regional offices (i.e.,
high population density and strong likelihood of
finding office space; centrally located in the state;
close to major transportation networks; located
in areas with a diverse labor force and substantial
applicant pool). The Census Bureau also considers
other related factors. IF there are no submitted
bids that meet the ACO requirements, THEN the
area of consideration will have to be expanded.
Milestones
The Census Bureau plans for every state to
include at least one ACO (currently 248 ACOs
are planned). These ACOs will meet a variety of
boundary and delineation criteria (areas of consideration) provided by the six regional offices (i.e.,
high population density and strong likelihood of
finding office space; centrally located in the state;
close to major transportation networks; located
in areas with a diverse labor force and substantial
applicant pool). The Census Bureau also considers other related factors. IF the Census Bureau
receives a request that changes the criteria for an
ACO location, THEN the Census Bureau will have
to incur additional costs so that field operations
will not be impacted.
NPC’s Document Services Branch (DSB) has
seven high-volume printers used for their current
work. The existing printer lease will expire in early
2019, and NPC is working with the Government
Printing Office for a solution to extend/renew the
current lease. NPC DSB plans to lease four or five
black and white printers (which will likely require
an ATO), although they have not been provided
with specific information regarding color printing
needs from the field data collection operational
areas to date. NPC DSB also needs a designated
climate-controled room to accommodate storage of the paper and the additional printers, and
staffing for the management and implementation of the printing, as well as the paper. It has
168 2020 Census Operational Plan—Version 4.0
Date
Activity
April 2016
Initiate search and build-out activities
for NPC logistics space.
March 2017
Initiate equipment leases for logistics
functions.
July 2017
Initiate search for ACO space for Wave
1 offices.
December
2017
Release the DLM Detailed Operational
Plan, version 1.0.
February
2018–
August 2019
Design and build-out ACO space.
April 2018
Open RCCs.
October 2018
Occupy NPC logistics space:
installations complete and ready to
operate.
December
2018
Release the DLM Detailed Operational
Plan, version 2.0 (delayed).
January 2019– Accept field office space for ACOs.
August 2019
January–
September
2019
Open ACOs (flow basis): installations
complete and ready to operate.
December
2020
Close ACOs.
June 2021
Close RCCs.
5.9.4 IT Infrastructure
Detailed Planning
Status:
In Production
DOP published in FY 2017
Purpose
The purpose of the IT Infrastructure (ITIN)
Operation is to provide the IT-related systems
and infrastructure support to the 2020 Census,
including:
•• Enterprise systems and applications.
•• Decennial-specific systems, applications, and
interfaces.
U.S. Census Bureau
•• Field ITIN (Regional Census Center [RCC], Field
Office, Work at Home [WAH], and Paper Data
Capture Operation).
•• Mobile computing.
•• Cloud computing.
Changes Made Since Version 3.0 Operational Plan
Release: There have been no major changes to
this operation.
Lessons Learned
Based on lessons learned from the 2010 Census,
as well as the 2014, 2015, 2016, Address
Canvassing (ADC) Operation Test, 2017 Census
Test, and 2018 End-to-End Census Test the following recommendations were made:
•• Use of prototypes and a test local census office
help validate the design of the IT infrastructure.
•• Opening some field offices earlier than the others allowed for a “test” run of the deployment
of the IT infrastructure, including the equipment
and the telecommunications.
•• ITIN readiness preparation and assessment process for the 2015 Census Test was instrumental
and should continue to be used to improve
remaining tests for the 2020 Census.
•• Improvements are needed in assessing and
approving requested changes to business and
technical requirements.
•• Thread testing alone may not be enough to
assure quality products.
•• Census Enterprise Data Collection and
Processing (CEDCaP) system.
•• Center for Enterprise Dissemination Services
and Consumer Innovation system.
•• Census–Schedule A Human Resources and
Recruiting Payroll System.
•• Census Data Lake system (future Enterprise
Data Lake).
•• Demographic Survey Systems to support PostEnumeration Survey (PES).
•• Shared Services (Virtual Desktop Infrastructure
[VDI], etc.).
Decennial Specific Applications: This support
area includes the planning and implementation of
all hardware and software to support operations
for the 2020 Census, as well as the management
and monitoring of those systems, including, but
not limited to, the following:
•• Real Time Non-ID Processing system.
•• Production Environment for Administrative
Records Staging, Integration, and Storage
system.
•• Sampling, Matching, Reviewing, and Coding
System.
•• Matching and Coding Software system.
•• Decennial Response Processing system.
•• Data Editing, Imputation, and Estimation
systems.
•• Evaluation systems.
Description of Operation
RCC and Field Office IT Infrastructure: This support area covers the deployment of IT capabilities
in the form of office automation services to any
RCC, field office, facility, or work location opened
as part of the 2020 Census operations. It includes
support for all field data collection operations
through automated recruiting, hiring, staffing,
training, fingerprinting, and mobile device support, including the following:
Each component of the ITIN Operation is
described below.
•• Definition of functional and nonfunctional solution requirements for field offices.
Enterprise Systems and Applications: This support area includes the planning and implementation of all hardware and software to support
operations for the 2020 Census, as well as the
management and monitoring of those systems,
including, but not limited to, the following:
•• Development of the IT computing environment
design.
•• Cloud computing has its own limitations.
•• Automate deployments.
•• Infrastructure authorizations across environment levels to allow for greater flexibility of
applications within environments and reduction
of authorization processing times for the application in the future.
U.S. Census Bureau
•• Procurement of circuits and IT equipment for
the census field offices.
•• Shipping, configuration, testing, and staging of
IT equipment for the census field offices.
2020 Census Operational Plan—Version 4.0 169
•• Tear-down and disposition of IT equipment and
circuits at the conclusion of the 2020 Census
activities.
Field IT infrastructure requirements will provide, at
a minimum, for the following:
•• Decennial Service Center Operation.
•• National Processing Center (NPC).
•• RCC.
•• Area Census Office (ACO).
•• Data Capture Centers.
•• Partnerships, if needed.
•• Mobile offices and vehicles, if needed.
•• Census offices in the Island Areas.
•• Regional technicians.
A summary of operational highlights is:
•• Alignment to the Enterprise Architecture.
•• Use of enterprise solutions.
•• Iterative deployment of infrastructure aligned
with and based on testing and the Integration
and Implementation Plan.
•• Use of workload demand models to size IT solutions appropriately.
•• Scalable IT solutions.
•• Agile development of applications.
•• Use of cloud computing.
•• Standardized IT infrastructure components and
baselines to provide common building blocks
for emerging application development efforts.
•• Service Oriented Architecture.
Mobile Computing: The Census Bureau will
leverage technology innovations such as decennial Device-as-a-Service (dDaaS), the Mobile
Application Manager (MAM), and Mobile Device
Management (MDM) programs and secure applications provided through Device-as-a-Service.
This will result in a flexible and efficient acquisition
strategy to procure mobile devices and services
for fieldworkers.
Cloud Computing: The Census Bureau will leverage cloud-computing capabilities to transition
workloads onto FedRAMP-certified commercial cloud service providers. The Census Bureau
will implement cloud computing with configuration-managed automated deployments,
170 2020 Census Operational Plan—Version 4.0
automated testing, and auto-scaling to meet
demands with a cloud consumption model for
cost and billing. Continuity of Operations Planning
will also leverage the cloud.
Work Completed
The following work has been completed for this
operation:
•• Established the Field IT infrastructure for the
2014 Census Test, 2014 Simulation Experiment
(SIMEX), 2015 Census Tests, 2016 Census Test,
2017 Census Test, and 2018 End-to-End Census
Test.
•• Established the Headquarters IT infrastructure
to support the 2014 Census Test, 2014 SIMEX,
2015 Census Tests, 2016 Census Test, 2017
Census Test, and 2018 End-to-End Census Test.
Mapped the IT infrastructure to each operational component tested to evaluate and ensure
readiness.
•• Used MDM solution and MAM solution to push
and securely manage mobile applications on
mobile devices.
•• Automated deployment and configuration of
Field Office IT systems and components.
•• Established VDI to provide Office Computing
Environment capabilities to Field Office and
WAH staff.
•• Established national, regional, and local file
share system to support the secure exchange
of documents between Field Office and
Headquarters staff.
•• Deployed Decennial Voice over Internet
Protocol system to provide telephony services
to Field Offices.
•• Deployed Field Office IT systems for the six
RCCs.
•• Provided cloud infrastructure for systems
planned to be hosted in the cloud services.
Decisions Made
The following decisions have been made for this
operation:
99 An incremental approach will be used to define,
deploy, and test the IT Infrastructure.
99 Mobile devices will be used for field data
collection.
U.S. Census Bureau
99 Whenever technically feasible and cost effective, enterprise solutions will be used in support
of the 2020 Census.
99 A hybrid cloud design will be used for all 2020
Census systems requiring scaling wherever
possible.
99 VDI will be used for all RCC and field office
staff.
99 The demand models that the IT Infrastructure
and systems need to accommodate have been
developed based on data from past census
tests and other surveys. These models are being
used to support future tests and the System of
Systems architecture. Future data will be used
to refine these models.
99 The solution architecture was formalized in
FY 2016 and was officially presented by the
Decennial IT Division Chief at the July 22, 2016,
2020 Census Program Management Review.
99 Bring Your Own Device (BYOD) will not be used
moving forward, but lessons learned will inform
how we structure and use the decennial dDaaS
program. The dDaaS approach will be used to
provide mobile devices, accessories, cellular
connectivity, and device provisioning for each
2020 Census operation beginning with the 2018
End-to-End Census Test through 2020 Census
Coverage Measurement survey.
99 Amazon Web Services GovCloud will be used
as the 2020 Census cloud service provider.
99 The 2020 Census will use a variety of mobile
devices. For primary data collection, smartphones will be used. Field supervisory staff
will use tablets for oversight and for operation control system functionality. Laptops (or
tablets) will also be used by field recruiters and
outreach staff for ADC, PES, and Update Leave
(UL) Operation. The security approach will be
to encrypt data at rest and in transit through a
Federal Information Processing Standard 140-2
solution. Mobile devices will also have a secure
authentication protocol. BYOD efforts in earlier
tests will serve as lessons learned in going forward with a government-furnished equipment
approach via the dDaaS acquisition vehicle.
99 The NPC will not have a role in IT deployments
to the RCCs and ACOs. The decision is that IT
U.S. Census Bureau
deployments (e.g., keyboards, monitors) will be
provided through a contracted service.
Design Issues to Be Resolved
There are no remaining design issues to be
resolved for this operation.
Cost and Quality
Investment in ITIN is projected to influence
(reduce or increase ) the 2020 Census overall
costs through:
ÐÐ Leveraging enterprise solutions.
ÐÐ Leveraging cloud computing to address peak
performance requirements.
Impacts of this operation on overall 2020 Census
quality include the following:
ÏÏ Use of automation to collect real-time data,
enabling better monitoring and management of
the data collection activities.
ÏÏ Automated Training and Knowledge Base.
ÏÏ Sufficient mobile and networking infrastructure
to effectively support field operations.
ÏÏ Sufficient IT infrastructure to provide necessary
levels of performance, to include acceptable
interactions by the public, partners, and others.
Risks
Major concerns for the ITIN Operation are covered by the IT-related 2020 Census risks listed in
Chapter 6.
Milestones
IT Infrastructure Milestones
Date
Activity
September
2016
Finalize definition of Field ITIN solution
requirements.
December
2016
Award contract for Field ITIN.
April 2017
Release the ITIN Detailed Operational
Plan.
November
2017
Begin Installation of ITIN for the RCCs.
June 2019
Begin Installation of ITIN for the ACOs.
Finalize Field Office ITIN design.
2020 Census Operational Plan—Version 4.0 171
Cloud Testing and Readiness Milestones
Date
Activity
December
2016
Phase 2 Cloud contract available, analysis
to transition or migrate 2020 Cloud
Solutions to Cloud Service Providers for
2020 Census production completed.
Date
Activity
January
2015
Identify cloud computing as the
assumed technical solution in support
of the CEDCaP Decennial Infrastructure
Scale-Up Project.
June 2015
Conduct initial testing of ISR instrument
using cloud computing services.
September
2015
Acquire cloud-computing services in
place to support the 2016 Census Tests.
April 2017
Leverage cloud-computing in support of
2017 Census Test.
Deliver initial output from the 2020
Census workload demand models,
including Internet Response.
June 2017
Modify technical solution architecture—
plan for larger-scale performance,
scalability, and resilience testing in the
cloud.
September
2017
Rebaseline workload-demand models
based on 2017 Census Test results.
December
2017
Initiate performance, scalability, and
resilience testing in the cloud.
June 2018
Leverage cloud-computing in support
of 2018 End-to-End Census Test and
analyze test results. Modify workload
demand models and technical solution
architecture.
September
2018
Review performance, scalability, and
resilience testing in the cloud.
September
2019
Ensure readiness of final cloudcomputing solution for 2020 Census.
December
2015
Deliver initial baseline of decomposed
2020 Census solution-level performance
requirements provided by 2020 Census
Integrated Project Teams.
June 2016
Deliver analyses of alternatives and
recommended solutions architecture, to
include cloud computing as a solution
alternative, in support of technical
solution-level requirements. Acquire
cloud-computing services to support
the 2017 Census tests and future census
tests.
August
2016
Complete 2020 Census technical
solution-level requirements, including
performance requirements.
September
2016
Provision cloud-computing services to
support the 2017 Census tests and future
census tests. Rebaseline and deliver
demand models based on 2016 Census
Test results.
172 2020 Census Operational Plan—Version 4.0
Conduct performance and scalability
testing in the cloud (2017 Census Test
Solution).
U.S. Census Bureau
6. Key 2020 Census Risks
The 2020 Census Risk Management process consists of activities performed to reduce the probability and consequences of events that could negatively affect the ability to meet the objectives of
the 2020 Census. The goal of the risk management
process is to ensure a common, systematic, and
repeatable assessment approach so that risks can
be effectively identified and managed, and clearly
communicated to management, stakeholders, and
executive-level decision-makers. Risk management is iterative and designed to be performed
continuously throughout the 2020 Census life
cycle. Therefore, the 2020 Census Risk Register is
revisited regularly and changes are made on an
ongoing basis, including the addition of new risks.
Figure 31 shows the current risk matrix for all risks
in the 2020 Census Risk Register, as of October
31, 2018.
From the 2020 Census Risk Register, 10 key risks
are highlighted in the sections below. These risks
were selected from the risk register by the members of the 2020 Census Risk Review Board as a
broad representation of the major concerns that
could affect the design or the successful implementation of the 2020 Census. Along with the
risk statement, the probability rating, the impact
rating, the risk exposure level, and the risk color
are provided for each risk. Mitigation strategies
are also provided. For information about all of
the risks, the full risk register is available upon
request.
6.1 PUBLIC PERCEPTION OF ABILITY
TO SAFEGUARD RESPONSE DATA
The accuracy and usefulness of the data collected for the 2020 Census are dependent upon
Probability
5
4
12%
3
1
3
7
2
76%
12%
2
2
5
7
1
1
1
1
3
4
5
1
2
Impact
Figure 32: 2020 Census Risk Matrix
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 173
the ability to obtain information from the public,
which is influenced partly by the public’s perception of how well their privacy and confidentiality concerns are being addressed. The public’s
perception of the Census Bureau’s ability to
safeguard their response data may be affected
by security breaches or the mishandling of data
at other government agencies or in the private
sector.
IF a substantial segment of the public is not convinced that the Census Bureau can safeguard their
response data against data breaches and unauthorized use, THEN response rates may be lower
than projected, leading to an increase in cases for
follow-up and cost increases.
Probability 3
(Moderately
likely)
Impact 5
(Major impact)
Exposure level
HIGH
Mitigation Strategies include the following:
•• Develop and implement a strategy to build and
maintain the public’s confidence in the Census
Bureau’s ability to keep their data safe.
•• Follow the IT security-related mitigation
strategies of 2020 Census Risk Cybersecurity
Incidents.
•• Continually monitor the public’s confidence in
data security in order to gauge their probable
acceptance of the Census Bureau’s methods for
enumeration.
6.2 CYBERSECURITY INCIDENTS
Cybersecurity incidents (e.g., breach, denial-
of-service attack) could impact the Census
Bureau’s authorized IT systems, such as the
Internet self-response instrument, mobile devices
used for fieldwork, and data processing and
storage systems. IT security controls will be put in
place to protect the confidentiality, integrity, and
availability of the IT systems and data, ensuring
the 2020 Census operations will not be negatively
impacted by the incidents.
IF a cybersecurity incident occurs to the systems
supporting the 2020 Census, THEN additional
technological efforts may be required to repair or
replace the systems affected in order to maintain
secure services and data.
174 2020 Census Operational Plan—Version 4.0
Probability 3
(Moderately
likely)
Impact 5
(Major impact)
Exposure level
HIGH
Mitigation Strategies include the following:
•• Monitor system development efforts to ensure
the proper Census Bureau IT security guidelines
are followed during the system development
phase.
•• Research other Census Bureau programs, other
government agencies, other countries, and the
private sector to understand how they effectively mitigate cybersecurity incidents.
•• Audit systems and check logs to help in detecting and tracing an outside infiltration.
•• Perform threat and vulnerability analysis
through testing.
•• Prepare for rapid response to address any
detected cybersecurity incidents.
•• Leverage data stewardship and information
safeguarding policies and procedures of Census
Bureau programs, other government agencies, other countries, and the private sector
to understand how to mitigate cybersecurity
incidents.
6.3 ADMINISTRATIVE RECORDS AND
THIRD-PARTY DATA—EXTERNAL
FACTORS
The Census Bureau is planning to use administrative records and third-party data to reduce need
to follow-up with nonrespondents through the
identification of vacant and deleted housing units
(those that do not meet the Census Bureau’s definition of a housing unit), the enumeration of nonresponding housing units, and the improvement of
the quality of imputation for demographic characteristics that are missing for person and housing
unit records. Administrative records will also be
used to update the Master Address File (MAF),
predict the best times to contact nonresponding
households, and verify the information provided
by respondents and enumerators.
IF external factors or policies prevent the Census
Bureau from utilizing administrative records and
third-party data as planned, THEN the Census
Bureau may not be able to fully meet the strategic goal of containing the overall cost of the 2020
Census or to fully utilize the data quality benefits
U.S. Census Bureau
of using administrative records in characteristic
imputation.
Probability 3
(Moderately
likely)
Impact 4
(Substantial
impact)
Exposure level
MEDIUM
THEN the strategic goals and objectives of the
2020 Census may not be met.
Probability 3
(Moderately
likely)
Impact 4
(Substantial
impact)
Exposure level
MEDIUM
Mitigation Strategies include the following:
Mitigation Strategies include the following:
•• Identify external stakeholders that have an
interest in Census Bureau policies regarding
administrative record and third-party data
usage.
•• Leverage Decennial Information Technology
Division’s Systems Engineering and Integration
(SEI) System Development Life Cycle system
readiness/phase gate review process, the SEI
program metrics dashboard, and various 2020
Census governance forums to provide a current
sense of where all solutions providers are in the
system development process and to raise issues
quickly for corrective action.
•• Develop a stakeholder communications plan for
identified external stakeholders.
•• Regularly communicate to and seek feedback
from identified external stakeholders on design
decisions and research and testing results
related to the use of administrative records and
third-party data for the 2020 Census.
•• Assess impacts of any changes to the design
based on feedback from external stakeholders
and update plans accordingly.
•• Monitor external factors and policies that may
impact the Census Bureau’s planned use of
administrative records and third-party data for
the 2020 Census.
Changes since the last version of the 2020 Census
Operational Plan:
The Cost Impact rating was lowered from 5 to
4. Even though there would be a cost increase
associated with a higher Nonresponse Followup
workload, there are processes in place to assist
with rapidly up-scaling the operation as necessary. This changed the overall Impact rating to
4.
6.4 OPERATIONS AND SYSTEMS
INTEGRATION
Due to the critical timing of 2020 Census operations and the potential impact of systems not
being ready to support them, managers must
have an accurate gauge of the progress made
towards integrating the various operations and
systems that support the 2020 Census. Progress
towards integration must take place throughout
the planning, development, and testing stages of
the operations and systems.
IF the various operations and systems are not
properly integrated prior to implementation,
U.S. Census Bureau
•• Conduct regularly scheduled reviews of the
2020 Census operations.
•• Ensure all operational areas and their associated Integrated Project Teams have adequate
resources assigned to integration efforts and
required project artifacts are developed and
approved.
•• Ensure each planned census test has an
approved Goals, Objectives, and Success
Criteria document, adequate resources to plan
and conduct are identified and assigned, a
detailed test plan is developed and approved
(including key milestones and roles and responsibilities), and deadlines are being met through
a regular management review with the test
team.
•• Ensure adequate technical review sessions are
planned and conducted in conjunction with SEI
staff, including the systems engineers responsible for developing the solutions).
•• Create an operational integration design team
to support the 2020 Census through creation
and distribution of artifacts which depict integration between the operations.
6.5 LATE OPERATIONAL DESIGN
CHANGES
After key planning and development milestones
are completed, stakeholders may disagree with
the planned innovations behind the 2020 Census
and propose modifications to the design, possibly
resulting in late operational design changes.
2020 Census Operational Plan—Version 4.0 175
IF operational design changes are required following the completion of key planning and development milestones, THEN costly design changes
may have to be implemented, increasing the risk
for not conducting a timely and successful 2020
Census.
Probability 3
(Moderately
likely)
Impact 4
(Substantial
impact)
Exposure level
MEDIUM
Mitigation Strategies include the following:
•• Identify internal and external stakeholders that
have an interest in the 2020 Census operational
design.
•• Develop a stakeholder communications plan for
identified internal and external stakeholders.
•• Regularly communicate with and seek feedback
from identified external stakeholders on design
decisions and research and testing results.
•• Monitor external factors and policies that may
impact the Census Bureau’s planned innovations for the 2020 Census operational design.
•• Establish a change-control management process to assess impacts of change requests to
facilitate decision-making.
•• Prepare for rapid response to address potential
changes and make decisions based on the
results of the change-control process.
6.6 INSUFFICIENT LEVELS OF STAFF
WITH SUBJECT-MATTER SKILL SETS
The 2020 Census consists of programs and
projects that require subject-matter skill sets to
complete the work. The potential of not having
the necessary staffing levels and staff with the
appropriate competencies to satisfy objectives is
an ongoing concern. This is the result of a lack of
consistent strategic workforce planning throughout the 2020 Census life cycle. Staff with the
necessary skill sets leave due to retirements and
movement within and out of the Decennial Census
Programs Directorate, hiring freezes and processes that delay or cause an inability to recruit
and hire candidates that possess the knowledge,
skills, and abilities to perform core functions of
the 2020 Census, and budgetary constraints. In
addition, with increasing numbers of staff eligible
176 2020 Census Operational Plan—Version 4.0
for retirement before 2020, there is also the potential of losing valuable institutional knowledge, as
employees in key positions may not be accessible
to share their knowledge and participate in succession planning.
IF the 2020 Census does not hire and retain staff
with the necessary subject-matter skill sets at the
levels required, THEN the additional staffing shortages may occur, making it difficult to meet the
goals and objectives of the 2020 Census.
Probability 3
(Moderately
likely)
Impact 4
(Substantial
impact)
Exposure level
MEDIUM
Mitigation Strategies include the following:
•• Identify high priority competencies and staffing
positions needed for the work of the 2020
Census.
•• Decennial Directorate Support Services Office
will continue to collaborate with the Human
Resources Division to facilitate hiring.
•• Employ various strategies to facilitate staff retention, development, and
knowledge-sharing.
6.7 ABILITY OF IT SOLUTIONS TO
SUPPORT THE 2020 CENSUS
There are 52 systems supporting the 2020 Census,
including enterprise systems, vendor-developedsystems, and in-house-developed systems.
There is the possibility that one or more of these
systems does not address all of the baselined
requirements and does not function as required,
negatively impacting the operations being supported. Proper development and testing is needed
for each system, as well as integration testing
between systems, in order to ensure a successful
deployment of the IT solutions supporting the
implementation of the 2020 Census operations.
IF the IT solutions supporting the 2020 Census
cannot meet the baselined requirements or workloads, THEN the systems may require substantial
modifications or manual workarounds may have to
be developed, adding complexity and increasing
risk for a timely and successful 2020 Census.
U.S. Census Bureau
Probability 3
(Moderately
likely)
Impact 4
(Substantial
impact)
Exposure level
MEDIUM
Mitigation Strategies include the following:
•• Engage with enterprise efforts to ensure that
solutions architectures align and provide continued support for 2020 Census systems development and management.
•• Ensure that contingencies are planned ahead of
time and exercised when necessary.
•• Design IT solutions based on clearly understood requirements to ensure minimal design
changes.
•• Revamp of the integrated testing framework
based on 2018 End-to-End Census Test lessons
learned.
6.8 ADMINISTRATIVE RECORDS AND
THIRD-PARTY DATA—ACCESS AND
CONSTRAINTS
The Census Bureau is planning to use administrative records and third-party data to reduce the
need to follow up with nonrespondents through
the identification of vacant and deleted housing
units (those that do not meet the Census Bureau’s
definition of a housing unit) and the enumeration
of nonresponding occupied housing units and
the improvement of the quality of imputation for
demographic characteristics that are missing for
person and housing unit records. Administrative
records will also be used to update the MAF,
predict the best times to contact nonresponding households, and verify the information provided by respondents and enumerators. The use
of administrative records data requires special
handling and security protocols that affect the
development of the systems and infrastructure
supporting the 2020 Census.
IF the Census Bureau does not have timely and
continual access to administrative records and
third-party data, or the data providers place
constraints on the use of the data that conflict
with planned 2020 Census operations, THEN the
Census Bureau may not be able to fully meet the
challenge of containing the overall cost of the
2020 Census or to fully utilize the data quality
benefits of using administrative records in characteristic imputation.
U.S. Census Bureau
Probability 2
(Not likely)
Impact 5
(Major impact)
Exposure level
MEDIUM
Mitigation Strategies include the following:
•• Identify all required administrative records
and third-party data sets needed for the
2020 Census, including data providers and
points-of-contact.
•• Review data sharing agreements/contracts in
order to understand all the conditions assigned
to the administrative records and thirdparty data sets and to ensure conditions are
appropriate.
•• Ensure requirements for administrative
records and third-party data usage are developed and documented.
•• Inform data providers that data agreements/
contracts need to be updated.
•• Disseminate updated data agreements/contracts to internal stakeholders.
•• Negotiate with the source providers to ensure
required administrative records and thirdparty data are available when needed.
•• Ensure the build-out for all systems supporting the 2020 Census takes into account the
handling of administrative records and thirdparty data.
•• Ensure the security requirements, including
physical security, for all systems supporting
the 2020 Census cover the handling of administrative records and third-party data.
•• Ensure staff has been trained in the proper
handling of administrative records and thirdparty data.
6.9 INTERNET SELF-RESPONSE
INSTRUMENT
For the 2020 Census, it is anticipated that online
self-response will be the primary mode of data
collection, with a goal of 45 percent of households responding online. Therefore, it is required
that the Internet self-response solution sustain
a positive user experience while ensuring the
confidentiality and privacy of respondent data
from security threats. Under the current plan, the
Internet self-response instrument is a public-facing application that represents a possible single
point of failure for 2020 Census data collection.
2020 Census Operational Plan—Version 4.0 177
Cybersecurity threats are expected, and any performance issues may disrupt electronic collection
of respondent data.
IF the Internet self-response instrument experiences lengthy disruptions during the data collection period for the 2020 Census, THEN public confidence may decline sharply and the entire 2020
design could be negatively impacted, leading to
lower public response, higher cost, and delayed
completion of the 2020 Census.
Probability 2
(Moderately
likely)
Impact 5
(Moderate
impact)
Exposure level
MEDIUM
Mitigation Strategies include the following:
•• Develop a backup Internet self-response
instrument that can perform data collection to
meet the needs of the 2020 Census.
•• Conduct ongoing user acceptance testing,
usability testing, output testing, performance
testing, and program-level testing to ensure
the ability to support a primarily electronic
and successful 2020 Census.
178 2020 Census Operational Plan—Version 4.0
6.10 SYSTEMS SCALABILITY
All systems supporting the 2020 Census must be
able to handle the large, dynamic demands of the
operations and support the system of systems.
IF systems are not properly designed, tested,
and implemented with the ability to scale, THEN
critical issues may arise during the production
window of operations when the need to scale-up
(or down) any system in the environment occurs,
thus potentially eliminating the ability to scale and
resulting in a limited capacity to support the operations or system failure.
Probability 3
(Moderately
likely)
Impact 3
(Moderate
impact)
Exposure level
MEDIUM
Mitigation Strategies include the following:
•• Under direction of SEI Chief Architect, conduct scalability assessment with the Technical
Integrator team.
•• Provide accurate demand models to the
systems to ensure proper system of systems
design.
U.S. Census Bureau
7. Quality Analysis
As the Census Bureau continues to evaluate the
2020 Census operational design, an analysis of
the impact on the quality of the census results
is required to ensure that innovations designed
to reduce cost do not have an unacceptable
impact on quality. This section describes the
processes and analysis performed to date on the
quality impacts of the four key innovation areas:
Reengineering Address Canvassing, Optimizing
Self-Response (OSR), Utilizing Administrative
Records and Third-Party Data, and Reengineering
Field Operations. The analysis focused on impacts
of innovations. For example, the analysis related
to administrative records and third-party data
focuses on the impact of these innovations on
Nonresponse Followup (NRFU), as that operation
is where the innovations are expected to provide
the greatest cost savings. The Census Bureau analyzed all major frame development and enumeration operations in the 2020 Census design.
This section is organized as follows with supporting operations for the analysis:
•• Quality Impacts for Reengineering Address
Canvassing
ºº Address Canvassing (ADC)
ºº Local Update of Census Addresses (LUCA)
ºº Geographic Programs (GEOP)
•• Quality Impacts for Optimizing Self-Response
(OSR)
ºº Paper Data Capture (paper as a response
mode) (PDC)
ºº Internet Self-Response (ISR)
ºº Non-ID Processing (NID)
ºº Census Questionnaire Assistance (CQA)
•• Quality Impacts of Utilizing Administrative
Records and Third-Party Data
ºº Nonresponse Followup (NRFU)
•• Quality Impacts of Reengineering Field
Operations
ºº Update Leave (UL)
ºº Update Enumerate (UE)
ºº Group Quarters (GQ)
ºº Nonresponse Followup (NRFU)
U.S. Census Bureau
This release expands prior analysis in version 3.0
of the 2020 Census Operational Plan by including analysis of the 2018 End-to-End Census Test
data; modifications in the Type of Enumeration
Areas; incorporating the new mailing contact
strategy; updated self-response rate projections;
and updated housing unit (HU) projections for the
2020 Census.
This analysis produces two major outputs: estimated housing-unit coverage error and person-level coverage error. Reengineering Address
Canvassing studies only HU coverage. Enumeration
includes integration of both subsections—one
for HUs and one for people. As was done in the
2010 Census enumeration, final quality metrics for
people are divided into three major parts: estimated correct enumerations, estimated erroneous
enumerations, and estimated omissions. Although
all of these estimates for both HUs and people in
2018 are reported at the national level, lower levels
of geography could be analyzed in 2019 before
2020.
This quality analysis leverages data from the 2010
Census Coverage Measurement Survey (CCM),
2010 Census, census tests conducted from 2012
through 2018, and the American Community
Survey (ACS) to produce specific parameters. A
parameter is a measure of X or Y or Z. For example, one parameter for an operation could be
an estimated workload, and another parameter
could be the number of estimated errors it will
produce of a given kind. In some cases, expert
judgment was used when data were not available.
Expert judgment varies from team to team, but
in general, the experts for each parameter were
asked to predict a value of the parameter for the
2020 Census as accurately as possible. Typically,
a parameter is based on data but then adjusted
based on expert judgment to account for deficiencies in the data. An example is provided below in
the Methodology Example section.
The integration of cost and quality drove the quality methodology. In past years, the cost estimation
team used parameters produced by subject-matter experts (SMEs) to define workloads and estimate costs across the operations. To be consistent
with cost models, a complex set of parameters
2020 Census Operational Plan—Version 4.0 179
drives this quality methodology, and each parameter includes five important components from
SMEs:
1.
Minimum value.
2.
Middle value (typically mean, median, or
mode).
3.
Maximum value.
4.
Distribution (normal, uniform, triangular,
log-normal, etc.).
5.
Source.
Two models integrate the parameters—one for
frame and the other for enumeration. The models interact with each other and produce quality
estimates of an integrated design of the 2020
Census. In other words, the effects of Address
Canvassing quality can be traced through the various self-response methods and all the way down
to the nonresponse operations to see the impacts
Address Canvassing has on cost and quality of
all the later operations of the design. This analysis reviewed the impacts and interactions of all
the major operations in the design. This analysis
includes HUs and population for the 50 states, the
District of Columbia, and Puerto Rico.
These analyses are potentially valuable in several
ways, not only to measure quality, but also to
predict operational and technical workloads. First,
they point out dependencies and gaps among the
operations that warrant consideration as the census design moves from planning to implementation. For example, this analysis reviews the impacts
of decisions on LUCA to later operations like paper
self-response through workloads. If LUCA adds a
million correct or erroneous addresses, then paper
operations have to prepare to mail materials to
them. Second, they help determine which factors
(parameters) are the key drivers of cost or quality
and must be constantly considered and monitored,
versus which factors must be addressed but play
a less important role in the design. By changing
many parameters together and reviewing impacts,
the Census Bureau can prepare for and mitigate
extreme circumstances that may arise (risk management). For example, if a major government
security breach occurs in early 2020, then all of
the parameters for self-response, especially for
Internet, may drop considerably. We can very
quickly model these possibilities and see extreme
examples with relatively minor effort. Finally, by
180 2020 Census Operational Plan—Version 4.0
changing the values of one parameter while keeping all others fixed (performing sensitivity analyses), one can study potential effects on quality
under alternative operational designs. If we change
the percentage of addresses visited in the In-Field
Address Canvassing (IFAC) Operation, we can see
the impacts of that change to cost and quality for
other operations and the overall design of the 2020
Census.
Baseline
The quality of the 2010 Census was measured
using the CCM survey.12 The CCM was a postenumeration survey designed to assess the coverage of the census for HUs and persons, producing
estimates of omissions and erroneous enumerations. The CCM estimated a net overcount of
0.01 percent, or 36,000 persons, which was not
statistically different from zero. There were an
estimated 10.0 million erroneous enumerations for
the household population and 10.0 million omissions, after removing the 6.0 million whole-person
imputations. To predict the potential cost and
quality implications of the 2020 Census design,
the Census Bureau does not have the benefit of
a post-enumeration survey. However, the analysis
presented here uses some findings from the 2010
CCM survey to make assumptions about what to
expect given the 2020 Census design plans. In
addition, census test results and simulations with
2010 Census data are used to assess potential cost
and quality effects.
The Census Bureau produces quarterly estimates
of residential vacancy rates and HU counts. The
quality analysis presented in this report used these
quarterly estimates from 1965 to 2018 to predict
the number of HUs on the ground in 2020, using
time series analysis. This process yielded quarters
in the future with a 95 percent confidence interval.
The projection for the third quarter in 2020 for the
United States and Puerto Rico is 141,415,000. This
will be the estimated true number of HUs in 2020.
Methodology Example
2020 Census operational teams prepared and
provided parameters for predicting the quality of
their operations. This example will focus on the
Self-Response Team, but all the teams followed a
similar process to provide parameters. We focus
12
The scope of the 2010 CCM excluded people living in GQ
and in Remote Alaska.
U.S. Census Bureau
on paper self-response and the impacts that
paper self-response has on the overall quality of
the 2020 Census design. The entire country is
divided into three basic parts for the purposes of
mail contact—Type of Enumeration Area (TEA) 1,
which are Self-Response areas; TEA 6, which are
UL areas; and TEAs 2 through 5, which are the rest
of the country.13
The five parameters for paper self-response
include:
1.
Percentage of paper questionnaires completed in TEA 1 (Self-response).
2.
Percentage of paper questionnaires completed in TEA 6 (UL).
3.
Percentage of paper questionnaires
with erroneous people (called erroneous
enumerations).
4.
Percentage of paper questionnaires with
omitted people (called omissions).
5.
Percentage of paper questionnaires with
missing race or Hispanic origin.
Focusing on the first parameter, percentage of
paper questionnaires completed in TEA 1, the
Self-Response Team provided the following
information:
1.
Minimum value—13.2 percent.
2.
Middle value—17.2 percent.
3.
Maximum value—21.2 percent.
4.
Distribution (normal, uniform, triangular,
log-normal, etc.)—triangular.
5.
Source—2012 National Census Test, 2014
Census Test, 2015 Census Test, 2015 National
Content Test, 2018 End-to-End Census Test,
ACS, 2010 Census, Pew Research, and expert
judgment.
These estimates are based on analysis involving
multiple tests and survey data. However, the test
and survey data do not yield the same self-response rates that have been seen in past censuses.
Based on expert judgment, a factor was applied to
the self-response rate to account for the “Census
Environment” that is not replicable in any census
test or survey. The middle-value estimate of this
TEAs 2 through 5 are not included in this analysis and they
make up 0.7 percent of the addresses in the country.
13
U.S. Census Bureau
parameter was applied to the total TEA 1 universe
(140.0 million addresses). The second parameter
for TEA 6 was applied to the TEA 6 universe (6.6
million addresses) and then the estimates were
added together to get approximately 26,200,000.
This total, 26.2 million, represents the current
point estimate of the number of completed paper
questionnaires expected in the 2020 Census.
This estimate has uncertainty around it, based
on the minimum and maximum values of the
parameters. The minimum and maximum, as well
as the distribution, are used to feed the Monte
Carlo simulation. The outputs of the Monte Carlo
simulation, after they are integrated with all other
parameters, provide a basis for uncertainty around
the parameters and the 2020 Census design as a
whole. Finally, the source information helps people outside the team understand the supporting
documentation and methodology behind each
estimate.
7.1 REENGINEERING ADDRESS
CANVASSING
Throughout the entire Reengineering Address
Canvassing section, the analysis focuses on three
ultimate estimates:
1.
Total living quarters on the enumeration
frame at the beginning of enumeration.
2.
Missed adds—addresses expected on the
ground that are missing from the frame
(missed adds include addresses that are on
the frame but lack a geocode).14
3.
Missed deletes—addresses on the frame that
are not actually valid living quarters on the
ground.
The final outputs from Reengineering Address
Canvassing are the starting point for enumeration
(approximately January 1, 2020). The specific
parameters collected to define the Reengineering
Address Canvassing outputs are summarized in
Table 8. Table 8 gives a rough approximation of
the level of detail and complexity of the various
operations for this analysis.
14
A geocoded address is one that has a block code. This code
is critical for the 2020 Census because we must count people
and living quarters in a block.
2020 Census Operational Plan—Version 4.0 181
Table 8: Summary of Quality Parameters
Collected for Reengineering Address
Canvassing
Operation
Table 9: Summary of Quality Parameters
Collected for Initial Frame
Number of
parameters
collected
for quality
analysis
Initial frame development. . . . . . . . . . .
20
MAFCS. . . . . . . . . . . . . . . . . . . . . . . . . . . .
2
IOAC. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
10
LUCA. . . . . . . . . . . . . . . . . . . . . . . . . . . . .
10
IFAC. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
6
GEOP. . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4
Total . . . . . . . . . . . . . . . . . . . . . . . . . . .
52
To simplify the analysis, the starting point is the
beginning of calendar year 2017, with the estimated number of the three main aggregates. The
numbers evolve though the subsequent fiscal years
by incorporating growth in the housing stock, and
cleaning up the frame by resolving missed adds
and missed deletes—errors on the frame. These
errors are resolved through several operations,
including Ungeocoded Resolution, IOAC, the
MAFCS, the LUCA program, and IFAC. The critical
point is January 1, 2020, when the enumeration
frame is defined and created for census enumerations, such as Self-Response, UL, and others.
This analysis integrates operations. As an example, the errors on the frame are tracked across
operations down to the NRFU Operation, so that
the same error is not fixed by more than one
operation.
Initial Frame Development
The starting point for the frame quality analysis
begins with estimates of the state of the frame as
of January 1, 2015. Based on analysis of the MAF
and using results of the MAFCS that occurred in
2016, some parameters for estimating the initial
state of the MAF ar
182 2020 Census Operational Plan—Version 4.0
Parameter
Source
Number of HUs on the frame
in 2015 and addresses in
2017 (United States only)—
four parameters
Decennial extracts of
total addresses and
ungeocoded addresses.
Estimated number of HUs
in 2020 (Puerto Rico only)—
three parameters
HU projections.
Estimated number of actual
addresses on the ground in
2020
Decennial Statistical
Studies Division.
Errors already on the frame— From analysis of the
percentage of missed adds
MAFCS.
in frame
Errors already on the frame— From analysis of the
percentage of missed deletes MAFCS.
in frame.
Estimated percentage of
growth missing from the
frame
From analysis of the
MAFCS.
Estimated percentage of the
growth that is ungeocoded
From analysis of the
MAFCS.
Estimated percentage of
growth to be overcoverage
(missed deletes)
From analysis of the
MAFCS.
Expected filter changes—two ACS extracts and expert
parameters
judgment.
Number of ungeocoded
addresses on the frame in
2017—two parameters
Decennial extracts.
Estimated workload of
ungeocoded addresses
through 2020
Trend analysis by
Geography Division.
Estimates of resolution of
geocoding results—two
parameters
Past tests, MAFCS
results, reports from
Geocoding Operation,
and expert judgment.
These 20 parameters, once integrated, represent
the state of the frame in 2017 including data from
the IOAC and the MAFCS, which both started in
full production in FY 2016 and for which portions
were paused in 2017.
The number of addresses on the frame in 2020
is projected by taking the projected estimate of
HUs in 2020, adding an estimate of the missed
adds that will be resolved during 2017 to 2020,
and subtracting an estimate of the missed deletes
that will be resolved in 2017 to 2020. The missed
adds and missed deletes are resolved through the
U.S. Census Bureau
operations mentioned above. These numbers take
into account the projected new growth and the
estimated numbers of missed adds and missed
deletes that accompany this growth.
Table 10: Summary of Key Quality
Parameters Collected for the In-Office
Address Canvassing and MAF Coverage
Study
For this analysis, estimates of the numbers of
missed adds are separated into two categories:
addresses on the MAF that are ungeocoded, and
other categories of missed adds. Some operations will resolve both types. On the other hand,
a planned geocoding operation that started in the
middle of 2017 will geocode many addresses. The
office work involved in the LUCA Operation will
also differentiate between the two types of missed
adds as it attempts to resolve cases.
Parameter
Source
Percentage of blocks
identified as Passive and
Active during IR
Based on observed IR
work that occurred in
2016 and 2017.
In-Office Address Canvassing and MAF
Coverage Study
Percentage of missed deletes MAFCS results and
captured in Active blocks
expert judgment.
The Address Canvassing Operation has three
major components, as described in section 5.4.3:
IOAC, IFAC, and the MAFCS. Both the IOAC
Operation and the MAFCS started in full production in FY 2016 but were put on hold in 2017 due
to budget constraints. The IOAC Operation has
two phases, Interactive Review (IR) and Active
Block Resolution (ABR). IR categorizes the blocks
into passive, active, or on-hold blocks. For blocks
considered “active,” ABR updates the block and
adds or deletes addresses. Table 10 describes the
five key parameters, out of the 10 total collected,
for IOAC and MAFCS conducted in 2016 through
2019, before LUCA. Workload parameters, not
described, include the amount of work planned
for each year based on approved budgets.
Percentage of missed adds in MAFCS results.
Passive blocks
Percentage of missed deletes MAFCS results.
in Passive blocks
Percentage of missed adds
captured in Active blocks
MAFCS results and
expert judgment.
Recognizing that the frame is the single largest
contributor to overall quality, the parameters in
Tables 9 and 10 show the most critical contributors
to quality in the entire 2020 Census design. IOAC
can correct hundreds of thousands of addresses
for both missed adds and missed deletes each
year. The quality outputs from the integration
of IOAC parameters illustrate the core quality
improvement in the 2020 Census design. This
ongoing frame improvement work involves inputs
and outputs that produce a higher quality frame
than the Census Bureau saw coming out of the
2010 Census. Better frame maintenance processes
conducted throughout the decade, including the
Geographic Support System, geocoding, and
improved use of technology like the use of aerial
imagery, help define the overall quality of the 2020
Census. The addition of this IOAC process shows
promise to improve the quality of the 2020 Census,
demographic surveys, and future censuses.
Local Update of Census Addresses
In analyzing the effect of the LUCA Operation, the
most important input parameter is the number of
LUCA submissions from the various governmental
entities. The procedures and requirements for submission changed from the 2000 Census to the 2010
Census, and changed again for the 2020 Census.
That makes it more difficult to project the volume
of submissions the Census Bureau will receive.
Another parameter considered is the number
of addresses submitted to the Census Bureau
through LUCA and then rejected by the Census
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 183
Bureau as not valid. These rejections may be
appealed to the Office of Management and
Budget for additional consideration. Unless the
appeals are resolved before the enumeration
frame is identified, such cases will be included in
the frame.
For the quality analysis, the projected number of
submissions is subdivided into several categories
according to the Census Bureau’s assessment of
the addresses provided, including whether the
address is valid or not, on the MAF already or not,
etc.
Based on results of the LUCA program in past
censuses, experts on the LUCA process have projected the total number of submissions the Census
Bureau might anticipate, the proportions for the
categories (above) those addresses may fall into,
and the chance that rejected submissions will be
appealed. Past data are used to estimate how
many of those appealed cases will turn out to be
valid living quarters and added to the frame and
how many will not.
The most important result of the quality analysis for LUCA is summarized in estimates of two
numbers from the LUCA program, good addresses
missing from the frame and erroneous addresses
added to the frame. The first represents the
reduction in the number of missed adds, while the
second represents additions to the frame in error
(missed deletes). The former quantifies a reduction in potential omission of HUs (and, eventually,
people); the latter quantifies additional cases
that may be sent for fieldwork erroneously. Just
as important, the sum of these two numbers has
a serious effect on census operations and their
accompanying cost.
An important dependency included in this analysis
is the relative state of the address frame when the
LUCA program begins and when submissions are
received and processed. As errors on the frame
are rectified through other geographic programs,
such as IOAC, the number of missed adds and
missed deletes should diminish. This may provide
for fewer address submissions from the government partners in the LUCA program and should
result in fewer actual address corrections, that is,
less error reduction. The quality analysis on the
frame takes these dependencies into account.
184 2020 Census Operational Plan—Version 4.0
In-Field Address Canvassing
The IFAC Operation will occur in 2019 for approximately 38.4 percent of the HUs in TEA 1, the
key IFAC parameter. This operation incorporates
fieldwork identified through the results of IFAC
and LUCA submissions. For this final field operation, which prepares the frame for enumeration,
the Census Bureau identifies parameters about
capture rates of the missed adds and missed
deletes expected in these canvassed blocks. After
this fieldwork is complete, the final enumeration
universe as of January 1, 2020, is created and estimated by this analysis.
Measures of Uncertainty for Reengineering
Address Canvassing
As described earlier, each input parameter has a
minimum, middle, and maximum value, and a distribution. After Reengineering Address Canvassing
integration for all these parameters, the final
description of the work logically concludes with
the outputs from the Monte Carlo simulations
that integrate all the uncertainty around these
key frame-development parameters. The resulting
variability is an input to the next phase, which is
enumeration.
Reengineering Address Canvassing
Alternatives Analysis
One of the goals of the 2020 Census Quality
Analysis Team is to use the models to look at
alternative designs and potential refinements to the
2020 Census operational design. To that end, the
Quality Analysis Team identified the five key parameters that affect cost or quality. The Census Bureau
considers alternative designs that present perspective on quality impacts of these parameters.
•• The volume of addresses sent to IFAC is a major
cost-driver, so that parameter is included.
•• Workload for the Ungeocoded Resolution
Operation is expected to add significant numbers of good addresses to the frame.
•• Expected filter changes are also expected to
add significant numbers of addresses to the
frame.
•• The number of addresses submitted to the
Census Bureau through LUCA will impact both
missed adds and missed deletes, decreasing
one and increasing the other, respectively.
U.S. Census Bureau
•• IFAC capture rates of the missed adds and
missed deletes expected in these canvassed
blocks are a critical estimate of the quality of
the fieldwork expected in 2019 for IFAC.
Analysis of alternatives for the cost and quality
tradeoffs began in late summer 2016. The Census
Bureau continued conducting a detailed analysis of alternatives through FY 2018, as resources
permitted.
Geographic Programs
After the frame definition is complete, the GEOP
Operation prepares the frame for Enumeration.
These parameters from the GEOP Operation
subdivide the universe that goes to Enumeration
and defines enumeration methods for the specific
addresses. Based on the newly updated results of
TEA delineation produced in July 2018, all of the
parameters collected for Geographic Programs
are applied to the estimated total number of HUs
predicted for January 1, 2020, and are shown in
Table 11.
7.2 OPTIMIZING SELF-RESPONSE
Before the analysis turns to Optimizing SelfResponse (OSR), Sections 7.2, 7.3, and 7.4 all
focus on enumeration operations that impact
quality. This analysis of enumeration continues
to estimate the number of addresses enumerated, addresses missing from enumeration, and
addresses that are enumerated erroneously, as
seen in the frame development analysis, as well
as an additional dimension added for people. The
final outputs from enumeration include:
1.
Total living quarters enumerated.
2.
Missed adds for living quarters.
3.
Missed deletes for living quarters.
4.
Correct enumerations for people.
5.
Erroneous enumerations for people.
6.
Omissions for people.
7.
Imputed Race or Hispanic origin.15
The results for enumeration are summarized by
these seven measures for this quality analysis.
For this quality analysis, not all of the addresses in
TEA 3 through 5 are considered. As seen in Table
11, these TEAs only account for an estimated
300,000 living quarters.
The detailed parameters collected from SMEs
to define the enumeration, including OSR, using
Administrative Records, and Reengineering Field
Operations, are summarized in Table 12.
Table 11: Geographic Programs Quality
Parameters
15
Imputation is the process of replacing missing data with
substituted values. Imputations come from three main sources—
whole-household imputations, whole-person imputations, and
item-missing imputations.
Parameter
Percent
Number
of living
quarters
Total living quarters from
reengineering address
canvassing . . . . . . . . . . . . . .
100.0
146,900,000
Percentage of all addresses in
TEA 1 (self-response). . . . . .
95.3
140,000,000
Percentage of all addresses in
TEA 2 (UE) . . . . . . . . . . . . . . .
0.1
15,000
Percentage of all addresses in
TEA 3–5 BCUs (all other)1 . .
0.2
300,000
Percentage of all addresses in
TEA 6 (UL) . . . . . . . . . . . . . . .
4.5
6,600,000
1
Measurement of the quality of these addresses will occur
in FY 2018 and beyond.
Note: These data do not reflect the uncertainty of the estimates. All the numbers in this table reflect the middle values of
a range of estimates provided by the teams.
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 185
Table 12: Summary of Quality Parameters
Collected for Enumeration
Operation
Number of
parameters
collected
for quality
analysis
Paper1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2
Internet Self-Response (ID only). . . . . .
4
Non-ID Processing (sources are
Internet and telephone). . . . . . . . . . . . . .
7
CQA (ID only). . . . . . . . . . . . . . . . . . . . . . .
4
NRFU (administrative records). . . . . . . .
12
NRFU (not administrative records). . . .
12
Coverage Improvement. . . . . . . . . . . . . .
3
UE/UL (frame updates during
enumeration. . . . . . . . . . . . . . . . . . . . . .
2
GQ. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5
Total . . . . . . . . . . . . . . . . . . . . . . . . . . . .
51
The Quality Analysis Team recognizes that there is not a
formal operation called “Paper,” but we ask readers to accept
this language for simplicity of the analysis.
1
The remainder of this section focuses on OSR,
specifically.
Paper Enumeration
The Census Bureau projects the percentage of
the households in the Self-Response universe
that will complete their questionnaires on paper
and send them back in 2020. It also projects the
186 2020 Census Operational Plan—Version 4.0
percentage of households in UE and UL geography that will complete their questionnaires on
paper. Based on the parameters for this mode,
the Census Bureau estimated the total number of
completed questionnaires expected from paper
in the 2020 Census.
Internet Enumeration (ID only)
The quality parameters collected for Internet were
similar to paper. The Census Bureau projects the
percentage of the Self-Response universe that
complete their questionnaires on the Internet. The
second component of the Internet comes from
the UL universe that completes a questionnaire
in the Internet based on materials left during the
first visit in UL. Because of quality differences
expected for Internet non-ID cases, those cases
are analyzed independently from these parameters. Measurement of Internet non-ID occurs in the
non-ID subsection. These parameters only estimated Internet ID cases.
Census Questionnaire Assistance
Enumeration (ID only)
The CQA quality parameters for SMEs paralleled
the Internet parameters. The Census Bureau projects the percentage of the Self-Response universe
that complete their questionnaires using the CQA
telephone option. The Census Bureau also projects the UL universe that completes a questionnaire by calling in based on materials left during
the first visit in UL. Because of quality differences
expected for telephone non-ID cases, those cases
U.S. Census Bureau
are analyzed independently from these parameters. Measurement of telephone non-ID occurs in
the non-ID subsection. The CQA parameters only
provide estimates for CQA ID cases.
Non-ID Processing Enumeration (Internet
and Telephone)
The Census Bureau projects the percentage of
the enumeration universe that will complete their
questionnaires using the non-ID process from
either the Internet or CQA. This includes portions
from both Self-Response and UL TEAs. Some
cases match and get an ID via automated matching; other cases are matched through the clerical
process; and finally some require a field-verified
visit to confirm the geography. These parameters project all completed cases identified and
enumerated through the non-ID process from
all paths. The NID will add new addresses that
the Census Bureau does not have on the initial
enumeration frame, which is different from self-
response options applied in the 2010 Census.
These projections for real adds through non-ID are
based on the outputs from Reengineering Address
Canvassing operations, that is, the quality of the
frame going into enumeration operations. This is a
significant integration point that occurs in upcoming operations as well.
Self-Response Housing Unit Summary
Because self-response generally does not add or
delete addresses from the enumeration universe,
minimal impacts come from self-response on the
HU side. The one exception is of course Non-ID
processing, as seen in Table 13.
For this analysis, Completed cases includes
the total of Occupied, Vacant, and Unresolved
addresses. Although deleted cases have cost
impacts, there are no quality impacts for person
enumeration. The addresses in the “Adds” row
are already captured in the occupied and vacant
figures in this table.
Self-Response Person Summary
For this analysis, the measurements or parameters
of person-level error come from the 2010 CCM
with adjustments to include dependencies with the
Reengineered Address Canvassing. Similar methods were applied to all the self-response modes
to estimate 2020 Census person-level coverage
error. Each parameter that feeds Table 14 has
detailed methodology based on input from SMEs
and only includes within-questionnaire error. Entire
addresses either missed or over-counted are not
included in these estimates but are considered
elsewhere.
Table 13: Summary of Self-Response Workloads for Housing Units
Non-ID
Cases
Paper
Internet ID
Internet
CQA
CQA ID
Completed cases total. . .
26,200,000
55,400,000
6,600,000
740,000
6,100,000
Occupied. . . . . . . . . . . . . .
26,200,000
55,400,000
6,200,000
700,000
6,100,000
Vacant . . . . . . . . . . . . . . . .
X
X
X
X
X
Delete. . . . . . . . . . . . . . . . .
X
X
X
X
Adds. . . . . . . . . . . . . . . . . .
X
X
40,000
X
Unresolved . . . . . . . . . . . .
X
X
X
X
X
400,000
1
X
1
X Not applicable.
1
For this analysis, these added non-ID addresses are included as occupied. Some could be vacant, but a very small number is
expected.
Note: These data do not reflect the uncertainty of the estimates. All the numbers in this table reflect the middle values of a range
of estimates provided by the teams.
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 187
Table 14: Summary of Key Quality
Parameters Collected for Self-Response
Person Error
Parameter
Sources
Number of erroneous
enumerations by SelfResponse Mode
2012 National Census
Test, 2014 Census Test,
2015 Census Test, 2015
National Content Test,
2018 End-to-End Census
Test, ACS, 2010 Census,
Pew Research, and expert
judgment
Number of missed people
(omissions) by SelfResponse Mode
Number of people with
missing Race or Hispanic
origin by Self-Response
Mode
2012 National Census
Test, 2014 Census Test,
2015 Census Test, 2015
National Content Test,
2018 End-to-End Census
Test, ACS, 2010 Census,
Pew Research, and expert
judgment
2012 National Census
Test, 2014 Census Test,
2015 Census Test, 2015
National Content Test,
2018 End-to-End Census
Test, ACS, 2010 Census,
Pew Research, and expert
judgment
7.3 USING ADMINISTRATIVE RECORDS
Use of administrative records and third-party data
is the third major innovation area introduced in
the 2020 Census design. The key parameters from
Administrative Records are:
1.
Percentage of the NRFU universe removed
for Occupied.
2.
Percentage of the NRFU universe removed
for Vacants.
3.
Percentage of the NRFU universe removed
for Deletes.
4.
Percentage of the NRFU universe removed
after the last visit.
Although the percentage removed after the last
visit is not yet developed, the Census Bureau built
this component into the model for the purpose of
analyzing design alternatives. Table 15 shows the
person-level parameters for using administrative
records. Recognizing that GQ will use administrative records, the analysis team added analysis of
GQ administrative records usage in FY 2018.
188 2020 Census Operational Plan—Version 4.0
The person-level error based on using administrative records seen in Table 15 is a new source of
error compared to the 2010 Census design.
Table 15: Summary of Key Quality
Parameters Collected for Using
Administrative Records Error for Persons
Parameter
Sources
Number of erroneous
enumerations
2010 Census simulation
using the 2017 test models
Number of missed people
(omissions)
2010 Census simulation
using the 2017 test models
Number of people with
imputed race or Hispanic
origin
2010 Census simulation
using the 2017 test models
The process implemented to estimate quality for
administrative records usage involves applying
these rates of error to the NRFU and UE universes
removed using administrative records. The quality metrics produced for person-level error came
from analysis on the entire 2010 Census NRFU
universe.
7.4 REENGINEERING FIELD
OPERATIONS
Nonresponse Followup
The NRFU field operation is the most costly
operation. After the Census Bureau removes the
addresses using administrative records and adds
new addresses in the field, what remains is the
field workload for NRFU.
For this analysis, “Completed cases” includes
the total of Occupied, Vacant, and Unresolved
addresses. Although deleted cases have cost
impacts, there are no quality impacts for person enumeration within questionnaires. Added
addresses, on the other hand, are included in the
occupied and vacant components. NRFU will add
new addresses that the Census Bureau did not
have on the initial enumeration frame, and NRFU
will delete addresses from the frame that do not
exist on the ground. The parameters for added
and deleted addresses through NRFU are integrated with the missed adds and missed deletes
from Reengineering Address Canvassing operations. These are important integration points with
U.S. Census Bureau
Reengineering Address Canvassing. Finally, the
unresolved addresses represent cases that are
deemed finished without a completed interview.
Unresolved cases typically occur after the maximum number of visits is reached.
Table 16 shows the person-level parameters of
error for the NRFU Operation.
The “unresolved” addresses from NRFU, included
in the final row of this table, are one primary source
of the imputations. Cost impacts related to the
number of visits drive the number of cases that
remain unresolved at the end of NRFU. This balance between cost and quality is manifested clearly
in this component of the operational design.
Update Leave/Update Enumerate
The UL and UE operations are somewhat more
complicated and have a sizable effect on the overall quality of the 2020 Census design. Based on the
current, untested methodology, the Census Bureau
expects two sources of response data for these
geographies.
1.
2.
Self-response through paper, Internet, and
telephone from questionnaires left at the
door. This universe will not be included in
this section because it has already been
included in the paper, Internet, and CQA
sections.
Frame updates that come from the listing
component of the operations. This includes
adding addresses, deleting addresses, identification of vacant HUs, and unresolved rates.
The quality parameters for the UL and UE
Operations have a smaller impact compared to
prior years due to the smaller universes identified
in the production TEA delineation that occurred in
FY 2018.
U.S. Census Bureau
Table 16: Summary of Key Quality
Parameters Collected for Update Leave and
Update Enumerate for Person Error
Parameter
Sources
Number of erroneous
enumerations by type of
respondent and visit
2010 CCM reports and
expert judgment
Number of missed people
(omissions) by type of
respondent and visit
2010 CCM reports and
expert judgment
Number of people with
imputed demographics by
type of respondent and
visit
2010 CCM reports and
expert judgment
Coverage Improvement
The Coverage Improvement suboperation
parameters estimate the number of households
attempted, number of people added, and number
of people deleted during the operation. Only basic
quality impacts are covered in this quality analysis
for FY 2018 in an effort to include all significant
quality operations in the model.
Group Quarters
The GQ Operation parameters estimate the number of GQs enumerated and number of people
enumerated in GQs. Only basic quality impacts are
covered in this quality analysis for FY 2018 in an
effort to include all significant methods of enumeration. Minor adjustments for person-level error
occurred in this analysis for GQ.
Measures of Uncertainty for Enumeration
Consistent with prior descriptions, each parameter
has a point estimate and measures of uncertainty
around the point estimate. After enumeration is
completed, the final description of the work logically concludes with the outputs from the Monte
Carlo simulations that integrate all the uncertainty
around these parameters. As described earlier,
each parameter has a minimum, middle, maximum
value, and a distribution. These pieces of information are the inputs to perform Monte Carlo simulations on the integration of frame and enumeration
to describe the uncertainty of quality for the 2020
Census design.
2020 Census Operational Plan—Version 4.0 189
[This page intentionally left blank]
8. Approval Signature
Albert E. Fontenot, Jr. (signed)
Albert E. Fontenot, Jr.
________
December 31, 2018
Associate Director for Decennial Census Programs
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 191
[This page intentionally left blank]
9. Document Logs
9.1 SENSITIVITY ASSESSMENT
This table specifies whether or not the document contains any administratively restricted information.
Verification of Document Content
This document does not contain any:
•• Title 5, Title 13, or Title 26 protected information
•• Procurement information
•• Budgetary information
•• Personally identifiable information
9.2 REVIEW AND APPROVALS
This 2020 Census Operational Plan document has been reviewed and approved for use.
This table documents the necessary approvals leading up to the point of baselining.
Document Review and Approval Tier: Operational Plan
Name
Area Represented
Date
Robin A. Pennington
2020 Census Operational Plan Team
November 16, 2018
2020 Census Operational Plan Team Leadership Group:
Albert E. Fontenot, Jr.
Associate Director for Decennial Census Programs
December 31, 2018
James B. Treat
Assistant Director for Decennial Census Programs
December 31, 2018
Michael Thieme
Assistant Director for Decennial Census Programs
December 31, 2018
Patrick J. Cantwell
Chief, Decennial Statistical Studies Division
December 31, 2018
Deirdre D. Bishop
Chief, Geography Division
December 31, 2018
Phani-Kumar A. Kalluri
Chief, Decennial IT Division
December 31, 2018
Burton Reist
Chief, Decennial Communications and Stakeholder Relations December 31, 2018
Office
Louis Cano
Chief, Decennial Contracts Execution Office
December 31, 2018
2020 Census Portfolio Management Governing Board
December 31, 2018
2020 Census Executive Steering Committee
December 31, 2018
9.3 VERSION HISTORY
The document version history recorded in this section provides the revision number,
the version number, the date it was issued, and a brief description of the changes since the
previous release. Baseline releases are also noted.
Rev #
Version
Date
Description
Final
V 1.0
October 1, 2015
Original baseline.
Final
V 1.1
November 6, 2015
Conversion of 2020 Census Operational Plan content into
Communications Directorate Desktop Publisher. Converted all
figures and updated figures 8 and 28. Also added Section 8—
Lifecycle Cost Estimate and Appendices.
Final
V 2.0
September 30, 2016
Fiscal year 2016 update of 2020 Census Operational Plan.
Final
V 3.0
September 30, 2017
Fiscal year 2017 update of 2020 Census Operational Plan.
Final
V 4.0
December 31, 2018
2018 update of 2020 Census Operational Plan.
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 193
[This page intentionally left blank]
Appendix A. List of Acronyms
Acronym
Definition
3PV
Third Party Vendor
ABR
Active Block Resolution
ACO
Area Census Office
ACS
American Community Survey
ADC
Address Canvassing Operation
ARC
Archiving Operation
ARCM
Assistant Regional Census Manager
ATO
Authority to Operate
ATT
Authority to Test
AVT
Address Validation Test
BAS
Boundary and Annexation Survey
BCU
Basic Collection Unit
BPM
Business Process Models
BYOD
Bring Your Own Device
BVP
Boundary Validation Program
CAP
Capability Requirements
CBAMS
Census Barriers, Attitudes, and Motivators Study
CCFR
Census Count and File Review
CCM
Census Coverage Measurement Survey
CDR
Critical Design Review
CEDCaP
Census Enterprise Data Collection and Processing
CEDSCI
Center for Enterprise Dissemination Services and Customer Innovation
CFD
Content and Forms Design Operation
CMDE
Coverage Measurement Design and Estimation Operation
CMFO
Coverage Measurement Field Operations
CMM
Coverage Measurement Matching Operation
CNMI
Commonwealth of the Northern Mariana Islands
COMPASS
Census Operations Mobile Platform for Adaptive Services and Solutions
CQA
Census Questionnaire Assistance Operation
CQR
Count Question Resolution Operation
CRO
Count Review Operation
CUF
Census Unedited File
C-SHARPS
Census-Schedule A Human Resources and Recruiting Payroll System
DAPPS
Decennial Applicant, Personnel, and Payroll System
DCEO
Decennial Contracts Execution Office
DCEO/TI
Decennial Contracts Execution Office/Technical Integrator
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 195
Acronym
Definition
dDaaS
decennial Device as a Service
DLM
Decennial Logistics Management Operation
DOP
Detailed Operational Plan
DPD
Data Products and Dissemination Operation
DSC
Decennial Service Center Operation
DSF
(USPS) Delivery Sequence File
eADRec
Electronic Administrative Record
EAE
Evaluations and Experiments Operation
ECaSE-OCS
Enterprise Census and Survey Enabling-Operational Control System
EMM
Enterprise Mobility Management
eResponse
Electronic Response Data Transfer
ESB
Enterprise Service Bus
eSDLC
Enterprise Systems Development Life Cycle
ETL
Enumeration at Transitory Locations Operation
FACO
Federally Affiliated Count Overseas Operation
FAQ
Frequently Asked Question
FHUFU
Final Housing Unit Follow-Up
FIPS
Federal Information Processing Standard
FLD
Field Division
FLDI
Field Infrastructure Operation
FPD
Forms Printing and Distribution Operation
FSCPE
Federal-State Cooperative Population Estimates
GAO
Government Accountability Office
GARP
Geographic Area Reconciliation Program
GEOP
Geographic Programs Operation
GIS
Geographic Information System
GPS
Global Positioning System
GQ
Group Quarters Operation
GSS
Geographic Support System
GSS-I
Geographic Support System Initiative
GUPS
Geographic Update Partnership Software
HP
Hewlett Packard
HQ
Headquarters
HU
Housing Unit
HUFU
Housing Unit Follow-up
IaaS
Infrastructure as a Service
IAC
Island Areas Censuses Operation
iCADE
integrated Computer-Assisted Data Entry
196 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
Acronym
Definition
ICD
Interface Control Document
ICQ
Individual Census Questionnaire
ID
Identifier
IFAC
In-Field Address Canvassing
IHUFU
Initial Housing Unit Follow-Up
IIP
Integration and Implementation Plan
IOAC
In-Office Address Canvassing
IOD
Integrated Operations Diagram
IPC
Integrated Partnership and Communications Operation
IPT
Integrated Project Team
IR
Interactive Review
ISA
Interconnection Security Agreement
ISR
Internet Self-Response Operation
IT
Information Technology
ITIN
IT Infrastructure Operation
IR
Interactive Review
IVR
Interactive Voice Response
JAWS
Job Access With Speech screen reader
KFI
Key From Image
LEP
Limited English Proficiency
LiMA
Listing and Mapping Application
LNG
Language Services Operation
LQ
Living Quarters
LUCA
Local Update of Census Addresses Operation
MAF
Master Address File
MAM
Mobile Application Manager
MAFCS
MAF Coverage Study
MDF
Microdata Detail File
MDM
Mobile Device Management
MMVT
MAF Model Validation Test
MOJO
In-field operational control system
MOU
Memorandum of Understanding
MS
Microsoft
MTDB
MAF/TIGER Database
NARA
National Archives and Records Administration
NCP
New Construction Program
NFC
National Finance Center
NID
Non-ID Processing Operation
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 197
Acronym
Definition
NPC
National Processing Center
NRFU
Nonresponse Followup Operation
OBAV
Office-Based Address Verification
OCR
Optical Character Recognition
O&M
Operations and Maintenance
OIS
Office of Information Security
OMR
Optical Mark Recognition
OSR
Optimizing Self-Response
PBC
Partial Block Canvassing
PC
Personal Computer
PDC
Paper Data Capture Operation
PDCC
Paper Data Capture Center
PEARSIS
Production Environment for Administrative Record Staging, Integration, and
Storage.
PES
Post-Enumeration Survey
PFU
Person Follow-Up
P.L.
Public Law
PaaS
Platform as a Service
PLBR
Project-Level Business Requirements
PM
Program Management
POA&M
Plan of Action and Milestones
POP
Population Division
PRR
Production Readiness Review
PSAP
Participant Statistical Areas Program
PUMA
Public Use Microdata Areas
QC
Quality Control
RCC
Regional Census Center
RDP
Redistricting Data Program Operation
REMP
Requirements Engineering Management Plan
RFP
Request for Proposal
RPO
Response Processing Operation
RTN/RTNP (ITIN)
Real Time Non-ID Processing System
RV
Recreational Vehicle
SaaS
Software as a Service
SEI
Systems Engineering and Integration Operation
SIMEX
Simulation Experiment
SPC
Security, Privacy, and Confidentiality Operation
SRQA
Self-Response Quality Assurance
198 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
Acronym
Definition
SRR
Systems Requirements Review
TEA
Type of Enumeration Area
TI
Technical Integrator
TIGER
Topologically Integrated Geographic Encoding and Referencing System
TL
Transitory Location
TRR
Test Readiness Review
TSAP
Tribal Statistical Areas Program
UE
Update Enumerate Operation
UHE
Usual Home Elsewhere
UL
Update Leave Operation
UR
Ungeocoded Resolution
URL
Uniform Resource Locator
U.S.
United States
U.S.C
United States Code
USPS
United States Postal Service
VOIP
Voice-over Internet Protocol
WAH
Work at Home
WBS
Work Breakdown Structure
WCAG
Web Content Accessibility Guidelines
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 199
[This page intentionally left blank]
Appendix B
2020 Census Operational Design: An Integrated Design for Hardto-Count Populations
The goal of each decennial census is to count
everyone once, only once, and in the right place.
Accomplishing this is no small task; it is impacted
by the ever-evolving environment in which we live,
work, and will conduct the 2020 Census. Societal,
demographic, and technological trends result in
a population that is harder and more expensive
to enumerate. As it becomes more challenging to
locate individuals, connect with them, and solicit
their participation through traditional methods, the U.S. Census Bureau must, decade after
decade, devote additional thought and effort to
understanding our environment and the potential
impacts on counting the population, especially
Participation hindered by
language barriers,
low literacy,
lack of Internet access
populations that have historically been hard to
count.
To establish a framework around which we will
consider hard-to-count populations, we will
leverage the work of Roger Tourangeau. Slight
modifications to Tourangeau’s definitions of the
segmentation of hard-to-count populations have
been made to fit the 2020 Census environment.
The 2020 Census operational design considers the
hard-to-count population in relation to four segments: Hard-to-Locate, Hard-to-Contact, Hard-toPersuade, and Hard-to-Interview, as depicted in
the following image.
Hard to
Locate
Hard to
Contact
Hard to
Interview
Suspicious of the
government, low levels
of civic engagement
Housing units not in our
frame and/or persons
wanted to remain hidden
Hard to
Persuade
Highly mobile, people
experiencing homelessness,
physical access barriers such
as gated communities
Figure 1: 2020 Census Hard-to-Count Framework
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 201
The Hard-to-Locate segment includes housing units that we do not have in our frame and
includes persons wanting to remain hidden.
Persons may want to remain hidden to keep
themselves or certain characteristics about themselves quiet out of fear or other factors that create
reluctance to respond. The Hard-to-Contact segment includes highly mobile populations, people
experiencing homelessness, and populations with
physical access barriers such as gated communities. The Hard-to-Persuade segment can include
populations with low civic engagement and populations suspicious of the government. In addition,
the Hard-to-Interview segment can include populations where participation may be hindered by
language barriers, low literacy, or lack of Internet
access. Some populations may fall in one, more
than one, or all of these segments of the Hard-toCount (HTC) Framework.
As our environment evolves and we lay the foundation for the 2020 Census operational design, we
must ask ourselves if and how the design impacts,
changes, or adds to the populations we historically think of as the hard-to-count. These populations include, but are not limited to:
•• Young children.
•• Highly mobile persons.
•• Racial and ethnic minorities.
•• Non-English speakers.
•• Low-income persons.
•• Persons experiencing homelessness.
•• Undocumented immigrants.
•• Persons who have distrust in the government.
•• Lesbian, Gay, Bisexual, Transgender, and
Questioning/Queer (LGBTQ) persons.
•• Persons with mental and physical disabilities.
•• Persons who do not live in traditional housing.
The ever-evolving societal changes and trends
have influenced the 2020 Census operational
design. Woven throughout the operational design
are operations and activities undertaken for populations that have historically been hard to count,
continue to be hard to count, or are emerging
as hard to count. Shown in Figure 2 is the 2020
Census operational placemat. Shaded in darker
blue are operations that make the most significant
202 2020 Census Operational Plan—Version 4.0
contributions to an integrated design for hard-tocount populations. Through these operations the
Census Bureau:
•• Engages with stakeholders to understand the
opportunities and challenges in enumerating
hard-to-count populations.
•• Determines what information to collect.
•• Identifies the addresses where people live or
could live.
•• Determines how to connect with people.
•• Motivates people to respond.
•• Collects information from all households,
including those residing in groups or unique
living arrangements.
To a certain extent, any deviation taken from the
traditional or “ideal” path to response can be seen
as an effort to encourage response and participation from someone who might otherwise not
respond to, and be counted as part of the 2020
Census. From an operational design perspective,
the ideal path to a 2020 Census response involves
the delivery of an initial invitation letter containing
a unique census identifier, a respondent receiving the letter and sitting down at a computer or
similar device and using their unique identification
code, completing, and submitting their census
response. However, our world is not ideal for
everyone.
In the text that follows, we itemize activities and
operations the Census Bureau will implement in
support of hard-to-count populations.
SUPPORT OPERATIONS
We learn from every decennial census. The Census
Bureau’s ability to connect with the population as
a whole and to have the population connect with
the data collected in a decennial census provides
opportunities for hard-to-count populations to
understand the importance of the census and
to see themselves in the data that are collected.
Understanding the challenges that face hard-tocount populations, providing materials in multiple
languages for non-English proficient populations,
and—as the diversity of the U.S. population has
grown—evolving the decennial census content are
ways in which the Census Bureau engages with
and encourages participation in the 2020 Census.
U.S. Census Bureau
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 203
Island Areas
Censuses
OTHER CENSUSES
Coverage Measurement
Matching
Evaluations and
Experiments
Archiving
Update Leave
Coverage Measurement
Field Operations
Count Question
Resolution
Count Review
Redistricting Data
Program
Data Products and
Dissemination
PUBLISH DATA
Language Services
Federally Affiliated
Count Overseas
Response Processing
Nonresponse Followup
Census Questionnaire
Assistance
IT Infrastructure
Content and
Forms Design
Figure 2: 2020 Census Operational Placemat
Coverage Measurement
Design and Estimation
TEST AND EVALUATION
Enumeration at
Transitory Locations
Group Quarters
Integrated Partnership
and Communications
Address Canvassing
Internet Self-Response
Update Enumerate
Paper Data Capture
Local Update of
Census Addresses
Non-ID Processing
Decennial Logistics
Management
Security, Privacy, and
Confidentiality
SUPPORT
Forms Printing and
Distribution
RESPONSE DATA
Field Infrastructure
Systems Engineering
and Integration
Census/Survey Engineering
Geographic Programs
FRAME
Decennial Service
Center
Infrastructure
Program Management
Program Management
Stakeholder Communication and
Engagement
Nested within the 2020 Census Program
Management Operation is Stakeholder
Communication and Engagement. The Census
Bureau engages with various internal and external
stakeholders pertaining to our planning, research,
and operational design. The Census Bureau
engages early and on a regular basis to share our
plans, but more importantly to listen, to hear, to
understand, and to collect information on the
opportunities and challenges with groups that
have historically been Hard-to-Count, as well as
groups that are emerging as Hard-to-Count.
Content and Forms Design
The 2020 Census will enable different race and
ethnic groups to self-identify their race/ethnicity
on their census questionnaires. Respondents will
be able to select multiple check boxes for race
and Hispanic origin. The race question includes 15
different checkboxes with the ability for respondents to select more than one checkbox; respondents can also select “Some other race” if they
do not see themselves in the other 14 options.
In addition, regardless of which checkboxes a
respondent selects, detailed responses can be
added in the write-in fields.
Stakeholder Communication and Engagement
activities include:
•• 2020 Census Program Management Reviews.
•• National Advisory Committee Meetings and
working groups that specifically look at hardto-count populations and potential impacts that
aspects of the 2020 Census operational design
would have on hard-to-count populations.
•• Census Scientific Advisory Committee
meetings.
•• Congressional briefings.
The Census Bureau also conducted a series of
tribal consultations with federal- and state-recognized tribes. In these consultations, information
about the 2020 Census Operational Design was
shared and has led to input on the preference that
each tribe has for enumeration (Self-Response,
Update Leave, or Update Enumerate).
In addition, we regularly share information about
the 2020 Census Operational Design via presentations in various forums, which often leads to feedback, concerns, and recommendations pertaining
to hard-to-count populations.
From a HTC Framework perspective, Stakeholder
Communications and Engagements is focused
on all four segments: Hard-to-Locate, Hard-toContact, Hard-to-Persuade, and Hard-to-Interview.
204 2020 Census Operational Plan—Version 4.0
Figure 3: 2020 Census Race Question
U.S. Census Bureau
Respondents who self-identify with Hispanic,
Latino, or Spanish origin will be able to further
indicate if they are Mexican, Mexican-American,
Chicano, Puerto Rican, Cuban, or another
Hispanic, Latino, or Spanish origin. There will also
be a dedicated write-in box to print origins of
those identifying with "another Hispanic, Latino,
or Spanish origin."
Figure 4: 2020 Census Hispanic Origin
Question
The 2020 Census also enables respondents to
indicate their relationship with household members, through a variety of relationship categories.
This includes the distinction between opposite-sex
and same-sex husband/wife/spouse/unmarried
partner categories. Relationship data are used
in planning and funding government programs
that provide funds or services for families, people
living or raising children alone, grandparents living
with grandchildren, or other households that qualify for additional assistance.
From the very first census in 1790, Congress
established the principle of counting people
where they usually reside, which is defined as the
place where a person lives or sleeps most of the
time, in order to be fair and consistent. The 2020
Census residence criteria and residence situations
determine who should be counted and where they
should be counted. Every decade, the Census
Bureau undertakes a review of the decennial residence criteria and residence situations to ensure
the concept of “usual residence” is applied in a
way that is consistent with the Census Bureau’s
commitment to count every person once, only
once, and in the right place. With respect to the
2020 Census residence criteria, language on our
questionnaires helps to count all people, including
young children. A summary of the residence criteria is the first thing a respondent will read on the
paper questionnaire, our Internet instrument provides a help text with a clear summary of response
criteria, and during field operations, respondents
are shown an informational sheet with instructions
on who should be counted.
In addition, our undercount question gives the
respondent an opportunity to ensure everyone has
been included. With increasingly complex living
arrangements, whom to include in the household population count can be a challenge. The
2020 Census will include revised wording related
to young children, who have historically been
undercounted.
Figure 6: 2020 Census Undercount Question
To address the undercount of young children, we
include specific instructions in our mailing materials abut including young children.
Figure 5: 2020 Census Relationship Question
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 205
From a HTC Framework perspective, the Content
and Forms Design Operation is focused on the
Hard-to-Persuade and the Hard-to-Interview
segments.
Language Services
The 2020 Census will enable Limited Englishspeaking individuals to respond to the census
by providing language assistance and represents
a significant expansion compared to the 2010
Census; the 2020 Census will be the most robust
language program ever built.
According to the 2016 American Community
Survey 5-year estimates, there are over 3 million
households in the United States that are Spanishspeaking and are limited English-speaking.
This accounts for over 60 percent of total limited English-speaking households. Accordingly,
the 2020 Census will deliver bilingual English/
Spanish mailing materials to addresses in Spanish
tracts, as well as enable enumerators to toggle
between English and Spanish in the enumeration
instrument when enumerating Spanish-speaking
households. In addition, the 2020 Census will
provide the Internet Self-Response instrument
and Census Questionnaire Assistance in Spanish
and 11 additional non-English languages, covering
over 85 percent of total limited English-speaking
households. The languages, in descending order
of need are Spanish, Chinese, Vietnamese, Korean,
Russian, Arabic, Tagalog, Polish, French, Haitian
Creole, Portuguese, and Japanese. For Census
Questionnaire Assistance, there will be separate
telephone numbers dedicated to each language
with two different numbers for Chinese, one for
Mandarin and one for Cantonese. There will also
be a dedicated number for Telephone Display
Device. The telephone numbers will be included in
the 2020 Census mailing packages.
There will also be language guides in 59 non-English languages (including the aforementioned
languages), where respondents will receive information via video and/or print guides on filling
out their questionnaires. The language guides
will include American Sign Language, braille, and
large print. This will account for approximately 98
percent of total limited English-speaking households. For the remainder of the language guides,
2020 Census staff and partnership specialists will
206 2020 Census Operational Plan—Version 4.0
work with the language communities to provide
additional assistance in their languages.
From a HTC Framework perspective, the
Language Services Operation is focused primarily
on the Hard-to-Interview segment.
Field Infrastructure
Often the topic of hard-to-count efforts leads to
questions about hiring and language skills. The
objective of our Field Infrastructure Operation
is to provide the human resources and personnel management support functions, including
recruiting, hiring, and onboarding that reflect the
diversity of the nation to support, facilitate, and
encourage response.
A key point in our recruiting and hiring process is
to make it local. The Census Bureau will hire enumerators who are comfortable and familiar with
the neighborhoods where they work. Recruiting
and hiring at low levels of geography is essential,
as is the ability to speak the languages of the local
community. The overarching strategy for hiring
enumerators is to hire people who will work in the
communities in which they live.
The 2020 Census Community Partnership and
Engagement Program will focus the efforts of
approximately 1,500 partnership specialists to
increase self-response and participation in communities who are hesitant to respond or who
will not respond. Partnership specialists will use
existing networks, resources, and “trusted voices”
to increase census participation in low response
communities.
When considering the HTC Framework, the primary focus of the Field Infrastructure Operation
is related to the Hard-to-Persuade and Hard-toInterview segments.
FRAME
The Census Bureau never ceases its efforts to
maintain the Master Address File (MAF) and
Topologically Integrated Geographic Encoding
and Referencing (TIGER) System, which serve as
the foundation on which we base the 2020 Census
operational design. The objective of the operations associated with the frame is to develop a
high-quality geospatial frame that serves as the
U.S. Census Bureau
universe for enumeration activities representing all
of the places where people live or could live. The
Census Bureau regularly updates our address list—
the MAF—with new information from the United
States Postal Service, and data from tribal, state,
and local governments and third-party data (commercial vendors). We are in a constant state of
exploration to identify new sources of address and
geospatial information that can corroborate data
from other sources, fill in missing information, and
add new addresses and spatial data to improve
the overall coverage and quality of the MAF and
TIGER data.
Area Census Office (ACO) Delineation
The Census Bureau is opening 248 ACOs to support the 2020 Census. The estimated Nonresponse
Followup (NRFU) workload, which is comprised of
hard-to-count addresses, was the primary driver
in determining the location and span of control
for each office. The initial number of ACOs was
determined based on the number of enumerators
needed for field operations. Several data sources
were used to estimate the number of enumerators
needed per area, such as response rate projections based on the 2010 Census, the estimated
NRFU workload, and the locations of group
quarters (university dormitories, nursing homes,
prisons, military barracks, etc.).
Type of Enumeration Area (TEA)
The TEA represents the predominant enumeration method for conducting the 2020 Census in
a given geographic area. The TEA assignment is
based on area characteristics to maximize respondent participation. TEAs are attributes of a Basic
Collection Unit (BCU); every BCU will have a TEA
attribution. In a very general sense, all TEAs other
than Self-Response are aimed at listing and enumerating housing units in areas that may require
special procedures to ensure accurate counting,
i.e., treating all areas in the same way will not
work. We cannot use a mail contact strategy in
areas where the majority of housing units do not
have mail delivered to the physical location of
the address. Many of these areas (such as Update
Enumerate) contain hard-to-count populations.
Please see the sections below pertaining to the
individual operations.
Address Canvassing (ADC) Operation
ADC is part of a continual effort to identify all
possible places where people live or could live.
In our efforts to ensure we count everyone where
they spend most of their time, we must identify
all possible places where people could live. This
includes hidden housing units. Occupants of hidden housing units are considered hard-to-count.
If we are unable to discover hidden housing units,
we are unable to count the occupants of those
units. As part of the ADC training, the Census
Bureau instructs listers to identify and inquire
about hidden housing units.
Local Update of Census Addresses (LUCA)
Operation
The LUCA Operation provides the opportunity
for tribal, state, and local governments to review
and comment on the Census Bureau’s address list
and maps to ensure an accurate and complete
enumeration of their communities. The Census
Address List Improvement Act of 1994 (P.L. 103430) authorized the Census Bureau to provide
individual addresses to designated local officials
of tribal, state, and local governments who agreed
to conditions of confidentiality in order to review
and comment on the Census Bureau’s address list
and maps prior to the decennial census. The basic
process for LUCA includes:
•• Census Bureau provides address list and maps
to the governmental entities.
•• Governmental entities review and add, delete,
or change address records or features.
•• Census Bureau incorporates the updates to
MAF/TIGER system.
•• Census Bureau validates the updates through
a clerical review, automated address matching,
and ADC.
•• Census Bureau provides feedback to the governmental entities.
The Census Bureau offers additional opportunities to review and provide input on the coverage,
completeness, and accuracy of the address list
through:
•• The Geographic Support System program.
•• The Count Review Operation.
•• The New Construction program
U.S. Census Bureau
2020 Census Operational Plan—Version 4.0 207
From a HTC Framework perspective, operations
associated with the frame are focused on the
Hard-to-Locate segment.
RESPONSE DATA
Targeted advertising and tailored contact strategies to different demographic and geographic
areas, and our partnership program outlined
below, assist in connecting with hard-to-count
populations. The 2020 Census operational design
makes it easier for people to respond through
multiple modes (Internet, paper, or telephone), by
allowing respondents to submit a questionnaire
without a unique Census identifier, and by providing online forms, paper forms, and flexible and
adaptive telephone support in multiple languages.
When and where field data collection efforts are
implemented, the Census Bureau tailors the enumeration strategy to the demographic and geographic areas.
Integrated Partnership and Communications
(IPC) Operation
The IPC Operation must reach every household in
the nation, delivering the right messages to the
right audiences at the right time. It must allocate
messages and resources efficiently, ensuring consistent messaging, as well as look and feel, across
all public-facing materials across communications
efforts as well as operations. The program will
offer the following components:
•• Advertising, using print, radio, digital, television, and out-of-home.
•• Earned media and public relations.
•• Partnership, including both regional and
national efforts.
•• Social media, to include blogs and messages on
platforms such as Facebook, Twitter, Instagram,
Snapchat, etc.
•• Statistics in Schools.
•• Rapid response.
•• Web site.
The IPC Operation will implement an integrated
communications campaign, to increase awareness
of the decennial census, promote self-response,
reduce cost for NRFU operations, and improve
response rates for our audiences. These audiences
include hard-to-count populations.
208 2020 Census Operational Plan—Version 4.0
Foundational research conducted as part of the
IPC Operation to better identify and understand
our audiences, particularly hard-to-count audiences, is known as the Census Barriers, Attitudes,
and Motivators Study (CBAMS). As part of 2020
CBAMS, the Census Bureau conducted a survey
called the 2020 Census Barriers, Attitudes, and
Motivators Study Survey (2020 CBAMS Survey),
designed to understand mindsets or correlated
attitudes and barriers that relate to census participation across demographic subgroups. The
2020 CBAMS Survey was a self-administered mail
and Internet data collection covering a range of
topics related to respondents’ knowledge of and
attitudes toward the 2020 Census. Results will
be used to understand how demographic subgroups respond to these questions. Results of the
quantitative survey will also serve as an input to
understanding the mindsets used in an audience
segmentation analysis. The audience segmentation analysis considers tracts and clusters them
based on their propensity to self-respond, their
demographic characteristics, and our understanding of their mindsets based on responses to the
2020 CBAMS Survey. This audience segmentation
analysis will drive creative development and media
planning.
Because the survey could not achieve 100 percent
response and because we cannot obtain enough
cases for small demographic groups or otherwise
hard-to-count populations, the Census Bureau
supplemented the survey data collection with
qualitative research. The qualitative research was
achieved through conducting focus groups (2020
CBAMS Focus Groups) aimed at gathering insights
from subgroups unable to participate in the 2020
CBAMS Survey or from subgroups that would not
have a large enough number of respondents from
which to draw meaningful inferences. Although
the results of the focus groups will not be directly
incorporated into segmentation, they will provide
an anecdotal guide that will be effective in planning communications.
The qualitative research provided better reach for
small and hard-to-count communities. It provided
deeper insights that will further inform message
development and creation. The CBAMS qualitative
research comprised 42 focus groups with six to
eight participants per group. The following are the
groups for the English language focus groups:
U.S. Census Bureau
•• Two focus groups with rural, economically disadvantaged individuals.
•• Four focus groups with low Internet proficiency
individuals.
•• Four focus groups with Black/African
Americans individuals with a hard-to-count
focus.
•• Six focus groups with American Indian and
Alaska Native individuals—two in Alaska and
four in the continental United States.
•• Four focus groups with Middle East, North
African individuals.
•• Four focus groups with Native Hawaiian and
Pacific Islander individuals.
•• Two focus groups with young, single, mobile
individuals with mixed race/ethnicity.
The following are the groups for non-English
speaking focus groups:
•• Four focus groups with Spanish-speaking individuals who live on the U.S. mainland.
•• Four focus groups with Spanish-speaking individuals in Puerto Rico.
•• Four focus groups with Chinese-speaking
individuals.
•• Four focus groups with Vietnamese-speaking
individuals.
English-speaking audiences prioritized for the
2020 CBAMS Focus Groups represent groups who
either will not be surveyed by the 2020 CBAMS
Survey or who are anticipated to be underrepresented in that dataset. During this phase of the
research, there will not be dedicated focus groups
with additional hard-to-count audiences such
as people experiencing homelessness, undocumented immigrants, children, persons who are
angry at and/or distrust the government, and
LGBTQ persons. However, individuals from these
groups may be represented within focus groups
planned at this stage. They will also be part of
the creative testing research, for which more
resources should be available to increase capacity
to reach and engage audiences. In addition, IPC
plans to engage these groups through mechanisms outside of focus groups.
We will advertise in multiple languages and work
with the “trusted voices” in communities across
the nation to encourage response to the 2020
U.S. Census Bureau
Census. The Census Bureau will expend resources
to reach the hard-to-count populations using both
traditional and digital media, as well as the use
of ethnic and local media. However, final decisions on how much to allocate to each of these
efforts have not been made pending results of the
CBAMS research efforts. Digital media will allow
us to reach hard-to-count populations more effectively than ever before. Census Bureau partners
include national organizations, but also churches
and other faith-based organizations, health clinics,
legal aid centers, and other support organizations
that traditionally undercounted populations rely
on.
Partnerships educate people about the importance of the census, motivate them to return their
questionnaires, and encourage cooperation with
enumerators. The Census Bureau traditionally
focuses on establishing partnerships with organizations that represent hard-to-count populations. For the 2010 Census, the Census Bureau
established over 250,000 partnerships and has
sustained as many of those relationships as possible during the intercensal years to be in a better
position to start the 2020 Census than previous
censuses in a variety of different ways. In order
to optimize self-response, the Census Bureau
has a robust relationship through the Partnership
Program that includes state, local, and tribal
governments; nongovernmental organizations at
the national and local level; national companies;
and schools. Within the Partnership Program, the
Community Partnership and Engagement Program
includes objectives to:
•• Increase self-response.
•• Use “trusted voices” to make census messages
relevant at the local level.
•• Grow the partnership audience.
•• Increase awareness among the general public.
•• Increase partnership engagement at the local
level through new or improved programs.
The Census Bureau relies on the support of partners throughout the country to help perform a
complete and accurate count. We work together
with our partners to extend our outreach efforts
and connect with hard-to-count populations.
From a HTC Framework perspective, IPC focuses
on the Hard-to-Contact, Hard-to-Persuade, and
Hard-to-Interview segments.
2020 Census Operational Plan—Version 4.0 209
Internet Self-Response Operation
Non-ID Processing (NID) Operation
A goal of the 2020 Census Optimizing SelfResponse is to generate the largest self-response,
reducing the need to conduct expensive in-person
follow-up with nonresponding households. This is
done in several ways by:
The NID Operation is focused on making it easy
for people to respond anytime, anywhere to
increase self-response rates. We will do this by:
•• Enabling people to respond via multiple modes
(Internet, paper, or telephone) and allowing
people to respond on devices such as a home
computer, laptop, tablet, or smartphone.
•• Maximizing real-time matching of NID respondent addresses to the census address inventory.
•• Allowing respondents to submit a questionnaire
without a unique Census identifier (see Non-ID
Processing Operation below.)
•• By providing online forms in multiple languages
(see Language Services above.)
•• The operational design for Internet SelfResponse Operation includes, but is not limited
to the following:
•• Ability to capture larger households than is
possible in a traditional paper-based survey.
•• Deployments of an application that can be used
across modern Internet devices and browsers.
•• An application user interface that is available in
English and non-English languages.
•• A self-response contact strategy that is tailored to demographic and geographic areas,
designed to encourage Internet self-response.
While the 2020 Census operational design mailing strategy is tailored to demographic and
geographic areas to encourage self-response,
the strategy recognizes that the Internet first
response option is not optimal for some populations who may have the will, but not the ability to
respond online. As such, when areas have known
characteristics, such as low Internet connectivity and concentrations of elderly populations,
providing a paper questionnaire with the first
mailing provides maximum response opportunities and increases the likelihood of receiving a
self-response. Because many people need more
encouragement and reminders, our mail strategy
involves up to five mailings with a combination
of letters, reminders, and for anyone who has not
responded, a paper questionnaire with the fourth
mailing. Any address that does not self-respond
is included in the workload for NRFU and subject
to in-person contact attempts to collect decennial
census response data.
210 2020 Census Operational Plan—Version 4.0
•• Providing response options that do not require
a unique Census identifier.
•• Accurately assigning nonmatching addresses to
census blocks.
The NID response option provides opportunities
to populations who predominantly use mobile
devices and may respond while taking a bus to
work, sitting in a doctor’s offices where they see
a 2020 Census poster, etc. The NID response
option may also improve coverage by reaching
households that were not on our frame and may
not have received any census mailing but saw an
advertisement and were able to respond.
Update Leave (UL) Operation
The UL Operation is designed to update the
address frame and deliver questionnaires in geographic areas where the majority of housing units
either do not have mail delivered to the physical
location of the housing unit, or the mail delivery
information for the housing unit cannot be verified. The purpose of the operation is to update the
address and feature data for the area assigned,
and to leave a 2020 Census Internet Choice
Questionnaire Package at every housing unit
identified to allow the household to self-respond.
In many ways, the UL Operation is an extension
of a Self-Response area with the major difference
being that a Census Bureau employee, rather than
a U.S. Postal Carrier, is delivering the 2020 Census
invitation to respond, along with a paper questionnaire. While the Census Bureau hand delivers
questionnaires, respondents will also have the
option to respond online or over the telephone
by calling Census Questionnaire Assistance. The
UL Operation—similar to In-Field ADC—involves
walking a geographic area to update the address
list, identify missing and hidden housing units,
and knocking on every door to inquire about the
existence of additional housing units.
Hard-to-count populations often reside in UL
areas. In order to effectively count these populations, their location must be accurately verified.
U.S. Census Bureau
UL can occur in geographic areas that:
•• Do not have city-style addresses.
•• Do not receive mail through city-style
addresses.
•• Receive mail at post office boxes.
•• Have been affected by major or natural disasters such as hurricanes, earthquakes, wild fires,
tornadoes, etc.
•• Have high concentrations of seasonally vacant
housing.
From a HTC Framework perspective, the focus
of the UL Operation is primarily on the Hard-toLocate and Hard-to-Contact segments.
Update Enumerate (UE) Operation
The UE Operation is designed to update the
address frame and enumerate respondents in
geographically remote areas with low housing-unit
density that are sparsely populated, or have challenges with accessibility.
UE will occur in the following geographic areas:
•• Remote areas of Maine and Southeast Alaska.
•• Select tribal areas.
•• Remote Alaska, which is considered a suboperation of UE.
In the UE Operation, field staff update the address
and feature data, and enumerate respondents in
person. UE offers respondents in areas with limited or no (broadband) Internet access and limited
cell phone access (with expensive data plans in
remote areas) an effective and familiar enumeration method.
Many of the hard-to-count populations reside in
areas where the Census Bureau is not confident in
the accuracy of the address or demographic data,
and where updates may not be conducted as
often as in areas that are more populous. In order
to be thorough and accurate, yet cost-effective,
UE will ensure that data for listing and enumeration are collected together. UE addresses hard-tocount populations by:
•• Linking enumerated housing units to listing
data, ensuring accurate processing of both listing and enumeration data.
•• Involvement with the local community or tribe
in order to optimize effective operational
U.S. Census Bureau
implementation and to encourage higher
response rates.
•• Hiring from local population for enumerators,
guides, or cultural facilitators who are familiar
with the residents and have the language or
other necessary skills to facilitate a response.
Group Quarters (GQs) Operation/ServiceBased Enumeration (SBE) Program
The Census Bureau conducts a number of operations designed for the enumeration of populations in special living arrangements. GQs are
places where people live or stay in a group living
arrangement, which are owned or managed by
an entity or organizations providing housing or
other services for the residents. GQs may have
administrators or gatekeepers that make residents
of these facilities hard to interview. Some GQs
facilities are for persons experiencing homelessness, making the populations receiving services
both hard to interview and hard to contact. SBE
is designed specifically to enumerate at service-based locations such as emergency and
transitional shelters, soup kitchens, regularly
scheduled mobile food vans, and Targeted NonSheltered Outdoor Locations. The SBE process is
specifically designed to approach people using
service facilities because they may be missed
during the traditional enumeration of housing
units and GQs.
An additional special enumeration operation
designed for a specific population group is the
Federally Affiliated Count Overseas (FACO),
where the Census Bureau will receive administrative records for all military personnel and
their dependents from the Defense Manpower
Command Divisions under the Department of
Defense or from federal agencies who have staff
stationed overseas.
Enumeration at Transitory Locations (ETL)
Operation
A Transitory Location (TL) is a location comprised
of nontraditional living quarters where people
are unlikely to live year-round, due to the transitory/temporary/impermanent nature of these
living quarters. At TLs, we enumerate highly
mobile populations. TLs include places such as
recreational vehicle parks, campgrounds, hotels,
motels, marinas, racetracks, circuses, or carnivals.
2020 Census Operational Plan—Version 4.0 211
From a HTC Framework perspective, the special enumeration operations including GQ, SBE,
Military Enumeration, FACO, and the ETL focus
on populations that are both Hard-to-Contact and
Hard-to-Interview segments.
Nonresponse Followup
The NRFU Operation is entirely about hard-tocount populations. NRFU is focused on contacting and persuading residents of nonresponding
addresses to provide their census responses. The
objective on NRFU is to determine or resolve the
housing unit status (occupied, vacant, or nonexistent) for all addresses for which a self-response
has not been received and to collect census
response data for housing units determined to be
occupied.
Administrative records, when high-quality data
exist, are used in place of repeated attempts to
reach nonresponding housing units. This enables
the Census Bureau to focus its NRFU contact
attempts on those housing units not represented
well by high-quality administrative records, likely
the harder-to-count populations.
NRFU enumerator training, job aids, and frequently asked questions include information
and an emphasis on counting young children.
Enumerator training includes a case study
intended to provide clarity about how to count
young children during the 2020 Census. All frequently asked questions and job aids have also
been updated to address counting young children.
Additionally, the verbiage that enumerators use
during the interview will be updated to highlight
the count of children when determining the housing unit's roster. For example, rather than using
phrases such as "the census counts people/residents..." enumerators will say, "the census counts
all adults and children..." Enumerators will also ask
the additional coverage questions featured in the
Internet self-response mode of data collection.
The NRFU operational design also addresses
hard-to-count populations in the procedures
used for making contact attempts. While most
cases receive a maximum of six attempts, cases
in hard-to-count areas may receive more than six
attempts to achieve a consistent response rate
for all geographic areas. Additionally, all cases
are eligible for proxy enumeration after the third
212 2020 Census Operational Plan—Version 4.0
attempt, allowing for four proxy attempts to
enumerate the housing unit. These attempts are
especially helpful in enumerating hard-to-count
populations.
In an effort to increase the likelihood that students
and faculty living in geographic areas surrounding
colleges and universities will be counted where
they lived on Census Day, the Census Bureau
will conduct early NRFU. Early NRFU focuses on
colleges and universities where the 2020 spring
semester concludes prior to mid-May when NRFU
begins nationwide. In these select geographic
areas, NRFU will begin in early April.
In all NRFU areas, as enumerators are making
contact attempts in the communities in which they
work, they may encounter language barriers to
completing an interview. When a language barrier
is encountered, efforts will be made to identify an
enumerator who speaks the non-English language of the respondents. If an enumerator with
the needed language skills cannot be identified,
the Census Bureau will engage the services of an
interpreter to facilitate the interview. In addition,
if an enumerator visits a nonresponding address
and no one answers, the enumerator will leave a
Notice of Visit that provides information for the
household on how to respond online or over the
telephone.
From a HTC Framework perspective, NRFU
focuses on the Hard-to-Persuade segment.
Coverage Improvement Operation
The Coverage Improvement Operation is unique
in that the Census Bureau has a response from an
address. However, there is some question about
the response. The objective of the Coverage
Improvement Operation is to recontact housing
units in an effort to determine if people were
missed, counted in the wrong place, or counted
more than once during the census.
Criteria for the identification of cases for the
Coverage Improvement Operation include:
•• Cases with count discrepancies, either high or
low, between the population count reported
and the number of people for which data are
reported.
•• Affirmative responses to either the Undercount
(shown in image 6) or the Overcount question.
U.S. Census Bureau
The goal of Coverage Improvement is to resolve
potential coverage issues through a recontact with
the household, asking questions in an attempt
to resolve whether someone has been missed
and should be included in the count, or whether
someone was included in the count and should be
removed. The following is an example of an undercount question or probe asked in the Coverage
Improvement instrument: "I'd like to make sure
that we are not missing anyone who lived or
stayed at this address. Were there any babies,
children, grandchildren, or foster children that you
did not mention?"
When considering the HTC Framework, Coverage
Improvement cases fall in the Hard-to-Locate
segment. They are in the Hard-to-Locate segment,
not from a missing address perspective as with
the frame, but from person perspective in terms of
where a person should be counted.
UNEXPECTED EVENTS
Despite the Census Bureau’s best efforts to plan
for the execution of the 2020 Census, unexpected
events, such as natural disasters, can occur. When
an unexpected event occurs, geographic areas,
populations, or both may become hard to count.
How the Census Bureau reacts and how we
approach the 2020 Census enumeration depends
on the event. When an event occurs, the Census
Bureau will form a rapid-response team to assess
the impact of the event and develop a recommended reaction to the event. In forming a
response plan, the Census Bureau will consider
facts such as the timing of the event, severity,
impacted geographic area, access to the impacted
area, and other environmental concerns.
Past events impacting a decennial census enumeration have included Hurricanes Katrina and
Rita that devastated the Gulf Coast prior to the
2010 Census and necessitated in the planned
enumeration methodology. Recently, the devastation resulting from Hurricane Maria that hit
Puerto Rico in 2017, resulted in the Census Bureau
U.S. Census Bureau
reaching a decision to conduct UL in Puerto Rico;
this decision allows additional recovery time for
the impacted area and will result in Census Bureau
staff hand delivering questionnaires to all of the
locations where people are living.
The Census Bureau will face the challenges of any
unexpected event and will take steps necessary to
enumerate the population impacted by any such
event.
TOOLS
In the Census Bureau’s efforts to enumerate hardto-count populations, there are tools, techniques,
and methods that support the operations and
activities outlined above. While not an exhaustive
list, included are:
•• The Planning Database.
•• Response Outreach Area Mapper (ROAM).
•• Residence Criteria/Instructions.
•• Coverage Questions.
•• Language Materials.
•• Mailing Strategy.
•• Non-ID Response.
•• Administrative Records.
•• Field Workforce.
•• Blitz Enumeration.
The Planning Database and ROAM were not
mentioned previously, but are tools that can be
used by Census Bureau Partnership Specialists,
local officials, and community leaders to identify
hard-to-count areas. The ROAM combines low
response-score data with an interactive mapping
platform to allow users to identify hard-to-count
areas and better understand the populations
of these areas for the purposes of 2020 Census
outreach and promotion. Identifying areas needing extra attention can help make the most of
time and resources when devising a communication and outreach strategy for hard-to-count
populations.
2020 Census Operational Plan—Version 4.0 213
Each tool listed above is important to supporting
an integrated design for the enumeration of hardto-count populations.
SUMMARY
Efforts to count everyone once, only once, and
in the right place—including hard-to-count populations—are infused throughout the operational
design of the 2020 Census. From early efforts
that engaged hard-to-count populations, such
as federal- and state-recognized tribes, through
ongoing interactions with our National Advisory
Committee, the Census Bureau devotes resources
to research, testing, and an operational design
that considers how our environment, societal
changes, and technological innovations shape
our understanding of our population and the
approaches we must take to ensure a complete
and accurate enumeration. The approaches we
employ consider both traditional enumeration
approaches, as well as approaches that are tailored to specific populations such as the hard-tocount populations.
Figure 7: Screenshot From the Response Outreach Area Mapper
214 2020 Census Operational Plan—Version 4.0
U.S. Census Bureau
File Type | application/pdf |
File Title | 2020 Census Operational Plan v4.0 |
Subject | 2020 Census |
Author | U.S. Census Bureau |
File Modified | 2019-01-29 |
File Created | 2019-01-28 |