Attachment K- CPS-Tech-Paper-77

Attachment K- CPS-Tech-Paper-77.pdf

Displaced Worker, Job Tenure, and Occupational Mobility Supplement to CPS

Attachment K- CPS-Tech-Paper-77

OMB: 1220-0104

Document [pdf]
Download: pdf | pdf
Design and Methodology
Current Population Survey—America’s Source for Labor Force Data
Technical Paper 77

Publication Number
Issue Date

October 2019

Acknowledgments

This Current Population Survey Technical Paper (CPS TP77) is an updated
version of TP66 released in 2006. Previous content has been brought
up-to-date and reorganized to reflect the 2010 Current Surveys’ Sample
Redesign, U.S. Census Bureau organizational changes, and revised Section
508 standards of the Rehabilitation Act of 1973 for electronic document
accessibility. As future updates in CPS design and methodology occur, they
will be posted to the Census Bureau’s Internet site.
Many individuals contributed to the publication of the CPS TP77. Some
individuals authored chapters, sections, and appendixes, and others
reviewed the text for technical, procedural, and grammatical accuracy. Some
individuals served in both roles. While certain individuals are still with their
respective agency, some have since left the agency.
This technical paper was written under the coordination of Leslie R. Flores,
Mei Li, and Aaron J. Gilary of the Census Bureau. Here, we have attempted to
document all individuals whose combined effort produced this publication.
Authors and reviewers of this paper at the Census Bureau were David E.
Adams, James R. Back, John T. Baker II, Candice A. Barnes, Redouane
Betrouni, Benjamin C. Bolender, Rosemary Byrne, Stephanie ChanYang, Yang Cheng, Dennis Clark, Lisa A. Clement, Jeffrey S. Corteville,
Annetta Golding DePompa, Christopher Dick, Khoa Dong, James Farber,
Tremika R. Finney, Leslie R. Flores, Brian A. Fouche, Susan S. Gajewski,
Susan T. Galeano, Aaron J. Gilary, Jennifer Williams Hill, David V. Hornick,
John A. Jones, Olga Koly, Chris Kuwik, Mei Li, Kyra M. Linse, Clifford L.
Loudermilk, Antoinette Lubich, Tim J. Marshall, Alan K. Peterson, KeTrena
Phipps, Jimmie B. Scott, Franklin D. Silberstein, Larry Douglas Sink, Daniel
Sommers, Thomas R. Spaulding, Kenisha Staine, Tim Trudell, Katrina L.
Wengert, and Gregory D. Weyland.
Authors and reviewers of this paper at the U.S. Bureau of Labor Statistics
(BLS) were Dorinda Allard, Andrew Blank, Thomas Evans, Steven Haugen,
Fran Horvath, Karen Kosanovich, Justin McIllece, Stephen M. Miller, Anne
Polivka, and Ed Robison.
Approving officials from the Census Bureau and BLS were Dorinda Allard,
John T. Baker II, Benjamin C. Bolender, Lisa A. Clement, Glenn A. Eanes,
Julie Hatch Maxfield, Karen S. Silloway, Jeffrey D. Sisson, and Anthony G.
Tersine, Jr.
At the Census Bureau, Musbaw Alawiye, Corey T. Beasley, Faye E. Brock,
Linda Chen, Christine E. Geter, Amanda J. Perry, and Janet S. Sweeney of
the Public Information Office provided publication management, graphics
design and composition, and editorial review for print and electronic media,
and Linda Vaughn of the Administrative and Customer Services Division
provided printing management. Lawrence Malakhoff served as accessibility
consultant, and Cynthia L. Wellons-Hazer provided administrative support.
We are grateful for the assistance of these individuals, and all others who
are not specifically mentioned, for their help with the preparation and
publication of this document.

Design and Methodology

Current Population Survey—America’s Source for Labor Force Data

U.S. Department of Labor
Eugene Scalia,
Secretary of Labor

U.S. Department of Commerce
Wilbur Ross,
Secretary

U.S. Bureau of Labor Statistics
William W. Beach,
Commissioner

Karen Dunn Kelley,
Deputy Secretary
U.S. CENSUS BUREAU
Steven D. Dillingham,
Director

Issued October 2019

Suggested Citation
U.S. Census Bureau,
Current Population Survey
Design and Methodology
Technical Paper 77,
October 2019.

U.S. CENSUS BUREAU
Steven D. Dillingham,
Director
Ron S. Jarmin,
Deputy Director and
Chief Operating Officer
Victoria A. Velkoff,
Associate Director for
Demographic Programs
Eloise K. Parker,
Assistant Director for
Demographic Programs
Anthony G. Tersine, Jr.,
Chief,
Demographic Statistical Methods Division

U.S. BUREAU OF LABOR
STATISTICS
William W. Beach,
Commissioner
William J. Wiatrowski,
Deputy Commissioner
Julie Hatch Maxfield,
Associate Commissioner for
Employment and Unemployment
Statistics

FOREWORD
The Current Population Survey (CPS) is one of the oldest, largest, and most well-recognized surveys in
the United States. It is an important survey, providing information on many of the things that define us
as a society—our work, our earnings, our education. It is also very complex. In this publication, the staff
of the U.S. Census Bureau and the U.S. Bureau of Labor Statistics have attempted to provide data users
with a thorough description of the design and methodology used to administer the CPS and create
estimates. The preparation of this technical paper was a major undertaking, spanning several years and
involving dozens of statisticians, economists, and others from the two agencies.
While the basic approach to collecting labor force and other data through the CPS has remained intact
since this report was last updated, much has changed. In particular, since CPS Design and Methodology
Technical Paper 66 was issued in 2006, annual sampling replaced once-a-decade sampling to increase
flexibility in design and operations. Another major difference from the previous design is the adoption of
the master address file (MAF) to take advantage of the extensive and regularly-updated address information from the U.S. Postal Service. The MAF is now the primary source of frames and sample units for
the CPS, as well as for many other demographic surveys the Census Bureau conducts. This change has
allowed the Census Bureau to abandon costly address listing operations in the field.
In addition to this technical paper, please visit our CPS Web sites at  and
, where additional information about CPS methodology
and data is available. Also, we welcome comments from users about the value of this document and
ways that it could be improved. To provide comments or submit a question, please refer to the CPS
contact page at .

Steven D. Dillingham
Director
U.S. Census Bureau
October 2019

				William W. Beach
						Commissioner
U.S. Bureau of Labor Statistics
						October 2019

CONTENTS
1.	 General Current Population Survey Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1-1	 Background and History  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1-2	 Questionnaire Concepts and Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1-3	Supplements  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1-4	 Data Products  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Appendix: History of the Current Population Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

	1
	3
	5
	13
	26
	29

2.	 Current Population Survey Sample Design and Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2-1	 Sample Frame  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2-2	 Sample Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2-3	 Weighting and Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2-4	 Variance Estimation  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2-5	 Seasonal Adjustment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

	41
	43
	52
	67
	82
	94

3.	 Current Population Survey Operations  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3-1	 Instrument Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3-2	 Conducting the Interviews  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3-3	 Transmitting the Interview Results  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3-4	 Data Preparation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3-5	 Organization and Training of the Data Collection Staff . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

	101
	103
	116
	129
	132
	135

4.	 Current Population Survey Data Quality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4-1	 Nonsampling Error . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4-2	 Reinterview Design and Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Appendix: Weighting Scheme for Quality Control Reinterview Estimates . . . . . . . . . . . . . . . . . . . . .

	141
	143
	154
	159

Acronyms  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	161
Index	

 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	165

LIST OF FIGURES
Figure 1-3.1
Figure 3-2.1
Figure 3-3.1
Figure 3-5.1
Figure 4-1.1
National
Figure 4-1.2
National
Figure 4-2.1

Diagram of the Annual Social and Economic Weighting Scheme  . . . . . . . . . . . . . . . . . . . .
Sample Introductory Letter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Overview of Current Population Survey Monthly Operations for a Typical Month . . . . . .
Census Bureau Regional Office Boundaries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Current Population Survey Total Coverage Ratios: December 2014–December 2017,
Estimates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Current Population Survey Nonresponse Rates: December 2014–December 2017,
Estimates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Quality Control Reinterview Sampling Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

	22
	117
	131
	139
	146
	147
	155

i

LIST OF TABLES
Table 1-3.1. Current Population Survey Supplements, January 2004–December 2017 . . . . . . . . . . . . . . 	15
Table 1-3.2 Month-In-Sample Groups Included in the Annual Social and Economic Supplement
Sample for Years 2001 to 2017 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	20
Table 1-3.3 Summary of the Annual Social and Economic Supplement Interview Months  . . . . . . . . . . 	21
Table 1-3.4 Summary of the Annual Social and Economic Supplement Children’s Health Insurance
Program Adjustment Factor  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	25
Table 2-1.1 Address Sources for the Current Population Survey Sampling Frames for the 2000 and
2010 Sample Designs  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	43
Table 2-2.1 Civilian Noninstitutional Population 16 Years and Over in Sample Areas for 852Primary-Sampling-Unit Design by State . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	56
Table 2-2.2 Current Population Survey/Children’s Health Insurance Program Rotation Chart
January 2018–March 2020 Month-in-Sample by Sample Designation and Rotation . . . . . . . . . . . . . 	63
Table 2-3.1 National Coverage Step Cell Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	72
Table 2-3.2 State Coverage Adjustment Cell Definition  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	73
Table 2-3.3 Second-Stage Adjustment Cell by Race, Age, and Sex . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	74
Table 2-3.4 Second-Stage Adjustment Cell by Ethnicity, Age, and Sex . . . . . . . . . . . . . . . . . . . . . . . . . . . 	75
Table 2-3.5 Composite National Ethnicity Cell Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	78
Table 2-3.6 Composite National Race Cell Definition  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	78
Table 2-4.1 Generalized Variance Function Parameters for Estimates of Monthly Levels . . . . . . . . . . . . 	88
Table 2-4.2 Within-Primary Sampling Unit to Total Variance Ratios, 2016 Annual Averages . . . . . . . . . 	89
Table 2-4.3 Relative Standard Errors After Selected Weighting Stages: 2016 Annual Averages  . . . . . 	90
Table 2-4.4 Design Effects After Selected Weighting Stages: 2010–2017 Averages . . . . . . . . . . . . . . . . 	91
Table 3-2.1 Noninterviews: Types A, B, and C  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	118
Table 3-2.2 Noninterviews: Main Household Items Asked for Types A, B, and C . . . . . . . . . . . . . . . . . . . 	119
Table 3-2.3 Interviews: Main Household Items Asked in Month-In-Sample 1 and Replacement
Households	 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	120
Table 3-2.4 Summary Table for Determining Who is to be Included as a Member of the Household . 	121
Table 3-2.5 Interviews: Main Demographic Item Asked  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	122
Table 3-2.6 Demographic Edits in the Current Population Survey Instrument  . . . . . . . . . . . . . . . . . . . . . 	124
Table 3-2.7 Interviews: Main Household and Demographic Items Asked in Month-In-Sample 5 . . . . . . 	126
Table 3-2.8 Interviewing Results for October 2017 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	127
Table 3-2.9 Telephone and Personal Field Interviewing for October 2017 . . . . . . . . . . . . . . . . . . . . . . . . . 	128
Table 3-5.1 Average Monthly Workload by Regional Office: 2017 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	140
Table 4-1.1 National Current Population Survey Item Types, Monthly Average Percentage of Invalid
or Missing Values, July 2017–December 2017 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	148
Table 4-1.2 Percentage of Current Population Survey Labor Force Reports by Type of
Respondents: 2015–2017 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	149
Table 4-1.3 Month-in-Sample Bias Indexes for Total Employed and Total Unemployed Average:
January 2014–December 2015  . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	151
Table 4-2.1 Reinterview Results for November 2016—Counts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	157
Table 4-2.2 Reinterview Results for November 2016—Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 	158

ii	

	

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Unit 1
General Current Population
Survey Information

This page is intentionally blank.

Chapter 1-1: Background and History
BACKGROUND
The Current Population Survey (CPS) is a monthly
survey sponsored jointly by the U.S. Census
Bureau and the U.S. Bureau of Labor Statistics
(BLS). It is the source of the national unemployment rate, along with a wide range of information
about employment, unemployment, and people not in the labor force. The CPS also collects
extensive demographic data that complement and
enhance our understanding of labor market conditions in the nation overall, among many different
population groups and geographic areas.
The labor force concepts and definitions used in
the CPS have undergone only slight modifications
since the survey’s inception in 1940. Those concepts and definitions are discussed in Chapter 1-2,
Questionnaire Concepts and Definitions. Although
labor market information is central to the CPS, the
survey provides a wealth of other demographic
and socioeconomic data that are widely used in
both the public and private sectors. In addition,
because of its long history and the quality of its
data, the CPS has been a model for other household surveys, both in the United States and in
other countries.
The CPS is a source of information not only for
economic and social science research, but also
for the study of survey methodology. This report
focuses on labor force data because the timely
and accurate collection and publication of these
data remains the principal purpose of the survey.
The CPS is administered by the Census Bureau
using a probability-selected sample of about
60,000 eligible households. Survey data generally are collected during the calendar week that
includes the nineteenth of the month. The questions refer to activities during the prior week for
those who are employed and the prior 4 weeks for
those who are unemployed. Activities in the prior
week generally refer to those done in the week
that includes the twelfth of the month.1 Sampled
households from all 50 states and the District
1
In the month of December, the survey is often conducted
1 week earlier to avoid conflicting with the holiday season.
Additionally, since 2006, the November collection may be moved
1 week earlier to avoid the holiday and/or to allow adequate
processing time before the December collection. The reference
week is then also moved 1 week earlier.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

of Columbia are in the survey for 4 consecutive
months, are out for 8 months, and then return
for another 4 consecutive months before leaving
the sample permanently. This design ensures a
high degree of continuity from one month to the
next, as well as year to year. This 4-8-4 sampling
scheme has the added benefit of allowing the
constant replenishment of the sample without
excessive burden to respondents. A new group of
respondents starts its 4-8-4 rotation each calendar month, while at the same time another group
completes its rotation.
The CPS is collected by Census Bureau field representatives (FR) across the country, through both
personal and telephone interviews, using laptop
computers. Additional telephone interviewing is
conducted from the Census Bureau’s two centralized contact centers in Jeffersonville, Indiana, and
Tucson, Arizona.
To be eligible to participate in the CPS, individuals must be U.S. residents aged 15 or over who
are not in the armed forces. People in institutions, such as prisons, long-term care facilities,
and nursing homes, are ineligible for the CPS. In
general, the BLS publishes labor force data only
for people aged 16 and over, since those under
the age of 16 are limited in their labor market
activities by compulsory schooling and child labor
laws. No upper age limit is used and students are
treated the same as nonstudents. One person
generally responds for all eligible members of the
household.
Usually within 2 weeks of the completion of these
interviews, the BLS publishes the Employment
Situation news release. This release highlights
monthly labor force statistics from the CPS, as
well as data on employment, hours, and earnings
from the Current Employment Statistics (CES)
survey of establishments. This closely watched
release provides some of the earliest economic
indicators available each month and represents
the nation’s most comprehensive measures of
national employment and unemployment. Dozens
of data tables and thousands of time series estimates are also made available to the public at the
time of the release.

Chapter 1-1: Background and History 3

In addition to the regular labor force questions,
the CPS often includes special supplementary
questions on a variety of topics. The best known
of the CPS supplements is the Annual Social and
Economic Survey (ASEC), which is the source
of the national poverty rate and is also used to
measure income and health insurance coverage.
Additional supplement topics include veterans,
school enrollment, worker displacement, and
job tenure. Because of the large sample size and
broad population coverage of the CPS, a wide
range of sponsors use supplements to collect
data on topics that are not directly related to the
labor market, such as participation in the arts,
tobacco use, computer use, and voting patterns.
The supplements are described in greater detail in
Chapter 1-3, Supplements.

HISTORY OF THE CURRENT
POPULATION SURVEY 	
The CPS has its origin in a program established
to provide direct measurement of national unemployment each month on a sample basis. Several
earlier efforts attempted to estimate the number
of unemployed using various devices ranging
from guesses to enumerative counts. The problem
of measuring unemployment became especially
acute during the economic depression of the
1930s.

4 Chapter 1-1: Background and History	

	

The Enumerative Check Census, taken as part of
the 1937 unemployment registration, was the first
attempt to estimate unemployment on a nationwide basis using probability sampling. During
the latter half of the 1930s, the Work Projects
Administration (WPA) developed techniques for
measuring unemployment, first on a local area
basis and later on a national basis. This research
by the WPA, combined with the experience from
the Enumerative Check Census, led to a regular
monthly sample survey of unemployment that
provided an accurate and timely measurement of
employment, unemployment, and the size of the
labor force on a systematic basis. Early tests of
this first national monthly sample survey were initiated in December 1939 and the survey officially
began in March 1940 with the collection of data
for the 1940 Census.
Over the years, survey questions have been
expanded to capture additional labor market data.
Since the survey's inception, there have been
numerous modifications to the definitions, sample
design, and data collection methods to enhance
the reliability of labor force statistics derived from
the survey data. See the Appendix: History of the
Current Population Survey for a list of important
modifications to the CPS since 1940.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Chapter 1-2: Questionnaire Concepts and Definitions
INTRODUCTION
An important component of the CPS is the questionnaire, also called the survey instrument. The
survey instrument utilizes automated data collection methods: computer-assisted personal interviewing (CAPI) and computer-assisted telephone
interviewing (CATI). This chapter explains the
definitions embedded in the questionnaire and
key survey questions. For details on data collection procedures and protocols, see Chapter 3-1,
Instrument Design and Chapter 3-2, Conducting
the Interviews.

STRUCTURE OF THE SURVEY
INSTRUMENT
The CPS interview questionnaire is divided into
three basic parts: (1) household and demographic questions, (2) labor force questions, and
(3) supplemental questions. Supplemental questions are added to the CPS nearly every month
and cover a number of different topics. The order
in which interviewers attempt to collect information is: (1) housing unit (HU) data, (2) demographic data, (3) labor force data, and (4) supplemental data. Supplemental data may include
more demographic and household questions.
The definitions underlying the household, demographic, and labor force data are described below.
For more information about supplements to the
CPS, see Chapter 1-3, Supplements.

CONCEPTS AND DEFINITIONS
Household and Demographic Information
Upon contacting a household, interviewers
proceed with the interview unless the case is a
definite noninterview. (Chapter 3-2, Conducting
the Interviews, discusses the interview process
and explains refusals and other types of noninterviews.) When interviewing a household for the
first time, interviewers collect information about
the HU and all individuals who usually live at the
address.
Housing unit information. Upon first contact with
a HU, interviewers collect information on the HU
physical address, its mailing address, the year it
was constructed, the type of living quarters (e.g.,
Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

a house, apartment, or mobile home), whether it
is renter- or owner-occupied, and whether it has a
telephone and, if so, the telephone number.
Household roster. After collecting or updating
the HU data, the interviewer either creates or
updates a list of all individuals living in the unit
and determines whether they are members of the
household. This list is referred to as the household
roster.
Household respondent. One person may provide
all of the CPS data for the entire sample unit, provided the person is a household member 15 years
or older who is knowledgeable about the household. The person who responds for the household
is called the household respondent. Information
collected from the household respondent for
other members of the household is referred to as
proxy response.
Reference person. To create the household roster,
the interviewer asks the household respondent to
give “the names of all persons living or staying” in
the HU, and to “start with the name of the person
or one of the persons who owns or rents” the unit.
The person whose name the interviewer enters
first (presumably one of the individuals who owns
or rents the unit) becomes the reference person.
The household respondent and the reference
person are not necessarily the same. For example,
if you are the household respondent and you give
your name first when asked to report the household roster, then you are also the reference person. If, on the other hand, you are the household
respondent and you give your spouse’s name first
when asked to report the household roster, then
your spouse is the reference person.
Household. A household is defined as all individuals (related family members and all unrelated individuals) whose usual place of residence
at the time of the interview is the sample unit.
Individuals who are temporarily absent and who
have no other usual address are still classified as
household members even though they are not
present in the household during the survey week.
College students compose the bulk of such absent
household members, but people away on business or vacation are also included. (Not included
are individuals in the military, individuals who
Chapter 1-2: Questionnaire Concepts and Definitions 5

have usual residences elsewhere, and individuals
in institutions such as prisons or nursing homes.)
Once household or nonhousehold membership
has been established for all people on the roster,
the interviewer proceeds to collect all other demographic data, but only for household members.
Relationship to reference person. The interviewer will show a flash card with relationship
categories (i.e., opposite-sex spouse, same-sex
spouse, opposite-sex unmarried partner, samesex unmarried partner, child, grandchild, parent,
brother or sister, housemate or roommate) to the
household respondent and ask them to report
each household member’s relationship to the reference person (the person listed on line one of the
household roster). These relationships are used to
define families, subfamilies, and unrelated individuals. A family is defined as a group of two or more
individuals residing together who are related by
birth, marriage, or adoption; all such individuals
are considered members of one family. Families
are further classified either as married-couple
(spouse present) families or as families maintained by women or men without spouses present. Subfamilies are further classified as related
subfamilies or unrelated subfamilies. A related
subfamily is a married couple with or without children, or one parent with one or more own, single
(never-married) children under 18 years old, living
in a household, and related to, but not including,
the householder or spouse. An unrelated subfamily is a family that lives in a HU where none of its
members is related to the reference person. An
unrelated individual(s) may be part of a household
containing one or more families (like an unmarried
partner, with or without children, or a housemate),
or may reside in group quarters such as a rooming
house.
Additional demographic information. In addition
to asking for relationship data, the interviewer
asks for other demographic data for each household member, including birth date, marital status,
armed forces or veteran status, level of education,
race, ethnicity, nativity, and disability status. Total
family income is also collected.
The following terms define an individual’s marital
status at the time of the interview:
•	

Married, spouse present: applies to a married
couple who both live at the same address,
even though one may be temporarily absent

6 Chapter 1-2: Questionnaire Concepts and Definitions	

	

due to business, vacation, a visit away from
home, a hospital stay, etc.
•

Married, spouse absent: refers to married
people living apart because a spouse was
employed and living at a considerable distance from home, was serving away from
home in the armed forces, had moved to
another area, or had a different place of residence for any other reason except separation
as defined below.

•

Separated: includes people with legal separations, those living apart with intentions of
obtaining a divorce, and other people permanently or temporarily separated because of
marital discord.

•

Widowed.

•

Divorced.

•

Never married.

Educational attainment for each person in the
household aged 15 or older is obtained through a
question asking about the highest grade or degree
completed. Beginning in 2015, additional questions about professional certifications and state
or industry licenses used for getting or keeping a
job were added to measure credentials granted
outside of the regular education system.
Questions on race and Hispanic or Latino ethnicity
comply with federal standards established by the
Office of Management and Budget. Respondents
are asked a question to determine if they are
Hispanic, which is considered an ethnicity rather
than a race. The question asks if the individual is
of Hispanic, Latino, or Spanish origin, and appears
before the question on race. Next, all respondents, including those who identify themselves
as Hispanic, are asked to choose one or more of
the following races they consider themselves to
be: White, Black or African American, American
Indian or Alaska Native, Asian, or Native Hawaiian
or Other Pacific Islander. They are reminded that
Hispanic origin is not a race. Although not indicated to the respondent, responses of “other” are
accepted and allocated among the race categories. Respondents may choose more than one
race, so data may be tabulated for many different
combinations of race and Hispanic ethnicity.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

The CPS uses a set of six questions to identify
persons with disabilities. People are classified as
having a disability if there is a response of “yes”
to any of these questions. Each of the questions asks the respondent whether anyone in the
household (civilian, aged 15 and older) has the
condition described, and if the respondent replies
“yes,” they are then asked to identify everyone
in the household who has the condition. A brief
description of the six conditions are: (1) deaf or
serious difficulty hearing; (2) blind or serious difficulty seeing; (3) serious difficulty concentrating,
remembering, or making decisions; (4) serious
difficulty walking or climbing stairs; (5) difficulty
dressing or bathing; and (6) difficulty doing
errands.

Labor Force Information
The CPS provides a measure of monthly employment—not jobs. An employed person, as measured in the CPS, may have more than one job.
Labor force information is obtained after most
household and demographic information has been
collected. Besides determining labor force status,
information is also collected on hours worked,
occupation, industry, self-employment, earnings,
duration of job search, and other labor force
characteristics.
The primary purpose of the labor force questions
is to classify all individuals as employed, unemployed, or not in the labor force. The major labor
force categories are defined hierarchically and,
thus, are mutually exclusive. Employed supersedes
unemployed, which supersedes not in the labor
force. For example, individuals who are classified as employed, no matter how many hours
they worked, are not asked the questions about
having looked for work and cannot be classified as
unemployed. Survey respondents are never asked
specifically if they are unemployed, nor are they
given an opportunity to decide their own labor
force status. Similarly, an individual’s personal
activities or characteristics, like going to school,
taking care of a family member, or the presence of
a disability or long-term illness, do not determine
how labor force status is measured. Their status
will be determined based on how they respond to
a specific set of questions about their recent labor
market activities. For example, someone who
“retired” from one job may be working at another
job and, thus, is classified as employed.
Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

The current concepts and definitions underlying
the collection and estimation of labor force data
are presented below. A more expanded, nontechnical discussion of the various labor force categories is available in the BLS publication “How the
Government Measures Unemployment,” online at
.
Reference week. The CPS labor force questions
ask about labor market activities for 1 week each
month. This week is referred to as the “reference
week.” The reference week is defined as the 7-day
period, Sunday through Saturday, that typically
includes the twelfth of the month. (For November
and December, the reference week and survey
collection period may be moved 1 week earlier to
avoid the Thanksgiving and Christmas holidays
and allow enough time for processing between
data collection.)
Civilian noninstitutional population. In the CPS,
labor force data are restricted to people who currently reside in one of the 50 states or the District
of Columbia, who do not reside in institutions
(such as a correctional institution or a residential nursing or mental health care facility), and
who are not on active duty in the armed forces.
Although data are collected from household members 15 years and over, official labor force estimates refer to people 16 years and over.
Employed people. Employed people are those
who, during the reference week, (1) did any work
at all (for at least 1 hour) as paid employees; (2)
worked for profit in their own business, profession,
or farm; (3) worked 15 hours or more as unpaid
workers in an enterprise operated by a family
member they lived with; or (4) were temporarily absent from their jobs because of illness, bad
weather, vacation, labor dispute, or another reason (whether or not they were paid for the time
off or were seeking other jobs). Each employed
person is counted only once, even if he or she
holds more than one job. (See the discussion of
multiple jobholders below.)
Citizens of foreign countries who are working
in the United States, not living on the premises
of an embassy, and not simply visiting or traveling, are included in the number of employed
people. People whose only activity consisted of
work around their own house (painting, repairing,
cleaning, or other home-related housework) or

Chapter 1-2: Questionnaire Concepts and Definitions 7

unpaid volunteer work for religious, charitable, or
other organizations are excluded.
The key questions used to determine whether an
individual is employed or not are presented at the
end of this unit.
Multiple jobholders. These are employed people who, during the reference week, had two or
more jobs as wage and salary workers; were selfemployed and also held one or more wage and
salary jobs; or worked as unpaid family workers
and also held one or more wage and salary jobs.
A person employed only in private households
(cleaner, gardener, babysitter, etc.) who worked
for two or more employers during the reference
week is not counted as a multiple jobholder since
working for several employers is considered an
inherent characteristic of private household work.
Self-employed people with multiple unincorporated businesses and people with multiple jobs as
unpaid family workers are excluded.
CPS respondents are asked questions each month
to identify multiple jobholders. First, all employed
people are asked “Last week, did you have more
than one job (or business, if one exists), including
part-time, evening, or weekend work?” Those who
answer “yes” are then asked, “Altogether, how
many jobs (or businesses) did you have?”
Hours of work. Information on both actual and
usual hours of work is collected. Published data on
hours of work generally relate to the actual number of hours spent “at work” during the reference
week. For example, people who normally work
40 hours a week but were off on the Labor Day
holiday would be reported as working 32 hours,
even though they were paid for the holiday. For
people working more than one job, the published
figures relate to the number of hours worked at all
jobs during the week. Data on people “at work”
exclude employed people who were absent from
their jobs during the entire reference week for reasons such as vacation, illness, or industrial dispute.
Data are also available on usual hours worked by
all employed people, including those who were
absent from their jobs during the reference week.
Usual full- or part-time status. In order to differentiate a person’s normal schedule from his or
her activity during the reference week, people
are also classified according to their usual full- or
8 Chapter 1-2: Questionnaire Concepts and Definitions	

	

part-time status. In this context, full-time workers
are defined as those who usually work 35 hours or
more (at all jobs combined). This group includes
some individuals who worked fewer than 35 hours
in the reference week—for either economic or
noneconomic reasons—as well as those who are
temporarily absent from work. Similarly, parttime workers are those who usually work fewer
than 35 hours per week (at all jobs combined),
regardless of the number of hours worked in the
reference week. This may include some individuals who actually worked more than 34 hours in
the reference week, as well as those who were
temporarily absent from work. The full-time labor
force includes all employed people who usually
work full-time and unemployed people who are
looking for full-time work or are on layoff from
full-time jobs. The part-time labor force consists
of employed people who usually work part-time
and unemployed people who are seeking or are on
layoff from part-time jobs.
At work part-time for economic reasons.
Sometimes referred to as involuntary part-time,
this category refers to individuals who gave an
economic reason for working 1 to 34 hours during
the reference week. (This includes both those who
usually work part-time and those who worked
part-time in the reference week, but usually work
full-time.) Economic reasons include slack work or
unfavorable business conditions, inability to find
full-time work, and seasonal declines in demand.
Those who usually work part-time also must
indicate that they want and are available to work
full-time to be classified as being part-time for
economic reasons.
At work part-time for noneconomic reasons.
This group includes people who usually work
part-time and were at work 1 to 34 hours during
the reference week for a noneconomic reason.
Noneconomic reasons include illness or other
medical limitation, childcare problems or other
family or personal obligations, school or training,
retirement or social security limits on earnings,
and being in a job where full-time work is less than
35 hours. The group also includes those who gave
an economic reason for usually working 1 to 34
hours but said they do not want to work full-time
or were unavailable for such work.
Occupation, industry, and class of worker. For the
employed, this information applies to the job held
Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

in the reference week. A person with two or more
jobs is classified according to their main job—the
job at which they usually worked the greatest
number of hours. The unemployed are classified
according to their last job, if any. CPS data use
the census occupational and industry classification systems, which are based on the Standard
Occupational Classification (SOC) and the North
American Industry Classification System (NAICS).
A list of these census codes can be found in the
Alphabetical Index of Industries and Occupations
at .
The class-of-worker designation assigns workers to one of the following categories: wage and
salary workers, self-employed workers, and unpaid
family workers. Wage and salary workers are
those who receive wages, salary, commissions,
tips, or pay-in-kind from a private employer or
from a government unit. The class-of-worker question also includes separate response categories for
government, private for-profit company, and nonprofit organization to further classify wage and
salary workers. The self-employed are those who
work for profit or fees in their own businesses,
professions, trades, or farms. Self-employed
individuals are identified as those whose businesses are either incorporated or unincorporated.
Typically, self-employment refers to only the unincorporated self-employed category since those
whose businesses are incorporated are technically
wage and salary employees of the corporation.
However, BLS publishes some data separately for
the unincorporated self-employed and the incorporated self-employed. Unpaid family workers are
individuals working without pay for 15 hours a
week or more on a farm or in a business operated
by a member of the household to whom they are
related by birth, marriage, or adoption.
Occupation, industry, and class of worker on second job. The occupation, industry, and class-ofworker information for individuals’ second jobs is
collected in order to obtain a more accurate measure of multiple jobholders and to obtain more
detailed information about their employment
characteristics. For the majority of multiple jobholders, occupation, industry, and class-of-worker
data for their second jobs are collected only from
one-fourth of the sample—those in their fourth or
eighth monthly interview. However, for those who
say they are “self-employed, unincorporated” on
Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

their main jobs, class of worker of the second job
is collected each month. This is done because,
according to the official definition, individuals who
are “self-employed, unincorporated” on both of
their jobs are not considered multiple jobholders.
Earnings. Information on what people earn at
their main job is collected only for those who are
receiving their fourth or eighth monthly interviews. This means that earnings questions are
asked of only one-fourth of the survey respondents each month. Employed respondents are
asked to report their usual earnings before taxes
and other deductions and to include any overtime
pay, commissions, or tips usually received. The
term “usual” is as perceived by the respondent.
If the respondent asks for a definition of usual,
interviewers are instructed to define the term as
more than half the weeks worked during the past
four or five months. Respondents may report
earnings in the period of time they prefer—for
example, hourly, weekly, biweekly, monthly, or
annually. (Allowing respondents to report in a
periodicity with which they are most comfortable was a feature added in the 1994 redesign.)
Based on additional information collected during
the interview, earnings reported on a basis other
than weekly are converted to a weekly amount
in later processing. Data are collected for wage
and salary workers, and for self-employed people
whose businesses are incorporated; earnings data
are not collected for self-employed people whose
businesses are unincorporated. (Earnings data are
not edited and are not released to the public for
the “self-employed whose businesses are incorporated.”) These earnings data are used to construct
estimates of the distribution of usual weekly earnings and median earnings. Individuals who do not
report their earnings on an hourly basis are asked
if they are, in fact, paid at an hourly rate and if
so, what the hourly rate is. The earnings of those
who reported hourly and those who are paid at an
hourly rate are used to analyze the characteristics
of hourly paid workers such as those who are paid
the minimum wage.
Unemployed people. All people who were not
employed during the reference week but were
available for work (excluding temporary illness)
and had made specific efforts to find employment
some time during the 4-week period ending with
the reference week are classified as unemployed.
Individuals who were waiting to be recalled to a
Chapter 1-2: Questionnaire Concepts and Definitions 9

job from which they had been laid off need not
have been looking for work to be classified as
unemployed.
People waiting to start a new job must have
actively looked for a job within the last 4 weeks in
order to be counted as unemployed. Otherwise,
they are classified as not in the labor force.
As the definition indicates, there are two ways
people may be classified as unemployed. They are
either looking for work (job seekers) or they have
been temporarily separated from a job (people
on layoff). Job seekers must have engaged in an
active job search during the above-mentioned
4-week period in order to be classified as unemployed. Active methods are defined as job search
methods that have the potential to result in a job
offer without any further action on the part of the
job seeker. Examples of active job search methods
include contacting an employer directly, using
a public or private employment agency, seeking assistance from friends or relatives, placing
or answering ads, or using some other active
method. Examples of the “other active” category
include auditioning for a part in a play, bidding on
a contract, or waiting at a designated labor pickup
point. Passive methods, which do not qualify as
job search, include simply reading “help wanted”
ads, researching companies as places to work,
and taking a job training course, as opposed to
actually answering “help wanted” ads or posting a resume online. The response categories for
active and passive methods are clearly delineated
in separately labeled columns on the interviewers’ computer screens. Job search methods are
identified by the following questions: “Have you
been doing anything to find work during the last 4
weeks?” and “What are all of the things you have
done to find work during the last 4 weeks?” To
ensure that respondents report all of the methods
of job search used, interviewers ask “Anything
else?” after the initial or a subsequent job search
method is reported.
Persons “on layoff” are defined as those who
have been separated from a job to which they are
waiting to be recalled (i.e., their layoff status is
temporary). In order to measure layoffs accurately,
the questionnaire determines whether people
reported to be on layoff did in fact have an expectation of recall; that is, whether they had been

10 Chapter 1-2: Questionnaire Concepts and Definitions	

	

given a specific date to return to work or, at least,
had been given an indication that they would be
recalled within the next 6 months. As previously
mentioned, people on layoff need not be actively
seeking work to be classified as unemployed.
The key questions used to classify an individual as
unemployed are presented at the end of this unit.
Reason for unemployment. Unemployed individuals are categorized according to their status at the
time they became unemployed. The categories
are:
•	

Job losers and people who completed temporary jobs: a group composed of (1) people
on temporary layoff from a job to which they
expect to be recalled, (2) permanent job
losers, whose employment ended involuntarily
and who began looking for work, and (3) people who completed temporary jobs—that is,
individuals who began looking for work after
their jobs ended.

•	

Job leavers: people who quit or otherwise
terminated their employment voluntarily and
began looking for work.

•	

Reentrants: unemployed people who previously worked but were out of the labor force
prior to beginning their job search.

•	

New entrants: unemployed individuals who
never worked before and who were entering
the labor force for the first time.

Duration of unemployment. The duration of
unemployment is typically expressed in weeks,
although the survey collects information in weeks,
months, or years. For individuals classified as
unemployed because they are looking for work,
the duration of unemployment is the length of
time (through the current reference week) that
they have been looking for work. For people on
layoff, the duration of unemployment is the number of weeks (through the reference week) they
have been on layoff.
Not in the labor force. Included in this group are
all members of the civilian noninstitutional population who are neither employed nor unemployed.
Information is collected on their desire for and
availability to take a job at the time of the CPS
interview, job search activity in the prior year, and

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

reason for not looking in the 4-week period prior
to the survey week. Responses to the question “Do
you currently want a job, either full- or part-time?”
are used in determining desire for work.
The not-in-labor-force group includes a subset
of individuals marginally attached to the labor
force, defined as people not working who want
and are available for a job and who have looked
for work sometime in the past 12 months (or since
the end of their last job if they held one within the
past 12 months). They are not counted as unemployed because they had not actively searched for
work in the prior 4 weeks. Within the marginally
attached group are discouraged workers—people
who are not currently looking for work because
they believe there are no jobs available or there
are none for which they would qualify. Reasons
identified by discouraged workers for not recently
looking for work are:

•

Labor force participation rate. The labor force
participation rate is the proportion of the
age-eligible, civilian, noninstitutional population that is in the labor force. For the CPS,
the age-eligible population consists of people
aged 16 and older. It represents the proportion of the population that is either working or
unemployed and actively seeking work or are
temporarily laid off from a job to which they
expect to be recalled.

•

Employment-population ratio. The employment population ratio is the number of
employed people aged 16 and older as a
percentage of the civilian noninstitutional
population. It represents the proportion of the
population that is working.

KEY QUESTIONS FOR EMPLOYED AND
UNEMPLOYED
1.

Does anyone in this household have a business
or a farm?
This question is asked once for each household. A series of questions is then asked for
each household member. The parentheticals
below are filled if there is a business or farm in
the household.

2.

LAST WEEK, did you do ANY work for (either)
pay (or profit)?

3.

LAST WEEK, did you do any unpaid work in
the family business or farm?

4.

LAST WEEK, (in addition to the business)
did you have a job, either full- or part-time?
Include any job from which you were temporarily absent.

Estimates of the number of employed and unemployed are used to construct a variety of measures. These measures include:

5.

LAST WEEK, were you on layoff from a job?

6.

Has your employer given you a date to return
to work?

•	

7.

Have you been given any indication that you
will be recalled to work within the next 6
months?

8.

Have you been doing anything to find work
during the last 4 weeks?

9.

What are all of the things you have done to
find work during the last 4 weeks?
Interviewers ask “Anything else?” until all of
the methods of job search used by the individual are reported.

•	

Belief that no work is available in line of work
or area.

•	

Inability to find any work.

•	

Lack of necessary schooling, training, skills, or
experience.

•	

Belief that employers have age bias.

•	

Belief that there are other types of
discrimination.

The other persons marginally attached to the
labor force group includes persons who want a job
but had not looked for work in the past 4 weeks
for reasons such as family responsibilities, being in
school or training, ill health or disability, or transportation problems.

•	

Labor force. The labor force consists of all
people aged 16 and older in the civilian noninstitutional population classified as employed
or unemployed according to the criteria
described above. In other words, the labor
force level is the number of people who are
either working or actively seeking work.
Unemployment rate. The unemployment rate
represents the number of unemployed as a
percentage of the labor force.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Chapter 1-2: Questionnaire Concepts and Definitions 11

10.	 LAST WEEK, could you have started a job if
one had been offered?
Individuals are classified as employed if they
say “yes” to questions 2, 3 (and work 15 hours
or more in the reference week or receive profits from the business or farm), or 4.
Individuals who are available to work (“yes”
to 10) are classified as unemployed if they
say “yes” to 5 and either 6 or 7 (on temporary
layoff), or if they say “yes” to 8 and provide
an active job search method (one that could
potentially result in a job offer) in 9.
For a complete version of the questionnaire, see
the CPS Web site at .

REFERENCES
U.S. Department of Commerce, U.S. Census
Bureau, Alphabetical Index of Industries and
Occupations, 2014, retrieved from .
U.S. Department of Labor, U.S. Bureau of Labor
Statistics, How the Government Measures
Unemployment, 2015, retrieved from .

FURTHER READING
U.S. Bureau of Labor Statistics definitions related
to Current Population Survey concepts retrieved
from .
U.S. Census Bureau definitions related to Current
Population Survey concepts retrieved from
.

12 Chapter 1-2: Questionnaire Concepts and Definitions	

	

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Chapter 1-3: Supplements
INTRODUCTION
In addition to providing data on the labor force
status of the population, the CPS supports a
variety of supplemental studies on the entire
U.S. population and specific population subsets.
Upon completing the basic CPS interview, some
respondents are requested to answer a series of
supplemental questions. These questions provide
information on the economic and social well-being
of the nation. Federal agencies and federal independent organizations sponsor these supplemental data collections. Supplemental inquiries take
advantage of several special features of the CPS
such as large sample size; general-purpose design;
highly skilled, experienced interviewing and field
staff; and generalized processing systems that can
easily accommodate the inclusion of additional
questions.
Some CPS supplemental inquiries are conducted
annually, others are every other year, and still
other inquiries are on a one-time basis. The frequency and recurrence of a supplement depend
on what best meets the needs of the supplement’s
sponsor. In addition, any supplemental inquiry
must meet the strict criteria discussed in the next
section.

CRITERIA FOR CURRENT POPULATION
SURVEY SUPPLEMENTS
Over the years, the Census Bureau, in consultation
with BLS, has developed and refined a number of
criteria to determine the acceptability of undertaking supplements for federal agencies or federal
independent organizations.
Working with the sponsors, the staff of the Census
Bureau develop the survey design including
the methodologies, questionnaires, pretesting
options, interviewer instructions, and processing
requirements. The Census Bureau provides a written description of the statistical properties associated with each supplement. The same standards
of quality that apply to the basic CPS also apply
to the supplements.
The Census Bureau considers the following criteria
before undertaking a supplement:
•	

The subject matter of the inquiry must be in
the public interest and must be appropriate
for inclusion in a government-run survey. The
questions should be of a factual nature rather
than gathering opinions.

•	

The inquiry must not have an adverse effect
on the CPS or on the image of BLS or the
Census Bureau. Specifically, the questions
must not be so far removed from the subject
and tenor of the basic CPS or from the missions of BLS and the Census Bureau that they
damage survey or institutional legitimacy.
They must not cause respondents to question the importance of the survey or result in
significantly decreased response rates or data
quality. BLS and the Census Bureau must not
be affected in terms of congressional acceptance and approval of their programs.

•	

The subject matter must be compatible with
the basic CPS survey and not introduce a
concept that could affect the accuracy of
responses to the basic CPS information.
For example, a series of questions incorporating a revised labor force concept could

Producing supplemental data from the CPS
involves more than including additional questions. Separate data processing is required to
edit responses for consistency and sometimes
to impute missing values. Additional weighting
methods are often necessary because the supplement targets a different universe from that of
the basic CPS. A supplement can also engender
a different level of response or cooperation from
respondents.
With the many different subject matters that
the supplements address, the level of data (data
about household versus data about individuals), target populations, contact strategies, and
weighting procedures, etc., may differ from that of
the CPS.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Chapter 1-3: Supplements 13

inadvertently affect responses to the standard
labor force items and would not be allowed.
•

The subject matter must not be sensitive.
This criterion is imprecise, and its interpretation has changed over time. For example, the
subject of birth expectations, once considered
sensitive, has been included as a CPS supplemental inquiry. Religious affiliation and activity
are examples of subjects currently considered
sensitive.

•

The inquiry must not slow down the work of
the basic survey or impose a response burden that may affect future participation in
the basic CPS. In general, the supplemental
inquiry must not add more than 10 minutes
of interview time per household. Competing
requirements for the use of BLS and Census
Bureau staff or facilities that arise in dealing
with a supplemental inquiry are resolved by
giving the basic CPS first priority. BLS and the
Census Bureau will not jeopardize the schedule for completing the basic CPS or other
work to favor completing a supplemental
inquiry within a specified time frame.

•

It must be possible to meet the objectives of
the inquiry through survey methods. That is,
it must be possible to translate the supplemental survey’s objectives into meaningful
questions, and the respondent must be able
to supply the information required to answer
the questions.

•

If the Census Bureau is to collect the supplemental information during the CPS interview,
the inquiry must be suitable for the personal
visit and telephone procedures used in the
CPS.

•

All supplements must abide by the Census
Bureau’s enabling legislation, which, in part,
ensures that the Census Bureau will not
release information that can identify any individual. Requests for a person’s name, address,
social security number, or other information

14 Chapter 1-3: Supplements	

	

that can directly identify an individual will
not be included in the dataset. In addition,
information that could be used to indirectly
identify an individual with a high probability of success (e.g., small geographic areas
in conjunction with income or age) will be
suppressed.
•

The cost of supplements must be borne by
the sponsor, regardless of the nature of the
request or the relationship of the sponsor to
the ongoing CPS. The questionnaires developed for the supplement are subject to BLS’
and the Census Bureau’s pretesting policies.
These policies encourage questionnaire
research aimed at improving data quality.
While BLS and the Census Bureau may reject
proposed supplement questions and topics,
they cannot give final approval of questions or
supplements. The Office of Management and
Budget (OMB), through its Statistical Policy
Division, reviews the proposed supplement
to make certain it meets government-wide
standards regarding the need for the data and
the appropriateness of the design and ensures
that the survey instruments, strategy, and
response burden are acceptable. They may
not allow some questions, or even whole supplements, that had been approved by the BLS
and Census Bureau. The risk of this occurring
is minimized by consulting with OMB early
in the development process if there is some
question about the appropriateness or validity
of the proposed questions.

RECENT CURRENT POPULATION
SURVEY SUPPLEMENTS
The scope and type of CPS supplements vary
considerably from month to month and from
year to year. Generally, the interviewers ask the
selected respondent(s) the additional questions
that are included in the supplemental survey after
completing the basic CPS. Table 1-3.1 summarizes
CPS supplements that were conducted between
January 2004 and December 2017.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Table 1-3.1.

Current Population Survey Supplements, January 2004–December 2017
Title

Month and
year(s)

Recent
sponsor

Purpose

Provide data concerning family characteristics, household Census Bureau/
Bureau of Labor
composition, marital status, educational attainment,
health insurance coverage, foreign-born population, prior Statistics (BLS)
year’s income from all sources, work experience, receipt
of noncash benefit, poverty, program participation, and
geographic mobility.

Annual Social
and Economic
Supplement

February, March,
April 2004–2017

Child Support

April 2004, 2006, Identify households with absent parents and provide
data on child support arrangements, visitation rights of
2008, 2010,
2012, 2014, 2016 absent parent, amount and frequency of actual versus
awarded child support, and health insurance coverage.
Also provide data on why child support was not received
or awarded.

Civic
Engagement

November 2008,
2009, 2010,
2011, 2013

Provide information on the extent to which our nation’s
communities are places where individuals are civically
active.

Corporation for
National and
Community Service
(CNCS)

Computer Use/
Internet Use

October 2007,
July 2011, July
2013, July 2015,
November 2017

Provide information about household access to
computers and the use of the Internet.

National
Telecommunications
and Information
Administration

Contingent
Workers

February 2005,
May 2017

Provide information on the type of employment
arrangement workers have on their current job and
other characteristics of the current job, e.g., earnings,
benefits, longevity, along with their satisfaction with and
expectations for their current jobs.

Department of
Labor

Disability

May 2012

Provide information on labor market challenges facing
people with a disability.

Office of Disability
Employment Policy

Displaced
Workers

January 2004,
2006, 2008,
2010, 2012,
2014, 2016

Provide data on workers who lost a job in the last 5 years
because of plant closing, shift elimination, or other workrelated reason.

Department of
Labor

Fertility

June 2004, 2006, Provide data on the number of children that women
aged 15 through 44 have ever had and the children’s
2008, 2010,
2012, 2014, 2016 characteristics.

Census Bureau

Food Security

December
2004–2017

Provide data that will measure hunger and food security
such as food expenditure, access to food, and food
quality and safety.

Food and Nutrition
Service

Housing
Vacancy

Monthly

Provide quarterly data on vacancy rates and
characteristics of vacant units.

Census Bureau

International
Migration

August 2008

Provide information on how migration patterns have
changed, as well as how migrants adapt to living in the
United States.

Census Bureau

Participation in
the Arts (PPA)/
Annual Arts
Benchmark
Survey (AABS)

PPA:
February 2008,
February 2012

PPA provides data on the type and frequency of
adult participation in the arts; training and exposure
(particularly while young); and their musical artistic
activity preferences.

National
Endowment for the
Arts

Office of
Child Support
Enforcement

AABS collects a subset of the same information from PPA.
AABS:
February 2013,
2014, 2015, 2016

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Chapter 1-3: Supplements 15

Table 1-3.1.

Current Population Survey Supplements, January 2004–December 2017—Con.
Title

Month and
year(s)

School
Enrollment

October 2004–
2017

Provide information on the population 3 years and older on
school enrollment, junior or regular college attendance, and
high school graduation.

BLS/Census
Bureau/
National
Center for
Education
Statistics

Tobacco Use

May 2006,
August 2006,
January 2007,
May 2010,
August 2010,
January 2011,
July 2014,
January 2015,
May 2015

Provide data for population 15 years and older on current
and former use of tobacco products; restrictions of smoking
in workplace for employed persons; and personal attitudes
toward smoking.

National
Cancer
Institute/Food
and Drug
Administration

Unbanked and
Underbanked

June 2009, 2011, Provide data on unbanked and underbanked households,
2013, 2015, 2017 their demographic characteristics, and their reasons for being
unbanked and underbanked.

Unemployment
Insurance
Nonfiler

January, May,
July, November
2005

Provide data on individuals who do apply for unemployment
insurance and the reasons they do not apply.

Department of
Labor

Veterans

August 2005,
2007, 2009,
July 2010,
August 2011–
2017

Provide data for veterans of the Unites States on Vietnamtheater and Persian Gulf-theater status, service-connected
income, effect of a service-connected disability on current
labor force participation, and participation in veterans’
programs.

Veterans’
Employment
and Training
Service and
Department
of Veterans
Affairs

Volunteers

September
2004–2015

Provide a measurement of participation in volunteer service,
specifically about the frequency of volunteer activity, the
kinds of organizations volunteered with, and types of
activities chosen. Among nonvolunteers, questions identify
what barriers were experienced in volunteering, or what
encouragement is needed to increase participation.

CNCS

Voting and
Registration

November 2004,
2006, 2008,
2010, 2012,
2014, 2016

Provide demographic information on persons who did and did
not register to vote. Also measures number of persons who
voted and reasons for not registering.

Census Bureau

16 Chapter 1-3: Supplements	

	

Recent
sponsor

Purpose

Federal
Deposit
Insurance
Corporation

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

A widely used supplement is the ASEC that the
Census Bureau conducts every February, March,
and April. This supplement collects data on work
experience, several sources of income, poverty,
migration, household composition, health insurance coverage, and receipt of noncash benefits.
The Housing Vacancy Survey (HVS) is unusual in
that it is conducted every month. The HVS collects
additional information (e.g., number of rooms,
plumbing, and rental or sales price) on HUs identified as vacant in the basic CPS.
The basic CPS weighting is not always appropriate for supplements, since supplements tend to
have higher response rates. In general, only CPS
respondents are requested to participate in supplements. So, response to supplements is conditional on CPS response. In addition, supplement
universes may be different from the basic CPS
universe. Thus, some supplements require weighting procedures that are different from those of the
basic CPS.
Although it is not a supplement, there is one
survey that uses the CPS as its sampling frame:
the American Time Use Survey (ATUS). The ATUS
draws sample households from households that
have completed their final CPS interview. The
ATUS, which is conducted 2 to 5 months after the
final CPS interview, collects information about
how people spend their time. More information
about the ATUS, including sampling and weighting
procedures, can be found in the American Time
Use Survey User’s Guide, at .

HOUSING VACANCY SURVEY
Description of supplement
The Housing Vacancy Survey (HVS) is a monthly
supplement to the CPS. The Census Bureau
administers this supplement when the CPS interviewers encounter a sample unit that is intended
for year-round or seasonal occupancy and is
currently vacant or occupied by people with a
usual residence elsewhere. The interviewer asks
a reliable respondent (e.g., the owner, a rental
agent, or a knowledgeable neighbor) questions
on year built; number of rooms, bedrooms, and
bathrooms; how long the HU has been vacant; the
vacancy status (for rent, for sale, etc.); and when
applicable, the selling price or rent amount.
Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

The purpose of the HVS is to provide current
information on the rental and homeowner vacancy
rates, home ownership rates, and characteristics of units available for occupancy. The rental
vacancy rate is a component of the index of
leading economic indicators, which is used to
gauge the current economic climate. Although
the Census Bureau performs this survey monthly,
data for the nation and for the Northeast, South,
Midwest, and West regions are released quarterly
and annually. The data released annually include
information for states and large metropolitan
areas.

Calculation of vacancy rates
The HVS collects data on year-round and seasonal
vacant units. Vacant year-round units are those
intended for occupancy at any time of the year,
even though they may not be in use year-round. In
resort areas, a HU that is intended for occupancy
on a year-round basis is considered a year-round
unit; those intended for occupancy only during
certain seasons of the year are considered seasonal. In addition, vacant HUs held for occupancy
by migratory workers employed in farm work
during the crop season are classified as seasonal.
The rental and homeowner vacancy rates are the
most prominent HVS statistics. The vacancy rates
are determined using information collected by the
HVS and CPS, since the rates are calculated using
both vacant and occupied HUs.
The rental vacancy rate is the ratio of vacant yearround units for rent to the sum of renter-occupied
units, vacant year-round units rented but awaiting
occupancy, and vacant year-round units for rent.
The homeowner vacancy rate is the ratio of vacant
year-round units for sale to the sum of owner-occupied units, vacant year-round units sold but
awaiting occupancy, and vacant year-round units
for sale.

Weighting procedure
Since the HVS universe differs from the CPS universe, the HVS records require a different weighting procedure from the CPS records. The HVS
records are weighted by the CPS basic weight,
the CPS special weighting factor, two HVS adjustments, and a regional HU adjustment. (Refer to
Chapter 2-3, Weighting and Estimation, for a

Chapter 1-3: Supplements 17

description of the CPS weighting adjustments.)
The two HVS adjustments are referred to as the
HVS first-stage weighting adjustment factor and
the HVS second-stage weighting adjustment
factor.
The HVS first-stage adjustment factor is comparable to the CPS first-stage adjustment factor in
that it reduces the contribution to variance from
the sampling of Primary Sampling Units (PSUs).
The adjustment factors are based on 2010 Census
data. There are separate first-stage factors for
year-round and seasonal HUs. For each state,
they are calculated as the ratio of the state-level
census count of vacant year-round or seasonal
HUs in all non-self-representing (NSR) PSUs to the
corresponding state-level estimate of vacant yearround or seasonal HUs from the sample NSR PSUs.
The appropriate first-stage adjustment factor is
applied to every vacant year-round and seasonal
HU in the NSR PSUs. The following formula is used
to compute the first-stage adjustment factors for
each state for the year-round and seasonal HUs:

FS S = 	The first-stage ratio adjustment factor for
state s.

C S,i = 	The number of seasonal or year-round

vacancies based on 2010 census data for all
NSR PSU i (sample and nonsample) in state
s.

C S,k = 	The number of seasonal or year-round

vacancies based on 2010 census data for
only sampled NSR PSU k in state s.

π S,k = 	The probability of selection for sample PSU
in state s.

=

	The number of NSR PSUs (sample and
nonsample) in state s.

m =	 The number of NSR sample PSUs in state s.

The HVS second-stage adjustment, which applies
to vacant year-round and seasonal HUs in self-​representing and NSR PSUs, is calculated as the ratio

18 Chapter 1-3: Supplements	

	

The regional HU adjustment is the final stage in
the HVS weighting procedure. The factor is calculated as the ratio of the HU control estimates
(including occupied and vacant HUs) by the four
major geographic regions of the United States
(Northeast, South, Midwest, and West) supplied
by the Population Division, to the sum of estimated occupied (from the CPS) plus vacant HUs,
through the HVS second-stage adjustment. This
factor is applied to both occupied and vacant
HUs.
The final weight for each HVS record is determined by calculating the product of the CPS basic
weight, the HVS first-stage adjustment, and the
HVS second-stage adjustment, that is,
Final weight = (baseweight)
x (HVS first-stage adjustment)
x (HVS second-stage adjustment).

where

n

of the weighted CPS interviewed HUs after CPS
second-stage adjustment to the weighted CPS
interviewed HUs after CPS first-stage adjustment.
The cells for the HVS second-stage adjustment are
calculated within each month-in-sample (MIS) by
census region and type of area (metropolitan or
nonmetropolitan, central city or balance of Core
Based Statistical Area, and urban or rural). This
adjustment is made to all eligible HVS records.

The occupied units in the denominator of the
vacancy rate formulas use a different final weight
since the data come from the CPS. The final
weight applied to the renter-occupied and owner​
-occupied units is the CPS household weight.
(Refer to Chapter 2-3, Weighting and Estimation,
for a description of the CPS household weight.)

ANNUAL SOCIAL AND ECONOMIC
SUPPLEMENT
Description of supplement
The Census Bureau and BLS sponsor the ASEC.
The Census Bureau has collected ASEC data since
1947. From 1947 to 1955, the ASEC interviews
took place in April; from 1956 to 2001, the ASEC
interviews took place in March; and from 2002 to
the present, the ASEC interviews take place in
February, March, and April, with most interviews
in March.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Prior to 2003, the ASEC was known as the Annual
Demographic Supplement (ADS) or the March
Supplement. In 2001,2 a sample increase was
implemented that required more time for data
collection. Thus, additional ASEC interviews now
take place in February through April.
The supplement collects data on health insurance
coverage, work experience, and income from all
sources, receipt of noncash benefits, poverty,
program participation, and geographic mobility.
A major reason for conducting the ASEC in the
month of March is to obtain better income data,
given proximity to tax season. The universe of
the ASEC is slightly different from that of the
basic CPS—it includes certain members of the
armed forces. This difference requires some minor
changes to the weighting methodology.

Sample
The ASEC sample consists of the March CPS
sample, plus additional CPS households identified
in prior CPS samples and the following April CPS
sample. Table 1-3.2 shows the months when the
eligible sample is identified for years 2001 through
2017. Starting in 2004, the eligible ASEC sample
households are:
•

The entire March CPS sample—8 MIS groups.

•

Hispanic households—identified in November
(from all eight MIS groups)

•

Hispanic households—identified in April (from
MIS 1 and 5 groups)—a total of two additional
MIS groups.

•

Non-Hispanic, non-White households—identified in August (MIS 8), September (MIS 8),

2
The Census Bureau first used the expanded sample in 2001
for testing and was not included in the official ADS statistics for
2001. The statistics from 2002 are the first official set of statistics
published by the Census Bureau using the expanded sample. The
2001 expanded sample statistics were released and are used for
comparing the 2001 data to the official 2002 statistics.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

October (MIS 8), November (MIS 1 and 5), and
April (MIS 1 and 5)—a total of seven additional
MIS groups.
•

Non-Hispanic White households with children
18 years or younger—identified in August
(MIS 8), September (MIS 8), October (MIS 8),
November (MIS 1 and 5), and April (MIS 1 and
5)—a total of seven additional MIS groups.

Prior to 1976, no additional sample households were added. From 1976 to 2001, only the
November CPS households containing at least
one person of Hispanic origin (item 2 above) were
added to the ASEC. The rest of the households
(items 3, 4, and 5 above) were added in 2001,
along with a general sample increase in selected
states, and are collectively known as the Children’s
Health Insurance Program (CHIP) sample expansion.3 The added households improve the precision of the ASEC estimates for the Hispanic
households; non-Hispanic, non-White households;
and non-Hispanic White households with children
18 years or younger.
Because of the characteristics of CPS sample
rotation (see Chapter 2-2, Sample Design), the
additional cases from the August, September,
October, November, and April CPS do not overlap with those in the March CPS. The March CPS
sample alone consists of eight MIS groups. The
additional sample cases in the ASEC increase its
effective sample size in comparison. The ASEC
sample includes 18 MIS groups for Hispanic households, 15 MIS groups for non-Hispanic, non-White
households, 15 MIS groups for non-Hispanic White
households with children 18 years or younger, and
8 MIS groups for all other households.

3
The CHIP was formerly known as the State Children’s Health
Insurance Program. For additional information about CHIP, see
Unit 2-2, Sample Design.

Chapter 1-3: Supplements 19

Table 1-3.2

Month-In-Sample Groups Included in the Annual Social and Economic Supplement Sample
for Years 2001 to 20171
Month-in-sample

Current Population Survey
month/Hispanic status
1
August

Hispanic

2

3

4

Non-Hispanic3
September Hispanic2

8

2004–
20171

Hispanic2

2004–
20171

NI
	NI

Non-Hispanic3

2003–
20171

NI

November Hispanic2

2001–20171

Non-Hispanic

3

2001–
20171

2001–
20171

NI

Hispanic2
Hispanic2
Non-Hispanic3

2001–
2002

2001–
2003

2001–
2003

2001–20171

Non-Hispanic3
April

7

	NI

Non-Hispanic

March

6

NI

3

October

5

	NI

2

2001–
20171

2001–
20171

NI

NI

NI Not interviewed for the Annual Social and Economic Supplement.
1

The 2010 sample design was phased-in in 2014. The month-in-sample groups for 2014 continue in subsequent years.

2

Hispanics may be any race.

3

The non-Hispanic group includes both non-Hispanic, non-Whites, and non-Hispanic Whites with children 18 years or younger.

Prior to 2004, non-Hispanic, non-White and
non-Hispanic White with children-under-18 households were selected from different months-insample. This table shows the sample selection as
of 2004.
The March and April ASEC eligible cases are
given the ASEC questionnaire in those respective
months (see Table 1-3.3). The November eligible
Hispanic households are administered the ASEC
questionnaire in February for November MIS
groups 1 and 5, during their regular CPS interviewing time, and the remaining MIS groups (MIS
2 to 4 and 6 to 8) receive the ASEC interview in
March. November MIS 6 to 8 households have
already completed all 8 months of interviewing for
the CPS before March, and the November MIS 2
to 4 households have an extra contact scheduled
for the ASEC before the fifth interview of the CPS
later in the year.
The August, September, October, and November
eligible non-Hispanic households are given the
ASEC questionnaire in either February or April.

20 Chapter 1-3: Supplements	

	

November ASEC-eligible cases in MIS 1 and 5
are interviewed for the CPS in February (in MIS 4
and 8, respectively), so the ASEC questionnaire
is given in February. The August, September, and
October MIS 8 eligible cases are split between the
February and April CPS interviewing months. The
households in other rotation groups in February
and April receive the corresponding CPS nonASEC supplement for that month.
Mover households are defined at the time of the
ASEC interview as households with a different
reference person when compared to the previous CPS interview or the person causing the
household to be eligible has moved out (i.e., the
Hispanic person or other race minority moved out
or a single child aged the household out of eligibility). Mover households identified from the August,
September, October, and November eligible
sample are removed from the ASEC sample. Mover
households identified in the March and April eligible sample receive the ASEC questionnaire.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Table 1-3.3.

Summary of the Annual Social and Economic Supplement Interview Months
Month-in-sample
Current Population Survey
month/Hispanic status

Mover
1

August

2

3

4

5

1

2

3

4

5

NI

NI
NI

NI

NI

6

7

Non-Hispanic

NI

NI

NI

NI

Non-Hispanic2

NI

Hispanic1

NI

Feb

March

Feb

March

Non-Hispanic2

NI

Feb

NI

Feb

NI

Hispanic

8

Feb

Hispanic1

Feb

NI

Apr

1

March

Non-Hispanic2
April

8

NI

2

March

7

Hispanic

September Hispanic

November

6

Non-Hispanic2

1

1

October

Nonmover

Hispanic1

Apr

NI

March

Apr

NI

Apr

NI

Apr

NI

Non-Hispanic2
NI Not interviewed for the Annual Social and Economic Supplement.
1

Hispanics may be any race.

2

The non-Hispanic group includes both non-Hispanic, non-Whites, and non-Hispanic Whites with children 18 years or younger.

Weighting procedure
Prior to weighting, missing supplement items are
assigned values based on hot deck imputation, a
method in which each missing value is replaced
with an observed response from a “similar” unit.
Values are imputed even if all of the supplement
data are missing; thus, there is no separate adjustment for households that respond to the basic
CPS survey but not to the supplement. The ASEC
records are weighted by the CPS base weight, the
CPS special weighting factor,4 the CPS noninterview adjustment, the CPS first-stage adjustment,
and the CPS second-stage adjustment procedure.
(Chapter 2-3, Weighting and Estimation, contains a description of these adjustments.) The
ASEC also receives an additional noninterview
adjustment for the August, September, October,
and November ASEC sample, a CHIP Adjustment
Factor, a family equalization adjustment, and
weights applied to armed forces members.
The August, September, October, and November
eligible samples are each weighted through the
CPS noninterview adjustment and then combined.
A noninterview adjustment for the combined
4
For the 2010 Sample Design, the CPS special weighting
factor applies only to group quarters frame units.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

samples and the CPS first-stage adjustments are
applied before the CHIP adjustment is applied.
The March eligible sample and the April eligible
sample are also weighted separately before the
second-stage weighting adjustment. All of the
samples are then combined so that one second-stage adjustment procedure is performed.
The flowchart in Figure 1-3.1 illustrates the
weighting process for the ASEC sample.
Households from August, September, October,
and November eligible samples: The households from the August, September, October,
and November eligible samples start with their
base CPS weight as calculated in the appropriate
month, modified by the appropriate CPS special
weighting factor and appropriate CPS noninterview adjustment in the August, September,
October, or November interviews. Next, a second noninterview adjustment is made for eligible
households that are still occupied but for which
an interview could not be obtained to account for
the nonrespondent households in the February,
March, or April interviews. Then, the ASEC sample weights for the prior sample are adjusted
by the CPS first-stage adjustment and the CHIP
Adjustment Factor.

Chapter 1-3: Supplements 21

Figure 1-3.1.

Diagram of the Annual Social and Economic Supplement Weighting Scheme

August1
Sample

September1
Sample

October2
Sample

November3
Sample

March4
Sample

April5
Sample

Base
Weight

Base
Basic
Weight
Weight

Base
Basic
Weight
Weight

Base
Basic
Weight
Weight

Base
Basic
Weight
Weight

Base
Basic
Weight
Weight

Special*
Weighting
Factor

Special*
Weighting
Factor

Special*
Weighting
Factor

Special*
Weighting
Factor

Special*
Weighting
Factor

Special*
Weighting
Factor

August  NI6
Adjustment

September NI6
Adjustment

October  NI6
Adjustment

November  NI6
Adjustment

March  NI6
Adjustment

April  NI6
Adjustment

First–Stage
Adjustment

First–Stage
Adjustment

Mover
Households
Removed

ASEC  NI6
Adjustment

First–Stage
Adjustment

CHIP
Adjustment

Non–Mover  Hispanic;
Non–Mover Non–Hispanic,
  Non–White;
Non–Mover Non–Hispanic
 
  Whites
  with
Children  Housholds

Mover Hispanic;
 
 
Mover Non–Hispanic,
Non–White;
 
 
 
Mover Non–Hispanic
Whites
with
 
Children
Housholds
and All Others
 

CHIP
Adjustment

CHIP
Adjustment

 

Civilian Persons
Second–Stage
  (SS)
Adjustment

Armed  Forces
 
Weighting

* For the 2010 Sample Design, the Current Population Survey
Special Weighting Factor applies only to group quarters frame
units.
Eligible Cases for:
 
1 From MIS 8 rotations that are non–Hispanic, non–Whites
and non–Hispanic Whites with children; interviewed in February.
2 From MIS 8 rotations that are non–Hispanic, non–Whites and
 
non–Hispanic Whites with children; interviewed in April.
3 Hispanics from MIS 2–4 and 6–8 groups; interviewed in March.
From MIS 1 and 5, Hispanic; non–Hispanic, non–Whites; and
non–Hispanic Whites with children; interviewed in February as
part of their MIS 4 and 8 interviews.
 
4 All cases interviewed from March.
5 From MIS 1 and 5, all Hispanics; non–Hispanic, non–Whites; and
non–Hispanic Whites with children; all interviewed in April.
6 NI (Not interviewed for the Annual Social and Economic Supplement.)

22 Chapter 1-3: Supplements	

	

Family
Equalization

ASEC Weight

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

The ASEC noninterview adjustment for the August,
September, October, and November eligible
sample. The second noninterview adjustment is
applied to the August, September, October, and
November eligible sample households to reflect
nonresponse of occupied HUs that occur in the
February, March, or April interviews. If a nonresponding household is actually a mover household, it would not be eligible for interview. Since
the mover status of nonresponding households
is not known, we assume that the proportion of
mover households is the same for interviewed
and nonresponding households. This is reflected
in the noninterview adjustment. With this exception, the noninterview adjustment procedure is
the same as described in Chapter 2-3, Weighting
and Estimation. At this point, the nonresponding
households and those mover households receive
no further ASEC weighting. The weights of nonresponding, occupied, nonmover households from
August, September, October, and November are
transferred to the ASEC respondents from August,
September, October, and November so that the
ASEC respondents represent all eligible persons.
The noninterview adjustment factor, F ij , is computed as follows:

where

Z ij = 	The weighted number of August,

September, October, and November eligible sample households interviewed in the
February, March, or April CPS in cell j of
cluster i.

N ij = 	The weighted number of August,

September, October, and November eligible sample occupied, nonresponding HUs
in the February, March, or April CPS in cell j
of cluster i.

B ij = 	 The weighted number of August,

September, October, and November eligible sample mover households identified in
the February, March, or April CPS in cell j of
cluster i.

The weighted counts used in this formula are
those after the CPS noninterview adjustment
is applied. The clusters i refer to the noninterview clusters (NICL), which are groups of PSUs.
Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

NICL boundaries do not cross the census regions
(Northeast, Midwest, South, and West). Within
each of these clusters, the cell j could be (1)
Central City, (2) Balance of Metropolitan Statistical
Area, or (3) Nonmetropolitan, depending on the
type of cluster, with each cluster having either
cells (1) and (2), or only cell (3).
CHIP adjustment factor for the August,
September, October, and November eligible
sample. The CHIP adjustment factor is applied to
nonmover eligible households that contain residents who are Hispanic; non-Hispanic, non-White;
and non-Hispanic Whites with children 18 years or
younger to compensate for the increased sample
in these demographic categories. Hispanic households receive a CHIP adjustment factor of 8/18
and non-Hispanic, non-White households and
non-Hispanic White households with children 18
years or younger receive a CHIP adjustment factor
of 8/15 (see Table 1-3.4). After this adjustment
is applied, the August, September, October, and
November eligible sample households are ready
to be combined with the March and April eligible
samples for the application of the second-stage
adjustment.
Eligible households from the March sample: The
March eligible sample households start with their
CPS base weight, modified by the CPS special
weighting factor, the March CPS noninterview
adjustment, the March CPS first-stage adjustment (as described in Chapter 2-3, Weighting and
Estimation), and the CHIP adjustment factor. After
the CHIP adjustment factor is applied, the March
eligible sample is ready to be combined with the
August, September, October, November, and April
eligible samples for the application of the second-​
stage adjustment.
CHIP adjustment factor for the March eligible sample. The CHIP adjustment factor is applied to the
March eligible nonmover households that contain
residents who are Hispanic; non-Hispanic, nonWhite; and non-Hispanic Whites with children 18
years or younger to compensate for the increased
sample size in these demographic categories.
Hispanic households receive a CHIP adjustment
factor of 8/18 and non-Hispanic, non-White households and non-Hispanic White resident households with children 18 years or younger receive
a CHIP adjustment factor of 8/15. Mover households, households with newborns, and all other
Chapter 1-3: Supplements 23

households receive a CHIP adjustment of one.
Table 1-3.4 summarizes these weight adjustments.
Eligible households from the April sample: The
households in the April eligible sample start with
their CPS base weight as calculated in April, modified by the April CPS special weighting factor, the
April CPS noninterview adjustment, the April CPS
first-stage adjustment, and the CHIP adjustment
factor. After the CHIP adjustment factor is applied,
the April eligible sample is ready to be combined
with the August, September, October, November,
and March eligible samples for the application of
the second-stage adjustment.
CHIP adjustment factor for the April eligible
sample. The CHIP adjustment factor is applied to
April-eligible households that contain residents
who are Hispanic; non-Hispanic, non-Whites; or
non-Hispanic Whites with children 18 years or
younger to compensate for the increased sample
size in these demographic categories regardless of
mover status. Hispanic households receive a CHIP
adjustment factor of 8/18 and non-Hispanic, nonWhite households and non-Hispanic White households with children 18 years or younger receive
a CHIP adjustment factor of 8/15. Table 1-3.4
summarizes these weight adjustments.
Combined sample of eligible households from the
August, September, October, November, March,
and April CPS: At this point, the eligible samples
from August, September, October, November,
March, and April are combined. The remaining
adjustments are applied to this combined sample
file.
ASEC second-stage adjustment: The second-​
stage adjustment adjusts the ASEC estimates,
so that they agree with independent age, sex,
race, and Hispanic-origin population controls
as described in Chapter 2-3, Weighting and
Estimation. The same procedure used for CPS is
used for the ASEC.
Additional ASEC weighting: After the ASEC
weight through the second-stage procedure is
determined, the next step is to determine the
final ASEC weight. There are two more weighting
adjustments applied to the ASEC sample cases.
The first is applied to the armed forces members.

24 Chapter 1-3: Supplements	

	

The armed forces adjustment assigns weights
to the eligible armed forces members so they
are included in the ASEC estimates. The second
adjustment is for family equalization. Without
this adjustment, the estimates of men in partnerships would be more than the number of women
in partnerships. Weights are adjusted to give a
married partners and unmarried partners the same
weight, while maintaining the overall age and race
and sex and Hispanic origin control totals.
Armed forces: Male and female members of the
armed forces living off post or living with their
families on post are included in the ASEC as long
as at least one civilian adult lives in the same
household, whereas the CPS excludes all armed
forces members. Households with no civilian
adults in the household, i.e., households with only
armed forces members, are excluded from the
ASEC. Control totals, used in the second-stage
factor, do not include armed forces members;
thus, armed forces members do not go through
the second-stage adjustment. During family equalization, the weight of an armed forces member
with a spouse or partner is assigned to the weight
of their spouse or partner.
Family equalization: The family equalization procedure equalizes the estimates of the number of
people in partnerships, both married and unmarried, while maintaining the overall civilian age/
race/sex/ethnicity control totals. Seven categories
of adults (at least 16 years old) based on sex and
household composition are formed:
•	

Female partners in female/female married or
unmarried partner households.

•	

All other civilian females.

•	

Married males, spouse present.

•	

Male partners in male/female unmarried partner households.

•	

Other civilian male heads of households.

•	

Male partners in male/male married or unmarried partner households.

•	

All other civilian males.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Table 1-3.4.

Summary of the Annual Social and Economic Supplement Children's Health Insurance
Program Adjustment Factor
Month-in-sample
Current Population Survey
month/Hispanic status

Mover
1

August

2

3

4

Hispanic

1

2

3

4

5
0

0

02

2

0

02

Non-Hispanic3

02

02

Hispanic1

02

02

Non-Hispanic

0

02

Hispanic1

02

September Hispanic

3

Non-Hispanic
Hispanic1
Hispanic

Non-Hispanic

3

8/15

02

8/15

02

8

8/15

02

8/18

1
8/18

1

7

8/18

02

3

6

2

2

Non-Hispanic3,4
April

8

2

1

March

7

2

Non-Hispanic

November

6

0

1
3

October

5

Nonmover

8/15
8/18
8/15

02

8/18
8/15

02

8/18
8/15

02

Hispanics may be any race.
Zero weight indicates the cases are ineligible for the Annual Social and Economic Supplement.
3
The non-Hispanic group includes both non-Hispanic, non-Whites, and non-Hispanic Whites with children 18 years or younger.
4
Nonmover non-Hispanic Whites without children 18 years or younger get a Children’s Health Insurance Program Adjustment
Factor.
1

2

Three different methods, depending on the
household composition, are used to assign the
ASEC weight to other members of the household.
The methods are (1) assigning the weight of the
householder to the spouse or partner, (2) averaging the weights of the householder and partner,
or (3) computing a ratio adjustment factor and
multiplying the factor by the ASEC weight.

SUMMARY
Although this discussion focuses on only two
CPS supplements (the HVS and the ASEC), every
supplement has its own unique objectives. The
Census Bureau tailors the particular questions,
edits, and imputations to each supplement’s data
needs. For many supplements, this also means
altering the weighting procedure to reflect a
different universe, account for a modified sample,

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

or adjust for a higher rate of nonresponse. The
weighting revisions discussed here for HVS and
ASEC indicate only the types of modifications that
the Census Bureau might use for a supplement.
Technical documentation for the CPS supplements
can be found at .

REFERENCES
U.S. Bureau of Labor Statistics, American Time
Use Survey User’s Guide retrieved from .
U.S. Census Bureau, Current Population Survey
Questionnaires retrieved from .

Chapter 1-3: Supplements 25

Chapter 1-4: Data Products
INTRODUCTION
Information collected in the CPS is made available
by both the BLS and the Census Bureau through
broad publication programs that include news
releases, reports, charts, data tables, time series,
and other formats. In addition to tabulated data,
public-use microdata files are also available. Some
of the major products are identified below. This
section is not intended to be an exhaustive reference for all information available from the CPS.

BUREAU OF LABOR STATISTICS
PRODUCTS
Each month, employment and unemployment
data are published initially in The Employment
Situation news release about 2 weeks after data
collection is completed. The release includes
a narrative summary and analysis of the major
employment and unemployment developments
together with tables containing statistics for the
principal data series. The news release can be
accessed at .
In addition to the news release, detailed tables
provide information on the labor force, employment, and unemployment by a number of characteristics such as age, sex, race, Hispanic and
Latino ethnicity, educational attainment, industry,
and occupation. Tables may contain monthly,
quarterly, or annual average data. Annual average
tables typically show more detail than tables published on a monthly or quarterly basis. See
.
The BLS LABSTAT database system contains four
separate databases with CPS time series data.
(They can be accessed from ). The largest database contains the
main labor force statistics, and others contain
earnings, union, and marital and family statistics. About 15,000 labor force series are updated
each month, including a few hundred time series
that go back to the 1940s. In total, there are over
100,000 monthly, quarterly, or annual-only time
series available from the CPS.

26 Chapter 1-4: Data Products	

	

Some of the tables and time series contain seasonally adjusted monthly or quarterly data. These
estimates, useful for month-to-month analysis,
undergo a process that removes predictable seasonal patterns in the data and makes it easier to
see the underlying trends. However, only a small
portion of estimates are available on a seasonally
adjusted basis. The majority of the data are not
seasonally adjusted. An over-the-year comparison
of unadjusted data is recommended rather than
a month-to-month analysis. (Note that annual
averages do not exhibit seasonality.) A discussion
of seasonal adjustment is available in Chapter 2-5,
Seasonal Adjustment.
BLS publishes CPS data in annual news releases
on a wide variety of subjects including union
membership, veterans, people with disabilities,
families, the foreign-born, and summer employment of youth. BLS also publishes several large
reports on minimum wage workers, labor force
characteristics by race and ethnicity, women in the
labor force, and earnings of men and women. All
of these publications include both analytical content and data tables. They also include a Technical
Note describing the source of the data and key
concepts or issues in interpreting the information
presented. Reports and news releases are available at .
BLS also sponsors (or co-sponsors) many of the
supplemental surveys described in Chapter 1-3,
Supplements. BLS news releases and reports from
the CPS supplements include work experience
over the calendar year (annual), a profile of the
working poor (annual), characteristics of veterans
(annual), college enrollment and work activity of
recent high school graduates (annual), worker
displacement (biennial), and employee tenure
(biennial). Other topics have also periodically
been addressed, like barriers to employment and
labor-related issues for people with disabilities,
workers on flexible and shift schedules, work at
home, and contingent and alternative employment
arrangements.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Because the CPS provides information on such
a wide variety of topics, the best way to explore
data from the BLS is using the subject matter
summaries of available data on the labor force,
employment, unemployment, persons not in the
labor force, hours of work, earnings, and other
demographic and labor force characteristics.
The CPS topics A to Z list is often a good place
to start: . Each
subject overview presents relevant news releases,
tables, database series, charts, and analytical articles authored by BLS staff.
Information about the CPS itself, including concepts, measurement, and changes to the program
over time, is available at .
Most of the CPS data available from BLS are estimates for the United States as a whole, but some
information is available for states and other geographic areas. The BLS Local Area Unemployment
Statistics (LAUS) program publishes monthly
estimates of employment and unemployment
for all states, counties, metropolitan areas, cities
of 25,000 population or more, and New England
cities and towns of 1,000 population or more,
by place of residence. Labor force data from the
LAUS program follow the same CPS concepts
and definitions used for the national labor force
data. However, because the CPS sample is insufficient for creating reliable monthly estimates for
statewide and substate areas, LAUS uses different estimating procedures for different levels of
geography. In general, monthly estimates for the
states, such as the official state unemployment
rates, are developed using statistical models that
incorporate current and historical data from the
CPS, the monthly CES establishment survey, and
regular state unemployment insurance systems.
The annual subnational demographic labor force
data published in the Geographical Profile of
Employment and Unemployment are derived

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

solely from the CPS. For more information about
these estimates, see .

CENSUS BUREAU PRODUCTS
The Census Bureau has been analyzing data from
the CPS and reporting the results to the public for
over five decades. The reports provide information
on a recurring basis about a wide variety of social,
demographic, and economic topics. Most of these
reports appear in one of the following series
issued by the Census Bureau: P-20, Population
Characteristics; P-23, Special Studies; and P-60,
Consumer Income. Many of the reports are based
on data collected by the ASEC, which is a CPS
supplement. However, other reports use data from
various CPS supplements. Generally, reports are
announced by news release and are released to
the public via .

Census Bureau Report Series
P-20, Population Characteristics. Regularly
recurring reports in this series include topics such
as geographic mobility, educational attainment,
school enrollment, marital status, households
and families, Hispanic origin, fertility, voter registration and participation, and the foreign-born
population.
P-23, Special Studies. Information pertaining
to methods, concepts, or specialized data is
furnished in these publications. Also included
are occasional reports on the Black population,
metropolitan-nonmetropolitan population, youth,
women, the older population, and other topics.
P-60, Consumer Income. Information concerning
families, individuals, and households at various
income levels is presented in this group of reports.
Data are also presented on noncash benefits and
relationship of income to age, sex, race, family
size, education, occupation, work experience, and
other characteristics.

Chapter 1-4: Data Products 27

Public-Use Microdata Files
In addition to the regularly tabulated statistics
described above, special data can be generated
through the use of the CPS individual record
(microdata) files. These files contain records of
the responses to the survey questionnaire for all
respondents in the survey and can be used to
create additional cross-sectional detail. The actual
identities of the individuals are protected on all
versions of the files made available to users outside of the Census Bureau.

28 Chapter 1-4: Data Products	

	

Access to CPS raw microdata, that is, the untabulated data from survey items, is available through
an FTP site . This site provides the
microdata files for the basic monthly labor force
survey from 1994 forward, along with record
layouts (data dictionaries). It also includes files for
selected CPS supplements.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Appendix: History of the Current Population Survey
•	

March 1940. WPA compiles the first official
monthly figures on unemployment; estimates
are released in an internal memorandum in
April 1940 with the title “Monthly Report of
Unemployment.” The reference week for the
survey was March 24–30; data were collected
the following week. (This is generally thought
of as the survey's first collection date.) March
was chosen since it closely matched the reference period for the 1940 decennial census.

•	

August 1942. The Census Bureau assumes
responsibility for the survey, and the program
is renamed the Monthly Report on the Labor
Force.

•	

October 1943. Following the 1940 decennial
census, the sample for the survey is revised to
make full use of probability sampling principles. PSUs of one or more counties were
defined covering the entire United States.
Sixty-eight PSUs were selected, and about
26,000 HUs were designated for the sample, of
which about 22,500 were eligible for interview.

•	

March 1945. The monthly sample is reduced to
about 25,000 designated households (21,500
eligible for interview) without affecting the
reliability of the estimates.

•	

July 1945. The CPS questionnaire is revised.
The revision consisted of the introduction
of four basic employment status questions.
Methodological studies showed that the
previous questionnaire produced results that
misclassified large numbers of part-time and
intermittent workers such as students, housewives, and unpaid family workers on farms.
These groups were erroneously reported as
not active in the labor force.

•	

August 1947. The sample selection method
is revised. The method of selecting sample
units within a sample area was changed so
that each unit selected would have the same
chance of selection. This change simplified
estimation procedures.

•	

July 1949. Previously excluded dwelling places
are now covered. The sample was extended to
cover special dwelling places—hotels, motels,
trailer camps, etc. This led to improvements

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

in the statistics (i.e., reduced bias), since
residents of these places often have characteristics that are different from the rest of the
population.
•

February 1952. Document-sensing procedures are introduced into the survey process. The CPS questionnaire was printed on
a document-sensing card. In this procedure,
responses were recorded by drawing a line
through the oval representing the correct
answer using an electrographic lead pencil.
Punch cards were automatically prepared
from the questionnaire by document-sensing
equipment.

•

January 1953. Ratio estimates begin to use
data from the 1950 population census. Prior
to January 1953, the ratio estimates had been
based on 1940 census relationships for the
first-stage ratio estimate, and 1940 population
data were used to adjust for births, deaths,
etc., for the second-stage ratio estimate.

•

July 1953. The 4-8-4 rotation system is
introduced. This sample rotation system was
adopted to improve measurement of changes
over time. In this system, households are
interviewed for 4 consecutive months, leave
the sample for 8 months, and then return for
the same period of 4 months the following
year. In the previous system, households were
interviewed for 6 consecutive months and
then replaced. The 4-8-4 system provides
some year-to-year overlap (50 percent), thus
improving estimates of change on a year-toyear basis (while largely preserving the degree
of month-to-month overlap).

•

September 1953. In second-stage ratio
estimation, information on “color” (the race
terminology used at that time) is added and
information on “veteran status” is omitted.
This change made it feasible to publish separate, absolute numbers for individuals by race,
whereas previously only the percent distributions had been published.
High-speed electronic equipment is introduced for tabulations. The introduction of
electronic calculation greatly increased data
timeliness and led to other improvements in

Appendix: History of the Current Population Survey 29

estimation methods. Other benefits included
the substantial expansion of the scope and
content of the tabulations and the computation of sampling variability. (The shift to modern computers was made in 1959.)
•

•	

January 1957. In response to recommendations from an interagency committee tasked
with reviewing conceptual and methodological issues in the measurement of employment and unemployment, two relatively small
groups of people, both formerly classified as
employed ‘‘with a job but not at work,’’ are
assigned to new classifications. The reassigned groups were (1) people on layoff with
definite instructions to return to work within
30 days of the layoff date, and (2) people
waiting to start new wage and salary jobs
within 30 days of the interview. Most of the
people in these two groups were shifted to the
unemployed classification. The only exception
was the small subgroup in school during the
survey week who were waiting to start new
jobs; these were transferred to ‘‘not in labor
force.’’ This change in definition did not affect
the basic labor force questions or the enumeration procedures.

•	

June 1957. A seasonally adjusted unemployment rate is introduced reflecting ongoing
advances in electronic computers. A further
extension of the data—using more refined
seasonal adjustment methods—was introduced in July 1957. (Some seasonally adjusted
unemployment data were introduced early in
1955 in index form.) The new data included a
seasonally adjusted rate of unemployment and
trends in seasonally adjusted levels for total
employment and unemployment. Significant
improvements in methodology emerged
from research conducted at the BLS and the
Census Bureau in the following years.

•	

July 1959. Responsibility for the planning,
analysis, seasonal adjustment, and publication of the labor force statistics from the
CPS is transferred to BLS as part of a large
exchange of statistical functions between the
Department of Commerce and Department of
Labor. The Census Bureau continued to have
(and still has) responsibility for the collection
and computer processing of these statistics,
for maintenance of the CPS sample, and for
related methodological research. Interagency
review of CPS policy and technical issues continues to be the responsibility of the Bureau
of the Budget (now the Statistical Policy
Division, OMB).

February 1954. For the redesign following
the 1950 Census, a new sample is phased in
from February 1954 to May 1955. The number
of PSUs was increased from 68 to 230 while
retaining the overall sample size of 25,000
designated HUs (21,500 eligible for interview).
At the same time, a substantially improved
estimation procedure is introduced, referred
to as composite estimation. This procedure
took advantage of the large overlap in the
sample from month to month. These two
changes improved the reliability of most of
the major statistics by a magnitude that could
otherwise be achieved only by doubling the
sample size.
Major reliance on telephone interviewing
begins, though personal interviews are preferred for MIS 1, 2, and 5. A regular reinterview program was initiated as a quality control
measure.

•

May 1955. Monthly questions exploring the
reasons for part-time work were added to the
standard set of employment status questions.
In the past, this information had been collected quarterly or less frequently and was
found to be valuable in studying labor market
trends.

•

July 1955. The survey reference week is
moved to the calendar week containing the
twelfth day of the month to align the CPS time
reference with that of other employment statistics. Previously, the survey reference week
had been the calendar week containing the
eighth day of the month.

•

May 1956. The number of PSUs is expanded
from 230 to 330. All of the former 230 PSUs
were also included in the expanded sample.
The overall sample size also increased by
roughly two-thirds to a total of about 40,000
households (about 35,000 eligible units). The
expansion improved the reliability of the major
statistics by around 20 percent and made it
possible to publish more detailed statistics.

30 Appendix: History of the Current Population Survey	

	

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

•	

January 1960. Upon achieving statehood,
Alaska and Hawaii are included in the independent population estimates and in the
sample survey. This increased the number of
sample PSUs from 330 to 333. The addition of
these two states affected the comparability of
population and labor force data with previous
years. There was an increase of about 500,000
in the noninstitutional population aged 16 and
over and about 300,000 in the labor force,
four-fifths of this in nonagricultural employment. The levels of other labor force categories were not appreciably changed.

•	

October 1961. The CPS questionnaire is
converted to the Film Optical Sensing Device
for Input to the Computer system (the system used in the 1960 Census). Entries were
made by filling in small circles with an ordinary
lead pencil. The questionnaires were photographed to microfilm. The microfilms were
then scanned by a reading device that transferred the information directly to computer
tape. This system permitted a larger form and
a more flexible arrangement of items than the
previous document-sensing procedure and did
not require the preparation of punch cards.
This data entry system was used through
December 1993.

•	

December 1961. For the redesign following the 1960 decennial census, new sample
is phased in from December 1961 to March
1963. There were 357 PSUs; 35,000 households were eligible for interview. In a major
improvement, most of the sample is drawn
from household lists prepared during the
decennial census. Starting February 1962,
population controls from the decennial census
were used in weighting.

•	

January 1963. In response to recommendations from the President’s Committee to
Appraise Employment and Unemployment
Statistics (also referred to as the “Gordon
Committee”), two new items are added to the
monthly questionnaire. The first was a question, formerly asked only intermittently, on
whether the unemployed were seeking full- or
part-time work. The second was an expanded
item on household relationships, formerly
included only annually, to provide greater
detail on the marital status and household
relationships of unemployed people.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

•	

January 1967. The CPS sample is expanded
from 357 to 449 PSUs. An increase in total
budget allowed the overall sample size to
increase by roughly 50 percent to a total of
about 60,000 HUs (50,000 eligible units). This
expansion improved the reliability of the major
statistics by about 20 percent and made it
possible to publish more detailed statistics.
The definitions of employment and unemployment are modified. In line with the basic recommendations of the President’s Committee
to Appraise Employment and Unemployment
Statistics (U.S. Department of Commerce,
1962), a several-year study was conducted
to develop and test proposed changes in the
labor force concepts. The principal research
results were implemented in January 1967.
The changes included a revised lower age
limit in defining the labor force (from age 14
to age 16) and new questions to improve the
information on hours of work, the duration of
unemployment, and the self-employed. The
definition of unemployment was also revised
slightly. A 4-week job search period was
specified, and new questions on job search
methods and current availability for work were
added. The refined definition of unemployment led to small differences in the estimates of levels and month-to-month change.
Collection of additional information on those
not in the labor force began.

•	

March 1968. Separate age and sex ratio
estimation cells are introduced for Negro and
Other races. (Negro was the race terminology
used at that time. Other includes American
Indian, Eskimo, Aleut, Asian, and Pacific
Islander.) Previously, the second-stage ratio
estimation used White and non-White race
categories by age groups and sex. The revised
procedures allowed separate ratio estimates
for Negro and Other race categories. This
change increased the number of ratio estimation cells from 68 to 116.

•	

January 1970. The detailed not in labor force
questions are shifted from the incoming
rotation groups (MIS 1 and 5) to the outgoing
rotation groups (MIS 4 and 8).

•	

January 1971. The 1970 Census occupational
classification system is introduced.

Appendix: History of the Current Population Survey 31

•	

December 1971. For the redesign following the 1970 decennial census, new sample
is phased in from December 1971 to March
1973. There were 461 PSUs; 47,000 HUs were
eligible for interview. Also, the cluster design
was changed from six nearby households
(but not contiguous) to four usually contiguous households. This change was undertaken
after research found that smaller cluster sizes
would increase sample efficiency. Even with
the reduction in sample size, this change led
to a small gain in reliability for most characteristics. Also, the questions on occupation were
made more comparable to those used in the
1970 Census by adding a question on major
activities or duties of current job.

•	

January 1972. The population estimates used
in second-stage ratio estimation are updated
to the 1970 Census base. Data tabulated using
the 1970 Census occupation classification
system were produced.

•	

January 1973. Seasonal adjustment converts
to X-11 software, replacing the BLS Factor
Method (which had been introduced in 1960).
The X-11 software included several options
for handling both additive and multiplicative
series, diagnostics were improved, and variable trends and seasonals were allowed.

•	

January 1974. The inflation-deflation method
is introduced for deriving independent estimates of the civilian noninstitutional population by age, race, and sex used in second-stage ratio estimation.

•	

July 1975. As a result of the large inflow of
Vietnamese refugees to the United States, the
total and Black-and-Other independent population controls for those 16 years and over are
adjusted upward.

•	

September 1975. In order to obtain state
estimates from the CPS, state supplementary
samples are introduced; these samples were
not used for national estimation. An additional
sample consisting of about 14,000 interviews
each month, was introduced in July 1975 to
supplement the national sample in 26 states
and the District of Columbia. In all, 165 new
PSUs were involved. The supplemental sample was added to meet a specific reliability

32 Appendix: History of the Current Population Survey	

	

standard for estimates of the annual average
number of unemployed people for each state.
•

January 1978. State supplemental sample
for 24 states and the District of Columbia are
incorporated into national estimation. Also,
a supplemental sample was introduced to
improve coverage in “address list” enumeration districts.

•

October 1978. Procedures for determining
demographic characteristics are modified. At
this time, changes were made in the collection
methods for household relationship, race, and
ethnicity data. Race is now determined by the
respondent rather than by the interviewer.
Other modifications included the introduction
of earnings questions for the two outgoing
rotations in the monthly survey (with regular
collection of these data beginning in January
1979). New questions focused on usual hours
worked, hourly wage rate, and usual weekly
earnings. Earnings questions were asked of
currently employed wage and salary workers.
Previously, earnings data were collected in
supplements.

•

January 1979. A new two-level, first-stage
ratio estimation procedure is introduced.
This procedure was designed to improve the
reliability of metropolitan and nonmetropolitan estimates. Also introduced were monthly
tabulations of children’s demographic data
including relationship, age, sex, race, and
origin.

•

September 1979. The final report of the
National Commission on Employment
and Unemployment Statistics (“Levitan
Commission”) is issued (Executive Office of
the President, 1979). This report shaped many
of the future changes to the CPS.

•

January 1980. To improve coverage, about
450 households are added to the sample,
increasing the number of total PSUs to 629.
Also, X-11 Auto-Regressive Integrated Moving
Average (ARIMA) software was introduced for
seasonal adjustment.

•

May 1981. The sample is reduced by approximately 6,000 assigned households, bringing
the total sample size to approximately 72,000
assigned households (with about 60,000
households eligible for interview).
Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

•	

•	

January 1982. The race categories in second-​
stage ratio estimation adjustment are changed
from White and non-White to Black and nonBlack. These changes were made to eliminate
classification differences in race that existed
between the 1980 Census and the CPS. The
change did not result in notable differences in
published CPS data. Nevertheless, it did result
in more variability for certain “White,” “Black,”
and “Other” characteristics. As is customary,
the CPS uses ratio estimates from the most
recent decennial census. Beginning in January
1982, current population estimates used in
the second-stage estimation procedure were
based on findings from the 1980 Census. The
use of the 1980 Census-based population
estimates, in conjunction with the revised
second-stage adjustment, resulted in about
a 2 percent increase in the estimates for total
civilian noninstitutional population aged 16
and over, civilian labor force, and unemployed
people. The magnitude of the differences
between 1970 and 1980 Census-based ratio
estimates affected the historical comparability and continuity of major labor force series;
therefore, BLS revised approximately 30,000
series going back to 1970.
January 1983. The occupational and industrial
data are coded using the 1980 classification
systems. While the effect on industry-related
data was minor, the conversion was viewed
as a major break in occupation-related data
series. The census developed a ‘‘list of conversion factors’’ to translate occupation descriptions based on the 1970 census-coding classification system to their 1980 equivalents.
Among the “Black and Other” race category,
data for Blacks are broken out and tabulated
separately going back to 1972. Data continued to be published for a number of “Black
and Other” data series until 2003. Two questions on union membership and union coverage for the two outgoing rotations in the
monthly survey are introduced. The questions
were asked of currently employed wage and
salary workers. Previously, union membership
data were collected in supplements.
Reflecting a recommendation from the Levitan
Commission, selected series that included the
resident armed forces among the employed

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

are introduced. This included an overall unemployment rate for the nation, which tended to
be about 0.1 percentage point lower than the
civilian unemployment rate.
•

October 1984. School enrollment questions
are added to the basic CPS for people 16 to
24 years of age.

•

April 1984. The 1970 Census-based sample is
phased out through a series of changes that
were completed by July 1985. The redesigned
sample used data from the 1980 Census to
update the sampling frame, took advantage
of recent research findings to improve the
efficiency and quality of the survey, and used
a state-based design to improve the estimates
for the states without any change in sample
size.

•

January 1985. Estimation procedures are
changed to use data from the 1980 Census
and the new sample. The major changes
were to the second-stage adjustment, which
replaced population estimates for “Black” and
“Non-Black” (by sex and age groups) with
population estimates for “White,” “Black,”
and “Other” population groups. In addition,
a separate, intermediate step was added as a
control to the Hispanic population. (Hispanics
may be of any race.) The combined effect of
these changes on labor force estimates and
aggregates for most population groups was
negligible; however, the Hispanic population
and associated labor force estimates were
dramatically affected, and revisions to the
major Hispanic data series were made back to
January 1980 to the extent possible.

•

June 1985. The CPS CATI contact center is
opened at Hagerstown, Maryland. A series of
tests over the next few years were conducted
to identify and resolve the operational issues
associated with the use of CATI. Later tests
focused on CATI-related issues, such as data
quality, costs, and mode effects on labor force
estimates. Samples used in these tests were
not used in the official CPS estimates.

•

July 1985. Response categories that obtain
information on period of service for female
veterans are added to the monthly CPS questionnaire. Estimates of the number of female

Appendix: History of the Current Population Survey 33

veterans based on these questions were first
published in a 1986 Monthly Labor Review
article.
•

•

January 1986. For the first time, the population controls used in the second-stage ratio
adjustment method are revised to reflect an
explicit estimate of the number of undocumented immigrants (largely Hispanic) since
1980. In addition, the population controls
included an improved estimate of emigration, or movement out of the United States,
by legal residents. Taken together, the two
changes had a comparatively small effect on
the total population figure, but their effect
on the Hispanic population was more pronounced. Because of the magnitude of the
adjustments for Hispanics, many Hispanic data
series were revised back to January 1980.
January 1989. First CATI cases are used on a
large scale in CPS monthly estimates. Initially,
CATI started with several hundred cases each
month. As operational issues were resolved
and new contact centers were opened—
Tucson, Arizona (May 1992) and Jeffersonville,
Indiana (September 1994)—the CATI workload
was gradually increased to about 10,000 cases
a month.

•

June 1990. The first of a series of experiments
to test alternative labor force questionnaires is
started at the Hagerstown Telephone Center.
These tests used random-digit dialing and
were conducted in 1990 and 1991.

•

January 1992. Industry and occupation (I&O)
codes from the 1990 Census are introduced.
Population estimates were converted to the
1990 Census base for use in ratio estimation
procedures.

•

July 1992. The CATI and CAPI Overlap (CCO)
experiments begins. CATI and automated
laptop versions of the revised CPS questionnaire were used in a sample of about 12,000
households selected from the National Crime
Victimization Survey sample.
The CCO’s main purpose was to gauge the
combined effect of the new questionnaire and
computer-assisted data collection. The initial
CCO ran parallel to the official CPS from July
1992 to December 1993. An extended parallel
survey with modifications ran from January

34 Appendix: History of the Current Population Survey	

	

1994 to May 1994. Research indicated that
the initial CCO parallel survey results provided misleading indications of the impact
of the redesign on major CPS indicators.
Additional research, using data collected from
both parallel surveys, along with data collected from the official CPS both prior to and
after January 1994 redesign, enabled a more
accurate interpretation of the effects of the
redesigned survey on CPS estimates.
•

January 1994. A new questionnaire designed
solely for use in computer-assisted interviewing is introduced in the official CPS. This
allowed the use of a very complex questionnaire without increasing respondent burden,
increased consistency by reducing interviewer
error, permitted editing at time of interviewing, and allowed the use of dependent
interviewing where information reported in 1
month (I&O, retired and disabled statuses, and
duration of unemployment) was confirmed or
updated in subsequent months. In developing
the automated questionnaire, extensive use
was made of cognitive testing techniques.
It is estimated that the redesign had no
statistically significant effect on the total
unemployment rate, but it did affect statistics
related to unemployment such as the reasons
for unemployment, the duration of unemployment, and the I&O distribution of the
unemployed with previous work experience.
It is also estimated that the redesign significantly increased the employment-population
ratio and the labor force participation rate
for women, but significantly decreased the
employment-population ratio for men. Along
with the changes in employment data, the
redesign significantly influenced the measurement of characteristics related to employment such as the proportion of the employed
working part-time, the proportion working
part-time for economic reasons, the number
of individuals classified as self-employed, and
the I&O distribution of the employed.
The redesigned questionnaire collects some
new data that sharpens labor force concepts and definitions and incorporates some
of the recommendations from the Levitan
Commission. For example, the definition of
discouraged workers is tightened by requiring some evidence of attachment to the job
Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

market, and it improves the measurement of
involuntary part-time employment by explicitly asking about current availability for fulltime work. Questions were added on citizenship, country of birth of a person’s parents,
and whether they were immigrants. These
questions were phased into the sample.
CPS data used by BLS are adjusted to reflect
an undercount in the 1990 decennial census.
Quantitative measures of this undercount
are derived from a post-enumeration survey.
Because of reliability issues associated with
the post-enumeration survey for small areas
of geography (i.e., places with populations of
less than 1,000,000), the undercount adjustment was made only to state and national
level estimates. While the undercount varied by geography and demographic group,
the overall undercount was estimated to be
slightly more than 2 percent for the total aged
16 and over civilian noninstitutional population.
Most interviews are conducted by FRs in
person or by telephone, but the transfer of
workload to CATI contact centers continued.
•

April 1994. For the redesign following the
1990 decennial census, new sample is phased
in from April 1994 to July 1995. This resulted
in a monthly overall sample size of 68,000,
with about 58,000 eligible HUs in 792 PSUs.

•

December 1994. Starting in December 1994,
a new set of response categories are phased
in for the relationship to reference person
question. This modification was directed at
individuals not formally related to the reference person to determine whether there were
unmarried partners in a household. The old
partner or roommate category was deleted
and replaced with the following categories:
unmarried partner, housemate or roommate,
and roomer or boarder. This modification was
phased in for two rotation groups at a time
and was fully in place by March 1995. This
change had no effect on the family statistics
produced by CPS.

•

January 1996. The 1990 CPS design is
changed because of a funding reduction. The
original reliability requirements of the sample were relaxed, allowing a reduction in the
national sample size. The overall sample size

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

decreased from 68,000 to 59,500 HUs, with
a reduction of 56,000 eligible HUs to about
50,000 eligible HUs. The reduced CPS national
sample contained 754 PSUs. As a result of the
sample reduction, it was decided to use time
series models to produce monthly employment and unemployment estimates for all
states.
•

January 1998. A new composite weighting
methodology is implemented. The procedure
starts with computed composite estimates for
the main labor force categories, classified by
key demographic characteristics. Then, the
procedure adjusts person weights, through a
series of ratio adjustments, to agree with the
composite estimates—thus incorporating the
effect of composite estimation into the person
weights.

•

July 2001. Effective with the release of July
2001 data, official labor force estimates
from the CPS and the LAUS program reflect
the expansion of the monthly CPS sample.
The overall sample size increased to 72,000
assigned HUs, with about 60,000 eligible
households. This expansion of the monthly
CPS sample was one part of the Census
Bureau’s plan to meet the requirements of
the State Children’s Health Insurance Program
(SCHIP, more recently referred to as CHIP)
legislation. This legislation requires the Census
Bureau to improve state estimates of the number of children who live in low-income families
and lack health insurance. These estimates
are obtained from the Annual Demographic
Supplement to the CPS (now known as the
ASEC). In September 2000, the Census Bureau
began expanding the monthly CPS sample in
31 states and the District of Columbia. States
were identified for sample supplementation
based on the standard error of their March
estimate of low-income children without
health insurance. The regular CPS design was
unchanged, and the SCHIP sample was added
using its own design criteria. The additional
10,000 households were added to the sample over a 3-month period. BLS chose not to
include the additional households in the official labor force estimates, however, until it had
sufficient time to evaluate the estimates from
the 60,000 household sample.

Appendix: History of the Current Population Survey 35

•	

January 2003. The 2002 Census Bureau occupational and industrial classification systems,
which are derived from the 2000 SOC and
the 2002 NAICS, are introduced into the CPS.
The composition of detailed occupational and
industrial classifications in the new systems
was substantially changed from the previous
systems, as was the structure for aggregating
them into broad groups. This created breaks in
existing data series at all levels of aggregation.
Questions on race and ethnicity are modified to comply with new federal standards.
Beginning in January 2003, individuals are
asked whether they are of Hispanic ethnicity before being asked about their race.
Individuals are asked directly if they are
Spanish, Hispanic, or Latino. With respect
to race, the response category of Asian and
Pacific Islander was split into two categories:
(1) Asian and (2) Native Hawaiian or Other
Pacific Islander. The questions on race were
reworded to indicate that individuals could
select more than one race and to convey more
clearly that individuals should report their own
perception of their race. These changes had
little or no impact on the overall civilian noninstitutional population and civilian labor force
but did reduce the population and labor force
levels of Whites, Blacks or African Americans,
and Asians beginning in January 2003. There
was little or no impact on the unemployment rates of these groups. The changes did
not affect the size of the Hispanic or Latino
population and had no significant impact on
the size of their labor force, but did cause an
increase of about 0.5 percentage points in
their unemployment rate.
New population controls reflecting the results
of the 2000 Census substantially increase the
size of the civilian noninstitutional population
and the civilian labor force. As a result, data
from January 2000 through December 2002
were revised. In addition, the Census Bureau
introduced another large upward adjustment
to the population controls as part of its annual
update of population estimates for 2003. The
entire amount of this adjustment was added
to the labor force data in January 2003. The
unemployment rate and other ratios were not
substantially affected by either of these population control adjustments.

36 Appendix: History of the Current Population Survey	

	

The CPS program begins using the X-12
ARIMA software for seasonal adjustment of
time series data with release of the data for
January 2003. Because of the other revisions
introduced with the January data, the annual
revision of 5 years of seasonally adjusted data
that typically occurs with the release of data
for December was delayed until the release of
data for January. As part of the annual revision process, the seasonal adjustment of CPS
series was reviewed to determine if additional
series could be adjusted and if the series
currently adjusted would pass a technical
review. As a result of this review, some series
that were seasonally adjusted in the past are
no longer adjusted. Most of these series were
related to I&O.
Improvements are introduced to both the
second-stage and composite weighting procedures. These changes adapted the weighting
procedures to the new race and ethnicity classification system and enhanced the stability
over time of national, state, and substate labor
force estimates for demographic groups.
•

January 2004. Population controls are
updated to reflect revised estimates of net
international migration for 2000 through
2003. The updated controls resulted in a
decrease of 560,000 in the estimated size of
the civilian noninstitutional population for
December 2003. The civilian labor force and
employment levels decreased by 437,000 and
409,000, respectively. The Hispanic or Latino
population and labor force estimates declined
by 583,000 and 446,000, respectively, and
Hispanic or Latino employment was lowered
by 421,000. The updated controls had little
or no effect on overall and subgroup unemployment rates and other measures of labor
market participation.
Beginning with the publication of December
2003 estimates in January 2004, the practice of concurrent seasonal adjustment is
adopted. Under this practice, the current
month’s seasonally adjusted estimate is computed using all relevant original data up to,
and including, those for the current month.
Revisions to estimates for previous months,
however, are postponed until the end of the
year. Previously, seasonal factors for the CPS
labor force data were projected twice a year.
Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

With the introduction of concurrent seasonal
adjustment, BLS no longer published projected seasonal factors for CPS data.
In addition to introducing population controls
that reflected revised estimates of net international migration for 2000 through 2003 in
January 2004, the LAUS program introduces
a linear wedge adjustment to CPS statewide
estimates of the civilian noninstitutional population aged 16 and over, labor force, employment, unemployment, and unemployment
rate. This adjustment linked the 1990 decennial census-based CPS estimates, adjusted
for the undercount (see January 1994), to the
2000 decennial census-based CPS estimates.
This adjustment provided consistent estimates of statewide labor force characteristics
from the 1990s to the 2000s. It also provided
consistent CPS series for the LAUS program’s
econometric models used to produce the
official labor force estimates for states and
selected substate areas. These models use
CPS employment and unemployment estimates as dependent variables.
•

April 2004. For the redesign following the
2000 decennial census, new sample is phased
in from April 2004 to July 2005. There were
824 PSUs; the overall sample size decreased
slightly, to 71,000 assigned HUs, with about
60,000 households eligible for interview.

•

September 2005. Hurricane Katrina made
landfall on the Gulf Coast on August 29,
2005, after the August 2005 survey reference
period. The data produced for the September
reference period were the first from the CPS
to reflect any impacts of this unusually catastrophic storm. The Census Bureau attempted
to contact all sample households in the disaster areas except those areas under mandatory
evacuation at the time of the survey. Starting
in October, all areas were surveyed. In accordance with standard procedures, uninhabitable households, and those for which the
condition was unknown, were taken out of the
CPS sample universe. People in households
that were successfully interviewed were given
a higher weight to account for those missed.
Also, starting in October, BLS and the Census
Bureau added several questions to identify
people who were evacuated from their homes,
even temporarily, due to Hurricane Katrina.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Beginning in November 2005, state population controls used for CPS estimation were
adjusted to account for interstate moves by
evacuees. This had a negligible effect on estimates for the total United States.
•

November 2006. For the first time, the
November survey is conducted 1 week earlier
in the month than usual. Interviewing is done
during the calendar week that includes the
twelfth of the month, and the reference week
is the week containing the fifth of the month.
(This has often been done for December.)
Policy now allows moving November collection earlier in the month to (1) avoid conflicting with the Thanksgiving holiday, and
(2) to allow adequate processing time before
December data collection begins.

•

January 2007. The system running the data
collection instrument changes from a DOSbased system to BLAISE©, a windows-based
system. BLAISE© increases the interviewer’s
awareness of their position in the survey flow
during the interview and enables them to better navigate backward and forward.

•

January 2008. A new feature is added to
the questionnaire to comply with the new
Respondent Identification Policy. The policy
requires surveys to protect certain sensitive
information when using dependent interviewing techniques.

•

June 2008. Six questions are added to the
CPS to identify people with disabilities aged
15 and over. They are typically asked in new
and returning households (MIS 1 and 5), in
replacement households, and of new household members.

•

January 2009. CPS switches to the 2007
Census industry classification system (used
through 2013). The codes are based on the
2007 NAICS. Changes were relatively minor
and historical data were not revised.

•

January 2011. CPS switches to the 2010
Census occupational classification system.
This classification system is based on the 2010
SOC and replaced an earlier version based on
the 2000 SOC. The names of broad- and
intermediate-level occupational groups
remained the same, but some detailed occupations were reclassified between them. This

Appendix: History of the Current Population Survey 37

affected comparability with earlier data, but
historical data were not revised.
The questionnaire is modified to allow
reported durations of unemployment of up to
5 years. The previous limit was 2 years. The
change was phased in two panels (MIS rotation groups) at a time from January 2011 to
April 2011. The change increased estimates of
mean duration of unemployment, but not the
medians.
The Census Bureau incorporates additional
safeguards for CPS public-use microdata files
to ensure that respondent identifying information is not disclosed; this includes perturbing
respondent ages. One result of the measures
taken to enhance data confidentiality is that
most estimates from public-use microdata
files will no longer exactly match the comparable estimates published by BLS. Although
certain topside labor force estimates will continue to match published data—such as the
overall levels of employed, unemployed, and
not in the labor force—estimates below the
topside level all have the chance of differing
slightly from the published data (but will be
well within the sampling variability associated
with the CPS).
•

•

January 2012. New benchmark population
controls reflecting the results of the 2010
Census are used in the weighting process. The
civilian noninstitutional population aged 16
and over increased by 1,510,000, but those
not in the labor force were disproportionately
affected and increased by 1,252,000. There
were increases of 258,000 in the civilian labor
force, 216,000 in employment, and 42,000 in
unemployment. The labor force participation
rate and the employment-population ratio
were each reduced by 0.3 percentage points.
Historical data were not revised.
January 2013. The Census Bureau regional
office (RO) structure is revised after remaining
substantially unchanged for 50 years. Staff in
the 12 ROs began transitioning in phases to a
6-RO structure in January 2012, and the new
structure took effect in January 2013. CPS
operations began reporting results based on
the six ROs immediately.

38 Appendix: History of the Current Population Survey	

	

•	

January 2014. CPS switches to the 2012
Census industry classification system. The
codes are based on the 2012 NAICS. The differences between the 2012 and 2007 industry
classification systems were minor and did not
involve reclassification of industries between
the broader industry sectors. Historical data
were not revised.

•	

April 2014. For the sample redesign following the 2010 decennial census, new sample
is phased in from April 2014 to July 2015.
There were 852 PSUs; the overall sample size
increased to 74,000 assigned HUs, with about
61,000 households eligible for interview. There
was no attempt to maximize PSU overlap with
the previous design.

•	

January 2015. BLS begins using X-13ARIMASEATS (Signal Extraction in ARIMA Time
Series) software to seasonally adjust data.
Questions are added on professional certifications and state or industry licenses, and
whether they were required for a person’s current or most recent job. To limit lengthening
the survey and increasing respondent burden,
three little-used questions on graduate and
professional coursework were removed from
the CPS when the three certification and
licensing questions were added.

•	

May 2015. Changes to response categories
take effect for the relationship-to-reference-person question. The two response
categories “spouse (husband or wife)” and
“unmarried partner” were replaced with four
response categories: “opposite-sex spouse
(husband or wife),” “opposite-sex unmarried partner,” “same-sex spouse (husband or
wife),” and “same-sex unmarried partner.”

•	

December 2017: Due to the reduction in
contact center workloads and more effective
data collection modes the centralized CATI
contact center in Hagerstown, Maryland,
is closed. The caseload previously handled
by the Hagerstown center is reassigned
to the remaining two contact centers in
Jeffersonville, Indiana, and Tucson, Arizona.
(Less than 10 percent of CPS interviews were
conducted from the centralized contact centers in 2017.)

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

FURTHER READING
Bregger, J., “The Current Population Survey: a
Historical Perspective and BLS’ Role,” Monthly
Labor Review, June 1984, pp. 8–14.
Dunn, M., S. E. Haugen, and J. Kang, “The Current
Population Survey – Tracking Unemployment
in the United States for over 75 years,” Monthly
Labor Review, January 2018, pp. 1–23.

Executive Office of the President, Office of
Management and Budget, Statistical Policy
Division, Federal Statistics: Coordination,
Standards, Guidelines: 1976, Washington, DC:
Government Printing Office.
Frankel, L. R., and J. Stevens Stock, “On the
Sample Survey of Unemployment,” Journal of the
American Statistical Association, March 1942.
U.S. Department of Commerce, U.S. Census
Bureau, “Sampling Procedures and Method of
Operation of the Monthly Report on the Labor
Force,” November 1942.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Appendix: History of the Current Population Survey 39

This page is intentionally blank.

Unit 2
Current Population Survey Sample
Design and Estimation

This page is intentionally blank.

Chapter 2-1: Sample Frame
INTRODUCTION
For the 2010-based sample design, the sampling
frames and sampling methodology for Current
Population Survey (CPS) have undergone important changes. CPS staff now selects the CPS sample from two dynamic sampling frames, one for
housing units (HUs) and one for group quarters
(GQs). Both frames are based upon the Master
Address File (MAF). The MAF is a national inventory of addresses that is continually updated by
the U.S. Census Bureau to support its decennial
programs and demographic surveys. The MAF,
which is maintained by Geography Division (GEO),
is described in greater detail in later sections,
while the CPS sampling methodology is described
in Chapter 2-2, Sample Design.
The MAF replaces a variety of address sources
used in the past to construct sampling frames for
CPS. For the sample design based upon the 2000
and earlier censuses, CPS sample was selected
from a coordinated set of four sampling frames:
the unit frame, the area frame, the GQ frame, and
the new construction permit frame. The address
sources for these frames included the official
address list from the most recent census, block
listings, and addresses from building permits
(Table 2-1.1).
As a comprehensive source for all HU and GQ
addresses in the nation, the MAF eliminates the
need for costly field visits to conduct area block
listings or to collect building permit information for new addresses. Instead, new growth is

captured through semiannual updates to the
MAF from a variety of address sources, the most
important of which is the Delivery Sequence File
(DSF). The DSF is the master file of mail delivery
points maintained and regularly updated by the
U.S. Postal Service (USPS). The DSF and other
MAF update sources are discussed in more detail
in following sections.
The American Community Survey (ACS), the largest demographic survey conducted by the Census
Bureau, has been using the MAF as the sole
source for its HU sampling frame since its earliest
testing phases in the late 1990s. ACS paired its
adoption of a MAF-based sampling frame with an
annual sampling methodology that was designed
to take full advantage of the dynamic qualities
of the MAF. Likewise, during its 2010 Sample
Redesign, the Demographic Statistical Methods
Division (DSMD) paired its proposal to switch to
MAF-based sampling frames with a recommendation to convert to annual sampling for CPS and the
other current household surveys. Annual sampling,
in which only a year’s worth of sample is selected
at a time, replaced the previous model in which
enough sample to last a decade or more was
selected in a one-time operation at the beginning
of each sample design. Going forward, annual
sampling from a continuously updated MAF-based
sampling frame allows CPS the flexibility to adjust
sample sizes, reorder the HU universe (including
new growth) using updated sort variables, or even
change the universe sort variables altogether.

Table 2-1.1.

Address Sources for the Current Population Survey Sampling Frames for the 2000 and
2010 Sample Designs
Sample
design

2000

2010

Sampling
frame

Address
source

Old construction
housing units

Housing units

2000 Census

√

Area

Block listings

√

Group quarters

2000 Census and
area block listings

Permit

Building permits

Housing units

Master Address File

Group quarters

Master Address File

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

New growth
housing units

Group
quarters

√

√
√

√
√

√
√

Chapter 2-1: Sample Frame 43

THE MASTER ADDRESS FILE
A Short History of the MAF
In preparation for the 2000 Census, the Census
Bureau created a preliminary version of the MAF
that would be used as the comprehensive address
source for decennial operations. The core of this
early MAF consisted of addresses collected during
the 1990 Census operations (the 1990 Address
Control File) merged with DSFs provided by the
USPS. This proto-MAF was then supplemented
and updated by a series of decennial operations.
These included the Local Update of Census
Addresses (LUCA), a partnership program by
which local governments could provide their own
address lists, and field listing operations like 2000
Address Canvassing. At the conclusion of the
2000 Census, the MAF was considered a virtually
complete inventory of all known HU addresses in
the nation.1
While previous census address lists were not maintained after the census was complete, the vision
for the MAF after 2000 was very different—the
MAF would be continually updated with the DSF
and other sources so it could serve as the official
inventory of HUs and GQs for all future censuses.
The MAF would also support the demographic
surveys and other statistical programs conducted
by the Census Bureau.
Consistent with this vision, the MAF has evolved
into a critical corporate resource for the Census
Bureau:
•	

The MAF has been the sole source of
addresses for the HU sampling frame for ACS
since its implementation as a national survey
in 2005.

•	

The MAF was the major address source for the
2010 Census and will serve the same role for
future censuses.

•	

The MAF is a critical input to the Population
Estimates Program.

•	

The MAF is the sole source of addresses for
the HU and GQ sampling frames for CPS and
other demographic household surveys, including the American Housing Survey (AHS), the

1
The MAF did not include GQ addresses at this point in time.
GQs were maintained in a separate file called the Special Place/
GQ Inventory. GQs would not be merged into the MAF until the
2010 Census.

44 Chapter 2-1: Sample Frame	

	

National Crime Victimization Survey (NCVS),
the Consumer Expenditure (CE) Surveys,
and the Survey of Income and Program
Participation (SIPP).
To further enhance its value, the MAF was integrated in 2007 with the Census Bureau’s geospatial database, the Topologically Integrated
Geographic Encoding and Referencing (TIGER)
System. The TIGER database contains digital
representations of all map features and related
attributes required by the census. An important
function of TIGER is to assign geocodes (i.e.,
state, county, tract, and block) to the addresses
on the MAF. For a short history of TIGER, see
Thompson (2014).
The integrated database created by merging
the MAF with TIGER is called the MAF/TIGER
Database. While the MAF is actually the address
portion of the MAF/TIGER Database, it is the ​
better-​known acronym and we will use it throughout this chapter to refer to the database maintained by GEO.

The Content of the MAF
The major purpose of the MAF is to store address
and geographic information about the HUs and
GQs (as well as some nonresidential addresses)
in the United States.2 Accurate, complete location address information is critical to CPS and the
other demographic surveys that conduct personal
visit interviews. Most MAF records have complete
city-style location addresses, which consist of
a house number, street name, and ZIP code; an
example of a complete city-style address is “3227
Holt Lane, Anytown, PA, 29999.”3 The importance
of a city-style location address is that it is usually
sufficient in itself to locate a sample case in the
field.
Incomplete location addresses, conversely, are
missing house number, street name, or both.
For the 2010 Census, the percentage of location addresses that were incomplete dropped to
1.9 percent. For incomplete addresses, the MAF
usually has other information, including a location
description (for example, “SECOND HOUSE ON
2
The MAF includes addresses for Puerto Rico, which are outof-scope for CPS but in scope for ACS.
3
The MAF does not contain locality or post office names
(“Anytown” in this example) or the two-letter postal abbreviation
for the state. The ZIP code must be matched to a separate file to
pick up post office name.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

LEFT DOWN UNNAMED DIRT ROAD OFF HOLT
LANE”) or latitude and longitude coordinates
that can help a field representative (FR) locate
the HU in the field. One of the imperatives of the
2010 Address Canvassing operation was to collect
Geographic Positioning System (GPS) coordinates
for as many listed HUs as possible. As a result,
only 8.2 percent of the incomplete city-style
addresses in the 2010 Census were also missing
coordinates.
Every MAF record contains a MAFID, which is a
unique identifier for that record across the entire
MAF. Other fields that are always filled are the
HU/GQ indicator, unit status (one purpose of
which is to flag known duplicates), and residential status. The MAF also contains action codes
(“add,” “delete,” “verify,” etc.) and dates from
each source that has provided information for the
MAF record.4
A MAF record may, but does not always, include
entries for these fields: mailing address, location
description, latitude and longitude coordinates,
census tract, and census block. MAF records may
be linked as duplicates through the Surviving
MAFID. A record with a nonblank Surviving MAFID
is a “retired” record. The Surviving MAFID denotes
a different record that replaces or “survives” the
original MAFID.
The MAF contains several variables relating to the
DSF, including the historical series of DSF flags.
Each DSF flag indicates whether the MAFID was
on that version of the DSF and whether it was residential or nonresidential. A more detailed discussion of the DSF later in this section provides more
information about the DSF content on the MAF.
For GQ records, the MAF contains GQ names, GQ
type, location address information, contact information, and information about the size of the GQ.
The geographic information on the MAF is mainly
restricted to tract and block codes and latitude
and longitude coordinates, but more information
can be obtained by matching by block to other
geographic files maintained by GEO. This leads
4
The action codes and dates for MAF sources are actually
provided in a separate product from the MAF extracts. These
files, the MAF Operations (MAFOP) data sets, are delivered in
conjunction with the MAF extracts, and data from the two files
are merged for each county during MAF processing.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

to an important concept for MAF-based sampling
frames, the ungeocoded HU. MAF records that
have been assigned census tract and block codes
are said to be geocoded, while records without
tract and block codes are ungeocoded.
All MAF HUs that were included on the final census address list for 2010 were geocoded; it was a
census requirement that all valid living quarters
be assigned block codes for tabulation purposes.
Also, MAF HUs that originate from a block listing
operation like the Community Address Updating
System (CAUS) operation will have geocodes.
Ungeocoded HUs result from adding addresses to
the MAF from a source that does not provide its
own block codes. The primary MAF source without block codes, and therefore for ungeocoded
MAF addresses, is the DSF.
Block codes are important to the CPS HU Frame
for several reasons:
•	

Block information can be important in locating
an address in the field for personal interview.

•	

The block code is the “gateway” to other
census geography; if you know the block for
an address, then you can determine its place,
county subdivision, urban or rural status,
urban area, principal city status, etc. None of
this information is available for ungeocoded
HUs (unless imputed).

•	

Block summary data from the census or ACS
is linked to geocoded HUs in the CPS HU
Frame for efficient sorting of the universe for
sampling.

Keeping the MAF Up-to-Date
The Census Bureau keeps the MAF as current
as possible by continuously updating it using a
diverse array of address sources, imagery, and
field interview and listing operations. While geographic updates to the TIGER are very important,
it is the address updates to the MAF portion that
are most critical to the mission of maintaining
address-based sampling frames for CPS and other
surveys. Demographic surveys depend upon frequent and accurate address updates to the MAF
to maintain survey coverage and quality.

Chapter 2-1: Sample Frame 45

The MAF updates take two forms:
•	

Augmenting and improving the information for existing MAF addresses. Examples
include (1) adding or correcting location
address components like street names, unit
designations, and ZIP codes; (2) establishing
a link between two MAF records because new
information shows they actually represent the
same HU; and (3) adding a block code to a
previously ungeocoded address.

•	

Adding new HU or GQ addresses to the MAF
from other address sources or field operations. Sources of new addresses include the
DSF, local partnership files, ACS interviews,
field listing operations, and decennial field
activities.

The Delivery Sequence File
The DSF is a national inventory of mail delivery points maintained by the USPS. Since the
inception of the MAF, the DSF has been its major
source of new growth addresses on the MAF.
As discussed previously, the DSF was first used
to update the MAF in the late 1990s in preparation for the 2000 Census. Since 2000, the Census
Bureau has “refreshed” the MAF with new versions
of the DSF twice a year. By 2009, after 9 years of
these DSF updates, 15.5 percent of the eligible
HUs in the ACS HU frame were DSF addresses
added to the MAF since the 2000 Census. This figure far surpassed all other sources of new growth
addresses between the 2000 and 2010 censuses.
The semiannual DSF refreshes involve a complex
address matching operation. The USPS does not
specifically identify the new records on the DSF
since the last time the file was delivered, so the
Census Bureau determines which records represent new addresses not already on the MAF by
matching the entire DSF to the MAF by address.5
DSF addresses that match existing MAF addresses
are used to update those records, while new MAF
records are created for DSF addresses not found
on the MAF. For further detail on the DSF update
process, see U.S. Census Bureau (2017).
An important feature of the DSF refresh is that
non-city-style addresses from the DSF are
5
Other USPS products, like the Locatable Address
Conversion System (LACS) file and the ZIP+4 house number
range files, are also used in this process. The LACS file will be
discussed in more detail later in this section.

46 Chapter 2-1: Sample Frame	

	

discarded and not used to update the MAF.
Examples of non-city-style addresses are “RR 4,
Box 16, Anytown, PA 29999” and “P.O. Box 11896,
Anytown, PA 29999.” These addresses are not
used in the DSF match to the MAF because of the
low match rates and the impermanent relationship
between such addresses and HUs on the ground.
To the extent that these non-city-style addresses
represent new growth in certain areas, MAF coverage is lost. Another limitation of the DSF is that
GQ addresses are not flagged as such, so DSF
address updates are classified by default as HUs.
The DSF is therefore not a source of new GQs on
the MAF, while inadvertently contributing some
GQ addresses that are misclassified as HUs and
included in the HU Frame.
As discussed above, most ungeocoded HUs on
the MAF originate from the DSF. This is because
the DSF contains no block codes or GPS coordinates. The primary means by which GEO assigns
geocodes to DSF additions, then, is by finding a
match to an address segment already in TIGER.
Because many new DSF addresses are located
on newly built streets or street segments not
yet included in TIGER, a large share of the new
addresses added to the MAF in each year cannot be assigned a geocode through TIGER. As
an example, 56.1 percent of the DSF addresses
added to the MAF in 2018 were ungeocoded.6
The DSF refresh of the MAF is a complicated
process, but there are corresponding challenges
for MAF users in determining how to use these
DSF updates. To consider just one example among
many, the USPS classifies each DSF address
as either an Include in Delivery Statistics (IDS)
address or an Exclude from Delivery Statistics
(EDS) address. The nominal definition of an
IDS address is that it represents a current mail
delivery point, while an EDS address does not.
Research has suggested that some portion of EDS
addresses represent planned new construction,
so (subject to other criteria) such addresses are
included in the CPS HU Frame (Loudermilk, 2010;
Ying, 2012). There are many other such choices
that must be made in determining which DSF
addresses should be included and which should
not. This process, referred to as MAF filtering, is a
critical component of frame construction for CPS
6
Computed using data from the MAF extracts delivered to
DSMD in January 2018 and July 2018.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

and all other surveys building a sampling frame
from the MAF.

to produce yearly MAF coverage estimates for
the United States and to provide continuous
updates to the MAF for current surveys and
the 2020 Census. For 2017, MAFCS selected
20,000 sample blocks; at that point, CAUS
was essentially absorbed into MAFCS and no
longer existed as a separate program. Though
originally planned to continue annually
through 2019, the MAFCS program was discontinued in April 2017 (U.S. Census Bureau,
2018).

Local Partnership Files
In 2013, the Census Bureau began receiving
address and Geographic Information System (GIS)
files through a partnership program involving
local governments. This program is modeled upon
the decennial partnership program, the LUCA
program, and is intended to extend the benefits
of such a program to the intercensal period. The
files submitted by the local government partners (“local partnership files”) can cover places,
counties, county subdivisions, American Indian
Areas, and even entire states. The files undergo a
review for quality and, if acceptable, are used to
update existing records and add new addresses
to the MAF. The data sources for the partnership
files can include building permits, tax assessment
data, GIS databases, real estate records, and other
types of administrative data. Since many local
governments now have very complete and accurate GIS information, the partnership file updates
can be an important source of geocodes for
previously ungeocoded postcensus additions to
the MAF. For more information on the partnership
file program, see Trainor (2014). See U.S. Census
Bureau (2015) for details on the partnership
updates.

•

Starting in 2012, the Coverage Improvement
(CI) Frame began sending blocks out for
listing under the DAAL banner. These listings
were very limited in scope and were discontinued after 2 years when CPS and AHS, the
only participants in the frame, decided to drop
out. The results from these listings were used
to update the MAF and will do so again in the
future should the program be revived. The CI
Frame is discussed in more detail later.

Field Listing Operations
Several post-2010 block-listing operations have
provided updates to the MAF, including:
•

CAUS and MAF Coverage Study (MAFCS).
CAUS was implemented in 2003 specifically to
address ACS coverage concerns with the MAF.
CAUS targeted predominantly rural blocks
where DSF coverage of postcensus growth
was most problematic due to a combination
of suspected high growth and a prevalence
of non-city-style addresses. Through block
canvassing based on a dependent list from
the MAF, CAUS supplemented MAF coverage
by adding new addresses and changing or
deleting existing MAF addresses in up to 1,500
blocks per year through 2016.
Starting in 2016, CAUS blocks were combined
with sample blocks from the MAF Coverage
Study to create a national canvassing workload of 20,000 blocks. MAFCS was designed

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Demographic Area Address Listings (DAAL).7
Block listings for the 2000-based area frame
for the current household surveys continued
to be conducted until 2014, when the DAAL
program was phased out and replaced by the
MAF as the sole source of sample for the 2010
sample design. The 2000-based block listings
were conducted in blocks that were screened
into the area frame for the current surveys;
these were in primarily rural areas with high
concentrations of non-city-style addresses or
in areas with no building permit coverage. The
listing results were used to update the MAF.

•

Census tests and dress rehearsals. Prior to the
2010 Census, block listings were conducted in
a very limited set of counties to support various census tests and dress rehearsals. These
listings updated the MAF, as will any future
listings conducted in support of 2020 decennial efforts.

ACS provides other MAF updates through its
time-of-interview (TOI) updates and the ACS
GQ program, which includes both a field listing
component and headquarters research. ACS-TOI
7
For operational reasons, the term DAAL came over time to
refer to both the 2000-based area frame block listings for the
current surveys and the CAUS listings. Here, we use DAAL in its
original sense to refer to the block listings conducted for the
current surveys—the 2000-based area frame listings and their
successors, the 2010-based CI Frame listings.

Chapter 2-1: Sample Frame 47

provides only updates to existing MAF records
that are in the ACS sample, such as adding block
codes or modifying address information. The ACS
GQ program is one of the only sources of GQ
updates to the MAF in the years between censuses; these updates include both new GQ records
and HU-to-GQ conversions. The partnership files
provide a very limited number of GQ updates, as
did the DAAL program before it was phased out
for the current surveys.
Another USPS product, the LACS file, is used in
conjunction with the DSF to help mitigate duplication on the MAF. The LACS file is a national
dataset of address conversions, often from E-911
readdressing operations in rural areas.8 If both the
“old” and “new” address on a LACS record are
found in the MAF, they are linked; the old address
is the retired record and the new address is the
surviving record. This allows surveys that use the
MAF to avoid including both records in their sampling frames, thereby avoiding duplication.

CREATING AND UPDATING THE
CURRENT POPULATION SURVEY
SAMPLING FRAMES
While both CPS sampling frames for the 2010
sample design are based upon the MAF, they are
created in different ways and on different cycles.
The HU Frame was created for the first time in
2013 and is updated every 6 months with the
latest MAF data. The GQ Frame, conversely, is
created every 3 years. The frame construction and
updating processes will be described separately
for each frame.

Current Population Survey Housing Unit
Frame
GEO delivers MAF extracts to DSMD twice each
year, in January and July.9 A MAF extract is a
“snapshot” of the MAF for a given county that
8
E-911 readdressing is the process whereby local or state
governments mandate new city-style addresses in areas (typically rural) where non-city-style addresses are common. The
purpose of the new addresses, which are usually assigned in a
systematic manner (for example, 1550 Fogerty Lane represents
the house that is 1550 feet from the start of Fogerty Lane), is to
provide better location information to allow faster response by
emergency vehicles.
9
The July MAF delivery also includes two sets of Geographic
Reference Files (GRFs), one set with block-level records with
corresponding geographic attributes (county subdivision, place,
urban or rural, etc.) and the other set with the names for all the
geographic entities. The GRFs contain geographic codes that
are not included on the MAF extracts, so are needed in MAF
processing to attach these geographic codes to the Edited MAF
Extract files.

48 Chapter 2-1: Sample Frame	

	

reflects 6 months of DSF and other updates. The
MAF extracts do not contain all information from
the MAF. The MAF often contains, for example,
multiple location addresses and multiple mailing
addresses for a given HU; the MAF extract contains only one instance of each address type (the
“preferred address”).
The delivery of the MAF extracts from GEO to
Demographic Systems Division (DSD) kicks off
the semiannual HU Frame update cycles.10 Details
on the production of MAF extracts can be found
in U.S. Census Bureau (2016). The MAF extracts
are reviewed by DSMD for quality, with a focus on
the most recent MAF updates. If errors are discovered, DSMD may request a fix and redelivery
of the extracts. Once the MAF extracts have all
been delivered by GEO and formally accepted by
DSMD, the files are submitted to the MAF processing systems within DSD. These systems perform
various quality edits, assign codes, and apply the
MAF filtering rules to create a set of Edited MAF
Extract files.
The MAF filtering is a critical feature of the frame
creation process; its outcomes can have an
important effect on frame coverage. The MAF
extracts contain all records from the MAF for a
given county, including many that should not
be eligible for the CPS HU Frame. The filtering
rules designate each MAF record as either “valid”
(passed the filter and eligible for the HU frame) or
“invalid” (failed the filter, ineligible for the frame).
While some filtering decisions are easy (for example, any record denoted as “nonresidential” or
as a “duplicate” is invalid), others are much less
obvious. Should ungeocoded additions from the
DSF be valid? How about addresses deleted by
the 2010 Census that continue to show up on the
DSF?
As a result of research conducted by DSMD as
part of the 2010 Sample Redesign, DSMD decided
to adopt the ACS filtering rules for CPS and
the other current household surveys when they
switched to a MAF-based HU frame. DSMD plans
to continually assess the filter rules and work with
ACS to identify possible enhancements. Most
filtering questions revolve around this question,
10
DSD is responsible for the programming systems and production files for all the current surveys, while DSMD is responsible for sample design, survey quality, and operational systems.
DSMD and DSD work very closely together on frame construction issues.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

“When does a new DSF address represent a new
growth HU rather than an existing HU already
on the MAF from the census or another source?”
Failing to provide the correct answer for any
subclass of DSF addresses can create either
overcoverage (a new DSF address is included in
the frame, but the HU is already on the MAF) or
undercoverage (a new DSF address is excluded,
but it represents a new growth HU not otherwise
represented on the MAF).
The CPS HU Frame actually takes the form of
separate HU universes by county, just as the
MAF extracts are separate by county. The HU
Frame files are called the Unit Frame Universe
Files (UFUFs). The original UFUFs for CPS and
the other current surveys were created in 2013
and consisted of all the valid and invalid HUs
from the MAF at that time. If the invalid HUs
are MAF records that we consider ineligible for
the HU Frame, though, why are they included
on the UFUFs? A decision was made to include
all MAFIDs on the UFUFs and make the valid or
invalid status the first sort key for sampling. All
MAFIDs would thereby be given a chance of selection, but any invalid HUs selected for sample will
be suppressed from interview unless the MAFID
changes to valid status by the time of the first
interview.
Starting with those initial 2013 universe files, the
UFUFs are updated every 6 months with MAF data
in two ways:
•	

Each existing UFUF record is updated with
the most recent MAF data (addresses, block
codes, etc.) and its latest filtering status.

•	

New growth records are added to the UFUF.

The UFUFs are also updated with sort information
from ACS and decennial block-level data as part
of the annual sampling process, which takes place
once each year as part of the January MAF processing cycle. Each survey participating in annual
sampling can sort the frame units in its own way.
The CPS sort keys for the UFUFs are discussed in
Chapter 2-2, Sample Design.
If the UFUFs that are updated from the January
MAF extracts each year are used for annual sampling, what is the purpose of the July updates? In
addition to refreshing the MAF data for sample
cases that have not yet gone out to interview,
Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

the new growth cases added to the UFUF in the
July phase can “activate” CPS sample that was
selected in a prior annual sampling phase, thereby
increasing the CPS sample size. In fact, every MAF
update, whether January or July, has this ability
to activate sample cases by adding new growth to
the UFUF.11
To understand how this new growth works, consider that each UFUF at time of annual sampling
consists of two portions: the actual universe,
which consists of actual MAFIDs, versus the “skeleton” universe, made up of units not yet linked up
with MAFIDs. The skeleton portion of the UFUF is
essentially an empty framework that is to be filled
in with new growth over time. During annual sampling, CPS selects its sample across both portions
of the UFUF. Many lines in the skeleton universe
are thereby selected for sample, but will not be
activated unless associated with MAFIDs at some
later point. This feature of the HU Frame allows
for constant augmentation of the CPS sample with
new growth up until the final scheduled interview
for a given sample designation.
Consider this example:
In the 2013 annual sampling phase, CPS
selected sample cases that were all designated A03. Many of these cases were
assigned to UFUF units associated with
MAFIDs, while others were assigned to empty
records in the skeleton portion of the universe. As the skeleton universe was filled in
with actual MAF data by subsequent MAF
updates (July 2013, January 2014, July 2014,
etc.), additional A03 cases were activated and
added to the CPS sample. The first interviews
for A03 cases began in August 2014, but
some A03 cases started interviews as late as
March 2016. Therefore, the A03 sample that
was originally selected from the HU Frame in
2013 could have been augmented with new
cases as late as July 2015.

11
Note that while every MAF processing cycle can add new
growth to the UFUFs, not all do. An example is the January 2015
cycle, when problems with the MAF delivery could not be fixed
in time, so MAF extracts from the previous delivery (July 2014)
were substituted. No new growth was added to the UFUFs for
that cycle. In July 2015, though, the UFUFs “caught up” when
updated with MAF extracts that by then contained an entire year
of new growth.

Chapter 2-1: Sample Frame 49

Group Quarters Frame
The GQ Frame for CPS and the other current surveys is created from MAF extracts only once every
3 years. CPS and the other current surveys participating in the frame each select a 3-year sample
of GQs. The first GQ Frame was created from the
July 2012 MAF extracts; the second GQ Frame
was created from the January 2016 MAF extracts.
The GQs on the frame all come from the MAF, and
nearly all GQs on the MAF were collected during
the 2010 Census through operations like 2010
Address Canvassing and GQ Validation. Unlike the
DSF on the HU side, there is not a dependable
source of new GQ addresses for the MAF. ACS
GQ operations add some new GQs and HU-to-GQ
conversions, the local partnership files can contribute some GQs, and block listing operations
like CAUS capture a relatively small number of
GQ addresses. For the most part, though, the GQ
Frame remains relatively static from one version
to the next. There is some possibility that future
GQ Frames may include new college housing GQs
collected through an independent study by DSMD.
Note that, while the HU Frame contains information down to the HU level, including individual
units within apartment complexes, the GQ Frame
does not include “unit” information for each GQ.
Instead, the MAF provides an expected GQ size;
any GQ selected for sample must then be sent to
the field for a listing of the “units” (which can be
rooms, beds, or people) at the GQ before individual sample units can be identified and interviewed.
The GQ Frame consists only of noninstitutional GQs and excludes institutional GQs.
Noninstitutional GQs are facilities for people who
are not under formally authorized and supervised
care and custody such as college housing, group
homes for adults, workers’ living quarters, and
convents. Institutional GQs include facilities such
as prisons, skilled nursing facilities, and residential
treatment centers for juveniles.

Coverage Improvement Frame
In the period immediately following the 2010
Census, the MAF had a very high level of coverage due to 2010 Address Canvassing and other

50 Chapter 2-1: Sample Frame	

	

decennial operations that systematically captured address information. That coverage may
start to degrade over time in areas where there
is no reliable source of new growth (postcensus)
addresses. The DSF, as the major source of HU
updates to the MAF between censuses, is the primary driver of new growth coverage on the MAF.
In most urban and suburban areas, the DSF should
be a thorough source of new growth. In more
rural areas with more non-city-style addresses
or lack of home mail delivery, though, the lack of
DSF coverage may lead to MAF undercoverage
concerns.12
Research conducted as part of the 2010 Sample
Redesign (Liu, 2009) suggested a risk of future
bias in the CPS state-level, labor force participation estimates due to potential MAF undercoverage. DSMD proposed a coverage improvement
operation in the 13 states identified as at-risk in
the study: Alabama, Alaska, Arkansas, Kentucky,
Maine, Mississippi, Montana, New Hampshire,
New Mexico, Oklahoma, Vermont, West Virginia,
and Wyoming. Within these 13 states, a universe
of blocks with suspect DSF coverage would be
identified each year. The blocks would be sampled
and the selected blocks would be sent to the field
to be canvassed. Because the HU Frame already
provides coverage of any HUs in these blocks that
were on the MAF, the CI Frame was concerned
only with the HUs that were added by the field
listers. Any added HUs within the blocks selected
by a survey would automatically be in sample for
the survey; in effect, these CI Frame additions
would supplement and be indistinguishable from
the HU Frame sample cases for the survey.
CPS was a participant in the CI Frame from its
first sampling and listing phases in 2012, with AHS
joining a year later. By late 2014, though, CPS
and AHS decided to end their participation in the
frame due to cost. Therefore, the CI Frame has
been suspended and will not be reinstated unless
CPS or other surveys decide that MAF supplementation is needed.

12
As stated earlier, non-city-style addresses from the DSF are
not used to refresh the MAF, so the MAF may be deficient in coverage in areas with such addresses (even if covered by the DSF).

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

REFERENCES
Liu, X., “A Supplemental Report of the Frame
Assessment for Current Household Surveys:
Effects of MAF-Based Frame Coverage on Survey
Estimates (Doc. #2010-4.0-G-7),” U.S. Census
Bureau, February 20, 2009.
Loudermilk, C., and J. Martin, “Frame Assessment
for Current Household Surveys (FACHS) Filter
Rules Research: 2009 Analytical Report,”
U.S. Census Bureau, May 24, 2010.
Thompson, J. H., “Happy 25th Anniversary,
TIGER,” U.S. Census Bureau, November 20, 2014,
retrieved from .
Trainor, T. F., “Geography Division Address
Canvassing Recommendation,” U.S. Census
Bureau, November 15, 2014.
U.S. Census Bureau, Decennial Census
Management Division, “2020 Census Detailed
Operational Plan for 8. Address Canvassing
Operation,” U.S. Census Bureau, May 9, 2018.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

U.S. Census Bureau, Geography Division,
“Delivery Sequence File (DSF) Refresh Software
Requirements Specification,” U.S. Census Bureau,
July 3, 2017.
U.S. Census Bureau, Decennial Census
Management Division, “2020 Census Detailed
Operational Plan for 6. Geographic Programs
Operation (GEOP) – 6-3. Geographic Data
Processing Component (GEOP/GDP),”
U.S. Census Bureau, September 13, 2016.
U.S. Census Bureau, Decennial Census
Management Division, “Geographic Support
System Initiative Master Address File Updates
using Partnership Data Software Requirements
Specification,” U.S. Census Bureau, March 31,
2015.
Ying, S., “Identifying Excluded from Delivery
Statistics Records that Elude the American
Community Survey Housing Unit Frame Filters,”
U.S. Census Bureau, February 27, 2012.

Chapter 2-1: Sample Frame 51

Chapter 2-2: Sample Design
INTRODUCTION
For more than seven decades, the CPS has been
one of the major sources of up-to-date information on the labor force and demographic characteristics of the U.S. population. Because of the
CPS’s importance and high profile, the reliability
of the estimates is evaluated periodically. The
design has often been under close scrutiny in
response to demand for new data and the need to
improve the reliability of the estimates by applying
research findings and new types of information
(especially decennial census results). All changes
are implemented with concern for minimizing cost
and maximizing comparability of estimates across
time. The methods used to select the sample
households for the survey are reevaluated after
each decennial census. Based on these reevaluations, the design of the survey is modified and
systems are put in place to provide sample for the
following decade. The most recent decennial revision incorporated new information from the 2010
Census and was fully implemented as of July 2015.
This chapter describes the CPS sample design as
of July 2015. It is directed to a general audience
and presents many topics with varying degrees
of detail. The following section provides a broad
overview of the CPS design.

SURVEY REQUIREMENTS AND DESIGN
Survey Requirements
The following bulleted items briefly describe the
major characteristics of the CPS sample as of July
2015:
•	

The CPS sample is a probability sample.

•	

The sample is designed primarily to produce
national and state estimates of labor force
characteristics of the civilian noninstitutional
population aged 16 and older (CNP16+).

•	

The CPS sample consists of independent
samples from each state and the District of
Columbia. Each state sample is specifically
tailored to the demographic and labor market
conditions that prevail in that particular state.
California and New York State are further
divided into two substate areas that also have

52 Chapter 2-2: Sample Design	

	

independent designs: Los Angeles County
and the rest of California, New York City and
the rest of New York State.13 Since the CPS
design consists of independent samples for
the states and substate areas, it is said to be
state-based.
•

Sample sizes are determined by reliability
requirements that are expressed in terms of
the coefficient of variation (CV). The CV is a
relative measure of the sampling error and
is calculated as sampling error divided by
the expected value of the given characteristic. The specified CV requirement for the
monthly unemployment level for the nation,
given a 6.0 percent unemployment rate, is 1.9
percent. The 1.9 percent CV is based on the
requirement that a difference of 0.2 percentage points in unemployment rate between 2
consecutive months be statistically significant
at the 0.10 level.

•

The required CV on the annual average unemployment level for each state, substate area,
and the District of Columbia, given a 6.0 percent unemployment rate, is 8.0 percent.

Overview of Survey Design
The CPS sample is a multistage stratified sample
of approximately 72,000 assigned HUs from 852
sample areas. It is designed to measure demographic and labor force characteristics of the civilian noninstitutional population aged 16 and older.
Approximately 12,000 of these assigned HUs are
sampled under the Children’s Health Insurance
Program (CHIP) expansion that has been part of
the official CPS sample since July 2001. CPS samples HUs from the MAF HU and GQ sections that
include all the official 2010 Census addresses and
postcensus additions from the USPS, local jurisdictions, and field listings. As of July 2015, sample
is drawn annually to allow newly constructed HUs
a chance of selection before the transition to a
new sample.
The first stage of sampling involves dividing the
United States into primary sampling units (PSUs)—
most of which comprise a metropolitan area, a
13
New York City consists of Bronx, Kings, New York, Queens,
and Richmond Counties.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

large county, or a group of smaller counties. Every
PSU is nested within the boundary of a state.
The PSUs are then grouped into strata based on
independent information that is obtained from
the decennial census or other sources. The strata
are constructed so that they are as homogeneous
as possible with respect to labor force and other
social and economic characteristics that are highly
correlated with unemployment. One PSU is sampled in each stratum. The probability of selection
for each PSU in the stratum is proportional to its
population as of the 2010 Census.
A second stage of sampling is conducted annually; a sample of HUs within the sample PSUs is
drawn. Ultimate sampling units (USUs) are small
groups of HUs. The bulk of the USUs sampled
in the second stage consist of sets of addresses
that are systematically drawn from sorted lists
of blocks. HUs from blocks with similar demographic composition and geographic proximity are
grouped together in the list. In parts of the United
States where addresses are not recognizable on
the ground, USUs are identified using area sampling techniques.
The CPS sample is usually described as a twostage sample because PSUs and groups of HUs
are each selected. PSUs are selected from strata,
and HUs are selected from these PSUs.
Each month, interviewers collect data from the
sample HUs. A HU is interviewed for 4 consecutive
months, dropped out of the sample for the next 8
months, and interviewed again in the following 4
months. In all, a sample HU is interviewed 8 times;
this is known as the 4-8-4 design.
Households are rotated in and out of the sample
in a way that improves the accuracy of the monthto-month and year-to-year change estimates.
The rotation scheme ensures that in any single
month, approximately one-eighth of the HUs are
interviewed for the first time, another eighth are
interviewed for the second time, and so on. That
is, after the first month, six of the eight rotation
groups will have been in the survey for the previous month—there will always be a 75 percent
month-to-month overlap. Thus, four of the eight
rotation groups in any month will have been in the
survey for the same month, 1 year ago; there will
always be a 50 percent year-to-year overlap. This
rotation scheme upholds the scientific tenets of
probability sampling, and each month’s sample
Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

produces an unbiased representation of the target
population. The rotation system makes it possible
to reduce sampling error by using a composite
estimation procedure (Chapter 2-3, Weighting
and Estimation) and, at slight additional cost, by
increasing the representation in the sample of
USUs with unusually large numbers of HUs.
Each state’s sample design ensures that most HUs
within a state have the same overall probability
of selection. Because of the state-based nature
of the design, sample HUs in different states have
different overall probabilities of selection. The
system of state-based designs ensures that both
state and national reliability requirements are met.

FIRST STAGE OF THE SAMPLE DESIGN
The first stage of the CPS sample design is the
selection of counties. The purpose of selecting a
subset of counties instead of having all counties in
the sample is to minimize the cost of the survey.
This is done mainly by minimizing the number of
FRs needed to conduct the survey and reducing
the travel cost incurred in visiting the sample HUs.
Two features of first-stage sampling are: (1) to
ensure that sample counties represent other
counties with similar labor force characteristics
that are not selected; and (2) to ensure that each
FR is allotted a manageable workload in his or her
sample area.
The first-stage sample selection is carried out in
three major steps:
•	

Definition of the PSUs.

•	

Stratification of the PSUs within each state.

•	

Selection of the sample PSUs in each state.

These steps are implemented soon after the
decennial census.

Definition of the Primary Sampling Units
PSUs are delineated so that they encompass the
entire United States. The land area covered by
each PSU is made reasonably compact so an interviewer can traverse it without incurring unreasonable costs. The population is as heterogeneous
with regard to labor force characteristics as can
be made consistent with the other constraints.
Strata are constructed that are homogenous in
terms of labor force characteristics to minimize
between-PSU variance. Between-PSU variance
Chapter 2-2: Sample Design 53

is a component of total variance that arises from
selecting a sample of PSUs rather than selecting
all PSUs. In each stratum, one PSU is selected to
represent the other PSUs in the same stratum.

other Census Bureau demographic surveys.14 The
following are steps for combining counties, county
equivalents, and independent cities into PSUs for
the 2010 design:

Most PSUs are groups of contiguous counties
rather than single counties. A group of counties
is more likely than a single county to have diverse
labor force characteristics. Limits are placed on
the geographic size of a PSU to restrict the distance an FR must travel.

•	

The 2010 PSUs are revised by incorporating
new or redefined metropolitan areas into the
PSU definitions.

•	

Any single county is classified as a separate
PSU if it exceeds the maximum area limitation
deemed practical for FR travel (regardless of
its 2010 population).

•	

Other counties within the same state are
examined to determine whether they might
advantageously be combined with contiguous
counties without violating the population and
area limitations.

•	

Contiguous counties with natural geographic
barriers between them are placed in separate
PSUs to reduce the cost of travel within PSUs.

Rules for Defining Primary Sampling Units
•	

Each PSU is contained within the boundary of
a single state.

•	

Metropolitan Statistical Areas (MSAs) are
defined as separate PSUs using projected
2013 Core-Based Statistical Area (CBSA)
definitions. CBSAs are defined as metropolitan
or micropolitan areas and include at least one
county. Micropolitan areas and areas outside
of CBSAs are considered nonmetropolitan
areas. If any metropolitan area crosses state
boundaries, each state/metropolitan area
intersection is a separate PSU.

•	

For most states, PSUs are either one county
or two or more contiguous counties. In some
states, county equivalents are used: cities,
independent of any county organization, in
Maryland, Missouri, Nevada, and Virginia; parishes in Louisiana; and boroughs and census
divisions in Alaska.

•	

The area of the PSU should not exceed 3,000
square miles except in cases where a single
county exceeds the maximum area.

•	

The population of the PSU is at least 7,500
except where this would require exceeding
the maximum area specified as 3,000 square
miles.

•	

In addition to meeting the limitation on total
area, PSUs are formed to limit extreme length
in any direction and to avoid natural barriers
within the PSU.

The PSU definitions are revised each time the CPS
sample design is revised. Revised PSU definitions
reflect changes in metropolitan area definitions
and an attempt to have PSUs consistent with

54 Chapter 2-2: Sample Design	

	

These steps created 1,987 PSUs in the United
States from which to draw the sample for the CPS
when it was redesigned after the 2010 decennial
census.

Stratification of Primary Sampling Units
The CPS sample design calls for combining PSUs
into strata within each state and selecting one
PSU from each stratum. For this type of sample
design, sampling theory and cost considerations
suggest forming strata with approximately equal
population sizes. When the design is self-weighting (i.e., uses the same sampling fraction in all
strata) and one FR is assigned to each sample
PSU, equal stratum sizes have the advantage of
providing equal FR workloads (before population
growth and migration significantly affect the PSU
population sizes).
Sampling theory and costs dictate that highly
populated PSUs should be selected for sample
with certainty. The rationale is that some PSUs
exceed or come close to the population size
needed for equalizing stratum sizes. These PSUs
are designated as self-representing (SR). Each
14
Final metropolitan area definitions were not available from
the Office of Management and Budget when PSUs were defined.
Fringe counties having a good chance of being in final CBSA
definitions are separate PSUs. Most projected CBSA definitions
are the same as final CBSA definitions (Executive Office of the
President, 2013).

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

SR PSU is treated as a separate stratum and is
included in the sample.

Key variables used for stratification
include:

The following describes the steps for stratifying
PSUs for the 2010 redesign:

•• Number of males unemployed.

1.	 CPS used several criteria to determine which
PSUs would be SR. First, all counties that
existed in the 150 most populous CBSAs were
set as SR after determining that a natural
break in population existed between CBSAs
ranked 150 and 151. Then, a formula was used
to determine which of the remaining PSUs
would become SR. If the calculated field
workload (#HU selected) for a PSU is greater
than or equal to 55, the PSU is classified as
SR.

•• Number of families with female head of
household.

def

Where

PSU MOS = total number of HUs in PSU
p(selection) = probability of selection
SI = state sampling interval

2.	 The remaining PSUs were grouped into
non-self-representing (NSR) strata within
state boundaries. In each NSR stratum, one
PSU was selected to represent all of the PSUs
in the stratum. They are formed by adhering
to the following criteria:
a.	 Roughly equal-sized NSR strata are
formed within a state.
b.	 NSR strata are formed so as to yield reasonable FR workloads of roughly 35 to 55
HUs in an NSR PSU. The number of NSR
strata in a state is a function of the 2010
population, civilian labor force, state CV,
and between-PSU variance on the unemployment level. (Workloads in NSR PSUs
are constrained because one FR must
canvass the entire PSU. No such constraints are placed on SR PSUs.) In Alaska,
the strata are also a function of expected
interview cost.
c.	 NSR strata are formed with PSUs homogeneous with respect to labor force and
other social and economic characteristics
that are highly correlated with unemployment. This helps to minimize the betweenPSU variance.
Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

•• Number of females unemployed.

•• Number of households with three or
more people.
In addition to these, a number of other
variables were used for stratification in
certain states, such as industry and wage
variables obtained from Quarterly Census
of Employment and Wages program at the
BLS. The number of stratification variables
in a state ranged from three to six except
in Alaska, where the only variable used is
the number of males unemployed.
d.	 Starting with the 2010 sample redesign,
stratifications and PSUs in sample for CPS
and CHIP are exactly the same in states
that contain CHIP sample.
e.	 Table 2-2.1 summarizes the percentage of
the targeted population in SR and sampled NSR areas by state. SR percentages
for a given state were computed as the
ratio of the sum of MOS (measure of size)
of all SR PSUs and the total MOS for that
state. NSR percentages for a given state
were computed as the ratio of the sum of
MOS of only the selected NSR PSUs and
the total MOS for that state.
The CPS used the PSU Stratification
Program (PSP), created by the
Demographic Statistical Methods Division
of the Census Bureau, to perform the PSU
stratification. CPS strata in all states are
formed by the PSP. The PSP randomly
places NSR PSUs into strata for each state
or area by adhering to the NSR stratum
size tolerance as the initial stratification.
The criterion score is how the stratifications were compared. It is an estimate of
variance that would result from the given
stratification. The criterion score (using
between or total variance) is computed.
PSUs are next moved from strata to strata
(maintaining sample size criterion) in an
attempt to improve (lower) the criterion
Chapter 2-2: Sample Design 55

Table 2-2.1.

Civilian Noninstitutional Population 16 Years and Over in Sample Areas for 852-PrimarySampling-Unit Design by State
State

Total

Self-representing

Non-self-representing

Population1

Percent

Population1

Percent

Population1

Percent

   Total. . . . . . . . . . . . . . . 

185,883,503

85.8

167,296,681

77.2

18,586,822

8.6

Alabama. . . . . . . . . . . . . . . . . . 
Alaska . . . . . . . . . . . . . . . . . . . . 
Arizona . . . . . . . . . . . . . . . . . . . 
Arkansas. . . . . . . . . . . . . . . . . . 
California. . . . . . . . . . . . . . . . . . 
 Los Angeles . . . . . . . . . . . . . 
 Remainder of California. . . 
Colorado. . . . . . . . . . . . . . . . . . 
Connecticut. . . . . . . . . . . . . . . 
Delaware. . . . . . . . . . . . . . . . . . 

2,201,623
388,785
3,905,714
1,308,227
25,685,930
7,286,643
18,399,287
2,853,127
2,568,385
624,999

66.7
85.2
90.5
65.8
96.5
100.0
95.2
82.4
100.0
100.0

1,559,216
337,864
3,505,676
936,784
24,875,235
7,286,643
17,588,592
2,613,303
2,568,385
624,999

48.4
74.1
81.3
47.1
93.5
100.0
91.0
75.5
100.0
100.0

602,407
50,921
400,038
371,443
810,695
Z
810,695
239,824
Z
Z

18.2
11.2
9.3
18.7
3.0
Z
4.2
6.9
Z
Z

District of Columbia. . . . . . . . 
Florida. . . . . . . . . . . . . . . . . . . . 
Georgia. . . . . . . . . . . . . . . . . . . 
Hawaii . . . . . . . . . . . . . . . . . . . . 
Idaho. . . . . . . . . . . . . . . . . . . . . 
Illinois. . . . . . . . . . . . . . . . . . . . . 
Indiana. . . . . . . . . . . . . . . . . . . . 
Iowa. . . . . . . . . . . . . . . . . . . . . . 
Kansas. . . . . . . . . . . . . . . . . . . . 
Kentucky. . . . . . . . . . . . . . . . . . 

472,453
12,599,956
5,177,165
939,702
810,619
8,145,484
3,502,350
1,265,473
1,380,262
1,903,506

100.0
94.4
78.4
100.0
78.2
89.2
78.1
59.8
71.5
62.9

472,453
11,978,102
4,462,856
939,702
624,070
7,320,664
2,964,356
736,978
1,091,805
1,360,990

100.0
89.7
67.6
100.0
60.2
80.2
60.1
34.8
56.6
44.9

Z
621,854
714,309
Z
186,549
824,820
807,994
528,495
288,457
542,516

Z
4.7
10.8
Z
18.0
9.0
18.0
25.0
14.9
17.9

Louisiana. . . . . . . . . . . . . . . . . . 
Maine. . . . . . . . . . . . . . . . . . . . . 
Maryland. . . . . . . . . . . . . . . . . . 
Massachusetts. . . . . . . . . . . . . 
Michigan. . . . . . . . . . . . . . . . . . 
Minnesota. . . . . . . . . . . . . . . . . 
Mississippi. . . . . . . . . . . . . . . . . 
Missouri. . . . . . . . . . . . . . . . . . . 
Montana . . . . . . . . . . . . . . . . . . 
Nebraska. . . . . . . . . . . . . . . . . . 

2,572,843
790,526
3,864,604
4,811,359
6,062,207
2,597,419
1,183,622
2,839,179
523,403
875,096

82.1
84.2
95.5
100.0
86.6
69.7
58.3
68.8
77.0
70.4

2,030,269
659,409
3,727,311
4,811,359
5,177,361
2,274,189
798,265
2,600,607
433,654
696,627

64.8
70.2
92.1
100.0
73.9
61.1
39.3
63.0
63.8
56.0

542,574
131,117
137,293
Z
884,846
323,230
385,357
238,572
89,749
178,469

17.3
14.0
3.4
Z
12.6
8.7
19.0
5.8
13.2
14.4

Nevada . . . . . . . . . . . . . . . . . . . 
New Hampshire. . . . . . . . . . . . 
New Jersey. . . . . . . . . . . . . . . . 
New Mexico . . . . . . . . . . . . . . . 
New York. . . . . . . . . . . . . . . . . . 
 New York City . . . . . . . . . . . 
 Remainder of New York. . . 
North Carolina. . . . . . . . . . . . . 
North Dakota. . . . . . . . . . . . . . 
Ohio. . . . . . . . . . . . . . . . . . . . . . 

1,787,944
933,310
6,390,073
1,133,369
13,219,759
6,316,113
6,903,646
5,144,855
365,709
6,601,621

96.9
100.0
100.0
81.3
92.1
100.0
86.0
77.8
77.5
82.1

1,737,016
933,310
6,390,073
950,061
12,739,122
6,316,113
6,423,009
4,302,744
272,519
6,056,024

94.1
100.0
100.0
68.1
88.8
100.0
80.0
65.1
57.8
75.3

50,928
Z
Z
183,308
480,637
Z
480,637
842,111
93,190
545,597

2.8
Z
Z
13.1
3.6
Z
6.0
12.7
19.7
6.8

Oklahoma. . . . . . . . . . . . . . . . . 
Oregon . . . . . . . . . . . . . . . . . . . 
Pennsylvania . . . . . . . . . . . . . . 
Rhode Island . . . . . . . . . . . . . . 
South Carolina. . . . . . . . . . . . . 
South Dakota. . . . . . . . . . . . . . 
Tennessee. . . . . . . . . . . . . . . . . 
Texas . . . . . . . . . . . . . . . . . . . . . 
Utah. . . . . . . . . . . . . . . . . . . . . . 
Vermont . . . . . . . . . . . . . . . . . . 

1,701,769
2,298,939
7,883,979
764,720
2,757,560
321,254
3,488,335
15,033,549
1,652,069
455,575

67.3
85.5
87.1
100.0
86.3
58.6
78.8
88.4
91.5
100.0

1,484,962
1,798,682
7,167,370
764,720
2,433,050
230,650
2,747,854
13,369,052
1,533,187
455,575

58.8
66.9
79.2
100.0
76.2
42.1
62.1
78.7
84.9
100.0

216,807
500,257
716,609
Z
324,510
90,604
740,481
1,664,497
118,882
Z

8.6
18.6
7.9
Z
10.2
16.5
16.7
9.8
6.6
Z

Virginia . . . . . . . . . . . . . . . . . . . 
Washington . . . . . . . . . . . . . . . 
West Virginia. . . . . . . . . . . . . . 
Wisconsin . . . . . . . . . . . . . . . . 
Wyoming. . . . . . . . . . . . . . . . . .

4,498,245
3,532,297
764,529
2,785,300
284,725

82.7
76.1
58.2
69.6
74.6

3,854,979
3,085,917
561,890
2,020,342
225,093

70.8
66.5
42.8
50.5
59.0

643,266
446,380
202,639
764,958
59,632

11.8
9.6
15.4
19.1
15.6

Z Represents or rounds to zero.
1
Civilian noninstitutional population from sample areas 16 years of age and over based on the 2010 Census.
Source: U.S. Census Bureau, 2010 Census.

56 Chapter 2-2: Sample Design	

	

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

score. All possible swaps of PSUs from
one stratum to another are then evaluated.
A list of the best stratifications based on
criterion score was created and given to
the analyst for each state. A national file
is then produced containing the chosen
stratification for each state.
A consequence of the above stratification
criteria is that states that are geographically small, mostly urban, or demographically homogeneous are entirely SR.
These states are Connecticut, Delaware,
Hawaii, Massachusetts, New Hampshire,
New Jersey, Rhode Island, and Vermont.
Additionally, the District of Columbia and
the New York City and Los Angeles substate areas are entirely SR.

in a state for a self-weighting design. By design,
the overall state sampling interval is fixed, but the
state sample size is not fixed, allowing for growth
of the CPS sample because of HUs built after the
2010 Census. (See below for information about
how the desired CPS sample size is maintained.)
The state sampling interval is designed to meet
the requirements for the variance on an estimate
of the unemployment level. This variance can be
thought of as a sum of variances from the first
stage and the second stage of sample selection.
The first-stage variance is called the between-PSU
variance and the second-stage variance is called
the within-PSU variance.
The square of the state CV, or the relative variance, is expressed as
(2-2.1)

Selection of Primary Sampling Units
Each SR PSU is in the sample by definition. There
are currently 506 SR PSUs. In each of the remaining 346 NSR strata, one PSU is selected for the
sample following the guidelines described next.

where

� b 2 = between-PSU variance contribution to

the variance of the state unemployment level
estimator

At each sample redesign of the CPS, it is important to minimize the cost of introducing a new set
of PSUs. Substantial investment has been made
in hiring and training FRs in the existing sample
PSUs. For each PSU dropped from the sample
and replaced by another in the new sample, the
expense of hiring and training a new FR must be
accepted. Furthermore, there is a temporary loss
in accuracy of the results produced by new and
relatively inexperienced FRs. Concern for these
factors is reflected in the procedure used for
selecting PSUs.

� w 2 = within-PSU variance contribution to

Objectives of the Non-Self-Representing
Selection Procedure

where
N = the civilian noninstitutional population, 16
years of age and older (CNP16+), for the
state

The selection of the NSR PSUs was carried out
within the strata using the 2010 Census population. The selection procedure selected one PSU
from each stratum with probability proportional to
the 2010 population.

Calculation of Overall State Sampling
Interval
After stratifying the PSUs within the states, the
overall sampling interval in each state is computed. The overall state sampling interval is the
inverse of the probability of selection of each HU
Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

the variance of the state unemployment level
estimator

E(X) = 𝓍, the expected value of unemployment
level for the state
2

The term � w can be written as the variance
assuming a binomial distribution from a simple
random sample multiplied by a design effect

p = proportion of unemployed in the CNP16+ for
the state, or

n = the state sample size
q=1–p
def f = the state within-PSU design effect. This

is a factor accounting for the difference
between the variance calculated from
a multistage stratified sample and that
from a simple random sample.
Chapter 2-2: Sample Design 57

Substituting Np = 𝓍, this formula can be rewritten as
	

(2-2.2)

where SI

= the state sampling interval, or

Substituting Formula 2-2.2 into Formula 2-2.1 and
rewriting in terms of the state sampling interval
gives

where

CV 2 X 2 = is the variance. Generally, the overall

state sampling interval is used for all strata in a
state yielding a self-weighting state design. (In
some states, the sampling interval is adjusted in
certain strata to equalize FR workloads.) When
computing the sampling interval for the current
CPS sample, a six percent state unemployment
rate is assumed. Table 2-2.1 provides information
on the proportion of the population in sample
areas for each state.
The CHIP sample is allocated among the states
after the CPS sample is allocated. A sampling
interval accounting for both the CPS and CHIP
samples can be computed as:

The between-PSU variance for the combined CPS/
CHIP sample can be estimated using:

SECOND STAGE OF THE SAMPLE
DESIGN
The second stage of the CPS sample design is the
selection of sample HUs within PSUs. The objectives of within-PSU sampling are to:
•	

Select a probability sample that is representative of the civilian noninstitutional population.

•	

Give each HU in the population one chance
of selection, with virtually all HUs in a state or
substate area having the same overall chance
of selection.

•	

For the sample size used, keep the within-​
PSU variance of the labor force statistics (in

58 Chapter 2-2: Sample Design	

	

particular, unemployment) at as low a level as
possible, subject to respondent burden, cost,
and other constraints.
•

Select within-PSU sample units annually.

•

Put particular emphasis on providing reliable
estimates of monthly levels and change over
time of labor force items.

USUs are the sample units selected during the
second stage of the CPS sample design. Most
USUs consist of a geographically compact cluster
of approximately four addresses, corresponding to
four HUs at the time of the census. Use of HU clusters lowers travel costs for FRs. Clustering slightly
increases within-PSU variance of estimates for
some labor force characteristics since respondents
within a compact cluster tend to have similar labor
force characteristics.

Overview of Sampling Sources
To accomplish the objectives of within-PSU sampling, extensive use is made of data from the 2010
Census. The 2010 Census collected information on
all living quarters existing as of April 1, 2010, as
well as the demographic composition of people
residing in these living quarters. Data on the economic well-being and labor force status of individuals was obtained from the American Community
Survey.
These sources provide sampling information for
numerous demographic surveys conducted by
the Census Bureau.15 In consideration of respondents, sampling methodologies are coordinated
among these surveys to ensure that a sampled HU
is selected for one survey only. Consistent definition of sampling frames allows the development
of separate, optimal sampling schemes for each
survey. The general strategy for each survey is to
sort and stratify all the elements in the sampling
frame (eligible and not eligible) to satisfy individual survey requirements, select a systematic
sample, and remove the selected sample from
the frame. Sample is selected for the next survey
from what remains. Procedures are developed to
determine eligibility of sample cases at the time of
15
CPS sample selection is coordinated with the following
demographic surveys in the 2010 redesign: the AHS-Metropolitan
sample, the AHS-National sample, the CE Survey-Diary sample, the CE Survey-Quarterly sample, the Telephone Point of
Purchase Survey, the NCVS, the National Health Interview
Survey, the Rent and Property Tax Survey, and the SIPP.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

interview for each survey. This coordinated sampling approach is computer intensive and started
with the 2000 sample redesign.

Type of Living Quarters
Two types of living quarters were defined for the
census. The first type is a HU. A HU is a group of
rooms or a single room occupied as a separate
living quarter or intended for occupancy as a
separate living quarter.16 A HU may be occupied
by a family, one person, or two or more unrelated
people who share the living quarter. About 99 percent of the population counted in the 2010 Census
resided in HUs.
The second type of living quarter is a GQ. A GQ
is a living quarter where residents share common facilities or receive formally authorized care.
Examples include college dormitories, retirement
homes, and communes. Some GQs, such as fraternity and sorority houses and certain types of
group houses, are distinguished from HUs if they
house ten or more unrelated people. The GQ population is classified as institutional or noninstitutional and as military or civilian. CPS targets only
the civilian noninstitutional population residing in
GQs. As a cost-savings measure, student dormitories are not sampled, since the vast majority
of students in dormitories either have a usual
residence elsewhere, are not in the labor force,
or both. A subset of institutional GQs is included
in the GQ frame and given a chance of selection
in case of conversion to civilian noninstitutional
housing by the time it is scheduled for interview.
Less than 1 percent of the population counted in
the 2010 Census resided in GQs.

Development of Sampling Frames
The primary sampling frame used by the CPS
in 2010 was the MAF. This file is used by many
demographic surveys and comprises 2010 decennial census addresses with updates from the USPS
and local governments. Refer back to Chapter
2-1, CPS Frame for more about the MAF. Separate
16
Separate living quarters are living quarters in which one or
more occupants live separately from any other individual(s) in
the building and have direct access to the living quarters without
going through another living quarters, such as from outside the
building or through a common hall. For vacant units, the criteria
of separateness and direct access are applied to the intended
occupants (U.S. Census Bureau, Decennial Management Division
Glossary [2014]).

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

HU and GQ frames are created from the MAF.
The skeleton frame, described below, is also used
as a placeholder for future sample from new
construction.

Housing Units on the MAF
The HU portion of the MAF includes all the official
2010 census addresses and postcensus additions
from the USPS, local jurisdictions, and field listings. About 99 percent of CPS sample comes from
the HU portion of the MAF. The unit frame consists
of HUs in census blocks that contain a very high
proportion of complete addresses. The unit frame
covers most of the population. A USU in the unit
frame consists of a geographically compact cluster of four addresses, which are identified during
sample selection. The addresses, in most cases,
are those for separate HUs. However, over time
some buildings may be demolished or converted
to nonresidential use, and others may be split up
into several HUs. These addresses remain sample
units, resulting in a small variability in cluster size.

Group Quarters on the MAF
About 1 percent of sample is also selected from
the GQs portion of the MAF. The GQs on the MAF
consist of noninstitutional facilities such as college
dorms, adult group homes, Job Corps centers,
and religious GQs.
The GQ frame covers a small proportion of the
population. A CPS USU in the GQ frame consists
of two HU equivalents. The GQ frame is converted
into HU equivalents because the 2010 Census
addresses of individual GQs or people within a
GQ are not used in the sampling. The number of
HU equivalents is computed by dividing the 2010
Census GQs population by the average number of
people per household (calculated from the 2010
Census as 2.61).

Skeleton Frame
The skeleton frame in the 2010 redesign is different from the skeleton frame created for the
2000 redesign. The 2000 skeleton frame, most
commonly referred to as the permit frame, was
filled in with HUs listed in the permit address
listing/new construction operations. The 2010
skeleton frame provides placeholders to be filled

Chapter 2-2: Sample Design 59

in by new growth identified in the MAF extracts
every 6 months. The purpose of the 2000 skeleton frame was to allocate new growth sample in
selected areas to a 15-year sampling period. The
primary purpose of the skeleton frame for the
2010 redesign is to select sample units from new
growth updates as part of annual sampling, and
its secondary purpose is to provide representative
new growth sample between main sample selection periods.

Group Quarters Details
An integer number of GQ units is calculated at
the census block level. The number of GQ units is
referred to as the GQ block MOS and is calculated
as follows	

where

NI = tabulation block noninstitutional GQ popu-

lation excluding college housing and military
GQs

CH = tabulation block college housing population

IN = number of institutional GQs in the tabulation
block

α = estimated average household size in the

United States (2.61 for the 2010 redesign)

A quarterly listing of GQs with closeouts occurs
every 3 months. Sampling of GQ units is done
after units are sampled in the within-PSU sampling
stage and after GQ listings have been loaded onto
the database. Units are selected monthly by interview date, meaning only the current upcoming
sample designation, rotation, and panel are sampled. If more units are selected than the cut-off
per segment for the survey, subsampling occurs.
Only the civilian noninstitutional population is
interviewed for CPS. An institutional GQ is equivalent to one measure, regardless of the number of
people counted there in the 2010 Census.
Unduplication will occur to allow at least 2 years
between interviews for a unit within a GQ. For

60 Chapter 2-2: Sample Design	

	

instance, a unit could be selected for CPS in the
first sample period with an initial interview date
of January 2016. This unit is next eligible to be
selected and interviewed with initial interview
date after January 2018. This would give that unit
at least 9 months off between the two 16-month
interview cycles.

SELECTION OF SAMPLE UNITS
The CPS sample is designed to be self-weighting
by state or substate area. A systematic sample is
selected from each PSU at a sampling rate of 1
in k, where k is the within-PSU sampling interval.
This interval is equal to the product of the PSU
probability of selection and the stratum sampling
interval. The stratum sampling interval is usually
the overall state sampling interval. (See the earlier
section in this chapter, “Calculation of overall state
sampling interval.”)
CPS sample is selected separately for the unit and
GQ frames. Since sample is selected at a constant
overall rate, the percentage of sample selected
from each frame is proportional to population size.

Within-Primary Sampling Unit Sampling
Procedure
Units are arranged within sampling frames based
on characteristics of the 2010 Census and geography. The characteristics of the 2010 Census used
are percentage of households with female householder, the percentage that are owner occupied,
percentage Black, and percentage aged 65 and
over. Sorting minimizes within-PSU variance of
estimates by grouping together units with similar
characteristics. The 2010 Census data and geography are used to sort blocks and units. (Sorting
is done within block and state since sampling is
performed within block and state.) The MAF HU
frame is sorted on block level characteristics,
keeping HUs in each block together, and then
by a HU identification number to sort the HUs
geographically.
CPS selects HUs from the HU frame every year,
and selects group quarters from the GQ frame
every 3 years. This is different from selecting samples for the entire decade like the past designs.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

After frame files are ready, the sampling intervals
are computed and adjustments are made, random starts are calculated, and then the sample is
selected. A sampling interval is an integral representation of a percentage; i.e., a sampling interval
of 20 means 1 out of 20, or 5 percent. A random
start is just the initial position in the list where
sampling is started.
Example:
Final sampling interval = 5.75
String length = 4
Random start = 3.76
Number of Records = 25
Then the following sequence will be created:
3.76
3.76 + (1 * 5.75) = 3.76 + 5.75 = 9.51
3.76 + (2 * 5.75) = 3.76 + 11.5 = 15.26
3.76 + (3 * 5.75) = 3.76 + 17.25 = 21.01
The rounded up sequence would then be 4, 10, 16,
and 22.
The sample cases would be records 4-7, 10-13,
16-19, and 22-25.

Assignment of Postsampling Codes
Two types of postsampling codes are assigned to
the sampled units. First, there are the CPS technical codes used to weight the data, estimate the
variance of characteristics, and identify representative subsamples of the CPS sample units. The
technical codes include final hit number, rotation
group, and random group codes. Second, there
are operational codes common to the demographic household surveys used to identify and
track the sample units through data collection and
processing. The operational codes include field
PSU and control number.
Final hit number—The final hit number identifies
the original within-PSU order of selection. All
USUs in a hit string are assigned the same final
hit number. For each PSU, this code is assigned
sequentially, starting with one. The final hit number is used in the application of the CPS variance
estimation method discussed in Chapter 2-4,
Variance Estimation.
Rotation group—The sample is partitioned into
eight representative subsamples, called rotation

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

groups, used in the CPS rotation scheme. All USUs
in a hit string are assigned to the same rotation
group. Rotation groups are assigned after sorting
hits by state, MSA or non-MSA status (old construction only), SR or NSR status, stratification
PSU, and final hit number. Because of this sorting,
the eight subsamples are balanced across stratification PSUs, states, and the nation. Rotation
group is used in conjunction with sample designation to determine units in sample for particular
months during the decade.
Random group—The sample is partitioned into ten
representative subsamples called random groups.
All USUs in the hit string are assigned to the same
random group. Since random groups are assigned
after sorting hits by state, stratification PSU,
rotation group, and final hit number, the ten subsamples are balanced across stratification PSUs,
states, and the nation. Random groups can be
used to partition the sample into test and control
panels for survey research.
Field PSU—A field PSU is a single county within a
stratification PSU. Field PSU definitions are consistent across all demographic surveys and are more
useful than stratification PSUs for coordinating FR
assignments among demographic surveys.

ROTATION OF THE SAMPLE
The CPS sample rotation scheme is a balance
between a permanent sample (from which a high
response rate would be difficult to maintain) and a
completely new sample each month (which results
in more variable estimates of change). The CPS
sample rotation scheme represents an attempt
to strike a balance in the minimization of the
following:
•

Variance of estimates of month-to-month
change: three-fourths of the sample units are
the same in consecutive months.

•

Variance of estimates of year-to-year change:
one-half of the sample units are the same in
the same month of consecutive years.

•

Variance of other estimates of change: outgoing sample is replaced by sample likely to
have similar characteristics.

•

Response burden: eight interviews are dispersed across 16 months.

Chapter 2-2: Sample Design 61

The rotation scheme follows a 4-8-4 pattern. A
HU or GQ is interviewed 4 consecutive months,
removed from sample for the next 8 months, interviewed the next 4 months, and then retired from
sample. The rotation scheme is designed so outgoing HUs are replaced by HUs from the same hit
string, which tend to have similar characteristics.

after a lapse of 8 months, they are interviewed
for another 4 months. For example, A08/B08
(rotation 4) is interviewed from March 2018 to
June 2018 and again from March 2019 to June
2019. In a given month, two or three sample
designations are in operation. For example, in
January 2019, units with sample designation
A07/B07 (rotations 7 and 8), A08/B08 (rotations 1 and 2), and A09/B09 (rotations 3, 4, 5,
and 6) are being interviewed.

Rotation Chart
The CPS rotation chart illustrates the rotation pattern of CPS sample over time. Table 2.2-2 presents
the rotation chart beginning in January 2018. The
following statements provide guidance in interpreting the chart:
•

•

The chart covers the interview period from
January 2018 through March 2020 for the
CPS and for CHIP. For each month, the chart
shows the sample designation and rotation
(or rotation group) for the units interviewed.
A sample designation is represented by the
combination of the letter A or B with a twodigit number. The letter A represents CPS and
the letter B represents CHIP, so that the designation A06 is a CPS sample and B06 is a CHIP
sample. Each sample designation consists
of rotations numbered 1 through 8. Sample
designations and rotations appear as column
headings and the numbers within the chart
refer to the month-in-sample (MIS). For example, the 5 under the column heading A08/B08
(rotation 2) for January 2019 indicates that
rotation 2 of sample designation A08 (or B08)
is being interviewed for the fifth time (MIS 5).
The sample units in a particular rotation are
interviewed for 4 consecutive months; then,

62 Chapter 2-2: Sample Design	

	

•

Each month, a new rotation comes into sample for the first time, and another returns to
sample after an 8-month rest. The remaining
sample designation/rotations were interviewed during the preceding month. For
example, in January 2019, A09/B09 (rotation
6) units come into sample for the first time;
A08/B08 (rotation 2) units return after an
8-month lapse; A07/B07 (rotations 7 and 8),
A08/B08 (rotation 1), and A09/B09 (rotations 3, 4 and 5) sample units are interviewed
again after being interviewed in the preceding
month, December 2018.

•

The chart differentiates the annual samples
using light and dark gray shading, with each
sample consisting of one-and-a-half sample
designations. Each annual sample begins its
interviews in April.

•

This rotation scheme has been used since
1953. The most recent research into alternate
rotation patterns was prior to the 1980 redesign when state-based designs were introduced (Tegels and Cahoon, 1982).

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Current Population Survey TP77	

U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Chapter 2-2: Sample Design 63

1

.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

Rotation:

2018 JAN. . . .
FEB. . . . 
MAR. . . 
APR . . . 
MAY. . . 
JUN . . . 
JUL. . . . 
AUG. . . 
SEP. . . . 
OCT . . . 
NOV. . . 
DEC . . . 
2019 JAN. . . .
FEB. . . . 
MAR. . . 
APR . . . 
MAY. . . 
JUN . . . 
JUL. . . . 
AUG. . . 
SEP. . . . 
OCT . . . 
NOV. . . 
DEC . . . 
2020 JAN. . . .
FEB. . . . 
MAR. . . 

Sample
design

3

4

5

6

7

. 8 7 6 5 .
. . 8 7 6 5
. . . 8 7 6
. . . . 8 7
. . . . . 8
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
2016 Annual sample

2

A06/B06

.
.
5
6
7
8
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

8
.
.
.
5
6
7
8
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

1
.
.
.
.
5
6
7
8
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

2
.
.
.
.
.
5
6
7
8
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.

3

5

6

7

8

1

. . . 4 3 2
. . . . 4 3
. . . . . 4
. . . . . .
. . . . . .
. . . . . .
5 . . . . .
6 5 . . . .
7 6 5 . . .
8 7 6 5 . .
. 8 7 6 5 .
. . 8 7 6 5
. . . 8 7 6
. . . . 8 7
. . . . . 8
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
. . . . . .
2017 Annual sample

4

A07/B07

1
2
3
4
.
.
.
.
.
.
.
.
5
6
7
8
.
.
.
.
.
.
.
.
.
.
.

2
1
2
3
4
.
.
.
.
.
.
.
.
5
6
7
8
.
.
.
.
.
.
.
.
.
.

3

1
2
3
4
.
.
.
.
.
.
.
.
5
6
7
8
.
.
.
.
.
.
.
.
.

4

1
2
3
4
.
.
.
.
.
.
.
.
5
6
7
8
.
.
.
.
.
.
.
.

5

A08/B08

1
2
3
4
.
.
.
.
.
.
.
.
5
6
7
8
.
.
.
.
.
.
.

6

1
2
3
4
.
.
.
.
.
.
.
.
5
6
7
8
.
.
.
.
.
.

7

1

2

3

4

5

1
2 1
3 2 1
4 3 2 1
. 4 3 2 1
. . 4 3 2 1
. . . 4 3 2
. . . . 4 3
. . . . . 4
. . . . . .
. . . . . .
. . . . . .
5 . . . . .
6 5 . . . .
7 6 5 . . .
8 7 6 5 . .
. 8 7 6 5 .
. . 8 7 6 5
. . . 8 7 6
. . . . 8 7
. . . . . 8
2018 Annual sample

8

A09/B09

1
2
3
4
.
.
.
.
.
.
.
.
5
6
7

6

1
2
3
4
.
.
.
.
.
.
.
.
5
6

7

1
2
3
4
.
.
.
.
.
.
.
.
5

8

1
2
3
4
.
.
.
.
.
.
.
.

1

1
2
3
4
.
.
.
.
.
.
.

2

1
2
3
4
.
.
.
.
.
.

3

5

6

7

8

1

1
2
3

2

1
2

3

A11/B11

1
2 1
3 2 1
4 3 2 1
. 4 3 2 1
. . 4 3 2 1
. . . 4 3 2
. . . . 4 3
. . . . . 4
2019 Annual sample

4

A10/B10

Current Population Survey/Children’s Health Insurance Program Rotation Chart January 2018–March 2020
Month-in-Sample by Sample Designation and Rotation

Table 2-2.2.

1

4

Overlap of the Sample
Table 2-2.3 shows the approximate proportion of
overlap between any 2 months of sample depending on the time lag between them. The proportion of sample in common has a strong effect
on correlation between estimates from different
months and, therefore, on variances of estimates
of change.
Table 2-2.3.

Approximate Proportion of Sample in
Common for 4-8-4 Rotation System
Interval (in months)

Percentage of
sample in common
between 2 months

1. . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2, 12. . . . . . . . . . . . . . . . . . . . . . . . . . . . 
3, 10, 14. . . . . . . . . . . . . . . . . . . . . . . . . 
4–8, 16 and greater. . . . . . . . . . . . . . .
9, 15. . . . . . . . . . . . . . . . . . . . . . . . . . . . 
11, 13. . . . . . . . . . . . . . . . . . . . . . . . . . . 

75.0
50.0
25.0
0.0
12.5
37.5

Phase-In of a New Design
When a newly redesigned sample is introduced
into the ongoing CPS rotation scheme, there are
a number of reasons not to discard the old CPS
sample 1 month and replace it with a completely
redesigned sample the next month. Since redesigned sample contains different sample areas,
new FRs must be hired. Modifications in survey
procedures are usually made for a redesigned
sample. These factors can cause discontinuity in
estimates if the transition is made at one time.
Instead, a gradual transition from the old sample
design to the new sample design is undertaken.
Beginning in April 2014, the 2010 Census-based
design was phased in through a series of changes
completed in July 2015 (BLS, 2014).

MAINTAINING THE DESIRED SAMPLE
SIZE
The CPS sample is continually updated to include
recently built HUs. If the same sampling rates were
used throughout the decade, the growth of the
U.S. housing inventory would lead to increases
in the CPS sample size and, consequently, to
increases in cost. To avoid exceeding the budget,
the sampling rate is periodically reduced to maintain the desired sample size. Referred to as maintenance reductions, these changes in the sampling

64 Chapter 2-2: Sample Design	

	

rate are implemented in a way that retains the
desired set of reliability requirements.
These maintenance reductions are different from
changes to the base CPS sample size resulting
from modifications to the CPS funding levels. The
methodology for designing and implementing this
type of sample size change is generally dictated
by new requirements specified by BLS. For example, the sample reduction implemented in January
1996 was due to a reduction in CPS funding; new
design requirements were specified at that time.

Developing the Reduction Plan
The CPS sample size for the United States is
projected forward for about 1 year using linear
regression based on previous CPS monthly sample
sizes. The future CPS sample size must be predicted because CPS maintenance reductions are
gradually introduced over 16 months and operational lead-time is needed so that dropped cases
will not be interviewed.
Housing growth is examined in all states and
major substate areas to determine whether it is
uniform or not. The states with faster growth are
candidates for maintenance reduction. The postreduction sample must be sufficient to maintain
the individual state and national reliability requirements. Generally, the sample in a state is reduced
by the same proportion in all frames in all PSUs to
maintain the self-weighting nature of the statelevel design.

Reduction Groups
The CPS sample size is reduced by deleting one
or more subsamples of USUs from each sampling
frame.
The original sample of USUs is partitioned into 101
subsamples called reduction groups; each is representative of the overall sample. The decision to
use 101 subsamples is somewhat arbitrary. A useful attribute of the number used is that it is prime
to the number of rotation groups (eight) so that
reductions have a uniform effect across rotations.
A number larger than 101 would allow greater
flexibility in pinpointing proportions of the sample
to reduce. However, a large number of reduction
groups can lead to imbalances in the distribution
of sample cuts across PSUs, since small PSUs may

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

not have enough sample to have all reduction
groups represented.
All USUs in a hit string have the same reduction
group number. For all frames, hit strings are
sorted and then sequentially assigned a reduction group code from 1 through 101. The sort
sequence is:
1.	 State or substate.

This makes the resulting overall selection probability in the state approximately 1 in 526. In the
subsequent maintenance reduction, the state has
96 reduction groups remaining. A further reduction of 1 in 96 can be accomplished by deleting 1
of the remaining 96 reduction groups.
The resulting overall state sampling interval is the
new basic weight for the remaining uncut sample.

2.	 MSA or non-MSA status.

Introducing the Reduction

3.	 SR or NSR status.

A maintenance reduction is implemented only
when a new sample designation is introduced,
and it is gradually phased in with each incoming
rotation group to minimize the effect on survey
estimates and reliability and to prevent sudden
changes to the interviewer workloads. The basic
weight applied to each incoming rotation group
reflects the reduction. Once this basic weight is
assigned, it does not change until future sample
changes are made. In all, it takes 16 months for
a maintenance sample reduction and new basic
weights to be fully reflected in all eight rotation
groups interviewed for a particular month. During
the phase-in period, rotation groups have different
basic weights; consequently, the average weight
over all eight rotation groups changes each
month. After the phase-in period, all eight rotation
groups have the same basic weight.

4.	 Stratification PSU.
5.	 Final hit number, which defines the original
order of selection.
The state or national sample can be reduced by
deleting USUs from the frame in one or more
reduction groups. If there are k reduction groups
in the sample, the sample may be reduced by 1/k
by deleting one of k reduction groups. For the first
reduction applied to redesigned samples, each
reduction group represents roughly 1 percent of
the sample. Reduction group numbers are chosen
for deletion in a specific sequence designed to
maintain the nature of the systematic sample to
the extent possible.
For example, suppose a state has an overall state
sampling interval of 500 at the start of the 2010
design. Suppose the original selection probability
of 1 in 500 is modified by deleting 5 of 101 reduction groups. The resulting overall state sampling
interval (SI) is

REFERENCES
Executive Office of the President, Office of
Management and Budget, Metropolitan Area
Changes Effective with the Office of Management
and Budget’s Bulletin 13-01, February 28, 2013.
Tegels, R., and L. Cahoon, “The Redesign of the
Current Population Survey: The Investigation
into Alternate Rotation Plans,” paper presented
at the 1982 Joint Statistical Meetings, American
Statistical Association, 1982.
U.S. Bureau of Labor Statistics, “Redesign
of the Sample for the Current Population
Survey,” Current Population Survey, Technical
Documentation, April 2014, retrieved from
.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Chapter 2-2: Sample Design 65

FURTHER READING
Farber, J., “2010 Sample Redesign
Requirements for Creating the Skeleton Frame,”
Doc. #2010-3.8-Q-7, Version 1.5, U.S. Census
Bureau, 2011.
Farber, J., and M. Li, “2010 Sample Redesign
Requirements for Within-PSU Unit Frame
Sample Selection,” Doc. #2010-3.8-Q-9, Version
1.2, U.S. Census Bureau, 2014.
Farber, J., and M. Neiman, “Sample Code
Assignments for the Current Population Survey
and the State Children’s Health Insurance
Program Survey for the 2010 Redesign on the
Unit Frame,” Doc. #2010-3-8-Q-12, Version 1.2,
U.S. Census Bureau, 2012.
Farber, J., and M. Sundukchi, “PSU Stratification
Program Requirements for the 2010 Sample
Redesign,” Doc. #2010-3-2-Q-2, Version 1.4,
U.S. Census Bureau, 2012.
Kostanich, D., D. Judkins, R. Singh, and
M. Schautz, “Modification of Friedman-Rubin’s
Clustering Algorithm for Use in Stratified PPS
Sampling,” paper presented at the 1981 Joint
Statistical Meetings, American Statistical
Association, 1981.

66 Chapter 2-2: Sample Design	

	

Kuwik, C., “Group Quarters Overview
Document,” Doc. #2010-3.4-G-2, Version 0.7,
U.S. Census Bureau, 2012.
Ludington, P. W., “Stratification of Primary
Sampling Units for the Current Population
Survey Using Computer Intensive Methods,”
paper presented at the 1992 Joint Statistical
Meetings, American Statistical Association,
1992.
Rawlings, A., “Annual Sampling
Recommendation for the 2010 Sample
Redesign,” Doc. #2010-3.8-R-3, Version 1.0,
U.S. Census Bureau, 2010.
Statt, R., E. A. Vacca, C. Wolters, and
R. Hernandez, “Problems Associated with Using
Building Permits as a Frame of Post-Census
Construction: Permit Lag and ED Identification,”
paper presented at the 1981 Joint Statistical
Meetings, American Statistical Association,
1981.
U.S. Bureau of Labor Statistics, “Redesign of
the Sample for the Current Population Survey,”
Employment and Earnings, Government Printing
Office, Washington, DC, December 2004.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Chapter 2-3: Weighting and Estimation
INTRODUCTION
The CPS is a multistage probability sample of HUs
in the United States. It produces monthly labor
force and related estimates for the total U.S. civilian
noninstitutional population (CNP) and provides
details by age, sex, race, and Hispanic ethnicity. In
addition, the CPS produces estimates for a number of other population subgroups (e.g., families,
veterans, and earnings of employed people) on
either a monthly, quarterly, or annual basis. Each
month a sample of eight panels, or rotation groups,
is interviewed, with demographic data collected for
all occupants of the sample HUs. Labor force data
are collected from people aged 15 and older. Each
rotation group is itself a representative sample of
the U.S. population. The labor force estimates are
derived through a number of weighting steps in the
estimation procedure. In addition, the weighting at
each step is replicated in order to derive variances
for the labor force estimates (see Chapter 2-4,
Variance Estimation for details).
The weighting procedures of the CPS supplements
are discussed in Chapter 1-3, Supplements. Many
of the supplements apply to specific demographic
subpopulations and differ in coverage from the
basic CPS universe. The supplements tend to have
higher nonresponse rates.
To produce national and state estimates from
survey data, a statistical weight for each person
in the sample is developed through the following
steps, each described in this chapter:
1.	 Base weighting produces simple, unbiased
estimates for the basic CPS universe under
ideal survey conditions, such as 100 percent
response rate, zero frame error, and zero
reporting error. Most sample units within a
state have the same probability of selection
and therefore have the same base weight.
2.	 Nonresponse adjustment reduces bias that
would arise from ignoring HUs that do not
respond.
3.	 First-stage weighting reduces variances due
to the sampling of NSR PSUs.
4.	 State and national coverage steps and second-stage weighting reduce variances by
Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

controlling, or benchmarking, CPS estimates
of the population to independent estimates of
the current population.
5.

Composite weighting uses estimates from previous months to reduce the variances, particularly for certain estimates of change.

In addition to estimates of basic labor force characteristics, several other types of estimates can be
produced on a monthly, quarterly, or annual basis.
Each of these involves additional weighting steps
to produce the final estimate. These additional
estimation procedures provide the estimates for
particular subgroups of the CNP. The types of
characteristics include:
•

Household-level estimates and estimates of
families (such as married-couple families living
in the same household) using family weights.

•

Estimates of earnings, union affiliation, and
industry and occupation (I&O) of second jobs
collected from respondents in the two outgoing rotation groups (about one-fourth of the
sample) using outgoing rotation weights.

•

Estimates of labor force status for veterans
and nonveterans using veterans’ weights.

The independent population controls used in second-stage weighting are produced by the Census
Bureau’s Population Estimates Program. Monthly
population figures are estimated using the 2010
Census as the basis and information from a variety
of primarily administrative sources that account
for births, deaths, and net migration. Subtracting
estimated numbers of resident armed forces
personnel and institutionalized people from the
resident population gives the CNP. These population controls are updated annually. CPS demographic weighted estimates are benchmarked to
the independent monthly controls. The Derivation
of Independent Population Controls section
later in the chapter provides a reference to the
methodology.
Although the processes described in this chapter have remained essentially unchanged since
January 1978, and seasonal adjustment has
been part of the estimation process since June
1975, modifications have been made in some of
Chapter 2-3: Weighting and Estimation 67

the procedures from time to time. For example,
in January 1998, a new compositing procedure
was introduced. In January 2003, new race cells
were introduced for first-stage weighting and
second-stage weighting; also, national and state
coverage steps were added. In January 2005, the
number of cells used in the national coverage
adjustment and in second-stage weighting was
expanded to improve the estimates of children.

BASE WEIGHTING
The CPS selects a sample for every state based on
state-specific survey requirements. A sample unit’s
base weight is equal to the inverse of its probability of selection.

where

�s = base weight for a unit in state s,

πs = probability of selection for a unit in state s.

Almost all sample people within the same state
have the same probability of selection. As the first
step in estimation, the base weights from eligible
individuals in eligible HUs are summed.

Effect of Annual Sampling
As described in Chapter 2-2, Sample Design, the
CPS selects samples annually, which allows for
a stabilized sample size at both the state and
national levels. CPS annual sampling also includes
a selection of samples for newly constructed HUs,
and the probabilities of selection for every state
are adjusted accordingly to approximately maintain a constant sample size from year to year. Base
weights also adjust, as these are inversely related.
Construction growth on the sampling frame from
year to year is small enough that the adjustments
to national base weights are typically only marginal. However, state-specific growth does have
the potential to introduce a more noticeable
change in a state’s base weight.
In addition to the growth associated with the
annual sampling, midyear growth also introduces
a small amount of additional sample to the CPS.
These sample cases are selected with the same
probability of selection as other cases in the state
and therefore have the same base weight.

68 Chapter 2-3: Weighting and Estimation	

	

WEIGHTING CONTROL FACTOR
The weighting control factor (WCF) adjusts the
base weight to account for any subsampling
required in the field or within the Census Bureau’s
Demographic Statistical Methods Division after a
survey selects its sample. As a means to control
sample size overrun, field subsampling reduces
the number of interviews in a segment to a manageable number when there are more than 15 designated interviews. In the 2010 design, it is no longer necessary to create a separate WCF file. For
each month, WCFs are provided in the Universe
Control File before FRs go out for interviews.

NONRESPONSE ADJUSTMENT
WEIGHTING
Nonresponse arises when HUs or other units of
observation that have been selected for inclusion in a survey fail to provide all or some of the
data that were to be collected. This failure to
obtain complete results from units selected can
arise from several different sources, depending
upon the survey situation. There are two major
types of nonresponse: item nonresponse and unit
nonresponse. Item nonresponse occurs when a
cooperating HU fails or refuses to provide some
specific items of information. Unit nonresponse
refers to the failure to collect any survey data from
an occupied sample HU. For example, data may
not be obtained from an eligible HU in the survey because of impassable roads, a respondent’s
absence or refusal to participate in the interview,
or unavailability of the respondent for other
reasons.
Unit nonresponse in the CPS is also called Type
A nonresponse. The nonresponse adjustment is
limited to eligible or in-scope HUs. Some HUs are
permanently out-of-scope, such as those that are
demolished (Type C nonresponse). Some HUs are
temporarily out-of-scope, such as those that are
vacant or those without any people in the CNP
(Type B nonresponse). Eligible HUs that do not
respond are Type A nonresponses.
In the CPS estimation process, the weights for
all eligible interviewed households are adjusted
to account for occupied sample households for
which no information was obtained because of
unit nonresponse. This nonresponse adjustment is

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

made separately for similar sample areas that are
usually, but not necessarily, contained within the
same state. Increasing the weights of responding
sample units to account for eligible sample units
that have not responded is valid if the responding
units are similar to the nonresponding units with
regard to their demographic and socioeconomic
characteristics. Nonresponse bias is present in
CPS estimates when the nonresponding units differ in relevant aspects from those that respond to
the survey. For more information, see Chapter 4-1,
Nonsampling Error.

Nonresponse Clusters and Nonresponse
Adjustment Cells
Nonresponse adjustment is performed on groups
of sample PSUs that have similar metropolitan
status and population size in order to reduce bias
due to nonresponse. These groups of PSUs are
called nonresponse clusters. In general, PSUs with
a metropolitan status of similar size in the same
state belong to the same nonresponse cluster.
PSUs classified as MSAs are assigned to metropolitan clusters of similar size, and nonmetropolitan
PSUs are assigned to nonmetropolitan clusters.
Within each metropolitan cluster, there is a further
breakdown into two nonresponse adjustment
cells: “principal city” and “not principal city.” The
nonmetropolitan clusters are not divided further.
In the 2010 redesign, there are 125 clusters (82
metropolitan and 43 nonmetropolitan).

Computing Nonresponse Adjustment
Factors
Weighted counts of responding and nonresponding households are tabulated separately for each
nonresponse adjustment cell. The base weight is
used as the weight for this purpose. The nonresponse adjustment factor NRAFij is computed as:

where

Zij = weighted count of eligible responding HUs in
cell j of cluster i,
Nij = weighted count of eligible nonresponding
HUs in cell j of cluster i.

These factors are applied to data for each
responding person except in cells where any of
the following situations occur:
•	

The computed factor is greater than 2.0.

•	

There are fewer than 50 unweighted responding HUs in the cell.

•	

The cell contains only nonresponding HUs.

If one of these situations occurs, the weighted
counts are combined for the nonresponse adjustment cells within the nonresponse cluster. A common adjustment factor is computed and applied
to weights for responding people within the cluster. If, after collapsing, any of the cells still meet
any of the situations above, the cell is output to an
“extreme cell file” that is created for review each
month. This allows the extreme cells to be tracked
over time, so adjustments can be made to the cell
definitions if needed.

Nonresponse Adjustment Weights
At the completion of the nonresponse adjustment
procedure, the weight for each interviewed person
is the product
(base weight) x (nonresponse adjustment factor)
At this point, records for all individuals in the same
household have the same weight, since the adjustments discussed so far depend only on household
characteristics.

BENCHMARKING
Distributions of demographic characteristics
derived from the CPS sample in any month will
be somewhat different from the true distributions
even for such basic characteristics as age, race,
sex, and Hispanic ethnicity.17 These particular population characteristics are closely correlated with
labor force status and other characteristics estimated from the sample. Therefore, the variance
of sample estimates based on these characteristics can be reduced when, by use of appropriate
weighting adjustments, the sample population distribution is brought as closely into agreement as
possible with the known distribution of the entire
population with respect to these characteristics.
This is accomplished by adjusting the weights
through a series of benchmarking adjustments.
17

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Hispanics may be of any race.

Chapter 2-3: Weighting and Estimation 69

There are five of these primary weighting adjustments in the CPS estimation process:
•	

First-stage weighting.

•	

National coverage step.

•	

State coverage step.

•	

Second-stage weighting.

•	

Composite estimation.

In first-stage weighting, weights are adjusted
to reduce the variance caused by the PSU sampling using the race distribution. In the national
and state coverage steps, weights are adjusted
in preparation for second-stage weighting. In
second-stage weighting, weights are iteratively
adjusted so that aggregated CPS sample estimates match independent estimates of population
controls in various age/sex/race and age/sex/ethnicity cells at the national level. Adjustments are
also made so that the estimated state populations
from CPS match independent state population
estimates by age and sex. In first-stage weighting,
the population distribution comes from the 2010
Census. For the other steps, the population distribution comes from estimated monthly population
controls.

population by race (Black alone/non-Black alone)
crossed with age groups 0 to 15 and 16 and over
satisfies all three criteria.
By using the four race/age categories, first-stage
weighting compensates for the possibility that
the racial composition of the sampled NSR PSUs
in a state could differ substantially from the racial
composition of all NSR PSUs in the state. The
adjustment is not necessary for SR PSUs. The
weight adjustment factors are computed once and
are unchanged until a new sample of NSR PSUs is
selected in a state.

Computing First-Stage Adjustment Factors
The first-stage adjustment factors are based on
the 2010 Census data and are applied only to sample data for the NSR PSUs. Factors are computed
in four race/age cells (Black alone/non-Black
alone crossed with 0 to 15 and 16 and over age
groups) for each state containing NSR PSUs. The
following formula is used to compute the firststage adjustment factors for each state:

FIRST-STAGE WEIGHTING
In first-stage weighting, weights are adjusted so
that the Black alone/non-Black alone population
distribution from the sample NSR PSUs in a state
corresponds to the Black alone/non-Black alone
population distribution from the 2010 Census for
all PSUs in the state.
The purpose is to reduce the contribution to
the variance of state estimates arising from the
sampling of NSR PSUs. This is called betweenPSU variance. For some states, the between-PSU
variance makes up a relatively large proportion of
the total variance, while the overall contribution of
the between-PSU variance at the national level is
generally quite small.
There are several factors to be considered in
determining what information to use in first-stage
weighting. The information must be available for
each PSU, correlated with as many of the critical
statistics published from the CPS as possible, and
reasonably stable over time so that the accuracy
gained from the weighting adjustment procedure does not deteriorate. The distribution of the

70 Chapter 2-3: Weighting and Estimation	

	

where
FSAFs j = first-stage adjustment factor for state s
and age/race cell j (j = 1, 2, 3, or 4),

Csi j = 2010 Census civilian noninstitutional
population for NSR PSU i (sample or
nonsample) in state s, race/age cell j,

n = number of total (sampled and nonsampled)
NSR PSUs in state s,
Csk j = Census 2010 civilian noninstitutional
population for NSR sample PSU k in
state s, race/age cell j,

m = number of sampled NSR PSUs,

πsk = 2010 probability of selection for sample
NSR PSU k in state s.
The estimate in the denominator of each of the
factors is obtained by multiplying the 2010 Census
CNP in the appropriate race/age cell for each NSR
sample PSU by the inverse of the probability of

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

selection for that PSU and summing over all NSR
sample PSUs in the state.
The Black alone and non-Black alone cells are collapsed within a state when a cell meets one of the
following criteria:
•	

The factor (FSs,j) is greater than 1.5.

•	

The factor is less than 0.5.

•	

There are fewer than four sampled NSR PSUs
in the state.

•	

There are fewer than ten expected interviews
in an age/race cell in the state.

First-Stage Weights
At the completion of first-stage weighting, the
weight for each responding person is the product
(base weight) x (nonresponse adjustment factor)
x (first-stage adjustment factor)
The weight after the first-stage adjustment is
called the first-stage weight. As in nonresponse
adjustment weighting, records for all individuals
in the same household have the same first-stage
weight, since the adjustments discussed so far
depend only on household and PSU characteristics, not on any respondent characteristics.

NATIONAL COVERAGE STEP
The national coverage step is an adjustment
by race/ethnicity/age/sex groups that ensures
weighted CPS estimates match independent
national population controls. This coverage step
helps correct for interactions between race and
Hispanic ethnicity that are not addressed in
second-stage weighting. Research has shown
that the undercoverage of certain race/ethnicity
combinations (e.g., non-Black Hispanic) cannot
be corrected with second-stage weighting alone.
The national coverage step also helps to speed
the convergence of the second-stage benchmarking process (Robison, Duff, Schneider, and
Shoemaker, 2002).

Computing National Coverage Adjustment
Factors
In the national coverage step, adjustment factors
are calculated that are based on independently
derived estimates of the population. People
records are grouped into four pairs (MIS 1 and 5,
MIS 2 and 6, MIS 3 and 7, and MIS 4 and 8). Each
MIS pair is then adjusted to age/sex/race/ethnicity population controls (see Table 2-3.1) using the
following formula:

PAIRING ROTATION GROUPS
Rotation groups, or months-in-sample (MIS), are
paired for the national coverage step, state coverage step, and second-stage weighting. Prior to
2003, second-stage benchmarking to population
controls was done separately for each of the eight
rotation groups in a given month, labeled MIS 1–
MIS 8. There were no coverage steps at that time.
Pairing the rotation groups enables the creation
of more cell detail in estimation steps. The particular pairing was motivated by the structure
of the composite estimation formula, detailed
in the Composite Estimation section below, and
the observed patterns of MIS bias through 2012
(Erkens, 2012). The pairings are:
•	
•	
•	
•	

MIS
MIS
MIS
MIS

1
2
3
4

and
and
and
and

MIS
MIS
MIS
MIS

where
NCAFjk = national coverage adjustment factor for
cell j and MIS pair k,

Cj = national coverage adjustment control for cell j,

Ejk = weighted tally (using first-stage weights) for
cell j and MIS pair k.
The age ranges in Table 2-3.1 are used to maximize demographic detail while limiting extreme
cells with 1) fewer than 20 persons responding
each month, and 2) adjustments outside the range
0.6 to 2.0.

5.
6.
7.
8.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Chapter 2-3: Weighting and Estimation 71

72 Chapter 2-3: Weighting and Estimation	

	

U.S. Bureau of Labor Statistics and U.S. Census Bureau

Current Population Survey TP77

M

F

Age

45–54. . . . .

45–54. . . . . 

65 and
older . . . . 

65 and
older . . . .

55–64. . . . .

35–44. . . . .

35–44. . . . . 

55–64. . . . . 

25–34. . . . .

10–15. . . . .

5–9 . . . . . . .

0–4. . . . . . .

25–34. . . . . 

F

16–24. . . . .

M

M

Residual race,
non-Hispanic

16–24. . . . . 

10–15. . . . . 

5–9 . . . . . . . 

0–4. . . . . . . 

Age

Asian alone,
non-Hispanic

Note: Taken from Weighting Specifications for the Current Population Survey, memorandum October 23, 2018.

65–69. . . . .
70–74 . . . . .
75 and
older . . . .

10–11. . . . . 
12–13. . . . . 
14 . . . . . . . . 
15. . . . . . . . 

8–9 . . . . . . . 

5–7 . . . . . . . 

2–4 . . . . . . . 

65 and
older . . . . 

Age
0–1 . . . . . . . 

65 and
older . . . . 

F

16–19. . . . . 
20–24. . . . . 
25–29. . . . . 
30–34. . . . . 
35–39. . . . . 
40–44. . . . . 
45–49. . . . . 
50–54. . . . . 
55–64. . . . . 

M

16–19. . . . . 
20–24. . . . . 
25–29. . . . . 
30–34. . . . . 
35–39. . . . . 
40–44. . . . . 
45–49. . . . . 
50–54. . . . . 
55–64. . . . . 

Age

16–19. . . . .
20–24. . . . .
25–29. . . . .
30–34. . . . .
35–39. . . . .
40–44. . . . .
45–49. . . . .
50–54. . . . .
55–59. . . . .
60–62. . . . .
63–64. . . . .

F

Black alone,
non-Hispanic

0 . . . . . . . . .
1 . . . . . . . . .
2 . . . . . . . . .
3 . . . . . . . . .
4 . . . . . . . . .
5 . . . . . . . . .
6 . . . . . . . . .
7 . . . . . . . . .
8 . . . . . . . . .
9 . . . . . . . . .
10–11. . . . . 
12–13. . . . . 
14 . . . . . . . . 
15. . . . . . . . 

M

White alone,
Hispanic

0 . . . . . . . . .
1 . . . . . . . . .
2 . . . . . . . . .
3 . . . . . . . . .
4 . . . . . . . . .
5 . . . . . . . . .
6 . . . . . . . . .
7 . . . . . . . . .
8 . . . . . . . . .
9 . . . . . . . . .
10–11. . . . .
12–13. . . . .
14 . . . . . . . .
15. . . . . . . .

Age

White alone,
non-Hispanic

National Coverage Step Cell Definitions

Table 2-3.1.

F

16 and
older . . . .

0–15. . . . . .

Age

M

Non-White alone,
Hispanic
F

National Coverage Weights
After the completion of the national coverage
step, the weight for each person is the product
(base weight) x (nonresponse adjustment
factor) x (first-stage adjustment factor)
x (national coverage adjustment factor).
This weight will usually vary for people in the
same household due to household members having different demographic characteristics.

In the District of Columbia, all MIS are combined
for the non-Black alone group, resulting in six
cells. For all other states (including Los Angeles
County, balance of California, New York City, and
balance of New York State), no further collapsing
is done in the non-Black alone cells.
For the Black alone cells, the collapsing varies by
state:
•	

No further collapsing: AL, AR, CA (Los
Angeles County and balance of CA), CT, DC,
DE, FL, GA, IL, LA, MA, MD, MI, MO, MS, NC,
NJ, NY (New York City and balance of NY),
OH, PA, SC, TN, TX, VA.

•	

Collapse to two-sex cells (males of all ages,
females of all ages): AK, AZ, CO, IN, KS, KY,
MN, NE, NV, OK, RI, WA, WI, WV.

•	

Collapse to one cell (males and females of all
ages): HI, ID, IA, ME, MT, ND, NH, NM, OR, SD,
UT, VT, WY.

STATE COVERAGE STEP
In the state coverage step, weights are adjusted so
that CPS estimates by race/sex/age groups match
each month’s independent state population controls.18 This coverage steps compensates for some
differences in race/sex/age coverage by state.

Computing State Coverage Adjustment
Factors
Table 2-3.2 shows the maximum number of adjustment cells in each state:
•	

Maximum of six age/sex cells for Black alone,
with no pairing of rotation groups.

•	

Maximum of 24 cells for non-Black alone—the
same six age/sex groups further split by MIS
pair.

Some collapsing is needed to avoid extreme cells
with: (1) fewer than 20 people responding each
month, and (2) adjustments outside of the range
0.6 to 2.0.
Table 2-3.2.

State Coverage Adjustment Cell Definition
Age

Black alone,
all months-in-sample
combined (6 cells)
Male

Female

Non-Black alone,
by months-insample pair
(24 cells)
Male

Female

0–15. . . . . .
16–44. . . . .
45 and
older . . . .
18
In the state coverage step, California is split into two parts
and each part is treated like a state—Los Angeles County and
the rest of California. Similarly, New York is split into two parts
with each part being treated as a separate state—New York City
(New York, Queens, Bronx, Kings, and Richmond Counties) and
the balance of New York. The District of Columbia is generally
treated as a state.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Each cell is then adjusted to age/sex/race population controls in each state using the following
formula:
	
where

SCAFjk = state coverage adjustment factor for
cell j and MIS pair k,

Cj = state coverage adjustment control for cell j,

Ejk = weighted tally for cell j and MIS pair k.

The independent population controls used for the
state coverage step are from the same source as
those for second-stage weighting.

State Coverage Weights
After the completion of the state coverage adjustment, the weight for each person is the product
(base weight) x (nonresponse adjustment
factor) x (first-stage adjustment factor)
x (national coverage adjustment factor)
x (state coverage adjustment factor).

Chapter 2-3: Weighting and Estimation 73

This weight will vary for people in the same household due to household members having different
demographic characteristics.

are updated each month. Three sets of controls
are used:
•	

State/sex/age: the CNP for the states (see
footnote 2 for state coverage) by sex and age
(0-15, 16-44, 45 and older).

•	

Ethnicity/sex/age: total national CNP for 36
Hispanic and non-Hispanic age/sex cells (see
Table 2-3.3).

•	

Race/sex/age: total national CNP for 56
White, 36 Black, and 26 “residual race” age/
sex cells (see Table 2-3.4).

SECOND-STAGE WEIGHTING
Second-stage weighting decreases the error in
the great majority of sample estimates. Chapter
2-4, Variance Estimation, illustrates the amount
of reduction in relative standard errors for key
labor force estimates. The benchmark procedure
is also a method used to reduce the bias due to
coverage errors (see Chapter 4-1, Nonsampling
Error). The benchmark procedure adjusts the
weights within each MIS pair such that the sample
estimates for geographic and demographic subgroups are matched to independent population
controls. These independent population controls

The specified demographic detail avoids extreme
cells with: (1) fewer than 20 people responding
monthly and (2) overall adjustments outside the
range 0.6 to 2.0.

Table 2-3.3.

Second-Stage Adjustment Cell by Ethnicity, Age, and Sex
Age

Hispanic
Male

Non-Hispanic
Female

Male

Female

0–1 . . . . . . . . . . . . . . . . . 
2–4 . . . . . . . . . . . . . . . . . 
5–7 . . . . . . . . . . . . . . . . . 
8–9 . . . . . . . . . . . . . . . . . 
10–11. . . . . . . . . . . . . . . 
12–13. . . . . . . . . . . . . . . 
14 . . . . . . . . . . . . . . . . . . 
15. . . . . . . . . . . . . . . . . . 
16–19. . . . . . . . . . . . . . . 
20–24. . . . . . . . . . . . . . . 
25–29. . . . . . . . . . . . . . . 
30–34. . . . . . . . . . . . . . . 
35–39. . . . . . . . . . . . . . . 
40–44. . . . . . . . . . . . . . . 
45–49. . . . . . . . . . . . . . . 
50–54. . . . . . . . . . . . . . . 
55–64. . . . . . . . . . . . . . . 
65 and older. . . . . . . . . 
Note: Taken from Weighting Specifications for the Current Population Survey, memorandum October 23, 2018.

74 Chapter 2-3: Weighting and Estimation	

	

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Table 2-3.4.

Second-Stage Adjustment Cell by Race, Age, and Sex
White Alone
Age

Male

Black Alone
Female

0	. . . . . . . . . .
1. . . . . . . . . .
2. . . . . . . . . .
3. . . . . . . . . .
4. . . . . . . . . .
5. . . . . . . . . .
6. . . . . . . . . .
7. . . . . . . . . .
8. . . . . . . . . .
9. . . . . . . . . .
10–11. . . . . . 
12–13. . . . . . 
14. . . . . . . . .
15. . . . . . . . .
16–19. . . . . . 
20–24. . . . . . 
25–29. . . . . . 
30–34. . . . . . 
35–39. . . . . . 
40–44. . . . . . 
45–49. . . . . . 
50–54. . . . . . 
55–59. . . . . . 
60–62. . . . . . 
63–64 . . . . . 
65–69. . . . . . 
70–74 . . . . . . 
75 and
older. . . . . .

Age

Male

Residual Race
Female

Age

0–1 . . . . . . . . 

0–1 . . . . . . . . 

2–4 . . . . . . . . 

2–4 . . . . . . . . 

5–7 . . . . . . . . 

5–7 . . . . . . . . 

8–9 . . . . . . . . 

8–9 . . . . . . . . 

10–11. . . . . . 
12–13. . . . . . 
14 . . . . . . . . . 
15. . . . . . . . . 
16–19. . . . . . 
20–24. . . . . . 
25–29. . . . . . 
30–34. . . . . . 
35–39. . . . . . 
40–44. . . . . . 
45–49. . . . . . 
50–54. . . . . . 
55–64. . . . . . 

10–11. . . . . . 
12–13. . . . . . 
14–15. . . . . . 
16–19. . . . . . 
20–24. . . . . . 
25–29. . . . . . 
30–34. . . . . . 
35–39. . . . . . 
40–44. . . . . . 
45–49. . . . . . 
50–54. . . . . . 
55–64. . . . . . 

65 and
older. . . . . .

65 and
older. . . . . .

Male

Female

Note: Taken from Weighting Specifications for the Current Population Survey, memorandum October 23, 2018

The adjustment is done separately for each MIS
pair (MIS 1 and 5, MIS 2 and 6, MIS 3 and 7, and
MIS 4 and 8). Adjusting the weights to match
one set of controls can cause differences in other
controls, so an iterative process is used to simultaneously control all variables. Successive iterations
begin with the weights as adjusted by all previous
iterations. A total of ten iterations is performed,
which results in near consistency between the
sample estimates and population controls. The
three-dimensional (state/sex/age, ethnicity/sex/
age, race/sex/age) weighting adjustment is also
known as iterative proportional fitting, raking ratio
estimation, or raking.
In addition to reducing the error in many CPS estimates and converging to the population controls
within ten iterations for most items, this estimator
minimizes the statistic

where

W2, i = weight for the i th sample record after the
second-stage adjustment,

W1, i = weight for the i th record after the firststage adjustment.

Thus, the raking adjusts the weights of the records
so that the sample estimates converge to the
population controls while minimally affecting
the weights after the state coverage adjustment.
Ireland and Kullback (1968) provide more details
on the properties of raking ratio estimation.

Computing Second-Stage Adjustment
Factors
As mentioned before, second-stage weighting
involves a three-step weighting adjustment, or
rake:

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Chapter 2-3: Weighting and Estimation 75

•
•
•

State step: state/sex/age.
Ethnicity step: ethnicity/sex/age.
Race step: race/sex/age.

Second-stage adjustment factors are successively
calculated for each raking step, using estimates
based on the most recently iterated weights as the
basis for each successive adjustment. Since every
iterated weighting adjustment is incorporated at
every step of the three-step raking procedure, the
successive estimates gradually converge to the
population controls in all three dimensions.

where

SSAFijk = second-stage adjustment factor for cell
j and MIS pair k after iteration i,
C j = second-stage adjustment control for cell j,
Ei–1, jk = weighted tally for cell j and MIS pair
k at iteration i–1 tabulated using the

intermediate second-stage weights
after iteration i–1 (when i–1 = 0, this is
equal to the estimate tabulated using
state-coverage weights),

m j = number of split MIS pairs in cell j (1 if all MIS
combined, 4 if split into pairs).

Each of three raking steps is iterated ten times,
enough for most estimates to fully converge.

Note that the matching of estimates to controls
for one dimension causes the cells to differ slightly
from the controls from the previous dimension.
With each successive iteration, these differences
decrease. For most cells, after ten iterations, the
estimates for each cell have converged to the population controls for each cell. Thus, the weight for
each record after second-stage weighting can be
thought of as the weight for the record after the
state coverage adjustment multiplied by a series
of 30 adjustment factors (ten iterations, each with
three raking steps).

(base weight) x (nonresponse adjustment
factor) x (first-stage adjustment factor) x
(national coverage adjustment factor) x
(state coverage adjustment factor) x
(second-stage adjustment factor).
The estimates produced after second-stage
weighting are referred to as second-stage estimates. Once each record has an second-stage
weight, an estimate for any given set of characteristics identifiable in the CPS can be computed
by summing the second-stage weights for all
respondents that have that set of characteristics.
The process for producing this type of estimate is
referred to as a Horvitz-Thompson estimator or a
simple weighted estimator.

COMPOSITE ESTIMATION
In general, a composite estimate is a weighted
average of several estimates. Most official CPS
labor force estimates are derived using a composite estimator. Historically, the CPS composite estimate consisted only of the second-stage estimate,
the composite from the preceding month, and
an estimate of change from preceding to current
month. Over time, the CPS refined the composite, updating the weights used in the weighted
average as well as adding a component that
captures the net difference between the incoming
and continuing parts of the current month’s sample. In 1998, the BLS introduced a compositing
method that allows more operational simplicity
for microdata users as well as determining better
compositing coefficients for different labor force
categories.
Breaking down the CPS composite estimator into
more detail and keeping in mind the underlying
4-8-4 rotation pattern, the estimator and its separate components are understood in terms of the
following expression:

where

Second-Stage Weights
At the completion of second-stage (SS) weighting, the record for each person has a weight
reflecting the product of:

76 Chapter 2-3: Weighting and Estimation	

	

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

consistent with the composite estimators for any
given month.

i = MIS 1,2,...,8,

𝓍ti = sum of second-stage weights of respondents in month t and MIS i with characteristic of
interest,

K = 0.4 for unemployed, 0.7 for employed,
A = 0.3 for unemployed, 0.4 for employed.

K determines the weight to get the weighted
average of two estimators for the current month:
(1) the current month’s second-stage estimate
and (2) the sum of the previous month’s composite
, and an estimator
of the change
estimator
since the previous month. The estimate of change
is based on data from sample households in the
six rotation groups common to months t and t-1
(about 75 percent). Higher correlation in the estimates coming in this part of the sample tends to
reduce variance of the estimated month-to-month
change. A determines the weight of βt ,
a term adjusting for the difference between the
continuing months in sample and the new months
in sample. It is intended to reduce both the variance of the composite estimator and the bias
associated with time in sample (Breau and Ernst,
1983; and Bailar, 1975). The values given above for
the constant coefficients A and K were selected
after extensive review of many series for minimum
variance for month-to-month change estimates of
unemployment and employment.
The composite estimator for the CPS provides, on
average, an overall improvement in the variance
for estimates of month-to-month change. The
composite also provides additional improvements
for estimates of change over longer intervals of
time, for a given month (Breau and Ernst, 1983).

Computing Composite Weights
Weights are derived for each record that, when
aggregated, produce estimates consistent with
those produced by the composite estimator.
Composite estimation is performed at the macro
level. The composite weights allow one month
of microdata to be used to produce estimates

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

The composite weighting method involves
two steps: (1) the computation of composite
estimates for the main labor force categories,
classified by important demographic characteristics; and (2) the adjustment of the microdata
weights, through a series of weighting adjustments, to agree with these composite estimates,
thus incorporating the effect of composite
estimation into the microdata weights. Under this
procedure, the sum of the composite weights
of all sample people in a particular labor force
category equals the composite estimate of the
level for that category. To produce a composite
estimate for a particular month, a data user may
simply access the microdata file for that month
and compute a weighted sum. This composite
weighting approach also improves the accuracy of labor force estimates by using different
compositing coefficients for different labor force
categories. The weighting adjustment method
assures additivity while allowing this variation in
compositing coefficients.
Composite weights are produced only for sample
persons aged 16 or older. The method of computing composite weights for the CPS imitates
second-stage weighting. Sample person weights
are raked to force their sums to equal control
totals. Composite labor force estimates are used
as controls in place of independent population
controls. The composite raking process is performed separately within each of the three major
labor force categories: employed (E), unemployed
(UE), and not in the labor force (NILF). These
three labor force totals sum to the CNP. Since CNP
is a population control, and E and UE are directly
estimated, the NILF is indirectly computed as the
residual:

NILF = CNP — (E + UE)

The composite weighting process is similar to
the raking process to compute the second-stage
weights. Within each labor category, a three-​
dimensional rake is applied, using the second-stage estimate at the state level as one step
and national second-stage estimates for age/sex/
race (Table 2-3.5) and age/sex/ethnicity (Table
2-3.6) as the second and third steps.

Chapter 2-3: Weighting and Estimation 77

Table 2-3.5.

Composite National Race Cell Definition
White alone
Age

Male

Black alone
Female

Age

Male

16–19. . . . . . 
20–24. . . . . . 
25–29. . . . . . 
30–34. . . . . . 
35–39. . . . . . 
40–44. . . . . . 
45 and
older. . . . . .

16–19. . . . . . 
20–24. . . . . . 
25–29. . . . . . 
30–34. . . . . . 
35–39. . . . . . 
40–44. . . . . . 
45–49. . . . . . 
50–54. . . . . . 
55–59. . . . . . 
60–64. . . . . . 
65 and
older. . . . . .

Residual race
Female

Age

Male

Female

16–19. . . . . . 
20–24. . . . . . 
25–34. . . . . . 
35–44. . . . . . 
45 and
older. . . . . .

Note: Taken from Weighting Specifications for the Current Population Survey, memorandum October 23, 2018.

Table 2-3.6.

Composite National Ethnicity Cell Definition
Ages

Hispanic
Male

Non-Hispanic
Female

Male

Female

16–19. . . . . 
20–24. . . . . 
25–34. . . . . 
35–44. . . . . 
45+. . . . . . . 
Note: Taken from Weighting Specifications for the Current Population Survey, memorandum October 23, 2018.

Data from all eight rotation groups are combined
for the purpose of computing composite weights.
As with second-stage weighting, each iteration of
the composite weight calculation uses the most
recently adjusted weights as the basis. The formulas are analogous. Like second-stage weighting,
ten iterations are performed in each of the three
dimensions for each of the three composited labor
force groups (E, UE, and NILF).
While E and UE are estimated directly in composite estimation, NILF is indirectly calculated as
the difference between the relevant population
control and the sum of E and UE, which is the estimate of the civilian labor force.
The specified demographic detail avoids extreme
cells with: (1) fewer than 10 people responding
monthly, and (2) overall adjustments outside the
range 0.7 to 1.3.

Final Weights
The composite weights are the final CPS weights
and the weights used to compute most BLS published estimates. Within each rotation group or
paired MIS, summation of the final weights do not
78 Chapter 2-3: Weighting and Estimation	

	

match independent population controls, because
data from all eight rotation groups are combined
to form the composite weights. However, summation of the final weights for the entire sample will
match the independent population controls.

PRODUCING OTHER LABOR FORCE
ESTIMATES
In addition to base weighting to produce estimates for people, several special-purpose weighting procedures are performed each month. These
include:
•	

Weighting to produce estimates for household
and families.

•	

Weighting to produce estimates from data
based on only two of eight (outgoing) rotation groups.

•	

Weighting to produce labor force estimates
for veterans and nonveterans (veterans’
weighting).

Most special weights are based on second-stage
weights. Some also make use of composited

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

estimates. In addition, consecutive months of data
are often used to produce quarterly or annual
average estimates. Each of these procedures is
described in more detail below.

Family Weight
Family weights are used to produce estimates
related to families and family composition. They
also provide the basis for household weights.
More than one family may be identified in a HU.
For families maintained by women or men with no
spouse present, the family weight is equivalent to
the second-stage weight of the reference person.
For married-couple families, the family weight is
equivalent to the wife’s second-stage weight (in
same-sex marriages, the family weight is the second-stage weight of the reference person).
Weighted tabulations of CPS data by sex and
marital status show the same number of married
women and married men with their spouses present. The wife’s weight is used as the family weight,
because CPS coverage ratios for women tend to
be higher and subject to less month-to-month
variability than those for men.

Household Weight
The same household weight is assigned to every
person in the same HU and is equal to the family
weight of the household reference person. The
household weight can be used to produce estimates at the household level. This varies from
family weight in the situations where more than
one family is living in the same household.

Outgoing Rotation Weights
Some items in the CPS questionnaire are asked
only in HUs due to rotate out of the sample temporarily (MIS 4) or permanently (MIS 8) after the
current month. These are referred to as the outgoing rotation groups. Items asked in the outgoing
rotations include those on earnings (since 1979),
union affiliation (since 1983), and I&O of second
jobs of multiple jobholders (beginning in 1994).
Since the data are collected from only one-fourth
of the sample each month, data from 3 or 12
months are used to improve their reliability, and
published as quarterly or annual averages.
Since 1979, most CPS files have included separate
weights for the outgoing rotations. An individual’s

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

outgoing rotation weight is approximately four
times that of his or her final weight. However,
these outgoing rotation weights are benchmarked
to controls for employment status, which are
based on the composited estimates of E, UE, and
NILF each month from the full sample. The cells
are classified by age, race, sex, and employment
status: employed wage and salary workers, other
employed, UE, and NILF. The outgoing adjustment
factor for each cell is computed by taking the ratio
of the total composite weights in the control to
the total second-stage weights for the outgoing
rotation groups.
The outgoing rotation weights are obtained by
multiplying the outgoing adjustment factors by
the second-stage weights. For consistency, an
outgoing rotation group weight of four times the
basic CPS family weight is assigned to all people
in the two outgoing rotation groups who were not
eligible for this special weighting (mainly military
personnel and people aged 15 and younger).
Production of monthly, quarterly, and annual
estimates using the outgoing rotation weights is
completely parallel to production of second-stage
estimates from the full sample—the weights are
summed and divided by the number of months
used.
The composite estimator is not applicable for
these estimates because there is no overlap
between the quarter samples in consecutive
months.

Family Outgoing Rotation Weight
The family outgoing rotation weight is analogous
to the family weight computed for the full sample,
except that outgoing rotation weights are used,
rather than second-stage weights.

Veterans’ Weights
Since 1986, CPS interviewers have collected
data on veteran status from all household members. Veterans’ weights are calculated for all CPS
household members based on their veteran status.
This information is used to produce tabulations of
employment status for veterans and nonveterans.
Each individual is classified as a veteran or a nonveteran. Veterans are currently classified into cells
based on age, sex, and veteran status (Gulf War,
Other War, or Non-War). The composite weights

Chapter 2-3: Weighting and Estimation 79

for CPS veterans are tallied into type-of-veteran/
sex/age cells. Separate adjustment factors are
computed for each cell, using independently
established monthly estimates of veterans provided by the Department of Veterans Affairs. The
adjustment factor is the ratio of the independent
control total to the sample estimate total for each
cell. This adjustment factor is multiplied by the
composite weight for each veteran to produce the
veterans’ weight.
To compute veterans’ weights for nonveterans, a
table of composited estimates is produced from
the CPS data by race (White alone/non-White
alone), sex, age, and labor force status (E, UE,
and NILF). The veterans’ weights produced in the
previous step are tallied into the same cells. The
estimated number of veterans is then subtracted
from the corresponding cell entry for the composited table to produce nonveterans control totals.
The composite weights for CPS nonveterans are
tallied into the same race/sex/age/labor force
status cells. The adjustment factor for each cell
is the ratio of the nonveteran control total to the
sample estimate total in each cell. The composite
weight for each nonveteran is multiplied by the
adjustment factor to produce the veterans’ weight
for nonveterans.

Estimates of Averages
CPS frequently produces estimates of averages,
using multiple months of data. The most commonly computed averages are: (1) quarterly,
which provide four estimates per year by grouping
the months of the calendar year in nonoverlapping intervals of three, and (2) annual, combining
all 12 months of the calendar year. Quarterly and
annual averages can be computed by summing
the weights for all of the months contributing
to each average and dividing by the number of
months involved. Averages for calculated cells,
such as rates, percentages, means, and medians,
are computed from the averages for the component levels, not by averaging the monthly values
(for example, a quarterly average unemployment
rate is computed by taking the quarterly average
unemployment level as a percentage of the quarterly average labor force level, not by averaging
the three monthly unemployment rates together).

by a factor of approximately the number of
months involved in the average, the sampling variance for the average estimate is actually reduced
by a factor substantially less than that number of
months. This is primarily because the CPS rotation
pattern and resulting month-to-month overlap in
sample units mean that people respond in several
months and estimates from the individual months
are not independent.

DERIVATION OF INDEPENDENT
POPULATION CONTROLS
The Census Bureau’s Population Estimates
Program (PEP) produces and publishes estimates
of the population for the nation, states, counties,
state/county equivalents, cities, towns, and for
Puerto Rico and its municipios. The PEP estimates
the population for each year following the most
recent decennial census using measures of population change. These population estimates are
used for federal funding allocations, as controls
for major surveys including the CPS and the ACS,
for community development, to aid business planning, and as denominators for statistical rates.
The PEP produces estimates of the resident
population, which includes all people currently
residing in the United States as well as estimates
of selected subpopulations. A more complete
description of PEP methodology can be found at
.

REFERENCES
Bailar, B. A., “The Effects of Rotation Group Bias
on Estimates from Panel Surveys,” Journal of the
American Statistical Association, Vol. 70, 1975,
pp. 23−30.
Breau, P., and L. Ernst, “Alternative Estimators to
the Current Composite Estimator,” Proceedings
of the Section on Survey Research Methods,
American Statistical Association, 1983,
pp. 397−402.
Erkens, G., “Changes in Panel Bias in the
U.S. Current Population Survey and its Effects
on Labor Force Estimates,” Proceedings of the
Section on Survey Research Methods, American
Statistical Association, 2012, pp. 4220−4232.

Although such averaging increases the number of
interviews contributing to the resulting estimates
80 Chapter 2-3: Weighting and Estimation	

	

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Ireland, C. T., and S. Kullback, “Contingency Tables
with Given Marginals,” Biometrika, 55, 1968,
pp. 179−187.
Robison, E., M. Duff, B. Schneider, and
H. Shoemaker, “Redesign of Current Population
Survey Raking to Control Totals,” Proceedings of
the Survey Research Methods Section, American
Statistical Association, 2002.

FURTHER READING
Bell, W. R., and S. C. Hillmer, “The Time Series
Approach to Estimation for Repeated Surveys,”
Survey Methodology, 16, 1990, pp. 195−215.
Copeland, K. R., F. K. Peitzmeier, and C. E. Hoy,
“An Alternative Method of Controlling Current
Population Survey Estimates to Population
Counts,” Proceedings of the Survey Research
Methods Section, American Statistical Association,
1986, pp. 332−339.
Evans, T. D., R. B. Tiller, and T. S. Zimmerman,
“Time Series Models for State Labor Force
Estimates,” Proceedings of the Survey Research
Methods Section, American Statistical Association,
1993, pp. 358−363.
Hansen, M. H., W. N. Hurwitz, and W. G. Madow,
Sample Survey Methods and Theory, Vol. II, John
Wiley and Sons, New York, 1953.
Harvey, A. C., Forecasting, Structural Time
Series Models and the Kalman Filter, Cambridge
University Press, Cambridge, 1989.
Kostanich, D., and P. Bettin, “Choosing a
Composite Estimator for CPS,” presented at
the International Symposium on Panel Surveys,
Washington, DC, 1986.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Lent, J., S. Miller, and P. Cantwell, “Composite
Weights for the Current Population Survey,”
Proceedings of the Survey Research Methods
Section, American Statistical Association, 1994,
pp. 867−872.
Lent, J., S. Miller, P. Cantwell, and M. Duff, “Effect
of Composite Weights on Some Estimates from
the Current Population Survey,” Journal of Official
Statistics, Vol. 15, No. 3, 1999, pp. 431−438.
Oh, H. L., and F. Scheuren, “Some Unresolved
Application Issues in Raking Estimation,”
Proceedings of the Section on Survey Research
Methods, American Statistical Association, 1978,
pp. 723−728.
Scott, J. J., and T. M. F. Smith, “Analysis of
Repeated Surveys Using Time Series Methods,”
Journal of the American Statistical Association, 69,
1974, pp. 674−678.
Thompson, J. H. , “Convergence Properties of the
Iterative 1980 Census Estimator,” Proceedings
of the American Statistical Association on
Survey Research Methods, American Statistical
Association, 1981, pp. 182−185.
Tiller, R. B., “A Kalman Filter Approach to Labor
Force Estimation Using Survey Data,” Proceedings
of the Survey Research Methods Section,
American Statistical Association, 1989, pp. 16−25.
Tiller, R. B., “Time Series Modeling of Sample
Survey Data from the U.S. Current Population
Survey,” Journal of Official Statistics, 8, 1992, pp.
149−166.
Zimmerman, T. S., T. D. Evans, and R. B. Tiller,
“State Unemployment Rate Time Series,”
Proceedings of the Survey Research Methods
Section, American Statistical Association, 1994,
pp. 1077−1082.

Chapter 2-3: Weighting and Estimation 81

Chapter 2-4: Variance Estimation
INTRODUCTION
Variance estimation of major CPS statistics serves
the following two objectives:
•	

Estimate the variance of the survey estimates
for use in various statistical analyses.

•	

Evaluate the effect of each of the stages of
sampling and estimation on the overall precision of the survey estimates.

CPS variance estimates take into account the
magnitude of the sampling error as well as the
effects of some nonsampling errors, such as
nonresponse and coverage error. Chapter 4.1,
Nonsampling Error, provides additional information on these topics. Certain aspects of the CPS
sample design, such as the use of one sample PSU
per NSR stratum and the use of cluster subsampling within PSUs, make it impossible to obtain a
completely unbiased estimate of the total variance. The use of ratio adjustments in the estimation procedure also contributes to this problem.
Although imperfect, the current variance estimation procedure is accurate enough for all practical
uses of the data, and captures the effects of sample selection and estimation on the total variance.
Variance estimates of selected characteristics, and
tables that show the effects of estimation steps on
variances, are presented at the end of this chapter.

CURRENT POPULATION SURVEY
REPLICATION METHODS
Replication methods are able to provide satisfactory estimates of variance for a wide variety of
designs that use probability sampling, even when
complex estimation procedures are used (Dippo et
al., 1984). This method presumes that the sample
selection, the collection of data, and the estimation procedures are independently carried out
(replicated) several times. The dispersion of the
resulting estimates can be used to measure the
variance of the full sample.
It would not be feasible to repeat the entire
CPS several times each month simply to obtain
variance estimates. A practical alternative is to
perturb the weights of the full sample many different times to create new artificial samples, and
to apply the regular CPS estimation procedures to
82 Chapter 2-4: Variance Estimation	

	

these subsamples, or replicates. While CPS uses
a fixed number of replicates, the optimal number
of replicates can vary by estimate series, meaning
that replication variances for any particular estimate may be suboptimal.

Replication Methods for 1970 and 1980
Designs
Historically, computational burden was balanced
with optimality considerations to determine the
number of replicates to use. Prior to the 1970 CPS
design, variance estimates were computed using
40 replicates. The replicates were subjected to
only second-stage weighting for the same age/
sex/race categories used for the full sample at
the time. Nonresponse and first-stage weighting
adjustments were not replicated. (See Chapter
2-3, Weighting and Estimation, for more context
on CPS weighting.) Even with these simplifications, limited computer capacity allowed the computation of variances for only 14 characteristics.
For the 1970 design, an adaptation of the Keyfitz
method of calculating variances was used (Keyfitz,
1957). These variance estimates were derived
using the Taylor approximation, dropping terms
with derivatives higher than the first. By 1980,
improvements in computer memory capacity
allowed the calculation of variance estimates for
many characteristics, with replication of all stages
of the weighting through compositing.
Starting with the 1980 design, variances were
computed using a modified balanced half-sample
approach. The sample was divided in half 48 times
to form replicates that retained all the features
of the sample design, such as stratification and
within-PSU sample selection. For total variance,
a pseudo first-stage design was imposed on the
CPS by dividing large SR PSUs into smaller pieces
called Standard Error Computation Units (SECUs)
and combining small NSR PSUs into paired strata
or pseudostrata. One NSR PSU was selected
randomly from each pseudostratum for each
replicate.
Forming these pseudostrata was necessary since
the first stage of the sample design has only one
NSR PSU per stratum in the sample. However,
pairing the original strata for variance estimation
purposes creates an upward bias in the variance
Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

estimator. For SR PSUs, each SECU was divided
into two panels, and one panel was selected
for each replicate. One column of a 48-by-48
Hadamard orthogonal matrix was assigned to
each SECU, or pseudostratum. The unbiased
weights were multiplied by replicate factors of 1.5
for the selected panel and 0.5 for the other panel
in the SR SECU or NSR pseudostratum (Dippo et
al., 1984). Thus, the full sample was included in
each replicate, but the matrix determined differing
weights for the half samples. These 48 replicates
were processed through all stages of the CPS
weighting through compositing. The estimated
variance for the characteristic of interest was computed by summing a squared difference between
each replicate estimate (Ŷr ) and the full sample
estimate (Ŷo ). The complete formula19 is

Due to costs and computer limitations, variance
estimates were calculated for only 13 months
(January 1987 through January 1988) and for
only about 600 estimates at the national level.
Replication estimates of variances at the subnational level were not reliable because of the
small number of SECUs available (Lent, 1991).
Generalized sampling errors (explained below)
were calculated based on the 13 months of variance estimates. (See Wolter, 1985; Fay, 1984; or
Fay, 1989 for more details on half-sample replication for variance estimation.)

Replication Methods for 1990, 2000, and
2010 Designs
The general goal of the current variance estimation methodology, the method in use since
July 1995, is to produce consistent variances
and covariances for each month over the entire
life of the design. Periodic maintenance reductions in sample size and the continuous addition
of new construction to the sample complicated
the strategy to achieve this goal. However,
research has shown that variance estimates are
not adversely affected as long as the cumulative
effect of the reductions is less than 20 percent of
19
The basic formula for balanced half-sample replcation
uses replicate factors of 2 and 0 with the formula:
where k is the number of replicates. The
factor of 4 in our variance estimator is the result of using a modified formula with replicate factors of 1.5 and 0.5. See Dippo et al.
(1984) for more details.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

the prereduction sample size (Kostanich, 1996).
Assigning all future new construction sample to
replicates when the variance subsamples are originally defined provides the basis for consistency
over time in the variance estimates.
The current approach to estimating the design
variances is called successive difference replication. The theoretical basis for the successive difference method was discussed by Wolter (1984)
and extended by Fay and Train (1995) to produce
the successive difference replication method
used for the CPS. The following is a description
of the application of this method. Successive
USUs are paired in the order of their selection to
take advantage of the systematic nature of the
CPS within-PSU sampling scheme.20 Each USU
usually occurs in two consecutive pairs: (USU1,
USU2), (USU2, USU3), (USU3, USU4), etc. A pair
is therefore similar to a SECU in the 1980 design
variance methodology. For each USU within a
PSU, two pairs (or SECUs) of neighboring USUs
are defined based on the order of selection—one
with the USU selected before and one with the
USU selected after it. This procedure allows USUs
adjacent in the sort order to be assigned to the
same SECU, thus better reflecting the systematic
sampling in the variance estimator. In addition, a
large increase in the number of SECUs and in the
number of replicates (160 vs. 48) starting with the
1990 design increases the precision of the variance estimator.
The 2010 sample design introduced two major
changes in methodology. First, the samples are
now selected on an annual rather than decadal
basis. Second, a new variance estimator was
introduced that includes rotation group among
the sort criteria for successive difference replication. This estimator allows estimation of variances
and covariances at the rotation group level. See
Chapter 2.2, Sample Design, for more information
about CPS sampling procedures and 2010 design
changes.

Replicate Factors for Total Variance
Total variance is composed of two types of variance, the variance due to sampling of HUs within
PSUs during second-stage sampling (within-PSU
variance) and the variance due to the selection of
PSUs during first-stage sampling (between-PSU
20

An ultimate sampling unit is usually a group of four HUs.

Chapter 2-4: Variance Estimation 83

variance). Due to the selection of one PSU within
NSR stratum in the CPS Design, between-PSU
variance cannot be estimated directly using this
methodology, but it can be estimated as the difference between the estimates of total variance
and within-PSU variance.
To produce estimates of total variance, replicates
are formed differently for SR and NSR samples.
For NSR PSUs, the original strata within a state
are collapsed into pseudostrata of pairs and at
most one triplet for states with an odd number
of NSR PSUs. After that collapsing, a replication
method of the Collapsed Stratum Estimator is
used to assign replicate factors to units in these
PSUs (Wolter, 1985). In pseudostrata containing
a pair of NSR PSUs, replicate factors of 1.5 or 0.5
adjust the weights.21 These factors are assigned
based on a single row from the Hadamard matrix
and are further adjusted to account for unequal
sizes of the original strata within the pseudostratum (Wolter, 1985). In pseudostrata containing
a triplet, for the 1990 design, two rows from the
Hadamard matrix were assigned to the pseudostratum, resulting in replicate factors of approximately 0.5, 1.7, and 0.8; or 1.5, 0.3, and 1.2 for the
three PSUs assuming equal sizes of the original
strata. However, for the 2000 design, these factors
were further adjusted to account for unequal sizes
of the original strata within the pseudostratum. All
USUs in a pseudostratum are assigned the same
row number(s).
In SR strata, a single PSU is picked with probability 1; therefore, there is no contribution from
between-PSU variance, and the total variance is
the within-PSU variance. Successive difference
replication is used in these PSUs to create replicate factors. For an SR sample, two rows of the
Hadamard matrix are assigned to each pair of USUs
creating replicate factors fir for r = 1, … , 160:
where

ai,r = a number in the Hadamard matrix (+1 or
−1) for the ith USU in the systematic sam-

ple. This formula yields replicate factors of
approximately 1.7, 1.0, or 0.3.

As in the 1980 methodology, the unbiased weights
(base weight x special weighting factor) are multiplied by the replicate factors to produce unbiased replicate weights. These unbiased replicate
weights are further adjusted through the weighting steps applied to the full sample, as described
in Chapter 2-3, Weighting and Estimation. A
variance estimator for the characteristic of interest
is a sum of squared differences between each replicate estimate (Ŷr ) and the full sample estimate
(Ŷo ). The formula is

The replicate factors 1.7, 1.0, and 0.3 for the
SR portion of the sample were specifically constructed to yield a numerator of 4 in the above
formula, such that the formula remains consistent
between SR and NSR areas (Fay and Train, 1995).

Replicate Factors for Within-Primary
Sampling Unit Variance
All PSUs (NSR or SR) undergo systematic sampling during second-stage sampling; therefore,
successive difference replication can be used to
estimate within-PSU variance. For SR PSUs, the
same replicate factors from total variance estimation are assigned to the within-PSU estimator. This
is because the PSUs were picked with probability
1 during first-stage sampling, so only within-PSU
variance contributes to the total variance. For
sample in an NSR PSU, successive difference
replication is also used in order to estimate the
contribution to variance from second-stage
sampling (within-PSU variance). Alternate row
assignments are made for USUs to form pairs of
USUs in the same manner that was used for the
SR assignments. Thus, for within-PSU variance, all
USUs (both SR and NSR) have replicate factors of
approximately 1.7, 1.0, or 0.3.
The successive difference replication method is
used to calculate total national variances and
within-PSU variances for some states and metropolitan areas. For more detailed information
regarding the formation of replicates, see Gunlicks
(1996).

Variances for State and Local Areas
Replicate factors are calculated using a 160-by-160
Hadamard matrix.
21

84 Chapter 2-4: Variance Estimation	

	

For estimates at the national level, total variances are estimated from the sample data by the
Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

successive difference replication method previously described. For local areas that are coextensive with one or more sample PSUs, variance
estimators can be derived from the same variance
estimation methods used for the SR portion of the
national sample. However, variance estimation for
states and areas that have substantial contributions from NSR sample areas can be problematic.
Most states contain a small number of NSR sample
PSUs, so between-PSU variances at the state level
are based on relatively small sample sizes. Pairing
these PSUs into pseudostrata further reduces the
number of NSR SECUs and increases reliability
problems. In addition, the component of variance
resulting from sampling PSUs can be more important for state estimates than for national estimates
in states where the proportion of the population
in NSR strata is larger than the national average.
Further, creating pseudostrata for variance estimation purposes introduces a between-stratum
variance component that is not in the sample
design, causing overestimation of the true variance. The between-PSU variance, which includes
the between-stratum component, is relatively
small at the national level for most characteristics, but it can be much larger at the state level
(Gunlicks, 1993; Corteville, 1996). Thus, this additional component should be accounted for when
estimating state variances.

GENERALIZING VARIANCES
With some exceptions, the standard errors for
CPS estimates are based on generalized variance
functions (GVFs). The GVF is a simple model
that expresses the variance as a function of the
expected value of the survey estimate. The parameters of the model are estimated using the direct
replicate variances discussed above. These models
provide a relatively easy way to obtain approximate standard errors for numerous characteristics.
One could not possibly predict all of the combinations of results that may be of interest to data
users. Therefore, a presentation of the individual
standard errors based on survey data, while technically possible to compute, would be of limited
use. In addition, for estimates of differences and
ratios that users may compute, the published
standard errors would not account for the correlation between the estimates.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Most importantly, variance estimates are based on
sample data and have variances of their own. The
variance estimate for a survey estimate for a particular month generally has less precision than the
survey estimate itself. The estimates of variance
for the same characteristic may vary considerably
from month to month or for related characteristics
(that might have similar levels of true precision) in
a given month. Therefore, some method of stabilizing these estimates of variance, such as by generalization or by averaging over time, is needed to
improve both their reliability and usability.

Generalization Method
The GVF that is used to estimate the variance of
an estimated monthly level, x, given a population
total N (CNP, 16 years and over) is of the form
	 (2-4.1)
where α and β are two parameters estimated
using least squares regression. The rationale for
this form of the GVF model is the assumption that
the variance of χ can be expressed as the product
of the variance from a simple random sample (for
a binomial random variable) and a design effect.
The design effect (δ) accounts for the effect of
a complex sample design relative to a simple
random sample. Defining p = 𝓍/N as the proportion of the population having some characteristic,
where n is the size of the sample, the variance of
the estimated monthly level 𝓍 is

(2-4.2)

Letting

, then

					(2-4.3)
The design effect component d (inclusive of the
sampling interval, which fluctuates monthly and
tends to increase over time) is estimated by the
regression term (α + βN), yielding the functional
form given in formula 2-4.1. Since N is a known
population total, α and β are the two modeled
GVF parameters for a particular series.

The generalized variance models used from 1947
to 2016 were based on the model form:

Var (x) = ax2 + bx

(2-4.4)

Chapter 2-4: Variance Estimation 85

The only difference between these model forms
is that the historical model (2-4.4) assumed a
fixed population level N, therefore producing
GVF parameters that were useful for time periods
in which the true population level was approximately equal to the fixed value. The current model
(2-4.2), implemented in 2017, allows N to vary
from month to month, resulting in GVF parameters
that are useful over the entire reference period of
the model (typically 10 or more years). Since N
is a population control readily available for every
month, the current model was adopted to increase
the generalizability of the GVF parameters.
The α and β parameters are designed to produce
variance estimates for monthly CPS estimates.
For estimates of changes or averages over time,
adjustment factors are calculated based on historical correlations and an assumption of equal
monthly variances, which simplifies the calculation. While monthly variances do change over
time, empirical review suggests the equal-variance
assumption has negligible impact on the quality of
variance estimates for most series.
The α and β parameters are updated annually
to incorporate recent information into the GVF
models. Since the overlap from one model period
to the next is very high, the resulting parameters
usually do not change much from one year to
the next, allowing for smooth comparisons over
time. In the event of a major sample reduction, the
GVF models would be adjusted to account for the
changing sampling intervals and design effects.

or more, the long-term stability of the sampling
interval and the design effect are leveraged, while
the relative volatility of the replication procedure
is smoothed, resulting in a GVF model fit that
accounts for binomial variance and seasonality.
Unlike direct replication variances, estimates of
variances from this GVF method are not typically
volatile, excepting some sparse subgroups.
The GVF model subsequently detailed is only
applied to level (i.e., count) and rate series that
may be considered binomial, such as number of
employed people and the unemployment rate,
respectively. Most CPS series fit this criterion. The
variances of nonbinomial series, such as means
and medians, are not estimated using this specific
method.
To estimate the variance of some level estimate x,
such as total employed or total unemployed, recall
the formula 2-4.3:

While this formula represents the theoretical form,
binomial variances are not directly computed.
Direct CPS variances, as referred to in this chapter,
are calculated by the successive difference replication method described in earlier sections. These
replicate variances are used in the construction of
the GVF model.
Define V* (𝓍; N) as the replicate estimate of the
variance of x, given the population total N. Then
calculate the design effect component d as

Calculation of Generalized Variance
Function Parameters
In 2015, the CPS introduced a new method for
computing GVF parameters (McIllece, 2016). The
new process models each series individually rather
than clustering series with similar design effects.
Under the previous method, the model used
relative variance as a dependent quantity. Under
the new method, the dependent quantity is the
design effect times the sampling interval, both of
which are stable over time for most series, presuming stability of the CPS sample size. Notably,
the fluctuation of the replication process, which
is implicitly included in d since direct replicate
variances are used (see formulations below), is
relatively unstable. By fitting each model on a
monthly series history of approximately 10 years
86 Chapter 2-4: Variance Estimation	

	

By computing d as a ratio based on replicate
variance, d implicitly includes the volatility of the
replication procedure. Since the quantity is large
relative to the size of the estimate x, the direct
replicate variance estimates are impractical for
use. The GVF model, therefore, is constructed to
retain the form of binomial variance and any associated seasonal effects, while smoothing through
the replication volatility that otherwise destabilizes the variance estimation series.
Let

d = the average value of over the model period

N = the average value of over the model period

Current Population Survey TP77

U.S. Bureau of Labor Statistics and U.S. Census Bureau

difference in the form of the variance estimators
between levels and rates:

sd* = the standard deviation of d* over the model

Once d is computed over the modeling period, the
α and β GVF parameters for rate series are developed using the same modeling method that is
used for level series (see above), but with the base
quantities y and y replacing N and N , respectively,
in the least squares regression model.

model period,

Computing Standard Errors

rd*,N* = the correlation between d* and N* over
the model period,
period,

sN* = the standard deviation of N* over the

The GVF model is constructed as a least squares
regression model:
Then expanded to estimate by the modeled
value dˆ :

Therefore, the α and
are calculated as:

β GVF parameters from 2-4.1

Modeled variances for rate estimates, such as
the unemployment rate or the labor force participation rate, are developed analogously to level
estimates, but with the base y (the denominator
of the rate) replacing N. Though y is often an
estimate, it is treated as a population value for
GVF parameter estimation, as empirical results
have indicated that treating the base as a random
variable typically has little impact on the variance
estimates.
By treating y as a population value, the variance
of rate estimates can be approximated by a slight
modification to 2-4.1:

After the parameters α and β of formula 2-4.1 are
determined, it is a simple matter to construct a
table of standard errors of estimates. In practice, such tables show the standard errors that
are appropriate for specific estimates, and the
user is instructed to interpolate for estimates not
explicitly shown in the table. The Bureau of Labor
Statistics publishes tables of GVF parameters,
which are available in the “Reliability of estimates
from the CPS” section of its website
https://www.bls.gov/cps/documentation.htm.
Table 2-4.1 provides example GVF parameters for
the labor force characteristics of several demographic subgroups for the purpose of demonstrating standard error computation. Refer to the
source listed previously for current parameters.
Since the standard error of an estimate is the
square root of its variance, the approximate standard error, se(𝓍; N), of an estimated monthly total
can be obtained with α and β from the above table
and square root of formula 2-4.1:
	(2-4.5)

In January 2007, there were an estimated
4,406,000 unemployed men from a population N
of 230,650,000 people. Obtaining the appropriate
α and β parameters from Table 2-4.1, an approximate standard error can be calculated for this
estimate using equation 2-4.5:
se (4,406,00O; 230,650,000)

where

.

The design effect component d for rate series has
a slightly different form, which accounts for the
Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau

In January 2017, there were an estimated
4,514,000 unemployed men from a population N of
Chapter 2-4: Variance Estimation 87

Table 2-4.1.

Generalized Variance Function Parameters for Estimates of Monthly Levels
Characteristic

Parameters
�

�

Total:
Civilian labor force. . . . . . . . . . . . . . . . . . 
Employed. . . . . . . . . . . . . . . . . . . . . . . . . . 
Unemployed. . . . . . . . . . . . . . . . . . . . . . . . 
Not in labor force. . . . . . . . . . . . . . . . . . . 

–4,629.00
–4,868.95
558.44
–4,629.00

0.00002821
0.00002960
0.00001184
0.00002821

Men:
Civilian labor force. . . . . . . . . . . . . . . . . . 
Employed. . . . . . . . . . . . . . . . . . . . . . . . . . 
Unemployed. . . . . . . . . . . . . . . . . . . . . . . . 
Not in labor force. . . . . . . . . . . . . . . . . . . 

–3,045.81
–2,452.20
1,050.17
–2,878.85

0.00001622
0.00001450
0.00000883
0.00001860

Women:
Civilian labor force. . . . . . . . . . . . . . . . . . 
Employed. . . . . . . . . . . . . . . . . . . . . . . . . . 
Unemployed. . . . . . . . . . . . . . . . . . . . . . . . 
Not in labor force. . . . . . . . . . . . . . . . . . . 

–2,678.77
–2,807.89
–2,748.05
–1,940.94

0.00001593
0.00001687
0.00002370
0.00001401

Both sexes, 16 to 19 years:
Civilian labor force. . . . . . . . . . . . . . . . . . 
Employed. . . . . . . . . . . . . . . . . . . . . . . . . . 
Unemployed. . . . . . . . . . . . . . . . . . . . . . . . 
Not in labor force. . . . . . . . . . . . . . . . . . . 

–2,804.43
–3,729.02
–1,368.68
1,541.65

0.00001981
0.00002414
0.00001808
–0.00000164

Note: These values are based on standard-error models that were updated with the release of July 2018 data.
Source: Bureau of Labor Statistics, “Reliability of estimates from the CPS” section of .

254,082,000 persons. Using the same α and β
parameters from Table 2-4.1, an approximate
standard error can be calculated for this estimate
using equation 2-4.5:
se (4,514,000; 254,082,000)

In this example, since the values of 𝓍 are similar,
the increase in the standard error estimate of
unemployed men is primarily driven by the 23.4
million increase in N population over that decade.
To form 90 percent confidence intervals (rounded
to the nearest thousand) around these estimates,
1.645 times the standard error is added and
subtracted from the estimate 𝓍. For the January
2007 estimate of unemployed men, the 90 percent
confidence interval is calculated as:
4,406,00O � 1.645 * 115,501,86 � (4,216,000; 4,596,000)

As a practical interpretation, there is about
a 90 percent chance that the true number of

88 Chapter 2-4: Variance Estimation	

	

unemployed men in January 2007 was between
4.216 million and 4.596 million. Using the same
approach, the 90 percent confidence interval for
unemployed men in January 2017 was approximately 4.315 million to 4.713 million.

TOTAL VARIANCE COMPONENTS
The following tables show variance estimates
computed using replication methods by type
(total and within-PSU) and by stage of estimation.
The estimates presented are based on the 2010
sample design and averaged across all twelve
months of 2016. This period was used for estimation because the sample design was essentially
unchanged throughout the year, and the variances
tend to be more stable than in earlier postrecessionary years.

Within-Primary Sampling Unit Variance
Ratios by Weighting Stage
The CPS employs a complex sample design that
selects both SR and NSR PSUs within states (see
Chapter 2-2, Sample Design). An NSR stratum
consists of a group of PSUs, typically including

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

rural areas, from which only one PSU is selected
to be included in the CPS sample. Since this PSU
represents the entire stratum, between-PSU
variability is introduced, which is a measure of
the variability that results from having selected
a subset of NSR PSUs throughout the nation. An
SR stratum includes only one PSU; therefore, the
contribution of between-PSU variance is zero by
definition from these sample areas. Both SR and
NSR PSUs contribute within-PSU variance, which
is the variability introduced by sampling a subset
of HUs within each selected PSU.
Total variance is the sum of between-PSU and
within-PSU variance. Table 2-4.2 presents ratios of
within-PSU to total variance for the primary labor

force estimates of selected demographic groups
after various stages in weighting. For national
estimates, within-PSU variance is the dominant
contributor to total variance, which is clearly
demonstrated in the table and is consistent across
weighting stages. Thus, the effect of selecting
NSR PSUs in some strata has a small effect on the
total variance of most national estimates.
Different replicate weights are assigned to sample units to reflect either total variance or within-PSU variance. Since the assignment of replicate
weights cannot be perfectly optimized for these
variance levels—and since the variances calculated using those replicates are only estimates
themselves, with their own errors—in some cases,

Table 2-4.2.

Within-Primary Sampling Unit to Total Variance Ratios: 2016 Annual Averages
(In percent)
Nonresponse adjustment

Second stage

Compositing

Civilian labor force. . . . . . . . 
Men. . . . . . . . . . . . . . . . . . . . . . . 
Women . . . . . . . . . . . . . . . . . . . 
White. . . . . . . . . . . . . . . . . . . . . 
Black . . . . . . . . . . . . . . . . . . . . . 
Hispanic. . . . . . . . . . . . . . . . . . . 
Asian . . . . . . . . . . . . . . . . . . . . . 

Characteristic

0.99
0.97
1.02
0.97
0.90
0.95
0.97

0.98
0.94
1.00
0.95
0.99
1.00
0.98

0.96
0.91
1.00
0.92
1.00
1.01
0.99

Employed. . . . . . . . . . . . . . . . 
Men. . . . . . . . . . . . . . . . . . . . . . . 
Women . . . . . . . . . . . . . . . . . . . 
White. . . . . . . . . . . . . . . . . . . . . 
Black . . . . . . . . . . . . . . . . . . . . . 
Hispanic. . . . . . . . . . . . . . . . . . . 
Asian . . . . . . . . . . . . . . . . . . . . . 

0.99
0.98
1.01
0.97
0.91
0.94
0.97

0.97
0.97
1.01
0.95
0.97
0.99
0.98

0.97
0.93
1.02
0.92
0.99
1.00
0.98

Unemployed . . . . . . . . . . . . . 
Men. . . . . . . . . . . . . . . . . . . . . . . 
Women . . . . . . . . . . . . . . . . . . . 
White. . . . . . . . . . . . . . . . . . . . . 
Black . . . . . . . . . . . . . . . . . . . . . 
Hispanic. . . . . . . . . . . . . . . . . . . 
Asian . . . . . . . . . . . . . . . . . . . . . 

0.99
0.99
0.99
0.99
0.95
1.03
1.00

1.00
1.02
0.98
0.97
0.99
1.01
1.01

1.01
1.02
0.98
0.98
1.00
1.03
1.02

Unemployment rate. . . . . . . 
Men. . . . . . . . . . . . . . . . . . . . . . . 
Women . . . . . . . . . . . . . . . . . . . 
White. . . . . . . . . . . . . . . . . . . . . 
Black . . . . . . . . . . . . . . . . . . . . . 
Hispanic. . . . . . . . . . . . . . . . . . . 
Asian . . . . . . . . . . . . . . . . . . . . . 

0.99
1.00
0.99
0.98
0.98
1.01
1.00

0.99
1.03
0.99
0.97
0.99
1.01
1.01

1.01
1.02
0.98
0.98
1.00
1.02
1.01

Not in labor force. . . . . . . . . 
Men. . . . . . . . . . . . . . . . . . . . . . . 
Women . . . . . . . . . . . . . . . . . . . 
White. . . . . . . . . . . . . . . . . . . . . 
Black . . . . . . . . . . . . . . . . . . . . . 
Hispanic. . . . . . . . . . . . . . . . . . . 
Asian . . . . . . . . . . . . . . . . . . . 

0.96
0.96
0.95
0.96
0.87
0.99
0.97

0.98
0.94
1.00
0.95
0.99
1.00
0.99

0.96
0.91
1.00
0.92
1.00
1.01
0.99

Source: U.S. Census Bureau, Tabulation of 2016 Current Population Survey microdata.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Chapter 2-4: Variance Estimation 89

the ratio of within-PSU to total variance is slightly
greater than one. While this implies a negative
between-PSU variance (since it can be derived
as total variance minus within-PSU variance), it
should not be interpreted that way. By definition,
variances cannot be less than zero. Practically
speaking, a ratio slightly greater than one can be
interpreted to mean that the true between-PSU
variance contribution is close to zero, but not
actually negative.

Relative Standard Errors by Weighting
Stage
Table 2-4.3 shows how the separate estimation
steps affect a survey estimate’s relative standard

errors (RSEs), calculated as the replicate standard
error of an estimate divided by the estimate itself.
As stated before, the standard error of an estimate is simply the square root of its variance. It is
more instructive to compare RSEs than the standard errors themselves, since the various stages
of estimation can affect both the level of a survey
estimate and its variance (Hanson, 1978; Train,
Cahoon, and Makens, 1978). The nonresponse-​
adjusted estimate includes base weights, special
weighting factors, and nonresponse adjustment.
The second-stage estimate includes all weighting
steps applied to the full sample except compositing (see Chapter 2-3, Weighting and Estimation).

Table 2-4.3.

Relative Standard Errors After Selected Weighting Stages: 2016 Annual Averages
(In percent)
Labor force status

Nonresponse adjustment

Second stage

Compositing

Civilian labor force. . . . . . . 
Men. . . . . . . . . . . . . . . . . . . . . . 
Women . . . . . . . . . . . . . . . . . . 
White. . . . . . . . . . . . . . . . . . . . 
Black . . . . . . . . . . . . . . . . . . . . 
Hispanic. . . . . . . . . . . . . . . . . . 
Asian . . . . . . . . . . . . . . . . . . . . 

0.86
0.91
0.95
0.94
2.31
2.13
3.18

0.26
0.31
0.40
0.28
0.81
0.64
1.26

0.24
0.29
0.36
0.25
0.74
0.58
1.23

Employed. . . . . . . . . . . . . . . 
Men. . . . . . . . . . . . . . . . . . . . . . 
Women . . . . . . . . . . . . . . . . . . 
White. . . . . . . . . . . . . . . . . . . . 
Black . . . . . . . . . . . . . . . . . . . . 
Hispanic. . . . . . . . . . . . . . . . . . 
Asian . . . . . . . . . . . . . . . . . . . . 

0.87
0.92
0.95
0.94
2.35
2.14
3.19

0.29
0.35
0.43
0.31
0.94
0.70
1.31

0.27
0.32
0.38
0.28
0.84
0.63
1.28

Unemployed . . . . . . . . . . . . 
Men. . . . . . . . . . . . . . . . . . . . . . 
Women . . . . . . . . . . . . . . . . . . 
White. . . . . . . . . . . . . . . . . . . . 
Black . . . . . . . . . . . . . . . . . . . . 
Hispanic. . . . . . . . . . . . . . . . . . 
Asian . . . . . . . . . . . . . . . . . . . . 

2.32
2.99
3.25
2.71
5.24
5.38
10.99

2.12
2.85
3.05
2.53
4.82
4.93
10.52

2.07
2.78
3.02
2.45
4.76
4.81
10.37

Unemployment rate. . . . . . 
Men. . . . . . . . . . . . . . . . . . . . . . 
Women . . . . . . . . . . . . . . . . . . 
White. . . . . . . . . . . . . . . . . . . . 
Black . . . . . . . . . . . . . . . . . . . . 
Hispanic. . . . . . . . . . . . . . . . . . 
Asian . . . . . . . . . . . . . . . . . . . . 

2.14
2.86
3.05
2.52
4.70
4.89
10.46

2.12
2.84
3.03
2.53
4.78
4.88
10.39

2.07
2.77
2.99
2.45
4.67
4.75
10.25

Not in labor force. . . . . . . . 
Men. . . . . . . . . . . . . . . . . . . . . . 
Women . . . . . . . . . . . . . . . . . . 
White. . . . . . . . . . . . . . . . . . . . 
Black . . . . . . . . . . . . . . . . . . . . 
Hispanic. . . . . . . . . . . . . . . . . . 
Asian . . . . . . . . . . . . . . . . . . . . 

0.93
1.16
1.01
1.04
2.70
2.54
3.65

0.44
0.70
0.53
0.48
1.31
1.25
1.97

0.40
0.64
0.47
0.42
1.18
1.11
1.91

Source: U.S. Census Bureau, Tabulation of 2016 Current Population Survey microdata.

90 Chapter 2-4: Variance Estimation	

	

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

In Table 2-4.3, the figures across demographic
groups for the civilian labor force, employed, and
not in labor force show, for example, that the second-stage weighting process substantially reduces
RSEs, while compositing further decreases the
RSEs but more modestly. The figures for unemployed and unemployment rate demonstrate little
difference between RSEs of nonresponse-adjusted
and second-stage estimates, but a more consistent, minor decrease in RSEs from second-stage
to composite estimates. The RSEs of the unemployment statistics tend to be larger because
unemployment is a relatively rare characteristic,
typically less than ten percent for most groups in
the primary estimation tables, depending on the
state of the economy.

DESIGN EFFECTS
Table 2-4.4 shows the second-stage and composite design effects for the total variances of major
labor force characteristics for selected demographic groups. A design effect (δ) is the ratio of
the variance from a complex sample design to the
variance of a simple random sample (SRS) design.
The design effects in this table were computed
by solving equation 2-4.3 for δ for a theoretical
response rate of 100 percent and also for realized
response rates of approximately 90 percent over
the reference period, 2010 to 2017. The design
effects for the realized response rates are smaller
because the estimated SRS variances increase as
the number of respondents decrease.
For the unemployed, assuming about a 90 percent
response rate, the design effect for total variance is approximately 1.59 for the second-stage
estimate and 1.49 for the composite estimate.

Table 2-4.4.

Design Effects After Selected Weighting Stages: 2010–2017 Averages
(In percent)
100 percent
response rates

Labor force status

2010–2017
response rates

Second stage

Compositing

Second stage

Compositing

Civilian labor force. . . 
Men. . . . . . . . . . . . . . . . . .
Women . . . . . . . . . . . . . .
White. . . . . . . . . . . . . . . .
Black . . . . . . . . . . . . . . . .
Hispanic. . . . . . . . . . . . . .
Asian . . . . . . . . . . . . . . . .

1.39
0.55
0.77
0.93
0.66
0.54
0.70

1.09
0.45
0.59
0.72
0.55
0.44
0.63

1.24
0.49
0.69
0.82
0.58
0.48
0.62

0.97
0.40
0.52
0.64
0.49
0.39
0.56

Employed. . . . . . . . . . . 
Men. . . . . . . . . . . . . . . . . .
Women . . . . . . . . . . . . . .
White. . . . . . . . . . . . . . . .
Black . . . . . . . . . . . . . . . .
Hispanic. . . . . . . . . . . . . .
Asian . . . . . . . . . . . . . . . .

1.46
0.66
0.84
1.02
0.81
0.66
0.75

1.13
0.52
0.64
0.78
0.66
0.52
0.66

1.30
0.58
0.75
0.90
0.72
0.58
0.67

1.01
0.46
0.57
0.70
0.59
0.46
0.59

Unemployed . . . . . . . . 
Men. . . . . . . . . . . . . . . . . .
Women . . . . . . . . . . . . . .
White. . . . . . . . . . . . . . . .
Black . . . . . . . . . . . . . . . .
Hispanic. . . . . . . . . . . . . .
Asian . . . . . . . . . . . . . . . .

1.78
1.64
1.56
1.68
1.81
1.83
1.77

1.68
1.54
1.48
1.57
1.70
1.74
1.69

1.59
1.46
1.39
1.49
1.61
1.63
1.58

1.49
1.37
1.31
1.40
1.52
1.55
1.51

Not in labor force. . . . 
Men. . . . . . . . . . . . . . . . . .
Women . . . . . . . . . . . . . .
White. . . . . . . . . . . . . . . .
Black . . . . . . . . . . . . . . . .
Hispanic. . . . . . . . . . . . . .
Asian . . . . . . . . . . . . . . . .

1.39
0.99
0.94
1.13
1.03
1.02
1.02

1.09
0.80
0.71
0.87
0.86
0.82
0.90

1.24
0.88
0.84
1.01
0.92
0.91
0.91

0.97
0.71
0.63
0.78
0.77
0.73
0.80

Source: U.S. Census Bureau, Tabulation of 2010–2017 Current Population Survey microdata.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Chapter 2-4: Variance Estimation 91

This means that, for the same number of sample
cases, the design of the CPS (including sample
selection and weighting) increases total variance
by about 60 percent for second-stage estimates
and about 50 percent for composite estimates,
relative to an SRS design. The design effects for
the civilian labor force, employed, and not in labor
force for all groups tend to be smaller than those
for unemployed. The design effects for composite
estimates tend to be smaller than those for SS,
which indicates that composite weighting generally increases the precision of estimates.
The design effects in Table 2-4.4 tend to be larger
than the analogous design effects reported in
past technical papers, such as Table 14-4 of
CPS Technical Paper 66, due to a change in
the accounting of the sample size. Table 2-4.4
assumes an average of slightly over 1.9 responding adults per eligible sampled household (excluding out-of-scope HUs selected in the sample),
which was derived from average respondent data
from 2010 to 2017. Past technical papers used
different denominators that are not reflective of
recent years of data, resulting in smaller design
effects than are reported in Table 2-4.4.

REFERENCES
Corteville, J., “State Between-PSU Variances
and Other Useful Information for the CPS 1990
Sample Design (VAR90−16),” Memorandum for
Documentation, Demographic Statistical Methods
Division, U.S. Census Bureau, March 6, 1996.
Dippo, C., R. Fay, and D. Morganstein, “Computing
Variances from Complex Samples with Replicate
Weights,” Proceedings of the Section on
Survey Research Methods, American Statistical
Association, 1984, pp. 489−494.
Fay, R. E., “Some Properties of Estimates of
Variance Based on Replication Methods,”
Proceedings of the Section on Survey Research
Methods, American Statistical Association, 1984,
pp. 495−500.
Fay, R. E., “Theory and Application of Replicate
Weighting for Variance Calculations,” Proceedings
of the Section on Survey Research Methods,
American Statistical Association, 1989,
pp. 212−217.

92 Chapter 2-4: Variance Estimation	

	

Fay, R., and G. Train, “Aspects of Survey and
Model-Based Postcensal Estimation of Income and
Poverty Characteristics for States and Counties,”
Proceedings of the Section on Government
Statistics, American Statistical Association, 1995,
pp. 154−159.
Gunlicks, C., “Overview of 1990 CPS PSU
Stratification (S−S90−DC−11),” Internal
Memorandum for Documentation, Demographic
Statistical Methods Division, U.S. Census Bureau,
February 24, 1993.
Gunlicks, C., “1990 Replicate Variance System
(VAR90−20),” Internal Memorandum for
Documentation, Demographic Statistical Methods
Division, U.S. Census Bureau, June 4, 1996.
Hanson, R. H., “The Current Population Survey:
Design and Methodology,” Technical Paper 40,
Government Printing Office, Washington, DC,
1978.
Keyfitz, N.,“Estimates of Sampling Variance Where
Two Units are Selected from Each Stratum,”
Journal of the American Statistical Association,
52(280), 1957, pp. 503–510.
Kostanich, D.,“Proposal for Assigning Variance
Codes for the 1990 CPS Design (VAR90−22),”
Internal Memorandum to BLS/Census Variance
Estimation Subcommittee, Demographic
Statistical Methods Division, U.S. Census Bureau,
June 17, 1996.
Lent, J., “Variance Estimation for Current
Population Survey Small Area Estimates,”
Proceedings of the Section on Survey Research
Methods, American Statistical Association, 1991,
pp. 11−20.
McIllece, J., “Calculating Generalized Variance
Functions with a Single-Series Model in the
Current Population Survey,” paper presented at
the 2016 Joint Statistical Meetings, Proceedings
of the Section on Survey Research Methods,
American Statistical Association, 2016,
pp. 2176–2187.
Train, G., L. Cahoon, and P. Makens, “The Current
Population Survey Variances, Inter-Relationships,
and Design Effects,” Proceedings of the Section
on Survey Research Methods, American Statistical
Association, 1978, pp. 443−448.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Wolter, K., “An Investigation of Some Estimators
of Variance for Systematic Sampling,” Journal of
the American Statistical Association, 79, 1984,
pp. 781−790.
Wolter, K., Introduction to Variance Estimation,
Springer-Verlag, New York, 1985.

FURTHER READING
Kostanich, D., “Revised Standard Error
Parameters and Tables for Labor Force Estimates:
1994−1995 (VAR80−6),” Internal Memorandum for
Documentation, Demographic Statistical Methods
Division, U.S. Census Bureau, February 23, 1996.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

U.S. Bureau of Labor Statistics, Current Population
Survey online documentation, 2018, retrieved
from .
U.S. Census Bureau, “Current Population Survey:
Design and Methodology, Technical Paper 66,”
Washington, DC, October 2006.
Valliant, R., “Generalized Variance Functions in
Stratified Two-Stage Sampling,” Journal of the
American Statistical Association, 82, 1987,
pp. 499−508.

Chapter 2-4: Variance Estimation 93

Chapter 2-5: Seasonal Adjustment
INTRODUCTION
Short-run movements in labor force time series
are strongly influenced by seasonality—periodic
fluctuations associated with recurring calendar-​
related events such as weather, holidays, and the
opening and closing of schools. Seasonal adjustment removes the influence of these fluctuations
and makes it easier for users to observe fundamental changes in the level of the series, particularly changes associated with general economic
expansions and contractions.
While seasonal adjustment is feasible only if
the seasonal effects are reasonably stable with
respect to timing, direction, and magnitude, these
effects are not necessarily fixed and often evolve
over time. The evolving patterns are estimated
by the X-13ARIMA-SEATS (X-13) program, with
procedures based on “filters” that successively
average a shifting timespan of data, thereby providing estimates of seasonal factors that change in
a smooth fashion from year to year.
For observations in the middle of a series, a set
of symmetric moving averages with fixed weights
produces final seasonally adjusted estimates. A
filter is symmetric if it is centered around the time
point being adjusted, with an equal amount of
data preceding and following that point. Standard
seasonal adjustment options imply a symmetric
filter that uses from 6 to 10 years of original data
to produce a final seasonally adjusted estimate.
Obviously, this final adjustment can be made only
where there are enough data beyond the time
point in question to adjust with the symmetric
filter.
To seasonally adjust recent data, shorter filters
with less desirable properties must be used. These
filters are referred to as asymmetric because they
use fewer observations after the reference point
than preceding it. The weights for such filters vary
with the number of observations that are available
beyond the time point for which estimates are
to be adjusted. Seasonally adjusted data for the
current year are produced with a technique known
as concurrent adjustment. Under this practice, the
current month’s seasonally adjusted estimate is
computed with the use of all relevant original data
up to and including data for the current month.
94 Chapter 2-5: Seasonal Adjustment	

	

Every time an observation is added, previous
estimates will be revised. The number of estimates
that are revised depends on the filter. Revisions
to a seasonally adjusted estimate for a given time
point continue until enough future observations
become available to use the symmetric weights.
This effectively means waiting up to 5 years for a
final adjustment when standard options are used.
At the end of each calendar year, the BLS recalculates the seasonal factors for CPS national
series by including another full year of data in the
estimation process. On the basis of this annual
recalculation, BLS revises the historical seasonally adjusted data for the previous 5 years. As a
result, each year’s data are generally subject to
five revisions before the values are considered
final. The fifth and final revisions to data for the
earliest of the 5 years are usually quite small,
while the first-time revisions to data for the most
recent years are generally much larger. For the
major aggregate labor force series, however, the
first-time revisions rarely alter the essential trends
observed in the initial estimates. (For information about seasonal adjustment of the Local Area
Unemployment Statistics program’s state and
local area estimates, see the BLS Handbook of
Methods [2018])).

ADJUSTMENT METHODS AND
PROCEDURES
Beginning in 2003, BLS adopted the use of X-12ARIMA (X-12) as the official seasonal adjustment
program for CPS labor force series, replacing the
X-11-ARIMA (X-11) program that had been used
since 1980. Both X-12-ARIMA and X-11-ARIMA
incorporate the widely used X-11 method developed at the Census Bureau in the 1960s. Statistics
Canada added to the X-11 method the ability to
extend the time series with forward and backward
extrapolations from Auto-Regressive Integrated
Moving Average (ARIMA) models prior to seasonal adjustment. The X-11 algorithm for seasonal
adjustment is then applied to the extended series.
When adjusted data are revised after future data
become available, the use of forward extensions
results in initial seasonal adjustments that are subject to smaller revisions on average.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

The enhancements in the X-12 program fall into
three basic categories: (1) enhanced ARIMA
model selection and estimation, (2) detection
and estimation of outlier, trading day, and holiday
effects, and (3) new postadjustment diagnostics.
Starting in 2015, BLS moved to the X-13 program,
which was developed by the Census Bureau and
the Bank of Spain. The X-13 program includes
all of the capabilities of the X-12 program, while
adding the signal extraction for ARIMA (SEATS)
seasonal adjustment methodology. (See the
subsequent section titled “X-11 and SEATS
decompositions.”)
The X-11 and SEATS methods have strong similarities. The ARIMA model of the observed series is
the starting point for both. They both also use the
same basic estimator, which is a weighted moving
average of the series, to produce the seasonally
adjusted output. The methods differ in the derivation of moving average weights.
The X-11 method is empirically based. It directly
selects the seasonal moving averages from a
prespecified set. The weights are not designed
for any specific series, but fit a wide variety of
series. SEATS is model-based and derives a moving average from the ARIMA model of the series.
The moving averages are tailored to the specific
modeled properties of the series. The SEATS
methodology is flexible in that it can adjust some
series that the X-11 method finds too variable. It
also facilitates analysis with a variety of error measures produced within the system. See Shiskin et
al., (1967), Dagum (1980), Findley et al., (1998),
Gomez and Maravall (2001), and Census Bureau
(2015) for details on these methods.
For almost all of the national labor force series
that are seasonally adjusted by BLS, the main
steps of the seasonal adjustment process proceed
in the following order:
1.	 Time series modeling. A REGARIMA model
(a combined regression and ARIMA model) is
developed to account for the normal evolutionary behavior of the time series and to
control for outliers and other special external
effects that may exist in the series.
2.	 Prior adjustments. Given an adequate
REGARIMA model, the series is modified
by prior adjustments for external effects
Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

estimated from the regression part of the
model and extrapolated forward 24 months or
more by the ARIMA part of the model.
3.

X-11 or SEATS decomposition. The modified
and extrapolated series is decomposed into
trend-cycle, seasonal, and irregular components by means of a series of moving averages
to produce seasonal factors for implementing
seasonal adjustment.

4.

Evaluation. A battery of diagnostic tests is
produced to evaluate the quality of the final
seasonal adjustment.

Time series modeling
Time series models play an important role in
seasonal adjustment. They are used to identify
and correct the series for atypical observations
and other external effects, as well as to extend
the original series with backcasts and forecasts so
that fewer asymmetric filters can be used at the
beginning and end of the series.
ARIMA models (see Box and Jenkins, 1970 or
Kendall and Ord, 1990) are designed to make
forecasts of a time series based on only its past
values. While these models can represent a wide
class of evolving time series patterns, they do not
account for the presence of occasional outliers
and other special external effects. An outlier represents a sudden break in the normal evolutionary
behavior of a time series. Ignoring the existence
of outliers may lead to serious distortions in the
seasonally adjusted series.
A common form of outlier that presents a special
problem for seasonal adjustment is an abrupt shift
in the level of the data that may be either transitory or permanent. Three types of outliers are
usually distinguished: (1) An additive change that
affects only a single observation, (2) a temporary change having an effect that diminishes to
zero over several periods, and (3) a level shift or
a break in the trend of the data, which represents
a permanent increase or decrease in the underlying level of the series. These three main types of
outliers, as well as other types of external effects,
may be handled by the time series modeling
component of the X-13 program. This is done by
adding to the ARIMA model appropriately defined
regression variables based on intervention analysis
originally proposed by Box and Tiao (1975).
Chapter 2-5: Seasonal Adjustment 95

The combined regression and ARIMA model is
referred to as a REGARIMA model and is represented by

Yt = βXt + Zt ,

where Yt is the original series or a logarithmic
transformation of it; Xt is a set of fixed regression variables; β represents the regression coeffcients; and Zt is a standard seasonal ARIMA model
described by the notation (p,d,q)(P,D,Q), where
p is the number of regular (nonseasonal) autoregressive parameters, d is the number of regular
differences, q is the number of regular moving
average parameters, P is the number of seasonal
autoregressive parameters, D is the number of
seasonal differences, and Q is the number of seasonal moving average parameters.
While the ARIMA model can be very complicated
theoretically, in practice it takes a parsimonious
form involving only a few estimated parameters.
There are well-developed methods for determining the number and types of parameters and the
degree of differencing appropriate for a given
series.
With respect to specifying the regression component to control for outliers, the X-13 program
offers two approaches. First, major external
events, such as breaks in trend, are usually associated with known events. In such cases, the user
has sufficient prior information to specify special
regression variables to estimate and control for
these effects.
Second, because it is rare that there is sufficient
prior information to locate and identify all of the
atypical observations that may exist in a time
series, as a second approach to specifying the
regression component, REGARIMA offers automatic outlier detection based on work by Chang,
Tiao, and Chen (1988). This approach is especially
useful when a large number of series must be
processed. Of course, both approaches may be
combined so that readily available prior information can be used directly, while unknown substantial outliers may still be discovered.
Model adequacy and length of series. The preference is to use relatively long series in fitting
time series models, but with some qualifications.
Sometimes, the relevance of data in the distant

96 Chapter 2-5: Seasonal Adjustment	

	

past to seasonal adjustment is questionable, which
may lead to using a shorter series.
Even though the filters have limited memory,
there are reasons for using longer series. First, for
homogenous time series, the more data used to
identify and estimate a model, the more likely it
is that the model will represent the structure of
the data well and the more accurate the parameter estimates will be. The exact amount of data
needed for time series modeling depends on
the properties of the series involved. Arbitrarily
truncating the series, however, may lead to more
frequent changes in model identification and to
large changes in estimated parameters, which in
turn may lead to larger-than-necessary revisions in
forecasts.
Second, although level shifts and other types of
outliers tend to occur more often in longer series,
the X-13 program has the capability of automatically controlling for these effects. Third, some
very useful diagnostics available in X-13 typically
require a minimum of 11 years of data and, in
some cases, as much as 14 years of data. Fourth,
attempting to fit longer series often provides
useful insights into the properties of the series,
including their overall quality and the effects of
major changes in survey design.
Extensive use is made of intervention analysis
to estimate the magnitude of known breaks in
CPS series and of automatic outlier detection to
identify and correct for the presence of additional
atypical observations. Once a model is estimated,
it is evaluated in terms of its adequacy for seasonal adjustment purposes. The criteria essentially
require a model to fit the series well (there should
be no systematic patterns in the residuals) and to
have low average forecasting errors for the last 3
years of observed data. When there is a tradeoff
between the length of the series and the adequacy of the model, a shorter series is selected.
In this case, the identification of the model is not
changed with the addition of new data unless the
model fails diagnostic testing.
Acceptable REGARIMA models have been developed for all of the labor force series that are
directly adjusted. (For information about directly
and indirectly adjusted series, see subsequent section titled “Aggregation procedures.”)

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Prior adjustments
Prior adjustments are adjustments made to the
original data prior to seasonal adjustment. Their
purpose is to correct the original series for atypical observations and other external effects that
otherwise would seriously distort the estimates of
the seasonal factors. Prior adjustment factors are
subtracted from or used as divisors for the original
series, depending on whether the seasonal adjustment is additive or multiplicative.
Prior adjustment factors for CPS series may be
based on special user-defined adjustments or
handled more formally with REGARIMA modeling. Most of the prior adjustment factors for the
labor force series are estimated directly from
REGARIMA.
Level shifts. The most common type of outlier that
occurs in CPS series is the permanent level shift.
Most such shifts have been due to noneconomic
methodological changes related to revisions in
population controls and to major modifications to
the CPS design. One notable economic level shift
was due to the 2001 terrorist attacks.
Population estimates extrapolated from the latest
decennial census are used in the second-stage
estimation procedure to control CPS sample
estimates to more accurate levels. These intercensal population estimates are regularly revised
to reflect the latest information on population
change.
During the 1990s, three major breaks occurred in
the intercensal population estimates. Population
controls based on the 1990 Census, adjusted
for the estimated undercount, were introduced
into the CPS series in 1994 and, in 1996, were
extended back to 1990. In January 1997 and again
in January 1999, the population controls were
revised to reflect updated information on net
international migration.
Population revisions that reflected the results of
the 2000 Census were introduced with the release
of data for January 2003 and were extended back
to data beginning in January 2000. Since 2003,
population controls also are updated in January
to reflect new estimates of net international
migration in the postcensus period, updated vital
statistics and other information, and any methodological changes in the population estimation

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

process. The population revisions introduced in
January 2012 incorporated the results of the 2010
Census; those revisions were not extended back to
January 2010. Further information on CPS population controls for 1996 to the present is available
on the BLS Web site at .
In 1994, major changes to the CPS were introduced, including a redesigned and automated
questionnaire and revisions to some of the labor
force concepts and definitions. In January 2003,
new I&O classifications also were introduced and
were extended back to data beginning in 2000.
To test for the possibility that revisions to the
population controls or significant survey changes
have important effects on those CPS series with
large numerical revisions, each REGARIMA model
is modified to include intervention variables for
those years. The coefficients for these variables
provide estimates of the direction and magnitude
of the intervention effects.
Intervention effects for 2000 were necessary for
selected employment series related primarily
to Hispanic, adult, and agricultural categories.
These effects mainly reflect increases in adult and
Hispanic employment due to the introduction of
the 2000 Census-based population controls and a
decline in agricultural employment caused by the
change in the industry classification system (see
Bowler et al., 2003).
Due to an unusual revision in the population controls in January 2000, the unadjusted employment
level of Black or African-American men 20 years or
over had a strong upward shift in the first quarter of 2000. This temporary effect is permanently
removed from the seasonally adjusted series with
the use of the REGARIMA model.
At the end of 2001, unemployed job losers were
identified as having had substantial upward level
shifts 1 month after the September 11, 2001,
terrorist attacks on the World Trade Center in
New York City. For more details, see McIntire et
al., (2002). Also, four additional series related to
workers employed part time for economic reasons
were identified as having substantial upward shifts
at the time of the terrorist attacks.
Calendar effects. Calendar effects are transitory
level shifts in a series that result from calendar

Chapter 2-5: Seasonal Adjustment 97

events such as moving holidays or the differing
composition of weekdays in a month between
years. These effects have different influences on
the same month across years, thereby distorting
the normal seasonal patterns for the given month.

X-11 and Signal Extraction in ARIMA Time
Series Decompositions
The X-11 and SEATS methods of seasonal adjustment contained within the X-13 program assume
that the original series is composed of three
components: trend-cycle, seasonal, and irregular.
Depending on the relationship between the original series and each of the components, the mode
of seasonal adjustment may be additive or multiplicative. Formal tests are conducted to determine
the appropriate mode of adjustment.
The multiplicative mode assumes that the absolute magnitudes of the components of the series
are dependent on each other, which implies that
the size of the seasonal component increases and
decreases with the level of the series. With this
mode, the monthly seasonal factors are ratios
with all positive values centered around unity. The
seasonally adjusted series values are computed by
dividing each month’s original value by the corresponding seasonal factor.
In contrast, the additive mode assumes that
the absolute magnitudes of the components of
the series are independent of each other, which
implies that the size of the seasonal component
is independent of the level of the series. In this
case, the seasonal factors represent positive or
negative deviations from the original series and
are centered around zero. The seasonally adjusted
series values are computed by subtracting the
corresponding seasonal factor from each month’s
original value.
Most seasonally adjusted CPS series are seasonally adjusted using the X-11 component of
the X-13 program. Given an appropriate choice
for the mode of adjustment, the prior-adjusted,
forecasted series is seasonally adjusted by the
X-11 component of the X-13 program. The X-11
method applies a sequence of moving average and smoothing calculations to estimate the
trend-cycle, seasonal, and irregular components.
The method takes either a ratio-to- or difference-from-moving-average approach, depending
on whether the multiplicative or additive model is
98 Chapter 2-5: Seasonal Adjustment	

	

used. For observations in the middle of the series,
a set of fixed symmetric moving averages (filters)
is used to produce final estimates. The implied
length of the final filter under standard options
is 72 time points for the 3 by 5 seasonal moving
average or 120 time points for the 3 by 9 moving average. That is, to obtain a final seasonally
adjusted estimate for a single time point requires
up to 5 years of monthly data preceding and
following that time point. For recent data, asymmetric filters with less desirable properties than
symmetric filters must be used.
Some seasonally adjusted CPS series are adjusted
using the SEATS component of the X-13 program rather than the X-11 component. Like the
X-11 method, SEATS decomposes the observed
series into trend-cycle, seasonal, and irregular
components, and the relationship between the
components may be additive or multiplicative
(via a log transformation). Both the X-11 and
SEATS methods use moving averages to decompose the series. The major difference between
the two procedures is how the moving averages
are constructed. The X-11 method selects from a
set of predefined filters, while the SEATS method
derives its filters from a decomposition of the
ARIMA model fit to the observed series into
models for the trend-cycle, seasonal, and irregular
components. The SEATS filters therefore are more
tailored to the specific properties of the series
as reflected in the ARIMA model fit, while X-11
filters are nonparametric in that they can fit a wide
variety of series without depending on a specific
model.

Evaluation
A series should be seasonally adjusted if three
conditions are satisfied: the series is seasonal, the
seasonal effects can be estimated reliably, and no
residual seasonality is left in the adjusted series. A
variety of diagnostic tools is available for the X-11
method to test for these conditions, including
frequency-spectrum estimates, revision-history
statistics, and various seasonal tests. The X-13
program provides some of the above diagnostics for SEATS analysis but also provides its own
battery of model-based diagnostics. If diagnostic
testing shows that any of the three conditions
listed fails to hold for a given series, that series is
deemed not suitable for seasonal adjustment.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

CONCURRENT SEASONAL
ADJUSTMENT
Concurrent seasonal adjustment of national labor
force data began for CPS with the release of estimates for December 2003 in January 2004. This
practice replaced the projected-factor method,
which updated seasonal factors twice a year.
Under the latter procedure, projected seasonal
factors were used to seasonally adjust the new
original data as they were collected. At midyear,
the historical series were updated with data for
January through June, and the seasonal adjustment program was rerun to produce projected
seasonal factors for July through December.
With concurrent seasonal adjustment, the seasonal adjustment program is rerun each month
as the latest CPS data become available. The
seasonal factors for the most recent month are
produced by applying a set of moving averages
to the entire data set, extended by extrapolations,
including data for the current month. While all
previous-month seasonally adjusted estimates
are recalculated in this process, BLS policy is
not to revise previous months’ official seasonally adjusted CPS estimates as new data become
available during the year. Instead, revisions are
introduced for the most recent 5 years of data at
the end of each year.
Numerous studies, including that discussed in
Methee and McIntire (1987) on the CPS laborforce series, have indicated that concurrent
adjustment generally produces initial seasonally
adjusted estimates requiring smaller revisions than
do estimates produced with the projected-factor
method. Revisions to data for previous months
also may produce gains in accuracy, especially
when the original data are themselves regularly
revised on a monthly basis. Publishing numerous
revisions during the year, however, can confuse
data users.
The case for revisions to previous-month seasonally adjusted estimates is less compelling for CPS
series because the original sample data normally
are not revised. Moreover, an empirical investigation indicated that there were no substantial
gains in estimating month-to-month changes by
introducing revisions to the data for the previous
month. For example, it was found that if previous-month revisions were made to the labor force
series, the overall unemployment rate would be
Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

different in only 2 months between January 2001
and November 2002, in each case by only onetenth of a percentage point.

AGGREGATION PROCEDURES
BLS directly seasonally adjusts national series
on the basis of age, sex, industry, education, and
other characteristics. BLS also provides seasonally
adjusted totals, subtotals, and ratios of selected
series. It is possible to seasonally adjust an aggregate series either directly or indirectly by seasonally adjusting its components and adding the
results, or dividing in the case of ratios. Indirect
and direct adjustments usually will not give identical results because, (1) seasonal patterns vary
across series, (2) there are inherent nonlinearities
in the X-13 program, (3) many series are multiplicatively adjusted, and (4) some series are ratios.
BLS uses indirect seasonal adjustment for most of
the major labor force aggregates. Besides retaining, so far as possible, the essential accounting
relationships, the indirect approach is needed
because many of the aggregates include components having different seasonal and trend characteristics that sometimes require different modes of
adjustment.
Examples of indirectly seasonally adjusted series
are the levels of total unemployment, employment, and the civilian labor force, as well as the
unemployment rate for all civilian workers. These
series are produced by the aggregation of some
or all of the seasonally adjusted series for the
eight major civilian labor force components. The
seasonally adjusted level of total unemployment
is the sum of the seasonally adjusted levels of
unemployment for four age/sex groups: men 16
to 19 years, women 16 to 19 years, men 20 years
and over, and women 20 years and over. Likewise,
seasonally adjusted civilian employment is the
sum of employment in all industries for the same
four age/sex groups. The seasonally adjusted civilian labor force is the sum of all eight components.
The seasonally adjusted civilian unemployment
rate is computed as the ratio of the total seasonally adjusted unemployment level to the total
seasonally adjusted civilian labor force (expressed
as a percentage).
A problem with producing seasonally adjusted
estimates for a series by aggregation is that
seasonal adjustment factors cannot be directly
Chapter 2-5: Seasonal Adjustment 99

computed for that series. Implicit seasonal adjustment factors, however, can be calculated after the
fact by taking the ratio of the unadjusted aggregate to the seasonally adjusted aggregate or, for
additive implicit factors, the difference between
those two aggregates.

REFERENCES

in A Course in Time Series Analysis, eds. D. Peña,
G.C. Tiao, and R.S. Tsay, pp. 202–247, J. Wiley and
Sons, New York, 2001, retrieved from .
Kendall, M., and J. K. Ord, Time Series, University
Press, New York, 1990.

Bowler, M., R. E. Ilg, S. Miller, E. Robison, and
A. Polivka, “Revisions to the Current Population
Survey Effective in January 2003,” 2003, retrieved
from .

McIntire, R. J., R. B. Tiller, and T. D. Evans,
“Revision of Seasonally Adjusted Labor Force
Series,” 2002, retrieved from .

Box, G. E. P., and G. M. Jenkins, Time Series
Analysis: Forecasting and Control, Holden Day,
San Francisco, 1970.

Methee, G. R., and R. J. McIntire, “An Evaluation
of Concurrent Seasonal Adjustment for the Major
Labor Force Series,” Proceedings of the Business
and Economic Statistics Section, American
Statistical Association, pp. 358–363, 1987,
retrieved from .

Box, G. E. P., and G. C. Tiao, “Intervention Analysis
with Applications to Economic and Environmental
Problems,” Journal of the American Statistical
Association, 70:349, 1975, pp. 70–79.
Chang, I., G. C. Tiao, and C. Chen, “Estimation
of Time Series Parameters in the Presence of
Outliers,” Techonometrics, 1988, pp. 193–204.
Dagum, E. B., “The X-11-ARIMA Seasonal
Adjustment Method,” Statistics Canada, Catalog
No. 12-564E, 1980, retrieved from .
Findley, D. F., B. C. Monsell, W. R. Bell, M. C. Otto,
and B. Chen, “New Capabilities and Methods of
the X-12-ARIMA Seasonal Adjustment Program,”
Journal of Business and Economic Statistics, April
1998, pp. 127–152, retrieved from .

Shiskin, J., A. Young, and J. Musgrave, “The
X-11 Variant of the Census Method II Seasonal
Adjustment Program,” Technical Paper 15,
U.S. Census Bureau, 1967, retrieved from
.
U.S. Bureau of Labor Statistics, “Local Area
Unemployment Statistics: Estimation,” Handbook
of Methods, 2018, retrieved from .
U.S. Census Bureau, X-13ARIMA-SEATS Reference
Manual (Version 1.1), 2017, retrieved from
.

Gomez, V., and A. Maravall, “Seasonal Adjustment
and Signal Extraction in Economic Time Series,”

100 Chapter 2-5: Seasonal Adjustment	

	

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Unit 3
Current Population Survey
Operations

This page is intentionally blank.

Chapter 3-1: Instrument Design
BASIC DESIGN
Chapter 1-2, Questionnaire Concepts and
Definitions, describes the current concepts and
definitions underpinning the Current Population
Survey (CPS) data collection instrument. This
chapter offers a general description of how the
data collection is designed and how it operates.
The current data collection instrument is designed
to be administered by live interviewers using a
laptop computer or calling from a centralized telephone facility. The CPS is designed to use computer-assisted interviewing (CAI). CAI consists of
either a computer-assisted telephone interviewing
(CATI) or a computer-assisted personal interviewing (CAPI), which can be conducted in person
or by telephone. Interviewers ask the survey
questions as they appear on the screen of the
laptop (or desktop) computer and then type the
responses directly into the computer. A portion of
sample households—currently about 10 percent—
is interviewed via CATI at contact centers located
in Tucson, Arizona, and Jeffersonville, Indiana.
CAI methods allow greater flexibility in questionnaire design than paper-and-pencil data collection
methods. Complicated skips, respondent-specific
question wording, and carryover of data from one
interview to the next are all possible in an automated environment.
CAI allows capabilities such as:
•	

The use of dependent interviewing, i.e.,
carrying over information from the previous
month for industry, occupation, and duration
of unemployment data.

•	

The use of respondent-specific question wording based on the person’s name, age, and sex,
answers to prior questions, household characteristics, etc. By automatically bringing up the
next question on the interviewer’s screen, CAI
reduces the probability that an interviewer will
ask the wrong set of questions.

•	

CAI permits the inclusion of several built-in
editing features, including automatic checks

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

for internal consistency and unlikely responses
and verification of answers. With these built-in
editing features, errors can be caught and
corrected during the interview itself.
The CPS instrument is created using Blaise®, a
CAI system and survey processing tool for the
Windows operating system and the Internet.
Developed by Statistics Netherlands and licensed
and supported in the United States by Westat,
Blaise® includes advanced capabilities that meet
the CAI needs of the CPS. The instrument consists
of complicated skip patterns and automated question text fills. Other features include help screens
and pop-up check boxes to guide and assist
accurate data entry. The instrument also collects
paradata, which is respondent contact history
information recorded by interviewers to help better evaluate the survey experience and willingness
to respond.
Typically, households’ first and fifth monthly
interviews are conducted in person. The remaining monthly interviews are generally done by
telephone. The survey instrument questions differ
slightly, depending on where households are in
their schedule of monthly interviews. The first and
fifth interviews collect the vast majority of the
demographic information, which is carried forward
to subsequent months. The fourth and eighth
interviews collect information on earnings and
union membership for wage and salary workers.
This information is often referred to as outgoing
rotation data. Earnings data are collected only
in the outgoing rotations in order to reduce the
potential for higher nonresponse in other monthly
interviews.
At the end of each day’s interviewing, the data
collected are transmitted to the U.S. Census
Bureau’s central database. Once files are transmitted, they are deleted from the laptops.
A copy of the questionnaire can be obtained from
the Internet at .

Chapter 3-1: Instrument Design 103

CURRENT POPULATION SURVEY
QUESTIONNAIRE
The CPS has been conducted for more than
three-quarters of a century and produces some
of the most important data used to develop
economic and social policy in the United States.
Although the U.S. economy and society has
experienced major changes since the survey was
first conducted in 1940, the concepts used in the
measurement of unemployment and the unemployment rate have remained largely unchanged.
During the history of the survey, there have been
several wholesale reviews of CPS concepts, most
notably two presidentially appointed evaluations—the Gordon Committee in the 1960s and the
Levitan Commission in the 1970s. While both of
these reviews found the main labor force definitions to be sound, they suggested tightening the
definition of unemployment at the margins. Many
recommendations from the Gordon Committee
were implemented into a major redesign of the
CPS in 1967. From 1967 to 1993, the survey questionnaire remained virtually unchanged. A major
redesign of the survey in 1994 implemented many
of the Levitan Commission recommendations. In
addition, the CPS questionnaire has expanded
to collect more detail about a variety of demographic and labor market characteristics over the
years. For more information about the history of
the CPS, see Dunn, Haugen, and Kang (2018).
The remainder of this chapter describes the work
performed on the data collection instrument and
changes implemented in 1994 because of that
work. A description of more recent improvements and additions to the survey instrument also
appears in this chapter.

1994 REDESIGN
A major redesign of the CPS was implemented
in January 1994. The 1994 revisions to the survey were designed to take advantage of major
advances in survey research methods and data
collection technology, as well as to account
for recommendations made by the Levitan
Commission. This section describes in detail the
work performed on the data collection instrument
and changes implemented in 1994.

104 Chapter 3-1: Instrument Design	

	

Objectives of the Redesign
There were five main objectives in redesigning the
CPS questionnaire:
•	

Better operationalize existing definitions and
reduce reliance on volunteered responses.

•	

Reduce the potential for response error in the
questionnaire-respondent-interviewer interaction and, hence, improve measurement of CPS
concepts.

•	

Implement minor definitional changes within
the labor force classifications.

•	

Expand the labor force data available and
improve longitudinal measures.

•	

Exploit the capabilities of CAI for improving
data quality and reducing respondent burden.
See Copeland and Rothgeb (1990) for a fuller
discussion.

Enhanced Accuracy
In redesigning the CPS questionnaire, the Bureau
of Labor Statistics (BLS) and the Census Bureau
developed questions to lessen the potential for
response error. Among the approaches used were:
•	

Shorter, clearer question wording.

•	

Splitting complex questions into two or more
separate questions.

•	

Building concept definitions into question
wording.

•	

Reducing reliance on volunteered information.

•	

Explicit and implicit strategies for the respondent to provide numeric data such as hours
and earnings.

•	

Use of revised, precoded response categories for open-ended questions (Copeland and
Rothgeb, 1990).

Definitional Changes
The labor force definitions used in the CPS have
undergone only minor modifications since the
survey’s inception in 1940 and, with only one
exception, the definitional changes and refinements made in 1994 were small. The one major
definitional change dealt with the concept of
discouraged workers, i.e., people outside the labor

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

force who are not looking for work because they
believe that there are no jobs available for them.
As noted in Chapter 1-2, Questionnaire Concepts
and Definitions, discouraged workers are similar
to the unemployed in that they are not working
and want a job. Since they are not conducting an
active job search, however, they do not satisfy a
key element necessary to be classified as unemployed. The previous measurement of discouraged
workers was criticized by the Levitan Commission
as being arbitrary and subjective. It was deemed
arbitrary because assumptions about a person’s
availability for work were made from responses
to a question on why the respondent was not
currently looking for work. It was considered
subjective because the measurement was based
on a person’s stated desire for a job regardless of
whether the individual had ever looked for work.
A new, more precise measurement of discouraged
workers was introduced that specifically asked if
a person had searched for a job during the prior
12 months and was available for work. The new
questions also enable estimation of the number
of people outside the labor force who, although
they cannot be precisely defined as discouraged,
satisfy many of the same criteria as discouraged
workers and, thus, show some marginal attachment to the labor force.
Other minor conceptual and questionnaire
changes were made to fine-tune the definitions of
unemployment, categories of unemployed people, and people who were employed part-time for
economic reasons.

New Labor Force Information Introduced
With the revised questionnaire, several types of
labor force data became available regularly for
the first time. For example, information became
available each month on the number of employed
people who have more than one job and the hours
multiple jobholders work on their main job and
all of their other jobs combined. By separately
collecting information on the number of hours
multiple jobholders work on their main job and
secondary jobs, it became possible to derive estimates of the number of workers who combined
two or more part-time jobs into a full-time workweek and the number of full- and part-time jobs
in the economy. The inclusion of the multiple job
questions also improves the accuracy of answers

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

to the questions on hours worked and facilitates
comparisons of employment estimates from the
CPS with those from the Current Employment
Statistics (CES), the survey of nonfarm business
establishments. (For a discussion of the CES, see
the "BLS Handbook of Methods," referenced at
the end of the chapter.) In addition, beginning
in 1994, monthly data on the number of hours
usually worked per week and data on the number
of discouraged workers became available from
the entire CPS sample rather than just from those
respondents who were in their fourth or eighth
monthly interviews.

Evaluation and Selection of Revised
Questions
Planning for the CPS questionnaire revisions
implemented in 1994 began 8 years beforehand,
when BLS and the Census Bureau convened a task
force to identify areas for improvement. Studies
employing methods from the cognitive sciences
were conducted to test possible solutions to the
problems identified. These studies included interviewer focus groups, respondent focus groups,
respondent debriefings, a test of interviewers’
knowledge of concepts, in-depth cognitive laboratory interviews, response categorization research,
and a study of respondents’ comprehension
of alternative versions of labor force questions
(Campanelli, Martin, and Rothgeb, 1991; Edwards,
Levine, and Cohany, 1989; Fracasso, 1989;
Gaertner, Cantor, and Gay, 1989; Martin, 1987;
Palmisano, 1989).
The revised questionnaire developed jointly by
Census Bureau and BLS staff used information
collected in a large two-phase test of question
wording in addition to qualitative research. During
Phase I, two alternative questionnaires were tested
using the then official questionnaire as the control.
During Phase II, one alternative questionnaire was
tested with the control. The questionnaires were
tested using computer-assisted telephone interviewing and a random-digit dialing sample (CATI/
RDD). During these tests, interviews were conducted from the centralized telephone interviewing contact centers of the Census Bureau (Polivka
and Rothgeb, 1993).
Both quantitative and qualitative information
was used in the two phases to select questions,

Chapter 3-1: Instrument Design 105

identify problems, and suggest solutions. Analyses
were based on information from item response
distributions, respondent and interviewer debriefing data, and behavior coding of interviewer and
respondent interactions. For more on the evaluation methods used for redesigning the questions, see Esposito et al. (1992) and Esposito and
Rothgeb (1997).

Item Response Analysis
The primary use of item response analysis was
to determine whether different questionnaires
produce different response patterns, which may
affect the labor force estimates. Unedited data
were used for this analysis. Statistical tests were
conducted to ascertain whether differences
among the response patterns of different questionnaire versions were statistically significant. The
statistical tests were adjusted to take into consideration the use of a nonrandom clustered sample, repeated measures over time, and multiple
persons in a household.
Response distributions were analyzed for all items
on the questionnaires. The response distribution
analysis indicated the degree to which new measurement processes produced different patterns
of responses. Data gathered using the other methods outlined above also aided interpretation of the
response differences observed. (Response distributions were calculated based on people who
responded to the item, excluding those whose
response was “don’t know” or “refused.”)

Respondent Debriefings
At the end of the test interview, respondent
debriefing questions were administered to a sample of respondents to measure respondent comprehension and response formulation. From these
data, indicators of how respondents interpret
and answer the questions and some measures of
response accuracy were obtained.
The debriefing questions were tailored to the
respondent and depended on the path the interview had taken. Two forms of respondent debriefing questions were administered—probing questions and vignette classification. Question-specific
probes were used to ascertain whether certain
words, phrases, or concepts were understood by
respondents in the manner intended (Esposito
et al., 1992). For example, those who did not
106 Chapter 3-1: Instrument Design	

	

indicate in the main survey that they had done any
work were asked the direct probe, “LAST WEEK
did you do any work at all, even for as little as 1
hour?” The classification of vignette debriefings
was designed to determine if alternative question
wording altered respondents’ understanding of
the question and changed their perception of the
concepts underlying them. An example of the
vignettes respondents received is, “Last week,
Amy spent 20 hours at home doing the accounting for her husband’s business. She did not receive
a paycheck.” Individuals were asked to classify the
person in the vignette as working or not working based on the wording of the question they
received in the main survey. For example, “Would
you report her as working last week, not counting work around the house?” if the respondent
received the unrevised questionnaire, or “Would
you report her as working for pay or profit last
week?” if the respondent received the current,
revised questionnaire (Martin and Polivka, 1995).

Behavior Coding
Behavior coding entails monitoring or audiotaping
interviews and recording significant interviewer
and respondent behaviors (for example, minor
or major changes in question wording, probing behavior, inadequate answers, requests for
clarification). During the early stages of testing,
behavior coding data were useful in identifying
problems with proposed questions. For example,
if interviewers frequently reword a question, this
may indicate that the question was too difficult to
ask as worded; respondents’ requests for clarification may indicate that they were experiencing
comprehension difficulties; or interruptions by
respondents may indicate that a question was too
lengthy (Esposito et al., 1992).
During later stages of testing, the objective of
behavior coding was to determine whether the
revised questionnaire improved the quality of
interviewer/respondent interactions as measured
by accurate reading of the questions and adequate responses by respondents. Additionally,
results from behavior coding helped identify areas
of the questionnaire that would benefit from
enhancements to interviewer training.

Interviewer Debriefings
The primary objective of interviewer debriefing
was to identify areas of the revised questionnaire
Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

or interviewer procedures that were problematic
for interviewers or respondents. The information collected was used to identify questions
that needed revision, and to modify interviewer
training and the interviewer manual. A secondary
objective was to obtain information about the
questionnaire, interviewer behavior, or respondent behavior that would help explain differences
observed in the labor force estimates from the
different measurement processes.
Two different techniques were used to debrief
interviewers. The first was the use of focus groups
at the centralized telephone interviewing contact centers and in geographically dispersed
regional offices (ROs). The focus groups were
conducted after interviewers had at least 3 to 4
months experience using the revised CPS instrument. Approximately 8 to 10 interviewers were
selected for each focus group. Interviewers were
selected to represent different levels of experience and ability. The second technique was the
use of a self-administered, standardized interviewer debriefing questionnaire. Once problematic
areas of the revised questionnaire were identified
through the focus groups, a standardized debriefing questionnaire was developed and administered
to all interviewers. See Esposito and Hess (1992)
for more information on interviewer debriefing.

HIGHLIGHTS OF THE 1994
QUESTIONNAIRE REVISION
General
Definition of reference week. In the interviewer
debriefings that were conducted in 13 different geographic areas during 1988, interviewers
reported that the major activity question, “What
were you doing most of LAST WEEK, working or
something else?” was unwieldy and sometimes
misunderstood by respondents. In addition to not
always understanding the intent of the question,
respondents were unsure what was meant by the
time period “last week” (BLS, 1988). A respondent
debriefing conducted in 1988 found that only 17
percent of respondents had definitions of “last
week” that matched the CPS definition of Sunday
through Saturday of the reference week. The
majority (54 percent) of respondents defined “last
week” as Monday through Friday (Campanelli et
al., 1991).

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

In the revised questionnaire, an introductory statement was added with the reference period clearly
stated. The new introductory statement reads, “I
am going to ask a few questions about work-related activities LAST WEEK. By last week I mean
the week beginning on Sunday, August 9
and ending Saturday, August 15.” This statement makes the reference period more explicit
to respondents. Additionally, the major activity
question was deleted from the questionnaire. In
the past, the question had served as a preamble to
the labor force questions, although many people’s
labor force status was found to be determined by
responses to it. In the revised questionnaire, the
survey content is defined in the introductory statement, which also defines the reference week.
Direct question on presence of business. The
definition of employed persons includes those
who work without pay for at least 15 hours per
week in a family business. In the pre-1994 questionnaire, there was no direct question on the
presence of a business in the household. Such a
question is included in the revised questionnaire.
This question is asked only once for the entire
household prior to the labor force questions. The
question reads, “Does anyone in this household
have a business or a farm?” This question determines whether a business exists and who in the
household owns the business. The primary purpose of this question is to screen for households
that may have unpaid family workers, not to
obtain an estimate of household businesses. For
a fuller discussion of the need for a direct question on presence of a business, see Rothgeb et al.
(1992), Copeland and Rothgeb (1990), and Martin
(1987).
For households that have a family business, direct
questions are asked about unpaid work in the family business by all people who were not reported
as working last week in an earlier question. BLS
produces monthly estimates of unpaid family
workers who work 15 or more hours per week.

Employment-Related Revisions
Revised “at work” question. Having a direct question on the presence of a family business not only
improved the estimates of unpaid family workers,
but also permitted a revision of the “at work”
question. In the pre-1994 questionnaire, the “at
work” question read, “LAST WEEK, did you do any
Chapter 3-1: Instrument Design 107

work at all, not counting work around the house?”
In the revised questionnaire, the wording reads,
“LAST WEEK did you do ANY work for (either)
pay (or profit)?” (The parentheticals in the question are displayed only when a business or farm
is in the household.) The revised wording “work
for pay (or profit)” better captures the concept
of work that BLS is attempting to measure. See
Martin (1987) or Martin and Polivka (1995) for a
fuller discussion of problems with the measurement of “work.”
Direct question on multiple jobholding. In the
pre-1994 questionnaire, the actual hours question
read: “How many hours did you work last week
at all jobs?” During the interviewer debriefings
conducted in 1988, it was reported that respondents do not always hear the last phrase “at all
jobs.” Some respondents who work at two jobs
may have only reported hours for one job (BLS,
1988). In the revised questionnaire, a question is
included at the beginning of the hours series to
determine whether the person holds multiple jobs.
A follow-up question also asks for the number of
jobs the multiple jobholder has. Multiple jobholders are asked about their hours on their main job
and other job(s) separately to avoid the problem
of multiple jobholders not hearing the phrase “at
all jobs.” These new questions also allow monthly
estimates of multiple jobholders to be produced.
Hours series. The pre-1994 question on “hours
worked” read: “How many hours did you work last
week at all jobs?” If a person reported 35 to 48
hours worked, additional follow-up probes were
asked to determine whether the person worked
any extra hours or took any time off. Interviewers
were instructed to correct the original report of
actual hours, if necessary, based on responses to
the probes. The hours data are important because
they are used to determine the sizes of the fulltime and part-time labor forces. It is unknown
whether respondents reported exact actual hours,
usual hours, or some approximation of actual
hours.
In the revised questionnaire, a revised hours series
was adopted. An anchor-recall estimation strategy was used to obtain a better measure of actual
hours and to address the issue of work schedules
more completely. For multiple jobholders, separate data on hours worked at a main job and other
jobs are also collected. The revised questionnaire
108 Chapter 3-1: Instrument Design	

	

first asks about the number of hours a person
usually works at the job. Then separate questions
are asked to determine whether a person worked
extra hours or fewer hours. Finally, a question is
asked on the number of actual hours worked last
week. The new hours series improves the accuracy
of the data and allows monthly estimates of usual
hours worked to be produced for all employed
people (Polivka and Rothgeb, 1993). In the pre1994 questionnaire, usual hours were obtained
only in the outgoing rotation for employed private
wage and salary workers and were available only
on a quarterly basis. The revised questionnaire
also permits estimates of usual and actual hours
on multiple jobholders’ main jobs and their other
jobs. This information is used to derive estimates
of the number of multiple jobholders who work
full-time on their main jobs and part-time on their
other jobs, those who work full-time on both their
main jobs and their other jobs, and those who
work part-time on all of their jobs.
Industry and occupation (Dependent interviewing). Prior to the 1994 revision, CPS industry and
occupation (I&O) data were not always consistent
from month to month for the same person in the
same job. These inconsistencies arose, in part,
because the household respondent frequently
varies from one month to the next. Furthermore, it
is sometimes difficult for a respondent to describe
an occupation consistently from month to month.
Moreover, distinctions at the three-digit occupation and industry level, i.e., at the most detailed
classification level, can be very subtle. To obtain
more consistent data and make full use of the
automated interviewing environment, dependent
interviewing for the I&O question, which uses
information collected during the previous month’s
interview in the current month’s interview, was
implemented in the revised questionnaire for
month-in-sample 2−4 (MIS 2–4) households and
MIS 6−8 households. Different variations of dependent interviewing were evaluated during testing.
See Rothgeb et al. (1992) for more detail.
In the revised CPS, respondents are provided the
name of their employer from the previous month
and asked if they still work for that employer. If
they answer “no,” respondents are asked the independent questions on I&O.
If they answer “yes,” respondents are asked, “Have
the usual activities and duties of your job changed
Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

since last month?” If individuals say “yes,” they are
then asked the independent questions on occupation, usual activities or duties, and class of worker.
If their usual duties have not changed, individuals
are asked to verify the previous month’s description through the question “Last month, you were
reported as (previous month’s occupation or kind
of work performed) and your usual activities were
(previous month’s duties). Is this an accurate
description of your current job?”
If they answer “yes,” the previous month’s occupation and class of worker are brought forward and
no coding is required. If they answer “no,” they are
asked the independent questions on occupation,
usual activities or duties, and class of worker. This
redesign permits a direct inquiry about job change
before the previous month’s information is provided to the respondent.
Earnings. The earnings series in the revised questionnaire is considerably different from that in the
pre-1994 questionnaire. In the pre-1994 questionnaire, persons were asked whether they were paid
by the hour, and if so, what the hourly wage was.
All wage and salary workers were then asked for
their usual weekly earnings. In the pre-1994 version, earnings could be reported as weekly figures
only, even though that may not have been the
easiest way for the respondent to recall and report
earnings. Data from early tests indicated that a
small proportion (14 percent) of non-hourly-wage
workers (n = 853) were paid at a weekly rate, and
less than (25 percent) of non-hourly-wage workers
(n = 1623) found it easiest to report earnings as a
weekly amount (Polivka and Rothgeb, 1993).
In the revised questionnaire, the earnings series is
designed to first request the periodicity for which
the respondent finds it easiest to report earnings
and then request an earnings amount in the specified periodicity as displayed below. The wording
of questions requesting an earnings amount is
tailored to the periodicity identified earlier by the
respondent. (Because data on weekly earnings are
published quarterly by BLS, earnings data provided by respondents in periodicities other than
weekly are converted to a weekly earnings estimate later during processing operations.)
Revised Earnings Series (Selected items)
1.

For your (MAIN) job, what is the easiest way
for you to report your total earnings BEFORE

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

taxes or other deductions: hourly, weekly,
annually, or on some other basis?
2.

Do you usually receive overtime pay, tips, or
commissions (at your MAIN job)?

3.

(Including overtime pay, tips, and commissions) What are your usual (weekly, monthly,
annual, etc.) earnings on this job, before taxes
or other deductions?

As can be seen from the revised questions presented above, other revisions to the earnings
series include a specific question to determine
whether a person usually receives overtime pay,
tips, or commissions. If so, a statement precedes
the earnings questions that reminds respondents
to include overtime pay, tips, and commissions
when reporting earnings. If a respondent reports
that it is easiest to report earnings on an hourly
basis, then a separate question is asked regarding
the amount of overtime pay, tips, and commissions usually received, if applicable.
Individuals for whom the easiest way to report is
hourly are assumed to be paid hourly. An additional question is asked of people who do not
report that it is easiest to report their earnings
hourly. The question is displayed below. Studies of
the number of minimum-wage workers are based
upon those individuals identified as hourly wage
workers.
“Even though you told me it is easier to
report your earnings annually, are you
PAID AT AN HOURLY RATE on this job?”

Unemployment-Related Revisions
People on layoff (direct question). Previous
research (Rothgeb, 1982; Palmisano, 1989)
demonstrated that the pre-1994 question on layoff status—“Did you have a job or business from
which you were temporarily absent or on layoff
LAST WEEK?”—was long, awkwardly worded, and
frequently misunderstood by respondents. Some
respondents heard only part of the question,
while others thought that they were being asked
whether they had a business.
In an effort to reduce response error, the revised
questionnaire includes two separate direct questions about layoff and temporary absences. The
layoff question is, “LAST WEEK, were you on
layoff from a job?” Questions asked later screen
Chapter 3-1: Instrument Design 109

out those who do not meet the criteria for layoff
status.
People on layoff (expectation of recall). The
official definition of layoff includes the criterion of
an expectation of being recalled to the job. In the
pre-1994 questionnaire, people reported being
on layoff were never directly asked whether they
expected to be recalled. In an effort to better capture the existing definition, people reported being
on layoff in the revised questionnaire are asked,
“Has your employer given you a date to return to
work?” People who respond that their employers
have not given them a date to return are asked,
“Have you been given any indication that you will
be recalled to work within the next 6 months?” If
the response is positive, their availability is determined by the question, “Could you have returned
to work LAST WEEK if you had been recalled?”
People who do not meet the criteria for layoff are
asked the job search questions so they still have
an opportunity to be classified as unemployed.
Job search methods. The concept of unemployment requires, among other criteria, an active job
search during the previous 4 weeks. In the pre1994 questionnaire, the following question was
asked to determine whether a person conducted
an active job search. “What has ... been doing in
the last 4 weeks to find work?” Responses that
could be checked included:
•	 Public employment agency.
•	 Private employment agency.	
•	 Employer directly.
•	 Friends and relatives.
•	 Placed or answered ads.
•	 Nothing.
•	 Other.
Interviewers were instructed to code all passive
job search methods, i.e., a job search method
that could not directly lead to a job offer, into
the “nothing” category. This included such activities as looking at newspaper ads, attending job
training courses, and practicing typing. Only
active job search methods for which no appropriate response category exists were to be coded as
“other.”
In the revised questionnaire, several additional
response categories were added and the response
options were reordered and reformatted to more
clearly represent the distinction between active
110 Chapter 3-1: Instrument Design	

	

job search methods and passive methods. The
revisions to the job search methods question grew
out of concern that interviewers were confused
by the precoded response categories. This was
evident even before the analysis of the CATI/RDD
test. Martin (1987) conducted an examination
of verbatim entries for the “other” category and
found that many of the “other” responses should
have been included in the “nothing” category. The
analysis also revealed responses coded as “other”
that were too vague to determine whether an
active job search method had been undertaken.
Fracasso (1989) also concluded that the current
set of response categories was not adequate for
accurate classification of active and passive job
search methods.
During development of the revised questionnaire,
two additional passive categories were included:
(1) “looked at ads” and (2) “attended job training
programs/courses.” Two additional active categories were included: (1) “contacted school/
university employment center” and (2) “checked
union/professional registers.” Later research also
demonstrated that interviewers had difficulty
coding relatively common responses such as “sent
out resumes” and “went on interviews”; thus, the
response categories were further expanded to
reflect these common job search methods.
Research conducted in the 2000s indicated a need
for instructions on how to code job search methods such as “online advertisements” or “placed
resume online.” The result of this research was
the introduction in 2014 of clearer descriptions
on how to code such job search methods into
the interviewer instructions and the interviewer
manual.
Duration of job search and layoff. The duration
of unemployment is an important labor market
indicator published monthly by BLS. In the pre1994 questionnaire, this information was collected
by the question “How many weeks have you been
looking for work?” This wording forced people
to report in a periodicity that may not have been
meaningful to them, especially for the longer-term
unemployed. In addition, asking for the number of
weeks (rather than months) may have led respondents to underestimate the duration. In the revised
questionnaire, the question reads: “As of the end
of LAST WEEK, how long had you been looking
for work?” Respondents can select the periodicity
Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

themselves and interviewers are able to record the
duration in weeks, months, or years.
To avoid the clustering of answers around whole
months, the revised questionnaire also asks those
who report duration in whole months (between
1 and 4 months) a follow-up question to obtain
an estimated duration in weeks, “We would like
to have that in weeks, if possible. Exactly how
many weeks had you been looking for work?” The
purpose of this is to lead people to report the
exact number of weeks instead of multiplying their
monthly estimates by four as was done in an earlier test and may have been done in the pre-1994
questionnaire.
As mentioned earlier, the CATI and CAPI technology makes it possible to automatically update
duration of job search and layoff for people who
are unemployed in consecutive months. For people reported to be looking for work for 2 consecutive months or longer, the previous month’s
duration is updated without re-asking the duration questions. For those on layoff for at least 2
consecutive months, the duration of layoff is also
automatically updated. This revision was made to
reduce respondent burden and enhance the longitudinal capability of the CPS. This revision also will
produce more consistent month-to-month estimates of duration. Previous research indicates that
about 25 percent of those unemployed in consecutive months who received the pre-1994 questionnaire (where duration was collected independently
each month) increased their reported durations
by 4 weeks plus or minus a week (Polivka and
Rothgeb, 1993; Polivka and Miller, 1998). A very
small bias is introduced when a person has a brief
(less than 3 or 4 weeks) period of employment in
between surveys. However, testing revealed that
only 3.2 percent of those who had been looking
for work in consecutive months said that they
had worked in the interlude between the surveys.
Furthermore, of those who had worked, none indicated that they had worked for 2 weeks or more.

Revisions to Not-in-the-Labor-Force
Questions
Response options of retired, disabled, and
unable to work. In the pre-1994 questionnaire,
when individuals reported they were retired in
response to any of the labor force items, the interviewer was required to continue asking whether
Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

they worked last week, were absent from a job,
were looking for work, and, in the outgoing rotation, when they last worked and their job histories.
Interviewers commented that elderly respondents
frequently complained that they had to respond
to questions that seemed to have no relevance to
their own situation.
In an attempt to reduce respondent burden, a
response category of “retired” was added to
each of the key labor force status questions in
the revised questionnaire. If individuals 50 years
of age or older volunteer that they are retired,
they are immediately asked a question inquiring
whether they want a job. If they indicate that they
want to work, they are then asked questions about
looking for work and the interview proceeds as
usual. If they do not want to work, these individuals are not asked the remainder of the labor
force questions and they are classified as not in
the labor force—retired. (If they are in the outgoing rotation, an additional question is asked to
determine whether they worked within the last 12
months. If so, the I&O questions are asked about
the last job held.)
A similar change has been made in the revised
questionnaire to reduce the burden for individuals reported to be “unable to work” or “disabled.”
(Individuals who may be “unable to work” for a
temporary period of time may not consider themselves as “disabled” so both response options are
provided.) If a person is reported to be “disabled”
or “unable to work” at any of the key labor force
classification items, a follow-up question is asked
to determine whether the person can do any
gainful work during the next 6 months. Different
versions of the follow-up probe are used depending on whether the person identifies as disabled
or as unable to work. Individuals who indicated
that their disability prevents them from working
in the next 6 months also are skipped around the
remaining labor force questions.
Dependent interviewing for people reported as
retired, disabled, or unable to work. The revised
questionnaire also was designed to use dependent interviewing across months for individuals
reported to be retired, disabled, or unable to
work. An automated questionnaire increases the
ease with which information from the previous
month’s interview can be used during the current
month’s interview.
Chapter 3-1: Instrument Design 111

Once it is reported that the person did not work
during the current month’s reference week, the
previous month’s status of retired (if a person is
50 years of age or older), disabled, or unable to
work is verified, and the regular series of labor
force questions is not asked. This revision reduces
respondent and interviewer burden.
Discouraged workers. The implementation of
the Levitan Commission’s recommendations on
discouraged workers resulted in one of the major
definitional changes in the 1994 redesign (NCEUS,
1979). The Levitan Commission criticized the pre1994 definition because it was based on a subjective desire for work and questionable inferences
about an individual’s availability to take a job.
Because of the redesign, two requirements were
added: for persons to qualify as discouraged, they
must have engaged in some job search within the
past year (or since they last worked if they worked
within the past year), and they must currently be
available to take a job. Formerly, availability was
inferred from responses to other questions; now
there is a direct question.
Data on a larger group of people outside the
labor force (one that includes discouraged workers, as well as those who desire work but give
other reasons for not searching such as child care
problems, family responsibilities, school, or transportation problems) also have been published
monthly since the 1994 redesign. This group is
made up of people who want a job, are available
for work, and have looked for work within the past
year. This group is generally described as having
some marginal attachment to the labor force. Also
beginning in 1994, questions on this subject are
asked of the full CPS sample rather than a quarter
of the sample as was done prior to the redesign,
permitting estimates of the number of discouraged workers to be published monthly rather than
quarterly.
Findings. Tests of the revised questionnaire
showed that the quality of labor force data
improved because of the redesign of the CPS
questionnaire, and in general, measurement error
diminished. Data from respondent debriefings,
interviewer debriefings, and response analysis
demonstrated that the revised questions are
more clearly understood by respondents and
the potential for labor force misclassification is
reduced. Results from these tests formed the
112 Chapter 3-1: Instrument Design	

	

basis for the design of the final revised version of
the questionnaire.
This revised version was tested in a separate yearand-a-half parallel survey prior to implementation
as the official survey in January 1994. In addition,
the unrevised procedures were used with the
parallel survey sample from January 1994 through
May 1994. These parallel surveys were conducted
to assess the effect of the redesign on national
labor force estimates. Estimates derived from the
initial year and a half of the parallel survey indicated that the redesign might increase the unemployment rate by 0.5 percentage points. However,
subsequent analysis using the entire parallel
survey indicates that the redesign did not have a
statistically significant effect on the unemployment rate. (Analysis of the effect of the redesign
on the unemployment rate and other labor force
estimates can be found in Cohany, Polivka, and
Rothgeb [1994]). Analysis of the redesign on the
unemployment rate along with a wide variety of
other labor force estimates using data from the
entire parallel survey can be found in Polivka and
Miller (1998).

RECENT IMPROVEMENTS IN THE
QUESTIONNAIRE
Changes to Race and Ethnicity Questions
Starting in 2003, race and ethnicity questions
were modified to allow individuals to report being
of more than one race. Hispanics were identified
through a direct question; previously, individuals’
ethnicity was determined through their country
of origin or that of their ancestors (Bowler et al.,
2003). These changes were made so CPS data
would be in accord with Office of Management
and Budget (OMB) guidelines on collection of race
and ethnicity. The changes were the result of a
multiyear research effort that started in 1996 and
included two supplements to the CPS that tested
alternative question wordings.

Disability Questions
Development of disability questions for the CPS
was a long process that started in the late 1990s.
The Presidential Task Force on Employment of
Adults with Disabilities formed the Employment
Rate Measurement Work Group, which included
members from 17 different agencies, to develop
a measure of the employment rate of people
Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

with disabilities. Questions proposed for the CPS
were tested in the National Comorbidity Survey
(NCS) and again in a 2006 supplement to the CPS.
However, the disability rate obtained from the
CPS was much lower than that from the NCS. The
Census Bureau had independently developed and
tested six questions on disability for the American
Community Survey (ACS). Because the NCS and
CPS supplement testing had uncovered problems
and because OMB was encouraging federal surveys to use the same questions, the ACS questions were modified slightly so that they could be
incorporated into the CPS questionnaire. The six
disability questions were added to the monthly
CPS in June 2008 (McMenamin and Hipple, 2014).

Certification and Licensing Questions
The federal Interagency Working Group on
Expanded Measures of Enrollment and Attainment
(GEMEnA) was formed in 2009 to develop questions to measure the prevalence and key characteristics of nondegree credentials, such as
certifications and licenses, for existing federal
surveys. The question development process took
several years and included cognitive testing, focus
groups, consultation with experts, and a number
of pilot studies. After additional cognitive testing and stakeholder outreach, BLS made minor
changes to three general GEMEnA questions
on certifications and licenses so they could be
incorporated into the CPS questionnaire. Fielding
of the three new questions began in January 2015
(Allard, 2016).

Introduction of New Relationship Categories
to Capture Same-Sex Marriages
In 2010, the Interagency Working Group on
Measuring Relationships in Federal Household
Surveys was established, convened, and chaired
by the Statistical and Science Policy Branch of
OMB. At the behest of this group, the Census
Bureau conducted focus groups and cognitive
testing to develop revised relationship categories. The actual question did not change. Because
of these focus group findings, two alternative
relationship category groupings were developed.
Both groupings underwent further cognitive testing. The version implemented in the CPS contains
the response categories that were recommended

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

for use following this cognitive testing (Kreider,
2017). The Census Bureau began phasing in the
new relationship question categories in CPS in
May 2015. In addition to the change to response
categories for the relationship question, the questions that ask about the presence of parents in the
household have also changed. Instead of asking
whether the person has a mother present (who is
always female) and then whether the person has a
father present (who is always male), the questions
now ask whether the person has a parent present,
and then whether they have another parent present. This allows for the identification of children
who live with same-sex parents.

CONTINUOUS TESTING AND
IMPROVEMENTS OF THE CURRENT
POPULATION SURVEY AND ITS
SUPPLEMENTS
Experience gained during the 1994 redesign of the
CPS demonstrates the importance of testing questions and monitoring data quality. The experience,
along with contemporaneous advances in research
on questionnaire design, informs the development
of methods for testing new or improved questions
for the basic CPS and its periodic supplements
(Martin and Polivka, 1995; Oksenberg et al., 1991;
Bischoping et al., 1989; Campanelli et al., 1991;
Esposito et al., 1992; Forsyth and Lessler, 1991).
Methods to continuously test questions and
assess data quality are discussed in Chapter 4-1,
Nonsampling Error. Changes to the CPS are made
carefully and their effects are consistently monitored. A major strength of the CPS is the ability
to compare estimates over time, and even minor
changes to the questionnaire can affect data
comparability. Proposed revisions to the CPS are
evaluated within the context of this comparability. Furthermore, methods to bridge differences
caused by changes and techniques to avoid the
disruption of historical series are included in the
testing of new or revised questions.

REFERENCES
Allard, M. D., “Adding Questions on Certifications
and Licenses to the Current Population Survey,”
Monthly Labor Review, U.S. Bureau of Labor
Statistics, November 2016, retrieved from
.

Chapter 3-1: Instrument Design 113

Bischoping, K., C. Cannell, F. Fowler, G. Kalton,
and L. Oksenberg, “New Techniques for Pretesting
Survey Questions,” Survey Research Center,
University of Michigan, 1989, Chapter 2.
Bowler, M., R. Ilg, S. Miller, E. Robinson, and
A. Polivka, “Revisions to the Current Population
Survey Effective in January 2003,” 2003, retrieved
from .
Campanelli, P. C., E. A. Martin, and J. M. Rothgeb,
“The Use of Interviewer Debriefing Studies as a
Way to Study Response Error in Survey Data,” The
Statistician, vol. 40, 1991, pp. 253−264.
Cohany, S. R., A. E. Polivka, and J. M. Rothgeb,
“Revisions in the Current Population Survey
Effective January 1994,” Employment and
Earnings, vol. 41, no. 2, 1994, pp. 1337.
Copeland, K., and J. M. Rothgeb, “Testing
Alternative Questionnaires for the Current
Population Survey,” Proceedings of the Section
on Survey Research Methods, American Statistical
Association, 1990, pp. 63−71.
Dunn, M., S. E. Haugen, and J. Kang, “The Current
Population Survey – Tracking Unemployment
in the United States for over 75 years,” Monthly
Labor Review, January 2018, pp. 1–23.
Edwards, S. W., R. Levine, and S. R. Cohany,
“Procedures for Validating Reports of Hours
Worked and for Classifying Discrepancies Between
Reports and Validation Totals,” Proceedings of the
Section on Survey Research Methods, American
Statistical Association, 1989.
Esposito, J. L., and J. Hess, “The Use of
Interviewer Debriefings to Identify Problematic
Questions on Alternate Questionnaires,” Paper
presented at the Annual Meeting of the American
Association for Public Opinion Research, St.
Petersburg, FL, 1992.
Esposito, J. L., J. M. Rothgeb, A. E. Polivka,
J. Hess, and P. C. Campanelli, “Methodologies for
Evaluating Survey Questions: Some Lessons from
the Redesign of the Current Population Survey,”
Paper presented at the International Conference
on Social Science Methodology, Trento, Italy, 1992.

114 Chapter 3-1: Instrument Design	

	

Esposito, J. L., and J. M. Rothgeb, “Evaluating
Survey Data: Making the Transition From
Pretesting to Quality Assessment,” Survey
Measurement and Process Quality, Wiley, New
York, 1997, pp. 541−571.
Forsyth, B. H., and J. T. Lessler, “Cognitive
Laboratory Methods: A Taxonomy,” Measurement
Errors in Surveys, Wiley, New York, Ch. 20, 1991,
pp. 393−418.
Fracasso, M. P., “Categorization of Responses to
the Open-Ended Labor Force Questions in the
Current Population Survey,” Proceedings of the
Section on Survey Research Methods, American
Statistical Association, 1989, pp. 481−485.
Gaertner, G., D. Cantor, and N. Gay, “Tests of
Alternative Questions for Measuring Industry
and Occupation in the CPS,” Proceedings of the
Section on Survey Research Methods, American
Statistical Association, 1989.
Kreider, R., “Changes to the Household
Relationship Data in the Current Population
Survey,” Paper presented at the Joint Statistical
Meetings, Baltimore, MD, July 31, 2017.
Martin, E. A., “Some Conceptual Problems in the
Current Population Survey,” Proceedings of the
Section on Survey Research Methods, American
Statistical Association, 1987, pp. 420−424.
Martin, E. A., and A. E. Polivka, “Diagnostics for
Redesigning Survey Questionnaires: Measuring
Work in the Current Population Survey,” Public
Opinion Quarterly, Winter 1995, vol. 59, no.4,
pp. 547−567.
McMenamin, T. M., and S. F. Hipple, “The
Development of Questions on Disability for
the Current Population Survey,” Monthly Labor
Review, U.S. Bureau of Labor Statistics, April
2014, retrieved from .
National Commission on Employment and
Unemployment Statistics (NCEUS), Counting the
Labor Force, 1979. (Also known as the Levitan
Commission.)

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Oksenberg, L., C. Cannell, and G. Kalton, “New
Strategies for Pretesting Survey Questions,”
Journal of Official Statistics, 1991, vol. 7, no. 3,
pp. 349−365.
Palmisano, M., “Respondents’ Understanding
of Key Labor Force Concepts Used in the CPS,”
Proceedings of the Section on Survey Research
Methods, American Statistical Association, 1989.
Polivka, A. E., and S. M. Miller, “The CPS after the
Redesign: Refocusing the Economic Lens,” Labor
Statistics Measurement Issues, 1998, pp. 249−286.
Polivka, A. E., and J. M. Rothgeb, “Overhauling
the Current Population Survey: Redesigning the
Questionnaire,” Monthly Labor Review, 1993, vol.
116, no. 9, pp. 10−28.

Rothgeb, J. M., A. E. Polivka, K. P. Creighton,
and S. R. Cohany, “Development of the Proposed
Revised Current Population Survey Questionnaire,”
Proceedings of the Section on Survey Research
Methods, American Statistical Association, 1992,
pp. 56−65.
U.S. Bureau of Labor Statistics, "Handbook of
Methods," retrieved from .
U.S. Bureau of Labor Statistics, “Response Errors
on Labor Force Questions Based on Consultations
with Current Population Survey Interviewers
in the United States,” Paper prepared for the
Organisation for Economic Co-Operation and
Development Working Party on Employment and
Unemployment Statistics, 1988.

Rothgeb, J. M., “Summary Report of July
Followup of the Unemployed,” U.S. Bureau of
Labor Statistics, December 20, 1982.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Chapter 3-1: Instrument Design 115

Chapter 3-2: Conducting the Interviews
INTRODUCTION
Each month during the interview week for CPS,
field representatives (FRs) and computer-assisted
telephone interviewers attempt to contact and
interview a knowledgeable adult household member living in each sample household to complete
an interview. Generally, the week containing
the twelfth is the reference week (i.e., the week
about which the labor force questions are asked).
Interviewing typically begins the week containing the nineteenth of the month. As outlined in
Chapter 2-2, Sample Design, a sample household
is in sample for 8 months total. It is in sample for 4
consecutive months, not in the sample for the next
8 consecutive months, and in the sample again
for the next 4 consecutive months. Each month,
one-eighth of the households are in sample for
the first time (referred to as MIS 1), one-eighth for
the second time (MIS 2), and so on, through the
eighth time (MIS 8). Because of this schedule, each
interviewer conducts different types of interviews
within their weekly assignment (due to differing
MIS cases).
An introductory letter is sent to each sample
household prior to its first- and fifth-month interviews. The letter describes the CPS, announces
the forthcoming visit, provides potential respondents with information regarding their rights
under the Privacy Act, the voluntary nature of the
survey, and the guarantees of confidentiality for
the information they provide.
Figure 3-2.1 shows the front side of the introductory letter sent to sample units in the area administered by the Atlanta Regional Office. Ideally, a
personal visit interview is required for all MIS 1
households because the CPS sample is strictly a
sample of addresses. The Census Bureau has no
way of knowing who the occupants of the sample household are or whether the household is
occupied or eligible for interview. A personal visit
is attempted for its MIS 5 interview. For some MIS
1 and MIS 5 households, telephone interviews
are conducted if, for example, the respondent
requests a telephone interview during the initial
personal visit.

116 Chapter 3-2: Conducting the Interviews	

	

NONINTERVIEWS AND HOUSEHOLD
ELIGIBILITY
The FR’s first task is to establish the eligibility
of the sample address for the CPS. There are
many reasons an address may not be eligible for
interview. For example, the address may have
been condemned, demolished, or converted to
a permanent business. Regardless of the reason,
such sample addresses are classified as Type C
noninterviews. Type C units have no chance of
becoming eligible for the CPS interview in future
months because the condition is considered
permanent. These addresses are stricken from the
lists of sample addresses and are not visited again
in subsequent months. All households classified
as Type C undergo a supervisory review of the
circumstances surrounding the case before the
determination becomes final.
Type B ineligibility includes units that are
intended for occupancy but are not occupied by
any eligible individuals. Reasons for such ineligibility include a vacant housing unit (either
for sale or rent) or a unit occupied entirely by
individuals who are not eligible for a CPS labor
force interview, such as individuals with a usual
residence elsewhere (URE), or who were in the
armed forces. These Type B noninterview units
have a chance of becoming eligible for interview in future months because the condition is
considered temporary (e.g., a vacant unit could
become occupied). Therefore, Type B units are
reassigned to FRs in subsequent months. These
sample addresses remain in sample for the entire
8 months that households are eligible for interview. An FR visits the unit each succeeding month
to determine whether it has changed status and
either continues the Type B classification, revises
the noninterview classification, or conducts an
interview as applicable. All Type B cases are
subject to supervisory review. Some of these
Type B households are eligible for the Housing
Vacancy Survey, which is described in Chapter
1-3, Supplements.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Figure 3-2.1.

Sample Introductory Letter
CPS-263(MIS-1)(L) ATLANTA (3-2018)

UNITED STATES DEPARTMENT OF COMMERCE
Economics and Statistics Administration

U.S. Census Bureau

Washington, DC 20233-0001 OFFICE OF THE DIRECTOR

A Message from the Director of the U.S. Census Bureau:

Dear Resident,
Your address has been selected to participate in the Current Population Survey. This
monthly survey is the source of the Nation’s unemployment rate that you may hear about in
the news. The U.S. Census Bureau conducts this survey in partnership with the U.S. Bureau
of Labor Statistics.
The Current Population Survey collects information on employment and earnings of people
living in the United States. The results help determine federal funding for veteran’s
programs, youth activities, food assistance and more. The survey results are also used by
the Federal Reserve to set interest rates.
The success of this survey depends on your participation. We cannot substitute another
address for yours. Your address is part of a scientifically selected sample of addresses
chosen throughout the country. Your answers represent hundreds of other U.S. households.
Answers to frequently asked questions are on the back of this letter. If you have
other questions, please visit census.gov/cps, or call your Census Bureau Regional
Office at 1-800-424-6974, #53939.
You do not need to take any action at this time. A Census Bureau representative will
contact you soon to ask your household to complete the survey.
Thank you in advance for participating in this important survey.

census.gov

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Chapter 3-2: Conducting the Interviews 117

One final set of households not interviewed for
CPS are Type A households. These are households
that the FR determined were eligible for a CPS
interview, but for which no useable data were
collected. To be eligible, the unit has to be occupied by at least one person eligible for an interview (i.e., an individual who is a civilian, at least
15 years of age, and does not have a URE). Even
though these households were eligible, they were
not interviewed because the household members
refused, were absent during the interviewing
period, or were unavailable for other reasons. All
Type A cases are subject to supervisory review
before the determination is made final. Every
effort is made to keep such noninterviews to a
minimum. All Type A cases remain in the sample
and are assigned for interview in all succeeding
months. Even in cases of confirmed refusals (cases
that still refuse to be interviewed despite supervisory attempts to convert the case), the FR must
verify that the same household still resides at that
address before submitting a Type A noninterview.
Table 3-2.1 shows how the three types of noninterviews are classified and various reasons that
define each category. Table 3-2.2 lists the main
household information collected for each noninterview category and briefly summarizes each
item.

INITIAL INTERVIEW
If the unit is not classified as a noninterview,
then the FR initiates the CPS interview. The FR
attempts to interview a knowledgeable adult
household member (known as the household
respondent). The FRs are trained to ask the
questions worded exactly as they appear on the
computer screen. The interview begins with the
verification of the unit’s address and confirmation
of its eligibility for a CPS interview. Part 1 of Table
3-2.3 shows some of the household items asked at
the beginning of the interview. Once eligibility is
established, the interview moves into the demographic portion of the instrument. The primary
purpose of this portion is to establish the household’s roster, which is the listing of all household
residents at the time of the interview. The main
concern here is to establish an individual’s usual
place of residence. (These rules are summarized in
Table 3-2.4.)

118 Chapter 3-2: Conducting the Interviews	

	

Table 3-2.1.

Noninterviews: Types A, B, and C
Type A
1
No one home
2
Temporarily absent
3
Refusal
4
Language barrier
5
Unable to locate
6
Other occupied
Type B
1
Entire household in the armed forces
2
Entire household under 15 years
3
Vacant regular
4
Temporarily occupied by persons with usual residence elsewhere
5
Vacant—storage of household furniture
6
Unfit or to be demolished
7
Under construction, not ready
8
Converted to temporary
business or storage
9
Unoccupied tent site or trailer site
10
Permit granted, construction not started
11
Other Type B—Specify
Type C
1
Demolished
2
House or trailer moved
3
Outside segment
4
Converted to permanent
business or storage
5
Merged
6
Condemned
7
Unused line of listing sheet
8
Unlocatable sample address
9
Unit does not exist or unit is out-of-scope
10
Other Type C—Specify

For all individuals residing in the household who
do not have a URE, a number of personal and family demographic characteristics are collected. Part
1 of Table 3-2.5 shows the demographic information collected from MIS 1 households. These
characteristics are the relationship to the reference person (the person who owns or rents the
home), parent or spouse or cohabitation partner
pointers (if applicable), age, sex, marital status,
educational attainment, veteran status, current
armed forces status, race, and Hispanic ethnicity.
As shown in Table 3-2.6, these characteristics are
collected in an interactive format that includes
a number of consistency edits embedded in the
interview itself. The goal is to collect as consistent

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Table 3-2.2.

Noninterviews: Main Household Items Asked for Types A, B, and C
Note: This list is not complete and covers only the main data items. It does not include related items used to identify
the final response such as probes and verification screens. See the Current Population Survey Interviewing Manual
for illustrations of the actual instrument screens for all Current Population Survey items.
TYPE A CASES
Item name
1
2
3

TYPA
ABMAIL
ACCESS

4

LIVQRT

5

INOTES-1

TYPE B CASES
Item name
1
2
3

TYPB
ABMAIL
ACCESS

4

LIVQRT

5
6

SEASON
BCINFO

7

INOTES-1

TYPE C CASES
Item name
1
2

TYPC
ACCESS

3

LIVQRT

4

BCINFO

5

INOTES-1

Item generally asks
Which specific kind of Type A is the case?
What is the property’s mailing address?
Is access to the household direct or through another unit? This item is answered by the
interviewer based on observation.
What is the type of housing unit (house/apartment, mobile home/trailer, etc.)? This item
is answered by the interviewer based on observation.
The interviewer may want to make notes about the case that might help with the next
interview.
Item generally asks
Which specific kind of Type B is the case?
What is the property’s mailing address?
Is access to the household direct or through another unit? This item is answered by the
interviewer based on observation.
What is the type of housing unit (house/apartment, mobile home/trailer, etc.)? This item
is answered by the interviewer based on observation.
Is the unit intended for occupancy year-round, by migrant workers, or seasonally?
What is the name, title, and phone number of the contact who provided the Type B information or was the information obtained by interviewer observation?
The interviewer may want to make notes about the case that might help with the next
interview.
Item generally asks
Which specific kind of Type C is the case?
Is access to the household direct or through another unit? This item is answered by the
interviewer based on observation.
What is the type of housing unit (house/apartment, mobile home/trailer, etc.)? This item
is answered by the interviewer based on observation.
What is the name, title, and phone number of the contact who provided the Type C information or was the information obtained by interviewer observation?
The interviewer may want to make notes about the case that might help with the next
interview.

a set of demographic characteristics as possible.
The final steps in this portion of the interview are
to verify the accuracy of the roster. To this end,
a series of questions is asked to ensure that all
household members have been accounted for.
The FR then collects labor force data about all
civilian adult individuals (at least 15 years of age)
who do not have a URE. In the interest of timeliness and efficiency, a household respondent
(preferably, the most knowledgeable household
member, though any household member aged 15
or over can be a household respondent) generally provides the data for each eligible individual.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Since household respondents provide data about
themselves, just over one-half of the CPS labor
force data is collected by self-response. The
remainder is collected by proxy from the household respondent. In certain limited situations,
collection of the data from a nonhousehold member is allowed; all such cases receive supervisory
review before the data are accepted into the CPS
processing system.
In a household’s initial interview, information
about a few additional characteristics are collected after completion of the labor force portion of the interview. This information includes

Chapter 3-2: Conducting the Interviews 119

Table 3-2.3.

Interviews: Main Household Items Asked in Month-In-Sample 1 and Replacement
Households
Note: This list is not complete and covers only the main data items. It does not include related items used to identify
the final response such as probes and verification screens. See the Current Population Survey Interviewing Manual
for illustrations of the actual instrument screens for all Current Population Survey items.
PART 1. Items Asked at the Beginning of the Interview
Item name
Item generally asks
1
2

INTROb
NONTYP

3
4
5
6

VERADD
MAILAD
TENUR
ACCESS

7

LIVQRT

Does the interviewer want to classify the case as a noninterview?
What type of noninterview is the case (A, B, or C)? It is asked or not depending on the
answer to INTROb.
What is the street address (as verification)?
What is the mailing address (as verification)?
Is the unit owned, rented, or occupied without paid rent?
Is access to the household direct or through another unit? This item is answered by the
interviewer (not read to the respondent).
What is the type of housing unit (house/apartment, mobile home/trailer, etc.)? This item
is answered by the interviewer (not read to the respondent).

PART 2. Items Asked at the End of the Interview
Item name
Item generally asks
1
2

TELHH
TELAV

3

TELWHR

4
5
6
7
8

TELIN
TELPHN
BSTTM
NOSUN
THANKYOU

9

INOTES-1

Is there a telephone in the unit?
Is there a telephone elsewhere on which people in this household can be contacted? This
is asked or not depending on the answer to TELHH.
If there is a telephone elsewhere, where is the phone located? This is asked or not
depending on the answer to TELAV.
Is a telephone interview acceptable?
What is the phone number and is it a home or office phone?
When is the best time to contact the respondent?
Is a Sunday interview acceptable?
Is there any reason why the interviewer will not be able to interview the household next
month?
The interviewer may want to make any notes about the case that might help with the next
interview. The interviewer is also asked to list the names/ages of ALL additional persons
if there are more than 16 household members.

120 Chapter 3-2: Conducting the Interviews	

	

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Table 3-2.4.

Summary Table for Determining Who Is to Be Included as a Member of the Household
Include as a
member of the
household
A. PERSONS STAYING IN SAMPLE UNIT AT TIME OF INTERVIEW
Person is member of family, lodger, servant, visitor, etc.
1. Ordinarily stays here all the time (sleeps here). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2. Here temporarily—no living quarters held for person elsewhere. . . . . . . . . . . . . . . . . . . . . .
3. Here temporarily—living quarters held for person elsewhere. . . . . . . . . . . . . . . . . . . . . . . . .
Person is in armed forces
1. Stationed in this locality, usually sleeps here. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2. Temporarily here on leave—stationed elsewhere. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Person is a student—here temporarily attending school—living quarters held for person
elsewhere
1. Not married or not living with immediate family . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2. Married and living with immediate family. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3. Student nurse living at school. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
B. ABSENT PERSON WHO USUALLY LIVES HERE IN SAMPLE UNIT
Person is inmate of institutional special place—absent because inmate in a specified
institution regardless of whether or not living quarters held for person here. . . . . . . . . . . . . . . . . 
Person is temporarily absent on vacation, in general hospital, etc. (including veterans’ facilities that are general hospitals)—living quarters held here for person. . . . . . . . . . . . . . . . . . . . . . . 
Person is absent in connection with job
1. Living quarters held here for person—temporarily absent while ‘‘on the road’’ in
connection with job (e.g., traveling salesperson, railroad conductor, bus driver). . . . . . .
2. Living quarters held here and elsewhere for person but comes here infrequently
(e.g., construction engineer). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3. Living quarters held here at home for unmarried college student working away from
home during summer school vacation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Person is in armed forces—was member of this household at time of induction but currently
stationed. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Person is a student in school—away temporarily attending school—living quarters held for
person here
1. Not married or not living with immediate family . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2. Married and living with immediate family. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3. Attending school overseas. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
C. EXCEPTIONS AND DOUBTFUL CASES
Person with two concurrent residences—determine length of time person has maintained
two concurrent residences
1. Has slept greater part of that time in another locality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2. Has slept greater part of that time in sample unit. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Citizen of foreign country temporarily in the United States
1. Living on premises of an Embassy, Ministry, Legation, Chancellery, Consulate. . . . . . . . .
2. Not living on premises of an Embassy, Ministry, etc.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 
a. Living here and no usual place of residence elsewhere in the United States . . . . . . . . .
b. Visiting or traveling in the United States. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Yes
Yes

Yes

Yes
Yes

No

No

No

No
Yes

Yes
No
Yes
No

Yes

Yes

No
No

No
No

Yes

No

Chapter 3-2: Conducting the Interviews 121

Table 3-2.5.

Interviews: Main Demographic Item Asked1
Note: This list is not complete and covers only the main data items. It does not include related items used to identify
the final response such as probes and verification screens. See the Current Population Survey Interviewing Manual
for illustrations of the actual instrument screens for all Current Population Survey items.
PART 1. Items Asked at the Beginning of the Interview
Item name
Item generally asks
1
2

HHRESP
REFPER

3

NEXTNM

4
5

URE
HHMEM

6

SEX

7
8

MCHILD
MLODGE

9
10

MELSE
RRP

11

SUBFAM

12

SBFAM_WHO

13
14
15
16
17
18
19
20
21

PAR1/PAR2
BIRTHM
BIRTHD
BIRTHY
AGE
MARITL
SPOUSE
COHAB
AFEVER

22

AFWHEN

23

AFNOW

24

EDUCA

25

CERT1

26

CERT2

What is the line number of the household respondent?
What is the name of the reference person (i.e., person who owns/rents home, whose
name should appear on line number 1 of the household roster)?
What is the name of the next person in the household (lines number 2 through a maximum of 16)?
Is the sample unit the person’s usual place of residence?
Does the person have his/her usual place of residence elsewhere? (This is asked only
when the sample unit is not the person’s usual place of residence.)
What is the person’s gender? This item is answered by the interviewer (not read to the
respondent).
Is the household roster (displayed on the screen) missing any babies or small children?
Is the household roster (displayed on the screen) missing any lodgers, boarders, or live-in
employees?
Is the household roster (displayed on the screen) missing anyone else staying there?
How is the person related to the reference person? The interviewer shows a flashcard
from which the respondent chooses the appropriate relationship category.
Is the person related to anyone else in the household? This is asked only when the person
is not related to the reference person.
Who on the household roster (displayed on the screen) is the person related to? This is
asked/not asked depending on answer to VR-NONREL.
What is the parent’s line number?
What is the month of birth?
What is the day of birth?
What is the year of birth?
How many years of age is the person (as verification)?
What is the person’s marital status? This is asked only of people 15+ years of age.
What is the spouse’s line number? This is asked only of people 15+ years of age.
Does the person have a boyfriend, girlfriend, or partner in the household?
Has the person ever served on active duty in the U.S. armed forces? This is asked only of
people 17+ years of age.
When did the person serve? This is asked only of people 17+ years of age who have served
in the U.S. armed forces.
Is the person in the U.S. armed forces now? This is asked only of people 17+ years of age
who have served in the U.S. armed forces. Interviewers will continue to ask this item
each month as long as the answer is ‘‘yes.’’
What is the highest level of school completed or highest degree received? This is asked
only of people 15+ years of age. It is asked for the first time in Months-in-Sample 1, and
then verified in Months-in-Sample 5 and in specific months (i.e., February, July, and
October).
Does the person have a currently active professional certification or a state or industry
license? Do not include business licenses such as liquor or vending.
Were any of the certifications/licenses issued by the federal, state, or local government?

See note at end of table.

122 Chapter 3-2: Conducting the Interviews	

	

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Table 3-2.5.

Interviews: Main Demographic Item Asked1 —Con.
PART 2. Items Asked at the End of the Labor Force Interview
Item name
Item generally asks
27

DS1

28

DS2

29

DS3

30

DS4

31

DS5

32

DS6

33
34
35
36

NTVT
MNTVT
FNTVT
CITIZN

37
38

CITYPA
CITYPB

39

INUSYR

40

FAMINC

Does anyone in this household have serious difficulty hearing? If the response is yes, then
the instrument asks “Who is that?” and “Anyone else?”
Does anyone in this household have serious difficulty seeing, even when wearing glasses?
If the response is yes, then the instrument asks “Who is that?” and “Anyone else?”
Because of a physical, mental, or emotional condition, does anyone in this household have
serious difficulty concentrating, remembering, or making decisions? If the response is
yes, then the instrument asks “Who is that?” and “Anyone else?”
Does anyone in this household have serious difficulty walking or climbing stairs? If the
response is yes, then the instrument asks “Who is that?” and “Anyone else?”
Does anyone in this household have serious difficulty dressing or bathing? If the response
is yes, then the instrument asks “Who is that?” and “Anyone else?”
Because of a physical, mental, or emotional condition, does anyone in this household
have difficulty doing errands alone such as visiting a doctor’s office or shopping? If the
response is yes, then the instrument asks “Who is that?” and “Anyone else?”
What is the person’s country of birth?
What is his/her mother’s country of birth?
What is his/her father’s country of birth?
Is the person is a U.S. citizen? This is asked only when neither the person nor both of his/
her parents were born in the United States or U.S. territory.
Was the person born a U.S. citizen? This is asked when the answer to CITZN-scrn is yes.
Did the person become a U.S. citizen through naturalization? This is asked when the
answer to CITYA-scrn is no.
When did the person come to live in the United States? This is asked of U.S. citizens born
outside of the 50 states (e.g., Puerto Ricans, U.S. Virgin Islanders, etc.) and of non-U.S.
citizens.
What is the household’s total combined income during the past 12 months? The interviewer shows the respondent a flashcard from which he/she chooses the appropriate
income category.

1
There are two occasions when demographic items are asked: (1) all month-in-sample 1 households and (2) any month-insample 2 through 8 household where additional members are added.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Chapter 3-2: Conducting the Interviews 123

Table 3-2.6.

Demographic Edits in the Current Population Survey Instrument
Note: The following list of edits is not complete; only the major edits are described. The demographic edits in the
Current Population Survey instrument take place while the interviewer is creating or updating the roster. After the
roster is in place, the interviewer may still make changes to the roster such as adding or deleting persons or changing variables.
EDUCATION EDITS
1
The instrument will force interviewers to probe if the education level is inconsistent with the person’s age;
interviewers will probe for the correct response if the education entry fails any of the following range
checks:

•• If 19 years of age, then the person should have an education level below that of a master’s
degree (EDUCA-scrn < 44).
•• If 16–18 years of age, then the person should have an education level below that of a bachelor’s
degree (EDUCA-scrn < 43).
•• If 15 years or younger , the person should have an education level below college (EDUCA-scrn
< 40).
2

The instrument will force the interviewer to probe before it allows lowering an education level reported in a
previous month-in-sample.

VETERANS EDITS
The instrument will display only the answer categories that apply (i.e., periods of service in the armed forces),
based on the person’s age. For example, the instrument will not display certain answer categories for a 40-yearold veteran (e.g., World War I, World War II, Korean War), but it will display them for a 99-year-old veteran.
NATIVITY EDITS
The instrument will force the interviewer to probe if the person’s year of entry into the United States is earlier than
his/her year of birth.
SPOUSE LINE NUMBER EDITS
1
If the household roster does not include a spouse for the reference person, their SPOUSE line number will
be set to zero. It will also omit the first answer category (i.e., married spouse present) when it asks for the
spouse; likewise for the reference person.
2
The instrument will not ask the SPOUSE line number for both spouses in a married couple. Once it obtains
the first SPOUSE line number on the roster, it will set the second spouse’s marital status equal to that of his/
her spouse.
3
For each household member with a spouse, the instrument will ensure that his/her SPOUSE line number is
not equal to his/her own line number, nor to his/her own PARENT line number (if any). In both cases, the
instrument will not allow the wrong entry and will display a message telling the interviewer to “TRY AGAIN.”
PARENT LINE NUMBER EDITS
1
The instrument will never ask for the reference person’s PARENT line number. It will set the reference person’s PARENT line number equal to the line number of whomever was reported on the roster as the reference person’s parent or equal to zero if no one on the roster fits that criteria.
2
Likewise, for each individual reported as the reference person’s child (S_RRP=22), the instrument will set
his/her PARENT line number equal to the reference person’s line number, without asking for each individual’s PARENT line number.
3
The instrument will not allow more than two parents for the reference person.
4
If an individual is reported to be a brother or sister of the reference person, and the reference person’s
PARENT is recorded, the instrument will: (a) first verify that the parent to which both siblings are pointing
to is indeed the line number belonging to the reference person’s PARENT; and if so, (b) set the PARENT line
number equal to the reference person’s PARENT line number.
5
For each household member, the instrument will ensure that the PARENT line number is not equal to his/her
own line number. In such a case, the instrument will not allow the interviewer to make the wrong entry and
will display a message telling the interviewer to “TRY AGAIN.”

124 Chapter 3-2: Conducting the Interviews	

	

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

questions on family income, disabilities, country of
birth of each household member, country of birth
of each member’s father and mother, and for the
foreign-born, year of entry into the United States
and citizenship status. See Part 2 of Table 3-2.5 for
a list of these items.

SUBSEQUENT MONTHS’ INTERVIEWS
For households in sample for the second, third,
and fourth months, the FR has the option to
conduct the interview over the telephone. The
respondent must approve the use of this interviewing mode. Such approval is obtained at the
end of the first month’s interview upon completion
of the labor force and any supplemental questions. Telephone interviewing is the preferred
method for collecting the data; it is much more
time and cost efficient. Approximately 85 percent
of interviews in these three MIS are obtained by
telephone. See Part 2 of Table 3-2.3 for the questions asked to obtain consent for the telephone
interview. FRs must attempt to conduct a personal visit interview for the fifth-month interview.
After one attempt, a telephone interview may
be conducted provided the original household
still occupies the sample unit. This fifth-month
interview follows a sample unit’s 8-month dormant period and is used to reestablish rapport
with the household. Fifth-month households are
more likely than any other MIS households to be a
replacement household—i.e., a household in which
all the previous month’s residents have moved out
and been replaced by an entirely different group
of residents. This can and does occur in any MIS
except for MIS 1. Households in MIS 6, 7, and 8
are also eligible for telephone interviewing. Once
again, approximately 85 percent of the interviews
of these cases are obtained by telephone.
The first thing the FR does in subsequent interviews is update the roster. The instrument presents a screen (or a series of screens for MIS 5
interviews) that prompts the FR to verify the accuracy of the roster. Since households in MIS 5 are
returning to sample after an 8-month hiatus, additional probing questions are asked to establish the

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

household’s current roster and to update some
characteristics. See Table 3-2.7 for a list of the
major items asked in MIS 5 interviews. If there are
any changes, the interviewer goes through the
necessary steps to add or delete individuals and
update relationship items (e.g., relationship to reference person, marital status, and parent, spouse,
or unmarried partner pointers) that may be
subject to change. After making the appropriate
corrections, the instrument takes the interviewer
to any items, such as educational attainment, that
require periodic updating.
The labor force interview in MIS 2, 3, 5, 6, and 7
collects the same information as the MIS 1 interview. MIS 4 and 8 interviews are different in several respects. These interviews include additional
questions for employed wage and salary workers
about their usual weekly earnings at their sole or
main job, as well as union membership for that
job. In addition, for all individuals who are multiple
jobholders, information is collected on the I&O
of their second job. For individuals who are not
in the labor force, additional information on their
previous labor force attachment is obtained.
After an initial interview, dependent interviewing is
used in the collection of several items. Information
collected in the previous month’s interview is
carried forward into the current interview to ease
response burden and improve the quality of the
labor force data. This is most noticeable in the
collection of I&O of the main job; job description
data from the previous month are imported into
the current month’s interview, and interviewers
verifying whether an individual still has the same
job. Other information collected using dependent interviewing includes items on duration of
unemployment and data on the not in labor force
subgroups of the retired and people who are not
working due to a disability. Dependent interviewing is not used in the MIS 5 interviews or for any of
the data collected solely in MIS 4 and 8 interviews.
The CPS uses centralized contact centers in
Jeffersonville, Indiana, and Tucson, Arizona, for
CATI. These have been used since 1983. The first

Chapter 3-2: Conducting the Interviews 125

Table 3-2.7.

Interviews: Main Household and Demographic Items Asked in Month-in-Sample 5
Note: This list is not complete and covers only the main data items. It does not include related items used to identify
the final response such as probes and verification screens. See the Current Population Survey Interviewing Manual
for illustrations of the actual instrument screens for all Current Population Survey items.
HOUSEHOLD ITEMS
Item name
1
2
3
4
5
6
7
8
9
10
11

HHNUM
VERADD
CHNGPH
MAILAD
TENUR
TELHH
TELIN
TELPHN
BESTTI
NOSUN
THANKYOU

12

INOTES-1

Item generally asks
Is the household a replacement household?
What is the street address (as verification)?
Does the current phone number need updating?
What is the mailing address (as verification)?
Is the unit owned, rented, or occupied without paid rent?
Is there a telephone in the unit?
Is a telephone interview acceptable?
What is the phone number and is it a home or office phone?
When is the best time to contact the respondent?
Is a Sunday interview acceptable?
Is there any reason why the interviewer will not be able to interview the household next
month?
The interviewer may want to make any notes about the case that might help with the next
interview. The interviewer is also asked to list the names/ages of ALL additional persons
if there are more than 16 household members.

DEMOGRAPHIC ITEMS
Item name
Item generally asks
13
RESP1
Is the respondent different from the previous interview?
14
STLLIV
Are all persons listed still living in the unit?
15
NEWLIV
Is anyone else staying in the unit now?
16
MCHILD
Is the roster (displayed on the screen) missing any babies or small children?
17
MAWAY
Is the roster (displayed on the screen) missing usual residents temporarily away from the
unit (e.g., traveling, at school, in the hospital)?
18
MLODGE
Is the roster (displayed on the screen) missing any lodgers, boarders, or live-in employees?
19
MELSE
Is the roster (displayed on the screen) missing anyone else staying in the unit?
20
EDUCA
What is the highest level of school completed or highest degree received? This is asked
for the first time in Month-in-Sample 1, and then verified in Month-in-Sample 5 and in
specific months (i.e., February, July, and October).
21
CHANGE
Since last month, has there been any change in the roster (displayed with full demographics), particularly in the marital status?

126 Chapter 3-2: Conducting the Interviews	

	

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Table 3-2.8.

Interviewing Results for October 2017
Description

October 2017

TOTAL HOUSEHOLDS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 
 Eligible households. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
 Interviewed households . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Response rate (unweighted) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Response rate (weighted). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
 Noninterviews. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Rate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Type A. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
   Rate (unweighted) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
   Rate (weighted). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
     No one home . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
     Temporarily absent. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
     Refused . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
     Language barrier. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
     Unable to locate. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
     Other—specify. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
     Callback needed—no progress. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Type B . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
   Rate (unweighted) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
   Rate (weighted). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Entire household armed forces. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Entire household under the age of 15. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Temporarily occupied with people with URE. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Vacant regular. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Vacant household furniture storage. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Unfit, to be demolished. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Under construction, not ready. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Converted to temporary business or storage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Unoccupied tent or trailer site. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Permit granted, construction not started. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Other Type B. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Type C . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
   Rate (unweighted) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
   Rate (weighted). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Demolished. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    House or trailer moved. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Outside segment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Converted to permanent business or storage. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Merged. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Condemned . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Unused serial number/listing sheet line. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Unlocatable sample address . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Unit does not exist or unit is out-of-scope. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Other Type C . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

72,151
61,040
52,638
86.24
86.41
19,513
27.04
8,402
13.76
13.59
1,095
196
6,547
36
25
503
0
10,634
14.84
13.88
158
4
974
7,892
346
425
336
100
234
73
92
477
0.66
0.91
93
52
0
48
11
10
2
0
227
34

Source: Derived from the Current Population Survey Summary Report, October 2017.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Chapter 3-2: Conducting the Interviews 127

Table 3-2.9.

Telephone and Personal Field Interviewing for October 2017
Months-in-sample
(MIS)
MIS 1 . . . . . . . . . . . . . . . . . . .
MIS 2 . . . . . . . . . . . . . . . . . . .
MIS 3 . . . . . . . . . . . . . . . . . . .
MIS 4 . . . . . . . . . . . . . . . . . . .
MIS 5 . . . . . . . . . . . . . . . . . . .
MIS 6 . . . . . . . . . . . . . . . . . . .
MIS 7 . . . . . . . . . . . . . . . . . . .
MIS 8 . . . . . . . . . . . . . . . . . . .
Total (MIS 1–8) . . . . . . . . .
MIS 1 and MIS 5. . . . . . . . . 
MIS 2–4 and MIS 6–8. . . . . .

Total
Number
6,346
6,469
6,563
6,413
6,558
6,624
6,740
6,925
52,638
12,904
39,734

Field telephone

Field personal visit

CATI center

Percent

Number

Percent

Number

Percent

Number

Percent

24.5
75.5

1,210
3,425
3,624
3,421
2,431
3,805
3,776
3,743
25,435
3,641
21,794

4.7
13.2
14.0
13.2
9.4
14.7
14.6
14.4
100.0
28.2
54.9

5,136
2,353
2,243
2,150
4,127
2,201
2,148
2,188
22,546
9,263
13,285

22.8
10.4
9.9
9.5
18.3
9.8
9.5
9.7
100.0
71.8
33.4

Z
691
696
842
Z
618
816
994
4,657
Z
4,657

Z
14.8
14.9
18.1
Z
13.3
17.5
21.3
100.0
Z
11.7

Z Represents or rounds to zero.
Note: This table for October 2017 is consistent with usual monthly results.
Source: Derived from the Current Population Survey Summary Report, October 2017.

use of CATI in production was the Tri-Cities Test,
which began in April 1987. Each month about
10,000 cases are sent to these telephone centers,
but no MIS 1 or MIS 5 cases are sent. Most of
the cases sent to CATI are in metropolitan areas,
which eases recruiting and hiring efforts in those
areas. Selection is made by RO supervisors based
on the FR’s analysis of a household’s probable
acceptance of a telephone center interview and
the need to balance workloads and meet specific
goals on the number of cases sent to the centers.
About 65 percent are interviewed and the remainder is recycled to FRs. The net result is that about
8 percent of all CPS interviews are completed at
a CATI center. The centers generally cease labor
force collection on the Wednesday of interview
week (which is generally the week containing the
nineteenth of the month), allowing field staff 3 or
4 days for follow-up activities. Generally, about 80
percent of the recycled cases are interviewed.

128 Chapter 3-2: Conducting the Interviews	

	

Tables 3-2.8 and 3-2.9 show the results of a typical
month’s CPS interviewing, October 2017. Table
3-2.8 lists the outcomes of all the households in
the CPS sample. For monthly interviewing, Type
A rates average around 15 percent with an overall
noninterview rate in the range of 28 to 30 percent. In October 2017, the Type A rate was 13.76
percent. The overall noninterview rate for October
2017 was 27.04 percent.

REFERENCES
U.S. Census Bureau, Current Population Survey
Interviewing Manual, 2015, retrieved from
.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Chapter 3-3: Transmitting the Interview Results
INTRODUCTION
FRs send completed CPS work through electronic
data transmissions. Secure lines are used for all
data transmissions between the Census Bureau
headquarters, the Jeffersonville, Indiana, National
Processing Center (NPC), two CATI contact centers, ROs, field supervisors, and FRs. This chapter
provides a summary of these procedures and how
the data are prepared for production processing.

THE SYSTEM
The system for the transmission of data is centralized at the Census Bureau headquarters. All
data transfers must pass through headquarters
even if headquarters is not the final destination
of the information. The system was designed
this way for ease of management and to ensure
uniformity of procedures within a given survey as
well as between different surveys. The transmission system was designed to satisfy the following
requirements:
•	
•	
•	
•	
•	
•	

Provide minimal user intervention.
Upload or download in one transmission.
Transmit interview responses in one
transmission.
Transmit software upgrades with data.
Maintain integrity of the software and the
assignment.
Prevent unauthorized access.

The central database system at headquarters
cannot initiate transmissions. Either the FRs or
the ROs must initiate any transmissions. Devices
in the field are not continuously connected to the
headquarters’ computers; rather, the field devices
contact the headquarters’ computers to exchange
data using a secure Census Virtual Private
Network (VPN). The central database system servers store messages and case information required
by the FRs or the ROs. When an FR connects, the
transfer of data from the FR’s device to headquarters’ database is completed first and then any
outgoing data are transferred to the FR’s device.
A major concern is the need for security. Both the
Census Bureau and the BLS are required to honor
the pledge of confidentiality given to CPS respondents. The system is designed to safeguard this
Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

pledge. All transmissions between the headquarters’ central database and the FRs’ devices are
encrypted and occur over the secure Census VPN.
All transmissions between headquarters, the NPC,
the CATI centers, and the ROs are over secure
telecommunications lines that are leased by the
Census Bureau and are accessible only by Census
Bureau employees.

TRANSMISSION OF INTERVIEW DATA
Each FR is expected to make a telecommunications transmission at the end of every workday
during the interview period. Additional transmissions may be made at any time as needed.
Each transmission is a batch process in which all
relevant files are automatically transmitted and
received. The device connects to the headquarters’ central database and transmits the completed cases. At the same time, the central database returns any other data to complete the FR’s
assignment and any software or security updates.
The RO or field supervisory staff performs a data
transmission when releasing or reassigning work
and after completing check-in and supervisory
review activities.

CATI Contact Center Transmissions
All cases are initially sent to the ROs and assigned
to FRs. A subset of the cases are selected for
CATI. These cases are then sent back to the
Master Control System (MCS) and subsequently
to a WebCATI System. The WebCATI system has a
central database where cases are shared between
the two CATI contact centers. Data are moved
between MCS and WebCATI using secure database links. Data are moved between WebCATI and
the two CATI contact centers using encryption
and secure transmission lines.
All cases sent to CATI are ultimately sent back to
the MCS. Both response data and operational data
are provided.
•	

Cases with an interview disposition of complete, sufficient partial, out-of-scope, or
“Congressional refusal” are sent to the MCS
and headquarters.

•	

All other cases are recycled to the field. These
include noncontacts, refusals, and unavailable.
Chapter 3-3: Transmitting the Interview Results 129

Each recycled case is transmitted directly to the
assigned FR. Case notes that include the reason
for the recycle are also transmitted to the FR to
assist in follow-up. These transmissions are similar
to the transmission of the initial assignment except
for the timing (i.e., they occur after the start of
data collection). The ROs monitor the progress of
the CATI recycled cases.

Transmission of Interviewed Data From the
Centralized Database
Each day during the production cycle, field staff
send files to headquarters with the results of the
previous day’s interviewing. (See Figure 3-3.1 for
an overview of the daily processing cycle.) The
headquarters’ production processing system usually receives three files per day: one combined file

130 Chapter 3-3: Transmitting the Interview Results	

	

from the CATI contact centers and two files from
the field. At this time, cases requiring I&O coding
are identified and a file of these cases is created.
This file is then used by NPC coders to assign
the appropriate I&O codes. The I&O data are not
transmitted to NPC; rather, the NPC coding staff
remotely accesses the data on headquarters’
servers. Chapter 3-4, Data Preparation, provides
an overview of the I&O coding and processing
system.
The CATI contact centers finish sending files to
headquarters by the middle of the week including
the nineteenth of the month (the week that interviewing generally begins). All data are received by
headquarters usually on Tuesday or Wednesday of
the week after interviewing begins.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Current Population Survey TP77	

U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Chapter 3-3: Transmitting the Interview Results 131

Fri

Sat

Field representatives (FRs) pick
up computerized
questionnaire via
transmission.

Thu

Week 1
(week containing
the 5th)
Mon

Tue

Wed

Thu

Fri

Sat

Sun

Mon

Tue

Wed

Fri

Sun

All assignments completed except
for telephone
holds.

Sat

All
interviews
completed.

Mon

ROs
complete
final
field
closeout.

Tue

NPC sends back completed I&O cases.

Thu

Fri

Sat

Sat
thru
Thu

Results
released
to public
at 8:30
a.m. by
BLS.

Fri

Week 5

Headquarters performs final computer
processing edits,
BLS analyzes
recodes, weighting.
data.
Census Bureau delivers
data to Burea of Labor
Statistics (BLS).

Wed

Week 4
(week containing the 26th)

Processing begins:
•• Files received overnight are checked in by ROs and headquarters daily.
•• Headquarters performs initial processing and sends eligible cases to
the National Processing Center (NPC) for Industry and Occupancy
(I&O)coding.

Computer-assisted-personal interviewing:
•• FRs transmit completed work to headquarters nightly.
•• Regional offices (ROs) monitor interviewing
assignments and progress.

Thu

Week 3
(week containing the )

CATI Interviewing:
FRs and computer-assisted
•• CATI contact centers
telephone interviewing
transmit completed work to
FRs pick up assign(CATI) interviewers complete
head­quarters nightly.
ments via transmission.
practice interviews and home
•• CATI supervisors monitor
studies.
interviewing progress.

Sun

Week 2
(week containing the 12th)

Overview of Current Population Survey Monthly Operations for a Typical Month

Figure 3-3.1.

Chapter 3-4: Data Preparation
INTRODUCTION
For the CPS, postdata collection activities transform raw data into a microdata file that can be
used to produce estimates. Several processes
are needed for this transformation. The raw data
files must be read and processed. Textual I&O
responses must be coded. Even though some
editing takes place in the instrument at the time
of the interview (see Chapter 3-2, Conducting
the Interviews), further editing is required once
all data are received. Editing and imputations,
explained below, are performed to improve the
consistency and completeness of the microdata.
New data items are created based upon responses
to multiple questions. These activities prepare
the data for weighting and estimation procedures, described in Chapter 2-3, Weighting and
Estimation.

DAILY PROCESSING
For a typical month, CATI starts on Sunday of the
week containing the nineteenth of the month and
continues through Wednesday of the same week.
The answer files from these interviews are sent
to headquarters on a daily basis from Monday
through Thursday of this interview week. CAPI
also begins on the same Sunday and continues
through Tuesday of the following week. The CAPI
answer files are again sent to headquarters daily
until all the interviewers and ROs have transmitted
the workload for the month. This phase is generally completed by Wednesday of the week containing the 26th of the month.
Various computer checks are performed to ensure
the data can be accepted into the CPS processing system. These checks include, but are not
limited to, ensuring the successful transmission
and receipt of the files, confirming the item range
checks, and rejecting invalid cases. Files containing records needing four-digit I&O codes are electronically sent to the NPC for assignment of these
codes. Once the NPC staff has completed the I&O
coding, the files are electronically transferred back
to headquarters where the codes are placed on
the CPS production file. When all of the expected
data for the month are accounted for and all of
NPC’s I&O coding files have been returned and

132 Chapter 3-4: Data Preparation 	

	

placed on the appropriate records on the data file,
editing and imputation are performed.

INDUSTRY AND OCCUPATION CODING
The I&O code assignment operation requires 10
coders to assign approximately 30,000 codes
for 1 week. The volume of codes has decreased
significantly with the introduction of dependent
interviewing for I&O codes (see Chapter 3-1,
Instrument Design). Both new monthly CPS cases
and people whose I&O have changed since the
previous interview are sent to NPC for coding. For
those whose I&O information has not changed,
the four-digit codes are brought forward from the
previous month of interviewing and require no
further coding.
A computer-assisted I&O coding system is used
by the NPC I&O coders. Files of all eligible I&O
cases are sent to this system each day. Each coder
works at a computer where the screen displays
the I&O descriptions that were captured by the
FRs at the time of the interview. The coder then
enters a four-digit numeric code for industry
based on the Census Industry Classification. The
coder also assigns a four-digit numeric code for
occupation based on the Census Occupational
Classification.
A substantial effort is directed at supervision and
control of the quality of this operation. The supervisor is able to turn the dependent verification
setting ‘‘on’’ or ‘‘off’’ at any time during the coding
operation. The ‘‘on’’ mode means that a particular
coder’s work is verified by a second coder. In addition, a 10 percent sample of each month’s cases is
selected to go through a quality assurance system
to evaluate the work of each coder. The selected
cases are verified by another coder after the current monthly processing has been completed.
After this operation, the batch of records is electronically returned to headquarters for the next
stage of monthly production processing.

EDITS AND IMPUTATIONS
The CPS is subject to two sources of nonresponse.
The most frequent source is nonresponding
households (unit nonresponse). To compensate

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

for unit nonresponse, the weights of nonresponse
households are distributed among interviewed
households as explained in Chapter 2-3, Weighting
and Estimation. The second source of data loss
is from item nonresponse, which occurs when a
respondent either does not know the answer to
a question or refuses to provide the answer. Item
nonresponse in the CPS is modest (see Chapter
4-1, Nonsampling Error, Table 4-1.1).
Before the edits are applied, the daily data files
are merged and the combined file is sorted by
state and primary sampling unit (PSU). This is
important because many labor force and I&O
characteristics are geographically clustered.
The edits effectively blank all entries in inappropriate questions (e.g., followed incorrect path
of questions) and ensure that all appropriate
questions have valid entries. For the most part,
illogical entries or out-of-range entries have been
eliminated with the use of electronic instruments;
however, the edits still address these possibilities
that may arise from data transmission problems
and occasional instrument malfunctions. The main
purpose of the edits, however, is to assign values to questions where the response was ‘‘Don’t
know’’ or ‘‘Refused.’’ This is accomplished by using
one of the three imputation techniques described
below.
The edits are run in a deliberate and logical
sequence. Household and demographic variables
are edited first because several of those variables
are used to allocate missing values in later edits.
The labor force variables are edited next since
labor force status and related items are used to
impute missing values for I&O codes and so forth.
The three imputation methods used by the CPS
edits are described below:
1.	 Relational imputation infers the missing value
from other characteristics on the person’s
record or within the household. For instance,
if race is missing, it is assigned based on the
race of another household member, or failing
that, taken from the previous record on the
file. Similarly, if relationship data are missing,
it is assigned by looking at the age and sex
of the person in conjunction with the known
relationships of other household members.
Missing occupation codes are sometimes
assigned by analyzing the industry codes
Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

and vice versa. This technique is used as
appropriate across all edits. If missing values
cannot be assigned using this technique, they
are assigned using one of the two following
methods.
2.	 Longitudinal edits are used in most of the
labor force edits as appropriate. If a question
is blank and the individual is in the second
or later month’s interview, the edit procedure looks at last month’s data to determine
whether there was an entry for that item. If
so, last month’s entry is assigned. The exception to filling missing values from the previous
month longitudinally is when handling blank
labor force items for MIS 5 interviews. In
this specific instance, the longitudinal link is
broken for MIS 5 interviews. If the item cannot
be assigned using this technique, the item is
assigned a value using the appropriate hot
deck as described next.
3.	 The ‘‘hot deck’’ imputation method assigns a
missing value from a record with similar characteristics. Hot decks are defined by variables
such as age, race, and sex. Other characteristics used in hot decks vary depending on
the nature of the unanswered question. For
instance, most labor force questions use age,
race, sex, and occasionally another correlated
labor force item such as full- or part-time
status. This means the number of cells in labor
force hot decks are relatively small, perhaps
fewer than 100. On the other hand, the weekly
earnings hot deck is defined by age, race,
sex, usual hours, occupation, and educational
attainment. This hot deck has several thousand cells.
All CPS items that require imputation for missing values have an associated hot deck. The
initial values for the hot decks are the ending
values from the preceding month. As a record
passes through the editing procedures, it will
either donate a value to each hot deck in its
path or receive a value from the hot deck. For
instance, in a hypothetical case the hot deck
for question X is defined by the characteristics Black, non-Black, male, female, and age
16−25/25 and older. Further assume a record
has the value of White, male, and age 64.
When this record reaches question X, the edits
determine whether it has a valid entry. If so,

Chapter 3-4: Data Preparation 133

that record’s value for question X replaces the
value in the hot deck reserved for non-Black,
male, and age 25 and older. Comparably, if the
record was missing a value for question X, it
would be assigned the value in the hot deck
designated for non-Black, male, and age 25
and older.
As stated above, the various edits are logically
sequenced, in accordance with the needs of subsequent edits. The edits and recodes, in order of
sequence, are:

eligible individuals for whom the I&O coders
were unable to assign a code. It also ensures
consistency, wherever feasible, between
industry, occupation, and class of worker. I&O
related recode guidelines are also created.
This edit uses longitudinal editing, relational,
and hot deck imputation. The hot decks are
defined by such variables as age, sex, race,
and educational attainment.
5.

Earnings edits and recodes. This processing
step edits the earnings series of items for
earnings-eligible individuals. A usual weekly
earnings recode is created to allow earnings
amounts to be in a comparable form for all
eligible individuals. There is no longitudinal
editing because this series of questions is
asked only of MIS 4 and 8 households. Hot
deck imputation is used here. The hot deck for
weekly earnings is defined by age, race, sex,
major occupation recode, educational attainment, and usual hours worked. Additional
earnings recodes are created.

6.

School enrollment edits and recodes. School
enrollment items are edited for individuals 16
to 54 years old. Longitudinal edits or hot deck
imputation based on age and other related
variables are used as necessary.

7.

Disability edits and codes. The disability items are asked of all interviewed adult
records in MIS 1 and 5. Valid responses from
MIS 1 and 5 are longitudinally filled into MIS
2−4 and 6−8 as appropriate. Any missing data,
regardless of MIS, are filled using longitudinal
or hot deck imputation.

8.

Certification edits and recodes. The certification items are edited in their own separate
edit after completion of the I&O edit since the
occupation and labor force data are used in
the imputation process for these items. Any
missing data are filled using longitudinal or
hot deck imputation.

1.	 Household edits and recodes. This processing
step performs edits and creates recodes for
items pertaining to the household. It classifies
households as interviews or nonresponse and
edits items appropriately. Hot deck imputations defined by geography and other related
variables are used in this edit.
2.	 Demographic edits and recodes. This processing step ensures consistency among
all demographic variables for all individuals
within a household. It ensures all interviewed
households have one and only one reference
person and that entries stating marital status,
spouse, and parents are all consistent. It also
creates families based upon these characteristics. It uses relational imputation, longitudinal editing, and hot deck imputation defined
by related demographic characteristics.
Demographic-related recodes are created for
both individual and family characteristics.
3.	 Labor force edits and recodes. This processing step first establishes an edited Major Labor
Force Recode (MLR), which classifies adults
as employed, unemployed, or not in the labor
force. Based upon MLR, the labor force items
related to each series of classification are
edited. This edit uses longitudinal editing and
hot deck imputation matrices. The hot decks
are defined by age, race, and/or sex and, possibly, by a related labor force characteristic.
4.	 I&O edits and recodes. This processing step
assigns four-digit I&O codes to those I&O

134 Chapter 3-4: Data Preparation 	

	

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Chapter 3-5: Organization and Training of the Data
Collection Staff
INTRODUCTION
The data collection staff for all Census Bureau
programs is directed through six ROs and two
contact centers. The ROs collect data using two
modes: CAPI and CATI. The six ROs report to the
Chief of the Field Division, whose headquarters is
located in Washington, DC. The two CATI facility
managers report to the Chief of the NPC located
in Jeffersonville, Indiana.

ORGANIZATION OF REGIONAL
OFFICES/CATI FACILITIES
The staffs of the ROs and CATI contact centers
carry out the Census Bureau’s field data collection
programs for both sample surveys and censuses.
Currently, the ROs supervise over 7,000 parttime and intermittent FRs who work on continuing current programs and one-time surveys.
Approximately 2,700 of these FRs work on the
CPS. When a census is being taken, the field staff
increases dramatically.
The location of the ROs and the boundaries of
their responsibilities are displayed in Figure 3-5.1.
RO areas were originally defined to evenly distribute the office workloads for all programs. Table
3-5.1 shows the average number of CPS units
assigned for interview per month in each RO.
Each RO is headed by a Regional Director (RD)
and has two Assistant Regional Directors (ARDs).
The CPS is the responsibility of the program and
management analysis coordinator who reports
to the RD through the ARD. The program coordinator has two or three CPS Regional Survey
Managers-Expert (RSM-E) on staff. The RSM-Es
work with Regional Survey Managers-Geography
(RSM-G) in the RO. Each RO has eight RSM-Gs.
The typical RO employs about 360 to 580 FRs who
are assigned to the CPS. Most FRs also work on
other surveys. ROs have a total of 408 full-time
Field Supervisors (FSs) who supervise teams
of FRs. Each FS is assigned around 17 FRs. The
primary function of the FS is to assist the RSM-Es
and RSM-Gs with training and supervising the field
interviewing staff. In addition, the FSs conduct
nonresponse reinterview follow-up with eligible
households. Like FRs, the FSs work out of their
Current Population Survey TP77	

home. However, the FRs are part-time or intermittent employees.
Despite the geographic dispersion of the sample
areas, there is a considerable amount of personal
contact between the supervisory staff and the FRs
accomplished mainly through the training programs and various aspects of the quality control
program. For some of the outlying PSUs, it is
necessary to use the telephone and written communication to keep in continual touch with all FRs.
The ROs, RSMs, and FSs also communicate with
the FRs using e-mail. FRs mostly communicate
with their FS and sometimes their RSM-G. The
RSM-Gs and FSs communicate with the RSM-Es. In
addition to communications relating to the work
content, there is a regular system for reporting
progress and costs.
The CATI contact centers are staffed with one
facility manager per each site who directs the
work of two to four Supervisory Operations
Specialists or Ops Specialists. Each Ops Specialist
is in charge of about five to ten supervisors or
Supervisory Statistical Assistants (Sups). Sups are
in charge of about 10 to 20 interviewers. In addition to conducting data collection for CPS, the
CATI contact centers also conduct quality control
reinterview follow-up with eligible households.
A substantial portion of the budget for field
activities is allocated to monitoring and improving
the quality of the FRs’ work. This includes FRs’
group training, monthly home studies, personal
observation, and reinterview. Approximately 25
percent of the CPS budget (including travel for
training) is allocated to quality enhancement. The
remaining 75 percent of the budget went to FR
and FS salaries, all other travel, clerical work in
the ROs, recruitment, and the supervision of these
activities.

TRAINING FIELD REPRESENTATIVES
Approximately 20 to 25 percent of the CPS FRs
leave the staff each year. As a result, the recruitment and training of new FRs is a continuing task
in each RO. To be selected as a CPS FR, a candidate must pass the Basic Skills Test on reading,

Chapter 3-5: Organization and Training of the Data Collection Staff 135

U.S. Bureau of Labor Statistics and U.S. Census Bureau	

arithmetic, and map reading. Applicants who pass
the Basic Skills Test must then pass the scored
Mock and Structured Job Interviews. Additionally,
the FR is required to live in or near the PSU in
which the work is to be performed and have a residence telephone and, in most situations, an automobile. As a part-time or intermittent employee,
the FR works 40 or fewer hours per week or
month. In most cases, new FRs are paid at the
GS−3 level and are eligible for payment at the
GS−4 level after 1 year of fully successful or better
work. FRs are paid mileage for the use of their
own cars while interviewing and for commuting to
classroom training sites. They also receive pay for
completing their home-study training packages.

FIELD REPRESENTATIVE TRAINING
PROCEDURES
Initial training for new Field Representatives
When appointed, each FR undergoes an initial
training program prior to starting his/her assignment. The initial training program consists of up
to 20 hours of preclassroom home study, 3.5 to
4.5 days of classroom training (dependent upon
the trainee’s interviewing experience) conducted
by an RSM or coordinator. The first day of training
(not required for experienced staff) is dedicated
to Generic Path to Success training, where FRs
learn generic soft interviewing skills and administrative tasks. An on-the-job field observation by
the FS during the FR’s first 2 days of interviewing
is also required. The classroom training includes
comprehensive instruction on the completion of
the survey using the laptop computer. In classroom training, special emphasis is placed on the
labor force concepts to ensure that the new FRs
fully grasp these concepts before conducting
interviews. In addition, a large part of the classroom training is devoted to practice interviews
that reinforce the correct interpretation and classification of the respondents’ answers. A 12-hour
post-classroom training consists of home-study
exercise and test.
Trainees receive extensive training on interviewing skills, such as how to handle noninterview
situations, how to probe for information, ask
questions as worded, and implement face-to-face
and telephone techniques. Before beginning the
first month observation and assignment, each FR
also completes a practice phone interview with an

experienced FS. Each FR completes a home-study
exercise before the second month’s assignment
and, during the second month’s interview assignment, is observed for at least 1 full day by the
FS who gives supplementary training if needed.
The FR also completes a home-study exercise
and a final review test prior to the third month’s
assignment.

Training for All Field Representatives
As part of each monthly assignment, FRs are
required to complete a home-study exercise,
which usually consists of questions concerning
labor force concepts and survey coverage procedures. Once a year, the FRs are gathered in groups
of about 12 to 15 for 1 or 2 days of refresher training. These sessions are usually conducted by RSM
or FSs. These group sessions cover regular CPS
and supplemental survey procedures.

Training for Interviewers at the CATI Centers
Candidates selected to be CATI interviewers
receive classroom training. The classroom training
consists of 3 days (15 hours) of new hire orientation training and 2 days (10 hours) of computer
training in the CATI case management system,
WebCATI. Additionally, CATI interviewers selected
for CPS complete a 5-hour CPS subject matter
self-study and receive a 3-day classroom training
(15 hours) specifically about the CPS. New CPS
interviewers are then closely monitored by having coaches observe recorded interviews. Prior to
data collection each month, all CATI interviewers
are required to come into the contact center to
complete a self-study that provides a review and
clarification of specific CPS concepts.

FIELD REPRESENTATIVES
PERFORMANCE GUIDELINES
Performance guidelines have been developed
for CPS CAPI FRs for response, nonresponse,
and production. Response and nonresponse rate
guidelines have been developed to ensure the
quality of the data collected. Production guidelines have been developed to assist in holding
costs within budget and to maintain an acceptable level of efficiency in the program. Both
sets of guidelines are intended to help supervisors analyze activities of individual FRs and to
assist supervisors in identifying FRs who need to
improve their performance.

136 Chapter 3-5: Organization and Training of the Data Collection Staff 	

	

Current Population Survey TP77

U.S. Bureau of Labor Statistics and U.S. Census Bureau

Each FS is responsible for developing each
employee to his or her fullest potential. Employee
development requires meaningful feedback on a
continuous basis. By acknowledging strong points
and highlighting areas for improvement, the FS
can monitor an employee’s progress and take
appropriate steps to improve weak areas.

including temporary refusals. (All of these noninterviews are referred to as Type A noninterviews.)
Type A cases do not include vacant units, those
that are used for nonresidential purposes, or other
addresses that are not eligible for interview.

FR performance is measured by a combination of
the following: response rates, production rates,
supplement response, reinterview results, observation results, accurate payrolls submitted on
time, deadlines met, reports to FS, training sessions, and administrative responsibilities.

The production guidelines used in the CPS CAPI
program are designed to measure the efficiency
of individual FRs and the RO field functions.
Efficiency is measured by total minutes per case.
The measure is calculated by dividing total time
reported on payroll documents, which includes
interview time and travel time, by total workload.
When looking at an FR’s production, FSs must
consider extenuating circumstances, such as:

The most useful tool to help supervisors evaluate the FRs performance is the CPS 11−39 form.
This report is generated monthly and is produced
using data from the Regional Office Survey
Control System (ROSCO) and Cost and Response
Management Network, known as CARMN systems.
This report provides the supervisor information
on:
•	

Workload and number of interviews.

•	

Response rate and numerical rating.

•	

Noninterview results (counts).

•	

Production rate, numerical rating, and
mileage.

•	

Meeting transmission goals.

EVALUATING FIELD REPRESENTATIVE
PERFORMANCE
Census Bureau headquarters, located in Suitland,
Maryland, provides guidelines to the ROs for
developing performance standards for FRs’
response and production rates. The ROs have the
option of using the production guidelines, modifying them, or establishing a completely different
set of standards for their FRs. ROs must notify the
FRs of the standards.

Response Rate Guidelines
Maintaining high response rates is of primary
importance to the Census Bureau. The response
rate is defined as the proportion of all sample
households eligible for interview that is actually
interviewed. It is calculated by dividing the total
number of interviewed households by the sum of
the number of interviewed households, the number of refusals, noncontacts, and eligible households that were not interviewed for other reasons,
Current Population Survey TP77	

Production Guidelines

•	

Unusual weather conditions such as floods,
hurricanes, or blizzards.
•	 Extreme distances between sample units or
assignments that cover multiple PSUs.
•	 Large number of inherited or confirmed
refusals.
•	 Working part of another FR’s assignment.
•	 Inordinate number of temporarily absent
cases.
•	 High percentage of Type B/C noninterviews
that decrease the base for the nonresponse
rate.
•	 Other substantial changes in normal assignment conditions.
•	 Personal visits/phones.
The supplement response rate is another measure
that FSs must use in measuring the performance
of their FR staff.

Transmittal Rates
The ROSCO system allows the supervisor to
monitor transmittal rates of each CPS FR. A
daily receipts report is printed each day showing the progress of each case on CPS.

Observation of Fieldwork
Field observation is one of the methods used by
the supervisor to check and improve performance
of the FR staff. It provides a uniform method for
assessing the FR’s attitude toward the job, use of
the computer, and the extent to which FRs apply
CPS concepts and procedures during actual work
situations. There are three types of observations:

Chapter 3-5: Organization and Training of the Data Collection Staff 137

U.S. Bureau of Labor Statistics and U.S. Census Bureau	

•
•
•

Initial observations (N1, N2, N3).
General performance review.
Special needs (SN).

Initial observations are an extension of the initial classroom training for new hires and provide
on-the-job training for FRs new to the survey.
They also allow the survey supervisor to assess
the extent to which a new CPS CAPI FR grasps
the concepts covered in initial training and are
an integral part of the initial training given to all
FRs. A 2-day initial observation (N1) is scheduled
during the FR’s first CPS CAPI assignment. A
second 1-day initial observation (N2) is scheduled
during the FR’s second CPS CAPI assignment. A
third 1-day initial observation (N3) is scheduled
during the FR’s fourth through sixth CPS CAPI
assignment.
General performance review observations are
conducted at least annually and allow the supervisor to provide continuing developmental feedback
to all FRs. Each FR is regularly observed at least
once a year.
SN observations are made when an FR has problems or poor performance. The need for a SN
observation is usually detected by other checks on
the FR’s work. For example, SN observations are
conducted if an FR has a high Type A noninterview rate, a high minutes-per-case rate, a failure
on reinterview, an unsatisfactory evaluation on a
previous observation, made a request for help, or
for other reasons related to the FR’s performance.
An observer accompanies the FR for a minimum
of 6 hours during an actual work assignment.
The observer notes the FR’s performance including how the interview is conducted and how
the computer is used. The observer stresses the
requirement to ask questions as worded and in the
order presented on the CAPI screen. The observer
also covers adhering to instructions on the instrument and in the manuals, knowing how to probe,
recording answers correctly and in adequate
detail, developing and maintaining good rapport
with the respondent conducive to an exchange
of information, avoiding questions or probes that
suggest a desired answer to the respondent, and
determining the most appropriate time and place
for the interview. The observer also stresses vehicular and personal safety practices.

The observer reviews the FR’s performance and
discusses the FR’s strong and weak points with an
emphasis on correcting habits that interfere with
the collection of reliable statistics. In addition, the
FR is encouraged to ask the observer to clarify
survey procedures not fully understood and to
seek the observer’s advice on solving other problems encountered.

Unsatisfactory Performance
When the performance of an FR is at the unsatisfactory level over any period (usually 90 days),
he or she may be placed in a trial period for 90
days. Depending on the circumstances, and with
guidance from the Human Resources Division, the
FR will be issued a letter stating that he or she
is being placed in a Performance Improvement
Plan (PIP). These administrative actions warn
the FR that his or her work is substandard, provide specific suggestions on ways to improve
performance, alert the FR to actions that will be
taken by the survey supervisor to assist the FR in
improving his or her performance, and notify the
FR that he or she is subject to separation if the
work does not show improvement in the allotted
time. Once placed on a PIP, the RO provides performance feedback during a specified time period
(usually 30 days, 60 days, and 90 days).

EVALUATING INTERVIEWER
PERFORMANCE AND QUALITY
GUIDELINES AT THE CATI CONTACT
CENTERS
Each CPS supervisor is responsible for ensuring
their assigned interviewers perform at satisfactory levels. Interviewer performance is measured
at both an overall level and a survey-specific
level. Interviewers’ performance evaluations take
into account their attendance and administrative
responsibilities such as completing their payrolls
correctly, completing required self-studies, and
completing required training classes. Additionally,
interviewer performance is measured for each
survey they are assigned to, and performance is
measured qualitatively and quantitatively.
Each CATI interviewer’s work on CPS is measured
qualitatively through the review of recorded interviewing sessions. Calls are digitally recorded and
reviewed using the NICE Interaction Management
(NIM) software. The recordings capture the

138 Chapter 3-5: Organization and Training of the Data Collection Staff 	

	

Current Population Survey TP77

U.S. Bureau of Labor Statistics and U.S. Census Bureau

Figure 3-5.1.

U.S. Census Bureau Regional Office Boundaries

content of each call, and the content is stored in
NIM. Informed consent is received from respondents in order for the calls to be recorded, and
all calls that receive consent are recorded. The
NIM software randomly assigns recent recordings
to coaches or monitors at the contact centers.
Coaches are then required to review and evaluate
these sessions in a timely manner as the coach/
monitor assignments expire in the NIM software
after 5 days. Coaches evaluate these sessions
using an evaluation scorecard that has numerous
categories relating to survey specific guidelines
and procedures. Each category has a set point
value and a “yes/no” option. Coaches are also
required to note a specific timeframe on the
recording when an issue occurred and describe
the reason an element was marked down.
The overall score can range from zero to 100.
Scores below 81 are considered unsatisfactory.
Current Population Survey TP77	

Interviewers are given feedback on all of their
monitoring sessions below the score of 100;
however, only those sessions below 81 are given
immediate feedback. Feedback can be conducted
by either a coach or a supervisor, and the interviewer has the opportunity to observe their own
recording. This gives an interviewer an opportunity to see any errors made and ensure the evaluation is reflective of their performance. If the interviewer is in agreement with the evaluation, they
digitally sign the evaluation; however, if they are in
disagreement, they have the option to dispute the
evaluation and have it reviewed by a supervisor.
Interviewers who receive three consecutive
unsatisfactory evaluations or demonstrate severe
misconduct on CPS are placed on an Individual
Coaching Assistance plan. Under this plan, a
minimum of three to five inbound and three to five
outbound calls are randomly assigned to coaches

Chapter 3-5: Organization and Training of the Data Collection Staff 139

U.S. Bureau of Labor Statistics and U.S. Census Bureau	

to evaluate each month. Interviewers remain in
the Individual Coaching Assistance plan until they
receive three consecutive satisfactory evaluations.
Once this occurs, the interviewer is placed onto
a Systematic plan where one inbound and one
outbound call is assigned to coaches to evaluate
each week. Interviewers who are newly trained on
CPS are placed on an Initial plan. This plan follows
the same guidelines as the Individual Coaching
Assistance plan; however, it is considered a continuation of training and poor scores under this
program do not affect an employee’s performance
evaluation.

Interviewers’ work on CPS is measured quantitatively using WebCATI statistics. Supervisors
meet with interviewers on an individual basis
each month to review their statistics on each
survey. Supervisors are able to use statistics from
WebCATI to track interviewer completes, efficiency (measured by completes per hour), call
attempts per hour, refusal conversion rates, and
numerous other categories. Supervisors can then
compare these statistics to prior months’ performance levels and the contact center averages.

Table 3-5.1.

Average Monthly Workload by Regional Office: 2017
Regional office
    Total . . . . . . . . 
New York. . . . . . . . . . . . .
Philadelphia. . . . . . . . . .
Chicago. . . . . . . . . . . . . .
Atlanta. . . . . . . . . . . . . . .
Denver. . . . . . . . . . . . . . .
Los Angeles. . . . . . . . . .

Total

Percent

Base
workload

CATI
workload

73,382
10,696
12,982
9,799
12,594
16,075
11,236

100.00
14.58
17.69
13.35
17.16
21.91
15.31

66,528
9,681
11,779
8,916
11,396
14,581
10,175

6,854
1,015
1,203
883
1198
1,494
1,061

Note: Derived from Current Population Survey Monthly Summary Report, January 2017–December 2017.

140 Chapter 3-5: Organization and Training of the Data Collection Staff 	

	

Current Population Survey TP77

U.S. Bureau of Labor Statistics and U.S. Census Bureau

Unit 4
Current Population Survey
Data Quality

This page is intentionally blank.

Chapter 4-1: Nonsampling Error
INTRODUCTION
This chapter discusses sources of nonsampling
error that can affect the Current Population
Survey (CPS) estimates. The effect of nonsampling error is difficult to measure, and the full
impact on estimates is generally unknown. The
CPS strategy is to examine potential sources of
nonsampling error and to take steps to minimize
their impact. Some sources of nonsampling error
are external; CPS measures cannot control those
(e.g., Master Address File [MAF] errors or errors in
benchmark population controls).
Total survey error comprises sampling error and
nonsampling error. The sampling error of an
estimate is commonly measured by its standard
error (square root of the variance). Nonsampling
errors can arise at any stage of a survey, and they
can affect both bias and variance, but the principal concern is usually with introducing bias into a
survey.
Nonsampling error can be attributed to many
sources and many types could affect the results of
a census as well as a survey. The following categories of nonsampling errors are discussed in this
chapter:
•	 Processing error.
•	 Coverage error.
•	 Nonresponse error.
•	 Measurement error.
•	 Other sources of nonsampling error.

Computer systems at the U.S. Census Bureau and
the Bureau of Labor Statistics (BLS) are compliant with federal requirements. This applies to
hardware, platforms, operating systems, software,
backups, and security procedures that control
access to computers. There is tight security on
data files and the production systems for CPS processing. There are contingency plans for continuing operations under weather-related and other
emergencies. Both agencies have offsite backups
that allow processing to continue should primary
systems fail.
Production system development is rigorous.
Programs have written specifications and data
files have detailed documentation. Production
programs are written with standard programming languages that have commercial support.
Testing procedures ensure files can be read and
that programs conform to specifications. Changes
to the production system are tested using prior
data, simulating associated changes in data files
as needed.
Quality assurance procedures during monthly processing include the following:
•	 Embedded checks in programs identify faulty
data.
•	 Record counts and file sizes are tracked at various stages of production.
•	 Person weights are summed to make sure they
add to the benchmark population controls.

The sources, controls, and quality indicators of
nonsampling errors presented in this chapter do
not necessarily constitute a complete list.

•	 The Census Bureau and the BLS independently
produce a set of tables and compare them cell
by cell.

PROCESSING ERROR

•	 Data are compared to the previous month
and to the same month of the previous year
by geographic, demographic, and labor force
characteristics.

Computer Processing Error
All survey operations, from sampling to field processing to tabulation, involve computer processing. At any stage of survey operations, it is conceptually possible for computer processing errors
to be introduced. Best practices in development
and testing prevent major computer processing
problems. Quality assurance procedures also track
data processing and detect errors.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Sample Processing Error
The CPS sample is coordinated with the samples
of several other demographic surveys conducted
by the Census Bureau. The sample selection
process accounts for needed sample sizes, determines necessary sampling intervals, and has a
protocol to ensure that a housing unit (HU) is

Chapter 4-1: Nonsampling Error 143

selected for at most one survey in a 5-year period.
Sampling verification has a testing phase and a
production phase. Verification in both phases is
done independently by an information technology
specialist and a subject-matter expert.
The testing phase includes systems testing and
unit testing. In systems testing, programs are
tested individually and collectively per specifications. In unit testing, trial data sets are run
through the programs. In particular, simulated
data sets with unusual observations are used to
test the performance of the system in extreme
situations.
The goal of the production phase is to verify
that samples were selected as specified and that
selected HUs have appropriate information, such
as correct sampling probabilities. The number of
HUs selected is compared to a computed target
for each state. Outputs at several stages of sampling are compared to results from the previous
annual sample to monitor consistency. The sample
files are checked for incomplete or inconsistent
HU information.

Field Processing Error
Many field processing errors are prevented by
computer systems. Particular attention is given to
the transmission of sample assignments and data.
Monthly sample processing operations send out
assignments through the regional offices (ROs)
to field representatives (FRs) and to telephone
interviewers in the contact centers. That information is shared with supervisory and RO staff. ROs
verify that each interviewer obtained their entire
assignment. The system ensures each sampled
HU that is to be interviewed is assigned once and
only once. During data collection, daily summary
information about each interviewer is available.
The ROs can make modifications to interviewer
assignments, if necessary.
Quality assurance procedures are implemented
on data transmissions from FRs and the contact
centers. During data collection, daily summary
information on transmissions is tracked. ROs verify
that all FR assignments are returned and resolve
any discrepancies before certifying completion.
System checks verify total counts and ensure that
one final report—either a response or a reason for

144 Chapter 4-1: Nonsampling Error	

	

nonresponse—is obtained for every HU and only
one response is retained per HU.
New interviewers are thoroughly trained on
retrieving assignments and transmitting data.
They are periodically assigned refresher training.
See Chapter 3-5, Organization and Training of
Staff, for more information.

COVERAGE ERROR
Coverage error exists when a survey does not
completely represent the population of interest.
The population of interest in the CPS is the civilian
noninstitutional population aged 16 and over
(CNP16+). Ideally, every person in this population
would have a known, nonzero probability of being
sampled or “covered” by the CPS.

Sampling Frame Omission
The CPS reaches persons in the CNP16+ through
a sample of HUs selected from the MAF, which
serves as its sampling frame. The MAF is used by
several demographic surveys conducted by the
Census Bureau. For more information about the
MAF, see Chapter 2-1, CPS Frame.
The MAF is a list of living quarters nationwide,
updated semiannually, which may be incomplete
or contain units that cannot be located. Delays
in identifying new construction and adding those
HUs to the MAF is the primary cause of frame
omission. Other examples of omission include
units in multiunit structures missed during canvassing, mobile homes that move into an area
after canvassing, and new group quarters that
have not been identified.

Housing Unit Misclassification
CPS operations classify some sampled HUs,
including some group quarters (Chapter 2-1,
CPS Frame), as temporarily (Type B) or permanently (Type C) out-of-scope. The remaining
HUs are treated as in-scope, although some
do not respond (Type A). Misclassification can
result by identifying too few (overcoverage) or
too many (undercoverage) HUs as out-of-scope.
Misclassification reduces the efficiency of weighting adjustments made to compensate for HU
nonresponse.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Type B HUs remain in interviewer assignments and
are rechecked in later months. There are many
reasons for a Type B classification. There may be
no in-scope persons living in the HU. For example,
the unit may be vacant, the adults may all be in
the armed forces, the unit may only be inhabited
by persons under the age of 16, or all persons in
the unit may have a usual residence elsewhere.
Examples of other Type B HUs include those
under construction but not ready for habitation,
conversions to temporary business or storage, and
to-be-demolished structures. Type C HUs do not
remain in interviewer assignments. Type C examples include demolished structures, mobile residences that have moved, and structures converted
to permanent business or storage.
Misclassification errors are minimized with uniform
procedures and guidelines for identification of
Type B and Type C HUs. However, some structures
may not be easily classified by the interviewer.
Type B HUs may be especially difficult to identify
in some situations as there is a wide range of reasons for that classification. Errors made in classifying out-of-scope HUs affect CPS coverage.

Housing Unit Roster Error
Creating the HU roster is a step in determining
CPS coverage. During the first interview of a HU,
the HU roster is established. It can be changed
during subsequent interviews.
The automated questionnaire guides the rostering
process, minimizing potential errors, and interviewers are trained to handle difficult situations.
Some errors may still arise, affecting CPS coverage. Examples include omission of residents,
erroneous inclusion of persons, misclassification
of persons as armed forces or usual residence
elsewhere, and misclassification of demographic
information of residents. Demographic information, such as age, sex, race, and ethnicity, is used
to benchmark CPS data to corresponding population controls. Misclassification of respondents
affects the efficiency of CPS weighting.
Another potential source of coverage error is
changing HU composition. A HU is included in the
CPS sample for 8 months. It is interviewed for 4
consecutive months, excluded from interview the

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

next 8 months, and interviewed again for the next
4 months. The characteristics of the HU roster
are subject to change over this 16-month period
(Grieves and Robison, 2013).

Coverage Ratios
Coverage ratios are measures of CPS coverage
monitored at various stages of the weighting
process, calculated by dividing weighted respondent totals by corresponding population controls.
The weighting and benchmarking steps reduce
CPS coverage error by calibrating respondent
weights to the population controls for certain
demographic subgroups. However, since both the
weighting process and the population controls
themselves are imperfect, some coverage error
remains after the weighting process is complete.
For example, the iterative second-stage weighting procedure does not eliminate all nonsampling
errors associated with undercoverage or overcoverage. Instead, it eliminates the relative undercoverage or overcoverage compared to benchmark
CNP16+ controls for particular subgroups. See
Chapter 2-3, Weighting and Estimation, for details
about the weighting process and population
controls, including specific weighting terminology
that is utilized in the tables of this section.
A coverage ratio less than 1.00 represents undercoverage, while a coverage ratio greater than 1.00
represents overcoverage. As displayed in Figure
4-1.1, using first-stage weights, the national coverage ratio for total CNP16+ ranged from 0.872
to 0.900 between December 2014 and December
2017. Coverage ratios remained generally the
same over this 3-year time period. Coverage for
Whites (0.896 to 0.927) is higher than for total
CNP16+, while coverage is poorer for Blacks
(0.771 to 0.820), Hispanics (0.787 to 0.842), and
other race and ethnicity groups (0.765 to 0.835).

NONRESPONSE ERROR
After identifying out-of-scope (Type B and Type
C) HUs, the remaining sampled HUs are identified
as eligible to be interviewed. In any given month
of inclusion in the CPS sample, the HU’s residents
may not provide usable or complete data. This
section discusses nonresponse errors in the CPS

Chapter 4-1: Nonsampling Error 145

Figure 4-1.1.

Current Population Survey Total Coverage Ratios: December 2014–December 2017,
National Estimates
Total

White

Black

Other

Hispanic

1.00
0.95
0.90
0.85
0.80
0.75
0.70

Source: U.S. Census Bureau, Current Population Survey Monthly Summary Report, December 2014–December 2017.

and the methods used to minimize their impact on
the quality of CPS estimates.
The CPS is affected primarily by unit nonresponse
and item nonresponse. Unit nonresponse occurs
when the residents of eligible HUs are not interviewed. Item nonresponse occurs when the residents of HUs eligible for interviewing provide
incomplete responses, answeringOverall
someType
but Anot
all
Rate
questions in the interview. Both of these types of
nonresponse
are discussed in more detail in the
16.00
following sections.
14.00
Person nonresponse occurs when some individ12.00
uals
within an eligible HU are not interviewed.
Person
10.00 nonresponse has not been much of a problem in the CPS because any responsible person
8.00 15 or over in the HU is able to respond for
aged
others in the HU as a proxy.
6.00

the government, or other reasons. Other potential reasons, such as language barriers or inability
to locate HUs, cause few Type A nonresponses,
since alternative language speaking interviewers
(from the contact centers) and Global Positioning
Systems are utilized to avoid these problems.
The Type A nonresponse rate is calculated by
dividing total
Type
Refusal
Rate A nonresponses by total eligible HUs. This can be computed using raw counts
or by weighting each HU by its base weight.
National unweighted Type A nonresponse and
refusal rates from December 2014 to December
2017 are shown in Figure 4-1.2. The overall Type
A rate ranged from 11.7 percent to 15.1 percent
during this 37-month timespan, mostly resulting
from refusals, which varied between 8.0 percent
and 11.9 percent. About two-thirds of Type A
nonresponses were due to refusals.

Unit
4.00(Type A) Nonresponse

As discussed in Chapter 2-3, Weighting and
Estimation, the weights of responding HUs are
In CPS, unit nonresponses—eligible HUs that do
2.00
increased to account for nonresponding units.
not yield any interviews—are called Type A nonThese nonresponding HUs may systematically
0.00
responses.
There are many reasons why a Type
differ from responding HUs, introducing bias
A nonresponse may occur. A majority of Type
into the weighting and estimation process. This
A nonresponses are due to refusals, which can
bias is a source of nonsampling error. Weightingresult from privacy concerns, attitudes toward
Source: U.S. Census Bureau, Current Population Survey Monthly Summary Report, December 2014–December 2017.
cell nonresponse adjustment compensates for
146 Chapter 4-1: Nonsampling Error	

	

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Source: U.S. Census Bureau, Current Population Survey Monthly Summary Report, December 2014–December 2017.

Figure 4-1.2.

Current Population Survey Nonresponse Rates: December 2014–December 2017,
National Estimates
(Percent)
Overall Type A Rate

Refusal Rate

16.00
14.00
12.00
10.00
8.00
6.00
4.00
2.00
0.00

Source: U.S. Census Bureau, Current Population Survey Monthly Summary Report, December 2014–December 2017.

nonresponse rates that differ by geographic area
and metropolitan status, which reduces bias due
to HU nonresponse but does not eliminate it. How
closely responding HUs resemble nonresponding
HUs varies by weighting cell.
Dissimilarities between respondents and nonrespondents can arise for many reasons. Mode of
interview can affect responses. Households in
the CPS sample in the first or fifth month tend
to have different response propensities than in
other months. Respondents refusing to answer
items related to income data are more likely to
refuse to participate in subsequent months and
are less likely to be unemployed (Dixon, 2012).
Demographic information may be related to
response propensity in some areas, but it is not
explicitly included in weighting cell construction.
The best way to limit nonresponse error is to have
uniform procedures in place to limit nonresponse
and increase response rates. For example:
•	

An interviewer must report a possible refusal
to the RO as soon as possible so conversion
attempts can be made.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

•	

Monthly feedback is provided to the interviewers for converting reluctant respondents.

•	

All ROs send tailored letters to not-interviewed HUs that remain in the survey for
additional months.

Unit nonresponse is also subject to misclassification error. Some Type A nonresponses may be
misclassified as Type B (temporarily out-of-scope)
or Type C (permanently out-of-scope) HUs. For
example, a difficult-to-reach HU may be misclassified as “vacant” when it should be classified as a
Type A nonresponse.
By its very nature, unit nonresponse error can
be difficult to quantify or assess since it occurs
when sample units do not respond to the survey.
Research findings by Tucker and Harris-Kojetin
(1997) and Dixon (2004, 2012), show little bias
arising from unit nonresponse, particularly for
major CPS estimates like the unemployment rate.

Item Nonresponse
Item nonresponse occurs when answers are not
provided for all questions (or items) of the CPS
Chapter 4-1: Nonsampling Error 147

interview. Item nonresponse rates, which include
invalid data as well as missing values, vary by
question, as shown in Table 4-1.1. Item nonresponse is quite small for demographic and labor
force questions, but is larger for industry, occupation, and earnings data.
To minimize item nonresponse, navigation is built
into the automated CPS questionnaire so that
questions are not inadvertently skipped, and only
appropriate responses for a person’s path through
the questionnaire can be recorded. However, the
automated questionnaire does not include extensive editing, and some items can be left blank. For
example, a proxy respondent may be unsure of
another HU member’s hours of work, or a person
may refuse to answer a sensitive question.
Ignoring item nonresponse runs the risk of incurring bias. Item nonresponses generally are not
randomly distributed. For example, it is common
for item nonresponse rates to differ by geography and demographics. The CPS uses item
imputation to fill in item nonresponses. Three
imputation methods, detailed in Chapter 3-4, Data
Preparation, are used:
Relational. Likely values are logically inferred
by analyzing monthly answers for other data
items provided by the respondent.

•	

•	

Longitudinal. A previous month’s value provided by the respondent may be used.

•	

Hot deck. A value is assigned from a similar
HU or person.

The CPS imputation procedures are imperfect and
can introduce bias. A nonsampling error occurs
when an imputed value is incorrect. However,
there is no definitive way of knowing which
imputed values are incorrect. What is important is
the net effect of all imputed values. Imputations
are consistent with what is known about the
distributions of responses. Biased estimates
result when the characteristics of nonrespondents
differ from the characteristics of respondents.

The magnitude of nonsampling error arising from
item nonresponse is difficult to quantify, but its
impact is likely very small for demographic and
labor force questions since their item nonresponse
rates are very low. Depending on the net effect of
imputed values, bias may be (but is not necessarily) larger for industry, occupation, and earnings
data since their item nonresponse rates are higher.

MEASUREMENT ERROR
Survey measurement may be incorrect for a
variety of reasons. Vague concepts, imperfectly
designed questions, interviewer errors, and
respondent errors all can contribute to measurement error. The mode of interview, personal visit
or telephone, is known to affect measurement.
Errors may arise when a respondent provides
answers for another member of the household.
A respondent may not know the true answer to
a question or may provide an estimate such as
for earnings or the dates of a job search. Despite
extensive training for interviewers and coders,
industry and occupation (I&O) data based on
open-ended descriptions given by respondents
may be coded incorrectly.

Automated Questionnaire
CPS concepts are well defined; particularly those
for labor force classification (see Chapter 1-2,
Questionnaire Concepts and Definitions). When
the first automated questionnaire was designed,
it was thoroughly tested for agreement with the
concepts.
The Census Bureau and the BLS are cautious
about adding or changing questions. New questions are reviewed to make sure they will not
interfere with the collection of basic labor force
information. The testing protocol avoids unintended disruptions to historical series and detects
problems with question wording. Refer to Chapter
1-2, Questionnaire Concepts and Definitions, for
more information on the CPS questionnaire and
the protocol for testing questions.

Table 4-1.1.

National Current Population Survey Item Types, Monthly Average Percentage of Invalid or
Missing Values: July 2017–December 2017
Household

Demographic

Labor force

Industry and occupation

Earnings1

5.5

2.9

3.2

15.0

24.4

Certain items of the earnings series of questions had invalid/missing rates around 40 percent.
Source: U.S. Census Bureau, Current Population Survey Summary Report, July 2017 through December 2017.
1

148 Chapter 4-1: Nonsampling Error	

	

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

The automated questionnaire contains several
features that help control measurement error as
listed below.
•

Names, pronouns, and reference dates are
inserted into questions.

•

Question wording is standard and uses plain
language.

•

Complex questions are split into several
shorter questions.

•

Navigation of complicated skip patterns is
built into the questionnaire, so that a respondent is not asked questions that can be
skipped based on answers to earlier questions.

•

Human error is reduced by keying responses
into predefined categories, so that only valid
responses may be accepted for a question.

•

On-screen help is available to assist matching
open-ended responses to predefined response
categories.

•

Basic real-time edits check responses for
internal consistency, allowing many potential
errors to be corrected prior to transmission.

Another helpful feature in controlling measurement error is that HU and person data are carried forward to the next interview. Prior months’
responses can be accepted as still current under
the proper circumstances such as a person’s occupation when there is no change of job. In addition
to reducing respondent and interviewer burden, the automated questionnaire avoids erratic
variation in I&O codes for people who have not
changed jobs but describe their I&O differently in
successive months. The automated questionnaire
provides the interviewer opportunities to review
and correct information before continuing to the
next series of questions.

Mode of Interview
A panel of HUs is included in the CPS sample for 8
months. Ideally, interviews are conducted in person by a FR during a HU’s first and fifth monthsin-sample (MIS 1 and 5). Interviews during the
other months (MIS 2–4 and MIS 6–8) are usually
conducted over telephone by a FR or an interviewer from the contact centers. In some cases, a
combination of in-person and telephone may be
needed to complete an interview.
Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

The mode of interview can systematically affect
responses. Mode effects cannot be easily estimated using current CPS sample since there are
confounding influences—e.g., HUs are not randomly assigned to interview mode.
A parallel survey was conducted in 1992 to 1993
before the implementation of the automated questionnaire. A study found only modest mode-ofinterview effects (Thompson, 1994). It is unknown
if similar effects would be realized in the full CPS
survey conducted by staff fully acclimated to the
automated questionnaire.

Proxy Reporting
The CPS seeks information about all eligible people in a sample HU, whether or not they are available for an interview. Since CPS data collection
occurs in a brief time schedule, interviewers can
accept reports from any knowledgeable person in
the household aged 15 or older to provide information about all household members.
Respondents who provide answers about other
household members are called proxy reporters.
Measurement error occurs when proxy reporters
are unable to provide accurate information about
the demographic and labor force characteristics
of other household members. Self-reporting is
typically more accurate for CPS data (Kojetin and
Mullin, 1995; Tanur, 1994). The CPS instrument is
designed to reduce measurement error if a proxy
respondent cannot provide labor force or earnings
information for someone. The instrument allows
the interviewer to skip over the questions for that
person and go to the next eligible person. At the
end of the interview, the interviewer can ask if
anyone else can provide the needed information
for the person, and if not, set up a callback for a
later time.
Table 4-1.2.

Percentage of Current Population Survey
Labor Force Reports by Type of
Respondents: 2015–2017
Type

Percent

Self reports. . . . . . . . . . . . . . . 
Proxy reports. . . . . . . . . . . . . 
Both self and proxy . . . . . . . 

50.6
49.2
0.1

Source: U.S. Census Bureau, Tabulation of 2015–2017
Current Population Survey Microdata.

Chapter 4-1: Nonsampling Error 149

Table 4-1.2 shows overall proxy reporting levels for
2015 through 2017. The level of proxy reporting in
the CPS has historically been around 50 percent.
Proxy reporters are more likely to be found at
home when the interviewer calls or visits. For this
reason, the incidence of proxy reporting tends
to be systematically related to important labor
force characteristics and demographics. If proxy
answers contain systematic reporting errors, then
bias is introduced into related estimates. As it
is difficult to measure how much response error
occurs due to proxy reporting, it is difficult to
assess the resulting bias in CPS estimates.

OTHER SOURCES OF NONSAMPLING
ERROR
Industry and Occupation Coding Verification
Files containing records needing I&O codes
are transmitted electronically to the National
Processing Center (NPC). For each record on the
file, the NPC coder enters four-digit numeric I&O
codes that best represent the job description
recorded during the CPS interview. Nonsampling
error can arise during the coding process due
to unclear descriptions or incorrect coding
assignment.
A substantial effort is directed at the supervision
and quality control of the I&O coding operation.
If the initial coder cannot determine the proper
code, the case is assigned a referral code and will
later be coded by a referral coder. During coding,
a supervisor may turn on dependent verification
in which the work is verified by a second coder.
Additionally, a sample of each month’s cases is
selected to go through a quality assurance system
to evaluate each coder’s work. See Chapter 3-4,
Data Preparation, for more information on I&O
coding.

Current Population Survey Population
Controls
Benchmark controls for the monthly CNP16+
are used for the weighting steps that produce
second-stage weights and composite weights.
National and state CPS population controls are
developed by the Census Bureau. Starting with
data from the most recent decennial census, the
population controls are estimates that are annually updated using administrative records and
150 Chapter 4-1: Nonsampling Error	

	

projection techniques. This is done independently
from the collection and processing of CPS data.
See Chapter 2-3, Weighting and Estimation, for
more detail on weighting and population controls.
As a means of controlling nonsampling error
throughout the process of creating population
controls, numerous internal consistency checks
in the programming are performed. For example, input files containing age and sex details are
compared to external counts and internal redundancy is intentionally built into the system. The
BLS reviews a time series of CNP16+ population
controls for unexpected changes. An important
means of assuring that quality data are used as
input into the CPS population controls is continuous research into improvements in methods of
making population estimates and projections.
The population controls have no sampling error,
but nonsampling error is present, specifically
arising from statistical modeling. The modeled changes over time tend to be smooth, and
although some subnational CNP16+ populations
may fluctuate seasonally, there is no seasonal
component in the models. There is also nonsampling error in the input data for the models. The
inputs are examined and frequently updated in
order to limit nonsampling error. CPS implements
changes in the CNP16+ controls each January,
resulting in level shifts for some series that affect
CPS weighting and estimation.
While the CNP16+ controls contain nonsampling
error, there are many benefits to using them in
CPS weighting. The decennial censuses exercise
extreme care in ensuring good population coverage. Changes over time are relatively smooth,
whereas CPS estimates of the CNP16+ would
appear erratic, primarily due to sampling error, if
not benchmarked to these controls. The models
used to update the population controls have performed well historically.

Month-in-Sample Bias
The CPS is a panel survey with a 4-8-4 rotation
scheme (Chapter 2-2, Sample Design). Each
panel, or rotation group, is included in the survey
4 consecutive months-in-sample (MIS 1–MIS 4), is
rotated out for 8 months, and is rotated back in
for another 4 consecutive months (MIS 5–MIS 8).
Collected data systematically differs across the 8

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

months in sample, and this is referred to as MIS
bias. Several possible sources contribute to MIS
bias. Each MIS has a different mix of data collection modes, persons listed on housing rosters
(Grieves and Robison, 2013), and different housing-unit nonresponse rates. Although MIS bias
has changed over the decades (Erkens, 2012),
one continuing characteristic is that the estimated
unemployment rate is usually highest in MIS 1.

Analogously for unemployment, composite
estimates fell below second-stage estimates by
more than 100,000. Over that same time frame,
average monthly composite estimates of the labor
force participation rate and employment-population ratio were about 0.2 percent lower, while
the composited unemployment rate fell below
second-stage estimates by an average of 0.05
percent.

The usual MIS bias measure used by the CPS is
a simple ratio index. This index is the ratio of the
estimate based on a particular MIS group to the
average estimate from all eight MIS groups combined, multiplied by 100. This calculation treats
the average of all eight MIS groups as an unbiased
estimate. Table 4-1.3 shows 2014 to 2015 2-year
average MIS bias indexes for total employed and
total unemployed.

Seasonal Adjustment
Seasonal adjustment models, described in Chapter
2-5, Seasonal Adjustment, are applied to CPS
composite estimates. Seasonal adjustment models
are generated using X-13ARIMA-SEATS software.
Relating to nonsampling errors, two perspectives
may be considered:

While the MIS bias indexes are close to one for all
months for total employed, for total unemployment they are much more variable. Relatively, the
estimate of unemployed persons is higher in MIS
1 than in other months, and there is a decreasing
trend across the MIS, such that MIS 7 and MIS 8
tend to have the lowest estimates of unemployed
persons.

Composite Estimation
Composite estimates, described in Chapter 2-3,
Weighting and Estimation, systematically differ
from second-stage estimates. Presuming that second-stage estimates are closer to unbiased than
composite estimates, the composite estimation
process introduces a bias into estimates that is
more noticeable for estimates of levels than rates.
This bias is a form of nonsampling error.

•

In the strict sense of classical finite sampling
theory, seasonal adjustment is a linear or
near-linear transformation of the survey data,
which does not introduce any new nonsampling errors into the estimation process. Any
nonsampling error present in the seasonal
adjustment process already existed in the
survey.

•

Viewed in a broader context, seasonal adjustment may introduce some model specification
error and error arising from estimating the
unobserved seasonal component, but because
it performs smoothing of the estimates it
likely reduces total error (inclusive of sampling
error). To minimize the potential impact of
specification error, seasonally adjusted series
are thoroughly reviewed to confirm that the
ARIMA models fit sufficiently well.

In either perspective, seasonal adjustment is not
considered to be a major source of nonsampling
error in the CPS, and sampling error tends to be
reduced as a result of the seasonal adjustment

In 2014 and 2015, average monthly composite
estimates of total employment fell below second-stage estimates by more than 400,000.
Table 4-1.3.

Month-in-Sample Bias Indexes for Total Employed and Total Unemployed Average:
January 2014–December 2015
Employment
Employed. . . . . . . . . . .
Unemployed. . . . . . . . .

Month-in-sample
1

2

3

4

5

6

7

8

1.02
1.14

1.02
1.06

1.01
1.04

1.01
0.98

0.99
0.97

0.98
0.95

0.98
0.92

0.99
0.93

Source: U.S. Census Bureau, Tabulation of 2014–2015 Current Population Survey microdata.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Chapter 4-1: Nonsampling Error 151

process at the state or local level (Tiller, 1992) and
the national level (Evans et al., 2015).

•

Record answers in the correct manner and in
adequate detail.

METHODS TO REDUCE NONSAMPLING
ERROR

•

Develop and maintain good rapport with the
respondent conducive to obtaining accurate
information.

•

Determine the most appropriate time and
place for the interview.

•

Are competent in using the laptop or
computer.

Interviewer Training and Performance
Monitoring
Comprehensive interviewer training can reduce
nonsampling error. Training to convert reluctant
respondents can improve CPS response rates, and
higher response rates are associated with lower
nonresponse error. In addition, interviewer training
lessens response error by improving the accuracy
of survey data.
Interviewers are trained on concepts, on what
answers are acceptable, on how to interpret
verbal responses, on how to choose the correct
responses from menus, and on how to probe to
obtain sufficient information from a respondent.
Detailed manuals are provided to interviewers.
Monitoring and observing interviewers can also
reduce nonsampling error. Interviews are observed
immediately after interviewers complete initial
training. Additionally, observations are made
as part of general performance review and for
interviewers who are identified as needing additional help. Interviews in the contact centers are
regularly monitored by supervisors with feedback
provided to the interviewer. The emphasis is on
reinforcing good habits and correcting habits that
interfere with the collection of reliable statistics.
Refer to Chapter 3-5, Organization and Training of
the Data Collection Staff, for more information.
Behavior that is examined during observation or
monitoring includes whether interviewers:
•	

Ask questions as worded and in the order presented on the questionnaire.

•	

Adhere to instructions on the instrument and
in the manuals.

•	

Probe for answers that are consistent with
CPS concepts.

•	

Avoid follow-up questions or probes that suggest a particular answer to the respondent.

152 Chapter 4-1: Nonsampling Error	

	

In addition, continuing efforts are made to
improve interviewer performance. Periodic
refresher training and training on special topics
is given to all interviewers. Quality assurance
procedures (such as the reinterview program
described in Chapter 4-2, Reinterview Design and
Methodology), administrative procedures, and the
collection of performance measures all contribute
to evaluating the performance of interviewers.
When needed, tailored training is assigned to an
interviewer.
Performance measures are gathered on each
interviewer’s monthly assignment, including Type
A nonresponse rate, item nonresponse rates, reasons for nonresponse, number of HUs classified as
Type B and Type C, and length of time taken per
interviewed HU. Many of these measures are gathered and examined by automated methods.
Acceptable ranges are set for some performance
measures. An interviewer may have poor response
rates, too many noninterviews, an unusually high
rate of blank items, or an unusually high or low
minutes-per-interview measure. If so, the interviewer is considered to be in need of additional
training or other remedial action.
FR performance measures are summarized for
the nation and each RO. Each RO can make a
determination about any general remediation that
is needed to improve interviewer performance.
National “don’t know” and refusal rates are monitored for individual questions; a high item nonresponse rate may be attributable to the question
itself and not to poor interviewer performance.
Changes to historical response patterns can be
detected.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Finally, additional monitoring and evaluation is
done by experts in questionnaire design and
interviewing (behavioral scientists). Observation
by behavioral scientists provides a platform for
assessing interviewers’ attitudes toward the job
and is an attempt to evaluate individual interviewers. In conjunction with other techniques, such
as interviewer debriefing, the aim is to improve
overall data collection. Behavioral scientists’ studies can recommend changes to training, instructions, the automated questionnaire, or survey
procedures.

Edit Modules
After transmission to Census Bureau headquarters, there are eight edit modules applied to the
data (Chapter 3-4, Data Preparation):
•
•
•
•
•
•
•
•

Household.
Demographic.
Labor force.
I&O.
Earnings.
School enrollment.
Disability.
Certification.

Each module establishes consistency between
logically related monthly items. Longitudinal
checks are made for consistency with prior
responses. Inappropriate entries are deleted and
become item nonresponses. If the labor force edit
finds that insufficient data has been collected to
determine a person’s labor force status, then data
for that person is dropped.

REFERENCES
Dixon, J., “Using Census Match Data to Evaluate
Models of Survey Nonresponse,” Proceedings
of the Section on Survey Research Methods,
American Statistical Association, 2004, pp.
3421−3428.
Dixon, J., “Using Contact History Information
to Adjust for Nonresponse in the Current
Population Survey,” Proceedings of the Section
on Government Statistics, American Statistical
Association, 2012, pp. 3421−3428.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Erkens, G., “Changes in Panel Bias in the
U.S. Current Population Survey and its Effects
on Labor Force Estimates,” Proceedings of the
Section on Survey Research Methods, American
Statistical Association, 2012, pp. 4220−4232.
Evans, T., J. McIllece, and S. Miller, “Variance
Estimation by Replication for National CPS
Seasonally Adjusted Series,” Proceedings of the
Section on Business and Economic Statistics,
American Statistical Association, 2015, pp.
884–895.
Grieves, C., and E. Robison, “Panel Analysis of
Household Nonresponse and Person Coverage
in the Current Population Survey,” Proceedings
of the Section on Survey Research Methods,
American Statistical Association, 2013, pp.
1339–1349.
Kojetin, B. A., and P. Mullin, “The Quality of Proxy
Reports on the Current Population Survey (CPS),”
Proceedings of the Section on Survey Research
Methods, American Statistical Association, 1995,
pp. 1110−1115.
Tanur, J. M., “Conceptualizations of Job Search:
Further Evidence from Verbatim Responses,”
Proceedings of the Section on Survey Research
Methods, American Statistical Association, 1994,
pp. 512−516.
Thompson, T., “Mode Effects Analysis of Labor
Force Estimates,” Current Population Survey
Overlap Analysis Team Technical Report 3,
U.S. Census Bureau, April 14, 1994.
Tiller, R. B., “Time Series Modeling of Sample
Survey Data from the U.S. Current Population
Survey,” Journal of Official Statistics, Statistics
Sweden, 1992, pp. 149–166.
Tucker, C., and B. A. Harris-Kojetin, “The Impact
of Nonresponse on the Unemployment Rate in
the Current Population Survey,” paper presented
at the 8th International Workshop on Household
Survey Nonresponse, 1997, Mannheim, Germany.

FURTHER READING
Groves, R. M., Survey Errors and Survey Costs,
John Wiley & Sons, 1989.

Chapter 4-1: Nonsampling Error 153

Chapter 4-2: Reinterview Design and Methodology
INTRODUCTION
The CPS Reinterview Program, which has been
in place since 1954, conducts a second interview on about 3 percent of sample households
each month. This quality control (QC) reinterview program provides a measure of control or
feedback about the quality of data received from
the respondent and the original interviewer. The
QC reinterview is used to deter data falsification
and to monitor the FRs adherence to established
procedures.

REINTERVIEW DATA SELECTION AND
COLLECTION
The sample for the QC reinterview is a subsample
of the CPS. For this subsample, HUs are selected
by a specific method and specific field procedures
are followed. To minimize respondent burden, a
subsampled household is reinterviewed only once.
Sample selection for reinterview is performed
immediately after the monthly interview assignments to FRs. This sample does not include the
contact center’s Computer-Assisted Telephone
Interviewing (CATI) interviewers because the contact centers have their own QC program in place.
With the CATI monitoring program, interviews are
recorded (with the permission of the respondent)
and reviewed by coaches and monitoring staff.
These staff members review and evaluate the
interviewers’ performance and provide feedback
to the interviewers.
The QC reinterview is used to measure some
nonsampling errors (as discussed in Chapter
4-1, Nonsampling Error) attributable to FRs. The
QC reinterview is used to make decisions as to
whether the FRs adhered to procedures and to
check for any possible falsification. QC reinterview
checks if the FR:
•	
•	
•	
•	

•	

Falsified responses.
Did not ask questions as worded.
Did not ask all questions.
Provided auxiliary information that could
influence or lead a respondent to answer
differently.
Did not use the proper equipment to capture
data.

154 Chapter 4-2: Reinterview Design and Methodology	

	

•

Did not maintain the standards established by
the Census Bureau and BLS.

Sampling Methodology
Previous research has indicated that inexperienced
FRs (less than 5 years of work with the Census
Bureau) are more likely to have significant procedural errors than experienced FRs, so more of
them are selected for QC reinterview each month
(U.S. Census Bureau, 1993 and 1997). Research
has also indicated that inexperienced FRs falsify
a greater percentage of households than experienced FRs, so fewer of their cases (households)
are needed to detect falsification. The QC reinterview sampling selection system is set up so that a
selected FR is in reinterview at least once but no
more than four times within a 15-month cycle.
The QC reinterview sample is selected in two
stages:
First Stage (assignment of FRs for QC reinterview)
•

The QC reinterview uses a 15-month cycle.
FRs are randomly assigned to one of the 15
different groups. Throughout the cycle, new
FRs are assigned to the groups.

•

The system is set up so that an FR is in reinterview at least once but no more than four times
within the 15-month cycle.

Second Stage (subsample of households for QC
reinterview)
•

Each month, one of the 15 groups of interviewers is included for QC reinterview.

•

A subsample of households is selected from
each FR’s interview assignment.

•

The number of households subsampled per FR
assignment is based on the length of tenure
of the FR (fewer for inexperienced FRs since
fewer cases are needed to detect falsification).

•

Typically, 3 percent of CPS households are
assigned for QC reinterview each month.

In addition to the two-stage QC reinterview sample, any FR with an original interview assignment
can be put into supplemental reinterview. These
are cases available to the ROs to add to reinterview at their discretion. The FR can be flagged for
supplemental reinterview for the next assignment
Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

period or the FR’s inactive cases can be activated
during the current assignment period. Inactive
cases are those cases that were not selected for
the prior two-stage or the supplemental reinterview. FRs flagged for review based on various
data quality indicators may be put into supplemental reinterview.

the same supervisory chain as the original FR who
conducted the interview. This minimizes bias when
reviewing the work of the FR. Most reinterviews
are conducted by telephone, but in some cases by
personal visit. They are conducted on a flow basis
extending through the week following the interview week.

All households are eligible for QC reinterview,
except noninterviews for occupied households.
Eligible households include completed and partial
interviews as well as noninterviews due to the HU
being unoccupied such as vacant units. In addition, FRs are evaluated on their rate of noninterviews for occupied households with penalties for
a high noninterview rate. Therefore, FRs have no
incentive to misclassify a case as a noninterview
for an occupied household. See Figure 4-2.1 for an
overview of determining QC reinterview household eligibility and data collection methods.

For QC reinterview, the reinterviewers are
instructed to try to reach the original household
respondent, but are allowed to conduct the reinterview with a proxy, another eligible household
respondent.
The QC reinterview asks questions to determine
if the FR conducted the original interview and
followed interviewing procedures. The labor
force questions are not asked. The QC reinterview instrument allows reinterviewers to indicate
whether any noninterview misclassifications
occurred or any falsification is suspected.

Data Collection Methods

Quality Control Reinterview Evaluation

Contact centers attempt all QC reinterviews with
a valid telephone number, and if not completed,
the cases are then transferred to the ROs for
assignment to FRs. Cases without a valid telephone number are sent directly to the ROs for
assignment to FRs. These FRs must not fall under

Cases are assigned outcome codes and disposition codes. These codes are based upon the judgment of the reinterviewer and the response of the
reinterview respondent. The outcome codes are
used as an indicator of the reinterview result (e.g.,
confirming correct or incorrect information, the

Figure 4-2.1.

Quality Control Reinterview Sampling Diagram
Current
Population
Survey
Production
Interviews

Interviews
Transmitted From
the Contact
Centers

Interview
Not Eligible for
Quality Control
Reinterview

Interviews
Transmitted From
the Regional
Offices

No

Was Interview Case
Selected and Eligible for
Quality Control
Reinterview

Yes

Does Interview Case
Have a Valid
Phone Number

No

Reinterview Cases
Sent to
Regional Offices

No
Yes
Reinterview Cases
Sent to
Contact Centers

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Complete?

Chapter 4-2: Reinterview Design and Methodology 155

type of noninterview, or any discrepancies found).
The reinterview outcome codes are similar to the
ones used for the original interviews (see Table
3-2.1 of Chapter 3-2, Conducting the Interviews).
Disposition codes are used as an indicator for
cases suspected of falsification, which triggers
a falsification investigation. The most common
cause for a disposition of suspected falsification
is a finding that FR training procedures were not
followed. A data falsification investigation is triggered by the case being flagged as having at least
one of three discrepancy types: household was
not contacted; the interviewer classified the unit
as a Type B or C noninterview when it should have
been an interview or Type A; or the respondent
said the interviewer did not use a laptop during a
personal visit interview. Any of these discrepancies would be considered a major departure from
FR training procedures.
Reinterview outcome and disposition codes are
summarized in a manner similar to those for the
original interview (refer to Table 3-2.8 of Chapter
3-2, Conducting the Interviews). Examples of reinterview results are shown in Table 4-2.1 and Table
4-2.2.

Adherence to Training Procedures
Reinterview is a quality check for FRs, to check
their adherence to training procedures. Most of
the suspected cases stem from a deviation from
training procedures. Findings such as those below
typically result from these deviations:
•	 According to the respondent, the FR did not
contact the household.
•	 The FR was not polite.
•	 The FR did not use a laptop during an in-person
interview.
•	 According to the respondent, there were errors
made on the household roster.

Falsification
Reinterview can help provide measures of “curbstoning” or survey data falsification (Kennickell,
2015) and serves as a deterrent.
Falsification can be defined as a manipulation
of the data collected. It can be intentional or

156 Chapter 4-2: Reinterview Design and Methodology	

	

unintentional. Examples of unintentional falsification can range from a simple typo to a household
member omission or deviations from procedure.
Examples of intentional falsification can range
from making up one or more responses to having
fictional interviews. FRs suspected of falsification
may undergo an investigation; results of these
investigations can vary from an oral or written
warning, recommendations for additional training,
or termination.
Periodic reports on the QC reinterview program
are issued. These reports identify: (1) a preliminary count of cases and FRs suspected of falsification, (2) the results of the investigation with
the confirmed falsification counts for cases and
FRs, and (3) unweighted (see formula below) and
weighted falsification rates. (See Appendix A,
Weighting Scheme for Quality Control Reinterview
Estimates).

Since FRs are made aware of the QC reinterview
program and are informed of the results following reinterview, falsification has not been a major
problem in the CPS. Only about 0.5 percent of
CPS FRs are found to have falsified data. FRs
found to have falsified data typically resign or
have their employment terminated.

REFERENCES
Kennickell, A., “Curbstoning and culture,”
Statistical Journal of the International Association
for Official Statistics (IAOS), 2015, (31), 237–240,
retrieved from .
U.S. Census Bureau, “Falsification by Field
Representatives 1982−1992,” memorandum from
Preston Jay Waite to Paula Schneider, May 10,
1993.
U.S. Census Bureau, “Falsification Study Results
for 1990−1997,” memorandum from Preston Jay
Waite to Richard L. Bitzer, May 8, 1997.

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Table 4-2.1.

Reinterview Results for November 2016—Counts
Description

November 2016

ORIGINAL INTERVIEW CASES SELECTED FOR RANDOM REINTERVIEW. . . . . . . . . . . . . . . . .

1,897

ORIGINAL INTERVIEW CASES ELIGIBLE FOR RANDOM REINTERVIEW. . . . . . . . . . . . . . . . . .
 Complete or sufficient partial reinterviews . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
 Reinterview noninterviews. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Noninterviews . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Headquarters discretion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Regional office discretion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1,629
1,217
412
367
45
0

NONINTERVIEWS EXCLUDING HEADQUARTERS AND REGIONAL OFFICE DISCRETION
CASES
 Type A. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Language problem. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Unable to locate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Unable to complete, bad telephone number. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Insufficient partial. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  No one home. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Temporarily absent. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Respondent can’t remember or refused. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Other Type A. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

361
1
6
73
1
101
4
55
120

 Type B. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Entire household under or over age limit. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Temporarily occupied by persons with URE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Vacant, regular . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Vacant, storage of household furniture. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Converted to temporary business or storage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Unoccupied mobile home, trailer, or tent site. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Household institutionalized or temporarily
ineligible. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Unfit, to be demolished. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Other Type B . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

2
0
1
1
0
0
0

 Type C. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Demolished. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  House or trailer moved . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Converted to permanent business or storage. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Condemned. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Moved out of country. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Deceased. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Other Type C . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

2
0
0
0
0
0
1
1

 Type D. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
  Household replaced by new household . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

2
2

0
0
0

Source: U.S. Census Bureau, Demographic Statistical Methods Division, November 2016 Current Population Survey Reinterview.

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Chapter 4-2: Reinterview Design and Methodology 157

Table 4-2.2.

Reinterview Results for November 2016—Statistics
Description

November 2016

Number of interviewers in reinterview. . . . . . . . . . . . . . . . . . . . .

482

Number of reinterview cases conducted. . . . . . . . . . . . . . . . . . .

1,629

Reinterview completion rate. . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Telephone reinterview rate. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Proxy reinterview rate. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Household accuracy rate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Households with missing members (adds). . . . . . . . . . . . . . . . . . . .
Households with additional members (drops). . . . . . . . . . . . . . . . .
Households with both adds and drops. . . . . . . . . . . . . . . . . . . . . . .
 Original noninterview misclassification rate. . . . . . . . . . . . . . . . . .
Total discrepancies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Household not contacted. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Status incorrect. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Roster incorrect . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Not all questions asked. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Laptop usage rate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
No laptop used. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Bad phone number collected . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Suspected falsification rate by reinterview cases
conducted. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Suspected falsification rate by interviewer. . . . . . . . . . . . . . . . .
Number of interviewers with falsification confirmed. . . . . . . . . . .

74.7% (1,217/1,629)
91.3%  (525/575)
4.6%  (75/1,629)
97.3%  (934/960)
15
9
2
5.5%  (19/347)
225
26
50
26
41
98.0%  (436/445)
9
73
4.9%  (80/1,629)
13.9%  (67/482)
1

Source: U.S. Census Bureau, Demographic Statistical Methods Division, November 2016 Current Population Survey Reinterview.

158 Chapter 4-2: Reinterview Design and Methodology	

	

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

Appendix: Weighting Scheme for Quality Control
Reinterview Estimates
To produce reinterview weights, data are stratified by the reinterview period and the experience level of
the FRs. For the CPS, FRs are considered experienced after 5 years of work at the Census Bureau. Each
month of the year is an interview period for the CPS. The data related to the quality of the production
survey (falsification, household coverage/roster accuracy, politeness, noninterview misclassification, discrepancies) are weighted for the QC reinterview reports, provided there are sufficient experience-level
data available for the CPS FRs. The data related to the quality of the reinterview survey (reinterview
completion rate, lag time average, reinterview telephone rate) are not weighted.
At the beginning of each 15-month CPS cycle, CPS FRs are randomly assigned to one of 15 groups. New
CPS FRs throughout the cycle are added to the groups. Generally, FRs are not regrouped throughout
the cycle. A sample for random reinterview is selected each month. The sample size will depend on the
experience level of the FR originally assigned to the case. Each case preselected for reinterview remains
in the reinterview sample even if the case is reassigned to another FR in the same RO.
The reinterview weight is the reciprocal of the probability of selection into the reinterview sample. The
probability of selection in an interview period is dependent upon:
•

The numbers of experienced and inexperienced FRs selected for random reinterview for the interview period.

•

The total numbers of experienced and inexperienced FRs for the interview period.

•

The number of cases preselected for random reinterview in the interview period for each originally
assigned FR.

•

The originally assigned FR’s assignment size in the interview period.

To determine the reinterview weight for cases originally assigned to a specific FR during an interview
period:
1.

Assess the FR’s experience level, j, for that interview period (month).

2.

Compute P1j as follows:

3.

P1j =

Compute P2j as follows:

P2j =

# of cases preselected for random reinterview for that FR for interview period
# of production cases originally assigned to that FR for interview period
# of FRs of experience level j preselected for reinterview for interview period
# of FRs of experience level j in interview period

4.

The probability of selection of production cases originally assigned to that FR into reinterview for
that interview period is pj = P1j * P2j .

5.

The reinterview weight for all random reinterview cases originally assigned to that CPS FR for that
interview period is wj = 1/Pj .

Current Population Survey TP77	

Appendix: Weighting Scheme for Quality Control Reinterview Estimates 159

U.S. Bureau of Labor Statistics and U.S. Census Bureau	

This page is intentionally blank.

Acronyms
ACS		

American Community Survey

ADS		

Annual Demographic Supplement

AHS		

American Housing Survey

ARD		

Assistant Regional Director

ARIMA		

Auto-Regressive Integrated Moving Average

ASEC		

Annual Social and Economic Supplement

ATUS		

American Time Use Survey

BLS		

Bureau of Labor Statistics

CAI		

Computer-Assisted Interviewing

CAPI		

Computer-Assisted Personal Interviewing

CARMN		

Cost and Response Management Network

CATI		

Computer-Assisted Telephone Interviewing

CAUS		

Community Address Updating System

CBSA		

Core-Based Statistical Area

CCM		

Civilian U.S. Citizen’s Migration

CCO		

CATI and CAPI overlap

CE		

Consumer Expenditure

CES		

Current Employment Statistics

CHIP		

Children’s Health Insurance Program

CI		

Coverage Improvement

CNP		

Civilian Noninstitutional Population

CNP16+	

Civilian Noninstitutional Population age 16 and over

CNCS		

Corporation of National and Community Service

CPS		

Current Population Survey

CV		

Coefficient of Variation

DAAL		

Demographic Area Address Listings

DSD		

Demographic Systems Division

DSF		

Delivery Sequence File

DSMD		

Demographic Statistical Methods Division

EDS		

Exclude from Delivery Statistics

EDMAFX	

Edited MAF Extract

FNS		

Food and Nutrition Service

FR		

Field Representative

FS		

Field Supervisor

GEMEnA	

Group on Expanded Measures of Enrollment and Attainment

GEO		

Geography Division

GEOP		

Geographic Programs Operation

GDP		

Geographic Data Processing

GIS		

Geographic Information System

GPS		

Global Positioning System

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Acronyms

161

GQ		

Group Quarters

GRF		

Geographic Reference File

GVF		

Generalized Variance Function

HU		

Housing Unit

HVS		

Housing Vacancy Survey

I&O		

Industry and Occupation

IDS		

Include in Delivery Statistics

LACS		

Locatable Address Conversion System

LAUS		

Local Area Unemployment Statistics

LUCA		

Local Update of Census Addresses

MAF		

Master Address File

MAFCS		

MAF Coverage Study

MAFID		

MAF Identifier

MAFOP		

MAF Operations

MCS		

Master Control System

MIS		

Month-In-Sample

MIS1		

Month-In-Sample 1

MLR		

Major Labor Force Recode

MOS		

Measure of Size

MSA		

Metropolitan Statistical Area

N1, N2, N3	

Initial Observations 1, 2, and 3

NAICS		

North American Industry Classification System

NCEUS		

National Commission on Employment and Unemployment Statistics

NCS		

National Comorbidity Survey

NCVS		

National Crime Victimization Survey

NICL		

noninterview clusters

NILF		

Not in Labor Force

NIM		

NICE Interaction Management

NPC		

National Processing Center

NRAF		

Net Recruits to the Armed Forces

NSR		

Non-Self-Representing

OMB		

Office of Management and Budget

PIP		

Performance Improvement Plan

PSU		

Primary Sampling Unit

PWBA		

Pension and Welfare Benefits Administration

QC		

Quality Control

RD		

Regional Director

RDD		

Random-Digit Dialing

REGARIMA	

Combined regression and ARIMA time series model

RO		

Regional Office

ROSCO		

Regional Office Survey Control System

RSE		

Relative Standard Error

162 Acronyms

	

	

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

RSM-E		

Regional Survey Managers - Expert

RSM-G		

Regional Survey Managers - Geography

SCHIP		

State Children’s Health Insurance Program

SEATS		

Signal Extraction in ARIMA Time Series

SECU		

Standard Error Computation Unit

SIPP		

Survey of Income and Program Participation

SN		

Special Needs

SOC		

Standard Occupational Classification

SR		

Self-Representing

SRS		

Simple Random Sample

SS		

Second-Stage

Sups		

Supervisory Statistical Assistants

TIGER		

Topologically Integrated Geographic Encoding and Referencing System

TOI		

Time of Interview

UE		

Unemployed

UFUF		

Unit Frame Universe File

URE		

Usual Residence Elsewhere

USPS		

United States Postal Service

USU		

Ultimate Sampling Unit

VPN		

Virtual Private Network

WPA		

Work Projects Administration

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Acronyms

163

This page is intentionally blank.

Index
4-8-4 rotation 3, 29, 76, 150

A
Address sources 43, 45–46	
American Time Use Survey (ATUS) 17
Annual Demographic Supplement 19, 35
Annual Social and Economic Supplement (ASEC)	
18–25
Area frame 43, 47
Auto Regressive Integrated Moving Average
(ARIMA) 32, 36, 38, 94–96, 150–151
Armed Forces
adjustment 24
members 19, 21, 24

B
Base (basic) weight 17, 18, 21–24, 65, 67–69, 71,
73, 76, 78, 84, 90, 146
Behavior coding 106
Births
births and deaths 29, 67	
Building permits 43, 47, 66
Business presence 107	

C
CATI recycled 130
Census Bureau Report Series 27
Citizenship 35, 125
Civilian noninstitutional population (CNP) 7, 10,
11, 32, 33, 35–38, 52, 56–60, 67, 68, 70, 74,
77, 85, 144, 145, 150
Civilian noninstitutional housing 59	
Class-of-worker 8, 9, 109	
Coefficient of variation (CV) 52, 55, 57, 58	
Collapsing 69, 73, 84	
Composite estimation 30, 35, 53, 70, 71, 76–78,
151
Composite weight 35, 36, 67, 77–81, 92, 150
Computer-Assisted Personal Interviewing (CAPI)
5, 103
Computer-Assisted Telephone Interviewing (CATI)
5, 103
CATI and CAPI Overlap	 (CCO) 34

Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

Core-Based Statistical Area (CBSA)
	Definition 54
	
Criteria to determine SR PSUs 55
Counties 27, 29, 47, 53–55, 80
Coverage errors 74, 82, 143–145
Coverage Improvement (CI) Frame 47, 50
Coverage ratio 79, 145, 146
Current Employment Statistics 3, 105		
Current Population Survey (CPS)	
	
Background and history 3, 4
	concepts 5–11
	
data products 26–28
	
key questions 11, 12	
	
sample frame 43–50
	
sample rotation 61–63
	supplements 13–25
	
variance estimation 82–92
	
weighting and estimation 67–80

D
Data collection staff 135–140
Data preparation 132–134	
Data quality 13–14, 113, 154–155, 159
Demographic Area Address Listings (DAAL) 47–48
Dependent interviewing 34, 37, 103, 108, 111,
125, 132
Design effects 86, 91, 92
Disabled 111, 112
Discouraged workers 11, 34, 104, 105, 112
Dormitories 59

E
Earnings
earnings edits 134
instrument design 9, 103
	
revision 104, 109
	
weighting 67, 79
Edits, demographic 124, 134
Educational attainment 6
Employment status 7, 8
Employment-population ratio 11, 34, 38, 151
Enumerative Check Census 4
Estimators 77, 80, 85, 87, 93
Expected value 52, 57, 85 	

Index 165

F			

J

Families 6		
Family business 11, 107
Family weights 67, 79	
Field representatives (FRs) (see also Interviewers)	
	
conducting interviews 3, 116, 136
	
evaluating performance 137, 138	
	
training 135, 136
	
transmitting interview results 129, 130, 131
Field subsampling 68
Final hit number 61, 65 	
Final weight 18, 78, 79 	
First-stage ratio adjustment 18,
First-stage weighting 18, 67–68, 70, 71		
Full-time workers, definition 8

Job leavers 10
Job losers 10
Job seekers 10

G		
Generalized variance functions (GVF) 85–88, 92, 	
	
93	

H		 	
Hadamard matrix 84
Hit string 61, 62, 65
Home Ownership Rates 17
Homeowner Vacancy Rate 17
Horvitz-Thompson estimators 76
Hot deck 21, 133, 134
Hours of work 8
Household
	
definition 5–6
Housing Unit (HU) Frame 45, 46, 48–51, 60
Housing unit (HU)  5, 143
Housing Vacancy Survey (HVS) 17–18	

I
Imputation method 133, 148
Independent population controls 24, 32, 67, 71,
73, 74, 77, 78, 80,
Industry and occupation data (I & O)
	
coding of 130–132, 148–150
	
edits and codes 133, 134
	
questionnaire revision 34	
Interview week 116, 128, 132, 155
Interviewers (see also Field representatives)
116, 125, 132, 135–136, 138–140, 144–149,
152–155
Item nonresponse 68, 133, 146–148, 152, 153
Iterative proportional fitting 75

166 Index	

	

L
Labor force
	
edits and recodes 134
	
information 7–11, 105
Labor force participation rate, definition 11
LABSTAT 26
Layoff 10–12, 109–111
Levitan Commission (see also National
Commission on Employment and
Unemployment Statistics) 32–34, 104–105,
112, 114
Listing operation 44–47
Longitudinal edits 133–134

M
Maintenance reductions 64, 65, 83
Major Labor Force Recode 134
Marital status categories 6
Master Address File (MAF) 43–51, 143
Measure of size (MOS) 55, 60
Measurement error 112, 143, 148, 149	
Metropolitan Statistical Areas (MSAs) 54, 61, 65
Microdata files 26, 28, 38
Month-in-sample (MIS) 18, 62,

N
National Commission on Employment and
Unemployment Statistics (see also Levitan
Commission)	 32, 114
National Processing Center (NPC) 129–132, 135
New entrants 10
Noninstitutional GQ  50, 60
Noninterviews 116, 118–119
Nonresponse adjustment 67–69, 89, 90
Nonresponse clusters 69 	
Nonresponse error 143, 145–147
Nonresponse rate 67, 136, 146–148, 152 	
Nonsampling error 82, 143, 145, 146, 148,
150–152, 154
Non-self-representing primary sampling units
(NSR PSUs) 18, 55, 57
North American Industry Classification System	
(NAICS) 9

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau

O
Office of Management and Budget (OMB) 14
Old construction 43, 61
Outgoing rotation weight 67, 79
Overlap of the sample 64

P
Part-time status 8
Performance Improvement Period (PIP) 138
Permit Address List 59
Permit frame 43, 59
Phase-in of a new design 64
Population Control adjustments 36
Primary Sampling Unit (PSU) 18, 52–61, 64–66,
69–71, 82–85, 88–89, 133, 135–137
Probability sample  52, 58, 67
Processing error 143, 144
Professional Certifications 6, 38
Proxy reporting  149–150
Publications 26–27
Public Use Microdata 26, 28, 38

Q
Quality control (QC)
reinterview program 30, 154, 156
Quality measures of statistical process 95, 145
Questionnaires (see also interviews)
	
1994 redesign 104–112
	
household and demographic 	
	 information 5–7
	
labor force information 7–11
	
recent improvements 112–113
	
reinterview program 154–158
	
structure of the survey 5

R
Race
	
determination of 32
	
racial categories 6
Raking 75–77, 81
Random digit dialing sample (CATI/RDD) 34, 105
Random group 61
Random start 61
Ratio adjustment 70–71, 73, 82
Reduction group 	64–65
Reduction plan 64
Reentrants 10
Reference person 5–6, 118, 122, 124, 126
Reference week 3, 7–12, 107, 116
Referral code 150
Refusal rate 146–147, 152
Current Population Survey TP77	
U.S. Bureau of Labor Statistics and U.S. Census Bureau	

REGARIMA 95–97
Regional offices (ROs)
	
organization and training of data collection
	
staff 131, 135
	
operations of 135–138, 144, 147, 154
	
transmitting interview results 129, 132
Reinterview Program 30, 154–156,
Relational imputation 133, 134
Relationship categories 6, 113
Relative variance 57, 86
Rental vacancy rate 17
Replicate factors 83–84
Replication methods 82–83, 88
Respondent Identification Policy 37
Response distribution 106
Rotation chart 62–63
Rotation system 29, 53, 64

S
Sample characteristics 52, 60–62
Sample design
	
annual sampling 43, 49, 68,
	
definition of the primary sampling units 		
	 (PSUs) 52–54
	
field subsampling 68
	
permit frame 43, 59
	
sampling frames 43–50, 58–60, 144
	
sampling procedure 58, 60
	
sampling sources 58–59
	
selection of the sample primary sampling
units (PSUs) 53–57
	
stratification of the PSUs 53–57, 61
	
within PSU sampling 58, 60, 83
Sample preparation
	
MAF updates 45–49
Sample redesign 38, 43, 48, 50, 57, 66
Sample size
	
determination of 52 	
	
maintenance reduction 64–65, 83
Sampling
	
error 52–53, 82–83, 143, 150–151
	frame 43–50, 58–60, 144
	
interval 57–58, 60–61, 65, 85–86, 143
School enrollment
	
edits and codes 134
	
general 16, 27, 33
Seasonal adjustment of employment 94, 97, 99 	
Second jobs 9, 67
Second-stage weighting 67–68, 70–71, 74–78
Security
	
transmitting interview results 129
Self-employed definition 9
Index 167

Self-representing primary sampling units
(SR PSUs) 54–57, 70, 82, 84, 88–89
Self-weighting 54, 60, 64
Simple weighted estimator 76	
Skeleton frame 59–60, 65
Standard Error Computation Units (SECUs)
82–83, 85
Standard Occupational Classification (SOC) 9
State sampling interval 55, 57–58, 60, 66	
State supplementary samples 32
State-based design 53, 62 	
State
	
coverage step 68, 70, 73
Statistical Policy Division 14, 30
Subfamilies 6
Subsampling 60, 68, 82 	
Successive difference replication 83–86
Supplemental inquiries
	
Annual Social and Economic Supplement
(ASEC) 15, 18–25
	
criteria for 13
Survey week 5, 11, 30
Systematic sampling  83–84, 93

T
The Employment Situation 3, 26
Time series modeling 81, 95–96, 153
Topologically Integrated Geographic Encoding
and Referencing (TIGER) System 44
Type A nonresponse 146–147, 152
Type B noninterviews 68, 116–119, 127, 137,
144–147, 152, 157
Type C noninterviews 68, 116–119, 127, 144–147,
152, 157

U
Ultimate sampling units (USUs) 53, 58, 61, 64–65,
83–84
Unable to work 111–112
Unemployment
	
classification of 9, 134
	
definition 9
	
duration of 10
	
early estimates of 4
	
questionnaire information 104–105
	
reason for 10
	
seasonal adjustment 94, 99
	
variance on estimate of 89–91
Union membership	 26, 33, 103
Unit frame 43–49, 59
Unit nonresponse (see also nonresponse) 68,
132–133, 146–147, 151
Usual residence elsewhere (URE) 116, 118, 122

V
Vacancy rate 15, 17
Variance estimation 82–92
Veterans, data for females 33–34
Veterans’ weights 67, 79–80

W
Weighting Procedure
	
Annual Social and Economic Supplement
(ASEC) 21
	
Current Population Survey (CPS) 67–80
	
Housing Vacancy Survey (HVS) 17–18
Work Projects Administration (WPA) 4

X
X-11, X-12 and X-13 ARIMA programs 32, 94–95,
98

168 Index	

	

Current Population Survey TP77
U.S. Bureau of Labor Statistics and U.S. Census Bureau


File Typeapplication/pdf
File TitleCurrent Population Survey Design and Methodology Technical Paper 77
SubjectCPS, Current Population Survey, Labor force, Unemployment, Employment, Jobs
AuthorU.S. Census Bureau
File Modified2020-01-16
File Created2020-01-09

© 2024 OMB.report | Privacy Policy