Census Household Panel_Supporting Statement Part B_06.26.23

Census Household Panel_Supporting Statement Part B_06.26.23.docx

Census Household Panel

OMB: 0607-1025

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT B

U.S. Department of Commerce

U.S. Census Bureau

Census Household Panel

OMB Control No. 0607-XXXX


B. Collections of Information Employing Statistical Methods

  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

The Census Household Panel (CHP) is sampled from the Census Bureau’s gold standard Master Address File (MAF), which contains an accurate, up to date inventory of all known living quarters in the United States, Puerto Rico and associated island areas. The MAF is used to support most of the census and surveys that the Census Bureau conducts including the decennial census, the American Community Survey and ongoing demographic surveys. The content of the MAF includes address information, Census geographic location codes, as well as source and history data. It can be linked to administrative records and other existing Census Bureau frames, such as the Demographic Frame (a comprehensive, person-level frame consisting of demographic, social, and economic characteristics), securely maintained and curated by the Census Bureau to provide additional information to ensure representativeness and enhance the informative power of the CHP. This foundation of the CHP in the Title 13 infrastructure at the Census Bureau allows for the Census Bureau and partner agencies to leverage administrative records and other non-survey data, in combination with data from the CHP to create a platform for a high-quality integrated data program in alignment with best practices and high-priority areas for innovation.

The targeted initial invite sample size from the MAF will be 75,000 housing units.


  1. Describe the procedures for the collection of information including:

    • Statistical methodology for stratification and sample selection,

    • Estimation procedure,

    • Degree of accuracy needed for the purpose described in the justification,

    • Unusual problems requiring specialized sampling procedures, and

    • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The sample design is a stratified systematic sample of all eligible HUs from the Census Bureau’s Master Address File (MAF), which covers all 50 states and the District of Columbia. Auxiliary data from the Demographic Frame (DF) and Planning Data Base (PDB) will be linked to the MAF to stratify the housing units into stratum based on demographic variables within the four Census Bureau regions. Each region will have eleven strata. Six strata based on information from the DF, four strata based on information from the PDB and one stratum for households that on information exists.

Information from the DF will be used to stratify the households into Hispanic/race strata. Three race strata (Black Alone, White Alone, and Other) within the Hispanic status (Hispanic/Non-Hispanic) will form the six strata. Households where no information exists on the DF will be evaluated at the block-group level and stratified into 4 strata based on the block-group information. Housing units will be stratified into a Hispanic stratum or into three Non-Hispanic race stratum; race information only exists for Non-Hispanic status on the PDB. The four strata are Hispanic, Non-Hispanic Black Alone, Non-Hispanic White Alone, and Non-Hispanic Other. The last strata for households where no information exists on either the DF or PDB. MAF records not stratified into a stratum based on the DF or PDB will be defined as their own strata. The sample will be distributed proportionately within regions of the country to each stratum based on the number housing units within the stratum. We will conduct a subsampling operation in stratum that, based on results of other demographic surveys, have higher response rates. Thus, the stratum where no subsampling occurs will be oversampled.

Sample sizes will be refreshed to maintain the proportional sample sizes within the strata based on updated DF or PDB universe information. Each quarter a refreshment sample panel will be selected based on evaluation of the attrition within each stratum. Sample weights will be adjusted bases on the inverse sum of probabilities of each panel that is in sample at the time.

For example, assume that the MAF contains 147 million records, the base weights for the original sample will be approximately 147M/75,000 = 1,960 with the probability of selection of 1/1,960. Say that the refreshment sample size requires 13,775 (base weight of refreshment sample selection is 147M/13,775 = 10,672) to maintain 15,000 sample units in sample, then the adjusted base weight would be adjusted base wgt = 1/(1/1,960+1/10,672)=1,656.

The final CHP survey weights are designed to produce estimates for the total persons aged 18 and older living within HUs (based on the person weight); and occupied household level estimates (based on the household weight). We will create these weights by adjusting the household-level sampling base weights by various factors to account for nonresponse, adults per household, and coverage. The final survey weights are created by applying a Housing Unit adjustment, which converts the person level weight back into a housing unit (HU) weight by dividing the person level weight by the number of persons age 18 and older that were reported to live within the household, and the Occupied HU ratio adjustment, which ensures that the final CHP survey weights will sum to the American Community Survey (ACS) one-year, state-level estimates of occupied HUs.

  1. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.

To enroll in the CHP, potential panelists must complete a baseline questionnaire. A $5 prepaid incentive will accompany the mailed invitation to the baseline questionnaire. Potential panelists will be considered enrolled after completing the baseline questionnaire. The baseline questionnaire will be programmed using Qualtrics.

Out of 75,000 households, the response rate to the initial baseline operation is expected to be ~20% resulting in about 15,000 households (75,000 x .20). The response rate to the first topical data collection is estimated at approximately 85%, resulting in approximately 12,750 responses. The overall cumulative response rate (12,750/75,000) is expected to be 17%.

For an estimate that has a 40% prevalence rate and given an expected response rate of 80% for module data collection, we expect to achieve a 1.1% CV for the estimate at the national level; 2.2% CV at the region level; and 3.4% CV at the division level.


The Baseline Questionnaire is a 20-minute questionnaire for potential panelists. This questionnaire will collect a household roster, detailed demographic information (mirroring questions from the ACS for benchmarking), attitudes about privacy and confidentiality, and views on science and government. These data will establish important benchmarks for subsequent analyses, including examination of characteristics of nonrespondents and CHP members who attrit over time. The baseline questionnaire will also collect detailed contact information and permission to send text messages for survey invitations and nonresponse follow-up. A panelist is considered enrolled after completion of the baseline questionnaire and a $20 incentive will be mailed to participants after completion.

For initial recruitment, we will mail an invitation to complete the baseline questionnaire to the sample address with a visible $5 prepaid incentive. The letter will contain a unique link to the Baseline Questionnaire with a QR code, a phone number for inbound calling, and a brochure describing the CHP and incentive structure. Web respondents will complete the questionnaire on their computer, tablet, or smartphone. Those who choose to complete via phone will call into a phone line provided by NPC. Phone interviewers will have access to a Qualtrics instrument for data entry and AWS Connect will be the case management system.

Three days after the initial invitation, all cases will be mailed a first reminder with a web link and inbound CATI number. This first reminder will be a pressure-sealed envelope. One week later (10 days after the initial invitation), nonresponding cases that have an associated phone number in the Alternate Contact Frame will receive a phone call reminder. A final mailing of the survey invitation with web link, QR code and inbound CATI phone number will be sent one week after the phone call reminder. See Figure 2 for the Initial Recruitment Contact Strategy. CHP recruitment is expected to take place over an 8-week period. After 8 weeks, the baseline invitation will close, and enrolled panelists will be mailed a $20 incentive for completing the baseline questionnaire.

Once enrolled, panelists will be invited to respond to monthly topical surveys. Invitations will be sent by email, text message (opt-in), and Interactive Voice Response (IVR) phone calls. Using a unique login or QR code, panelists can access a topical questionnaire by computer, tablet, or smartphone to complete a topical survey. Phone-only panelists will complete topical surveys via inbound CATI.

Data collection for each topical survey will take place in a 2-week window. Panelists will receive the first topical survey invitation 4 weeks after the initial recruitment period ends. Each topical survey will be approximately 15 minutes long and panelists will receive up to two reminders to complete a topical survey. Panelists who complete a topical survey will be mailed a thank you letter with a $10 cash incentive about 10 days after the topical survey field period closes.

The first topical survey planned for the Census Household Panel is the Census Barriers, Attitudes and Motivators survey that will inform the decennial advertising and communications program for the 2030 Census. Future topical surveys can be sponsored by other Census Bureau survey programs or by other Federal statistical agencies. Each topical survey will offer panelists an opportunity to update contact information and verify their address for incentive mailing. Content governance will initially follow policies developed for the Household Pulse Survey and be amended as necessary.


Keeping panelists engaged will prevent attrition and maintain the representativeness of the CHP. We anticipate sending panelists one topical survey per month to keep them engaged. Panelists will not be eligible for more than one survey per month to keep burden low and reduce panel conditioning. Topical surveys may target specific groups of panelists depending on the topical survey sponsor. If panelists are not sampled for a particular month’s topical survey, they will be asked to respond to a pre-designed panel maintenance questionnaire that will also serve to verify demographic information and record any changes.


We plan to use the Audience Management functionality in Qualtrics to create a web page where panelists can view their upcoming surveys, check for mailing of incentives for past questionnaires, update their contact information, access technical assistance, and opt-out of CHP participation. At least once a year, panelists will be asked to verify or update information from their original Baseline Questionnaire to ensure information about the panelist and their household is current.


Panel Replacement and Replenishment

Census Household Panel members will be asked to complete approximately one questionnaire per month and will receive an incentive for each questionnaire. Panelists will be enrolled for three years and drop off after that period. In addition to this three-year limit, we expect attrition due to inactivity and requests to disenroll. Attrition can bias the panel estimates, making the development of a panel member replenishment plan of vital importance (Herzing & Blom, 2019; Lugtig et al., 2014; Schifeling et al., 2015; Toepoela & Schonlau, 2017).

Panelist requests to disenroll from the CHP will be identified and processed according to forthcoming protocols. Periodic nonresponse or refusal to the monthly requests for otherwise active panelists is expected. The definition of an inactive panelist is as follows:

No response or active refusal to:

  • a survey request for three consecutive months; or

  • more than 50% of survey requests within a 12-month period.

A particular questionnaire may be classified as “no response” due to unit nonresponse (i.e., no questionnaire initiation), item nonresponse resulting in a questionnaire that is not usable for analyses (e.g., item nonresponse to questions deemed critical for analysis, high item nonresponse alone or after data review), and poor-quality data resulting in an unusable interview. Inactive panelists will remain members of the Census Household Panel if reengagement is desired by Census staff, especially for rare or historically undercounted populations. Definition of poor-quality responses is forthcoming.

We will assess on an ongoing basis (and no less than quarterly) the generalizability of the CHP estimates to represent the target population. Evaluative methods will include precision within important demographic and geographic characteristics, R-indicators, propensity scores, and nonresponse bias analyses (Bianchi & Biffignandi, 2017; Eckman et al., 2021; Groves & Peytcheva, 2008; Peytcheva & Groves, 2009; Rosen et al., 2014).


Based on results from multiple analyses, we will identify any subgroups requiring replenishment. New members will be sampled and recruited using the same protocol as for initial enrollment.


Because incentives remain one of the most effective ways to encourage survey participation. The current incentive design includes the following:


  • Initial Invitation: $5 visible prepaid incentive with the initial invitation to complete the Baseline Questionnaire.

  • Baseline Questionnaire: $20 baseline contingent incentive after initial recruitment field period.

  • Topical Surveys: $10 for each topical survey (~20-minute average; once per month).

Respondents will be mailed cash incentives for survey completion. NPC will coordinate incentive distribution. The incentive structure could be amended to facilitate ongoing engagement of panelists, particularly for groups of panelists that are rare or historically undercounted.


  1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.

The Ask U.S. Panel Pilot was developed to test methods for a Federally-sponsored, probability-based, nationally-representative survey panel which would include historically undercounted populations. The Pilot was designed to answer critical methodological questions about our ability to recruit and retain historically undercounted population groups in a panel. To address two related challenges that may contribute to nonresponse bias in estimates – how to engage those who are unlikely to complete an online screening questionnaire and how to include the population without internet, we launched a two-phase panel recruitment design with subsampling for nonresponse. We oversampled populations who were historically missing from online surveys – those with low internet penetration and Hispanics – and evaluated recruitment protocols that may increase response rates and minimize the potential for nonresponse bias. We found that nonresponse follow-up efforts allowed us to access more diverse households, such as those who do not own their home, speak a language at home other than English, and who receive financial assistance.

Experimentally, we focused on two design elements – sponsorship and prepaid incentives. In a 2x2 design, we compared explicit government sponsorship vs. none and visibility of a $5 prepaid incentive sent with the initial recruitment letter. We found that both the explicit government sponsorship and the visible $5 incentive had a positive and significant influence on the response rates. The effect remained significant even after controlling for design variables. The interaction of the two experimental conditions was also significant such that the condition with the visible incentive and the Census Bureau brand had the highest response rate. These findings are all described: Ask U.S. Panel Pilot General Population Final Report (census.gov)

We plan to continue to experiment with ways to maximize recruitment and retention for the Census Household Panel. Each experiment will be submitted to OMB as it is planned.

  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Statistical Design:

Anthony Tersine

Demographic Statistical Methods Division

Demographic Programs Directorate

[email protected]

Data Collection/Survey Design:

Jason Fields

Social Economic and Housing Statistics Division

Demographic Programs Directorate

[email protected]


Jennifer Hunter Childs

Center for Behavioral Science Methods

Associate Director Research and Methodology

[email protected]

Statistical Analysis:


David Waddington

Social Economic and Housing Statistics Division

Demographic Programs Directorate

[email protected]


Page | 4


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2023-09-12

© 2024 OMB.report | Privacy Policy