SUPPORTING STATEMENT B
U.S. Department of Commerce
U.S. Census Bureau
Census Household Panel
OMB Control No. 0607-1025
B. Collections of Information Employing Statistical Methods
The topical sample size is currently 17,852 housing units after conducting sample replenishment in March. Previous topicals yielded, on average, a response rate of approximately 58%. Subsequent topicals are expected to have a similar response rate, resulting in approximately 10,354 households responding to the 10th, 11th, and 12th topical surveys.
Describe the procedures for the collection of information including:
Statistical methodology for stratification and sample selection,
Estimation procedure,
Degree of accuracy needed for the purpose described in the justification,
Unusual problems requiring specialized sampling procedures, and
Any use of periodic (less frequent than annual) data collection cycles to reduce burden.
The sample design is a stratified systematic sample of all eligible HUs from the Census Bureau’s Master Address File (MAF), which covers all 50 states and the District of Columbia. Auxiliary data from the Demographic Frame (DF) and Planning Data Base (PDB) will be linked to the MAF to stratify the housing units into stratum based on demographic variables within the four Census Bureau regions. MAF records not stratified into a stratum based on the DF or PDB will be defined as their own strata. The sample will be distributed proportionately within regions of the country to each stratum based on the number housing units within the stratum. We will conduct a subsampling operation in stratum that, based on results of other demographic surveys, have higher response rates. Thus, the stratum where no subsampling occurs will be oversampled.
Future refreshment samples will be drawn from a frame that uses updated MAF, DF and PDB information, and those samples may be targeted at the geographic or domain level, to maintain representativeness of the Household Panel Survey, adjust sample sizes based on observed nonresponse, and account for sample units that are rotating out of the panel.
The final Household Panel Survey weights are designed to produce estimates for the total persons aged 18 and older living within HUs (based on the person weight); and occupied household level estimates (based on the household weight). We will create these weights by adjusting the household-level sampling base weights by various factors to account for nonresponse, adults per household, and coverage. The final Household Panel Survey weights are created by applying a Housing Unit adjustment, which converts the person level weight back into a housing unit (HU) weight by dividing the person level weight by the number of persons age 18 and older that were reported to live within the household, and the Occupied HU ratio adjustment, which ensures that the final Household Panel Survey weights will sum to the American Community Survey (ACS) one-year, state-level estimates of occupied HUs.
Enrolled panelists will be invited to respond to monthly topical surveys. Invitations will be sent by email, text message (opt-in), and for those panelists with no email or mobile phone contact information, outbound telephone calling. Using a unique login or QR code, panelists can access a topical questionnaire by computer, tablet, or smartphone to complete a topical survey. Phone-only panelists will complete topical surveys via inbound or outbound CATI.
Data collection for each topical survey will take place in a 2-week window. Each topical survey will be approximately 20 minutes long and panelists will receive up to two reminders to complete a topical survey. Panelists who complete a topical survey will be mailed a thank you letter with a $10 cash incentive about 6 weeks after the topical survey field period closes.
The topical survey that will field in August (Topical 10) will include a roster experiment, and content from the Household Pulse Survey (HPS) to run in parallel with the HPS. We will also conduct a split ballot roster experiment that will vary the question wording used across two experimental treatment groups (i.e., Treatment A and Treatment B) as compared to a control group. The fielding of content from the Household Pulse Survey simultaneously and longitudinally in the CHP will allow methodological assessments of implications changing methodology for a program such as the HPS from a cross-sectional design to a longitudinal design. The September topical (Topical 11) will include a test of the Survey of Income and Program Participation’s (SIPP) labor force, assets, and homeownership items. The labor force section of the A-B test aims to evaluate the effectiveness of different question formats in gathering comprehensive information about the employment, earnings, and work hours of respondents and their spouses over the past six months. For the assets content, the instrument will be used to test how response rates are affected when a single respondent is asked (a) whether anyone in the household owns a given asset/debt type and (b) total household amounts. There will also be a test of whether person-level ownership can be identified. Finally, we will measure the efficacy of asking about the total amount of loans owed on a house rather than multiple loans individually. Similarly, the October topical questionnaire (Topical 12) will repeat the Household Pulse Survey content without the roster experiment.
Future topical surveys can be sponsored by other Census Bureau survey programs. Each topical survey will offer panelists an opportunity to update contact information and verify their address for incentive mailing. Content governance will initially follow policies developed for the Household Pulse Survey and be amended as necessary.
Keeping panelists engaged will prevent attrition and maintain the representativeness of the panel. We will continue sending panelists one topical survey per month to keep them engaged. Panelists will not be eligible for more than one survey per month to keep burden low and reduce panel conditioning. Topical surveys may target specific groups of panelists depending on the topical survey sponsor. If panelists are not sampled for a particular month’s topical survey, they will be asked to respond to a pre-designed panel maintenance questionnaire that will also serve to verify demographic information and record any changes.
In the future, we plan to use the Audience Management functionality in Qualtrics to create a web page where panelists can view their upcoming surveys, check for mailing of incentives for past questionnaires, update their contact information, access technical assistance, and opt-out of panel participation. At least once a year, panelists will be asked to verify or update information from their original Baseline Questionnaire to ensure information about the panelist and their household is current.
Census Household Panel members will be asked to complete approximately one questionnaire per month and will receive an incentive for each questionnaire. Panelists will be enrolled for three years and drop off after that period. In addition to this three-year limit, we expect attrition due to inactivity and requests to disenroll. Attrition can bias the panel estimates, making the development of a panel member replenishment plan of vital importance (Herzing & Blom, 2019; Lugtig et al., 2014; Schifeling et al., 2015; Toepoela & Schonlau, 2017).
Panelist requests to disenroll from the panel will be identified and processed according to forthcoming protocols. Periodic nonresponse or refusal to the monthly requests for otherwise active panelists is expected. The definition of an inactive panelist is as follows:
No response or active refusal to:
a survey request for two consecutive months; or
more than 50% of survey requests within a 12-month period.
A particular questionnaire may be classified as “no response” due to unit nonresponse (i.e., no questionnaire initiation), item nonresponse resulting in an interview that is not usable for analyses (e.g., item nonresponse to questions deemed critical for analysis, high item nonresponse alone or after data review), and poor-quality data resulting in an unusable interview. Inactive panelists will remain members of the Census Household Panel if reengagement is desired by Census staff, especially for rare or historically undercounted populations. Definition of poor-quality responses is forthcoming.
We will assess on an ongoing basis (and no less than quarterly) the generalizability of the panel estimates to represent the target population. Evaluative methods will include precision within important demographic and geographic characteristics, R-indicators, propensity scores, and nonresponse bias analyses (Bianchi & Biffignandi, 2017; Eckman et al., 2021; Groves & Peytcheva, 2008; Peytcheva & Groves, 2009; Rosen et al., 2014).
Based on results from multiple analyses, we will identify any subgroups requiring replenishment. New members will be sampled and recruited using the same protocol as for initial enrollment.
Because incentives remain one of the most effective ways to encourage survey participation. The current incentive design includes the following:
Initial Invitation: $5 visible prepaid incentive with the initial invitation to complete the screener.
Baseline Questionnaire: $20 baseline contingent incentive after initial recruitment field period.
Topical Surveys: $10 for each topical survey (~15-minute average; once per month).
Respondents will be mailed cash incentives for survey completion. NPC will coordinate incentive distribution. The incentive structure could be amended to facilitate ongoing engagement of panelists, particularly for groups of panelists that are rare or historically undercounted.
Rostering Experiment
Stemming from findings on roster question wording and design from the Decennial 2030 Project 21, we are proposing a split ballot experiment for the CHP Topical 10 Questionnaire. The primary objective of this experiment is to compare the final roster wording recommendations from Project 21 to the 2020 Census roster wording. More specifically, the split ballot experiment will vary the question wording used across two experimental treatment groups (i.e., Treatment A and Treatment B) as compared to a control group. Across all three groups, each survey participant will also respond to a subset of undercount and overcount “probes” to act as a truth measure in relation to the participants’ answers to the different rostering questions presented. For this experimental research design, the primary experimental manipulation will be the inclusion/exclusion of the word “stay” in the initial rostering question (with “live or stay” being the wording used in the 2010 and 2020 Census). It is important to note that the design of this experiment not only extends and tests findings and recommendations from Project 21, but also complies with a memorandum issued by the Deputy Chief Counsel for Economic Affairs that outlines the legal requirement for initial roster questions to retain the word “live” (to be in line with the Decennial Census residence criteria). In doing so, the proposed split ballot experiment ultimately seeks to answer (within the parameters of the aforementioned residence criteria) the following core research question: Is there a difference in how the new procedure performs for households with historically undercounted characteristics or with people more likely to be duplicated in the Census?
SIPP Labor Force Test
The Survey of Income and Program Participation (SIPP) is undergoing significant changes to adapt to a predominantly internet-based format, where a single household member responds on behalf of all adults in the household. The Labor Force Statistics Branch plans to test a series of questions on the September CHP.
Labor Force
The labor force section of the A-B test aims to evaluate the effectiveness of different question formats in gathering comprehensive information about the employment, earnings, and work hours of respondents and their spouses over the past six months. Version A of the questionnaire will ask respondents to provide total amounts for earnings worked across various types of work arrangements, including formal employment, self-employment, business ownership, and informal work activities. In contrast, Version B will ask respondents to provide average amounts for earnings and hours worked in a typical month, as well as the minimum and maximum amounts earned during the six-month period. By comparing the results from both versions, the A-B test will provide insights into the most effective way to capture a nuanced understanding of labor force participation and economic well-being while minimizing respondent burden.
Assets
Currently in the SIPP, we ask each adult household member to report the types and values of assets and debts they hold. We want to update the Assets content to preserve the quality of household-level asset ownership and debt-holding rates and net worth. We have A & B versions of the questions that will be randomly assigned to test how response rates are affected when a single respondent is asked (a) whether anyone in the household owns a given asset/debt type and (b) total household amounts. We will also test whether we can identify person-level ownership. We include two types of assets – which differ in likelihood of ownership – and one unsecured liability, which can be used to evaluate how responses differ across assets and debts. And we include several questions to identify how household dynamics affect responses: (a) who in the household shares resources and financial responsibility, (b) financial knowledge sharing among couples, and (c) how couples share assets.
Homeownership
The current iteration of SIPP asks individuals the values of the residence they own, along with loans on the property and various details about said loans (interest rates, types of loans, where they were obtained). As respondents in the SIPP can have 2 or even 3 loans, this can balloon respondent burden. For SIPP Seamless, we have been discussing the idea of aggregating the loans - asking about the total amount of the loans owed on the house rather than each loan in succession, with mark-alls to capture the loan characteristics instead of multiple questions with radio buttons. In the CHP we hope to measure the efficacy of this change, testing asking respondents questions about two loans and the associated details vs asking a more compressed set of questions.
Statistical Design:
Anthony Tersine
Demographic Statistical Methods Division
Demographic Programs Directorate
Data Collection/Survey Design:
Jason Fields
Social Economic and Housing Statistics Division
Demographic Programs Directorate
Jennifer Hunter Childs
Associate Director for Demographic Programs
Statistical Analysis:
David Waddington
Social Economic and Housing Statistics Division
Demographic Programs Directorate
Page
|
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Mary Reuling Lenaiyasa (CENSUS/PCO FED) |
File Modified | 0000-00-00 |
File Created | 2024-09-06 |