High Frequency Surveys HTOPS March, April, May Supporting Statement B (1)

High Frequency Surveys HTOPS March, April, May Supporting Statement B (1) (1).docx

High Frequency Surveys Program Household Trends and Outlook Pulse Survey (March, April, May)

OMB: 0607-1029

Document [docx]
Download: docx | pdf

OMB Information Collection Request

Supporting Statement B

U.S. Department of Commerce

U.S. Census Bureau


High Frequency Surveys Program

Household Trends and Outlook Pulse Survey

OMB Control Number 0607-1029


B. Collections of Information Employing Statistical Methods

  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

The topical monthly sample size is currently 17,812 housing units after conducting sample replenishment in March. Previous topicals yielded, on average, a response rate of approximately 55%. Subsequent topicals are expected to have a similar response rate, resulting in approximately 10,000 households responding to the January and February surveys.

In January 2025, a sample replenishment will be conducted. Approximately 110,000 households will be invited to join the HTOPS panel. Of those households, we expect a 17 percent response rate, resulting in 18,800 responses to the baseline questionnaire. We expect a 55 percent response rate for the monthly topical surveys resulting in10,350 additional monthly topical responses. The new Household Trends and Outlook Pulse Survey (HTOPS) panel sample size is estimated to reach 36,600, with over 20,000 monthly responses beginning with the April 2025 topical collection.


  1. Describe the procedures for the collection of information including:

    • Statistical methodology for stratification and sample selection,

    • Estimation procedure,

    • Degree of accuracy needed for the purpose described in the justification,

    • Unusual problems requiring specialized sampling procedures, and

    • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

The sample design is a stratified systematic sample of all eligible HUs from the Census Bureau’s Master Address File (MAF), which covers all 50 states and the District of Columbia. Auxiliary data from the Demographic Frame (DF) and Planning Data Base (PDB) will be linked to the MAF to stratify the housing units into stratum based on demographic variables within the nine Census Bureau divisions. MAF records not stratified into a stratum based on the DF or PDB will be defined as their own strata. The sample will be distributed proportionately within divisions of the country to each stratum based on the number of housing units within the stratum. We will conduct a subsampling operation in stratum that, based on results of other demographic surveys, have higher response rates. Thus, the stratum where no subsampling occurs will be oversampled.

Future refreshment samples will be drawn from a frame that uses updated MAF, DF and PDB information, and those samples may be targeted at the geographic or domain level, to maintain representativeness of the Household Panel Survey, adjust sample sizes based on observed nonresponse, and account for sample units that are rotating out of the panel.

The final HTOPS survey weights are designed to produce estimates for the total persons aged 18 and older living within HUs (based on the person weight); and occupied household level estimates (based on the household weight). We will create these weights by adjusting the household-level sampling base weights by various factors to account for nonresponse, adults per household, and coverage. The final HTOPS survey weights are created by applying a Housing Unit adjustment, which converts the person level weight back into a housing unit (HU) weight by dividing the person level weight by the number of persons age 18 and older that were reported to live within the household, and the Occupied HU ratio adjustment, which ensures that the final Household Panel Survey weights will sum to the American Community Survey (ACS) one-year, state-level estimates of occupied HUs.

  1. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.

Enrolled HTOPS participants are invited to respond to monthly topical surveys. Invitations will be sent by email, text message (opt-in), and for those panelists with no email or mobile phone contact information, outbound telephone calling. Using a unique login, panelists can access a topical questionnaire by computer, tablet, or smartphone. Phone-only panelists will complete topical surveys via inbound or outbound CATI.

Data collection for each topical survey will take place in a 2-week window. Each topical survey will be approximately 20 minutes long and panelists will receive up to two reminders to complete a topical survey. Panelists who complete a topical survey will be mailed a thank you letter with a $5 incentive (either digital or cash) about 2 to 6 weeks after the topical survey field period closes (depending on the type of incentive).

Future topical surveys can be sponsored by other Census Bureau survey programs. Each topical survey will offer panelists an opportunity to update contact information and verify their address for incentive mailing. Content governance will initially follow policies developed for the Household Pulse Survey and be amended as necessary.


Keeping panelists engaged will prevent attrition and maintain the representativeness of the panel. We will continue sending panelists one topical survey per month to keep them engaged. Panelists will not be eligible for more than one survey per month to keep burden low and reduce panel conditioning. Topical surveys may target specific groups of panelists depending on the topical survey sponsor. If panelists are not sampled for a particular month’s topical survey, they will be asked to respond to a pre-designed panel maintenance questionnaire that will also serve to verify demographic information and record any changes.



HTOPS Replacement and Replenishment

HTOPS panel members will be asked to complete approximately one questionnaire per month and will receive an incentive for each questionnaire. Panelists will be enrolled for three years and drop off after that period. In addition to this three-year limit, we expect attrition due to inactivity and requests to disenroll. Attrition can bias the panel estimates, making the development of a panel member replenishment plan of vital importance (Herzing & Blom, 2019; Lugtig et al., 2014; Schifeling et al., 2015; Toepoela & Schonlau, 2017).

Panelist requests to disenroll from the panel will be identified and processed according to forthcoming protocols. Periodic nonresponse or refusal to the monthly requests for otherwise active panelists is expected. The definition of an inactive panelist is as follows:

No response or active refusal to:

  • a survey request for three consecutive months; or

  • more than 50% of survey requests within a 12-month period.

A particular questionnaire may be classified as “no response” due to unit nonresponse (i.e., no questionnaire initiation), item nonresponse resulting in an interview that is not usable for analyses (e.g., item nonresponse to questions deemed critical for analysis, high item nonresponse alone or after data review), and poor-quality data resulting in an unusable interview. Inactive panelists will remain members of the HTOPS panel if reengagement is desired by Census staff, especially for rare or historically undercounted populations. Definition of poor-quality responses is forthcoming.

We will assess on an ongoing basis (and no less than quarterly) the generalizability of the panel estimates to represent the target population. Evaluative methods will include precision within important demographic and geographic characteristics, R-indicators, propensity scores, and nonresponse bias analyses (Bianchi & Biffignandi, 2017; Eckman et al., 2021; Groves & Peytcheva, 2008; Peytcheva & Groves, 2009; Rosen et al., 2014).


Based on results from multiple analyses, we will identify any subgroups requiring replenishment. New members will be sampled and recruited using the same protocol as for initial enrollment.

Because incentives remain one of the most effective ways to encourage survey participation. The current incentive design includes the following:


  • Initial Invitation: $2 visible prepaid incentive with the initial invitation to complete the screener.

  • Baseline Questionnaire: $10 baseline contingent incentive after initial recruitment field period.

  • Topical Surveys: $5 for each topical survey (~20-minute average; once per month).

Respondents will be emailed digital incentives or mailed cash incentives (if unable or unwilling to accept digital incentives) for survey completion. The National Processing Center (NPC) and the Associate Director for Demographic Programs – Survey Operations (ADDP-SO) team will coordinate incentive distribution. The incentive structure could be amended to facilitate ongoing engagement of panelists, particularly for groups of panelists that are rare or historically undercounted.


  1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


March Topical Experiments

In March the topical content tests new SIPP content for labor force and assets. The labor force portion of the survey asks about the respondent’s work history between October 1, 2024 and the present day (the “reference period”). For each job collected from the respondent, we then collect earnings, expenses, and work time information. The respondent’s path through this section is highly dependent on their personal work situation, including the type(s) of pay they receive, the regularity of their work schedule and the type of job they hold: we include multiple different versions of earnings and hours questions, each meant to be seen by mutually exclusive subsets of respondents. For each job, we also test new SIPP questions capturing industry and occupation. For all respondents (regardless of work history), we ask about their job search activity over the past four weeks. The assets portion of the survey asks the respondent about ownership of certain assets within their household. We ask the respondent to report whether they hold certain types of debt. For checking and savings accounts (as one asset type), credit card debt, and medical debt, we also collect the current value (if relevant). We include two questions about whether the respondent pays any utilities for their residence.

Experimental treatments are included to test a highly ‘personalized’ labor force questionnaire and randomize several sequences over different versions of the same question(s). First, we randomize whether respondents see the assets content or the labor force content first in the survey.

For respondents who worked for pay at some point during the reference period, we randomly assign them to one of two question sequences meant to capture their full list of jobs over that period. For respondents who report receiving an hourly wage, we randomly assign them to one of two question sequences meant to capture their total earnings during the reference period. We randomize whether we include instructions to check earnings records within the labor force content.

We also are testing questions designed to collect information about looking for work and for new types of work arrangements. For “gig work” type jobs that respondents report having, we randomize whether respondents are first asked about commuting expenses or first asked about work expenses. We randomly assign respondents to one of four possible questions about job search experiences.

Within the assets series of questions, we randomly assign one of two different reference periods to a question about debts possessed by the respondent. We randomly assign respondents to one of two ways of capturing ownership of less-commonly-held assets: one version is a mark-all question, while the other is a series of yes-no questions. We randomly assign respondents to one of four versions of a question about ownership of retirement accounts.


May Contact Strategies Experiments



HTOPS will include two contact strategy experiments for the May 2025 data collection. The first experiment tests whether sending the first reminder emails or texts to nonrespondents on a Saturday will help improve response and whether sending text messages or emails may be more effective on a Saturday. To do this, the sample will be divided into four experimental groups: A, B, C, D. Groups A and B will serve as the control group. Nonrespondents in Group A will receive their first reminder as a text message and Group B will receive their first reminder as an email on the first Thursday after the start of data collection. Nonrespondents in Group C will receive their first reminder as a text and nonrespondents in Group D will receive their first reminder as an email on the first Saturday after data collection begins. A reminder in the alternative mode of communication will go to nonrespondents on the Tuesday following the start of data collection: Groups A and C will receive this reminder as an email and Group B and D will receive this reminder as a text message. The rest of the reminders will be on the same day for each of the experimental groups. All groups will receive up to 4 email invitations/reminders and up to 4 text message invitations/reminders. See the chart below for a summary of the planned invites and reminder across each of the four experimental groups.

A

B

C

D

Initial Email

5/20/2025

5/20/2025

5/20/2025

5/20/2025

Initial Text

5/20/2025

5/20/2025

5/20/2025

5/20/2025

First Reminder Text

5/22/2025

5/27/2025

5/24/2025

5/27/2025

First Reminder Email

5/27/2025

5/22/2025

5/27/2025

5/24/2025

Second Reminder Text

5/29/2025

5/29/2025

5/29/2025

5/29/2025

Second Reminder Email

5/29/2025

5/29/2025

5/29/2025

5/29/2025

Final Reminder Email

6/3/2025

6/3/2025

6/3/2025

6/3/2025

Final Reminder Text

6/3/2025

6/3/2025

6/3/2025

6/3/2025

The second experiment will test whether urgent messaging in the final reminder is more effective in gaining response at the end of data collection than less urgent messaging. The “urgent” message will retain the language used for other HTOPS final reminder email messages (e.g. maintain “final chance” in the email subject and in the first paragraph of the email.) The “urgent” final reminder text message will modify “today” to “now” to increase the urgency to respond. The control group will soften this messaging. The final email for the control group will remove “Final chance” from the email subject line and the first paragraph. It will retain a reference to the survey due date. The final reminder message for the control group will remove the word “today” asking panelist to respond to their survey. See materials below that demonstrate the changes being made to the final reminder and control group. Groups A and C will receive the urgent messages and Groups B and D will receive the control messages.



  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Statistical Design:

Anthony Tersine

Demographic Statistical Methods Division

Demographic Programs Directorate

[email protected]

Data Collection/Survey Design:

Jason Fields

Social Economic and Housing Statistics Division

Demographic Programs Directorate

[email protected]


Jennifer Hunter Childs

Demographic Programs Directorate

[email protected]

Statistical Analysis:


David Waddington

Social Economic and Housing Statistics Division

Demographic Programs Directorate

[email protected]


Page | 7


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2025 OMB.report | Privacy Policy