SUPPORTING STATEMENT B
U.S. Department of Commerce
U.S. Census Bureau
American Community Survey Methods Panel
OMB Control No. 0607-0936
B. Collections of Information Employing Statistical Methods
Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.
For each test or set of tests that use similar methodology, we have outlined the general planned respondent universe, sampling method, sample size, and response rate. Once details of specific tests are determined this information will be further detailed as part of a non-substantive change request (and Federal Register Notice, if required).
Every effort is taken to use the existing American Community Survey (ACS) sample for testing when the tests do not involve content changes. This approach reduces overall burden to the public, does not require additional burden of respondents already selected for the ACS, and reduces costs of the ACS program. Because the decision to use production sample is made on a case-by-case basis, the sample details listed below assume a sample independent of the production ACS is needed to conduct testing (except where noted). When an independent sample is used addresses selected to participate in production ACS are not eligible for the test.
Questionnaire Timing Test
Universe: The sample universe for this test consists of all mailable residential housing unit addresses in the United States from the Census Bureau’s Master Address File. Also, no addresses can be in sample (production and methods panel tests) more than once in a 5-year period.
Sample Selection and Sample Size: This test will use 2025 production month panel consisting of about 288,000 housing unit addresses. The sample design will be based on the ACS production multi-stage sample design. The monthly ACS production sample of approximately 288,000 housing unit addresses is divided into 24 groups, called methods panel groups. Each methods panel group contains approximately 12,000 addresses and are a representative subsample of the entire monthly sample. Each monthly sample is representative of the entire yearly sample and the country. Any combination of methods panel groups can be used for a test. This test will have four experimental treatments and a control treatment. This test will use all 24 methods panel groups with the control treatment using four randomly assigned methods panel groups and each experimental treatment randomly assigned five methods panel groups.
Response Rate: The Questionnaire Timing Test mail contact strategy consists of sending up to five mailings to sampled housing unit addresses to solicit self-response including internet self-response. This strategy results in a return rate prior to Computer-Assisted Personal Interview (CAPI) of approximately 55.2 percent. The Questionnaire Timing Test is expected to have similar final self-response rates to the production ACS, with the goal being higher internet self-response.
Response Option and Error Message Design Test
Universe: The sample universe for the Response Option and Error Message Design (RED) Test consists of all mailable residential housing unit addresses in the United States from the Census Bureau’s Master Address File. Also, no addresses can be in sample (production and methods panel tests) more than once in a 5-year period.
Sample Selection and Sample Size: The RED Test will be conducted using 2025 production month panel. The monthly ACS production panel consists of approximately 288,000 housing unit addresses and is divided into 24 nationally representative groups (referred to as methods panel groups) of approximately 12,000 addresses each. This test will use all 24 methods panel groups. Each group will be randomly assigned to one of the four treatments (control, response buttons, edit message formatting, combined response buttons and edit message formatting), so that each treatment uses six randomly assigned methods panel groups.
Response Rate: The ACS mail contact strategy consists of sending up to five mailings to sampled housing unit addresses to solicit self-response including internet self-response. This strategy results in an internet self-response rate prior to CAPI of approximately 38.6 percent. The Internet Instrument Tests are expected to have similar internet self-response rates to the production ACS.
Additional Internet Instrument Testing
Universe: The sample universe for the internet instrument testing consists of all mailable residential housing unit addresses in the United States from the Census Bureau’s Master Address File. Addresses selected to participate in production ACS will be out-of-scope for the tests. Also, no addresses can be in sample (production and methods panel tests)
Sample Selection and Sample Size: The sample design will be based on the ACS production multi-stage sample design. The number of treatments will be determined as the details of the tests are defined. Once the number of treatments is determined, methods will be developed to sample addresses and randomly assign them to the various experimental treatments.
We will use production sample, when possible, to conduct these tests. The monthly ACS production sample of approximately 288,000 housing unit addresses is divided into 24 groups, called methods panel groups. Each methods panel group contains approximately 12,000 addresses and is a representative subsample of the entire monthly sample. Each monthly sample is representative of the entire yearly sample and the country. Any combination of methods panel groups can be used for a test.
If an independent sample is needed for testing, each internet instrument test will have a national sample of 60,000 addresses divided into treatments.
Response Rate: The ACS mail contact strategy consists of sending up to five mailings to sampled housing unit addresses to solicit self-response including internet self-response. This strategy results in an internet self-response rate prior to CAPI of approximately 38.6 percent. The Internet Instrument Tests are expected to have similar internet self-response rates to the production ACS.
Self-Response Mail Messaging and Contact Strategies Testing
Universe: The sample universe for the mail messaging tests consists of all mailable residential housing unit addresses in the United States from the Census Bureau’s Master Address File. Also, no addresses can be in sample (production and methods panel tests) more than once in a 5-year period.
Sample Selection and Sample Size: The sample design will be based on the ACS production multi-stage sample design. The number of treatments will be determined as the details of the tests are defined. Once the number of treatments is determined, methods will be developed to sample addresses and randomly assign them to the various experimental treatments.
We will use production sample, when possible, to conduct these tests. The monthly ACS production sample of approximately 288,000 housing unit addresses are divided into 24 groups, called methods panel groups. Each methods panel group contains approximately 12,000 addresses and are a representative subsample of the entire monthly sample. Each monthly sample is representative of the entire yearly sample and the country. Any combination of methods panel groups can be used for a test.
If an independent sample is needed for testing, each self-response test will have a national sample of 60,000 addresses divided into treatments.
Response Rate: The ACS mail contact strategy consists of sending up to five mailings to sampled housing unit addresses to solicit self-response including internet self-response. This strategy results in a return rate prior to CAPI of approximately 55.2 percent. The Self-Response Mail Messaging and Contact Strategies Tests are expected to have similar final self-response rates to the production ACS.
Content Testing and Content Testing Follow-up Interview
Universe: The sample universe for the Content Tests consists of all residential addresses in the United States from the Census Bureau’s Master Address File. Addresses selected to participate in production ACS will be out-of-scope for the tests. Also, no addresses can be in sample (production and methods panel tests) more than once in a 5-year period.
Sample Selection: Content Testing is typically conducted with sample separate from the production ACS. The sample design will be based on the ACS production multi-stage sample design and sampling methods used in prior Content Tests. The sample size will be determined based on the topics being tested to ensure differences between treatments are detectable to a sufficient degree and is estimated to require 40,000 housing units. Past ACS Content Tests have excluded Puerto Rico, Alaska, and Hawaii for cost reasons. The Census Bureau will reconsider whether Hawaii and non-remote portions of Alaska should continue to be excluded. These tests will continue to exclude Remote Alaska and Puerto Rico.
Follow-up interviews are conducted with respondents in order to measure response bias or variance on the testing question topics. Currently no additional sampling is planned.
Response Rate: The ACS contact strategy consists of sending up to five mailings to sampled housing unit addresses to solicit self-response. Nonrespondents are then subsampled for in-person and telephone interviewing. This strategy results in an overall response rate of approximately 84.7 percent. The Content Tests employ the same contact strategy as production and as a result expects to produce similar response rates.
Nonresponse Follow-up Data Collection Testing
Universe: The sample universe for the Nonresponse Followup Data Collection Testing consists of all addresses in the United States from the Census Bureau’s Master Address File which did not respond to the ACS prior the CAPI data collection. Also, no housing unit addresses can be in sample (production and methods panel tests) more than once in a 5-year period.
Sample Selection: The sample design will be based on the ACS production multi-stage sample design. Additional sampling methodology may be incorporated in the design based on the nonresponse distributions and other considerations.
Sample Size: This test is estimated to need a national sample of 60,000 addresses. The details of the test are yet to be defined.
Response Rate: The Nonresponse Followup Data Collection Testing focus is on in-person and telephone interviews conducted by Census Bureau field representatives (FRs). Since FRs also encourage response online and may also mail back a paper questionnaire they received during the self-response phase of the ACS, response rates would depend on these factors, as well as the specific data collection intervention, and thus unknown at this time.
Describe the procedures for the collection of information including:
Statistical methodology for stratification and sample selection,
Estimation procedure,
Degree of accuracy needed for the purpose described in the justification,
Unusual problems requiring specialized sampling procedures, and
Any use of periodic (less frequent than annual) data collection cycles to reduce burden.
The Methods Panel Tests typically follow the data collection procedures for the production ACS mailing strategy, with modifications based on the design and purpose of the test. For example, changing the wording in a letter mailed to the sampled address, changing the content of a question, or sending an additional reminder to a sampled address. Section 4 below, on Test Procedures, outlines specific changes to the data collection methodology for the proposed tests. A summary of data collection methodology for production ACS is below. More details about the data collection methodology for the ACS can be found in the ACS Design and Methodology report (U.S. Census Bureau, 2022).
ACS Housing Unit Data Collection
The ACS employs a 3-month data collection process for each monthly sample first through self-response and later through in-person and telephone interviews.
To solicit self-response, the Census Bureau sends up to five mailings to potential respondents. The first two mailings are sent to all mailable addresses in the monthly sample. The first mailing is a pressure seal mailer with information about the survey and about how to respond via the internet. A week later, the same addresses are sent a second mailing (reminder letter in a pressure seal mailer).
Responding addresses are removed from the address file after the second mailing to create a new mailing universe of nonrespondents; these addresses are sent the third and fourth mailings. The third mailing is a package that includes a letter, a paper questionnaire, and a business reply envelope. Four days later, these addresses are sent a fourth mailing (reminder postcard) which encourages them to respond.
After the fourth mailing, responding addresses are again removed from the address file to create a new mailing universe of nonrespondents. The remaining sample addresses are sent the fifth mailing (a more urgent final reminder letter with a due date in a pressure seal mailer).
The Census Bureau provides Telephone Questionnaire Assistance (TQA) for respondents who need assistance with completing the paper or internet questionnaires, who have questions about the survey, or who would like to complete the ACS interview over the telephone instead of by other modes. Respondents may call the ACS toll-free TQA numbers listed on various ACS mail materials. The TQA staff answer respondent questions and/or completes the entire ACS interview using a computer-assisted telephone interview (CATI) instrument.
Two to three weeks after the fifth mailing is sent, responding addresses are removed and the unmailable and undeliverable addresses (from the initial sample) are added to create the universe of addresses eligible for the CAPI Nonresponse Follow-up operation. Of this universe, a subsample is chosen to be included in the CAPI operation, which starts at the beginning of the month following the fifth mailing. A pressure seal reminder letter is also sent to all mailable addresses sampled for CAPI at the start of the interviewing month. This letter lets respondents know that a Census Bureau FR may call or visit them to complete the interview and encourages them to complete the survey online, if possible. For most addresses, Census Bureau FRs first attempt to interview those selected for CAPI by phone. If the FR is unable to complete a phone interview, they visit the address to conduct an in-person interview.
Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.
Similar to production ACS, the Methods Panel Tests include a well-researched mail contact strategy (as outlined in Section 2 above) to encourage self-response. TQA is available via a toll-free number in the mailings, which respondents may call to obtain help in completing the survey, to address questions regarding their participation in the ACS, or to complete the questionnaire over the phone. Similarly, the mailing materials and online survey provide links to additional information about the ACS as well as Census Bureau’s policies on privacy, security, and accessibility.
Nonresponse follow-up operations are conducted to ensure a final high weighted response rate. The nonresponse follow-up operations are conducted via computer-assisted interviewing for a sample of addresses for which we have not obtained response. We maintain high levels of data accuracy and response rates through interviewer instruction, training, and close monitoring of the data.
Data collection instruments are available to respondents and interviewers in English and Spanish. Respondents may also complete the survey via a phone interview with bilingual staff in about 15 languages. There are toll-free numbers to receive assistance in Spanish, Chinese, Russian, Korean, and Vietnamese.
Additional methods for maximizing response are explored as part of several of the proposed tests. Once details of specific strategies are determined they will be provided as part of a non-substantive change request.
Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.
The Census Bureau is continuously engaging and responding to stakeholders to adapt the way we gather data, administer the survey, and conduct business. The ACS Methods Panel program allows the Census Bureau to respond to emerging trends and changes in our nation that spawn new data needs by building on our comprehensive research agenda. This work not only improves the ACS but also allows the Census Bureau to innovate responsively across key aspects of our work. The ACS Methods Panel program also provides an opportunity to research and test elements of survey data collection that relate to the decennial census.
The series of tests that are proposed over the next three years allow the Census Bureau the opportunity to improve data quality, reduce data collection costs, improve questionnaire content and data collection materials, as well as react to emerging needs.
The Questionnaire Timing Test was created to compare alternate mail contact strategies that mail the paper questionnaire later than the current timing (in the third mailing). The purpose of the test is to determine which contact strategy decreases operational costs without decreasing overall self-response.
The ACS has a two-month self-response period where sampled households are contacted by mail and encouraged to respond to the survey by internet, paper questionnaire, or phone. Since the internet instrument was introduced in 2013, internet self-response has increased over time whereas paper response has decreased (Mills et al., 2022). Responding online is both convenient for respondents and cost-effective compared to the other self-response modes. Additionally, the internet instrument has technological features that are designed to improve data quality, such as automated skip patterns and a review screen.
Over time, the ACS program has modified the ACS mail contact strategy to emphasize responding online over the other response modes. Currently, the paper questionnaire is the third mailing sent to households and is mailed three weeks after the first contact. Mailing the paper questionnaire later may push respondents to respond online sooner, improving data quality, reducing the number of paper questionnaires sent and decreasing costs of questionnaire printing, assembly, mailing, and processing.
In addition to changing the timing of the paper questionnaire mailing, we are also interested in testing the addition of a Quick Response (QR) code to the ACS mailing materials. A QR code is a two-dimensional barcode that can be read by smartphones and tablets (either by a QR reader or a camera mobile application) and sends the user to a linked website. Most internet responses are from personal computers but responses from smart devices (i.e., smartphones and tablets) have steadily increased (Mills et al., 2022).
Currently, an ACS respondent must type the URL provided in the mailing materials to access the survey. However, with a QR code, respondents would not have to open a browser and type in the URL, and instead they could simply scan the code with their smart device. Providing a QR code might be helpful for respondents to access the survey quicker, reducing respondent burden.
Experimental Design: This test will include four treatments and a control treatment.
The Control Treatment will use production ACS mail materials and mailout timing. All mailable sample addresses in this treatment will be sent an initial mailing. Seven days later those same sampled addresses are sent a reminder letter. Nonresponding addresses will be sent a paper questionnaire package 14 days after the reminder letter, followed by a reminder postcard 4 days later. Remaining nonresponding addresses will be sent a due date letter 22 days after the reminder postcard.
Treatment 1 will follow the control treatment mailout timing and will include a QR code on all ACS mail materials.
Treatment 2 is based on the 2020 Census Internet First mailout timing strategy. The questionnaire will be sent to all nonrespondents but moved from the third mailing to the fourth mailing. The motivation behind this treatment is that mailing the questionnaire package later may allow more respondents to respond by internet first, reducing costs if fewer questionnaire packages need to be mailed. The QR codes will be included on the mail materials in this treatment. All mailable sample addresses in this treatment will be sent an initial mailing. Four days later those same addresses will be sent a reminder letter. Ten days later those same addresses will be sent a reminder postcard. Nonresponding addresses will be sent a paper questionnaire package 14 days after the reminder postcard. Remaining nonresponding addresses will be sent a due date letter 19 days after the paper questionnaire.
In Treatment 3, the paper questionnaire will be sent in the fourth mailing, followed by the reminder postcard in the fifth mailing. This treatment keeps the ACS production order of questionnaire followed by postcard, which was shown to perform well in the Strategic Framework Test (Oliver et al, 2023), while delaying the questionnaire to a later mailout. mailable sample addresses in this treatment will be sent an initial letter. Seven days later those same addresses will be sent a reminder letter. Nonresponding addresses will be sent a second reminder letter 19 days after the first one. Remaining nonresponding addresses will be sent a paper questionnaire package 16 days after the second reminder letter, followed by a reminder postcard with a due date four days later. The QR codes will be included on the mail materials in this treatment.
In Treatment 4, the paper questionnaire will be sent in the fifth mailing to all remaining nonrespondents. All mailable sample addresses in this treatment will be sent an initial mailing. Seven days later those same addresses will be sent a reminder letter. Nonresponding addresses will be sent a reminder postcard 14 days after the reminder letter, followed by a second reminder letter seven days later. Remaining nonresponding addresses will be sent a paper questionnaire package with a due date 14 days after the second reminder letter. The QR codes will be included on the mail materials in this treatment.
Sample Size: This test will use the production ACS sample. The monthly ACS production sample consists of approximately 288,000 housing unit addresses and is divided into 24 nationally representative groups (referred to as methods panel groups) of approximately 12,000 addresses each. This test will use the entirety of the July 2025 ACS sample. The sample for each of the four experimental treatments in this test will consist of five randomly assigned methods panel groups (approximately 60,000 mailing addresses per treatment). The sample for the control treatment will consist of four randomly assigned methods panel groups. The test will exclude Remote Alaska since sampled addresses do not receive mail materials. Similarly, Puerto Rico is out of scope for this test because different mail methods are used.
Evaluation: The primary metric of interest in evaluating the experiment is to compare the self-response rates versus the control. See Attachment A for more details about the treatment development, analysis plan and methodology, and for mail materials.
The Response Option and Error Message Design Test is proposed to test and evaluate revised features of the internet instrument. With a growing population using the internet to respond to the ACS, as well as the increased use of smartphones and other electronic devices with smaller screens, an evaluation and redesign of the internet instrument is needed. This test will determine the impact of two proposed standards on ACS response and respondent burden in the ACS internet response instrument. These are proposed changes to response buttons and edit messages. We will measure the effectiveness of these changes by comparing item non-response on specific questions to that of the control. These changes will lead to higher response and data quality. The changes to response buttons, in particular, are in line with standards to improve response for respondents using mobile devices.
Experimental Design: This test will include a control group and three treatment groups. The control group will receive the 2025 ACS production internet instrument. The three treatment groups will have the following changes to the production internet instrument.
Treatment 1: Replace radio buttons and ‘select all that apply’ checkboxes with response buttons. Response buttons outline the touch or click area and are colored once a selection is made.
Treatment 2: Updated edit message formatting (yellow formatting and outline of missing response, where applicable).
Treatment 3: Use response buttons and updated edit message formatting.
Sample Size: This test will use the production ACS sample. The monthly ACS production sample consists of approximately 288,000 housing unit addresses and is divided into 24 nationally representative groups (referred to as methods panel groups) of approximately 12,000 addresses each. This test will use the entirety of the August 2025 ACS sample. The sample for each of the three experimental treatments in this test will consist of six randomly assigned methods panel groups (approximately 72,000 mailing addresses per treatment). The sample for the control treatment will also consist of six randomly assigned methods panel groups. The test will exclude Remote Alaska since sampled addresses do not receive mail materials. Similarly, Puerto Rico is out of scope for this test because different mail methods are used.
Evaluation: The primary metric of interest in evaluating the experiment is to compare the nonresponse rates in certain questions versus the control. See Attachment B for more details about the treatment development, analysis plan and methodology, and screen captures of the proposed changes to the internet instrument.
Additional Internet Instrument Testing is proposed to test and evaluate revised features of the internet instrument. In 2013, the ACS incorporated the use of an internet instrument to collect survey responses. The design of the instrument reflected the research and standards of survey data collection at that time. With a growing population using the internet to respond to the ACS, as well as the increased use of smartphones and other electronic devices with smaller screens, an evaluation and redesign of the internet instrument is needed. Design elements will be developed and tested based on input from experts in survey methodology and web survey design. Testing may include revisions focused on improving login procedures and screen navigation, improving the user interface design, as well as methods to decrease respondent burden. Multiple tests may be conducted.
Self-Response Mail Messaging and Contact Strategies Testing is proposed to test and evaluate revised language and messaging in the mail materials, visual appearance of the materials, changing the type of material used (letter vs postcard for example), or other changes that could be used to improve response to the ACS. Multiple tests may be conducted.
Content Testing is conducted by the Census Bureau periodically to improve data quality. Working through the Office of Management and Budget Interagency Committee for the ACS, the Census Bureau solicits proposals from other Federal agencies to change existing questions or add new questions to the ACS. The objective of content testing is to determine the impact of changing question wording and response categories, as well as redefining underlying constructs, on the quality of the data collected. The Census Bureau proposes evaluating changes to current questions by comparing the revised questions to the current ACS questions. For new questions, the Census Bureau proposes comparing the performance of two versions of any new questions and benchmark results with other well-known sources of such information. The questions would be tested using all modes of data collection. Response bias or variance may also be measured to evaluate the questions by conducting a followup interview with respondents. Multiple tests may be conducted.
Nonresponse Follow-up Data Collection Testing is proposed to evaluate the use of adaptive survey design techniques for the ACS CAPI Nonresponse Follow-up operation. Models and rules would be developed to predict case outcomes and determine interventions for a case, such as assigning a case to a refusal specialist. The models and rules would also prioritize cases based on the likelihood of completing an interview. The adaptive approach would be evaluated by comparing results to traditional methods of case assignment and progress.
Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
The Census Bureau will collect and process these data as needed for each test. Within the Census Bureau, please consult the following individuals for further information on their area of expertise.
Statistical Aspects
Joan Hill Assistant Division Chief for Experiments and Evaluations
Decennial Statistical Studies Division
Phone: (301) 763-4286
Test Methodology and Management
Elizabeth Poehler Assistant Division Chief for Survey Methodology
American Community Survey Office
Phone: (301)763-9305
References
Oliver, B., Contard, L., and Barth, D. (2023). “2021 ACS Strategic Framework Mail Materials Test Report.” Washington, DC: U.S. Census Bureau. Retrieved on December 20, 2024 from https://www.census.gov/library/working-papers/2023/acs/2023_Oliver_01.html
U.S. Census Bureau. 2022. American Community Survey Design and Methodology. Washington, D.C.: U.S. Census Bureau. Accessed December 20, 2024. https://www.census.gov/programs-surveys/acs/methodology/design-and-methodology.html
Attachments
Attachment A: Research and Analysis Plan for the Questionnaire Timing Test
Attachment B: Research and Analysis Plan for the Response Option and Error Messaging Design Test
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | SUPPORTING STATEMENT B |
Author | Joy M Barger (CENSUS/ACSO FED) |
File Modified | 0000-00-00 |
File Created | 2025-05-19 |