SUPPORTING STATEMENT A
U.S. Department of Commerce
U.S. Census Bureau
American Community Survey Methods Panel Tests
OMB Control No. 0607-0936
Abstract
The American Community Survey (ACS) is an ongoing monthly survey that collects detailed social, economic, housing and demographic data from about 3.5 million addresses in the United States and about 36,000 addresses in Puerto Rico each year (where it is called the Puerto Rico Community Survey [PRCS]). The ACS also collects detailed data from about 153,600 residents living in group quarters (GQ) facilities in the United States and Puerto Rico. Resulting tabulations from this data collection are provided on a yearly basis. The ACS allows the Census Bureau to provide timely and relevant social, economic, housing, and demographic statistics, even for low levels of geography.
An ongoing data collection effort with an annual sample of this magnitude requires that the Census Bureau continue research, tests, and evaluations aimed at improving data quality, reducing data collection costs, and improving the ACS questionnaire content and related data collection materials. The ACS Methods Panel is a research program at the Census Bureau designed to address and respond to survey issues and needs of the ACS. As part of the Decennial Census Program, the ACS also provides an opportunity to research and test elements of survey data collection that relate to the decennial census. As such, the ACS Methods Panel can serve as a testbed for the decennial census. From 2025 to 2028, the ACS Methods Panel may test ACS and decennial census methods for reducing survey cost, addressing respondent burden, and improving survey response, data quality, and survey efficiencies for housing units and group quarters. Testing may also include revising content or testing new questions. The ACS Methods Panel may also address other emerging needs of the programs.
Justification
Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.
The Census Bureau developed the ACS to collect and update demographic, social, economic, and housing data every year that are essentially the same as the “long-form” data that the Census Bureau formerly collected once a decade as part of the census. general public use information like housing quality, income distribution, journey-to-work patterns, immigration data, and regional age distributions for decision-making and program evaluation. The ACS is now the only source of comparable data about social, economic, housing, and demographic characteristics for small areas and small subpopulations across the nation and in Puerto Rico.
The ACS program provides estimates annually for all states and for all medium and large cities, counties, and metropolitan areas. For smaller areas and population groups, it takes five years to accumulate enough data to provide reliable estimates. Every community in the nation continues to receive a detailed, statistical portrait of its social, economic, housing, and demographic characteristics each year through one-year and five-year ACS products. An ongoing data collection effort with an annual sample of this magnitude requires that the ACS continue research, testing, and evaluations aimed at improving data quality, reducing data collection costs, and improving the ACS questionnaire content and related data collection materials.
The Census Bureau is collecting these data under authority of Title 13, United States Code, Sections 141, 193, and 221. The goals of the ACS and PRCS are to:
Provide federal, state, tribal, and local governments an information base for the administration and evaluation of government programs; and
Provide data users with timely demographic, housing, social, and economic data updated every year that can be compared across states, communities, and population groups.
Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.
The ACS collects detailed data, which are documented here:
https://www.census.gov/programs-surveys/acs/guidance/subjects.html
Information is requested from a resident of the sampled address. Respondents are invited to complete the survey online, by paper questionnaire, telephone, or in person. For in-person interviews, respondents must be at least 15 years old. The Census Bureau selects a random sample of addresses to be included in the ACS. Each address has about a 1-in-480 chance of being selected in a month, and no address should be selected more than once every 5 years. A detailed discussion of the ACS data collection practices can be found in Part B of this supporting statement.
The ACS has been conducted since 2005. The questions asked and how the data are collected have changed over time to reflect the changing needs of our nation, as well as changes to data collection technology and respondent needs. For example, ACS questions undergo a periodic review and testing of updated constructs and wording to reflect societal and technological changes. The ACS Content Test conducted in 2022 resulted in refreshed ACS content. Working through the Office of Management and Budget (OMB) Interagency Committee for the ACS, the Census Bureau solicits proposals from other federal agencies to change existing questions.
Information quality is an integral part of the predissemination review of the information disseminated by the Census Bureau (fully described in the Census Bureau’s Information Quality Guidelines). Information quality is also integral to the information collections conducted by the Census Bureau and is incorporated into the clearance process required by the Paperwork Reduction Act. Individual responses are protected by Title 13 and can only be shared with people with Special Sworn Status. Summary information is available to the public, including other Federal Agencies, after applying disclosure avoidance techniques to protect privacy of respondents.
See https://www.census.gov/about/policies/quality/guidelines.html for more information about the Census Bureau’s Information Quality Guidelines.
Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g. permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also, describe any consideration of using information technology to reduce burden.
The ACS uses web-based technology to collect data for stateside housing unit responses and certain types of group quarters responses. Paper questionnaires are still available to respondents who need or prefer to use paper. If a respondent starts to complete the survey online and provides an e-mail address, but does not complete the survey, they will be sent an e-mail reminding them to return to the survey to complete their online questionnaire. This e-mail includes a link to the online survey, clear instructions to log in, including an explicit reference to the user identification number. This e-mail is sent only once to a respondent.
The ACS uses web-based technology to obtain group quarters residency lists directly from facilities.
Computer-assisted interviewing is used for in-person and telephone interviews, for both housing unit and group quarters. Computer-assisted instruments allow for the automation of skip patterns and conduct error checks on-the-spot to minimize costly follow-up interviews or editing.
Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Question 2.
The ACS is the instrument used to collect long-form data that have traditionally been collected only during the decennial census. The questions on the ACS reflect federal legal requirements and that the Census Bureau have determined are not duplicative of another agency’s data collection. Several questions in the ACS appear in other demographic surveys but these results are typically not released as frequently as ACS results or at the same level of geography. The comprehensive set of ACS questions, coupled with the tabulation and dissemination of data for small geographic areas, does not duplicate any other single information collection. Moreover, many smaller federal and non-federal studies use a small subset of the same measures to benchmark those results to the ACS, which is often the most authoritative source for local area demographic data.
The OMB Interagency Committee for the ACS, co-chaired by OMB and the Census Bureau, includes more than 30 participating federal agencies and meets periodically to examine and review ACS content. This committee provides an extra safeguard to ensure that other agencies are aware of the ACS content and do not duplicate its collection and content with other surveys.
The ACS Methods Panel is the only testing vehicle for the ACS. There is no other program designed to improve the ACS. Testing for the ACS builds on research conducted for other surveys and by other statistical agencies. Specifically, lessons learned from ongoing decennial census cognitive and field testing of mail materials and multilingual approaches inform the design of ACS testing. Staff from the ACS program meet regularly with staff working on other demographic surveys to share test plans and research results. Proposals for content changes in the ACS also frequently build on research conducted for other federal surveys.
If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.
The collection of ACS data for group quarters could include small entities (such as small group homes). Small group quarters facilities (defined as having fewer than 15 people) are eligible to be in sample only once every five years. This decision was made to reduce burden both on the facility and the residents living there. The focus of the interview is on a sample of residents, not the business, though a facility administrator is involved in the data collection.
Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.
Collecting the data less frequently, on fewer sampled cases, would increase variances on the estimates produced from the ACS especially affecting small geographic areas. Many federal agencies would also not be able to implement programs as intended or meet legal requirements. Examples of federal uses for the ACS data and associated laws, are described in the ACS Handbook of Questions and Current Federal Uses: ACS Handbook of Questions and Current Federal Uses
The ACS is also a critical resource for Congress; providing data that are used to allocate federal program funds, conduct legislative research, draft legislation, and understand constituencies. Lack of ACS data would eliminate nationally consistent, detailed data that are a critical resource for making data-driven policies and legislation.
Explain any special circumstances that would cause an information collection to be conducted in a manner:
requiring respondents to report information to the agency more often than quarterly;
requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
requiring respondents to submit more than an original and two copies of any document;
requiring respondents to retain records, other than health, medical, government contract, grant-in- aid, or tax records for more than three years;
in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;
requiring the use of a statistical data classification that has not been reviewed and approved by OMB;
that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.
This data collection does not require any of the special circumstances mentioned above.
The ACS samples housing unit addresses in such a way that no housing unit address can be in sample more than once every five years. Data for a sampled housing unit are collected once. If a person moves or has more than one location where they live, they can be in sample more than once.
Respondents are instructed to respond to the survey as soon as possible, which avoids costly follow-up activities such as in-person interviewing. However, once in sample, a respondent has three months to respond to the survey.
Other than the survey itself, respondents are not required to submit any documents.
Respondents are not required to acquire or retain records for the survey.
The data collected from the ACS and associated research studies result in statistics that are released or in reports documenting findings. The sample is designed to ensure sufficient geographic coverage so that the ACS can produce an accurate demographic snapshot by surveying a representative sample of the population.
ACS survey results, including statistical estimates categories, are established and recognized as an official survey time series that has been used as survey benchmarks for other data collection programs for nearly two decades.
ACS survey materials include information related to Title 13 protections of the data collection and other associated federal regulations. The Census Bureau implements disclosure avoidance procedures on all of its products and complies with federal regulations related to data security policies.
Respondents are not required to submit proprietary information. All information collected is protected by law.
If applicable, provide a copy and identify the date and page number of publications in the Federal Register of the agency's notice, required by 5 CFR 1320.8 (d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.
Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years - even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.
The ACS frequently consults with persons outside of the Census Bureau to obtain views on a variety of topics. The ACS consults with data users, other federal agencies, and experts in the fields of survey methodology and statistics for feedback on the data produced from the survey. A similar process is followed for data collection activities, statistical techniques, and disclosure protection. For example, the Population Reference Bureau, in partnership with the Census Bureau, maintains an online data user community, organizes webinars and special sessions at professional meetings, and holds ACS Data Users Conferences.
The Census Bureau published a notice of our intent to conduct Methods Panel Testing for the ACS in the October 23, 2024, edition (Vol. 89, No. 205, pages 84526-84529) of the Federal Register (USBC-2024-0027). The Census Bureau received four public comments on this information collection submission. A summary of the comments, organized by topic, is below.
Questionnaire Timing Test – There was general support for this test. Three of the commenters provided suggestions for the test. We received a comment that recommends that the QR code direct recipients to a page with information in languages other than English as well as a collection instrument in multiple languages as referenced in ACS brochures with multiple languages. Another comment advocated for user-friendly, visually interesting resources and to make them available in multiple languages and to highlight that language assistance is available in respondent materials. We received an additional comment that recommends developing a QR code that will direct respondents to multilingual resources about the ACS including plain-language guidance about the purpose of the survey and questions asked. Another comment suggested that the Census Bureau consider developing a QR code that would direct respondents to resources about the ACS, simplifying the process for them to understand the purpose of the survey and questions asked and in multiple languages.
Two comments were received to consider respondents who require materials other than English and Spanish.
The Census Bureau appreciates the input and will take these recommendations into consideration moving forward for future tests. The Questionnaire Timing Test implementation date precludes the development of additional online resources. This test will be a foundation for future testing, including potentially more language resources.
It was recommended that online responses should be formatted so that respondents can use smartphones to respond when necessary.
The Census Bureau has optimized the survey for mobile platforms and will continue to include mobile platforms during testing and as we review and implement future tests. We will take these recommendations into consideration moving forward as we look to further support mobile response.
Internet Instrument Response Option and Error Message Design Test and Additional Internet Testing – It was recommended that internet instrument design and testing address accessibility requirements for respondents with disabilities, limited English proficiency, and or limited digital literacy and testing of mobile platforms beyond traditional web instrument designs. Two comments were received recommending the development of a “hover” or pop-up feature associated with each question on the internet instrument so that respondents can easily understand how the data collected from each question are used. Additional recommendations include testing alternative design elements, such as dropdown menus or sliders for certain question types (such as age ranges and income brackets) which might streamline the response process and improve the user interface, particularly for more complex questions. Additional research into how error messages are perceived across different demographic groups, particularly those with limited digital literacy of English proficiency is also recommended.
This testing focuses on changes to layout, not content. However, the Census Bureau will consider these recommendations for potential future internet instrument tests. The ACS internet instrument meets the guidance set forth in Section 508 of the Rehabilitation Act of 1973. Development of a “hover” option and other design elements would need to comply with this law. Information about how the data are used and why the question is asked is presented in “help” for each screen; the Census Bureau will consider alternatives to make this information more readily available to respondents. To the degree that sample size supports the analysis, we will look at differences of tested elements by demographic groups.
Self-Response Mail Materials Testing – There were two comments received recommending that ACS consider improving ACS branding, increasing awareness of ACS through public education confronting concerns about confidentiality and data protection head-on, and making the survey form itself and associated guidance materials available in languages beyond English and Spanish, as well as locally-target messaging. Another comment received recommended more research to better understand how best to improve participation across all communities, particularly among populations that are persistently undercounted in the census and under-covered in the ACS, as well as those who require materials in languages other than English and Spanish. Another group noted that research is needed to better understand how best to improve participation across all communities, particularly among populations that are persistently undercounted in the census and underrepresented in the ACS. Additional outreach was recommended to address issues surrounding privacy, confidentiality, and data protection.
The Census Bureau will continue to work to improve ACS branding and public education moving forward. We will also work to improve access when possible.
It was recommended to test how best to deploy the message that ACS response is required by law and why that is so to ensure it is effectively motivating.
The Census Bureau will consider this recommendation as future tests are developed.
We received a comment to translate the survey and associated materials into multiple languages beyond English and Spanish and for the translations to be conducted by native speakers.
The Census Bureau will consider these recommendations as we explore materials development.
There was also a recommendation to consult with literacy experts to ensure that in addition to using plain language, the materials use sentence structure that is readily understandable and accessible to those who have literacy issues.
The Census Bureau will continue to use plain language principles as we develop and implement tests and will review materials with this consideration.
There was a comment stressing the importance of research to investigate the sequencing, timing, mode, and content of different types of contact attempts using responsive and adaptive design in order to efficiently promote high response rates.
The Census Bureau appreciates the reminder and will strive to incorporate responsive and adaptive design when formulating future test efforts.
Content Testing – Commenters recommended prioritizing consultation with a broad range of external stakeholders including subject matter experts and individuals with lived experience to ensure the questions are inclusive and serve the needs of affected communities. They note that content testing should address how changes in question wording or formats affect respondents with different educational backgrounds or those with limited English proficiency. Tailoring follow-up interviews to better capture the experiences of these groups could further reduce biases in the data.
The Census Bureau appreciates these comments. We are currently reviewing communication efforts.
Regarding content testing, it was requested the Census Bureau work with OMB and other federal statistical agencies in the Interagency Council on Statistical Policy to examine the process by which revisions are proposed by agencies. Revisions to the disability items that are being considered for ACS may be needed but require appropriate consultation and input. The Census Bureau should include information notices and listening sessions. The content test could develop specifications for stakeholder engagement.
The Census Bureau and the National Center for Health Statistics (NCHS) carefully reviewed public feedback regarding the suite of disability questions on the ACS. Based on that feedback, we plan to retain the current ACS disability questions for collection year 2025. Along with colleagues at OMB, NCHS and other statistical agencies, we will continue assess which, if any, revisions are needed across the federal statistical system to better address those needs. In the past, we have had discussions with the Office of Management and Budget, the National Center for Health Statistics, the Leadership Conference on Civil and Human Rights, the National Council on Disability, and the Consortium for Constituents with Disabilities and met with federal agency disability stakeholders, disability community representatives, data users, researchers, and disability advocates. For more information refer to the Census Bureau Director’s Blog on the Next Steps on the ACS Disability Questions https://www.census.gov/newsroom/blogs/director/2024/02/next-steps-on-acs-disability-questions.html
Another comment highlighted that understanding if and how the revised SPD 15 questions generate differences in responses is important not only for the Decennial Program but also for the full federal statistical system. The comment stated that additional testing of the revised question both with and without a Some Other Race category is important to understand the impact of the changes, and that the results of this testing should be shared with OMB’s Federal Committee on SPD 15 and the public. The comment also raised concern about the implications of implementing the updated SPD 15 in the 2027 ACS while conducting the ACS Methods Panel from 2025 through 2028.
The Census Bureau is focused on implementing Statistical Policy Directive 15 (SPD15) as soon as practicable. Plans for experimental testing of changes to the updated race and ethnicity standards will be considered at a later time and in collaboration with the Office of Management and Budget (OMB). We will follow the guidance in OMB’s updated standards when the new combined race and ethnicity question is implemented in the ACS. The combined race/ethnicity question was tested with and without an SOR category by multiple agencies as part of OMB’s Interagency Working Group (ITWG) on SPD 15. Agencies identified surveys and other collections that could be used for testing the collection of race and ethnicity information using the proposed new combined question, and provided the resources (e.g., personnel, funds) necessary to conduct that research. The findings from this research informed OMB’s final decisions on SPD 15. More information about this research can be found in the ITWG’s Testing Team Final Report: https://www2.census.gov/about/ombraceethnicityitwg/annex-2-testing-team-final-report.pdf.
The Census Bureau is participating in OMB’s Federal Committee on Statistical Policy Directive No. 15, which will provide statistical tools such as bridging programs to crosswalk data collected under the 1997 SPD 15 with data collected under the 2024 SPD 15. More information can be found in the Implementation of SPD 15 in the American Community Survey blog post: https://www.census.gov/newsroom/blogs/random-samplings/2024/11/implementation-spd-15-acs.html
Nonresponse Follow-up Data Collection Testing – We received a comment that the Census Bureau should explore whether there are ways to reduce respondent burden through alternative nonresponse follow-up data collection approaches, such as more flexible hours for interviews or a broader range of languages for in-person or phone interviews. Also, it was recommended that the Census Bureau should explain why a field representative might follow up with a personal call or visit and make sure that the personal contact is meant to be helpful, to answer questions and assist with filling out the survey. One comment was received for testing instant language support or translation tools for field representatives when unable to speak the household’s language.
The Census Bureau will continue to work towards reducing respondent burden and optimizing interview and interviewer methods where possible.
Another comment was received recommending testing on a full range of devices and operating systems that respondents may use in testing questionnaire design elements in relation to internet testing. In addition, it was stressed that the Census Bureau needs to understand the effects on response of having fewer or additional mailings (and other contact attempts) compared to the baseline of five mailings. More generally, there is a pressing need to investigate the sequencing, timing, mode, and content of different types of contact attempts using responsive and adaptive design to efficiently promote higher response rates.
The Census Bureau will consider this recommendation as future tests are developed.
Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.
The Census Bureau does not pay ACS respondents or provide respondents with gifts.
Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy. If the collection requires a system of records notice (SORN) or privacy impact assessment (PIA), those should be cited and described here.
The Census Bureau collects data for this survey under Title 13, United States Code, Sections 141, 193, and 221. All data are afforded confidential treatment under Section 9 of that Title. In accordance with Title 13, each household, GQ administrator, and each person within a GQ participating in the ACS is assured of the confidentiality of their answers. Confidentiality information is sent to sampled housing units in the initial mailing. Respondents responding using the internet questionnaire are presented with additional assurances of their confidentiality and security of their online responses. At the beginning of follow-up interviews, the interviewer explains the confidentiality of data collected and that participation is required by law. The interviewer may also give the household respondent a copy of a letter explaining the confidentiality of all information provided and a Frequently Asked Questions brochure, as appropriate.
ACS data collection is covered under the COMMERCE/CENSUS-5 Decennial Census Programs system of records notice (SORN). Records are maintained to perform methodological evaluations and enhancements for data collection and quality control studies, and to undertake linkages with survey and administrative data for statistical projects as authorized by law and the Census Bureau. Additional information can be found here:
https://www.commerce.gov/opog/privacy/SORN
The Associate Directorate for Decennial Census Programs (ADDCP) American Community Survey Office (ACSO) IT system maintains the ACS data stored and processed on Census Bureau servers. This system interacts with other Census Bureau IT systems to collect, process, and store data. The Privacy Impact Assessment (PIA) for the American Community Survey can be found here: https://www.commerce.gov/sites/default/files/2024-08/ADDCP-ACS-PIA-FY2024_SAOP_Approved_Delegated.pdf
Related PIAs referenced in that document are located here:
https://www.commerce.gov/sites/default/files/2024-08/ADDCP-ACS-PIA-FY2024_SAOP_Approved_Delegated.pdf
Provide additional justification for any questions of a sensitive nature, such as sexual behavior or attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
Some of the data the Census Bureau collects on the ACS, such as race, ethnicity, disability, and sources of income and assets, may be considered to be of a sensitive nature. The Census Bureau takes the position that the collection of these types of data is necessary for the analysis of important policy and program issues and has structured the questions to lessen their sensitivity. The Census Bureau has provided guidance to the in-person interviewer on how to ask these types of questions. The Census Bureau has materials that demonstrate how the data from sensitive questions are used and how those data are kept confidential. Respondents who use the internet to complete the survey have access to links on the survey screens that provide information to help address their questions or concerns with sensitive topics.
In 2024, the U.S. Office of Management and Budget (OMB) published the results of its review of Statistical Policy Directive No. 15 (SPD 15) and issued updated standards for maintaining, collecting and presenting race and ethnicity data across federal agencies. The updated 2024 SPD 15 requires the use of a combined race and ethnicity question, the addition of a new “Middle Eastern or North African” minimum reporting category, and the collection of detailed race and ethnicity responses. ACS will implement the finalized 2024 SPD 15 published by OMB on March 28, 2024, in the 2027 ACS data collection cycle. The first ACS 1-year estimates produced using the updated standards will be the 2027 ACS 1-year data, planned for release in September 2028.
Testing for ACS will use the same content as production ACS unless specifically testing content changes.
https://spd15revision.gov/content/spd15revision/en/news/2024-10-16-omb-blog.html
Provide estimates of the hour burden of the collection of information.
Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.
If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.
Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included under ‘Annual Cost to Federal Government’ (Item #14).
The ACS Methods Panel Testing program includes multiple tests over a period of 3 years. Each test potentially includes a different number of respondents, though the ACS survey has a similar burden in each test. The ACS estimates that the burden to complete the survey for an average household is approximately 40 minutes.
This is the maximum burden requested for these tests. Every effort is taken to use existing ACS sample for testing when the tests do not involve content changes. The use of existing ACS sample results in no additional burden to the public.
Estimated Annualized Respondent Burden Hours
Test |
Type of Respondent |
Estimated number of respondents |
Number of Responses per Respondent |
Average Burden per Response (in minutes) |
Total Burden Hours |
Questionnaire Timing Test |
Household Respondent |
288,000 |
1 |
40 |
192,000 |
Response Option and Error Message Design Test |
Household Respondent |
288,000 |
1 |
40 |
192,000
|
Additional Internet Instrument Testing |
Household Respondent |
Test A – 60,000 Test B – 60,000
|
1 |
40 |
40,000 40,000
|
Self-Response Mail Messaging and Contact Strategies Testing |
Household Respondent |
Test A – 60,000 Test B – 60,000 Test C – 60,000
|
1 |
40 |
40,000 40,000 40,000 |
Content Testing |
Household Respondent |
40,000
|
1 |
40 |
26,667
|
Content Testing Follow-up Interview |
Household Respondent |
40,000
|
1 |
20 |
13,333
|
Nonresponse Follow-up Data Collection Testing |
Household Respondent |
60,000 |
1 |
40 |
40,000 |
Total (over 3 years) |
|
1,016,000 |
|
|
664,000 |
Annual Burden Hours |
|
338,667 |
|
|
221,333 |
Estimated Annualized Respondent Costs
Type of Respondent |
Number of Respondents |
Number of Responses per Respondent |
Average Burden per Response |
Hourly Wage Rate* |
Total Burden Costs |
Household Respondent |
1,016,000 |
1 |
40 |
$31.48 |
$21,322,453 |
Total |
-- |
-- |
-- |
-- |
$21,322,453 |
*The wage rate for household respondents is estimated based on average hourly rate among all occupations as reported by the Bureau of Labor Statistics as of May 2023. BLS’s Occupational Outlook Handbook https://www.bls.gov/bls/blswage.htm
Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden already reflected on the burden worksheet).
There are no direct costs to responding to the ACS.
Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information.
The Methods Panel Tests program is expected to cost an estimated $4,500,000 per year, less than two percent of the annual ACS budget. This cost includes staff time to conduct, analyze, and report on the tests as well as the cost to print materials and collect data.
Explain the reasons for any program changes or adjustments reported in ROCIS.
As an ongoing testing program, the ACS plans testing in 3-year increments in line with the OMB Clearance process. Testing proposed here is in reaction to emerging issues related to improving data quality, reducing data collection costs, and improving the ACS questionnaire content and related data collection materials. Changes in technology, emerging societal issues, and cutting-edge survey methodology necessitate testing to maintain the highest relevance for ACS survey estimates. Additionally, the ACS Methods Panel can serve as a testbed for the decennial census. The agility of this ongoing testing program benefits the larger field of survey methodology, as well as contributing specifically to reducing survey cost, addressing data collection efficiencies, and improving data quality for the ACS and decennial census.
For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.
Data collected in the proposed Methods Panel Tests are analyzed at the conclusion of data collection. Results are published online roughly a year after the data collection for the test ends. This allows time to conduct thorough analysis, document the findings accurately, and apply disclosure protections to the summary report.
If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.
The Methods Panel tests will display the expiration date on materials in line with the methods used for the production ACS. We request that we not display the OMB expiration date on the paper questionnaire. The ACS is an ongoing and continuous survey that is mandatory. If there is an expiration date on the questionnaire, respondents may infer that the survey is over as of the expiration date, which is not the case.
Explain each exception to the certification statement identified in “Certification for Paperwork Reduction Act Submissions."
The Census Bureau certifies compliance with 5 CFR 1320.9 and the related provisions of 5 CFR 1320.8(b)(3) for the ACS Methods Panel Tests program.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | SUPPORTING STATEMENT A |
Author | Joy M Barger (CENSUS/ACSO FED) |
File Modified | 0000-00-00 |
File Created | 2025-05-19 |