SUPPORTING STATEMENT
U.S. Census Bureau
The American Community Survey
OMB Control No. 0607-0810
A. Justification
Necessity of the Information Collection
The U.S. Census Bureau requests authorization from the Office of Management and Budget (OMB) for revisions to the American Community Survey (ACS). The Census Bureau has developed a methodology to collect and update demographic, social, economic, and housing data every year that are essentially the same as the "long-form" data that the Census Bureau traditionally has collected once a decade as part of the decennial census. Federal and state government agencies use such data to evaluate and manage federal programs and to distribute funding for various programs that include food stamp benefits, transportation dollars, and housing grants. State, county, tribal, and community governments, nonprofit organizations, businesses, and the general public use information like housing quality, income distribution, journey-to-work patterns, immigration data, and regional age distributions for decision-making and program evaluation.
In years past, the Census Bureau collected the long-form data only once every ten years and it became out of date over the course of the decade. To provide more timely data, the Census Bureau developed the ACS. The ACS blends the strength of small area estimation with the high quality of current surveys. There is an increasing need for current data describing lower geographic detail. The ACS is now the only source of uniform data available about general demographic and housing characteristics for small-area levels across the Nation and in Puerto Rico. In addition, there is an increased interest in obtaining data for small subpopulations such as groups within the Hispanic, Asian, and American Indian populations, the elderly, and children. The ACS provides current data throughout the decade for small areas and subpopulations.
The ACS began providing up-to-date profiles in 2006 for areas and population groups of 65,000 or more people, providing policymakers, planners, and service providers in the public and private sectors with information every year–not just every ten years. The ACS program provides estimates annually for all states and for all medium and large cities, counties, and metropolitan areas. For smaller areas and population groups, it takes three to five years to accumulate information to provide accurate estimates. The first three-year estimates were released in 2008; the first five-year estimates in 2010. Since then, these multiyear estimates have been updated annually.
Using the Master Address File (MAF) from the decennial census, which is updated each year, we select a sample of addresses and mail survey materials each month to a new group of potential households. Most households are asked first to complete the survey via the Internet, with a paper questionnaire provided to those households that do not respond via Internet. We then attempt to conduct interviews over the telephone with households that have not responded either by mail or Internet. Upon completion of the telephone follow-up, we select a sub-sample of the remaining households that have not responded either by mail, Internet, or telephone and designate the household for a personal interview. Typically, for personal interviews, we sample at a rate of one in three. We also conduct interviews with a sample of residents at selected group quarters (GQ) facilities. Collecting these data from a new sample of housing units (HUs) and GQ facilities every month provides more timely data and lessens respondent burden in the Decennial Census.
We release a yearly micro data file, similar to the Public Use Micro data Sample file of the Census 2000 long-form records. In addition, we produce total population summary tabulations similar to the Census 2000 tabulations down to the block group level. The micro data files, tabulated files, and their associated documentation are available through the Internet.
In January 2005, the Census Bureau began full implementation of the ACS in households with a sample of approximately 250,000 addresses per month in the 50 states and the District of Columbia. In addition, we select approximately 3,000 residential addresses per month in Puerto Rico and refer to the survey as the Puerto Rico Community Survey (PRCS).
In January 2006, the Census Bureau implemented ACS data collection for the entire national population by including a sample of 20,000 GQ facilities and a sample of 200,000 residents living in GQ facilities in the 50 states and the District of Columbia along with the annual household sample. A sample of 100 GQs and 1,000 GQ residents was also selected for participation in the PRCS.
Starting with the June 2011 mail panel, the Census Bureau increased the annual sample size for the ACS to 3,540,000 households (or 295,000 households per month) in the 50 states and the District of Columbia.
The goals of the ACS and PRCS are to:
Provide federal, state, tribal, and local governments an information base for the administration and evaluation of government programs; and
Provide data users with timely demographic, housing, social, and economic data updated every year that can be compared across states, communities, and population groups.
ACS Household Data Collection
Historically, the ACS had employed a tri-modal data collection strategy for household data collection—mail, telephone and personal visit. In 2011, the Census Bureau conducted two tests to assess the feasibility of providing an Internet response option to households that receive survey materials by mail. Based on the results of these tests, the Census Bureau implemented an Internet response option for the ACS for the start of the 2013 data collection.
Detailed reports documenting the methods and results from tests that led to the implementation of this methodology can be found at: http://www.census.gov/acs/www/library/by_series/internet_data_collection/
For households eligible to receive survey materials by mail, the first contact (Attachment A) includes a letter and instruction card explaining how to complete the survey online. Also included are a Frequently Asked Questions (FAQ) brochure and a brochure that provides basic information about the survey in English, Spanish, Russian, Chinese, Vietnamese, and Korean, and provides a phone number to call for assistance in each language. The instruction card provides the information on how to respond in English and Spanish. The letter explains that if the respondent is unable to complete the survey online, a paper questionnaire will be sent later. The Internet version of the questionnaire is available in English and Spanish and includes questions about the HU and the people living in the HU. The Internet questionnaire (Attachment B) has space to collect detailed information for twenty people in the household.
The second mailing is a letter (Attachment C) that reminds respondents to complete the survey online, thanks them if they have already done so, and informs them that a paper form will be sent later if we do not receive their response. This letter includes clear instructions to log in, including an explicit reference to the user identification number.
In a third mailing (Attachment D), the American Community Survey Household (HU) Questionnaire Package is sent only to those sample addresses that have not completed the online questionnaire within two weeks. The content includes a follow up letter, a paper copy of the questionnaire, an instruction guide for completing the paper form, an instruction card for completing the survey online, a FAQ brochure, and a return envelope. The cover letter with this questionnaire package reminds the household of the importance of the ACS, and asks them to respond soon either by completing the survey online or by returning a completed paper questionnaire.
The fourth mailing (Attachment E) is a postcard that reminds respondents that “now is the time to complete the survey,” informs them that an interviewer may contact them if they do not complete the survey, and reminds them of the importance of the ACS.
A fifth mailing (Attachment F) is sent to respondents who have not completed the survey within five weeks and are not eligible for telephone follow-up because we do not have a telephone number for the household. This postcard reminds these respondents to return their questionnaires and thanks them if they have already done so.
A sixth mailing is sent to those respondents who request a replacement package in Spanish. Similar to the third mailing, but in Spanish, the content includes an introductory letter, a paper copy of the questionnaire, an instruction guide for completing the paper form, an instruction card for completing the survey online, a FAQ brochure, a follow up letter, and a return envelope. The cover letter with this questionnaire package reminds the household of the importance of the ACS, and asks them to respond soon either by completing the survey online or by returning a completed paper questionnaire.
A second reminder postcard is sent a few days after this mailing emphasizing the importance of completing this survey.
If we do not receive the completed questionnaire by the cut-off date and we do not have a telephone number on file for the housing unit, an additional reminder postcard is sent.
All of the materials that are sent to respondents who request a replacement package in Spanish, including related reminder postcards, are included in Attachment G.
For sample housing units in Puerto Rico, a different mail strategy is employed. Based on the results of testing in 2011 and concerns with the resulting Internet response rates from that testing, we are delaying the introduction of an Internet response option for Puerto Rico until a later date while we assess the best implementation approach. Therefore, in 2016 for Puerto Rico, we plan to continue to use the previously used mail strategy with no references to an Internet response option (Attachment H). Similar to the stateside mailing strategy, our first Puerto Rico mailing includes a pre-notice letter in Spanish and English.
The second Puerto Rico mailing includes an introductory letter, a FAQ brochure, a copy of the paper questionnaire, an instruction booklet, and a return envelope.
The third Puerto Rico mailing is a reminder postcard.
The fourth Puerto Rico mailing is a replacement package similar to the second mailing and is mailed only to non-respondents.
The fifth Puerto Rico mailing is a reminder postcard that is mailed only to non- respondents not selected for telephone follow up because we do not have a good telephone number on file for the housing unit.
After the self-response modes of mail and Internet, the next mode of data collection is computer-assisted telephone interviewing (CATI). This is used to conduct telephone interviews for all households that do not respond by Internet or mail and for which we were able to obtain telephone numbers.
The final mode of data collection is computer-assisted personal interviewing (CAPI) and is used to conduct personal interviews for a sample of addresses for which we have not obtained a self-response (paper or Internet) or CATI interview. Both CATI and CAPI instruments are available to interviewers in English and Spanish. We also conduct a CAPI-only operation to collect ACS data from sampled HUs in remote areas of Alaska.
We provide telephone questionnaire assistance (TQA) for respondents who need assistance with completing the paper or Internet questionnaires, who have questions about the survey or who would like to complete the ACS interview over the telephone instead of by other modes. Respondents may call the ACS toll free TQA numbers listed on various ACS mail materials. The TQA staff answers respondent questions and/or completes the entire ACS interview using CATI. Interested households may request a survey form in Spanish (Attachment G) by calling our TQA center. Since May 2012, households are also able to request a Language Assistance Guide in Simplified Chinese or Korean. Copies of these guides are found in Attachments I and J, respectively. For Puerto Rico households, we mail a Spanish version of the questionnaire. Upon request through TQA, respondents are mailed an English version of the PRCS questionnaire and appropriate informational materials (Attachment K).
Previously, we conducted a CATI Failed Edit Follow-up (FEFU) if we had a telephone number and either: 1) respondents omitted answering a set of critical questions that are deemed essential for the questionnaire to be considered complete; or 2) the household had more than five people so that we could obtain information for the additional members of the household. Starting in October 2012, we scaled back the FEFU operation to focus on only households with coverage problems (such as mail respondents with more than 5 people, mail respondents with more people listed on the cover than in the basic demographic section, or questionnaires returned for vacant units). We also use the FEFU operation to confirm the status of Internet responses classified as businesses or vacant units and to collect the minimum amount of information needed to further process the questionnaire. If funding allows, we would resume FEFU for mail and Internet returns missing responses to critical questions. The FEFU instrument (Attachment L) is available to interviewers in both English and Spanish.
We also collect information from HUs identified as vacant. We ask a knowledgeable contact to answer the housing questions on the ACS questionnaire along with some additional questions for these units. Questions asked on the ACS household CATI and CAPI instruments that are worded differently and those asked in addition to the questions on the household ACS questionnaire for vacant units are included in Attachment M.
We conduct a reinterview operation to monitor Field Representative (FR) performance. Only households that provide an interview via CAPI are eligible for this reinterview. For the household reinterview operation, we use a separate set of questions for units that were identified as occupied, vacant, or noninterview at the time of the original CAPI interview. The household ACS Reinterview questions are included in Attachment N.
CAPI interviewers have several tools available for use to explain the ACS to households, including an introductory letter, a thank you letter, a short explanatory brochure, and a longer brochure in question and answer format. Each of these materials is available in English, Spanish, Arabic, Simplified Chinese, French, Haitian-Creole, Korean, Polish, Portuguese, Russian, and Vietnamese. The Census Bureau also provides letters for reluctant CATI and CAPI respondents in English, Spanish, Korean, Simplified Chinese, Russian, and Vietnamese. These letters and brochures can be found in Attachment O.
ACS Group Quarters (GQ) Collection
In addition to selecting a sample of residential addresses, we select a sample of GQs. An introductory letter and FAQ brochure for the facility administrator are mailed to the sample GQ approximately two weeks prior to the period when a field representative (FR) may begin making contact with the GQ. The FR gives the facility contact person a thank you letter when they arrive for the interview (Attachment P). The FRs use the CAPI Group Quarters Facility Questionnaire (GQFQ) in English or Spanish when making initial telephone contact to schedule an appointment to conduct a personal visit at the sample GQ and also use a GQ listing sheet to generate the sub-sample of persons for ACS interviews (Attachment Q).
We use a subset of the ACS HU questions to conduct interviews with sample residents in GQs. Resident-level personal interviews with sampled GQ residents are conducted using CAPI, but bilingual paper questionnaires can also be used for self-response. The GQ CAPI and paper questionnaires contain questions for one person. The GQ resident data collection packages (Attachment R) include an introductory letter, a bilingual Confidentiality Notice, a paper questionnaire (for self-response only), an instruction guide for completing the paper form, a thank you letter, and a copy of the ACS GQ brochure. We conduct a separate operation to collect ACS GQ data from sampled GQs in Federal Prisons and in remote Alaska.
For Puerto Rico sample GQ residents, we use PRCS data collection packages (Attachment S) to collect the GQ data.
We conduct a GQ reinterview (RI) operation to monitor the performance of FRs in conducting the GQFQ interviews. For the GQ RI operation, we use a separate set of questions to verify and monitor the FR interviews at the GQ level (Attachment T).
The Census Bureau is collecting these data under authority of Title 13, United States Code, Sections 141, 193, and 221.
Changes in ACS Content for 2016
The content of the proposed 2016 ACS questionnaire and data collection instruments for both Housing Unit and Group Quarters operations reflect changes to content and instructions that were proposed as a result of the 2014 ACS Content Review and as a result of findings from concurrent research and testing.
The American Community Survey
(ACS) is one of the Department of Commerce’s most valuable data
products, used extensively by businesses, non-governmental
organizations (NGOs), local governments, and many federal agencies.
In conducting this survey, the Census Bureau’s top priority is
respecting the time and privacy of the people providing information
while preserving its value to the public. The 2016 survey content
changes are the initial step in a multi-faceted approach to reducing
respondent burden. The Census Bureau is currently carrying out this
program of research, which includes several components as discussed
briefly below.
One of the areas with strong potential to
reduce respondent burden is to reuse information already supplied to
the federal government in lieu of directly collecting it again
through particular questions on the ACS. The Census Bureau is
conducting groundbreaking work aimed at understanding the extent to
which existing government data can reduce redundancy and improve
efficiency. The tests we are conducting in the next two years will
tell us whether existing government records can provide substitute
data for households that have not responded to the ACS.
In
addition, we continue to look into the possibility of asking
questions less often beginning initial efforts on the marital history
series of questions. For example, asking a question every other year,
every third year, or asking a question of a subset of the respondents
each year. We also want to examine ways we can better phrase our
questions to reduce respondent concern, especially for those who may
be sensitive to providing information.
The outcome of
these future steps will be a more efficient survey that minimizes
respondent burden while continuing to provide quality data products
for the nation. We expect to make great progress during fiscal 2015
on this front, and will be reporting our progress to the Secretary of
Commerce at the end of the fiscal year.
Since the founding of the nation, the U.S. Census has mediated between the demands of a growing country for information about its economy and people, and the people's privacy and respondent burden. Beginning with the 1810 Census, Congress added questions to support a range of public concerns and uses, and over the course of a century questions were added about agriculture, industry, and commerce, as well as occupation, ancestry, marital status, disabilities, and other topics. In 1940, the U.S. Census Bureau introduced the long form. Since then, only the more detailed questions were asked of a sample of the public.
The ACS, launched in 2005, is the current embodiment of the long form of the census, and is asked each year of a sample of the U.S. population in order to provide current data needed more often than once every ten years. In December of 2010, five years after its launch, the ACS program accomplished its primary objective with the release of its first set of estimates for every area of the United States. The Census Bureau concluded it was an appropriate time to conduct a comprehensive assessment of the ACS program. This program assessment focused on strengthening programmatic, technical, and methodological aspects of the survey to assure that the Census Bureau conducts the ACS efficiently and effectively.
In August 2012, the OMB and the Census Bureau chartered the Interagency Council on Statistical Policy (ICSP) Subcommittee on the ACS to “provide advice to the Director of the Census Bureau and the Chief Statistician at OMB on how the ACS can best fulfill its role in the portfolio of Federal household surveys and provide the most useful information with the least amount of burden.” The Subcommittee charter also states that the Subcommittee would be expected to “conduct regular, periodic reviews of the ACS content…designed to ensure that there is clear and specific authority and justification for each question to be on the ACS, the ACS is the appropriate vehicle for collecting the information, respondent burden is being minimized, and the quality of the data from ACS is appropriate for its intended use.”
The formation of the ICSP Subcommittee on the ACS and the aforementioned assessment of the ACS program also provided an opportunity to examine and confirm the value of each question on the ACS, which resulted in the 2014 ACS Content Review. This review, which was an initial step in a multi-faceted approach of a much larger content review process, included examination of all 72 questions contained on the 2014 ACS questionnaire, including 24 housing-related questions and 48 person-related questions.
The Census Bureau proposed the two analysis factors – benefit as defined by the level of usefulness and cost as defined by the level of respondent burden or difficulty in obtaining the data, which was accepted by the ICSP Subcommittee. Based on a methodology pre-defined by the Census Bureau with the input and concurrence of the ICSP Subcommittee on the ACS, each question received a total number of points between 0 and 100 based on its benefits, and 0 and 100 points based on its costs. These points were then used as the basis for creating four categories: High Benefit and Low Cost; High Benefit and High Cost; Low Benefit and Low Cost; or Low Benefit and High Cost. For this analysis, any question that was designated as either Low Benefit and Low Cost or Low Benefit and High Cost and was NOT designated as Mandatory (i.e., statutory) by the Department of Commerce Office of General Counsel (OGC) or NOT Required (i.e., regulatory) with a sub-state use, was identified as a potential candidate for removal. The Department of Commerce OGC worked with its counterparts across the federal government to determine mandatory, required, or programmatic status, as defined below:
Mandatory – a federal law explicitly calls for use of decennial census or ACS data on that question
Required – a federal law (or implementing regulation) explicitly requires the use of data and the decennial census or the ACS is the historical source; or the data are needed for case law requirements imposed by the U.S. federal court system
Programmatic – the data are needed for program planning, implementation, or evaluation and there is no explicit mandate or requirement.
Based on the analysis, the following questions were initially proposed for removal:
Housing Question No. 6—Business/Medical Office on Property
Person Question No. 12—Undergraduate Field of Degree
Person Question No. 21—(In the Past 12 mos, did this person) Get Married, Widowed, Divorced
Person Question No. 22—Times Married
Person Question No. 23—Year Last Married
For reports that provide a full description of the overall 2014 ACS Content Review methods and results, see “Final Report - American Community Survey FY14 Content Review Results” (Attachment U); additional reports about the 2014 ACS Content Review are also available at http://www.census.gov/acs/www/about_the_survey/methods_and_results_report/
Removal of the Question on Business/Medical Office on Property
Regarding the business/medical office on property question, the Census Bureau received 41 comments from researchers, and individuals. Most of these comments came from researchers who felt that the Census Bureau should keep all of the proposed questions in order to keep the survey content consistent over time, or felt that modifications to the question could potentially make it more useful. Housing Question No. 6—Business/Medical Office on Property is currently not published by the Census Bureau in any data tables. The only known use of the question is to produce a variable for the Public Use Microdata Sample (PUMS), a recode for the Specified Owner (SVAL) variable that allows users to compare other datasets. The Content Review did not reveal any uses by federal agencies, and the comments to the Federal Register notice did not reveal any non-federal uses. Additionally, there were no uses uncovered in meetings with stakeholders, data user feedback forms, or other methods employed to understand the uses of ACS data. Lastly, independent research conducted on behalf of the Census Bureau did not uncover any further uses. Though the question has a low cost, it has no benefit to federal agencies, the federal statistical system, or the nation. The Census Bureau plans to remove this question, beginning with the 2016 ACS content.
Retention of the Question on Undergraduate Field of Degree
Regarding the field of degree question, the Census Bureau received 625 comments from researchers, professors and administrators at many universities, professional associations that represent science, technology, engineering and mathematics (STEM) careers and industries, members of Congress, the National Science Foundation, and many individuals interested in retaining this question. A number of commenters (92) cited the importance of these estimates for research that analyzes the effect of field of degree choice on economic outcomes, including earnings, education, occupation, industry, and employment. University administrators (37) commented that this information allows for analysis of postsecondary outcomes, and allows them to benchmark their graduates’ relative success in different fields as well as to plan degree offerings. While some commenters used the estimates to understand fields such as humanities or philosophy (56), the majority of these comments (125) addressed the value of knowing about the outcomes of people who pursued degrees in science, technology, engineering and mathematics. These commenters felt that knowing more about the people currently earning STEM degrees and the people currently working in STEM fields would enable universities, advocacy groups, and policy makers to encourage more people to pursue STEM careers, and to encourage diversity within STEM careers.
The initial analysis of Person Question No. 12—Undergraduate Field of Degree did not uncover any evidence that the question was Mandatory or Required. However, comments to the Federal Register notice uncovered the existence of a relationship between the Census Bureau and the National Science Foundation, dating back to 1960. Over the course of this established relationship, long-form decennial census data was used as a sampling frame for surveys that provided important information about scientists and engineers. These comments demonstrated that the Field of Degree question on the ACS continues this historical use of decennial long-form and ACS data for this purpose, and makes this process more efficient. Many commenters (58) also cited the necessity of the National Survey of College Graduates (NSCG), and recommended retaining the question because it is needed as a sampling frame for the NSCG. Though commenters theorized that the NSCG might still be able to produce STEM estimates without the ACS, a number of commenters (16) thought that doing so would be very expensive, costing as much as $17 million more (1).
Additionally, many comments also indicated uses of this question to understand the economic outcomes of college graduates at local geographic levels, especially those with STEM degrees. These commenters included professional, academic, congressional, and policy-making stakeholders who expressed concerns that the absence of statistical information about STEM degrees would harm the ability to understand characteristics of small populations attaining STEM degrees. Given the importance of this small population group to the economy, the federal statistical system and the nation, bolstered by the new knowledge of historical precedent brought to light by commenters to the Federal Register notice, the Census Bureau therefore plans to retain this question on the 2016 ACS.
Retention of the Questions on Marital History, including Changes in Marital Status, Year Last Married, and Number of Times Married
Regarding the marital history questions, the Census Bureau received 1,361 comments from researchers and professors, professional associations that represent marriage and family therapists, the Social Security Administration (SSA), and many individuals interested in retaining these questions. SSA commented that it uses the marital history questions to estimate future populations by marital status as part of the Board of Trustees annual report on the actuarial status (including future income and disbursements) of the Old-Age and Survivors Insurance (OASI) and Disability Insurance (DI) Trust Funds. The Department of Health and Human Services (HHS) also uses these questions to distinguish households in which a grandparent has primary responsibility for a grandchild or grandchildren, as well as to provide family formation and stability measures for the Temporary Assistance for Needy Families (TANF) program.
The focus of the proposed elimination is on the marital history questions only with no change to collection of marital status. Over 400 additional comments to the Federal Register notice cited concerns that the proposed elimination of the marital history questions was an indication of whether the government views information about marriage as somehow less valuable than other ACS question topics that were not proposed for removal. While the Census Bureau had always planned to continue collecting information about the “marital status” for each person in a household (Person Question No. 20) and their relationships to each other (Person Question No. 2), the Census Bureau remains sensitive to these criticisms.
More than 100 supporters of retaining the marital history questions mentioned their utility for research into marital status changes over time and they correctly noted that there is currently no other national source of the marital history information. As a result, many commenters felt they would not be able to compare marriage characteristics and patterns with other nations in the same depth that is possible today. Similarly, without these questions, the commenters felt that the analysis of changes in marriage events (especially those due to changing societal values and pressures or policy changes) would be less robust. In particular, comments focused on 6 research areas, that would be more difficult to analyze without the marital history questions:
Family formation and stability (23)
Patterns/trends of marriage and divorce (168)
Marital effects on earnings, education and employment (45)
Marital effects on child wellbeing (6)
Same-sex marriages, civil unions and partnerships (70)
New government policy effects on marriage (9)
Because the initial analysis of Person Question Nos. 21-23 on marital history did not uncover any evidence that data from these questions were “Required” for federal use at sub-state geographies, those questions received a lower benefit score than many other ACS questions. However, in deference to the very large number (1,367) of comments received on the Census Bureau proposal to eliminate those questions, the Census Bureau plans to retain those questions on the 2016 ACS.
The Census Bureau takes very seriously respondent concerns and recognizes that the Content Review and the resulting proposed question changes discussed above are only initial steps to addressing them. The Census Bureau has implemented an extensive action plan on addressing respondent burden and concerns. The work completed, and the comments received, on the 2014 Content Review provide a foundation for ongoing and future efforts to reduce burden and concerns. In addition to the immediate content changes (proposed above), the Census Bureau is also currently testing the language on the survey materials that may cause concern such as reminding people that their responses are required by law. In order to be responsive to these concerns about the prominence of the mandatory message on the envelopes, we are conducting research with a subset of ACS respondents in May 2015. Over the summer, we will work with external methodological experts to test other revisions of the ACS mail materials to check respondent perceptions of the softened references to the mandatory nature of participation in the ACS. The preliminary results of those tests will be available in the fall, and the Census Bureau will make changes to the 2016 ACS mail materials based on those results.
Concurrently we also are identifying additional questions that we may only need to ask intermittently, rather than each month or year. The current ACS sample design asks all of the survey questions from all selected households in order to produce estimates each year for small geographies and small populations. However, during the Content Review we learned about over 300 data uses that federal agencies require to implement their missions. We see several potential opportunities to either include some questions periodically, or ask a smaller subset of ACS respondents in cases where those agencies do not need certain data annually. The Census Bureau plans to engage the federal agencies and external experts on this topic during 2015. In addition, we need to assess the operational and statistical issues associated with alternate designs. The alternate designs will result in a reduction in the number of questions asked of individual households.
We are also conducting research on substituting the direct collection of information with the use of information already provided to the government. It is possible that the Census Bureau could use administrative records from federal and commercial sources in lieu of asking particular questions on the ACS.
Lastly, we are examining our approaches to field collection to reduce the number of in-person contact attempts while preserving data quality. For example, based on research conducted in 2012, we implemented changes in 2013 which led to an estimated reduction of approximately 1.2 million call attempts per year, while sustaining the 97percent response rate for the survey overall. For the person visit operation, we are researching a reduction in the number of contact attempts. We plan to field test this change in August 2015. If successful we would implement nationwide in spring 2016.
We will continue to look for other opportunities to reduce respondent burden while maintaining survey quality. Taken together, these measures will make a significant impact on reducing respondent burden in the ACS. In fact, as we have been accelerating our research program in parallel with the content review, we are proposing several additional immediate changes to the 2016 ACS.
Changes in 2016 ACS Content resulting from Cognitive Testing on Computer Usage and Internet Questions
In early 2013 the Census Bureau began to reach out to Federal agency stakeholders through the forum provided by the OMB Interagency Committee for the ACS to identify possible question changes to be considered for the 2016 ACS Content Test. The ICSP Subcommittee on the ACS conducted an initial review of the proposals received from these Federal agencies, and identified a set of topics that would be approved for the formation of topical subcommittees. These topical subcommittees worked with the Census Bureau to develop proposed wording that was evaluated through multiple rounds of cognitive testing in 2014 and 2015 to refine the proposed question wording changes.
During the course of the preparations for the 2016 ACS Content Test, attention was given to the computer usage and Internet series of questions (questions 9 through 11 on the ACS-1(HU) questionnaire). When this series of questions was added to the production ACS questionnaire in 2013, it was clear that the quickly evolving nature of the types of computing devices available and the ways individuals access the Internet would cause this series of questions to quickly become out-of-date. Cognitive testing of these questions in 2014 brought to light difficulties respondents face when answering the current versions of these questions that were corroborated by the metrics collected during the ACS Content Review. Specifically, technical terms and types of devices and Internet services referenced in the current questions are not easily reconciled with the devices and Internet services used by households today. Additionally, there is evidence in the production data being collected that respondents are misreporting their usage of tablets, since there is not a clear category that references tablet computers. Proposed changes to these questions to bring the wording more in sync with current devices and Internet services were shown to be effectively understood during the cognitive testing process. Therefore, in order to improve the quality of the ACS data, and to reduce the difficulty respondents experience when answering these questions, the Census Bureau is proposing revising these questions as outlined in Attachment W. Given the timing of the receipt of the results of cognitive testing, the proposal to revise these questions in the 2016 ACS was not included in the October 31st notice in the Federal Register.
In order to ensure that question changes are effective at collecting high quality data, the current policy requires that proposed revisions to questions must first be cognitively tested, and then, if successful, the results of the cognitive testing will be used as input to a field test that utilizes multiple ACS modes of collection. However, the current concerns with the computer use and Internet questions suggest the need in some instances for the ACS program to be more nimble in making changes than our current process for cognitive and field testing will allow. Therefore, we are evaluating on a pilot basis incorporating the following criteria into the pretesting requirements of the ICSP Subcommittee on the ACS to determine when to implement changes without field testing:
The external environment related to the topic being measured has changed in a way that there is evidence of significant measurement error in the absence of a question change.
Cognitive testing has been conducted on versions of the question accounting for multiple modes of administration (such as self-response and interviewer-administered) and the results have led to clear recommendations on the specific changes to make.
There is evidence that implementing changes to the production versions of the question should be done on a timeline that makes field testing unfeasible, OR the Census Bureau has not received sufficient funding to conduct field testing.
If each of these criteria are met, then a change to ACS question wording could be considered without field testing. Regular reviews and analysis would continue to evaluate any questions changed under this policy, allowing the Census Bureau to preserve the quality of the ACS data and be more responsive in making question wording changes that reflect the changing environment.
Changes in 2016 ACS Content Concerning the Flush Toilet Section of the Plumbing Facilities Question
Traditionally the means of determining substandard housing has involved identifying housing that lacks complete plumbing facilities or complete kitchen facilities. Until 2008, the Census Bureau asked one question to determine complete plumbing facilities, “Does the house, apartment or mobile home have COMPLETE plumbing facilities; that is, 1) hot and cold running water, 2) flush toilet, and 3) bathtub or shower?” Similarly, the Census Bureau used one question to determine complete kitchen facilities (sink with a faucet, stove or range, and a refrigerator). In 2008, in conjunction with our stakeholders, we broke the plumbing and kitchen facilities questions into six sub-parts in order ask about each component separately. Having data available for each sub-part has enabled us to better understand the impact of asking each one, including the flush toilet component. As we have accelerated our research into this topic, we have learned that there are very few instances where flush toilets alone determine the existence of substandard housing. After consultation with some of our key stakeholders, the Census Bureau believes that the flush toilet question places unnecessary burden on the American public relative to the value of the information gained from it, and recommends that it be removed in the 2016 ACS (as shown in Attachment V) though we will continue to work with stakeholders to explore how this information can be collected apart from the ACS.
Changes in 2016 ACS Mailing Procedures
Based on the results of testing conducted in 2015, the Census Bureau is proposing to modify the mail out strategy for the ACS by eliminating the previously used pre-notice letter, and by replacing the first reminder postcard with a reminder letter (Attachment C). The testing has shown that this change increases response to the online questionnaire, and reduces the total number of mailings sent to households by eliminating one entire mailing and replacing a postcard with a letter. (These modified procedures are also included in the above ACS Household Data Collection section, pp. 3-4.)
For households eligible to receive survey materials by mail, the first contact includes a letter and instruction card explaining how to complete the survey online. Also included are a Frequently Asked Questions (FAQ) brochure and a brochure that provides basic information about the survey in English, Spanish, Russian, Chinese, Vietnamese, and Korean, and provides a phone number to call for assistance in each language. The instruction card provides the information on how to respond in English and Spanish. The letter explains that if the respondent is unable to complete the survey online, a paper questionnaire will be sent later. The Internet version of the questionnaire is available in English and Spanish and includes questions about the HU and the people living in the HU. The Internet questionnaire has space to collect detailed information for twenty people in the household.
The second mailing is a letter that reminds respondents to complete the survey online, thanks them if they have already done so, and informs them that a paper form will be sent later if we do not receive their response. This letter includes clear instructions to log in, including an explicit reference to the user identification number.
In a third mailing, the American Community Survey Household (HU) Questionnaire Package is sent only to those sample addresses that have not completed the online questionnaire within two weeks. The content includes a follow up letter, a paper copy of the questionnaire, an instruction guide for completing the paper form, an instruction card for completing the survey online, a FAQ brochure, and a return envelope. The cover letter with this questionnaire package reminds the household of the importance of the ACS, and asks them to respond soon either by completing the survey online or by returning a completed paper questionnaire.
The fourth mailing is a postcard that reminds respondents that “now is the time to complete the survey,” informs them that an interviewer may contact them if they do not complete the survey, and reminds them of the importance of the ACS.
A fifth mailing is sent to respondents who have not completed the survey within five weeks and are not eligible for telephone follow-up because we do not have a telephone number for the household. This postcard reminds these respondents to return their questionnaires and thanks them if they have already done so.
2. Needs and Uses
The primary necessity for continued full implementation of the ACS is to provide comparable data at small geographies, including metropolitan and micropolitan areas, as well as the census tract and block group level. These data are used by federal agencies and others to provide assurance of long-form type data availability since the elimination of the long form from the 2010 Census. The 2014 ACS Content Review collected information about how ACS estimates are being used to meet current federal data needs; the following are examples of these uses:
Federal agencies frequently use ACS data as an input for a funding allocation formula. The Department of Housing and Urban Development (HUD) uses state, county, and metropolitan area level ACS median income estimates to allocate Section 8 Housing funds and to set Fair Market Rents for metropolitan areas.1 Both these calculations use a yearly update factor based on ACS data and earlier data (currently from the Census 2000 Long Form, though HUD is in the process of phasing this out).2
Federal agencies also fund state and local programs through block grants that are administered and evaluated at the state and local level. The data collected via the ACS is useful not only to the federal agencies in determining program requirements but also to state, local, and tribal governments in planning, administering, and evaluating programs. For example, within the Department of Health and Human Services (HHS), the Community Services Block Grant program uses ACS data at the county level to determine the allocation of funds from states to eligible entities, to determine guidelines used for participant eligibility, and to assess the need for assistance for low-income, including elderly low-income households.3 Additionally, the USDA’s Food and Nutrition Service (FNS) provides states and school districts data based on ACS poverty estimates in order to evaluate their Supplemental Nutrition Assistance Program programs.4
Federal agencies find value in using ACS estimates to understand characteristics of population groups in order to make program decisions. The Federal Communications Commission uses computer and Internet use estimates to assist in evaluation of the extent of access to, and adoption of, broadband.5 Additionally, HHS uses disability, health insurance and other estimates to measure, report, and evaluate health disparities and improvements in health equity.6
Some federal agencies use ACS data to estimate future needs; the ACS provides more timely data for use in estimation models that provide estimates of various concepts for small geographic areas. The Department of Transportation’s Federal Highway Administration (FHWA) uses American Community Survey Journey to Work estimates (including means of transportation, time a worker leaves the house to go to work, travel time, and work location) to create traffic flow models.7 These flow patterns are used by both the FHWA and state transportation agencies to plan and fund new road and other travel infrastructure projects. Additionally, the Department of Energy uses ACS estimates to project residential energy demand over the next 30 years, which is detailed in EIA's Annual Energy Outlook (AEO), the premier source for assessing the energy needs of the U.S. economy in a domestic and international context.
The Census Bureau continues to examine the operational issues, research the data quality, collect cost information and make recommendations in the future for this annual data collection.
Information quality is an integral part of the pre-dissemination review of the information disseminated by the Census Bureau (fully described in the Census Bureau's Information Quality Guidelines). Information quality is also integral to the information collections conducted by the Census Bureau and is incorporated into the clearance process required by the Paperwork Reduction Act of 1995.
Additional question by question justification can be found in Attachment V.
3. Use of Information Technology
We use Internet, CATI and CAPI technologies for collecting data from households for the ACS. These technologies allow us to skip past questions that may be inappropriate for a person/household, which, in turn, keep respondent burden to a minimum. We use CAPI technologies for collecting information from GQ facilities to accurately classify the GQs by type and to generate a sample of residents at the GQs. CAPI is also used to conduct personal interviews with GQ residents. We use CAPI technologies for both the HU and GQ Reinterview operations. Additionally, by continuing to offer an Internet response option in the ACS, the Census Bureau is taking further steps to comply with the e-gov initiative. Based on early implementation of an Internet response option, this method also slightly improves self-response rates and creates cost savings by reducing printing and data capture costs and workloads for more costly follow-up operations.
4. Efforts to Identify Duplication
The ACS is the instrument used to collect long-form data that has traditionally been collected only during the decennial census. The content of the ACS reflects topics that are required directly or indirectly by the Congress and that the Census Bureau determines are not duplicative of another agency’s data collection. A number of questions in the ACS appear in other demographic surveys, but the comprehensive set of questions, coupled with the tabulation and dissemination of data for small geographic areas, does not duplicate any other single information collection. Moreover, many smaller Federal and non-Federal studies use a small subset of the same measures in order to benchmark those results to the ACS, which is often the most authoritative source for local area demographic data.
In addition, the OMB Interagency Committee for the ACS, co-chaired by OMB and the Census Bureau, includes more than 30 participating agencies and meets periodically to examine and review ACS content. This committee provides an extra safeguard to ensure that other agencies are aware of the ACS content and do not duplicate its collection and content with other surveys.
5. Minimizing Burden
Research and data from survey administrators indicates that the ACS HU questionnaire takes an estimated 39 minutes to complete; CATI/CAPI data collection takes an estimated 27 minutes, and response via Internet takes an estimated 39 minutes. The GQ facility questionnaire takes an estimated 15 minutes to complete and the ACS GQ questionnaire takes an estimated 25 minutes to complete. Every effort is taken to minimize the time needed for respondents or GQ contacts to answer the questions for all ACS data collection operations.
6. Consequences of Less Frequent Collection
A less frequent data collection plan would preclude the Census Bureau's goal of producing data annually in order to examine year-to-year changes in estimates. The ACS is conducted monthly because collecting data every month provides the most accurate annual average of many survey items that can vary by month or season. A monthly survey also helps us stabilize workloads across the year for CATI and CAPI operations and account for seasonal changes that occur.
7. Special Circumstances
The Census Bureau collects these data in a manner consistent with the OMB guidelines.
8. Consultations Outside the Agency
In August 2012, the Office of Management and Budget (OMB) in conjunction with the Census Bureau established a Subcommittee of the Interagency Council on Statistical Policy (ICSP) on the ACS. The ICSP Subcommittee on the ACS exists to advise the Chief Statistician at OMB and the Director of the Census Bureau on how the ACS can best fulfill its role in the portfolio of Federal household surveys and provide the most useful information with the least amount of burden. It may also advise Census Bureau technical staff on issues they request the subcommittee to examine or that otherwise arise in discussions. The ICSP Subcommittee on the ACS reviewed the proposed 2016 ACS content changes and recommended their approval to the OMB and Census Bureau.
Decisions on the content of the ACS moving forward are supported by the findings of the 2014 ACS Content Review, extensive consultation during meetings with the ICSP Subcommittee on the ACS, Census advisory groups, and other federal agencies. In addition, we have consulted with federal agencies most impacted by the removal of questions from the ACS, including experts at the National Science Foundation and the Social Security Administration.
Since the inception of the Content Review in 2013, the Census Bureau has been deliberate in casting a wide net to ensure ACS stakeholders are aware of the project and understand the steps involved to reach an outcome. The communications strategy for the Content Review has included outreach to a range of groups starting with the 23 Federal Agencies who participated in the effort. The Census Bureau launched its efforts to work with these agencies through a kickoff summit held in the Spring of 2014, and implemented an electronic data collection process to gather input. A variety of communications mechanisms were also implemented to correspond with the agencies and answer questions. Finally, a technical briefing was provided to agencies affected by the proposed removal of survey content in the Fall of 2014, as well as a general briefing for all participating agencies regardless of impact.
Beyond federal agencies, the Census Bureau has also promoted awareness and provided specific Content Review information in forums to state and local governments, academia, Congress, the public, data users, Census advisory groups, the business community, Census data dissemination support networks, think-tanks, non-profit associations, grant writers, and many scientific and professional organizations.
Many examples can be given to illustrate the depth and breadth of these outreach activities. The Census Bureau has sent periodic updates on the Content Review to the members of its ACS mailing lists (via GovDelivery). We have also utilized the ACS Data Users Group, established in 2012, to promote awareness of survey data users in the process and encourage participation.
With regard to Census Advisory Groups, we have engaged regularly with the National Advisory Committee (NAC) on Race and Ethnicity, and the Census Scientific Advisory Committee (CSAC), beginning with the respective Fall meetings for each of those committees in 2013, in order to provide program updates and solicit input on the Content Review. Furthermore, during the summer of 2014, the NAC chartered a working group to provide input to the Census Bureau on the utility of ACS questions and examine the burden imposed on respondents especially with regard to sensitivities raised with small population groups. This input has been considered in the Content Review process. Also during June and July of 2014, the Census Bureau solicited feedback from the larger data user community on the most useful or most frequently used questions on the ACS. A total of 932 individual responses were received and this input also has been considered in the Content Review.
The Census Bureau also conducted specific outreach concerning the 60-day Federal Register notice (posted October 31 to December 30, 2014) to encourage a strong response. A total of 1,693 comments was received from many different organizations and individuals. The majority of comments came from individuals who did not identify an affiliation with an organization (801), followed by commenters from academia (591), nonprofits (131), government (70), business (45), university administrators (34), Census stakeholders (15), and media (6). This volume of comments about the ACS is unprecedented in the history of the survey. The Census Bureau was pleased to hear from such a large number of diverse organizations, agencies, and individuals, including but not limited to the following:
Congressional Black Caucus
House of Representatives Committee on Science, Space and Technology
National Science Foundation
Social Security Administration
Department of Housing and Urban Development
Department of Health and Human Services
Florida Legislature
State of North Carolina
Committee on National Statistics (CNSTAT)
Census Scientific Advisory Council (CSAC)
Members of the Census National Advisory Council (NAC)
State Data Centers affiliates
Census Information Center affiliates
Research Institutes including:
Brookings Institution
Pew Research
Urban Institute
American Enterprise Institute
Professional Associations including:
Council of Professional Associations on Federal Statistics (COPAFS)
Population Association of America (PAA) and the Committee on Population Statistics
National Council on Family Relations (NCFR)
American Council on Education
American Sociological Association
Council of Graduate Schools
Universities including:
Bowling Green University
University of Michigan
University of Minnesota
The transparent approach the Census Bureau has followed in the Content Review extends to establishing and maintaining a location on the Census Bureau’s website to view project background materials and methodological reports at the following link: http://www.census.gov/acs/www/about_the_survey/acs_content_review/
We will continue to maintain the website with the public comment materials supporting the Census Bureau’s recommendations to OMB, and OMB’s final decisions on 2016 survey content. Regarding the specific comments to the Federal Register notice, many comments were identical, utilizing templates developed and shared by people with similar interests. However, more than 70 percent of the comments were unique. Rather than address each individual comment, this statement addresses the themes that appeared throughout the entire set of comments. Numbers that appear parenthetically represent the volume of comments on each theme.
ACS and the 2014 ACS Content Review
In general, many commenters (207) felt that the ACS provided important estimates and a good value to the public. Some commenters (112) also suggested modifications to existing questions or additional questions, citing the advantages of a large sample continuous survey for collecting estimates that are potentially valuable to them. The questions suggested as additions included many questions that have been previously tested and considered for inclusion on the ACS, such as health insurance marketplace and parental place of birth questions. While the Census Bureau recognizes and appreciates the interests of federal partners, stakeholders, and other data users in the collection of data for the ACS, the process for proposing new and revised questions is more complicated than reviewing suggestions. Because participation in the ACS is mandatory, the OMB will only approve necessary questions for inclusion on the ACS. The uses of the data must be identified to determine the appropriateness of collecting it through a national mandatory survey. Because ACS data are collected and tabulated at the census tract or block-group level, the response burden for the majority of respondents must be considered and minimized. More information about our Content Policy is available in section 5.4 “Content Policy and Content Change Process” of the American Community Survey Design and Methodology (January 2014). This report is available at http://www.census.gov/acs/www/Downloads/survey_methodology/acs_design_methodology_report_2014.pdf
While most comments addressed the questions considered for removal, 88 comments discussed the ACS Content Review process itself. Eight comments specifically commended the process, while 12 criticized it. Of the 88 comments, 27 felt that the process did not consider the value of providing data for small populations and population subgroups. The Census Bureau agrees that the ACS is a very valuable tool for analysis of and for small population groups. However, defining and identifying small population groups would have been problematic in this analysis as nearly any group can become a small population group when social characteristics are tabulated at sub-state geographies.
Similarly, 10 commenters felt that either the analysis did not consider non-federal data uses, or did not consider them to be of equivalent value. The Census Bureau recognizes and appreciates these non-federal uses; however, an important part of the analysis was understanding how questions were used by federal agencies. Because the Census Bureau provides ACS data at no cost to data users and without any registration requirement, it is not possible to identify the diverse and vast number of non-federal uses. Thus, incorporating non-federal data users’ input was accomplished through extensive communication efforts, and through the Federal Register notice.
The remaining 34 commenters felt that the reduction in respondent burden as a result of the removal of these questions would be insignificant. While removing the proposed questions reduces burden by just one minute per sample household, this step is only the first towards reducing respondent burden. Research will continue into those questions identified as high cost (including those with high relative seconds to answer) in an effort to have the most high value, low cost survey possible.
Business on Property Question
Relatively few comments (41) were received about the Business on Property question. Most of these comments (30) came from researchers who felt that the Census Bureau should keep all of the proposed questions in order to keep the survey content consistent over time. Other comments (4) theorized that modifications to the question could potentially make it more useful, while one commenter objected to the method for calculating owner-occupied unit value estimates. The Census Bureau appreciates that keeping survey content consistent does help with comparability and trend analysis over time. However, none of the comments mentioned current uses of estimates from this question.
Field of Degree Question
Regarding the field of degree question, the Census Bureau received comments from researchers, professors and administrators at many universities, professional associations that represent STEM careers and industries, members of Congress, and the National Science Foundation.
Commenters frequently cited the importance of these estimates for research that analyzes the effect of field of degree choice on economic outcomes, including earnings, education, occupation, industry, and employment (92). University administrators (37) commented that this information allows for analysis of postsecondary outcomes, and allows them to benchmark their graduates’ relative success in different fields as well as to plan degree offerings.
While some commenters used the estimates to understand fields such as humanities or philosophy (56), the majority of the comments (125) addressed the value of knowing about the outcomes of people who pursued degrees in science, technology, engineering and math (STEM). These commenters felt that knowing more about the people currently earning STEM degrees and the people currently working in STEM fields would enable universities, advocacy groups, and policy makers to encourage more people to pursue STEM careers, and to encourage diversity within STEM careers.
While some of the commenters used the ACS estimates directly, many (58) also cited the necessity of the National Survey of College Graduates (NSCG), and recommended retaining the question because it is used as a sampling frame for the NSCG. Though commenters theorized that the NSCG might still be able to produce STEM estimates without the ACS, a number of commenters (16) thought that doing so would be very expensive, costing as much as $17 million more (1).
The Census Bureau understands the significant value of estimates of characteristics of STEM degree-holders and people employed in STEM occupations or industries. In making a recommendation to retain or remove, the Census Bureau considered the burden of including the one, easily-understood question on the ACS against the likely decrease in efficacy and increase in monetary and in respondent burden of any potential alternatives to collecting this information.
The ability to know more about the effect of field of degree on many other aspects of the lives of college graduates, such as earnings, employment, and migration, is an additional area of research that commenters noted in their feedback to the Census Bureau. However, some of this information may be available in the form of university records and surveys fielded by the universities themselves. In making a recommendation to retain or remove, the Census Bureau did not consider whether this information should be collected, but rather whether the ACS is the appropriate vehicle for this data collection given our commitment to minimize respondent burden.
Marital History Questions
Regarding the marital history questions, the Census Bureau received comments from researchers and professors at many universities, professional associations that represent marriage and family therapists, the Social Security Administration, and many individuals with an interest in marriage estimates.
The majority of comments (1,361) asked the Census Bureau to reconsider removing the questions about marital history. A large number of these comments (422) articulated a perception that removing these questions indicates government sentiment against marriage. The Census Bureau plans to continue collecting Person Question No. 2 (Household Relationship) and Person Question No. 20 (Marital Status, which includes the response categories of Now Married, Widowed, Divorced, Separated, or Never Married). These questions, regardless of whether the marital history questions are kept or removed, will allow individuals to continue obtaining information on marital status.
The remaining supporters of the retention of the marital history question, mentioned its utility for research into marital status changes. These comments stated that without these questions, the United States would have no other source of these estimates (113). As a result, many commenters felt they would not be able to compare marriage characteristics and patterns with other nations in the same depth that is possible today. Similarly, without these questions, the commenters felt that the analysis of changes in marriage events (especially those due to changing societal values and pressures or policy changes) would be less robust. In particular, comments focused on 6 research areas, that would be more difficult to analyze without the marital history questions:
Family formation and stability (23)
Patterns/trends of marriage and divorce (168)
Marital effects on earnings, education and employment (45)
Marital effects on child wellbeing (6)
Same-sex marriages, civil unions and partnerships (70)
New government policy effects on marriage (9)
Finally, 33 comments including those from the Social Security Administration, questioned the government’s ability to adequately plan and fund federal programs without robust marital history estimates. These comments also discussed the lack of vital statistics or other administrative records at the state level, and the inability of other federal surveys to provide this information at the state level (because of small sample sizes). While, market forces might encourage other federal survey or private entities to begin collecting this information and disseminating statistics, (5) commenters felt that other sources would be cost-prohibitive as well.
The Census Bureau appreciates the utility of these estimates for research and modeling. In making a recommendation to retain or remove, the Census Bureau did not consider whether this information should be collected, but rather whether the ACS is the appropriate vehicle for this data collection given our commitment to minimize respondent burden.
Other ACS Questions
Several federal agencies that participated in the 2014 ACS Content Review wrote to reaffirm their uses of other questions that were not proposed as candidates for removal. The Census Bureau recognizes and appreciates the participation of these agencies in the Federal Register notice process. However, these federal uses are documented in other reports available at http://www.census.gov/acs/www/about_the_survey/methods_and_results_report/, and are considered beyond the scope of this document.
Other Comments
The Census Bureau found that several comment types were not relevant to this effort:
One commenter advised the Census Bureau to work with Congress to pass a budget. While Census submits budget estimates each year and communicates with appropriators throughout the budget process, funding for the Census Bureau – as with all executive branch agencies – is ultimately determined by Congress.
One commenter felt the Census Bureau should pay respondents.
Many commenters felt that the field of degree and marital history questions have great genealogical value. However, the individual record-level information the genealogists cited would not be released until 72 years after its collection, many decades in the future.
Finally, many commenters wrote to tell us about their research goals, personal experience with marriage and divorce, opinions about government and social policy, and many other personal anecdotes. The Census Bureau appreciates each comment received and will retain them as personal perspectives shared on the Content Review.
9. Paying Respondents
We do not pay respondents or provide respondents with gifts.
10. Assurance of Confidentiality
The Census Bureau collects data for this survey under Title 13, United States Code, Sections 141, 193, and 221. All data are afforded confidential treatment under Section 9 of that Title.
In accordance with Title 13, each household, GQ administrator, and each person within a GQ participating in the ACS is assured of the confidentiality of their answers. A brochure is sent to sample households with the initial mail package and contains this assurance. Households responding using the Internet questionnaire are presented with additional assurances of their confidentiality and security of their online responses. The brochure mailed to sample GQs with the GQ introductory letter contains assurances of confidentiality. It is also provided to sample GQ residents at the time of interview.
Household members, GQ administrators or GQ residents may ask for additional information at the time of interview. A Question and Answer Guide, and a Confidentiality Notice are provided to respondents, as appropriate. These materials explain Census Bureau confidentiality regulations and standards.
At the beginning of follow-up interviews (CATI and CAPI), the interviewer explains the confidentiality of data collected and that participation is required by law. For all CAPI interviews, the interviewer gives the household respondent, GQ administrator, or GQ resident a copy of a letter from the Census Bureau Director explaining the confidentiality of all information provided.
11. Justification for Sensitive Questions
Some of the data we collect, such as race and sources of income and assets may be considered to be of a sensitive nature. The Census Bureau takes the position that the collection of these types of data is necessary for the analysis of important policy and program issues and has structured the questions to lessen their sensitivity. We have provided guidance to the CATI and the CAPI interviewers on how to ask these types of questions during the interview. The Census Bureau has materials that demonstrate how we use the data for sensitive questions, and how we keep that data confidential. Respondents who use the Internet to complete the survey have access to links on the survey screens that provide information to help address their questions or concerns with sensitive topics.
12. Estimate of Hour Burden
The sample size is 295,000 households per month, and we plan to mail survey materials to approximately 286,000 households each month that have mailable addresses. The Census Bureau estimates that, for the average household, the new 2013 version of either the paper ACS-1 questionnaire or the Internet questionnaire will take 40 minutes to complete, including the time for reviewing the instructions and answers. This reflects a two minute increase from the estimated time to complete the 2012 household version of the paper questionnaire. We do not estimate any increase in time to complete the Group Quarters interviews. We plan to conduct reinterviews for approximately 3,600 households each month. We estimate that the average time for a reinterview will be 10 minutes.
We plan to conduct personal interviews at 1667 GQs each month. At each facility, one GQ contact is interviewed to collect data about the GQ and to provide a list of residents in the GQ. This list is used to randomly select the sample of individuals to complete the ACS. The estimated time for each facility interview is 15 minutes. We conduct interviews with approximately 16,667 people in GQs each month. The estimated response time for each person to complete the ACS-1(GQ) is 25 minutes. We also conduct GQ reinterviews for approximately 166 GQs each month. We estimate that the average time for a GQ reinterview will be 10 minutes.
We have based these estimates of the average length of time on our previous ACS tests and on experiences with forms of comparable lengths used in previous censuses and tests. The total number of respondent burden hours for a full year is 2,455,868 hours. See Table 1 on the following page for the detailed respondent and burden hour estimates.
Table 1. Annual ACS Respondent and Burden Hour Estimates
Data Collection Operation |
Forms or Instrument Used in Data Collection |
Annual Estimated Number of Respondents |
Estimated Minutes Per Respondent by Data Collection Activity |
Annual Estimated Burden Hours
|
|
I. ACS Household Questionnaire - Paper Mailout/Mailback |
ACS-1, ACS 1(SP), ACS-1PR, ACS-1PR(SP) |
3,540,000 |
40 |
2,360,000 |
|
ACS Household CATI - Telephone Non-response Follow-up |
CATI HU |
[1,364,000 included in I.] |
[40] |
[910,000 included in I.]
|
|
ACS Household CAPI – Personal Visit Non-response Follow-up |
CAPI HU
|
[698,000 included in I.] |
[40] |
[466,000 included in I.] |
|
ACS Household Internet |
Internet HU |
[712,000 included in I.] |
[40] |
[475,000 included in I.]
|
|
II. ACS GQ Facility Questionnaire CAPI - Telephone and Personal Visit |
CAPI GQFQ |
20,000 |
15 |
5,000 |
|
III. ACS GQ CAPI Personal Interview or Telephone, and – Paper Self-response |
CAPI, ACS-1(GQ), ACS-1(GQ)(PR) |
200,000 |
25 |
83,333 |
|
IV. ACS Household Reinterview – CATI/CAPI |
ACS HU-RI |
43,200 |
10 |
7,200 |
|
V. ACS GQ GQ-level Reinterview – CATI/CAPI |
ACS GQ-RI |
2,000 |
10 |
335 |
|
TOTALS |
|
3,805,200 |
N/A |
2,455,868 |
|
13. Estimate of Cost Burden
There are no costs to the respondent other than his/her time to respond to the survey.
14. Cost to Federal Government
As requested in the FY 2015 President’s Budget, the estimated cost of the 2016 ACS is approximately 256.8 million. The Census Bureau will pay the total cost of the ACS.
15. Reason for Change in Burden
We do not estimate any change in burden due to the 2016 content changes.
We will release data for the new 2016 content beginning September 2017. The data releases will include data collected from HUs and GQs.
The data collection activities for the 2016 Content will begin in late December 2015.
Approximately one month after the initial mailing for a sample month, we begin the CATI operation for households, which have not responded by mail or Internet. Approximately two months after the initial mailing, we begin a field follow-up operation using CAPI for a sample of the remaining nonresponse households.
Each month, we begin interviews with sample GQ administrators and a sample of residents. The data collection for each GQ sample month is six-weeks. The GQ reinterview takes place approximately one month after the beginning of the survey year and continues until the end of the December each year. The ACS GQ does not include a formal non-response follow up operation, but FRs contact a respondent or GQ administrator for missing responses on the questionnaire at any point during the six-week data collection period.
17. Request to Not Display Expiration Date
We request that we not display the OMB expiration date on the questionnaire. The ACS is an ongoing and continuous survey that is mandatory. If there is an expiration date on the questionnaire, respondents may infer that the survey is over as of the expiration date, which is not the case.
18. Exceptions to the Certification
There are no exceptions to the Certification for Paperwork Reduction Act Submission.
1 See 42 U.S.C. 1437b and 1437f. HUD’s funding formulas are available at: http://www.huduser.org/portal/datasets/fmr/fmrover_071707R2.doc and http://www.huduser.org/portal/datasets/il/il10/IncomeLimitsBriefingMaterial_FY10.pdf. The results of these formulas are announced yearly in the Federal Register.
2 See United States Housing Act of 1937, Public Law 93-383, as amended, and 42 U.S.C. § 1437f(c)(1);
24 CFR 888.113, 24 CFR 982.401.
3 See Community Services Block Grant Act, Pub. L. No. 105-285, Sections 673 (2), 674, and 681A, and 42 U.S.C. § 9902 (2), 9903, and 9908 (b)(1)(A), (b)(11) & (c)(1)(A)(i),
4 See 7 U.S.C. 2025(d)(2) and 7 CFR 275.24(b)(3). The FNS calculates a Program Access Index that allows them to provide additional award funds to states that have the highest levels of SNAP access, or show the greatest annual improvement in SNAP access. For the PAI formula, see: http://www.fns.usda.gov/ora/menu/Published/snap/FILES/Other/pai2008.pdf and 7 CFR 275.24.
5 See Broadband Data Improvement Act of 2008, Pub. L. No. 110-385;
47 U.S.C. § 1303(d)
6 See Patient Protection and Affordable Care Act, Pub. L. No. 111-148, §10334 and 42 U.S.C. 300kk.
7 See 23 U.S.C. 134 and 23 U.S.C. 135. See also 23 U.S.C. 303 and 23 CFR 450.316-322. See also P.L. 109-59.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | SUPPORTING STATEMENT |
Author | Sharon K Boyer |
File Modified | 0000-00-00 |
File Created | 2021-01-24 |