2016 Census Test OMB Supporting Statement Part A_final

2016 Census Test OMB Supporting Statement Part A_final.docx

2016 Census Test

OMB: 0607-0989

Document [docx]
Download: docx | pdf


Department of Commerce

United States Census Bureau

OMB Information Collection Request

2016 Census Test

OMB Control Number 0607-XXXX



Part A. Justification



1. Necessity of the Information Collection

During the years preceding the 2020 Census, the Census Bureau is pursuing its commitment to reduce the cost of conducting the census while maintaining the quality of the results. A primary decennial census cost driver is the collection of data in person from addresses for which the Census Bureau received no reply via initially offered response options. We refer to these as nonresponse cases, and the efforts we make to collect data from these cases as the Nonresponse Followup, or NRFU, operation.

The 2016 Census Test will allow the Census Bureau to build upon past tests, to refine our plans and methods associated with the reengineered field operations for the NRFU operation of the Census. Namely, this test will allow us to:

  • Test refinements to the ratios of field enumerators to field supervisors.

  • Test refinements to our enhanced operational control system, including the way we assign work to field staff, and how those assignments are routed.

  • Test alternatives to government furnished equipment for data collection, such as devices provided by a private company as part of a contract for wireless service (sometimes known as Device As A Service).

  • Test refinements to our use of administrative records to reduce the NRFU workload.

  • Test new methods of conducting NRFU quality control reinterviews.

Increasing the number of people who take advantage of self response options (such as responding online, completing a paper questionnaire and mailing it back to the Census Bureau, or responding via telephone) can contribute to a less costly census. The Census Bureau has committed to using the Internet as a primary response option in the 2020 Census, and we are studying ways to offer and promote this option to respondents. In addition to increasing and optimizing self response through the Internet, the Census Bureau plans to test the impacts of providing additional materials to respondents as part of their first mailing along with a letter invitation. One example of additional material is an insert to be included for traditionally hard-to-count populations. We will also test a tailored envelope treatment to determine whether this represents an effective way to encourage and support self response for respondents who speak languages other than English. We also will continue to study the option of allowing people to respond on the Internet without having or using a unique identification code previously supplied by the Census Bureau. Each of these will be discussed in more detail in subsequent sections of this supporting statement.



Supporting Documents about the 2020 Census Design and the 2016 Census Test Objectives

We are submitting with this package the following documents with the purposes as stated:

The 2020 Operational Plan documents at a high-level the objectives for the census tests both already completed and planned for the future. This document shows the current planned design of the 2020 Census and identifies design decisions made, as well as remaining decisions to be made using Census Test results. Key design components related to the 2016 Census Test are discussed in Chapter 4 and in Sections 5.5.4, 5.5.5, 5.5.9, and 5.5.10. https://www.census.gov/programs-surveys/decennial-census/2020-census/planning-management/operational-plan.html



We are also submitting The 2020 Research and Testing Management Plan, which defines the high-level research for the life-cycle of the program, thereby providing direction for research and testing activities and for decision-making based on the outcomes.

http://www.census.gov/programs-surveys/decennial-census/2020-census/planning-management/memo-series/2020-memo-2015_03.html



In addition, we are submitting a planning document that lists our Goals, Objectives, and Success Criteria for the 2016 Census Test. This is the document that shows in finer detail what the research questions are related to design decisions to be made using the results of this test.



2016 Census Test – Los Angeles County (part), California and Harris County (part), Texas

For the 2016 Census Test, the areas within Los Angeles County (part), California and Harris County (part), Texas were chosen based on a variety of characteristics — including language diversity, demographic diversity, varying levels of Internet usage, large metropolitan areas and high vacancy rates. These characteristics can help the Census Bureau refine its operational plans for the 2020 Census by testing operational procedures on traditionally hard-to-count populations. The tests will allow for our continued development of providing additional ways for the population to respond to the once-a-decade census, as well as more cost-effective ways for census takers to follow up with households that fail to respond.

Los Angeles County (part), California

Places and Census Designated Places (CDP)

Harris County (part), Texas

Places


Alhambra city

Los Angeles city

Montebello city

Monterey Park city

Pasadena city

Rosemead city

San Gabriel city

San Marino city

South El Monte city

South Pasadena city

Temple City city


East Los Angeles CDP

East Pasadena CDP

East San Gabriel CDP

San Pasqual CDP

South San Gabriel CDP




Bunker Hill Village city

Hedwig Village city

Hilshire Village city

Houston city

Hunters Creek Village city

Jersey Village city

Piney Point Village city

Spring Valley Village city




To increase Internet self response rates, the Census Bureau will improve contact and notification strategies that were studied in prior testing. The core of our contact strategy is an Internet-push strategy, which was previously tested in the 2012 National Census Test, 2014 Census Test, and the 2015 Optimizing Self Response and Census Tests, and is now being further refined. We also introduced a supplemental contact strategy in the 2015 National Content Test, the Internet Choice panel, which we will continue to study in the 2016 Census Test. In the 2016 Census Test, improvements to this approach will be tested by modifying the content of our messages, including materials in the mailing packages. Additional information about how the components of the 2016 Census Test fit together is provided in the 2016 Census Test Design Matrix in the supplemental materials.

We also will continue our efforts to make it easier for respondents by allowing them to respond without providing a pre-assigned identification (ID) number associated with their address. This response option, referred to as “Non-ID,” was successfully implemented on the Internet in the 2014 and 2015 Census Tests. In this test, we will continue to develop the infrastructure to deploy real-time processing of Non-ID responses. Specifically, we will implement automated processing of Non-ID responses in a cloud-based environment instead of using Census Bureau hardware. This work will help us prepare for conducting Non-ID Processing at the scale we anticipate for 2020. In addition, we will be conducting a manual matching and geocoding operation for Non-ID responses that could not be matched to a record in the Census address list, or assigned to a census block during automated processing. Some of this processing will require Census staff to call respondents to obtain further information, such as missing address items that could help us obtain a match to a record in the Census address list. In some cases, we may also ask for the respondent’s assistance in accurately locating their living quarters on a map so that we can associate the response to the correct census block, which is required for data tabulation.

The 2016 Census Test will be comprised of four major phases: Self Response (including Non-ID processing and Response Validation), NRFU (with a reinterview component), Coverage Reinterview, and focus groups.

Self Response

We will implement an “Internet Push” contact strategy, which involves first sending a letter inviting people to respond via the Internet; then sending two postcard reminders to non-responding addresses; and ultimately sending a paper questionnaire to addresses that still have not responded. The Census Bureau will directly contact up to 250,000 addresses in each site to request self response via one of the available response modes (Internet, telephone, paper). Materials included in the mailing explain the test and provide information on how to respond. The impact of message content on self response will be tested by varying the content of the mailing packages in the “Internet Push” for different panels. Specifically, we will test language that addresses how participation in the Census benefits respondents’ communities and cite the mandatory nature of the census. Mail panels targeting limited English proficiency (LEP) households will include a language insert as part of the contact strategy. LEP households represent a subsample of housing units in each test location. We also plan to include the Census Internet Uniform Resource Locator (URL) on envelopes with messaging in multiple languages for a panel of housing units. This is intended to serve as a prompt for LEP respondents to access the Census URL without needing to read a letter written in a language in which they are not fluent. An “Internet Choice” panel will also be tested; which involves first sending a questionnaire with a letter inviting people to respond via the Internet or by using the questionnaire; then sending up to two postcard reminders to non-responding addresses; and ultimately sending a second paper questionnaire to addresses that still have not responded. The design of the mail panels is fully described in Supporting Statement B.

In addition, we will conduct a Response Validation Reinterview operation to recollect the response data for an estimated sample of 5,000 of the Non-ID returns. This will be performed via in-person visits during the Nonresponse Followup (NRFU) operation. The followup interview data will be compared to the original data collected for the sample households in order to determine the level of consistency.

Census Questionnaire Assistance will be available to all respondents. In addition, on-line respondents will be provided with pre-defined “Help” screens or “Frequently Asked Questions” accessible through the Internet instrument. People who prefer not to respond via a paper form or on the Internet can also call the Census Questionnaire Assistance number and speak to an agent to complete the questionnaire for their household.

In addition to supporting Non-ID self response and conducting manual processing of Non-ID returns when required, we will take steps to identify duplicate or potentially fraudulent Non-ID responses. For all Non-ID responses, we will compare response data to information contained in Federal administrative records and third-party data maintained within the Census Bureau. This will help validate respondent-provided data as well as examine the gaps in coverage we might have in currently available administrative records datasets. The Census Bureau’s Center for Administrative Records Research and Applications (CARRA) will compare the original and Response Validation reinterview responses to administrative records using the Person Identification Validation System (PVS). PVS is a suite of programs and parameters that have been developed by CARRA for validating the identity of persons represented in data records from censuses, surveys, and administrative datasets, and uses a reference file of Social Security Numbers, names, birthdates, sex, and address information from various government administrative data including Internal Revenue Service returns and Social Security Administration records. PVS is a standardized Census Bureau process by which globally unique identifiers called Protected Identification Keys (PIKs) are assigned to records based on match to reference files (see above). At the heart of PVS is a comparison and scoring tool based on the Fellegi-Sunter probabilistic linking methodology. The method assigns a match score to each candidate pair based on a weighted combination of agreement measures for the matching fields for each pass. Using PVS, CARRA will attempt to match the 2016 response and Response Validation reinterview data to administrative records in order to validate respondent-provided data, as well as determine which group(s) of respondents are not well represented in administrative records.

Content Test Objectives in Self Response and Nonresponse Followup Data Collection

The 2016 Census Test questionnaire will include questions on housing tenure, household roster, age, sex/gender, date of birth, race and Hispanic origin, and relationship. Based on results from the 2010 Race and Hispanic Origin Alternative Questionnaire Experiment (Compton, et. al. 20121), the 2016 Census Test will include a combined race and Hispanic origin question intended to build on what is being tested in the 2015 National Content Test. This combined question provides examples and write-in areas for each major response category, including a response category for Middle Eastern and North African ethnicities. With this combined question format no separate “Hispanic origin” question is used. Rather, Hispanic ethnicity or origin is measured within the single item. Respondents are asked to self-identify by selecting one or more checkboxes, and to write-in a specific origin for each checkbox selected. The 2016 Census Test allows us to test responses to these questions in geographic areas with different race and Hispanic Origin concentrations from the prior test areas.

The inclusion of the combined question will also allow the Census Bureau to conduct imputation research using this combined format in a setting when there are self responses, administrative records and NRFU enumerator responses. This will allow the Census Bureau to understand imputation approaches needed for a combined question.

We also plan to test variation in terminology by comparing “Am.” with “American” in the response category “Black or African Am.” on the Internet instrument. This research is being undertaken to assess the impact of different wording for the racial category that collects and tabulates data for the African American, African, and Afro-Caribbean populations. This test will provide insights to how respondents identify with the race category, depending on the wording used to describe the category itself (“Black or African Am.” vs. “Black or African American”).

For the relationship question, we plan to include variations in question wording associated with “non-relatives.” We will compare responses to a relationship question with, and without, the response categories “roomer or boarder” and “housemate or roommate.” Cognitive testing has repeatedly shown that respondents do not know what the Census Bureau sees as the differences between these categories.

The 2016 Census Test will continue to include the response categories recommended by the OMB Interagency Working Group (see Section 11 of this document – Justification for Sensitive Questions) for opposite-sex and same-sex husband/wife/spouse households, and for the category for unmarried partner.

The 2016 Census Test will include a question on the Internet instrument that will allow respondents to report that a housing unit they own is vacant as of Census Day, and to provide the reason for the vacancy status (e.g., a seasonal or rental unit). Collecting these data from respondents may allow the Census Bureau to identify some vacant housing units during self response so they can be removed from NRFU operations.

The Census Bureau’s research on how best to present and explain the residence rule (who to count) in specific situations will continue. The Internet data collection instrument will include various ways to ask about and confirm the number of persons residing at an address.  Respondents will see one of three screens about the enumeration of people in their household: one that displays the Census Bureau’s basic residence rule, and then asks for the number of people in the household based on that rule; one that asks for the number of people who live in the household but provides our residence rule definition in the help text; and one that asks if any other people live at the household, with the residence rule in the help text.  After the names of the roster members are collected, the respondent will be subjected to one of three treatments associated with undercount detection questions: one treatment asks for additional people on two separate screens, the second treatment asks for additional people on only one screen, and the third treatment involves asking no undercount questions at all. After the demographic items are collected, the respondent will then see overcount detection questions or, if the case had not received undercount questions, no overcount detection questions.

The materials mailed to the respondents will inform them that the survey is mandatory in accordance with Title 13, United States Code, Sections 141 and 193. This information also will be available via a hyperlink from within the Internet instrument.

Nonresponse Followup (NRFU) Operation Testing

The 2016 Census Test will determine our 2020 Census methods for conducting NRFU operations that will increase efficiency and reduce costs. Based on previous tests, the Census Bureau will refine its contact strategies and methods for field data collection, case assignment management, and field staff administrative functions. This will include further testing of how administrative records can be used to reduce the NRFU workload.

As part of the 2016 Census Test, we will collect housing unit status and enumerate the occupants of households that do not respond to the self response phase of the census using automated enumeration software on standard (iOS and Android operating system) smartphone devices. The test will enable our continued study of options for alternatives to using government furnished equipment. This includes an option to use a ‘Device as a Service’ contract, where the Census Bureau will not own the smartphone devices outright, but instead will pay a vendor for their use, including any initialization and setup processes required. This has the potential to mitigate risks to the operation, such as unpredictable increases in costs associated with device initialization and hardware support. We will also continue to operationally test the field data collection application we use on these devices. The devices will use a modified version of the software used in the 2015 Census Test, with updated capabilities for handling special non-interview cases (such as demolished homes and non-existent addresses), better handling of addresses with multiple units (like apartment buildings), a clearer path for enumerators to take when attempting to collect data from a householder’s neighbor or another knowledgeable source, new screens related to detecting potential “overcount” in a household (scenarios where current household residents also lived at another location, like student housing), and numerous other minor incremental user interface and performance updates.

The Census Bureau also plans to test a newly redesigned portion of our quality assurance activities – the NRFU Reinterview program (NRFU-RI). In particular, the Census Bureau plans to test:

  • New methodologies for selecting cases to be reinterviewed, including the potential use of operational control system data (paradata) and administrative records to detect potential falsification by enumerators

  • Using our automated field data collection instrument for conducting these reinterviews

  • Using our recently re-designed operational control system to optimize the routing and assignment of reinterview cases, and

  • Using the same field staff to conduct both NRFU interviews and associated reinterviews, with an explicit rule within the instrument that an enumerator is not allowed to reinterview their own work.

All of these changes have the potential to lead to a more cost-effective, streamlined, and higher quality NRFU operation for the 2020 Census.

We will continue to test our newly re-engineered field infrastructure, allowing us to refine our requirements for staffing ratios and position duties for 2020 Census operations. We will also continue to test our enhanced operational control system, using lessons learned from the 2015 Census Test to make further improvements to how assignments are made and routed. We will continue to test improvements to our use of systematic alerts that will quickly notify field supervisors of potential problem enumerators, detect possible falsification, and improve both quality and efficiency for the NRFU operation.

All units that do not self respond will be eligible for the administrative records processing to determine if they are vacant or occupied. Units determined to be vacant will be removed from the NRFU universe and receive no in person contacts. Units determined to be occupied will receive a reduced number of contacts.

For the 2016 Census Test, the Census Bureau will implement a fixed contact strategy approach. The plan is to contact the addresses a maximum of six times. Starting on the third visit, the addresses will be eligible to be resolved by collecting information from a knowledge source if not resolved. If additional contacts are needed, the interviewer will continue to try to obtain resolutions of occupied units by trying to complete the interview with household members before attempting a proxy.

Finally, we will build upon work from the 2013, 2014, and 2015 Census Tests in a continued attempt to refine and evaluate our use of administrative records (including government and third-party data sources) to reduce the NRFU workload. Cases will be removed from the NRFU operation based on our administrative records modeling as follows:

  • Any case that is given a status of vacant from our administrative records modeling will be immediately removed from the NRFU workload; and

  • Any case that is given a status of occupied from our administrative records modeling will be removed from the NRFU workload after one unsuccessful attempt at field enumeration is made (as long as good administrative records exist for that case).

Unlike previous tests, for all cases removed from the NRFU workload in this way, we will test mailing these addresses a supplemental letter to prompt a self response. If these cases do not self-respond, we will enumerate the unit based on the results of our administrative records modeling.

For a sample of the cases that would be removed via this criterion, we will continue to perform the field followup activities. This will allow us to compare the outcomes of those that get a completed interview with our modeled status of the household, and determine the quality of our administrative record modeling.





Coverage Reinterview

As described previously, the 2016 Census Test Internet instrument contains embedded coverage experiments, and a reinterview is needed to quantify the effects of each particular version on the roster provided by the Internet respondent.  The quality of the final household roster created from the panels with experimentally applied questions will be evaluated by a coverage reinterview conducted by telephone. Note that these panels are used to evaluate the different residence rule approaches used in the different questionnaire panels. The reinterview will contain extensive questions about potentially missed roster members and other places that any household members sometimes stay. Specifically, the reinterview will re-contact responders to determine if any people may have been left off the roster or erroneously included on the roster during the initial response.  If there are indications during the reinterview that some people may have been left off the roster, then we will ask for demographic information about the missed people.  If there are indications during the reinterview that some people may have been erroneously included, then we will ask for information about stay durations in order to resolve residency situations. The reinterview will be a Computer Assisted Telephone Interviewing (CATI) operation conducted in the Census Bureau's call centers.

In addition to contacting Internet responders, a small portion of people who responded by paper or as a part of NRFU will be selected for the Coverage Reinterview. The inclusion of such cases will allow us to quantify the quality of household rosters collected in these two other modes.







Focus Groups

Following the end of data collection, the Census Bureau will conduct focus groups with 2016 Census Test participants to ask about their experience. Two focus groups will be conducted in each site, with English- and Spanish-dominant speakers respectively. Topics will include their opinions on the use of administrative records by the Census Bureau. Participants also will be asked about their general concerns with government data collection and the government’s ability to protect confidential data. Other topics to be covered will be informed by feedback received from stakeholder groups, including the National Hispanic Leadership Agenda (NHLA) and the National Association of Latino Elected and Appointed Officials (NALEO) Educational Fund. The specific information collection materials for those activities will be submitted separately as non-substantive changes.



2. Needs and Uses

Testing in 2016 is necessary to build on the findings from prior testing and to establish recommendations for contact strategies, response options, and field operation efficiencies that can be further refined and deployed again in subsequent operational and system development activities. At this point in the decade, the Census Bureau needs to solidify evidence showing whether the strategies being tested can reduce the cost per housing unit during a decennial census, while still providing high quality and accuracy of the census data. The results of the 2016 Census Test from both sites will inform decisions that the Census Bureau will make about refining the detailed operational plan for the 2020 Census and will help guide the evaluation of additional 2020 Census test results later this decade.

Along with other results related to content, the 2016 Census Test response rates to paper and Internet collection will be used to help inform 2020 Census program planning and cost estimates. These estimates are input into planning associated with the sizing of systems to support paper data capture and Internet response and ultimately to inform the estimated Nonresponse Followup workload and as a result, the physical infrastructure needed to support the Nonresponse Followup data collection.

An accurate population count and collection of the roster of persons at a housing unit is an integral component to an accurate census. The Census Bureau strives to collect an accurate housing unit population count and roster through using prescribed residence rules. Decisions on how the residence rules are presented to a respondent and how coverage questions aimed at ensuring accurate accounting of the persons at a housing unit on census day are required in order to meet Congressional deadlines for the 2020 Census topics (April 1, 2017) and 2020 Census questions (April 1, 2018). In the 2015 National Content Test, residence rule presentation and associated coverage questions were tested, although the presentation of the coverage questions was not a major test objective. Some types of known hard to count populations are known to see coverage improvements when coverage questions are asked to clarify census residence rules. However, there must be a balance with respondent burden to avoid extended follow-up questions for more typical household situations. We continue to test branching and other options within automated instruments in order to achieve the best mix of coverage improvements while minimizing respondent burden. We are also including versions of the demographic questions that we believe show promise based on earlier test results but that do not incorporate results from our major content test, the 2015 National Content Test. This test will provide further indicators of potential successful use of these questions. Details about the experimental treatments associated with options for asking coverage questions are presented earlier in this document with additional detail provided in Part B.

Testing enhancements to Non-ID processing will inform final planning for the 2020 Census design, as well as the infrastructure required to support large scale, real-time processing of electronic Non-ID response data submitted via the Internet.

Building upon previous Census Tests, the NRFU portion of the 2016 Census Test will inform the following important decisions for conducting the 2020 Census:

  • We will continue to research the cost and quality impact of reducing the NRFU caseload through the use of administrative records information, to inform our final strategy for the use of administrative records. This test will also allow us to further define our core set of administrative records that will be used for the 2020 Census, and our strategies for acquiring and using those records. This research will help us achieve our goal of a more cost-effective 2020 Census, while maintaining quality of the results.

  • We will continue to research the cost and quality impacts of new NRFU contact strategies that make use of adaptive design and a re-engineered management structure employing automated payroll, automated training, and minimal face-to-face contact between enumerators and supervisors. Enumerators are asked to provide work-time availability in advance, and the system then will assign the optimal number of cases to attempt each day, as well as the optimal route to follow that day. Again, this operational research will help us towards our goal of a more cost-effective 2020 Census, while maintaining quality of the results.

  • We will be able to make quality and cost determinations about a ‘Device as a Service’ option, and be able to develop more mature cost models to inform our decisions related to the device provision strategies for the 2020 Census NRFU operation.

  • We will be able to determine the cost and quality impacts of our newly re-engineered NRFU Reinterview quality assurance program. This data will inform our decision on an integrated and re-designed approach to quality assurance for the 2020 Census.

For each of these areas of the test, we will use cost and quality measures to assess their effectiveness. Some examples of such measures include:

  • Enumerator Attrition Rates

  • Enumerator Productivity Rates

  • Cost per Enumerator

  • Cost per Contact Attempt

  • Cost per Completed Case

  • Mileage per Case

  • Mileage per Attempt

  • Average Time to Conduct Interviews

Information Quality

Information quality is an integral part of the pre-dissemination review of the data disseminated by the Census Bureau (fully described in the Census Bureau’s Information Quality Guidelines at https://www.census.gov/quality/guidelines/). Information quality is also integral to the data collections conducted by the Census Bureau and is incorporated into the clearance process required by the Paperwork Reduction Act.

The data collected from households and individuals during the 2016 Census Test will be used to research and evaluate new methodologies and systems to plan the 2020 Census. The Census Bureau will not publish any tabulations or population estimates using the results from this test. However, methodological papers may be written that include summary tallies of response characteristics or problems identified, and responses may be used to inform future research studies building upon the results of these tests. The Census Bureau plans to make the evaluation results of this study available to the general public.



3. Use of Information Technology

The 2016 Census Test will use the Internet, telephone centers and data collection software residing on handheld mobile devices to communicate with respondents. Respondents will initially have the option to respond to this test via the Internet (on various devices, e.g., computers, tablets, smart phones) or through Census Questionnaire Assistance.

The 2016 Census Test is heavily dependent on information technology systems, and a significant portion of the test is devoted to operationally testing these systems. The test will employ automated systems to administer and manage training, manage workloads, route field workers, alert supervisors of potential problems, create management reports, and process responses.



4. Efforts to Identify Duplication

As part of its efforts to reduce the cost of conducting the next decennial census while still providing the highest data quality possible, the Census Bureau continues its testing of new methods for the public to respond and new ways to automate and more efficiently manage field data collection that have not previously been examined or used in a decennial census.

We are utilizing research results when possible from other Census Bureau surveys. Current research on American Community Survey paradata from their Internet data collection operations will be informative on the usefulness of continuing to contact respondents after they initially fail to complete the survey. We also are working with other countries with similar issues and goals (for example, Australia, Canada, and England) to share information on these matters. The 2020 Census Research and Testing Program is also conducting additional literature reviews on results seen in other surveys about paradata, NRFU procedures, fieldwork efficiencies, telephone contacts, Internet response, and geographic differences. However, most survey results cannot be directly applied to a decennial census environment. The size, scope, mandatory nature, importance of results (for such things as Congressional apportionment, state redistricting efforts, and the allocation of over $400 billion in Federal funds each year), and timing constraints (legal deadlines for producing apportionment and redistricting data) of the decennial census are unique. Thus, thorough and separate research and testing must be conducted to ensure that new methods and operations will work in a decennial census environment.



5. Minimizing Burden

The collection of data is only for households and individuals and should have no effect on small businesses. To reduce total burden on respondents, any housing units selected for the 2016 American Community Survey (ACS) - by far the Census Bureau’s largest monthly survey - are excluded from sample selection for this test. Housing units from the recent 2015 National Content Test are also excluded. The sample selection processes of the Coverage Reinterview and the NRFU Reinterview will include safeguards to prevent the selection of a housing unit for both reinterviews.



6. Consequences of Less Frequent Collection

If this collection of information does not occur, it would significantly delay or prevent the Census Bureau’s ability to improve upon the current decennial census design, and thus the ability to make design changes with major cost savings for the 2020 Decennial Census. Frequency cannot be decreased, as this is a one-time data collection activity.



7. Special Circumstances

No special circumstances exist.



8. Consultations outside the Agency

In developing these tests, the Census Bureau consulted with a variety of stakeholders, including, but not limited to, academics, national researchers, community and organizational leaders, and the Census Bureau’s Advisory Committees. In addition, external consultants from the National Academy of Sciences shared information about other relevant studies and provided quarterly feedback about the Census Bureau’s research plans and objectives for the 2015 Census Test. The results from these tests will be shared widely with decennial census stakeholders.

To help evaluate and assess the results of the 2016 Census Test, the Census Bureau will conduct debriefings with enumerators and others who work on the tests, and will conduct focus groups, as described earlier in this supporting statement. To support the development of the Internet and nonresponse instruments, we are conducting cognitive and usability testing under separate submissions for OMB approval; participants will be recruited from outside the Census Bureau to provide their views on the wording of questionnaire items and the navigation of the Internet application, including the optimization of screens for mobile and tablets.

The notice for public comment, entitled, “2016 Census Test,” was published in the Federal Register August 4, 2015 (Vol. 80, No. 149, pp. 46239-46242). The Census Bureau received two comments during the 60-day period.

One comment expressed concern about how our use of administrative records to reduce the NRFU workload might impact the enumeration of potential “squatters” -- individuals who occupy abandoned or unoccupied buildings that might otherwise be considered vacant. This is a historic challenge faced by the Decennial Census. The 2020 Census will rely heavily on a strong partnership and outreach program to assist in the identification, location, and enumeration of these difficult to count populations.

The second comment expressed concerns on behalf of the NALEO Educational Fund in the area of Census policy development and public education. This comment was broken into multiple parts:

Internet/Technology Response Option. One comment reflected cautions about the potential for questionnaires on the internet and on mobile phones to increase response rates among some groups but to have a different effect in certain racial/ethnic and socio-economic groups. The Census Bureau is committed to optimizing self response across all demographic groups, particularly for traditionally hard-to-count populations. We are continuing to explore the best ways to motivate households to respond and are taking steps to ensure that all households have the opportunity to respond to the census so that everyone is counted. Since there is wide variation in the US as related to Internet access, connectivity, and language use, we are researching methods to accommodate the needs of the whole population. These include providing Internet (for home computers, as well as mobile devices), paper, and telephone response options, all in multiple languages; also included is a program for broad outreach and partnership support at local levels. As described in the Census Bureau’s 2020 Census Operational Plan, to encourage and better facilitate self response, the Census Bureau continues to research and refine our contact strategies. Specifically, we will again identify areas that are less likely to respond online and will mail paper questionnaires to these households as part of their initial contact (the aforementioned Choice Panel). These areas will likely include geographies with low internet penetration, which are expected to include traditionally hard to count populations. We will also identify areas with higher concentrations of populations that may need information in languages other than English and will provide mail materials for these areas in multiple languages.

Another component to the 2016 Census Test will be analysis of all data from the test in order to further refine our strategies and methods to increase self response. The Census Bureau plans to look at response data across multiple strata, including response mode (mobile device, home computer, paper questionnaire, and telephone), as well as across multiple demographic characteristics.



Proposed Revisions to Race and Ethnicity Questions. While the comments expressed support for the collection of detailed data that accurately illuminates the diversity within racial and ethnic groups, it highlighted several overarching requirements when making such changes. These include the need to ensure that all Census products provide data that are useful in the implementation, enforcement, and monitoring of the nation’s civil rights laws by collecting data comparable to that collected in previous decades, and the goal to maximize the accuracy of the data being collected through comprehensive testing of all questions and instructions with multiple language communities.

The Census Bureau has ongoing collaborations with the Civil Rights community, academics, national researchers, community and organizational leaders, the Census Bureau’s Advisory Committees, and other key stakeholders. Results of the 2010 Alternative Questionnaire Experiment (AQE) research showed that the combined question is comparable to current and past Census tabulations on race and ethnicity because it easily maps to the traditional race and ethnicity groups produced by the two separate questions. Similar analyses are planned for data collected during the 2015 National Content Test (NCT) before 2020 Census content decisions are finalized.

The NCT is the major means of testing proposed changes to the race and ethnicity questions for this decade. Since the 2010 AQE, the Census Bureau has continued its research on measurement of race and ethnicity. This ongoing program of research designed to inform the 2020 Census includes qualitative research, such as focus groups, cognitive interviewing, expert reviews, and usability testing, with Spanish-dominant residents as well as other language groups. For example, recent Census Bureau qualitative research found that the terms “race,” “ethnicity,” and “origin” are confusing or misleading to many respondents and mean different things to different people.

This qualitative work also informed the development of the 2015 NCT, which was conducted in the fall of 2015. The NCT was a large-scale test designed to ascertain the best way to elicit and collect detailed data on race and ethnicity for the 2020 Census, testing alternative ways to word questions, instructions, and questionnaire layout, including refinements for collecting detailed race/ethnic responses, with the goal of optimizing reporting for all communities.  Data were collected in multiple modes, including Internet, mailed paper questionnaires, and interviewer-administered over the telephone. The 2015 NCT also included a reinterview to assess the accuracy and reliability of the question alternatives for race and ethnicity.  The data collected in the 2015 NCT will allow us to compare the success of different race and ethnicity designs to determine how questions perform in various modes and with specific populations. The 2016 Census Test will use a version of question wording tested in the 2015 NCT.

In further attempts to optimize self response and data quality, the Internet response option will have differences from the paper response option in the 2016 Census Test. We plan to include both multiple checkboxes and the option to manually enter multiple responses for each respondent selected race, to better capture the diversity and detail of our population. The internet and NRFU applications have 200 characters available for respondent replies. Response data will be used to inform future decisions regarding race and ethnic origin.


Administrative Records and NRFU. A significant set of comments was related to the use of administrative records for enumeration of hard-to-count populations. The Census Bureau plans to use the most cost-effective strategy for contacting and counting people to ensure fair representation of every person in the United States. The Census Bureau will only use administrative records for the enumeration of nonresponding housing units where high-quality administrative records are available from trusted sources. Where high-quality data are not available from trusted sources, we will continue in-person visits to reach nonresponding housing units up to an established maximum number of visits. In addition to our ongoing research and testing efforts, in partnership with the Census Bureau’s National Advisory Committee (NAC), we developed a Hard to Count NAC working group, which has been focusing on understanding which populations are well-represented in administrative records and which ones would be most affected by the use of these new methodologies. This NAC will provide recommendations on how to best reach the latter groups.

During the 2016 Census Test, after giving the population in the test sites an opportunity to self-respond to the test census, we will use administrative records to identify vacant units so enumerators do not have to visit those addresses. For the remaining workload, enumerators will make one visit to collect information in person. After the attempt to collect information in person, only where we have high-quality administrative records from trusted sources will we use that data as the response for the housing unit. Where high-quality administrative records are not available, we will continue in-person attempts to reach the nonresponding housing units up to an established maximum number of visits. For those nonresponding addresses where we could use administrative records as the response for the housing unit, the Census Bureau will mail a postcard providing an additional opportunity for residents of the housing unit to self-respond.

NRFU Technological Improvements and Operational Procedures. One comment recommended that the Census Bureau ensure the field workforce for the Nonresponse Followup portion of the test reflects the racial, ethnic, and linguistic diversity of the test sites.  The Census Bureau agrees and, for the 2016 Census Test, will continue to implement field hiring practices to support the concept of hiring staff to enumerate at or near where they live, and in their communities.

Focus Groups. One comment recommended that the focus groups held with Latino participants reflect the diversity of the Latino communities in the test sites. As noted above, for the 2016 Census Test we will conduct Focus Groups in each site – one in English and one in Spanish.

Regional Offices and Partnership Program. Since the 2010 Census many of our partnership efforts have been maintained through our data dissemination specialist program, which trains data users on how to access and use Census Bureau data, allowing us to maintain a level of presence and sharing of census data in communities across the nation. We have also implemented partnership activities in recent tests, beginning with the 2015 Optimizing Self-Response Test in Savannah, Georgia. For the 2016 Census Test, in keeping with our policy of hiring individuals from local communities, we plan to hire partnership staff who reside in or represent local test areas. We will work with local officials to inform them about the test, coordinate with trusted voices in the community to gain their help in encouraging respondent participation, participate in community events, and reach out to special groups, such as faith-based organizations, schools and non-profits. 

The Census Bureau’s 2020 Operational Plan provides additional context as to how the various tests over this decade build on one another or otherwise fit together to lead to final design decisions. Many design decisions are documented in the initial version of this document, completed in 2015.

The second notice for public comment on the 2016 Census Test was published in the Federal Register November 9, 2015 (Vol. 80, pp. 69188-69193). The Census Bureau received two sets of comments during the 30-day review period. The first set of comments was from the NHLA, and the second set was from the NALEO Educational Fund. The comments received from these two organizations had significant overlap, and many comments were repeated from the 60-day review because the response provided in the 30-day notice addressed the comments more generally than in this document. Responses to the 60-day Notice comments were provided just above.

Two other comments that were sent in response to the 30-day Federal Register Notice are topics that would be better addressed in Focus Groups and will be considered for inclusion there. The first was a recommendation to determine the extent to which heads of households need help when utilizing technological response modes. The second was a recommendation to evaluate customer satisfaction with the bilingual assistance within the telephone response mode.

An additional comment was a concern that Coverage Reinterview would underrepresent the Latino population by taking the preponderance of sample from the responses received by Internet, when the Latino population may be less likely to respond by Internet. The Coverage Reinterview is the means of collecting the data that will be used to analyze the effectiveness of the tested implementations of the coverage question experiments in the Internet instrument. Therefore the focus on the Internet responses is necessary. Coverage questions in other modes have been tested in previous census tests.



9. Paying Respondents

Respondents participating in the 2016 Census Test will not receive any form of compensation for their participation. Respondents participating in the focus groups will receive a $75 stipend after the group concludes.



10. Assurances of Confidentiality

The Census Bureau will conduct the 2016 Census Test under the authority of Title 13 United States Code Sections 141 and 193. All respondents who participate in the 2016 Census Test will be informed that the information they provide is confidential under that law, but that the same law makes participation mandatory. All collected information that identifies individuals will be held in strict confidence according to the provisions of Title 13 United States Code, Section 9 and 214.



11. Justification for Sensitive Questions

The Census Bureau is currently considering the inclusion of new categories for same-sex couples on the decennial census questionnaire. In August 2009, the Secretary of Commerce requested that the Office of Management and Budget (OMB) establish the interagency task force [Measuring Relationship in Federal Household Surveys] to research issues related to improving the collection and tabulation of marriage and relationship data. One focus of the research was family relationships, particularly with respect to same-sex couples who report being married. The first phase of research involved focus groups conducted primarily with persons cohabiting in same-sex relationships. The focus groups explored the meaning and interpretation of the current decennial Census and American Community Survey (ACS) relationship and marital status items. The second phase of qualitative research was conducted by the Census Bureau under the auspices of the OMB working group. As a result of the focus groups and expert panel review, two alternatives were developed for recommended wording to be further tested in larger-scale quantitative content tests. The 2016 Census Test includes the relationship question tested in the 2015 National Content Test which includes these same-sex response categories.

12. Estimate of Burden Hours

Type of Respondent / Operation

Estimated Number of Respondents

Estimated Time per Response

Estimated Total Annual Burden Hours

Self Response

250,000

10 minutes

41,667 hrs

NRFU

120,000

10 minutes

20,000 hrs

NRFU Quality Control Reinterview

12,000

10 minutes

2,000 hrs

Non-ID Manual Processing – phone followup

400

5 minutes

33 hrs

Coverage Reinterview

24,500

10 minutes

4,084 hrs

Non-ID Response Validation

5,000

10 minutes

834 hrs

Focus Group Selection Contacts

288

3 minutes

15 hrs

Focus Group Participants

160

120 minutes

320 hrs

Totals

412,348


68,954 hrs



13. Estimate of Cost Burden to Respondents

There are no costs to respondents (self respondents and those contacted during Nonresponse Followup) other than their time to participate in this data collection. Those recruited to participate in subsequent focus groups will be compensated to cover expenses and for their participation.

14. Cost to Federal Government

The cost of this collection is covered under the requested budget for the 2016 Census Test, Research and Testing Program, and is estimated to be $34 million. This amount includes space acquisition, field infrastructure and equipment, recruiting and hiring processes, postage, and print contracts (“other objects”). This estimate also includes salaries for field workers, Census Questionnaire Assistance agents, data capture processing staff, and the staff in headquarters providing program management and/or systems engineering and integration (SE&I) support.

15. Reason for Change Burden

The increase in burden is attributable to the information collection being submitted as a new collection.



16. Project Schedule

Activity / Milestone

Date / Range

Initial Contact

March 23, 2016

Census Day

April 1, 2016

Public Response Period – Internet and CATI

March 23 – June 16, 2016

Conduct Non-ID Manual Processing phone followup

March 23 - June 16, 2016

Conduct NRFU Interviews

May 12 – June 16, 2016*

Conduct NRFU Quality Control Reinterviews

May 13 – June 21, 2016*

Conduct Coverage Reinterviews

April 25 – June 30, 2016

Conduct Non-ID Response Validation Interviews

May 12 – June 16, 2016

Recruit Focus Group Participants

TBD

Conduct Focus Groups

TBD

*While the main portions of the NRFU and NRFU-RI field work will begin on May 12th and 13th, respectively, we will conduct a small number of early interviews starting on May 2nd to ensure systems integration and operational readiness for the test.



17. Request to Not Display Expiration Date

No exemption is requested.



18. Exceptions to the Certification

There are no exceptions to the certification.

1 Compton, E., Bentley. M., Ennis, S., Rastogi, S., (2012), “2010 Census Race and Hispanic Origin Alternative Questionnaire Experiment,” DSSD 2010 CPEX Memorandum Series #B-05-R2, U.S. Census Bureau.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorVicky Dempsey Trump
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy