2013_NCCT_8.106_OMB_submission_Part_A_v6.1

2013_NCCT_8.106_OMB_submission_Part_A_v6.1.docx

2013 National Census Contact Test

OMB: 0607-0972

Document [docx]
Download: docx | pdf


SUPPORTING STATEMENT

U.S. Department of Commerce

U.S. Census Bureau

2013 National Census Contact Test


Part A Justification


Question 1. Necessity of the Information Collection


The U.S. Census Bureau is committed to using alternative approaches for contacting potential respondents, such as cell phones, landlines, text messages and e-mails in an effort to reduce costs by increasing self-response. However, developing and implementing successful and secure contact and response strategies during the 2020 Census requires research throughout the next decade. The Census Bureau must conduct a series of research projects and tests to fulfill its commitment to provide the public with an option to complete their 2020 Decennial Census questionnaire using these alternate contact strategies through a self-response test.

The 2013 National Census Contact Test (NCCT), formerly named the 2013 Alternative Contact Strategy Test, supports this alternate contact research. The 2013 NCCT will be conducted over the telephone, using a Computer Assisted Telephone Instrument (CATI) with approximately 40,000 households between January 7, 2013, and February 1, 2013. These interviews will enable Census staff to assess the quality of the data from vendor files that are under consideration for use in the construction of an Alternate Contact Frame. The 2013 NCCT questionnaire will ask respondents for basic demographic information collected in a Census. Demographic information includes questions such as household roster, age, race, Hispanic origin, relationship, and sex. The questionnaire also takes the approach of requesting email addresses and telephone numbers from the respondents. Afterward, the respondent-provided address, phone numbers, and email addresses will be matched to the vendor data being evaluated as well as to the MAF. In addition, the NCCT provides an opportunity for the Census Bureau to test potential enhancements to its automated processing of responses lacking a pre-assigned Census Identification number. “Non-ID Processing,” as it is known within the Census Bureau, compares respondent addresses to the Master Address File (MAF), which is a national inventory of living quarters addresses compiled and maintained by the Census Bureau. In the case of a non-match, Non-ID processing includes the assignment of geographic codes to the respondent address, which enables a record to be tabulated to the correct geographic area (e.g. State, Congressional District, County, Census tract, etc.). Finally, the interview contains a question intended to gauge respondents’ attitudes regarding the collection of Global Positioning System (GPS) coordinate data from a respondent’s mobile device, such as cellular phone or tablet, made available through technology referred to as “location based services.”

The advance letter mailed to the respondents prior to the telephone interview will inform them

that the survey is mandatory, in accordance with Title 13 United States Code, Sections 141 and

193.



Question 2. Needs and Uses

The Census Bureau designed the 2013 National Census Contact Test to inform the 2020 Census testing and planning design. The intent is to research and validate the quality of the administrative records files, which contain alternate contact data to connect with individuals and households, such as e-mail addresses and cell phone numbers, as well as evaluate enhancements to the Census Bureau’s process for matching and geocoding Non-ID’d responses. Additionally, responses to the final interview question (regarding collection of respondent coordinate location) will be compiled and provide some indication of public attitude regarding the use of location-based services on mobile devices to derive a respondent’s location.

This study is one of the initial steps in answering the following research questions:

  1. How can we identify or develop alternative contact frames that can be associated with an address?

  2. What is the coverage of the alternative contact frame over different demographic and geographic characteristics?

  3. Could the contact information be identified on the frame (e.g., best phone, alternate phone)?

  4. How can the Census Bureau more effectively match and geocode addresses lacking a pre-assigned Census ID during automated processing?

5. For cases not matched to a geocoded record during automated Non-ID Processing, how can the Census Bureau significantly reduce the Field Verification workload? The results from the 2013 NCCT will influence internal Census Bureau planning decisions that will guide the design of additional 2020 Decennial Census testing later this decade. By testing in 2013, we aim to establish a baseline approach for multi-mode testing. Testing enhancements to Non-ID processing early in the decade will inform early planning for the 2020 Census design, as well as the infrastructure required to support large-scale processing of electronic Non-ID response data submitted via the Internet or a Census-provided questionnaire application designed for mobile devices.

The Census Bureau plans to make the aggregated results of this study available to the public. Information quality is an integral part of the pre-dissemination review of the information disseminated by the Census Bureau (fully described in the Census Bureau's Information Quality Guidelines). Information quality is also integral to the information collections conducted by the Census Bureau and is part of the clearance process required by the Paperwork Reduction Act.

Data from the research will be included in reports with clear statements about the limitations and that the data were produced for strategic and tactical decision-making and exploratory research and not for official estimates. Research results may be prepared for presentation at professional meetings or in publications in professional journals to promote discussion between the larger survey and statistical community, and to encourage further research and refinement. Again, all presentations or publications will provide clear descriptions of the methodology and its limitations.



Question 3. Use of Information Technology


The Census Bureau will use the CATI web-based instrument to collect survey responses to reduce respondent burden by implementing a more efficient interview (including making use of technology to invoke skip patterns efficiently). Automation is being used in comparing survey data to frame data.

Prior to the start of interviewing, the NCCT will validate all of the sample phone numbers (and non-productive numbers eliminated) using a vendor-provided service called “Genesys CSS.” The service first compares the phone numbers to a comprehensive, national database to identify any listed non-residential (e.g., business) phone numbers, as well as any cell phone number. Next, vendor staff manually dial the remaining (i.e, residential, non-cellular) phone numbers . Calling is always conducted from 9a.m. to 5p.m. in the time zone associated with the area code. If someone answers, the vendor agent inquires if the number dialed is a business. Once residential/commercial status is established, the vendor agent says either “Sorry, I have dialed the wrong number” for residential status or “Thank you, we are conducting research and thank you for your time” and ends the call. Use of this vendor service will eliminate non-valid numbers from the survey sample, and thus make calling more efficient (i.e., less costly).



Question 4. Efforts to Identify Duplication

A literature review was conducted to identify private- and public-sector best practices for using texts and e-mails to contact households, as well to identify other internal/external efforts to collect similar information, with the intent of evaluating the utility of supplemental contact information. In addition, members of an expert panel convened by the National Academies of Science have been asked for any knowledge they have of similar studies. Although they had no insight into past studies, they have indicated much interest in the results of our research as many industry sectors are extremely interested in alternate ways to contact the public.



Question 5. Minimizing Burden

The collection of information targets households and should have no effect on small businesses.



Question 6. Consequences of Less Frequent Collection

This data collection will support 2020 decennial census planning and research. If this data collection were cancelled, the Census Bureau would lack some quantitative evidence to support decennial census design decisions.



Question 7. Special Circumstances

There are no special circumstances.



Question 8. Consultations outside the Agency

The notice for public comment, entitled, “Proposed Information Collection; Comment Request; 2013 Alternative Contact Strategy Test,” was published in the Federal Register on July 13, 2012 (Vol. 77, No. 135, pp. 41370 – 41371). No comments were received.

Cognitive testing participants recruited from outside the Census Bureau in August 2012 have provided their views on the clarity and sensitivity of survey instrument, and minor changes to instrument were made based on those interviews.



Question 9. Paying Respondents

Respondents will not be paid or provided with gifts for participating in this data collection.

Question 10. Assurance of Confidentiality

In accordance with Title 13 United States Code, Section 9, any information collected in this test that can identify individuals is protected and kept confidential. This study complies with the requirements of the Privacy Act and the Paperwork Reduction Act. All persons involved with the collection of these data will be either Census Bureau employees, contractors, or other persons appointed to special sworn status. All individually identifiable data will be kept in a secure environment.

The materials mailed to the respondents will inform them that the survey is mandatory, in accordance with Title 13 United States Code, Sections 141 and 193. These materials will also notify them of the confidentiality safeguards.



Question 11. Justification for Sensitive Questions

The content of the 2013 National Census Contact Test will include household roster and characteristic information similar to the 2010 Census. However, contact information such as cell phone numbers, email addresses, and associated usage metrics will also be collected for household members. Usage metrics will include information on how often a particular e-mail address is checked.



Question 12. Estimate of Hour Burden

Approximately 40,000 households will be selected for the study. Our estimates of survey response rates account for ineligible cases and unknown eligibility and include both completed interview and sufficient partials.   Overall, we estimate that 25 percent of the sampled cases will result in a completed interview (10,000 out of 40,000 households).   This estimate is based on a review of previous telephone studies and is most comparable to the response rate from a random digit dial (RDD) telephone sample.   The burden estimates is shown on the table below.




Total # of Respondents

Estimated Response Time

Estimated Burden Hours

CATI – Highest Burden

40,000

7 minutes

4,667

CATI – Expected burden

10,000

7 minutes

1,167



Question 13. Estimate of Cost Burden

There are no costs to respondents other than their time to respond.



Question 14. Cost to Federal Government

Estimated costs to the U.S. Census Bureau are approximately $658,000, which includes: the development of a survey instrument; use of a phone number validation service prior to finalizing the sample; printing, assembly and shipment of an advanced notification letter; and the implementation of the survey. Implementation also includes a toll-free number that respondents may call with any questions about the survey, staffed by interviewers are trained to conduct the interview given the respondent’s consent.



Question 15. Reason for Change in Burden

This is a new collection request.



Question 16. Project Schedule

Survey drafted

6/2012

Cognitive testing

8/2012

Study plan drafted

9/2012

Instrument programmed

11/2012

CATI Interviewers trained

1/2013

Begin data collection

1/2013

End data collection

1/2013

Conduct analysis

2/2013

Draft report

4/2013

Final report

7/2013



Question 17. Request to Not Display Expiration Date

None.



Question 18. Exceptions to the Certification

None.

5


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorPhilip Lutz
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy