2016 Methods Panel OMB Supporting Statement Pta_FINAL 8-7-15

2016 Methods Panel OMB Supporting Statement Pta_FINAL 8-7-15.docx

American Community Survey Methods Panel Tests

OMB: 0607-0936

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT

U.S. Department of Commerce

U.S. Census Bureau

American Community Survey Method Panel Tests

OMB Control Number 0607-0936


Part A. Justification


  1. Necessity of the Information Collection


The U.S. Census Bureau requests authorization from the Office of Management and Budget (OMB) to conduct the American Community Survey (ACS) Methods Panel tests.


American Community Survey


The Census Bureau has developed a methodology to collect and update demographic, social, economic, and housing data every year that are essentially the same as the "long-form" data that the Census Bureau traditionally has collected once a decade as part of the decennial census. Federal and state government agencies use such data to evaluate and manage federal programs and to distribute funding for various programs that include Supplemental Nutrition Assistance Program benefits, transportation dollars, and housing grants. State, county, and community governments, nonprofit organizations, businesses, and the general public use information like housing quality, income distribution, journey-to-work patterns, immigration data, and regional age distributions for decision-making and program evaluation.


In years past, the Census Bureau collected the long-form data only once every ten years and it became out of date over the course of the decade. To provide more timely data, the Census Bureau developed the ACS. The ACS blends the strength of small area estimation with the high quality of current surveys. There is an increasing need for current data describing lower geographic detail. The ACS is now the only source of uniform data available about general demographic and housing characteristics for small-area levels across the Nation and in Puerto Rico. In addition, there is an increased interest in obtaining data for small subpopulations such as groups within the Hispanic, Asian, and American Indian populations, the elderly, and children. The ACS provides current data throughout the decade for small areas and subpopulations.


For households eligible to receive survey materials by mail, the first contact includes a letter (Attachment A1 – ACS-13) and instruction card (Attachment A2 – ACS-34IM) explaining how to complete the ACS survey online, and encouraging them to do so promptly. A Frequently Asked Questions (FAQ) brochure (Attachment A3 – ACS-10SM) is included and answers questions about confidentiality, the benefits and uses of the information, as well as the legal requirements related to the survey. A multi-lingual brochure is also included in this mailing (Attachment A4 – ACS-9). The mailing arrives in an envelope that indicates that response is required by law (Attachment A5 – ACS-46IM). This mailing is also referred to as the initial package. The Internet questionnaire (Attachment B Internet Screen Capture Guide) has space to collect detailed information for twenty people in the household.


The second mailing is a letter (Attachment C1 – ACS-20(L)) in an envelope (Attachment C2 – ACS-40) that reminds respondents to complete the survey online, thanks them if they have already done so, and informs them that a paper form will be sent later if we do not receive their response.


In a third mailing, the ACS Housing Unit (HU) Questionnaire Package is sent only to those sample addresses that have not completed the online questionnaire within two weeks and includes a paper copy of the questionnaire. In addition to the paper questionnaire (Attachment D1 – ACS-1), this mailing includes a letter (Attachment D2 – ACS-14), instruction card (Attachment D3 – ACS-34RM), return envelope (Attachment D4), FAQ brochure (Attachment D5 – ACS-10SM), and question guide (Attachment D6 – ACS-30). The mailing arrives in an envelope that indicates that response is required by law (Attachment D7 – ACS-46). This mailing is also referred to as the replacement package.


The fourth mailing is a postcard (Attachment E – ACS-29) reminding respondents to complete the survey and a fifth mailing (Attachment F – ACS-23) is a reminder postcard sent to respondents who have not completed the survey within five weeks and are not eligible for telephone followup because we do not have a telephone number for the household.


After the self-response modes of Internet and mail, the next mode of data collection is computer-assisted telephone interviewing (CATI). This is used to conduct telephone interviews for all households that do not respond by Internet or mail and for which we were able to obtain telephone numbers.


The final mode of data collection is computer-assisted personal interviewing (CAPI) and is used to conduct personal interviews for a sample of addresses for which we have not obtained a self-response (Internet or paper) or CATI interview.


We provide telephone questionnaire assistance (TQA) for respondents who need assistance with completing the Internet or paper questionnaires, who have questions about the survey, or who would like to complete the ACS interview over the telephone instead of by other modes.

Methods Panel Tests

The ACS samples about 3.5 million housing unit addresses in the United States and 36,000 in Puerto Rico each year to collect detailed socioeconomic data. The ACS also samples about 195,000 residents living in Group Quarter (GQ) facilities to collect detailed socioeconomic data. Resulting tabulations from that data collection are provided on a yearly basis. The ACS allows the Census Bureau to provide timely and relevant housing and socio-economic statistics, even for low levels of geography.

An ongoing data collection effort with an annual sample of this magnitude requires that the ACS continue research, testing, and evaluations aimed at improving data quality, achieving survey cost efficiencies, and improving ACS questionnaire content and related data collection materials. The ACS Methods Panel is a research program that is designed to address and respond to emerging issues and survey needs. Over the next three years, the Methods Panel may include testing methods for increasing survey efficiencies, reducing survey cost, lessening respondent burden, and improving response rates. Testing may also include methods to improve data quality.

At this time, plans are in place to propose several tests: a summer 2015 mail messaging test, a fall 2015 mail messaging test, a 2016 ACS Content Test, a 2016 mail messaging test, a 2017 self-response test with the potential to test both mail messaging as well as questionnaire content, a 2018 self-response test building on the previous tests, as well as tests of Internet data collection enhancements in 2017 and 2018. Since the ACS Methods Panel is designed to address emerging issues, we may conduct additional testing as needed. Any additional testing would focus on methods for reducing data collection costs, improving data quality, revising content, or testing new questions that have an urgent need to be included on the ACS. Please note that this proposal includes summer and fall 2015 mail messaging tests which were not included in the pre-submission notice.

We are requesting approval to conduct three tests for which plans are in a relatively mature status: a summer 2015 mail messaging test (planned to be mailed out in late August 2015 in connection with the September sample panel), a fall 2015 mail messaging test (planned to be mailed out in late October 2015 in connection with the November sample panel), and a 2016 ACS Content Test.


We are also proposing several other tests that are not yet fully scoped, but are generally described in this package. For those later tests, when revisions to the content of the questionnaire are being tested, or when the messaging revisions being tested are likely to be of high level interest to the public, we will provide an opportunity for public comment via a 30-day Federal Register notice, and additional details will be provided. When later tests are not likely to be of high interest to the public, and are of a more operational nature, we will provide updated plans with additional details through a non-substantive change request.


Summer 2015 Mail Messaging Test


In response to respondent concerns about prominent references to the mandatory participation in the ACS, the Census Bureau plans to test methods to soften the mandatory messages while emphasizing the benefits of participation in the survey. In May of 2015 the Census Bureau conducted a test to study the impact of removing the phrase, “YOUR RESPONSE IS REQUIRED BY LAW” from the envelopes used in the first and third mailing to respondents. The Summer 2015 Mail Messaging Test will advance the study of mandatory messaging as well as improve communication with respondents by modifying the messages included in several of the mailings, including postcards and letters.


In preparation for the Summer 2015 Mail Messaging Test, the Census Bureau solicited feedback about the mandatory messages on the current ACS mail materials from respected experts in the field of survey methodology. Additionally, in 2014, the Census Bureau conducted messaging and mail package assessment research that helps us address frequent questions and concerns we hear about the ACS surrounding privacy, intrusiveness, value of the data, and burden of completion. This research included several iterative rounds of qualitative and quantitative testing to improve the way we communicate about the importance of the ACS and the benefits to communities that result from the data. The purpose of this research was to develop messages to increase ACS self-response rates as well as to obtain insights to support general outreach, data dissemination, materials development, and call center and field operations. The design of the mail materials proposed for these tests are based on the key findings from this study. The final report “American Community Survey Messaging and Mail Package Assessment Research: Cumulative Findings” can be found on the Census Bureau’s website at: http://www.census.gov/acs/www/Downloads/library/2014/2014_Walker_02.pdf.


Taking this research and feedback from survey methodology experts into account, the Census Bureau is proposing to test five sets of mail materials aimed at improving the way we communicate the importance and benefits of the ACS as well as reducing or modifying statements about the mandatory nature of the survey. A description of each treatment follows Table 1, which explains the proposed experimental design.

Table 1. Experimental Design for the Summer 2015 Mail Messaging Test


Existing Mandatory Messaging

Experimental Softened Messaging

Minimal Mandatory Messaging

Control Design (same look and feel as production for the letters, envelopes, and brochures)

Control Design Treatment (n=24,000)

  • Use production design of materials

  • Use production wording of materials

  • No multi-lingual brochure

Softened Mandatory Messaging Treatment (n=24,000)

  • Use production design of materials

  • Modify production wording to minimize and soften mandatory messaging

  • Deadline/ timing messages used throughout materials (i.e., “Open Immediately” on envelopes)

  • No multi-lingual brochure


Revised Design (new look and feel of letters, envelopes, and brochures, and modified content)

Design Treatment

(n=24,000)

  • Use revised design of materials where appropriate (envelopes/letters/ postcards)

  • Use wording that includes prominent mandatory messaging

  • Eliminate FAQ and multi-lingual brochures from mail materials

  • Deadline/ timing messages used throughout materials (i.e., “Open Immediately” on envelopes)


Softened Design Treatment

(n=24,000)

  • Use revised design of materials where appropriate (envelopes/letters/

postcards)

  • Modify wording to minimize and soften mandatory messaging

  • Eliminate FAQ and multi-lingual brochures from mail materials

  • Deadline/ timing messages used throughout materials (i.e., “Open Immediately” on envelopes)

Minimal Design Treatment

(n=24,000)

  • Use revised design of materials where appropriate (envelopes/letters/ postcards)

  • Modify wording to eliminate all mandatory messaging with exception of one reference on back of letter in the initial mail package

  • Eliminate FAQ and multi-lingual brochures from mail materials

  • Deadline/ timing messages used throughout materials (i.e., “Open Immediately” on envelopes)




Control Design Treatment – The mail materials in this treatment have no revisions to the mandatory messages. The multi-lingual brochure will not be sent in the mail package for this test. This change to the production materials minimize confounding effects with the other experimental treatments.


Softened Mandatory Messaging Treatment – This experimental treatment builds on the control design treatment. Mandatory messaging on envelopes, postcards, letters, and brochures are removed or softened. We softened emphasis on the mandatory message by using plain text instead of bold text and including the mandatory message in sentences with statements about the benefits of the survey (see Attachment G – Softened Mandatory Messaging Treatment for wording modifications).


Design Treatment - This experimental treatment primarily uses materials designed as a result of the messaging and mail package assessment research that the Census Bureau conducted with elements that are intended to better emphasize the benefits of participation in the survey. The multi-lingual and FAQ brochure (which were redesigned) will not be sent in the mail materials for this test in order to test recommendations from external experts that we should significantly streamline the set of materials included in each package. Additionally, the time needed to develop and print these redesigned materials would delay the test from being implemented as quickly as possible. Some information contained in the FAQ brochure will be included in the other mail materials.


The mailing of the design treatment follows the production mailing schedule but several of the materials have been redesigned, including the following: the letter in the initial mailing (Attachment H1 – A3_Official Internet Invitation Letter) and the envelope containing the initial mailing (Attachment H2 – A1_Official Main Envelope), the second mailing reminder letter (Attachment H3 – Rem2_Official Reminder Letter), the letter included in the questionnaire package (Attachment H4 – C6_Official Choice Letter) as well as the mailing envelope (Attachment H5 – C1_Official Main Envelope), the reminder postcard (Attachment H6 – D1_Official Second Reminder Card) and final reminder postcard (Attachment H7 – E1_Official Final Reminder Card). The redesigned FAQ brochure that will not be included can be seen in Attachment H8 – C4_Official FAQ.


Softened Design Treatment – This experimental treatment builds on the Design Treatment. Mandatory messaging on envelopes, postcards, and letters are removed or softened. We softened emphasis on the mandatory message by using plain text instead of bold text and including the mandatory message in sentences with statements about the benefits of the survey (see Attachment I – Design Treatments Materials for wording modifications).


Minimal Design Treatment – This experimental treatment builds on the Design Treatment. Mandatory messaging on envelopes, postcards, and letters are minimized by removing all references to the mandatory requirement except for the letter in the initial package. We are required by the Paperwork Reduction Act of 1995 to let the respondent know that the survey is mandatory. The initial package letter will have one reference explaining the mandatory nature of the survey on the back of the letter (see Attachment J – Minimal Design Treatment Initial Letter for wording modifications).

Fall 2015 Mail Messaging Test


Respondents sometimes ask why the Census Bureau needs to ask the specific questions on the ACS questionnaire, and so to satisfy their understandable curiosity, we’ve developed some materials to give them the answers. Having a better understanding of why we ask the questions and how the data from their responses to each question are used to benefit their community has shown to be an effective means of addressing respondent concerns with the sensitive nature of some questions. When responding to the survey by Internet or by mail, the respondent has few tools available to obtain information about why we ask the survey questions. Therefore, the Census Bureau will explore additional tools and materials to provide relevant information to respondents to address these concerns.


One approach will be to test the addition of information on why we ask questions into the questionnaire package. We will develop an additional mail piece (Attachment K – ACS-8(X)) that would be inserted in this package to draw the respondent’s attention to information about why we ask some of the survey questions that frequently are of interest to respondents, and examples of how the data are used to benefit their communities. The test is proposed to be conducted in connection with the November 2015 panel, and will contain the following treatments:


Modified Control Treatment—this treatment would mimic the production mail materials with one exception. The Instruction Card that accompanies the paper questionnaire would be eliminated for this test for better comparison to the experimental treatments.


Experimental Treatment 1—this treatment would be similar to the Modified Control Treatment, but would also include the additional ACS-8(X) mail piece that provides information about why we ask some of the survey questions. Due to limitations in the number of individual components that can be inserted by Census Bureau equipment into a single package, we have removed the Instruction Card from the package in this treatment.


Experimental Treatment 2—this treatment would be similar to Experimental Treatment 1 by including the additional ACS-8(X) insert, but would also remove the Question Guide from the package. Given that testing in early 2015 measured the impact of removing the Question Guide, we would like to evaluate the addition of the new insert without the context of the Question Guide.


2016 ACS Content Test


In response to Federal agencies’ requests for new and revised ACS questions, the Census Bureau plans to conduct the 2016 ACS Content Test. The objective of the 2016 ACS Content Test is to determine the impact of changing question wording, response categories, and redefinition of underlying constructs on the quality of the data collected. Changes to the current ACS content and the addition of new content were identified through the OMB Interagency Committee for the ACS, and must be approved for testing by the OMB. Revisions to twelve questions/topics are proposed for inclusion in the 2016 ACS Content Test:

  • Telephone Service

  • Computer and Internet

  • Relationship

  • Race and Hispanic Origin

  • Health Insurance

  • Health Insurance Premium and Subsidies (new questions)

  • Journey to Work: Commuting Mode

  • Journey to Work: Time Left for Work

  • Number of Weeks Worked

  • Class of Worker

  • Industry and Occupation

  • Retirement Income


There will be two treatments for this test: a Control Treatment and a Test Treatment. The Test Treatment includes the alternative proposed wording that was developed as a result of cognitive testing. The Control Treatment includes the current production ACS versions of the questions, except for Race and Hispanic Origin and the Telephone questions which will use modified versions of the current production questions. In addition, since the Health Insurance Premium and Subsidies questions are proposed new additions to the form, a version of these questions will also be tested in the Control Treatment. See Attachment L1 – ACS-1(X)CTC for the Content Test Control Treatment Questionnaire, Attachment L2 - ACS-1(X)CTT for the Content Test Test Treatment Questionnaire, and Attachment L3 – Content Test Question Wording for revised question wording and new questions in all modes.


In coordination with research for the decennial census and, specifically, the 2015 National Content Test, the 2016 ACS Content Test will also test ways to collect and tabulate detailed information for the detailed Race and Hispanic Origin groups, not just the broad groups as identified in OMB’s October 30, 1997 “Revisions to the Standards for the Classification of Federal Data on Race and Ethnicity.” This testing includes collecting detailed data for specific White and Black population groups, a separate “Middle Eastern or North African” category, as well as a “combined” approach to collecting Race and Hispanic Origin data as compared to a “separate” approach, which collects Race separately from Hispanic Origin.


The Census Bureau proposes to evaluate changes to the twelve questions/topics by comparing the revised questions in the Test Treatment to the current ACS questions in the Control Treatment (for new questions we will compare the performance of question versions to each other). The primary metrics of interest in this comparison include the following: response distributions, item missing data rates, comparisons against benchmark data (when available), and response reliability (as measured by gross difference rates and the index of inconsistency). Response reliability will be measured by conducting a Content Followup (CFU) reinterview. The CFU is a telephone reinterview for all housing units that complete the initial interview. The questions being tested (along with some other ACS questions for context) will be asked as they are asked in the CATI version of the 2016 ACS Content Test instrument, with one exception. For retirement income, instead of evaluating response reliability we will evaluate response bias. In the CFU interview we will ask a series of questions about retirement income from the Current Population Survey, which subject-matter experts believe is the most accurate measure for collecting income information. Attachment L3 contains the CATI wording that will be used in CFU as well as the specific CFU questions for retirement income. In addition to the above metrics, which may also be examined by mode if there is sufficient data, some of the questions being tested have additional secondary metrics (see Attachment M – Secondary Metrics for the Analysis of the 2016 Content Test) that are important to consider in determining if the Test Treatment wording should be implemented in the future.


Other Potential Tests


In response to declining response rates and increasing costs, the Census Bureau plans to analyze methods to increase self-response, the least expensive mode of data collection, especially Internet response. The tests would include changes to messages included in mail materials to motivate the public to respond to the ACS, to increase awareness of the ACS, as well as include changes to design elements of the materials, including color and graphics. Additional tests would be conducted in 2016, 2017, and 2018 building on previous tests’ findings.


Additionally, as part of the mail messaging tests in 2017 and 2018, the Census Bureau may include content changes based on continued review of the ACS content in an effort to address respondent concerns and potentially reduce respondent burden. In 2014 the U.S. Census Bureau conducted a top-to-bottom review of the ACS. The goal of this review was to ensure that the survey contains only the content and questions that are necessary, given that response to the survey, an official component of the Decennial Census Program, is mandatory. Additionally, among other activities, the Census Bureau is reviewing questions to determine if we can revise the wording in a way to make them less burdensome for survey respondents, especially for questions determined during the 2014 ACS Content Review to be especially sensitive, difficult, or time-consuming. Proposed changes to content would be cognitively tested and then included in a field test to assess the impact on both respondent burden and data quality.


Also, the ACS began collecting data using the Internet in January 2013. To date, the website used to collect the data is designed for a desktop computer screen. The Internet tests being proposed in 2017 and 2018 would evaluate Internet data collection via mobile devices, examine ways to reduce Internet break-offs, e-mail testing, as well as other improvements to Internet data collection. Recently, the Census Bureau conducted usability testing of the ACS Internet instrument on mobile devices finding that the current design causes many user problems due to the font size and display1. After a redesign of the instrument for mobile devices later this year, the Census Bureau is proposing to test the implementation in a field test. Additionally, the Census Bureau conducted an Internet test in 2014 redesigning aspects of the Internet instrument to reduce break-offs as well as testing the inclusion of e-mail reminders for respondents who start the interview on the Internet but do not complete it2. The tests proposed here would build on findings from that test as well as address emergent issues.


Other testing not outlined above is being considered, but the specific details of these tests are not known at this time. However, these tests cover similar testing topics of content and methods to address emergent issues or needs. The tests may be conducted on residential households, group quarters, or both. Although burden hours associated with the tests described above are included in this package, any additional burden hours needed for other testing would be requested separately in the future as needed.


The Census Bureau collects data for this survey under authority of Title 13, United States Code, Sections 141 and 193. All data are afforded confidential treatment under Section 9 of that Title.


  1. Needs and Uses


The primary need for continued full implementation of the ACS is to provide comparable data at small geographies, including metropolitan and micropolitan areas, as well as the census tract and block group level. These data are needed by federal agencies and others to provide assurance of long-form type data availability since the elimination of the long form from the 2010 Census. For instance, the Department of Housing and Urban Development (HUD) uses state, county, and metropolitan area level ACS median income estimates to allocate Section 8 Housing funds and to set Fair Market Rents for metropolitan areas.3 Both these calculations use a yearly update factor based on ACS data and baselined data (currently from the Census 2000 Long Form, though HUD is in the process of phasing this out).4


State and local governments are becoming more involved in administering and evaluating programs traditionally controlled by the federal government. This devolution of responsibility is often accompanied by federal funding through block grants. The data collected via the ACS is useful not only to the federal agencies but also to state, local, and tribal governments in planning, administering, and evaluating programs. For example, within the Department of Health and Human Services (HHS), the Low Income Home Energy Assistance Program (LIHEAP) uses ACS data at the state level of geography in both its funding formula and its program administration.5 Additionally, the USDA’s Food and Nutrition Service (FNS) provides states and school districts data based on ACS poverty estimates in order to better target services for the Supplemental Nutrition Assistance Program.6


The ACS provides more timely data for use in estimation models that provide estimates of various concepts for small geographic areas. In essence, detailed data from national household and GQ surveys (whose samples are too small to provide reliable estimates for states or localities) can be combined with data from the ACS to create reliable estimates for small geographic areas. The Department of Education’s Title 1 program, under the Elementary and Secondary Education Act reauthorization in 20017, uses the Census Bureau’s Small Area Income and Poverty Estimates (SAIPE) to allocate funds to school districts in order to close the achievement gap between upper and lower-income students. The SAIPE program uses ACS income estimates as a key input in its model. As an additional example, the Department of Transportation’s Federal Highway Administration (FHWA) uses American Community Survey Journey to Work estimates (including means of transportation, time a worker leaves the house to go to work, travel time, and work location) to create traffic flow models.8 These flow patterns are used by both the FHWA and state transportation agencies to plan and fund new road and other travel infrastructure projects.


We will continue to examine the operational issues, research the data quality, collect cost information and make recommendations in the future for this annual data collection. The ACS Methods Panel testing, such as the 2015 mail messaging tests, 2016 Content Test, 2016 mail messaging test, 2017 self-response test, and 2018 self-response test, provide a mechanism to investigate ways to reduce or at least maintain data collection costs and improve the quality of the data.


Information quality is an integral part of the pre-dissemination review of the information disseminated by the Census Bureau (fully described in the Census Bureau's Information Quality Guidelines). Information quality is also integral to the information collections conducted by the Census Bureau and is incorporated into the clearance process required by the Paperwork Reduction Act of 1995.


  1. Use of Information Technology


The ACS data collection includes response by Internet (available in English and Spanish). Sampled addresses are mailed materials to request a response online. Households that do not respond online in a timely manner are mailed a paper questionnaire and reminder(s) urging response. The Mail Messaging Tests as well as the Self-Response Tests may modify the frequency and type of materials mailed to respondents to encourage response.


Several of the mail pieces include a toll-free number to reach staff at the TQA center. In the TQA operation, interviewers can complete the ACS interview with a respondent using an automated data collection instrument. Additionally, several mailing pieces include a URL for the ACS where respondents can go to obtain more information about the ACS.


For the 2016 Content Test, households that do not return their mailed questionnaire will be contacted for a Computer Assisted Telephone Interview (or sent another postcard if there is no phone number for the address) or Computer Assisted Personal Interview (CAPI). The Content Test, along with tests in 2017 and 2018 that include content changes, will also include a CATI Content Followup (CFU), which uses another automated instrument. These technologies allow us to skip past questions that may be inappropriate for a person/household, which, in turn, keep respondent burden to a minimum.


  1. Efforts to Identify Duplication


The ACS is the instrument used to collect long-form data that has traditionally been collected only during the decennial census. The content of the ACS reflects topics that the Congress has been properly notified of, as directed by law, and the OMB has approved the Census Bureau to collect. A number of questions in the ACS appear in other demographic surveys, but the comprehensiveness of the ACS questions, coupled with ability of the ACS sample design to support the tabulation and dissemination of data for small geographic areas fills a unique role.


In addition, the OMB Interagency Committee for the ACS, co-chaired by OMB and the Census Bureau, includes more than 30 participating agencies and meets periodically to examine and review ACS content. This committee provides an extra safeguard to ensure that other agencies are aware of the ACS content and do not duplicate its collection and content with other surveys.


The ACS Methods Panel is the only testing vehicle for the ACS. There is no other program designed to improve the ACS.


Multiple federal household surveys collect information on the topic of health insurance coverage, including the Current Population Survey (CPS), National Health Interview Survey (NHIS) and the ACS.   With the passage of the Patient Protection and Affordable Care Act (ACA), people have expanded options for getting coverage, as well as new types of coverage, which may create additional confusion as to the types of plans that they have. The additions and changes proposed for the 2016 ACS Content Test will help ACS respondents provide responses that are more accurate.  By combining new information on premiums and subsidies with existing questions on coverage types in the ACS, the Census Bureau hopes to be able to resolve conflicting reports of health insurance coverage and produce more accurate estimates of coverage types.  On a more general level, the CPS, the NHIS and the ACS have different strengths when it comes to health insurance data. The CPS makes it possible to examine health insurance coverage in conjunction with detailed employment and income data. The NHIS allows detailed collection of health insurance coverage information in connection with other detailed health data.  The ACS allows the public to obtain estimates of health insurance coverage at the very lowest levels of geography, which is important for decision-makers at the local level, as well as for meeting many federal data needs, including those outlined in the ACA9.


Some question changes that are being considered for the future ACS are also being considered for the 2020 Census (specifically, Race, Hispanic Origin, and Relationship). Preparation for the 2016 ACS Content Test in connection with these items has involved close collaboration with preparation for the 2015 National Content Test.   Specifically, cognitive testing that was conducted in preparation for the 2016 ACS Content Test helped to inform the design of the question wording included in both tests. Although the design of the 2015 National Content Test allows for many more treatments of question revisions (the 2016 ACS Content Test only includes two main versions for race and ethnicity - the 'separate' and 'combined' approaches), testing the combined question on the 2016 ACS Content Test will provide additional questionnaire design and data mode research opportunities, including evaluating the performance of the new “Middle Eastern or North African” (MENA) category and Relationship question in the ACS. Although the 2015 National Content Test is designed for self-response modes (paper, Internet, and Telephone Questionnaire Assistance), the 2016 ACS Content Test allows for additional testing in interviewer-administered telephone and personal visit non-response followup modes.  In addition, the 2016 ACS Content Test will also provide operational testing of questions to analyze areas such as autocoding, residual coding, and classification and tabulation of future ACS data products, as well as providing an opportunity to examine selected socioeconomic characteristics in the 2016 ACS Content Test in connection with responses to the modified questions.  Results from both the 2015 National Content Test and the 2016 ACS Content Test will be used together to make recommendations for future changes to the production ACS and the 2020 Census.



  1. Minimizing Burden


Research and data from survey administrators indicates that the ACS HU questionnaire takes an estimated 40 minutes to complete; CATI/CAPI data collection takes an estimated 27 minutes, and response via Internet takes an estimated 39 minutes. Every effort is taken to minimize the time needed for respondents to answer the questions for all ACS data collection operations.


  1. Consequences of Less Frequent Collection


A less frequent data collection plan would preclude the Census Bureau's goal of producing data annually in order to examine year-to-year changes in estimates. The ACS is conducted monthly because we need to collect data every month for developing an annual average. A monthly survey also helps us stabilize workloads across the year for CATI and CAPI operations and observe seasonal changes that occur.


The Methods Panel tests represent one-time, special tests with a defined period for data collection.


  1. Special Circumstances


The Census Bureau will collect these data in a manner consistent with the OMB guidelines.


  1. Consultations Outside of the Agency


In August 2012, the Office of Management and Budget (OMB) in conjunction with the Census Bureau established a Subcommittee of the Interagency Council on Statistical Policy (ICSP) on the ACS. The ICSP Subcommittee on the ACS exists to advise the Chief Statistician at OMB and the Director of the Census Bureau on how the ACS can best fulfill its role in the portfolio of Federal household surveys and provide the most useful information with the least amount of burden. It may also advise Census Bureau technical staff on issues they request the subcommittee to examine or that otherwise arise in discussions.


The content of the ACS is a result of extensive consultation during meetings with the ICSP Subcommittee for the ACS, advisory committees, and other federal agencies. We also regularly consult with researchers from Westat, NORC, RTI and other survey and research firms as well as the University of Michigan’s Institute for Survey Research and other academic institutions. In addition, staff regularly review survey methodology literature and attend conferences that present state-of-the-art methods and procedures.


For the 2016 Content Test, we consulted with the following federal agencies: Office of Management and Budget, Federal Communications Commission, Department of Commerce, Bureau of Economic Analysis, National Telecommunications and Information Administration, Department of Transportation, Federal Transit Administration, Equal Employment Opportunity Commission, Department of Veterans Affairs, Department of Health and Human Services, Social Security Administration, Centers for Disease Control and Prevention, Department of Labor, Bureau of Labor Statistics and National Science Foundation.


We published a notice in the Federal Register on January 22, 2015 (vol. 80, pg. 3213-3215), inviting the public and other federal agencies to comment on our plans to submit this request. We received two comments in response to that notice. The first indicated that the commenter felt that the Census Bureau’s budget was too large, that the Census Bureau conducts too many censuses, too often, that are too costly, and too big, and that the results of the surveys are propagandized. The second response was from Andrew Reamer, Research Professor at the George Washington University. Mr. Reamer had many specific comments and suggestions about the design of ACS Methods Panel testing of respondent materials, and referred to findings from research conducted by the Census Bureau working with a contractor (“Team Reingold”) as summarized in the report “American Community Survey Messaging and Mail Package Assessment Research: Cumulative Findings.”10


Specifically, Mr. Reamer commented that the ACS Messaging and Mail Package Assessment Research (MMPAR) found that the mandatory response message was the single most effective message in attracting attention and motivating response, and Mr. Reamer suggested that the Census Bureau test the approach recommend in that research to combine “more and earlier legal warnings” with compelling neighborhood information about the community benefits of the ACS. The Census Bureau has built in a panel for the Summer 2015 Mail Messaging test that includes the final recommended design from the ACS MMPAR research which reflects these findings. This is identified as the “Design Treatment” in Table 1 of this document. Although we recognize the ACS MMPAR findings related to the effectiveness of mandatory messaging in motivating response, we also acknowledge that those same messages can cause significant concern to some respondents. Therefore the design of the Summer 2015 Mail Messaging test also includes softened versions of the “Design Treatment” to be responsive to those concerns and to allow us to measure the relative impact of various components of the changes to the mail materials.


Mr. Reamer had several other specific suggestions for future testing, including: “short generic messages on benefits that are matched to each test community’s particular circumstances;” state-specific information and testimonials from trusted local figures on web pages for ACS respondents and equipping field staff with similar information; making revisions to the ACS FAQ brochure to address the legal history of census data collection and congressional oversight of the ACS questions; and engaging with the Social and Behavioral Sciences Team (SBST) out of the White House Office of Science and Technology Policy on behavioral research such as those in regards to social norms to improve ACS response rates. The Census Bureau has already begun meeting with the SBST team to discuss their ideas, and is working to develop ideas with them for the 2016 Mail Messaging Test. The Census Bureau will also look for opportunities to include Mr. Reamer’s other suggestions in that test or other 2017 and 2018 testing.


  1. Paying Respondents


We do not pay respondents or provide respondents with gifts in the ACS.


  1. Assurance of Confidentiality


The Census Bureau collects data for this survey under Title 13, United States Code, Sections 141 and 193. All data are afforded confidential treatment under Section 9 of that Title.


In accordance with Title 13, each household will be assured of the confidentiality of their answers. A brochure is sent to sample households with the initial mail package and contains this assurance. Households responding using the Internet questionnaire are presented with additional assurances of their confidentiality and security of their online responses.


Household members may ask for additional information at the time of interview. A Question and Answer Guide, and a Confidentiality Notice are provided to respondents, as appropriate. These materials explain Census Bureau confidentiality regulations and standards.


At the beginning of followup interviews (CATI and CAPI), the interviewer explains the confidentiality of data collected and that participation is required by law. For all CAPI interviews, the interviewer gives the household respondent a copy of a letter from the Census Bureau Director explaining the confidentiality of all information provided.


  1. Justification for Sensitive Questions


Some of the data we collect, such as race and sources of income and assets may be considered to be of a sensitive nature. The Census Bureau takes the position that the collection of these types of data is necessary for the analysis of important policy and program issues and has structured the questions to lessen their sensitivity. We have provided guidance to the CATI and the CAPI interviewers on how to ask these types of questions during the interview. The Census Bureau has materials that demonstrate how we use the data from sensitive questions, and how we keep that data confidential. Respondents who use the Internet to complete the survey have access to links on the survey screens that provide information to help address their questions or concerns with sensitive topics.


  1. Estimate of Hour Burden


When quick results are required, when we expect improvements in response rates, when we don’t expect adverse impacts to the production estimates, and when we are not testing modified questionnaire content, field tests can be conducted within our production ACS sample. In other cases, we must select a separate sample to conduct the test. Given the cost and complexity of conducting a test using a separate sample, the Census Bureau looks for opportunities to embed tests within the production sample. By doing so, we also minimize respondent burden. Given that we plan to use production sample for the Summer 2015 Mail Messaging Test and the Fall 2015 Mail Messaging Test, we do not include additional burden hours in this package since there is no increase over production ACS interviewing. For the other tests outlined in this package, we are requesting additional burden hours, though for some of these tests we may later determine to embed these tests within the production ACS.


Summer 2015 Mail Messaging Test: We have divided the monthly production ACS sample (clearance number: 0607-0810, expires 6/30/2018) of 295,000 addresses in 24 nationally representative groups of approximately 12,000 addresses each. We will use two randomly assigned groups for each of the five experimental treatment groups, yielding 24,000 addresses per panel and a total experimental sample of 120,000 addresses. The Census Bureau estimates that, for the average household, the ACS questionnaire will take 40 minutes to complete, including the time for reviewing the instructions and answers. As we are using production cases for this test, there is no increase in burden from this test. Although we do expect reductions in self-response for this test, we are using production cases in order to field this test in a timely manner to be responsive to concerns expressed by ACS respondents.


Fall 2015 Mail Messaging Test: We will use two randomly assigned groups for each of the three experimental treatment groups, yielding 24,000 addresses per panel and a total experimental sample of 72,000 addresses. The Census Bureau estimates that, for the average household, the ACS questionnaire will take 40 minutes to complete, including the time for reviewing the instructions and answers. As we are using production cases for this test, there is no increase in burden from this test.


2016 Content Test: From late February through mid-June, 2016, we plan to contact 70,000 residential addresses for the field test portion of the Content Test. We estimate the time to complete the ACS Content Test will be the same as ACS production (40 minutes). We plan to contact approximately 40,000 addresses that responded during the field test during the Content Followup (CFU) portion of the Content Test. The CFU reinterview should average about 30 minutes per address. Note that this is an increase from the estimated burden listed in the pre-submission notice in which we estimated the reinterview would average 15 minutes per address. This increase in burden is due to updated information about the questions that will be asked in the reinterview.


2016 Mail Messaging Test: We plan to contact 60,000 residential addresses. We estimate the time to complete the Mail Messaging Test will be the same as ACS production (40 minutes).


2017 Self-Response Test: We plan to contact 60,000 sampled addresses and 35,000 responding addresses during the content followup conducted by telephone. We estimate the time to complete the test will be the same as ACS production (40 minutes) plus an additional 15 minutes for those addresses included in the content followup.

Internet Tests in 2017 and 2018: In order to study break-offs from the Internet, which do not happen frequently, the sample size must be larger than for other tests. We plan to contact 108,000 sampled addresses for each test. For an average household, the estimated time to complete the ACS is 40 minutes, including the time for reviewing the instructions and answers.

2018 Self-Response Test: We plan to contact 60,000 sampled addresses and 35,000 responding addresses during the content followup conducted by telephone. We estimate the time to complete the test will be the same as ACS production (40 minutes) plus an additional 15 minutes for those addresses included in the content followup.



Table 2. Response Burden

Test and Test Components

Sampled units

Estimated Burden Hours

2016 Content Test



Field Test

70,000

46,667

CFU Reinterview

40,000

20,000

2016 Mail Messaging Test

60,000

40,000

2017 Self-Response Test



Field Test

60,000

40,000

CFU Reinterview

35,000

8,750

2017 Internet Test

108,000

72,000

2018 Self-Response Test



Field Test

60,000

40,000

CFU Reinterview

35,000

8,750

2018 Internet Test

108,000

72,000

Total

576,000*

348,167

*This total includes housing units twice that are selected for the field test and also subsequently included in the CFU reinterview sample.


Annual estimates of the number of respondents and reporting burden are calculated by dividing totals above by three years. The estimated total annual burden hours is 116,056. The Summer 2015 Mail Messaging Test and Fall 2015 Mail Messaging Test will use production sample, so there is no increase in burden from these tests.


  1. Estimate of Cost Burden


There are no costs to the respondent other than his/her time to respond to the survey.


  1. Cost to Federal Government


As requested in the FY 2015 President’s Budget, the estimated cost of the ACS Methods Panel program is approximately $9.045 million per year. The Census Bureau will pay all costs of the Methods Panel tests.


  1. Reason for Change in Burden


This collection is being submitted as revised as it is an ongoing activity. The burden hours are increasing because the Methods Panel is implementing different tests to improve the ACS. The experimental designs and objectives for these tests require additional sample to measure the impact of the changes.


  1. Project Schedules


The preliminary schedule for the Summer 2015 Mail Messaging Test is in Table 3 below. This schedule assumes that the test will be conducted using the September ACS production sample.


Table 3. Summer 2015 Mail Messaging Test Schedule

Activities

Time Frame

Identify test design

April 2015

Develop and print materials

May – July 2015

Field test data collection

August – November 2015

Statistically analyze results

September 2015 – April 2016

Preliminary Self-Response Results

December 2015

Final results and recommendations released

May 2016


The preliminary schedule for the Fall 2015 Mail Messaging Test is in Table 4 below. This schedule assumes that the test will be conducted using the September ACS production sample.


Table 4. Fall 2015 Mail Messaging Test Schedule

Activities

Time Frame

Identify test design

June – July 2015

Develop and print materials

July – September 2015

Field test data collection

October 2015 – January 2016

Statistically analyze results

November 2016 – July 2016

Preliminary Self-Response Results

March 2016

Final results and recommendations released

August 2016


The preliminary schedule for the 2016 Content Test is in Table 5 below.


Table 5. 2016 Content Test

Activities

Time Frame

Identify suggested content for testing

2013

Cognitive Testing

June 2014 through June 2015

Field test data collection

February – June 2016

CFU data collection

March – June 2016

Statistically analyze results

October 2016 – February 2017

Final results and recommendations released

February 2017 – March 2017


Schedules for all other testing are to be determined and will be included in a future submission.


  1. Request Not to Display Expiration Date


We will display the expiration date on the form(s) and the Internet instrument.


  1. Exceptions to the Certification


There are no exceptions to the Certification for Paperwork Reduction Act submission.

1 Olmsted-Hawala, Erica, Elizabeth Nichols, Temika Holland, and Misa Gareau (Forthcoming). “Round 1 Results of the American Community Survey Online Form Usability Testing on Smartphones and Tablets (draft).”

2 Horowitz, Rachel and Zelenak, Mary Frances (2014). “The 2014 American Community Survey (ACS) Internet Test Research and Evaluation Analysis Plan.”

3 See 42 U.S.C. Sections 1437b and 1437f

4 HUD’s funding formulas are available at: http://www.huduser.org/portal/datasets/fmr/fmrover_071707R2.doc and http://www.huduser.org/portal/datasets/il/il10/IncomeLimitsBriefingMaterial_FY10.pdf. The results of these formulas are announced yearly in the Federal Register.

5 See 42 U.S.C. Sections 8621 through 8630

6 See 7 U.S.C. Section 2025(9)(d). The FNS calculates a Program Access Index that allows them to provide additional award funds to states that have the highest levels of SNAP access, or show the greatest annual improvement in SNAP access. For the PAI formula, see: http://www.fns.usda.gov/ora/menu/Published/snap/FILES/Other/pai2008.pdf and 7 CFR Section 275.24.

7 See 20 U.S.C. Section 6313 (a)(5) and P.L.107-110

8 See 23 U.S.C. Section 134 and 23 U.S.C. Section 135. See also 23 U.S.C. Section 303 and 23 CFR Sections 450.316-322. See also P.L. 109-59.

9 The Patient Protection and Accountable Care Act (ACA) requires to the extent possible that data be collected on individuals who participate in the marketplace and those who are recipients of premium subsidies under ACA.  Title XXXI SEC. 3101. DATA COLLECTION, ANALYSIS, AND QUALITY IN GENERAL.—The Secretary shall ensure that, by not later than 2 years after the date of enactment of this title, any federally conducted or supported health care or public health program, activity or survey (including Current Population Surveys and American Community Surveys conducted by the Bureau of Labor Statistics and the Bureau of the Census) collects and reports, to the extent practicable—

‘‘(A) data on race, ethnicity, sex, primary language, and disability status for applicants, recipients, or participants;

‘‘(B) data at the smallest geographic level such as State, local, or institutional levels if such data can be aggregated;

‘‘(C) sufficient data to generate statistically reliable estimates by racial, ethnic, sex, primary language, and disability status subgroups for applicants, recipients or participants using, if  needed, statistical oversamples of these subpopulations; and

‘‘(D) any other demographic data as deemed appropriate by the Secretary regarding health insurance.”


10 See http://www.census.gov/library/working-papers/2014/acs/2014_Walker_01.html

17


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSUMMARY STATEMENT
Authorhicks308
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy