Supporting Statement A_2017_05_17

SUPPORTING STATEMENT A_2017_05_17.docx

Generic Clearance for Internet Nonprobability Panel Pretesting

OMB: 0607-0978

Document [docx]
Download: docx | pdf



SUPPORTING STATEMENT


U.S. Department of Commerce

U.S. Census Bureau

Generic Clearance for Internet Nonprobability Panel Pretesting

OMB Control Number 0607-0978



A. Justification

1. Necessity of Information Collection



At this time, the U.S. Census Bureau is seeking a three-year reinstatement of the generic clearance for pretesting for a variety of medium-scale iterative Internet research pretesting activities. We will dedicate a block of hours to these activities for each of the next three years. OMB will be informed in writing of the purpose and scope of each of these activities, as well as the time frame and number of burden hours used. The number of hours used will not exceed the number set aside for this purpose.



The Census Bureau is committed to conducting research in a cost efficient manner. There are several stages of testing in research projects at the Census Bureau. As a first stage of research, the Census Bureau pretests questions on surveys or censuses and evaluates the usability and ease of use of websites using a small number of subjects during focus groups, usability and cognitive testing. These projects are in-person and labor-intensive, but typically only target samples of 20 to 30 respondents. This small-scale work is done through another existing OMB generic clearance. Often the second stage is a larger-scale field test with a split-panel design of a survey or a release of a Census Bureau data dissemination product with a feedback mechanism. The field tests often involve a lot of preparatory work and often are limited in the number of panels tested due to the cost considerations. They are often targeted at very large sample sizes and are typically done using stand-alone OMB clearances.



Cost efficiencies can occur by testing some research questions in a medium-scale test, using a smaller number of participants than what we typically use in a field test, yet a larger and more diverse set of participants than who we recruit for cognitive and usability tests. Using a medium-scale test, we can answer some research questions more thoroughly than in the small-scale testing, but less expensively than in the large-scale field test. This clearance established a medium-scale cost-efficient method of testing questions and contact strategies.



This research program will be used by the Census Bureau and survey sponsors to test alternative contact methods, including emails and text messages (via an opt-in strategy), improve online questionnaires and procedures, reduce respondent burden, and ultimately increase the quality of data collected in the Census Bureau censuses and surveys. We will use the clearance to conduct pretesting of decennial and demographic census and survey questionnaires prior to fielding them as well as communications and/or marketing strategies and data dissemination tools for the Census Bureau. The primary method of identifying measurement problems with the questionnaire or survey procedure is split panel tests. This will encompass both methodological and subject matter research questions that can be tested on a medium-scale nonprobability panel.



This research program will also be used by the Census Bureau for remote usability testing of electronic interfaces and to perform other qualitative analyses such as respondent debriefings. An advantage of using remote, medium-scale testing is that participants can test products at their convenience using their own equipment, as opposed to using Census Bureau-supplied computers. A diverse participant pool (geographically, demographically, or economically) is another advantage. Remote usability testing would use click through rates and other paradata, accuracy and satisfaction scores, and written qualitative comments to determine optimal interface designs and to obtain feedback from respondents.



Starting this year, the Census Bureau will conduct research to inform and test communications materials and advertisements for the 2020 Integrated Communications Campaign (ICC) using this pretesting vehicle. Research will inform all aspects of the campaign, including audience segment profiling, message development, pretesting, and refining, leading up to the campaign in 2020. Ultimately, these various techniques will inform a cost efficient communications strategy aimed at increasing self-response to the 2020 Census, which will save the government money in reducing Nonresponse Follow-Up activities. While the qualitative methods used under this generic clearance allow for a depth of understanding, the quantitative methods allow for a breadth of understanding across a large swath of the U.S. population. A number of different techniques may be used for message development, including in-depth interviews, focus groups, online discussion communities, surveys (including mail, online, and telephone responses), and detailed ethnographies. Once the contractors develop advertising content, there will also be pretesting techniques used such as focus groups, dial sessions, and cognitive interviews. The objectives of this research are to identify barriers and motivators to self-response and compelling message frames, for each audience segment. The data from this research will also allow Census to tailor messaging and communication plans to each group. The interviews will be conducted in-person, on paper, over-the-phone, and online.



Methods

Quantitative surveys. These methods may include probability- or non-probability samples to interview a number of participants with primarily close-ended questions. For example, the Census Bureau will use quantitative surveys at several stages of the communications campaign, including initial planning, pretesting draft creative materials with target audiences, and monitoring the communication campaign during the enumeration process.

One of these studies will be a survey measuring barriers, attitudes and motivators for completing the 2020 Census, known as CBAMS 2020. This survey will assess barriers and motivators of self-response to the 2020 Census in the broader population. The Census Bureau will use findings to create attitudinal profiles of various audience segments in the population, which in turn will inform the communications tailored to these segments. With tailored communication, the Census Bureau is aiming to increase self-response to the 2020 Census, resulting in a cost savings to the government.

Another type of survey will be quantitative online visual testing of creative materials. This may involve a probability or non-probability sample from a commercially available survey panel. Participants would view one or more draft advertisements and provide feedback on the materials. This may include viewing other advertisements or materials that are not related to the Census Bureau to provide a more realistic exercise.



Split sample experiments. This involves testing alternative versions of questionnaires, invitations to questionnaires (e.g., emails or text messages), or websites, at least some of which have been designed to address problems identified in draft versions or versions from previous waves. The use of multiple questionnaires, invitations, or websites, randomly assigned to permit statistical comparisons, is the critical component here; data collection will be via the Internet. Comparison of revised questionnaires (or invitations) against a control version, preferably, or against each other facilitates statistical evaluation of the performance of alternative versions of the questionnaire (or invitation or website).

The number of versions tested and the number of cases per version will depend on the objectives of the test. We cannot specify with certainty a minimum panel size, although we would expect that no questionnaire versions would be administered to less than fifty respondents.

Split sample tests that incorporate methodological questionnaire design experiments will have a larger maximum sample size (up to several hundred cases per panel) than other pretest methods. This will enable the detection of statistically significant differences, and facilitate methodological experiments that can extend questionnaire design knowledge more generally for use in a variety of Census Bureau data collection instruments.



Usability Interviews: This method involves getting respondent input to aid in the development of automated questionnaires and websites and associated materials. The objective is to identify problems that keep respondents from completing automated questionnaires accurately and efficiently with minimal burden, or that prevent respondents from successfully navigating websites and finding the information they seek. Remote usability testing may be conducted under this clearance, whereby a user would receive an invitation to use a website or survey, then answer targeted questions about that experience.



Qualitative Interviews: This method involves one-on-one group interviews in which the respondent is typically asked questions about survey content areas, survey questions or the survey process. A number of different techniques may be involved, including cognitive interviews, online discussion forums, and focus groups. The objective is to identify problems of ambiguity or misunderstanding, or other difficulties respondents may have answering survey questions in order to improve the information ultimately collected in large-scale surveys and censuses.



Procedures for Clearance



Since the types of surveys included under the umbrella of the clearance are so varied, it is difficult to specify at this point what kinds of activities would be involved in any particular test, but a key component will be the comparison of one invitation, questionnaire or website to another.

We will provide OMB with a copy of questionnaires and invitations in advance of any testing activity. Depending on the stage of development, this may be the printed material from the last round of a survey or a revised draft based on analysis of other evaluation data. For a test of alternative procedures, the description and rationale for the procedures would be submitted. We will also provide a description of the sample design and the planned administration. OMB has previously agreed to endeavor to provide comments on substantive issues within 10 working days of receipt.

The Census Bureau will consult with OMB prior to submission on the appropriateness of submissions under this clearance that may raise policy or substantive issues. The Census Bureau will strive to send OMB an annual report at the end of each year summarizing the number of hours used, as well as the nature and results of the activities completed under this clearance.



Data collection for this project is authorized under the authorizing legislation for the questionnaire being tested. This may be Title 13, Sections 131, 141, 161, 181, 182, 193, and 301 for Census Bureau-sponsored surveys or other corresponding collection authorities for surveys sponsored by other Federal agencies. We do not now know what other titles will be referenced, since we do not know what survey questionnaires will be pretested during the course of the clearance.



Literature on and considerations about the use of nonprobability samples for this type of work have been thoroughly covered by a Task Force commissioned by the American Association for Public Opinion Research and are well documented there (Baker, et al., 2013).



2. Needs and Uses


The information collected in this program of developing and testing questionnaires will be used by staff from the Census Bureau, contractors and sponsoring agencies to evaluate and improve the quality of the data in the surveys and censuses that are ultimately conducted. Although the CBAMS is portrayed as being nationally representative, it does not meet Census Bureau quality standards for dissemination and is not intended for use as precise national estimates or distribution as a Census Bureau data product. The Census Bureau will use the results from this survey to monitor awareness and attitudes, as an indicator of the impact of potential negative events, and as an indicator of potential changes in awareness activities. Data from the research will be included in research reports with clear statements about the limitations and that the data were produced for strategic and tactical decision-making and exploratory research and not for official estimates. Research results may be prepared for presentation at professional meetings or in publications in professional journals to promote discussion among the larger survey and statistical community, encourage further research and refinement. Again, all presentations or publications will provide clear descriptions of the methodology and its limitations.


Information quality is an integral part of the pre-dissemination review of the information disseminated by the Census Bureau (fully described in the Census Bureau's Information Quality Guidelines). Information quality is also integral to the information collections conducted by the Census Bureau and is incorporated into the clearance process required by the Paperwork Reduction Act.

3. Use of Information Technology


This generic clearance will use approximately 75 percent electronic data collection. It will use primarily electronic data collection instruments with some paper questionnaires. We will also test Internet data dissemination websites with email invitations or by posting a link on a Census Bureau webpage to test out the website.

4. Efforts to Identify Duplication


This research does not duplicate any other questionnaire design work being done by the Census Bureau or other Federal agencies. The purpose of this clearance is to stimulate additional research, which would not be done under other circumstances due to time constraints. This research will involve collaboration with staff from other agencies that are sponsoring the surveys conducted by the Census Bureau. The research may also involve joint efforts with staff from other Federal laboratory facilities. All efforts would be collaborative in nature, and no duplication in this area is anticipated.


To the maximum extent possible, we will make use of previous information, reviewing results of previous evaluations of survey data before we attempt to revise invitations or questionnaires. However, this information is not sufficient to refine our census and survey invitations or questionnaires without conducting additional research.


This generic clearance request is an explicit addition to the current generic clearance for pretesting (Number 0607-0725) that allows smaller-scale cognitive and questionnaire design and usability testing research as part of testing for its censuses and surveys.


5. Minimizing Burden

This research will be designed as small-to-medium scale data collection efforts. This will minimize the amount of burden required to improve questionnaires and procedures, test new ideas, and refine or improve upon positive or unclear results from other tests.

6. Consequences of Less Frequent Collection


This clearance involves one-time questionnaire development activities for each survey that is connected with the clearance. If this project were not carried out, the quality of the data collected in the surveys would suffer. In addition, testing websites with a larger sample-base will allow us to have more confidence in our findings and the resulting revisions to the websites.


7. Special Circumstances


All the guidelines listed in the OMB guidelines are met. There are no special circumstances.


8. Consultations Outside the Agency


Reg Baker, Consultant at Market Strategies International, and an expert in nonprobability internet panel testing has been a consultant on this project.


Consultation with staff from other Federal agencies that sponsor surveys conducted by the Census Bureau will occur in conjunction with the testing program for the individual survey. Consultation with staff from other Federal laboratory facilities may also occur as part of joint research efforts. These consultations will include discussions concerning potential response problems, clarity of questions and instructions, and other aspects of respondent burden. Additional efforts to consult with potential respondents to obtain their views on the availability of data, clarity of instructions, etc., may be undertaken as part of the testing that is conducted under this clearance.


A notice was published in the Federal Register on 1/13/17 (vol. 82 FR 4284 pg. 4284-4286), inviting public comment on our plans to submit this request. We have not received any response to this notice.


9. Paying Respondents


Respondents will not be paid or provided any cash incentive for their participation in remote testing activities. For activities conducted in the cognitive laboratory or in focus groups, respondents may be offered an incentive to offset the cost of participation. Details of this incentive will be outlined with each individual request submitted by Census to OMB.


10. Assurance of Confidentiality


All respondents who participate in research under this clearance will be informed that the information they provide will not be made available in any way that would personally identify him or her and that their participation is voluntary. This disclosure will be made prior to any data collection.


Depending on the collection, the confidentiality of information may be assured by Title 13, United States Code, or another applicable title. Per the Federal Cybersecurity Enhancement Act of 2015, data are protected from cybersecurity risks through screening of the systems that transmit data.


11. Justification for Sensitive Questions


There will be no sensitive questions asked.


12. Estimate of Hour Burden


We will use a variety of forms to conduct the research under this clearance, but the exact number of different forms, length of each form, and number of subjects/respondents per form are unknown at this time.


Phase

(dates)

Research Technique

Description

Burden Hours (max)

Fiscal Year 2018

CBAMS
(Oct-Dec ‘17)

CBAMS quantitative survey

Nationwide mailout survey with paper or online response, survey will provide data to enrich segmentation for communication planning.

12,500 hours

In-person focus groups

Groups with hard-to-count populations that may not be covered sufficiently by the quantitative survey

2,500 hours

Online discussion communities

Weeklong online discussion communities with a mix of participants recruited from broad geographic areas

600 hours

In-depth telephone interviews

Conversations with community leaders from small populations that may not receive coverage with other techniques

300 hours

Ethnographic sessions

Interviews with low self-response propensity individuals, conducted by Team Y&R researchers and multicultural experts

450 hours

Rapid Iterative Pretesting (RIPs)
(May-Jun ‘18)

Quantitative online visual testing (AdLab)

Online survey exposing participants to one of several creative platform statements, monadic design to identify the best platform for the 2020 campaign

3,500 hours

Weeklong online discussion communities

Weeklong online discussion communities with a mix of participants recruited from broad geographic areas, consider each of the platforms in turn

600 hours

In-person focus groups

Group discussions to identify the best platforms with key populations not covered by the online responses

1,000 hours

Recruiting Campaign Pretesting

(Jun-Jul ‘18)

In-person focus groups with potential enumerator candidates

Group discussions to identify the best platforms with key populations not covered by the online responses

500 hours

Other yet unidentified pretesting

Unknown

We are estimating 10 minutes per response for 20,000 respondents per year.

3,333 hours

Fiscal Year 2018 Total: 25,283 burden-hours

Fiscal Year 2019

Creative Pretesting with draft creative

(Sept-Dec ’18)

Quantitative online visual testing (AdLab)

Online survey that exposes participants to one of several executions of an advertisement, allowing for comparisons between ads from hundreds of monadic ad exposures. Similar to the process used for 2014 ACS mail package online visual testing.

5,500 hours

Online discussion communities

Weeklong online sessions to discuss several different executions in detail. Participants recruited from across the country.

550 hours

In-person focus groups

Group discussions with key audiences to explore creative mock-ups and early stage production (storyboards, animatic videos). Discuss what’s working with the creative and ways to improve

4,200 hours

Final Audience Validation

(May-Jun ’19)

Quantitative online visual testing (AdLab)

Using an online survey, interview subjects will review post-production rough cuts as a last effort to ensure there are no previously unidentified issues with a piece that make it unsuitable for promotion of the census

4,500 hours


In-person focus groups

For populations (such as in-language) that cannot be reached through online survey panels, we will conduct focus groups on post-production materials as a last effort to ensure there are no previously unidentified issues

2,000 hours

Other yet unidentified pretesting

Unknown

We are estimating 10 minutes per response for 20,000 respondents per year.

3,333 hours

Fiscal Year 2019 Total: 20,083 burden-hours

Fiscal Year 2020

Thank you Validation
(Apr-May ’20)


Pre-testing “thank you” campaign with in-person focus groups to ensure adequate messages and execution for delivery

2,000 hours

Other yet unidentified pretesting

Unknown

We are estimating 10 minutes per response for 20,000 respondents per year.

3,333 hours

Fiscal Year 2020 Total: 5,333 burden-hours



13. Estimate of Cost Burden


There is no cost to respondents for participating in the research we are conducting under this clearance, except for their time to complete the questionnaire and (if applicable) travel to the location to participate in the research.


14. Cost to Federal Government


There is no way to anticipate the actual number of participants, length of interview, and/or complexity of the data collection instruments for the surveys to be conducted under this clearance. Thus, it is impossible to estimate in advance the cost to the Federal Government. The Census Bureau will cover the costs.

15. Reason for Change in Burden


We are requesting an increase in hours from 8,334 to 16,900 annually because we incorporated the pretesting needs for the 2020 Census communications campaign into this request.


16. Project Schedule


This research program is for questionnaire and procedure development purposes. We will use data tabulations to evaluate the results of questionnaire testing. The information collected in this effort will not be the subject of any printed Census Bureau reports; however, it might be included as a methodological appendix or footnote in a report containing data from a larger data collection effort. The results of this research may, however, be prepared for presentation at professional meetings or publication in professional journals.

Due to the nature of this clearance, there is no definite or tentative time schedule at this point. We expect work to continue more or less continuously throughout the duration of the clearance.


17. Request to Not Display Expiration Date


No exemption is requested.


18. Exceptions to the Certification


There are no exceptions to the certification.







File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authordemai001
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy