FSS Trust Tracking Survey Pretesting Plan

omb1139fsspos-1rev1.doc

Generic Clearence for Questionnaire Pretesting Research

FSS Trust Tracking Survey Pretesting Plan

OMB: 0607-0725

Document [doc]
Download: doc | pdf

The Census Bureau plans to add questions to a daily tracking survey under the generic clearance for pretesting (OMB 0607-0725). The objective of this research is to conduct a field pretest with respondent debriefing on draft questions for the Federal Statistical System Public Opinion Survey (FSS POS). This survey will be fielded as a joint effort among Federal Statistical Agencies and will attempt to monitor the public’s trust in official statistics.


This questionnaire has been revised based on the results of cognitive testing, which was conducted under this clearance and approved by OMB on October 12, 2011. When fielded, these public opinion data will enable the Census Bureau to better understand public perceptions, which will provide guidance for communicating with the public and for future planning of data collection that reflects a good understanding of public perceptions and concerns. Because all federal statistical agencies are also facing these issues of declining response rates and increasing costs in a time of constrained budgets, the Census Bureau will share the results of these surveys with other federal statistical agencies, to maximize the utility of this information collection and ultimately, the quality and efficiency of federal statistics. Specifically, the National Agricultural Statistics Service, the National Center of Health Statistics, the Economic Research Service, Internal Revenue Service Survey of Income, and the Office of Management and Budget plan to use results from this data collection to inform public communication and for future planning of data collection. A separate OMB package using regular submission procedures will be made for the tracking survey itself.


The pretest will be fielded from January 3, 2012 through February 7, 2012. It will consist of three weeks of pretesting. Each week corresponds to a different phase and will be separated by one week for analysis. During Phase One, 30 items will be piloted and evaluated using a variety of methods: 1) item response distributions will be assessed to determine the value of various response categories - If some categories are not being used, they will be adjusted or eliminated; 2) the factor structure of the scaled items will be explored using Exploratory Factor Analysis (EFA), which will allow us to determine if items load under the factor they were designed to measure, if items load under multiple factors (cross-load), and if the hypothesized factors are identified (meaning at least three items load under them; and 3) random probing (respondent debriefing) will be used to increase our understanding of how each question is being interpreted (described in more detail below). These questions will be adjusted as necessary for Phase Two.


During Phase Two, 30 items will be piloted and evaluated in terms of item and model fit. Using Confirmatory Factor Analysis (CFA) we will assess measurement error, model invariance, and review recommendations for improving model fit by either eliminating items or respecifying the factor structure. Random probing will be used as necessary for items that have been modified.


In addition to the 30 survey questions, each interview in the first and second week will have 3 “random probes” which will be an open–ended question asking respondents why they reported the way they did.1 The same open-ended question will be asked three (3) times for each respondent at different survey items. By parsing respondents into 10 groups, each of the 30 survey questions will be probed in 1/10 of the interviews.


During Phases One and Two, we will also review interview tapes and debrief interviewers to gain deeper insight into how the questions are being interpreted.


During Phase Three, the 25 questions that are likely to be fielded for the time series will be implemented. Random probing will be used as necessary for items that have been modified.


The specifications of the data collection in the field pretest are:

  • Questions will be added to an established public opinion tracking survey (the Gallup Daily Tracking Survey);

  • The survey will begin on or about January 3, 2012;

  • The survey will run daily every other week until about February 7, 2012 (seven days on data collection, followed by seven off for analysis);

  • The survey will reflect a national random probability sample of the U.S. population;

  • The survey will be conducted using random-digit-dial telephone methodology;

  • The survey will use a random selection method to select respondents in households containing more than one person;

  • The interview protocol will include a multi-call design to reach respondents not contacted on the initial attempt;

  • The sample frame will include both cell phone and landline phone numbers;

  • The survey will conduct interviews in both English and Spanish;

  • The sample size will be approximately 100 completed interviews per day;

  • The sample will include cases in Alaska and Hawaii;

  • The survey will include language to notify respondents that their information will not be made available in any way that would personally identify them; and

  • No compensation will be provided to respondents.


The Census Bureau’s questions will follow Gallup’s questions on politics, wellbeing and health behaviors on which they have a long time series that they do not wish to disrupt. We estimate our portion of the interview will last at most 15 minutes. Gallup’s usual demographic questions will be asked at the end of the data collection. On average, the Gallup survey will not last more than 18 minutes (though this average takes into account the 100/1000 Census Bureau sample, which will likely be longer than 18 minutes, though we estimate no more than 29 minutes). The survey will run for a total of 21 days at 100 interviews per day, we estimate a total of total of 2,100 respondents during the course of the pretest. Thus, the total estimated burden hours for this research is 525 hours.


The initial thirty tracking questions are in Attachment A.


The contact person for questions regarding this tracking survey is:


Jennifer Hunter Childs

Center for Survey Measurement

Research and Methodology Directorate

U.S. Census Bureau

Room 5K020A

301-763-4927 (o)

571-263-5813 (c)

[email protected]




1 Two versions of the random probe will be used: ALTERNATIVE C1: What do you think this question was asking? ALTERNATIVE C2: Can you tell us why you chose that response?


File Typeapplication/msword
File TitleThe Census Bureau plans to add questions to a daily tracking survey under the generic clearance for 2010 Census Communications
Authorbates005
Last Modified ByJenny Childs
File Modified2011-12-16
File Created2011-12-16

© 2024 OMB.report | Privacy Policy