March 4, 2011
NOTE TO THE REVIEWER OF: |
OMB CLEARANCE #1220-0141 “Cognitive and Psychological Research”
|
FROM: |
Scott Fricker Research Psychologist Office of Survey Methods Research
|
SUBJECT: |
Submission of Materials for the Classifying Data Study |
Please accept the enclosed materials for approval under the OMB clearance package #1220-0141 “Cognitive and Psychological Research.” In accordance with our agreement with OMB, I am submitting a brief description of the study.
The total estimated respondent burden hours for this study are 90 hours.
If there are any questions regarding this project, please direct them to Scott Fricker (202-691-7390).
Introduction and Purpose
Much of the data the BLS collects and processes are captured through the use of questionnaires. Often, these questionnaires consist of a series of ‘close-ended’ items with fixed, categorical response options from which respondents choose an answer. Close-ended questions offer a number of advantages over other types of survey questions. For example, the response categories can serve as memory cues for information that respondents may not consider in answering open-ended questions, and the resulting data are often easier to process and analyze than open-ended responses.
One of the challenges of developing effective and reliable close-ended questions is to insure that the response choices provided to respondents are both exhaustive and mutually exclusive. If respondents cannot find a response category that corresponds to their answer to the question, they may skip the question entirely (nonresponse error), or may select a response option that does not accurately reflect their “true” answer (measurement error).
Another challenge is to provide response options that are relevant and familiar from the respondents’ perspective. This can be particularly difficult in official government surveys, in which the words used to label the response options, and even the underlying concepts of interest themselves, may have specialized meaning to the survey sponsor (e.g., Bournazian et al., 2001). Respondents bring to the survey a naïve or everyday sense of those same words and concepts, and this can cause them to misinterpret the question and select an inappropriate response option (e.g., Gerber, Wellens, and Keeley, 1996; Tourangeau, Conrad, Arens, Fricker, Lee, and Smith, 2006). And, even when fixed-response categories are exhaustive, mutually exclusive, and well understood, respondents can still have difficulty deciding whether or not their situation fits a given response option. Empirical work in the field of cognition has shown that people often have difficulty categorizing instances that fit some features of a concept while not fitting others (e.g., Rosch and Mervis, 1975; see also Schober and Conrad, 1997, for a related discussion of the effects of ‘complicated mappings’ on survey responses).
The focus of our examination will be on a relatively neglected area of study – the sources of error in categorization decisions specifically associated with the response options in close-ended survey questions. In particular, the proposed study will examine the Class of Worker (COW) question from the Current Population Survey (CPS).
Now I have a few questions about the job at which you worked LAST WEEK.
Were you employed by government, by a private company, a non-profit organization, or were you self employed (or working in the family business)?
In the CPS interview, the COW question is asked of any respondent who was working for pay during the reference week. Responses to this question allow the CPS instrument to tailor industry and occupation questions for each respondent. This information also helps Census coders determine the correct industry and occupation codes to assign to the interview.
Studies have shown that the employment estimates derived from the CPS COW question differ significantly from estimates taken from the other BLS employment survey – the Current Employment Statistics survey (CES) (e.g., Bowler and Morisi, 2006). Abraham et al. (2007) suggest that one reason for this difference may be that CPS respondents - particularly those who report that they or another household member are self-employed - are misclassifying the types of jobs that they hold. Another reason to more closely examine the classification decisions in the COW question is that the response options are not clearly mutually exclusive of one another (e.g., a person working for an incorporated, sole-proprietorship is both self-employed and working for private company).
Discussions with internal and external CPS/COW stakeholders suggest that one of the main drivers of potential respondent confusion with the class of worker question is the ambiguity of what means to be self-employed. Definitions of self-employment vary considerably across organizations that measure and track this employment category, and respondents’ everyday sense of this concept may not align well with some of those definitions. Moreover, such misalignments can be exacerbated to the extent that respondents are asked to categorize the employment situation of other household members (given what we know about self- vs. proxy reporting), and when there is not a clear definition available (as in the case of the CPS). We therefore focus our efforts in this study on the dimension of self-employment, and on two factors that we believe may affect categorization errors in the COW: question format (i.e., the order in which respondents are asked about the four employment categories), and the information available to respondents about the underlying nature of the work situation.
Research Design
The current CPS Class of Worker (COW) item will be examined in a laboratory study. Each test session will consist of two parts. In the first part, participants will be given a series of written vignettes that briefly describe the work situations of different fictional characters (Attachment I). Test participants will be asked to indicate which COW category they believe best fits the characters’ work situations based on the information provided in each vignette (Attachment II). At the end of the test session, participants will be administered a short set of debriefing questions to explore their reasoning for their vignette-based responses and their understanding of COW-related concepts (Attachment III).
One of our key experimental manipulations will be the way in which the classification request is presented. Study participants will be asked to go through a set of 12 vignettes twice during the test session. In the first round, half of the study participants will be shown the vignettes and asked to classify the character based on the current COW question (Standard condition); the other half of the participants will simply be asked to make an initial determination of whether or not the character is self-employed or is not self-employed (Split condition). Participants in both conditions then will be asked to go back through each of the 12 vignettes again and decide “the nature of the employment:” private, for profit; government; or non-profit. This design will enable us to capture the full set of possible class of worker responses for the split treatment participants, assess reliability between rounds of administration for the standard group respondents, and examine differences in response distributions under the standard and split conditions.
Vignettes
The purpose of asking participants to answer on the basis of fictional scenarios is to allow for systematic variation of information that either strengthens or weakens the “self-employed” nature of the job being described. This additional information will vary on two dimensions – the degree to which the hypothetical job-holder (1) is free from supervisory oversight (supervision) and (2) receives and allocates income from the job free from oversight by another person (financial role). We selected these indicators of “being self-employed” based on ratings by a panel of experts in employment (particularly self-employment) data of approximately 20 potential indicators that were suggested in the research literature.
We created vignettes that described work being done in one of three employment sectors (i.e., government, private-for profit, and non-profit), and varied the amount of information that suggested self-employment (i.e., by making no direct reference to self-employment dimensions, referencing positive evidence of one or the other self-employment dimensions, or referencing negative evidence of one or the other self-employment dimensions). The following example illustrates these manipulations. In the first vignette below, the intent was to suggest that the job holder works for a governmental agency absent any mention of self-employment dimensions. (We refer to this as the “baseline” version of the vignette.)
“Marvin works full time collecting water samples and measuring air quality for his state’s environmental agency. He gathers environmental data at sampling locations in the northwest part of the state where he lives. He works mostly outdoors in the field in his car and on foot. He does occasionally attend meetings at agency headquarters in the state capital.”
This vignette is crafted deliberately to leave open the possibility that Marvin is in his own eyes self-employed even though it also clearly indicates the governmental nature of his work. That is, any holder of this job as described above could well respond to the CPS by selecting “self-employed” when the COW question is asked. As it stands, then, it is likely that some fraction of our study participants will classify Marvin as “self-employed” in the “Standard” condition1. If this occurs, these responses would be “errors.” As we understand it, the model of employment on which the CPS’s work classification scheme rests treats a “government-related self-employed” response as impossible. Thus the proportion of this kind of “misclassification” provides an estimate of one type of potential response error in the CPS data.
We reproduced this vignette with information added that was intended to increase Marvin’s (or our study participants’) likelihood of judging him to be self-employed.
“Marvin works full time collecting water samples and measuring air quality for his state environmental agency. He gathers environmental data at certain locations in the northwest part of the state where he lives. He’s mostly outdoors in the field in his car and on foot. He occasionally attends meetings at agency headquarters in the state capital, but he does his work free from any day-to-day supervision or direction.”
This variation uses information about the critical indicator of supervision. Another variation involving the financial independence indicator but designed to lessen the appearance that Marvin is self-employed would replace the italicized addition above with the following information. “Marvin has to record his hours worked each day on a time-sheet that he submits bi-weekly in order to collect his pay.” (See Attachment I for the full set of baseline vignettes.)
Each test participant will receive a total of 12 vignettes (6 describing work done in a private company, 3 describing work in the government sector, and 3 describing work in the non-profit sector). In addition, four of the twelve vignettes given to participants will contain only the baseline information (i.e., no mention of self-employment dimensions), four will contain positive evidence of self-employment (i.e., two suggesting greater supervisory autonomy, and two suggesting greater financial autonomy), and four will contain negative evidence of self-employment (again, varying along the two self-employment dimensions). Individual participants will be randomly assigned only one version of each vignette, and the order of presentation will be randomized within each test session as well.
Procedures
The test sessions will be conducted in the Office of Survey Methods Research (OSMR) laboratory. Participants will be run in small groups not to exceed six individuals in any one session. At the start of each session, a researcher will explain the study’s purpose and procedures and obtain informed consent from the participants (Attachment IV). The study is designed to be primarily self-administered; participants will be provided a packet of materials with written instructions, the 12 vignettes and COW-related questions, and both open- and close-ended debriefing items to investigate the effects of question wording/format and context on interpretations of the COW terms. In addition, study participants will be asked to self-classify themselves into one of the COW categories, and to briefly describe any conceptual or reporting issues associated with that classification. At the end of the test session, participants will be given time to ask the researcher any questions they have about the study.
Participants
Ninety participants will be recruited from the OSMR participant database. Potential study participants will be screened to ensure that the sample includes people who are currently active in the labor force (although they may be temporarily unemployed at the time we collect their data). In addition, we will collect information about recruits’ histories of filing as self-employed with the IRS (we may oversample this subgroup and split them fairly equally among the main study conditions). Efforts will be made to select participants with varying levels of education, income, and occupation, based on self-reported information provided during the initial recruitment process.
Burden Hours
We anticipate that each test session will last 1.0 hour on average (i.e., 5 minutes for front matter, 40 minutes for administration of the vignettes and COW items, and 15 minutes for debriefing.). Therefore, we estimate that the total burden hours will be 90 hours.
Payment
For this study we will be paying participants the standard $40.
Data Confidentiality
Participants will be informed of the voluntary nature of the study. Participants also will be informed that the study will be used for internal purposes to improve the design of a national labor force interview survey. Participants will be given a consent form to read and sign (Appendix IV) prior to beginning the test session. Information related to this study will not be released to the public in any way that would allow identification of individuals except as prescribed under the conditions of the Privacy Act Notice.
Attachments
Attachment I: |
Baseline Study Vignettes |
Attachment II: |
Respondent Questionnaire |
Attachment III: |
Debriefing Items |
Attachment IV: |
Consent agreement form and Privacy Act statement |
References
Abraham, K., Bakhtiari, S., Haltiwanger, J., Sandusky, K., and Spletzer, J. (2007). “Comparing household administrative measures of self-employment.” Paper presented at the Princeton Data Improvement Initiative conference, February.
Bournazian, J., French, D., Golby, M.E., Lindstrom, P., Miller, R., and Schloss, L. (2001). “The reality of metadata: Are common data definitions possible?” Paper presented at the Federal Committee on Statistical Methodology Conference, Washington, DC.
Bowler, M. and Morisi, T. (2006). “Understanding the employment measures from the CPS and CES survey.” Monthly Labor Review, February, 23-38.
Conrad, F. and Schober, M. (1999). “A conversational approach to text-based computer-administered questionnaires.” Proceedings of the 3rd International ASC Conference. Chesham, UK: Association for Survey Computing, 91-102.
Gerber, E., Wellens, T, and Keeley, C. (1996). “Who lives here? The use of vignettes in household roster research.” Proceedings of the American Statistical Association, Section on Survey Research Methods, 962-967. Alexandria, VA: American Statistical Association.
Groves, R. M. and Lyberg, L. (1988). "An overview of nonresponse issues in telephone surveys," in Telephone Survey Methodology, edited by Robert M. Groves, et al. New York: John Wiley & Sons.
Rosch, E. and Mervis, C. (1975). “Family Resemblances: Studies in the internal structure of categories.” Cognitive Psychology, 7, 573-605.
Schober, M. and Conrad, F. (1997). “Does conversational interviewing reduce survey measurement error?” Public Opinion Quarterly, 60, 576-602.
Tucker, C. and Lepkowski, J. (2008). “Telephone Survey Methods: Adapting to Change.” Chapter 1 in J.M. Lepkowski, C. Tucker, J.M. Brick, E.D. de Leeuw, L. Japec, P. Lavrakas, M.W., Link, R.L., Sangster (eds.), Advances in Telephone Survey Methodology. Hoboken, NJ: Wiley.
Tourangeau, R., Conrad, F., Arens, Z., Fricker, S., Lee, S., and Smith, E. (2006). “Everyday concepts and classification errors: Judgments of disability and residence.” Journal of Official Statistics, 22(3), 385-418.
Attachment IV: Consent agreement form and Privacy Act statement
The Bureau of Labor Statistics (BLS) is conducting research to improve the quality of BLS surveys. This study is intended to suggest ways to improve the procedures the BLS uses to collect survey data.
The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. The Privacy Act notice on the back of this form describes the conditions under which information related to this study will be used by BLS employees and agents.
During this research you may be audio and/or videotaped, or you may be observed. If you do not wish to be taped, you still may participate in this research.
We estimate it will take you an average of one hour to participate in this research.
Your participation in this research project is voluntary, and you have the right to stop at any time. If you agree to participate, please sign below.
Persons are not required to respond to the collection of information unless it displays a currently valid OMB control number. OMB control number is 1220-0141, and expires February 29, 2012.
------------------------------------------------------------------------------------------------------------
I have read and understand the statements above. I consent to participate in this study.
___________________________________ ___________________________
Participant's signature Date
___________________________________
Participant's printed name
___________________________________
Researcher's signature
OMB Control Number: 1220-0141
Expiration Date: 2/29/12
In accordance with the Privacy Act of 1974, as amended (5 U.S.C. 552a), you are hereby notified that this study is sponsored by the U.S. Department of Labor, Bureau of Labor Statistics (BLS), under authority of 29 U.S.C. 2. Your voluntary participation is important to the success of this study and will enable the BLS to better understand the behavioral and psychological processes of individuals, as they reflect on the accuracy of BLS information collections. The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent.
1 Clearly our study participants are only role-playing as actual CPS interviewees. Ideally, this study would be performed on a carefully-selected, complex sample of actual workers whose jobs resemble those in our fictional vignettes. This would require an unjustifiable commitment of time and money. Instead, we chose to do what was feasible. We regard our study respondents as “proxy respondents” in the CPS framework.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | December 1, 2008 |
Author | LAN User Support |
File Modified | 0000-00-00 |
File Created | 2021-01-31 |