Note to Reviewer - ORS Cognitive Items Multi Mode Test

ORS_Cog_items_multi-method_final_9-7-16.docx

Cognitive and Psychological Research

Note to Reviewer - ORS Cognitive Items Multi Mode Test

OMB: 1220-0141

Document [docx]
Download: docx | pdf

August 23, 2016




NOTE TO THE

REVIEWER OF:

OMB CLEARANCE 1220-0141

Cognitive and Psychological Research”


FROM:

Scott Fricker

Senior Research Psychologist

Office of Survey Methods Research


SUBJECT:

Submission of Materials for Evaluating ORS Cognitive Items – Methods Comparison




Please accept the enclosed materials for approval under the OMB clearance package 1220-0141 “Cognitive and Psychological Research.” In accordance with our agreement with OMB, I am submitting a brief description of the study.


The total estimated respondent burden hours for this study are 133 hours.


If there are any questions regarding this project, please direct them to Scott Fricker (202-691-7390).

  1. Introduction and Purpose

In the past two decades survey research has established standards for the systematic evaluation of survey instruments. The aim of these evaluations is to ensure that survey questions ask about the right thing (have construct validity) and can be answered accurately by respondents (meet the cognitive requirements of the response process). Although there are a variety of pretesting methods that researchers can use to achieve this aim, the most widely used evaluative technique is the cognitive interview. In this method, respondents are interviewed in a lab setting (or sometimes on site, e.g., at their place of work) and asked to report on the internal cognitive processes used to answer survey questions; the interviewer in turn asks additional probes during the interview about the meaning of specific terms or the perceived intent of a question. The objective is to reveal the thought processes involved in the survey response and to diagnose problems with survey questions.

Cognitive interviews offer a rich source of information that can help researchers evaluate and improve surveys, but they present some challenges as well. For example, they generally are conducted in person, which can be expensive and in practice often constrains the number and geographic diversity of study participants. Recent and emerging technology, however, has permitted survey researchers to begin exploring online alternatives to in-person cognitive interviews, and the penetration of the Internet and Internet-enabled devices in the U.S. has provided access to a broader population and made it easier and cheaper to recruit people for web-based evaluative studies (e.g., Murphy, Keeting, and Edgar, 2013).

These online evaluative approaches adapt some of the foundational elements of traditional cognitive interviewing (e.g., asking respondents to report on their cognitive processes, probing for meaning or clarification), but vary in how they are implemented. Embedded-probing techniques simply intersperse targeted cognitive-interview like questions throughout (or at the end of) a regular online survey. Virtual cognitive interviews utilize web conferencing tools or virtual worlds like Second Life to conduct remote cognitive interviews. Crowdsourcing platforms like Amazon’s Mechanical Turk (MTurk) increasingly are being used to recruit participants for online survey research where the workers are paid to perform micro-tasks (e.g., spend a few minutes evaluating a question or two). And there are specialized web services like TryMyUI which provide panels of participants who are experienced in providing think-aloud reports (verbalized reactions and opinions) about online stimuli (typically a website), and those reports and video recordings of participants’ computer screens are provided to researchers who can then apply qualitative research methods (e.g., behavior coding) to assess performance.

As the popularity and variety of online evaluation techniques grow, researchers need to determine whether they yield results that are consistent and/or complementary with those from more established methods like cognitive interviewing. How do these methods compare in terms of the number and type of problems they identify, the composition of their samples, their cost and duration of the evaluation cycle, and respondent engagement? Although a handful of studies have begun to examine these issues (e.g., Murphy et al., 2013; Edgar, 2012; Behrend et al., 2011) and show promise in the quality of web-based evaluation methods, additional and more systematic research is needed.

In the proposed study, we will evaluate a set of questions from the new Occupational Requirements Survey (ORS) (see Appendix D) using three pretesting tools: a traditional, lab-based cognitive interview; a web-based panel from TryMyUI where participants respond verbally to scripted cognitive interview prompts as they complete the survey; and an embedded-probe approach using crowdsourcing recruitment (MTurk) where respondents provide written responses to cognitive probes. Respondents in each condition will be administered the ORS survey items followed by the same set of cognitive probes (see Appendix E). Responses to these probes subsequently will be coded to indicate the adequacy of the answer and the type of problems (e.g., comprehension, retreival, response selection) that arise (see Appendix F).

The objective of this study is to explore and begin to assess the impact of these three evaluation methods on interpretations of item performance, the ability to generate solutions/revised questions, study cost and efficiency, and respondent composition and engagement. How do respondents interpret the questions they are asked? How do the survey and probing formats affect the ORS data elements that are captured from reports? What memory and estimation processes and errors do respondents make in those reports? Do the methods differ in their ability to identify changes that can be made to survey protocols that would improve the respondent experience and data quality?

Research Design and Procedures

The proposed study is not intended to provide definitive evidence for informing BLS policy on online cognitive testing, but rather to serve as a preliminary evaluation of three potential approaches for pretesting survey questions. The focus of data collection and analysis in this study will be on the ORS cognitive demand data elements. In addition, participants will be asked to provide information about general occupational characteristics (e.g., job title, work schedule), and to generate a list of their primary work tasks. No Personally Identifiable Information (PII) will be collected as part of this study. All participants will be currently employed; targeted screener/recruiting effort will be made to ensure the inclusion of nursing assistants and cashiers, two occupations of particular interest to SSA.

In-lab cognitive interview participants will be recruited through the Office of Survey Methods Research (OSMR) participant database and complete the study in the OSMR lab at BLS. The session facilitator will begin by introducing the purpose of the study, obtaining informed consent, and answering respondent questions (Appendix B). Study participants will be complete a self-administered, paper version of the ORS cognitive demand questionnaire; as they complete the survey they will be asked to verbalized their thoughts (i.e., provide a “think-aloud”). Upon completion of the survey, participants will be asked the set of cognitive interview probes by the researcher. Traditional cognitive interviewing procedures will be used – scripted questions with follow-up probes as necessary. Sessions will be audio recorded with participant consent. The interviews are expected to last approximately 30 minutes. We intend to run 20 participants in this condition, paying the standard rate of $40 for in-person research participation.

The embedded-probe condition will be conducted using an online survey tool, Qualtrics. Study participants will be recruited on MTurk and sent a link to the web survey that contains the ORS questions, scripted cognitive probes, and set of demographic questions. The entire survey is intended to take respondents not more than 20 minutes. To assess the impact of incentive amounts on sample composition and substantive findings, three between-subjects payment levels will be offered: $1.00; $2.00; and $4.00, with 100 participants in each incentive condition.

The TryMyUI condition will be conducted using TryMyUI panel members. Researchers define study eligibility criteria (e.g., only employed individuals, mix of education, gender, and age groups), and eligible participants are invited by TryMyUI to participate and sent a link to the Qualtrics web survey. The sequence and content of this survey will be identical to those in the other two conditions – i.e., ORS survey questions followed by cognitive probes. In TryMyUI, participants will provide verbal answers to the cognitive questions as they complete their think-aloud protocols, and these will be recorded, in addition to substantive answers participants provide through their Qualtrics survey. TryMyUI limits tasks to 20 minutes, and charges researchers $25 per completed interview. We intend to run 45 TryMyUI participants.


  1. Participants

Twenty participants will be recruited from the OSMR participant database for the in-lab condition. During the recruiting process, recruiters will screen individuals for employment status and occupation and collect basic demographic information. Individuals will be directed to come to the OSMR lab at BLS. Participants will be balanced across these occupational groups, as well as by education and age (see Appendix A for screener questions).

Three hundred participants will be recruited from the MTurk platform for the embedded-probe condition. The recruitment announcement will identify eligibility criteria (employment status, targeted occupational groups) and eligible individuals will be sent a link to the Qualtrics survey.

Forty-five participants will be recruited from the TryMyUI research panel using the same recruitment criteria as above. Partipants will be sent a link to the Qualtric survey.


  1. Burden Hours

The target sample size for this study is 365 participants. We anticipate that each in-lab session will last no longer than 30 minutes. Screening potential participants for the in-lab condition is estimated to take five minutes per participant. Since individuals will be responding to targeted recruitment ads and/or will have been participants in prior OSMR studies, we expect a fairly high success rate (approximately 65%). This means we will have to screen 31 people in order to find 20 eligible participants.

We anticipate that the online survey conditions (embedded-probe and TryMyUI) will last no longer than 20 minutes. Screening for the embedded-probe condition will be accomplished through the announcement ad and is estimated to take 0.5 minutes per individual. We anticipate a moderate hit rate for MTurk workers, given previous study experience (e.g., 50%) with comparable payment structures. This means that we will have to screen 600 people in order to obtain 300 eligible participants.


TryMyUI sessions are limited to 20 minutes, and there is no additional screener time (TryMyUI pre-matches eligible participants and does not release their take rate).

Total burden hours for screening and participation for this study are expected to be 35.3.


Interview Task

Number of Responses

Minutes per respondent

Total Number of minutes

Burden Hours

Screening for In-lab collection

31

5

155

3

  1. In-lab collection

20

30

600

10

  1. MTurk recruitment

600

0.5

300

5

  1. Embedded-probe collection

300

20

6000

100

TryMyUI collection

45

20

900

15

Total




133


  1. Payment

In-lab participants will receive $40 for their time; MTurk/Embedded-probe participants will receive $1, 2 or $4 for completing the task split evenly among the three groups; TryMyUI participants receive $25.

  1. Data Confidentiality

Participants will be informed as to the voluntary nature of the survey, and that the study will be used for internal research purposes only. Participants in the interviewer-administered conditions will be given a consent form to read and sign (Appendix C). Participants in the online conditions will be not be given a pledge of confidentiality because data collected as part of this study will be stored on Qualtrics servers. The following notice will be placed on the first page of the survey:


This survey is being administered by qualtrics.com and resides on a server outside of the BLS domain. BLS cannot guarantee the protection of survey responses and advises against the inclusion of sensitive personal information in any response.


Appendix A – Screening Questions

Appendix B – Introduction

Appendix C – Consent Form – Interview Administered (In person and by phone)

Appendix D – ORS Cognitive Demand Items

Appendix E – Debriefing Questions

Appendix F – Coding Scheme for Participant Responses




Appendix A: Screening Questions for In-Person Interviews and MTurk


  1. Are you currently employed either part-time or full-time?

    • Yes

    • No



  1. (If Yes to Q1) What is your current job title?




  1. (If eligible for study) How long have you worked as a [fill with occupation from Q3] for your current employer?


  • Less than a year

  • 1 – 2 years

  • 3 – 5 years

  • 6 – 10 years

  • More than 10 years


  1. (If eligible for study) What is the highest level of education you have achieved, or the highest grade you have completed?


  • Less than HS

  • HS diploma or equivalent

  • Some college but no degree

  • BA or BS

  • Post Graduate (MA/MS, PhD, MD)

  • Other, specify


  1. (If Yes to Q1) How old are you?





Appendix B: Introduction

  • Hi! Thank you for coming in today.

  • [If team member is observing] I have a colleague in the next room that will be observing and taking notes.

  • Explanation of the study purpose:

    • Today we’re going to be evaluating a new survey that collects information about the demands of occupations. BLS currently collects this information by speaking to managers, supervisors, or HR staff in a company, but we’re exploring the feasibility of collecting this information directly from the employees themselves. We will be asking you to complete a questionnaire, and then we will spend some time at the end discussing your reactions – for example, what you liked or disliked, how easy or difficult you found it, suggestions for how we can improve things.

    • Here is the questionnaire that we want you to complete. The instructions are simple, just complete each item. However, as you do so, it would help us if you could think out loud about what each question is asking and why you picked the answer that you did. I’ll be sitting right here, so feel free to share any thoughts that you have about a question and the answer choices that you are given, for example, if there is a word that isn’t clear, or if you’re not sure what is being asked for, or you’re not sure what something means. Because I’m a slow note taker, to help me make sure I don’t miss anything, I hope it’s okay with you if I tape record this session.

    • Before we begin, it’s important to note that we’re not here to evaluate you or your abilities. We’re speaking to a number of people with different backgrounds and in different jobs, and really just trying to learn what works and what doesn’t. This questionnaire has to be used by a wide variety of people, so we want it to be as easy to use as possible. Also, we’re pretty early in the process of developing this questionnaire, so we know it isn’t perfect by any stretch, so don’t worry about hurting our feelings.



    • Consent Form [for interviewer-administered conditions] or confidentiality acknowledgement [for self-administered conditions]

  • Any questions before we begin?

Appendix C: Consent Form

OMB Control Number: 1220-0141

Expiration Date: April 30, 2018


CONSENT FORM


The Bureau of Labor Statistics (BLS) is conducting research to increase the quality of BLS surveys. This study is intended to suggest ways to improve the procedures the BLS uses to collect survey data.


The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. The Privacy Act notice on the back of this form describes the conditions under which information related to this study will be used by BLS employees and agents.


During this research you will be observed. If you do not wish to be taped, you still may participate in this research.


We estimate it will take you an average of 30 minutes to participate in this research.


Your participation in this research project is voluntary and you have the right to stop at any time. If you agree to participate, please sign below.


Persons are not required to respond to the collection of information unless it displays a currently valid OMB control number. The OMB control number is 1220-0141 and expires April 30, 2018.


------------------------------------------------------------------------------------------------------------

I have read and understand the statements above. I consent to participate in this study.



___________________________________ ___________________________

Participant's signature Date



___________________________________

Participant's printed name



___________________________________

Researcher's signature



PRIVACY ACT STATEMENT

In accordance with the Privacy Act of 1974, as amended (5 U.S.C. 552a), you are hereby notified that this study is sponsored by the U.S. Department of Labor, Bureau of Labor Statistics (BLS), under authority of 29 U.S.C. 2. Your voluntary participation is important to the success of this study and will enable the BLS to better understand the behavioral and psychological processes of individuals, as they reflect on the accuracy of BLS information collections. The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent.





Appendix D: ORS Cognitive Demand Questions



Job Requirements Questionnaire


Instructions. Please answer each question below. Please “think out loud” as you do so.


  1. Work tasks are the regular duties of your job. Please briefly list your most important work tasks in the space below.








  1. How often do your work tasks change?

  • At least once per day

  • At least once per week, but less often than daily

  • At least once per month, but less often than weekly

  • Less than monthly, including never


  1. Work location is the physical site where you perform your work. How often does your work location change?

  • Does not change unless it is permanent

  • Changes up to four times a year

  • Changes more than four times a year


  1. Work schedule is the work hours and days that are set by your employer. Does your work schedule change?

  • Yes

  • No


  1. Does your job have faster and slower periods of work?

  • Yes

  • No


  1. What is the fastest pace of work that you perform?

  • Rapid, with no periods of waiting

  • Steady, with rare periods of waiting

  • Unhurried, with much time spent observing or waiting, rushed periods are rare or never occur




  1. Can you intervene and control the flow of your work?

  • Yes. I can change the priority of work tasks or the amount of time allotted to complete them

  • No. The work is primarily driven by business processes, production line speed, or customer demands


  1. How frequently is your work checked?

  • More than once per day

  • Once per day

  • At least once per week, but less often than daily

  • Less often than weekly


  1. What is the highest level of independent judgment you are expected to use to perform your work?

  • Employee uses independent judgment to select from a limited number of predetermined actions

  • Employee uses independent judgment to determine the most appropriate course of action in situations that do not have set responses

  • Employee uses independent judgment to make decisions by choosing from a large number of possibilities in situations where a high degree of uncertainty or complexity may exist




Appendix E: Debriefing Questions


Thanks for completing that survey. We are going to switch gears now and talk a bit about what it was like for you to answer those questions. I have some follow-up questions that will help us better understand how people are responding to this survey. Feel free to refer to the questionnaire to refresh your memory as we go through these questions.


I want to emphasize again though that there are no ‘right’ or ‘wrong’ answers to these questions – we are just interested in your interpretations and reactions.


What were your general reactions to the survey?



Work Tasks


  1. The first question asked you to list the regular duties of your job. Did you have any difficulties doing this?

  • Yes – enter explanation below

  • No



  1. Next, you were asked several questions about changes in your work routine including your work tasks, your schedule, and the location of your work.


      1. How would you define a “work task” in your current job? Please give a couple of examples, and examples of when or how your work tasks change.



      1. How easy or difficult was it to figure out how often your work tasks change?

  • Very easy

  • Somewhat easy

  • Neither easy nor difficult

  • Somewhat difficult

  • Very difficult


Please tell me more about that - what made you give that answer? How did you arrive at this answer?



      1. How much variability is there in your job in terms of how frequently work tasks change. For example, is the pattern basically the same throughout the year, or does it vary? Besides time of year, are there other factors that affect this?




      1. Do you think that other individuals working in your job in this company would report the same level of changes in work tasks?

  • Yes

  • No


      1. [For both the work schedule and work location questions, ask:] How easy or difficult was this question for you to answer the questions about changes in your work schedule and your work location?

  • Very easy

  • Somewhat easy

  • Neither easy nor difficult

  • Somewhat difficult

  • Very difficult


Why is that?



    1. What do you consider changes in work schedule to be?



    1. If your employer allows you to occasionally change the times you begin and end work, would you consider that a change in work schedule?

  • Yes

  • No


    1. [For participants who reported some change in work location, ask:] How did you arrive at your answer? Can you describe the types of location changes you thought about when answering this question? (Probe to gauge extent to which change is unique to this employee, etc.)




  1. The next question asked whether your job has faster and slower periods of work. How well does this question apply to your job?



How easy or difficult was this question for you to answer?

  • Very easy

  • Somewhat easy

  • Neither easy nor difficult

  • Somewhat difficult

  • Very difficult


  1. The next section asked a question about the pace of work in your job, whether it was rapid, steady, or unhurried. How well does this question apply to your job?


    1. In your own words, what does it mean to perform work at a rapid pace, with no periods of waiting; at a steady pace, with rare periods of waiting; or at an unhurried pace, with much time spent observing or waiting ?




How easy or difficult was the Pace of Work question for you to answer?

  • Very easy

  • Somewhat easy

  • Neither easy nor difficult

  • Somewhat difficult

  • Very difficult


    1. The next question asked “Can you intervene and control the flow of your work?”

In your own words, what does it mean to “intervene and control the flow of work?”




    1. Let’s look at the “no” answer for this question. What does it mean to have work driven by business processes?





    1. How easy or difficult was it for you to answer the question about intervening and controlling the flow of work?

  • Very easy

  • Somewhat easy

  • Neither easy nor difficult

  • Somewhat difficult

  • Very difficult


  1. The next question asked how frequently your work was checked? How well does this question apply to your job?



    1. What does it mean to have “work checked” in your job? (What is being checked? Who is doing the checking?)



    1. Is your work checked on a regular, predictable basis, or is it checked only occasionally, or even randomly?



    1. Can you give me some examples of the types of situations or activities that you thought about when answering this question?





How easy or difficult was this question for you to answer

  • Very easy

  • Somewhat easy

  • Neither easy nor difficult

  • Somewhat difficult

  • Very difficult


  1. The last question asked about the highest level of independent judgment you are expected to use to perform your work. What are your reactions to this question?


    1. In your own words, please explain what it means to use “independent judgment” on the job?



    1. Let’s look at the first answer choice you were given, “Employee uses independent judgment to select from a limited number of predetermined actions.” In your own words, what is a “predetermined action?”




    1. Can you give any examples of your work tasks where you select from a limited number of predetermined actions?

  • Yes – if yes, please describe

  • No




    1. Let’s look at the second answer choice you were given. How would you describe a job task that does not have a set response?



    1. Can you give any examples of your work tasks where there are no set responses and you have to determine the most appropriate course of action?

  • Yes – if yes, please describe

  • No



    1. And, finally, are there any situations in your job that the third answer choice describes?

  • Yes

  • No


    1. And can you give any examples of your work tasks that are described by this answer?

  • Yes – if yes, please describe

  • No



How easy or difficult was this question for you to answer?

  • Very easy

  • Somewhat easy

  • Neither easy nor difficult

  • Somewhat difficult

  • Very difficult



    1. What made it [fill]?



  1. Sometimes two people looking at the same job might view it differently – for example, one person might view something as a requirement of the job while others view it as simply a choice, or they could have different estimates about the amount of time a person is required to spend doing something on a typical day. With that in mind, imagine that we had your supervisor fill out the same survey for your occupation.


    1. To what extent do you think you and your supervisor would agree on the level of cognitive demands in your occupation?

  • 100% agreement

  • 75%

  • 50%

  • 25%

  • 0%

  • Don’t know


Why is that?




Appendix F: Coding Scheme for Participant Responses



Question Comprehension

a

  • A comment was made that wording of the question stem was ambiguous, unclear, too abstract, or confusing

b

  • A comment was made that wording of a question response option was ambiguous, unclear, too abstract, or confusing

c

  • Respondent interpreted question incorrectly

d

  • (In-person) Respondent asked that the question be reread / (For TryMyUI respondents) Respondent had to read the question more than once

e

  • A comment was made that question was long or wordy

f

  • (For in person) Respondent interrupted reading of the question or response options / (For TryMyUI respondents) Respondent did not fully read the question and all of the response options out loud / (For Embedded-probe condition) Respondent indicated that he/she did not read all of the question or response options




Response Selection

g

  • None of the response option(s) applied; correct response option was missing

h

  • Struggled with selection of best response option

i

  • Answered question but expressed uncertainty or lack of confidence about accuracy




Retrieval



j

  • Gave (in-person) or Entered (online) a “don’t know” response

    • Reason: could not recall information (e.g., number of hours of leave taken in past 7 days)

k

    • Reason: did not have the knowledge to answer question (e.g., was not sure how often employer reviews work)

L

  • A comment was made that a question was not relevant to respondent

m

  • The respondent has difficulty recalling, formulating, or reporting the answer (e.g., remembering/calculating the frequency of work changes)





Other

n

  • A positive comment was made, for example, question was easy, straightforward, etc.

o

  • Answer to a current question is not in agreement with, or is inconsistent with, an answer to a previous question



No problems

p

  • No problems noted






8

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleDecember 1, 2008
AuthorLAN User Support
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy