Note to Reviewer - Eye Tracking Study

Note to Reviewer - Eye tracking study.docx

Cognitive and Psychological Research

Note to Reviewer - Eye Tracking Study

OMB: 1220-0141

Document [docx]
Download: docx | pdf

June 20, 2017




NOTE TO THE

REVIEWER OF:

OMB CLEARANCE 1220-0141

Cognitive and Psychological Research”


FROM:

Jean Fox and Robin Kaplan

Research Psychologists

Office of Survey Methods Research


SUBJECT:

Submission of Materials for ‘The Impact of Question Format on Reading Behavior Study’ and ‘OES Envelopes and Forms’




Please accept the enclosed materials for approval under the OMB clearance package 1220-0141 “Cognitive and Psychological Research.” In accordance with our agreement with OMB, we are submitting a brief description of the work.


We are planning to run two short studies together, for a total of about 45 minutes.


One study is titled “The Impact of Question Format on Reading Behavior” and the other is “OES Envelopes and Forms.”


The total estimated respondent burden hours for this study are 38 hours.


If there are any questions regarding this project, please contact Jean Fox at 202-691-7370.





The Impact of Question Format on Reading Behavior

Abstract

There have been numerous studies on how the format of a question can impact the response. The format can encourage respondents to skew their responses one way or another, or to help them answer more quickly. These studies have generally evaluated these impacts through the responses themselves. This study will look at the impact of question format in a different way. This study will use eye tracking to explore how the format affects how respondents read questions and response options. The study will consider two versions of two types of questions: (1) mark all that apply vs forced yes/no responses and (2) grid vs individual questions.

Background

There has been some research on how question formats can impact the way that people respond to survey questions. Most research involves comparing responses across conditions. Looking at the final results (the responses) has helped us understand the overall response process, but hasn’t helped us understand the individual steps leading to a response that might impact the results. In this study, we will use eye tracking to explore how one step in the response process, that is, how respondents read questions, might be impacted by question format.

Eye Tracking

Eye-tracking equipment allows researchers to monitor respondents’ eye movements from one portion of the interface to another as they interact with a website. This technology can uncover valuable insights about the best way to design the interface for web surveys, as well as to test the validity of general principles about survey design. For example, researchers are able to monitor how much time respondents spend reading instructions or individual questions, to trace the path their eyes follow as their gaze moves around the page, and to identify the features on the website their gaze dwells upon.

There are a number of eye tracking studies looking at the design of form items (e.g., data entry mechanisms, labels, instructions), but very few looking at the question formatting. We can combine evidence from eye tracking studies with the results of other types of studies to better understand how respondents are reading and responding to survey items.

This study will look at two different issues where prior research has suggested that one format may be better than another. We will study the issue by looking at how participants read both the questions and the response options. The first issue is whether asking respondents to pick “yes” or “no” is different from asking respondents to “mark all that apply.” The second issue is whether listing questions with similar response scales separately is different from presenting them together in a grid.

Eye tracking data can provide objective evidence for how respondents look at questions and help fill in the gaps of our understanding. There may be other factors that impact how respondents read questions (like who the respondents are, the topic of the questions, whether the topics are factual or attitudinal, etc.). This study will begin to look at these issues and provide some guidance for future studies.

Question Types

In this study, we will look at how two different question formats impact how participants read questions and responses. The section below describes the types of questions to be included in the study.

“Mark All that Apply” Lists versus forced choice of Yes or No

Previous research has shown that respondents provide more affirmative answers to forced-choice formats where the respondent is asked to choose either affirmative (e.g., “yes”) or a negative response (e.g., “no”) when compared to the check-all-that-apply format where respondents mark all the responses they feel apply to them. This effect occurs on paper questionnaires (Rasinski, Mingay, and Bradburn, 1994) as well as on Web forms (Smyth, Dillman, Christian, and Stern, 2006).

There have been several theoretical explanations for this effect, including satisficing and depth of processing (Rasinski, et al., 1994; Smyth, et al., 2006) or acquiescence bias, the tendency to be agreeable (Callegaro, Murakami, Tepman, and Henderson, 2015). In addition, some researchers have conjectured that the type of item, factual vs. opinion, may also play a role (Smyth et al., 2006).

Smyth et al. (2006) found that the forced-choice (yes/no) format appears to invoke deeper processing and to eliminate satisficing behavior that occurs among some respondents using the check-all format. They reported little evidence of acquiescence in the forced-choice format.

These studies show that respondents approach the two question formats differently. However, no one has studied how people actually read the questions. For example, do respondents read one format more thoroughly than another? This study will evaluate whether there are differences in how respondents read the questions, which should provide additional insight into how people process the different formats.

Grid Layouts versus Individual Items

Researchers often prefer to use grids to display a series of questions using the same scale. The layout allows for more questions on a page, and may be seen as an easy way to layout and present questions.

However, research has shown that presenting survey items in grids lead to greater straightlining (picking the same response for multiple questions), failure to notice when items are reverse-coded, faster response times, or speeding, higher breakoff rates, and higher item non-response rates (Tourangeau, Couper, & Conrad, 2004). The authors speculated that these effects occur due to a using a “near means related” heuristic – the assumption that items within the same physical proximity are also related conceptually. As a result, respondents may perceive all of the items in the grid to be highly related to one another, and might not read each item carefully.

An alternative hypothesis is that grids are more complex to navigate than questions presented individually (Couper, 2000). Respondents may take longer to orient themselves to a grid than to an individual question. However, since they only need to do this once per grid, the impact on the response process might depend on the number of questions.

Thus, there are several theories about how respondents process grid questions. This proposed research will use eye tracking to investigate differences in how respondents read and navigate among items in grids versus those same items presented individually. By examining gaze patterns and fixation durations, we will explore whether respondents read items as carefully or whether they experience more navigational difficulties with grids vs. individual questions. Because many surveys both inside and outside of BLS use grids, this research will help improve our knowledge of how respondents navigate grids and has widespread applications.

Experimental Design

In this study, participants will complete a short online survey while the Tobii eye tracker monitors their eye movements. We will evaluate the results of the eye tracking to identify any patterns in their reading, especially between the two conditions in each question format.



Research Questions

In this study, we will explore the research questions listed below.

Topic

Conditions

Number of questions

Research Questions

Dependent Measures

Lists

Mark all that Apply”

vs

Yes / No”

  • 6 questions with 12 items each

  • Each participant answers 3 in each format

Do people spend more time reading the Y/N list than the mark all list?

Eye tracking measures from areas of response options:

  • Fixation duration

  • Total number of fixations

Do they make more selections in the Y/N list?

How many items selected for each question

Do people respond more quickly with one format?

Total time answering each question.

Grids

Grid vs Individual items

  • 6 topics with 10 items in each topic

  • Each participant answers 3 in each format

Do people spend more time reading the question text with individual questions than with questions in a grid?

Eye tracking measures from areas of question text:

  • Fixation duration

  • Total number of fixations

Do they spend more time reading the responses more with individual items?

Eye tracking measures from areas of response options:

  • Fixation duration

  • Total number of fixations

Do people respond more quickly with one format?

Total time answering each question.


Survey Questions

The questions in the survey will be on the following topics:

  • List Questions

    • 1: Cell phone apps used

    • 2: Languages spoken

    • 3: Purchases made (1)

    • 4: Purchases made (2)

    • 5: Exercise engaged in

    • 6: Sports watched


  • Grid Questions

    • 1: Frequency of activities

    • 2: Importance of work benefits

    • 3: Enjoyment of music genres

    • 4: Frequency of transportation by mode

    • 5: Importance of activities for health

    • 6: Enjoyment of movie genres


See Appendix A for a description of the structure and randomization of the survey. The exact questions used are in Appendix B. A preview of the survey is available online. Note that in the preview, you will only see what an individual participant would see, which is one version of each question.

Debriefing questions

After they complete the survey, we will conduct a post-test interview addressing their thoughts about the different question formatting options. The interview will cover the following questions:

  • In this study, we are looking at how the format of the question can impact responses. Do you have any thoughts on how the format impacted your responses for the List questions? For the Grid questions?

  • Can you compare your experience with the “mark all that apply” questions versus the “yes/no” questions?

  • Can you compare your experience with the grid questions versus the list of questions?

References for Question Format Study

Callegaro, M., Murakami,. M.H., Tempan, Z., and Henderson, V. (2015). Yes-no answers versus check-all in self-administered modes. International Journal of Market Research, 57(2), 203-223.

Couper, M.P. (2000). Usability evaluation of computer-assisted survey instruments. Social Science Computer Review, 18(4), 384-396.

Rasinski, K. A., Mingay, D., & Bradburn, N. M. (1994). Do respondents really “Mark all that apply” on self-administered questions? Public Opinion Quarterly, 58(3), 400-408.

Smyth, J. D., Dillman, D. A., Christian, L. M., & Stern, M. J. (2006). Comparing check-all and forced-choice question formats in web surveys, Public Opinion Quarterly, 70(1), 66-77.

Tourangeau, R., Couper, M., & Conrad, F. (2004). “Spacing, Position, and Order: Interpretive Heuristics for Visual Features of Survey Questions.” Public Opinion Quarterly, 68, 368-393.



OES Envelopes and Forms

Background

The design and administration of mail survey materials such as envelopes, advance letters, and survey response requests is crucial to maintain high response rates in a climate where survey response is declining (Dillman, 1991; Miller, 2017). Best practices for sending mail materials include having the organization logo printed on the mailing to increase legitimacy, sending multiple mailings with a different look and appeal, and using color to make the envelope stand out from other pieces of mail, such as junk mail or marketing materials (Dillman et al., 2014). However, much of this research was conducted with household respondents rather than with business respondents for establishment surveys. Establishment respondents may have different concerns than household respondents that affect their decision to participate in a survey, including the authority of the survey requestor, the benefit the business will receive from participating, and having the time and resources to respond (Snijkers et al., 2013). But establishment survey response remains less well-researched than the household side. Thus, it is critical that BLS research the best practices for ensuring our mailing materials reach and motivate our respondents.

The Occupational Employment Statistics (OES) program at BLS collects data from establishments on their employment and wage estimates. OES is interested in conducting research to increase response rates for their surveys by improving their mailing materials, such as their envelopes, advance letters, and brochures (see Attachment A for mock-up samples of the materials to be tested). To get this feedback, we propose using eye-tracking as a method to determine what areas people look at and find most appealing about OES mailing materials and how to improve them going forward. Thus, the goal of this study is to add information about how participants interact with these materials. By tracking their eye movements as they look at the materials, we can better understand what, if any, words they focus on and what type of reading behavior they exhibit. Although participants will be recruited from general public, their reading behavior and gaze patterns are expected to be similar to those of actual OES respondents. Further, this research is part of a larger project which will recruit establishment respondents from a variety of industries and size classes for feedback on these OES mailing materials. We hope to use the results to provide guidance on how to improve response rates and motivate respondents more generally throughout the BLS.

Experimental Design

Participants will be instructed to imagine that they are an administrator working at a business and it is their responsibility to answer postal mail they receive. They will then be instructed to imagine they receive a stack of mail one day, and amongst that mail they receive envelopes with the OES report request enclosed. They will be asked to view each envelope at their own pace and then we will ask them some follow-up questions about each one afterward.

After viewing an electronic version of each envelope, participants will be instructed to imagine they are an administrator for a business in Alabama (for the envelope testing), or hospital in Colorado (for the form testing), and received the following forms in the mail. See Appendix C for the envelopes and forms to use in the test. The participants will view the forms at their own pace and then answer questions about them afterward. For the full protocol, see Appendix D.

References for OES Envelopes and Forms Study

Dillman, D. A. (1991). The design and administration of mail surveys. Annual Review of Sociology17(1), 225-249.

Dillman, D. A., Smyth, J. D., & Christian, L. M. (2014). Internet, phone, mail, and mixed-mode surveys: the tailored design method. John Wiley & Sons.

Miller, P. V. (2017). Is There a Future for Surveys? Public Opinion Quarterly81(S1), 205-212.

Snijkers, G., Haraldsen, G., Jones, J., & Willimack, D. (2013). Designing and Conducting Business Surveys (Vol. 568). John Wiley & Sons.



Full Study Protocol

At the beginning of the test session, we will describe the study and have the participant sign the informed consent form (Appendix E), which has been modified to describe the eye tracking. Next, we will calibrate the participant to the eye tracker, then start the study. We do not expect any interaction between the two studies, we will conduct the question format study first for all participants.

We will not interrupt the participants or ask them to think aloud during the study, as we are interested in their natural reading patterns.

At the end of the session, we will pay the participants and have them sign a receipt.

Participants

The goal is to get data from 40 general population participants for each study. As the eye tracker does not work with all participants, we will aim to run 50 participants total. We expect each study to take approximately 20 minutes. Burden therefore will be 50 people at 20 minutes for each task plus about five minutes between studies, that is 2,250 minutes, or 38 burden hours.

Participants will receive $40 for their participation, which should take about 45 minutes.

Confidentiality

Participants will be informed as to the voluntary nature of the study. Participants will be given a consent form to read and sign (see Attachment E). Information related to this study will not be released to the public in any way that would allow identification of individuals except as prescribed under the conditions of the Privacy Act Notice.



Appendix A: Structure of the Question Format Survey

Structure of Survey

Survey

Block

Contents

1

Block 1

List: Y/N question 1-3

List: Mark all questions 4-6

Block 2

List: Mark all questions 1-3

List: Y/N questions 4-6

2

Block 3

Grid: Grid questions 1-3

Grid: Individual questions 4-6 (each “individual question” is really a page containing a series of 10 questions, to correspond to the 10 items in the grid)

Block 4

Grid: Individual questions 1-3

Grid: Grid questions 4-6


  • For the Y/N, Mark all, and Grid questions, each question will be on one page. For the individual version of the grid questions, all the questions will appear on the same page, to parallel the grid version of the question

  • To counterbalance the presentation of questions and format, participants will see questions 1-3 of each type in one format and questions 4-6 in the other.

  • To minimize order effects, we have built two separate surveys.

    • In the first survey, participants will be randomly assigned to either block 1, which contains Yes/No questions 1-3 and Mark all questions 4-6, or to block 2, which contains Yes/No questions 4-6 and Mark all questions 1-3.

    • At the end of the first survey, SurveyMonkey will automatically take participants to the second survey, which uses the same strategy for randomizing the grid and individual questions. Due to limitations of the randomization features of SurveyMonkey, it wasn’t possible to randomize the order of the surveys.

    • Within an individual Yes/No, Mark all, and grid question, the items (rows) will be randomized.

    • Likewise, the 10 questions for each “Individual” grid condition will be randomized within a page.

    • Within each of the two surveys, all the questions (or groups of 10 “Individual” grid question) are randomized.




Appendix B: Survey Questions for the Question Format Study

List Questions

Lists Question 1: Cell phone apps

Mark all version: Which of the following cell phone apps do you use? (Mark all that apply)

Yes/No version: Do you use the following cell phone apps?

  • Banking services

  • Camera

  • Email

  • Games

  • Mapping / Navigating

  • News services

  • Podcast player

  • Social media

  • Streaming music

  • Streaming video

  • Weather

  • Web Browser


Lists Question 2: Languages

Mark all version: What languages are you able to read or speak at even a basic level? (Mark all that apply)

Yes/No version: Can you read or speak the following languages at even a basic level?

  • Chinese

  • English

  • French

  • German

  • Hindi

  • Italian

  • Japanese

  • Korean

  • Portuguese

  • Spanish

  • Swahili

  • Urdu




Lists Question 3: Purchases

Mark all version: Which of the following items have you purchased in the last month? (Mark all that apply)

Yes/No version: Have you purchased any of the following items in the last month?

  • Automobile

  • Large kitchen appliances

  • Housing (rent or mortgage payments)

  • Utilities (e.g., gas, water, electricity)

  • Furniture

  • Cell phone service

  • Digital book reader or tablet

  • Clothing or shoes

  • Tuition or child care

  • Food at a restaurant

  • Food for consumption at home

  • Public transportation


Lists Question 4: Purchases

Mark all version: Which of the following items have you purchased in the last month? (Mark all that apply)

Yes/No version: Have you purchased any of the following items in the last month?


  • Personal care products (e.g., hair care products, skin care products)

  • First aid products

  • Over the counter medicine

  • Housekeeping supplies (cleaners, paper goods)

  • Hardware or other home maintenance supplies

  • Household services (housecleaning, home maintenance, or yard work)

  • Entertainment (e.g., movies, music, tickets to concerts or sporting events)

  • Long distance travel (not local commuting)

  • Books or magazines (including digital and print editions)

  • Charitable donations

  • Gasoline

  • Dry cleaning or tailoring




Lists Question 5: Exercise

Mark all version: Which of the following types of exercise have you engaged in anytime in the last year? (Mark all that apply)

Yes/No version: Have you engaged in the following types of exercise anytime in the last year?

  • Biking / Spinning

  • Dancing

  • Martial Arts

  • Rowing

  • Running

  • Skating

  • Swimming

  • Tai Chi

  • Tennis

  • Walking / Hiking

  • Weightlifting

  • Yoga


Lists Question 6: Sports

Mark all version: Which of the following sports do you enjoy watching? (Mark all that apply)

Yes/No version: Do you enjoy watching the following sports?

  • Auto Racing

  • Baseball

  • Basketball

  • Boxing

  • Ice Hockey

  • Cricket

  • Football

  • Golf

  • Horse Racing

  • Lacrosse

  • Soccer

  • Tennis




Grid vs Individual Questions

Grid Question 1: Frequency

Grid: How often do you do the following?

Individual: How often do you <activity>?

  • Spend an evening with friends

  • Attend a sporting event

  • Attend a class of any kind

  • Play a computer game

  • Exercise

  • Go to the library

  • Volunteer for charity

  • Cook a meal

  • Read a book for pleasure

  • Watch TV


Options:

  • Every day

  • Several times a week

  • Several times a month

  • Several times a year

  • Never


Grid Question 2: Importance

Grid: At work, how important is it to have the following?

Individual: At work, how important is it to have <item>?

  • Services such as a post office or bank

  • Music in the common areas

  • Opportunities for advancement

  • Time off

  • Benefits other than time off

  • Windows in your workplace

  • Parking

  • A cafeteria

  • Manageable commute

  • Good location


Options:

  • Extremely important

  • Very important

  • Moderately important

  • Slightly important

  • Not at all important




Grid Question 3: Enjoyment

Grid: How much do you enjoy listening to the following types of music?

Individual: How much do you enjoy listening to <genre>?

  • Alternative

  • Blues

  • Classical

  • Country

  • Electronic

  • Hip-hop

  • Jazz

  • Pop

  • R&B

  • Rock


Options:

  • Enjoy a lot

  • Enjoy a little

  • Neither enjoy nor dislike

  • Dislike a little

  • Dislike a lot


Grid Question 4: Frequency

Grid: How often do you travel by the following modes?

Individual: How often do you travel by <mode>?

  • Commercial Airline

  • Private Airplane

  • Automobile

  • Commuter Train

  • Amtrak Train

  • Boat

  • Taxi/Uber/Lyft

  • Bicycle

  • Motorcycle

  • Bus


Options:

  • Everyday

  • Several times a week

  • Several times a month

  • Several times a year

  • Never




Grid Question 5: Importance

Grid: For good health, how important is it to do the following?

Individual: For good health, how important is it to <do activity>?

  • Exercise

  • Eat healthy foods

  • Meditate

  • Spend time with friends

  • Spend time with family

  • Run a marathon

  • Limit time watching television

  • Read books

  • Do crossword puzzles

  • Go to bed early


Options:

  • Extremely important

  • Very important

  • Moderately important

  • Slightly important

  • Not at all important


Grid Question 6: Enjoyment

Grid: How much do you enjoy the following types of movies?

Individual: How much do you enjoy <genre> movies?

  • Action / Adventure

  • Animated

  • Comedy

  • Documentary

  • Drama

  • Musical

  • Mystery

  • Romance

  • Science Fiction

  • Western


Options:

  • Enjoy a lot

  • Enjoy a little

  • Neither enjoy nor dislike

  • Dislike a little

  • Dislike a lot

Appendix C: Envelopes and Forms



Envelope 1:





Envelope 2:











Envelope 3:

Shape2 Shape1

You’ve been selected to participate in the May 2017 Occupational Employment Statistics (OES) report.



See inside for how your participation provides vital wage and employment data.



Form 1:

Form 2:





Appendix D: Protocol for OES Envelopes and Forms Study

Imagine that you are a busy administrator for a business in Alabama and receive a stack of postal mail each day. On the next few pages, you will be asked to imagine you received the following envelopes at your desk. Please view each envelope at your own pace, and then we will ask you some follow-up questions about them later.

--page break--

[insert envelope 1]

--page break--

[insert envelope 2]

--page break--

[insert envelope 3]

--page break--

Now we’d like you to imagine you are a busy administrator working for a hospital in Colorado. On the next few pages, you will view some forms telling you more information about participating in the Occupational Employment Statistics report. Please view each form at your own pace, and then we will ask you some follow-up questions about them later.

--page break--

[insert form 1]

--page break--

[insert form 2]

--page break--

Next we’d like you to answer some questions about the envelopes you viewed earlier.

[insert pictures of all 3 envelopes, labeled, randomized order]

What were your reactions this envelope [insert envelope 1 / envelope 2 / envelope 3 individually for each] ______________

Which envelope do you most prefer?

  • Envelope 1

  • Envelope 2

  • Envelope 3

Which envelope do you think businesses would be most likely to open?

  • Envelope 1

  • Envelope 2

  • Envelope 3

-- page break --

Which logo do you prefer to appear on the envelope?

  • [insert DOL logo]

  • [insert BLS logo]

What language do you prefer to appear above the address window?

  • Information about submitting data inside”

  • Official government correspondence”

What language do you prefer to appear below the address window?

What language do you prefer to appear on the left side of the envelope, below the return address?

  • U.S. Bureau of Labor Statistics Occupational Employment Statistics Report Enclosed”

  • You’ve been selected to participate in the May 2017 Occupational Employment Statistics (OES) report. See inside for how your participation provides vital wage and employment data.”

-- page break --

Please look over the first form below again:

[insert form 1]

What were your reactions to this form? _____________

How effective do you think this form would be at getting businesses to complete the OES report?

  • Not at all effective

  • Slightly effective

  • Moderately effective

  • Very effective

  • Extremely effective

-- page break --

Please click ONCE on the words, images, or figures that you find the most persuasive to participate in the OES report.

[insert form 1]

-- page break --

Please look over the second form below again

[insert form 2]

What were your reactions to this form?: _________________

How effective do you think this form would be at getting businesses to complete the OES report?

  • Not at all effective

  • Slightly effective

  • Moderately effective

  • Very effective

  • Extremely effective

-- page break –

Please click ONCE on the words, images, or figures that you find the most persuasive to participate in the OES report.

[insert form 2]





OMB Control Number: 1220-0141

Expiration Date: April 30, 2018

Appendix E: Consent Form

The Bureau of Labor Statistics (BLS) is conducting research to increase the quality of BLS surveys. This study is intended to suggest ways to improve the procedures the BLS uses to collect survey data.

The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. The Privacy Act notice on the back of this form describes the conditions under which information related to this study will be used by BLS employees and agents.

During this research you and your activity on the computer will be recorded, and you may be observed. In addition, we will record your eye movements using an eye tracker. This will help us understand how you are processing the content in the study. The eye tracker follows reflections of light on your eyes to determine what you are looking at on the monitor.

If you do not wish to be recorded, you still may participate in this research.

We estimate it will take you an average of 45 minutes to participate in this research (ranging from 30 minutes to 60 minutes).

Your participation in this research project is voluntary, and you have the right to stop at any time. If you agree to participate, please sign below.

Persons are not required to respond to the collection of information unless it displays a currently valid OMB control number. OMB control number is 1220-0141 and expires April 30, 2018.

------------------------------------------------------------------------------------------------------------

I have read and understand the statements above. I consent to participate in this study.



___________________________________ ___________________________

Participant's signature Date



___________________________________

Participant's printed name



___________________________________

Researcher's signature



PRIVACY ACT STATEMENT

In accordance with the Privacy Act of 1974 (DOL/BLS – 14 BLS Behavioral Science Research Laboratory Project Files (81 FR 47418)), as amended (5 U.S.C. 552a) , you are hereby notified that this study is sponsored by the U.S. Department of Labor, Bureau of Labor Statistics (BLS), under authority of 29 U.S.C. 2. Your voluntary participation is important to the success of this study and will enable the BLS to better understand the behavioral and psychological processes of individuals, as they reflect on the accuracy of BLS information collections. The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. Per the Federal Cybersecurity Enhancement Act of 2015, Federal information systems are protected from malicious activities through cybersecurity screening of transmitted data.







29

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorFox, Jean - BLS
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy