Supporting Statement Part A (1220-0141) Final

Supporting Statement Part A (1220-0141) Final.docx

Cognitive and Psychological Research

OMB: 1220-0141

Document [docx]
Download: docx | pdf










BUREAU OF LABOR STATISTICS

OMB CLEARANCE PACKAGE






for

CLEARANCE TO CONDUCT COGNITIVE AND PSYCHOLOGICAL RESEARCH

IN FY2018 THROUGH FY2020









Prepared by

BEHAVIORAL SCIENCE RESEARCH CENTER

OFFICE OF SURVEY METHODS RESEARCH

BUREAU OF LABOR STATISTICS



2017



Request for Extension of Clearance to Conduct

Cognitive and Psychological Research


Abstract


This is a request for clearance by the Bureau of Labor Statistics' (BLS) Behavioral Science Research Center (BSRC) to conduct research to improve the quality of data collection by examining the psychological and cognitive aspects of methods and procedures. BLS staff, employing state-of-the-art cognitive psychological testing methods, will conduct these research and development activities. The use of cognitive techniques to improve the quality of data collection has been advocated by the Cognitive Aspects of Survey Methodology seminar sponsored by the National Academy of Science and respondents in a questionnaire design advisory conference. The planned research and development activities will be conducted during FY2018 through FY2020 with the goal of improving overall data quality through improved procedures.

Supporting Statement


A. Justification


1. Collection of Information


The Bureau of Labor Statistics' Behavioral Science Research Center (BSRC) conducts psychological research focusing on the design and execution of the data collection process in order to improve the quality of data collected by the Bureau. The BSRC conducts research aimed at improving data collection quality by assessing questionnaire/form management and administration, as well as issues which relate to interviewer training and interaction with respondents in the interview process. BSRC staff work closely with economists and/or program specialists responsible for defining the concepts to be measured by the Bureau of Labor Statistics' collection programs.


Both questionnaires and forms are used in the Bureau's surveys. Questionnaires specify the preferred wording of the questions to be asked, whereas forms specify the data items to be collected. Each possesses distinctive problems which in many cases can be related to respondent characteristics, survey content, or format of administration. Such problems impede the effectiveness of surveys and the mission of the Bureau in general.


The purpose of this request for clearance for cognitive psychological research and development activities by the BSRC is to enhance the quality of the Bureau's data collection procedures and overall data management. The basic goal of the BSRC is to improve, through interdisciplinary research, the quality of the data collected and published by the BLS. BLS is committed to producing the most accurate and complete data within the highest quality assurance guidelines. It is with this mission in mind, then, that BSRC was created to aid in the effort of not only maintaining, but also improving the quality of the data collection process.


This laboratory was established in 1988, by Commissioner Janet Norwood, to employ behavioral science to investigate all forms of oral and written communication used in the collection and processing of survey data. This exploration also includes all aspects of data collection such as mode, manuals, and interviewer training. BSRC performs a state-of-the-art service for many programs within BLS, the DOL, and other agencies as requested, providing questionnaire redesign efforts, survey updates, and improvement in the overall quality of the data collection management. These efforts, in turn, increase data quality and reduce respondent burden. The techniques proposed here have been successfully applied to many BLS surveys.


The research techniques and methods to be used in these studies will include both analyses of questionnaire construction and interview process, as well as survey technology. Within the structure of the questionnaire, analyses will be conducted in the following domains:


  1. Question Analysis - Evaluation of individual questionnaires appraising question intent, assessment of semantic clarity, and an examination of relationships between questions.


  1. Term Analysis - Evaluation of specific wording and phrases in terms of their psycholinguistic properties and an assessment of respondent interpretation of the meaning of these terms, at both the conscious and unconscious levels.


  1. Instruction Analysis - Inspection of instructions for their semantic clarity, the degree to which they reflect the stated intention of investigators, ease of interpretation, and other considerations which may elicit unambiguous and appropriate answers or behaviors from respondents or interviewers.


  1. Format Analysis - Review of questionnaires or subsets of questions for perceptual characteristics in order to facilitate better respondent comprehension and to promote more focused attention on the questionnaire or form.


Within the interview process, several analyses are conducted to assess nonverbal communication, interpersonal dynamics, and symbolic interaction--the use of cultural symbols to make social statements. Staff conducts research to evaluate the overall effectiveness of data collection procedural characteristics, including:


  1. Interviewer Characteristics and Behavior Analysis - Study of the presentation of appearance, manner, relation to subject population, etc., in order to enhance interpersonal skills of interviewers in general and develop and improve procedures for the training of interviewers.


  1. Respondent Characteristics and Behavior Analysis - Assessment of the social, cultural, and ethnic characteristics of the respondent and how that may bear upon interactions with the interviewer. Staff members also observe the behavior of respondents for cues concerning their reactions to the interview process. Because BLS constantly collects data from different populations that change over time, the analysis of respondent characteristics needs frequent updating.


  1. Mode Characteristics - Examination of the unique properties of interviewer and/or respondent behavior as a function of the media used to collect data; for example, self-administered interviews, personal interviews, telephone interviews, and interviews utilizing assistive technologies (e.g., CAPI, CASI and CATI).


  1. Usability Analysis - Evaluation of the effectiveness, efficiency, and satisfaction with which respondents complete tasks assigned to them, especially when using self-guided instruments (PAPI or CASI).



  1. Data Collection Methodology Analysis – Assessment of alternative formats for collecting survey data (e.g., respondent provided records, administrative records). Staff will evaluate the validity and reliability of data collected through the alternative methodology as well as the level of respondent burden relative to current procedures.


BLS also uses a variety of methodologies, such as usability analysis, debriefings, and in-depth interviews, to better understand how to communicate more effectively with its stakeholder and user communities through its website and other materials.


2. The Purpose of Data Collection


The purpose of BSRC's data collection is to improve Federal data collection processes through scientific research. Theories and methods of cognitive science provide essential tools for the development of effective questionnaires. For example, they can provide an understanding of how respondents comprehend survey questions, recall relevant facts from memory, make judgments, and respond. On the basis of such knowledge, questions can be tailored to increase the accuracy and validity of the collected information and to reduce respondent burden. Similar improvements can be made with respect to other aspects of the data collection process.


BSRC’s research contributes to BLS and to the entire survey research field. Research results are shared with the Bureau through seminars, training sessions, reports, publications, and presentations at professional conferences. The BSRC staff has instituted a method of peer review to encourage high standards of social science research practice. A list of BSRC staff publications and internal reports1 covering the last five years can be found in Attachment I.


The BSRC’s research is expected to 1) improve the data collection instruments employed by the Bureau, 2) increase the accuracy of the economic data produced by BLS and on which economic policy decisions are based, 3) increase the ease of administering survey instruments for both respondents and interviewers, 4) increase response rates in panel surveys as a result of reduced respondent burden, 5) increase the ease of use of the BLS website and other BLS products, and 6) enhance BLS’s reputation resulting in greater confidence and respect in survey instruments used by BLS.

The application of cognitive and psychological theories and methods to survey data collection is widespread and well established. The consequences of failing to scientifically investigate the data collection process is to lag in the use of accepted practices, to apply archaic survey development techniques based on intuition and trial and error, and ultimately to incur a severe cost in data quality and in burden to respondents, interviewers, and data users alike. For example, without knowledge of what respondents can be expected to remember about the past and how to ask questions that effectively aid in the retrieval of the appropriate information, survey researchers cannot ensure that respondents will not take shortcuts to avoid careful thought in answering the questions, or to be subject to undue burden. Likewise, without investigation of the interviewers’ roles and abilities in the data collection process, survey researchers cannot ensure that interviewers will read their questions correctly with ease and fluency, or record the respondent’s answers correctly.


3. Use of Improved Technology


Staff members will design, conduct, and interpret field and laboratory research that contributes new knowledge of the cognitive aspects of human behavior in relationship to questionnaire design and survey methodology. Cognitive psychological research methods in use include such techniques as probing questioning, memory cueing, group discussion, and intensive interviewing. Depending on research goals, these methods may be used separately or in combination with one another.


The use of the laboratory approach has a number of advantages associated with it. These advantages include rapid and in-depth testing of questionnaire items, a more detailed understanding of the respondents’ comprehension of concepts, and access to special populations who can be quickly recruited and tested. Different laboratory methods will be used in different studies depending on the aspects of the data collection process being studied. Computer technology will be used when appropriate to aid the respondents and interviewers and minimize burden.


In addition to laboratory methods, research with online respondents allows for information to be collected from a larger variety and number of respondents. By using existing online panels of volunteers, BSRC staff can easily recruit and screen to find respondents with the characteristics of interest in a much more efficient, and less burdensome, manner. Dividing traditional laboratory studies, which commonly are 45 to 60 minutes, into smaller tasks, online testing allows for similar information to be collected while burdening individual respondents far less. Finally, online testing allows for experimentation of survey features such as question wording or format, in a way that is simply not possible in the laboratory due to the resources required to obtain necessary sample sizes to detect statistical differences.


Respondent burden in this collection will be held to a minimum. The proposed approach to research of data collection methods is designed to obtain the maximum amount of information for the minimum respondent burden. The research includes such methods as:


  1. cognitive interviews, focus groups or usability tests


  1. interview pacing and latency classification,


  1. degree of structure within the interview format, group dynamics observation and recording of response, decision and reporting behaviors,


  1. structured evaluation tasks such as card sorts and vignettes,


  1. experiments involving the administration of forms to study respondents, and


  1. usability tests of existing or proposed data collection and data dissemination systems (including the public BLS website).


4. Efforts to Avoid Duplication


This research does not duplicate any other research effort being done within the BLS. This research will provide critical, groundbreaking, and important supplemental information beyond that currently available in the field of survey methodology as it applies to BLS surveys.


This research also does not duplicate any outside-of-government research effort, as its purpose is not to replicate survey research studies. The staff of BSRC is cognizant of current research being done in the field of cognitive psychology through attendance at conferences, research reported in professional journals, and through in-house staff meetings and peer review processes. There is no current, similar, existing data that can be used or modified for the purposes of improving the overall data collection process.


  1. Collection of Information Involving Small Establishments

BSRC data collection efforts focus primarily on information gained through laboratory interviews, telephone interviews, and self-administered questionnaires with individuals recruited from the general public. However, in some instances, organizational goals necessitate the involvement of businesses, state agencies, or other entities. To the extent these establishments are included in a research project, they normally are surveyed only once.


6. The Consequences of Less Frequent Data Collection


The planned collection of data will allow BSRC to suggest modifications and alterations to survey research in an ongoing manner. Because this collection is expected to be an ongoing effort, it has the potential to have immediate impact on all survey collection methods within the Bureau's jurisdiction. Its delay would sacrifice potential gain in survey modification within the Bureau as a whole.


7. Special Circumstances


There are no special circumstances.


8. Federal Register and Consultation Outside BLS


Federal Register: No comments were received as a result of the Federal Register notice published in 82 FR 35826 on August 1, 2017.


Outside Consultation: Consultation with individuals outside BLS to obtain views on the availability of data, frequency of collection, suitability of particular laboratory methods, clarity of instruction and record keeping, disclosure, or reporting format, and on the data elements to be recorded and/or disclosed, are frequent and ongoing with the National Center for Health Statistics, the Bureau of the Census, the University of Maryland, the University of Michigan, and other federal agencies and institutions of higher learning. Consultants come from a wide range of subject areas and expertise. A list of individuals consulted in the past is attached to this document (Attachment II).


The individual responsible for the BSRC research efforts is:


Dr. Jennifer Edgar

Director of Behavioral Science Research

Office of Survey Methods Research

Bureau of Labor Statistics

PSB Room 1950

2 Massachusetts Ave., NE

Washington, DC 20212
(202) 691-7528


  1. Payment to Respondents


Respondents for activities conducted in the laboratory (that is, cognitive interviews and focus groups) under this clearance will receive a small stipend.  This practice has proven necessary and effective in recruiting subjects to participate in this small-scale research, and is also employed by the other Federal cognitive laboratories.  The incentive for participation in an in-person cognitive interview is $40, and for participation in an in-person focus group is $75.  BLS may provide smaller incentives than these amounts at its discretion; however, any requests for larger amounts must be justified in writing to OMB.  

 

Respondents for methods that are generally administered as part of field test activities (that is, split sample tests, behavior coding of interviewer/respondent interaction, and respondent debriefing) or other research projects where BLS lab staff travel to and test in respondents’ residences will not receive payment unless there are extenuating circumstances that warrant it.  Such circumstances and proposed incentives must be justified in writing to OMB.  


10. Confidentiality and Privacy Concerns


The data collected from respondents will be tabulated and analyzed only for the purpose of evaluating the research in question. Laboratory respondents will be asked to read and sign a Consent form explaining the voluntary nature of the studies, the use of the information, that the interview may be taped or observed, and a Privacy Act Statement. (Attachment III). The Privacy Act Statement given to respondents is as follows:


In accordance with the Privacy Act of 1974 (DOL/BLS – 14 BLS Behavioral Science Research Laboratory Project Files (81 FR 47418)), as amended (5 U.S.C. 552a), you are hereby notified that this study is sponsored by the U.S. Department of Labor, Bureau of Labor Statistics (BLS), under authority of 29 U.S.C. 2. Your voluntary participation is important to the success of this study and will enable the BLS to better understand the behavioral and psychological processes of individuals, as they reflect on the accuracy of BLS information collections. The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. Per the Federal Cybersecurity Enhancement Act of 2015, Federal information systems are protected from malicious activities through cybersecurity screening of transmitted data.

Surveys with current OMB approval that are involved in BSRC studies and are collected outside the laboratory such as mail or CATI surveys use the pledge of the existing approved collection or the Privacy Act Statement.


The Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA) safeguards the confidentiality of individually identifiable information acquired under a pledge of confidentiality for exclusively statistical purposes by controlling access to, and uses made of, such information. CIPSEA includes fines and penalties for any knowing and willful disclosure of individually identifiable information by an officer, employee, or agent of the BLS.


BLS policy on the confidential nature of respondent identifiable information (RII) states that “RII acquired or maintained by the BLS for exclusively statistical purposes and under a pledge of confidentiality shall be treated in a manner that ensures the information will be used only for statistical purposes and will be accessible only to authorized individuals with a need-to-know.”


11. Sensitive Questions


Most of the questions that are included on BLS questionnaires are not of a sensitive nature. However, it is possible that some potentially sensitive questions may be included in questionnaires that are tested under this clearance. One of the purposes of this testing is to identify such questions, determine sources of sensitivity and alleviate them insofar as possible before the actual survey is administered.




12. Estimated Respondent Burden


The FY2018, FY2019, and FY2020 estimated respondent burdens are as follows:


Individuals and Households

Respondents

Frequency

Responses



Minutes per response

Total Response Burden (Hours)*

FY2018

6,000

once

6,000

20

2,000

FY2019

6,000

once

6,000

20

2,000

FY2020

6,000

once

6,000

20

2,000

Total FY18-20

18,000


18,000


6,000

*Burden estimates include recruiting, screening, online studies, and interviews.



Private Sector

Respondents




Frequency




Responses



Minutes per response

Total Response Burden (Hours)

FY2018

100

Once

100

60

100

FY2019

100

Once

100

60

100

FY2020

100

Once

100

60

100

Total FY18-20

300




300


300




Individuals and Households




Private Sector

Total Response Burden (Hours)

FY2018

2,000

100

2,100

FY2019

2,000

100

2,100

FY2020

2,000

100

2,100

Total FY18-20

6,000

300

6,300





The burden hours are estimated based on the anticipation that the research will require approximately one hour per respondent for in person interviews. Online studies range from 5 to 20 minutes on average. Each study will differ substantively from one another. The projects are expected to be complex, involving at times several cognitive testing methods to test the hypotheses of the given research question.


In addition to burden hours required for data collection, The Office of Management and Budget has instructed that time spent recruiting and screening respondents for studies be included in estimates of burden. We estimate that screening takes approximately 10 minutes per household respondent coming into the cognitive laboratory, and 2 minutes per online respondent. Business respondents are often sampled from an existing BLS frame and so screening is not necessary.




Coverage of the Estimates


The estimates cover the time that each respondent will spend answering questions, including the debriefing procedure concerning the cognitive testing procedure used. The time required to travel to the laboratory, if needed, is not covered, since distances and modes of transportation are unknown. No retrieval of information by respondents is anticipated, although it is possible that validation of data at some point may require respondents to keep and check records. In this case, burden hour requests will include the estimated time required to gather records.


Basis for the Estimate


These estimates are based on the BSRC’s previous experience in conducting such research under the existing OMB Clearance 1220-0141, and on expectations concerning the research projects to be conducted in the next 3 years. BSRC staff and its laboratory facilities (especially the usability lab) have been increasingly utilized by both program offices and outside agencies, and it is anticipated that this trend will continue. The estimates are also based on the experience of other government agencies (such as the National Center for Health Statistics' study of the Cognitive Aspects of Survey Methods, 1987) which have conducted cognitively oriented questionnaire design research.

Annualized Cost to Respondents


The estimated dollar cost for these individuals is based on average hourly earnings of employed wage and salaried workers, $26.22 per hour, taken from the May 2017 Current Employment Statistics Program data. Using the $26.22 per hour figure, the annualized cost to the respondents is $55,062 for FY2018 – FY2020 (based on an average of 2,100 burden hours annually).

13. Total Annual Cost Burden


  1. There will be no total capital and start-up cost component for respondents or record keepers resulting from the collection of information.


b) The respondents and record keepers will have no expenses for operation and maintenance or purchase of services resulting from the collection of information.


14. Cost to the Federal Government


The maximum cost to the Federal Government is $26,000 annually for FY2018, FY2019, and FY2020.  Those costs are entirely comprised of the reimbursements paid to respondents and newspaper advertisement costs.  Other costs such as operational expenses (e.g., equipment, overhead, printing, fees for use of online panels, and support staff), and any other expense that would not have been incurred without the paperwork burden, are in place as a function of the laboratory proper and are not contingent or necessary to perform this research. 


15. Changes in Burden


This is a request for extension to the existing OMB Clearance 1220-0141 in order to continue the research mission for another 3 years. The expected burden for the next 3 years is 6,300 hours, with an annual average of 2,100 hours. This is a decrease of 966 hours from the previously approved package.


16. Tabulation, Analysis and Publication Plans, and Project Schedule


This clearance request is for survey methodological and questionnaire quality assurance, which include the exploratory activities leading to the evaluation of data quality.  Both quantitative and qualitative analyses are planned for the evaluation of these activities depending on the circumstances. 

 



The results of these investigations will be used primarily to improve the quality of data collection and assure total collection quality as it relates to data management.  Because BLS is using the latest techniques and cognitive psychological testing methodology, methodological papers may be written that include some tallies of response problems, recall strategies, or results from other testing procedures used, etc.  The methodological results may be included as a methodological appendix or footnote in a report containing data from a larger data collection effort.  The methodological results of this research may be prepared for presentation at professional meetings or publication in professional journals. While these methodological publications may include substantive findings as part of the results, the substantive findings will not be published on their own.


Project Schedule


This project schedule calls for laboratory interviews to commence once OMB approval is received.


A time schedule is proposed that is continuously ongoing in nature, with research publication dates dependent on data collection limited to the researcher's proposal and subsequent results.


17. Expiration for OMB Approval


The BSRC is not seeking approval to avoid displaying the expiration date for OMB approval of the information collection.


18. Exception to Certification Statement


There are no exceptions to the certification statement “Certification for Paperwork Reduction Act Submissions.”

ATTACHMENT I


BEHAVIORAL SCIENCE RESEARCH LAB RECENT ARTICLES AND REPORTS BASED ON OMB APPROVED STUDIES


Crafts, J., Tesler, R., Blair, J., Behm, J., Stein, K., Leonard, M., & Folz, J. (2015). Testing Global Questions for the Consumer Expenditure Gemini Recall Interview. Internal Report

Edgar, J. (2016). Cognitive Testing of Revised CIPSEA Pledge.  Internal BLS Report.

Edgar, J. (2016). Respondent Perceptions:  What happens after legislation and technology collide, collaborate and compromise.  2016 FCSM Policy Conference.

Edgar, J., Esposito, J., Kopp, B., Mockovak, W., and Yu, E.  (2016)  Identifying Factoryless Goods Producers in the U.S. Statistical System.  Presented at the 5th International Conference on Establishment Surveys, Geneva, Switzerland.

Edgar, J., Kaplan, R., Earp, M. (2016) Online Testing of Revised CIPSEA Pledge.  Internal BLS Report.

Edgar, J., Kaplan, R. (2017). Does It Matter?  Impact of Confidentiality Pledges on Web Survey Response.  2017 American Association of Public Opinion Research Conference.

Edgar, J., Kopp, B., Mockovak, B., Yu, E. (2014). Factoryless Goods Producers Scoping Interviews Report. Internal BLS report.

Edgar, J., Kopp, B., Mockovak, B., Yu, E. (2014). Factoryless Goods Producers Scoping Interviews Report. Internal BLS presentation.

Edgar, J., Mockovak, W., Kopp, B. (2014). Results from CPS Certification Cognitive Testing.  Internal BLS Memo.

Edgar, J., Murphy, J., Keating, M. (2014).  Crowdsourcing in the cognitive interview process. Presentation at the 2014 American Association for Public Opinion Research Conference.

Edgar, J. (2017). Cognitive Testing Results: CPS UI Non-filers.  Internal BLS report.

Edgar, J., Ridolfo, H. (2015). Repeating After You: Dependent Interviewing in Establishment Surveys.  Presentation at the 2015 Joint Statistical Meetings.

Edgar, J., Ridolfo, H. (2015). Let Me Tell You What You Told Me: Dependent Interviewing in Establishment Surveys.  Presentation at the 2015 American Association for Public Opinion Research Conference.

Edgar, J., Samuel, S. (2017) Factoryless Goods Producers Sensitivity Results Memo.  Internal BLS Report. Westat (2016). Factoryless Goods Producers: Report on In-depth Interviews with Establishments.  Internal BLS Report

Edgar, J., Shkodriani, G., Koenig, T., (2015). Revising the Reference Period in the JOLTS survey.  Presentation at the 2015 Business Data Collection Methods Workshop.

Eggleston, C., Olmsted Hawala, E., Edgar, J. (2017) Do They Read It? Using Paradata to Evaluate the Extent to Which Respondents Attend to Confidentiality Pledge Language. 2017 American Association of Public Opinion Research Conference.

Jones, C., Martinelli, C., Edgar, J. (2016). Collecting Previously Reported Data: Testing Telephone Interviewing Techniques in the Occupational Employment Statistics Survey.  Presentation at the Questionnaire Design, Evaluation and Testing 2 conference.

Kaplan, R. & Edgar, J. (2017). Confidentiality Concerns, Do They Matter More than Confidentiality

Pledges? Presented at the American Association of Public Opinion Research, New Orleans, LA.

Kaplan, R., Kopp, B., Phipps, P. (2015). Contrasting Stylized Questions of Sleep with Diary Measures from

the American Time Use Survey. Presented at the American Association of Public Opinion Research, Hollywood, FL.

Kaplan, R., Kopp, B., Phipps, P. (2015). Contrasting Stylized Questions of Sleep with Diary Measures from

the American Time Use Survey. Presented at the International Association of Time Use Researchers Conference, Ankara, Turkey.

Kaplan, R., Kopp, B., & Phipps, P. (2017). Using wearable devices to assess the validity of diary and stylized sleep measures. Presented at the European Survey Research Association, Lisbon, Portugal.

Kaplan, R., Kopp, B., & Phipps, P. (in progress). Contrasting Stylized Questions of Sleep with Diary

Measures from the American Time Use Survey. Chapter accepted to the Questionnaire Design, Evaluation, and Testing Wiley Volume.

Kaplan, R., & Phipps, P. (2016). Results from the Office of Survey Methods Research’s Pretesting of the SOII Respondent Re-contact Survey. Internal report.

Kaplan, R., & Phipps, P. (2015) Designing and Testing Tools to Assess Measurement Error. 2015 Workshop on Business Data Collections Methodology, Washington, DC.

Kaplan R., &, Yu, Erica. (2016): What would you ask?: Exploring why interviewers select different

techniques to reduce question sensitivity. Presented at the American Association of Public Opinion Research, Austin, TX.

Kopp B., Kaplan, R., & Phipps, P. (2016). Contrasting Stylized Questions of Sleep with Diary Measures

from the American Time Use Survey. Presented at the American Association of Public Opinion Research, Austin, TX.

Kaplan, R., Kopp, B., & Phipps, P. (2016). Contrasting Stylized Questions of Sleep with Diary Measures

from the American Time Use Survey. Invited presentation at the Questionnaire Design, Evaluation, and Testing Conference, Miami, FL.

Kaplan, R. & Yu, Erica (2015). Measuring Question Sensitivity. Internal report.

Kaplan. R., Yu, E. (2015). Measuring Sensitivity. Presented at the American Association of Public Opinion Research Conference 2015, Hollywood, FL.

Kopp, B., Edgar, J. (2016). Current Population Survey Program Electronic Mediation of Contingent Work Question Cognitive Testing.  Internal BLS report.

Kopp, B., & Yu, E. (2016). Final Cognitive Testing Report for the CEQ 2017 Changes. Internal Report.

Kopp, B. (2016). Usability Test Results from the Testing of the CE Paper Diary. Internal Report.

Martinelli, C., Jones, C. (2015). Field Testing the Collection of New Data Elements in the Occupational Employment Statistics Survey. Presentation at the 2015 Joint Statistical Meetings.

Mockovak, W. (2014).  Evaluation of the Revised BLS Marketing Brochure.  Internal BLS report.

Mockovak, W., & Kaplan, R. (2016). Comparing Face-to-Face Cognitive Interviewing with Unmoderated, Online Cognitive Interviewing with Embedded and Follow-Up Probing. Presented at the Questionnaire Design, Evaluation, and Testing Conference, Miami, FL.

Mockovak, W., & Kaplan, R. (2015). Comparing Results from Telelphone Reinterview with Unmoderated, Online Cognitive Interviewing. Internal report.

Mockovak, W., & Kaplan, R. (2015). Summary of Cognitive Interviewing Testing for the 2017 ATUS Annual Leave Module. Internal report.

Phipps, P., Kaplan, R., & Kopp, B. (2017). Exploring Interviewer and Respondent Interactions Surrounding

Sleep Questions in the American Time Use Survey. Presented at the American Association of Public Opinion Research, New Orleans, LA.

Redline, C., Bournazian, J., Edgar, J., Ridolfo, H. (2017).  Do Establishments Understand It?  Cognitive Interviewing Assessment of Confidentiality Pledges for Establishment Surveys. 2017 American Association of Public Opinion Research Conference.

Scherer, A., Edgar, J. (2016). Confidentiality Pledge Changes: TryMyUI Testing.  Internal BLS Report.

Swallow, A., Kaplan, R., Edgar, J. (2017)  Exploring Respondent’s perceptions of data confidentiality and enhanced cybersecurity. 2017 FedCASIC conference.

Westat (2015) Phase 1 Report of Mail Respondent Debriefing Interviews.  Internal BLS Report.

Westat (2015) Phase 2 Report of Mail Respondent Debriefing Interviews.  Internal BLS Report.

Westat. (2015). Job Openings and Labor Turnover Survey (JOLTS) Reference Period Study, Final Report.  Internal BLS report.

Westat (2017) Factoryless Goods Producers Enterprise vs. Establishment – Task Order #27.  Internal BLS Report.

Westat (2016). Factoryless Goods Producers: Report on In-depth Interviews with Industry Associations.  Internal BLS Report.

Yan, T., Warren, A., Sun, H., & Muller, M. (2017). Testing Global Questions for the Consumer Expenditure Gemini Recall Interview. Internal Report.

Yu, E., (2017). Results from pre-testing the proposed questions for collecting outlet information in the Q183 Consumer Expenditure Quarterly Interview Survey. Internal Report.

Yu, E., & Kopp, B. (2015). Testing Outlet Questions in the CE Diary - Final Report. Internal Report.

Yu, E., & Kopp, B. (2015). Testing Outlet Questions for Recall – Final Report. Internal Report.

Yu, E. (2016). Electronic records online study report. Internal Report.

Yu, E. (2016). Testing instructions for organizing electronic records for the redesigned CE interview – Final report. Internal Report.

Yu, E. (2016). Testing New Interview Protocols: Lessons Learned about Interviewers, Respondents, and Survey Content. Presented at the Annual Conference of the Joint Statistical Meetings, Chicago, IL.         

Yu, E., Martinez, W., Kopp, B., & Fricker, S. (2016). Using Text Analysis to Find the Meaning of Respondent Burden. Presented at the Annual Conference of the American Association for Public Opinion Research, Austin, TX.    

Yu, E., Fricker, S., & Kopp, B. (2015). Can Survey Instructions Relieve Respondent Burden? Presented at the Annual Conference of the American Association for Public Opinion Research, Hollywood, FL.

















ATTACHMENT III


Consent Form

OMB Control Number: 1220-0141

Expiration Date: month xx, 2020


The Bureau of Labor Statistics (BLS) is conducting research to increase the quality of BLS surveys. This study is intended to suggest ways to improve the procedures the BLS uses to collect survey data.


The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. The Privacy Act notice on the back of this form describes the conditions under which information related to this study will be used by BLS employees and agents.


During this research you may be audio and/or videotaped, or you may be observed. If you do not wish to be taped, you still may participate in this research.


We estimate it will take you an average of xx minutes to participate in this research.


Your participation in this research project is voluntary, and you have the right to stop at any time. If you agree to participate, please sign below.


Persons are not required to respond to the collection of information unless it displays a currently valid OMB control number. OMB control number is 1220-0141 and expires month xx, 2020.


------------------------------------------------------------------------------------------------------------

I have read and understand the statements above. I consent to participate in this study.



___________________________________ ___________________________

Participant's signature Date



___________________________________

Participant's printed name



___________________________________

Researcher's signature




PRIVACY ACT STATEMENT

In accordance with the Privacy Act of 1974 (DOL/BLS – 14 BLS Behavioral Science Research Laboratory Project Files (81 FR 47418)), as amended (5 U.S.C. 552a), you are hereby notified that this study is sponsored by the U.S. Department of Labor, Bureau of Labor Statistics (BLS), under authority of 29 U.S.C. 2. Your voluntary participation is important to the success of this study and will enable the BLS to better understand the behavioral and psychological processes of individuals, as they reflect on the accuracy of BLS information collections. The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. Per the Federal Cybersecurity Enhancement Act of 2015, Federal information systems are protected from malicious activities through cybersecurity screening of transmitted data.




1 Internal reports available upon request.

19


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleBUREAU OF LABOR STATISTICS
AuthorNora Kincaid
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy