BUREAU OF LABOR STATISTICS
OMB CLEARANCE PACKAGE
for
CLEARANCE TO CONDUCT COGNITIVE AND PSYCHOLOGICAL RESEARCH
IN FY2015 THROUGH FY2017
Prepared by
BEHAVIORAL SCIENCE RESEARCH CENTER
OFFICE OF SURVEY METHODS RESEARCH
BUREAU OF LABOR STATISTICS
2014
Request for Extension of Clearance to Conduct
Cognitive and Psychological Research
Abstract
This is a request for clearance by the Bureau of Labor Statistics' (BLS) Behavioral Science Research Center (BSRC) to conduct research to improve the quality of data collection by examining the psychological and cognitive aspects of methods and procedures. BLS staff, employing state-of-the-art cognitive psychological testing methods, will conduct these research and development activities. The use of cognitive techniques to improve the quality of data collection has been advocated by the Cognitive Aspects of Survey Methodology seminar sponsored by the National Academy of Science and participants in a questionnaire design advisory conference. The planned research and development activities will be conducted during FY2015 through FY2017 with the goal of improving overall data quality through improved procedures.
Supporting Statement
A. Justification
1. Collection of Information
The Bureau of Labor Statistics' Behavioral Science Research Center (BSRC) conducts psychological research focusing on the design and execution of the data collection process in order to improve the quality of data collected by the Bureau. The BSRC conducts research aimed at improving data collection quality by assessing questionnaire/form management and administration, as well as issues which relate to interviewer training and interaction with respondents in the interview process. BSRC staff work closely with economists and/or program specialists responsible for defining the concepts to be measured by the Bureau of Labor Statistics' collection programs.
Both questionnaires and forms are used in the Bureau's surveys. Questionnaires specify the preferred wording of the questions to be asked, whereas forms specify the data items to be collected. Each possesses distinctive problems which in many cases can be related to respondent characteristics, survey content, or format of administration. Such problems impede the effectiveness of surveys and the mission of the Bureau in general.
The purpose of this request for clearance for cognitive psychological research and development activities by the BSRC is to enhance the quality of the Bureau's data collection procedures and overall data management. The basic goal of the BSRC is to improve, through interdisciplinary research, the quality of the data collected and published by the BLS. BLS is committed to producing the most accurate and complete data within the highest quality assurance guidelines. It is with this mission in mind, then, that BSRC was created to aid in the effort of not only maintaining, but also improving the quality of the data collection process.
This laboratory was established in 1988, by Commissioner Janet Norwood, to employ behavioral science to investigate all forms of oral and written communication used in the collection and processing of survey data. This exploration also includes all aspects of data collection such as mode, manuals, and interviewer training. BSRC performs a state-of-the-art service for many programs within BLS, the DOL, and other agencies as requested, providing questionnaire redesign efforts, survey updates, and improvement in the overall quality of the data collection management. These efforts, in turn, increase data quality and reduce respondent burden. The techniques proposed here have been successfully applied to many BLS surveys.
The research techniques and methods to be used in these studies will include both analyses of questionnaire construction and interview process, as well as survey technology. Within the structure of the questionnaire, analyses will be conducted in the following domains:
Question Analysis - Evaluation of individual questionnaires appraising question intent, assessment of semantic clarity, and an examination of relationships between questions.
Term Analysis - Evaluation of specific wording and phrases in terms of their psycholinguistic properties and an assessment of respondent interpretation of the meaning of these terms, at both the conscious and unconscious levels.
Instruction Analysis - Inspection of instructions for their semantic clarity, the degree to which they reflect the stated intention of investigators, ease of interpretation, and other considerations which may elicit unambiguous and appropriate answers or behaviors from respondents or interviewers.
Format Analysis - Review of questionnaires or subsets of questions for perceptual characteristics in order to facilitate better respondent comprehension and to promote more focused attention on the questionnaire or form.
Within the interview process, several analyses are conducted to assess nonverbal communication, interpersonal dynamics, and symbolic interaction--the use of cultural symbols to make social statements. Staff conducts research to evaluate the overall effectiveness of data collection procedural characteristics, including:
Interviewer Characteristics and Behavior Analysis - Study of the presentation of appearance, manner, relation to subject population, etc., in order to enhance interpersonal skills of interviewers in general and develop and improve procedures for the training of interviewers.
Respondent Characteristics and Behavior Analysis - Assessment of the social, cultural, and ethnic characteristics of the respondent and how that may bear upon interactions with the interviewer. Staff members also observe the behavior of respondents for cues concerning their reactions to the interview process. Because BLS constantly collects data from different populations that change over time, the analysis of respondent characteristics needs frequent updating.
Mode Characteristics - Examination of the unique properties of interviewer and/or respondent behavior as a function of the media used to collect data; for example, self-administered interviews, personal interviews, telephone interviews, and interviews utilizing assistive technologies (e.g., CAPI, CASI and CATI).
Usability Analysis - Evaluation of the effectiveness, efficiency, and satisfaction with which respondents complete tasks assigned to them, especially when using self-guided instruments (PAPI or CASI).
Data Collection Methodology Analysis – Assessment of alternative formats for collecting survey data (e.g., respondent provided records, administrative records). Staff will evaluate the validity and reliability of data collected through the alternative methodology as well as the level of respondent burden relative to current procedures.
BLS also uses a variety of methodologies, such as usability analysis, debriefings, and in-depth interviews, to better understand how to communicate more effectively with its stakeholder and user communities through its website and other materials.
2. The Purpose of Data Collection
The purpose of BSRC's data collection is to improve Federal data collection processes through scientific research. Theories and methods of cognitive science provide essential tools for the development of effective questionnaires. For example, they can provide an understanding of how respondents comprehend survey questions, recall relevant facts from memory, make judgments, and respond. On the basis of such knowledge, questions can be tailored to increase the accuracy and validity of the collected information and to reduce respondent burden. Similar improvements can be made with respect to other aspects of the data collection process.
BSRC’s research contributes to BLS and to the entire survey research field. Research results are shared with the Bureau through seminars, training sessions, reports, publications, and presentations at professional conferences. The BSRC staff has instituted a method of peer review to encourage high standards of social science research practice. A list of BSRC staff publications and internal reports1 covering the last five years can be found in Attachment I.
The BSRC’s research is expected to 1) improve the data collection instruments employed by the Bureau, 2) increase the accuracy of the economic data produced by BLS and on which economic policy decisions are based, 3) increase the ease of administering survey instruments for both respondents and interviewers, 4) increase response rates in panel surveys as a result of reduced respondent burden, 5) increase the ease of use of the BLS website and other BLS products, and 6) enhance BLS’s reputation resulting in greater confidence and respect in survey instruments used by BLS.
The application of cognitive and psychological theories and methods to survey data collection is widespread and well established. The consequences of failing to scientifically investigate the data collection process is to lag in the use of accepted practices, to apply archaic survey development techniques based on intuition and trial and error, and ultimately to incur a severe cost in data quality and in burden to respondents, interviewers, and data users alike. For example, without knowledge of what respondents can be expected to remember about the past and how to ask questions that effectively aid in the retrieval of the appropriate information, survey researchers cannot ensure that respondents will not take shortcuts to avoid careful thought in answering the questions, or to be subject to undue burden. Likewise, without investigation of the interviewers’ roles and abilities in the data collection process, survey researchers cannot ensure that interviewers will read their questions correctly with ease and fluency, or record the respondent’s answers correctly.
3. Use of Improved Technology
Staff members will design, conduct, and interpret field and laboratory research that contributes new knowledge of the cognitive aspects of human behavior in relationship to questionnaire design and survey methodology. Cognitive psychological research methods in use include such techniques as probing questioning, memory cueing, group discussion, and intensive interviewing. Depending on research goals, these methods may be used separately or in combination with one another.
The use of the laboratory approach has a number of advantages associated with it. These advantages include rapid and in-depth testing of questionnaire items, a more detailed understanding of the respondents’ comprehension of concepts, and access to special populations who can be quickly recruited and tested. Different laboratory methods will be used in different studies depending on the aspects of the data collection process being studied. Computer technology will be used when appropriate to aid the respondents and interviewers and minimize burden.
Respondent burden in this collection will be held to a minimum. The proposed approach to research of data collection methods is designed to obtain the maximum amount of information for the minimum respondent burden. The research includes such methods as:
cognitive interviews,
interview pacing and latency classification,
degree of structure within the interview format, group dynamics observation and recording of decision behaviors and/or the negotiation processes,
structured tasks: card sorts and vignettes,
expert analyses,
experiments involving the administration of forms to study respondents, and
usability tests of existing or proposed data collection and data dissemination systems (including the public BLS website).
4. Efforts to Avoid Duplication
This research does not duplicate any other research effort being done within the BLS. This research will provide critical, groundbreaking, and important supplemental information beyond that currently available in the field of survey methodology as it applies to BLS surveys.
This research also does not duplicate any outside-of-government research effort, as its purpose is not to replicate survey research studies. The staff of BSRC is cognizant of current research being done in the field of cognitive psychology through attendance at conferences, research reported in professional journals, and through in-house staff meetings and peer review processes. There is no current, similar, existing data that can be used or modified for the purposes of improving the overall data collection process.
Collection of Information Involving Small
Establishments
BSRC data collection efforts focus primarily on information gained through laboratory interviews, telephone interviews, and self-administered questionnaires with individuals recruited from the general public. However, in some instances, organizational goals necessitate the involvement of businesses, state agencies, or other entities. To the extent these establishments are included, they normally are surveyed only once.
6. The Consequences of Less Frequent Data Collection
The planned collection of data will allow BSRC to suggest modifications and alterations to survey research in an ongoing manner. Because this collection is expected to be an ongoing effort, it has the potential to have immediate impact on all survey collection methods within the Bureau's jurisdiction. Its delay would sacrifice potential gain in survey modification within the Bureau as a whole.
7. Special Circumstances
There are no special circumstances.
8. Federal Register and Consultation Outside BLS
Federal Register: No comments were received as a result of the Federal Register notice published in 79 FR 51614 on August 29, 2014.
Outside Consultation: Consultation with individuals outside BLS to obtain views on the availability of data, frequency of collection, suitability of particular laboratory methods, clarity of instruction and record keeping, disclosure, or reporting format, and on the data elements to be recorded and/or disclosed, are frequent and ongoing with the National Center for Health Statistics, the Bureau of the Census, the University of Maryland, the University of Michigan, and other federal agencies and institutions of higher learning. Consultants come from a wide range of subject areas and expertise. A list of individuals consulted in the past is attached to this document (Attachment II).
The individual responsible for the BSRC research efforts is:
Dr. Jennifer Edgar
Director of Behavioral Science Research
Office of Survey Methods Research
Bureau of Labor Statistics
PSB Room 1950
Washington,
DC 20212
(202) 691-7528
Payment to Respondents
Respondents for activities conducted in the laboratory (that is, cognitive interviews and focus groups) under this clearance will receive a small stipend. This practice has proven necessary and effective in recruiting subjects to participate in this small-scale research, and is also employed by the other Federal cognitive laboratories. The incentive for participation in an in-person cognitive interview is $40, and for participation in an in-person focus group is $50-$75. BLS may provide smaller incentives than these amounts at its discretion; however, any requests for larger amounts must be justified in writing to OMB.
Respondents for methods that are generally administered as part of field test activities (that is, split sample tests, behavior coding of interviewer/respondent interaction, and respondent debriefing) or other research projects where BLS lab staff travel to and test in respondents’ residences will not receive payment unless there are extenuating circumstances that warrant it. Such circumstances and proposed incentives must be justified in writing to OMB.
10. Confidentiality and Privacy Concerns
The data collected from respondents will be tabulated and analyzed only for the purpose of evaluating the research in question. Laboratory respondents will be asked to read and sign a Consent form explaining the voluntary nature of the studies, the use of the information, that the interview may be taped or observed, and a Privacy Act Statement. (Attachment III). The Privacy Act Statement given to respondents is as follows:
In accordance with the Privacy Act of 1974, as amended (5 U.S.C. 552a), you are hereby notified that this study is sponsored by the U.S. Department of Labor, Bureau of Labor Statistics (BLS), under authority of 29 U.S.C. 2. Your voluntary participation is important to the success of this study and will enable the BLS to better understand the behavioral and psychological processes of individuals, as they reflect on the accuracy of BLS information collections. The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent.
Surveys with current OMB approval that are involved in BSRC studies and are collected outside the laboratory such as mail or CATI surveys use the pledge of the existing approved collection or the Privacy Act Statement.
The Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA) safeguards the confidentiality of individually identifiable information acquired under a pledge of confidentiality for exclusively statistical purposes by controlling access to, and uses made of, such information. CIPSEA includes fines and penalties for any knowing and willful disclosure of individually identifiable information by an officer, employee, or agent of the BLS.
BLS policy on the confidential nature of respondent identifiable information (RII) states that “RII acquired or maintained by the BLS for exclusively statistical purposes and under a pledge of confidentiality shall be treated in a manner that ensures the information will be used only for statistical purposes and will be accessible only to authorized individuals with a need-to-know.”
11. Sensitive Questions
There are no questions of a sensitive nature.
12. Estimated Respondent Burden
The current burden inventory for FY2011 to FY2014 is 3,600. The FY2015, FY2016, and FY2017 estimated respondent burdens are as follows:
|
Individuals and Households |
Private Sector |
Recruiting and Screening |
Total Response Burden (Hours) |
FY2015 |
1,0002 |
1,8653 |
166 |
3,031 |
FY2016 |
900 |
1,1754 |
150 |
2,225 |
FY2017 |
900 |
300 |
150 |
1,350 |
Total FY15-17 |
2,800 |
3,340 |
466 |
6,606 |
The burden hours are estimated based on the anticipation that the research will require approximately one hour per respondent. Each study will differ substantively from one another. The projects are expected to be complex, involving at times several cognitive testing methods to test the hypotheses of the given research question.
In addition to burden hours required for data collection, The Office of Management and Budget has instructed that time spent recruiting and screening participants for studies be included in estimates of burden. The addition of the recruiting and screening column in the table above reflects that requirement. Specifically, we estimate that screening takes approximately 10 minutes per household participant. Private sector participants are often sampled from an existing BLS frame and so screening is not necessary.
Coverage of the Estimates
The estimates cover the time that each respondent will spend answering questions, including the debriefing procedure concerning the cognitive testing procedure used. The time required to travel to the laboratory, if needed, is not covered, since distances and modes of transportation are unknown. No retrieval of information by respondents is anticipated, although it is possible that validation of data at some point may require respondents to keep and check records. In this case, burden hour requests will include the estimated time required to gather records.
Basis for the Estimate
These estimates are based on the BSRC’s
previous experience in conducting such research under the existing
OMB Clearance 1220-0141, and on expectations concerning the research
projects to be conducted in the next 3 years. BSRC staff and its
laboratory facilities (especially the usability lab) have been
increasingly utilized by both program offices and outside agencies,
and it is anticipated that this trend will continue. The estimates
are also based on the experience of other government agencies (such
as the National Center for Health Statistics' study of the Cognitive
Aspects of Survey Methods, 1987) which have conducted cognitively
oriented questionnaire design research.
The estimated dollar cost for these individuals is based on average hourly earnings of employed wage and salaried workers, $24.45 per hour, taken from July 2014 Current Employment Statistics Program data. Using the $24.45 per hour figure, the annualized cost to the respondents is $53,839 for FY2015 – FY2017 (based on an average of 2,202 burden hours annually).
13. Total Annual Cost Burden
There will be no total capital and start-up cost component for respondents or record keepers resulting from the collection of information.
b) The respondents and record keepers will have no expenses for operation and maintenance or purchase of services resulting from the collection of information.
14. Cost to the Federal Government
The maximum cost to the Federal Government is $42,000 annually for FY2015, FY2016, and FY2017. Those costs are entirely comprised of the reimbursements paid to respondents and newspaper advertisement costs. Other costs such as operational expenses (e.g., equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without the paperwork burden, are in place as a function of the laboratory proper and are not contingent or necessary to perform this research.
15. Changes in Burden
This is a request for an extension to the existing OMB Clearance 1220-0141 in order to continue the research mission for another 3 years. In addition to extending the 3,600 burden hours requested during the last clearance package, the BSRC at BLS is requesting an additional 2,540 hours during FY2015 and FY2016 to cover three, one-time projects including: 1) a study which will examine the feasibility of using respondent provided financial records in the context of a redesigned Consumer Expenditure Survey (CE) totaling 100 hours; 2) a feasibility study to explore the possibility of collecting Job Openings and Labor Turnover Survey data on an accelerated schedule to align with Current Employment Survey totaling an additional 1565 hours; and 3) a Factoryless Goods Production (FGP) test for an additional 875 hours. Each of these tests will be conducted by contractors and supervised by BSRC staff. BLS is also requesting an additional 10 minutes per household participant, a total of 466 burden hours, to cover time spent recruiting and screening participants for various studies.
16. Tabulation, Analysis and Publication Plans, and Project Schedule
This clearance request is for survey methodological and questionnaire quality assurance, which include the exploratory activities leading to the evaluation of data quality. Both quantitative and qualitative analyses are planned for the evaluation of these activities depending on the circumstances.
The results of these investigations will be used primarily to improve the quality of data collection and assure total collection quality as it relates to data management. Because BLS is using the latest techniques and cognitive psychological testing methodology, methodological papers may be written that include some tallies of response problems, recall strategies, or results from other testing procedures used, etc. However, BLS will not publish any reports of the substantive results collected under this clearance. The methodological results may be included as a methodological appendix or footnote in a report containing data from a larger data collection effort. The methodological results of this research may be prepared for presentation at professional meetings or publication in professional journals.
This project schedule calls for laboratory interviews to commence once OMB approval is received.
A time schedule is proposed that is continuously ongoing in nature, with research publication dates dependent on data collection limited to the researcher's proposal and subsequent results.
17. Expiration for OMB Approval
The BSRC is not seeking approval to avoid displaying the expiration date for OMB approval of the information collection.
18. Exception to Certification Statement
There are no exceptions to the certification statement “Certification for Paperwork Reduction Act Submissions.”
ATTACHMENT I
BEHAVIORAL SCIENCE RESEARCH LAB RECENT ARTICLES AND REPORTS
Amchin, S., Creech, B., Davis, J., Edgar, J., Fraser, W., Gloster, J., Murphy, P., & To, N. (2010) Information booklet for telephone respondents feasibility test. Final Report. Internal Report.
Anderson, S., Applebaum, M., Eickman, M., Erkens, G., Fairman, F., Groen, J., Kroll, S., Manning, C., & Phipps, P. (2009). Differences in seasonality between the CES and QCEW programs: Results from the 2008 response analysis survey. Internal Report.
Bates, N., Dahlhamer, J.M., Phipps, P., Safir, A., & Tan, L. (2010). Assessing contact history data quality and consistency across several federal surveys. Proceedings of the ASA Joint Statistical Meetings, Survey Methods Research Section, 91-105.
Book, T., & Edgar, J (2012). Proxy reporting lab study report. Internal Report.
Denton, S., Edgar, J., Fricker, S., & Phipps, P. (2012). Exploring conversational interviewing in the American Time Use Survey: Behavior coding study report. Presented at the Annual Conference of the American Association for Public Opinion Research, Orlando, FL.
Edgar, J. (2009). What does “usual‟ usually mean? Presented at the Annual Conference of the American Association for Public Opinion Research, Orlando, FL.
Edgar, J., (2010). Cognitive testing 2011 CEQ changes results. Internal Memo.
Edgar, J. (2011). Ask more, get more? Comparing responses to detailed and global questions. Presented at the Annual Conference of the American Association for Public Opinion Research, Phoenix, AZ.
Edgar, J., (2011). Global questions cognitive testing results. Internal Report.
Edgar, J. (2011). SOII IDCF usability testing results. Internal Memo.
Edgar, J. (2012). Cognitive interviews without the cognitive interviewer? Presented at the Annual Conference of the American Association for Public Opinion Research, Orlando, FL.
Edgar, J. (2013) Improving proxy reporting. Presented at the Annual Conference of the American Association for Public Opinion Research, Boston, MA.
Edgar, J., Mockovak, W., & Kopp, B. (2014). Results from CPS certification cognitive testing. Internal Report.
Edgar, J., Schwarz, D., & Book, T. (2012). Reference period web survey report. Internal Report.
Fairman, K., Applebaum, M., Manning, C., & Phipps, P. (2009). Response analysis survey: Examining reasons for employment differences between the QCEW and the CES survey. pp. 3483-3496 in Proceedings of the ASA Joint Statistical Meetings, Survey Methods Research Section. Alexandria, VA: American Statistical Association.
Fox, J.E., & Fricker, S. (2009). Designing ratings scales for questionnaires. Presented at the Usability Professionals’ Association Annual Meeting, Portland, OR, June 11, 2009.
Fox, J. & Fricker, S. (2012). Steps to design a better survey. Presentation at User Focus, Chevy Chase, MD.
Fricker, S., Bosley, J., & Gillman, D. (2012). Effects on employment classifications of conceptual variability of response category 0ptions—Implications for data quality, Proceedings of the Annual Meeting of the American Statistical Association.
Fricker, S., Creech, B., Davis, J., Gonzalez, J., Tan, L., & To, N. (2012). Exploring the effects of a shorter interview on data quality, nonresponse, and respondent burden. Proceedings of the Annual Meeting of the Federal Committee on Survey Methodology, Washington, DC.
Fricker, S. & Edgar, J. (2009). Exploring sources of error in the Consumer Expenditure Survey – Results from a small-scale validation study. Paper Presented at the Annual Conference of the American Association for Public Opinion Research, Hollywood, FL
Fricker, S., Gonzalez, J., & Tan, L. (2011). Are you burdened? Let’s find out. Paper Presented at the Annual Conference of the American Association for Public Opinion Research, Phoenix, AZ.
Fricker, S., Kreisler, C., & Tan, L. (2012). Exploratory research on the construction of a summary index for respondent burden, Proceedings of the Annual Meeting of the American Statistical Association.
Fricker, S. & Kopp, B. (2010). Summary report of the usability test of the BLS public website homepage redesign. Internal Report.
Fricker, S., Kopp, B., & To, N. (in press). Exploring a balance edit approach in the Consumer Expenditure Quarterly Interview Survey. In C. Carroll, T. Crossley, & J. Sablehaus (Eds.) Improving the Measurement of Consumer Expenditures. Chicago, IL: University of Chicago Press.
Fricker, S., & Tourangeau, R. (2010). Examining the relationship between nonresponse propensity and data quality in two national household surveys. Public Opinion Quarterly, 74(5), 934-955.
Geisen, E., Richards, A., Strohm, C., & Wang, J. (2011). U.S. Consumer Expenditure records study final report. Internal Report.
Gonzalez, J., & Edgar, J. (2009). Correlates of data quality in the Consumer Expenditure Quarterly Interview Survey. Proceedings of the Section on Survey Research Methods, American Statistical Association.
Kaplan, R. & Mockovak, W. (2013). Occupational Outlook Quarterly reinvention: Section titles feedback. Internal Report.
Kopp, B. (2011). CPS disability supplement questions: Summary of findings from cognitive Kopp, B. (2012). Cognitive testing report for the CEQ 2013 changes. Internal Report.
Kopp, B., Fox, J., Yu, E., & To, N. (2014). Summary report of the phases II & III usability tests of the CE mobile-optimized web diary. Internal Report.
Kopp, B., Kaplan, K., & Phipps, P. (2014). Results from cognitive testing of the ATUS sleep questions: Contrasting time diary and stylized sleep estimates. Internal Report.
Kopp, B. & Yu, E. (2013). Summary report of the 2013 global questions cognitive testing study. Internal Report.
Kopp, B. & Yu, E. (2014). Final cognitive testing report for the CEQ 2015 changes. Internal Report.
Mockovak, W. (2010). Using an action-research model to move from conversational to hybrid standardized interviewing: A case study. Proceedings of the Section on Survey Research Methods, American Statistical Association.
Mockovak, W. (2011). The impact of visual design in survey cover letters on response and web take-up rates. Proceedings of the Section on Survey Research Methods, American Statistical Association.
Mockovak, W. (2011). Summary of interviews with technical users of employment projections data. Internal Report.
Mockovak, W. (2011). Usability findings from the Occupational Outlook Handbook search function test. Internal Report.
Mockovak, W. & Kopp, B. (2012). An evaluation of alternative prototypes of drop-down menus on BLS.gov. Internal Report.
Mockovak, W. (2012). Exploratory study of the Adobe fillable GTP form using eye tracking. Internal Report.
Mockovak, W. (2013). Usability test of the OOH home page prototypes. Internal Report.
Mockovak, W. (2013). Initial summary of BLS brochure evaluation data. Internal Report.
Mockovak, W. (2013). Occupational Outlook Quarterly (OOQ) focus group results from GMU career services staff. Internal Report.
Mockovak, W. & Bartsch; K. (2013). Reinventing and evaluating a redesigned Occupational Outlook Handbook. Proceedings of the Section on Survey Research Methods, American Statistical Association.
Mockovak, W., Harney, T., Hersey, R., Muck, J., Carney, P., & Rowinski, N. (2013). Summary report from the Occupational Requirements Survey (ORS) educational requirements test. Internal Report.
Mockovak, W. & Stang; S. (2012). Using an action-research model to develop a grid on a self-administered questionnaire. Proceedings of the Fourth International Conference on Establishment Surveys, Montreal.
Phipps, P. (2009). Employer interviews on the Survey of Occupational Injuries and Illnesses and Workers’ Compensation Claims reporting. Internal Report.
Phipps, P., Edgar, J., Denton, S., & Fricker, S. (2012). Exploring conversational interviewing in the American Time Use Survey, Paper Presented at the Annual Conference of the American Association for Public Opinion Research, Phoenix, AZ.
Phipps, P. & Moore, D. (2010). Employer interviews: Exploring differences in reporting work injuries and illnesses in the Survey of Occupational Injuries and Illnesses and state Workers’ Compensation claims. pp. 2910-2924 in Proceedings of the ASA Joint Statistical Meetings, Survey Methods Research Section. Alexandria, VA: American Statistical Association.
Phipps, P. & Vernon, M.K. (2009). 24 hours: An overview of the recall diary method and data quality in the American Time Use Survey. pp. 109-124 in R. F. Belli, F. Stafford, and D. F. Alwin (eds.), Calendar and Time Diary Methods in Life Course Research. Thousand Oaks, CA: Sage.
Ruther, N. (2013). Producing paradata from American Time Use Survey audit trails. Internal Report.
Sjoblom, M., & Lee L. (2012). Records information and feasibility of use study: Final report. Internal Report.
Yu, E. (2013) Asking questions about household member activities to improve expenditure reporting. Internal Report.
ATTACHMENT II
CONSULTANTS TO THE
BEHAVIORAL SCIENCE RESEARCH LABORATORY
Dr. Paul Biemer, Distinguished Fellow
Research Triangle Institute
3040 Corwallace Rd.
Ragland Building
Research Triangle Park, NC 27709
(919) 541-6000
Dr. Roger Tourangeau
Westat
1600 Research Boulevard
Rockville, MD 20850
(301) 294-2828
Dr. Ting Yan
University of Michigan Institute of Social Research
426 Thompson Street
Ann Arbor, MI 48104
734-647-5380
The Bureau of Labor Statistics (BLS) is conducting research to increase the quality of BLS surveys. This study is intended to suggest ways to improve the procedures the BLS uses to collect survey data.
The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. The Privacy Act notice on the back of this form describes the conditions under which information related to this study will be used by BLS employees and agents.
During this research you may be audio and/or videotaped, or you may be observed. If you do not wish to be taped, you still may participate in this research.
We estimate it will take you an average of [enter #] minutes to participate in this research (ranging from [enter #] minutes to [enter #] minutes).
Your participation in this research project is voluntary, and you have the right to stop at any time. If you agree to participate, please sign below.
Persons are not required to respond to the collection of information unless it displays a currently valid OMB control number. OMB control number is 1220-0141, and expires [enter date].
------------------------------------------------------------------------------------------------------------
I have read and understand the statements above. I consent to participate in this study.
___________________________________ ___________________________
Participant's signature Date
___________________________________
Participant's printed name
___________________________________
Researcher's signature
OMB Control Number: 1220-0141
Expiration Date: [enter expiration date]
In accordance with the Privacy Act of 1974, as amended (5 U.S.C. 552a), you are hereby notified that this study is sponsored by the U.S. Department of Labor, Bureau of Labor Statistics (BLS), under authority of 29 U.S.C. 2. Your voluntary participation is important to the success of this study and will enable the BLS to better understand the behavioral and psychological processes of individuals, as they reflect on the accuracy of BLS information collections. The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent.
1 Internal reports available upon request.
2 This estimate includes an additional 100 hours for a study that will be conducted in FY2015 by a contractor under OSMR’s Blanket Purchase Agreement.
This study will examine the feasibility of using respondent provided financial records in the context of a redesigned Consumer Expenditure Survey (CE). This study will have 50 participants and it is estimated that they will participate for a total of two hours across the data collection period (50 participants X 2 hours = 100 burden hours).
3 This estimate includes an additional 1,565 burden hours for one study that will be conducted in FY2015 by a contractor under OSMR’s Blanket Purchase Agreement.
This is a feasibility study to explore the possibility of collecting Job Openings and Labor Turnover Survey data on an accelerated schedule to align with Current Employment Survey. One thousand study participants will be mailed a questionnaire once a month for six months. It is estimated that this form will take 15 minutes to complete. A subsample of 65 participants will be contacted by phone for a 1 hour follow-up interview (1,000 participants X 6 forms X .25 hours per form = 1,500 burden hours. 65 participants X 1 hour interview = 65 burden hours. For a total of 1,565 burden hours).
4 This estimate includes an additional 875 burden hours for one study that is anticipated to be fielded in FY2016, though it may occur in FY2015. The date is dependent on decisions regarding the measurement of Factoryless Goods Production (FGP) which are outside of BLS control.
Based on current plans to study FGP, we expect up to two thousand participants will be mailed a questionnaire that is estimated to take 15 minutes to complete. A subsample of 500 of these participants will be contacted by phone for a 45 minute debriefing interview (2,000 participants X .25 hours per questionnaire = 500 burden hours. 500 participants X .75 hours = 375 burden hours. For a total of 875 burden hours).
File Type | application/msword |
File Title | BUREAU OF LABOR STATISTICS |
Author | Nora Kincaid |
Last Modified By | Kincaid, Nora - BLS |
File Modified | 2015-04-22 |
File Created | 2015-04-22 |