Supporting Statement Part A (1220-0141) 2012_02_28

Supporting Statement Part A (1220-0141) 2012_02_28.doc

Cognitive and Psychological Research

OMB: 1220-0141

Document [doc]
Download: doc | pdf











BUREAU OF LABOR STATISTICS

OMB CLEARANCE PACKAGE






for

CLEARANCE TO CONDUCT COGNITIVE AND PSYCHOLOGICAL RESEARCH

IN FY2012 THROUGH FY2014









Prepared by

BEHAVIORAL SCIENCE RESEARCH LABORATORY

OFFICE OF SURVEY METHODS RESEARCH

BUREAU OF LABOR STATISTICS



2011



Request for Extension of Clearance to Conduct

Cognitive and Psychological Research


Abstract


This is a request for clearance by the Bureau of Labor Statistics' (BLS) Behavioral Science Research Laboratory (BSRL) to conduct research to improve the quality of data collection by examining the psychological and cognitive aspects of methods and procedures. BLS staff, employing state-of-the-art cognitive psychological testing methods, will conduct these research and development activities. The feasibility and value of this approach to questionnaire construction, survey technology, and interview processes has been demonstrated over the past 25 years. The use of this technique to improve the quality of data collection has been advocated by the Cognitive Aspects of Survey Methodology seminar sponsored by the National Academy of Science and participants in a questionnaire design advisory conference. The planned research and development activities will be conducted during FY2012 through FY2014 with the goal of improving overall data quality through improved procedures.

Supporting Statement


A. Justification


1. Collection of Information


The Bureau of Labor Statistics' Behavioral Science Research Laboratory (BSRL) conducts psychological research focusing on the design and execution of the data collection process in order to improve the quality of data collected by the Bureau. The BSRL conducts research aimed at improving data collection quality by assessing questionnaire/form management and administration, as well as issues which relate to interviewer training and interaction with respondents in the interview process. BSRL staff work closely with economists and/or program specialists responsible for defining the concepts to be measured by the Bureau of Labor Statistics' collection programs.


Both questionnaires and forms are used in the Bureau's surveys. Questionnaires specify the preferred wording of the questions to be asked, whereas forms specify the data items to be collected. Each possesses distinctive problems which in many cases can be related to respondent characteristics, survey content, or format of administration. Such problems impede the effectiveness of particular surveys and the mission of the Bureau in general.


The purpose of this request for clearance for cognitive psychological research and development activities by the BSRL is to enhance the quality of the Bureau's data collection procedures and overall data management. The basic goal of the BSRL is to improve, through interdisciplinary research, the quality of the data collected and published by the BLS. BLS is committed to producing the most accurate and complete data within the highest quality assurance guidelines. It is with this mission in mind, then, that BSRL was created to aid in the effort of not only maintaining, but also improving the quality of the data collection process.


This laboratory was established in 1988, by Commissioner Janet Norwood, to employ behavioral science to investigate all forms of oral and written communication used in the collection and processing of survey data. This exploration also includes all aspects of data collection such as mode, manuals, and interviewer training. BSRL performs a state-of-the-art service to numerous programs within BLS, the DOL, and other agencies as requested, providing questionnaire redesign efforts, survey updates, and improvement in the overall quality of the data collection management. These efforts, in turn, increase data quality and reduce respondent burden. The techniques proposed here have been applied successfully to many BLS surveys.


The research techniques and methods to be used in these studies will include both analyses of questionnaire construction and interview process, as well as survey technology. Within the structure of the questionnaire, analyses will be conducted in the following domains:


  1. Question Analysis--Evaluation of individual questionnaires appraising question intent, assessment of semantic clarity, and an examination of relationships between questions.


  1. Term Analysis--Evaluation of specific wording and phrases in terms of their psycholinguistic properties and an assessment of respondent interpretation of the meaning of these terms, at both the conscious and unconscious levels.


  1. Instruction Analysis--Inspection of instructions for their semantic clarity, the degree to which they reflect the stated intention of investigators, ease of interpretation, and other considerations which may elicit unambiguous and appropriate answers or behaviors from respondents or interviewers.


  1. Format Analysis--Review of questionnaires or subsets of questions for perceptual characteristics in order to facilitate better respondent comprehension and promote more active attention to the focus of the questionnaire or form.


Within the interview process, several analyses are conducted to assess nonverbal communication, interpersonal dynamics, and symbolic interaction--the use of cultural symbols to make social statements. Staff conducts research to evaluate the overall effectiveness of data collection procedural characteristics, including:


  1. Interviewer Characteristics and Behavior analysis - Study of the presentation of e.g., appearance, manner, relation to subject population, etc., in order to enhance interpersonal skills of interviewers in general and develop and improve procedures for the training of interviewers.


  1. Respondent Characteristics and Behavior analysis - Assessment of the social, cultural, and ethnic characteristics of the respondent and how that may bear upon interactions with the interviewer. Staff members also observe the behaviors of respondents for cues concerning their reactions to the interview process. Because BLS constantly collects data from different populations that change over time, the analysis of respondent characteristics needs frequent updating.


  1. Mode Characteristics - Examination of the unique properties of interviewer and/or respondent behavior as a function of the media used to collect data; for example, self-administered interviews, personal interviews, telephone interviews and interviews utilizing assistive technologies (e.g., CAPI, CASI and CATI).


  1. Usability Analysis - Evaluation of the effectiveness, efficiency, and satisfaction with which respondents complete tasks assigned to them, especially when using self-guided instruments (PAPI or CASI).



  1. Data Collection Methodology analysis – Assessment of alternative formats for collecting survey data (e.g., respondent provided records, administrative records). Staff will evaluate the validity and reliability of data collected through the alternative methodology as well as the level of respondent burden relative to current procedures.


BLS also uses a variety of methodologies, such as usability analysis, debriefings, and in-depth interviews, to better understand how to communicate more effectively with its stakeholder and user communities through its website and other materials.


2. The Purpose of Data Collection


The purpose of BSRL's data collection is to improve Federal data collection processes through scientific research. Theories and methods of cognitive science provide essential tools for the development of effective questionnaires. For example, they can provide an understanding of how respondents comprehend survey questions, recall relevant facts from memory, make judgments, and respond. On the basis of such knowledge, questions can be tailored to increase the accuracy of the collected information and to reduce the respondent burden. Similar improvements can be made with respect to other aspects of the data collection process.


BSRL’s research contributes to BLS and to the entire survey research field. Research results are shared with the Bureau through seminars, training sessions, reports, publications, and presentations at professional conferences. The BSRL staff has instituted a method of peer review to encourage high standards of social science research practice. A current bibliography of BSRL staff publications can be found in Attachment I.


The BSRL’s research is expected to 1) improve the data collection instruments employed by the Bureau, 2) increase the accuracy of the economic data produced by BLS and on which economic policy decisions are based, 3) increase the ease of administering survey instruments for both respondents and interviewers, 4) increase response rates in panel surveys as a result of reduced respondent burden, 5) increase the ease of use of the BLS website and other BLS products, and 6) enhance BLS’s reputation resulting in greater confidence and respect in survey instruments used by BLS.

The application of cognitive and psychological theories and methods to survey data collection is widespread and well established. The consequences of failing to scientifically investigate the data collection process is to lag in the use of accepted practices, to apply archaic survey development techniques based on intuition and trial and error, and ultimately to incur a severe cost in data quality and in burden to respondents, interviewers and data users alike. For example, without knowledge of what respondents can be expected to remember about the past and how to ask questions that effectively aid in the retrieval of the appropriate information, survey researchers cannot ensure that respondents will not take shortcuts to avoid careful thought in answering the questions, or to be subject to undue burden. Likewise, without investigation of the interviewers’ roles and abilities in the data collection process, survey researchers cannot ensure that interviewers will read their questions correctly with ease and fluency, or record the respondent’s answers correctly.


3. Use of Improved Technology


Staff members will design, conduct, and interpret field and laboratory research that contributes new knowledge of the cognitive aspects of human behavior in relationship to questionnaire design and survey methodology. Cognitive psychological research methods in use include such techniques as: probing questioning, memory cueing, group discussion, and intensive interviewing. Depending on research goals, these methods may be used separately or in combination with one another.


The use of the laboratory approach has a number of advantages associated with it. These advantages include: rapid and in-depth testing of questionnaire items, a more detailed understanding of the respondents’ comprehension of concepts, and access to special populations who can be quickly recruited and tested. Different laboratory methods will be used in different studies depending on the aspects of the data collection process being studied. Computer technology will be used when appropriate to aid the respondents and interviewers and minimize burden.


Respondent burden in this collection will be held to a minimum. The proposed approach to research of data collection methods is designed to obtain the maximum amount of information for the minimum respondent burden. The research includes such methods as:


  1. interview pacing and latency classification,


  1. degree of structure within the interview format, group dynamics observation and recording of decision behaviors and/or the negotiation processes,


  1. structured tasks: card sorts and vignettes,


  1. expert analyses, and


  1. experiments involving the administration of forms to study respondents.


  1. usability tests of existing or proposed data collection and data dissemination systems (including the public BLS website).


4. Efforts to Avoid Duplication


This research does not duplicate any other research effort being done within the BLS. This research will provide critical, groundbreaking, and important supplemental information beyond that currently available in the field of survey methodology as it applies to BLS surveys.


This research also does not duplicate any outside-of-government research effort, as its purpose is not to replicate survey research studies. The staff of BSRL is cognizant of current research being done in the field of cognitive psychology through attendance at conferences, research reported in professional journals, and through in-house staff meetings and peer review processes. There is no current, similar, existing data that can be used or modified for the purposes of improving the overall data collection process.


  1. Collection of Information Involving Small Establishments

BSRL data collection efforts focus primarily on information gained through laboratory interviews, telephone interviews, and self-administered questionnaires with individuals recruited from the general public. However, in some instances organizational goals necessitate the involvement of businesses, state agencies, or other entities. To the extent these establishments are included, they normally are surveyed only once.


6. The Consequences of Less Frequent Data Collection


The planned collection of data will allow BSRL to suggest modifications and alterations to survey research in an on-going manner. Because this collection is expected to be an on-going effort, it has the potential to have immediate impact on all survey collection methods within the Bureau's jurisdiction. Its delay would sacrifice potential gain in survey modification within the Bureau as a whole.


7. Special Circumstances


There are no special circumstances.


8. Federal Register and Consultation Outside BLS


Federal Register: No comments were received as a result of the Federal Register notice published in Volume 76 FR 56226 on September 12, 2011.


Outside Consultation: Consultation with individuals outside BLS to obtain views on the availability of data, frequency of collection, suitability of particular laboratory methods, clarity of instruction and record keeping, disclosure, or reporting format, and on the data elements to be recorded and/or disclosed, are frequent and on-going with the National Center for Health Statistics, the Bureau of the Census, the University of Maryland, the University of Michigan, and other federal agencies and institutions of higher learning. Consultants come from a wide range of subject areas and expertise. A list of individuals consulted in the past is attached to this document. (Attachment II).


The individual responsible for the BSRL research efforts is:


Dr. William P. Mockovak

Director of Behavioral Science Research

Office of Survey Methods Research

Bureau of Labor Statistics

PSB Room 1950

2 Massachusetts Ave., NE

Washington, DC 20212
202--691-7414


  1. Payment to Respondents


Respondents for activities conducted in the laboratory (that is, cognitive interviews and focus groups) under this clearance will receive a small stipend.  This practice has proven necessary and effective in recruiting subjects to participate in this small-scale research, and is also employed by the other Federal cognitive laboratories.  The incentive for participation in a cognitive interview is $40, and for participation in a focus group is $50-$75.  BLS may provide smaller incentives than these amounts at its discretion; however, any requests for larger amounts must be justified in writing to OMB.  

 

Respondents for methods that are generally administered as part of field test activities (that is, split sample tests, behavior coding of interviewer/respondent interaction, and respondent debriefing) or other research projects where BLS lab staff travel to and test in respondents’ residences will not receive payment unless there are extenuating circumstances that warrant it.  Such circumstances and proposed incentives must be justified in writing to OMB.  


10. Confidentiality and Privacy Concerns


The data collected from respondents will be tabulated and analyzed only for the purpose of evaluating the research in question. Laboratory respondents will be asked to read and sign a Consent form explaining the voluntary nature of the studies, the use of the information, that the interview may be taped or observed, and a Privacy Act statement. (Attachment III). The Privacy Act Statement given to respondents is as follows:

In accordance with the Privacy Act of 1974, as amended (5 U.S.C. 552a), you are hereby notified that this study is sponsored by the U.S. Department of Labor, Bureau of Labor Statistics (BLS), under authority of 29 U.S.C. 2. Your voluntary participation is important to the success of this study and will enable the BLS to better understand the behavioral and psychological processes of individuals, as they reflect on the accuracy of BLS information collections. The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent.

Surveys with current OMB approval that are involved in BSRL studies and are collected outside the laboratory such as mail or CATI surveys use the pledge of the existing approved collection or the Privacy Act statement.


The Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA) safeguards the confidentiality of individually identifiable information acquired under a pledge of confidentiality for exclusively statistical purposes by controlling access to, and uses made of, such information. CIPSEA includes fines and penalties for any knowing and willful disclosure of individually identifiable information by an officer, employee, or agent of the BLS.


The Bureau of Labor Statistics Commissioner's Order No. 1-06, “Confidential Nature of BLS Statistical Data,” explains the Bureau's policy on confidentiality: “In conformance with existing law and Departmental regulations, it is the policy of the BLS that respondent identifiable information collected or maintained by, or under the auspices of, the BLS for exclusively statistical purposes and under a pledge of confidentiality shall be treated in a manner that will ensure that the information will be used only for statistical purposes and will be accessible only to authorized persons.”


11. Sensitive Questions


There are no questions of a sensitive nature.


12. Estimated Respondent Burden


The current OMB inventory for FY2011 is 2,200 hours. The FY2012, FY2013, and FY2014 estimated respondent burdens are as follows:



Individuals and Households

Private Sector

Total Response Burden

(Hours)




FY2012

900

300

1,200

FY2013

900

300

1,200

FY2014

900

300

1,200

Total FY12-14

2700

900

3,600


The burden hours are estimated based on the anticipation that the research will require approximately one hour per respondent. Each study will differ substantively from one another. The projects are expected to be complex, involving at times several cognitive testing methods to test the hypotheses of the given research question.


Coverage of the Estimates


The estimates cover the time that each respondent will spend answering questions, including the debriefing procedure concerning the cognitive testing procedure used. The time required to travel to the laboratory, if needed, is not covered, since distances and modes of transportation are unknown. No retrieval of information by respondents is anticipated, although it is possible that validation of data at some point may require respondents to keep and check records. In this case, experiments will be designed to include retrieval time.


Basis for the Estimate


These estimates are based on the BSRL’s previous experience in conducting such research under the existing OMB Clearance 1220-0141, and on expectations concerning the research projects to be conducted in the next 3 years. BSRL staff and its laboratory facilities (especially the usability lab) have been increasingly utilized by both program offices and outside agencies, and it is anticipated that this trend will continue. The estimates are also based on the experience of other government agencies (such as the National Center for Health Statistics' study of the Cognitive Aspects of Survey Methods, 1987) which have conducted cognitively oriented questionnaire design research.

Annualized Cost to Respondents


The estimated dollar cost for these individuals is based on average hourly earnings of employed wage and salaried workers, $23.13 per hour, taken from July 2011 Current Employment Statistics Program data. Using the $23.13 per hour figure, the annualized cost to the respondents is $27,744 for FY2012 – FY2014 (based on 1,200 burden hours annually).

13. Total Annual Cost Burden


  1. There will be no total capital and start-up cost component for respondents or record keepers resulting from the collection of information.


b) The respondents and record keepers will have no expenses for operation and maintenance or purchase of services resulting from the collection of information.


14. Cost to the Federal Government


The maximum cost to the Federal Government is $52,000 annually for FY2012, FY2013, and FY2014. Those costs are entirely comprised of the reimbursements paid to respondents and newspaper advertisement costs. Other costs such as operational expenses (e.g., equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without the paperwork burden, are in place as a function of the laboratory proper and is not contingent or necessary to perform this research.


15. Changes in Burden


This is a request for extension to the existing OMB Clearance 1220-0141 in order to continue the research mission for another 3 years. Because this is a generic information collection, where burdens are cumulative over the approval cycle as opposed to being averaged over a period of three years, the BLS is increasing its burden estimate to 3,600 hours. The prior estimate proved insufficient to meet BLS program needs.


16. Tabulation, Analysis and Publication Plans, and Project Schedule


This clearance request is for survey methodological and questionnaire quality assurance, which include the exploratory activities leading to the evaluation of data quality.  Both quantitative and qualitative analyses are planned for the evaluation of these activities depending on the circumstances. 

 

The results of these investigations will be used primarily to improve the quality of data collection and assure total collection quality as it relates to data management.  Because BLS is using the latest techniques and cognitive psychological testing methodology, methodological papers may be written that include some tallies of response problems, recall strategies, or results from other testing procedures used, etc.  However, BLS will not publish any reports of the substantive results collected under this clearance.  The methodological results may be included as a methodological appendix or footnote in a report containing data from a larger data collection effort.  The methodological results of this research may be prepared for presentation at professional meetings or publication in professional journals.

Project Schedule


This project schedule calls for laboratory interviews to commence once OMB approval is received.


A time schedule is proposed that is continuously on-going in nature, with research publication dates dependent on data collection limited to the researcher's proposal and subsequent results.


17. Expiration for OMB Approval


The BSRL is not seeking approval to avoid displaying the expiration date for OMB approval of the information collection.


18. Exception to Certification Statement


There are no exceptions to the certification statement “Certification for Paperwork Reduction Act Submissions.”

ATTACHMENT I


BEHAVIORAL SCIENCE RESEARCH LAB BIBLIOGRAPHY


Bates, N., Dahlhamer, J.M., Phipps, P., Safir, A., & Tan, L. (2010). Assessing contact history data quality and consistency across several federal surveys. Proceedings of the ASA Joint Statistical Meetings, Survey Methods Research Section, 91-105.


Bosley, J.J., Dashen, M., & Fox, J.E. (1999). Effects on list length and recall accuracy of order of asking follow-up questions about lists of items recalled in surveys. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Bosley, J.J., Eltinge, J.L., Fox, J.E., and Fricker, S.S. (2003). Conceptual and practical issues in the statistical design and analysis of usability tests. Presented at The Federal Committee on Statistical Methodology’s Research Conference, Arlington, VA.


Butani, S.J., and McElroy, M. (1999). Managing various customer needs for occupational employment statistics and wage survey. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Butani, S.J., Robertson, K., & Mueller, K. (1998). Assigning permanent random numbers to the Bureau of Labor Statistics longitudinal (universe) data base. Proceedings of the Section on Survey Research Methods, American Statistical Association, 451-462.


Chen, B., & Zadrozny, P. (1998). An extended Yule-Walker method for estimating a vector autoregressive model with mixed-frequency data. Advances in Econometrics: Messy Data--Missing Observations, Outliers, and Mixed-Frequency Data, Vol. 13, T.B. Fomby and R.C. Hill (eds.), JAI Press Inc., Greenwich, CT.


Cho, E.C., & Cho, M.J. (2005). Maximum Entropy and Differential Forms. The International Journal of Pure and Applied Mathematics, 18, No. 3, 395-402.


Cho, E.C., & Cho, M.J. (2009). Variance of Sample Variance with Replacement. The International Journal of Pure and Applied Mathematics, 51, No. 1, 73-77.


Cho, E.C., Cho, M.J., & Eltinge, J.L. (2005). The variance of sample variance from a finite population. The International Journal of Pure and Applied Mathematics, 21, No. 3, 389-396.


Cho, M.J., & Eltinge, J. (2001). Diagnostics for evaluation of superpopulation models for variance estimation under systematic sampling. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Cho, M.J., Eltinge, J.L., & Cho, E. C. (2007). Optimal pairing for stratum collapse methods with interviewer-level measurement error shared across strata. The International Journal of Pure and Applied Mathematics, 40, 3, 399-411.


Cho, M. J., & Stockbridge, R. H.(2002). Linear programming formulation for optimal stopping problems. Society for Industrial and Applied Mathematics, 40, 1965-1982.


Clements, J. (2000). Protecting data in two-way statistical tables using network flow methodology. Proceedings of the Section on Government Statistics, American Statistical Association.


Cohen, S. (1997). The National Compensation Survey: The new BLS Integrated Compensation Program. Proceedings of the Section on Survey Research Methods, American Statistical Association, 451-456.


Cohen, S., Wright, T., Vangel, M., & Wacholder, S., (2000). Postdoctoral research programs in the federal statistical system. Proceedings of the Section on Survey Research Methods, American Statistical Association. .


Conrad, F., Blair, J., & Tracy, E. (1999). Verbal reports are data! A theoretical approach to cognitive interviews. Proceedings of the Federal Committee on Statistical Methodology Research Conference.


Conrad, F., Brown, N., & Dashen, M. (1999). Estimating the frequency of events from unnatural categories. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Conrad, F.G., & Schober, M.F. (1999). Conversational Interviewing and Data Quality. Proceedings of the Federal Committee on Statistical Methodology Research Conference.


Couper, M., & Stinson, L. (1999). Completion of self administered questionnaires in a sex survey. The Journal of Sex Research, 36, 321-330.


Dashen, M. (2000). Improving purchase recollection. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Dashen, M. & Fricker, S. (2001). Understanding the cognitive processes of open-ended categorical questions and their effects on data quality. Journal of Official Statistics, 17(4), 457 – 477.


Dashen, M., & Sangster, R.L. (1997). Does item similarity and word order influence comparative judgments? Proceedings of the Section on Survey Research Methods, American Statistical Association.


Dippo, C.S., & Gillman, D.W. (1999). The role of metadata in statistics. Paper presented at UN/ECE Work Session on Statistical Metadata.


Dippo, C.S., & Hoy, E. (1997). Providing metadata to survey staff via Internet. Presented at ISI 51st Session, Istanbul,Turkey.


Dippo, C.S., & Tupek, A. (1997). Quantitative literacy: New website for federal statistics provides research opportunities. D-Lib Magazine.


Dixon, J. (2001). Using 'gross flows' to evaluate the impact of nonresponse on federal household survey estimates. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Dixon, J. (2000). The relationship between household moving, nonresponse, and the unemployment rate in the Current Population Survey," Proceedings of the Section on Government Statistics, American Statistical Association.


Dorfman, A.H. (1999). The stochastic approach to price indices. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Dorfman, A.H. (1999). Issues in the analysis of complex surveys. Proceedings Book 2 Topic 67 (Bulletin of the International Statistical Institute).


Dorfman, A.H. (2000). Nonparametric regression for estimating totals in finite populations. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Dorfman, A.H. (2009). Nonparametric regression and the two sample problem. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Dorfman, A.H., & Valliant, R. (1997). The Hajek estimator revisited. Proceedings of the Section on Survey Research Methods, American Statistical Association, 760-765.


Eltinge, J. (2001). Accounting for design and superpopulation components of variability in approximations for design effects, generalized variance functions and related quantities. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Eltinge, J. (2000). Implications of model validation criteria for the performance of small domain estimation methods. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Eltinge, J. (1999). Evaluation and reduction of cluster-level identification risk for public-use survey microdata files. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Eltinge, J., & Phipps, P. (2009). Characterization, Evaluation, and Management of Prospective Benefits, Costs, and Risks in the Development of New Statistical Programs for Energy. Proceedings of the ASA Joint Statistical Meeting, American Statistical Association. 241-255.


Ernst, L., & Paben, S. (2000). Maximizing and minimizing overlap when selecting any number of units per stratum simultaneously for two designs with different stratifications. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Ernst, L.R., Powers, R., Sadler, A., & Slack, D. (2011). Adjusting sampling and weighting to account for births and deaths. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Ernst, L.R. (2001). The history and mathematics of apportionment of the U.S. House of Representatives. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Ernst, L.R. (2001). Retrospective assignment of permanent random numbers for Ohlsson's exponential sampling overlap maximization procedure for designs with more than one sample unit per stratum. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Ernst, L.R. (1999). The maximization and minimization of sample overlap problems: A half century of results. Bulletin of the International Statistical Institute, Proceedings Tome LVII, Book 2, 293-296.


Ernst, L.R., Valliant, R., & Casady, R.J. (1998). Permanent and collocated random number sampling and the coverage of births and deaths. Proceedings of the Section on Survey Research Methods, American Statistical Association, 457-462.


Ernst, L.R., & Ponikowski, C.H. (1998). Selecting the Employment Cost Index survey sample as a subsample of the National Compensation Survey. Proceedings of the Section on Survey Research Methods, American Statistical Association, 517-522.


Ernst, L.R. (1998). Maximizing and minimizing overlap when selecting a large number of units per stratum with simultaneous selection. Journal of Official Statistics, 14, 297-314.


Esposito, J.L., & Fisher, S. (1998). A summary of quality-assessment research conducted on the 1996 displaced-worker/job-tenure/occupational-mobility supplement. BLS Statistical Notes (No. 43), Bureau of Labor Statistics, Washington, DC.


Fisher, S.K. (2001). A clinic-based needs assessment study of women who partner with women: The relationship between sexual orientation/gender identity, health-seeking behaviors and perceived quality of care issues. Proceedings of the Section on Committee on Gay and Lesbian Concerns in Statistics, American Statistical Association.


Fisher, S.K. (2000). Improving the quality of data reporting in business surveys: Discussant comments. Proceedings of the International Conference on Establishment Surveys II.


Fisher, S.K., Ramirez, C., McCarthy, S., & Shimizu, I. (2000). Examining standardization of response rate measures in establishment surveys. Proceedings of the Council of Professional and Federal Statistics.


Fox, J.E. (2003). Designing the user interface of a data collection instrument for the Consumer Price Index. CHI 2003, Ft. Lauderdale, FL.


Fox, J.E. (2005) How do you resolve conflicting requirements from different user groups? Presented at the Usability Professionals’ Association Annual Meeting. (http://www.upassoc.org/usability_resources/conference/2005/im_fox.html.)


Fox, J.E., & Dumas, J.S. (2007). Usability testing:  Current practice and future directions, In A. Sears and J. Jacko (Eds.) The Human-Computer Interaction Handbook, New York:  Lawrence Erlbaum Associates.


Fox, J.E., Fisher, S.K., Tucker, N.C., Sangster, R.L., Rho, C. (2003). A qualitative approach to the study of BLS establishment survey nonresponse. Presented at The 163rd Annual Joint Statistical Meetings, San Francisco, CA, August 4, 2003.


Fox, J.E., & Fricker, S.S. (2008).  Beyond words: Strategies for designing good questionnaires.  Presented at the Usability Professionals’ Association Annual Meeting, Baltimore, MD, June 19, 2008.


Fox, J.E., & Fricker, S.S. (2009).  Designing ratings scales for questionnaires.  Presented at the Usability Professionals’ Association Annual Meeting, Portland, OR, June 11, 2009.


Fox, J.E., Mockovak, W., Fisher, S.K., & Rho, C. (2003). Usability issues associated with converting establishment surveys to web-based data collection. Presented at The Federal Committee on Statistical Methodology’s Research Conference, Arlington, VA.


Fricker, S., Galesic, M., Tourangeau, R., & Yan, T. (2005). An experimental comparison of web and telephone surveys. Public Opinion Quarterly, 69, 370-392.


Fricker, S., Gonzalez, J., & Tan, L. (2011).  Are you burdened?  Let’s find out.  Paper Presented at the Annual Conference of the American Association for Public Opinion Research, Phoenix, AZ.


Fricker, S., & Tourangeau, R. (2010).  Examining the relationship between nonresponse propensity and data quality in two national household surveys.  Public Opinion Quarterly, 74(5), 934-955.


Garner, T., Stinson, L., & Shipp, S. (1998). Subjective assessments of economic well-being: Cognitive research at the U.S. Bureau of Labor Statistics,” Focus, 19, 43-46.


Goldenberg, K., & Phillips, M.A. (2000). Now that the study is over, what have you told us? Identifying and correcting measurement error in the Job Openings and Labor Turnover Survey pilot test. Paper Presented at the International Conference on Establishment Surveys II, Buffalo, NY, June 2000.


Goldenberg, K.L., & Stewart, J. (1999). Earnings concepts and data availability for the Current Employment Statistics Survey: Findings from cognitive interviews. Proceedings of the Section on Survey Research Methods,American Statistical Association.


Goldenberg, K.L., Levin, K., Hagerty, T., Shen, T. & Cantor, D. (1997). Procedures for reducing measurement error in establishment surveys. Proceedings of the Section on Survey Research Methods, American Statistical Association, 994-999.


Gonzalez, J., & Edgar, J. (2009). Correlates of data quality in the Consumer Expenditure Quarterly Interview Survey. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Gregg,V., & Dippo, C.S. (1999). FedStats: Partnering to create the national statistical information infrastructure of the 21st century. Proceedings of the Section on Government Statistics, American Statistical Association.


Guciardo, C. (2001). Estimating variances in the National Compensation Survey using balanced repeated replication. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Harris-Kojetin, B.A., & Fricker, S.S. (1999). The influence of environmental characteristics on survey cooperation: A comparison of metropolitan areas. Paper Presented at 28th Session International Conference on Survey Nonresponse, Portland, OR.


Heo, S., & Eltinge, J. (1999). The analysis of categorical data from a complex sample survey: Chi-squared tests for homogeneity subject to misclassification error. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Kydoniefs, L., & Stinson, L. (1999). Standing on the outside, looking in: Tapping data users to compare and review surveys. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Lee, S. and Eltinge, J. (1999). Diagnostics for the stability of an estimated misspecification effect matrix. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Lent, J. (2000). Chain drift in some price index estimators. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Lent, J., & Dorfman, A.H. (2009). Using a weighted average of the Jevons and Laspeyres indexes to approximate a superlative index. Journal of Official Statistics, 25, 139-149.


Lent, J., Miller, S., Duff, M., & Cantwell, P. (1998). Comparing Current Population Survey estimates computed using different composite estimators. Proceedings of the Section on Survey Research Methods, American Statistical Association, 564-569.


Levi, M. (1997). A Shaker approach to web site design. Proceedings of the Section on Statistical Computing, American Statistical Association.


Mason, C., Sangster, R., & Wirth, C. (2001). Comparison of final disposition codes used for attrition calculations for telephone samples. Proceedings of the Section on Survey Research Methods, American Statistical Association.


McKay, R.B. (1997). The multiracial category as “wild card” in racial questionnaire design. Proceedings of the Section on Survey Research Methods (AAPOR), American Statistical Association, 916-921.


Mockovak, W. (2011). The impact of visual design in survey cover letters on response and web take-up rates. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Mockovak, W., and Fox, J.E. (2002). Approaches for incorporating user-centered design into CAI development. Proceedings of the International Conference on Questionnaire Development, Evaluation, and Testing Methods (QDET).


Mockovak, W., & Powers, R. (2008). The use of paradata for evaluating interviewer training and performance. Poster for the Joint Statistical Meetings.


Moore, J., Stinson, L., & Welniak, E. (1999). Income reporting in surveys: Cognitive issues and measurement error. In Monroe Sirken, Douglas Herrmann, Susan Schecter, Norbert Schwarz, Judith Tanur, and Roger Tourangeau (eds.), Cognitive and Survey Research. New York: John Wiley & Sons, Inc.


Moy, L., & Stinson, L. (1999). Two sides of a single coin?: Dimensions of change in different settings. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Park, I., & Eltinge, J. (1999). Fitting complex survey data to the tail of a parametric distribution. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Park, I., & Eltinge, J. (2001). The effect of cluster sampling on the covariance and correlation matrices of sample distribution functions. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Parsons,V., & Eltinge, J. (1999). Stratum Partition, Collapse and Mixing in Construction of Balanced Repeated Replication Variance Estimators. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Pfeffermann, D., & Scott, S. (1997). Variance measures for X-11 seasonally adjusted estimators; Some new developments with application to labor force series. Proceedings of the Section on Business & Economic Statistics, American Statistical Association, 211-216.


Pfeffermann, D., Tiller, R., & Zimmerman, T. (2000). Accounting for sampling error autocorrelations towards signal extraction from models with sampling error. Proceedings of the Section on Business and Economics Statistics, American Statistical Association.


Phipps, P.A., Butani, S.J., & Chun, Y.I. (1995). Research on establishment survey questionnaire design. Journal of Business and Economic Statistics, 7, 337-346.


Phipps, P.A., & Tupek, A.R. (1991). Assessing measurement errors in a touchtone recognition survey. Survey Methodology, 17, 15-26.


Polivka, A.E., & West, S.A. (1997). Earnings data from the Current Population Survey after the redesign. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Powers, R., & Cohen, S. (2004). Use of an audit program to improve confidentiality protection of tabular data at the Bureau of Labor Statistics. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Powers, R., & Eltinge, J. (2006). Evaluation of the detectability and inferential impact of nonresponse bias in establishment surveys. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Powers, R., & Eltinge, J. (2008). Properties of callback procedures under moderate deviations from specified models. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Powers, R., & Eltinge, J. (2009). Evaluation of randomization-based estimation and inference methods. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Powers, R., & Eltinge, J. (2010). Evaluating prospective integration of survey and administrative record data: The impact of uncertainty in measurement of data quality and cost factors. Proceedings of the Section on Government Statistics, American Statistical Association.


Powers, R., & Eltinge, J. (2011). Responsive designs for rare subpopulations subject to misclassification. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Presser, S., & Stinson, L. (1998). Data collection mode and social desirability bias in self-reported religious attendance. American Sociological Review, vol. 63, pp. 137-145.


Rips, L.J., Conrad, F.G. & Fricker, S.S. (2004). Straightening out the seam effect in panel surveys. Public Opinion Quarterly, 67, 522-554.


Rips, L.J., Conrad, F.G., & Fricker, S.S. (2000). Unraveling the seam effect. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Sangster, R. (2005). Consumer Price Index (CPI) housing survey sample attrition. Presented at the 16th International Workshop on Household Survey Nonresponse, Tällberg, Sweden.


Sangster, R., & Meekins, B. (2004). Modeling the likelihood of interviews and refusals: using call history data to improve efficiency of effort in a national RDD survey. Proceedings of the Section on Survey Research Methods, American Statistical Association, Toronto, Canada.


Sangster, R. & Meekins, B. (2003). Data concerns for hard to reach and reluctant respondents in telephone panel surveys. Presented at the 14th International Workshop on Household Survey Nonresponse, Leuven, Belgium.


Sangster, R., & Willits, F. (2001). Evaluating numeric rating scales: Replicated results. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Schober, M.F., Conrad, F.G. and Fricker, S.S. (2004). Misunderstanding standardized language in research interviews. Applied Cognitive Psychology, 18, 169-188.

Schober, M.F., Conrad, F.G., & Fricker, S.S. (2000). When and how should survey interviewers clarify question meaning? Proceedings of the Section on Survey Research Methods, American Statistical Association.


Scott, S., and Sverchkov, M. (2010). Characteristics of a model-based variance measure for X-11 seasonal adjustment. Proceedings of the Section on Business and Economic Statistics, American Statistical Association.


Scott, S., & Zadrozny, P. (1999). Aggregation & model-based methods in seasonal adjustment of labor force series. Proceedings of the Section on Business and Economic Statistics, American Statistical Association.


Schwartz, L., & Paulin, G. (2000). Improving response rates to income questions: a comparison of range techniques. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Shettle, C., Ahmed, S., Cohen, S., Miller, R., &Waite, P. (1997). Improving statistical reports from government agencies through the reports review process. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Sirken, M., Tanur, J., Tucker, C.N., & Martin, E.A. (1997). Synthesis of CASM II: Current and future directions in interdisciplinary research on cognitive aspects of survey methods. Proceedings of the Section on Survey Research Methods, American Statistical Association, 1-10.


Stamas, G., Goldenberg, K., Levin, K., & Cantor, D. (1997). Sampling for employment at new establishments in a monthly business survey. Proceedings of the Section on Survey Research Methods, American Statistical Association, 279-284.


Steiger, D.M., Mainieri, T., & Stinson, L. (1997). Subjective assessments of economic well-being: Understanding the minimum income question. Proceedings of the Section on Survey Methods Research, American Statistical Association, 899-903.


Stewart, J., & Joyce, M. (1999). Why do we need time-use data? Proceedings of the Section on Social Statistics, American Statistical Association.


Stewart, J., & Frazis, H. (1998). Keying errors caused by unusual response categories: Evidence from a Current Population Survey test. Proceedings of the Section on Survey Research Methods, American Statistical Association, 131-134.


Stinson, L. (2000). Day of week differences and implications for time-use research. Proceedings of the Section on Social Statistics, American Statistical Association.


Stinson, L. (1999). Measuring how people spend time. Proceedings of the Section on Social Statistics, American Statistical Association.


Stinson, L. (1997). Using the delighted/terrible scale to measure feelings about income and expenses. Proceedings of the Section on Survey Methods Research, American Statistical Association, 904-909.


Sukasih, A., & Eltinge, J. (2001). A goodness-of-fit test for response probability models in the analysis of complex survey data. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Sverchkov, M. (2010). On modeling and estimation of response probabilities when missing data are not missing at random. Proceedings of the Section on Survey Methods Research, American Statistical Association.


Sverchkov, M., & Pfeffermann, D. (2000). Prediction of finite population totals under informative sampling utilizing the sample distribution. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Sverchkov, M., Pfeffermann, D., & Scott, S. (2009). On X-11 seasonal adjustment and estimation of its MSE. Proceedings of the Section on Business and Economic Statistics, American Statistical Association.


Swanson, D.C., Hauge, S.K, & Schmidt, M.L. (1999). Evaluation of composite estimation methods for cost weights in the CPI. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Tourangeau, R., Shapiro, G., Kearney, A., & Ernst, L. (1997). Who lives here? Survey undercoverage and household roster questions. Journal of Official Statistics, 13, 1-18.


Tucker, C. (2001). Using the new race and ethnicity data. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Valliant, R., & Dorfman, A.H. (1997). Stratification on a size variable revisited. Proceedings of the Section on Survey Research Methods, American Statistical Association, 766-771.


Walker, E., & Mesenbourg, T. (1997). The Census Bureau's business register: Quality issues and observations. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Walker, M.A.C., & Bergman, B. (1997). Estimates of year-to-year change in costs per hour worked from the employer costs for Employee Compensation Survey. Proceedings of the Business and Economic Statistics Section, American Statistical Association.


Wang, S., Dorfman, A.H., & Chambers, R. (1999). Maximum likelihood under informative sampling. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Weber, W. (1999). A method of microdata disclosure limitation based on noise infusion and outlier substitution. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Werking, G.S. (1997). Overview Of The CES Redesign Research. Proceedings of the Section on Survey Research Methods, American Statistical Association, 512-516.


West, S.A., Kratzke, T., & Garden, P. (1997). Estimators for average hourly earnings and average weekly hours for The Current Employment Statistics Survey. Proceedings of the Section on Survey Research Methods, American Statistical Association, 529-534.


Wohlford, J., & Mueller, C. (2000). The debut of a new establishment survey: The Job Openings and Labor Turnover Survey at the Bureau of Labor Statistics. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Yansaneh, I., & Eltinge, J. (2001). Design Effect and Cost Issues for Surveys in Developing Countries. Proceedings of the Section on Survey Research Methods, American Statistical Association.


Zadrozny, P. (2001). An estimated autoregressive model for forecasting U.S. GDP Based on real-time data. Proceedings of the Section on Business and Economics Statistics, American Statistical Association.


Zadrozny, P. (2000). Modeling survey-error autocorrelations subject to time-in-sample effects for model-based seasonal adjustments. Proceedings of the Section on Business and Economics Statistics, American Statistical Association.


Zadrozny, P., & Chen, B. (1999). Estimation of capital and technology with a dynamic economic model. Proceedings of the Section on Business and Economic Statistics, American Statistical Association.

ATTACHMENT II


CONSULTANTS TO THE

BEHAVIORAL SCIENCE RESEARCH LABORATORY


Dr. Paul Biemer, Distinguished Fellow

Research Triangle Institute

3040 Corwallace Rd.

Ragland Building

Research Triangle Park, NC 277709

(919) 541-6000


John J. Bosley, Ph.D.

Survey Methods Consulting

805 Union Avenue

Baltimore, MD 21211-2210

(410) 366-2344


Dr. Fredrick G. Conrad

University of Michigan

Institute for Social Research

PO Box 1248

Room 4025

Ann Arbor, MI 48106

(734) 936-1019


ATTACHMENT III

CONSENT FORM


The Bureau of Labor Statistics (BLS) is conducting research to increase the quality of BLS surveys. This study is intended to suggest ways to improve the procedures the BLS uses to collect survey data.


The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. The Privacy Act notice on the back of this form describes the conditions under which information related to this study will be used by BLS employees and agents.


During this research you may be audio and/or videotaped, or you may be observed. If you do not wish to be taped, you still may participate in this research.


We estimate it will take you an average of [enter #] minutes to participate in this research (ranging from [enter #] minutes to [enter #] minutes).


Your participation in this research project is voluntary, and you have the right to stop at any time. If you agree to participate, please sign below.


Persons are not required to respond to the collection of information unless it displays a currently valid OMB control number. OMB control number is 1220-0141, and expires [enter date].


------------------------------------------------------------------------------------------------------------

I have read and understand the statements above. I consent to participate in this study.



___________________________________ ___________________________

Participant's signature Date



___________________________________

Participant's printed name



___________________________________

Researcher's signature



OMB Control Number: 1220-0141

Expiration Date: [enter expiration date]



PRIVACY ACT STATEMENT

In accordance with the Privacy Act of 1974, as amended (5 U.S.C. 552a), you are hereby notified that this study is sponsored by the U.S. Department of Labor, Bureau of Labor Statistics (BLS), under authority of 29 U.S.C. 2. Your voluntary participation is important to the success of this study and will enable the BLS to better understand the behavioral and psychological processes of individuals, as they reflect on the accuracy of BLS information collections. The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent.

26


File Typeapplication/msword
File TitleBUREAU OF LABOR STATISTICS
AuthorNora Kincaid
Last Modified Bykopp_b
File Modified2012-02-28
File Created2012-02-28

© 2024 OMB.report | Privacy Policy