1220-0141 Supporting_Statement_Part_A (revised 2-23)

1220-0141 Supporting_Statement_Part_A (revised 2-23).doc

Cognitive and Psychological Research

OMB: 1220-0141

Document [doc]
Download: doc | pdf










BUREAU OF LABOR STATISTICS

OMB CLEARANCE PACKAGE






for

CLEARANCE TO CONDUCT COGNITIVE AND PSYCHOLOGICAL RESEARCH

IN FY2009 THROUGH FY2011









Prepared by

BEHAVIORAL SCIENCE RESEARCH LABORATORY

OFFICE OF SURVEY METHODS RESEARCH

BUREAU OF LABOR STATISTICS



September 9, 2008



Request for Revision to Clearance to Conduct

Cognitive and Psychological Research


Abstract


This is a request for clearance by the Bureau of Labor Statistics' (BLS) Behavioral Science Research Laboratory (BSRL) to conduct research to improve the quality of data collection by examining the psychological and cognitive aspects of methods and procedures. BLS staff employing state-of-the-art cognitive psychological testing methods will conduct these research and development activities. The feasibility and value of this approach to questionnaire construction, survey technology and interview processes has been demonstrated over the past 20 years. The use of this technique to improve the quality of data collection has been advocated by the Cognitive Aspects of Survey Methodology seminar sponsored by the National Academy of Science and participants in a questionnaire design advisory conference. The planned research and development activities will be conducted during FY2009 through FY2011 with the goal of improving overall data quality through improved procedures.

Supporting Statement


A. Justification


1. Collection of Information


The Bureau of Labor Statistics' Behavioral Science Research Laboratory (BSRL) conducts psychological research focusing on the design and execution of the data collection process in order to improve the quality of data collected by the Bureau. BSRL conducts research aimed at improving data collection quality by assessing questionnaire/form management and administration, as well as issues which relate to interviewer training and interaction with respondents in the interview process. BSRL staff work closely with economists and/or program specialists responsible for defining the concepts to be measured by the Bureau of Labor Statistics' collection programs.


Both questionnaires and forms are used in the Bureau's surveys. Questionnaires specify the preferred wording of the questions to be asked, whereas forms specify the data items to be collected. Each possesses distinctive problems which in many cases can be related to respondent characteristics, survey content, or format of administration. Such problems impede the effectiveness of particular surveys and the mission of the Bureau in general.


The purpose of this request for clearance for cognitive psychological research and development activities by the BSRL is to enhance the quality of the Bureau's data collection procedures and overall data management. The basic goal of the BSRL is to improve through interdisciplinary research the quality of the data collected and published by the BLS. BLS is committed to producing the most accurate and complete data within the highest quality assurance guidelines. It is with this mission in mind, then, that BSRL was created to aid in the effort of not only maintaining, but also improving the quality of the data collection process.


This laboratory was established in 1988, by Commissioner Janet Norwood, to employ behavioral science to investigate all forms of oral and written communication used in the collection and processing of survey data. This exploration also includes all aspects of data collection such as mode, manuals, and interviewer training. BSRL performs a state-of-the-art service to numerous programs within BLS, the DOL, and other agencies as requested, providing questionnaire redesign efforts, survey updates, and improvement in the overall quality of the data collection management. These efforts, in turn, increase data quality and reduce respondent burden. The techniques proposed here have been applied successfully to a number of BLS surveys; e.g., Current Population Survey (CPS), Current Employment Statistics (CES), Consumer Expenditure Survey (CE), and the American Time Use Survey (ATUS).


The research techniques and methods to be used in these studies will include both analyses of questionnaire construction and interview process, as well as survey technology. Within the structure of the questionnaire, analyses will be conducted in the following domains:


  1. Question Analysis--Evaluation of individual questionnaires appraising question intent, assessment of semantic clarity, and an examination of relationships between questions.


  1. Term Analysis--Evaluation of specific wording and phrases in terms of their psycholinguistic properties and an assessment of respondent interpretation of the meaning of these terms, at both the conscious and unconscious levels.


  1. Instruction Analysis--Inspection of instructions for their semantic clarity, the degree to which they reflect the stated intention of investigators, ease of interpretation, and other considerations which may elicit unambiguous and appropriate answers or behaviors from respondents or interviewers.


  1. Format Analysis--Review of questionnaires or subsets of questions for perceptual characteristics in order to facilitate better respondent comprehension and promote more active attention to the focus of the questionnaire or form.


Within the interview process, several analyses are conducted to assess nonverbal communication, interpersonal dynamics and symbolic interaction--the use of cultural symbols to make social statements. Staff conducts research to evaluate the overall effectiveness of data collection procedural characteristics, including:


  1. Interviewer Characteristics and Behavior analysis - Study of the presentation of e.g., appearance, manner, relation to subject population, etc., in order to enhance interpersonal skills of interviewers in general and develop and improve procedures for the training of interviewers.


  1. Respondent Characteristics and Behavior analysis - Assessment of the social, cultural and ethnic characteristics of the respondent and how that may bear upon interactions with the interviewer. Staff members also observe the behaviors of respondents for cues concerning their reactions to the interview process. Because BLS constantly collects data from different populations that change over time, the analysis of respondent characteristics needs frequent updating.


  1. Mode Characteristics - examination of the unique properties of interviewer and/or respondent behavior as a function of the media used to collect data; for example, self-administered interviews, personal interviews, telephone interviews and interviews utilizing assistive technologies (e.g., CAPI, CASI and CATI).


2. The Purpose of Data Collection


The purpose of BSRL's data collection is to improve Federal data collection processes through scientific research. Theories and methods of cognitive science provide essential tools for the development of effective questionnaires. For example, they can provide an understanding of how respondents comprehend survey questions, recall relevant facts from memory, make judgments, and respond. On the basis of such knowledge, questions can be tailored to increase the accuracy of the collected information and to reduce the respondent burden. Similar improvements can be made with respect to other aspects of the data collection process.


BSRL’s research contributes to BLS and to the entire survey research field. Research results are shared with the Bureau through seminars, training sessions, reports, publications, and presentations at professional conferences. The BSRL Staff has instituted a method of peer review to encourage high standards of social science research practice. A current bibliography of BSRL staff publications can be found in Attachment I.


The BSRL’s research is expected to 1) improve the data collection instruments employed by the Bureau, 2) increase the accuracy of the economic data produced by BLS and on which economic policy decisions are based, 3) increase the ease of administering survey instruments for both respondents and interviewers, 4) increase response rates in panel surveys as a result of reduced respondent burden, and 5) enhance BLS’s reputation resulting in greater confidence and respect in survey instruments used by BLS.

The application of cognitive and psychological theories and methods to survey data collection is widespread and well established. The consequences of failing to scientifically investigate the data collection process is to lag in the use of accepted practices, to apply archaic survey development techniques based on intuition and trial and error, and ultimately to incur a severe cost in data quality and in burden to respondents, interviewers and data users alike. For example, without knowledge of what respondents can be expected to remember about the past and how to ask questions that effectively aid in the retrieval of the appropriate information, survey researchers cannot ensure that respondents will not take shortcuts to avoid careful thought in answering the questions, or to be subject to undue burden. Likewise, without investigation of the interviewers’ roles and abilities in the data collection process, survey researchers cannot ensure that interviewers will read their questions correctly with ease and fluency, or record the respondent’s answers correctly.


3. Use of Improved Technology


Staff members will design, conduct and interpret field and laboratory research that contributes new knowledge of the cognitive aspects of human behavior in relationship to questionnaire design and survey methodology. Cognitive psychological research methods in use include such techniques as: probing questioning, memory cueing, group discussion, and intensive interviewing. Depending on research goals, these methods may be used separately or in combination with one another.


The use of the laboratory approach has a number of advantages associated with it. These advantages include: rapid and in-depth testing of questionnaire items, a more detailed understanding of the respondents’ comprehension of concepts, and access to special populations who can be quickly recruited and tested. Different laboratory methods will be used in different studies depending on the aspects of the data collection process being studied. Computer technology will be used when appropriate to aid the respondents and interviewers and minimize burden.


Respondent burden in this collection will be held to a minimum. The proposed approach to research of data collection methods is designed to obtain the maximum amount of information for the minimum respondent burden. The research includes such methods as:


  1. interview pacing and latency classification,


  1. degree of structure within the interview format, group dynamics observation and recording of decision behaviors and/or the negotiation processes,


  1. structured tasks: card sorts and vignettes,


  1. expert analyses, and


  1. experiments involving the administration of forms to study respondents.


  1. usability tests of existing or proposed data collection and data dissemination systems


4. Efforts to Avoid Duplication


This research does not duplicate any other research effort being done within the BLS. This research will provide critical, groundbreaking, and important supplemental information beyond that currently available in the field of survey methodology as it applies to BLS surveys.


This research also does not duplicate any outside-of-government research effort, as its purpose is not to replicate survey research studies. The staff of BSRL is cognizant of current research being done in the field of cognitive psychology through attendance at conferences, research reported in professional journals, and through in-house staff meetings and peer review processes. There is no current similar existing data that can be used or modified for the purposes of improving the overall data collection process.


  1. Collection of Information Involving Small Establishments

BSRL data collection efforts focus primarily on information gained through laboratory interviews, telephone interviews, and self-administered questionnaires with individuals recruited from the general public. However, in some instances organizational goals necessitate the involvement of businesses, state agencies, or other entities. To the extent these establishments are included they normally are surveyed only once. Typically, this involves fewer than ten establishments per study, but when resources allow, additional establishments may be included to help generalize findings to the field.


6. The Consequences of Less Frequent Data Collection


The planned collection of data will allow BSRL to suggest modifications and alterations to survey research in an on-going manner. Because this collection is expected to be an on-going effort, it has the potential to have immediate impact on all survey collection methods within the Bureau's jurisdiction. Its delay would sacrifice potential gain in survey modification within the Bureau as a whole.


7. Special Circumstances


There are no special circumstances.


8. Federal Register and Consultation Outside BLS


Federal Register: No comments were received as a result of the Federal Register notice published in Volume 73 FR 54623 on September 22, 2008. A correction notice was published in Volume 73 FR 62325 on October 20, 2008 to correct the comment date to a 60 day notice which was originally cited as a 30 day notice.


Outside Consultation: Consultation with individuals outside BLS to obtain views on the availability of data, frequency of collection, suitability of particular laboratory methods, clarity of instruction and record keeping, disclosure, or reporting format, and on the data elements to be recorded and/or disclosed, are frequent and on-going with the National Center for Health Statistics, the Bureau of the Census, the University of Maryland, the University of Michigan, and other federal agencies and institutions of higher learning. Consultants come from a wide range of subject areas and expertise. A list of individuals consulted in the past is attached to this document. (Attachment II).


The individual responsible for the BSRL research efforts is:


Dr. William P. Mockovak

Director of Behavioral Science Research

Office of Survey Methods Research

Bureau of Labor Statistics

PSB Room 1950

2 Massachusetts Ave., NE

Washington, DC 20212
202--691-7414


  1. Payment to Respondents


Respondents for activities conducted in the laboratory (that is, cognitive interviews and focus groups) under this clearance will receive a small stipend.  This practice has proven necessary and effective in recruiting subjects to participate in this small-scale research, and is also employed by the other Federal cognitive laboratories.  The incentive for participation in a cognitive interview is $40, and for participation in a focus group is $50-$75.  BLS may provide smaller incentives than these amounts at its discretion; however, any requests for larger amounts must be justified in writing to OMB.  

 

Respondents for methods that are generally administered as part of field test activities (that is, split sample tests, behavior coding of interviewer/respondent interaction, and respondent debriefing) or other research projects where BLS lab staff travel to and test in respondent’s residences will not receive payment unless there are extenuating circumstances that warrant it.  Such circumstances and proposed incentives must be justified in writing to OMB.  


10. Confidentiality and Privacy Concerns


The data collected from respondents will be tabulated and analyzed only for the purpose of evaluating the research in question. Laboratory respondents will be asked to read and sign a Consent form explaining the voluntary nature of the studies, the use of the information, that the interview may be taped or observed, and a Privacy Act statement. (Attachment III). The Privacy Act Statement given to respondents is as follows:

In accordance with the Privacy Act of 1974, as amended (5 U.S.C. 552a), you are hereby notified that this study is sponsored by the U.S. Department of Labor, Bureau of Labor Statistics (BLS), under authority of 29 U.S.C. 2. Your voluntary participation is important to the success of this study and will enable the BLS to better understand the behavioral and psychological processes of individuals, as they reflect on the accuracy of BLS information collections. The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent.

Surveys with current OMB approval that are involved in BSRL studies and are collected outside the laboratory such as mail or CATI surveys use the pledge of the existing approved collection or the Privacy Act statement.


The Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA) safeguards the confidentiality of individually identifiable information acquired under a pledge of confidentiality for exclusively statistical purposes by controlling access to, and uses made of, such information. CIPSEA includes fines and penalties for any knowing and willful disclosure of individually identifiable information by an officer, employee, or agent of the BLS.


The Bureau of Labor Statistics Commissioner's Order No. 1-06, “Confidential Nature of BLS Statistical Data,” explains the Bureau's policy on confidentiality: “In conformance with existing law and Departmental regulations, it is the policy of the BLS that respondent identifiable information collected or maintained by, or under the auspices of, the BLS for exclusively statistical purposes and under a pledge of confidentiality shall be treated in a manner that will ensure that the information will be used only for statistical purposes and will be accessible only to authorized persons.”


11. Sensitive Questions


There are no questions of a sensitive nature.


12. Estimated Respondent Burden


The current OMB inventory for FY2008 is 1,200 hours. The FY2009, FY2010, and FY2011 estimated respondent burdens are as follows:


Response Burden

(Hours)

FY2009 1,200

FY2010 1,200

FY2011 1,200

Total FY09-11 3,600


The burden hours are estimated based on the anticipation that the research will require approximately one hour per respondent. Each study will differ substantively from one another. The projects are expected to be complex, involving at times several cognitive testing methods to test the hypotheses of the given research question.


Coverage of the Estimates


The estimates cover the time that each respondent will spend answering questions, including the debriefing procedure concerning the cognitive testing procedure used. The time required to travel to the laboratory, if needed, is not covered, since distances and modes of transportation are unknown. No retrieval of information by respondents is anticipated, although it is possible that validation of data at some point may require respondents to keep and check records. In this case, experiments will be designed to include retrieval time.




Basis for the Estimate


These estimates are based on the BSRL’s previous experience in conducting such research under the existing OMB Clearance 1220-0141, and on expectations concerning the research projects to be conducted in the next 3 years. BSRL staff and its laboratory facilities (especially the usability lab) have been increasingly utilized by both program offices and outside agencies, and it is anticipated that this trend will continue. The estimates are also based on the experience of other government agencies (such as the National Center for Health Statistics' study of the Cognitive Aspects of Survey Methods 1987) which have conducted cognitively oriented questionnaire design research.

Annualized Cost to Respondents


The estimated dollar cost for these individuals is based on average hourly earnings of employed wage and salaried workers, $17.42 per hour, taken from 2007 Current Employment Statistics Program data. Using the $17.42 per hour figure, the annualized cost to the respondents is $20,904 for FY2009 – FY2011 (based on 1,200 burden hours annually).

13. Total Annual Cost Burden


  1. There will be no total capital and start-up cost component for respondents or record keepers resulting from the collection of information.


b) The respondents and record keepers will have no expenses for operation and
maintenance or purchase of services resulting from the collection of information.


14. Cost to the Federal Government


The maximum cost to the Federal Government is $48,000 annually for FY2009, FY2010, and FY2011. Those costs are entirely comprised of the $35 reimbursements and $5 travel allowance provided to respondents, and newspaper advertisement costs. Other costs such as operational expenses (e.g., equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without the paperwork burden, are in place as a function of the laboratory proper and is not contingent or necessary to perform this research.


15. Changes in Burden


This is a request for extension to the existing OMB Clearance 1220-0141 in order to continue the research mission for another 3 years. No change in the number of burden hours is being requested.


16. Tabulation, Analysis and Publication Plans, and Project Schedule


This clearance request is for survey methodological and questionnaire quality assurance, which include the exploratory activities leading to the evaluation of data quality.  Both quantitative and qualitative analyses are planned for the evaluation of these activities depending on the circumstances. 

 

The results of these investigations will be used primarily to improve the quality of data collection and assure total collection quality as it relates to data management.  Because BLS is using the latest techniques and cognitive psychological testing methodology, methodological papers may be written that include some tallies of response problems, recall strategies, or results from other testing procedures used, etc.  However, BLS will not publish any reports of the substantive results collected under this clearance.  The methodological results may be included as a methodological appendix or footnote in a report containing data from a larger data collection effort.  The methodological results of this research may be prepared for presentation at professional meetings or publication in professional journals.

Project Schedule


This project schedule calls for laboratory interviews to commence once OMB approval is received.


A time schedule is proposed that is continuously on-going in nature, with research publication dates dependent on data collection limited to the researcher's proposal and subsequent results.


17. Expiration for OMB Approval


The BSRL is not seeking approval to avoid displaying the expiration date for OMB approval of the information collection.


18. Exception to Certification Statement


There are no exceptions to the certification statement “Certification for Paperwork Reduction Act Submissions.”

ATTACHMENT I


BEHAVIORAL SCIENCE RESEARCH LAB BIBLIOGRAPHY


Bosley, John, Dashen, Monica, and Fox, Jean (1999), "Effects on List Length and

Recall Accuracy of Order of Asking Follow-up Questions About Lists of Items

Recalled in Surveys," Proceedings of the Section on Survey Research Methods,

American Statistical Association.


Bosley, J.J., Eltinge, J.L., Fox, J.E., and Fricker, S.S. (2003). Conceptual and Practical Issues in the Statistical Design and Analysis of Usability Tests. Presented at The Federal Committee on Statistical Methodology’s Research Conference, Arlington, VA.



Butani, Shail J., and McElroy, Michael (1999), " Managing Various Customer Needs

for Occupational Employment Statistics and Wage Survey," Proceedings of the

Section on Survey Research Methods, American Statistical Association.


Butani, Shail, Robertson, Kenneth and Mueller, Kirk (1998), “Assigning Permanent

Random Numbers to the Bureau of Labor Statistics Longitudinal (Universe) Data

Base,” Proceedings of the Section on Survey Research Methods, American

Statistical Association, 451-462.


Chen, Baoline and Zadrozny, Peter (1998), "An Extended Yule-Walker Method for

Estimating a Vector Autoregressive Model with Mixed-Frequency Data,"

Advances in Econometrics: Messy Data--Missing Observations, Outliers, and

Mixed-Frequency Data, Vol. 13, T.B. Fomby and R.C. Hill (eds.), JAI Press Inc.,

Greenwich, CT.


Cho, Moon Jung and Eltinge, John, (2001), “Diagnostics for Evaluation of Superpopulation Models for Variance Estimation Under Systematic Sampling,” Proceedings of the Section on Survey Research Methods, American

Statistical Association.


Clements, Joseph, (2000), "Protecting Data in Two-Way Statistical Tables Using

Network Flow Methodology," Proceedings of the Section on Government

Statistics, American Statistical Association.


Cohen, Stephen (1997), “The National Compensation Survey: The New BLS Integrated

Compensation Program,” Proceedings of the Section on Survey Research

Methods, American Statistical Association, 451-456.


Cohen, Stephen, Wright, Tommy, Vangel, Mark, and Wacholder, Sholom, (2000),

"Postdoctoral Research Programs in the Federal Statistical System," Proceedings

of the Section on Survey Research Methods, American Statistical Association. .


Conrad, Frederick, Blair, Johnny and Tracy, Elena (1999). “Verbal Reports are Data! A

Theoretical Approach to Cognitive Interviews.” Proceedings of the Federal

Committee on Statistical Methodology Research Conference.


Conrad, Frederick, Brown, Norman and Dashen, Monica (1999). “Estimating the

Frequency of Events from Unnatural Categories.” Proceedings of the Section on

Survey Research Methods, American Statistical Association. .


Conrad, Frederick G. and Schober, Michael F. (1999) “Conversational Interviewing

and Data Quality.” Proceedings of the Federal Committee on Statistical

Methodology Research Conference.


Couper, Mick and Stinson, Linda (1999), “Completion of Self Administered

Questionnaires in a Sex Survey,” The Journal of Sex Research, vol. 36, pp.321-

330.


Dashen, Monica and Fricker, Scott (1998), “Taking Different Perspectives on a Survey

Question,” Journal of Official Statistics, 17(4), 457-479.


Dashen, Monica, (2000), "Improving Purchase Recollection," Proceedings of the Section

on Survey Research Methods, American Statistical Association.


Dashen, Monica and Sangster, Roberta L. (1997), “Does Item Similarity And Word

Order Influence Comparative Judgments?,” Proceedings of the Section on Survey

Research Methods, American Statistical Association.


Dippo, Cathryn S., Gillman, Daniel W. (1999), The Role of Metadata in Statistics,"

Paper presented at UN/ECE Work Session on Statistical Metadata .

Metadata.


Dippo, Cathryn S. and Hoy, Easley (1997), “Providing Metadata to Survey Staff Via

Internet,” Presented at ISI 51st Session – Istanbul,Turkey.


Dippo, Cathryn S. and Tupek, Alan (1997), “Quantitative Literacy: New Website for

Federal Statistics Provides Research Opportunities,” D-Lib Magazine, December

1997


Dixon, John, (2001), “Using 'Gross Flows' to Evaluate the Impact of Nonresponse on Federal Household Survey Estimates,” Proceedings of the Section on Survey Research Methods, American Statistical Association.


Dixon, John, (2000), "The Relationship Between Household Moving, Nonresponse, and

the Unemployment Rate in the Current Population Survey," Proceedings of the

Section on Government Statistics, American Statistical Association.


Dorfman, Alan H. (1999), "The Stochastic Approach to Price Indices," Proceedings of

the Section on Survey Research Methods, American Statistical Association.


Dorfman, Alan H. (1999), "Issues in the Analysis of Complex Surveys"," Proceedings

Book2 Topic 67 (Bulletin of the International Statistical Institute).


Dorfman, Alan H., (2000) "Non- Parametric Regression for Estimating Totals in Finite

Populations," Proceedings of the Section on Survey Research Methods, American

Statistical Association.


Dorfman, Alan, Leaver, Sylvia; and Lent, Janice (1999). "Some Observations on Price

Index Estimators," Statistical Policy Working Paper 29 - Part 2 of 5, pages 56-65.


Dorfman, Alan H. and Valliant, Richard (1997), “The Hajek Estimator Revisited,”

Proceedings of the Section on Survey Research Methods, American Statistical

Association, 760-765.

Eltinge, John, (2001), “Accounting for Design and Superpopulation Components of Variability in Approximations for Design Effects, Generalized Variance Functions and Related Quantities,” Proceedings of the Section on Survey Research Methods, American Statistical Association.


Eltinge, John, (2000), "Implications of Model Validation Criteria for the Performance of

Small Domain Estimation Methods," Proceedings of the Section on Survey

Research Methods, American Statistical Association.


Eltinge, John (1999), "Evaluation and Reduction of Cluster-Level Identification Risk for

Public-Use Survey Microdata Files," Proceedings of the Section on Survey

Research Methods, American Statistical Association.


Ernst, Lawrence and Paben, Steven, (2000), "Maximizing and Minimizing Overlap

When Selecting Any Number of Units per Stratum Simultaneously for Two

Designs with Different Stratifications," Proceedings of the Section on Survey

Research Methods, American Statistical Association.


Ernst, Lawrence R. (2001), “The History and Mathematics of Apportionment of the U.S. House of Representatives,” Proceedings of the Section on Survey Research Methods, American Statistical Association.


Ernst, Lawrence R. (2001), “Retrospective Assignment of Permanent Random Numbers for Ohlsson's Exponential Sampling Overlap Maximization Procedure for Designs with More than One Sample Unit per Stratum,” Proceedings of the Section on Survey Research Methods, American Statistical Association.


Ernst, Lawrence R. (1999), “The Maximization and Minimization of Sample Overlap

Problems: A Half Century of Results,” Bulletin of the International Statistical

Institute, Proceedings Tome LVII, Book 2, 293-296.


Ernst, Lawrence R., Valliant, Richard and Casady, Robert J. (1998), “Permanent and

Collocated Random Number Sampling and the Coverage of Births and Deaths,”

Proceedings of the Section on Survey Research Methods, American Statistical

Association, 457-462.


Ernst, Lawrence R. and Ponikowski, Chester H. (1998), “Selecting the Employment

Cost Index Survey Sample as a Subsample of the National Compensation Survey,” Proceedings of the Section on Survey Research Methods, American Statistical Association, 517-522.


Ernst, Lawrence R. (1998), “Maximizing And Minimizing Overlap When Selecting A

Large Number Of Units Per Stratum With Simultaneous Selection,” Journal of

Official Statistics, 14, 297-314. Also in Proceedings of the Section on Survey

Research Methods, American Statistical Association (1997), 475-480.


Esposito, James L., (1999) "Evaluating the Displaced Worker/Job-Tenure Supplement to the CPS: An Illustration of Multimethod Quality Assessment Research," Paper

Presented at the Conference of the Federal Committee on Statistical Methodology.


Esposito, James L. and Fisher, Sylvia (1998), “A Summary of Quality-Assessment

Research Conducted on the 1996 Displaced-Worker/Job-Tenure/Occupational-

Mobility Supplement,” BLS Statistical Notes (No. 43), Bureau of Labor Statistics,

Washington, DC.


Fisher, Sylvia K. (2001), “A Clinic-Based Needs Assessment Study of Women Who Partner with Women: The Relationship Between Sexual Orientation/Gender Identity, Health-Seeking Behaviors and Perceived Quality of Care Issues,” Proceedings of the Section on Committee on Gay and Lesbian Concerns in Statistics, American Statistical Association.


Fisher, Sylvia K. (2000), "Improving the Quality of Data Reporting in Business Surveys:

Discussant Comments," Proceedings of the International Conference on Establishment Surveys II.


Fisher, Sylvia. K., Ramirez, Carl, Stanley McCarthy, Jaki, and Shimizu, Iris, (2000),

"Examining Standardization of Response Rate Measures in Establishment Surveys " Proceedings of the Council of Professional and Federal Statistics.


Fox, J.E. (2005) How Do You Resolve Conflicting Requirements from Different User Groups? Presented at the Usability Professionals’ Association Annual Meeting. (http://www.upassoc.org/usability_resources/conference/2005/im_fox.html.)


Fox, J.E., Mockovak, W., Fisher, S.K., Rho, C. (2003). Usability Issues Associated with Converting Establishment Surveys to Web-Based Data Collection. Presented at The Federal Committee on Statistical Methodology’s Research Conference, Arlington, VA.


Fox, J.E., Fisher, S.K., Tucker, N.C., Sangster, R.L., Rho, C. (2003). A Qualitative Approach to the Study of BLS Establishment Survey Nonresponse. Presented at The 163rd Annual Joint Statistical Meetings, San Francisco, CA, August 4, 2003.


Fox, J.E. (2003). Designing the User Interface of a Data Collection Instrument for the Consumer Price Index. CHI 2003, Ft. Lauderdale, FL.


Fricker, S., Galesic, M., Tourangeau, R. and Yan, T. (2005). “An Experimental Comparison of Web and Telephone Surveys.Public Opinion Quarterly, 69, 370-392.


Fricker, S. (2005). “The Relation Between Response Propensity and Data Quality in the American Time Use Survey” American Statistical Association, Minneapolis, Minnesota.


Fricker, S. and Dashen, M., (2001), “How Do People Interpret Open-ended Categorical Questions?” Paper Presented at the American Association for Public Opinion Research Conference.


Garner, Thesia, Stinson, Linda and Shipp, Stephanie (1998), “Subjective Assessments

of Economic Well-Being: Cognitive Research at the U.S. Bureau of Labor

Statistics,” Focus, vol. 19, pp. 43-46.


Goldenberg, Karen, and Phillips, May Anne, (2000) "Now that the Study is Over,

What have You Told Us? Identifying and Correcting Measurement Error in the

Job Openings and Labor Turnover Survey Pilot Test," Paper Presented at the

International Conference on Establishment Surveys II, Buffalo, NY, June 2000

(Proceedings Forthcoming).


Goldenberg, Karen L., and Stewart, Jay (1999), "Earnings Concepts and Data

Availability for the Current Employment Statistics Survey: Findings from

Cognitive Interviews," Proceedings of the Section on Survey Research Methods,

American Statistical Association.


Goldenberg, Karen L., Levin, Kerry, Hagerty, Tracey, Shen, Ted and Cantor, David

(1997), "Procedures for Reducing Measurement Error in Establishment Surveys,"

Proceedings of the Section on Survey Research Methods, American Statistical

Association, 994-999.


Gregg,Valerie and Dippo, Cathryn S. (1999), "FedStats: Partnering to Create the

National Statistical Information Infrastructure of the 21st Century," Proceedings

of the Section on Government Statistics, American Statistical Association.


Guciardo, Christopher (2001), “Estimating Variances in the National Compensation Survey Using Balanced Repeated Replication,” Proceedings of the Section on Survey Research Methods, American Statistical Association.


Harris-Kojetin, Brian A., and Fricker, Scott (1999), " The Influence of Environmental

Characteristics on Survey Cooperation: A Comparison of Metropolitan Areas,"

Paper Presented at 28th Session International Conference on Survey

Nonresponse– Portland, OR. .


Heo, Sunyeong and Eltinge, John (1999), "The Analysis of Categorical Data from a

Complex Sample Survey: Chi-squared Tests for Homogeneity Subject to

Misclassification Error," Proceedings of the Section on Survey Research Methods,

American Statistical Association.


Kydoniefs, Leda and Stinson, Linda (1999) "Standing on the Outside, Looking In:

Tapping Data Users to Compare and Review Surveys," Proceedings of the Section on Survey Research Methods, American Statistical Association.


Lee, Sangrae and Eltinge, John (1999), "Diagnostics for the Stability of an Estimated

Misspecification Effect Matrix ," Proceedings of the Section on Survey Research

Methods, American Statistical Association.


Lent, Janice, (2000). "Chain Drift in Some Price Index Estimators," Proceedings of the

Section on Survey Research Methods, American Statistical Association.


Lent, Janice, Miller, Stephen, Duff, Martha and Cantwell, Patrick (1998),

Comparing Current Population Survey Estimates Computed Using Different

Composite Estimators,” Proceedings of the Section on Survey Research Methods,

American Statistical Association, 564-569.


Levi, Michael (1997), “A Shaker Approach To Web Site Design,” Proceedings of the

Section on Statistical Computing, American Statistical Association.



Mason, Charles, Sangster, Roberta and Wirth, Cassandra (2001), “Comparison of Final Disposition Codes Used for Attrition Calculations for Telephone Samples,” Proceedings of the Section on Survey Research Methods, American Statistical Association.


McKay, Ruth B. (1997), “The Multiracial Category As “Wild Card” In Racial

Questionnaire Design,” Proceedings of the Section on Survey Research Methods

(AAPOR), American Statistical Association, 916-921.



Meekins, B. and Sangster, R. (2004). "Telephone Point of Purchase Advance Letter Study: Technical Report for the Cost-Weights Division for the Consumer Price Index."

Mockovak, W., and Fox, J.E. (2002). Approaches for incorporating user-centered design into CAI development. Proceedings of the International Conference on Questionnaire Development, Evaluation, and Testing Methods (QDET).


Moore, Jeff, Stinson, Linda and Welniak, Ed. Jr. (2001), “Income Measurement Error in Surveys: A Review,” Journal of Official Statistics, vol. 16.


Moore, Jeff, Stinson, Linda, and Welniak, Ed, Jr. (1999), “Income Reporting in

Surveys: Cognitive Issues and Measurement Error.” In Monroe Sirken, Douglas

Herrmann, Susan Schecter, Norbert Schwarz, Judith Tanur, and Roger

Tourangeau (eds.), Cognitive and Survey Research. New York: John Wiley &

Sons, Inc.


Moy, Luann and Stinson, Linda (1999) "Two Sides of A Single Coin?: Dimensions of

Change in Different Settings," Proceedings of the Section on Survey Research

Methods, American Statistical Association.


O'Neill, G. & Vernon, M. (2005). "Revising the American Time Use Survey Advance Materials." The International Field Directors and Technologies Conference, Miami, FL.


Park, Inho and Eltinge, John (1999), "Fitting Complex Survey Data to the Tail of a

Parametric Distribution," Proceedings of the Section on Survey Research

Methods, American Statistical Association.


Park, Inho, and Eltinge, John, (2001), "The Effect of Cluster Sampling on the

Covariance and Correlation Matrices of Sample Distribution Functions,"

Proceedings of the Section on Survey Research Methods, American Statistical

Association.


Parsons,Van and Eltinge, John (1999) "Stratum Partition, Collapse and Mixing in

Construction of Balanced Repeated Replication Variance Estimators,"

Proceedings of the Section on Survey Research Methods, American Statistical

Association.


Pfeffermann, Danny and Scott, Stuart (1997), “Variance Measures For X-11 Seasonally

Adjusted Estimators; Some New Developments with Application to Labor Force

Series,” Proceedings of the Section on Business & Economic Statistics, American

Statistical Association, 211-216


Pfeffermann, Danny, Tiller, Richard, and Zimmerman, Tamara, (2000), "

Accounting for Sampling Error Autocorrelations Towards Signal Extraction from

Models with Sampling Error," Proceedings of the Section on Business and

Economics Statistics, American Statistical Association.


Polivka, Anne E. and West, Sandra A. (1997), “Earnings Data From The Current

Population Survey After The Redesign,” Proceedings of the Section on Survey

Research Methods, American Statistical Association.


Poole, R. and Sangster, R. (2005). "Housing Tenure Study: Technical Report for Survey Methods Division of the Consumer Price Index."


Presser, Stanley and Stinson, Linda (1998). “Data Collection Mode and Social

Desirability Bias in Self-Reported Religious Attendance,” American Sociological

Review, vol. 63, pp. 137-145.


Rips, L. J., Conrad, F.G. & Fricker, S. S. (2004). Straightening out the seam effect in panel surveys. Public Opinion Quarterly, 67, 522-554.


Rips, Lance, Conrad, Frederick and Fricker, Scott, (2000), "Unraveling the Seam

Effect," Proceedings of the Section on Survey Research Methods, American

Statistical Association.


Sangster, R (2005). "Consumer Price Index (CPI) Housing Survey Sample Attrition." Presentation for the 16th International Workshop on Household Survey Nonresponse, Tällberg, Sweden, August 28-31, 2005


Sangster, R., and Meekins, B. (2004). "Assessing Data Quality for Hard to Reach and Reluctant Respondents in an RDD Telephone Panel Survey." Poster for the Annual Conference of the American Association for Public Opinion Research, Phoenix, AZ.


Sangster, R and Meekins, B. (2004). "Modeling the Likelihood of Interviews and Refusals: Using Call History Data to Improve Efficiency of Effort in a National RDD Survey." Proceedings of the Section on Survey Research Methods, American Statistical Association, Toronto, Canada.


Sangster, R and Meekins, B. (2003). "Data Concerns for Hard to Reach and Reluctant Respondents in Telephone Panel Surveys." Presentation for the 14th International Workshop on Household Survey Nonresponse, Leuven, Belgium, 22-24 September 2003.


Sangster, Roberta and Willits, Fern (2001), “Evaluating Numeric Rating Scales: Replicated Results,” Proceedings of the Section on Survey Research Methods, American Statistical Association.


Schecter, Susan, Stinson, Linda and Moy, Luann (1999), " Developing and Testing

Aggregate Reporting Forms for Data on Race and Ethnicity," Paper Presented at

the Conference of the Federal Committee on Statistical Methodology.


Schober, M.F., Conrad, F.G. and Fricker, S.S. (2004). Misunderstanding standardized language in research interviews. Applied Cognitive Psychology, 18, 169-188.

Schober, Michael F., Conrad, Frederick G., and Fricker, Scott S. (2000). “When and

How Should Survey Interviewers Clarify Question Meaning?” Proceedings of the

Section on Survey Research Methods,. American Statistical Association.


Scott, Stuart and Zadrozny, Peter (1999) "," Aggregation & Model-based Methods in

Seasonal Adjustment of Labor Force Series Proceedings of the Section on

Business and Economic Statistics, American Statistical Association.


Schwartz, Lisa and Paulin, Geoffrey (2000) “Improving Response Rates to Income

Questions: A Comparison of Range Techniques” Proceedings of the Section on

Survey Research Methods, American Statistical Association.


Shettle, Carolyn, Ahmed, Susan, Cohen, Steve, Miller, Renee and Waite, Preston (1997),

Improving Statistical Reports From Government Agencies Through The Reports

Review Process,” Proceedings of the Section on Survey Research Methods,

American Statistical Association.


Sirken, Monroe, Tanur, Judith, Tucker, Clyde N. and Martin, Elizabeth A. (1997),

Synthesis Of Casm II: Current And Future Directions In Interdisciplinary

Research On Cognitive Aspects Of Survey Methods,” Proceedings of the Section

on Survey Research Methods, American Statistical Association, 1-10.


Stamas, George, Goldenberg, Karen, Levin, Kerry and Cantor, David (1997),

Sampling For Employment At New Establishments In A Monthly Business

Survey,” Proceedings of the Section on Survey Research Methods, American

Statistical Association, 279-284.


Steiger, Darby Miller, Mainieri, Tina, and Stinson, Linda (1997), “Subjective

Assessments of Economic Well-Being: Understanding the Minimum Income

Question,” Proceedings of the Section on Survey Methods Research, American

Statistical Association, 899-903.


Stewart, Jay, Goldenberg, Karen, Gomes, Tony, and Manser, Marilyn, (2000), "

Collecting All-Employee Earnings Data in the Current Employment Statistics


Stewart, Jay and Joyce, Mary (1999), "Why Do We Need Time-Use Data?," Proceedings of the Section on Social Statistics, American Statistical Association.


Stewart, Jay and Frazis, Harley (1998), “Keying Errors Caused by Unusual Response

Categories: Evidence from a Current Population Survey Test,” Proceedings of the

Section on Survey Research Methods, American Statistical Association, 131-134.


Stinson, Linda (2000), “‘Day of Week’ Differences and Implications for Time-Use

Research,”Proceedings of the Section on Social Statistics, American Statistical

Association.


Stinson, Linda (1999), " Measuring How People Spend Time," Proceedings of the

Section on Social Statistics, American Statistical Association.


Stinson, Linda (1997), “Using the Delighted/Terrible Scale to Measure Feelings About

Income and Expenses,” Proceedings of the Section on Survey Methods Research,

American Statistical Association, 904-909.


Sukasih, Amang and Eltinge, John (2001), “A Goodness-of-Fit Test for Response Probability Models in the Analysis of Complex Survey Data,” Proceedings of the Section on Survey Research Methods


Sverchkov, Michael, and Pfeffermann, Danny, (2000), " Prediction of Finite Population

Totals Under Informative Sampling Utilizing the Sample Distribution,"

Proceedings of the Section on Survey Research Methods, American Statistical

Association.


Swanson, David C, Hauge, Sharon K,. Schmidt, Mary Lynn (1999) "Evaluation of

Composite Estimation Methods for Cost Weights in the CPI," Proceedings of the

Section on Survey Research Methods, American Statistical Association.


Tourangeau, Roger, Shapiro, Gary, Kearney, Anne and Ernst, Lawrence (1997), "Who

lives Here? Survey Undercoverage and Household Roster Questions," Journal of

Official Statistics, 13, 1-18.


Tucker, Clyde (2001), “Using the New Race and Ethnicity Data,“ Proceedings of the Section on Survey Research Methods, American Statistical Association.


Valliant, Richard and Dorfman, Alan H. (1997), “Stratification On A Size Variable

Revisited,” Proceedings of the Section on Survey Research Methods, American

Statistical Association, 766-771.


Vernon, M. (2005). "Pre-testing Sensitive Questions: Perceived Sensitivity, Comprehension, and Order Effects of Questions about Income and Weight." American Statistical Association, Minneapolis, Minnesota.


Walker, Ed and Mesenbourg, Tom (1997), “The Census Bureau's Business Register:

Quality Issues And Observations,” Proceedings of the Section on Survey Research Methods, American Statistical Association.


Walker, Martha A. C. and Bergman, Bruce (1997), “Estimates Of Year-To-Year

Change In Costs Per Hour Worked From The Employer Costs For Employee

Compensation Survey,” Proceedings of the Business and Economic Statistics

Section, American Statistical Association.


Wang, Suojin, Dorfman, Alan H., and Chambers, Raymond (1999)"Maximum

Likelihood Under Informative Sampling," Proceedings of the Section on Survey

Research Methods, American Statistical Association.


Weber, Wolf, (1999), "A Method of Microdata Disclosure Limitation based on Noise

Infusion and Outlier Substitution," Proceedings of the Section on Survey Research

Methods, American Statistical Association.


Werking, George S. Jr. (1997), “Overview Of The CES Redesign Research,”

Proceedings of the Section on Survey Research Methods, American Statistical

Association, 512-516.


West, Sandra A., Kratzke, Tran and Grden, Paul (1997), “Estimators For Average

Hourly Earnings And Average Weekly Hours For The Current Employment

Statistics Survey,” Proceedings of the Section on Survey Research Methods,

American Statistical Association, 529-534.


Wohlford, John and Mueller, Charlotte, (2000), "The Debut of a New Establishment

Survey: The Job Openings and Labor Turnover Survey at the Bureau of Labor

Statistics ," Proceedings of the Section on Survey Research Methods, American

Statistical Association.


Yansaneh, Ibriham, and Eltinge, John, (2001), "Design Effect and Cost Issues for

Surveys in Developing Countries," Proceedings of the Section on Survey Research Methods, American Statistical Association.


Zadrozny, Peter (2001), “An Estimated Autoregressive Model for Forecasting U.S. GDP Based on Real-time Data,” Proceedings of the Section on Business and Economics Statistics, American Statistical Association.


Zadrozny, Peter, (2000), "Modelling Survey-Error Autocorrelations Subject to Time-in-

Sample Effects for Model-Based Seasonal Adjustments," Proceedings of the

Section on Business and Economics Statistics, American Statistical Association.


Zadrozny, Peter and Chen, Baoline, (1999), "Estimation of Capital and Technology with

a Dynmaic Economic Model," Proceedings of the Section on Business and

Economic Statistics, American Statistical Association.



Zarate, Alvan, Greenberg, Brian, Bournazian, Cohen, Stephen and Eden, Donna (2001),

Privacy, Confidentiality and the Protection of Health Data - A Statistical Perspective,” Proceedings of the Section on Government Statistics, American Statistical Association.

ATTACHMENT II


CONSULTANTS TO THE

BEHAVIORAL SCIENCE RESEARCH LABORATORY


Dr. Paul Biemer, Distinguished Fellow

Research Triangle Institute

3040 Corwallace Rd.

Ragland Building

Research Triangle Park, NC 277709

(919) 541-6000


Dr. Fredrick G. Conrad

University of Michigan

Institute for Social Research

PO Box 1248

Room 4025

Ann Arbor, MI 48106

734-936-1019


Carl Ramirez, Senior Design Methodologist

Government Accountability Office

441 G St., NW Room 6K17R

Washington, DC 20548

(202) 512-3721


Kristin Stettler

Survey Methodologist, Establishment Survey Methods Staff

U.S. Census Bureau

FOB4-3110

Washington, DC  20296

301-763-7596


Diane Willimack

Chief, Establishment Survey Methods Staff, ESMPD

U.S. Census Bureau

4700 Silver Hill Road #6200

Washington, DC 20233-6200

ATTACHMENT III

Consent Form


The Bureau of Labor Statistics (BLS) is conducting research to increase the quality of BLS surveys. This study is intended to suggest ways to improve the procedures the BLS uses to collect survey data.


The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. The Privacy Act notice on the back of this form describes the conditions under which information related to this study will be used by BLS employees and agents.


During this research you may be audio and/or videotaped, or you may be observed. If you do not wish to be taped, you still may participate in this research.


We estimate it will take you an average of [enter #] minutes to participate in this research (ranging from [enter #] minutes to [enter #] minutes).


Your participation in this research project is voluntary, and you have the right to stop at any time. If you agree to participate, please sign below.


Persons are not required to respond to the collection of information unless it displays a currently valid OMB control number. OMB control number is 1220-0141, and expires [enter date].


------------------------------------------------------------------------------------------------------------

I have read and understand the statements above. I consent to participate in this study.



___________________________________ ___________________________

Participant's signature Date



___________________________________

Participant's printed name



___________________________________

Researcher's signature



OMB Control Number: 1220-0141

Expiration Date: [enter expiration date]



PRIVACY ACT STATEMENT

In accordance with the Privacy Act of 1974, as amended (5 U.S.C. 552a), you are hereby notified that this study is sponsored by the U.S. Department of Labor, Bureau of Labor Statistics (BLS), under authority of 29 U.S.C. 2. Your voluntary participation is important to the success of this study and will enable the BLS to better understand the behavioral and psychological processes of individuals, as they reflect on the accuracy of BLS information collections. The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent.




File Typeapplication/msword
File TitleBUREAU OF LABOR STATISTICS
AuthorNora Kincaid
Last Modified ByNora Kincaid
File Modified2009-02-23
File Created2009-02-23

© 2024 OMB.report | Privacy Policy