Cognitive and Psychological Research

Cognitive and Psychological Research

Supporting Statement (1220-0141)

Cognitive and Psychological Research

OMB: 1220-0141

Document [doc]
Download: doc | pdf










BUREAU OF LABOR STATISTICS

OMB CLEARANCE PACKAGE






for

CLEARANCE TO CONDUCT COGNITIVE AND PSYCHOLOGICAL RESEARCH

IN FY2006 THROUGH FY2008









Prepared by

BEHAVIORAL SCIENCE RESEARCH LABORATORY

OFFICE OF SURVEY METHODS RESEARCH

BUREAU OF LABOR STATISTICS



November 17, 2005



Request for Revision to Clearance to Conduct

Cognitive and Psychological Research


Abstract


This is a request for clearance by the Bureau of Labor Statistics' (BLS) Behavioral Science Research Laboratory (BSRL) to conduct research to improve the quality of data collection by examining the psychological and cognitive aspects of methods and procedures. BLS staff employing state-of-the-art cognitive psychological testing methods will conduct these research and development activities. The feasibility and value of this approach to questionnaire construction, survey technology and interview processes has been demonstrated over the past 15 years. The use of this technique to improve the quality of data collection has been advocated by the Cognitive Aspects of Survey Methodology seminar sponsored by the National Academy of Science and participants in a questionnaire design advisory conference. The planned research and development activities will be conducted during FY2006 through FY2008 with the goal of improving overall data quality through improved procedures.

Supporting Statement


A. Justification


1. Collection of Information


The Bureau of Labor Statistics' Behavioral Science Research Laboratory conducts psychological research focusing on the design and execution of the data collection process in order to improve the quality of data collected by the Bureau. BSRL conducts research aimed at improving data collection quality by assessing questionnaire/form management and administration, as well as issues which relate to interviewer training and interaction with respondents in the interview process. BSRL staff work closely with economists and/or program specialists responsible for defining the concepts to be measured by the Bureau of Labor Statistics' collection programs.


Both questionnaires and forms are used in the Bureau's surveys. Questionnaires specify the preferred wording of the questions to be asked, whereas forms specify the data items to be collected. Each possesses distinctive problems which in many cases can be related to respondent characteristics, survey content, or format of administration. Such problems impede the effectiveness of particular surveys and the mission of the Bureau in general.


The purpose of this request for clearance for cognitive psychological research and development activities by the BSRL is to enhance the quality of the Bureau's data collection procedures and overall data management. The basic goal of the BSRL is to improve through interdisciplinary research the quality of the data collected and published by the BLS. BLS is committed to producing the most accurate and complete data within the highest quality assurance guidelines. It is with this mission in mind, then, that BSRL was created to aid in the effort of not only maintaining, but also improving the quality of the data collection process.


This laboratory was established in 1988, by Commissioner Janet Norwood, to employ behavioral science to investigate all forms of oral and written communication used in the collection and processing of survey data. This exploration also includes all aspects of data collection such as mode, manuals, and interviewer training. BSRL performs a state-of-the-art service to numerous programs within BLS, the DOL, and other agencies as requested, providing questionnaire redesign efforts, survey updates, and improvement in the overall quality of the data collection management. These efforts, in turn, increase data quality and reduce respondent burden. The techniques proposed here have been applied successfully to a number of on-going BLS surveys; e.g., Current Population Survey (CPS), Current Employment Statistics (CES), Consumer Expenditure Survey (CE), and the American Time Use Survey (ATUS).


The research techniques and methods to be used in these studies will include both analyses of questionnaire construction and interview process, as well as survey technology. Within the structure of the questionnaire, analyses will be conducted in the following domains:


  1. Question Analysis--Evaluation of individual questionnaires appraising question intent, assessment of semantic clarity, and an examination of relationships between questions.


  1. Term Analysis--Evaluation of specific wording and phrases in terms of their psycholinguistic properties and an assessment of respondent interpretation of the meaning of these terms, at both the conscious and unconscious levels.


  1. Instruction Analysis--Inspection of instructions for their semantic clarity, the degree to which they reflect the stated intention of investigators, ease of interpretation, and other considerations which may elicit unambiguous and appropriate answers or behaviors from respondents or interviewers.


  1. Format Analysis--Review of questionnaires or subsets of questions for perceptual characteristics in order to facilitate better respondent comprehension and promote more active attention to the focus of the questionnaire or form.


Within the interview process, several analyses are conducted to assess nonverbal communication, interpersonal dynamics and symbolic interaction--the use of cultural symbols to make social statements. Staff conducts research to evaluate the overall effectiveness of data collection procedural characteristics, including:


  1. Interviewer Characteristics and Behavior analysis - Study of the presentation of e.g., appearance, manner, relation to subject population, etc., in order to enhance interpersonal skills of interviewers in general and develop and improve procedures for the training of interviewers.


  1. Respondent Characteristics and Behavior analysis - Assessment of the social, cultural and ethnic characteristics of the respondent and how that may bear upon interactions with the interviewer. Staff members also observe the behaviors of respondents for cues concerning their reactions to the interview process. Because BLS constantly collects data from different populations that change over time, the analysis of respondent characteristics needs frequent updating.


  1. Mode Characteristics - examination of the unique properties of interviewer and/or respondent behavior as a function of the media used to collect data; for example, self-administered interviews, personal interviews, telephone interviews and interviews utilizing assistive technologies (e.g., CAPI, CASI and CATI).


2. The Purpose of Data Collection


The purpose of BSRL's data collection is to improve Federal data collection processes through scientific research. Theories and methods of cognitive science provide essential tools for the development of effective questionnaires. For example, they can provide an understanding of how respondents comprehend survey questions, recall relevant facts from memory, make judgments, and respond. On the basis of such knowledge, questions can be tailored to increase the accuracy of the collected information and to reduce the respondent burden. Similar improvements can be made with respect to other aspects of the data collection process.


BSRL’s research contributes to BLS and to the entire survey research field. Research results are shared with the Bureau through seminars, training sessions, reports, publications, and presentations at professional conferences. The BSRL Staff has instituted a method of peer review to encourage high standards of social science research practice. A current bibliography of BSRL staff publications can be found in Attachment I.


The BSRL’s research is expected to 1) improve the data collection instruments employed by the Bureau, 2) increase the accuracy of the economic data produced by BLS and on which economic policy decisions are based, 3) increase the ease of administering survey instruments for both respondents and interviewers, 4) increase response rates in panel surveys as a result of reduced respondent burden, and 5) enhance BLS’s reputation resulting in greater confidence and respect in survey instruments used by BLS.

The application of cognitive and psychological theories and methods to survey data collection is widespread and well established. The consequences of failing to scientifically investigate the data collection process is to lag in the use of accepted practices, to apply archaic survey development techniques based on intuition and trial and error, and ultimately to incur a severe cost in data quality and in burden to respondents, interviewers and data users alike. For example, without knowledge of what respondents can be expected to remember about the past and how to ask questions that effectively aid in the retrieval of the appropriate information, survey researchers cannot ensure that respondents will not take shortcuts to avoid careful thought in answering the questions, or to be subject to undue burden. Likewise, without investigation of the interviewers’ roles and abilities in the data collection process, survey researchers cannot ensure that interviewers will read their questions correctly with ease and fluency, or record the respondent’s answers correctly.


3. Use of Improved Technology


Staff members will design, conduct and interpret field and laboratory research that contributes new knowledge of the cognitive aspects of human behavior in relationship to questionnaire design and survey methodology. Cognitive psychological research methods in use include such techniques as: probing questioning, memory cueing, group discussion, and intensive interviewing. Depending on research goals, these methods may be used separately or in combination with one another.


The use of the laboratory approach has a number of advantages associated with it. These advantages include: rapid and in-depth testing of questionnaire items, a more detailed understanding of the respondents’ comprehension of concepts, and access to special populations who can be quickly recruited and tested. Different laboratory methods will be used in different studies depending on the aspects of the data collection process being studied. Computer technology will be used when appropriate to aid the respondents and interviewers and minimize burden.


Respondent burden in this collection will be held to a minimum. The proposed approach to research of data collection methods is designed to obtain the maximum amount of information for the minimum respondent burden. The research includes such methods as:


  1. interview pacing and latency classification,


  1. degree of structure within the interview format, group dynamics observation and recording of decision behaviors and/or the negotiation processes,


  1. structured tasks: card sorts and vignettes,


  1. expert analyses, and


  1. experiments involving the administration of forms to study respondents.


  1. usability tests of existing or proposed data collection and data dissemination systems


4. Efforts to Avoid Duplication


This research does not duplicate any other research effort being done within the BLS. This research will provide critical, groundbreaking, and important supplemental information beyond that currently available in the field of survey methodology as it applies to BLS surveys.


This research also does not duplicate any outside-of-government research effort, as its purpose is not to replicate survey research studies. The staff of BSRL is cognizant of current research being done in the field of cognitive psychology through attendance at conferences, research reported in professional journals, and through in-house staff meetings and peer review processes. There is no current similar existing data that can be used or modified for the purposes of improving the overall data collection process.


  1. Collection of Information Involving Small Establishments

BSRL data collection efforts focus primarily on information gained through laboratory interviews, telephone interviews, and self-administered questionnaires with individuals recruited from the general public. However, in some instances organizational goals necessitate the involvement of businesses, state agencies, or other entities. To the extent these establishments are included they normally are surveyed only once. Typically, this involves fewer than ten establishments per study, but when resources allow, additional establishments may be included to help generalize findings to the field.


6. The Consequences of Less Frequent Data Collection


The planned collection of data will allow BSRL to suggest modifications and alterations to survey research in an on-going manner. Because this collection is expected to be an on-going effort, it has the potential to have immediate impact on all survey collection methods within the Bureau's jurisdiction. Its delay would sacrifice potential gain in survey modification within the Bureau as a whole.


7. Special Circumstances


There are no special circumstances.


8. Federal Register and Consultation Outside BLS


Federal Register: No comments were received as a result of the Federal Register notice published in Volume 70, No. 172 on September 7, 2005


Outside Consultation: Consultation with individuals outside BLS to obtain views on the availability of data, frequency of collection, suitability of particular laboratory methods, clarity of instruction and record keeping, disclosure, or reporting format, and on the data elements to be recorded and/or disclosed, are frequent and on-going with the National Center for Health Statistics, the Bureau of the Census, the University of Maryland, the University of Michigan, and other federal agencies and institutions of higher learning. Consultants come from a wide range of subject areas and expertise. A list of individuals consulted in the past is attached to this document. (Attachment II).


The individual responsible for the BSRL research efforts is:


Dr. William P. Mockovak

Director of Behavioral Science Research

Office of Survey Methods Research

Bureau of Labor Statistics

PSB Room 1950

2 Massachusetts Ave., NE

Washington, DC 20212
202--691-7414


  1. Payment to Respondents


For some research projects, lab staff will travel to, and test, in the vicinity of respondents' residences. In other cases, subjects will travel to the BSRL laboratory facilities. Because respondents are asked to leave their homes and travel to the laboratory field testing site, they will be reimbursed $35.00 for their time in the lab (approximately 1/2 hour to 2 hours) and $5.00 for transportation. With the exception of remuneration, there are no circumstances that require data collection to be inconsistent with 5 CFR 1320.5.


10. Confidentiality and Privacy Concerns


The data collected from respondents will be tabulated and analyzed only for the purpose of evaluating the research in question. Laboratory respondents will be asked to read and sign a Consent form explaining the voluntary nature of the studies, the use of the information, that the interview may be taped or observed, and a Privacy Act statement. (Attachment III). Surveys with current OMB approval that are involved in BSRL studies and are collected outside the laboratory such as mail or CATI surveys use the pledge of the existing approved collection or the Privacy Act statement.


The Commissioner's Order, "Confidential Nature of Bureau Records," explains the BLS policy on confidentiality: "In conformance with existing law and Departmental regulations, it is the policy of the Bureau of Labor Statistics that data collected or maintained by, or under the auspices of, the Bureau under a pledge of confidentiality shall be treated in a manner that will assure that individually identifiable data will be accessible only for statistical purposes or for other purposes made known in advance to the respondent."


11. Sensitive Questions


There are no questions of a sensitive nature.


12. Estimated Respondent Burden


The current OMB inventory for FY2005 is 3,000 hours. The FY2006, FY2007, and FY2008 estimated respondent burdens as follows:


Response Burden

(Hours)

FY2006 1,200

FY2007 1,200

FY2008 1,200

Total FY06-08 3,600


The burden hours are estimated based on the anticipation that the research will require approximately one hour per respondent. Each study will differ substantively from one another. The projects are expected to be complex, involving at times several cognitive testing methods to test the hypotheses of the given research question.


Coverage of the Estimates


The estimates cover the time that each respondent will spend answering questions, including the debriefing procedure concerning the cognitive testing procedure used. The time required to travel to the laboratory, if needed, is not covered, since distances and modes of transportation are unknown. No retrieval of information by respondents is anticipated, although it is possible that validation of data at some point may require respondents to keep and check records. In this case, experiments will be designed to include retrieval time.




Basis for the Estimate


These estimates are based on the BSRL’s previous experience in conducting such research under the existing OMB Clearance 1220-0141, and on expectations concerning the research projects to be conducted in the next 3 years. BSRL staff and its laboratory facilities (especially the usability lab) have been increasingly utilized by both program offices and outside agencies, and it is anticipated that this trend will continue. The estimates are also based on the experience of other government agencies (such as the National Center for Health Statistics' study of the Cognitive Aspects of Survey Methods 1987) which have conducted cognitively oriented questionnaire design research.

Annualized Cost to Respondents


Using the $35 per hour figure in number 9, the annualized cost to the respondents for laboratory time will be $42,000 for FY2006 – FY 2008 (based on 1,200 burden hours).

13. Total Annual Cost Burden


  1. There will be no total capital and start-up cost component for respondents or record keepers resulting from the collection of information.


b) The respondents and record keepers will have no expenses for operation and
maintenance or purchase of services resulting from the collection of information.


14. Cost to the Federal Government


The maximum cost to the Federal Government is $48,000 annually for FY2006, FY2007, and FY2008. Those costs are entirely comprised of the $35 reimbursements and $5 travel allowance provided to respondents, and newspaper advertisement costs. Other costs such as operational expenses (e.g., equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without the paperwork burden, are in place as a function of the laboratory proper and is not contingent or necessary to perform this research. The overall annualized dollar cost to the respondents is $42,161. This assumes that approximately four-fifths of the participants will be reimbursed at a rate of $40 per session. The remainder will be volunteers willing to participate without reimbursement. The estimate dollar cost for these individuals is based on average hourly earnings of employed wage and salaried workers, $15.67 per hour, taken from Current Employment Statistics Program data.

960 respondents X $40.00 = $38,400

240 respondents X $15.67 = $3,761






15. Changes in Burden


This is a request for extension to the existing OMB Clearance 1220-0141 in order to continue the research mission for another 3 years. The estimated reduction of 1,800 hours results from a reduction in the projected number of respondents. Previously, BLS estimated that 4,000 respondents annually would participate in studies approved under this clearance; however, experience over the past three years indicates that the annual number of respondents is approximately 1,200.


16. Tabulation, Analysis and Publication Plans, and Project Schedule


This clearance request is for survey methodological and questionnaire quality assurance, which include the exploratory activities leading to the evaluation of data quality. Both quantitative and qualitative analyses are planned for the evaluation of these activities depending on the circumstances.


The results of these investigations will be used primarily to improve the quality of data collection and assure total collection quality as it relates to data management. Because BLS is using the latest techniques and cognitive psychological testing methodology, papers may be written that include some tallies of response problems, recall strategies, or results from other testing procedures used, etc.

Project Schedule


This project schedule calls for laboratory interviews to commence once OMB approval is received.


A time schedule is proposed that is continuously on-going in nature, with research publication dates dependent on data collection limited to the researcher's proposal and subsequent results.


17. Expiration for OMB Approval


The BSRL is not seeking approval to avoid displaying the expiration date for OMB approval of the information collection.


18. Exception to Certification Statement


There are no exceptions to the certification statement identified in Item 19 “Certification for Paperwork Reduction Act Submissions,” of OMB Form 83-I.

B. Collection of Information Employing Statistical Methods


1. Sample Design


The data collected will be used for research activities which improve data collection processes, rather than to produce estimates about the population. The objective is to interview a variety of people, rather than a probability sample of the population. For most of the research design activities concerning items applicable to the general population, respondents will be recruited by means of fliers or other advertisements posted in public places or in newspapers.


For testing some hypotheses, however, some initial screening of individuals will be done to identify eligible respondents. Eligible respondents are defined as those individuals who have not participated in more that 3 survey research projects in the preceding six months, or who meet other necessary requirements. Special attempts may be made to recruit from specific groups if there are no volunteers from these groups as a result of the general recruiting effort.


In addition, projects in the furtherance of Fed-State cooperative agreements or interagency initiatives may call for participation by state agencies, federal contractors, and other establishments. The cooperation of these organizations will be solicited through agency contacts and/or written correspondence to the appropriate department personnel.


2. Data Collection Procedures


Recruitment:

Potential respondents are typically solicited through newspaper advertisements that state briefly that individuals are needed to participate in research on surveys conducted by the Bureau of Labor Statistics, and that $35 compensation is offered. Persons responding to the advertisement are given a brief description of the nature of the research task. Those interested provide their name and a minimal set of demographic characteristics that are matched against the needs of the particular study. Eligible individuals are then scheduled for an appointment. Those not meeting current study requirements are placed in a respondent pool, and are eligible to participate in future studies.


Some projects require the use of a targeted sample, such as establishments involved in government sponsored surveys, reporting offices from state agencies, or organizations conducting contractual work for the federal government. Organizations will be asked to participate based on the requirements of the research design, and in accordance with the goals outlined above (A.2.). Prior to giving their consent they will be provided with: (1) a written description of the study, including details of study purpose, data collection methodology, and burden estimate; (2) a copy of the Privacy Act statement; and (3) a Consent statement explaining the use of the information collected and the voluntary nature of the study.


Telephone interview and mail-in survey studies that draw upon respondents from the general population use recruitment methods similar to those used for in-house laboratory sessions. However, individuals in these studies will receive additional written materials to include (as above): (1) a study description and estimate of interview/survey length; (2) the Privacy Act Statement; and (3) a Consent statement. Mail-in study participation and telephone interviews will be scheduled only after these materials have been read, and the individuals have given their informed consent, either verbally or in writing.


Lab Interviews:


Once a laboratory interview is scheduled, it is the responsibility of the respondent (from the public sector) to travel to the interview site. The BLS research rooms are located in Room 1950 on the first floor of the Postal Square Building, 2 Massachusetts Avenue, NE, Washington, DC. The rooms are private to insure confidentiality of the interview. To reduce the number of no shows, scheduled volunteers are phoned to remind them of their appointment.


When respondents arrive they receive an oral and/or written explanation of the purpose of the study and the research procedures. The respondent is then given a consent form to sign which includes a Privacy Act statement on the back. The consent form, in addition to the OMB number and expiration date, includes the OMB failure to comply notice, which states that if the OMB control number is not present, the respondent does not have to complete the survey. The need for audio or video taping of the interview is explained if such taping is planned and the respondent is asked to sign the consent form. If consent is not granted, the session will not be recorded. The study may last from 1/2 hour to 2 hours depending on the specific laboratory techniques applied.


The selection of the laboratory technique, in turn, is determined by the hypotheses to be tested. The most commonly used methods include concurrent and retrospective think-aloud interviews. In these interviews, respondents are asked questions (pertinent to the data collection instrument in question) and are asked to think-aloud about how and why they answered as they did. The interviewer usually probes extensively to ascertain the degree of comprehension and the recall processes involved.


Debriefing:

All respondents will be debriefed. This procedure explains the purpose of the project and answers respondents' questions regarding the study.


3. Methods to Maximize Response Rates


As noted, to reduce the number of no-shows, scheduled laboratory respondents will be sent a reminder letter giving the time of the interview and directions to the laboratory. They will also receive a reminder telephone call before the interview. Other data collection procedures will incorporate similar reminders to reduce the level of non-response.


4. Tests of Procedures and Methods


The tests proposed for research fall into a number of categories which cognitive psychologists utilize to confirm or reject research hypotheses. Some of these tests include those tasks outlined by Michael W. Eysenck (1984) in A Handbook of Cognitive Psychology. Some of the possible tests outlined are:


  • developing protocols, scenarios, and question probes--follow-up questions used to gain more information about respondents' strategies for answering questions,


  • concurrent think-aloud interview-- respondents think aloud while answering questions and responses are probed extensively,


  • focus groups and individual interviews-- Structured and unstructured discussion of the survey topic with groups or individuals,


  • retrospective think-aloud interview-- respondents answer all questions first, then are asked how they arrived at their answers,


  • sorting and ranking tasks--respondents sort lists or similar items into groups that go together and rank the items according to a specified scale,


  • confidence ratings--respondents relate the degree of confidence they have in the accuracy of their answers,


  • memory cues--interviewer reads terms which are intended as aids to recall,


  • response latency--measurement of the elapsed time between the presentation of the question and the respondent’s answer,


  • paraphrasing--respondents repeat the questions in their own words.


In addition, BSRL increasingly provides evaluation of and development assistance with BLS electronic data collection and data dissemination instruments (e.g., usability tests of Bureau websites, interviewing software, etc.). BSRL’s usability laboratory offers both on-site and remote testing capabilities.


5. General BSRG Procedures for Submitting packages to OMB


In accordance with study guidelines, studies originating from the BSRL submit supporting documentation that outlines the purpose, cost, and estimated burden. This documentation also provides a description of the study design, the data collection methodology, and the guidelines used for ensuring confidentiality, and includes copies of all relevant project materials (e.g., contact letters, collection instruments, confidentiality forms, protocols). An Inventory Correction Worksheet (ICW), OMB Form 83-C, is submitted with each package. The ICW indicates the title of the study, the assigned project number (based on the fiscal year and number of submissions), and the number of burden hours requested. Three copies of the entire study clearance package are submitted to OMB for consideration (DOL and BLS-DMS also retain a copy). Within 10 working days OMB will review, provide comment, and take action on each package request. A written decision, including any terms of clearance are provided to BLS/DMS and forwarded to the primary investigator. Upon completion of the project, the primary investigator submits a summary report, including a statement of the actual burden hours used, to BLS/DMS and OMB.


6. Statistical Consultants


The individual acting as a consultant to the Laboratory on statistical aspects of the basic research design is:

Dr. N. Clyde Tucker

Senior Survey Methodologist

Office of Survey Methods Research

Bureau of Labor Statistics

PSB Room 1950

2 Mass Ave., NE

Washington, DC 20212

(202) 691-7371


Individuals collecting data and analyzing information are:


Dr. John Bosley

Office of Survey Methods Research

(202) 691-7514

Dr. Monica Dashen

Office of Survey Methods Research

(202) 691-7530


Dr. John Dixon

Office of Survey Methods Research

(202) 691-7516


Dr. Kathy Downey-Sargent

Office of Survey Methods Research

(202) 691-7382


Dr. Jennifer Edgar

Office of Survey Methods Research

(202) 691-7528

Dr. Jean Fox

Office of Survey Methods Research

(202) 691-7370


Scott Fricker, Doctoral Candidate

Office of Survey Methods Research

(202) 691-7390


Dr. Brian Meekins

Office of Survey Methods Research

(202) 691-7594


Dr. William Mockovak

Office of Survey Methods Research

(202) 691-7414


Dr. Polly Phipps

Office of Survey Methods Research

(202) 691-7513


Dr. Christine Rho

Office of Survey Methods Research

(202) 691-7399


Dr. Roberta Sangster

Office of Survey Methods Research

(202) 691-7517

Dr. N. Clyde Tucker

Office of Survey Methods Research

(202) 691-7371


Dr. Margaret Vernon

Office of Survey Methods Research

(202) 691-7386


ATTACHMENT I


BEHAVIORAL SCIENCE RESEARCH LAB BIBLIOGRAPHY


Bosley, John, Dashen, Monica, and Fox, Jean (1999), "Effects on List Length and

Recall Accuracy of Order of Asking Follow-up Questions About Lists of Items

Recalled in Surveys," Proceedings of the Section on Survey Research Methods,

American Statistical Association.


Bosley, J.J., Eltinge, J.L., Fox, J.E., and Fricker, S.S. (2003). Conceptual and Practical Issues in the Statistical Design and Analysis of Usability Tests. Presented at The Federal Committee on Statistical Methodology’s Research Conference, Arlington, VA.



Butani, Shail J., and McElroy, Michael (1999), " Managing Various Customer Needs

for Occupational Employment Statistics and Wage Survey," Proceedings of the

Section on Survey Research Methods, American Statistical Association.


Butani, Shail, Robertson, Kenneth and Mueller, Kirk (1998), “Assigning Permanent

Random Numbers to the Bureau of Labor Statistics Longitudinal (Universe) Data

Base,” Proceedings of the Section on Survey Research Methods, American

Statistical Association, 451-462.


Chen, Baoline and Zadrozny, Peter (1998), "An Extended Yule-Walker Method for

Estimating a Vector Autoregressive Model with Mixed-Frequency Data,"

Advances in Econometrics: Messy Data--Missing Observations, Outliers, and

Mixed-Frequency Data, Vol. 13, T.B. Fomby and R.C. Hill (eds.), JAI Press Inc.,

Greenwich, CT.


Cho, Moon Jung and Eltinge, John, (2001), “Diagnostics for Evaluation of Superpopulation Models for Variance Estimation Under Systematic Sampling,” Proceedings of the Section on Survey Research Methods, American

Statistical Association.


Clements, Joseph, (2000), "Protecting Data in Two-Way Statistical Tables Using

Network Flow Methodology," Proceedings of the Section on Government

Statistics, American Statistical Association.


Cohen, Stephen (1997), “The National Compensation Survey: The New BLS Integrated

Compensation Program,” Proceedings of the Section on Survey Research

Methods, American Statistical Association, 451-456.


Cohen, Stephen, Wright, Tommy, Vangel, Mark, and Wacholder, Sholom, (2000),

"Postdoctoral Research Programs in the Federal Statistical System," Proceedings

of the Section on Survey Research Methods, American Statistical Association. .


Conrad, Frederick, Blair, Johnny and Tracy, Elena (1999). “Verbal Reports are Data! A

Theoretical Approach to Cognitive Interviews.” Proceedings of the Federal

Committee on Statistical Methodology Research Conference.


Conrad, Frederick, Brown, Norman and Dashen, Monica (1999). “Estimating the

Frequency of Events from Unnatural Categories.” Proceedings of the Section on

Survey Research Methods, American Statistical Association. .


Conrad, Frederick G. and Schober, Michael F. (1999) “Conversational Interviewing

and Data Quality.” Proceedings of the Federal Committee on Statistical

Methodology Research Conference.


Couper, Mick and Stinson, Linda (1999), “Completion of Self Administered

Questionnaires in a Sex Survey,” The Journal of Sex Research, vol. 36, pp.321-

330.


Dashen, Monica and Fricker, Scott (1998), “Taking Different Perspectives on a Survey

Question,” Journal of Official Statistics, 17(4), 457-479.


Dashen, Monica, (2000), "Improving Purchase Recollection," Proceedings of the Section

on Survey Research Methods, American Statistical Association.


Dashen, Monica and Sangster, Roberta L. (1997), “Does Item Similarity And Word

Order Influence Comparative Judgments?,” Proceedings of the Section on Survey

Research Methods, American Statistical Association.


Dippo, Cathryn S., Gillman, Daniel W. (1999), The Role of Metadata in Statistics,"

Paper presented at UN/ECE Work Session on Statistical Metadata .

Metadata.


Dippo, Cathryn S. and Hoy, Easley (1997), “Providing Metadata to Survey Staff Via

Internet,” Presented at ISI 51st Session – Istanbul,Turkey.


Dippo, Cathryn S. and Tupek, Alan (1997), “Quantitative Literacy: New Website for

Federal Statistics Provides Research Opportunities,” D-Lib Magazine, December

1997


Dixon, John, (2001), “Using 'Gross Flows' to Evaluate the Impact of Nonresponse on Federal Household Survey Estimates,” Proceedings of the Section on Survey Research Methods, American Statistical Association.


Dixon, John, (2000), "The Relationship Between Household Moving, Nonresponse, and

the Unemployment Rate in the Current Population Survey," Proceedings of the

Section on Government Statistics, American Statistical Association.


Dorfman, Alan H. (1999), "The Stochastic Approach to Price Indices," Proceedings of

the Section on Survey Research Methods, American Statistical Association.


Dorfman, Alan H. (1999), "Issues in the Analysis of Complex Surveys"," Proceedings

Book2 Topic 67 (Bulletin of the International Statistical Institute).


Dorfman, Alan H., (2000) "Non- Parametric Regression for Estimating Totals in Finite

Populations," Proceedings of the Section on Survey Research Methods, American

Statistical Association.


Dorfman, Alan, Leaver, Sylvia; and Lent, Janice (1999). "Some Observations on Price

Index Estimators," Statistical Policy Working Paper 29 - Part 2 of 5, pages 56-65.


Dorfman, Alan H. and Valliant, Richard (1997), “The Hajek Estimator Revisited,”

Proceedings of the Section on Survey Research Methods, American Statistical

Association, 760-765.

Eltinge, John, (2001), “Accounting for Design and Superpopulation Components of Variability in Approximations for Design Effects, Generalized Variance Functions and Related Quantities,” Proceedings of the Section on Survey Research Methods, American Statistical Association.


Eltinge, John, (2000), "Implications of Model Validation Criteria for the Performance of

Small Domain Estimation Methods," Proceedings of the Section on Survey

Research Methods, American Statistical Association.


Eltinge, John (1999), "Evaluation and Reduction of Cluster-Level Identification Risk for

Public-Use Survey Microdata Files," Proceedings of the Section on Survey

Research Methods, American Statistical Association.


Ernst, Lawrence and Paben, Steven, (2000), "Maximizing and Minimizing Overlap

When Selecting Any Number of Units per Stratum Simultaneously for Two

Designs with Different Stratifications," Proceedings of the Section on Survey

Research Methods, American Statistical Association.


Ernst, Lawrence R. (2001), “The History and Mathematics of Apportionment of the U.S. House of Representatives,” Proceedings of the Section on Survey Research Methods, American Statistical Association.


Ernst, Lawrence R. (2001), “Retrospective Assignment of Permanent Random Numbers for Ohlsson's Exponential Sampling Overlap Maximization Procedure for Designs with More than One Sample Unit per Stratum,” Proceedings of the Section on Survey Research Methods, American Statistical Association.


Ernst, Lawrence R. (1999), “The Maximization and Minimization of Sample Overlap

Problems: A Half Century of Results,” Bulletin of the International Statistical

Institute, Proceedings Tome LVII, Book 2, 293-296.


Ernst, Lawrence R., Valliant, Richard and Casady, Robert J. (1998), “Permanent and

Collocated Random Number Sampling and the Coverage of Births and Deaths,”

Proceedings of the Section on Survey Research Methods, American Statistical

Association, 457-462.


Ernst, Lawrence R. and Ponikowski, Chester H. (1998), “Selecting the Employment

Cost Index Survey Sample as a Subsample of the National Compensation Survey,” Proceedings of the Section on Survey Research Methods, American Statistical Association, 517-522.


Ernst, Lawrence R. (1998), “Maximizing And Minimizing Overlap When Selecting A

Large Number Of Units Per Stratum With Simultaneous Selection,” Journal of

Official Statistics, 14, 297-314. Also in Proceedings of the Section on Survey

Research Methods, American Statistical Association (1997), 475-480.


Esposito, James L., (1999) "Evaluating the Displaced Worker/Job-Tenure Supplement to the CPS: An Illustration of Multimethod Quality Assessment Research," Paper

Presented at the Conference of the Federal Committee on Statistical Methodology.


Esposito, James L. and Fisher, Sylvia (1998), “A Summary of Quality-Assessment

Research Conducted on the 1996 Displaced-Worker/Job-Tenure/Occupational-

Mobility Supplement,” BLS Statistical Notes (No. 43), Bureau of Labor Statistics,

Washington, DC.


Fisher, Sylvia K. (2001), “A Clinic-Based Needs Assessment Study of Women Who Partner with Women: The Relationship Between Sexual Orientation/Gender Identity, Health-Seeking Behaviors and Perceived Quality of Care Issues,” Proceedings of the Section on Committee on Gay and Lesbian Concerns in Statistics, American Statistical Association.


Fisher, Sylvia K. (2000), "Improving the Quality of Data Reporting in Business Surveys:

Discussant Comments," Proceedings of the International Conference on Establishment Surveys II.


Fisher, Sylvia. K., Ramirez, Carl, Stanley McCarthy, Jaki, and Shimizu, Iris, (2000),

"Examining Standardization of Response Rate Measures in Establishment Surveys " Proceedings of the Council of Professional and Federal Statistics.


Fox, J.E. (2005) How Do You Resolve Conflicting Requirements from Different User Groups? Presented at the Usability Professionals’ Association Annual Meeting. (http://www.upassoc.org/usability_resources/conference/2005/im_fox.html.)


Fox, J.E., Mockovak, W., Fisher, S.K., Rho, C. (2003). Usability Issues Associated with Converting Establishment Surveys to Web-Based Data Collection. Presented at The Federal Committee on Statistical Methodology’s Research Conference, Arlington, VA.


Fox, J.E., Fisher, S.K., Tucker, N.C., Sangster, R.L., Rho, C. (2003). A Qualitative Approach to the Study of BLS Establishment Survey Nonresponse. Presented at The 163rd Annual Joint Statistical Meetings, San Francisco, CA, August 4, 2003.


Fox, J.E. (2003). Designing the User Interface of a Data Collection Instrument for the Consumer Price Index. CHI 2003, Ft. Lauderdale, FL.


Fricker, S., Galesic, M., Tourangeau, R. and Yan, T. (2005). “An Experimental Comparison of Web and Telephone Surveys.Public Opinion Quarterly, 69, 370-392.


Fricker, S. (2005). “The Relation Between Response Propensity and Data Quality in the American Time Use Survey” American Statistical Association, Minneapolis, Minnesota.


Fricker, S. and Dashen, M., (2001), “How Do People Interpret Open-ended Categorical Questions?” Paper Presented at the American Association for Public Opinion Research Conference.


Garner, Thesia, Stinson, Linda and Shipp, Stephanie (1998), “Subjective Assessments

of Economic Well-Being: Cognitive Research at the U.S. Bureau of Labor

Statistics,” Focus, vol. 19, pp. 43-46.


Goldenberg, Karen, and Phillips, May Anne, (2000) "Now that the Study is Over,

What have You Told Us? Identifying and Correcting Measurement Error in the

Job Openings and Labor Turnover Survey Pilot Test," Paper Presented at the

International Conference on Establishment Surveys II, Buffalo, NY, June 2000

(Proceedings Forthcoming).


Goldenberg, Karen L., and Stewart, Jay (1999), "Earnings Concepts and Data

Availability for the Current Employment Statistics Survey: Findings from

Cognitive Interviews," Proceedings of the Section on Survey Research Methods,

American Statistical Association.


Goldenberg, Karen L., Levin, Kerry, Hagerty, Tracey, Shen, Ted and Cantor, David

(1997), "Procedures for Reducing Measurement Error in Establishment Surveys,"

Proceedings of the Section on Survey Research Methods, American Statistical

Association, 994-999.


Gregg,Valerie and Dippo, Cathryn S. (1999), "FedStats: Partnering to Create the

National Statistical Information Infrastructure of the 21st Century," Proceedings

of the Section on Government Statistics, American Statistical Association.


Guciardo, Christopher (2001), “Estimating Variances in the National Compensation Survey Using Balanced Repeated Replication,” Proceedings of the Section on Survey Research Methods, American Statistical Association.


Harris-Kojetin, Brian A., and Fricker, Scott (1999), " The Influence of Environmental

Characteristics on Survey Cooperation: A Comparison of Metropolitan Areas,"

Paper Presented at 28th Session International Conference on Survey

Nonresponse– Portland, OR. .


Heo, Sunyeong and Eltinge, John (1999), "The Analysis of Categorical Data from a

Complex Sample Survey: Chi-squared Tests for Homogeneity Subject to

Misclassification Error," Proceedings of the Section on Survey Research Methods,

American Statistical Association.


Kydoniefs, Leda and Stinson, Linda (1999) "Standing on the Outside, Looking In:

Tapping Data Users to Compare and Review Surveys," Proceedings of the Section on Survey Research Methods, American Statistical Association.


Lee, Sangrae and Eltinge, John (1999), "Diagnostics for the Stability of an Estimated

Misspecification Effect Matrix ," Proceedings of the Section on Survey Research

Methods, American Statistical Association.


Lent, Janice, (2000). "Chain Drift in Some Price Index Estimators," Proceedings of the

Section on Survey Research Methods, American Statistical Association.


Lent, Janice, Miller, Stephen, Duff, Martha and Cantwell, Patrick (1998),

Comparing Current Population Survey Estimates Computed Using Different

Composite Estimators,” Proceedings of the Section on Survey Research Methods,

American Statistical Association, 564-569.


Levi, Michael (1997), “A Shaker Approach To Web Site Design,” Proceedings of the

Section on Statistical Computing, American Statistical Association.



Mason, Charles, Sangster, Roberta and Wirth, Cassandra (2001), “Comparison of Final Disposition Codes Used for Attrition Calculations for Telephone Samples,” Proceedings of the Section on Survey Research Methods, American Statistical Association.


McKay, Ruth B. (1997), “The Multiracial Category As “Wild Card” In Racial

Questionnaire Design,” Proceedings of the Section on Survey Research Methods

(AAPOR), American Statistical Association, 916-921.



Meekins, B. and Sangster, R. (2004). "Telephone Point of Purchase Advance Letter Study: Technical Report for the Cost-Weights Division for the Consumer Price Index."

Mockovak, W., and Fox, J.E. (2002). Approaches for incorporating user-centered design into CAI development. Proceedings of the International Conference on Questionnaire Development, Evaluation, and Testing Methods (QDET).


Moore, Jeff, Stinson, Linda and Welniak, Ed. Jr. (2001), “Income Measurement Error in Surveys: A Review,” Journal of Official Statistics, vol. 16.


Moore, Jeff, Stinson, Linda, and Welniak, Ed, Jr. (1999), “Income Reporting in

Surveys: Cognitive Issues and Measurement Error.” In Monroe Sirken, Douglas

Herrmann, Susan Schecter, Norbert Schwarz, Judith Tanur, and Roger

Tourangeau (eds.), Cognitive and Survey Research. New York: John Wiley &

Sons, Inc.


Moy, Luann and Stinson, Linda (1999) "Two Sides of A Single Coin?: Dimensions of

Change in Different Settings," Proceedings of the Section on Survey Research

Methods, American Statistical Association.


O'Neill, G. & Vernon, M. (2005). "Revising the American Time Use Survey Advance Materials." The International Field Directors and Technologies Conference, Miami, FL.


Park, Inho and Eltinge, John (1999), "Fitting Complex Survey Data to the Tail of a

Parametric Distribution," Proceedings of the Section on Survey Research

Methods, American Statistical Association.


Park, Inho, and Eltinge, John, (2001), "The Effect of Cluster Sampling on the

Covariance and Correlation Matrices of Sample Distribution Functions,"

Proceedings of the Section on Survey Research Methods, American Statistical

Association.


Parsons,Van and Eltinge, John (1999) "Stratum Partition, Collapse and Mixing in

Construction of Balanced Repeated Replication Variance Estimators,"

Proceedings of the Section on Survey Research Methods, American Statistical

Association.


Pfeffermann, Danny and Scott, Stuart (1997), “Variance Measures For X-11 Seasonally

Adjusted Estimators; Some New Developments with Application to Labor Force

Series,” Proceedings of the Section on Business & Economic Statistics, American

Statistical Association, 211-216


Pfeffermann, Danny, Tiller, Richard, and Zimmerman, Tamara, (2000), "

Accounting for Sampling Error Autocorrelations Towards Signal Extraction from

Models with Sampling Error," Proceedings of the Section on Business and

Economics Statistics, American Statistical Association.


Polivka, Anne E. and West, Sandra A. (1997), “Earnings Data From The Current

Population Survey After The Redesign,” Proceedings of the Section on Survey

Research Methods, American Statistical Association.


Poole, R. and Sangster, R. (2005). "Housing Tenure Study: Technical Report for Survey Methods Division of the Consumer Price Index."


Presser, Stanley and Stinson, Linda (1998). “Data Collection Mode and Social

Desirability Bias in Self-Reported Religious Attendance,” American Sociological

Review, vol. 63, pp. 137-145.


Rips, L. J., Conrad, F.G. & Fricker, S. S. (2004). Straightening out the seam effect in panel surveys. Public Opinion Quarterly, 67, 522-554.


Rips, Lance, Conrad, Frederick and Fricker, Scott, (2000), "Unraveling the Seam

Effect," Proceedings of the Section on Survey Research Methods, American

Statistical Association.


Sangster, R (2005). "Consumer Price Index (CPI) Housing Survey Sample Attrition." Presentation for the 16th International Workshop on Household Survey Nonresponse, Tällberg, Sweden, August 28-31, 2005


Sangster, R., and Meekins, B. (2004). "Assessing Data Quality for Hard to Reach and Reluctant Respondents in an RDD Telephone Panel Survey." Poster for the Annual Conference of the American Association for Public Opinion Research, Phoenix, AZ.


Sangster, R and Meekins, B. (2004). "Modeling the Likelihood of Interviews and Refusals: Using Call History Data to Improve Efficiency of Effort in a National RDD Survey." Proceedings of the Section on Survey Research Methods, American Statistical Association, Toronto, Canada.


Sangster, R and Meekins, B. (2003). "Data Concerns for Hard to Reach and Reluctant Respondents in Telephone Panel Surveys." Presentation for the 14th International Workshop on Household Survey Nonresponse, Leuven, Belgium, 22-24 September 2003.


Sangster, Roberta and Willits, Fern (2001), “Evaluating Numeric Rating Scales: Replicated Results,” Proceedings of the Section on Survey Research Methods, American Statistical Association.


Schecter, Susan, Stinson, Linda and Moy, Luann (1999), " Developing and Testing

Aggregate Reporting Forms for Data on Race and Ethnicity," Paper Presented at

the Conference of the Federal Committee on Statistical Methodology.


Schober, M.F., Conrad, F.G. and Fricker, S.S. (2004). Misunderstanding standardized language in research interviews. Applied Cognitive Psychology, 18, 169-188.

Schober, Michael F., Conrad, Frederick G., and Fricker, Scott S. (2000). “When and

How Should Survey Interviewers Clarify Question Meaning?” Proceedings of the

Section on Survey Research Methods,. American Statistical Association.


Scott, Stuart and Zadrozny, Peter (1999) "," Aggregation & Model-based Methods in

Seasonal Adjustment of Labor Force Series Proceedings of the Section on

Business and Economic Statistics, American Statistical Association.


Schwartz, Lisa and Paulin, Geoffrey (2000) “Improving Response Rates to Income

Questions: A Comparison of Range Techniques” Proceedings of the Section on

Survey Research Methods, American Statistical Association.


Shettle, Carolyn, Ahmed, Susan, Cohen, Steve, Miller, Renee and Waite, Preston (1997),

Improving Statistical Reports From Government Agencies Through The Reports

Review Process,” Proceedings of the Section on Survey Research Methods,

American Statistical Association.


Sirken, Monroe, Tanur, Judith, Tucker, Clyde N. and Martin, Elizabeth A. (1997),

Synthesis Of Casm II: Current And Future Directions In Interdisciplinary

Research On Cognitive Aspects Of Survey Methods,” Proceedings of the Section

on Survey Research Methods, American Statistical Association, 1-10.


Stamas, George, Goldenberg, Karen, Levin, Kerry and Cantor, David (1997),

Sampling For Employment At New Establishments In A Monthly Business

Survey,” Proceedings of the Section on Survey Research Methods, American

Statistical Association, 279-284.


Steiger, Darby Miller, Mainieri, Tina, and Stinson, Linda (1997), “Subjective

Assessments of Economic Well-Being: Understanding the Minimum Income

Question,” Proceedings of the Section on Survey Methods Research, American

Statistical Association, 899-903.


Stewart, Jay, Goldenberg, Karen, Gomes, Tony, and Manser, Marilyn, (2000), "

Collecting All-Employee Earnings Data in the Current Employment Statistics


Stewart, Jay and Joyce, Mary (1999), "Why Do We Need Time-Use Data?," Proceedings of the Section on Social Statistics, American Statistical Association.


Stewart, Jay and Frazis, Harley (1998), “Keying Errors Caused by Unusual Response

Categories: Evidence from a Current Population Survey Test,” Proceedings of the

Section on Survey Research Methods, American Statistical Association, 131-134.


Stinson, Linda (2000), “‘Day of Week’ Differences and Implications for Time-Use

Research,”Proceedings of the Section on Social Statistics, American Statistical

Association.


Stinson, Linda (1999), " Measuring How People Spend Time," Proceedings of the

Section on Social Statistics, American Statistical Association.


Stinson, Linda (1997), “Using the Delighted/Terrible Scale to Measure Feelings About

Income and Expenses,” Proceedings of the Section on Survey Methods Research,

American Statistical Association, 904-909.


Sukasih, Amang and Eltinge, John (2001), “A Goodness-of-Fit Test for Response Probability Models in the Analysis of Complex Survey Data,” Proceedings of the Section on Survey Research Methods


Sverchkov, Michael, and Pfeffermann, Danny, (2000), " Prediction of Finite Population

Totals Under Informative Sampling Utilizing the Sample Distribution,"

Proceedings of the Section on Survey Research Methods, American Statistical

Association.


Swanson, David C, Hauge, Sharon K,. Schmidt, Mary Lynn (1999) "Evaluation of

Composite Estimation Methods for Cost Weights in the CPI," Proceedings of the

Section on Survey Research Methods, American Statistical Association.


Tourangeau, Roger, Shapiro, Gary, Kearney, Anne and Ernst, Lawrence (1997), "Who

lives Here? Survey Undercoverage and Household Roster Questions," Journal of

Official Statistics, 13, 1-18.


Tucker, Clyde (2001), “Using the New Race and Ethnicity Data,“ Proceedings of the Section on Survey Research Methods, American Statistical Association.


Valliant, Richard and Dorfman, Alan H. (1997), “Stratification On A Size Variable

Revisited,” Proceedings of the Section on Survey Research Methods, American

Statistical Association, 766-771.


Vernon, M. (2005). "Pre-testing Sensitive Questions: Perceived Sensitivity, Comprehension, and Order Effects of Questions about Income and Weight." American Statistical Association, Minneapolis, Minnesota.


Walker, Ed and Mesenbourg, Tom (1997), “The Census Bureau's Business Register:

Quality Issues And Observations,” Proceedings of the Section on Survey Research Methods, American Statistical Association.


Walker, Martha A. C. and Bergman, Bruce (1997), “Estimates Of Year-To-Year

Change In Costs Per Hour Worked From The Employer Costs For Employee

Compensation Survey,” Proceedings of the Business and Economic Statistics

Section, American Statistical Association.


Wang, Suojin, Dorfman, Alan H., and Chambers, Raymond (1999)"Maximum

Likelihood Under Informative Sampling," Proceedings of the Section on Survey

Research Methods, American Statistical Association.


Weber, Wolf, (1999), "A Method of Microdata Disclosure Limitation based on Noise

Infusion and Outlier Substitution," Proceedings of the Section on Survey Research

Methods, American Statistical Association.


Werking, George S. Jr. (1997), “Overview Of The CES Redesign Research,”

Proceedings of the Section on Survey Research Methods, American Statistical

Association, 512-516.


West, Sandra A., Kratzke, Tran and Grden, Paul (1997), “Estimators For Average

Hourly Earnings And Average Weekly Hours For The Current Employment

Statistics Survey,” Proceedings of the Section on Survey Research Methods,

American Statistical Association, 529-534.


Wohlford, John and Mueller, Charlotte, (2000), "The Debut of a New Establishment

Survey: The Job Openings and Labor Turnover Survey at the Bureau of Labor

Statistics ," Proceedings of the Section on Survey Research Methods, American

Statistical Association.


Yansaneh, Ibriham, and Eltinge, John, (2001), "Design Effect and Cost Issues for

Surveys in Developing Countries," Proceedings of the Section on Survey Research Methods, American Statistical Association.


Zadrozny, Peter (2001), “An Estimated Autoregressive Model for Forecasting U.S. GDP Based on Real-time Data,” Proceedings of the Section on Business and Economics Statistics, American Statistical Association.


Zadrozny, Peter, (2000), "Modelling Survey-Error Autocorrelations Subject to Time-in-

Sample Effects for Model-Based Seasonal Adjustments," Proceedings of the

Section on Business and Economics Statistics, American Statistical Association.


Zadrozny, Peter and Chen, Baoline, (1999), "Estimation of Capital and Technology with

a Dynmaic Economic Model," Proceedings of the Section on Business and

Economic Statistics, American Statistical Association.



Zarate, Alvan, Greenberg, Brian, Bournazian, Cohen, Stephen and Eden, Donna (2001),

Privacy, Confidentiality and the Protection of Health Data - A Statistical Perspective,” Proceedings of the Section on Government Statistics, American Statistical Association.

ATTACHMENT II


CONSULTANTS TO THE

BEHAVIORAL SCIENCE RESEARCH LABORATORY


Dr. Paul Biemer, Distinguished Fellow

Research Triangle Instititute

3040 Corwallace Rd.

Ragland Building

Research Triangle Park, NC 277709

(919) 541-6000


Pamela Doty, Senior Policy Analyst

Division of Disability, Aging and Long-term Care Policy

Office of the Assistant Secretary for Planning and Evaluation

U.S. Department of Health and Human Services

200 Independence Ave, SW

Washington, DC 20201

Phone: (202) 690-6449


Carl Ramirez, Senior Design Methodologist

Government Accountability Office

441 G St., NW Room 6K17R

Washington, DC 20548

(202) 512-3721


Kristin Stettler

Survey Methodologist, Establishment Survey Methods Staff

U.S. Census Bureau

FOB4-3110

Washington, DC  20296

301-763-7596

Kristin Stettler
ESMS, US Census Bureau
301-763-7596


Diane Willimack

Chief, Establishment Survey Methods Staff, ESMPD

U.S. Census Bureau

4700 Silver Hill Road #6200

Washington, DC 20233-6200

ATTACHMENT III

Consent Form


The Bureau of Labor Statistics (BLS) is conducting research to increase the quality of BLS surveys. This study is intended to suggest ways to improve the procedures the BLS uses to collect survey data.


The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. The Privacy Act notice on the back of this form describes the conditions under which information related to this study will be used by BLS employees and agents.


During this research you may be audio and/or videotaped, or you may be observed. If you do not wish to be taped, you still may participate in this research.


We estimate it will take you an average of xx minutes to participate in this research (ranging from xx minutes to xx minutes).


Your participation in this research project is voluntary, and you have the right to stop at any time. If you agree to participate, please sign below.


Persons are not required to respond to the collection of information unless it displays a currently valid OMB control number. OMB control number is 1220-0141, and expires 01/31/06.


------------------------------------------------------------------------------------------------------------


I have read and understand the statements above. I consent to participate in this study.


___________________________________ ___________________________

Participant's signature Date


___________________________________

Participant's printed name


___________________________________

Researcher's signature



OMB Control Number: 1220-0141

Expiration Date: 01/31/06



PRIVACY ACT STATEMENT

In accordance with the Privacy Act of 1974, as amended (5 U.S.C. 552a), you are hereby notified that this study is sponsored by the U.S. Department of Labor, Bureau of Labor Statistics (BLS), under authority of 29 U.S.C. 2. Your voluntary participation is important to the success of this study and will enable the BLS to better understand the behavioral and psychological processes of individuals, as they reflect on the accuracy of BLS information collections. The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. The BLS may release individually identifiable information to individuals designated as agents of the BLS in accordance with Public Law 107-347 to perform exclusively statistical activities. Individuals designated as agents of the BLS may be imprisoned for not more than 5 years or fined not more that $250,000 or both for any knowing and willful disclosure of respondent information to unauthorized persons. Such designated agents may include individuals from other sponsoring agencies; to contractors, grantees, and their employees or volunteers who are working on this study for the BLS and who need access to the information; or to the National Archives and Records Administration or the General Services Administration for records management purposes. Under written agreements to protect the confidentiality and security of individually identifiable information, the BLS may provide individually identifiable information to other researchers designated as agents of the BLS to conduct statistical research projects that further the mission and functions of the BLS.




29


File Typeapplication/msword
File TitleBUREAU OF LABOR STATISTICS
Authorjerstad_s
Last Modified ByHobby_A
File Modified2005-12-07
File Created2005-11-17

© 2024 OMB.report | Privacy Policy