Supporting Statement
B. Collection of Information Employing Statistical Methods
1. Sample Design
The data collected will be used for research activities which improve data collection processes, rather than to produce estimates about the population. The objective is to interview a variety of people, rather than a probability sample of the population. For most of the research design activities concerning items applicable to the general population, respondents will be recruited by means of fliers or other advertisements posted in public places or in newspapers.
For testing some hypotheses, however, some initial screening of individuals will be done to identify eligible respondents. Eligible respondents are defined as those individuals who have not participated in more that 3 survey research projects in the preceding six months, or who meet other necessary requirements. Special attempts may be made to recruit from specific groups if there are no volunteers from these groups as a result of the general recruiting effort.
In addition, projects in the furtherance of Fed-State cooperative agreements or interagency initiatives may call for participation by state agencies, federal contractors, and other establishments. The cooperation of these organizations will be solicited through agency contacts and/or written correspondence to the appropriate department personnel.
2. Data Collection Procedures
Recruitment:
Potential respondents are typically solicited through newspaper advertisements that state briefly that individuals are needed to participate in research on surveys conducted by the Bureau of Labor Statistics, and that $35 compensation is offered. Persons responding to the advertisement are given a brief description of the nature of the research task. Those interested provide their name and a minimal set of demographic characteristics that are matched against the needs of the particular study. Eligible individuals are then scheduled for an appointment. Those not meeting current study requirements are placed in a respondent pool, and are eligible to participate in future studies.
Some projects require the use of a targeted sample, such as establishments involved in government sponsored surveys, reporting offices from state agencies, or organizations conducting contractual work for the federal government. Organizations will be asked to participate based on the requirements of the research design, and in accordance with the goals outlined above (A.2.). Prior to giving their consent they will be provided with: (1) a written description of the study, including details of study purpose, data collection methodology, and burden estimate; (2) a copy of the Privacy Act statement; and (3) a Consent statement explaining the use of the information collected and the voluntary nature of the study.
Telephone interview and mail-in survey studies that draw upon respondents from the general population use recruitment methods similar to those used for in-house laboratory sessions. However, individuals in these studies will receive additional written materials to include (as above): (1) a study description and estimate of interview/survey length; (2) the Privacy Act Statement; and (3) a Consent statement. Mail-in study participation and telephone interviews will be scheduled only after these materials have been read, and the individuals have given their informed consent, either verbally or in writing.
Lab Interviews:
Once a laboratory interview is scheduled, it is the responsibility of the respondent (from the public sector) to travel to the interview site. The BLS research rooms are located in Room 1950 on the first floor of the Postal Square Building, 2 Massachusetts Avenue, NE, Washington, DC. The rooms are private to insure confidentiality of the interview. To reduce the number of no shows, scheduled volunteers are phoned to remind them of their appointment.
When respondents arrive they receive an oral and/or written explanation of the purpose of the study and the research procedures. The respondent is then given a consent form to sign which includes a Privacy Act statement on the back. The consent form, in addition to the OMB number and expiration date, includes the OMB failure to comply notice, which states that if the OMB control number is not present, the respondent does not have to complete the survey. The need for audio or video taping of the interview is explained if such taping is planned and the respondent is asked to sign the consent form. If consent is not granted, the session will not be recorded. The study may last from 1/2 hour to 2 hours depending on the specific laboratory techniques applied.
The selection of the laboratory technique, in turn, is determined by the hypotheses to be tested. The most commonly used methods include concurrent and retrospective think-aloud interviews. In these interviews, respondents are asked questions (pertinent to the data collection instrument in question) and are asked to think-aloud about how and why they answered as they did. The interviewer usually probes extensively to ascertain the degree of comprehension and the recall processes involved.
Debriefing:
All respondents will be debriefed. This procedure explains the purpose of the project and answers respondents' questions regarding the study.
3. Methods to Maximize Response Rates
As noted, to reduce the number of no-shows, scheduled laboratory respondents will be sent a reminder letter giving the time of the interview and directions to the laboratory. They will also receive a reminder telephone call before the interview. Other data collection procedures will incorporate similar reminders to reduce the level of non-response.
4. Tests of Procedures and Methods
The tests proposed for research fall into a number of categories which cognitive psychologists utilize to confirm or reject research hypotheses. Some of these tests include those tasks outlined by Michael W. Eysenck (1984) in A Handbook of Cognitive Psychology. Some of the possible tests outlined are:
developing protocols, scenarios, and question probes--follow-up questions used to gain more information about respondents' strategies for answering questions,
concurrent think-aloud interview-- respondents think aloud while answering questions and responses are probed extensively,
focus groups and individual interviews-- Structured and unstructured discussion of the survey topic with groups or individuals,
retrospective think-aloud interview-- respondents answer all questions first, then are asked how they arrived at their answers,
sorting and ranking tasks--respondents sort lists or similar items into groups that go together and rank the items according to a specified scale,
confidence ratings--respondents relate the degree of confidence they have in the accuracy of their answers,
memory cues--interviewer reads terms which are intended as aids to recall,
response latency--measurement of the elapsed time between the presentation of the question and the respondent’s answer,
paraphrasing--respondents repeat the questions in their own words.
In addition, BSRL increasingly provides evaluation of and development assistance with BLS electronic data collection and data dissemination instruments (e.g., usability tests of Bureau websites, interviewing software, etc.). BSRL’s usability laboratory offers both on-site and remote testing capabilities.
5. General BSRG Procedures for Submitting packages to OMB
In accordance with study guidelines, studies originating from the BSRL submit supporting documentation that outlines the purpose, cost, and estimated burden. This documentation also provides a description of the study design, the data collection methodology, and the guidelines used for ensuring confidentiality, and includes copies of all relevant project materials (e.g., contact letters, collection instruments, confidentiality forms, protocols). An Inventory Correction Worksheet (ICW), OMB Form 83-C, is submitted with each package. The ICW indicates the title of the study, the assigned project number (based on the fiscal year and number of submissions), and the number of burden hours requested. Three copies of the entire study clearance package are submitted to OMB for consideration (DOL and BLS-DMS also retain a copy). Within 10 working days OMB will review, provide comment, and take action on each package request. A written decision, including any terms of clearance are provided to BLS/DMS and forwarded to the primary investigator. Upon completion of the project, the primary investigator submits a summary report, including a statement of the actual burden hours used, to BLS/DMS and OMB.
6. Statistical Consultants
The individual acting as a consultant to the Laboratory on statistical aspects of the basic research design is:
Dr. N. Clyde Tucker
Senior Survey Methodologist
Office of Survey Methods Research
Bureau of Labor Statistics
PSB Room 1950
2 Mass Ave., NE
Washington, DC 20212
(202) 691-7371
Individuals collecting data and analyzing information are:
Dr. John Bosley
(202) 691-7514
Dr. Monica Dashen
Office of Survey Methods Research
(202) 691-7530
Dr. John Dixon
Office of Survey Methods Research
(202) 691-7516
Dr. Kathy Downey-Sargent
Office of Survey Methods Research
(202) 691-7382
Dr. Jennifer Edgar
Office of Survey Methods Research
(202) 691-7528
Dr. Jean Fox
Office of Survey Methods Research
(202) 691-7370
Scott Fricker, Doctoral Candidate
Office of Survey Methods Research
(202) 691-7390
Dr. Brian Meekins
Office of Survey Methods Research
(202) 691-7594
Dr. William Mockovak
Office of Survey Methods Research
(202) 691-7414
Dr. Polly Phipps
Office of Survey Methods Research
(202) 691-7513
Dr. Christine Rho
Office of Survey Methods Research
(202) 691-7399
Dr. Roberta Sangster
Office of Survey Methods Research
(202) 691-7517
Dr. N. Clyde Tucker
Office of Survey Methods Research
(202) 691-7371
Dr. Margaret Vernon
Office of Survey Methods Research
(202) 691-7386
ATTACHMENT I
BEHAVIORAL SCIENCE RESEARCH LAB BIBLIOGRAPHY
Bosley, John, Dashen, Monica, and Fox, Jean (1999), "Effects on List Length and
Recall Accuracy of Order of Asking Follow-up Questions About Lists of Items
Recalled in Surveys," Proceedings of the Section on Survey Research Methods,
American Statistical Association.
Bosley, J.J., Eltinge, J.L., Fox, J.E., and Fricker, S.S. (2003). Conceptual and Practical Issues in the Statistical Design and Analysis of Usability Tests. Presented at The Federal Committee on Statistical Methodology’s Research Conference, Arlington, VA.
Butani, Shail J., and McElroy, Michael (1999), " Managing Various Customer Needs
for Occupational Employment Statistics and Wage Survey," Proceedings of the
Section on Survey Research Methods, American Statistical Association.
Butani, Shail, Robertson, Kenneth and Mueller, Kirk (1998), “Assigning Permanent
Random Numbers to the Bureau of Labor Statistics Longitudinal (Universe) Data
Base,” Proceedings of the Section on Survey Research Methods, American
Statistical Association, 451-462.
Chen, Baoline and Zadrozny, Peter (1998), "An Extended Yule-Walker Method for
Estimating a Vector Autoregressive Model with Mixed-Frequency Data,"
Advances in Econometrics: Messy Data--Missing Observations, Outliers, and
Mixed-Frequency Data, Vol. 13, T.B. Fomby and R.C. Hill (eds.), JAI Press Inc.,
Greenwich, CT.
Cho, Moon Jung and Eltinge, John, (2001), “Diagnostics for Evaluation of Superpopulation Models for Variance Estimation Under Systematic Sampling,” Proceedings of the Section on Survey Research Methods, American
Statistical Association.
Clements, Joseph, (2000), "Protecting Data in Two-Way Statistical Tables Using
Network Flow Methodology," Proceedings of the Section on Government
Statistics, American Statistical Association.
Cohen, Stephen (1997), “The National Compensation Survey: The New BLS Integrated
Compensation Program,” Proceedings of the Section on Survey Research
Methods, American Statistical Association, 451-456.
Cohen, Stephen, Wright, Tommy, Vangel, Mark, and Wacholder, Sholom, (2000),
"Postdoctoral Research Programs in the Federal Statistical System," Proceedings
of the Section on Survey Research Methods, American Statistical Association. .
Conrad, Frederick, Blair, Johnny and Tracy, Elena (1999). “Verbal Reports are Data! A
Theoretical Approach to Cognitive Interviews.” Proceedings of the Federal
Committee on Statistical Methodology Research Conference.
Conrad, Frederick, Brown, Norman and Dashen, Monica (1999). “Estimating the
Frequency of Events from Unnatural Categories.” Proceedings of the Section on
Survey Research Methods, American Statistical Association. .
Conrad, Frederick G. and Schober, Michael F. (1999) “Conversational Interviewing
and Data Quality.” Proceedings of the Federal Committee on Statistical
Methodology Research Conference.
Couper, Mick and Stinson, Linda (1999), “Completion of Self Administered
Questionnaires in a Sex Survey,” The Journal of Sex Research, vol. 36, pp.321-
330.
Dashen, Monica and Fricker, Scott (1998), “Taking Different Perspectives on a Survey
Question,” Journal of Official Statistics, 17(4), 457-479.
Dashen, Monica, (2000), "Improving Purchase Recollection," Proceedings of the Section
on Survey Research Methods, American Statistical Association.
Dashen, Monica and Sangster, Roberta L. (1997), “Does Item Similarity And Word
Order Influence Comparative Judgments?,” Proceedings of the Section on Survey
Research Methods, American Statistical Association.
Dippo, Cathryn S., Gillman, Daniel W. (1999), The Role of Metadata in Statistics,"
Paper presented at UN/ECE Work Session on Statistical Metadata .
Metadata.
Dippo, Cathryn S. and Hoy, Easley (1997), “Providing Metadata to Survey Staff Via
Internet,” Presented at ISI 51st Session – Istanbul,Turkey.
Dippo, Cathryn S. and Tupek, Alan (1997), “Quantitative Literacy: New Website for
Federal Statistics Provides Research Opportunities,” D-Lib Magazine, December
1997
Dixon, John, (2001), “Using 'Gross Flows' to Evaluate the Impact of Nonresponse on Federal Household Survey Estimates,” Proceedings of the Section on Survey Research Methods, American Statistical Association.
Dixon, John, (2000), "The Relationship Between Household Moving, Nonresponse, and
the Unemployment Rate in the Current Population Survey," Proceedings of the
Section on Government Statistics, American Statistical Association.
Dorfman, Alan H. (1999), "The Stochastic Approach to Price Indices," Proceedings of
the Section on Survey Research Methods, American Statistical Association.
Dorfman, Alan H. (1999), "Issues in the Analysis of Complex Surveys"," Proceedings
Book2 Topic 67 (Bulletin of the International Statistical Institute).
Dorfman, Alan H., (2000) "Non- Parametric Regression for Estimating Totals in Finite
Populations," Proceedings of the Section on Survey Research Methods, American
Statistical Association.
Dorfman, Alan, Leaver, Sylvia; and Lent, Janice (1999). "Some Observations on Price
Index Estimators," Statistical Policy Working Paper 29 - Part 2 of 5, pages 56-65.
Dorfman, Alan H. and Valliant, Richard (1997), “The Hajek Estimator Revisited,”
Proceedings of the Section on Survey Research Methods, American Statistical
Association, 760-765.
Eltinge, John, (2001), “Accounting for Design and Superpopulation Components of Variability in Approximations for Design Effects, Generalized Variance Functions and Related Quantities,” Proceedings of the Section on Survey Research Methods, American Statistical Association.
Eltinge, John, (2000), "Implications of Model Validation Criteria for the Performance of
Small Domain Estimation Methods," Proceedings of the Section on Survey
Research Methods, American Statistical Association.
Eltinge, John (1999), "Evaluation and Reduction of Cluster-Level Identification Risk for
Public-Use Survey Microdata Files," Proceedings of the Section on Survey
Research Methods, American Statistical Association.
Ernst, Lawrence and Paben, Steven, (2000), "Maximizing and Minimizing Overlap
When Selecting Any Number of Units per Stratum Simultaneously for Two
Designs with Different Stratifications," Proceedings of the Section on Survey
Research Methods, American Statistical Association.
Ernst, Lawrence R. (2001), “The History and Mathematics of Apportionment of the U.S. House of Representatives,” Proceedings of the Section on Survey Research Methods, American Statistical Association.
Ernst, Lawrence R. (2001), “Retrospective Assignment of Permanent Random Numbers for Ohlsson's Exponential Sampling Overlap Maximization Procedure for Designs with More than One Sample Unit per Stratum,” Proceedings of the Section on Survey Research Methods, American Statistical Association.
Ernst, Lawrence R. (1999), “The Maximization and Minimization of Sample Overlap
Problems: A Half Century of Results,” Bulletin of the International Statistical
Institute, Proceedings Tome LVII, Book 2, 293-296.
Ernst, Lawrence R., Valliant, Richard and Casady, Robert J. (1998), “Permanent and
Collocated Random Number Sampling and the Coverage of Births and Deaths,”
Proceedings of the Section on Survey Research Methods, American Statistical
Association, 457-462.
Ernst, Lawrence R. and Ponikowski, Chester H. (1998), “Selecting the Employment
Cost Index Survey Sample as a Subsample of the National Compensation Survey,” Proceedings of the Section on Survey Research Methods, American Statistical Association, 517-522.
Ernst, Lawrence R. (1998), “Maximizing And Minimizing Overlap When Selecting A
Large Number Of Units Per Stratum With Simultaneous Selection,” Journal of
Official Statistics, 14, 297-314. Also in Proceedings of the Section on Survey
Research Methods, American Statistical Association (1997), 475-480.
Esposito, James L., (1999) "Evaluating the Displaced Worker/Job-Tenure Supplement to the CPS: An Illustration of Multimethod Quality Assessment Research," Paper
Presented at the Conference of the Federal Committee on Statistical Methodology.
Esposito, James L. and Fisher, Sylvia (1998), “A Summary of Quality-Assessment
Research Conducted on the 1996 Displaced-Worker/Job-Tenure/Occupational-
Mobility Supplement,” BLS Statistical Notes (No. 43), Bureau of Labor Statistics,
Washington, DC.
Fisher, Sylvia K. (2001), “A Clinic-Based Needs Assessment Study of Women Who Partner with Women: The Relationship Between Sexual Orientation/Gender Identity, Health-Seeking Behaviors and Perceived Quality of Care Issues,” Proceedings of the Section on Committee on Gay and Lesbian Concerns in Statistics, American Statistical Association.
Fisher, Sylvia K. (2000), "Improving the Quality of Data Reporting in Business Surveys:
Discussant Comments," Proceedings of the International Conference on Establishment Surveys II.
Fisher, Sylvia. K., Ramirez, Carl, Stanley McCarthy, Jaki, and Shimizu, Iris, (2000),
"Examining Standardization of Response Rate Measures in Establishment Surveys " Proceedings of the Council of Professional and Federal Statistics.
Fox, J.E. (2005) How Do You Resolve Conflicting Requirements from Different User Groups? Presented at the Usability Professionals’ Association Annual Meeting. (http://www.upassoc.org/usability_resources/conference/2005/im_fox.html.)
Fox, J.E., Mockovak, W., Fisher, S.K., Rho, C. (2003). Usability Issues Associated with Converting Establishment Surveys to Web-Based Data Collection. Presented at The Federal Committee on Statistical Methodology’s Research Conference, Arlington, VA.
Fox, J.E., Fisher, S.K., Tucker, N.C., Sangster, R.L., Rho, C. (2003). A Qualitative Approach to the Study of BLS Establishment Survey Nonresponse. Presented at The 163rd Annual Joint Statistical Meetings, San Francisco, CA, August 4, 2003.
Fox, J.E. (2003). Designing the User Interface of a Data Collection Instrument for the Consumer Price Index. CHI 2003, Ft. Lauderdale, FL.
Fricker, S., Galesic, M., Tourangeau, R. and Yan, T. (2005). “An Experimental Comparison of Web and Telephone Surveys.” Public Opinion Quarterly, 69, 370-392.
Fricker, S. (2005). “The Relation Between Response Propensity and Data Quality in the American Time Use Survey” American Statistical Association, Minneapolis, Minnesota.
Fricker, S. and Dashen, M., (2001), “How Do People Interpret Open-ended Categorical Questions?” Paper Presented at the American Association for Public Opinion Research Conference.
Garner, Thesia, Stinson, Linda and Shipp, Stephanie (1998), “Subjective Assessments
of Economic Well-Being: Cognitive Research at the U.S. Bureau of Labor
Statistics,” Focus, vol. 19, pp. 43-46.
Goldenberg, Karen, and Phillips, May Anne, (2000) "Now that the Study is Over,
What have You Told Us? Identifying and Correcting Measurement Error in the
Job Openings and Labor Turnover Survey Pilot Test," Paper Presented at the
International Conference on Establishment Surveys II, Buffalo, NY, June 2000
(Proceedings Forthcoming).
Goldenberg, Karen L., and Stewart, Jay (1999), "Earnings Concepts and Data
Availability for the Current Employment Statistics Survey: Findings from
Cognitive Interviews," Proceedings of the Section on Survey Research Methods,
American Statistical Association.
Goldenberg, Karen L., Levin, Kerry, Hagerty, Tracey, Shen, Ted and Cantor, David
(1997), "Procedures for Reducing Measurement Error in Establishment Surveys,"
Proceedings of the Section on Survey Research Methods, American Statistical
Association, 994-999.
Gregg,Valerie and Dippo, Cathryn S. (1999), "FedStats: Partnering to Create the
National Statistical Information Infrastructure of the 21st Century," Proceedings
of the Section on Government Statistics, American Statistical Association.
Guciardo, Christopher (2001), “Estimating Variances in the National Compensation Survey Using Balanced Repeated Replication,” Proceedings of the Section on Survey Research Methods, American Statistical Association.
Harris-Kojetin, Brian A., and Fricker, Scott (1999), " The Influence of Environmental
Characteristics on Survey Cooperation: A Comparison of Metropolitan Areas,"
Paper Presented at 28th Session International Conference on Survey
Nonresponse– Portland, OR. .
Heo, Sunyeong and Eltinge, John (1999), "The Analysis of Categorical Data from a
Complex Sample Survey: Chi-squared Tests for Homogeneity Subject to
Misclassification Error," Proceedings of the Section on Survey Research Methods,
American Statistical Association.
Kydoniefs, Leda and Stinson, Linda (1999) "Standing on the Outside, Looking In:
Tapping Data Users to Compare and Review Surveys," Proceedings of the Section on Survey Research Methods, American Statistical Association.
Lee, Sangrae and Eltinge, John (1999), "Diagnostics for the Stability of an Estimated
Misspecification Effect Matrix ," Proceedings of the Section on Survey Research
Methods, American Statistical Association.
Lent, Janice, (2000). "Chain Drift in Some Price Index Estimators," Proceedings of the
Section on Survey Research Methods, American Statistical Association.
Lent, Janice, Miller, Stephen, Duff, Martha and Cantwell, Patrick (1998),
“Comparing Current Population Survey Estimates Computed Using Different
Composite Estimators,” Proceedings of the Section on Survey Research Methods,
American Statistical Association, 564-569.
Levi, Michael (1997), “A Shaker Approach To Web Site Design,” Proceedings of the
Section on Statistical Computing, American Statistical Association.
Mason, Charles, Sangster, Roberta and Wirth, Cassandra (2001), “Comparison of Final Disposition Codes Used for Attrition Calculations for Telephone Samples,” Proceedings of the Section on Survey Research Methods, American Statistical Association.
McKay, Ruth B. (1997), “The Multiracial Category As “Wild Card” In Racial
Questionnaire Design,” Proceedings of the Section on Survey Research Methods
(AAPOR), American Statistical Association, 916-921.
Meekins, B. and Sangster, R. (2004). "Telephone Point of Purchase Advance Letter Study: Technical Report for the Cost-Weights Division for the Consumer Price Index."
Mockovak, W., and Fox, J.E. (2002). Approaches for incorporating user-centered design into CAI development. Proceedings of the International Conference on Questionnaire Development, Evaluation, and Testing Methods (QDET).
Moore, Jeff, Stinson, Linda and Welniak, Ed. Jr. (2001), “Income Measurement Error in Surveys: A Review,” Journal of Official Statistics, vol. 16.
Moore, Jeff, Stinson, Linda, and Welniak, Ed, Jr. (1999), “Income Reporting in
Surveys: Cognitive Issues and Measurement Error.” In Monroe Sirken, Douglas
Herrmann, Susan Schecter, Norbert Schwarz, Judith Tanur, and Roger
Tourangeau (eds.), Cognitive and Survey Research. New York: John Wiley &
Sons, Inc.
Moy, Luann and Stinson, Linda (1999) "Two Sides of A Single Coin?: Dimensions of
Change in Different Settings," Proceedings of the Section on Survey Research
Methods, American Statistical Association.
O'Neill, G. & Vernon, M. (2005). "Revising the American Time Use Survey Advance Materials." The International Field Directors and Technologies Conference, Miami, FL.
Park, Inho and Eltinge, John (1999), "Fitting Complex Survey Data to the Tail of a
Parametric Distribution," Proceedings of the Section on Survey Research
Methods, American Statistical Association.
Park, Inho, and Eltinge, John, (2001), "The Effect of Cluster Sampling on the
Covariance and Correlation Matrices of Sample Distribution Functions,"
Proceedings of the Section on Survey Research Methods, American Statistical
Association.
Parsons,Van and Eltinge, John (1999) "Stratum Partition, Collapse and Mixing in
Construction of Balanced Repeated Replication Variance Estimators,"
Proceedings of the Section on Survey Research Methods, American Statistical
Association.
Pfeffermann, Danny and Scott, Stuart (1997), “Variance Measures For X-11 Seasonally
Adjusted Estimators; Some New Developments with Application to Labor Force
Series,” Proceedings of the Section on Business & Economic Statistics, American
Statistical Association, 211-216
Pfeffermann, Danny, Tiller, Richard, and Zimmerman, Tamara, (2000), "
Accounting for Sampling Error Autocorrelations Towards Signal Extraction from
Models with Sampling Error," Proceedings of the Section on Business and
Economics Statistics, American Statistical Association.
Polivka, Anne E. and West, Sandra A. (1997), “Earnings Data From The Current
Population Survey After The Redesign,” Proceedings of the Section on Survey
Research Methods, American Statistical Association.
Poole, R. and Sangster, R. (2005). "Housing Tenure Study: Technical Report for Survey Methods Division of the Consumer Price Index."
Presser, Stanley and Stinson, Linda (1998). “Data Collection Mode and Social
Desirability Bias in Self-Reported Religious Attendance,” American Sociological
Review, vol. 63, pp. 137-145.
Rips, L. J., Conrad, F.G. & Fricker, S. S. (2004). Straightening out the seam effect in panel surveys. Public Opinion Quarterly, 67, 522-554.
Rips, Lance, Conrad, Frederick and Fricker, Scott, (2000), "Unraveling the Seam
Effect," Proceedings of the Section on Survey Research Methods, American
Statistical Association.
Sangster, R (2005). "Consumer Price Index (CPI) Housing Survey Sample Attrition." Presentation for the 16th International Workshop on Household Survey Nonresponse, Tällberg, Sweden, August 28-31, 2005
Sangster, R., and Meekins, B. (2004). "Assessing Data Quality for Hard to Reach and Reluctant Respondents in an RDD Telephone Panel Survey." Poster for the Annual Conference of the American Association for Public Opinion Research, Phoenix, AZ.
Sangster, R and Meekins, B. (2004). "Modeling the Likelihood of Interviews and Refusals: Using Call History Data to Improve Efficiency of Effort in a National RDD Survey." Proceedings of the Section on Survey Research Methods, American Statistical Association, Toronto, Canada.
Sangster, R and Meekins, B. (2003). "Data Concerns for Hard to Reach and Reluctant Respondents in Telephone Panel Surveys." Presentation for the 14th International Workshop on Household Survey Nonresponse, Leuven, Belgium, 22-24 September 2003.
Sangster, Roberta and Willits, Fern (2001), “Evaluating Numeric Rating Scales: Replicated Results,” Proceedings of the Section on Survey Research Methods, American Statistical Association.
Schecter, Susan, Stinson, Linda and Moy, Luann (1999), " Developing and Testing
Aggregate Reporting Forms for Data on Race and Ethnicity," Paper Presented at
the Conference of the Federal Committee on Statistical Methodology.
Schober, M.F., Conrad, F.G. and Fricker, S.S. (2004). Misunderstanding standardized language in research interviews. Applied Cognitive Psychology, 18, 169-188.
Schober, Michael F., Conrad, Frederick G., and Fricker, Scott S. (2000). “When and
How Should Survey Interviewers Clarify Question Meaning?” Proceedings of the
Section on Survey Research Methods,. American Statistical Association.
Scott, Stuart and Zadrozny, Peter (1999) "," Aggregation & Model-based Methods in
Seasonal Adjustment of Labor Force Series Proceedings of the Section on
Business and Economic Statistics, American Statistical Association.
Schwartz, Lisa and Paulin, Geoffrey (2000) “Improving Response Rates to Income
Questions: A Comparison of Range Techniques” Proceedings of the Section on
Survey Research Methods, American Statistical Association.
Shettle, Carolyn, Ahmed, Susan, Cohen, Steve, Miller, Renee and Waite, Preston (1997),
“Improving Statistical Reports From Government Agencies Through The Reports
Review Process,” Proceedings of the Section on Survey Research Methods,
American Statistical Association.
Sirken, Monroe, Tanur, Judith, Tucker, Clyde N. and Martin, Elizabeth A. (1997),
“Synthesis Of Casm II: Current And Future Directions In Interdisciplinary
Research On Cognitive Aspects Of Survey Methods,” Proceedings of the Section
on Survey Research Methods, American Statistical Association, 1-10.
Stamas, George, Goldenberg, Karen, Levin, Kerry and Cantor, David (1997),
“Sampling For Employment At New Establishments In A Monthly Business
Survey,” Proceedings of the Section on Survey Research Methods, American
Statistical Association, 279-284.
Steiger, Darby Miller, Mainieri, Tina, and Stinson, Linda (1997), “Subjective
Assessments of Economic Well-Being: Understanding the Minimum Income
Question,” Proceedings of the Section on Survey Methods Research, American
Statistical Association, 899-903.
Stewart, Jay, Goldenberg, Karen, Gomes, Tony, and Manser, Marilyn, (2000), "
Collecting All-Employee Earnings Data in the Current Employment Statistics
Stewart, Jay and Joyce, Mary (1999), "Why Do We Need Time-Use Data?," Proceedings of the Section on Social Statistics, American Statistical Association.
Stewart, Jay and Frazis, Harley (1998), “Keying Errors Caused by Unusual Response
Categories: Evidence from a Current Population Survey Test,” Proceedings of the
Section on Survey Research Methods, American Statistical Association, 131-134.
Stinson, Linda (2000), “‘Day of Week’ Differences and Implications for Time-Use
Research,”Proceedings of the Section on Social Statistics, American Statistical
Association.
Stinson, Linda (1999), " Measuring How People Spend Time," Proceedings of the
Section on Social Statistics, American Statistical Association.
Stinson, Linda (1997), “Using the Delighted/Terrible Scale to Measure Feelings About
Income and Expenses,” Proceedings of the Section on Survey Methods Research,
American Statistical Association, 904-909.
Sukasih, Amang and Eltinge, John (2001), “A Goodness-of-Fit Test for Response Probability Models in the Analysis of Complex Survey Data,” Proceedings of the Section on Survey Research Methods
Sverchkov, Michael, and Pfeffermann, Danny, (2000), " Prediction of Finite Population
Totals Under Informative Sampling Utilizing the Sample Distribution,"
Proceedings of the Section on Survey Research Methods, American Statistical
Association.
Swanson, David C, Hauge, Sharon K,. Schmidt, Mary Lynn (1999) "Evaluation of
Composite Estimation Methods for Cost Weights in the CPI," Proceedings of the
Section on Survey Research Methods, American Statistical Association.
Tourangeau, Roger, Shapiro, Gary, Kearney, Anne and Ernst, Lawrence (1997), "Who
lives Here? Survey Undercoverage and Household Roster Questions," Journal of
Official Statistics, 13, 1-18.
Tucker, Clyde (2001), “Using the New Race and Ethnicity Data,“ Proceedings of the Section on Survey Research Methods, American Statistical Association.
Valliant, Richard and Dorfman, Alan H. (1997), “Stratification On A Size Variable
Revisited,” Proceedings of the Section on Survey Research Methods, American
Statistical Association, 766-771.
Vernon, M. (2005). "Pre-testing Sensitive Questions: Perceived Sensitivity, Comprehension, and Order Effects of Questions about Income and Weight." American Statistical Association, Minneapolis, Minnesota.
Walker, Ed and Mesenbourg, Tom (1997), “The Census Bureau's Business Register:
Quality Issues And Observations,” Proceedings of the Section on Survey Research Methods, American Statistical Association.
Walker, Martha A. C. and Bergman, Bruce (1997), “Estimates Of Year-To-Year
Change In Costs Per Hour Worked From The Employer Costs For Employee
Compensation Survey,” Proceedings of the Business and Economic Statistics
Section, American Statistical Association.
Wang, Suojin, Dorfman, Alan H., and Chambers, Raymond (1999)"Maximum
Likelihood Under Informative Sampling," Proceedings of the Section on Survey
Research Methods, American Statistical Association.
Weber, Wolf, (1999), "A Method of Microdata Disclosure Limitation based on Noise
Infusion and Outlier Substitution," Proceedings of the Section on Survey Research
Methods, American Statistical Association.
Werking, George S. Jr. (1997), “Overview Of The CES Redesign Research,”
Proceedings of the Section on Survey Research Methods, American Statistical
Association, 512-516.
West, Sandra A., Kratzke, Tran and Grden, Paul (1997), “Estimators For Average
Hourly Earnings And Average Weekly Hours For The Current Employment
Statistics Survey,” Proceedings of the Section on Survey Research Methods,
American Statistical Association, 529-534.
Wohlford, John and Mueller, Charlotte, (2000), "The Debut of a New Establishment
Survey: The Job Openings and Labor Turnover Survey at the Bureau of Labor
Statistics ," Proceedings of the Section on Survey Research Methods, American
Statistical Association.
Yansaneh, Ibriham, and Eltinge, John, (2001), "Design Effect and Cost Issues for
Surveys in Developing Countries," Proceedings of the Section on Survey Research Methods, American Statistical Association.
Zadrozny, Peter (2001), “An Estimated Autoregressive Model for Forecasting U.S. GDP Based on Real-time Data,” Proceedings of the Section on Business and Economics Statistics, American Statistical Association.
Zadrozny, Peter, (2000), "Modelling Survey-Error Autocorrelations Subject to Time-in-
Sample Effects for Model-Based Seasonal Adjustments," Proceedings of the
Section on Business and Economics Statistics, American Statistical Association.
Zadrozny, Peter and Chen, Baoline, (1999), "Estimation of Capital and Technology with
a Dynmaic Economic Model," Proceedings of the Section on Business and
Economic Statistics, American Statistical Association.
Zarate, Alvan, Greenberg, Brian, Bournazian, Cohen, Stephen and Eden, Donna (2001),
“Privacy, Confidentiality and the Protection of Health Data - A Statistical Perspective,” Proceedings of the Section on Government Statistics, American Statistical Association.
ATTACHMENT II
CONSULTANTS TO THE
BEHAVIORAL SCIENCE RESEARCH LABORATORY
Dr. Paul Biemer, Distinguished Fellow
Research Triangle Instititute
3040 Corwallace Rd.
Ragland Building
Research Triangle Park, NC 277709
(919) 541-6000
Pamela Doty, Senior Policy Analyst
Division of Disability, Aging and Long-term Care Policy
Office of the Assistant Secretary for Planning and Evaluation
U.S. Department of Health and Human Services
200 Independence Ave, SW
Washington, DC 20201
Phone: (202) 690-6449
Carl Ramirez, Senior Design Methodologist
Government Accountability Office
441 G St., NW Room 6K17R
Washington, DC 20548
(202) 512-3721
Kristin Stettler
Survey Methodologist, Establishment Survey Methods Staff
U.S. Census Bureau
FOB4-3110
Washington, DC 20296
301-763-7596
Kristin
Stettler
ESMS, US Census Bureau
301-763-7596
Diane Willimack
Chief, Establishment Survey Methods Staff, ESMPD
U.S. Census Bureau
4700 Silver Hill Road #6200
Washington, DC 20233-6200
The Bureau of Labor Statistics (BLS) is conducting research to increase the quality of BLS surveys. This study is intended to suggest ways to improve the procedures the BLS uses to collect survey data.
The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. The Privacy Act notice on the back of this form describes the conditions under which information related to this study will be used by BLS employees and agents.
During this research you may be audio and/or videotaped, or you may be observed. If you do not wish to be taped, you still may participate in this research.
We estimate it will take you an average of xx minutes to participate in this research (ranging from xx minutes to xx minutes).
Your participation in this research project is voluntary, and you have the right to stop at any time. If you agree to participate, please sign below.
Persons are not required to respond to the collection of information unless it displays a currently valid OMB control number. OMB control number is 1220-0141, and expires 01/31/06.
------------------------------------------------------------------------------------------------------------
I have read and understand the statements above. I consent to participate in this study.
___________________________________ ___________________________
Participant's signature Date
___________________________________
Participant's printed name
___________________________________
Researcher's signature
OMB Control Number: 1220-0141
Expiration Date: 01/31/06
In accordance with the Privacy Act of 1974, as amended (5 U.S.C. 552a), you are hereby notified that this study is sponsored by the U.S. Department of Labor, Bureau of Labor Statistics (BLS), under authority of 29 U.S.C. 2. Your voluntary participation is important to the success of this study and will enable the BLS to better understand the behavioral and psychological processes of individuals, as they reflect on the accuracy of BLS information collections. The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. The BLS may release individually identifiable information to individuals designated as agents of the BLS in accordance with Public Law 107-347 to perform exclusively statistical activities. Individuals designated as agents of the BLS may be imprisoned for not more than 5 years or fined not more that $250,000 or both for any knowing and willful disclosure of respondent information to unauthorized persons. Such designated agents may include individuals from other sponsoring agencies; to contractors, grantees, and their employees or volunteers who are working on this study for the BLS and who need access to the information; or to the National Archives and Records Administration or the General Services Administration for records management purposes. Under written agreements to protect the confidentiality and security of individually identifiable information, the BLS may provide individually identifiable information to other researchers designated as agents of the BLS to conduct statistical research projects that further the mission and functions of the BLS.
File Type | application/msword |
File Title | BUREAU OF LABOR STATISTICS |
Author | jerstad_s |
Last Modified By | Amy Hobby |
File Modified | 2007-09-13 |
File Created | 2007-09-13 |