BLS-07-02 Justification for Generic IC

BLS-07-02-Justification_for_Generic_IC(Aug2007).doc

Customer Satisfaction Surveys and Conference Evaluations Generic Clearance

BLS-07-02 Justification for Generic IC

OMB: 1225-0059

Document [doc]
Download: doc | pdf

OMB Approval No. 1225-0059


CUSTOMER SATISFACTION SURVEY AND CONFERENCE EVALUATION CLEARANCE FORM


A. SUPPLEMENTAL SUPPORTING STATEMENT


A.1. Title:

BLS.Gov User’s Survey

A.2. Compliance with 5 CFR 1320.5:

Yes ___X__ No _____


A.3. Assurances of confidentiality:

The data will be held in confidence in accordance to the confidentiality pledge on the survey instrument.


A.4. Federal cost: $ 2, 330 (20 hours of BLS employee work)

A.5. Requested expiration date (Month/Year): 09/2008


A.6. Burden Hour estimates:


a. Number of Respondents: 5,800

a.1. % Received Electronically 100%

b. Frequency: one-time

c. Average Response Time: 3.57 mins

d. Total Annual Burden Hours: 345


A7. Does the collection of information employ statistical methods?


_______ No


X Yes (Complete Section B and attach BLS review sheet).


A.8. Abstract:

The BLS Website (www.bls.gov) served an average of 1.5 million users each month in 2005. There were over four million visits and over two million unique visitors to the website in June, 2007. Since its introduction in September 1995, this website has become the primary way for the overwhelming majority of our customers to obtain BLS data and for BLS to communicate with those customers. The purpose of this user survey is to obtain information about BLS.gov users, specifically to gain an understanding of who the users are and how well the website is meeting their needs.

The BLS.gov Website is undergoing revisions. An internal BLS team has been organized to design, test, and implement a new navigation structure and look-and-feel for the BLS Website that better serves our customers by making BLS information easier to find and easier to use. This redesign will concentrate on the organization of materials on the BLS Website, the navigation between pages, and the common structural elements of individual web pages. This survey, designed and implemented by another BLS team, will be fielded twice: once before and once after the redesigned website has been implemented. The baseline data will be compared to the post-redesign data to determine the impact of the redesign on meeting website users’ needs.


Darrin A. King


Acting Departmental Clearance Officer


Date


08/30/2007



  1. SURVEYS AND EVALUATIONS EMPLOYING STATISTICAL METHODS




B.1.


Sampling Frame

The BLS Website (www.bls.gov) served an average of 1.5 million users each month in 2005. There were over four million visits and over two million unique visitors to the website in June, 2007. We will use BLS Website users for this survey.



Expected Response Rates


One page of the BLS Website, the Occupational Health Handbook (OOH), currently conducts a pop-up survey of users as they use the site. The OOH survey is the American Customer Satisfaction Index (ACSI) survey conducted by ForeSee Results, Inc. for BLS. Currently, their response rate is about 4%. This rate is similar to other user surveys conducted on websites, including the Census Bureau which had a response rate of 2%. Given this information, we anticipate a response rate of 4% for the longer survey (see Table 1 on the next page). However, since the short survey is shorter than the OOH or Census pop-up surveys, we expect a response rate of 6%.



B.2. Describe the procedures for the collection of information including:


Statistical methodology for stratification and sample selection,

Estimation procedure,

Degree of accuracy needed for the purpose described in the justification,

Unusual problems requiring specialized sampling procedures, and

Any use of periodic (less frequently than annual) data collection cycles to reduce burden.


Procedures


There are two phases for this study, the first will be a very short survey designed only to provide input for question wording on the second, longer, survey. Following that, in the second phase of the project a longer survey will be administered aimed at answering the research questions of the project.

For the short survey, 0.5% of randomly selected users will see a pop-up window containing a brief introductory sentence and will be asked to click Continue to view and answer the questions. The appearance of this window would depend on the number of mouse clicks the user makes when on the website. We will change the click criterion so that we sample different types of users at different times. For example, we will use one click as the selection criteria for 1-2 weeks then change the selection criteria to two clicks for another 1-2 weeks, etc. The survey will be fielded for three months: October to December 2007.

For the long survey, 1% of randomly selected users will be invited to participate in the survey. The survey will be fielded for about two months. The long survey will be implemented before and after the BLS Website has been redesigned. The redesigned site will be implemented in Fall of 2008. That means the long survey will be implemented in mid-March to June 2008 and again in mid-March to June 20091.

The same procedures and sampling design will be used for both implementations of the long survey. There is no way to ensure that users who are selected for short survey are not selected for long survey, but the short length of the first survey should minimize the impact on the second response.

Both surveys will use a pop-under approach. After the specified criteria has been met (i.e. two clicks for the short survey), users will see a pop-up window containing a brief introductory sentence, and will be asked to click continue to view and answer the questions. If they agree, a new browser window will open up underneath their current browser. Once they finish their work on the BLS.gov website and close the browser window, they will see the survey.


Sampling

Table 1 shows the sampling rate for both surveys. We made a general assumption about the sample size for the long survey, but this will be re-evaluated after the data from the short survey have been analyzed. Since the long survey will be conducted to measure differences in user opinion before and after the redesign, the sample size will need to be large enough to detect significant differences at a level which would be practically meaningful. The size of the variances in responses to the short survey will allow us to refine our original estimate for the long survey sample size. If the sample size for the long survey changes, we will submit an amended package to OMB prior to data collection.


Table 1: Sampling Assumptions


Short Survey

Long Survey

Time period for data collection

3 months

2.5 months

Estimated unique visitors (for the time period)

6,000,000

5,000,000

Sampling rate

0.5%

1.00%

Estimated response rate

6%

4%

Estimated number of participants

1,800

2,000

Estimated number of participants (post-redesign)*


2,000

*the post-design follows the same sampling structure as the pre-design





B.3. Describe methods to maximize response rates and to deal with issues of non‑response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


Methods to Reduce Non-Response


As stated before, pop-up surveys of web users have notoriously low response rates. We have attempted to make both questionnaires (“short” and “long” versions) relatively short in order to improve response rates. Also, we do not ask any identifying information that might hurt the response rate.

Additionally, we have informally tested the questionnaires (with BLS employees) to improve the question items and the flow of the survey. This should improve item non-response.




B.4. Describe any tests of procedures or methods to be undertaken.


Questionnaire Testing


The questionnaires have already been informally tested using several BLS employees who gave feedback on the items. This testing improved the text of the question items and the overall flow of the surveys.

Depending on the results of the short survey, items in the long survey may be revised. The responses from the short survey will be examined to determine if the existing response are appropriate for use on the long survey. We will look at the frequencies of responses used and examine whether there are any response options that should be added or removed. For example, if we have many “other” responses to an item, we will examine the response categories for possible changes.



B.5. Provide the name, affiliation (company, agency, or organization) and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Name

Agency/Company/Organization

Number Telephone

Kathy Downey

BLS – OSMR

202-691-7382

Jennifer Edgar

BLS – OSMR

202-691-7528

Bill Mockovak

BLS – OSMR

202-691-7414

JoAnn Yu

BLS – OTSP

202-691-5041










1 We understand that this will have to be cleared under a new package, since the OMB expiration for 1225-0059 is September, 2009.

File Typeapplication/msword
File TitleSUPPORTING STATEMENT FOR
Authorkurz-karin
Last Modified Byking-darrin
File Modified2007-08-30
File Created2007-08-28

© 2024 OMB.report | Privacy Policy