Circumstances Necessitating Collection of Information
We are requesting a three-year extension of the generic clearance to conduct customer satisfaction surveys of federal government websites utilizing the methodology of the American Customer Satisfaction Index (ACSI) [see attached methodology paper]. An extension will allow us to continue to use a data-driven and statistically valid approach to understanding customer satisfaction with agency websites, which are playing a strategic role of ever-increasing importance.
Collecting, analyzing, and acting upon customer satisfaction data are vital to the government’s ability to achieve its E-Government strategy as mandated by Executive Order 12862 and the President’s Management Agenda. Under the President’s Management Agenda, the Internet has been identified as the centerpiece of a strategy to use technology to make major productivity gains and improve citizen satisfaction. According to the President’s Management Agenda, while the federal government is the world’s largest single consumer of information technology, the government has failed to produce measurable gains in public-sector worker productivity on the worldwide web. One of the major causes of this failure is that agencies don’t employ performance measures to evaluate the success of their Internet program in terms of desirable outcomes.
The President’s Management Agenda established clear standards for success, and agencies developed action plans to achieve these goals. Agencies have been held accountable for their performance by a simple grading system of red (unsatisfactory), yellow (mixed) or green (satisfactory). In the Administration’s initial management scorecard for the Executive Branch’s performance in the area of E-government, all of the agencies measured scored either a red or a yellow. Since then, Executive Branch agencies have shown considerable progress as nearly all agencies have achieved either a green or a yellow rating for their E-Government performance or progress in implementation.
During the past three years, the generic clearance has been critical to the ability of agencies and their web teams to:
Better identify who is visiting their websites.
Determine what drives visitor satisfaction.
Understand the relationship between visitors’ satisfaction with their experience and future behaviors.
Prioritize resource allocation based on their ability to drive Return on Investment.
Measure customer satisfaction continuously.
Benchmark performance against public and private sector websites with a similar mission.
Identify areas for improvement.
Quantify the impact of improving visitor satisfaction on future behaviors.
Drill down to evaluate satisfaction of different user groups and various sections of their websites.
In addition, the generic clearance has enabled the Federal Consulting Group of the U.S. Department of the Treasury to provide the general public and policymakers in the Executive, Legislative, and Judicial Branches with data reporting on trends in overall customer satisfaction with federal government websites, as well as other insights into citizen behaviors and website usage.
Since the generic clearance was issued, approximately 200 websites have adopted the ACSI as a research tool and customer satisfaction metric. Agencies have been able to utilize the data from this research to guide their website redesign and improvement efforts with a greater and entirely appropriate focus on customer needs and desires, and many agencies have reported on their successes and lessons learned in a Community of Practice that meets quarterly at the Federal Consulting Group.
The website customer satisfaction measures are based on the methodology of the ACSI. The ACSI is the only internationally recognized cross-industry, cross-agency methodology for obtaining comparable measures of customer satisfaction. In a competitive procurement, the FCG selected ForeSee Results, Inc, which uses a unique website customer satisfaction measurement survey and model that employs the ACSI methodology. This survey and related analysis and reporting enable agencies to obtain insights that help make valuable resource allocation decisions based on customer feedback. ForeSee Results utilizes the proprietary methodology behind the ACSI econometric model to link the drivers and consequences of satisfaction. An important advantage, in contrast to methods that rely solely on survey questions, is that it produces results with statistical stability and low chance variation. This helps ensure uniform and consistent results that allow cross-agency comparisons and benchmarking.
ForeSee Results was initially a joint venture between Compuware, a software development and professional services company, and the CFI Group USA, LLC, an international leader in assessing and understanding customer satisfaction. Dr. Claes Fornell, the Director of the National Quality Research Center at the Steven M. Ross Business School at the University of Michigan and creator of the ACSI, founded the CFI Group in 1988. Along with other economic objectives – such as employment and growth – the quality of output (goods and services) is important to living standards. Like other objectives, the quality of goods and services should be subjected to systematic and uniform measurement. This is the rationale for the ACSI. In the most general sense, the ultimate purpose of the ACSI is to help improve the quality of goods and services available to American citizens, regardless of whether they come from the public or private sectors. The benefits to government agencies in using the ACSI for customer satisfaction measures are:
Reliance on the only national uniform and scientifically established measure of customer satisfaction.
Confidence in having the most accurate and researched index of customer satisfaction available.
Capability to benchmark against other agencies as well as private sector companies.
Information on how to improve website satisfaction.
Impartiality, objectivity, and stature of a leading consulting company and two of the world’s leading non-profit organizations for customer satisfaction and quality: the Steven M. Ross School of Business at the University of Michigan and the American Society for Quality (ASQ).
High quality of data.
Ability to measure customer satisfaction continuously.
In 1999, the federal government selected the ACSI to be a standard metric for agency evaluation of customer satisfaction. Since December 1999, the University of Michigan’s Steven M. Ross School of Business has published annually a national index of customer satisfaction with federal government services. Beginning in September 2003, the University of Michigan has published quarterly E-Government scores for websites that participate in the ACSI. In 2008 the Federal Consulting Group and CFI Group partnered to produce the first Government Call Center Satisfaction Index. The Federal Consulting Group is the executive agent for the ACSI in the federal government and offers the ForeSee Results web survey tool to federal agencies on an annual subscription basis.
2. Use of the Data
All data is collected on-line, and the agencies receive access to their data and related reports 24/7 via a sophisticated on-line portal. This portal provides accurate and actionable information that enables agency web teams and managers to focus time, energy, and resources on areas that matter most to their web customers. A brief survey made up of a combination of standard and custom questions is triggered randomly for the smallest possible percentage of site visitors needed to achieve statistically valid information. The survey continuously and unobtrusively gathers information from agency website visitors about their overall satisfaction with the agency’s site, satisfaction with specific site elements, and their likelihood to return to, recommend, or transact with the agency site in the future. All reporting and data storage are done through secure servers that reside at the ForeSee Results site so that agency site performance is not affected. In addition, aggregate data on government website satisfaction is maintained and available for comparative purposes.
3. Use of Improved Information Technology to Reduce Burden
No other web survey instrument employs the patented methodology of the ACSI. Other tools available to agencies measure activities such as numbers of page views, amount of time per visit to a website, percentage of website reliability, etc., but do not capture data on customer satisfaction. Moreover, most other customer satisfaction survey tools are not able to capture data on the customer experience both randomly and after the customer has visited sufficient web pages to render a reasonable evaluation of their experience.
4. Efforts to Identify Duplication
Respondents for the web survey are selected at random and, typically, only after the website visitor has had a unique experience with the agency’s website. For agencies with large numbers of visitors, it is unlikely that individual respondents will be selected to complete more than one random survey. There are no "special circumstances" as contemplated within item 7 of the "Certification Requirements for Paperwork Reduction Act Submissions." There are no situations where respondents would be required to: prepare a written response to the survey, submit more than an original and two copies of any document, or retain records for more than three years.
Methods to Minimize Burden on Small Businesses or Other Small Entities
The collection of information will not impact small businesses or other small entities as indicated in item 5 of OMB Form 83-I.
Consequences of Less Frequent Collection on Federal Programs or Policy Activities
Agencies that do not evaluate the customer satisfaction of their websites are at risk that:
They will focus on the wrong measure of success – how well the website serves the agency’s needs instead of citizens’ needs.
The President's Management Agenda goal of creating a citizen-centered, electronic government that provides the best possible service and information to citizens may not be realized.
Citizens will benchmark agency website performance against the “best in business” and will not return to or recommend government websites that do not meet their expectations.
They will not see productivity gains, necessary improvements and sufficient returns on their information technology budgets.
Potential savings of doing government business via websites will not be realized, thus missing an important opportunity to reduce costs.
Citizen satisfaction will decline which will lead to an overall reduction in citizen trust in government.
Special Circumstances Requiring Data Collection to be Inconsistent with Guidelines in 5 CFR 1320.5(d) (2)
ForeSee Results will collect information under this clearance in a manner that complies with 5 CFR 1320.5(d) (2)
Consultation with Individuals Outside of the Agency on Availability of Data, Frequency of Collection, Clarity of Instructions and Forms, and Data Elements
This survey employs a methodology that was previously reviewed and approved by the Office of Management and Budget. It does not require respondents to submit proprietary trade secrets, or other confidential information, and does not include a pledge of confidentiality.
The Federal Consulting Group published a notice in the Federal Register on June 18, 2008 as required by 5 CFR 1320.8(d) soliciting comments on the information collection prior to submission to OMB. No public comments were received.
Explanation of Decision to Provide any Payment or Gift to Respondents
No payments or gifts will be made to respondents.
Assurance of Confidentiality of Response
Individuals and organizations given the opportunity to take a survey will be assured of the confidentiality of their replies under 5 U.S.C. 552 (Freedom of Information Act), 5 U.S.C. 552a (Privacy Act of 1974) and OMB Circular No. A-130. Survey respondents will be advised on the survey form or in a privacy statement that participation is voluntary and that the data provided will be kept confidential.
Justification of Sensitive Questions
This website survey will not ask questions or collect data of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. However, on occasion, some respondents may consider some of the standard demographic questions as sensitive in nature (e.g., questions that request the respondent’s age, gender, education, or household income). Demographic questions are useful in segmenting the responses of different user groups or visitor profiles and are helpful in evaluating the results; therefore, respondents will be encouraged to answer these questions but assured that their participation is completely voluntary.
Estimated Burden of Information Collection
The total respondent burden on the public of the ACSI website survey measurements during this three-year approval period is estimated to be 140,625 hours. The actual number of surveys is unknown at this time and will vary based on participation by federal agencies and as new websites are added or deleted. This estimate is based on our experience from the previous three-year approval period, as further explained in item 15. The projected estimates for fiscal years 2008-2010 are as follows:
Fiscal Year 2008. 200 Website Customer Satisfaction Measurements: The estimated burden for 200 website customer satisfaction measurements is estimated at 1,000,000 completed surveys, consuming 41,667 hours per year. This was calculated as follows: 5,000 respondents surveyed for each of 200 websites with a total of 2.5 minutes for each survey.
Fiscal Year 2009. 225 Website Customer Satisfaction Measurements: The estimated burden for 225 website customer satisfaction measurements is estimated at 1,125,000 completed surveys, consuming 46,875 hours per year. This was calculated as follows: 5,000 respondents surveyed for each of 225 websites with a total of 2.5 minutes for each survey.
Fiscal Year 2010. 250 Website Customer Satisfaction Measurements: The estimated burden for 250 website customer satisfaction measurements is estimated at 1,250,000 completed surveys, consuming 52,083 hours per year. This was calculated as follows: 5,000 respondents surveyed for each of 250 websites with a total of 2.5 minutes for each survey.
Estimated Total Annual Cost Burden to Respondents
There is no monetary cost to respondents for participating in these data gathering efforts. The vast majority of all costs associated with these efforts are born by the government agencies.
Estimated Total Annual Cost Burden to the Federal Government
The estimate of the total annual cost burden to the government resulting from the collection of information will vary based on participation by federal agencies as new website service is added or existing service is not renewed. Unfortunately, we will not be able to account for the net effect of an ACSI website measure replacing survey work being performed by other contractors or utilizing internal staff resources. Projected estimates for fiscal years 2008-2010 are as follows:
Fiscal Year 2008. 200 Website Customer Satisfaction Measurements (of which 25 are new website measures and 175 are repeat measures): The total annualized cost estimated for the Website Customer Satisfaction Measurement for 200 websites is $7,625,000. This was calculated by adding the annualized capital/startup costs of $225,000 with the total annual costs of $7,400,000. The annualized capital/startup costs were calculated by adding the total of 25 new agency websites’ preparation contribution of .1 of a full-time equivalent (FTE) at $90,000 per FTE. The total annual costs were calculated by adding the 200 agency websites’ average contribution of $28,000 each with their annual contribution of .1 of an FTE at $90,000 per FTE.
Fiscal Year 2009. 225 Website Customer Satisfaction Measurements (of which 25 are new website measures and 200 are repeat measures): The total annualized cost estimated for the Website Customer Satisfaction Measurement for 200 websites is $8,550,000. This was calculated by adding the annualized capital/startup costs of $225,000 with the total annual costs of $8,325,000. The annualized capital/startup costs were calculated by adding the total of 25 new agency websites’ preparation contribution of .1 of an FTE at $90,000 per FTE. The total annual costs were calculated by adding the 225 agency websites’ average contribution of $28,000 each with their annual contribution of .1 of an FTE at $90,000 per FTE.
Fiscal Year 2010. 250 Website Customer Satisfaction Measurements (of which 25 are new website measures and 225 are repeat measures): The total annualized cost estimated for the Website Customer Satisfaction Measurement for 300 web sites is $9,725,000. This was calculated by adding the annualized capital/startup costs of $225,000 with the total annual costs of $9,500,000. The annualized capital/startup costs were calculated by adding the total of 25 new agencies’ preparation contribution of .1 of an FTE at $90,000 per FTE. The total annual costs were calculated by adding the 225 agencies’ average contribution of $29,000 each with their annual contribution of .1 of an FTE at $90,000 per FTE.
Reasons for Change in Burden
We estimated that the respondent burden for the prior three years would be approximately 156,250 hours. That estimate was based on an expected participation of 200 websites in the first year of the clearance with growth of 50 additional websites each succeeding year. The rate of growth realized turned out to be less than expected, due to limited agency funding and fewer agencies adding customer surveys on their websites.
Using the past three years as a baseline, we estimate that we will use 41,667 burden hours in year one, 46,875 burden hours in year two, and 52,083 burden hours in year three, for a three-year total of 140,625 hours. This is a reduction of 15,625 burden hours from the previous three-year projection, reflecting a more modest growth estimate for agency use of the ACSI surveys on websites.
Plans for Tabulation, Statistical Analysis and Publication
Each agency that participates in the ACSI E-Government website survey has access to its ACSI scores and detailed statistical and analytical data through an on-line reporting portal maintained by ForeSee Results. While all agencies being supported receive monthly on-line reports, they also generally select a level of service involving a satisfaction research analyst provided by the contractor. This analyst prepares a detailed satisfaction insight review each quarter for review with the agency. This review of an agency website provides much greater statistical data, analysis, and trends about the satisfaction results. Recommendations for improvement are also an important part of each deliverable.
The Steven M. Ross School of Business at the University of Michigan intends to continue quarterly publication of the ACSI E-Government website scores for participating websites. In addition, coinciding with the release of the E-Government website scores, ForeSee Results publishes a commentary analyzing government-wide trends.
Reasons Why Displaying the OMB Expiration Date is Inappropriate
We believe that by not stating the expiration date for OMB approval of the information collection on the website, response rates would increase. Would-be respondents might be inclined to refuse to participate if the form carries an authorization date that is expired or is soon to expire.
18. Exceptions to the Certification Statement on OMB Form 83-I
No exception to the certification statement is being requested.
File Type | application/msword |
File Title | Supporting Statement to Accompany OMB Form 83-1 |
Author | BERNIE LUBRAN |
Last Modified By | bjinnoha |
File Modified | 2008-09-25 |
File Created | 2008-09-25 |