2007 TreasuryDirect Customer Satisfaction Survey

OMB Justification for 2007 TreasuryDirect Survey4[1].11.30.2007a.doc

Voluntary Customer Satisfaction Survey to Implement Executive Order 12862

2007 TreasuryDirect Customer Satisfaction Survey

OMB: 1535-0122

Document [doc]
Download: doc | pdf

2007 TreasuryDirect Customer Satisfaction Survey

N ovember 30, 2007

Background


The Bureau of the Public Debt plays an integral role in financing the operating expenses of the federal government. We sell U.S. Treasury securities to the public to help fund our country’s expenses and to pay off maturing debt. Launched in 2001, the web-based TreasuryDirect system allows customers to purchase and manage U.S. Treasury securities directly from Treasury.


To provide high quality and responsive customer service and meet our obligations under Executive Order 12862, we need to know what our customers’ needs are and what new system features they desire. Surveying our customers gives us the opportunity to know our customers better and to get valuable feedback on the quality of our services.


Getting in touch now is important because of the rapid advances in business technology. Today’s investors are more electronically savvy than their predecessors. They can tell us whether the newest services we’ve provided in TreasuryDirect meet their investing needs, what investment features are most important to them, and how we can continue to improve and gain new investors. Again, the bottom line is customer satisfaction. Have we achieved it…and are we positioned to continue achieving it?


This is the first comprehensive survey of our TreasuryDirect customers, setting the baseline for future surveys. We’ve recently contracted the services of a research company experienced in survey methodologies. Working with the vendor, we will:


  • Ensure the survey is conducted with statistical accuracy;

  • Ensure the data collected will be accurately analyzed;

  • Ensure the data we benchmark will reliably support comparisons we may want to make when conducting future online surveys; and

  • Learn and build our own in-house knowledge for online data sampling.


Survey Goals


The survey will gather basic demographics, general information about our customers’ investment experiences with Treasury securities, and a few key facts about their Internet usage. Most importantly, it will measure our customers’ satisfaction with TreasuryDirect. Through survey responses, we will:


  • Measure (quantitative and qualitative) customer satisfaction with our TreasuryDirect services and with our customer service representatives.

  • Gather attitudes and opinions about:

    • the information and transactions available through the TreasuryDirect web application, and

    • other financial services sites frequented by TreasuryDirect customers.

  • Better understand why some customers have inactive TreasuryDirect accounts; i.e., why they have never invested or have had no purchases in the past two years.

  • Gauge customer interest in new or prospective features and services that are designed to enhance the TreasuryDirect investment process.

  • Better understand the financial investment behaviors and practices of our customers.

  • Establish a benchmark for future TreasuryDirect surveys and support our goal to have the best business practices.

  • Fulfill Public Debt’s commitment under Executive Order 12862 to regularly survey our customers.


Research Methodology


Population


The TreasuryDirect system has just over 564,000 accounts from which we’ll draw our random sample.


Survey Sample


We’ll divide our population into four sub-groups, based on whether customers have made TreasuryDirect purchases and accessed their accounts in the past two years. Group 1 consists of customers who have made purchases within the past two years; 99 percent of these customers have logged in to their accounts in the past two years. This is the largest of the four groups. Groups 2 and 3 consist of customers with a history of purchasing but not within the past two years. Group 2 customers have accessed their accounts within the past two years, whereas Group 3 customers have not. Group 4 customers are those who have never made a TreasuryDirect purchase but at some point in time logged into their accounts. We view these customers as inactive, and while they are not the primary group of study, we hope to learn from them the reasons for their lack of interest in TreasuryDirect. Using random selection, we will disproportionately sample from the four groups to ensure adequate numbers for analysis. When we analyze the survey results for the entire sample taken as a whole, we will weight the responses to reflect the proportions in which Groups 1 through 4 occur in TreasuryDirect’s customer population.

Table 1 summarizes the sampling frames, survey samples, and projected number of respondents. The estimated response rates were derived from considering the customers’ varying degrees of involvement in the TreasuryDirect program and their consequent willingness to participate in the survey. Although 30 percent for Group 4 is lower than we would like, we believe it is a realistic projection for customers whose lack of account activity reflects little interest or involvement in TreasuryDirect. These “inactive” customers, however, are of interest in this study since their responses might provide insight into improvements that could stimulate dormant accounts. Table 2 shows the estimated burden per respondent and for the project overall.


Table 1.



Purchased
Accessed

Respondent Universe

Survey Sample

Est. Response Rate

Projected Completed Surveys

Margin of Error

Group 1

within 2 yrs

99% within 2 yrs

246,435

1,250

80%

1,000

3.1%

Group 2

over 2 yrs ago

within 2 yrs

93,725

1,000

70%

700

3.7%

Group 3

over 2 yrs ago

over 2 yrs ago

57,342

1,400

50%

700

3.7%

Group 4

never

at some point

166,913

1,000

30%

300

5.7%

 

 

 

564,415

4,650

 

2,700

1.9%



Table 2.


Number of Respondents

Total Burden per Respondent (Minutes)

Total Annual Burden (Minutes)

Total Annual Burden (Hours)

2,700

8.5

22,950

383


Collection Method


We believe that the web-based relationship we share with our customers makes an online collection method appropriate. To become a TreasuryDirect account holder, customers apply online by providing personal information (e.g., name, social security number, address, email address, and bank information). We then verify their identity before we allow them to purchase and hold Treasury securities in their accounts. This process gives us a high quality sampling frame for our target population. Therefore, we can reduce the expense to contact customers and leverage the built-in edits and prompts of online surveys to increase data integrity.


Procedures to Deal with Non-Response


Consistent with the research literature on survey data collection, we have planned at least four communications with the survey sample. First, we’ll send a letter on letterhead through surface mail, informing the customer sample of the upcoming Internet survey and asking for their participation. A week later our vendor will email the sample an invitation to the survey with a link to the site and a password for access. Two follow-up reminders are planned, one by surface mail and another by email, if we don’t get responses within a reasonable time. A third and fourth reminder, by email, are also in our budget, and we will judge the need to send them once we see the response rate. All survey correspondence will carry our vendor’s toll free 1-800 number, with encouragement to call with questions. If a potential respondent prefers to respond to the survey questionnaire by some means other than the Internet, our vendor will offer a variety of options: telephone interview, hard copy with postage pre-paid return envelope, or fax. Experienced interviewers who have knowledge of the study will be used to explain its purpose and importance.


Response Rate


Although the information being gathered does not fall under the influential category, we’ll take the following approaches to get an 80 percent response rate:


  • Working with the vendor to create clear, concise questions and a user-friendly online format. The vendor will pretest the survey questionnaire on nine or fewer customers.

  • Mailing a letter to respondents in advance of the survey (as mentioned above). The letter will be personally addressed to the respondent and will include a) a pledge of confidentiality, b) a senior agency official’s signature, c) the approximate time needed to complete the survey, d) acknowledgment of the vendor conducting the survey, and e) a contact number and email address so respondents can verify the survey’s legitimacy.

  • Emailing an invitation to take the survey (as mentioned above). The invitation, sent by the vendor, will include a link to the survey. Respondents will use a password to gain access to the survey site; this should give them confidence that unauthorized people cannot meddle with their responses.

  • Assuring customers that the online survey is safe and secure by using Secure Sockets Layer (SSL) software to encrypt all respondents' information so that it cannot be read as the information travels over the Internet.

  • Sending follow-up email(s) and/or letter(s) to non-respondents (as mentioned above). We’ll assign a unique tracking number to each survey. The tracking number ensures each customer polled may only submit one survey, and it allows us to easily identify non-respondents for follow-up.

  • Working with our vendor to take advantage of other reasonable avenues to encourage respondent participation. The vendor will follow all appropriate guidelines to maximize response rates listed in Questions and Answers When Designing Surveys for Information Collection.


Research suggests that response rates to web-based surveys have declined in recent years, most likely because of the preponderance of junk email, filters, Internet surveys, and, perhaps, the Internet’s loss of novelty as a data collection vehicle. Nevertheless, we believe the online relationship we have with our customers has created trust and has allowed us to assemble reasonably accurate contact information—key factors that should allow us to get a good response rate.


The vendor will also supply a professional statistician to assure that non-response is not an issue in our analysis. We don’t believe the survey variables being measured will discourage participation in the survey. Because we’re not including sensitive issues in this survey, we don’t believe that refusals to answer survey items will be likely. This is a simple survey about satisfaction with an online method of buying Treasury products.


We’ll use the survey data mainly to gauge customer satisfaction and to improve customer service. Respondents’ feedback could affect product decisions.

File Typeapplication/msword
File TitleTreasuryDirect’s 2004 Customer Service Survey
Authorpcarozza
Last Modified ByRichard Hathaway
File Modified2007-11-30
File Created2007-11-30

© 2024 OMB.report | Privacy Policy