1545-1432 Follow-up

Appendix B CS OMB Consolidated Follow-up Rpt 2007-2009-copy .doc

Voluntary Customer Surveys to Implement E.O. 12862 Coordinated by the Corporate Planning and Performance Division on Behalf of All IRS Operations Functions

1545-1432 Follow-up

OMB: 1545-1432

Document [doc]
Download: doc | pdf

Appendix B


Office of Management and Budget (1545-1432)

Summary of Projects Submitted for Approval and Results of Data Collections

(August 2007- August 2009)




Control # and Name: CS-06-18 Re: 2007 eService Study

Participants: 1,897 (3,210 Requests)

Data Collection Began: 4/16/2007 Data Collection Ended: 5/31/2007 Burden Hours: 474

Cost: $ 110,618 Response Rate: 59.1%


Purpose: IRS intends to track customer satisfaction with the e-Services program and asked Russell to review all past e-Services research to determine if benchmarks exist for tracking. There were no true benchmarks among the tightly-focused e-Services studies conducted previously, so a new e-Services Customer Satisfaction Tracking Study was authorized and conducted. The purpose of this study was to benchmark User satisfaction with, attitudes toward, and concerns about e-Services as well as Non-User reasons for non-usage and interest in e-Services.

Findings: Benchmark results show high satisfaction with e-Services among Heavy Users, but not among others – especially Medium Users.

While Users clearly like e-Services and the vast majority would recommend it to others, a high proportion feel e-Services should be improved with leading suggestions for improvement including make it easier to use, improve specific services, provide easier access to information and improve speed of activity.

Users (especially Heavy Users) are highly satisfied with specific dimensions of e-Services such as Site Appearance, Security, Speed In Transmission, Services, Types Of Info, Site Content, and Response/Acknowledgement. However, there are five areas that generate notably low satisfaction among one or more User segments – availability of help at help desk, understanding error messages & reject codes, site tutorials for different services and 6-months password re-set.

Aside from Registration, the e-Services with the highest claimed usage are PTIN Application and DA, though usage levels tend to vary across the three User segments.

Satisfaction with each Service (among its Users) is at about 80% or higher, with PTIN Application satisfaction highest, at 91%.

Among Non-Users of e-Services, we found that 83% are aware of the program and that 53% say they are Personally Registered (IRS believes them to not be registered).

Their top reasons for non-usage to this point are Do Not Need It and Need More Information – which, incidentally, are the top two things that USERS indicate IRS should do to stimulate usage – Create Demand and Provide More Information about the program.

After hearing the program description, 79% of Non-Users say they are likely to use e-Services (57% Very Likely), and three-fourths or more believe they are qualified to use each Service in the e-Services suite.

In other learning from the Benchmark study we found that Users tend to have higher usage of other types of websites than Non-Users but that, for both groups, there are specific websites which stand out as exceptional and possibly offering lessons for the e-Service program.

Finally, we found that there are notable differences in the make-up of Users and Non-Users, with Non-Users being more likely to work in a firm, and in far larger firms, with far more active Tax Preparers, who are filing far more total tax returns.

Actions taken or lessons learned: TBD by IRS product team. Due to budget constraints and business ownership issues, ETA was not able to act on the survey results to make timely e-Services continuous improvements for the 2008 filing season.  ETA will share results with W&I CAS EPSS (business owner for e-Services) to identify/implement next steps.


Control # and Name: CS-06-19 Re: LMSB CAP Study

Participants: 30

Data Collection Began: 3/26/07 Data Collection Ended: April 20, 2007 Burden Hours: 5.2%

Cost: $46,000 Response Rate: 73%


Purpose: To understand satisfaction levels among taxpayers of the Compliance Assurance Process (CAP) program for tax years 2005 and 2006 and to compare findings with information gathered from qualitative CAP interviews conducted in June and July 2006.

Findings: Overall CAP participants are satisfied with the CAP program. Average overall satisfaction rating with CAP program is 4.23. 93% of CAP program participants indicated that they would recommend CAP to others.

Actions taken or lessons learned: The results were used along with a survey of revenue agents in evaluation whether the program should be continued, expanded or abandoned. Results from survey supported expanding the number of participants in the program


Control # and Name: CS-06-20 Re: Practitioner Priority Service Study

Participants: 2,526 (6,853 Requests)

Data Collection Began: 1/07 Data Collection Ended: 12/07 Burden Hours: 397

Cost: $77,160 Response Rate: 37%


Purpose: The research was conducted as part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction with the service provided. The objectives are to identify what PPS staff and managers can do to improve customer service and to track customer satisfaction with PPS’s progress over time

Findings: Ninety two percent of PPS customers are satisfied with the service received, while only three percent are dissatisfied. Customers continue to be most satisfied with “Professionalism of Representative” and “Representative’s willingness to listen to you and help with your issue”. PPS customers are continually least satisfied with “Length of your wait to talk to a representative”.

Actions taken or lessons learned: It is important to set customer expectations concerning hold time, and make sure that representatives are able to fully answer customer’s questions and resolve their issues before ending the call. It is also important to provide PPS representatives with the appropriate level of authority to completely address all taxpayer issues during the call, and if not, train them to transfer the taxpayer to someone who can effectively help them.


Control # and Name: CS-06-21 Re: Toll Free Study

Participants: 12,799 (35,250 Requests)

Data Collection Began: 1/2007 Data Collection Ended: 12/2007 Burden Hours: 2,810

Cost: $113,560 Response Rate: 36%


Purpose: This research is being conducted as part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction with the service provided. The objectives of this study are:

  • to identify what Toll-free staff and managers can do to improve customer service and

  • to track customer satisfaction with Toll-free progress over

Findings: In the current period, 95% of Toll-free customers are satisfied (giving a rating of 4 or 5 on a 5-point scale), and only 2% are dissatisfied (giving a rating of 1 or 2). The overall satisfaction rating is 4.64. The overall satisfaction rating of 4.64 for the current period (October through December 2007) is not significantly different from the 4.66 rating for the previous period (July through September 2007); customers’ perceptions of Toll-free service have remained constant between these two periods. The current overall satisfaction rating of 4.64 is also not significantly different from the 4.65 rating for the corresponding period of last year (October through December 2006); customers’ service perceptions have remained constant when comparing these two periods as well.

Actions taken or lessons learned: To increase customer satisfaction with Toll-free, focus improvement efforts on the following items

  • Time to Get through to the IRS is continually the top improvement priority for Toll-free customers and received the lowest satisfaction rating (3.89). This is the only rating item to fall under a rating of 4.

  • After You Reached a Representative, Time to Complete Call is the second-highest improvement priority for Toll-free customers

Recognizing that it is difficult to make changes to the automated phone system, separate leverage analysis was performed to look at the items that Toll-free can control—the customer service representative (CSR) attributes. Within this separate analysis, After You Reached a Representative, Time to Complete Call is continually the top improvement priority.

Conclusions from open-ended questions which provide insights on top improvement priorities:

    • Toll-free customers want the automated answering system menu options to be simplified and clarified so they are able to speak with a representative as quickly as possible. Customers reported that it not only takes too long to get through to a representative, but the length of the call is also extended due to the representative’s lack of knowledge. Customers suggested that the call length could be shortened by increasing the knowledge of representatives. Depending on their needs and experiences, customers can evaluate the Toll-free process differently, which can cause significant variations between customer subgroups.

    • Customers who feel their issue was completely or partly resolved at the end of their phone call, customers whose call lasted 20 minutes or less, individual taxpayers, customers who experienced no transfers, customers who called only one time about their issue, and customers whose call was answered by the Puerto Rico call site gave higher overall satisfaction ratings than customers as a whole.

    • Customers who were either business taxpayers, exempt organizations, or tax professionals; customers whose call lasted 21 minutes or longer; customers who experienced two or more transfers; customers who called four or five times about their issue; customers who feel their issue was not resolved at the end of their phone call; and customers whose call was answered by the Oakland, Baltimore, or Cleveland call site gave lower ratings than customers as a whole.


Control # and Name: CS-06-22 Re: Automated Collection System

Participants: 3,910 (9,506 Requests)

Data Collection Began: October 2006 Data Collection Ended: September 2007 Burden Hours: 568

Cost: $89,689 Response Rate: 41%

Purpose: The survey was conducted as part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction with the service provided. It assesses customer satisfaction with the Automated Collection System and their satisfaction with the service they received during their calls. The objectives are to administer the survey using an IVR system, identify of ACS through this research, to track customer satisfaction at eight ACS sites and nationwide, and to identify actionable improvement opportunities.

Findings: The cumulative year–to-date overall satisfaction is 4.63%. This is a rating of 4 or 5 on a 5 point scale, while only 3% of customers were dissatisfied (giving a rating of 1 or 2). ACS customers continue to express their appreciation for representatives who are helpful, friendly, knowledgeable, and fair.

Actions taken or lessons learned: It is important to increase customer satisfaction by improving on the tone of the IRS correspondence so they are precise and clear. Customers remain frustrated with the amount of time they spend calling the ACS line. The IRS needs to incorporate procedures that will reduce the amount of time customers spend on hold time when calling the ACS phone number and the amount of time it takes to complete their call.


Control # and Name: CS-07-23 Re: Field Assistance Study

Participants: 463,035 (6,169,124 Requests)

Data Collection Began: October 2006 Data Collection Ended: September 2007 Burden Hours: 1,575

Cost: $246,626 Response Rate: 8%

Purpose: The survey was conducted as part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction with the service provided. The IRS has been measuring customer satisfaction in its Taxpayers assistance Centers (TACs) since January 1998 using a survey comment card. The overall goal of this survey is to provide meaningful feedback to managers and staff in those field offices to improve the services provided at these Taxpayer Assistance Centers.

Findings: The overall satisfaction is 89% (rating of 4 or 5) and 7% are dissatisfied (1 or 2) rating. The national overall satisfaction rating is 4.55. Customers remain most satisfied with “Employee Attitude”, giving it an average satisfaction rating of 4.65, on a 5-point scale. Customers are also satisfied with “Employee Skill or Knowledge” and “Listening to Your Concerns”, giving them both an average satisfaction rating of 4.64. Only 7% of customers gave an overall satisfaction rating of 1 or 2 (dissatisfied). Of these, 38% do not feel they have a better understanding of their tax responsibilities after visiting a Field Assistance office, compared to just 10% for all other customers combined (those giving an overall rating of 3, 4, or 5).

Actions taken or lessons learned: Field Assistance wait times exceed customers’ expectations. Although nearly half of all customers waited less than 5 minutes for service, “Promptness of Service” remains the top improvement priority for customers and a very important item for them. Field Assistance customers also remain concerned with resolving their question/issue, which is their second highest improvement priority. The IRS needs to look at ways to decrease the wait time and to continue to train Customer Service Representatives so that taxpayers receive the correct answer to their question/issue.

Control # and Name: CS-06-24 Re: Innocent Spouse

Participants: 2,276 (6,600 Requests)

Data Collection Began: October 2006 Data Collection Ended: September 2007 Burden Hours: 302

Cost: $79,689 Response Rate: 36%

Purpose: The survey was conducted as part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction with the service provided. The Innocent Spouse Program is responsible for protecting the rights of the requesting and non-requesting spouses. They ensure each claim receives timely and consistent treatment in accordance with established guidelines and the law. The project has three primary goals: 1) to identify customer expectations of the Innocent Spouse program; 2) to track customer satisfaction for the Innocent Spouse Program on a national level; and, 3) to identify operational improvements.

Findings: The overall satisfaction (3.24%) for the current year (October 2006 through September 2007) is significantly different from the rating (3.46%) for the previous year (October 2005 through September 2006); the decrease in ratings is due to a real change in customers’ perceptions, as well as differences in customer characteristics. When comparing the current year to the previous year, more customers had their claim disallowed (39% vs. 26%), and more customers experienced a claim process that lasted 9 months to less than 12 months (28% vs. 25%). These customers tend to give lower ratings. Fewer customers had their claim allowed (44% vs. 55%), and these customers tend to give higher ratings.

Actions taken or lessons learned: The IRS will continue to work on shortening the length of the claim process and the time that customers spend on their claim. Communicating the length of the process may help set customers’ expectations regarding how long it will take. The customers need to be provided detailed explanations regarding why documents are requested, and request these documents all at once.


Control # and Name: CS-06-25 Re: CSCO Survey

Participants: 1,820

Data Collection Began: October 2006 Data Collection Ended: September 2007 Burden Hours: 385

Cost: $93,656 Response Rate: 21%

Purpose: The survey was conducted as part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction with the service provided. The Compliances Services Collection Operation within the Wage & Investment Division is responsible for resolving taxpayer accounts using correspondence in a method that reduces taxpayer burden and increases voluntary compliance. As an important customer interface for W&I, CSCO needs feedback from its customers to continually improve operations. The CSCO customer satisfaction survey project has three primary goals: 1) to survey external customers on an ongoing basis regarding their expectations of CSCO; 2) to track customer satisfaction at the 5 W&I sites, and 3) to identify operational improvements

Findings: CSCO customers remain most satisfied with the IRS’s explanation of payment options giving a rating of 3.86. Customers also remain relatively satisfied with the IRS’s ability to follow through (3.75) and the fair treatment they receive (3.74). For the annual overall satisfaction results, the 3.67 rating for the current year (October 06 through September 07) is not significantly different from the 3.68 rating for the previous year. Almost two-thirds of CSCO W&I customers (60%) are satisfied with the service they received this year, and 17% are dissatisfied. The annual overall satisfaction rating is 3.67. The top annual improvement priority for all CSCO customers is the “Length of the Correspondence Collection Process”.

Actions taken or lessons learned: It is important to respond to customer’s written inquiries more quickly. Customers remain frustrated with the length of the process. Providing them with a timeline of approximately how long the process will take will help ease their frustration in this area. We will strive to ensure that is easy for customers to resolve their issues through written correspondence and to ensure that notices sent to customers are easy to read and understand. We will strive to keep customers better informed of the status of their case.


Control # and Name: CS-07-26 Re: CC Exam

Participants: 1,788 (8,698 Requests)

Data Collection Began: October 2006 Data Collection Ended: September 2007 Burden Hours: 439

Cost: $77,176 Response Rate: 21%

Purpose: This research is being conducted as part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction with the service provided. The objectives of this study are to identify what CC Exam staff and managers can do to improve customer service and to track customer satisfaction with CC Exam’s progress over time.

Findings: Customers remain most satisfied with the employees’ courtesy and professionalism, giving it a rating of 3.73 on a 5-point scale. The actual length of the exam is greater than expected. 49% of respondents expected their exam to be completed in two months or less, but only 15% of respondents experienced an exam of that length. In addition, 27% of respondents experienced an exam that lasted more than six months (only 9% expected their exam to be that long). Depending on their needs and experiences, customers can evaluate their experience with CCE differently, which can cause significant variations between customer subgroups.

Actions taken or lessons learned: Focusing on the improvement priorities for customers as a whole will address the priorities for satisfied customers. (The top two improvement priorities for satisfied customers are the same as those for customers as a whole: Length of Audit Process and Time You Spent on Audit.) Work on reducing the length of the audit and the amount of time customers spend on the process.


Control # and Name: CS-06-27 Re: Accounts Management (Adjustments)

Participants: 3,500 (13,371)

Data Collection Began: January 2007 Data Collection Ended: December 2007 Burden Hours: 621

Cost: $107,336 Response Rate: 26%

Purpose: This research is being conducted as part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction with the service provided. The objectives of this study are to identify what Adjustments’ staff and managers can do to improve customer service and to track customer satisfaction with Adjustments’ progress over time.

Findings: Sixty-five percent of all Adjustments customers are satisfied with the way their issue was handled, while 19% are dissatisfied. The overall satisfaction rating is 3.70. Adjustments customers as a whole, as well as BMF and IMF customers individually, remain most satisfied with the fair treatment they receive from the IRS and the appropriateness of the information the IRS requests.

Actions taken or lessons learned: 1) Find ways to decrease the processing time of amended returns and resolve customers’ issues more quickly. Ensure all customers are provided with a detailed timeline before the process begins and every IRS branch, department, and representative can provide customers with the same information. Length of Time to Resolve Your Issue is the top improvement priority for all customers and for non-correspondence customers. 2) Try to make it easier for customers to get more information about their issue. Improve the online experience for customers so they can access the most recent information regarding their issue. Also, ensure that mail correspondence always states where the customer can receive more information. Ease of Getting More Information about Your Issue is the second-highest improvement priority for all customers and the top priority for correspondence customers


Control # and Name: CS-06-28 Re:

Participants: 1,718 (6950 Requests)

Data Collection Began: January 2007 Data Collection Ended: March 2008 Burden Hours: 318

Cost: $126,680 Response Rate: 32%

Purpose: To measure customer perceptions and expectations, track customer satisfaction progress at a national level, and identify operational improvements for SB/SE

Findings: Estate and Gift customers vary in overall satisfaction from quarter to quarter. However, for the period of April2007-March 2008, overall satisfaction was averaged at 66% and 17: of customers were dissatisfied overall. Over these individual quarters, most attributes varied by quarter, but remained close to the same on average. October-December 2007 yielded much higher satisfaction scores than the other quarters. Attributes related to the auditor (e.g., courtesy, flexibility in scheduling meetings) have consistently high satisfaction ratings while attributes that relate to time (e.g., amount of time spent on examination, length of examination) generally have low satisfaction ratings.

Actions Taken/Lessons Learned: Estate and Gift continues to actively monitor the findings as legislated.


Control # and Name: CS-06-29 Re:

Participants:

Data Collection Began: Data: Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-06-30 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-31 Re: 2007 Compliance Services Operations Study

Participants: 1,424 (7124 Requests)

Data Collection Began: January 2007 Data Collection Ended: March 2008 Burden Hours: 309

Cost: $93,319 Response Rate: 20%

Purpose: This research was conducted as part of the IRS agency-wide initiative to monitor taxpayer satisfaction with the service provided. The objectives of this study were to identify what CSCO staff and managers can do to improve customer service, and to track customer satisfaction with CSCO’s progress over time.

Findings: During the most recent reporting period of October through December 2007, 54% of customers are satisfied with the service they received from CSCO (giving an overall satisfaction rating of 4 or 5 on a 5-point scale), while 25% are dissatisfied (giving a rating of 1 or 2). The average overall satisfaction rating is 3.40. The top improvement priorities for SB/SE CSCO are, Resolving Matter through Written Correspondence, Notifying You of Case Closures, and Time IRS Took to Respond to Written Inquiry.

Actions Taken/Lessons Learned: CSCO continues to actively monitor the findings as legislated.


Control # and Name: CS-07-32 Re: 2007 Field Collection Study

Participants: 2470 (14021 Requests)

Data Collection Began: January 2008 Data Collection Ended: March 2008 Burden Hours: 591

Cost: $168,613 Response Rate: 18%

Purpose: This research was conducted as part of the IRS agency-wide initiative to monitor taxpayer satisfaction with the service provided. The objectives of this study were to identify what Collection staff and managers can do to improve customer service, and to track customer satisfaction with Collection’s progress over time.

Findings: During the most recent reporting period of October through December 2007, 61% of customers are satisfied with the service they received from Collection (giving an overall satisfaction rating of 4 or 5 on a 5-point scale), while 19% are dissatisfied (giving a rating of 1 or 2). The average overall satisfaction rating is 3.61. The top improvement priorities for SB/SE Collection are: Time You Spent on Issue, Keeping You Up-to-Date on Collection Process, and Flexibility in Resolving Issue.

Actions Taken/Lessons Learned: Collection continues to actively monitor the findings as legislated.


Control # and Name: CS-07-33 Re: 2007 Compliance Center Exam Study

Participants: 1,930 (7,743 Requests)

Data Collection Began: January 2008 Data Collection Ended: March 2008 Burden Hours: 355

Cost: $104,250 Response Rate: 25%

Purpose: This research was conducted as part of the IRS agency-wide initiative to monitor taxpayer satisfaction with the service provided. The objectives of this study were to identify what CCE staff and managers can do to improve customer service, and to track customer satisfaction with CCE’s progress over time.

Findings: During the most recent reporting period of October through December 2007, 49% of customers are satisfied with the service they received from CCE (giving an overall satisfaction rating of 4 or 5), and 29% are dissatisfied (giving a rating of 1 or 2). The current overall satisfaction rating is 3.27 on a 5-point scale. The top improvement priorities for SB/SE/CCE are: Ease of Getting Through to the Right Person, Length of Correspondence Exam Process, and Providing You Consistent Information about Case.

Actions Taken/Lessons Learned: CCE continues to actively monitor the findings as legislated.


Control # and Name: CS-07-34 Re: 2007 Automated Under-reporter Study

Participants: 1,786 (6,389 Requests)

Data Collection Began: January 2007 Data Collection Ended: March 2008 Burden Hours: 302

Cost: $104,868 Response Rate: 28%

Purpose: This research was conducted as part of the IRS agency-wide initiative to monitor taxpayer satisfaction with the service provided. The objectives of this study were to identify what AUR staff and managers can do to improve customer service, and to track customer satisfaction with AUR’s progress over time.

Findings: During the most recent reporting period of October through December 2007, 57% of customers are satisfied with the service they received from AUR (giving an overall satisfaction rating of 4 or 5 on a 5-point scale), while 61% are dissatisfied (giving a rating of 1 or 2). The average overall satisfaction rating is 3.57. The top improvement priorities for SB/SE AUR are: Length of Time to Hear from IRS That There Was a Discrepancy, Ease of Understanding Notices Explaining Action Taken, and Time IRS Took to Respond to You.

Actions Taken/Lessons Learned: AUR continues to actively monitor the findings as legislated


Control # and Name: CS-07-35 Re: 2007 Automated Collection System

Participants: 2,911 (Requests 6,760)

Data Collection Began: January 2007 Data Collection Ended: March 2008 Burden Hours: 598

Cost: $118,391 Response Rate: 43%

Purpose: This research was conducted as part of the IRS agency-wide initiative to monitor taxpayer satisfaction with the service provided. The objectives of this study were to identify what ACS staff and managers can do to improve customer service, and to track customer satisfaction with ACS’s progress over time.

Findings: During the most recent reporting period of October through December 2007, 91% of customers are satisfied with the service they received from ACS (giving an overall satisfaction rating of 4 or 5 on a 5-point scale), while 3% are dissatisfied (giving a rating of 1 or 2). The average overall satisfaction rating is 4.55. The top improvement priorities for SB/SE ACS are: Tone of IRS Notice, Bill or Letter*, After You Reached a Representative, Time to Complete Call.

* Customers who called regarding recent IRS notice, bill or letter (90%) were asked these questions

Actions Taken/Lessons Learned: ACS continues to actively monitor the findings as legislated


Control # and Name: CS-07-36 Re: 2007 Field Examination

Participants: 2,554 (8,400 Requests)

Data Collection Began: February 2007 Data Collection Ended: May 2008 Burden Hours: 385

Cost: $139,485 Response Rate: 33%

Purpose: To measure customer perceptions and expectations, track customer satisfaction progress at a national level, and identify operational improvement for SB/SE

Findings: Field Exam customers vary in overall satisfaction from quarter to quarter. For the period of April-2007-March 2008, overall satisfaction was averaged at 64% and 22% of customers were dissatisfied overall. Fairness of treatment by the IRS was a “driver” of overall satisfaction for all quarters meaning that if improvements are made to this area, overall satisfaction is likely to increase. Additionally, attributes that relate to auditor (e.g. courtesy, flexibility in scheduling meetings) have high satisfaction ratings while attributes that related to time (e.g. amount of time spent on examination, length of examination) have low satisfaction ratings.

Actions Taken/Lessons Learned: Field Exam continues to actively monitor the findings as legislated.


Control # and Name: CS-07-37 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-38 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-39A Re: Collection Improvement Pilot Closing Letter

Participants: 213 (2,255 Requests)

Data Collection Began: April 2007 Data Collection Ended: September 2007 Burden Hours: 86

Cost: $8,000 Response Rate: 9%

Purpose: The Collection Improvement Pilot Closing Letter Survey was administered to determine whether the Collection closing letters sent to a pilot group of customers helped improve customer satisfaction overall and specifically with “notifying you of case closure” and related items, “Time customer spent on issue” and “Keeping customer up-to-date on the collection process.”

Findings: Collection customers in the pilot groups, all of whom received the closing letter, reported higher overall satisfaction than Collection transactional survey respondents who had not received the closing letter. Pilot groups also had higher overall satisfaction when comparisons were done by case disposition. As hypothesized, satisfaction ratings for Notifying You of Case Closure were significantly higher in pilot groups compared to transactional survey respondents. Satisfaction ratings were higher for all rating items in the pilot groups compared to the transactional survey respondents of the same disposition type, suggesting that the positive experience of receiving a case closure letter may have influenced customers’ responses to other rating items. While no direct comparisons exist, pilot respondents are more satisfied with the Ease of Understanding the Closing Letter than transactional survey respondents are with the Ease of Understanding Collection Notices. Those customers who completely understand the resolution letter gave higher overall satisfaction ratings and those who did not completely understand the letter gave lower overall satisfaction ratings.

Actions taken or lessons learned: This survey has shown the positive impact of these closing letters on Collection customers. Therefore, Collection will implement case closing letters for certain field collection cases. They are in the process of determining the most efficient method of issuing the letters.


Control # and Name: CS-07-39B Re: Collection Improvement Pilot Postcard Survey

Participants: 132 (1,082 Requests)

Data Collection Began: April 2007 Data Collection Ended: December 2007 Burden Hours: 43

Cost: $5,000 Response Rate: 12.2%

Purpose: The Collection Improvement Pilot Postcard Survey was administered to determine whether the Collection Postcards distributed to a pilot group of customers helped improve customer satisfaction overall and specifically with “Acknowledge Receipt of Information submitted” which has been determined to be a driver of Collection customer overall satisfaction.

Findings: Collection customers in the pilot groups, all of whom received the postcard, reported overall satisfaction that did not significantly differ from Collection transactional survey respondents who had not received the postcard. In addition, satisfaction ratings for Acknowledge Receipt of Information Submitted also did not significantly differ in pilot groups compared to transactional survey respondents.

Actions taken or lessons learned: This survey has shown that the postcard method of acknowledging receipt of information submitted by Collection customers did not have a measurable impact on their overall satisfaction with the Collection process, nor on their satisfaction with the receiving acknowledgement from IRS for information they submitted aspect of service. Given the lack of positive survey results from this pilot, Collection will not implement this process further.


Control # and Name: CS-07-40 Re: Automated Collection System

Participants: 3,385 (8,619 Requests)

Data Collection Began: October 2007 Data Collection Ended: September 2008 Burden Hours: 495

Cost: $89,689 Response Rate: 39%

Purpose: The survey was conducted as part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction with the service provided. It assesses customer satisfaction with the Automated Collection System and their satisfaction with the service they received during their calls. The objectives are to administer the survey using an IVR system, identify of ACS through this research, to track customer satisfaction at eight ACS sites and nationwide, and to identify actionable improvement opportunities.

Findings: The cumulative year–to-date overall satisfaction is 92.5%. This is a rating of 4 or 5 on a 5 point scale. ACS customers continue to express their appreciation for representatives who are helpful, friendly, knowledgeable, and fair.

Actions taken or lessons learned: It is important to increase customer satisfaction by improving on the tone of the IRS correspondence so they are precise and clear. Customers remain frustrated with the amount of time they spend calling the ACS line. The IRS needs to incorporate procedures that will reduce the amount of time customers spend on hold time when calling the ACS phone number and the amount of time it takes to complete their call.


Control # and Name: CS-07-41 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-42 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-43 Re: Compliance Check Study

Participants: 36 (285 Requests)

Data Collection Began: 12/07 Data Collection Ended: 9/08 Burden Hours: 2

Cost: $0 Response Rate: 13%

Purpose: The Exempt Organizations Compliance Area (EOCA) within TEGE developed and implemented a Customer Satisfaction survey with the purpose of measuring the satisfaction of our customers during the compliance check process. Through this feedback it is our goal to determine our strengths, and identify any areas we need to improve.

Findings: Seventy nine percent of our customers indicated they were satisfied with the “ease of finding and IRS contact number”. Seventy five percent of our customers indicated they were satisfied with “The person with whom you spoke was knowledgeable about the topic of the letter”. In addition, sixty four percent or those who responded indicated they will modify the way they previously prepared their information return (Form 990 series) based on the information contained in the letter. Nine percent indicated they were dissatisfied with “The time given to you to respond to IRS’s additional request(s)” and “The amount of time you had to spend responding to the IRS’s additional request(s)”.

Actions taken or lessons learned: In an effort to reduce taxpayer burden, we modified the instructions given to our tax examiners to allow extensions if the taxpayer indicates they are having difficulties meeting the due date of the information requested.

When we originally stated we would randomly select 1140 compliance checks to send the survey to, we didn't take into account that many of the compliance check accounts are resolved through research without having to contact the taxpayer. To remedy this, we selected a sample of only those cases in which the taxpayer was actually contacted. We were also receiving a large amount of undeliverable mail but again we changed our procedures to update the addresses on our database after running a mass update. We estimated an 85% response rate for our survey response based on what our response rate is to the compliance check letters we mail out. We overestimated this number as the response rate was only 13%. We are adjusting our future samples accordingly.


Control # and Name: CS-07-44 Re: Education Letter Survey Study

Participants: 0 (0 Requests)

Data Collection Began: Delayed/Cancelled Study Data Collection Ended: N/A Burden Hours: N/A

Cost: $0 Response Rate: N/A

Purpose: The Exempt Organizations Compliance Area (EOCA) within TEGE developed and implemented a Customer Satisfaction survey with the purpose of measuring the satisfaction of our customers when sent an Educational Letter. Through this feedback it is our goal to determine our strengths, and identify any areas we need to improve.

Findings: The Educational Letter project we had scheduled for fiscal year 2008 was delayed and is scheduled to begin in fiscal year 2009. Therefore, we did not mail out any Educational Letter customer satisfaction surveys.

Actions taken or lessons learned: N/A


Control # and Name: CS-07-45 Re: Field Assistance

Participants: 452,326 (6,704,578 Requests)

Data Collection Began: October 2007 Data Collection Ended: September 2008 Burden Hours: 11,308

Cost: $224,252 Response Rate: 7%

Purpose: The survey was conducted as part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction with the service provided. The IRS has been measuring customer satisfaction in its Taxpayers assistance Centers (TACs) since January 1998 using a survey comment card. The overall goal of this survey is to provide meaningful feedback to managers and staff in those field offices to improve the services provided at these Taxpayer Assistance Centers. This report represents the results of the 183,022 visitors who completed a comment card at Field Assistance Taxpayer Assistance Center (TAC) (175,119 English and 7,903 Spanish responses).

Findings: The overall satisfaction is 91% (rating of 4-5), 5% are dissatisfied (rating of 1-2) and 4% neutral (rating of 3). The national overall satisfaction rating is 4.61. Customers remain most satisfied with “Employee Professionalism and Courtesy” giving it an average satisfaction rating of 4.70, on a 5-point scale. Customers are also satisfied with “Employee Skill or Knowledge” and “Listening to Your Concerns”, giving them both an average satisfaction rating of 4.69. Only 5% of customers gave an overall satisfaction rating of 1 or 2 (dissatisfied). Spanish-speaking customers are less satisfied overall (4.45) when compared to customers as a whole (4.57).

Actions taken or lessons learned: Field Assistance wait times exceed customers’ expectations. Although the majority (70%) of customers waited 30 minutes or less for service, “Promptness of Service” remains the top improvement priority for Field Assistance customers. While “Promptness of Service” is also the top improvement priority among Spanish-speaking customers, “Understanding Who Was Next in Line” is the second-highest improvement priority for these customers—higher than for all customers. This suggests the IRS should do a better job of explaining the queuing system to Spanish-speaking customers. Field Assistance customers also remain concerned with resolving their question/issue, which is their second highest improvement priority. The IRS needs to look at ways to decrease the wait time and to continue to train Customer Service Representatives so that taxpayers receive the correct answer to their question/issue.

Control # and Name: CS-07-46 Re: IC and CIC Customer Survey Studies

Participants: 1,203/148

Data Collection Began: August 2006 Data Collection Ended: December 2007 Burden Hours: 316/50

Cost: $240,000 Response Rate: 67%/54%

Purpose: to identify what Large and Mid-Size Business (LMSB) Industry staff and managers can do to improve customer service and to track customer satisfaction with LMSB Industry’s progress over time.

Findings:

IC: Most IC customers (82%) are satisfaction with their audit experience during the FY 06 period. Customers remain most satisfied with Your Treatment as a Taxpayer and least satisfied with Audit Completion.

CIC: Most CIC customers (83%) are satisfied with their audit experience during FY 07 period. Average overall satisfaction right is 4.01. The arena with the highest percent of satisfied customers is Taxpayer Treatment and the arena with the lowest is IRS Specialists Managers.

Actions taken or lessons learned:

Reports were communicated to all Industry Directors and their staff. Specific improvement items noted in the reports were used to draft sample management performance commitments for all levels of management to include in their Management Official Performance Agreement. Talking points are provided executives when meeting/presenting to industry groups.


Control # and Name: CS-07-47 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-48 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-49 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-50 Re: CC Exam

Participants: 2,232 (9,120 Requests)

Data Collection Began: October 2007 Data Collection Ended: September 2008 Burden Hours: 396

Cost: $63,709 Response Rate: 26%

Purpose: To identify what CC Exam staff and managers can do to improve customer service and to track customer satisfaction with CC Exam’s progress over time.

Findings: Customers remain most satisfied with courtesy and professionalism of IRS Employees, giving it an average satisfaction rating of 3.63 on a 5-point scale. The majority (76%) of customers contacted the IRS Toll-Free exam number listed on the letter they received. This is evidence that the Toll Free line is an important service channel for CC Exam taxpayers. On average, CC Exam customers contact the IRS by phone or mail five times before their issue is resolved. Resolving taxpayer issues with the least number of contacts benefits both the IRS and reduces taxpayer frustration with the process.

Actions Taken/Lessons Learned: Focusing on the improvement priorities for customers as a whole will address the priorities for satisfied customers. (The top two improvement priorities for satisfied customers are the same as those for customers as a whole: and Time You Spent on Audit and Length of Audit Process.) Work on reducing the length of the audit and the amount of time customers spend on the process.


Control # and Name: CS-07-51 Re: Compliance Services Collection Operation

Participants: 1,582 (10,708 Requests)

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: $95,320 Response Rate: 31%

Purpose: This research is being conducted as part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction with the service provided. The objectives of this study are: to identify what CSCO staff and managers can do to improve customer service and to track customer satisfaction with CSCO’s progress over time.

Findings: Currently period, 67.5% of CSCO customers are satisfied with the service they received from CSCO W&I (giving an overall rating of 4 or 5 on a 5-point scale), while 16% are dissatisfied (giving a rating of 1 or 2). The average overall satisfaction rating is 3.76. The current rating of 3.76 is also not significantly different from the 3.91 rating for the corresponding period of last year

Actions Taken/Lessons Learned: Key customer suggestions to increase responses from the IRS are related to improving the length of the process. Length of Correspondence Collection Process, from When You Received the Initial Notice to Finish is the top improvement priority for both CSCO customers as a whole and customers who called the Toll-free line (59% of respondents) Ensure they are able to contact someone who has their information readily available. Ease of Obtaining Information You Needed from Collection Operation is the second-highest improvement priority for CSCO customers as a whole Time Collection Operation Took to Respond to Your Written Inquiry is the fourth-highest improvement priority for CSCO customers as a whole and the third-highest priority for customers who called the Toll-free line (59%). Length of Time It Took to Get through to an IRS Employee is the second-highest priority for customers who called the Toll-free line (59%)


Control # and Name: CS-07-52 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-53 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-54 Re: SIS Customer Survey

Participants: 143 (305 Requests

Data Collection Began: January 2008 Data Collection Ended: March 2008 Burden Hours: 17

Cost: $0 Response Rate:

Purpose: The universe of this survey consists of Statistical Information Services (SIS) Office customers who called or emailed the SIS office with an inquiry.

Findings: The universe of this survey consists of Statistical Information Services (SIS) Office customers who called or emailed the SIS office with an inquiry. From the survey, it was found that just over 94 percent of respondents rated their overall satisfaction with their latest inquiry to SIS as either ‘satisfied’ (25.74 percent) or ‘totally satisfied’ (68.38 percent). The majority (82 percent) said that their request was satisfied, whereas 13 percent said it was partially satisfied.

Actions taken/lessons learned: The feedback provided helped the SIS office determine some areas that need improving. For example, many callers were not aware of the products we offered and were looking for something other than what was available. SIS also implemented a new payment method in the fall of 2007. The feedback from the survey showed that customers are very happy with this new method (89 percent were either ‘satisfied’ or ‘totally satisfied.’


Control # and Name: CS-07-55 Re: Appeals Customer Satisfaction Study

Participants: 887 (2,639 Requests)

Data Collection Began: November 2007 Data Collection Ended: January 2008 Burden Hours: 163

Cost: $78,532 Response Rate: 34%

Purpose: The objective of the customer satisfaction survey was to examine customer expectations and perceptions about Appeals services. Each customer surveyed was given an opportunity to express their opinion about the services they have received. The product of the customer satisfaction survey scores facilitates more effective management of Appeals by: Providing insight from the customer’s perspective about possible improvements, and providing useful input for program evaluation.

Findings: Overall Satisfaction decreased to 65% from 69% in FY2006, however it is still the second highest it has been since the inception of the study. Business taxpayers are more satisfied than individual taxpayers (71% and 64%, respectively) and Represented taxpayers (72%) are significantly more satisfied than Pro Se taxpayers (58%).

Overall Satisfaction is significantly higher with respondents who agreed (79%) with the Appeals decision or whose case was OIC is accepted/CDP is not fully sustained than with other respondents. Satisfaction varies by Category of Work—Examination/TEGE customers reported relatively high satisfaction at 75%. OIC and CDP customers were least satisfied (58%). Field Operations at customers are significantly more satisfied than Campus Operations customers. Even at the attribute level, there were a few meaningful differences. Customers were most satisfied with Degree of Respect shown (80%) and Professionalism of Appeals person (78%). Customers were least satisfied with Time to Hear from Appeals (52%) and Length of Appeals Process (54%).

Alternative Dispute Resolution (ADR): Of those who used the ADR process, 70% were satisfied overall with the ADR process. Customers who used the ADR program were most satisfied with the impartiality of the process (74%), the effectiveness of the process (73%) and the impartiality of the mediator/arbitrator (73%). Most of the ADR customers reported saving money (82%) and time (74%) by using the ADR program. Note there were only 32 respondents who used the ADR program.

Of the customers who used the ADR program:

93% said the established timeframes met their expectations.

86% said they would use the program again.

81% of customers who used the ADR program said they would recommend the ADR program to others. Many of their reasons for doing so focused around saving time and money, in addition to the effectiveness of the program.

Respondents were asked if there was anything the IRS could do to improve the ADR process. 37% of customers provided suggestions, which related to the desire for more mediators/arbitrators, raising the awareness of the program, and making it more available to customers.

Lessons learned: Drivers of overall satisfaction--- The survey identified 6 attributes that were drivers of overall satisfaction, meaning they have the strongest impact on customers’ perceptions of their experience:

Application of the law to facts in your case

Fairness in resolving your case

Consideration of information presented

Listening to your concerns

Adequacy of resources applied by appeals

Clarity of records and documents needed

The strongest driver—application of the law to facts in your case—also received one of the lowest satisfaction ratings of the strongest drivers (61%), therefore Macro recommends this should be an area targeted for improvement.

Three other drivers of overall satisfaction received ratings below 65%—Fairness in Resolving your Case (61%), Consideration of Information Presented (63%), Adequacy of Resources applied by Appeals (63%).


Control # and Name: CS-07-56 Re: Tax Professional Survey 2007

Participants: 1,800 (4,761 Requests)

Data Collection Began: 12/07 Data Collection Ended: 2/08 Burden Hours: 838

Cost: $253,938 Response Rate: 38%

Purpose: The research is designed to help SB/SE understand who their practitioners are, recognize how practitioners are contacting the IRS and their success with different methods, develop new strategies for improving the practitioners’ effectiveness and ease of dealing with the IRS and, ultimately, their satisfaction, and develop new strategies to address the Tax Gap in support of SB/SE’s current goals. Specifically, the research documents practitioner behavior patterns in their pre-filing, filing, and post-filing experiences on behalf of their SB/SE clients and links key improvement opportunities to these patterns; provides practitioner feedback on the IRS Web site, e-filing, and other services; and suggests tactics for improving practitioner satisfaction and business results.

Findings: Overall satisfaction ratings are generally lower this year (59% satisfied) compared to last year (64% satisfied). However, shifts in the respondent composition in 2007 compared to 2006 explain the apparent decrease in overall satisfaction ratings, with fewer respondents among the groups that tend to give higher ratings and more respondents among the groups that tend to give lower ratings. Some groups that tend to give higher ratings and had fewer respondents in 2007 compared to 2006: (1) practitioners who did not help clients with IRS notices and (2) practitioners who contacted the IRS for forms or guidance using a Toll-free line. Additionally, some groups that tend to give lower ratings and had more respondents in 2007 compared to 2006: (1) CPAs who prepared 225 or more SB/SE returns, (2) practitioners who helped clients with mostly employment tax notices, and (3) practitioners who did not contact the IRS for forms or guidance using a Toll-free line. Practitioners who reported the highest percentage of clients receiving notices include other practitioners (not CPAs or Enrolled Agents) who file 225 or more SB/SE returns, especially those who charge a flat fee or retainer. CPAs tend to be less satisfied and tend to use IRS pre-filing services less. They also tend to help clients more with complex issues such as resolving notices and audits.

Resolving Your Clients’ IRS Notices and Getting Client Account Information from IRS Pre-Filing are the two highest priority improvement arenas. These are the two areas where tax professionals must rely on the IRS for success. Success in these areas means more timely issue resolution and more accurate returns.

Most practitioners rely on non-IRS sources for information or guidance on tax issues. The most frequently used non-IRS source is a paid tax service. The most frequently used IRS source is IRS.gov. Those who use IRS.gov find information most often by using the search engine. Among those who do not use IRS.gov and instead call the IRS for forms, form instructions, information, or guidance, the most common reasons given are that they prefer to talk to a live person or that they cannot get the information needed on the Internet.64% of practitioners surveyed felt that their contacts with the IRS helped them avoid subsequent problems or errors.

Only 45% of practitioners were enrolled in e-services, with only 14% to 19% of all practitioners actually using each service. The most common reasons for not enrolling in e-services and for not using e-services are lack of knowledge and the belief that the services are not needed or useful. These reasons can be addressed through marketing the availability and use of e-services E-file usage increased in 2007. 45% claim they e-filed all or most of their clients’ income tax returns in 2007 versus 38% in 2006. Those who e-file income taxes reported higher satisfaction with “Preparing/Filing Your Client’s Income Tax Return.” Almost all practitioners surveyed help their clients resolve post-filing issues such as IRS notices (99%). On average, 50% of practitioners believe their cases were resolved with no change in the amount owed, 14% with the IRS owing money, and 34% with the client owing money. Those practitioners who commonly assist their clients with notices due to unreported income or problems with deductions reported an average of 39% of cases in which they had submitted information with the original return that in their professional judgment should have prevented the notice from being issued. However, the IRS does not look at additional information unless an audit is in progress.

Actions taken or lessons learned: Overall directions from the Tax Professional survey will be included in a Joint Recommendations report together with directions with the Customer Base Survey


Control # and Name: CS-07-57 Re: ACS Support

Participants: 1,318 (7,355 Requests)

Data Collection Began: October 2007 Data Collection Ended: September 2008 Burden Hours: 139

Cost: $104,352 Response Rate: 18.75%

Purpose: The survey was conducted as part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction with the service provided. It assesses customer satisfaction with the ACS Correspondence Support process. The objectives are: 1) to identify customer expectations of ACS Support through this research, 2) to track customer satisfaction at two (2) ACS Support sites and nationwide, and 3) to identify actionable improvement opportunities.

Findings: For the annual overall satisfaction results, 60 % of customers are satisfied, and 20% are dissatisfied. The annual overall satisfaction rating is 3.63. ACS Support customers remain the most satisfied with the “Tone of correspondence you received” (3.99) and “Understanding that you have payment options” (3.95).

Actions taken or lessons learned: “Ease of Resolving Matter through Written Correspondence” is the top improvement priority for ACS Support customers. Customers will benefit from receiving proper expectations from the IRS about items that are factored into resolving a case. “Ease of Obtaining Information You Needed from IRS” is the second highest improvement priority for customers.


Control # and Name: CS-07-58 Re: Practitioner Priority Service

Participants: 2,636 (7,109)

Data Collection Began: January 2008 Data Collection Ended: December 2008 Burden Hours: 373

Cost: $88,940 Response Rate: 37.25%

Purpose: The research was conducted as part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction with the service provided. The objectives are to identify what PPS staff and managers can do to improve customer service and to track customer satisfaction with PPS’s progress over time

Findings: Ninety three percent of PPS customers are satisfied with the service received, while only two percent are dissatisfied. Overall satisfaction rating of 4.55 for the current period (October-December 2008) is higher than the 4.40 rating for the previous period (July-September 2008). Customers continue to be most satisfied with “Professionalism of Representative” and “Representative’s willingness to listen to you and help with your issue”. PPS customers are continually least satisfied with “Length of your wait to talk to a representative”.

Actions taken or lessons learned: PPS needs to set customer expectations concerning hold time, and provide representatives who are able to fully answer customer’s questions and resolve their issues before ending the call. It is also important to provide PPS representatives with the appropriate level of authority to completely address all taxpayer issues during the call, and if not, train them to transfer the taxpayer to someone who can effectively help them. Actions Taken/Lessons Learned:


Control # and Name: CS-07-59 Re: Compliance Services Collection Operation

Participants: 1,582 (10,708 Requests)

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: $95,320 Response Rate: 31%

Purpose: This research is being conducted as part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction with the service provided. The objectives of this study are: to identify what CSCO staff and managers can do to improve customer service and to track customer satisfaction with CSCO’s progress over time.

Findings: Currently period, 67.5% of CSCO customers are satisfied with the service they received from CSCO W&I (giving an overall rating of 4 or 5 on a 5-point scale), while 16% are dissatisfied (giving a rating of 1 or 2). The average overall satisfaction rating is 3.76. The current rating of 3.76 is also not significantly different from the 3.91 rating for the corresponding period of last year

Actions Taken/Lessons Learned: Key customer suggestions to increase responses from the IRS are related to improving the length of the process. Length of Correspondence Collection Process, from When You Received the Initial Notice to Finish is the top improvement priority for both CSCO customers as a whole and customers who called the Toll-free line (59% of respondents) Ensure they are able to contact someone who has their information readily available. Ease of Obtaining Information You Needed from Collection Operation is the second-highest improvement priority for CSCO customers as a whole Time Collection Operation Took to Respond to Your Written Inquiry is the fourth-highest improvement priority for CSCO customers as a whole and the third-highest priority for customers who called the Toll-free line (59%). Length of Time It Took to Get through to an IRS Employee is the second-highest priority for customers who called the Toll-free line (59%)


Control # and Name: CS-07-60 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-61 Re: Toll Free

Participants: 12,399 (34,443 Requests)

Data Collection Began: October 2007 Data Collection Ended: October 2008 Burden Hours: 5,070

Cost: $115,773 Response Rate: 36%

Purpose: This research is being conducted as part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction with the service provided. The objectives of this study are:

    • to identify what Toll-free staff and managers can do to improve customer service and

    • to track customer satisfaction with Toll-free’s progress over time.

Findings: In the most recent cumulative year to date period covering October 2007 through September 2008, 93% of Toll-free customers are satisfied (giving a rating of 4 or 5 on a 5-point scale), and only 3% are dissatisfied (giving a rating of 1 or 2). Similar results were found for the previous annual reporting period covering October 2006 through September 2007, 94% of Toll-free customers are satisfied (giving a rating of 4 or 5 on a 5-point scale), and only 2% are dissatisfied (giving a rating of 1 or 2).

Actions taken or lessons learned: To increase customer satisfaction with Toll-free, focus improvement efforts on the following item

    • Time to Get through to the IRS is continually the top improvement priority for Toll-free customers and received the lowest satisfaction rating.

Recognizing that it is difficult to make changes to the automated phone system, separate leverage analysis was performed to look at the items that Toll-free can control—the customer service representative (CSR) attributes. Within this separate analysis,

    • After You Reached a Representative, Time to Complete Call is continually the top improvement priority.


Control # and Name: CS-07-62 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-63 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-64 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-65 Re: TAS Customer Satisfaction Survey

Participants: 15,433 (58,071 Requests)

Data Collection Began: February 2007 Data Collection Ended: November 2007 Burden Hours: 2,572

Cost: $529,497 Response Rate:

Purpose: The mission of the Taxpayer Advocate Service is to serve as an in-house advocate for taxpayers, representing them within IRS when they have had problems in dealing with the operating functions of the IRS. In this context, TAS provides service remediation services that span almost all IRS operations and functions.

Findings: Overall Customer Satisfaction: In FY 2007, TAS achieved a customer satisfaction mean score of 4.29.1 Stated in terms of a frequency score, 83% of all TAS customers surveyed reported being satisfied overall with the services they received. Customer issues identified in the survey which are most highly correlated to overall satisfaction involve the perception that the customer’s case advocate:

1. resolved their problem in a reasonable timeframe;

2. kept them informed about progress in solving their problem;

3. did their best to resolve their problem;

4. showed concern about helping to solve their problem; and

5. treated them fairly.

Trends: TAS’s customer satisfaction survey scores have been stagnant or in a narrowly declining range for the past three years

  • Work Category Analysis: TAS has approximately seventy five discrete categories of work, which mirror the work performed throughout the other IRS divisions. The composition of TAS’s workload has been changing significantly over the past few years (Exhibit 3, below), with an increasing percentage of cases involving Examination and Collection issues being closed. These types of cases are typically among the most challenging for TAS to produce satisfied customers due to their potential for financial loss. Of these two categories, Collection is the larger and also typically conveys lower satisfaction scores.

  • Impact of Relief on Satisfaction: Predictably, customers who receive some form of relief or assistance from TAS are more likely to report being satisfied with the service provided. This conclusion has been borne out over several years of data collection and, in FY 2007, is reflected by the fact that 86% of the customers who received relief or assistance from TAS reported being satisfied, as compared to 63% satisfaction for those who received neither relief nor assistance. However, it is with respect to the group of non-relief customers that TAS has learned the quantifiable value that emerges when case advocates are attentive to customer needs. Specifically, among non-relief customers, comparisons have been made between those who reported being satisfied with TAS overall against those who were dissatisfied.

  • Impact of TAS Experience on Taxpayer Opinions of IRS: Since TAS provides a form of service remediation that is intended to restore taxpayers back to mainstream IRS processes, it is important to understand how taxpayers who have been through this process view the overall tax environment. In this respect, 50% of the taxpayers who responded to the survey in FY 2007 indicated that they had a higher opinion of IRS as a result of their TAS experience.

Actions Taken/Lessons Learned:

TAS' Customer Satisfaction data represents the end of a service cycle for IRS taxpayer accounts that are owned by the operating divisions. TAS’s surveys therefore have strategic utility to the operating divisions in understanding potential service failures. TAS uses this data to engage operating divisions in partnerships aimed at ensuring that their customers are provided with one stop service at the first point of contact. For instance, TAS is currently working with the Wage and Investment Division to examine the underlying reasons why so many taxpayers who file amended returns find it necessary to seek TAS’ assistance in having their return-related issues resolved. In response to the data and trends cited above, TAS began in FY 2007 to initiate a number of broad-based management actions to bolster its ability to effectively accomplish its mission and to better serve its constituency. This includes changes in the survey process, infrastructure changes, employment engagement, and TAS Systems improvements.


Control # and Name: CS-07-066 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: $ Response Rate: %


Purpose:

Findings:

Actions taken or lessons learned:


Control # and Name: CS-07-067 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:


Purpose:

Findings:

Actions taken or lessons learned:


Control # and Name: CS-07-068 Re: TE/GE Toll Free

Participants: 1,346 (2,690)

Data Collection Began: October 2007 Data Collection Ended: September 2008 Burden Hours: 51.5

Cost: $62,557 Response Rate: 50%


Purpose: This research is being conducted as part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction with the service provided. The objectives of this study are:

  • to identify what TE/GE Toll-free staff and managers can do to improve customer service,

  • to track progress in improving customer satisfaction with TE/GE Toll-free service, and

  • to understand taxpayers’ experience with resolving issues on the IRS.gov Web site.

Findings: In the most recent cumulative year to date period covering October 2007 through September 2008, 91% of Toll-free customers are satisfied (giving a rating of 4 or 5 on a 5-point scale), and only 3% are dissatisfied (giving a rating of 1 or 2). Similar results were found for the previous annual reporting period covering October 2006 through September 2007, 91% of Toll-free customers are satisfied (giving a rating of 4 or 5 on a 5-point scale), and only 4% are dissatisfied (giving a rating of 1 or 2).

Actions taken or lessons learned: To increase customer satisfaction with TE/GE Toll-free, focus improvement efforts on the following item

  • Customers remain frustrated with the time it takes to get through to a Customer Service Representative (CSR). Time to Get through by Phone remains the top improvement priority for customers and received the lowest satisfaction rating. Also, when customers were asked what the IRS could do to improve service, their most frequently-cited response was to improve the wait time and hold time.

Recognizing that it is difficult to make changes to the automated phone system, additional analysis was performed to prioritize improvement items of the CSR attributes that TE/GE has control over.

  • Within this separate analysis; Getting All the Information You Needed during This Call remains the highest improvement priority even though customers are relatively satisfied with this item. Ensure CSRs are asking customers if they have received everything they need.


Control # and Name: CS-07-69 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:


Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-70 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-71 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-72 Re: 2008 W&I Tax Professional Survey

Participants: 2,375 (7,200)

Data Collection Began: 3/08 Data Collection Ended: 5/08 Burden Hours: 1032

Cost: $163,278 Response Rate: 33%

Purpose: The objective of the study was to expand the IRS’s understanding of W&I tax professionals’ characteristics, needs and behaviors, and measure their satisfaction with select IRS products and services. Specific research questions include:

  1. What IRS and non-IRS services are practitioners aware of and use?

  2. How satisfied are practitioners with the services provided by the IRS?

  3. How do practitioners get their tax law information, forms, and publications?

  4. Where do tax professionals get assistance for tax law issues when preparing their clients’ Form 1040 tax returns?

  5. How would practitioners improve or change the services provided by the IRS?

Findings: When tax professionals’ characteristics (i.e., work status, number of years preparing returns, Electronic Return Originator status, and type of taxpayers served) were examined, most varied significantly by type of tax professional (i.e., Certified Public Accountant, Enrolled Agent, other tax professionals) and type of employer (i.e., national tax preparation firm, other tax preparation firm, and national accounting and/or law firm). Analysis also showed that in general, awareness and usage of IRS resources specific to tax professionals, varied significantly by reported type of tax professional. Overall satisfaction with IRS services was relatively high, with 68 percent of W&I tax professionals indicating they were satisfied or very satisfied with IRS services; interestingly, satisfaction with IRS products and services varied by the characteristics of tax professionals. For example, part-time tax professionals and those with less than six years of experience preparing tax returns were generally more satisfied with IRS services than their counterparts.

Actions taken or lessons learned: Findings from this study suggest that additional research in the form of surveys and/or focus groups should be conducted. Subsequent research efforts should attempt to identify what IRS resources and services are most important to tax professionals in assisting taxpayers. An examination of tax professionals’ issue resolution using IRS products and services should also be conducted. Along with further data collection and analysis, additional profiling analyses should be performed. We should continue to build a profile of W&I tax professionals’ characteristics as they are related to awareness, usage, and satisfaction with IRS products and services. Findings from this study and future research will further the IRS’ understanding of W&I tax professionals’ characteristics, needs, and behavior. This knowledge will lead to better alignment of IRS services and resources by removing services no longer needed and identifying new service offerings that may better serve the practitioner community – thereby enabling practitioners to effectively serve taxpayers and ultimately improve taxpayer compliance. Findings from the 2008 W&I Tax Professionals Survey will also be combined with results from other preparer-related studies to build a comprehensive body of knowledge of the universe of tax professionals.


Control # and Name: CS-07-73 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-74 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-75 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-76 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-07-77 Re: SB/SE Customer Base Survey

Participants: 2,504 (8,646)

Data Collection Began: March 2008 Data: Collection Ended: July 2008 Burden Hours: 1,150

Cost: $359,714 Response Rate: 29%

Purpose: The research is designed to help SB/SE understand who their taxpayers are, recognize how customers are contacting the IRS and their success with different methods, develop new strategies for improving the taxpayers’ effectiveness and ease of dealing with the IRS and, ultimately, their satisfaction, and develop new strategies to address the Tax Gap in support of SB/SE’s current goals. Specifically, the research documents taxpayer behavior patterns in their pre-filing, filing, and post-filing experiences and links key improvement opportunities to these patterns; shows differences among customer groups; and provides taxpayer feedback on the IRS Web site, e-filing, and other services

Findings: Satisfaction ratings are generally significantly lower this year than last year. In general, different taxpayer groups (e.g., those who file different forms, use a practitioner, or self-file) do not give significantly different satisfaction ratings, either overall or for the various service experience categories. When asked how to improve service in each arena, taxpayers consistently mention making forms and instructions easier to understand and providing better explanations of the changes from previous years among the top three. Preparing/Filing Your Tax Return and Keeping Your Tax Records are the top two improvement priority arenas. Most taxpayers who call the Toll-Free line before filing could easily use the IRS Web site instead. E-file usage remains low among all taxpayers. Only 31% actually e-filed their 2006 income taxes according to the IRS. Among taxpayers who received a notice in the mail that dealt with returns they had previously filed, only 38% ended up owing money to the IRS. 76% of those who received a notice were surprised to find that they had an error.

Actions taken or lessons learned: Overall directions from the Customer Base and Tax Professional surveys that address the IRS Strategic Plan:


Control # and Name: CS-07-78 Re: Form 944 Participant Survey

Participants: 1,100 (7,905)

Data Collection Began: 3/08 Data Collection Ended: 6/08 Burden Hours: 717

Cost: $6,728 Response Rate: 57%/42%

Purpose: In order to better evaluate the overall Form 944 program, The Office of Taxpayer Burden Reduction (TBR) needed to gather feedback from small business taxpayers that qualified for and participated in the program during the first year. TBR asked the Denver SB/SE Research office to conduct a survey to assist them in this effort.

In our initial discussions, we determined that participants who received the initial letter and filed a Form 944 would evaluate the program much differently than taxpayers who received multiple letters and had yet to file a Form 944. We recommended surveying the two groups independently to provide useful results. The objectives of the participant surveys were to:

  • determine overall satisfaction with the Form 944 program

  • determine why taxpayers continue to file unpostable Form 941 returns

  • evaluate what taxpayers think the IRS should do with the program

Findings: Customer Satisfaction Survey

  • Participants are very satisfied with the Form 944 program.

  • Better understanding of the Form 944 program criteria and opt out options along with timely notification are the aspects of the program participants feel need the most improvement.

  • It is unclear whether participants wish the program to remain mandatory or be given the choice to participate.

Program Survey

  • Participants are undecided about their satisfaction with the program.

  • The notification letters are the key problem with the unpostable population.

  • Participants are in greater favor of being able to choose to participate in the program.

Common Themes: The two populations of Form 944 participants surveyed exhibited different behaviors and revealed different experiences during the first year of the program. There were, however, many common themes detected throughout the survey responses of both groups. We can conclude the following observations apply to the entire population of Form 944 program participants:

  • The key benefits of the program are: simplicity; less paperwork; saves time; and the ability to make deposits and payments.

  • The IRS should raise the threshold.

  • The IRS should send year-end reminders to program participants.

  • The IRS needs to improve both the timing and clarity of the notices and letters associated with the Form 944 program.

  • The IRS should take steps to eliminate the difficulty in switching back and forth between a quarterly Form 941 requirement and the annual Form 944 filing requirement.

Actions taken or lessons learned: The Form 944 project team and TBR added the qualitative survey results to their overall program evaluation for the executive briefing held in August 2008. The Form 944 project team recommended to IRS executives to: (1) continue the program; (2) consider making the program voluntary; and (3) raise the threshold amount to include more small business taxpayers. IRS executives agreed on the recommendations and changes to the From 944 program will be implemented for the 2010 tax year. Additionally, the Form 944 project team is using the survey results to prioritize improvement areas and implement suggestions offered by respondents.


Control # and Name: CS-08-79 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-80 Re: SPEC Partner Local/National Survey

Participants: 1108/42 (1108/53 Requests)

Data Collection Began: March 2008 Data Collection Ended: April 2008 Burden Hours: 824/18

Cost: $205,033 Response Rate: 47%/79%

Purpose: The 2008 SPEC Local Partner Survey is being conducted as part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction with the service provided. The objectives of this study are—To identify what SPEC staff and managers can do to improve customer service to their Local Partners To track customer satisfaction with SPEC progress over time

Findings: Overall satisfaction with SPEC products and services is 84%, with 4% being dissatisfied. Satisfaction with the quality of SPEC products and services increased from 77% in 2007 to 80% in 2008. Satisfaction with quality shows a similar trend to overall satisfaction results. Of the general areas covered in the survey, partners provided the highest ratings for their relationship managers (87%) and SPEC’s efforts to improve privacy, confidentiality, and security (86%). There were only two areas whose ratings did not meet or exceed 80%: satisfaction with tax law training service and marketing products and materials. The lowest rated area of the survey was SPEC’s marketing products (69%). Based on the driver analysis, the two areas that most drive local partners’ overall satisfaction are SPEC’s guidance, tools, and support for VITA/TCE (VRPP-QIP) and the relationship manager. These were also the drivers of overall satisfaction in 2007.

Actions taken or lessons learned: Based on the driver analysis, the two areas that most drive local partners’ overall satisfaction are the relationship manager and the SPEC’s guidance, tools, and support for VRRP-QIP. The strongest driver of overall satisfaction relates to guidance, tools, and support for VITA/TCE (VRPP-QIP). Improving the Quality Review Technique DVD/Video would have considerable impact on the overall satisfaction of those who use it. In general, relationship managers received high ratings and this is a strong driver of overall satisfaction. Continue to emphasize and reinforce the need for relationship managers to be proactive and to anticipate the needs of partners.

Purpose: The 2008 SPEC National Partner Survey was being conducted as part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction with the service provided. The objectives of this study are To identify what SPEC staff and managers can do to improve customer service to their National Partners To track customer satisfaction with SPEC progress over time

Findings: The findings from the SPEC national partners are somewhat contradictory. Overall Satisfaction dropped from 91% satisfied in 2007 to 78% satisfied in 2008 (“Satisfied” is defined as those who rated either a 4 or 5). Conversely, the mean increased slightly from 4.35 to 4.37. Why this contradictory result in Overall Satisfaction? Our analysis showed that, compared with 2007, more partners in 2008 were “Very Satisfied” (rating of 5) and more partners were “Neutral” (rating of 3), while far fewer rated their overall satisfaction a 4. Very Satisfied (5) customers increased from 44% to 61%. Neutral (3) customers also increased from 9% to 20%. However, customers who gave a rating of 4 dropped from 47% to just 17% in 2008.

What is the impact of this shift? The increase in “3” scores resulted in a marked decrease in the overall percentage of the “Satisfied” score (4s and 5s). However, the mean score remained relatively unchanged because the downwards movement of “4” scores to “3” scores was largely offset by the upwards movement from “4” scores to “5” scores. Who are the 3s? All but 2 of the 8 Neutral (3) ratings were given by respondents who had not been surveyed the previous year. 19 respondents participated in the 2008 survey who had not taken part in the 2007 survey. Ratings were split between those who were Very Satisfied (13 respondents) and those who were Neutral (6 respondents). No new respondents rated Overall Satisfaction a 4. Another factor was a drop in Overall Satisfaction by two points or more in 2008 from three National Partners who were Very Satisfied in 2007. Follow-up calls surfaced different reasons for each partner’s decline. Respondent 1 (Government Agency) Drop from 5 (2007) to 2 (2008). Principal reason: Dissatisfaction with Relationship Manager. Respondent 2 (Volunteer/Community Partnership) Drop from 5 (2007) to 3 (2008). Principal reason: Dissatisfaction with impact of National SPEC on Local SPEC activities. Respondent 3 (Volunteer/Community Partnership) Drop from 5 (2007) to 3 (2008). Principal reason: Lack of information about additional partners

Actions taken or lessons learned: Emphasize with Relationship Managers the importance for Overall Satisfaction of identifying and establishing partnerships with community based organizations—the two areas with which National Partners were least satisfied and for which satisfaction scores are substantially lower than last year. Renew efforts to improve communication about SPEC’s efforts to maintain and improve the privacy, security and confidentiality of taxpayer data with National Partners who provide return preparation services.


Control # and Name: CS-08-81 Re: Outreach and Educational Activities

Participants: 1,810 (3,298 Requests)

Data Collection Began: December 2007 Data Collection Ended: September 2008 Burden Hours: 115

Cost: $300 Response Rate: 55%

Purpose: The objective of the research was to determine customer opinion on outreach activities and on information which FSLG provides to their customers.

Findings: For all characteristics of the outreach activity that were topics of the feedback form, the majority of the participants chose the most favorable choice - indicating they were very satisfied. Over 79 percent were satisfied (the two most favorable choices) with every factor; fewer than 3 percent were dissatisfied (the two least favorable choices). Over 25 percent of the participants offered comments – the majority of were positive. Suggestions included allowing more time for individual sessions, conducting sessions more frequently, and topics for future agendas. Respondent comments, overall, indicated that they were very satisfied with presenters’ knowledge, presentation skills, subject matter of the sessions, tailoring of outreach sessions towards specific audiences, and presentations showing that the IRS was approachable and helpful.

Actions taken or lessons learned: The results provided information and customer opinion needed to determine an effective and immediate method of if the outreach events are accomplishing their goals. FSLG will be able to identify/implement changes accommodating customer needs. Also, FSLG used the results provided to make decisions on the most effective methods of distributing outreach program information as well as which audiences to target.


Control # and Name: CS-08-82 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-83 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-84 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-85 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-86 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-87 Re: C&L 2008 Tax Forum Survey

Participants: 1,202 (3,333)

Data Collection Began: July 2008 Data Collection Ended: October 2008 Burden Hours: 200

Cost: $190 Response Rate: 45%


Purpose: Tax Forum Survey is to survey recent attendees of the 2008 Nationwide Tax Forums about the quality of the content, logistics, and overall experience.

Findings: The 2008 survey provided input on how the individual seminars could be modified to provide more current and relevant tax administration content.

Actions taken or lessons learned:

  • We supplied the finding to the executives of participating IRS divisions, with comments and input on how their seminars can be enhanced.



Control # and Name: CS-08-88 Re: CAP Survey

Participants: 42

Data Collection Began: 7/08 Data Collection Ended: 8/08 Burden Hours: 7

Cost: $46,000 Response Rate: 65%

Purpose: To understand satisfaction levels among taxpayers of the Compliance Assurance Process (CAP) program for tax years 2007 and to compare findings with survey data from 2006. This study includes 42 participants—an increase from the 30 participants last year.

Findings: Overall satisfaction is significantly higher this year (4.60) compared to 2007 (4.23). Customers’ overall perceptions of the CAP process have improved. It appears that the IRS has been successful in addressing key concerns from prior years related to the real-time nature of CAP audits. CAP program loyalty remains very high. Nearly all (97%) of CAP participants reported being likely to recommend the program.

Actions taken or lessons learned: The results were used along with a survey of revenue agents in evaluation whether the program should be continued, expanded or abandoned. Results from survey supported expanding the number of participants in the program. Additionally it is used to improve the program by addressing customer concerns and issues.


Control # and Name: CS-08-89 Re: Indian Tribal Governments Survey

Participants: 197 (562)

Data Collection Began: 1/08 Data Collection Ended: 8/08 Burden Hours: 58

Cost: $847 Response Rate: 29%

Purpose: The survey was designed to ascertain the level of customer satisfaction with products and services from the IRS that affect Indian tribes.

Findings: In general tribes have a 76% satisfaction rate, but believe the IRS needs to improve by being timelier in its interactions with tribes, and conducting more extensive and frequent outreach/education. When contrasted to FY 2007 survey results, significant increases in Customer Satisfaction were realized with the Navajo Chapters (autonomous villages) and Alaska tribal villages, where extensive outreach was performed in FY 2008. These entities are now more satisfied with IRS than the tribal entities in the remainder of the country.

Actions taken or lessons learned: We are developing an Action Plan to implement some new products and revised procedures that will address specific areas of dissatisfaction noted above. In particular, we are designing an Outreach program modeled after the successful programs in Alaska and on the Navajo reservation.


Control # and Name: CS-08-90 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-91 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-92 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-93 Re: IRS Web Site Help Desk

Participants: 3,605 (55,937 Requests)

Data Collection Began: August 2008 Data Collection Ended: November 2008 Burden Hours: 222

Cost: $2,500 Response Rate: 6.4%

Purpose: Measure satisfaction with the level of service provided to IRS Web Site visitors who contacted the IRS Web Site Help Desk via phone, web chat, and e-mail.

Findings: Satisfaction varied by contact method. Respondents surveyed post-call were the most satisfied with all CSR attributes, as evidenced by mean scores ranging from 6.3 to 6.6 on a 7-point scale. Satisfaction ratings for e-mail contacts ranged from 3.6 to 3.8 on a 7-point scale. Ratings based on their web chat experiences ranged from 1.9 to 5.5 on a 7-point scale. Regression analysis showed that that the representative’s ability to take care of the taxpayer’s issue or question had the most impact on overall satisfaction for both the phone and web chat experiences. Cross tabulation with purpose of call showed that respondents who contacted the IRS Web Site Help Desk “to check the status of a refund” were least satisfied with the representative’s ability to take care. By phone, 59% had issues resolved in the first call; 14% had issues that were partially, but not completely resolved. By e-mail, 18% had issues resolved in the first contact; 66% were not resolved. By web chat, 38% indicated issues were resolved in one chat session; 47% had unresolved issues.

Actions taken or lessons learned: Sample size and margin of error varied by channel, with phone having the highest sample (n=2,900) and lowest margin of error (1.82%). Due to a low sample size (n=95), the margin of error for surveys regarding web chat experiences was high (10.05%), rendering the validity for this quarter’s performance questionable. The target margin of error is less than 5%; it is recommended that we review cumulated web chat surveys over a prolonged time period to reach the target margin of error and increase the data’s validity. Respondents selected ‘other’ as the reason for their purpose of contact the majority of the time, regardless of channel. It is unclear if respondents selected ‘other’ even if an applicable alternative was listed, if the options provided need to be improved, or if a one-time program, such as stimulus checks, influenced this. Continued monitoring for another quarter is recommended to see if ‘other’ continues to dominate the responses to this question. The findings demonstrate that there continues to be a disconnect between what taxpayers expect from their experiences with the IRS Web Site Help Desk and the team’s actual intended purpose, which is to help taxpayers navigate the web site.


Control # and Name: CS-08-94 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-95 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-96 Re: Taxpayer Advocate FY2008 Survey

Participants: 16,648

Data Collection Began: January 2008 Data Collection Ended: November 2008 Burden Hours: 2,497

Cost: $611,889 Response Rate: 32%

Purpose: To measure customer satisfaction of the Taxpayer Advocate Service at all office levels.

Findings: In FY 2008, 85% of all TAS customers surveyed reported being satisfied overall with the services they received. Customer issues identified in the survey which are most highly correlated to overall satisfaction involve the customer’s having reported that their case advocate:

provided an estimate of how long it might take to resolve their problem;

solved their problem in the timeframe provided; and

kept them informed about progress in solving their problem.

resolved their problem in a reasonable timeframe.

Additional analysis of the FY 2008 data showed that the: Impact of Relief on Satisfaction: Predictably, customers who receive some form of relief or assistance from TAS are more likely to report being satisfied with the service provided. This conclusion has been borne out over several years of data collection and, in FY 2008, is reflected by the fact that 89% of the customers who received relief or assistance from TAS reported being satisfied, as compared to 66% satisfaction for those who received neither relief nor assistance.

TAS asks a question regarding the Impact of TAS Experience on Taxpayer Opinions of IRS: Since TAS provides a form of service remediation that is intended to restore taxpayers back to mainstream IRS processes, it is important to understand how taxpayers view the overall tax environment after their issue has been resolved by TAS. In this respect, 50% of the taxpayers who responded to the survey in FY 2008 indicated that they had a higher opinion of IRS as a result of their TAS experience.

Action taken or lessons learned: TAS Initiatives

Hiring Initiatives: The hiring initiatives begun in FY 2006 have continued through FY 2008 and are expected to stretch beyond FY 2009. This will hopefully will have the impact of reducing local workload and increase the quality of service. Office Consultation Visits: Starting this year, TAS formed a team to work with some of its offices to identify opportunities to improve customer satisfaction attributes. This team consists of a consultant from Macro International and representatives from various levels within TAS. TAS uses a continuous improvement methodology (DMAIC) (Define-Measure-Analyze-Improve-Control) as part of these office visits to analyze survey data, identify improvement opportunities, and formalize actions for improvement and to monitor results achieved. 2 TAS schedules follow-up meetings with each office to provide assistance, reinforce the process principles presented during the consultations, and to monitor the post-visit results. Nationwide Inventory Balancing. To optimize the opportunity for case advocates to provide quality service, TAS has implemented a process for moving workload among its offices based upon available resources. D. Revised Service Level Agreements: TAS renegotiates its agreements with IRS operating divisions handle taxpayer cases sent to them by TAS for follow-up or action. Timeliness is one of the primary drivers of customer satisfaction and efforts to provide more efficient operating procedures should have a positive impact on customer satisfaction scores. One of the primary tools of TAS is the Operational Assistance Requests (OAR) as a mechanism for engaging the IRS Operating Divisions cooperation in resolving its customer’s cases. As a result of recent revised agreements, TAS was successful in shortening the cycle time of this process.


Control # and Name: CS-08-97 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-98 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-99 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-100 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-101 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-102 Re: SB/SE Tax Professional Survey

Participants: 1,800 (5,124 Requests)

Data Collection Began: November 2008 Data Collection Ended: January 2009 Burden Hours: 864

Cost: $350,766 Response Rate: 35%

Purpose: The research is designed to help SB/SE understand who their practitioners are, recognize how practitioners are contacting the IRS and their success with different methods, develop new strategies for improving the practitioners’ effectiveness and ease of dealing with the IRS and, ultimately, their satisfaction, and develop new strategies to address the Tax Gap in support of SB/SE’s current goals. Specifically, the research documents practitioner behavior patterns in their pre-filing, filing, and post-filing experiences on behalf of their SB/SE clients and links key improvement opportunities to these patterns; provides practitioner feedback on the IRS Web site, e-filing, and other services; and suggests tactics for improving practitioner satisfaction and business results.

Findings: Overall satisfaction ratings are slightly but not significantly higher this year (63% satisfied) compared to last year (59% satisfied). Practitioners who reported the highest percentage of clients receiving notices include those whose clients mostly live in an area with a population of more than 200,000 and specifically Unenrolled Return Preparers whose clients mostly live in an area with a population of 200,000 or fewer. CPAs tend to be less satisfied, and they also tend to report fewer clients with notices. They tend to rely on non-IRS sources of information. They are less likely to enroll in e-services and to e-file. They also tend to help clients more with complex issues such as resolving notices. Resolving Your Clients’ IRS Notices and Getting Client Account Information from IRS Pre-Filing are the two highest priority improvement arenas. These are the two areas where tax professionals must rely on the IRS for success. Success in these areas means more timely issue resolution and more accurate returns. 50% practitioners rely mostly on non-IRS sources for information or guidance on tax issues. The most frequently used non-IRS source is a paid tax service. The most frequently used IRS source is IRS.gov, with 85% of practitioners visiting for 2007 taxes. 80% of practitioners contacted the IRS for forms or guidance, and 25% contacted for client history. 94% of practitioners who helped clients with notices contacted the IRS most often through mail (82%). 65% of practitioners surveyed felt that their contacts with the IRS helped them avoid subsequent problems or errors. The most common additional information or guidance desired from the IRS is how to respond to and resolve notices or account issues (83%). Only 48% of practitioners were enrolled in e-services, with only 17% to 23% of all practitioners actually using each service. The most common reasons for not enrolling in e-services and for not using e-services are lack of knowledge and the belief that the services are not needed or useful. CPAs and Unenrolled Return Preparers are less likely than Enrolled Agents to be enrolled in e-services. E-file usage increased in 2008. 51% claim they e-filed all or most of their clients’ income tax returns in 2008 versus 45% in 2007. Those who e-file income taxes reported higher satisfaction with Preparing/Filing Your Client’s Income Tax Return. Those most likely to file by mail rather than e-file include: practitioners with clients mostly in an area with a population of more than 200,000, especially those who are CPAs and Unenrolled Return Preparers and practitioners with clients mostly in an area with a population of 200,000 or less with more than 5 employees in their firm. Almost all practitioners surveyed help their clients resolve post-filing issues such as IRS notices (95%). On average, 52% of practitioners believe their cases were resolved with no change in the amount owed, 15% with the IRS owing money, and 33% with the client owing money. Those practitioners who commonly assist their clients with notices due to unreported income or problems with deductions reported an average of 34% of cases in which they had submitted information with the original return that in their professional judgment should have prevented the notice from being issued. However, the IRS does not look at additional information unless an audit is in progress. On average, practitioners represented about 8% of their clients for compliance problems in the past year. Of those clients represented by the practitioner, on average, 16% were represented even though the practitioner did not prepare and file the original return.

Actions taken or lessons learned:

Overall directions from the Tax Professional survey will be included in a Joint Recommendations report together with directions with the Customer Base Survey

Control # and Name: CS-08-103 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-104 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-105 Re: Correspondex Letter 4310-C

Participants: 61 (165 Requests)

Data Collection Began: November 2008 Data Collection Ended: December 2008 Burden Hours: 27

Cost: $732 Response Rate: 37%

Purpose: In January of 2008, the IRS Office of Identity Theft and Incident Management selected 100 previously investigated taxpayers for a pilot program. A 4310-C Refund Crime letter was mailed to the 100 pilot program taxpayers whose PII had been used to file multiple returns for tax year 2006. Following that pilot program, the IRS mailed the same letter to taxpayers whose PII had been used to file more than one 2007 tax return. In the letter, the IRS expressed that their PII may have been fraudulently used, provided educational information on resources available to protect themselves, and recommended actions to take if they were in fact a victim of identity theft. The IRS did not request that the taxpayer make contact and stated that no action was needed in the contents of the letter. Furthermore, the letter referred taxpayers to Taxpayer Advocate Service (TAS) for any questions and for assistance resolving issues pertaining to identity theft.

The overall objective of the identity theft project was to determine the effectiveness of the letter as well as determine the reasons why taxpayers may respond to the letter by contacting the IRS.

Findings: The majority of survey respondents indicated that they had received, read, and understood the letter. On average, survey respondents were satisfied with the letter. Specifically, respondents felt the language was appropriate and easy to follow. Sixty-five percent of the 58 survey respondents reported monitoring their financial accounts However, most respondents who had been victims of identity theft did not report taking the proper action to protect their financial information, which included contacting their financial institution, contacting the fraud department at one of the three major credit bureaus, filing a police report, and filing a complaint with the Federal Trade Commission (FTC). Although letter recipients were asked not to contact the IRS, 80% reported doing so. The most common reasons given for calling were to be sure that their financial information was safe and to get more information about their tax accounts. As an alternative to calling the IRS, the letter stated that the taxpayer could contact the Taxpayer Advocate Service (TAS); however, only 31% reported utilizing this resource.

Actions taken or lessons learned: Findings will be used to evaluate the notice when it is up for revision.


Control # and Name: CS-08-106

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-107 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-108 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-109 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-110 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-111 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-112 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-113 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-114 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-115 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-116 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-117 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-118 Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08-119 Re: Notice Simplification Web Survey

Participants: 36,124 (1,200 Requests)

Data Collection Began: January 2009 Data Collection Ended: January 2009 Burden Hours: 473

Cost: $15,605 Response Rate: 84% (combined notices)

Purpose: The primary objective of this web-based research is to understand respondent behavior and comprehension of existing notices and proposed re-designs. More specifically, we collected answers related to perception and comprehension of both “before” and “after” versions of the IRS notices; CP2000, CP521, and L-1058. The research data and findings will be used to validate the improvement of IRS notices, create a benchmark and a library of modules

Findings: The revised CP2000 was a significant improvement over the original version: The revised version greatly reduced the number of problem areas where respondents had difficulty in the original. It halved the number of respondents having difficulty understanding any part of the notice. The majority of respondents felt that the revised version provided a clearer explanation of what they need to do. There was improvement in both the simplicity and comprehension index. The vast majority of respondents commented on how they prefer the notice in its short form. The revised CP521 was a significant improvement over the original version: The revised version reduced the number of problem areas where respondents had difficulty in the original. It reduced the number of respondents having difficulty understanding any part of the notice. Most respondents felt that the revised version provided a clearer explanation on what they need to do. Respondents appreciated that the bill had fewer pages. The revised L-1058 was a significant improvement over the original version: The revised version greatly reduced the number of problem areas where respondents had difficulty in the original. It more than halved the number of respondents having difficulty understanding any part of the notice. The vast majority of respondents felt that the revised version provided a clearer explanation for what they need to do. There was substantial improvement in both the simplicity and comprehension index

Actions taken or lessons learned: Changes to improve the CP2000: Placing the detailed calculations after the summary created an overwhelming list of numbers for respondents to read. Our remedies will be aimed at emphasizing the most critical amounts, visually separating the details and telegraphing the action (See appendix for details). Include the amount due in the title using large, bold, red type. Further separate the summary information from the more detailed calculation. Streamline the calculation in the billing summary and shorten the text accompanying the calculation. Move the section “What to do next” to the first page to make it easier to find. This will also make the summary information more distinct . Remove description of penalty charges and refer to them only as something that may apply in the future. Changes to improve the CP521: Highlighting the subtotal before interest and penalty charges might have distracted respondents by having three items in bold type within one calculation stream. Our remedies include eliminating the subtotal and isolating the amount due this month. Include the monthly payment in the title using large, bold, red type. Streamline the calculation in the billing summary and simplify the text accompanying the calculation. Specifically, eliminate “remaining balance” (balance after taxpayer’s previous payment) Separate taxpayer’s balance from the required monthly payment. Changes to improve the L-105: The remedy seems to be a combination of using design for greater emphasis and removing unnecessary subtotaling. (See appendix for details). Include the amount owed in the title using large, bold, red type. Simplify the text accompanying the calculation. Consider whether the appeals process should appear on page 1.


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:




Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:




Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:


Control # and Name: CS-08- Re:

Participants:

Data Collection Began: Data Collection Ended: Burden Hours:

Cost: Response Rate:

Purpose:

Findings:

Actions Taken/Lessons Learned:







1

2

File Typeapplication/msword
File TitleAppendix B
Authormdsloa00
Last Modified Bymdsloa00
File Modified2009-05-28
File Created2008-05-23

© 2024 OMB.report | Privacy Policy