CS-10-214 Supporting Statement

CS-10-214 Supporting Statement.doc

Voluntary Customer Surveys to Implement E.O. 12862 Coordinated by the Corporate Planning and Performance Division on Behalf of All IRS Operations Functions

CS-10-214 Supporting Statement

OMB: 1545-1432

Document [rtf]
Download: rtf | pdf

OMB SUPPORTING STATEMENT STUDY TO MEASURE CUSTOMER SATISFACTION COMPLIANCE CENTER EXAMINATION – CY2010 FEBRUARY 15, 2010 - DECEMBER 31, 2010 TIRNO-05-Z-000XX



Introduction


Background/Overview


The purpose of the survey is to gauge customer expectations and perceptions about the Compliance Center Examination (CC Exam) process. This research is being conducted as part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction with the service provided.


The Internal Revenue Service (IRS) engages a balanced measurement system consisting of business results, customer satisfaction, and employee satisfaction. The Compliance Center Examination section within the Compliance Operating Unit (OU) of Wage and Investment is responsible for responding to customer technical and account inquiries, resolving customer account issues, providing account settlement (payment options), and working related issues. As an important customer interface for Wage & Investment, CC Exam needs feedback from customers to continuously improve its operations. This initiative is part of the Service-wide effort to establish a system of balanced organizational performance measures mandated by the IRS Restructuring and Reform Act of 1998. This is also a result of Executive Order 12862, which requires all government agencies to survey their customers and incorporate customer preferences in their process improvement efforts.


The key goals of the survey are to:


  • Track nationwide customer satisfaction at the five Wage and Investment Compliance Center Examination (CC Exam) sites and


Identify operational improvements.



The results should facilitate more effective management of W&I CC Exam by providing:

  • Insight from the customer’s perspective about possible improvements.

  • Useful input for program evaluation and execution at the programmatic and field office level of service delivery.



Objectives of Data Collection


The objectives of this study are to:


  • Identify what CC Exam staff and managers can do to improve customer service and

  • Track customer satisfaction with CC Exam’s progress over time.




Methodology


Sample Design


The universe sample includes individuals whose income tax returns were examined through correspondence with the IRS and whose cases were then closed. The closing categories




include examinations where there were agreements, disagreements, or no changes in the tax liabilities.









The sample does not include businesses who file corporate and partnership returns. It does, however, include individual shareholders and partners examined as a result of a corporate audit, as well as sole proprietors and self-employed farmers.


The CC Exam sample is derived from the Audit Information Management System (AIMS) database, which the five compliance centers send to the contractor each month. The contractor selects 9,120 cases per year and target 2,280 completed interviews per year (456 per site).


Data to be Collected


Compliance Center Examination taxpayer customer satisfaction respondent data is collected.


How Data Collected & Used


The contractor administers the survey by mail on a monthly basis. Standard procedures are used in order to obtain the highest response rate possible for the mail survey. These include: 1) an advance letter about the survey; 2) the initial survey with a cover letter; 3) a postcard reminder; and, 4) a second letter and survey to non-respondents.


The contractor, on a quarterly basis, summarizes the quantitative ratings and produces a national report showing customer satisfaction scores on all CC Exam survey items and overall improvement priorities for the function. Quarterly, the contractor produces 12 month rolling data by site. This is an extra slide in the quarterly report. On an annual basis, the contractor prepares five site reports containing individual site scores on each of the survey items and improvement priorities for the individual sites. The contractor includes any relevant database variables in the analysis and weights the survey responses as necessary to reflect accurately the entire customer base.


Reports of survey findings are distributed to the IRS quarterly. Each report is delivered approximately seven weeks after the survey cut-off date for the quarter.


For the quarterly reports, the contractor uses basic and advanced statistical techniques including, but not limited to, analysis of variance and the prioritization of improvement priorities using contractor’s established technique.


Dates of Collection Begin/End


Data collection runs from February 15, 2010 through December 31, 2010.


Who is Conducting the Research/Where


The contractor is responsible for ensuring the sample is pulled and to conduct data analysis. A separate GPO contractor is responsible for printing and administering the survey via mail, and then providing the dataset to the contractor.










Cost of Study


The estimated cost for this survey is $108,399.


Expected Response Rate


The expected response rate is 25%. The expected response rate on this survey is lower than the OMB 50% target rate since the sample consists of taxpayers with compliance issues. They either owe money to the IRS or have unfiled tax returns. This group of taxpayers generally do not seek out the IRS. In addition, there are no incentives offered to encourage taxpayers to respond to the survey. Mail surveys also traditionally yield lower response rates than other methodologies such as telephone or in-person interviews. Telephone and in-person techniques offer the advantage of interviewer contact who can further encourage taxpayers to respond through refusal conversion techniques. The mail survey methodology employs best practices in maximizing response rates by sending out enough mailings as justified without creating extra burden for taxpayers.


With regard to the low response rate, the IRS will assume that all data collected from this survey is qualitative in nature, and that no critical decisions will be made by this office solely from the analysis of data from this survey. The results from this survey are simply one piece of a larger set of information needed to assess the needs related to services provided by the IRS.


Methods to Maximize Response Rate


The questionnaire length is minimized to reduce respondent burden; thereby, tending to increase response rates. Respondents are assured anonymity of their responses. Also, weighting procedures can be applied to adjust aggregated data from those who do respond.


Test Structure and Design


The Compliance Center Examination questionnaire is an established and tested survey instrument. If changes are made to the questionnaire, they are expected to be minor.


The questionnaire is based on the contractor’s leverage analysis, which asks respondents to evaluate various aspects of their experience and to provide an overall summary evaluation. The questionnaire was developed based on inputs from a focus group with customers who had a recent closed case with Compliance Center Examination.


The survey will include several rating questions evaluating service delivery during the CC Exam process as well as several demographic items. In addition, ample space will be provided for suggestions for improvement.


Survey scoring for this contract will be based on the Customer Satisfaction Survey Score response average to the keystone question – “How would you rate your overall experience with the way your audit was handled?” Questions will utilize a 5-point rating scale, with 1 being very dissatisfied and 5 being very satisfied. All survey responses generated are kept “private to the extent of the law”. The contractor ensures that taxpayers responding to the survey are guaranteed anonymity.










Efforts to avoid Duplicate Research


This is the only mail based Customer Satisfaction survey currently conducted by W&I Compliance for Compliance Center Examination customers.


Participants Criteria


The CC Exam survey participants are pulled from the Audit Information Management System (AIMS) database in a random sample, which the five compliance centers send to the contractor each month.


Privacy, Disclosure and Security Issues


The IRS will ensure compliance with the Taxpayer Bill of Rights II. All participants will be treated fairly and appropriately.


The security of the data used in this project and the privacy of participants will be carefully safeguarded at all times. Security requirements are based on the Computer Security Act of 1987 and Office of Management and Budget Circular A-130, Appendices A7B. Physical security measures include a locked, secure office. Audiotapes are stored in locked cabinets. Transcription of audiotapes are stored in locked cabinets or shredded. Data security at the appropriate levels has been accomplished. Systems are password protected, users profiled for authorized use, and individual audit trails generated and reviewed periodically. The IRS will apply and meet fair information and record keeping practices to ensure privacy protection of all participants. This includes criterion for disclosure—laid out in the Privacy Act of 1974, the Freedom of Information Act, and Section 6103 of the Internal Revenue Code—all of which provide for the protection of taxpayer information as well as its release to authorized recipients. Privacy will be safeguarded; participants will not be identified to IRS personnel. In addition, no participant names will be mentioned in the reports or data files. Participants will be advised that comments will be audio taped. Privacy is assured by virtue of agency policy.


Pursuant to the Federal Information Security Management Act (FISMA), Title III of the E-Government Act of 2002, P.L. 107-347, the contractor shall provide minimum security controls required to protect Federal information and information systems. The term ‘information security’ means protecting information and information systems from unauthorized access, use, disclosure, disruption, modification, or destruction in order to provide confidentially, integrity and availability.


The contractor shall provide information security protections commensurate with the risk and magnitude of the harm resulting from the unauthorized access, use, disclosure, disruption, modification, or destruction of information collected or maintained by or on behalf of the agency; or information systems used or operated by an agency or by a contractor of an agency. This applies to individuals and organizations having contractual arrangements with the IRS, including employees, contractors, contractors, and outsourcing providers, which use or operate information technology systems containing IRS data. The contractor shall comply with Department of Treasury Directive TD P 85-01, Treasury Security Manual TDP 71-10, and Internal Revenue Manual 10.8.1 Information Technology Security Policy and Guidance. The contractor shall comply with IRS Internal Revenue Manuals (IRM) and Law Enforcement Manuals (LEM) when developing or administering IRS information and




information systems. The contractor shall comply with the Taxpayer Browsing Protection Act of 1997 - Unauthorized Access (UNAX), the Act amends the Internal Revenue Code 6103 of 1986 to prevent the unauthorized inspection of taxpayer returns or tax return information. Contractors systems that collect, maintain, operate or use agency information or an information system on behalf of the agency (a General Support System (GSS), Major or Minor Application with a FIPS 199 security categorization) must ensure annual reviews, risk assessments, security plans, control testing, a Privacy Impact Assessment (PIA), contingency planning, and certification and accreditation, at a minimum meet NIST guidance, if required by the IRS.









The contractor shall be subject to at the option / discretion of the agency, to periodically test, (but no less than annually) and evaluate the effectiveness of information security controls and techniques. The assessment of information security controls may be performed by an agency independent auditor, security team or Inspector General, and shall include testing of management, operational, and technical controls of every information system that maintain, collect, operate or use federal information on behalf of the agency. The agency and contractor shall document and maintain a remedial action plan, also known as a Plan of Action and Milestones (POA&M) to address any deficiencies identified during the test and evaluation. The contractor must cost-effectively reduce information security risks to an acceptable level within the scope, terms and conditions of the contract. The contractor shall maintain procedures for detecting, reporting, and responding to security incidents, and mitigating risks associated with such incidents before substantial damage is done to federal information or information systems. The contractor shall immediately report all computer security incidents that involve IRS information systems to the IRS Computer Security Incident Response Center (CSIRC). Any theft or loss of IT equipment with federal information / data must be reported within one hour of the incident to CSIRC. Those incidents involving the loss or theft of sensitive but unclassified (SBU) data (i.e. taxpayer, PII) shall be reported to CSIRC, first-line manager, and Treasury Inspector General for Tax Administration (TIGTA). Based on the computer security incident type, CSIRC may further notify the Treasury Computer Security Incident Response Capability (TCSIRC) in accordance with TCSIRC procedures.


Burden Hours


The survey interview is designed to minimize burden on the taxpayer. The time that a respondent takes to complete the mail survey is carefully considered and only the most important areas are being surveyed. The average time of survey completion is expected to be 7 minutes. The questions are generally one sentence in structure and on an elementary concept level.


Based on a sample of potential respondents of 9,120 (approx 152 per site, per month) and a response rate of 25%, we expect 2,280 survey participants per year, leaving 6,840 non­participants. The contact time to determine non-participants could take up to two minutes , with the resulting burden for non-participants being 6,840 x 2 minute = 13,680/60 minutes=228 burden hours.


For participants, the time to complete the survey is 7 minutes. This reflects the time to read the pre-read letter (2 minute) as well as the time to complete the survey (5 minutes










maximum).The time burden for participants being 2,280 x 7 minutes=15,960/60 minutes = 266 burden hours. Thus the total burden hours for the survey would be (228 + 266) 494 burden hours.


Attachments


Compliance Center Examination survey L1-Advance letter (pre-note) about the survey L2-Cover letter with the survey L3-Postcard reminder L4-Second letter and survey to non-respondents




OMB # 1545-1432









File Typetext/rtf
File TitleMicrosoft Word - CCE Survey OMB Submission CY2010.doc
Authorf1kfb
Last Modified ByXHFNB
File Modified2010-01-29
File Created2010-01-22

© 2024 OMB.report | Privacy Policy