CS-10-210 Supporting Statement

CS-10-210 Supporting Statement.rtf

Voluntary Customer Surveys to Implement E.O. 12862 Coordinated by the Corporate Planning and Performance Division on Behalf of All IRS Operations Functions

CS-10-210 Supporting Statement

OMB: 1545-1432

Document [rtf]
Download: rtf | pdf

OMB SUPPORTING STATEMENT STUDY TO MEASURE CUSTOMER SATISFACTION AUTOMATED UNDERREPORTER (AUR) – CY2010


FEBRUARY 1, 2010 - DECEMBER 31, 2010 TIRNO-05-Z-000XX



Introduction


Background/Overview


The Internal Revenue Service (IRS) engages a balanced measurement system consisting of business results, customer satisfaction, and employee satisfaction. This initiative is part of the Service-wide effort to establish a system of balanced organizational performance measures mandated by the IRS Restructuring and Reform Act of 1998. This is also a result of Executive Order 12862, which requires all government agencies to survey their customers and incorporate customer preferences in their process improvement efforts.


The Automated Underreporter (AUR) section within the Compliance Operating Unit (OU) of the Wage and Investment (W&I) Division is responsible for notifying taxpayers of discrepancies between the income information reported on their tax returns and the income information supplied by their employers and other organizations. As an important customer interface for Wage & Investment, Automated Underreporter needs feedback from customers (i.e., taxpayers) to continuously improve its operations.


The key goals of the survey are:


  • Identify customer expectations of Automated Underreporter,

  • Track customer satisfaction at the three Wage and Investment Automated Underreporter sites and nationwide, and

  • Identify operational improvements.



Objectives of Data Collection


The objective of the survey is to gauge customer expectations and perceptions about the AUR process. The survey results should facilitate more effective management of W&I Automated Underreporting by:


  • Providing insight from the customer’s perspective about possible improvements.

  • Providing useful input for program evaluation and execution at the programmatic and field office level of service delivery.




Methodology


Sample Design


The sample database consists of taxpayers with closed AUR cases. The contractor selects 1600 taxpayers per quarter (stratifying by the three W&I AUR sites), resulting in a total of 480 completed questionnaires each quarter (160 per site) and 1,920 completed questionnaires per year (640 per site).










Data to be Collected


Automated Underreporter taxpayer customer satisfaction respondent data is collected.


How Data Collected & Used


The questionnaire is based on the contractor’s Net Impression methodology, which asks respondents to evaluate various aspects of their experience and to provide an overall summary evaluation. The questionnaire was developed based on focus group input with customers who received a notice from the IRS notifying them of a discrepancy between their W2 or 1099 Forms and the earnings information reported on their tax returns.


The contractor administers the survey by mail on a monthly basis. Standard procedures will be used in order to obtain the highest response rate possible for the mail survey. These include: 1) an advance letter about the survey; 2) the initial survey with a cover letter; 3) a postcard reminder; and, 4) a second letter and survey to non-respondents.


The contractor, on a quarterly basis, summarizes the quantitative ratings and produces a national report showing customer satisfaction scores on all AUR survey items and overall improvement priorities for the function. The contractor delivers two versions of each national report, one with any appropriate site-level data and one without site references. On an annual basis, the contractor prepares three site reports containing individual site scores on each of the survey items and improvement priorities for the individual sites. The contractor includes any relevant database variables in the analysis and weights the survey responses as necessary, to accurately reflect the entire customer base.


Reports of survey findings are distributed to the IRS each quarter. Each report is delivered approximately seven weeks after the survey cut-off date for the quarter.


Dates of Collection Begin/End


Data collection runs the 2010 calendar year from February 1, 2010 through December 31, 2010.


Who is Conducting the Research/Where


The contractor is responsible for ensuring the sample is pulled and to conduct data analysis. A separate GPO contractor will be responsible for printing and administering the survey via mail, and then providing the dataset to the contractor.


Cost of Study


The estimated cost for this survey is $112,187.


Expected Response Rate


The expected response rate is 30%. The expected response rate on this survey is lower than the OMB 50% target rate since the sample consists of taxpayers with compliance issues. They did not report all of their income to the IRS. This group of taxpayers generally doesn’t seek out the IRS. In addition, there are no incentives offered to encourage taxpayers to respond to the survey. Mail surveys also traditionally yield lower response rates than other methodologies such as telephone or in-person interviews. Telephone and in-person techniques offer the advantage of interviewer contact who can further encourage taxpayers to




respond through refusal conversion techniques. The mail survey methodology employs best practices in maximizing response rates by sending out enough mailings as justified without creating extra burden for taxpayers.









With regard to the low response rate, the IRS will assume that all data collected from this survey is qualitative in nature, and that no critical decisions will be made by this office solely from the analysis of data from this survey. The results from this survey are simply one piece of a larger set of information needed to assess the needs related to services provided by the IRS.


Methods to Maximize Response Rate


The questionnaire length is minimized to reduce respondent burden; thereby, tending to increase response rates. Respondents are assured anonymity of their responses. Also, weighting procedures can be applied to adjust aggregated data from those who do respond.


Test Structure and Design


The Automated Underreporter questionnaire is an established and tested survey instrument. If changes are made to the questionnaire, they are expected to be minor.


The survey includes several ratings questions evaluating service delivery during the AUR process as well as several demographic items. In addition, ample space is provided for suggestions for improvement. Survey scoring for this contract is based on the Customer Satisfaction Survey Score response average to the keystone question – “Regardless of whether you agree or disagree with the final outcome, how would you rate your overall experience with the way your discrepancy was handled?” Questions utilize a 5-point rating scale, with 1 being very dissatisfied and 5 being very satisfied. All survey responses will be released only as summaries. The contractor shall hold the identities, of the taxpayers responding to the survey, private to the extent permitted by law. The contractor ensures that taxpayers responding to the survey are guaranteed anonymity.


Efforts to avoid Duplicate Research


This is the only Customer Satisfaction survey currently conducted by W&I Compliance for Automated Underreporters.


Participants Criteria


Survey participants are pulled from Automated Underreporter closed cases in a random sample by IRS employees at each of the 3 site locations.


Privacy, Disclosure and Security Issues


The IRS will ensure compliance with the Taxpayer Bill of Rights II. All participants will be treated fairly and appropriately.


The security of the data used in this project and the privacy of participants will be carefully safeguarded at all times. Security requirements are based on the Computer Security Act of 1987 and Office of Management and Budget Circular A-130, Appendices A7B. Physical security measures include a locked, secure office. Audiotapes are stored in locked cabinets. Transcription of audiotapes are stored in locked cabinets or shredded. Data security at the




appropriate levels has been accomplished. Systems are password protected, users profiled for authorized use, and individual audit trails generated and reviewed periodically. The IRS will apply and meet fair information and record keeping practices to ensure privacy protection of all participants. This includes criterion for disclosure





laid out in the Privacy Act of 1974, the Freedom of Information Act, and Section 6103 of the Internal Revenue Code





all of which provide for the protection of taxpayer information as well as its release to authorized recipients. Privacy will be safeguarded; participants will not be identified to IRS personnel. In addition, no participant names will be mentioned in the reports or data files. Participants will be advised that comments will be audio taped. Privacy is assured by virtue of agency policy.









Pursuant to the Federal Information Security Management Act (FISMA), Title III of the E-Government Act of 2002, P.L. 107-347, the contractor shall provide minimum security controls required to protect Federal information and information systems. The term ‘information security’ means protecting information and information systems from unauthorized access, use, disclosure, disruption, modification, or destruction in order to provide confidentially, integrity and availability.


The contractor shall provide information security protections commensurate with the risk and magnitude of the harm resulting from the unauthorized access, use, disclosure, disruption, modification, or destruction of information collected or maintained by or on behalf of the agency; or information systems used or operated by an agency or by a contractor of an agency. This applies to individuals and organizations having contractual arrangements with the IRS, including employees, contractors, contractors, and outsourcing providers, which use or operate information technology systems containing IRS data. The contractor shall comply with Department of Treasury Directive TD P 85-01, Treasury Security Manual TDP 71-10, and Internal Revenue Manual 10.8.1 Information Technology Security Policy and Guidance. The contractor shall comply with IRS Internal Revenue Manuals (IRM) and Law Enforcement Manuals (LEM) when developing or administering IRS information and information systems. The contractor shall comply with the Taxpayer Browsing Protection Act of 1997 - Unauthorized Access (UNAX), the Act amends the Internal Revenue Code 6103 of 1986 to prevent the unauthorized inspection of taxpayer returns or tax return information. Contractors systems that collect, maintain, operate or use agency information or an information system on behalf of the agency (a General Support System (GSS), Major or Minor Application with a FIPS 199 security categorization) must ensure annual reviews, risk assessments, security plans, control testing, a Privacy Impact Assessment (PIA), contingency planning, and certification and accreditation, at a minimum meet NIST guidance, if required by the IRS.


The contractor shall be subject to at the option / discretion of the agency, to periodically test, (but no less than annually) and evaluate the effectiveness of information security controls and techniques. The assessment of information security controls may be performed by an agency independent auditor, security team or Inspector General, and shall include testing of management, operational, and technical controls of every information system that maintain, collect, operate or use federal information on behalf of the agency. The agency and contractor shall document and maintain a remedial action plan, also known as a Plan of Action and Milestones (POA&M) to address any deficiencies identified during the test and evaluation. The contractor must cost-effectively reduce information security risks to an




acceptable level within the scope, terms and conditions of the contract. The contractor shall maintain procedures for detecting, reporting, and responding to security incidents, and mitigating risks associated with such incidents before substantial damage is done to federal information or information systems. The contractor shall immediately report all computer security incidents that involve IRS information systems to the IRS Computer Security Incident Response Center (CSIRC). Any theft or loss of IT equipment with federal information / data must be reported within one hour of the incident to CSIRC. Those incidents involving the loss or theft of sensitive but unclassified (SBU) data (i.e. taxpayer, PII) shall be reported to CSIRC, first-line manager, and Treasury Inspector General for Tax Administration (TIGTA). Based on the computer security incident type, CSIRC may further notify the Treasury Computer Security Incident Response Capability (TCSIRC) in accordance with TCSIRC procedures.









Burden Hours


The survey interview is designed to minimize burden on the taxpayer. The time that a respondent takes to complete the mail survey is carefully considered and only the most important areas are being surveyed. The average time of survey completion is expected to be 7 minutes. The questions are generally one sentence in structure and on an elementary concept level.


Based on a sample of potential respondents of 6,400 (3 sites, average of 178 per site each month) and a response rate of 30%, we expect 1,920 survey participants (640 per site), leaving 4,480 non-participants. The contact time to determine non-participants could take up to two minutes to read the pre-contact letter, with the resulting burden for non-participants being 4,480 x 2 minute = 8,960/60 minutes=149 burden hours.


For participants, the time to complete the survey is 7 minutes. This reflects the time to read the pre-notification letter (2 minutes) and time needed to complete the survey (5 minutes maximum). The time burden for participants is 1,920 x 7 minutes/60 minutes = 224 burden hours.


Thus the total burden hours for the survey would be (149 + 224) 373 burden hours.


Attachments


13257-G Automated Underreporter Survey L1-Advance letter (pre-note) about the survey L2-Cover letter with the survey L3-Postcard reminder L4-Second letter and survey to non-respondents










OMB # 1545-1432


File Typetext/rtf
File TitleMicrosoft Word - AUR Survey OMB Submission CY2010.doc
Authorf1kfb
Last Modified ByXHFNB
File Modified2010-01-29
File Created2010-01-29

© 2024 OMB.report | Privacy Policy