Supporting Statement -Part B

Supporting Statement -Part B.docx

IRS Taxpayer Burden Surveys

OMB: 1545-2212

Document [docx]
Download: docx | pdf

OMB SUPPORTING STATEMENT

Internal revenue service

IRS Taxpayer burden surveys

TIRNO-10-Q-00152


PART B – JUSTIFICATION


1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used.


The suite of burden surveys recognizes differences between taxpayers (individuals, corporations, partnerships, tax-exempt organizations, trusts, estates, employers, and information document issuers). However, across all the surveys, the data are captured in an internally consistent manner (in terms of time and money). Some populations are explicitly excluded from the survey population. This includes minors, deceased taxpayers, and in most circumstances, taxpayers that have international addresses, including active duty military serving overseas.


Tables 1 – 12 below show the potential respondent universe (population count), stratification plan, and sample allocations for each survey and special study for which a sampling plan exists as of the date this request was submitted. When sub-populations vary considerably, it is advantageous to sample each subpopulation (stratum) independently. Stratification is the process of grouping members of the population into relatively homogeneous sub-groups before sampling. The strata should be:


  • Mutually Exclusive. Members must be assigned to only one stratum, and

  • Collectively Exhaustive. No members can be excluded.



Table 1 –2019, 2020, and 2021 Individual Taxpayer Burden Surveys

Strata

Population Count

Sample Allocation

Preparation Method

Complexity

Used a Paid Professional

Low

9,758,524

2,218

Used a Paid Professional

Medium-Low

22,015,186

4,382

Used a Paid Professional

Medium

15,372,213

3,912

Used a Paid Professional

Medium-High - Simpler

6,496,505

1,546

Used a Paid Professional

Medium-High - Moderate

4,455,117

1,266

Used a Paid Professional

Medium-High - Difficult

5,235,768

2,486

Used a Paid Professional

High-Simpler

4,333,619

1,324

Used a Paid Professional

High-Moderate

2,500,073

1,302

Used a Paid Professional

High-Difficult

4,446,633

3,636

Self-Prepared by Hand

Low

1,277,493

1,526

Self-Prepared by Hand

Medium-Low

1,581,900

846

Self-Prepared by Hand

Medium

932,670

844

Self-Prepared by Hand

Medium-High

525,801

790

Self-Prepared by Hand

High

201,726

806

Self-Prepared by Software

Low

13,352,100

2,018

Self-Prepared by Software

Medium-Low

21,713,432

4,002

Self-Prepared by Software

Medium

11,570,065

3,040

Self-Prepared by Software

Medium-High

7,571,992

2,130

Self-Prepared by Software

High

1,681,784

948

VITA Prepared

ANY

3,421,941

978




40,000



Table 2 - 2019 Individual Taxpayer Compliance (Post-Filing) Burden Survey

Strata

Population Count

Sample Allocation

Post Filing Issue Complexity

Return Complexity

Preparation Method

Appeal

Other than High

Other than Assisted

2,552

1,363

Exam - High

Other than High

Other than Assisted

4,548

1,363

Appeal

High

Assisted

8,294

1,363

Exam - Medium

Other than High

Other than Assisted

10,168

1,363

Collection - High

High

Unassisted

17,175

1,363

Collection - High

Low

Unassisted

23,015

1,363

Amended

High

Unassisted

27,091

1,363

Amended

Low

Unassisted

32,183

1,363

Exam - High

High

Assisted

65,083

1,363

Coll - High

Low

Assisted

74,933

1,363

Exam - Low

High

Unassisted

90,776

1,363

Exam - Medium

High

Assisted

92,153

1,363

Exam - Low

Low

Unassisted

132,179

1,363

Coll - High

High

Assisted

193,443

1,363

Amended

Low

Assisted

257,990

1,363

Coll - Low

High

Unassisted

308,263

1,363

Coll - Low

Low

Unassisted

386,219

1,363

Amended

High

Assisted

472,121

1,377

Exam - Low

Low

Assisted

1,459,608

1,363

Exam - Low

High

Assisted

2,055,678

1,363

Coll - Low

Low

Assisted

2,361,685

1,363

Coll - Low

High

Assisted

4,037,175

1,363





30,000


Table 3 - 2021 Tax-Exempt Organization Burden Survey

Strata

Population Count

Sample Allocation

Preparation Method

Total Revenue

Self-Prepared

Less than $5,000

31,081

832

Self-Prepared

$5,001 - $100,000

81,248

705

Self-Prepared

$100,001 - $1,000,000

48,578

1,490

Self-Prepared

$1,000,001 or more

18,489

767

Used a Paid Professional

Equal to zero

6,273

573

Used a Paid Professional

$1 - $5,000

16,889

465

Used a Paid Professional

$5,001 – 50,000

80,604

590

Used a Paid Professional

$50,001 - $100,000

69,488

1,110

Used a Paid Professional

$100,001 - $500,000

134,164

3,588

Used a Paid Professional

$500,001 - $1,000,000

41,783

1,085

Used a Paid Professional

$1,000,001 - $5,000,000

55,838

4,732

Used a Paid Professional

$5,000,001 - $10,000,000

12,481

1,043

Used a Paid Professional

$10,000,001 or more

19,014

3,020




20,000


Table 4 – 2019 Information Return Burden Survey

#

Clients

Types

Forms

 

Allocation

1

500 or Less

Filed 1 Type

F1099MISC - Rent and NonEmp Comp

877,910

989

2

500 or Less

Filed 1 Type

F1099MISC - Rent

290,278

395

3

500 or Less

Filed 1 Type

F1099MISC - NonEmp Comp

3,926,013

3,757

4

500 or Less

Filed 1 Type

F1099MISC - No Rent or No Emp comp

180,426

395

5

500 or Less

Filed 1 Type

F1098

49,799

1,055

6

500 or Less

Filed 1 Type

F1099INT

92,999

1,055

7

500 or Less

Filed 1 Type

F1099R

39,929

1,055

8

500 or Less

Filed 1 Type

F1099DIV

23,433

1,055

9

500 or Less

Filed 1 Type

F1098T

13,121

659

10

500 or Less

Filed 1 Type

F1099S

8,728

330

11

500 or Less

Filed 1 Type

F1042S

7,384

1,055

12

500 or Less

Filed 1 Type

F1099B

1,193

791

13

500 or Less

Filed 1 Type

F1099C

1,319

264

14

500 or Less

Filed 1 Type

F1099K

154

330

15

500 or Less

Filed 1 Type

F5498

79

227

16

500 or Less

Filed 2 or More Types

MISC|No Int|No Div

69,160

527

17

500 or Less

Filed 2 or More Types

MISC|No Int|Div

30,382

527

18

500 or Less

Filed 2 or More Types

MISC|Int

207,819

659

19

500 or Less

Filed 2 or More Types

No MISC|No Int

3,631

527

20

500 or Less

Filed 2 or More Types

No MISC|Int|B

4,323

527

21

500 or Less

Filed 2 or More Types

No MISC|Int|No B

2,821

527

22

More than 500

Filed 1 Type

Form 1099-MISC

7,307

659

23

More than 500

Filed 1 Type

Not Form 1099-MISC

1,454

659

24

More than 500

Filed 2 or More Types

Form 1099-MISC

9,768

1,318

25

More than 500

Filed 2 or More Types

Not Form 1099-MISC

2,464

658






20,000



Table 5 – 2020 Trust and Estate Income Tax Return

Strata

Population Count

Sample Allocation

Form 1041 Type

Complexity

Complex trust or Generation-Skipping Trust

Low

1,750,000

4,502

High

3,001

Simple Trust or Taxable Grantor Type Trust

Low

750,000

2,487

High

1,658

Non-taxable Grantor Type Trust

Low

500,000

2,055

High

1,370

Decedent’s estate

Low

400,000

1,717

High

1,145

Qualified Disability Trust

ANY

25,000

689

Non-Exempt Charitable and Split Interest Trust

ANY

7,500

689

Bankruptcy Estate

ANY

750

688




20,000



Table 6 – Tax Year 2019 Employment Tax Burden Survey



Strata

Population Count

Sample Allocation

Primary Form

Number of W2s

Share with Benefits

Tips

Form 941

1 to 4

None

ANY

1,846,642

2,604

5 to 19

None

ANY

768,328

947

20 to 99

None

NO

271,569

613

100 to 299

None

NO

23,184

497

300 to 999

None

NO

3,664

444

Over 1000

None

NO

792

445

1 to 19

Less Than 10%

NO

410,266

841

20 to 99

Less Than 10%

NO

252,724

613

100 to 299

Less Than 10%

NO

56,339

497

300 to 999

Less Than 10%

NO

11,699

448

Over 1000

Less Than 10%

NO

3,192

466

1 to 19

Greater Than 10%

NO

167,492

841

20 to 99

Greater Than 10%

NO

20,499

611

100 to 299

Greater Than 10%

NO

10,684

496

300 to 999

Greater Than 10%

NO

11,572

450

Over 1000

Greater Than 10%

NO

4,839

469

1 to 19

ANY

YES

82,030

841

20 to 99

ANY

YES

90,295

613

100 to 299

ANY

YES

14,041

496

300 to 999

ANY

YES

2,198

441

Over 1000

ANY

YES

924

453

Form 944

ANY

ANY

ANY

31,693

472

Form 943

ANY

ANY

ANY

75,426

402






15,000








Table 7 – 2020 Estate Tax Burden Survey





Table 8 – 2020 Gift Tax Burden Survey


Table 9 – 2021 Nonfiler/Late-Filer Burden Survey



Table 10 – 2021 Excise Tax Burden Survey



Table 11 – 2021 Business Entity Special Survey

Strata

Population Count

Sample Allocation

Form 1120

1,774,426

3,655

Form 1120-S

4,265,196

8,786

Other Form 1120

36,324

575

Form 1065

3,390,363

6,984



20,000













Table 12 – 2021 Pension Plan Burden Survey



  1. Describe the procedures for the collection of information.


We have two objectives in the design of the following protocols. The first is the efficient collection of the current sample; the second is to inform the design of future studies. The exact form of each of these contacts may vary somewhat, depending on whether any survey-related research is conducted during survey administration. Examples of such potential research are timing of mailings, messaging, shorter-length surveys, or response mode options.









Individual Taxpayer Burden Surveys (TY2019, TY2020 TY2021)

Contact

Description

Schedule

Contact 1: Pre-note

The pre-note is a hardcopy letter from an IRS official endorsing the survey and emphasizing the importance of the data collection effort. It notifies the respondent of selection for the survey, as well as provides information about the survey and assurances that there is no risk associated with participation. In addition, respondents will be given directions on how to view the survey on the taxstats website. Provided in English and Spanish.

Beginning of data collection period

Contact 2: Survey packet

The survey packet consists of a paper-and-pencil questionnaire, a pre-addressed postage paid reply envelope, and a letter from the survey vendor indicating that the enclosed survey is the one referred to in the previously-received pre-note, and a reminder that completing the survey is voluntary. The paper survey also includes information on how the respondents may complete the survey on the web, if so desired.


ITB only: A $2 incentive will be enclosed in this mailing. The survey vendor letter states that the incentive is a token of appreciation.

1 – 2 weeks after Contact 1 mails

Contact 3: Thank you/Reminder Letter

All respondents will be mailed a thank you/reminder letter. The letter will thank those who have already submitted a completed survey and ask those who have not responded to please do so.

3 weeks after Contact 2 mails

Contact 4: Survey packet

Nonrespondents are sent the same packet as Contact 2.


ITB only: No incentive is included. The survey vendor letter is replaced with a letter that does not mention an incentive.

2 weeks after Contact 3 mails

Contact 5: Thank you/Reminder Letter or Phone call

If no completed survey is received, nonrespondents will receive a follow-up thank you/reminder letter, similar to Contact 3.


ITB only: Nonrespondents that have been matched to a phone number will receive an IVR prompt that will ask them complete and return the survey or call the survey vendor if the respondent has questions.

2 weeks after Contact 4 mails

Contact 6: Survey packet

Nonrespondents are sent the survey packet, which provides a third copy of the paper-and-pencil questionnaire, a pre-addressed postage paid reply envelope, and a letter from the survey vendor asking for response. As with the first and second survey packets, the paper survey also includes information on how the respondents may complete the survey on the web, if so desired.

2 weeks after Contact 5 phone or letter follow-up









All Other Mixed Mode (Paper and Web) Surveys/Special Studies

Contact

Description

Schedule

Contact 1: Initial survey packet

The initial survey packet consists of a paper-and-pencil (TeleForm) survey, a letter from the IRS endorsing the survey, a letter from the survey vendor with instructions on completing the survey online, and a postage-paid return envelope.

Beginning of data collection period

Contact 2: Thank you/reminder Letter

All respondents will be mailed a thank you/reminder letter. The letter will thank those who have already submitted a completed survey and ask those who have not responded to please do so.

7 – 10 days after Contact 1 mails

Contact 3: Follow-up survey packet

All sampled organizations will receive a follow-up survey packet, which will include the paper-and-pencil (TeleForm) survey, a pre-addressed postage paid reply envelope, and a letter from the survey vendor asking for response. The letter will be tailored to acknowledge the earlier survey package sent to the respondent.

7 – 10 days after Contact 1 mails

Contact 4: Thank you/reminder Letter or Phone call)

If no completed survey is received, nonrespondents will either receive a follow-up thank you/reminder letter, similar to Contact 3, or if they have been matched to a phone number, an IVR prompt will ask them complete and return the survey or call the survey vendor if the respondent has questions.

7 – 10 days after Contact 2 mails

Contact 5: Survey packet

Nonrespondents are sent the survey packet, which provides a third copy of the paper-and-pencil questionnaire, a pre-addressed postage paid reply envelope, and a letter from the survey vendor asking for response. As with the first and second survey packets, the paper survey also includes information on how the respondents may complete the survey on the web, if so desired.

2 weeks after Contact 4 phone or letter follow-up




All Web-Only Surveys

Contact

Description

Schedule

Contact 1: Initial packet

The initial packet consists of a letter from the IRS endorsing the survey and instructions for completing the survey online

Beginning of data collection period

Contact 2: Thank you/reminder Letter

All respondents will be mailed a thank you/reminder letter. The letter will thank those who have already submitted a completed survey and ask those who have not responded to please do so.

7 – 10 days after Contact 1 mails

Contact 3: Thank you/reminder Letter

All nonrespondents will be mailed a reminder letter with instructions for completing the survey online.

7 – 10 days after Contact 2 mails

Contact 4: Thank you/reminder Letter

All nonrespondents will be mailed a reminder letter with instructions for completing the survey online.

7 – 10 days after Contact 3 mails


Web survey. The secure web survey will be posted online using a proprietary web survey delivery system developed by our contractor. The software easily accommodates different question formats, including open-ended response fields.  It also allows participants to skip questions and complete the survey in more than one session (i.e., the respondent can leave the web survey and come back to finish it at a later time).  Development and testing of the web survey will follow well-established, documented best methods.


Paper-and-pencil survey. The paper-and-pencil mail survey will be designed to be user friendly, easy to navigate, and with clear and simple instructions.  The survey will be created using TeleForm technology, a software system for intelligent data capture and image processing.  The software extracts indexing information automatically from any document type through the use of multiple recognition engines. TeleForm reads hand print, machine print, optical marks, bar codes, and signatures. 


Data storage and usage. Response data will be stored and tracked in a response database which can then be used to update and extend the IRS compliance burden model. In addition, a tailored Survey Management System will track cases throughout all modes of contact, including mail, telephone, and IVR.


Focus groups. Focus groups allow the IRS to speak directly to industry stakeholders and taxpayers regarding the primary drivers of burden to inform survey instrument design. They are extremely important to the design of a new survey because they offer the opportunity to increase and validate the understanding of the burden incurred by the relevant population as well as to develop and test meaningful survey questions.



  1. Describe methods to maximize response rates and to deal with issues of non-response.


The survey instrument design and administration protocol are informed by currently-accepted best practices that support survey response rates, such as including official IRS letter as a pre-note, item formatting, and survey length. IRS conducted survey research during prior burden survey administrations, such as the incentive studies discussed in Section A9, that have also provided important insight.


Upon completion of each survey data collection, we will conduct a nonresponse bias analysis. This analysis will use a raking technique to control for the difference between the characteristics of those who respond and those who do not. The process is further outlined in the paper “Response Mode and Bias Analysis in the IRS’ Individual Taxpayer Burden Survey”, by J. Michael Brick, George Contos, Karen Masken, and Roy Nord.


  1. Describe any tests of procedures or methods to be undertaken.


To ensure that the collection of information is not burdensome and that the questions are clearly written and will produce accurate and valid results, the IRS will conduct cognitive testing for any new or revised survey instrument. Cognitive testing is a well-established qualitative research method intended to identify problems respondents have with comprehension of survey questions (Willis 2005)1.  The testing will be conducted with taxpayers in the Washington, D.C. area.  Respondents will be recruited according to specific criteria (e.g., filing status, complexity of return, and filing method). Efforts will be made to recruit respondents who are demographically representative of the population being surveyed.


In addition, at the outset as well as after each interaction of testing, the instrument will undergo extensive review by the IRS, the contractor, and stakeholders.





  1. Provide the names and telephone numbers of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


IRS Research, Applied Analytics, and Statistics and Treasury Office of Tax Analysis


Statistical Design:

Ahmad Qadri IRS Research, Analysis, & Statistics 202-803-9373

Ishani Roy, IRS Research, Analysis, & Statistics 202-803-9372


Collection and Analysis:

Brenda Schafer (overall lead), IRS Research, Applied Analytics, and Statistics, 202-803-9412

Patrick Langetieg (deputy lead), IRS Research, Applied Analytics, and Statistics, 202-803-9419

Jose Colon de la Matta, IRS Research, Applied Analytics, and Statistics, 202-803-9412

Ronald H. Hodge II, IRS Research, Applied Analytics, and Statistics, 202-803-9414

Rizwan Javaid, IRS Research, Applied Analytics, and Statistics, 240-613-5023

Yuri Katrinic, IRS Research, Applied Analytics, and Statistics, 202-803-9443

Scott Leary, IRS Research, Applied Analytics, and Statistics, 202-803-9909

Alexander Saak, IRS Research, Applied Analytics, and Statistics, 202-803-9450

Melissa Vigil, IRS Research, Applied Analytics, and Statistics, 202-803-9404


Survey Administrators:


Westat

Kerry Levin, Project Manager, 301-294-3928

Jocelyn Newsome, Research Analyst

Martha Stapleton, Project Manager

Karen Stein, Project Manager

Reina Sprankle, Survey Intake Manager

Statistical Design and Analysis:

Mike Brick, Statistician


Fors Marsh Group

Justin Baer, Project Manager, 571-444-1781

Kimberly Wyborski, Project Lead



APPENDIX A – Citations


American Academy of Political and Social Science, “The Nonresponse Challenge to Surveys and Statistics” (May 13, 2013), http://www.socialsciencespace.com/2013/05/the-nonresponse-challenge-to-surveys-and-statistics/


Church, A.H., “Estimating the Effect of Incentives on Mail Survey Response Rates: A Meta-Analysis”, Public Opinion Quarterly, 57: 62-79 (1993).


Dillman, D., Smyth, J., and Christian, L., Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method, Hoboken, NJ: John Wiley & Sons (2008).

Groves, R., Dillman, D., Eltinge, J., and Little, R, Survey Nonresponse, New York: John Wiley and Sons (2002).

James, Jeannine M., and Richard Bolstein, ‘‘The Effect of Monetary Incentives and Follow Up Mailings on the Response Rate and Response Quality in Mail Surveys,’’ Public Opinion Quarterly 54:346–61 (1990).

Millar, M.M. and Dillman, D.A., “Improving Response to Web and Mixed-Mode Surveys”, Public Opinion Quarterly, 1-21 (2011).



Petrolia, D.R. and Bhattacharjee, S., “Revisiting Incentive Effects: Evidence from a Random-Sample Mail Survey on Consumer Preferences for Fuel Ethanol”, Public Opinion Quarterly, 73:537-550 (2009).

Shaw, M.J. et al, “The Use of Monetary Incentives in a Community Survey: Impact on Response Rates, Data Quality, and Cost”, Health Services Research, 35: 1339-1346 (2001).

Shettle, C. and Mooney, G., “Evaluation of Using Monetary Incentives in a Government Survey”, Mathematics Policy Research, National Science Foundation (1999).

Teisl, M.F., Roe, B. and Vayda, M., “Incentive Effects on Response Rates, Data Quality, and Survey Administration Costs”, International Journal of Public Opinion Research 18 (2005).

Trussell, N. and Lavrakas, P.J., “The Influence of Incremental Increases in Token Cash Incentives on Mail Survey Response: Is there an Optimal Amount?” Public Opinion Quarterly 68: 349-367 (2004)


1Willis, G.B. (2005). Cognitive Interviewing: A Tool for Improving Questionnaire Design. Thousand Oaks, CA: Sage Publications.

14


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleDRAFT
AuthorPCxx
File Modified0000-00-00
File Created2021-01-14

© 2024 OMB.report | Privacy Policy