2018 SIPP Panel Supporting Statement - Section A Final 082817

2018 SIPP Panel Supporting Statement - Section A Final 082817.docx

2018 Survey of Income and Program Participation Panel

OMB: 0607-1000

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT

U.S. Department of Commerce

U.S. Census Bureau

2018 Survey of Income and Program Participation Panel

OMB Control No. 0607-XXXX


A. Justification


1. Necessity of Information Collection


The U.S. Census Bureau requests authorization from the Office of Management and Budget (OMB) to conduct the 2018 Survey of Income and Program Participation (SIPP) Panel. The SIPP is authorized by Title 13, United States Code, Sections 141 and 182.


The Census Bureau plans to conduct the 2018 SIPP Panel in four waves beginning in February 2018. The Census Bureau's SIPP Computer-Assisted Personal Interviewing (CAPI) will use an Event History Calendar (EHC) interviewing method and a 12-month, calendar-year reference period. This is the same approach as the 2014 SIPP Panel. The 2018 SIPP Panel instrument content and post-production processing will remain similar to that of the 2014 SIPP Panel. See Attachment A, Wave 1 SIPP 2018 Instrument Items Booklet, for the interview questions.


The main objective of the SIPP is to provide accurate and comprehensive information about the income and program participation of individuals and households in the United States. The survey’s mission is to provide a nationally representative sample for evaluating: 1) annual and sub-annual income dynamics; 2) movements into and out of government transfer programs; 3) family and social context of individuals and households; and 4) interactions among these items. A major use of the SIPP has been to evaluate the use of and eligibility for government programs and to analyze the impacts of modifications to those programs. The SIPP collects detailed information on cash and non-cash income (including participation in government transfer programs) once per year. The current SIPP panel continues to reduce the cost of collection, improve accuracy, increase relevance and timeliness, reduce respondent burden, and increase accessibility.


Providing the same, or better, quality data at a reduced burden to respondents is a high priority for the Census Bureau and for the SIPP program. To accomplish this, the Census Bureau uses an EHC-based instrument to gather SIPP data. The content of the 2018 SIPP Panel will match that of the 2014 SIPP very closely. The EHC allows recording dates of events and spells of coverage, and provides measures of monthly transitions of program receipt and coverage, labor force transitions, health insurance transitions and others.


The 2018 SIPP Panel begins with a new “Wave 1” sample of survey respondents who were not interviewed in the previous 2014 SIPP. The 2018 SIPP Panel sample is based on the 2010 Decennial Census. The 2018 SIPP Panel Wave 1 will interview respondents using the previous calendar year 2017 as the reference period and will proceed with annual interviewing going forward. The 2018 SIPP Panel will follow persons aged 15 years and older who move from the prior wave household. Consequently, future waves will incorporate dependent data, which is information collected from the prior wave interview brought forward to the current interview.


New Sample Design and Overlapping Panels


To increase the ability to respond to changing budget constraints, the 2018 Panel will have a modified sample design and data collection strategy. Instead of one large sample that would be created prior to Wave 1 and interviewed over a four to five-month period, we will draw five smaller, independent samples, interviewing and closing out one of these samples each month of the Wave 1 interview period, which will run for 5 months from February through June for each panel year. Each independent sample will be drawn from the same geographies, so that staffing needs would be constant in level and location from month to month. We will treat each of the five smaller samples independently during data collection, but combine them for data processing and file release as the full wave SIPP data file. With the exception that the reference period for all of the smaller samples reach back to the same January 2017 starting month, this design is conceptually analogous to the way SIPP fielded the rotation groups that existed prior to the SIPP Panel 2014.


Since we are treating each month’s sample independently, this system will allow flexibility in responding to budgetary levels while still planning for the full workload. Thus, we will plan for all sample months, but field and work as many as the budget allows, and allow for orderly decisions about each month’s interviewing based on current funding without haphazard impacts on data quality.


If interviewing is cut off early, in this design, the loss of data will affect the precision of the estimates and the cases available for longitudinal use, but would not affect the representativeness of the sample composition based on the completed months. We would keep this as a contingency, but plan to interview for the full period of months and cover the entire sample.

In Waves 2+, we will again interview for as many months as the budget allows. We will continue this pattern in each subsequent wave.


If we are funded for a sample the same size as the 2014 SIPP Panel, a Wave 1 sample of 53,000 households, we expect approximately 31,800 households with completed interviews. We estimate that each household contains 2.1 people aged 15 and above, yielding approximately 66,800 person-level interviews per wave in this panel. We estimate that completing the SIPP interview will take approximately 60 minutes per adult on average, consequently the total annual burden for 2018 SIPP interviews will be 66,800 hours per year in FY 2018, 2019, 2020, and 2021.


If we are funded at current levels, for a Wave 1 sample of 35,000 households, we expect approximately 20,000 households with completed interviews. We estimate that each household contains 2.1 people aged 15 and above, yielding approximately 42,000 person-level interviews per wave in this panel. We estimate that completing the SIPP interview will take approximately 60 minutes per adult on average, consequently the total annual burden for 2018 SIPP interviews will be 42,000 hours per year in FY 2018, 2019, 2020, and 2021.


Design plans depend on funding, with a sample size of 35,000 households being the minimum threshold. If we are funded for a Wave 1 household sample size of more than 35,000 (the 2014 Panel sample size of 53,000), we will conduct the 2018 Panel in the same manner as the 1996 – 2014 Panels with abutting panels.


If we are not able to initiate the 2018 Panel with more than 35,000 households, we plan to institute an overlapping panel (sample) design for initiating new panels and maintaining the level of interviewed cases in each subsequent year to keep the total pool of cases interviewed constant in each year. New representative cases would come from the same geographic areas as the original sample, and represent new representative samples for incoming annual panels. This will maintain an approximately equal annual and monthly workload during the interview period. In addition, longitudinal analyses can utilize sets of overlapping four-year panels when calendar year is not critical to the design of the specific analysis.


The table below illustrates how the overlapping panels work. The initial sample of 7,000 per month will be interviewed over the next four years. In 2019, based on an estimated sample loss from Wave 1 (2018) to Wave 2 (2019), there are 4,060 original cases remaining. Adding 2,800 new cases will bring the monthly total back to around 7,000. The 2,800 supplement acts as a new panel as these cases are interviewed for the next 4 years. In 2020, three groups make up the 7,000 monthly total: Wave 3 for Panel 2018, Wave 2 for Panel 2019, and Wave 1 for the 2,500 new cases in 2020.


SIPP OVERLAPPING PANELS


YEAR


2018

2019

2020

2021

2022

2023

2024

2025

2026

2027

Initial Monthly sample

7,000

4,060**

2,842

1,989

3,800

2,204

1,543

3,080

3,100

1,798

Year 2 supplement

N/A

2,800

1,624

1,137

796

3,100

1,798

1,259

881

3,100

Year 3 supplement

N/A

N/A

2,500

1,450

1,015

711

2,900

1,682

1,177

824

Year 4 supplement

N/A

N/A

N/A

2,400

1,392

974

682

2,900

1,682

1,177

Total monthly cases interviewed

7,000

6,860

6,966

6,976

7,003

6,989

6,923

6,921

6,840

6,900












Total annual cases interviewed

35,000*

34,300

34,830

34,881

35,014

34,945

34,614

34,603

34,202

34,498

Initial Wave 1 sample/panel

35,000

14,000

12,500

12,000

19,000

15,500

14,500

14,500

15,500

15,000

*Based on 5 months of interviewing 5 independent samples of 7,000 each

**Based on sample loss from Wave 1


After about eight years, the samples will reach equilibrium, with each panel consisting of approximately 15,000-16,000 new households per year, and maintaining approximately 35,000 households being attempted for interview each calendar year.


The Census Bureau plans to use Computer Audio-Recorded Interview (CARI) technology during the 2018 SIPP Panel. CARI is a tool available during data collection to capture audio along with response data. With the respondent’s consent, a portion of each interview is recorded unobtrusively and both the sound file and screen images are returned with the response data to Census Headquarters for evaluation. Census staff may review the recorded portions of the interview to improve questionnaire design and for quality assurance purposes.


Wave 1 of the SIPP 2018 Panel will be conducted from February to June of 2018. Wave 2 is scheduled to be conducted from February to June of 2019. Wave 3 is scheduled to be conducted from February to June of 2020. Wave 4 is scheduled to be conducted from February to June of 2021. This OMB clearance request is for the 2018 SIPP Panel (Waves 1, 2, and 3).


2. Needs and Uses


The 2018 SIPP collects information about a variety of topics including demographics, household composition, education, nativity and citizenship, health insurance coverage, Medicaid, Medicare, employment and earnings, unemployment insurance, assets, child support, disability, housing subsidies, migration, Old-Age Survivors and Disability Insurance (OASDI), poverty, and participation in various government programs like Supplemental Nutrition Assistance Program (SNAP), Supplemental Security Income (SSI), and Temporary Assistance for Needy Families (TANF). The 2018 SIPP Panel will continue the EHC methodology implemented in 2014 SIPP instruments. The 2018 SIPP sample is nationally representative, with an oversample of low-income areas, in order to increase the ability to measure participation in government programs.


The 2018 SIPP program provides critical information necessary to understand patterns and relationships in income and program participation. It will fulfill its objectives started with the 2014 Panel to keep respondent burden and costs low, maintain high data quality and timeliness, and use a refined and vetted instrument and processing system. The 2018 SIPP data collection instrument maintains the improved data collection experience for respondents and interviewers, and focuses on improvements in data quality and better topic integration. While evaluations will be ongoing, the Census Bureau is pleased with the results and is confident in continuing to provide high quality SIPP data utilizing the current annual SIPP data collection instrument and procedures.


The 2018 SIPP instrument is currently written in Blaise and C#. It incorporates an EHC design to help ensure that the 2018 panel will collect intra-year dynamics of income, program participation, and other activities with at least the same data quality as earlier panels. The EHC is intended to help respondents recall information in a more natural “autobiographical” manner by using life events as triggers to recall other economic events. For example, a residence change may often occur contemporaneously with a change in employment. The entire process of compiling the calendar focuses, by its nature, on consistency and sequential order of events, and attempts to correct for otherwise missing data.


Information quality, as described by the Census Bureau’s Information Quality Guidelines, is an integral part of the pre-dissemination review of information released by the Census Bureau. Information quality is essential to data collections conducted by the Census Bureau and is incorporated into the clearance process required by the Paperwork Reduction Act.


3. Use of Information Technology


The survey is administered using CAPI and CARI methodologies. The Census Bureau field representatives (FRs) collect the data from respondents using laptop computers and transmit to the Census Bureau Headquarters via high-speed modems. Automation significantly enhances our efforts to collect high quality data with skip instructions programmed into the instrument and information obtained in earlier interview segments fed back to the respondent. Response burden is minimized by incorporating design features that make it easier to collect and record respondent information. Therefore, screening questions and lead-in questions are built into the automated instrument to skip respondents out of sections of the questionnaire that are not relevant or applicable.


Review of current internet instruments and analysis from an internet field test conducted by the SIPP Methods Panel (2000) suggest that using the internet as a data-collection mode for a long, complex demographic survey such as SIPP is currently not a feasible option for the SIPP program. The SIPP automated instrument contains many complicated skip patterns and roster related components. While the Methods Panel findings are now seventeen-years old, they remain salient. The public is not expected to be tolerant of long, complex, self-administered internet data collections. Major revisions to design and content would be necessary to shift to an online-data collection. The costs of converting a complex questionnaire such as SIPP to an online survey far outweigh the benefits even in a multimode environment. The SIPP program continuously reviews options for simplification of data collection, options for multi-mode data collection, and ways to reduce respondent burden.


4. Efforts to Identify Duplication


The demographic data collected in the SIPP must be collected in conjunction with the labor force and program participation data in order for the information to be most useful; therefore, although we collect demographic data in conjunction with almost all Census Bureau surveys, we need to continue its present collection in the SIPP. No other current data source is available which provides as comprehensive a set of statistics for analysis as described above in section 2.


5. Minimizing Burden


The Census Bureau uses appropriate technology to keep respondent burden to a minimum. Examples of technology used to minimize respondent burden include: use of appropriate screening and lead-in questions that serve to skip respondents out of sections of the CAPI instrument that are not relevant or applicable to them; use of flash cards to aid respondents with multiple response categories; and the arrangement of questions and sections of the CAPI instrument that facilitate the flow of administration from one topic area to another. We will also use paradata to help evaluate respondent burden. For a more detailed explanation on paradata evaluation, see section 16.


6. Frequency of Collection


The 2018 SIPP Panel will interview respondents annually, using the previous calendar year as the reference period. One possible consequence of the one-year reference period is the possibility of increased recall difficulties for respondents. However, use of the EHC methodology of interview should help to alleviate these issues by linking respondents’ memories to significant life events. See earlier explanation in section 2.


7. Special Circumstances


There are no special circumstances associated with this clearance request.


8. Consultations Outside the Agency


The SIPP program has always relied upon and valued the input of our stakeholders outside the Census Bureau. Input from partner agencies and OMB continue to be valuable and welcome guidance for the content and procedures for the SIPP.


Additionally, the Census Bureau has continued involvement with the Committee for National Statistics (CNSTAT) at the National Academies of Science, and other interested representatives from policy, research, and government. The Census Bureau has continued to work actively to assure that the SIPP stakeholders interests and priorities are represented, that the content and procedures to collect SIPP data are appropriate, and the duplication between surveys are minimized to the extent possible.


We published a notice in the Federal Register on April 19, 2017, Vol. 82, No. 74, pages 18418-18419, inviting public comment on our plans to submit this request (Attachment M). We received one comment about the necessity of the survey that was not relevant.


9. Paying Respondents


For the 2018 SIPP Panel, we will continue our multi-wave incentive experiment and continue to evaluate the efficacy of incentives as a means of increasing respondent cooperation with the SIPP.


We plan to divide the panel into two incentive groups and a control group to monitor the effectiveness of the incentives over time. Group 1 is the control group; households in this group will not be eligible for an incentive in any wave of the 2018 panel. For Wave 1, incentives will be assigned randomly to households in incentive Group 2 and will be eligible to receive an incentive in Wave 1 and later waves. In subsequent waves, households in Group 2 will receive incentives of $40 based on a propensity model that considers the effectiveness of the incentive for generating an interview. This assignment plan for incentives will help to both increase the response rate among households where the absence of the incentive would lead to differential attrition. The use of incentives in this model-based framework will also lower costs since we would not focus incentives on households that would be over-represented in the absence of an incentive. Additionally, we will continue to evaluate the incentive modeling and develop specifications that incorporate ongoing work to utilize responsive and adaptive design to prioritize cases in interviewers’ workloads.


The third group of respondents (Group 3) will receive a $40 incentive in Wave 1 and all subsequent waves. All incentives are conditional on completing the interview. We will inform households in both the advance letter and the introduction to the survey of their eligibility for an incentive.


Inputs for the propensity model for Group 2 will come from the Wave 1 responses, Wave 1 contact data, sample frame data, data linkable to the sample frame, and results from comparing Groups 1 and 2 with Group 3. We will evaluate what characteristics of households seem to make them more or less likely to complete interviews, how they contribute to the eventual sample representation, and how likely they are to respond to incentives. Using this knowledge, we can design a model for Waves 2+ that optimizes our distribution of incentives to Group 2 in a way that maximizes the return on each dollar spent, reduces non-response bias, and improves sample representativeness.


For all waves, we will distribute the incentives centrally, from our National Processing Center. This centralized distribution eliminates any discretion on the part of the field representatives, ensuring that only eligible households are given (or promised) incentives. This control is necessary to ensure the success of the propensity modeling experiment. We plan to mail the debit cards containing the incentives on a weekly basis. That is, as we receive completed interviews from eligible households, we will send a list of these households to the NPC, and they will mail the Thank You letters containing the PIN information (Attachment K), and then mail the debit cards in a separate mailing. Splitting the mailings this way allows us to avoid the additional expense of sending the debit cards via a signature-required service such as FedEx.


10. Assurance of Confidentiality


The U.S. Census Bureau is required by law to protect all respondent information. The Census Bureau is not permitted to publicly release responses in any way that could identify an individual or household. We are conducting this survey under the authority of Title 13, United States Code, Sections 141 and 182. Federal law protects respondent privacy and keeps all answers confidential under Title 13, United States Code, Section 9. Per the Federal Cybersecurity Enhancement Act of 2015, all respondent data are protected from cybersecurity risks through screening of the systems that transmit the data.


SIPP respondents will be informed of the confidentiality of their responses and that this is a voluntary survey by an annual (per wave) letter from the Director of the Census Bureau that will be sent to all participants in the survey in advance of the interview (Attachments B and C).


11. Justification for Sensitive Questions


The sources of income and assets are among the kinds of data collected and possibly considered of a sensitive nature. The Census Bureau takes the position that the collection of these types of data is necessary for the analysis of important policy and program issues, and consequently has structured the questions to lessen their sensitivity.


12. Estimates of Annualized Respondent Hour and Cost Burden


Based on our experience with the 2014 SIPP Panel waves, the burden estimates for the 2018 SIPP Panel (per wave) are as follows:


12a. 2018 SIPP PANEL ESTIMATED ANNUALIZED BURDEN HOURS SUMMARY


Table 1: For a 35,000 Household Sample


Expected Number of Respondents

Waves

Number of Responses per Respondent

Average Burden Per Response

(in Hours)

Total Burden Hours

Interview

42,000

1

1

1.0

42,000

Total

42,000

1

1

1.0

42,000


Approximately 20,000 households are expected to have completed interviews for the initial wave of the 2018 SIPP Panel. We estimate that each household contains 2.1 people aged 15 and above, yielding approximately 42,000 person-level interviews per wave in this panel. We estimate that completing the SIPP interview take approximately 60 minutes per adult on average, consequently the total annual burden for 2018 SIPP interviews will be 42,000 hours per year in FY 2018, 2019, 2020, and 2021.


Table 2: For a 53,000 Household Sample


Expected Number of Respondents

Waves

Number of Responses per Respondent

Average Burden Per Response

(in Hours)

Total Burden Hours

Interview

66,800

1

1

1.0

66,800

Total

66,800

1

1

1.0

66,800


Approximately 31,800 households are expected to have completed interviews for the initial wave of the 2018 SIPP Panel. We estimate that each household contains 2.1 people aged 15 and above, yielding approximately 66,800 person-level interviews per wave in this panel. We estimate that completing the SIPP interview take approximately 60 minutes per adult on average, consequently the total annual burden for 2018 SIPP interviews will be 66,800 hours per year in FY 2018, 2019, 2020, and 2021.


12b. 2018 SIPP PANEL ESTIMATED ANNUALIZED BURDEN COSTS


Table 1: For a 35,000 Household Sample


Total Burden Hours

Hourly Wage Rate1

Total Respondent Costs

Respondents

42,000

$26.25

$1,102,500

Total

42,000

$26.25

$1,102,500


Table 2: For a 53,000 Household Sample


Total Burden Hours

Hourly Wage Rate1

Total Respondent Costs

Respondents

66,800

$26.25

$1,753,500

Total

66,800

$26.25

$1,753,500


1 For individuals, the wage rate is $26.25 per hour (for June 2017). This is based on the average hourly earnings for all employees on private nonfarm payrolls as reported by the Bureau of Labor Statistics (http://www.bls.gov/news.release).



13. Estimate of Cost Burden


There are no direct costs to respondents participating in the survey other than the time involved in answering the survey questions.


14. Cost to Federal Government


The data collection costs of the 2018 SIPP Panel are approximately $18,000,000 for 35,000 households ($510 per 7,000 cases * 5 months) or $27,000,000 for 53,000 households ($510 per 10,600 cases * 5 months) in each year from FY 2018-2021 (assuming we interview all 5 months in a wave). That amount is included in the estimate of total costs to the federal government of the Census Bureau's current programs supplied to the OMB. Items included in the cost of data collection are: instrument review, printing of materials, hiring and training for field representatives, data collection, interview monitoring, respondent engagement, initial data review, overhead, and support staff for the Field division.


15. Reason for Change in Burden


The 2018 SIPP Panel is being submitted as a new collection; therefore, change in burden will occur.


16. Project Schedule


The 2018 SIPP Panel Advance Letters will be mailed prior to interviewing. Wave 1 of the SIPP 2018 Panel will be conducted from February to June of 2018. Wave 2 is scheduled to be conducted from February to June of 2019. Wave 3 is scheduled to be conducted from February to June of 2020. Wave 4 is scheduled to be conducted from February to June of 2021. We will release public use data products on a schedule to be determined.


We will evaluate the survey using both collected data and paradata. As each subject-matter area is evaluating its content, we expect that in addition to comparing the results from the 2018 SIPP to the 2014 SIPP, where possible they will also compare the results to those from other surveys, such as the American Community Survey (ACS) or the Current Population Survey (CPS). While we would expect some variation in estimates from the different surveys due to sample size, survey universe, etc., we can tell whether SIPP’s results are broadly in line with those from other surveys.


The 2018 SIPP instrument will also generate a large amount of paradata that the survey team will use to evaluate the survey. First, we want to run a number of comparisons related to interview timing – for example, how much longer are adults’ interviews compared to children’s, and how much different are proxy interviews from self-reports? Those comparisons involve respondents’ behavior, but additionally we want to use paradata to evaluate how much influence field representatives’ behavior has on the survey results. We will have results from the FR certification test, so we can determine how much more successful high-scoring FRs are than low-scoring ones. Differential success by certification test score will help to identify areas to target for improvement in training and supplemental interviewer observation.


The paradata will also provide us with metrics that allow us to evaluate the respondent burden and to produce better cost estimates. For example, we will know the average number of questions asked during each interview, allowing us to pinpoint content areas that we could streamline or change during the research panel. We also know how many visits to a household it takes to get a completed interview, so we can use this statistic to estimate our 2019 costs more precisely.


The paradata can provide information we can use to improve the overall survey or correct errors in the existing instrument. We will be reviewing item level don’t know and refusal rates, as well as particularly time consuming items, to further target instrument improvement. Field representatives have the ability to enter notes, both at the item level and at the case level. We used these notes to identify and correct a number of bugs in the 2014 instrument, and we hope to duplicate this success in 2018. Additionally, we developed a FR debriefing instrument, which allows the FRs to offer detailed comments on each section of the instrument and to express any concerns or problems they had when fielding the survey once they completed all the cases in their workload.


In addition to these evaluation tools, the Census Bureau plans to continue using CARI technology (see page 4). We will record items from each household’s interview, so long as the household provides its consent to the quality assurance recordings. It is the Census Bureau’s plan to evaluate both the quality of the SIPP data and the possibility that CARI could supplement or replace standard reinterview.


Additionally, by recording all households, supervisors will have the ability to select recordings for the supplemental observation and coaching of interviewers. As the recordings from 2018 are used during and after data collection for Wave 1, the Census Bureau will use the recordings to develop standards for the use of recorded interviews for coaching interviewers. We will also use them to develop options to use the recordings as supplements to both in-person observation and reinterview. The 2018 SIPP Panel instrument will utilize the CARI Interactive Data Access System (CARI System), an innovative, integrated, multifaceted monitoring system that features a configurable web-based interface for behavior coding, quality assurance, and coaching. This system assists in coding interviews for measuring question and interviewer performance and the interaction between interviewers and respondents.


Finally, for a small subset of characteristics, and for a subset of sample areas, we will have access to administrative record data, which we will use for an objective assessment of the validity of survey estimates matched to administrative data.


17. Display of OMB Approval Information


The OMB control number is displayed in the advance letter that will be sent to eligible households before each wave’s interview. We request not to display the expiration date so we can reuse materials.


18. Exceptions to the Certification


There are no exceptions to the certification.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy