omb supporting statement part a.wpd

omb supporting statement part a.wpd

Survey of Health Insurance and Program Participation (SHIPP)

OMB: 0607-0959

Document [pdf]
Download: pdf | pdf

SUPPORTING STATEMENT

U.S. Department of Commerce

U.S. Census Bureau

Survey of Health Insurance and Program Participation (SHIPP)

OMB Control Number 0607-<XXXX>


Part A – Justification


Question 1. Necessity of the Information Collection


The U.S. health care system is decentralized, thus there is no comprehensive database of the insured and no way to derive the number of uninsured from such a database. Surveys offer the only data source for estimating the uninsured. Measuring the uninsured in surveys, however, has proved to be a persistent challenge to the research community. The Census Bureau has been conducting research for more than a decade on measurement error in its surveys that measure health insurance, including the Current Population Survey Annual Social and Economic Supplement (commonly called the CPS ASEC), the American Community Survey (ACS) and the Survey of Income and Program Participation (SIPP). This research fed into the development of an experimental set of questions on health insurance (the Redesign), which has the potential to reduce measurement error. The next step in this line of research is a split-ballot experiment planned for the spring of 2010 called the “Survey of Health Insurance and Program Participation” (SHIPP) which will include three panels of questions on health insurance: one modeled on the CPS ASEC series, one modeled on the American Community Survey (ACS) series, and the Redesign (see attached questionnaire and additional lists of state-specific program names).


The SHIPP will be conducted by telephone from the Census Bureau’s telephone data collection center in Hagerstown, Md., and the field period is scheduled for March 22 through May 10, 2010. Two types of sample will be used: random digit dial (RDD), and "seeded" sample of known Medicare enrollees from the Centers for Medicare and Medicaid Services administrative records. The survey is being conducted under the legal authority of Title 13, United States Code, Section 182.


With regard to the circumstances necessitating an emergency clearance, on January 21, 2010, we submitted a request to conduct this survey under the Statistical Research Division’s (SRD) generic clearance, which covers basic methodological research on questionnaire design and evaluation (split-ballot field tests, respondent debriefings, interviewer evaluations, etc.). Turn-around time for generic clearance is generally 10 days, and since 1999 SRD has conducted several similar (and related) studies under this generic clearance. Results from some of these studies are documented in the list of references in Question 8 below. In early February, 2010, however, we were informed by OMB that this particular study did not fall under the generic clearance but required a separate package because of the increased visibility of health insurance measurement issues which arose in the context of recent high-profile efforts to evaluate various health system reform proposals.




Given the timing of this determination that a separate OMB clearance package is needed, the choice is either to delay the survey by about six months or to pursue an emergency clearance. Delaying the survey has several negative consequences. In the short run, significant resources have been dedicated to running this survey in the spring of 2010, and shifting the timing would not only squander those resources, but it is unlikely that sufficient staff would be available later. Related to this, beginning in May 2010 and running through September 2010, several decennial followup operations will be conducted out of the Hagerstown telephone facility, and the SHIPP study would directly conflict with resources dedicated to those efforts. But perhaps the most compelling reason the survey cannot be delayed is due to the nature of the research questions. The Redesign is aimed at reducing measurement error associated with the calendar year reference period, in tandem with the approximate three-month lag time between the end of the reference period and the interview date. Thus, as noted in Question 6 below, to assess whether the Redesign results in any improvement in reporting of retrospective coverage relative to the CPS ASEC, it is essential that the field study be carried out in parallel with the timing of production CPS ASEC data collection as closely as possible. A 6-month delay would seriously threaten the applicability of the results.


Question 2. Needs and Uses


The primary purpose of the field study is to evaluate the Redesign and assess any improvements over the CPS ASEC design. A secondary purpose is to compare estimates from the CPS and ACS test panels. Evaluations will be carried out by HHES and SRD staff and will involve a range of different methods, including an analysis of: (1) the point estimates of the uninsured, and also those insured by various types of coverage (such as employer-sponsored plans, Medicare and Medicaid); (2) the accuracy of the survey data (as compared to administrative records on health coverage); (3) interview administration time; (4) interviewer feedback; (5) analysis of interviewer-respondent interaction (through behavior coding); and (6) respondent debriefings (scripted into the questionnaire). The evaluation will be used to help interpret estimates from CPS ASEC and ACS production data, and to determine whether particular survey design features of the CPS ASEC would benefit by modifications based on the Redesign. One particular survey design feature – the calendar year reference period – has been demonstrated to result in under-reported coverage. The Redesign, therefore, collects data on current coverage (a much less problematic reference period) and then uses this information as an anchor in order to ask about retrospective coverage during the past calendar year. If results show that this alternative method does in fact reduce under-reporting of past coverage, the CPS ASEC could adapt this type of question sequence in order to (1) produce statistics on current coverage and (2) produce past calendar year statistics that are more accurate.


Information quality is an integral part of the pre-dissemination review of the information disseminated by the Census Bureau (fully described in the Census Bureau's Information Quality Guidelines). Information quality is also integral to the information collections conducted by the Census Bureau and is incorporated into the clearance process required by the Paperwork Reduction Act.


Question 3. Use of Information Technology


All interviews will be carried out using a Computer-Assisted Telephone Interviewing (CATI) instrument and data will be transmitted electronically from the Hagerstown, Md., facility to Census Bureau headquarters in Suitland. In general, CATI instruments offer smooth, efficient administration of questionnaires, since the sequencing of questions is handled behind-the-scenes by the program, not by the interviewer. In the Redesign, this automation is heavily exploited, in that information gathered early in the interview is stored and harnessed for use later in the interview in order to enhance efficiency and reduce burden. Specifically, the Redesign is technically a person-level interview, in which a series of questions about health insurance coverage is asked about each household member by name (though the entire interview is conducted with a single household respondent, who answers these questions about all household members). For the first person listed on the roster (the household respondent) the full series of health insurance questions is asked. These questions determine whether he or she had any coverage at any time from January 1, 2009 until the interview day and, if so, the plan type and months of coverage. In multi-person households there is an additional set of questions to determine whether other household members are (or were) also covered by that same plan, and their months of coverage. After the household respondent’s interview is complete, he or she is asked a set of questions about the second person on the roster. If that second person was mentioned as being covered by a plan already reported in the household respondent’s interview, then in most cases the “series” of health insurance questions for this second person is much abbreviated, and consists of only two questions: one to see if they now have any additional plans, and one to determine if they had any additional plans at any point from January 1, 2009 up until the interview day. If that household member was covered by additional plans, details on plan type, months of coverage, and other household members covered by the same plan are gathered. This routine is repeated for questions about all subsequent household members. That is, at the start of the series about any given household member, the program first searches all previously-reported plans to determine whether that person was already reported to have coverage. If so, the two questions on additional plans are asked. If not, the full series is administered. This approach offers the specificity of asking about each household member by name, but reduces burden substantially in cases where multiple household members share the same coverage.



Question 4. Efforts to Identify Duplication


While other federal and non-federal agencies collect data on health insurance coverage (indeed the Census Bureau now carries out three surveys that ask about health insurance: the CPS ASEC, ACS and SIPP), each of these surveys has a different history, purpose and – perhaps most relevant to this field study – different methodologies, and all result in different estimates. There have been several comparative studies but most have been post-hoc and thus cannot control for the range of variation in survey design features, such as sampling, weighting and imputation. The split-ballot study offers the only opportunity to isolate and manipulate particular survey design features in order to measure the specific contribution these features make to measurement error.


Question 5. Minimizing Burden


Small businesses or other small entities are not asked to report information.



Question 6. Consequences of Less Frequent Collection


The Redesign is aimed at reducing measurement error in the CPS ASEC, focusing especially on error related to the calendar year reference period. Thus to evaluate the effectiveness of the questions on retrospective coverage in the Redesign it is essential that the field study be carried out in parallel with the timing of production CPS ASEC data collection as closely as possible. If the study is not carried out in 2010 it will have to wait an entire year and be conducted in the spring of 2011. This will delay research findings which could inform interpretations of estimates from CPS ASEC and ACS production data, and it will delay further testing and implementation of any improvements to the CPS ASEC production instrument. This in turn will delay data collection using improved techniques that could improve the accuracy of health insurance estimates.



Question 7. Special Circumstances


There are no special circumstances.



Question 8. Consultations Outside the Agency


For more than a decade Census Bureau staff have been collaborating and communicating with individuals outside the bureau who have been closely involved in the technical matters of health insurance measurement. These individuals include Rob Stewart (then at the U.S. Department of Health and Human Services Office of the Assistant Secretary for Planning and Evaluation), Mike Davern and Kathleen Call (of the State Health Access Data Center), Linda Bilheimer and Diane Makuc (at the National Center for Health Statistics) and Steve Hill (at the Agency for Healthcare Research and Quality). Efforts have also been made to both inform and solicit comments from the research community on research findings and plans for future tests through a variety of publications, conferences and seminars. Below is a partial list.


Pascale, J. (2009). “Findings from a Pretest of a New Approach to Measuring Health Insurance in the Current Population Survey.” Proceedings of the Federal Committee on Statistical Methodology Conference, November, 2009.


Pascale, Joanne, Marc I. Roemer and Dean M. Resnick (2009). Medicaid Underreporting in the CPS: Results from a Record Check Study. Public Opinion Quarterly 2009 73: 497-520. (Also presented at a DC Chapter of the American Association for Public Opinion Research seminar August 21, 2007 and a Washington Statistical Society seminar, January 16, 2008).


Pascale, Joanne. “Assessing Measurement Error in Health Insurance Reporting: A Qualitative Study of the Current Population Survey.” (2008) Inquiry Journal 45(5) Winter 2008/2009 pp 422-437.


Pascale, J. (2008). “Health Insurance Measurement: A Synthesis of Cognitive Testing Results.”

Paper presented at the American Association for Public Opinion Research (AAPOR), American Statistical Association. New Orleans, LA.


Pascale, J. (2004), “Medicaid and Medicare Reporting in Surveys: An Experiment on Order Effects and Program Definitions.” Proceedings of the American Association for Public Opinion Research (AAPOR), American Statistical Association.


Pascale, J. (2002), “Data Quality of Health Insurance Surveys.” Roundtable organized for the 2002 Annual Meetings of the American Association for Public Opinion Research. Participants included Timothy Beebe, PhD, State Health Access Data Assistance Center, University of Minnesota; Joanne Pascale, MA, Census Bureau; Terry L. Richardson, PhD, MPA, National Center for Health Statistics; Anthony M. Roman, MA, Center for Survey Research, University of Massachusetts-Boston; Stephen Zuckerman, PhD, Urban Institute


Pascale, J. (2002), “Health Insurance Measurement Methodologies: a Data Quality Assessment.” Presentation to the Data Analysis Group, National Center for Health Statistics, April 15, 2002.


Pascale, J. (2001), "Measuring Private and Public Health Coverage: Results from a Split-Ballot Experiment on Order Effects." Paper presented at the 2001 Annual Meetings of the American Association for Public Opinion Research, Proceedings of the Section on Survey Research Methods, American Statistical Association, 2001.


Hess, J., J. Moore, J. Pascale, J. Rothgeb, and C. Keeley (2001), "The Effects of Person-level vs. Household-level Questionnaire Design on Survey Estimates and Data Quality." Public Opinion Quarterly Winter 2001 65:574-584. (Also presented at a Washington Statistical Society seminar, March 21, 2001).


Pascale, J. (1999) "Methodological Issues in Measuring the Uninsured." Paper presented at the Seventh Health Survey Research Methods Conference, Proceedings. U.S. Dept. of Health and Human Services, National Center for Health Statistics. Hyattsville, Maryland, pp. 167-173.



Question 9. Paying Respondents


This study will not involve any payments to respondents.


Question 10. Assurance of Confidentiality


Respondents are informed through an advance letter (see attachment) and in the survey introduction that the survey: (1) is being conducted under the authority of Title 13, Unites States Code, Section 182, Section 9; (2) has been approved by the OMB under project number xxxx-xxxx; (3) takes an average of 12 minutes per household to complete; and (4) is voluntary. Respondents are also informed that the Census Bureau is required keep their information confidential and use it for statistical purposes only. The advance letter also solicits comments from respondents and provides an address and email address for sending these comments. It also

states that the OMB number legally certifies the information collection.


Question 11. Justification for Sensitive Questions


No sensitive questions are asked in this study.



Question 12. Estimate of Hour Burden


The SHIPP survey will be conducted only one time, by telephone, with two types of sample: RDD (n=3,000 completed interviews) and "seeded" sample of known Medicare enrollees from the Centers for Medicare and Medicaid Services (n=2,000 completed interviews). A single household respondent (18 years old or older) is asked to report for the entire household. The interview is expected to take 12 minutes per household on average, resulting in 1,000 total annual burden hours.



Question 13. Estimate of Cost Burden


There are no costs to respondents other than that of their time to respond.



Question 14. Cost to Federal Government


The total cost estimate for the production of the CATI instrument, testing, and all phases of preparation and administration of data collection is $1,088,308. The Census Bureau will bear the full cost of the study.



Question 15. Reason for Change in Burden


The increase in burden is attributable to the information collection being submitted as new.



Question 16. Project Schedule


General research leading up to this study has been ongoing. Preliminary planning for this particular field test began in December 2008, with discussions for a March 2009 pretest followed by a March 2010 field test. Below is a detailed schedule.


Figure 1: SHIPP Field Test Preparation and Data Collection Schedule


2009

January-March: Develop SHIPP instrument and training materials for pretest.

March: Conduct and analyze SHIPP pretest results; modify instrument accordingly.

April - June: Solicit and select contractor to develop CATI instrument; develop management plan for administration of contract.

April - June: Develop detailed specifications for CATI instrument.

June: Hold kick-off meeting among inter-divisional Census staff; develop schedule; assign tasks.

June - December: Contractor develops CATI instrument; Census conducts tests of working instrument.

December: (1) Conduct first formal test of CATI instrument (“user’s test”)

(2) Develop processing specifications


2010

January: (1) Develop specifications for sample selection.

(2) Develop advance letter and procedures

(3) Conduct second formal test of CATI instrument (“systems test”)

(4) Develop training materials

February: (1) Select and prepare sample.

(2) Conduct third formal test of CATI instrument (“verifications test”)

March: (1) Mail advance letters

(2) Conduct training

March - May Conduct training and data collection (see Table 1 for details)

June - August Analysis and preparation of reports for internal use and research conferences and publications (see Figure 2 for details).


Table 1: 2010 SHIPP Data Collection Training and Production Interviewing Schedule


Phase

Training

Data Collection

Content

Dates

Time

1

Base training (all interviewer groups together)

March 20

4 hours

March 22-April 6

ACS health section (interviewer group A)

March 20

3 hours

CPS health section (interviewer group B)

March 21 (a.m.)

3 hours

EXP health section (interviewer group C)

March 21 (p.m.)

3 hours

2

ACS health section (interviewer group B)

April 7 (a.m.)

2 hours

April 9-23

CPS health section (interviewer group C)

April 7 (p.m.)

2 hours

EXP health section (interviewer group A)

April 8 (a.m.)

2 hours

3

ACS health section (interviewer group C)

April 24 (a.m.)

2 hours

April 26-May 10

CPS health section (interviewer group A)

April 24 (p.m.)

2 hours

EXP health section (interviewer group B)

April 25 (a.m.)

2 hours


Figure 2: 2010 SHIPP Analysis, Tabulation and Publication Plans


The following tasks are planned for the SHIPP data. Dates for completion of each task have not been determined:

  • Conduct preliminary analysis; develop data cleaning programs

  • Finalize processing specifications to derive flat person-level file

  • Produce estimates of health insurance coverage by questionnaire treatment. Produce separate estimates for uninsured, general plan type (public vs. private) and specific plan type (employer-sponsored, Medicare, etc.).

  • Compare and analyze estimates of coverage across treatments

  • Conduct analysis on months and duration of coverage across treatments

  • Conduct multi-variate analysis of health insurance coverage using auxiliary variables on respondent and household characteristics

  • Conduct analysis of response rates

  • Tabulate average interview administration time

  • Analyze interviewer debriefing and questionnaire evaluation forms

  • Analyze respondent debriefing data

  • Match survey data to administrative records when possible

  • Prepare internal research reports

  • Prepare reports for presentation to research community



Question 17. Request to Not Display Expiration Date


The expiration date will be contained in the advance letter sent to respondents.



Question 18. Exceptions to the Certification


There are no exceptions to the Certification for Paperwork Reduction Act Submissions.



File Typeapplication/octet-stream
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy