0990-ONCConsumerPrivacySurveySPSTB6 19

0990-ONCConsumerPrivacySurveySPSTB6 19.docx

Consumer Survey of Attitudes Toward the Privacy and Security Aspects of Electronic Health Records and Electronic Health Information Exchange

OMB: 0990-0398

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT PART B:
Consumer Survey of Attitudes Toward the Privacy and Security Aspects of Electronic Health Records and Electronic Health Information Exchange

The United States Department of Health and Human Services (HHS)

The Office of the National Coordinator for Health Information Technology (ONC)

Office of the Chief Privacy Officer (OCPO)

January 30, 2021

Table of Contents

  1. Collections of Information Employing Statistical Methods

    1. Respondent Universe and Sampling Methods

      1. Respondent Universe

The respondent universe for the Consumer Survey of Attitudes Toward the Privacy and Security Aspects of Electronic Health Records and Electronic Health Information is the civilian, noninstitutionalized population ages 18 years old and older within the 50 states and the District of Columbia. In order to administer the questionnaire to Spanish-speaking respondents, NORC will work with Research Support Services in Evanston, Illinois, to translate the final OMB-approved English questionnaire into Spanish. The firm specializes in the development of dialect-neutral translations, allowing for the comprehension and acceptability of the questionnaire by a breadth of speakers, regardless of dialect or origin.

      1. Sampling Plan

This survey will utilize a dual random digit dialing (RDD) frame of landline phone numbers and wireless/mobile phone numbers developed by Survey Sampling International (SSI). In order to reduce sampling variability and to represent the nation, NORC proposes to stratify the landline RDD frame by Census Region (4). It will be further stratified by Latino and African American population density so that we can implicitly oversample Latinos and African Americans. The cell phone RDD frame will not be stratified, because its phone numbers are not well-associated with geographical location of the households.

In order to create the stratification by Latino and African American population density, NORC will utilize SSI’s telephone exchange level population estimate by ethnic group (SSI, 2009b). Within each Census Region of the landline RDD frame, the proportion of Latinos and African Americans is computed for each telephone area code exchange. The combined proportion of Latinos and African Americans will suffice in identifying high-density exchanges for each racial/ethnic group. The upper quartile (25%) of the exchanges in each Region is then grouped as a high-density area with respect to the Latino and African American population.

By defining the low- and high-density areas within each Region and applying different sampling fractions, we can probabilistically oversample African Americans and Latinos. This implicit oversampling method does not require a separate frame of African Americans and Latinos (e.g., a frame of race/ethnicity targeted list phone numbers), which is not always well-defined, particularly when it forms a dual frame with an RDD frame. Most importantly, this method requires no race/ethnicity screening and it reduces burden on respondents, leading to higher interview completion rates and lower selection or interview bias. However, monitoring of race/ethnicity of actual respondents will be necessary along with monitoring of response rates and other outcome measures.

For the main survey data collection, the target number of completed interviews is 2,000. The expected number of completed cases is allocated to each sample frame and stratum: landline RDD frame (stratified by Census region and African American and Latino population density) and cell phone RDD frame, as follows:

Landline RDD

Cell-only RDD

Total

West Region

Midwest Region

South Region

Northeast Region

National


High*

Low**

High

Low

High

Low

High

Low

All


192

192

179

179

303

303

148

148

356

2000

* Latino and African American high density

** Latino and African American low density

First, the number of completed interviews is distributed to the landline RDD frame (1,644) and the cell phone only RDD (356) with the proportions of 82.2% and 17.8%, respectively. These proportions come from the estimates derived from the National Health Interview Survey (Pew, 2008).

Next, the landline RDD frame portion of 1,644 completed interviews is distributed to four Census Regions proportionately to the most recent population estimates available at Census Bureau—71,568,081 (West), 66,836,911 (Midwest), 113,317,879 (South), 55,283,679 (Northwest): 384, 358, 606, 296, respectively (http://www.census.gov/popest/national/files/NST_EST2009_ALLDATA.csv).

Within each Region, the completed interviews are equally divided into the Latino-African American high- and low-density groups.

      1. Sampling Frames

As previously noted, this survey will utilize the landline RDD frame and the wireless/mobile RDD frame developed by Survey Sampling International (SSI). SSI’s landline RDD frame is based on area codes and exchanges. This database directly allows NORC to implement a list-assisted RDD method (Lepkowski, 1988; Tucker, et al., 1993). The method selects a random sample of telephone numbers from the banks of 100 consecutive telephone numbers (e.g., 773-256-0000 to 773-256-0099) that contain at least one directory-listed residential telephone number.

The cell phone RDD frame will also come from SSI, which maintains a wireless/mobile RDD frame with wireless prefixes and blocks that do not overlap with the landline RDD frame (SSI, 2009c). Although households with no phone services of any type would not be covered in the survey (about 2% of households according to Blumberg, et al. (2010)), the cell phone only households would be covered (about 17.8% of households with some phone services (Pew, 2008)).

From each household with a selected phone number in a given frame, only one adult will be selected to complete the telephone interview. The survey will utilize the last-birthday respondent-selection method. It simply asks for the eligible person (adult at least 18 years old) within the sampling unit (i.e., household) who had the most recent birthday or will have the next birthday. This method provides a true within-unit probability sample without intrusive or burdensome screening of eligible persons in the household (Lavrakas, et al., 1994). The proposed respondent selection method ensures maximum respondent anonymity, as no identifying information is collected.

      1. Sample Allocation and Precision

Based on the anticipated response rate, household eligibility rates, and desired number of respondents, NORC proposes taking a dual landline and cellular telephone sample of 31,705 telephone numbers to achieve the study’s design goals. This includes a sample of 14,999 cell phone numbers, which cannot be prescreened for known nonworking and business numbers and therefore requires more numbers to be dialed to achieve the same target response rate.



The precision of the survey estimates will depend primarily on the sample sizes within each group of analytic interest. For the key variables shown in the table below, we have constructed estimates of the group sizes based on Census data for age and urbanicity. The distribution of race and ethnicity was computed from the GENESYS system provided by MSG, our sample vendor, which accounts for the oversampling of high-density African-American and Hispanic telephone exchanges, as described in our sampling plan. Please note that in the case of race and ethnicity and urbanicity some modification of categories was needed in order to conform to the data available from MSG and Census respectively. The estimate of the number of respondents under treatment for a chronic health condition was taken from a report based on NHANES data.1

The computed minimum detectable differences are based on a two-sample test of proportions where the reference proportion is assumed to be 50% and the desired power of the test is 0.80. The comparisons across years assume that similar sample sizes will be available in each year. The comparison between groups compares the selected group with all other respondents. The actual minimum detectable difference for any particular comparison will depend on the final sample sizes and the values of the compared proportions.




Minimum Detectable Differences

Race/Ethnicity

Group Size

Compared Across Years (within Group)

Group Compared to all Others

Non-Hispanic White

1,151

±6%

±7%

Non-Hispanic Black

302

±12%

±9%

Hispanic

404

±10%

±8%

Asian/Pacific Islander

89

±22%

±16%

Other

55

±27%

±20%

Total

2,000

±5%

 





Urbanicity

 

 

 

Urban Center

190

±15%

±11%

Urban Area

1,425

±6%

±7%

Rural

385

±11%

±9%





Age

 

 

 

18 to 24 years

261

±13%

±10%

25 to 44 years

699

±8%

±7%

45 to 64 years

696

±8%

±7%

65 years and over

344

±11%

±9%





Chronic Condition Under Treatment

 

Yes

580

±9%

±7%

No

1420

±6%

±7%



      1. Responsive Design

The design of the HHS ONC Consumer Survey is based on established, proven methods for telephone RDD sampling and data collection. The protocols have been developed to minimize sampling and measurement errors in the survey process. However a certain level of uncertainty is always present in any given survey and survey design. To address these uncertainties, the survey will incorporate responsive design by using survey paradata and other real-time information to manage the survey operations to reduce costs, gain efficiencies and improve response rates. We will use real-time information on sample outcomes (e.g. rates of non-residential telephone numbers, disconnected lines, refusals, completed interviews) and survey data to monitor costs and errors, and make mid-course design alterations if needed.

At the start of data collection, the survey will identify the survey design areas with the potential to affect costs and measurement errors. For example, landline and cellular phone samples perform differently due to the inability to screen cell phone samples for known nonworking and business numbers. The overall interviewing hours and cost per interview in each sample will be closely monitored and the allocation between landline and cellular phone numbers may be adjusted during data collection to achieve a more optimal allocation. We will also monitor the rates of participation among the Latino and African American oversamples and adjust the sampling frame allocation as needed. In addition to monitoring the sampling procedures, the survey will be adaptive to any potential issues with respondent selection, refusal conversion strategies, and other data collection procedures.

      1. Response Rates

Survey response rates express completed interviews as a percentage of the estimated eligible units, but note that it can be decomposed into three rates if we assume that the working residential and eligible rate of unresolved cases is equal to that of resolved cases:

Response rate = Working residential number resolution rate * Household screening completion rate * Survey interview completion rate.

This rate is the so-called CASRO (Council of American Survey Research Organizations) response rate, or American Association for Public Opinion Research’s (AAPOR’s) third response rate definition with the working residential and eligible rate of unresolved cases equal to that of resolved cases. We use this definition of response rates here.

Survey response rates depend not only on the sampled population, or potential respondents, but also on operational factors that can affect their response propensity. Also, response rates usually change during data collection. Initial response rates tend to be higher because of the freshness of sample. Intermediate response rates tend to dip, facing some of the difficult cases in the sample. And, final response rates often recover somewhat due to extra operational efforts put forth near the end of data collection. Sometimes, contact modes or rules are modified in order to increase the contact opportunity and response propensity.

We expect the following the following outcome rates for the HHS ONC Consumer Survey:

Consumer Survey

RDD landline

RDD cell

Resolution rate

0.79

0.56

Household screening completion rate

1.00

1.00

Interview completion rate

0.55

0.70

CASRO response rate

0.43

0.39

A recently conducted national telephone survey was similar to the current survey in some critical factors: the National 2009 H1N1 Flu Survey by the Centers for Disease Control and Prevention (CDC) (http://www.cdc.gov/nchs/nis/h1n1_introduction.htm). First, this survey was a national survey like the HHS ONC Consumer Survey. Second, it was conducted by NORC, i.e., it utilized NORC’s telephone interviewers at NORC’s telephone center. And, finally, but most importantly, the Flu Survey drew a list-assisted RDD sample of both landline and cell telephone numbers. We are going to consider their outcome rates in order to predict the response rates for this survey. The Flu Survey achieved a CASRO response rate of 35% for RDD landline telephone lines and 27% for RDD cell-only or cell-mostly telephone lines. Compared to the Flu Survey, the interview completion rates are expected to be somewhat higher (by 25%), because the survey content is easier and the questionnaire will be well tested in advance, including cognitive testing.

    1. Procedures for the Collection of Information

      1. Interviewer Training

NORC telephone interviewers will be trained to administer the CATI questionnaire. All interviewers staffed on the project will have completed a general training that covers standard data collection techniques, including proper reading of questions, probing, and dictation protocols, and requirements for maintaining privacy and confidentiality, as well as an overview of our data collection software, including the dialer, the calling algorithms, and the system of disposition coding and call notes. During this training, heavy emphasis is also placed on gaining respondent cooperation. In addition to this general training, the telephone interviewers staffed on the survey will participate in project-specific training for the HHS ONC Consumer Survey. Training for telephone interviewers staffed on this project will be held at the telephone production center in Chicago. Project training will address the background and purpose of the HHS ONC Consumer Survey, interviewer expectations, administration of the CATI questionnaire, and other study-specific materials and procedures. NORC has a very well-established interviewer training program that includes role playing, practice sessions, and supervisor evaluation of trainee competencies to ensure that all interviews successfully complete training. Bilingual interviewers will receive additional training in administering the Spanish version of the CATI questionnaire. Interviewers begin data collection only after they have been certified as ready by telephone center management.

      1. Collection of Survey Data

The finalized English and Spanish versions of the approved questionnaire for the HHS ONC Consumer Survey will be programmed into electronic versions suitable for Computer Assisted Telephone Interviewing (CATI) administration mode. The questionnaire includes mainly multiple choice or close-ended, scaled questions and is estimated to take approximately 15 minutes to administer.

NORC’s CATI software is a full-featured dialing, data collection, and sample management system that is designed to collect survey data efficiently and accurately. This state-of-the-art system is designed for both complex surveys with involved skip patterns and questionnaires designed with limited complexity. The system is also equipped to permit internal consistency checks to ensure that only valid, applicable codes are entered for each question and that skip patterns are functioning appropriately.

The CATI system also features powerful sample management software providing review, prioritization, and overall management of samples. It also provides enhanced call scheduling capabilities that support intelligent calling rules. These rules reference both case-level call history and questionnaire-embedded sample management data to distribute future call attempts to new days and times and finalize sample that has reached the maximum call attempts. During data collection, we will continue to refine the calling rules to ensure the most efficient set of rules are in place.

NORC conducts extensive testing on each new CATI instrument before it is deployed into production. By conducting such extensive testing, project staff can ensure that the system is working as specified in regards to the skip patterns, data capture, and points of return. Testing the CATI system involves testing targeted questionnaire aspects by “calling” cases through a test environment – which mimics the questionnaire seen in production – as well as simulated interview conditions by conducting mock interviewers with NORC staff members.

NORC’s CATI system will allow for fast and easy access to the Spanish version of the electronic survey questionnaire. After contacting a Spanish-speaking household and selecting the appropriate respondent within the household, bilingual interviewers will be able to seamlessly transition to either the English or Spanish version of the CATI questionnaire, depending on the respondent’s language preference.

      1. Quality Control

Throughout the course of data collection, NORC’s telephone supervisors will use the CATI monitoring system to observe interviewer performance, recording and maintaining the results of these observations in a database. Interviewer monitoring and recording capabilities allow supervisors to evaluate interviewers’ performance and to identify motivational programs, training needs, and specialized skills accordingly. NORC telephone interviewers understand that they could be monitored at any time; however, they will not know they are actually being monitored until after the call. Shortly after completing the monitoring session, the supervisor will meet with the interviewer to provide feedback about performance. Monitoring algorithms produce recommendations of which interviewers need monitoring in order to fulfill their appropriate monitoring levels. The algorithms are highly customizable and evaluate many interviewer-level factors, including tenure, previous monitoring scores, production metrics, and length of time since the last monitoring session. A minimum of 10% of the CATI interviews will be monitored for this survey.


In addition to interviewer monitoring, all data collection efforts, from case management to survey data, will be subject to strict quality assurance and security standards for data capture, processing, and cleaning. NORC’s security program is compliant with federal government regulations and the security of all servers and processing equipment is handled under a specific protocol. All NORC server rooms and wiring closets are located behind locked doors within the boundaries of a secure area inside the facility. Access to these areas requires the use of either key or a security code, which are made available only to those specific individuals that are designated to work on in these areas. All of NORC's data collection and processing sites are located in highly restricted areas. Only those NORC employees who have read and signed a copy of NORC's confidentiality pledge (or their escorted guests) are allowed on the premises. When handling and reviewing project-specific materials and data, only the staff members that are assigned to that specific project are granted access to those data and materials.

      1. Estimation Procedure

Analysis weights will be computed so weighted data from the sample represents the population, allowing not only segment-specific level analyses but also overall national analyses with as little selection bias as possible. The survey will employ Census-based population totals for race/ethnicity, gender, age, income, and geography in deriving the survey weights. We expect that these geographic and demographic groups would be most appropriate for ensuring sample representativeness of the population, thereby reducing the potential for bias in the resultant survey estimates. This weighting approach controls the weighted sample counts to population totals for characteristics presumed to be correlated with nonresponse, undercoverage, and/or the survey variables of interest. Analyses for the total population as well as population subgroups based on the resultant survey weights should thus produce accurate and reliable results.

    1. Methods to Maximize Response Rates

Declining response rates have been reported by all survey organizations and statistical agencies regardless of respondent type and modes of data collection. To encourage participation, the telephone introductory script will clearly describe the goals and importance of the survey, highlighting that findings from the survey will be used to develop national policy recommendations given the importance of the privacy and security of electronic health information exchange.

The survey questionnaire will be designed for optimal telephone administration. Specifically, the questionnaire items will be tested for suitability for telephone data collection mode and comprehension level. The overall interview administration length is estimated to take approximately 15 minutes, to reduce respondent fatigue and reduce the risk of potential break-offs.

To aid in gaining respondent cooperation, hard-copy materials will be developed for use by the telephone interviewers. These include Frequently Asked Questions (FAQs) – a job aid that anticipates questions from respondents and provides scripted answers for the interviewers. Cooperation-gaining strategies are implemented at the first contact with a respondent, before the respondent actually refuses to participate. The first few seconds of the interviewer-respondent interaction are critical, as the interviewer must successfully engage the respondent during the introduction to the survey and before the respondent hangs up the telephone. In addition to averting refusals, Interviewers will be trained to convert reluctant respondents by addressing possible concerns they might have about participating in the survey.

NORC will also maintain a toll-free line for respondents who choose to call to confirm the legitimacy of the survey. The toll-free line will be staffed by a telephone supervisor trained in gaining cooperation for the survey, and will also have a voicemail message designed to put respondents at east and encourage participation.

During the course of data collection, the NORC project team will communicate frequently with the telephone center staff to identify any potential issues related to contacting the sampled respondents or administering the CATI questionnaire. Helpful strategies for gaining cooperation and tips garnered by the interviewers will also be shared and disseminated to improve data collection efforts.

Contact attempts to respondents will also be carefully designed and monitored during data collection. Multiple contact attempts will be made to each sampled residential or cellular telephone number in order to identify and interview the appropriate adult respondent. Call attempts will be made during different days of the week and hours of the day so as to increase the likelihood of reaching someone at the household.

      1. Methods to Maximize Coverage

As described in Section B2c above, the survey is utilizing a dual-frame RDD sample of landline and cellular telephone lines to maximize coverage. By augmenting the landline RDD sample with cellular lines, the survey will maximize the percent of the national population that is eligible for inclusion in the survey.

      1. Addressing Nonresponse Bias

Unit non-response has two negative consequences for the quality of the estimates derived from the data. First, nonresponse reduces the sample size and, as the number of responses decreases, the variability of survey estimates increases. Second, and more importantly, nonresponse has the potential to cause bias in the estimates. For means and proportions, the bias depends on two factors: the response rate, and the difference in the means or proportions of the respondents and non-respondents. Therefore, bias can be expressed as follows:

Bias = (1 – RR) * (S_r – S_n),

where RR = the unit response rate, S_r = the mean or proportion for respondents, and S_n = the mean of proportion for non-respondents.

Thus, bias increases as the difference in means/proportions increases between respondents and non-respondents, or as the unit nonresponse rate increases. Unfortunately, while the response rate can be calculated, we do not know the mean or proportion for the non-respondents. The actual amount of non-response bias in any estimates produced from the survey will be impossible to know in general, as there are no benchmark external estimates available to use for comparison.

However, the potential effect of unit nonresponse can be reduced through the use of population-based weighting that adjusts not only for under-coverage but also for non-response. This weighting approach controls the weighted sample counts to population totals for characteristics presumed to be correlated with non-response, under-coverage, and/or the survey variables of interest. For example, if any potential non-response is due to differences in income, the weighting adjustments will help compensate as long as income is used in the population controls. Analyses for the total population as well as population subgroups based on the resultant survey weights should thus produce accurate and reliable results.

The survey will make use of Census-based population totals for race/ethnicity, gender, age, income, and geography in deriving the survey weights. We expect that these geographic and demographic groups would be most appropriate for ensuring sample representativeness of the population, thereby reducing the potential for bias in the resultant survey estimates.

In order to assess the above weighting scheme and potential non-response bias, we recommend comparing demographic profiles and income distribution derived from our data against several sources, including those published by the Census Bureau for the Current Population Survey and the American Community Survey.

    1. Tests of Procedures

      1. Cognitive and Psychometric Testing

The National Cancer Institute (NCI) will include 5 core questions from the survey questionnaire for the ONC proposed information collection, entitled “Consumer Survey of Attitudes Toward the Privacy and Security Aspects of Electronic Health Records and Electronic Health Information Exchange,” in at least two (of four) cycles of the Health Information National Trends Survey (HINTS) IV data collection.

As part of the HINTS IV data collection program, NCI has selected Westat, an employee-owned corporation providing research services to agencies of the U.S. Government, to conduct cognitive interviews designed to focus on comprehension and response to the new items never before included in a HINTS IV data collection. The Westat cognitive testing is expected to fine-tune the 5 core questions from the survey questionnaire. Westat conducted their cognitive testing during the month of June 2011. Participants from different demographic segments were recruited through a variety of methods, such as advertising in the classified’s section of the Craig’s List website (www.craigslist.org) and community-based outreach. People interested in participating were asked to call a toll-free number with their contact information indicating their interest. A trained Westat interviewer contacted all interested persons and conducted a screening interview collecting demographic information, and previous experience in any Westat research studies (e.g., focus groups or cognitive interviews). Westat interviewers selected and scheduled 9 participants for in-person cognitive interviews which are estimated to be 120 minutes in length. All cognitive interviews used the same basic procedures, starting by reviewing the consent form and getting permission to record the interviews. Once the participant signed the consent form, the cognitive interviewer provided the participant with the survey questionnaire mailing package. Westat interviewers asked the participants to open the package and work through the materials as they would if they had received it in the mail at home. In addition to making observations, Westat interviewers stopped and probed participants about their understanding of the survey topic, purpose and survey questionnaire. NCI provided ONC with a copy of the Westat report on their findings from the cognitive testing. These findings may recommend changes to the 5 core questions for ONC to incorporate into the draft survey questionnaire as non-substantive changes. ONC will benefit from comparative analysis of various survey methodologies, such as the use of Random Digit Dialing (RDD) and post-office mail which may offer opportunities to improve efficiency and decrease cost in the outlying years.

For more information regarding the Health Information National Trends Survey (HINTS) please refer to OMB Control No: 0925-0538. For more information regarding cognitive testing at the National Cancer Institute (NCI) Applied Research Program, in the Division of Cancer Control and Population Sciences please refer to OMB Control No: 0925-0589.

For more information regarding the cognitive testing to be performed by ONC with regard to this proposed information collection, please refer to OMB Control No: 0990-0376, Communications Testing for Comprehensive Communication Campaign for HITECH Act (ICR No. 201105-0990-005). The cognitive testing of the survey questionnaire for this proposed information collection was completed in February 2012. The briefing slides containing the findings and recommendations from the cognitive testing are attached as a supplemental document.

Following the completion of the main survey data collection, NORC will perform psychometric analysis of the questionnaire. A trained NORC psychometrician will evaluate the psychometric properties of response scales in the survey instrument, using both classical and item response theory (IRT) methods. Following the analyses, NORC will prepare a report that presents the results and recommendations for improving the scales.

      1. Pretesting

Prior to the start of the main survey data collection, NORC will conduct a pretest of 100 completed cases for feasibility and operational costs. The pretest will utilize a representative subsample of the main survey RDD landline and cell phone sample. The goals of the pretest are mainly operational. The main goals will be to test the CATI instrument in a live setting; measure the average administration length of the questionnaire; and assess the RDD sampling procedures and expected sample eligibility rates.

Any CATI programming issues identified during the pretest will be addressed and resolved prior to starting the main survey data collection. We will use the questionnaire timings collected during the pretest to further confirm the expected respondent burden estimates. We have collected questionnaire timings on the final survey during ad-hoc interviews with project staff and have found them to be within the expected burden estimate, so do not expect to have to shorten the questionnaire any further. Additionally, we will examine the actual household eligibility rates from the dual landline and cell phone RDD samples. We will also use these results to confirm that the sample design for the main survey will produce the desired number of respondents, including the expected proportion of oversampled African-American and Latino respondents. Any sampling issues uncovered during the pretest will be addressed prior to the main data collection. Should the agency decide to make any changes to the sampling plan or CATI questionnaire based on the results from the pretest, we will submit a non-substantive change request.

    1. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

Individuals who have participated in designing the data collection:

Melissa M. Goldstein, JD

Office of the Chief Privacy Officer

Office of the National Coordinator for Health Information Technology (ONC)

U.S. Department of Health and Human Services

[email protected]

202.205.9277



Penelope P. Hughes, JD, MPH

Office of the Chief Privacy Officer

Office of the National Coordinator for Health Information Technology (ONC)

U.S. Department of Health and Human Services

[email protected]

202.690.3895



Vaishali Patel, PhD MPH

Office of Economic Analysis, Evaluation, and Modeling

Office of the National Coordinator for Health Information Technology (ONC)

[email protected]

202.690.3912



Linda D. Koontz, CIPP/G

The MITRE Corporation

[email protected]

703.983.3463



Ted Sienknecht, PMP

The MITRE Corporation

[email protected]

703.983.1035



Alison R. Brunelle, CIPP

The MITRE Corporation

[email protected]

571.420.8384



The following individuals from NORC will participate in the collection of data:

Kristina Hanson Lowell, Ph.D.

NORC at the University of Chicago

[email protected]

301.634.9488



Laurie Imhof, M.P.P.

NORC at the University of Chicago

[email protected]

312.325.2530



Lisa Lee, Ph.D.

NORC at the University of Chicago

[email protected]

312.759.4284



Hiroaki Minato, Ph.D.

NORC at the University of Chicago

[email protected]

312.759.4223



The following individual from NORC will participate in the psychometric analysis:

Michele Zimowski, Ph.D.

NORC at the University of Chicago

[email protected]

773.256.6099



The following individuals from The MITRE Corporation will participate in data analysis:

Linda D. Koontz, CIPP/G

The MITRE Corporation

[email protected]

703.983.3463



Ted Sienknecht, PMP

The MITRE Corporation

[email protected]

703.983.1035



Attachments

Attachment A: Bibliography: Literature Review

Attachment B: Introductory Script and Questionnaire

References

The American Association for Public Opinion Research (AAPOR) (2011). “Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys.” 7th edition. Lenexa, Kansas: AAPOR.

Battaglia, Michael P., Khare, Meena, Frankel, Martin R., Murray, Nary Cay, Buckley, Paul, and Peritz, Saralyn (2006). “Response Rates: How Have They Changed and Where are They Headed?” Presented at the Telephone Survey Methodology II Conference, Miami, FL.

Blumberg, Stephen J. and Luke, Julian V. (2010). “Wireless Substitution: Early Release of Estimates from the National Health Interview Survey, July–December 2009” http://www.cdc.gov/nchs/data/nhis/earlyrelease/wireless201005.pdf.

Consumers and Health Information Technology: A National Survey. The California Health Care Foundation, April 2010.

Ezzati-Rice, Trena M., Frankel, Martin R., Hoaglin, David C., Loft, John D., Coronado, Victor G., and Wright, Robert A. (2000). “An Alternative Measure of Response Rate in Random-Digit-Dialing Surveys That Screen for Eligible Subpopulations.” Journal of Economic and Social Measurement, 26, 99–109.

Gaylin, Daniel S., Adil Moiduddin, Shamis Mohamoud, Katie Lundeen, and Jennifer A. Kelly. “Public Attitudes About Health Information Technology, and Its Relationship to Health Care Quality, Costs, and Privacy.” Health Services Research, no. doi: 10.1111/j.1475-6773.2010.01233.x.

Lavrakas, P.J., Bauman, S.L. and Merkle, D.A. (1994). “The last birthday selection method and within-unit coverage problems”. American Statistical Association 1993 Proceedings: Section on Survey Research Methods, 1994, 1107-1112.

Lepkowski, James M. (1988). “Telephone Sampling Methods in the United States,” pp. 73–98 in R. M. Groves et al. (eds.), Telephone Survey Methodology. New York: Wiley.

The Pew Research Center for the People & the Press (Pew) (2008). "Calling Cell Phones in '08 Pre-Election Polls.” http://people-press.org/reports/pdf/cell-phone-commentary.pdf.

1 Presentation by David Cutler. Alliance for Health Reform Novartis/NIHCM Briefing. March 28, 2008.

OMB Control No: TBD ICR Reference No: TBD

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleHHS ONC Consumer Survey PRA-OMB Packet
SubjectHHS ONC Consumer Survey PRA-OMB Packet
AuthorLinda Koontz;Ted Sienknecht;Alison Brunelle;Laurie Imhof;Lisa Le
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy