HIO Supporting Statement B

HIO Supporting Statement B.docx

National Survey of Health Information Exchange Organizations (HIO)

OMB: 0955-0019

Document [docx]
Download: docx | pdf

B. Statistical Methods


  1. Respondent Universe and Sampling Method

Given the relatively small number of HIOs in the U.S. (<200), the proposed approach to achieving nationally representative survey results is by conducting a census. Under this approach, UCSF will develop a comprehensive list of all HIOs in the nation by drawing on five sources, and collect survey responses from all HIOs on the list. First, UCSF has access to the list of HIOs from prior years of conducting the survey, with 2015 as the most recent containing 160 HIOs. Second, any organizations on the Strategic Health Information Collaborative (SHIEC) member list which includes 57 HIOs that are not on the master list will be added. Third, the same process will be followed for reviewing the eHealth Initiative’s HIE directory, which includes more than 100 entries from their efforts to survey HIEs dating back to 2003. Fourth, the same process will be followed for reviewing the Healthcare Information and Management Systems Society (HIMSS) directory provided by ONC. Fifth and finally, a targeted web search will be conducted of state government websites and other HIE resources to identify any new HIOs that may have been established in the past 18 months and were not captured in the other sources. Together, these five sources of data should create a robust sampling frame that will serve as the basis for the census.



2. Procedures for the Collection of Information

The final survey will be programmed into the online survey tool Qualtrics©. This tool has been used for past surveys and it has strong capabilities to support complex survey design (e.g., branching logic) as well as respondent communication and tracking. The tool will be extensively tested to ensure accuracy of branching and skip logic, accuracy of piped text, clarity of question display, and adherence to other survey usability guidelines. The HIO contact list will be loaded into Qualtrics to support communication and response tracking (i.e., by generating a unique survey link for each target respondent).


UCSF plans to use a standard survey data collection methodology used for past surveys, modified based on any changes requested by ONC. This methodology involves an email to respondents 10-14 days before the survey is sent to them letting them know to expect it. The survey is then sent via email through Qualtrics. Using Qualtrics features, responses will be tracked on a daily basis and response status will be updated in a separate MS Excel tracking spreadsheet based on the HIO sampling frame.


As responses are received, they will be reviewed for completeness, internal consistency, and accuracy. Beyond general quality assurance (QA) techniques (e.g., examining systematic patterns in responses), based on prior experience a more specific set of QA checks have been developed. For example, we typically check for high outlier values in the number of hospitals and number of ambulatory practices participating in their HIO as well as for broad geographic coverage in non-contiguous states. There will be follow up with individual respondents to correct any errors. At the end of this process we will have a QA-ed data set.



3. Methods to Maximize Response Rates and Deal with Nonresponse

In past surveys, two types of response rates have been typically tracked. First, of all HIOs in the sampling frame, for what proportion can their status be determined: operational, planning, defunct. This response rate is critical to tracking the number of efforts over time and the aim is to achieve a 100% response rate as with prior surveys. The second response rate that is tracked is, of all operational and planning HIOs, what proportion respond to the survey. For this response rate, we have had substantial success achieving a more than 80% response rate and that will be the goal for the current survey. Given that the scope of the survey is somewhat broader and will include sensitive questions about information blocking, it is likely that we can achieve at least an 80% response rate to the survey module that asks about key HIO demographics (e.g., governance, key activities, size, geographic coverage, support for reform efforts). For modules related to standards and information blocking, the goal is to achieve an equally high response rate but we could end up with a response rate closer to 65% (which is what was achieved in the first information blocking survey). However, by collecting the demographic measures for the broader sample, we will be able to assess whether HIOs that respond to the standards and information blocking modules look systematically different from the overall group of HIOs. Individual item responses will be calculated and targeted follow up will be conducted with respondents if any items have substantially lower response rates (i.e., 10 percentage points or more) than the overall response rate.


A robust approach to achieving high response rates based on prior experience surveying HIOs will be implemented. First, respondents will be emailed 10-14 days before the survey is sent to let them know to expect the survey. Then, after they receive the survey, weekly/bi-weekly email follow-up for ~4 weeks will be conducted. Next, non-responders will be called to ensure they received the survey and answer any questions. Calls and emails will continue until the target response rate has been achieved. We have found that this contact strategy is most effective when: (1) offered with the choice for the respondent to complete the survey via MS Word or over the phone if they prefer that to the online platform and (2) offered with a financial incentive for survey completion. As with prior years, respondents will be offered a small incentive ($10) for completing the screening questions that enable us to determine whether they are operational, planning, or defunct and a larger incentive ($50) for completing the entire survey if they are eligible (i.e., not defunct). For the first time, this approach will be further supplemented with communications from SHIEC to their members and eHI to their members to raise awareness of the survey and encourage completion. Both organizations will send emails as well as include information about the survey in newsletters and during regular member phone calls. In addition, these organizations will not administer their own HIO surveys, and any other substantive requests from their members will be timed to not overlap with the data collection period for this survey.



4. Tests of Procedures or Methods to be Undertaken

We conducted 2 rounds of pilot testing of the survey in order to calculate an estimate of time required for completion of the survey, identify any potential concerns that could impede survey completion (e.g., selection of mandatory questions), and obtain input on the clarity of the instrument. Round 1 pilot test participants were comprised of 5 SHIEC members who were recruited by SHIEC and focused on areas of the survey to cut and/or re-structure, clarity of questions, important concepts that may have been missed and specific questions that were flagged for feedback. We followed our standard approach to survey pilot testing. Specifically, we sent the pilot respondents the survey and instructions that ask them to complete the survey, track the time it takes, and note any questions that are confusing or otherwise problematic. We then conducted a phone interview with each respondent to review their feedback. The goals of the interview were to ensure: (1) respondents understand the questions as intended, and (2) the questions are written in a manner that respondents can answer. Through interview, we identified potential response errors as well as errors in question interpretation. We then updated the survey questions after Round 1 pilot testing so that the questions are increasingly clear and comprehensible, and capture the concepts we are trying to ascertain. We used the revised survey from Round 1 pilot testing for Round 2 pilot testing which was comprised of 4 respondents. The focus of Round 2 pilot testing was to gather feedback on estimated time required for completion of the survey, clarity of questions, important concepts that may have been missed and responses to specific flags in the survey. After each round of pilot testing, we prepared a tracked version of the survey for ONC that briefly summarized the results of the pilot testing and made recommendations for survey modifications to facilitate ONC review.


5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or

Analyzing Data


The information for this study is being collected by the UCSF Center for Clinical Informatics and Improvement Research, on behalf of ONC. With ONC oversight, UCSF is responsible for the study design, instrument development, data collection, analysis, and report preparation.

The instrument for this study and the plans for statistical analyses were developed by Dr Julia Adler-Milstein with input from ONC. The staff team is composed of Dr. Julia Adler-Milstein, Principal Investigator and Anjali Garg, Project Manager. Contact information for these individuals is provided below.

Name

Email

Number

Julia Adler-Milstein

[email protected]

415-476-9562

Anjali Garg

[email protected]

415-502-3729




2


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy