Changes to Previous Documents based on the 03/19/09 OMB conference call
General issue #1 (Research Methodology):
Clarify the timing of the follow-up for agencies that do not respond to the initial request to complete the Web-based survey
Provide a summary/outline of the Web-based survey in the initial request to complete the Web-based survey rather than the survey itself
Provide the hard copy of the Web-based survey as an item in one of the follow-up reminders (rather than with the initial invitation) to the agencies that do not complete the survey within the initial timeframe
Responses to General issue #1 (Research Methodology):
A. The re-contact schedule has been changed as well as the materials provided to the agencies at each time point. The new schedule is as follows:
Initial notification to home health agencies participating in the P4P Demonstration
Materials/Method:
Notification letter addressed by name to the administrator or Director of Nursing for participating home health agencies from the CMS P4P Demonstration Evaluation Project Officer (William Buczko, PhD) inviting their participation in completing the Web-based survey
Information sheet that 1) outlines items that will be included in
the survey that can be used as a navigation aid while completing
the Web-based survey, 2) provides the URL address for accessing the
Web-based survey,
3) reiterates the security protocols in
place to ensure that the information that is provided will remain
secure, 4) includes the date for completing the Web-based survey
(the work day nearest to two weeks and three days from the date of
the mailing), and 5) provides an abbreviated summary of expected
follow-up contacts if the Web-based survey is not completed by the
specified date.
First follow-up (within two working days after specified date in the initial notification)
Materials/Method:
Email sent to administrator or Director of Nursing with a colorful, animated reminder message about completing the Web-based survey and the new date to complete (one week from the date of the email).
Second follow-up (within two working days after date specified in first follow-up)
Materials/Method:
Letter addressed by name to administrator or Director of Nursing from the CMS contractor (University of Colorado, Denver (Anschutz Medical Center)) with a request to complete the Web-based survey, along with statistics on how many have already completed the survey and the challenge to “be counted”, and new date to complete of one week and three days.
Third follow-up (within two working days after the date specified in the second follow-up)
Materials/Method:
Letter addressed by name to administrator or Director of Nursing from the CMS contractor (University of Colorado, Denver (Anschutz Medical Center)) with a hard copy of the survey instrument. The letter will explain the agency’s option either to complete the hard copy of the survey, mail it back to the CMS contractor, and have the contractor enter their data or use the hard copy as a guide when they complete the Web-based survey themselves. The new date to complete will be one week and three days from the date of the mailing.
Fourth follow-up (within two working days after the date specified in the third follow-up)
Materials/Method:
Personal phone call to administrator or Director of Nursing from the CMS contractor (University of Colorado, Denver (Anschutz Medical Center)). The phone call will follow a script where the goal is to gather the data needed to complete the Web-based survey.
The entire period from initial contact to fourth follow-up (if needed) is approximately two calendar months.
As detailed in the previous section, HHAs will not receive a hard copy of the survey in the initial contact to announce the Web-based survey. The cover memo that contains directions about the Web-based survey contains an outline listing of information elements contained in the survey.
This is addressed in the third follow-up vs. the initial contact as in the previous version of the proposed follow-up contacts.
General issue #2 (Instrument Item Specific):
Remove the references to “Treatment” and “Control” groups from cover page information
Clarify/Specify the meaning of the word “Change” in item #4
Item #5 is an example of an item that presents too large a cognitive burden for the user (same comment about item #17)
Provide options for user for items 6a and 7a
Randomize (preferable) the order of the long lists of alternative for “check all that apply” items
Consider repositioning item 12
Items 14 – 17 appear to be “opinion” questions. How are you going to use these data? Are there other data available to gather this information?
The total number of items (<20) does not seem to be problematic, even if some of the individual items are probably too cognitively burdensome.
Response to General issue #2 (Instrument Item Specific):
The words “Treatment” and “Control” no longer appear on the Web surveys.
The original item #4 was split into two items.
This item was redesigned to simply the options.
The open-ended portion of this item was replaced with a restricted set of options.
The list of items was reviewed to establish commonality among items. Specific cognitive burden questions addressed the efficacy of randomizing and the burden of completing the items. After reviewing the results of this cognitive burden testing (see results described in section “h”) and recognizing the limitations of the software used to create the Web-based survey, the decision was made to not randomize the short list of items for the “Mark all that apply” questions.
This item was redesigned to provide better clarity to the difference between “local actions” and “corporate actions”.
An introduction to this group of questions was created to emphasize that these were opinion questions and that the data would be corroborated using other secondary data available through CMS.
We conducted cognitive burden testing as part of the revision to the previous survey instruments proposed for this study. Cognitive burden testing mirrored common practice as described in the scientific literature. The cognitive burden testing used the actual instrument as delivered via the Web site. Three senior clinicians who all had numerous years in leadership positions home health agencies participated in the cognitive burden testing. Each of the three senior clinicians also reviewed the revised survey items prior to completing the Web-based version of the instrument and submitting to cognitive burden testing on that instrument. Twenty-five specific cognitive probes were used throughout the interview/testing process, such as “Please think ‘aloud’ as you answer this question. Please tell me how you chose your answer. What did you have to think about? Do the column headings for the matrix make sense to you? Why/Why not?” The specific responses are document in the PRA package. Survey items and Web-based format were revised based on the responses to these cognitive probes. Based on the comments made during the cognitive burden testing, 3 changes to the cover memo and 11 changes to the Web based survey were made.
The alternatives for items with lists, e.g., items #8 – 11 were reviewed during the cognitive burden testing to evaluate any “fatigue” issue in selecting “all that apply” on these items. Each participant was adamant that there was no fatigue and all participants chose options from the beginning, middle, and end of the lists. The fixed order of the items grouping, e.g., grouping communications items and then business related items, were identified as easing the cognitive burden when choosing answers to these items.
Summary of Changes (06/03/09) Page
File Type | application/msword |
File Title | Cognitive Testing of Web-based Survey Items |
Author | Gene Nuccio |
Last Modified By | Gene Nuccio |
File Modified | 2009-06-03 |
File Created | 2009-06-03 |