Attachment 5 - P4P_Eval - Part A_Attachment_2_ OMB Conf Call Resp w Appendices 052209

P4P_Eval - Part A_Attachment_2_ OMB Conf Call Resp w Appendices 052209.doc

Evaluation of the Home Health Pay for Performance Demonstration: Survey instrument

Attachment 5 - P4P_Eval - Part A_Attachment_2_ OMB Conf Call Resp w Appendices 052209

OMB: 0938-1064

Document [doc]
Download: doc | pdf

Issues raised on the 03/19/09 OMB conference call and specific actions taken to the address these issues


Attendees: Bonnie Harkless, William Buczko (CMS)

Bridget Dooling, Shelley Martinez (OMB)

Eugene Nuccio, Angela Richard (UCD, AMC)


General issue #1: There was a large change in the estimated response rate between the information provided in the original (May 2008) documentation and the estimated response rate provided in the follow-up responses to OMB Passback questions (January 2009). Why?


Specific OMB suggestions related to General issue #1:

  1. Clarify the timing of the follow-up for agencies that do not respond to the initial request to complete the Web-based survey

  2. Provide a summary/outline of the Web-based survey in the initial request to complete the Web-based survey rather than the survey itself

  3. Provide the hard copy of the Web-based survey as an item in one of the follow-up reminders (rather than with the initial invitation) to the agencies that do not complete the survey within the initial timeframe


Response to General issue #1:

There are several issues to address in General item #1. Each issue will be addressed as well as the linkages between the general issue and the specific OMB suggestions.


Regarding the issue of the increased response rate for the survey, as specified in the response to the OMB Passback questions, the revised rates were based on a much more aggressive approach to the use of multiple modalities when re-contacting agencies. This more aggressive approach and the use of multiple modalities for re-contacting are supported by the research and principles set for by Dillman and others (1998, 2007). Additionally, the character of the home health agencies involved in the P4P Demonstration will be further clarified to demonstrate this group as very highly motivated to participate in these kinds of activities.


The re-contact schedule has been changed as well as the materials provided to the agencies at each time point. The new schedule is as follows:

  1. Initial notification to home health agencies participating in the P4P Demonstration

Materials/Method:

    1. Notification letter addressed by name to the administrator or Director of Nursing for participating home health agencies from the CMS P4P Demonstration Evaluation Project Officer (William Buczko, PhD) inviting their participation in completing the Web-based survey

    2. Information sheet that 1) outlines items that will be included in the survey that can be used as a navigation aid while completing the Web-based survey, 2) provides the URL address for accessing the Web-based survey,
      3) reiterates the security protocols in place to ensure that the information that is provided will remain secure, 4) includes the date for completing the Web-based survey (the work day nearest to two weeks and three days from the date of the mailing), and 5) provides an abbreviated summary of expected follow-up contacts if the Web-based survey is not completed by the specified date.

  1. First follow-up (within two working days after specified date in the initial notification)

Materials/Method:

    1. Email sent to administrator or Director of Nursing with a colorful, animated reminder message about completing the Web-based survey and the new date to complete (one week from the date of the email).

  1. Second follow-up (within two working days after date specified in first follow-up)

Materials/Method:

    1. Letter addressed by name to administrator or Director of Nursing from the CMS contractor (University of Colorado, Denver (Anschutz Medical Center)) with a request to complete the Web-based survey, along with statistics on how many have already completed the survey and the challenge to “be counted”, and new date to complete of one week and three days.

  1. Third follow-up (within two working days after the date specified in the second follow-up)

Materials/Method:

    1. Letter addressed by name to administrator or Director of Nursing from the CMS contractor (University of Colorado, Denver (Anschutz Medical Center)) with a hard copy of the survey instrument. The letter will explain the agency’s option either to complete the hard copy of the survey, mail it back to the CMS contractor, and have the contractor enter their data or use the hard copy as a guide when they complete the Web-based survey themselves. The new date to complete will be one week and three days from the date of the mailing.

  1. Fourth follow-up (within two working days after the date specified in the third follow-up)

Materials/Method:

    1. Personal phone call to administrator or Director of Nursing from the CMS contractor (University of Colorado, Denver (Anschutz Medical Center)). The phone call will follow a script where the goal is to gather the data needed to complete the Web-based survey.

The entire period from initial contact to fourth follow-up (if needed) is approximately two calendar months.


In addition to this multi-modality approach to increasing response rates, the 570 home health agencies that are eligible participants in the Web-based survey are highly motivated. First, each of these home health agencies volunteered to participate in the P4P Demonstration project. Second, half of these home health agencies are eligible for potentially significant monetary awards based on their performance for each calendar year during the project. Third, based on site visit focus groups from across the four regions of the country with participating home health agencies, the volunteer agencies were high performers and highly motivated organizations prior to beginning the incentive-based demonstration.


In summary, the response rate of approximately 80% completed Web-based surveys projected in the response to the OMB Passback questions is a reasonable estimate based on the aggressive use of a multi-modality, repeated contact approach with these home health care agencies. Additionally, the home health agencies in this study have demonstrated themselves to be highly motivated to convey an image of being high performers across several years. Finally, as will be discussed in the next section of this response, the Web-based survey was designed to be user-friendly both in terms of its interface and content. The ease of answering the Web-based survey items will further enhance the completion rates.

General issue #2: There was a concern about the cognitive complexity specific items on the Web-based survey instrument. What cognitive testing was done to assess the cognitive burden of these items?


Specific OMB suggestions related to General issue #2:

  1. Remove the references to “Treatment” and “Control” groups from cover page information

  2. Clarify/Specify the meaning of the word “Change” in item #4

  3. Item #5 is an example of an item that presents too large a cognitive burden for the user (same comment about item #17)

  4. Provide options for user for items 6a and 7a

  5. Randomize (preferable) the order of the long lists of alternative for “check all that apply” items

  6. Consider repositioning item 12

  7. Items 14 – 17 appear to be “opinion” questions. How are you going to use these data? Are there other data available to gather this information?

  8. The total number of items (<20) does not seem to be problematic, even if some of the individual items are probably too cognitively burdensome.


Response to General issue #2:

We support the concept of cognitive testing during the development of survey instruments such as the one proposed for this study. Cognitive testing of survey items during the development phase has been shown to be an effective tool in enhancing both construct validity and reliability of the survey items (Beatty and Willis (2007), Goldenberg (1996), Levine, et. al. (2005), and Uhrig, et. al. (2002)). The initial Web-based survey instrument items were developed using a structured approach that is outlined in the following section. Similarly, based on a review of the literature on cognitive testing, the Web-based survey instrument was revised and reviewed by senior clinical personal with extensive home health experience. A significant new element in the development of the second Web-based survey was the cognitive testing of the actual instrument as delivered via the Web site. The details of the development of the second Web-based survey follow the next section.


Initial Survey Development Process

  1. Using the information presented to CMS as part of the University of Colorado, Denver (Anschutz Medical Center) response to CMS request for a proposal to evaluate the Pay for Performance (P4P) Demonstration project, a group of technical experts in survey design and home health practices developed survey items for both the Treatment and Control groups participating in the Demonstration project.

  2. These survey items were reviewed by three senior clinicians who all had numerous years in leadership positions home health agencies. They were requested to 1) review each item and suggest wording improvements and/or identify needed clarification, and 2) keep track of their time in completing the survey items.

  3. The feedback on survey items was provided in both written and verbal form during an interview conducted after each senior clinician completed the survey.

  4. The information provided by the senior clinicians was incorporated into the Web-survey.

  5. The revised version was presented to each of the senior clinicians to ensure that questions/issues raised had been addressed satisfactorily.

  6. The revised version was presented as part of the PRA OMB package in May 16, 2008.


Second Survey Development Process

  1. Based on information shared during a conference call with OMB on March 19, 2009 to discuss the Web-survey, the following cognitive testing procedures were used to revise the proposed survey items.

    1. Each item in the survey was reviewed and revised using the suggestions provided by OMB. Additionally, background methodology in cognitive interview and survey design was reviewed. These included articles/reports by: Beatty and Willis (2007), Goldenberg (1996), Levine, et. al. (2005), and Uhrig, et. al. (2002).

    2. There was special attention to 1) ensuring clarity of terminology in the item stem; 2) reducing the complexity in matrices used to capture answers; 3) reducing the number of open-ended questions during the redesign of the items; and 4) splitting items that contained multiple constructs, e.g., readiness and openness, into separate items.

    3. Each of the three senior clinicians involved in the previous review also reviewed the revised survey items using the same review protocol as used in the initial survey development process (see item 2 in the previous section).

    4. The senior clinician suggestions were incorporated into the revised survey and the materials were provided to the Web-survey programmer.

    5. After the survey items were transformed into their Web-based format, each of the senior clinicians was asked to complete the survey using the Web-based format while being interviewed by a senior member of the project team.

    6. Specific cognitive probes were used throughout the interview/testing process, such as “Please think ‘aloud’ as you answer this question. Please tell me how you chose your answer. What did you have to think about? Do the column headings for the matrix make sense to you? Why/Why not?” The specific responses by the senior clinicians to the 25 cognitive burden questions related to the survey are included in Appendix B of this document.

    7. Survey items and Web-based format were revised based on the responses to these cognitive probes. Based on the comments made during the cognitive burden testing, 3 changes to the cover memo and 11 changes to the Web based survey were made.

  2. General format and presentation changes to the Web-based format were also incorporated into the revised instrument. These included:

    1. Removing references to “Treatment” or “Control” from the instrument.

    2. The alternatives for items with lists, e.g., items #8 – 11 were reviewed during the cognitive burden testing to evaluate any “fatigue” issue in selecting “all that apply” on these items. Each participant was adamant that there was no fatigue and all participants chose options from the beginning, middle, and end of the lists. The fixed order of the items grouping, e.g., grouping communications items and then business related items, were identified as easing the cognitive burden when choosing answers to these items.

    3. Specific directions for items 14 – 17 were created to indicate that these are opinion/perception questions and that secondary data will be use to measure these areas.


Survey Results Within the Context of the Evaluation of the CMS P4P Demonstration Project Research Plan

The results from the Web-based survey represent one element in the overall analysis plan for the evaluation of the CMS P4P Demonstration. These results, coupled with qualitative data gathered from eight site visit focus groups, represent the primary data for this evaluation.


Even combined, these primary data are dwarfed by the secondary data that will be used in this study. All OBQI episodes of care for calendar years 2007 – 2009 for all 570 volunteer home health agencies, plus all OBQI episodes of care for all other home health agencies in the seven states (MA, CT, TN, AL, GA, IL, and CA) participating in the P4P Demonstration will be analyzed. These data will probably exceed two million episodes of care. These data will be used to determine the patient outcome performance of the Treatment, Control, and non-participant home health agencies on the target OBQI measures on which monetary awards are based. In addition to these patient outcome performance data, health care claims for patients served by the 570 participant organizations during calendar years 2008 and 2009 will be analyzed to identify Medicare cost differences between the Treatment and Control agencies. Finally, cost report data (financial data related to the operation of the home health agency) for the 570 participant organizations during calendar years 2008 and 2009 will be analyzed to determine if Treatment agencies spent more per patient after controlling for case mix differences than Control agencies.


In summary, the concerns about the cognitive burden presented by the Web-based survey items have been addressed by:

  1. redesigning, where necessary, the initially proposed items to create more singularly focused items

  2. refining the redesigned items based on comments/interviews with experienced health care professionals

  3. completing a formalized cognitive testing of the revised survey items as they will be presented with the Web interface

  4. revising the survey items based on the cognitive testing results.


References


Beatty, PC and Willis, GB “Research Synthesis: The practice of cognitive interviewing”, Public Opinion Quarterly, May 2007, pp.1-25.


Dillman, DA; Tortora, RD; and Bowker, D “Principles for Constructing Web Surveys”, SESRC Technical Report 98-50, Pullman, Washington, 1998


Dillman, DA Mail and Internet Surveys: The tailored design, Second Edition—2007 Update, John Wiley: Hoboken, NJ, 2007.


Goldenberg, KL “Using Cognitive Testing in the Design of a Business Survey Questionnaire”, American Association for Public Opinion Research, Salt Lake City, UT, May 1996.


Levine, RE; Folwer, Jr., FJ; and Brown, JA “Role of Cognitive Testing in the Development of CAHPS Hospital Survey”, Health Services Research, Vol. 40 #6, Dec. 2005, pp. 2037 – 2056.


Uhrig, JD; Squire, C; McCormick, LA; Bann, C; Hall, PK; An, C; and Bonito, AJ “Questionnaire Development and Cognitive Testing Using Item Response Theory (IRT)” Final Report presented to Centers for Medicare & Medicaid Services, Baltimore, MD, Feb. 5, 2002.


Westat “Survey of ATO Applicants 2000: Methods report”, Report submitted to National Institute of Standards and Technology, Advanced Technology Program, Gaithersburg, MD, December 2003.




Appendices
Appendix A ----- Revised Web-based survey instrument

Appendix B ----- Results from the cognitive burden testing

Appendix A: Web Survey


See separate .pdf attachment for Web Survey instrument.

Appendix B: Cognitive Burden questions and responses


COGNITIVE TESTING FOR SURVEY WEB SITE DISPLAYS


Explanation questioning process:

  1. Requested by OMB

  2. Technique to document (to some extent) the underlying thought processes used by an end-user when trying to answer survey questions

  3. Uses leading questions such as

    1. “What were you thinking when you read….?”

    2. “What did you need to recall in order to answer….”

    3. “How could the item be restructured to make answering the item easier?”

to document underlying thought processes and make an estimation of the difficulty/burden imposed by the survey items

  1. No “right or wrong”; documents individual styles of approaching how to answer standard questions


COGNITIVE BURDEN TESTING PROBES

  1. As you read the information/direction sheet, what were you anticipating about the Web-based survey? Karin: Helpful. Angela: brief, on-line survey; data will be useful; add CCN reference. Kathy (post changes): relatively easy to follow; paragraph on radio buttons a little complicated.

  2. Were there any errors when you tried to access the URL and the opening pages of the survey? If so, what were they? Why do you think they occurred? Karin: No errors; would prefer “Start” to be capitalized. Angela: No errors. Kathy (post changes): No errors.

  3. Describe what you needed to recall when you completed the first two Web screens prior to beginning the survey items. Karin: CCN and password. Angela: very clear. Kathy (post changes): No major problems; CCN value very common knowledge for HHAs; liked the opportunity to print hard copy of survey especially if the task of actually filling in the survey is going to be delegated to someone else in the organization.

  4. The first three items are basic demographic information. Was there any confusion regarding how to enter your answers to these items? If so, what? Karin: no problems; did hit “enter”, used back arrow to return; emphasize “use tab”. Angela: very clear. Kathy (post changes): No problems here.

  5. Item 4 is the first of several pre-filled items. Did you experience any problems clicking on the radio buttons? If so, what problems? Karin: no; even checked the “fill in”; Angela: No problems. Kathy (post changes): None; very easy.

  6. What information did you need to recall to answer item 4? Rate the difficulty in answering this question (1 = extremely easy; 10 = extremely hard). Give a verbal description of what your rating means, e.g., “not too hard”. Karin: be aware of all staff changes; “6” = need to know when changes occurred and number of changes. Angela: May need to get this information from others; larger agencies will find this harder than smaller agencies; “2” = getting information may be harder for larger agencies. Kathy (post changes): Think about when these events happened; “3” depending upon need to find the information on staffing.

  7. Item 5 asks about staff turnover. Did you understand that the item implied that if there was no one in that position before 2008 and no one was hired during 2008, that there was no turnover in 2008? Karin: Yes. Angela: Should contracted staff be included (or excluded) from this question? Suggested wording clarification to explicitly exclude contracted staff. Kathy (post changes): Focused on not “adding or subtracting” staff, but replacing with existing staff with no net change to number of staff.

  8. Item 6 has a complex option (f. "Combination" position(s) that include two or
    more of the "a-e" functions). What were you thinking when you read that stem?
    Karin: went back to previous options to ensure that there were no redundant answers. Angela: makes sense. Kathy (post changes): Immediately thought that outcome analysis and QI could be an example of this combination position.

  9. Item 6 required you to click on a radio button to the left of the pre-filled button. Was this in any way a problem when compared with clicking on radio buttons to the right or left of the pre-filled button? Karin: no problem. Angela: no. Kathy (post changes): No problem.

  10. Item 7 uses an acronym (QIO) and uses names of home health care outcome measures. Were there any terms used on this item that were confusing or unknown to you? If so, what terms? Karin: No Angela: No. Kathy (post changes): No.

  11. What information did you need to recall to answer item 7? Rate the difficulty in answering this question (1 = extremely easy; 10 = extremely hard). Give a verbal description of what your rating means, e.g., “not too hard”. Karin: What we did; “6” = need to confirm or check info to answer Q. Angela: Get info from QI person; “2” = need to get info, but not hard to rate. Kathy (post changes): What my agency was doing in CY2008; “4” = may be a little confused about the actual dates of events; looking at a calendar might help.

  12. Complete items 8 – 11. Each of these items asks you to “check all that apply”. Angela: 8a needs another “ mark; 9 has spelling error “oversight”; NOTE: all three testers selected items from all “regions” of the answer list (beginning, middle, end).

  13. When you were thinking about how to answer Item 8 (Policies), how would the order of the options in the item helped you to answer the item more easily/quickly. Karin: order is fine; no big deal. Angela: order was not a problem; order is OK. Kathy (post changes): set up was fine; communication items grouped; business items grouped.

  14. Did you become fatigued while reading the options for this item? If so, did this make you stop answering the item before your read/considered all of the options? Karin: No. Angela: No. Kathy (post changes): No.

  15. Repeat Questions 13 and 14 for Items 9 – 11. Karin: No order difficulties; no difficulty getting through both of the lists. Angela: items grouped OK; not fatigued. Kathy (post changes): OK flow for each item; no fatigue.

  16. How clear was the distinction between corporate initiatives vs. local initiatives (Item 12) with regard to the P4P Demonstration? If unclear, how could the item be restructured to make answering the item easier? Karin: Clarify/define corporate/chain; no change = no program. Angela: unclear meaning of corporate; define as multi-agency corporate chain. Kathy (post changes): Corporate distinction very clear (asked if I wanted her to make-believe she was part of a corporation); marking “NA” if not multi-agency group very clear.

  17. What were you thinking as you tried to rate the impact of local and regional situations on your HHA’s activities during CY2008? Rate the difficulty in answering this question (1 = extremely easy; 10 = extremely hard). Give a verbal description of what your rating means, e.g., “not too hard”. Karin: I was trying to remember what happened locally; “4” = these are “biggies” and would be easy for the administrator to be aware of these. Angela: spelling 13a (“aides”); “5” = need to figure out if a change occurred and then what the impact was. Kathy (post changes): needed to think about two things—first, whether change occurred and then how they influenced the agency; “5” = middle; some items would be very easy (nurses available), others more complex (local medical practices).

  18. Complete items 14 – 16. How well do the directions and the structure of the items convey that we are asking for your opinions about finances, quality improvement, and P4P Demonstration impact? Karin: pretty clear; estimate/approximate answers are fine. Angela: good; you say in your stems “do you think”. Kathy (post changes): Clear; item says “best estimate” and “you”.

  19. Which of items 14 – 16 was the most difficult to answer? Describe why the item was so challenging to answer. Karin: #14 due to lack of information; would want to be at least in the “ballpark” with estimate of profitability. Angela: All items were pretty simple; none were difficult. Kathy (post changes): #16 because #14 and 15 focus on “my agency” whereas #16 includes a look statewide which could be difficult in larger states.

  20. Complete items 17a – c. How easy was it to make the distinctions among “commitment”, “readiness”, and “willingness to sustain” for these three items? Karin: Easy Angela: Correct stem underlines to match option title in 17a and 17c; clear distinction. Kathy (post changes): Commitment is the most difficult to judge because person may have started out very committed but then drifted off the high level of commitment.

  21. Rate the difficulty in answering the most difficult section of this question (1 = extremely easy; 10 = extremely hard). Give a verbal description of what your rating means, e.g., “not too hard” Karin: “2” = very easy Angela: “2” = answered “off the top of my head”. Kathy (post changes): #17a would be the hardest “6” because if there was a change in the level of commitment choosing a single level could be a challenge.

  22. What were you thinking when you read Item 18? Karin: try to recall information. Angela: this is easy. Kathy (post changes): Knowledge of (awareness of) any feedback would be helpful.

  23. Item 19 is the only open-ended item on the survey. Were there other items on the survey where you wanted to provide a long, open-ended response? Which item(s)? Why? Karin: Nothing comes to mind. Angela: No. Kathy (post changes): Gosh, no!

  24. Describe in a few words your experience when you exited the survey? Karin: placement of “Ready to submit” is confusing. Angela: re-arrange “Ready to submit” placement. Kathy (post changes): Glad it worked; easy.

  25. Thinking about the entire survey, rate its difficulty/challenge (1 = extremely easy; 10 = extremely hard). Give a verbal description of what your rating means, e.g., “not too hard”. Karin: “5” = medium; not a burden; does take time; may need to ask others for input. Angela: “3” = depends upon how much I would need to go back to get data. Kathy (post changes): “3” = not particularly difficult; may vary some by size of HHA and position of person completing the survey.




Response to OMB Conference Call (03/19/09) Page 12 of 12

File Typeapplication/msword
File TitleCognitive Testing of Web-based Survey Items
AuthorGene Nuccio
Last Modified ByGene Nuccio
File Modified2009-05-27
File Created2009-05-22

© 2024 OMB.report | Privacy Policy