Generic Supporting Statement B
(CMS-10694;OMB 0938-New)
TABLE OF CONTENTS
B1. Respondent Universe and Respondent Selection 2
B2. Data Collection Procedures 4
B3. Response Rates and Non-Response 7
B4. Tests of Procedures or Methods 8
Data Use
Data collection will occur approximately in 2020-2022.
The data collected under this clearance request will be used for survey research and testing activities. Consequently, many of the small-scale methods proposed (cognitive testing, focus groups, usability testing, pilot testing, respondent debriefing, etc.) will involve respondents selected either to cover a broad range of demographic subgroups of interest or for specific characteristics related to the specific content being tested. For other methods, including pilot tests, split ballot experiments or other methodological experiments, statistically representative subsamples of existing survey panel respondents will be selected for inclusion. A description of the plans for selecting respondents for each individual test will be provided to OMB at the time the testing plans are submitted.
While survey research employs a quantitative methodology and relies on a relatively large population-based probability sample to support statistical inference and representativeness, cognitive testing, usability testing and focus groups conducted in a laboratory usually employ a qualitative methodology and generally rely on relatively small samples. Unlike survey research, the primary objective of such testing is not to produce statistical data that can be generalized to an entire population. Rather, the objective is to provide an in-depth exploration of particular concepts, processes and/or patterns of interpretation. Cognitive interviewing samples generally do not achieve full inclusivity of all social and demographic groups. However, as a general rule, respondents are selected for inclusion based on similarity of characteristics to survey respondents.
Selecting and Recruiting Entities for Pilot Testing
For pilot testing that requires random assignment of individuals within entities (e.g. hospitals, emergency departments, hospices, physician groups), entities will be sampled subject to minimum sample size constraints that ensure adequate power based on the proposed testing design. If testing results are intended to be nationally representative, sampling will be done by proportionately stratifying by select entity characteristics and recruitment will proceed using randomized queues. For example, if hospital emergency departments (ED) are to be recruited, queues may be constructed based on the annual number of ED visits (to denote a small, medium, and large ED) and geographic region (Northeast, Southeast, Northwest, Southwest). EDs within each queue would be randomly sorted and recruitment would proceed through each queue in order. Employing the model used to recruit entities for participation in past survey testing, we will conduct initial outreach to entities to secure an initial agreement of participation. Once entities have initially agreed to participation, CMS’ contractor will follow-up to discuss details of data transmission requirements and secure fully executed business associate agreements/data use agreements. If testing results are not intended to be nationally representative, proportional stratification and recruitment by queues are not necessary.
Depending on the details of the testing design, eligibility for participation may depend on
capabilities of the entity. For example, it may be necessary for the entity to have the capability to collect and transfer email addresses, cell phone numbers, and be compliant with the Telephone Consumer Protection Act (TCPA).
The number of individuals to be sampled within each entity and overall will depend on the testing design. Sample size determinations will be based on (1) the ability to detect a small difference in response rates between two protocols being tested and/or (2) the ability to detect a small difference in response patterns between two protocols being tested. Depending on the sample size required and the time period allotted, it may not be feasible for smaller entities to participate in testing. For example, for a testing design aiming to develop mode adjustments in an ED setting for various mode protocols, some of which involve web, a total of 31,000 patients would be sampled, 620 patients from each of 50 hospitals over a 3 month time period. An ED with less than 620 ED patients (discharged home) over a 3 month period would likely not be eligible to participate in the test.
Power calculations will incorporate any known information about survey response rates that are relevant to the components being tested. In general, sample size calculations will be based on being able to detect a minimum difference in response rates (between tested mode protocols) ranging from 2-5% and a minimum difference in response patterns ranging from an effect size of 0.2-0.4 (Cohen’s D).
Determining Individual Eligibility
Individual eligibility criteria will depend on the testing design and will be determined in consultation with CMS. Additional inclusion criteria will be included to reflect the respective survey. For instance, Fee-For-Service (FFS) Consumer Assessment of Healthcare Providers and Systems (CAHPS) requires a 6-month minimum length of continuous enrollment and In-Center Hemodialysis (ICH) CAHPS requires that patients receive outpatient care from an ICH facility for 3 months or longer. Generally, all individuals are eligible for inclusion in the sampling universe, with the exception of the following ineligible groups:
Individuals under the age of 18 (except for pediatric surveys)
Individuals known to have died
Individuals who request that they not be contacted (those who sign “no publicity” requests while hospitalized or otherwise directly request not to be contacted)
Court/Law enforcement individuals (i.e., prisoners)
Individuals with a foreign (Non-US or US Territory address) home address
Individuals who are excluded because of state regulations that place further restrictions on which individuals may be contacted after discharge
In general, identification of individuals for exclusion will be based on administrative data provided by the entities.
Data collection procedures will depend on the testing design; we provide some examples below:
Cognitive Testing Interviews
Participants scheduled for in-person cognitive testing interviews will usually travel to the contractor facilities. On rare occasions, a participant may be unable to travel to the intended location (e.g. an individual may be housebound or have limited mobility). In such cases, the interview may be conducted in his/her home or at a location normally frequented by the participant, such as a senior center. To reduce the number of "no shows" for cognitive interviews, participants scheduled more than a week in advance receive a reminder telephone call by the survey staff the day prior to the scheduled interview.
When the respondent arrives for their scheduled interview, he/she is greeted by the survey staff person. The participant will then be brought to the interview room and asked to read (or have read aloud to) a consent form. The form contains a brief description of the study, assurances of confidentiality for the participant, explains the voluntary nature of the study, and describes the risks and benefits. The need for recording the interview (audio or video) is explained and the respondent is asked to provide consent. In the rare instance that the participant consents to the cognitive interview, but not to recording it, the session will be carried out but not recorded. If the respondent grants consent to record the interview but changes his/her mind while the session is being recorded, the interviewer will ask for verbal consent to retain the interviewing materials and the portion already taped.
The interviewer, usually a survey methodologist or other project staff, will begin the cognitive interview by reading a more detailed explanation of the purpose of the interview and the procedures to be used. Interviewing procedures vary depending on the specific testing technique to be applied. The selection of the technique is determined by the nature of the project, or the stage of development of the questionnaire or set of questions under study. The most commonly used method is the cognitive interview with concurrent probing. In these interviews, respondents are presented draft survey questions and asked to “think aloud” about how and why they answered as they did. The interviewer usually probes extensively to ascertain the degree of comprehension and the recall processes involved.
If possible, the cognitive interview will be conducted in the mode intended for the instrument, either face-to-face, telephone, or self-administered using a paper or web- based instrument. For a telephone interview, the respondent is called from one room to another and in-person debriefing follows.
As described in Supporting Statement A. Section A.9, participants will receive a small incentive. They will be asked to sign a receipt form indicating receipt of the remuneration. Immediately following the interview, any hardcopy interviewing data (e.g.
questionnaires) will be separated from any signed consent form and receipt form, so that no demographic information will be associated with the individual’s name. Signed consent forms will be stored separately from any data collection from the participant.
Audio and video recording files will be stored on CMS contractor’s secure servers and the files deleted from the recording device.
Focus Groups
Participants scheduled for focus groups will usually travel to the contractor facilities. To reduce the number of "no shows" for focus groups sessions, participants scheduled more than a week in advance receive a reminder telephone call by the survey staff the day prior to the scheduled session. When participants arrive they are greeted by survey project staff and directed to the focus group room. Participants are given a consent form to read (or to have read to them by project staff). The consent form contains a brief description of the study, assurances of confidentiality for the participant, explains the voluntary nature of the study, and describes the risks and benefits. The need for recording the focus group sessions (audio and/or video) is explained and the participant is asked to provide consent.
Survey staff person(s) will moderate the focus group. Before discussion begins, the moderator will distribute name tags and will tell respondents to pick a name to put on the name tag. Respondents will be told that they do not have to use their real names. The moderator will then describe the process of the focus group and ask if there are any questions. After all questions are answered, the moderator will then begin the focus group discussion following the moderator guide designed for that particular study.
Once the focus group has concluded, participants will receive a small incentive. They will be asked to sign a receipt form indicating receipt of the remuneration. Immediately following the interview, any hardcopy materials completed by the participant will be separated from any signed consent form and receipt form, so that no demographic information will be associated with the individual’s name. Signed consent forms will be stored separately from any data collection from the participant. Audio and video recording files will be stored on CMS contractor’s secure servers and the files deleted from the recording device.
Usability Testing
Participants scheduled for usability testing will usually travel to the contractor’s facilities. When the participant arrives for their scheduled interview, he/she is greeted by the survey staff person. The participant will then be brought to the testing room and asked to read (or have read aloud to) a consent form. The form contains a brief description of the study, assurances of confidentiality for the participant, explains the voluntary nature of the study, and describes the risks and benefits. The need for recording the interview (audio or video) is explained and the respondent is asked to provide consent. In the rare instance that the participant consents to the cognitive interview, but not to recording it, the session will be carried out but not recorded. If the respondent grants consent to record the interview but changes his/her mind while the session is being recorded, the interviewer
will ask for verbal consent to retain the interviewing materials and the portion already taped.
The survey project staff person will begin the testing session by reading a more detailed explanation of the purpose of the interview and the procedures to be used. Interviewing procedures vary depending on the specific testing technique to be applied. Usability could be evaluated with cognitive testing (think aloud, debriefing questions), behavior coding (e.g., coding errors made by the interviewer or respondent while trying to navigate the instrument), and analysis of interview completion time.
As described in Supporting Statement A. Section A.9, participants will receive a small incentive. They will be asked to sign a receipt form indicating receipt of the remuneration. Immediately following the interview, any hardcopy interviewing data (e.g. questionnaires) will be separated from any signed consent form and receipt form, so that no demographic information will be associated with the individual’s name. Signed consent forms will be stored separately from any data collection from the participant.
Audio and video recording files will be stored on CMS contractor’s secure servers and the files deleted from the recording device.
Pilot Tests
Pilot tests will be conducted with subsamples of the main survey cohorts, as part of regular survey data collection cycle. The test questions will be incorporated into the questionnaires and administration will follow survey protocols.
Pilot testing will be conducted by CMS contracted interviewers or survey vendors. Cases to be included will be pre-identified in the survey sample, based on the desired sampling allocation. For all interviews administered in this manner, the pilot interviewer will follow approved survey administration procedures. As time and resources allow, a subset of the interviews may be observed by CMS staff or survey project staff and observations manually recorded to allow for systematic analysis. In addition, CMS or project staff may conduct analysis of outcome data such as response rates and response distributions to key items, paradata, interviewer observations, and respondent debriefing data. Subject matter staff are debriefed, and findings are used to modify the questionnaire for follow-up pilot tests or incorporation into the main survey questionnaires.
Pilot test experimental modes might include:
Web-mail-phone: individuals will receive an invitation to the web survey (see Attachment 1) by email (see Attachment 2), text and/or paper invitation. Non- responders will receive 3 reminders (e.g. emails) for the web survey spaced 2 days apart. For sampled individuals randomized to this mode who have not responded by web by a cutoff date 10 days after the first contact attempt, we will mail a letter, survey, and business reply envelope. For sampled individuals randomized to this mode who have not responded by web or mail by a cutoff date 3 weeks days after the mailed survey was sent, we will begin attempts to complete surveys by phone using computer assisted telephone interviewing (CATI). We will make five attempts to reach each sampled individual. Calls will be made only
between the hours of 9:00 am and 9:00 pm respondent local time (unless a respondent specifically requests a callback outside of this range). These attempts will be made over a period of time such that the entire pilot period is no longer than six weeks.
Web-mail: individuals will receive an invitation to the web survey by email, text and/or paper invitation. Non-responders will receive 3 reminders (e.g. emails) for the web survey spaced 2 days apart. For sampled individuals randomized to this mode who have not responded by web by a cutoff date 10 days after the first contact attempt, we will mail a letter, survey, and business reply envelope. All non-responders will be sent a second mailing three weeks after the initial mailed survey.
Web-phone: individuals will receive an invitation to the web survey by email, text and/or paper invitation. Non-responders will receive 3 reminders (e.g. emails) for the web survey spaced 2 days apart. For sampled individuals randomized to this mode who have not responded by web by a cutoff date 10 days after the first contact attempt, we will begin attempts to complete surveys by phone using CATI. We will make five attempts to reach each sampled individual. Calls will be made only between the hours of 9:00 am and 9:00 pm respondent local time (unless a respondent specifically requests a callback outside of this range). These attempts will be made over a period of time such that the entire pilot period is no longer than six weeks.
Pilot test comparison modes might include:
Mail only: We will mail an initial letter, survey, and a business reply envelope. All non-responders will be sent a second mailing three weeks after the initial mailing.
Telephone only: We will make five attempts to reach each sampled individual over a six week maximum period using CATI. Calls will be made only between the hours of 9:00 am and 9:00 pm respondent local time (unless a respondent specifically requests a callback outside of this range).
Mail with telephone follow up: individuals will receive an initial mailing containing an initial letter (see Attachment 3), survey (see Attachment 4), and business reply envelope. For sampled individuals randomized to this mode who have not responded by mail by a cutoff date three weeks after the first survey mailing, we will begin attempts to complete surveys by phone using CATI. We will make five attempts to reach each sampled individual. Calls will be made only between the hours of 9:00 am and 9:00 pm respondent local time (unless a respondent specifically requests a callback outside of this range). These attempts will be made over a period of three weeks such that the entire pilot period is no longer than six weeks.
Expected response rates will vary depending on the setting and the protocols being tested. Protocols involving only a web survey are expected to receive very low response rates, ranging from 0-5%. Protocols involving multiple modes with a web survey as one
component of the protocol could be expected to receive response rates as high as 40-50%. We will employ multiple email, text, mail and phone contacts to minimize non-response.
All testing designs will involve planned testing of survey unit and item non response analyses to evaluate response propensity across modes. These statistics will be computed overall, and separately by mode of administration and potentially, by mode of completion, if relevant.
The data collection effort will depend on the testing design; we provide some examples below of procedures and methods that will likely be tested:
A test of the effect of mode of survey administration on response rates
A test of the effect of mode of survey administration on response patterns
A test of the feasibility of the use of a web survey in a setting with multiple survey vendors
The survey, sampling approach, and data collection procedures were designed in consultation with the RAND Corporation under the leadership of:
Layla Parast, PhD RAND Corporation 1776 Main Street
Santa Monica, CA 90407
Kirsten Becker, MS RAND Corporation 1776 Main Street
Santa Monica, CA 90407
Attachment 1 – Emergency Department Patient Experience of Care (EDPEC) Discharged to Community (DTC) web survey – sample screen shots, mobile and computer versions
Attachment 2 - Emergency Department Patient Experience of Care (EDPEC) Discharged to Community (DTC) email invitation to the web survey
Attachment 3 - Emergency Department Patient Experience of Care (EDPEC) Discharged to Community (DTC) mailed survey cover letter
Attachment 4 - Emergency Department Patient Experience of Care (EDPEC) Discharged to Community (DTC) survey instrument
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 0000-00-00 |