Supporting Statement – Part B
Substance
Use-Disorder Prevention that Promotes Opioid Recovery and Treatment
(SUPPORT) for Patients and Communities Act Section 1003 Demonstration
Evaluation
Information Collection Request
Collections of Information Employing Statistical Methods
1. The respondent universe and respondent selection method.
The universe for the data collection for the survey consists of current prescribers or dispensers of buprenorphine or methadone, who may also dispense extended-release naltrexone, in the 15 planning grant states. We will work with the Contracting Officer’s Representative (COR) and other stakeholders, including state Medicaid points of contact in the post-planning states, to obtain information on relevant providers for the surveys and focus groups. Our sample frame ensures that we recruit all types of providers who are qualified to provide substance use disorder (SUD) treatment or recovery services and are the targets of each state’s demonstration. We specifically seek to include providers who work in communities with or serve high-need Medicaid populations, such as perinatal women with opioid use disorder (OUD) and their infants and children, adolescents and young adults, and American Indian and Alaska Native individuals.
The Substance Use-Disorder Prevention that Promotes Opioid Recovery and Treatment (SUPPORT) Act section 1003 demonstration evaluation will not employ any statistical methods for selecting respondents for either the surveys or the focus groups. The decision not to use statistical methods to select respondents was based on the limited number of respondents in each state and the need to reflect the diversity of provider types. There are many possible combinations of organizations and providers with different disciplines (e.g., medical doctors with different specialties, nurse practitioners, physician assistants) that necessitate data collection from all eligible respondents to ensure an understanding of efforts to strengthen provider capacity.
Given the potential diversity in respondents, the evaluation team determined that sampling could not ensure a representative sample and would not likely improve the robustness of data gathered. Furthermore, there were no reasonable criteria for purposefully sampling among the possible provider types. As a standard objective, we aim to achieve a high survey response rate (e.g., 80 percent). However, if our response rate is lower, performing a census can help us obtain representativeness of subgroups of types of providers in each state. A stratified sample may not yield sufficient responses to produce representative samples for all the various subgroup analyses that would capture the diversity of outcomes across the different types of respondents. This is because the sample may not yield sufficient statistical power to determine variation in outcomes across the types of respondents. If we do not achieve sufficient responses in these strata, even with a census approach, we will aggregate across groups. The census approach would be the optimal way to ensure that all groups are represented.
2. Procedures for the collection of information.
We
have designed data collection procedures to support a high response
rate, reduce burden to respondents, and promote accuracy and
completeness of responses. Specifically, the following types
and timing of the respondent contact efforts
are summarized in Table 1.
Survey
procedures. We
start with an advance notification of the survey, via a hard-copy
letter, cosigned by the Centers for Medicare & Medicaid Services
(CMS) COR and representatives from the State Medicaid Agency, which
announces the survey and describes the importance of the data being
collected. This letter will let respondents know that an email with
a secure, personalized link to the web survey will follow in
approximately 1 week. We will maintain a list of returned emails
(accounts that are no longer valid). We
will email all respondents to introduce the evaluation team, provide
instructions on completing the survey, and send a secure link to the
survey. For the
duration of the fielding, the evaluation team will carry out outreach
strategies tailored to each type of respondent to help obtain a high
response rate (see Table 1).
Table 1: Survey Respondent Outreach
Contact Timing |
Contact Details |
Mode |
Advance notice |
Advance introduction from survey and focus group team mailed to respondents, announcing the survey team’s upcoming request for participation |
(hard-copy, U.S. mail) |
Start of survey fielding |
Email with link to surveys |
|
Focus group participation confirmation |
Follow-up by phone to confirm participation and scheduling for focus groups |
Phone call |
Start +1 week |
Email reminder to survey respondents |
|
Start +2 weeks |
Reminder email with link to surveys |
|
Start +3 weeks |
Reminder email with link to surveys Coordinated effort with CMS and state Medicaid offices to implement outreach to nonresponders via existing communication channels (e.g., announcement on agency website or newsletter) |
Email/ |
Start +4 weeks |
Reminder email with link to surveys Phone follow-up if possible |
Email and phone call |
Start +5 weeks |
Reminder email with link to surveys Phone follow-up if possible |
Email and phone call |
Start +6 weeks |
“Last chance” additional email |
Notes: For
the first round of data collection, outreach will begin immediately
after Office of Management and Budget approval, pending contact
information availability; for the second round of data collection,
outreach will begin per agreed-on timing with the evaluation team and
COR.
Focus group procedures. We recruit a range of SUD treatment providers who are primary targets of the state’s section 1003 demonstration. We choose provider types to include in focus groups based on the goals of each state, with particular attention given to areas that the state aimed to support and develop. In discussion with the COR, we identify subgroups of providers, for example, those serving specific client populations or those providing a particular type of service. We identify 24 potential candidates in each state and anticipate between six to eight participants for each of the two focus groups per state. Contact information will be obtained from states or from emails from the provider survey respondents who agree to be contacted for focus group discussions. To minimize provider burden, we include flexible times outside of normal working hours.
Each provider receives a mailed (hard-copy) invitation that describes the purpose of the study and the role and importance of the focus group data in the evaluation, the intended deliverables, and institutional review board (IRB) review. The letter describes the different topics the focus group will cover, such as—
Changes in provider capacity;
Views on reimbursement policies;
Barriers to providing SUD, including OUD treatment;
Experiences with changes in care coordination and communication to support integrated care;
Provider facilitation via targeted case management or referral to resources for social determinants of health to mitigate social risks and support recovery;
Experiences with training and technical assistance; and
Engagement as stakeholders in the demonstration.
We follow up by phone to confirm participation and scheduling with a designated contact at a mutually convenient time for all participants (whether during the day or the evening). We expect between six to eight participants to show up for each group. Focus groups last between 60 and 90 minutes and are audio recorded with informed consent.
An experienced lead facilitator is responsible for guiding and moderating the discussion, ensuring that all beneficiaries have an opportunity to share their story. A cofacilitator is responsible for ensuring that all major themes included in the discussion guide are addressed as extensively as possible. A research assistant takes accurate and comprehensive notes about the focus group proceedings and discussion and makes notes as to which participant provided which comments, as well as nonverbal cues. Provided that we receive permission from every participant in the consent process, we will record (audio only) focus group discussions.
3. Methods to maximize response rates and to deal with issues of non-response.
Provider surveys. In addition to the advance outreach and continuous follow-up described above, we employ the following design and implementation aspects to achieve a high response rate.
Efficient questionnaires: To develop questionnaires that efficiently capture the necessary data, minimize respondent burden, and maximize response rates, we carefully reviewed awardee and secondary data to ensure that we constrain primary data collection to those elements that are not otherwise available. We consulted current relevant literature on provider perceptions and barriers to OUD medication treatment to inform questionnaire development. Our survey also asks providers whether they can be contacted for their participation in focus groups and, if so, requests the best contact information.
User-friendly questionnaire design and mode: To help facilitate cooperation and reduce item nonresponse by respondents, attention was placed on creating a logical, clear questionnaire with concrete question wording, closed-ended response choices, simple grammar, and questions grouped according to subject areas. In addition, the web-based Qualtrics survey platform will make it easy for respondents to participate. We will prepopulate fields where relevant to allow respondents to complete the survey more efficiently and skip questions that do not pertain to them. The web-based platform will also allow respondents to save and continue work, which adds convenience in completing the surveys.
Robust follow-up: For the first 4 weeks of the survey, we send email reminders to follow up with respondents who have not completed the survey. In these communications, we will emphasize the critical importance of their participation. After approximately 4 weeks, we will begin prompting nonrespondents by phone. Phone reminders will continue for approximately 2 weeks, with each nonrespondent receiving at least two reminders. Following the phone prompting phase, we will send an additional email reminder before closing the survey (at approximately 8 weeks). We will track survey completion and update our COR on a routine basis on the number of new and cumulative completes.
Midpoint support from State Medicaid Agency leadership: We also propose working with state Medicaid offices to encourage provider participation. Leveraging the support of state leadership for survey participation underscores the importance of the survey to providers. After approximately 3 weeks of survey fielding, we will coordinate with our COR in reaching out to state Medicaid offices (via email or phone call) to discuss outreach to nonresponders via existing communication channels (e.g., email blast through a provider portal, posting on social media, or blurb in a provider newsletter). We will provide state Medicaid offices with a template blurb about the survey that they may use for this special outreach.
Respondent support (or technical assistance): We will provide contact information for the survey team should respondents have questions or concerns about the surveys, as well as contact information for our IRB should respondents have concerns about their rights as a study participant. Respondents will be provided a toll-free number to speak directly with staff trained to assist survey respondents and be as responsive as possible in addressing their concerns.
4. Describe any tests of procedures or methods to be undertaken.
The data collection instruments were developed with input from CMS staff and were reviewed by NORC survey methodologists and SUD and OUD subject matter experts.
5. Name and telephone number of individuals consulted on statistical aspects of the design and the name of the contractor and persons who will actually collect and/or analyze the information for the agency.
Data collection and analysis will be conducted by the NORC at the University of Chicago, in coordination with the IBM Watson Health evaluation team. Contact information for NORC’s principal staff on the project is listed below. All four individuals were involved in designing the data collection. Watson Health staff shown below were consulted on the survey and focus group questionnaire development.
IBM Watson Health Staff
Peggy O’Brien
Evaluation Project Director
Phone: 339/927-1064
Email: [email protected]
Lavonia Smith LeBeau
Evaluation Primary Data Collection Lead
Phone: 781/507-4140
Email: [email protected]
NORC Staff
Kathy Rowan
NORC Team Lead
Phone: 301/921-6067
Email: [email protected]
Jennifer Satorius
NORC Survey Lead
Phone: 217/632-7161
Email: [email protected]
Maysoun Freij
NORC Focus Group Technical Advisor
Email: [email protected]
Phone: 240/762-1411
Christina Rotondo
NORC Focus Group Lead
Email: [email protected]
Phone: 312/859-1346
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Supporting Statement – Part B |
Author | CMS |
File Modified | 0000-00-00 |
File Created | 2022-02-24 |