Respondent Universe and Sampling Methods
The proposed practice surveys will be fielded among the full universe of all medical practices participating in the MAPCP Demonstration at the time of our survey administration (a 6-week period occurring 3 months after OMB approval of the information collection request). In the email that is sent to practice managers asking them to complete our practice characterics survey and to ask the providers in their practice to complete our medical home survey, we will ask that each provider in their practice complete the survey independently.
Tables 1 and 2 display the total number of providers and practice managers we propose to survey and the expected number of completed surveys within each state and across all eight states, assuming an 80% response rate for both surveys.
We estimate that after inviting 7,249 providers to complete our web-based medical home survey, 5,799 providers (80%) will complete the survey. We also estimate that after inviting 1,005 practice managers to complete our web-based practice characteristics survey, 803 practice managers (80%) will complete the survey. We expect a high response rate to our surveys, since we have existing relationships with practices through the practice feedback reports they regularly receive from us, which provide them with complimentary quality and utilization measure data for their practice. (The approaches we will use to achieve this response rate are described in the following section.)
Table 1. Expected Number of Providers in the MAPCP Demonstration at the Time of Survey Administation and Expected Number of Respondents
State |
Expected # of Providers Participating in the MAPCP Demonstration (# of Providers Invited to Complete Survey) |
Expected # of Providers Responding to the Survey (Assuming an 80% response rate) |
Maine |
514 |
411 |
Vermont |
1,157 |
926 |
Rhode Island |
98 |
78 |
New York |
191 |
153 |
Pennsylvania |
408 |
326 |
North Carolina |
141 |
113 |
Michigan |
1,581 |
1,265 |
Minnesota |
3,159 |
2,527 |
Total |
7,249 |
5,799 |
Table 2. Expected Number of Practice Managers in the MAPCP Demonstration at the Time of Survey Administation and Expected Number of Respondents
State |
Expected # of Practice Managers Participating in the MAPCP Demonstration (# of Practice Managers Invited to Complete Survey) |
Expected # of Practice Managers Responding to the Survey (Assuming an 80% response rate) |
Maine |
81 |
65 |
Michigan |
393 |
314 |
Minnesota |
213 |
170 |
New York |
43 |
34 |
North Carolina |
58 |
46 |
Pennsylvania |
57 |
46 |
Rhode Island |
21 |
17 |
Vermont |
139 |
111 |
Total |
1,005 |
803 |
We do not believe that the fact that practices have previously completed a medical home practice recognition survey will prevent them from filling out our survey. We have been told by state staff administering the MAPCP Demonstration that many providers do not believe that the patient-centered medical home recognition survey that practices were required to complete to enter the MAPCP Demonstration in most states (NCQA’s PCMH standards) accurately captures whether a practices has fully adopted all of the key components of the medical home model of care. We believe providers will welcome the opportunity to use an alternative instrument to capture the extent to which they have adopted the medical home model of care. In our pilot testing of this survey with providers participating in the MAPCP Demonstration, providers consistently told us they felt we were capturing the right aspects of the medical home model of care.
We also note that many, perhaps most, respondents will not have filled out NCQA’s practice recognition survey before, since this survey is often completed by a single person on behalf of a whole practice, or even a single person on behalf of a group of practices owned by the same entity. In many states, this survey was completed years ago, so the burden of completing it is not likely to be fresh in respondents’ minds; in such cases, respondents may actually have a psychological incentive to want to complete our survey, to see for themselves what kind of progress they have made in their mastery of the medical home model in the years since they first completed NCQA’s practice recognition survey.
We will analyze survey data for non-response bias by estimating how the probability of a response varies with practice attributes that we are able to observe among all practices that received the survey, even if they did not respond. Universally available data elements from CMS’s Enrollment Database, Medicare claims data, and U.S. Census data will include practice level estimates of Medicare beneficiary characteristics (age distribution, % female, % non-white, % disabled, % Medicaid dual eligible, % ESRD, mean hierarchical condition category (HCC) risk score, mean Charlson index score, median household income and population density of beneficiary county of residence, practice size, indicator of practice type (primary care only, multi-specialty practice, FQHC, critical access hospital, and rural health center), and mean total annual Medicare expenditures.
To ensure that states with especially small or large numbers of practices or providers are not under- or over-represented when we conduct demonstration-wide statistical analyses, we will use survey weights. We will also primarily present state-level descriptive statistics and results from statistical analyses, since we are particularly interested in understanding how different MAPCP Demonstration states compare in terms of the makeup of their participating practices and the results they achieve.
Procedures for the Collection of Information
Survey Materials. Characteristics and medical home-related activities and care processes of participating providers and their practices will be documented through a survey developed by Robert A. Berenson, M.D., a former Vice Chair of MedPAC and Institute Fellow at the Urban Institute. The two proposed companion surveys are based on a survey originally developed by Deborah Peikes and colleagues at Mathematica Policy Research, Inc. (MPR) for use in CMS’s evaluation of the Comprehensive Primary Care Initiative (CPCI). The proposed surveys include questions similar to those used in MPR’s survey to allow for meta-analyses to be conducted with the survey data in the future. The proposed surveys include two sets of questions: 1) 23 closed-ended questions about a practice’s medical home-related activities and care processes, which will be asked only of health care providers; 2) 17 questions about basic practice characteristics and infrastructure, which will be asked only of practice managers, and questions about the provider who completed the survey, which will be answered by providers. Some questions are closed-ended and some include short write-in answer options of a few words. The medical home survey is estimated to take 12 minutes for providers to complete, and the practice characteristics survey is estimated to take 6 minutes for practice managers (who are generally non-clinical administrative staff) to complete.
Mode of Administration. The proposed surveys will be administered through interactive websites created and maintained by RTI, which will be 508-compliant. Evaluators are exploring the possibility of designing this web-based survey such that it can be completed on smart phones (e.g., iPhones). Administering these surveys online will minimize the chance of receiving incomplete surveys, since the websites will be programmed to give users error messages if they do not complete a question and to draw their attention to the incomplete responses. Collecting these data electronically will allow for automatic, inexpensive tabulation. Providers will not have the option of completing a hard copy version of this survey during the regular survey administration period (starting 3 months after OMB approves this survey and extending for 6 weeks) ; however, if CMS’s evaluators have not obtained an 80% response rate within the regular survey administration period, non-responding practice staff will be offered the chance to complete hard-copy versions of the surveys and fax or mail them back to evaluators, who will then manually enter responses into the web-based survey tools. A second person on staff at one of CMS’s evaluation contractors (RTI, the Urban Institute, or NASHP) will then verify that hard copy responses were correctly entered into the web-based surveys.
Recruitment Communications. To encourage participating MAPCP practices to complete the proposed surveys, CMS will ask state staff administering the MAPCP Demonstration (with whom they already have ongoing relationships, and have the opportunity to speak with on a monthly basis via standing conference calls) to let participating practices in their respective states know that they will soon receive an email asking them to complete online surveys. State staff will be asked to assure practices that these surveys are an authorized component of the MAPCP Demonstration evaluation, and to encourage practices to complete these surveys. CMS will ask state staff to mention the surveys to practices during existing webinars, conference calls, in-person MAPCP meetings with practices, and/or as part of an email sent to practices.
After these advance communications, CMS’s evaluators will send customized, 508-compliant emails to each practice participating in the MAPCP Demonstration with individualized hyperlinks to the proposed online surveys. Using individualized hyperlinks will obviate the need for respondents to type in their name, identifying number, and the name and identifying number of their practice – thus slightly reducing respondent burden. More importantly, such individualized hyperlinks will also allow CMS’s evaluators to identify non-responders – to facilitate weekly follow-up communications aimed at encouraging these remaining providers to complete the survey (which are described in greater detail in the next section). The emails that are sent by CMS’s evaluators to practice managers will assure respondents that their responses will be kept private, to the extent permitted by law, and all required elements of informed consent will be included in the introductory section of the online surveys.
The specific recruitment emails we propose sending are included in Attachment C.
Quality Control. CMS and its evaluation contractors, RTI and its subcontractors the Urban Institute and NASHP, will implement quality control procedures throughout the survey pre-testing, recruitment, and administration periods. Specifically, they will verify that hyperlinks and all pages of the online surveys work properly. Aggregated responses will be reviewed for face validity, to ensure respondents’ answers have been correctly coded in the data file that is created by the software program used to administer the proposed online surveys. And if some responses from hard-to-reach respondents are ultimately received via fax or mail rather than online survey submission, the responses that RTI staff manually enter into the online survey website will be verified by a second evaluation staff member.
Methods to Maximize Response Rates and Deal with Nonresponse
A number of approaches will be used to maximize response rates among respondents, such as:
Having trusted state leaders tell providers about the proposed surveys in advance and encourage them to complete the surveys via emails and/or remarks during existing webinars, conference calls, and/or in-person meetings.
Having practice managers remind providers to complete the medical home survey (e.g., through oral announcements at staff meetings and personal emails and communications to non-responders). The Urban Institute has consulted with its Institutional Review Board, which advised that having CMS’s evaluation contractors disclose the identities of non-responding providers to practice managers (to allow them to identify which providers to remind to complete the survey) would not raise confidentiality concerns as long as the initial email to these providers about the survey informs them that they may receive reminders from these individuals to complete the survey.
Engaging in the following follow-up communications with non-responders:
On a weekly basis throughout the 6-week survey administration period: CMS’s evaluators will send emails to our contact person in each practice reminding them to complete the practice characteristics survey and to ask the providers in their practice to complete the medical home survey.
2 weeks after initial email: CMS’s evaluators will ask state staff members to remind practices to complete the proposed surveys.
4 weeks after initial email: CMS’s project officer for the evaluation of the MAPCP Demonstration will send an email to state staff asking them to remind practices to complete the surveys.
5 weeks after initial email: CMS’s project officer for the MAPCP Demonstration will send an email to state staff asking them to remind practices to complete the surveys.
If CMS’s evaluators have not achieved an 80% response rate after the 6-week survey administration period on either of the two proposed surveys, this period will be extended for a few weeks and direct phone calls and emails will be made to non-responders, offering to allow them to complete their survey through a hard-copy version that could be faxed or mailed back to CMS’s evaluators. Evaluation staff would then manually enter these practice staff members’ responses into the online provider surveys on their behalf, and a second evaluation staff member would verify that responses were entered correctly.
Test of Procedures or Methods to be Undertaken
To pilot-test the proposed surveys, which were originally envisioned as a single provider survey, eight providers participating in the MAPCP Demonstration were recruited (drawing one provider from each of the eight MAPCP Demonstration states) and asked to complete the proposed provider survey online. To identify health care providers to participate in pilot-testing, CMS’s evaluators asked state agency staff leading each state’s MAPCP Demonstration to ask participating providers for a volunteer to pilot-test the survey.
As part of the pilot-testing, the online provider survey recorded how long it took each respondent to complete the survey; these recorded amounts of time informed the amount of time estimated to complete the current version of the surveys. After filling out the survey, pilot testers were taken to a webpage with text fields below each question in the survey, and asked to provide any suggested revisions. Providers were also asked to comment on whether the survey covered an appropriate set of topics, given its intent of capturing medical home care processes and activities that practices engage in. All pilot testers that provided feedback thought the survey covered appropriate topics and did not have any major suggested revisions, though they suggested rewording some questions slightly to increase clarity and reader comprehension.
Based on the feedback obtained through this pilot-testing, minor revisions have been made to the surveys, as identified in Attachment E. The survey developers have also taken this opportunity to propose additional minor refinements, which also are described and documented in Attachment E.
Following OMB review of this survey, CMS has also decided to split the survey into two surveys: one for providers (Attachment D1), which will ask about medical home care processes and activities adopted in the practice; and a survey for non-clinical practice managers (Attachment D2), which will ask about basic practice characteristics.
Analysis and Reporting of Information Collected
The information collected through these surveys will be used to evaluate the effects of the MAPCP Demonstration on the Medicare and Medicaid programs and their beneficiaries. Specifically, these survey data will be used to answer the research questions:
What are the features of practices participating in the MAPCP Demonstration?
All questions in the two companion surveys will be used to answer this research question. We will use the survey data to present basic descriptive statistics about the makeup of the practices and providers participating in the demonstration (e.g., the prevalence of care coordinators in practices, whether participating practices tend to have adopted EHRs many years ago). We will also use the survey data to present descriptive statistics identifying the components of the patient-centered medical home model of care that practices adopted most frequently (e.g., actively coordinating care with other providers, using patient registries to identify patients to remind to come in for visits). These descriptive statics will be presented for each state, to allow us to observe how participating practices varied in different demonstration states.
What changes did practices make to enter and maintain participation in their state’s MAPCP Demonstration initiative?
The questions that ask about medical home care processes (with answer options of “1” to “9”) will allow us to identify which aspects of the patient-centered medical home model of care each provider has adopted and incorporated into his or her daily practice.
Do features of the participating practices result in more efficient delivery of health services, improved access, or higher quality of care to Medicare and Medicaid beneficiaries? If so, what features facilitate these improved outcomes?
To answer this question, we will combine data collected from both of our companion surveys (which will give us information about practices’ basic characteristics as well as their adoption of specific medical home care processes) with identifiable claims data (which will allow us to identify which practices achieved the largest improvements in, or absolute performance on, measures of health care utilization, spending, and quality). Survey data and claims data for an individual practice will be linked, and statistical analyses performed to identify which practice characteristics and/or medical home activities are associated with the largest increases in health care quality and/or utilization measures. More information on our statistical analyses are included in Supporting Statement A.
CMS’s evaluators’ analysis will include numerous variables designed to capture variation at the clinician, practice, demonstration, and state levels. The data collected through our surveys, along with the data collected through practices’ responses to state-endorsed medical home recognition surveys (e.g., NCQA’s), will be used to measure core medical home activities and capabilities at the practice level. The final number and focus of these variables will depend on the available data, but CMS’s evaluators currently expect to create separate variables for the following types of medical home activities or capabilities:
Care coordination—the practice communicates with the patient and their family as well as other providers, such as specialists and hospitals, to coordinate care received (i.e., questions #8, 10, 15, 16, 17, 18, and 19 of our medical home survey for providers).
Care transitions—the practice identifies high-risk patients who have recently been discharged from a hospital and follows up with them to ensure patients understand their discharge instructions, to reconcile new medications with pre-existing ones (i.e., question #20).
Quality measurement—the practice measures the quality of the care it delivers (i.e., questions #21 and 23).
Population management—the practice uses a registry to proactively manage care for patients with a given chronic condition (i.e., questions #6, 7, and 11).
Access to care—the practice offers its patients enhanced access to care through same-day appointments, night or weekend office hours, and/or clinicians answering patient emails (i.e., questions #1-4).
Quality improvement—the practice engages in quality improvement projects and/or sets specific performance targets based on quality measure data collected (i.e., question #22).
Care plan—the practice regularly develops individualized treatment plans for designated groups of patients, basing this care plan on an individualized health risk assessment and patient goals and preferences (i.e., question #9).
Patient engagement & self-management—the practice counsels patients to adopt healthier behaviors or to learn how to better manage their chronic condition(s) (i.e., questions #10, 12-14).
Continuity of care—within the practice, patients are seen by the same clinician over time (i.e., question #5).
Electronic health records (EHRs)—the practice uses EHRs not just to document services rendered but also to create clinical decision support prompts reminders and generate quality measure data used for internal quality improvement purposes (i.e., question 21).
Some of these variables may be constructed based on providers’ responses to a single question, whereas others may reflect responses to multiple questions – in which case we will add up the number of questions providers were able to answer “yes” to and create a single summary variable for a particular medical home activity or care process (e.g., “care coordination”).
Cross-sectional analyses will be used to determine whether certain medical home variables (such as the ones listed above) are more highly associated with high-quality, efficient, low-cost care. Multivariate regression models will be used to estimate the impact on expenditures, health care utilization, and performance on quality-of-care measures of engaging in different medical home care processes; medical home variables will serve as the independent or “explanatory” variables, along with other contextual variables at the state, practice, and clinician level.
Descriptive statistics will be reported in the annual report that will be submitted to CMS following the administration of the survey. The survey data will also be used to create variables that will be used in regression analyses performed to identify which medical home-related activities or care processes are associated with the largest gains in health care quality and reductions in health care cost trends. Findings from these statistical analyses will be included in the final report submitted to CMS. Further information on our statistical analyses is included in Supporting Statement A.
Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
Ann O’Malley, M.D., M.P.H.
Center for Studying Health System Change
Telephone: 202-484-5261
Robert A. Berenson, M.D.
The Urban Institute, Health Policy Center
Telephone: 202-261-5886
Stephen Zuckerman, Ph.D.
The Urban Institute, Health Policy Center
Telephone: 202-261-5679
Rachel Burton, M.P.P.
The Urban Institute, Health Policy Center
Telephone: 202-261-5825
Nancy McCall, Sc.D.
RTI International
Telephone: 202-728-1968
Thomas Morgan
RTI International
Telephone: 919-541-7414
References
Bazeley, P. (2007). Qualitative data analysis with NVivo (2nd ed.). Sage Publications Ltd.
Berenson, R.A., Devers, K.J., & Burton, R.A. (2011). “Will the Patient Centered Medical home Transform the Delivery of Health Care.” Robert Wood Johnson Foundation and Urban Institute Issue Paper, August, http://www.urban.org/publications/412373.html.
Bitton, A., Martin, C., & Landon, B. E. (2010). A nationwide survey of patient centered medical home demonstration projects. Journal of General Internal Medicine, 25(6), 584–592.
Bradley, E.H., Curry, L.A., & Devers, K.J. (2007). Qualitative data analysis for health services research: Developing taxonomy, themes, and theory. Health Services Research, 42(4), 1758-1772.
Burton R, Devers K, and Berenson R. (2012). Patient-Centered Medical Home Recognition Tools: A Comparison of Ten Surveys’ Content and Operational Details, March. Baltimore, MD: U.S. Centers for Medicare and Medicaid Services (http://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/Reports/Downloads/Burton_PCMH_RT_Survey_Compare_March_2012.pdf).
Crabtree, B. F., Nutting, P. A., et al. (2010). Summary of the National Demonstration Project and recommendations for the patient-centered medical home. Annals of Family Medicine, 8(Suppl. 1), S80–S90.
Creswell, J. W. (2009). Research design: Qualitative, quantitative, and mixed methods approaches (3rd ed.). Thousand Oaks, CA: Sage.
Damman OC, Stubbe JH, Hendriks M, et al. Using multilevel modeling to assess case-mix adjusters in consumer experience surveys in health care. Medical Care 2009; 47: 496- 503.
Devers, K.J. (1999). How will we know good qualitative research when we see it? Health Services Research, 34(5, part 2), 1153–1188.
Herberlein TA, Baumgartner R. Factors affecting response rates to mailed questionnaires: A quantitative analysis of the published literature. American Sociological Review 1978; 43:447-462.
Miles, M.A. & Huberman, A.M. (1994). Qualitative data analysis: An expanded source book (2nd ed.). Thousand Oaks, CA: Sage
Patton, M.Q. (1990). Qualitative evaluation and research methods (2nd ed.). Thousand Oaks, CA: Sage.
Patton, M. Q. (1996). Utilization-focused evaluation (3rd ed.). Thousand Oaks, CA: Sage.
Ragin, C.C. (1999). Using qualitative comparative analysis to study causal complexity.
Health Services Research, 34(5 Pt 2), 1225-1239.
Richards, L. (2009). Handling qualitative data (2nd ed.). Thousand Oaks, CA: Sage.
Rist, Ray C. (1994). Influencing the policy process with qualitative research. In N. Denzin & Y. Lincoln (Eds.) Handbook of Qualitative Research (pp. 545-557). Thousand Oaks, CA: Sage Publications, Inc.
Sofaer, S. (1999). Qualitative methods: What are they and why use them? Health Services Research, 34(5, part 2), 1101–1118.
Sorensen, A. (2008). Use of QSR NVivo 7 qualitative analysis software for mixed methods research. Journal of Mixed Methods Research, 2(1), 106–110.
Steiner, B. D., Denham, A. C., et al. (2008). Community Care of North Carolina: Improving care through community health networks. Annals of Family Medicine, 6(4), 361–367.
Yin, R.K. (1999). Enhancing the Quality of Case Studies in Health Services Research, Health Services Research, 34(5 Pt 2), 1209-1224.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Month 200X |
Author | npepoli |
File Modified | 0000-00-00 |
File Created | 2021-01-28 |