1-ehrdpraomb_c

1-EHRDPRAOMB_C.pdf

Evaluation of the Medicare Care Management Performance Demonstration and the Electronic Health Records Demonstration

OMB: 0938-1072

Document [pdf]
Download: pdf | pdf
C. SUPPORTING STATEMENT - PART B
COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS

1.

Respondent Universe and Sampling Methods

a.

Respondent Universe

MCMP OSS
The MCMP Demonstration targets small to medium-sized group practices (10 or fewer
physicians) serving at least 50 traditional fee-for-service Medicare beneficiaries with selected
chronic conditions for whom the practices are providing primary care. Targeted in addition to the
three primary conditions—congestive heart failure, coronary artery disease, and diabetes
mellitus—are Alzheimer’s disease or other mental, psychiatric, or neurological disorders; any
heart condition (such as arteriosclerosis, myocardial infarction, angina pectoris, or stroke); any
cancer; arthritis and osteoporosis; kidney disease; and lung disease.
CMS selected four states—Arkansas, California, Massachusetts, and Utah—following
criteria specified in the enabling legislation. The QIOs in the four MCMP Demonstration states
recruited practices based on relationships built through CMS’s Doctor’s Office QualityInformation Technology (DOQ-IT) program, in which practices must participate to be eligible
for the demonstration. Table C.1 presents the distribution of eligible practices, by size, in the
four states. 9 Each demonstration state will be matched to a comparison state based on specific
criteria that include demographic features, degree of HIT and pay-for-performance programs
currently under way, and other factors. Alternate comparison states will be chosen if the initially
selected comparison state does not have enough practices for matching.

9 We cannot present a similar table for the EHRD due to the timing of practice recruiting.

 

34

TABLE C.1
DISTRIBUTION OF ELIGIBLE PRACTICES BY SIZE
IN MCMP DEMONSTRATION STATES
State

Practice Size

Number of Practices

Percentage

Arkansas

1
2
3
4
5
6
7
8
9
10

99
259
45
48
21
20
8
16
4
11

18.6
48.8
8.5
9.0
4.0
3.8
1.5
3.0
0.8
2.1

California

1
2
3
4
5
6
7
8
9
10

3,051
2,570
518
302
154
134
103
73
45
45

43.6
36.7
7.4
4.3
2.2
1.9
1.5
1.0
0.6
0.6

Massachusetts

1
2
3
4
5
6
7
8
9
10

468
324
90
63
52
34
26
26
22
14

41.8
29.0
8.0
5.6
4.7
3.0
2.3
2.3
2.0
1.3

Utah

1
2
3
4
5
6
7
8
9
10

39
112
23
23
15
9
3
7
6
3

16.3
46.7
9.6
9.6
6.3
3.8
1.3
2.9
2.5
1.3

Source: MCMP financial support contractor.

EHRD OSS
 

35

The EHRD targets small to medium-sized practices (20 or fewer physicians) that provide
primary care 10 to at least 50 fee-for-service Medicare beneficiaries with congestive heart failure,
coronary artery disease, diabetes, or other chronic diseases. CMS selected Louisiana, Maryland/
DC, Pennsylvania, and South Dakota as the sites for the demonstration. These four sites were not
randomly selected from a population of states and practices; rather, they were chosen through a
competitive process to identify “community partners” to assist CMS with education, outreach
activities, and recruitment of physician practices. Community partners represent stakeholders
that have ties to primary care physicians but are not required to have a specific type of
organizational entity or structure.11 Practices are invited to begin their participation by filling out
and submitting a practice application form.
b. Sampling Methods
MCMP OSS
CMS recruited a total of 699 practices across the four MCMP Demonstration states using
the QIOs’ relationships with practices, and will select about 700 practices in the comparison
states, in roughly equal proportion to the demonstration practices recruited in the four states. We
will match practices in the comparison states to the demonstration practices based on size,
experience with HIT, and use of ambulatory Medicare-covered services. Table C.2 lists the
number of enrolled demonstration practices by state.
10 The following physician specialties will be eligible to participate in the EHRD if they provide primary care:
general practice, allergy/immunology, cardiology, family practice, gastroenterology, internal medicine, pulmonary
disease, geriatric medicine, osteopathic medicine, nephrology, infectious disease, endocrinology, multispecialty
clinic or group practice, hematology, hematology/oncology, preventive medicine, rheumatology, and medical
oncology.
11 For example, in the MCMP Demonstration, practices were recruited by the Quality Improvement
Organizations (QIOs) in the four demonstration states, using relationships built through CMS’s Doctor’s Office
Quality-Information Technology (DOQ-IT) program, in which practices must participate to be eligible for the
demonstration.

 

36

The demonstration states, and the practices within them, were not randomly selected from a
population of states and practices. Furthermore, physician practices volunteered to participate.
Thus, the sampled population cannot be generalized beyond those practices enrolled in the
demonstration and their matched counterparts.
TABLE C.2
MCMP DEMONSTRATION NUMBER OF ENROLLED PRACTICES BY SITE

State

Number of Enrolled Practices

Arkansas

106

California

236

Massachusetts

236

Utah

121

We will ask all demonstration and comparison practices to complete the second and final
round of the OSS. The survey will start in September 2009, 27 months after the start of the
demonstration’s operations. ARC, the MCMP implementation contractor, will provide MPR with
a list of all demonstration and comparison group practices. The goal will be a 70 percent
response rate from both these practices, for a total of 980 completed surveys (490 demonstration
and 490 comparison group practices) from the 1,399 practices.
EHRD OSS
With the assistance of the community partners, CMS will recruit about 200 practices per
site, for a total of 800. If, in a particular site, CMS receives more practice application forms than
the target, preference will be given to the smallest practices and to those that are in the early
stages of EHR adoption.

 

37

The sites and practices are not randomly selected from a population of states and practices.
Furthermore, physician practices within each site volunteer to participate. Thus, the sampled
population cannot be generalized beyond those practices enrolled in the demonstration.
However, if the four demonstration sites represent a substantial portion of the physician
practices’ efforts to adopt and use EHRs nationwide, the evaluation’s assessment of the overall
impacts of the demonstration will reflect the experiences of a larger share of small to mediumsized practices in the nation.
In spring 2009, before the start of operations, the small to medium-sized practices that have
volunteered to participate will be randomly allocated within each site to the treatment or control
group, which will generate 100 treatment and 100 control group practices per site, or 400
treatment and 400 control group practices in total. The design plan randomly allocates each
practice to the treatment or control group within specified strata. Four key stratifying variables
capture practice characteristics, measured at baseline, that are likely to be associated with EHR
adoption, quality of care, and Medicare expenditures: (1) site, (2) size, (3) urban or rural
location, and (4) whether the practice already had an EHR at the time of application.
Stratification ensures that treatment group practices are similar to control group practices on key
factors likely to be associated with outcomes of interest. Adopting a stratified design maximizes
the precision of impact estimates and avoids compromising the credibility of the evaluation with
chance imbalances across stratifying factors.
We will ask treatment group practices to complete the OSS at the end of each of the five
demonstration years. We will use their responses regarding use of EHR functions to calculate the
payment given to each practice at the end of each demonstration year. We expect that up to
5 percent of treatment group practices (or a total of 20) may withdraw from the demonstration
after the second year of operations as a result of their inability to implement the minimum EHR
 

38

functions required. Therefore, we expect that 100 percent of treatment group practices will
complete the OSS within the first two years of the demonstration, but also that the number may
drop slightly for each round of the OSS in years 3 through 5.
We will ask control group practices to complete the OSS after demonstration years 2 and 5.
Since responses for these practices will not yield a payment, we do not expect a 100 percent
response to each round, as we do for treatment group practices. Our goal instead is to obtain a 70
percent response rate to the OSS at each round, or 280 practices per round, from the 400 control
group practices, at the end of years 2 and 5. Table C.3 lists the expected number of completed
OSS surveys for the MCMP Demonstration and the EHRD by year.
EHRD – Qualitative Discussions with Demonstration Practices and Community Partners
We will conduct two rounds of qualitative discussions with 4 treatment group practices, up
to 2 control group practices, and the community partner from each of the four sites. In total,
discussions will be held with 16 treatment group practices, up to 8 control group practices, and
four community partners per round. The average number of respondents per practice will be 3.
For the first round, we will (non-randomly) select practices in each site to ensure that we achieve
a mix of sizes, locations (urban and rural), and practices with and without an EHR in place. We
will cluster practices geographically to make the site visit efficient, though the clustering does
not have to be tight. For example, the practices in a large site could include 2 treatment group
practices and 1 control group practice from a large urban area, a third treatment group practice

 

39

TABLE C.3
EXPECTED OSS COMPLETES BY DEMONSTRATION AND YEAR
Total Number of
Responding
Practices

Number of Responding
Treatment Practices*

Number of Responding
Control Practices

490

490

980

EHRD (2010)

400

–

400

EHRD (2011)

400

280

680

EHRD (2012)

380

–

380

EHRD (2013)

380

–

380

EHRD (2014)

380

280

660

Study (Year)
MCMP
(2009)

Demonstration

*Assumes that 5 percent of treatment group practices will withdraw at the end of demonstration year 2.

from a rural area outside that urban area, and a fourth treatment group and second control group
practice from or near a small urban area two hours away from the large one. Based on the data
from the practice application forms, we will select some practices that did and some that did not
have an EHR in place at the start of the demonstration. We will conduct discussions for the
second round with the same practices as in the first round.
In addition, up to 6 practices that withdraw from the demonstration will be selected (nonrandomly) for telephone discussions in spring 2012. All practices that have withdrawn up to that
point will be eligible for selection. If more than 6 have withdrawn, we will use application or
OSS data (to the extent they are available) to select a mix of practices in terms of the month they
withdrew, their size, their geographic location, and their stage of EHR implementation (for

 

40

example, we would select at least 1 practice with an EHR in place if some practices with EHRs
withdrew). Table C.4 presents the number of sites, contacts, and discussions by demonstration
year.
TABLE C.4
EHRD PRACTICE DISCUSSIONS BY YEAR AND ROUND

Year Practices (Round)
2010 Participating practices
(Round 1)

Number of
Sites

4

2012 Practices that
withdrew

–

2014 Participating practices
(Round 2) a
4

Total Number of
Practices/Community
Partners Contacted
16 treatment practices
up to 8 control practices
4 community partner

4 community partners

6 treatment practices

6 practice staff

16 treatment practices

72 practice staff

up to 8 control practices
4 community partners

2.

Total Number of Staff
Discussions per Site
72 practice staff

4 community partners

Procedures for the Collection of Information

EHRD OSS
A self-administered web survey will be the primary data collection mode for the EHRD
OSS. Respondents will be sent (1) a letter, printed on CMS letterhead and signed by the CMS
Privacy Officer, describing the survey and providing a secure login and PIN; and (2) a fact sheet
of answers to commonly asked questions about the study and the survey. (A copy of the letter is
included as Appendix D to this submission; Appendix E contains the fact sheet.) The OSS
questionnaire (Appendix F) has been designed so that office managers or administrators (the
a Round 2 interviews are conducted with the same practices that were interviewed in Round 1.

 

41

likely respondents) will be able to complete the survey in 29 minutes. We expect that 90 percent
of responding practices will complete the OSS on the web, with a small number completing it via
a paper-and-pencil questionnaire.
The following topics will be covered by the EHRD OSS:
• Section 1: General Practice Information. This section collects information on the
enrolled practices, including name, address, telephone number, affiliation with an
independent practice association, and participation in any other quality reporting or
improvement initiatives.
• Section 2: Provider Profile. This section asks about the providers enrolled in the
demonstration, such as their name, specialty, credentials, primary practice location,
and provider and Medicare billing numbers (provider identification number or PIN).
• Section 3: Use/Planned Use of EHRs, E-Patient Registry, or E-Prescribing
Systems. This section collects information about the various types of electronic
systems (EHRs, patient registries, and prescribing) currently in use or planned for use
in the practice, and the number of providers who use these systems.
• Section 4: EHR, Patient Registry, and Prescribing System Functions. This section
collects information about the various functions that practices’ use for each of the
systems identified in Section 3, and the proportion of patients for which they use each
function. Functions are organized under five domains: 1) completeness of
information; 2) communication of care outside the practice; 3) clinical decision
support; 4) use of the system to increase patient engagement/adherence; and 5)
medication safety.
• Section 5: Data Attestation. This section requests that the respondent confirm that the
responses are a correct assessment of the practice, that the survey responses are
accurate, and that they may be subject to validation.
Each round of the OSS will be administered over a two-month period beginning in April.
We will establish a toll-free help desk and general email address at MPR for assistance with the
survey. Table C.5 presents the expected data collection activity by week during the field period.
The advance letter will be mailed to practices just prior to the start of data collection (i.e., in late
March) when the web survey is available to access online. We will supplement the advance letter
with three follow-up mailings: (1) a thank-you postcard or email sent to all practices roughly 2
weeks later, (2) a reminder postcard or email to all nonresponding practices 4 weeks later, and
 

42

(3) a second reminder postcard or email to all nonresponding practices 6 weeks later. About
halfway through the field period (or 4 weeks after the advance letter mailing), MPR will call
nonresponding practices to request their participation, answer questions, and identify practices
that prefer a paper-and-pencil questionnaire. Practices that request such a version will be sent
one. Data collection will continue through week 8.
We will validate OSS responses for 25 percent of responding treatment practices using the
OSS Validation Form (Appendix G). We will ask practices to complete the Validation Form
after they have completed the OSS in order to verify their survey responses.
TABLE C.5
OSS DATA COLLECTION ACTIVITY BY WEEK
Week of
Data
Collection

Activity

1
2
4

Advance letter mailed to practices
Thank-you postcard/email sent to practices
Reminder postcard/email sent to nonresponding practices

4
5

Telephone reminder calls to nonresponding practices
Paper-and-pencil questionnaires mailed/faxed to practices that request them

6

Second reminder postcard/email sent to nonresponding practices

8

Survey field period ends

MCMP OSS
A self-administered web survey will be the primary data collection mode for the MCMP
OSS. Respondents will be sent a letter, printed on CMS letterhead and signed by the CMS
Privacy Officer, describing the survey and providing a secure login and PIN, and a fact sheet of
answers to commonly asked questions. A copy of the letter is included as Appendix H to this
submission; the fact sheet is in Appendix I. The OSS questionnaire (Appendix J) has been
designed so that office managers or administrators (the likely respondents) will be able to

 

43

complete it in 29 minutes. Although the EHRD and MCMP OSS instruments cover many of the
same topics, the organization and content of the specific survey questions differ between the two
instruments.
The following topics will be covered by the MCMP OSS:
• Section 1: General Practice Information. This section collects information on the
enrolled practices, including name, address, telephone number, affiliation with an
independent practice association, and participation in any other quality-reporting or
improvement initiatives.
• Section 2: Provider Profile. This section asks about the providers enrolled in the
demonstration, such as their name, specialty, credentials, primary practice location,
and provider and Medicare billing numbers (PIN).
• Section 3: Office Practice. This section asks about changes in practice workflows
that have occurred as a result of transitioning from paper to electronic systems.
• Section 4: Electronic Health Record. This section collects information about the use
or planned use of an EHR in the practice, the types of functions used, and the
proportion of patient visits/encounters for which the functions are used.
• Section 5: Patient Registry/Care Management Processes. This section collects
information about the use or planned use of an electronic patient registry or care
management process in the practice, the types of functions used, and the proportion of
patient visits/encounters for which the functions are used.
• Section 6: Electronic Prescribing. This section collects information about the use or
planned use of electronic prescribing in the practice, the types of functions used, and
the proportion of patient visits/encounters for which the functions are used.
• Section 7: Data Attestation. This section requests that the respondent confirm that the
responses are a correct assessment of the practice.
• Section 8: Final Comments. This section invites the respondent to provide any
additional comments.
MPR’s goal is to complete surveys with 980 practices (490 demonstration and 490
comparison) for a 70 percent response rate from the 1,399 practices. The data collection
procedures for conducting the MCMP OSS are the same as for the EHRD OSS, except that the
field period will run for 17 weeks, no validation will be conducted of MCMP survey responses
and no $50 incentive offered to comparison group practices for completing the survey.
 

44

EHRD Discussion with Demonstration Practices and Community Partners
We will use a semistructured guide for discussions with practices and community partners
(see Appendix K for the discussion guides for treatment and control practices, withdrawn
practices, and community partners). The guides for treatment and control group practices will be
similar, except that questions pertaining to the demonstration will not apply to the latter.
We will conduct in-person visits across all four sites for practices selected for discussions.
Each site visit will be led by a two-person MPR team consisting of (1) a senior project team
member, and (2) a research analyst responsible for documenting each visit and making sure the
content areas are covered.
Scheduling for each contact will begin four to eight weeks before the target date for the
contact. To each practice selected for discussion, we will send an introductory letter, signed by a
CMS Project Officer, describing the study and encouraging participation. A copy of this letter is
included as Appendix L to this submission. Follow-up calls will be made a week later, and many
offices will need to have the letter faxed or emailed to them again. A confirmation letter will be
faxed or emailed to sites that agree to participate, and a second confirmation call will be made
the week before the visit.
Each office visit will be two to three hours long, depending on practice size, and will include
discussions with three office staff: (1) a physician representative (30 minutes in small practices,
up to 60 minutes in larger ones); (2) a nurse or other clinical staff member involved in care
management and care coordination (30 minutes); and (3) an administrative person responsible
for overseeing the adoption and implementation of the IT system (60 minutes). The physician
who has been most involved with any changes (or planned changes) in response to the
demonstration will be targeted for the physician discussion. If all potential physician respondents
are equal in this regard, the one with the most Medicare patients will be targeted. In practices
 

45

with a medical director (assuming the person is also a practicing physician), he or she will be
interviewed as the physician representative, and the time frame will be extended from 30 to 60
minutes. For larger practices that have a chief executive officer, a chief financial officer, a
marketing director, or similar personnel who could inform the project, we will schedule an
additional 45-minute group interview.
Discussion topics will include (1) practice perspectives on the demonstration; (2) facilitators
and barriers to adopting and implementing HIT; (3) participation in other pay-for-performance or
quality-reporting programs; (4) adaptation of practice operations as HIT is implemented; (5) staff
attitudes toward HIT; (6) staff use of HIT functions and changes in job responsibilities as a result
of adopting HIT; and (7) effects of HIT on the practice, such as increasing orientation to quality
and safety.
For each practice that uses an EHR, e-prescribing, or e-registry for at least some care
management functions, a 30-minute demonstration will be requested to show in real time how
they use their EHR for care management functions. In the second round of in-person visits, the
requests to demonstrate will be based on the practice’s OSS responses to its year 4 OSS.
In addition, during the site visit discussions, documentation related to the types of changes
that are discussed may be requested, with any personal health information de-identified.
Examples of such documentation include a copy of an internal quality indicators report that the
practice produces, or a copy of the type of self-management plan the practice uses its system to
generate. Any documentation obtained will contain no identifiable information. Under no
circumstances will any personal health information be recorded in discussion notes.
Before each round of practice contacts, we will have a one-hour discussion with key staff
from each of the community partners who helped CMS recruit, and provide outreach and
education to, the practices enrolled in the demonstration. Discussions with community partners
 

46

will include several topics: (1) partners’ experience recruiting practices, (2) practice needs for the
demonstration to be successful, (3) plans for working with and facilitating assistance to practices,
(4) perceptions of practices’ progress under the demonstration, and (5) other (non-EHRD) HIT
activities in the site. This will provide a solid background on the demonstration recruitment
(Round 1) and operational experience (Round 2), as well as communication strategies and
messages (both rounds) prior to undertaking any practice-level discussions. Such discussions will
greatly assist the MPR team in interpreting practice staff statements and probing appropriately.
We will conduct telephone interviews with up to 6 practices that withdraw from the
demonstration, to determine the reasons for withdrawal and assess whether withdrawing
practices are systematically different from remaining practices. Discussions with withdrawn
practices will include (1) why the practice enrolled in the demonstration and what it hoped to
gain; (2) why the practice decided to withdraw (extensive probes); (3) whether the practice
participates in other pay-for-performance programs and if so, how they differ from EHR; (4) if
not, under what circumstances they would participate in a pay-for-performance program again;
and (5) what, if anything, could have been done differently that would have prevented the
withdrawal.
3.

Methods to Maximize Response Rates

EHRD OSS
MPR will utilize an initial advance mailing to alert both treatment and control group
practices about the survey. The letter will be printed on CMS letterhead, personally addressed,
and signed by the CMS Project Officer. It will include the email address and toll-free telephone
number of Mindy Hu, MPR’s survey manager for the OSS, whom practices can call for
assistance, as well as a fact sheet with answers to commonly asked questions. Treatment
practices will be aware that an annual survey is conducted as part of the demonstration and will
 

47

be motivated to respond, since their payment will be calculated based on their responses. Control
group practices will also be aware of the survey but may be less motivated to respond since they
will receive no payment. Therefore, to ensure a high response rate, we will offer control group
practices $50 to complete the OSS.
Roughly two weeks after the initial letter is mailed, postcards or emails will be sent to all
practices to thank those who completed the survey and to encourage those who haven’t
responded to log on and do so. Another round of reminder postcards or emails will be sent
roughly four weeks after the initial letter is mailed. About four weeks after the initial mailing,
telephone interviewers trained at negotiating with gatekeepers for access to health care
administrators and physicians will make reminder calls to sampled practices. The purpose of
these telephone contacts will be to encourage participation in the OSS, answer questions, and
identify practices that prefer to complete a paper-and-pencil survey. Such surveys will be mailed
to those practices that request one, along with a postage-paid, addressed return envelope. A final
round of reminder postcards or emails will be sent roughly six weeks after the initial letter. These
efforts are projected to yield a response rate of 100 percent among treatment group practices and
70 percent among control group practices.
MCMP OSS
The MCMP OSS data collection will use the same methods to maximize response rates as
the EHR OSS, except that the field period will be 17 weeks long and comparison group practices
will not receive any incentive for completing the survey. These efforts are projected to yield a
response rate of 70 percent for both demonstration and comparison group practices.

 

48

EHRD Discussions with Demonstration Practices and Community Partners
MPR will take a number of steps to gain practices’ participation in the in-person and
telephone discussions. All practices will be sent a personalized introductory letter, printed on
CMS letterhead and signed by the CMS Project Officer, providing an overview of the study and
highlighting the importance of the practice contacts to the evaluation. The letter will provide the
telephone number and email address of Ms. Sue Felt-Lisk, MPR task leader for the
implementation analysis.
Practices will be contacted by telephone within a week of the mailing so that the letter is still
fresh in their minds. During the calls, trained recruiters will emphasize that their participation is
important and that the schedule will be arranged to accommodate the demands of their practice.
We will re-mail or fax a copy of the letter to practices that did not receive or that misplaced their
letter, and follow up with these practices a few days afterwards. Unresponsive practices will be
recontacted on a regular basis, unless they have clearly refused to participate. After a practice has
agreed to participate, we will send a confirmation letter and call to confirm a week prior to the
in-person or telephone discussion.
4.

Tests of Procedures Or Methods

EHRD OSS
We conducted a pretest of the OSS with a convenience sample of eight practices that have
and use EHR systems in central New Jersey. We made initial calls to solicit their cooperation,
and mailed a paper-and-pencil questionnaire to eight practice administrators or office managers,
instructing them to fill out the questionnaire, keep track of how long it took to complete it, and
fax it back to MPR. A telephone debriefing was conducted with pretest respondents about their
experience completing the survey. MPR staff asked questions to assess respondents’ cognitive
understanding of key terms and to identify any problems in answering questions or navigating
 

49

the instrument. Estimates of the time it took to complete the questionnaire were also obtained, as
well as whether the respondent needed help from other practice staff and the level of effort it
took those additional staff.
MPR staff asked the same eight practices to complete the Validation form and to estimate
the time required to complete that instrument. The goal was to obtain a burden estimate for the
validation of OSS responses. Estimates of how long it took to complete the Validation form were
obtained, as well as whether the respondent needed help from other practice staff to complete the
form and the level of effort it took those additional staff.
MCMP OSS
A pretest of the MCMP OSS was conducted with a convenience sample of three practices
that have and use EHR systems in central New Jersey. The same pretest procedures for the
MCMP OSS pretest were used as for the EHRD OSS pretest.
EHRD Discussions with Demonstration Practices and Community Partners
The discussion guides developed for the EHRD evaluation were taken in large part from
guides developed and used on similar evaluations to interview practice staff and local
community agency staff. The guides have been tested and validated on these other evaluations
and therefore do not need to be pretested.
5.

People Involved in Design
The following people have contributed to the study design and to the design of the OSS

instruments, site visit protocols, and telephone interview protocols:
• Ms. Jennifer Schore, an MPR senior health researcher and study project director,
(609) 275-2380
• Ms. Martha Kovac, an MPR associate director of survey research and study survey
director, (609) 275-2331
 

50

• Ms. Sue Felt-Lisk, an MPR senior health researcher and study co-principal
investigator, (202) 484-4519
• Dr. Lorenzo Moreno, an MPR senior health researcher and study principal
investigator, (609) 936-2776
• Dr. Lorraine Johnson, CMS Project Officer, Office of Research, Development, and
Information, (410) 786-9457

 

51

REFERENCES

Thompson, Tommy G., and David J. Brailer. “The Decade of Health Information Technology:
Delivering Consumer-centric and Information-rich Health Care. Framework for Strategic
Action.” Washington, DC: Department of Health and Human Services, July 1, 2004.
Wilkin, John C., Kerry E. Moroz. Erika G. Yoshino, and Laurie E. Pekala. “Electronic Health
Records Demonstration Waiver Cost Estimate.” Columbia, MD: Actuarial Research
Corporation, December 13, 2007.

 

52


File Typeapplication/pdf
File TitleEvaluation of the Electronic Health Records Demonstration (EHRD) and theMedicare Care Management Performance (MCMP) Demonstratio
AuthorMartha Kovac, Jennifer Schore, Nancy Duda, Mindy Hu
File Modified2009-05-28
File Created2009-04-30

© 2024 OMB.report | Privacy Policy