Skip to main content

Advanced Search

Advanced Search

Current Filters

Filter your query

Publication Types



Newsletter Article


<em>Health Affairs</em> Examines Why Medical Providers Often Don't Provide the Most Effective Treatments

By Rebecca Adams, CQ HealthBeat Associate Editor

October 9, 2012 -- Researchers often puzzle over why medical practitioners often treat patients in ways that studies have shown to be ineffective or less effective than alternatives. A new article in Health Affairs identifies five reasons it takes so long for evidence-based practices to change what health care professionals do.

The article by RAND Corp. researchers suggests that perverse financial incentives, ambiguous results, biases among medical professionals, different goals for information and the limited use of tools that promote best practices in decision-making are common reasons why treatment is not always based on the best evidence.

"The nation is making substantial investments in new comparative effectiveness research in the hope the results will improve the quality of medical care and reduce its cost," said a statement from Justin Timbie, who along with Eric Schneider, was a lead researcher for the study. "Before we can achieve these benefits, we must address the issues that impede the translation of evidence into medical practice."

In a separate article, Dartmouth professor Harold Sox argues that officials at the Patient Centered Outcomes Research Institute (PCORI) should develop a greater sense of urgency and "plan its research agenda strategically, so that it addresses research questions that comparative effectiveness research could answer quickly and decisively." That article is also included in the October issue of Health Affairs.

Barriers to Adoption of Best Practices

The RAND Corp. researchers noted that some treatment approaches changed quickly after new evidence emerged. However, the authors said that "translating evidence into changes in clinical practice is rarely rapid." For instance, prostate-specific antigen testing is still often used despite evidence that it offers little benefit, the authors wrote. And, they found, cost-effective treatments for patients with hypertension, such as thiazide diuretics, are not used as much as heavily marketed alternatives that are less effective.

One explanation is that "perverse financial incentives push both patients and providers to disregard the evidence and pursue aggressive treatments even if they are no more effective than more conservative treatment approaches," according to the study. Under fee-for-service systems, providers are paid well for performing procedures while they receive little or no compensation for the time it takes to counsel patients about varying treatment options.

Medical providers also might hear incomplete information about study results, in part because industry groups can finance biased publicity about research, and there is often little funding provided to disseminate results in an objective manner.

Another issue is that sometimes it's difficult for providers to sift through ambiguous results, which "become fuel for competing interpretations," the authors said.

In addition, medical professionals often have three kinds of cognitive biases: a tendency to accept evidence that confirms previous understanding and reject anything that upends pre-conceived ideas; a preference for intervention over inaction; and using newer technology, with the thought that newer technology is better.

A separate challenge is that sometimes clinicians and patients have different expectations about research. Some are interested in personalized medicine that can tailor treatments to the patients who will most benefit from an approach, while others "might prefer research whose results can be generalized to larger populations," the study said.

Decision-making support tools could help overcome providers' tendencies to ignore recent research, but they are not widely used, the study said.

The authors offered three ways to reduce barriers to the adoption of evidence-based medicine.

One idea is to get a consensus of a study's goals and the standards for interpreting the results before the research starts. The article said that most comparative effectiveness studies should conduct a consensus development process to debate the appropriate design of each study, reducing the chance that investigations are ignored after they are done.

Additionally, multidisciplinary teams should develop treatment guidelines, rather than such efforts being dominated by one kind of medical specialty.

The researchers said that a third way to push for more evidence-based care is to increase the use of payment systems that pay providers for the most effective treatments.

PCORI's Progress

The second article, by Sox, argued that the comparative effectiveness research should be focused on specific, high-impact questions, and should proceed quickly, given that authority for a major funding source for PCORI will expire in 2019 unless Congress reauthorizes it. The health care law (PL 111-148, PL 111-152), which created PCORI, said that a trust fund for it can no longer be used after Sept. 30, 2019.

"The fledgling institute's leadership has a difficult task," Sox wrote. "The institute is a start-up that has one year to scale up to an organization that dispenses a half-billion dollars in research funding each year and only seven years to win reauthorization."

Sox's concerns about the pace of PCORI's action echo those of others who would like the nonprofit to target questions that would help a large number of patients judge the effectiveness of different treatments. PCORI officials have decided to initially accept research grant proposals without limiting applications to specific medical dilemmas. Some critics say that they need to decide first which treatment questions they are most interested in rather than allowing applications for an unlimited array of issues.

"PCORI's actions have not conveyed a sense of urgency or strategic direction," Sox wrote. "The institute's first substantive utterance—its National Priorities for Research and Research Agenda—does not list high-priority research questions or specify research methods to address them. It leaves these important tasks entirely to the research community, to patients and to stakeholders. This approach could be effective in the long term, but it is not sufficiently responsive to the urgent circumstances dictated by the 2019 sunset date. PCORI must start now to implement a strategy to make the largest possible impact before its day of reckoning."

Joe V. Selby, PCORI's executive director, said in a statement that the institute "is aggressively developing a robust, systematic multi-stakeholder process for identifying and prioritizing research questions for targeted funding. We expect to issue a first set of targeted research funding calls within about 90 days, and to have a fully developed process in place early in 2013. Additional funding announcements designed to support specific research questions will follow." An aide to Selby said he had not yet read Sox's article.

Publication Details