Skip to main content

Advanced Search

Advanced Search

Current Filters

Filter your query

Publication Types

Other

to

Newsletter Article

/

A Coalition Builds a Report Card

By Brian Schilling

Summary

The Puget Sound Health Alliance is a Seattle-area coalition of payers, purchasers, providers, and consumers that work together on efforts to improve health care quality and control costs.  In 2005, the group began work on an initiative to make information about the quality of care at area hospitals and clinics readily available to consumers. In September 2008, the Alliance launched the “Community Checkup,” a free, online comparison tool with more than 20 different performance measures. The effort enjoys broad community support and is widely cited as a model public reporting effort.   

Background

IMPORTED: __media_76452C949D8843598D070856C35D9993_w_228_h_46_as_1.gif The Puget Sound Health Alliance, formed in 2004 to address health cost and quality issues, counts among its members more than 150 different public and private employers, health plans, physicians, hospitals, and consumers.  These organizations represent more than 1.5 million people in Washington State's Puget Sound region, or about one of three residents. Every health plan in the state participates, as do most major clinic groups.

The collaborative, inclusive nature of the Alliance reflects the vision of its founder and first president, Ron Sims, now deputy secretary of Housing and Urban Development in the Obama Administration.  Sims formed the Alliance and recruited its initial members in early 2005, several years into his tenure as King County Executive—during which he watched area health care costs skyrocket. 

“Alliance leaders know that we can accomplish far more together than any single health plan, medical group, hospital, employer, or patient could on their own,” said Mary McWilliams, executive director of the Alliance. “We encourage everyone to participate, from the largest employers to individual consumers who want to help improve health care in the region.”
The organization has established an impressive track record in its first four years:

  • Care Guidelines -- One of its initial efforts involved reaching community agreement about the national evidence-based clinical guidelines that should be used in the region to ensure effective treatment of diabetes, heart disease, back pain, asthma, and depression. These guidelines ultimately became the basis for the measures in the public Community Checkup report. 
  • Health IT -- The Alliance has successfully pushed for greater adoption of health information technology, and has been involved in awarding more than $2.5 million in grants to clinics and hospitals for adoption of such technology.
  • Audience-specific Information -- The Alliance also provides a wide variety of information resources to promote evidence-based medical care.  Consumers, for example, can find tips and advice on managing their health in the form of one-page fact sheets on certain common illnesses.  Similar materials are available for employers and providers to distribute to their employees or patients.

A longstanding goal of the organization was to produce a report comparing health care provided in the region's medical groups and hospitals.  Such efforts are rare in the U.S; only a handful of regional models, in Minnesota and Wisconsin, exist. One reason for the paucity of such "report cards": they’re tough.  Clinics and physicians tend to focus on what’s being measured and how, insisting on fair, objective metrics that support improvement efforts; consumers tend to gravitate toward simplified reports or rankings; and employers may prefer different metrics altogether, focusing on cost and value.    

What Did They Do?

Initial Discussions: In keeping with its commitment to collaboration, the Alliance’s first step in producing a comparison report was to discuss it among the growing number of organizations at the table.  Once the draft approach to developing the report was defined, the Alliance sought input from area physicians, purchasers, health plans, and the public to refine the approach. More than 40 presentations were given to physician groups across the five counties to pose specific questions and give doctors the opportunity to weigh in. 

The Alliance hoped to get about 30 clinics to step forward. Instead, once certain medical groups volunteered, others followed suit. The inaugural report, launched in January 2008, showed results for more than 80 clinics in the region. The second report, published in November 2008, had results for more than 170 clinics.

Limiting the Report’s Scope: One of the first decisions the Alliance made was a pragmatic one: limiting the report to measures that could be calculated using claims data.  That decision didn’t limit measurement options as much as it might sound. Claims data can reliably show whether patients received various needed tests (cancer screening, cholesterol checks) and whether unnecessary tests were avoided (antibiotics for the common cold)—making it possible to assess care for everything from diabetes to heart disease to depression to asthma. Over time, the Alliance intends to expand the Community Checkup to include measures based on data from registry software, labs, and patient satisfaction surveys.

Ultimately, the group selected 21 measures of medical group care and 41 measures of hospital care to include in the report card.  The medical group measures focus on issues such as prevention (cancer screening); overuse (prescriptions for antibiotics); generic drug use; and certain chronic conditions (asthma, depression, diabetes, and heart disease). The hospital measures focus on care related to heart failure, pneumonia, surgery, and patient satisfaction.  For the medical group results, local physicians validated the draft results, which were calculated using measures or guidelines developed by the National Quality Forum, the Institute of Medicine, and/or NCQA’s HEDIS.

Getting Consumer Buy-in: To ensure the report meets the needs of consumers, the Alliance vetted an early draft with its Consumer Advisory Group, a 15-member panel of area patients that initially met monthly to advise the Alliance Board.  Feedback from that group helped cement the Alliance’s decision to use a Consumer Reports™–style approach to presenting the results.  Full, half-full, or empty circles are used to indicate whether a clinic or hospital is above the regional average, at average, or below the regional average on a given measure. 

“Patients and other consumers play a big role in their health and health care, so to get people to use the report it must be user-friendly for the average person,” said Diane Giese, Alliance director of communication and development. “Other comparison reports exist, even in our region, but they’re often complicated and difficult to navigate.” 

As user-friendly as the comparison report is, the Alliance is careful not to oversell it.  The report notes that there is more to quality care than what it measures and that people should not use the report to choose a doctor, but rather to improve their relationship with their doctor.

Ensuring Provider Support: To ensure provider support for the effort, the Alliance sought out the input of the Washington State Medical Association on the proposed reports.  Feedback convinced the Alliance to produce an entirely separate, more detailed report for use by clinics. Those private reports include physician-level data, which enables the clinics to target interventions and refine practice patterns.  The same data, however, could be easily misinterpreted by consumers.

“A practice manager has the context to know whether a physician’s very high or very low rate on a given measure means what it looks like it means,” said Karen Onstad, director of health information for the Alliance.  “Even with our large multi-payer database, when results are attributed to a specific clinician the amount of data upon which the result is based can be very small, so it’s important that the person doing the interpretation understands statistical validity.” 

Data Gathering and Analysis: One of the decisions the Alliance made early on was to negotiate individual data sharing agreements with each of the 14 health plans and self-insured purchasers that contributed data to the effort.  The process was described as gut wrenching, but ultimately a key success factor.

“If we had attempted to apply a uniform agreement to the disparate organizations that supplied their data, we may not be enjoying the continued and strong support we have today,” said Onstad. “Going slow and working with each organization separately helped us address everyone’s concerns and get stronger buy in.”

Data aggregation and analysis are conducted by a third-party vendor, Milliman, Inc., a highly regarded actuarial and consulting firm with roots in the Seattle area.  Milliman was selected in part because the firm does more than just number crunching, and has the expertise necessary to advise the Alliance on developing a planned next-generation report card featuring data derived from electronic medical records.

A Paper Version: Before the online version of the report was launched, a paper version was released.  Well before then, clinics and medical groups privately received their own draft results for review, comment, and, if necessary, correction.

Promotion: With no real budget for promoting the public report, the Alliance relied on its member organizations and the local media to raise awareness about the launch of the report.  Member organizations were given talking points and encouraged to highlight the report in company newsletters and at appropriate gatherings.  At the Alliance, Giese spent the weeks leading up to the launch pitching the story to area media, many of whom jumped at the story.   

What Happened?

In its first month, September 2008, the Community Checkup Web site drew about 5,000 unique visitors. That ticked up to 7,000 by November.  In a typical month, the site now draws about 1,000 visitors.

The fact that 35 percent of the site’s visitors come from participating organizations is good news. One of the main goals of the report has always been to support quality improvement efforts, and that so much traffic is coming from participating organizations suggests they are looking at the data carefully.  Anecdotes suggest the same thing.  One clinic noticed that it was scoring below average on all four generic prescribing measures and responded by implementing a plan to be more conscientious about prescribing habits. The group also began tracking progress on the measures through its electronic medical record system.  Another clinic reported accelerating plans to transition to an electronic medical record system, and another launched a breast cancer screening awareness campaign to help improve its scores on related measures.

“What we’ve heard from medical groups, hospitals, and other organizations about how they are using the report confirms that everyone is committed to making changes to improve patient care,” said McWilliams. “Not until the Alliance existed did we all have a way to agree on what was needed and then take coordinated action to improve health care value.”

Local media have embraced the report.  Every major paper in the region has run articles or editorials lauding the effort and some have done so more than once.  National Pubic Radio, the Associated Press, and several professional journals also covered the launch of the report card.

The Future

The Alliance is presently working to develop ways to measure and report on "resource use" or efficiency, a special area of interest for the public and private employers involved in the effort.  It is also working on the best way to gather data to measure and report on patients' experiences in medical groups.  Further in the future, the Alliance expects to incorporate measures based on data from electronic medical records and possibly expand the report to feature Medicare data.

Annual fees from members and other general Alliance support so far have been sufficient to fund the reporting effort, but producing and maintaining such a report is not cheap; the Alliance estimated that it spent about $1.1 million in 2007 alone (much of it for start-up costs) to get the comparison report up and running. Annual operating costs—to collect data, maintain the Web site, refine the report, etc.—will be some fraction of that amount, but significant nonetheless.

Lessons Learned

  • Collaboration is critical. Giese traces the success of the Community Checkup effort to the culture that was established at the Alliance from the outset. “Getting everyone to the table and building trust is absolutely imperative,” said Giese. “A comparison report that will be trusted and used has to be a collaborative effort.” 
  • Don’t be afraid to take the difficult route. The Alliance’s experience with respect to negotiating data-sharing contracts was particularly educational. Rather than attempt to apply a uniform agreement, the Alliance negotiated all 14 initial data supplier agreements separately.  The individual attention was appreciated by participants and helped cement relationships during the early stages of the effort. 
  • Don’t let the perfect be the enemy of the good. The Alliance has never been afraid to point out the limitations of the Community Checkup report.  It views the effort as a platform on which to build even better versions of the report (with more data, more measures, and more frequent updates) in the future.

Publication Details