Algorithm 2: Know Your Costs

How can we control our costs?

Hospitals are in the hot seat. Every week in the media, some article lasers in on the outrageously high and variable costs of health care and how those costs are bankrupting both individuals and the country. No one can argue the point. This year, the U.S. is projected to spend nearly $2.8 trillion on health care—18 percent of the entire U.S. GDP—which is more than Canada, Japan, Germany, France, China, the U.K., Italy, Brazil, Spain, and Australia will spend combined.

The high cost of health care in the U.S. has been reported in the media for years. What has changed is the intensity of the public’s focus on the issue, as well as the shifting of blame from insurance companies to the hospitals themselves. In his scathing Time magazine exposé on hospital medical bills, Steven Brill spotlights egregious charges, like a 10,000 percent markup on acetaminophen, a $77 price tag for a box of gauze pads, and lab work that cost more than a car.

What’s not garnering as much attention is a more complicated, if no-less-disturbing, angle to the health care cost story. The truth is that we are in the dark about where all that money is going. Health care systems like ours, much less individual providers, have very little idea what their actual costs really look like, or how they break down over the full cycle of a patient’s care. We’re not referring to the charges billed or reimbursements paid, but the true, real-world costs. “There’s a lot of fiction floating around here, and nobody’s been able to get to the truth,” admits Chief Financial Officer Gordon Crabtree, M.B.A. Moreover, we have even less of an idea how, or if, all the money we’re spending is improving patient outcomes or experiences.

In our defense, the current structure of health care pricing is unlike that of any other industry. Former Secretary of Health and Human Services Michael Leavitt once said, “The way we price health care cannot be understood by a human being of average intelligence and limited patience.” And the sheer numbers are overwhelming. A knee replacement without complications, for example, racks up an average of 1,300 cost allocations under 20 different organizations using 13 different costing methods. The box of gauze pads would be one of those charges.

If that sounds like an excuse, it’s not. Rather, the complex nature of health care pricing makes an even stronger case for the imperative to know where our money is going and how it’s impacting the health of patients.


It was Michael Porter, Ph.D., and Robert Kaplan, Ph.D., who thrust this decades-overdue costing conversation into the spotlight two years ago with a game-changing paper, “The Big Idea: How to Solve the Cost Crisis in Health Care.” Published in Harvard Business Review, the paper zeroed in on providers’ “complete lack of understanding” about the costs of health care delivery. This costing void, they explained, makes it nearly impossible to improve processes, eliminate unnecessary procedures, and deliver better outcomes. According to the Harvard business professors, figuring out the cost crisis would be the “single most important lever to transform the value of health care.”

On the heels of the Kaplan-Porter article, Senior Vice President Vivian S. Lee, M.D., Ph.D., M.B.A., met with the department chairs of the School of Medicine in the spring of 2012 and challenged them to find a way to lower costs. The conversation turned out to be a pivotal moment that brought to light a huge missing piece of the costing mystery: There was no way for the chairs, or anyone, to tackle costs because the data didn’t exist.

The next day, Lee used a rare opening in her calendar to rally together a group of senior leaders from finance, decision support, quality improvement, biomedical informatics, and IT. By the end of the impromptu hour-long meeting, the group had arrived at an ambitious goal: to create a tool by the end of the summer that would provide access to real-time, accurate costing data at the provider or patient level­—better yet, a tool that could also overlay outcome and patient satisfaction data against costs.

“We’re looking to change the value proposition altogether,” says Lee, “so we can deliver the best outcomes at the lowest possible cost and with the greatest patient satisfaction.”


Six months later, we had a revolutionary tool that we call Value Driven Outcomes, or VDO. To get an idea of the Herculean effort behind the creation of VDO, visualize this: Each year, the costing data set for the University of Utah Health Care system includes approximately 135 million rows of data, with each row as wide as a football field. To develop the tool, we took some of the brightest minds from four key areas (decision support, biomedical informatics, IT, and the medical group), released them from the responsibilities of their day jobs, put them in a room together and shut the door. In short, we sequestered them. The team brought together expertise across a variety of disciplines to work on a single task: getting a clinically focused costing tool up and running within months, not years. The group’s makeshift command center was a 25-by-25-foot room devoid of any furnishings or decorations except rows of office cubicles and, occasionally, stacks of empty pizza boxes. For the next six months, that nondescript room became their second home as many of them virtually spent their entire summer working.

In the sequestered environment, the team was able to speed up the process of collaboration and innovation. “Being sequestered meant we could say no to everything else,” says Charlton Park, M.B.A., M.H.S.M., director of decision support and cost accounting. Communication also flowed freely. “Instead of sending an email and waiting days or weeks for a response, we could just stand up and talk to each other over the cubicles. We got our answers immediately.”

Every single person in that room was so vital, says Kensaku Kawamoto, M.D., Ph.D., assistant professor of biomedical informatics and associate chief medical information officer, that if someone was sick, important aspects of the project would come to a screeching halt.

As the deadline drew near, some team members were working 100-hour weeks. “It was intense but needed,” says Cheri Hunter, IT director of the data warehouse, who saved some of the instant messaging conversations had at 2 and 3 a.m. “We were willing to work late because we felt like we were doing something that would really add value.” Hunter believes that kind of sequestered environment is a model for how teams can speed the pace of innovation. “So many things should be run like this,” she says. “It was the ideal blending of business, technical, medical, and top-level leadership expertise.”

Park agrees, calling it “the perfect project.” Released from their siloed everyday jobs, collaborating across disciplines and reporting directly to the senior vice president and her executive team, each person on the team had a tremendous opportunity to add value to the tool they were creating. The team members also realized that the work they were doing could potentially be the transformational lever that Porter and Kaplan called for.


With a mature data warehouse that had been painstakingly built over two decades, we had institutional data for billing, clinical, general ledger and payroll already in place at the start of the project. The team’s mission was to harness these masses of data and figure out how to allocate costs at the patient-visit level—from the cost of gauze tape to individual chemotherapy treatments to minutes of nursing labor. Each of these expenses was itemized for more than 1,200 operating units in our academic medical center, effectively creating general ledgers for each and every unit. “In the accounting world, the gospel of finances is the general ledger,” says Park. “You can’t get to the truth without it.”

Taking institutional expenses and applying them to the patient level was only the first step. To create a meaningful tool, variations in clinical activity had to be considered too. Filters were created that enabled users to adjust for a variety of situations, including the severity of a case, different types of patients, and the length of a patient visit. “We’re dealing with complex paths of care and a heterogeneous population of patients,” says Hunter. “Physicians need to have precise data that they can filter.”


After the first iteration of the VDO tool was released, the team threw themselves into phase two. Now sequestered for just two days a week, the group is integrating quality data, including mortality, length of stay, readmissions, bleeding and infection rates, into the tool. With costs on an x-axis and outcomes on a y-axis, the tool now enables users to see direct correlations between the cost of every choice made and how it affects the quality of care. “We need to debunk the myth that high costs go hand-in-hand with quality,” says Park. “When mistakes happen, they cost.” As we refine the outcomes data, the team is also integrating patient satisfaction data and developing measures of patient-reported outcomes.

“The tool’s ability to drill down to the most granular of details is a powerful way to get physicians thinking differently about care delivery,” says Chief Medical Quality Officer Robert Pendleton, M.D. For instance, when the VDO tool alerted physicians that a $15 bronchodilator could deliver the same outcomes for most patients as the $200 bronchodilator they were habitually prescribing, it was easy to switch. “We can take the VDO data, have a 15-minute conversation between physicians, and within two days we can change care delivery to save several hundred thousand dollars a year. And in the case of the bronchodilators, we’re just talking about one tiny grain of sand in the beach of opportunity.”

The VDO tool doesn’t just drive individual decision making about treatments and medications. By looking at information in aggregate, we can ask larger questions about overall health care delivery—and find new ways to redesign care pathways. “The tool allows us to do the good, hard work of redesigning care,” says Pendleton.


“Just making this data available, without a single directive, has the power to really change things,” says Kawamoto. But his vision for VDO is much more ambitious. He wants to take the standardized care pathways that we create from the data and design clinical-decision support tools that hardwire best practices into the electronic health records. Protocol-based computer reminders can then help guide practitioners to make evidence-based decisions right at the point of care. His ultimate goal? Make the tools as easy to implement as an automated Windows update.

“We’ve known for a long time that this is the right thing to do,” says Kawamoto, citing a 1976 article by C.J. McDonald on the use of computer-based care suggestions to reduce errors. “But we’ve failed as a field to take information published more than 30 years ago and make it happen. Our new financing model gives us an opportunity to do it now.”


As the entire nation tries to figure out how to control health care costs, will every health care institution have to scramble to build its own proprietary costing tool? Kawamoto would like to share the VDO technology with institutions nationwide to show them how we’ve managed to measure costs and outcomes throughout our patients’ entire cycle of care. He even entertains the idea of making the tool open-source and creating an open consortium for enabling collaborative, cross-institutional refinement and deployment of the tool. “If we can empower other institutions to reduce costs, avoid unneeded care and deliver safer, better experiences to patients,” says Kawamoto, “just think what we’d all be getting in return.”