Stanford University

News Service



Dawn Levy, News Service (650) 725-1944; e-mail:

Laboring in obscurity is what risk management's all about

Imagine having a job where your success is measured by how anonymous you remain.

Nobody knows a successful risk manager's name -- and few know what somebody with that job title does for a living. The resume looks like this: Planes that didn't crash; the breweries whose vats didn't spill their contents all over the floor when The Big One hit; patients who lived to count their surgery stitches.

Simply put, risk management is the art of figuring out the scenarios and the chances of accidents -- some of which may not have happened yet but could -- then figuring out how you can prevent them, said Elisabeth Paté-Cornell, chair of Stanford's Department of Management Science and Engineering, during a workshop held at Stanford on April 26. "If we're doing our job well," she said, "nobody hears about it."

Titled "Engineering Risk Analysis and Management," the workshop was organized by Paté-Cornell and sponsored by the Alliance for Innovative Manufacturing at Stanford (AIMS). AIMS is a campus-based joint venture of Stanford's business and engineering schools and several corporate partners. Its mission is to encourage advances in manufacturing and to disseminate these advances throughout industry and academia.

It's difficult enough to identify, much less to quantify, risks involving complex engineered systems, for which there is often little experience from which to mine data. And even when you've done it, "convincing top management to invest in low-likelihood events with high consequences is not an easy task," said Ted Marston, chief nuclear officer for the Electric Power Research Institute in Palo Alto. But assessing and managing risk can pay off, noted Marston, who has logged 25 years of international experience. He recalled the example of a major American brewer that, by seismically retrofitting its Los Angeles brewery in the mid-1980s at a cost of $10 million, avoided a potential earthquake-damage loss of between $750 million and $1 billion.

Nor is it easy to focus executive attention on addressing myriad tiny, repetitive problems that can add up to potential catastrophes, said Jimmy Benjamin, a manager in electronics giant Hewlett-Packard's hardware services division. Benjamin recounted a lesson learned from his involvement in a risk study commissioned by a major airline.

"Some problems are chronic," Benjamin said. "You think you've fixed it, but it's still there. For example, say you're an airline mechanic. You keep replacing that part, but it keeps wearing out. Meanwhile, your manager grades you on your ability to turn a plane around in 20 minutes, rather than considering the cost to the company of not fixing it right. So what will you do? You'll fix it quick, the way you know will get it in the air fast, rather than inspect the maintenance log and ask: 'Hey, has anyone noticed that this widget's broken 20 times in the last 35 flights?'" The solution, Benjamin said, lies in restructuring the maintenance process to provide incentives to pay attention to chronic problems. By implementing that restructuring, the airline will probably save billions of dollars.

Paté-Cornell related a case study on improving safety among anesthesia patients. The particular concern was how to minimize risk to patients from anesthesiologists abusing drugs or alcohol. "While they should know better, young anesthesiologists seem to have drug use and addiction rates on the same order as that of the general population," she said -- probably because they have easy access to drugs. "And older anesthesiologists may well have a higher alcoholism rate" -- perhaps due to the pressures under which they've worked for decades.

The risk of anesthesia-related deaths in big hospitals is less than one in 10,000, said Paté-Cornell, but when such deaths do occur, they may be blamed on the malady that brought the patient to the operating table, when in reality they may have been an avoidable consequence of small incidents of neglect, misjudgment or malperformance.

Working with a medical school in Adelaide, Australia, and a large hospital in California, Paté-Cornell developed a model for assessing the role of such events. To identify risky behaviors, conditions and situations, her team talked to numerous experts: anesthesiologists, surgeons and operating-room nurses, who, she said, were close observers of the scene with a wealth of knowledge. Paté-Cornell and her team also talked to lawyers who regaled them with horror stories.

The team identified two general sources of lapses on the part of anesthesiologists. The first was lack of alertness. "An anesthesiologist's job is like that of a pilot," said Paté-Cornell. "Once you take off, your job can be very boring, but you'd better be here if something goes wrong -- for example, if the tube that brings the oxygen from the machine to the patient's lungs gets disconnected. An anesthesiologist has to both recognize a problem and fix it, generally within less than two minutes."

Fatigue really gets in the way of detection and diagnosis and thinking straight, said Paté-Cornell: "If the anesthesiologist has been working 22 hours in a row, you don't want to be the patient that shows up in the 23rd hour." Paté-Cornell's group found that 10 percent of the time, fatigue was a problem -- one which, Paté-Cornell said, can be solved by placing limits on how long an anesthesiologist is allowed to stay on duty, as the State of New York has done.

The other source of anesthesiologist error was incompetence. Crisis management requires a certain kind of personality, but beyond admission to medical school there's really not much selection for those traits, said Paté-Cornell. Moreover, anesthesiology crises are sufficiently rare that an anesthesiologist who isn't in the operating room often enough to encounter problems with any frequency may not remember what he or she learned way back in medical school. Paté-Cornell's recommendations included crisis training (on simulators like those used to train airline pilots), which quantitative methods indicated could reduce risk by a far-from-negligible 16 percent, and more-effective supervision and back-up of resident anesthesiologists, which she calculated could lower risk by about 14 percent.

The real surprise, Paté-Cornell told the audience, was her finding that even aggressive random testing for alcohol or drug abuse -- the issues that had triggered the study in the first place -- would reduce risk by an anemic 2 percent and 1 percent, respectively. In comparison, a periodic, formal recertification process, which may detect a performance-hindering deterioration in an anesthesiologist's health and/or competence, could reduce risk by an impressive 23 to 29 percent.

Lee Merkhofer, vice president of Applied Decision Analysis (a wholly owned subsidiary of PricewaterhouseCoopers), captured the counterintuitive essence of Paté-Cornell's conclusion in a remark he said has been attributed to Albert Einstein: "There is a simple solution to every complex problem. Unfortunately, it is wrong."

Of course, Merkhofer said, the fact that something is complex does not mean it is impossible to quantify. He referred to an exercise he has conducted with clients. "I ask, 'What's your estimate of the annual production of eggs in the United States?'" The answers are typically way off. "Then we break people into groups and get them thinking analytically: 'Well, how many eggs per day per person are eaten in the U.S.?' That's something they can get a handle on, and the resulting production estimates invariably come much closer to the mark."

Still, he said, the fact that the devil is so often in the details makes explaining his job to anybody but another risk manager a bit tough. Once, Merkhofer tagged along with a somewhat wayward high-school buddy turned alchemist and a third man (a friend of his friend) to meet the actress Shirley MacLaine, who has a strong interest in the occult. "She talked with Charlie for a while about turning lead to gold, then turned to Charlie's friend and asked him what he did. He said, 'I lift weights.' She stared at him for a few minutes, then turned to me and asked me the same question. I wanted to wow her, so I said, 'I quantify risks. In fact, I just worked with NASA assessing the risks of the Galileo space mission . . . ' and she cut me off and turned back to the other guy. 'Tell me about weight lifting,' she said."


By Bruce Goldman

© Stanford University. All Rights Reserved. Stanford, CA 94305. (650) 723-2300. Terms of Use  |  Copyright Complaints