CONTACT: Stanford University News Service (650) 723-2558
IS SELF-DECEPTION NORMAL, RATIONAL? SCHOLARS ASK
STANFORD - A team of researchers recently reported that balding was correlated with a higher incidence of heart disease. The findings refuted rumors that treatments for bald spots might cause heart attacks, but did the results also set up balding men to deceive themselves into taking treatments for baldness in hopes it would help their heart?
Philosophers, psychologists and others wrestled with this question at a recent Stanford University colloquium on self-deception, or the limits on what is thought of as human rationality.
"Nobody has asserted a causal mechanism between baldness and heart disease. In fact, there's a causal theory about both having to do with male hormones," said philosopher Kent Bach of San Francisco State University. "However, some people who didn't mind being bald before [the research] now might start using Rogaine [a pharmaceutical that treats baldness] because of the correlation."
For philosophers and psychologists, "this is a new version of an old problem," Bach said, the problem of people mistaking the "sign" or the "evidence" of something for "the real thing" or its "cause."
"We are dealing with a fundamental existential problem that raises its head every time we are confronted with absolute uncertainty concerning a variable on which our 'salvation' depends," said the organizer of the colloquium, Jean-Pierre Dupuy, a philosopher on the faculties of both Stanford University and Ecole Polytechnique in Paris. Health is often such an issue.
A highly regarded study in the last decade by Stanford University psychologist Amos Tversky and George Quattrone found that, when given the chance, people are willing to "cheat" on medical exams to get a better diagnosis. In other words, they try to influence the results even though it's not rational to believe that good examination results, by themselves, could affect one's state of health, Tversky said.
Furthermore, the researchers found evidence that people deceived themselves into believing they did not try to influence their diagnosis, he said.
In the experiment to test conventional theories of rationality, Tversky and Quattrone told Stanford undergraduate volunteers that they were to be part of an experiment examining the effects of rapid changes in temperature on heart rate after exercise. The volunteers were asked to hold their hands and forearms in ice water as long as they could tolerate the discomfort.
Next, one group of volunteers was told that people can tolerate more cold water after exercise if they have the better of two types of hearts, while another group was told the opposite. After exercise, the volunteers were asked to repeat the cold water press.
The researchers found most people's tolerance for the ice water changed in the direction of what they believed to be the healthier sign. Most also denied that they had made a conscious effort to tolerate more or less cold the second time. The few who admitted they had tried to influence the results expressed less confidence in the validity of the diagnosis of their health.
"Denying to yourself the fact that you tried to influence the medical exam is an essential step to making the inference that you are in better health," Tversky said.
Tversky's research has been one of the main challenges to the long-standing theory of rational choice, colloquium participants said. Rational choice theory says that people weigh the costs and benefits of their actions and make decisions in their best interests. This capacity for rationality is often said to be responsible for the efficiency of free markets. When people do not behave according to the theory, as in many of Tversky's experiments, their actions are called irrational or self- deceiving.
Philosopher Dupuy said he disagrees with the concept of self- deception. He sees instead "two irreducible forms of rationality."
One is based on "projected time" - the desire to plan or take actions today that can cause benefits in the future - quitting smoking to increase one's chances of future health, for example. The other is based on "occurring time" - the desire to provide evidence today of one's salvation, as with the volunteers who tried to influence their medical exam results.
Projected time is emphasized by economists, but occurring time may be equally valid, Dupuy said, especially when individuals are not in a position to know the "causes" of future events. "Both conceptions of time correspond to a fundamental human experience," he said.
Max Weber pointed to the consequences of two forms of rationality in his 1930 tract on the doctrine of predestination, Dupuy said. During the Reformation, Calvinists believed that they were either "chosen" or not chosen by God at birth, whereas Catholics believed good works on earth determined their chances of entry into heaven.
Calvinists found it as essential as Catholics to do good works on earth, Dupuy said, as a "sign" of their having been chosen.
A Calvinist may even be more motivated to do good works than a Catholic, Tversky and Quattrone argued in a 1985 article, because "even a single sinful deed is evidence enough that he or she is not among the chosen. To the Catholic, it is more a matter of one's total good and bad deeds that determines heaven or hell. And besides, there is always confession."
"One of the downsides of discussions of rationality is that we tend to view it as all or nothing, black and white, whereas it is more graded," Tversky said at the colloquium.
He cited, for example, President Reagan's statements about his knowledge of his administration's guns-for-hostages deal with Iran.
"At some point, it seems, President Reagan was aware of it, and sometimes he didn't think about it. There was a certain lack of clarity, a fleeting thought quality to it," Tversky said.
Self-deception may be a "failure to think through the decision tree," he said. "There is a certain extent to which we deceive ourselves all the time on small matters. We're never quite really frank."
With the unity of self called into question, some researchers try to understand individuals by comparing them to organizations, Tversky said. The human mind may be a little like "the accounting department that doesn't pass on all the essential information to the production department."
"Self-deception isn't always conniving and calculated," he said. "Sometimes it's better not to know, we think. It's more like a cover- up - a failure to transmit information rather than have a direct contradiction between two beliefs."
The "urgent question" to psychologists in his specialty, Tversky said, is "What conditions allow us to deceive ourselves?"
This is an archived release.
This release is not available in any other form.
Images mentioned in this release are not available online.
© Stanford University. All Rights Reserved. Stanford, CA 94305. (650) 723-2300.