CONTACT: Stanford University News Service (650) 723-2558
TELEVISIONS, COMPUTERS CAN FOOL HUMAN NATURE
STANFORD -- If you watch a political candidate on a 30-inch television, chances are you will like him or her more than if you watch on a 13-inch set, Stanford's Byron Reeves says.
"We've found in the laboratory that big pictures automatically take more of a viewer's attention," he said. "You will like someone more on the large screen and pay more attention to what he or she says but remember less."
New television and computer technology increasingly is blurring humans' ability to distinguish between natural and artificial experience, say Reeves and Clifford Nass, Stanford University communication professors.
Their work in the psychological effects on humans of these technologies may have profound implications for how electronics are designed and used in everything from schools to on-line information systems, entertainment and advertising.
It may be, for example, that with wider-angle screens that better fill a human's natural visual field, TV producers could "make Sesame Street so arousing that children learn less by watching it," Reeves said.
Team the latest 70-inch screen with a "surround-sound" system that can give the illusion of bombs falling all around your living room, he said, and for $7,000 to $10,000 you can radically alter the psychological effects of viewing.
In computers, better voice output, better screen displays and more moving pictures, Nass said, will increasingly fool users into treating the computer as if it were human.
"The distinction between the two mediums - television and computers - is becoming blurred," he said, which is why he and Reeves now work together.
If computer programmers simply switch from text to voice communication without changing what the programs say, they are likely to alienate many of their users, Nass said, because people react to voice output as if were a specific human talking to them.
Different voices on programs also might be used to reinforce stereotypes about women and men or class or ethnicity, he said.
For decades, people have speculated about whether a computer can be made smart enough to fool humans into thinking they are interacting with another human.
"This is the wrong question," Nass said. "We should be asking, how pathetic can the machine be and still fool a human?"
People rarely say they think a computer is human, but when Nass designs experiments to elicit their reactions to computers, they often use rules derived from human - rather than machine - interaction. Voice output is one of the characteristics that triggers such a mistake, Nass said, because "humans have enormous, primitive capabilities for discriminating human-sounding voices. We can distinguish our mother's voice within a week of birth, and perhaps even in the womb."
Thus, a computer with a human-sounding voice gets categorized as a distinct individual.
The fact that a computer interacts with its user and often fills a social role - such as teacher or adviser - also prompts its users to treat is as human, he said.
Communication theorists have assumed that when all but the youngest children sat down in front of a television set, they said to themselves "this is artificial" - a symbolic representation of the real world - and processed the information accordingly, Reeves said.
"What we are discovering is that there is little in the brain that says 'artificial' and a lot in the brain that says 'natural, real, just like everything else,' " he said.
"Our bodies are simply not able to 'discount' information just because it is symbolic rather than real. When a TV character moves quickly in our direction, our visual perception system will make us aware of this and prepares us for retreat or confrontation.
"Even though our body remains stationary - actually, it probably moves a bit - the perceptual readiness to respond can influence how we describe our experience, and how we remember and apply its lessons."
That's why more compelling TV or computer technology may not aid education, Reeves said, and why research that focuses on whether the content of a program is good or bad for children may miss an important point.
"If you are just trying to figure out which way to go, you remember less of the content," he said.
Those who doubt that they treat computers and TV as reality may wish to test how they think they would react to the following experiments and compare that to the responses Reeves and Nass found in their laboratories:
Q: You are invited to be tutored by a computer with a black-and- white display screen and a CD-quality voice. A second computer voice will evaluate the tutor's performance in one case, the same voice in another. Will that make any difference?
A: Yes. "Criticism is nasty, but modesty is nice in computers as well as humans," Nass said. A computer that praised itself in it's own voice was disliked and distrusted more, apparently because it violated the social norm against humans praising themselves.
A second voice that was critical of the tutor was seen as "smarter but less friendly" than when it praised the tutor.
"People who criticize others are perceived as brilliant but cruel," Nass said. "Each voice - or each computer box in another experiment with two computers and the same voice - has a social role."
Q: You are asked to watch news and entertainment shows on two television sets. The sets are identical but the researchers tell you they normally use one to show entertainment shows and one to show news. Will your evaluation of the shows be affected by the set upon which you watched them?
A: Yes. Research subjects found news to be "newsier" on the set that was normally used to show news and entertainment to be more "entertaining" on the set that was normally used for entertainment.
The subjects apparently applied a social norm to the TV sets: people are seen as less competent when they are "jacks of all trades, masters of none."
This is an archived release.
This release is not available in any other form.
Images mentioned in this release are not available online.
© Stanford University. All Rights Reserved. Stanford, CA 94305. (650) 723-2300.