When Engineering Dean Jim Plummer was asked recently to predict the hottest new field, the uncharted territory just ahead, he didn’t take but a second to reply.
“Nano,” he said. “It’s a sea change.”
Nanotechnology, by which certain physical and chemical operations enable mastery over unbelievably tiny structures, which in turn can benefit everything from medicine to sportswear, is occupying the time of a growing number of Stanford researchers. The university has two principal facilities. The Stanford Nanocharacterization Laboratory, launched in October 2005, is in the Geballe Laboratory for Advanced Materials. The Stanford Nanofabrication Facility (SNF), whose lab members work in a 10,500-square-foot clean room surrounded by observation windows, is in the Paul Allen CIS Building.
“These places are very expensive to run, so people from various fields converge on them,” Plummer said. “Physics, chemistry, engineering all use them. The labs are like cafes; people go there to accomplish something. They have the same magic.”
But while nanotechnology is a technically exciting domain of inquiry with enormous potential, its possible effects on society are a focus of persistent controversy.
In 2003 the National Science Foundation announced a competition to establish a network of nanotechnology facilities that would be open to academic, industrial and government researchers. SNF, together with labs at 12 other universities, submitted an application. The agency required that each proposal indicate how the group would address the “social and ethical implications of nanotechnology.” To help formulate that part of the proposal, SNF invited Professor (Teaching) Robert McGinn, of the Management Science and Engineering Department, to get involved, and he proposed carrying out a detailed empirical study of what researchers at the 13 labs thought about ethics and research.
Ultimately, that proposal won the competition, and in 2004, SNF became part of the 13-node National Nanotechnology Infrastructure Network (NNIN).
After working for about a year with other researchers at SNF, McGinn finalized a questionnaire for an online survey titled “Ethics and Nanotechnology: Mapping the Views of the NNIN Community.” It was accessible to researchers from September 2005 to July 2006.
The most important of McGinn’s results, in his view, is that it appears that most researchers (professors, engineers, scientists, postdoctoral scholars and graduate students) believe they have an ethical responsibility to anticipate the impact of their scientific work. In other words, they have a responsibility not only to ensure that they themselves cause no harm, but also to alert authorities if they think applications of their work might pose risks down the line. This, McGinn said, could well indicate a paradigm shift.
His other principal takeaway point is that it is up to managers to take responsibility for the safety and ethical culture of their labs. In somewhat contradictory fashion, a majority of researchers said their colleagues probably would not intervene if someone were taking shortcuts, though a large majority (77 percent) also disagreed with the proposition that their only responsibility is to follow lab safety rules.
These matters are of such concern in the field of nanotechnology, as opposed to other technologies, because the enormous projected benefits conceivably conceal substantial ills. The fact that a new nanomaterial exhibits a particular property that is safe on the macro or micro scale in no way guarantees that the same material will exhibit the same property at the nanoscale.
That understanding led the National Science Foundation to issue its request for proposals; it also led the United Kingdom’s Royal Society and the Royal Academy of Engineering to recommend in 2004 that consideration of ethical and social implications of nanotechnology form part of all researchers’ training.
McGinn’s survey garnered 1,037 responses (90 of them from Stanford), or about one-quarter of the total number of researchers—a sample he calls “robust but not random.” Eighty percent of respondents were men and about two-thirds were U.S. citizens. He divided the questionnaire into three categories: general beliefs about ethics and nanotechnology; specific ethical issues in the lab; and experiences and beliefs about the study of ethics in general.
Half the respondents either somewhat or strongly agreed with the statement that “there are significant ethical issues related to nanotechnology,” and 27 percent somewhat or strongly disagreed. When asked to compare the importance of the ethical dimension of the nanotech field with its scientific dimension, 43 percent said they were equally important. However, far fewer ranked ethics more important than science (8 percent) than the reverse (49 percent). When asked how interested they are in ethical issues related to nanotech, 39 percent said they are quite or very interested; just 6 percent said they are not at all interested.
Respondents were then presented with specific scenarios that might play out in their labs and asked to state the degree to which they were ethical or unethical. For instance, if a researcher never before involved in an accident planned to carry out a procedure that was potentially hazardous, 93 percent of the respondents said they believe it would be completely or somewhat unethical for that researcher not to inform fellow bench workers. However, the disapprobation declined regarding the researcher’s obligation to consult scholarly literature beforehand (72 percent thought the omission was unethical) or to inform administrators (37 percent).
Regarding a situation in which a researcher takes “a relatively safe, time-saving shortcut,” researchers were asked what the most likely response would be in their lab. In other words, respondents were asked not what they would do but rather what everyone else would do. Around one-fifth said the individual would be reported to lab management, while 44 percent said colleagues would try to persuade the individual to stop taking the shortcut.
One-quarter said no one would do anything, which is what led McGinn to assert that managers have a responsibility to prevent what he calls a “laissez-faire culture.”
Seventy-seven percent disagreed with the idea that following the rules is one’s only ethical responsibility; on the other side, 44 respondents (4 percent of the total) strongly agreed with the statement: “The only ethical responsibility of a researcher at a [nanotech] lab is to follow laboratory rules.”
It’s a tiny percentage, McGinn noted—though it is somewhat higher at Stanford—but it is clear that all researchers do not hold similar beliefs about the relationship of ethics and unsafe lab behavior. Asked if they believe ethical guidelines are necessary for nanotech research, 59 percent nationwide said yes, but the number at Stanford was just 50 percent. Those who said guidelines were “neither necessary nor desirable” was 7 percent nationwide and 12 percent at Stanford.
The survey also asked researchers about their obligation beyond the laboratory walls to anticipate ethical issues and to alert appropriate parties of potential danger. Two-thirds agreed that researchers “should always strive to anticipate ethical issues” (6 percent strongly disagreed), and 76 percent strongly agreed that researchers have the responsibility to alert others of danger (4 percent strongly disagreed).
Researchers also were asked how morally acceptable three nanotech goals are to them: Eighty-five percent said cleaning the environment with nanotech is quite or very morally acceptable; three-quarters said repairing damaged human body parts is; and 35 percent said increasing human mental abilities is. But 18 percent said the latter category—enhancement—is morally unacceptable.
Turning to the field of ethics in general, McGinn asked all respondents if they had ever taken an ethics course. Thirty-four percent said yes, though U.S. citizens scored slightly higher than non-citizens. Thirty-six percent said they had taken a course in which ethical issues closely related to science, technology and/or engineering were discussed. Despite that—or perhaps because of it—77 percent said they were somewhat, quite or very willing to spend time learning about the ethical issues related to nanotechnology, and two-thirds said they should become a standard part of the education of future engineers and scientists.
In general, the numbers for Stanford mirror the national numbers, with some exceptions.
Among McGinn’s principal conclusions are that respondents believe quite strongly that it is important for ethical issues to be considered but are themselves only moderately interested; they believe themselves inadequately informed about ethical issues and want them incorporated into curricula; they need a better grasp of what constitutes ethical judgment, negligence and action; and they believe researchers have ethical responsibilities to society. Their notion of what constitutes “harm” is amorphous, he said, yet he took heart in the number who believe that researchers must anticipate the ethical consequences of their work.
“Concern for man himself and his fate must always form the chief interest of all technical endeavors,” Albert Einstein said in 1931. “Never forget this amidst all your diagrams and equations.” This admonition is one of McGinn’s favorites, and he often ends his engineering ethics presentations with it.
“Engineers and engineering students like formulas,” McGinn said. “But you can’t calculate ethical issues. At the end of the day, there is no substitute for mature, independent judgment. I hope the results of this survey make a modest contribution to fostering exactly that.”