Trouble viewing? Open in web browser.

Journalist Resources Stanford News Stanford Experts Contact Us
Stanford University homepage

News Service

April 18, 2006

Singularity Summit at Stanford will explore the future of human and machine 'cognition'

How would life change if computers became so smart that they could pass the Turing Test—that is, had intelligence and conversational ability on par with people? Will advanced technologies serve us, control us, change us, destroy us? What are the implications of a controversial notion known as "the Singularity"—a hypothesized "event horizon" in human technological development beyond which our models of the future cease to give reliable answers?

These are among the questions that leading thinkers will explore at the Singularity Summit at Stanford on Saturday, May 13, from 9 a.m. to 5 p.m. in Memorial Auditorium. Stanford's Symbolic Systems Program and Center for the Study of Language and Information are co-hosting the summit. Co-sponsors include Clarium Capital Management, KurzweilAI.net, MINE, the Singularity Institute for Artificial Intelligence, the Stanford Transhumanist Association and United Therapeutics.

The event is free and open to the public, but seating is limited. While 500 seats have been reserved on a first-come, first-served basis for Stanford faculty, staff and students who RSVP, at least 900 additional seats will be available. To RSVP, go to http://sss.stanford.edu/rsvptoday/. For more information, go to http://sss.stanford.edu or call (650) 353-6063.

"The Singularity will be a future period during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed," said Ray Kurzweil, keynote speaker and author of the best-selling The Singularity Is Near: When Humans Transcend Biology (Viking, 2005). "Based on models of technology development that I've used to forecast technological change successfully for more than 25 years, I believe computers will pass the Turing Test by 2029, and by the 2040s our civilization will be billions of times more intelligent."

Smarter-than-human intelligence in humans and machines may become possible through singularity technologies such as artificial intelligence (AI) and brain-computer interfaces. Just as our current model of physics breaks down when scientists try to model the singularity at the center of a black hole, our model of the world breaks down after it predicts the rise of smarter-than-human intelligence. Imagine, for example, an AI agent able to improve its own source code. It could make itself smarter and then, being smarter, could redesign itself again. Through positive-feedback loops, smart technologies may be able to produce even smarter technologies that modern humans cannot fathom.

"Some regard the Singularity as a positive event and work to hasten its arrival, while others view it as unlikely, or even dangerous and undesirable," said Todd Davies, associate director of the Symbolic Systems Program. "The conference will bring together a range of thinkers about AI, nanotechnology, cognitive science and related areas for a public discussion of these important questions about our future."

Confirmed speakers in addition to Kurzweil include:

  • Douglas R. Hofstadter, cognitive scientist and Pulitzer-winning author;
  • K. Eric Drexler, nanotechnology pioneer;
  • Christine L. Peterson, vice president of public policy for the Foresight Nanotech Institute;
  • Cory Doctorow, science-fiction novelist;
  • Nick Bostrom, philosopher and director of the Oxford Future of Humanity Institute;
  • Max More, futurist and cofounder of the Extropy Institute;
  • Eliezer S. Yudkowsky, a research fellow of the Singularity Institute for Artificial Intelligence;
  • John Smart, president of the Acceleration Studies Foundation;
  • Peter Thiel, president of Clarium Capital Management and former chief executive officer of PayPal;
  • Bill McKibben, author of Enough: Staying Human in an Engineered Age; and
  • Sebastian Thrun, director of the Stanford Artificial Intelligence Laboratory and leader of the Stanford Racing Team, whose robotic car won $2 million last year in the Grand Challenge sponsored by the Defense Advanced Research Projects Agency.
  • Among the issues the speakers will address: Can we shape the intelligence explosion for the benefit of humanity? Will our emotional, social, psychological and ethical intelligence keep up with our expanding cognitive abilities? Will smarter-than-human intelligence help reduce existential risks, such as the risk that advanced technology will be used in warfare or terrorism? Will productive nanosystems enable the development of intricate and complex productive systems, creating a feedback loop that accelerates change?

    Steve Jurvetson, a Silicon Valley venture-capital polymath, and Thiel will moderate the summit.

    "To any thoughtful person, the Singularity idea, even if it seems wild, raises a gigantic, swirling cloud of profound and vital questions about humanity and the powerful technologies it is producing," said Hofstadter, author of Gödel, Escher, Bach: An Eternal Golden Braid, which won a Pulitzer Prize in 1980. "Given this mysterious and rapidly approaching cloud, there can be no doubt that the time has come for the scientific and technological community to seriously try to figure out what is on humanity's collective horizon. Not to do so would be hugely irresponsible."

    Editor Note:

    The event is free and open to the public, but seating is limited. To RSVP, go to http://sss.stanford.edu/rsvptoday/.

    -30-

    Contact

    Dawn Levy, News Service: (650) 725-1944, dawnlevy@stanford.edu

    Comment

    Todd Davies, Symbolic Systems Program: (650) 723-4091, tdavies@csli.stanford.edu

    Related Information

     

    Update your subscription

    • Email: news-service@stanford.edu
    • Phone: (650) 723-2558

    More Stanford coverage

    Facebook Twitter iTunes YouTube Futurity RSS

    Journalist Resources Stanford News Stanford Experts Contact Us

    © Stanford University. Stanford, California 94305. (650) 723-2300.