Trouble viewing? Open in web browser.

Journalist Resources Stanford News Stanford Experts Contact Us
Stanford University homepage

News Service

November 23, 2009

As robots become more common, Stanford experts consider the legal challenges

By Adam Gorlick

They already detect and defuse bombs, control traffic patterns and do some basic household chores. And scientists predict that pretty soon, robots will be using artificial intelligence to play a larger role on the battlefield, operate our vehicles and take care of us in old age.

But who will be to blame if a robot-controlled weapon kills a civilian? Who can be sued if one of those new cars takes an unexpected turn into a crowd of pedestrians? And who is liable if the robot you programmed to bathe your elderly mother drowns her in the tub?

As mechanical engineers and computer scientists at Stanford develop technology that will transform the stuff of science fiction into everyday machinery, scholars at the Law School are thinking about the legal challenges that will arise.

"I worry that in the absence of some good, up-front thought about the question of liability, we'll have some high-profile cases that will turn the public against robots or chill innovation and make it less likely for engineers to go into the field and less likely for capital to flow in the area," said M. Ryan Calo, a residential fellow at the Law School's Center for Internet and Society.

And the consequence of a flood of lawsuits, he said, is that the United States will fall behind other countries – like Japan and South Korea – that are also at the forefront of personal robot technology, a field that some analysts expect to exceed $5 billion in annual sales by 2015.

"We're going to need to think about how to immunize manufacturers from lawsuits in appropriate circumstances," Calo said, adding that defense contractors are usually shielded from liability when the robots and machines they make for the military accidentally injure a soldier.

"If we don't do that, we're going to move too slowly in development," Calo said. "When something goes wrong, people are going to go after the deep pockets of the manufacturer."

Calo and his colleagues at Stanford are among the first in the country to ponder the potential legal questions facing the emerging field of personal robotics. And the issues go beyond claims of personal injury and property damage.

In order to navigate, robots are outfitted with cameras and sensors. And because they run on computer software, they're susceptible to hacking. So a robot designed to clean your house could potentially be turned into a spy, vandal or thief.

And some predict that at some point, someone will sue for the right to marry their robot.

"Don't laugh," Paul Saffo, a technology forecaster and visiting scholar at Stanford's Media X project, said during a recent panel discussion held at the Law School to address the legal challenges surrounding robotics. "People get emotionally attached to their robots."

Saffo said about two-thirds of people who own Roombas – robotic vacuum cleaners made by the Massachusetts company iRobot – have given names to their machines. And several people take them on vacation and treat them like friends or family members, he said.

Some soldiers in Iraq and Afghanistan have reportedly developed unusual bonds with the robots they've used to detect roadside bombs, said Kenneth Anderson, a research fellow at the Hoover Institution.

"Soldiers come back and talk about their IED-detector robots in a way that [shows] they've developed deep relationships," he said. "They'll risk their lives so the robot doesn't get shot."

While most robots don't bear strong physical resemblance to humans, they are increasingly being built to think like them. In the Gates Computer Science Building, researchers are building personal robots that can make their own way in the world.

Mounted on the bottom half of a Segway scooter, the metal frame of STAIR – the Stanford Artificial Intelligence Robot – is stacked with cameras, sensors and wires. From the center of its "body," a single arm with pincers at its end can extend to grasp and lift items, push buttons and move things out of the way.

Depending on how it's programmed, STAIR can move through a hallway to call an elevator, fetch a stapler from a desk in another room or unload a dishwasher.

"One of the things robotics researchers often think about is how to design robots that are safe and can help us in our homes so that we can even trust them around our children," said Andrew Ng, an associate professor of computer science who helped design and build STAIR.

Before autonomous robots become commonplace in our homes and office space, we'll be dealing more with the consequences of small robots that operate behind the scenes: the machines that run MRI scanners, subway systems and city traffic lights.

"That's the scary part," Saffo said. "I predict some company will go bankrupt because of a little bot that goes out of control. We're not heading into a nirvana, and it's not going to be hell. But along the way, humans will get killed and cars will go out of control."

-30-

Contact

Adam Gorlick, Stanford News Service: (650) 725-0224, agorlick@stanford.edu

Comment

Ryan Calo, Stanford Law School: (650) 736-8675, rcalo@stanford.edu

Related Information

 

Update your subscription

  • Email: news-service@stanford.edu
  • Phone: (650) 723-2558

More Stanford coverage

Facebook Twitter iTunes YouTube Futurity RSS

Journalist Resources Stanford News Stanford Experts Contact Us

© Stanford University. Stanford, California 94305. (650) 723-2300.