This robot learns to clean your space just the way you like it
The one-armed “TidyBot” needs only a few starter examples to figure out how you like your stuff put away.
Go to the web site to view the video.
Whenever Shuran Song teaches robotics, she asks her students the same question: What daily tasks would you most like a robot to do?
“The two requests that always come up are ‘washing dishes’ and ‘cleaning my room,’” said Song, assistant professor of electrical engineering at Stanford.
So far, the first task is best left to a dishwasher. But for the second, Song and an interinstitutional team of roboticists may have a solution: TidyBot, a one-armed helper that cleans according to personal preferences. The group, which includes researchers from Princeton University, The Nueva School, Columbia University, and Google, presented a paper on the robot and its performance at the International Conference on Intelligent Robots and Systems on October 2nd. An extended journal version of the paper will appear soon in Autonomous Robots in the special issue on robots and large language models.
“We all experience it: our home or lab or office gets messy,” said Jeannette Bohg, assistant professor of computer science at Stanford and another of the paper’s authors. “Now imagine you just tell a robot, ‘In my place, this is where that thing goes, and that is where this thing goes.’ Then you leave and let the robot tidy up.”
For personal pick-up, no one size fits all
To navigate your space, TidyBot rolls around on powered wheels hidden beneath a box-shaped platform. A seven-jointed arm sprouting from the top ends in a two-fingered gripper for manipulating objects, from opening a drawer and gently placing a toy inside to precisely tossing an empty can into a bin. Cameras on the ceiling allow TidyBot to pinpoint objects and obstacles across a room, while others on its body provide close-up views to analyze items in front of it.
But it’s TidyBot’s adaptability, not just its physical form, that could make it so useful around the house.
“Household organization is so personal,” Bohg said. Recognizing this, the group worked to make TidyBot customizable to different preferences.
TidyBot uses a large language model (a type of AI similar to the one used in ChatGPT) trained on massive amounts of information from the internet to identify broad categories of objects like clothes, food, and toys. This means you can give TidyBot just a few examples of where certain belongings go and it will understand to clean up similar items in the same ways – even if it’s never seen those items before.
Suppose, for instance, that you want your kid’s loose toys back on their shelf. “You could say, ‘I want the teddy bear on the shelf,’ and TidyBot is able to generalize that concept to other toys in your home,” Song explained. TidyBot also recognizes qualities like color, allowing it to do things like sort light and dark laundry. In real-world tests, the robot can properly put away 85% of objects.
An endearingly imperfect helper
That 85% success rate means TidyBot still makes mistakes, of course – sometimes to comical effect. Bohg recalls a visit from a former adviser during a TidyBot demo.
“He was visible in TidyBot’s ‘playing field,’ and TidyBot tried to pick him up,” she laughed.
Luckily, it didn’t succeed: the team designed the robot mainly for handheld-sized objects or soft, deformable items like clothing. TidyBot also struggles with difficult-to-grasp shapes, like a credit card lying flat on a table. Bohg hopes to improve robots’ abilities to replicate human strategies in scenarios like this – such as dragging the card to the table’s edge to pick it up. And she’d like to investigate whether multiple TidyBots could cooperate to accomplish bigger tasks like moving furniture or making a bed.
Like most humans, TidyBot also has more trouble with new spaces than with new objects: an unfamiliar kind of drawer in a strange location might trip it up, just as it might take time to learn your way around a new kitchen.
“The challenge is having TidyBot work in all sorts of environments – in your home and my home, in this lab and the other lab,” Bohg said.
Getting TidyBot to generalize across environments as well as objects will take time and effort. But as large language models and vision models improve, so will TidyBot’s ability to process information and make correct decisions in new scenarios. In theory, Song said, there may be no limit to how many combinations of items and environments TidyBot could eventually work with. As long as an object exists on the internet, TidyBot could learn to recognize and categorize it.
“Maybe in the long term we can think about a more general-purpose robot that’s also able to cook my meal, make my bed, do my laundry,” Song mused. But for now, she said, “If I have a robot that’s able to reliably put my things away, that’s a pretty good goal to achieve already.”
Jimmy Wu, a graduate student at Princeton University and visiting scholar at Stanford is lead author of the forthcoming paper. Additional Stanford co-authors include postdoctoral fellow Rika Antonova and graduate student Marion Lepert. Bohg is also a faculty affiliate of the Institute for Human-Centered Artificial Intelligence (HAI). Song is also an assistant professor, by courtesy, of computer science.
This research was funded by the Princeton School of Engineering, the Toyota Research Institute, and the National Science Foundation.