Skip to content

Teaching a robot to dance? That, and maybe a lot more

keepon-motion4.jpg

Consisting of small motors and sensors under a flexible skin, the robot Keepon mimics dancing by coordinating random movements to the rythm of a techno song. (Photo courtesy Marek Michalowski)

The robot named Keepon looks like two tennis balls stuck together, until it starts dancing. In a YouTube video seen nearly two million times, it bobs its head and gyrates to the techno band Spoon’s song “I Turn My Camera On.” Then there’s CB2, or “Child-robot with Biomimetic Body.” Meant to mimic a human baby, it looks like a huge gawky toddler, with a body covered in gray silicone. Kismet, a “social” robot with expressive eyes, eyebrows, lips and ears, can follow a human with its gaze and react to a person’s tone of voice.

Engineers say that these cute robots, which can recreate and react to our emotions, are helping them understand how humans interact with lifelike machines. This is important, they say, because robots might be working with humans in hospitals, homes and offices within a few years-—in fact, many types of robots are already deployed in factories and offices in Japan. If humans are to accept these robots, say engineers, the robots will need to behave in a way that makes humans comfortable.

“A robot in a hospital should understand how quickly a person is walking down the hall,” said Marek Michalowski, a Ph.D. student at Carnegie Mellon University who programmed the dancing Keepon to study the importance of rhythm in human interaction. “A humanoid robot that we have in five or ten years needs to be able to nod as I talk to it. When we have interactive robots in the future, they will need to match our rhythms, otherwise our interactions will not be smooth.”

The Japanese engineers who created CB2 say its tactile sensors and cameras allow it to react to touch and movement, and its compressed air motors simulate the movement of muscles. They say they plan to teach the robot how to move more like a human. For now, the robot can only squirm at random, but its creation brought international attention, and many who viewed a video of the robot on YouTube thought CB2 seemed a little too human.

“Very creepy,” wrote one YouTube user.

“It’s odd, it makes me kinda sad to see this,” wrote another. “It almost looks like there is a person in there.”

American engineers have also designed robots that seem uncannily lifelike. Kismet was developed in the late 1990s by M.I.T. engineers, who use it to study human expression. Since then, engineer Cynthia Breazeal and her team have also created Leonardo, a furry robot with big eyes and ears that looks like a character from the 1984 movie Gremlins. Another robot called Autom, which takes the form of a simplified head and torso, is designed to give moral support to people who are trying to lose weight. Using face-tracking software, it maintains eye contact while a human enters information about his diet and exercise on a touch-screen. The Huggable is a robotic Teddy bear, which researchers are planning to test as a part of therapy for hospital patients.

As people interact with these expressive robots, it’s easy for them to feel that the robots are expressing real emotions. Nicholas Epley, a behavioral scientist at the University of Chicago who studies anthropomorphism, said that this reaction is only natural.

“Cues that another agent might be a little like us, or like humans in general,” Epley said, lead us to draw on “our own experience and the intuitive knowledge we have about how people work.” Humans have been programmed by evolution to recognize lifelike characteristics and expect things to behave in a lifelike manner. In a sense, we’re easily tricked by robots.

In truth most of these robots are fairly simple. Keepon, for example, consists of small motors under a soft rubber skin. It has cameras in its eyes and a microphone in its nose, which allow it to respond to motion and sound with movements of its own. It can turn its head, nod, rock from side to side and bob up and down--but that’s it. To make Keepon dance, Michalowski explained, he programmed it to make random movements in time with the music.

“I think it’s one of the best robots around,” Michalowski said, of the device designed by Japanese engineer Hideki Kozima, “because it’s simple in appearance, so you don’t expect it to do that much.”

Human-robot interaction is a popular and highly visible area of study, but robots that look emotional can make the field of robotics appear much more advanced than it is. As the latest humanoid robots make the rounds of the Internet, it is tempting to conclude that the age of sentient androids--think “Blade Runner” or “Terminator”--is imminent. But today’s robots have a long way to go before they can do the dishes, let alone take over the world. Though the hardware is advanced, researchers are still working on fundamental software problems, like recognizing shapes, grasping objects, and finding the best path around a crowded room.

“Robotics is a very wide field,” said Matei Ciocarlie, a Ph.D. student at Columbia University’s robotics lab. “We’ve got a lot going on.”

At one end of Columbia’s lab, researchers are using a computer program to manipulate a tiny needle and microscope, which are used to analyze protein crystalsin a process called micro-robotics, Ciocarlie said. Other engineers are investigating the use of robots for remote-controlled laparoscopic surgery. A small robot that looks vaguely like a dog on wheels can be controlled wirelessly by programs running on a laptop. “This little guy,” Ciocarlie said, is used mainly for teaching undergraduate students about robots.

Ciocarlie studies grasping, or the best way for a robot hand to hold and pick up an object. A robot hand that can grasp objects well, he said, could be used in a prosthesis for humans. But this is one of the biggest challenges for robots, Ciocarlie said. “People pick up and grasp things intuitively, but for robots it’s still a very big problem.”

There are many unsolved problems when it comes to the all-purpose humanoid robot that people imagine when they think of robots, Ciocarlie said. Such a robot would have to be good at navigating around its environment, seeing objects, picking things up, and interacting with humans--all at the same time.

“It’s a lot of little problems, and none has been solved,” Ciocarlie said.

While we might one day have robots in our living rooms or kitchens, scientists can’t predict how long it will take for these creations to understand emotion, rather than simply mimicking facial expressions. In the mean time, however, Keepon and his pals keep generating interest and desire to see what’s next.

E-mail: amc2235@columbia.edu