Can Humans and Robots Interact Naturally?
Humans draw on a multitude of impulse reactions while navigating surroundings. These signals, gestures, and movements — made without thinking — help people interact smoothly with one another; they prevent people from bumping into one another, for example. But what happens when humans and machines interact? Can humans automate impulse reactions in robots?
This question fascinated Dr. Wendy Ju, who joined the Jacobs Technion-Cornell Institute at Cornell Tech as Assistant Professor of Information Science last year. Ju is setting up a robotics research lab for human-and-robot interaction, where she will seek how to design machines that can easily and naturally interact with humans.
Watching what humans do
Dr. Ju has a PhD in mechanical engineering from Stanford and a master’s degree in media arts and sciences from MIT. She has spent years observing human interactions. “They’re basically very small conversations that enable everyday life: Who’s going to open the door? Who’s going to go in first or second? With cars we do it at four-way stops,” she said.
Watching what happens when things go wrong during these interactions can also be useful, she said. “I’m just a person that really likes to watch people. I really love looking at breakdowns, I like being in public places and seeing when these things get tripped up.”
Ju explained that machines do not automatically pick up cues that people use every day; rather, everything they do requires button pushing or explicit asking. Humans are constantly signaling to one another, so when a robot is added to the mix, how is it given the skills it needs to interact?
“I’m interested in that kind of interaction, knowing that is how people interact with one another, and thinking about what that means for how we design machines.”
According to Ju, these interactions are not always completely obvious to the people making robots because they work so invisibly that it’s not often noticed “I feel like I have an obligation, since I can see these things, to design the machines so that they work that way as well,” she said.
On the road with ghost drivers
One of Ju’s principal areas of research is autonomous cars. She’s carried out field experiments to see how people behave around them. To gauge these reactions, Ju and her team use what she calls “ghost drivers.” In light of a recent accident in Tempe, Arizona, where a woman was killed by an autonomous Uber car undergoing testing, Ju’s methods have particular significance. “We basically have a person dressed in a costume to look like a car seat driving on the road, and then we can do experiments to see how people react, but with the safety of an actual human controlling the car,” she said.
Through her work, Ju aims to answer specific questions about set scenarios. For example, how quickly will people react if a car hands control over to its passenger? What will people do when they encounter an autonomous car on the road? The results could be surprising. People tend to understand that autonomous cars and robots are learning, Ju said, and they often want to help. “People do different things to enforce norms about where the car’s supposed to be and when it’s supposed to stop,” she said.
Ju’s other work on robot-and-human interactions is focused on everyday situations. The machines she deploys are not actually autonomous, they’re controlled by people. The point is to try to anticipate how people might behave around robots in the future. How will they react to a robot collecting trash, cooking food, or running on the sidewalk?
Some people test the robots to see what they are capable of, while others try to help them. “If the robot plays dumb, if it doesn’t pick up a signal, people will do different things to really exaggerate what they’re doing, like wave the garbage or point at empty spaces,” Ju said. These, she said, are implicit cues which people amplify to teach the robot the correct way to do something.
By observing all of the different cues and behaviors, Ju can then incorporate them into machine-learning algorithms. This way, robots can learn from the natural things that humans do.
A new context in NYC
To date, Ju has carried out most of her experiments around Stanford. She has found that people there are pretty nonchalant when they encounter robots. “A lot of people are very pro-technology and they also work in technology so they’re very much like, ‘Oh yeah, this is happening now,’” she said.
Having recently moved to New York City, she is curious to find out whether people will behave differently or the same way. One of her first projects at Cornell Tech will explore how people interact with a troupe of chair robots -— essentially chairs that can reconfigure themselves in a space. She will also carry out further on-road experiments with ghost drivers.
Ju relishes the opportunity to continue her work in this new context. The diverse department at the Jacobs Technion-Cornell Institute also really appealed to her. “There are so many people with shared interests in both technology and the social issues around that,” she said. The city played its part too, “New York City was a big draw because I’m so interested in observing people,” she said. “I’m working at the intersection of technology and design; New York City’s such a great place for that.”