12px13px15px17px
Date:01/03/16

People will follow a robot in an emergency – even if it’s wrong

A university student is holed up in a small office with a robot, completing an academic survey. Suddenly, an alarm rings and smoke fills the hall outside the door.

The student is forced to make a quick choice: escape via the clearly marked exit that they entered through, or head in the direction the robot is pointing, along an unknown path and through an obscure door.

That was the real choice posed to 30 subjects in a recent experiment at the Georgia Institute of Technology in Atlanta. The results surprised researchers: almost everyone elected to follow the robot – even though it was taking them away from the real exit.

“We were surprised,” says Paul Robinette, the graduate student who led the study. “We thought that there wouldn’t be enough trust, and that we’d have to do something to prove the robot was trustworthy.”
The unexpected result is another piece of a puzzle that roboticists are struggling to solve. If people don’t trust robots enough, then the bots probably won’t be successful in helping us escape disasters or otherwise navigate the real world. But we also don’t want people to follow the instructions of a malicious or buggy machine. To researchers, the nature of that human-robot relationship is still elusive.

In the emergency study, Robinette’s team used a modified Pioneer P3-AT, a robot that looks like a small bin with wheels and has lit-up foam arms to point. Each participant would individually follow the robot along a hallway until it pointed to the room they were to enter. They would then fill in a survey to rate the robot’s navigation skills and read a magazine article. The emergency was simulated with artificial smoke and a First Alert smoke detector.

A total of 26 of the 30 participants chose to follow the robot during the emergency. Of the remaining four, two were thrown out of the study for unrelated reasons, and the other two never left the room.
The results suggest that if people are told the robot is designed to do a particular task – as was the case in this experiment – they will probably automatically trust it to do it, say the researchers. Indeed, in a survey given after the fake emergency was over, many of the participants explained that they followed the robot specifically because it was wearing a sign that read “EMERGENCY GUIDE ROBOT.”




Views: 367

©ictnews.az. All rights reserved.

Facebook Google Favorites.Live BobrDobr Delicious Twitter Propeller Diigo Yahoo Memori MoeMesto






22 December 2024

21 12 2024