12px13px15px17px
Date:05/07/18

New System Enables Robot Control Using Brainwaves and Hand Gestures

A new system enables users to control the movements of robots using brain signals and simple hand gestures.
 
The new system— created by researchers from the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Laboratory (CSAIL)—works by harnessing the power of electroencephalography (EEG) and electromyography (EMG). A human user is outfitted with a series of electrodes on their scalp and forearm, which are the linked to the robot.
 
Traditionally, EEG signals are not always reliably detectable and EMG signals are often difficult to map motions more specific than simple "move left or move right." However, by merging the two signals, the researchers created a system with more robust biosensing.
 
"By looking at both muscle and brain signals, we can start to pick up on a person's natural gestures along with their snap decisions about whether something is going wrong," PhD candidate Joseph DelPreto, lead author on the study, said in a statement.
 
As a robot performs a task, the new system is able to detect in real time if a user notices an error, by monitoring brain activity. An interface that measures muscle activity enables a person to make hand gestures to scroll through and select the correct option for the robot to execute.
 
The researchers demonstrated the new system by having a humanoid robot move a power drill to one of three possible targets on the body of a mock plane. With human supervision, the robot went from correctly choosing the intended target 70 percent of the time to more than 97 percent of the time.
 
The results proved that the system works on people that have never used it before, meaning that companies could deploy it in real-world settings without lengthy and costly training sessions.
 
"This work combining EEG and EMG feedback enables natural human-robot interactions for a broader set of applications than we've been able to do before using only EEG feedback," CSAIL director Daniela Rus, who supervised the work, said in a statement. "By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity."
 
Previously, robotic systems could only generally recognize brain signals when people trained themselves to think in very specific, but arbitrary ways that the system was also trained on.  For example, a human could look at different light displays that correspond to different robotic tasks during a training session.
 
The new system harnesses the power of brain signals called error-related potentials—which naturally occur when people notice mistakes.
 
"What's great about this approach is that there's no need to train users to think in a prescribed way," DelPreto said. "The machine adapts to you, and not the other way around."
 
The researchers will present their new paper at the Robotics: Science and Systems (RSS) conference in Pittsburgh from June 26 to 30.





Views: 302

©ictnews.az. All rights reserved.

Facebook Google Favorites.Live BobrDobr Delicious Twitter Propeller Diigo Yahoo Memori MoeMesto






03 May 2024

02 05 2024