Date:02/02/16
Researchers attached electrodes to the temporal lobes of seven epilepsy patients for roughly one week -- the implants were part of a program that aimed to locate the sources of these patients' seizures, but while the electrodes were active, the patients also participated in this brain-wave study. Researchers were in the neighborhood, after all.
The participants viewed a series of houses and faces that appeared on a screen for 400 milliseconds at a time, and were told to look for the upside-down building. An algorithm tracked the brain waves of their temporal lobes, which deals in sensory input. By the end of each session, the program was able to pinpoint with roughly 96 percent accuracy which images the patients were looking at, in real time. The program knew whether the patient was seeing a house, a face or a gray screen within 20 milliseconds of actual perception.
"Clinically, you could think of our result as a proof of concept toward building a communication mechanism for patients who are paralyzed or have had a stroke and are completely locked-in," UW computational neuroscientist Rajesh Rao said.
Scientists decode brain signals nearly at speed of perception
Neuroscientists from the University of Washington have decoded brain signals in real-time and with astounding accuracy, as revealed in a recent study published in PLOS Computational Biology.Researchers attached electrodes to the temporal lobes of seven epilepsy patients for roughly one week -- the implants were part of a program that aimed to locate the sources of these patients' seizures, but while the electrodes were active, the patients also participated in this brain-wave study. Researchers were in the neighborhood, after all.
The participants viewed a series of houses and faces that appeared on a screen for 400 milliseconds at a time, and were told to look for the upside-down building. An algorithm tracked the brain waves of their temporal lobes, which deals in sensory input. By the end of each session, the program was able to pinpoint with roughly 96 percent accuracy which images the patients were looking at, in real time. The program knew whether the patient was seeing a house, a face or a gray screen within 20 milliseconds of actual perception.
"Clinically, you could think of our result as a proof of concept toward building a communication mechanism for patients who are paralyzed or have had a stroke and are completely locked-in," UW computational neuroscientist Rajesh Rao said.
Views: 1061
©ictnews.az. All rights reserved.Similar news
- The mobile sector continues its lead
- Facebook counted 600 million active users
- Cell phone testing laboratory is planned to be built in Azerbaijan
- Tablets and riders outfitted quickly with 3G/4G modems
- The number of digital TV channels will double to 24 units
- Tax proposal in China gets massive online feedback
- Malaysia to implement biometric system at all entry points
- Korea to build Green Technology Centre
- Cisco Poised to Help China Keep an Eye on Its Citizens
- 3G speed in Azerbaijan is higher than in UK
- Government of Canada Announces Investment in Green Innovation for Canada
- Electric cars in Azerbaijan
- Dominican Republic Govt Issues Cashless Benefits
- Spain raises €1.65bn from spectrum auction
- Camden Council boosts mobile security