Date:05/04/18
The equipment has a hook which slips over the right ear, and sensors placed at seven key areas on the cheeks, jaw, and chin.
These sensors pick up speech-related neuromuscular signals and “translate” them using sophisticated AI algorithms.
Currently AlterEgo can recognise digits zero to nine and about one hundred words.
It is also linked to a programme which can query Google, and offers answers via built-in headphones, offering users access to a vast database of knowledge using just their minds.
In one of the researchers’ experiments, subjects used the system to report opponents’ moves in a chess game and receive silent computer-recommended responses.
In a test, Mr Kapur was asked the time in New Zealand. The answer then appeared on a computer screen, and AlterEgo whispered the words.
Getting the device to recognise internal speech patterns requires 31 hours of training from users.
However, Mr Kapur believes the more it is used, the more accurate it will become.
He said: “The motivation for this was to build an IA device — an intelligence-augmentation device.
“Our idea was: Could we have a computing platform that’s more internal, that melds human and machine in some ways and that feels like an internal extension of our own cognition?”
Professor Maes added: “We basically can’t live without our cellphones, our digital devices.
“But at the moment, the use of those devices is very disruptive.
“If I want to look something up that’s relevant to a conversation I’m having, I have to find my phone and type in the passcode and open an app and type in some search keyword, and the whole thing requires that I completely shift attention from my environment and the people that I’m with to the phone itself.
“So, my students and I have for a very long time been experimenting with new form factors and new types of experience that enable people to still benefit from all the wonderful knowledge and services that these devices give us, but do it in a way that lets them remain in the present.”
Facebook and Neuralink, Elon Musk’s brain science venture, are also working on brain-computer interfaces using direct brain signals rather than nerve impulses.
New device will allow you to talk to your computer telepathically
Researcher Arnav Kapur and Professor Pattie Maes have developed their prototype, AlterEgo, at MIT’s Media Lab – and are already foreseeing huge benefits in our increasingly technologically driven world.The equipment has a hook which slips over the right ear, and sensors placed at seven key areas on the cheeks, jaw, and chin.
These sensors pick up speech-related neuromuscular signals and “translate” them using sophisticated AI algorithms.
Currently AlterEgo can recognise digits zero to nine and about one hundred words.
It is also linked to a programme which can query Google, and offers answers via built-in headphones, offering users access to a vast database of knowledge using just their minds.
In one of the researchers’ experiments, subjects used the system to report opponents’ moves in a chess game and receive silent computer-recommended responses.
In a test, Mr Kapur was asked the time in New Zealand. The answer then appeared on a computer screen, and AlterEgo whispered the words.
Getting the device to recognise internal speech patterns requires 31 hours of training from users.
However, Mr Kapur believes the more it is used, the more accurate it will become.
He said: “The motivation for this was to build an IA device — an intelligence-augmentation device.
“Our idea was: Could we have a computing platform that’s more internal, that melds human and machine in some ways and that feels like an internal extension of our own cognition?”
Professor Maes added: “We basically can’t live without our cellphones, our digital devices.
“But at the moment, the use of those devices is very disruptive.
“If I want to look something up that’s relevant to a conversation I’m having, I have to find my phone and type in the passcode and open an app and type in some search keyword, and the whole thing requires that I completely shift attention from my environment and the people that I’m with to the phone itself.
“So, my students and I have for a very long time been experimenting with new form factors and new types of experience that enable people to still benefit from all the wonderful knowledge and services that these devices give us, but do it in a way that lets them remain in the present.”
Facebook and Neuralink, Elon Musk’s brain science venture, are also working on brain-computer interfaces using direct brain signals rather than nerve impulses.
Views: 343
©ictnews.az. All rights reserved.Similar news
- Justin Timberlake takes stake in Facebook rival MySpace
- Wills and Kate to promote UK tech sector at Hollywood debate
- 35% of American Adults Own a Smartphone
- How does Azerbaijan use plastic cards?
- Imperial College London given £5.9m grant to research smart cities
- Search and Email Still the Most Popular Online Activities
- Nokia to ship Windows Phone in time for holiday sales
- Internet 'may be changing brains'
- Would-be iPhone buyers still face weeks-long waits
- Under pressure, China company scraps Steve Jobs doll
- Jobs was told anti-poaching idea "likely illegal"
- Angelic "Steve Jobs" loves Android in Taiwan TV ad
- Kinect for Windows gesture sensor launched by Microsoft
- Kindle-wielding Amazon dips toes into physical world
- Video game sales fall ahead of PlayStation Vita launch