12px13px15px17px
Date:29/10/16

Google's neural network learns to encrypt its own messages

Researchers from the Google Brain deep learning project have already taught AI systems to make trippy works of art, but now they're moving on to something potentially darker: AI-generated, human-independent encryption. According to a new research paper, Googlers Martín Abadi and David G. Andersen have willingly allowed three test subjects -- neural networks named Alice, Bob and Eve -- to pass each other notes using an encryption method they created themselves.
 
As the New Scientist reports, Abadi and Andersen assigned each AI a task: Alice had to send a secret message that only Bob could read, while Eve would try to figure out how to eavesdrop and decode the message herself. The experiment started with a plain-text message that Alice converted into unreadable gibberish, which Bob could decode using cipher key. At first, Alice and Bob were apparently bad at hiding their secrets, but over the course of 15,000 attempts Alice worked out her own encryption strategy and Bob simultaneously figured out how to decrypt it. The message was only 16 bits long, with each bit being a 1 or a 0, so the fact that Eve was only able to guess half of the bits in the message means she was basically just flipping a coin or guessing at random.
 
Of course, the personification of these three neural networks oversimplifies things a little bit: Because of the way the machine learning works, even the researchers don't know what kind of encryption method Alice devised, so it won't be very useful in any practical applications. In the end, it's an interesting exercise, but we don't have to worry about the machines talking behind our backs just yet. With open-source deep learning tools like Microsoft's Cognitive Toolkit, it might be interesting to see this play out on an even larger scale.




Views: 432

©ictnews.az. All rights reserved.

Facebook Google Favorites.Live BobrDobr Delicious Twitter Propeller Diigo Yahoo Memori MoeMesto






23 November 2024

22 11 2024