Rabitə və İnformasiya Texnologiyaları Nazirliyinin elektron xəbər xidməti

Google DeepMind develops ‘big red button’ to stop AIs from causing harm


Google’s DeepMind team has been working with scientists to create a way in which humans can kill artificial intelligence agents before they can turn on us.

It’s a concern many have as the AI systems being developed by tech giants become more intelligent and more capable — but Google has us covered.

Google acquired DeepMind for around $400 million back in 2014, and since then, the company has done some incredible things with it. Its most recent achievement was beating one of the highest-ranked Go players, Lee Sedol, 4-1 in a five-game match.

What’s great about technologies like DeepMind is that they are always learning and becoming greater at what they do. Now even their creators are worried about what might happen should the AI decide that it no longer wants to be controlled by a human.

In a paper titled “Safely Interruptible Agents,” scientists at DeepMind and Oxford University reveal their plan for that scenario.

“If such an agent is operating in real-time under human supervision, now and then it may be necessary for a human operator to press the big red button to prevent the agent from continuing a harmful sequence of actions — harmful either for the agent or for the environment — and lead the agent into a safer situation,” the paper explains.

The paper also confirms that there is a “framework” in place that allows a human to safely intervene with an AI, while at the same time ensuring that the AI cannot learn how to prevent those interruptions.

“Safe interruptibility can be useful to take control of a robot that is misbehaving and may lead to irreversible consequences, or to take it out of a delicate situation,” the paper continues.

So, basically, we have to just trust that these systems will work. The researchers say they do with some algorithms, but others have to be adapted. And it’s not clear if all “can be easily made safely interruptible.”





06/06/16    Çap et