Google DeepMind breakthrough brain barrier."nervous system Turing machine."

DeepMind has established an access to the same as a conventional external memory Turing machine "nervous system Turing machine." The result is the "nervous system Turing machine" short-term memory can simulate the human brain.

In the field of neuroscience, one of the biggest challenges is to understand the short-term working memory in the human brain. At the same time, computer scientists have to reproduce the same loving memories wafer.

Early this year, Google has spent 400 million dollars to acquire startups DeepMind today announced a prototype computer, this computer may attempt to mimic the characteristics of a prototype of some of the human brain short-term working memory. 
This new computer with a type of neural network, it can be adapted to work with external memory. The result is that the computer can be stored in memory and can be retrieved after they have logic to perform some tasks, in addition, it can also be trained to do something.
DeepMind
DeepMind
DeepMind  in people already have a long history of short-term memory has made a breakthrough. In the 1950s, the American cognitive psychologist George Miller had a history of brain science in a very famous experiment. George Miller on the ability of the human brain is very interested in working memory, he began experiments to measure the human brain and invited a large number of students to participate in this experiment.

George Miller results show the ability of short-term memory can not be determined by the amount of information it contains. Instead, the experimental results show George Miller working memory in the form of "block" to store information and can store about seven.

This presents a strange question: What is this "block"? In Miller's experiment, a block may be a single number, such as "4", a letter such as "Q", a word or a phrase. So each block can represent anything from a very small amount of information into a very complex idea of something that is equivalent to a lot of information.

However, regardless of a single block can represent how much information the human brain can only store seven blocks in working memory.

Here is an example. Please carefully read the following sentence: "This book is a thrilling read with a complex plot and lifelike characters".

This sentence is composed of about seven messages, apparently this sentence in any ordinary readers seem understandable.

In contrast, the attempt to read this sentence: "This book about the Roman Empire during the first years of Augustus Caesar's rein at the end of the Roman Republic, describes the events following the bloody Battle of Actium in 31 BC when the young emperor defeated Mark Antony and Cleopatra by comprehensively outmaneuvering them in a major naval engagement ".

This sentence contains at least 20 information. So, if you find it more difficult to read, do not be surprised. The human brain in working memory is indeed difficult to deal with such messages.

In the field of cognitive science to understand part of the sentence, and its ability to be stored in working memory is called bind variables. This is a piece of data is removed, assign it to memory, and the ability to repeatedly execute this operation.

During the 90's and 100's, computer scientists repeatedly tried to design algorithms, circuits, and neural networks, hope it can perform the above operations. Such a computer should be able to analyze some simple sentences such as "Mary told John," so in this case, the computer can distinguish the role of Mary is talking, talking role in this action, and John the listening.

DeepMind of staff revealed that the early performance of the machine is very limited, and their architecture draws on previous technology and enhanced it.

They began to redefine the nature of neural networks. So far, neural networks have become interconnected "neurons" mode, it has the ability to change the intensity of interconnected response to external input. This is a form of learning, so that they find similarities between different inputs.

However, the basic calculation of the process contains an important additional factor. In the calculation process, the external memory needs to be read and written. Turing's famous description of the computer, the memory is like a telegraph paper, transfer and store a wide variety of symbols for later processing by the computer back and forth.

This type of readable and writable memory does not exist in conventional neural networks. Therefore, they merely add one. This enables the neural network can be stored in its memory variables and returned to their use in subsequent calculations.

This is similar to an ordinary computer may put the number 3 and number 4 input into the internal register, after which they together generate 7. The difference is that the neural network model can store more complex representation of the variable.

Since this form of computing with the traditional neural networks with different characteristics, DeepMind give it a new name - they call it the nervous system Turing machine, and the first product has been completed. Turing machine nervous system similar to conventional neural network, which the outside world by accepting input information to learn, but it will also learn how to store this information and when to retrieve it.

DeepMind work includes: First build the device, and then experiment. Their experiments included some tests to see if the nervous system Turing machine can perform a particular task, it can further extend this capability to achieve larger or more complex tasks.

Comments