06 Jul 2017 |
World innovation news |
Information and Communications Technologies
The Memristor, a Pillar in Adaptive Neural Network Architecture
A team from the University of Michigan has created a new computer circuit prototype inspired by mammalian vision. This new technology can process complex data, such as images and videos, faster and consume less energy than the most advanced existing systems, using an unsupervised learning algorithm.
In the computers in today’s market, communication between processor and memory to recover important data is slow and energy intensive. The system created by the team can process images and videos 1000 times faster with 10,000 times less power without losing accuracy. Rapid processing of images can be used in several stand-alone devices such as self-driving vehicles.
Adaptive Neural Network Architecture
This technology uses an efficient recognition model that simplifies the image processing process. The model operates through an adaptive neural network using the memristor as a data processing and storage unit.
Adaptive neural networks learn the characteristics of an image instead of memorizing the values of each pixel. This method generates simpler representations in the memory—for example, only two features, “round” and “red” may be sufficient to determine that a traffic light indicates “stop”.
It is the same process as the treatment of images in the mammalian brain. Visual information received is fragmented into characteristic elements, such as shape, color, and movement, which are transmitted to the cortical areas to give meaning to the representation.
Reception and interpretation of visual information in the human brain
How has the team translated this neurological pattern?
Memristor, a Component that can Detect, Store and Analyze Data
Memristor is a word formed by combining the words memory and resistor.
Memristors are passive electronic components that work in exactly the same way as a variable resistor to manage data storage. Indeed, the memristor regulates the electric flux value that represents specific data according to the history of the voltage it received. It can store and process data simultaneously, making it much more efficient than existing systems. In a conventional computer, the logic and memory functions are located in different parts of the circuit.
The goal of the project is to build a network of memristors that function as artificial synapses between circuits that simulate neurons. Two models will be tested by the research team. The first, a simpler version of the network, uses memristors as a memory node to store data that will be processed separately by supervised learning algorithms. The second, with a more complex architecture, imitates the brain using memristors as synaptic units capable of unsupervised self-learning. These memristors will support the Sparse algorithm that will decode the data.
Once this complex architecture is built, thousands of images will be stored in the memristors. Afterward, the circuits will be trained to recognize the representations that correspond to the images. The memristors will also make the connections between the circuits that will be able to identify a particular characteristic or shape after the learning process. When a similar characteristic is detected, only the circuits associated with the particular model or shape will trigger and transmit the required information.
All of the circuits and memristors function as areas of the visual cortex connected to other areas of the brain, such as memory, in order to recognize a type of movement or a form with a particular meaning. Thus, the more a neural network is exposed to a representation during the learning phase, the more efficient the synaptic connections that can detect this characteristic will be. A crossbar of memristors will be connected to the circuits and use tungsten oxide as an electrical resistance regulator.
The team led by Wei Lu, Professor of Electrical and Computer Engineering, was awarded a contract worth $5.7 million from the Defense Advanced Research Projects Agency (DARPA) to build the system.
Their study entitled “Sparse Adaptive Local Learning for Sensing and Analytics,” co-written by Zhengya Zhang and Michael Flynn of the U-M Department of Electrical Engineering and Computer Science, Garrett Kenyon of the Los Alamos National Lab and Christof Teuscher of Portland State University, was published on May 22, 2017, in the Nature Nanotechnology journal.