Âé¶¹ÒùÔº

September 22, 2015

Synapses need only few bits

Deep learning is possibly the most exciting branch of contemporary machine learning. Complex image analysis, speech recognition and self-driving cars are just a few popular examples of a multitude of new applications where machine learning, and deep learning in particular, show their amazing capabilities.

Deep neural networks are made up of many layers of artificial neurons with hundreds of millions of connections between them. The structure of such deep networks is reminiscent of the brain, where billions of neurons are connected through thousands of synaptic contacts each. These types of networks can be trained to perform hard classification tasks over huge datasets, with the remarkable property of extracting information from examples and generalizing them to unseen items.

The way neural networks learn is by tuning their multitude of connections, or synaptic weights, following the signal provided by a learning algorithm that reacts to the input data. This process is in some aspects similar to what happens throughout the nervous system, in which plastic modifications of synapses are considered to be responsible for the formation and stabilization memories. The problem of devising efficient and scalable learning algorithms for realistic synapses is crucial for both technological and biological applications.

In a recent study, published in Âé¶¹ÒùÔºical Review Letters, researchers from Politecnico di Torino and Human Genetics Foundation (Italy) showed that extremely simple synaptic contacts, even one-bit switch-like synapses, can be efficiently used for learning in large-scale , and can lead to unanticipated computational performance. The study was conducted by a research group led by Riccardo Zecchina and composed by Carlo Baldassi, Alessandro Ingrosso, Carlo Lucibello and Luca Saglietti.

Get free science updates with Science X Daily and Weekly Newsletters — to customize your preferences!

Until now, theoretical analysis suggested that learning with simple discretized synaptic connections was exceedingly difficult and thus impractical. Applying principles from statistical physics of disordered systems, the researchers found that the problem actually becomes extremely simple. The authors provide an in-depth theoretical explanation for why the problem can become simple and provide concrete learning strategies.

These new results are consistent with biological considerations, and recent experimental evidence that suggests that synaptic weights are not arbitrarily graded, but store a few bits each. Still, the most immediate follow-ups will be of a technological nature—the hardware implementation of relying on extremely simple synapses can overcome many of the computational bottlenecks (memory and speed) that the future generation of algorithms will have to face.

More information: "Subdominant Dense Clusters Allow for Simple Learning and High Computational Performance in Neural Networks with Discrete Synapses." Âé¶¹ÒùÔº. Rev. Lett. 115, 128101 – Published 18 September 2015.

Journal information: Âé¶¹ÒùÔºical Review Letters

Provided by Politecnico di Torino

Load comments (1)

This article has been reviewed according to Science X's and . have highlighted the following attributes while ensuring the content's credibility:

Get Instant Summarized Text (GIST)

This summary was automatically generated using LLM.