Convergence of stochastic learning in perceptrons with binary synapses
- 16 June 2005
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review E
- Vol. 71 (6), 061907
- https://doi.org/10.1103/physreve.71.061907
Abstract
The efficacy of a biological synapse is naturally bounded, and at some resolution, and is discrete at the latest level of single vesicles. The finite number of synaptic states dramatically reduce the storage capacity of a network when online learning is considered (i.e., the synapses are immediately modified by each pattern): the trace of old memories decays exponentially with the number of new memories (palimpsest property). Moreover, finding the discrete synaptic strengths which enable the classification of linearly separable patterns is a combinatorially hard problem known to be NP complete. In this paper we show that learning with discrete (binary) synapses is nevertheless possible with high probability if a randomly selected fraction of synapses is modified following each stimulus presentation (slow stochastic learning). As an additional constraint, the synapses are only changed if the output neuron does not give the desired response, as in the case of classical perceptron learning. We prove that for linearly separable classes of patterns the stochastic learning algorithm converges with arbitrary high probability in a finite number of presentations, provided that the number of neurons encoding the patterns is large enough. The stochastic learning algorithm is successfully applied to a standard classification problem of nonlinearly separable patterns by using multiple, stochastically independent output units, with an achieved performance which is comparable to the maximal ones reached for the task.Keywords
This publication has 17 references indexed in Scilit:
- Slow stochastic learning with global inhibition: a biological solution to the binary perceptron problemNeurocomputing, 2004
- Hebbian spike-driven synaptic plasticity for learning patterns of mean firing ratesBiological Cybernetics, 2002
- A single-transistor silicon synapseIEEE Transactions on Electron Devices, 1996
- Learning in Neural Networks with Material SynapsesNeural Computation, 1994
- Directed drift: A new linear threshold algorithm for learning binary weights on-lineJournal of Computer and System Sciences, 1993
- ASSOCIATIVE MEMORY IN NEURAL NETWORKS WITH BINARY SYNAPSESModern Physics Letters B, 1990
- Learning algorithm for a neural network with binary synapsesZeitschrift für Physik B Condensed Matter, 1990
- Storage capacity of memory networks with binary couplingsJournal de Physique, 1989
- Maximum Storage Capacity in Neural NetworksEurophysics Letters, 1987
- Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern RecognitionIEEE Transactions on Electronic Computers, 1965