EEPROM synapses exhibiting pseudo-Hebbian plasticity

Abstract
Dynamic analogue learning in VLSI neural networks can be achieved with low synaptic complexity using EEPROM devices of the floating gate tunnel oxide type. When logarithmic neural voltages are employed, learning algorithms describing synaptic plasticity that approximate multiplicative Hebbian learning rules can be implemented. This does not require the high transistor count of conventional analogue multiplier circuits.

This publication has 1 reference indexed in Scilit: