A neural network solution to the transverse patterning problem depends on repetition of the input code

Abstract
Using computer simulations, this paper investigates how input codes affect a minimal computational model of the hippocampal region CA3. Because encoding context seems to be a function of the hippocampus, we have studied problems that require learning context for their solution. Here we study a hippocampally dependent, configural learning problem called transverse patterning. Previously, we showed that the network does not produce long local context codings when the sequential input patterns are orthogonal, and it fails to solve many context-dependent problems in such situations. Here we show that this need not be the case if we assume that the input changes more slowly than a processing interval. Stuttering, i.e., repeating inputs, allows the network to create long local context firings even for orthogonal inputs. With these long local context firings, the network is able to solve the transverse patterning problem. Without stuttering, transverse patterning is not learned. Because stuttering is so useful, we investigate the relationship between the stuttering repetition length and relative context length in a simple, idealized sequence prediction problem. The relative context length, defined as the average length of the local context codes divided by the stuttering length, interacts with activity levels and has an optimal stuttering repetition length. Moreover, the increase in average context length can reach this maximum without loss of relative capacity. Finally, we note that stuttering is an example of maintained or introduced redundancy that can improve neural computations.

This publication has 1 reference indexed in Scilit: