Abstract
The differential Hebbian law ėij=Ċi Ċj is examined as an alternative to the traditional Hebbian law ėij =Ci Cj for updating edge connection strengths in neural networks. The motivation is that concurrent change, rather than just concurrent activation, more accurately captures the ‘‘concomitant variation’’ that is central to inductively inferred functional relationships. The resulting networks are characterized by a kinetic, rather than potential, energy. Yet we prove that both system energies are given by the same entropy‐like functional of connection matrices, Trace(Ė E). We prove that the differential Hebbian is equivalent to stochastic‐process correlation (a cross‐covariance kernel). We exactly solve the differential Hebbian law, interpret the sequence of edges as a stochastic process, and report that the edge process is a submartingale: the edges are expected to increase with time. The submartingale edges decompose into a martingale or unchanging process and an increasing or novelty process. Hence conditioned averages of edge residuals are encoded in learning though the network only ‘‘experiences’’ the unconditioned edge residuals.