Primary cortical representation of sounds by the coordination of action-potential timing

Abstract
Cortical population coding could in principle rely on either the mean rate of neuronal action potentials, or the relative timing of action potentials, or both. When a single sensory stimulus drives many neurons to fire at elevated rates, the spikes of these neurons become tightly synchronized, which could be involved in 'binding' together individual firing-rate feature representations into a unified object percept. Here we demonstrate that the relative timing of cortical action potentials can signal stimulus features themselves, a function even more basic than feature grouping. Populations of neurons in the primary auditory cortex can coordinate the relative timing of their action potentials such that spikes occur closer together in time during continuous stimuli. In this way cortical neurons can signal stimuli even when their firing rates do not change. Population coding based on relative spike timing can systemically signal stimulus features, it is topographically mapped, and it follows the stimulus time course even where mean firing rate does not.