Decoding reveals the contents of visual working memory in early visual areas

Top Cited Papers
Open Access
Abstract
Although we can hold several different items in working visual memory, how we remember specific details and visual features of individual objects remains a mystery. The neurons in the higher-order areas responsible for working memory seem to exhibit no selectivity for visual detail, and the early visual areas of the cerebral cortex are uniquely able to process incoming visual signals from the eye but, it was thought, not to perform higher cognitive functions such as memory. Using a new technique for decoding data from functional magnetic resonance imaging (fMRI), Stephanie Harrison and Frank Tong have found that early visual areas can retain specific information about features held in working memory. Volunteers were shown two striped patterns at different orientations and asked to memorize one of the orientations whilst being scanned by fMRI. From analysis of the scans it was possible to predict which of the two orientation patterns a subject was being retained in over 80% of tests. This study shows that early visual areas can retain specific information about features held in working memory even when there is no physical stimulus present. Using functional magnetic resonance imaging decoding methods, visual features could be predicted from early visual area activity with a high degree of accuracy. Visual working memory provides an essential link between perception and higher cognitive functions, allowing for the active maintenance of information about stimuli no longer in view1,2. Research suggests that sustained activity in higher-order prefrontal, parietal, inferotemporal and lateral occipital areas supports visual maintenance3,4,5,6,7,8,9,10,11, and may account for the limited capacity of working memory to hold up to 3–4 items9,10,11. Because higher-order areas lack the visual selectivity of early sensory areas, it has remained unclear how observers can remember specific visual features, such as the precise orientation of a grating, with minimal decay in performance over delays of many seconds12. One proposal is that sensory areas serve to maintain fine-tuned feature information13, but early visual areas show little to no sustained activity over prolonged delays14,15,16. Here we show that orientations held in working memory can be decoded from activity patterns in the human visual cortex, even when overall levels of activity are low. Using functional magnetic resonance imaging and pattern classification methods, we found that activity patterns in visual areas V1–V4 could predict which of two oriented gratings was held in memory with mean accuracy levels upwards of 80%, even in participants whose activity fell to baseline levels after a prolonged delay. These orientation-selective activity patterns were sustained throughout the delay period, evident in individual visual areas, and similar to the responses evoked by unattended, task-irrelevant gratings. Our results demonstrate that early visual areas can retain specific information about visual features held in working memory, over periods of many seconds when no physical stimulus is present.