Abstract
Two experiments explored whether older adults have developed a strategy of compensating for slower speeds of language processing and hearing loss by relying more on the visual modality. Experiment 1 examined the influence of visual articulatory movements of the face (visible speech) in auditory-visual syllable classification in young adults and older adults. Older adults showed a significantly greater influence of visible speech. Experiment 2 examined immediate recall in three spoken-language sentence conditions: speech alone, with visible speech, or with both visible speech and iconic gestures. Sentences also varied in meaningfulness and speech rate. In the old adult group, recall was better for sentences containing visible speech compared with the speech-alone sentences in the meaningful sentence condition. Old adults' recall showed no overall benefit of the presence of gestures. Young adults' recall on meaningful sentences was not higher for the visible speech compared with the speech-alone condition, whereas recall was significantly higher with the addition of iconic gestures. In the anomalous sentence condition, both young and old adults showed an advantage in recall by the presence of visible speech. The experiments provide converging evidence for old adults' greater reliance on visible speech while processing visual-spoken language.