You must see the point: Automatic processing of cues to the direction of social attention.

Abstract
Four experiments explored the processing of pointing gestures comprising hand and combined head and gaze cues to direction. The cross-modal interference effect exerted by pointing hand gestures on the processing of spoken directional words, first noted by S. R. H. Langton, C. O'Malley, and V. Bruce (1996), was found to be moderated by the orientation of the gesturer's head-gaze (Experiment 1). Hand and head cues also produced bidirectional interference effects in a within-modalities version of the task (Experiment 2). These findings suggest that both head-gaze and hand cues to direction are processed automatically and in parallel up to a stage in processing where a directional decision is computed. In support of this model, head-gaze cues produced no influence on nondirectional decisions to social emblematic gestures in Experiment 3 but exerted significant interference effects on directional responses to arrows in Experiment 4. It is suggested that the automatic analysis of head, gaze, and pointing gestures occurs because these directional signals are processed as cues to the direction of another individual's social attention.