Title

Audiovisual asymmetries in speech and nonspeech

Date of Completion

January 2005

Keywords

Psychology, Experimental

Degree

Ph.D.

Abstract

The individual contributions of the various perceptual systems to multimodal perception are typically examined by placing information from various perceptual systems in conflict and measuring how much each contribute to the resulting perception. When one perceptual system dominates perception more than another, it is called intersensory bias. The present research examined intersensory bias with audition and vision using a tapping task in which participants were asked to tap to the frequency of either an auditory or visual stimulus. The stimuli were either presented alone or simultaneously at the same or different frequencies. It was found that when the stimuli were a pulsating tone and a flashing ellipse, audition dominated temporal perception. When the stimuli were speech (as in an acoustic syllable and a speaking face) or bouncing basketballs, intersensory bias substantially weakened or disappeared. These results do not support the modality appropriateness hypothesis that contends that audition should dominate temporal perception. These results also do not support the assumption of unity theory that claims that intersensory bias should be stronger in more a unified event, such as speech and bouncing basketballs. Instead, the results indicate that intersensory bias is a consequence of an imbalance of information available to the various perceptual systems. When information from one perceptual system is difficult to detect, the perceiver seeks other sources of information to complete the task. ^

COinS