Evolution and Multisensory Integration
By Vincent A Medina
Vision, hearing, smell, taste and touch. When combined, these form a single coherent perception of the world around us. The combination of senses for this purpose is called multisensory integration, a concept popularized by Stein & Meredith’s The Merging of the Senses (1993). Though the psychology and neuroscience literature on sensory perception and attention has blossomed through research on humans, it is important to recognize the evolutionary roots of multisensory integration. Evolution has favored species that can function best with respect to the constant sensory stimuli in the environment. Research has demonstrated efficient multisensory integration in a great number of species, two of which will be discussed in detail.
Starting simple with invertebrates, one representative example is the marine snail. A study on learning in a marine snail (Alkon, 1983) was driven by a wish to understand classical conditioning in invertebrates. It was already well-known that more complex animals such as humans and dogs can be conditioned to associate one sensory stimulus with another. The popular example is Pavlov’s dog: consistently pairing a ringing bell with food caused the dog to create a cross-modal association between the auditory (the bell) and gustatory (food taste) stimuli. Because of this, the dog salivated whenever a bell was rung, even in the absence of food.
Similarly, the marine snail study involved two different sensory modalities. These modalities were vision and touch, due to two unique behaviors of the marine snail in its natural ocean environment. The first is that the marine snail tends to move towards light because its prey, microorganisms called hydroids, live in well-lit water. The second is that the marine snail clings to surfaces when exposed to turbulent water during storms. After experiments that repeatedly combined light with water turbulence, the marine snails formed cross-modal associations between the visual stimulus (light) and the tactile stimulus (water turbulence). This was evident from clinging behavior when light was later presented without water turbulence, as well as decreased light-approaching behavior. These results suggest that the neural bases for associative learning based on multisensory integration are present even in simple organisms like sea snails.
These results suggest that the neural bases for associative learning based on multisensory integration are present even in simple organisms like sea snails.
Regarding non-human vertebrates, one representative example is the cat. One study (Stein et al., 1989) used a paradigm in which each cat was surrounded by five locations that each had a speaker and a light — this type of multisensory paradigm has passed the test of time and has been extended to current cognitive research in humans (Pomper & Chait, 2017; Best et al., 2020). There were two conditions: spatial congruence (i.e., both sensory stimuli were presented in the same location) and spatial incongruence (i.e., both sensory stimuli were presented in different locations). Cats in the spatial congruence group were trained using food rewards to walk towards noise bursts and lights both simultaneously and when each was presented alone. Cats in the spatial incongruence group were trained using food rewards to walk towards lights and ignore the simultaneous noise bursts. During the experiments, cats performed significantly better during spatial congruence and significantly worse during spatial incongruence, suggesting that the element of space plays an important role in whether two stimuli are associated with each other.
Better performance during spatial congruence may sound intuitive, but this finding laid the groundwork for later research on humans and more complex stimuli. One human study found that spatial congruence results in worse performance than spatial incongruence if the two sensory stimuli have conflicting meanings, as the mismatching distractor is harder to ignore (Spence et al., 2000). Another finding from the cat study, which tested different intensities for both stimuli, was that the greatest leaps in performance for spatial congruence occurred when the sensory stimuli were at low intensity. This suggests that multisensory integration works best when the stimuli do not strongly command one’s attention.
…the greatest leaps in performance for spatial congruence occurred when the sensory stimuli were at low intensity.
Both findings from the cat study make sense from a survival standpoint. Regarding better performance during spatial congruence, two sensory stimuli from the same space are usually linked by causality in the real world. For example, leaves crunching accompanied by a sudden smell both coming from behind an animal are more likely to have a common cause — perhaps a predator — than to have independent causes. Additionally, the likelihood of survival increases if an animal can effectively integrate two faint sensory stimuli.
To place these examples in a larger evolutionary context, it is worth recognizing how species will react differently when exposed to the same sensory stimulus based on their ecological niches. For example, many species can react to the visual cue of an owl gliding to attack even though owls have near-silent flight. But, a bat might have trouble reacting to that important visual cue because it prioritizes hearing over seeing due to the environmental demands of dark caves. Some species also have unique senses that could be integrated, such as snakes that can detect infrared radiation (Gracheva et al., 2010). Regardless of the vast variability found throughout the animal kingdom, distinct sensory modalities are used together an overwhelming majority of the time—there is no known animal with a nervous system that has sensory representations exclusive from one another (Stein & Meredith, 1993). Understanding evolutionary forces can help add necessary perspective to the field of multisensory integration as it moves forward.
—there is no known animal with a nervous system that has sensory representations exclusive from one another.
~~~
Written by Vincent Medina
Illustrated by Sumana Shrestha
Edited by Chris Gabriel and Lauren Wagner
~~~
References
- Alkon, D. L. (1983). Learning in a marine snail. Scientific American, 249(1), 70-85.
- Best, V., Jennings, T. R., & Kidd Jr, G. (2020). An effect of eye position in cocktail party listening. In Proceedings of Meetings on Acoustics 179ASA (Vol. 42, No. 1, p. 050001). Acoustical Society of America.
- Gracheva, E. O., Ingolia, N. T., Kelly, Y. M., Cordero-Morales, J. F., Hollopeter, G., Chesler, A. T., … & Julius, D. (2010). Molecular basis of infrared detection by snakes. Nature, 464(7291), 1006-1011.
- Pomper, U., & Chait, M. (2017). The impact of visual gaze direction on auditory object tracking. Scientific Reports, 7(1), 1-16.
- Spence, C., Ranson, J., & Driver, J. (2000). Cross-modal selective attention: On the difficulty of ignoring sounds at the locus of visual attention. Perception & Psychophysics, 62(2), 410-424.
- Stein, B. E., & Meredith, M. A. (1993). The merging of the senses. The MIT press.
- Stein, B. E., Meredith, M. A., Huneycutt, W. S., & McDade, L. (1989). Behavioral indices of multisensory integration: orientation to visual cues is affected by auditory stimuli. Journal of Cognitive Neuroscience, 1(1), 12-24.