A new discovery about the brain indicates that the processing of sound is enhanced by the visual cortex, and may offer keys to unlocking mental health mysteries such as autism and schizophrenia. This discovery was uncovered in a study that was conducted at the University of Glasgow, in their Institute of Neuroscience and Psychology. The team of researchers was led by professor Lars Muckli and findings were reported in the journal Current Biology.
Until now, medical authorities had believed that auditory information was processed separately from visual information. Anatomical evidence of interconnectedness between sight and sound had not been previously known. The part of the brain that processes and receives information from what is seen is the visual cortex, and this new research found that it can process auditory information as well, which creates visual imagery.
Professor Muckli suggests that the processing of what is heard gives a heads-up to the visual system, enabling it to predict what is ahead. This provides a distinct survival advantage. Sounds actually create what is seen in visual imagery, automatic projections, as well as mental images. Muckli used the example of standing on a street corner and hearing the approach of a motorbike. Of course that is what would be expected to be seen, but it would be most surprising if a horse came galloping around the corner. What is heard produces a visual image. Muckli’s team traditionally thought that the information relayed from the retina would only process simple data such as spatial frequency, orientation and contrast. The team also conveyed that the “non-retinal” information that is received from the early visual cortex has not been thoroughly investigated, even though the feedback connectors form other regions of the brain vastly outnumber the feedforward connectors.
Ten volunteers were in involved in five experiments. Early visual cortex activity was observed using functional magnetic resonance imaging (fMRI). One experiment revealed that even in the absence of sound and sight input, there was increased activity early on in the visual cortex. In another experiment, the volunteers were blindfolded while listening to traffic noise, crowds of people talking, and birdsong. To interpret different patterns of specific brain activity, special algorithms were used to indicate how different categories of sound were being processed in the early visual cortex. Researchers learned that generated auditory perception and/or imagery sends information to the visual cortex. The visual cortex then has the ability to receive common abstract information which is carried through a means of non-retinal input. This verifies how different regions of the brain are interconnected. Professor Muckli expects that further study will provide insights as to how sensory perceptions enhancing sound in the visual cortex are different in individuals with autism and schizophreniza, and may unlock keys to understanding different forms of mental health issues.
The above studies were financed by the Biotechnology and Biological Sciences Research Council and the European Research Council. Further study will delve into predictive coding in the brain to see how precise it can be with the experimentation of a broader range of sounds. Another published study found in the journal Neuron, also validated how the visual region affects the auditory processing system. This confirms the validity of interconnectedness in the brain. Researchers found that prevention of sight for a short time, even a week, can enable the brain to process sound more efficiently.
The University of Iowa chose to do an experiment testing short term memory in 100 of its undergraduate students. Students were asked to look at various shades of red squares, to grip an aluminum bar which emitted low-intensity vibrations, and to put on headphones so they could listen to pure tones. The squares, vibrations, and tones had time delays of one to 32 seconds between each set. According to UI’s Department of Psychology researchers, the students retained the memory of the squares and vibrations for longer. The memory of sounds declined more rapidly. In another study, students watched silent videos of a basketball game, and felt familiar objects, as well as listened to dogs barking. Results were the same in that the retention level was shorter with what they heard, as opposed to what they saw and felt. Amy Poremba, an associate professor at UI, commented that these studies correlate that the brain processes visual and tactile information differently than it processes auditory infomation. In other words, sound is stored differently. These studies are the first to indicate that what is remembered by touch and by what is seen are pretty much equal.
Jame Bigelow, a graduate student at UI, commented that there is merit in the Chinese proverb, “I hear, and forget, but I see and then remember.” Teachers have long known that students do better with visual aids than without. These new studies have the capacity to collect sensory data. They will be instrumental in looking deeper into unlocking keys into mental health mysteries in the brain, as cross-referencing continues to understand how sound is enhanced by the visual cortex and how the visual regions of the brain react to tone.
By Jill Boyer-Adriance