Wednesday, January 23, 2008

Music, the Brain, and Learning

Relax and let the music flow through you.

Wait! Is that really what happens? What does listening to your favorite music reveal about your brain? Quite a bit, actually! In This is Your Brain on Music: The Science of a Human Obsession, author and neuroscientist Daniel J. Levitin traces the brain’s processing of heard music.


Two cognitive processes, feature extraction and feature integration, create the experience of hearing music. Interestingly, these processes mirror two processes, comprehension and elaboration, that enable learning.


Feature extraction happens in the brain’s posterior regions. As sound waves cause the eardrum to vibrate, the brain receives sensory input. “The brain extracts basic, low-level features from the music, using specialized neural networks that decompose the signal into information about pitch, timbre, spatial location, loudness, reverberant environment, tone durations, and the onset times for different notes (and for different components of tones)” (p. 103). As feature extraction begins, feature integration also activates.


In the brain’s frontal areas, feature integration integrates the extracted features “into a perceptual whole” (p. 105). According to Levitin, the brain constructs a “represenation of reality” from the component features identified during feature extraction (p. 103). That representation is what you experience, the music you actually “hear.”


What can this tell us about learning? First, note the basic processing of music. Individual components of sensory data are identified first as the brain perceives “elemental or building-block attributes of a sensory stimulus” (p. 104). The brain first identifies, sorts, and labels new sensory data. This “low-level processing” is followed by “higher-level” processing. The brain’s frontal regions construct meaning from the sensory data’s “building blocks.” The basic elements form patterns that the brain recognizes, attributing meaning and significance to the sensory data. The brain receives data, sorts and identifies data, and constructs meaning from the data.


Second, note the necessity of both processes for the sensory data to become meaningful. If the brain only engaged in feature extraction, listening to music would generate a frustrating amount of isolated data. Coherence and meaning would be lost, and music as we experience it would not exist. Now transfer this to learning. Memorizing data on its own, void of any higher-level processing, produces the elemental building blocks of understanding. However, understanding cannot be achieved solely through feature extraction. As the brain must engage both processes to experience music, the brain must also engage both process to construct understanding of new instructional material.


And you thought the music just flowed through you!


Levitin, Daniel J. (2007). This is your brain on music: The science of a human obsession. New York: Plume.

No comments: