AsianScientist (Jun. 15, 2012) – Ever been stuck in traffic when a feel-good song comes on the radio and suddenly your mood lightens?
Most people will agree that our emotions and feelings are typically associated with the right side of the brain; for example, processing the emotion in human facial expressions is performed by the right hemisphere of the brain.
New research from Australia published in the journal Neuropsychologia is challenging the widely held view that emotions and feelings are the domain of only the right hemisphere.
Dr. Sharpley Hsieh and colleagues from Neuroscience Research Australia (NeuRA) studied people with Alzheimer’s disease, semantic dementia, and healthy people without either disease. People with Alzheimer’s disease lose episodic memory (‘What did I do yesterday?’) while people with semantic dementia lose semantic memory (‘What is a zebra?’).
Participants were played new pieces of music and had to indicate whether the song was happy, sad, peaceful, or scary, and images were then taken of the patients’ brains using MRI so that diseased parts of the brain could be compared statistically to the answers provided in the musical test.
The study showed that patients with Alzheimer’s and semantic dementia have problems deciding whether a human face looks happy or sad because the amygdala in the right hemisphere is diseased.
Patients with semantic dementia have additional problems labeling whether a piece of music is happy or sad because the anterior temporal lobe in the left hemisphere is diseased.
“It’s known that processing whether a face is happy or sad is impaired in people who lose key regions of the right hemisphere, as happens in people with Alzheimer’s and semantic dementia,” said Hsieh.
“What we have now learnt from looking at people with semantic dementia is that understanding emotions in music involves key parts of the other side of the brain as well.”
Hsieh says that this is the first study from patients with dementia to show that language-based areas of the brain, primarily on the left, are important for extracting emotional meaning from music.
“Our findings suggest that the brain considers melodies and speech to be similar and that overlapping parts of the brain are required for both,” said Hsieh.
The article can be found at: Hsieh S et al. (2012) Brain correlates of musical and facial emotion recognition: Evidence from the dementias.
Disclaimer: This article does not necessarily reflect the views of AsianScientist or its staff.