Document Detail


Audiovisual integration of emotional signals from music improvisation does not depend on temporal correspondence.
MedLine Citation:
PMID:  20153297     Owner:  NLM     Status:  MEDLINE    
Abstract/OtherAbstract:
In the present study we applied a paradigm often used in face-voice affect perception to solo music improvisation to examine how the emotional valence of sound and gesture are integrated when perceiving an emotion. Three brief excerpts expressing emotion produced by a drummer and three by a saxophonist were selected. From these bimodal congruent displays the audio-only, visual-only, and audiovisually incongruent conditions (obtained by combining the two signals both within and between instruments) were derived. In Experiment 1 twenty musical novices judged the perceived emotion and rated the strength of each emotion. The results indicate that sound dominated the visual signal in the perception of affective expression, though this was more evident for the saxophone. In Experiment 2 a further sixteen musical novices were asked to either pay attention to the musicians' movements or to the sound when judging the perceived emotions. The results showed no effect of visual information when judging the sound. On the contrary, when judging the emotional content of the visual information, a worsening in performance was obtained for the incongruent condition that combined different emotional auditory and visual information for the same instrument. The effect of emotionally discordant information thus became evident only when the auditory and visual signals belonged to the same categorical event despite their temporal mismatch. This suggests that the integration of emotional information may be reinforced by its semantic attributes but might be independent from temporal features.
Authors:
Karin Petrini; Phil McAleer; Frank Pollick
Related Documents :
16291807 - Memories for emotional autobiographical events following unilateral damage to medial te...
16480897 - The interaction of emotional and cognitive neural systems in emotionally guided respons...
19445007 - Learning to 'talk the talk: the relationship of psychopathic traits to deficits in empa...
15193617 - Fmri correlates of the episodic retrieval of emotional contexts.
22848197 - Visual and spatial modulation of tactile extinction: behavioural and electrophysiologic...
16507067 - The influence of facial feedback on race bias.
Publication Detail:
Type:  Journal Article; Research Support, Non-U.S. Gov't     Date:  2010-02-11
Journal Detail:
Title:  Brain research     Volume:  1323     ISSN:  1872-6240     ISO Abbreviation:  Brain Res.     Publication Date:  2010 Apr 
Date Detail:
Created Date:  2010-08-13     Completed Date:  2010-11-22     Revised Date:  2014-10-28    
Medline Journal Info:
Nlm Unique ID:  0045503     Medline TA:  Brain Res     Country:  Netherlands    
Other Details:
Languages:  eng     Pagination:  139-48     Citation Subset:  IM    
Copyright Information:
Copyright 2010 Elsevier B.V. All rights reserved.
Export Citation:
APA/MLA Format     Download EndNote     Download BibTex
MeSH Terms
Descriptor/Qualifier:
Acoustic Stimulation
Adult
Analysis of Variance
Attention
Auditory Perception / physiology*
Emotions*
Female
Humans
Male
Music / psychology
Photic Stimulation
Time Factors
Visual Perception / physiology*
Grant Support
ID/Acronym/Agency:
MC_G1001214//Medical Research Council

From MEDLINE®/PubMed®, a database of the U.S. National Library of Medicine


Previous Document:  Flavonoids inhibit hypoxia-induced vascular endothelial growth factor expression by a HIF-1 independ...
Next Document:  Stimulus and response conflict in the color-word Stroop task: A combined electro-myography and event...