Download Lecture in Linköping 23/9 Music, the Brain and Multimodal

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Emotional lateralization wikipedia , lookup

Human brain wikipedia , lookup

Nervous system network models wikipedia , lookup

Optogenetics wikipedia , lookup

Neuropsychology wikipedia , lookup

Neuroplasticity wikipedia , lookup

Sensory cue wikipedia , lookup

Sensory substitution wikipedia , lookup

Stimulus (physiology) wikipedia , lookup

Affective neuroscience wikipedia , lookup

Metastability in the brain wikipedia , lookup

Neuroesthetics wikipedia , lookup

Brain Rules wikipedia , lookup

Synaptic gating wikipedia , lookup

Holonomic brain theory wikipedia , lookup

Central pattern generator wikipedia , lookup

Proprioception wikipedia , lookup

Mirror neuron wikipedia , lookup

Neural correlates of consciousness wikipedia , lookup

Emotion perception wikipedia , lookup

Channelrhodopsin wikipedia , lookup

Neuropsychopharmacology wikipedia , lookup

Embodied language processing wikipedia , lookup

Neuroanatomy wikipedia , lookup

Feature detection (nervous system) wikipedia , lookup

Neuroscience in space wikipedia , lookup

Music psychology wikipedia , lookup

Allochiria wikipedia , lookup

Cognitive neuroscience of music wikipedia , lookup

Premovement neuronal activity wikipedia , lookup

Perception wikipedia , lookup

Time perception wikipedia , lookup

Embodied cognitive science wikipedia , lookup

Transcript
Lecture in Linköping 23/9
Music, the Brain and Multimodal Communication
Björn Vickhoff
[email protected]
Is music communication unimodal or multimodal? To answer this question we
have to consider the production of music, the message and the perception of
the message. Let us do that.
For the production of music, musicians are depending on the following senses:
1. Audidory (feed-back for technical control).
2. Visual (reading the score, looking at the instrument and the fingers).
3. Sensorymotor (movements vis-à-vis the instrument and feedback from
touching the instrument)
4. Proprioception (sense of body position, balance)
5. The sense of space and direction (ex. A drummer knowing the positions
of his drums)
Furthermore the following cognitive faculties are involved in music production:
1.
2.
3.
4.
Memory (procedural, echoic, episodic, semantic)
Sequencing
Musical rule processing
Imagery (= perceptions not caused by sensory input, such as felt
continuations in composing and improvising)
5. Emotion (for expression)
The listener perceives:
1. Sounds
2. Visions (imagining the artist, or any visual imagery triggered by the
music)
1
3. Sensorymotor sensations (manifested in own body movements)
4. Proprioception (dancing and moving to the music)
5. A sense of space and direction (such as the perception of music as a
landscape we are passing through or the sense of the melody moving up
& down)
Furthermore listening involves the following faculties:
1.
2.
3.
4.
5.
Memory (procedural, echoic, episodic, semantic)
Sequencing
Musical rule processing
Imagery (for anticipation and perception)
Emotion
Thus we can see that the process of sending and the process of receiving is
multimodal and involves a manifold of cognitive faculties.
But: The transmission – the message as such - is unimodal.
Although the transmission is unimodal, the listener has to a large extent the
same rich multimodal experience as the musician. How is this possible? The
explanation lies in the difference between sensory input and perception. Even
if we just receive in one modality the perception of music is multimodal.
Perception is depending on perceptual learning, imagery, and contextually
derived perspectives engaging procedural, echoic, episodic and semantic
memories. Beside these memories I want to emphasize the memory for
sequences and rules. Let us examine these mechanisms one by one.
Perceptual learning: Pieces of information that are presented together become
linked together in the brain irrespective of modality. This is called unification or
pattern completion.
Examples:
1. Lip-reading ↔ hearing the lyrics ↔ hearing the melody.
2. The sound of the instrument: The sound of an acoustic guitar may act as
a cue to memories in all modalities of a guitar such as shape and color,
2
and even the smell of spruce wood, as well as the feeling of holding it
and playing it. The sound of a Les Paul guitar holds a universe of implicit
understanding for the jazz guitarist.
Episodic memories. Music often evokes memories of situations. These
memories are multimodal and emotional.
Procedural memories create bodily felt expectancies (motor imagery).
Rule memory: We learn musical rules just by being exposed to styles and
genres. The rule memory can be indicated by ERP-experiments (EEG). This
memory is implicit and independent of modality.
In music the rule governed pattern (the structure) can be presented as:
1. The visual pattern (the music score)
2. The sensorymotor pattern of the fingers (touch + motor)
3. The sounding structure
In the experienced musician these modalities are automatically interconnected.
The listener perceives the implicit rules of the style. As in the musician these
are intermodal connections. Even if the listener is not an experienced musician,
there is a synchronicity between heard music, own body movement and the
visual imagination of the artist in action.
The perspective is a way to perceive the sensory input. Perspectives are
triggered by the situation. They decide what will be figure and what will be
ground (external context). It also selects appropriate memories (internal
context).
Examples:
1. In the mood - the rhythmic context changes the perception of the
melody. (Musical context)
2. When we are in a concert hall we perceive silence as music
(environmental context).
3
3. Hip-hop, heavy metal, indie, folk music - any music is connected with a
tribal understanding. People listen, move, dress, and use emblems
according to tribal belonging. The tribe lives the music as an
understanding of who they are. I call this the tribal perspective.
4. Semantic knowledge such as facts about the composer, knowledge of the
lyrics etc. affects listening. This can be called a semantic perspective.
Thus situation derived perspectives selects memories and this process creates
rich multimodal perceptions.
The challenge
I have to a certain extent explained how unimodal transmission can evoke
multimodal perception. Any musical perception that is not auditory is imagery
since the sensory input is just auditory. But imagery is quite vivid and it is
processed by the same brain areas as are sensory input derived perceptions.
On a neurological level imagery is not separable from any other perception.
But: How can we explain the perception of movement (sensorymotor) and
emotion (visceral sensorymotor)? Neither movement nor emotion has been
transmitted in any obvious sense. Let me sketch some ideas.
Music is a procedural knowledge. The knowledge is in the fingers of the
musician, in the voice of the singer, in the body of the dancer, in the baton of
the conductor (!), in proprioception and in the bodily relation to the
instrument(s).
Question: Can music be communicated from body to body. Or, if we think of
the body as representations of the body in the brain: Can music be
communicated from body map to body map?
Maps and mirrors in the brain
Picture 1. Body maps. Map = a one to one representation.
Picture 2. The brain
Picture 3. Sensory primary cortex & motor primary cortex (Wilder Penfield).
4
Picture 4. Proportion of body representation in the brain.
Picture 5. Listening to the same music leads to differing perceptions depending
of the differences in sensory equipment and perception.
Picture 6. The cerebellum contains body maps important for sequential
movements. Other parts of the brain that maps the body are the colliculus and
the parietal lobe. By moving in the environment we develop a body schema =
the sense of the position of the body parts ≠ body image.
Picture 7. The perception of space
1. Peripersonal space: The body schema contains representations for
reachable positions in space (such as positions on the instrument) and
representations for reachable positions with a tool (such as the
drumsticks). Where to reach for an object (bimodal sensorymotor
neurons) ≠ canonical neurons: how to hold → affordances
(sensorymotor). Trimodal neurons respond to somatosensory, visual and
auditory stimuli (Fogassi 1996 for monkeys) (Brenner 2001 for humans)
in peripersonal space. Perspective: manual - instumental
2. Extrapersonal space is another system. Spatial neglect can be limited to
one of these systems. Extrapersonal space may be constructed from
peripersonal space. Perspective: Egocentric
To sum up: Space can be heard, seen and felt. We have multimodal
representations for space.
Picture 9. How do we communicate body movements? → communicative
neurons → Mirror neurons (explain). Mirror neurons map the others body/face
to your own body/face. They can be visuomotor as well as auditorymotor,
transitive as well as intransitive. Perspective: dyadic = a shared space of action.
Picture 10. Mirror neuron areas in the brain.
1. Multimodal structure planning in the frontal lobe creating expectations.
Broca´s area has mirror neurons.
5
2. Secondary areas in the temporal lobe (strongest in STS for perception of
body movement) and the parietal lobe coding body action of the other
form multimodal form circuits with
3. The premotor area → motor planning & execution of motor act.
This leads to an action/situation understanding = the understanding of an
affordance in the eyes of the other and the motor act implied → the
motivational/intention of the other + complementary action.
Picture 11. Examples. In the case of music, heard music triggers finger
movements. Mirror neurons are implied in music listening.
Picture 12. The insula and the cingulate cortex have mirror neurons (Rizzolatti
2008). The neurons in the insula are multimodal (vision/acoustic) (Calder et al.
2000)
Insula is the visceromotor centre (Craig 2002).
Function:
1. Anterior → gustatory, olfactory + face reactions
2. Posterior: auditory, somatosensory, premotor
Visual/auditory info ↔ insula ↔ visceral reactions (body states) +
visceromotor reactions ↔ emotions (James-Lange)
Thus we can compare the two circuits:
1. Premotor/temporal/parietal mirror neurons ↔ motor imitation
2. Insula/cingulate cortex mirror neurons ↔ visceral motor imitation
Tribal perspective: I have shown how mirror neurons can create dyadic
perspectives. In the prolongation they can also provide a tribal perspective
created by joint action and collective communication. Joint action concerns
implicit understanding of own contribution to group goals. This is important to
music. Particularly the rhythm provides a shared temporal space which is
6
interesting to watch at the dance floor as well as a shared emotion as we can
see in concerts (Vickhoff 2008).
The musical brain
For the perception of sound we have (Zatorre et al. 2007):
1. A ventral stream for identification of sound, accessible to consciousness.
This stream produces pre-motor imitation of sounds.
2. A dorsal stream that is pre-attentive and rule based. This stream is
insensitive to modality and organizes the information along sequential,
rhythmical and tonal parameters (allocentric perspective).
3. And recently (Rizzolatti 2008) a stream directly to the insula/cingulate
cortex producing emotions.
Conclusion: Music, although unimodal as a message, triggers multisensory
mirror neurons which communicate from body to body and create implicit
understandings of goals and actions as well as emotional understandings. The
sound is the seed that produces the rose.
Film music
Are these principles important for how music can be used in multimodal
communication? Let us discuss this for film music.
1. The principle of unification: Because of unification we know what to look
for when we hear a sound. Sounds direct the gaze. The eye is looking for
the sound source. And in doing so it looks for synchronicity. If you see ten
bouncing balls and hear the sound from one of them, your eyes will
automatically find the ball that is synchronized with the movement. We
are, to take another example, sensitive to the synchronization between lip
movements and speech. It has been shown that babies attend to voices in
synch with lip movements and lose interest if the lip movements are
unsynchronized. By synchronizing musical structure with film structure, or
7
imitating sounds film musicians can help the viewer to focus her
perception on important features that are important to the narrative.
2. The principle of emotional communication: When we watch a film we are
usually not aware of the music, or of our own emotions. We project our
emotions into the intentional object. This creates a situation
understanding. This, I believe is what emotions are for: To make us react
appropriately. So when the film music provides us with an emotion this
emotion is used to understand the scene on the screen.
3. The principle of perspective: Music provides the emotional, tribal or
cultural context which makes us select a perspective. The scene is
perceived from this perspective. Example: In a course of music in film I
replaced the piano music added to a silent movie (Buster Keaton) with
Elegy to the Victims of Hiroshima by Krzysztof Penderecki for pedagogic
purposes.
Task: Daniel Gonzaga http://www.youtube.com/watch?v=MVepjSMsgKc
8