Download Danczi Csaba László - 2nd WORLD CONGRESS OF ARTS

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Neuroethology wikipedia , lookup

Response priming wikipedia , lookup

Microneurography wikipedia , lookup

Visual selective attention in dementia wikipedia , lookup

Emotion perception wikipedia , lookup

Perception wikipedia , lookup

Development of the nervous system wikipedia , lookup

Neuroesthetics wikipedia , lookup

Biological neuron model wikipedia , lookup

Neuroanatomy wikipedia , lookup

Eyeblink conditioning wikipedia , lookup

Mirror neuron wikipedia , lookup

Metastability in the brain wikipedia , lookup

Convolutional neural network wikipedia , lookup

Allochiria wikipedia , lookup

Neuropsychopharmacology wikipedia , lookup

Clinical neurochemistry wikipedia , lookup

Sensory substitution wikipedia , lookup

Embodied cognitive science wikipedia , lookup

Perception of infrasound wikipedia , lookup

Caridoid escape reaction wikipedia , lookup

Central pattern generator wikipedia , lookup

Optogenetics wikipedia , lookup

Circumventricular organs wikipedia , lookup

Evoked potential wikipedia , lookup

Pre-Bötzinger complex wikipedia , lookup

Music psychology wikipedia , lookup

Visual extinction wikipedia , lookup

Premovement neuronal activity wikipedia , lookup

Nervous system network models wikipedia , lookup

Psychophysics wikipedia , lookup

Time perception wikipedia , lookup

Channelrhodopsin wikipedia , lookup

Visual N1 wikipedia , lookup

Sensory cue wikipedia , lookup

Cognitive neuroscience of music wikipedia , lookup

Neural coding wikipedia , lookup

C1 and P1 (neuroscience) wikipedia , lookup

P200 wikipedia , lookup

Synaptic gating wikipedia , lookup

Efficient coding hypothesis wikipedia , lookup

Stimulus (physiology) wikipedia , lookup

Superior colliculus wikipedia , lookup

Feature detection (nervous system) wikipedia , lookup

Transcript
MUSICAL SPACE REVISITED
Danczi Csaba László
Center for Culture and Communication, ParaRadio
H-2500, Esztergom, Dobogókői út 50.
tel.: +36-30/2013781
[email protected]
ABSTRACT
Among ancient Greeks, many subscribed to the notion of the Pythagoreans, that is, Music
is a reflection of the universal harmony (Music of the Speres). In the Middle Ages, Alberti, in his
treatise on Architecture, described the same ratios as were attributed previously to consonances.
The first scientific theory of music was that of Helmholtz’s, who based it on the sensation
of tones. Considering music as a pure manisfestation of vibrations and periodicities, there is no
reason why all languages uses a spatial analogy to describe musical events. Apart from spatial
array and stereophonic listening or score writing, music has nothing to do with space. Or has it?
In the early 80’s, researchers began to explore the auditory mid-brain. Until the late 90’s,
it became clear there was a multi-modal channel combining auditory, visual, and cutaneous
information, consisting of the inferior and superior colliculi and the anterior ecto-sylvian cortex.
The main organising principle of the superior colliculus is space. Is it a sub-cortical
substrate for musical space? Possible applications to music therapy are discussed.
MUSICAL SPACE REVISITED
MUSIC AND SPACE
Since the ancient Greeks, phylosophers, musicians and musicologists thought of musical
expression in different terms. For the ancient Greeks, music expressed “ethos” (human character)
and was a manifestation of the universal order (Music of the Spheres, Pythagoras and his
followers). Pythagoras (or, rather, his followers) thought of numbers as the principal arkhe of the
Universe. According to the Music of the Spheres theory, heavenly bodies, depending on their
distance from the Earth, gave off sounds, while their distances, or, rather their ratios were
considered as the base for “consonances” in music, thus reflecting the universal order. Most
probably, Aristoxenos, back in ancient Greece, was the first one to allege that musical notes made
movements, thus giving allowances for an ever since prevailing spatial analogy for pitches. In the
2
pythagorean sense, healing meant a re-synchronisation of the micro- and the macro-universe. This
view, saturated with a profoundly religious meaning, remained in use through the Middle Ages
when Alberti, in his treatise on Architecture, required proportions of buildings to conform to the
very same ratios as previously pythagoreans required for consonances.
In French Classicism, J.-Ph. Rameau introspectively invented “basse fondamentale” and
the notion of combined sounds. More than hundred years later, Helmholtz (1954) advanced a
novel theory of music based on the sensation of tones and described pitch in terms of frequency
(oscillation) and timbre in terms of the combination of simple tones. Thence, he based a scientific
language in which to describe musical phenomenon.
MUSIC AND LANGUAGE
Most of contemporary languages use a spatial metaphore to describe musical events. To
name but a few, Hungarian, English, German, Finnish, Italian, Portuguese, Czech, Slovakian etc.
languages use the same word for spatial position and musical pitch ("high and low pitch")
whereas Chinese and Hindi languages use spatial position words for "high" and special
expressions for 'low pitches'. Most interesting are the Rumanien and French languages: in French,
a 'high pitch' is 'acute' and in Rumanian, a 'low pitch' is 'gross' (while both languages use spatial
metaphors for high pitches). Despite differences in terms, all the above mentioned languages
describe musical events using Spatial Metaphor.
The term Spatial Metaphor is used here, to empasise that musical notes occupy no real
space, make no spatial movement etc. If so, why should we speak of high and low pitches, and
musical movements?
CROSS-MODAL LINKS
In the 1970's and 1980's, Marks and his collegues provided a vast amount of empirical
data substantiating that people (even in lack of any perceptual aid) consistently associate musical
notes with spatial positions (high pitch - upper position, low pitch lower position) and form (high
pitch - acute, low pitch large and rounded form) in visual modality.
The early work on intersensory development was framed in terms of one of two opposing
theoretical views: integration view (e.g. Piaget) or the differentiation view (e.g. Bower and
Gibson). The integration view, which is constructivist in nature, assumes that the senses operate
as separate avenues of sensory input and that slowly, either over many months, or over several
years the liaisons among the senses develop. In contrast, the differentiation view holds that we are
born with an ability to perceive intersensory relations.
AUDITORY PATHWAYS
2
3
Since the pioneering work of Aitkin (1), a clear picture of a multisensory (auditory, visual,
tactile/cutaneous, somesthetic, and motor) channel has emerged in neurophysiological literature
consisting of parts of the inferior colliculus, the superior colliculus, and the anterior ectosylvian
gyrus of the cortex. The most interesting feature of this channel is that it is most intimately
intertwined with the auditory pathway.
Inferior Colliculus (IC)
The inferior colliculus (IC) is a mandatory relay for information ascending to the auditory
cortex. It is subdivided into a central nucleus surrounded by the external nucleus and the
pericentral nucleus.
The IC receives its main inputs from both the auditory cortex and brainstem structures of the
lateral lemniscal system as well as from "nonauditory" structures, e.g., the spinal chord, dorsal
column nuclei, retinal ganglions and the sensorimotor cortex.
Neurons of the external nucleus of IC are influenced by both somatic and acoustic stimuli,
whereas the external and pericentral nuclei also receive some fibers from the auditory cortex
ipsilaterally.
Every subdivision of IC contributes more or less to the projections to the pons,
superior olive, and cochlear nuclei. However, IC-pontine neurons are largely confined to the
external and pericentral nuclei, whereas neurons projecting to the superior olivary complex and
cochlear nuclei are distributed mainly in the central nucleus.
Most of the neurons in the central nucleus exhibit sharp sensitivity to tonal frequencies
with a sustained discharge pattern, whereas those in the external and pericentral nuclei are
broadly tuned to respond at the onset of the stimulus. Similar acoustic responses as in the external
nucleus neurons have been shown in the dorsolateral pontine nucleus where the collicular fibers
terminate. The IC uses GABAergic circuits to modify its response properties.
Superior Colliculus (SC)
The mammalian superior colliculus consists of seven alternating fibrous and cellular laminae, which have been
grouped into superficial and deep divisions. Neurons in the superficial layers are responsive only to visual stimuli.
Nonvisual stimuli not only fails to produce a response in these neurons, but also fails to modulate their responses to
visual stimuli. All superficial layer neurons responds to moving visual stimuli, and every neuron with receptive fields
within the area of binocular overlap could be driven by both eyes. However, they are not selective for stimulus
orientation or shape, and in a study (2) only 5% were selective for direction of movement, although a wide range of
velocity preferences were represented.
The most characteristic feature of the deep layers is the appearance of neurons responsive to
nonvisual stimuli (i.e., auditory and somatosensory), as well as the appearance of multisensory
neurons. Of the 108 sensory-responsive neurons studied in Wallace et al.'s (2) study on the deep
layers, more than one-quarter were multisensory. All but one of these multisensory neurons were
responsive to visual stimuli.
The large majority of somatosensory-responsive neurons also have receptive fields on the
contralateral body surface. These fields are smallest on the face and forelimb and largest on the
trunk and hind limb. Somatosensory-responsive neurons are overwhelmingly cutaneous (hair:
65%; skin: 30%) and, regardless of the location of a neuron's receptive field on the body surface
or whether the neuron is unimodal or multisensory, responses are generally most vigorous to
high-velocity stimuli. Typically, somatosensory-responsive neurons are most sensitive to rapid
3
4
deflection of the hairs. Responses are transient, and a sustained response can be elicited only by a
stimulus moving continuously across the cutaneous surface (2).
The presence of extensive connections between superficial and deep regions of the colliculus
in the cat supports the idea that receptive field organization in the deep layers is modulated by
visual input from the overlying layers. Thus, a complex network of connections within and
between both superficial and deep regions of the colliculus may participate in forming output
signal to the saccadic control system.
The superior colliculus plays an integral role in cross-modal behavior. Its neurons are capable
not only of cues from different sensory modalities (e.g., visual, auditory, and somatosensory) but
of synthesizing this information (3). The capacity to integrate cross-modal information plays a
major role in the fundamental attentive and orientation functions of the structure, and is
particularly evident in one of its more dynamic sensorimotor roles, the facilitation of gaze shifts.
The integration of multisensory information in the SC is predictable based on the organization
of its various sensory representations and the spatial relationships among stimuli. Each sensory
representation in the SC is topographically organized and in spatial correspondence with the
others. Two different sensory stimuli that originate from the same location in space (e.g. derived
from the same event) will effect a pronounced enhancement of the neuron's response beyond that
predicted by the sum of its activation when the two cues are presented individually. When one of
the two cues falls outside the neuron's receptive field and is disparate from the other (as is the
case when the two stimuli are unrelated), either no integration occurs or the response to the
stimulus still within the receptive field is markedly depressed (3).
Combinations of stimuli can have very different consequences in the same neuron, depending
on their temporal and spatial relationships. Generally, multisensory interactions are evident when
pairs of stimuli are separated from one another by < 500 ms, and the products of these
interactions far exceeds the sum of their unimodal components. Whether the combination of
stimuli produces response enhancement, response depression, or no interaction depends on the
location of the stimuli relative to one another and to their respective receptive fields.
Furthermore, maximal response interactions can be seen with the pairing of weakly effective
unimodal stimuli. As the individual unimodal stimuli becomes increasingly effective, the levels
of response enhancement to stimulus combinations declines, a principle referred to as 'inverse
effectiveness' (2). In all the stimulus combinations and neurons that Wallace et al. (2) studied,
progressively greater response enhancements were seen with combinations of stimuli that were
progressively less effective.
SC-mediated orientation and localization responses are effected through the descending
efferent projections of deep laminae cells to premotor and motor areas of the brain stem and
spinal cord. In a study (3) > 90% of the cells identified as descending efferents also responded to
sensory stimuli and > 70% were multisensory (3). In the SC, NMDA and GABA transmitters help
forming the auditory space map.
Anterior Ectosylvian Sulcus (AES)
Recently the cortex in and around the anterior ectosylvian sulcus (AES) has been shown to play a major role in
gating the multisensory integration of SC neurons. The AES is composed of three modality-specific areas: a visual
area, AEV, a somatosensory area, SIV, and an auditory area, Field AES. This AES projects heavily to the ipsilateral
SC and Wallace and Stein (4) showed that deactivation of the AES depressed or eliminated the multisensory
enhancement evoked by two spatially coincident stimuli. These deactivation effects were selective, reversible, and
4
5
often produced only minor changes in the neuron's responses to unimodal cues. Thus, in the absence of AES
influences the neuron would retain its multisensory character and respond to cues from more than one modality, but
would lose its ability to integrate multisensory information.
The axons of unimodal AES neurons from different modalities converge on neurons in the
ipsilateral SC. Sometimes these inputs are the only effective sensory inputs to an SC neuron and
thereby render it multisensory. In most cases AES inputs converge on SC neurons that already
respond to stimuli from more than one modality; AES multisensory neurons project elsewhere
(4).
REFERENCES
1: Aitkin, L. The Auditory Midbrain. Structure and Function in the Central Auditory Pathway.
Humana Press: Clifton, New Jersey, 1985
2: Wallace, M.T., Wilkinson, L.K., & Stein, B.E. Representation and Integration of Multiple
Sensory Inputs in Primate Superior Colliculus. Journal of Neurophysiology,
Vol.
76, No. 2, 1996
3: Stein, B.E. and Meredith, M.A. The Merging of the Senses. New York: MIT, 1993
4: Wallace, M.T., and Stein, B.E. Cross-modal synthesis in the midbrain depends on input from
association cortex. Journal of Neurophysiology, 1994, 71: 429-432
5