* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download 6-Janata_Natarajan - School of Electronic Engineering and
Neuromarketing wikipedia , lookup
Artificial general intelligence wikipedia , lookup
Recurrent neural network wikipedia , lookup
Donald O. Hebb wikipedia , lookup
Evolution of human intelligence wikipedia , lookup
Neurogenomics wikipedia , lookup
Haemodynamic response wikipedia , lookup
Human brain wikipedia , lookup
Human multitasking wikipedia , lookup
Neural modeling fields wikipedia , lookup
Selfish brain theory wikipedia , lookup
Types of artificial neural networks wikipedia , lookup
Emotional lateralization wikipedia , lookup
Nervous system network models wikipedia , lookup
Neuroplasticity wikipedia , lookup
Embodied cognitive science wikipedia , lookup
Impact of health on intelligence wikipedia , lookup
Aging brain wikipedia , lookup
Neurotechnology wikipedia , lookup
Neuropsychopharmacology wikipedia , lookup
Functional magnetic resonance imaging wikipedia , lookup
Neuroeconomics wikipedia , lookup
Brain Rules wikipedia , lookup
Brain morphometry wikipedia , lookup
Mind uploading wikipedia , lookup
Cognitive neuroscience of music wikipedia , lookup
Neuroanatomy wikipedia , lookup
Neurolinguistics wikipedia , lookup
Music psychology wikipedia , lookup
Neuropsychology wikipedia , lookup
Cognitive neuroscience wikipedia , lookup
Time perception wikipedia , lookup
Neuroesthetics wikipedia , lookup
Holonomic brain theory wikipedia , lookup
History of neuroimaging wikipedia , lookup
Neuroinformatics wikipedia , lookup
Tonal Space and the Human Mind By P. Janata Presented by Deepak Natarajan Background • Simple musical stimuli used in the past to measure expectancy violations • Recently, neuroscientists prefer using more natural stimuli (REAL music), causing a need for modeling tonality in more sophisticated ways • Neural, perceptual, and cognitive constraints of the human brain can provide more insight to music theorists and musicologists Background • Major perceptual dimensions of music (tonality, rhythm, timbre). Focus of this research is on tonality • Analysis must be conscious of the human mind – Must establish short-term and long-term context of tonality – Need to consider other factors like human attention span • We can use physiological knowledge to inform the models for music processing Goal • Develop models for tonal structure that are suitable for analyzing behavioral and neurophysiological data (Janata, 2005) Navigating Tonal Space • The toroidal model links music theory, cognitive psychology, and computational modeling (Krumhansl & Kessler, 1982) Navigating Tonal Space • Ability to use various distance metrics between keys • Different disciplines seek specific relationships (eg. tonal/chordal relationships (circle of fifths) vs. psychological values (relatedness) • All of these relationships can be modeled using self-organizing maps Self-Organizing Maps • An SOM is a type of artificial neural network that is trained using unsupervised learning – Unassuming of relationships among elements in source data – Produces a low-dimensional, discretized representation of the input space of the training samples – Can uncover structure in source data SOMs in music theory • The approach assumes that nervous systems learn to identify recurring patters of sensory input; the brain is a statistical engine • We can use similarities in the pitch class distributions of input data and template data to train the neural network • The learning algorithm adjusts the weights (ties) between input and output data to determine the most probabilistic key SOMs in music theory • Three types – Probe tone: Information on how well each of the twelve pitch-classes is perceived to fit into a particular key serves as an input to a SOM – Pitch-class: Similar to probe tone, but uses music theory (non-subjective) to create pitch-class distributions to represent the importance of each pitch in a key – Acoustic Waveforms: Uses models of known physiological mechanisms for defining transformations of the auditory input and subsequent representations Acoustic Waveform SOMs (Janata, 2007) Perception of Tonal Regions in a Modulating Melody • To perform key finding, a SOM was trained on an 8 minute melody that modulated through all 24 major and minor keys – Resulted in an equal representation of PERCEIVED key regions • Model was probed with various timescales using known stimuli and projected onto the SOM to determine activation dynamics at those different timescale Input stimuli: B-major scale + variations (Janata, 2007) Results: Activation Images 0.2 s 2.0 s (Janata, 2007) Results • Activation consistently appears in the vicinity of the B-major label • However, activation is biased toward different key regions, and the biasing depends on the harmonic structure of the input stimuli • The 2.0 s time-scale activation patterns indicate the stable key. This analysis can be extended to even longer time-scales Video (http://atonal.ucdavis.edu/projects/musical_spaces/tonal/torus_animations) Brain Networks That Track Musical Structure • Identifying regions of the brain that invoke tonal analysis (if at all!) – Compare model data to fMRI data and look for similarities. If matches are found, we can assume we are essentially modeling this function of the brain – Use outside knowledge to raise/answer questions concerning other brain functions occurring in those regions Attentive listening • Music provides a complex soundscape for attention to roam on • We are interested in brain states that correspond to attentive and engaged listening • To achieve this, neuroimaging experiments were performed where the subjects were asked to identify some phenomenon (eg. tonal expectancy violation) Brain Activity • fMRI data showed activity in premotor areas of the brain, which are known to be active in primarily perceptual tasks that have strong and directed anticipatory components to them • This encourages the viewing of music in a perception/action cycle framework. The task demands shape the activity that we label as the brain’s processing of music. Tracking Movement Through Tonal Space (Janata, 2005) Tracking Movement Through Tonal Space • Regions of high correlation between spherical harmonic model data and brain activity suggest other functions are linked to tracking tonality – Model data is extremely correlated with activity from the Rostral Medial Prefrontal Cortex (RMPFC) – This region is generally involved in the cognitive control and evaluation of emotion Tracking Movement Through Tonal Space (Janata, 2005) Revisit Goal • Develop models for tonal structure that are suitable for analyzing behavioral and neurophysiological data (Janata, 2005) Other Observations • Rostral and ventral aspects of the MPFC are among the last in which significant cortical atrophy (weakening/degeneration) is observed in Alzheimer disease (AD) patient – (AD) patients have responded very positively to familiar music from their childhood, often singing along and readily detecting deviances implanted in the musical stimuli – Possibly suggests that the RMPFC is a locus at which music and autobiographical memories are bound together References • Janata, P. (2005). Brain networks that track musical structure. In The Neurosciences and Music II: From Perception to Performance (Vol. 1060): New York Academy of Sciences. • Janata, P. (2007). Navigating tonal space. In E. Selfridge-Field (Ed.), Tonal Theory for the Digital Age (Computing in Musicology: Vol. 15, pp. 39– 50). • Janata Lab (Center for Mind and Brain, UC Davis) – http://atonal.ucdavis.edu/