Download Rose F. Kennedy Intellectual and Developmental Disabilities

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Neurophilosophy wikipedia , lookup

Binding problem wikipedia , lookup

Psychophysics wikipedia , lookup

Central pattern generator wikipedia , lookup

Neuroplasticity wikipedia , lookup

Cognitive neuroscience of music wikipedia , lookup

Neurocomputational speech processing wikipedia , lookup

Allochiria wikipedia , lookup

Activity-dependent plasticity wikipedia , lookup

Emotion perception wikipedia , lookup

Visual selective attention in dementia wikipedia , lookup

Metastability in the brain wikipedia , lookup

Stimulus (physiology) wikipedia , lookup

Neuroinformatics wikipedia , lookup

Neural correlates of consciousness wikipedia , lookup

Cognitive neuroscience wikipedia , lookup

Mental chronometry wikipedia , lookup

Speech perception wikipedia , lookup

Evoked potential wikipedia , lookup

Visual extinction wikipedia , lookup

Sensory cue wikipedia , lookup

C1 and P1 (neuroscience) wikipedia , lookup

Cognitive science wikipedia , lookup

Visual N1 wikipedia , lookup

Sensory substitution wikipedia , lookup

Neuroesthetics wikipedia , lookup

Time perception wikipedia , lookup

Feature detection (nervous system) wikipedia , lookup

Efficient coding hypothesis wikipedia , lookup

Perception wikipedia , lookup

P200 wikipedia , lookup

Embodied cognitive science wikipedia , lookup

Transcript
Science at the heart of medicine
September 17th, 2010
Rose F. Kennedy Intellectual and Developmental Disabilities
Research Center (IDDRC) Seminar Series
Director: Steve Walkley, DVM., Ph.D.
Associate Director: John J. Foxe, Ph.D.
Advances in Multisensory Integration Science:
The Multisensory Research Network Symposium
LeFrak Auditorium, Price Center, Room 151
9am, Friday, September 17th, 2010
SYMPOSIUM OUTLINE:
08:15am
Breakfast (Price Center, lobby below the LeFrak Auditorium)
09:00am
Title: Adult Plasticity of Multisensory Integration
Speaker: Barry E. Stein, Ph.D., Professor and Chairperson, Department of
Neurobiology and Anatomy, Wake Forest University School of Medicine, North Carolina,
USA
09:25am
Title: Visual Speech Perception
Speaker: Lynne E. Bernstein, Ph.D. Program Director, Cognitive Neuroscience Program
Division of Social, Behavioral, and Economic Sciences National Science Foundation,
(NSF). Professor, Department of Speech and Hearing Sciences, George, Washington
University, Washington DC, USA
09:50am
Title: The Development of Multisensory Integration and its Breakdown in Autism
Speaker: Sophie Molholm, Ph.D. Associate Professor, Departments of Pediatrics and
Neuroscience, Children’s Evaluation and Rehabilitation Center, Associate Director,
Cognitive Neurophysiology Laboratory, Albert Einstein College of Medicine, New York,
USA
10:15am
Title: How and Why Primary Sensory Cortices Participate in Multisensory
Integration
Speaker: Charles E. Schroeder, Ph.D., Professor, Cognitive Neuroscience &
Schizophrenia Program, Nathan Kline Inst. for Psychiatric Research and Dept.
Psychiatry, Columbia University College of Physicians and Surgeons, New York, USA
10:40am
Title: Making Waves: Evoked Periodicity in Visual-Target Detection Demonstrates
the Behavioral Consequences of Cross-sensory Phase Reset in Humans
Speaker: Ian Fiebelkorn, Doctoral Student, The City College of New York, New York, USA
11:05am
Title: Multisensory Integration of Vision and Touch
Speaker: Krish Sathian, MD, Ph.D.
Professor of Neurology, Rehabilitation Medicine & Psychology, Emory University, Interim
Director, Rehabilitation R&D Center of Excellence, Atlanta VAMC, Georgia, USA
Science at the heart of medicine
September 17th, 2010
Title: Adult Plasticity of Multisensory Integration
Speaker: Barry E. Stein, Ph.D., Professor and Chairperson, Department of Neurobiology and Anatomy,
Wake Forest University School of Medicine, North Carolina, USA
Abstract:
During early postnatal life the brain is highly plastic and gradually develops and adapts its multisensory
integration processing capabilities to deal with the presumptive environment in which it will function. Spike-timing
dependent plasticity (STDP) provides a framework for understanding these adaptations by predicting how SC
afferents that are co-activated mutually reinforce each other. Recently, however, we have found that if this
functional organization has not taken place, because the essential experiences to instantiate it have been
precluded, the mature brain retains the capacity to incorporate it. These observations contrast with expectations
of a critical period during early life for building its essential circuitry. Nevertheless, animals reared in the dark
(they have had no experience with the statistics of visual-auditory events) until adulthood rapidly develop this
capacity once presented with cross-modal events. Not only do they develop this capacity far more rapidly than
their neonatal counterparts, but they can do so while anesthetized. Indeed, even once fully formed, this
organization structure is not static, but continues to show rapid adaptations to changes in the statistics of crossmodal events. Thus, repeated exposure of animals to pairs of sequential, temporally disparate visual and auditory
stimuli (that initially produce independent, unisensory responses) alters the properties of multisensory SC
neurons in ways predicted by the STDP learning algorithm: the duration and magnitude of the first unisensory
response increase, and the latency of the second unisensory response decreases. As a result, the two initially
independent responses begin to fuse and appear more like an integrated multisensory response. This effect was
evident in most neurons within 15-20 repetitions of the stimuli; it persisted well beyond the “training” period; and
it was independent of the order of the stimuli. Apparently, adult multisensory SC neurons adapt to the statistics of
short-term sensory experiences much as their neonatal counterparts adapt to the statistics of long-term sensory
experiences. Research was supported by NIH grants EY016716 and NS036916, and the Wallace Foundation.
Title: Visual Speech Perception
Speaker: Lynne E. Bernstein, Ph.D. Program Director Cognitive Neuroscience Program Division of Social,
Behavioral, and Economic Sciences National Science Foundation (NSF) and Professor, Department of Speech and
Hearing Sciences George Washington University Washington, DC , USA
Abstract:
Audiovisual speech perception is known as a quintessential example of multisensory integration. Its underlying
mechanisms are so effective that, for decades, speech perception was considered by many researchers to be an
auditory function only. That is, the contribution of vision went almost unnoted. Contemporary views that visible
speech is processed cortically along the same pathways as audible speech expand the domain of the speech
stimulus but attend little to the distinct properties and processing demands of visible speech. We regard visible
speech as fundamentally both linguistic and visual. In this talk, I present evidence on what constitutes the
effective visible speech stimulus, how it is perceived by lip readers, where it is processed cortically, and how
lifelong deafness affects lip-reading capacity. The extensive, yet frequently underestimated, visual system
contribution to processing audiovisual speech stimuli is itself a outcome of remarkable multisensory speech
integrative mechanisms.
Title: The Development of Multisensory Integration and its Breakdown in Autism
Speaker: Sophie Molholm, Ph.D. Associate Professor, Departments of Pediatrics and Neuroscience,
Children’s Evaluation and Rehabilitation Center, Associate Director, Cognitive Neurophysiology
Laboratory, Albert Einstein College of Medicine, New York, USA
Abstract:
The integration of multisensory inputs is essential to forming meaningful representations of the environment.
Optimal benefit from multisensory inputs often requires experience, and there is every reason to expect that
there is a typical developmental course for the “tuning-up” of multisensory integration. Here I will consider
recent behavioral and electrophysiological data from our laboratory on the developmental course of multisensory
processing in school aged children. These data are suggestive of the protracted plasticity in a dynamic system
that continues to update the relative significance of multidimensional inputs from the environment. In addition, I
will present data on multisensory integration and its breakdown in children with autism.
Science at the heart of medicine
September 17th, 2010
Title: How and Why Primary Sensory Cortices Participate in Multisensory Integration
Speaker: Charles E. Schroeder, Ph.D., Professor, Cognitive Neuroscience & Schizophrenia Program, Nathan Kline
Inst. For Psychiatric Research and Dept. Psychiatry, Columbia University College of Physicians and Surgeons, New
York, USA
Abstract:
Multisensory interactions have been widely reported in primary auditory, visual and somatosensory cortices. How
do these interactions operate? Several lines of evidence indicate that these effects predominantly reflect an
interaction of a driving input (i.e., one that causes local cortical neurons to fire action potentials) through the
preferred modality with a modulatory input mediated by a non-preferred modality. The latter type of input
operates by raising or lowering excitability, and thus, the probability or amount that neurons will fire in response
to a preferred modality input. Modulatory inputs can control local excitability dynamically by adjusting the phase of
ongoing neuronal oscillations. Because these oscillations reflect cyclical excitability variations in local neurons,
response amplitudes depend on the oscillatory phase under which inputs arrive. By resetting activity to a high or
low excitability phase at a particular moment in time, the modulatory input can determine whether the a coincident
driving input is amplified or suppressed. For near threshold inputs this can be an “all-or-none” effect. Further,
because oscillations are often coupled across low, middle and high frequency ranges, these effects can support
multisensory facilitation in processing of complex sensory processing, such as that occurring in the parsing of
spoken language. Finally, it is clear that attention is critical in determining whether heteromodal inputs will
modulate processing in a primary sensory region. Why do interactions occur in primary cortices? Paradoxically, the
most important effect of low level multisensory interactions may be that they enhance the brain’s representation of
a unisensory stream, helping you to hear and see better and more selectively. In a cocktail party conversation, for
example, attending to a person’s head, facial and hand gestures, is predicted to give these visual cues the power
to dynamically modulate oscillatory phase in a number of frequencies, helping to amplify that specific conversation
stream against the background of the party. The extent to which the visual inputs also enrich the low level
auditory representation in this case, for example, by adding “information” to it, is not yet clear.
Title: Making Waves: Evoked Periodicity in Visual-Target Detection Demonstrates the Behavioral
Consequences of Cross-sensory Phase Reset in Humans
Speaker: Ian Fiebelkorn, Postdoctoral Fellow, The City College of New York, New York, USA
Abstract:
The simultaneous presentation of a stimulus in one sensory modality often enhances the detectability of targets in
another sensory modality, but the neural mechanisms that govern these effects are still under investigation. Here
we test a hypothesis proposed in the neurophysiologic literature: that multisensory enhancement of visual-target
detection in humans operates through top-down phase reset of ongoing oscillations. Our behavioral approach
bypasses issues that complicate the detection of phase reset in neurophysiologic data when stimuli from multiple
modalities are simultaneously presented. Fluctuations in visual-target detection at consecutive time points reveals
evoked periodicity in behavioral performance, time-locked to a temporally informative sound. We argue that these
data demonstrate the behavioral consequences of cross-sensory phase reset, both at the moment that it occurs
and persisting for seconds thereafter. To test the dependence of our results on endogenous factors such as topdown attentional control, we also manipulated the probability that a low-contrast visual target would co-occur with
the sound. The probability of simultaneity seems to determine whether ongoing oscillations are reset such that
co-occurring visual targets align with high-excitability phases (and detection is therefore enhanced).
Title: Multisensory Integration of Vision and Touch
Speaker: Krish Sathian, MD, Ph.D.
Professor of Neurology, Rehabilitation Medicine & Psychology, Emory University
Interim Director, Rehabilitation R&D Center of Excellence, Atlanta VAMC, Georgia, USA
Abstract:
It is now established that visual cortical areas are routinely recruited, in a task-specific manner, for processing
those aspects of touch for which they are specialized in vision. Our recent work has shown that processing of
somatosensory information is segregated along dual pathways: a dorsal stream involved in processing “where”
information (location) and a ventral stream involved in processing “what” information (texture), with considerable
multisensory overlap. Both visual and haptic perception of object shape engage processing in the lateral occipital
complex (LOC) and intraparietal sulcus (IPS). The IPS regions (but not the LOC) are also active during visual and
haptic perception of object location. The common processing in IPS regions during perception of both shape and
location information suggests that activity in these regions during shape perception may reflect processing of the
relative locations of parts of objects, whereas LOC activity may reflect global shape processing. Visual object
imagery plays a role in LOC activation during haptic perception of familiar, but not unfamiliar objects, whereas
spatial imagery may facilitate haptic perception of unfamiliar objects via pathways involving the IPS.
Price Center, Albert Einstein College of Medicine
Science at the heart of medicine
Albert Einstein College of Medicine
of Yeshiva University
Jack and Pearl Resnick Campus
1300 Morris Park Avenue
Bronx, NY 10461
www.einstein.yu.edu