Download EmergentSemanticsBerkeleyMay2_2010

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Optogenetics wikipedia , lookup

Embodied cognitive science wikipedia , lookup

Neuroanatomy of memory wikipedia , lookup

Perceptual learning wikipedia , lookup

Stephen Grossberg wikipedia , lookup

Reconstructive memory wikipedia , lookup

Conceptual model wikipedia , lookup

Learning wikipedia , lookup

Cognitive semantics wikipedia , lookup

Cognitive model wikipedia , lookup

Barbara Landau wikipedia , lookup

Machine learning wikipedia , lookup

Holonomic brain theory wikipedia , lookup

Social learning in animals wikipedia , lookup

Nervous system network models wikipedia , lookup

Recurrent neural network wikipedia , lookup

Neural modeling fields wikipedia , lookup

Situated cognition wikipedia , lookup

Types of artificial neural networks wikipedia , lookup

Conduit metaphor wikipedia , lookup

Neurophilosophy wikipedia , lookup

Catastrophic interference wikipedia , lookup

Hierarchical temporal memory wikipedia , lookup

Transcript
Emergent Semantics:
Meaning and Metaphor
Jay McClelland
Department of Psychology and
Center for Mind, Brain, and Computation
Stanford University
The Parallel Distributed Processing
Approach to Semantic Cognition
• Representation is a pattern of
activation distributed over
neurons within and across
brain areas.
language
• Bidirectional propagation of
activation underlies the ability
to bring these representations
to mind from given inputs.
• The knowledge underlying
propagation of activation is in
the connections.
• Experience affects our
knowledge representations
through a gradual connection
adjustment process
Distributed Representations:
and Overlapping Patterns for Related
Concepts
dog
goat
hammer
dog goat hammer
Responses of Four Neurons to Face and
Non-Face Stimuli
Emergence of Meaning and
Metaphor
• Learned distributed representations that
capture important aspects of meaning emerge
through a gradual learning process in simple
connectionist networks
• Metaphor arises naturally as a byproduct of
learning information in homologous domains
in models of this type
Emergence of Meaning: Differentiation,
Domain-Specificity, and Reorganization
The Rumelhart Model
The Training Data:
All propositions true of
items at the bottom level
of the tree, e.g.:
Robin can {grow, move, fly}
Target output for ‘robin can’ input
Forward Propagation of Activation
aj
wij
neti=Sajwij
ai
wki
Back Propagation of Error (d)
aj
wij
di ~
Sdkwki
ai
wki
Error-correcting learning:
At the output layer:
At the prior layer:
…
dk ~ (tk-ak)
Dwki = edkai
Dwij = edjaj
Early
Later
Later
Still
E
x
p
e
r
i
e
n
c
e
Why Does the Model
Show Progressive
Differentiation?
• Learning is sensitive
to patterns of
coherent covariation.
• Coherent Covariation:
– The tendency for
properties of
objects to co-vary
in clusters
What Drives
Progressive
Differentiation?
•
Waves of differentiation reflect
coherent covariation of properties
across items.
•
Patterns of coherent covariation are
reflected in the principal
components of the property
covariance matrix.
•
Figure shows attribute loadings on
the first three principal components:
– 1. Plants vs. animals
– 2. Birds vs. fish
– 3. Trees vs. flowers
•
•
Same color = features covary in
component
Diff color = anti-covarying
features
Sensitivity to Coherence
Requires Convergence
A
A
A
Conceptual Reorganization (Carey, 1985)
• Carey demonstrated that young children
‘discover’ the unity of plants and animals as
living things with many shared properties only
around the age of 10.
• She suggested that the coalescence of the
concept of living thing depends on learning
about diverse aspects of plants and animals
including
– Nature of life sustaining processes
– What it means to be dead vs. alive
– Reproductive properties
• Can reorganization occur in a connectionist
net?
Conceptual Reorganization in the
Model
• Suppose superficial appearance information, which is
not coherent with much else, is always available…
• And there is a pattern of coherent covariation across
information that is contingently available in different
contexts.
• The model forms initial representations based on
superficial appearances.
• Later, it discovers the shared structure that cuts across
the different contexts, reorganizing its representations.
Organization of Conceptual Knowledge
Early and Late in Development
Proposed Architecture for the
Organization of Semantic Memory
name
action
Temporal
pole
motion
color
valance
form
Medial Temporal Lobe
Metaphor in Connectionist Models of
Semantics
• By metaphor I mean:
the application of a relation learned in one
domain to a novel situation in another
Hinton’s Family Tree Network
Person 2
Person 1
Relation
Understanding Via Metaphor
in the Family Trees Network
Marco’s father is Pierro.
Who is James’s father?
Christopher’s daughter is Victoria.
Who is Roberto’s daughter?
Emergence of Meaning and
Metaphor
• Learned distributed representations that
capture important aspects of meaning emerge
through a gradual learning process in simple
connectionist networks
• Metaphor arises naturally as a byproduct of
learning information in homologous domains
in models of this type