Download Associative computer - Softcomputing Lab, Department of Computer

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Theoretical computer science wikipedia , lookup

Transcript
인지기반 지능형 에이전트 설계: 인식
Associative computer: a hybrid
connectionistic production system
Action Editor : John Barnden
발제 : 최 봉환, 04/07, 2009
Outline
• Introduce Associative computer
= "a connectionistic hybrid production system"
– relies : distributed representation
– using : associative memory
– action : production system
– contribution : learn from experience
• Explain about "Associative computer"
– Visual representation of state
– Associative memory for state transition
– Permutation associative memroy
– Problem space
• Demonstrated by empirical experiments in block world
– what is block world
Motivated from
Biology : Neural assembly theory
• bridge between the structures found in the nervous system
– In high level cognition such as problem solving
– An assembly of neurons
• act as closed system, represent a complex object
• activation : some  entire ( Hebb, 1958; Palm, 1993 )
• Associative memory
– Neural net model + assembly concept ( Palm, 1982 )
– A group of inter connected neurons = Hebbian Network
• store patterns  new pattern presented
 a pattern is formed which closely resembles
• The pump of thought model
– Theoretical assembly model (Braitenberg, 1973,1984; Palm, 1982)
– How thoughts represented by assemblies
• can be propagated and changed by the brain
– The transformation of thoughts through a sequence of assemblies
• describe process of human problem solving (Braitenberg, 1978; Palm, 1982)
Motivated from
Psychology : Mental representation theory
• Thoughts = Description of complex objects
– Complex objects : structured and formed by different fragments
• can be represented by categories (Smith, 1995).
• categorical representation : how to deal with similarity between objects
• Complex Object description
– verbal : prototypical features
– visual (picture) : detailed shape representation
– by binary pictograms : size + orientation (Feldman, 1985).
• Similarity = the amount of shared area
(Biederman & Ju, 1988; Kurbat, Smith, & Medin, 1994; Smith & Sloman, 1994).
• items = ( vectors or vector parts ) <> symbols
(Anderson, 1995a; Ga¨rdenfors, 2000; McClelland & Rumelhart, 1985; Wichert, 2000,
2001).
Motivated from
Computer science : Production system
• Production systems = composed of productions
– production = if–then rules
– One of the most successful models of human problem solving
• (Anderson, 1983; Klahr & Waterman, 1986; Newell & Simon, 1972; Newell, 1990)
• how to form a sequence of actions which lead to a goal
(Newell, 1990; Winston, 1992).
• Memory components
– Long-term memory : complete set of productions
• precondition = triggered by specific combinations of symbols
– Short-term memory : Problem-space
• "state" = human thought or situation
• computation (action) = stepwise transformation
• Searching :  backtracking + avoiding repetitions
• (Anderson, 1995b; Newell & Simon, 1972; Newell, 1990)
– Problem description = initial state + desired state.
– Solution = set of the productions [ initial state  desired state]
• choose actions by heuristic functions
( = specified depending on the problem domain )
Related models
Connectionistic models
• rulebased reasoning + ( involve distributed | localist
representation )
– A two-level neural system (Sun, 1995)
• distributed(level 2) and localistic(level 1) representation (Acyclic directed graph)
• 1st level : precondition and conclusion  localistic, Link to 2nd level's features
• 2nd level : the distributed rules, uncertainty  ANN + reinforcement learning
– DCPS: Distributed connectionist production system (Touretzky, 1985)
• production rule = premise + a conclusion
– premise = two triples + matched against the working memory
– a conclusion = consists of commands for adding, deleting triples of the WM
• no backtracking and no learning
• Statistical models
– recurrent neural nets
• no separation of the problem space and the
problem-dependent knowledge
• less transparency
Associative computer
Introduce
• Based on the connectionistic production system
– different heuristic functions + learned from experience
– The states correspond to pictograms.
• Example domain : the block world
• ≡ A production system
– Solves problems = forming a chain of associations
• Sequence of actions which lead to a solution of a problem
• Permutation associative memory (Wichert, 2001)
• The associations : stored in a new associative memory
• learning from experience + using an additional associative memory
 Learning from experience
– Which associations should be used (heuristics) result
from the distributed representation of the problems
Associative computer
Structured binary vector representation
• Structuring
– Used by the permutation associative memory
– during recognition and execution
– without crosstalk and with graceful degradation
• Similarity(Sim)
– a, b : binary pattern vectors, a ≠ b
• Quality criterion(qc)
Associative computer
Structured representation
• Transition  2 binary pictogram pair
• Cognitive entities : Pieces of object for represent scene
– 'what' pathway : visual categorization(Posner, 1994), temporal lobe
– 'where' pathway : parietal lobe
Associative computer
representation of Association
• Frame problem (Winston, 1992)
– Which part of the description should change and which not
– An empty cognitive entity required
• The accepted uncertainty
– Dependent on the threshold value
Associative computer
Associative memory for state transitions
• Associative memory
– Model of the long-term memory for sorted Association
– A single input  several possible associations arise
• cannot be learned by an associative memory (Anderson, 1995)
– Nonlinear mechanism is required
• select one or avoid the sum of output branches (Anderson, 1995)
new concept : "Tranditional associative memory model"
• not structured pictograms stored in, and represented by binary vectors
• Lernmatrix ( Steinbuch )
– Permutation associative memory  composed Learnmatrix
– Composed of a cluster of units
– Unit : simple model of
a real biological neuron
– Learning : process of association
• indicate 'one' or 'zero'
T : threshold of the unit
wij : weight of connection
Associative computer
Associative memory : Detail
• Learning ( binary Hebb rule )
– Initialization phase
– No information stored
– Information = weight ( wij )
• Backward projection ( y  x )
– Reverse of Retrieval
• x = question, y = answer
• Retrieval ( x  y )
– Phase1. recall the appropriate answer
• fault tolerant answering machanism
– Most similar learned xl
• To the presented question
– Hamming distance
 appropriate answer
• Reliability of the answer
– Normalized contrast model
(Smith, 1995; Tversky &
Kahneman,1973)
– xl : x from y by backward projection
Associative computer
Permutation associative memory (1)
• δ-permutations of Δ set
– A state is represented by Δ cognitive entities
 Association = transition between the pictograms
– Premise : δ cognitive entities which a correlation of object
[ should be present ]
– IF State = Premise THEN δ cognitive entities of conclusion
• In general : δ << Δ
– In the recognition phase
• all possible δ-permutations of Δ cognitive entities
should be composed to test if the premise of
an association is valid
– In the retrieval phase
•
• Ξ permutations are formed
– i) question  answer
– ii) if qc < threshold then associate
– Permutation problem : the reduction of
computation of all permutations
Associative computer
Permutation associative memory (2)
• Parts
– Permute δ arrangement of
entities  get same answer before permute
• δ parts of the associative memory are permutated
– R ( Parts of ) Associative memory
• perform compute parallel
• Constraints : check facts and thresholds
– reduce # of possible combinations of
possible associative memories
Associative computer
Permutation associative memory (3)
• A model of thalamus
– Spotlight theory (Downing & Oinker, 1985)
• visual objects by the brain corresponds
• Retrieval : Searchlight model( thalamus )(Crick, 2003)
≒ spotlight
– Attention = ∝ a spotlight (Kosslyn, 1994; Posner, 1994)
• cued location and shifted as necessary
• by the mechanism of attention window
– Binding stage
• associative memory
formed successively
Associative computer
Problem Space (1) : Representation
• Representation
– Synchronous : the sequence of the carried out state model
• A state : represented by cognitive entities
• A sequence of states of pictograms : described by cognitive entities can be
represented by connected units
Associative computer
Problem Space (2) : Linkage
• Linkage
– A pattern matcher
• Compute qcCa(b(i ) )
 mark chain disable
– Ca = category, b = state
• If (qcCa(b(i ) ) = 1 ) then reached
– Backtracker
• If [ all units in l is disabled ] then
enabled all units
–  Implement Searching algorithm
Associative computer
Problem Space (3)
• Pattern heuristics
– qcCa(b(i ) )  interpreted by h#()
• h# is heuristic function for calculate
distance to desired states
• h0 : Blind-search
• h1 :  for block world
• Prediction heuristics
– Search similar problems to
speed up
– Prediction associative memory
• after ‘‘learning’’ the sequence can be
recalled
• Learning strategy
– Unsupervised learning
– Hebb rules
Associative computer
Architecture
Associative computer
Experiments : Geomatrix blocks world