Download cs621-lect27-bp-applcation-logic-2009-10-15

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

M-Theory (learning framework) wikipedia , lookup

History of artificial intelligence wikipedia , lookup

Knowledge representation and reasoning wikipedia , lookup

Hidden Markov model wikipedia , lookup

Philosophy of artificial intelligence wikipedia , lookup

Expert system wikipedia , lookup

Sequent calculus wikipedia , lookup

Hierarchical temporal memory wikipedia , lookup

Pattern recognition wikipedia , lookup

Backpropagation wikipedia , lookup

Catastrophic interference wikipedia , lookup

Convolutional neural network wikipedia , lookup

Transcript
CS621: Artificial Intelligence
Lecture 27: Backpropagation applied
to recognition problems; start of
logic
Pushpak Bhattacharyya
Computer Science and Engineering
Department
IIT Bombay
Backpropagation algorithm
j
wji
i
….
….
….
….
Output layer
(m o/p neurons)
Hidden layers
Input layer
(n i/p neurons)
• Fully connected feed forward network
• Pure FF network (no jumping of connections
over layers)
General Backpropagation Rule
• General weight updating rule:
w ji  joi
• Where
 j  (t j  o j )o j (1  o j )

 (w 
knext layer
kj
k
for outermost layer
)o j (1  o j )oi for hidden layers
Local Minima
Due to the Greedy nature
of BP, it can get stuck in
local minimum m and
will never be able to
reach the global
minimum g as the error
can only decrease by
weight change.
Momentum factor
1. Introduce momentum factor.
(w ji ) nth  iteration  jOi   (wji)( n  1)th  iteration
 Accelerates the movement out of the trough.
 Dampens oscillation inside the trough.
 Choosing β : If β is large, we may jump over the
minimum.
Symmetry breaking
• If mapping demands different weights, but we start with the
same weights
everywhere, then BP will never converge.
θ = 0.5
w2=1
w1=1
x 1x 2
-1
x1
1
1.5
1.5
1 x 1x 2
-1
x2
XOR n/w: if we s
started with identical
weight everywhere, BP
will not converge
Backpropagation Applications
Feed Forward Network
Architecture
Problem defined
Decided by trial error
Problem defined
O/P layer
Hidden layer
I/P layer
Digit Recognition Problem
• Digit recognition:
– 7 segment display
– Segment being on/off defines a digit
2
1
3
7
6
4
5
9O
8O
7O
...
2O
1O
Full connection
Hidden layer
Full connection
7O
Seg-7
6O
5O
Seg-6
Seg-5
...
2O
Seg-2
1O
Seg-1
Example - Character Recognition
• Output layer – 26 neurons (all capital)
• First output neuron has the responsibility of
detecting all forms of ‘A’
• Centralized representation of outputs
• In distributed representations, all output
neurons participate in output
An application in Medical
Domain
Expert System for Skin Diseases
Diagnosis
• Bumpiness and scaliness of skin
• Mostly for symptom gathering and for
developing diagnosis skills
• Not replacing doctor’s diagnosis
Architecture of the FF NN
• 96-20-10
• 96 input neurons, 20 hidden layer neurons, 10
output neurons
• Inputs: skin disease symptoms and their parameters
– Location, distribution, shape, arrangement, pattern,
number of lesions, presence of an active norder, amount of
scale, elevation of papuls, color, altered pigmentation,
itching, pustules, lymphadenopathy, palmer thickening,
results of microscopic examination, presence of herald
pathc, result of dermatology test called KOH
Output
• 10 neurons indicative of the diseases:
– psoriasis, pityriasis rubra pilaris, lichen planus,
pityriasis rosea, tinea versicolor, dermatophytosis,
cutaneous T-cell lymphoma, secondery syphilis,
chronic contact dermatitis, soberrheic dermatitis
Training data
• Input specs of 10 model diseases from 250
patients
• 0.5 is some specific symptom value is not
knoiwn
• Trained using standard error backpropagation
algorithm
Testing
• Previously unused symptom and disease data of 99 patients
• Result:
• Correct diagnosis achieved for 70% of papulosquamous group
skin diseases
• Success rate above 80% for the remaining diseases except for
psoriasis
• psoriasis diagnosed correctly only in 30% of the cases
• Psoriasis resembles other diseases within the
papulosquamous group of diseases, and is somewhat difficult
even for specialists to recognise.
Explanation capability
• Rule based systems reveal the explicit path of
reasoning through the textual statements
• Connectionist expert systems reach
conclusions through complex, non linear and
simultaneous interaction of many units
• Analysing the effect of a single input or a
single group of inputs would be difficult and
would yield incor6rect results
Explanation contd.
• The hidden layer re-represents the data
• Outputs of hidden neurons are neither
symtoms nor decisions
Duration
of lesions : weeks
Duration
of lesions : weeks
Symptoms & parameters
0
Internal
representation
Disease
diagnosis
0
1
0
( Psoriasis node )
Minimal itching
6
Positive
KOH test
Lesions located
on feet
1.68
10
13
5
(Dermatophytosis node)
1.62
36
14
Minimal
increase
in pigmentation 71
1
Positive test for
pseudohyphae
95
And spores
19
Bias
Bias
96
9
(Seborrheic dermatitis node)
20
Figure : Explanation of dermatophytosis diagnosis using the DESKNET expert system.
Discussion
• Symptoms and parameters contributing to the
diagnosis found from the n/w
• Standard deviation, mean and other tests of
significance used to arrive at the importance
of contributing parameters
• The n/w acts as apprentice to the expert
Exercise
• Find the weakest condition for symmetry
breaking. It is not the case that only when ALL
weights are equal, the network faces the
symmetry problem.
Logic
Logic and inferencing
Vision
NLP
Search
 Reasoning
 Learning
 Knowledge

Robotics
Expert
Systems
Planning
Obtaining implication of given facts and rules -- Hallmark of
intelligence
Inferencing through
−
−
−
Deduction (General to specific)
Induction (Specific to General)
Abduction (Conclusion to hypothesis in absence of any other evidence
to contrary)
Deduction
Given:
All men are mortal (rule)
Shakespeare is a man (fact)
To prove:
Shakespeare is mortal (inference)
Induction
Given:
Shakespeare is mortal
Newton is mortal
(Observation)
Dijkstra is mortal
To prove:
All men are mortal (Generalization)
If there is rain, then there will be no picnic
Deduction
Fact1: There was rain
Conclude: There was no picnic
Fact2: There was no picnic
Conclude: There was no rain (?)
Induction and abduction are fallible forms of reasoning. Their conclusions are
susceptible to retraction
Two systems of logic
1) Propositional calculus
2) Predicate calculus
Propositions
Stand for facts/assertions
− Declarative statements
− As opposed to interrogative statements (questions) or imperative
statements (request, order)
−
Operators
AND
(

),
OR
(

),
NOT
(~),
IMPLIC
N
(

)
=> and ¬ form a minimal set (can express other operations)
- Prove it.
Tautologies are formulae whose truth value is always T, whatever the
assignment is
Model
In propositional calculus any formula with n propositions has 2n models
(assignments)
- Tautologies evaluate to T in all models.
Examples:
1)
P  P
2)
( P  Q)  (P  Q)
e Morgan with AND
-
Semantic Tree/Tableau method of proving tautology
Start with the negation of the formula
[( P  Q)  (P  Q)]
- α - formula
α-formula
β-formula - β - formula
( P  Q )
α-formula
(P  Q)
- α - formula
p
q
¬p
¬q
Example 2:
[ A  ( B  C )  ( A  B )  ( A  C )]
X
(α - formula)
A  (B  C)
(α - formulae)
¬A
(( A  B )  ( A  C )) α-formula
( A  B )
¬A
¬C
¬B
( A  C ))
¬B
¬A
(β - formulae)
A
A
B∨ C
A
B∨ C
B∨ C
Contradictions in all paths
B
A
C
B∨ C
B
C
A puzzle
(Zohar Manna, Mathematical Theory of
Computation, 1974)
From Propositional Calculus
Tourist in a country of truth-sayers
and liers
• Facts and Rules: In a certain country, people either
always speak the truth or always lie. A tourist T
comes to a junction in the country and finds an
inhabitant S of the country standing there. One of
the roads at the junction leads to the capital of the
country and the other does not. S can be asked only
yes/no questions.
• Question: What single yes/no question can T ask of
S, so that the direction of the capital is revealed?
Diagrammatic representation
Capital
S (either always says the truth
Or always lies)
T (tourist)
Deciding the Propositions: a very difficult stepneeds human intelligence
• P: Left road leads to capital
• Q: S always speaks the truth
Meta Question: What question
should the tourist ask
• The form of the question
• Very difficult: needs human intelligence
• The tourist should ask
– Is R true?
– The answer is “yes” if and only if the left road
leads to the capital
– The structure of R to be found as a function of P
and Q
A more mechanical part: use of
truth table
P
Q
R
T
S’s
Answer
Yes
T
T
F
Yes
F
F
T
No
F
F
F
No
T
T
Get form of R: quite mechanical
• From the truth table
– R is of the form (P x-nor Q) or (P ≡ Q)
Get R in English/Hindi/Hebrew…
• Natural Language Generation: non-trivial
• The question the tourist will ask is
– Is it true that the left road leads to the capital if and only
if you speak the truth?
• Exercise: A more well known form of this question
asked by the tourist uses the X-OR operator instead
of the X-Nor. What changes do you have to
incorporate to the solution, to get that answer?