Download Introduction to Artificial Intelligence and Search problems

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Genetic algorithm wikipedia , lookup

Computer Go wikipedia , lookup

Collaborative information seeking wikipedia , lookup

Stemming wikipedia , lookup

Transcript
Artificial Intelligence
Lesson 1
1
Ram Meshulam 2004
About
• Lecturer: Prof. Sarit Kraus
• TA: Galit Haim: [email protected]
• (almost) All you need can be found on the
course website:
– http://u.cs.biu.ac.il/~haimga/Teaching/AI/
2
Ram Meshulam 2004
Course Requirements 1
•
•
•
•
•
3
The grade is comprised of 70% exam and 30% exercises.
3 programming exercises will be given. Work individually.
All the exercises are counted for the final grade.
10% each.
Exercises will be written in C++ or JAVA only. It should
be compiled and run on planet machine, and will be
submitted via “submit”. Be precise!
Ram Meshulam 2004
Course Requirements 2
• Exercises are not hard, but work is required. Plan your
time ahead!
• When sending me mail please include the course number
(89-570) in the header, to pass the automatic spam filter.
• You (probably) will be required to participate in AI
experiments.
• See other general rules in:
http://u.cs.biu.ac.il/~haimga/Teaching/AI/assignments/general-rules.pdf
4
Ram Meshulam 2004
Course Schedule
• Lesson 1:
– Introduction
– Transferring a general problem to a graph
search problem.
• Lesson 2
– Uninformed Search (BFS, DFS etc.).
• Lesson 3
– Informed Search (A*,Best-First-Search etc.).
5
Ram Meshulam 2004
Course Schedule – Cont.
• Lesson 4
– Local Search (Hill Climbing, Genetic
algorithms etc.).
• Lesson 5
– “Search algorithms” chapter summary.
• Lesson 6-7
– Game-Trees: Min-Max & Alpha-Beta
algorithms.
6
Ram Meshulam 2004
Course Schedule – Cont.
• Lesson 8-9
– Planning: STRIPS algorithm
• Lesson 10-11-12
– Learning: Decision-Trees, Neural Network,
Naïve Bayes, Bayesian Networks and more.
• Lesson 13: Robotics
• Lesson 14
– Questions and exercise.
7
Ram Meshulam 2004
AI – Alternative Definitions
• Elaine Rich and Kevin Knight: AI is the study of how to
make computers do things at which, at the moment, people
are better.
• Stuart Russell and Peter Norvig: [AI] has to do with
smart programs, so let's get on and write some.
• Claudson Bornstein: AI is the science of common sense.
• Douglas Baker: AI is the attempt to make computers do
what people think computers cannot do.
• Astro Teller: AI is the attempt to make computers do what
they do in the movies.
8
Ram Meshulam 2004
AI Domains
• Games – chess, checkers, tile puzzle.
• Expert systems
• Speech recognition and Natural language
processing, Computer vision, Robotics.
9
Ram Meshulam 2004
AI & Search
• "The two most fundamental concerns of AI
researchers are knowledge representation and
search”
• “knowledge representation … addresses the
problem of capturing in a language…suitable for
computer manipulation”
• “Search is a problem-solving technique that
systematically explores a space of problem
states”.Luger, G.F. Artificial Intelligence: Structures and Strategies for
Complex Problem Solving
10
Ram Meshulam 2004
Solving Problems with Search
Algorithms
• Input: a problem P.
• Preprocessing:
– Define states and a state space
– Define Operators
– Define a start state and goal set of states.
• Processing:
– Activate a Search algorithm to find a path from
start to one of the goal states.
11
Ram Meshulam 2004
Example - Missionaries & Cannibals
State space – [M,C,B]
Initial State – [3,3,1]
Goal State – [0,0,0]
Operators – adding or subtracting the
vectors [1,0,1], [2,0,1], [0,1,1], [0,2,1] or
[1,1,1]
• Path – moves from [3,3,1] to [0,0,0]
• Path Cost – river trips
•
•
•
•
• http://www.plastelina.net/game2.html
• http://www.youtube.com/watch?v=W9NEWxabGmg
12
Ram Meshulam 2004
Breadth-First-Search Pseudo code
• Intuition: Treating the graph as a tree and scanning topdown.
• Algorithm:
BFS(Graph graph, Node start, Vector Goals)
1. L make_queue(start)
2. While L not empty loop
1.
n  L.remove_front()
2.
If goal (n) return true
3.
S  successors (n)
4.
L.insert(S)
3. Return false
13
Ram Meshulam 2004
Breadth-First-Search Attributes
• Completeness – yes (b < ∞, d < ∞)
• Optimality – yes, if graph is unweighted.
• Time Complexity:
O(b d +1 )
• Memory Complexity: O(b d +1 )
– Where b is branching factor and d is the
solution depth
• See water tanks example:
•
14
http://u.cs.biu.ac.il/~haimga/Teaching/AI/lessons/lesson1&2b-%20WaterTankBFS_DFS.pdf
Ram Meshulam 2004
Artificial Intelligence
Lesson 2
15
Ram Meshulam 2004
Uninformed Search
• Uninformed search methods use only
information available in the problem
definition.
–
–
–
–
–
16
Breadth First Search (BFS)
Depth First Search (DFS)
Iterative DFS (IDA)
Bi-directional search
Uniform Cost Search (a.k.a. Dijkstra alg.)
Ram Meshulam 2004
Depth-First-Search Pseudo code
DFS(Graph graph, Node start, Vector Goals)
1. L make_stack(start)
2. While L not empty loop
2.1 n  L.remove_front()
2.2 If goal (n) return true
2.3 S  successors (n)
2.4 L.insert(S)
3. Return false
17
Ram Meshulam 2004
Depth-First-Search Attributes
• Completeness – No. Infinite loops or
Infinite depth can occur.
• Optimality – No.
m
O
(
b
)
• Time Complexity:
• Memory Complexity: O (bm)
– Where b is branching factor and m is the
maximum depth of search tree
• See water tanks example
3
4
18
Ram Meshulam 2004
1
2
5
Limited DFS Attributes
• Completeness – Yes, if d≤l
• Optimality – No.
• Time Complexity: O (bl )
– If d<l, it is larger than in BFS
• Memory Complexity:
O(bl )
– Where b is branching factor and l is the
depth limit.
19
Ram Meshulam 2004
Depth--First Iterative-Deepening
Depth
0
1,3,
9
12
14
8,20
7,17c
5,13
c
4,10
11
2,6,16
15
c
18
19
21
The numbers represent the order generated by DFID
20
Ram Meshulam 2004
22
Iterative-Deepening Attributes
• Completeness – Yes
• Optimality – yes, if graph is un-weighted.
• Time Complexity:
O (( d ) b + ( d − 1) b 2 + ... + (1) b d ) = O ( b d )
• Memory Complexity: O( db)
– Where b is branching factor and d is the maximum
depth of search tree
21
Ram Meshulam 2004
State Redundancies
• Closed list - a hash table which holds the
visited nodes.
– Prevent re-exploring of nodes.
– Hold solution path from start to
goal
Closed List
• For example BFS:
Open List (Frontier)
22
Ram Meshulam 2004
Bi-directional Search
• Search both from initial state to goal state.
• Operators must be symmetric.
S
23
Ram Meshulam 2004
G
Bi-directional Search Attributes
• Completeness – Yes, if both directions use BFS
• Optimality – yes, if graph is un-weighted and both
directions use BFS.
d /2
O
(
b
)
• Time and memory Complexity:
• Pros.
– Cuts the search tree by half (at least theoretically).
• Cons.
– Frontiers must be constantly compared.
24
Ram Meshulam 2004
Minimum cost path
• General minimum cost path-search
problem:
– Find shortest path form start state to one of the
goal states in a weighted graph.
– Path cost function is g(n): sum of weights from
start state to goal.
25
Ram Meshulam 2004
Uniform Cost Search
• Also known as Dijkstra’s algorithm.
• Expand the node with the minimum path
cost first.
• Implementation: priority queue.
26
Ram Meshulam 2004
Example of Uniform Cost Search
• Assume an example tree with different edge costs, represented by
numbers next to the edges.
a
2
1
c
b
1
f
Notations for this example:
generated node
expanded node
27
2
gc
1
dc
2
ec
Example of Uniform Cost Search
2
Closed list:
28
Open
list:
a
0
a
1
Example of Uniform Cost Search
a
2
1
c
b
1
2
a
Closed list:
29
Open
list:
b
2
c
1
1
2
Example of Uniform Cost Search
a
2
1
c
b
1
2
a
c
Closed list:
30
Open
list:
b
2
d
2
e
3
1
dc
2
ec
Example of Uniform Cost Search
a
2
1
c
b
1
f
2
gc
a
c
b
d
2
e
3
f
3
1
dc
Closed list:
31
Open
list:
g
4
2
ec
Example of Uniform Cost Search
a
2
1
c
b
1
f
2
gc
a
c
b
e
3
f
3
g
4
Closed list:
32
Open
list:
1
dc
d
2
ec
Example of Uniform Cost Search
a
2
1
c
b
1
f
2
gc
a
c
f
3
g
4
Closed list:
33
Open
list:
b
1
dc
d
e
2
ec
Example of Uniform Cost Search
a
2
1
c
b
1
f
2
gc
a
Closed list:
34
Open
list:
g
4
c
b
2
1
dc
d
e
ec
f
Example of Uniform Cost Search
a
2
1
c
b
1
f
2
gc
a
Closed list:
35
Open
list:
c
b
2
1
dc
d
e
ec
f
g
Uniform Cost Search Attributes
• Completeness: yes, for positive weights
• Optimality: yes
c / e 
(b )
• Time & Memory complexity: O (b
– Where b is branching factor, c is the optimal solution cost
and e is the minimum edge cost (each operator cost at
least e).
– UCS uses cost path and not length path, therefore, the ‘d’
term is not used here.
36
Ram Meshulam 2004
Informed Search
• Incorporate additional measure of a
potential of a specific state to reach the
goal.
• A potential of a state to reach a goal is
measured through a heuristic function h(n).
• An evaluation function is denoted f(n).
37
Ram Meshulam 2004
Best First Search Algorithms
• Principle: Expand node n with the best
evaluation function value f(n).
• Implement via a priority queue
• Algorithms differ with definition of f :
– Greedy Search: f ( n) = h( n)
– A*:
f ( n) = g ( n) + h( n)
– IDA*: iterative deepening version of A*
– Etc’
38
Ram Meshulam 2004
Exercise
• Q: Does a Uniform-Cost search be considered as
a Best-First algorithm?
• A: Yes. It can be considered as a Best-First
algorithm with evaluation function f(n)=g(n).
• Q: In what scenarios IDS outperforms DFS?,
BFS?
• A:
– IDS outperforms DFS when the search tree is a lot
deeper than the solution depth.
– IDS outperforms BFS when BFS run out of memory.
39
Ram Meshulam 2004
Exercise – Cont.
• Q: Why do we need a closed list?
• A: Generally a closed list has two main functionalities:
– Prevent re-exploring of nodes.
– Hold solution path from start to goal (DFS based algorithms have
it anyway).
• Q: Does Breadth-FS find optimal path length in general?
• A: No, unless the search graph is un-weighted.
• Q: Will IDS always find the same solution as BFS given
that the nodes expansion order is deterministic?
• A: Yes. Each iteration of IDS explores new nodes the same
order a BFS does.
40
Ram Meshulam 2004