Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Visibility Graph Voronoi Diagram β’ Control is easy: stay equidistant away from closest obstacles Exact Cell Decomposition Plan over this graph Localization β’ Two types of approaches: β Iconic : use raw sensor data directly. Match current sensor readings with what was observed in the past β Feature-based : extract features of the environment, such as corners and doorways. Match current observations Continuous Localization and Mapping Time Initial Map Global Map Continuous Localization and Mapping Time Sensor Data Local Map Global Map Matching Registration π₯, π¦, π offsets to correct odometry Continuous Localization and Mapping Time Local Map Global Map Matching Registration π₯, π¦, π offsets to correct odometry Continuous Localization and Mapping register global map sensor data construct local map local map match and score best pose Continuous Localization and Mapping register global map sensor data encoder data construct local map pose estimation local map k possible poses match and score best pose Matching X probabilities Local Map Where am I? Global Map Matching ® Where am I on the global map? obstacle β¦ Local Map Global Map ο ο ο ο ο ο ο ο ο ο ο ο ο ο ο ® ο ο ο ο ο ο ο ο β¦ ο ο ο ο ο ο ο ο ο ο ο ο ο ® ο ο ο ο ο ο ο ο ο ο Examine different possible robot positions. β¦ This sounds hard, do we need to localize? https://www.youtube.com/watch?v=6KRjuuEVEZs Matching and Registration β’ Collect sensor readings and create a local map β’ Estimate poses that the robot is likely to be in given distance traveled from last map update β In theory k is infinite β Discretize the space of possible positions (e.g., consider errors in increments of 5o) β Try to model the likely behavior of your robot. Try to account for systematic errors (e.g., robot tends to drift to one side) Matching and Registration β’ Collect π sensor readings and create a local map β’ Estimate π poses (π₯, π¦, π) that the robot is likely to be in given the distance travelled from the last map update β’ For each pose π score how well the local map matches the global map at this position β’ Choose the pose with the best score. Update the position of the robot to the corresponding (π₯, π¦, π) location. What if you were tracking multiple possible poses. How would you combine info from this with previous estimate of global position + odometry? Representations line-based map (~100 lines) Representations One location vs. location distribution Grid-based map (3000 cells) Topological map (50 features, 18 nodes) Feature-Based Localization β’ Extract features such as doorways, corners and intersections β’ Either β Use continuous localization to try and match features at each update β Use topological information to create a graph of the environment Topological Map of Office Building β’ The robot has identified 10 doorways, marked by single number. β’ Hallways between doorways labeled by gateway-gateway pairing. Topological Map of Office Building β’ What if the robot is told it is at position A but itβs actually at B. How could it correct that information? Localization Problem(s) β’ β’ β’ β’ Position Tracking Global Localization Kidnapped Robot Problem Multi-Robot Localization β’ General approach: β’ A: action β’ S: pose β’ O: observation Position at time t depends on position previous position and action, and current observation β’ Pose at time t determines the observation at time t β’ If we know the pose, we can say what the observation is β’ But, this is backwardsβ¦ β’ Hello Bayes! Quiz! β’ If events a and b are independent, β’ p(a, b) = β’ If events a and b are not independent, β’ p(a, b) = β’ p(c|d) = ? Mattel 1992 Quiz! β’ If events a and b are independent, β’ p(a, b) = p(a) × p(b) β’ If events a and b are not independent, β’ p(a, b) = p(a) × p(b|a) = p(b) × p (a|b) β’ p(c|d) = p (c , d) / p(d) = p((d|c) p(c)) / p(d) Bayes Filtering β’ Want to have a way of representing uncertainty β’ Probability Distribution β Could be discrete or continuous β Prob. of each pose in set of all possible poses β’ Belief β’ Prior β’ Posterior Models of Belief 1. Uniform Prior 2. Observation: see pillar 3. Action: move right 4. Observation: see pillar Modeling objects in the environment http://www.cs.washington.edu/research/rse-lab/projects/mcl Modeling objects in the environment http://www.cs.washington.edu/research/rse-lab/projects/mcl Axioms of Probability Theory β’ Pr(π΄) denotes probability that proposition A is true. β’ Pr(¬π΄) denotes probability that proposition A is false. 1. 2. 3. 0 ο£ Pr( A) ο£ 1 Pr(True) ο½ 1 Pr( False) ο½ 0 Pr( A ο B) ο½ Pr( A) ο« Pr( B) ο Pr( A ο B) A Closer Look at Axiom 3 Pr( A ο B) ο½ Pr( A) ο« Pr( B) ο Pr( A ο B) True A Aο B B B Discrete Random Variables β’ X denotes a random variable. β’ X can take on a countable number of values in {x1, x2, β¦, xn}. β’ P(X=xi), or P(xi), is the probability that the random variable X takes on value xi. β’ P(xi) is called probability mass function. β’ E.g. P( Room) ο½ 0.2 Continuous Random Variables β’ π takes on values in the continuum. β’ π(π = π₯), or π(π₯), is a probability density function. b Pr( x ο (a, b)) ο½ ο² p( x)dx a β’ E.g. p(x) x Probability Density Function p(x) Magnitude of curve could be greater than 1 in some areas. The total area under the curve must add up to 1. x β’ Since continuous probability functions are defined for an infinite number of points over a continuous interval, the probability at a single point is always 0. Inference by Enumeration ο¨ π ¬πππ£ππ‘π¦ π‘πππ‘βππβπ) = 0.016+0.064 0.108+0.012++0.016+0.064 π(¬πππ£ππ‘π¦β§π‘πππ‘βππβπ) π(π‘πππ‘βππβπ) =0.4 Law of Total Probability Discrete case ο₯ P( x) ο½ 1 x P ( x ) ο½ ο₯ P ( x, y ) y P( x) ο½ ο₯ P( x | y ) P( y ) y Continuous case ο² p( x) dx ο½ 1 p( x) ο½ ο² p( x, y ) dy p( x) ο½ ο² p( x | y ) p( y ) dy Bayes Formula P ( x, y ) ο½ P ( x | y ) P ( y ) ο½ P ( y | x ) P ( x ) ο P( y | x) P( x) likelihood ο prior P( x y ) ο½ ο½ P( y ) evidence If y is a new sensor reading: p (x ) p( x y ) ο Prior probability distribution ο Posterior (conditional) probability distribution p ( y x) p( y) ο Model of the characteristics of the sensor ο Does not depend on x Bayes Formula P ( x, y ) ο½ P ( x | y ) P ( y ) ο½ P ( y | x ) P ( x ) ο P( y | x) P( x) likelihood ο prior P( x y ) ο½ ο½ P( y ) evidence ο P( y | x) P( x) P( x y ) ο½ ο₯ P( y | x) P( x) x Bayes Rule with Background Knowledge P( y | x, z ) P( x | z ) P( x | y, z ) ο½ P( y | z ) Conditional Independence P( x, y z ) ο½ P( x | z ) P( y | z ) equivalent to P ( x z ) ο½ P( x | z , y ) and P( y z ) ο½ P( y | z , x ) Simple Example of State Estimation β’ Suppose a robot obtains measurement π§ β’ What is π(ππππ|π§)? Causal vs. Diagnostic Reasoning β’ β’ β’ β’ π(ππππ|π§) is diagnostic. π(π§|ππππ) is causal. Often causal knowledge is easier to obtain. Bayes rule allows us to use causal knowledge: Comes from sensor model. P( z | open) P(open) P(open | z ) ο½ P( z ) P(open | z ) ο½ Example ο½ ο½ P(o z) = P(z|open) = 0.6 P(z|οopen) = 0.3 P(open) = P(οopen) = 0.5 P( z | open) P(open) P( z ) P(z | o) P(o) å P(z | o)P(o) x P( z | open) P(open) P(open | z ) ο½ P( z | open) p(open) ο« P( z | οopen) p(οopen) 0.6 ο 0.5 2 P(open | z ) ο½ ο½ ο½ 0.67 0.6 ο 0.5 ο« 0.3 ο 0.5 3 π§ raises the probability that the door is open. Combining Evidence β’ Suppose our robot obtains another observation z2. β’ How can we integrate this new information? β’ More generally, how can we estimate P(x| z1...zn )? Recursive Bayesian Updating P( zn | x, z1,ο, zn ο 1) P( x | z1,ο, zn ο 1) P( x | z1,ο, zn) ο½ P( zn | z1,ο, zn ο 1) Markov assumption: zn is independent of z1,...,zn-1 if we know x. P(zn | open) P(open | z1,β¦ , zn - 1) P(open | z1,β¦ , zn ) = P(zn | z1,β¦ , zn - 1) P(open | z ) ο½ P( z | open) P(open) P( z ) Example: 2nd Measurement P( x | z1, ο , zn) ο½ β’ P(z2|open) = 0.5 β’ P(open|z1)=2/3 P( zn | x) P ( x | z1, ο , zn ο 1) P ( zn | z1, ο , zn ο 1) P(z2|οopen) = 0.6 P( z2 | open) P(open | z1 ) P(open P (open ||zz22,, zz11)) = ο½? P( z2 | open) P(open | z1 ) ο« P( z2 | οopen) P(οopen | z1 ) 1 2 ο 5 2 3 ο½ ο½ ο½ 0.625 1 2 3 1 8 ο ο« ο 2 3 5 3 π§2 lowers the probability that the door is open.