Download X - Read

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Weakly-interacting massive particles wikipedia , lookup

Relativistic quantum mechanics wikipedia , lookup

Double-slit experiment wikipedia , lookup

Electron scattering wikipedia , lookup

Theoretical and experimental justification for the Schrödinger equation wikipedia , lookup

Standard Model wikipedia , lookup

Compact Muon Solenoid wikipedia , lookup

ATLAS experiment wikipedia , lookup

Monte Carlo methods for electron transport wikipedia , lookup

Identical particles wikipedia , lookup

Elementary particle wikipedia , lookup

Transcript
Particle Filters
大连理工大学 金乃高
2009-01-03
1
Introduction
Sequential
Monte Carlo Methods in Practice, Springer-Verlag,2001
IEEE Transactions on Signal Processing
Special issue on Monte Carlo Methods for Statistical Signal Processing
2002,50(2)
Proceedings of the IEEE, Special Issue on Sequential State Estimation,
2004,92(3)
Beyond the Kalman Filter: Particle Filters for Tracking Applications,
Artech House Publishers, 2004.
2
Monte Carlo Method




3
Ulam
von Neumann
Metropolis
Fermi
Buffon 投针实验
N:投针次数
M:与平行线相交次数
D:间距
L:针长度
2L  N

DM
4
 4
S1
S2
Monte Carlo Methods
 Important
Sampling
 Rejection Sampling
 Metropolis-Hastings
 Gibbs采样
5
Sequential Monte Carlo
Bootstrap
filtering (Gordon 1993)
Condensation
Particle
6
algorithm(Isard and Blake 1996 )
filtering (Doucet 2001)
Sequential Monte Carlo



7
20世纪50年代,Hammersley便采用基于序贯重
要性采样(Sequential importance sampling,SIS)
的蒙特卡洛方法解决统计学问题。
20世纪60年代后期,Handschin与Mayne使用序
贯蒙特卡洛方法解决自动控制领域的相关问题。
20世纪70年代,Handschin、Akashi,Zaritskii
Sequential Monte Carlo




8
Tanizaki、Geweke等采用基于重要性采样的蒙特卡洛方
法成功解决了一系列高维积分问题。
Smith与Gelfand提出的采样-重采样思想为Bayesian推理
提供了一种易于实现的计算策略。
Smith与Gordon等人合作,于20世纪90年代初将重采样
(Resampling)步骤引入到粒子滤波中,提出Bootstrap滤波
算法。
美国海军集成水下监控系统中的Nodestar便是粒子滤波
应用的一个实例
Applications of Particle Filters
Navigation,
Channel
9
Positioning, Tracking
equalization
Fundamental Concepts




10
Bayesian inference
Monte Carlo Simulation
Sequential Importance Sampling
Resampling
Bayesian Inference





11
X is unknown-a random variable or set (vector)
of random variables
Y is observed-also a set of random variables
We wish to infer X by observing Y.
The probability distribution p(x) models our
prior knowledge of X.
The conditional probability distribution p(Y|X)
models the relationship between Y and X.
Bayesian Filtering

12
General problem statement
State Space Formulation
13
Bayes Theorem
The conditional distribution p(x|y) represents
posterior information about x given y.
p( y | x) p( x)
p( x | y )=
p( y )
14
Recursive Bayesian Estimation
15
Recursive Bayesian Estimation
16
Monte Carlo Sampling
State space
model
Solution
Estimate
posterior
Integrals are
not tractable
Monte Carlo
Sampling
Difficult to
draw samples
Importance
Sampling
17
Problem
Monte Carlo Simulation


The posterior distribution p(x|y) may be difficult
or impossible to compute in closed form.
An alternative is to represent p(x|y) using
Monte Carlo samples (particles):
–
Each particle has a value and a weight
x
x
18
Monte Carlo Simulation
19
Importance Sampling


20
Ideally, the particles would represent samples drawn
from the distribution p(x|y).
– In practice, we usually cannot get p(x|y) in closed
form; in any case, it would usually be difficult to
draw samples from p(x|y).
重要性采样引入一个已知的易于采样的期望分布,权值
用来描述期望分布与实际后验分布的差异。重要性采样
是蒙特卡罗积分中的一种方差缩减策略,在贝叶斯滤波
中,我们可以将重要性函数看成对后验概率密度函数的
加权近似。
Importance Sampling
21
Importance Sampling
22
Importance Sampling
23
Importance Sampling
24
Sequential Importance Sampling

粒子权值的递归形式可以表示为
p( x0:(ik) | Yk )
w 
q( x0:(ik) | Yk )
(i )
k
p( yk | xk(i ) ) p( xk(i ) | xk(i)1 ) p( x0:(ik) 1 | Yk 1 )

q( xk(i ) | x0:(ik) 1 , Yk )q( x0:(ik) 1 | Yk 1 )
 wk(i)1
25
p( yk | xk(i ) ) p( xk(i ) | xk(i)1 )
q( xk(i ) | x0:(ik) 1 , Yk )
Resampling



26
我们希望经过若干次迭代,方差趋近于零以得到
正确的估计。然而在SIS算法中的方差随着时间
增加,产生权值退化现象。
1993年 Gordon提出重采样的思想克服了这个问
题,推广了粒子滤波技术的应用范围。
重采样的基本思想是舍弃权值较小的、肯定不感
兴趣的粒子,代之以较大的权值的粒子。
Resampling


27
In inference problems,
most weights tend to
zero except a few (from
particles that closely
match observations),
which become large.
We resample to
concentrate particles in
regions where p(x|y) is
larger.
x
x
Resampling
破坏算法的并行性
 粒子差异性丧失
解决方案
 增加粒子数
 重采样之后加入随机噪声
 Markov chain Monte Carlo 移动
 核平滑:核函数代替狄拉克函数

28
粒子滤波示意图
29
Variations


Use a different importance distribution
Use a different resampling technique:
–
–
30
Resampling adds variance to the estimate; several
resampling techniques are available that minimize
this added variance.
Our simple resampling leaves several particles with
the same value; methods for spreading them are
available.
Variations

Reduce the resampling frequency:
–
–
Our implementation resamples after every
observation, which may add unneeded variance to
the estimate.
Alternatively, one can resample only when the
particle weights warrant it. This can be
determined by the effective sample size.
Nˆ eff 
1
N
(i ) 2
(
w
 k)
i 1
31
Rao-Blackwellization

Rao-Blackwellization:
–
–
32
Some components of the model may have linear
dynamics and can be well estimated using a
conventional Kalman filter.
The Kalman filter/Extended Kalman filter/Unscented
Kalman filter/ Gauss-hermit filter is combined with a
particle filter to reduce the number of particles
needed to obtain a given level of performance.
Advantages of Particle Filters



33
Under general conditions, the particle filter
estimate becomes asymptotically optimal as
the number of particles goes to infinity.
Non-linear, non-Gaussian state update and
observation equations can be used.
Multi-modal distributions are not a problem.
Disadvantages of Particle Filters



34
Naïve formulations of problems usually result
in significant computation times.
The Number of particles.
The best importance distribution and/or
resampling methods may be very problem
specific.
Conclusions
Particle filter is a tractable exercise for previously
difficult or impossible problems.
35
综述文章
M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp,
A tutorial on particle filters for online nonlinear/non-gaussian
Bayesian tracking, IEEE Transactions on Signal Processing,
2002 ,50(2)174-188
36
相关网站
Google Sequential Monte Carlo
http://www-sigproc.eng.cam.ac.uk/smc/papers.html
37
谢谢大家!
38