Particle filter
Particle filters, also known as sequential Monte Carlo methods (SMC), are sophisticated model estimation techniques based on simulation. Particle filters have important applications in econometrics[1].
They are usually used to estimate Bayesian models and are the sequential ('on-line') analogue of Markov chain Monte Carlo (MCMC) batch methods and are often similar to importance sampling methods. Well-designed particle filters can often be much faster than MCMC. They are often an alternative to the Extended Kalman filter (EKF) or Unscented Kalman filter (UKF) with the advantage that, with sufficient samples, they approach the Bayesian optimal estimate, so they can be made more accurate than either the EKF or UKF. However, when the simulated sample is not sufficiently large, they might suffer from sample impoverishment. The approaches can also be combined by using a version of the Kalman filter as a proposal distribution for the particle filter.
Goal
The particle filter aims to estimate the sequence of hidden parameters, for , based only on the observed data for . All Bayesian estimates of follow from the posterior distribution . In contrast, the MCMC or importance sampling approach would model the full posterior .
Model
Particle methods assume and the observations can be modeled in this form:
- is a first order Markov process such that
and with an initial distribution .
- The observations are conditionally independent provided that are known
- In other words, each only depends on
One example form of this scenario is
where both and are mutually independent and identically distributed sequences with known probability density functions and and are known functions. These two equations can be viewed as state space equations and look similar to the state space equations for the Kalman filter. If the functions and are linear, and if both and are Gaussian, the Kalman filter finds the exact Bayesian filtering distribution. If not, Kalman filter based methods are a first-order approximation. Particle filters are also an approximation, but with enough particles they can be much more accurate.
Monte Carlo approximation
Particle methods, like all sampling-based approaches (e.g., MCMC), generate a set of samples that approximate the filtering distribution . So, with samples, expectations with respect to the filtering distribution are approximated by
and , in the usual way for Monte Carlo, can give all the moments etc. of the distribution up to some degree of approximation.
Sampling Importance Resampling (SIR)
Sampling importance resampling (SIR), the original particle filtering algorithm (Gordon et al. 1993), is a very commonly used particle filtering algorithm, which approximates the filtering distribution by a weighted set of P particles
- .
The importance weights are approximations to the relative posterior probabilities (or densities) of the particles such that .
SIR is a sequential (i.e., recursive) version of importance sampling. As in importance sampling, the expectation of a function can be approximated as a weighted average
For a finite set of particles, the algorithm performance is dependent on the choice of the proposal distribution
- .
The optimal proposal distribution is given as the target distribution
However, the transition prior is often used as importance function, since it is easier to draw particles (or samples) and perform subsequent importance weight calculations:
Sampling Importance Resampling (SIR) filters with transition prior as importance function are commonly known as bootstrap filter and condensation algorithm.
Resampling is used to avoid the problem of degeneracy of the algorithm, that is, avoiding the situation that all but one of the importance weights are close to zero. The performance of the algorithm can be also affected by proper choice of resampling method. The stratified resampling proposed by Kitagawa (1996) is optimal in terms of variance.
A single step of sequential importance resampling is as follows:
- 1) For draw samples from the proposal distribution
- 2) For update the importance weights up to a normalizing constant:
- Note that this simplifies to the following :
- when we use :
- 3) For compute the normalized importance weights:
- 4) Compute an estimate of the effective number of particles as
- 5) If the effective number of particles is less than a given threshold , then perform resampling:
- a) Draw particles from the current particle set with probabilities proportional to their weights. Replace the current particle set with this new one.
- b) For set
The term Sequential Importance Resampling is also sometimes used when referring to SIR filters.
Sequential Importance Sampling (SIS)
- Is the same as Sampling Importance Resampling, but without the resampling stage.
"Direct version" algorithm
The "direct version" algorithm is rather simple (compared to other particle filtering algorithms) and it uses composition and rejection. To generate a single sample at from :
- 1) Set p=1
- 2) Uniformly generate L from
- 3) Generate a test from its distribution
- 4) Generate the probability of using from where is the measured value
- 5) Generate another uniform u from
- 6) Compare u and
- 6a) If u is larger then repeat from step 2
- 6b) If u is smaller then save as and increment p
- 7) If p > P then quit
The goal is to generate P "particles" at using only the particles from . This requires that a Markov equation can be written (and computed) to generate a based only upon . This algorithm uses composition of the P particles from to generate a particle at and repeats (steps 2-6) until P particles are generated at .
This can be more easily visualized if is viewed as a two-dimensional array. One dimension is and the other dimensions is the particle number. For example, would be the Lth particle at and can also be written (as done above in the algorithm). Step 3 generates a potential based on a randomly chosen particle () at time and rejects or accepts it in step 6. In other words, the values are generated using the previously generated .
Other Particle Filters
- Auxiliary particle filter
- Gaussian particle filter
- Unscented particle filter
- Monte Carlo particle filter
- Gauss-Hermite particle filter
- Cost Reference particle filter
- Rao-Blackwellized particle filter
See also
References
- ^ Thomas Flury & Neil Shephard, 2008. "Bayesian inference based only on simulated likelihood: particle filter analysis of dynamic economic models," OFRC Working Papers Series 2008fe32, Oxford Financial Research Centre.
- Doucet, A. (2001). Sequential Monte Carlo Methods in Practice. Springer.
{{cite book}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)
- Cappe, O. (2005). Inference in Hidden Markov Models. Springer.
{{cite book}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)
- Liu, J. (2001). Monte Carlo strategies in Scientific Computing. Springer.
- Ristic, B. (2004). Beyond the Kalman Filter: Particle Filters for Tracking Applications. Artech House.
{{cite book}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)
- Doucet, A. (December 2008). "A tutorial on particle filtering and smoothing: fifteen years later" (PDF). Technical report, Department of Statistics, University of British Columbia.
{{cite journal}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)CS1 maint: extra punctuation (link) CS1 maint: year (link)
- Doucet, A. (2000). "On Sequential Monte Carlo Methods for Bayesian Filtering". Statistics and Computing. 10 (3): 197–208. doi:10.1023/A:1008935410038.
{{cite journal}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)CS1 maint: extra punctuation (link)
- Arulampalam, M.S. (2002). "A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking". IEEE Transactions on Signal Processing. 50 (2): 174–188. doi:10.1109/78.978374.
{{cite journal}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)CS1 maint: extra punctuation (link)
- Cappe, O. (2007). "An overview of existing methods and recent advances in sequential Monte Carlo". Proceedings of IEEE. 95 (5): 899. doi:10.1109/JPROC.2007.893250.
{{cite journal}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)CS1 maint: extra punctuation (link)
- Kitagawa, G. (1996). "Monte carlo filter and smoother for non-Gaussian nonlinear state space models". Journal of Computational and Graphical Statistics. 5 (1): 1–25. doi:10.2307/1390750.
- Kotecha, J.H. (2003). "Gaussian Particle filtering". IEEE Transactions Signal Processing. 51 (10).
{{cite journal}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)CS1 maint: extra punctuation (link)
- Haug, A.J. (2005). "A Tutorial on Bayesian Estimation and Tracking Techniques Applicable to Nonlinear and Non-Gaussian Processes" (PDF). The MITRE Corporation, USA, Tech. Rep., Feb. Retrieved 2008-05-06.
- Pitt, M.K. (1999). "Filtering Via Simulation: Auxiliary Particle Filters". Journal of the American Statistical Association. 94 (446): 590–591. doi:10.2307/2670179. Retrieved 2008-05-06.
{{cite journal}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)
- Gordon, N. J. (1993). "Novel approach to nonlinear/non-Gaussian Bayesian state estimation". IEE Proceedings F on Radar and Signal Processing. 140 (2): 107–113. Retrieved 2009-09-19.
{{cite journal}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)
External links
- Sequential Monte Carlo Methods (Particle Filtering) homepage on University of Cambridge
- Dieter Fox's MCL Animations
- Rob Hess' free software
- SMCTC: A Template Class for Implementing SMC algorithms in C++
- Tutorial on particle filtering with the MRPT C++ library, and a mobile robot localization video.
- Java applet on particle filtering