Particle filter: Difference between revisions
m WP:CHECKWIKI error fix for #61. Punctuation goes before References. Do general fixes if a problem exists. -, replaced: → (30) using AWB |
description of the mean field particle simulation methodology |
||
Line 4: | Line 4: | ||
}} |
}} |
||
'''Particle filters''' or '''Sequential Monte Carlo''' (SMC) methods are a set of on-line posterior [[density estimation]] algorithms that estimate the posterior density of the state-space by directly implementing the [[Recursive Bayesian estimation|Bayesian recursion]] equations. The term "particle filters" was first coined in 1996 by Del Moral,<ref name="dm96">{{cite journal|last1 = Del Moral|first1 = Pierre|title = Non Linear Filtering: Interacting Particle Solution.|journal = Markov Processes and Related Fields|date = 1996|volume = 2|issue = 4|pages = 555–580|url = http://web.maths.unsw.edu.au/~peterdel-moral/mprfs.pdf}}</ref> and the |
'''Particle filters''' or '''Sequential Monte Carlo''' (SMC) methods are a set of on-line posterior [[density estimation]] algorithms that estimate the posterior density of the state-space by directly implementing the [[Recursive Bayesian estimation|Bayesian recursion]] equations. The term "particle filters" was first coined in 1996 by Del Moral,<ref name="dm96">{{cite journal|last1 = Del Moral|first1 = Pierre|title = Non Linear Filtering: Interacting Particle Solution.|journal = Markov Processes and Related Fields|date = 1996|volume = 2|issue = 4|pages = 555–580|url = http://web.maths.unsw.edu.au/~peterdel-moral/mprfs.pdf}}</ref> and the terminology "sequential Monte Carlo" was proposed by Liu and Chen in 1998. Particle filters and SMC methods use a sampling approach, with a set of particles to represent the posterior density. The state-space model can be non-linear and the initial state and noise distributions can take any form required. SMC methods provide a well-established methodology<ref name="dm96"/><ref name=":2">{{cite journal|last1 = Del Moral|first1 = Pierre|title = Measure Valued Processes and Interacting Particle Systems. Application to Non Linear Filtering Problems|journal = Annals of Applied Probability|date = 1998|edition = Publications du Laboratoire de Statistique et Probabilités, 96-15 (1996)|volume = 8|issue = 2|pages = 438–495|url = http://projecteuclid.org/download/pdf_1/euclid.aoap/1028903535|doi = 10.1214/aoap/1028903535}}</ref><ref name=":1">{{Cite book|title = Feynman-Kac formulae. Genealogical and interacting particle approximations.|last = Del Moral|first = Pierre|publisher = Springer. Series: Probability and Applications|year = 2004|isbn = 978-0-387-20268-6|location = http://www.springer.com/gp/book/9780387202686|pages = 556}}</ref> for generating samples from the required distribution without requiring assumptions about the state-space model or the state distributions. However, these methods do not perform well when applied to high-dimensional systems. SMC methods implement the Bayesian recursion equations directly by using an ensemble based approach. The samples from the distribution are represented by a set of particles; each particle has a weight assigned to it that represents the probability of that particle being sampled from the probability density function. |
||
Weight disparity leading to weight collapse is a common issue encountered in these filtering algorithms; however it can be mitigated by including a resampling step before the weights become too uneven. Several adaptive resampling criteria can be used, including the variance of the weights and the relative entropy w.r.t. the uniform distribution.<ref name=":0">{{cite journal|last1 = Del Moral|first1 = Pierre|last2 = Doucet|first2 = Arnaud|last3 = Jasra|first3 = Ajay|title = On Adaptive Resampling Procedures for Sequential Monte Carlo Methods|journal = Bernoulli|date = 2012|volume = 18|issue = 1|pages = 252–278|url = http://hal.inria.fr/docs/00/33/25/83/PDF/RR-6700.pdf|doi = 10.3150/10-bej335}}</ref> In the resampling step, the particles with negligible weights are replaced by new particles in the proximity of the particles with higher weights. |
Weight disparity leading to weight collapse is a common issue encountered in these filtering algorithms; however it can be mitigated by including a resampling step before the weights become too uneven. Several adaptive resampling criteria can be used, including the variance of the weights and the relative entropy w.r.t. the uniform distribution.<ref name=":0">{{cite journal|last1 = Del Moral|first1 = Pierre|last2 = Doucet|first2 = Arnaud|last3 = Jasra|first3 = Ajay|title = On Adaptive Resampling Procedures for Sequential Monte Carlo Methods|journal = Bernoulli|date = 2012|volume = 18|issue = 1|pages = 252–278|url = http://hal.inria.fr/docs/00/33/25/83/PDF/RR-6700.pdf|doi = 10.3150/10-bej335}}</ref> In the resampling step, the particles with negligible weights are replaced by new particles in the proximity of the particles with higher weights. |
||
==History== |
==History== |
||
One trace of particle filters date back to the 50's; the 'Poor Man's Monte Carlo', that was proposed by Hammersley et al., in 1954, contained hints of the SMC methods used today. Later in the 70's, similar attempts were made in the control and genetic algorithms communities. However in the signal processing literature it was in 1993, that Gordon et al., published their seminal work 'A novel Approach to nonlinear/non-Gaussian Bayesian State estimation', that provided the first true heuristic-like implementation of the SMC methods used today in Bayesian statistical inference. The authors named their algorithm 'the bootstrap filter', and demonstrated that compared to other filtering methods, their bootstrap algorithm does not require any assumption about that state-space or the noise of the system. We also quote another pioneering article in this field of Genshiro Kitagawa ,<ref>{{cite journal |
|||
|last = Kitagawa|first = G.|year = 1996|title = Monte carlo filter and smoother for non-Gaussian nonlinear state space models|volume = 5|issue = 1|journal = Journal of Computational and Graphical Statistics|pages = 1–25|doi = 10.2307/1390750|jstor = 1390750}} |
|last = Kitagawa|first = G.|year = 1996|title = Monte carlo filter and smoother for non-Gaussian nonlinear state space models|volume = 5|issue = 1|journal = Journal of Computational and Graphical Statistics|pages = 1–25|doi = 10.2307/1390750|jstor = 1390750}} |
||
</ref> and the one by Himilcon Carvalho, Pierre Del Moral, André Monin and Gérard Salut<ref>{{cite journal|last1 = Carvalho|first1 = Himilcon|last2 = Del Moral|first2 = Pierre|last3 = Monin|first3 = André|last4 = Salut|first4 = Gérard|title = Optimal Non-linear Filtering in GPS/INS Integration.|journal = IEEE-Trans. on Aerospace and electronic systems|date = July 1997|volume = 33|issue = 3|url = http://homepages.laas.fr/monin/Version_anglaise/Publications_files/GPS.pdf}}</ref> published in the 1990s. All of these articles present particle filters as natural heuristic algorithms without a single proof of their consistency. |
</ref> and the one by Himilcon Carvalho, Pierre Del Moral, André Monin and Gérard Salut<ref>{{cite journal|last1 = Carvalho|first1 = Himilcon|last2 = Del Moral|first2 = Pierre|last3 = Monin|first3 = André|last4 = Salut|first4 = Gérard|title = Optimal Non-linear Filtering in GPS/INS Integration.|journal = IEEE-Trans. on Aerospace and electronic systems|date = July 1997|volume = 33|issue = 3|url = http://homepages.laas.fr/monin/Version_anglaise/Publications_files/GPS.pdf}}</ref> published in the 1990s. All of these articles present particle filters as natural heuristic-like algorithms without a single proof of their consistency. |
||
Particle filters belong to the class of [[genetic algorithms|genetic type algorithms]] and [[Mean field particle methods|mean field type particle methods.]] We quote a pioneering article by [[Ted Harris (mathematician)|Theodore E. Harris]] and Herman Kahn, published in 1951, using mean field but heuristic-like genetic methods for estimating particle transmission energies.<ref>{{cite journal|last1 = Herman|first1 = Kahn|last2 = Harris|first2 = Theodore, E.|title = Estimation of particle transmission by random sampling|journal = Natl. Bur. Stand. Appl. Math. Ser.|date = 1951|volume = 12|pages = 27–30|url = https://dornsifecms.usc.edu/assets/sites/520/docs/kahnharris.pdf}}</ref> Mean field genetic type particle methodologies are also used as heuristic natural search algorithms (a.k.a. [[Metaheuristic]]) in evolutionary computing. The origins of these mean field computational techniques can be traced to 1950 and 1954 with the work of [[Alan Turing]] on genetic type mutation-selection learning machines <ref>{{cite journal|last1 = Turing|first1 = Alan M.|title = Computing machinery and intelligence|journal = Mind|volume = LIX|issue = 238|pages = 433–460|doi = 10.1093/mind/LIX.236.433|url = http://mind.oxfordjournals.org/content/LIX/236/433}}</ref> and the articles by [[Nils Aall Barricelli]] at the [[Institute for Advanced Study]] in [[Princeton, New Jersey]].<ref>{{cite journal|last = Barricelli|first = Nils Aall|year = 1954|authorlink = Nils Aall Barricelli|title = Esempi numerici di processi di evoluzione|journal = Methodos|pages = 45–68}}</ref><ref>{{cite journal|last = Barricelli|first = Nils Aall|year = 1957|authorlink = Nils Aall Barricelli|title = Symbiogenetic evolution processes realized by artificial methods|journal = Methodos|pages = 143–182}}</ref> The Australian geneticist [[Alex Fraser (scientist)|Alex Fraser]] also published in 1957 a series of papers on the genetic type simulation of [[artificial selection]] of organisms.<ref>{{cite journal|last = Fraser|first = Alex|authorlink = Alex Fraser (scientist)|year = 1957|title = Simulation of genetic systems by automatic digital computers. I. Introduction|journal = Aust. J. Biol. Sci.|volume = 10|pages = 484–491}}</ref> |
Particle filters belong to the class of [[genetic algorithms|genetic type algorithms]] and [[Mean field particle methods|mean field type particle methods.]] We quote a pioneering article by [[Ted Harris (mathematician)|Theodore E. Harris]] and Herman Kahn, published in 1951, using mean field but heuristic-like genetic methods for estimating particle transmission energies.<ref>{{cite journal|last1 = Herman|first1 = Kahn|last2 = Harris|first2 = Theodore, E.|title = Estimation of particle transmission by random sampling|journal = Natl. Bur. Stand. Appl. Math. Ser.|date = 1951|volume = 12|pages = 27–30|url = https://dornsifecms.usc.edu/assets/sites/520/docs/kahnharris.pdf}}</ref> Mean field genetic type particle methodologies are also used as heuristic natural search algorithms (a.k.a. [[Metaheuristic]]) in evolutionary computing. The origins of these mean field computational techniques can be traced to 1950 and 1954 with the work of [[Alan Turing]] on genetic type mutation-selection learning machines <ref>{{cite journal|last1 = Turing|first1 = Alan M.|title = Computing machinery and intelligence|journal = Mind|volume = LIX|issue = 238|pages = 433–460|doi = 10.1093/mind/LIX.236.433|url = http://mind.oxfordjournals.org/content/LIX/236/433}}</ref> and the articles by [[Nils Aall Barricelli]] at the [[Institute for Advanced Study]] in [[Princeton, New Jersey]].<ref>{{cite journal|last = Barricelli|first = Nils Aall|year = 1954|authorlink = Nils Aall Barricelli|title = Esempi numerici di processi di evoluzione|journal = Methodos|pages = 45–68}}</ref><ref>{{cite journal|last = Barricelli|first = Nils Aall|year = 1957|authorlink = Nils Aall Barricelli|title = Symbiogenetic evolution processes realized by artificial methods|journal = Methodos|pages = 143–182}}</ref> The Australian geneticist [[Alex Fraser (scientist)|Alex Fraser]] also published in 1957 a series of papers on the genetic type simulation of [[artificial selection]] of organisms.<ref>{{cite journal|last = Fraser|first = Alex|authorlink = Alex Fraser (scientist)|year = 1957|title = Simulation of genetic systems by automatic digital computers. I. Introduction|journal = Aust. J. Biol. Sci.|volume = 10|pages = 484–491}}</ref> |
||
The conditional distribution of the random states of a signal given some partial and noisy observations is described by a Feynman-Kac probability on the random trajectories of the signal weighted by likelihood potential functions. The first rigorous mathematical analysis of these models are due to Pierre Del Moral<ref name="dm96"/><ref name=":2"/> in 1996. This article also contain a proof of the unbiased properties of particle approximations of likelihood functions and unnormalized conditional probability measures. |
The conditional distribution of the random states of a signal given some partial and noisy observations is described by a Feynman-Kac probability on the random trajectories of the signal weighted by likelihood potential functions. The first rigorous mathematical analysis of these models are due to Pierre Del Moral<ref name="dm96"/><ref name=":2"/> in 1996. This article also contain a proof of the unbiased properties of particle approximations of likelihood functions and unnormalized conditional probability measures. Branching type particle methodologies with varying population sizes were also developed in the end of the 1990s by Dan Crisan, Jessica Gaines and Terry Lyons,<ref>{{cite journal|last1 = Crisan|first1 = Dan|last2 = Gaines|first2 = Jessica|last3 = Lyons|first3 = Terry|title = Convergence of a branching particle method to the solution of the Zakai|journal = SIAM Journal on Applied Mathematics|date = 1998|volume = 58|issue = 5|pages = 1568–1590|doi = 10.1137/s0036139996307371}}</ref><ref>{{cite journal|last1 = Crisan|first1 = Dan|last2 = Lyons|first2 = Terry|title = Nonlinear filtering and measure-valued processes|journal = Probability Theory and Related Fields|date = 1997|volume = 109|issue = 2|pages = 217–244|doi = 10.1007/s004400050131}}</ref><ref>{{cite journal|last1 = Crisan|first1 = Dan|last2 = Lyons|first2 = Terry|title = A particle approximation of the solution of the Kushner–Stratonovitch equation|journal = Probability Theory and Related Fields|date = 1999|volume = 115|issue = 4|pages = 549–578|doi = 10.1007/s004400050249}}</ref> and by Dan Crisan, Pierre Del Moral and Terry Lyons.<ref>{{cite journal|last1 = Crisan|first1 = Dan|last2 = Del Moral|first2 = Pierre|last3 = Lyons|first3 = Terry|title = Discrete filtering using branching and interacting particle systems|journal = Markov Processes and Related Fields|date = 1999|volume = 5|issue = 3|pages = 293–318|url = http://web.maths.unsw.edu.au/~peterdel-moral/crisan98discrete.pdf}}</ref> |
||
The theory on Feynman-Kac particle methodologies and related particle filters algorithms has been developed in 2000 and 2004 in the books.<ref name="dmm00">{{cite book|last1 = Del Moral|first1 = Pierre|last2 = Miclo|first2 = Laurent|title = Branching and Interacting Particle Systems Approximations of Feynman-Kac Formulae with Applications to Non-Linear Filtering.|journal = Lecture Notes in Mathematics|date = 2000|volume = 1729|pages = 1–145|url = http://archive.numdam.org/ARCHIVE/SPS/SPS_2000__34_/SPS_2000__34__1_0/SPS_2000__34__1_0.pdf|doi = 10.1007/bfb0103798}}</ref><ref name=":1" /> These probabilistic models encapsulate genetic type algorithms, particle and bootstrap filters, interacting Kalman filters (a.k.a. Rao–Blackwellized particle filter <ref name="rbpf1999">{{cite conference |
|||
| id = {{citeseerx|10.1.1.137.5199}} |
| id = {{citeseerx|10.1.1.137.5199}} |
||
| title = Rao–Blackwellised particle filtering for dynamic Bayesian networks |
| title = Rao–Blackwellised particle filtering for dynamic Bayesian networks |
||
Line 25: | Line 27: | ||
| accessdate = 2012-04-09 |
| accessdate = 2012-04-09 |
||
}} |
}} |
||
⚫ | </ref>), importance sampling and resampling style particle filter techniques, genealogical tree based and particle backward methodologies for solving filtering and smoothing problems. Other classes of particle filtering methodologies includes genealogical tree based models,<ref name="dp13">{{cite book|last = Del Moral|first = Pierre|title = Mean field simulation for Monte Carlo integration|year = 2013|publisher = Chapman & Hall/CRC Press|quote = Monographs on Statistics & Applied Probability|url = http://www.crcpress.com/product/isbn/9781466504059|pages = 626}}</ref><ref name="dp04">{{cite book|last = Del Moral|first = Pierre|title = Feynman-Kac formulae. Genealogical and interacting particle approximations|year = 2004|publisher = Springer|quote = Series: Probability and Applications|url = http://www.springer.com/mathematics/probability/book/978-0-387-20268-6|pages = 575}}</ref><ref>{{cite journal|last1 = Del Moral|first1 = Pierre|last2 = Miclo|first2 = Laurent|title = Genealogies and Increasing Propagations of Chaos for Feynman-Kac and Genetic Models|journal = Annals of Applied Probability|date = 2001|volume = 11|issue = 4|pages = 1166–1198|url = http://web.maths.unsw.edu.au/~peterdel-moral/spc.ps}}</ref> backward particle models,<ref name="dp13" /><ref>{{cite journal|last1 = Del Moral|first1 = Pierre|last2 = Doucet|first2 = Arnaud|last3 = Singh|first3 = Sumeetpal, S.|title = A Backward Particle Interpretation of Feynman-Kac Formulae|journal = M2AN|date = 2010|volume = 44|issue = 5|pages = 947–976|url = http://hal.inria.fr/docs/00/42/13/56/PDF/RR-7019.pdf|doi = 10.1051/m2an/2010048}}</ref> adaptive mean field particle models,<ref name=":0"/> island type particle models,<ref>{{cite journal|last1 = Vergé|first1 = Christelle|last2 = Dubarry|first2 = Cyrille|last3 = Del Moral|first3 = Pierre|last4 = Moulines|first4 = Eric|title = On parallel implementation of Sequential Monte Carlo methods: the island particle model|journal = Statistics and Computing|date = 2013|doi = 10.1007/s11222-013-9429-x|volume = 25|pages = 243–260}}</ref><ref>{{cite journal|last1 = Chopin|first1 = Nicolas|last2 = Jacob|first2 = Pierre, E.|last3 = Papaspiliopoulos|first3 = Omiros|title = SMC^2: an efficient algorithm for sequential analysis of state-space models|journal = arXiv:1101.1528v3|url = http://arxiv.org/pdf/1101.1528v3.pdf}}</ref> and particle Markov chain Monte Carlo methodologies.<ref>{{cite journal|last1 = Andrieu|first1 = Christophe|last2 = Doucet|first2 = Arnaud|last3 = Holenstein|first3 = Roman|title = Particle Markov chain Monte Carlo methods|journal = Journal Royal Statistical Society B|date = 2010|volume = 72|issue = 3|pages = 269–342|doi = 10.1111/j.1467-9868.2009.00736.x}}</ref><ref>{{cite journal|last1 = Del Moral|first1 = Pierre|last2 = Patras|first2 = Frédéric|last3 = Kohn|first3 = Robert|title = On Feynman-Kac and particle Markov chain Monte Carlo models|journal = arXiv preprint arXiv:1404.5733|date = 2014|url = http://arxiv.org/pdf/1404.5733.pdf}}</ref> The first uniform convergence results with respect to the time parameter for particle filters were developed in the end of the 1990s by Pierre Del Moral and Alice Guionnet.<ref name="dg01">{{cite journal|last1 = Del Moral|first1 = Pierre|last2 = Guionnet|first2 = Alice|title = On the stability of interacting processes with applications to filtering and genetic algorithms|journal = Annales de l'Institut Henri Poincaré,|date = 2001|volume = 37|issue = 2|pages = 155–194|url = http://web.maths.unsw.edu.au/~peterdel-moral/ihp.ps|doi = 10.1016/s0246-0203(00)01064-5}}</ref><ref name="dg99">{{cite journal|last1 = Del Moral|first1 = Pierre|last2 = Guionnet|first2 = Alice|title = On the stability of Measure Valued Processes with Applications to filtering|journal = C.R. Acad. Sci. Paris|date = 1999|volume = 39|issue = 1|pages = 429–434}}</ref> The first rigorous analysis of genealogical tree based particle filter smoothers is due to P. Del Moral and L. Miclo in 2001<ref>{{Cite journal|title = Genealogies and Increasing Propagation of Chaos For Feynman-Kac and Genetic Models|url = http://projecteuclid.org/euclid.aoap/1015345399|journal = The Annals of Applied Probability|date = 2001|issn = 1050-5164|pages = 1166–1198|volume = 11|issue = 4|doi = 10.1214/aoap/1015345399|first = Pierre|last = Del Moral|first2 = Laurent|last2 = Miclo}}</ref> |
||
</ref>), importance sampling and resampling style particle filter techniques, genealogical tree based and particle backward methodologies for solving filtering and smoothing problems. |
|||
⚫ | Other classes of particle filtering methodologies includes genealogical tree based models,<ref name="dp13">{{cite book|last = Del Moral|first = Pierre|title = Mean field simulation for Monte Carlo integration|year = 2013|publisher = Chapman & Hall/CRC Press|quote = Monographs on Statistics & Applied Probability|url = http://www.crcpress.com/product/isbn/9781466504059|pages = 626}}</ref><ref name="dp04">{{cite book|last = Del Moral|first = Pierre|title = Feynman-Kac formulae. Genealogical and interacting particle approximations|year = 2004|publisher = Springer|quote = Series: Probability and Applications|url = http://www.springer.com/mathematics/probability/book/978-0-387-20268-6|pages = 575}}</ref><ref>{{cite journal|last1 = Del Moral|first1 = Pierre|last2 = Miclo|first2 = Laurent|title = Genealogies and Increasing Propagations of Chaos for Feynman-Kac and Genetic Models|journal = Annals of Applied Probability|date = 2001|volume = 11|issue = 4|pages = 1166–1198|url = http://web.maths.unsw.edu.au/~peterdel-moral/spc.ps}}</ref> backward particle models,<ref name="dp13" /><ref>{{cite journal|last1 = Del Moral|first1 = Pierre|last2 = Doucet|first2 = Arnaud|last3 = Singh|first3 = Sumeetpal, S.|title = A Backward Particle Interpretation of Feynman-Kac Formulae|journal = M2AN|date = 2010|volume = 44|issue = 5|pages = 947–976|url = http://hal.inria.fr/docs/00/42/13/56/PDF/RR-7019.pdf|doi = 10.1051/m2an/2010048}}</ref> adaptive mean field particle models,<ref name=":0"/> island type particle models,<ref>{{cite journal|last1 = Vergé|first1 = Christelle|last2 = Dubarry|first2 = Cyrille|last3 = Del Moral|first3 = Pierre|last4 = Moulines|first4 = Eric|title = On parallel implementation of Sequential Monte Carlo methods: the island particle model|journal = Statistics and Computing|date = 2013|doi = 10.1007/s11222-013-9429-x|volume = 25|pages = 243–260}}</ref><ref>{{cite journal|last1 = Chopin|first1 = Nicolas|last2 = Jacob|first2 = Pierre, E.|last3 = Papaspiliopoulos|first3 = Omiros|title = SMC^2: an efficient algorithm for sequential analysis of state-space models|journal = arXiv:1101.1528v3|url = http://arxiv.org/pdf/1101.1528v3.pdf}}</ref> and particle Markov chain Monte Carlo methodologies.<ref>{{cite journal|last1 = Andrieu|first1 = Christophe|last2 = Doucet|first2 = Arnaud|last3 = Holenstein|first3 = Roman|title = Particle Markov chain Monte Carlo methods|journal = Journal Royal Statistical Society B|date = 2010|volume = 72|issue = 3|pages = 269–342|doi = 10.1111/j.1467-9868.2009.00736.x}}</ref><ref>{{cite journal|last1 = Del Moral|first1 = Pierre|last2 = Patras|first2 = Frédéric|last3 = Kohn|first3 = Robert|title = On Feynman-Kac and particle Markov chain Monte Carlo models|journal = arXiv preprint arXiv:1404.5733|date = 2014|url = http://arxiv.org/pdf/1404.5733.pdf}}</ref> The first uniform convergence results with respect to the time parameter for particle filters were developed in the end of the 1990s by Pierre Del Moral and Alice Guionnet.<ref name="dg01">{{cite journal|last1 = Del Moral|first1 = Pierre|last2 = Guionnet|first2 = Alice|title = On the stability of interacting processes with applications to filtering and genetic algorithms|journal = Annales de l'Institut Henri Poincaré,|date = 2001|volume = 37|issue = 2|pages = 155–194|url = http://web.maths.unsw.edu.au/~peterdel-moral/ihp.ps|doi = 10.1016/s0246-0203(00)01064-5}}</ref><ref name="dg99">{{cite journal|last1 = Del Moral|first1 = Pierre|last2 = Guionnet|first2 = Alice|title = On the stability of Measure Valued Processes with Applications to filtering|journal = C.R. Acad. Sci. Paris|date = 1999|volume = 39|issue = 1|pages = 429–434}}</ref> The first rigorous analysis of genealogical tree based particle filter smoothers is due to P. Del Moral and L. Miclo in 2001<ref>{{Cite journal|title = Genealogies and Increasing Propagation of Chaos For Feynman-Kac and Genetic Models|url = http://projecteuclid.org/euclid.aoap/1015345399|journal = The Annals of Applied Probability|date = 2001|issn = 1050-5164|pages = 1166–1198|volume = 11|issue = 4|doi = 10.1214/aoap/1015345399|first = Pierre|last = Del Moral|first2 = Laurent|last2 = Miclo}}</ref> |
||
==Objective== |
==Objective== |
||
Line 68: | Line 68: | ||
Particle methods, like all sampling-based approaches (e.g., [[Markov chain Monte Carlo|MCMC]]), generate a set of samples that approximate the filtering distribution <math>p(x_k|y_0,\dots,y_k)</math>. For example, we may have <math>N</math> samples from the approximate posterior distribution of <math>x_k</math>, where the samples are labeled with superscripts as <math>x_k^{(1)}, x_k^{(2)}, \ldots, x_k^{(N)}</math>. Then, expectations with respect to the filtering distribution are approximated by |
Particle methods, like all sampling-based approaches (e.g., [[Markov chain Monte Carlo|MCMC]]), generate a set of samples that approximate the filtering distribution <math>p(x_k|y_0,\dots,y_k)</math>. For example, we may have <math>N</math> samples from the approximate posterior distribution of <math>x_k</math>, where the samples are labeled with superscripts as <math>x_k^{(1)}, x_k^{(2)}, \ldots, x_k^{(N)}</math>. Then, expectations with respect to the filtering distribution are approximated by |
||
:<math display="block">\int f(x_k)p(x_k|y_0,\dots,y_k) \, dx_k\approx\frac1N \sum_{i=1}^Nf\left(x_k^{(i)}\right) |
:<math display="block">\int f(x_k)p(x_k|y_0,\dots,y_k) \, dx_k\approx\frac1N \sum_{i=1}^Nf\left(x_k^{(i)}\right)=\int f(x_k)~\widehat{p}(dx_k|y_0,\dots,y_k)\quad with\quad |
||
\widehat{p}(dx_k|y_0,\dots,y_k)=\frac{1}{N}\sum_{1\leq i\leq N}\delta_{x^{(i)}_k}(dx_k)</math> |
|||
In the above display <math display="inline"> |
|||
⚫ | |||
\delta_a |
|||
⚫ | </math> stands for the <b role="presentation">[[Dirac measure]]</b> at a given state a. The function <math>f(\cdot)</math>, in the usual way for Monte Carlo, can give all the [[moment (mathematics)|moments]] etc. of the distribution up to some degree of approximation. Particle filters can be interpreted as a genetic type particle algorithm evolving with mutation and selection transitions. If we keep track of the ancestral lines <math display="inline">(x^{i}_{0,k},x^{i}_{1,k},...,x^{i}_{k,k})</math> of the particles <math display="inline">i=1,...,N</math> (the lower indices <math display="inline">x^{i}_{l,k}</math> , with l=0,...,k, stands for the levels of the ancestors), then we have the approximation formula |
||
<math display="block">\int F(x_0,...,x_k)p(x_0,...,x_k|y_0,\dots,y_k) \, dx_0...dx_k\approx \frac1N \sum_{i=1}^NF\left( |
<math display="block">\int F(x_0,...,x_k)p(x_0,...,x_k|y_0,\dots,y_k) \, dx_0...dx_k\approx \frac1N \sum_{i=1}^NF\left( |
||
Line 76: | Line 79: | ||
Here F stands for some function on the path space of the signal. |
Here F stands for some function on the path space of the signal. |
||
== [[Mean field particle methods|Mean field particle simulation]] == |
|||
The mean field particle interpretation of the nonlinear filtering equation is better explained using the evolution of the one step optimal predictors |
|||
<math display="block">\boldsymbol{p(x_{k}|y_0,\ldots,y_{k-1})}\quad\longrightarrow\quad p(x_{k+1}|y_0,\ldots,y_k)=\int~p(x_{k+1}|x'_{k})~\displaystyle\frac{p(y_k|x_k')~\boldsymbol{p(x'_k|y_0,\dots,y_{k-1})}~dx'_k}{\displaystyle\int~p(y_k|x''_k)~\boldsymbol{p(x''_k|y_0,\ldots,y_{k-1})}~dx''_{k-1}}</math>For k=0 we use the convention <math>p(x_0|y_0,\dots,y_{-1}):=p(x_0)</math>. |
|||
One of the simplest way to approximate these probability measures is to start with <math display="inline">N</math> independent random variable <math display="inline">\left(\xi^i_0\right)_{1\leq i\leq N}</math> with common probability density <math display="inline">p(x_0)</math> . By the law of large numbers, we have |
|||
<math display="block" id="{{EquationRef|1}}">\widehat{p}(dx_0)=\frac{1}{N}\sum_{1\leq i\leq N}\delta_{\xi^{i}_0}(dx_0)\approx_{N\uparrow\infty}~p(x_0)dx_0 |
|||
\quad \mbox{in the sense that}\quad |
|||
\int f(x_0)\widehat{p}(dx_0)=\frac{1}{N}\sum_{1\leq i\leq N}f(\xi^i_0)\approx_{N\uparrow\infty} \int f(x_0)p(dx_0)dx_0</math>for any bounded function <math>f</math>. We further assume that we have constructed a sequence of particles <math display="inline">\left(\xi^i_k\right)_{1\leq i\leq N}</math> at some rank k such that |
|||
<math display="block">\widehat{p}(dx_k|y_0,\ldots,y_{k-1}):=\frac{1}{N}\sum_{1\leq i\leq N}\delta_{\xi^{i}_k}(dx_k)\approx_{N\uparrow\infty}~p(x_k~|~y_0,\ldots,y_{k-1})dx_k</math>in the sense that for any bounded function <math>f</math> we have<math display="block" id="{{EquationRef|1}}">\int f(x_k)\widehat{p}(dx_k~|~y_0,\ldots,y_{k-1})=\frac{1}{N}\sum_{1\leq i\leq N}f(\xi^i_k)\approx_{N\uparrow\infty} \int f(x_k)p(dx_k|y_0,\ldots,y_{k-1})dx_k</math>In this situation, '''replacing <math id="{{EquationRef|1}}">p(x_k~|~y_0,\ldots,y_{k-1})~dx_k</math> by <math id="{{EquationRef|1}}">\widehat{p}(dx_k~|~y_0,\ldots,y_{k-1})</math> in the evolution equation of the one-step optimal filter stated above,''' we find that |
|||
<math display="block">p(x_{k+1}|y_0,\ldots,y_k)\approx_{N\uparrow\infty} \int~p(x_{k+1}|x'_{k})~\displaystyle\frac{p(y_k|x_k')~\widehat{p}(x'_k|y_0,\dots,y_{k-1})~dx'_k}{\displaystyle\int~p(y_k|x''_k)~\widehat{p}(x''_k|y_0,\ldots,y_{k-1})dx''_{k-1}}=\sum_{1\leq i\leq N}~\frac{p(y_k~|~\xi^i_k)}{\sum_{1\leq j\leq N}p(y_k~|~\xi^j_k)}~~p(x_{k+1}~|~\xi^i_k):=\widehat{q}(x_{k+1}|y_0,\ldots,y_k)</math> |
|||
where <math id="{{EquationRef|1}}">p(y_k~|~\xi^i_k)</math> stands for the density <math id="{{EquationRef|1}}">p(y_k~|~x_k)</math> evaluated at <math id="{{EquationRef|1}}">x_k=\xi^i_k</math>, and <math id="{{EquationRef|1}}">p(x_{k+1}~|~\xi^i_k)</math> stands for the density <math id="{{EquationRef|1}}">p(x_{k+1}~|~x_k)</math> evaluated at <math id="{{EquationRef|1}}">x_k=\xi^i_k</math>., for <math id="{{EquationRef|1}}">i=1,\ldots,N.</math> Then, we sample <math display="inline">N</math> independent random variable <math display="inline">\left(\xi^i_{k+1}\right)_{1\leq i\leq N}</math> with common probability density <math display="inline">\widehat{q}(x_{k+1}|y_0,\ldots,y_k)</math> so that |
|||
<math display="block">\widehat{p}(dx_{k+1}|y_0,\ldots,y_{k}):=\frac{1}{N}\sum_{1\leq i\leq N}\delta_{\xi^{i}_{k+1}}(dx_{k+1})\approx_{N\uparrow\infty}~ |
|||
\widehat{q}(x_{k+1}~|~y_0,\ldots,y_{k})dx_{k+1}\approx_{N\uparrow\infty}~p(x_{k+1}~|~y_0,\ldots,y_{k})dx_{k+1}</math>Iterating this procedure, we design a Markov chain <math display="inline">\left(\xi^i_{k}\right)_{1\leq i\leq N}</math> such that |
|||
<math display="block">\widehat{p}(dx_k|y_0,\ldots,y_{k-1}):=\frac{1}{N}\sum_{1\leq i\leq N}\delta_{\xi^{i}_k}(dx_k)\approx_{N\uparrow\infty}~p(x_k~|~y_0,\ldots,y_{k-1})dx_k</math> |
|||
Notice that the optimal filter is approximated at each time step k using the Bayes' formulae<math display="block">p(x_{k}|y_0,\ldots,y_{k})dx_k |
|||
\approx_{N\uparrow\infty}\displaystyle\frac{p(y_{k}|x_{k})~\widehat{p}(dx_{k}|y_0,\ldots,y_{k-1})}{\displaystyle\int p(y_{k}|x'_{k})\widehat{p}(dx'_{k}|y_0,\ldots,y_{k-1})}=\sum_{1\leq i\leq N}~\frac{p(y_k~|~\xi^i_k)}{\sum_{1\leq j\leq N}p(y_k~|~\xi^j_k)}~\delta_{\xi^i_k}(dx_k)</math>The terminology "mean field approximation" comes from the fact that we replace at each time step the probability measure '''<math id="{{EquationRef|1}}">p(x_k~|~y_0,\ldots,y_{k-1})~dx_k</math>''' by the empirical approximation '''<math id="{{EquationRef|1}}">\widehat{p}(dx_k~|~y_0,\ldots,y_{k-1})</math>.''' |
|||
== Sequential importance resampling (SIR) == |
== Sequential importance resampling (SIR) == |
Revision as of 22:44, 1 June 2015
This article has multiple issues. Please help improve it or discuss these issues on the talk page. (Learn how and when to remove these template messages)
|
Particle filters or Sequential Monte Carlo (SMC) methods are a set of on-line posterior density estimation algorithms that estimate the posterior density of the state-space by directly implementing the Bayesian recursion equations. The term "particle filters" was first coined in 1996 by Del Moral,[1] and the terminology "sequential Monte Carlo" was proposed by Liu and Chen in 1998. Particle filters and SMC methods use a sampling approach, with a set of particles to represent the posterior density. The state-space model can be non-linear and the initial state and noise distributions can take any form required. SMC methods provide a well-established methodology[1][2][3] for generating samples from the required distribution without requiring assumptions about the state-space model or the state distributions. However, these methods do not perform well when applied to high-dimensional systems. SMC methods implement the Bayesian recursion equations directly by using an ensemble based approach. The samples from the distribution are represented by a set of particles; each particle has a weight assigned to it that represents the probability of that particle being sampled from the probability density function.
Weight disparity leading to weight collapse is a common issue encountered in these filtering algorithms; however it can be mitigated by including a resampling step before the weights become too uneven. Several adaptive resampling criteria can be used, including the variance of the weights and the relative entropy w.r.t. the uniform distribution.[4] In the resampling step, the particles with negligible weights are replaced by new particles in the proximity of the particles with higher weights.
History
One trace of particle filters date back to the 50's; the 'Poor Man's Monte Carlo', that was proposed by Hammersley et al., in 1954, contained hints of the SMC methods used today. Later in the 70's, similar attempts were made in the control and genetic algorithms communities. However in the signal processing literature it was in 1993, that Gordon et al., published their seminal work 'A novel Approach to nonlinear/non-Gaussian Bayesian State estimation', that provided the first true heuristic-like implementation of the SMC methods used today in Bayesian statistical inference. The authors named their algorithm 'the bootstrap filter', and demonstrated that compared to other filtering methods, their bootstrap algorithm does not require any assumption about that state-space or the noise of the system. We also quote another pioneering article in this field of Genshiro Kitagawa ,[5] and the one by Himilcon Carvalho, Pierre Del Moral, André Monin and Gérard Salut[6] published in the 1990s. All of these articles present particle filters as natural heuristic-like algorithms without a single proof of their consistency.
Particle filters belong to the class of genetic type algorithms and mean field type particle methods. We quote a pioneering article by Theodore E. Harris and Herman Kahn, published in 1951, using mean field but heuristic-like genetic methods for estimating particle transmission energies.[7] Mean field genetic type particle methodologies are also used as heuristic natural search algorithms (a.k.a. Metaheuristic) in evolutionary computing. The origins of these mean field computational techniques can be traced to 1950 and 1954 with the work of Alan Turing on genetic type mutation-selection learning machines [8] and the articles by Nils Aall Barricelli at the Institute for Advanced Study in Princeton, New Jersey.[9][10] The Australian geneticist Alex Fraser also published in 1957 a series of papers on the genetic type simulation of artificial selection of organisms.[11]
The conditional distribution of the random states of a signal given some partial and noisy observations is described by a Feynman-Kac probability on the random trajectories of the signal weighted by likelihood potential functions. The first rigorous mathematical analysis of these models are due to Pierre Del Moral[1][2] in 1996. This article also contain a proof of the unbiased properties of particle approximations of likelihood functions and unnormalized conditional probability measures. Branching type particle methodologies with varying population sizes were also developed in the end of the 1990s by Dan Crisan, Jessica Gaines and Terry Lyons,[12][13][14] and by Dan Crisan, Pierre Del Moral and Terry Lyons.[15]
The theory on Feynman-Kac particle methodologies and related particle filters algorithms has been developed in 2000 and 2004 in the books.[16][3] These probabilistic models encapsulate genetic type algorithms, particle and bootstrap filters, interacting Kalman filters (a.k.a. Rao–Blackwellized particle filter [17]), importance sampling and resampling style particle filter techniques, genealogical tree based and particle backward methodologies for solving filtering and smoothing problems. Other classes of particle filtering methodologies includes genealogical tree based models,[18][19][20] backward particle models,[18][21] adaptive mean field particle models,[4] island type particle models,[22][23] and particle Markov chain Monte Carlo methodologies.[24][25] The first uniform convergence results with respect to the time parameter for particle filters were developed in the end of the 1990s by Pierre Del Moral and Alice Guionnet.[26][27] The first rigorous analysis of genealogical tree based particle filter smoothers is due to P. Del Moral and L. Miclo in 2001[28]
Objective
The objective of a particle filter is to estimate the posterior density of the state variables given the observation variables. The particle filter is designed for a hidden Markov Model, where the system consists of hidden and observable variables. The observable variables (observation process) are related to the hidden variables (state-process) by some functional form that is known. Similarly the dynamical system describing the evolution of the state variables is also known probabilistically.
A generic particle filter estimates the posterior distribution of the hidden states using the observation measurement process. Consider a state-space shown in the diagram (Figure 2).
The objective of the particle filter is to estimate the values of the hidden states x, given the values of the observation process y.
The particle filter aims to estimate the sequence of hidden parameters, xk for k = 0,1,2,3,…, based only on the observed data yk for k = 0,1,2,3,…. All Bayesian estimates of xk follow from the posterior distribution p(xk | y0,y1,…,yk). In contrast, the MCMC or importance sampling approach would model the full posterior p(x0,x1,…,xk | y0,y1,…,yk).
Model
Particle methods assume and the observations can be modeled in this form:
- is a first order Markov process that evolves according to the distribution :
and with an initial distribution .
- The observations are conditionally independent provided that are known
- In other words, each only depends on . This conditional distribution for is written as
An example system with these properties is:
where both and are mutually independent and identically distributed sequences with known probability density functions and and are known functions. These two equations can be viewed as state space equations and look similar to the state space equations for the Kalman filter. The conditional distribution p(x0,x1,…,xk | y0,y1,…,yk) are given by the Bayes' rule
with
If the functions and are linear, and if both and are Gaussian, the Kalman filter finds the exact Bayesian filtering distribution. If not, Kalman filter based methods are a first-order approximation (EKF) or a second-order approximation (UKF in general, but if probability distribution is Gaussian a third-order approximation is possible). Particle filters are also an approximation, but with enough particles they can be much more accurate[1][2][3][26][27]..
Monte Carlo approximation
Particle methods, like all sampling-based approaches (e.g., MCMC), generate a set of samples that approximate the filtering distribution . For example, we may have samples from the approximate posterior distribution of , where the samples are labeled with superscripts as . Then, expectations with respect to the filtering distribution are approximated by
In the above display stands for the Dirac measure at a given state a. The function , in the usual way for Monte Carlo, can give all the moments etc. of the distribution up to some degree of approximation. Particle filters can be interpreted as a genetic type particle algorithm evolving with mutation and selection transitions. If we keep track of the ancestral lines of the particles (the lower indices , with l=0,...,k, stands for the levels of the ancestors), then we have the approximation formula
Here F stands for some function on the path space of the signal.
Mean field particle simulation
The mean field particle interpretation of the nonlinear filtering equation is better explained using the evolution of the one step optimal predictors
One of the simplest way to approximate these probability measures is to start with independent random variable with common probability density . By the law of large numbers, we have
where stands for the density evaluated at , and stands for the density evaluated at ., for Then, we sample independent random variable with common probability density so that
Notice that the optimal filter is approximated at each time step k using the Bayes' formulae
Sequential importance resampling (SIR)
Sequential importance resampling (SIR), the original particle filtering algorithm (Gordon et al. 1993), is a very commonly used particle filtering algorithm, which approximates the filtering distribution by a weighted set of P particles
The importance weights are approximations to the relative posterior probabilities (or densities) of the particles such that .
SIR is a sequential (i.e., recursive) version of importance sampling. As in importance sampling, the expectation of a function can be approximated as a weighted average
For a finite set of particles, the algorithm performance is dependent on the choice of the proposal distribution
- .
The "optimal" proposal distribution is given as the target distribution
This particular choice of proposal transition has been proposed by P. Del Moral in [2] in 1996 and 1998. When it is difficult to sample transitions according to the distribution one natural strategy is to use the following particle approximation
with the empirical approximation
associated with (or any other large number of samples) independent random samples with the conditional distribution of the random state given . The consistency of the resulting particle filter of this approximation and other extensions are developed in.[2] In the above display stands for the Dirac measure at a given state a.
However, the transition prior probability distribution is often used as importance function, since it is easier to draw particles (or samples) and perform subsequent importance weight calculations:
Sequential Importance Resampling (SIR) filters with transition prior probability distribution as importance function are commonly known as bootstrap filter and condensation algorithm.
Resampling is used to avoid the problem of degeneracy of the algorithm, that is, avoiding the situation that all but one of the importance weights are close to zero. The performance of the algorithm can be also affected by proper choice of resampling method. The stratified sampling proposed by Kitagawa (1996) is optimal in terms of variance.
A single step of sequential importance resampling is as follows:
- 1) For draw samples from the proposal distribution
- 2) For update the importance weights up to a normalizing constant:
-
- Note that when we use the transition prior probability distribution as the importance function, , this simplifies to the following :
- Note that when we use the transition prior probability distribution as the importance function, , this simplifies to the following :
- 3) For compute the normalized importance weights:
- 4) Compute an estimate of the effective number of particles as
This criteria reflects the variance of the weights, other criteria can be found in the article,[4] including their rigorous analysis and central limit theorems.
- 5) If the effective number of particles is less than a given threshold , then perform resampling:
- a) Draw particles from the current particle set with probabilities proportional to their weights. Replace the current particle set with this new one.
- b) For set
The term Sampling Importance Resampling is also sometimes used when referring to SIR filters.
Sequential importance sampling (SIS)
- Is the same as sequential importance resampling, but without the resampling stage.
"direct version" algorithm
This section may be confusing or unclear to readers. (October 2011) |
The "direct version" algorithm [citation needed] is rather simple (compared to other particle filtering algorithms) and it uses composition and rejection. To generate a single sample at from :
- 1) Set n=0 (This will count the number of particles generated so far)
- 2) Uniformly choose an index i from the range
- 3) Generate a test from the distribution
- 4) Generate the probability of using from where is the measured value
- 5) Generate another uniform u from where
- 6) Compare u and
- 6a) If u is larger then repeat from step 2
- 6b) If u is smaller then save as and increment n
- 7) If n == N then quit
The goal is to generate P "particles" at using only the particles from . This requires that a Markov equation can be written (and computed) to generate a based only upon . This algorithm uses composition of the P particles from to generate a particle at and repeats (steps 2–6) until P particles are generated at .
This can be more easily visualized if is viewed as a two-dimensional array. One dimension is and the other dimensions is the particle number. For example, would be the ith particle at and can also be written (as done above in the algorithm). Step 3 generates a potential based on a randomly chosen particle () at time and rejects or accepts it in step 6. In other words, the values are generated using the previously generated .
Other particle filters
- Auxiliary particle filter [29]
- Regularized auxiliary particle filter [30]
- Gaussian particle filter
- Unscented particle filter
- Gauss–Hermite particle filter
- Cost Reference particle filter
- Hierarchical/Scalable particle filter [31]
- Rao–Blackwellized particle filter [17]
- Rejection-sampling based optimal particle filter [32][33]
- Feynman-Kac and mean field particle methodologies[1][3][18]
See also
- Ensemble Kalman filter
- Generalized filtering
- Moving horizon estimation
- Recursive Bayesian estimation
- Monte Carlo localization
- Mean field particle methods
References
- ^ a b c d e Del Moral, Pierre (1996). "Non Linear Filtering: Interacting Particle Solution" (PDF). Markov Processes and Related Fields. 2 (4): 555–580.
- ^ a b c d e Del Moral, Pierre (1998). "Measure Valued Processes and Interacting Particle Systems. Application to Non Linear Filtering Problems". Annals of Applied Probability. 8 (2) (Publications du Laboratoire de Statistique et Probabilités, 96-15 (1996) ed.): 438–495. doi:10.1214/aoap/1028903535.
- ^ a b c d Del Moral, Pierre (2004). Feynman-Kac formulae. Genealogical and interacting particle approximations. http://www.springer.com/gp/book/9780387202686: Springer. Series: Probability and Applications. p. 556. ISBN 978-0-387-20268-6.
{{cite book}}
: External link in
(help)CS1 maint: location (link)|location=
- ^ a b c Del Moral, Pierre; Doucet, Arnaud; Jasra, Ajay (2012). "On Adaptive Resampling Procedures for Sequential Monte Carlo Methods" (PDF). Bernoulli. 18 (1): 252–278. doi:10.3150/10-bej335.
- ^ Kitagawa, G. (1996). "Monte carlo filter and smoother for non-Gaussian nonlinear state space models". Journal of Computational and Graphical Statistics. 5 (1): 1–25. doi:10.2307/1390750. JSTOR 1390750.
- ^ Carvalho, Himilcon; Del Moral, Pierre; Monin, André; Salut, Gérard (July 1997). "Optimal Non-linear Filtering in GPS/INS Integration" (PDF). IEEE-Trans. on Aerospace and electronic systems. 33 (3).
- ^ Herman, Kahn; Harris, Theodore, E. (1951). "Estimation of particle transmission by random sampling" (PDF). Natl. Bur. Stand. Appl. Math. Ser. 12: 27–30.
{{cite journal}}
: CS1 maint: multiple names: authors list (link) - ^ Turing, Alan M. "Computing machinery and intelligence". Mind. LIX (238): 433–460. doi:10.1093/mind/LIX.236.433.
- ^ Barricelli, Nils Aall (1954). "Esempi numerici di processi di evoluzione". Methodos: 45–68.
- ^ Barricelli, Nils Aall (1957). "Symbiogenetic evolution processes realized by artificial methods". Methodos: 143–182.
- ^ Fraser, Alex (1957). "Simulation of genetic systems by automatic digital computers. I. Introduction". Aust. J. Biol. Sci. 10: 484–491.
- ^ Crisan, Dan; Gaines, Jessica; Lyons, Terry (1998). "Convergence of a branching particle method to the solution of the Zakai". SIAM Journal on Applied Mathematics. 58 (5): 1568–1590. doi:10.1137/s0036139996307371.
- ^ Crisan, Dan; Lyons, Terry (1997). "Nonlinear filtering and measure-valued processes". Probability Theory and Related Fields. 109 (2): 217–244. doi:10.1007/s004400050131.
- ^ Crisan, Dan; Lyons, Terry (1999). "A particle approximation of the solution of the Kushner–Stratonovitch equation". Probability Theory and Related Fields. 115 (4): 549–578. doi:10.1007/s004400050249.
- ^ Crisan, Dan; Del Moral, Pierre; Lyons, Terry (1999). "Discrete filtering using branching and interacting particle systems" (PDF). Markov Processes and Related Fields. 5 (3): 293–318.
- ^ Del Moral, Pierre; Miclo, Laurent (2000). Branching and Interacting Particle Systems Approximations of Feynman-Kac Formulae with Applications to Non-Linear Filtering (PDF). Vol. 1729. pp. 1–145. doi:10.1007/bfb0103798.
{{cite book}}
:|journal=
ignored (help) - ^ a b Doucet, A. (2000). Rao–Blackwellised particle filtering for dynamic Bayesian networks. Proceedings of the Sixteenth conference on Uncertainty in artificial intelligence. pp. 176–183. CiteSeerx: 10.1.1.137.5199.
{{cite conference}}
:|access-date=
requires|url=
(help); Unknown parameter|coauthors=
ignored (|author=
suggested) (help) - ^ a b c Del Moral, Pierre (2013). Mean field simulation for Monte Carlo integration. Chapman & Hall/CRC Press. p. 626.
Monographs on Statistics & Applied Probability
- ^ Del Moral, Pierre (2004). Feynman-Kac formulae. Genealogical and interacting particle approximations. Springer. p. 575.
Series: Probability and Applications
- ^ Del Moral, Pierre; Miclo, Laurent (2001). "Genealogies and Increasing Propagations of Chaos for Feynman-Kac and Genetic Models". Annals of Applied Probability. 11 (4): 1166–1198.
- ^ Del Moral, Pierre; Doucet, Arnaud; Singh, Sumeetpal, S. (2010). "A Backward Particle Interpretation of Feynman-Kac Formulae" (PDF). M2AN. 44 (5): 947–976. doi:10.1051/m2an/2010048.
{{cite journal}}
: CS1 maint: multiple names: authors list (link) - ^ Vergé, Christelle; Dubarry, Cyrille; Del Moral, Pierre; Moulines, Eric (2013). "On parallel implementation of Sequential Monte Carlo methods: the island particle model". Statistics and Computing. 25: 243–260. doi:10.1007/s11222-013-9429-x.
- ^ Chopin, Nicolas; Jacob, Pierre, E.; Papaspiliopoulos, Omiros. "SMC^2: an efficient algorithm for sequential analysis of state-space models" (PDF). arXiv:1101.1528v3.
{{cite journal}}
: CS1 maint: multiple names: authors list (link) - ^ Andrieu, Christophe; Doucet, Arnaud; Holenstein, Roman (2010). "Particle Markov chain Monte Carlo methods". Journal Royal Statistical Society B. 72 (3): 269–342. doi:10.1111/j.1467-9868.2009.00736.x.
- ^ Del Moral, Pierre; Patras, Frédéric; Kohn, Robert (2014). "On Feynman-Kac and particle Markov chain Monte Carlo models" (PDF). arXiv preprint arXiv:1404.5733.
- ^ a b Del Moral, Pierre; Guionnet, Alice (2001). "On the stability of interacting processes with applications to filtering and genetic algorithms". Annales de l'Institut Henri Poincaré,. 37 (2): 155–194. doi:10.1016/s0246-0203(00)01064-5.
{{cite journal}}
: CS1 maint: extra punctuation (link) - ^ a b Del Moral, Pierre; Guionnet, Alice (1999). "On the stability of Measure Valued Processes with Applications to filtering". C.R. Acad. Sci. Paris. 39 (1): 429–434.
- ^ Del Moral, Pierre; Miclo, Laurent (2001). "Genealogies and Increasing Propagation of Chaos For Feynman-Kac and Genetic Models". The Annals of Applied Probability. 11 (4): 1166–1198. doi:10.1214/aoap/1015345399. ISSN 1050-5164.
- ^ Pitt, M.K.; Shephard, N. (1999). "Filtering Via Simulation: Auxiliary Particle Filters". Journal of the American Statistical Association. 94 (446). American Statistical Association: 590–591. doi:10.2307/2670179. JSTOR 2670179. Retrieved 2008-05-06.
- ^ Liu, J.; Wang, W.; Ma, F. (2011). "A Regularized Auxiliary Particle Filtering Approach for System State Estimation and Battery Life Prediction". Smart Materials and Structures. 20 (7): 1–9. doi:10.1088/0964-1726/20/7/075021.
- ^ Canton-Ferrer, C.; Casas, J.R.; Pardàs, M. (2011). "Human Motion Capture Using Scalable Body Models". Computer Vision and Image Understanding. 115 (10). Elsevier: 1363–1374. doi:10.1016/j.cviu.2011.06.001.
{{cite journal}}
:|access-date=
requires|url=
(help) - ^ Blanco, J.L. (2008). An Optimal Filtering Algorithm for Non-Parametric Observation Models in Robot Localization. IEEE International Conference on Robotics and Automation (ICRA'08). pp. 461–466. CiteSeerx: 10.1.1.190.7092.
{{cite conference}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help) - ^ Blanco, J.L. (2010). "Optimal Filtering for Non-Parametric Observation Models: Applications to Localization and SLAM" (PDF). The International Journal of Robotics Research (IJRR). 29 (14): 1726–1742. doi:10.1177/0278364910364165.
{{cite journal}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)
Bibliography
- Cappe, O. (2005). Inference in Hidden Markov Models. Springer.
{{cite book}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help) - Liu, J.S.,; Chen, R. (1998). "Sequential Monte Carlo methods for dynamic systems" (PDF). Journal of the American Statistical Association. 93 (443). Taylor & Francis Group: 1032–1044. doi:10.1080/01621459.1998.10473765.
{{cite journal}}
: CS1 maint: extra punctuation (link) CS1 maint: multiple names: authors list (link)
- Liu, J.S. (2001). Monte Carlo strategies in Scientific Computing. Springer.
- Kong, A.; Liu, J.S.; Wong, W.H. (1994). "Sequential imputations and Bayesian missing data problems" (PDF). Journal of the American Statistical Association. 89 (425). Taylor & Francis Group: 278–288. doi:10.1080/01621459.1994.10476469.
- Liu, J.S.; Chen, R. (1995). "Blind deconvolution via sequential imputations" (PDF). Journal of the American Statistical Association. 90 (430). Taylor & Francis Group: 567–576. doi:10.2307/2291068.
- Ristic, B. (2004). Beyond the Kalman Filter: Particle Filters for Tracking Applications. Artech House.
{{cite book}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)
- Doucet, A. (December 2008). "A tutorial on particle filtering and smoothing: fifteen years later" (PDF). Technical report. Department of Statistics, University of British Columbia.
{{cite journal}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)CS1 maint: extra punctuation (link)
- Doucet, A. (2000). "On sequential Monte Carlo sampling methods for Bayesian filtering". Statistics and Computing. 10 (3): 197–208. doi:10.1023/A:1008935410038.
{{cite journal}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)CS1 maint: extra punctuation (link)
- Arulampalam, M.S. (2002). "A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking". IEEE Transactions on Signal Processing. 50 (2): 174–188. doi:10.1109/78.978374.
{{cite journal}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)CS1 maint: extra punctuation (link)
- Cappe, O. (2007). "An overview of existing methods and recent advances in sequential Monte Carlo". Proceedings of IEEE. 95 (5): 899–924. doi:10.1109/JPROC.2007.893250.
{{cite journal}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)CS1 maint: extra punctuation (link)
- Kitagawa, G. (1996). "Monte carlo filter and smoother for non-Gaussian nonlinear state space models". Journal of Computational and Graphical Statistics. 5 (1): 1–25. doi:10.2307/1390750. JSTOR 1390750.
- Kotecha, J.H. (2003). "Gaussian Particle filtering". IEEE Transactions Signal Processing. 51 (10).
{{cite journal}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)CS1 maint: extra punctuation (link)
- Haug, A.J. (2005). "A Tutorial on Bayesian Estimation and Tracking Techniques Applicable to Nonlinear and Non-Gaussian Processes" (PDF). The MITRE Corporation, USA, Tech. Rep., Feb. Retrieved 2008-05-06.
- Pitt, M.K.; Shephard, N. (1999). "Filtering Via Simulation: Auxiliary Particle Filters". Journal of the American Statistical Association. 94 (446). Journal of the American Statistical Association, Vol. 94, No. 446: 590–591. doi:10.2307/2670179. JSTOR 2670179. Retrieved 2008-05-06.
- Gordon, N. J. (1993). "Novel approach to nonlinear/non-Gaussian Bayesian state estimation". IEE Proceedings F on Radar and Signal Processing. 140 (2): 107–113. doi:10.1049/ip-f-2.1993.0015. Retrieved 2009-09-19.
{{cite journal}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)
- Chen, Z. (2003). "Bayesian Filtering: From Kalman Filters to Particle Filters, and Beyond". CiteSeerx: 10.1.1.107.7415.
{{cite journal}}
:|access-date=
requires|url=
(help); Cite journal requires|journal=
(help)
- Vaswani, N. (2007). "Tracking deforming objects using particle filtering for geometric active contours". IEEE Trans. on Pattern Analysis and Machine Intelligence. 29 (8): 1470–1475. doi:10.1109/tpami.2007.1081.
{{cite journal}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)
External links
- Feynman–Kac models and interacting particle algorithms (a.k.a. Particle Filtering) Theoretical aspects and a list of application domains of particle filters
- Sequential Monte Carlo Methods (Particle Filtering) homepage on University of Cambridge
- Dieter Fox's MCL Animations
- Rob Hess' free software
- SMCTC: A Template Class for Implementing SMC algorithms in C++
- Java applet on particle filtering
- vSMC : Vectorized Sequential Monte Carlo