Multilevel sequential Monte Carlo samplers

Alexandros Beskos, Ajay Jasra, Kody Law, Raul Tempone, Yan Zhou

Research output: Contribution to journalArticlepeer-review

30 Scopus citations

Abstract

In this article we consider the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods which depend on the step-size level . hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretization levels . ∞>h0>h1⋯>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence and a sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. It is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context. That is, relative to exact sampling and Monte Carlo for the distribution at the finest level . hL. The approach is numerically illustrated on a Bayesian inverse problem. © 2016 Elsevier B.V.
Original languageEnglish (US)
Pages (from-to)1417-1440
Number of pages24
JournalStochastic Processes and their Applications
Volume127
Issue number5
DOIs
StatePublished - Aug 29 2016

Fingerprint Dive into the research topics of 'Multilevel sequential Monte Carlo samplers'. Together they form a unique fingerprint.

Cite this