Unbiased Inference for Discretely Observed Hidden Markov Model Diffusions

Neil Kumar Chada, Jordan Franks, Ajay Jasra, Kody J H Law, Matti Vihola

Research output: Contribution to journalArticlepeer-review

Abstract

We develop a Bayesian inference method for diffusions observed discretely and with noise, which is free of discretization bias. Unlike existing unbiased inference methods, our method does not rely on exact simulation techniques. Instead, our method uses standard time-discretized approximations of diffusions, such as the Euler--Maruyama scheme. Our approach is based on particle marginal Metropolis--Hastings, a particle filter, randomized multilevel Monte Carlo, and an importance sampling type correction of approximate Markov chain Monte Carlo. The resulting estimator leads to inference without a bias from the time-discretization as the number of Markov chain iterations increases. We give convergence results and recommend allocations for algorithm inputs. Our method admits a straightforward parallelization and can be computationally efficient. The user-friendly approach is illustrated on three examples, where the underlying diffusion is an Ornstein--Uhlenbeck process, a geometric Brownian motion, and a 2d nonreversible Langevin equation.
Original languageEnglish (US)
Pages (from-to)763-787
Number of pages25
JournalSIAM/ASA Journal on Uncertainty Quantification
Volume9
Issue number2
DOIs
StatePublished - Jun 8 2021

Fingerprint

Dive into the research topics of 'Unbiased Inference for Discretely Observed Hidden Markov Model Diffusions'. Together they form a unique fingerprint.

Cite this