Variational inference for diffusion processes

Cédric Archambeau, Manfred Opper, Yuan Shen, Dan Cornford, John Shawe-Taylor

Research output: Chapter in Book/Published conference outputConference publication

Abstract

Diffusion processes are a family of continuous-time continuous-state stochastic processes that are in general only partially observed. The joint estimation of the forcing parameters and the system noise (volatility) in these dynamical systems is a crucial, but non-trivial task, especially when the system is nonlinear and multimodal. We propose a variational treatment of diffusion processes, which allows us to compute type II maximum likelihood estimates of the parameters by simple gradient techniques and which is computationally less demanding than most MCMC approaches. We also show how a cheap estimate of the posterior over the parameters can be constructed based on the variational free energy.
Original languageEnglish
Title of host publicationAnnual Conference on Neural Information Processing Systems 2007
EditorsJ.C. Platt, D. Koller, Y. Singer, S. Roweis
Place of PublicationCambridge, MA (US)
PublisherMIT
Pages17-24
Number of pages8
ISBN (Print)978-1-60560352-0
Publication statusPublished - 2008
Event21st Annual Conference on Neural Information Processing Systems, NIPS 2007 - Vancouver, BC, Canada
Duration: 3 Dec 20076 Dec 2007

Publication series

NameAdvances In Neural Information Processing Systems
PublisherMassachusetts Institute of Technology Press
Volume20

Conference

Conference21st Annual Conference on Neural Information Processing Systems, NIPS 2007
Country/TerritoryCanada
CityVancouver, BC
Period3/12/076/12/07

Bibliographical note

Copyright of the Massachusetts Institute of Technology Press (MIT Press)

Keywords

  • diffusion processes
  • continuous-time continuous-state stochastic processes
  • system noise
  • volatility
  • variational free energy

Fingerprint

Dive into the research topics of 'Variational inference for diffusion processes'. Together they form a unique fingerprint.

Cite this