Scalable importance sampling estimation of Gaussian mixture posteriors in Bayesian networks

Darío Ramos-López, Andrés Masegosa, Antonio Salmerón, Rafael Rumí, Helge Langseth, Thomas Dyhre Nielsen, Anders Læsø Madsen

Research output: Contribution to journalJournal articleResearchpeer-review

13 Citations (Scopus)

Abstract

In this paper we propose a scalable importance sampling algorithm for computing Gaussian mixture posteriors in conditional linear Gaussian Bayesian networks. Our contribution is based on using a stochastic gradient ascent procedure taking as input a stream of importance sampling weights, so that a mixture of Gaussians is dynamically updated with no need to store the full sample. The algorithm has been designed following a Map/Reduce approach and is therefore scalable with respect to computing resources. The implementation of the proposed algorithm is available as part of the AMIDST open-source toolbox for scalable probabilistic machine learning (http://www.amidsttoolbox.com).
Original languageEnglish
Article number100
JournalInternational Journal of Approximate Reasoning
Volume100
Pages (from-to)115-134
Number of pages20
ISSN0888-613X
DOIs
Publication statusPublished - 1 Sept 2018

Keywords

  • Bayesian networks
  • Conditional linear Gaussian models
  • Gaussian mixtures
  • Importance sampling
  • Scalable inference

Fingerprint

Dive into the research topics of 'Scalable importance sampling estimation of Gaussian mixture posteriors in Bayesian networks'. Together they form a unique fingerprint.

Cite this