Mixtures of truncated basis functions

Helge Langseth, Thomas Dyhre Nielsen, Rafael Rumí, Antonio Salmerón

Research output: Contribution to journalJournal articleResearchpeer-review

54 Citations (Scopus)
617 Downloads (Pure)


In this paper we propose a framework, called mixtures of truncated basis functions (MoTBFs), for representing general hybrid Bayesian networks. The proposed framework generalizes both the mixture of truncated exponentials (MTEs) framework and the mixture of polynomials (MoPs) framework. Similar to MTEs and MoPs, MoTBFs are defined so that the potentials are closed under combination and marginalization, which ensures that inference in MoTBF networks can be performed efficiently using the Shafer-Shenoy architecture.
Based on a generalized Fourier series approximation, we devise a method for efficiently approximating an arbitrary density function using the MoTBF framework. The transla- tion method is more flexible than existing MTE or MoP-based methods, and it supports an online/anytime tradeoff between the accuracy and the complexity of the approxima- tion. Experimental results show that the approximations obtained are either comparable or significantly better than the approximations obtained using existing methods.
Original languageEnglish
JournalInternational Journal of Approximate Reasoning
Issue number2
Pages (from-to)212-227
Publication statusPublished - 2012

Fingerprint Dive into the research topics of 'Mixtures of truncated basis functions'. Together they form a unique fingerprint.

Cite this