A Reparameterization of Mixtures of Truncated Basis Functions and its Applications

Antonio Salmerón, Helge Langseth, Andres Masegosa, Thomas Dyhre Nielsen

Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review


Mixtures of truncated basis functions (MoTBFs) are a popular tool within the context of hybrid Bayesian networks, mainly because they are compatible with efficient probabilistic inference schemes. However, their standard parameterization allows the presence of negative mixture weights as well as non-normalized mixture terms, which prevents them from benefiting from existing likelihood-based mixture estimation methods like the EM algorithm. Furthermore, the standard parameterization does not facilitate the definition of a Bayesian framework ideally allowing conjugate analysis. In this paper we show how MoTBFs can be reparameterized applying a strategy already used in the literature for Gaussian mixture models with negative terms. We exemplify how the new parameterization is compatible with the EM algorithm and conjugate analysis.
Original languageEnglish
Title of host publicationProceedings of the 11th International Conference on Probabilistic Graphical Models : PMLR
EditorsAntonio Salmerón, Rafael Rumı́
PublisherPMLR Press
Publication date2022
Publication statusPublished - 2022
EventInternational Conference on Probabilistic Graphical Models - Almería, Spain
Duration: 5 Oct 20227 Oct 2022


ConferenceInternational Conference on Probabilistic Graphical Models
SeriesThe Proceedings of Machine Learning Research


Dive into the research topics of 'A Reparameterization of Mixtures of Truncated Basis Functions and its Applications'. Together they form a unique fingerprint.

Cite this