Predictive Distribution of the Dirichlet Mixture Model by the Local Variational Inference Method

Zhanyu Ma, Arne Leijon, Zheng-Hua Tan, Sheng Gao

Research output: Contribution to journalJournal articleResearchpeer-review

4 Citations (Scopus)

Abstract

In Bayesian analysis of a statistical model, the predictive distribution is obtained by marginalizing over the parameters with their posterior distributions. Compared to the frequently used point estimate plug-in method, the predictive distribution leads to a more reliable result in calculating the predictive likelihood of the new upcoming data, especially when the amount of training data is small. The Bayesian estimation of a Dirichlet mixture model (DMM) is, in general, not analytically tractable. In our previous work, we have proposed a global variational inference-based method for approximately calculating the posterior distributions of the parameters in the DMM analytically. In this paper, we extend our previous study for the DMM and propose an algorithm to calculate the predictive distribution of the DMM with the local variational inference (LVI) method. The true predictive distribution of the DMM is analytically intractable. By considering the concave property of the multivariate inverse beta function, we introduce an upper-bound to the true predictive distribution. As the global minimum of this upper-bound exists, the problem is reduced to seek an approximation to the true predictive distribution. The approximated predictive distribution obtained by minimizing the upper-bound is analytically tractable, facilitating the computation of the predictive likelihood. With synthesized data and real data evaluations, the good performance of the proposed LVI based method is demonstrated by comparing with some conventionally used methods.
Original languageEnglish
JournalJournal of Signal Processing Systems
Volume74
Issue number3
Pages (from-to)359-374
ISSN1939-8018
DOIs
Publication statusPublished - 2014

Fingerprint

Predictive Distribution
Mixture Model
Dirichlet
Upper bound
Posterior distribution
Likelihood
Plug-in Method
Beta function
Inverse function
Point Estimate
Bayesian Estimation
Global Minimum
Bayesian Analysis
Statistical Model
Calculate
Evaluation
Approximation

Cite this

@article{60f68a6b31dd49feb3cc3ea56f4432b1,
title = "Predictive Distribution of the Dirichlet Mixture Model by the Local Variational Inference Method",
abstract = "In Bayesian analysis of a statistical model, the predictive distribution is obtained by marginalizing over the parameters with their posterior distributions. Compared to the frequently used point estimate plug-in method, the predictive distribution leads to a more reliable result in calculating the predictive likelihood of the new upcoming data, especially when the amount of training data is small. The Bayesian estimation of a Dirichlet mixture model (DMM) is, in general, not analytically tractable. In our previous work, we have proposed a global variational inference-based method for approximately calculating the posterior distributions of the parameters in the DMM analytically. In this paper, we extend our previous study for the DMM and propose an algorithm to calculate the predictive distribution of the DMM with the local variational inference (LVI) method. The true predictive distribution of the DMM is analytically intractable. By considering the concave property of the multivariate inverse beta function, we introduce an upper-bound to the true predictive distribution. As the global minimum of this upper-bound exists, the problem is reduced to seek an approximation to the true predictive distribution. The approximated predictive distribution obtained by minimizing the upper-bound is analytically tractable, facilitating the computation of the predictive likelihood. With synthesized data and real data evaluations, the good performance of the proposed LVI based method is demonstrated by comparing with some conventionally used methods.",
author = "Zhanyu Ma and Arne Leijon and Zheng-Hua Tan and Sheng Gao",
year = "2014",
doi = "10.1007/s11265-013-0769-8",
language = "English",
volume = "74",
pages = "359--374",
journal = "Journal of Signal Processing Systems",
issn = "1939-8018",
publisher = "Springer",
number = "3",

}

Predictive Distribution of the Dirichlet Mixture Model by the Local Variational Inference Method. / Ma, Zhanyu; Leijon, Arne; Tan, Zheng-Hua; Gao, Sheng.

In: Journal of Signal Processing Systems, Vol. 74, No. 3, 2014, p. 359-374.

Research output: Contribution to journalJournal articleResearchpeer-review

TY - JOUR

T1 - Predictive Distribution of the Dirichlet Mixture Model by the Local Variational Inference Method

AU - Ma, Zhanyu

AU - Leijon, Arne

AU - Tan, Zheng-Hua

AU - Gao, Sheng

PY - 2014

Y1 - 2014

N2 - In Bayesian analysis of a statistical model, the predictive distribution is obtained by marginalizing over the parameters with their posterior distributions. Compared to the frequently used point estimate plug-in method, the predictive distribution leads to a more reliable result in calculating the predictive likelihood of the new upcoming data, especially when the amount of training data is small. The Bayesian estimation of a Dirichlet mixture model (DMM) is, in general, not analytically tractable. In our previous work, we have proposed a global variational inference-based method for approximately calculating the posterior distributions of the parameters in the DMM analytically. In this paper, we extend our previous study for the DMM and propose an algorithm to calculate the predictive distribution of the DMM with the local variational inference (LVI) method. The true predictive distribution of the DMM is analytically intractable. By considering the concave property of the multivariate inverse beta function, we introduce an upper-bound to the true predictive distribution. As the global minimum of this upper-bound exists, the problem is reduced to seek an approximation to the true predictive distribution. The approximated predictive distribution obtained by minimizing the upper-bound is analytically tractable, facilitating the computation of the predictive likelihood. With synthesized data and real data evaluations, the good performance of the proposed LVI based method is demonstrated by comparing with some conventionally used methods.

AB - In Bayesian analysis of a statistical model, the predictive distribution is obtained by marginalizing over the parameters with their posterior distributions. Compared to the frequently used point estimate plug-in method, the predictive distribution leads to a more reliable result in calculating the predictive likelihood of the new upcoming data, especially when the amount of training data is small. The Bayesian estimation of a Dirichlet mixture model (DMM) is, in general, not analytically tractable. In our previous work, we have proposed a global variational inference-based method for approximately calculating the posterior distributions of the parameters in the DMM analytically. In this paper, we extend our previous study for the DMM and propose an algorithm to calculate the predictive distribution of the DMM with the local variational inference (LVI) method. The true predictive distribution of the DMM is analytically intractable. By considering the concave property of the multivariate inverse beta function, we introduce an upper-bound to the true predictive distribution. As the global minimum of this upper-bound exists, the problem is reduced to seek an approximation to the true predictive distribution. The approximated predictive distribution obtained by minimizing the upper-bound is analytically tractable, facilitating the computation of the predictive likelihood. With synthesized data and real data evaluations, the good performance of the proposed LVI based method is demonstrated by comparing with some conventionally used methods.

U2 - 10.1007/s11265-013-0769-8

DO - 10.1007/s11265-013-0769-8

M3 - Journal article

VL - 74

SP - 359

EP - 374

JO - Journal of Signal Processing Systems

JF - Journal of Signal Processing Systems

SN - 1939-8018

IS - 3

ER -