Abstract
Dependency networks have previously been proposed as alternatives to e.g. Bayesian networks by supporting fast algorithms for automatic learning. Recently dependency networks have also been proposed as classification models, but as with e.g. general probabilistic inference, the reported speed-ups are often obtained at the expense of accuracy.
In this paper we try to address this issue through the use of mixtures of dependency networks. To reduce learning time and improve robustness when dealing with data sparse classes, we outline methods for reusing calculations across mixture components. Finally, the proposed model is empirically compared to other state-of-the-art classifiers, both in
terms of accuracy and learning time.
In this paper we try to address this issue through the use of mixtures of dependency networks. To reduce learning time and improve robustness when dealing with data sparse classes, we outline methods for reusing calculations across mixture components. Finally, the proposed model is empirically compared to other state-of-the-art classifiers, both in
terms of accuracy and learning time.
Originalsprog | Engelsk |
---|---|
Titel | Proceedings of the Fourth European Workshop on Probabilistic Graphical Models |
Antal sider | 8 |
Publikationsdato | 2008 |
Sider | 129-136 |
Status | Udgivet - 2008 |
Begivenhed | The Fourth European Workshop on Probabilistic Graphical Models - Hirtshals, Danmark Varighed: 17 sep. 2008 → 19 sep. 2008 Konferencens nummer: 4 |
Konference
Konference | The Fourth European Workshop on Probabilistic Graphical Models |
---|---|
Nummer | 4 |
Land/Område | Danmark |
By | Hirtshals |
Periode | 17/09/2008 → 19/09/2008 |