Abstract
Dependency networks have previously been proposed as alternatives to e.g. Bayesian networks by supporting fast algorithms for automatic learning. Recently dependency networks have also been proposed as classification models, but as with e.g. general probabilistic inference, the reported speed-ups are often obtained at the expense of accuracy.
In this paper we try to address this issue through the use of mixtures of dependency networks. To reduce learning time and improve robustness when dealing with data sparse classes, we outline methods for reusing calculations across mixture components. Finally, the proposed model is empirically compared to other state-of-the-art classifiers, both in
terms of accuracy and learning time.
In this paper we try to address this issue through the use of mixtures of dependency networks. To reduce learning time and improve robustness when dealing with data sparse classes, we outline methods for reusing calculations across mixture components. Finally, the proposed model is empirically compared to other state-of-the-art classifiers, both in
terms of accuracy and learning time.
Original language | English |
---|---|
Title of host publication | Proceedings of the Fourth European Workshop on Probabilistic Graphical Models |
Number of pages | 8 |
Publication date | 2008 |
Pages | 129-136 |
Publication status | Published - 2008 |
Event | The Fourth European Workshop on Probabilistic Graphical Models - Hirtshals, Denmark Duration: 17 Sept 2008 → 19 Sept 2008 Conference number: 4 |
Conference
Conference | The Fourth European Workshop on Probabilistic Graphical Models |
---|---|
Number | 4 |
Country/Territory | Denmark |
City | Hirtshals |
Period | 17/09/2008 → 19/09/2008 |