Scalable learning of probabilistic latent models for collaborative filtering

Helge Langseth, Thomas Dyhre Nielsen

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningpeer review

9 Citationer (Scopus)
223 Downloads (Pure)

Resumé

Collaborative filtering has emerged as a popular way of making user recommendations, but with the increasing sizes of the underlying databases scalability is becoming a crucial issue. In this paper we focus on a recently proposed probabilistic collaborative filtering model that explicitly represents all users and items simultaneously in the model. This model class has several desirable properties, including high recommendation accuracy and principled support for group recommendations. Unfortunately, it also suffers from poor scalability. We address this issue by proposing a scalable variational Bayes learning and inference algorithm for these types of models. Empirical results show that the proposed algorithm achieves significantly better accuracy results than other straw-men models evaluated on a collection of well-known data sets. We also demonstrate that the algorithm has a highly favorable behavior in relation to cold-start situations.
OriginalsprogEngelsk
TidsskriftDecision Support Systems
Vol/bind74
ISSN0167-9236
DOI
StatusUdgivet - 2015

Fingerprint

Collaborative filtering
Statistical Models
Learning
Self-Help Groups
Scalability
Databases

Citer dette

@article{1b926423b72e45728240110190297b89,
title = "Scalable learning of probabilistic latent models for collaborative filtering",
abstract = "Collaborative filtering has emerged as a popular way of making user recommendations, but with the increasing sizes of the underlying databases scalability is becoming a crucial issue. In this paper we focus on a recently proposed probabilistic collaborative filtering model that explicitly represents all users and items simultaneously in the model. This model class has several desirable properties, including high recommendation accuracy and principled support for group recommendations. Unfortunately, it also suffers from poor scalability. We address this issue by proposing a scalable variational Bayes learning and inference algorithm for these types of models. Empirical results show that the proposed algorithm achieves significantly better accuracy results than other straw-men models evaluated on a collection of well-known data sets. We also demonstrate that the algorithm has a highly favorable behavior in relation to cold-start situations.",
author = "Helge Langseth and Nielsen, {Thomas Dyhre}",
year = "2015",
doi = "10.1016/j.dss.2015.03.006",
language = "English",
volume = "74",
journal = "Decision Support Systems",
issn = "0167-9236",
publisher = "Elsevier",

}

Scalable learning of probabilistic latent models for collaborative filtering. / Langseth, Helge; Nielsen, Thomas Dyhre.

I: Decision Support Systems, Bind 74, 2015.

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningpeer review

TY - JOUR

T1 - Scalable learning of probabilistic latent models for collaborative filtering

AU - Langseth, Helge

AU - Nielsen, Thomas Dyhre

PY - 2015

Y1 - 2015

N2 - Collaborative filtering has emerged as a popular way of making user recommendations, but with the increasing sizes of the underlying databases scalability is becoming a crucial issue. In this paper we focus on a recently proposed probabilistic collaborative filtering model that explicitly represents all users and items simultaneously in the model. This model class has several desirable properties, including high recommendation accuracy and principled support for group recommendations. Unfortunately, it also suffers from poor scalability. We address this issue by proposing a scalable variational Bayes learning and inference algorithm for these types of models. Empirical results show that the proposed algorithm achieves significantly better accuracy results than other straw-men models evaluated on a collection of well-known data sets. We also demonstrate that the algorithm has a highly favorable behavior in relation to cold-start situations.

AB - Collaborative filtering has emerged as a popular way of making user recommendations, but with the increasing sizes of the underlying databases scalability is becoming a crucial issue. In this paper we focus on a recently proposed probabilistic collaborative filtering model that explicitly represents all users and items simultaneously in the model. This model class has several desirable properties, including high recommendation accuracy and principled support for group recommendations. Unfortunately, it also suffers from poor scalability. We address this issue by proposing a scalable variational Bayes learning and inference algorithm for these types of models. Empirical results show that the proposed algorithm achieves significantly better accuracy results than other straw-men models evaluated on a collection of well-known data sets. We also demonstrate that the algorithm has a highly favorable behavior in relation to cold-start situations.

U2 - 10.1016/j.dss.2015.03.006

DO - 10.1016/j.dss.2015.03.006

M3 - Journal article

VL - 74

JO - Decision Support Systems

JF - Decision Support Systems

SN - 0167-9236

ER -