Sedhain, Suvash
Low-Rank Linear Cold-Start Recommendation from Social Data
Sedhain, Suvash (Australian National University) | Menon, Aditya Krishna (DATA61 and Australian National University) | Sanner, Scott (University of Toronto) | Xie, Lexing (Australian National University and DATA61) | Braziunas, Darius (Rakuten Kobo Inc.)
The cold-start problem involves recommendation of content to new users of a system, for whom there is no historical preference information available. This proves a challenge for collaborative filtering algorithms that inherently rely on such information. Recent work has shown that social metadata, such as users' friend groups and page likes, can strongly mitigate the problem. However, such approaches either lack an interpretation as optimising some principled objective, involve iterative non-convex optimisation with limited scalability, or require tuning several hyperparameters. In this paper, we first show how three popular cold-start models are special cases of a linear content-based model, with implicit constraints on the weights. Leveraging this insight, we propose Loco, a new model for cold-start recommendation based on three ingredients: (a) linear regression to learn an optimal weighting of social signals for preferences, (b) a low-rank parametrisation of the weights to overcome the high dimensionality common in social data, and (c) scalable learning of such low-rank weights using randomised SVD. Experiments on four real-world datasets show that Loco yields significant improvements over state-of-the-art cold-start recommenders that exploit high-dimensional social network metadata.
On the Effectiveness of Linear Models for One-Class Collaborative Filtering
Sedhain, Suvash (Australian National University) | Menon, Aditya Krishna (Australian National University and NICTA) | Sanner, Scott (Oregon State University and Australian National University) | Braziunas, Darius (Rakuten Kobo Inc)
In many personalised recommendation problems, there are examples of items users prefer or like, but no examples of items they dislike. A state-of-the-art method for such implicit feedback, or one-class collaborative filtering (OC-CF), problems is SLIM, which makes recommendations based on a learned item-item similarity matrix. While SLIM has been shown to perform well on implicit feedback tasks, we argue that it is hindered by two limitations: first, it does not produce user-personalised predictions, which hampers recommendation performance; second, it involves solving a constrained optimisation problem, which impedes fast training. In this paper, we propose LRec, a variant of SLIM that overcomes these limitations without sacrificing any of SLIM's strengths.At its core, LRec employs linear logistic regression; despite this simplicity, LRec consistently and significantly outperforms all existing methods on a range of datasets. Our results thus illustrate that the OC-CF problem can be effectively tackled via linear classification models.