Classic Matrix Factorization¶
LKPY provides classical matrix factorization implementations.
Common Support¶
The mf_common
module contains common support code for matrix factorization
algorithms. This class, MFPredictor
,
defines the parameters that are estimated during the Algorithm.fit()
process on common matrix factorization algorithms.

class
lenskit.algorithms.mf_common.
MFPredictor
¶ Bases:
lenskit.Predictor
Common predictor for matrix factorization.

user_index_
¶ Users in the model (length=:math:m).
 Type

item_index_
¶ Items in the model (length=:math:n).
 Type

user_features_
¶ The \(m \times k\) userfeature matrix.
 Type

item_features_
¶ The \(n \times k\) itemfeature matrix.
 Type

property
n_features
¶ The number of features.

property
n_users
¶ The number of users.

property
n_items
¶ The number of items.

lookup_user
(user)¶ Look up the index for a user.
 Parameters
user – the user ID to look up
 Returns
the user index.
 Return type

lookup_items
(items)¶ Look up the indices for a set of items.
 Parameters
items (arraylike) – the item IDs to look up.
 Returns
the item indices. Unknown items will have negative indices.
 Return type

score
(user, items, u_features=None)¶ Score a set of items for a user. User and item parameters must be indices into the matrices.
 Parameters
 Returns
the scores for the items.
 Return type

Alternating Least Squares¶
LensKit provides alternating least squares implementations of matrix factorization suitable for explicit feedback data. These implementations are parallelized with Numba, and perform best with the MKL from Conda.

class
lenskit.algorithms.als.
BiasedMF
(features, *, iterations=20, reg=0.1, damping=5, bias=True, method='cd', rng_spec=None, progress=None, save_user_features=True)¶ Bases:
lenskit.algorithms.mf_common.MFPredictor
Biased matrix factorization trained with alternating least squares [ZWSP2008]. This is a predictionoriented algorithm suitable for explicit feedback data.
It provides two solvers for the optimization step (the method parameter):
'cd'
(the default)Coordinate descent [TPT2011], adapted for a separatelytrained bias model and to use weighted regularization as in the original ALS paper [ZWSP2008].
'lu'
A direct implementation of the original ALS concept [ZWSP2008] using LUdecomposition to solve for the optimized matrices.
See the base class
MFPredictor
for documentation on the estimated parameters you can extract from a trained model. ZWSP2008(1,2,3)
Yunhong Zhou, Dennis Wilkinson, Robert Schreiber, and Rong Pan. 2008. LargeScale Parallel Collaborative Filtering for the Netflix Prize. In +Algorithmic Aspects in Information and Management_, LNCS 5034, 337–348. DOI 10.1007/9783540688808_32.
 TPT2011
Gábor Takács, István Pilászy, and Domonkos Tikk. 2011. Applications of the Conjugate Gradient Method for Implicit Feedback Collaborative Filtering.
 Parameters
features (int) – the number of features to train
iterations (int) – the number of iterations to train
reg (float) – the regularization factor; can also be a tuple
(ureg, ireg)
to specify separate user and item regularization terms.damping (float) – damping factor for the underlying bias.
bias (bool or
Bias
) – the bias model. IfTrue
, fits aBias
with dampingdamping
.method (str) – the solver to use (see above).
rng_spec – Random number generator or state (see
lenskit.util.random.rng()
).progress – a
tqdm.tqdm()
compatible progress bar function

fit
(ratings, **kwargs)¶ Run ALS to train a model.
 Parameters
ratings – the ratings data frame.
 Returns
The algorithm (for chaining).

fit_iters
(ratings, **kwargs)¶ Run ALS to train a model, returning each iteration as a generator.
 Parameters
ratings – the ratings data frame.
 Returns
The algorithm (for chaining).

predict_for_user
(user, items, ratings=None)¶ Compute predictions for a user and items.
 Parameters
user – the user ID
items (arraylike) – the items to predict
ratings (pandas.Series) – the user’s ratings (indexed by item id); if provided, they may be used to override or augment the model’s notion of a user’s preferences.
 Returns
scores for the items, indexed by item id.
 Return type

class
lenskit.algorithms.als.
ImplicitMF
(features, *, iterations=20, reg=0.1, weight=40, method='cg', rng_spec=None, progress=None, save_user_features=True)¶ Bases:
lenskit.algorithms.mf_common.MFPredictor
Implicit matrix factorization trained with alternating least squares [HKV2008]. This algorithm outputs ‘predictions’, but they are not on a meaningful scale. If its input data contains
rating
values, these will be used as the ‘confidence’ values; otherwise, confidence will be 1 for every rated item.'cg'
(the default)Conjugate gradient method [TPT2011].
'lu'
A direct implementation of the original implicitfeedback ALS concept [HKV2008] using LUdecomposition to solve for the optimized matrices.
See the base class
MFPredictor
for documentation on the estimated parameters you can extract from a trained model. HKV2008(1,2,3)
Y. Hu, Y. Koren, and C. Volinsky. 2008. Collaborative Filtering for Implicit Feedback Datasets. In _Proceedings of the 2008 Eighth IEEE International Conference on Data Mining_, 263–272. DOI 10.1109/ICDM.2008.22
 TPT2011
Gábor Takács, István Pilászy, and Domonkos Tikk. 2011. Applications of the Conjugate Gradient Method for Implicit Feedback Collaborative Filtering.
 Parameters
features (int) – the number of features to train
iterations (int) – the number of iterations to train
reg (double) – the regularization factor
weight (double) – the scaling weight for positive samples (\(\alpha\) in [HKV2008]).
rng_spec – Random number generator or state (see
lenskit.util.random.rng()
).progress – a
tqdm.tqdm()
compatible progress bar function

fit
(ratings, **kwargs)¶ Train a model using the specified ratings (or similar) data.
 Parameters
ratings (pandas.DataFrame) – The ratings data.
kwargs – Additional training data the algorithm may require. Algorithms should avoid using the same keyword arguments for different purposes, so that they can be more easily hybridized.
 Returns
The algorithm object.

predict_for_user
(user, items, ratings=None)¶ Compute predictions for a user and items.
 Parameters
user – the user ID
items (arraylike) – the items to predict
ratings (pandas.Series) – the user’s ratings (indexed by item id); if provided, they may be used to override or augment the model’s notion of a user’s preferences.
 Returns
scores for the items, indexed by item id.
 Return type
SciKit SVD¶
This code implements a traditional SVD using scikitlearn. It requires scikitlearn
to
be installed in order to function.

class
lenskit.algorithms.svd.
BiasedSVD
(features, *, damping=5, bias=True, algorithm='randomized')¶ Bases:
lenskit.Predictor
Biased matrix factorization for implicit feedback using SciKitLearn’s SVD solver (
sklearn.decomposition.TruncatedSVD
). It operates by first computing the bias, then computing the SVD of the bias residuals.You’ll generally want one of the iterative SVD implementations such as
lennskit.algorithms.als.BiasedMF
; this is here primarily as an example and for cases where you want to evaluate a pure SVD implementation.
fit
(ratings, **kwargs)¶ Train a model using the specified ratings (or similar) data.
 Parameters
ratings (pandas.DataFrame) – The ratings data.
kwargs – Additional training data the algorithm may require. Algorithms should avoid using the same keyword arguments for different purposes, so that they can be more easily hybridized.
 Returns
The algorithm object.

predict_for_user
(user, items, ratings=None)¶ Compute predictions for a user and items.
 Parameters
user – the user ID
items (arraylike) – the items to predict
ratings (pandas.Series) – the user’s ratings (indexed by item id); if provided, they may be used to override or augment the model’s notion of a user’s preferences.
 Returns
scores for the items, indexed by item id.
 Return type

FunkSVD¶
FunkSVD is an SVDlike matrix factorization that uses stochastic gradient descent, configured much like coordinate descent, to train the userfeature and itemfeature matrices. We generally don’t recommend using it in new applications or experiments; the ALSbased algorithms are less sensitive to hyperparameters, and the TensorFlow algorithms provide more optimized gradient descent training of the same prediction model.

class
lenskit.algorithms.funksvd.
FunkSVD
(features, iterations=100, *, lrate=0.001, reg=0.015, damping=5, range=None, bias=True, random_state=None)¶ Bases:
lenskit.algorithms.mf_common.MFPredictor
Algorithm class implementing FunkSVD matrix factorization. FunkSVD is a regularized biased matrix factorization technique trained with featurewise stochastic gradient descent.
See the base class
MFPredictor
for documentation on the estimated parameters you can extract from a trained model. Parameters
features (int) – the number of features to train
iterations (int) – the number of iterations to train each feature
lrate (double) – the learning rate
reg (double) – the regularization factor
damping (double) – damping factor for the underlying mean
bias (Predictor) – the underlying bias model to fit. If
True
, then abias.Bias
model is fit withdamping
.range (tuple) – the
(min, max)
rating values to clamp ratings, orNone
to leave predictions unclamped.random_state – The random state for shuffling the data prior to training.

fit
(ratings, **kwargs)¶ Train a FunkSVD model.
 Parameters
ratings – the ratings data frame.

predict_for_user
(user, items, ratings=None)¶ Compute predictions for a user and items.
 Parameters
user – the user ID
items (arraylike) – the items to predict
ratings (pandas.Series) – the user’s ratings (indexed by item id); if provided, they may be used to override or augment the model’s notion of a user’s preferences.
 Returns
scores for the items, indexed by item id.
 Return type