TY - JOUR
T1 - The probabilistic tensor decomposition toolbox
AU - Hinrich, Jesper L.
AU - Madsen, Kristoffer H.
AU - Mørup, Morten
N1 - Publisher Copyright:
©2020 The Author(s).
PY - 2020/6/17
Y1 - 2020/6/17
N2 - This article introduces the probabilistic tensor decomposition toolbox - a MATLAB toolbox for tensor decomposition using Variational Bayesian inference and Gibbs sampling. An introduction and overview of probabilistic tensor decomposition and its connection with classical tensor decomposition methods based on maximum likelihood is provided. We subsequently describe the probabilistic tensor decomposition toolbox which encompasses the Canonical Polyadic, Tucker, and Tensor Train decomposition models. Currently, unconstrained, non-negative, orthogonal, and sparse factors are supported. Bayesian inference forms a principled way of incorporating prior knowledge, prediction of held-out data, and estimating posterior probabilities. Furthermore, it facilitates automatic model order determination, automatic regularization on factors (e.g. sparsity), and inherently penalizes model complexity which is beneficial when inferring hierarchical models, such as heteroscedastic noise modelling. The toolbox allows researchers to easily apply Bayesian tensor decomposition methods without the need to derive or implement these methods themselves. Furthermore, it serves as a reference implementation for comparing existing and new tensor decomposition methods. The software is available from https://github.com/JesperLH/prob-tensor-toolbox/.
AB - This article introduces the probabilistic tensor decomposition toolbox - a MATLAB toolbox for tensor decomposition using Variational Bayesian inference and Gibbs sampling. An introduction and overview of probabilistic tensor decomposition and its connection with classical tensor decomposition methods based on maximum likelihood is provided. We subsequently describe the probabilistic tensor decomposition toolbox which encompasses the Canonical Polyadic, Tucker, and Tensor Train decomposition models. Currently, unconstrained, non-negative, orthogonal, and sparse factors are supported. Bayesian inference forms a principled way of incorporating prior knowledge, prediction of held-out data, and estimating posterior probabilities. Furthermore, it facilitates automatic model order determination, automatic regularization on factors (e.g. sparsity), and inherently penalizes model complexity which is beneficial when inferring hierarchical models, such as heteroscedastic noise modelling. The toolbox allows researchers to easily apply Bayesian tensor decomposition methods without the need to derive or implement these methods themselves. Furthermore, it serves as a reference implementation for comparing existing and new tensor decomposition methods. The software is available from https://github.com/JesperLH/prob-tensor-toolbox/.
KW - Bayesian inference
KW - Candecomp/PARAFAC
KW - Multi-way modelling
KW - Tensor decomposition
KW - Tensor train
KW - Tucker
UR - http://www.scopus.com/inward/record.url?scp=85125604089&partnerID=8YFLogxK
U2 - 10.1088/2632-2153/ab8241
DO - 10.1088/2632-2153/ab8241
M3 - Journal article
AN - SCOPUS:85125604089
SN - 2632-2153
VL - 1
SP - 1
EP - 20
JO - Machine Learning: Science and Technology
JF - Machine Learning: Science and Technology
IS - 2
M1 - 025011
ER -