The probabilistic tensor decomposition toolbox

Jesper L. Hinrich, Kristoffer H. Madsen, Morten Mørup

5 Citations (Scopus)

Abstract

This article introduces the probabilistic tensor decomposition toolbox - a MATLAB toolbox for tensor decomposition using Variational Bayesian inference and Gibbs sampling. An introduction and overview of probabilistic tensor decomposition and its connection with classical tensor decomposition methods based on maximum likelihood is provided. We subsequently describe the probabilistic tensor decomposition toolbox which encompasses the Canonical Polyadic, Tucker, and Tensor Train decomposition models. Currently, unconstrained, non-negative, orthogonal, and sparse factors are supported. Bayesian inference forms a principled way of incorporating prior knowledge, prediction of held-out data, and estimating posterior probabilities. Furthermore, it facilitates automatic model order determination, automatic regularization on factors (e.g. sparsity), and inherently penalizes model complexity which is beneficial when inferring hierarchical models, such as heteroscedastic noise modelling. The toolbox allows researchers to easily apply Bayesian tensor decomposition methods without the need to derive or implement these methods themselves. Furthermore, it serves as a reference implementation for comparing existing and new tensor decomposition methods. The software is available from https://github.com/JesperLH/prob-tensor-toolbox/.

Original languageEnglish
Article number025011
JournalMachine Learning: Science and Technology
Volume1
Issue number2
Pages (from-to)1-20
Number of pages20
DOIs
Publication statusPublished - 17 Jun 2020

Keywords

  • Bayesian inference
  • Candecomp/PARAFAC
  • Multi-way modelling
  • Tensor decomposition
  • Tensor train
  • Tucker

Fingerprint

Dive into the research topics of 'The probabilistic tensor decomposition toolbox'. Together they form a unique fingerprint.

Cite this