Goto

Collaborating Authors

 order tensor








Efficient Probabilistic Tensor Networks

Hameed, Marawan Gamal Abdel, Rabusseau, Guillaume

arXiv.org Artificial Intelligence

Tensor networks (TNs) enable compact representations of large tensors through shared parameters. Their use in probabilistic modeling is particularly appealing, as probabilistic tensor networks (PTNs) allow for tractable computation of marginals. However, existing approaches for learning parameters of PTNs are either computationally demanding and not fully compatible with automatic differentiation frameworks, or numerically unstable. In this work, we propose a conceptually simple approach for learning PTNs efficiently, that is numerically stable. We show our method provides significant improvements in time and space complexity, achieving 10x reduction in latency for generative modeling on the MNIST dataset. Furthermore, our approach enables learning of distributions with 10x more variables than previous approaches when applied to a variety of density estimation benchmarks. Our code is publicly available at github.com/marawangamal/ptn.


Below, we address the main concerns raised in the reviews

Neural Information Processing Systems

We thank the reviewers for their comments and suggestions. We will incorporate the suggestions in our revised version. Below, we address the main concerns raised in the reviews. The work of Morris et al. [2019] was one of our main inspirations. We note that the time complexity of Morris et al. can probably be improved to Nevertheless, we admit that trying to prove our results without using it is an intriguing question.