Distributed Flexible Nonlinear Tensor Factorization
Zhe, Shandian, Zhang, Kai, Wang, Pengyuan, Lee, Kuang-chih, Xu, Zenglin, Qi, Yuan, Ghahramani, Zoubin
–Neural Information Processing Systems
Tensor factorization is a powerful tool to analyse multi-way data. Recently proposed nonlinear factorization methods, although capable of capturing complex relationships, are computationally quite expensive and may suffer a severe learning bias in case of extreme data sparsity. Therefore, we propose a distributed, flexible nonlinear tensor factorization model, which avoids the expensive computations and structural restrictions of the Kronecker-product in the existing TGP formulations, allowing an arbitrary subset of tensor entries to be selected for training. Meanwhile, we derive a tractable and tight variational evidence lower bound (ELBO) that enables highly decoupled, parallel computations and high-quality inference. Based on the new bound, we develop a distributed, key-value-free inference algorithm in the MapReduce framework, which can fully exploit the memory cache mechanism in fast MapReduce systems such as Spark.
Neural Information Processing Systems
Feb-14-2020, 07:12:02 GMT
- Technology: