Legendre Decomposition for Tensors
Mahito Sugiyama, Hiroyuki Nakahara, Koji Tsuda
–Neural Information Processing Systems
CP decomposition compresses an input tensor into a sum of rank-one components, and Tucker decomposition approximates an input tensor by a core tensor multiplied by matrices. To date, matrix and tensor decomposition has been extensively analyzed, and there are a number of variations of such decomposition (Kolda and Bader, 2009), where the common goal is to approximate a given tensor by a smaller number of components, or parameters,inanefficientmanner. However, despite the recent advances of decomposition techniques, a learning theory that can systematically define decomposition for any order tensors including vectors and matrices is still under development. Moreover, it is well known that CP and Tucker tensor decomposition include non-convex optimization and that the global convergence is not guaranteed.
Neural Information Processing Systems
Feb-12-2026, 21:08:13 GMT
- Country:
- Africa > Senegal
- Kolda Region > Kolda (0.25)
- Asia > Japan
- Honshū > Kantō > Tokyo Metropolis Prefecture > Tokyo (0.05)
- North America > Canada
- Africa > Senegal
- Technology: