Matrix Exponential Gradient Updates for On-line Learning and Bregman Projection
–Neural Information Processing Systems
We address the problem of learning a symmetric positive definite matrix. The central issue is to design parameter updates that preserve positive definiteness. Our updates are motivated with the von Neumann diver- gence. Rather than treating the most general case, we focus on two key applications that exemplify our methods: On-line learning with a simple square loss and finding a symmetric positive definite matrix subject to symmetric linear constraints. The updates generalize the Exponentiated Gradient (EG) update and AdaBoost, respectively: the parameter is now a symmetric positive definite matrix of trace one instead of a probability vector (which in this context is a diagonal positive definite matrix with trace one).
Neural Information Processing Systems
Apr-6-2023, 15:48:49 GMT
- Country:
- Oceania > Australia > Australian Capital Territory > Canberra (0.04)
- Genre:
- Instructional Material > Online (0.71)
- Industry:
- Education > Educational Setting > Online (1.00)