Supplementary Material Primal-Attention: Self-attention through Asymmetric Kernel SVD in Primal Representation Yingyi Chen
–Neural Information Processing Systems
Comments on Theorem 3.2 With the primal problem in (6) in the paper, Theorem 3.2 provides Additionally, [27] presents the optimization w.r.t. a single projection direction in Therefore, our KSVD is more general in the data setups. Remark 3.3, we show that the values can be regarded as playing the role of the dual variables Using data-dependent projection weights does not affect the derivation of the shifted eigenvalue problem in the dual. With the derivations of the primal-dual optimization problems above, the primal-dual model representation of our KSVD problem can be provided correspondingly. Lemma 4.2 evaluates the objective value Moreover, as in the proof of Theorem 3.2, we note that the regularization coefficient This section provides the implementation details of all experiments included in the paper. This will be illustrated in details in the following.Algorithm 1 Learning with Primal-AttentionRequire: X:= [ x UEA Time Series The UEA time series benchmark [31] consists of 30 datasets. Following the setup in [11], we select 10 datasets for evaluation.
Neural Information Processing Systems
Oct-9-2025, 07:49:08 GMT