nw-buresw
Supplementary Material for: Adversarial Regression with Doubly Non-negative Weighting Matrices
In this supplementary, we give details for the proofs of our technical results in Section A. In Section B, In Section C, we illustrate additional empirical results. Proposition 3.4, we begin by computing the support function of the convex cone of symmetric positive Tr null A Ω null 0. Therefore, the desired result follows.For Proposition 3.4 Proof of Proposition 3.4. Thus, the infimum in problem (A.2) can be restricted to γ > 0. This observation completes the proof. The following elementary fact is well known. For completeness, we include a proof here.
Supplementary Material for: Adversarial Regression with Doubly Non-negative Weighting Matrices
In this supplementary, we give details for the proofs of our technical results in Section A. In Section B, In Section C, we illustrate additional empirical results. Proposition 3.4, we begin by computing the support function of the convex cone of symmetric positive Tr null A Ω null 0. Therefore, the desired result follows.For Proposition 3.4 Proof of Proposition 3.4. Thus, the infimum in problem (A.2) can be restricted to γ > 0. This observation completes the proof. The following elementary fact is well known. For completeness, we include a proof here.
Adversarial Regression with Doubly Non-negative Weighting Matrices
Le, Tam, Nguyen, Truyen, Yamada, Makoto, Blanchet, Jose, Nguyen, Viet Anh
Many machine learning tasks that involve predicting an output response can be solved by training a weighted regression model. Unfortunately, the predictive power of this type of models may severely deteriorate under low sample sizes or under covariate perturbations. Reweighting the training samples has aroused as an effective mitigation strategy to these problems. In this paper, we propose a novel and coherent scheme for kernel-reweighted regression by reparametrizing the sample weights using a doubly non-negative matrix. When the weighting matrix is confined in an uncertainty set using either the log-determinant divergence or the Bures-Wasserstein distance, we show that the adversarially reweighted estimate can be solved efficiently using first-order methods. Numerical experiments show that our reweighting strategy delivers promising results on numerous datasets.
- North America > United States > Michigan (0.04)
- North America > United States > Massachusetts (0.04)
- North America > United States > California > Santa Clara County > Palo Alto (0.04)
- (3 more...)