Supplementary Material for Adversarial Robustness of Supervised Sparse Coding A Encoder Gap for k-sparse signals Herein we show that a positive encoder gap exists for signals that are (approximately)

Neural Information Processing Systems 

Herein we prove our generalization bound, but first re-state it for completeness. Let us bound the first of these terms. Let us now focus on the second and third terms in Eq. Likewise, null is Lipschitz continuous w.r.t w, null null null( y,f Note that the third term in Eq. In this section we prove the key result in Lemma 4.2, guaranteeing that the perturbation in the encoded With this result at hand, the proof of Lemma 4.2 follows directly from the Remark B.2. F or the setting above, we have that a) null( x The proof mimics that in [Mehta and Gray, 2013, Lemma 10-11], though accommodating for the adversarial perturbation.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found