DefendingAgainstAdversarialAttacksviaNeural DynamicSystem

Neural Information Processing Systems 

Some recent works have accordingly proposed to enhance the robustnessofDNN fromadynamic system perspective. Followingthislineofinquiry, and inspired by the asymptotic stability of the general nonautonomous dynamicalsystem, wepropose tomakeeachcleaninstance betheasymptotically stable equilibrium points of a slowly time-varying system in order to defend against adversarial attacks. We present a theoretical guarantee that if a clean instance is an asymptotically stable equilibrium point and the adversarial instance is in the neighborhood of this point, the asymptotic stability will reduce the adversarial noise to bring the adversarial instance close to the clean instance. Motivated by our theoretical results, we go on to propose a nonautonomous neural ordinary differential equation (ASODE) and place constraints onitscorresponding linear time-variant system to make all clean instances act as its asymptotically stable equilibrium points. Our analysis suggests that the constraints can be converted to regularizers in implementation.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found