Goto

Collaborating Authors

 proximity






Few-shotImageGenerationvia Adaptation-AwareKernelModulation--Supplementary -- Overview

Neural Information Processing Systems

Onthe other hand, FSIG with extremely limited data(10samples) poses unique challenges. Inparticular, as pointed out in [4,5], severe mode collapse and loss in diversity are critical challenges in FSIG that require special attention. We remark that in [12], a technique called AdaFM is introduced to update kernels.





How degenerate is the parametrization of neural networks with the ReLU activation function?

Neural Information Processing Systems

Neural network training is usually accomplished by solving a non-convex optimization problem using stochastic gradient descent. Although one optimizes over the networks parameters, the main loss function generally only depends on the realization of the neural network, i.e. the function it computes. Studying the optimization problem over the space of realizations opens up new ways to understand neural network training. In particular, usual loss functions like mean squared error and categorical cross entropy are convex on spaces of neural network realizations, which themselves are non-convex. Approximation capabilities of neural networks can be used to deal with the latter non-convexity, which allows us to establish that for sufficiently large networks local minima of a regularized optimization problem on the realization space are almost optimal.


Curvature Regularization to Prevent Distortion in Graph Embedding

Neural Information Processing Systems

Recent research on graph embedding has achieved success in various applications. Most graph embedding methods preserve the proximity in a graph into a manifold in an embedding space. We argue an important but neglected problem about this proximity-preserving strategy: Graph topology patterns, while preserved well into an embedding manifold by preserving proximity, may distort in the ambient embedding Euclidean space, and hence to detect them becomes difficult for machine learning models. To address the problem, we propose curvature regularization, to enforce flatness for embedding manifolds, thereby preventing the distortion. We present a novel angle-based sectional curvature, termed ABS curvature, and accordingly three kinds of curvature regularization to induce flat embedding manifolds during graph embedding. We integrate curvature regularization into five popular proximity-preserving embedding methods, and empirical results in two applications show significant improvements on a wide range of open graph datasets.