Goto

Collaborating Authors

 implementation






One-step differentiation of iterative algorithms

Neural Information Processing Systems

For iterative algorithms, implicit differentiation alleviates this issue but requires custom implementation of Jacobian evaluation. In this paper, we study one-step differentiation, also known as Jacobian-free backpropagation, a method as easy as automatic differentiation and as efficient as implicit differentiation for fast algorithms (e.g., superlinear



Latent SDEs on Homogeneous Spaces

Neural Information Processing Systems

We consider the problem of variational Bayesian inference in a latent variable model where a (possibly complex) observed stochastic process is governed by the solution of a latent stochastic differential equation (SDE).


Trial matching: capturing variability with data-constrained spiking neural networks

Neural Information Processing Systems

Simultaneous behavioral and electrophysiological recordings call for new methods to reveal the interactions between neural activity and behavior. A milestone would be an interpretable model of the co-variability of spiking activity and behavior across trials. Here, we model a mouse cortical sensory-motor pathway in a tactile detection task reported by licking with a large recurrent spiking neural network (RSNN), fitted to the recordings via gradient-based optimization. We focus specifically on the difficulty to match the trial-to-trial variability in the data. Our solution relies on optimal transport to define a distance between the distributions of generated and recorded trials. The technique is applied to artificial data and neural recordings covering six cortical areas. We find that the resulting RSNN can generate realistic cortical activity and predict jaw movements across the main modes of trial-to-trial variability. Our analysis also identifies an unexpected mode of variability in the data corresponding to task-irrelevant movements of the mouse.



Supplementary Materials for " Deep Fractional Fourier Transform " Hu Y u

Neural Information Processing Systems

This supplementary document is organized as follows: Section 1 shows the proof that the formula of FRFT degrades to that of FT when α = π/ 2. Section 2 shows the discrete implementation of 2D FRFT. Section 4 shows the experimental results with single branch. Section 5 shows the architecture design of SFC and example usage of SFC and MFRFC. Section 6 introduces the periodicity of FRFT. Section 7 introduces the energy distribution of FRFT.