CORNN: Convex optimization of recurrent neural networks for rapid inference of neural dynamics Adam Shai
–Neural Information Processing Systems
Advances in optical and electrophysiological recording technologies have made it possible to record the dynamics of thousands of neurons, opening up new possibilities for interpreting and controlling large neural populations in behaving animals. A promising way to extract computational principles from these large datasets is to train data-constrained recurrent neural networks (dRNNs). Performing this training in real-time could open doors for research techniques and medical applications to model and control interventions at single-cell resolution and drive desired forms of animal behavior. However, existing training algorithms for dRNNs are inefficient and have limited scalability, making it a challenge to analyze large neural recordings even in offline scenarios.
Neural Information Processing Systems
May-25-2025, 07:07:00 GMT
- Country:
- North America > United States > California > Santa Clara County (0.14)
- Genre:
- Research Report > New Finding (0.46)
- Industry:
- Technology: