Supplementary Material: Reverse engineering recurrent neural networks with Jacobian switching linear dynamical systems
–Neural Information Processing Systems
In general, we have found the JSLDS loss function strengths to be relatively easy to select (see example settings in the specific experiment sections below). RNN's fixed points or slow points would defeat the primary purpose of the method. However, other variations are possible. We set the number of timesteps T = 25. We trained both methods with the Adam optimizer with default settings.
Neural Information Processing Systems
Aug-15-2025, 19:03:33 GMT