Supplementary information for: H-Mem: Harnessing synaptic plasticity with Hebbian Memory Networks
–Neural Information Processing Systems
Here we give details to our models, and to the encoding and representation of images and questions used in our models. We used a CNN as input encoder in this task. The last fully connected layer was of size 128 followed by a ReLU nonlinearity (BatchNorm denotes a batch normalization layer [2]). Training examples were generated as described in the main text. Here, an example is one full sequence of image pairs (including random images) and one query image.
Neural Information Processing Systems
Aug-17-2025, 07:57:47 GMT
- Technology: