Servan-Schreiber, David
A Computational Model of Prefrontal Cortex Function
Braver, Todd S., Cohen, Jonathan D., Servan-Schreiber, David
Accumulating data from neurophysiology and neuropsychology have suggested two information processing roles for prefrontal cortex (PFC): 1) short-term active memory; and 2) inhibition. We present a new behavioral task and a computational model which were developed in parallel. The task was developed to probe both of these prefrontal functions simultaneously, and produces a rich set of behavioral data that act as constraints on the model. The model is implemented in continuous-time, thus providing a natural framework in which to study the temporal dynamics of processing in the task. We show how the model can be used to examine the behavioral consequences of neuromodulation in PFC. Specifically, we use the model to make novel and testable predictions regarding the behavioral performance of schizophrenics, who are hypothesized to suffer from reduced dopaminergic tone in this brain area.
A Computational Model of Prefrontal Cortex Function
Braver, Todd S., Cohen, Jonathan D., Servan-Schreiber, David
Accumulating data from neurophysiology and neuropsychology have suggested two information processing roles for prefrontal cortex (PFC):1) short-term active memory; and 2) inhibition. We present a new behavioral task and a computational model which were developed in parallel. The task was developed to probe both of these prefrontal functions simultaneously, and produces a rich set of behavioral data that act as constraints on the model. The model is implemented in continuous-time, thus providing a natural framework in which to study the temporal dynamics of processing in the task. We show how the model can be used to examine the behavioral consequencesof neuromodulation in PFC. Specifically, we use the model to make novel and testable predictions regarding the behavioral performance of schizophrenics, who are hypothesized to suffer from reduced dopaminergic tone in this brain area.
The Effect of Catecholamines on Performance: From Unit to System Behavior
Servan-Schreiber, David, Printz, Harry, Cohen, Jonathan D.
We present a model of catecholamine effects in a network of neural-like elements. We argue that changes in the responsivity of individual elements do not affect their ability to detect a signal and ignore noise. However. the same changes in cell responsivity in a network of such elements do improve the signal detection performance of the network as a whole. We show how this result can be used in a computer simulation of behavior to account for the effect of eNS stimulants on the signal detection performance of human subjects.
The Effect of Catecholamines on Performance: From Unit to System Behavior
Servan-Schreiber, David, Printz, Harry, Cohen, Jonathan D.
We present a model of catecholamine effects in a network of neural-like elements. We argue that changes in the responsivity of individual elements do not affect their ability to detect a signal and ignore noise. However. the same changes in cell responsivity in a network of such elements do improve the signal detection performance of the network as a whole. We show how this result can be used in a computer simulation of behavior to account for the effect of eNS stimulants on the signal detection performance of human subjects.
Learning Sequential Structure in Simple Recurrent Networks
Servan-Schreiber, David, Cleeremans, Axel, McClelland, James L.
The network uses the pattern of activation over a set of hidden units from time-step tl, together with element t, to predict element t 1. When the network is trained with strings from a particular finite-state grammar, it can learn to be a perfect finite-state recognizer for the grammar. Cluster analyses of the hidden-layer patterns of activation showed that they encode prediction-relevant information about the entire path traversed through the network. We illustrate the phases of learning with cluster analyses performed at different points during training. Several connectionist architectures that are explicitly constrained to capture sequential infonnation have been developed. Examples are Time Delay Networks (e.g.
Learning Sequential Structure in Simple Recurrent Networks
Servan-Schreiber, David, Cleeremans, Axel, McClelland, James L.
This tendency to preserve information about the path is not a characteristic of traditional finite-state automata. ENCODING PATH INFORMATION In a different set of experiments, we asked whether the SRN could learn to use the infonnation about the path that is encoded in the hidden units' patterns of activation. In one of these experiments, we tested whether the network could master length constraints. When strings generated from the small finite-state grammar may only have a maximum of 8 letters, the prediction following the presentation of the same letter in position number six or seven may be different. For example, following the sequence'TSSSXXV', 'V' is the seventh letter and only another'V' would be a legal successor.