Towards an Organizing Principle for a Layered Perceptual Network
–Neural Information Processing Systems
This principle of "maximum information preservation" states that the signal transformation that is to be realized at each stage is one that maximizes the information that the output signal values (from that stage) convey about the input signals values (to that stage), subject to certain constraints and in the presence of processing noise. The quantity being maximized is a Shannon information rate. I provide motivation for this principle and -- for some simple model cases -- derive some of its consequences, discuss an algorithmic implementation, and show how the principle may lead to biologically relevant neural architectural features such as topographic maps, map distortions, orientation selectivity, and extraction of spatial and temporal signal correlations. A possible connection between this information-theoretic principle and a principle of minimum entropy production in nonequilibrium thermodynamics is suggested. Introduction This paper describes some properties of a proposed information-theoretic organizing principle for the development of a layered perceptual network. The purpose of this paper is to provide an intuitive and qualitative understanding of how the principle leads to specific feature-analyzing properties and signal transformations in some simple model cases. More detailed analysis is required in order to apply the principle to cases involving more realistic patterns of signaling activity as well as specific constraints on network connectivity. This section gives a brief summary of the results that motivated the formulation of the organizing principle, which I call the principle of "maximum information preservation." In later sections the principle is stated and its consequences studied.
Neural Information Processing Systems
Dec-31-1988
- Technology: