A Trellis-Structured Neural Network
Petsche, Thomas, Dickinson, Bradley W.
–Neural Information Processing Systems
We have presented a locally interconnected network which minimizes a function that is analogous to the log likelihood function near the global minimum. The results of simulations demonstrate that the network can successfully decode input sequences containing no noise at least as well as the globally connected Hopfield-Tank [6] decomposition network. Simulations also strongly support the conjecture that in the noiseless case, the network can be guaranteed to converge to the global minimum. In addition, for low error rates, the network can also decode noisy received sequences. We have been able to apply the Cohen-Grossberg proof of the stability of "oncenter off-surround" networks to show that each stage will maximize the desired local "likelihood" for noisy received sequences. We have also shown that, in the large gain limit, the network as a whole is stable and that the equilibrium points correspond to the MLSE decoder output. Simulations have verified this proof of stability even for relatively small gains. Unfortunately, a proof of strict Lyapunov stability is very difficult, and may not be possible, because of the cooperative connections in the network. This network demonstrates that it is possible to perform interesting functions even if only localized connections are allowed, although there may be some loss of performance.
Neural Information Processing Systems
Dec-31-1988