Goto

Collaborating Authors

 Learning Graphical Models



Performance of Synthetic Neural Network Classification of Noisy Radar Signals

Neural Information Processing Systems

This study evaluates the performance of the multilayer-perceptron and the frequency-sensitive competitive learning network in identifying five commercial aircraft from radar backscatter measurements. The performance of the neural network classifiers is compared with that of the nearest-neighbor and maximum-likelihood classifiers. Our results indicate that for this problem, the neural network classifiers are relatively insensitive to changes in the network topology, and to the noise level in the training data. While, for this problem, the traditional algorithms outperform these simple neural classifiers, we feel that neural networks show the potential for improved performance.



Optimization by Mean Field Annealing

Neural Information Processing Systems

Nearly optimal solutions to many combinatorial problems can be found using stochastic simulated annealing. This paper extends the concept of simulated annealing from its original formulation as a Markov process to a new formulation based on mean field theory. Mean field annealing essentially replaces the discrete degrees of freedom in simulated annealing with their average values as computed by the mean field approximation. The net result is that equilibrium at a given temperature is achieved 1-2 orders of magnitude faster than with simulated annealing. A general framework for the mean field annealing algorithm is derived, and its relationship to Hopfield networks is shown. The behavior of MFA is examined both analytically and experimentally for a generic combinatorial optimization problem: graph bipartitioning. This analysis indicates the presence of critical temperatures which could be important in improving the performance of neural networks.


The Boltzmann Perceptron Network: A Multi-Layered Feed-Forward Network Equivalent to the Boltzmann Machine

Neural Information Processing Systems

The concept of the stochastic Boltzmann machine (BM) is auractive for decision making and pattern classification purposes since the probability of attaining the network states is a function of the network energy. Hence, the probability of attaining particular energy minima may be associated with the probabilities of making certain decisions (or classifications). However, because of its stochastic nature, the complexity of the BM is fairly high and therefore such networks are not very likely to be used in practice. In this paper we suggest a way to alleviate this drawback by converting the stochastic BM into a deterministic network which we call the Boltzmann Perceptron Network (BPN). The BPN is functionally equivalent to the BM but has a feed-forward structure and low complexity.



Performance of Synthetic Neural Network Classification of Noisy Radar Signals

Neural Information Processing Systems

This study evaluates the performance of the multilayer-perceptron and the frequency-sensitive competitive learning network in identifying five commercial aircraft from radar backscatter measurements. The performance of the neural network classifiers is compared with that of the nearest-neighbor and maximum-likelihood classifiers. Our results indicate that for this problem, the neural network classifiers are relatively insensitive to changes in the network topology, and to the noise level in the training data. While, for this problem, the traditional algorithms outperform these simple neural classifiers, we feel that neural networks show the potential for improved performance.




HUGIN: A shell for building Bayesian belief universes for expert systems

Classics

Causal probabilistic networks have proved to be a useful knowledge representation tool for modelling domains where causal relations in a broad sense are a natural way of relating domain objects and where uncertainty is inherited in these relations. This paper outlines an implementation the HUGIN shell--for handling a domain model expressed by a causal probabilistic network. The only topological restriction imposed on the network is that, it must not contain any directed loops. The approach is illustrated step by step by solving a. genetic breeding problem. A graph representation of the domain model is interactively created by using instances of the basic network components—nodes and arcs—as building blocks. This structure, together with the quantitative relations between nodes and their immediate causes expressed as conditional probabilities, are automatically transformed into a tree structure, a junction tree. Here a computationally efficient and conceptually simple algebra of Bayesian belief universes supports incorporation of new evidence, propagation of information, and calculation of revised beliefs in the states of the nodes in the network. Finally, as an example of a real world application, MUN1N an expert system for electromyography is discussed.IJCAI-89, Vol. 2, pp. 1080–1085