Country
Fault Diagnosis of Antenna Pointing Systems using Hybrid Neural Network and Signal Processing Models
Smyth, Padhraic, Mellstrom, Jeff
We describe in this paper a novel application of neural networks to system health monitoring of a large antenna for deep space communications. The paper outlines our approach to building a monitoring system using hybrid signal processing and neural network techniques, including autoregressive modelling, pattern recognition, and Hidden Markov models. We discuss several problems which are somewhat generic in applications of this kind - in particular we address the problem of detecting classes which were not present in the training data. Experimental results indicate that the proposed system is sufficiently reliable for practical implementation. 1 Background: The Deep Space Network The Deep Space Network (DSN) (designed and operated by the Jet Propulsion Laboratory (JPL) for the National Aeronautics and Space Administration (NASA)) is unique in terms of providing end-to-end telecommunication capabilities between earth and various interplanetary spacecraft throughout the solar system. The ground component of the DSN consists of three ground station complexes located in California, Spain and Australia, giving full 24-hour coverage for deep space communications.
A Topographic Product for the Optimization of Self-Organizing Feature Maps
Bauer, Hans-Ulrich, Pawelzik, Klaus, Geisel, Theo
Self-organizing feature maps like the Kohonen map (Kohonen, 1989, Ritter et al., 1990) not only provide a plausible explanation for the formation of maps in brains, e.g. in the visual system (Obermayer et al., 1990), but have also been applied to problems like vector quantization, or robot arm control (Martinetz et al., 1990). The underlying organizing principle is the preservation of neighborhood relations. For this principle to lead to a most useful map, the topological structure of the output space must roughly fit the structure of the input data. However, in technical 1141 1142 Bauer, Pawelzik, and Geisel applications this structure is often not a priory known. For this reason several attempts have been made to modify the Kohonen-algorithm such, that not only the weights, but also the output space topology itself is adapted during learning (Kangas et al., 1990, Martinetz et al., 1991). Our contribution is also concerned with optimal output space topologies, but we follow a different approach, which avoids a possibly complicated structure of the output space. First we describe a quantitative measure for the preservation of neighborhood relations in maps, the topographic product P. The topographic product had been invented under the name of" wavering product" in nonlinear dynamics in order to optimize the embeddings of chaotic attractors (Liebert et al., 1991).
Neural Computing with Small Weights
Siu, Kai-Yeung, Bruck, Jehoshua
An important issue in neural computation is the dynamic range of weights in the neural networks. Many experimental results on learning indicate that the weights in the networks can grow prohibitively large with the size of the inputs. Here we address this issue by studying the tradeoffs between the depth and the size of weights in polynomial-size networks of linear threshold elements (LTEs). We show that there is an efficient way of simulating a network of LTEs with large weights by a network of LTEs with small weights. To prove these results, we use tools from harmonic analysis of Boolean functions.
Benchmarking Feed-Forward Neural Networks: Models and Measures
Existing metrics for the learning performance of feed-forward neural networks do not provide a satisfactory basis for comparison because the choice of the training epoch limit can determine the results of the comparison. I propose new metrics which have the desirable property of being independent of the training epoch limit. The efficiency measures the yield of correct networks in proportion to the training effort expended. The optimal epoch limit provides the greatest efficiency. The learning performance is modelled statistically, and asymptotic performance is estimated. Implementation details may be found in (Harney, 1992).
Fault Diagnosis of Antenna Pointing Systems using Hybrid Neural Network and Signal Processing Models
Smyth, Padhraic, Mellstrom, Jeff
Padhraic Smyth, J eft" Mellstrom Jet Propulsion Laboratory 238-420 California Institute of Technology Pasadena, CA 91109 Abstract We describe in this paper a novel application of neural networks to system health monitoring of a large antenna for deep space communications. The paper outlines our approach to building a monitoring system using hybrid signal processing and neural network techniques, including autoregressive modelling, pattern recognition, and Hidden Markov models. We discuss several problems which are somewhat generic in applications of this kind - in particular we address the problem of detecting classes which were not present in the training data. Experimental results indicate that the proposed system is sufficiently reliable for practical implementation. 1 Background: The Deep Space Network The Deep Space Network (DSN) (designed and operated by the Jet Propulsion Laboratory (JPL)for the National Aeronautics and Space Administration (NASA)) is unique in terms of ...
Adaptive Synchronization of Neural and Physical Oscillators
Animal locomotion patterns are controlled by recurrent neural networks called central pattern generators (CPGs). Although a CPG can oscillate autonomously, its rhythm and phase must be well coordinated with the state of the physical system using sensory inputs. In this paper we propose a learning algorithm for synchronizing neural and physical oscillators with specific phase relationships. Sensory input connections are modified by the and input signals. Simulations showcorrelation between cellular activities that the learning rule can be used for setting sensory feedback connections to a CPG as well as coupling connections between CPGs. 1 CENTRAL AND SENSORY MECHANISMS IN LOCOMOTION CONTROL Patterns of animal locomotion, such as walking, swimming, and fiying, are generated by recurrent neural networks that are located in segmental ganglia of invertebrates and spinal cords of vertebrates (Barnes and Gladden, 1985).