Goto

Collaborating Authors

 Tenorio, Manoel Fernando


The Cocktail Party Problem: Speech/Data Signal Separation Comparison between Backpropagation and SONN

Neural Information Processing Systems

This work introduces a new method called Self Organizing Neural Network (SONN) algorithm and compares its performance with Back Propagation in a signal separation application. The problem is to separate two signals; a modem data signal and a male speech signal, added and transmitted through a 4 khz channel. The signals are sampled at 8 khz, and using supervised learning, an attempt is made to reconstruct them. The SONN is an algorithm that constructs its own network topology during training, which is shown to be much smaller than the BP network, faster to trained, and free from the trial-anderror network design that characterize BP. 1. INTRODUCTION The research in Neural Networks has witnessed major changes in algorithm design focus, motivated by the limitations perceived in the algorithms available at the time.


Generalized Hopfield Networks and Nonlinear Optimization

Neural Information Processing Systems

Purdue University Purdue University Purdue University W. Lafayette, IN. 47907 W. Lafayette, IN. 47907 W. Lafayette, IN. 47907 ABSTRACT A nonlinear neural framework, called the Generalized Hopfield network, is proposed, which is able to solve in a parallel distributed manner systems of nonlinear equations. The method is applied to the general nonlinear optimization problem. We demonstrate GHNs implementing the three most important optimization algorithms, namely the Augmented Lagrangian, Generalized Reduced Gradient and Successive Quadratic Programming methods. The study results in a dynamic view of the optimization problem and offers a straightforward model for the parallelization of the optimization computations, thus significantly extending the practical limits of problems that can be formulated as an optimization problem and which can gain from the introduction of nonlinearities in their structure (eg. The ability of networks of highly interconnected simple nonlinear analog processors (neurons) to solve complicated optimization problems was demonstrated in a series of papers by Hopfield and Tank (Hopfield, 1984), (Tank, 1986).


The Cocktail Party Problem: Speech/Data Signal Separation Comparison between Backpropagation and SONN

Neural Information Processing Systems

Parallel Distributed Structures Laboratory School of Electrical Engineering Purdue University W. Lafayette, IN. 47907 ChristophSchaefers ABSTRACT This work introduces a new method called Self Organizing Neural Network (SONN) algorithm and compares its performance with Back Propagation in a signal separation application. The problem is to separate two signals; a modem data signal and a male speech signal, added and transmitted through a 4 khz channel. The signals are sampled at8 khz, and using supervised learning, an attempt is made to reconstruct them. The SONN is an algorithm that constructs its own network topology during training, which is shown to be much smaller than the BP network, faster to trained, and free from the trial-anderror networkdesign that characterize BP. 1. INTRODUCTION The research in Neural Networks has witnessed major changes in algorithm design focus, motivated by the limitations perceived in the algorithms available at the time. With the extensive work performed in that last few years using multilayered networks, it was soon discovered that these networks present limitations in tasks The Cocktail Party Problem: 543 that: (a) are difficult to determine problem complexity a priori, and thus design network of the correct size, (b) training not only takes prohibitively long times, but requires a large number of samples as well as fine parameter adjustment, without guarantee of convergence, (c) such networks do not handle the system identification task efficiently for systems whose time varying structure changes radically, and, (d) the trained network is little more than a black box of weights and connections, revealing little about the problem structure; being hard to find the justification for the algorithm weight choice, or an explanation for the output decisions based on an input vector.


Generalized Hopfield Networks and Nonlinear Optimization

Neural Information Processing Systems

Purdue University W. Lafayette, IN. 47907 ABSTRACT A nonlinear neural framework, called the Generalized Hopfield network, is proposed, which is able to solve in a parallel distributed manner systems of nonlinear equations. The method is applied to the general nonlinear optimization problem. We demonstrate GHNs implementing the three most important optimization algorithms, namely the Augmented Lagrangian, Generalized Reduced Gradient and Successive Quadratic Programming methods. The study results in a dynamic view of the optimization problem and offers a straightforward model for the parallelization of the optimization computations, thus significantly extending the practical limits of problems that can be formulated as an optimization problem and which can gain from the introduction of nonlinearities in their structure (eg. The ability of networks of highly interconnected simple nonlinear analog processors (neurons) to solve complicated optimization problems was demonstrated in a series of papers by Hopfield and Tank (Hopfield, 1984), (Tank, 1986).


Self Organizing Neural Networks for the Identification Problem

Neural Information Processing Systems

This work introduces a new method called Self Organizing Neural Network (SONN) algorithm and demonstrates its use in a system identification task. The algorithm constructs the network, chooses the neuron functions, and adjusts the weights. It is compared to the Back-Propagation algorithm in the identification of the chaotic time series. The results shows that SONN constructs a simpler, more accurate model.


Self Organizing Neural Networks for the Identification Problem

Neural Information Processing Systems

This work introduces a new method called Self Organizing Neural Network (SONN) algorithm and demonstrates its use in a system identification task. The algorithm constructs the network, chooses the neuron functions, and adjusts the weights. It is compared to the Back-Propagation algorithm in the identification of the chaotic time series. The results shows that SONN constructs a simpler, more accurate model.


Self Organizing Neural Networks for the Identification Problem

Neural Information Processing Systems

This work introduces a new method called Self Organizing Neural Network (SONN) algorithm and demonstrates its use in a system identification task. The algorithm constructs the network, chooses the neuron functions, and adjusts the weights. It is compared to the Back-Propagation algorithm in the identification of the chaotic time series. The results shows that SONN constructs a simpler, more accurate model.


Using Neural Networks to Improve Cochlear Implant Speech Perception

Neural Information Processing Systems

Mter the implant, sound can be detected through the electrical stimulation of the remaining peripheral auditory nervous system. Although great progress has been achieved in this area, no useful speech recognition has been attained with either single or multiple channel cochlear implants. Coding evidence suggests that it is necessary for any implant which would effectively couple with the natural speech perception system to simulate thetemporal dispersion and other phenomena found in the natural receptors, and currently not implemented in any cochlear implants. To this end, it is presented here a computational model using artificial neural networks (ANN)to incorporate the natural phenomena in the artificial cochlear. The ANN model presents a series of advantages to the implementation of such systems.


Using Neural Networks to Improve Cochlear Implant Speech Perception

Neural Information Processing Systems

An increasing number of profoundly deaf patients suffering from sensorineural deafness are using cochlear implants as prostheses. Mter the implant, sound can be detected through the electrical stimulation of the remaining peripheral auditory nervous system. Although great progress has been achieved in this area, no useful speech recognition has been attained with either single or multiple channel cochlear implants. Coding evidence suggests that it is necessary for any implant which would effectively couple with the natural speech perception system to simulate the temporal dispersion and other phenomena found in the natural receptors, and currently not implemented in any cochlear implants. To this end, it is presented here a computational model using artificial neural networks (ANN) to incorporate the natural phenomena in the artificial cochlear.