Golden, Richard M.
Stochastic Descent Analysis of Representation Learning Algorithms
Golden, Richard M.
Although stochastic approximation learning methods have been widely used in the machine learning literature for over 50 years, formal theoretical analyses of specific machine learning algorithms are less common because stochastic approximation theorems typically possess assumptions which are difficult to communicate and verify. This paper presents a new stochastic approximation theorem for state-dependent noise with easily verifiable assumptions applicable to the analysis and design of important deep learning algorithms including: adaptive learning, contrastive divergence learning, stochastic descent expectation maximization, and active learning.
Generative Modeling of Hidden Functional Brain Networks
Nandy, Shaurabh, Golden, Richard M.
Functional connectivity refers to the temporal statistical relationship between spatially distinct brain regions and is usually inferred from the time series coherence/correlation in brain activity between regions of interest. In human functional brain networks, the network structure is often inferred from functional magnetic resonance imaging (fMRI) blood oxygen level dependent (BOLD) signal. Since the BOLD signal is a proxy for neuronal activity, it is of interest to learn the latent functional network structure. Additionally, despite a core set of observations about functional networks such as small-worldness, modularity, exponentially truncated degree distributions, and presence of various types of hubs, very little is known about the computational principles which can give rise to these observations. This paper introduces a Hidden Markov Random Field framework for the purpose of representing, estimating, and evaluating latent neuronal functional relationships between different brain regions using fMRI data.
Clustering Documents Along Multiple Dimensions
Dasgupta, Sajib (IBM Almaden Research Center) | Golden, Richard M. (University of Texas at Dallas) | Ng, Vincent (University of Texas at Dallas)
Traditional clustering algorithms are designed to search for a single clustering solution despite the fact that multiple alternative solutions might exist for a particular dataset. For example, a set of news articles might be clustered by topic or by the author's gender or age. Similarly, book reviews might be clustered by sentiment or comprehensiveness. In this paper, we address the problem of identifying alternative clustering solutions by developing a Probabilistic Multi-Clustering (PMC) model that discovers multiple, maximally different clusterings of a data sample. Empirical results on six datasets representative of real-world applications show that our PMC model exhibits superior performance to comparable multi-clustering algorithms.
Kirchoff Law Markov Fields for Analog Circuit Design
Golden, Richard M.
Three contributions to developing an algorithm for assisting engineers indesigning analog circuits are provided in this paper. First, a method for representing highly nonlinear and noncontinuous analog circuits using Kirchoff current law potential functions within the context of a Markov field is described. Second, a relatively efficient algorithmfor optimizing the Markov field objective function is briefly described and the convergence proof is briefly sketched. And third, empirical results illustrating the strengths and limitations ofthe approach are provided within the context of a JFET transistor design problem. The proposed algorithm generated a set of circuit components for the JFET circuit model that accurately generated the desired characteristic curves. 1 Analog circuit design using Markov random fields
Kirchoff Law Markov Fields for Analog Circuit Design
Golden, Richard M.
Three contributions to developing an algorithm for assisting engineers in designing analog circuits are provided in this paper. First, a method for representing highly nonlinear and noncontinuous analog circuits using Kirchoff current law potential functions within the context of a Markov field is described. Second, a relatively efficient algorithm for optimizing the Markov field objective function is briefly described and the convergence proof is briefly sketched. And third, empirical results illustrating the strengths and limitations of the approach are provided within the context of a JFET transistor design problem. The proposed algorithm generated a set of circuit components for the JFET circuit model that accurately generated the desired characteristic curves. 1 Analog circuit design using Markov random fields
Kirchoff Law Markov Fields for Analog Circuit Design
Golden, Richard M.
Three contributions to developing an algorithm for assisting engineers in designing analog circuits are provided in this paper. First, a method for representing highly nonlinear and noncontinuous analog circuits using Kirchoff current law potential functions within the context of a Markov field is described. Second, a relatively efficient algorithm for optimizing the Markov field objective function is briefly described and the convergence proof is briefly sketched. And third, empirical results illustrating the strengths and limitations of the approach are provided within the context of a JFET transistor design problem. The proposed algorithm generated a set of circuit components for the JFET circuit model that accurately generated the desired characteristic curves. 1 Analog circuit design using Markov random fields
Probabilistic Characterization of Neural Model Computations
Golden, Richard M.
This viewpoint allows the class of probability distributions, P, the neural network can acquire to be explicitly specified. Learning algorithms for the neural network which search for the "most probable" member of P can then be designed. Statistical tests which decide if the "true" or environmental probability distribution is in P can also be developed. Example applications of the theory to the highly nonlinear back-propagation learning algorithm, and the networks of Hopfield and Anderson are discussed. INTRODUCTION A connectionist system is a network of simple neuron-like computing elements which can store and retrieve information, and most importantly make generalizations. Using terminology suggested by Rumelhart & McClelland 1, the computing elements of a connectionist system are called units, and each unit is associated with a real number indicating its activity level. The activity level of a given unit in the system can also influence the activity level of another unit. The degree of influence between two such units is often characterized by a parameter of the system known as a connection strength. During the information retrieval process some subset of the units in the system are activated, and these units in turn activate neighboring units via the inter-unit connection strengths.
Probabilistic Characterization of Neural Model Computations
Golden, Richard M.
Learning algorithms for the neural network which search for the "most probable" member of P can then be designed. Statistical tests which decide if the "true" or environmental probability distribution is in P can also be developed. Example applications of the theory to the highly nonlinear back-propagation learning algorithm, and the networks of Hopfield and Anderson are discussed. INTRODUCTION A connectionist system is a network of simple neuron-like computing elements which can store and retrieve information, and most importantly make generalizations. Using terminology suggested by Rumelhart & McClelland 1, the computing elements of a connectionist system are called units, and each unit is associated with a real number indicating its activity level. The activity level of a given unit in the system can also influence the activity level of another unit. The degree of influence between two such units is often characterized by a parameter of the system known as a connection strength. During the information retrievalprocess some subset of the units in the system are activated, and these units in turn activate neighboring units via the inter-unit connection strengths.