nn representation
Nearest Neighbor Representations of Neural Circuits
Kilic, Kordag Mehmet, Sima, Jin, Bruck, Jehoshua
Neural networks successfully capture the computational power of the human brain for many tasks. Similarly inspired by the brain architecture, Nearest Neighbor (NN) representations is a novel approach of computation. We establish a firmer correspondence between NN representations and neural networks. Although it was known how to represent a single neuron using NN representations, there were no results even for small depth neural networks. Specifically, for depth-2 threshold circuits, we provide explicit constructions for their NN representation with an explicit bound on the number of bits to represent it. Example functions include NN representations of convex polytopes (AND of threshold gates), IP2, OR of threshold gates, and linear or exact decision lists.
- South America > Brazil > Rio de Janeiro > Rio de Janeiro (0.04)
- North America > United States > Illinois > Champaign County > Urbana (0.04)
- North America > United States > California (0.04)
On the Information Capacity of Nearest Neighbor Representations
Kilic, Kordag Mehmet, Sima, Jin, Bruck, Jehoshua
The $\textit{von Neumann Computer Architecture}$ has a distinction between computation and memory. In contrast, the brain has an integrated architecture where computation and memory are indistinguishable. Motivated by the architecture of the brain, we propose a model of $\textit{associative computation}$ where memory is defined by a set of vectors in $\mathbb{R}^n$ (that we call $\textit{anchors}$), computation is performed by convergence from an input vector to a nearest neighbor anchor, and the output is a label associated with an anchor. Specifically, in this paper, we study the representation of Boolean functions in the associative computation model, where the inputs are binary vectors and the corresponding outputs are the labels ($0$ or $1$) of the nearest neighbor anchors. The information capacity of a Boolean function in this model is associated with two quantities: $\textit{(i)}$ the number of anchors (called $\textit{Nearest Neighbor (NN) Complexity}$) and $\textit{(ii)}$ the maximal number of bits representing entries of anchors (called $\textit{Resolution}$). We study symmetric Boolean functions and present constructions that have optimal NN complexity and resolution.
- North America > United States > New York > New York County > New York City (0.04)
- North America > United States > Illinois > Champaign County > Urbana (0.04)
- North America > United States > California (0.04)
- Europe > Hungary > Budapest > Budapest (0.04)
Reconstructing spectral functions via automatic differentiation
Wang, Lingxiao, Shi, Shuzhe, Zhou, Kai
Reconstructing spectral functions from Euclidean Green's functions is an important inverse problem in many-body physics. However, the inversion is proved to be ill-posed in the realistic systems with noisy Green's functions. In this Letter, we propose an automatic differentiation(AD) framework as a generic tool for the spectral reconstruction from propagator observable. Exploiting the neural networks' regularization as a non-local smoothness regulator of the spectral function, we represent spectral functions by neural networks and use the propagator's reconstruction error to optimize the network parameters unsupervisedly. In the training process, except for the positive-definite form for the spectral function, there are no other explicit physical priors embedded into the neural networks. The reconstruction performance is assessed through relative entropy and mean square error for two different network representations. Compared to the maximum entropy method, the AD framework achieves better performance in the large-noise situation. It is noted that the freedom of introducing non-local regularization is an inherent advantage of the present framework and may lead to substantial improvements in solving inverse problems.
- North America > Canada > Quebec > Montreal (0.14)
- North America > United States > New York > Suffolk County > Stony Brook (0.04)
- North America > United States > Massachusetts > Middlesex County > Reading (0.04)
- (2 more...)
Hybrid modeling: Applications in real-time diagnosis
Matei, Ion, de Kleer, Johan, Feldman, Alexander, Rai, Rahul, Chowdhury, Souma
Reduced-order models that accurately abstract high fidelity models and enable faster simulation is vital for real-time, model-based diagnosis applications. In this paper, we outline a novel hybrid modeling approach that combines machine learning inspired models and physics-based models to generate reduced-order models from high fidelity models. We are using such models for real-time diagnosis applications. Specifically, we have developed machine learning inspired representations to generate reduced order component models that preserve, in part, the physical interpretation of the original high fidelity component models. To ensure the accuracy, scalability and numerical stability of the learning algorithms when training the reduced-order models we use optimization platforms featuring automatic differentiation. Training data is generated by simulating the high-fidelity model. We showcase our approach in the context of fault diagnosis of a rail switch system. Three new model abstractions whose complexities are two orders of magnitude smaller than the complexity of the high fidelity model, both in the number of equations and simulation time are shown. The numerical experiments and results demonstrate the efficacy of the proposed hybrid modeling approach.
- North America > United States > New Jersey > Hudson County > Hoboken (0.04)
- North America > United States > District of Columbia > Washington (0.04)
- Europe > Germany > Bavaria > Upper Bavaria > Munich (0.04)
- Transportation (0.68)
- Energy (0.46)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Model-Based Reasoning (1.00)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Diagnosis (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.46)