idnn
Representation and Induction of Finite State Machines using Time-Delay Neural Networks
This work investigates the representational and inductive capabili(cid:173) ties of time-delay neural networks (TDNNs) in general, and of two subclasses of TDNN, those with delays only on the inputs (IDNN), and those which include delays on hidden units (HDNN) . Both ar(cid:173) chitectures are capable of representing the same class of languages, the definite memory machine (DMM) languages, but the delays on the hidden units in the HDNN helps it outperform the IDNN on problems composed of repeated features over short time windows.
Intrinsic and extrinsic deep learning on manifolds
Fang, Yihao, Ohn, Ilsang, Gupta, Vijay, Lin, Lizhen
We propose extrinsic and intrinsic deep neural network architectures as general frameworks for deep learning on manifolds. Specifically, extrinsic deep neural networks (eDNNs) preserve geometric features on manifolds by utilizing an equivariant embedding from the manifold to its image in the Euclidean space. Moreover, intrinsic deep neural networks (iDNNs) incorporate the underlying intrinsic geometry of manifolds via exponential and log maps with respect to a Riemannian structure. Consequently, we prove that the empirical risk of the empirical risk minimizers (ERM) of eDNNs and iDNNs converge in optimal rates. Overall, The eDNNs framework is simple and easy to compute, while the iDNNs framework is accurate and fast converging. To demonstrate the utilities of our framework, various simulation studies, and real data analyses are presented with eDNNs and iDNNs.
- North America > United States > New York (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Europe > Netherlands > North Holland > Amsterdam (0.04)
- Health & Medicine > Therapeutic Area > Neurology (0.68)
- Health & Medicine > Therapeutic Area > Psychiatry/Psychology (0.46)
Scale bridging materials physics: Active learning workflows and integrable deep neural networks for free energy function representations in alloys
Teichert, Gregory, Natarajan, Anirudh, Van der Ven, Anton, Garikipati, Krishna
The free energy plays a fundamental role in descriptions of many systems in continuum physics. Notably, in multiphysics applications, it encodes thermodynamic coupling between different fields. It thereby gives rise to driving forces on the dynamics of interaction between the constituent phenomena. In mechano-chemically interacting materials systems, even consideration of only compositions, order parameters and strains can render the free energy to be reasonably high-dimensional. In proposing the free energy as a paradigm for scale bridging, we have previously exploited neural networks for their representation of such high-dimensional functions. Specifically, we have developed an integrable deep neural network (IDNN) that can be trained to free energy derivative data obtained from atomic scale models and statistical mechanics, then analytically integrated to recover a free energy density function. The motivation comes from the statistical mechanics formalism, in which certain free energy derivatives are accessible for control of the system, rather than the free energy itself. Our current work combines the IDNN with an active learning workflow to improve sampling of the free energy derivative data in a high-dimensional input space. Treated as input-output maps, machine learning accommodates role reversals between independent and dependent quantities as the mathematical descriptions change with scale bridging. As a prototypical system we focus on Ni-Al. Phase field simulations using the resulting IDNN representation for the free energy density of Ni-Al demonstrate that the appropriate physics of the material have been learned. To the best of our knowledge, this represents the most complete treatment of scale bridging, using the free energy for a practical materials system, that starts with electronic structure calculations and proceeds through statistical mechanics to continuum physics.
- North America > United States > California (0.28)
- Europe > Italy (0.14)
- Workflow (0.73)
- Research Report (0.50)
- Government > Regional Government > North America Government > United States Government (0.68)
- Energy > Oil & Gas > Upstream (0.50)
Representation and Induction of Finite State Machines using Time-Delay Neural Networks
Clouse, Daniel S., Giles, C. Lee, Horne, Bill G., Cottrell, Garrison W.
This work investigates the representational and inductive capabilities of time-delay neural networks (TDNNs) in general, and of two subclasses of TDNN, those with delays only on the inputs (IDNN), and those which include delays on hidden units (HDNN). Both architectures are capable of representing the same class of languages, the definite memory machine (DMM) languages, but the delays on the hidden units in the HDNN helps it outperform the IDNN on problems composed of repeated features over short time windows.
- North America > United States > New Jersey > Mercer County > Princeton (0.14)
- North America > United States > California > San Diego County > San Diego (0.05)
- North America > United States > California > San Diego County > La Jolla (0.05)
- (2 more...)
Representation and Induction of Finite State Machines using Time-Delay Neural Networks
Clouse, Daniel S., Giles, C. Lee, Horne, Bill G., Cottrell, Garrison W.
This work investigates the representational and inductive capabilities of time-delay neural networks (TDNNs) in general, and of two subclasses of TDNN, those with delays only on the inputs (IDNN), and those which include delays on hidden units (HDNN). Both architectures are capable of representing the same class of languages, the definite memory machine (DMM) languages, but the delays on the hidden units in the HDNN helps it outperform the IDNN on problems composed of repeated features over short time windows.
- North America > United States > New Jersey > Mercer County > Princeton (0.14)
- North America > United States > California > San Diego County > San Diego (0.05)
- North America > United States > California > San Diego County > La Jolla (0.05)
- (2 more...)
Representation and Induction of Finite State Machines using Time-Delay Neural Networks
Clouse, Daniel S., Giles, C. Lee, Horne, Bill G., Cottrell, Garrison W.
This work investigates the representational and inductive capabilities oftime-delay neural networks (TDNNs) in general, and of two subclasses of TDNN, those with delays only on the inputs (IDNN), and those which include delays on hidden units (HDNN). Both architectures arecapable of representing the same class of languages, the definite memory machine (DMM) languages, but the delays on the hidden units in the HDNN helps it outperform the IDNN on problems composed of repeated features over short time windows. 1 Introduction In this paper we consider the representational and inductive capabilities of timedelay neuralnetworks (TDNN) [Waibel et al., 1989] [Lang et al., 1990], also known as NNFIR [Wan, 1993]. A TDNN is a feed-forward network in which the set of inputs to any node i may include the output from previous layers not only in the current time step t, but from d earlier time steps as well. The activation function 404 D.S. Clouse, C. L Giles, B. G. Home and G. W. Cottrell for node i at time t in such a network is given by equation 1: TDNNs have been used in speech recognition [Waibel et al., 1989], and time series prediction [Wan, 1993]. In this paper we concentrate on the language induction problem.
- North America > United States > California > San Diego County > San Diego (0.05)
- North America > United States > New Jersey > Mercer County > Princeton (0.05)
- North America > United States > California > San Diego County > La Jolla (0.05)
- (2 more...)