Goto

Collaborating Authors

 system energy


SupplementaryMaterial: Appendices

Neural Information Processing Systems

Symplectic integrators arethe numerical integrators thatpreservethisconservation law;hence, theycanbeinasense considered as adiscrete Hamiltonian system that is an approximation to the target Hamiltonian system. As shown above, a discrete gradient is defined in Definition 1. However,most oftheexisting discrete gradients require explicit representation of the Hamiltonian; hence, they are not available for neural networks. An exception is the Ito-Abe method[24] Hence, the proposed automatic discrete differentiation algorithm isindispensable for practical application of the discrete gradient methodforneuralnetworks. Seealso [17,22]. The target equations for this study are the differential equations with acertain geometric structure. The typical examples of the manifolds with such a2-tensor are the Riemannian manifold [4]and thesymplectic manifold [29].


Supplementary Material: Appendices A Geometric Numerical Integration Geometric numerical integration is a study on the numerical integrators of ODEs that preserve the

Neural Information Processing Systems

Due to this property, there should exist a corresponding Hamiltonian function, i.e., energy function, As such, the discrete gradient method has achieved great success. As shown above, a discrete gradient is defined in Definition 1. The target equations for this study are the differential equations with a certain geometric structure. A (null u)null v (16) with a matrix A (null u); hence, Eq. (15) is shown to be equivalent to null w This is our target equation in Eq. (1). This section provides the proofs of the Theorems in the main text.


Molecular Geometry-aware Transformer for accurate 3D Atomic System modeling

Yuan, Zheng, Zhang, Yaoyun, Tan, Chuanqi, Wang, Wei, Huang, Fei, Huang, Songfang

arXiv.org Artificial Intelligence

Molecular dynamic simulations are important in computational physics, chemistry, material, and biology. Machine learning-based methods have shown strong abilities in predicting molecular energy and properties and are much faster than DFT calculations. Molecular energy is at least related to atoms, bonds, bond angles, torsion angles, and nonbonding atom pairs. Previous Transformer models only use atoms as inputs which lack explicit modeling of the aforementioned factors. To alleviate this limitation, we propose Moleformer, a novel Transformer architecture that takes nodes (atoms) and edges (bonds and nonbonding atom pairs) as inputs and models the interactions among them using rotational and translational invariant geometry-aware spatial encoding. Proposed spatial encoding calculates relative position information including distances and angles among nodes and edges. We benchmark Moleformer on OC20 and QM9 datasets, and our model achieves state-of-the-art on the initial state to relaxed energy prediction of OC20 and is very competitive in QM9 on predicting quantum chemical properties compared to other Transformer and Graph Neural Network (GNN) methods which proves the effectiveness of the proposed geometry-aware spatial encoding in Moleformer.


RedEye: Analog ConvNet Image Sensor Architecture for Continuous Mobile Vision – implementation –

#artificialintelligence

Continuous mobile vision is limited by the inability to efficiently capture image frames and process vision features. This is largely due to the energy burden of analog readout circuitry, data traffic, and intensive computation. To promote efficiency, we shift early vision processing into the analog domain. This results in RedEye, an analog convolutional image sensor that performs layers of a convolutional neural network in the analog domain before quantization. We design RedEye to mitigate analog design complexity, using a modular column-parallel design to promote physical design reuse and algorithmic cyclic reuse.