Goto

Collaborating Authors

 qdf


On the equivalence of molecular graph convolution and molecular wave function with poor basis set

Neural Information Processing Systems

In this study, we demonstrate that the linear combination of atomic orbitals (LCAO), an approximation introduced by Pauling and Lennard-Jones in the 1920s, corresponds to graph convolutional networks (GCNs) for molecules. However, GCNs involve unnecessary nonlinearity and deep architecture. We also verify that molecular GCNs are based on a poor basis function set compared with the standard one used in theoretical calculations or quantum chemical simulations. From these observations, we describe the quantum deep field (QDF), a machine learning (ML) model based on an underlying quantum physics, in particular the density functional theory (DFT). We believe that the QDF model can be easily understood because it can be regarded as a single linear layer GCN. Moreover, it uses two vanilla feedforward neural networks to learn an energy functional and a Hohenberg-Kohn map that have nonlinearities inherent in quantum physics and the DFT. For molecular energy prediction tasks, we demonstrated the viability of an "extrapolation," in which we trained a QDF model with small molecules, tested it with large molecules, and achieved high extrapolation performance. We believe that we should move away from the competition of interpolation accuracy within benchmark datasets and evaluate ML models based on physics using an extrapolation setting; this will lead to reliable and practical applications, such as fast, large-scale molecular screening for discovering effective materials.


We thank the reviewers for the thoughtful feedback in these difficult times caused by the global COVID-19 pandemic

Neural Information Processing Systems

We thank the reviewers for the thoughtful feedback in these difficult times caused by the global COVID-19 pandemic. QM9 is used for training, the model must be based on LCAO, and QDF achieved high extrapolation performance. We emphasize that even this LDA-like HK map achieved high extrapolation performance. We will address this in future work. Of course, QDF can be proposed without a comparison to GCN.


Quadratic Direct Forecast for Training Multi-Step Time-Series Forecast Models

Wang, Hao, Pan, Licheng, Lu, Yuan, Chen, Zhichao, Liu, Tianqiao, He, Shuting, Chu, Zhixuan, Wen, Qingsong, Li, Haoxuan, Lin, Zhouchen

arXiv.org Machine Learning

The design of training objective is central to training time-series forecasting models. Existing training objectives such as mean squared error mostly treat each future step as an independent, equally weighted task, which we found leading to the following two issues: (1) overlook the label autocorrelation effect among future steps, leading to biased training objective; (2) fail to set heterogeneous task weights for different forecasting tasks corresponding to varying future steps, limiting the forecasting performance. To fill this gap, we propose a novel quadratic-form weighted training objective, addressing both of the issues simultaneously. Specifically, the off-diagonal elements of the weighting matrix account for the label autocorrelation effect, whereas the non-uniform diagonals are expected to match the most preferable weights of the forecasting tasks with varying future steps. To achieve this, we propose a Quadratic Direct Forecast (QDF) learning algorithm, which trains the forecast model using the adaptively updated quadratic-form weighting matrix. Experiments show that our QDF effectively improves performance of various forecast models, achieving state-of-the-art results. Code is available at https://anonymous.4open.science/r/QDF-8937.


On the equivalence of molecular graph convolution and molecular wave function with poor basis set

Neural Information Processing Systems

In this study, we demonstrate that the linear combination of atomic orbitals (LCAO), an approximation introduced by Pauling and Lennard-Jones in the 1920s, corresponds to graph convolutional networks (GCNs) for molecules. However, GCNs involve unnecessary nonlinearity and deep architecture. We also verify that molecular GCNs are based on a poor basis function set compared with the standard one used in theoretical calculations or quantum chemical simulations. From these observations, we describe the quantum deep field (QDF), a machine learning (ML) model based on an underlying quantum physics, in particular the density functional theory (DFT). We believe that the QDF model can be easily understood because it can be regarded as a single linear layer GCN. Moreover, it uses two vanilla feedforward neural networks to learn an energy functional and a Hohenberg-Kohn map that have nonlinearities inherent in quantum physics and the DFT. For molecular energy prediction tasks, we demonstrated the viability of an "extrapolation," in which we trained a QDF model with small molecules, tested it with large molecules, and achieved high extrapolation performance. We believe that we should move away from the competition of interpolation accuracy within benchmark datasets and evaluate ML models based on physics using an extrapolation setting; this will lead to reliable and practical applications, such as fast, large-scale molecular screening for discovering effective materials.


We thank the reviewers for the thoughtful feedback in these difficult times caused by the global COVID-19 pandemic

Neural Information Processing Systems

We thank the reviewers for the thoughtful feedback in these difficult times caused by the global COVID-19 pandemic. QM9 is used for training, the model must be based on LCAO, and QDF achieved high extrapolation performance. We emphasize that even this LDA-like HK map achieved high extrapolation performance. We will address this in future work. Of course, QDF can be proposed without a comparison to GCN.