fail 0
A Appendix
We will prove by the induction. Let's suppose that the formula holds for By the definition in Eq. 4 and the chain rule, we can get that: N In this section, we give error bounds for spline representation. In the present work, we focus on using spline for smoothing noisy data. Following [51], we have spline fitting error bounds, as following. Eq. 11 L Output: Mean estimation: θ A.4 Training Details Additional training hyper parameters used in Sec. 4 is shown in the Tab. 2. T able 2: Training Details We list additional discovery and UQ results in this section.
A Appendix
We will prove by the induction. Let's suppose that the formula holds for By the definition in Eq. 4 and the chain rule, we can get that: N In this section, we give error bounds for spline representation. In the present work, we focus on using spline for smoothing noisy data. Following [51], we have spline fitting error bounds, as following. Eq. 11 L Output: Mean estimation: θ A.4 Training Details Additional training hyper parameters used in Sec. 4 is shown in the Tab. 2. T able 2: Training Details We list additional discovery and UQ results in this section.
HiGPT: Heterogeneous Graph Language Model
Tang, Jiabin, Yang, Yuhao, Wei, Wei, Shi, Lei, Xia, Long, Yin, Dawei, Huang, Chao
Heterogeneous graph learning aims to capture complex relationships and diverse relational semantics among entities in a heterogeneous graph to obtain meaningful representations for nodes and edges. Recent advancements in heterogeneous graph neural networks (HGNNs) have achieved state-of-the-art performance by considering relation heterogeneity and using specialized message functions and aggregation rules. However, existing frameworks for heterogeneous graph learning have limitations in generalizing across diverse heterogeneous graph datasets. Most of these frameworks follow the "pre-train" and "fine-tune" paradigm on the same dataset, which restricts their capacity to adapt to new and unseen data. This raises the question: "Can we generalize heterogeneous graph models to be well-adapted to diverse downstream learning tasks with distribution shifts in both node token sets and relation type heterogeneity?'' To tackle those challenges, we propose HiGPT, a general large graph model with Heterogeneous graph instruction-tuning paradigm. Our framework enables learning from arbitrary heterogeneous graphs without the need for any fine-tuning process from downstream datasets. To handle distribution shifts in heterogeneity, we introduce an in-context heterogeneous graph tokenizer that captures semantic relationships in different heterogeneous graphs, facilitating model adaptation. We incorporate a large corpus of heterogeneity-aware graph instructions into our HiGPT, enabling the model to effectively comprehend complex relation heterogeneity and distinguish between various types of graph tokens. Furthermore, we introduce the Mixture-of-Thought (MoT) instruction augmentation paradigm to mitigate data scarcity by generating diverse and informative instructions. Through comprehensive evaluations, our proposed framework demonstrates exceptional performance in terms of generalization performance.
- North America > United States > Texas (0.04)
- Asia > Myanmar > Tanintharyi Region > Dawei (0.04)
- North America > United States > New York > New York County > New York City (0.04)
- (2 more...)
- Media > Film (1.00)
- Leisure & Entertainment (1.00)
Bayesian Spline Learning for Equation Discovery of Nonlinear Dynamics with Quantified Uncertainty
Sun, Luning, Huang, Daniel Zhengyu, Sun, Hao, Wang, Jian-Xun
Nonlinear dynamics are ubiquitous in science and engineering applications, but the physics of most complex systems is far from being fully understood. Discovering interpretable governing equations from measurement data can help us understand and predict the behavior of complex dynamic systems. Although extensive work has recently been done in this field, robustly distilling explicit model forms from very sparse data with considerable noise remains intractable. Moreover, quantifying and propagating the uncertainty of the identified system from noisy data is challenging, and relevant literature is still limited. To bridge this gap, we develop a novel Bayesian spline learning framework to identify parsimonious governing equations of nonlinear (spatio)temporal dynamics from sparse, noisy data with quantified uncertainty. The proposed method utilizes spline basis to handle the data scarcity and measurement noise, upon which a group of derivatives can be accurately computed to form a library of candidate model terms. The equation residuals are used to inform the spline learning in a Bayesian manner, where approximate Bayesian uncertainty calibration techniques are employed to approximate posterior distributions of the trainable parameters. To promote the sparsity, an iterative sequential-threshold Bayesian learning approach is developed, using the alternative direction optimization strategy to systematically approximate L0 sparsity constraints. The proposed algorithm is evaluated on multiple nonlinear dynamical systems governed by canonical ordinary and partial differential equations, and the merit/superiority of the proposed method is demonstrated by comparison with state-of-the-art methods.
- North America > United States > New York (0.04)
- North America > United States > Indiana > St. Joseph County > Notre Dame (0.04)
- North America > United States > California (0.04)
- (2 more...)
- Health & Medicine (0.46)
- Government (0.46)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (1.00)
- (2 more...)