Goto

Collaborating Authors

 scinet



SCINet: Time Series Modeling and Forecasting with Sample Convolution and Interaction

Neural Information Processing Systems

One unique property of time series is that the temporal relations are largely preserved after downsampling into two sub-sequences. By taking advantage of this property, we propose a novel neural network architecture that conducts sample convolution and interaction for temporal modeling and forecasting, named SCINet. Specifically, SCINet is a recursive downsample-convolve-interact architecture. In each layer, we use multiple convolutional filters to extract distinct yet valuable temporal features from the downsampled sub-sequences or features.


Appendix

Neural Information Processing Systems

Section A. Then, we provide extra experimental results in Section B. In Section C, we present details Each data point includes an "oil temperature" value and For data pre-processing, we perform zero-mean normalization, i.e., Table 8: The error bars of SCINet with 5 runs on the ETTh1 dataset.T Metrics Seed 1 Seed 2 Seed 3 Seed 4 Seed 5 Mean Std. The prediction horizon is fixed to be 24 . Our code is implemented with PyTorch. SXM2 GPU (32GB memory), which is sufficient for all our experiments. To enhance the performance in single-step (short-term time series forecasting Sec.



Exploring Partial Multi-Label Learning via Integrating Semantic Co-occurrence Knowledge

Wu, Xin, Teng, Fei, Feng, Yue, Shi, Kaibo, Lin, Zhuosheng, Zhang, Ji, Wang, James

arXiv.org Artificial Intelligence

JOURNAL OF L A T EX CLASS FILES, VOL. 18, NO. 9, SEPTEMBER 2024 1 Exploring Partial Multi-Label Learning via Integrating Semantic Co-occurrence Knowledge Xin Wu, Fei Teng, Y ue Feng, Kaibo Shi, Zhuosheng Lin, Ji Zhang and James Wang Abstract --Partial multi-label learning aims to extract knowledge from incompletely annotated data, which includes known correct labels, known incorrect labels, and unknown labels. The core challenge lies in accurately identifying the ambiguous relationships between labels and instances. In this paper, we emphasize that matching co-occurrence patterns between labels and instances is key to addressing this challenge. T o this end, we propose Semantic Co-occurrence Insight Network (SCINet), a novel and effective framework for partial multi-label learning. Specifically, SCINet introduces a bi-dominant prompter module, which leverages an off-the-shelf multimodal model to capture text-image correlations and enhance semantic alignment. T o reinforce instance-label interdependencies, we develop a cross-modality fusion module that jointly models inter-label correlations, inter-instance relationships, and co-occurrence patterns across instance-label assignments. Moreover, we propose an intrinsic semantic augmentation strategy that enhances the model's understanding of intrinsic data semantics by applying diverse image transformations, thereby fostering a synergistic relationship between label confidence and sample difficulty. Extensive experiments on four widely-used benchmark datasets demonstrate that SCINet surpasses state-of-the-art methods. I NTRODUCTION M UL TI-LABEL learning has demonstrated tremendous potential in fields. However, due to the high cost of labeling and the subjectivity of annotators, real-world datasets often suffer from incomplete and noisy labels. This challenge has spurred the exploration of partial multi-label learning methods aimed at addressing these issues more effectively. Consequently, driven by this research need, partial multi-label learning has garnered vibrant attention in machine learning [1], [2]. It represents a new paradigm for multi-label recognition This work was supported by the National Natural Science Foundation of China (No.62272398), Sichuan Science and Technology Program (No.2024NSFJQ0019).


SCINet: Time Series Modeling and Forecasting with Sample Convolution and Interaction

Neural Information Processing Systems

One unique property of time series is that the temporal relations are largely preserved after downsampling into two sub-sequences. By taking advantage of this property, we propose a novel neural network architecture that conducts sample convolution and interaction for temporal modeling and forecasting, named SCINet. Specifically, SCINet is a recursive downsample-convolve-interact architecture. In each layer, we use multiple convolutional filters to extract distinct yet valuable temporal features from the downsampled sub-sequences or features. Experimental results show that SCINet achieves significant forecasting accuracy improvements over both existing convolutional models and Transformer-based solutions across various real-world time series forecasting datasets.


DeepFIB: Self-Imputation for Time Series Anomaly Detection

Liu, Minhao, Xu, Zhijian, Xu, Qiang

arXiv.org Artificial Intelligence

Time series (TS) anomaly detection (AD) plays an essential role in various applications, e.g., fraud detection in finance and healthcare monitoring. Due to the inherently unpredictable and highly varied nature of anomalies and the lack of anomaly labels in historical data, the AD problem is typically formulated as an unsupervised learning problem. The performance of existing solutions is often not satisfactory, especially in data-scarce scenarios. To tackle this problem, we propose a novel self-supervised learning technique for AD in time series, namely \emph{DeepFIB}. We model the problem as a \emph{Fill In the Blank} game by masking some elements in the TS and imputing them with the rest. Considering the two common anomaly shapes (point- or sequence-outliers) in TS data, we implement two masking strategies with many self-generated training samples. The corresponding self-imputation networks can extract more robust temporal relations than existing AD solutions and effectively facilitate identifying the two types of anomalies. For continuous outliers, we also propose an anomaly localization algorithm that dramatically reduces AD errors. Experiments on various real-world TS datasets demonstrate that DeepFIB outperforms state-of-the-art methods by a large margin, achieving up to $65.2\%$ relative improvement in F1-score.


Time Series is a Special Sequence: Forecasting with Sample Convolution and Interaction

Liu, Minhao, Zeng, Ailing, Lai, Qiuxia, Xu, Qiang

arXiv.org Artificial Intelligence

Time series is a special type of sequence data, a set of observations collected at even intervals of time and ordered chronologically. Existing deep learning techniques use generic sequence models (e.g., recurrent neural network, Transformer model, or temporal convolutional network) for time series analysis, which ignore some of its unique properties. For example, the downsampling of time series data often preserves most of the information in the data, while this is not true for general sequence data such as text sequence and DNA sequence. Motivated by the above, in this paper, we propose a novel neural network architecture and apply it for the time series forecasting problem, wherein we conduct sample convolution and interaction at multiple resolutions for temporal modeling. The proposed architecture, namelySCINet, facilitates extracting features with enhanced predictability. Experimental results show that SCINet achieves significant prediction accuracy improvement over existing solutions across various real-world time series forecasting datasets. In particular, it can achieve high fore-casting accuracy for those temporal-spatial datasets without using sophisticated spatial modeling techniques. Our codes and data are presented in the supplemental material.


Viewpoint: Physics Insights from Neural Networks

#artificialintelligence

Machine-learning models based on neural networks are behind many recent technological advances, including high-accuracy translations of text and self-driving cars. They are also increasingly used by researchers to help solve physics problems [1]. Neural networks have identified new phases of matter (see Q&A: A Condensed Matter Theorist Embraces AI) [2], detected interesting outliers in data from high-energy physics experiments [3], and found astronomical objects known as gravitational lenses in maps of the night sky (see Q&A: Paving A Path for AI in Physics Research) [4]. But, while the results obtained by neural networks proliferate, the inner workings of this tool remain elusive, and it is often unclear exactly how the network processes information in order to solve a problem. Now a team at the Swiss Federal Institute of Technology (ETH) in Zurich has demonstrated a way to find this information [5].


AI discovered Copernicus' heliocentricity on its own

#artificialintelligence

In the process, SciNet generated formulas that place the Sun at the center of our solar system. Remarkably, SciNet accomplished this in a way similar to how astronomer Nicolaus Copernicus discovered heliocentricity. "In the 16th century, Copernicus measured the angles between a distant fixed star and several planets and celestial bodies and hypothesized that the Sun, and not the Earth, is in the centre of our solar system and that the planets move around the Sun on simple orbits," the team wrote in a paper published on the preprint repository arXiv. "This explains the complicated orbits as seen from Earth." The team "encouraged" SciNet to come up with ways to predict the movements of the Sun and Mars in the simplest way possible.