Goto

Collaborating Authors

 Zhan, Tianxiang


Generalized Uncertainty-Based Evidential Fusion with Hybrid Multi-Head Attention for Weak-Supervised Temporal Action Localization

arXiv.org Artificial Intelligence

Weakly supervised temporal action localization (WS-TAL) is a task of targeting at localizing complete action instances and categorizing them with video-level labels. Action-background ambiguity, primarily caused by background noise resulting from aggregation and intra-action variation, is a significant challenge for existing WS-TAL methods. In this paper, we introduce a hybrid multi-head attention (HMHA) module and generalized uncertainty-based evidential fusion (GUEF) module to address the problem. The proposed HMHA effectively enhances RGB and optical flow features by filtering redundant information and adjusting their feature distribution to better align with the WS-TAL task. Additionally, the proposed GUEF adaptively eliminates the interference of background noise by fusing snippet-level evidences to refine uncertainty measurement and select superior foreground feature information, which enables the model to concentrate on integral action instances to achieve better action localization and classification performance. Experimental results conducted on the THUMOS14 dataset demonstrate that our method outperforms state-of-the-art methods. Our code is available in \url{https://github.com/heyuanpengpku/GUEF/tree/main}.


Residual Feature-Reutilization Inception Network for Image Classification

arXiv.org Artificial Intelligence

Generally, deep learning has contributed to this field a lot. The most representative deep neural network architectures in computer vision can be roughly divided into transformer-based and CNN-based models. Transformer is originally proposed for natural language processing, which has been transferred to vision tasks and achieves considerably satisfying performance recently. Specifically, vision transformer [1] first introduces attention mechanism into computer vision whose strategy of information interaction enlargers the effective receptive field of related models observably so that crucial information can be better obtained. Due to efficiency of this architecture, the variations of transformer are devised corresponding to specific demands, and there are two main categories in the thoughts about improvements on the variations, namely integration of transformer framework with other models which are for particular usages and modifications on the original architecture. With respect to the former, DS-TransUNet [2] is a typical example, which synthesizes dual transformer-based architectures and U-Net to realize a breakthrough in medical image segmentation. Besides, some works focus on improvements on architecture of transformer, for instance, Mix-ViT [3] tries to design a mix attention mechanism to create more sufficient passages for information interaction.


Reliability Assessment of Information Sources Based on Random Permutation Set

arXiv.org Artificial Intelligence

In pattern recognition, handling uncertainty is a critical challenge that significantly affects decision-making and classification accuracy. Dempster-Shafer Theory (DST) is an effective reasoning framework for addressing uncertainty, and the Random Permutation Set (RPS) extends DST by additionally considering the internal order of elements, forming a more ordered extension of DST. However, there is a lack of a transformation method based on permutation order between RPS and DST, as well as a sequence-based probability transformation method for RPS. Moreover, the reliability of RPS sources remains an issue that requires attention. To address these challenges, this paper proposes an RPS transformation approach and a probability transformation method tailored for RPS. On this basis, a reliability computation method for RPS sources, based on the RPS probability transformation, is introduced and applied to pattern recognition. Experimental results demonstrate that the proposed approach effectively bridges the gap between DST and RPS and achieves superior recognition accuracy in classification problems.


Evaluating Evidential Reliability In Pattern Recognition Based On Intuitionistic Fuzzy Sets

arXiv.org Artificial Intelligence

Determining the reliability of evidence sources is a crucial topic in Dempster-Shafer theory (DST). Previous approaches have addressed high conflicts between evidence sources using discounting methods, but these methods may not ensure the high efficiency of classification models. In this paper, we consider the combination of DS theory and Intuitionistic Fuzzy Sets (IFS) and propose an algorithm for quantifying the reliability of evidence sources, called Fuzzy Reliability Index (FRI). The FRI algorithm is based on decision quantification rules derived from IFS, defining the contribution of different BPAs to correct decisions and deriving the evidential reliability from these contributions. The proposed method effectively enhances the rationality of reliability estimation for evidence sources, making it particularly suitable for classification decision problems in complex scenarios. Subsequent comparisons with DST-based algorithms and classical machine learning algorithms demonstrate the superiority and generalizability of the FRI algorithm. The FRI algorithm provides a new perspective for future decision probability conversion and reliability analysis of evidence sources.


Time Evidence Fusion Network: Multi-source View in Long-Term Time Series Forecasting

arXiv.org Artificial Intelligence

In real-world scenarios, time series forecasting often demands timeliness, making research on model backbones a perennially hot topic. To meet these performance demands, we propose a novel backbone from the perspective of information fusion. Introducing the Basic Probability Assignment (BPA) Module and the Time Evidence Fusion Network (TEFN), based on evidence theory, allows us to achieve superior performance. On the other hand, the perspective of multi-source information fusion effectively improves the accuracy of forecasting. Due to the fact that BPA is generated by fuzzy theory, TEFN also has considerable interpretability. In real data experiments, the TEFN partially achieved state-of-the-art, with low errors comparable to PatchTST, and operating efficiency surpass performance models such as Dlinear. Meanwhile, TEFN has high robustness and small error fluctuations in the random hyperparameter selection. TEFN is not a model that achieves the ultimate in single aspect, but a model that balances performance, accuracy, stability, and interpretability.


Isopignistic Canonical Decomposition via Belief Evolution Network

arXiv.org Artificial Intelligence

Developing a general information processing model in uncertain environments is fundamental for the advancement of explainable artificial intelligence. Dempster-Shafer theory of evidence is a well-known and effective reasoning method for representing epistemic uncertainty, which is closely related to subjective probability theory and possibility theory. Although they can be transformed to each other under some particular belief structures, there remains a lack of a clear and interpretable transformation process, as well as a unified approach for information processing. In this paper, we aim to address these issues from the perspectives of isopignistic belief functions and the hyper-cautious transferable belief model. Firstly, we propose an isopignistic transformation based on the belief evolution network. This transformation allows for the adjustment of the information granule while retaining the potential decision outcome. The isopignistic transformation is integrated with a hyper-cautious transferable belief model to establish a new canonical decomposition. This decomposition offers a reverse path between the possibility distribution and its isopignistic mass functions. The result of the canonical decomposition, called isopignistic function, is an identical information content distribution to reflect the propensity and relative commitment degree of the BPA. Furthermore, this paper introduces a method to reconstruct the basic belief assignment by adjusting the isopignistic function. It explores the advantages of this approach in modeling and handling uncertainty within the hyper-cautious transferable belief model. More general, this paper establishes a theoretical basis for building general models of artificial intelligence based on probability theory, Dempster-Shafer theory, and possibility theory. Introduction Dempster-Shafer (DS) theory of evidence, also known as belief function theory, is an effective artificial intelligence tool for modeling and handling uncertainty in partial knowledge environments.


Learnable WSN Deployment of Evidential Collaborative Sensing Model

arXiv.org Artificial Intelligence

In wireless sensor networks (WSNs), coverage and deployment are two most crucial issues when conducting detection tasks. However, the detection information collected from sensors is oftentimes not fully utilized and efficiently integrated. Such sensing model and deployment strategy, thereby, cannot reach the maximum quality of coverage, particularly when the amount of sensors within WSNs expands significantly. In this article, we aim at achieving the optimal coverage quality of WSN deployment. We develop a collaborative sensing model of sensors to enhance detection capabilities of WSNs, by leveraging the collaborative information derived from the combination rule under the framework of evidence theory. In this model, the performance evaluation of evidential fusion systems is adopted as the criterion of the sensor selection. A learnable sensor deployment network (LSDNet) considering both sensor contribution and detection capability, is proposed for achieving the optimal deployment of WSNs. Moreover, we deeply investigate the algorithm for finding the requisite minimum number of sensors that realizes the full coverage of WSNs. A series of numerical examples, along with an application of forest area monitoring, are employed to demonstrate the effectiveness and the robustness of the proposed algorithms.


Random Graph Set and Evidence Pattern Reasoning Model

arXiv.org Artificial Intelligence

Evidence theory is widely used in decision-making and reasoning systems. In previous research, Transferable Belief Model (TBM) is a commonly used evidential decision making model, but TBM is a non-preference model. In order to better fit the decision making goals, the Evidence Pattern Reasoning Model (EPRM) is proposed. By defining pattern operators and decision making operators, corresponding preferences can be set for different tasks. Random Permutation Set (RPS) expands order information for evidence theory. It is hard for RPS to characterize the complex relationship between samples such as cycling, paralleling relationships. Therefore, Random Graph Set (RGS) were proposed to model complex relationships and represent more event types. In order to illustrate the significance of RGS and EPRM, an experiment of aircraft velocity ranking was designed and 10,000 cases were simulated. The implementation of EPRM called Conflict Resolution Decision optimized 18.17\% of the cases compared to Mean Velocity Decision, effectively improving the aircraft velocity ranking. EPRM provides a unified solution for evidence-based decision making.


Differential Convolutional Fuzzy Time Series Forecasting

arXiv.org Artificial Intelligence

Fuzzy time series forecasting (FTSF) is a typical forecasting method with wide application. Traditional FTSF is regarded as an expert system which leads to loss of the ability to recognize undefined features. The mentioned is the main reason for poor forecasting with FTSF. To solve the problem, the proposed model Differential Fuzzy Convolutional Neural Network (DFCNN) utilizes a convolution neural network to re-implement FTSF with learnable ability. DFCNN is capable of recognizing potential information and improving forecasting accuracy. Thanks to the learnable ability of the neural network, the length of fuzzy rules established in FTSF is expended to an arbitrary length that the expert is not able to handle by the expert system. At the same time, FTSF usually cannot achieve satisfactory performance of non-stationary time series due to the trend of non-stationary time series. The trend of non-stationary time series causes the fuzzy set established by FTSF to be invalid and causes the forecasting to fail. DFCNN utilizes the Difference algorithm to weaken the non-stationary of time series so that DFCNN can forecast the non-stationary time series with a low error that FTSF cannot forecast in satisfactory performance. After the mass of experiments, DFCNN has an excellent prediction effect, which is ahead of the existing FTSF and common time series forecasting algorithms. Finally, DFCNN provides further ideas for improving FTSF and holds continued research value.


Evaluating importance of nodes in complex networks with local volume information dimension

arXiv.org Artificial Intelligence

How to evaluate the importance of nodes is essential in research of complex network. There are many methods proposed for solving this problem, but they still have room to be improved. In this paper, a new approach called local volume information dimension is proposed. In this method, the sum of degree of nodes within different distances of central node is calculated. The information within the certain distance is described by the information entropy. Compared to other methods, the proposed method considers the information of the nodes from different distances more comprehensively. For the purpose of showing the effectiveness of the proposed method, experiments on real-world networks are implemented. Promising results indicate the effectiveness of the proposed method.