Goto

Collaborating Authors

 Deng, Yong


Reliability Assessment of Information Sources Based on Random Permutation Set

arXiv.org Artificial Intelligence

In pattern recognition, handling uncertainty is a critical challenge that significantly affects decision-making and classification accuracy. Dempster-Shafer Theory (DST) is an effective reasoning framework for addressing uncertainty, and the Random Permutation Set (RPS) extends DST by additionally considering the internal order of elements, forming a more ordered extension of DST. However, there is a lack of a transformation method based on permutation order between RPS and DST, as well as a sequence-based probability transformation method for RPS. Moreover, the reliability of RPS sources remains an issue that requires attention. To address these challenges, this paper proposes an RPS transformation approach and a probability transformation method tailored for RPS. On this basis, a reliability computation method for RPS sources, based on the RPS probability transformation, is introduced and applied to pattern recognition. Experimental results demonstrate that the proposed approach effectively bridges the gap between DST and RPS and achieves superior recognition accuracy in classification problems.


Evaluating Evidential Reliability In Pattern Recognition Based On Intuitionistic Fuzzy Sets

arXiv.org Artificial Intelligence

Determining the reliability of evidence sources is a crucial topic in Dempster-Shafer theory (DST). Previous approaches have addressed high conflicts between evidence sources using discounting methods, but these methods may not ensure the high efficiency of classification models. In this paper, we consider the combination of DS theory and Intuitionistic Fuzzy Sets (IFS) and propose an algorithm for quantifying the reliability of evidence sources, called Fuzzy Reliability Index (FRI). The FRI algorithm is based on decision quantification rules derived from IFS, defining the contribution of different BPAs to correct decisions and deriving the evidential reliability from these contributions. The proposed method effectively enhances the rationality of reliability estimation for evidence sources, making it particularly suitable for classification decision problems in complex scenarios. Subsequent comparisons with DST-based algorithms and classical machine learning algorithms demonstrate the superiority and generalizability of the FRI algorithm. The FRI algorithm provides a new perspective for future decision probability conversion and reliability analysis of evidence sources.


Transferable Belief Model on Quantum Circuits

arXiv.org Artificial Intelligence

The transferable belief model, as a semantic interpretation of Dempster-Shafer theory, enables agents to perform reasoning and decision making in imprecise and incomplete environments. The model offers distinct semantics for handling unreliable testimonies, allowing for a more reasonable and general process of belief transfer compared to the Bayesian approach. However, because both the belief masses and the structure of focal sets must be considered when updating belief functions-leading to extra computational complexity during reasoning-the transferable belief model has gradually lost favor among researchers in recent developments. In this paper, we implement the transferable belief model on quantum circuits and demonstrate that belief functions offer a more concise and effective alternative to Bayesian approaches within the quantum computing framework. Furthermore, leveraging the unique characteristics of quantum computing, we propose several novel belief transfer approaches. More broadly, this paper introduces a new perspective on basic information representation for quantum AI models, suggesting that belief functions are more suitable than Bayesian approach for handling uncertainty on quantum circuits.


Time Evidence Fusion Network: Multi-source View in Long-Term Time Series Forecasting

arXiv.org Artificial Intelligence

In real-world scenarios, time series forecasting often demands timeliness, making research on model backbones a perennially hot topic. To meet these performance demands, we propose a novel backbone from the perspective of information fusion. Introducing the Basic Probability Assignment (BPA) Module and the Time Evidence Fusion Network (TEFN), based on evidence theory, allows us to achieve superior performance. On the other hand, the perspective of multi-source information fusion effectively improves the accuracy of forecasting. Due to the fact that BPA is generated by fuzzy theory, TEFN also has considerable interpretability. In real data experiments, the TEFN partially achieved state-of-the-art, with low errors comparable to PatchTST, and operating efficiency surpass performance models such as Dlinear. Meanwhile, TEFN has high robustness and small error fluctuations in the random hyperparameter selection. TEFN is not a model that achieves the ultimate in single aspect, but a model that balances performance, accuracy, stability, and interpretability.


Isopignistic Canonical Decomposition via Belief Evolution Network

arXiv.org Artificial Intelligence

Developing a general information processing model in uncertain environments is fundamental for the advancement of explainable artificial intelligence. Dempster-Shafer theory of evidence is a well-known and effective reasoning method for representing epistemic uncertainty, which is closely related to subjective probability theory and possibility theory. Although they can be transformed to each other under some particular belief structures, there remains a lack of a clear and interpretable transformation process, as well as a unified approach for information processing. In this paper, we aim to address these issues from the perspectives of isopignistic belief functions and the hyper-cautious transferable belief model. Firstly, we propose an isopignistic transformation based on the belief evolution network. This transformation allows for the adjustment of the information granule while retaining the potential decision outcome. The isopignistic transformation is integrated with a hyper-cautious transferable belief model to establish a new canonical decomposition. This decomposition offers a reverse path between the possibility distribution and its isopignistic mass functions. The result of the canonical decomposition, called isopignistic function, is an identical information content distribution to reflect the propensity and relative commitment degree of the BPA. Furthermore, this paper introduces a method to reconstruct the basic belief assignment by adjusting the isopignistic function. It explores the advantages of this approach in modeling and handling uncertainty within the hyper-cautious transferable belief model. More general, this paper establishes a theoretical basis for building general models of artificial intelligence based on probability theory, Dempster-Shafer theory, and possibility theory. Introduction Dempster-Shafer (DS) theory of evidence, also known as belief function theory, is an effective artificial intelligence tool for modeling and handling uncertainty in partial knowledge environments.


Random Walk in Random Permutation Set Theory

arXiv.org Artificial Intelligence

Random walk is an explainable approach for modeling natural processes at the molecular level. The Random Permutation Set Theory (RPST) serves as a framework for uncertainty reasoning, extending the applicability of Dempster-Shafer Theory. Recent explorations indicate a promising link between RPST and random walk. In this study, we conduct an analysis and construct a random walk model based on the properties of RPST, with Monte Carlo simulations of such random walk. Our findings reveal that the random walk generated through RPST exhibits characteristics similar to those of a Gaussian random walk and can be transformed into a Wiener process through a specific limiting scaling procedure. This investigation establishes a novel connection between RPST and random walk theory, thereby not only expanding the applicability of RPST, but also demonstrating the potential for combining the strengths of both approaches to improve problem-solving abilities.


Learnable WSN Deployment of Evidential Collaborative Sensing Model

arXiv.org Artificial Intelligence

In wireless sensor networks (WSNs), coverage and deployment are two most crucial issues when conducting detection tasks. However, the detection information collected from sensors is oftentimes not fully utilized and efficiently integrated. Such sensing model and deployment strategy, thereby, cannot reach the maximum quality of coverage, particularly when the amount of sensors within WSNs expands significantly. In this article, we aim at achieving the optimal coverage quality of WSN deployment. We develop a collaborative sensing model of sensors to enhance detection capabilities of WSNs, by leveraging the collaborative information derived from the combination rule under the framework of evidence theory. In this model, the performance evaluation of evidential fusion systems is adopted as the criterion of the sensor selection. A learnable sensor deployment network (LSDNet) considering both sensor contribution and detection capability, is proposed for achieving the optimal deployment of WSNs. Moreover, we deeply investigate the algorithm for finding the requisite minimum number of sensors that realizes the full coverage of WSNs. A series of numerical examples, along with an application of forest area monitoring, are employed to demonstrate the effectiveness and the robustness of the proposed algorithms.


Limit of the Maximum Random Permutation Set Entropy

arXiv.org Artificial Intelligence

The Random Permutation Set (RPS) is a new type of set proposed recently, which can be regarded as the generalization of evidence theory. To measure the uncertainty of RPS, the entropy of RPS and its corresponding maximum entropy have been proposed. Exploring the maximum entropy provides a possible way of understanding the physical meaning of RPS. In this paper, a new concept, the envelope of entropy function, is defined. In addition, the limit of the envelope of RPS entropy is derived and proved. Compared with the existing method, the computational complexity of the proposed method to calculate the envelope of RPS entropy decreases greatly. The result shows that when $N \to \infty$, the limit form of the envelope of the entropy of RPS converges to $e \times (N!)^2$, which is highly connected to the constant $e$ and factorial. Finally, numerical examples validate the efficiency and conciseness of the proposed envelope, which provides a new insight into the maximum entropy function.


Random Graph Set and Evidence Pattern Reasoning Model

arXiv.org Artificial Intelligence

Evidence theory is widely used in decision-making and reasoning systems. In previous research, Transferable Belief Model (TBM) is a commonly used evidential decision making model, but TBM is a non-preference model. In order to better fit the decision making goals, the Evidence Pattern Reasoning Model (EPRM) is proposed. By defining pattern operators and decision making operators, corresponding preferences can be set for different tasks. Random Permutation Set (RPS) expands order information for evidence theory. It is hard for RPS to characterize the complex relationship between samples such as cycling, paralleling relationships. Therefore, Random Graph Set (RGS) were proposed to model complex relationships and represent more event types. In order to illustrate the significance of RGS and EPRM, an experiment of aircraft velocity ranking was designed and 10,000 cases were simulated. The implementation of EPRM called Conflict Resolution Decision optimized 18.17\% of the cases compared to Mean Velocity Decision, effectively improving the aircraft velocity ranking. EPRM provides a unified solution for evidence-based decision making.


Differential Convolutional Fuzzy Time Series Forecasting

arXiv.org Artificial Intelligence

Fuzzy time series forecasting (FTSF) is a typical forecasting method with wide application. Traditional FTSF is regarded as an expert system which leads to loss of the ability to recognize undefined features. The mentioned is the main reason for poor forecasting with FTSF. To solve the problem, the proposed model Differential Fuzzy Convolutional Neural Network (DFCNN) utilizes a convolution neural network to re-implement FTSF with learnable ability. DFCNN is capable of recognizing potential information and improving forecasting accuracy. Thanks to the learnable ability of the neural network, the length of fuzzy rules established in FTSF is expended to an arbitrary length that the expert is not able to handle by the expert system. At the same time, FTSF usually cannot achieve satisfactory performance of non-stationary time series due to the trend of non-stationary time series. The trend of non-stationary time series causes the fuzzy set established by FTSF to be invalid and causes the forecasting to fail. DFCNN utilizes the Difference algorithm to weaken the non-stationary of time series so that DFCNN can forecast the non-stationary time series with a low error that FTSF cannot forecast in satisfactory performance. After the mass of experiments, DFCNN has an excellent prediction effect, which is ahead of the existing FTSF and common time series forecasting algorithms. Finally, DFCNN provides further ideas for improving FTSF and holds continued research value.