Rule-Based Reasoning
Dissect Black Box: Interpreting for Rule-Based Explanations in Unsupervised Anomaly Detection, Nengwu Wu, Qing Li
In high-stakes sectors such as network security, IoT security, accurately distinguishing between normal and anomalous data is critical due to the significant implications for operational success and safety in decision-making. The complexity is exacerbated by the presence of unlabeled data and the opaque nature of black-box anomaly detection models, which obscure the rationale behind their predictions. In this paper, we present a novel method to interpret the decision-making processes of these models, which are essential for detecting malicious activities without labeled attack data. We put forward the Segmentation Clustering Decision Tree (SCD-Tree), designed to dissect and understand the structure of normal data distributions.
A Dynamic Programs For SSK Evaluations and Gradients We now detail recursive calculation strategies for calculating k (a,b) and its gradients with O(nl
A recursive strategy is able to efficiently calculate the contributions of particular substring, pre-calculating contributions of the smaller sub-strings contained within the target string. Context-free grammars (CFG) are 4-tuples G = (V, ฮฃ, R, S), consisting of: a set of non-terminal symbols V, a set of terminal symbols ฮฃ (also known as an alphabet), a set of production rules R, a non-terminal starting symbol S from which all strings are generated. Production rules are simple maps permitting the swapping of non-terminals with other non-terminals or terminals. All strings generated by the CFG can be broken down into a (non-unique) tree of production rules with the non-terminal starting symbol S at its head. These are known as the parse trees and are demonstrated in Figure 3 in the main paper.
Probabilistic Logic Neural Networks for Reasoning
Knowledge graph reasoning, which aims at predicting the missing facts through reasoning with the observed facts, is critical to many applications. Such a problem has been widely explored by traditional logic rule-based approaches and recent knowledge graph embedding methods. A principled logic rule-based approach is the Markov Logic Network (MLN), which is able to leverage domain knowledge with first-order logic and meanwhile handle the uncertainty. However, the inference in MLNs is usually very difficult due to the complicated graph structures.
BoxE: A Box Embedding Model for Knowledge Base Completion
Knowledge base completion (KBC) aims to automatically infer missing facts by exploiting information already present in a knowledge base (KB). A promising approach for KBC is to embed knowledge into latent spaces and make predictions from learned embeddings. However, existing embedding models are subject to at least one of the following limitations: (1) theoretical inexpressivity, (2) lack of support for prominent inference patterns (e.g., hierarchies), (3) lack of support for KBC over higher-arity relations, and (4) lack of support for incorporating logical rules. Here, we propose a spatio-translational embedding model, called BoxE, that simultaneously addresses all these limitations. BoxE embeds entities as points, and relations as a set of hyper-rectangles (or boxes), which spatially characterize basic logical properties. This seemingly simple abstraction yields a fully expressive model offering a natural encoding for many desired logical properties. BoxE can both capture and inject rules from rich classes of rule languages, going well beyond individual inference patterns. By design, BoxE naturally applies to higher-arity KBs. We conduct a detailed experimental analysis, and show that BoxE achieves state-of-the-art performance, both on benchmark knowledge graphs and on more general KBs, and we empirically show the power of integrating logical rules.
SEN. JEANNE SHAHEEN: If Trump wants a Ukraine deal, he should reread his own book
Since his first day in office, President Donald Trump has mismanaged negotiations over an end to the war in Ukraine. More than 100 days later, innocent Ukrainians are still dying while the president gets played by Russian President Vladimir Putin โ illustrated starkly by the barrages of drones and missiles continually aimed at Ukrainian cities as Trump posts online. It's good to hear Trump finally express some frustration toward Putin and admit that his negotiating tactics aren't working, that, as he says, Putin is "just tapping me along, and has to be dealt with differently." The reasons for this aren't complicated. Instead of increasing his leverage over Russa, Trump offered concession after concession before talks even began.
Large Language Models-guided Dynamic Adaptation for Temporal Knowledge Graph Reasoning
Temporal Knowledge Graph Reasoning (TKGR) is the process of utilizing temporal information to capture complex relations within a Temporal Knowledge Graph (TKG) to infer new knowledge. Conventional methods in TKGR typically depend on deep learning algorithms or temporal logical rules. However, deep learningbased TKGRs often lack interpretability, whereas rule-based TKGRs struggle to effectively learn temporal rules that capture temporal patterns. Recently, Large Language Models (LLMs) have demonstrated extensive knowledge and remarkable proficiency in temporal reasoning. Consequently, the employment of LLMs for Temporal Knowledge Graph Reasoning (TKGR) has sparked increasing interest among researchers.
Learning Rule-Induced Subgraph Representations for Inductive Relation Prediction Jie Wang
Inductive relation prediction (IRP)--where entities can be different during training and inference--has shown great power for completing evolving knowledge graphs. Existing works mainly focus on using graph neural networks (GNNs) to learn the representation of the subgraph induced from the target link, which can be seen as an implicit rule-mining process to measure the plausibility of the target link. However, these methods cannot differentiate the target link and other links during message passing, hence the final subgraph representation will contain irrelevant rule information to the target link, which reduces the reasoning performance and severely hinders the applications for real-world scenarios. To tackle this problem, we propose a novel single-source edge-wise GNN model to learn the Rule-inducEd Subgraph represenTations (REST), which encodes relevant rules and eliminates irrelevant rules within the subgraph. Specifically, we propose a single-source initialization approach to initialize edge features only for the target link, which guarantees the relevance of mined rules and target link. Then we propose several RNN-based functions for edge-wise message passing to model the sequential property of mined rules. REST is a simple and effective approach with theoretical support to learn the rule-induced subgraph representation. Moreover, REST does not need node labeling, which significantly accelerates the subgraph preprocessing time by up to 11.66 .
Multi-value Rule Sets for Interpretable Classification with Feature-Efficient Representations
We present the Multi-value Rule Set (MRS) for interpretable classification with feature efficient presentations. Compared to rule sets built from single-value rules, MRS adopts a more generalized form of association rules that allows multiple values in a condition. Rules of this form are more concise than classical singlevalue rules in capturing and describing patterns in data. Our formulation also pursues a higher efficiency of feature utilization, which reduces possible cost in data collection and storage. We propose a Bayesian framework for formulating an MRS model and develop an efficient inference method for learning a maximum a posteriori, incorporating theoretically grounded bounds to iteratively reduce the search space and improve the search efficiency. Experiments on synthetic and realworld data demonstrate that MRS models have significantly smaller complexity and fewer features than baseline models while being competitive in predictive accuracy. Human evaluations show that MRS is easier to understand and use compared to other rule-based models.