boxe
BoxE: A Box Embedding Model for Knowledge Base Completion
Knowledge base completion (KBC) aims to automatically infer missing facts by exploiting information already present in a knowledge base (KB). A promising approach for KBC is to embed knowledge into latent spaces and make predictions from learned embeddings. However, existing embedding models are subject to at least one of the following limitations: (1) theoretical inexpressivity, (2) lack of support for prominent inference patterns (e.g., hierarchies), (3) lack of support for KBC over higher-arity relations, and (4) lack of support for incorporating logical rules. Here, we propose a spatio-translational embedding model, called BoxE, that simultaneously addresses all these limitations. BoxE embeds entities as points, and relations as a set of hyper-rectangles (or boxes), which spatially characterize basic logical properties. This seemingly simple abstraction yields a fully expressive model offering a natural encoding for many desired logical properties. BoxE can both capture and inject rules from rich classes of rule languages, going well beyond individual inference patterns. By design, BoxE naturally applies to higher-arity KBs. We conduct a detailed experimental analysis, and show that BoxE achieves state-of-the-art performance, both on benchmark knowledge graphs and on more general KBs, and we empirically show the power of integrating logical rules.
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.14)
- North America > United States > California > Ventura County > Oxnard (0.04)
- North America > Canada > Ontario > Toronto (0.04)
- (2 more...)
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.14)
- North America > Canada (0.04)
We are pleased that the different conceptual aspects of BoxE are clear, and that our experiments are
We thank the reviewers for their valuable and insightful feedback, and respond to their comments and questions below. Model expressivity and compression: The bound in Theorem 5.1 is a worst-case bound that is only tight when all KB In fact, higher-arity experiments (see Section 6.2) are Furthermore, we have evaluated model robustness in Appendix H.1, and Adam optimizer, and hyper-parameters (including negative samples) are in Table 6. We will mention this in the paper. Novelty of the model: BoxE is substantially different from any existing box model. We will make these differences more explicit in the paper.
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.14)
- North America > Canada (0.04)
- Information Technology > Artificial Intelligence > Natural Language (1.00)
- Information Technology > Artificial Intelligence > Machine Learning (1.00)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Rule-Based Reasoning (0.68)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Expert Systems (0.54)
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.14)
- North America > United States > California > Ventura County > Oxnard (0.04)
- North America > Canada > Ontario > Toronto (0.04)
- (3 more...)
16 Best Prime Day Board Game Deals for Kids of All Ages (2025)
With summer vacations still stretching off into the distance, taking advantage of Prime Day board game deals or snagging some discounted toys could be a very smart move. The WIRED Reviews team has a gaggle of kids of all ages between us, and most of us are still big kids at heart, so we've tried a lot of board games and toys over the years. Some of the best family board games are on sale for Prime Day. You can also find deals on some of the best STEM toys. I have handpicked a selection of WIRED-tested and -approved board games and toys right here, and I'll keep adding to this list during the Prime Days sales through July 11.
Review for NeurIPS paper: BoxE: A Box Embedding Model for Knowledge Base Completion
Additional Feedback: Please number ALL equations for easy reference, at least in the preliminary submission. L139 Translational bumps are certainly very expressive, but a likely first reaction is that they are too expressive. Perhaps you need a couple sentences right here on how you control their power. L153 "for the sample KG, there are 4 2 potential configurations" There are four entities and two binary relations. For each relation, each slot can be occupied by any one of four entities (assuming selectively reflexive and symmetric relations allowed).
Review for NeurIPS paper: BoxE: A Box Embedding Model for Knowledge Base Completion
The paper aims to improve knowledge base modelling. In this regards, authors propose a rather ingenious use of box embeddings as the latent representation for the relations. Specifically, each n-ary relation is represented by n boxes and each entity is represented by two vectors. Having a pair of vectors is very powerful, as they allow us to model complex interactions across entities. In particular authors show how their proposed box embeddings can simultaneously handle symmetry, asymmetry, anti-symmetry, and transitivity. No previous framework is claimed to be as flexible nor capable of handling all these patterns.