Neural Datalog Through Time: Informed Temporal Modeling via Logical Specification
Mei, Hongyuan, Qin, Guanghui, Xu, Minjie, Eisner, Jason
–arXiv.org Artificial Intelligence
Learning how to predict future events from patterns of past events is difficult when the set of possible event types is large. Training an unrestricted neural model might overfit to spurious patterns. To exploit domain-specific knowledge of how past events might affect an event's present probability, we propose using a temporal deductive database to track structured facts over time. Rules serve to prove facts from other facts and from past events. Each fact has a time-varying state---a vector computed by a neural net whose topology is determined by the fact's provenance, including its experience of past events. The possible event types at any time are given by special facts, whose probabilities are neurally modeled alongside their states. In both synthetic and real-world domains, we show that neural probabilistic models derived from concise Datalog programs improve prediction by encoding appropriate domain knowledge in their architecture.
arXiv.org Artificial Intelligence
Aug-16-2020
- Country:
- North America > United States (1.00)
- Genre:
- Research Report > New Finding (0.46)
- Industry:
- Leisure & Entertainment > Sports > Soccer (0.46)
- Technology:
- Information Technology > Artificial Intelligence
- Cognitive Science (1.00)
- Machine Learning
- Learning Graphical Models
- Directed Networks > Bayesian Learning (0.67)
- Undirected Networks > Markov Models (0.92)
- Neural Networks > Deep Learning (0.69)
- Statistical Learning (1.00)
- Learning Graphical Models
- Natural Language (1.00)
- Representation & Reasoning > Uncertainty (0.87)
- Information Technology > Artificial Intelligence