Distributional Semantic Features as Semantic Primitives — Or Not

AAAI Conferences

We argue that >distributional semantics can serve as the basis for a semantic representation of words and phrases that serves many of the purposes semantic primitives were designed for, without running into many of their philosophical, empirical, and practical problems.


Distributional-Relational Models: Scalable Semantics for Databases

AAAI Conferences

The crisp/brittle semantic model behind databases limits the scale in which data consumers can query, explore, integrate and process structured data. Approaches aiming to provide more comprehensive semantic models for databases, which are purely logic-based (e.g. as in Semantic Web databases) have major scalability limitations in the acquisition of structured semantic and commonsense data. This work describes a complementary semantic model for databases which has semantic approximation at its center. This model uses distributional semantic models (DSMs) to extend structured data semantics. DSMs support the automatic construction of semantic and commonsense models from large-scale unstructured text and provides a simple model to analyze similarities in the structured data. The combination of distributional and structured data semantics provides a simple and promising solution to address the challenges associated with the interaction and processing of structured data.


Distributional Relational Networks

AAAI Conferences

This work introduces distributional relational networks (DRNs), a knowledge representation (KR) framework which focuses on allowing semantic approximations over large-scale and heterogeneous knowledge bases. The proposed model uses the distributional semantics information embedded in large text/data corpora to provide a comprehensive and principled solution for semantic approximation. DRNs can be applied to open domain knowledge bases and can be used as a KR model for commonsense reasoning. Experimental results show the suitability of DRNs as a semantically flexible KR framework.


Manfred: Slow Free Agent Market Is 'Distributional Problem'

U.S. News

Manfred's comments came as owners and players neared an agreement to allow teams to carry an extra player for most of the season starting in 2020 and for other changes to the way rosters are managed. The deal would include a limit on September call-ups -- lowering the maximum roster size from 40 players to 28 -- and on the number of pitchers would be designed to end the parade of relievers that can cause games to run long. Baseball also agreed not pursue a pitch clock through the 2021 season.


Nonlinear Distributional Gradient Temporal-Difference Learning

arXiv.org Artificial Intelligence

We devise a distributional variant of gradient temporal-difference (TD) learning. Distributional reinforcement learning has been demonstrated to outperform the regular one in the recent study \citep{bellemare2017distributional}. In our paper, we design two new algorithms called distributional GTD2 and distributional TDC using the Cram{\'e}r distance on the distributional version of the Bellman error objective function, which inherits advantages of both the nonlinear gradient TD algorithms and the distributional RL approach. We prove the asymptotic almost-sure convergence to a local optimal solution for general smooth function approximators, which includes neural networks that have been widely used in recent study to solve the real-life RL problems. In each step, the computational complexity is linear w.r.t.\ the number of the parameters of the function approximator, thus can be implemented efficiently for neural networks.