Enriching Word Embeddings Using Knowledge Graph for Semantic Tagging in Conversational Dialog Systems
Celikyilmaz, Asli (Microsoft) | Hakkani-Tur, Dilek (Microsoft Research) | Pasupat, Panupong (Stanford University) | Sarikaya, Ruhi (Microsoft)
Unsupervised word embeddings provide rich linguistic and conceptual information about words. However, they may provide weak information about domain specific semantic relations for certain tasks such as semantic parsing of natural language queries, where such information about words can be valuable. To encode the prior knowledge about the semantic word relations, we present new method as follows: We extend the neural network based lexical word embedding objective function Mikolov, et.al. 2013 by incorporating the information about relationship between entities that we extract from knowledge bases. Our model can jointly learn lexical word representations from free text enriched by the relational word embeddings from relational data (e.g., Freebase) for each type of entity relations. We empirically show on the task of semantic tagging of natural language queries that our enriched embeddings can provide information about not only short-range syntactic dependencies but also long-range semantic dependencies between words. Using the enriched embeddings, we obtain an average of 2% improvement in F-score compared to the previous baselines.
Mar-16-2015
- Technology: