Evaluating KGR10 Polish word embeddings in the recognition of temporal expressions using BiLSTM-CRF
Recent studies in information extraction domain (but also in other natural language processing fields) show that deep learning models produce state-of-the-art results [38]. Deep architectures employ multiple layers to learn hierarchical representations of the input data. In the last few years, neural networks based on dense vector representations provided the best results in various NLP tasks, including named entities recognition [32], semantic role labelling [6], question answering [39] and multitask learning [4]. The core element of most deep learning solutions is the dense distributed semantic representation of words, often called word embeddings. Distributional vectors follow the distributional hypothesis that words with a similar meaning tend to appear in similar contexts.
Apr-3-2019
- Country:
- Europe > Poland (0.47)
- North America > United States
- California > San Francisco County > San Francisco (0.14)
- Genre:
- Research Report (0.84)
- Industry:
- Education (0.54)
- Technology: