Word Embeddings
A word embedding is a representation of a word as a vector, or sequence of numbers. Often times these vectors encode how the word is used in conjunction with other words in a dataset. Both the technique for encoding and the dataset used can vary greatly and ultimately depends on the appropriate use case. Word embeddings have ubiquitous use cases in NLP/ML, and allow computers or mathematical equations to reason about words. Computers only see words as a sequence of individual characters, which is not often useful when reasoning about the semantic or syntactic usage of a word in a language.
Aug-31-2021, 14:18:34 GMT
- Technology: