In this story, we will visualise the word embedding vectors to understand the relations between words described by the embeddings. This story focuses on word2vec  and BERT . To understand the embeddings, I suggest reading a different introduction (like this) as this story does not aim to describe them. This story is part of my journey to develop Neural Machine Translation (NMT) using BERT contextualised embedding vectors. Word embeddings are models to generate computer-friendly numeric vector representations for words.
Nov-19-2019, 09:58:24 GMT