Visualisation of embedding relations (Word2Vec, BERT)


In this story, we will visualise the word embedding vectors to understand the relations between words described by the embeddings. This story focuses on word2vec [1] and BERT [2]. To understand the embeddings, I suggest reading a different introduction (like this) as this story does not aim to describe them. This story is part of my journey to develop Neural Machine Translation (NMT) using BERT contextualised embedding vectors. Word embeddings are models to generate computer-friendly numeric vector representations for words.

Duplicate Docs Excel Report

None found

Similar Docs  Excel Report  more

None found