Word Embeddings and RNNs

One of the simplest ways to convert words from a natural language into mathematical tensors is to simply represent them as one-hot vectors where the length of these vectors is equal to the size of the vocabulary from where these words are fetched. For example, if we have a vocabulary of size 8 containing the words: "a", "apple", "has", "matrix", "pineapple", "python", "the", "you" the word “matrix” can be represented as: [0,0,0,1,0,0,0,0] ...

October 20, 2018