Understanding ResNets

I’m currently enrolled in fastai’s Deep Learning MOOC (version 3), and loving it so far. It’s only been 2 lectures as of today, but folks are already building awesome stuff based on the content taught so far. The course starts with the application of DL in Computer Vision, and in the very first lecture, course instructor Jeremy teaches us how to leverage transfer learning by making use of pre-trained ResNet models. I’ve been meaning to dive into the details of Resnets for a while, and this seems like a good time to do so. ...

November 7, 2018

Word Embeddings and RNNs

One of the simplest ways to convert words from a natural language into mathematical tensors is to simply represent them as one-hot vectors where the length of these vectors is equal to the size of the vocabulary from where these words are fetched. For example, if we have a vocabulary of size 8 containing the words: "a", "apple", "has", "matrix", "pineapple", "python", "the", "you" the word “matrix” can be represented as: [0,0,0,1,0,0,0,0] ...

October 20, 2018