Natural Language Processing

Text generation, language translation, spoken language understanding, sentiment analysis and dialogue systems

Natural Language Processing · Attention and Transformers · Computer Vision

Vision Language models: towards multi-modal deep learning

A review of state of the art vision-language models such as CLIP, DALLE, ALIGN and SimVL

Attention and Transformers · Natural Language Processing

Why multi-head self attention works: math, intuitions and 10+1 hidden insights

Learn everything there is to know about the attention mechanisms of the infamous transformer, through 10+1 hidden insights and observations

Attention and Transformers · Natural Language Processing · Pytorch

How Positional Embeddings work in Self-Attention (code in Pytorch)

Understand how positional embeddings emerged and how we use the inside self-attention to model highly structured data such as images

Attention and Transformers · Natural Language Processing

How Transformers work in deep learning and NLP: an intuitive introduction

An intuitive understanding on Transformers and how they are used in Machine Translation. After analyzing all subcomponents one by one such as self-attention and positional encodings , we explain the principles behind the Encoder and Decoder and why Transformers work so well

Attention and Transformers · Natural Language Processing

How Attention works in Deep Learning: understanding the attention mechanism in sequence models

New to Natural Language Processing? This is the ultimate beginner’s guide to the attention mechanism and sequence learning to get you started

Natural Language Processing · Unsupervised Learning · Machine Learning

Document clustering

Use unsupervised learning to cluster documents based on their content