📖 You can now grab a copy of our new Deep Learning in Production Book 📖Learn more
AI Summer is a free educational platform covering research and applied trends in AI and Deep Learning. We provide accessible and comprehensive content from the entire spectrum of AI that aims to bridge the gap between researchers and the public.
Our mission is to simplify complex concepts and drive scientific research. We try to accomplish that by writing highly-detailed overviews of recent deep learning developments as well as thorough tutorials on popular frameworks.
But above all, we are a community that seeks to demystify the AI landscape and enable new technological innovations.
Simplified but technically informed overviews of recent research trends and deep learning breakthroughs. Our articles cover both popular concepts in depth as well as state-of-the-art algorithms. Learn more
Thorough and highly-detailed tutorials on popular AI libraries and frameworks. We discuss best practices and principles on how to use deep learning architectures on real-life projects. Learn more
Clear explanations and step-by-step guides of fundamental architectures and concepts from the machine learning literature. In most cases, code is also available. Learn more
An online community that collaborates on novel articles and open-source projects. If you are looking to co-author and publish an article on our platform, join us on Discord.
An Artificial Intelligenge hub where you can find and learn anything related to Deep Learning. From fundamental principles to state of art research and real-life applications
New to Natural Language Processing? This is the ultimate beginner’s guide to the attention mechanism and sequence learning to get you started
An intuitive understanding on Transformers and how they are used in Machine Translation. After analyzing all subcomponents one by one such as self-attention and positional encodings , we explain the principles behind the Encoder and Decoder and why Transformers work so well
In this article you will learn how the vision transformer works for image classification problems. We distill all the important details you need to grasp along with reasons it can work very well given enough data for pretraining.
An intuitive guide on why it is important to inspect the receptive field, as well as how the receptive field affect the design choices of deep convolutional networks.
How convolutional neural networks work? What are the principles behind designing one CNN architecture? How did we go from AlexNet to EfficientNet?
What are skip connections, why we need them and how they are applied to architectures such as ResNet, DenseNet and UNet.
The basic MRI foundations are presented for tensor representation, as well as the basic components to apply a deep learning method that handles the task-specific problems(class imbalance, limited data). Moreover, we present some features of the open source medical image segmentation library. Finally, we discuss our preliminary experimental results and provide sources to find medical imaging data.
Learn how to apply 3D transformations for medical image preprocessing and augmentation, to setup your awesome deep learning pipeline
How can deep learning revolutionize medical image analysis beyond segmentation? In this article, we will see a couple of interesting applications in medical imaging such as medical image reconstruction, image synthesis, super-resolution, and registration in medical images
What is Kubernetes? What are the basic principles behind it? Why it might be the best option to deploy Machine Learning applications? What features it provides to help us maintain and scale our infrastructure? How to set up a simple Kubernetes cluster in Google cloud?
Learn how to containerize a deep learning model using Docker. Start with the basic concepts behind containers, package a Tensorflow application with Docker and combine multiple images using Docker compose
How to train your data in multiple GPUs or machines using distributed methods such as mirrored strategy, parameter-server and central storage.
Learn about the Hugging Face ecosystem with a hands-on tutorial on the datasets and transformers library. Explore how to fine tune a Vision Transformer (ViT)
How to develop and train a Transformer with JAX, Haiku and Optax. Learn by example how to code Deep Learning models in JAX
Learn about the einsum notation and einops by coding a custom multi-head self-attention unit and a transformer block
Although 99% of our content is available for free, we do offer some paid courses and books. Why?
Because we need a way to cover hosting and other expenses. So you can consider buying them just to support our work.
However, we invest even more effort into our paid content in order to keep the quality as high as possible. Towards that goal, we try to a) maximize flow between concepts, b) minimize the external links and c) update them as frequently as possible.
This book will teach you how to build, train, deploy, scale and maintain deep learning models. You will understand ML infrastructure and MLOps using hands-on examples with Tensorflow, Flask, Docker, Kubernetes, Google Cloud and more.
This course is a higly-interactive, hands-on introduction into the most popular deep learning architectures. It will help you learn the intuition and the mathematics behind deep learning and will provide you with practical experience in Pytorch. The course is 100% text-based and is hosted in educative.io.
Implement and understand byol, a self-supervised computer vision method without negative samples. Learn how BYOL learns robust representations for image classification.
Learn how distributed training works in pytorch: data parallel, distributed data parallel and automatic mixed precision. Train your deep learning models with massive speedups.
Learn how to implement the infamous contrastive self-supervised learning method called SimCLR. Step by step implementation in PyTorch and PyTorch-lightning
A review of state of the art vision-language models such as CLIP, DALLE, ALIGN and SimVL
This article demystifies the ML learning modeling process under the prism of statistics. We will understand how our assumptions on the data enable us to create meaningful optimization problems.
Explore what is neural architecture search, compare the most popular,SOTA methodologies and implement it with nni
A list of the top books to learn deep learning divided into four distinct categories. Personal reviews are included for each one of them.
Implement a UNETR to perform 3D medical image segmentation on the BRATS dataset
Discorver how to formulate and train Spiking Neural Networks (SNNs) using the LIF model, and how to encode data so that it can be processed by SNNs