An Interactive Introduction to Model-Agnostic Meta-Learning

Exploring the world of model-agnostic meta-learning and its variants.

This page is part of a multi-part series on Model-Agnostic Meta-Learning. If you are already familiar with the topic, use the menu on the right side to jump straight to the part that interests you. Otherwise, we suggest you start at the beginning.

Conclusion

In this multi-part series on Model-Agnostic Meta-Learning (MAML), we have studied an optimization-based meta-learning approach to solve, e.g., few-shot learning problems, where we expose a learner to only a few samples on a task and hope to somehow converge to a nice solution. MAML builds up a model of the world from studying other tasks and thereby finding an initialization for which only a few gradient descent steps are enough to learn a new task. As long as we can differentiate through the optimizer we use for the tasks (also called the inner optimizer), MAML can be applied to all architectures and learning schemes, making it a versatile tool for few-shot learning. However, we also studied the computational overhead that comes with such a differentiation. Thus, variants to MAML are needed, bypassing the meta-gradient terms, which are costly to obtain. While FOMAML simply drops these terms from the meta-gradient, Reptile abandons the computation of a meta-gradient altogether and computes a moving average of optimal task parameters instead. In contrast, iMAML creates a dependency between initial parameters and loss space and therefore allows us to compute the meta-gradient without differentiating through the inner optimizer.

Before you leave: After letting your browser compute a bunch of meta-updates, you may want to cool it off and take a look at our computationally lightweight but not-to-miss further reading section. As you will see, MAML has sparked a plethora of research on optimization-based meta-learning with many interesting approaches.

Consider leaving feedback or suggestions or simply get in contact with us, either via GitHub or e-mail at luismueller2309@gmail.com.

Author Contributions

Luis Müller implemented the visualization of MAML, FOMAML, Reptile and the Comparision. Max Ploner created the visualization of iMAML and the svelte elements and components. Both wrote the introduction together and contributed most of the text of the other parts. Thomas Goerttler came up with the idea and sketched out the project. He also wrote parts of the manuscript and helped with finalizing the document. Klaus Obermayer provided feedback on the project.

† equal contributors