Beyond Hand-Crafted Networks: Neural Architecture Search
Dott. Stefano Alletto ο»Ώ
Online lecture – Thursday, 07 May 2020, 08:30 a.m. (GMT+1)
With the performance on several benchmarks approaching saturation, pushing the state of the art is often a tedious process of hyperparameter tuning and network architecture optimization. Finding the perfect neural network for a given task by hand is often impossible due to time constraints, but what if we could design a system capable of automatically designing architectures, test their performance and improve itself by looking at its previous mistakes?
This is the goal of neural architecture search (NAS): an automated system that explores search spaces which size is beyond human capabilities, samples network structures from them and improves its decision making by using the performance of the architectures it finds as supervision.
In this talk, after introducing this task more in detail, I will be giving an overview of recent NAS approaches, discussing the opportunities and limitations in the field. Finally, I will present an application of NAS to the multimodal domain.
Stefano Alletto completed his Ph.D. at the University of Modena and Reggio Emilia in 2018. Since then, he worked as a Deep Learning engineer for Panasonic in Osaka, Japan and Mountain View, California. During this period, he spent 1 year as a visiting researcher at Mila β Quebec Institute of Artificial Intelligence under the supervision of Prof. Yoshua Bengio (in 2019, Prof. Bengio received the ACM A.M. Turing Award, βthe Nobel Prize of Computingβ). His research focuses on using generative adversarial networks for improving image quality, denoising and inpainting. He also works on neural architecture search and meta learning for multimodal data.
If you are interested in partecipating, please contact Prof. Giuseppe Serra (giuseppe.serra@uniud.it).