Friday, 14th of December14h30 (room R2014, 660 building) (see location)
Title: The shallow learning quest
AbstractSuccessful deep neural networks are systematically trained via the end-to-end back-propagation algorithm. We challenge this optimization procedure. We show it is possible to train layers sequentially, by using ad-hoc auxiliary classifiers. For instance, one can obtain AlexNet performances on Imagenet by training successively a sequence of 1-hidden layer CNNs. Using deeper auxiliary classifier, we exhibit VGG performances. We propose an empirical analysis through the scope of progressive linear separation. Applications are discussed.
Contact: guillaume.charpiat at inria.fr
All TAU seminars: here