February, Friday 16th

14:30 (room 2014, 'Digiteo Shannon' 660 building) (see location)

Rémi Leblond

(INRIA Sierra team)

Title: SeaRNN: training RNNs with global-local losses


Recurrent neural networks (RNNs) have been widely successful in structured prediction applications such as machine translation or parsing, and are commonly trained using maximum likelihood estimation (MLE). Unfortunately, this training loss is not always an appropriate surrogate for the test error: by only maximizing the ground truth probability, it fails to exploit the wealth of information offered by structured losses. Further, it introduces discrepancies between training and predicting.
In this talk, I will present our work on SeaRNN, a novel training algorithm for RNNs inspired by the "learning to search" (L2S) approach to structured prediction. The outline will be as follows.
First, I will start by quickly discussing the limitations of MLE training for RNNs and introducing the L2S approach.
Second, I will explain how SeaRNN leverages search space exploration to introduce global-local losses that are closer to the test error, and show that it outperforms MLE on two small scale tasks.
Third, I will introduce a subsampling strategy that allows SeaRNN to scale to large vocabulary sizes, and validate its benefits on a neural machine translation task.
Finally, I will draw interesting links between our approach and the related literature (including Reinforcement and Imitation Learning), and briefly talk about future directions.


Corresponding paper:


Rémi is a third year Ph.D. student at INRIA in the SIERRA team under the supervision of Simon Lacoste-Julien. His current focus is on structured prediction using neural networks and asynchronous parallel algorithms for large scale optimization.
After graduating from Ecole Polytechnique and Corps des Mines (French MPA), he worked in the big data field for three years, as a Data Scientist at SpotRight in the US (2011-2012) and as a Data Engineer at the Ministry of Defense (2013-2015).

Contact: guillaume.charpiat at
All TAU seminars: here

Collaborateur(s) de cette page: guillaume .
Page dernièrement modifiée le Dimanche 11 février 2018 13:03:39 CET par guillaume.