Friday, 5th of October
11h30 (usual room R2014, 660 building) (see location)Thomas Lucas
(Toth team, INRIA Grenoble)Title: Mixed batches and symmetric discriminators for GAN training
Abstract
Generative adversarial networks (GANs) are powerful generative modelsbased on providing feedback to a generative network via a
discriminator network. However, the discriminator usually assesses
individual samples. This prevents the discriminator from accessing
global distributional statistics of generated samples, and often leads
to mode dropping: the generator models only part of the target
distribution. We propose to feed the discriminator with mixed batches
of true and fake samples, and train it to predict the ratio of true
samples in the batch. The latter score does not depend on the order of
samples in a batch. Rather than learning this invariance, we introduce
a generic permutation-invariant discriminator architecture. This
architecture is provably a universal approximator of all symmetric
functions. Experimentally, our approach reduces mode collapse in GANs
on two synthetic datasets, and obtains good results on the CIFAR10 and
CelebA datasets, both qualitatively and quantitatively.
Contact: guillaume.charpiat at inria.fr
All TAU seminars: here