Three-Player Wasserstein GAN via Amortised Duality

Three-Player Wasserstein GAN via Amortised Duality

Nhan Dam, Quan Hoang, Trung Le, Tu Dinh Nguyen, Hung Bui, Dinh Phung

Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
Main track. Pages 2202-2208. https://doi.org/10.24963/ijcai.2019/305

We propose a new formulation for learning generative adversarial networks (GANs) using optimal transport cost (the general form of Wasserstein distance) as the objective criterion to measure the dissimilarity between target distribution and learned distribution. Our formulation is based on the general form of the Kantorovich duality which is applicable to optimal transport with a wide range of cost functions that are not necessarily metric. To make optimising this duality form amenable to gradient-based methods, we employ a function that acts as an amortised optimiser for the innermost optimisation problem. Interestingly, the amortised optimiser can be viewed as a mover since it strategically shifts around data points. The resulting formulation is a sequential min-max-min game with 3 players: the generator, the critic, and the mover where the new player, the mover, attempts to fool the critic by shifting the data around. Despite involving three players, we demonstrate that our proposed formulation can be trained reasonably effectively via a simple alternative gradient learning strategy. Compared with the existing Lipschitz-constrained formulations of Wasserstein GAN on CIFAR-10, our model yields significantly better diversity scores than weight clipping and comparable performance to gradient penalty method.
Keywords:
Machine Learning: Learning Generative Models
Machine Learning: Unsupervised Learning
Machine Learning: Deep Learning