Mixture of GANs for Clustering

Mixture of GANs for Clustering

Yang Yu, Wen-Ji Zhou

Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Main track. Pages 3047-3053. https://doi.org/10.24963/ijcai.2018/423

For data clustering, Gaussian mixture model (GMM) is a typical method that trains several Gaussian models to capture the data. Each Gaussian model then provides the distribution information of a cluster. For clustering of high dimensional and complex data, more flexible models rather than Gaussian models are desired. Recently, the generative adversarial networks (GANs) have shown effectiveness in capturing complex data distribution. Therefore, GAN mixture model (GANMM) would be a promising alternative of GMM. However, we notice that the non-flexibility of the Gaussian model is essential in the expectation-maximization procedure for training GMM. GAN can have much higher flexibility, which disables the commonly employed expectation-maximization procedure, as that the maximization cannot change the result of the expectation. In this paper, we propose to use the epsilon-expectation-maximization procedure for training GANMM. The experiments show that the proposed GANMM can have good performance on complex data as well as simple data.
Keywords:
Machine Learning: Unsupervised Learning
Machine Learning: Deep Learning
Machine Learning: Clustering