GeNAS: Neural Architecture Search with Better Generalization

GeNAS: Neural Architecture Search with Better Generalization

Joonhyun Jeong, Joonsang Yu, Geondo Park, Dongyoon Han, YoungJoon Yoo

Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence
Main Track. Pages 911-919. https://doi.org/10.24963/ijcai.2023/101

Neural Architecture Search (NAS) aims to automatically excavate the optimal network architecture with superior test performance. Recent neural architecture search (NAS) approaches rely on validation loss or accuracy to find the superior network for the target data. In this paper, we investigate a new neural architecture search measure for excavating architectures with better generalization. We demonstrate that the flatness of the loss surface can be a promising proxy for predicting the generalization capability of neural network architectures. We evaluate our proposed method on various search spaces, showing similar or even better performance compared to the state-of-the-art NAS methods. Notably, the resultant architecture found by flatness measure generalizes robustly to various shifts in data distribution (e.g. ImageNet-V2,-A,-O), as well as various tasks such as object detection and semantic segmentation.
Keywords:
Computer Vision: CV: Machine learning for vision
Computer Vision: CV: Recognition (object detection, categorization)
Computer Vision: CV: Segmentation
Computer Vision: CV: Transfer, low-shot, semi- and un- supervised learning