FreqMoE: Dynamic Frequency Enhancement for Neural PDE Solvers

FreqMoE: Dynamic Frequency Enhancement for Neural PDE Solvers

Tianyu Chen, Haoyi Zhou, Ying Li, Hao Wang, Zhenzhe Zhang, Tianchen Zhu, Shanghang Zhang, Jianxin Li

Proceedings of the Thirty-Fourth International Joint Conference on Artificial Intelligence
Main Track. Pages 7356-7364. https://doi.org/10.24963/ijcai.2025/818

Fourier Neural Operators (FNO) have emerged as promising solutions for efficiently solving partial differential equations (PDEs) by learning infinite-dimensional function mappings through frequency domain transformations. However, the sparsity of high-frequency signals limits computational efficiency for high-dimensional inputs, and fixed-pattern truncation often causes high-frequency signal loss, reducing performance in scenarios such as high-resolution inputs or long-term predictions. To address these challenges, we propose FreqMoE, an efficient and progressive training framework that exploits the dependency of high-frequency signals on low-frequency components. The model first learns low-frequency weights and then applies a sparse upward-cycling strategy to construct a mixture of experts (MoE) in the frequency domain, effectively extending the learned weights to high-frequency regions. Experiments on both regular and irregular grid PDEs demonstrate that FreqMoE achieves up to 16.6 percent accuracy improvement while using merely 2.1 percent parameters (47.32x reduction) compared to dense FNO. Furthermore, the approach demonstrates remarkable stability in long-term predictions and generalizes seamlessly to various FNO variants and grid structures, establishing a new Low frequency Pretraining, High frequency Fine-tuning'' paradigm for solving PDEs.
Keywords:
Multidisciplinary Topics and Applications: MTA: Physical sciences
Machine Learning: ML: Deep learning architectures