A Symmetric Relative-Error Loss Function for Intermittent Multiscale Signal Modelling
A Symmetric Relative-Error Loss Function for Intermittent Multiscale Signal Modelling
Sergio M. Vanegas Arias, Lasse Lensu, Fredy Ruiz Palacios
Proceedings of the Thirty-Fourth International Joint Conference on Artificial Intelligence
Main Track. Pages 6281-6288.
https://doi.org/10.24963/ijcai.2025/699
Multiscale signals represent a formidable modelling challenge in Machine Learning as the ubiquitous Mean Squared Error loss function neglects signal behaviour at smaller values. Several scale-equalizing error metrics have been devised to tackle this problem, amongst which the Mean Absolute Percentage Error (MAPE) remains the most widely used due to its simplicity and interpretability. However, by its very definition, MAPE introduces three major issues: asymptotic behaviour at zero-target values, asymptotic gradient behaviour at zero error, and accuracy loss for large signal scales. We address these limitations by proposing the Symmetric Mean Arctangent Squared Percentage Error (SMASPE), which builds up from the Mean Arctangent Absolute Percentage Error (MAAPE) and leverages a mathematically smoother definition along with user-provided signal bounds to extend its functionality. The numerical properties of SMASPE are explored, and its performance is tested in two real-life cases for deterministic and stochastic optimization. The experiments show a clear advantage of the proposed loss function, with an improvement of up to 42% with respect to MAAPE in terms of Mean Absolute Error for deep learning models when appropriate bounds are selected.
Keywords:
Machine Learning: ML: Regression
Machine Learning: ML: Optimization
Machine Learning: ML: Supervised Learning
Machine Learning: ML: Time series and data streams
