# Some Might Say All You Need Is Sum

# Some Might Say All You Need Is Sum

## Eran Rosenbluth, Jan TÃ¶nshoff, Martin Grohe

Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence

Main Track. Pages 4172-4179.
https://doi.org/10.24963/ijcai.2023/464

The expressivity of Graph Neural Networks (GNNs) is dependent on the aggregation functions they employ. Theoretical works have pointed towards Sum aggregation GNNs subsuming every other GNNs, while certain practical works have observed a clear advantage to using Mean and Max. An examination of the theoretical guarantee identifies two caveats. First, it is size-restricted, that is, the power of every specific GNN is limited to graphs of a specific size. Successfully processing larger graphs may require an other GNN, and so on. Second, it concerns the power to distinguish non-isomorphic graphs, not the power to approximate general functions on graphs, and the former does not necessarily imply the latter.
It is desired that a GNN's usability will not be limited to graphs of any specific size. Therefore, we explore the realm of unrestricted-size expressivity. We prove that basic functions, which can be computed exactly by Mean or Max GNNs, are inapproximable by any Sum GNN. We prove that under certain restrictions, every Mean or Max GNN can be approximated by a Sum GNN, but even there, a combination of (Sum, [Mean/Max]) is more expressive than Sum alone. Lastly, we prove further expressivity limitations for GNNs with a broad class of aggregations.

Keywords:

Machine Learning: ML: Theory of deep learning

Machine Learning: ML: Sequence and graph learning

Machine Learning: ML: Learning theory