Efficient Non-parametric Bayesian Hawkes Processes

Efficient Non-parametric Bayesian Hawkes Processes

Rui Zhang, Christian Walder, Marian-Andrei Rizoiu, Lexing Xie

Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
Main track. Pages 4299-4305. https://doi.org/10.24963/ijcai.2019/597

In this paper, we develop an efficient non-parametric Bayesian estimation of the kernel function of Hawkes processes. The non-parametric Bayesian approach is important because it provides flexible Hawkes kernels and quantifies their uncertainty. Our method is based on the cluster representation of Hawkes processes. Utilizing the stationarity of the Hawkes process, we efficiently sample random branching structures and thus, we split the Hawkes process into clusters of Poisson processes. We derive two algorithms --- a block Gibbs sampler and a maximum a posteriori estimator based on expectation maximization --- and we show that our methods have a linear time complexity, both theoretically and empirically. On synthetic data, we show our methods to be able to infer flexible Hawkes triggering kernels. On two large-scale Twitter diffusion datasets, we show that our methods outperform the current state-of-the-art in goodness-of-fit and that the time complexity is linear in the size of the dataset. We also observe that on diffusions related to online videos, the learned kernels reflect the perceived longevity for different content types such as music or pets videos.
Keywords:
Machine Learning: Time-series;Data Streams
Machine Learning: Learning Generative Models
Machine Learning: Probabilistic Machine Learning