Hierarchical Diffusion Attention Network

Hierarchical Diffusion Attention Network

Zhitao Wang, Wenjie Li

Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
Main track. Pages 3828-3834. https://doi.org/10.24963/ijcai.2019/531

A series of recent studies formulated the diffusion prediction problem as a sequence prediction task and proposed several sequential models based on recurrent neural networks. However, non-sequential properties exist in real diffusion cascades, which do not strictly follow the sequential assumptions of previous work. In this paper, we propose a hierarchical diffusion attention network (HiDAN), which adopts a non-sequential framework and two-level attention mechanisms, for diffusion prediction. At the user level, a dependency attention mechanism is proposed to dynamically capture historical user-to-user dependencies and extract the dependency-aware user information. At the cascade (i.e., sequence) level, a time-aware influence attention is designed to infer possible future user's dependencies on historical users by considering both inherent user importance and time decay effects. Significantly higher effectiveness and efficiency of HiDAN over state-of-the-art sequential models are demonstrated when evaluated on three real diffusion datasets. The further case studies illustrate that HiDAN can accurately capture diffusion dependencies.
Keywords:
Machine Learning: Data Mining
Machine Learning: Deep Learning