中圖分類號： TP391 文獻標識碼： A DOI： 10.19358/j.issn.2096-5133.2022.06.011 引用格式： 楊廣乾，李金龍. 基于直接高階注意力和多尺度路由的圖神經網絡[J].信息技術與網絡安全，2022，41(6)：64-72.
Direct high-order attention and multi-scale routing for graph neural networks
Yang Guangqian，Li Jinlong
(School of Computer Science and Technology，University of Science and Technology of China，Hefei 230026，China)
Abstract： Recently, the attention mechanism in Graph Neural Networks shows excellent performance in processing graph structured data. Traditional graph attention calculates the attention between directly connected nodes, and implicitly gets high-order information by stacking layers. Despite the extensive research about the graph attention mechanism, we argue that the stacking paradigm for attention calculation is less effective in modeling long-range dependency. To improve the expression ability, we design a novel direct attention mechanism, which directly calculates attention between higher-order neighbors via K-power adjacency matrix. We further propagate the higher-order information with an adaptive routing aggregation process, which makes the aggregation more flexible to adjust to the property of different graphs. We perform extensive experiments on node classifications on citation networks. Experiments shows that our method consistently outperforms the state-of-the-art baselines, which validates the effectiveness of our method.
Key words : graph neural networks；attention；dynamic routing