4.7 Article

Hypergraph convolution and hypergraph attention

Journal

PATTERN RECOGNITION
Volume 110, Issue -, Pages -

Publisher

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2020.107637

Keywords

Graph learning; Hypergraph learning; Graph neural networks; Semi-supervised learning

Funding

  1. EPSRC [Seebibyte EP/M013774/1]
  2. EPSRC/MURI grant [EP/N019474/1]
  3. EPSRC [EP/M013774/1, EP/N019474/1] Funding Source: UKRI

Ask authors/readers for more resources

Graph neural networks have recently gained significant attention and performance, with the introduction of hypergraph convolution and hypergraph attention operators for efficient learning on high-order graph-structured data. These operators enhance representation learning capacity and enable the extension of graph neural networks to more flexible models for diverse applications with non-pairwise relationships.
Recently, graph neural networks have attracted great attention and achieved prominent performance in various research fields. Most of those algorithms have assumed pairwise relationships of objects of in-terest. However, in many real applications, the relationships between objects are in higher-order, beyond a pairwise formulation. To efficiently learn deep embeddings on the high-order graph-structured data, we introduce two end-to-end trainable operators to the family of graph neural networks, i.e., hypergraph convolution and hypergraph attention. Whilst hypergraph convolution defines the basic formulation of performing convolution on a hypergraph, hypergraph attention further enhances the capacity of representation learning by leveraging an attention module. With the two operators, a graph neural network is readily extended to a more flexible model and applied to diverse applications where non-pairwise relationships are observed. Extensive experimental results with semi-supervised node classification demonstrate the effectiveness of hypergraph convolution and hypergraph attention. (c) 2020 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available