{ "id": "2105.00490", "version": "v1", "published": "2021-05-02T14:53:32.000Z", "updated": "2021-05-02T14:53:32.000Z", "title": "Residual Enhanced Multi-Hypergraph Neural Network", "authors": [ "Jing Huang", "Xiaolin Huang", "Jie Yang" ], "comment": "ICIP 2021 submitted", "categories": [ "cs.CV" ], "abstract": "Hypergraphs are a generalized data structure of graphs to model higher-order correlations among entities, which have been successfully adopted into various research domains. Meanwhile, HyperGraph Neural Network (HGNN) is currently the de-facto method for hypergraph representation learning. However, HGNN aims at single hypergraph learning and uses a pre-concatenation approach when confronting multi-modal datasets, which leads to sub-optimal exploitation of the inter-correlations of multi-modal hypergraphs. HGNN also suffers the over-smoothing issue, that is, its performance drops significantly when layers are stacked up. To resolve these issues, we propose the Residual enhanced Multi-Hypergraph Neural Network, which can not only fuse multi-modal information from each hypergraph effectively, but also circumvent the over-smoothing issue associated with HGNN. We conduct experiments on two 3D benchmarks, the NTU and the ModelNet40 datasets, and compare against multiple state-of-the-art methods. Experimental results demonstrate that both the residual hypergraph convolutions and the multi-fusion architecture can improve the performance of the base model and the combined model achieves a new state-of-the-art. Code is available at \\url{https://github.com/OneForward/ResMHGNN}.", "revisions": [ { "version": "v1", "updated": "2021-05-02T14:53:32.000Z" } ], "analyses": { "keywords": [ "residual enhanced multi-hypergraph neural network", "experimental results demonstrate", "over-smoothing issue", "model higher-order correlations", "multiple state-of-the-art methods" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }