Improving Expressivity of Graph Neural Networks
Stanisław PurgałInternational Joint Conference on Neural Networks (IJCNN), pp. 1 – 7, 2020.
Abstract
We propose a Graph Neural Network with greater expressive power than commonly used GNNs – not constrained to only differentiate between graphs that Weisfeiler-Lehman test recognizes to be non-isomorphic. We use a graph attention network with expanding attention window that aggregates information from nodes exponentially far away. We also use partially random initial embeddings, allowing differentiation between nodes that would otherwise look the same. This could cause problem with a traditional dropout mechanism, therefore we use a “head dropout”, randomly ignoring some attention heads rather than some dimensions of the embedding.
BibTeX
@inproceedings{sp-ijcnn20, author = {Stanislaw J. Purgal}, title = {Improving Expressivity of Graph Neural Networks}, booktitle = {2020 International Joint Conference on Neural Networks, {IJCNN} 2020}, pages = {1--7}, publisher = {IEEE}, year = {2020}, url = {https://doi.org/10.1109/IJCNN48605.2020.9206591}, doi = {10.1109/IJCNN48605.2020.9206591}, }