We present a continuous diffusion approach to deep learning on graphs that treats graph learning as a continuous diffusion process and treats graph neural networks (gnns) as discretisations of an underlying graph learning model.
Our approach allows a principled development of a broad newclass of graph learningmodels that are able to address the common plights of graph learningmodels such as depth, oversmoothing, and bottlenecks.
We develop linear and nonlinear versions of our model that achieve competitive results on many standard graph benchmarks.
Authors
Benjamin Paul Chamberlain, James Rowbottom, Maria Gorinova, Stefan Webb, Emanuele Rossi, Michael M. Bronstein