graph convolutional networks
1. polynomials of the graph laplacian
- The $k$-th power of the graph laplacian picks out all the nodes which are \(k\) steps away from a given vertex, for all vertices
- polynomials can be written using linear combinations of the powers of the graph laplacian
- the input to these polynomials is a multi-dimensional array of the node features – one vector per node
- the output is of the same shape
- the weights of the polynomial are what gets learned
2. global convolutions
- Take the eigenvectors of the graph Laplacian and write your feature vector in that basis, omitting the less important eigenvectors for efficiency
- perform graph (?) convolution with the new vectors and transform back