NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:3832
Title:Universal Invariant and Equivariant Graph Neural Networks

Reviewer 1


		
This work proves that any continuous equivariant function of graphs can be approximated by a one hidden layer graph neural network that uses higher-order tensorization and an arbitrary squashing function. One of the main contributions of the paper is the introduction of the framework based on the Stone-Weierstrass theorem, which can be used to reprove a version of a previously known equivalent for invariant graph neural networks. On the negative side, this framework seems to lead to fewer insights such as the required order of tensorization and whether the underlying permutation group can be different from the set of all permutations. Personally, I have found the mathematical part interesting and this result can be accepted to NeurIPS but given what we have already known about invariant functions, it is not as exciting and general as it could be.

Reviewer 2


		
Although the paper is overly technical, it addresses an important problem in deep learning with graph data. In terms of results, the novelty of the paper is in proving the universality for the equivariant case in a limited setting; this limitation to the setting where the output is a function over “nodes of the (hyper) graph” should be clarified in the abstract. For the invariant case, the advantage of the proposed technique over the proof of Maron et al’19 is that it shows that uniform approximation is possible to a function over graphs of varying size. The disadvantage is that it provides no bound on the order of equivariant tensors produced in the hidden layers. I have not checked the proofs in the appendix and cannot comment on the significance of the techniques. However, I wonder if the authors can comment on the viability of similar techniques to show the universality of equivariant networks for a broader class of discrete group actions (e.g., when the feed-forward layer is uniquely equivariant to a group action)?

Reviewer 3


		
This paper proves the universality of invariant and equivariant graph neural networks by using the Stone-Weierstrass theorem. This paper is original. The property of invariance and equivariance of graph neural networks has seldom been formalized in the existing literature. This paper lays a solid theoretical foundation on the universality of invariant and equivariant graph neural networks defined by a single set of parameters. However, unless the graph neural network proposed in Equation 1 covers a large number of existing graph neural networks, it is needed to validate the superiority of the proposed graph neural networks over other graph neural networks with empirical studies such as graph classification. Due to lack of generalization ability to existing graph neural networks or experimental studies, at this stage the contribution of this paper is of weak significance to the society of graph deep learning. Other aspects of this paper are summarized below: Quality The quality of this paper largely depends on the correctness on proof of the proposed theorems. I haven't check all the proof carefully. Clarity This paper is well written with clear language. There are occasional circumstances that some notations are not explained when they appear, for example: * Page 2, line 86: what does ‘i1,…id’ represents? * Page 3, line 90: what does [n]={1,…,n} mean? * Page 3, line 115, Equation 1, what is b?