Abstract
We relate properties of attributed random graph models to the performance of GNN architectures. We identify regimes where GNNs outperform feedforward neural networks and non-attributed graph clustering methods. We compare GNN performance on our synthetic benchmark to performance on popular real-world datasets. We analyze the theoretical foundations for weak recovery in GNNs for popular one- and two-layer architectures. We obtain an explicit formula for the performance of a 1-layer GNN, and we obtain useful insights on how to proceed in the 2-layer case. Finally, we improve the bound for a notable result on the GNN size generalization problem by 1.
Degree
MS
College and Department
Physical and Mathematical Sciences; Mathematics
Rights
https://lib.byu.edu/about/copyright/
BYU ScholarsArchive Citation
Carson, Brigham Stone, "A Survey of Graph Neural Networks on Synthetic Data" (2023). Theses and Dissertations. 9929.
https://scholarsarchive.byu.edu/etd/9929
Date Submitted
2023-04-18
Document Type
Thesis
Handle
http://hdl.lib.byu.edu/1877/etd12767
Keywords
graph neural networks, machine learning, network theory
Language
english