Witbrock, MichaelAmor, RobertZhu, Yonghua2025-03-112025-03-112025https://hdl.handle.net/2292/71665Node classification is a core task in graph-based machine learning, where the goal is to predict labels or categories for nodes in a graph, leveraging the relational structure and attributes of the data. To enhance the expressiveness of node representations, various Graph Neural Networks (GNNs) have been proposed to aggregate information from neighbouring nodes, conducting message-passing within local receptive fields, a process also referred to as local learning. However, the suboptimal nature of the available graph structure, often characterized by noisy edges or missing edges, generally negatively affects the performance of node classification. To address this issue, this dissertation explores node classification by considering relations among all nodes, referred to as global learning, and further investigates opportunities and challenges for its application across different scenarios. -We introduce Chain of Propagation Prompting (CPP) to enhance the expressiveness of node representations while reducing the dependency on label information. CPP involves designing a simple message-passing pattern, which we incorporate into node representations using graph contrastive learning. This simple pattern prompts multi-head self-attention-based layers to globally capture more complex patterns while minimizing their reliance on label information. Additionally, we implement majority voting to enhance the predictive confidence of multiple heads. -We introduce Robust Node Classification under Graph and Label Noise (RNCGLN) to improve the robustness of node classification when both graph and label noise are present. By integrating local graph learning and global graph learning, RNCGLN can provide comprehensive information to enhance node classification performance. Additionally, we develop graph and label self-improvement modules to improve and supplement the quality of supervisory information. Consequently, RNCGLN leverages self-training and pseudo-label techniques to facilitate two self-improvement processes in an end-to-end learning framework. -We introduce a Flexible-pass Filter-based Graph Transformer (FFGT) to resist adversarial attacks on graph data. Leveraging self-attention's ability to capture arbitrary graph filters, our self-attention layers with three heads capture multi-frequency representations across low-frequency, hybrid-frequency, and high-frequency ranges. Additionally, we designed graph learning and fusion modules to improve self-attention effectiveness in capturing designed information, yielding a flexible-frequency representation. Consequently, FFGT shows consistent resistance to adversarial perturbation in multiple datasets and against diverse adversarial attacks. We conducted theoretical analyses and numerical evaluations of our proposed methods using diverse graph data. The experimental results show that our methods, leveraging global learning strategies, consistently outperform traditional Graph Neural Networks (GNNs) based on local learning. This superior performance is demonstrated across various scenarios, including diverse graph datasets, graph noise, label noise, and multiple adversarial attacks. Additionally, our theoretical analysis validates the effectiveness of each proposed method in addressing these challenges. Together, these findings confirm that our methods enhance expressiveness, improve robustness to noise, and strengthen resilience against adversarial attacks.Items in ResearchSpace are protected by copyright, with all rights reserved, unless otherwise indicated.https://researchspace.auckland.ac.nz/docs/uoa-docs/rights.htmNode Classification on Graph Data with Global LearningThesisCopyright: The authorAttribution 4.0 Internationalhttps://creativecommons.org/licenses/by/4.0/