Graph Efficient Pre-training for Knowledge Transfer

Show simple item record

dc.contributor.advisor Chiang, Meng-Fen
dc.contributor.author Lu, Sirui
dc.date.accessioned 2022-08-02T03:51:33Z
dc.date.available 2022-08-02T03:51:33Z
dc.date.issued 2022 en
dc.identifier.uri https://hdl.handle.net/2292/60656
dc.description Full Text is available to authenticated members of The University of Auckland only.
dc.description.abstract Self-supervised graph pre-training frameworks have shown effective for in-domain knowledge transfer, where a model is typically pre-trained on unlabeled massive graph data to learn the general transferable knowledge before being fine-tuned for other specific tasks. However, their capability to learn domain-invariant knowledge for cross-domain transfer remains unknown. Moreover, the information of how individuals are positioned across the entire graph is largely overlooked. To bridge this gap, we propose Graph Efficient (GrapE) pre-training framework to seamlessly integrate the augmented graphs with complementary positioning information to enhance domain-invariant knowledge learning in a self-supervised manner for cross-domain graph transfer. First, to obtain global positioning perspective, we propose to augment the original graph with a component graph, which reveals the dual topological structures in node-level and component-level. The global proximity estimates to enrich self-supervised signals between a set of nodes and components offer a rich perspective on positioning individuals in the entire graph. Lastly, to alleviate the tremendous computational burden on pre-training massive graphs, GrapE adopts the sequential training paradigm to continually grow in transfer knowledge over limited sampled graph instances to improve data efficiency. In extensive experiments on four benchmarks, GrapE is shown to achieve better data efficiency, generalization performance, and transferability by a considerable margin, in both in-domain and cross-domain transfer settings, via two fine-tuning tasks.
dc.publisher ResearchSpace@Auckland en
dc.relation.ispartof Masters Thesis - University of Auckland en
dc.relation.isreferencedby UoA en
dc.rights Items in ResearchSpace are protected by copyright, with all rights reserved, unless otherwise indicated.
dc.rights Restricted Item. Full Text is available to authenticated members of The University of Auckland only.
dc.rights.uri https://researchspace.auckland.ac.nz/docs/uoa-docs/rights.htm en
dc.rights.uri http://creativecommons.org/licenses/by-nc-sa/3.0/nz/
dc.title Graph Efficient Pre-training for Knowledge Transfer
dc.type Thesis en
thesis.degree.discipline Computer Science
thesis.degree.grantor The University of Auckland en
thesis.degree.level Masters en
dc.date.updated 2022-07-05T05:57:29Z
dc.rights.holder Copyright: the author en


Files in this item

Find Full text

This item appears in the following Collection(s)

Show simple item record

Share

Search ResearchSpace


Browse

Statistics