Skip to the content.

Introduction

Graph data is extensively utilized across various domains, owing to its capacity to represent complex structural relationships among diverse real-world entities. However, the rapid expansion of graph data introduces significant challenges in terms of storage, transmission, and the training of graph neural networks (GNNs) for effective graph data analysis. In light of these challenges, graph condensation (GC) has emerged as a data-centric solution, synthesizing a compact yet representative graph to replace the original large graph in GNN training. These GNNs trained on condensed graphs can achieve performance comparable to models trained on full-scale data, attracting substantial attention and stimulating extensive research. In response to this trend, this tutorial provides a comprehensive and up-to-date overview of GC research. It systematically categorizes existing studies into five categories aligned with critical GC evaluation criteria: effectiveness, generalization, efficiency, fairness, and robustness. Additionally, we will provide an in-depth analysis of two fundamental components of GC: optimization strategies and condensed graph generation, elucidating their key characteristics and underlying technologies. Finally, this tutorial will explore GC applications across various fields and outline potential directions for future research in this rapidly evolving and impactful domain.

GC


Outline

This is a lecture-style tutorial that includes:

Section 1: Welcome and Introduction (10 mins)

Section 2: Definition and Taxonomy of GC (20 mins)

Section 3: A Review of GC Methods (90 mins)

Section 4: GC Toolkit: GCondenser (20 mins)

Section 5: Applications and New Trends (20 mins)

Section 6: Conclusion and Open Discussions (20 mins)


Presenter


Our Papers on Graph Condensation