Papers about large graph models.
-
Updated
Mar 24, 2024
Papers about large graph models.
Accompanied repositories for our paper Graph foundation model
Official implementation of GraphCLIP: Enhancing Transferability in Graph Foundation Models for Text-Attributed Graphs
[NeurIPS 2024 🔥] TEG-DB: A Comprehensive Dataset and Benchmark of Textual-Edge Graphs
GFT: Graph Foundation Model with Transferable Tree Vocabulary, NeurIPS 2024.
Subgraph-conditioned Graph Information Bottleneck (S-CGIB) is a novel architecture for pre-training Graph Neural Networks in molecular property prediction and developed by NS Lab, CUK based on pure PyTorch backend.
Add a description, image, and links to the graph-foundation-model topic page so that developers can more easily learn about it.
To associate your repository with the graph-foundation-model topic, visit your repo's landing page and select "manage topics."