Skip to content

A curated list of papers and resources based on "Large Language Models on Graphs: A Comprehensive Survey".

License

Notifications You must be signed in to change notification settings

jiangweiatgithub/Awesome-Language-Model-on-Graphs

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 

Repository files navigation

Awesome-Language-Model-on-Graphs Awesome

A curated list of papers and resources about large language models (LLMs) on graphs based on our survey paper: Large Language Models on Graphs: A Comprehensive Survey.

This repo will be continuously updated. Don't forget to star it and keep tuned!

Please cite the paper in Citations if you find the resource helpful for your research. Thanks!

Why LLMs on graphs?

Large language models (LLMs), such as ChatGPT and LLaMA, are creating significant advancements in natural language processing, due to their strong text encoding/decoding ability and newly found emergent capability (e.g., reasoning). While LLMs are mainly designed to process pure texts, there are many real-world scenarios where text data are associated with rich structure information in the form of graphs (e.g., academic networks, and e-commerce networks) or scenarios where graph data are captioned with rich textual information (e.g., molecules with descriptions). Besides, although LLMs have shown their pure text-based reasoning ability, it is underexplored whether such ability can be generalized to graph scenarios (i.e., graph-based reasoning). In this paper, we provide a comprehensive review of scenarios and techniques related to large language models on graphs.

Contents

Keywords Convention

The Transformer architecture used in the work, e.g., EncoderOnly, DecoderOnly, EncoderDecoder.

The size of the large language model, e.g., medium (i.e., less than 1B parameters), LLM (i.e., more than 1B parameters).

Perspectives

  1. Unifying Large Language Models and Knowledge Graphs: A Roadmap. preprint

    Shirui Pan, Linhao Luo, Yufei Wang, Chen Chen, Jiapu Wang, Xindong Wu [PDF], 2023.6

  2. Integrating Graphs with Large Language Models: Methods and Prospects preprint

    Shirui Pan, Yizhen Zheng, Yixin Liu [PDF], 2023.10

  3. Towards graph foundation models: A survey and beyond. preprint

    Jiawei Liu, Cheng Yang, Zhiyuan Lu, Junze Chen, Yibo Li, Mengmei Zhang, Ting Bai, Yuan Fang, Lichao Sun, Philip S. Yu, Chuan Shi. [PDF], 2023.10

  4. A Survey of Graph Meets Large Language Model: Progress and Future Directions. preprint

    Yuhan Li, Zhixun Li, Peisong Wang, Jia Li, Xiangguo Sun, Hong Cheng, Jeffrey Xu Yu. [PDF], 2023.11

Pure Graphs

Datasets

Table 3 in our survey paper Large Language Models on Graphs: A Comprehensive Survey.

Direct Answering

  1. Can Language Models Solve Graph Problems in Natural Language? preprint

    Heng Wang, Shangbin Feng, Tianxing He, Zhaoxuan Tan, Xiaochuang Han, Yulia Tsvetkov. [PDF] [Code], 2023.5,

  2. GPT4Graph: Can Large Language Models Understand Graph Structured Data ? An Empirical Evaluation and Benchmarking. preprint

    Jiayan Guo, Lun Du, Hengyu Liu, Mengyu Zhou, Xinyi He, Shi Han. [PDF], 2023.5,

  3. Evaluating Large Language Models on Graphs: Performance Insights and Comparative Analysis. preprint

    Chang Liu, Bo Wu. [PDF] [Code], 2023.8, [PDF], 2023.5,

  4. Talk Like A Graph: Encoding Graphs For Large Language Models. preprint

    Bahare Fatemi, Jonathan Halcrow, Bryan Perozzi. [PDF], 2023.10,

  5. GraphLLM: Boosting Graph Reasoning Ability of Large Language Model. preprint

    Ziwei Chai, Tianjie Zhang, Liang Wu, Kaiqiao Han, Xiaohai Hu, Xuanwen Huang, Yang Yang. [PDF] [Code], 2023.10,

  6. LLM4DyG: Can Large Language Models Solve Problems on Dynamic Graphs?. preprint

    Zeyang Zhang, Xin Wang, Ziwei Zhang, Haoyang Li, Yijian Qin, Simin Wu, Wenwu Zhu [PDF] [Code], 2023.10,

  7. Which Modality should I use - Text, Motif, or Image? : Understanding Graphs with Large Language Models. preprint

    Debarati Das, Ishaan Gupta, Jaideep Srivastava, Dongyeop Kang [PDF] [Code], 2023.11,

Heuristic Reasoning

  1. StructGPT: A General Framework for Large Language Model to Reason over Structured Data. preprint

    Jinhao Jiang, Kun Zhou, Zican Dong, Keming Ye, Wayne Xin Zhao, Ji-Rong Wen. [PDF] [Code], 2023.5,

  2. Think-on-Graph: Deep and Responsible Reasoning of Large Language Model on Knowledge Graph. preprint

    Jiashuo Sun, Chengjin Xu, Lumingyuan Tang, Saizhuo Wang, Chen Lin, Yeyun Gong, Lionel M. Ni, Heung-Yeung Shum, Jian Guo. [PDF] [Code], 2023.7,

  3. Exploring Large Language Model for Graph Data Understanding in Online Job Recommendations. preprint

    Likang Wu, Zhaopeng Qiu, Zhi Zheng, Hengshu Zhu, Enhong Chen. [PDF] [Code], 2023.7,

  4. Knowledge Graph Prompting for Multi-Document Question Answering. AAAI2024

    Yu Wang, Nedim Lipka, Ryan Rossi, Alex Siu, Ruiyi Zhang, Tyler Derr. [PDF] [Code], 2023.8,

  5. ChatRule: Mining Logical Rules with Large Language Models for Knowledge Graph Reasoning. preprint

    Linhao Luo, Jiaxin Ju, Bo Xiong, Yuan-Fang Li, Gholamreza Haffari, Shirui Pan. [PDF] [Code], 2023.9,

  6. Reasoning on Graphs: Faithful and Interpretable Large Language Model Reasoning. preprint

    Linhao Luo, Yuan-Fang Li, Gholamreza Haffari, Shirui Pan. [PDF] [Code], 2023.10,

  7. Thought Propagation: An Analogical Approach to Complex Reasoning with Large Language Models. preprint

    Junchi Yu, Ran He, Rex Ying. [PDF], 2023.10,

  8. Large Language Models Can Learn Temporal Reasoning. preprint

    Siheng Xiong, Ali Payani, Ramana Kompella, Faramarz Fekri. [PDF], 2024.1,

  9. Exploring the Limitations of Graph Reasoning in Large Language Models. preprint

    Palaash Agrawal, Shavak Vasania, Cheston Tan. [PDF], 2024.2,

  10. Rendering Graphs for Graph Reasoning in Multimodal Large Language Models. preprint

    Yanbin Wei, Shuai Fu, Weisen Jiang, James T. Kwok, Yu Zhang. [PDF], 2024.2,

  11. Graph-enhanced Large Language Models in Asynchronous Plan Reasoning. preprint

    Fangru Lin, Emanuele La Malfa, Valentin Hofmann, Elle Michelle Yang, Anthony Cohn, Janet B. Pierrehumbert. [PDF], 2024.2,

  12. Microstructures and Accuracy of Graph Recall by Large Language Models. preprint

    Yanbang Wang, Hejie Cui, Jon Kleinberg. [PDF], 2024.2,

  13. Structure Guided Prompt: Instructing Large Language Model in Multi-Step Reasoning by Exploring Graph Structure of the Text. preprint

    Kewei Cheng, Nesreen K. Ahmed, Theodore Willke, Yizhou Sun. [PDF], 2024.2,

  14. GraphInstruct: Empowering Large Language Models with Graph Understanding and Reasoning Capability. preprint

    Zihan Luo, Xiran Song, Hong Huang, Jianxun Lian, Chenhao Zhang, Jinqi Jiang, Xing Xie, Hai Jin. [PDF], 2024.3,

  15. Call Me When Necessary: LLMs can Efficiently and Faithfully Reason over Structured Environments. preprint

    Sitao Cheng, Ziyuan Zhuang, Yong Xu, Fangkai Yang, Chaoyun Zhang, Xiaoting Qin, Xiang Huang, Ling Chen, Qingwei Lin, Dongmei Zhang, Saravan Rajmohan, Qi Zhang. [PDF], 2024.3,

  16. Exploring the Potential of Large Language Models in Graph Generation. preprint

    Yang Yao, Xin Wang, Zeyang Zhang, Yijian Qin, Ziwei Zhang, Xu Chu, Yuekui Yang, Wenwu Zhu, Hong Mei. [PDF], 2024.3,

Algorithmic Reasoning

  1. Graph-ToolFormer: To Empower LLMs with Graph Reasoning Ability via Prompt Augmented by ChatGPT. preprint

    Jiawei Zhang. [PDF] [Code], 2023.4,

Text-Attributed Graphs

Datasets

Table 7 in our survey paper Large Language Models on Graphs: A Comprehensive Survey.

LLM as Predictor (Node)

Graph As Sequence (Node)

  1. MATCH: Metadata-Aware Text Classification in A Large Hierarchy. WWW 2021

    Yu Zhang, Zhihong Shen, Yuxiao Dong, Kuansan Wang, Jiawei Han. [PDF] [Code], 2021.2,

  2. QA-GNN: Reasoning with Language Models and Knowledge Graphs for Question Answering. NAACL 2021

    Michihiro Yasunaga, Hongyu Ren, Antoine Bosselut, Percy Liang, Jure Leskovec. [PDF] [Code], 2021.4,

  3. Natural Language is All a Graph Needs. preprint

    Ruosong Ye, Caiqi Zhang, Runhui Wang, Shuyuan Xu, Yongfeng Zhang. [PDF], 2023.8,

  4. Can LLMs Effectively Leverage Graph Structural Information: When and Why. preprint

    Jin Huang, Xingjian Zhang, Qiaozhu Mei, Jiaqi Ma. [PDF] [Code], 2023.9,

  5. Graph Neural Prompting with Large Language Models. preprint

    Yijun Tian, Huan Song, Zichen Wang, Haozhu Wang, Ziqing Hu, Fang Wang, Nitesh V. Chawla, Panpan Xu. [PDF], 2023.9,

  6. Prompt-based Node Feature Extractor for Few-shot Learning on Text-Attributed Graphs. preprint

    Xuanwen Huang, Kaiqiao Han, Dezheng Bao, Quanjin Tao, Zhisheng Zhang, Yang Yang, Qi Zhu. [PDF], 2023.9, ,

  7. GraphText: Graph Reasoning in Text Space. preprint

    Jianan Zhao, Le Zhuo, Yikang Shen, Meng Qu, Kai Liu, Michael Bronstein, Zhaocheng Zhu, Jian Tang [PDF], 2023.10,

  8. GraphGPT: Graph Instruction Tuning for Large Language Models. preprint

    Jiabin Tang, Yuhao Yang, Wei Wei, Lei Shi, Lixin Su, Suqi Cheng, Dawei Yin, Chao Huang. [PDF], 2023.10,

  9. Learning Multiplex Embeddings on Text-rich Networks with One Text Encoder. preprint

    Bowen Jin, Wentao Zhang, Yu Zhang, Yu Meng, Han Zhao, Jiawei Han. [PDF][Code], 2023.10,

  10. Disentangled Representation Learning with Large Language Models for Text-Attributed Graphs. preprint

    Yijian Qin, Xin Wang, Ziwei Zhang, Wenwu Zhu. [PDF], 2023.10,

  11. ReasoningLM: Enabling Structural Subgraph Reasoning in Pre-trained Language Models for Question Answering over Knowledge Graph. EMNLP23

    Jinhao Jiang, Kun Zhou, Wayne Xin Zhao, Yaliang Li, Ji-Rong Wen. [PDF], 2023.12,

  12. Chain of History: Learning and Forecasting with LLMs for Temporal Knowledge Graph Completion. preprint

    Ruilin Luo, Tianle Gu, Haoling Li, Junzhe Li, Zicheng Lin, Jiayi Li, Yujiu Yang. [PDF] [Code], 2024.1,

  13. Scalable Link Prediction on Large-Scale Heterogeneous Graphs with Large Language Models. preprint

    Baolong Bi, Shenghua Liu, Yiwei Wang, Lingrui Mei, Xueqi Chen. [PDF], 2024.1,

  14. Similarity-based Neighbor Selection for Graph LLMs. preprint

    Rui Li, Jiwei Li, Jiawei Han, Guoyin Wang. [PDF], 2024.2,

  15. Let Your Graph Do the Talking: Encoding Structured Data for LLMs. preprint

    Bryan Perozzi, Bahare Fatemi, Dustin Zelle, Anton Tsitsulin, Mehran Kazemi, Rami Al-Rfou, Jonathan Halcrow. [PDF], 2024.2,

  16. InstructGraph: Boosting Large Language Models via Graph-centric Instruction Tuning and Preference Alignment. preprint

    Jianing Wang, Junda Wu, Yupeng Hou, Yao Liu, Ming Gao, Julian McAuley. [PDF], 2024.2,

Graph-Empowered LLM (Node)

  1. Text Generation from Knowledge Graphs with Graph Transformers. NAACL 2019

    Rik Koncel-Kedziorski, Dhanush Bekal, Yi Luan, Mirella Lapata, Hannaneh Hajishirzi. [PDF] [Code], 2019.4,

  2. GraphFormers: GNN-nested Transformers for Representation Learning on Textual Graph. NeurIPs 2021

    Junhan Yang, Zheng Liu, Shitao Xiao, Chaozhuo Li, Defu Lian, Sanjay Agrawal, Amit Singh, Guangzhong Sun, Xing Xie. [PDF][Code], 2021.5,

  3. GreaseLM: Graph Reasoning Enhanced Language Models for Question Answering. ICLR 2022

    Xikun Zhang, Antoine Bosselut, Michihiro Yasunaga, Hongyu Ren, Percy Liang, Christopher D Manning and Jure Leskovec. [PDF] [Code], 2022.1,

  4. Heterformer: Transformer-based Deep Node Representation Learning on Heterogeneous Text-Rich Networks. KDD 2023

    Bowen Jin, Yu Zhang, Qi Zhu, Jiawei Han. [PDF][Code], 2022.5,

  5. Hidden Schema Networks. preprint

    Ramsés J. Sánchez, Lukas Conrads, Pascal Welke, Kostadin Cvejoski, César Ojeda. [PDF], 2022.7,

  6. DRAGON: Deep Bidirectional Language-Knowledge Graph Pretraining. NeurIPs 2022

    Michihiro Yasunaga, Antoine Bosselut, Hongyu Ren, Xikun Zhang, Christopher D. Manning, Percy Liang, Jure Leskovec. [PDF][Code], 2022.10,

  7. Edgeformers: Graph-Empowered Transformers for Representation Learning on Textual-Edge Networks. ICLR 2023

    Bowen Jin, Yu Zhang, Yu Meng, Jiawei Han. [PDF][Code], 2023.1,

  8. Patton: Language Model Pretraining on Text-rich Networks. ACL 2023

    Bowen Jin, Wentao Zhang, Yu Zhang, Yu Meng, Xinyang Zhang, Qi Zhu, Jiawei Han. [PDF][Code], 2023.5,

  9. Graph Language Models. preprint

    Moritz Plenz, Anette Frank. [PDF], 2024.1,

  10. Efficient Tuning and Inference for Large Language Models on Textual Graphs. preprint

    Yun Zhu, Yaoke Wang, Haizhou Shi, Siliang Tang. [PDF], 2024.1,

  11. HyperBERT: Mixing Hypergraph-Aware Layers with Language Models for Node Classification on Text-Attributed Hypergraphs. preprint

    Adrián Bazaga, Pietro Liò, Gos Micklem. [PDF], 2024.2,

Graph-Aware LLM Finetuning

  1. Explaining Relationships Between Scientific Documents. ACL 2021

    Kelvin Luu, Xinyi Wu, Rik Koncel-Kedziorski, Kyle Lo, Isabel Cachola, Noah A. Smith. [PDF], 2020.2,

  2. SPECTER: Document-level Representation Learning using Citation-informed Transformers. ACL 2020

    Arman Cohan, Sergey Feldman, Iz Beltagy, Doug Downey, Daniel S. Weld. [PDF], 2020.4,

  3. Pre-training for Ad-hoc Retrieval: Hyperlink is Also You Need. CIKM 2021

    Zhengyi Ma, Zhicheng Dou, Wei Xu, Xinyu Zhang, Hao Jiang, Zhao Cao, Ji-Rong Wen. [PDF] [Code], 2021.8,

  4. Neighborhood Contrastive Learning for Scientific Document Representations with Citation Embeddings. EMNLP 2022

    Malte Ostendorff, Nils Rethmeier, Isabelle Augenstein, Bela Gipp, Georg Rehm. [PDF][Code], 2022.2,

  5. Metadata-Induced Contrastive Learning for Zero-Shot Multi-Label Text Classification. WWW 2022

    Yu Zhang, Zhihong Shen, Chieh-Han Wu, Boya Xie, Junheng Hao, Ye-Yi Wang, Kuansan Wang, Jiawei Han. [PDF][Code], 2022.2,

  6. LinkBERT: Pretraining Language Models with Document Links. ACL 2022

    Michihiro Yasunaga, Jure Leskovec, Percy Liang. [PDF][Code], 2022.3,

  7. E2EG: End-to-End Node Classification Using Graph Topology and Text-based Node Attributes. ICDM 2023

    Tu Anh Dinh, Jeroen den Boef, Joran Cornelisse, Paul Groth. [PDF], 2022.8,

  8. TwHIN-BERT: A Socially-Enriched Pre-trained Language Model for Multilingual Tweet Representations at Twitter. KDD 2023

    Xinyang Zhang, Yury Malkov, Omar Florez, Serim Park, Brian McWilliams, Jiawei Han, Ahmed El-Kishky. [PDF] [Code], 2022.9,

  9. TouchUp-G: Improving Feature Representation through Graph-Centric Finetuning. preprint

    Jing Zhu, Xiang Song, Vassilis N. Ioannidis, Danai Koutra, Christos Faloutsos. [PDF], 2023.9,

  10. Wikiformer: Pre-training with Structured Information of Wikipedia for Ad-hoc Retrieval. preprint

    Weihang Su, Qingyao Ai, Xiangsheng Li, Jia Chen, Yiqun Liu, Xiaolong Wu, Shengluan Hou. [PDF], 2023.12,

  11. WalkLM: A Uniform Language Model Fine-tuning Framework for Attributed Graph Embedding. NeurIPs 2023

    Yanchao Tan, Zihao Zhou, Hang Lv, Weiming Liu, Carl Yang. [PDF], 2023.12,

  12. GLaM: Fine-Tuning Large Language Models for Domain Knowledge Graph Alignment via Neighborhood Partitioning and Generative Subgraph Encoding. AAAI 2024

    Stefan Dernbach, Khushbu Agarwal, Alejandro Zuniga, Michael Henry, Sutanay Choudhury. [PDF], 2024.2,

LLM as Encoder

Optimization

  1. GNN-LM: Language Modeling based on Global Contexts via GNN. ICLR 2022

    Yuxian Meng, Shi Zong, Xiaoya Li, Xiaofei Sun, Tianwei Zhang, Fei Wu, Jiwei Li. [PDF] [Code], 2021.10,

  2. Node Feature Extraction by Self-Supervised Multi-Scale Neighborhood Prediction. ICLR 2022

    Eli Chien, Wei-Cheng Chang, Cho-Jui Hsieh, Hsiang-Fu Yu, Jiong Zhang, Olgica Milenkovic, Inderjit S Dhillon. [PDF][Code], 2021.11,

  3. TextGNN: Improving Text Encoder via Graph Neural Network in Sponsored Search. WWW 2021

    Jason Yue Zhu, Yanling Cui, Yuming Liu, Hao Sun, Xue Li, Markus Pelger, Tianqi Yang, Liangjie Zhang, Ruofei Zhang, Huasha Zhao. [PDF] [Code], 2022.1

  4. AdsGNN: Behavior-Graph Augmented Relevance Modeling in Sponsored Search. SIGIR 2021

    Chaozhuo Li, Bochen Pang, Yuming Liu, Hao Sun, Zheng Liu, Xing Xie, Tianqi Yang, Yanling Cui, Liangjie Zhang, Qi Zhang. [PDF] [Code], 2022.4

  5. Efficient and effective training of language and graph neural network models. AAAI 2023

    Vassilis N Ioannidis, Xiang Song, Da Zheng, Houyu Zhang, Jun Ma, Yi Xu, Belinda Zeng, Trishul Chilimbi, George Karypis. [PDF], 2022.6,

  6. Graph-Aware Language Model Pre-Training on a Large Graph Corpus Can Help Multiple Graph Applications. KDD 2023

    Han Xie, Da Zheng, Jun Ma, Houyu Zhang, Vassilis N. Ioannidis, Xiang Song, Qing Ping, Sheng Wang, Carl Yang, Yi Xu, Belinda Zeng, Trishul Chilimbi. [PDF], 2023.6,

  7. Exploring the Potential of Large Language Models (LLMs) in Learning on Graphs. preprint

    Zhikai Chen, Haitao Mao, Hang Li, Wei Jin, Hongzhi Wen, Xiaochi Wei, Shuaiqiang Wang, Dawei Yin, Wenqi Fan, Hui Liu, Jiliang Tang. [PDF] [Code], 2023.7,

  8. SimTeG: A Frustratingly Simple Approach Improves Textual Graph Learning. preprint

    Keyu Duan, Qian Liu, Tat-Seng Chua, Shuicheng Yan, Wei Tsang Ooi, Qizhe Xie, Junxian He. [PDF] [Code], 2023.8,

Data Augmentation

  1. Explanations as Features: LLM-Based Features for Text-Attributed Graphs. preprint

    Xiaoxin He, Xavier Bresson, Thomas Laurent, Adam Perold, Yann LeCun, Bryan Hooi. [PDF] [Code], 2023.5,

  2. Label-free Node Classification on Graphs with Large Language Models (LLMS). preprint

    Zhikai Chen, Haitao Mao, Hongzhi Wen, Haoyu Han, Wei Jin, Haiyang Zhang, Hui Liu, Jiliang Tang. [PDF], 2023.9,

  3. Empower Text-Attributed Graphs Learning with Large Language Models (LLMs). preprint

    Jianxiang Yu, Yuxiang Ren, Chenghua Gong, Jiaqi Tan, Xiang Li, Xuecang Zhang. [PDF], 2023.10,

  4. Large Language Models as Topological Structure Enhancers for Text-Attributed Graphs. preprint

    Shengyin Sun, Yuxiang Ren, Chen Ma, Xuecang Zhang. [PDF], 2023.11,

  5. A Comprehensive Study on Text-attributed Graphs: Benchmarking and Rethinking. NeurIPS 2023

    Hao Yan, Chaozhuo Li, Ruosong Long, Chao Yan, Jianan Zhao, Wenwen Zhuang, Jun Yin, Peiyan Zhang, Weihao Han, Hao Sun, Weiwei Deng, Qi Zhang, Lichao Sun, Xing Xie, Senzhang Wang [PDF] [Code], 2023.11,

  6. Distilling Event Sequence Knowledge From Large Language Models. preprint

    Somin Wadhwa, Oktie Hassanzadeh, Debarun Bhattacharjya, Ken Barker, Jian Ni. [PDF], 2024.1,

Efficiency

  1. Train Your Own GNN Teacher: Graph-Aware Distillation on Textual Graphs. PKDD 2023

    C. Mavromatis, V. N. Ioannidis, S. Wang, D. Zheng, S. Adeshina, J. Ma, H. Zhao, C. Faloutsos, G. Karypis. [PDF], 2023.4,

  2. Efficient Large Language Models Fine-Tuning On Graphs. preprint

    Rui Xue, Xipeng Shen, Ruozhou Yu, Xiaorui Liu. [PDF], 2023.12,

  3. Distilling Large Language Models for Text-Attributed Graph Learning. preprint

    Bo Pan, Zheng Zhang, Yifei Zhang, Yuntong Hu, Liang Zhao. [PDF], 2024.2,

LLM as Aligner (Node)

Prediction Alignment

  1. Minimally-Supervised Structure-Rich Text Categorization via Learning on Text-Rich Networks. WWW 2021

    Xinyang Zhang, Chenwei Zhang, Luna Xin Dong, Jingbo Shang, Jiawei Han. [PDF] [Code], 2021.2,

  2. Learning on Large-scale Text-attributed graphs via variational inference. ICLR 2023

    Jianan Zhao, Meng Qu, Chaozhuo Li, Hao Yan, Qian Liu, Rui Li, Xing Xie, Jian Tang. [PDF][Code], 2023.1,

Latent Space Alignment (Node)

  1. ConGraT: Self-Supervised Contrastive Pretraining for Joint Graph and Text Embeddings. preprint

    William Brannon, Suyash Fulay, Hang Jiang, Wonjune Kang, Brandon Roy, Jad Kabbara, Deb Roy. [PDF] [Code], 2023.5,

  2. Augmenting Low-Resource Text Classification with Graph-Grounded Pre-training and Prompting. SIGIR 2023

    Zhihao Wen, Yuan Fang. [PDF], 2023.5,

  3. GRENADE: Graph-Centric Language Model for Self-Supervised Representation Learning on Text-Attributed Graphs. EMNLP 2023

    Yichuan Li, Kaize Ding, Kyumin Lee. [PDF][Code], 2023.10,

  4. Pretraining Language Models with Text-Attributed Heterogeneous Graphs. preprint

    Tao Zou, Le Yu, Yifei HUANG, Leilei Sun, Bowen Du. [PDF] [Code], 2023.10,

Text-Paired Graphs (Molecules)

Datasets

Table 8 in our survey paper Large Language Models on Graphs: A Comprehensive Survey.

LLM as Predictor (Graph)

Graph As Sequence

  1. SMILES-BERT: Large Scale Unsupervised Pre-Training for Molecular Property Prediction BCB 19

    Sheng Wang , Yuzhi Guo , Yuhong Wang , Hongmao Sun , Junzhou Huang [PDF] [Code], 2019.09,

  2. MolGPT: Molecular Generation Using a Transformer-Decoder Model. Journal of Chemical Information and Modeling

    Viraj Bagal, Rishal Aggarwal, P. K. Vinod, and U. Deva Priyakumar [PDF] [Code], 2021.10, ,

  3. A Deep-learning System Bridging Molecule Structure and Biomedical Text with Comprehension Comparable to Human Professionals. Nature Communications

    Zheni Zeng, Yuan Yao, Zhiyuan Liu, Maosong Sun [PDF] [Code], 2022.02,

  4. Chemformer: a pre-trained transformer for computational chemistry Machine Learning: Science and Technology

    Ross Irwin, Spyridon Dimitriadis, Jiazhen He and Esben Jannik Bjerrum [PDF] [Code], 2022.02,

  5. Large-Scale Distributed Training of Transformers for Chemical Fingerprinting. Journal of Chemical Information and Modeling

    Hisham Abdel-Aty, Ian R. Gould [PDF] [Code], 2022.06,

  6. Galactica: A Large Language Model for Science Preprint

    Ross Taylor, Marcin Kardas, Guillem Cucurull, Thomas Scialom, Anthony Hartshorn, Elvis Saravia, Andrew Poulton, Viktor Kerkez, Robert Stojnic [PDF] [Code], 2022.11,

  7. Translation between Molecules and Natural Language. EMNLP 2022

    Carl Edwards, Tuan Lai, Kevin Ros, Garrett Honke, Kyunghyun Cho, Heng Ji. [PDF] [Code], 2022.12,

  8. Unifying Molecular and Textual Representations via Multi-task Language Modelling ICML 2023

    Dimitrios Christofidellis, Giorgio Giannone, Jannis Born, Ole Winther, Teodoro Laino, Matteo Manica [PDF] [Code], 2023.05,

  9. What can Large Language Models do in chemistry? A comprehensive benchmark on eight tasks. NeurIPS 2023

    Taicheng Guo, Kehan Guo, Bozhao Nan, Zhenwen Liang, Zhichun Guo, Nitesh V. Chawla, Olaf Wiest, Xiangliang Zhang [PDF] [Code], 2023.9,

  10. MolXPT: Wrapping Molecules with Text for Generative Pre-training. ACL 2023

    Zequn Liu, Wei Zhang, Yingce Xia, Lijun Wu, Shufang Xie, Tao Qin, Ming Zhang, Tie-Yan Liu. [PDF], 2023.05,

  11. Interactive Molecular Discovery with Natural Language preprint

    Zheni Zeng, Bangchen Yin, Shipeng Wang, Jiarui Liu, Cheng Yang, Haishen Yao, Xingzhi Sun, Maosong Sun, Guotong Xie, Zhiyuan Liu. [PDF] [Code], 2023.06,

  12. Empowering Molecule Discovery for Molecule-Caption Translation with Large Language Models: A ChatGPT Perspective preprint

    Jiatong Li, Yunqing Liu, Wenqi Fan, Xiao-Yong Wei, Hui Liu, Jiliang Tang, Qing Li. [PDF], 2023.06, [Code],

  13. Regression Transformer enables concurrent sequence regression and generation for molecular language modelling Nature Machine Intelligence

    Jannis Born, Matteo Manica [PDF] [Code], 2023.6, ,

  14. Can Large Language Models Empower Molecular Property Prediction? preprint

    Chen Qian, Huayi Tang, Zhirui Yang, Hong Liang, Yong Liu. [PDF] [Code], 2023.7,

  15. Mol-Instructions: A Large-Scale Biomolecular Instruction Dataset for Large Language Models preprint

    Yin Fang, Xiaozhuan Liang, Ningyu Zhang, Kangwei Liu, Rui Huang, Zhuo Chen, Xiaohui Fan, Huajun Chen [PDF], 2023.07,

  16. GPT-MolBERTa: GPT Molecular Features Language Model for molecular property prediction. preprint

    Balaji, Suryanarayanan, Rishikesh Magar, and Yayati Jadhav. [PDF], 2023.10,

  17. The Impact of Large Language Models on Scientific Discovery: a Preliminary Study using GPT-4. preprint

    Microsoft Research AI4Science, Microsoft Azure Quantum [PDF], 2023.11,

  18. Towards 3D Molecule-Text Interpretation in Language Models ICLR 2024

    Sihang Li, Zhiyuan Liu, Yanchen Luo, Xiang Wang, Xiangnan He, Kenji Kawaguchi, Tat-Seng Chua, Qi Tian. [PDF], 2024.1,

  19. MolTC: Towards Molecular Relational Modeling In Language Models preprint

    Junfeng Fang, Shuai Zhang, Chang Wu, Zhengyi Yang, Zhiyuan Liu, Sihang Li, Kun Wang, Wenjie Du, Xiang Wang. [PDF], 2024.2,

  20. Large Language Models are In-Context Molecule Learners preprint

    Jiatong Li, Wei Liu, Zhihao Ding, Wenqi Fan, Yuqiang Li, Qing Li. [PDF], 2024.3,

Graph-Empowered LLM (Graph)

  1. Text2Mol: Cross-Modal Molecule Retrieval with Natural Language Queries. EMNLP 2021

    Carl Edwards, ChengXiang Zhai, Heng Ji. [PDF] [Code], 2021.10,

  2. GIMLET: A Unified Graph-Text Model for Instruction-Based Molecule Zero-Shot Learning. preprint

    Haiteng Zhao, Shengchao Liu, Chang Ma, Hannan Xu, Jie Fu, Zhi-Hong Deng, Lingpeng Kong, Qi Liu. [PDF] [Code], 2023.6,

  3. Prot2Text: Multimodal Protein's Function Generation with GNNs and Transformers preprint

    Hadi Abdine, Michail Chatzianastasis, Costas Bouyioukos, Michalis Vazirgiannis [PDF], 2023.07,

  4. ReLM: Leveraging Language Models for Enhanced Chemical Reaction Prediction. EMNLP 2023

    Yaorui Shi, An Zhang, Enzhi Zhang, Zhiyuan Liu, Xiang Wang. [PDF] [Code], 2023.10,

  5. Biomedical knowledge graph-enhanced prompt generation for large language models preprint

    Karthik Soman, Peter W Rose, John H Morris, Rabia E Akbas, Brett Smith, Braian Peetoom, Catalina Villouta-Reyes, Gabriel Cerono, Yongmei Shi, Angela Rizk-Jackson, Sharat Israni, Charlotte A Nelson, Sui Huang, Sergio E Baranzini [PDF] [Code], 2023.11,

  6. L+M-24: Building a Dataset for Language + Molecules @ ACL 2024 preprint

    Carl Edwards, Qingyun Wang, Lawrence Zhao, Heng Ji [PDF] [Code], 2024.2,

LLM as Aligner (Graph)

Latent Space Alignment (Graph)

  1. A Molecular Multimodal Foundation Model Associating Molecule Graphs with Natural Language. preprint

    Bing Su, Dazhao Du, Zhao Yang, Yujie Zhou, Jiangmeng Li, Anyi Rao, Hao Sun, Zhiwu Lu, Ji-Rong Wen [PDF] [Code], 2022.09,

  2. Multi-modal Molecule Structure-text Model for Text-based Retrieval and Editing. preprint

    Shengchao Liu, Weili Nie, Chengpeng Wang, Jiarui Lu, Zhuoran Qiao, Ling Liu, Jian Tang, Chaowei Xiao, Anima Anandkumar [PDF], 2022.10,

  3. Extracting Molecular Properties from Natural Language with Multimodal Contrastive Learning ICML 2023 Workshop on Computational Biology

    Romain Lacombe, Andrew Gaut, Jeff He, David Lüdeke, Kateryna Pistunova [PDF], 2023.07,

  4. ProtST: Multi-Modality Learning of Protein Sequences and Biomedical Texts. ICML 2023

    Minghao Xu, Xinyu Yuan, Santiago Miret, Jian Tang. [PDF] [Code], 2023.1,

  5. Enhancing Activity Prediction Models in Drug Discovery with the Ability to Understand Human Language. ICML 2023

    Philipp Seidl, Andreu Vall, Sepp Hochreiter, Günter Klambauer. [PDF] [Code], 2023.5,

  6. MolFM: A multimodal molecular foundation model. Preprint

    Yizhen Luo, Kai Yang, Massimo Hong, Xing Yi Liu, Zaiqing Nie. [PDF] [Code], 2023.7,

  7. GIT-Mol: A Multi-modal Large Language Model for Molecular Science with Graph, Image, and Text Preprint

    Pengfei Liu, Yiming Ren, Zhixiang Ren [PDF], 2023.08,

  8. MolCA: Molecular Graph-Language Modeling with Cross-Modal Projector and Uni-Modal Adapter. EMNLP 2023

    Zhiyuan Liu, Sihang Li, Yanchen Luo, Hao Fei, Yixin Cao, Kenji Kawaguchi, Xiang Wang, Tat-Seng Chua. [PDF] [Code], 2023.10,

  9. CHEMREASONER: Heuristic Search over a Large Language Model's Knowledge Space using Quantum-Chemical Feedback. preprint

    Henry W. Sprueill, Carl Edwards, Khushbu Agarwal, Mariefel V. Olarte, Udishnu Sanyal, Conrad Johnston, Hongbin Liu, Heng Ji, Sutanay Choudhury. [PDF], 2024.2,

Contribution

Contributions to this repository are welcome!

If you find any error or have relevant resources, feel free to open an issue or a pull request.

Paper format:

1. **[paper title].** `[]`

    *[authors].* [[PDF]([pdf link])] [[Code]([code link])], published time, ![](https://img.shields.io/badge/[architecture]-blue) ![](https://img.shields.io/badge/[size]-red)

Citations

Please cite the following paper if you find the resource helpful for your research.

@article{jin@llmgraph,
  title={Large Language Models on Graphs: A Comprehensive Survey},
  author={Jin, Bowen and Liu, Gang and Han, Chi and Jiang, Meng and Ji, Heng and Han, Jiawei},
  journal={arXiv preprint arXiv:2312.02783},
  year={2023}
}

About

A curated list of papers and resources based on "Large Language Models on Graphs: A Comprehensive Survey".

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published