CogNLG: Cognitive Graph for KG-to-text Generation

Peichao Lai, Feiyang Ye, Yang-Geng Fu, Zhiwei Chen, Yingjie Wu, Yilei Wang, Victor Chang*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review


Knowledge graph (KG) has been fully considered in natural language generation (NLG) tasks. A KG can help models generate controllable text and achieve better performance. However, most existing related approaches still lack explainability and scalability in large-scale knowledge reasoning. In this work, we propose a novel CogNLG framework for KG-to-text generation tasks. Our CogNLG is implemented based on the dual-process theory in cognitive science. It consists of two systems: one system acts as the analytic system for knowledge extraction, and another is the perceptual system for text generation by using existing knowledge. During text generation, CogNLG provides a visible and explainable reasoning path. Our framework shows excellent performance on all datasets and achieves a BLEU score of 36.7, which increases by 6.7 compared to the best competitor.
Original languageEnglish
Article numbere13461
JournalExpert Systems
Issue number1
Early online date10 Oct 2023
Publication statusPublished - Jan 2024

Bibliographical note

© 2023 The Authors. Expert Systems published by John Wiley & Sons Ltd.

This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.


  • KG-to-text
  • cognitive graph
  • natural language generation


Dive into the research topics of 'CogNLG: Cognitive Graph for KG-to-text Generation'. Together they form a unique fingerprint.

Cite this