Abstract
Knowledge graph (KG) has been fully considered in natural language generation (NLG) tasks. A KG can help models generate controllable text and achieve better performance. However, most existing related approaches still lack explainability and scalability in large-scale knowledge reasoning. In this work, we propose a novel CogNLG framework for KG-to-text generation tasks. Our CogNLG is implemented based on the dual-process theory in cognitive science. It consists of two systems: one system acts as the analytic system for knowledge extraction, and another is the perceptual system for text generation by using existing knowledge. During text generation, CogNLG provides a visible and explainable reasoning path. Our framework shows excellent performance on all datasets and achieves a BLEU score of 36.7, which increases by 6.7 compared to the best competitor.
Original language | English |
---|---|
Article number | e13461 |
Journal | Expert Systems |
Volume | 41 |
Issue number | 1 |
Early online date | 10 Oct 2023 |
DOIs | |
Publication status | Published - Jan 2024 |
Bibliographical note
© 2023 The Authors. Expert Systems published by John Wiley & Sons Ltd.This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
Keywords
- KG-to-text
- cognitive graph
- natural language generation