Automatic Evaluation of Text Summarization Based on Semantic Link Network

Mengyun Cao, Hai Zhuge

    Research output: Chapter in Book/Published conference outputConference publication

    Abstract

    This paper proposes an approach for automatically evaluating summaries based on Semantic Link Network (SLN). Three factors about the quality of summary are taken into account: 1) Fidelity, inspecting whether a summary conveys the core themes of the source text; 2) Conciseness, inspecting the non-redundancy between the sentences of a summary; and, 3) Coherence, inspecting the relevance among all the themes contained in a summary. A summary gets a quality rating based on its performance on the three factors. Experimental results show that the quality ratings of summaries given by our approach are close to the results of manual evaluation.
    Original languageEnglish
    Title of host publicationProceedings - 15th International Conference on Semantics, Knowledge and Grids
    Subtitle of host publicationOn Big Data, AI and Future Interconnection Environment, SKG 2019
    EditorsHai Zhuge, Xiaoping Sun
    PublisherIEEE
    Pages107-114
    Number of pages8
    ISBN (Electronic)978-1-7281-5823-5
    ISBN (Print)978-1-7281-5824-2
    DOIs
    Publication statusPublished - 23 Mar 2020
    Event2019 15th International Conference on Semantics, Knowledge and Grids (SKG) - Guangzhou, China
    Duration: 17 Sept 201918 Sept 2019

    Publication series

    NameProceedings - 15th International Conference on Semantics, Knowledge and Grids: On Big Data, AI and Future Interconnection Environment, SKG 2019

    Conference

    Conference2019 15th International Conference on Semantics, Knowledge and Grids (SKG)
    Period17/09/1918/09/19

    Keywords

    • Automatic text summarization
    • Semantic link network
    • Summarization evaluation
    • Textual coherence

    Fingerprint

    Dive into the research topics of 'Automatic Evaluation of Text Summarization Based on Semantic Link Network'. Together they form a unique fingerprint.

    Cite this