TY - JOUR
T1 - Overview of the Authorship Verification Task at PAN 2022
AU - Stamatatos, Efstathios
AU - Kestemont, Mike
AU - Kredens, Krzysztof
AU - Pezik, Piotr
AU - Heini, Annina
AU - Bevendorff, Janek
AU - Stein, Benno
AU - Potthast, Martin
N1 - © 2021 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
PY - 2022/9/5
Y1 - 2022/9/5
N2 - The authorship verification task at PAN 2022 follows the experimental setup of similar shared tasks in the recent past. However, it focuses on a different, and very challenging scenario: given two texts belonging to different discourse types, the task is to determine whether they are written by the same author. Based on a new corpus in English, we provide pairs of texts using four discourse types: essays, emails, text messages, and business memos. The differences in communicative purpose, intended audience, and the level of formality render the cross-discourse-type authorship verification task very hard. We received 7 submissions and evaluated them using the TIRA integrated research architecture, along with two baseline approaches. This paper reviews the submissions and presents a detailed discussion of the evaluation results.
AB - The authorship verification task at PAN 2022 follows the experimental setup of similar shared tasks in the recent past. However, it focuses on a different, and very challenging scenario: given two texts belonging to different discourse types, the task is to determine whether they are written by the same author. Based on a new corpus in English, we provide pairs of texts using four discourse types: essays, emails, text messages, and business memos. The differences in communicative purpose, intended audience, and the level of formality render the cross-discourse-type authorship verification task very hard. We received 7 submissions and evaluated them using the TIRA integrated research architecture, along with two baseline approaches. This paper reviews the submissions and presents a detailed discussion of the evaluation results.
UR - http://www.scopus.com/inward/record.url?scp=85136949420&partnerID=8YFLogxK
UR - https://ceur-ws.org/
M3 - Conference article
AN - SCOPUS:85136949420
SN - 1613-0073
VL - 3180
SP - 2301
EP - 2313
JO - CEUR workshop proceedings
JF - CEUR workshop proceedings
T2 - Proceedings of the Working Notes of CLEF 2022 - Conference and Labs of the Evaluation Forum
Y2 - 5 September 2022 through 8 September 2022
ER -