Spurious Correlations in Cross-Topic Argument Mining
Research output: Chapter in Book/Report/Conference proceeding › Article in proceedings › Research › peer-review
Documents
- Fulltext
Final published version, 1.55 MB, PDF document
Recent work in cross-topic argument mining attempts to learn models that generalise across topics rather than merely relying on within-topic spurious correlations. We examine the effectiveness of this approach by analysing the output of single-task and multi-task models for cross-topic argument mining, through a combination of linear approximations of their decision boundaries, manual feature grouping, challenge examples, and ablations across the input vocabulary. Surprisingly, we show that cross-topic models still rely mostly on spurious correlations and only generalise within closely related topics, e.g., a model trained only on closed-class words and a few common open-class words outperforms a state-of-the-art cross-topic model on distant target topics.
Original language | English |
---|---|
Title of host publication | Proceedings of *SEM 2021: The Tenth Joint Conference on Lexical and Computational Semantics |
Publisher | Association for Computational Linguistics |
Publication date | 2021 |
Pages | 263-277 |
DOIs | |
Publication status | Published - 2021 |
Event | Tenth Joint Conference on Lexical and Computational Semantics - SEM 2021 - Online Duration: 5 Aug 2021 → 6 Aug 2021 |
Conference
Conference | Tenth Joint Conference on Lexical and Computational Semantics - SEM 2021 |
---|---|
By | Online |
Periode | 05/08/2021 → 06/08/2021 |
Number of downloads are based on statistics from Google Scholar and www.ku.dk
No data available
ID: 300082790