Multi-head Self-attention with Role-Guided Masks

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Dokumenter

  • Fulltext

    Indsendt manuskript, 218 KB, PDF-dokument

The state of the art in learning meaningful semantic representations of words is the Transformer model and its attention mechanisms. Simply put, the attention mechanisms learn to attend to specific parts of the input dispensing recurrence and convolutions. While some of the learned attention heads have been found to play linguistically interpretable roles, they can be redundant or prone to errors. We propose a method to guide the attention heads towards roles identified in prior work as important. We do this by defining role-specific masks to constrain the heads to attend to specific parts of the input, such that different heads are designed to play different roles. Experiments on text classification and machine translation using 7 different datasets show that our method outperforms competitive attention-based, CNN, and RNN baselines.

OriginalsprogEngelsk
TitelAdvances in Information Retrieval - 43rd European Conference on IR Research, ECIR 2021, Proceedings, Part II
RedaktørerDjoerd Hiemstra, Marie-Francine Moens, Josiane Mothe, Raffaele Perego, Martin Potthast, Fabrizio Sebastiani
ForlagSpringer
Publikationsdato2021
Sider432-439
ISBN (Trykt)9783030722395
DOI
StatusUdgivet - 2021
Begivenhed43rd European Conference on Information Retrieval, ECIR 2021 - Virtual, Online
Varighed: 28 mar. 20211 apr. 2021

Konference

Konference43rd European Conference on Information Retrieval, ECIR 2021
ByVirtual, Online
Periode28/03/202101/04/2021
NavnLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Vol/bind12657 LNCS
ISSN0302-9743

Bibliografisk note

Funding Information:
Acknowledgments. This work is supported by the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 721321 (QUARTZ project) and No. 893667 (METER project).

Publisher Copyright:
© 2021, Springer Nature Switzerland AG.

Antal downloads er baseret på statistik fra Google Scholar og www.ku.dk


Ingen data tilgængelig

ID: 283133892