MS-Transformer: Introduce multiple structural priors into a unified transformer for encoding sentences. (March 2022)
- Record Type:
- Journal Article
- Title:
- MS-Transformer: Introduce multiple structural priors into a unified transformer for encoding sentences. (March 2022)
- Main Title:
- MS-Transformer: Introduce multiple structural priors into a unified transformer for encoding sentences
- Authors:
- Qi, Le
Zhang, Yu
Yin, Qingyu
Liu, Ting - Abstract:
- Abstract: Transformers have been widely utilized in recent NLP studies. Unlike CNNs or RNNs, the vanilla Transformer is position-insensitive, and thus is incapable of capturing the structural priors between sequences of words. Existing studies commonly apply one single mask strategy on Transformers for incorporating structural priors while failing at modeling more abundant structural information of texts. In this paper, we aim at introducing multiple types of structural priors into Transformers, proposing the Multiple Structural Priors Guided Transformer (MS-Transformer) that transforms different structural priors into different attention heads by using a novel multi-mask based multi-head attention mechanism. In particular, we integrate two categories of structural priors, including the sequential order and the relative position of words. For the purpose of capturing the latent hierarchical structure of the texts, we extract these information not only from the word contexts but also from the dependency syntax trees. Experimental results on three tasks show that MS-Transformer achieves significant improvements against other strong baselines. Highlights: Multi-mask strategies can introduce different priors into different attention heads. Multi-mask strategies can guide models learning more precise dependencies. The sequential order and relative position of words are taken as structure priors. Structure priors benefit models on modeling sentence structures from multiple aspects.
- Is Part Of:
- Computer speech & language. Volume 72(2022)
- Journal:
- Computer speech & language
- Issue:
- Volume 72(2022)
- Issue Display:
- Volume 72, Issue 2022 (2022)
- Year:
- 2022
- Volume:
- 72
- Issue:
- 2022
- Issue Sort Value:
- 2022-0072-2022-0000
- Page Start:
- Page End:
- Publication Date:
- 2022-03
- Subjects:
- 68T50
Sentence representation -- Transformer -- Natural language processing
Speech processing systems -- Periodicals
Automatic speech recognition -- Periodicals
Computers -- Periodicals
Linguistics -- Periodicals
Speech-Language Pathology -- Periodicals
Traitement automatique de la parole -- Périodiques
Reconnaissance automatique de la parole -- Périodiques
Automatic speech recognition
Speech processing systems
Electronic journals
Periodicals
006.454 - Journal URLs:
- http://www.journals.elsevier.com/computer-speech-and-language/ ↗
http://www.elsevier.com/journals ↗ - DOI:
- 10.1016/j.csl.2021.101304 ↗
- Languages:
- English
- ISSNs:
- 0885-2308
- Deposit Type:
- Legaldeposit
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library DSC - 3394.276600
British Library DSC - BLDSS-3PM
British Library HMNTS - ELD Digital store - Ingest File:
- 20111.xml