Shucong Zhang
Stochastic Attention Head Removal: A Simple and Effective Method for Improving Transformer Based ASR Models
Zhang, Shucong; Loweimi, Erfan; Bell, Peter; Renals, Steve
Authors
Erfan Loweimi
Peter Bell
Steve Renals
Abstract
Recently, Transformer based models have shown competitive automatic speech recognition (ASR) performance. One key factor in the success of these models is the multi-head attention mechanism. However, for trained models, we have previously observed that many attention matrices are close to diagonal, indicating the redundancy of the corresponding attention heads. We have also found that some architectures with reduced numbers of attention heads have better performance. Since the search for the best structure is time prohibitive, we propose to randomly remove attention heads during training and keep all attention heads at test time, thus the final model is an ensemble of models with different architectures. The proposed method also forces each head independently learn the most useful patterns. We apply the proposed method to train Transformer based and Convolution-augmented Transformer (Conformer) based ASR models. Our method gives consistent performance gains over strong baselines on the Wall Street Journal, AISHELL, Switchboard and AMI datasets. To the best of our knowledge, we have achieved state-of-the-art end-to-end Transformer based model performance on Switchboard and AMI.
Citation
Zhang, S., Loweimi, E., Bell, P., & Renals, S. (2021). Stochastic Attention Head Removal: A Simple and Effective Method for Improving Transformer Based ASR Models. In Proc. Interspeech 2021 (2541-2545). https://doi.org/10.21437/interspeech.2021-280
Presentation Conference Type | Conference Paper (Published) |
---|---|
Conference Name | Interspeech 2021 |
Start Date | Aug 30, 2021 |
End Date | Sep 1, 2021 |
Online Publication Date | Aug 30, 2021 |
Publication Date | 2021 |
Deposit Date | Apr 3, 2024 |
Pages | 2541-2545 |
Book Title | Proc. Interspeech 2021 |
DOI | https://doi.org/10.21437/interspeech.2021-280 |
Public URL | http://researchrepository.napier.ac.uk/Output/3585843 |
Downloadable Citations
About Edinburgh Napier Research Repository
Administrator e-mail: repository@napier.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search