Self-Attention Aligner: A Latency-Control End-to-End Model for ASR using Self-attention Network and Chunk-hopping
Dong, Linhao1,2; Wang, Feng2; Xu, Bo2
2019-05
会议日期2019-05
会议地点Brighton, United Kingdom
关键词speech recognition self-attention network encoder-decoder end-to-end latency-control
页码5656-5660
英文摘要

Self-attention network, an attention-based feedforward neural network, has recently shown the potential to replace recurrent neural networks (RNNs) in a variety of NLP tasks. However, it is not clear if the self-attention network could be a good alternative of RNNs in automatic speech recognition (ASR), which processes the longer speech sequences and may have online recognition requirements. In this paper, we present a RNN-free end-to-end model: self-attention aligner (SAA), which applies the self-attention networks to a simplified recurrent neural aligner (RNA) framework. We also propose a chunk-hopping mechanism, which enables the SAA model to encode on segmented frame chunks one after another to support online recognition. Experiments on two Mandarin ASR datasets show the replacement of RNNs by the self-attention networks yields a 8.4%-10.2% relative character error rate (CER) reduction. In addition, the chunk-hopping mechanism allows the SAA to have only a 2.5% relative CER degradation with a 320ms latency. After jointly training with a self-attention network language model, our SAA model obtains further error rate reduction on multiple datasets. Especially, it achieves 24.12% CER on the Mandarin ASR benchmark (HKUST), exceeding the best end-to-end model by over 2% absolute CER.

会议录出版者IEEE Xplore
内容类型会议论文
源URL[http://ir.ia.ac.cn/handle/173211/39276]  
专题数字内容技术与服务研究中心_听觉模型与认知计算
作者单位1.University of Chinese Academy of Sciences, China
2.Institute of Automation, Chinese Academy of Sciences, China
推荐引用方式
GB/T 7714
Dong, Linhao,Wang, Feng,Xu, Bo. Self-Attention Aligner: A Latency-Control End-to-End Model for ASR using Self-attention Network and Chunk-hopping[C]. 见:. Brighton, United Kingdom. 2019-05.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace