Multi-Level Attention-Based Categorical Emotion Recognition Using Modulation-Filtered Cochleagram
Peng, Zhichao4; He, Wenhua4; Li, Yongwei1; Du, Yegang5; Dang, Jianwu2,3
刊名APPLIED SCIENCES-BASEL
2023-06-01
卷号13期号:11页码:16
关键词categorical emotion recognition auditory signal processing modulation-filtered cochleagram multi-level attention
DOI10.3390/app13116749
通讯作者Peng, Zhichao(zcpeng@tju.edu.cn) ; Dang, Jianwu(jdang@jaist.ac.jp)
英文摘要Speech emotion recognition is a critical component for achieving natural human-robot interaction. The modulation-filtered cochleagram is a feature based on auditory modulation perception, which contains multi-dimensional spectral-temporal modulation representation. In this study, we propose an emotion recognition framework that utilizes a multi-level attention network to extract high-level emotional feature representations from the modulation-filtered cochleagram. Our approach utilizes channel-level attention and spatial-level attention modules to generate emotional saliency maps of channel and spatial feature representations, capturing significant emotional channel and feature space from the 3D convolution feature maps, respectively. Furthermore, we employ a temporal-level attention module to capture significant emotional regions from the concatenated feature sequence of the emotional saliency maps. Our experiments on the Interactive Emotional Dyadic Motion Capture (IEMOCAP) dataset demonstrate that the modulation-filtered cochleagram significantly improves the prediction performance of categorical emotion compared to other evaluated features. Moreover, our emotion recognition framework achieves comparable unweighted accuracy of 71% in categorical emotion recognition by comparing with several existing approaches. In summary, our study demonstrates the effectiveness of the modulation-filtered cochleagram in speech emotion recognition, and our proposed multi-level attention framework provides a promising direction for future research in this field.
资助项目Hunan Provincial Natural Science Foundation of China[2021JJ30379] ; Youth Fund of the National Natural Science Foundation of China[62201571]
WOS关键词FEATURES
WOS研究方向Chemistry ; Engineering ; Materials Science ; Physics
语种英语
出版者MDPI
WOS记录号WOS:001003438300001
资助机构Hunan Provincial Natural Science Foundation of China ; Youth Fund of the National Natural Science Foundation of China
内容类型期刊论文
源URL[http://ir.ia.ac.cn/handle/173211/53498]  
专题模式识别国家重点实验室_智能交互
通讯作者Peng, Zhichao; Dang, Jianwu
作者单位1.Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100045, Peoples R China
2.Tianjin Univ, Coll Intelligence & Comp, Tianjin 300350, Peoples R China
3.Pengcheng Lab, Shenzhen 518055, Peoples R China
4.Hunan Univ Humanities Sci & Technol, Informat Sch, Loudi 417000, Peoples R China
5.Waseda Univ, Future Robot Org, Tokyo 1698050, Japan
推荐引用方式
GB/T 7714
Peng, Zhichao,He, Wenhua,Li, Yongwei,et al. Multi-Level Attention-Based Categorical Emotion Recognition Using Modulation-Filtered Cochleagram[J]. APPLIED SCIENCES-BASEL,2023,13(11):16.
APA Peng, Zhichao,He, Wenhua,Li, Yongwei,Du, Yegang,&Dang, Jianwu.(2023).Multi-Level Attention-Based Categorical Emotion Recognition Using Modulation-Filtered Cochleagram.APPLIED SCIENCES-BASEL,13(11),16.
MLA Peng, Zhichao,et al."Multi-Level Attention-Based Categorical Emotion Recognition Using Modulation-Filtered Cochleagram".APPLIED SCIENCES-BASEL 13.11(2023):16.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace