University Of Tasmania
146291 - Deep auto-encoders with sequential learning for multimodal dimensional emotion recognition.pdf (4.47 MB)

Deep auto-encoders with sequential learning for multimodal dimensional emotion recognition

Download (4.47 MB)
journal contribution
posted on 2023-05-21, 01:58 authored by Nguyen, D, Nguyen, DT, Zeng, R, Nguyen, TT, Son TranSon Tran, Nguyen, TK, Sridharan, S, Fookes, C
Multimodal dimensional emotion recognition has drawn a great attention from the affective computing community and numerous schemes have been extensively investigated, making a significant progress in this area. However, several questions still remain unanswered for most of existing approaches including: (i) how to simultaneously learn compact yet representative features from multimodal data, (ii) how to effectively capture complementary features from multimodal streams, and (iii) how to perform all the tasks in an end-to-end manner. To address these challenges, in this paper, we propose a novel deep neural network architecture consisting of a two-stream auto-encoder and a long short term memory for effectively integrating visual and audio signal streams for emotion recognition. To validate the robustness of our proposed architecture, we carry out extensive experiments on the multimodal emotion in the wild dataset: RECOLA. Experimental results show that the proposed method achieves state-of-the-art recognition performance.


Publication title

IEEE Transactions on Multimedia






School of Information and Communication Technology


Ieee-Inst Electrical Electronics Engineers Inc

Place of publication

445 Hoes Lane, Piscataway, USA, Nj, 08855

Rights statement

Copyright 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

Repository Status

  • Open

Socio-economic Objectives

Behaviour and health