Despite recent significant advancements in the field of human emotion recognition, applying upper body movements along with facial expressions present severe challenges in the field of human-robot interaction. This article presents a model that learns emotions through upper body movements and corresponds with facial expressions. Once this correspondence is mapped, tasks such as emotion and gesture recognition can easily be identified using facial features and movement vectors.
Our method uses a deep convolution neural network trained on benchmark datasets exhibiting various emotions and corresponding body movements. Features obtained through facial movements and body motion are fused to get emotion recognition performance. We have implemented various fusion methodologies to integrate multimodal features for non-verbal emotion identification.
Our system achieves 76.8\% accuracy of emotion recognition through upper body movements only, surpassing 73.1\% on the FABO dataset. In addition, employing multimodal compact bilinear polling with temporal information surpassed the state-of-the-art method with an accuracy of 94.41\% on the FABO dataset. This system can lead to better human-machine interaction by enabling robots to recognize emotions and body actions and react according to their emotions, thus enriching the user experience.
Original languageEnglish
Title of host publication16th International Conference on Computer Vision Theory and Application (VISAPP 2021)
Number of pages11
PublisherSCITEPRESS Digital Library
Publication date2021
Article number229
ISBN (Electronic)978-989-758-488-6
Publication statusPublished - 2021
Event16th International Conference on Computer Vision Theory and Application - Online Streaming
Duration: 8 Feb 202110 Feb 2021


Conference16th International Conference on Computer Vision Theory and Application
LocationOnline Streaming
Internet address

Fingerprint Dive into the research topics of 'Deep Emotion Recognition through Upper Body Movements and Facial Expression.'. Together they form a unique fingerprint.

Cite this