Projekter pr. år
Abstract
Despite recent significant advancements in the field of human emotion recognition, applying upper body movements along with facial expressions present severe challenges in the field of human-robot interaction. This article presents a model that learns emotions through upper body movements and corresponds with facial expressions. Once this correspondence is mapped, tasks such as emotion and gesture recognition can easily be identified using facial features and movement vectors. Our method uses a deep convolution neural network trained on benchmark datasets exhibiting various emotions and corresponding body movements. Features obtained through facial movements and body motion are fused to get emotion recognition performance. We have implemented various fusion methodologies to integrate multimodal features for non-verbal emotion identification. Our system achieves 76.8% accuracy of emotion recognition through upper body movements only, surpassing 73.1% on the FABO dataset. In addition, employing multimodal compact bilinear pooling with temporal information surpassed the state-of-the-art method with an accuracy of 94.41% on the FABO dataset. This system can lead to better human-machine interaction by enabling robots to recognize emotions and body actions and react according to their emotions, thus enriching the user experience.
Originalsprog | Engelsk |
---|---|
Titel | 16th International Conference on Computer Vision Theory and Application (VISAPP 2021) |
Redaktører | Giovanni Maria Farinella, Petia Radeva, Jose Braz, Kadi Bouatouch |
Antal sider | 11 |
Forlag | SCITEPRESS Digital Library |
Publikationsdato | 2021 |
Sider | 669-679 |
Artikelnummer | 229 |
ISBN (Elektronisk) | 978-989-758-488-6 |
DOI | |
Status | Udgivet - 2021 |
Begivenhed | 16th International Conference on Computer Vision Theory and Application - Online Streaming Varighed: 8 feb. 2021 → 10 feb. 2021 http://www.visapp.visigrapp.org/ImportantDates.aspx |
Konference
Konference | 16th International Conference on Computer Vision Theory and Application |
---|---|
Lokation | Online Streaming |
Periode | 08/02/2021 → 10/02/2021 |
Internetadresse |
Fingeraftryk
Dyk ned i forskningsemnerne om 'Deep Emotion Recognition through Upper Body Movements and Facial Expression.'. Sammen danner de et unikt fingeraftryk.Projekter
- 1 Igangværende
-
NET4Age-Friendly: International Interdisciplinary Network on Smart Healthy Age-friendly Environments
01/10/2020 → 30/09/2024
Projekter: Projekt › Forskning
Publikation
- 9 Citationer
- 1 Ph.d.-afhandling
-
Facial Emotion Recognition for Citizens with Traumatic Brain Injury for Therapeutic Robot Interaction
Ilyas, C. M., 2021, Aalborg Universitetsforlag. 196 s.Publikation: Ph.d.-afhandling
Åben adgangFil207 Downloads (Pure)