Projekter pr. år
Abstract
Recently, there has been a significant amount of work on the recognition of emotions from speech and biosignals. Most approaches to emotion recognition so far concentrate on a single modality and do not take advantage of the fact that an integrated multimodal analysis may help to resolve ambiguities and compensate
for errors. In this paper, we describe various methods for fusing physiological and voice data at the feature-level and the decision-level as well as a hybrid integration scheme. The results of the integrated recognition approach are then compared with the individual recognition results from each modality.
for errors. In this paper, we describe various methods for fusing physiological and voice data at the feature-level and the decision-level as well as a hybrid integration scheme. The results of the integrated recognition approach are then compared with the individual recognition results from each modality.
Originalsprog | Engelsk |
---|---|
Titel | Proceedings of the 9th European Conference on Speech Communication and Technology |
Antal sider | 4 |
Publikationsdato | 2005 |
Sider | 809-812 |
Status | Udgivet - 2005 |
Fingeraftryk
Dyk ned i forskningsemnerne om 'Integrating Information from Speech and Physiological Signals to Achieve Emotional Sensitivity'. Sammen danner de et unikt fingeraftryk.Projekter
- 1 Afsluttet
-
HUMAINE: HUman MAchine Interaction Network on Emotions
01/03/2004 → 29/02/2008
Projekter: Projekt › Forskning