Projects per year
Project Details
Description
It is always challenging for people with physical and cognitive impairments to communicate and socialize. This project aims to determine and investigate the emotional signals, particularly negative emotions like stress, fatigue or anger cues from the brain-injured patients to explore practical strategies to improve the quality of life of the residents at the Danish neurocenter. It focuses on the extraction of social signals through facial expressions and uses emotional cues for productive human-robot interaction. This project has three phases:
Development of the multi-modal database of real patients in specific scenarios
Use of deep transfer learning approaches to make a unique and customized facial expressions recognition (FER) system for these residents.
Deployment of this FER system through social robot (in our case pepper robot) to enhance Human-Robot interaction and social interaction.
The Pepper robot performs the role of assistive technology for both residents and staff members. In the case of residents, it provides stimulus to engage more socially through its speech and visual input by analyzing mood in particular during therapy sessions. It acts as a monitoring and feedback tool for the staff members allowing to monitor emotional reactions over therapy sessions and allows for the adaptation of rehabilitation strategies based on this additional input.
Development of the multi-modal database of real patients in specific scenarios
Use of deep transfer learning approaches to make a unique and customized facial expressions recognition (FER) system for these residents.
Deployment of this FER system through social robot (in our case pepper robot) to enhance Human-Robot interaction and social interaction.
The Pepper robot performs the role of assistive technology for both residents and staff members. In the case of residents, it provides stimulus to engage more socially through its speech and visual input by analyzing mood in particular during therapy sessions. It acts as a monitoring and feedback tool for the staff members allowing to monitor emotional reactions over therapy sessions and allows for the adaptation of rehabilitation strategies based on this additional input.
Status | Finished |
---|---|
Effective start/end date | 03/04/2017 → 14/04/2020 |
Keywords
- Facial expression recognition,
- Emotional signal
- Assistive and Augmentative technology (AAT)
- Human-Robot Interaction
- Social robotics
- Rehabilitation
- social interaction
Fingerprint
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.
Projects
- 2 Finished
-
RETRO: Regulating Trust in Human Robot Interaction
Rehm, M. (PI) & Fischer, K. (PI)
Independent Research Fund Denmark
01/09/2021 → 31/03/2025
Project: Research
-
NET4Age-Friendly: International Interdisciplinary Network on Smart Healthy Age-friendly Environments
Rehm, M. (PI)
01/10/2020 → 30/09/2024
Project: Research
Research output
- 5 Article in proceeding
-
Developing a user-centred Communication Pad for Cognitive and Physical Impaired People
Ilyas, C. M. A., Rodil, K. & Rehm, M., 2020, In proceedings of 8th EAI International Conference: ArtsIT, Interactivity & Game Creation. Brooks, A. & Brooks, E. I. (eds.). Springer Publishing Company, Vol. 328. p. 124-137 14 p. (Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering (LNICST), Vol. 328).Research output: Contribution to book/anthology/report/conference proceeding › Article in proceeding › Research › peer-review
File5 Citations (Scopus)188 Downloads (Pure) -
Effective Facial Expression Recognition Through Multimodal Imaging for Traumatic Brain Injured Patient’s Rehabilitation
Ilyas, C. M. A., Haque, M. A., Rehm, M., Nasrollahi, K. & Moeslund, T. B., 2 Jul 2019, Computer Vision, Imaging and Computer Graphics Theory and Applications: 13th International Joint Conference, VISIGRAPP 2018 Funchal–Madeira, Portugal, January 27–29, 2018, Revised Selected Papers. Bechmann, D., Chessa, M., Cláudio, A. P., Imai, F., Kerren, A., Richard, P., Telea, A. & Tremeau, A. (eds.). Springer Publishing Company, p. 369-389 21 p. (Communications in Computer and Information Science, Vol. CCIS, volume 997).Research output: Contribution to book/anthology/report/conference proceeding › Article in proceeding › Research › peer-review
-
Teaching Pepper Robot to Recognize Emotions of Traumatic Brain Injured Patients Using Deep Neural Networks
Ilyas, C. M. A., Schmuck, V., Haque, M. A., Nasrollahi, K., Rehm, M. & Moeslund, T. B., 12 Oct 2019, 28th IEEE International Conference on Robot and Human Interactive Communication (ROMAN). IEEE (Institute of Electrical and Electronics Engineers), 8956445. (IEEE RO-MAN proceedings).Research output: Contribution to book/anthology/report/conference proceeding › Article in proceeding › Research › peer-review
File12 Citations (Scopus)371 Downloads (Pure)