Project Details


It is always challenging for people with physical and cognitive impairments to communicate and socialize. This project aims to determine and investigate the emotional signals, particularly negative emotions like stress, fatigue or anger cues from the brain-injured patients to explore practical strategies to improve the quality of life of the residents at the Danish neurocenter. It focuses on the extraction of social signals through facial expressions and uses emotional cues for productive human-robot interaction. This project has three phases:
Development of the multi-modal database of real patients in specific scenarios
Use of deep transfer learning approaches to make a unique and customized facial expressions recognition (FER) system for these residents.
Deployment of this FER system through social robot (in our case pepper robot) to enhance Human-Robot interaction and social interaction.
The Pepper robot performs the role of assistive technology for both residents and staff members. In the case of residents, it provides stimulus to engage more socially through its speech and visual input by analyzing mood in particular during therapy sessions. It acts as a monitoring and feedback tool for the staff members allowing to monitor emotional reactions over therapy sessions and allows for the adaptation of rehabilitation strategies based on this additional input.
Effective start/end date03/04/201714/04/2020


  • Facial expression recognition,
  • Emotional signal
  • Assistive and Augmentative technology (AAT)
  • Human-Robot Interaction
  • Social robotics
  • Rehabilitation
  • social interaction

Research Output

  • 5 Article in proceeding

Developing a user-centred Communication Pad for Cognitive and Physical Impaired People

Ilyas, C. M. A., Rodil, K. & Rehm, M., 2020, In proceedings of 8th EAI International Conference: ArtsIT, Interactivity & Game Creation. Brooks, A. & Brooks, E. I. (eds.). Springer Publishing Company, p. 124-137 14 p.

Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review

  • 1 Citation (Scopus)
    29 Downloads (Pure)

    Effective Facial Expression Recognition Through Multimodal Imaging for Traumatic Brain Injured Patient’s Rehabilitation

    Ilyas, C. M. A., Haque, M. A., Rehm, M., Nasrollahi, K. & Moeslund, T. B., 2 Jul 2019, Computer Vision, Imaging and Computer Graphics Theory and Applications: 13th International Joint Conference, VISIGRAPP 2018 Funchal–Madeira, Portugal, January 27–29, 2018, Revised Selected Papers. Bechmann, D., Chessa, M., Cláudio, A. P., Imai, F., Kerren, A., Richard, P., Telea, A. & Tremeau, A. (eds.). Springer Publishing Company, p. 369-389 21 p. (Communications in Computer and Information Science, Vol. CCIS, volume 997).

    Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review

  • Teaching Pepper Robot to Recognize Emotions of Traumatic Brain Injured Patients Using Deep Neural Networks

    Ilyas, C. M. A., Schmuck, V., Haque, M. A., Nasrollahi, K., Rehm, M. & Moeslund, T. B., 12 Oct 2019, 28th IEEE International Conference on Robot and Human Interactive Communication (ROMAN). IEEE, 8956445. (IEEE RO-MAN proceedings).

    Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review

  • 59 Downloads (Pure)