This paper presents the extraction of the emotional signals from traumatic brain-injured (TBI) patients through the analysis of facial features and implementation of the effective emotion-recognition model through the Pepper robot to assist in the rehabilitation process. The identification of emotional cues from TBI patients is very challenging due to unique and diverse psychological, physiological, and behavioral challenges such as non-cooperation, facial/body paralysis, upper or lower limb impairments, cognitive, motor, and hearing skills inhibition. It is essential to read subtle changes in the emotional cues of TBI patients for effective communication and the development of affect-based systems. To analyze the variations of the emotional signal in TBI patients, a new database is collected in a natural and unconstrained environment from eleven residents of a neurological center in three different modalities, RGB, thermal and depth in three specified scenarios performing physical, cognitive and social communication rehabilitation activities. Due to the lack of labeled data, a deep transfer learning method is applied to efficiently classify emotions. The emotion classification model is tested through closed-field study and installment of a Pepper robot equipped with the trained model. Our deep trained and fine-tuned emotional recognition model composed of CNN-LSTM has improved the performance by 1.47% on MMI, and 4.96% on FER2013 validation data set. In addition, use of temporal information and transfer learning techniques to overcome TBI-data limitations has increased the performance efficacy on challenging dataset of neurologically impaired people. Findings that emerged from the study illustrate the noticeable effectiveness of SoftBank Pepper robot equipped with deep trained emotion recognition model in developing rehabilitation strategies by monitoring the TBI patient’s emotions. This research article presents the technical solution for real therapeutic robot interaction to rehabilitate patients with standard monitoring, assessment, and feedback in the neuro centers.

Original languageEnglish
JournalPattern Analysis and Applications
Pages (from-to)1-25
Number of pages25
Publication statusPublished - 2021

Bibliographical note

Part of a collection:
Special Issue on Computer Vision and Machine Learning for Healthcare Applications
ISSN: 1433-7541 (Print) 1433-755X (Online)


  • Assessment and monitoring
  • Assistive care
  • Augmentative and Assistive Technology (AAT)
  • Cognitive, social, and physical therapy
  • Deep transfer learning
  • Emotion recognition
  • Human–robot interaction
  • Rehabilitation strategies
  • TBI patients database
  • Traumatic brain injury (TBI)


Dive into the research topics of 'Deep Transfer Learning in Human-Robot Interaction for Cognitive and Physical Rehabilitation Purposes'. Together they form a unique fingerprint.

Cite this