The human brain is often praised for its raw processing power, which is highly superior to the most powerful supercomputers in the world today. The brain’s computational abilities allow humans to tackle complex tasks such as sense, perceive, act, interact, and understand stimuli from several different modalities (e.g. visual, auditory, tactile, painful), by simultaneously processing a huge amount of sensory information. Very recent evidence indicates that brain responses recorded after nociceptive (i.e. painful) stimulation are likely to be unspecific for pain, but instead part of a generic alarm system, involved in detecting and reacting to the occurrence of salient sensory events, regardless of their modality. In that context, the initial step of this project consists of recording brain responses to simultaneous multisensory stimulation in a novel, controlled fashion. From then, the challenges of the project are: 1) to accurately estimate the brain responses in the shortest possible time, in order to properly synchronize individual modalities, 2) to isolate and quantify the contribution of each specific modality to the resulting brain responses, and 3) to determine how the brain encodes and represents multimodal sensory information. Advanced signals processing methods, such as Joint-Time Frequency Analysis, Blind Source Separation and Mutual Information Analysis will be adapted (and possibly improved) for these tasks, with the aim of devising tools to provide specific, objective and robust indexes of brain activity to different sensory input.
|Effective start/end date||01/02/2012 → 31/10/2016|
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.