Project Details


Unilateral spatial neglect (USN) is a frequent impairment after stroke with a detrimental influence on recovery and independence in activities of daily living. The VR@SN project will provide the basis for comprehensive diagnosis and targeted treatment for the large number of patients affected by USN. The project has high relevance for affecting the recovery of patients with stroke and research into the diagnosis of the diverse condition of USN by capturing the patients head, eye, and hand movements in relation to a gamified task (Whack-a-mole). This work is timely, given demands for evidence-based treatments, digitization of healthcare, leveraging of health care records, and achieving better care outcomes with fewer resources.
VR@SN is an innovative, interdisciplinary project that validates an existing custom-made virtual reality (VR) prototype in terms of its design and diagnostic capabilities through a series of pilot single case experimental designs (SCEDs) [3] to improve evidence based treatment and health-related quality of life of thousands of patients with unilateral spatial neglect (USN) in Denmark alone. Proper diagnosis and monitoring of USN and its sub-types allows for targeted training in rehabilitation. Regionshospitalet Hammel Neurocenter (RHN) will provide the expertise to validate its diagnostic capabilities and collect patient feedback so the department of Architecture, Design and Media Technology (ADMT) can refine the designed game-based activities in VR. Volunteering patients meeting inclusion criteria at Hammel Neurocenter will participate in the SCEDs.
Following a user-centered design (UCD) approach, our current application, (details available on github ), was conceived through a number of co-design activities with therapists and refined through observations of patient interactions with previous prototype versions and their feedback to it. This project focuses on the validation of diagnostics from data logged by the current prototype through data collections from single case experimental designs (SCED).

The project has two main goals:
Develop, validate, and refine the diagnostic precision of the prototype (e.g., from eye, head, and torso tracking)
implement, and evaluate treatment approaches (e.g. mirror therapy, prismatic shift, half-field eye-patching for individualised neglect rehabilitation based on improved diagnostics
Specifically, our previous tests showed that we could extract relevant diagnostic markers for neglect from simple activities such as looking at pictures in VR. Thus, we can contribute to continuous data collection and evidence based targeted treatments (manuscript under preparation).
We therefore predict that we can diagnose neglect and its subtypes with greater accuracy than current pen and paper techniques and include many scientifically proven sub-tests that are not included in day-to-day practice due to resource limitations or insufficient training/knowledge from personnel. Based on these diagnoses, individually tailored rehabilitation can be provided.
The current prototype supports this plan by allowing to pre-define a set of conditions of the virtually displayed task - Whack-a-mole. This engaging game-like task requires players to hit (whack) appearing targets (mole). The approach can combine therapeutic modalities such as: mirroring patients movements, patching of eyes, adding prism adaptation, and scaling the motor space to the patients’ physical abilities such that even patients who cannot move their hands or arms, or only within a small range can use the application.

Layman's description

Stroke caused by a blood clot or bleeding in the brain is the leading cause of severe disability and affects 12,000 people a year in Denmark. Of these, 30-50% suffer from a condition called unilateral spatial neglect - the difficulty of paying attention to events or objects in one side of the body and the space around them. For example, a neglected patient would not shave the left side of the face even though their eyes are fine and can see the whole face. Neglected patients have a poorer rehabilitation effect than patients without neglect, and mild cases of neglect are difficult to diagnose, resulting in treatment that is not as effective and productive as it could be. There are many variations of neglect, many variants of which are not detected because clinicians are not pay attention to them or lack the necessary tests. Current paper-and-pencil tests are not sufficiently sensitive, leaving patients with milder symptoms are underdiagnosed and thus undertreated. Moreover, they are not sufficiently suitable to distinguish different neglected subtypes, whereby the treatment not targeted and individualized sufficiently for an individual patient. Finally, they are not
sufficiently sensitive to rehabilitation effect and they therefore do not help much that increase patient motivation. We have developed a virtual reality (VR) application together with patients and clinicians in over the last three years. The latest improvements were funded by The health innovation pool in the Central Region (Grant no. 208282401048). The application combines concepts from several paper-based tests and a range of recognized treatment methods, such that the investigation and treatment of neglect is carried out in one and same solution. For patients, the application looks like a (whack-a-mole) game that they in our own studies find entertaining. In this project, we will now evaluate that data from our application supports a more sensitive one and accurate diagnosis of neglect and its subtypes. Below, the solution allows us to fine-tune patients' hand, head, and eye movements during use to clarify subtypes and distinguish these from differential diagnoses. Unlike paper-based tests the application can be adjusted in terms of degree of difficulty and different levels of the patient's physical abilities (eg if they cannot move their arms much).
Short titleVR assessment and treatment of spatial neglect
Effective start/end date01/04/202130/09/2022


Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.