Project Details

Description

This pilot project combines machine learning and wearable sensing devices to develop intuitive audiovisual displays that accurately reflect physical activity and the felt experience of human movement. Tracking human motion is important for a range of activities and applications, from dance and music performance to rehabilitation and human-robot interaction. Wearable devices using physiological sensor data capture precise information about muscle activity, but provide very little information about how a person feels during the activity. This project combines a real-time interactive audiovisual system and machine learning techniques to develop algorithms that impart additional high-level information about the mover’s emotional and affective states. The goal is to improve algorithms for movement and effort tracking by incorporating people’s felt experience of movement.
Short titleImprovAIze
StatusFinished
Effective start/end date01/09/202231/12/2022

UN Sustainable Development Goals

In 2015, UN member states agreed to 17 global Sustainable Development Goals (SDGs) to end poverty, protect the planet and ensure prosperity for all. This project contributes towards the following SDG(s):

  • SDG 3 - Good Health and Well-being

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.