Abstract
Ubiquitous mobile sensors on human activity recognition pose the threat of leaking personal information that is implicitly contained within the time-series sensor signals and can be extracted by attackers. Existing protective methods only support specific sensitive attributes and require massive relevant sensitive ground truth for training, which is unfavourable to users. To fill this gap, we propose a novel data transformation framework for prohibiting the leakage of sensitive information from sensor data. The proposed framework transforms raw sensor data into a new format, where the sensitive information is hidden and the desired information (e.g., human activities) is retained. Training can be conducted without using any personal information as ground truth. Meanwhile, multiple attributes of sensitive information (e.g., age, gender) can be collectively hidden through a one-time transformation. The experimental results on two multimodal sensor-based human activity datasets manifest the feasibility of the presented framework in hiding users' sensitive information (inference MAE increases ∼ 2 times and inference accuracy degrades ∼ 50%) without degrading the usability of the data for activity recognition (only ∼2% accuracy degradation).
Originalsprog | Engelsk |
---|---|
Artikelnummer | 9424974 |
Tidsskrift | IEEE Transactions on Mobile Computing |
Vol/bind | 21 |
Udgave nummer | 12 |
Sider (fra-til) | 4517-4528 |
Antal sider | 12 |
ISSN | 1536-1233 |
DOI | |
Status | Udgivet - 1 dec. 2022 |
Bibliografisk note
Publisher Copyright:IEEE