Abstract
Ubiquitous mobile sensors on human activity recognition pose the threat of leaking personal information that is implicitly contained within the time-series sensor signals and can be extracted by attackers. Existing protective methods only support specific sensitive attributes and require massive relevant sensitive ground truth for training, which is unfavourable to users. To fill this gap, we propose a novel data transformation framework for prohibiting the leakage of sensitive information from sensor data. The proposed framework transforms raw sensor data into a new format, where the sensitive information is hidden and the desired information (e.g., human activities) is retained. Training can be conducted without using any personal information as ground truth. Meanwhile, multiple attributes of sensitive information (e.g., age, gender) can be collectively hidden through a one-time transformation. The experimental results on two multimodal sensor-based human activity datasets manifest the feasibility of the presented framework in hiding users' sensitive information (inference MAE increases ∼ 2 times and inference accuracy degrades ∼ 50%) without degrading the usability of the data for activity recognition (only ∼2% accuracy degradation).
Original language | English |
---|---|
Article number | 9424974 |
Journal | IEEE Transactions on Mobile Computing |
Volume | 21 |
Issue number | 12 |
Pages (from-to) | 4517-4528 |
Number of pages | 12 |
ISSN | 1536-1233 |
DOIs | |
Publication status | Published - 1 Dec 2022 |
Bibliographical note
Publisher Copyright:IEEE
Keywords
- Activity recognition
- Legged locomotion
- Medical services
- Mobile computing
- Neural networks
- Task analysis
- Training
- human activity recognition
- mobile sensors
- neural network
- sensitive information protection
- Mobile sensors