Abstract
In this paper, we present a novel method for utilising wearable devices with Convolutional Neural Networks (CNN) trained on acoustic and accelerometer signals in smart manufacturing environments in order to provide real-time quality inspection during manual operations. We show through our framework how recorded or streamed sound and accelerometer data gathered from a wrist-attached device can classify certain user actions as successful or unsuccessful. The classification is designed with a Deep CNN model trained on Mel-frequency Cepstral Coefficients (MFCC) from the acoustic input signals. The wearable device provides feedback on three different modalities: audio, visual and haptic; thus ensuring the worker’s awareness at all time. We validate our findings through deployments of the complete AI-enabled device in production facilities of Mercedes-Benz AG. From the conducted experiments it is concluded that the use of acoustic and accelerometer data is valuable to train a classifier with the purpose of action examination during industrial assembly operations, and provides an intuitive interface for ensuring continued and improved quality inspection.
Original language | English |
---|---|
Journal | Procedia Manufacturing |
Volume | 51 |
Pages (from-to) | 373-380 |
ISSN | 2351-9789 |
DOIs | |
Publication status | Published - Nov 2020 |
Event | 30th International Conference on Flexible Automation and Intelligent Manufacturing - Athens, Greece Duration: 15 Jun 2021 → 18 Jun 2021 https://www.faimconference.org/ |
Conference
Conference | 30th International Conference on Flexible Automation and Intelligent Manufacturing |
---|---|
Country/Territory | Greece |
City | Athens |
Period | 15/06/2021 → 18/06/2021 |
Internet address |
Keywords
- Deep Learning
- CNN
- Convolutional Neural Network
- MFCC
- Smart Wearable Devices
- Sound Classification
- Smart Manufacturing
- Quality Inspection