Abstract
In this paper, we present a novel method for utilising wearable devices with Convolutional Neural Networks (CNN) trained on acoustic and accelerometer signals in smart manufacturing environments in order to provide real-time quality inspection during manual operations. We show through our framework how recorded or streamed sound and accelerometer data gathered from a wrist-attached device can classify certain user actions as successful or unsuccessful. The classification is designed with a Deep CNN model trained on Mel-frequency Cepstral Coefficients (MFCC) from the acoustic input signals. The wearable device provides feedback on three different modalities: audio, visual and haptic; thus ensuring the worker’s awareness at all time. We validate our findings through deployments of the complete AI-enabled device in production facilities of Mercedes-Benz AG. From the conducted experiments it is concluded that the use of acoustic and accelerometer data is valuable to train a classifier with the purpose of action examination during industrial assembly operations, and provides an intuitive interface for ensuring continued and improved quality inspection.
Originalsprog | Engelsk |
---|---|
Tidsskrift | Procedia Manufacturing |
Vol/bind | 51 |
Sider (fra-til) | 373-380 |
ISSN | 2351-9789 |
DOI | |
Status | Udgivet - nov. 2020 |
Begivenhed | 30th International Conference on Flexible Automation and Intelligent Manufacturing - Athens, Grækenland Varighed: 15 jun. 2021 → 18 jun. 2021 https://www.faimconference.org/ |
Konference
Konference | 30th International Conference on Flexible Automation and Intelligent Manufacturing |
---|---|
Land/Område | Grækenland |
By | Athens |
Periode | 15/06/2021 → 18/06/2021 |
Internetadresse |