In this paper, we present a novel method for utilising wearable devices with Convolutional Neural Networks (CNN) trained on acoustic and accelerometer signals in smart manufacturing environments in order to provide real-time quality inspection during manual operations. We show through our framework how recorded or streamed sound and accelerometer data gathered from a wrist-attached device can classify certain user actions as successful or unsuccessful. The classification is designed with a Deep CNN model trained on Mel-frequency Cepstral Coefficients (MFCC) from the acoustic input signals. The wearable device provides feedback on three different modalities: audio, visual and haptic; thus ensuring the worker’s awareness at all time. We validate our findings through deployments of the complete AI-enabled device in production facilities of Mercedes-Benz AG. From the conducted experiments it is concluded that the use of acoustic and accelerometer data is valuable to train a classifier with the purpose of action examination during industrial assembly operations, and provides an intuitive interface for ensuring continued and improved quality inspection.
|Status||Accepteret/In press - 2020|
|Begivenhed||30th International Conference on Flexible Automation and Intelligent Manufacturing - Athens, Grækenland|
Varighed: 15 jun. 2021 → 18 jun. 2021
|Konference||30th International Conference on Flexible Automation and Intelligent Manufacturing|
|Periode||15/06/2021 → 18/06/2021|
Sarivan, I-M., Greiner, J., Díez Alvarez, D., Euteneuer, F., Reichenbach, M., Madsen, O., & Bøgh, S. (Accepteret/In press). Enabling Real-Time Quality Inspection in Smart Manufacturing Through Wearable Smart Devices and Deep Learning. Procedia Manufacturing.