In this paper, we present a novel method for utilising wearable devices with Convolutional Neural Networks (CNN) trained on acoustic and accelerometer signals in smart manufacturing environments in order to provide real-time quality inspection during manual operations. We show through our framework how recorded or streamed sound and accelerometer data gathered from a wrist-attached device can classify certain user actions as successful or unsuccessful. The classification is designed with a Deep CNN model trained on Mel-frequency Cepstral Coefficients (MFCC) from the acoustic input signals. The wearable device provides feedback on three different modalities: audio, visual and haptic; thus ensuring the worker’s awareness at all time. We validate our findings through deployments of the complete AI-enabled device in production facilities of Mercedes-Benz AG. From the conducted experiments it is concluded that the use of acoustic and accelerometer data is valuable to train a classifier with the purpose of action examination during industrial assembly operations, and provides an intuitive interface for ensuring continued and improved quality inspection.
|Publication status||Accepted/In press - 2020|
|Event||30th International Conference on Flexible Automation and Intelligent Manufacturing - Athens, Greece|
Duration: 15 Jun 2021 → 18 Jun 2021
|Conference||30th International Conference on Flexible Automation and Intelligent Manufacturing|
|Period||15/06/2021 → 18/06/2021|
- Deep Learning
- Convolutional Neural Network
- Smart Wearable Devices
- Sound Classification
- Smart Manufacturing
- Quality Inspection
Sarivan, I-M., Greiner, J., Díez Alvarez, D., Euteneuer, F., Reichenbach, M., Madsen, O., & Bøgh, S. (Accepted/In press). Enabling Real-Time Quality Inspection in Smart Manufacturing Through Wearable Smart Devices and Deep Learning. Procedia Manufacturing.