Self-supervised masked convolutional transformer block for anomaly detection

Neelu Madan, Nicolae Catalin Ristea, Radu Tudor Ionescu, Kamal Nasrollahi, Fahad Shahbaz Khan, Thomas B. Moeslund, Mubarak Shah

Research output: Contribution to journalJournal articleResearchpeer-review

3 Citations (Scopus)

Abstract

Anomaly detection has recently gained increasing attention in the field of computer vision, likely due to its broad set of applications ranging from product fault detection on industrial production lines and impending event detection in video surveillance to finding lesions in medical scans. Regardless of the domain, anomaly detection is typically framed as a one-class classification task, where the learning is conducted on normal examples only. An entire family of successful anomaly detection methods is based on learning to reconstruct masked normal inputs (e.g. patches, future frames, etc.) and exerting the magnitude of the reconstruction error as an indicator for the abnormality level. Unlike other reconstruction-based methods, we present a novel self-supervised masked convolutional transformer block (SSMCTB) that comprises the reconstruction-based functionality at a core architectural level. The proposed self-supervised block is extremely flexible, enabling information masking at any layer of a neural network and being compatible with a wide range of neural architectures. In this work, we extend our previous self-supervised predictive convolutional attentive block (SSPCAB) with a 3D masked convolutional layer, a transformer for channel-wise attention, as well as a novel self-supervised objective based on Huber loss. Furthermore, we show that our block is applicable to a wider variety of tasks, adding anomaly detection in medical images and thermal videos to the previously considered tasks based on RGB images and surveillance videos. We exhibit the generality and flexibility of SSMCTB by integrating it into multiple state-of-the-art neural models for anomaly detection, bringing forth empirical results that confirm considerable performance improvements on five benchmarks: MVTec AD, BRATS, Avenue, ShanghaiTech, and Thermal Rare Event.

Original languageEnglish
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume46
Issue number1
Pages (from-to)525-542
Number of pages18
ISSN0162-8828
DOIs
Publication statusPublished - 1 Jan 2024

Keywords

  • Anomaly detection
  • Benchmark testing
  • Convolution
  • Image reconstruction
  • Task analysis
  • Three-dimensional displays
  • Transformers
  • abnormal event detection
  • anomaly detection
  • attention mechanism
  • masked convolution
  • self-attention
  • self-supervised learning
  • transformer
  • Abnormal event detection

Fingerprint

Dive into the research topics of 'Self-supervised masked convolutional transformer block for anomaly detection'. Together they form a unique fingerprint.

Cite this