OSLNet: Deep Small-Sample Classification with an Orthogonal Softmax Layer

Xiaoxu Li, Dongliang Chang, Zhanyu Ma, Zheng-Hua Tan, Jing-Hao Xue, Jie Cao, Jingyi Yu, Jun Guo

Research output: Contribution to journalJournal articleResearchpeer-review

28 Citations (Scopus)
112 Downloads (Pure)

Abstract

A deep neural network of multiple nonlinear layers forms a large function space, which can easily lead to overfitting when it encounters small-sample data. To mitigate overfitting in small-sample classification, learning more discriminative features from small-sample data is becoming a new trend. To this end, this paper aims to find a subspace of neural networks that can facilitate a large decision margin. Specifically, we propose the Orthogonal Softmax Layer (OSL), which makes the weight vectors in the classification layer remain orthogonal during both the training and test processes. The Rademacher complexity of a network using the OSL is only 1 K, where K is the number of classes, of that of a network using the fully connected classification layer, leading to a tighter generalization error bound. Experimental results demonstrate that the proposed OSL has better performance than the methods used for comparison on four small-sample benchmark datasets, as well as its applicability to large-sample datasets. Codes are available at: https://github.com/dongliangchang/OSLNet.

Original languageEnglish
Article number9088302
JournalI E E E Transactions on Image Processing
Volume29
Pages (from-to)6482-6495
Number of pages14
ISSN1057-7149
DOIs
Publication statusPublished - 2020

Keywords

  • Deep neural network
  • Orthogonal softmax layer
  • overfitting
  • small-sample classification

Fingerprint

Dive into the research topics of 'OSLNet: Deep Small-Sample Classification with an Orthogonal Softmax Layer'. Together they form a unique fingerprint.

Cite this