Back-dropout Transfer Learning for Action Recognition

Huamin Ren, Nattiya Kanhabua, Andreas Møgelmose, Weifeng Liu, Sergio Escalera, Xavier Baro, Thomas B. Moeslund

Research output: Contribution to journalJournal articleResearchpeer-review

4 Citations (Scopus)

Abstract

Transfer learning aims at adapting a model learned from source dataset to target dataset. It is a beneficial approach especially when annotating on the target dataset is expensive or infeasible. Transfer learning has demonstrated its powerful learning capabilities in various vision tasks. Despite transfer learning being a promising approach, it is still an open question how to adapt the model learned from the source dataset to the target dataset. One big challenge is to prevent the impact of category bias on classification performance. Dataset bias exists when two images from the same category, but from different datasets, are not classified as the same. To address this problem, a transfer learning algorithm has been proposed, called negative back-dropout transfer learning (NB-TL), which utilizes images that have been misclassified and further performs backdropout strategy on them to penalize errors. Experimental results demonstrate the effectiveness of the proposed algorithm. In particular, the authors evaluate the performance of the proposed NB-TL algorithm on UCF 101 action recognition dataset, achieving 88.9% recognition rate.

Original languageEnglish
JournalIET Computer Vision
Volume12
Issue number4
Pages (from-to)484-491
Number of pages8
ISSN1751-9632
DOIs
Publication statusPublished - 2018

Fingerprint

Dive into the research topics of 'Back-dropout Transfer Learning for Action Recognition'. Together they form a unique fingerprint.

Cite this