Self-Supervised Knowledge Transfer via Loosely Supervised Auxiliary Tasks

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 56
  • Download : 0
Knowledge transfer using convolutional neural networks (CNNs) can help efficiently train a CNN with fewer parameters or maximize the generalization performance under limited supervision. 'lb enable a more efficient transfer of pretrained knowledge under relaxed conditions, we propose a simple yet powerful knowledge transfer methodology without any restrictions regarding the network structure or dataset used, namely self-supervised knowledge transfer (SSKT), via loosely supervised auxiliary tasks. For this, we devise a training methodology that transfers previously learned knowledge to the current training process as an auxiliary task for the target task through self-supervision using a soft label. The SSKT is independent of the network structure and dataset, and is trained differently from existing knowledge transfer methods; hence, it has an advantage in that the prior knowledge acquired from various tasks can be naturally transferred during the training process to the target task. Furthermore, it can improve the generalization performance on most datasets through the proposed knowledge transfer between different problem domains from multiple source networks. SSKT outperforms the other transfer learning methods (KD, DML, and MAXL) through experiments under various knowledge transfer settings. The source code will be made available to the public(1).
Publisher
IEEE COMPUTER SOC
Issue Date
2022-01
Language
English
Citation

22nd IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), pp.2947 - 2956

ISSN
2472-6737
DOI
10.1109/WACV51458.2022.00300
URI
http://hdl.handle.net/10203/298333
Appears in Collection
EE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0