Deep Self-Supervised Diversity Promoting Learning on Hierarchical Hyperspheres for Regularization

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 52
  • Download : 0
In this paper, we propose a novel approach to enhance the generalization performance of deep neural networks. Our method employs a hierarchical hypersphere-based constraint that organizes weight vectors hierarchically based on observed data. By diversifying the parameter space of hyperplanes in the classification layer, we aim to encourage discriminative generalization. We introduce a self-supervised grouping method designed to unveil hierarchical structures in scenarios with unknown hierarchy information. To maximize distances between weight vectors on multiple hyperspheres, we propose a novel metric that combines discrete and continuous measures. This regularization encourages diverse orientations, consequently leading to improved generalization. Extensive evaluations on datasets, including CUB200-2011, Stanford-Cars, CIFAR-100, and TinyImageNet, consistently demonstrate enhancements in classification performance compared to baseline settings.
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Issue Date
2023
Language
English
Article Type
Article
Citation

IEEE ACCESS, v.11, pp.146208 - 146222

ISSN
2169-3536
DOI
10.1109/ACCESS.2023.3346430
URI
http://hdl.handle.net/10203/317884
Appears in Collection
AI-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0