Entropy regularization for weakly supervised object localization

Cited 5 time in webofscience Cited 0 time in scopus
  • Hit : 193
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorHwang, Dongjunko
dc.contributor.authorHa, Jung-Wooko
dc.contributor.authorShim, Hyunjungko
dc.contributor.authorChoe, Junsukko
dc.date.accessioned2023-05-02T07:00:19Z-
dc.date.available2023-05-02T07:00:19Z-
dc.date.created2023-05-02-
dc.date.issued2023-05-
dc.identifier.citationPATTERN RECOGNITION LETTERS, v.169, pp.1 - 7-
dc.identifier.issn0167-8655-
dc.identifier.urihttp://hdl.handle.net/10203/306409-
dc.description.abstractThe goal of weakly-supervised object localization (WSOL) is to train a localization model without the location information of the object(s). Recently, most existing WSOL methods capture the object with an attention map extracted from a classification network. However, it has been observed that we need to sacrifice classification performances to achieve the best WSOL score. We conjecture that this is because the objective of classification training, minimizing entropy between one-hot ground truth and predicted class probability, is not entirely consistent with that of localization. In this paper, we investigate how the entropy of predicted class probability affects localization performances, where we conclude that there is a sweet spot for localization with respect to entropy. Hence, we propose a new training strategy that adopts entropy regularization for finding the optimal point effectively. Specifically, we add an additional term to the loss function, which minimizes the entropy between a uniform distribution and the predicted class probability vector. The proposed method is easy to implement since we do not need to modify the architecture or data pipeline. In addition, our method is efficient in that almost zero additional resources are required. Most importantly, our method improves WSOL scores significantly, which has been shown through extensive experiments. (c) 2023 Elsevier B.V. All rights reserved.-
dc.languageEnglish-
dc.publisherELSEVIER-
dc.titleEntropy regularization for weakly supervised object localization-
dc.typeArticle-
dc.identifier.wosid000967562500001-
dc.identifier.scopusid2-s2.0-85150900216-
dc.type.rimsART-
dc.citation.volume169-
dc.citation.beginningpage1-
dc.citation.endingpage7-
dc.citation.publicationnamePATTERN RECOGNITION LETTERS-
dc.identifier.doi10.1016/j.patrec.2023.03.018-
dc.contributor.localauthorShim, Hyunjung-
dc.contributor.nonIdAuthorHwang, Dongjun-
dc.contributor.nonIdAuthorHa, Jung-Woo-
dc.contributor.nonIdAuthorChoe, Junsuk-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorDeep learning-
dc.subject.keywordAuthorWeakly -supervised learning-
dc.subject.keywordAuthorComputer vision-
dc.subject.keywordAuthorObject localization-
dc.subject.keywordAuthorEntropy regularization-
dc.subject.keywordAuthorDense attention map-
Appears in Collection
AI-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 5 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0