DAT: Domain Adaptive Transformer for Domain Adaptive Semantic Segmentation

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 52
  • Download : 0
Unsupervised domain adaptation (UDA) for semantic segmentation aims to predict class annotations on an unlabeled target dataset by training on a rich labeled source dataset. It is crucial in UDA semantic segmentation to decrease the domain gap by learning domain invariant feature representations across both domains. In this paper, we propose a novel transformer-based network, called a domain adaptive transformer (DAT), using a self-training scheme. We introduce domain invariant attention (DIA), which enables the DAT to exploit high-level domain invariant features at the patch level. Moreover, an entropy-based selective pseudo-labeling algorithm provides the DAT with reliable pseudo-labels of target samples for domain adaptive self-training, which corrects the noisy pseudo-labels online. We show that our DAT greatly improves the domain adaptability and achieves state-of-the-art results on the SYNTHIA-to-Cityscapes benchmark.
Publisher
IEEE
Issue Date
2022-10
Language
English
Citation

IEEE International Conference on Image Processing, ICIP 2022, pp.4183 - 4187

ISSN
1522-4880
DOI
10.1109/ICIP46576.2022.9897293
URI
http://hdl.handle.net/10203/300290
Appears in Collection
EE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0