TENAS: Using Taylor Expansion and Channel-level Skip Connection for Neural Architecture Search

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 153
  • Download : 0
There is growing interest in automating designing good neural network architectures. The neural architecture search (NAS) methods proposed recently have significantly reduced the architecture search cost by sharing parameters, but there is still a challenging problem in designing search space. The existing operation-level architecture search methods require a large amount of computing power or designing the search space of operations very carefully. For example, types of operations (e.g., convolutions with filter sizes 3 x 3) in the search space need to be carefully selected in the existing methods. In this paper, we investigate the possibility of achieving competitive performance with them only using a small amount of computing power and without designing search space carefully. We propose TENAS using Taylor expansion and only a fixed type of operation. The resulting architecture is sparse in terms of channel and has different topology at different cells. The experimental results for CIFAR-10 and ImageNet show that a fine-granular and sparse model searched by TENAS achieves very competitive performance with dense models searched by the existing methods.
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Issue Date
2022-08
Language
English
Article Type
Article
Citation

IEEE ACCESS, v.10, pp.84790 - 84798

ISSN
2169-3536
DOI
10.1109/ACCESS.2022.3195208
URI
http://hdl.handle.net/10203/298371
Appears in Collection
CS-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0