Lazy Net: Lazy Entry Neural Networks for Accelerated and Efficient Inference

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 67
  • Download : 0
Modern edge devices have become powerful enough to run deep learning tasks, but there are still many challenges, such as limited resources such as computing power, memory space, and energy. To address these challenges, methods such as channel pruning, network quantization and early exiting has been introduced to reduce the computational load for achieve this tasks. In this paper, we propose LazyNet, an alternative network of applying skip modules instead of early exiting on a pre-trained neural network. We use a small module that preserves the spatial information and also provides metrics to decide the computational flow. If the data sample is easy, the network skips most of the computation load and if not, the network computes the sample for accurate classification. We test our model with various backbone networks and cifar-10 dataset and show reduction on model inference time, memory consumption and increased accuracy to prove our results.
Publisher
IEEE Computer Society
Issue Date
2022-10
Language
English
Citation

13th International Conference on Information and Communication Technology Convergence, ICTC 2022, pp.495 - 497

DOI
10.1109/ICTC55196.2022.9953031
URI
http://hdl.handle.net/10203/312677
Appears in Collection
CS-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0