A brain-inspired network architecture for cost-efficient object recognition in shallow hierarchical neural networks

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 66
  • Download : 11
The brain successfully performs visual object recognition with a limited number of hierarchical networks that are much shallower than artificial deep neural networks (DNNs) that perform similar tasks. Here, we show that long-range horizontal connections (LRCs), often observed in the visual cortex of mammalian species, enable such a cost-efficient visual object recognition in shallow neural networks. Using simulations of a model hierarchical network with convergent feedforward connections and LRCs, we found that the addition of LRCs to the shallow feedforward network significantly enhances the performance of networks for image classification, to a degree that is comparable to much deeper networks. We found that a combination of sparse LRCs and dense local connections dramatically increases performance per wiring cost. From network pruning with gradient-based optimization, we also confirmed that LRCs could emerge spontaneously by minimizing the total connection length while maintaining performance. Ablation of emerged LRCs led to a significant reduction of classification performance, which implies these LRCs are crucial for performing image classification. Taken together, our findings suggest a brain-inspired strategy for constructing a cost-efficient network architecture to implement parsimonious object recognition under physical constraints such as shallow hierarchical depth.
Publisher
PERGAMON-ELSEVIER SCIENCE LTD
Issue Date
2021-02
Language
English
Article Type
Article
Citation

NEURAL NETWORKS, v.134, pp.76 - 85

ISSN
0893-6080
DOI
10.1016/j.neunet.2020.11.013
URI
http://hdl.handle.net/10203/280461
Appears in Collection
BiS-Journal Papers(저널논문)
Files in This Item
1-s2.0-S0893608020304111-main.pdf(2.95 MB)Download

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0