Provable Memorization via Deep Neural Networks using Sub-linear Parameters

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 129
  • Download : 0
It is known that 𝑂(𝑁) parameters are sufficient for neural networks to memorize arbitrary 𝑁 input-label pairs. By exploiting depth, we show that 𝑂(𝑁2/3) parameters suffice to memorize 𝑁 pairs, under a mild condition on the separation of input points. In particular, deeper networks (even with width 3) are shown to memorize more pairs than shallow networks, which also agrees with the recent line of works on the benefits of depth for function approximation. We also provide empirical results that support our theoretical findings.
Publisher
JMLR JOURNAL MACHINE LEARNING RESEARCH
Issue Date
2021-08
Language
English
Citation

Conference on Learning Theory (COLT) 2021

URI
http://hdl.handle.net/10203/291826
Appears in Collection
RIMS Conference Papers
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0