Learning Polymorphic Neural ODEs with Time-evolving Mixture

Cited 2 time in webofscience Cited 0 time in scopus
  • Hit : 254
  • Download : 0
Neural ordinary differential equations (NODE) present a new way of considering a deep residual network as a continuous structure by layer depth. However, it fails to overcome its representational limits, where it cannot learn all possible homeomorphisms of input data space, and therefore quickly saturates in terms of performance even as the number of layers increases. Here, we show that simply stacking Neural ODE blocks could easily improve performance by alleviating this issue. Furthermore, we suggest a more effective way of training neural ODE by using a time-evolving mixture weight on multiple ODE functions that also evolves with a separate neural ODE. We provide empirical results that are suggestive of improved performance over stacked as well as vanilla neural ODEs where we also confirm our approach can be orthogonally combined with recent advances in neural ODEs.
Publisher
IEEE COMPUTER SOC
Issue Date
2023-01
Language
English
Article Type
Article
Citation

IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, v.45, no.1, pp.712 - 721

ISSN
0162-8828
DOI
10.1109/tpami.2022.3145013
URI
http://hdl.handle.net/10203/302130
Appears in Collection
AI-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 2 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0