Neuromimetic metaplasticity for adaptive continual learning without catastrophic forgetting

Cited 1 time in webofscience Cited 0 time in scopus
  • Hit : 1754
  • Download : 0
Conventional intelligent systems based on deep neural network (DNN) models encounter challenges in achieving human-like continual learning due to catastrophic forgetting. Here, we propose a metaplasticity model inspired by human working memory, enabling DNNs to perform catastrophic forgetting-resistant continual learning without any pre-or post-processing. A key aspect of our approach involves implementing distinct types of synapses from stable to flexible, and randomly intermixing them to train synaptic connections with different degrees of flexibility. This strategy allowed the network to successfully learn a continuous stream of information, even under unexpected changes in input length. The model achieved a balanced tradeoff between memory capacity and performance without requiring additional training or structural modifications, dynamically allocating memory resources to retain both old and new information. Furthermore, the model demonstrated robustness against data poisoning attacks by selectively filtering out erroneous memories, leveraging the Hebb repetition effect to reinforce the retention of significant data.
Publisher
PERGAMON-ELSEVIER SCIENCE LTD
Issue Date
2025-10
Language
English
Article Type
Article
Citation

NEURAL NETWORKS, v.190

ISSN
0893-6080
DOI
10.1016/j.neunet.2025.107762
URI
http://hdl.handle.net/10203/330619
Appears in Collection
RIMS Journal PapersBC-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 1 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0