Fact-checked by Grok 2 weeks ago
References
- [1]
- [2]
-
[3]
What is Catastrophic Forgetting? - IBMAlso known as “catastrophic interference,” this phenomenon causes trained networks to lose information related to old tasks when being trained on new data in a ...
-
[4]
[PDF] CATASTROPHIC INTERFERENCE IN CONNECTIONIST NETWORKSMichael McCloskey and Neal J. Cohen. VI. Generality of the Interference Problem. Our arithmetic and retroactive interference results raise a variety of.
-
[5]
Overcoming catastrophic forgetting in neural networks - PNASMar 14, 2017 · This phenomenon, termed catastrophic forgetting (2–6), occurs specifically when the network is trained sequentially on multiple tasks because ...Results · Ewc Extends Memory Lifetime... · Random Patterns
- [6]
-
[7]
Catastrophic Interference in Connectionist Networks: The Sequential ...New learning may interfere catastrophically with old learning when networks are trained sequentially.
-
[8]
Catastrophic Interference in Connectionist Networks: The Sequential ...Catastrophic Interference in Connectionist Networks: The Sequential Learning Problem · M. McCloskey, N. J. Cohen · Published 1989 · Computer Science, Psychology ...
-
[9]
(PDF) Catastrophic forgetting in connectionist networksAug 6, 2025 · In this article the causes, consequences and numerous solutions to the problem of catastrophic forgetting in neural networks are examined.
-
[10]
[PDF] A Comprehensive Survey of Forgetting in Deep Learning Beyond ...Jul 16, 2023 · The concept of catastrophic forgetting was first formally introduced by McCloskey and Cohen. [1]. They demonstrated that neural networks when ...
-
[11]
Continual lifelong learning with neural networks: A reviewIn this review, we critically summarize the main challenges linked to lifelong learning for artificial learning systems and compare existing neural network ...Review · 2. Biological Aspects Of... · 2.1. The...
-
[12]
[PDF] Using Semi-Distributed Representations to Overcome Catastrophic ...The algorithm presented below, using a different technique, allows semi-distributed representations to evolve that significantly reduce catastrophic forgetting.
-
[13]
(PDF) Catastrophic Interference is Eliminated in Pretrained NetworksPDF | this article, we outline the major cause of catastrophic interference in standard networks, describe recent approaches to the problem, ...<|separator|>
-
[14]
[1703.04200] Continual Learning Through Synaptic Intelligence - arXivIn this study, we introduce intelligent synapses that bring some of this biological complexity into artificial neural networks.Missing: ICLR | Show results with:ICLR
-
[15]
[PDF] Learning without Forgetting - arXivFeb 14, 2017 · Li and D. Hoiem, “Learning without forgetting,” in European. Conference on Computer Vision. Springer, 2016, pp. 614–629. [11] G. Hinton, O ...
-
[16]
Understanding Catastrophic Forgetting and Remembering in ... - arXivFeb 22, 2021 · ... neural networks ... Additionally, current approaches that deal with forgetting ignore the problem of catastrophic remembering, i.e. the worsening ...
-
[17]
The stability-plasticity dilemma: investigating the continuum from ...The stability-plasticity dilemma is a well-know constraint for artificial and biological neural systems. The basic idea is that learning in a parallel and ...The Problem of Catastrophic... · The Entrenchment Effect: The...Missing: seminal | Show results with:seminal
-
[18]
Progressive learning: A deep learning framework for continual ...Progressive learning is a deep learning framework for continual learning that comprises three procedures: curriculum, progression, and pruning.
- [19]
-
[20]
[1606.04671] Progressive Neural Networks - arXivJun 15, 2016 · The progressive networks approach represents a step forward in this direction: they are immune to forgetting and can leverage prior knowledge via lateral ...
-
[21]
Memory Efficient Continual Learning with TransformersIn this paper, we devise a method to incrementally train a model on a sequence of tasks using pre-trained Transformers and extending them with Adapters.
-
[22]
[2308.08747] An Empirical Study of Catastrophic Forgetting in Large ...Aug 17, 2023 · The experiments reveal that catastrophic forgetting is generally observed in LLMs ranging from 1b to 7b parameters. Surprisingly, as the model ...
-
[23]
New Algorithm Enables Neural Networks to Learn ContinuouslyOct 9, 2024 · Caltech researchers have now developed a new type of algorithm that enables neural networks to be continuously updated with new data that they are able to ...Missing: Hebbian rules interference
-
[24]
Hybrid neural networks for continual learning inspired by ... - NatureFeb 2, 2025 · A brain-inspired algorithm that mitigates catastrophic forgetting of artificial and spiking neural networks with low computational cost. Sci ...Results · Knowledge Transfer From... · Methods
-
[25]
[2510.23756] Explaining Robustness to Catastrophic Forgetting ...Oct 27, 2025 · Catastrophic forgetting remains a central challenge in continual learning, where models are required to integrate new knowledge over time ...
-
[26]
Humans and neural networks show similar patterns of transfer and ...Oct 30, 2025 · ... catastrophic interference relate to transfer during continual learning in humans and ANNs. Here, we directly compare humans and linear ANNs ...
-
[27]
Catastrophic Forgetting in LLMs: A Comparative Analysis Across ...Apr 1, 2025 · This study evaluates the continual fine-tuning of various open-source LLMs with different parameter sizes (specifically models under 10 billion parameters) on ...Missing: empirical sequential
-
[28]
Mitigating Catastrophic Forgetting in Large Language Models with ...In this paper, we propose the Forgetting-Aware Pruning Metric (FAPM), a novel pruning-based approach to balance CF and downstream task performance. Our ...
-
[29]
Mitigating Catastrophic Forgetting in Large Language Models with ...We propose a framework called Self-Synthesized Rehearsal (SSR) that uses the LLM to generate synthetic instances for rehearsal.
-
[30]
Understanding Catastrophic Forgetting in Continual LearningJun 11, 2025 · First, we propose a set of metrics to evaluate models learning over a continuum of data. These metrics characterize models not only by their ...
-
[31]
New Study Warns of Catastrophic Overtraining in Large ... - HPCwireApr 3, 2025 · When the researchers added Gaussian noise to pre-trained models, they found performance became significantly worse with increasing pre-training ...