Fact-checked by Grok 2 weeks ago

Decay theory

Decay theory is a model proposing that occurs due to the passive deterioration or fading of traces over time, particularly in short-term or , when information is not actively rehearsed or retrieved. This theory posits that strength diminishes gradually as neural activation dissipates, contrasting with alternative explanations like from competing information. The roots of decay theory trace back to early empirical work on , such as Hermann Ebbinghaus's 1885 , which demonstrated rapid memory loss over time for nonsense syllables, though Ebbinghaus emphasized disuse rather than explicit neural decay. The modern formulation emerged during the of the , with introducing it in 1958 to explain in immediate serial recall tasks, where participants lost memory for consonant trigrams after delays as short as 1.5 seconds, even without distracting stimuli. This was quickly corroborated by Lloyd and Margaret Peterson in 1959, whose experiments showed near-complete of trigrams after 18 seconds of simple arithmetic tasks, attributing the loss to time-based decay masked by minimal . Decay theory has primarily been applied to , where it suggests a fixed limited by rapid at rates estimated between 0.125 and 1 item per second. Proponents, including Nelson Cowan and Klaus Oberauer in later models, argue it accounts for phenomena like the phonological similarity effect in verbal recall, where similar items faster due to overlapping traces. Evidence supporting pure includes low-interference paradigms, such as visual tasks showing limits tied to rather than item count. However, the theory faces significant criticism for lacking direct neurobiological evidence of passive trace , with many researchers favoring accounts that explain time-dependent as arising from proactive or retroactive between memories. For instance, studies by Stephan Lewandowsky and colleagues in 2009 demonstrated that apparent effects disappear when is controlled, suggesting time alone does not cause . Despite these challenges, hybrid models like the Time-Based Resource-Sharing framework by Pierre Barrouillet and colleagues (2004) integrate with attention-based refreshing, maintaining the theory's relevance in contemporary research. In contexts, decay theory is less influential, as evidence points more toward retrieval failure or processes rather than simple fading. Overall, while decay theory provides an intuitive explanation aligned with everyday experiences of fade, its empirical support remains debated, fueling ongoing investigations into the mechanisms of .

Overview and Fundamentals

Core Principles

Decay theory posits that traces weaken or fade over time through passive processes, independent of from other memories or activities. This model emphasizes an intrinsic deterioration of the memory representation, where the strength of the encoded diminishes solely as a result of the passage of time since initial . A core assumption of decay theory is structural decay, in which strength decreases either exponentially or linearly as a function of time elapsed since encoding. This process assumes that without active maintenance, the neural or representational trace of the erodes progressively, leading to reduced or complete loss. The theory's foundational illustration is the Ebbinghaus-inspired , which depicts retention as declining rapidly at first and then more gradually; a common mathematical representation is the formula R = e^{-t/s}, where R is retention, t is time since learning, and s is a reflecting the relative strength or durability of the . In sensory and perceptual memory, occurs rapidly, typically over seconds, as fleeting sensory inputs like visual icons or auditory echoes fade without transfer to more durable stores. While originated in explanations of limitations, it has been extended conceptually to other types, though the rate and mechanisms may vary. Central to the is the role of disuse: memories decay in the absence of or retrieval, with time serving as the primary independent variable driving , underscoring a passive rather than active erasure.

Comparison to Interference Theory

Interference theory posits that forgetting arises from the competition between similar memory traces, rather than the mere passage of time. In this framework, proactive interference occurs when older memories hinder the retrieval of newer ones, while retroactive interference happens when subsequent learning disrupts recall of prior material. These processes require overlapping or similar content between memories, emphasizing active disruption over passive fading. Decay theory, by contrast, emphasizes passive temporal erosion of traces in the absence of or use, without necessitating from other memories. A key divergence lies in the mechanisms: operates through time-dependent weakening of isolated traces, whereas demands intervening activities or materials that overlap with the target . For instance, under decay theory, an individual might forget a recently heard phone number after a delay due to the lapse of time alone; in , forgetting the same number would stem from subsequently learning a similar sequence, such as another phone number. These differences yield distinct predictions about forgetting patterns. Decay theory anticipates uniform rates of memory loss over time when no interfering events occur, reflecting a steady process. , however, predicts variability in forgetting based on the similarity and recency of competing materials, with greater disruption from highly overlapping interveners. Early theoretical debates often framed and as mutually exclusive accounts of , with proponents like McGeoch arguing against time-based disuse in favor of competitive processes. Contemporary perspectives recognize that both mechanisms can coexist in models, where serves a functional role in mitigating by clearing outdated traces.
AspectDecay TheoryInterference Theory
Cause of ForgettingPassage of time leading to trace erosion, independent of other activities.Competition from similar prior (proactive) or subsequent (retroactive) memories.
TestabilityAssessed in via time delays without intervening tasks.Evaluated through controlled exposure to similar or dissimilar lists or materials.
ScopePrimarily applies to short-term and traces fading over time, with conceptual extensions to other types.Primarily affects verbal, associative, or similar content prone to overlap.

Historical Development

Early Formulations

The early formulations of decay theory emerged in the late , rooted in experimental investigations of human memory. is widely recognized as the foundational figure, who in his seminal 1885 monograph Über das Gedächtnis (translated as Memory: A Contribution to ) introduced the concept of as a spontaneous, time-dependent process. Through self-experiments using nonsense syllables to minimize prior associations, Ebbinghaus observed that retention of learned material declined progressively over intervals of time, even in the absence of interfering factors, laying the groundwork for viewing memory traces as subject to natural dissipation. This idea drew from the broader tradition of associationist psychology, which portrayed as a network of mental connections between ideas or sensations that could weaken or fade like natural entropy if not actively maintained, predating more formalized models of . Associationists such as and had earlier emphasized how ideas link through contiguity and resemblance, implying that unused links might dissipate over time, providing a philosophical precursor to empirical accounts of . In the early , extended these notions to associative learning in animal and . In his 1913 work Educational Psychology, Volume II: The Psychology of Learning, Thorndike proposed the "law of disuse," asserting that the bonds or connections between stimuli and responses weaken and decay when not exercised or reinforced over time. However, Thorndike later revised this view in 1932, concluding based on further evidence that disuse does not reliably weaken connections. This principle complemented his connectionist framework, where learning formed S-R (stimulus-response) associations that faded without repetition, offering a mechanistic explanation for the erosion of habits and skills. Amid the rising influence of in the early 1900s, decay theory aligned well with the movement's emphasis on observable processes, as it accounted for the loss of learned behaviors and skill retention without relying on introspective or mentalistic constructs. Thorndike's formulations, in particular, bridged and emerging behaviorist paradigms by framing decay as a passive weakening of neural pathways, influencing subsequent on persistence.

Key Experimental Foundations

One of the earliest empirical foundations for decay theory came from Hermann Ebbinghaus's self-experiments in , where he systematically studied through the serial learning of nonsense syllables composed of meaningless letter combinations. By measuring the "savings" method—comparing the time required for initial learning to the time needed for relearning after retention intervals ranging from 20 minutes to over a day—Ebbinghaus observed that the amount of increased progressively with the of time, even in the absence of interfering material. This time-correlated pattern of retention loss provided initial quantitative evidence for memory traces fading due to disuse over extended periods. Building on this, John Brown's 1958 experiments tested decay in immediate memory using lists of 10 consonant syllables, with participants counting aloud during retention intervals to prevent rehearsal. Brown's results showed that recall probability declined linearly with the duration of the filled interval, up to 15 seconds, supporting a time-based decay process independent of the number of intervening items. This approach isolated temporal factors in short-term forgetting, laying groundwork for subsequent paradigms. The Peterson and Peterson task of 1959 further refined this methodology to examine short-term memory decay, presenting auditory consonant trigrams (e.g., XXX) followed immediately by a three-digit number for backward counting as a distractor to suppress rehearsal. With 24 participants, recall accuracy was nearly perfect at 3 seconds but fell to about 10% correct after 18 seconds, indicating rapid time-dependent loss in short-term storage. This 90% forgetting rate over 18 seconds implied an exponential decay with a half-life of roughly 5 seconds for such verbal material. Extensions of the Brown-Peterson paradigm in the late and early incorporated varied distractor tasks, such as non-verbal or spatial activities, to control for verbal and more cleanly attribute performance declines to rather than proactive or retroactive inhibition. For instance, using that minimized phonological overlap with the trigrams preserved the observed time-based forgetting , reinforcing as the primary mechanism in isolated short-term retention. Waugh and Norman's 1965 study employed a partial report technique on sequences of spoken s presented at rates of 1 or 4 per second, followed by a probe digit and position cue after a variable number of intervening items. With four participants, they found that the probability of an item remaining in primary memory decreased gradually with time since presentation and with the number of intervening items, estimating that from new items primarily drives , though presentation rate (affecting time) also influenced . This work quantified factors in immediate , highlighting interactions between time and overload in auditory stimuli.

Empirical Evidence

Support in Short-Term Memory

Evidence from sensory memory systems provides foundational support for decay theory in short-term memory processes. In the visual modality, iconic memory stores brief images for approximately 250 milliseconds to 1 second before rapid decay occurs, as demonstrated by Sperling's partial report experiments where recall accuracy declined sharply with delays in the cue presentation, revealing curves in information availability. Similarly, in the auditory domain retains sounds for 2 to 4 seconds, with partial report paradigms showing a time-based loss of phonetic details without external interference, as evidenced by studies adapting Sperling's method to auditory stimuli. Short-term memory (STM) studies further affirm decay through tasks that isolate temporal factors from interference. In low-interference paradigms preventing rehearsal, such as recent-probes tasks with digits, memory traces exhibit rapid initial decay within approximately 3 seconds, followed by a plateau, even in the absence of proactive or retroactive interference. These findings align with early demonstrations like the Peterson task, where consonant trigrams faded over filled delays of similar durations. Quantitative models of incorporate decay parameters to explain these patterns. Baddeley's phonological loop model posits a passive store where trace strength diminishes exponentially over time, characterized by the equation S(t) = S_0 e^{-t / \tau} with a \tau \approx 2 seconds, reflecting the rapid fading of verbal information unless refreshed by subvocal . Recent studies suggest nuances, such as familiar items resisting decay more than unfamiliar ones in tasks, potentially due to stronger associations. Cross-species evidence reinforces the universality of time-based decay in short-term systems. In monkeys performing delayed matching-to-sample tasks, accuracy decreases monotonically with delay duration up to 30 seconds, even under conditions minimizing , accompanied by decaying neuronal firing rates in prefrontal areas that mirror the behavioral loss.

Applications to

Proposals for decay in (LTM) suggest a slow, asymptotic fading of memory traces over extended periods, often spanning years or decades, contrasting with the rapid decay seen in shorter-term processes. A seminal study by Bahrick (1984) examined retention of learned in high school or college among 733 participants up to 50 years after instruction, revealing memory curves that decline exponentially during the first 3-6 years, plateau with minimal loss for approximately 25 years, and then exhibit a gradual further decline, with substantial portions of semantic knowledge—such as vocabulary recognition—remaining accessible even after half a century without . This pattern supports the idea of a "permastore" for overlearned material, where decay slows to near-asymptotic levels, allowing for persistent but imperfect retention in LTM. In skill-based LTM, such as motor memories acquired by musicians or athletes, decay manifests during prolonged disuse, leading to measurable losses in proficiency, though relearning often benefits from residual savings. on procedural skill retention indicates that motor abilities degrade gradually with nonuse, with effect sizes of approximately d = -0.9 across intervals of nonuse, influenced by factors like task complexity and original . For instance, studies of musicians demonstrate that fine for instruments deteriorates during extended breaks, but relearning times are reduced compared to novices, with savings persisting even after years of inactivity, albeit diminishing progressively. Quantitative analyses suggest that relearning savings for diminish over extended disuse, highlighting decay's role in eroding procedural LTM efficiency without complete erasure. Distinctions between episodic and semantic LTM further illustrate differential decay rates, with event-based episodic memories fading more rapidly than abstract semantic knowledge. Episodic memories, particularly vivid ones like flashbulb recollections of major public events, lose peripheral details over time despite high initial confidence in their accuracy; for example, retellings of September 11, 2001, memories showed significant inconsistencies and detail erosion after 10 years, consistent with gradual trace decay. In contrast, semantic memories—such as factual knowledge or vocabulary—exhibit greater stability, with slower asymptotic fading that aligns with the permastore concept observed in language retention studies. Mathematical models of LTM decay often employ power-law functions to capture this protracted fading, particularly for verbal recall over long intervals. The retention function R(t) = t^{-\alpha}, where t is time and \alpha \approx 1 for extended retention periods, has been derived from analyses of free recall data, fitting empirical curves better than exponential models for LTM phenomena like word list retention. Large-scale corpus-based studies of word usage and recall corroborate this, showing power-law decay in accessibility of linguistic items over decades, reflecting how LTM traces weaken logarithmically with disuse. Emerging longitudinal research from the 2010s addresses gaps in understanding LTM decay during developmental periods, such as , where early episodic memories fade rapidly due to immature neural mechanisms. Cohort studies link this accelerated decay to in the during early childhood, a that eliminates excess connections and contributes to the boundary of autobiographical recall around age 3-4 years, with quantitative models estimating near-total loss of pre-verbal memories over time. These findings extend decay theory to LTM formation, emphasizing how structural changes amplify weakening in vulnerable periods.

Criticisms and Limitations

Inconsistencies in Working Memory

Working memory, as proposed in Baddeley's influential model, comprises a central executive that oversees and , alongside subordinate systems like the phonological loop for verbal material and the visuospatial sketchpad for visual-spatial information, enabling active maintenance and manipulation of information over brief periods. Within this framework, decay theory posits that traces in these storage buffers fade passively over time unless refreshed through mechanisms such as articulatory in the phonological loop. However, this assumption encounters significant challenges in dynamic contexts involving active processing, where is often disrupted or unavailable. A primary inconsistency arises from dual-task paradigms, which reveal stable error rates in performance across varying retention intervals, indicating an absence of pure time-based when is controlled. For instance, studies from the late and , building on Baddeley and Hitch's foundational work, demonstrated that accuracy in verbal tasks remained largely constant despite extended delays under articulatory suppression conditions that prevented , with deficits attributable primarily to the interfering demands of concurrent processing rather than elapsed time. These findings underscore that loss in such setups stems from resource competition or proactive , not temporal degradation alone. Methodological challenges further complicate isolating decay in working memory, as experiments struggle to disentangle time-based from output or transient shifts. In complex span tasks, for example, Conway et al.'s review highlights how variations in processing intervals confound interpretations, with secondary tasks introducing unavoidable that masks any potential decay effects. Similarly, in n-back tasks—which require ongoing monitoring and updating of sequences—performance plateaus at moderate loads (e.g., 2-back), suggesting that intrusions from successive items, modulated by factors such as stimulus-onset asynchrony, rather than trace weakening over time, dictate patterns. Controlling for verbal suppression in these paradigms reveals minimal evidence of independent time-based loss, as recall errors correlate more strongly with load than duration. This tension extends theoretically: if decay were the dominant mechanism, capacity—often cited as stable at around 7 ± 2 chunks over seconds-long intervals, per Miller's seminal analysis—should erode progressively with time, yet empirical stability points to as the primary limiter. Such paradoxes imply that decay theory inadequately explains 's in active manipulation scenarios, favoring models emphasizing dominance.

Challenges from System Interactions

One key challenge to decay theory arises in the multi-store model of memory proposed by Atkinson and Shiffrin, where (STM) traces are expected to decay over time unless rehearsed into (LTM) via . However, empirical observations reveal that transfer rates from STM to LTM do not align with predictions of isolated temporal decay, as proactive and retroactive frequently disrupt this process independently of duration in STM. For instance, studies controlling for show that interference from competing traces during the transfer window better explains consolidation failures than passive fading, undermining the model's reliance on time-based loss. Interactions between memory systems further highlight decay theory's limitations, particularly in scenarios where active processes override temporal erosion. During sleep, hippocampal replay of recent experiences—occurring as sharp-wave ripples—strengthens engram cells and prevents synaptic , prioritizing vulnerable memories for while interference from unrelated neural activity is minimized. This mechanism demonstrates that in transitional phases, such as from to LTM, is dominated by dynamic interactions rather than inevitable time-dependent decline, as disrupting replay leads to impaired retention despite short delays. Empirical challenges are evident in free recall paradigms from the 1970s, such as those by Tulving, which showed that loss of accessibility is primarily driven by mismatches between encoding and retrieval contexts rather than elapsed time alone. In these experiments, recall probability remained stable over intervals when cues matched original encoding conditions but plummeted with contextual shifts, indicating that decay cannot account for variability in retrieval success across systems. Cross-system dynamics, particularly in hippocampal-prefrontal loops, reveal that decay is not purely temporal but modulated by attentional mechanisms that stabilize or accelerate trace weakening. Theta and gamma oscillations between these regions enhance coordination during attention-demanding tasks, countering potential decay by reactivating ensembles and integrating sensory inputs with stored representations. This attentional gating explains why memory persistence varies with cognitive load, challenging decay theory's emphasis on uniform time-based fading. Quantitative analyses of in compound tasks, which engage multiple systems simultaneously, underscore these issues: pure models predict steeper curves than observed, explaining less than 30% of variance in retention, while factors like and noise account for the remainder. For example, in visual tasks with overlaid features, increasing internal noise over time fits data better than systematic , with time alone predicting minimal performance decline amid interactive demands.

Contemporary Advances

Hybrid Theoretical Models

Hybrid theoretical models in decay theory seek to reconcile the time-based fading of memory traces with other mechanisms, such as , by integrating them into unified frameworks that better account for observed patterns in both short- and . These models address limitations of pure decay accounts by incorporating contextual or associative factors that modulate the rate or impact of temporal decay, leading to more robust explanations of empirical phenomena like serial position effects and load-dependent . A prominent example is the time-based resource-sharing (TBRS) model, which combines time-based —where memory traces weaken proportionally to the duration of unattended intervals—with arising from the processing of distractors in complex span tasks. In this framework, refreshes traces to counteract , but from secondary tasks reduces refreshing efficiency, resulting in an where longer retention intervals exacerbate under high . Developed by Barrouillet and colleagues, TBRS simulates key effects, such as the reduced recall in dual-task conditions, outperforming interference-only models in capturing time-sensitive declines. In the domain of episodic memory, contextual binding theories treat decay as the progressive loss of temporal or situational context bindings that link items to their encoding episode, integrated with interference from overlapping contexts in retrieval cues. Yonelinas, Ranganath, Ekstrom, and Wiltgen's contextual binding theory (CBT) frames hippocampal-dependent episodic recall as dependent on retrieving these bindings, where time-based decay weakens binding strength, and interference occurs when similar contexts activate competing bindings, leading to source misattribution or omissions. This integration explains why episodic forgetting accelerates with temporal distance yet is mitigated by distinctive cues, as seen in studies of recognition memory where context reinstatement reduces decay-like losses. CBT thus positions decay not as passive dissipation but as context-specific degradation vulnerable to proactive and retroactive interference. Computational implementations further exemplify hybrids, notably in the cognitive architecture, where declarative memory incorporates a term alongside associative spreading. The equation is given by A_i = B_i + \sum_j W_j S_{ji} where B_i is the base-level reflecting recency and via B_i = \ln \left( \sum_k t_k^{-d} \right ) (with t_k as times of previous accesses and d the parameter, typically 0.5), and \sum_j W_j S_{ji} captures from related knowledge (analogous to interference facilitation or competition); this blends passive temporal with dynamic from the knowledge network, enabling simulations of tasks like where noise leads to probabilistic forgetting. models using this hybrid mechanism accurately predict error patterns in problem-solving and learning scenarios, such as the fan effect where additional associations increase and accelerate effective . These hybrid models offer advantages over pure decay theories by achieving superior fits to complex task data, resolving inconsistencies like the lack of pure time effects in low-load conditions. Recent developments in the 2020s extend these hybrids to machine learning analogs, where recurrent neural networks (RNNs) simulate memory decay through time-weighted parameters integrated with interference-mitigating gates. In long short-term memory (LSTM) variants, the forget gate applies a sigmoid-modulated decay to cell states based on prior hidden inputs, effectively combining temporal fading with selective retention against input interference, as demonstrated in tasks modeling human working memory capacity limits. Such models replicate decay-like forgetting curves in sequence learning while capturing interference from sequence overlaps, bridging cognitive theory with AI applications in natural language processing.

Neurobiological and Neuronal Evidence

Neurobiological evidence for decay theory draws from synaptic mechanisms that underlie the passive weakening of traces over time. (LTP), a cellular model of learning and , can reverse through depotentiation, where previously potentiated synapses return to baseline strength via mechanisms such as long-term depression (LTD) or active forgetting processes. This reversal is mediated by of receptors (AMPARs), leading to synaptic weakening that aligns with behavioral decay timelines; for instance, in short-term contexts, AMPAR fading occurs on scales of seconds to minutes, matching the rapid loss observed in tasks. In hippocampal circuits, such depotentiation involves activation and signaling, promoting the removal of GluA2-containing AMPARs from the postsynaptic membrane. Neuroimaging studies provide further support by demonstrating temporal dynamics in brain activity consistent with trace decay. (fMRI) research has shown that blood-oxygen-level-dependent (BOLD) signals in the decline over retention intervals during tasks when is prevented, reflecting a passive of neural representations. For example, in associative paradigms, hippocampal activation during maintenance phases fades without active engagement, correlating with poorer subsequent retrieval performance. (PET) and fMRI data similarly indicate reduced metabolic and hemodynamic activity in medial structures over delays, underscoring decay as a default process in the absence of . At the cellular level, experiments on hippocampal slices reveal passive rundown of excitatory postsynaptic potentials (EPSPs), where synaptic responses diminish over minutes without ongoing stimulation. These studies demonstrate that EPSP amplitudes decrease due to gradual receptor desensitization and reduced release probability, mimicking the spontaneous weakening posited by decay theory. Such rundown is activity-independent in controlled slice preparations, highlighting intrinsic synaptic instability that contributes to fading across timescales. Molecular investigations further elucidate how decay is accelerated under specific conditions, particularly through disruptions in protein synthesis essential for (LTM) stabilization. Inhibition of protein synthesis, as explored in foundational work, renders consolidated memories labile and prone to rapid destabilization, effectively hastening their decay by preventing the maintenance of synaptic changes. This process involves the reversal of phases, where blocking de novo protein production post-learning leads to accelerated loss of LTM traces in the . Recent advances in the 2020s using have provided direct evidence of targeted within engrams in . manipulation of engram cells in the and during reveals that these ensembles weaken over time, with fear strength showing over days to weeks when reactivation is withheld. For instance, chronic low-level stimulation or inhibition of tagged engram neurons demonstrates reversible modulation of fear responses, where natural correlates with reduced engram excitability and synaptic connectivity on timescales of 1-2 weeks. These findings highlight active yet time-bound processes in engram destabilization, complementing passive synaptic mechanisms.

References

  1. [1]
    Decay Theory of Immediate Memory: From Brown (to Today (2014)
    In this work, Brown proposed a theory of forgetting based upon memory traces that lose activation, or decay, with the passage of time.
  2. [2]
  3. [3]
  4. [4]
  5. [5]
  6. [6]
    Forgetting Details in Visual Long-Term Memory: Decay or ...
    Jul 19, 2022 · Two main explanations for memory loss have been proposed. On the one hand, decay theories consider that over time memory fades away.
  7. [7]
    Replication and Analysis of Ebbinghaus' Forgetting Curve - PMC - NIH
    Jul 6, 2015 · We present a successful replication of Ebbinghaus' classic forgetting curve from 1880 based on the method of savings.
  8. [8]
    8.1 How Memory Functions - Psychology 2e | OpenStax
    Apr 22, 2020 · Memory trace decay and interference are two factors that affect short-term memory retention. Peterson and Peterson (1959) investigated short- ...
  9. [9]
    [PDF] Interference theory: History and current status - University of Waterloo
    underlying forgetting: John McGeoch's paper in 1932 emphasizing the key role of retroactive interference and Benton Underwood's paper in 1957 emphasizing ...
  10. [10]
    The Functional Relationship of Decay and Interference
    Aug 7, 2025 · Functional decay theory proposes that decay and interference, historically viewed as competing accounts of forgetting, are instead functionally related.
  11. [11]
    [PDF] Memory; a contribution to experimental psychology
    TRANSLATORS' INTRODUCTION. The publication by Ebbinghaus of the results of his experi- mental investigation of memory (1885) marks the application of.<|control11|><|separator|>
  12. [12]
    Associationist Theories of Thought
    Mar 17, 2015 · Associationism is a theory that connects learning to thought based on principles of the organism's causal history.
  13. [13]
    Associationism in the Philosophy of Mind
    Association is variously treated as a relation between functionally defined representational mental states such as concepts, “subrepresentational” states.
  14. [14]
    The psychology of learning : Thorndike, Edward L ... - Internet Archive
    Aug 26, 2015 · The psychology of learning. by: Thorndike, Edward L. (Edward Lee), 1874-1949; University of Leeds. Library. Publication date: 1913. Publisher ...<|control11|><|separator|>
  15. [15]
    Edward Thorndike: The Law of Effect - Simply Psychology
    Oct 23, 2025 · Law of Effect: Stated that behaviors followed by positive outcomes are strengthened, while those followed by negative ones are weakened.
  16. [16]
    Thorndike (1911) Chapter 5 - Classics in the History of Psychology
    "The law may be expressed briefly as follows:-- The resolution of one physiological state into another becomes easier and more rapid after it has taken place a ...
  17. [17]
    Ebbinghaus (1885/1913) Chapter 1
    Memory: A Contribution to Experimental Psychology. Hermann Ebbinghaus (1885) ... Originally published in New York by Teachers College, Columbia University.
  18. [18]
    Some tests of the decay theory of immediate memory
    Apr 7, 2008 · Some tests of the decay theory of immediate memory. John Brown Department of Psychology, Birkbeck College, University of London. Pages 12-21 ...
  19. [19]
    Short-term retention of individual verbal items - ResearchGate
    Sep 27, 2025 · Short-term retention of individual verbal items. September 1959; Journal of Experimental Psychology 58(3):193-198 ... Peterson; Peterson ...
  20. [20]
    [PDF] Sperling, G. (1960). The information available in brief visual ...
    For all stimuli and for all Ss, the available information calculated from the partial report is greater than that contained in the immediate-memory report.
  21. [21]
    [PDF] An auditory analogue of the sperling partial report procedure
    Apr 1, 1972 · An auditory analogue of the sperling partial report procedure: Evidence for brief auditory storage · C. Darwin, M. Turvey, R. G. Crowder ...
  22. [22]
    [PDF] Evidence for Decay in Verbal Short-Term Memory
    None of the experiments found a reduction in proactive interference over time, which they interpreted as evidence against time-based decay. However, it is ...
  23. [23]
    Temporal Expectation Modulates the Cortical Dynamics of Short ...
    Aug 22, 2018 · We show here that this decay of short-term memory can be counteracted by so-called temporal expectation; that is, knowledge of when to expect a sensory event.<|control11|><|separator|>
  24. [24]
    fifty years of memory for Spanish learned in school - PubMed - NIH
    The analysis yields memory curves which decline exponentially for the first 3-6 years of the retention interval. After that retention remains unchanged for ...Missing: foreign decay
  25. [25]
    Semantic memory content in permastore: Fifty years ... - APA PsycNet
    Citation. Bahrick, H. P. (1984). · Abstract. Tested retention of Spanish among 587 Ss who had studied the language in high school or college 1–50 yrs previously.Missing: decay | Show results with:decay
  26. [26]
    Factors That Influence Skill Decay and Retention: A Quantitative ...
    Nov 13, 2009 · This article presents a review of the skill retention and skill decay literature that focuses on factors that influence the loss of trained skills or knowledge.
  27. [27]
    Retention and Relearning of Gross Motor Skills after Long Periods of ...
    Aug 7, 2025 · A high degree of skill was retained after approximately one year of no practice. Relearning to previously attained skill levels was rapid. There ...Missing: disuse savings 20-30%
  28. [28]
    [PDF] The Long-Term Retention of Knowledge and Skills - DTIC
    For motor skills, the time saved in relearning to the original nitstery criterion is generally more than 50%. 6. Variable: Individual Differences a. There ...<|control11|><|separator|>
  29. [29]
    [PDF] Changes over 10 years in the retelling of the flashbulb memories of ...
    Nov 29, 2021 · In this study, we explored changes over time in American individuals' retelling of their flashbulb memories of the terrorist attack of September ...
  30. [30]
    (PDF) Note on the power law of forgetting - ResearchGate
    One example of this is the power law of forgetting: The decline in memory performance with time or intervening events is well fit by a power function. This ...
  31. [31]
    Infantile Amnesia: A Critical Period of Learning to Learn and ...
    Infantile amnesia, the inability of adults to recollect early episodic memories, is associated with the rapid forgetting that occurs in childhood.Missing: pruning | Show results with:pruning
  32. [32]
    Complement Dependent Synaptic Reorganisation During Critical ...
    Complement-mediated synaptic pruning during critical periods of early life may play a key role in shaping brain development and subsequent risk for ...
  33. [33]
    Working memory, attention control, and the n-back task - APA PsycNet
    The n-back task requires participants to decide whether each stimulus in a sequence matches the one that appeared n items ago.
  34. [34]
  35. [35]
  36. [36]
    Engram neurons: Encoding, consolidation, retrieval, and forgetting ...
    Jun 28, 2023 · ... replay prevents consolidation of the corresponding memory [77]. ... prevents synaptic decay and promotes memory persistence, increasing ...
  37. [37]
    Multiple modes of hippocampal-prefrontal interactions in memory ...
    Recent studies have shown that disrupting anatomical connections between the hippocampus and PFC leads to deficits in spatial navigation and memory [30,38], but ...Missing: decay | Show results with:decay
  38. [38]
  39. [39]
    Role of the Hippocampus in Adaptive Forgetting - IntechOpen
    The decay of NMDA receptor-dependent late (but not early) LTP could thus be mediated by active processes such as depotentiation or a reversal of LTP by LTD that ...<|separator|>
  40. [40]
    Long-term potentiation decay and memory loss are mediated ... - NIH
    Dec 1, 2014 · Long-term potentiation (LTP) of synaptic strength between hippocampal neurons is associated with learning and memory, and LTP dysfunction is ...
  41. [41]
    Brain activation during associative short-term memory maintenance ...
    This, in turn, may explain why hippocampal activation is demonstrated in some fMRI studies during WM tasks and why patients with hippocampal lesions are ...
  42. [42]
    Effects of healthy aging on hippocampal and rhinal memory functions
    Event-related functional magnetic resonance imaging was used to study the effects of healthy aging on hippocampal and rhinal memory functions.Missing: PET studies decay theory 2015 BOLD signal
  43. [43]
    Estimating the Time Course of the Excitatory Synaptic Conductance ...
    We use the method experimentally to determine the decay time course of excitatory synaptic conductances in neocortical pyramidal cells. The relatively rapid ...
  44. [44]
    Activity-dependent Decay of Early LTP Revealed by Dual EPSP ...
    The early maintenance of long-term potentiation (LTP) was studied in the CA1 region of hippocampal slices from 12- to 18-day-old rats in a low-magnesium ...Missing: rundown passive
  45. [45]
    The role of protein synthesis during the labile phases of memory
    Despite the fact that extensive evidence supports the view that phases of de novo protein synthesis are necessary for memory formation and maintenance, doubts ...Missing: accelerating decay
  46. [46]
    PROTEIN SYNTHESIS INHIBITION AND MEMORY: FORMATION ...
    Studies using protein synthesis inhibitors have provided key support for the prevalent view that memory formation requires the initiation of protein synthesis.
  47. [47]
    Natural forgetting reversibly modulates engram expression - eLife
    In order to assess engram reactivation on object memory retrievability an AAV9-TRE-ChR2-EYFP virus was injected into the dentate gyrus of c-fos-tTA mice (36, 37) ...
  48. [48]
    Chronic activation of fear engrams induces extinction-like behavior ...
    Sep 18, 2020 · Numerous studies demonstrate that forced alcohol abstinence, which may lead to withdrawal, can impair fear-related memory processes in rodents ...Missing: decay half- life
  49. [49]
    Time-dependent consolidation mechanisms of durable memory in ...
    Apr 1, 2025 · Emerging studies suggest that time-dependent consolidation enables memory stabilization by promoting memory integration and hippocampal-cortical transfer.