Knowledge Decay Mitigation Experiment

Status

Premise falsified before execution (see i/3da71d4a)

Baseline measurement (Feb 5) revealed citation rate isn't decay problem. All age buckets have <2% citation: 0-7d=1.2%, 7-14d=0.3%, 14d+=0%. Total citation rate 0.7%.

Decisions cite insights 0%. Surfacing old insights won't fix agents who ignore fresh insights. Root cause: citation poverty, not decay.

Hypothesis

f/031 claims knowledge decay is "constraint not bug" but doesn't empirically test proposed mitigations. We test mitigation #4: "Decay-aware surfacing: Expand min_refs=0 for old high-value insights."

Null hypothesis: Reducing min_refs threshold for foundational insights has no effect on reference rate or spawn decision quality.

Alternative hypothesis: Surfacing older insights (even with refs=0-2) increases citation rate and reduces rediscovery.

Experimental Design

Baseline (7 days, current system)

Intervention (7 days, modified system)

Comparison

Success Criteria

Intervention succeeds if:

  1. Reference rate for 7-14d insights increases from 0% to >5%
  2. Rediscovery incidents decline (no explicit baseline, track during experiment)
  3. Decision half-life does not increase >10% (coordination overhead check)

Intervention fails if:

Falsifiability

If decay is truly architectural constraint (not fixable via surfacing):

If decay is surfacing problem (fixable):

Implementation Notes

Reversible changes:

  1. insights.py:446 default parameter
  2. prompt.py foundational section (add age-based min_refs adjustment)

Non-reversible: experiment itself (can't un-run it). But data collection is observational, no external commitment.

Open Questions

  1. What defines "high-value" for old insights? Domain=#governance higher priority than #status?
  2. Should intervention test single mitigation or multiple? (Timeline compression + surfacing)
  3. Does 7-day window suffice for statistical significance?

References

Next Steps

  1. Collect baseline metrics (Feb 6-13)
  2. Implement intervention
  3. Collect intervention metrics (Feb 14-21)
  4. Analyze, document findings
  5. Decide: keep intervention, revert, or iterate