Recent research has proposed that systematic biases in human memory -- while seemingly highlighting a proclivity for failure -- can be understood as hallmarks of optimised lossy compression. Specifically, a form of compression termed semantic compression whereby an internal model of the environment is recruited to encode memories. Semantic compression casts memory errors in the normative framework of information theory, describing how limited memory resources should be distributed to optimise recall performance. Notably, the theory does not define a single best compression, rather a continuum of trade-offs between utilised capacity and expected distortion is possible. However, possible consequences of this characteristic feature have not been tested explicitly. Here we test the idea that gradual degradation of memories with time corresponds to a decrease in the amount of resources allocated to store memories. We apply the general framework to remembering synthetic words in a delayed recognition experiment and find that subjects are indeed less sensitive to intrusions generated by our model than generic distortions, and that delay length modulates recall rates in line with the predictions of the theory.