On 2/14/23 15:07, Dino wrote:

Here's my problem today. I am using a dict() to implement a quick and dirty in-memory cache.

I am stopping adding elements when I am reaching 1000 elements (totally arbitrary number), but I would like to have something slightly more sophisticated to free up space for newer and potentially more relevant entries.

I am thinking of the Least Recently Used principle, but how to implement that is not immediate. Before I embark on reinventing the wheel, is there a tool, library or smart trick that will allow me to remove elements with LRU logic?

It depends here how fancy you want to get. If you really need the behavior of a size-limited cache and you expect there to be a lot of churn, then you'll probably want a combination of data structures... e.g. a dict-like thing for quick hashed lookups (means your "keys" need to be hashable) and a doubly-linked list for ordering so you can quickly move a hit to the most-recent position - and maybe you want to keep cache stats as well.

Do look at functools if that well-tested standard library module fits your needs. If you need, or are motivated (as a learning experience?) to roll your own, my memory tells me the RealPython crew did a tutorial on implementing an LRU cache, might be worth a look (not sure if this is one of the all-free ones, or one of the must-be-a-paid-subscriber-to-get-the-advanced-stuff ones.


--
https://mail.python.org/mailman/listinfo/python-list

Reply via email to