LRU cache based on stucchio’s py-lru-cache module
original copy at https://github.com/stucchio/Python-LRU-cache licensed under MIT
LRUCacheDict(max_size=1024, expiration=900, thread_clear=False, concurrent=True)¶
A dictionary-like object, supporting LRU caching semantics.
>>> d = LRUCacheDict(max_size=3, expiration=3) >>> d['foo'] = 'bar' >>> d['foo'] 'bar' >>> import time >>> time.sleep(4) # 4 seconds > 3 second cache expiry of d >>> d['foo'] Traceback (most recent call last): ... KeyError: 'foo' >>> d['a'] = 'A' >>> d['b'] = 'B' >>> d['c'] = 'C' >>> d['d'] = 'D' >>> d['a'] # Should return value error, since we exceeded the max cache size Traceback (most recent call last): ... KeyError: 'a'
By default, this cache will only expire items whenever you poke it - all methods on this class will result in a cleanup. If the thread_clear option is specified, a background thread will clean it up every thread_clear_min_check seconds.
If this class must be used in a multithreaded environment, the option concurrent should be set to true. Note that the cache will always be concurrent if a background cleanup thread is used.
Initialize the LRUCacheDict object.
- max_size (int) – Maximum number of elements in the cache.
- expiration (int) – Number of seconds an item can be in the cache before it expires.
- thread_clear (bool) – True if we want to use a background thread to keep the cache clear.
- concurrent (bool) – True to make access to the cache thread-safe.
Background thread that expires elements out of the cache.
Initialize the EmptyCacheThread.
- cache (LRUCacheDict) – The cache to be monitored.
- peek_duration (int) – The delay between “sweeps” of the cache.
Execute the background cleanup.
A memoized function, backed by an LRU cache.
>>> def f(x): ... print "Calling f(" + str(x) + ")" ... return x >>> f = LRUCachedFunction(f, LRUCacheDict(max_size=3, expiration=3) ) >>> f(3) Calling f(3) 3 >>> f(3) 3 >>> import time >>> time.sleep(4) #Cache should now be empty, since expiration time is 3. >>> f(3) Calling f(3) 3 >>> f(4) Calling f(4) 4 >>> f(5) Calling f(5) 5 >>> f(3) #Still in cache, so no print statement. At this point, 4 is the least recently used. 3 >>> f(6) Calling f(6) 6 >>> f(4) #No longer in cache - 4 is the least recently used, and there are at least 3 others items in cache [3,4,5,6]. Calling f(4) 4
Initialize the LRUCachedFunction object.
- function (func) – The function to be used to create new items in the cache.
- cache (LRUCacheDict) – The internal cache structure.
Least recently used cache function
>>> @lru_cache_function(3, 1) ... def f(x): ... print "Calling f(" + str(x) + ")" ... return x >>> f(3) Calling f(3) 3 >>> f(3) 3