Skip to content Skip to sidebar Skip to footer

Python In-memory Cache With Time To Live

I have multiple threads running the same process that need to be able to to notify each other that something should not be worked on for the next n seconds its not the end of the w

Solution 1:

In case you don't want to use any 3rd libraries, you can add one more parameter to your expensive function: ttl_hash=None. This new parameter is so-called "time sensitive hash", its the only purpose is to affect lru_cache.

For example:

from functools import lru_cache
import time


@lru_cache()defmy_expensive_function(a, b, ttl_hash=None):
    del ttl_hash  # to emphasize we don't use it and to shut pylint upreturn a + b  # horrible CPU load...defget_ttl_hash(seconds=3600):
    """Return the same value withing `seconds` time period"""returnround(time.time() / seconds)


# somewhere in your code...
res = my_expensive_function(2, 2, ttl_hash=get_ttl_hash())
# cache will be updated once in an hour

Solution 2:

The OP is using python 2.7 but if you're using python 3, ExpiringDict mentioned in the accepted answer is currently, well, expired. The last commit to the github repo was June 17, 2017 and there is an open issue that it doesn't work with Python 3.5

As of September 1, 2020, there is a more recently maintained project cachetools.

pip install cachetools

from cachetools import TTLCache

cache = TTLCache(maxsize=10, ttl=360)
cache['apple'] = 'top dog'
...
>>> cache['apple']
'top dog'... after 360 seconds...
>>> cache['apple']
KeyError exception raised

ttl is the time to live in seconds.

Solution 3:

You can use the expiringdict module:

The core of the library is ExpiringDict class which is an ordered dictionary with auto-expiring values for caching purposes.

In the description they do not talk about multithreading, so in order not to mess up, use a Lock.

Solution 4:

Regarding an expiring in-memory cache, for general purpose use, a common design pattern to typically do this is not via a dictionary, but via a function or method decorator. A cache dictionary is managed behind the scenes. As such, this answer somewhat complements the answer by User which uses a dictionary rather than a decorator.

The ttl_cache decorator in cachetools works a lot like functools.lru_cache, but with a time to live.

import cachetools.func

@cachetools.func.ttl_cache(maxsize=128, ttl=10 * 60)defexample_function(key):
    return get_expensively_computed_value(key)


classExampleClass:
    EXP = 2    @classmethod    @cachetools.func.ttl_cache()defexample_classmethod(cls, i):
        return i * cls.EXP

    @staticmethod    @cachetools.func.ttl_cache()defexample_staticmethod(i):
        return i * 3

Solution 5:

I absolutely love the idea from @iutinvg, I just wanted to take it a little further; decouple it from having to know to pass the ttl into every function and just make it a decorator so you don't have to think about it. If you have django, py3, and don't feel like pip installing any dependencies, try this out.

import time
from django.utils.functional import lazy
from functools import lru_cache, partial, update_wrapper


deflru_cache_time(seconds, maxsize=None):
    """
    Adds time aware caching to lru_cache
    """defwrapper(func):
        # Lazy function that makes sure the lru_cache() invalidate after X secs
        ttl_hash = lazy(lambda: round(time.time() / seconds), int)()
        
        @lru_cache(maxsize)deftime_aware(__ttl, *args, **kwargs):
            """
            Main wrapper, note that the first argument ttl is not passed down. 
            This is because no function should bother to know this that 
            this is here.
            """defwrapping(*args, **kwargs):
                return func(*args, **kwargs)
            return wrapping(*args, **kwargs)
        return update_wrapper(partial(time_aware, ttl_hash), func)
    return wrapper

Proving it works (with examples):

@lru_cache_time(seconds=10)defmeaning_of_life():
    """
    This message should show up if you call help().
    """print('this better only show up once!')
    return42@lru_cache_time(seconds=10)defmultiply(a, b):
    """
    This message should show up if you call help().
    """print('this better only show up once!')
    return a * b
    
# This is a test, prints a `.` for every second, there should be 10s # between each "this better only show up once!" *2 because of the two functions.for _ inrange(20):
    meaning_of_life()
    multiply(50, 99991)
    print('.')
    time.sleep(1)

Post a Comment for "Python In-memory Cache With Time To Live"