How long does Developer lru_cache live?

Asked

Viewed 62 times

3

I’m developing a Python script that needs to memorize the information (possibly a cache) of previous instances and I ended up finding the developer @functools.lru_cache, but I was left with doubts read the documentation.

This Developer memorizes information only until the end of the instance or memorizes the information after the end of the instance as well?

If not, you can give suggestions on how I can do this, or a path I can follow to get to that?

  • 1

    If you need a cache that lives even after the application restarts, consider Redis: https://redis.io

1 answer

1


There is no specification for life time, only for size, the @functools.lru_cache works as a dictionary, so as long as your application and function with the decorator are there the feature will exist, so it is recommended to use the maxsize= with a size not too large (default is 128), to avoid uncontrolled use, if set maxsize=None LRU feature will be disabled and cache can grow without limits, which can cause serious problems.

Note that if using typed=true, the arguments of the function of different types will be cached separately, example: foobar(3) and foobar(3.0) will be treated as separate calls with distinct results, which will require more resources, so it is good to assess what is needed.

There is also the method .clear_cache, with it at any time you can invalidate the cache of a function, example:

@lru_cache
def foobar():
   return ..algoaqui..

foobar() # retorna o resultado
foobar() # retorna o mesmo resultado, possivelmente em cache

# descarta o cache existente para "foobar()"
foobar.clear_cache()

In the LRU algorithm used by this decorator, will discard the least recently used items first.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.