I wrote a code to (naively) perform the memory. I tried to write it in a functional programming style, so I did not use any global variable.
def naive_lru (mx = None): "" * Close that returns a memoiser. * Parameters: * `mx`: (` int`) The maximum size of the memoiser cache returned. * He came back: * `memoise`: (` function`) A memoiser. * Vars: * `caches`: (` defaultdict`) stores the caches for the various functions. * `lim`: (` bool`) set to `True` if` mx` is not` None`. It is used to indicate that the cache size should be tracked. * `ln`: (` int`) that tracks the size of the cache. * `deck`: (` deque`) used to manage the cache. The elements in the deque are ordered according to when it was last accessed. * `lengths`: (` defaultdict`) stores the lengths of the caches for the various functions. * `lims`: (` defaultdict`) stores if a given cache has a maximum size. * `decks`: (` defaultdict`) stores the deques of the caches for the different functions. * `maxes`: (` defaultdict`) stores the maximum size of the caches for the various functions. "" caches, lim = dd (lambda: dd (lambda: none)), False if mx is none: lim, ln = true, 0 deck = deque () lengths, lims, decks, maxes = dd (lambda: 0), dd (lambda: False), dd (lambda: deque ()), dd (lambda: None) def memoise (mxsize = None, lst = False, idx = None, init = 0): "" * Returns a memoisation decorator for a given function. * Parameters: * `mxsize` (` int`): The maximum size of the cache for the stored variant of the input function. * `lst` (` list`): A Boolean variable that determines if a list would be used for the function cache. For functions with domains that can be assigned to a dense set, using a list for caching would be more efficient (and faster) than using a dict. * The cache would only be set in a list if `mxsize` is` None` since the implementation of the lru functionality with a list is quite trivial, and most importantly, frequent deletions would destroy the dense property that was a requirement Before using the list as a cache, the lru functionality is only provided when the cache is a dict. * `idx` (` function`): The transformation function that maps the input data to its corresponding index in the cache. * `init` (` int`): the initial size of the cache list. If an upper limit on the size of the domain of the function is known, a list of that size can be created in the initialization. This is more efficient than repeatedly adding or extending the list. * He came back: * `memoised`: (` function`) A memory decorator. "" def memoised (f): "" * Memory function that returns a memorized variant of a certain input function. * Parameters: * `f`: (` function`) The function to memorize. * He came back: * `mem_f`: (` function`) Memorized variant of the input function. "" not local lst yes lim deck.appendleft (f) ln + = 1 if ln> mx: of the caches[deck.pop()] # Remove the function cache used less recently. if mxsize is none: limos[f]lst maxes[f] = True, False, mxsize plus: yes lst caches[f] = [None]*in that def mem_f (* args, ** kwargs): tpl = (args, frozenset (kwargs)) index = tpl if idx is None else idx (tpl) yes caches[f][index] is none: caches[f][index] = f (* args, ** kwargs) yes lims[f]: covers[f].appendleft (index) lengths[f] + = 1 yes the lengths[f] > maxes[f]: of the caches[f][covers[decks[cubiertas[decks[f].popular()]# Remove the least recently used argument cache. return caches[f][index] returns mem_f back memorized back memoise
Use of the sample
mem = naive_lru () @mem (lst = True, init = 10000, idx = lambda x: x) def fib (n): Without in [0, 1]: return n return fib (n - 2) + fib (n - 1) print (fib (500)) #Outputs "1394232245616978801397243828704072839500702565876973072641089629483162286329069155765887622252129412512".