CacheCascade
- class hybrid_learning.datasets.caching.CacheCascade(*caches, sync_by='precedence')[source]
Bases:
Cache
Combine several caches by trying to load from first to last. In case of a put, all caches are updated. In case of a load, the object is collected from the first cache holding it. In case and all previous ones are updated to also hold it. If
sync_by
isTrue
, then on load the first match is put to all other cache instances, not only the previous ones. I.e. the order ofcaches
also determines the precedence.Some use-cases:
Combine in-memory and persistent cache: Combine a
DictCache
with aFileCache
instance: Files are stored in file system for later runs, respectively loaded from previous runs, and additionally for even faster access stored in memory.Cache to different cache locations: Combine several
FileCache
caches withsync_by=True
to write cache to several locations (can be used as sort of a lazy copy).
Public Methods:
put
(descriptor, obj)Put
obj
to all caches under keydescriptor
.load
(descriptor)Load object stored under
descriptor
from the cache with highest precedence holding it.clear
()Clears all caches in the cascade.
This returns the united descriptor lists of all sub-caches.
append
(cache)Append
cache
to the cascade and return self.insert
(i, cache)Insert
cache
at positioni
in cascade and return self.remove
(i)Remove the cache at position
i
in cascade and return self.Inherited from : py: class:Cache
put
(descriptor, obj)Put
obj
to all caches under keydescriptor
.load
(descriptor)Load object stored under
descriptor
from the cache with highest precedence holding it.put_batch
(descriptors, objs)Store a batch of
objs
in this cache using accordingdescriptors
.load_batch
(descriptors[, return_none_if])Load a batch of objects.
clear
()Clears all caches in the cascade.
This returns the united descriptor lists of all sub-caches.
as_dict
()Return a dict with all cached descriptors and objects.
wrap
(getitem[, descriptor_map])Add this cache to the deterministic function
getitem
(which should have no side effects).Special Methods:
__init__
(*caches[, sync_by])Init.
__repr__
()Return repr(self).
Inherited from : py: class:Cache
__repr__
()Return repr(self).
__add__
(other)Return a (cascaded) cache which will first lookup
self
thenother
with default sync mode.__radd__
(other)Return a (cascaded) cache which will first lookup
other
thenself
with default sync mode.
- append(cache)[source]
Append
cache
to the cascade and return self.- Parameters
cache (Cache) –
- Return type
- descriptors()[source]
This returns the united descriptor lists of all sub-caches.
Warning
This may be very computationally expensive, depending on the size of the lists to merge into a set. So use with care!
- Return type
- insert(i, cache)[source]
Insert
cache
at positioni
in cascade and return self.- Parameters
- Return type
- load(descriptor)[source]
Load object stored under
descriptor
from the cache with highest precedence holding it. Possibly update other cache instances according tosync_by
mode.
- put(descriptor, obj)[source]
Put
obj
to all caches under keydescriptor
. Beware that the key may be changed (e.g. transformed to string) in sub-caches, leading to non-unique descriptors.
- remove(i)[source]
Remove the cache at position
i
in cascade and return self.- Parameters
i (int) –
- Return type
- sync_by: bool
Synchronization mode for loading. Update other instances according to the following settings:
False
: no sync; simply load from the first cache holding an object without updating the others'precedence'
: when loading, update all caches with higher precedence (earlier in thecaches
list) that do not hold the object'all'
: put object to all other caches when a value was loaded from one