CacheCascade
- class hybrid_learning.datasets.caching.CacheCascade(*caches, sync_by='precedence')[source]
Bases:
CacheCombine several caches by trying to load from first to last. In case of a put, all caches are updated. In case of a load, the object is collected from the first cache holding it. In case and all previous ones are updated to also hold it. If
sync_byisTrue, then on load the first match is put to all other cache instances, not only the previous ones. I.e. the order ofcachesalso determines the precedence.Some use-cases:
Combine in-memory and persistent cache: Combine a
DictCachewith aFileCacheinstance: Files are stored in file system for later runs, respectively loaded from previous runs, and additionally for even faster access stored in memory.Cache to different cache locations: Combine several
FileCachecaches withsync_by=Trueto write cache to several locations (can be used as sort of a lazy copy).
Public Methods:
put(descriptor, obj)Put
objto all caches under keydescriptor.load(descriptor)Load object stored under
descriptorfrom the cache with highest precedence holding it.clear()Clears all caches in the cascade.
This returns the united descriptor lists of all sub-caches.
append(cache)Append
cacheto the cascade and return self.insert(i, cache)Insert
cacheat positioniin cascade and return self.remove(i)Remove the cache at position
iin cascade and return self.Inherited from : py: class:Cache
put(descriptor, obj)Put
objto all caches under keydescriptor.load(descriptor)Load object stored under
descriptorfrom the cache with highest precedence holding it.put_batch(descriptors, objs)Store a batch of
objsin this cache using accordingdescriptors.load_batch(descriptors[, return_none_if])Load a batch of objects.
clear()Clears all caches in the cascade.
This returns the united descriptor lists of all sub-caches.
as_dict()Return a dict with all cached descriptors and objects.
wrap(getitem[, descriptor_map])Add this cache to the deterministic function
getitem(which should have no side effects).Special Methods:
__init__(*caches[, sync_by])Init.
__repr__()Return repr(self).
Inherited from : py: class:Cache
__repr__()Return repr(self).
__add__(other)Return a (cascaded) cache which will first lookup
selfthenotherwith default sync mode.__radd__(other)Return a (cascaded) cache which will first lookup
otherthenselfwith default sync mode.
- append(cache)[source]
Append
cacheto the cascade and return self.- Parameters
cache (Cache) –
- Return type
- descriptors()[source]
This returns the united descriptor lists of all sub-caches.
Warning
This may be very computationally expensive, depending on the size of the lists to merge into a set. So use with care!
- Return type
- insert(i, cache)[source]
Insert
cacheat positioniin cascade and return self.- Parameters
- Return type
- load(descriptor)[source]
Load object stored under
descriptorfrom the cache with highest precedence holding it. Possibly update other cache instances according tosync_bymode.
- put(descriptor, obj)[source]
Put
objto all caches under keydescriptor. Beware that the key may be changed (e.g. transformed to string) in sub-caches, leading to non-unique descriptors.
- remove(i)[source]
Remove the cache at position
iin cascade and return self.- Parameters
i (int) –
- Return type
- sync_by: bool
Synchronization mode for loading. Update other instances according to the following settings:
False: no sync; simply load from the first cache holding an object without updating the others'precedence': when loading, update all caches with higher precedence (earlier in thecacheslist) that do not hold the object'all': put object to all other caches when a value was loaded from one