Cache
- class hybrid_learning.datasets.caching.Cache[source]
Bases:
ABCCaching base handle. Put objects into the cache using
put(), and load cached objects by their cache descriptor usingload(). Derive custom caching handles from this class.Public Methods:
put(descriptor, obj)Store
objin this cache.load(descriptor)Load the object stored under key
descriptorfrom cache.put_batch(descriptors, objs)Store a batch of
objsin this cache using accordingdescriptors.load_batch(descriptors[, return_none_if])Load a batch of objects.
clear()Clear the current cache.
Return all descriptors for which an element is cached.
as_dict()Return a dict with all cached descriptors and objects.
wrap(getitem[, descriptor_map])Add this cache to the deterministic function
getitem(which should have no side effects).Special Methods:
__repr__()Return repr(self).
__add__(other)Return a (cascaded) cache which will first lookup
selfthenotherwith default sync mode.__radd__(other)Return a (cascaded) cache which will first lookup
otherthenselfwith default sync mode.
- __add__(other)[source]
Return a (cascaded) cache which will first lookup
selfthenotherwith default sync mode. In caseotherisNoneor a dummyNoCache, returnself.- Returns
one of the summands in case the other is a no-op, else a
CacheCascadetransforms.- Parameters
- Return type
- __radd__(other)[source]
Return a (cascaded) cache which will first lookup
otherthenselfwith default sync mode. See__add__().
- as_dict()[source]
Return a dict with all cached descriptors and objects. Beware: This can be very large!
- Return type
- abstract load(descriptor)[source]
Load the object stored under key
descriptorfrom cache.Noneis returned if the object is not in the cache.
- load_batch(descriptors, return_none_if='any')[source]
Load a batch of objects. Return
Noneaccording toreturn_none_if.
- abstract put(descriptor, obj)[source]
Store
objin this cache. In case it already exists, the existing object is overwritten.
- put_batch(descriptors, objs)[source]
Store a batch of
objsin this cache using accordingdescriptors.
- wrap(getitem, descriptor_map=None)[source]
Add this cache to the deterministic function
getitem(which should have no side effects). When the wrapped method is called, the desired item from cache is returned. Only if it does not exist in the cache, the originalgetitemis called, and its output cached and returned. The optionaldescriptor_mapfunction should map agetitem-input to the hash value which is to be used for the cache. E.g. it could map an index to the underlying file name.getitemshouldhave no side effects, and
have deterministic output, i.e. calls with equal input value will return equal output values.
descriptor_mapshouldaccept elements from the same domain as
getitem,be injective, i.e. map each
getitem-input to a unique descriptor value.