DictCache
- class hybrid_learning.datasets.caching.DictCache(thread_safe=True)[source]
Bases:
Cache
Cache objects in a (multiprocessing capable) dictionary in memory. In case this cache is used, the multiprocessing sharing strategy is automatically set to
'file_system'
since otherwise the ulimit of multiprocessing is exceeded for larger cache sizes. See pytorch issue 973 for this, and the pytorch doc on multiprocessing for the drawbacks of this sharing strategy.Warning
Be sure to have enough RAM!
Public Methods:
put
(descriptor, obj)Store
obj
under keydescriptor
in a in-memory cache.load
(descriptor)Load the object stored under
descriptor
from in-memory cache.clear
()Empty cache dict.
Return the keys (descriptors) of the cache dict.
Inherited from : py: class:Cache
put
(descriptor, obj)Store
obj
under keydescriptor
in a in-memory cache.load
(descriptor)Load the object stored under
descriptor
from in-memory cache.put_batch
(descriptors, objs)Store a batch of
objs
in this cache using accordingdescriptors
.load_batch
(descriptors[, return_none_if])Load a batch of objects.
clear
()Empty cache dict.
Return the keys (descriptors) of the cache dict.
as_dict
()Return a dict with all cached descriptors and objects.
wrap
(getitem[, descriptor_map])Add this cache to the deterministic function
getitem
(which should have no side effects).Special Methods:
__init__
([thread_safe])Init.
Inherited from : py: class:Cache
__repr__
()Return repr(self).
__add__
(other)Return a (cascaded) cache which will first lookup
self
thenother
with default sync mode.__radd__
(other)Return a (cascaded) cache which will first lookup
other
thenself
with default sync mode.
- Parameters
thread_safe (bool) –
- __init__(thread_safe=True)[source]
Init.
- Parameters
thread_safe (bool) – whether to use a multiprocessing-capable dict