DictCache

class hybrid_learning.datasets.caching.DictCache(thread_safe=True)[source]

Bases: Cache

Cache objects in a (multiprocessing capable) dictionary in memory. In case this cache is used, the multiprocessing sharing strategy is automatically set to 'file_system' since otherwise the ulimit of multiprocessing is exceeded for larger cache sizes. See pytorch issue 973 for this, and the pytorch doc on multiprocessing for the drawbacks of this sharing strategy.

Warning

Be sure to have enough RAM!

Public Methods:

put(descriptor, obj)

Store obj under key descriptor in a in-memory cache.

load(descriptor)

Load the object stored under descriptor from in-memory cache.

clear()

Empty cache dict.

descriptors()

Return the keys (descriptors) of the cache dict.

Inherited from : py: class:Cache

put(descriptor, obj)

Store obj under key descriptor in a in-memory cache.

load(descriptor)

Load the object stored under descriptor from in-memory cache.

put_batch(descriptors, objs)

Store a batch of objs in this cache using according descriptors.

load_batch(descriptors[, return_none_if])

Load a batch of objects.

clear()

Empty cache dict.

descriptors()

Return the keys (descriptors) of the cache dict.

as_dict()

Return a dict with all cached descriptors and objects.

wrap(getitem[, descriptor_map])

Add this cache to the deterministic function getitem (which should have no side effects).

Special Methods:

__init__([thread_safe])

Init.

Inherited from : py: class:Cache

__repr__()

Return repr(self).

__add__(other)

Return a (cascaded) cache which will first lookup self then other with default sync mode.

__radd__(other)

Return a (cascaded) cache which will first lookup other then self with default sync mode.


Parameters

thread_safe (bool) –

__init__(thread_safe=True)[source]

Init.

Parameters

thread_safe (bool) – whether to use a multiprocessing-capable dict

clear()[source]

Empty cache dict.

descriptors()[source]

Return the keys (descriptors) of the cache dict.

Return type

Iterable

load(descriptor)[source]

Load the object stored under descriptor from in-memory cache.

Parameters

descriptor (Hashable) –

Return type

Optional[Any]

put(descriptor, obj)[source]

Store obj under key descriptor in a in-memory cache. In case it already exists, the existing object is overwritten.

Parameters
  • descriptor (Hashable) – the descriptor key under which to store the object; used to access the object later

  • obj (Any) – the object to put into cache; must not be None