CacheTuple

class hybrid_learning.datasets.caching.CacheTuple(*caches, return_none_if='any')[source]

Bases: Cache

Cache the values of tuples using different caches. Given a descriptor and a tuple of objects, each value of the tuple is stored in a different cache under the given descriptor.

Can be used e.g. to store transformed pairs of (input, target) using two different caches.

Public Methods:

load(descriptor)

Load all objects stored under descriptor and return as tuple.

put(descriptor, obj)

Put obj[i] into caches[i] under key descriptor.

clear()

Clear all caches in the tuple.

descriptors()

Return all descriptors that occur in any of the given caches.

Inherited from : py: class:Cache

put(descriptor, obj)

Put obj[i] into caches[i] under key descriptor.

load(descriptor)

Load all objects stored under descriptor and return as tuple.

put_batch(descriptors, objs)

Store a batch of objs in this cache using according descriptors.

load_batch(descriptors[, return_none_if])

Load a batch of objects.

clear()

Clear all caches in the tuple.

descriptors()

Return all descriptors that occur in any of the given caches.

as_dict()

Return a dict with all cached descriptors and objects.

wrap(getitem[, descriptor_map])

Add this cache to the deterministic function getitem (which should have no side effects).

Special Methods:

__init__(*caches[, return_none_if])

Init.

__repr__()

Return repr(self).

Inherited from : py: class:Cache

__repr__()

Return repr(self).

__add__(other)

Return a (cascaded) cache which will first lookup self then other with default sync mode.

__radd__(other)

Return a (cascaded) cache which will first lookup other then self with default sync mode.


Parameters
__init__(*caches, return_none_if='any')[source]

Init.

Parameters
  • caches (Cache) – the caches to use to cache the values of given tuples

  • return_none_if (Union[str, int]) – see return_none_if; may be one of 'any', 'all', 'never', 1, 0, -1.

__repr__()[source]

Return repr(self).

Return type

str

clear()[source]

Clear all caches in the tuple.

descriptors()[source]

Return all descriptors that occur in any of the given caches.

Warning

This may be slow, as the descriptor sets need to be united. Instead collect the descriptors of one cache if you know the caches have the same descriptors: tuple_cache.caches[0].descriptors().

Return type

Iterable

load(descriptor)[source]

Load all objects stored under descriptor and return as tuple. Return None according to the setting of return_none_if.

Parameters

descriptor (Hashable) –

Return type

Optional[Tuple[Any, …]]

put(descriptor, obj)[source]

Put obj[i] into caches[i] under key descriptor.

Parameters
caches: Tuple[Cache]

The tuple of caches to handle tuple values.

return_none_if: int

Mode by which to return None on load(). Possible modes:

  • 1 /'any': Return None if any cache load returns None.

  • 0/'all': Return None if all cache loads return None.

  • -1/'never': Do not return None, but always a tuple (possibly only holding None values).

The string specifiers will get mapped to integer values to increase speed.