opt_einsum.paths.DynamicProgramming

class opt_einsum.paths.DynamicProgramming(minimize='flops', cost_cap=True, search_outer=False)[source]

Finds the optimal path of pairwise contractions without intermediate outer products based a dynamic programming approach presented in Phys. Rev. E 90, 033315 (2014) (the corresponding preprint is publically available at https://arxiv.org/abs/1304.6112). This method is especially well-suited in the area of tensor network states, where it usually outperforms all the other optimization strategies.

This algorithm shows exponential scaling with the number of inputs in the worst case scenario (see example below). If the graph to be contracted consists of disconnected subgraphs, the algorithm scales linearly in the number of disconnected subgraphs and only exponentially with the number of inputs per subgraph.

Parameters
  • minimize ({‘flops’, ‘size’}, optional) – Whether to find the contraction that minimizes the number of operations or the size of the largest intermediate tensor.

  • cost_cap ({True, False, int}, optional) –

    How to implement cost-capping:

    • True - iteratively increase the cost-cap

    • False - implement no cost-cap at all

    • int - use explicit cost cap

  • search_outer (bool, optional) – In rare circumstances the optimal contraction may involve an outer product, this option allows searching such contractions but may well slow down the path finding considerably on all but very small graphs.

__init__(minimize='flops', cost_cap=True, search_outer=False)[source]

Initialize self. See help(type(self)) for accurate signature.

Methods

__call__(inputs, output, size_dict[, …])

Parameters
  • inputs (list) – List of sets that represent the lhs side of the einsum subscript

__delattr__(name, /)

Implement delattr(self, name).

__dir__()

Default dir() implementation.

__eq__(value, /)

Return self==value.

__format__(format_spec, /)

Default object formatter.

__ge__(value, /)

Return self>=value.

__getattribute__(name, /)

Return getattr(self, name).

__gt__(value, /)

Return self>value.

__hash__()

Return hash(self).

__init__([minimize, cost_cap, search_outer])

Initialize self.

__init_subclass__

This method is called when a class is subclassed.

__le__(value, /)

Return self<=value.

__lt__(value, /)

Return self<value.

__ne__(value, /)

Return self!=value.

__new__(**kwargs)

Create and return a new object.

__reduce__()

Helper for pickle.

__reduce_ex__(protocol, /)

Helper for pickle.

__repr__()

Return repr(self).

__setattr__(name, value, /)

Implement setattr(self, name, value).

__sizeof__()

Size of object in memory, in bytes.

__str__()

Return str(self).

__subclasshook__

Abstract classes can override this to customize issubclass().

_check_args_against_first_call(inputs, …)

Utility that stateful optimizers can use to ensure they are not called with different contractions across separate runs.

Attributes

__dict__

__doc__

__module__

__weakref__

list of weak references to the object (if defined)