Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add possibility to use opt_einsum instead of pure Numpy #2

Draft
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

dtellenbach
Copy link

@dtellenbach dtellenbach commented Feb 15, 2022

One of the most challenging parts of contracting a tensor network is to find a good contraction ordering. Currently ncon just uses some simple default ordering or a user provided one. opt_einsum is a highly optimized tensor network contractor that is capable of finding very good contraction sequences.

To be able to use the ncon interface with a high quality network contractor, this patch adds two new arguments to ncon:

  1. backend, a string that can be set to numpy or opt_einsum to enable using opt_einsum. Defaults to numpy
  2. opt_einsum_strategy, a string to select a contraction strategy for opt_einsum. Defaults to auto.

This patch enables to use the highly optimized tensor network contractor
opt_einsum instead of pure Numpy. opt_einsum particularly optimizes contraction
orders, therefore the `order` argument is ignored if opt_einsum is used.
@dtellenbach
Copy link
Author

Still a draft since I haven't updated the README yet.

@mhauru
Copy link
Owner

mhauru commented Mar 2, 2022

Hi @tellenbach! What's the advantage of opt_einsum over Numpy's own einsum with contraction order optimisation?

@dtellenbach
Copy link
Author

Hi @mhauru! Although many of opt_einsum's strategies to find good contraction sequences have been upstreamed into Numpy's einsum it still has newer and more strategies.

Another big advantage of opt_einsum over einsum is that the contractors provided by opt_einsum are ready to be automatically differentiated using e.g. autograd or Jax. This is e.g. interesting in the context of optimizations of tensor networks on Riemannian manifolds.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants