⚡️Optimizing einsum functions in NumPy, Tensorflow, Dask, and more with contraction order optimization.
Bug fixes:
README.md
.Bug Fixes:
versioneer.py
is in MANIFEST.in for a clean pip install.Bug Fixes:
opt_einsum
continues to improve its support for additional backends beyond NumPy with PyTorch.
We have also published the opt_einsum package in the Journal of Open Source Software. If you use this package in your work, please consider citing us!
New features:
Enhancements:
Bug fixes:
opt_einsum
is a powerful tensor contraction order optimizer for NumPy and related ecosystems.
opt_einsum
is a powerful tensor contraction order optimizer for NumPy and related ecosystems.
greedy
order optimization algorithm has been tuned to be able to handle hundreds of tensors in several seconds.Official 1.0 release.
Einsum is a very powerful function for contracting tensors of arbitrary dimension and index. However, it is only optimized to contract two terms at a time resulting in non-optimal scaling for contractions with many terms. Opt_einsum aims to fix this by optimizing the contraction order which can lead to arbitrarily large speed ups at the cost of additional intermediate tensors.
Opt_einsum is also implemented into the np.einsum
function as of NumPy v1.12.
A large step towards to a full 1.0 release. BLAS usage is now automatically applied to all operations. Future releases will be more careful with regard to views and needless data copying.
Adds Python 3 support in addition to installation through a setup.py
command.