Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)
Major changes:
torch.compile
just works, registration of operations happens automaticallyeinops
operations can be used with any framework that follows the standard (see einops.array_api
)Full Changelog: https://github.com/arogozhnikov/einops/compare/v0.6.1...v0.7.0
Full Changelog: https://github.com/arogozhnikov/einops/compare/v0.7.0rc1...v0.7.0rc2
Major changes:
torch.compile
just works, registration of operations happens automaticallyeinops
operations can be used with any framework that follows the standard (see einops.array_api
)Full Changelog: https://github.com/arogozhnikov/einops/compare/v0.6.1...v0.7.0rc1
pre-release is published to allow public testing of new caching logic (pattern analysis is now dependent on input dimensionality to preserve axis identity).
Full Changelog: https://github.com/arogozhnikov/einops/compare/v0.6.1...v0.6.2rc0
einops._torch_specific.allow_ops_in_compiled_graph()
before torch.compile
allow_ops_in_compiled_graph
to support torch.compile by @arogozhnikov in https://github.com/arogozhnikov/einops/pull/251
Full Changelog: https://github.com/arogozhnikov/einops/compare/v0.6.0...v0.6.1
Sunsetting experimental mxnet support: no demand and package is outdated, with numerous deprecations and poor support of corner cases. 0.6.0 will be the last release with mxnet backend.
Full Changelog: https://github.com/arogozhnikov/einops/compare/v0.5.0...v0.6.0
Full Changelog: https://github.com/arogozhnikov/einops/compare/v0.4.1...v0.5.0
Full Changelog: https://github.com/arogozhnikov/einops/compare/v0.4.0...v0.4.1
Full Changelog: https://github.com/arogozhnikov/einops/compare/v0.3.2...v0.4.0