Einops Versions Save

Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)

v0.7.0

6 months ago

Major changes:

  • torch.compile just works, registration of operations happens automatically
  • JAX's distributed arrays can use ellipses, and in general ellipsis processing now preserves axis identity. This involved changing internal gears of einops.
  • Array API: einops operations can be used with any framework that follows the standard (see einops.array_api)
  • Python 3.7 is dead. Good bye, you were great at the time
  • Gluon is dropped as previously announced
  • reduce/repeat/rearrange all accept lists now

PRs list

Full Changelog: https://github.com/arogozhnikov/einops/compare/v0.6.1...v0.7.0

v0.7.0rc2

8 months ago

What's Changed

Full Changelog: https://github.com/arogozhnikov/einops/compare/v0.7.0rc1...v0.7.0rc2

v0.7.0rc1

9 months ago

Major changes:

  • torch.compile just works, registration of operations happens automatically
  • JAX's distributed arrays can use ellipses, and in general ellipsis processing now preserves axis identity. This involved changing internal gears of einops.
  • Array API: einops operations can be used with any framework that follows the standard (see einops.array_api)
  • Python 3.7 is dead. Good bye, you were great at the time
  • Gluon is dropped as previously announced
  • Reduce/repeat/rearrange all accept lists now

What's Changed

Full Changelog: https://github.com/arogozhnikov/einops/compare/v0.6.1...v0.7.0rc1

v0.6.2rc0

9 months ago

pre-release is published to allow public testing of new caching logic (pattern analysis is now dependent on input dimensionality to preserve axis identity).

What's Changed

Full Changelog: https://github.com/arogozhnikov/einops/compare/v0.6.1...v0.6.2rc0

v0.6.1

1 year ago
  • einops layers perfectly interplay with torch.compile
  • einops operations needs registration: run einops._torch_specific.allow_ops_in_compiled_graph() before torch.compile
  • paddle is now supported (thanks to @zhouwei25)
  • as previously announced, support of mxnet is dropped

What's Changed

New Contributors

Full Changelog: https://github.com/arogozhnikov/einops/compare/v0.6.0...v0.6.1

v0.6.0

1 year ago

What's Changed

New Contributors

Announcement

Sunsetting experimental mxnet support: no demand and package is outdated, with numerous deprecations and poor support of corner cases. 0.6.0 will be the last release with mxnet backend.

Full Changelog: https://github.com/arogozhnikov/einops/compare/v0.5.0...v0.6.0

v0.5.0

1 year ago

What's Changed

New Contributors

Full Changelog: https://github.com/arogozhnikov/einops/compare/v0.4.1...v0.5.0

v0.4.1

2 years ago

What's Changed

New Contributors

Full Changelog: https://github.com/arogozhnikov/einops/compare/v0.4.0...v0.4.1

v0.4.0

2 years ago

Main Changes

  • torch.jit.script is supported (in addition to previous torch.jit.trace)
  • EinMix (swiss-knife for next-gen MLPs) is added. A much-improved einsum/linear layer is now available.
  • einops.repeat in torch does not create copy when possible

Detailed PRs

New Contributors

Full Changelog: https://github.com/arogozhnikov/einops/compare/v0.3.2...v0.4.0

v0.3.2

2 years ago
  • documentation and domain (#75, #76, #77, #79, #81), thanks to @cgarciae
  • typos and spellcheck (thank @ollema and @GarrettMooney )
  • moved away from keras to tf.keras
  • adjustments to tutorials and testing
  • other minor improvements