Allennlp Optuna Versions Save

⚡️ AllenNLP plugin for adding subcommands to use Optuna, making hyperparameter optimization easy

v0.1.7

2 years ago

allennlp-optuna v0.1.7 is now available. We no longer support Python 3.6 so please upgrade Python if you use Python3.6 or earlier. :bow:

This version introduces one experimental option to tune command, --skip-exception. If it is specified, an optimization won't stop even when some exception is raised. This feature is very experimental and it could be removed in a later version.

allennlp tune test_fixtures/config/classifier.jsonnet test_fixtures/config/hparams.json --serialization-dir tmp --study-name test --skip-exception

#documentation

  • Add example with optimization metrics and direction (#43)

#bug

  • Allow users to specify pruner/sampler without attribute (#40)

#other

  • Update pyproject.toml to require py36 or newer (#52)
  • CI: Drop python36 and add python39 (#51)
  • tune: Add option to continue even when exception is raised (#50)
  • Update .gitignore to ignore invisible files (#42)

v0.1.6

3 years ago

allennlp-optuna v0.1.6 is out. This release mainly focuses on bug fixes.

Enhancement

  • Create a temporary config file to avoid configuration error in retraining (#37)

Bug

  • Allow users to specify pruner/sampler without attribute (#40)

v0.1.5

3 years ago

allennlp-optuna v0.1.5 is released! This release only contains dependency upgrades for AllenNLP and Optuna.

Other

  • Upgrade dependency: AllenNLP (#34)

v0.1.4

3 years ago

documentation

  • Readme/pruning (#21)
  • Fix naming (#22)
  • Update README for AllenNLP plugins (#23)
  • Add supported environments to README (#24)
  • Fix typo in allennlp plugins instruction of README (#26)
  • Remove poetry run commands (#27)
  • Remove poetry run on docs (#28)

other

  • Bump/0.1.4 (#29)
  • Create LICENSE (#31)

v0.1.3

3 years ago

bug

  • Specify include_package explicitly (#19)