CLU lets you write beautiful training loops in JAX.
jax.tree_map
(deprecated since JAX 0.4.26) to
jax.tree_util.tree_map
.Full Changelog: https://github.com/google/CommonLoopUtils/compare/v0.0.11...v0.0.12
parameter_overview
clu.metrics.Std
support same shapes as clu.metrics.Average
Full Changelog: https://github.com/google/CommonLoopUtils/compare/v0.0.10...v0.0.11
This release requires Python>=3.10
clu.parameter_overview
now supports JAX global arrays.clu.metrics
module.Full Changelog: https://github.com/google/CommonLoopUtils/compare/v0.0.9...v0.0.10
Last release before dropping support for Python 3.7 and 3.8
Full Changelog: https://github.com/google/CommonLoopUtils/compare/v0.0.8...v0.0.9
clu.internal.asynclib
to clu.asynclib
.MetricWriter
.clu.values
to annotate arrays with a modality.clu.data.DatasetIterator
- a generic interface between input
pipelines and training loops.clu.metrics
.This will be the last release supporting Python 3.7.
deterministic_data
work with tfds>4.4.0
and tfds<=4.4.0
.profiler.start()
raises an exception.periodic_actions.ProgressUpdate
show total number of steps.AsyncWriter
non-blocking wrt JAX async computations.clu_synopsis.ipynb
Colab as initial documentation.PreprocessFn
addable.deterministic_data
metric_writers
metrics
Metric.from_output()
.Metric.from_fun()
.Collections.unreplicate()
, Collections.create()
.periodic_actions
preprocess_spec
preprocess_spec.get_all_ops()
.Documentation:
clu_synopsis.ipynb
ColabChangelog:
metric_writers
: Lets SummaryWriter
write nested dictionaries.internal
: Adds async.Pool
.preprocess_spec
: Support nested dictionaries.profile
: Use JAX profiler APIs instead of TF profiler APIs.