BatchFlow helps you conveniently work with random or sequential batches of your data and define data processing and machine learning workflows even for datasets that do not fit into memory.
Update package versions
Added Normalizer
and Quantizer
classes for convenience
Minor fixes
Fix both versions to be the same
Fix model decay behaviour
Lots of incremental improvements. Main features are:
Pipeline.run
with prefetch turned on;Profiler
a lot faster, which allows to turn it on by defaultTorchModel
work with outputs
on the flyFix small bug relating padding in ResBlock
This release fixes crop
behavior of TorchModel
, as well as adds new blocks and methods:
InternBlock
with deformable convolutionsBottleneckBlock
that extends the functionality of ResBlock
TorchModel
instance inside train/predict
contextsmode
parameter for train
and predict
methods to control nn.Module
behavior.Also, this is the first version after numpy
deprecation of autocast to dtype=object
of mishaped arrays, so this is fixed in some places.
This release fixes one small TorchModel
bug.
This release changes the way Batch.apply_parallel
works: now it accepts both init
and post
functions, and should be the preferrable way to decorate batch methods (by marking them with decorators.apply_parallel
).
Other than that, there are a few new building blocks for TorchModel
, parameter to pad
the last microbatches to full microbatch_size
, and small bug fixes.