Batchflow Versions Save

BatchFlow helps you conveniently work with random or sequential batches of your data and define data processing and machine learning workflows even for datasets that do not fit into memory.

0.8.7

9 months ago

Update package versions

0.8.6

9 months ago

Added Normalizer and Quantizer classes for convenience

0.8.5

10 months ago

Minor fixes

0.8.4a

1 year ago

Fix both versions to be the same

0.8.3

1 year ago

Fix model decay behaviour

0.8.2a

1 year ago

Lots of incremental improvements. Main features are:

  • fix memory leak in Pipeline.run with prefetch turned on;
  • make Profiler a lot faster, which allows to turn it on by default
  • make TorchModel work with outputs on the fly
  • fixed release action

0.8.1

1 year ago

Fix small bug relating padding in ResBlock

0.8.0

1 year ago

This release fixes crop behavior of TorchModel, as well as adds new blocks and methods:

  • InternBlock with deformable convolutions
  • separate BottleneckBlock that extends the functionality of ResBlock
  • method for getting a reference to the current TorchModel instance inside train/predict contexts
  • mode parameter for train and predict methods to control nn.Module behavior.

Also, this is the first version after numpy deprecation of autocast to dtype=object of mishaped arrays, so this is fixed in some places.

0.7.7

1 year ago

This release fixes one small TorchModel bug.

0.7.6

1 year ago

This release changes the way Batch.apply_parallel works: now it accepts both init and post functions, and should be the preferrable way to decorate batch methods (by marking them with decorators.apply_parallel).

Other than that, there are a few new building blocks for TorchModel, parameter to pad the last microbatches to full microbatch_size, and small bug fixes.