Pytorch Toolbelt Versions Save

PyTorch extensions for fast R&D prototyping and Kaggle farming

0.4.2

3 years ago

Breaking Changes

  • Bump up minimal PyTorch version to 1.7.1

New features

  • New dataset classes ClassificationDataset, SegmentationDataset for easy every-day use in Kaggle
  • New losses: FocalCosineLoss, BiTemperedLogisticLoss, SoftF1Loss
  • Support of new activations for get_activation_block (Silu, Softplus, Gelu)
  • More encoders from timm package: NFNets, NFRegNet, HRNet, DPN
  • RocAucMetricCallback for Catalyst
  • MultilabelAccuracyCallback and AccuracyCallback with DDP support

Bugfixes

  • Fix invalid prefix in catalyst registry to from tbt to tbt.

0.4.1

3 years ago

New features

  • Added Soft-F1 loss for direct optimization of F1 score (Binary case only)
  • Fully rework TTA (Kept backward compatibility where it's possible) module for inference.
  • Added support of ignore_index to Dice & Jaccard losses.
  • Improved Lovasz loss to work in fp16 mode.
  • Added option to override selected params in make_n_channel_input.
  • More Encoders, from timm package.
  • FPNFuse module not works on 2D, 3D and N-D inputs.
  • Added Global K-Max 2D pooling block.
  • Added Generalized mean pooling 2D block.
  • Added softmax_over_dim_X, argmax_over_dim_X shorthand functions for use in metrics to get soft/hard labels without using lambda functions.
  • Added helper visualization functions to add fancy header to image, stack images of different sizes.
  • Improved rendering of confusion matrix.

Catalyst goodies

  • Encoders & Losses are available in Catalyst registry
  • StopIfNanCallback
  • Added OutputDistributionCallback to log distribtion of predictions to TensorBoard.
  • Added UMAPCallback to visualize embedding space using UMAP in TensorBoard.

Breaking Changes

  • Renamed CudaTileMerger to TileMerger. TileMerger allows to specify target device explicitly.
  • tensor_from_rgb_image removed in favor of image_to_tensor.

Bug fixes & Improvements

  • Improve numeric stability of focal_loss_with_logits when reduction="sum"
  • Prevent NaN in FocalLoss when all elements are equal to ignore_index value.
  • A LOT of type hints.

0.4.0

3 years ago

New features

Fixes

  • Fixed incorrect default value for ignore_index in SoftCrossEntropyLoss

Breaking changes

  • All catalyst-related utils updated to be compatible with Catalyst 20.8.2
  • Remove PIL package dependency

Improvements

  • More comments, more type hints

0.3.2

4 years ago

New features

  • Many helpful callbacks for Catalyst library: HyperParameterCallback, LossAdapter to name a few.
  • New losses for deep model supervision (Helpful, when size of target and output mask are different)
  • Stacked Hourglass encoder
  • Context Aggregation Network decoder

Breaking Changes

  • ABN module will now resolve as nn.Sequential(BatchNorm2d, Activation) instead of a hand-crafted module. This enables easier conversion of batch normalization modules to the nn.SyncBatchNorm.

  • Almost every Encoder/Decoder implementation has been refactored for better clarity and flexibility. Please double-check your pipelines.

Important bugfixes

  • Improved numerical stability of Dice / Jaccard losses (Using log_sigmoid() + exp() instead of plain sigmoid() )

Other

  • A lots of comments for functions and modules
  • Code cleanup, thanks for DeepSource
  • Type annotations for modules and functions
  • Update of README

0.3.1

4 years ago

Fixes

  • Fixed bug in computation IoU metric in binary_dice_iou_score function
  • Fixed incorrect default value in SoftCrossEntropyLoss #38

Improvements

  • Function draw_binary_segmentation_predictions now has parameter image_format (rgb|bgr|gray) to specify format of the image to visualize correctly images in TB
  • More type annotations across the codebase

New features

  • New visualization function draw_multilabel_segmentation_predictions

0.3.0

4 years ago

Pytorch Toolbel 0.3.0

This release has a huge set of new features, bugfixes and breaking changes. So be careful, when upgrading. pip install pytorch-toolbelt==0.3.0

New features

Encoders

  • HRNetV2
  • DenseNets
  • EfficientNet
  • Encoder class has change_input_channels method to change number of channels in input image

New losses

  • BCELoss with support of ignore_index
  • SoftBCELoss (Label smoothing loss for binary case with support of ignore_index)
  • SoftCrossEntropyLoss (Label smoothing loss for multiclass case with support of ignore_index)

Catalyst goodies

  • Online pseudolabeling callback
  • Training signal annealing callback

Other

  • New activation functions support in ABN block: Swish, Mish, HardSigmoid
  • New decoders (Unet, FPN, DeeplabV3, PPM) to simplify creation of segmentation models
  • CREDITS.md to include all the references to code/articles. Existing list is definitely not complete, so feel free to make PR's
  • Object context block from OCNet

API changes

  • Focal loss now supports normalized focal loss and reduced focal loss extensions.
  • Optimize computation of pyramid weight matrix #34
  • Default value align_corners=False in F.interpolate when doing bilinear upsampling.

Bugfixes

  • Fix missing call to batch normalization block in FPNBottleneckBN
  • Fix numerical stability for DiceLoss and JaccardLoss when log_loss=True
  • Fix numerical stability when computing normalized focal loss

0.2.1

4 years ago

New features

  • Added normalized focal loss

Bugfixes

  • Fixed wrong shape of intermediate layers of DenseNet

0.2.0

4 years ago

PyTorch Toolbelt 0.2.0

This release dedicated to housekeeping work. Dice/IoU metrics and losses have been redesigned to reduce amount of duplicated code and bring more clarity. Code is now auto-formatted using Black.

pip install pytorch_toolbelt==0.2.0

Catalyst contrib

  • Refactor Dice/IoU loss into single metric IoUMetricsCallback with a few cool features: metric="dice|jaccard" to choose what metric should be used; mode=binary|multiclass|multilabel to specify problem type (binary, multiclass or multi-label segmentation)'; classes_of_interest=[1,2,4] to select for which set of classes metric should be computed and nan_score_on_empty=False to compute Dice Accuracy (Counts as a 1.0 if both y_true and y_pred are empty; 0.0 if y_pred is not empty).
  • Added L-p regularization callback to apply L1 and L2 regularization to model with support of regularization strength scheduling.

Losses

  • Refactor DiceLoss/JaccardLoss losses in a same fashion as metrics.

Models

  • Add Densenet encoders
  • Bugfix: Fix missing BN+Relu in UNetDecoder
  • Global pooling modules can squeeze spatial channel dimensions if flatten=True.

Misc

  • Add more unit tests
  • Code-style is now managed with Black
  • to_numpy now supports int, float scalar types

0.1.4

4 years ago

PyTorch 0.1.4

  • Minor release to update Catalyst contrib modules to latest Catalyst (requires catalyst>=19.8)

0.1.3

4 years ago

PyTorch Toolbelt 0.1.3

  1. Added ignore_index for focal loss
  2. Added ignore_index to some metrics for Catalyst
  3. Added tif extension for find_images_in_dir