PyTorch extensions for fast R&D prototyping and Kaggle farming
ClassificationDataset
, SegmentationDataset
for easy every-day use in KaggleFocalCosineLoss
, BiTemperedLogisticLoss
, SoftF1Loss
get_activation_block
(Silu, Softplus, Gelu)RocAucMetricCallback
for CatalystMultilabelAccuracyCallback
and AccuracyCallback
with DDP supporttbt
to tbt.
ignore_index
to Dice & Jaccard losses.fp16
mode.make_n_channel_input
.timm
package.FPNFuse
module not works on 2D, 3D and N-D inputs.softmax_over_dim_X
, argmax_over_dim_X
shorthand functions for use in metrics to get soft/hard labels without using lambda functions.StopIfNanCallback
OutputDistributionCallback
to log distribtion of predictions to TensorBoard.UMAPCallback
to visualize embedding space using UMAP in TensorBoard.CudaTileMerger
to TileMerger
. TileMerger
allows to specify target device explicitly.tensor_from_rgb_image
removed in favor of image_to_tensor
.focal_loss_with_logits
when reduction="sum"
NaN
in FocalLoss when all elements are equal to ignore_index
value.Swish
and Mish
activation functions (Credits goes to http://github.com/rwightman/pytorch-image-models)ignore_index
in SoftCrossEntropyLoss
ABN module will now resolve as nn.Sequential(BatchNorm2d, Activation) instead of a hand-crafted module. This enables easier conversion of batch normalization modules to the nn.SyncBatchNorm.
Almost every Encoder/Decoder implementation has been refactored for better clarity and flexibility. Please double-check your pipelines.
binary_dice_iou_score
functionSoftCrossEntropyLoss
#38draw_binary_segmentation_predictions
now has parameter image_format
(rgb
|bgr
|gray
) to specify format of the image to visualize correctly images in TBdraw_multilabel_segmentation_predictions
This release has a huge set of new features, bugfixes and breaking changes. So be careful, when upgrading.
pip install pytorch-toolbelt==0.3.0
Encoder
class has change_input_channels
method to change number of channels in input imageBCELoss
with support of ignore_index
SoftBCELoss
(Label smoothing loss for binary case with support of ignore_index
)SoftCrossEntropyLoss
(Label smoothing loss for multiclass case with support of ignore_index
)ABN
block: Swish, Mish, HardSigmoidCREDITS.md
to include all the references to code/articles. Existing list is definitely not complete, so feel free to make PR'salign_corners=False
in F.interpolate
when doing bilinear upsampling.FPNBottleneckBN
DiceLoss
and JaccardLoss
when log_loss=True
This release dedicated to housekeeping work. Dice/IoU metrics and losses have been redesigned to reduce amount of duplicated code and bring more clarity. Code is now auto-formatted using Black.
pip install pytorch_toolbelt==0.2.0
IoUMetricsCallback
with a few cool features: metric="dice|jaccard"
to choose what metric should be used; mode=binary|multiclass|multilabel
to specify problem type (binary, multiclass or multi-label segmentation)'; classes_of_interest=[1,2,4]
to select for which set of classes metric should be computed and nan_score_on_empty=False
to compute Dice Accuracy
(Counts as a 1.0 if both y_true
and y_pred
are empty; 0.0 if y_pred
is not empty).DiceLoss
/JaccardLoss
losses in a same fashion as metrics.UNetDecoder
flatten=True
.to_numpy
now supports int
, float
scalar typesignore_index
for focal lossignore_index
to some metrics for Catalysttif
extension for find_images_in_dir