If you are hiring for a Python-based software engineer or data scientist, email me at nickcullen31 at gmail dot com.
v0.1.3 JUST RELEASED - contains significant improvements, bug fixes, and additional support. Get it from the releases, or pull the master branch.
This package provides a few things:
Have any feature requests? Submit an issue! I'll make it happen. Specifically, any data augmentation, data loading, or sampling functions.
Want to contribute? Check the issues page for those tagged with [contributions welcome].
ModuleTrainer class provides a high-level training interface which abstracts
away the training loop while providing callbacks, constraints, initializers, regularizers,
from torchsample.modules import ModuleTrainer # Define your model EXACTLY as normal class Network(nn.Module): def __init__(self): super(Network, self).__init__() self.conv1 = nn.Conv2d(1, 32, kernel_size=3) self.conv2 = nn.Conv2d(32, 64, kernel_size=3) self.fc1 = nn.Linear(1600, 128) self.fc2 = nn.Linear(128, 10) def forward(self, x): x = F.relu(F.max_pool2d(self.conv1(x), 2)) x = F.relu(F.max_pool2d(self.conv2(x), 2)) x = x.view(-1, 1600) x = F.relu(self.fc1(x)) x = F.dropout(x, training=self.training) x = self.fc2(x) return F.log_softmax(x) model = Network() trainer = ModuleTrainer(model) trainer.compile(loss='nll_loss', optimizer='adadelta') trainer.fit(x_train, y_train, val_data=(x_test, y_test), num_epoch=20, batch_size=128, verbose=1)
You also have access to the standard evaluation and prediction functions:
loss = model.evaluate(x_train, y_train) y_pred = model.predict(x_train)
Torchsample provides a wide range of callbacks, generally mimicking the interface
from torchsample.callbacks import EarlyStopping callbacks = [EarlyStopping(monitor='val_loss', patience=5)] model.set_callbacks(callbacks)
Torchsample also provides regularizers:
Both regularizers and constraints can be selectively applied on layers using regular expressions and the
argument. Constraints can be explicit (hard) constraints applied at an arbitrary batch or
epoch frequency, or they can be implicit (soft) constraints similar to regularizers
where the the constraint deviation is added as a penalty to the total model loss.
from torchsample.constraints import MaxNorm, NonNeg from torchsample.regularizers import L1Regularizer # hard constraint applied every 5 batches hard_constraint = MaxNorm(value=2., frequency=5, unit='batch', module_filter='*fc*') # implicit constraint added as a penalty term to model loss soft_constraint = NonNeg(lagrangian=True, scale=1e-3, module_filter='*fc*') constraints = [hard_constraint, soft_constraint] model.set_constraints(constraints) regularizers = [L1Regularizer(scale=1e-4, module_filter='*conv*')] model.set_regularizers(regularizers)
You can also fit directly on a
torch.utils.data.DataLoader and can have
a validation set as well :
from torchsample import TensorDataset from torch.utils.data import DataLoader train_dataset = TensorDataset(x_train, y_train) train_loader = DataLoader(train_dataset, batch_size=32) val_dataset = TensorDataset(x_val, y_val) val_loader = DataLoader(val_dataset, batch_size=32) trainer.fit_loader(loader, val_loader=val_loader, num_epoch=100)
Finally, torchsample provides a few utility functions not commonly found:
th_gather_nd(N-dimensional version of torch.gather)
th_affine3d(affine transforms on torch.Tensors)
The torchsample package provides a ton of good data augmentation and transformation
tools which can be applied during data loading. The package also provides the flexible
FolderDataset classes to handle most dataset needs.
These transforms work directly on torch tensors
The following transforms perform affine (or affine-like) transforms on torch tensors.
We also provide a class for stringing multiple affine transformations together so that only one interpolation takes place:
We provide the following datasets which provide general structure and iterators for sampling from and using transforms on in-memory or out-of-memory data:
Thank you to the following people and contributors: