mlpack: a fast, header-only C++ machine learning library
Released April 29th, 2020.
Minor Julia and Python documentation fixes (#2373).
Updated terminal state and fixed bugs for Pendulum environment (#2354, #2369).
Added EliSH
activation function (#2323).
Add L1 Loss function (#2203).
Pass CMAKE_CXX_FLAGS (compilation options) correctly to Python build (#2367).
Expose ensmallen Callbacks for sparseautoencoder (#2198).
Bugfix for LARS class causing invalid read (#2374).
Add serialization support from Julia; use mlpack.serialize()
and mlpack.deserialize()
to save and load from IOBuffer
s.
Released April 7th, 2020.
Templated return type of Forward function
of loss functions (#2339).
Added R2 Score
regression metric (#2323).
Added mean squared logarithmic error
loss function for neural networks (#2210).
Added mean bias loss function
for neural networks (#2210).
The DecisionStump class has been marked deprecated; use the DecisionTree
class with NoRecursion=true
or use ID3DecisionStump
instead (#2099).
Added probabilities_file
parameter to get the probabilities matrix of AdaBoost classifier (#2050).
Fix STB header search paths (#2104).
Add DISABLE_DOWNLOADS
CMake configuration option (#2104).
Add padding layer in TransposedConvolutionLayer (#2082).
Fix pkgconfig generation on non-Linux systems (#2101).
Use log-space to represent HMM initial state and transition probabilities (#2081).
Add functions to access parameters of Convolution
and AtrousConvolution
layers (#1985).
Add Compute Error function in lars regression and changing Train function to return computed error (#2139).
Add Julia bindings (#1949). Build settings can be controlled with the BUILD_JULIA_BINDINGS=(ON/OFF)
and JULIA_EXECUTABLE=/path/to/julia
CMake parameters.
CMake fix for finding STB include directory (#2145).
Add bindings for loading and saving images (#2019); mlpack_image_converter
from the command-line, mlpack.image_converter()
from Python.
Add normalization support for CF binding (#2136).
Add Mish activation function (#2158).
Update init_rules
in AMF to allow users to merge two initialization rules (#2151).
Add GELU activation function (#2183).
Better error handling of eigendecompositions and Cholesky decompositions (#2088, #1840).
Add LiSHT activation function (#2182).
Add Valid and Same Padding for Transposed Convolution layer (#2163).
Add CELU activation function (#2191)
Add Log-Hyperbolic-Cosine Loss function (#2207)
Change neural network types to avoid unnecessary use of rvalue references (#2259).
Bump minimum Boost version to 1.58 (#2305).
Refactor STB support so HAS_STB
macro is not needed when compiling against mlpack (#2312).
Add Hard Shrink Activation Function (#2186).
Add Soft Shrink Activation Function (#2174).
Add Hinge Embedding Loss Function (#2229).
Add Cosine Embedding Loss Function (#2209).
Add Margin Ranking Loss Function (#2264).
Bugfix for incorrect parameter vector sizes in logistic regression and softmax regression (#2359).
Released Oct. 1, 2019. (But I forgot to release it on Github; sorry about that.)
Released Sept. 25, 2019.
Fix occasionally-failing RADICAL test (#1924).
Fix gcc 9 OpenMP compilation issue (#1970).
Added support for loading and saving of images (#1903).
Add Multiple Pole Balancing Environment (#1901, #1951).
Added functionality for scaling of data (#1876); see the command-line binding mlpack_preprocess_scale
or Python binding preprocess_scale()
.
Add new parameter maximum_depth
to decision tree and random forest bindings (#1916).
Fix prediction output of softmax regression when test set accuracy is calculated (#1922).
Pendulum environment now checks for termination. All RL environments now have an option to terminate after a set number of time steps (no limit by default) (#1941).
Add support for probabilistic KDE (kernel density estimation) error bounds when using the Gaussian kernel (#1934).
Fix negative distances for cover tree computation (#1979).
Fix cover tree building when all pairwise distances are 0 (#1986).
Improve KDE pruning by reclaiming not used error tolerance (#1954, #1984).
Optimizations for sparse matrix accesses in z-score normalization for CF (#1989).
Add kmeans_max_iterations
option to GMM training binding gmm_train_main
.
Bump minimum Armadillo version to 8.400.0 due to ensmallen dependency requirement (#2015).
Released May 26, 2019.
minimum_gain_split
and subspace_dim
parameters (#1887).print_training_error
deprecated in favor of print_training_accuracy
.output
option changed to predictions
for adaboost and perceptron binding. Old options are now deprecated and will be preserved until mlpack 4.0.0 (#1882).src/mlpack/core/data/normalize_labels_impl.hpp
) (#1780).ConfusionMatrix()
function for checking performance of classifiers (#1798).Released April 25, 2019. Release email
Add DiagonalGaussianDistribution and DiagonalGMM classes to speed up the diagonal covariance computation and deprecate DiagonalConstraint (#1666).
Add kernel density estimation (KDE) implementation with bindings to other languages (#1301).
Where relevant, all models with a Train()
method now return a double
value representing the goodness of fit (i.e. final objective value, error, etc.) (#1678).
Add implementation for linear support vector machine (see src/mlpack/methods/linear_svm
).
Change DBSCAN to use PointSelectionPolicy and add OrderedPointSelection (#1625).
Residual block support (#1594).
Bidirectional RNN (#1626).
Dice loss layer (#1674, #1714) and hard sigmoid layer (#1776).
output
option changed to predictions
and output_probabilities
to probabilities
for Naive Bayes binding (mlpack_nbc
/nbc()
). Old options are now deprecated and will be preserved until mlpack 4.0.0 (#1616).
Add support for Diagonal GMMs to HMM code (#1658, #1666). This can provide large speedup when a diagonal GMM is acceptable as an emission probability distribution.
Python binding improvements: check parameter type (#1717), avoid copying Pandas dataframes (#1711), handle Pandas Series objects (#1700).
Released November 13, 2018.
Released July 27th, 2018.
Released June 8th, 2018.
Released May 10th, 2018.