BetaML.jl Versions Save

Beta Machine Learning Toolkit

v0.12.0

2 weeks ago

BetaML v0.12.0

Diff since v0.11.4

  • Added FeatureRanker, a flexible feature ranking estimator using multiple feature importance metrics
  • new functions kl_divergence and sobol_index
  • added option to tree-based models to ignore specific variables in prediction, by following both the splits on nodes occurring on that dimensions, as the keyword ignore_dims to the predict function
  • added option sampling_share to RandomForestEstimator model
  • DOC: added Benchmarks (but then temporarily removed due to the issue of SystemBenchmark not installable, see this issue )
  • DOC: added FeatureRanker tutorial
  • bugfix on l2loss_by_cv for unsupervised models

v0.11.4

2 months ago

BetaML v0.11.4

Diff since v0.11.3

bugfix (solve issue in cosine_distance - similarity was actually computed)

v0.11.3

3 months ago

BetaML v0.11.3

Diff since v0.11.2

  • bugfixes (removed old, undocumented, unused, type pirate findfirst and findall functions)

v0.11.2

4 months ago

BetaML v0.11.2

Diff since v0.11.1

  • bugfixes

v0.11.1

4 months ago

BetaML v0.11.1

Diff since v0.11.0

  • changed some keyword arguments of AutoEncoder and PCAEncoder: outdims => encoded_size and innerdims => layers_size

This shouldn't be breaking as I twisted the constructor to accept the older names (until next breaking version 0.12)

v0.11.0

4 months ago

BetaML v0.11.0

Diff since v0.10.4

Attention: many breaking changes in this version !!

  • experimental new ConvLayer and PoolLayer for convolutional networks. BetaML neural networks work only on CPU and even on CPU the convolution layers (but not the dense ones) are 2-3 times slower than Flux. Still they have some quite unique characteristics, like working with any dimensions or not requiring AD in most cases, so they may still be useful in some corner situations. Then, if you want to help in porting to GPU... ;-)
  • Isolated MLJ interface models into their own Bmlj submodule
  • Renamed many model in a congruent way
  • Shortened the hyper-parameters and learnable parameters struct names
  • Corrected many doc bugs
  • Several bugfixes

v0.10.4

5 months ago

BetaML v0.10.4

Diff since v0.10.3

  • Added models AutoEncoder and MLJ wrapper AutoEncoderMLJ with a m=AutoEncoder(hp); fit!(m,x); x_latent = predict(m,x); x̂ = inverse_predict(m,x_latent) interface. Users can optionally specify the number of dimensions to shrink the data (outdims), the number of neurons of the inner layers (innerdims) or the full details of the encoding and decoding layers and all the underlying NN options, but this remains optional.
  • Adapted 2loss_by_cv function to unsupervised models with inverse_predict
  • Several bugfixes

Merged pull requests:

  • CompatHelper: add new compat entry for Statistics at version 1, (keep existing compat) (#61) (@github-actions[bot])
  • correct typo in AbstractTrees.printnode (#62) (@roland-KA)

Closed issues:

  • Deprecation warning from ProgressMeter.jl (#58)

v0.10.3

9 months ago

BetaML v0.10.3

Diff since v0.10.2

v0.10.2

10 months ago

BetaML v0.10.2

Diff since v0.10.1

Merged pull requests:

  • CompatHelper: add new compat entry for DelimitedFiles at version 1, (keep existing compat) (#55) (@github-actions[bot])

v0.10.1

1 year ago

BetaML v0.10.1

Diff since v0.10.0

Closed issues:

  • target_scitype for MultitargetNeuralNetworkRegressor is too broad (#53)

Merged pull requests:

  • CompatHelper: bump compat for StatsBase to 0.34, (keep existing compat) (#54) (@github-actions[bot])