Beta Machine Learning Toolkit
FeatureRanker
, a flexible feature ranking estimator using multiple feature importance metricskl_divergence
and sobol_index
ignore_dims
to the predict
functionsampling_share
to RandomForestEstimator
modelFeatureRanker
tutoriall2loss_by_cv
for unsupervised modelsAttention: many breaking changes in this version !!
ConvLayer
and PoolLayer
for convolutional networks. BetaML neural networks work only on CPU and even on CPU the convolution layers (but not the dense ones) are 2-3 times slower than Flux. Still they have some quite unique characteristics, like working with any dimensions or not requiring AD in most cases, so they may still be useful in some corner situations. Then, if you want to help in porting to GPU... ;-)Bmlj
submoduleAutoEncoder
and MLJ wrapper AutoEncoderMLJ
with a m=AutoEncoder(hp); fit!(m,x); x_latent = predict(m,x); x̂ = inverse_predict(m,x_latent)
interface. Users can optionally specify the number of dimensions to shrink the data (outdims
), the number of neurons of the inner layers (innerdims
) or the full details of the encoding and decoding layers and all the underlying NN options, but this remains optional.2loss_by_cv
function to unsupervised models with inverse_predictMerged pull requests:
AbstractTrees.printnode
(#62) (@roland-KA)Closed issues: