Skip to content

v0.8.0

Compare
Choose a tag to compare
@github-actions github-actions released this 02 Oct 10:45
· 169 commits to master since this release

BetaML v0.8.0

Diff since v0.7.1

  • support for all models of the new "V2" API that implements a "standard" mod = Model([Options]), fit!(mod,X,[Y]), predict(mod,[X]) workflow (details here). Classic API is now deprecated, with some of its functions be removed in the next BetaML 0.9 versions and some unexported.
  • standardised function names to follow the Julia style guidelines and the new BetaML code style guidelines](https://sylvaticus.github.io/BetaML.jl/dev/StyleGuide_templates.html)
  • new hyper-parameter autotuning method:
    mod = ModelXX(autotune=true)  # --> control autotune with the parameter `tunemethod`
    fit!(mod,x,[y])               # --> autotune happens here together with final tuning
    est = predict(mod,xnew)
    
    Autotune is hyperthreaded with model-specific defaults. For example for Random Forests the defaults are:
    tunemethod=SuccessiveHalvingSearch(
        hpranges     = Dict("n_trees"   => [10, 20, 30, 40],
                         "max_depth"    => [5,10,nothing],
                         "min_gain"     => [0.0, 0.1, 0.5],
                         "min_records"  => [2,3,5],
                         "max_features" => [nothing,5,10,30],
                         "beta"         => [0,0.01,0.1]),
        loss         = l2loss_by_cv, # works for both regression and classification
        res_shares   = [0.08, 0.1, 0.13, 0.15, 0.2, 0.3, 0.4]
        multithreads = false) # RF are already multi-threaded
    
    The number of models is reduced in order to arrive with a single model.
    Only supervised model autotuning is currently implemented, but GMM-based clustering autotuniing is planned using BIC or AIC.
  • new functions model_load and model_save to load/save trained models from the filesystem
  • new MinMaxScaler (StandardScaler was already available as classical API functions scale and getScalingFactors)
  • many bugfixes/improvments on corner situations
  • new MLJ interface models to NeuralNetworkEstimator

Closed issues:

  • Improve oneHotEncode stability for encoding integers embedding categories (#29)
  • initVarainces! doesn't support mixed-type variances (#33)
  • Error generating MLJ model registry (#37)
  • WARNING: could not import Perceptron ... (#38)
  • MLJ model BetaMLGMMRegressor predicting row vectors instead of column vectors (#40)