Tunes the Hyperparameters for a given task and learner. Tries to find the best parameter set to tune for the given learner.

hyperopt(task, learner = NULL, par.config = NULL, hyper.control = NULL,
  show.info = getMlrOptions()$show.info)

Arguments

task

[Task] The Task

learner

[Learner] The learner that is subject to the Hyperparameter Tuning. If no learner is given the learner referenced in the par.config will be used, if available.

par.config

[ParConfig] The Parameter Configuration

hyper.control

[HyperControl] The Hyperparameter Control Object

show.info

[logical(1)] Print verbose output on console? Default is set via configureMlr.

Value

[TuneResult]

Examples

# the shortest way of hyperparameter optimization hyperopt(iris.task, "classif.svm")
#> [Tune] Started tuning learner classif.svm for parameter set:
#> Type len Def Constr Req Tunable Trafo #> cost numeric - 0 -15 to 15 - TRUE Y #> gamma numeric - -2 -15 to 15 - TRUE Y
#> With control class: TuneControlMBO
#> Imputation value: 1
#> [Tune-x] 1: cost=0.00973; gamma=0.878
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 1: mmce.test.mean=0.7800000; time: 0.0 min
#> [Tune-x] 2: cost=0.082; gamma=1.51e+03
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 2: mmce.test.mean=0.8266667; time: 0.0 min
#> [Tune-x] 3: cost=3.22; gamma=0.0355
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 3: mmce.test.mean=0.0466667; time: 0.0 min
#> [Tune-x] 4: cost=1.44e+04; gamma=6.61
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 4: mmce.test.mean=0.0800000; time: 0.0 min
#> [Tune-x] 5: cost=0.000267; gamma=171
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 5: mmce.test.mean=0.8200000; time: 0.0 min
#> [Tune-x] 6: cost=604; gamma=6.12e-05
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 6: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 7: cost=0.00061; gamma=0.00461
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 7: mmce.test.mean=0.7800000; time: 0.0 min
#> [Tune-x] 8: cost=32; gamma=5.23e+03
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 8: mmce.test.mean=0.8066667; time: 0.0 min
#> Loading required package: rgenoud
#> ## rgenoud (Version 5.7-12.4, Build Date: 2015-07-19) #> ## See http://sekhon.berkeley.edu/rgenoud for additional documentation. #> ## Please cite software as: #> ## Walter Mebane, Jr. and Jasjeet S. Sekhon. 2011. #> ## ``Genetic Optimization Using Derivatives: The rgenoud package for R.'' #> ## Journal of Statistical Software, 42(11): 1-26. #> ##
#> [Tune-x] 9: cost=1.7e+03; gamma=0.021
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 9: mmce.test.mean=0.0533333; time: 0.0 min
#> [Tune-x] 10: cost=8.9; gamma=0.000184
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 10: mmce.test.mean=0.4466667; time: 0.0 min
#> [Tune-x] 11: cost=1.47e+04; gamma=0.0175
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 11: mmce.test.mean=0.0600000; time: 0.0 min
#> [Tune-x] 12: cost=9.87e+03; gamma=0.0219
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 12: mmce.test.mean=0.0600000; time: 0.0 min
#> [Tune-x] 13: cost=3.12e+03; gamma=0.0331
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 13: mmce.test.mean=0.0533333; time: 0.0 min
#> [Tune-x] 14: cost=1.74; gamma=3.12e-05
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 14: mmce.test.mean=0.7800000; time: 0.0 min
#> [Tune-x] 15: cost=1.56e+04; gamma=0.000108
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 15: mmce.test.mean=0.0333333; time: 0.0 min
#> [Tune-x] 16: cost=45.5; gamma=0.0721
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 16: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 17: cost=2.36e+03; gamma=0.000475
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 17: mmce.test.mean=0.0333333; time: 0.0 min
#> [Tune-x] 18: cost=3.26e+04; gamma=0.515
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 18: mmce.test.mean=0.0666667; time: 0.0 min
#> [Tune-x] 19: cost=3.38e+03; gamma=3.06e-05
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 19: mmce.test.mean=0.0466667; time: 0.0 min
#> [Tune-x] 20: cost=318; gamma=0.00435
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 20: mmce.test.mean=0.0333333; time: 0.0 min
#> [Tune-x] 21: cost=10.5; gamma=0.0482
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 21: mmce.test.mean=0.0333333; time: 0.0 min
#> [Tune-x] 22: cost=222; gamma=0.0345
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 22: mmce.test.mean=0.0600000; time: 0.0 min
#> [Tune-x] 23: cost=1.66e+03; gamma=0.000111
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 23: mmce.test.mean=0.0333333; time: 0.0 min
#> [Tune-x] 24: cost=733; gamma=0.00174
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 24: mmce.test.mean=0.0333333; time: 0.0 min
#> [Tune-x] 25: cost=4.89; gamma=0.167
#> Resampling: cross-validation
#> Measures: mmce
#> [Tune-y] 25: mmce.test.mean=0.0333333; time: 0.0 min
#> [Tune] Result: cost=1.66e+03; gamma=0.000111 : mmce.test.mean=0.0333333
#> Tune result: #> Op. pars: cost=1.66e+03; gamma=0.000111 #> mmce.test.mean=0.0333333
# manually defining the paramer space configuration par.config = makeParConfig( par.set = makeParamSet( makeIntegerParam("mtry", lower = 1, upper = 4), makeDiscreteParam("ntree", values = c(10, 25, 50)) ), par.vals = list(replace = FALSE), learner.name = "randomForest" ) hyperopt(bh.task, par.config = par.config)
#> [Tune] Started tuning learner regr.randomForest for parameter set:
#> Type len Def Constr Req Tunable Trafo #> mtry integer - - 1 to 4 - TRUE - #> ntree discrete - - 10,25,50 - TRUE -
#> With control class: TuneControlGrid
#> Imputation value: Inf
#> [Tune-x] 1: mtry=1; ntree=10
#> Resampling: cross-validation
#> Measures: mse
#> [Tune-y] 1: mse.test.mean=22.2012745; time: 0.0 min
#> [Tune-x] 2: mtry=2; ntree=10
#> Resampling: cross-validation
#> Measures: mse
#> [Tune-y] 2: mse.test.mean=16.5385299; time: 0.0 min
#> [Tune-x] 3: mtry=3; ntree=10
#> Resampling: cross-validation
#> Measures: mse
#> [Tune-y] 3: mse.test.mean=14.2338994; time: 0.0 min
#> [Tune-x] 4: mtry=4; ntree=10
#> Resampling: cross-validation
#> Measures: mse
#> [Tune-y] 4: mse.test.mean=12.2771992; time: 0.0 min
#> [Tune-x] 5: mtry=1; ntree=25
#> Resampling: cross-validation
#> Measures: mse
#> [Tune-y] 5: mse.test.mean=19.8272853; time: 0.0 min
#> [Tune-x] 6: mtry=2; ntree=25
#> Resampling: cross-validation
#> Measures: mse
#> [Tune-y] 6: mse.test.mean=14.5461377; time: 0.0 min
#> [Tune-x] 7: mtry=3; ntree=25
#> Resampling: cross-validation
#> Measures: mse
#> [Tune-y] 7: mse.test.mean=11.8587858; time: 0.0 min
#> [Tune-x] 8: mtry=4; ntree=25
#> Resampling: cross-validation
#> Measures: mse
#> [Tune-y] 8: mse.test.mean=12.5758066; time: 0.0 min
#> [Tune-x] 9: mtry=1; ntree=50
#> Resampling: cross-validation
#> Measures: mse
#> [Tune-y] 9: mse.test.mean=20.9448880; time: 0.0 min
#> [Tune-x] 10: mtry=2; ntree=50
#> Resampling: cross-validation
#> Measures: mse
#> [Tune-y] 10: mse.test.mean=13.6173787; time: 0.0 min
#> [Tune-x] 11: mtry=3; ntree=50
#> Resampling: cross-validation
#> Measures: mse
#> [Tune-y] 11: mse.test.mean=12.4899408; time: 0.0 min
#> [Tune-x] 12: mtry=4; ntree=50
#> Resampling: cross-validation
#> Measures: mse
#> [Tune-y] 12: mse.test.mean=11.4071097; time: 0.0 min
#> [Tune] Result: mtry=4; ntree=50 : mse.test.mean=11.4071097
#> Tune result: #> Op. pars: mtry=4; ntree=50 #> mse.test.mean=11.4071097