Tunes the Hyperparameters for a given task and learner. Tries to find the best parameter set to tune for the given learner.

hyperopt(task, learner = NULL, par.config = NULL, hyper.control = NULL,
  show.info = getMlrOptions()$show.info)

Arguments

task

[Task] The Task

learner

[Learner] The learner that is subject to the Hyperparameter Tuning. If no learner is given the learner referenced in the par.config will be used, if available.

par.config

[ParConfig] The Parameter Configuration

hyper.control

[HyperControl] The Hyperparameter Control Object

show.info

[logical(1)] Print verbose output on console? Default is set via configureMlr.

Value

[TuneResult]

Examples

# the shortest way of hyperparameter optimization hyperopt(iris.task, "classif.svm")
#> [Tune] Started tuning learner classif.svm for parameter set:
#> Type len Def Constr Req Tunable Trafo #> cost numeric - 0 -15 to 15 - TRUE Y #> gamma numeric - -2 -15 to 15 - TRUE Y
#> With control class: TuneControlMBO
#> Imputation value: 1
#> [Tune-x] 1: cost=0.0383; gamma=2.73e+04
#> [Tune-y] 1: mmce.test.mean=0.8266667; time: 0.0 min
#> [Tune-x] 2: cost=545; gamma=0.0878
#> [Tune-y] 2: mmce.test.mean=0.0533333; time: 0.0 min
#> [Tune-x] 3: cost=17.6; gamma=4.63e-05
#> [Tune-y] 3: mmce.test.mean=0.6466667; time: 0.0 min
#> [Tune-x] 4: cost=0.138; gamma=3.27
#> [Tune-y] 4: mmce.test.mean=0.1400000; time: 0.0 min
#> [Tune-x] 5: cost=0.000291; gamma=35.3
#> [Tune-y] 5: mmce.test.mean=0.8000000; time: 0.0 min
#> [Tune-x] 6: cost=5.01; gamma=0.0029
#> [Tune-y] 6: mmce.test.mean=0.0600000; time: 0.0 min
#> [Tune-x] 7: cost=8.99e+03; gamma=1.32e+03
#> [Tune-y] 7: mmce.test.mean=0.8066667; time: 0.0 min
#> [Tune-x] 8: cost=0.000411; gamma=0.00814
#> [Tune-y] 8: mmce.test.mean=0.7800000; time: 0.0 min
#> Loading required package: rgenoud
#> ## rgenoud (Version 5.8-1.0, Build Date: 2017-10-10) #> ## See http://sekhon.berkeley.edu/rgenoud for additional documentation. #> ## Please cite software as: #> ## Walter Mebane, Jr. and Jasjeet S. Sekhon. 2011. #> ## ``Genetic Optimization Using Derivatives: The rgenoud package for R.'' #> ## Journal of Statistical Software, 42(11): 1-26. #> ##
#> [Tune-x] 9: cost=1.2e+04; gamma=1.54
#> [Tune-y] 9: mmce.test.mean=0.0466667; time: 0.0 min
#> [Tune-x] 10: cost=0.138; gamma=124
#> [Tune-y] 10: mmce.test.mean=0.8200000; time: 0.0 min
#> [Tune-x] 11: cost=23.8; gamma=0.818
#> [Tune-y] 11: mmce.test.mean=0.0666667; time: 0.0 min
#> [Tune-x] 12: cost=1.19e+03; gamma=0.0048
#> [Tune-y] 12: mmce.test.mean=0.0333333; time: 0.0 min
#> [Tune-x] 13: cost=81.5; gamma=0.0137
#> [Tune-y] 13: mmce.test.mean=0.0333333; time: 0.0 min
#> [Tune-x] 14: cost=3.26e+04; gamma=0.0269
#> [Tune-y] 14: mmce.test.mean=0.0533333; time: 0.0 min
#> [Tune-x] 15: cost=3.27e+04; gamma=0.304
#> [Tune-y] 15: mmce.test.mean=0.0533333; time: 0.0 min
#> [Tune-x] 16: cost=90.4; gamma=0.00372
#> [Tune-y] 16: mmce.test.mean=0.0266667; time: 0.0 min
#> [Tune-x] 17: cost=235; gamma=2.92
#> [Tune-y] 17: mmce.test.mean=0.0600000; time: 0.0 min
#> [Tune-x] 18: cost=3.26e+04; gamma=0.0022
#> [Tune-y] 18: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 19: cost=342; gamma=0.00206
#> [Tune-y] 19: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 20: cost=13; gamma=0.0405
#> [Tune-y] 20: mmce.test.mean=0.0333333; time: 0.0 min
#> Warning: Stopped because hard maximum generation limit was hit.
#> [Tune-x] 21: cost=6.06; gamma=0.169
#> [Tune-y] 21: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 22: cost=3.27e+04; gamma=3.39
#> [Tune-y] 22: mmce.test.mean=0.0600000; time: 0.0 min
#> [Tune-x] 23: cost=5.84e+03; gamma=0.0031
#> [Tune-y] 23: mmce.test.mean=0.0466667; time: 0.0 min
#> [Tune-x] 24: cost=170; gamma=0.00611
#> [Tune-y] 24: mmce.test.mean=0.0333333; time: 0.0 min
#> Warning: Stopped because hard maximum generation limit was hit.
#> [Tune-x] 25: cost=6.02; gamma=2.04
#> [Tune-y] 25: mmce.test.mean=0.0533333; time: 0.0 min
#> [Tune] Result: cost=90.4; gamma=0.00372 : mmce.test.mean=0.0266667
#> Tune result: #> Op. pars: cost=90.4; gamma=0.00372 #> mmce.test.mean=0.0266667
# manually defining the paramer space configuration par.config = makeParConfig( par.set = makeParamSet( makeIntegerParam("mtry", lower = 1, upper = 4), makeDiscreteParam("ntree", values = c(10, 25, 50)) ), par.vals = list(replace = FALSE), learner.name = "randomForest" ) hyperopt(bh.task, par.config = par.config)
#> [Tune] Started tuning learner regr.randomForest for parameter set:
#> Type len Def Constr Req Tunable Trafo #> mtry integer - - 1 to 4 - TRUE - #> ntree discrete - - 10,25,50 - TRUE -
#> With control class: TuneControlGrid
#> Imputation value: Inf
#> [Tune-x] 1: mtry=1; ntree=10
#> [Tune-y] 1: mse.test.mean=22.1903200; time: 0.0 min
#> [Tune-x] 2: mtry=2; ntree=10
#> [Tune-y] 2: mse.test.mean=14.7553338; time: 0.0 min
#> [Tune-x] 3: mtry=3; ntree=10
#> [Tune-y] 3: mse.test.mean=14.0760079; time: 0.0 min
#> [Tune-x] 4: mtry=4; ntree=10
#> [Tune-y] 4: mse.test.mean=12.1160928; time: 0.0 min
#> [Tune-x] 5: mtry=1; ntree=25
#> [Tune-y] 5: mse.test.mean=20.1467504; time: 0.0 min
#> [Tune-x] 6: mtry=2; ntree=25
#> [Tune-y] 6: mse.test.mean=14.3714560; time: 0.0 min
#> [Tune-x] 7: mtry=3; ntree=25
#> [Tune-y] 7: mse.test.mean=11.4527973; time: 0.0 min
#> [Tune-x] 8: mtry=4; ntree=25
#> [Tune-y] 8: mse.test.mean=11.0731591; time: 0.0 min
#> [Tune-x] 9: mtry=1; ntree=50
#> [Tune-y] 9: mse.test.mean=20.7157229; time: 0.0 min
#> [Tune-x] 10: mtry=2; ntree=50
#> [Tune-y] 10: mse.test.mean=13.4406667; time: 0.0 min
#> [Tune-x] 11: mtry=3; ntree=50
#> [Tune-y] 11: mse.test.mean=11.1015338; time: 0.0 min
#> [Tune-x] 12: mtry=4; ntree=50
#> [Tune-y] 12: mse.test.mean=10.6527538; time: 0.0 min
#> [Tune] Result: mtry=4; ntree=50 : mse.test.mean=10.6527538
#> Tune result: #> Op. pars: mtry=4; ntree=50 #> mse.test.mean=10.6527538