Tunes the Hyperparameters for a given task and learner. Tries to find the best parameter set to tune for the given learner.

hyperopt(
  task,
  learner = NULL,
  par.config = NULL,
  hyper.control = NULL,
  show.info = getMlrOptions()$show.info
)

Arguments

task

[Task] The Task

learner

[Learner] The learner that is subject to the Hyperparameter Tuning. If no learner is given the learner referenced in the par.config will be used, if available.

par.config

[ParConfig] The Parameter Configuration

hyper.control

[HyperControl] The Hyperparameter Control Object

show.info

[logical(1)]
Print verbose output on console? Default is set via configureMlr.

Value

[TuneResult]

Examples

# the shortest way of hyperparameter optimization hyperopt(iris.task, "classif.svm")
#> [Tune] Started tuning learner classif.svm for parameter set:
#> Type len Def Constr Req Tunable Trafo #> cost numeric - 0 -15 to 15 - TRUE Y #> gamma numeric - -2 -15 to 15 - TRUE Y
#> With control class: TuneControlMBO
#> Imputation value: 1
#> [Tune-x] 1: cost=166; gamma=1.05
#> [Tune-y] 1: mmce.test.mean=0.0600000; time: 0.0 min
#> [Tune-x] 2: cost=0.13; gamma=88.1
#> [Tune-y] 2: mmce.test.mean=0.8066667; time: 0.0 min
#> [Tune-x] 3: cost=1.64e+04; gamma=0.0116
#> [Tune-y] 3: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 4: cost=0.000347; gamma=5.45e+03
#> [Tune-y] 4: mmce.test.mean=0.7933333; time: 0.0 min
#> [Tune-x] 5: cost=1.67; gamma=0.203
#> [Tune-y] 5: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 6: cost=0.000737; gamma=0.00367
#> [Tune-y] 6: mmce.test.mean=0.7533333; time: 0.0 min
#> [Tune-x] 7: cost=294; gamma=0.000242
#> [Tune-y] 7: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 8: cost=0.0565; gamma=205
#> [Tune-y] 8: mmce.test.mean=0.8133333; time: 0.0 min
#> [Tune-x] 9: cost=7.58; gamma=0.000172
#> [Tune-y] 9: mmce.test.mean=0.4400000; time: 0.0 min
#> [Tune-x] 10: cost=244; gamma=0.0519
#> [Tune-y] 10: mmce.test.mean=0.0466667; time: 0.0 min
#> [Tune-x] 11: cost=2.89e+03; gamma=0.0019
#> [Tune-y] 11: mmce.test.mean=0.0266667; time: 0.0 min
#> [Tune-x] 12: cost=885; gamma=46.3
#> [Tune-y] 12: mmce.test.mean=0.4066667; time: 0.0 min
#> [Tune-x] 13: cost=1.73e+04; gamma=0.000198
#> [Tune-y] 13: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 14: cost=43.2; gamma=0.239
#> [Tune-y] 14: mmce.test.mean=0.0466667; time: 0.0 min
#> [Tune-x] 15: cost=1.63e+03; gamma=0.000361
#> [Tune-y] 15: mmce.test.mean=0.0333333; time: 0.0 min
#> [Tune-x] 16: cost=6.84e+03; gamma=0.317
#> [Tune-y] 16: mmce.test.mean=0.0733333; time: 0.0 min
#> [Tune-x] 17: cost=606; gamma=0.00487
#> [Tune-y] 17: mmce.test.mean=0.0333333; time: 0.0 min
#> [Tune-x] 18: cost=781; gamma=3.05e-05
#> [Tune-y] 18: mmce.test.mean=0.0333333; time: 0.0 min
#> [Tune-x] 19: cost=3.28e+04; gamma=0.00185
#> [Tune-y] 19: mmce.test.mean=0.0333333; time: 0.0 min
#> [Tune-x] 20: cost=347; gamma=0.25
#> [Tune-y] 20: mmce.test.mean=0.0600000; time: 0.0 min
#> [Tune-x] 21: cost=5.55; gamma=0.454
#> [Tune-y] 21: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 22: cost=486; gamma=0.00114
#> [Tune-y] 22: mmce.test.mean=0.0333333; time: 0.0 min
#> [Tune-x] 23: cost=783; gamma=9.87e-05
#> [Tune-y] 23: mmce.test.mean=0.0466667; time: 0.0 min
#> [Tune-x] 24: cost=6.22e+03; gamma=3.05e-05
#> [Tune-y] 24: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune-x] 25: cost=9.09e+03; gamma=0.00289
#> [Tune-y] 25: mmce.test.mean=0.0400000; time: 0.0 min
#> [Tune] Result: cost=2.89e+03; gamma=0.0019 : mmce.test.mean=0.0266667
#> Tune result: #> Op. pars: cost=2.89e+03; gamma=0.0019 #> mmce.test.mean=0.0266667
# manually defining the paramer space configuration par.config = makeParConfig( par.set = makeParamSet( makeIntegerParam("mtry", lower = 1, upper = 4), makeDiscreteParam("ntree", values = c(10, 25, 50)) ), par.vals = list(replace = FALSE), learner.name = "randomForest" ) hyperopt(bh.task, par.config = par.config)
#> [Tune] Started tuning learner regr.randomForest for parameter set:
#> Type len Def Constr Req Tunable Trafo #> mtry integer - - 1 to 4 - TRUE - #> ntree discrete - - 10,25,50 - TRUE -
#> With control class: TuneControlGrid
#> Imputation value: Inf
#> [Tune-x] 1: mtry=1; ntree=10
#> [Tune-y] 1: mse.test.mean=23.2214697; time: 0.0 min
#> [Tune-x] 2: mtry=2; ntree=10
#> [Tune-y] 2: mse.test.mean=15.9537085; time: 0.0 min
#> [Tune-x] 3: mtry=3; ntree=10
#> [Tune-y] 3: mse.test.mean=11.4362343; time: 0.0 min
#> [Tune-x] 4: mtry=4; ntree=10
#> [Tune-y] 4: mse.test.mean=12.8585559; time: 0.0 min
#> [Tune-x] 5: mtry=1; ntree=25
#> [Tune-y] 5: mse.test.mean=21.8955031; time: 0.0 min
#> [Tune-x] 6: mtry=2; ntree=25
#> [Tune-y] 6: mse.test.mean=13.6541942; time: 0.0 min
#> [Tune-x] 7: mtry=3; ntree=25
#> [Tune-y] 7: mse.test.mean=11.9837778; time: 0.0 min
#> [Tune-x] 8: mtry=4; ntree=25
#> [Tune-y] 8: mse.test.mean=11.8104998; time: 0.0 min
#> [Tune-x] 9: mtry=1; ntree=50
#> [Tune-y] 9: mse.test.mean=20.3497017; time: 0.0 min
#> [Tune-x] 10: mtry=2; ntree=50
#> [Tune-y] 10: mse.test.mean=13.6800808; time: 0.0 min
#> [Tune-x] 11: mtry=3; ntree=50
#> [Tune-y] 11: mse.test.mean=10.9099665; time: 0.0 min
#> [Tune-x] 12: mtry=4; ntree=50
#> [Tune-y] 12: mse.test.mean=10.8271211; time: 0.0 min
#> [Tune] Result: mtry=4; ntree=50 : mse.test.mean=10.8271211
#> Tune result: #> Op. pars: mtry=4; ntree=50 #> mse.test.mean=10.8271211