Hyperparameter optimization package of the 'mlr3' ecosystem. It
features highly configurable search spaces via the 'paradox' package and
finds optimal hyperparameter configurations for any 'mlr3' learner.
'mlr3tuning' works with several optimization algorithms e.g. Random
Search, Iterated Racing, Bayesian Optimization (in 'mlr3mbo') and
Hyperband (in 'mlr3hyperband'). Moreover, it can automatically optimize
learners and estimate the performance of optimized models with nested
resampling.
Version: |
0.17.2 |
Depends: |
mlr3 (≥ 0.14.1), paradox (≥ 0.10.0), R (≥ 3.1.0) |
Imports: |
bbotk (≥ 0.7.2), checkmate (≥ 2.0.0), data.table, lgr, mlr3misc (≥ 0.11.0), R6 |
Suggests: |
adagio, GenSA, irace, mlr3learners (≥ 0.5.5), mlr3pipelines, nloptr, rpart, testthat (≥ 3.0.0), xgboost |
Published: |
2022-12-22 |
Author: |
Marc Becker [cre,
aut],
Michel Lang [aut],
Jakob Richter
[aut],
Bernd Bischl
[aut],
Daniel Schalk
[aut] |
Maintainer: |
Marc Becker <marcbecker at posteo.de> |
BugReports: |
https://github.com/mlr-org/mlr3tuning/issues |
License: |
LGPL-3 |
URL: |
https://mlr3tuning.mlr-org.com,
https://github.com/mlr-org/mlr3tuning |
NeedsCompilation: |
no |
Materials: |
README NEWS |
CRAN checks: |
mlr3tuning results |