Calls mboost::mboost().

The dist parameter is specified slightly differently than in mboost. Whereas the latter takes in objects, in this learner instead a string is specified in order to identify which distribution to use. As the default in mboost is the Gaussian family, which is not compatible with survival models, instead we have by default "coxph".

If the value given to the Family parameter is "custom.family" then an object of class mboost::Family() needs to be passed to the custom.family parameter.

The only difference between LearnerSurvGamboost and LearnerSurvMboost is that the latter function allows one to specify default degrees of freedom for smooth effects specified via baselearner = "bbs". In all other cases, degrees of freedom need to be set manually via a specific definition of the corresponding base-learner.

Format

R6::R6Class() inheriting from LearnerSurv.

Construction

LearnerSurvMboost$new()
mlr_learners$get("surv.mboost")
lrn("surv.mboost")

Meta Information

  • Type: "surv"

  • Predict Types: distr, crank, lp

  • Feature Types: integer, numeric, factor, logical

  • Packages: mboost distr6 survival

References

Bühlmann P, Yu B (2003). “Boosting With the L2 Loss.” Journal of the American Statistical Association, 98(462), 324--339. doi: 10.1198/016214503000125 .

Bühlmann P, Hothorn T (2007). “Boosting Algorithms: Regularization, Prediction and Model Fitting.” Statistical Science, 22(4), 477--505. doi: 10.1214/07-sts242 .

Bühlmann P, Hothorn T (2007). “Boosting Algorithms: Regularization, Prediction and Model Fitting.” Statistical Science, 22(4), 477--505. doi: 10.1214/07-sts242 .

Kneib T, Hothorn T, Tutz G (2008). “Variable Selection and Model Choice in Geoadditive Regression Models.” Biometrics, 65(2), 626--634. doi: 10.1111/j.1541-0420.2008.01112.x .

Schmid M, Hothorn T (2008). “Boosting additive models using component-wise P-Splines.” Computational Statistics & Data Analysis, 53(2), 298--311. doi: 10.1016/j.csda.2008.09.009 .

Hothorn T, Bühlmann P, Kneib T, Schmid M, Hofner B (2010). “Model-based boosting 2.0.” Journal of Machine Learning Research, 11(Aug), 2109--2113. http://www.jmlr.org/papers/v11/hothorn10a.html.

Hofner B, Mayr A, Robinzonov N, Schmid M (2012). “Model-based boosting in R: a hands-on tutorial using the R package mboost.” Computational Statistics, 29(1-2), 3--35. doi: 10.1007/s00180-012-0382-5 .

See also

Examples

library(mlr3) task = tgen("simsurv")$generate(20) learner = lrn("surv.mboost") learner$param_set$values = mlr3misc::insert_named(learner$param_set$values, list(center = TRUE, baselearner = "bols")) resampling = rsmp("cv", folds = 2) resample(task, learner, resampling)
#> <ResampleResult> of 2 iterations #> * Task: simsurv #> * Learner: surv.mboost #> * Warnings: 0 in 0 iterations #> * Errors: 0 in 0 iterations