Calls mboost::mboost().

The dist parameter is specified slightly differently than in mboost. Whereas the latter takes in objects, in this learner instead a string is specified in order to identify which distribution to use. As the default in mboost is the Gaussian family, which is not compatible with survival models, instead we have by default "coxph".

If the value given to the Family parameter is "custom.family" then an object of class mboost::Family() needs to be passed to the custom.family parameter.

The only difference between LearnerSurvGamboost and LearnerSurvMboost is that the latter function allows one to specify default degrees of freedom for smooth effects specified via baselearner = "bbs". In all other cases, degrees of freedom need to be set manually via a specific definition of the corresponding base-learner.

Format

R6::R6Class() inheriting from LearnerSurv.

Construction

LearnerSurvMboost$new()
mlr_learners$get("surv.mboost")
lrn("surv.mboost")

Meta Information

  • Type: "surv"

  • Predict Types: distr, crank, lp

  • Feature Types: integer, numeric, factor, logical

  • Packages: mboost distr6 survival

References

Peter Buehlmann and Bin Yu (2003), Boosting with the L2 loss: regression and classification. Journal of the American Statistical Association, 98, 324–339.

Peter Buehlmann and Torsten Hothorn (2007), Boosting algorithms: regularization, prediction and model fitting. Statistical Science, 22(4), 477–505.

Thomas Kneib, Torsten Hothorn and Gerhard Tutz (2009), Variable selection and model choice in geoadditive regression models, Biometrics, 65(2), 626–634.

Matthias Schmid and Torsten Hothorn (2008), Boosting additive models using component-wise P-splines as base-learners. Computational Statistics \& Data Analysis, 53(2), 298–311.

Torsten Hothorn, Peter Buehlmann, Thomas Kneib, Mattthias Schmid and Benjamin Hofner (2010), Model-based Boosting 2.0. Journal of Machine Learning Research, 11, 2109 – 2113.

Benjamin Hofner, Andreas Mayr, Nikolay Robinzonov and Matthias Schmid (2014). Model-based Boosting in R: A Hands-on Tutorial Using the R Package mboost. Computational Statistics, 29, 3–35. doi: 10.1007/s00180-012-0382-5

See also

Examples

library(mlr3) task = tsk("rats") learner = lrn("surv.mboost") learner$param_set$values = mlr3misc::insert_named(learner$param_set$values, list(center = TRUE, baselearner = "bols")) resampling = rsmp("cv", folds = 3) resample(task, learner, resampling)
#> <ResampleResult> of 3 iterations #> * Task: rats #> * Learner: surv.mboost #> * Warnings: 0 in 0 iterations #> * Errors: 0 in 0 iterations