Calls mboost::gamboost().

The dist parameter is specified slightly differently than in mboost. Whereas the latter takes in objects, in this learner instead a string is specified in order to identify which distribution to use. As the default in mboost is the Gaussian family, which is not compatible with survival models, instead we have by default "coxph".

If the value given to the Family parameter is "" then an object of class mboost::Family() needs to be passed to the parameter.

The only difference between LearnerSurvGamboost and LearnerSurvMboost is that the latter function allows one to specify default degrees of freedom for smooth effects specified via baselearner = "bbs". In all other cases, degrees of freedom need to be set manually via a specific definition of the corresponding base-learner.


This Learner can be instantiated via the dictionary mlr_learners or with the associated sugar function lrn():


Meta Information

  • Type: "surv"

  • Predict Types: distr, crank, lp, response

  • Feature Types: integer, numeric, factor, logical

  • Properties: importance, selected_features, weights

  • Packages: mboost distr6 survival


Bühlmann P, Yu B (2003). “Boosting With the L2 Loss.” Journal of the American Statistical Association, 98(462), 324--339. doi: 10.1198/016214503000125 .

Bühlmann P, Hothorn T (2007). “Boosting Algorithms: Regularization, Prediction and Model Fitting.” Statistical Science, 22(4), 477--505. doi: 10.1214/07-sts242 .

Kneib T, Hothorn T, Tutz G (2008). “Variable Selection and Model Choice in Geoadditive Regression Models.” Biometrics, 65(2), 626--634. doi: 10.1111/j.1541-0420.2008.01112.x .

Schmid M, Hothorn T (2008). “Boosting additive models using component-wise P-Splines.” Computational Statistics & Data Analysis, 53(2), 298--311. doi: 10.1016/j.csda.2008.09.009 .

Hothorn T, Bühlmann P, Kneib T, Schmid M, Hofner B (2010). “Model-based boosting 2.0.” Journal of Machine Learning Research, 11(Aug), 2109--2113.

Hofner B, Mayr A, Robinzonov N, Schmid M (2012). “Model-based boosting in R: a hands-on tutorial using the R package mboost.” Computational Statistics, 29(1-2), 3--35. doi: 10.1007/s00180-012-0382-5 .

See also

Super classes

mlr3::Learner -> mlr3proba::LearnerSurv -> LearnerSurvGamboost


Public methods

Inherited methods

Method new()

Creates a new instance of this R6 class.



Method importance()

The importance scores are extracted with the function mboost::varimp() with the default arguments.




Named numeric().

Method selected_features()

Selected features are extracted with the function mboost::variable.names.mboost(), with used.only = TRUE.





Method clone()

The objects of this class are cloneable with this method.


LearnerSurvGamboost$clone(deep = FALSE)



Whether to make a deep clone.


library(mlr3) task = tgen("simsurv")$generate(20) learner = lrn("surv.gamboost") learner$param_set$values = mlr3misc::insert_named( learner$param_set$values, list(dfbase = 3, center = TRUE, baselearner = "bols")) resampling = rsmp("cv", folds = 2) resample(task, learner, resampling)
#> <ResampleResult> of 2 iterations #> * Task: simsurv #> * Learner: surv.gamboost #> * Warnings: 0 in 0 iterations #> * Errors: 0 in 0 iterations