Calls mboost::blackboost().

The dist parameter is specified slightly differently than in mboost. Whereas the latter takes in objects, in this learner instead a string is specified in order to identify which distribution to use. As the default in mboost is the Gaussian family, which is not compatible with survival models, instead we have by default "coxph".

If the value given to the Family parameter is "" then an object of class mboost::Family() needs to be passed to the parameter.


R6::R6Class() inheriting from LearnerSurv.



Meta Information


Peter Buehlmann and Torsten Hothorn (2007), Boosting algorithms: regularization, prediction and model fitting. Statistical Science, 22(4), 477–505.

Torsten Hothorn, Kurt Hornik and Achim Zeileis (2006). Unbiased recursive partitioning: A conditional inference framework. Journal of Computational and Graphical Statistics, 15(3), 651–674.

Yoav Freund and Robert E. Schapire (1996), Experiments with a new boosting algorithm. In Machine Learning: Proc. Thirteenth International Conference, 148–156.

Jerome H. Friedman (2001), Greedy function approximation: A gradient boosting machine. The Annals of Statistics, 29, 1189–1232.

Greg Ridgeway (1999), The state of boosting. Computing Science and Statistics, 31, 172–181.

See also


library(mlr3) task = tgen("simsurv")$generate(200) learner = lrn("surv.blackboost") resampling = rsmp("cv", folds = 3) resample(task, learner, resampling)
#> <ResampleResult> of 3 iterations #> * Task: simsurv #> * Learner: surv.blackboost #> * Warnings: 0 in 0 iterations #> * Errors: 0 in 0 iterations