Calls gbm::gbm().

Parameter distribution is set to coxph as this is the only distribution implemented in gbm::gbm() for survival analysis; parameter keep.data is set to FALSE for efficiency.

Format

R6::R6Class() inheriting from LearnerSurv.

Construction

LearnerSurvGBM$new()
mlr_learners$get("surv.gbm")
lrn("surv.gbm")

Meta Information

  • Type: "surv"

  • Predict Types: crank, lp

  • Feature Types: integer, numeric, factor, ordered

  • Packages: gbm

References

Y. Freund and R.E. Schapire (1997) A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, 55(1):119-139.

G. Ridgeway (1999). The state of boosting. Computing Science and Statistics 31:172-181.

J.H. Friedman, T. Hastie, R. Tibshirani (2000). Additive Logistic Regression: a Statistical View of Boosting. Annals of Statistics 28(2):337-374.

J.H. Friedman (2001). Greedy Function Approximation: A Gradient Boosting Machine. Annals of Statistics 29(5):1189-1232.

J.H. Friedman (2002). Stochastic Gradient Boosting. Computational Statistics and Data Analysis 38(4):367-378.

B. Kriegler (2007). Cost-Sensitive Stochastic Gradient Boosting Within a Quantitative Regression Framework. Ph.D. Dissertation. University of California at Los Angeles, Los Angeles, CA, USA. Advisor(s) Richard A. Berk. https://dl.acm.org/citation.cfm?id=1354603

C. Burges (2010). From RankNet to LambdaRank to LambdaMART: An Overview. Microsoft Research Technical Report MSR-TR-2010-82.

See also