Calls xgboost::xgboost().

#' eXtreme Gradient Boosting regression for Cox PH model. Calls xgboost::xgb.train() from package xgboost.

We changed the following defaults for this learner:

  • Verbosity is reduced by setting verbose to 0.

  • Number of boosting iterations nrounds is set to 1.

  • The objective is set to survival:cox.

  • The eval_metric is set to cox-nloglik

Dictionary

This Learner can be instantiated via the dictionary mlr_learners or with the associated sugar function lrn():

LearnerSurvXgboost$new()
mlr_learners$get("surv.xgboost")
lrn("surv.xgboost")

Meta Information

  • Type: "surv"

  • Predict Types: crank, lp

  • Feature Types: logical, integer, numeric

  • Properties: importance, missings, weights

  • Packages: xgboost

References

Chen T, Guestrin C (2016). “XGBoost: A Scalable Tree Boosting System.” Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining - KDD '16, 785--794. arXiv: 1603.02754, http://arxiv.org/abs/1603.02754.

See also

Super classes

mlr3::Learner -> mlr3proba::LearnerSurv -> LearnerSurvXgboost

Methods

Public methods

Inherited methods

Method new()

Creates a new instance of this R6 class.

Usage

LearnerSurvXgboost$new()


Method importance()

The importance scores are calculated with xgboost::xgb.importance().

Usage

LearnerSurvXgboost$importance()

Returns

Named numeric().


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerSurvXgboost$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.