Skip to contents

Calculates the cross-entropy, or logarithmic (log), loss.

The logloss, in the context of probabilistic predictions, is defined as the negative log probability density function, \(f\), evaluated at the observation time, \(t\), $$L(f, t) = -log(f(t))$$

The standard error of the Logloss, L, is approximated via, $$se(L) = sd(L)/\sqrt{N}$$ where \(N\) are the number of observations in the test set, and \(sd\) is the standard deviation.

The IPCW log loss is defined by $$L(f, t, \Delta) = -\Delta log(f(t))/G(t)$$ where \(\Delta\) is the censoring indicator and G is the Kaplan-Meier estimator of the censoring distribution.

Details

If task and train_set are passed to $score then G is fit on training data, otherwise testing data. The first is likely to reduce any bias caused by calculating parts of the measure on the test data it is evaluating. The training data is automatically used in scoring resamplings.

Dictionary

This Measure can be instantiated via the dictionary mlr_measures or with the associated sugar function msr():

MeasureSurvLogloss$new()
mlr_measures$get("surv.logloss")
msr("surv.logloss")

Meta Information

  • Type: "surv"

  • Range: \([0, \infty)\)

  • Minimize: TRUE

  • Required prediction: distr

Super classes

mlr3::Measure -> mlr3proba::MeasureSurv -> MeasureSurvLogloss

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage

MeasureSurvLogloss$new(ERV = FALSE)

Arguments

ERV

(logical(1))
Standardize measure against a Kaplan-Meier baseline (Explained Residual Variation)


Method clone()

The objects of this class are cloneable with this method.

Usage

MeasureSurvLogloss$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.