A convenience wrapper to evaluate a data frame of prognostic predictions.
This function is ideal for evaluating the output of apply_pro
.
Usage
evaluate_predictions_pro(prediction_df, years_to_evaluate = c(1, 3, 5))
Value
A list of evaluation metrics, including C-index, time-dependent AUROC, and Kaplan-Meier analysis results.
Examples
# \donttest{
# Assume 'trained_model' and 'test_pro' data are available
if (requireNamespace("E2E", quietly = TRUE) &&
"train_pro" %in% utils::data(package = "E2E")$results[,3] &&
"test_pro" %in% utils::data(package = "E2E")$results[,3]) {
data(train_pro, package = "E2E")
data(test_pro, package = "E2E")
initialize_modeling_system_pro()
model_results <- models_pro(data = train_pro, model = "lasso_pro")
# 1. Get predictions on new data
predictions <- apply_pro(model_results$lasso_pro$model_object, test_pro)
# 2. Evaluate these predictions using the simplified function
evaluation_metrics <- evaluate_predictions_pro(predictions, years_to_evaluate = c(1, 3))
print(evaluation_metrics)
}
#> Prognosis modeling system already initialized.
#> Running model: lasso_pro
#> Applying model on new data...
#> $C_index
#> [1] 0.6639451
#>
#> $AUROC_Years
#> $AUROC_Years$`1`
#> [1] 0.6548161
#>
#> $AUROC_Years$`3`
#> [1] 0.6039198
#>
#>
#> $AUROC_Average
#> [1] 0.629368
#>
#> $KM_HR
#> [1] 1.944295
#>
#> $KM_P_value
#> [1] 0.03222287
#>
#> $KM_Cutoff
#> [1] 0.449566
#>
# }