Skip to contents

Extracts aggregated performance metrics from a model evaluated with rf_evaluate().

Usage

get_evaluation(model)

Arguments

model

Model object with class rf_evaluate from rf_evaluate().

Value

Data frame with aggregated evaluation metrics containing:

  • model: Model type - "Full" (original model), "Training" (trained on training folds), or "Testing" (performance on testing folds, representing generalization ability).

  • metric: Metric name - "rmse", "nrmse", "r.squared", or "pseudo.r.squared".

  • mean, sd, min, max: Summary statistics across cross-validation repetitions.

Details

This function returns aggregated statistics across all cross-validation repetitions. The "Testing" model metrics indicate the model's ability to generalize to unseen spatial locations.

Examples


if(interactive()){

data(plants_rf, plants_xy)

# Evaluate model with spatial cross-validation
m_evaluated <- rf_evaluate(
  model = plants_rf,
  xy = plants_xy,
  repetitions = 5,
  n.cores = 1
)

# Extract evaluation metrics
eval_metrics <- get_evaluation(m_evaluated)
eval_metrics

}