Skip to contents

Trains a Gradient Boosting Machine (GBM) model using caret::train for binary classification.

Usage

gbm_dia(X, y, tune = FALSE, cv_folds = 5)

Arguments

X

A data frame of features.

y

A factor vector of class labels.

tune

Logical, whether to perform hyperparameter tuning for interaction.depth, n.trees, and shrinkage (if TRUE) or use fixed values (if FALSE).

cv_folds

An integer, the number of cross-validation folds for caret.

Value

A caret::train object representing the trained GBM model.

Examples

# \donttest{
set.seed(42)
n_obs <- 200
X_toy <- data.frame(
  FeatureA = rnorm(n_obs),
  FeatureB = runif(n_obs, 0, 100)
)
y_toy <- factor(sample(c("Control", "Case"), n_obs, replace = TRUE),
                levels = c("Control", "Case"))

# Train the model
gbm_model <- gbm_dia(X_toy, y_toy)
print(gbm_model)
#> Stochastic Gradient Boosting 
#> 
#> 200 samples
#>   2 predictor
#>   2 classes: 'Control', 'Case' 
#> 
#> No pre-processing
#> Resampling: Cross-Validated (5 fold) 
#> Summary of sample sizes: 161, 159, 160, 160, 160 
#> Resampling results:
#> 
#>   ROC        Sens       Spec     
#>   0.4855489  0.4684211  0.5104762
#> 
#> Tuning parameter 'n.trees' was held constant at a value of 100
#> Tuning
#> 
#> Tuning parameter 'shrinkage' was held constant at a value of 0.1
#> 
#> Tuning parameter 'n.minobsinnode' was held constant at a value of 10
# }