Setup hyperparameters for LightGBM training.
Usage
setup_LightGBM(
max_nrounds = 1000L,
force_nrounds = NULL,
early_stopping_rounds = 10L,
num_leaves = 8L,
max_depth = -1L,
learning_rate = 0.01,
feature_fraction = 1,
subsample = 1,
subsample_freq = 1L,
lambda_l1 = 0,
lambda_l2 = 0,
max_cat_threshold = 32L,
min_data_per_group = 32L,
linear_tree = FALSE,
ifw = FALSE,
objective = NULL,
device_type = "cpu",
tree_learner = "serial",
force_col_wise = TRUE
)Arguments
- max_nrounds
Positive integer: Maximum number of boosting rounds.
- force_nrounds
Positive integer: Use this many boosting rounds. Disable search for nrounds.
- early_stopping_rounds
Positive integer: Number of rounds without improvement to stop training.
- num_leaves
(Tunable) Positive integer: Maximum number of leaves in one tree.
- max_depth
(Tunable) Integer: Maximum depth of trees.
- learning_rate
(Tunable) Numeric: Learning rate.
- feature_fraction
(Tunable) Numeric: Fraction of features to use.
- subsample
(Tunable) Numeric: Fraction of data to use.
- subsample_freq
(Tunable) Positive integer: Frequency of subsample.
- lambda_l1
(Tunable) Numeric: L1 regularization.
- lambda_l2
(Tunable) Numeric: L2 regularization.
- max_cat_threshold
(Tunable) Positive integer: Maximum number of categories for categorical features.
- min_data_per_group
(Tunable) Positive integer: Minimum number of observations per categorical group.
- linear_tree
Logical: If TRUE, use linear trees.
- ifw
Logical: If TRUE, use Inverse Frequency Weighting in classification.
- objective
Character: Objective function.
- device_type
Character: "cpu" or "gpu".
- tree_learner
Character: "serial", "feature", "data", or "voting".
- force_col_wise
Logical: Use only with CPU - If TRUE, force col-wise histogram building.
Details
Get more information from lightgbm::lgb.train.
Examples
lightgbm_hyperparams <- setup_LightGBM(max_nrounds = 500L,
learning_rate = c(0.001, 0.01, 0.05), ifw = TRUE)
lightgbm_hyperparams
#> <LightGBMHyperparameters>
#> hyperparameters:
#> nrounds: <NUL> NULL
#> max_nrounds: <int> 500
#> force_nrounds: <NUL> NULL
#> early_stopping_rounds: <int> 10
#> num_leaves: <int> 8
#> max_depth: <int> -1
#> learning_rate: <nmr> 1e-03, 0.01, 0.05
#> subsample: <nmr> 1.00
#> subsample_freq: <int> 1
#> lambda_l1: <nmr> 0.00
#> lambda_l2: <nmr> 0.00
#> max_cat_threshold: <int> 32
#> min_data_per_group: <int> 32
#> linear_tree: <lgc> FALSE
#> ifw: <lgc> TRUE
#> objective: <NUL> NULL
#> device_type: <chr> cpu
#> tree_learner: <chr> serial
#> force_col_wise: <lgc> TRUE
#> tunable_hyperparameters: <chr> num_leaves, max_depth, learning_rate, feature_fraction, subsample, subsample_freq, lambda_l1, lambda_l2, max_cat_threshold, min_data_per_group, linear_tree, ifw
#> fixed_hyperparameters: <chr> max_nrounds, force_nrounds, early_stopping_rounds, objective, device_type, tree_learner, force_col_wise
#> tuned: <int> 0
#> resampled: <int> 0
#> n_workers: <int> 1
#>
#> Hyperparameters learning_rate and nrounds need tuning.