Setup hyperparameters for LightRF training.
Usage
setup_LightRF(
nrounds = 500L,
num_leaves = 4096L,
max_depth = -1L,
feature_fraction = 0.7,
subsample = 0.623,
lambda_l1 = 0,
lambda_l2 = 0,
max_cat_threshold = 32L,
min_data_per_group = 32L,
linear_tree = FALSE,
ifw = FALSE,
objective = NULL,
device_type = "cpu",
tree_learner = "serial",
force_col_wise = TRUE,
num_threads = 0L
)
Arguments
- nrounds
(Tunable) Positive integer: Number of boosting rounds.
- num_leaves
(Tunable) Positive integer: Maximum number of leaves in one tree.
- max_depth
(Tunable) Integer: Maximum depth of trees.
- feature_fraction
(Tunable) Numeric: Fraction of features to use.
- subsample
(Tunable) Numeric: Fraction of data to use.
- lambda_l1
(Tunable) Numeric: L1 regularization.
- lambda_l2
(Tunable) Numeric: L2 regularization.
- max_cat_threshold
(Tunable) Positive integer: Maximum number of categories for categorical features.
- min_data_per_group
(Tunable) Positive integer: Minimum number of data per categorical group.
- linear_tree
Logical: If TRUE, use linear trees.
- ifw
Logical: If TRUE, use Inverse Frequency Weighting in classification.
- objective
Character: Objective function.
- device_type
Character: "cpu" or "gpu".
- tree_learner
Character: "serial", "feature", "data", or "voting".
- force_col_wise
Logical: Use only with CPU - If TRUE, force col-wise histogram building
- num_threads
Integer: Number of threads to use. 0 means default number of threads in OpenMP.
Details
Get more information from lightgbm::lgb.train.
Note that hyperparameters subsample_freq and early_stopping_rounds are fixed,
and cannot be set because they are what makes lightgbm
train a random forest.
These can all be set when training gradient boosting with LightGBM.