Setup hyperparameters for GLMNET training.
Usage
setup_GLMNET(
alpha = 1,
family = NULL,
offset = NULL,
which_lambda_cv = "lambda.1se",
nlambda = 100L,
lambda = NULL,
penalty_factor = NULL,
standardize = TRUE,
intercept = TRUE,
ifw = TRUE
)Arguments
- alpha
(Tunable) Numeric: Mixing parameter.
- family
Character: Family for GLMNET.
- offset
Numeric: Offset for GLMNET.
- which_lambda_cv
Character: Which lambda to use for prediction: "lambda.1se" or "lambda.min"
- nlambda
Positive integer: Number of lambda values.
- lambda
Numeric: Lambda values.
- penalty_factor
Numeric: Penalty factor for each feature.
- standardize
Logical: If TRUE, standardize features.
- intercept
Logical: If TRUE, include intercept.
- ifw
Logical: If TRUE, use Inverse Frequency Weighting in classification.
Details
Get more information from glmnet::glmnet.
Examples
glm_hyperparams <- setup_GLMNET(alpha = 1, ifw = TRUE)
glm_hyperparams
#> <GLMNETHyperparameters>
#> hyperparameters:
#> alpha: <nmr> 1.00
#> family: <NUL> NULL
#> offset: <NUL> NULL
#> which_lambda_cv: <chr> lambda.1se
#> nlambda: <int> 100
#> lambda: <NUL> NULL
#> penalty_factor: <NUL> NULL
#> standardize: <lgc> TRUE
#> intercept: <lgc> TRUE
#> ifw: <lgc> TRUE
#> tunable_hyperparameters: <chr> alpha, ifw
#> fixed_hyperparameters: <chr> family, offset, which_lambda_cv, nlambda, penalty_factor, standardize, intercept
#> tuned: <int> 0
#> resampled: <int> 0
#> n_workers: <int> 1
#>
#> Hyperparameter lambda needs tuning.