couldn't get the same results with FLAML when using the original ML model
kurtsenol opened this issue · 1 comments
kurtsenol commented
I couldn't get the same result with FLAML when using the original ML model trained with the configuration obtained from FLAML.
Below is my code for FLAML and the result:
automl = AutoML()
automl.fit(X.values, y.values, task="regression", estimator_list=['lgbm'], eval_method="cv", metric = "mape", time_budget = 100)
print("best estimator: ", automl.best_estimator)
print("best_loss: ", automl.best_loss)
print("best config: ", automl.best_config)
best estimator: lgbm
best_loss: 0.027535881847679154
best config: {'n_estimators': 309, 'num_leaves': 12, 'min_child_samples': 2, 'learning_rate': 0.041140404681460824, 'colsample_bytree': 0.6137508653610831, 'reg_alpha': 0.5906654749123738, 'reg_lambda': 3.8530166960799934}
Below is my code for original LGB model using parameters obtained from FLAML:
lgb_model = lgb.LGBMRegressor(n_estimators = 309,
num_leaves = 12,
min_child_samples = 2,
learning_rate = 0.041140404681460824,
colsample_bytree = 0.6137508653610831,
reg_alpha = 0.5906654749123738,
reg_lambda = 3.8530166960799934)
print("neg_mean_absolute_percentage_error:", cross_val_score(lgb_model, X, y.values.ravel(), scoring="neg_mean_absolute_percentage_error").mean())
neg_mean_absolute_percentage_error: -0.03579415775688318
thinkall commented
Hi @kurtsenol , the reason could be the same as discussed in #1054 , #1287 .
Could you try setting skip_transform
to True
in the automl.fit
function?