You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi I am a new player in this field. I would like to suggest if there could be a function for a saved model to be trained on a small and new dataset to update the weights of the model, to commodate for concept/data drifting and also for more accurate model with small training data. Currently the fit function will reset the weights of the model and retrain it from scratch.
I found some examples in xgboost where the train and fit function has a parameter to continue training on a saved model [https://xgboost.readthedocs.io/en/latest/python/python_api.html#module-xgboost.sklearn].
the parameter detail is as xgb_model (file name of stored xgb model or 'Booster' instance) – Xgb model to be loaded before training (allows training continuation).
Or also like sklearn's partial fit function (I'm not too confident in how the function works).
Thank you.
The text was updated successfully, but these errors were encountered:
Hi I am a new player in this field. I would like to suggest if there could be a function for a saved model to be trained on a small and new dataset to update the weights of the model, to commodate for concept/data drifting and also for more accurate model with small training data. Currently the fit function will reset the weights of the model and retrain it from scratch.
I found some examples in xgboost where the train and fit function has a parameter to continue training on a saved model [https://xgboost.readthedocs.io/en/latest/python/python_api.html#module-xgboost.sklearn].
the parameter detail is as xgb_model (file name of stored xgb model or 'Booster' instance) – Xgb model to be loaded before training (allows training continuation).
Or also like sklearn's partial fit function (I'm not too confident in how the function works).
Thank you.
The text was updated successfully, but these errors were encountered: