-
-
Notifications
You must be signed in to change notification settings - Fork 8.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
clarification needed for model/saving loading #10111
Comments
It should be the right way. I don't know why there's an error(assuming you have trained the model before serialization). Will look into it, is there an easy way that I can reproduce? |
Sorry, I seem to have gotten myself confused somehow... I thought I had reproduced the problem and indeed the error above was real, but today I went to try to reproduce it to report back here and I can't. At this point I'm not sure if this is a mistake on my end, an intermittent (or hardware dependent) issue in |
@ExpandingMan Any updates on this issue? |
Feel free to reopen if there's a way to reproduce. |
Sorry for not responding, yeah never ran into this again, don't know what I did. |
A model saved via
XGBoosterSaveModelToBuffer
can be loaded by creating a booster object with the parametermodel_buffer
, however, attempting to load this model into a booster object usingXGBoosterLoadModelFromBuffer
gives an error such asThis may be a bug in
XGBoosterLoadModelFromBuffer
, but it seems more likely that I just don't understand how to properly use that function. This comment in the python wrapper would seem to suggest that this is not the intended use ofXGBoosterLoadModelFromBuffer
. At the very least, I find this very confusing sinceXGBoosterSaveModelToBuffer
andXGBoosterLoadModelFromBuffer
sure sound like they are inverse of each other. I think some clarification in the docs would be useful (I didn't see one but I might be missing it).For context, I'm one of the maintainers of the Julia wrapper, and confusion over serialization methods has popped up a number of times, most recently here.
The text was updated successfully, but these errors were encountered: