Loading a model saved via python does not appear to save the "missing" parameter. #151
Replies: 2 comments
-
I even tried adding -999 / "NaN" / "NAN" / "nan" as a missing value using SetParameter on the booster.
and converting all -999 to double.NaN. However, I still get the error: |
Beta Was this translation helpful? Give feedback.
-
Hi @cpr-pcurry, SharpLearning just wraps the booster class from https://github.com/PicNet/XGBoost.Net/blob/master/XGBoost/lib/Booster.cs#L35 public Booster(string fileName, int silent = 1)
{
IntPtr tempPtr;
var newBooster = XGBOOST_NATIVE_METHODS.XGBoosterCreate(null, 0,out tempPtr);
var output = XGBOOST_NATIVE_METHODS.XGBoosterLoadModel(tempPtr, fileName);
if (output == -1) throw new DllFailException(XGBOOST_NATIVE_METHODS.XGBGetLastError());
handle = tempPtr;
} To test if the original issue (with different results with missings included) has to do with some of the conversions being done in SharpLearning when parsing in the feature vector to the model, you can try the above example just using // Parse feature vector (as jagged array)
var floatObservation = new float[][]
using (var data = new DMatrix(floatObservation))
{
// This contains the probabilies for each class if classification.
var prediction = m_model.Predict(data);
} Hope this can help you in debugging the issue. |
Beta Was this translation helpful? Give feedback.
-
I have a xgboost model that I have saved in Python (using xgboost == 0.8), with the parameter "missing" = -999.
However, when I load the model in SharpLearning and attempt a regression test on the training features, I see a 8% decrease in performance. This goes away when I remove the missing values (-999) from the training data.
This suggests to me that the "missing" parameter is not being loaded correctly by SharpLearning. Is the "missing" parameter supported currently?
Thanks for your help!
Beta Was this translation helpful? Give feedback.
All reactions