-
-
Notifications
You must be signed in to change notification settings - Fork 70
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hyperparameter optimization for kriging model #388
Comments
As per this paper: "Since there is no analytical solution for estimating the hyperparameters θ, it is necessary to |
Try p = 1.99 * ones(dim) instead of p = 2 * ones(dim). This will give better numerical stability (your covariance matrix is singular here, so its determinant is zero and that is why the logpdf is Inf). |
@archermarx why |
When p = 2.0, the correlation function is nearly flat for points that are nearby, which causes ill conditioning in the correlation matrix. If p is just slightly less than 2.0, then the correlation function has a small slope so even points that are nearby are not counted as being effectively the same. An update I'm working on will change the default from 2.0 to 1.99 |
I have tried p = 1.99 * ones(dim). Still, the model is overfitting and there is nearly no improvement. The predictive coefficient R squared in training set is 1 while in test set is 0.00186. |
Does your loss function still produce Inf? |
Hello,
I am using surrogates.jl v6.3.0 to build kriging for a dataset with 500 elements. There are 24 input variables and 1 output. But I found the accuracy is terrible. There is serious overfitting with the model.
According to #368, I calculate the loss function,
Inf
is obtained!I don't know how can I improve my model, as p and theta are already the values suggested.
The text was updated successfully, but these errors were encountered: