You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am analysing the interpolation techniques, and I encountered something that appears to be weird for me (I might be wrong).
I am using here 100 random points in Spain where the precipitation has been measured. The precipitation does not follow a normal distribution, so I did a log-transformation and then it follows a normal distribution.
Here, I did two analysis, one with the non-transformed variable and another with the transformed variable. When using ordinary kriging, the predictions of the precipitation are consistent in both cases. However, the variance it's significantly different between both methods.
Transformed option: the variance oscillates 1 and 1.20 after transforming with exp()
Non-transformed option: the variance oscillates between 2.5 and 364
I tried this with other examples, including meuse data, and the range of the variance are always weird when I use a transformation. Here I paste the code that I am using for transforming the variable, and invert the transformation. If you need the full code for a reproducible example, please let me know.
The back-transform is not a simple exp() for the log-normal distribution, see e.g. here; you can look into gstat::krigeTg(), with lambda=0 Box-Cox is the log-transform.
The back-transform is not a simple exp() for the log-normal distribution, see e.g. here; you can look into gstat::krigeTg(), with lambda=0 Box-Cox is the log-transform.
I am not sure if I got it. When using krigeTg() with the argument lambda, we apply the Box-Cox transformation. The function takes it into account internally, so the var1TG.pred and var1TG.var were computed using the Box-Cox transformation, and transformed back to the original scale. Is this right, or am I misinterpreting it?
I am a bit confused from this, because most of the examples I found using the meuse dataset, people use (and I was also taught to do this):
Hello,
I am analysing the interpolation techniques, and I encountered something that appears to be weird for me (I might be wrong).
I am using here 100 random points in Spain where the precipitation has been measured. The precipitation does not follow a normal distribution, so I did a log-transformation and then it follows a normal distribution.
Here, I did two analysis, one with the non-transformed variable and another with the transformed variable. When using ordinary kriging, the predictions of the precipitation are consistent in both cases. However, the variance it's significantly different between both methods.
Transformed option: the variance oscillates 1 and 1.20 after transforming with
exp()
Non-transformed option: the variance oscillates between 2.5 and 364
I tried this with other examples, including meuse data, and the range of the variance are always weird when I use a transformation. Here I paste the code that I am using for transforming the variable, and invert the transformation. If you need the full code for a reproducible example, please let me know.
The text was updated successfully, but these errors were encountered: