You have two options:

1. Run predictions using tiling (https://github.com/Envirometrix/BigSpatialDataR#dem-analysis-using-tiling-and-parallelization)

2. Buy more RAM.

I suggest using option 1 since option 2 can propagate to infinity.

PS: I am working on a new package (https://github.com/Envirometrix/landmap/) that should give more flexibility to users and maybe even incorporate tiling of large objects by default.


On 9/13/19 5:38 PM, Manuel Spínola wrote:
Dear list members,

I am fitting a model with the GSIF package, but I ran into a problem of
vector allocation.  Is there any way to solve this problem?  See code and
error message below.

I am using:
R 3.6.1
GSIF 0.5-5
Mac with 16 GB of RAM

rk_rf_ac <- fit.gstatModel(variables_todos_sp["ac"], ac_formulaString_correlacion, covar_finales_sp, method 
= "quantregForest")Fitting a Quantile Regression Forest model...Shapiro-Wilk normality test and 
Anderson-Darling normality test report probability of < .05 indicating lack of normal distribution for 
residualsFitting a 2D variogram...Saving an object of class 'gstatModel'...> rk_rf_ac_pred <- 
GSIF::predict(rk_rf_ac, covar_finales_sp, predict.method = "KED")Error: cannot allocate vector of size 19.4 Gb

Thank you very much,

Manuel


_______________________________________________
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Reply via email to