Dear Vlad and Loic,
Some more info on the issue; the error ("Error: cannot allocate vector of
size 1.7 Gb") shows a random behaviour, sometimes things work (e.g., after
a reboot) and sometimes not.
Anyway this is a reproducible example:
theTest=rast(nrows=5000, ncols=5000, nlyrs=18, xmin=0,
Thank you Vlad!
This is a good solution, I keep it for future.
My issue is that I would like to understand why things don't work in the
best performing PC.
I'll do more experiments, to isolate the issue.
Sebastiano
Sebastiano
Il lun 23 gen 2023, 18:48 Vlad Amihaesei ha
scritto:
> You
You can try converting each layer to data frame instead of all the layers
at once time. You can do this using a loop.
r <- terra::rast(file.nc)
df<- NULL
for( i in 1:nlyr(r)){
r1 <- r[[i]]
r1.df <- as.data.frame(r1,xy =T)
df <- rbind(df, r1.df)
}
Best regards
Vlad Alexandru AMIHAESEI
On
Hi Loïc,
thank you for your suggestion! this was one of the possibilities I checked.
However, I tried with terra.options() to change the memfrac parameter from
0.6 to 0.1, so as to force the writing on disc, but things didn't work.
It seems like it is working as 32 bit system, but the R the
Hi Sebastiano,
Try comparing terra::terraOptions() and terra::mem_info(r) on both systems.
Just guessing there but it could be that terra is trying to do everything in
memory and failing on your 16GB system, while taking a memory friendly route on
your 8GB system.
Cheers,
Loïc
Dear list members,
In converting a raster as the one below to a data frame, I have an issue of
memory allocation only when I use a specific pc, and in particular the one
with more memory. To be more precise (using same version of R 4.2.2 and
Terra terra 1.6.47, by means of as.data.frame()