Dear all,
I've some general question regarding the implementation of
randomness in RandomFields and sequential simulation.
(1) Is there a difference between running
krige(nsim = 5, )
and running
krige(nsim = 1, )
for 5 times?
(2) Does repeating of the simulation with the s
On Thu, 16 Jul 2009, Paulo E. Cardoso wrote:
I think I got it working although without truly understand why some aspects
seems to be mandatory, such as assigning NA's into the SGDF to get labcon()
working.
In labcon, you see the lines:
x[!is.na(x)] <- 1
x[is.na(x)] <- 0
which will ju
I think I got it working although without truly understand why some aspects
seems to be mandatory, such as assigning NA's into the SGDF to get labcon()
working.
grelha<-readGDAL(paste("A:\\",i,sep=""))
gr.topo <- slot(grelha, "grid")
#! GridTopology
codcores <- SGDF2PCT(grelha)$idx
On Thu, Jul 16, 2009 at 5:44 AM, Roger Bivand wrote:
> On Wed, 15 Jul 2009, Steve Hong wrote:
>
> First I apologize all of you for annoying messages. Since I did not
>> receive
>> the mail I sent, I thought there might be some errors.
>>
>
> OK. Note that there may be latency issues - occasional
Roger,
I'm not being able, even with a very simple image.
Attached file is a bmp image; the type of data I'll need to work with.
labcon is returning a single level when it should return 3.
Paulo E. Cardoso
> -Mensagem original-
> De: Roger Bivand [mailto:roger.biv...@nhh.no]
On Wed, 15 Jul 2009, Steve Hong wrote:
First I apologize all of you for annoying messages. Since I did not receive
the mail I sent, I thought there might be some errors.
OK. Note that there may be latency issues - occasionally, it takes much
longer for the mail servers to process submitted po
I can not reproduce Emmanuel's bug with my configuration: I loaded a
216 149 observations matrix into a geodata object (using geoR) and
also in a gstat object.
But as Tomislav notice, this is pretty unusable and a simple variogram
pmlotting takes a long time. I do subsample this dataset.
For the
Dear Emmanuel,
I have the same problem. I either can not run processing with large data set in
R or I can not even
load such data to R. Then, if I want to do any geostatistics, it takes forever.
R (gstat/geoR) is
simply not that efficient with large spatial data as e.g. GIS software.
What you
Pierre Roudier a écrit :
Hi Emmanuel,
This is surprising has I am currently working on a dataset with 215
000+ locations. I managed to create a sp object
(SpatialPointsDataFrame) seamlessly. I also can use it in gstat (while
time-consumming) for analysis like variograms.
Do you have any error m
Hi Emmanuel,
This is surprising has I am currently working on a dataset with 215
000+ locations. I managed to create a sp object
(SpatialPointsDataFrame) seamlessly. I also can use it in gstat (while
time-consumming) for analysis like variograms.
Do you have any error message?
Pierre
--
Dr. Pi
Dear all,
I would like to perform a geostatistical analysis using R.
To do so, I'm classicaly use geoR ou GSTAT packages.
In the present case, I have a dataset of around 157000 locations where I
have a value (depth).
I've been not able to create both geodata or gstat valid R object
because appa
11 matches
Mail list logo